From bea at codespeak.net Thu Dec 1 00:01:20 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 1 Dec 2005 00:01:20 +0100 (CET) Subject: [pypy-svn] r20468 - pypy/extradoc/talk/22c3 Message-ID: <20051130230120.F1CC527B51@code1.codespeak.net> Author: bea Date: Thu Dec 1 00:01:19 2005 New Revision: 20468 Added: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: one version of structure and content for the agile ccc talk - this following a more evolutionary approach Added: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 00:01:19 2005 @@ -0,0 +1,64 @@ +Agile Business and EU funding: sprint methodology in funded OSS project +----------------------------------------------------------------------- + +Introduction: +------------- + + + +The vision: the creation of an OSS community +-------------------------------------------- + +Founding PyPy: + +Agile approaches: +- sprints +- testdriven development + +Community structure: +- transparent communication +- decision making +- interaction with other communities + + +The idea: Framework 6 programme IST funding for OSS work +-------------------------------------------------------- + +"Why do you want money - aren?t you guys non-profit?": +- impact for the EU + +"Why do we want money - isn?t OSS non-profit?": +- impact for the community + +Proposal and negotiations: +- formal requirements +- organizational limbo + +The Project: consortium and companies within a OSS community structure +---------------------------------------------------------------------- + +Forced entrepreneurship: + +Creating the consortium: + +Formalizing aspects of the community: +- roles and responsibilities + + +The challenge: balancing agile OSS community structures with EU requirements +------------------------------------------------------------------------------ + +Sprints - the key agile approach: + +Physical persons: + +Communication channels: + +Managing diversities: agile business - a succesful marriage ? +----------------------------------------------------------- + +Agile EU-project: + +Agile businesses: + + From bea at codespeak.net Thu Dec 1 00:02:08 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 1 Dec 2005 00:02:08 +0100 (CET) Subject: [pypy-svn] r20470 - pypy/extradoc/talk/22c3 Message-ID: <20051130230208.31D5127B53@code1.codespeak.net> Author: bea Date: Thu Dec 1 00:02:07 2005 New Revision: 20470 Added: pypy/extradoc/talk/22c3/agility_v2.txt.txt Log: a second version of structure and content for the agile ccc talk - this one abit more traditional in structure, covering different segments Added: pypy/extradoc/talk/22c3/agility_v2.txt.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/agility_v2.txt.txt Thu Dec 1 00:02:07 2005 @@ -0,0 +1,34 @@ +Agile Business and EU funding: sprint methodology in funded OSS project +----------------------------------------------------------------------- + +Introduction: +------------- + + + +1. Agile open source practices + +- community structure (communication and decision making) +- sprints (source, Python community, PyPy) + + +2. Agile technical practises + +- working distributed (pypy-sync) +- testdriven development +- version control +- infrastructure + +3. EU-funding in an OSS community + +- company creation +- consortium structure +- requirements versus agility +- managing diversities (roles, responsibilities, communication,culture) + +4. Designing agile businesses + +- tailoring an agile project process +- challenges and recommendations + + \ No newline at end of file From pedronis at codespeak.net Thu Dec 1 00:14:57 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 1 Dec 2005 00:14:57 +0100 (CET) Subject: [pypy-svn] r20471 - pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem Message-ID: <20051130231457.9EC1127B46@code1.codespeak.net> Author: pedronis Date: Thu Dec 1 00:14:56 2005 New Revision: 20471 Modified: pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py Log: when reading value to fill instances fallback to the classdesc if the getattr on the instance failed Modified: pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py Thu Dec 1 00:14:56 2005 @@ -401,11 +401,14 @@ else: try: attrvalue = getattr(value, name) + llattrvalue = r.convert_const(attrvalue) except AttributeError: - warning("prebuilt instance %r has no attribute %r" % ( - value, name)) - continue - llattrvalue = r.convert_const(attrvalue) + attrvalue = self.classdef.classdesc.read_attribute(name, None) + if attrvalue is None: + warning("prebuilt instance %r has no attribute %r" % ( + value, name)) + continue + llattrvalue = r.convert_desc_or_cosnt(attrvalue) setattr(result, mangled_name, llattrvalue) else: # OBJECT part From pedronis at codespeak.net Thu Dec 1 00:23:23 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 1 Dec 2005 00:23:23 +0100 (CET) Subject: [pypy-svn] r20472 - pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem Message-ID: <20051130232323.B7F5927B49@code1.codespeak.net> Author: pedronis Date: Thu Dec 1 00:23:23 2005 New Revision: 20472 Modified: pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py Log: typo Modified: pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py Thu Dec 1 00:23:23 2005 @@ -408,7 +408,7 @@ warning("prebuilt instance %r has no attribute %r" % ( value, name)) continue - llattrvalue = r.convert_desc_or_cosnt(attrvalue) + llattrvalue = r.convert_desc_or_const(attrvalue) setattr(result, mangled_name, llattrvalue) else: # OBJECT part From pedronis at codespeak.net Thu Dec 1 00:32:09 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 1 Dec 2005 00:32:09 +0100 (CET) Subject: [pypy-svn] r20473 - pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem Message-ID: <20051130233209.810AF27B49@code1.codespeak.net> Author: pedronis Date: Thu Dec 1 00:32:08 2005 New Revision: 20473 Modified: pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py Log: oops, don't hide exceptions Modified: pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py Thu Dec 1 00:32:08 2005 @@ -401,7 +401,6 @@ else: try: attrvalue = getattr(value, name) - llattrvalue = r.convert_const(attrvalue) except AttributeError: attrvalue = self.classdef.classdesc.read_attribute(name, None) if attrvalue is None: @@ -409,6 +408,8 @@ value, name)) continue llattrvalue = r.convert_desc_or_const(attrvalue) + else: + llattrvalue = r.convert_const(attrvalue) setattr(result, mangled_name, llattrvalue) else: # OBJECT part From pedronis at codespeak.net Thu Dec 1 00:41:11 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 1 Dec 2005 00:41:11 +0100 (CET) Subject: [pypy-svn] r20474 - pypy/branch/somepbc-refactoring/pypy/translator Message-ID: <20051130234111.1705027B51@code1.codespeak.net> Author: pedronis Date: Thu Dec 1 00:41:10 2005 New Revision: 20474 Modified: pypy/branch/somepbc-refactoring/pypy/translator/ann_override.py Log: safer ann_override Modified: pypy/branch/somepbc-refactoring/pypy/translator/ann_override.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/ann_override.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/ann_override.py Thu Dec 1 00:41:10 2005 @@ -120,14 +120,20 @@ def event(pol, bookkeeper, what, x): from pypy.objspace.std import typeobject if isinstance(x, typeobject.W_TypeObject): + from pypy.annotation.classdef import InstanceSource + clsdef = bookkeeper.getuniqueclassdef(typeobject.W_TypeObject) pol.pypytypes[x] = True print "TYPE", x for attr in pol.lookups: - if attr: - pol.attach_lookup(x, attr) + if attr and pol.attach_lookup(x, attr): + cached = "cached_%s" % attr + source = InstanceSource(bookkeeper, x) + clsdef.add_source_for_attribute(cached, source) for attr in pol.lookups_where: - if attr: - pol.attach_lookup_in_type_where(x, attr) + if attr and pol.attach_lookup_in_type_where(x, attr): + cached = "cached_where_%s" % attr + source = InstanceSource(bookkeeper, x) + clsdef.add_source_for_attribute(cached, source) return CACHED_LOOKUP = """ From cfbolz at codespeak.net Thu Dec 1 01:53:47 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 01:53:47 +0100 (CET) Subject: [pypy-svn] r20475 - pypy/extradoc/talk/22c3 Message-ID: <20051201005347.F0C4C27B53@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 01:53:42 2005 New Revision: 20475 Added: pypy/extradoc/talk/22c3/techpaper.sty Modified: pypy/extradoc/talk/22c3/techpaper.pdf pypy/extradoc/talk/22c3/techpaper.txt Log: fix some small things. added a style file to include the author info (hard to do in ReST). regenerate the pdf -- manual tweaks in the tex files are necessary :-( Modified: pypy/extradoc/talk/22c3/techpaper.pdf ============================================================================== Files pypy/extradoc/talk/22c3/techpaper.pdf (original) and pypy/extradoc/talk/22c3/techpaper.pdf Thu Dec 1 01:53:42 2005 differ Added: pypy/extradoc/talk/22c3/techpaper.sty ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/techpaper.sty Thu Dec 1 01:53:42 2005 @@ -0,0 +1,4 @@ +\author {Carl Friedrich Bolz\\\texttt{cfbolz at gmx.de} \and + Holger Krekel\\\texttt{hpk at merlinux} \and + Armin Rigo\\\texttt{arigo at tunes.org}} +\date{} Modified: pypy/extradoc/talk/22c3/techpaper.txt ============================================================================== --- pypy/extradoc/talk/22c3/techpaper.txt (original) +++ pypy/extradoc/talk/22c3/techpaper.txt Thu Dec 1 01:53:42 2005 @@ -1,11 +1,15 @@ -================================================== -PyPy - The new Python Implemention on the Block -================================================== +=============================================== +PyPy - The new Python Implemention on the Block +=============================================== + +.. command to produce latex file: +.. rst2latex --use-latex-citations --use-latex-footnotes --use-latex-toc --documentoptions=twocolumn,10pt,a4paper --hyperlink-color=0 techpaper.txt --use-latex-docinfo --stylesheet=techpaper.sty techpaper.tex +.. manual tweaking of the title in the latex file is necessary :-( Mission statement ==================== -PyPy is an implementation of the Python [#]_ programming language written in +PyPy [#]_ is an implementation of the Python [#]_ programming language written in Python itself, flexible and easy to experiment with. Our long-term goals are to target a large variety of platforms, small and large, by providing a compiler toolsuite that can produce custom Python versions. Platform, memory @@ -14,6 +18,7 @@ Eventually, dynamic optimization techniques - implemented as another translation aspect - should become robust against language changes. +.. [#] http://codespeak.net/pypy .. [#] http://docs.python.org/ref PyPy - an implementation of Python in Python @@ -76,7 +81,7 @@ this *translation* process. It can be configured to use reference counting or not; thus we have already achieved two very different mappings of application Python code over C/Posix. We have -successfully also translated our Python interpreter into the LLVM [#]_ code, +also successfully translated our Python interpreter into LLVM [#]_ code, and we are working on targeting higher-level environments like Java and Squeak. @@ -241,7 +246,7 @@ .. [#] http://codespeak.net/pypy -Status of the implementation (Oct 2005) +Status of the implementation (Nov 2005) ========================================== With the pypy-0.8.0 release we have integrated our AST compiler with From cfbolz at codespeak.net Thu Dec 1 10:07:45 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 10:07:45 +0100 (CET) Subject: [pypy-svn] r20476 - pypy/extradoc/talk/22c3 Message-ID: <20051201090745.9290E27B51@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 10:07:43 2005 New Revision: 20476 Added: pypy/extradoc/talk/22c3/sprintprocess.gif - copied unchanged from r20475, pypy/funding/sprintprocess.gif Log: sprintprocess picture for the business talk From cfbolz at codespeak.net Thu Dec 1 10:15:37 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 10:15:37 +0100 (CET) Subject: [pypy-svn] r20477 - pypy/extradoc/talk/22c3 Message-ID: <20051201091537.820D827B51@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 10:15:28 2005 New Revision: 20477 Modified: pypy/extradoc/talk/22c3/techpaper.pdf pypy/extradoc/talk/22c3/techpaper.sty Log: add organizations + fix holgers email Modified: pypy/extradoc/talk/22c3/techpaper.pdf ============================================================================== Files pypy/extradoc/talk/22c3/techpaper.pdf (original) and pypy/extradoc/talk/22c3/techpaper.pdf Thu Dec 1 10:15:28 2005 differ Modified: pypy/extradoc/talk/22c3/techpaper.sty ============================================================================== --- pypy/extradoc/talk/22c3/techpaper.sty (original) +++ pypy/extradoc/talk/22c3/techpaper.sty Thu Dec 1 10:15:28 2005 @@ -1,4 +1,4 @@ -\author {Carl Friedrich Bolz\\\texttt{cfbolz at gmx.de} \and - Holger Krekel\\\texttt{hpk at merlinux} \and - Armin Rigo\\\texttt{arigo at tunes.org}} +\author {Carl Friedrich Bolz\\merlinux\\\texttt{cfbolz at gmx.de} \and + Holger Krekel\\merlinux\\\texttt{hpk at merlinux.de} \and + Armin Rigo\\Heinrich Heine Universit\"at D\"usseldorf\\\texttt{arigo at tunes.org}} \date{} From arigo at codespeak.net Thu Dec 1 10:37:02 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 10:37:02 +0100 (CET) Subject: [pypy-svn] r20480 - pypy/dist/pypy/doc Message-ID: <20051201093702.E8FC227B53@code1.codespeak.net> Author: arigo Date: Thu Dec 1 10:37:01 2005 New Revision: 20480 Modified: pypy/dist/pypy/doc/architecture.txt Log: Port of mwh's anglicizations from extradoc/talk/22c3/techpaper.txt. Modified: pypy/dist/pypy/doc/architecture.txt ============================================================================== --- pypy/dist/pypy/doc/architecture.txt (original) +++ pypy/dist/pypy/doc/architecture.txt Thu Dec 1 10:37:01 2005 @@ -46,7 +46,7 @@ and extensions, following the ideas of Stackless_ and others, that will increase the expressive power available to python programmers. -.. _Python: http://www.python.org/doc/current/ref/ref.html +.. _Python: http://docs.python.org/ref .. _Psyco: http://psyco.sourceforge.net .. _Stackless: http://stackless.com @@ -55,7 +55,7 @@ ======================== PyPy is not only about writing another Python interpreter. -Traditionally, interpreters are written in some target platform language +Traditionally, interpreters are written in a target platform language like C/Posix, Java or C#. Each such interpreter provides a "mapping" from application source code to the target environment. One of the goals of the "all-encompassing" environments, like the .NET framework @@ -68,20 +68,20 @@ Python interpreter for a specific target platform. We have written a Python interpreter in Python, without many references to low-level details. (Because of the nature of Python, this is already a -complicated task, although not as much as writing it -- say -- in C.) +complicated task, although not as much as writing it in - say - C.) Then we use this as a "language specification" and manipulate it to produce the more traditional interpreters that we want. In the above sense, we are generating the concrete "mappings" of Python into lower-level target platforms. -So far (fall 2005), we have already succeeded in turning this "language +So far (autumn 2005), we have already succeeded in turning this "language specification" into reasonably efficient C-level code that performs basically the same job as CPython. Memory management is inserted during -this "translation" process. It can be configured to use reference +this *translation* process. It can be configured to use reference counting or not; thus we have already achieved two very different -"mappings" of application Python code over C/Posix. We have -successfully translated our Python interpreter to the LLVM_ target as -well, and we are working on targeting higher-level environments like +mappings of application Python code over C/Posix. We have +successfully also translated our Python interpreter into LLVM_ code, +and we are working on targeting higher-level environments like Java and Squeak. In some senses, PyPy project's central component is not its @@ -99,7 +99,7 @@ * we can tweak the translation process to produce low-level code based on different models and tradeoffs. -By contrast, a standardized target environment -- say .NET -- +By contrast, a standardized target environment - say .NET - enforces ``m=1`` as far as it's concerned. This helps making ``o`` a bit smaller by providing a higher-level base to build upon. Still, we believe that enforcing the use of one common environment @@ -107,9 +107,9 @@ as far as language implementation is concerned - showing an approach to the ``n * m * o`` problem that does not rely on standardization. -This is the "meta-goal"; a more concrete goal worth mentioning at this +This is the *meta-goal*; a more concrete goal worth mentioning at this point is that language specifications can be used to generate cool stuff -in addition to traditional interpreters -- e.g. Just-In-Time Compilers. +in addition to traditional interpreters - e.g. Just-In-Time compilers. Higher level picture @@ -176,7 +176,7 @@ - the *annotator* which performs type inference on the flow graph; - the *typer* which, based on the type annotations, turns the flow graph - into one using only low-level, operations that fit the model of the + into one using only low-level operations that fit the model of the target platform; - the *code generator* which translates the resulting flow graph into @@ -200,8 +200,7 @@ Python code at some point. However, in the start-up phase, we are completely free to use all kinds of powerful python constructs, including metaclasses and execution of dynamically constructed strings. However, -when the initialization phase (mainly, the function -``objspace.initialize()``) finishes, all code objects involved need to +when the initialization phase finishes, all code objects involved need to adhere to a more static subset of Python: Restricted Python, also known as `RPython`_. @@ -241,7 +240,7 @@ uses them to modify the flow graph in-place to replace its operations with low-level ones, directly manipulating C-like values and data structures. -Here is an overview of the whole process (`PDF color version`_): +Here is an overview of the translation process (`PDF color version`_): .. image:: image/translation-greyscale-small.png @@ -279,12 +278,12 @@ written in Python to the resulting self-contained executable. Our rather complete and Python 2.4-compliant interpreter consists -of about 30'000-50'000 lines of code (depending on the way you +of about 30,000-50,000 lines of code (depending on the way you count code borrowed and adapted from other sources), with -another 14'000 lines of unit tests. If we include the tools, +another 14,000 lines of unit tests. If we include the tools, the parts related to code analysis and generation, and the -standard library modules ported from C, PyPy is now 138'000 -lines of code and 32'000 lines of tests. Refer to +standard library modules ported from C, PyPy is now 138,000 +lines of code and 32,000 lines of tests. Refer to the `statistics web page`_ for more detailed information. .. _`statistics web page`: http://codespeak.net/~hpk/pypy-stat/ From cfbolz at codespeak.net Thu Dec 1 10:46:20 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 10:46:20 +0100 (CET) Subject: [pypy-svn] r20481 - pypy/extradoc/talk/22c3 Message-ID: <20051201094620.D1E3C27B51@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 10:46:16 2005 New Revision: 20481 Modified: pypy/extradoc/talk/22c3/techpaper.pdf pypy/extradoc/talk/22c3/techpaper.txt Log: change "Mission statement" to "Abstract" Modified: pypy/extradoc/talk/22c3/techpaper.pdf ============================================================================== Files pypy/extradoc/talk/22c3/techpaper.pdf (original) and pypy/extradoc/talk/22c3/techpaper.pdf Thu Dec 1 10:46:16 2005 differ Modified: pypy/extradoc/talk/22c3/techpaper.txt ============================================================================== --- pypy/extradoc/talk/22c3/techpaper.txt (original) +++ pypy/extradoc/talk/22c3/techpaper.txt Thu Dec 1 10:46:16 2005 @@ -6,8 +6,8 @@ .. rst2latex --use-latex-citations --use-latex-footnotes --use-latex-toc --documentoptions=twocolumn,10pt,a4paper --hyperlink-color=0 techpaper.txt --use-latex-docinfo --stylesheet=techpaper.sty techpaper.tex .. manual tweaking of the title in the latex file is necessary :-( -Mission statement -==================== +Abstract +======== PyPy [#]_ is an implementation of the Python [#]_ programming language written in Python itself, flexible and easy to experiment with. Our long-term goals are From cfbolz at codespeak.net Thu Dec 1 10:56:21 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 10:56:21 +0100 (CET) Subject: [pypy-svn] r20482 - pypy/extradoc/talk/22c3 Message-ID: <20051201095621.D47E327B54@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 10:56:17 2005 New Revision: 20482 Modified: pypy/extradoc/talk/22c3/techpaper.pdf pypy/extradoc/talk/22c3/techpaper.txt Log: don't number the abstract Modified: pypy/extradoc/talk/22c3/techpaper.pdf ============================================================================== Files pypy/extradoc/talk/22c3/techpaper.pdf (original) and pypy/extradoc/talk/22c3/techpaper.pdf Thu Dec 1 10:56:17 2005 differ Modified: pypy/extradoc/talk/22c3/techpaper.txt ============================================================================== --- pypy/extradoc/talk/22c3/techpaper.txt (original) +++ pypy/extradoc/talk/22c3/techpaper.txt Thu Dec 1 10:56:17 2005 @@ -6,8 +6,10 @@ .. rst2latex --use-latex-citations --use-latex-footnotes --use-latex-toc --documentoptions=twocolumn,10pt,a4paper --hyperlink-color=0 techpaper.txt --use-latex-docinfo --stylesheet=techpaper.sty techpaper.tex .. manual tweaking of the title in the latex file is necessary :-( -Abstract -======== + +.. raw:: latex + + \begin{abstract} PyPy [#]_ is an implementation of the Python [#]_ programming language written in Python itself, flexible and easy to experiment with. Our long-term goals are @@ -21,6 +23,10 @@ .. [#] http://codespeak.net/pypy .. [#] http://docs.python.org/ref +.. raw:: latex + + \end{abstract} + PyPy - an implementation of Python in Python ============================================ From bea at codespeak.net Thu Dec 1 10:59:17 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 1 Dec 2005 10:59:17 +0100 (CET) Subject: [pypy-svn] r20483 - pypy/extradoc/talk/22c3 Message-ID: <20051201095917.9843227B54@code1.codespeak.net> Author: bea Date: Thu Dec 1 10:59:16 2005 New Revision: 20483 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: introduction - explaining the structure of the paper Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 10:59:16 2005 @@ -3,7 +3,14 @@ Introduction: ------------- - +This paper uses an evolutionary approach, a walkthrough of the history of the +PyPy project, touching down on different aspects of agility. + +In the founding of the community there was a clear vision of agile development and sprints as the key method. The idea of EU-funding and the process in achieving this created a paradox: how to keep the agile open source community structure with key aspects of the project being funded through EU. + +This then exposed the project to formal requirements planning, estimation,resource tracking and the challenge was to design a process in which a balance was struck between community and consortium, between a developer driven process and formal organinizational structure. + +The evolution of the project - from a non profit Open Source initiative to a partial funded EU project - made possible the growth of Agile Business. The vision: the creation of an OSS community From cfbolz at codespeak.net Thu Dec 1 11:59:48 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 11:59:48 +0100 (CET) Subject: [pypy-svn] r20487 - pypy/dist/pypy/doc/statistic Message-ID: <20051201105948.1A0DA27B53@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 11:59:46 2005 New Revision: 20487 Modified: pypy/dist/pypy/doc/statistic/format.py Log: disentangle release- and sprint-text Modified: pypy/dist/pypy/doc/statistic/format.py ============================================================================== --- pypy/dist/pypy/doc/statistic/format.py (original) +++ pypy/dist/pypy/doc/statistic/format.py Thu Dec 1 11:59:46 2005 @@ -57,11 +57,11 @@ args = [dates, d, colors[i]] pylab.plot_date(*args) - ymax = max(pylab.yticks()[0]) * 0.88 #just below the legend + ymax = max(pylab.yticks()[0]) #just below the legend for i, release_date in enumerate(release_dates): release_name = release_names[i] pylab.axvline(release_date, linewidth=2, color="g", alpha=0.5) - ax.text(release_date, ymax, release_name, + ax.text(release_date, ymax * 0.5, release_name, fontsize=10, horizontalalignment='right', verticalalignment='top', @@ -72,7 +72,7 @@ end = sprint_end_dates[i] if float(begin) >= float(min(dates[0],dates[-1])): pylab.axvspan(begin, end, facecolor="y", alpha=0.2) - ax.text(begin, ymax, location, + ax.text(begin, ymax * 0.88, location, fontsize=10, horizontalalignment='right', verticalalignment='top', From cfbolz at codespeak.net Thu Dec 1 11:59:57 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 11:59:57 +0100 (CET) Subject: [pypy-svn] r20488 - pypy/dist/pypy/doc/statistic Message-ID: <20051201105957.EB10327B54@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 11:59:56 2005 New Revision: 20488 Added: pypy/dist/pypy/doc/statistic/webaccess.txt Log: add web access statistic Added: pypy/dist/pypy/doc/statistic/webaccess.txt ============================================================================== --- (empty file) +++ pypy/dist/pypy/doc/statistic/webaccess.txt Thu Dec 1 11:59:56 2005 @@ -0,0 +1,824 @@ +Codespeak web access +date, number of visits +01 Sep 2003, 0 +02 Sep 2003, 0 +03 Sep 2003, 0 +04 Sep 2003, 0 +05 Sep 2003, 0 +06 Sep 2003, 0 +07 Sep 2003, 61 +08 Sep 2003, 161 +09 Sep 2003, 110 +10 Sep 2003, 118 +11 Sep 2003, 112 +12 Sep 2003, 84 +13 Sep 2003, 99 +14 Sep 2003, 23 +15 Sep 2003, 134 +16 Sep 2003, 113 +17 Sep 2003, 89 +18 Sep 2003, 98 +19 Sep 2003, 12 +20 Sep 2003, 16 +21 Sep 2003, 12 +22 Sep 2003, 52 +23 Sep 2003, 21 +24 Sep 2003, 0 +25 Sep 2003, 130 +26 Sep 2003, 101 +27 Sep 2003, 22 +28 Sep 2003, 72 +29 Sep 2003, 43 +30 Sep 2003, 101 +01 Oct 2003, 63 +02 Oct 2003, 114 +03 Oct 2003, 8 +04 Oct 2003, 20 +05 Oct 2003, 55 +06 Oct 2003, 134 +07 Oct 2003, 172 +08 Oct 2003, 154 +09 Oct 2003, 130 +10 Oct 2003, 134 +11 Oct 2003, 32 +12 Oct 2003, 149 +13 Oct 2003, 165 +14 Oct 2003, 117 +15 Oct 2003, 121 +16 Oct 2003, 151 +17 Oct 2003, 100 +18 Oct 2003, 46 +19 Oct 2003, 70 +20 Oct 2003, 230 +21 Oct 2003, 87 +22 Oct 2003, 181 +23 Oct 2003, 171 +24 Oct 2003, 123 +25 Oct 2003, 52 +26 Oct 2003, 117 +27 Oct 2003, 13 +28 Oct 2003, 164 +29 Oct 2003, 172 +30 Oct 2003, 138 +31 Oct 2003, 39 +01 Nov 2003, 107 +02 Nov 2003, 128 +03 Nov 2003, 61 +04 Nov 2003, 77 +05 Nov 2003, 125 +06 Nov 2003, 149 +07 Nov 2003, 106 +08 Nov 2003, 10 +09 Nov 2003, 51 +10 Nov 2003, 98 +11 Nov 2003, 0 +12 Nov 2003, 18 +13 Nov 2003, 94 +14 Nov 2003, 53 +15 Nov 2003, 29 +16 Nov 2003, 42 +17 Nov 2003, 32 +18 Nov 2003, 153 +19 Nov 2003, 95 +20 Nov 2003, 144 +21 Nov 2003, 15 +22 Nov 2003, 62 +23 Nov 2003, 26 +24 Nov 2003, 60 +25 Nov 2003, 114 +26 Nov 2003, 14 +27 Nov 2003, 94 +28 Nov 2003, 48 +29 Nov 2003, 14 +30 Nov 2003, 36 +01 Dec 2003, 130 +02 Dec 2003, 146 +03 Dec 2003, 132 +04 Dec 2003, 106 +05 Dec 2003, 132 +06 Dec 2003, 74 +07 Dec 2003, 75 +08 Dec 2003, 115 +09 Dec 2003, 79 +10 Dec 2003, 83 +11 Dec 2003, 91 +12 Dec 2003, 145 +13 Dec 2003, 53 +14 Dec 2003, 76 +15 Dec 2003, 190 +16 Dec 2003, 204 +17 Dec 2003, 20 +18 Dec 2003, 0 +19 Dec 2003, 0 +20 Dec 2003, 0 +21 Dec 2003, 91 +22 Dec 2003, 183 +23 Dec 2003, 149 +24 Dec 2003, 72 +25 Dec 2003, 71 +26 Dec 2003, 41 +27 Dec 2003, 62 +28 Dec 2003, 16 +29 Dec 2003, 104 +30 Dec 2003, 45 +31 Dec 2003, 37 +01 Jan 2004, 79 +02 Jan 2004, 20 +03 Jan 2004, 20 +04 Jan 2004, 0 +05 Jan 2004, 21 +06 Jan 2004, 119 +07 Jan 2004, 79 +08 Jan 2004, 76 +09 Jan 2004, 172 +10 Jan 2004, 121 +11 Jan 2004, 104 +12 Jan 2004, 129 +13 Jan 2004, 0 +14 Jan 2004, 0 +15 Jan 2004, 164 +16 Jan 2004, 0 +17 Jan 2004, 90 +18 Jan 2004, 93 +19 Jan 2004, 76 +20 Jan 2004, 136 +21 Jan 2004, 105 +22 Jan 2004, 175 +23 Jan 2004, 102 +24 Jan 2004, 156 +25 Jan 2004, 34 +26 Jan 2004, 182 +27 Jan 2004, 21 +28 Jan 2004, 167 +29 Jan 2004, 85 +30 Jan 2004, 14 +31 Jan 2004, 35 +01 Feb 2004, 96 +02 Feb 2004, 85 +03 Feb 2004, 120 +04 Feb 2004, 100 +05 Feb 2004, 141 +06 Feb 2004, 173 +07 Feb 2004, 47 +08 Feb 2004, 112 +09 Feb 2004, 185 +10 Feb 2004, 138 +11 Feb 2004, 24 +12 Feb 2004, 0 +13 Feb 2004, 40 +14 Feb 2004, 26 +15 Feb 2004, 0 +16 Feb 2004, 142 +17 Feb 2004, 118 +18 Feb 2004, 178 +19 Feb 2004, 104 +20 Feb 2004, 25 +21 Feb 2004, 81 +22 Feb 2004, 87 +23 Feb 2004, 152 +24 Feb 2004, 66 +25 Feb 2004, 0 +26 Feb 2004, 0 +27 Feb 2004, 0 +28 Feb 2004, 104 +29 Feb 2004, 86 +01 Mar 2004, 170 +02 Mar 2004, 72 +03 Mar 2004, 0 +04 Mar 2004, 10 +05 Mar 2004, 63 +06 Mar 2004, 99 +07 Mar 2004, 95 +08 Mar 2004, 0 +09 Mar 2004, 0 +10 Mar 2004, 117 +11 Mar 2004, 0 +12 Mar 2004, 78 +13 Mar 2004, 89 +14 Mar 2004, 43 +15 Mar 2004, 0 +16 Mar 2004, 184 +17 Mar 2004, 101 +18 Mar 2004, 145 +19 Mar 2004, 0 +20 Mar 2004, 0 +21 Mar 2004, 99 +22 Mar 2004, 142 +23 Mar 2004, 28 +24 Mar 2004, 55 +25 Mar 2004, 82 +26 Mar 2004, 128 +27 Mar 2004, 112 +28 Mar 2004, 42 +29 Mar 2004, 0 +30 Mar 2004, 0 +31 Mar 2004, 112 +01 Apr 2004, 105 +02 Apr 2004, 142 +03 Apr 2004, 86 +04 Apr 2004, 107 +05 Apr 2004, 95 +06 Apr 2004, 191 +07 Apr 2004, 138 +08 Apr 2004, 0 +09 Apr 2004, 62 +10 Apr 2004, 46 +11 Apr 2004, 104 +12 Apr 2004, 110 +13 Apr 2004, 84 +14 Apr 2004, 196 +15 Apr 2004, 77 +16 Apr 2004, 164 +17 Apr 2004, 0 +18 Apr 2004, 0 +19 Apr 2004, 147 +20 Apr 2004, 15 +21 Apr 2004, 0 +22 Apr 2004, 0 +23 Apr 2004, 123 +24 Apr 2004, 0 +25 Apr 2004, 0 +26 Apr 2004, 178 +27 Apr 2004, 47 +28 Apr 2004, 147 +29 Apr 2004, 48 +30 Apr 2004, 83 +01 May 2004, 49 +02 May 2004, 0 +03 May 2004, 109 +04 May 2004, 148 +05 May 2004, 165 +06 May 2004, 182 +07 May 2004, 0 +08 May 2004, 0 +09 May 2004, 26 +10 May 2004, 0 +11 May 2004, 0 +12 May 2004, 173 +13 May 2004, 178 +14 May 2004, 86 +15 May 2004, 97 +16 May 2004, 100 +17 May 2004, 165 +18 May 2004, 145 +19 May 2004, 67 +20 May 2004, 136 +21 May 2004, 128 +22 May 2004, 76 +23 May 2004, 51 +24 May 2004, 207 +25 May 2004, 159 +26 May 2004, 197 +27 May 2004, 160 +28 May 2004, 136 +29 May 2004, 20 +30 May 2004, 0 +31 May 2004, 0 +01 Jun 2004, 180 +02 Jun 2004, 176 +03 Jun 2004, 185 +04 Jun 2004, 98 +05 Jun 2004, 105 +06 Jun 2004, 7 +07 Jun 2004, 0 +08 Jun 2004, 60 +09 Jun 2004, 221 +10 Jun 2004, 156 +11 Jun 2004, 0 +12 Jun 2004, 15 +13 Jun 2004, 106 +14 Jun 2004, 93 +15 Jun 2004, 0 +16 Jun 2004, 267 +17 Jun 2004, 317 +18 Jun 2004, 208 +19 Jun 2004, 95 +20 Jun 2004, 139 +21 Jun 2004, 58 +22 Jun 2004, 643 +23 Jun 2004, 85 +24 Jun 2004, 0 +25 Jun 2004, 0 +26 Jun 2004, 97 +27 Jun 2004, 53 +28 Jun 2004, 76 +29 Jun 2004, 251 +30 Jun 2004, 291 +01 Jul 2004, 252 +02 Jul 2004, 245 +03 Jul 2004, 170 +04 Jul 2004, 55 +05 Jul 2004, 125 +06 Jul 2004, 84 +07 Jul 2004, 208 +08 Jul 2004, 44 +09 Jul 2004, 148 +10 Jul 2004, 150 +11 Jul 2004, 90 +12 Jul 2004, 284 +13 Jul 2004, 220 +14 Jul 2004, 57 +15 Jul 2004, 131 +16 Jul 2004, 37 +17 Jul 2004, 0 +18 Jul 2004, 62 +19 Jul 2004, 0 +20 Jul 2004, 149 +21 Jul 2004, 0 +22 Jul 2004, 124 +23 Jul 2004, 108 +24 Jul 2004, 106 +25 Jul 2004, 20 +26 Jul 2004, 174 +27 Jul 2004, 120 +28 Jul 2004, 40 +29 Jul 2004, 0 +30 Jul 2004, 319 +31 Jul 2004, 285 +01 Aug 2004, 237 +02 Aug 2004, 376 +03 Aug 2004, 306 +04 Aug 2004, 417 +05 Aug 2004, 393 +06 Aug 2004, 279 +07 Aug 2004, 219 +08 Aug 2004, 199 +09 Aug 2004, 325 +10 Aug 2004, 314 +11 Aug 2004, 275 +12 Aug 2004, 320 +13 Aug 2004, 316 +14 Aug 2004, 199 +15 Aug 2004, 232 +16 Aug 2004, 267 +17 Aug 2004, 286 +18 Aug 2004, 317 +19 Aug 2004, 297 +20 Aug 2004, 256 +21 Aug 2004, 174 +22 Aug 2004, 201 +23 Aug 2004, 319 +24 Aug 2004, 302 +25 Aug 2004, 261 +26 Aug 2004, 251 +27 Aug 2004, 224 +28 Aug 2004, 98 +29 Aug 2004, 0 +30 Aug 2004, 290 +31 Aug 2004, 229 +01 Sep 2004, 261 +02 Sep 2004, 226 +03 Sep 2004, 243 +04 Sep 2004, 157 +05 Sep 2004, 79 +06 Sep 2004, 186 +07 Sep 2004, 236 +08 Sep 2004, 238 +09 Sep 2004, 254 +10 Sep 2004, 256 +11 Sep 2004, 130 +12 Sep 2004, 159 +13 Sep 2004, 245 +14 Sep 2004, 259 +15 Sep 2004, 279 +16 Sep 2004, 265 +17 Sep 2004, 261 +18 Sep 2004, 130 +19 Sep 2004, 135 +20 Sep 2004, 229 +21 Sep 2004, 283 +22 Sep 2004, 277 +23 Sep 2004, 276 +24 Sep 2004, 351 +25 Sep 2004, 260 +26 Sep 2004, 235 +27 Sep 2004, 325 +28 Sep 2004, 316 +29 Sep 2004, 302 +30 Sep 2004, 268 +01 Oct 2004, 262 +02 Oct 2004, 186 +03 Oct 2004, 197 +04 Oct 2004, 292 +05 Oct 2004, 323 +06 Oct 2004, 293 +07 Oct 2004, 311 +08 Oct 2004, 333 +09 Oct 2004, 229 +10 Oct 2004, 239 +11 Oct 2004, 379 +12 Oct 2004, 340 +13 Oct 2004, 309 +14 Oct 2004, 445 +15 Oct 2004, 384 +16 Oct 2004, 210 +17 Oct 2004, 222 +18 Oct 2004, 335 +19 Oct 2004, 388 +20 Oct 2004, 368 +21 Oct 2004, 431 +22 Oct 2004, 338 +23 Oct 2004, 181 +24 Oct 2004, 168 +25 Oct 2004, 315 +26 Oct 2004, 296 +27 Oct 2004, 284 +28 Oct 2004, 307 +29 Oct 2004, 306 +30 Oct 2004, 175 +31 Oct 2004, 224 +01 Nov 2004, 270 +02 Nov 2004, 283 +03 Nov 2004, 302 +04 Nov 2004, 325 +05 Nov 2004, 323 +06 Nov 2004, 172 +07 Nov 2004, 168 +08 Nov 2004, 295 +09 Nov 2004, 351 +10 Nov 2004, 342 +11 Nov 2004, 470 +12 Nov 2004, 355 +13 Nov 2004, 218 +14 Nov 2004, 255 +15 Nov 2004, 521 +16 Nov 2004, 402 +17 Nov 2004, 335 +18 Nov 2004, 303 +19 Nov 2004, 296 +20 Nov 2004, 284 +21 Nov 2004, 286 +22 Nov 2004, 518 +23 Nov 2004, 461 +24 Nov 2004, 325 +25 Nov 2004, 300 +26 Nov 2004, 299 +27 Nov 2004, 219 +28 Nov 2004, 200 +29 Nov 2004, 294 +30 Nov 2004, 338 +01 Dec 2004, 392 +02 Dec 2004, 408 +03 Dec 2004, 351 +04 Dec 2004, 270 +05 Dec 2004, 220 +06 Dec 2004, 222 +07 Dec 2004, 339 +08 Dec 2004, 386 +09 Dec 2004, 556 +10 Dec 2004, 1159 +11 Dec 2004, 465 +12 Dec 2004, 15 +13 Dec 2004, 467 +14 Dec 2004, 455 +15 Dec 2004, 420 +16 Dec 2004, 410 +17 Dec 2004, 404 +18 Dec 2004, 293 +19 Dec 2004, 243 +20 Dec 2004, 396 +21 Dec 2004, 339 +22 Dec 2004, 320 +23 Dec 2004, 311 +24 Dec 2004, 198 +25 Dec 2004, 191 +26 Dec 2004, 244 +27 Dec 2004, 301 +28 Dec 2004, 316 +29 Dec 2004, 326 +30 Dec 2004, 313 +31 Dec 2004, 185 +01 Jan 2005, 172 +02 Jan 2005, 229 +03 Jan 2005, 328 +04 Jan 2005, 367 +05 Jan 2005, 622 +06 Jan 2005, 321 +07 Jan 2005, 410 +08 Jan 2005, 222 +09 Jan 2005, 239 +10 Jan 2005, 415 +11 Jan 2005, 456 +12 Jan 2005, 407 +13 Jan 2005, 653 +14 Jan 2005, 438 +15 Jan 2005, 360 +16 Jan 2005, 278 +17 Jan 2005, 544 +18 Jan 2005, 453 +19 Jan 2005, 385 +20 Jan 2005, 384 +21 Jan 2005, 370 +22 Jan 2005, 292 +23 Jan 2005, 279 +24 Jan 2005, 426 +25 Jan 2005, 438 +26 Jan 2005, 234 +27 Jan 2005, 0 +28 Jan 2005, 0 +29 Jan 2005, 0 +30 Jan 2005, 0 +31 Jan 2005, 0 +01 Feb 2005, 0 +02 Feb 2005, 0 +03 Feb 2005, 0 +04 Feb 2005, 0 +05 Feb 2005, 0 +06 Feb 2005, 0 +07 Feb 2005, 0 +08 Feb 2005, 449 +09 Feb 2005, 488 +10 Feb 2005, 191 +11 Feb 2005, 0 +12 Feb 2005, 206 +13 Feb 2005, 224 +14 Feb 2005, 355 +15 Feb 2005, 463 +16 Feb 2005, 409 +17 Feb 2005, 381 +18 Feb 2005, 394 +19 Feb 2005, 313 +20 Feb 2005, 273 +21 Feb 2005, 403 +22 Feb 2005, 407 +23 Feb 2005, 392 +24 Feb 2005, 454 +25 Feb 2005, 387 +26 Feb 2005, 242 +27 Feb 2005, 241 +28 Feb 2005, 429 +01 Mar 2005, 438 +02 Mar 2005, 382 +03 Mar 2005, 420 +04 Mar 2005, 398 +05 Mar 2005, 274 +06 Mar 2005, 219 +07 Mar 2005, 436 +08 Mar 2005, 389 +09 Mar 2005, 398 +10 Mar 2005, 431 +11 Mar 2005, 529 +12 Mar 2005, 347 +13 Mar 2005, 295 +14 Mar 2005, 492 +15 Mar 2005, 541 +16 Mar 2005, 445 +17 Mar 2005, 485 +18 Mar 2005, 425 +19 Mar 2005, 319 +20 Mar 2005, 276 +21 Mar 2005, 497 +22 Mar 2005, 484 +23 Mar 2005, 484 +24 Mar 2005, 341 +25 Mar 2005, 0 +26 Mar 2005, 0 +27 Mar 2005, 62 +28 Mar 2005, 358 +29 Mar 2005, 728 +30 Mar 2005, 656 +31 Mar 2005, 641 +01 Apr 2005, 572 +02 Apr 2005, 356 +03 Apr 2005, 329 +04 Apr 2005, 580 +05 Apr 2005, 549 +06 Apr 2005, 524 +07 Apr 2005, 552 +08 Apr 2005, 473 +09 Apr 2005, 398 +10 Apr 2005, 411 +11 Apr 2005, 703 +12 Apr 2005, 809 +13 Apr 2005, 674 +14 Apr 2005, 704 +15 Apr 2005, 646 +16 Apr 2005, 366 +17 Apr 2005, 341 +18 Apr 2005, 593 +19 Apr 2005, 600 +20 Apr 2005, 569 +21 Apr 2005, 555 +22 Apr 2005, 482 +23 Apr 2005, 368 +24 Apr 2005, 347 +25 Apr 2005, 496 +26 Apr 2005, 515 +27 Apr 2005, 558 +28 Apr 2005, 709 +29 Apr 2005, 632 +30 Apr 2005, 347 +01 May 2005, 335 +02 May 2005, 531 +03 May 2005, 543 +04 May 2005, 625 +05 May 2005, 500 +06 May 2005, 506 +07 May 2005, 330 +08 May 2005, 290 +09 May 2005, 585 +10 May 2005, 525 +11 May 2005, 593 +12 May 2005, 537 +13 May 2005, 544 +14 May 2005, 346 +15 May 2005, 352 +16 May 2005, 548 +17 May 2005, 608 +18 May 2005, 724 +19 May 2005, 604 +20 May 2005, 615 +21 May 2005, 797 +22 May 2005, 788 +23 May 2005, 1441 +24 May 2005, 1158 +25 May 2005, 1360 +26 May 2005, 915 +27 May 2005, 639 +28 May 2005, 380 +29 May 2005, 334 +30 May 2005, 553 +31 May 2005, 564 +01 Jun 2005, 795 +02 Jun 2005, 901 +03 Jun 2005, 769 +04 Jun 2005, 430 +05 Jun 2005, 50 +06 Jun 2005, 0 +07 Jun 2005, 0 +08 Jun 2005, 0 +09 Jun 2005, 0 +10 Jun 2005, 0 +11 Jun 2005, 0 +12 Jun 2005, 246 +13 Jun 2005, 678 +14 Jun 2005, 645 +15 Jun 2005, 660 +16 Jun 2005, 718 +17 Jun 2005, 702 +18 Jun 2005, 329 +19 Jun 2005, 337 +20 Jun 2005, 616 +21 Jun 2005, 662 +22 Jun 2005, 777 +23 Jun 2005, 655 +24 Jun 2005, 601 +25 Jun 2005, 383 +26 Jun 2005, 418 +27 Jun 2005, 645 +28 Jun 2005, 555 +29 Jun 2005, 616 +30 Jun 2005, 498 +01 Jul 2005, 445 +02 Jul 2005, 383 +03 Jul 2005, 316 +04 Jul 2005, 525 +05 Jul 2005, 592 +06 Jul 2005, 368 +07 Jul 2005, 748 +08 Jul 2005, 719 +09 Jul 2005, 365 +10 Jul 2005, 387 +11 Jul 2005, 571 +12 Jul 2005, 583 +13 Jul 2005, 617 +14 Jul 2005, 642 +15 Jul 2005, 483 +16 Jul 2005, 0 +17 Jul 2005, 0 +18 Jul 2005, 580 +19 Jul 2005, 791 +20 Jul 2005, 702 +21 Jul 2005, 683 +22 Jul 2005, 663 +23 Jul 2005, 457 +24 Jul 2005, 539 +25 Jul 2005, 935 +26 Jul 2005, 862 +27 Jul 2005, 660 +28 Jul 2005, 632 +29 Jul 2005, 597 +30 Jul 2005, 455 +31 Jul 2005, 503 +01 Aug 2005, 824 +02 Aug 2005, 990 +03 Aug 2005, 971 +04 Aug 2005, 877 +05 Aug 2005, 765 +06 Aug 2005, 501 +07 Aug 2005, 643 +08 Aug 2005, 694 +09 Aug 2005, 825 +10 Aug 2005, 817 +11 Aug 2005, 865 +12 Aug 2005, 860 +13 Aug 2005, 669 +14 Aug 2005, 667 +15 Aug 2005, 769 +16 Aug 2005, 902 +17 Aug 2005, 955 +18 Aug 2005, 982 +19 Aug 2005, 973 +20 Aug 2005, 808 +21 Aug 2005, 704 +22 Aug 2005, 1051 +23 Aug 2005, 1007 +24 Aug 2005, 1201 +25 Aug 2005, 934 +26 Aug 2005, 981 +27 Aug 2005, 820 +28 Aug 2005, 781 +29 Aug 2005, 1205 +30 Aug 2005, 1245 +31 Aug 2005, 1154 +01 Sep 2005, 1192 +02 Sep 2005, 1074 +03 Sep 2005, 728 +04 Sep 2005, 633 +05 Sep 2005, 1019 +06 Sep 2005, 1089 +07 Sep 2005, 1083 +08 Sep 2005, 1061 +09 Sep 2005, 1060 +10 Sep 2005, 885 +11 Sep 2005, 781 +12 Sep 2005, 1191 +13 Sep 2005, 1067 +14 Sep 2005, 1175 +15 Sep 2005, 1274 +16 Sep 2005, 1106 +17 Sep 2005, 782 +18 Sep 2005, 834 +19 Sep 2005, 1198 +20 Sep 2005, 1218 +21 Sep 2005, 1322 +22 Sep 2005, 1151 +23 Sep 2005, 1153 +24 Sep 2005, 732 +25 Sep 2005, 954 +26 Sep 2005, 1195 +27 Sep 2005, 1267 +28 Sep 2005, 1197 +29 Sep 2005, 1106 +30 Sep 2005, 1109 +01 Oct 2005, 901 +02 Oct 2005, 744 +03 Oct 2005, 1065 +04 Oct 2005, 1393 +05 Oct 2005, 1281 +06 Oct 2005, 1355 +07 Oct 2005, 1148 +08 Oct 2005, 857 +09 Oct 2005, 812 +10 Oct 2005, 1111 +11 Oct 2005, 1160 +12 Oct 2005, 1100 +13 Oct 2005, 1157 +14 Oct 2005, 1130 +15 Oct 2005, 781 +16 Oct 2005, 878 +17 Oct 2005, 1315 +18 Oct 2005, 1555 +19 Oct 2005, 1288 +20 Oct 2005, 1284 +21 Oct 2005, 1159 +22 Oct 2005, 791 +23 Oct 2005, 667 +24 Oct 2005, 1094 +25 Oct 2005, 1200 +26 Oct 2005, 1430 +27 Oct 2005, 1123 +28 Oct 2005, 1001 +29 Oct 2005, 654 +30 Oct 2005, 662 +31 Oct 2005, 1159 +2005/11/01, 1099 +2005/11/02, 1264 +2005/11/03, 1594 +2005/11/04, 1588 +2005/11/05, 1030 +2005/11/06, 915 +2005/11/07, 1382 +2005/11/08, 1207 +2005/11/09, 1181 +2005/11/10, 1070 +2005/11/11, 1169 +2005/11/12, 723 +2005/11/13, 785 +2005/11/14, 1244 +2005/11/15, 1386 +2005/11/16, 1380 +2005/11/17, 1306 +2005/11/18, 1210 +2005/11/19, 763 +2005/11/20, 754 +2005/11/21, 1218 +2005/11/22, 1233 +2005/11/23, 1218 +2005/11/24, 1116 +2005/11/25, 1043 +2005/11/26, 696 +2005/11/27, 713 +2005/11/28, 1205 +2005/11/29, 1144 +2005/11/30, 1143 From arigo at codespeak.net Thu Dec 1 12:59:33 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 12:59:33 +0100 (CET) Subject: [pypy-svn] r20489 - in pypy/branch/somepbc-refactoring/pypy: annotation rpython rpython/lltypesystem Message-ID: <20051201115933.9BEED27B51@code1.codespeak.net> Author: arigo Date: Thu Dec 1 12:59:32 2005 New Revision: 20489 Modified: pypy/branch/somepbc-refactoring/pypy/annotation/classdef.py pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rpbc.py pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py Log: * fix obscure rtyper bug: a bad caching. * small unrelated fix in classdef. Modified: pypy/branch/somepbc-refactoring/pypy/annotation/classdef.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/annotation/classdef.py (original) +++ pypy/branch/somepbc-refactoring/pypy/annotation/classdef.py Thu Dec 1 12:59:32 2005 @@ -150,7 +150,6 @@ def add_source_for_attribute(self, attr, source): """Adds information about a constant source for an attribute. """ - sources = self.attr_sources.setdefault(attr, []) for cdef in self.getmro(): if attr in cdef.attrs: # the Attribute() exists already for this class (or a parent) @@ -165,6 +164,7 @@ return else: # remember the source in self.attr_sources + sources = self.attr_sources.setdefault(attr, []) sources.append(source) # register the source in any Attribute found in subclasses, # to restore invariant (III) Modified: pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rpbc.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rpbc.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rpbc.py Thu Dec 1 12:59:32 2005 @@ -28,10 +28,9 @@ class MultipleFrozenPBCRepr(MultiplePBCRepr): """Representation selected for multiple non-callable pre-built constants.""" - def __init__(self, rtyper, frozendescs): + def __init__(self, rtyper, access_set): self.rtyper = rtyper - self.descs = frozendescs - self.access_set = frozendescs[0].queryattrfamily() + self.access_set = access_set self.pbc_type = ForwardReference() self.lowleveltype = Ptr(self.pbc_type) self.pbc_cache = {} @@ -52,8 +51,9 @@ self.llfieldmap = llfieldmap def convert_desc(self, frozendesc): - if self.access_set is not None and frozendesc not in self.descs: - raise TyperError("not found in PBC set: %r" % (frozendesc,)) + if (self.access_set is not None and + frozendesc not in self.access_set.descs): + raise TyperError("not found in PBC access set: %r" % (frozendesc,)) try: return self.pbc_cache[frozendesc] except KeyError: Modified: pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py Thu Dec 1 12:59:32 2005 @@ -308,7 +308,8 @@ try: return rtyper.pbc_reprs[access] except KeyError: - result = rtyper.type_system.rpbc.MultipleFrozenPBCRepr(rtyper, descs) + result = rtyper.type_system.rpbc.MultipleFrozenPBCRepr(rtyper, + access) rtyper.pbc_reprs[access] = result rtyper.add_pendingsetup(result) return result From ericvrp at codespeak.net Thu Dec 1 13:11:34 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Thu, 1 Dec 2005 13:11:34 +0100 (CET) Subject: [pypy-svn] r20490 - in pypy/dist/pypy/translator/js: . src test Message-ID: <20051201121134.413E827B52@code1.codespeak.net> Author: ericvrp Date: Thu Dec 1 13:11:32 2005 New Revision: 20490 Modified: pypy/dist/pypy/translator/js/codewriter.py pypy/dist/pypy/translator/js/opwriter.py pypy/dist/pypy/translator/js/src/ll_stackless.js pypy/dist/pypy/translator/js/test/test_stackless.py Log: more passing stackless tests in genjs Modified: pypy/dist/pypy/translator/js/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/js/codewriter.py (original) +++ pypy/dist/pypy/translator/js/codewriter.py Thu Dec 1 13:11:32 2005 @@ -41,8 +41,11 @@ else: eol = ';\n' - do_log = self.js.logging and line and not line.endswith('}') - for x in ['var', '//', 'for (;;) {', 'switch (block) {', 'slp_new_frame', 'case ', 'if ', 'elif ', 'else ', '} else ']: + do_log = self.js.logging and line and not line.endswith('}') and \ + not ' = slp_frame_stack_top.' in line + for x in ['var' , '//' , 'for (;;) {', 'switch (block) {', + 'slp_new_frame', 'case ', 'if ' , 'elif ' , + 'else ' , '} else ', 'slp_stack_depth' ]: if line.startswith(x): do_log = False break @@ -190,20 +193,26 @@ def neg(self, targetvar, source): self.append('%(targetvar)s = -%(source)s' % locals()) - def call(self, targetvar, functionref, argrefs, no_exception=None, exceptions=[]): + def call(self, targetvar, functionref, argrefs=[], no_exception=None, exceptions=[], specialreturnvalue=None): args = ", ".join(argrefs) if not exceptions: assert no_exception is None + if self.js.stackless: + self.append("slp_stack_depth++") self.append('%s = %s(%s)' % (targetvar, functionref, args)) if self.js.stackless: + self.append("slp_stack_depth--") selfdecl = self.decl.split('(')[0] usedvars = ', '.join(self._usedvars.keys()) self.append('if (slp_frame_stack_bottom) {') self.indent_more() self.append('slp_new_frame("%s", %s, %d, new Array(%s))' % (targetvar, selfdecl, self._resume_blocknum, usedvars)) - self.append('return') + if specialreturnvalue: + self.append("return " + specialreturnvalue) + else: + self.append('return') self.indent_less() self.append('}') self.indent_less() @@ -217,7 +226,10 @@ self.indent_more() if self.js.stackless: self.comment('TODO: XXX stackless in combination with exceptions handling') + self.append("slp_stack_depth++") self.append('%s = %s(%s)' % (targetvar, functionref, args)) + if self.js.stackless: + self.append("slp_stack_depth--") #XXX we don't actually get here when an exception occurs! self._phi(no_exception_exit) self._goto_block(no_exception_label) self.indent_less() @@ -252,6 +264,7 @@ self.indent_less() self.append('}') + def cast(self, targetvar, fromtype, fromvar, targettype): if fromtype == 'void' and targettype == 'void': return Modified: pypy/dist/pypy/translator/js/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/js/opwriter.py (original) +++ pypy/dist/pypy/translator/js/opwriter.py Thu Dec 1 13:11:32 2005 @@ -124,7 +124,7 @@ self.codewriter.cast(targetvar, mult_type, res_val, mult_type) def _skipped(self, op): - #self.codewriter.comment('Skipping operation %s()' % op.opname) + self.codewriter.comment('Skipping operation %s()' % op.opname) pass keepalive = _skipped @@ -389,4 +389,8 @@ #Stackless def yield_current_frame_to_caller(self, op): - self._skipped(op) + '''special handling of this operation: call stack_unwind() to force the + current frame to be saved into the heap, but don't propagate the + unwind -- instead, capture it and return it normally''' + targetvar = self.db.repr_arg(op.result) + self.codewriter.call(targetvar, "ll_stack_unwind", specialreturnvalue="slp_return_current_frame_to_caller()") Modified: pypy/dist/pypy/translator/js/src/ll_stackless.js ============================================================================== --- pypy/dist/pypy/translator/js/src/ll_stackless.js (original) +++ pypy/dist/pypy/translator/js/src/ll_stackless.js Thu Dec 1 13:11:32 2005 @@ -3,6 +3,7 @@ var slp_frame_stack_top = null; var slp_frame_stack_bottom = null; var slp_return_value = undefined; +var slp_stack_depth = 0; // This gets called with --log @@ -10,7 +11,7 @@ try { alert(s); // in browser } catch (e) { - print('log:' + s); // commandline + print('log: ' + s); // commandline } } @@ -22,152 +23,120 @@ return s } -// +// example function for testing + +function ll_stackless_stack_frames_depth() { + if (!slp_frame_stack_top) { + LOG("ll_stackless_stack_frames_depth init"); + slp_frame_stack_top = slp_frame_stack_bottom = slp_new_frame_simple(ll_stackless_stack_frames_depth); + return; + } -function ll_stack_too_big_helper(depth) { - if (depth > 0) { - ll_stack_too_big_helper(depth-1) + LOG("ll_stackless_stack_frames_depth resume"); + var f = slp_frame_stack_top; + slp_frame_stack_top = null; + for (var result = 0;f;result++) { + f = f.f_back; } -} + return result; +} + +// function ll_stack_too_big() { - try { - ll_stack_too_big_helper(5); // some magic number that seems to work - } catch (e) { //stack overflow when recursing some more - LOG("stack is too big") - return true; - } - LOG("stack is not too big yet") - return false; -} + var result = slp_stack_depth > 500; // Firefox has a recursion limit of 1000 (others allow more) + LOG("ll_stack_to_big result=" + result); + return result; +} function slp_new_frame(targetvar, func, resume_blocknum, vars) { - LOG("starting slp_new_frame("+targetvar+","+function_name(func)+","+resume_blocknum+","+vars.toSource()+")"); + //LOG("slp_new_frame("+targetvar+","+function_name(func)+","+resume_blocknum+","+vars.toSource()+")"); + LOG("slp_new_frame("+function_name(func)+")"); var f = new Object(); f.func = func; f.targetvar = targetvar; f.resume_blocknum = resume_blocknum; f.vars = vars; f.f_back = null; - // push below current bottom so after unwinding the current stack - // the slp_frame_stack will be correctly sorted - slp_frame_stack_bottom.f_back = f; - slp_frame_stack_bottom = f; - LOG("finished slp_new_frame"); + slp_frame_stack_bottom.f_back = f; // push below bottom, to keep stack + slp_frame_stack_bottom = f; // correctly sorted after unwind } function slp_new_frame_simple(func) { - LOG("starting slp_new_frame_simple("+function_name(func)+")"); + LOG("slp_new_frame_simple("+function_name(func)+")"); var f = new Object(); f.func = func; f.targetvar = undefined; f.resume_blocknum = undefined; f.vars = undefined; f.f_back = null; - LOG("finished slp_new_frame_simple"); return f; // note: the non-simple version returns nothing } -// - function ll_stack_unwind() { - LOG("starting ll_stackless_stack_unwind"); + LOG("ll_stack_unwind"); if (slp_frame_stack_top) { - slp_frame_stack_top = null; // no need to resume + slp_frame_stack_top = null; } else { slp_frame_stack_top = slp_frame_stack_bottom = slp_new_frame_simple(ll_stack_unwind); } - LOG("finished ll_stackless_stack_unwind"); + LOG('slp_frame_stack_top='+slp_frame_stack_top + ', slp_frame_stack_bottom='+slp_frame_stack_bottom) + return null; } -// // ll_stack_unwind = ll_stackless_stack_unwind; // alias (XXX really need both?) -// ll_stackless_stack_unwind = ll_stack_unwind; // alias (XXX really need both?) function slp_return_current_frame_to_caller() { - LOG("starting slp_return_current_frame_to_caller"); - if (!slp_frame_stack_top) alert('!slp_frame_stack_top'); - if (!slp_frame_stack_bottom) alert('!slp_frame_stack_bottom'); + LOG("slp_return_current_frame_to_caller"); + if (!slp_frame_stack_top) log('!slp_frame_stack_top'); + if (!slp_frame_stack_bottom) log('!slp_frame_stack_bottom'); var result = slp_frame_stack_top; - slp_frame_stack_bottom.f_back = slp_new_frame_simple(slp_return_current_frame_to_caller); + slp_frame_stack_bottom.f_back = slp_new_frame_simple(slp_end_of_yielding_function); //special case! slp_frame_stack_top = slp_frame_stack_bottom = null; // stop unwinding - LOG("finished slp_return_current_frame_to_caller"); return result; } function slp_end_of_yielding_function() { - LOG("starting slp_end_of_yielding_function"); - if (!slp_frame_stack_top) alert('!slp_frame_stack_top'); // can only resume from slp_return_current_frame_to_caller() - if (!slp_return_value) alert('!slp_return_value'); + LOG("slp_end_of_yielding_function"); + if (!slp_frame_stack_top) log('slp_end_of_yielding_function !slp_frame_stack_top'); // can only resume from slp_return_current_frame_to_caller() + if (!slp_return_value) log('slp_end_of_yielding_function !slp_return_value'); slp_frame_stack_top = slp_return_value; - LOG("finished slp_end_of_yielding_function"); - return null; // XXX or just return? + return null; } -function ll_stackless_switch(c) { - LOG("starting ll_stackless_switch"); +function ll_stackless_switch__frame_stack_topPtr(c) { + LOG("ll_stackless_switch__frame_stack_topPtr"); var f; var result; if (slp_frame_stack_top) { //resume - LOG("slp_frame_stack_top != null"); - // ready to do the switch. The current (old) frame_stack_top is - // f.f_back, which we store where it will be found immediately - // after the switch + LOG("slp_frame_stack_top != null, SWITCH"); + // ready to do the switch. The current (old) frame_stack_top is f.f_back, + // which we store where it will be found immediately after the switch f = slp_frame_stack_top; result = f.f_back; // grab the saved value of 'c' and do the switch slp_frame_stack_top = f.p0; - LOG("finished ll_stackless_switch"); return result; } LOG("slp_frame_stack_top == null"); // first, unwind the current stack - f = slp_new_frame_simple(ll_stackless_switch); + f = slp_new_frame_simple(ll_stackless_switch__frame_stack_topPtr); f.p0 = c; slp_frame_stack_top = slp_frame_stack_bottom = f; - LOG("finished ll_stackless_switch"); - return null; -} -ll_stackless_switch__frame_stack_topPtr = ll_stackless_switch; // alias (XXX really need both?) - -// - -// example function for testing - -function ll_stackless_stack_frames_depth() { - if (!slp_frame_stack_top) { - LOG("starting ll_stackless_stack_frames_depth init"); - slp_frame_stack_top = slp_frame_stack_bottom = slp_new_frame_simple(ll_stackless_stack_frames_depth); - LOG("finished ll_stackless_stack_frames_depth init"); - return; - } - - LOG("starting ll_stackless_stack_frames_depth resume"); - var f = slp_frame_stack_top; - slp_frame_stack_top = null; - for (var result = 0;f;result++) { - f = f.f_back; - } - LOG("stack_frames_depth = " + result); - LOG("finished ll_stackless_stack_frames_depth resume"); - return result; } // main dispatcher loop function slp_main_loop() { var f_back; - LOG("starting slp_main_loop"); while (true) { - LOG("slp_main_loop (outer loop)"); - slp_frame_stack_bottom = null; pending = slp_frame_stack_top; while (true) { - LOG("slp_main_loop (inner loop)"); f_back = pending.f_back; LOG('calling: ' + function_name(pending.func)); + slp_stack_depth = 0; // we are restarting to recurse slp_return_value = pending.func(); // params get initialized in the function because it's a resume! if (slp_frame_stack_top) { break; @@ -180,26 +149,19 @@ } if (slp_frame_stack_bottom) { // returning from switch() - if (slp_frame_stack_bottom.f_back) alert('slp_frame_stack_bottom.f_back'); + if (slp_frame_stack_bottom.f_back) log('slp_frame_stack_bottom.f_back'); slp_frame_stack_bottom.f_back = f_back; } } - LOG("finished slp_main_loop"); } function slp_entry_point(funcstring) { - LOG("starting slp_standalone_entry_point"); + slp_stack_depth = 0; /// initial stack depth var result = eval(funcstring); - LOG("RESULT = " + result); - LOG("slp_frame_stack_bottom = " + slp_frame_stack_bottom); - if (slp_frame_stack_bottom) { - // if the stack unwound we need to run the dispatch loop - // to retrieve the actual result + if (slp_frame_stack_bottom) { // get with dispatch loop when stack unwound slp_main_loop(); result = slp_return_value; } - LOG("FINAL RESULT = " + result); - LOG("finished slp_standalone_entry_point"); return result; } Modified: pypy/dist/pypy/translator/js/test/test_stackless.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_stackless.py (original) +++ pypy/dist/pypy/translator/js/test/test_stackless.py Thu Dec 1 13:11:32 2005 @@ -2,6 +2,7 @@ from pypy.rpython.rstack import stack_unwind, stack_frames_depth, stack_too_big from pypy.rpython.rstack import yield_current_frame_to_caller +from pypy.rpython.lltypesystem import lltype from pypy.translator.js.test.runtest import compile_function from pypy.translator.js import conftest @@ -12,9 +13,6 @@ # ____________________________________________________________ def test_stack_depth(): - if not conftest.option.jsstackless: - py.test.skip("stackless disabled (enable with py.test --stackless)") - def g1(): "just to check Void special cases around the code" def g2(ignored): @@ -37,9 +35,6 @@ assert data.strip() == '10' def test_stack_withptr(): - if not conftest.option.jsstackless: - py.test.skip("stackless disabled (enable with py.test --stackless)") - def f(n): if n > 0: res = f(n-1) @@ -56,9 +51,6 @@ assert data.strip() == '10' def test_stackless_manytimes(): - if not conftest.option.jsstackless: - py.test.skip("stackless disabled (enable with py.test --stackless)") - def f(n): if n > 0: stack_frames_depth() @@ -76,9 +68,7 @@ assert data.strip() == '100' def test_stackless_arguments(): - if not conftest.option.jsstackless: - py.test.skip("stackless disabled (enable with py.test --stackless)") - py.test.skip("stackless feature not incomplete") + py.test.skip("stackless feature incomplete (empty Object mallocs)") def f(n, d, t): if n > 0: @@ -97,10 +87,6 @@ def test_stack_too_big(): - if not conftest.option.jsstackless: - py.test.skip("stackless disabled (enable with py.test --stackless)") - #py.test.skip("stackless feature not incomplete") - def f1(): return stack_too_big() def f2(): @@ -121,13 +107,10 @@ def fn(): return f(0) data = wrap_stackless_function(fn) - assert int(data.strip()) > 500 + assert int(data.strip()) == 494 def test_stack_unwind(): - if not conftest.option.jsstackless: - py.test.skip("stackless disabled (enable with py.test --stackless)") - def f(): stack_unwind() return 42 @@ -136,24 +119,17 @@ assert int(data.strip()) == 42 def test_auto_stack_unwind(): - if not conftest.option.jsstackless: - py.test.skip("stackless disabled (enable with py.test --stackless)") - py.test.skip("stackless feature not incomplete") - def f(n): if n == 1: return 1 return (n+f(n-1)) % 1291 def fn(): - return f(10**6) + return f(10**4) data = wrap_stackless_function(fn) - assert int(data.strip()) == 704 - + assert int(data.strip()) == 697 #10**4==697(6seconds, 10**5==545(45seconds) -def test_yield_frame(): - if not conftest.option.jsstackless: - py.test.skip("stackless disabled (enable with py.test --stackless)") +def test_yield_frame1(): py.test.skip("stackless feature not incomplete") def g(lst): @@ -172,7 +148,7 @@ lst.append(5) frametop_after_return = frametop_before_6.switch() lst.append(7) - assert frametop_after_return is None + #assert frametop_after_return is None n = 0 for i in lst: n = n*10 + i @@ -180,3 +156,31 @@ data = wrap_stackless_function(f) assert int(data.strip()) == 1234567 + +def test_yield_frame2(): + py.test.skip("stackless feature incomplete (exception handling?)") + + S = lltype.GcStruct("base", ('a', lltype.Signed)) + s = lltype.malloc(S) + + def g(x): + x.a <<= 2 + frametop_before_5 = yield_current_frame_to_caller() + x.a <<= 4 + frametop_before_7 = frametop_before_5.switch() + x.a <<= 6 + return frametop_before_7 + + def f(): + s.a = 1 + frametop_before_4 = g(s) + s.a += 3 + frametop_before_6 = frametop_before_4.switch() + s.a += 5 + frametop_after_return = frametop_before_6.switch() + s.a += 7 + #assert frametop_after_return is None + return s.a + + data = wrap_stackless_function(f) + assert int(data.strip()) == 7495 From arigo at codespeak.net Thu Dec 1 13:15:53 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 13:15:53 +0100 (CET) Subject: [pypy-svn] r20491 - in pypy/branch/somepbc-refactoring/pypy/rpython: lltypesystem test Message-ID: <20051201121553.6D3B027B52@code1.codespeak.net> Author: arigo Date: Thu Dec 1 13:15:52 2005 New Revision: 20491 Modified: pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rclass.py Log: (mwh, pedronis, arigo) * Made the test actually test something (magic flow space again) * Fix for the test: when a class attribute is read directly from a family of classes, put it in the vtable for exactly these classes (not subclasses). Modified: pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py Thu Dec 1 13:15:52 2005 @@ -200,8 +200,10 @@ value = rsubcls.classdef.classdesc.read_attribute(fldname, None) if value is not None: assign(mangled_name, value) - # extra PBC attributes # xxx couldn't they be implemented as regular readonyla attrs? + # extra PBC attributes for (access_set, attr), (mangled_name, r) in self.pbcfields.items(): + if rsubcls.classdef.classdesc not in access_set.descs: + continue # only for the classes in the same pbc access set if r.lowleveltype is Void: continue attrvalue = rsubcls.classdef.classdesc.read_attribute(attr, None) Modified: pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rclass.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rclass.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rclass.py Thu Dec 1 13:15:52 2005 @@ -342,12 +342,13 @@ class C(B): def meth(self): return self.value - 1 - def f(i): + def pick_class(i): if i > 0: - cls = A + return A else: - cls = C - meth = cls.meth + return C + def f(i): + meth = pick_class(i).meth x = C() x.value = 12 return meth(x) # calls A.meth or C.meth, completely ignores B.meth From arigo at codespeak.net Thu Dec 1 14:40:17 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 14:40:17 +0100 (CET) Subject: [pypy-svn] r20493 - pypy/branch/somepbc-refactoring/pypy/translator/test Message-ID: <20051201134017.6FBC927B52@code1.codespeak.net> Author: arigo Date: Thu Dec 1 14:40:16 2005 New Revision: 20493 Modified: pypy/branch/somepbc-refactoring/pypy/translator/test/test_annrpython.py Log: a failing annotation test. Modified: pypy/branch/somepbc-refactoring/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/test/test_annrpython.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/test/test_annrpython.py Thu Dec 1 14:40:16 2005 @@ -1871,6 +1871,19 @@ assert s.items[0].knowntype == int assert s.items[1].knowntype == str + def test_constant_bound_method(self): + class C: + def __init__(self, value): + self.value = value + def meth(self): + return self.value + meth = C(1).meth + def f(): + return meth() + a = self.RPythonAnnotator() + s = a.build_types(f, []) + assert s.knowntype == int + def g(n): return [0,1,2,n] From mwh at codespeak.net Thu Dec 1 14:58:57 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Thu, 1 Dec 2005 14:58:57 +0100 (CET) Subject: [pypy-svn] r20494 - pypy/branch/somepbc-refactoring/pypy/annotation Message-ID: <20051201135857.E1B5127B53@code1.codespeak.net> Author: mwh Date: Thu Dec 1 14:58:56 2005 New Revision: 20494 Modified: pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py Log: Create InstanceSources for instances that are only seen as the im_self of a bound method. Fixes the new test in test_annrpython.py. Modified: pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py (original) +++ pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py Thu Dec 1 14:58:56 2005 @@ -377,15 +377,8 @@ if frozen: result = SomePBC([self.getdesc(x)]) else: - clsdef = self.getuniqueclassdef(x.__class__) - if x not in self.seen_mutable: # avoid circular reflowing, - # see for example test_circular_mutable_getattr - self.seen_mutable[x] = True - self.event('mutable', x) - source = InstanceSource(self, x) - for attr in x.__dict__: - clsdef.add_source_for_attribute(attr, source) # can trigger reflowing - result = SomeInstance(clsdef) + self.see_mutable(x) + result = SomeInstance(self.getuniqueclassdef(x.__class__)) elif x is None: return s_None else: @@ -420,6 +413,7 @@ self.getdesc(pyobj.im_self)) # frozendesc else: # regular method origincls, name = origin_of_meth(pyobj) + self.see_mutable(pyobj.im_self) result = self.getmethoddesc( self.getdesc(pyobj.im_func), # funcdesc self.getuniqueclassdef(origincls), # originclassdef @@ -449,6 +443,16 @@ self.methoddescs[key] = result return result + def see_mutable(self, x): + if x in self.seen_mutable: + return + clsdef = self.getuniqueclassdef(x.__class__) + self.seen_mutable[x] = True + self.event('mutable', x) + source = InstanceSource(self, x) + for attr in x.__dict__: + clsdef.add_source_for_attribute(attr, source) # can trigger reflowing + def valueoftype(self, t): """The most precise SomeValue instance that contains all objects of type t.""" From arigo at codespeak.net Thu Dec 1 14:59:05 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 14:59:05 +0100 (CET) Subject: [pypy-svn] r20495 - pypy/branch/somepbc-refactoring/pypy/rpython Message-ID: <20051201135905.3DB6327B58@code1.codespeak.net> Author: arigo Date: Thu Dec 1 14:59:04 2005 New Revision: 20495 Modified: pypy/branch/somepbc-refactoring/pypy/rpython/llinterp.py pypy/branch/somepbc-refactoring/pypy/rpython/rbool.py pypy/branch/somepbc-refactoring/pypy/rpython/rfloat.py Log: (pedronis, arigo) We need conversions in the float -> int -> bool direction too. Mostly untested, we'll expode when we *really* need all of them :-( Plus, cast_float_to_uint is a new operation probably not supported by the back-ends. Modified: pypy/branch/somepbc-refactoring/pypy/rpython/llinterp.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/llinterp.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/llinterp.py Thu Dec 1 14:59:04 2005 @@ -411,7 +411,11 @@ def op_cast_float_to_int(self, f): assert type(f) is float return ovfcheck(int(f)) - + + def op_cast_float_to_uint(self, f): + assert type(f) is float + return r_uint(int(f)) + def op_cast_char_to_int(self, b): assert type(b) is str and len(b) == 1 return ord(b) Modified: pypy/branch/somepbc-refactoring/pypy/rpython/rbool.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/rbool.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/rbool.py Thu Dec 1 14:59:04 2005 @@ -51,6 +51,16 @@ return llops.genop('cast_bool_to_int', [v], resulttype=Signed) return NotImplemented +class __extend__(pairtype(IntegerRepr, BoolRepr)): + def convert_from_to((r_from, r_to), v, llops): + if r_from.lowleveltype == Unsigned and r_to.lowleveltype == Bool: + log.debug('explicit cast_uint_to_bool') + return llops.genop('uint_is_true', [v], resulttype=Bool) + if r_from.lowleveltype == Signed and r_to.lowleveltype == Bool: + log.debug('explicit cast_int_to_bool') + return llops.genop('int_is_true', [v], resulttype=Bool) + return NotImplemented + class __extend__(pairtype(PyObjRepr, BoolRepr)): def convert_from_to((r_from, r_to), v, llops): if r_to.lowleveltype == Bool: Modified: pypy/branch/somepbc-refactoring/pypy/rpython/rfloat.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/rfloat.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/rfloat.py Thu Dec 1 14:59:04 2005 @@ -147,6 +147,16 @@ return llops.genop('cast_int_to_float', [v], resulttype=Float) return NotImplemented +class __extend__(pairtype(FloatRepr, IntegerRepr)): + def convert_from_to((r_from, r_to), v, llops): + if r_from.lowleveltype == Float and r_to.lowleveltype == Unsigned: + log.debug('explicit cast_float_to_uint') + return llops.genop('cast_float_to_uint', [v], resulttype=Unsigned) + if r_from.lowleveltype == Float and r_to.lowleveltype == Signed: + log.debug('explicit cast_float_to_int') + return llops.genop('cast_float_to_int', [v], resulttype=Signed) + return NotImplemented + class __extend__(pairtype(BoolRepr, FloatRepr)): def convert_from_to((r_from, r_to), v, llops): if r_from.lowleveltype == Bool and r_to.lowleveltype == Float: @@ -154,6 +164,13 @@ return llops.genop('cast_bool_to_float', [v], resulttype=Float) return NotImplemented +class __extend__(pairtype(FloatRepr, BoolRepr)): + def convert_from_to((r_from, r_to), v, llops): + if r_from.lowleveltype == Float and r_to.lowleveltype == Bool: + log.debug('explicit cast_float_to_bool') + return llops.genop('float_is_true', [v], resulttype=Bool) + return NotImplemented + class __extend__(pairtype(PyObjRepr, FloatRepr)): def convert_from_to((r_from, r_to), v, llops): if r_to.lowleveltype == Float: From arigo at codespeak.net Thu Dec 1 15:00:29 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 15:00:29 +0100 (CET) Subject: [pypy-svn] r20496 - in pypy/branch/somepbc-refactoring/pypy/rpython: . lltypesystem test Message-ID: <20051201140029.645BC27B57@code1.codespeak.net> Author: arigo Date: Thu Dec 1 15:00:28 2005 New Revision: 20496 Modified: pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rpbc.py pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rpbc.py Log: (pedronis, arigo) Tests and fixes for the now-more-precise getattr() on classes and frozen PBCs. Modified: pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rclass.py Thu Dec 1 15:00:28 2005 @@ -219,6 +219,7 @@ def fromtypeptr(self, vcls, llops): """Return the type pointer cast to self's vtable type.""" + self.setup() castable(self.lowleveltype, vcls.concretetype) # sanity check return llops.genop('cast_pointer', [vcls], resulttype=self.lowleveltype) Modified: pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rpbc.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rpbc.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/rpbc.py Thu Dec 1 15:00:28 2005 @@ -83,7 +83,9 @@ def rtype_getattr(self, hop): attr = hop.args_s[1].const vpbc, vattr = hop.inputargs(self, Void) - return self.getfield(vpbc, attr, hop.llops) + v_res = self.getfield(vpbc, attr, hop.llops) + mangled_name, r_res = self.llfieldmap[attr] + return hop.llops.convertvar(v_res, r_res, hop.r_result) def getfield(self, vpbc, attr, llops): mangled_name, r_value = self.llfieldmap[attr] @@ -128,17 +130,17 @@ raise TyperError("unsupported: variable of type " "method-of-frozen-PBC or None") - im_selves = {} + im_selves = [] for desc in s_pbc.descriptions: assert desc.funcdesc is self.funcdesc - im_selves[desc.frozendesc] = True + im_selves.append(desc.frozendesc) self.s_im_self = annmodel.SomePBC(im_selves) self.r_im_self = rtyper.getrepr(self.s_im_self) self.lowleveltype = self.r_im_self.lowleveltype def get_s_callable(self): - return annmodel.SomePBC({self.funcdesc: True}) + return annmodel.SomePBC([self.funcdesc]) def get_r_implfunc(self): r_func = self.rtyper.getrepr(self.get_s_callable()) @@ -285,24 +287,6 @@ hop2.dispatch() return v_instance - -class __extend__(pairtype(ClassesPBCRepr, rclass.AbstractClassRepr)): - def convert_from_to((r_clspbc, r_cls), v, llops): - if r_cls.lowleveltype != r_clspbc.lowleveltype: - return NotImplemented # good enough for now - return v - -class __extend__(pairtype(ClassesPBCRepr, ClassesPBCRepr)): - def convert_from_to((r_clspbc1, r_clspbc2), v, llops): - # this check makes sense because both source and dest repr are ClassesPBCRepr - if r_clspbc1.lowleveltype == r_clspbc2.lowleveltype: - return v - if r_clspbc1.lowleveltype is Void: - return inputconst(r_clspbc2, r_clspbc1.s_pbc.const) - return NotImplemented - - - # ____________________________________________________________ ##def rtype_call_memo(hop): Modified: pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py Thu Dec 1 15:00:28 2005 @@ -413,19 +413,28 @@ return hop.inputconst(hop.r_result, hop.s_result.const) else: attr = hop.args_s[1].const - vcls, vattr = hop.inputargs(self, Void) - return self.getfield(vcls, attr, hop.llops) - - def getfield(self, vcls, attr, llops): - access_set = self.get_access_set() - class_repr = self.get_class_repr() - return class_repr.getpbcfield(vcls, access_set, attr, llops) + access_set = self.get_access_set() + class_repr = self.get_class_repr() + vcls, vattr = hop.inputargs(class_repr, Void) + v_res = class_repr.getpbcfield(vcls, access_set, attr, hop.llops) + s_res = access_set.attrs[attr] + r_res = self.rtyper.getrepr(s_res) + return hop.llops.convertvar(v_res, r_res, hop.r_result) class __extend__(pairtype(AbstractClassesPBCRepr, rclass.AbstractClassRepr)): def convert_from_to((r_clspbc, r_cls), v, llops): - if r_cls.lowleveltype != r_clspbc.lowleveltype: - return NotImplemented # good enough for now - return v + # turn a PBC of classes to a standard pointer-to-vtable class repr + if r_clspbc.lowleveltype == r_cls.lowleveltype: + return v + if r_clspbc.lowleveltype is Void: + return inputconst(r_cls, r_clspbc.s_pbc.const) + # convert from ptr-to-object-vtable to ptr-to-more-precise-vtable + # but first check if it is safe + assert (r_clspbc.lowleveltype == + r_clspbc.rtyper.type_system.rclass.CLASSTYPE) + if not r_clspbc.get_class_repr().classdef.issubclass(r_cls.classdef): + return NotImplemented + return r_cls.fromtypeptr(v, llops) class __extend__(pairtype(AbstractClassesPBCRepr, AbstractClassesPBCRepr)): def convert_from_to((r_clspbc1, r_clspbc2), v, llops): Modified: pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rpbc.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rpbc.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rpbc.py Thu Dec 1 15:00:28 2005 @@ -1112,3 +1112,60 @@ assert res == 184 res = interpret(f, [3, 100]) assert res == -1 + +def test_pbc_getattr_conversion(): + fr1 = Freezing() + fr2 = Freezing() + fr3 = Freezing() + fr1.value = 10 + fr2.value = 5 + fr3.value = 2.5 + def pick12(i): + if i > 0: + return fr1 + else: + return fr2 + def pick23(i): + if i > 5: + return fr2 + else: + return fr3 + def f(i): + x = pick12(i) + y = pick23(i) + return x.value, y.value + for i in [0, 5, 10]: + res = interpret(f, [i]) + assert type(res.item0) is int # precise + assert type(res.item1) is float + assert res.item0 == f(i)[0] + assert res.item1 == f(i)[1] + +def test_pbc_getattr_conversion_with_classes(): + class base: pass + class fr1(base): pass + class fr2(base): pass + class fr3(base): pass + fr1.value = 10 + fr2.value = 5 + fr3.value = 2.5 + def pick12(i): + if i > 0: + return fr1 + else: + return fr2 + def pick23(i): + if i > 5: + return fr2 + else: + return fr3 + def f(i): + x = pick12(i) + y = pick23(i) + return x.value, y.value + for i in [0, 5, 10]: + res = interpret(f, [i]) + assert type(res.item0) is int # precise + assert type(res.item1) is float + assert res.item0 == f(i)[0] + assert res.item1 == f(i)[1] From arigo at codespeak.net Thu Dec 1 15:05:10 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 15:05:10 +0100 (CET) Subject: [pypy-svn] r20497 - pypy/branch/somepbc-refactoring/pypy/annotation Message-ID: <20051201140510.105A527B53@code1.codespeak.net> Author: arigo Date: Thu Dec 1 15:05:09 2005 New Revision: 20497 Modified: pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py Log: (mwh) Oups, this was suppose to go before r20496 :) Be more precise about annotating the result of a getattr operation on a pbc (makes most difference when the PBC is a constant). Modified: pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py (original) +++ pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py Thu Dec 1 15:05:09 2005 @@ -500,11 +500,12 @@ attrfamily.read_locations[position] = True actuals = [] - for desc in attrfamily.descs: + for desc in descs: actuals.append(desc.s_read_attribute(attr)) s_result = unionof(*actuals) - attrfamily.attrs[attr] = s_result + attrfamily.attrs[attr] = unionof(s_result, + attrfamily.attrs.get(attr, s_ImpossibleValue)) if change: for position in attrfamily.read_locations: From cfbolz at codespeak.net Thu Dec 1 15:18:32 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 15:18:32 +0100 (CET) Subject: [pypy-svn] r20498 - pypy/extradoc/talk/22c3 Message-ID: <20051201141832.CD10727B5E@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 15:18:28 2005 New Revision: 20498 Modified: pypy/extradoc/talk/22c3/techpaper.pdf pypy/extradoc/talk/22c3/techpaper.sty pypy/extradoc/talk/22c3/techpaper.txt Log: add GmbH to merlinux. remove -- since rest escapes then. Modified: pypy/extradoc/talk/22c3/techpaper.pdf ============================================================================== Files pypy/extradoc/talk/22c3/techpaper.pdf (original) and pypy/extradoc/talk/22c3/techpaper.pdf Thu Dec 1 15:18:28 2005 differ Modified: pypy/extradoc/talk/22c3/techpaper.sty ============================================================================== --- pypy/extradoc/talk/22c3/techpaper.sty (original) +++ pypy/extradoc/talk/22c3/techpaper.sty Thu Dec 1 15:18:28 2005 @@ -1,4 +1,4 @@ -\author {Carl Friedrich Bolz\\merlinux\\\texttt{cfbolz at gmx.de} \and - Holger Krekel\\merlinux\\\texttt{hpk at merlinux.de} \and - Armin Rigo\\Heinrich Heine Universit\"at D\"usseldorf\\\texttt{arigo at tunes.org}} +\author {Carl Friedrich Bolz\\merlinux GmbH\\\texttt{cfbolz at gmx.de} \and + Holger Krekel\\merlinux GmbH\\\texttt{hpk at merlinux.de} \and + Armin Rigo\\Heinrich-Heine-Universit\"at D\"usseldorf\\\texttt{arigo at tunes.org}} \date{} Modified: pypy/extradoc/talk/22c3/techpaper.txt ============================================================================== --- pypy/extradoc/talk/22c3/techpaper.txt (original) +++ pypy/extradoc/talk/22c3/techpaper.txt Thu Dec 1 15:18:28 2005 @@ -106,7 +106,7 @@ * we can tweak the translation process to produce low-level code based on different models and tradeoffs. -By contrast, a standardized target environment -- say .NET -- +By contrast, a standardized target environment - say .NET - enforces ``m=1`` as far as it is concerned. This helps making ``o`` a bit smaller by providing a higher-level base to build upon. Still, we believe that enforcing the use of one common environment @@ -116,7 +116,7 @@ This is the *meta-goal*; a more concrete goal worth mentioning at this point is that language specifications can be used to generate cool stuff -in addition to traditional interpreters -- e.g. Just-In-Time compilers. +in addition to traditional interpreters - e.g. Just-In-Time compilers. .. [#] http://llvm.cs.uiuc.edu/ From bea at codespeak.net Thu Dec 1 15:36:16 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 1 Dec 2005 15:36:16 +0100 (CET) Subject: [pypy-svn] r20500 - pypy/extradoc/talk/22c3 Message-ID: <20051201143616.5113F27B5B@code1.codespeak.net> Author: bea Date: Thu Dec 1 15:36:15 2005 New Revision: 20500 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: inserting and modifying text on sprinting - more on how it was changed later in the text Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 15:36:15 2005 @@ -17,10 +17,27 @@ -------------------------------------------- Founding PyPy: + + +Agile approaches: sprinting + +The first drafts of ideas of what was to become PyPy started during a sprint, held in Hildesheim in February 2003. Inspired by this practice, used by other Python oriented projects such as Zope. Originally the sprint methodology used in the Python community grew from practices within Zope Corporation. Their definition of a sprint is "two-day or three-day focused development session, in which developers pair off together +in a room and focus on building a particular subsystem". + +It was decided early that sprinting was to be the key technique in creating a collaborative and open community. The early PyPy sprints moved around, being organised by core developers together with local Pythonistas and soon to become PyPy:ers in LOvain Le Neuve, Gothenburg,Vilnius and Amsterdam.This strategy helped to create as well as strengthen the growing community and sprints gave the opportunity to both help, participate and influence the idea of PyPy. + +Sprints as such is not part of the Agile portfolio of techniques, the closes thing to it comes from Scrum who names the 30 days long programming iterations "sprints", covering a certain increment. In the Scrum method considerable effort is placed into performing the sprint planning as well as creating and documenting the "sprint backlog" which is then feedbacked into the "Product backlog".The sprint ends with a "sprint review" - an informal planning session in which the team decides on upcoming work, there are also techniques in which the team looks at ways to improve the development methodology and future sprints. + +The practise used within the Python community and by Zope Corporation is an adoption of just this aspect of Scrum - not the entire Scrum methodology which covers more than just sprinting. Here - and even in the early days of PyPy sprints where limited to 2-3 days, which in some sense reduces the need for rigourous planning beforehand but also the need to review the process. We will come back to this subject later on. + +Why did PyPy choose sprinting as a key technique? It is a method that fits distributed teams well because it gets the team focused around clear (and challenging) goals while working collarobative (pairprogramming, status meeting, discussions etc) as well as accelerated (short increments and tasks, "doing" and testing instead of long start ups of planning and requirement gathering). This means that most of the time a sprint +is a great way of getting results, but also to get new people aquinted with +the codebase. It is also a great method for dissemination and learning within +the team because of the pairprogramming. + +Agile approaches: testdriven development + -Agile approaches: -- sprints -- testdriven development Community structure: - transparent communication From bea at codespeak.net Thu Dec 1 15:38:57 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 1 Dec 2005 15:38:57 +0100 (CET) Subject: [pypy-svn] r20501 - pypy/extradoc/talk/22c3 Message-ID: <20051201143857.D145527B5E@code1.codespeak.net> Author: bea Date: Thu Dec 1 15:38:56 2005 New Revision: 20501 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: changed introduction to abstract in order to conform with the tech paper Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 15:38:56 2005 @@ -1,7 +1,7 @@ Agile Business and EU funding: sprint methodology in funded OSS project ----------------------------------------------------------------------- -Introduction: +Abstract: ------------- This paper uses an evolutionary approach, a walkthrough of the history of the PyPy project, touching down on different aspects of agility. From bea at codespeak.net Thu Dec 1 16:51:43 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 1 Dec 2005 16:51:43 +0100 (CET) Subject: [pypy-svn] r20506 - pypy/extradoc/talk/22c3 Message-ID: <20051201155143.D756527B5E@code1.codespeak.net> Author: bea Date: Thu Dec 1 16:51:42 2005 New Revision: 20506 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: some more text.... Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 16:51:42 2005 @@ -16,8 +16,7 @@ The vision: the creation of an OSS community -------------------------------------------- -Founding PyPy: - +Founding PyPy: Agile approaches: sprinting @@ -37,7 +36,14 @@ Agile approaches: testdriven development +Testdriven development is the cornerstone of a developer driven process. Seen from an Agile Manifesto perspective it is right up there as one of the key elements since it puts focus on producing working code, rather than plans and papers and faulty software. +Seen from an Open Source community perspective it is a vital strategy - especially when combined with an transparent open process in which anyone interested can participate - if only for just a few days at a sprint. Some of the key problems identified by Frederick P Brooks in the latest version of "The mythical Man-month" (unfortunately still very actual today) are estimating correct amount of time for communication and +testing/debugging. Automated test-driven development and version control will solve many of those problems, especially in the hands of a team sprinting its way through the Python community - welcoming everyone to participate. + +The early choice of the PyPy team was an almost extreme test driven approach. Experiences from the Subversion project, merged with the results of the py.lib (Holger????py.test - your other hobby project ;-) created a stable platform for the early development efforts. + +These two agile approaches combined (sprints and test driven development) and the way they where implemented where the building block of the PyPy community. Community structure: - transparent communication @@ -48,11 +54,15 @@ The idea: Framework 6 programme IST funding for OSS work -------------------------------------------------------- +In XXXX the idea of trying to get EU-funding for the project was identified. The community stretched outside of the regular Open Source world to try to gather as much information and contacts as possible in order to answer the question: "Should we go for it?" To be able to answer that question - two other questions needed to be understood and answered: + "Why do you want money - aren?t you guys non-profit?": -- impact for the EU +There had been a growing interest from the European Commission, IST division, to look closer at the Open Source world and its achievements. Several funded research projects in the 5th framework programme studied the phenomenon (FLOSS-POLS, FLOSS) - its organization, business models and licensings. A few other funded software projects used Open Source in their work as tools (languages and applications). There was no previous experience of an Open Source community making a bid for funding. + +The areas in the 6th Framework programme, second call fitted well enough with the objectives of PyPy (xxxxx). The idea of strengthening the european software development companies and businesses with supporting an open source language implementation was new but appealing to the EU. But being an Open Source project wasn?t enough - the challenges and the idea of an flexible, configurable "translator" or "compiler" met the research targets of the FP6, as well as trying out and documenting the agile methodology being used.The EU wanted the PyPy team to very concrete show how funding PyPy would have an strategic for Europe. "Why do we want money - isn?t OSS non-profit?": -- impact for the community +There was of course the risk of alienating parts of the Open Source community that had evolved around PyPy, not to mention the "collegues" working with the other Python Implentation Projects. To make a bid for funding for core developers and trying to find a model to channel funding for others to be able to participate in sprints was the idea. The decision to stay true to the vision of working agile and the strategy to strengtening the community via eu-funding was the key. Previously, all sprints from 2003 and onwards had been funded privately by the participants. The idea of using eu-funding to make sure that more people could contribute and participate in sprints made sure thar the project wouldn?t abruptly change it?s nature and that contribution wouldn?t be exploited. In the end the response was somewhat opposite - other OSS projects became curious - "PyPy had opened a new market" (Paul Everitt, Zope Europe). Proposal and negotiations: - formal requirements From arigo at codespeak.net Thu Dec 1 17:02:00 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 17:02:00 +0100 (CET) Subject: [pypy-svn] r20507 - in pypy/branch/somepbc-refactoring/pypy: annotation rpython/test Message-ID: <20051201160200.E3DDD27B5B@code1.codespeak.net> Author: arigo Date: Thu Dec 1 17:02:00 2005 New Revision: 20507 Modified: pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rclass.py Log: (mwh, pedronis, arigo) A test and fix for annotating calls to prebuilt bound methods. Modified: pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py (original) +++ pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py Thu Dec 1 17:02:00 2005 @@ -414,10 +414,15 @@ else: # regular method origincls, name = origin_of_meth(pyobj) self.see_mutable(pyobj.im_self) + assert pyobj == getattr(pyobj.im_self, name), ( + "%r is not %s.%s ??" % (pyobj, pyobj.im_self, name)) + # emulate a getattr to make sure it's on the classdef + classdef = self.getuniqueclassdef(pyobj.im_class) + classdef.find_attribute(name) result = self.getmethoddesc( self.getdesc(pyobj.im_func), # funcdesc self.getuniqueclassdef(origincls), # originclassdef - self.getuniqueclassdef(pyobj.im_class), # selfclassdef + classdef, # selfclassdef name) else: # must be a frozen pre-built constant, but let's check Modified: pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rclass.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rclass.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rclass.py Thu Dec 1 17:02:00 2005 @@ -356,3 +356,14 @@ assert res == 54 res = interpret(f, [0]) assert res == 11 + +def test_constant_bound_method(): + class C: + value = 1 + def meth(self): + return self.value + meth = C().meth + def f(): + return meth() + res = interpret(f, []) + assert res == 1 From bea at codespeak.net Thu Dec 1 17:15:03 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 1 Dec 2005 17:15:03 +0100 (CET) Subject: [pypy-svn] r20508 - pypy/extradoc/talk/22c3 Message-ID: <20051201161503.AA06D27B60@code1.codespeak.net> Author: bea Date: Thu Dec 1 17:15:02 2005 New Revision: 20508 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: some more text ;-) Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 17:15:02 2005 @@ -64,9 +64,10 @@ "Why do we want money - isn?t OSS non-profit?": There was of course the risk of alienating parts of the Open Source community that had evolved around PyPy, not to mention the "collegues" working with the other Python Implentation Projects. To make a bid for funding for core developers and trying to find a model to channel funding for others to be able to participate in sprints was the idea. The decision to stay true to the vision of working agile and the strategy to strengtening the community via eu-funding was the key. Previously, all sprints from 2003 and onwards had been funded privately by the participants. The idea of using eu-funding to make sure that more people could contribute and participate in sprints made sure thar the project wouldn?t abruptly change it?s nature and that contribution wouldn?t be exploited. In the end the response was somewhat opposite - other OSS projects became curious - "PyPy had opened a new market" (Paul Everitt, Zope Europe). -Proposal and negotiations: -- formal requirements -- organizational limbo +Acting on the answer to these questions proved to be a more difficult task. The entire proposal and negotiation process took over a year (Autumn 2003 to Dec 2004 Holger???). +Creating the formal requirements, the description of work, had not previously been a part of the development process. Drafting the high-level requirements (in total 14 workpackages and 58 deliverables) was made during sprints as well as distributed between sprints. This first eu-related work have been useful for the project and the community, clearly stating the idea of the PyPy, a design document on a high level - helping others better understand the vision to be implemented. + +Unfortunately the negotiations got stuck in organizational limbo and the project is still suffering from the effects of this even today. The vision of funding contribution during and between sprints to people inside and outside of the formal funding project structure was based on a neutral non-profit party - Python Business Forum. This solution wasn?t seen as realistic or feasible by the EU. The agile approach, keeping the process developer driven as much as possible, needed to be restructured. The Project: consortium and companies within a OSS community structure ---------------------------------------------------------------------- From cfbolz at codespeak.net Thu Dec 1 17:20:38 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 17:20:38 +0100 (CET) Subject: [pypy-svn] r20510 - pypy/extradoc/talk/22c3 Message-ID: <20051201162038.A346327B61@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 17:20:36 2005 New Revision: 20510 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt (contents, props changed) pypy/extradoc/talk/22c3/agility_v2.txt.txt (contents, props changed) Log: fixeol Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 17:20:36 2005 @@ -1,99 +1,99 @@ -Agile Business and EU funding: sprint methodology in funded OSS project ------------------------------------------------------------------------ - -Abstract: -------------- -This paper uses an evolutionary approach, a walkthrough of the history of the -PyPy project, touching down on different aspects of agility. - -In the founding of the community there was a clear vision of agile development and sprints as the key method. The idea of EU-funding and the process in achieving this created a paradox: how to keep the agile open source community structure with key aspects of the project being funded through EU. - -This then exposed the project to formal requirements planning, estimation,resource tracking and the challenge was to design a process in which a balance was struck between community and consortium, between a developer driven process and formal organinizational structure. - -The evolution of the project - from a non profit Open Source initiative to a partial funded EU project - made possible the growth of Agile Business. - - -The vision: the creation of an OSS community --------------------------------------------- - -Founding PyPy: - -Agile approaches: sprinting - -The first drafts of ideas of what was to become PyPy started during a sprint, held in Hildesheim in February 2003. Inspired by this practice, used by other Python oriented projects such as Zope. Originally the sprint methodology used in the Python community grew from practices within Zope Corporation. Their definition of a sprint is "two-day or three-day focused development session, in which developers pair off together -in a room and focus on building a particular subsystem". - -It was decided early that sprinting was to be the key technique in creating a collaborative and open community. The early PyPy sprints moved around, being organised by core developers together with local Pythonistas and soon to become PyPy:ers in LOvain Le Neuve, Gothenburg,Vilnius and Amsterdam.This strategy helped to create as well as strengthen the growing community and sprints gave the opportunity to both help, participate and influence the idea of PyPy. - -Sprints as such is not part of the Agile portfolio of techniques, the closes thing to it comes from Scrum who names the 30 days long programming iterations "sprints", covering a certain increment. In the Scrum method considerable effort is placed into performing the sprint planning as well as creating and documenting the "sprint backlog" which is then feedbacked into the "Product backlog".The sprint ends with a "sprint review" - an informal planning session in which the team decides on upcoming work, there are also techniques in which the team looks at ways to improve the development methodology and future sprints. - -The practise used within the Python community and by Zope Corporation is an adoption of just this aspect of Scrum - not the entire Scrum methodology which covers more than just sprinting. Here - and even in the early days of PyPy sprints where limited to 2-3 days, which in some sense reduces the need for rigourous planning beforehand but also the need to review the process. We will come back to this subject later on. - -Why did PyPy choose sprinting as a key technique? It is a method that fits distributed teams well because it gets the team focused around clear (and challenging) goals while working collarobative (pairprogramming, status meeting, discussions etc) as well as accelerated (short increments and tasks, "doing" and testing instead of long start ups of planning and requirement gathering). This means that most of the time a sprint -is a great way of getting results, but also to get new people aquinted with -the codebase. It is also a great method for dissemination and learning within -the team because of the pairprogramming. - -Agile approaches: testdriven development - -Testdriven development is the cornerstone of a developer driven process. Seen from an Agile Manifesto perspective it is right up there as one of the key elements since it puts focus on producing working code, rather than plans and papers and faulty software. - -Seen from an Open Source community perspective it is a vital strategy - especially when combined with an transparent open process in which anyone interested can participate - if only for just a few days at a sprint. Some of the key problems identified by Frederick P Brooks in the latest version of "The mythical Man-month" (unfortunately still very actual today) are estimating correct amount of time for communication and -testing/debugging. Automated test-driven development and version control will solve many of those problems, especially in the hands of a team sprinting its way through the Python community - welcoming everyone to participate. - -The early choice of the PyPy team was an almost extreme test driven approach. Experiences from the Subversion project, merged with the results of the py.lib (Holger????py.test - your other hobby project ;-) created a stable platform for the early development efforts. - -These two agile approaches combined (sprints and test driven development) and the way they where implemented where the building block of the PyPy community. - -Community structure: -- transparent communication -- decision making -- interaction with other communities - - -The idea: Framework 6 programme IST funding for OSS work --------------------------------------------------------- - -In XXXX the idea of trying to get EU-funding for the project was identified. The community stretched outside of the regular Open Source world to try to gather as much information and contacts as possible in order to answer the question: "Should we go for it?" To be able to answer that question - two other questions needed to be understood and answered: - -"Why do you want money - aren?t you guys non-profit?": -There had been a growing interest from the European Commission, IST division, to look closer at the Open Source world and its achievements. Several funded research projects in the 5th framework programme studied the phenomenon (FLOSS-POLS, FLOSS) - its organization, business models and licensings. A few other funded software projects used Open Source in their work as tools (languages and applications). There was no previous experience of an Open Source community making a bid for funding. - -The areas in the 6th Framework programme, second call fitted well enough with the objectives of PyPy (xxxxx). The idea of strengthening the european software development companies and businesses with supporting an open source language implementation was new but appealing to the EU. But being an Open Source project wasn?t enough - the challenges and the idea of an flexible, configurable "translator" or "compiler" met the research targets of the FP6, as well as trying out and documenting the agile methodology being used.The EU wanted the PyPy team to very concrete show how funding PyPy would have an strategic for Europe. - -"Why do we want money - isn?t OSS non-profit?": -There was of course the risk of alienating parts of the Open Source community that had evolved around PyPy, not to mention the "collegues" working with the other Python Implentation Projects. To make a bid for funding for core developers and trying to find a model to channel funding for others to be able to participate in sprints was the idea. The decision to stay true to the vision of working agile and the strategy to strengtening the community via eu-funding was the key. Previously, all sprints from 2003 and onwards had been funded privately by the participants. The idea of using eu-funding to make sure that more people could contribute and participate in sprints made sure thar the project wouldn?t abruptly change it?s nature and that contribution wouldn?t be exploited. In the end the response was somewhat opposite - other OSS projects became curious - "PyPy had opened a new market" (Paul Everitt, Zope Europe). - -Acting on the answer to these questions proved to be a more difficult task. The entire proposal and negotiation process took over a year (Autumn 2003 to Dec 2004 Holger???). -Creating the formal requirements, the description of work, had not previously been a part of the development process. Drafting the high-level requirements (in total 14 workpackages and 58 deliverables) was made during sprints as well as distributed between sprints. This first eu-related work have been useful for the project and the community, clearly stating the idea of the PyPy, a design document on a high level - helping others better understand the vision to be implemented. - -Unfortunately the negotiations got stuck in organizational limbo and the project is still suffering from the effects of this even today. The vision of funding contribution during and between sprints to people inside and outside of the formal funding project structure was based on a neutral non-profit party - Python Business Forum. This solution wasn?t seen as realistic or feasible by the EU. The agile approach, keeping the process developer driven as much as possible, needed to be restructured. - -The Project: consortium and companies within a OSS community structure ----------------------------------------------------------------------- - -Forced entrepreneurship: - -Creating the consortium: - -Formalizing aspects of the community: -- roles and responsibilities - - -The challenge: balancing agile OSS community structures with EU requirements ------------------------------------------------------------------------------- - -Sprints - the key agile approach: - -Physical persons: - -Communication channels: - -Managing diversities: agile business - a succesful marriage ? ------------------------------------------------------------ - -Agile EU-project: - -Agile businesses: - - +Agile Business and EU funding: sprint methodology in funded OSS project +----------------------------------------------------------------------- + +Abstract: +------------- +This paper uses an evolutionary approach, a walkthrough of the history of the +PyPy project, touching down on different aspects of agility. + +In the founding of the community there was a clear vision of agile development and sprints as the key method. The idea of EU-funding and the process in achieving this created a paradox: how to keep the agile open source community structure with key aspects of the project being funded through EU. + +This then exposed the project to formal requirements planning, estimation,resource tracking and the challenge was to design a process in which a balance was struck between community and consortium, between a developer driven process and formal organinizational structure. + +The evolution of the project - from a non profit Open Source initiative to a partial funded EU project - made possible the growth of Agile Business. + + +The vision: the creation of an OSS community +-------------------------------------------- + +Founding PyPy: + +Agile approaches: sprinting + +The first drafts of ideas of what was to become PyPy started during a sprint, held in Hildesheim in February 2003. Inspired by this practice, used by other Python oriented projects such as Zope. Originally the sprint methodology used in the Python community grew from practices within Zope Corporation. Their definition of a sprint is "two-day or three-day focused development session, in which developers pair off together +in a room and focus on building a particular subsystem". + +It was decided early that sprinting was to be the key technique in creating a collaborative and open community. The early PyPy sprints moved around, being organised by core developers together with local Pythonistas and soon to become PyPy:ers in LOvain Le Neuve, Gothenburg,Vilnius and Amsterdam.This strategy helped to create as well as strengthen the growing community and sprints gave the opportunity to both help, participate and influence the idea of PyPy. + +Sprints as such is not part of the Agile portfolio of techniques, the closes thing to it comes from Scrum who names the 30 days long programming iterations "sprints", covering a certain increment. In the Scrum method considerable effort is placed into performing the sprint planning as well as creating and documenting the "sprint backlog" which is then feedbacked into the "Product backlog".The sprint ends with a "sprint review" - an informal planning session in which the team decides on upcoming work, there are also techniques in which the team looks at ways to improve the development methodology and future sprints. + +The practise used within the Python community and by Zope Corporation is an adoption of just this aspect of Scrum - not the entire Scrum methodology which covers more than just sprinting. Here - and even in the early days of PyPy sprints where limited to 2-3 days, which in some sense reduces the need for rigourous planning beforehand but also the need to review the process. We will come back to this subject later on. + +Why did PyPy choose sprinting as a key technique? It is a method that fits distributed teams well because it gets the team focused around clear (and challenging) goals while working collarobative (pairprogramming, status meeting, discussions etc) as well as accelerated (short increments and tasks, "doing" and testing instead of long start ups of planning and requirement gathering). This means that most of the time a sprint +is a great way of getting results, but also to get new people aquinted with +the codebase. It is also a great method for dissemination and learning within +the team because of the pairprogramming. + +Agile approaches: testdriven development + +Testdriven development is the cornerstone of a developer driven process. Seen from an Agile Manifesto perspective it is right up there as one of the key elements since it puts focus on producing working code, rather than plans and papers and faulty software. + +Seen from an Open Source community perspective it is a vital strategy - especially when combined with an transparent open process in which anyone interested can participate - if only for just a few days at a sprint. Some of the key problems identified by Frederick P Brooks in the latest version of "The mythical Man-month" (unfortunately still very actual today) are estimating correct amount of time for communication and +testing/debugging. Automated test-driven development and version control will solve many of those problems, especially in the hands of a team sprinting its way through the Python community - welcoming everyone to participate. + +The early choice of the PyPy team was an almost extreme test driven approach. Experiences from the Subversion project, merged with the results of the py.lib (Holger????py.test - your other hobby project ;-) created a stable platform for the early development efforts. + +These two agile approaches combined (sprints and test driven development) and the way they where implemented where the building block of the PyPy community. + +Community structure: +- transparent communication +- decision making +- interaction with other communities + + +The idea: Framework 6 programme IST funding for OSS work +-------------------------------------------------------- + +In XXXX the idea of trying to get EU-funding for the project was identified. The community stretched outside of the regular Open Source world to try to gather as much information and contacts as possible in order to answer the question: "Should we go for it?" To be able to answer that question - two other questions needed to be understood and answered: + +"Why do you want money - aren?t you guys non-profit?": +There had been a growing interest from the European Commission, IST division, to look closer at the Open Source world and its achievements. Several funded research projects in the 5th framework programme studied the phenomenon (FLOSS-POLS, FLOSS) - its organization, business models and licensings. A few other funded software projects used Open Source in their work as tools (languages and applications). There was no previous experience of an Open Source community making a bid for funding. + +The areas in the 6th Framework programme, second call fitted well enough with the objectives of PyPy (xxxxx). The idea of strengthening the european software development companies and businesses with supporting an open source language implementation was new but appealing to the EU. But being an Open Source project wasn?t enough - the challenges and the idea of an flexible, configurable "translator" or "compiler" met the research targets of the FP6, as well as trying out and documenting the agile methodology being used.The EU wanted the PyPy team to very concrete show how funding PyPy would have an strategic for Europe. + +"Why do we want money - isn?t OSS non-profit?": +There was of course the risk of alienating parts of the Open Source community that had evolved around PyPy, not to mention the "collegues" working with the other Python Implentation Projects. To make a bid for funding for core developers and trying to find a model to channel funding for others to be able to participate in sprints was the idea. The decision to stay true to the vision of working agile and the strategy to strengtening the community via eu-funding was the key. Previously, all sprints from 2003 and onwards had been funded privately by the participants. The idea of using eu-funding to make sure that more people could contribute and participate in sprints made sure thar the project wouldn?t abruptly change it?s nature and that contribution wouldn?t be exploited. In the end the response was somewhat opposite - other OSS projects became curious - "PyPy had opened a new market" (Paul Everitt, Zope Europe). + +Acting on the answer to these questions proved to be a more difficult task. The entire proposal and negotiation process took over a year (Autumn 2003 to Dec 2004 Holger???). +Creating the formal requirements, the description of work, had not previously been a part of the development process. Drafting the high-level requirements (in total 14 workpackages and 58 deliverables) was made during sprints as well as distributed between sprints. This first eu-related work have been useful for the project and the community, clearly stating the idea of the PyPy, a design document on a high level - helping others better understand the vision to be implemented. + +Unfortunately the negotiations got stuck in organizational limbo and the project is still suffering from the effects of this even today. The vision of funding contribution during and between sprints to people inside and outside of the formal funding project structure was based on a neutral non-profit party - Python Business Forum. This solution wasn?t seen as realistic or feasible by the EU. The agile approach, keeping the process developer driven as much as possible, needed to be restructured. + +The Project: consortium and companies within a OSS community structure +---------------------------------------------------------------------- + +Forced entrepreneurship: + +Creating the consortium: + +Formalizing aspects of the community: +- roles and responsibilities + + +The challenge: balancing agile OSS community structures with EU requirements +------------------------------------------------------------------------------ + +Sprints - the key agile approach: + +Physical persons: + +Communication channels: + +Managing diversities: agile business - a succesful marriage ? +----------------------------------------------------------- + +Agile EU-project: + +Agile businesses: + + Modified: pypy/extradoc/talk/22c3/agility_v2.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v2.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v2.txt.txt Thu Dec 1 17:20:36 2005 @@ -1,34 +1,34 @@ -Agile Business and EU funding: sprint methodology in funded OSS project ------------------------------------------------------------------------ - -Introduction: -------------- - - - -1. Agile open source practices - -- community structure (communication and decision making) -- sprints (source, Python community, PyPy) - - -2. Agile technical practises - -- working distributed (pypy-sync) -- testdriven development -- version control -- infrastructure - -3. EU-funding in an OSS community - -- company creation -- consortium structure -- requirements versus agility -- managing diversities (roles, responsibilities, communication,culture) - -4. Designing agile businesses - -- tailoring an agile project process -- challenges and recommendations - +Agile Business and EU funding: sprint methodology in funded OSS project +----------------------------------------------------------------------- + +Introduction: +------------- + + + +1. Agile open source practices + +- community structure (communication and decision making) +- sprints (source, Python community, PyPy) + + +2. Agile technical practises + +- working distributed (pypy-sync) +- testdriven development +- version control +- infrastructure + +3. EU-funding in an OSS community + +- company creation +- consortium structure +- requirements versus agility +- managing diversities (roles, responsibilities, communication,culture) + +4. Designing agile businesses + +- tailoring an agile project process +- challenges and recommendations + \ No newline at end of file From cfbolz at codespeak.net Thu Dec 1 17:46:27 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 17:46:27 +0100 (CET) Subject: [pypy-svn] r20511 - pypy/extradoc/talk/22c3 Message-ID: <20051201164627.5404827B5E@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 17:46:25 2005 New Revision: 20511 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: reformatting and some typo fixes Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 17:46:25 2005 @@ -3,14 +3,22 @@ Abstract: ------------- + This paper uses an evolutionary approach, a walkthrough of the history of the PyPy project, touching down on different aspects of agility. -In the founding of the community there was a clear vision of agile development and sprints as the key method. The idea of EU-funding and the process in achieving this created a paradox: how to keep the agile open source community structure with key aspects of the project being funded through EU. - -This then exposed the project to formal requirements planning, estimation,resource tracking and the challenge was to design a process in which a balance was struck between community and consortium, between a developer driven process and formal organinizational structure. +In the founding of the community there was a clear vision of agile development +and sprints as the key method. The idea of EU-funding and the process in +achieving this created a paradox: how to keep the agile open source community +structure with key aspects of the project being funded through the EU. + +This then exposed the project to formal requirements planning, estimation, +resource tracking and the challenge was to design a process in which a balance +was struck between community and consortium, between a developer driven process +and formal organizational structure. -The evolution of the project - from a non profit Open Source initiative to a partial funded EU project - made possible the growth of Agile Business. +The evolution of the project - from a non profit Open Source initiative to a +partial funded EU project - made the growth of Agile Business possible. The vision: the creation of an OSS community @@ -19,31 +27,75 @@ Founding PyPy: Agile approaches: sprinting +++++++++++++++++++++++++++++ -The first drafts of ideas of what was to become PyPy started during a sprint, held in Hildesheim in February 2003. Inspired by this practice, used by other Python oriented projects such as Zope. Originally the sprint methodology used in the Python community grew from practices within Zope Corporation. Their definition of a sprint is "two-day or three-day focused development session, in which developers pair off together -in a room and focus on building a particular subsystem". - -It was decided early that sprinting was to be the key technique in creating a collaborative and open community. The early PyPy sprints moved around, being organised by core developers together with local Pythonistas and soon to become PyPy:ers in LOvain Le Neuve, Gothenburg,Vilnius and Amsterdam.This strategy helped to create as well as strengthen the growing community and sprints gave the opportunity to both help, participate and influence the idea of PyPy. - -Sprints as such is not part of the Agile portfolio of techniques, the closes thing to it comes from Scrum who names the 30 days long programming iterations "sprints", covering a certain increment. In the Scrum method considerable effort is placed into performing the sprint planning as well as creating and documenting the "sprint backlog" which is then feedbacked into the "Product backlog".The sprint ends with a "sprint review" - an informal planning session in which the team decides on upcoming work, there are also techniques in which the team looks at ways to improve the development methodology and future sprints. - -The practise used within the Python community and by Zope Corporation is an adoption of just this aspect of Scrum - not the entire Scrum methodology which covers more than just sprinting. Here - and even in the early days of PyPy sprints where limited to 2-3 days, which in some sense reduces the need for rigourous planning beforehand but also the need to review the process. We will come back to this subject later on. - -Why did PyPy choose sprinting as a key technique? It is a method that fits distributed teams well because it gets the team focused around clear (and challenging) goals while working collarobative (pairprogramming, status meeting, discussions etc) as well as accelerated (short increments and tasks, "doing" and testing instead of long start ups of planning and requirement gathering). This means that most of the time a sprint -is a great way of getting results, but also to get new people aquinted with -the codebase. It is also a great method for dissemination and learning within -the team because of the pairprogramming. - -Agile approaches: testdriven development - -Testdriven development is the cornerstone of a developer driven process. Seen from an Agile Manifesto perspective it is right up there as one of the key elements since it puts focus on producing working code, rather than plans and papers and faulty software. +The first drafts of ideas of what was to become PyPy started during a sprint, +held in Hildesheim in February 2003. It was inspired by a practice used by +other Python oriented projects such as Zope. Originally the sprint methodology +used in the Python community grew from practices within Zope Corporation. Their +definition of a sprint is: "a two-day or three-day focused development session, +in which developers pair off together in a room and focus on building a +particular subsystem". + +It was decided early that sprinting was to be the key technique in creating a +collaborative and open community. The early PyPy sprints moved around, being +organised by core developers together with local Pythonistas and soon to become +PyPy'ers in Louvain LaNeuve, Gothenburg, Vilnius and Amsterdam. This strategy +helped to create and later strengthen the growing community. Sprints gave +the opportunity to both help, participate and influence the idea of PyPy. + +Sprints as such are not part of the Agile portfolio of techniques, the closest +thing to it comes from Scrum who names the 30 days long programming iterations +"sprints", covering a certain increment. In the Scrum method considerable +effort is placed into performing the sprint planning as well as creating and +documenting the "sprint backlog" which is then fed back into the "Product +backlog".The sprint ends with a "sprint review" - an informal planning session +in which the team decides on upcoming work. There are also techniques in which +the team looks at ways to improve the development methodology and future +sprints. + +The practise used within the Python community and by Zope Corporation is an +adoption of just this aspect of Scrum - not the entire Scrum methodology which +covers more than just sprinting. In the Zope community - and even in the early +days of PyPy sprints where limited to 2-3 days, which in some sense reduces the +need for rigourous planning beforehand but also the need to review the process. +We will come back to this subject later on. + +Why did PyPy choose sprinting as a key technique? It is a method that fits +distributed teams well because it gets the team focused around clear (and +challenging) goals while working collaboratively (pair-programming, status +meetings, discussions etc) as well as acceleratedly (short increments and +tasks, "doing" and testing instead of long start ups of planning and +requirement gathering). This means that most of the time a sprint is a great +way of getting results, but also to get new people aquinted with the codebase. +It is also a great method for dissemination and learning within the team +because of the pair-programming. + +Agile approaches: Test-Driven Development +----------------------------------------- + +Test-driven development is the cornerstone of a developer-driven process. Seen +from an Agile Manifesto perspective it is right up there as one of the key +elements since it puts focus on producing working code, rather than plans and +papers and faulty software. + +Seen from an Open Source community perspective it is a vital strategy - +especially when combined with an transparent open process in which anyone +interested can participate - if only for just a few days at a sprint. Some of +the key problems identified by Frederick P. Brooks in the latest version of "The +mythical Man-month" (unfortunately still very actual today) are estimating +correct amount of time for communication and testing/debugging. Automated +test-driven development and version control will solve many of those problems, +especially in the hands of a team sprinting its way through the Python +community - welcoming everyone to participate. + +The early choice of the PyPy team was an almost extreme test driven approach. +Experiences from the Subversion project, merged with the results of the py.lib +(Holger????py.test - your other hobby project ;-) created a stable platform for +the early development efforts. XXX (cf) is the py-lib really as old as PyPy? -Seen from an Open Source community perspective it is a vital strategy - especially when combined with an transparent open process in which anyone interested can participate - if only for just a few days at a sprint. Some of the key problems identified by Frederick P Brooks in the latest version of "The mythical Man-month" (unfortunately still very actual today) are estimating correct amount of time for communication and -testing/debugging. Automated test-driven development and version control will solve many of those problems, especially in the hands of a team sprinting its way through the Python community - welcoming everyone to participate. - -The early choice of the PyPy team was an almost extreme test driven approach. Experiences from the Subversion project, merged with the results of the py.lib (Holger????py.test - your other hobby project ;-) created a stable platform for the early development efforts. - -These two agile approaches combined (sprints and test driven development) and the way they where implemented where the building block of the PyPy community. +These two agile approaches combined (sprints and test driven development) and +the way they where implemented where the building block of the PyPy community. Community structure: - transparent communication @@ -54,20 +106,66 @@ The idea: Framework 6 programme IST funding for OSS work -------------------------------------------------------- -In XXXX the idea of trying to get EU-funding for the project was identified. The community stretched outside of the regular Open Source world to try to gather as much information and contacts as possible in order to answer the question: "Should we go for it?" To be able to answer that question - two other questions needed to be understood and answered: - -"Why do you want money - aren?t you guys non-profit?": -There had been a growing interest from the European Commission, IST division, to look closer at the Open Source world and its achievements. Several funded research projects in the 5th framework programme studied the phenomenon (FLOSS-POLS, FLOSS) - its organization, business models and licensings. A few other funded software projects used Open Source in their work as tools (languages and applications). There was no previous experience of an Open Source community making a bid for funding. - -The areas in the 6th Framework programme, second call fitted well enough with the objectives of PyPy (xxxxx). The idea of strengthening the european software development companies and businesses with supporting an open source language implementation was new but appealing to the EU. But being an Open Source project wasn?t enough - the challenges and the idea of an flexible, configurable "translator" or "compiler" met the research targets of the FP6, as well as trying out and documenting the agile methodology being used.The EU wanted the PyPy team to very concrete show how funding PyPy would have an strategic for Europe. - -"Why do we want money - isn?t OSS non-profit?": -There was of course the risk of alienating parts of the Open Source community that had evolved around PyPy, not to mention the "collegues" working with the other Python Implentation Projects. To make a bid for funding for core developers and trying to find a model to channel funding for others to be able to participate in sprints was the idea. The decision to stay true to the vision of working agile and the strategy to strengtening the community via eu-funding was the key. Previously, all sprints from 2003 and onwards had been funded privately by the participants. The idea of using eu-funding to make sure that more people could contribute and participate in sprints made sure thar the project wouldn?t abruptly change it?s nature and that contribution wouldn?t be exploited. In the end the response was somewhat opposite - other OSS projects became curious - "PyPy had opened a new market" (Paul Everitt, Zope Europe). - -Acting on the answer to these questions proved to be a more difficult task. The entire proposal and negotiation process took over a year (Autumn 2003 to Dec 2004 Holger???). -Creating the formal requirements, the description of work, had not previously been a part of the development process. Drafting the high-level requirements (in total 14 workpackages and 58 deliverables) was made during sprints as well as distributed between sprints. This first eu-related work have been useful for the project and the community, clearly stating the idea of the PyPy, a design document on a high level - helping others better understand the vision to be implemented. - -Unfortunately the negotiations got stuck in organizational limbo and the project is still suffering from the effects of this even today. The vision of funding contribution during and between sprints to people inside and outside of the formal funding project structure was based on a neutral non-profit party - Python Business Forum. This solution wasn?t seen as realistic or feasible by the EU. The agile approach, keeping the process developer driven as much as possible, needed to be restructured. +In XXXX the idea of trying to get EU-funding for the project was identified. +The community stretched outside of the regular Open Source world to try to +gather as much information and contacts as possible in order to answer the +question: "Should we go for it?" To be able to answer that question - two other +questions needed to be understood and answered: + +"Why do you want money - aren't you guys non-profit?": ++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +There had been a growing interest from the European Commission, IST division, +to look closer at the Open Source world and its achievements. Several funded +research projects in the 5th framework programme studied the phenomenon +(FLOSS-POLS, FLOSS) - its organization, business models and licensings. A few +other funded software projects used Open Source in their work as tools +(languages and applications). There was no previous experience of an Open +Source community making a bid for funding. + +The areas in the 6th Framework programme, second call fitted well enough with +the objectives of PyPy (XXX). The idea of strengthening the european software +development companies and businesses with supporting an open source language +implementation was new but appealing to the EU. But being an Open Source +project wasn't enough - the challenges and the idea of an flexible, +configurable "translator" or "compiler" met the research targets of the FP6, as +well as trying out and documenting the agile methodology being used.The EU +wanted the PyPy team to very concretely show how funding PyPy would have an +strategic impact for Europe. + +"Why do we want money - isn't OSS non-profit?": ++++++++++++++++++++++++++++++++++++++++++++++++ + +There was of course the risk of alienating parts of the Open Source community +that had evolved around PyPy, not to mention the "colleagues" working with the +other Python implentation projects. To make a bid for funding for core +developers and trying to find a model to channel funding for others to be able +to participate in sprints was the idea. The decision to stay true to the vision +of working agile and the strategy to strengtening the community via eu-funding +was the key. Previously, all sprints from 2003 and onwards had been funded +privately by the participants. The idea of using eu-funding to make sure that +more people could contribute and participate in sprints made sure thar the +project wouldn?t abruptly change it?s nature and that contribution wouldn?t be +exploited. In the end the response was somewhat opposite - other OSS projects +became curious - "PyPy had opened a new market" (Paul Everitt, Zope Europe). + +Acting on the answer to these questions proved to be a more difficult task. The +entire proposal and negotiation process took over a year (Autumn 2003 to Dec +2004 Holger???). Creating the formal requirements, the description of work, +had not previously been a part of the development process. Drafting the +high-level requirements (in total 14 workpackages and 58 deliverables) was made +during sprints as well as distributed between sprints. This first eu-related +work have been useful for the project and the community, clearly stating the +idea of the PyPy, a design document on a high level - helping others better +understand the vision to be implemented. + +Unfortunately the negotiations got stuck in organizational limbo and the +project is still suffering from the effects of this even today. The vision of +funding contribution during and between sprints to people inside and outside of +the formal funding project structure was based on a neutral non-profit party - +Python Business Forum. This solution wasn't seen as realistic or feasible by +the EU. The agile approach, keeping the process developer driven as much as +possible, needed to be restructured. The Project: consortium and companies within a OSS community structure ---------------------------------------------------------------------- From hpk at codespeak.net Thu Dec 1 18:12:54 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 18:12:54 +0100 (CET) Subject: [pypy-svn] r20512 - pypy/extradoc/talk/22c3 Message-ID: <20051201171254.CF27227B5E@code1.codespeak.net> Author: hpk Date: Thu Dec 1 18:12:54 2005 New Revision: 20512 Removed: pypy/extradoc/talk/22c3/agility_v2.txt.txt Log: remove alternative approach From cfbolz at codespeak.net Thu Dec 1 18:19:42 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 18:19:42 +0100 (CET) Subject: [pypy-svn] r20515 - pypy/dist/pypy/doc/statistic Message-ID: <20051201171942.0B40B27B5B@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 18:19:40 2005 New Revision: 20515 Modified: pypy/dist/pypy/doc/statistic/release_dates.csv Log: remove the 0.6.1 release as it is making the line too thick Modified: pypy/dist/pypy/doc/statistic/release_dates.csv ============================================================================== --- pypy/dist/pypy/doc/statistic/release_dates.csv (original) +++ pypy/dist/pypy/doc/statistic/release_dates.csv Thu Dec 1 18:19:40 2005 @@ -1,6 +1,5 @@ PyPy releases date, release 2005-05-20,"PyPy 0.6 & 0.6.1" -2005-05-21,"" 2005-08-28,"PyPy 0.7.0" 2005-11-03,"PyPy 0.8.0" From cfbolz at codespeak.net Thu Dec 1 18:19:57 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 18:19:57 +0100 (CET) Subject: [pypy-svn] r20516 - pypy/dist/pypy/doc/statistic Message-ID: <20051201171957.2C45F27B5B@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 18:19:55 2005 New Revision: 20516 Modified: pypy/dist/pypy/doc/statistic/format.py Log: allow the generation of black/white plots Modified: pypy/dist/pypy/doc/statistic/format.py ============================================================================== --- pypy/dist/pypy/doc/statistic/format.py (original) +++ pypy/dist/pypy/doc/statistic/format.py Thu Dec 1 18:19:55 2005 @@ -6,6 +6,8 @@ import pylab import matplotlib +greyscale = True + def get_data(p): data = p.readlines() title = data[0].strip() @@ -39,7 +41,10 @@ result = parser.parse(s) return pylab.date2num(result) -colors = "brg" +if greyscale: + colors = ["k", "k--", "k."] +else: + colors = "brg" def txt2png(p): print p @@ -60,7 +65,11 @@ ymax = max(pylab.yticks()[0]) #just below the legend for i, release_date in enumerate(release_dates): release_name = release_names[i] - pylab.axvline(release_date, linewidth=2, color="g", alpha=0.5) + if greyscale: + color = 0.3 + else: + color = "g" + pylab.axvline(release_date, linewidth=2, color=color, alpha=0.5) ax.text(release_date, ymax * 0.5, release_name, fontsize=10, horizontalalignment='right', @@ -71,7 +80,11 @@ begin = sprint_begin_dates[i] end = sprint_end_dates[i] if float(begin) >= float(min(dates[0],dates[-1])): - pylab.axvspan(begin, end, facecolor="y", alpha=0.2) + if greyscale: + color = 0.8 + else: + color = "y" + pylab.axvspan(begin, end, facecolor=color, alpha=0.2) ax.text(begin, ymax * 0.88, location, fontsize=10, horizontalalignment='right', From cfbolz at codespeak.net Thu Dec 1 18:23:20 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 18:23:20 +0100 (CET) Subject: [pypy-svn] r20517 - pypy/dist/pypy/doc/image Message-ID: <20051201172320.92CA427B5B@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 18:23:18 2005 New Revision: 20517 Modified: pypy/dist/pypy/doc/image/lattice1.png pypy/dist/pypy/doc/image/lattice2.png pypy/dist/pypy/doc/image/lattice3.png Log: regenerate these Modified: pypy/dist/pypy/doc/image/lattice1.png ============================================================================== Binary files. No diff available. Modified: pypy/dist/pypy/doc/image/lattice2.png ============================================================================== Binary files. No diff available. Modified: pypy/dist/pypy/doc/image/lattice3.png ============================================================================== Binary files. No diff available. From arigo at codespeak.net Thu Dec 1 18:23:30 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 18:23:30 +0100 (CET) Subject: [pypy-svn] r20518 - in pypy/branch/somepbc-refactoring/pypy: annotation rpython/test Message-ID: <20051201172330.7335827B5B@code1.codespeak.net> Author: arigo Date: Thu Dec 1 18:23:29 2005 New Revision: 20518 Modified: pypy/branch/somepbc-refactoring/pypy/annotation/description.py pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rdict.py Log: (mwh, pedronis, arigo) Test and fix for a rather obscure bug, caused by a FunctionDesc being first built at just the wrong time when the policy is not the expected one. Fixed by capturing the specializer from the current policy only when specialize() is first called. Modified: pypy/branch/somepbc-refactoring/pypy/annotation/description.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/annotation/description.py (original) +++ pypy/branch/somepbc-refactoring/pypy/annotation/description.py Thu Dec 1 18:23:29 2005 @@ -145,10 +145,6 @@ signature = cpython_code_signature(pyobj.func_code) if defaults is None: defaults = pyobj.func_defaults - if specializer is None: - tag = getattr(pyobj, '_annspecialcase_', None) - policy = bookkeeper.annotator.policy - specializer = policy.get_specializer(tag) self.name = name self.signature = signature self.defaults = defaults or () @@ -194,6 +190,12 @@ return inputcells def specialize(self, inputcells): + if self.specializer is None: + # get the specializer based on the tag of the 'pyobj' + # (if any), according to the current policy + tag = getattr(self.pyobj, '_annspecialcase_', None) + policy = self.bookkeeper.annotator.policy + self.specializer = policy.get_specializer(tag) return self.specializer(self, inputcells) def pycall(self, schedule, args, s_previous_result): Modified: pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rdict.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rdict.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rdict.py Thu Dec 1 18:23:29 2005 @@ -515,3 +515,14 @@ res = interpret(f, []) assert res == 2 + +def test_specific_obscure_bug(): + class A: pass + class B: pass # unrelated kinds of instances + def f(): + lst = [A()] + res1 = A() in lst + d2 = {B(): None, B(): None} + return res1+len(d2) + res = interpret(f, []) + assert res == 2 From cfbolz at codespeak.net Thu Dec 1 18:25:27 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 18:25:27 +0100 (CET) Subject: [pypy-svn] r20519 - pypy/dist/pypy/doc Message-ID: <20051201172527.C0D4D27B5E@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 18:25:26 2005 New Revision: 20519 Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt Log: use the new graphviz directive that I introduced. it does the right thing when producing a pdf and an html file: for pdfs it converts the dot file to pdf and embeds it, for html it uses pngs. Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/dynamic-language-translation.txt Thu Dec 1 18:25:26 2005 @@ -837,7 +837,7 @@ \ \ | / / `--------`-- Bottom ------' -.. image:: image/lattice1.png +.. graphviz:: image/lattice1.dot Here is the part about instances and nullable instances, assuming a simple class hierarchy with only two direct subclasses of ``object``: @@ -866,7 +866,7 @@ \ / / Bottom -.. image:: image/lattice2.png +.. graphviz:: image/lattice2.dot All list terms for all variables are unordered: @@ -881,7 +881,7 @@ \ \ \ / / / '------------'--- None ----'------------' -.. image:: image/lattice3.png +.. graphviz:: image/lattice3.dot The Pbcs form a classical finite set-of-subsets lattice. In practice, we consider ``None`` as a degenerated prebuilt constant, so the None From bea at codespeak.net Thu Dec 1 18:26:22 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 1 Dec 2005 18:26:22 +0100 (CET) Subject: [pypy-svn] r20520 - pypy/extradoc/talk/22c3 Message-ID: <20051201172622.10B7A27B5F@code1.codespeak.net> Author: bea Date: Thu Dec 1 18:26:20 2005 New Revision: 20520 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: new disgusting file ;-) Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 18:26:20 2005 @@ -3,22 +3,14 @@ Abstract: ------------- - This paper uses an evolutionary approach, a walkthrough of the history of the PyPy project, touching down on different aspects of agility. -In the founding of the community there was a clear vision of agile development -and sprints as the key method. The idea of EU-funding and the process in -achieving this created a paradox: how to keep the agile open source community -structure with key aspects of the project being funded through the EU. - -This then exposed the project to formal requirements planning, estimation, -resource tracking and the challenge was to design a process in which a balance -was struck between community and consortium, between a developer driven process -and formal organizational structure. +In the founding of the community there was a clear vision of agile development and sprints as the key method. The idea of EU-funding and the process in achieving this created a paradox: how to keep the agile open source community structure with key aspects of the project being funded through EU. + +This then exposed the project to formal requirements planning, estimation,resource tracking and the challenge was to design a process in which a balance was struck between community and consortium, between a developer driven process and formal organinizational structure. -The evolution of the project - from a non profit Open Source initiative to a -partial funded EU project - made the growth of Agile Business possible. +The evolution of the project - from a non profit Open Source initiative to a partial funded EU project - made possible the growth of Agile Business. The vision: the creation of an OSS community @@ -27,75 +19,31 @@ Founding PyPy: Agile approaches: sprinting -++++++++++++++++++++++++++++ -The first drafts of ideas of what was to become PyPy started during a sprint, -held in Hildesheim in February 2003. It was inspired by a practice used by -other Python oriented projects such as Zope. Originally the sprint methodology -used in the Python community grew from practices within Zope Corporation. Their -definition of a sprint is: "a two-day or three-day focused development session, -in which developers pair off together in a room and focus on building a -particular subsystem". - -It was decided early that sprinting was to be the key technique in creating a -collaborative and open community. The early PyPy sprints moved around, being -organised by core developers together with local Pythonistas and soon to become -PyPy'ers in Louvain LaNeuve, Gothenburg, Vilnius and Amsterdam. This strategy -helped to create and later strengthen the growing community. Sprints gave -the opportunity to both help, participate and influence the idea of PyPy. - -Sprints as such are not part of the Agile portfolio of techniques, the closest -thing to it comes from Scrum who names the 30 days long programming iterations -"sprints", covering a certain increment. In the Scrum method considerable -effort is placed into performing the sprint planning as well as creating and -documenting the "sprint backlog" which is then fed back into the "Product -backlog".The sprint ends with a "sprint review" - an informal planning session -in which the team decides on upcoming work. There are also techniques in which -the team looks at ways to improve the development methodology and future -sprints. - -The practise used within the Python community and by Zope Corporation is an -adoption of just this aspect of Scrum - not the entire Scrum methodology which -covers more than just sprinting. In the Zope community - and even in the early -days of PyPy sprints where limited to 2-3 days, which in some sense reduces the -need for rigourous planning beforehand but also the need to review the process. -We will come back to this subject later on. - -Why did PyPy choose sprinting as a key technique? It is a method that fits -distributed teams well because it gets the team focused around clear (and -challenging) goals while working collaboratively (pair-programming, status -meetings, discussions etc) as well as acceleratedly (short increments and -tasks, "doing" and testing instead of long start ups of planning and -requirement gathering). This means that most of the time a sprint is a great -way of getting results, but also to get new people aquinted with the codebase. -It is also a great method for dissemination and learning within the team -because of the pair-programming. - -Agile approaches: Test-Driven Development ------------------------------------------ - -Test-driven development is the cornerstone of a developer-driven process. Seen -from an Agile Manifesto perspective it is right up there as one of the key -elements since it puts focus on producing working code, rather than plans and -papers and faulty software. - -Seen from an Open Source community perspective it is a vital strategy - -especially when combined with an transparent open process in which anyone -interested can participate - if only for just a few days at a sprint. Some of -the key problems identified by Frederick P. Brooks in the latest version of "The -mythical Man-month" (unfortunately still very actual today) are estimating -correct amount of time for communication and testing/debugging. Automated -test-driven development and version control will solve many of those problems, -especially in the hands of a team sprinting its way through the Python -community - welcoming everyone to participate. - -The early choice of the PyPy team was an almost extreme test driven approach. -Experiences from the Subversion project, merged with the results of the py.lib -(Holger????py.test - your other hobby project ;-) created a stable platform for -the early development efforts. XXX (cf) is the py-lib really as old as PyPy? +The first drafts of ideas of what was to become PyPy started during a sprint, held in Hildesheim in February 2003. Inspired by this practice, used by other Python oriented projects such as Zope. Originally the sprint methodology used in the Python community grew from practices within Zope Corporation. Their definition of a sprint is "two-day or three-day focused development session, in which developers pair off together +in a room and focus on building a particular subsystem". + +It was decided early that sprinting was to be the key technique in creating a collaborative and open community. The early PyPy sprints moved around, being organised by core developers together with local Pythonistas and soon to become PyPy:ers in LOvain Le Neuve, Gothenburg,Vilnius and Amsterdam.This strategy helped to create as well as strengthen the growing community and sprints gave the opportunity to both help, participate and influence the idea of PyPy. + +Sprints as such is not part of the Agile portfolio of techniques, the closes thing to it comes from Scrum who names the 30 days long programming iterations "sprints", covering a certain increment. In the Scrum method considerable effort is placed into performing the sprint planning as well as creating and documenting the "sprint backlog" which is then feedbacked into the "Product backlog".The sprint ends with a "sprint review" - an informal planning session in which the team decides on upcoming work, there are also techniques in which the team looks at ways to improve the development methodology and future sprints. + +The practise used within the Python community and by Zope Corporation is an adoption of just this aspect of Scrum - not the entire Scrum methodology which covers more than just sprinting. Here - and even in the early days of PyPy sprints where limited to 2-3 days, which in some sense reduces the need for rigourous planning beforehand but also the need to review the process. We will come back to this subject later on. + +Why did PyPy choose sprinting as a key technique? It is a method that fits distributed teams well because it gets the team focused around clear (and challenging) goals while working collarobative (pairprogramming, status meeting, discussions etc) as well as accelerated (short increments and tasks, "doing" and testing instead of long start ups of planning and requirement gathering). This means that most of the time a sprint +is a great way of getting results, but also to get new people aquinted with +the codebase. It is also a great method for dissemination and learning within +the team because of the pairprogramming. + +Agile approaches: testdriven development + +Testdriven development is the cornerstone of a developer driven process. Seen from an Agile Manifesto perspective it is right up there as one of the key elements since it puts focus on producing working code, rather than plans and papers and faulty software. + +Seen from an Open Source community perspective it is a vital strategy - especially when combined with an transparent open process in which anyone interested can participate - if only for just a few days at a sprint. Some of the key problems identified by Frederick P Brooks in the latest version of "The mythical Man-month" (unfortunately still very actual today) are estimating correct amount of time for communication and +testing/debugging. Automated test-driven development and version control will solve many of those problems, especially in the hands of a team sprinting its way through the Python community - welcoming everyone to participate. -These two agile approaches combined (sprints and test driven development) and -the way they where implemented where the building block of the PyPy community. +The early choice of the PyPy team was an almost extreme test driven approach. Experiences from the Subversion project, merged with the results of the py.lib (Holger????py.test - your other hobby project ;-) created a stable platform for the early development efforts. + +These two agile approaches combined (sprints and test driven development) and the way they where implemented where the building block of the PyPy community. Community structure: - transparent communication @@ -106,82 +54,41 @@ The idea: Framework 6 programme IST funding for OSS work -------------------------------------------------------- -In XXXX the idea of trying to get EU-funding for the project was identified. -The community stretched outside of the regular Open Source world to try to -gather as much information and contacts as possible in order to answer the -question: "Should we go for it?" To be able to answer that question - two other -questions needed to be understood and answered: - -"Why do you want money - aren't you guys non-profit?": -+++++++++++++++++++++++++++++++++++++++++++++++++++++++ - -There had been a growing interest from the European Commission, IST division, -to look closer at the Open Source world and its achievements. Several funded -research projects in the 5th framework programme studied the phenomenon -(FLOSS-POLS, FLOSS) - its organization, business models and licensings. A few -other funded software projects used Open Source in their work as tools -(languages and applications). There was no previous experience of an Open -Source community making a bid for funding. - -The areas in the 6th Framework programme, second call fitted well enough with -the objectives of PyPy (XXX). The idea of strengthening the european software -development companies and businesses with supporting an open source language -implementation was new but appealing to the EU. But being an Open Source -project wasn't enough - the challenges and the idea of an flexible, -configurable "translator" or "compiler" met the research targets of the FP6, as -well as trying out and documenting the agile methodology being used.The EU -wanted the PyPy team to very concretely show how funding PyPy would have an -strategic impact for Europe. - -"Why do we want money - isn't OSS non-profit?": -+++++++++++++++++++++++++++++++++++++++++++++++ - -There was of course the risk of alienating parts of the Open Source community -that had evolved around PyPy, not to mention the "colleagues" working with the -other Python implentation projects. To make a bid for funding for core -developers and trying to find a model to channel funding for others to be able -to participate in sprints was the idea. The decision to stay true to the vision -of working agile and the strategy to strengtening the community via eu-funding -was the key. Previously, all sprints from 2003 and onwards had been funded -privately by the participants. The idea of using eu-funding to make sure that -more people could contribute and participate in sprints made sure thar the -project wouldn?t abruptly change it?s nature and that contribution wouldn?t be -exploited. In the end the response was somewhat opposite - other OSS projects -became curious - "PyPy had opened a new market" (Paul Everitt, Zope Europe). - -Acting on the answer to these questions proved to be a more difficult task. The -entire proposal and negotiation process took over a year (Autumn 2003 to Dec -2004 Holger???). Creating the formal requirements, the description of work, -had not previously been a part of the development process. Drafting the -high-level requirements (in total 14 workpackages and 58 deliverables) was made -during sprints as well as distributed between sprints. This first eu-related -work have been useful for the project and the community, clearly stating the -idea of the PyPy, a design document on a high level - helping others better -understand the vision to be implemented. - -Unfortunately the negotiations got stuck in organizational limbo and the -project is still suffering from the effects of this even today. The vision of -funding contribution during and between sprints to people inside and outside of -the formal funding project structure was based on a neutral non-profit party - -Python Business Forum. This solution wasn't seen as realistic or feasible by -the EU. The agile approach, keeping the process developer driven as much as -possible, needed to be restructured. +In XXXX the idea of trying to get EU-funding for the project was identified. The community stretched outside of the regular Open Source world to try to gather as much information and contacts as possible in order to answer the question: "Should we go for it?" To be able to answer that question - two other questions needed to be understood and answered: + +"Why do you want money - aren?t you guys non-profit?": +There had been a growing interest from the European Commission, IST division, to look closer at the Open Source world and its achievements. Several funded research projects in the 5th framework programme studied the phenomenon (FLOSS-POLS, FLOSS) - its organization, business models and licensings. A few other funded software projects used Open Source in their work as tools (languages and applications). There was no previous experience of an Open Source community making a bid for funding. + +The areas in the 6th Framework programme, second call fitted well enough with the objectives of PyPy (xxxxx). The idea of strengthening the european software development companies and businesses with supporting an open source language implementation was new but appealing to the EU. But being an Open Source project wasn?t enough - the challenges and the idea of an flexible, configurable "translator" or "compiler" met the research targets of the FP6, as well as trying out and documenting the agile methodology being used.The EU wanted the PyPy team to very concrete show how funding PyPy would have an strategic for Europe. + +"Why do we want money - isn?t OSS non-profit?": +There was of course the risk of alienating parts of the Open Source community that had evolved around PyPy, not to mention the "collegues" working with the other Python Implentation Projects. To make a bid for funding for core developers and trying to find a model to channel funding for others to be able to participate in sprints was the idea. The decision to stay true to the vision of working agile and the strategy to strengtening the community via eu-funding was the key. Previously, all sprints from 2003 and onwards had been funded privately by the participants. The idea of using eu-funding to make sure that more people could contribute and participate in sprints made sure thar the project wouldn?t abruptly change it?s nature and that contribution wouldn?t be exploited. In the end the response was somewhat opposite - other OSS projects became curious - "PyPy had opened a new market" (Paul Everitt, Zope Europe). + +Acting on the answer to these questions proved to be a more difficult task. The entire proposal and negotiation process took over a year (Autumn 2003 to Dec 2004 Holger???). +Creating the formal requirements, the description of work, had not previously been a part of the development process. Drafting the high-level requirements (in total 14 workpackages and 58 deliverables) was made during sprints as well as distributed between sprints. This first eu-related work have been useful for the project and the community, clearly stating the idea of the PyPy, a design document on a high level - helping others better understand the vision to be implemented. + +Unfortunately the negotiations got stuck in organizational limbo and the project is still suffering from the effects of this even today. The vision of funding contribution during and between sprints to people inside and outside of the formal funding project structure was based on a neutral non-profit party - Python Business Forum. This solution wasn?t seen as realistic or feasible by the EU. The agile approach, keeping the process developer driven as much as possible, needed to be restructured. The Project: consortium and companies within a OSS community structure ---------------------------------------------------------------------- -Forced entrepreneurship: +In order to solve the Python Business Forum solution some of the core developers entered a state of forced entrepreneurship and created two companies: Merlinux and Tismerysoft. What first felt as an EU-related obstacle became an opportunity, but with an added load of legal and organizational responsibilities, in it self adding inertia to an agile process. -Creating the consortium: +Other adjustments, recruiting companies with previous EU project experiences and not part of the original PyPy community was done. There was also an recruitment of a company totally unrelated to the developer work being done in the PyPy community - focused on process management and designing learning processes with a background from the Chaospilot school in Aarhus, Denmark. When creating the formal consortium of seven partners, new cultures and perspectives were mixed with the strong collaborative Open Source core team, adding new complexities in communication and cooperation. Getting the new "playmates" to adopt the vision, culture and spirit of the original idea and holding true to it during the work on the proposal and negotiation process was a challenge indeed. -Formalizing aspects of the community: -- roles and responsibilities +The formal project organization required by the EU imposed new restrictions on the previous agile process. Roles and responsibilities where staked out, conforming with the requirements of the roles but delegating as much as possible of the responsibilities and decision-making to the core developers. The strategy was to keep "conceptual integrity" (Brookes) of the vision and the idea in the hands of the core developers. The result was just that but also an added workload when the project got started, which had a negative effect - adding inertia and hindering the agility of the process. The challenge: balancing agile OSS community structures with EU requirements ------------------------------------------------------------------------------ -Sprints - the key agile approach: +The designed agile development process in the funded work of the PyPy project centers arund the sprints (see picture - sprint process). A budget had been created to fund contribution from the community (the non consortium members) and the strategy of the project was to sprint every 6th week, moving around and making it possible for developers to get in touch with the project. Sprinting in connection with major conferences was also a key strategy. + +The nature of sprints changed. The need to meet milestones of the EU-funded deliverables and the need to keep an open sprint process, still welcoming newcomers into the world of Pypy, made the sprints longer (7 days with a break day in the middle) but also changed the nature of the sprints. The team started to distuingish between sprints open for all to attend, without prior PyPy experience, and sprints requiring PyPy experience. Tutorials, start up planning meetings as well as daily status meetings evolved, the latest additions to the sprints are closure planning meetings (planning the work between sprints) and work-groups - a version of pair-programming in groups. + +Some other effects of sprinting within the EU-structure is that the sprint becomes a forum for non-development work - coordinating and tracking the project. The challenge here is not affecting the main work and "disturbing" visiting developers with EU-related work. It could also be argued that the prolonged sprints could possibly make it more difficult for non consortium members to attend the full time, disturbing other engagements etc. + +The project continues to try to enhance the method of sprinting, evaluating feedback from sprint participants. Maybe it the implementation within the PyPy team is slowly conforming to the Scrum standard of sprinting, but not as a conscious effort? Physical persons: From hpk at codespeak.net Thu Dec 1 18:41:48 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 18:41:48 +0100 (CET) Subject: [pypy-svn] r20521 - pypy/extradoc/talk/22c3/plots Message-ID: <20051201174148.627D227B5E@code1.codespeak.net> Author: hpk Date: Thu Dec 1 18:41:39 2005 New Revision: 20521 Added: pypy/extradoc/talk/22c3/plots/ pypy/extradoc/talk/22c3/plots/irc_messages.png (contents, props changed) pypy/extradoc/talk/22c3/plots/loc.png (contents, props changed) pypy/extradoc/talk/22c3/plots/number_files.png (contents, props changed) pypy/extradoc/talk/22c3/plots/post.png (contents, props changed) pypy/extradoc/talk/22c3/plots/statistic_irc_log.png (contents, props changed) pypy/extradoc/talk/22c3/plots/subscribers.png (contents, props changed) pypy/extradoc/talk/22c3/plots/webaccess.png (contents, props changed) Log: added a snapshot of the current plots for perusing then in the paper Added: pypy/extradoc/talk/22c3/plots/irc_messages.png ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/plots/loc.png ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/plots/number_files.png ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/plots/post.png ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/plots/statistic_irc_log.png ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/plots/subscribers.png ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/plots/webaccess.png ============================================================================== Binary file. No diff available. From cfbolz at codespeak.net Thu Dec 1 18:56:24 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 18:56:24 +0100 (CET) Subject: [pypy-svn] r20522 - pypy/extradoc/talk/22c3 Message-ID: <20051201175624.2713127B61@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 18:56:22 2005 New Revision: 20522 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: fix rest/typos (again) Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 18:56:22 2005 @@ -1,49 +1,102 @@ -Agile Business and EU funding: sprint methodology in funded OSS project ------------------------------------------------------------------------ +========================================================================= +Agile Business and EU funding: sprint methodology in a funded OSS project +========================================================================= Abstract: -------------- -This paper uses an evolutionary approach, a walkthrough of the history of the -PyPy project, touching down on different aspects of agility. +========== -In the founding of the community there was a clear vision of agile development and sprints as the key method. The idea of EU-funding and the process in achieving this created a paradox: how to keep the agile open source community structure with key aspects of the project being funded through EU. +This paper uses an evolutionary approach, a walkthrough through the history of +the PyPy project, touching down on different aspects of agility. -This then exposed the project to formal requirements planning, estimation,resource tracking and the challenge was to design a process in which a balance was struck between community and consortium, between a developer driven process and formal organinizational structure. +In the founding of the community there was a clear vision of agile development +and sprints as the key method. The idea of EU-funding and the process in +achieving this created a paradox: how to keep the agile open source community +structure with key aspects of the project being funded through EU. + +This then exposed the project to formal requirements planning, estimation, +resource tracking and the challenge was to design a process in which a balance +was struck between community and consortium, between a developer driven process +and formal organinizational structure. -The evolution of the project - from a non profit Open Source initiative to a partial funded EU project - made possible the growth of Agile Business. +The evolution of the project - from a non profit Open Source initiative to a +partial funded EU project - made possible the growth of Agile Business. The vision: the creation of an OSS community --------------------------------------------- +============================================== Founding PyPy: Agile approaches: sprinting +---------------------------- -The first drafts of ideas of what was to become PyPy started during a sprint, held in Hildesheim in February 2003. Inspired by this practice, used by other Python oriented projects such as Zope. Originally the sprint methodology used in the Python community grew from practices within Zope Corporation. Their definition of a sprint is "two-day or three-day focused development session, in which developers pair off together -in a room and focus on building a particular subsystem". +The first drafts of ideas of what was to become PyPy started during a sprint, +held in Hildesheim in February 2003. The sprint was inspired by practices, used +by other Python oriented projects such as Zope. Originally the sprint +methodology used in the Python community grew from practices within Zope +Corporation. Their definition of a sprint is "two-day or three-day focused +development session, in which developers pair off together in a room and focus +on building a particular subsystem". + +It was decided early that sprinting was to be the key technique in creating a +collaborative and open community. The early PyPy sprints moved around, being +organised by core developers together with local Pythonistas and soon to become +PyPyers in Louvain LaNeuve, Gothenburg, Vilnius and Amsterdam. This strategy +helped to create and later strengthen the growing community. Sprints gave +the opportunity to both help, participate and influence the idea of PyPy. + +Sprints as such are not part of the Agile portfolio of techniques, the closest +thing to it comes from Scrum who names the 30 days long programming iterations +"sprints", covering a certain increment. In the Scrum method considerable +effort is put into performing the sprint planning as well as creating and +documenting the "sprint backlog" which is then feedbacked into the "Product +backlog". The sprint ends with a "sprint review" - an informal planning session +in which the team decides on upcoming work. There are also techniques in which +the team looks at ways to improve the development methodology and future +sprints. + +The practise used within the Python community and by Zope Corporation is an +adoption of just this aspect of Scrum - not the entire Scrum methodology which +covers more than just sprinting. Here - and even in the early days of PyPy - +sprints where limited to 2-3 days, which in some sense reduces the need for +rigourous planning beforehand but also the need to review the process. We will +come back to this subject later on. + +Why did PyPy choose sprinting as a key technique? It is a method that fits +distributed teams well because it gets the team focused around clear (and +challenging) goals while working collarobatively (pair-programming, status +meetings, discussions etc) as well as acceleratedly (short increments and +tasks, "doing" and testing instead of long startups of planning and +requirement gathering). This means that most of the time a sprint is a great +way of getting results, but also to get new people aquinted with the codebase. +It is also a great method for dissemination and learning within the team +because of the pair-programming. + +Agile approaches: test-driven development +----------------------------------------- + +Testdriven development is the cornerstone of a developer driven process. Seen +from an Agile Manifesto perspective it is right up there as one of the key +elements since it puts focus on producing working code, rather than plans and +papers and faulty software. + +Seen from an Open Source community perspective it is a vital strategy - +especially when combined with an transparent open process in which anyone +interested can participate - if only for just a few days at a sprint. Some of +the key problems identified by Frederick P. Brooks in the latest version of +"The Mythical Man-Month" (unfortunately still very actual today) are estimating +correct amount of time for communication and testing/debugging. Automated +testing development and version control will solve many of those problems, +especially in the hands of a team sprinting its way through the Python +community - welcoming everyone to participate. + +The early choice of the PyPy team was an almost extreme test driven approach. +Experiences from the Subversion project, merged with the results of the py.lib +(Holger????py.test - your other hobby project ;-) XXX (cf): the py-lib is not +so old, right?) created a stable platform for the early development efforts. -It was decided early that sprinting was to be the key technique in creating a collaborative and open community. The early PyPy sprints moved around, being organised by core developers together with local Pythonistas and soon to become PyPy:ers in LOvain Le Neuve, Gothenburg,Vilnius and Amsterdam.This strategy helped to create as well as strengthen the growing community and sprints gave the opportunity to both help, participate and influence the idea of PyPy. - -Sprints as such is not part of the Agile portfolio of techniques, the closes thing to it comes from Scrum who names the 30 days long programming iterations "sprints", covering a certain increment. In the Scrum method considerable effort is placed into performing the sprint planning as well as creating and documenting the "sprint backlog" which is then feedbacked into the "Product backlog".The sprint ends with a "sprint review" - an informal planning session in which the team decides on upcoming work, there are also techniques in which the team looks at ways to improve the development methodology and future sprints. - -The practise used within the Python community and by Zope Corporation is an adoption of just this aspect of Scrum - not the entire Scrum methodology which covers more than just sprinting. Here - and even in the early days of PyPy sprints where limited to 2-3 days, which in some sense reduces the need for rigourous planning beforehand but also the need to review the process. We will come back to this subject later on. - -Why did PyPy choose sprinting as a key technique? It is a method that fits distributed teams well because it gets the team focused around clear (and challenging) goals while working collarobative (pairprogramming, status meeting, discussions etc) as well as accelerated (short increments and tasks, "doing" and testing instead of long start ups of planning and requirement gathering). This means that most of the time a sprint -is a great way of getting results, but also to get new people aquinted with -the codebase. It is also a great method for dissemination and learning within -the team because of the pairprogramming. - -Agile approaches: testdriven development - -Testdriven development is the cornerstone of a developer driven process. Seen from an Agile Manifesto perspective it is right up there as one of the key elements since it puts focus on producing working code, rather than plans and papers and faulty software. - -Seen from an Open Source community perspective it is a vital strategy - especially when combined with an transparent open process in which anyone interested can participate - if only for just a few days at a sprint. Some of the key problems identified by Frederick P Brooks in the latest version of "The mythical Man-month" (unfortunately still very actual today) are estimating correct amount of time for communication and -testing/debugging. Automated test-driven development and version control will solve many of those problems, especially in the hands of a team sprinting its way through the Python community - welcoming everyone to participate. - -The early choice of the PyPy team was an almost extreme test driven approach. Experiences from the Subversion project, merged with the results of the py.lib (Holger????py.test - your other hobby project ;-) created a stable platform for the early development efforts. - -These two agile approaches combined (sprints and test driven development) and the way they where implemented where the building block of the PyPy community. +These two agile approaches combined (sprints and test driven development) and +the way they where implemented where the building block of the PyPy community. Community structure: - transparent communication @@ -52,50 +105,138 @@ The idea: Framework 6 programme IST funding for OSS work --------------------------------------------------------- +========================================================== -In XXXX the idea of trying to get EU-funding for the project was identified. The community stretched outside of the regular Open Source world to try to gather as much information and contacts as possible in order to answer the question: "Should we go for it?" To be able to answer that question - two other questions needed to be understood and answered: +In XXXX the idea of trying to get EU-funding for the project was identified. +The community stretched outside of the regular Open Source world to try to +gather as much information and contacts as possible in order to answer the +question: "Should we go for it?" To be able to answer that question - two other +questions needed to be understood and answered: "Why do you want money - aren?t you guys non-profit?": -There had been a growing interest from the European Commission, IST division, to look closer at the Open Source world and its achievements. Several funded research projects in the 5th framework programme studied the phenomenon (FLOSS-POLS, FLOSS) - its organization, business models and licensings. A few other funded software projects used Open Source in their work as tools (languages and applications). There was no previous experience of an Open Source community making a bid for funding. +------------------------------------------------------ -The areas in the 6th Framework programme, second call fitted well enough with the objectives of PyPy (xxxxx). The idea of strengthening the european software development companies and businesses with supporting an open source language implementation was new but appealing to the EU. But being an Open Source project wasn?t enough - the challenges and the idea of an flexible, configurable "translator" or "compiler" met the research targets of the FP6, as well as trying out and documenting the agile methodology being used.The EU wanted the PyPy team to very concrete show how funding PyPy would have an strategic for Europe. +There had been a growing interest from the European Commission, IST division, +to look closer at the Open Source world and its achievements. Several funded +research projects in the 5th framework programme studied the phenomenon +(FLOSS-POLS, FLOSS) - its organization, business models and licensings. A few +other funded software projects used Open Source in their work as tools +(languages and applications). There was no previous experience of an Open +Source community making a bid for funding. + +The areas in the 6th Framework programme, second call fitted well enough with +the objectives of PyPy (XXX). The idea of strengthening the european software +development companies and businesses with supporting an open source language +implementation was new but appealing to the EU. But being an Open Source +project wasn?t enough - the challenges and the idea of an flexible, +configurable "translator" or "compiler" met the research targets of the FP6, as +well as trying out and documenting the agile methodology being used. The EU +wanted the PyPy team to very concretely show how funding PyPy would have an +strategic impact for Europe. "Why do we want money - isn?t OSS non-profit?": -There was of course the risk of alienating parts of the Open Source community that had evolved around PyPy, not to mention the "collegues" working with the other Python Implentation Projects. To make a bid for funding for core developers and trying to find a model to channel funding for others to be able to participate in sprints was the idea. The decision to stay true to the vision of working agile and the strategy to strengtening the community via eu-funding was the key. Previously, all sprints from 2003 and onwards had been funded privately by the participants. The idea of using eu-funding to make sure that more people could contribute and participate in sprints made sure thar the project wouldn?t abruptly change it?s nature and that contribution wouldn?t be exploited. In the end the response was somewhat opposite - other OSS projects became curious - "PyPy had opened a new market" (Paul Everitt, Zope Europe). - -Acting on the answer to these questions proved to be a more difficult task. The entire proposal and negotiation process took over a year (Autumn 2003 to Dec 2004 Holger???). -Creating the formal requirements, the description of work, had not previously been a part of the development process. Drafting the high-level requirements (in total 14 workpackages and 58 deliverables) was made during sprints as well as distributed between sprints. This first eu-related work have been useful for the project and the community, clearly stating the idea of the PyPy, a design document on a high level - helping others better understand the vision to be implemented. +------------------------------------------------ -Unfortunately the negotiations got stuck in organizational limbo and the project is still suffering from the effects of this even today. The vision of funding contribution during and between sprints to people inside and outside of the formal funding project structure was based on a neutral non-profit party - Python Business Forum. This solution wasn?t seen as realistic or feasible by the EU. The agile approach, keeping the process developer driven as much as possible, needed to be restructured. +There was of course the risk of alienating parts of the Open Source community +that had evolved around PyPy, not to mention the "collegues" working with the +other Python Implentation Projects. To make a bid for funding for core +developers and trying to find a model to channel funding for others to be able +to participate in sprints was the idea. The decision to stay true to the +vision of working agile and the strategy to strengthening the community via +EU-funding was the key. Previously, all sprints from 2003 and onwards had been +funded privately by the participants. The idea of using EU-funding to make sure +that more people could contribute and participate in sprints made sure that the +project wouldn?t abruptly change it's nature and that contributors wouldn't be +exploited. In the end the response was somewhat opposite - other OSS projects +became curious - "PyPy had opened a new market" (Paul Everitt, Zope Europe). + +Acting on the answer to these questions proved to be a more difficult task. The +entire proposal and negotiation process took over a year (Autumn 2003 to Dec +2004 Holger???). Creating the formal requirements, the description of work, +had not previously been a part of the development process. Drafting the +high-level requirements (in total 14 workpackages and 58 deliverables) was made +during sprints as well as distributed between sprints. This first eu-related +work have been useful for the project and the community, clearly stating the +idea of the PyPy, a design document on a high level - helping others better +understand the vision to be implemented. + +Unfortunately the negotiations got stuck in organizational limbo and the +project is still suffering from the effects of this even today. The vision of +funding contribution during and between sprints to people inside and outside of +the formal funding project structure was based on a neutral non-profit party - +Python Business Forum. This solution wasn't seen as realistic or feasible by +the EU. The agile approach, keeping the process developer driven as much as +possible, needed to be restructured (XXX cf: I don't get the last sentence). The Project: consortium and companies within a OSS community structure ---------------------------------------------------------------------- -In order to solve the Python Business Forum solution some of the core developers entered a state of forced entrepreneurship and created two companies: Merlinux and Tismerysoft. What first felt as an EU-related obstacle became an opportunity, but with an added load of legal and organizational responsibilities, in it self adding inertia to an agile process. - -Other adjustments, recruiting companies with previous EU project experiences and not part of the original PyPy community was done. There was also an recruitment of a company totally unrelated to the developer work being done in the PyPy community - focused on process management and designing learning processes with a background from the Chaospilot school in Aarhus, Denmark. When creating the formal consortium of seven partners, new cultures and perspectives were mixed with the strong collaborative Open Source core team, adding new complexities in communication and cooperation. Getting the new "playmates" to adopt the vision, culture and spirit of the original idea and holding true to it during the work on the proposal and negotiation process was a challenge indeed. - -The formal project organization required by the EU imposed new restrictions on the previous agile process. Roles and responsibilities where staked out, conforming with the requirements of the roles but delegating as much as possible of the responsibilities and decision-making to the core developers. The strategy was to keep "conceptual integrity" (Brookes) of the vision and the idea in the hands of the core developers. The result was just that but also an added workload when the project got started, which had a negative effect - adding inertia and hindering the agility of the process. +In order to solve the Python Business Forum solution some of the core +developers entered a state of forced entrepreneurship and created two +companies: Merlinux and Tismerysoft. What first felt as an EU-related obstacle +became an opportunity, but with an added load of legal and organizational +responsibilities, in itself adding inertia to an agile process. + +Other adjustments, recruiting companies with previous EU project experiences +and not part of the original PyPy community was done. There was also an +recruitment of a company totally unrelated to the developer work being done in +the PyPy community - focused on process management and designing learning +processes with a background from the Chaospilot school in Aarhus, Denmark. When +creating the formal consortium of seven partners, new cultures and perspectives +were mixed with the strong collaborative Open Source core team, adding new +complexities in communication and cooperation. Getting the new "playmates" to +adopt the vision, culture and spirit of the original idea and holding true to +it during the work on the proposal and negotiation process was a challenge +indeed. + +The formal project organization required by the EU imposed new restrictions on +the previous agile process. Roles and responsibilities where staked out, +conforming with the requirements of the roles but delegating as much as +possible of the responsibilities and decision-making to the core developers. +The strategy was to keep "conceptual integrity" (Brooks) of the vision and the +idea in the hands of the core developers. The result was just that but also an +added workload when the project got started, which had a negative effect - +adding inertia and hindering the agility of the process. The challenge: balancing agile OSS community structures with EU requirements ------------------------------------------------------------------------------ -The designed agile development process in the funded work of the PyPy project centers arund the sprints (see picture - sprint process). A budget had been created to fund contribution from the community (the non consortium members) and the strategy of the project was to sprint every 6th week, moving around and making it possible for developers to get in touch with the project. Sprinting in connection with major conferences was also a key strategy. - -The nature of sprints changed. The need to meet milestones of the EU-funded deliverables and the need to keep an open sprint process, still welcoming newcomers into the world of Pypy, made the sprints longer (7 days with a break day in the middle) but also changed the nature of the sprints. The team started to distuingish between sprints open for all to attend, without prior PyPy experience, and sprints requiring PyPy experience. Tutorials, start up planning meetings as well as daily status meetings evolved, the latest additions to the sprints are closure planning meetings (planning the work between sprints) and work-groups - a version of pair-programming in groups. - -Some other effects of sprinting within the EU-structure is that the sprint becomes a forum for non-development work - coordinating and tracking the project. The challenge here is not affecting the main work and "disturbing" visiting developers with EU-related work. It could also be argued that the prolonged sprints could possibly make it more difficult for non consortium members to attend the full time, disturbing other engagements etc. - -The project continues to try to enhance the method of sprinting, evaluating feedback from sprint participants. Maybe it the implementation within the PyPy team is slowly conforming to the Scrum standard of sprinting, but not as a conscious effort? +The designed agile development process in the funded work of the PyPy project +centers arund the sprints (see picture - sprint process). A budget had been +created to fund contribution from the community (the non consortium members) +and the strategy of the project was to sprint every 6th week, moving around and +making it possible for developers to get in touch with the project. Sprinting +in connection with major conferences was also a key strategy. + +The nature of sprints changed. The need to meet milestones of the EU-funded +deliverables and the need to keep an open sprint process, still welcoming +newcomers into the world of Pypy, made the sprints longer (7 days with a break +day in the middle) but also changed the nature of the sprints. The team started +to distuingish between sprints open for all to attend, without prior PyPy +experience, and sprints requiring PyPy experience. Tutorials, start up planning +meetings as well as daily status meetings evolved, the latest additions to the +sprints are closing planning meetings (planning the work between sprints) and +work-groups - a version of pair-programming in groups. + +Some other effects of sprinting within the EU-structure is that the sprint +becomes a forum for non-development work - coordinating and tracking the +project. The challenge here is not affecting the main work and "disturbing" +visiting developers with EU-related work. It could also be argued that the +prolonged sprints could possibly make it more difficult for non consortium +members to attend the full time, disturbing other engagements etc. + +The project continues to try to enhance the method of sprinting, evaluating +feedback from sprint participants. Maybe the implementation within the PyPy +team is slowly conforming to the Scrum standard of sprinting, but not as a +conscious effort? Physical persons: Communication channels: Managing diversities: agile business - a succesful marriage ? ------------------------------------------------------------ +-------------------------------------------------------------- Agile EU-project: From arigo at codespeak.net Thu Dec 1 18:58:45 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 18:58:45 +0100 (CET) Subject: [pypy-svn] r20523 - pypy/branch/somepbc-refactoring/pypy/translator/llvm Message-ID: <20051201175845.9B41D27B61@code1.codespeak.net> Author: arigo Date: Thu Dec 1 18:58:45 2005 New Revision: 20523 Removed: pypy/branch/somepbc-refactoring/pypy/translator/llvm/ Log: Removing llvm from the branch... From arigo at codespeak.net Thu Dec 1 18:59:21 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 18:59:21 +0100 (CET) Subject: [pypy-svn] r20524 - pypy/branch/somepbc-refactoring/pypy/translator/llvm Message-ID: <20051201175921.30A2727B61@code1.codespeak.net> Author: arigo Date: Thu Dec 1 18:59:20 2005 New Revision: 20524 Added: pypy/branch/somepbc-refactoring/pypy/translator/llvm/ - copied from r20523, pypy/dist/pypy/translator/llvm/ Log: Copied the current llvm from the trunk. From hpk at codespeak.net Thu Dec 1 19:12:07 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 19:12:07 +0100 (CET) Subject: [pypy-svn] r20525 - pypy/extradoc/talk/22c3 Message-ID: <20051201181207.4211A27B57@code1.codespeak.net> Author: hpk Date: Thu Dec 1 19:12:01 2005 New Revision: 20525 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: rewrote the abstract Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 19:12:01 2005 @@ -2,27 +2,35 @@ Agile Business and EU funding: sprint methodology in a funded OSS project ========================================================================= -Abstract: +Abstract ========== -This paper uses an evolutionary approach, a walkthrough through the history of -the PyPy project, touching down on different aspects of agility. +This paper uses an evolutionary approach, a walkthrough +through the history of the PyPy project, touching down on +different aspects of agility. Agility played a key role from +the beginning - the PyPy project started from some mails +between a few people, quickly had a first one-week "sprint" +from where it evolved into a structure that was able to carry +out a research project - and got accepted by the European +Union. During the course, two companies got founded and are +now growing and employing key developers. + +Technical Development is strongly rooted in open-source +contexts and this adds another agility aspect - dynamic +communication, cooperation and exchange with other people and +projects. + +The process of getting EU-funding posed (and continues to pose) a +challenge to the community-rooted PyPy project: how to connect +agile open source culture with formal structures: exposition +to requirements like planning, budget estimation, work +distribution and resource tracking. After our first year we +are reasonably happy with the balance we strike between +organisations and EU funding on the one and the developers +driving the technical aspects of the project on the other side. -In the founding of the community there was a clear vision of agile development -and sprints as the key method. The idea of EU-funding and the process in -achieving this created a paradox: how to keep the agile open source community -structure with key aspects of the project being funded through EU. -This then exposed the project to formal requirements planning, estimation, -resource tracking and the challenge was to design a process in which a balance -was struck between community and consortium, between a developer driven process -and formal organinizational structure. - -The evolution of the project - from a non profit Open Source initiative to a -partial funded EU project - made possible the growth of Agile Business. - - -The vision: the creation of an OSS community +The vision: how the creation of an OSS community ============================================== Founding PyPy: From arigo at codespeak.net Thu Dec 1 19:46:46 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 19:46:46 +0100 (CET) Subject: [pypy-svn] r20526 - in pypy/branch/somepbc-refactoring/pypy/translator/pyrex: . test Message-ID: <20051201184646.3C13827B5B@code1.codespeak.net> Author: arigo Date: Thu Dec 1 19:46:44 2005 New Revision: 20526 Modified: pypy/branch/somepbc-refactoring/pypy/translator/pyrex/genpyrex.py pypy/branch/somepbc-refactoring/pypy/translator/pyrex/test/test_pyrextrans.py Log: hack hack hack until the genpyrex tests pass again. Modified: pypy/branch/somepbc-refactoring/pypy/translator/pyrex/genpyrex.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/pyrex/genpyrex.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/pyrex/genpyrex.py Thu Dec 1 19:46:44 2005 @@ -8,7 +8,8 @@ from pypy.objspace.flow.model import mkentrymap, last_exception from pypy.translator.annrpython import RPythonAnnotator from pypy.annotation.model import SomePBC -from pypy.annotation.classdef import isclassdef +from pypy.annotation.description import MethodDesc +from pypy.annotation.classdef import ClassDef from pypy.tool.uid import uid import inspect @@ -213,7 +214,7 @@ # make the function visible from the outside # under its original name args = ', '.join([var.name for var in fun.getargs()]) - self.putline("def %s(%s):" % (fun.name, args)) + self.putline("def %s(%s):" % (fun.name.split('.')[-1], args)) self.indent += 1 self.putline("return %s(%s)" % ( self.getfunctionname(function_object), args)) @@ -247,6 +248,10 @@ def get_type(self, var): if isinstance(var, Constant): + tp = var.value.__class__ + if self.annotator and tp in self.annotator.bookkeeper.descs: + classdesc = self.annotator.bookkeeper.descs[tp] + return classdesc.getuniqueclassdef() return type(var.value) elif self.annotator: return self.annotator.gettype(var) @@ -257,7 +262,7 @@ vartype = self.get_type(var) if vartype in (int, bool): prefix = "i_" - elif self.annotator and vartype in self.annotator.getuserclasses(): + elif isinstance(vartype, ClassDef): prefix = "p_" else: prefix = "" @@ -271,8 +276,7 @@ def _gettypename(self, vartype): if vartype in (int, bool): ctype = "int" - elif (self.annotator and vartype in self.annotator.getuserclasses() - and vartype.__module__ != '__builtin__'): + elif isinstance(vartype, ClassDef): ctype = self.getclassname(vartype) else: ctype = "object" @@ -286,9 +290,9 @@ return "" def getclassname(self,cls): - assert inspect.isclass(cls) - name = cls.__name__ - if issubclass(cls,Exception): + assert isinstance(cls, ClassDef) + name = cls.shortname + if cls.issubclass(self.annotator.bookkeeper.getuniqueclassdef(Exception)): return name return '%s__%x' % (name, uid(cls))#self._hackname(cls) @@ -316,7 +320,8 @@ elif isinstance(obj, Constant): import types if isinstance(obj.value,(types.ClassType,type)): - fff=self.getclassname(obj.value) + bk = self.annotator.bookkeeper + fff=self.getclassname(bk.getuniqueclassdef(obj.value)) elif isinstance(obj.value,(types.FunctionType, types.MethodType, type)): @@ -423,42 +428,41 @@ self.lines = [] self.indent = 0 delay_methods={} - for cls in self.annotator.getuserclassdefinitions(): + for cls in self.annotator.bookkeeper.classdefs: if cls.basedef: - bdef="(%s)" % (self.getclassname(cls.basedef.cls)) + bdef="(%s)" % (self.getclassname(cls.basedef)) else: bdef="" - self.putline("cdef class %s%s:" % (self.getclassname(cls.cls),bdef)) + self.putline("cdef class %s%s:" % (self.getclassname(cls),bdef)) self.indent += 1 empty = True for attr, attrdef in cls.attrs.items(): s_value = attrdef.s_value if isinstance(s_value, SomePBC): - for py_fun,fun_class in s_value.prebuiltinstances.items(): - assert isclassdef(fun_class), ("don't support " - "prebuilt constants like %r" % py_fun) - delay_methods.setdefault(fun_class,[]).append(py_fun) + assert s_value.getKind() is MethodDesc, ("don't support " + "prebuilt constants like %r" % (s_value,)) + for methdesc in s_value.descriptions: + meth_class = methdesc.originclassdef + delay_methods.setdefault(meth_class,[]).append(methdesc) else: vartype=self._gettypename(s_value.knowntype) self.putline("cdef public %s %s" % (vartype, attr)) empty = False list_methods=delay_methods.get(cls,[]) - for py_fun in list_methods: + for methdesc in list_methods: # XXX! - try: - fun = self.annotator.translator.flowgraphs[py_fun] - except KeyError: - continue # method present in class but never called - hackedargs = ', '.join([var.name for var in fun.getargs()]) - self.putline("def %s(%s):" % (py_fun.__name__, hackedargs)) + graph = methdesc.funcdesc.cachedgraph(None) + hackedargs = ', '.join([var.name for var in graph.getargs()]) + name = graph.name.split('.')[-1] + self.putline("def %s(%s):" % (name, hackedargs)) self.indent += 1 # XXX special case hack: cannot use 'return' in __init__ - if py_fun.__name__ == "__init__": + if name == "__init__": statement = "" else: statement = "return " self.putline("%s%s(%s)" % (statement, - self.getfunctionname(py_fun), + self.getfunctionname(graph.func), hackedargs)) self.indent -= 1 empty = False Modified: pypy/branch/somepbc-refactoring/pypy/translator/pyrex/test/test_pyrextrans.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/pyrex/test/test_pyrextrans.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/pyrex/test/test_pyrextrans.py Thu Dec 1 19:46:44 2005 @@ -4,8 +4,9 @@ from pypy.translator.pyrex.genpyrex import GenPyrex from pypy.objspace.flow.model import * from pypy.translator.tool.cbuild import build_cfunc +from pypy.translator.tool.cbuild import make_module_from_pyxstring from pypy.translator.tool.cbuild import skip_missing_compiler -from pypy.translator.translator import Translator +from pypy.translator.translator import TranslationContext from pypy.objspace.flow import FlowObjSpace from pypy import conftest @@ -113,7 +114,7 @@ class TestTypedTestCase: def getcompiled(self, func): - t = Translator(func, simplifying=True) + t = TranslationContext() # builds starting-types from func_defs argstypelist = [] if func.func_defaults: @@ -121,8 +122,23 @@ if isinstance(spec, tuple): spec = spec[0] # use the first type only for the tests argstypelist.append(spec) - t.annotate(argstypelist) - return skip_missing_compiler(t.pyrexcompile) + t.buildannotator().build_types(func, argstypelist) + name = func.func_name + + blobs = [] + for graph in t.graphs: + g = GenPyrex(graph) + g.by_the_way_the_function_was = graph.func # XXX + g.setannotator(t.annotator) + blobs.append(g.emitcode()) + code = g.globaldeclarations() # any 'g' is fine here... + if code: + blobs.insert(0, code) + pyxcode = '\n\n#_________________\n\n'.join(blobs) + + mod = skip_missing_compiler( + make_module_from_pyxstring, name, udir, pyxcode) + return getattr(mod, name) def test_set_attr(self): set_attr = self.getcompiled(snippet.set_attr) From hpk at codespeak.net Thu Dec 1 20:03:14 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 20:03:14 +0100 (CET) Subject: [pypy-svn] r20527 - pypy/extradoc/talk/22c3 Message-ID: <20051201190314.48CC427B5F@code1.codespeak.net> Author: hpk Date: Thu Dec 1 20:03:13 2005 New Revision: 20527 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: a first rough review and refinements to the CCC paper Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 20:03:13 2005 @@ -30,49 +30,46 @@ driving the technical aspects of the project on the other side. -The vision: how the creation of an OSS community -============================================== - -Founding PyPy: +Agility in Technical Development and Organisation +========================================================= Agile approaches: sprinting ---------------------------- The first drafts of ideas of what was to become PyPy started during a sprint, -held in Hildesheim in February 2003. The sprint was inspired by practices, used -by other Python oriented projects such as Zope. Originally the sprint -methodology used in the Python community grew from practices within Zope -Corporation. Their definition of a sprint is "two-day or three-day focused +held at Trillke-Gut in Hildesheim in February 2003. The sprint was inspired by +practices used by other Python oriented projects such as Zope3. Originally the +sprint methodology used in the Python community grew from practices within Zope +Corporation. Their definition of a sprint was "two-day or three-day focused development session, in which developers pair off together in a room and focus on building a particular subsystem". -It was decided early that sprinting was to be the key technique in creating a -collaborative and open community. The early PyPy sprints moved around, being -organised by core developers together with local Pythonistas and soon to become -PyPyers in Louvain LaNeuve, Gothenburg, Vilnius and Amsterdam. This strategy -helped to create and later strengthen the growing community. Sprints gave -the opportunity to both help, participate and influence the idea of PyPy. - -Sprints as such are not part of the Agile portfolio of techniques, the closest -thing to it comes from Scrum who names the 30 days long programming iterations -"sprints", covering a certain increment. In the Scrum method considerable -effort is put into performing the sprint planning as well as creating and -documenting the "sprint backlog" which is then feedbacked into the "Product -backlog". The sprint ends with a "sprint review" - an informal planning session -in which the team decides on upcoming work. There are also techniques in which -the team looks at ways to improve the development methodology and future -sprints. - -The practise used within the Python community and by Zope Corporation is an -adoption of just this aspect of Scrum - not the entire Scrum methodology which -covers more than just sprinting. Here - and even in the early days of PyPy - -sprints where limited to 2-3 days, which in some sense reduces the need for -rigourous planning beforehand but also the need to review the process. We will -come back to this subject later on. +It turned out that sprinting got to be a key technique in evolving +the code base and the community/people around it. The early PyPy sprints moved +around, being organised by core developers together with local Pythonistas +in Louvain LaNeuve, Gothenburg, Vilnius and Amsterdam. Sprints gave +the opportunity to both help, participate and influence the ideas within PyPy. + +Sprints as such are not part of the traditional Agile +portfolio of techniques, the closest thing to it comes from +Scrum who names the 30 days long programming iterations +"sprints", covering a certain increment. In the Scrum method +considerable effort is put into performing the sprint planning +as well as creating and documenting the "sprint backlog" which +is then feedbacked into the "Product backlog". The sprint ends +with a "sprint review" - an informal planning session in which +the team decides on upcoming work. There are also techniques +in which the team looks at ways to improve the development +methodology and future sprints. + +To our knowledge, most open-source projects are only sprinting up to a week +which also reflects the fact that many contributors give their time +and even money to gather and work together and thus it's different +from having fully funded people from one company working together. Why did PyPy choose sprinting as a key technique? It is a method that fits -distributed teams well because it gets the team focused around clear (and -challenging) goals while working collarobatively (pair-programming, status +distributed teams well because it gets the team focused around visible +challenging goals while working collarobatively (pair-programming, status meetings, discussions etc) as well as acceleratedly (short increments and tasks, "doing" and testing instead of long startups of planning and requirement gathering). This means that most of the time a sprint is a great @@ -80,48 +77,62 @@ It is also a great method for dissemination and learning within the team because of the pair-programming. + Agile approaches: test-driven development ----------------------------------------- -Testdriven development is the cornerstone of a developer driven process. Seen -from an Agile Manifesto perspective it is right up there as one of the key -elements since it puts focus on producing working code, rather than plans and -papers and faulty software. +Test-driven development is a cornerstone for programming together +in a distributed team effectively. Seen from an Agile +Manifesto perspective it is right up there as one of the key +elements since it puts focus on producing working code, rather +than diagrams, plans and papers and then faulty software. -Seen from an Open Source community perspective it is a vital strategy - +Seen from an Open Source community perspective it is a vitalising strategy - especially when combined with an transparent open process in which anyone interested can participate - if only for just a few days at a sprint. Some of the key problems identified by Frederick P. Brooks in the latest version of "The Mythical Man-Month" (unfortunately still very actual today) are estimating -correct amount of time for communication and testing/debugging. Automated -testing development and version control will solve many of those problems, +correct amount of time for communication and testing/debugging. Automated +testing development and version control help with many of those problems, especially in the hands of a team sprinting its way through the Python community - welcoming everyone to participate. -The early choice of the PyPy team was an almost extreme test driven approach. -Experiences from the Subversion project, merged with the results of the py.lib -(Holger????py.test - your other hobby project ;-) XXX (cf): the py-lib is not -so old, right?) created a stable platform for the early development efforts. - -These two agile approaches combined (sprints and test driven development) and -the way they where implemented where the building block of the PyPy community. - -Community structure: -- transparent communication -- decision making -- interaction with other communities - - -The idea: Framework 6 programme IST funding for OSS work -========================================================== - -In XXXX the idea of trying to get EU-funding for the project was identified. -The community stretched outside of the regular Open Source world to try to -gather as much information and contacts as possible in order to answer the +Apart from rewriting the language within itself, PyPy also evolved a number of +development tools useful for writing tests and glueing things together. + +Agility: Open Communication and organisation +---------------------------------------------------- + +Another agility aspects relates to the transparent and open communication +about the project. Only very few (EU-contract related) documents are +access restricted, everything else is freely available and modifiable. +Announcing Sprints, Releases and development goals lead to more and +more people subscribing to mailing lists or participating in development. + +Moreover, the PyPy developers evolved a model of weekly 30-minute +IRC chat meetings where topics are briefly discussed, delegated +or decided upon: those meetings are open to all active developers +and usually do not discuss internal EU matters much except that +funded developers probably keep EU goals more in mind than others. +Minutes of these weekly developer meetings get archived and posted +to the development list. + +.. image:: plots/subscribe.png +.. overview of PyPy mailing list subscriptions + + +How and why EU Framework 6 programme IST funding for OSS work +===================================================================== + +Mid 2003 the idea of trying to get EU-funding for the project was born. +It became clear that the project had a very large scale and that +receiving some funding would dramatically increase the pace and seriousness +of the project. The community stretched outside of the regular Open Source world +to try to gather as much information and contacts as possible in order to answer the question: "Should we go for it?" To be able to answer that question - two other questions needed to be understood and answered: -"Why do you want money - aren?t you guys non-profit?": +"Why do you want money - aren?t you guys non-profit?" ------------------------------------------------------ There had been a growing interest from the European Commission, IST division, @@ -130,63 +141,57 @@ (FLOSS-POLS, FLOSS) - its organization, business models and licensings. A few other funded software projects used Open Source in their work as tools (languages and applications). There was no previous experience of an Open -Source community making a bid for funding. +Source community based project making a bid for funding. -The areas in the 6th Framework programme, second call fitted well enough with -the objectives of PyPy (XXX). The idea of strengthening the european software +The areas in the 6th Framework programme (second call) luckily fitted very well +with the objectives of PyPy. The idea of strengthening the european software development companies and businesses with supporting an open source language -implementation was new but appealing to the EU. But being an Open Source +implementation was new but appealing to the EU. But being an Open Source project wasn?t enough - the challenges and the idea of an flexible, configurable "translator" or "compiler" met the research targets of the FP6, as -well as trying out and documenting the agile methodology being used. The EU -wanted the PyPy team to very concretely show how funding PyPy would have an -strategic impact for Europe. - -"Why do we want money - isn?t OSS non-profit?": ------------------------------------------------- - -There was of course the risk of alienating parts of the Open Source community -that had evolved around PyPy, not to mention the "collegues" working with the -other Python Implentation Projects. To make a bid for funding for core -developers and trying to find a model to channel funding for others to be able -to participate in sprints was the idea. The decision to stay true to the -vision of working agile and the strategy to strengthening the community via -EU-funding was the key. Previously, all sprints from 2003 and onwards had been -funded privately by the participants. The idea of using EU-funding to make sure -that more people could contribute and participate in sprints made sure that the -project wouldn?t abruptly change it's nature and that contributors wouldn't be -exploited. In the end the response was somewhat opposite - other OSS projects -became curious - "PyPy had opened a new market" (Paul Everitt, Zope Europe). - -Acting on the answer to these questions proved to be a more difficult task. The -entire proposal and negotiation process took over a year (Autumn 2003 to Dec -2004 Holger???). Creating the formal requirements, the description of work, -had not previously been a part of the development process. Drafting the +well as trying out and documenting the agile methodology being used. + +In short, we argued that EU funding allows the project to go for +reaching a critical mass and position to continue to evolve from +there. + +Acting on this proved to be a more difficult task. The +entire proposal and negotiation process took over a year (Autumn 2003 till +November 2004). Satisfying the formal requirements, the description of work, +had not previously been a part of the development process and both the EU +and the parties involved had to adapt to the situation. Yet, drafting the high-level requirements (in total 14 workpackages and 58 deliverables) was made -during sprints as well as distributed between sprints. This first eu-related -work have been useful for the project and the community, clearly stating the -idea of the PyPy, a design document on a high level - helping others better -understand the vision to be implemented. +using the same version-control/review based work style and also papers +were written during sprints. Writing the proposal and specifying according +objectives on a higher level has been useful for clarifying goals on a +longer term - also helping - helping others better understand the visions. Unfortunately the negotiations got stuck in organizational limbo and the project is still suffering from the effects of this even today. The vision of funding contribution during and between sprints to people inside and outside of -the formal funding project structure was based on a neutral non-profit party - -Python Business Forum. This solution wasn't seen as realistic or feasible by -the EU. The agile approach, keeping the process developer driven as much as -possible, needed to be restructured (XXX cf: I don't get the last sentence). +the formal funding project structure was originally based on a neutral +non-profit association. This solution wasn't seen as realistic or feasible by +the EU. In the course, we reached an alternative solution that has a few +drawbacks: Contributors have to become Partners within the Consortium +(which is by itself not hard) and can then at least claim travel and +accomodation costs when attending sprints. However, this does not +easily allow them to get paid for working and also has some formal +requirements. This leads to current considerations of developers +to shift private money between them in order to circumvent the +current problems with implementing an agile model with the EU. + -The Project: consortium and companies within a OSS community structure +consortium and companies within a OSS community structure ---------------------------------------------------------------------- -In order to solve the Python Business Forum solution some of the core -developers entered a state of forced entrepreneurship and created two -companies: Merlinux and Tismerysoft. What first felt as an EU-related obstacle -became an opportunity, but with an added load of legal and organizational -responsibilities, in itself adding inertia to an agile process. +Two of the core developers founded companies allowing them to +participate in EU funding - what first might have felt as an +EU-related obstacle became an opportunity, but with an added +load of legal and organizational responsibilities, in itself +adding inertia to an agile process. Other adjustments, recruiting companies with previous EU project experiences -and not part of the original PyPy community was done. There was also an +and not part of the original PyPy community, were done. There was also an recruitment of a company totally unrelated to the developer work being done in the PyPy community - focused on process management and designing learning processes with a background from the Chaospilot school in Aarhus, Denmark. When @@ -212,7 +217,7 @@ The designed agile development process in the funded work of the PyPy project centers arund the sprints (see picture - sprint process). A budget had been -created to fund contribution from the community (the non consortium members) +calculated to fund contribution from the community (the non consortium members) and the strategy of the project was to sprint every 6th week, moving around and making it possible for developers to get in touch with the project. Sprinting in connection with major conferences was also a key strategy. From hpk at codespeak.net Thu Dec 1 20:13:57 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 20:13:57 +0100 (CET) Subject: [pypy-svn] r20528 - pypy/extradoc/talk/22c3 Message-ID: <20051201191357.9D43C27B61@code1.codespeak.net> Author: hpk Date: Thu Dec 1 20:13:56 2005 New Revision: 20528 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: added two diagrams/images and a few refinements Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 20:13:56 2005 @@ -100,6 +100,9 @@ Apart from rewriting the language within itself, PyPy also evolved a number of development tools useful for writing tests and glueing things together. +.. image:: plots/loc.png +.. test driven development + Agility: Open Communication and organisation ---------------------------------------------------- @@ -117,7 +120,8 @@ Minutes of these weekly developer meetings get archived and posted to the development list. -.. image:: plots/subscribe.png +.. image:: plots/subscribers.png + .. overview of PyPy mailing list subscriptions @@ -127,10 +131,10 @@ Mid 2003 the idea of trying to get EU-funding for the project was born. It became clear that the project had a very large scale and that receiving some funding would dramatically increase the pace and seriousness -of the project. The community stretched outside of the regular Open Source world +of the project. The community stretched outside of the Open Source ecologies to try to gather as much information and contacts as possible in order to answer the -question: "Should we go for it?" To be able to answer that question - two other -questions needed to be understood and answered: +question: "Should we go for it?" to which the answer quickly became +"Let's see how far we get!". "Why do you want money - aren?t you guys non-profit?" ------------------------------------------------------ From hpk at codespeak.net Thu Dec 1 20:31:36 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 20:31:36 +0100 (CET) Subject: [pypy-svn] r20529 - pypy/extradoc/talk/22c3 Message-ID: <20051201193136.27EE027B5F@code1.codespeak.net> Author: hpk Date: Thu Dec 1 20:31:35 2005 New Revision: 20529 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: completing/refining more things .. still the second half is somewhat missing a red evolving line. Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 20:31:35 2005 @@ -125,18 +125,18 @@ .. overview of PyPy mailing list subscriptions -How and why EU Framework 6 programme IST funding for OSS work +How and why EU funding? ===================================================================== Mid 2003 the idea of trying to get EU-funding for the project was born. -It became clear that the project had a very large scale and that +It became clear that the project had an arbitrarily large scale and that receiving some funding would dramatically increase the pace and seriousness -of the project. The community stretched outside of the Open Source ecologies -to try to gather as much information and contacts as possible in order to answer the -question: "Should we go for it?" to which the answer quickly became -"Let's see how far we get!". +of the project. The involved developers and people stretched outside of the +Open Source ecologies to try to gather as much information and contacts as +possible in order to answer the question: "Should we go for it?" to which +the answer quickly became "Let's see how far we get!". -"Why do you want money - aren?t you guys non-profit?" +Making things fit with EU perspectives ------------------------------------------------------ There had been a growing interest from the European Commission, IST division, @@ -165,24 +165,27 @@ had not previously been a part of the development process and both the EU and the parties involved had to adapt to the situation. Yet, drafting the high-level requirements (in total 14 workpackages and 58 deliverables) was made -using the same version-control/review based work style and also papers -were written during sprints. Writing the proposal and specifying according -objectives on a higher level has been useful for clarifying goals on a -longer term - also helping - helping others better understand the visions. - -Unfortunately the negotiations got stuck in organizational limbo and the -project is still suffering from the effects of this even today. The vision of -funding contribution during and between sprints to people inside and outside of -the formal funding project structure was originally based on a neutral -non-profit association. This solution wasn't seen as realistic or feasible by -the EU. In the course, we reached an alternative solution that has a few -drawbacks: Contributors have to become Partners within the Consortium -(which is by itself not hard) and can then at least claim travel and -accomodation costs when attending sprints. However, this does not -easily allow them to get paid for working and also has some formal -requirements. This leads to current considerations of developers -to shift private money between them in order to circumvent the -current problems with implementing an agile model with the EU. +using the same version-control/open-communication based work style, including +evolving the proposal at sprints. Writing the proposal and specifying according +objectives on a higher level has proved to be generally useful for clarifying goals +on a longer term - also helping others better understand the project. + +Unfortunately the negotiations with the EU got stuck in +organizational limbo and the project is still suffering from +the effects of this even today. The goal of funding +contributors especially coming to sprints was originally +based on a non-profit association. This solution +wasn't seen as realistic or feasible by the EU although +it remains an interesting approach for the future. During +negotiations, we got to an alternative solution which - however - +has a few drawbacks: Contributors have to become Partners within the +EU-level Consortium (which is by itself not hard) and can then at least +claim travel and accomodation costs when attending sprints. +However, this does not easily allow them to get paid for +working and also has some formal requirements. This leads to +current considerations of developers to shift private money +between them in order to circumvent the current problems with +implementing an agile model within the EU contract framing. consortium and companies within a OSS community structure @@ -192,7 +195,7 @@ participate in EU funding - what first might have felt as an EU-related obstacle became an opportunity, but with an added load of legal and organizational responsibilities, in itself -adding inertia to an agile process. +adding inertia to an agile process. Other adjustments, recruiting companies with previous EU project experiences and not part of the original PyPy community, were done. There was also an @@ -206,32 +209,38 @@ it during the work on the proposal and negotiation process was a challenge indeed. -The formal project organization required by the EU imposed new restrictions on -the previous agile process. Roles and responsibilities where staked out, -conforming with the requirements of the roles but delegating as much as +The formal project organization required by the EU imposed more structure on +the previous more free-floating agile process. Roles and responsibilities where +staked out, conforming with the requirements of the roles but delegating as much as possible of the responsibilities and decision-making to the core developers. The strategy was to keep "conceptual integrity" (Brooks) of the vision and the -idea in the hands of the core developers. The result was just that but also an -added workload when the project got started, which had a negative effect - -adding inertia and hindering the agility of the process. +idea in the hands of the core developers. A somewhat negative result was +the added workload and responsibility on developers regarding EU related work. +It is interesting, though, that the consortium with its member organisation +now employs a version-control/review based scheme regarding EU documents +similar to the technical development approaches. + +It remains a challenge for all partners of the consortium, +universities and companies alike, to connect an ongoing +medium-scale open-source project with EU regulations and +requirements - not to speak of the fact that companies need to +fund 50% of the costs themselves. - -The challenge: balancing agile OSS community structures with EU requirements +challenge: balancing agile OSS community structures with EU requirements ------------------------------------------------------------------------------ -The designed agile development process in the funded work of the PyPy project -centers arund the sprints (see picture - sprint process). A budget had been -calculated to fund contribution from the community (the non consortium members) -and the strategy of the project was to sprint every 6th week, moving around and -making it possible for developers to get in touch with the project. Sprinting -in connection with major conferences was also a key strategy. +The agile development process in the funded work of the PyPy project +centers around the sprints (see picture - sprint process) - which are planned +to take place every 6th week at different places to allow many developers +to get in direct touch with each other. Sprinting in connection with +major conferences also became a key strategy. -The nature of sprints changed. The need to meet milestones of the EU-funded +The nature of sprints changed. The need to meet milestones of the EU-funded deliverables and the need to keep an open sprint process, still welcoming -newcomers into the world of Pypy, made the sprints longer (7 days with a break -day in the middle) but also changed the nature of the sprints. The team started +newcomers into the world of Pypy, made the sprints longer (at least 7 days with a +break day in the middle) but also changed the nature of the sprints. The team started to distuingish between sprints open for all to attend, without prior PyPy -experience, and sprints requiring PyPy experience. Tutorials, start up planning +experience, and sprints requiring PyPy experience. Tutorials, start up planning meetings as well as daily status meetings evolved, the latest additions to the sprints are closing planning meetings (planning the work between sprints) and work-groups - a version of pair-programming in groups. @@ -248,13 +257,15 @@ team is slowly conforming to the Scrum standard of sprinting, but not as a conscious effort? -Physical persons: - -Communication channels: Managing diversities: agile business - a succesful marriage ? -------------------------------------------------------------- +For a diverse group of organisations and people, agility is +helpful at various levels: you cannot make all-encompassing +plans and hope to simply follow them and succeed. New developments, +twists and opportunities evolve all the time. + Agile EU-project: Agile businesses: From arigo at codespeak.net Thu Dec 1 20:55:21 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 20:55:21 +0100 (CET) Subject: [pypy-svn] r20530 - pypy/branch/somepbc-refactoring/pypy/rpython Message-ID: <20051201195521.3DAE327B61@code1.codespeak.net> Author: arigo Date: Thu Dec 1 20:55:19 2005 New Revision: 20530 Modified: pypy/branch/somepbc-refactoring/pypy/rpython/rmodel.py pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py pypy/branch/somepbc-refactoring/pypy/rpython/rtyper.py Log: (pedronis, arigo) Pushed and pulled obscurity around to make the genc start_new_thread tests pass. The idea is to use a HalfConcreteWrapper marker when calling a ll helper from RPython-level code with a function argument. Such a function is meant as a call-back, so it needs particular care, e.g. it should be turned exactly into a ll function pointer (and not, say, a Void because it's a single function). Modified: pypy/branch/somepbc-refactoring/pypy/rpython/rmodel.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/rmodel.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/rmodel.py Thu Dec 1 20:55:19 2005 @@ -333,6 +333,13 @@ else: return '%s_%s' % (prefix, name) +class HalfConcreteWrapper: + # see rtyper.gendirectcall() + def __init__(self, callback): + self.concretize = callback # should produce a concrete const + def _freeze_(self): + return True + # __________ utilities __________ PyObjPtr = Ptr(PyObject) Modified: pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py Thu Dec 1 20:55:19 2005 @@ -7,7 +7,7 @@ from pypy.rpython.lltypesystem.lltype import \ typeOf, Void, Bool, nullptr, frozendict from pypy.rpython.error import TyperError -from pypy.rpython.rmodel import Repr, inputconst +from pypy.rpython.rmodel import Repr, inputconst, HalfConcreteWrapper from pypy.rpython import rclass from pypy.rpython import robject @@ -202,7 +202,7 @@ def convert_desc(self, funcdesc): # get the whole "column" of the call table corresponding to this desc if self.lowleveltype is Void: - return funcdesc.pyobj + return HalfConcreteWrapper(self.get_unique_llfn) llfns = {} found_anything = False for row in self.uniquerows: @@ -225,7 +225,7 @@ if isinstance(value, types.MethodType) and value.im_self is None: value = value.im_func # unbound method -> bare function if self.lowleveltype is Void: - return value + return HalfConcreteWrapper(self.get_unique_llfn) if value is None: null = self.rtyper.type_system.null_callable(self.lowleveltype) return null @@ -250,6 +250,30 @@ else: XXX_later + def get_unique_llfn(self): + # try to build a unique low-level function. Avoid to use + # whenever possible! Doesn't work with specialization, multiple + # different call sites, etc. + if self.lowleveltype is not Void: + raise TyperError("cannot pass multiple functions here") + assert len(self.s_pbc.descriptions) == 1 + # lowleveltype wouldn't be Void otherwise + funcdesc, = self.s_pbc.descriptions + if len(self.callfamily.calltables) != 1: + raise TyperError("cannot pass a function with various call shapes") + table, = self.callfamily.calltables.values() + graphs = [] + for row in table: + if funcdesc in row: + graphs.append(row[funcdesc]) + if not graphs: + raise TyperError("cannot pass here a function that is not called") + graph = graphs[0] + if graphs != [graph]*len(graphs): + raise TyperError("cannot pass a specialized function here") + llfn = self.rtyper.getcallable(graph) + return inputconst(typeOf(llfn), llfn) + def rtype_simple_call(self, hop): return self.call('simple_call', hop) Modified: pypy/branch/somepbc-refactoring/pypy/rpython/rtyper.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/rtyper.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/rtyper.py Thu Dec 1 20:55:19 2005 @@ -28,7 +28,7 @@ from pypy.translator.transform import insert_stackcheck from pypy.rpython.error import TyperError from pypy.rpython.rmodel import Repr, inputconst, BrokenReprTyperError -from pypy.rpython.rmodel import warning +from pypy.rpython.rmodel import warning, HalfConcreteWrapper from pypy.rpython.normalizecalls import perform_normalizations from pypy.rpython.annlowlevel import annotate_lowlevel_helper from pypy.rpython.rmodel import log @@ -762,14 +762,13 @@ if not s_value.is_constant(): raise TyperError("non-constant variable of type Void") if not isinstance(s_value, annmodel.SomePBC): - # a Void non-PBC constant: can be a SomePtr pointing to a - # constant function. - assert isinstance(s_value, annmodel.SomePtr) - # Drop the 'const'. - s_value = annmodel.SomePtr(s_value.ll_ptrtype) - # Modify args_v so that 'v' gets the llptr concretetype - # stored in s_value - v = inputconst(s_value.ll_ptrtype, v.value) + raise TyperError("non-PBC Void argument: %r", (s_value,)) + if isinstance(s_value.const, HalfConcreteWrapper): + # Modify args_v so that 'v' gets the concrete value + # returned by the wrapper + wrapper = s_value.const + v = wrapper.concretize() + s_value = annmodel.lltype_to_annotation(v.concretetype) args_s.append(s_value) else: args_s.append(annmodel.lltype_to_annotation(v.concretetype)) From hpk at codespeak.net Thu Dec 1 21:20:38 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 21:20:38 +0100 (CET) Subject: [pypy-svn] r20531 - pypy/extradoc/talk/22c3 Message-ID: <20051201202038.4B8FA27B61@code1.codespeak.net> Author: hpk Date: Thu Dec 1 21:20:37 2005 New Revision: 20531 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: trying to streamline the latter half - still not satisfying. Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 21:20:37 2005 @@ -1,5 +1,5 @@ ========================================================================= -Agile Business and EU funding: sprint methodology in a funded OSS project +Agile Business and EU funding: sprint methods in a funded OSS project ========================================================================= Abstract @@ -36,69 +36,71 @@ Agile approaches: sprinting ---------------------------- -The first drafts of ideas of what was to become PyPy started during a sprint, -held at Trillke-Gut in Hildesheim in February 2003. The sprint was inspired by +The first bits of PyPy started during a one-week meeting, a "sprint", +held at Trillke-Gut in Hildesheim February 2003. The sprint was inspired by practices used by other Python oriented projects such as Zope3. Originally the sprint methodology used in the Python community grew from practices within Zope Corporation. Their definition of a sprint was "two-day or three-day focused development session, in which developers pair off together in a room and focus on building a particular subsystem". -It turned out that sprinting got to be a key technique in evolving -the code base and the community/people around it. The early PyPy sprints moved -around, being organised by core developers together with local Pythonistas +Sprinting up to a week became the initial driving factor in developing +the code base and the community/people around it. The early PyPy sprints +were organised by core developers together with local Pythonistas in Louvain LaNeuve, Gothenburg, Vilnius and Amsterdam. Sprints gave the opportunity to both help, participate and influence the ideas within PyPy. -Sprints as such are not part of the traditional Agile +Sprints are not really part of the traditional Agile portfolio of techniques, the closest thing to it comes from Scrum who names the 30 days long programming iterations -"sprints", covering a certain increment. In the Scrum method +"sprints", covering a certain increment. In the Scrum method considerable effort is put into performing the sprint planning as well as creating and documenting the "sprint backlog" which -is then feedbacked into the "Product backlog". The sprint ends +is then feedbacked into the "Product backlog". The sprint ends with a "sprint review" - an informal planning session in which -the team decides on upcoming work. There are also techniques +the team decides on upcoming work. There are also techniques in which the team looks at ways to improve the development methodology and future sprints. -To our knowledge, most open-source projects are only sprinting up to a week +To our knowledge, open-source projects are only sprinting up to a week which also reflects the fact that many contributors give their time -and even money to gather and work together and thus it's different +and even money to gather and work together. This is different from having fully funded people from one company working together. -Why did PyPy choose sprinting as a key technique? It is a method that fits +Why did PyPy choose sprinting as a key technique? It is a method that fits distributed teams well because it gets the team focused around visible challenging goals while working collarobatively (pair-programming, status meetings, discussions etc) as well as acceleratedly (short increments and tasks, "doing" and testing instead of long startups of planning and -requirement gathering). This means that most of the time a sprint is a great -way of getting results, but also to get new people aquinted with the codebase. -It is also a great method for dissemination and learning within the team -because of the pair-programming. +requirement gathering). This means that most of the time a sprint is a great +way of getting results, but also to get new people aquainted - a great +method for dissemination and learning within the team. Agile approaches: test-driven development ----------------------------------------- -Test-driven development is a cornerstone for programming together -in a distributed team effectively. Seen from an Agile -Manifesto perspective it is right up there as one of the key -elements since it puts focus on producing working code, rather -than diagrams, plans and papers and then faulty software. +Test-driven development is a cornerstone for programming +efficiently together in a distributed team. Seen from the +Agile Manifesto perspective it is right up there as one of the +key elements since it puts focus on producing working code, +rather than diagrams, plans and papers (and then faulty +software). Seen from an Open Source community perspective it is a vitalising strategy - -especially when combined with an transparent open process in which anyone -interested can participate - if only for just a few days at a sprint. Some of +especially in combination with a transparent open process in which anyone +interested can participate - if only for just a few days at a sprint. Some of the key problems identified by Frederick P. Brooks in the latest version of "The Mythical Man-Month" (unfortunately still very actual today) are estimating correct amount of time for communication and testing/debugging. Automated -testing development and version control help with many of those problems, +testing development and strict version tracking helps with those problems, especially in the hands of a team sprinting its way through the Python community - welcoming everyone to participate. Apart from rewriting the language within itself, PyPy also evolved a number of -development tools useful for writing tests and glueing things together. +development tools useful for writing tests and glueing things together. PyPy's +testing tool ("py.test") is used separately and continues to evolve on +its own by now. .. image:: plots/loc.png .. test driven development @@ -106,11 +108,14 @@ Agility: Open Communication and organisation ---------------------------------------------------- -Another agility aspects relates to the transparent and open communication +Another agility aspect relates to the transparent and open communication about the project. Only very few (EU-contract related) documents are access restricted, everything else is freely available and modifiable. -Announcing Sprints, Releases and development goals lead to more and -more people subscribing to mailing lists or participating in development. +There are no hierarchies for commit rights. In fact, the hosting server +also gives home to a couple of other projects and all projects share +commit rights ("Coding Wiki"). Announcing Sprints, Releases and development +goals lead to more and more people subscribing to mailing lists or +participating in development. Moreover, the PyPy developers evolved a model of weekly 30-minute IRC chat meetings where topics are briefly discussed, delegated @@ -120,6 +125,10 @@ Minutes of these weekly developer meetings get archived and posted to the development list. +A rather recent invention is "This week in PyPy" which tries to +summarize what is going on in the lively IRC development #pypy +channel - main place of technical coordination. + .. image:: plots/subscribers.png .. overview of PyPy mailing list subscriptions @@ -131,7 +140,8 @@ Mid 2003 the idea of trying to get EU-funding for the project was born. It became clear that the project had an arbitrarily large scale and that receiving some funding would dramatically increase the pace and seriousness -of the project. The involved developers and people stretched outside of the +of the project - because funded developers can dedicate more of their time +to the project. The involved developers and people stretched outside of the Open Source ecologies to try to gather as much information and contacts as possible in order to answer the question: "Should we go for it?" to which the answer quickly became "Let's see how far we get!". @@ -147,26 +157,29 @@ (languages and applications). There was no previous experience of an Open Source community based project making a bid for funding. -The areas in the 6th Framework programme (second call) luckily fitted very well -with the objectives of PyPy. The idea of strengthening the european software +The areas in the 6th Framework programme (second call) fitted very well +with the objectives of PyPy. The idea of strengthening the European Software development companies and businesses with supporting an open source language implementation was new but appealing to the EU. But being an Open Source project wasn?t enough - the challenges and the idea of an flexible, configurable "translator" or "compiler" met the research targets of the FP6, as well as trying out and documenting the agile methodology being used. +It is interesting to note that todays computer industrial language +research and development happens mostly in the US. In short, we argued that EU funding allows the project to go for reaching a critical mass and position to continue to evolve from -there. +there and that it would help European Organisations to make some +ground. -Acting on this proved to be a more difficult task. The +Acting on this strategy proved to be a more difficult task. The entire proposal and negotiation process took over a year (Autumn 2003 till -November 2004). Satisfying the formal requirements, the description of work, -had not previously been a part of the development process and both the EU +November 2004). Satisfying the formal requirements, a proper description of +planned work, had not previously been part of the development focus and both the EU and the parties involved had to adapt to the situation. Yet, drafting the high-level requirements (in total 14 workpackages and 58 deliverables) was made using the same version-control/open-communication based work style, including -evolving the proposal at sprints. Writing the proposal and specifying according +evolving the proposal at sprints. Writing the proposal and specifying according objectives on a higher level has proved to be generally useful for clarifying goals on a longer term - also helping others better understand the project. @@ -176,74 +189,88 @@ contributors especially coming to sprints was originally based on a non-profit association. This solution wasn't seen as realistic or feasible by the EU although -it remains an interesting approach for the future. During +we think it remains a viable approach for the future. During negotiations, we got to an alternative solution which - however - -has a few drawbacks: Contributors have to become Partners within the -EU-level Consortium (which is by itself not hard) and can then at least +has a few drawbacks: Contributors have to become Contract Partners within +the EU-level Consortium (which is by itself not hard) and can then at least claim travel and accomodation costs when attending sprints. -However, this does not easily allow them to get paid for -working and also has some formal requirements. This leads to -current considerations of developers to shift private money + +However, this construction does not allow them to get paid for +work time and also has some formal requirements. This practically +leads to current considerations of developers to shift private money between them in order to circumvent the current problems with implementing an agile model within the EU contract framing. -consortium and companies within a OSS community structure +Seven Organisations / The consortium ---------------------------------------------------------------------- -Two of the core developers founded companies allowing them to -participate in EU funding - what first might have felt as an -EU-related obstacle became an opportunity, but with an added -load of legal and organizational responsibilities, in itself -adding inertia to an agile process. - -Other adjustments, recruiting companies with previous EU project experiences -and not part of the original PyPy community, were done. There was also an -recruitment of a company totally unrelated to the developer work being done in -the PyPy community - focused on process management and designing learning -processes with a background from the Chaospilot school in Aarhus, Denmark. When -creating the formal consortium of seven partners, new cultures and perspectives -were mixed with the strong collaborative Open Source core team, adding new -complexities in communication and cooperation. Getting the new "playmates" to -adopt the vision, culture and spirit of the original idea and holding true to -it during the work on the proposal and negotiation process was a challenge -indeed. - -The formal project organization required by the EU imposed more structure on -the previous more free-floating agile process. Roles and responsibilities where -staked out, conforming with the requirements of the roles but delegating as much as -possible of the responsibilities and decision-making to the core developers. -The strategy was to keep "conceptual integrity" (Brooks) of the vision and the -idea in the hands of the core developers. A somewhat negative result was -the added workload and responsibility on developers regarding EU related work. -It is interesting, though, that the consortium with its member organisation -now employs a version-control/review based scheme regarding EU documents +The guiding idea for receiving funding is to have organisations +through which key developers and other parties are employed. +Two companies out of the seven organisations in the initial +consortium got funded during the EU negotiation process - +what first might have felt as an EU-related obstacle became an +opportunity, but with some overhead in legal and organizational +responsibilities. + +Other adjustments and recruiting companies with previous EU +project experiences took place. There was also an recruitment +of a company totally unrelated to the developer work being +done so far - focused on process management and +designing learning processes with a background from the +Chaospilot school in Aarhus, Denmark. When creating the formal +consortium of seven partners, new cultures and perspectives +were mixed with the strong collaborative Open Source core +team, adding new complexities in communication and +cooperation. Getting the new "playmates" to adopt the vision, +culture and spirit of the original idea and holding true to it +during the work on the proposal and negotiation process was a +challenge indeed. + +The formal project organization required by the EU imposed +more structure on the previous more free-floating agile +process. Roles and responsibilities where staked out, +conforming with the requirements of the roles but delegating +as much as possible of the responsibilities and +decision-making to the core developers. The strategy was to +keep "conceptual integrity" (Brooks) of the vision and the +idea in the hands of the core developers. A somewhat negative +result was the added workload and responsibility on developers +regarding EU related work. It is interesting, though, that +the consortium with its member organisation now employs a +version-control/review based scheme regarding EU documents similar to the technical development approaches. It remains a challenge for all partners of the consortium, universities and companies alike, to connect an ongoing medium-scale open-source project with EU regulations and requirements - not to speak of the fact that companies need to -fund 50% of the costs themselves. +fund 50% of the costs themselves. It is, in fact, too early +to judge on the overall success of our approaches although +we are confident that things work out reasonably well. + challenge: balancing agile OSS community structures with EU requirements ------------------------------------------------------------------------------ -The agile development process in the funded work of the PyPy project -centers around the sprints (see picture - sprint process) - which are planned -to take place every 6th week at different places to allow many developers -to get in direct touch with each other. Sprinting in connection with -major conferences also became a key strategy. - -The nature of sprints changed. The need to meet milestones of the EU-funded -deliverables and the need to keep an open sprint process, still welcoming -newcomers into the world of Pypy, made the sprints longer (at least 7 days with a -break day in the middle) but also changed the nature of the sprints. The team started -to distuingish between sprints open for all to attend, without prior PyPy -experience, and sprints requiring PyPy experience. Tutorials, start up planning -meetings as well as daily status meetings evolved, the latest additions to the -sprints are closing planning meetings (planning the work between sprints) and -work-groups - a version of pair-programming in groups. +The agile development process in the EU funded work of the +PyPy project centers around sprints - which are planned to +take place every 6th week at different places to allow many +developers to get in direct touch with each other. Sprinting +around conferences also became a key strategy. + +Tut the nature of sprints changed when EU funding started. The +need to meet milestones of promised *deliverables* and the +goal to keep an open sprint process, still welcoming newcomers +into the world of Pypy, made the sprints longer (at least 7 +days with a break day in the middle) but also changed the +nature of the sprints. The team started to distuingish between +sprints open for all to attend, without any prior PyPy experience, +and sprints requiring earlier PyPy involvement. Tutorials, start up +planning meetings as well as daily status meetings evolved, +the latest additions to the sprints are closing planning +meetings (planning the work between sprints) and work-groups - +a version of pair-programming in groups. Some other effects of sprinting within the EU-structure is that the sprint becomes a forum for non-development work - coordinating and tracking the From hpk at codespeak.net Thu Dec 1 21:34:13 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 21:34:13 +0100 (CET) Subject: [pypy-svn] r20532 - pypy/extradoc/talk/22c3 Message-ID: <20051201203413.A798127B61@code1.codespeak.net> Author: hpk Date: Thu Dec 1 21:34:12 2005 New Revision: 20532 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: insert the sprint process picture Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 21:34:12 2005 @@ -76,6 +76,7 @@ way of getting results, but also to get new people aquainted - a great method for dissemination and learning within the team. +.. image:: sprintprocess.gif Agile approaches: test-driven development ----------------------------------------- From bea at codespeak.net Thu Dec 1 21:47:37 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 1 Dec 2005 21:47:37 +0100 (CET) Subject: [pypy-svn] r20533 - pypy/extradoc/talk/22c3 Message-ID: <20051201204737.2BC7127B5F@code1.codespeak.net> Author: bea Date: Thu Dec 1 21:47:34 2005 New Revision: 20533 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: try Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 21:47:34 2005 @@ -294,8 +294,14 @@ plans and hope to simply follow them and succeed. New developments, twists and opportunities evolve all the time. -Agile EU-project: +Our experience with evolving PyPy from an non profit Open Source project to a partially funded EU research project shows the following: + +- what first seemed like too diverse interests and views, impossible to tailor into a single project, was instead a fruitful mix of diversities. The challenge is to manage these diversities and channel them into constructive team efforts. Homogenity is the real threat. + +- what first seemed like unbeatable odds and too big obstacles turned into new frontiers of possibilities. The challenge is to create an atmosphere in which a team can act on those and within the short timeframe of opportunity. Change is inevitable - how you handle it is the real challenge. + +Concluding - the cumulative effects of an agile, open and dynamic team process combined with a market and curious clients creates new business models - agile business. A path that we hope others will follow. + -Agile businesses: From hpk at codespeak.net Thu Dec 1 21:50:17 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 21:50:17 +0100 (CET) Subject: [pypy-svn] r20534 - pypy/extradoc/talk/22c3 Message-ID: <20051201205017.9830F27B5B@code1.codespeak.net> Author: hpk Date: Thu Dec 1 21:50:16 2005 New Revision: 20534 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: slight reformatting / minor refinement Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 21:50:16 2005 @@ -289,18 +289,31 @@ Managing diversities: agile business - a succesful marriage ? -------------------------------------------------------------- -For a diverse group of organisations and people, agility is -helpful at various levels: you cannot make all-encompassing -plans and hope to simply follow them and succeed. New developments, -twists and opportunities evolve all the time. - -Our experience with evolving PyPy from an non profit Open Source project to a partially funded EU research project shows the following: - -- what first seemed like too diverse interests and views, impossible to tailor into a single project, was instead a fruitful mix of diversities. The challenge is to manage these diversities and channel them into constructive team efforts. Homogenity is the real threat. - -- what first seemed like unbeatable odds and too big obstacles turned into new frontiers of possibilities. The challenge is to create an atmosphere in which a team can act on those and within the short timeframe of opportunity. Change is inevitable - how you handle it is the real challenge. - -Concluding - the cumulative effects of an agile, open and dynamic team process combined with a market and curious clients creates new business models - agile business. A path that we hope others will follow. +For a diverse group of organisations and people, agility is +helpful at various levels: you cannot make all-encompassing +plans and hope to simply follow them and succeed. New +developments, twists and opportunities evolve all the time. + +Our experience with evolving PyPy from an non profit Open +Source project to a partially funded EU research project shows +the following: + +- what first seemed like too diverse interests and views, + impossible to tailor into a single project, was instead a + fruitful mix of diversities. The challenge is to manage + these diversities and channel them into constructive team + efforts. Aiming for homogenity is the real threat. + +- what first seemed like unbeatable odds and too big obstacles + turned into new frontiers of possibilities. The challenge is + to create an atmosphere in which a team can act on those and + within the short timeframe of opportunity. Change is + inevitable - how you handle it is the real challenge. + +Concluding - the cumulative effects of an agile, open and +dynamic team process combined with a market and curious +clients creates new business models - agile business. +A path that we hope others will follow. From arigo at codespeak.net Thu Dec 1 22:02:01 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 22:02:01 +0100 (CET) Subject: [pypy-svn] r20535 - in pypy/branch/somepbc-refactoring/pypy/translator/llvm: . module Message-ID: <20051201210201.E847D27B5F@code1.codespeak.net> Author: arigo Date: Thu Dec 1 22:02:00 2005 New Revision: 20535 Modified: pypy/branch/somepbc-refactoring/pypy/translator/llvm/externs2ll.py pypy/branch/somepbc-refactoring/pypy/translator/llvm/genllvm.py pypy/branch/somepbc-refactoring/pypy/translator/llvm/module/support.py pypy/branch/somepbc-refactoring/pypy/translator/llvm/opwriter.py Log: (pedronis, arigo) Fixing LLVM was rather easy. Modified: pypy/branch/somepbc-refactoring/pypy/translator/llvm/externs2ll.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/llvm/externs2ll.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/llvm/externs2ll.py Thu Dec 1 22:02:00 2005 @@ -3,7 +3,8 @@ import types import urllib -from pypy.rpython.rmodel import inputconst, getfunctionptr +from pypy.objspace.flow.model import FunctionGraph +from pypy.rpython.rmodel import inputconst from pypy.rpython.lltypesystem import lltype from pypy.translator.llvm.codewriter import DEFAULT_CCONV @@ -113,8 +114,8 @@ for c_name, obj in decls: if isinstance(obj, lltype.LowLevelType): db.prepare_type(obj) - elif isinstance(obj, types.FunctionType): - funcptr = getfunctionptr(db.translator, obj) + elif isinstance(obj, FunctionGraph): + funcptr = rtyper.getcallable(obj) c = inputconst(lltype.typeOf(funcptr), funcptr) db.prepare_arg_value(c) elif isinstance(lltype.typeOf(obj), lltype.Ptr): @@ -156,8 +157,8 @@ if isinstance(obj, lltype.LowLevelType): s = "#define %s struct %s\n%s;\n" % (c_name, c_name, c_name) ccode.append(s) - elif isinstance(obj, types.FunctionType): - funcptr = getfunctionptr(db.translator, obj) + elif isinstance(obj, FunctionGraph): + funcptr = db.translator.rtyper.getcallable(obj) c = inputconst(lltype.typeOf(funcptr), funcptr) predeclarefn(c_name, db.repr_arg(c)) elif isinstance(lltype.typeOf(obj), lltype.Ptr): Modified: pypy/branch/somepbc-refactoring/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/llvm/genllvm.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/llvm/genllvm.py Thu Dec 1 22:02:00 2005 @@ -3,7 +3,8 @@ from pypy.translator.llvm import build_llvm_module from pypy.translator.llvm.database import Database from pypy.translator.llvm.pyxwrapper import write_pyx_wrapper -from pypy.rpython.rmodel import inputconst, getfunctionptr +from pypy.rpython.rmodel import inputconst +from pypy.rpython.typesystem import getfunctionptr from pypy.rpython.lltypesystem import lltype from pypy.tool.udir import udir from pypy.translator.llvm.codewriter import CodeWriter @@ -161,7 +162,8 @@ func = self.translator.entrypoint self.entrypoint = func - ptr = getfunctionptr(self.translator, func) + bk = self.translator.annotator.bookkeeper + ptr = getfunctionptr(bk.getdesc(func).cachedgraph(None)) c = inputconst(lltype.typeOf(ptr), ptr) self.db.prepare_arg_value(c) self.entry_func_name = func.func_name Modified: pypy/branch/somepbc-refactoring/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/llvm/module/support.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/llvm/module/support.py Thu Dec 1 22:02:00 2005 @@ -36,7 +36,7 @@ internal fastcc %RPyString* %RPyString_FromString(sbyte* %s) { %lenu = call ccc uint %strlen(sbyte* %s) %len = cast uint %lenu to int - %rpy = call fastcc %RPyString* %pypy_RPyString_New__Signed(int %len) + %rpy = call fastcc %RPyString* %pypy_RPyString_New(int %len) %rpystrptr = getelementptr %RPyString* %rpy, int 0, uint 1, uint 1 %rpystr = cast [0 x sbyte]* %rpystrptr to sbyte* Modified: pypy/branch/somepbc-refactoring/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/llvm/opwriter.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/llvm/opwriter.py Thu Dec 1 22:02:00 2005 @@ -324,7 +324,7 @@ argtypes, none_label, exc_label) e = self.db.translator.rtyper.getexceptiondata() - ll_exception_match = '%pypy_' + e.ll_exception_match.__name__ + ll_exception_match = '%pypy_' + e.fn_exception_match._obj._name # XXX Can we use database? lltype_of_exception_type = ('%structtype_' + From hpk at codespeak.net Thu Dec 1 22:03:35 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 22:03:35 +0100 (CET) Subject: [pypy-svn] r20536 - pypy/extradoc/talk/22c3 Message-ID: <20051201210335.3FDDE27B61@code1.codespeak.net> Author: hpk Date: Thu Dec 1 22:03:34 2005 New Revision: 20536 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: extended ending: a sharper shot at large european companies regarding their views on open-source. Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 22:03:34 2005 @@ -294,9 +294,8 @@ plans and hope to simply follow them and succeed. New developments, twists and opportunities evolve all the time. -Our experience with evolving PyPy from an non profit Open -Source project to a partially funded EU research project shows -the following: +Our experience with evolving PyPy from a loose Open Source project +to a partially funded EU research project shows the following: - what first seemed like too diverse interests and views, impossible to tailor into a single project, was instead a @@ -305,16 +304,25 @@ efforts. Aiming for homogenity is the real threat. - what first seemed like unbeatable odds and too big obstacles - turned into new frontiers of possibilities. The challenge is - to create an atmosphere in which a team can act on those and - within the short timeframe of opportunity. Change is + even turned sometimes into new possibilities. The challenge is + to maintain an atmosphere in which a team can act on those and + within short timeframes of opportunities. Change is inevitable - how you handle it is the real challenge. +- there are many other projects and organisations who are + heading in similar directions of trying to connect and + evolve agile open source strategies with business matters. + Especially models for developers distributed between + different countries allows people with special interests + to effectively work together and learn from each other. + Concluding - the cumulative effects of an agile, open and dynamic team process combined with a market and curious -clients creates new business models - agile business. -A path that we hope others will follow. - - - - +first adopters facilitates "agile business": a positive result +is that a lot of people within the PyPy context found +jobs which they enjoy and there now is evolving commercial +interest despite the still early stages of the project - mostly +from US companies though ... Why european companies, especially +larger ones, appear to prefer taking dumb views on agile open-source +development ("great, it's cheaper, no license fees!") is +another interesting topic. From hpk at codespeak.net Thu Dec 1 22:38:01 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 22:38:01 +0100 (CET) Subject: [pypy-svn] r20537 - pypy/extradoc/talk/22c3 Message-ID: <20051201213801.CCD2E27B61@code1.codespeak.net> Author: hpk Date: Thu Dec 1 22:38:01 2005 New Revision: 20537 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: another reviewing run over the whole document ... Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 22:38:01 2005 @@ -1,33 +1,34 @@ ========================================================================= -Agile Business and EU funding: sprint methods in a funded OSS project +Open Source and EU funding: agile methods in a funded OSS project ========================================================================= Abstract ========== -This paper uses an evolutionary approach, a walkthrough -through the history of the PyPy project, touching down on -different aspects of agility. Agility played a key role from -the beginning - the PyPy project started from some mails -between a few people, quickly had a first one-week "sprint" +This paper walks through different aspects of agility +within the open-source driven PyPy project. Agility played a key role +from the beginning - the PyPy project started from some mails +between a few people, quickly had a first one-week meeting, a "sprint" from where it evolved into a structure that was able to carry out a research project - and got accepted by the European Union. During the course, two companies got founded and are now growing and employing key developers. -Technical Development is strongly rooted in open-source -contexts and this adds another agility aspect - dynamic -communication, cooperation and exchange with other people and +PyPy's technical development is strongly rooted in open-source +contexts and this adds another agility aspect - free +communication, co-operation and exchange with other people and projects. -The process of getting EU-funding posed (and continues to pose) a -challenge to the community-rooted PyPy project: how to connect -agile open source culture with formal structures: exposition -to requirements like planning, budget estimation, work -distribution and resource tracking. After our first year we -are reasonably happy with the balance we strike between -organisations and EU funding on the one and the developers -driving the technical aspects of the project on the other side. +The process of obtaining EU-funding is a continous challenge to +the community-rooted PyPy project: how to connect +agile open source culture with formal structures: interaction +with requirements like planning, budget estimation, work +distribution and resource tracking. + +After our first "funded" year we are reasonably happy with the +balance we strike between organisations and EU funding on the +one and the developers driving the technical aspects of the +project on the other side. Agility in Technical Development and Organisation @@ -39,21 +40,21 @@ The first bits of PyPy started during a one-week meeting, a "sprint", held at Trillke-Gut in Hildesheim February 2003. The sprint was inspired by practices used by other Python oriented projects such as Zope3. Originally the -sprint methodology used in the Python community grew from practices within Zope -Corporation. Their definition of a sprint was "two-day or three-day focused +sprint methodology used in the Python community grew from practices applied by +the Zope Corporation. Their definition of a sprint was "two-day or three-day focused development session, in which developers pair off together in a room and focus on building a particular subsystem". Sprinting up to a week became the initial driving factor in developing the code base and the community/people around it. The early PyPy sprints were organised by core developers together with local Pythonistas -in Louvain LaNeuve, Gothenburg, Vilnius and Amsterdam. Sprints gave +in Louvain La Neuve, Gothenburg, Vilnius and Amsterdam. Sprints gave the opportunity to both help, participate and influence the ideas within PyPy. -Sprints are not really part of the traditional Agile +Sprints are actually not part of the traditional Agile portfolio of techniques, the closest thing to it comes from Scrum who names the 30 days long programming iterations -"sprints", covering a certain increment. In the Scrum method +"sprints", covering a certain increment. With the Scrum method, considerable effort is put into performing the sprint planning as well as creating and documenting the "sprint backlog" which is then feedbacked into the "Product backlog". The sprint ends @@ -62,9 +63,9 @@ in which the team looks at ways to improve the development methodology and future sprints. -To our knowledge, open-source projects are only sprinting up to a week -which also reflects the fact that many contributors give their time -and even money to gather and work together. This is different +To our knowledge, open-source projects these days are sprinting for at most +a week which reflects the fact that many contributors give their time +and even money to gather and work together. This is odifferent from having fully funded people from one company working together. Why did PyPy choose sprinting as a key technique? It is a method that fits @@ -73,15 +74,16 @@ meetings, discussions etc) as well as acceleratedly (short increments and tasks, "doing" and testing instead of long startups of planning and requirement gathering). This means that most of the time a sprint is a great -way of getting results, but also to get new people aquainted - a great -method for dissemination and learning within the team. +way of getting results and getting new people aquainted - a good +method for dissemination of knowledge and learning within the team. .. image:: sprintprocess.gif + Agile approaches: test-driven development ----------------------------------------- -Test-driven development is a cornerstone for programming +Test-driven development is a technical cornerstone for programming efficiently together in a distributed team. Seen from the Agile Manifesto perspective it is right up there as one of the key elements since it puts focus on producing working code, @@ -94,41 +96,43 @@ the key problems identified by Frederick P. Brooks in the latest version of "The Mythical Man-Month" (unfortunately still very actual today) are estimating correct amount of time for communication and testing/debugging. Automated -testing development and strict version tracking helps with those problems, -especially in the hands of a team sprinting its way through the Python -community - welcoming everyone to participate. - -Apart from rewriting the language within itself, PyPy also evolved a number of -development tools useful for writing tests and glueing things together. PyPy's -testing tool ("py.test") is used separately and continues to evolve on -its own by now. +testing, rather barrier-free communication and strict version tracking helps +with those problems, especially in the hands of a team sprinting its way through +the Python community - welcoming everyone to participate. + +Apart from rewriting a practical programming language within +itself, PyPy also evolved a number of development tools useful +for writing tests and glueing things together. PyPy's testing +tool ("py.test") is used separately and evolves on its own by now. .. image:: plots/loc.png .. test driven development + Agility: Open Communication and organisation ---------------------------------------------------- -Another agility aspect relates to the transparent and open communication -about the project. Only very few (EU-contract related) documents are -access restricted, everything else is freely available and modifiable. -There are no hierarchies for commit rights. In fact, the hosting server -also gives home to a couple of other projects and all projects share -commit rights ("Coding Wiki"). Announcing Sprints, Releases and development -goals lead to more and more people subscribing to mailing lists or +Another agility aspect relates to transparent and open +communication within the project. Only very few (EU-contract +related) documents are access restricted, everything else is +freely available. There are no hierarchies for commit rights. +In fact, the hosting server also gives home to a couple of +other projects and all projects share commit rights ("Coding +Wiki"). Announcing Sprints, Releases and development goals +lead to increasdingly many people subscribing to mailing lists or participating in development. -Moreover, the PyPy developers evolved a model of weekly 30-minute +Moreover, the PyPy developers installed a model of weekly 30-minute IRC chat meetings where topics are briefly discussed, delegated or decided upon: those meetings are open to all active developers and usually do not discuss internal EU matters much except that -funded developers probably keep EU goals more in mind than others. +funded developers keep EU goals more in mind than others. Minutes of these weekly developer meetings get archived and posted to the development list. -A rather recent invention is "This week in PyPy" which tries to -summarize what is going on in the lively IRC development #pypy -channel - main place of technical coordination. +A rather recent invention is the postings of "This week in PyPy" +which try to summarize what is going on in the lively IRC +development #pypy channel - main place of technical coordination. .. image:: plots/subscribers.png @@ -215,9 +219,9 @@ responsibilities. Other adjustments and recruiting companies with previous EU -project experiences took place. There was also an recruitment -of a company totally unrelated to the developer work being -done so far - focused on process management and +project experiences took place. There also is one company +involved quite unrelated to the previous developer work - +but rather focused on process management and designing learning processes with a background from the Chaospilot school in Aarhus, Denmark. When creating the formal consortium of seven partners, new cultures and perspectives @@ -237,10 +241,10 @@ keep "conceptual integrity" (Brooks) of the vision and the idea in the hands of the core developers. A somewhat negative result was the added workload and responsibility on developers -regarding EU related work. It is interesting, though, that +regarding EU related work. It is not too surprising that the consortium with its member organisation now employs a version-control/review based scheme regarding EU documents -similar to the technical development approaches. +reflecting the technical development approaches. It remains a challenge for all partners of the consortium, universities and companies alike, to connect an ongoing @@ -251,7 +255,7 @@ we are confident that things work out reasonably well. -challenge: balancing agile OSS community structures with EU requirements +challenge: balancing community interests with EU requirements ------------------------------------------------------------------------------ The agile development process in the EU funded work of the @@ -260,7 +264,7 @@ developers to get in direct touch with each other. Sprinting around conferences also became a key strategy. -Tut the nature of sprints changed when EU funding started. The +But the nature of sprints changed when EU funding started. The need to meet milestones of promised *deliverables* and the goal to keep an open sprint process, still welcoming newcomers into the world of Pypy, made the sprints longer (at least 7 @@ -281,17 +285,17 @@ members to attend the full time, disturbing other engagements etc. The project continues to try to enhance the method of sprinting, evaluating -feedback from sprint participants. Maybe the implementation within the PyPy +feedback from sprint participants. Maybe the implementation within the PyPy team is slowly conforming to the Scrum standard of sprinting, but not as a conscious effort? -Managing diversities: agile business - a succesful marriage ? +Managing diversities: agile business - a succesful marriage? -------------------------------------------------------------- For a diverse group of organisations and people, agility is helpful at various levels: you cannot make all-encompassing -plans and hope to simply follow them and succeed. New +plans and hope to statically follow them and succeed. New developments, twists and opportunities evolve all the time. Our experience with evolving PyPy from a loose Open Source project @@ -312,17 +316,17 @@ - there are many other projects and organisations who are heading in similar directions of trying to connect and evolve agile open source strategies with business matters. - Especially models for developers distributed between + Emerging models for developers distributed between different countries allows people with special interests to effectively work together and learn from each other. Concluding - the cumulative effects of an agile, open and dynamic team process combined with a market and curious -first adopters facilitates "agile business": a positive result +first adopters facilitates agile business. A positive result is that a lot of people within the PyPy context found -jobs which they enjoy and there now is evolving commercial +enjoyable jobs and there now already is evolving commercial interest despite the still early stages of the project - mostly -from US companies though ... Why european companies, especially +from US companies though ... why european companies, especially larger ones, appear to prefer taking dumb views on agile open-source development ("great, it's cheaper, no license fees!") is -another interesting topic. +another interesting topic ... From arigo at codespeak.net Thu Dec 1 22:48:11 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 22:48:11 +0100 (CET) Subject: [pypy-svn] r20538 - in pypy/branch/somepbc-refactoring/pypy/translator/squeak: . test Message-ID: <20051201214811.E230627B5F@code1.codespeak.net> Author: arigo Date: Thu Dec 1 22:48:10 2005 New Revision: 20538 Modified: pypy/branch/somepbc-refactoring/pypy/translator/squeak/gensqueak.py pypy/branch/somepbc-refactoring/pypy/translator/squeak/test/test_oo.py pypy/branch/somepbc-refactoring/pypy/translator/squeak/test/test_squeaktrans.py Log: (pedronis, arigo) Fixed the Squeak tests. (Hopefully didn't harm gensqueak too much in the process.) Modified: pypy/branch/somepbc-refactoring/pypy/translator/squeak/gensqueak.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/squeak/gensqueak.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/squeak/gensqueak.py Thu Dec 1 22:48:10 2005 @@ -22,13 +22,13 @@ words[i] = words[i].capitalize() return ''.join(words) -def arg_names(func, names = None): +def arg_names(graph): #XXX need to handle more args, see # http://docs.python.org/ref/types.html#l2h-139 - co = func.func_code - if not names: - names = co.co_varnames - return names[:co.co_argcount] + names, vararg, kwarg = graph.signature + assert vararg is None + assert kwarg is None + return names def selector(name, args): s = name @@ -82,39 +82,38 @@ self.sqdir = sqdir self.translator = translator self.modname = (modname or - translator.functions[0].__name__) + translator.graphs[0].name) self.sqnames = { Constant(None).key: 'nil', Constant(False).key: 'false', Constant(True).key: 'true', } self.seennames = {} - self.pendingfunctions = [] + self.pendinggraphs = [] self.pendingclasses = [] self.pendingmethods = [] self.classes = [] self.methods = [] t = self.translator - func = t.functions[0] - graph = t.getflowgraph(func) + graph = t.graphs[0] simplify_graph(graph) remove_direct_loops(t, graph) checkgraph(graph) #self.translator.view() - self.nameof(func) #add to pending - file = self.sqdir.join('%s.st' % func.__name__).open('w') + self.nameof(graph) #add to pending + file = self.sqdir.join('%s.st' % graph.name).open('w') self.gen_source(file) file.close() #self.translator.view() def gen_source(self, file): - while self.pendingfunctions or self.pendingclasses or self.pendingmethods: - while self.pendingfunctions: - func = self.pendingfunctions.pop() - self.gen_sqfunction(func, file) + while self.pendinggraphs or self.pendingclasses or self.pendingmethods: + while self.pendinggraphs: + graph = self.pendinggraphs.pop() + self.gen_sqfunction(graph, file) while self.pendingclasses: inst = self.pendingclasses.pop() self.gen_sqclass(inst, file) @@ -146,7 +145,7 @@ print >> f - def gen_sqfunction(self, func, f): + def gen_sqfunction(self, graph, f): def expr(v): if isinstance(v, Variable): @@ -239,12 +238,9 @@ yield " %s" % line yield "]" - t = self.translator - graph = t.getflowgraph(func) - start = graph.startblock args = [expr(arg) for arg in start.inputargs] - print >> f, '%s' % signature(self.nameof(func), args) + print >> f, '%s' % signature(self.nameof(graph), args) loops = LoopFinder(start).loops @@ -276,27 +272,35 @@ def nameof_str(self, s): return "'s'" - def nameof_function(self, func): + def nameof_FunctionGraph(self, graph): #XXX this should actually be a StaticMeth - printable_name = '(%s:%d) %s' % ( - func.func_globals.get('__name__', '?'), - func.func_code.co_firstlineno, - func.__name__) - if self.translator.frozen: - if func not in self.translator.flowgraphs: - print "NOT GENERATING", printable_name - return self.skipped_function(func) - else: - if (func.func_doc and - func.func_doc.lstrip().startswith('NOT_RPYTHON')): - print "skipped", printable_name - return self.skipped_function(func) - name = self.unique_name(func.__name__) - args = arg_names(func) + name = self.unique_name(graph.name.split('.')[-1]) + args = arg_names(graph) sel = selector(name, args) - self.pendingfunctions.append(func) + self.pendinggraphs.append(graph) return sel + #def nameof_function(self, func): + # #XXX this should actually be a StaticMeth + # printable_name = '(%s:%d) %s' % ( + # func.func_globals.get('__name__', '?'), + # func.func_code.co_firstlineno, + # func.__name__) + # if self.translator.frozen: + # if func not in self.translator.flowgraphs: + # print "NOT GENERATING", printable_name + # return self.skipped_function(func) + # else: + # if (func.func_doc and + # func.func_doc.lstrip().startswith('NOT_RPYTHON')): + # print "skipped", printable_name + # return self.skipped_function(func) + # name = self.unique_name(func.__name__) + # args = arg_names(func) + # sel = selector(name, args) + # self.pendingfunctions.append(func) + # return sel + def nameof_Instance(self, inst): if inst is None: #empty superclass Modified: pypy/branch/somepbc-refactoring/pypy/translator/squeak/test/test_oo.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/squeak/test/test_oo.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/squeak/test/test_oo.py Thu Dec 1 22:48:10 2005 @@ -1,16 +1,15 @@ from pypy.tool.udir import udir from pypy.translator.squeak.gensqueak import GenSqueak -from pypy.translator.translator import Translator +from pypy.translator.translator import TranslationContext from pypy.rpython.ootypesystem.ootype import * def build_sqfunc(func, args=[], view=False): try: func = func.im_func except AttributeError: pass - t = Translator(func) - t.annotate(args) - t.specialize(type_system="ootype") - t.simplify() + t = TranslationContext() + t.buildannotator().build_types(func, args) + t.buildrtyper(type_system="ootype").specialize() if view: t.viewcg() GenSqueak(udir, t) Modified: pypy/branch/somepbc-refactoring/pypy/translator/squeak/test/test_squeaktrans.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/squeak/test/test_squeaktrans.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/squeak/test/test_squeaktrans.py Thu Dec 1 22:48:10 2005 @@ -1,7 +1,7 @@ from pypy.tool.udir import udir from pypy.translator.test import snippet from pypy.translator.squeak.gensqueak import GenSqueak -from pypy.translator.translator import Translator +from pypy.translator.translator import TranslationContext def looping(i = (int), j = (int)): @@ -15,8 +15,9 @@ def build_sqfunc(self, func): try: func = func.im_func except AttributeError: pass - t = Translator(func) - t.simplify() + t = TranslationContext() + graph = t.buildflowgraph(func) + t._prebuilt_graphs[func] = graph self.gen = GenSqueak(udir, t) def test_simple_func(self): From arigo at codespeak.net Thu Dec 1 22:49:26 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 22:49:26 +0100 (CET) Subject: [pypy-svn] r20539 - pypy/branch/somepbc-refactoring/pypy/translator/c/test Message-ID: <20051201214926.39CA727B61@code1.codespeak.net> Author: arigo Date: Thu Dec 1 22:49:25 2005 New Revision: 20539 Modified: pypy/branch/somepbc-refactoring/pypy/translator/c/test/test_annotated.py Log: (pedronis, arigo) Removed this hack, which is apparently not needed any more and breaks the pyrex tests *if* they are run after the C tests in the same process. Duh. Modified: pypy/branch/somepbc-refactoring/pypy/translator/c/test/test_annotated.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/c/test/test_annotated.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/c/test/test_annotated.py Thu Dec 1 22:49:25 2005 @@ -13,18 +13,13 @@ def annotatefunc(self, func): t = TranslationContext(simplifying=True) - if hasattr(func, 'starting_types'): - argstypelist = func.starting_types - else: - # builds starting-types from func_defs - argstypelist = [] - if func.func_defaults: - for spec in func.func_defaults: - if isinstance(spec, tuple): - spec = spec[0] # use the first type only for the tests - argstypelist.append(spec) - func.func_defaults = None - func.starting_types = argstypelist + # builds starting-types from func_defs + argstypelist = [] + if func.func_defaults: + for spec in func.func_defaults: + if isinstance(spec, tuple): + spec = spec[0] # use the first type only for the tests + argstypelist.append(spec) a = t.buildannotator() a.build_types(func, argstypelist) a.simplify() From bea at codespeak.net Thu Dec 1 22:49:51 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 1 Dec 2005 22:49:51 +0100 (CET) Subject: [pypy-svn] r20540 - pypy/extradoc/talk/22c3 Message-ID: <20051201214951.BCDD027B64@code1.codespeak.net> Author: bea Date: Thu Dec 1 22:49:50 2005 New Revision: 20540 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: fixed typo Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 22:49:50 2005 @@ -65,7 +65,7 @@ To our knowledge, open-source projects these days are sprinting for at most a week which reflects the fact that many contributors give their time -and even money to gather and work together. This is odifferent +and even money to gather and work together. This is different from having fully funded people from one company working together. Why did PyPy choose sprinting as a key technique? It is a method that fits From hpk at codespeak.net Thu Dec 1 22:56:45 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 22:56:45 +0100 (CET) Subject: [pypy-svn] r20541 - pypy/extradoc/talk/22c3 Message-ID: <20051201215645.518C227B5F@code1.codespeak.net> Author: hpk Date: Thu Dec 1 22:56:44 2005 New Revision: 20541 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: streamlined the last sentence a bit. Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 22:56:44 2005 @@ -327,6 +327,6 @@ enjoyable jobs and there now already is evolving commercial interest despite the still early stages of the project - mostly from US companies though ... why european companies, especially -larger ones, appear to prefer taking dumb views on agile open-source -development ("great, it's cheaper, no license fees!") is -another interesting topic ... +larger ones, appear to prefer taking rather naive views on agile +open-source development ("great, it's cheaper, no license fees!") +is another interesting topic. From arigo at codespeak.net Thu Dec 1 22:58:07 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 22:58:07 +0100 (CET) Subject: [pypy-svn] r20542 - pypy/branch/somepbc-refactoring/pypy/translator/goal Message-ID: <20051201215807.4D38D27B5F@code1.codespeak.net> Author: arigo Date: Thu Dec 1 22:58:06 2005 New Revision: 20542 Modified: pypy/branch/somepbc-refactoring/pypy/translator/goal/driver.py Log: (pedronis, arigo) fixed the --llinterp option of translate_pypy. Modified: pypy/branch/somepbc-refactoring/pypy/translator/goal/driver.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/goal/driver.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/goal/driver.py Thu Dec 1 22:58:06 2005 @@ -262,10 +262,12 @@ py.log.setconsumer("llinterp operation", None) translator = self.translator - interp = LLInterpreter(translator.flowgraphs, translator.rtyper) - v = interp.eval_function(translator.entrypoint, - self.extra.get('get_llinterp_args', - lambda: [])()) + interp = LLInterpreter(translator.rtyper) + bk = translator.annotator.bookkeeper + graph = bk.getdesc(self.entry_point).cachedgraph(None) + v = interp.eval_graph(graph, + self.extra.get('get_llinterp_args', + lambda: [])()) log.llinterpret.event("result -> %s" % v) # From bea at codespeak.net Thu Dec 1 23:30:14 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 1 Dec 2005 23:30:14 +0100 (CET) Subject: [pypy-svn] r20543 - pypy/extradoc/talk/22c3 Message-ID: <20051201223014.312FB27B57@code1.codespeak.net> Author: bea Date: Thu Dec 1 23:30:12 2005 New Revision: 20543 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: grammar fixes Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 23:30:12 2005 @@ -7,11 +7,11 @@ This paper walks through different aspects of agility within the open-source driven PyPy project. Agility played a key role -from the beginning - the PyPy project started from some mails -between a few people, quickly had a first one-week meeting, a "sprint" +from the beginning. The PyPy project started from some mails +between a few people, quickly had a first one-week meeting, a "sprint", from where it evolved into a structure that was able to carry out a research project - and got accepted by the European -Union. During the course, two companies got founded and are +Union. During the course, two companies were founded. They are now growing and employing key developers. PyPy's technical development is strongly rooted in open-source @@ -20,8 +20,8 @@ projects. The process of obtaining EU-funding is a continous challenge to -the community-rooted PyPy project: how to connect -agile open source culture with formal structures: interaction +the community-rooted PyPy project; how to connect +agile open source culture with formal structures, interaction with requirements like planning, budget estimation, work distribution and resource tracking. @@ -39,9 +39,9 @@ The first bits of PyPy started during a one-week meeting, a "sprint", held at Trillke-Gut in Hildesheim February 2003. The sprint was inspired by -practices used by other Python oriented projects such as Zope3. Originally the +practices used by other Python projects such as Zope3. Originally the sprint methodology used in the Python community grew from practices applied by -the Zope Corporation. Their definition of a sprint was "two-day or three-day focused +the Zope Corporation. Their definition of a sprint was: "two-day or three-day focused development session, in which developers pair off together in a room and focus on building a particular subsystem". @@ -92,12 +92,12 @@ Seen from an Open Source community perspective it is a vitalising strategy - especially in combination with a transparent open process in which anyone -interested can participate - if only for just a few days at a sprint. Some of +interested can participate - if only for just a few days at a sprint. One of the key problems identified by Frederick P. Brooks in the latest version of -"The Mythical Man-Month" (unfortunately still very actual today) are estimating +"The Mythical Man-Month" (unfortunately still very actual today) is estimating correct amount of time for communication and testing/debugging. Automated testing, rather barrier-free communication and strict version tracking helps -with those problems, especially in the hands of a team sprinting its way through +with that problem, especially in the hands of a team sprinting its way through the Python community - welcoming everyone to participate. Apart from rewriting a practical programming language within @@ -116,22 +116,22 @@ communication within the project. Only very few (EU-contract related) documents are access restricted, everything else is freely available. There are no hierarchies for commit rights. -In fact, the hosting server also gives home to a couple of +In fact, the server also hosts a couple of other projects and all projects share commit rights ("Coding Wiki"). Announcing Sprints, Releases and development goals -lead to increasdingly many people subscribing to mailing lists or +lead to an increasing amount of people subscribing to mailing lists or participating in development. -Moreover, the PyPy developers installed a model of weekly 30-minute -IRC chat meetings where topics are briefly discussed, delegated -or decided upon: those meetings are open to all active developers -and usually do not discuss internal EU matters much except that +Moreover, the PyPy developers implemented a method with weekly 30-minute +IRC chat meetings where topics were briefly discussed, delegated +or decided upon. Those meetings are open to all active developers +and usually do not touch upon internal EU matters much except that funded developers keep EU goals more in mind than others. Minutes of these weekly developer meetings get archived and posted to the development list. -A rather recent invention is the postings of "This week in PyPy" -which try to summarize what is going on in the lively IRC +A rather recent invention is the postings of "This week in PyPy". +The text is a summary of what is going on in the lively IRC development #pypy channel - main place of technical coordination. .. image:: plots/subscribers.png @@ -156,37 +156,37 @@ There had been a growing interest from the European Commission, IST division, to look closer at the Open Source world and its achievements. Several funded -research projects in the 5th framework programme studied the phenomenon +research projects in the 5th framework programme studied the phenomen (FLOSS-POLS, FLOSS) - its organization, business models and licensings. A few other funded software projects used Open Source in their work as tools (languages and applications). There was no previous experience of an Open Source community based project making a bid for funding. -The areas in the 6th Framework programme (second call) fitted very well +The areas in the 6th Framework programme (second call) fit very well with the objectives of PyPy. The idea of strengthening the European Software development companies and businesses with supporting an open source language -implementation was new but appealing to the EU. But being an Open Source -project wasn?t enough - the challenges and the idea of an flexible, +implementation was new but appealing to the EU. However, being an Open Source +project wasn?t enough. The challenges and the idea of a flexible, configurable "translator" or "compiler" met the research targets of the FP6, as well as trying out and documenting the agile methodology being used. It is interesting to note that todays computer industrial language -research and development happens mostly in the US. +research and development occurs mostly in the US. -In short, we argued that EU funding allows the project to go for +In short, we argued that EU funding allowed the project to go from reaching a critical mass and position to continue to evolve from -there and that it would help European Organisations to make some +there, and that it would help European Organisations to make some ground. Acting on this strategy proved to be a more difficult task. The -entire proposal and negotiation process took over a year (Autumn 2003 till +entire proposal and negotiation process took over a year (Autumn 2003 until November 2004). Satisfying the formal requirements, a proper description of planned work, had not previously been part of the development focus and both the EU and the parties involved had to adapt to the situation. Yet, drafting the -high-level requirements (in total 14 workpackages and 58 deliverables) was made +high-level requirements (in total 14 workpackages and 58 deliverables) was done using the same version-control/open-communication based work style, including evolving the proposal at sprints. Writing the proposal and specifying according objectives on a higher level has proved to be generally useful for clarifying goals -on a longer term - also helping others better understand the project. +on a longer term. It also helps others to understand the project better. Unfortunately the negotiations with the EU got stuck in organizational limbo and the project is still suffering from @@ -195,13 +195,13 @@ based on a non-profit association. This solution wasn't seen as realistic or feasible by the EU although we think it remains a viable approach for the future. During -negotiations, we got to an alternative solution which - however - -has a few drawbacks: Contributors have to become Contract Partners within -the EU-level Consortium (which is by itself not hard) and can then at least +negotiations, we got to an alternative solution which +had a few drawbacks: contributors have to become Contract Partners within +the EU-level Consortium (which is by itself not difficult) and can then at least claim travel and accomodation costs when attending sprints. However, this construction does not allow them to get paid for -work time and also has some formal requirements. This practically +work time and also has some formal requirements. In practice this leads to current considerations of developers to shift private money between them in order to circumvent the current problems with implementing an agile model within the EU contract framing. @@ -211,20 +211,20 @@ ---------------------------------------------------------------------- The guiding idea for receiving funding is to have organisations -through which key developers and other parties are employed. +in which key developers and other parties are employed. Two companies out of the seven organisations in the initial -consortium got funded during the EU negotiation process - +consortium were funded during the EU negotiation process. what first might have felt as an EU-related obstacle became an -opportunity, but with some overhead in legal and organizational +opportunity, but with some overhead like legal and organizational responsibilities. Other adjustments and recruiting companies with previous EU project experiences took place. There also is one company -involved quite unrelated to the previous developer work - +involved quite unrelated to the previous developer work but rather focused on process management and designing learning processes with a background from the Chaospilot school in Aarhus, Denmark. When creating the formal -consortium of seven partners, new cultures and perspectives +consortium of seven partners new cultures and perspectives were mixed with the strong collaborative Open Source core team, adding new complexities in communication and cooperation. Getting the new "playmates" to adopt the vision, @@ -251,7 +251,7 @@ medium-scale open-source project with EU regulations and requirements - not to speak of the fact that companies need to fund 50% of the costs themselves. It is, in fact, too early -to judge on the overall success of our approaches although +to judge the overall success of our approaches although we are confident that things work out reasonably well. From arigo at codespeak.net Thu Dec 1 23:32:48 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 23:32:48 +0100 (CET) Subject: [pypy-svn] r20544 - pypy/tag/dist-ext-someobject Message-ID: <20051201223248.9DE8F27B5F@code1.codespeak.net> Author: arigo Date: Thu Dec 1 23:32:48 2005 New Revision: 20544 Added: pypy/tag/dist-ext-someobject/ - copied from r20542, pypy/dist/ Log: Saved a snapshot of the trunk before the somepbc-refactoring merge. It is difficult to merge r19917, so we are ignoring it. It can be redone after the merge, slightly differently -- the Translator class is going away. (Also, only Christian can do so as there are no test to make sure things still work as intended). From hpk at codespeak.net Thu Dec 1 23:33:40 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 23:33:40 +0100 (CET) Subject: [pypy-svn] r20545 - pypy/extradoc/talk/22c3 Message-ID: <20051201223340.D4CD927B61@code1.codespeak.net> Author: hpk Date: Thu Dec 1 23:33:40 2005 New Revision: 20545 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: shorten things a bit by removing one para about sprints Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 23:33:40 2005 @@ -255,21 +255,15 @@ we are confident that things work out reasonably well. -challenge: balancing community interests with EU requirements +challenges: balancing community interests with EU requirements ------------------------------------------------------------------------------ -The agile development process in the EU funded work of the -PyPy project centers around sprints - which are planned to -take place every 6th week at different places to allow many -developers to get in direct touch with each other. Sprinting -around conferences also became a key strategy. - -But the nature of sprints changed when EU funding started. The +The nature of sprints changed when EU funding started. The need to meet milestones of promised *deliverables* and the goal to keep an open sprint process, still welcoming newcomers into the world of Pypy, made the sprints longer (at least 7 days with a break day in the middle) but also changed the -nature of the sprints. The team started to distuingish between +nature of the sprints. The team started to distuingish between sprints open for all to attend, without any prior PyPy experience, and sprints requiring earlier PyPy involvement. Tutorials, start up planning meetings as well as daily status meetings evolved, @@ -284,7 +278,7 @@ prolonged sprints could possibly make it more difficult for non consortium members to attend the full time, disturbing other engagements etc. -The project continues to try to enhance the method of sprinting, evaluating +The project continues to try to adapt the method of sprinting, evaluating feedback from sprint participants. Maybe the implementation within the PyPy team is slowly conforming to the Scrum standard of sprinting, but not as a conscious effort? From arigo at codespeak.net Thu Dec 1 23:35:54 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 23:35:54 +0100 (CET) Subject: [pypy-svn] r20546 - in pypy/dist/pypy: annotation translator translator/c Message-ID: <20051201223554.5C2E627B5F@code1.codespeak.net> Author: arigo Date: Thu Dec 1 23:35:54 2005 New Revision: 20546 Removed: pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/translator/c/pyobj.py pypy/dist/pypy/translator/translator.py Log: Reverting r19917, step 1 (see pypy/tag/dist-ext-someobject for explanations) From arigo at codespeak.net Thu Dec 1 23:37:01 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 23:37:01 +0100 (CET) Subject: [pypy-svn] r20547 - in pypy/dist/pypy: annotation translator translator/c Message-ID: <20051201223701.ACE9927B5F@code1.codespeak.net> Author: arigo Date: Thu Dec 1 23:37:01 2005 New Revision: 20547 Added: pypy/dist/pypy/annotation/bookkeeper.py - copied unchanged from r19916, pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/translator/c/pyobj.py - copied unchanged from r19916, pypy/dist/pypy/translator/c/pyobj.py pypy/dist/pypy/translator/translator.py - copied unchanged from r19916, pypy/dist/pypy/translator/translator.py Log: Reverted r19917, step 2. From cfbolz at codespeak.net Thu Dec 1 23:38:42 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 23:38:42 +0100 (CET) Subject: [pypy-svn] r20548 - in pypy/extradoc/talk/22c3: . plots Message-ID: <20051201223842.54E4D27B61@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 23:38:31 2005 New Revision: 20548 Added: pypy/extradoc/talk/22c3/agility.pdf pypy/extradoc/talk/22c3/plots/loc.pdf (contents, props changed) pypy/extradoc/talk/22c3/plots/subscribers.pdf (contents, props changed) pypy/extradoc/talk/22c3/sprintprocess.png (contents, props changed) Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: use pdfs for figures and proper latex code. Added: pypy/extradoc/talk/22c3/agility.pdf ============================================================================== Files (empty file) and pypy/extradoc/talk/22c3/agility.pdf Thu Dec 1 23:38:31 2005 differ Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 23:38:31 2005 @@ -77,8 +77,11 @@ way of getting results and getting new people aquainted - a good method for dissemination of knowledge and learning within the team. -.. image:: sprintprocess.gif - +.. raw:: latex + + \begin{figure*}[htbp]\begin{center} + \scalebox{0.500000}{\includegraphics{sprintprocess.png}} + \end{center}\end{figure*} Agile approaches: test-driven development ----------------------------------------- @@ -105,9 +108,13 @@ for writing tests and glueing things together. PyPy's testing tool ("py.test") is used separately and evolves on its own by now. -.. image:: plots/loc.png -.. test driven development +.. raw:: latex + + \begin{figure*}[htbp]\begin{center} + \scalebox{0.75}{\includegraphics{plots/loc.pdf}} + \end{center}\end{figure*} +.. test driven development Agility: Open Communication and organisation ---------------------------------------------------- @@ -134,7 +141,12 @@ The text is a summary of what is going on in the lively IRC development #pypy channel - main place of technical coordination. -.. image:: plots/subscribers.png + +.. raw:: latex + + \begin{figure*}[htbp]\begin{center} + \scalebox{0.75}{\includegraphics{plots/subscribers.pdf}} + \end{center}\end{figure*} .. overview of PyPy mailing list subscriptions Added: pypy/extradoc/talk/22c3/plots/loc.pdf ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/plots/subscribers.pdf ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/sprintprocess.png ============================================================================== Binary file. No diff available. From arigo at codespeak.net Thu Dec 1 23:41:53 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 23:41:53 +0100 (CET) Subject: [pypy-svn] r20549 - in pypy/branch/somepbc-refactoring/pypy: annotation rpython rpython/test Message-ID: <20051201224153.04F4727B5F@code1.codespeak.net> Author: arigo Date: Thu Dec 1 23:41:52 2005 New Revision: 20549 Modified: pypy/branch/somepbc-refactoring/pypy/annotation/unaryop.py pypy/branch/somepbc-refactoring/pypy/rpython/rfloat.py pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rfloat.py Log: (tismer, replayed from r20152) added support for float valued dict keys Modified: pypy/branch/somepbc-refactoring/pypy/annotation/unaryop.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/annotation/unaryop.py (original) +++ pypy/branch/somepbc-refactoring/pypy/annotation/unaryop.py Thu Dec 1 23:41:52 2005 @@ -191,6 +191,9 @@ return getbookkeeper().immutablevalue(bool(self.const)) return SomeBool() + def hash(flt): + return SomeInteger() + class __extend__(SomeInteger): def invert(self): Modified: pypy/branch/somepbc-refactoring/pypy/rpython/rfloat.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/rfloat.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/rfloat.py Thu Dec 1 23:41:52 2005 @@ -10,6 +10,7 @@ from pypy.rpython import rstr from pypy.rpython.rmodel import log +import math class __extend__(annmodel.SomeFloat): def rtyper_makerepr(self, rtyper): @@ -105,7 +106,10 @@ return float(value) def get_ll_eq_function(self): - return None + return None + + def get_ll_hash_function(self): + return ll_hash_float def rtype_is_true(_, hop): vlist = hop.inputargs(Float) @@ -133,7 +137,27 @@ from pypy.rpython.module.ll_strtod import ll_strtod_formatd return ll_strtod_formatd(percent_f, f) + def rtype_hash(_, hop): + v_flt, = hop.inputargs(float_repr) + return hop.gendirectcall(ll_hash_float, v_flt) + percent_f = string_repr.convert_const("%f") + +TAKE_NEXT = float(2**31) + +def ll_hash_float(f): + """ + this implementation is identical to the CPython implementation, + despite the fact that the integer case is not treated, specially. + This should be special-cased in W_FloatObject. + In the low-level case, floats cannot be used with ints in dicts, anyway. + """ + v, expo = math.frexp(f) + v *= TAKE_NEXT + hipart = int(v) + v = (v - float(hipart)) * TAKE_NEXT + x = hipart + int(v) + (expo << 15) + return x # # _________________________ Conversions _________________________ Modified: pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rfloat.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rfloat.py (original) +++ pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rfloat.py Thu Dec 1 23:41:52 2005 @@ -49,3 +49,9 @@ res = interpret(fn, [1.5]) assert float(''.join(res.chars)) == 1.5 + +def test_hash(): + def fn(f): + return hash(f) + res = interpret(fn, [1.5]) + assert res == hash(1.5) From arigo at codespeak.net Thu Dec 1 23:44:38 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 23:44:38 +0100 (CET) Subject: [pypy-svn] r20550 - pypy/branch/somepbc-refactoring/pypy/tool Message-ID: <20051201224438.DD65627B5F@code1.codespeak.net> Author: arigo Date: Thu Dec 1 23:44:38 2005 New Revision: 20550 Modified: pypy/branch/somepbc-refactoring/pypy/tool/sourcetools.py Log: (tismer, replayed from r19693) support display of source attached by compile2 Modified: pypy/branch/somepbc-refactoring/pypy/tool/sourcetools.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/tool/sourcetools.py (original) +++ pypy/branch/somepbc-refactoring/pypy/tool/sourcetools.py Thu Dec 1 23:44:38 2005 @@ -101,8 +101,11 @@ find the parameters of formatting generated methods and functions. """ - src = inspect.getsource(object) name = inspect.getfile(object) + if hasattr(name, '__source__'): + src = str(name.__source__) + else: + src = inspect.getsource(object) if hasattr(name, "__sourceargs__"): return src % name.__sourceargs__ return src From hpk at codespeak.net Thu Dec 1 23:47:31 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Dec 2005 23:47:31 +0100 (CET) Subject: [pypy-svn] r20551 - pypy/extradoc/talk/22c3 Message-ID: <20051201224731.6E3AA27B61@code1.codespeak.net> Author: hpk Date: Thu Dec 1 23:47:30 2005 New Revision: 20551 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: renaming of the main title and a section Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 23:47:30 2005 @@ -1,5 +1,5 @@ ========================================================================= -Open Source and EU funding: agile methods in a funded OSS project +Open Source, EU Funding and Agile Methods ========================================================================= Abstract @@ -296,7 +296,7 @@ conscious effort? -Managing diversities: agile business - a succesful marriage? +Managing diversities: agile business -------------------------------------------------------------- For a diverse group of organisations and people, agility is From cfbolz at codespeak.net Thu Dec 1 23:50:05 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Dec 2005 23:50:05 +0100 (CET) Subject: [pypy-svn] r20552 - in pypy/extradoc/talk/22c3: . plots Message-ID: <20051201225005.88FC927B57@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 23:49:51 2005 New Revision: 20552 Added: pypy/extradoc/talk/22c3/agility.sty Modified: pypy/extradoc/talk/22c3/agility.pdf pypy/extradoc/talk/22c3/agility_v1.txt.txt pypy/extradoc/talk/22c3/plots/loc.pdf pypy/extradoc/talk/22c3/plots/subscribers.pdf Log: fixes + regenerate pdf Modified: pypy/extradoc/talk/22c3/agility.pdf ============================================================================== Files pypy/extradoc/talk/22c3/agility.pdf (original) and pypy/extradoc/talk/22c3/agility.pdf Thu Dec 1 23:49:51 2005 differ Added: pypy/extradoc/talk/22c3/agility.sty ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/agility.sty Thu Dec 1 23:49:51 2005 @@ -0,0 +1,3 @@ +\author {Bea D\"uring\\Change Maker\\\texttt{bea at changemaker.nu} \and + Holger Krekel\\merlinux GmbH\\\texttt{hpk at merlinux.de}} +\date{} Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 23:49:51 2005 @@ -2,8 +2,9 @@ Open Source, EU Funding and Agile Methods ========================================================================= -Abstract -========== +.. raw:: latex + + \begin{abstract} This paper walks through different aspects of agility within the open-source driven PyPy project. Agility played a key role @@ -30,6 +31,9 @@ one and the developers driving the technical aspects of the project on the other side. +.. raw:: latex + + \end{abstract} Agility in Technical Development and Organisation ========================================================= @@ -111,9 +115,10 @@ .. raw:: latex \begin{figure*}[htbp]\begin{center} - \scalebox{0.75}{\includegraphics{plots/loc.pdf}} + \scalebox{0.7}{\includegraphics{plots/loc.pdf}} \end{center}\end{figure*} + .. test driven development Agility: Open Communication and organisation @@ -141,15 +146,12 @@ The text is a summary of what is going on in the lively IRC development #pypy channel - main place of technical coordination. - .. raw:: latex \begin{figure*}[htbp]\begin{center} - \scalebox{0.75}{\includegraphics{plots/subscribers.pdf}} + \scalebox{0.7}{\includegraphics{plots/subscribers.pdf}} \end{center}\end{figure*} -.. overview of PyPy mailing list subscriptions - How and why EU funding? ===================================================================== Modified: pypy/extradoc/talk/22c3/plots/loc.pdf ============================================================================== Binary files. No diff available. Modified: pypy/extradoc/talk/22c3/plots/subscribers.pdf ============================================================================== Binary files. No diff available. From lac at codespeak.net Thu Dec 1 23:52:58 2005 From: lac at codespeak.net (lac at codespeak.net) Date: Thu, 1 Dec 2005 23:52:58 +0100 (CET) Subject: [pypy-svn] r20553 - pypy/extradoc/talk/22c3 Message-ID: <20051201225258.54DF127B5F@code1.codespeak.net> Author: lac Date: Thu Dec 1 23:52:57 2005 New Revision: 20553 Added: pypy/extradoc/talk/22c3/lac.comments Log: >is my comments. There is still a lot of clunky English, but this before the deadline ... Added: pypy/extradoc/talk/22c3/lac.comments ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/lac.comments Thu Dec 1 23:52:57 2005 @@ -0,0 +1,367 @@ +========================================================================= +Open Source and EU funding: agile methods in a funded OSS project +========================================================================= + +Abstract +========== + +This paper walks through different aspects of agility +within the open-source driven PyPy project. + +>what's 'Open Source Driven'? + +Agility + +>has + +played a key role +from the beginning - the PyPy project started from some mails +between a few people, quickly had a first one-week meeting, a "sprint" + +where it evolved into a structure that was able to carry +out a research project - and got accepted by the European +Union. During the course, two companies got founded and are +now growing and employing key developers. + +PyPy's technical development is strongly rooted in open-source +contexts and this adds another agility aspect - free +communication, co-operation and exchange with other people and +projects. + +The process of obtaining EU-funding is a continous challenge to +the community-rooted PyPy project: how to connect +agile open source culture with formal structures: interaction +with requirements like planning, budget estimation, work +distribution and resource tracking. + +After our first "funded" year we are reasonably happy with the +balance we strike between organisations and EU funding on the +one and the developers driving the technical aspects of the +project on the other side. + + +Agility in Technical Development and Organisation +========================================================= + +Agile approaches: sprinting +---------------------------- + +>PyPy first (with bits I am seeing something dismembered -- bloody bits) + +started during a one-week meeting, a "sprint", +held at Trillke-Gut in Hildesheim February 2003. The sprint was inspired by +practices used by other Python oriented projects such as Zope3. Originally the +sprint methodology used in the Python community grew from practices applied by +the Zope Corporation. Their definition of a sprint was "two-day or three-day focused +development session, in which developers pair off together in a room and focus +on building a particular subsystem". + +Sprinting up to a week became the initial driving factor in developing +the code base and the community/people around it. The early PyPy sprints +were organised by core developers together with local Pythonistas +in Louvain La Neuve, Gothenburg, Vilnius and Amsterdam. Sprints gave +the opportunity to both help, participate and influence the ideas within PyPy. + +Sprints are actually not part of the traditional Agile +portfolio of techniques, the closest thing to it comes from +Scrum who names the 30 days long programming iterations +"sprints", covering a certain increment. With the Scrum method, +considerable effort is put into performing the sprint planning +as well as creating and documenting the "sprint backlog" which +is then feedbacked into the "Product backlog". The sprint ends +with a "sprint review" - an informal planning session in which +the team decides on upcoming work. There are also techniques +in which the team looks at ways to improve the development +methodology and future sprints. + +To our knowledge, open-source projects these days are sprinting for at most +a week which reflects the fact that many contributors give their time +and even money to gather and work together. This is different +from having fully funded people from one company working together. + +Why did PyPy choose sprinting as a key technique? It is a method that fits +distributed teams well because it gets the team focused around visible +challenging goals while working collarobatively (pair-programming, status +meetings, discussions etc) as well as acceleratedly (short increments and +tasks, "doing" and testing instead of long startups of planning and +requirement gathering). This means that most of the time a sprint is a great +way of getting results and getting new people aquainted - a good +method for dissemination of knowledge and learning within the team. + +.. image:: sprintprocess.gif + +>Interesting. I think that you have missed what I think is the most +>important aspect. Most EU projects do very poorly. They define +>success as 'whatever they did' so that it is rare for an EU project +>to ever formally fail -- but people with a less charitable +>interpretation of failure might announce that they fail a lot. + +>Why? + +>I think it is because the whole notion of 'divide the job into tasks, +>and let different groups work on a different piece and then try to +>stitch it all together through meetings' is a very flawed way to work. +>This is creative work we are doing, not assembly line production. +>If you want creative people to actually produce creative work, then you +>have to let them work together. + + +Agile approaches: test-driven development +----------------------------------------- + +Test-driven development is a technical cornerstone for programming +efficiently together in a distributed team. Seen from the +Agile Manifesto perspective it is right up there as one of the +key elements since it puts focus on producing working code, +rather than diagrams, plans and papers (and then faulty +software). + +Seen from an Open Source community perspective it is a vitalising strategy - +especially in combination with a transparent open process in which anyone +interested can participate - if only for just a few days at a sprint. Some of +the key problems identified by Frederick P. Brooks in the latest version of +"The Mythical Man-Month" (unfortunately still very actual today) are estimating +correct amount of time for communication and testing/debugging. Automated +testing, rather barrier-free communication and strict version tracking helps +with those problems, especially in the hands of a team sprinting its way through +the Python community - welcoming everyone to participate. + +Apart from rewriting a practical programming language within +itself, PyPy also evolved a number of development tools useful +for writing tests and glueing things together. PyPy's testing +tool ("py.test") is used separately and evolves on its own by now. + +.. image:: plots/loc.png +.. test driven development + + +Agility: Open Communication and organisation +---------------------------------------------------- + +Another agility aspect relates to transparent and open +communication within the project. Only very few (EU-contract +related) documents are access restricted, everything else is +freely available. There are no hierarchies for commit rights. +In fact, the hosting server also gives home to a couple of +other projects and all projects share commit rights ("Coding +Wiki"). Announcing Sprints, Releases and development goals +lead to increasdingly many people subscribing to mailing lists or +participating in development. + +Moreover, the PyPy developers installed a model of weekly 30-minute +IRC chat meetings where topics are briefly discussed, delegated +or decided upon: those meetings are open to all active developers +and usually do not discuss internal EU matters much except that +funded developers keep EU goals more in mind than others. +Minutes of these weekly developer meetings get archived and posted +to the development list. + +A rather recent invention is the postings of "This week in PyPy" +which try to summarize what is going on in the lively IRC +development #pypy channel - main place of technical coordination. + +.. image:: plots/subscribers.png + +.. overview of PyPy mailing list subscriptions + + +How and why EU funding? +===================================================================== + +Mid 2003 the idea of trying to get EU-funding for the project was born. +It became clear that the project had an arbitrarily large scale and that +receiving some funding would dramatically increase the pace and seriousness +of the project - because funded developers can dedicate more of their time +to the project. The involved developers and people stretched outside of the +Open Source ecologies to try to gather as much information and contacts as +possible in order to answer the question: "Should we go for it?" to which +the answer quickly became "Let's see how far we get!". + +Making things fit with EU perspectives +------------------------------------------------------ + +There had been a growing interest from the European Commission, IST division, +to look closer at the Open Source world and its achievements. Several funded +research projects in the 5th framework programme studied the phenomenon +(FLOSS-POLS, FLOSS) - its organization, business models and licensings. A few +other funded software projects used Open Source in their work as tools +(languages and applications). There was no previous experience of an Open +Source community based project making a bid for funding. + +The areas in the 6th Framework programme (second call) fitted very well +with the objectives of PyPy. The idea of strengthening the European Software +development companies and businesses with supporting an open source language +implementation was new but appealing to the EU. But being an Open Source +project wasn?t enough - the challenges and the idea of an flexible, +configurable "translator" or "compiler" met the research targets of the FP6, as +well as trying out and documenting the agile methodology being used. +It is interesting to note that + +> most of today's computer language research and development happens in the US. + +In short, we argued that EU funding allows the project to go for +reaching a critical mass and position to continue to evolve from +there and that it would help European Organisations to make some +ground. + +Acting on this strategy proved to be a more difficult task. The +entire proposal and negotiation process took over a year (Autumn 2003 till +November 2004). + + +>a proper description of planned work, +>necessary to satisfy formal requirements + +had not previously been part of the development focus and both the EU +and the parties involved had to adapt to the situation. Yet, drafting the +high-level requirements (in total 14 workpackages and 58 deliverables) was made +using the same version-control/open-communication based work style, including +evolving the proposal at sprints. Writing the proposal and specifying according +objectives on a higher level has proved to be generally useful for clarifying goals +on a longer term - also helping others better understand the project. + +Unfortunately the negotiations with the EU got stuck in +organizational limbo and the project is still suffering from +the effects of this even today. The goal of funding +contributors especially coming to sprints was originally +based on a non-profit association. This solution +wasn't seen as realistic or feasible by the EU although +we think it remains a viable approach for the future. During +negotiations, we got to an alternative solution which - however - +has a few drawbacks: Contributors have to become Contract Partners within +the EU-level Consortium (which is by itself not hard) and can then at least +claim travel and accomodation costs when attending sprints. + +However, this construction does not allow them to get paid for +work time and also has some formal requirements. This practically +leads to current considerations of developers to shift private money +between them in order to circumvent the current problems with +implementing an agile model within the EU contract framing. + + +Seven Organisations / The consortium +---------------------------------------------------------------------- + +The guiding idea for receiving funding is to have organisations +through which key developers and other parties are employed. +Two companies out of the seven organisations in the initial +consortium got funded during the EU negotiation process - +what first might have felt as an EU-related obstacle became an +opportunity, but with some overhead in legal and organizational +responsibilities. + +Other adjustments and recruiting companies with previous EU +project experiences took place. There also is one company +involved quite unrelated to the previous developer work - +but rather focused on process management and +designing learning processes with a background from the +Chaospilot school in Aarhus, Denmark. When creating the formal +consortium of seven partners, new cultures and perspectives +were mixed with the strong collaborative Open Source core +team, adding new complexities in communication and +cooperation. Getting the new "playmates" to adopt the vision, +culture and spirit of the original idea and holding true to it +during the work on the proposal and negotiation process was a +challenge indeed. + +The formal project organization required by the EU imposed +more structure on the previous more free-floating agile +process. Roles and responsibilities where staked out, +conforming with the requirements of the roles but delegating +as much as possible of the responsibilities and +decision-making to the core developers. The strategy was to +keep "conceptual integrity" (Brooks) of the vision and the +idea in the hands of the core developers. A somewhat negative +result was the added workload and responsibility on developers +regarding EU related work. It is not too surprising that +the consortium with its member organisation now employs a +version-control/review based scheme regarding EU documents +reflecting the technical development approaches. + +It remains a challenge for all partners of the consortium, +universities and companies alike, to connect an ongoing +medium-scale open-source project with EU regulations and +requirements - not to speak of the fact that companies need to +fund 50% of the costs themselves. It is, in fact, too early +to judge on the overall success of our approaches although +we are confident that things work out reasonably well. + + +challenge: balancing community interests with EU requirements +------------------------------------------------------------------------------ + +The agile development process in the EU funded work of the +PyPy project centers around sprints - which are planned to +take place every 6th week at different places to allow many +developers to get in direct touch with each other. Sprinting +around conferences also became a key strategy. + +But the nature of sprints changed when EU funding started. The +need to meet milestones of promised *deliverables* and the +goal to keep an open sprint process, still welcoming newcomers +into the world of Pypy, made the sprints longer (at least 7 +days with a break day in the middle) but also changed the +nature of the sprints. The team started to distuingish between +sprints open for all to attend, without any prior PyPy experience, +and sprints requiring earlier PyPy involvement. Tutorials, start up +planning meetings as well as daily status meetings evolved, +the latest additions to the sprints are closing planning +meetings (planning the work between sprints) and work-groups - +a version of pair-programming in groups. + +Some other effects of sprinting within the EU-structure is that the sprint +becomes a forum for non-development work - coordinating and tracking the +project. The challenge here is not affecting the main work and "disturbing" +visiting developers with EU-related work. It could also be argued that the +prolonged sprints could possibly make it more difficult for non consortium +members to attend the full time, disturbing other engagements etc. + +The project continues to try to enhance the method of sprinting, evaluating +feedback from sprint participants. Maybe the implementation within the PyPy +team is slowly conforming to the Scrum standard of sprinting, but not as a +conscious effort? + + +Managing diversities: agile business - a succesful marriage? +-------------------------------------------------------------- + +For a diverse group of organisations and people, agility is +helpful at various levels: you cannot make all-encompassing +plans and hope to statically follow them and succeed. New +developments, twists and opportunities evolve all the time. + +Our experience with evolving PyPy from a loose Open Source project +to a partially funded EU research project shows the following: + +- what first seemed like too diverse interests and views, + impossible to tailor into a single project, was instead a + fruitful mix of diversities. The challenge is to manage + these diversities and channel them into constructive team + efforts. Aiming for homogenity is the real threat. + +- what first seemed like unbeatable odds and too big obstacles + +>sometimes + + even turned into new possibilities. The challenge is + to maintain an atmosphere in which a team can act on those and + within short timeframes of opportunities. Change is + inevitable - how you handle it is the real challenge. + +- there are many other projects and organisations who are + heading in similar directions of trying to connect and + evolve agile open source strategies with business matters. + Emerging models for developers distributed between + different countries allows people with special interests + to effectively work together and learn from each other. + +Concluding - the cumulative effects of an agile, open and +dynamic team process combined with a market and curious +first adopters facilitates agile business. A positive result +is that a lot of people within the PyPy context found +enjoyable jobs and there now already is evolving commercial +interest despite the still early stages of the project - mostly +from US companies though ... why european companies, especially +larger ones, appear to prefer taking rather naive views on agile +open-source development ("great, it's cheaper, no license fees!") +is another interesting topic. From arigo at codespeak.net Thu Dec 1 23:57:16 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 1 Dec 2005 23:57:16 +0100 (CET) Subject: [pypy-svn] r20554 - in pypy/branch/somepbc-refactoring/pypy/translator: goal llvm Message-ID: <20051201225716.9F7CE27B5F@code1.codespeak.net> Author: arigo Date: Thu Dec 1 23:57:16 2005 New Revision: 20554 Modified: pypy/branch/somepbc-refactoring/pypy/translator/goal/driver.py pypy/branch/somepbc-refactoring/pypy/translator/llvm/genllvm.py Log: (rxe, replay from r19947, r20194, r20368; pedronis, arigo) * Ported the LLVM changes to driver.py to the branch. * Fixed a detail for the new model of the branch. Modified: pypy/branch/somepbc-refactoring/pypy/translator/goal/driver.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/goal/driver.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/goal/driver.py Thu Dec 1 23:57:16 2005 @@ -223,18 +223,21 @@ # task_source_c = taskdef(task_source_c, ['database_c'], "Generating c source") + def create_exe(self): + import shutil + exename = mkexename(self.c_entryp) + newexename = mkexename('./pypy-%s' % self.options.backend) + shutil.copy(exename, newexename) + self.c_entryp = newexename + self.log.info("created: %s" % (self.c_entryp,)) + def task_compile_c(self): # xxx messy cbuilder = self.cbuilder cbuilder.compile() - + if self.standalone: - c_entryp = cbuilder.executable_name - import shutil - exename = mkexename(c_entryp) - newexename = mkexename('./'+'pypy-c') - shutil.copy(exename, newexename) - self.c_entryp = newexename - self.log.info("created: %s" % (self.c_entryp,)) + self.c_entryp = cbuilder.executable_name + self.create_exe() else: cbuilder.import_module() self.c_entryp = cbuilder.get_entry_point() @@ -275,33 +278,32 @@ ['?backendopt', 'rtype'], "LLInterpreting") - def task_source_llvm(self): # xxx messy + def task_source_llvm(self): translator = self.translator opts = self.options if translator.annotator is None: - raise ValueError, "function has to be annotated." + raise ValueError, "llvm requires annotation." + from pypy.translator.llvm import genllvm - self.llvmgen = genllvm.GenLLVM(translator, - genllvm.GcPolicy.new(opts.gc), - genllvm.ExceptionPolicy.new(None)) - self.llvm_filename = self.llvmgen.gen_llvm_source() - self.log.info("written: %s" % (self.llvm_filename,)) + + # XXX Need more options for policies/llvm-backendoptions here? + self.llvmgen = genllvm.GenLLVM(translator, self.options.gc, + None, self.standalone) + + llvm_filename = self.llvmgen.gen_llvm_source(self.entry_point) + self.log.info("written: %s" % (llvm_filename,)) # task_source_llvm = taskdef(task_source_llvm, ['backendopt', 'rtype'], "Generating llvm source") - def task_compile_llvm(self): # xxx messy - self.c_entryp = self.llvmgen.create_module(self.llvm_filename, - standalone=self.standalone, - exe_name = 'pypy-llvm') + def task_compile_llvm(self): + gen = self.llvmgen if self.standalone: - import shutil - exename = mkexename(self.c_entryp) - newexename = mkexename('./pypy-llvm') - shutil.copy(exename, newexename) - self.c_entryp = newexename - self.log.info("created: %s" % (self.c_entryp,)) + self.c_entryp = gen.compile_llvm_source(exe_name='pypy-llvm') + self.create_exe() + else: + self.c_entryp = gen.compile_llvm_source(return_fn=True) # task_compile_llvm = taskdef(task_compile_llvm, ['source_llvm'], Modified: pypy/branch/somepbc-refactoring/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/llvm/genllvm.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/llvm/genllvm.py Thu Dec 1 23:57:16 2005 @@ -48,7 +48,7 @@ self.source_generated = False - def gen_llvm_source(self, func=None): + def gen_llvm_source(self, func): self._checkpoint() codewriter = self.setup(func) @@ -158,8 +158,7 @@ codewriter.comment("End of file") def get_entry_point(self, func): - if func is None: - func = self.translator.entrypoint + assert func is not None self.entrypoint = func bk = self.translator.annotator.bookkeeper From cfbolz at codespeak.net Fri Dec 2 00:00:01 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 2 Dec 2005 00:00:01 +0100 (CET) Subject: [pypy-svn] r20555 - pypy/extradoc/talk/22c3 Message-ID: <20051201230001.8917E27B5B@code1.codespeak.net> Author: cfbolz Date: Thu Dec 1 23:59:56 2005 New Revision: 20555 Modified: pypy/extradoc/talk/22c3/agility.pdf pypy/extradoc/talk/22c3/agility.sty pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: five pages Modified: pypy/extradoc/talk/22c3/agility.pdf ============================================================================== Files pypy/extradoc/talk/22c3/agility.pdf (original) and pypy/extradoc/talk/22c3/agility.pdf Thu Dec 1 23:59:56 2005 differ Modified: pypy/extradoc/talk/22c3/agility.sty ============================================================================== --- pypy/extradoc/talk/22c3/agility.sty (original) +++ pypy/extradoc/talk/22c3/agility.sty Thu Dec 1 23:59:56 2005 @@ -1,3 +1,6 @@ \author {Bea D\"uring\\Change Maker\\\texttt{bea at changemaker.nu} \and Holger Krekel\\merlinux GmbH\\\texttt{hpk at merlinux.de}} \date{} +\usepackage{geometry} +\geometry{verbose,letterpaper,tmargin=2.5cm,bmargin=2cm,lmargin=2.5cm,rmargin=2.5cm,headheight=0in,headsep=0cm,footskip=0.5cm} + Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Thu Dec 1 23:59:56 2005 @@ -7,7 +7,7 @@ \begin{abstract} This paper walks through different aspects of agility -within the open-source driven PyPy project. Agility played a key role +within the open-source driven PyPy project [#]_. Agility played a key role from the beginning. The PyPy project started from some mails between a few people, quickly had a first one-week meeting, a "sprint", from where it evolved into a structure that was able to carry @@ -30,6 +30,8 @@ balance we strike between organisations and EU funding on the one and the developers driving the technical aspects of the project on the other side. + +.. [#] http://codespeak.net/pypy .. raw:: latex From arigo at codespeak.net Fri Dec 2 00:10:10 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 2 Dec 2005 00:10:10 +0100 (CET) Subject: [pypy-svn] r20556 - pypy/branch/somepbc-refactoring/pypy/translator/llvm Message-ID: <20051201231010.2526627B5F@code1.codespeak.net> Author: arigo Date: Fri Dec 2 00:10:09 2005 New Revision: 20556 Modified: pypy/branch/somepbc-refactoring/pypy/translator/llvm/genllvm.py Log: (pedronis, arigo) Fixed a bit the interface in genllvm -- we need to pass an entry point. It would be nice if the back-ends could converge to some common base interface :-/ Modified: pypy/branch/somepbc-refactoring/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/branch/somepbc-refactoring/pypy/translator/llvm/genllvm.py (original) +++ pypy/branch/somepbc-refactoring/pypy/translator/llvm/genllvm.py Fri Dec 2 00:10:09 2005 @@ -15,7 +15,6 @@ from pypy.translator.llvm.externs2ll import setup_externs, generate_llfile from pypy.translator.llvm.gc import GcPolicy from pypy.translator.llvm.exception import ExceptionPolicy -from pypy.translator.translator import Translator from pypy.translator.llvm.log import log class GenLLVM(object): @@ -270,12 +269,13 @@ for s in stats: log('STATS %s' % str(s)) -def genllvm(translator, gcpolicy=None, exceptionpolicy=None, standalone=False, +def genllvm(translator, entry_point, gcpolicy=None, + exceptionpolicy=None, standalone=False, log_source=False, logging=False, **kwds): gen = GenLLVM(translator, gcpolicy, exceptionpolicy, standalone, logging=logging) - filename = gen.gen_llvm_source() + filename = gen.gen_llvm_source(entry_point) if log_source: log(open(filename).read()) @@ -283,16 +283,17 @@ return gen.compile_llvm_source(**kwds) def genllvm_compile(function, annotation, view=False, **kwds): - t = Translator(function) - a = t.annotate(annotation) - a.simplify() - t.specialize() - t.backend_optimizations(ssa_form=False) + from pypy.translator.translator import TranslationContext + from pypy.translator.backendopt.all import backend_optimizations + t = TranslationContext() + t.buildannotator().build_types(function, annotation) + t.buildrtyper().specialize() + backend_optimizations(t, ssa_form=False) # note: this is without policy transforms if view: t.view() - return genllvm(t, **kwds) + return genllvm(t, function, **kwds) def compile_function(function, annotation, **kwds): """ Helper - which get the compiled module from CPython. """ From arigo at codespeak.net Fri Dec 2 00:12:21 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 2 Dec 2005 00:12:21 +0100 (CET) Subject: [pypy-svn] r20557 - in pypy/dist/pypy: annotation rpython rpython/test tool translator/goal Message-ID: <20051201231221.4164D27B61@code1.codespeak.net> Author: arigo Date: Fri Dec 2 00:12:20 2005 New Revision: 20557 Removed: pypy/dist/pypy/annotation/unaryop.py pypy/dist/pypy/rpython/rfloat.py pypy/dist/pypy/rpython/test/test_rfloat.py pypy/dist/pypy/tool/sourcetools.py pypy/dist/pypy/translator/goal/driver.py Log: (pedronis, arigo, hpk looking :-) Starting the merge of the somepbc-refactoring branch. Step 1: the trunk changes on these files have been ported to the branch; remove them from the trunk. From arigo at codespeak.net Fri Dec 2 00:15:35 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 2 Dec 2005 00:15:35 +0100 (CET) Subject: [pypy-svn] r20558 - in pypy/dist/pypy: annotation annotation/test interpreter module/thread/rpython objspace/flow rpython rpython/l3interp rpython/lltypesystem rpython/memory rpython/ootypesystem rpython/test tool translator translator/asm translator/backendopt translator/c translator/c/test translator/goal translator/llvm translator/pyrex translator/squeak translator/test translator/tool Message-ID: <20051201231535.0518D27B62@code1.codespeak.net> Author: arigo Date: Fri Dec 2 00:15:34 2005 New Revision: 20558 Removed: pypy/dist/pypy/annotation/binaryop.py pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/annotation/classdef.py pypy/dist/pypy/annotation/dictdef.py pypy/dist/pypy/annotation/model.py pypy/dist/pypy/annotation/policy.py pypy/dist/pypy/annotation/specialize.py pypy/dist/pypy/annotation/test/ pypy/dist/pypy/interpreter/ pypy/dist/pypy/module/thread/rpython/ pypy/dist/pypy/objspace/flow/ pypy/dist/pypy/rpython/annlowlevel.py pypy/dist/pypy/rpython/callparse.py pypy/dist/pypy/rpython/l3interp/ pypy/dist/pypy/rpython/llinterp.py pypy/dist/pypy/rpython/lltypesystem/ pypy/dist/pypy/rpython/memory/ pypy/dist/pypy/rpython/normalizecalls.py pypy/dist/pypy/rpython/ootypesystem/ pypy/dist/pypy/rpython/rbool.py pypy/dist/pypy/rpython/rbuiltin.py pypy/dist/pypy/rpython/rclass.py pypy/dist/pypy/rpython/rdict.py pypy/dist/pypy/rpython/rmodel.py pypy/dist/pypy/rpython/rpbc.py pypy/dist/pypy/rpython/rptr.py pypy/dist/pypy/rpython/rspecialcase.py pypy/dist/pypy/rpython/rtyper.py pypy/dist/pypy/rpython/test/test_exception.py pypy/dist/pypy/rpython/test/test_llann.py pypy/dist/pypy/rpython/test/test_llinterp.py pypy/dist/pypy/rpython/test/test_nongc.py pypy/dist/pypy/rpython/test/test_normalizecalls.py pypy/dist/pypy/rpython/test/test_objectmodel.py pypy/dist/pypy/rpython/test/test_rbool.py pypy/dist/pypy/rpython/test/test_rbuiltin.py pypy/dist/pypy/rpython/test/test_rclass.py pypy/dist/pypy/rpython/test/test_rdict.py pypy/dist/pypy/rpython/test/test_rint.py pypy/dist/pypy/rpython/test/test_rlist.py pypy/dist/pypy/rpython/test/test_rpbc.py pypy/dist/pypy/rpython/test/test_rptr.py pypy/dist/pypy/rpython/test/test_rspecialcase.py pypy/dist/pypy/rpython/test/test_rtuple.py pypy/dist/pypy/rpython/test/test_rtyper.py pypy/dist/pypy/rpython/typesystem.py pypy/dist/pypy/tool/cache.py pypy/dist/pypy/translator/ann_override.py pypy/dist/pypy/translator/annrpython.py pypy/dist/pypy/translator/asm/ pypy/dist/pypy/translator/backendopt/ pypy/dist/pypy/translator/c/extfunc.py pypy/dist/pypy/translator/c/genc.py pypy/dist/pypy/translator/c/pyobj.py pypy/dist/pypy/translator/c/stackless.py pypy/dist/pypy/translator/c/test/ pypy/dist/pypy/translator/c/wrapper.py pypy/dist/pypy/translator/gencl.py pypy/dist/pypy/translator/geninterplevel.py pypy/dist/pypy/translator/goal/query.py pypy/dist/pypy/translator/goal/translate_pypy.py pypy/dist/pypy/translator/llvm/ pypy/dist/pypy/translator/pyrex/ pypy/dist/pypy/translator/squeak/ pypy/dist/pypy/translator/test/ pypy/dist/pypy/translator/tool/graphpage.py pypy/dist/pypy/translator/transform.py pypy/dist/pypy/translator/translator.py Log: Merging, step 2: deleting all the files from the trunk that need to be copied from the branch. From arigo at codespeak.net Fri Dec 2 00:17:56 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 2 Dec 2005 00:17:56 +0100 (CET) Subject: [pypy-svn] r20559 - in pypy/dist/pypy: annotation annotation/test doc/discussion interpreter module/thread/rpython objspace/flow rpython rpython/l3interp rpython/lltypesystem rpython/memory rpython/ootypesystem rpython/test tool translator translator/asm translator/backendopt translator/c translator/c/test translator/goal translator/llvm translator/pyrex translator/squeak translator/test translator/tool Message-ID: <20051201231756.CE55927B5F@code1.codespeak.net> Author: arigo Date: Fri Dec 2 00:17:55 2005 New Revision: 20559 Added: pypy/dist/pypy/annotation/binaryop.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/annotation/binaryop.py pypy/dist/pypy/annotation/bookkeeper.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/annotation/bookkeeper.py pypy/dist/pypy/annotation/builtin.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/annotation/builtin.py pypy/dist/pypy/annotation/classdef.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/annotation/classdef.py pypy/dist/pypy/annotation/description.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/annotation/description.py pypy/dist/pypy/annotation/dictdef.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/annotation/dictdef.py pypy/dist/pypy/annotation/model.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/annotation/model.py pypy/dist/pypy/annotation/policy.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/annotation/policy.py pypy/dist/pypy/annotation/specialize.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/annotation/specialize.py pypy/dist/pypy/annotation/test/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/annotation/test/ pypy/dist/pypy/annotation/unaryop.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/annotation/unaryop.py pypy/dist/pypy/doc/discussion/somepbc-refactoring-plan.txt - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/doc/discussion/somepbc-refactoring-plan.txt pypy/dist/pypy/interpreter/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/interpreter/ pypy/dist/pypy/module/thread/rpython/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/module/thread/rpython/ pypy/dist/pypy/objspace/flow/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/objspace/flow/ pypy/dist/pypy/rpython/annlowlevel.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/annlowlevel.py pypy/dist/pypy/rpython/callparse.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/callparse.py pypy/dist/pypy/rpython/l3interp/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/l3interp/ pypy/dist/pypy/rpython/llinterp.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/llinterp.py pypy/dist/pypy/rpython/lltypesystem/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/lltypesystem/ pypy/dist/pypy/rpython/memory/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/memory/ pypy/dist/pypy/rpython/normalizecalls.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/normalizecalls.py pypy/dist/pypy/rpython/ootypesystem/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/ootypesystem/ pypy/dist/pypy/rpython/rbool.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/rbool.py pypy/dist/pypy/rpython/rbuiltin.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/rbuiltin.py pypy/dist/pypy/rpython/rclass.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/rclass.py pypy/dist/pypy/rpython/rdict.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/rdict.py pypy/dist/pypy/rpython/rfloat.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/rfloat.py pypy/dist/pypy/rpython/rmodel.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/rmodel.py pypy/dist/pypy/rpython/rpbc.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/rpbc.py pypy/dist/pypy/rpython/rptr.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/rptr.py pypy/dist/pypy/rpython/rspecialcase.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/rspecialcase.py pypy/dist/pypy/rpython/rtyper.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/rtyper.py pypy/dist/pypy/rpython/test/test_exception.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_exception.py pypy/dist/pypy/rpython/test/test_llann.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_llann.py pypy/dist/pypy/rpython/test/test_llinterp.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_llinterp.py pypy/dist/pypy/rpython/test/test_nongc.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_nongc.py pypy/dist/pypy/rpython/test/test_normalizecalls.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_normalizecalls.py pypy/dist/pypy/rpython/test/test_objectmodel.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_objectmodel.py pypy/dist/pypy/rpython/test/test_rbool.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rbool.py pypy/dist/pypy/rpython/test/test_rbuiltin.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rbuiltin.py pypy/dist/pypy/rpython/test/test_rclass.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rclass.py pypy/dist/pypy/rpython/test/test_rdict.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rdict.py pypy/dist/pypy/rpython/test/test_rfloat.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rfloat.py pypy/dist/pypy/rpython/test/test_rint.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rint.py pypy/dist/pypy/rpython/test/test_rlist.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rlist.py pypy/dist/pypy/rpython/test/test_rpbc.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rpbc.py pypy/dist/pypy/rpython/test/test_rptr.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rptr.py pypy/dist/pypy/rpython/test/test_rspecialcase.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rspecialcase.py pypy/dist/pypy/rpython/test/test_rtuple.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rtuple.py pypy/dist/pypy/rpython/test/test_rtyper.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/test/test_rtyper.py pypy/dist/pypy/rpython/typesystem.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/rpython/typesystem.py pypy/dist/pypy/tool/cache.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/tool/cache.py pypy/dist/pypy/tool/sourcetools.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/tool/sourcetools.py pypy/dist/pypy/translator/ann_override.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/ann_override.py pypy/dist/pypy/translator/annrpython.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/annrpython.py pypy/dist/pypy/translator/asm/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/translator/asm/ pypy/dist/pypy/translator/backendopt/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/translator/backendopt/ pypy/dist/pypy/translator/c/extfunc.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/c/extfunc.py pypy/dist/pypy/translator/c/genc.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/c/genc.py pypy/dist/pypy/translator/c/pyobj.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/c/pyobj.py pypy/dist/pypy/translator/c/stackless.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/c/stackless.py pypy/dist/pypy/translator/c/test/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/translator/c/test/ pypy/dist/pypy/translator/c/wrapper.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/c/wrapper.py pypy/dist/pypy/translator/gencl.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/gencl.py pypy/dist/pypy/translator/geninterplevel.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/geninterplevel.py pypy/dist/pypy/translator/goal/driver.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/goal/driver.py pypy/dist/pypy/translator/goal/old_queries.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/goal/old_queries.py pypy/dist/pypy/translator/goal/query.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/goal/query.py pypy/dist/pypy/translator/goal/translate_pypy.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/goal/translate_pypy.py pypy/dist/pypy/translator/llvm/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/translator/llvm/ pypy/dist/pypy/translator/pyrex/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/translator/pyrex/ pypy/dist/pypy/translator/squeak/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/translator/squeak/ pypy/dist/pypy/translator/test/ - copied from r20557, pypy/branch/somepbc-refactoring/pypy/translator/test/ pypy/dist/pypy/translator/tool/graphpage.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/tool/graphpage.py pypy/dist/pypy/translator/transform.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/transform.py pypy/dist/pypy/translator/translator.py - copied unchanged from r20557, pypy/branch/somepbc-refactoring/pypy/translator/translator.py Log: Merging, step 3: copy relevant files and directories from the branch. The branch should not be modified any more now. From pedronis at codespeak.net Fri Dec 2 01:13:24 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 2 Dec 2005 01:13:24 +0100 (CET) Subject: [pypy-svn] r20560 - pypy/dist/pypy/translator/c/test Message-ID: <20051202001324.CD47927B5F@code1.codespeak.net> Author: pedronis Date: Fri Dec 2 01:13:24 2005 New Revision: 20560 Modified: pypy/dist/pypy/translator/c/test/test_boehm.py Log: update test_boehm (still skipped and in need of a diffent approach) Modified: pypy/dist/pypy/translator/c/test/test_boehm.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_boehm.py (original) +++ pypy/dist/pypy/translator/c/test/test_boehm.py Fri Dec 2 01:13:24 2005 @@ -1,12 +1,12 @@ import py test_src = """ -from pypy.translator.translator import Translator +from pypy.translator.translator import TranslationContext from pypy.translator.tool.cbuild import skip_missing_compiler from pypy.translator.c.genc import CExtModuleBuilder def getcompiled(func): from pypy.translator.c.gc import BoehmGcPolicy - t = Translator(func, simplifying=True) + t = TranslationContext(simplifying=True) # builds starting-types from func_defs argstypelist = [] if func.func_defaults: @@ -14,12 +14,11 @@ if isinstance(spec, tuple): spec = spec[0] # use the first type only for the tests argstypelist.append(spec) - a = t.annotate(argstypelist) - a.simplify() - t.specialize() + a = t.buildannotator().build_types(func, argstypelist) + t.buildrtyper().specialize() t.checkgraphs() def compile(): - cbuilder = CExtModuleBuilder(t, gcpolicy=BoehmGcPolicy) + cbuilder = CExtModuleBuilder(t, func, gcpolicy=BoehmGcPolicy) c_source_filename = cbuilder.generate_source() cbuilder.compile() cbuilder.import_module() @@ -42,7 +41,7 @@ def run_test(fn): fn() - channel.send(None) + channel.send("ok") run_test(test_malloc_a_lot) """ @@ -57,7 +56,7 @@ gw = py.execnet.PopenGateway() chan = gw.remote_exec(py.code.Source(test_src)) res = chan.receive() - assert not res + assert res == "ok" chan.close() From pedronis at codespeak.net Fri Dec 2 01:19:17 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 2 Dec 2005 01:19:17 +0100 (CET) Subject: [pypy-svn] r20561 - pypy/dist/pypy/translator/test Message-ID: <20051202001917.D2ACA27B5F@code1.codespeak.net> Author: pedronis Date: Fri Dec 2 01:19:16 2005 New Revision: 20561 Modified: pypy/dist/pypy/translator/test/test_translator.py Log: Translator -> TranslationContext Modified: pypy/dist/pypy/translator/test/test_translator.py ============================================================================== --- pypy/dist/pypy/translator/test/test_translator.py (original) +++ pypy/dist/pypy/translator/test/test_translator.py Fri Dec 2 01:19:16 2005 @@ -1,5 +1,5 @@ import autopath -from pypy.translator.translator import Translator +from pypy.translator.translator import TranslationContext def example(d): @@ -9,6 +9,7 @@ d['key'] = 'value' def test_example(): - t = Translator(example) - t.simplify() # this specific example triggered a bug in simplify.py + t = TranslationContext(simplifying=True) + t.buildflowgraph(example) + # this specific example triggered a bug in simplify.py #t.view() From pedronis at codespeak.net Fri Dec 2 01:33:42 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 2 Dec 2005 01:33:42 +0100 (CET) Subject: [pypy-svn] r20562 - pypy/dist/pypy/translator/test Message-ID: <20051202003342.93CB327B62@code1.codespeak.net> Author: pedronis Date: Fri Dec 2 01:33:41 2005 New Revision: 20562 Modified: pypy/dist/pypy/translator/test/test_simplify.py pypy/dist/pypy/translator/test/test_unsimplify.py Log: Translator -> TranslationContext Modified: pypy/dist/pypy/translator/test/test_simplify.py ============================================================================== --- pypy/dist/pypy/translator/test/test_simplify.py (original) +++ pypy/dist/pypy/translator/test/test_simplify.py Fri Dec 2 01:33:41 2005 @@ -1,27 +1,29 @@ -from pypy.translator.translator import Translator, graphof +from pypy.translator.translator import TranslationContext, graphof +from pypy.translator.backendopt.all import backend_optimizations from pypy.objspace.flow.model import traverse, Block +def translate(func, argtypes): + t = TranslationContext() + t.buildannotator().build_types(func, argtypes) + t.buildrtyper().specialize() + backend_optimizations(t) + return graphof(t, func), t + def test_remove_direct_call_without_side_effects(): def f(x): return x + 123 def g(x): a = f(x) return x * 12 - t = Translator(g) - a = t.annotate([int]) - t.specialize() - t.backend_optimizations() - assert len(graphof(t, g).startblock.operations) == 1 + graph, _ = translate(g, [int]) + assert len(graph.startblock.operations) == 1 def test_dont_remove_external_calls(): import os def f(x): os.close(x) - t = Translator(f) - a = t.annotate([int]) - t.specialize() - t.backend_optimizations() - assert len(graphof(t, f).startblock.operations) == 1 + graph, _ = translate(f, [int]) + assert len(graph.startblock.operations) == 1 def test_remove_recursive_call(): def rec(a): @@ -32,11 +34,8 @@ def f(x): a = rec(x) return x + 12 - t = Translator(f) - a = t.annotate([int]) - t.specialize() - t.backend_optimizations() - assert len(graphof(t, f).startblock.operations) + graph, _ = translate(f, [int]) + assert len(graph.startblock.operations) def test_dont_remove_if_exception_guarded(): def f(x): @@ -51,11 +50,8 @@ raise else: return 1 - t = Translator(g) - a = t.annotate([int]) - t.specialize() - t.backend_optimizations() - assert graphof(t, g).startblock.operations[-1].opname == 'direct_call' + graph, _ = translate(g, [int]) + assert graph.startblock.operations[-1].opname == 'direct_call' def test_remove_pointless_keepalive(): @@ -77,15 +73,10 @@ n = c.z2 objectmodel.keepalive_until_here(c, n) - t = Translator(f) - a = t.annotate([int]) - t.specialize() - t.backend_optimizations() + graph, t = translate(f, [bool]) #t.view() - graph = graphof(t, f) - for block in graph.iterblocks(): for op in block.operations: assert op.opname != 'getfield' Modified: pypy/dist/pypy/translator/test/test_unsimplify.py ============================================================================== --- pypy/dist/pypy/translator/test/test_unsimplify.py (original) +++ pypy/dist/pypy/translator/test/test_unsimplify.py Fri Dec 2 01:33:41 2005 @@ -1,6 +1,12 @@ -from pypy.rpython.llinterp import LLInterpreter -from pypy.translator.translator import Translator, graphof +from pypy.translator.translator import TranslationContext, graphof from pypy.translator.unsimplify import split_block +from pypy.rpython.llinterp import LLInterpreter + +def translate(func, argtypes): + t = TranslationContext() + t.buildannotator().build_types(func, argtypes) + t.buildrtyper().specialize() + return graphof(t, func), t def test_split_blocks_simple(): for i in range(4): @@ -8,10 +14,7 @@ z = x + y w = x * y return z + w - t = Translator(f) - a = t.annotate([int, int]) - t.specialize() - graph = graphof(t, f) + graph, t = translate(f, [int, int]) split_block(t, graph, graph.startblock, i) interp = LLInterpreter(t.rtyper) result = interp.eval_graph(graph, [1, 2]) @@ -24,10 +27,7 @@ return y + 1 else: return y + 2 - t = Translator(f) - a = t.annotate([int, int]) - t.specialize() - graph = graphof(t, f) + graph, t = translate(f, [int, int]) split_block(t, graph, graph.startblock, i) interp = LLInterpreter(t.rtyper) result = interp.eval_graph(graph, [-12, 2]) @@ -52,10 +52,7 @@ except KeyError: return 1 return x - t = Translator(catches) - a = t.annotate([int]) - t.specialize() - graph = graphof(t, catches) + graph, t = translate(catches, [int]) split_block(t, graph, graph.startblock, i) interp = LLInterpreter(t.rtyper) result = interp.eval_graph(graph, [0]) From pedronis at codespeak.net Fri Dec 2 02:23:46 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 2 Dec 2005 02:23:46 +0100 (CET) Subject: [pypy-svn] r20563 - pypy/dist/pypy/translator/backendopt/test Message-ID: <20051202012346.07AA027B61@code1.codespeak.net> Author: pedronis Date: Fri Dec 2 02:23:45 2005 New Revision: 20563 Modified: pypy/dist/pypy/translator/backendopt/test/test_all.py pypy/dist/pypy/translator/backendopt/test/test_inline.py pypy/dist/pypy/translator/backendopt/test/test_malloc.py pypy/dist/pypy/translator/backendopt/test/test_propagate.py pypy/dist/pypy/translator/backendopt/test/test_removenoops.py pypy/dist/pypy/translator/backendopt/test/test_tailrecursion.py Log: mostly last round of Translator -> TranslationContext Modified: pypy/dist/pypy/translator/backendopt/test/test_all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_all.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_all.py Fri Dec 2 02:23:45 2005 @@ -1,11 +1,17 @@ import py from pypy.translator.backendopt.all import backend_optimizations from pypy.translator.backendopt.test.test_malloc import check_malloc_removed -from pypy.translator.translator import Translator, graphof +from pypy.translator.translator import TranslationContext, graphof from pypy.objspace.flow.model import Constant from pypy.annotation import model as annmodel from pypy.rpython.llinterp import LLInterpreter +def translateopt(func, sig, **optflags): + t = TranslationContext() + t.buildannotator().build_types(func, sig) + t.buildrtyper().specialize() + backend_optimizations(t, **optflags) + return t class A: def __init__(self, x, y): @@ -40,10 +46,7 @@ def test_big(): assert big() == 83 - t = Translator(big) - t.annotate([]) - t.specialize() - backend_optimizations(t, inline_threshold=100, mallocs=True) + t = translateopt(big, [], inline_threshold=100, mallocs=True) big_graph = graphof(t, big) check_malloc_removed(big_graph) @@ -60,10 +63,8 @@ for i in range(n): total += i return total - t = Translator(f) - t.annotate([int]) - t.specialize() - t.backend_optimizations(inline_threshold=1, mallocs=True) + + t = translateopt(f, [int], inline_threshold=1, mallocs=True) # this also checks that the BASE_INLINE_THRESHOLD is enough for 'for' loops f_graph = graph = graphof(t, f) @@ -78,10 +79,8 @@ def f(n1, n2): c = [i for i in range(n2)] return 33 - t = Translator(f) - t.annotate([int, int]) - t.specialize() - t.backend_optimizations(inline_threshold=10, mallocs=True) + + t = translateopt(f, [int, int], inline_threshold=10, mallocs=True) f_graph = graphof(t, f) check_malloc_removed(f_graph) @@ -110,10 +109,7 @@ #debug(" lowered -> " + a) return 0 - t = Translator(entry_point) - t.annotate(inputtypes) - t.specialize() - t.backend_optimizations(inline_threshold=1, mallocs=True) + t = translateopt(entry_point, inputtypes, inline_threshold=1, mallocs=True) entry_point_graph = graphof(t, entry_point) Modified: pypy/dist/pypy/translator/backendopt/test/test_inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_inline.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_inline.py Fri Dec 2 02:23:45 2005 @@ -7,7 +7,7 @@ from pypy.translator.backendopt.inline import auto_inlining from pypy.translator.backendopt.inline import collect_called_functions from pypy.translator.backendopt.inline import measure_median_execution_cost -from pypy.translator.translator import Translator, TranslationContext, graphof +from pypy.translator.translator import TranslationContext, graphof from pypy.rpython.llinterp import LLInterpreter from pypy.rpython.rarithmetic import ovfcheck from pypy.translator.test.snippet import is_perfect_number @@ -36,22 +36,39 @@ checkgraph(graph) traverse(no_missing_concretetype, graph) -def check_inline(func, in_func, sig): - t = Translator(in_func) - a = t.annotate(sig) - a.simplify() - t.specialize() +def translate(func, argtypes): + t = TranslationContext() + t.buildannotator().build_types(func, argtypes) + t.buildrtyper().specialize() + return t + +def check_inline(func, in_func, sig, entry=None): + if entry is None: + entry = in_func + t = translate(entry, sig) # inline! sanity_check(t) # also check before inlining (so we don't blame it) - in_func_graph = graphof(t, in_func) - func_graph = graphof(t, func) - inline_function(t, func, in_func_graph) + inline_function(t, func, graphof(t, in_func)) sanity_check(t) interp = LLInterpreter(t.rtyper) def eval_func(args): - return interp.eval_graph(in_func_graph, args) + return interp.eval_graph(graphof(t, entry), args) return eval_func +def check_auto_inlining(func, sig, threshold=None): + t = translate(func, sig) + # inline! + sanity_check(t) # also check before inlining (so we don't blame it) + if threshold is None: + auto_inlining(t) + else: + auto_inlining(t, threshold=threshold) + sanity_check(t) + interp = LLInterpreter(t.rtyper) + def eval_func(args): + return interp.eval_graph(graphof(t, func), args) + return eval_func, t + def test_inline_simple(): def f(x, y): @@ -95,20 +112,12 @@ except KeyError: return 2 return x - t = Translator(h) - a = t.annotate([int]) - a.simplify() - t.specialize() - sanity_check(t) # also check before inlining (so we don't blame it) - inline_function(t, f, graphof(t, g)) - sanity_check(t) - interp = LLInterpreter(t.rtyper) - h_graph = graphof(t, h) - result = interp.eval_graph(h_graph, [0]) + eval_func = check_inline(f,g, [int], entry=h) + result = eval_func([0]) assert result == 0 - result = interp.eval_graph(h_graph, [1]) + result = eval_func([1]) assert result == 1 - result = interp.eval_graph(h_graph, [2]) + result = eval_func([2]) assert result == 2 def test_inline_several_times(): @@ -165,20 +174,13 @@ except KeyError: return 3 return 1 - t = Translator(g) - a = t.annotate([int]) - a.simplify() - t.specialize() - auto_inlining(t, threshold=10) - for graph in t.graphs: - traverse(no_missing_concretetype, graph) - interp = LLInterpreter(t.rtyper) - g_graph = graphof(t, g) - result = interp.eval_graph(g_graph, [0]) + + eval_func, _ = check_auto_inlining(g, [int], threshold=10) + result = eval_func([0]) assert result == 2 - result = interp.eval_graph(g_graph, [1]) + result = eval_func([1]) assert result == 3 - result = interp.eval_graph(g_graph, [42]) + result = eval_func([42]) assert result == 1 def test_inline_nonraising_into_catching(): @@ -208,17 +210,10 @@ return f(x)+3 except KeyError: return -1 - t = Translator(g) - a = t.annotate([int]) - a.simplify() - t.specialize() - sanity_check(t) # also check before inlining (so we don't blame it) - inline_function(t, f, t.flowgraphs[g]) - sanity_check(t) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(g, [100]) + eval_func = check_inline(f, g, [int]) + result = eval_func([100]) assert result == 106 - result = interp.eval_function(g, [-100]) + result = eval_func(g, [-100]) assert result == -1 def test_for_loop(): @@ -227,16 +222,13 @@ for i in range(0, x): result += i return result - t = Translator(f) - a = t.annotate([int]) - a.simplify() - t.specialize() + t = translate(f, [int]) + sanity_check(t) # also check before inlining (so we don't blame it) for graph in t.graphs: if graph.name.startswith('ll_rangenext'): break else: assert 0, "cannot find ll_rangenext_*() function" - sanity_check(t) # also check before inlining (so we don't blame it) inline_function(t, graph, graphof(t, f)) sanity_check(t) interp = LLInterpreter(t.rtyper) @@ -283,17 +275,13 @@ return g(n) except OverflowError: return -1 - t = Translator(f) - a = t.annotate([int]) - a.simplify() - t.specialize() - auto_inlining(t, threshold=10) + eval_func, t = check_auto_inlining(f, [int], threshold=10) f_graph = graphof(t, f) assert len(collect_called_functions(f_graph)) == 0 - interp = LLInterpreter(t.rtyper) - result = interp.eval_graph(f_graph, [10]) + + result = eval_func([10]) assert result == 45 - result = interp.eval_graph(f_graph, [15]) + result = eval_func([15]) assert result == -1 def test_inline_exception_catching(): @@ -332,13 +320,8 @@ directory = "./." def f(): return os.path.isdir(directory) - t = Translator(f) - a = t.annotate([]) - a.simplify() - t.specialize() - auto_inlining(t) - interp = LLInterpreter(t.rtyper) - result = interp.eval_graph(graphof(t, f), []) + eval_func, _ = check_auto_inlining(f, []) + result = eval_func([]) assert result is True def test_inline_raiseonly(): Modified: pypy/dist/pypy/translator/backendopt/test/test_malloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_malloc.py Fri Dec 2 02:23:45 2005 @@ -1,6 +1,6 @@ from pypy.translator.backendopt.malloc import remove_simple_mallocs from pypy.translator.backendopt.inline import inline_function -from pypy.translator.translator import Translator, graphof +from pypy.translator.translator import TranslationContext, graphof from pypy.objspace.flow.model import checkgraph, flatten, Block from pypy.rpython.llinterp import LLInterpreter @@ -18,9 +18,9 @@ assert count2 == 0 # number of direct_calls left def check(fn, signature, args, expected_result, must_be_removed=True): - t = Translator(fn) - t.annotate(signature) - t.specialize() + t = TranslationContext() + t.buildannotator().build_types(fn, signature) + t.buildrtyper().specialize() graph = graphof(t, fn) remove_simple_mallocs(graph) if must_be_removed: Modified: pypy/dist/pypy/translator/backendopt/test/test_propagate.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_propagate.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_propagate.py Fri Dec 2 02:23:45 2005 @@ -1,13 +1,14 @@ -from pypy.translator.translator import Translator, graphof +from pypy.translator.translator import TranslationContext, graphof from pypy.translator.backendopt.propagate import * +from pypy.translator.backendopt.all import backend_optimizations from pypy.rpython.llinterp import LLInterpreter def get_graph(fn, signature): - t = Translator(fn) - t.annotate(signature) - t.specialize() - t.backend_optimizations(ssa_form=False, propagate=False) + t = TranslationContext() + t.buildannotator().build_types(fn, signature) + t.buildrtyper().specialize() + backend_optimizations(t, ssa_form=False, propagate=False) graph = graphof(t, fn) return graph, t @@ -115,6 +116,6 @@ def test_call_list_default_argument(): graph, t = get_graph(call_list_default_argument, [int]) - t.backend_optimizations(propagate=True, ssa_form=False) + backend_optimizations(t, propagate=True, ssa_form=False) for i in range(10): check_graph(graph, [i], call_list_default_argument(i), t) Modified: pypy/dist/pypy/translator/backendopt/test/test_removenoops.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_removenoops.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_removenoops.py Fri Dec 2 02:23:45 2005 @@ -1,6 +1,6 @@ from pypy.translator.backendopt.removenoops import remove_void, remove_same_as from pypy.translator.backendopt.inline import inline_function -from pypy.translator.translator import Translator, graphof +from pypy.translator.translator import TranslationContext, graphof from pypy.translator.test.snippet import simple_method from pypy.objspace.flow.model import checkgraph, flatten, Block from pypy.rpython.lltypesystem.lltype import Void @@ -11,9 +11,9 @@ def annotate_and_remove_void(f, annotate): - t = Translator(f) - a = t.annotate(annotate) - t.specialize() + t = TranslationContext() + t.buildannotator().build_types(f, annotate) + t.buildrtyper().specialize() remove_void(t) return t @@ -51,9 +51,9 @@ return 42 else: return 666 - t = Translator(f) - a = t.annotate([]) - t.specialize() + t = TranslationContext() + t.buildannotator().build_types(f, []) + t.buildrtyper().specialize() # now we make the 'if True' appear f_graph = graphof(t, f) inline_function(t, nothing, f_graph) Modified: pypy/dist/pypy/translator/backendopt/test/test_tailrecursion.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_tailrecursion.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_tailrecursion.py Fri Dec 2 02:23:45 2005 @@ -1,6 +1,6 @@ from pypy.objspace.flow.model import traverse, Block, Link, Variable, Constant from pypy.translator.backendopt.tailrecursion import remove_tail_calls_to_self -from pypy.translator.translator import Translator, graphof +from pypy.translator.translator import TranslationContext, graphof from pypy.rpython.llinterp import LLInterpreter from pypy.translator.test.snippet import is_perfect_number @@ -11,9 +11,9 @@ if a > b: return gcd(b, a) return gcd(b % a, a) - t = Translator(gcd) - a = t.annotate([int, int]) - t.specialize() + t = TranslationContext() + t.buildannotator().build_types(gcd, [int, int]) + t.buildrtyper().specialize() gcd_graph = graphof(t, gcd) remove_tail_calls_to_self(t, gcd_graph ) lli = LLInterpreter(t.rtyper) From ericvrp at codespeak.net Fri Dec 2 11:14:42 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 2 Dec 2005 11:14:42 +0100 (CET) Subject: [pypy-svn] r20565 - pypy/dist/pypy/translator/js Message-ID: <20051202101442.B6C5627B69@code1.codespeak.net> Author: ericvrp Date: Fri Dec 2 11:14:41 2005 New Revision: 20565 Modified: pypy/dist/pypy/translator/js/js.py pypy/dist/pypy/translator/js/opwriter.py Log: Fixes from genjs. There is one fix I don't know how to do. Can someone please have a quick look? Modified: pypy/dist/pypy/translator/js/js.py ============================================================================== --- pypy/dist/pypy/translator/js/js.py (original) +++ pypy/dist/pypy/translator/js/js.py Fri Dec 2 11:14:41 2005 @@ -10,7 +10,8 @@ import py import os -from pypy.rpython.rmodel import inputconst, getfunctionptr +from pypy.rpython.rmodel import inputconst +from pypy.rpython.typesystem import getfunctionptr from pypy.rpython.lltypesystem import lltype from pypy.tool.udir import udir from pypy.translator.js.node import Node @@ -34,15 +35,20 @@ def write_source(self): func = self.entrypoint - ptr = getfunctionptr(self.db.translator, func) + bk = self.db.translator.annotator.bookkeeper + ptr = getfunctionptr(bk.getdesc(func).cachedgraph(None)) c = inputconst(lltype.typeOf(ptr), ptr) self.db.prepare_arg_value(c) #add exception matching function (XXX should only be done when needed) - e = self.db.translator.rtyper.getexceptiondata() - matchptr = getfunctionptr(self.db.translator, e.ll_exception_match) - matchconst = inputconst(lltype.typeOf(matchptr), matchptr) - self.db.prepare_arg_value(matchconst) + try: + e = self.db.translator.rtyper.getexceptiondata() + #matchptr = getfunctionptr(bk.getdesc(e.fn_exception_match).cachedgraph(None)) + matchptr = getfunctionptr(bk.getdesc(e.fn_exception_match._obj).cachedgraph(None)) + matchconst = inputconst(lltype.typeOf(matchptr), matchptr) + self.db.prepare_arg_value(matchconst) + except: + pass #XXX need a fix here # set up all nodes self.db.setup_all() Modified: pypy/dist/pypy/translator/js/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/js/opwriter.py (original) +++ pypy/dist/pypy/translator/js/opwriter.py Fri Dec 2 11:14:41 2005 @@ -1,7 +1,6 @@ import py from pypy.objspace.flow.model import Constant from pypy.rpython.lltypesystem import lltype -from pypy.rpython.rmodel import inputconst, getfunctionptr from pypy.translator.js.log import log log = log.opwriter @@ -278,7 +277,7 @@ exceptions = [] for exit in self.block.exits[1:]: assert issubclass(exit.exitcase, Exception) - exception_match = self.db.translator.rtyper.getexceptiondata().ll_exception_match.__name__ + exception_match = self.db.translator.rtyper.getexceptiondata().fn_exception_match._obj._name exception_ref = self.db.obj2node[exit.llexitcase._obj].ref #get _ref() exception_target = self.node.blockindex[exit.target] exception = (exception_match, exception_ref, exception_target, exit) From arigo at codespeak.net Fri Dec 2 11:43:45 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 2 Dec 2005 11:43:45 +0100 (CET) Subject: [pypy-svn] r20566 - in pypy/dist/pypy/rpython: . test Message-ID: <20051202104345.35B7727B5F@code1.codespeak.net> Author: arigo Date: Fri Dec 2 11:43:44 2005 New Revision: 20566 Modified: pypy/dist/pypy/rpython/rpbc.py pypy/dist/pypy/rpython/test/test_rpbc.py Log: (mwh, arre-and-ericvrp-overlooking, arigo) Implemented multiple-specializations, multiple-functions PBCs. Modified: pypy/dist/pypy/rpython/rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/rpbc.py (original) +++ pypy/dist/pypy/rpython/rpbc.py Fri Dec 2 11:43:44 2005 @@ -5,7 +5,7 @@ from pypy.annotation import description from pypy.objspace.flow.model import Constant from pypy.rpython.lltypesystem.lltype import \ - typeOf, Void, Bool, nullptr, frozendict + typeOf, Void, Bool, nullptr, frozendict, Ptr, Struct, malloc from pypy.rpython.error import TyperError from pypy.rpython.rmodel import Repr, inputconst, HalfConcreteWrapper from pypy.rpython import rclass @@ -91,6 +91,8 @@ # a 'matching' row is one that has the same llfn, expect # that it may have more or less 'holes' for existingindex, existingrow in enumerate(uniquerows): + if row.fntype != existingrow.fntype: + continue # not the same pointer type, cannot match for funcdesc, llfn in row.items(): if funcdesc in existingrow: if llfn != existingrow[funcdesc]: @@ -99,6 +101,7 @@ # potential match, unless the two rows have no common funcdesc merged = ConcreteCallTableRow(row) merged.update(existingrow) + merged.fntype = row.fntype if len(merged) == len(row) + len(existingrow): pass # no common funcdesc, not a match else: @@ -125,14 +128,19 @@ for funcdesc, graph in row.items(): llfn = rtyper.getcallable(graph) concreterow[funcdesc] = llfn + assert len(concreterow) > 0 + concreterow.fntype = typeOf(llfn) # 'llfn' from the loop above + # (they should all have the same type) concreterows[shape, index] = concreterow for row in concreterows.values(): addrow(row) for (shape, index), row in concreterows.items(): - _, biggerrow = lookuprow(row) - concretetable[shape, index] = biggerrow + existingindex, biggerrow = lookuprow(row) + row = uniquerows[existingindex] + assert biggerrow == row # otherwise, addrow() is broken + concretetable[shape, index] = row for finalindex, row in enumerate(uniquerows): row.attrname = 'variant%d' % finalindex @@ -174,10 +182,16 @@ self.uniquerows = uniquerows if len(uniquerows) == 1: row = uniquerows[0] - examplellfn = row.itervalues().next() - self.lowleveltype = typeOf(examplellfn) + self.lowleveltype = row.fntype else: - XXX_later + # several functions, each with several specialized variants. + # each function becomes a pointer to a Struct containing + # pointers to its variants. + fields = [] + for row in uniquerows: + fields.append((row.attrname, row.fntype)) + self.lowleveltype = Ptr(Struct('specfunc', *fields)) + self.funccache = {} def get_s_callable(self): return self.s_pbc @@ -201,25 +215,36 @@ def convert_desc(self, funcdesc): # get the whole "column" of the call table corresponding to this desc + try: + return self.funccache[funcdesc] + except KeyError: + pass if self.lowleveltype is Void: - return HalfConcreteWrapper(self.get_unique_llfn) - llfns = {} - found_anything = False - for row in self.uniquerows: - if funcdesc in row: - llfn = row[funcdesc] - found_anything = True - else: - null = self.rtyper.type_system.null_callable(self.lowleveltype) - llfn = null - llfns[row.attrname] = llfn - if not found_anything: - raise TyperError("%r not in %r" % (funcdesc, - self.s_pbc.descriptions)) - if len(self.uniquerows) == 1: - return llfn # from the loop above + result = HalfConcreteWrapper(self.get_unique_llfn) else: - XXX_later + llfns = {} + found_anything = False + for row in self.uniquerows: + if funcdesc in row: + llfn = row[funcdesc] + found_anything = True + else: + # missing entry -- need a 'null' of the type that matches + # this row + llfn = self.rtyper.type_system.null_callable(row.fntype) + llfns[row.attrname] = llfn + if not found_anything: + raise TyperError("%r not in %r" % (funcdesc, + self.s_pbc.descriptions)) + if len(self.uniquerows) == 1: + result = llfn # from the loop above + else: + # build a Struct with all the values collected in 'llfns' + result = malloc(self.lowleveltype.TO, immortal=True) + for attrname, llfn in llfns.items(): + setattr(result, attrname, llfn) + self.funccache[funcdesc] = result + return result def convert_const(self, value): if isinstance(value, types.MethodType) and value.im_self is None: @@ -237,6 +262,7 @@ low-level function. In case the call table contains multiple rows, 'index' and 'shape' tells which of its items we are interested in. """ + assert v.concretetype == self.lowleveltype if self.lowleveltype is Void: assert len(self.s_pbc.descriptions) == 1 # lowleveltype wouldn't be Void otherwise @@ -248,7 +274,10 @@ elif len(self.uniquerows) == 1: return v else: - XXX_later + # 'v' is a Struct pointer, read the corresponding field + row = self.concretetable[shape, index] + cname = inputconst(Void, row.attrname) + return llop.genop('getfield', [v, cname], resulttype = row.fntype) def get_unique_llfn(self): # try to build a unique low-level function. Avoid to use Modified: pypy/dist/pypy/rpython/test/test_rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rpbc.py (original) +++ pypy/dist/pypy/rpython/test/test_rpbc.py Fri Dec 2 11:43:44 2005 @@ -1169,3 +1169,52 @@ assert type(res.item1) is float assert res.item0 == f(i)[0] assert res.item1 == f(i)[1] + +def test_multiple_specialized_functions(): + def myadder(x, y): # int,int->int or str,str->str + return x+y + def myfirst(x, y): # int,int->int or str,str->str + return x + def mysecond(x, y): # int,int->int or str,str->str + return y + myadder._annspecialcase_ = 'specialize:argtype0' + myfirst._annspecialcase_ = 'specialize:argtype0' + mysecond._annspecialcase_ = 'specialize:argtype0' + def f(i): + if i == 0: + g = myfirst + elif i == 1: + g = mysecond + else: + g = myadder + s = g("hel", "lo") + n = g(40, 2) + return len(s) * n + for i in range(3): + res = interpret(f, [i]) + assert res == f(i) + +def test_specialized_method_of_frozen(): + class space: + def __init__(self, tag): + self.tag = tag + def wrap(self, x): + if isinstance(x, int): + return self.tag + '< %d >' % x + else: + return self.tag + x + wrap._annspecialcase_ = 'specialize:argtype1' + space1 = space("tag1:") + space2 = space("tag2:") + def f(i): + if i == 1: + sp = space1 + else: + sp = space2 + w1 = sp.wrap('hello') + w2 = sp.wrap(42) + return w1 + w2 + res = interpret(f, [1]) + assert ''.join(res.chars) == 'tag1:hellotag1:< 42 >' + res = interpret(f, [0]) + assert ''.join(res.chars) == 'tag2:hellotag2:< 42 >' From bea at codespeak.net Fri Dec 2 12:44:48 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Fri, 2 Dec 2005 12:44:48 +0100 (CET) Subject: [pypy-svn] r20568 - pypy/extradoc/talk/22c3 Message-ID: <20051202114448.7FC5827B69@code1.codespeak.net> Author: bea Date: Fri Dec 2 12:44:47 2005 New Revision: 20568 Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: changes made based on Lauras feedback - thanks Laura - please check again if you have the time.... Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Fri Dec 2 12:44:47 2005 @@ -7,7 +7,7 @@ \begin{abstract} This paper walks through different aspects of agility -within the open-source driven PyPy project [#]_. Agility played a key role +within the open-source PyPy project [#]_. Agility has played a key role from the beginning. The PyPy project started from some mails between a few people, quickly had a first one-week meeting, a "sprint", from where it evolved into a structure that was able to carry @@ -43,7 +43,7 @@ Agile approaches: sprinting ---------------------------- -The first bits of PyPy started during a one-week meeting, a "sprint", +PyPy first started during a one-week meeting, a "sprint", held at Trillke-Gut in Hildesheim February 2003. The sprint was inspired by practices used by other Python projects such as Zope3. Originally the sprint methodology used in the Python community grew from practices applied by @@ -83,6 +83,9 @@ way of getting results and getting new people aquainted - a good method for dissemination of knowledge and learning within the team. +A key insight, wothwhile for other EU-projects to ponder about, is how an agile +process like sprinting is much more suited for creative work between groups of distributed people. Traditional software development, as well as traditional project management techniques have a tendency to hinder creativity due to the inbuilt over-structured, segmented and control-oriented approach which in most cases end in less quality when results are being measured. + .. raw:: latex \begin{figure*}[htbp]\begin{center} @@ -185,8 +188,7 @@ project wasn?t enough. The challenges and the idea of a flexible, configurable "translator" or "compiler" met the research targets of the FP6, as well as trying out and documenting the agile methodology being used. -It is interesting to note that todays computer industrial language -research and development occurs mostly in the US. +It is interesting to note that most of today's computer language research and development happens in the US In short, we argued that EU funding allowed the project to go from reaching a critical mass and position to continue to evolve from @@ -195,8 +197,7 @@ Acting on this strategy proved to be a more difficult task. The entire proposal and negotiation process took over a year (Autumn 2003 until -November 2004). Satisfying the formal requirements, a proper description of -planned work, had not previously been part of the development focus and both the EU +November 2004).A proper description of planned work, necessary to satisfy formal requirements, had not previously been part of the development focus and both the EU and the parties involved had to adapt to the situation. Yet, drafting the high-level requirements (in total 14 workpackages and 58 deliverables) was done using the same version-control/open-communication based work style, including @@ -318,7 +319,7 @@ efforts. Aiming for homogenity is the real threat. - what first seemed like unbeatable odds and too big obstacles - even turned sometimes into new possibilities. The challenge is + sometimes even turned into new possibilities. The challenge is to maintain an atmosphere in which a team can act on those and within short timeframes of opportunities. Change is inevitable - how you handle it is the real challenge. From cfbolz at codespeak.net Fri Dec 2 13:10:11 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 2 Dec 2005 13:10:11 +0100 (CET) Subject: [pypy-svn] r20569 - pypy/extradoc/talk/22c3 Message-ID: <20051202121011.DF51227B6E@code1.codespeak.net> Author: cfbolz Date: Fri Dec 2 13:10:05 2005 New Revision: 20569 Modified: pypy/extradoc/talk/22c3/agility.pdf pypy/extradoc/talk/22c3/agility_v1.txt.txt Log: adding references + regeneration. Modified: pypy/extradoc/talk/22c3/agility.pdf ============================================================================== Files pypy/extradoc/talk/22c3/agility.pdf (original) and pypy/extradoc/talk/22c3/agility.pdf Fri Dec 2 13:10:05 2005 differ Modified: pypy/extradoc/talk/22c3/agility_v1.txt.txt ============================================================================== --- pypy/extradoc/talk/22c3/agility_v1.txt.txt (original) +++ pypy/extradoc/talk/22c3/agility_v1.txt.txt Fri Dec 2 13:10:05 2005 @@ -43,13 +43,13 @@ Agile approaches: sprinting ---------------------------- -PyPy first started during a one-week meeting, a "sprint", -held at Trillke-Gut in Hildesheim February 2003. The sprint was inspired by -practices used by other Python projects such as Zope3. Originally the -sprint methodology used in the Python community grew from practices applied by -the Zope Corporation. Their definition of a sprint was: "two-day or three-day focused -development session, in which developers pair off together in a room and focus -on building a particular subsystem". +PyPy first started during a one-week meeting, a "sprint", held at Trillke-Gut +in Hildesheim February 2003. The sprint was inspired by practices used by other +Python projects such as Zope3. Originally the sprint methodology used in the +Python community grew from practices applied by the Zope Corporation. Their +definition of a sprint was: "two-day or three-day focused development session, +in which developers pair off together in a room and focus on building a +particular subsystem". Sprinting up to a week became the initial driving factor in developing the code base and the community/people around it. The early PyPy sprints @@ -59,7 +59,7 @@ Sprints are actually not part of the traditional Agile portfolio of techniques, the closest thing to it comes from -Scrum who names the 30 days long programming iterations +Scrum [#]_ who names the 30 days long programming iterations "sprints", covering a certain increment. With the Scrum method, considerable effort is put into performing the sprint planning as well as creating and documenting the "sprint backlog" which @@ -83,8 +83,14 @@ way of getting results and getting new people aquainted - a good method for dissemination of knowledge and learning within the team. -A key insight, wothwhile for other EU-projects to ponder about, is how an agile -process like sprinting is much more suited for creative work between groups of distributed people. Traditional software development, as well as traditional project management techniques have a tendency to hinder creativity due to the inbuilt over-structured, segmented and control-oriented approach which in most cases end in less quality when results are being measured. +A key insight, worthwhile for other EU-projects to ponder, is how an agile +process like sprinting is much more suited for creative work between groups of +distributed people. Traditional software development, as well as traditional +project management techniques have a tendency to hinder creativity due to the +inbuilt over-structured, segmented and control-oriented approach which in most +cases ends in less quality when results are being measured. + +.. [#] "Agile project management with Scrum", Ken Schwaber, Microsoft Professional 2004, p. 8-9 .. raw:: latex @@ -106,7 +112,7 @@ especially in combination with a transparent open process in which anyone interested can participate - if only for just a few days at a sprint. One of the key problems identified by Frederick P. Brooks in the latest version of -"The Mythical Man-Month" (unfortunately still very actual today) is estimating +"The Mythical Man-Month" [#]_ (unfortunately still very actual today) is estimating correct amount of time for communication and testing/debugging. Automated testing, rather barrier-free communication and strict version tracking helps with that problem, especially in the hands of a team sprinting its way through @@ -117,6 +123,8 @@ for writing tests and glueing things together. PyPy's testing tool ("py.test") is used separately and evolves on its own by now. +.. [#] "The mythical man-month, anniversary edition", Frederick P. Brooks, Jr, Addison-Wesley 2004 + .. raw:: latex \begin{figure*}[htbp]\begin{center} @@ -181,29 +189,31 @@ (languages and applications). There was no previous experience of an Open Source community based project making a bid for funding. -The areas in the 6th Framework programme (second call) fit very well -with the objectives of PyPy. The idea of strengthening the European Software +The areas in the 6th Framework programme (second call) fit very well with the +objectives of PyPy. The idea of strengthening the European Software development companies and businesses with supporting an open source language implementation was new but appealing to the EU. However, being an Open Source -project wasn?t enough. The challenges and the idea of a flexible, -configurable "translator" or "compiler" met the research targets of the FP6, as -well as trying out and documenting the agile methodology being used. -It is interesting to note that most of today's computer language research and development happens in the US +project wasn?t enough. The challenges and the idea of a flexible, configurable +"translator" or "compiler" met the research targets of the FP6, as well as +trying out and documenting the agile methodology being used. It is interesting +to note that most of today's computer language research and development happens +in the US. In short, we argued that EU funding allowed the project to go from reaching a critical mass and position to continue to evolve from there, and that it would help European Organisations to make some ground. -Acting on this strategy proved to be a more difficult task. The -entire proposal and negotiation process took over a year (Autumn 2003 until -November 2004).A proper description of planned work, necessary to satisfy formal requirements, had not previously been part of the development focus and both the EU -and the parties involved had to adapt to the situation. Yet, drafting the -high-level requirements (in total 14 workpackages and 58 deliverables) was done -using the same version-control/open-communication based work style, including -evolving the proposal at sprints. Writing the proposal and specifying according -objectives on a higher level has proved to be generally useful for clarifying goals -on a longer term. It also helps others to understand the project better. +Acting on this strategy proved to be a more difficult task. The entire proposal +and negotiation process took over a year (Autumn 2003 until November 2004).A +proper description of planned work, necessary to satisfy formal requirements, +had not previously been part of the development focus and both the EU and the +parties involved had to adapt to the situation. Yet, drafting the high-level +requirements (in total 14 workpackages and 58 deliverables) was done using the +same version-control/open-communication based work style, including evolving +the proposal at sprints. Writing the proposal and specifying according +objectives on a higher level has proved to be generally useful for clarifying +goals on a longer term. It also helps others to understand the project better. Unfortunately the negotiations with the EU got stuck in organizational limbo and the project is still suffering from From arigo at codespeak.net Fri Dec 2 13:18:41 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 2 Dec 2005 13:18:41 +0100 (CET) Subject: [pypy-svn] r20570 - in pypy/dist/pypy/translator/js: . src test Message-ID: <20051202121841.F328327B6E@code1.codespeak.net> Author: arigo Date: Fri Dec 2 13:18:40 2005 New Revision: 20570 Modified: pypy/dist/pypy/translator/js/js.py pypy/dist/pypy/translator/js/src/ll_stackless.js pypy/dist/pypy/translator/js/test/test_stackless.py Log: (arigo and ericvrp) Fix for getting at exception_match function Fix for yield_frame_to_caller. (return value was overwritten) All but one test passes now. The last test is fails because structure fields are not defined with a new Object in genjs at the moment. Modified: pypy/dist/pypy/translator/js/js.py ============================================================================== --- pypy/dist/pypy/translator/js/js.py (original) +++ pypy/dist/pypy/translator/js/js.py Fri Dec 2 13:18:40 2005 @@ -41,14 +41,10 @@ self.db.prepare_arg_value(c) #add exception matching function (XXX should only be done when needed) - try: - e = self.db.translator.rtyper.getexceptiondata() - #matchptr = getfunctionptr(bk.getdesc(e.fn_exception_match).cachedgraph(None)) - matchptr = getfunctionptr(bk.getdesc(e.fn_exception_match._obj).cachedgraph(None)) - matchconst = inputconst(lltype.typeOf(matchptr), matchptr) - self.db.prepare_arg_value(matchconst) - except: - pass #XXX need a fix here + e = self.db.translator.rtyper.getexceptiondata() + matchptr = e.fn_exception_match + matchconst = inputconst(lltype.typeOf(matchptr), matchptr) + self.db.prepare_arg_value(matchconst) # set up all nodes self.db.setup_all() Modified: pypy/dist/pypy/translator/js/src/ll_stackless.js ============================================================================== --- pypy/dist/pypy/translator/js/src/ll_stackless.js (original) +++ pypy/dist/pypy/translator/js/src/ll_stackless.js Fri Dec 2 13:18:40 2005 @@ -81,7 +81,7 @@ slp_frame_stack_top = slp_frame_stack_bottom = slp_new_frame_simple(ll_stack_unwind); } LOG('slp_frame_stack_top='+slp_frame_stack_top + ', slp_frame_stack_bottom='+slp_frame_stack_bottom) - return null; + return slp_return_value; } function slp_return_current_frame_to_caller() { @@ -98,12 +98,13 @@ LOG("slp_end_of_yielding_function"); if (!slp_frame_stack_top) log('slp_end_of_yielding_function !slp_frame_stack_top'); // can only resume from slp_return_current_frame_to_caller() if (!slp_return_value) log('slp_end_of_yielding_function !slp_return_value'); + LOG('slp_return_value is going to ' + function_name(slp_return_value.func)) slp_frame_stack_top = slp_return_value; return null; } -function ll_stackless_switch__frame_stack_topPtr(c) { - LOG("ll_stackless_switch__frame_stack_topPtr"); +function ll_stackless_switch(c) { + LOG("ll_stackless_switch"); var f; var result; if (slp_frame_stack_top) { //resume @@ -120,7 +121,7 @@ LOG("slp_frame_stack_top == null"); // first, unwind the current stack - f = slp_new_frame_simple(ll_stackless_switch__frame_stack_topPtr); + f = slp_new_frame_simple(ll_stackless_switch); f.p0 = c; slp_frame_stack_top = slp_frame_stack_bottom = f; } Modified: pypy/dist/pypy/translator/js/test/test_stackless.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_stackless.py (original) +++ pypy/dist/pypy/translator/js/test/test_stackless.py Fri Dec 2 13:18:40 2005 @@ -130,8 +130,6 @@ assert int(data.strip()) == 697 #10**4==697(6seconds, 10**5==545(45seconds) def test_yield_frame1(): - py.test.skip("stackless feature not incomplete") - def g(lst): lst.append(2) frametop_before_5 = yield_current_frame_to_caller() @@ -148,7 +146,7 @@ lst.append(5) frametop_after_return = frametop_before_6.switch() lst.append(7) - #assert frametop_after_return is None + assert frametop_after_return is None n = 0 for i in lst: n = n*10 + i @@ -158,8 +156,6 @@ assert int(data.strip()) == 1234567 def test_yield_frame2(): - py.test.skip("stackless feature incomplete (exception handling?)") - S = lltype.GcStruct("base", ('a', lltype.Signed)) s = lltype.malloc(S) @@ -179,7 +175,7 @@ s.a += 5 frametop_after_return = frametop_before_6.switch() s.a += 7 - #assert frametop_after_return is None + assert frametop_after_return is None return s.a data = wrap_stackless_function(f) From jacob at codespeak.net Fri Dec 2 13:43:18 2005 From: jacob at codespeak.net (jacob at codespeak.net) Date: Fri, 2 Dec 2005 13:43:18 +0100 (CET) Subject: [pypy-svn] r20571 - pypy/extradoc/talk/22c3 Message-ID: <20051202124318.5DEB327B5A@code1.codespeak.net> Author: jacob Date: Fri Dec 2 13:43:17 2005 New Revision: 20571 Modified: pypy/extradoc/talk/22c3/techpaper.txt Log: Some clarifications and language changes. Modified: pypy/extradoc/talk/22c3/techpaper.txt ============================================================================== --- pypy/extradoc/talk/22c3/techpaper.txt (original) +++ pypy/extradoc/talk/22c3/techpaper.txt Fri Dec 2 13:43:17 2005 @@ -17,7 +17,7 @@ compiler toolsuite that can produce custom Python versions. Platform, memory and threading models are to become aspects of the translation process - as opposed to encoding low level details into the language implementation itself. -Eventually, dynamic optimization techniques - implemented as another +Eventually, dynamic optimisation techniques - implemented as another translation aspect - should become robust against language changes. .. [#] http://codespeak.net/pypy @@ -49,10 +49,12 @@ This eases reuse and allows experimenting with multiple implementations of specific features. -Later in the project we will introduce optimizations, following the ideas -of Psyco [#]_ that should make PyPy run Python programs faster than CPython, -and extensions, following the ideas of Stackless [#]_ and others, that will -increase the expressive power available to python programmers. +Later in the project we will introduce optimisations, following the +ideas of Psyco [#]_, a Just in Time Specialiser, that should make PyPy +run Python programs faster than CPython. Extensions that increase the +expressive power are also planned. For instance, we will include the +ideas of Stackless [#]_, which moves the execution frames off the stack into +heap space, allowing for massive parallellism. .. [#] http://psyco.sourceforge.net .. [#] http://stackless.com @@ -66,19 +68,19 @@ like C/Posix, Java or C#. Each such interpreter provides a "mapping" from application source code to the target environment. One of the goals of the "all-encompassing" environments, like the .NET framework -and to some extent the Java virtual machine, is to provide standardized +and to some extent the Java virtual machine, is to provide standardised and higher level functionalities to language implementors. This reduces the burden of having to write and maintain many interpreters or compilers. PyPy is experimenting with a different approach. We are not writing a Python interpreter for a specific target platform. We have written a -Python interpreter in Python, without many references to low-level -details. (Because of the nature of Python, this is already a -complicated task, although not as much as writing it in - say - C.) -Then we use this as a "language specification" and manipulate it to -produce the more traditional interpreters that we want. In the above -sense, we are generating the concrete "mappings" of Python into +Python interpreter in Python, with as few references as possible to +low-level details. (Because of the nature of Python, this is already +a complicated task, although not as complicated as writing it in - say +- C.) Then we use this as a "language specification" and manipulate +it to produce the more traditional interpreters that we want. In the +above sense, we are generating the concrete "mappings" of Python into lower-level target platforms. So far (autumn 2005), we have already succeeded in turning this "language @@ -149,7 +151,7 @@ The *bytecode interpreter* is the part that interprets the compact bytecode format produced from user Python sources by a preprocessing phase, the *bytecode compiler*. The bytecode compiler itself is -implemented as a chain of flexible passes (tokenizer, lexer, parser, +implemented as a chain of flexible passes (tokeniser, lexer, parser, abstract syntax tree builder, bytecode generator). The bytecode interpreter then does its work by delegating all actual manipulation of user objects to the *object space*. The latter can be thought of as the @@ -173,7 +175,7 @@ - producing a *flow graph* representation of the standard interpreter. A combination of the bytecode interpreter and a *flow object space* performs *abstract interpretation* to record the flow of objects - and execution throughout a python program into such a *flow graph*; + and execution throughout a Python program into such a *flow graph*; - the *annotator* which performs type inference on the flow graph; @@ -198,16 +200,16 @@ In order for our translation and type inference mechanisms to work effectively, we need to restrict the dynamism of our interpreter-level Python code at some point. However, in the start-up phase, we are -completely free to use all kinds of powerful python constructs, including +completely free to use all kinds of powerful Python constructs, including metaclasses and execution of dynamically constructed strings. However, -when the initialization phase finishes, all code objects involved need to +when the initialisation phase finishes, all code objects involved need to adhere to a more static subset of Python: Restricted Python, also known as RPython. The Flow Object Space then, with the help of our bytecode interpreter, -works through those initialized RPython code objects. The result of +works through those initialised RPython code objects. The result of this abstract interpretation is a flow graph: yet another -representation of a python program, but one which is suitable for +representation of a Python program, but one which is suitable for applying translation and type inference techniques. The nodes of the graph are basic blocks consisting of Object Space operations, flowing of values, and an exitswitch to one, two or multiple links which connect @@ -255,30 +257,31 @@ Status of the implementation (Nov 2005) ========================================== -With the pypy-0.8.0 release we have integrated our AST compiler with -the rest of PyPy. The compiler gets translated with the rest to a -static self-contained version of our standard interpreter. Like -with 0.7.0 this version is very compliant [#]_ to CPython 2.4.1 but you -cannot run many existing programs on it yet because we are -still missing a number of C-modules like socket or support for process -creation. +With the pypy-0.8.0 release we have integrated our Abstract Syntax +Tree (AST) compiler with the rest of PyPy. The compiler gets +translated with the rest to a static self-contained version of the +standard interpreter. Like with 0.7.0 this version is very compliant +[#]_ to CPython 2.4.1 but you cannot run many existing programs on it +yet because we are still missing a number of C-modules like socket or +support for process creation. The self-contained PyPy version (single-threaded and using the -Boehm-Demers-Weiser garbage collector [#]_) now runs around 10-20 times -slower than CPython, i.e. around 10 times faster than 0.7.0. -This is the result of optimizing, adding short -cuts for some common paths in our interpreter and adding relatively -straightforward optimization transforms to our tool chain, like inlining -paired with simple escape analysis to remove unnecessary heap allocations. -We still have some way to go, and we still expect most of our speed -will come from our Just-In-Time compiler work, which we have barely started -at the moment. - -With the 0.8.0 release the "thunk" object space can also be translated, -obtaining a self-contained version of PyPy -with its features (and some speed degradation), show-casing at a small -scale how our whole tool-chain supports flexibility from the interpreter -written in Python to the resulting self-contained executable. +Boehm-Demers-Weiser garbage collector [#]_) now runs around 10-20 +times slower than CPython, i.e. around 10 times faster than 0.7.0. +This is the result of optimisations, adding short cuts for some common +paths in our interpreter and adding relatively straight forward +optimising transforms to our tool chain, like inlining paired with +simple escape analysis to remove unnecessary heap allocations. We +still have some way to go. However we expect that most of our speed +will come from the Just-In-Time compiler - work which we have barely +started yet. + +With the 0.8.0 release the "Thunk Object Space" can also be +translated. This is a module that proxies the Standard Object Space, +adding lazy evaluation features to Python. It is a small scale +show-case for how our whole tool-chain supports flexibility from the +interpreter written in Python to the resulting self-contained +executable. Our rather complete and Python 2.4-compliant interpreter consists of about 30,000-50,000 lines of code (depending on the way you @@ -298,23 +301,23 @@ In 2006, the PyPy project aims to translate the standard Python Interpreter to a JIT-compiler and also to support massive parallelism -aka micro-threads within the language. These are not trivial tasks -especially if we want to retain and improve the modularity and -flexibility aspects of our implementation - like giving an -independent choice of memory or threading models for translation. -Moreover it is likely that our javascript and other higher -level backends (in contrast to our current low-level ones) will -continue to evolve. - -Apart from optimization-related translation choices PyPy is to enable new -possibilities regarding persistence, security and distribution issues. We -intend to experiment with ortoghonal persistence for Python objects, i.e. -one that doesn't require application objects to behave in a -particular manner. Security wise we will look at sandboxing -or capabilities based schemes. For distribution we already experimented -with allowing transparent migration of objects between processes with -the help of the existing (and translateable) Thunk Object Space. -In general, according experiments are much easier to conduct with PyPy -and should provide a resulting standalone executable in shorter time. +(micro-threads) within the language. These are not trivial tasks +especially if we want to retain and improve the modularity and +flexibility aspects of our implementation - like giving an independent +choice of memory or threading models for translation. Moreover it is +likely that our Javascript and other higher level backends (in +contrast to our current low-level ones) will continue to evolve. + +Apart from optimisation-related translation choices PyPy is to enable +new possibilities regarding persistence, security and distribution +issues. We intend to experiment with ortoghonal persistence for +Python objects, i.e. one that doesn't require application objects to +behave in a particular manner. Security-wise we will look at +sandboxing or capabilities based schemes. For distribution we already +experimented with allowing transparent migration of objects between +processes with the help of the existing (and translateable) Thunk +Object Space. In general, all experiments are much easier to conduct +in PyPy and should provide a resulting standalone executable in +a shorter time than traditional approaches. From cfbolz at codespeak.net Fri Dec 2 14:02:04 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 2 Dec 2005 14:02:04 +0100 (CET) Subject: [pypy-svn] r20572 - pypy/extradoc/talk/22c3 Message-ID: <20051202130204.8D54527B69@code1.codespeak.net> Author: cfbolz Date: Fri Dec 2 14:01:59 2005 New Revision: 20572 Modified: pypy/extradoc/talk/22c3/techpaper.pdf pypy/extradoc/talk/22c3/techpaper.txt Log: try to stick to british spelling. regenerate pdf Modified: pypy/extradoc/talk/22c3/techpaper.pdf ============================================================================== Files pypy/extradoc/talk/22c3/techpaper.pdf (original) and pypy/extradoc/talk/22c3/techpaper.pdf Fri Dec 2 14:01:59 2005 differ Modified: pypy/extradoc/talk/22c3/techpaper.txt ============================================================================== --- pypy/extradoc/talk/22c3/techpaper.txt (original) +++ pypy/extradoc/talk/22c3/techpaper.txt Fri Dec 2 14:01:59 2005 @@ -17,7 +17,7 @@ compiler toolsuite that can produce custom Python versions. Platform, memory and threading models are to become aspects of the translation process - as opposed to encoding low level details into the language implementation itself. -Eventually, dynamic optimisation techniques - implemented as another +Eventually, dynamic optimization techniques - implemented as another translation aspect - should become robust against language changes. .. [#] http://codespeak.net/pypy @@ -49,8 +49,8 @@ This eases reuse and allows experimenting with multiple implementations of specific features. -Later in the project we will introduce optimisations, following the -ideas of Psyco [#]_, a Just in Time Specialiser, that should make PyPy +Later in the project we will introduce optimizations, following the +ideas of Psyco [#]_, a Just in Time Specializer, that should make PyPy run Python programs faster than CPython. Extensions that increase the expressive power are also planned. For instance, we will include the ideas of Stackless [#]_, which moves the execution frames off the stack into @@ -68,7 +68,7 @@ like C/Posix, Java or C#. Each such interpreter provides a "mapping" from application source code to the target environment. One of the goals of the "all-encompassing" environments, like the .NET framework -and to some extent the Java virtual machine, is to provide standardised +and to some extent the Java virtual machine, is to provide standardized and higher level functionalities to language implementors. This reduces the burden of having to write and maintain many interpreters or compilers. @@ -151,7 +151,7 @@ The *bytecode interpreter* is the part that interprets the compact bytecode format produced from user Python sources by a preprocessing phase, the *bytecode compiler*. The bytecode compiler itself is -implemented as a chain of flexible passes (tokeniser, lexer, parser, +implemented as a chain of flexible passes (tokenizer, lexer, parser, abstract syntax tree builder, bytecode generator). The bytecode interpreter then does its work by delegating all actual manipulation of user objects to the *object space*. The latter can be thought of as the @@ -202,12 +202,12 @@ Python code at some point. However, in the start-up phase, we are completely free to use all kinds of powerful Python constructs, including metaclasses and execution of dynamically constructed strings. However, -when the initialisation phase finishes, all code objects involved need to +when the initialization phase finishes, all code objects involved need to adhere to a more static subset of Python: Restricted Python, also known as RPython. The Flow Object Space then, with the help of our bytecode interpreter, -works through those initialised RPython code objects. The result of +works through those initialized RPython code objects. The result of this abstract interpretation is a flow graph: yet another representation of a Python program, but one which is suitable for applying translation and type inference techniques. The nodes of the @@ -268,7 +268,7 @@ The self-contained PyPy version (single-threaded and using the Boehm-Demers-Weiser garbage collector [#]_) now runs around 10-20 times slower than CPython, i.e. around 10 times faster than 0.7.0. -This is the result of optimisations, adding short cuts for some common +This is the result of optimizations, adding short cuts for some common paths in our interpreter and adding relatively straight forward optimising transforms to our tool chain, like inlining paired with simple escape analysis to remove unnecessary heap allocations. We @@ -308,7 +308,7 @@ likely that our Javascript and other higher level backends (in contrast to our current low-level ones) will continue to evolve. -Apart from optimisation-related translation choices PyPy is to enable +Apart from optimization-related translation choices PyPy is to enable new possibilities regarding persistence, security and distribution issues. We intend to experiment with ortoghonal persistence for Python objects, i.e. one that doesn't require application objects to From arigo at codespeak.net Fri Dec 2 14:51:57 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 2 Dec 2005 14:51:57 +0100 (CET) Subject: [pypy-svn] r20573 - in pypy/dist/pypy: annotation objspace/std translator/goal Message-ID: <20051202135157.6EAE627B5D@code1.codespeak.net> Author: arigo Date: Fri Dec 2 14:51:56 2005 New Revision: 20573 Added: pypy/dist/pypy/translator/goal/targetmultiplespaces.py - copied unchanged from r19545, pypy/dist/pypy/translator/goal/targetmultiplespaces.py Modified: pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/objspace/std/objspace.py Log: (mwh, pedronis, arre, arigo) Reintroduced the two-spaces PyPy target. With a few tweaks in the source of PyPy, this can now compile! Three weeks of refactoring... Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Fri Dec 2 14:51:56 2005 @@ -11,7 +11,9 @@ from pypy.annotation.model import SomeExternalObject from pypy.annotation.model import annotation_to_lltype, lltype_to_annotation from pypy.annotation.model import add_knowntypedata +from pypy.annotation.model import s_ImpossibleValue from pypy.annotation.bookkeeper import getbookkeeper +from pypy.annotation import description from pypy.objspace.flow.model import Constant import pypy.rpython.rarithmetic import pypy.rpython.objectmodel @@ -169,6 +171,14 @@ r = SomeBool() if s_obj.is_constant(): r.const = hasattr(s_obj.const, s_attr.const) + elif (isinstance(s_obj, SomePBC) + and s_obj.getKind() is description.FrozenDesc): + answers = {} + for d in s_obj.descriptions: + answer = (d.s_read_attribute(s_attr.const) != s_ImpossibleValue) + answers[answer] = True + if len(answers) == 1: + r.const, = answers return r ##def builtin_callable(s_obj): Modified: pypy/dist/pypy/objspace/std/objspace.py ============================================================================== --- pypy/dist/pypy/objspace/std/objspace.py (original) +++ pypy/dist/pypy/objspace/std/objspace.py Fri Dec 2 14:51:56 2005 @@ -5,6 +5,7 @@ from pypy.rpython.objectmodel import instantiate from pypy.interpreter.gateway import PyPyCacheDir from pypy.tool.cache import Cache +from pypy.tool.sourcetools import func_with_new_name from pypy.objspace.std.model import W_Object, UnwrapError from pypy.objspace.std.model import W_ANY, StdObjSpaceMultiMethod, StdTypeModel from pypy.objspace.std.multimethod import FailedToImplement @@ -52,7 +53,11 @@ mm) # e.g. add(space, w_x, w_y) - boundmethod = func.__get__(self) # bind the 'space' argument + def make_boundmethod(func=func): + def boundmethod(*args): + return func(self, *args) + return func_with_new_name(boundmethod, 'boundmethod_'+name) + boundmethod = make_boundmethod() setattr(self, name, boundmethod) # store into 'space' instance # hack to avoid imports in the time-critical functions below From pedronis at codespeak.net Fri Dec 2 15:37:55 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 2 Dec 2005 15:37:55 +0100 (CET) Subject: [pypy-svn] r20574 - pypy/dist/pypy/tool Message-ID: <20051202143755.2F39427B5A@code1.codespeak.net> Author: pedronis Date: Fri Dec 2 15:37:54 2005 New Revision: 20574 Modified: pypy/dist/pypy/tool/run_translation.py Log: Translator -> TranslationContext Modified: pypy/dist/pypy/tool/run_translation.py ============================================================================== --- pypy/dist/pypy/tool/run_translation.py (original) +++ pypy/dist/pypy/tool/run_translation.py Fri Dec 2 15:37:54 2005 @@ -7,11 +7,12 @@ module = __import__('pypy.translator.goal.%s', None, None, ['target']) entry_point, arg_s = module.target() -from pypy.translator.translator import Translator +from pypy.translator.translator import TranslationContext from pypy.translator.goal.query import polluted -t = Translator(entry_point) -a = t.annotate(arg_s) +t = TranslationContext() +a = t.buildannotator() +a.build_types(entry_point, args_s) a.simplify() print polluted(t) From cfbolz at codespeak.net Fri Dec 2 15:53:40 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 2 Dec 2005 15:53:40 +0100 (CET) Subject: [pypy-svn] r20575 - pypy/dist/pypy/doc Message-ID: <20051202145340.C3CCA27B5A@code1.codespeak.net> Author: cfbolz Date: Fri Dec 2 15:53:39 2005 New Revision: 20575 Removed: pypy/dist/pypy/doc/getting-started-0.8.txt Log: remove old 0.8 getting started From cfbolz at codespeak.net Fri Dec 2 16:06:01 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 2 Dec 2005 16:06:01 +0100 (CET) Subject: [pypy-svn] r20576 - pypy/dist/pypy/doc Message-ID: <20051202150601.4AA3D27B5D@code1.codespeak.net> Author: cfbolz Date: Fri Dec 2 16:06:00 2005 New Revision: 20576 Modified: pypy/dist/pypy/doc/getting-started.txt Log: add warning that some of the getting-started examples might not work on the trunk. Modified: pypy/dist/pypy/doc/getting-started.txt ============================================================================== --- pypy/dist/pypy/doc/getting-started.txt (original) +++ pypy/dist/pypy/doc/getting-started.txt Fri Dec 2 16:06:00 2005 @@ -52,6 +52,9 @@ Svn-check out & run the latest PyPy as a two-liner -------------------------------------------------- +**WARNING**: because of an ongoing major refactoring some of the examples +below don't work in the svn version. Please use the 0.8.0 release for now! + If you want to play with the ongoing development PyPy version you can check it out from the repository using subversion. Download and install subversion_ if you don't allready have it. Then you can From sanxiyn at codespeak.net Fri Dec 2 16:31:53 2005 From: sanxiyn at codespeak.net (sanxiyn at codespeak.net) Date: Fri, 2 Dec 2005 16:31:53 +0100 (CET) Subject: [pypy-svn] r20577 - pypy/dist/pypy/translator Message-ID: <20051202153153.9F47E27B5F@code1.codespeak.net> Author: sanxiyn Date: Fri Dec 2 16:31:50 2005 New Revision: 20577 Modified: pypy/dist/pypy/translator/gencl.py Log: Insert some blank lines Modified: pypy/dist/pypy/translator/gencl.py ============================================================================== --- pypy/dist/pypy/translator/gencl.py (original) +++ pypy/dist/pypy/translator/gencl.py Fri Dec 2 16:31:50 2005 @@ -16,6 +16,7 @@ class Op: + def __init__(self, gen, op): self.gen = gen self.str = gen.str @@ -23,6 +24,7 @@ self.opname = op.opname self.args = op.args self.result = op.result + def __call__(self): if DEBUG: self.op_default() @@ -33,9 +35,11 @@ default = self.op_default meth = getattr(self, "op_" + self.opname, default) meth() + def op_default(self): print ";;", self.op print ";; Op", self.opname, "is missing" + binary_ops = { #"add": "+", "sub": "-", @@ -49,15 +53,18 @@ "and_": "logand", "getitem": "elt", } + def op_binary(self, op): s = self.str result, (arg1, arg2) = self.result, self.args cl_op = self.binary_ops[op] print "(setq", s(result), "(", cl_op, s(arg1), s(arg2), "))" + def op_contains(self): s = self.str result, (arg1, arg2) = self.result, self.args print "(setq", s(result), "(not (not (find", s(arg2), s(arg1), "))))" + def op_add(self): s = self.str result, (arg1, arg2) = self.result, self.args @@ -72,6 +79,7 @@ } self.gen.emit_typecase(table, arg1, arg2) print ")" + def op_not_(self): s = self.str result, (arg1,) = self.result, self.args @@ -84,6 +92,7 @@ } self.gen.emit_typecase(table, arg1) print "))" + def op_is_true(self): s = self.str result, (arg1,) = self.result, self.args @@ -96,41 +105,49 @@ } self.gen.emit_typecase(table, arg1) print ")" + def op_newtuple(self): s = self.str print "(setq", s(self.result), "(list", for arg in self.args: print s(arg), print "))" + def op_newlist(self): s = self.str print "(setq", s(self.result), "(vector", for arg in self.args: print s(arg), print "))" + def op_alloc_and_set(self): s = self.str result, (size, init) = self.result, self.args print "(setq", s(result), "(make-array", s(size), "))" print "(fill", s(result), s(init), ")" + def op_setitem(self): s = self.str (seq, index, element) = self.args print "(setf (elt", s(seq), s(index), ")", s(element), ")" + def op_iter(self): s = self.str result, (seq,) = self.result, self.args print "(setq", s(result), "(make-iterator", s(seq), "))" + def op_next(self): s = self.str result, (iterator,) = self.result, self.args print "(let ((result (funcall", s(iterator), ")))" print " (setq", s(result), "(car result))" print " (setq last-exc (cdr result)))" + builtin_map = { pow: "expt", range: "python-range", } + def op_simple_call(self): func = self.args[0] if not isinstance(func, Constant): @@ -146,10 +163,12 @@ for arg in args: print s(arg), print "))" + def op_getslice(self): s = self.str result, (seq, start, end) = self.result, self.args print "(setq", s(result), "(python-slice", s(seq), s(start), s(end), "))" + def op_pow(self): s = self.str result, (x,y,z) = self.result, self.args @@ -160,7 +179,9 @@ self.gen.emit_typecase(table, x, y, z) print ")" + class GenCL: + def __init__(self, fun, input_arg_types=[]): # NB. 'fun' is a graph! simplify_graph(fun) @@ -169,15 +190,19 @@ self.annotate(input_arg_types) transform_graph(self.ann, extra_passes=default_extra_passes +[transform_slice]) + def annotate(self, input_arg_types): ann = RPythonAnnotator() inputcells = [ann.typeannotation(t) for t in input_arg_types] ann.build_graph_types(self.fun, inputcells) self.setannotator(ann) + def setannotator(self, annotator): self.ann = annotator + def get_type(self, var): return self.ann.gettype(var) + def str(self, obj): if isinstance(obj, Variable): return obj.name @@ -185,6 +210,7 @@ return self.conv(obj.value) else: return "#<%r>" % (obj,) + def conv(self, val): if isinstance(val, tuple): val = map(self.conv, val) @@ -211,6 +237,7 @@ return "'last-exc-value" else: return "#<%r>" % (val,) + def emitcode(self, public=True): import sys from cStringIO import StringIO @@ -220,8 +247,10 @@ self.emit() sys.stdout = oldstdout return out.getvalue() + def emit(self): self.emit_defun(self.fun) + def emit_defun(self, fun): print ";;;; Main" print "(defun", fun.name @@ -259,6 +288,7 @@ self.emit_block(block) print ")" print ")" + def emit_block(self, block): self.cur_block = block tag = self.blockref[block] @@ -301,9 +331,11 @@ else: retval = self.str(block.inputargs[0]) print "(return", retval, ")" + def emit_jump(self, block): tag = self.blockref[block] print "(go", "tag" + str(tag), ")" + def emit_link(self, link): source = link.args target = link.target.inputargs @@ -313,6 +345,7 @@ print self.str(t), self.str(s), print ")" self.emit_jump(link.target) + typemap = { bool: "boolean", int: "fixnum", @@ -320,6 +353,7 @@ type(''): "string", # hack, 'str' is in the namespace! list: "vector", } + def emit_typecase(self, table, *args): argreprs = tuple(map(self.str, args)) argtypes = tuple(map(self.get_type, args)) @@ -341,6 +375,7 @@ print trans % argreprs, print ")" print ")" + def globaldeclarations(self): return prelude From sanxiyn at codespeak.net Fri Dec 2 16:52:30 2005 From: sanxiyn at codespeak.net (sanxiyn at codespeak.net) Date: Fri, 2 Dec 2005 16:52:30 +0100 (CET) Subject: [pypy-svn] r20578 - in pypy/dist/pypy/translator: . tool Message-ID: <20051202155230.A93DD27B68@code1.codespeak.net> Author: sanxiyn Date: Fri Dec 2 16:52:26 2005 New Revision: 20578 Modified: pypy/dist/pypy/translator/gencl.py pypy/dist/pypy/translator/tool/buildcl.py Log: Rename str/conv to repr_arg, repr_var, repr_const, repr_unknown Modified: pypy/dist/pypy/translator/gencl.py ============================================================================== --- pypy/dist/pypy/translator/gencl.py (original) +++ pypy/dist/pypy/translator/gencl.py Fri Dec 2 16:52:26 2005 @@ -19,7 +19,7 @@ def __init__(self, gen, op): self.gen = gen - self.str = gen.str + self.str = gen.repr_arg self.op = op self.opname = op.opname self.args = op.args @@ -203,17 +203,15 @@ def get_type(self, var): return self.ann.gettype(var) - def str(self, obj): - if isinstance(obj, Variable): - return obj.name - elif isinstance(obj, Constant): - return self.conv(obj.value) - else: - return "#<%r>" % (obj,) + def repr_unknown(self, obj): + return '#<%r>' % (obj,) + + def repr_var(self, var): + return var.name - def conv(self, val): + def repr_const(self, val): if isinstance(val, tuple): - val = map(self.conv, val) + val = map(self.repr_const, val) return "'(%s)" % ' '.join(val) elif isinstance(val, bool): # should precedes int if val: @@ -236,7 +234,15 @@ elif val is last_exc_value: return "'last-exc-value" else: - return "#<%r>" % (val,) + return self.repr_unknown(val) + + def repr_arg(self, arg): + if isinstance(arg, Variable): + return self.repr_var(arg) + elif isinstance(arg, Constant): + return self.repr_const(arg.value) + else: + return self.repr_unknown(arg) def emitcode(self, public=True): import sys @@ -257,7 +263,7 @@ arglist = fun.getargs() print "(", for arg in arglist: - print self.str(arg), + print self.repr_var(arg), print ")" print "(prog" blocklist = [] @@ -274,15 +280,15 @@ print "( last-exc", for var in vardict: if var in arglist: - print "(", self.str(var), self.str(var), ")", + print "(", self.repr_var(var), self.repr_var(var), ")", else: - print self.str(var), + print self.repr_var(var), print ")" print ";; DEBUG: type inference" for var in vardict: tp = vardict[var] if tp: - print ";;", self.str(var), "is", tp.__name__ + print ";;", self.repr_var(var), "is", tp.__name__ print "(setq last-exc nil)" for block in blocklist: self.emit_block(block) @@ -304,7 +310,7 @@ if (len(exits) == 2 and exits[0].exitcase == False and exits[1].exitcase == True): - print "(if", self.str(block.exitswitch) + print "(if", self.repr_arg(block.exitswitch) print "(progn" self.emit_link(exits[1]) print ") ; else" @@ -316,20 +322,20 @@ # shouldn't be needed but in Python 2.2 we can't tell apart # 0 vs nil and 1 vs t :-( for exit in exits[:-1]: - print "(if (equalp", self.str(block.exitswitch), - print self.conv(exit.exitcase), ')' + print "(if (equalp", self.repr_arg(block.exitswitch), + print self.repr_const(exit.exitcase), ')' print "(progn" self.emit_link(exit) print ")" - print "(progn ; else should be", self.conv(exits[-1].exitcase) + print "(progn ; else should be", self.repr_const(exits[-1].exitcase) self.emit_link(exits[-1]) print ")" * len(exits) elif len(block.inputargs) == 2: # exc_cls, exc_value - exc_cls = self.str(block.inputargs[0]) - exc_value = self.str(block.inputargs[1]) + exc_cls = self.repr_var(block.inputargs[0]) + exc_value = self.repr_var(block.inputargs[1]) print "(something-like-throw-exception %s %s)" % (exc_cls, exc_value) else: - retval = self.str(block.inputargs[0]) + retval = self.repr_var(block.inputargs[0]) print "(return", retval, ")" def emit_jump(self, block): @@ -337,12 +343,11 @@ print "(go", "tag" + str(tag), ")" def emit_link(self, link): - source = link.args - target = link.target.inputargs + source = map(self.repr_arg, link.args) + target = map(self.repr_var, link.target.inputargs) print "(psetq", # parallel assignment for s, t in zip(source, target): - if s != t: # and s != Constant(undefined_value): - print self.str(t), self.str(s), + print t, s print ")" self.emit_jump(link.target) @@ -350,12 +355,12 @@ bool: "boolean", int: "fixnum", long: "bignum", - type(''): "string", # hack, 'str' is in the namespace! + str: "string", list: "vector", } def emit_typecase(self, table, *args): - argreprs = tuple(map(self.str, args)) + argreprs = tuple(map(self.repr_arg, args)) argtypes = tuple(map(self.get_type, args)) if argtypes in table: trans = table[argtypes] Modified: pypy/dist/pypy/translator/tool/buildcl.py ============================================================================== --- pypy/dist/pypy/translator/tool/buildcl.py (original) +++ pypy/dist/pypy/translator/tool/buildcl.py Fri Dec 2 16:52:26 2005 @@ -26,7 +26,7 @@ def writelisp(gen, obj): #if isinstance(obj, (bool, int, type(None), str)): if isinstance(obj, (int, type(None), str)): - return gen.conv(obj) + return gen.repr_const(obj) if isinstance(obj, (tuple, list)): content = ' '.join([writelisp(gen, elt) for elt in obj]) content = '(' + content + ')' @@ -65,4 +65,4 @@ it = writelisp(gen, what) print what print it - assert it == '#(t "universe" 42 nil ("of" "them" #("eternal" 95)))' + assert it == '#(t "universe" 42 nil \'("of" "them" #("eternal" 95)))' From sanxiyn at codespeak.net Fri Dec 2 17:04:34 2005 From: sanxiyn at codespeak.net (sanxiyn at codespeak.net) Date: Fri, 2 Dec 2005 17:04:34 +0100 (CET) Subject: [pypy-svn] r20579 - pypy/dist/pypy/translator Message-ID: <20051202160434.2BF8A27B69@code1.codespeak.net> Author: sanxiyn Date: Fri Dec 2 17:04:31 2005 New Revision: 20579 Modified: pypy/dist/pypy/translator/gencl.py Log: Remove some obsolete codes Modified: pypy/dist/pypy/translator/gencl.py ============================================================================== --- pypy/dist/pypy/translator/gencl.py (original) +++ pypy/dist/pypy/translator/gencl.py Fri Dec 2 17:04:31 2005 @@ -5,16 +5,6 @@ from pypy.translator.transform import transform_graph, default_extra_passes, transform_slice -DEBUG = False - - -# XXX For 2.2 the emitted code isn't quite right, because we cannot tell -# when we should write "0"/"1" or "nil"/"t". -if not isinstance(bool, type): - class bool(int): - pass - - class Op: def __init__(self, gen, op): @@ -26,9 +16,6 @@ self.result = op.result def __call__(self): - if DEBUG: - self.op_default() - return if self.opname in self.binary_ops: self.op_binary(self.opname) else: From sanxiyn at codespeak.net Fri Dec 2 17:22:03 2005 From: sanxiyn at codespeak.net (sanxiyn at codespeak.net) Date: Fri, 2 Dec 2005 17:22:03 +0100 (CET) Subject: [pypy-svn] r20580 - pypy/dist/pypy/translator Message-ID: <20051202162203.6559427B5D@code1.codespeak.net> Author: sanxiyn Date: Fri Dec 2 17:22:00 2005 New Revision: 20580 Modified: pypy/dist/pypy/translator/gencl.py Log: traverse -> iterblocks Modified: pypy/dist/pypy/translator/gencl.py ============================================================================== --- pypy/dist/pypy/translator/gencl.py (original) +++ pypy/dist/pypy/translator/gencl.py Fri Dec 2 17:22:00 2005 @@ -253,11 +253,7 @@ print self.repr_var(arg), print ")" print "(prog" - blocklist = [] - def collect_block(node): - if isinstance(node, Block): - blocklist.append(node) - traverse(collect_block, fun) + blocklist = list(fun.iterblocks()) vardict = {} for block in blocklist: tag = len(self.blockref) From sanxiyn at codespeak.net Fri Dec 2 17:35:09 2005 From: sanxiyn at codespeak.net (sanxiyn at codespeak.net) Date: Fri, 2 Dec 2005 17:35:09 +0100 (CET) Subject: [pypy-svn] r20581 - pypy/dist/pypy/translator Message-ID: <20051202163509.5C11927B5D@code1.codespeak.net> Author: sanxiyn Date: Fri Dec 2 17:35:06 2005 New Revision: 20581 Modified: pypy/dist/pypy/translator/gencl.py Log: Remove starred import Modified: pypy/dist/pypy/translator/gencl.py ============================================================================== --- pypy/dist/pypy/translator/gencl.py (original) +++ pypy/dist/pypy/translator/gencl.py Fri Dec 2 17:35:06 2005 @@ -1,4 +1,4 @@ -from pypy.objspace.flow.model import * +from pypy.objspace.flow.model import Constant, Variable, last_exception from pypy.translator.annrpython import RPythonAnnotator from pypy.translator.simplify import simplify_graph From jacob at codespeak.net Fri Dec 2 18:21:38 2005 From: jacob at codespeak.net (jacob at codespeak.net) Date: Fri, 2 Dec 2005 18:21:38 +0100 (CET) Subject: [pypy-svn] r20582 - pypy/extradoc/sprintinfo/gothenburg-2005 Message-ID: <20051202172138.5BEED27B6E@code1.codespeak.net> Author: jacob Date: Fri Dec 2 18:21:36 2005 New Revision: 20582 Added: pypy/extradoc/sprintinfo/gothenburg-2005/logistics.txt Log: Added some logistics info. Added: pypy/extradoc/sprintinfo/gothenburg-2005/logistics.txt ============================================================================== --- (empty file) +++ pypy/extradoc/sprintinfo/gothenburg-2005/logistics.txt Fri Dec 2 18:21:36 2005 @@ -0,0 +1,143 @@ +Getting to the sprint +===================== + +The sprint will be in the home of Laura Creighton and Jacob Hall?n on +G?tabergsgatan 22. + +If you arrive by regular air travel, you will arrive at Landvetter +Airport. Take the airport bus to Park Avenue Hotel. Walk 1 block North +and 3 blocks East. + +If you arrive by boat, take a tram to Vasaplatsen, walk 1 block +East, 1 block South or to Valand and walk 3 blocks West, 1 block South. + +If you arrive by train, either take a tram to Vasaplatsen or to Valand. +Follow instructions above. It is also possible to walk. The walk takes +15-20 minutes. + +If you arrive by airborn cattletransport, take the airport shuttle +to the Central Station. Follow instructions above. + +For instructions on how to use the public transit system, and for other +local information, plase see http://www.europython.org. + +Events +====== + +Concert +------- + +Our house is 3 blocks away from the Concert House. Both the house and +the symphony orchestra are world class. They will perform Bethovens +violin concerto with Maxim Vengerov on violin and Rachmaninovs +Symphony No. 2. There is one performance on Wednesday at 19.30, for +which Laura and Jacob already have tickets. Only returns left for that +concert. It plays again Friday at 18.00. If more people want to go, +Laura and Jacob will try to exchange their tickets and go Friday +instead. Normal ticket price is SEK 350. You can take a chance on +getting last minute tickets at SEK 100. + +Museum visit +------------ + +The Art Museum, which is next to the concert house, has a nice special +exhibition of Marie and PS. Kr?yer. It also has a lovely regular +collection of mainly Scandinavian art. I suggest a visit on Wednesday +when the museum is open until 21.00. + +Eating out +---------- + +For those who are interested in a gourmet experience, we would like to +suggest a dinner at a restaurant called Popcorn. Despite its name, it +does really good food at prices that are quite reasonable. For +non-vegetarians we recommend their game menu and for vegetarians the +vegetarian menu. The first set menu costs SEK 475. Matching wines with +each dish at SEK 325. The vegetarian menu is SEK 225. Wines at SEK 190. +If there are many people who want to go, doing this mid-week is easier +to arrange than during the weekend. + +Food +==== + +The plan is to do quite a bit of food in the house. However, we expect +all who participate in the eating to also participate in preparations +and clean-up. + +If you want to prepare your own stuff, you are welcome to use the +kitchen. Just make sure you leave it in good condition. + +There will be lists for food. You have to sign up at the latest the +night before to be sure you get fed on a certain day. Everything will +be approximately at cost and we will ask you for money before you +leave. While it may be a bit more expensive at home, it will be +nowhere near restaurant prices. + +Breakfast +--------- + +Muesli +Yougurt +Milk +Bread +Butter +Cheese +Ham +Cucumber +Tomatoes +Tea +Coffee +Makings for hot chocolate + +If there are people craving things like egg&bacon or porridge, that +can be arranged too. + +Lunch +----- + +Sandwich makings and on some days a soup. If you crave a hot lunch, +there are many lunch places in the vicinity (open Monday-Friday generally). + +Dinner +------ + +We plan to have hot dinners in the house on most nights. Exactly what +food we will serve is not planned yet. Sometimes vegetarian alternatives +will not be a problem, sometimes we will need help in coming up with +a suitable solution. Some ideas for food are: + +- Pasta and pasta sauce (vegetarian pasta sauce as alternative) + +- Chicken breasts in Parma ham + +- Yellow pea soup, arrak punch and waffles (vegetarian soup as alternative) + +- Beef curry stew with rice and condiments + +- Christmas ham with cream boiled green cabbage and red cabbage + +- Beetroot soup (vegetarian) + +- Beetroot pie (vegetarian) + +- Jansons frestelse (Potato/onion gratin with pickled Baltic herring) + +- Lamb burgers with couscous + +- Potato and lamb stew with parsley + +- Lamb/cabbage stew + +- Fishballs + +- Cow tounge + +- Boeuf Bourgignon + +- Coq au vin + +- Smoked pork with red cabbage in mustard cream + +- Cabbage/mince meat pudding + +We will cost food and drink separately for dinner. \ No newline at end of file From arigo at codespeak.net Fri Dec 2 18:54:10 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 2 Dec 2005 18:54:10 +0100 (CET) Subject: [pypy-svn] r20583 - in pypy/dist/pypy: doc/discussion translator translator/goal translator/test translator/tool Message-ID: <20051202175410.0A4AB27B7B@code1.codespeak.net> Author: arigo Date: Fri Dec 2 18:54:08 2005 New Revision: 20583 Added: pypy/dist/pypy/doc/discussion/cmd-prompt-translation.txt pypy/dist/pypy/translator/interactive.py pypy/dist/pypy/translator/test/test_interactive.py Modified: pypy/dist/pypy/translator/goal/driver.py pypy/dist/pypy/translator/tool/graphpage.py pypy/dist/pypy/translator/tool/taskengine.py Log: (mwh, cfbolz, pedronis, arigo) * start of a simpler Translator substitute based on goal/driver * skecthy description of the interface in doc/discussion/cmd-prompt-translation.txt * some tweaks to driver.py to allow it to work even if backend is not set, also to let task return values bubble up * remove legacy support in graphpage for Translator.entrypoint Added: pypy/dist/pypy/doc/discussion/cmd-prompt-translation.txt ============================================================================== --- (empty file) +++ pypy/dist/pypy/doc/discussion/cmd-prompt-translation.txt Fri Dec 2 18:54:08 2005 @@ -0,0 +1,18 @@ + +t = Translation(entry_point[,]) +t.annotate([]) +t.rtype([]) +t.backendopt[_]([]) +t.source[_]([]) +f = t.compile[_]([]) + +and t.view(), t.viewcg() + + = c|llvm (for now) +you can skip steps + + = argtypes (for annotation) plus + keyword args: gc=...|policy= etc + + + Modified: pypy/dist/pypy/translator/goal/driver.py ============================================================================== --- pypy/dist/pypy/translator/goal/driver.py (original) +++ pypy/dist/pypy/translator/goal/driver.py Fri Dec 2 18:54:08 2005 @@ -52,8 +52,9 @@ self.done = {} maybe_skip = [] - for goal in self.backend_select_goals(disable): - maybe_skip.extend(self._depending_on_closure(goal)) + if disable: + for goal in self.backend_select_goals(disable): + maybe_skip.extend(self._depending_on_closure(goal)) self.maybe_skip = dict.fromkeys(maybe_skip).keys() if default_goal: @@ -67,20 +68,23 @@ def expose_task(task): backend_goal, = self.backend_select_goals([task]) def proc(): - self.proceed(backend_goal) + return self.proceed(backend_goal) setattr(self, task, proc) - for task in ('annotate', 'rtype', 'backendopt', 'source', 'compile', 'run', 'llinterpret'): - expose_task(task) - + if self.options.backend: + for task in ('annotate', 'rtype', 'backendopt', 'source', 'compile', 'run', 'llinterpret'): + expose_task(task) + else: + for task in self.tasks: + expose_task(task) + def backend_select_goals(self, goals): backend = self.options.backend - assert backend l = [] for goal in goals: if goal in self.tasks: l.append(goal) - else: + elif backend: goal = "%s_%s" % (goal, backend) assert goal in self.tasks l.append(goal) @@ -125,10 +129,10 @@ return else: self.log.info("%s..." % title) - func() + res = func() if not func.task_idempotent: self.done[goal] = True - + return res def task_annotate(self): # includes annotation and annotatation simplifications @@ -138,9 +142,10 @@ annmodel.DEBUG = self.options.debug annotator = translator.buildannotator(policy=policy) - annotator.build_types(self.entry_point, self.inputtypes) + s = annotator.build_types(self.entry_point, self.inputtypes) self.sanity_check_annotation() - annotator.simplify() + annotator.simplify() + return s # task_annotate = taskdef(task_annotate, [], "Annotating&simplifying") @@ -326,7 +331,7 @@ elif isinstance(goals, str): goals = [goals] goals = self.backend_select_goals(goals) - self._execute(goals, task_skip = self.maybe_skip) + return self._execute(goals, task_skip = self.maybe_skip) def from_targetspec(targetspec_dic, options=None, args=None, empty_translator=None, disable=[], Added: pypy/dist/pypy/translator/interactive.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/interactive.py Fri Dec 2 18:54:08 2005 @@ -0,0 +1,91 @@ +import optparse + +from pypy.translator.translator import TranslationContext +from pypy.translator.goal import driver + +DEFAULT_OPTIONS = optparse.Values(defaults={ + 'gc': 'ref', + + 'thread': False, # influences GC policy + + 'stackless': False, + 'debug': True, + 'insist': False, + + 'backend': None, + 'lowmem': False, + + 'fork_before': None +}) + +class Translation(object): + + def __init__(self, entry_point, argtypes=None, **kwds): + self.entry_point = entry_point + self.context = TranslationContext() + # for t.view() to work just after construction + graph = self.context.buildflowgraph(entry_point) + self.context._prebuilt_graphs[entry_point] = graph + + self.driver = driver.TranslationDriver(DEFAULT_OPTIONS) + + # hook into driver events + driver_own_event = self.driver._event + def _event(kind, goal, func): + self.driver_event(kind, goal, func) + driver_own_event(kind, goal, func) + self.driver._event = _event + self.driver_setup = False + + self.frozen_options = {} + + self.update_options(argtypes, kwds) + + def driver_event(self, kind, goal, func): + if kind == 'pre': + print goal + self.ensure_setup() + elif kind == 'post': + if 'goal' == 'annotate': # xxx use a table instead + self.frozen_options['debug'] = True + + def ensure_setup(self, argtypes=None, policy=None): + if not self.driver_setup: + if argtypes is None: + argtypes = [] + self.driver.setup(self.entry_point, argtypes, policy) + self.ann_argtypes = argtypes + self.ann_policy = policy + self.driver_setup = True + else: + # check consistency + if argtypes is not None and argtypes != self.ann_argtypes: + raise Exception("xxx") + if policy is not None and policy != self.ann_policy: + raise Exception("xxx") + + def update_options(self, argtypes, kwds): + if argtypes or kwds.get('policy'): + self.ensure_setup(argtypes, kwds.get('policy')) + for optname, value in kwds: + if optname in self.frozen_options: + if getattr(self.driver.options, optname) != value: + raise Exception("xxx") + else: + setattr(self.driver.options, optname, value) + self.frozen_options[optname] = True + + def annotate(self, argtypes=None, **kwds): + self.update_options(argtypes, kwds) + return self.driver.annotate() + + def source(self, argtypes, **kwds): + backend = self.ensure_backend() + self.update_options(argtypes, kwds) + getattr(self.driver, 'source_'+backend)() + + def source_c(self, argtypes, **kwds): + self.ensure_backend('c') + self.update_options(argtypes, kwds) + self.driver.source_c() + Added: pypy/dist/pypy/translator/test/test_interactive.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/test/test_interactive.py Fri Dec 2 18:54:08 2005 @@ -0,0 +1,20 @@ +from pypy.translator.interactive import Translation +import py + +def test_simple_annotate(): + + def f(x,y): + return x+y + + t = Translation(f, [int, int]) + s = t.annotate([int, int]) + assert s.knowntype == int + + t = Translation(f) + s = t.annotate([int, int]) + assert s.knowntype == int + + t = Translation(f, [int, int]) + py.test.raises(Exception, "t.annotate([int, float])") + + Modified: pypy/dist/pypy/translator/tool/graphpage.py ============================================================================== --- pypy/dist/pypy/translator/tool/graphpage.py (original) +++ pypy/dist/pypy/translator/tool/graphpage.py Fri Dec 2 18:54:08 2005 @@ -122,9 +122,6 @@ graphs += graphsof(translator, func) else: graphs = self.translator.graphs - if not graphs: - if hasattr(translator, 'entrypoint'): - graphs = graphsof(translator, translator.entrypoint) gs = [(graph.name, graph) for graph in graphs] if self.annotator and self.annotator.blocked_graphs: for block, was_annotated in self.annotator.annotated.items(): @@ -244,10 +241,7 @@ and possibily the class hierarchy.""" def allgraphs(self): - graphs = list(self.translator.graphs) - if not graphs and hasattr(self.translator, 'entrypoint'): - graphs = graphsof(self.translator, self.translator.entrypoint) - return graphs + return list(self.translator.graphs) def graph_name(self, *args): raise NotImplementedError @@ -336,8 +330,6 @@ graphs = self.allgraphs() if len(graphs) > huge: - if hasattr(translator, 'entrypoint'): - graphs = graphsof(translator, translator.entrypoint) assert graphs, "no graph to show!" LocalizedCallGraphPage.do_compute.im_func(self, dotgen, graphs[0]) return Modified: pypy/dist/pypy/translator/tool/taskengine.py ============================================================================== --- pypy/dist/pypy/translator/tool/taskengine.py (original) +++ pypy/dist/pypy/translator/tool/taskengine.py Fri Dec 2 18:54:08 2005 @@ -100,20 +100,22 @@ def _execute(self, goals, *args, **kwds): task_skip = kwds.get('task_skip', []) + res = None for goal in self._plan(goals, skip=task_skip): taskcallable, _ = self.tasks[goal] self._event('pre', goal, taskcallable) try: - self._do(goal, taskcallable, *args, **kwds) + res = self._do(goal, taskcallable, *args, **kwds) except (SystemExit, KeyboardInterrupt): raise except: self._error(goal) raise self._event('post', goal, taskcallable) - + return res + def _do(self, goal, func, *args, **kwds): - func() + return func() def _event(self, kind, goal, func): pass From arigo at codespeak.net Fri Dec 2 19:00:46 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 2 Dec 2005 19:00:46 +0100 (CET) Subject: [pypy-svn] r20584 - pypy/dist/pypy/tool Message-ID: <20051202180046.64D9327B75@code1.codespeak.net> Author: arigo Date: Fri Dec 2 19:00:45 2005 New Revision: 20584 Removed: pypy/dist/pypy/tool/run_translation.py Log: (pedronis, arigo) this is not used, remove it. From pedronis at codespeak.net Sat Dec 3 01:29:19 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 3 Dec 2005 01:29:19 +0100 (CET) Subject: [pypy-svn] r20596 - pypy/dist/pypy/translator/tool Message-ID: <20051203002919.028CB27B5D@code1.codespeak.net> Author: pedronis Date: Sat Dec 3 01:29:18 2005 New Revision: 20596 Modified: pypy/dist/pypy/translator/tool/graphpage.py pypy/dist/pypy/translator/tool/pdbplus.py Log: pass at fixining graph commands in pdbplus Modified: pypy/dist/pypy/translator/tool/graphpage.py ============================================================================== --- pypy/dist/pypy/translator/tool/graphpage.py (original) +++ pypy/dist/pypy/translator/tool/graphpage.py Sat Dec 3 01:29:18 2005 @@ -308,7 +308,7 @@ def followlink(self, name): if name.endswith('...'): obj = self.object_by_name[name] - return LocalizedCallGraphPage(self.translator, obj) + return LocalizedCallGraphPage(self.translator, [obj]) obj = self.object_by_name[name] if isinstance(obj, ClassDef): return ClassDefPage(self.translator, obj) @@ -331,7 +331,7 @@ if len(graphs) > huge: assert graphs, "no graph to show!" - LocalizedCallGraphPage.do_compute.im_func(self, dotgen, graphs[0]) + LocalizedCallGraphPage.do_compute.im_func(self, dotgen, [graphs[0]]) return blocked_graphs = self.get_blocked_graphs(graphs) @@ -361,16 +361,18 @@ """A GraphPage showing the localized call graph for a function, that means just including direct callers and callees""" - def graph_name(self, graph0): - return 'LCG_%s' % nameof(graph0) + def graph_name(self, centers): + return 'LCG_%s' % nameof(centers[0]) + + def do_compute(self, dotgen, centers): + centers = dict.fromkeys(centers) - def do_compute(self, dotgen, graph0): translator = self.translator graphs = {} for g1, g2 in translator.callgraph.values(): - if g1 is graph0 or g2 is graph0: + if g1 in centers or g2 in centers: dotgen.emit_edge(nameof(g1), nameof(g2)) graphs[g1] = True graphs[g2] = True @@ -391,7 +393,7 @@ kw = {} dotgen.emit_node(nameof(graph), label=data, shape="box", **kw) - if graph is not graph0: + if graph not in centers: lcg = 'LCG_%s' % nameof(graph) label = graph.name+'...' dotgen.emit_node(lcg, label=label) Modified: pypy/dist/pypy/translator/tool/pdbplus.py ============================================================================== --- pypy/dist/pypy/translator/tool/pdbplus.py (original) +++ pypy/dist/pypy/translator/tool/pdbplus.py Sat Dec 3 01:29:18 2005 @@ -1,4 +1,6 @@ import threading, pdb +import types +from pypy.objspace.flow.model import FunctionGraph class _EnableGraphic: def __init__(self, port=None): @@ -176,8 +178,10 @@ return if hasattr(obj, 'im_func'): obj = obj.im_func - if obj in translator.flowgraphs: - page = graphpage.LocalizedCallGraphPage(translator, obj) + if isinstance(obj, types.FunctionType): + page = graphpage.LocalizedCallGraphPage(translator, self._allgraphs(obj)) + elif isinstance(obj, FunctionGraph): + page = graphpage.FlowGraphPage(translator, [obj]) elif obj in getattr(translator.annotator, 'getuserclasses', lambda: {})(): page = graphpage.ClassDefPage(translator, translator.annotator.getuserclasses()[obj]) elif isinstance(obj, ClassDef): @@ -194,7 +198,6 @@ obj = self._getobj(arg) if obj is None: return - import types if isinstance(obj, (type, types.ClassType)): obj = [obj] else: @@ -306,33 +309,50 @@ """callg obj show flow graph for function obj, obj can be an expression or a dotted name (in which case prefixing with some packages in pypy is tried (see help pypyprefixes))""" - import types from pypy.translator.tool import graphpage obj = self._getobj(arg) if obj is None: return if hasattr(obj, 'im_func'): obj = obj.im_func - if not isinstance(obj, types.FunctionType): + if isinstance(obj, types.FunctionType): + graphs = self._allgraphs(obj) + elif isinstance(obj, FunctionGraph): + graphs = [obj] + else: print "*** Not a function" return - self._show(graphpage.FlowGraphPage(self.translator, [obj])) + self._show(graphpage.FlowGraphPage(self.translator, graphs)) + + def _allgraphs(self, func): + graphs = {} + funcdesc = self.translator.annotator.bookkeeper.getdesc(func) + for graph in funcdesc._cache.itervalues(): + graphs[graph] = True + for graph in self.translator.graphs: + if getattr(graph, 'func', None) is func: + graphs[graph] = True + return graphs.keys() + def do_callg(self, arg): """callg obj show localized call-graph for function obj, obj can be an expression or a dotted name (in which case prefixing with some packages in pypy is tried (see help pypyprefixes))""" - import types from pypy.translator.tool import graphpage obj = self._getobj(arg) if obj is None: return if hasattr(obj, 'im_func'): obj = obj.im_func - if not isinstance(obj, types.FunctionType): + if isinstance(obj, types.FunctionType): + graphs = self._allgraphs(obj) + elif isinstance(obj, FunctionGraph): + graphs = [obj] + else: print "*** Not a function" return - self._show(graphpage.LocalizedCallGraphPage(self.translator, obj)) + self._show(graphpage.LocalizedCallGraphPage(self.translator, graphs)) def do_classhier(self, arg): """classhier From pedronis at codespeak.net Sat Dec 3 02:07:04 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 3 Dec 2005 02:07:04 +0100 (CET) Subject: [pypy-svn] r20598 - pypy/dist/pypy/translator/tool Message-ID: <20051203010704.B67CA27B5A@code1.codespeak.net> Author: pedronis Date: Sat Dec 3 02:07:04 2005 New Revision: 20598 Modified: pypy/dist/pypy/translator/tool/pdbplus.py Log: pass at fixing attribute annotation related commands (now they takes either classes or classdefs) Modified: pypy/dist/pypy/translator/tool/pdbplus.py ============================================================================== --- pypy/dist/pypy/translator/tool/pdbplus.py (original) +++ pypy/dist/pypy/translator/tool/pdbplus.py Sat Dec 3 02:07:04 2005 @@ -182,8 +182,9 @@ page = graphpage.LocalizedCallGraphPage(translator, self._allgraphs(obj)) elif isinstance(obj, FunctionGraph): page = graphpage.FlowGraphPage(translator, [obj]) - elif obj in getattr(translator.annotator, 'getuserclasses', lambda: {})(): - page = graphpage.ClassDefPage(translator, translator.annotator.getuserclasses()[obj]) + elif isinstance(obj, (type, types.ClassType)): + classdef = translator.annotator.bookkeeper.getuniqueclassdef(obj) + page = graphpage.ClassDefPage(translator, classdef) elif isinstance(obj, ClassDef): page = graphpage.ClassDefPage(translator, obj) else: @@ -198,30 +199,36 @@ obj = self._getobj(arg) if obj is None: return - if isinstance(obj, (type, types.ClassType)): - obj = [obj] - else: + try: obj = list(obj) + except: + obj = [obj] + getcdef = self.translator.annotator.bookkeeper.getuniqueclassdef + clsdefs = [] + for x in obj: + if isinstance(x, (type, types.ClassType)): + clsdefs.append(getcdef(x)) + else: + clsdefs.append(x) + def longname(c): - return "%s.%s" % (c.__module__, c.__name__) - obj.sort(lambda x,y: cmp(longname(x), longname(y))) - cls = self.translator.annotator.getuserclasses() + return c.name + clsdefs.sort(lambda x,y: cmp(longname(x), longname(y))) flt = self._make_flt(expr) if flt is None: return - for c in obj: - if c in cls: - try: - attrs = [a for a in cls[c].attrs.itervalues() if flt(a)] - except self.GiveUp: - return - if attrs: - print "%s:" % longname(c) - pr(attrs) + for cdef in clsdefs: + try: + attrs = [a for a in cdef.attrs.itervalues() if flt(a)] + except self.GiveUp: + return + if attrs: + print "%s:" % cdef.name + pr(attrs) def do_attrs(self, arg): """attrs obj [match expr] -list annotated attrs of class obj or list of classes obj, +list annotated attrs of class|def obj or list of classe(def)s obj, obj can be an expression or a dotted name (in which case prefixing with some packages in pypy is tried (see help pypyprefixes)); expr is an optional filtering expression; cand in it refer to the candidate Attribute @@ -232,7 +239,7 @@ def do_attrsann(self, arg): """attrsann obj [match expr] -list with their annotation annotated attrs of class obj or list of classes obj, +list with their annotation annotated attrs of class|def obj or list of classe(def)s obj, obj can be an expression or a dotted name (in which case prefixing with some packages in pypy is tried (see help pypyprefixes)); expr is an optional filtering expression; cand in it refer to the candidate Attribute @@ -244,14 +251,15 @@ def do_readpos(self, arg): """readpos obj attrname [match expr] [as var] -list the read positions of annotated attr with attrname of class obj, +list the read positions of annotated attr with attrname of class or classdef obj, obj can be an expression or a dotted name (in which case prefixing with some packages in pypy is tried (see help pypyprefixes)); expr is an optional filtering expression; cand in it refer to the candidate read -position information, which has a .func and .block and .i; +position information, which has a .func (which can be None), a .graph and .block and .i; the list of the read positions functions is set to var or _.""" class Pos: - def __init__(self, func, block, i): + def __init__(self, graph, func, block, i): + self.graph = graph self.func = func self.block = block self.i = i @@ -272,10 +280,9 @@ obj = self._getobj(arg) if obj is None: return - cls = self.translator.annotator.getuserclasses() - if obj not in cls: - return - attrs = cls[obj].attrs + if isinstance(obj, (type, types.ClassType)): + obj = self.translator.annotator.bookkeeper.getuniqueclassdef(obj) + attrs = obj.attrs if attrname not in attrs: print "*** bogus:", attrname return @@ -288,9 +295,16 @@ r = {} try: for p in pos: - func, block, i = p - if flt(Pos(func, block, i)): - print func.__module__ or '?', func.__name__, block, i + graph, block, i = p + if hasattr(graph, 'func'): + func = graph.func + else: + func = None + if flt(Pos(graph, func, block, i)): + if func is not None: + print func.__module__ or '?', func.__name__, block, i + else: + print graph, block, i if i >= 0: op = block.operations[i] print " ", op From pedronis at codespeak.net Sat Dec 3 02:35:38 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 3 Dec 2005 02:35:38 +0100 (CET) Subject: [pypy-svn] r20600 - pypy/dist/pypy/translator/tool Message-ID: <20051203013538.CBEB527B5A@code1.codespeak.net> Author: pedronis Date: Sat Dec 3 02:35:38 2005 New Revision: 20600 Modified: pypy/dist/pypy/translator/tool/pdbplus.py Log: * make the code more robust against getuniqueclassdef for a specialized class * findfuncs | finclasses -> more general finddescs pdbplus extension commands should be reasonably fixed now Modified: pypy/dist/pypy/translator/tool/pdbplus.py ============================================================================== --- pypy/dist/pypy/translator/tool/pdbplus.py (original) +++ pypy/dist/pypy/translator/tool/pdbplus.py Sat Dec 3 02:35:38 2005 @@ -106,6 +106,13 @@ class GiveUp(Exception): pass + def _getcdef(self, cls): + try: + return self.translator.annotator.bookkeeper.getuniqueclassdef(cls) + except Exception: + print "*** cannot get classdef: likely specialized class: %s" % cls + return None + def _make_flt(self, expr): try: expr = compile(expr, '', 'eval') @@ -130,39 +137,32 @@ del self.curframe.f_locals['cand'] return flt - def do_findclasses(self, arg): - """findclasses expr [as var] -find annotated classes for which expr is true, cand in it referes to -the candidate class; the result list is assigned to var or _.""" + def do_finddescs(self, arg): + """finddescs kind expr [as var] +find annotation descs of kind (ClassDesc|FuncionDesc|...) + for which expr is true, cand in it referes to +the candidate desc; the result list is assigned to var or _.""" expr, var = self._parse_modif(arg) + kind, expr = expr.split(None, 1) flt = self._make_flt(expr) if flt is None: return - cls = [] - try: - for c in self.translator.annotator.getuserclasses(): - if flt(c): - cls.append(c) - except self.GiveUp: + from pypy.annotation import description + kind_cls = getattr(description, kind, None) + if kind_cls is None: + kind = kind.title()+'Desc' + kind_cls = getattr(description, kind, None) + if kind_cls is None: return - self._setvar(var, cls) - def do_findfuncs(self, arg): - """findfuncs expr [as var] -find flow-graphed functions for which expr is true, cand in it referes to -the candidate function; the result list is assigned to var or _.""" - expr, var = self._parse_modif(arg) - flt = self._make_flt(expr) - if flt is None: - return - funcs = [] + descs = [] try: - for f in self.translator.flowgraphs: - if flt(f): - funcs.append(f) + for c in self.translator.annotator.bookkeeper.descs.itervalues(): + if isinstance(c, kind_cls) and flt(c): + descs.append(c) except self.GiveUp: return - self._setvar(var, funcs) + self._setvar(var, descs) def do_showg(self, arg): """showg obj @@ -183,7 +183,9 @@ elif isinstance(obj, FunctionGraph): page = graphpage.FlowGraphPage(translator, [obj]) elif isinstance(obj, (type, types.ClassType)): - classdef = translator.annotator.bookkeeper.getuniqueclassdef(obj) + classdef = self._getcdef(obj) + if classdef is None: + return page = graphpage.ClassDefPage(translator, classdef) elif isinstance(obj, ClassDef): page = graphpage.ClassDefPage(translator, obj) @@ -203,11 +205,13 @@ obj = list(obj) except: obj = [obj] - getcdef = self.translator.annotator.bookkeeper.getuniqueclassdef clsdefs = [] for x in obj: if isinstance(x, (type, types.ClassType)): - clsdefs.append(getcdef(x)) + cdef = self._getcdef(x) + if cdef is None: + continue + clsdefs.append(cdef) else: clsdefs.append(x) @@ -281,7 +285,9 @@ if obj is None: return if isinstance(obj, (type, types.ClassType)): - obj = self.translator.annotator.bookkeeper.getuniqueclassdef(obj) + obj = self._getcdef(obj) + if obj is None: + return attrs = obj.attrs if attrname not in attrs: print "*** bogus:", attrname @@ -395,7 +401,7 @@ print "graph commands are: showg, flowg, callg, classhier, enable_graphic" def help_ann_other(self): - print "other annotation related commands are: find, findclasses, findfuncs, attrs, attrsann, readpos" + print "other annotation related commands are: find, finddescs, attrs, attrsann, readpos" def help_pypyprefixes(self): print "these prefixes are tried for dotted names in graph commands:" From arigo at codespeak.net Sat Dec 3 11:44:26 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 3 Dec 2005 11:44:26 +0100 (CET) Subject: [pypy-svn] r20603 - pypy/dist/pypy/translator/tool Message-ID: <20051203104426.4ABED27B56@code1.codespeak.net> Author: arigo Date: Sat Dec 3 11:44:25 2005 New Revision: 20603 Modified: pypy/dist/pypy/translator/tool/graphpage.py Log: (pedronis, arigo) Localized graph page: fixed the labels, added all edges between the displayed nodes. Modified: pypy/dist/pypy/translator/tool/graphpage.py ============================================================================== --- pypy/dist/pypy/translator/tool/graphpage.py (original) +++ pypy/dist/pypy/translator/tool/graphpage.py Sat Dec 3 11:44:25 2005 @@ -373,10 +373,14 @@ for g1, g2 in translator.callgraph.values(): if g1 in centers or g2 in centers: - dotgen.emit_edge(nameof(g1), nameof(g2)) graphs[g1] = True graphs[g2] = True + # show all edges that exist between these graphs + for g1, g2 in translator.callgraph.values(): + if g1 in graphs and g2 in graphs: + dotgen.emit_edge(nameof(g1), nameof(g2)) + graphs = graphs.keys() # show the call graph @@ -395,7 +399,7 @@ if graph not in centers: lcg = 'LCG_%s' % nameof(graph) - label = graph.name+'...' + label = data+'...' dotgen.emit_node(lcg, label=label) dotgen.emit_edge(nameof(graph), lcg) self.links[label] = 'go to its localized call graph' From mwh at codespeak.net Sat Dec 3 12:16:07 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Sat, 3 Dec 2005 12:16:07 +0100 (CET) Subject: [pypy-svn] r20604 - pypy/dist/pypy/doc/weekly Message-ID: <20051203111607.0CC5E27B5D@code1.codespeak.net> Author: mwh Date: Sat Dec 3 12:16:05 2005 New Revision: 20604 Added: pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Modified: pypy/dist/pypy/doc/weekly/index.txt Log: This Week in PyPy 5, rev 1 Modified: pypy/dist/pypy/doc/weekly/index.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/index.txt (original) +++ pypy/dist/pypy/doc/weekly/index.txt Sat Dec 3 12:16:05 2005 @@ -60,8 +60,14 @@ - Resource consumption - PyPy at conferences + * `Week ending 2005-12-02`_ + + - SomePBC-refactoring + - CCC papers + .. _`Week ending 2005-11-04`: summary-2005-11-04.html .. _`Week ending 2005-11-11`: summary-2005-11-11.html .. _`Week ending 2005-11-18`: summary-2005-11-18.html .. _`Week ending 2005-11-25`: summary-2005-11-25.html +.. _`Week ending 2005-12-02`: summary-2005-12-02.html Added: pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt ============================================================================== --- (empty file) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Sat Dec 3 12:16:05 2005 @@ -0,0 +1,55 @@ +======================= + This Week in PyPy 5 +======================= + +Introduction +============ + +This is the fifth of what will hopefully be many summaries of what's +been going on in the world of PyPy in the last week. I'd still like +to remind people that when something worth summarizing happens to +recommend if for "This Week in PyPy" as mentioned on: + + http://codespeak.net/pypy/dist/pypy/doc/weekly/ + +where you can also find old summaries. I note in passing that the +idea of keeping track of IRC conversations in the weekly summary has +pretty much fizzled. Oh well. + +There were about 230 commits to the pypy section of codespeak's +repository in the last week (a busy one, it seems :-). + + +SomePBC-refactoring +=================== + +We merged the branch at last! Finishing the branch off and getting +translate_pypy running again seemed to mostly involve fighting with +memoized functions and methods, and the "strange details" hinted at in +the last "This Week in PyPy" were not so bad -- indeed once we got to +the point of rtyping finishing, the backend optimizations, source +generation, compilation and resulting binary all worked first time +(there must be something to be said for this Test Driven Development +stuff :). + +If you recall from the second This Week in PyPy the thing that +motivated us to start the branch was wanting to support multiple +independent object spaces in the translated binary. After three weeks +of refactoring we hoped we'd made this possible... and so it proved, +though a couple of small tweaks were needed to the PyPy source. The +resulting binary is quite a lot (40%) bigger but only a little (10%) +slower. + + +CCC papers +========== + +As mentioned last week, two PyPy talks have been accepted for the +Chaos Communication Congress in Berlin, from December 27th to the +30th. The CCC asks that speakers provide papers to accompany their +talks (they make a proceedings book) so that's what we've done, and +the results are two quite nice pieces of propaganda for the project: + + http://codespeak.net/pypy/extradoc/talk/22c3/agility.pdf + http://codespeak.net/pypy/extradoc/talk/22c3/techpaper.pdf + From mwh at codespeak.net Sat Dec 3 12:28:16 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Sat, 3 Dec 2005 12:28:16 +0100 (CET) Subject: [pypy-svn] r20605 - pypy/dist/pypy/doc/weekly Message-ID: <20051203112816.CC12927B5D@code1.codespeak.net> Author: mwh Date: Sat Dec 3 12:28:15 2005 New Revision: 20605 Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Log: a bit of blather about pypy-sync Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt (original) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Sat Dec 3 12:28:15 2005 @@ -53,3 +53,26 @@ http://codespeak.net/pypy/extradoc/talk/22c3/agility.pdf http://codespeak.net/pypy/extradoc/talk/22c3/techpaper.pdf + +Where did PyPy-sync go? +======================= + +What's a pypy-sync meeting? Apparently:: + + It's an XP-style meeting that serves to synchronize + development work and let everybody know who is + working on what. It also serves as a decision + board of the PyPy developers. If discussions + last too long and decisions cannot be reached + they are delegated to a sub-group or get postponed. + +And they usually happen at 1pm CET on the #pypy-sync meeting on +freenode. Except that the last couple haven't really happened -- a +few people have turned up, but not many and mostly it's just the +people who are in #pypy all week anyway. So after the Gtbg sprint +next week we're going to try hard to get everyone to attend these +meetings again and talk about what they are doing. This is especially +important as we head into phases 2 and 3 of the project as the areas +of work have less intrinsically in common and so maintaining an +overall impression of where the project is requires more explicit +effort. From hpk at codespeak.net Sat Dec 3 12:44:46 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Sat, 3 Dec 2005 12:44:46 +0100 (CET) Subject: [pypy-svn] r20606 - pypy/dist/pypy/doc/weekly Message-ID: <20051203114446.E39A027B5D@code1.codespeak.net> Author: hpk Date: Sat Dec 3 12:44:46 2005 New Revision: 20606 Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Log: - more on pypy-sync - more on CCC and that people can still attend Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt (original) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Sat Dec 3 12:44:46 2005 @@ -53,6 +53,14 @@ http://codespeak.net/pypy/extradoc/talk/22c3/agility.pdf http://codespeak.net/pypy/extradoc/talk/22c3/techpaper.pdf +Btw, it's still possible to attend the CCC conference 27th-30th December +in Berlin: + + http://events.ccc.de/congress/2005 + +A number of PyPy people will be around and innocently mix with +people from other communities and generally be available for +discussing all things PyPy and the future. Where did PyPy-sync go? ======================= @@ -62,17 +70,19 @@ It's an XP-style meeting that serves to synchronize development work and let everybody know who is working on what. It also serves as a decision - board of the PyPy developers. If discussions + board of the PyPy active developers. If discussions last too long and decisions cannot be reached they are delegated to a sub-group or get postponed. -And they usually happen at 1pm CET on the #pypy-sync meeting on -freenode. Except that the last couple haven't really happened -- a -few people have turned up, but not many and mostly it's just the -people who are in #pypy all week anyway. So after the Gtbg sprint -next week we're going to try hard to get everyone to attend these -meetings again and talk about what they are doing. This is especially -important as we head into phases 2 and 3 of the project as the areas -of work have less intrinsically in common and so maintaining an -overall impression of where the project is requires more explicit -effort. +pypy-sync meetings usually happen on thursdays at 1pm CET on the +#pypy-sync IRC channel on freenode, and they are usually +prepared an Agenda and minutes after the meeting. Except that +the last couple haven't really happened this way -- no agenda, +a few people have turned up, but not many and mostly it's just the +people who are in #pypy all week anyway. + +So after the Gtbg sprint next week we're going to try harder +to prepare and get developers to attend pypy-sync meetings +again. This is especially important as we head towards +JIT-compiler efforts, integrating more of our GC works, more +and refined backends and lots of other challenges. From mwh at codespeak.net Sat Dec 3 12:54:29 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Sat, 3 Dec 2005 12:54:29 +0100 (CET) Subject: [pypy-svn] r20607 - pypy/dist/pypy/doc/weekly Message-ID: <20051203115429.0EA0227B5D@code1.codespeak.net> Author: mwh Date: Sat Dec 3 12:54:27 2005 New Revision: 20607 Modified: pypy/dist/pypy/doc/weekly/index.txt pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Log: make holger's words look more like i wrote them and remove a little repetition. Modified: pypy/dist/pypy/doc/weekly/index.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/index.txt (original) +++ pypy/dist/pypy/doc/weekly/index.txt Sat Dec 3 12:54:27 2005 @@ -64,6 +64,7 @@ - SomePBC-refactoring - CCC papers + - Where did PyPy-sync go? .. _`Week ending 2005-11-04`: summary-2005-11-04.html .. _`Week ending 2005-11-11`: summary-2005-11-11.html Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt (original) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Sat Dec 3 12:54:27 2005 @@ -45,22 +45,23 @@ ========== As mentioned last week, two PyPy talks have been accepted for the -Chaos Communication Congress in Berlin, from December 27th to the -30th. The CCC asks that speakers provide papers to accompany their -talks (they make a proceedings book) so that's what we've done, and -the results are two quite nice pieces of propaganda for the project: +Chaos Communication Congress. The CCC asks that speakers provide +papers to accompany their talks (they make a proceedings book) so +that's what we've done, and the results are two quite nice pieces of +propaganda for the project: http://codespeak.net/pypy/extradoc/talk/22c3/agility.pdf http://codespeak.net/pypy/extradoc/talk/22c3/techpaper.pdf -Btw, it's still possible to attend the CCC conference 27th-30th December -in Berlin: +It's still possible to attend the conference in Berlin, from December +27th to the 30th: http://events.ccc.de/congress/2005 -A number of PyPy people will be around and innocently mix with -people from other communities and generally be available for -discussing all things PyPy and the future. +A number of PyPy people will be around and innocently mixing with +people from other communities and generally be available for +discussing all things PyPy and the future. + Where did PyPy-sync go? ======================= @@ -74,15 +75,15 @@ last too long and decisions cannot be reached they are delegated to a sub-group or get postponed. -pypy-sync meetings usually happen on thursdays at 1pm CET on the -#pypy-sync IRC channel on freenode, and they are usually -prepared an Agenda and minutes after the meeting. Except that -the last couple haven't really happened this way -- no agenda, -a few people have turned up, but not many and mostly it's just the -people who are in #pypy all week anyway. - -So after the Gtbg sprint next week we're going to try harder -to prepare and get developers to attend pypy-sync meetings -again. This is especially important as we head towards -JIT-compiler efforts, integrating more of our GC works, more -and refined backends and lots of other challenges. +pypy-sync meetings usually happen on thursdays at 1pm CET on the +#pypy-sync IRC channel on freenode, with an agenda prepared beforehand +and minutes posted to pypy-dev after the meeting. Except that the +last couple haven't really happened this way -- no agenda and only a +few people have turned up and mostly just the people who are in #pypy +all week anyway. + +So after the Gtbg sprint next week we're going to try harder to +prepare and get developers to attend pypy-sync meetings again. This +is especially important as we head towards more varied and less +intrinsically related challenges such as a JIT compiler, integration +of logic programming, higher level backend and much more. From hpk at codespeak.net Sat Dec 3 13:02:44 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Sat, 3 Dec 2005 13:02:44 +0100 (CET) Subject: [pypy-svn] r20609 - pypy/dist/pypy/doc/weekly Message-ID: <20051203120244.0A4CA27B56@code1.codespeak.net> Author: hpk Date: Sat Dec 3 13:02:43 2005 New Revision: 20609 Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Log: - added some bits about background EU things - added 4 letters (", GC") to pypy-sync forecast Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt (original) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Sat Dec 3 13:02:43 2005 @@ -63,6 +63,26 @@ discussing all things PyPy and the future. +Background work regarding EU involvement +============================================ + +Less visible but requiring quite some work, organisations +funding and organizing the EU PyPy project are currently +preparing a lot of paperwork and reports. Most of the (technical) +reports are done by now but the next Gothenborg sprint +will still internally start with two days of finalizing +those reports. Let's all hope that everything +goes well regarding our first major EU review end January. + +Meanwhile, Holger was invited to give a talk about PyPy's +technical organisation at a workshop from the german EU office +on the 5th December. Also, Bea, Alastair and Holger will +talk about PyPy at an EU workshop in Bruxelles. Hopefully, this +provides us some more insights and possibilities to have PyPy +recognized as an interesting "live" project within +the EU context. + + Where did PyPy-sync go? ======================= @@ -86,4 +106,5 @@ prepare and get developers to attend pypy-sync meetings again. This is especially important as we head towards more varied and less intrinsically related challenges such as a JIT compiler, integration -of logic programming, higher level backend and much more. +of logic programming, GC, higher level backend and much more. + From hpk at codespeak.net Sat Dec 3 13:04:54 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Sat, 3 Dec 2005 13:04:54 +0100 (CET) Subject: [pypy-svn] r20610 - pypy/dist/pypy/doc/weekly Message-ID: <20051203120454.55E3F27B56@code1.codespeak.net> Author: hpk Date: Sat Dec 3 13:04:53 2005 New Revision: 20610 Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Log: add the date of the bruxelles workshop Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt (original) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Sat Dec 3 13:04:53 2005 @@ -77,9 +77,9 @@ Meanwhile, Holger was invited to give a talk about PyPy's technical organisation at a workshop from the german EU office on the 5th December. Also, Bea, Alastair and Holger will -talk about PyPy at an EU workshop in Bruxelles. Hopefully, this -provides us some more insights and possibilities to have PyPy -recognized as an interesting "live" project within +talk about PyPy at an EU workshop 8th December in Bruxelles. +Hopefully, this provides us some more insights and possibilities +to have PyPy recognized as an interesting "live" project within the EU context. From mwh at codespeak.net Sat Dec 3 13:20:55 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Sat, 3 Dec 2005 13:20:55 +0100 (CET) Subject: [pypy-svn] r20611 - pypy/dist/pypy/doc/weekly Message-ID: <20051203122055.478E327B56@code1.codespeak.net> Author: mwh Date: Sat Dec 3 13:20:54 2005 New Revision: 20611 Modified: pypy/dist/pypy/doc/weekly/index.txt pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Log: "michael-ification" Modified: pypy/dist/pypy/doc/weekly/index.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/index.txt (original) +++ pypy/dist/pypy/doc/weekly/index.txt Sat Dec 3 13:20:54 2005 @@ -64,6 +64,7 @@ - SomePBC-refactoring - CCC papers + - Background EU-related work - Where did PyPy-sync go? .. _`Week ending 2005-11-04`: summary-2005-11-04.html Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt (original) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt Sat Dec 3 13:20:54 2005 @@ -63,24 +63,24 @@ discussing all things PyPy and the future. -Background work regarding EU involvement -============================================ -Less visible but requiring quite some work, organisations -funding and organizing the EU PyPy project are currently -preparing a lot of paperwork and reports. Most of the (technical) -reports are done by now but the next Gothenborg sprint -will still internally start with two days of finalizing -those reports. Let's all hope that everything -goes well regarding our first major EU review end January. - -Meanwhile, Holger was invited to give a talk about PyPy's -technical organisation at a workshop from the german EU office -on the 5th December. Also, Bea, Alastair and Holger will -talk about PyPy at an EU workshop 8th December in Bruxelles. -Hopefully, this provides us some more insights and possibilities -to have PyPy recognized as an interesting "live" project within -the EU context. +Background EU-related work +========================== + +Less visible but still requiring work, organisations funding and +organizing the EU PyPy project are currently preparing a lot of +paperwork and reports. Most of the reports are mostly done by now but +the next G?teborg sprint will start with two (insider only) days of +dotting the 'i's and crossing the 't's. Let's all hope that +everything goes well at our first major EU review at the end of +January. + +Meanwhile, Holger was invited to give a talk about PyPy's technical +organisation at a workshop given by the german EU office on the 5th of +December. Also, Bea, Alastair and Holger will talk about PyPy at an +EU workshop on the 8th of December in Bruxelles. Hopefully, this will +enable us to find more opportunities to get PyPy recognized as an +interesting "live" project in the EU's corner of the world. Where did PyPy-sync go? @@ -102,9 +102,8 @@ few people have turned up and mostly just the people who are in #pypy all week anyway. -So after the Gtbg sprint next week we're going to try harder to +So after the G?teborg sprint next week we're going to try harder to prepare and get developers to attend pypy-sync meetings again. This is especially important as we head towards more varied and less intrinsically related challenges such as a JIT compiler, integration -of logic programming, GC, higher level backend and much more. - +of logic programming, GC, higher level backends and much more. From arigo at codespeak.net Sat Dec 3 13:27:42 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 3 Dec 2005 13:27:42 +0100 (CET) Subject: [pypy-svn] r20613 - pypy/branch/somepbc-refactoring Message-ID: <20051203122742.A7D5927B56@code1.codespeak.net> Author: arigo Date: Sat Dec 3 13:27:42 2005 New Revision: 20613 Removed: pypy/branch/somepbc-refactoring/ Log: Deleting the branch, now merged. From pedronis at codespeak.net Sat Dec 3 13:35:37 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 3 Dec 2005 13:35:37 +0100 (CET) Subject: [pypy-svn] r20614 - pypy/dist/pypy/doc/discussion Message-ID: <20051203123537.2D24D27B56@code1.codespeak.net> Author: pedronis Date: Sat Dec 3 13:35:36 2005 New Revision: 20614 Added: pypy/dist/pypy/doc/discussion/draft-jit-ideas.txt (contents, props changed) Log: start of putting into text ideas and discussions about the JIT and JIT work Armin and me had or that were already floating in the air. Slightly provisional/draftish for now. Added: pypy/dist/pypy/doc/discussion/draft-jit-ideas.txt ============================================================================== --- (empty file) +++ pypy/dist/pypy/doc/discussion/draft-jit-ideas.txt Sat Dec 3 13:35:36 2005 @@ -0,0 +1,52 @@ +JIT ideas and areas of work +------------------------------ + +Low-level graphs abstract interpreter +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +In the context of PyPy architecture a JIT can be envisioned as a +run-time specialiser (doing constant folding, partial evaluation and +keeping allocation virtual as much as possible). The specialiser would +be able to produce for some runtime execution state of a function new +low-level graphs from a predetermined relevant subset of the forest of +low-level graphs making up PyPy and given entry points and parameter +variables whose run-time constantness can be exploited (think the eval +loop for an entry-point and the bytecode of the function for such a +variable and the graphs for the directly involved PyPy functions as the +subset). This new low-level graphs could then be turned into machine +code by a run-time machine code backend, mapping progressively +the function bytecode into machine code. + +Ideally PyPy translation should generate code from this determined +subset, list of entry-points and variables that implements run-time +specialisation for it, plus management/bookkeeping and instrumentation +code. + +To explore and understand this problem space, we should probably start +by writing a pure Python abstract interpreter doing constant-folding +and partial evaluation of low-level graphs. Increasing abstraction +this could maybe evolve in the code for generating the specialiser or +at least be used to analyse which subset of the graphs is relevant. + +issue: too fine granularity of low-level implementations of rpython dicts + +Simple target interpreter for experimentation and testing +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Given that PyPy eval loop is quite a large chunk to swallow, ahem, +analyse, it would be nice to have some kind of minimal bytecode eval +loop for some very simple bytecode written in RPython to use for +testing/experimenting and as first target. Ideally the interpreter +state for this should not be much more than an instruction counter and +a value stack. + +L3 interpreter +~~~~~~~~~~~~~~~~~~~ + +xxx + + +Machine code backends and register allocation +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +xxx \ No newline at end of file From pedronis at codespeak.net Sat Dec 3 16:23:33 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 3 Dec 2005 16:23:33 +0100 (CET) Subject: [pypy-svn] r20615 - pypy/dist/pypy/doc/discussion Message-ID: <20051203152333.8DF4527B6A@code1.codespeak.net> Author: pedronis Date: Sat Dec 3 16:23:32 2005 New Revision: 20615 Modified: pypy/dist/pypy/doc/discussion/draft-jit-ideas.txt Log: some scattered notes about the L3 interpreter core ideas Modified: pypy/dist/pypy/doc/discussion/draft-jit-ideas.txt ============================================================================== --- pypy/dist/pypy/doc/discussion/draft-jit-ideas.txt (original) +++ pypy/dist/pypy/doc/discussion/draft-jit-ideas.txt Sat Dec 3 16:23:32 2005 @@ -43,7 +43,23 @@ L3 interpreter ~~~~~~~~~~~~~~~~~~~ -xxx +* in RPython + +* the code should try to be straightforward (also for efficiency) + +* try to avoid needing maximal parameters that need to be + computed over all the graph (makes emitting a graph incrementally harder) + +* one major issue to keep in mind is where the information + about offsets into low-level data types comes from/is computed + (C compiler offset values, vs. made-up values) + +* translatable together with an RPython program, and capable + of accepting (constant) data and functions from it + in the interpreted graphs + +* ideally this should evolve into the intermediate representation + used between the JIT and the machine code backends Machine code backends and register allocation From arigo at codespeak.net Sun Dec 4 11:50:35 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 4 Dec 2005 11:50:35 +0100 (CET) Subject: [pypy-svn] r20632 - in pypy/dist/pypy/rpython: . lltypesystem Message-ID: <20051204105035.7389C27B53@code1.codespeak.net> Author: arigo Date: Sun Dec 4 11:50:34 2005 New Revision: 20632 Modified: pypy/dist/pypy/rpython/lltypesystem/rpbc.py pypy/dist/pypy/rpython/rmodel.py pypy/dist/pypy/rpython/rpbc.py Log: A helper for Reprs that represent true/false objects with non-null/null pointers. Modified: pypy/dist/pypy/rpython/lltypesystem/rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/lltypesystem/rpbc.py (original) +++ pypy/dist/pypy/rpython/lltypesystem/rpbc.py Sun Dec 4 11:50:34 2005 @@ -7,11 +7,11 @@ typeOf, Void, ForwardReference, Struct, Bool, \ Ptr, malloc, nullptr from pypy.rpython.rmodel import Repr, TyperError, inputconst, inputdesc -from pypy.rpython.rmodel import warning, mangle +from pypy.rpython.rmodel import warning, mangle, CanBeNull from pypy.rpython import robject from pypy.rpython import rtuple from pypy.rpython.rpbc import SingleFrozenPBCRepr, samesig,\ - commonbase, allattributenames, MultiplePBCRepr, FunctionsPBCRepr, \ + commonbase, allattributenames, FunctionsPBCRepr, \ AbstractClassesPBCRepr, AbstractMethodsPBCRepr, OverriddenFunctionPBCRepr from pypy.rpython.lltypesystem import rclass from pypy.tool.sourcetools import has_varargs @@ -26,7 +26,7 @@ # ____________________________________________________________ -class MultipleFrozenPBCRepr(MultiplePBCRepr): +class MultipleFrozenPBCRepr(CanBeNull, Repr): """Representation selected for multiple non-callable pre-built constants.""" def __init__(self, rtyper, access_set): self.rtyper = rtyper Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Sun Dec 4 11:50:34 2005 @@ -179,6 +179,17 @@ return 0 +class CanBeNull(object): + """A mix-in base class for subclasses of Repr that represent None as + 'null' and true values as non-'null'. + """ + def rtype_is_true(self, hop): + if hop.s_result.is_constant(): + return hop.inputconst(Bool, hop.s_result.const) + else: + return hop.rtyper.type_system.check_null(self, hop) + + class IteratorRepr(Repr): """Base class of Reprs of any kind of iterator.""" Modified: pypy/dist/pypy/rpython/rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/rpbc.py (original) +++ pypy/dist/pypy/rpython/rpbc.py Sun Dec 4 11:50:34 2005 @@ -7,7 +7,7 @@ from pypy.rpython.lltypesystem.lltype import \ typeOf, Void, Bool, nullptr, frozendict, Ptr, Struct, malloc from pypy.rpython.error import TyperError -from pypy.rpython.rmodel import Repr, inputconst, HalfConcreteWrapper +from pypy.rpython.rmodel import Repr, inputconst, HalfConcreteWrapper, CanBeNull from pypy.rpython import rclass from pypy.rpython import robject @@ -66,17 +66,6 @@ # ____________________________________________________________ -class MultiplePBCRepr(Repr): - """Base class for PBCReprs of multiple PBCs that can include None - (represented as a NULL pointer).""" - def rtype_is_true(self, hop): - if hop.s_result.is_constant(): - assert hop.s_result.const is True # custom __nonzero__ on PBCs? - return hop.inputconst(Bool, hop.s_result.const) - else: - return hop.rtyper.type_system.check_null(self, hop) - - class ConcreteCallTableRow(dict): """A row in a concrete call table.""" @@ -165,7 +154,7 @@ return concretetable, uniquerows -class FunctionsPBCRepr(MultiplePBCRepr): +class FunctionsPBCRepr(CanBeNull, Repr): """Representation selected for a PBC of function(s).""" def __init__(self, rtyper, s_pbc): From arigo at codespeak.net Sun Dec 4 11:51:36 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 4 Dec 2005 11:51:36 +0100 (CET) Subject: [pypy-svn] r20633 - pypy/dist/pypy Message-ID: <20051204105136.1A40F27B53@code1.codespeak.net> Author: arigo Date: Sun Dec 4 11:51:34 2005 New Revision: 20633 Modified: pypy/dist/pypy/conftest.py Log: Hacked on conftest.py until it again does the Right Thing with interp- and app-level KeyboardInterrupt. Modified: pypy/dist/pypy/conftest.py ============================================================================== --- pypy/dist/pypy/conftest.py (original) +++ pypy/dist/pypy/conftest.py Sun Dec 4 11:51:34 2005 @@ -62,16 +62,8 @@ kwds.setdefault('usemodules', option.usemodules) kwds.setdefault('compiler', option.compiler) space = Space(**kwds) - except KeyboardInterrupt: - raise - except OperationError, e: - # we cannot easily convert w_KeyboardInterrupt to - # KeyboardInterrupt so we have to jump through hoops - try: - if e.w_type.name == 'KeyboardInterrupt': - raise KeyboardInterrupt - except AttributeError: - pass + except OperationError, e: + check_keyboard_interrupt(e) if option.verbose: import traceback traceback.print_exc() @@ -87,6 +79,19 @@ space.eq_w = appsupport.eq_w.__get__(space) return space +class OpErrKeyboardInterrupt(KeyboardInterrupt): + pass + +def check_keyboard_interrupt(e): + # we cannot easily convert w_KeyboardInterrupt to KeyboardInterrupt + # in general without a space -- here is an approximation + try: + if e.w_type.name == 'KeyboardInterrupt': + tb = sys.exc_info()[2] + raise OpErrKeyboardInterrupt, OpErrKeyboardInterrupt(), tb + except AttributeError: + pass + # # Interfacing/Integrating with py.test's collection process # @@ -147,8 +152,9 @@ try: target(*args) except OperationError, e: - if e.match(space, space.w_KeyboardInterrupt): - raise KeyboardInterrupt + if e.match(space, space.w_KeyboardInterrupt): + tb = sys.exc_info()[2] + raise OpErrKeyboardInterrupt, OpErrKeyboardInterrupt(), tb appexcinfo = appsupport.AppExceptionInfo(space, e) if appexcinfo.traceback: raise self.Failed(excinfo=appsupport.AppExceptionInfo(space, e)) @@ -159,11 +165,15 @@ class IntTestFunction(PyPyTestFunction): def execute(self, target, *args): co = target.func_code - if 'space' in co.co_varnames[:co.co_argcount]: - space = gettestobjspace() - target(space, *args) - else: - target(*args) + try: + if 'space' in co.co_varnames[:co.co_argcount]: + space = gettestobjspace() + target(space, *args) + else: + target(*args) + except OperationError, e: + check_keyboard_interrupt(e) + raise if 'pygame' in sys.modules: global _pygame_warned if not _pygame_warned: From arigo at codespeak.net Sun Dec 4 13:46:50 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 4 Dec 2005 13:46:50 +0100 (CET) Subject: [pypy-svn] r20639 - in pypy/dist/pypy/rpython/l3interp: . test Message-ID: <20051204124650.0F66427B56@code1.codespeak.net> Author: arigo Date: Sun Dec 4 13:46:47 2005 New Revision: 20639 Modified: pypy/dist/pypy/rpython/l3interp/l3interp.py pypy/dist/pypy/rpython/l3interp/model.py pypy/dist/pypy/rpython/l3interp/test/test_convert.py pypy/dist/pypy/rpython/l3interp/test/test_l3interp.py Log: Simplifying the L3 graph model... Modified: pypy/dist/pypy/rpython/l3interp/l3interp.py ============================================================================== --- pypy/dist/pypy/rpython/l3interp/l3interp.py (original) +++ pypy/dist/pypy/rpython/l3interp/l3interp.py Sun Dec 4 13:46:47 2005 @@ -1,80 +1,207 @@ from pypy.rpython.l3interp import model from pypy.rpython.memory import lladdress +from pypy.rpython.rarithmetic import r_uint +from pypy.interpreter.miscutils import InitializedClass + +class L3Exception(Exception): + pass -class LLException(Exception): - def __init__(self): - pass - -class LLInterpreter(object): - def __init__(self, globals): - self.globals = globals - def eval_graph_int(self, graph, args): - frame = LLFrame(graph, self) - returnlink = frame.eval(args) - return frame.get_int(0) - -class LLFrame(object): - def __init__(self, graph, lli): - self.interp = lli - self.graph = graph - self.int_vars = [0] * graph.max_num_ints - def eval(self, int_values): - link = self.graph.startlink - self.copy_startlink_vars(link, int_values) - while not link.stop_graph_evaluation: - link = self.eval_block(link.target) - self.copy_link_vars(link) - return link - - def eval_block(self, block): - for op in block.operations: -# print op.opimpl, op.result, op.args, self.int_vars - op.opimpl(self, op.result, op.args) - exitswitch = block.exitswitch - if exitswitch >= 0: - link = block.exits[self.int_vars[exitswitch]] - return link - return block.exits[0] - - def copy_startlink_vars(self, link, int_values): -# print "copy_startlink_vars", int_values, link.move_int_registers - if link.move_int_registers is None: - return - for i in range(0, len(link.move_int_registers), 2): - source = link.move_int_registers[i] - target = link.move_int_registers[i + 1] - self.set_int(target, int_values[source]) - - def copy_link_vars(self, link): -# print "copy_link_vars", link.move_int_registers, self.int_vars - if link.move_int_registers is None: - return - for i in range(0, len(link.move_int_registers), 2): - source = link.move_int_registers[i] - target = link.move_int_registers[i + 1] - self.set_int(target, self.get_int(source)) - - def get_int(self, index): - if index < 0: - return self.graph.constants_int[~index] +class L3Value(object): + pass + +class L3Integer(L3Value): + def __init__(self, intval): + self.intval = intval + +class L3Double(L3Value): + def __init__(self, dblval): + self.dblval = dblval + +class L3Pointer(L3Value): + def __init__(self, ptrval): + self.ptrval = ptrval + + +def l3interpret(graph, args_int, args_dbl, args_ptr): + assert len(args_int) == graph.nargs_int + assert len(args_dbl) == graph.nargs_dbl + assert len(args_ptr) == graph.nargs_ptr + frame = L3Frame(graph, args_int, args_dbl, args_ptr) + frame.execute() + nint = len(args_int) - graph.nargs_int + ndbl = len(args_dbl) - graph.nargs_dbl + nptr = len(args_ptr) - graph.nargs_ptr + if nint == 1 and ndbl == 0 and nptr == 0: + return L3Integer(args_int.pop()) + if nint == 0 and ndbl == 1 and nptr == 0: + return L3Double(args_dbl.pop()) + if nint == 0 and ndbl == 0 and nptr == 1: + return L3Pointer(args_ptr.pop()) + raise AssertionError("stacks corrupted") + +class L3Frame(object): + + def __init__(self, graph, stack_int, stack_dbl, stack_ptr): + self.graph = graph + self.block = self.graph.startblock + self.i = 0 + self.stack_int = stack_int + self.stack_dbl = stack_dbl + self.stack_ptr = stack_ptr + self.base_int = len(stack_int) + self.base_dbl = len(stack_dbl) + self.base_ptr = len(stack_ptr) + + def nextop(self): + i = self.i + self.i = i+1 + return self.block.insns[i] + + def nextuop(self): + return r_uint(self.nextop()) + + def execute(self): + try: + while True: + op = self.nextuop() + ophandler = L3Frame.dispatch_table[op] + ophandler(self) + except L3Return: + pass + + def followlink(self, link): + assert isinstance(link, model.Link) + if link.targetregs_int is None: + del self.stack_int[self.base_int:] + else: + buf = [0] * len(link.targetregs_int) + for i in range(len(link.targetregs_int)): + op = link.targetregs_int[i] + if op >= 0: buf[i] = self.block.constants_int[op] + else: buf[i] = self.stack_int[op] + del self.stack_int[self.base_int:] + self.stack_int.extend(buf) + if link.targetregs_dbl is None: + del self.stack_dbl[self.base_dbl:] else: - return self.int_vars[index] + buf = [0.0] * len(link.targetregs_dbl) + for i in range(len(link.targetregs_dbl)): + op = link.targetregs_dbl[i] + if op >= 0: buf[i] = self.block.constants_dbl[op] + else: buf[i] = self.stack_dbl[op] + del self.stack_dbl[self.base_dbl:] + self.stack_dbl.extend(buf) + if link.targetregs_ptr is None: + del self.stack_ptr[self.base_ptr:] + else: + buf = [lladdress.NULL] * len(link.targetregs_ptr) + for i in range(len(link.targetregs_ptr)): + op = link.targetregs_ptr[i] + if op >= 0: buf[i] = self.block.constants_ptr[op] + else: buf[i] = self.stack_ptr[op] + del self.stack_ptr[self.base_ptr:] + self.stack_ptr.extend(buf) + self.block = link.target + self.i = 0 + + __metaclass__ = InitializedClass + def __initclass__(cls): + "NOT_RPYTHON" + def make_missing_handler(opname): + def missing_handler(self): + print 'XXX missing handler for operation', opname + raise NotImplementedError + cls.dispatch_table = [] + for opname in model.very_low_level_ops: + try: + fn = getattr(cls, 'op_' + opname).im_func + except AttributeError: + fn = make_missing_handler(opname) + cls.dispatch_table.append(fn) + + # ____________________________________________________________ + + def getint(self): + op = self.nextop() + if op >= 0: return self.block.constants_int[op] + else: return self.stack_int[op] + + def getdbl(self): + op = self.nextop() + if op >= 0: return self.block.constants_dbl[op] + else: return self.stack_dbl[op] + + def getptr(self): + op = self.nextop() + if op >= 0: return self.block.constants_ptr[op] + else: return self.stack_ptr[op] + + def restorestacks(self): + del self.stack_int[self.base_int:] + del self.stack_dbl[self.base_dbl:] + del self.stack_ptr[self.base_ptr:] + + def op_void_return(self): + self.restorestacks() + raise L3Return + + def op_int_return(self): + x = self.getint() + self.restorestacks() + self.stack_int.append(x) + raise L3Return + + def op_dbl_return(self): + x = self.getdbl() + self.restorestacks() + self.stack_dbl.append(x) + raise L3Return + + def op_ptr_return(self): + x = self.getptr() + self.restorestacks() + self.stack_ptr.append(x) + raise L3Return + + def op_jump(self): + self.followlink(self.block.exit0) + + def op_jump_cond(self): + x = self.getint() + if x: + link = self.block.exit1 + else: + link = self.block.exit0 + self.followlink(link) + + def op_int_add(self): + x = self.getint() + y = self.getint() + self.stack_int.append(x + y) + + def op_direct_call(self): + assert self.block.called_graphs is not None + graph = self.block.called_graphs[self.nextuop()] + if graph.nargs_int: + buf = [0] * graph.nargs_int + for i in range(graph.nargs_int): + buf[i] = self.getint() + self.stack_int.extend(buf) + if graph.nargs_dbl: + buf = [0.0] * graph.nargs_dbl + for i in range(graph.nargs_dbl): + buf[i] = self.getdbl() + self.stack_dbl.extend(buf) + if graph.nargs_ptr: + buf = [lladdress.NULL] * graph.nargs_ptr + for i in range(graph.nargs_ptr): + buf[i] = self.getptr() + self.stack_ptr.extend(buf) + frame = L3Frame(graph, self.stack_int, self.stack_dbl, self.stack_ptr) + frame.execute() - def set_int(self, index, val): - self.int_vars[index] = val + # ____________________________________________________________ - def op_int_add(self, result, args): - int1 = self.get_int(args[0]) - int2 = self.get_int(args[1]) - self.set_int(result, int1 + int2) - - def op_int_is_true(self, result, args): - int1 = self.get_int(args[0]) - self.set_int(result, bool(int1)) - - def op_call_graph_int(self, result, args): - graph = self.interp.globals.graphs[args[0]] - concrete_args = [self.get_int(arg) for arg in args[1:]] - r = self.interp.eval_graph_int(graph, concrete_args) - self.set_int(result, r) +class L3Return(Exception): + pass Modified: pypy/dist/pypy/rpython/l3interp/model.py ============================================================================== --- pypy/dist/pypy/rpython/l3interp/model.py (original) +++ pypy/dist/pypy/rpython/l3interp/model.py Sun Dec 4 13:46:47 2005 @@ -3,6 +3,12 @@ from pypy.rpython.lltypesystem import lltype very_low_level_ops = [ + 'nop', + + #control flow operations (only at the end of blocks) + 'jump', 'jump_cond', + 'void_return', 'int_return', 'float_return', 'adr_return', + #operations with adresses: 'adr_add', 'adr_delta', 'adr_eq', 'adr_ge', 'adr_gt', 'adr_le', 'adr_lt', 'adr_ne', 'adr_sub', @@ -72,7 +78,11 @@ 'unichar_eq', 'unichar_ne' ] - +#assert len(very_low_level_ops) <= 256 +very_low_level_opcode = {} +for i, op in enumerate(very_low_level_ops): + very_low_level_opcode[op] = i +del i, op primitives = [lltype.Signed, lltype.Unsigned, lltype.Float, lltype.Char, @@ -81,56 +91,41 @@ primitive_to_number = {} for i, p in enumerate(primitives): primitive_to_number[p] = -i - 1 -del p +del i, p +class Op: "Attribute-based interface to very_low_level_opcode" +Op = Op() +Op.__dict__ = very_low_level_opcode -# possible values for exitswitch: -ONE_EXIT = -1 -LAST_EXCEPTION = -2 - -class Operation(object): - def __init__(self, opimpl, result, args): - self.opimpl = opimpl # unbound method of LLFrame - self.args = args # list of ints: how to represent constants? - self.result = result # resulting variable - -class Link(object): - stop_graph_evaluation = False - def __init__(self, target, exitcase=None): - self.target = target # target is a Block - self.exitcase = exitcase # NULL for non-exceptional case - # address of exception class else - self.move_int_registers = None - -class ReturnLink(Link): - stop_graph_evaluation = True - def __init__(self, return_val=0, exitcase=None): - Link.__init__(self, None, exitcase) - if return_val != 0: - self.move_int_registers = [return_val, 0] - pass - -class StartLink(Link): - pass class Block(object): - def __init__(self): - self.operations = [] # list of Operations - self.exitswitch = 0 # positives are variables - # negatives see above - self.exits = [] # list of Links + def __init__(self, insns, exit0=None, + exit1=None, + constants_int=None, + constants_dbl=None, + constants_ptr=None, + called_graphs=None): + self.insns = insns + self.exit0 = exit0 + self.exit1 = exit1 + self.constants_int = constants_int + self.constants_dbl = constants_dbl + self.constants_ptr = constants_ptr + self.called_graphs = called_graphs -class Graph(object): - def __init__(self, name, startlink): - self.name = name # string - self.startlink = startlink # Block - self.constants_int = [] - self.max_num_ints = 17 #XXX calculate this - - def set_constants_int(self, constants): - self.constants_int = constants - -class Globals(object): - def __init__(self): - self.graphs = [] # list of Graphs +class Link(object): + def __init__(self, target, targetregs_int=None, + targetregs_dbl=None, + targetregs_ptr=None): + self.target = target + self.targetregs_int = targetregs_int + self.targetregs_dbl = targetregs_dbl + self.targetregs_ptr = targetregs_ptr +class Graph(object): + def __init__(self, name, startblock, nargs_int, nargs_dbl, nargs_ptr): + self.name = name + self.startblock = startblock + self.nargs_int = nargs_int + self.nargs_dbl = nargs_dbl + self.nargs_ptr = nargs_ptr Modified: pypy/dist/pypy/rpython/l3interp/test/test_convert.py ============================================================================== --- pypy/dist/pypy/rpython/l3interp/test/test_convert.py (original) +++ pypy/dist/pypy/rpython/l3interp/test/test_convert.py Sun Dec 4 13:46:47 2005 @@ -1,7 +1,9 @@ +import py from pypy.rpython.l3interp import convertgraph, l3interp from pypy.translator.translator import TranslationContext def test_convert_add(): + py.test.skip("in-progress") def f(x): return x + 4 t = TranslationContext() Modified: pypy/dist/pypy/rpython/l3interp/test/test_l3interp.py ============================================================================== --- pypy/dist/pypy/rpython/l3interp/test/test_l3interp.py (original) +++ pypy/dist/pypy/rpython/l3interp/test/test_l3interp.py Sun Dec 4 13:46:47 2005 @@ -1,5 +1,6 @@ from pypy.rpython.l3interp import l3interp from pypy.rpython.l3interp import model +from pypy.rpython.l3interp.model import Op from pypy.translator.c.test.test_genc import compile from pypy.translator.translator import TranslationContext from pypy.annotation import policy @@ -24,19 +25,13 @@ def eval_seven(): #def f(): # return 3 + 4 - op = model.Operation(l3interp.LLFrame.op_int_add, 0, [-1, -2]) - returnlink = model.ReturnLink() - block = model.Block() - block.exitswitch = model.ONE_EXIT - block.exits = [returnlink] - block.operations.append(op) - startlink = model.Link(block, []) - graph = model.Graph("testgraph", startlink) - graph.set_constants_int([3, 4]) - g = model.Globals() - g.graphs = [graph] - interp = l3interp.LLInterpreter(g) - return interp.eval_graph_int(graph, []) + block = model.Block([Op.int_add, 0, 1, + Op.int_return, -1], + constants_int = [3, 4]) + graph = model.Graph("testgraph", block, 0, 0, 0) + value = l3interp.l3interpret(graph, [], [], []) + assert isinstance(value, l3interp.L3Integer) + return value.intval def test_very_simple(): result = eval_seven() @@ -50,20 +45,13 @@ def eval_eight(number): #def f(x): # return x + 4 - op = model.Operation(l3interp.LLFrame.op_int_add, 1, [0, -1]) - returnlink = model.ReturnLink(return_val=1) - block = model.Block() - block.exitswitch = model.ONE_EXIT - block.exits = [returnlink] - block.operations.append(op) - startlink = model.Link(target=block) - startlink.move_int_registers = [0, 0] - graph = model.Graph("testgraph", startlink) - graph.set_constants_int([4]) - g = model.Globals() - g.graphs = [graph] - interp = l3interp.LLInterpreter(g) - return interp.eval_graph_int(graph, [number]) + block = model.Block([Op.int_add, -1, 0, + Op.int_return, -1], + constants_int = [4]) + graph = model.Graph("testgraph", block, 1, 0, 0) + value = l3interp.l3interpret(graph, [number], [], []) + assert isinstance(value, l3interp.L3Integer) + return value.intval def test_simple(): result = eval_eight(4) @@ -77,69 +65,51 @@ def eval_branch(number): #def f(x): # if x: - # return 2 + # return x # return 1 - op = model.Operation(l3interp.LLFrame.op_int_is_true, 1, [0]) - returnlink1 = model.ReturnLink(-1) - returnlink2 = model.ReturnLink(-2) - block = model.Block() - block.exitswitch = 1 - block.exits = [returnlink1, returnlink2] - block.operations.append(op) - startlink = model.Link(target=block) - startlink.move_int_registers = [0, 0] - graph = model.Graph("testgraph", startlink) - graph.set_constants_int([1, 2]) - g = model.Globals() - g.graphs = [graph] - interp = l3interp.LLInterpreter(g) - return interp.eval_graph_int(graph, [number]) + block1 = model.Block([Op.jump_cond, -1]) + block2 = model.Block([Op.int_return, -1]) + block3 = model.Block([Op.int_return, 0], constants_int=[1]) + block1.exit0 = model.Link(block3) + block1.exit1 = model.Link(block2, targetregs_int=[-1]) + graph = model.Graph("testgraph", block1, 1, 0, 0) + value = l3interp.l3interpret(graph, [number], [], []) + assert isinstance(value, l3interp.L3Integer) + return value.intval def test_branch(): result = eval_branch(4) - assert result == 2 + assert result == 4 result = eval_branch(0) assert result == 1 def test_branch_translated(): fn = translate(eval_branch, [int]) - assert fn(4) == 2 + assert fn(4) == 4 assert fn(0) == 1 #---------------------------------------------------------------------- def eval_call(number): - #XXX uh: this isn't funny anymore #def g(x): # return x + 1 #def f(x): # return g(x) + 2 - op_g = model.Operation(l3interp.LLFrame.op_int_add, 1, [0, -1]) - op_f = model.Operation(l3interp.LLFrame.op_int_add, 2, [1, -1]) - call_op = model.Operation(l3interp.LLFrame.op_call_graph_int, 1, [0, 0]) - returnlink_g = model.ReturnLink(1) - returnlink_f = model.ReturnLink(2) - block_g = model.Block() - block_g.exitswitch = model.ONE_EXIT - block_g.exits = [returnlink_g] - block_g.operations.append(op_g) - startlink_g = model.StartLink(target=block_g) - startlink_g.move_int_registers = [0, 0] - graph_g = model.Graph("g", startlink_g) - graph_g.set_constants_int([1]) - - block_f = model.Block() - block_f.exitswitch = model.ONE_EXIT - block_f.exits = [returnlink_f] - block_f.operations.extend([call_op, op_f]) - startlink_f = model.StartLink(target=block_f) - startlink_f.move_int_registers = [0, 0] - graph_f = model.Graph("f", startlink_f) - graph_f.set_constants_int([2]) - g = model.Globals() - g.graphs = [graph_g, graph_f] - interp = l3interp.LLInterpreter(g) - return interp.eval_graph_int(graph_f, [number]) + block = model.Block([Op.int_add, -1, 0, + Op.int_return, -1], + constants_int = [1]) + graph1 = model.Graph("g", block, 1, 0, 0) + + block = model.Block([Op.direct_call, 0, -1, + Op.int_add, -1, 0, + Op.int_return, -1], + constants_int = [2], + called_graphs = [graph1]) + graph2 = model.Graph("f", block, 1, 0, 0) + + value = l3interp.l3interpret(graph2, [number], [], []) + assert isinstance(value, l3interp.L3Integer) + return value.intval def test_call(): result = eval_call(4) @@ -151,5 +121,3 @@ fn = translate(eval_call, [int]) assert fn(4) == 7 assert fn(0) == 3 - - From arigo at codespeak.net Sun Dec 4 14:05:14 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 4 Dec 2005 14:05:14 +0100 (CET) Subject: [pypy-svn] r20640 - pypy/dist/pypy/rpython/l3interp Message-ID: <20051204130514.E6AFF27B56@code1.codespeak.net> Author: arigo Date: Sun Dec 4 14:05:13 2005 New Revision: 20640 Modified: pypy/dist/pypy/rpython/l3interp/l3interp.py Log: Got rid of the intermediate buffer, by copying data around the stack. Moved the code in a function that gets specialized for the three different stacks. Modified: pypy/dist/pypy/rpython/l3interp/l3interp.py ============================================================================== --- pypy/dist/pypy/rpython/l3interp/l3interp.py (original) +++ pypy/dist/pypy/rpython/l3interp/l3interp.py Sun Dec 4 14:05:13 2005 @@ -72,36 +72,13 @@ def followlink(self, link): assert isinstance(link, model.Link) - if link.targetregs_int is None: - del self.stack_int[self.base_int:] - else: - buf = [0] * len(link.targetregs_int) - for i in range(len(link.targetregs_int)): - op = link.targetregs_int[i] - if op >= 0: buf[i] = self.block.constants_int[op] - else: buf[i] = self.stack_int[op] - del self.stack_int[self.base_int:] - self.stack_int.extend(buf) - if link.targetregs_dbl is None: - del self.stack_dbl[self.base_dbl:] - else: - buf = [0.0] * len(link.targetregs_dbl) - for i in range(len(link.targetregs_dbl)): - op = link.targetregs_dbl[i] - if op >= 0: buf[i] = self.block.constants_dbl[op] - else: buf[i] = self.stack_dbl[op] - del self.stack_dbl[self.base_dbl:] - self.stack_dbl.extend(buf) - if link.targetregs_ptr is None: - del self.stack_ptr[self.base_ptr:] - else: - buf = [lladdress.NULL] * len(link.targetregs_ptr) - for i in range(len(link.targetregs_ptr)): - op = link.targetregs_ptr[i] - if op >= 0: buf[i] = self.block.constants_ptr[op] - else: buf[i] = self.stack_ptr[op] - del self.stack_ptr[self.base_ptr:] - self.stack_ptr.extend(buf) + block = self.block + followlink1(L3Integer, self.stack_int, self.base_int, + link.targetregs_int, block.constants_int) + followlink1(L3Double, self.stack_dbl, self.base_dbl, + link.targetregs_dbl, block.constants_dbl) + followlink1(L3Pointer, self.stack_ptr, self.base_ptr, + link.targetregs_ptr, block.constants_ptr) self.block = link.target self.i = 0 @@ -181,23 +158,15 @@ self.stack_int.append(x + y) def op_direct_call(self): - assert self.block.called_graphs is not None - graph = self.block.called_graphs[self.nextuop()] - if graph.nargs_int: - buf = [0] * graph.nargs_int - for i in range(graph.nargs_int): - buf[i] = self.getint() - self.stack_int.extend(buf) - if graph.nargs_dbl: - buf = [0.0] * graph.nargs_dbl - for i in range(graph.nargs_dbl): - buf[i] = self.getdbl() - self.stack_dbl.extend(buf) - if graph.nargs_ptr: - buf = [lladdress.NULL] * graph.nargs_ptr - for i in range(graph.nargs_ptr): - buf[i] = self.getptr() - self.stack_ptr.extend(buf) + block = self.block + assert block.called_graphs is not None + graph = block.called_graphs[self.nextuop()] + directcall1(L3Integer, graph.nargs_int, self.stack_int, + block.constants_int, self.nextop) + directcall1(L3Double, graph.nargs_dbl, self.stack_dbl, + block.constants_dbl, self.nextop) + directcall1(L3Pointer, graph.nargs_ptr, self.stack_ptr, + block.constants_ptr, self.nextop) frame = L3Frame(graph, self.stack_int, self.stack_dbl, self.stack_ptr) frame.execute() @@ -205,3 +174,28 @@ class L3Return(Exception): pass + +def followlink1(marker, stack, stackbase, targetregs, constants): + if targetregs is None: + del stack[stackbase:] + else: + top = r_uint(len(stack)) + for op in targetregs: + if op >= 0: newval = constants[op] + else: newval = stack[top + op] + stack.append(newval) + targetlen = len(targetregs) + for i in range(targetlen): + stack[stackbase + i] = stack[top + i] + del stack[stackbase + targetlen:] +followlink1._annspecialcase_ = 'specialize:arg0' + +def directcall1(marker, nargs, stack, constants, nextop): + if nargs > 0: + top = r_uint(len(stack)) + for i in range(nargs): + op = nextop() + if op >= 0: newval = constants[op] + else: newval = stack[top + op] + stack.append(newval) +directcall1._annspecialcase_ = 'specialize:arg0' From arigo at codespeak.net Sun Dec 4 14:11:55 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 4 Dec 2005 14:11:55 +0100 (CET) Subject: [pypy-svn] r20641 - in pypy/dist/pypy: annotation rpython/l3interp rpython/test Message-ID: <20051204131155.E6D0B27B56@code1.codespeak.net> Author: arigo Date: Sun Dec 4 14:11:53 2005 New Revision: 20641 Modified: pypy/dist/pypy/annotation/policy.py pypy/dist/pypy/annotation/specialize.py pypy/dist/pypy/rpython/l3interp/l3interp.py pypy/dist/pypy/rpython/test/test_rpbc.py Log: Make and use a more appropriate specialization tag. Modified: pypy/dist/pypy/annotation/policy.py ============================================================================== --- pypy/dist/pypy/annotation/policy.py (original) +++ pypy/dist/pypy/annotation/policy.py Sun Dec 4 14:11:53 2005 @@ -1,6 +1,6 @@ # base annotation policy for overrides and specialization from pypy.annotation.specialize import default_specialize as default -from pypy.annotation.specialize import argtype, argvalue +from pypy.annotation.specialize import argtype, argvalue, arglistitemtype from pypy.annotation.specialize import memo, methodmemo # for some reason, model must be imported first, # or we create a cycle. @@ -55,16 +55,22 @@ specialize__methodmemo = staticmethod(methodmemo) specialize__arg0 = staticmethod(argvalue(0)) specialize__argtype0 = staticmethod(argtype(0)) + specialize__arglistitemtype0 = staticmethod(arglistitemtype(0)) specialize__arg1 = staticmethod(argvalue(1)) specialize__argtype1 = staticmethod(argtype(1)) + specialize__arglistitemtype1 = staticmethod(arglistitemtype(1)) specialize__arg2 = staticmethod(argvalue(2)) specialize__argtype2 = staticmethod(argtype(2)) + specialize__arglistitemtype2 = staticmethod(arglistitemtype(2)) specialize__arg3 = staticmethod(argvalue(3)) specialize__argtype3 = staticmethod(argtype(3)) + specialize__arglistitemtype3 = staticmethod(arglistitemtype(3)) specialize__arg4 = staticmethod(argvalue(4)) specialize__argtype4 = staticmethod(argtype(4)) + specialize__arglistitemtype4 = staticmethod(arglistitemtype(4)) specialize__arg5 = staticmethod(argvalue(5)) specialize__argtype5 = staticmethod(argtype(5)) + specialize__arglistitemtype5 = staticmethod(arglistitemtype(5)) def override__ignore(pol, *args): bk = getbookkeeper() Modified: pypy/dist/pypy/annotation/specialize.py ============================================================================== --- pypy/dist/pypy/annotation/specialize.py (original) +++ pypy/dist/pypy/annotation/specialize.py Sun Dec 4 14:11:53 2005 @@ -186,3 +186,13 @@ key = args_s[i].knowntype return funcdesc.cachedgraph(key) return specialize_argtype + +def arglistitemtype(i): + def specialize_arglistitemtype(funcdesc, args_s): + s = args_s[i] + if s.knowntype is not list: + key = None + else: + key = s.listdef.listitem.s_value.knowntype + return funcdesc.cachedgraph(key) + return specialize_arglistitemtype Modified: pypy/dist/pypy/rpython/l3interp/l3interp.py ============================================================================== --- pypy/dist/pypy/rpython/l3interp/l3interp.py (original) +++ pypy/dist/pypy/rpython/l3interp/l3interp.py Sun Dec 4 14:11:53 2005 @@ -73,12 +73,12 @@ def followlink(self, link): assert isinstance(link, model.Link) block = self.block - followlink1(L3Integer, self.stack_int, self.base_int, - link.targetregs_int, block.constants_int) - followlink1(L3Double, self.stack_dbl, self.base_dbl, - link.targetregs_dbl, block.constants_dbl) - followlink1(L3Pointer, self.stack_ptr, self.base_ptr, - link.targetregs_ptr, block.constants_ptr) + followlink1(self.stack_int, self.base_int, + link.targetregs_int, block.constants_int) + followlink1(self.stack_dbl, self.base_dbl, + link.targetregs_dbl, block.constants_dbl) + followlink1(self.stack_ptr, self.base_ptr, + link.targetregs_ptr, block.constants_ptr) self.block = link.target self.i = 0 @@ -161,12 +161,12 @@ block = self.block assert block.called_graphs is not None graph = block.called_graphs[self.nextuop()] - directcall1(L3Integer, graph.nargs_int, self.stack_int, - block.constants_int, self.nextop) - directcall1(L3Double, graph.nargs_dbl, self.stack_dbl, - block.constants_dbl, self.nextop) - directcall1(L3Pointer, graph.nargs_ptr, self.stack_ptr, - block.constants_ptr, self.nextop) + directcall1(self.stack_int, graph.nargs_int, + block.constants_int, self.nextop) + directcall1(self.stack_dbl, graph.nargs_dbl, + block.constants_dbl, self.nextop) + directcall1(self.stack_ptr, graph.nargs_ptr, + block.constants_ptr, self.nextop) frame = L3Frame(graph, self.stack_int, self.stack_dbl, self.stack_ptr) frame.execute() @@ -175,7 +175,7 @@ class L3Return(Exception): pass -def followlink1(marker, stack, stackbase, targetregs, constants): +def followlink1(stack, stackbase, targetregs, constants): if targetregs is None: del stack[stackbase:] else: @@ -188,9 +188,9 @@ for i in range(targetlen): stack[stackbase + i] = stack[top + i] del stack[stackbase + targetlen:] -followlink1._annspecialcase_ = 'specialize:arg0' +followlink1._annspecialcase_ = 'specialize:arglistitemtype0' -def directcall1(marker, nargs, stack, constants, nextop): +def directcall1(stack, nargs, constants, nextop): if nargs > 0: top = r_uint(len(stack)) for i in range(nargs): @@ -198,4 +198,4 @@ if op >= 0: newval = constants[op] else: newval = stack[top + op] stack.append(newval) -directcall1._annspecialcase_ = 'specialize:arg0' +directcall1._annspecialcase_ = 'specialize:arglistitemtype0' Modified: pypy/dist/pypy/rpython/test/test_rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rpbc.py (original) +++ pypy/dist/pypy/rpython/test/test_rpbc.py Sun Dec 4 14:11:53 2005 @@ -1218,3 +1218,16 @@ assert ''.join(res.chars) == 'tag1:hellotag1:< 42 >' res = interpret(f, [0]) assert ''.join(res.chars) == 'tag2:hellotag2:< 42 >' + +def test_call_from_list(): + def f0(n): return n+200 + def f1(n): return n+192 + def f2(n): return n+46 + def f3(n): return n+2987 + def f4(n): return n+217 + lst = [f0, f1, f2, f3, f4] + def f(i, n): + return lst[i](n) + for i in range(5): + res = interpret(f, [i, 1000]) + assert res == f(i, 1000) From arigo at codespeak.net Sun Dec 4 15:42:35 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 4 Dec 2005 15:42:35 +0100 (CET) Subject: [pypy-svn] r20642 - pypy/dist/pypy/doc/discussion Message-ID: <20051204144235.F3A4A27B49@code1.codespeak.net> Author: arigo Date: Sun Dec 4 15:42:34 2005 New Revision: 20642 Modified: pypy/dist/pypy/doc/discussion/draft-jit-ideas.txt Log: Concrete short-term JIT plans. Modified: pypy/dist/pypy/doc/discussion/draft-jit-ideas.txt ============================================================================== --- pypy/dist/pypy/doc/discussion/draft-jit-ideas.txt (original) +++ pypy/dist/pypy/doc/discussion/draft-jit-ideas.txt Sun Dec 4 15:42:34 2005 @@ -1,6 +1,24 @@ JIT ideas and areas of work ------------------------------ +Plan +=========================== + +Short-term plans: + +1. Write a small interpreter in RPython for whatever bytecode language, + as an example and for testing. The goal is to turn that interpreter + into a JIT. + +2. Write code that takes LL graphs and "specializes" them, by making a + variable constant and propagating it. + +3. Think more about how to plug 1 into 2 :-) + + +Discussion and details +=========================== + Low-level graphs abstract interpreter ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ From arigo at codespeak.net Sun Dec 4 16:43:14 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 4 Dec 2005 16:43:14 +0100 (CET) Subject: [pypy-svn] r20643 - in pypy/dist/pypy/rpython/l3interp: . test Message-ID: <20051204154314.CD27027B4D@code1.codespeak.net> Author: arigo Date: Sun Dec 4 16:43:12 2005 New Revision: 20643 Modified: pypy/dist/pypy/rpython/l3interp/convertgraph.py pypy/dist/pypy/rpython/l3interp/test/test_convert.py Log: Adapted convertgraph to the new model. Fun so far :-) Modified: pypy/dist/pypy/rpython/l3interp/convertgraph.py ============================================================================== --- pypy/dist/pypy/rpython/l3interp/convertgraph.py (original) +++ pypy/dist/pypy/rpython/l3interp/convertgraph.py Sun Dec 4 16:43:12 2005 @@ -1,85 +1,131 @@ -from pypy.rpython.l3interp import model from pypy.rpython.l3interp import l3interp +from pypy.rpython.l3interp import model +from pypy.rpython.l3interp.model import Op from pypy.objspace.flow import model as flowmodel +from pypy.rpython.lltypesystem import lltype -def convert(entrygraph): - cvter = LL2L3Converter(entrygraph) - return cvter.globals class LL2L3Converter(object): - def __init__(self, entrygraph): - self.globals = model.Globals() - self.convert_graph(entrygraph) + def __init__(self): + pass def convert_graph(self, graph): - graph_cvter = LL2L3GraphConverter(graph, self) - l3graph = graph_cvter.l3graph - self.globals.graphs.append(l3graph) - return l3graph - -class LL2L3GraphConverter(object): - def __init__(self, graph, cvter): - self.cvter = cvter - self.graph = graph - self.blocks_ll2l3 = {} - self.constants_to_index = {} - self.constants = [] - startlink = self.convert_startlink(graph.startblock) - self.l3graph = model.Graph(graph.name, startlink) - self.l3graph.constants_int = self.constants - - def convert_startlink(self, block): - var_to_register = dict([(var, i) - for i, var in enumerate(block.inputargs)]) - target = self.convert_block(block, var_to_register) - startlink = model.Link(target) - startlink.move_int_register = [i // 2 - for i in range(len(block.inputargs) * 2)] - return startlink - - def convert_block(self, block, var_to_register): - if block in self.blocks_ll2l3: - return self.blocks_ll2l3[block] - def get_reg_number(var): - if var not in var_to_register: - var_to_register[var] = len(var_to_register) - return var_to_register[var] - l3ops = [] - for op in block.operations: - l3ops.append(self.convert_op(op, get_reg_number)) - assert block.exitswitch is None - l3block = model.Block() - self.blocks_ll2l3[block] = l3block - l3block.exitswitch = model.ONE_EXIT - l3block.exits = [self.convert_link(block.exits[0], var_to_register)] - l3block.operations = l3ops + l3block = convert_block(graph.startblock, {}) + nargs = {'int': 0, + 'dbl': 0, + 'ptr': 0} + for v in graph.getargs(): + nargs[getkind(v.concretetype)] += 1 + return model.Graph(graph.name, l3block, + nargs['int'], nargs['dbl'], nargs['ptr']) + + +def getkind(T): + assert isinstance(T, lltype.LowLevelType) + if isinstance(T, lltype.Primitive): + if T == lltype.Float: + return 'dbl' + elif T == lltype.Void: + raise Exception("Void not implemented") + else: + return 'int' + else: + return 'ptr' + +def convert_block(block, memo): + if block in memo: + return memo[block] + + stacksizes = {'int': 0, + 'dbl': 0, + 'ptr': 0} + constants = {'int': [], + 'dbl': [], + 'ptr': []} + var2stack = {} + + def push(v): + kind = getkind(v.concretetype) + position = stacksizes[kind] + stacksizes[kind] += 1 + var2stack[v] = position + + def get(v): + kind = getkind(v.concretetype) + if isinstance(v, flowmodel.Constant): + clist = constants[kind] + try: + res = clist.index(v.value) + except ValueError: + res = len(clist) + clist.append(v.value) + return res + else: + position = var2stack[v] + return position - stacksizes[kind] # < 0 + + for v in block.inputargs: + push(v) + + insns = [] + l3block = model.Block(insns) + memo[block] = l3block + + if block.operations == (): + if len(block.inputargs) == 1: # return block + if block.inputargs[0].concretetype is lltype.Void: + l3block.insns.append(Op.void_return) + else: + kind = getkind(block.inputargs[0].concretetype) + l3block.insns.append(model.very_low_level_opcode[ + {'int': 'int_return', + 'dbl': 'float_return', + 'ptr': 'adr_return'}[kind]]) + l3block.insns.append(-1) + else: + raise NotImplementedError("except block") return l3block - def convert_link(self, link, var_to_register): - if link.target is self.graph.returnblock: - l3link = model.ReturnLink(var_to_register[link.args[0]]) - return l3link - assert 0, "not yet implemented" - - def convert_op(self, op, get_reg_number): - c_op = getattr(self, "op_" + op.opname, None) - if c_op is not None: - return c_op(op, get_reg_number) - l3args = [] - for arg in op.args: - if isinstance(arg, flowmodel.Variable): - l3args.append(get_reg_number(arg)) - else: - l3args.append(self.convert_const(arg)) - l3op = model.Operation(getattr(l3interp.LLFrame, "op_" + op.opname), - get_reg_number(op.result), l3args) - return l3op - - def convert_const(self, arg): - arg = int(arg.value) - if arg in self.constants_to_index: - return self.constants_to_index[arg] - index = len(self.constants) - self.constants.append(arg) - self.constants_to_index[arg] = index - return ~index + for spaceop in block.operations: + insns.append(model.very_low_level_opcode[spaceop.opname]) + for v in spaceop.args: + insns.append(get(v)) + if spaceop.result.concretetype is not lltype.Void: + push(spaceop.result) + + def convert_link(link): + targetregs = {'int': [], + 'dbl': [], + 'ptr': []} + for v in link.args: + kind = getkind(v.concretetype) + targetregs[kind].append(get(v)) + return model.Link(convert_block(link.target, memo), + targetregs['int'] or None, + targetregs['dbl'] or None, + targetregs['ptr'] or None) + + if block.exitswitch is None: + insns.append(Op.jump) + link, = block.exits + l3block.exit0 = convert_link(link) + + elif block.exitswitch != flowmodel.Constant(flowmodel.last_exception): + link0, link1 = block.exits + if link0.exitcase: + link0, link1 = link1, link0 + assert not link0.exitcase + assert link1.exitcase + insns.append(Op.jump_cond) + insns.append(get(block.exitswitch)) + l3block.exit0 = convert_link(link0) + l3block.exit1 = convert_link(link1) + + else: + raise NotImplementedError("exceptions") + + if constants['int']: l3block.constants_int = constants['int'] + if constants['dbl']: l3block.constants_dbl = constants['dbl'] + if constants['ptr']: l3block.constants_ptr = constants['ptr'] + + return l3block Modified: pypy/dist/pypy/rpython/l3interp/test/test_convert.py ============================================================================== --- pypy/dist/pypy/rpython/l3interp/test/test_convert.py (original) +++ pypy/dist/pypy/rpython/l3interp/test/test_convert.py Sun Dec 4 16:43:12 2005 @@ -3,14 +3,13 @@ from pypy.translator.translator import TranslationContext def test_convert_add(): - py.test.skip("in-progress") def f(x): return x + 4 t = TranslationContext() t.buildannotator().build_types(f, [int]) t.buildrtyper().specialize() - globals = convertgraph.convert(t.graphs[0]) - interp = l3interp.LLInterpreter(globals) - graph = globals.graphs[0] - result = interp.eval_graph_int(graph, [0]) - assert result == 4 + conv = convertgraph.LL2L3Converter() + l3graph = conv.convert_graph(t.graphs[0]) + result = l3interp.l3interpret(l3graph, [42], [], []) + assert isinstance(result, l3interp.L3Integer) + assert result.intval == 46 From arigo at codespeak.net Sun Dec 4 18:24:43 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 4 Dec 2005 18:24:43 +0100 (CET) Subject: [pypy-svn] r20645 - in pypy/dist/pypy: bin translator/goal Message-ID: <20051204172443.0DB0027B52@code1.codespeak.net> Author: arigo Date: Sun Dec 4 18:24:42 2005 New Revision: 20645 Modified: pypy/dist/pypy/bin/py.py pypy/dist/pypy/translator/goal/app_main.py Log: Insert the dir of the script in sys.path[0]. Thanks Seo. This initialization is rather messy at the moment, it will need a good refactoring at one point... Modified: pypy/dist/pypy/bin/py.py ============================================================================== --- pypy/dist/pypy/bin/py.py (original) +++ pypy/dist/pypy/bin/py.py Sun Dec 4 18:24:42 2005 @@ -100,6 +100,9 @@ def doit(): main.run_string(Options.command[0], space=space) elif args: + scriptdir = os.path.dirname(os.path.abspath(args[0])) + space.call_method(space.sys.get('path'), 'insert', + space.wrap(0), space.wrap(scriptdir)) def doit(): main.run_file(args[0], space=space) else: Modified: pypy/dist/pypy/translator/goal/app_main.py ============================================================================== --- pypy/dist/pypy/translator/goal/app_main.py (original) +++ pypy/dist/pypy/translator/goal/app_main.py Sun Dec 4 18:24:42 2005 @@ -173,6 +173,10 @@ exec cmd in mainmodule.__dict__ run_toplevel(run_it) else: + import os + # XXX resolve symlinks + scriptdir = os.path.dirname(os.path.abspath(sys.argv[0])) + sys.path.insert(0, scriptdir) run_toplevel(execfile, sys.argv[0], mainmodule.__dict__) else: go_interactive = True From bea at codespeak.net Sun Dec 4 22:22:19 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Sun, 4 Dec 2005 22:22:19 +0100 (CET) Subject: [pypy-svn] r20646 - pypy/extradoc/talk/pypy_euworkshop_2005-12-08 Message-ID: <20051204212219.019CA27B4B@code1.codespeak.net> Author: bea Date: Sun Dec 4 22:22:18 2005 New Revision: 20646 Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/bea_part3_agility Log: my part of the eu-workshop talk,,,, Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/bea_part3_agility ============================================================================== --- (empty file) +++ pypy/extradoc/talk/pypy_euworkshop_2005-12-08/bea_part3_agility Sun Dec 4 22:22:18 2005 @@ -0,0 +1,72 @@ +SEMINAR +Best Practice in the Use and Development of Free and Open Source Software +3. Case study: + +Part 3/Bea: + +Slides: + +1. + +- Agile development grew out of a need to handle rapid change + in processes surrounding commercial software development + +- How then do agile approaches fit distributed, open-source projects without + the need to handle changing formal requirements and client relations? + +- The answer points to the core of Agile practises: the people factor + "Agile processes are designed to capitalize on each individual and each team?s + unique strenghts" (Cockburn, Highsmith, 2001) + +- The OSS nature of teams being self-organized and intensely collaborative fits the + agile approach, although OSS teams are an unique implementation due to the distributed + nature of work + +2. + +- Agile approaches aim at: + * reducing "cost of information" and distance between decision-making and it?s implementation + * by locating the team closer in a physical sense, replace documentation with face-to-face dissemination + * resulting in improved sense of community and team "morale", the foundation of pro-active teams + +- OSS teams fit the criteria very well if you look at the "physical" aspect in a more unorthodox sense. Transparent + intense, daily communication via IRC, emails and wiki:s make up for a lot of this. + +- It is no wonder though that the Python community (Zope Foundation) tailored agile approaches to add one crucial + technique, sprints, to make up for the lack of "physical" interaction between programmers. This technique is now + widely used within the Python community. + +3. + +- Sprints are ""two-day or three-day focused development session, in which developers pair off + together in a room and focus on building a particular subsystem". In this implementation it fits + agile criterias because of the knowledge/learning aspects as well as the incremental approaches. + +- Sprinting was the key agile technique in the start-up of PyPy, work being non-funded. While working on the + proposal (during sprints) the challenge was to tailor a project process based on sprinting that would fit and + work within an EU framework. (picture - sprint process) + +- Sprinting is central to the PyPy project because it is the focus point of the funded, consortium-based efforts + as well as the non-funded OSS efforts. Primarily focused on programming but there are also regular dissemination + activities (tutorials, talks) as well as consortium/management coordination. + +4. + +- "Agile teams are characterized by self-organization and intense collaboration, within and across organizational + boundaries" (Cockburn, Highsmith, 2001) How do one structure an agile OSS community into a consortium of 7 partners? + +- In order to stay true to the agile vision as much as possible, the consortium structure and roles/responsibilities + are supporting a developer-driven, flat organization. Much of the coordination of work is delegated to the core + developers. Regular "sync" meetings (once per week) are done via IRC in which the community of developers (funded + as well as non-funded) coordinate development work, keeping communication as transparent as possible. + +- Consortium meetings are done once every month via IRC with developers attending as well, physical consortium meetings + are done in conjunction with sprints. The tools for automated test driven development and version control are implemented + on consortium documentation, reducing the gap between the consortium and the community in ways of working. + +- Contribution from the community is partially funded through the process of "physical persons", entering the consortium + as individual partners, recieving funding for travel and accommodation during sprints. + +- Striking a balance between agile approaches within the OSS community of PyPy and the funded consortium structure of PyPy + is a constant challenge but an crucial one. The results from the first year of the project show important results supporting + this effort. \ No newline at end of file From mwh at codespeak.net Mon Dec 5 00:24:28 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 5 Dec 2005 00:24:28 +0100 (CET) Subject: [pypy-svn] r20647 - pypy/dist/pypy/doc Message-ID: <20051204232428.4650527B4E@code1.codespeak.net> Author: mwh Date: Mon Dec 5 00:24:27 2005 New Revision: 20647 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: Hey, multiple spaces work now :) Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 00:24:27 2005 @@ -255,9 +255,12 @@ Multiple Interpreters - No experimental data available so far. We are working on removing a - minor technical restriction that prevents our translation toolchain - from handling this case. + A binary that allowed selection between two copies of the standard + object space with a command line switch was about 10% slower and + about 40% bigger larger than the default. Most of the extra size + is likely accounted for by the duplication of the large amount + prebuilt data involved in an instance of the standard object + space. Memory Management From mwh at codespeak.net Mon Dec 5 00:55:48 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 5 Dec 2005 00:55:48 +0100 (CET) Subject: [pypy-svn] r20648 - pypy/dist/pypy/doc Message-ID: <20051204235548.24D4427B4E@code1.codespeak.net> Author: mwh Date: Mon Dec 5 00:55:46 2005 New Revision: 20648 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: remove some not-english that escaped from my tired brain (thanks jacob!) Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 00:55:46 2005 @@ -257,8 +257,8 @@ A binary that allowed selection between two copies of the standard object space with a command line switch was about 10% slower and - about 40% bigger larger than the default. Most of the extra size - is likely accounted for by the duplication of the large amount + about 40% larger than the default. Most of the extra size is + likely accounted for by the duplication of the large amount prebuilt data involved in an instance of the standard object space. From cfbolz at codespeak.net Mon Dec 5 01:39:32 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Mon, 5 Dec 2005 01:39:32 +0100 (CET) Subject: [pypy-svn] r20649 - pypy/dist/pypy/doc/tool Message-ID: <20051205003932.3145A27B4E@code1.codespeak.net> Author: cfbolz Date: Mon Dec 5 01:39:31 2005 New Revision: 20649 Modified: pypy/dist/pypy/doc/tool/mydot.py Log: fix mydot: now it does properly handle dot-files that contain latex formulas. the problem is right now that the bounding box is a bit too big (which is a huge improvement already, because before it was too small). Modified: pypy/dist/pypy/doc/tool/mydot.py ============================================================================== --- pypy/dist/pypy/doc/tool/mydot.py (original) +++ pypy/dist/pypy/doc/tool/mydot.py Mon Dec 5 01:39:31 2005 @@ -52,22 +52,14 @@ temppath = py.test.ensuretemp("dot") tex, texcontent = create_tex_eps(dot, temppath) dvi = tex.new(ext="dvi") - output = dvi.new(ext="ps") + output = dvi.new(purebasename=dvi.purebasename + "out", ext="eps") oldpath = dot.dirpath() dvi.dirpath().chdir() py.process.cmdexec("latex %s" % (tex, )) - py.process.cmdexec("dvips -o %s %s" % (output, dvi)) + py.process.cmdexec("dvips -E -o %s %s" % (output, dvi)) oldpath.chdir() return output -def ps2eps(ps): - try: - py.process.cmdexec("ps2eps -l -f %s" % ps) - except: - try: - py.process.cmdexec("ps2epsi %s %s" % (psfile, eps)) - except: - raise OSError("neither ps2eps nor ps2epsi found") if __name__ == '__main__': import optparse @@ -77,22 +69,10 @@ options, args = parser.parse_args() if len(args) != 1: raise ValueError, "need exactly one argument" - psfile = process_dot(py.path.local(args[0])) - if options.format == "ps": - print psfile.read() - elif options.format == "eps": - ps2eps(psfile) - eps = psfile.new(ext="eps") - print eps.read() + epsfile = process_dot(py.path.local(args[0])) + if options.format == "ps" or options.format == "eps": + print epsfile.read() elif options.format == "png": - png = psfile.new(ext="png") - eps = psfile.new(ext="eps") - try: - ps2eps(psfile) - except: - #ok, no eps converter found - py.process.cmdexec("convert %s %s" % (psfile, png)) - else: - py.process.cmdexec("convert %s %s" % (eps, png)) + png = epsfile.new(ext="png") + py.process.cmdexec("convert %s %s" % (epsfile, png)) print png.read() - From mwh at codespeak.net Mon Dec 5 11:47:25 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 5 Dec 2005 11:47:25 +0100 (CET) Subject: [pypy-svn] r20656 - pypy/dist/pypy/doc Message-ID: <20051205104725.E9A7C27B71@code1.codespeak.net> Author: mwh Date: Mon Dec 5 11:47:24 2005 New Revision: 20656 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: small tweaks. Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 11:47:24 2005 @@ -11,7 +11,7 @@ It has always been a major goal of PyPy to not force implementation decisions. This means that even after the implementation of the -standard interpreter has been written we are still able to experiment +standard interpreter [#]_ has been written we are still able to experiment with different approaches to memory management or concurrency and to target wildly different platforms such as the Java Virtual Machine or a very memory-limited embedded environment. @@ -23,6 +23,10 @@ successfully encapsulated in more detail and contrast the potential of our approach with CPython. +.. [#] `standard interpreter`_ is our term for the code which + implements the Python language, i.e. the interpreter and the + standard object space. + Background ========== @@ -37,17 +41,17 @@ exercise. One solution would have been for the patches to become part of core -CPython but this was not done partly because the code that fully enabled -stackless required widespread modifications that made the code harder to -understand (as the "stackless" model contains control flow that is not -easily expressable in C, the implementation became much less -"natural" in some sense). +CPython but this was not done partly because the code that fully +enabled stackless required widespread modifications that made the code +harder to understand (as the "stackless" model contains control flow +that is not easily expressable in C, the implementation became much +less "natural" in some sense). With PyPy, however, it is possible to obtain this flexible control flow whilst retaining transparent implementation code as the necessary modifications can be implemented as a localized translation aspect, and indeed this was done at the Paris sprint in a couple of days (as -compared to XXX weeks for the original stackless patches). +compared to around six months for the original stackless patches). Of course, this is not the only aspect that can be so decided a posteriori, during translation. @@ -56,7 +60,7 @@ Translation aspects =================== -Our standard interpreter [#]_ is implemented at a very high level of +Our standard interpreter is implemented at a very high level of abstraction. This has a number of happy consequences, among which is enabling the encapsulation of language aspects as described in this document. For example, the implementation code simply makes no @@ -65,9 +69,6 @@ CPython where the decision to use reference counting is reflected tens or even hundreds of times in each C source file in the codebase. -.. [#] `standard interpreter`_ in this context means the code which - implements the interpreter and the standard object space. - As described in [ARCH]_, producing a Python implementation from the source of our standard interpreter involves various stages: the initialization code is run, the resulting code is annotated, typed and @@ -86,7 +87,7 @@ ------------- The stackless modifications are mostly implemented in the C backend, -with a single extra flow graph operations to influence some detail of +with a single extra flow graph operation to influence some details of the generated C code. The total changes only required about 300 lines of source, vindicating our abstract approach. @@ -98,12 +99,12 @@ re-entering the innermost (most recent) frame: all previous (older) frames can continue to live in the heap and be resumed only when their callees return. In this way, unlimited recursion is possible even on -OSes that limit the size of the C stack. Alternatively, a different -stack can be resumed, which implements software-switching (coroutines, -or green threads if scheduling is implicit). We reobtain in this way -all major benefits of the original "stackless" patches. More generally, -we are able to compile any RPython program into a C program that can -explicitly control its C stack. +operating systems that limit the size of the C stack. Alternatively, +a different stack can be resumed, which implements software-switching +(coroutines, or green threads if scheduling is implicit). We reobtain +in this way all major benefits of the original "stackless" patches. +More generally, we are able to compile any RPython program into a C +program that can explicitly control its C stack. This effect requires a number of changes in each and every C function that would be extremely tedious to write by hand: checking for the From arigo at codespeak.net Mon Dec 5 12:11:54 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 5 Dec 2005 12:11:54 +0100 (CET) Subject: [pypy-svn] r20660 - pypy/dist/pypy/doc Message-ID: <20051205111154.A5F3E27B5D@code1.codespeak.net> Author: arigo Date: Mon Dec 5 12:11:53 2005 New Revision: 20660 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: Completed the 'stacklessness' section. Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 12:11:53 2005 @@ -91,38 +91,35 @@ the generated C code. The total changes only required about 300 lines of source, vindicating our abstract approach. -In stackless mode, the C backend generates functions answering to a -special "unwind" exception: when this exception goes through a stack -frame, the function saves away its local variables into the heap. By -propagating the exception through the whole stack, the latter gets -entierely moved to the heap. The stack can then be resumed by just -re-entering the innermost (most recent) frame: all previous (older) -frames can continue to live in the heap and be resumed only when their -callees return. In this way, unlimited recursion is possible even on -operating systems that limit the size of the C stack. Alternatively, -a different stack can be resumed, which implements software-switching -(coroutines, or green threads if scheduling is implicit). We reobtain +In stackless mode, the C backend generates functions that are +systematically extended with a small amount of bookkeeping code. This +allows the C code to save its own stack to the heap on demand, where it +can then be inspected, manipulated and eventually resumed. This is +described in more detail in [TA]_. In this way, unlimited (or more +precisely heap-limited) recursion is possible, even on operating systems +that limit the size of the C stack. Alternatively, a different saved +stack can be resumed, thus implementing soft context switches - +coroutines, or green threads with an appropriate scheduler. We reobtain in this way all major benefits of the original "stackless" patches. -More generally, we are able to compile any RPython program into a C -program that can explicitly control its C stack. This effect requires a number of changes in each and every C function that would be extremely tedious to write by hand: checking for the -"unwind" exception, saving away precisely the currently active local -variables, and when re-entering the function check which variables are -being restored and which call site is resumed. There are moreover a -couple of global tables assisting the process. The key point is that we -can fine-tune all these interactions freely, without having to rewrite -the whole code all the time but only modifying the C backend. So far, -this allowed us to find a style that does not hinder the compiler -optimisations and so has only a minor impact on performance in the -non-exceptional case. - -XXX Start documenting it and link to it from here! +signal triggering the saving of the stack, actually saving precisely the +currently active local variables, and when re-entering the function +check which variables are being restored and which call site is resumed. +In addition, a couple of global tables must be maintained to drive the +process. The key point is that we can fine-tune all these interactions +freely, without having to rewrite the whole code all the time but only +modifying the C backend (in addition, of course, to being able to change +at any time the high-level code that is the input of the translation +process). So far, this allowed us to find a style that does not hinder +the compiler optimisations and so has only a minor impact on performance +in the normal case. Also note that the fact that the C stack can be fully saved into the -heap can tremendously simplify the life of garbage collection: after a -stack unwind, there are no stack roots left. +heap can tremendously simplify the portable implementation of garbage +collection: after the stack has been completely transfered to the heap, +there are no roots left on the stack. Multiple Interpreters @@ -314,6 +311,9 @@ .. [STK] `Stackless Python`_, a Python implementation that does not use the C stack, Christian Tismer, 1999-2004 +.. [TA] `Memory management and threading models as translation aspects`_, + PyPy documentation (and EU Deliverable D05.3), 2005 + .. _`standard interpreter`: architecture.html#standard-interpreter .. _`Architecture Overview`: architecture.html .. _`Coding Guide`: coding-guide.html @@ -321,3 +321,4 @@ .. _`Object Spaces`: objspace.html .. _`Stackless Python`: http://www.stackless.com .. _`Boehm-Demers-Weiser garbage collector`: http://www.hpl.hp.com/personal/Hans_Boehm/gc/ +.. _`Memory management and threading models as translation aspects`: translation-aspects.html From mwh at codespeak.net Mon Dec 5 12:16:40 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 5 Dec 2005 12:16:40 +0100 (CET) Subject: [pypy-svn] r20662 - pypy/dist/pypy/doc Message-ID: <20051205111640.994F927B8C@code1.codespeak.net> Author: mwh Date: Mon Dec 5 12:16:39 2005 New Revision: 20662 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: a couple more words. Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 12:16:39 2005 @@ -100,7 +100,7 @@ that limit the size of the C stack. Alternatively, a different saved stack can be resumed, thus implementing soft context switches - coroutines, or green threads with an appropriate scheduler. We reobtain -in this way all major benefits of the original "stackless" patches. +in this way all the major benefits of the original "stackless" patches. This effect requires a number of changes in each and every C function that would be extremely tedious to write by hand: checking for the @@ -113,8 +113,8 @@ modifying the C backend (in addition, of course, to being able to change at any time the high-level code that is the input of the translation process). So far, this allowed us to find a style that does not hinder -the compiler optimisations and so has only a minor impact on performance -in the normal case. +the optimisations performed by the C compiler and so has only a minor +impact on performance in the normal case. Also note that the fact that the C stack can be fully saved into the heap can tremendously simplify the portable implementation of garbage From cfbolz at codespeak.net Mon Dec 5 12:21:45 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Mon, 5 Dec 2005 12:21:45 +0100 (CET) Subject: [pypy-svn] r20663 - pypy/dist/pypy/interpreter Message-ID: <20051205112145.0AA8627B71@code1.codespeak.net> Author: cfbolz Date: Mon Dec 5 12:21:44 2005 New Revision: 20663 Modified: pypy/dist/pypy/interpreter/interactive.py Log: now that we startup reasonably fast we can also remove one t from the startupttime Modified: pypy/dist/pypy/interpreter/interactive.py ============================================================================== --- pypy/dist/pypy/interpreter/interactive.py (original) +++ pypy/dist/pypy/interpreter/interactive.py Mon Dec 5 12:21:44 2005 @@ -130,7 +130,7 @@ w_sys = self.space.sys major, minor, micro, _, _ = self.space.unwrap(self.space.sys.get('pypy_version_info')) elapsed = time.time() - self.space._starttime - banner = "PyPy %d.%d.%d in %r on top of Python %s (startupttime: %.2f secs)" % ( + banner = "PyPy %d.%d.%d in %r on top of Python %s (startuptime: %.2f secs)" % ( major, minor, micro, self.space, sys.version.split()[0], elapsed) code.InteractiveConsole.interact(self, banner) From mwh at codespeak.net Mon Dec 5 12:26:24 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 5 Dec 2005 12:26:24 +0100 (CET) Subject: [pypy-svn] r20664 - pypy/dist/pypy/doc Message-ID: <20051205112624.576D527B71@code1.codespeak.net> Author: mwh Date: Mon Dec 5 12:26:19 2005 New Revision: 20664 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: small tweaks to the multiple interpreteter section. Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 12:26:19 2005 @@ -141,15 +141,15 @@ pointer to all Python objects would create. In PyPy, all of our implementation code manipulates an explicit object -space instance, as described in [CODG]_. Ideally, the situation of -multiple interpreters is thus handled automatically: if there is only -one space instance, it is regarded as a pre-constructed constant and the -space object pointer (though not its non-constant contents) disappears -from the produced source, i.e. both from function arguments and local -variables and from instance fields. If there are two or more such -instances, a 'space' attribute will be automatically added to all -application objects (or more precisely, it will not be removed by the -translation process), the best of both worlds. +space instance, as described in [CODG]_. The situation of multiple +interpreters is thus handled automatically: if there is only one space +instance, it is regarded as a pre-constructed constant and the space +object pointer (though not its non-constant contents) disappears from +the produced source, i.e. from function arguments, local variables and +instance fields. If there are two or more such instances, a 'space' +attribute will be automatically added to all application objects (or +more precisely, it will not be removed by the translation process), the +best of both worlds. Concurrency From mwh at codespeak.net Mon Dec 5 12:34:51 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 5 Dec 2005 12:34:51 +0100 (CET) Subject: [pypy-svn] r20666 - pypy/dist/pypy/doc Message-ID: <20051205113451.D629027B8A@code1.codespeak.net> Author: mwh Date: Mon Dec 5 12:34:51 2005 New Revision: 20666 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: move section on concurrency after that on memory management. tweaks to memory management section. Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 12:34:51 2005 @@ -152,6 +152,31 @@ best of both worlds. +Memory Management +----------------- + +As mentioned above, CPython's decision to use a garbage collector based +on reference counting is reflected throughout its source. In the +implementation code of PyPy, it is not, and in fact the standard +interpreter can currently be compiled to use a reference counted scheme +or the Boehm GC [BOEHM]_. Again, more details are in [TA]_. We also +have an experimental framework for developing custom exact GCs [GC]_, +but it is not yet integrated with the low-level translation back-ends. + +Another advantage of the aspect oriented approach shows itself most +clearly with this memory management aspect: that of correctness. +Although reference counting is a fairly simple scheme, writing code for +CPython requires that the programmer make a large number of +not-quite-trivial decisions about the refcounting code. Experience +suggests that mistakes will always creep in, leading to crashes or +leaks. While tools exist to help find these mistakes, it is surely +better to not write the reference count manipulations at all and this is +what PyPy's approach allows. Writing the code that emits the correct +reference count manipulations is surely harder than writing any one +piece of explicit refcounting code, but once it is done and tested, it +just works without further effort. + + Concurrency ----------- @@ -182,34 +207,6 @@ example, various forms of object-level locking. -Memory Management ------------------ - -A final low-level aspect is that of memory management. As mentioned -above, CPython's decision to use a garbage collector based on -reference counting is reflected throughout the source. In the -implementation code of PyPy, it is not, and in fact the standard -interpreter can currently be compiled to use a reference counted -scheme or the Boehm GC [BOEHM]_. We also have an experimental -framework for developing custom exact GCs [GC]_, but it is not yet -integrated with the low-level translation back-ends. - -XXX Start documentation and link to it from here. - -Another advantage of the aspect oriented approach shows itself most -clearly with this memory management aspect: that of correctness. -Although reference counting is a fairly simple scheme, writing code -for CPython requires that the programmer make a large number of -not-quite-trivial decisions about the refcounting code and experience -suggests that mistakes will always creep in, leading to crashes or -leaks. While tools exist to help find these mistakes, it is surely -better to not write the reference count manipulations at all and this -is what PyPy's approach allows. Writing the code that emits the -correct reference count manipulations is surely harder than writing -any one piece of explicit refcounting code, but once it's done and -tested, it just works without further effort. - - Evaluation Strategy ------------------- From arigo at codespeak.net Mon Dec 5 12:37:33 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 5 Dec 2005 12:37:33 +0100 (CET) Subject: [pypy-svn] r20668 - pypy/dist/pypy/doc Message-ID: <20051205113733.CC0DF27B8A@code1.codespeak.net> Author: arigo Date: Mon Dec 5 12:37:32 2005 New Revision: 20668 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: Add a reference. Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 12:37:32 2005 @@ -192,11 +192,10 @@ PyPy will offer the opportunity to experiment with different models, although currently we only offer a version with no thread support and -another with a GIL-like model. (We also plan to support soon "green" -software-only threads in the Stackless model described above, but -obviously this would not solve the multi-processor scalability issue.) - -XXX Start documentation and link to it from here. +another with a GIL-like model [TA]_. (We also plan to support soon +"green" software-only threads in the Stackless model described above, +but obviously this would not solve the multi-processor scalability +issue.) The future work in this direction is to collect the numerous possible approaches that have between thought out along the years and From arigo at codespeak.net Mon Dec 5 12:45:44 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 5 Dec 2005 12:45:44 +0100 (CET) Subject: [pypy-svn] r20669 - pypy/dist/pypy/doc Message-ID: <20051205114544.AA1CC27BF8@code1.codespeak.net> Author: arigo Date: Mon Dec 5 12:45:42 2005 New Revision: 20669 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: Expand a bit on evaluation strategy. Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 12:45:42 2005 @@ -211,12 +211,17 @@ Possibly the most radical aspect to tinker with is the evaluation strategy. The thunk object space [OBJS]_ wraps the standard object -space to allow the production of "lazily computed objects", objects -whose values are only calculated when needed, and to allow the global -and total replacement of one object with another. The thunk object -space is mostly meant as an example of what our approach can acheive -- -the combination of side-effects and lazy evaluation is not easy to -understand. +space to allow the production of "lazily computed objects", i.e. objects +whose values are only calculated when needed. It also allows global and +total replacement of one object with another. + +The thunk object space is mostly meant as an example of what our +approach can acheive -- the combination of side-effects and lazy +evaluation is not easy to understand. This demonstration is important +because this level of flexibility will be required to implement future +features along the lines of Prolog-style logic variables, transparent +persistency, object distribution across several machines, or +object-level security. Experimental results From arigo at codespeak.net Mon Dec 5 12:54:24 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 5 Dec 2005 12:54:24 +0100 (CET) Subject: [pypy-svn] r20670 - pypy/dist/pypy/doc Message-ID: <20051205115424.5404627BF6@code1.codespeak.net> Author: arigo Date: Mon Dec 5 12:54:22 2005 New Revision: 20670 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: Minor tweaks in Experimental Results. Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 12:54:22 2005 @@ -245,19 +245,21 @@ Stacklessness - Producing Stackless-style C code currently means that all the - functions of the PyPy interpreter use the new style. The current - performance impact is to make PyPy slower by about 10%. A couple of - minor pending optimisations could reduce this figure a bit. We - expect the rest of the performance impact to be mainly caused by the - increase of size of the generated executable (+20%). + Producing Stackless-style C code means that all the functions of the + PyPy interpreter that can be involved in recursions contain stack + bookkeeping code (leaf functions, functions calling only leaves, + etc. do not need to use this style). The current performance impact + is to make PyPy slower by about 10%. A couple of minor pending + optimisations could reduce this figure a bit. We expect the rest of + the performance impact to be mainly caused by the increase of size + of the generated executable (+20%). Multiple Interpreters A binary that allowed selection between two copies of the standard object space with a command line switch was about 10% slower and about 40% larger than the default. Most of the extra size is - likely accounted for by the duplication of the large amount + likely accounted for by the duplication of the large amount of prebuilt data involved in an instance of the standard object space. @@ -267,7 +269,7 @@ comparison, using reference counting instead makes the interpreter twice as slow. This is almost certainly due to the naive approach to reference counting used so far, which updates the counter far - more often than theoretically necessary; we also still have a lot of + more often than strictly necessary; we also still have a lot of objects that would theoretically not need a reference counter, either because they are short-lived or because we can prove that they are "owned" by another object and can share its lifetime. In From mwh at codespeak.net Mon Dec 5 13:19:06 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 5 Dec 2005 13:19:06 +0100 (CET) Subject: [pypy-svn] r20672 - pypy/dist/pypy/doc Message-ID: <20051205121906.791FB27BB3@code1.codespeak.net> Author: mwh Date: Mon Dec 5 13:19:05 2005 New Revision: 20672 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 13:19:05 2005 @@ -82,6 +82,10 @@ remainder of this section describes a few aspects that we have successfully enscapsulated. +An advantage of our approach is that any combination of aspects can be +freely selected, avoiding the problem of combinatorial explosion of +variants seen in manually written interpreters. + Stacklessness ------------- @@ -288,6 +292,13 @@ performance impact of 5%. The executable is 12% bigger (probably due to the arguably excessive inlining we perform). +We have described five aspects in this document, each currently with +two implementation choices, leading to 32 possible translations. We +have not measured the performance of each variant, but the few we have +tried suggests that the performance impacts are what one would expect, +e.g. a translated stackless binary using the thunk object space is +about 1.05 x 1.10 = 1.16 times slower than the default. + Conclusion ========== From mwh at codespeak.net Mon Dec 5 13:29:04 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 5 Dec 2005 13:29:04 +0100 (CET) Subject: [pypy-svn] r20673 - pypy/dist/pypy/doc Message-ID: <20051205122904.E09EE27D78@code1.codespeak.net> Author: mwh Date: Mon Dec 5 13:29:02 2005 New Revision: 20673 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: add a few more words to the conclusion. Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 13:29:02 2005 @@ -305,7 +305,11 @@ Although still a work in progress, we believe that the successes we have had in enscapsulating implementation aspects justifies the -approach we have taken. +approach we have taken. In particular, the relative ease of +implementing the translation aspects described in this paper -- as +mentioned above, the stackless modifications took only a few days -- +means we are confident that it will be easily possible to encapsulate +implementation aspects we have not yet considered. References From bea at codespeak.net Mon Dec 5 13:30:41 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Mon, 5 Dec 2005 13:30:41 +0100 (CET) Subject: [pypy-svn] r20674 - pypy/extradoc/sprintinfo Message-ID: <20051205123041.0944327D78@code1.codespeak.net> Author: bea Date: Mon Dec 5 13:30:39 2005 New Revision: 20674 Modified: pypy/extradoc/sprintinfo/LeysinReport.txt Log: just clarified time and participants Modified: pypy/extradoc/sprintinfo/LeysinReport.txt ============================================================================== --- pypy/extradoc/sprintinfo/LeysinReport.txt (original) +++ pypy/extradoc/sprintinfo/LeysinReport.txt Mon Dec 5 13:30:39 2005 @@ -1,3 +1,20 @@ +Date: 2005-01-24-2005-01-30, Leysin Switzerland + +Participants: +Adrien Di Mascio, +Anders Chrigstr?m, +Armin Rigo, +Christian Tismer, +Holger Krekel, +Jacek Generowicz, +Ludovic Aubry, +OlivierDormond, +Samuele Pedronis, +Laura Creighton +Jacob Hall?n +Beatrice D?ring +Marcus Denker +Michael Hudson Hello, From bea at codespeak.net Mon Dec 5 13:34:19 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Mon, 5 Dec 2005 13:34:19 +0100 (CET) Subject: [pypy-svn] r20676 - pypy/extradoc/sprintinfo Message-ID: <20051205123419.C263127D84@code1.codespeak.net> Author: bea Date: Mon Dec 5 13:34:18 2005 New Revision: 20676 Modified: pypy/extradoc/sprintinfo/pycon_sprint_report.txt Log: just clarified Modified: pypy/extradoc/sprintinfo/pycon_sprint_report.txt ============================================================================== --- pypy/extradoc/sprintinfo/pycon_sprint_report.txt (original) +++ pypy/extradoc/sprintinfo/pycon_sprint_report.txt Mon Dec 5 13:34:18 2005 @@ -1,3 +1,20 @@ +?Time: 2005-03-19-2005-03-22 +Participants: +Michael Chermside +Anders Chrigstr?m +Brian Dorsey +Richard Emslie +Jacob Hall?n +Holger Krekel +Alex Martelli +Alan Mcintyre +Lutz P?like +Samuele Pedroni +Jonathan Riehl +Armin Rigo +Christian Tismer + + Pypy sprint and conference report The Pypy project held a 4 day sprint at the Marvin Center, George From arigo at codespeak.net Mon Dec 5 13:39:22 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 5 Dec 2005 13:39:22 +0100 (CET) Subject: [pypy-svn] r20677 - pypy/dist/pypy/doc/image Message-ID: <20051205123922.BF8C727D7A@code1.codespeak.net> Author: arigo Date: Mon Dec 5 13:39:21 2005 New Revision: 20677 Removed: pypy/dist/pypy/doc/image/lattice1.png pypy/dist/pypy/doc/image/lattice2.png pypy/dist/pypy/doc/image/lattice3.png Log: Remove these images because they are generated automatically, and everybody gets slightly different files (with slightly different fonts). From tismer at codespeak.net Mon Dec 5 13:44:33 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Mon, 5 Dec 2005 13:44:33 +0100 (CET) Subject: [pypy-svn] r20678 - pypy/dist/pypy/doc Message-ID: <20051205124433.7315527D7A@code1.codespeak.net> Author: tismer Date: Mon Dec 5 13:44:32 2005 New Revision: 20678 Modified: pypy/dist/pypy/doc/objspace.txt Log: some wording, removed "I"-style, etc. Modified: pypy/dist/pypy/doc/objspace.txt ============================================================================== --- pypy/dist/pypy/doc/objspace.txt (original) +++ pypy/dist/pypy/doc/objspace.txt Mon Dec 5 13:44:32 2005 @@ -236,7 +236,7 @@ In our case it is a later optimization that we could make. We just don't want to make it now (and certainly not hard-coded at this level -- it could be introduced by the code generators at translation time). So in summary: wrapping integers as instances is the simple path, while using plain integers instead is the complex path, not the other way around. -Note that the Standard Object Space implementation uses MultiMethod_ dispatch instead of the complex rules of "Object/abstract.c". I think that this can be translated to a different low-level dispatch implementation that would be binary compatible with CPython's (basically the PyTypeObject structure and its function pointers). If compatibility is not required it will be more straightforwardly converted into some efficient multimethod code. +Note that the Standard Object Space implementation uses MultiMethod_ dispatch instead of the complex rules of "Object/abstract.c". This can probably be translated to a different low-level dispatch implementation that would be binary compatible with CPython's (basically the PyTypeObject structure and its function pointers). If compatibility is not required it will be more straightforwardly converted into some efficient multimethod code. .. _StdObjSpace: http://codespeak.net/svn/pypy/dist/pypy/objspace/std/ .. _MultiMethod: theory.html#multimethods From arigo at codespeak.net Mon Dec 5 13:46:17 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 5 Dec 2005 13:46:17 +0100 (CET) Subject: [pypy-svn] r20680 - in pypy/dist/pypy/doc: . discussion image statistic weekly Message-ID: <20051205124617.28F9127D7A@code1.codespeak.net> Author: arigo Date: Mon Dec 5 13:46:09 2005 New Revision: 20680 Removed: pypy/dist/pypy/doc/image/stackless_informal.png Modified: pypy/dist/pypy/doc/discussion/cmd-prompt-translation.txt (props changed) pypy/dist/pypy/doc/image/ (props changed) pypy/dist/pypy/doc/statistic/ (props changed) pypy/dist/pypy/doc/statistic/conftest.py (props changed) pypy/dist/pypy/doc/statistic/format.py (props changed) pypy/dist/pypy/doc/statistic/loc.txt (props changed) pypy/dist/pypy/doc/statistic/number_files.txt (props changed) pypy/dist/pypy/doc/statistic/post.txt (props changed) pypy/dist/pypy/doc/statistic/statistic_irc_log.txt (props changed) pypy/dist/pypy/doc/statistic/subscribers.txt (props changed) pypy/dist/pypy/doc/statistic/webaccess.txt (props changed) pypy/dist/pypy/doc/translation-aspects.txt (contents, props changed) pypy/dist/pypy/doc/weekly/summary-2005-11-11.txt (props changed) pypy/dist/pypy/doc/weekly/summary-2005-11-18.txt (props changed) pypy/dist/pypy/doc/weekly/summary-2005-11-25.txt (props changed) pypy/dist/pypy/doc/weekly/summary-2005-12-02.txt (props changed) Log: * fixeol, and adding a few svn:ignore manually. * turned an ..image directive into a ..graphviz one. Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Mon Dec 5 13:46:09 2005 @@ -366,7 +366,7 @@ the non-exceptional case. Most optimisations performed by C compilers, like register allocation, continue to work... -.. image:: image/stackless_informal.png +.. graphviz:: image/stackless_informal.dot Open Challenges From cfbolz at codespeak.net Mon Dec 5 13:53:53 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Mon, 5 Dec 2005 13:53:53 +0100 (CET) Subject: [pypy-svn] r20681 - pypy/dist/pypy/doc Message-ID: <20051205125353.5FB6327D8E@code1.codespeak.net> Author: cfbolz Date: Mon Dec 5 13:53:52 2005 New Revision: 20681 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: (pedronis, cfbolz): * described low level object model in more detail * clarified paragraph about id hashes * added paragraph about cashed pbc arguments * added paragraph about fixed sized lists/tagged pointers * clarified GIL section * added open challenges about garbage collection Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Mon Dec 5 13:53:52 2005 @@ -36,12 +36,15 @@ XXX proper references to translation.txt and dynamic-language-translation One important part of the translation process is *rtyping*. Before that step -all objects in our flow graphs still represent regular python objects. During -rtyping they are transformed into objects that match the model of the specific -target platform. For C this model consists of a set of C-like types like structures, -arrays and functions in addition to primitive types (integers, characters, -floating point numbers). This multi-stage approach gives a lot of flexibility -how a certain object is represented at C level. +all objects in our flow graphs are annotated with types on the level of the +RPython type system which is still quite high-level and target-independent. +During rtyping they are transformed into objects that match the model of the +specific target platform. For C or C-like targets this model consists of a set +of C-like types like structures, arrays and functions in addition to primitive +types (integers, characters, floating point numbers). This multi-stage approach +gives a lot of flexibility how a certain object is represented on the target +level. The RPython process can decide what representation to use based on the +type annotation and on the context and usages of the object. In the following the structures used to represent user classes are described. There is one "vtable" per user class, with the following structure: A root @@ -64,6 +67,10 @@ ... // extra class attributes } +The extra class attributes usually contain function pointers to the methods +of that class. In addition the class attributes (which are +supported by the RPython object model) are stored there. + The type of the instances is:: struct object { // for the root class @@ -75,8 +82,10 @@ ... // extra instance attributes } +The extra instance attributes are all the attributes of an instance. -XXX low level object model, data structures current layouts +These structure layouts are quite similar to how classes are usually +implemented in C++. Subclass checking ----------------- @@ -90,23 +99,27 @@ during rtyping and inserting the necessary fields into the class structure. It would be similarly easy to switch to another implementation. +XXX reference to the paper + ID hashes --------- -XXX motivate the presence of this example +In the RPython type system class instances can be used as dictionary-keys using +a default hash implementation based on identity which in practise is +implemented using the memory address. This is similar to how standard Python +behaves if no user-defined hash function is present. The annotator keeps track +for which classes this hashing is ever used. One of the peculiarities of PyPy's approach is that live objects are analyzed by our translation toolchain. This leads to the presence of instances of user classes that were built before the translation started. These are called -prebuilt-constants (PBCs for short). During rtyping these instances have to be +prebuilt-constants (PBCs for short). During rtyping, these instances have to be converted to the low level model. One of the problems with doing this is that -the standard hash implementation of Python is take the id of an object, which -is just the memory address. If a PBC is stored in a dictionary the memory -address of the original object is used as a key. If the dictionary is also -built before the translation process starts then the conversion of the -dictionary together with its keys is problematic: after the conversion the PBCs -that are used as keys have a different memory addresses and are therefore no -longer found in the dictionary. +the standard hash implementation of Python is to take the id of an object, which +is just the memory address. If the RPython program explicitely stores the hashes +of PBCS somewhere (for example in the implementation of a data structure) then +the stored hash value would not match the value of the object's address after +translation anymore. To prevent this the following strategy is used: for every class with instances that are hashed somewhere in the program (either when storing them in a @@ -118,24 +131,33 @@ return value of the hash function is the content of the field. This means that instances of such a class that are converted PBCs retain the hash values they had before the conversion whereas new objects of the class have their memory -address as hash values. Therefor the structural integrity of dictionaries with -PBC keys is conserved during conversion. This might of course lead to hash -collisions but in practice these should be rare. +address as hash values. Cached functions with PBC arguments ------------------------------------ -XXX -(also a word on the -example of cached functions with PBC argument?) - - -Changing the representation of an object completely ---------------------------------------------------- - -XXX -probably describe in more detail the possibilies to completely change the -representation of objects, etc. +As explained in (XXX reference to dynamicblabla) the annotated code can contain +functions from a finite set of PBCs to something else. The set itself has to be +finite but its content does not need to be provided explictly but is discovered +as the annotation of the input argument by the annotator itself. This kind of +function is translated by recording the input-result relationship by calling +the function concretely at annotation time, and adding a field to the PBCs in +the set and emitting code reading that field instead of the function call. + +Changing the representation of an object +---------------------------------------- + +One example of the flexibility the RTyper provides is how we deal with lists. +Based on information gathered by the annotator the RTyper chooses between two +different list implementations. If a list never changes its size after creation +a low-level array is used directly. For lists which get resized a +representation consisting of a structure with a pointer to an array is used and +overallocation is performed. + +We plan to use similar techniques to use tagged pointers instead of box-classes +to represent builtin types of the PyPy-interpreter such as integers. This would +require attaching explicit hints to the involved classes. Field acces would +then be translated to the corresponging masking operations. Automatic Memory Management Implementations @@ -226,8 +248,6 @@ simple approach which does not track objects accros function boundaries only works well in the presence of function inlining. -XXX - A general garbage collection framework -------------------------------------- @@ -266,11 +286,12 @@ tackled in phase 2 as generating assembly directly is needed anyway for a just-in-time compiler. The other possibility (which would be much easier to implement) is to move all the data away from the stack to the heap, as -described below in section XXXXX. +described below in section (XXX reference). Threading Model Implementations ============================================ + XXX nice introductory paragraph No threading @@ -287,16 +308,17 @@ At the moment there is one non-trivial threading model implemented. It follows the threading implementation of CPython and thus uses a global interpreter lock. This lock prevents that any two threads can interpret python code at any -time. The global interpreter lock is released around calls to blocking I/O functions. -This approach has a number of advantages: it gives very little runtime penalty -for single-threaded applications, makes many of the common uses for threading -possible and is relatively easy to implement and maintain. It has the -disadvantages that multiple threads cannot be distributed accros multiple -proccessors (XXX is this really a major point? it is repeated very often which -of course does not make it true). +time. The global interpreter lock is released around calls to blocking I/O +functions. This approach has a number of advantages: it gives very little +runtime penalty for single-threaded applications, makes many of the common uses +for threading possible and is relatively easy to implement and maintain. It has +the disadvantages that multiple threads cannot be distributed accros multiple +proccessors. + +To make this threading-model useable for I/O-bound applications the global +intepreter lock should be released around blocking external function calls +(which is also what CPython does). This has not been fully implemented yet. -XXX GIL release around system calls (how do we mention that here? as if it was -already implemented?) Stackless C code ----------------- @@ -369,21 +391,42 @@ .. graphviz:: image/stackless_informal.dot -Open Challenges +Future work ================ -XXX - open challenges for phase 2: - - more clever incref/decref policy, circularity detector - - more sophisticated structure inlining ? possibly - - full GC hooks? (we have started a framework for GC construction, only simulated for now) - - exact GC needs -- control over low-level machine code - - Finalization and weak references +Garbage collection +------------------ + +One of the biggest missing features of our current garbage collectors is +missing finalization. Right now finalizers are not invoked if an object is +freed by the garbage collector. Along the same lines weak references are not +supported yet. It should be possible to implement these with a reasonable +amount of effort for reference counting as well as the Boehm collector (which +provides the necessary hooks). + +Integrating the now only simulated GC framework into the rtyping process and +the code generation will require considerable effort. It requires to be able to +keep track of the GC roots which is hard to do with portable C code. One +solution would be to use stackless since it moves the stack completely to the +heap. We expect that we can insert GC barriers as function calls into the +graphs and rely on inlining to make them less inefficient. + +We may also spent some time on improving the existing reference counting +implementation by removing unnecessary incref-decref pairs. A bigger task would +be to add support for detecing circular references. + +Threading model +--------------- + - green threads? - threading model with various granularities of locking +Object model +------------ + - more sophisticated structure inlining ? possibly + Conclusion =========== From bea at codespeak.net Mon Dec 5 15:45:26 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Mon, 5 Dec 2005 15:45:26 +0100 (CET) Subject: [pypy-svn] r20687 - pypy/extradoc/talk/pypy_euworkshop_2005-12-08 Message-ID: <20051205144526.5961C27D77@code1.codespeak.net> Author: bea Date: Mon Dec 5 15:45:25 2005 New Revision: 20687 Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/holger_part2_OSSdev.txt Log: My suggested structure for Holgers part of the talk. Alastair - please dont use this yet since Holger hasnt even seen it - he will submit stuff later this evening.... Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/holger_part2_OSSdev.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/pypy_euworkshop_2005-12-08/holger_part2_OSSdev.txt Mon Dec 5 15:45:25 2005 @@ -0,0 +1,42 @@ +SEMINAR +Best Practice in the Use and Development of Free and Open Source Software +3. Case study: + +Part 2/Holger: + +Slides: + +1. What makes Open Source communities like Python work: the people factor + + - collaborative + - communication + - transparent + - organization (decision making) + +2. What makes Open Source communities like Python work: the technical framework + + - version control (Subversion) + - automated test driven development + - releases + + +3. Typical aspects of the Python community? + + - lively community + - lot?s of different python implementation projects + - good interaction between the projects + - different from other OSS communities? + +4. PyPy: the vision + + - grew out of the Python community + - started through agile practices and evolved via them + - what is PyPy (short high level explanation - configurable "interpreter") + + +5. OSS and EU funding: PyPy as a case study + + - why a fusion between an oss community and EU? (PyPyagenda) + - why fund a OSS community (EUagenda) + - impact so far + From cfbolz at codespeak.net Mon Dec 5 16:03:28 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Mon, 5 Dec 2005 16:03:28 +0100 (CET) Subject: [pypy-svn] r20690 - pypy/dist/pypy/doc Message-ID: <20051205150328.429CE27B5D@code1.codespeak.net> Author: cfbolz Date: Mon Dec 5 16:03:27 2005 New Revision: 20690 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: (pedronis, cfbolz): added future work in the threading and object model section Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Mon Dec 5 16:03:27 2005 @@ -417,16 +417,27 @@ implementation by removing unnecessary incref-decref pairs. A bigger task would be to add support for detecing circular references. + Threading model --------------- - - green threads? - - threading model with various granularities of locking +One of the interesting possibities that stackless offers is to implement green +threading. This would involve writing a scheduler and preemption logic. + +We should also investigate other threading models based on operating system +threads with various granularities of locking for access of shared access. Object model ------------ - - more sophisticated structure inlining ? possibly +We also might want to experiment with more sophisticated structure inlining. +That means identifying a field in a structure A that points to another object B +on the heap in such a way, that the pointer in A gets assigned only once to and +that no other pointer to B exists from a heap object. If this is the case the +object B can be inlined into the A since B lives exactly as long as A. + +As noted above, another plan is to implement builtin application level objects +by using tagged pointer. Conclusion =========== From cfbolz at codespeak.net Mon Dec 5 17:23:24 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Mon, 5 Dec 2005 17:23:24 +0100 (CET) Subject: [pypy-svn] r20704 - pypy/dist/pypy/doc Message-ID: <20051205162324.0EF1A27B4D@code1.codespeak.net> Author: cfbolz Date: Mon Dec 5 17:23:23 2005 New Revision: 20704 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: add a very feeble not very nice concluding paragraph Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Mon Dec 5 17:23:23 2005 @@ -9,11 +9,11 @@ ========= One of the goals of the PyPy project is it to have the memory and threading -model flexible and changeable without having to manually reimplement -the interpreter. In fact, PyPy with the 0.7 and 0.8 releases contain -code for memory management and threading models which allows experimentation -without requiring early design decisions. This document describes the current state -of the implementation of the memory object model, automatic memory management and +model flexible and changeable without having to manually reimplement the +interpreter. In fact, PyPy with the 0.7 and 0.8 releases contain code for +memory management and threading models which allows experimentation without +requiring early design decisions. This document describes the current state of +the implementation of the memory object model, automatic memory management and threading models and describes possible future developments. @@ -28,7 +28,6 @@ fundamentally different ways to implement these things is possible and reasonably easy. -XXX The low level object model =========================== @@ -442,8 +441,9 @@ Conclusion =========== -XXX nice concluding paragraphs - +As shown with various examples our approach gives us flexibility and lets us +chooses various aspects at translation time instead of encoding them into the +implementation itself. References =========== From mwh at codespeak.net Mon Dec 5 18:07:20 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 5 Dec 2005 18:07:20 +0100 (CET) Subject: [pypy-svn] r20709 - pypy/dist/pypy/doc Message-ID: <20051205170720.7EA4427B5F@code1.codespeak.net> Author: mwh Date: Mon Dec 5 18:07:19 2005 New Revision: 20709 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: minor wording tweaks and a couple more XXXs Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Mon Dec 5 18:07:19 2005 @@ -9,7 +9,7 @@ ========= One of the goals of the PyPy project is it to have the memory and threading -model flexible and changeable without having to manually reimplement the +models flexible and changeable without having to manually reimplement the interpreter. In fact, PyPy with the 0.7 and 0.8 releases contain code for memory management and threading models which allows experimentation without requiring early design decisions. This document describes the current state of @@ -41,12 +41,12 @@ specific target platform. For C or C-like targets this model consists of a set of C-like types like structures, arrays and functions in addition to primitive types (integers, characters, floating point numbers). This multi-stage approach -gives a lot of flexibility how a certain object is represented on the target -level. The RPython process can decide what representation to use based on the -type annotation and on the context and usages of the object. +gives a lot of flexibility how a particular object is represented on the +target's level. The RPython process can decide what representation to use based +on the type annotation and on the way the object is used. -In the following the structures used to represent user classes are described. -There is one "vtable" per user class, with the following structure: A root +In the following the structures used to represent RPython classes are described. +There is one "vtable" per RPython class, with the following structure: A root class "object" has:: struct object_vtable { @@ -72,11 +72,11 @@ The type of the instances is:: - struct object { // for the root class + struct object { // for instance of the root class struct object_vtable* typeptr; } - struct X { + struct X { // for instances of every other class struct Y super; // inlined ... // extra instance attributes } @@ -94,8 +94,8 @@ algorithm. Since subclass checking is quite common (it is also used to check whether an object is an instance of a certain class) we wanted to replace it with the more efficient relative numbering algorithm. This was a matter of just -changing the appropriate code of the rtyping process, calculating the class-ids -during rtyping and inserting the necessary fields into the class structure. It +changing the appropriate code of the rtyping process to calculate the class-ids +during rtyping and insert the necessary fields into the class structure. It would be similarly easy to switch to another implementation. XXX reference to the paper @@ -103,7 +103,7 @@ ID hashes --------- -In the RPython type system class instances can be used as dictionary-keys using +In the RPython type system class instances can be used as dictionary keys using a default hash implementation based on identity which in practise is implemented using the memory address. This is similar to how standard Python behaves if no user-defined hash function is present. The annotator keeps track @@ -130,7 +130,8 @@ return value of the hash function is the content of the field. This means that instances of such a class that are converted PBCs retain the hash values they had before the conversion whereas new objects of the class have their memory -address as hash values. +address as hash values. A strategy along these lines will be required if we ever +switch to using a copying garbage collector. Cached functions with PBC arguments ------------------------------------ @@ -154,9 +155,9 @@ overallocation is performed. We plan to use similar techniques to use tagged pointers instead of box-classes -to represent builtin types of the PyPy-interpreter such as integers. This would -require attaching explicit hints to the involved classes. Field acces would -then be translated to the corresponging masking operations. +to represent builtin types of the PyPy interpreter such as integers. This would +require attaching explicit hints to the involved classes. Field access would +then be translated to the corresponding masking operations. Automatic Memory Management Implementations @@ -165,7 +166,7 @@ The whole implementation of the PyPy interpreter assumes automatic memory management, e.g. automatic reclamation of memory that is no longer used. The whole analysis toolchain also assumes that memory management is being taken -care of. Only the backends have to concern themselves with that issue. For +care of -- only the backends have to concern themselves with that issue. For backends that target environments that have their own garbage collector, like Smalltalk or Javascript, this is not an issue. For other targets like C and LLVM the backend has to produce code that uses some sort of garbage collection. @@ -195,7 +196,7 @@ signals to the collector that it does not need to consider this memory when tracing pointers. -Using the Boehm collector has disadvantages as well. Its problems stem from the +Using the Boehm collector has disadvantages as well. The problems stem from the fact that the Boehm collector is conservative which means that it has to consider every word in memory to be a potential pointer. Since PyPy's toolchain has complete knowledge of the placement of data in memory we can generate an @@ -272,8 +273,8 @@ At the moment we have three simple garbage collectors implemented for this framework: a simple copying collector, a mark-and-sweep collector and a deferred reference counting collector. These garbage collectors are working on -top of the memory simulator at the moment it is not yet possible to translate -PyPy to C with them, though. This is due to the fact that it is not easy to +top of the memory simulator, but at the moment it is not yet possible to translate +PyPy to C with them. This is due to the fact that it is not easy to find the root pointers that reside on the C stack because the C stack layout is heavily platform dependent and because of the possibility of roots that are not only on the stack but also in registers (which would give a problem for moving @@ -306,12 +307,12 @@ At the moment there is one non-trivial threading model implemented. It follows the threading implementation of CPython and thus uses a global interpreter -lock. This lock prevents that any two threads can interpret python code at any +lock. This lock prevents any two threads from interpreting python code at any time. The global interpreter lock is released around calls to blocking I/O functions. This approach has a number of advantages: it gives very little runtime penalty for single-threaded applications, makes many of the common uses for threading possible and is relatively easy to implement and maintain. It has -the disadvantages that multiple threads cannot be distributed accros multiple +the disadvantages that multiple threads cannot be distributed accross multiple proccessors. To make this threading-model useable for I/O-bound applications the global @@ -332,8 +333,8 @@ The technique we have implemented is based on an old but recurring idea of emulating this style via exceptions: a specific program point can generate a pseudo-exception whose purpose is to unwind the whole C stack -in a restartable way. More precisely, the "unwind" exception has the -effect of saving the C stack into the heap, in a compact and explicit +in a restartable way. More precisely, the "unwind" exception causes +the C stack to be saved into the heap in a compact and explicit format, as described below. It is then possible to resume only the innermost (most recent) frame of the saved stack -- allowing unlimited recursion on OSes that limit the size of the C stack -- or to resume a @@ -383,7 +384,7 @@ * implicitly-scheduled microthreads, also known as green threads. An important property of the changes in all the generated C functions is -to be written in a way that almost does not degrade their performance in +to be written in a way that does not significantly degrade their performance in the non-exceptional case. Most optimisations performed by C compilers, like register allocation, continue to work... @@ -405,12 +406,12 @@ amount of effort for reference counting as well as the Boehm collector (which provides the necessary hooks). -Integrating the now only simulated GC framework into the rtyping process and -the code generation will require considerable effort. It requires to be able to +Integrating the now simulated-only GC framework into the rtyping process and +the code generation will require considerable effort. It requires being able to keep track of the GC roots which is hard to do with portable C code. One solution would be to use stackless since it moves the stack completely to the -heap. We expect that we can insert GC barriers as function calls into the -graphs and rely on inlining to make them less inefficient. +heap. We expect that we can implement GC read and write barriers as function +calls and rely on inlining to make them less inefficient. We may also spent some time on improving the existing reference counting implementation by removing unnecessary incref-decref pairs. A bigger task would @@ -433,10 +434,11 @@ That means identifying a field in a structure A that points to another object B on the heap in such a way, that the pointer in A gets assigned only once to and that no other pointer to B exists from a heap object. If this is the case the -object B can be inlined into the A since B lives exactly as long as A. +object B can be inlined into the A since B lives exactly as long as A. +XXX makes little sense! As noted above, another plan is to implement builtin application level objects -by using tagged pointer. +by using tagged pointers. XXX also makes little sense! Conclusion =========== From mwh at codespeak.net Mon Dec 5 18:22:38 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 5 Dec 2005 18:22:38 +0100 (CET) Subject: [pypy-svn] r20711 - pypy/dist/pypy/doc Message-ID: <20051205172238.A47D027B56@code1.codespeak.net> Author: mwh Date: Mon Dec 5 18:22:37 2005 New Revision: 20711 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: up-to-date performance numbers. Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Mon Dec 5 18:22:37 2005 @@ -253,7 +253,7 @@ PyPy interpreter that can be involved in recursions contain stack bookkeeping code (leaf functions, functions calling only leaves, etc. do not need to use this style). The current performance impact - is to make PyPy slower by about 10%. A couple of minor pending + is to make PyPy slower by about 8%. A couple of minor pending optimisations could reduce this figure a bit. We expect the rest of the performance impact to be mainly caused by the increase of size of the generated executable (+20%). @@ -289,15 +289,16 @@ Evaluation Strategy When translated to C code, the Thunk object space has a global - performance impact of 5%. The executable is 12% bigger (probably + performance impact of 6%. The executable is 10% bigger (probably due to the arguably excessive inlining we perform). We have described five aspects in this document, each currently with two implementation choices, leading to 32 possible translations. We have not measured the performance of each variant, but the few we have tried suggests that the performance impacts are what one would expect, -e.g. a translated stackless binary using the thunk object space is -about 1.05 x 1.10 = 1.16 times slower than the default. +e.g. a translated stackless binary using the thunk object space would +be expected to be about 1.06 x 1.08 ~= 1.14 times slower than the +default and was found to be 1.15 times slower. Conclusion From lac at codespeak.net Mon Dec 5 19:03:01 2005 From: lac at codespeak.net (lac at codespeak.net) Date: Mon, 5 Dec 2005 19:03:01 +0100 (CET) Subject: [pypy-svn] r20712 - pypy/dist/pypy/doc Message-ID: <20051205180301.A211027B6E@code1.codespeak.net> Author: lac Date: Mon Dec 5 19:03:00 2005 New Revision: 20712 Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt Log: fix one typo and standardize (sic) spelling Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/dynamic-language-translation.txt Mon Dec 5 19:03:00 2005 @@ -32,9 +32,9 @@ they previously were. The following aspects in particular are typical not only of Python but of most modern dynamic languages: -* The driving force is not minimalistic elegance. It is a balance between - elegance and practicality, and rather un-minimalistic -- the feature - sets built into languages tend to be relatively large and growing +* The driving force is not minimalist elegance. It is a balance between + elegance and practicality, and rather un-minimalist -- the feature + sets built into languages tend to be relatively large and grow (to some extent, depending on the community driving the evolution of the language). @@ -295,7 +295,7 @@ * the `Annotator`_ performs type inference. This part is best implemented separately from other parts because it is based on a - fixpoint search algorithm. It is mostly this part that defines and + fixed point search algorithm. It is mostly this part that defines and restricts what RPython exactly is. After annotation, the control flow graphs still contain all the original relatively-high-level RPython operations; the inferred information is only used in the next step. @@ -490,7 +490,7 @@ answer corresponding to the branch to follow, and switches to the next recorder in the chain. -.. [#] "eggs, eggs, eggs, eggs and spam" -- references to Monthy Python +.. [#] "spam, spam, spam, egg and spam" -- references to Monty Python are common in Python :-) This mechanism ensures that all flow paths are considered, including @@ -682,7 +682,7 @@ outwards. Naturally, simply propagating annotations forward requires the use of a -fixpoint algorithm in the presence of loops in the flow graphs or in the +fixed point algorithm in the presence of loops in the flow graphs or in the inter-procedural call graph. Indeed, we flow annotations forward from the beginning of the entry point function into each block, operation after operation, and follow all calls recursively. During this process, @@ -693,7 +693,7 @@ where they appear for reflowing. The more general annotations can generalise the annotations of the results of the variables in the block, which in turn can generalise the annotations that flow into the -following blocks, and so on. This process continues until a fixpoint is +following blocks, and so on. This process continues until a fixed point is reached. We can consider that all variables are initially assigned the "bottom" @@ -785,7 +785,7 @@ (there is one such term per variable); * Pbc(*set*) -- where the *set* is a subset of the (finite) set of all - `Prebuilt Constants`_, defined below. This set includes all the + `Pre-built Constants`_, defined below. This set includes all the callables of the user program: functions, classes, and methods. * None -- stands for the singleton ``None`` object of Python. @@ -884,7 +884,7 @@ .. graphviz:: image/lattice3.dot The Pbcs form a classical finite set-of-subsets lattice. In practice, -we consider ``None`` as a degenerated prebuilt constant, so the None +we consider ``None`` as a degenerated pre-built constant, so the None annotation is actually Pbc({None}). We should mention (but ignore for the sequel) that all annotations also @@ -927,7 +927,7 @@ The goal of the annotator is to find the least general (i.e. most precise) state that is sound (i.e. correct for the user program). The -algorithm used is a fixpoint search: we start from the least general +algorithm used is a fixed point search: we start from the least general state and consider the conditions repeatedly; if a condition is not met, we generalise the state incrementally to accommodate for it. This process continues until all conditions are satisfied. @@ -1025,9 +1025,9 @@ Note that a priori, all rules should be tried repeatedly until none of them generalises the state any more, at which point we have reached a -fixpoint. However, the rules are well suited to a simple metarule that +fixed point. However, the rules are well suited to a simple meta-rule that tracks a small set of rules that can possibly apply. Only these -"scheduled" rules are tried. The metarule is as follows: +"scheduled" rules are tried. The meta-rule is as follows: - when an identification *x~y* is added to *E*, then the rule ``(x~y) in E`` is scheduled to be considered; @@ -1038,7 +1038,7 @@ This also includes the cases where *x* is the auxiliary variable of an operation (see `Flow graph model`_). -These rules and metarule favour a forward propagation: the rule +These rules and meta-rule favour a forward propagation: the rule corresponding to an operation in a flow graph typically modifies the binding of the operation's result variable which is used in a following operation in the same block, thus scheduling the following operation's @@ -1085,7 +1085,7 @@ the same variable *v*. The binding of *v* itself, i.e. ``b(v)``, is updated to reflect generalisation of the list item's type; such an update is instantly visible to all aliases. Moreover, the update is -described as a change of binding, which means that the metarules will +described as a change of binding, which means that the meta-rules will ensure that any rule based on the binding of this variable will be reconsidered. @@ -1109,7 +1109,7 @@ so by identifying the hidden variable with the current operation's auxiliary variable. The identification ensures that the hidden variable's binding will eventually propagate to the auxiliary variable, -which -- according to the metarule -- will reschedule the operation's +which -- according to the meta-rule -- will reschedule the operation's rule:: z=getitem(x,y) | z', b(x)=List(v) @@ -1119,7 +1119,7 @@ We cannot directly set ``z->b(v)`` because that would be an "illegal" use of a binding, in the sense explained above: it would defeat the -metarule for rescheduling the rule when ``b(v)`` is modified. (In the +meta-rule for rescheduling the rule when ``b(v)`` is modified. (In the source code, the same effect is actually achieved by recording on a case-by-case basis at which locations the binding ``b(v)`` has been read; in the theory we use the equivalence relation *E* to make this @@ -1146,8 +1146,8 @@ As with `merge`_, it identifies the two lists. -Prebuilt constants -~~~~~~~~~~~~~~~~~~ +Pre-built constants +~~~~~~~~~~~~~~~~~~~ The ``Pbc`` annotations play a special role in our approach. They group in a single family all the constant user-defined objects that exist @@ -1159,46 +1159,46 @@ new problems to solve -- is a distinguishing property of the idea of analysing a live program instead of static source code. All the user objects that exist before the annotation phase are divided in two -further families: the "prebuilt instances" and the "frozen prebuilt +further families: the "pre-built instances" and the "frozen pre-built constants". 1. Normally, all instances of user-defined classes have the same behaviour, independently of whether they exist before annotation or are built dynamically by the program after annotation and compilation. Both correspond to the ``Inst(C)`` annotation. Instances that are - prebuilt will simply be compiled into the resulting executable as - prebuilt data structures. + pre-built will simply be compiled into the resulting executable as + pre-built data structures. 2. However, as an exception to 1., the user program can give a hint that - forces the annotator to consider such an object as a "frozen prebuilt + forces the annotator to consider such an object as a "frozen pre-built constant" instead. The object is then considered as an *immutable* container of attributes. It loses its object-oriented aspects and its class becomes irrelevant. It is not possible to further instantiate its class at run-time. -In summary, the prebuilt constants are: +In summary, the pre-built constants are: * all functions ``f`` of the user program (including the ones appearing as methods); * all classes ``C`` of the user program; -* all frozen prebuilt constants. +* all frozen pre-built constants. For convenience, we add the following objects to the above set: * for each function ``f`` and class ``C``, a "potential bound method" object written ``C.f``, used below to handle method calls; -* the singleton None object (a special case of frozen prebuilt constant). +* the singleton None object (a special case of frozen pre-built constant). The annotation ``Pbc(set)`` stands for an object that belongs to the -specified ``set`` of prebuilt constant objects, which is a subset of all -the prebuilt constant objects. +specified ``set`` of pre-built constant objects, which is a subset of all +the pre-built constant objects. -In practice, the set of all prebuilt constants is not fixed in advance, +In practice, the set of all pre-built constants is not fixed in advance, but grows while annotation discovers new functions and classes and -frozen prebuilt constants; in this way we can be sure that only the +frozen pre-built constants; in this way we can be sure that only the objects that are still alive will be included in the set, leaving out the ones that were only relevant during the initialisation phase of the program. @@ -1769,7 +1769,7 @@ The lattice is finite, although its size depends on the size of the program. The List part has the same size as *V*, and the Pbc part is -exponential on the number of prebuilt constants. However, in this model +exponential on the number of pre-built constants. However, in this model a chain of annotations cannot be longer than:: max(5, number-of-pbcs + 3, depth-of-class-hierarchy + 3). @@ -1811,7 +1811,7 @@ aspects. -Specialisation +Sprecialization *************** The type system used by the annotator does not include polymorphism @@ -1821,7 +1821,7 @@ most cases sufficient for the kind of system programming RPython is aimed at and matching our main targets. -Not all of our target code or expressivity needs fit into this model. +Not all of our target code or our needs for expressiveness fit into this model. The fact that we allow unrestricted dynamism at bootstrap helps a great deal, but in addition we also support the explicit flagging of certain functions or classes as requiring special treatment. One such @@ -1838,19 +1838,19 @@ object space and annotator abstractly interpret the function's bytecode. In more details, the following special-cases are supported by default -(more advanced specialisations have been implemented specifically for +(more advanced sprecializations have been implemented specifically for PyPy): -* specialising a function by the annotation of a given argument +* sprecializing a function by the annotation of a given argument -* specialising a function by the value of a given argument (requires all +* sprecializing a function by the value of a given argument (requires all calls to the function to resolve the argument to a constant value) * ignoring -- the function call is ignored. Useful for writing tests or debugging support code that should be removed during translation. * by arity -- for functions taking a variable number of (non-keyword) - arguments via a ``*args``, the default specialisation is by the number + arguments via a ``*args``, the default sprecialization is by the number of extra arguments. (This follows naturally from the fact that the extended annotation lattice we use has annotations of the form ``Tuple(A_1, ..., A_n)`` representing a heterogeneous tuple of length @@ -1859,7 +1859,7 @@ * ctr_location -- for classes. A fresh independent copy of the class is made for each program point that instantiate the class. This is a - simple (but potentially over-specialising) way to obtain class + simple (but potentially over-sprecializing) way to obtain class polymorphism for the couple of container classes we needed in PyPy (e.g. Stack). @@ -1875,11 +1875,11 @@ Concrete mode execution *********************** -The *memo* specialisation_ is used at key points in PyPy to obtain the +The *memo* sprecialization_ is used at key points in PyPy to obtain the effect described in the introduction (see `Abstract interpretation`_): the memo functions and all the code it invokes is concretely executed during annotation. There is no staticness restriction on that code -- -it will typically instantiate classes, creating more prebuilt instances, +it will typically instantiate classes, creating more pre-built instances, and sometimes even build new classes and functions; this possibility is used quite extensively in PyPy. @@ -1910,10 +1910,10 @@ The dead code removal effect is used in an essential way to hide bootstrap-only code from the annotator where it could not analyse such -code. For example, some frozen prebuilt constants force some of their +code. For example, some frozen pre-built constants force some of their caches to be filled when they are frozen (which occurs the first time the annotator discovers such a constant). This allows the regular -access methods of the frozen prebuilt constant to contain code like:: +access methods of the frozen pre-built constant to contain code like:: if self.not_computed_yet: self.compute_result() @@ -1938,7 +1938,7 @@ In the basic block at the beginning of the positive case, the input block variable corresponding to the source-level ``obj`` variable is -annotated as ``Inst(MySubClass)``. Similarily, in:: +annotated as ``Inst(MySubClass)``. Similarly, in:: if x > y: ...positive case... @@ -1990,7 +1990,7 @@ annotations, as mentioned at the end of the `Annotation model`_. It requires constant ``Bool`` annotations -- i.e. known to be True or known to be False -- that are nevertheless extended as above, even though it -seems redundant, just in case the annotation needs to be generalized to +seems redundant, just in case the annotation needs to be generalised to a non-constant extended annotation. See for example ``builtin_isinstance()`` in `pypy/annotation/builtin.py`_.) @@ -2017,7 +2017,7 @@ Code Generation -=============================== +=============== The actual generation of low-level code from the information computed by the annotator is not the central subject of the present report, so we @@ -2052,7 +2052,7 @@ RTyper -~~~~~~~~~~ +~~~~~~ The first step is called "RTyping", short for "RPython low-level typing". It turns general high-level operations into low-level C-like @@ -2109,7 +2109,7 @@ One representation is created for each used annotation. The representation maps a low-level type to each annotation in a way that -depends on information dicovered by the annotator. For example, the +depends on information discovered by the annotator. For example, the representation of ``Inst`` annotations are responsible for building the low-level type -- nested structures and vtable pointers, in the case of lltype_. In addition,the representation objects' central role is to @@ -2140,7 +2140,7 @@ back into the flow object space and the annotator and the RTyper itself, so that it gets turned into another low-level control flow graph. At this point, the annotator runs with a different set of default -specialisations: it allows several copies of the helper functions to be +sprecializations: it allows several copies of the helper functions to be automatically built, one for each low-level type of its arguments. We do this by default at this level because of the intended purpose of these helpers: they are usually methods of a polymorphic container. @@ -2149,7 +2149,7 @@ accommodate different kinds of sub-languages at different levels: it is straightforward to adapt it for the so-called "low-level Python" language in which we constrain ourselves to write the low-level -operation helpers. Automatic specialisation was a key point here; the +operation helpers. Automatic sprecialization was a key point here; the resulting language feels like a basic C++ without any type or template declarations. @@ -2157,7 +2157,7 @@ The back-ends ~~~~~~~~~~~~~ -So far, all data structures (flow graphs, prebuilt constants...) +So far, all data structures (flow graphs, pre-built constants...) manipulated by the translation process only existed as objects in memory. The last step is to turn them into an external representation. This step, while basically straightforward, is messy in practice for @@ -2178,7 +2178,7 @@ The C back-end works itself again in two steps: * it first collects recursively all functions (i.e. their low-level flow - graphs) and all prebuilt data structures, remembering all "struct" C + graphs) and all pre-built data structures, remembering all "struct" C types that will need to be declared; * it then generates one or multiple C source files containing: @@ -2187,11 +2187,11 @@ 2. the full declarations of the latter; - 3. a forward declaration of all the functions and prebuilt data + 3. a forward declaration of all the functions and pre-built data structures; 4. the implementation of the latter (i.e. the body of functions and - the static initialisers of prebuilt data structures). + the static initialisers of pre-built data structures). Each function's body is implemented as basic blocks (following the basic blocks of the control flow graphs) with jumps between them. The @@ -2249,9 +2249,9 @@ In PyPy, our short-term future work is to focus on using the translation toolchain presented here to generate a modified low-level version of the same full Python interpreter. This modified version will drive a -just-in-time specialisation process, in the sense of providing a +just-in-time sprecialization process, in the sense of providing a description of full Python that will not be directly executed, but -specialised for the particular user Python program. +sprecialized for the particular user Python program. As of October 2005, we are only starting the work in this direction. The details are not fleshed out nor documented yet, but the [Psyco]_ @@ -2306,7 +2306,7 @@ Glossary and links mentioned in the text: * Abstract interpretation: http://en.wikipedia.org/wiki/Abstract_interpretation - +p * Flow Object Space: see `Object Space`_. * GenC back-end: see [TR]_. From cfbolz at codespeak.net Mon Dec 5 19:25:19 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Mon, 5 Dec 2005 19:25:19 +0100 (CET) Subject: [pypy-svn] r20713 - pypy/dist/pypy/doc Message-ID: <20051205182519.45C9D27B5D@code1.codespeak.net> Author: cfbolz Date: Mon Dec 5 19:25:14 2005 New Revision: 20713 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: (mwh, pedronis, cfbolz): small fixes Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Mon Dec 5 19:25:14 2005 @@ -12,13 +12,16 @@ models flexible and changeable without having to manually reimplement the interpreter. In fact, PyPy with the 0.7 and 0.8 releases contain code for memory management and threading models which allows experimentation without -requiring early design decisions. This document describes the current state of -the implementation of the memory object model, automatic memory management and -threading models and describes possible future developments. +requiring early design decisions. This document describes many details of the +current state of the implementation of the memory object model, automatic +memory management and threading models and describes possible future +developments. + + +Introduction +============ -Background -=========== The main emphasis of the PyPy project is that of integration: we want to make changing memory management and threading techniques possible while at the same @@ -154,7 +157,7 @@ representation consisting of a structure with a pointer to an array is used and overallocation is performed. -We plan to use similar techniques to use tagged pointers instead of box-classes +We plan to use similar techniques to use tagged pointers instead of using boxing to represent builtin types of the PyPy interpreter such as integers. This would require attaching explicit hints to the involved classes. Field access would then be translated to the corresponding masking operations. @@ -431,14 +434,12 @@ ------------ We also might want to experiment with more sophisticated structure inlining. -That means identifying a field in a structure A that points to another object B -on the heap in such a way, that the pointer in A gets assigned only once to and -that no other pointer to B exists from a heap object. If this is the case the -object B can be inlined into the A since B lives exactly as long as A. -XXX makes little sense! +Sometimes it is possible to find out that one structure object that is +allocated on the heap lives exactly as long as another structure object on the +heap pointing to it. If this is the case it is possible to inline the first +object into the second. This saves the space of one pointer and avoids +pointer-chasing. -As noted above, another plan is to implement builtin application level objects -by using tagged pointers. XXX also makes little sense! Conclusion =========== From mwh at codespeak.net Mon Dec 5 19:28:06 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 5 Dec 2005 19:28:06 +0100 (CET) Subject: [pypy-svn] r20714 - pypy/dist/pypy/doc Message-ID: <20051205182806.E698C27B5D@code1.codespeak.net> Author: mwh Date: Mon Dec 5 19:28:06 2005 New Revision: 20714 Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt Log: remove presuambly accidentally added 'p' Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/dynamic-language-translation.txt Mon Dec 5 19:28:06 2005 @@ -2306,7 +2306,7 @@ Glossary and links mentioned in the text: * Abstract interpretation: http://en.wikipedia.org/wiki/Abstract_interpretation -p + * Flow Object Space: see `Object Space`_. * GenC back-end: see [TR]_. From cfbolz at codespeak.net Mon Dec 5 20:58:47 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Mon, 5 Dec 2005 20:58:47 +0100 (CET) Subject: [pypy-svn] r20717 - pypy/dist/pypy/doc/statistic Message-ID: <20051205195847.E4E3227B5D@code1.codespeak.net> Author: cfbolz Date: Mon Dec 5 20:58:46 2005 New Revision: 20717 Modified: pypy/dist/pypy/doc/statistic/format.py pypy/dist/pypy/doc/statistic/sprint_dates.csv Log: change back to color. add the missing berlin sprint Modified: pypy/dist/pypy/doc/statistic/format.py ============================================================================== --- pypy/dist/pypy/doc/statistic/format.py (original) +++ pypy/dist/pypy/doc/statistic/format.py Mon Dec 5 20:58:46 2005 @@ -6,7 +6,7 @@ import pylab import matplotlib -greyscale = True +greyscale = False def get_data(p): data = p.readlines() Modified: pypy/dist/pypy/doc/statistic/sprint_dates.csv ============================================================================== --- pypy/dist/pypy/doc/statistic/sprint_dates.csv (original) +++ pypy/dist/pypy/doc/statistic/sprint_dates.csv Mon Dec 5 20:58:46 2005 @@ -3,6 +3,7 @@ "Hildesheim",2003-02-17,2003-02-23 "Gothenburg",2003-05-24,2003-05-31 "LovainLaNeuve",2003-06-21,2003-06-24 +"Berlin",2003-09-29,2003-10-04 "Amsterdam",2003-12-14,2003-12-21 "Europython/Gothenburg",2004-06-01,2004-06-07 "Vilnius",2004-11-15,2004-11-23 From hpk at codespeak.net Mon Dec 5 21:42:34 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Mon, 5 Dec 2005 21:42:34 +0100 (CET) Subject: [pypy-svn] r20719 - pypy/extradoc/talk/pypy_euworkshop_2005-12-08 Message-ID: <20051205204234.CD6EA27B5F@code1.codespeak.net> Author: hpk Date: Mon Dec 5 21:42:33 2005 New Revision: 20719 Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/holger_part2_OSSdev.txt Log: first go over slides for EU workshop on 8th Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/holger_part2_OSSdev.txt ============================================================================== --- pypy/extradoc/talk/pypy_euworkshop_2005-12-08/holger_part2_OSSdev.txt (original) +++ pypy/extradoc/talk/pypy_euworkshop_2005-12-08/holger_part2_OSSdev.txt Mon Dec 5 21:42:33 2005 @@ -6,37 +6,56 @@ Slides: -1. What makes Open Source communities like Python work: the people factor - - collaborative - - communication - - transparent - - organization (decision making) +1. personal background + - worked in gaming companies, banks and car companies for + several years + - studied computer science + - left job and went into open-source scenes (2001) + - various project involvements, started PyPy 2003 by + inviting people to the first "sprint" + +2. What makes Open Source communities like Python work: the people factor + + - collaborative - driven by interest + - communication - quite transparent to everyone involved + - email / IRC / version-control + - organization - rather informal -2. What makes Open Source communities like Python work: the technical framework +3. the technical factors - - version control (Subversion) - automated test driven development + - specific expertise/special interest + - version control (Subversion) - releases - -3. Typical aspects of the Python community? +4. Typical aspects of the Python community? - lively community - - lot?s of different python implementation projects - - good interaction between the projects - - different from other OSS communities? - -4. PyPy: the vision - - - grew out of the Python community - - started through agile practices and evolved via them - - what is PyPy (short high level explanation - configurable "interpreter") + - lots of different python implementation projects + - good contactsbetween the projects + - maybe less fragmented than other OSS communities? + +5. PyPy: the vision + + - founders came from the Python community + - "sprints" were the inital factor + - what is PyPy/Python - one of the five most used programming + languages today -5. OSS and EU funding: PyPy as a case study +6. OSS and EU funding: PyPy as a case study - - why a fusion between an oss community and EU? (PyPyagenda) - - why fund a OSS community (EUagenda) - - impact so far + - driven by EU funded and non-EU funded parties + - technically challenging + - IBM or Sun have done similarly challenging projects + in much more time and with more funding + +7. PyPy: It's all about communication ... + - pypy-sync meetings, 30 minutes IRC + - pypy-svn/eu-tracking tracks all code and document + changes + - around 20000 visitors per month on website + - lots of blogs and subscribers to pypy-dev (dev-list) + - 300-500 people across the world following the project From mwh at codespeak.net Tue Dec 6 01:12:53 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Tue, 6 Dec 2005 01:12:53 +0100 (CET) Subject: [pypy-svn] r20729 - pypy/dist/pypy/doc Message-ID: <20051206001253.E761C27B5D@code1.codespeak.net> Author: mwh Date: Tue Dec 6 01:12:53 2005 New Revision: 20729 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: massaging of the language of about the first half of translation-aspects. Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Tue Dec 6 01:12:53 2005 @@ -10,7 +10,7 @@ One of the goals of the PyPy project is it to have the memory and threading models flexible and changeable without having to manually reimplement the -interpreter. In fact, PyPy with the 0.7 and 0.8 releases contain code for +interpreter. In fact, PyPy by time of the 0.8 release contains code for memory management and threading models which allows experimentation without requiring early design decisions. This document describes many details of the current state of the implementation of the memory object model, automatic @@ -21,15 +21,13 @@ Introduction ============ - - -The main emphasis of the PyPy project is that of integration: we want to make -changing memory management and threading techniques possible while at the same -time influencing the interpreter as little as possible. It is not the current -goal to optimize the current approaches in extreme ways rather to produce solid -implementations and to provide an environment where experiments with -fundamentally different ways to implement these things is possible and -reasonably easy. +The main emphasis of the PyPy project is that of flexible integration: we want +to make changing memory management and threading techniques possible while at +the same time influencing the source code of interpreter as little as +possible. It is not the current goal to optimize the current approaches in +extreme ways but rather to produce solid implementations and to provide an +environment where experiments with fundamentally different ways to implement +these things are possible and reasonably easy. The low level object model @@ -38,19 +36,20 @@ XXX proper references to translation.txt and dynamic-language-translation One important part of the translation process is *rtyping*. Before that step -all objects in our flow graphs are annotated with types on the level of the +all objects in our flow graphs are annotated with types at the level of the RPython type system which is still quite high-level and target-independent. During rtyping they are transformed into objects that match the model of the specific target platform. For C or C-like targets this model consists of a set of C-like types like structures, arrays and functions in addition to primitive types (integers, characters, floating point numbers). This multi-stage approach -gives a lot of flexibility how a particular object is represented on the +gives a lot of flexibility in how a given object is represented at the target's level. The RPython process can decide what representation to use based on the type annotation and on the way the object is used. In the following the structures used to represent RPython classes are described. -There is one "vtable" per RPython class, with the following structure: A root -class "object" has:: +There is one "vtable" per RPython class, with the following structure: The root +class "object" has a vtable of the following type (expressed in a C-like +syntax):: struct object_vtable { struct object_vtable* parenttypeptr; @@ -62,7 +61,8 @@ } The structure members ``subclassrange_min`` and ``subclassrange_max`` are used -for subclass checking. Every other class X, with parent Y, has the structure:: +for subclass checking (see below). Every other class X, with parent Y, has the +structure:: struct vtable_X { struct vtable_Y super; // inlined @@ -70,12 +70,12 @@ } The extra class attributes usually contain function pointers to the methods -of that class. In addition the class attributes (which are -supported by the RPython object model) are stored there. +of that class, although the data class attributes (which are supported by the +RPython object model) are stored there. The type of the instances is:: - struct object { // for instance of the root class + struct object { // for instances of the root class struct object_vtable* typeptr; } @@ -94,47 +94,48 @@ The way we do subclass checking is a good example of the flexibility provided by our approach: in the beginning we were using a naive linear lookup -algorithm. Since subclass checking is quite common (it is also used to check -whether an object is an instance of a certain class) we wanted to replace it -with the more efficient relative numbering algorithm. This was a matter of just -changing the appropriate code of the rtyping process to calculate the class-ids -during rtyping and insert the necessary fields into the class structure. It -would be similarly easy to switch to another implementation. +algorithm. Since subclass checking is quite a common operation (it is also +used to check whether an object is an instance of a certain class) we wanted +to replace it with the more efficient relative numbering algorithm. This was a +matter of just changing the appropriate code of the rtyping process to +calculate the class-ids during rtyping and insert the necessary fields into +the class structure. It would be similarly easy to switch to another +implementation. XXX reference to the paper -ID hashes ---------- +Identity hashes +--------------- In the RPython type system class instances can be used as dictionary keys using a default hash implementation based on identity which in practise is -implemented using the memory address. This is similar to how standard Python -behaves if no user-defined hash function is present. The annotator keeps track -for which classes this hashing is ever used. +implemented using the memory address. This is similar to how CPython behaves if +no user-defined hash function is present. The annotator keeps track of the +classes for which this hashing is ever used. One of the peculiarities of PyPy's approach is that live objects are analyzed -by our translation toolchain. This leads to the presence of instances of user +by our translation toolchain. This leads to the presence of instances of RPython classes that were built before the translation started. These are called -prebuilt-constants (PBCs for short). During rtyping, these instances have to be +"pre-built constants" (PBCs for short). During rtyping, these instances must be converted to the low level model. One of the problems with doing this is that the standard hash implementation of Python is to take the id of an object, which is just the memory address. If the RPython program explicitely stores the hashes -of PBCS somewhere (for example in the implementation of a data structure) then -the stored hash value would not match the value of the object's address after -translation anymore. +of a PBC somewhere (for example in the implementation of a data structure) then +the stored hash value would be extremely unlikely to match the value of the object's +address after translation. -To prevent this the following strategy is used: for every class with instances -that are hashed somewhere in the program (either when storing them in a +To prevent this the following strategy is used: for every class whose instances +are hashed somewhere in the program (either when storing them in a dictionary or by calling the hash function) an extra field is introduced in the structure used for the instances of that class. For PBCs of such a class this -field is used to store the memory address of the original object, new objects +field is used to store the memory address of the original object and new objects have this field initialized to zero. The hash function for instances of such a class stores the object's memory address in this field if it is zero. The return value of the hash function is the content of the field. This means that instances of such a class that are converted PBCs retain the hash values they had before the conversion whereas new objects of the class have their memory -address as hash values. A strategy along these lines will be required if we ever -switch to using a copying garbage collector. +address as hash values. A strategy along these lines would in any case have been +required if we ever switch to using a copying garbage collector. Cached functions with PBC arguments ------------------------------------ @@ -153,7 +154,7 @@ One example of the flexibility the RTyper provides is how we deal with lists. Based on information gathered by the annotator the RTyper chooses between two different list implementations. If a list never changes its size after creation -a low-level array is used directly. For lists which get resized a +a low-level array is used directly. For lists which might be resized a representation consisting of a structure with a pointer to an array is used and overallocation is performed. @@ -187,23 +188,23 @@ Using the Boehm garbage collector ----------------------------------- -At the moment there are two different garbage collectors implemented in the C +Currently there are two different garbage collectors implemented in the C backend (which is the most complete backend right now). One of them uses the existing Boehm-Demers-Weiser garbage collector [BOEHM]_. For every memory allocating operation in a low level flow graph the C backend introduces a call to a function of the boehm collector which returns a suitable amount of memory. -Since the C backends has a lot of information avaiable about the data structure +Since the C backend has a lot of information avaiable about the data structure being allocated it can choose the memory allocation function out of the Boehm -API that fits best. For example for objects that do not contain references to +API that fits best. For example, for objects that do not contain references to other objects (e.g. strings) there is a special allocation function that signals to the collector that it does not need to consider this memory when tracing pointers. Using the Boehm collector has disadvantages as well. The problems stem from the fact that the Boehm collector is conservative which means that it has to -consider every word in memory to be a potential pointer. Since PyPy's toolchain +consider every word in memory as a potential pointer. Since PyPy's toolchain has complete knowledge of the placement of data in memory we can generate an -exact garbage collector that considers only pointers. +exact garbage collector that considers only genuine pointers. Using a simple reference counting garbage collector ----------------------------------------------------- @@ -215,7 +216,7 @@ every reference decrement operations a check is performed whether the reference count has dropped to zero. If this is the case the memory of the object will be reclaimed after the references counts of the objects the original object -references are decremented as well. +refers to are decremented as well. The current placement of reference counter updates is far from optimal: The reference counts are updated much more often than theoretically necessary (e.g. From hpk at codespeak.net Tue Dec 6 08:00:06 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Tue, 6 Dec 2005 08:00:06 +0100 (CET) Subject: [pypy-svn] r20731 - pypy/extradoc/talk/pypy_euworkshop_2005-12-08 Message-ID: <20051206070006.2E93727B4B@code1.codespeak.net> Author: hpk Date: Tue Dec 6 08:00:04 2005 New Revision: 20731 Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/holger_part2_OSSdev.txt Log: updates to my 8th dec. slides Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/holger_part2_OSSdev.txt ============================================================================== --- pypy/extradoc/talk/pypy_euworkshop_2005-12-08/holger_part2_OSSdev.txt (original) +++ pypy/extradoc/talk/pypy_euworkshop_2005-12-08/holger_part2_OSSdev.txt Tue Dec 6 08:00:04 2005 @@ -11,7 +11,7 @@ - worked in gaming companies, banks and car companies for several years - studied computer science - - left job and went into open-source scenes (2001) + - left well paid job and went into open-source scenes (2001) - various project involvements, started PyPy 2003 by inviting people to the first "sprint" @@ -22,7 +22,7 @@ - email / IRC / version-control - organization - rather informal -3. the technical factors +3. technical production factors - automated test driven development - specific expertise/special interest @@ -42,20 +42,46 @@ - "sprints" were the inital factor - what is PyPy/Python - one of the five most used programming languages today + - grass root approach 6. OSS and EU funding: PyPy as a case study - - driven by EU funded and non-EU funded parties - - technically challenging + - driven by partially EU funded and non-EU funded parties + - focus on avoiding friction and turning PyPy into a long + term project - IBM or Sun have done similarly challenging projects - in much more time and with more funding + in more time and with more funding + - yet not found completely satisfying "funding" + interactions with communities. + +7. PyPy technical status + - three public releases in 2005, well received by the + community + - core deliverables fullfilled + - contributors add different directions -7. PyPy: It's all about communication ... +8. PyPy: It's all about communication ... - pypy-sync meetings, 30 minutes IRC - - pypy-svn/eu-tracking tracks all code and document - changes + - day-to-day IRC discussions + - "this week in PyPy" + - mailing lists: pypy-svn/eu-tracking tracks code and document changes - around 20000 visitors per month on website - - lots of blogs and subscribers to pypy-dev (dev-list) + - lots of blogs and pypy-dev (developer/researcher list) - 300-500 people across the world following the project +9. all good and well but the main thing are: sprints + - one-week intense work-meetings with one break day + - EU and non-EU researchers/developers get together + - daily planning sessions + - pair programming + - evolving and adapting to more attendants + - organisational/management tasks happen also on sprints + +10. next + - tackling research and technical goals (challenging!) + - mid-term EU review planned for 20th january + - looking into adjusting some work planning + - increased dissemination, attending conferences (movie features?) + - start talking to and interact with commercial stakeholders + From mwh at codespeak.net Tue Dec 6 10:30:11 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Tue, 6 Dec 2005 10:30:11 +0100 (CET) Subject: [pypy-svn] r20735 - pypy/dist/pypy/doc Message-ID: <20051206093011.578F327B48@code1.codespeak.net> Author: mwh Date: Tue Dec 6 10:30:10 2005 New Revision: 20735 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: more hacking at the language, couple more XXXs in the stackless section :/ Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Tue Dec 6 10:30:10 2005 @@ -237,9 +237,9 @@ Simple escape analysis to remove memory allocation --------------------------------------------------- -We also implemented a technique to prevent some amount of memory allocation. +We also implemented a technique to reduce the amount of memory allocation. Sometimes it is possible to deduce from the flow graphs that an object lives -exactly as long as the stack frame of the function where it is allocated in. +exactly as long as the stack frame of the function it is allocated in. This happens if no pointer to the object is stored into another object and if no pointer to the object is returned from the function. If this is the case and if the size of the object is known in advance the object can be allocated on @@ -248,8 +248,8 @@ the graph. Reads from elements of the structure are removed and just replaced by one of the variables, writes by assignements to same. -Since quite a lot of objects are allocated in small "helper" functions this -simple approach which does not track objects accros function boundaries only +Since quite a lot of objects are allocated in small helper functions, this +simple approach which does not track objects accross function boundaries only works well in the presence of function inlining. A general garbage collection framework @@ -276,10 +276,10 @@ At the moment we have three simple garbage collectors implemented for this framework: a simple copying collector, a mark-and-sweep collector and a -deferred reference counting collector. These garbage collectors are working on +deferred reference counting collector. These garbage collectors are work when run on top of the memory simulator, but at the moment it is not yet possible to translate -PyPy to C with them. This is due to the fact that it is not easy to -find the root pointers that reside on the C stack because the C stack layout is +PyPy to C with them. This is because is not easy to +find the root pointers that reside on the C stack -- both because the C stack layout is heavily platform dependent and because of the possibility of roots that are not only on the stack but also in registers (which would give a problem for moving garbage collectors). @@ -302,8 +302,8 @@ ------------- By default multi-threading is not supported at all, which gives some small -benefits for single-threaded applications since even for single-threaded -applications some overhead is there if threading-capabilities are built into +benefits for single-threaded applications since even in the single-threaded +case there is some overhead if threading capabilities are built into the interpreter. Threading with a Global Interpreter Lock @@ -345,7 +345,8 @@ different previously-saved C stack altogether, thus implementing coroutines. -In our case, exception handling is always explicit: the C backend always +In our case, exception handling is always explicit in the generated code: +the C backend always puts after each call site a cheap check to detect if the callee exited normally or generated an exception. So when compiling functions in stackless mode, the generated exception handling code special-cases the @@ -358,9 +359,9 @@ At this point, the whole C stack is stored away in the heap. This is a very interesting state in itself, because precisely there is no C stack -left. It allows us to write in a portable way all the algorithms that -normally require machine-specific instructions to inspect the stack, -e.g. garbage collectors. +left. It is this which will allow us to write in a portable way all the algorithms that +normally require machine-specific code to inspect the stack, +in particular garbage collectors. To continue execution, the dispatcher can resume either the freshly saved or a completely different stack. Moreover, it can resume directly @@ -369,10 +370,10 @@ stack switches fast, but it also allows the frame to continue to run on top of a clean C stack. When that frame eventually exits normally, it returns to the dispatcher, which then invokes the previous (parent) -saved frame, and so on. In this model, the C stack can be considered as -a cache for the heap-based saved frame. When we run out of C stack -space, we flush the cache. When the cache is empty, we fill it with the -next item from the heap. +saved frame, and so on. XXX mention that we unwind when we use too much stack! +In this model, the C stack can be considered as a cache for the heap-based saved +frame. When we run out of C stack space, we flush the cache. When the cache is +empty, we fill it with the next item from the heap. To give the translated program some amount of control over the heap-based stack structures and over the top-level dispatcher that jumps @@ -392,6 +393,8 @@ the non-exceptional case. Most optimisations performed by C compilers, like register allocation, continue to work... +XXX talk about the picture + .. graphviz:: image/stackless_informal.dot @@ -404,7 +407,7 @@ ------------------ One of the biggest missing features of our current garbage collectors is -missing finalization. Right now finalizers are not invoked if an object is +finalization. At present finalizers are simply not invoked if an object is freed by the garbage collector. Along the same lines weak references are not supported yet. It should be possible to implement these with a reasonable amount of effort for reference counting as well as the Boehm collector (which @@ -413,11 +416,11 @@ Integrating the now simulated-only GC framework into the rtyping process and the code generation will require considerable effort. It requires being able to keep track of the GC roots which is hard to do with portable C code. One -solution would be to use stackless since it moves the stack completely to the -heap. We expect that we can implement GC read and write barriers as function -calls and rely on inlining to make them less inefficient. +solution would be to use the "stackless" code since it can move the stack +completely to the heap. We expect that we can implement GC read and write +barriers as function calls and rely on inlining to make them less inefficient. -We may also spent some time on improving the existing reference counting +We may also spend some time on improving the existing reference counting implementation by removing unnecessary incref-decref pairs. A bigger task would be to add support for detecing circular references. @@ -429,7 +432,7 @@ threading. This would involve writing a scheduler and preemption logic. We should also investigate other threading models based on operating system -threads with various granularities of locking for access of shared access. +threads with various granularities of locking for access of shared objects. Object model ------------ From mwh at codespeak.net Tue Dec 6 11:35:26 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Tue, 6 Dec 2005 11:35:26 +0100 (CET) Subject: [pypy-svn] r20739 - pypy/dist/pypy/doc Message-ID: <20051206103526.BB55D27DB5@code1.codespeak.net> Author: mwh Date: Tue Dec 6 11:35:24 2005 New Revision: 20739 Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt Log: delete-trailing-whitespace, but mainly to see if graphviz picks up the fonts installed in /usr/local/share/fonts on codespeak. Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/dynamic-language-translation.txt Tue Dec 6 11:35:24 2005 @@ -25,7 +25,7 @@ The analysis of dynamic languages =============================================== -Dynamic languages are definitely not new on the computing scene. +Dynamic languages are definitely not new on the computing scene. However, new conditions like increased computing power and designs driven by larger communities have enabled the emergence of new aspects in the recent members of the family, or at least made them more practical than @@ -195,7 +195,7 @@ whose objects are empty placeholders. The over-simplified view is that to analyse a function, we bind its input arguments to such placeholders, and execute the function -- i.e. let the interpreter follow -its bytecode and invoke the object space for each operations, one by one. +its bytecode and invoke the object space for each operations, one by one. The Flow object space records each operation when it is issued, and returns a new placeholder as a result. At the end, the list of recorded operations, along with the involved placeholders, gives an assembler-like @@ -628,7 +628,7 @@ -Annotator +Annotator ================================= The annotator is the type inference part of our toolchain. The @@ -857,7 +857,7 @@ / / \ \ / NuInst(cls2) \ NuInst(cls1) / / \ \ / / - Inst(cls2) \ Inst(cls1) / + Inst(cls2) \ Inst(cls1) / \ \ / / \ \ / / \ \/ / @@ -2157,7 +2157,7 @@ The back-ends ~~~~~~~~~~~~~ -So far, all data structures (flow graphs, pre-built constants...) +So far, all data structures (flow graphs, pre-built constants...) manipulated by the translation process only existed as objects in memory. The last step is to turn them into an external representation. This step, while basically straightforward, is messy in practice for From mwh at codespeak.net Tue Dec 6 11:41:02 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Tue, 6 Dec 2005 11:41:02 +0100 (CET) Subject: [pypy-svn] r20740 - pypy/dist/pypy/doc Message-ID: <20051206104102.D34F427DB5@code1.codespeak.net> Author: mwh Date: Tue Dec 6 11:41:01 2005 New Revision: 20740 Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt Log: ytpo Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/dynamic-language-translation.txt Tue Dec 6 11:41:01 2005 @@ -2149,7 +2149,7 @@ accommodate different kinds of sub-languages at different levels: it is straightforward to adapt it for the so-called "low-level Python" language in which we constrain ourselves to write the low-level -operation helpers. Automatic sprecialization was a key point here; the +operation helpers. Automatic specialization was a key point here; the resulting language feels like a basic C++ without any type or template declarations. From cfbolz at codespeak.net Tue Dec 6 12:32:09 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 6 Dec 2005 12:32:09 +0100 (CET) Subject: [pypy-svn] r20743 - pypy/dist/pypy/doc Message-ID: <20051206113209.1FF8A27DB5@code1.codespeak.net> Author: cfbolz Date: Tue Dec 6 12:32:08 2005 New Revision: 20743 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: (pedronis, cfbolz, help from arigo): proper references, clarified stackless section, write a "nice introductory paragraph" Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Tue Dec 6 12:32:08 2005 @@ -29,22 +29,20 @@ environment where experiments with fundamentally different ways to implement these things are possible and reasonably easy. - The low level object model =========================== -XXX proper references to translation.txt and dynamic-language-translation - -One important part of the translation process is *rtyping*. Before that step -all objects in our flow graphs are annotated with types at the level of the -RPython type system which is still quite high-level and target-independent. -During rtyping they are transformed into objects that match the model of the -specific target platform. For C or C-like targets this model consists of a set -of C-like types like structures, arrays and functions in addition to primitive -types (integers, characters, floating point numbers). This multi-stage approach -gives a lot of flexibility in how a given object is represented at the -target's level. The RPython process can decide what representation to use based -on the type annotation and on the way the object is used. +One important part of the translation process is *rtyping* [DLT]_, [TR]_. +Before that step all objects in our flow graphs are annotated with types at the +level of the RPython type system which is still quite high-level and +target-independent. During rtyping they are transformed into objects that +match the model of the specific target platform. For C or C-like targets this +model consists of a set of C-like types like structures, arrays and functions +in addition to primitive types (integers, characters, floating point numbers). +This multi-stage approach gives a lot of flexibility in how a given object is +represented at the target's level. The RPython process can decide what +representation to use based on the type annotation and on the way the object is +used. In the following the structures used to represent RPython classes are described. There is one "vtable" per RPython class, with the following structure: The root @@ -94,15 +92,13 @@ The way we do subclass checking is a good example of the flexibility provided by our approach: in the beginning we were using a naive linear lookup -algorithm. Since subclass checking is quite a common operation (it is also -used to check whether an object is an instance of a certain class) we wanted -to replace it with the more efficient relative numbering algorithm. This was a -matter of just changing the appropriate code of the rtyping process to -calculate the class-ids during rtyping and insert the necessary fields into -the class structure. It would be similarly easy to switch to another -implementation. - -XXX reference to the paper +algorithm. Since subclass checking is quite a common operation (it is also used +to check whether an object is an instance of a certain class) we wanted to +replace it with the more efficient relative numbering algorithm (see [PVE] for +an overview of techniques). This was a matter of just changing the appropriate +code of the rtyping process to calculate the class-ids during rtyping and +insert the necessary fields into the class structure. It would be similarly +easy to switch to another implementation. Identity hashes --------------- @@ -140,7 +136,7 @@ Cached functions with PBC arguments ------------------------------------ -As explained in (XXX reference to dynamicblabla) the annotated code can contain +As explained in [DLT]_ the annotated code can contain functions from a finite set of PBCs to something else. The set itself has to be finite but its content does not need to be provided explictly but is discovered as the annotation of the input argument by the annotator itself. This kind of @@ -255,9 +251,6 @@ A general garbage collection framework -------------------------------------- -(XXX I have no idea whether/how detailed this should be described here. It kind -of fits the "solutions for memory models", though) - In addition to the garbage collectors implemented in the C backend we have also started writing a more general toolkit for implementing exact garbage collectors in Python. The general idea is to express the garbage collection @@ -290,13 +283,14 @@ tackled in phase 2 as generating assembly directly is needed anyway for a just-in-time compiler. The other possibility (which would be much easier to implement) is to move all the data away from the stack to the heap, as -described below in section (XXX reference). +described below in section "Stackless C code". -Threading Model Implementations +Concurrency Model Implementations ============================================ - -XXX nice introductory paragraph +At the moment we have implemented two different concurrency models and the +option to not support concurrency (which proof the modularity of our approach): +threading with a global interpreter lock and a "stackless" model. No threading ------------- @@ -363,17 +357,19 @@ normally require machine-specific code to inspect the stack, in particular garbage collectors. -To continue execution, the dispatcher can resume either the freshly -saved or a completely different stack. Moreover, it can resume directly -the innermost (most recent) saved frame in the heap chain, without -having to resume all intermediate frames first. This not only makes -stack switches fast, but it also allows the frame to continue to run on -top of a clean C stack. When that frame eventually exits normally, it -returns to the dispatcher, which then invokes the previous (parent) -saved frame, and so on. XXX mention that we unwind when we use too much stack! -In this model, the C stack can be considered as a cache for the heap-based saved -frame. When we run out of C stack space, we flush the cache. When the cache is -empty, we fill it with the next item from the heap. +To continue execution, the dispatcher can resume either the freshly saved or a +completely different stack. Moreover, it can resume directly the innermost +(most recent) saved frame in the heap chain, without having to resume all +intermediate frames first. This not only makes stack switches fast, but it +also allows the frame to continue to run on top of a clean C stack. When that +frame eventually exits normally, it returns to the dispatcher, which then +invokes the previous (parent) saved frame, and so on. We insert stack checks +before calls that can lead to recursion by detecting cycles in the call graph. +These stack checks copy the stack to the heap (by raising the special +exception) if it is deeper than a certain level. In this model, the C stack +can be considered as a cache for the heap-based saved frames. When we run out +of C stack space, we flush the cache. When the cache is empty, we fill it with +the next item from the heap. To give the translated program some amount of control over the heap-based stack structures and over the top-level dispatcher that jumps @@ -381,10 +377,10 @@ in C. These functions provide an elementary interface on top of which useful abstractions can be implemented, like: -* coroutines: explicitly switching code, similar to Greenlets_. +* coroutines: explicitly switching code, similar to Greenlets [GREENLET]_. * "tasklets": cooperatively-scheduled microthreads, as introduced in - `Stackless Python`_. + Stackless Python [STK]_. * implicitly-scheduled microthreads, also known as green threads. @@ -393,9 +389,14 @@ the non-exceptional case. Most optimisations performed by C compilers, like register allocation, continue to work... -XXX talk about the picture +The following picture shows a graph function together with the modifications +necessary for the stackless style: the check whether the stack is too big and +should be unwound, the check whether we are in the process of currently storing +away the stack and the check whether the call to the function is not a regular +call but a reentry call. .. graphviz:: image/stackless_informal.dot + :scale: 70 Future work @@ -459,5 +460,23 @@ for C and C++, Hans Boehm, 1988-2004 .. _`Boehm-Demers-Weiser garbage collector`: http://www.hpl.hp.com/personal/Hans_Boehm/gc/ -.. _Greenlets: http://codespeak.net/py/current/doc/greenlet.html +.. [GREENLET] `Lightweight concurrent programming`_, py-lib Documentation 2003-2005 +.. _`Lightweight concurrent programming`: http://codespeak.net/py/current/doc/greenlet.html + +.. [STK] `Stackless Python`_, a Python implementation that does not use + the C stack, Christian Tismer, 1999-2004 .. _`Stackless Python`: http://www.stackless.com + +.. [TR] `Translation`_, PyPy documentation, 2003-2005 +.. _`Translation`: translation.html + +.. [LE] `Encapsulating low-level implementation aspects`_, + PyPy documentation (and EU deliverable D05.4), 2005 +.. _`Encapsulating low-level implementation aspects`: low-level-encapsulation.html + +.. [DLT] `Compiling dynamic language implementations`_, + PyPy documentation (and EU deliverable D05.1), 2005 +.. _`Compiling dynamic language implementations`: dynamic-language-translation.html + +.. [PVE] `Simple and Efficient Subclass Tests`_, Jonathan Bachrach, Draft submission to ECOOP-02, 2001 +.. _`Simple and Efficient Subclass Tests`: http://people.csail.mit.edu/jrb/pve/pve.pdf From cfbolz at codespeak.net Tue Dec 6 12:53:26 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 6 Dec 2005 12:53:26 +0100 (CET) Subject: [pypy-svn] r20746 - pypy/dist/pypy/doc Message-ID: <20051206115326.A412527DB5@code1.codespeak.net> Author: cfbolz Date: Tue Dec 6 12:53:25 2005 New Revision: 20746 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: (pedronis, cfbolz): remove the strange introduction and replaced with the former abstract. nicefy. Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Tue Dec 6 12:53:25 2005 @@ -5,29 +5,18 @@ .. contents:: .. sectnum:: -Abstract -========= +Introduction +============= -One of the goals of the PyPy project is it to have the memory and threading +One of the goals of the PyPy project is it to have the memory and concurrency models flexible and changeable without having to manually reimplement the -interpreter. In fact, PyPy by time of the 0.8 release contains code for -memory management and threading models which allows experimentation without -requiring early design decisions. This document describes many details of the -current state of the implementation of the memory object model, automatic -memory management and threading models and describes possible future -developments. - - -Introduction -============ +interpreter. In fact, PyPy by time of the 0.8 release contains code for memory +management and concurrency models which allows experimentation without +requiring early design decisions. This document describes many of the more +technical details of the current state of the implementation of the memory +object model, automatic memory management and concurrency models and describes +possible future developments. -The main emphasis of the PyPy project is that of flexible integration: we want -to make changing memory management and threading techniques possible while at -the same time influencing the source code of interpreter as little as -possible. It is not the current goal to optimize the current approaches in -extreme ways but rather to produce solid implementations and to provide an -environment where experiments with fundamentally different ways to implement -these things are possible and reasonably easy. The low level object model =========================== @@ -94,7 +83,7 @@ by our approach: in the beginning we were using a naive linear lookup algorithm. Since subclass checking is quite a common operation (it is also used to check whether an object is an instance of a certain class) we wanted to -replace it with the more efficient relative numbering algorithm (see [PVE] for +replace it with the more efficient relative numbering algorithm (see [PVE]_ for an overview of techniques). This was a matter of just changing the appropriate code of the rtyping process to calculate the class-ids during rtyping and insert the necessary fields into the class structure. It would be similarly From arigo at codespeak.net Tue Dec 6 12:59:10 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 6 Dec 2005 12:59:10 +0100 (CET) Subject: [pypy-svn] r20751 - pypy/dist/pypy/doc Message-ID: <20051206115910.511D227DB6@code1.codespeak.net> Author: arigo Date: Tue Dec 6 12:59:08 2005 New Revision: 20751 Modified: pypy/dist/pypy/doc/getting-started.txt Log: compliancy->compliance. There are many more around, but this one is a starting point... Modified: pypy/dist/pypy/doc/getting-started.txt ============================================================================== --- pypy/dist/pypy/doc/getting-started.txt (original) +++ pypy/dist/pypy/doc/getting-started.txt Tue Dec 6 12:59:08 2005 @@ -602,7 +602,7 @@ The `py library`_ is used for supporting PyPy development and running our tests against code and documentation as well as -compliancy tests. You don't need to install the py library because +compliance tests. You don't need to install the py library because it ships with PyPy and `pypy/test_all.py`_ is an alias for ``py.test`` but if you want to have the ``py.test`` tool generally in your path, you might like to visit: From arigo at codespeak.net Tue Dec 6 13:27:57 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 6 Dec 2005 13:27:57 +0100 (CET) Subject: [pypy-svn] r20759 - pypy/dist/pypy/doc Message-ID: <20051206122757.933DE27DB5@code1.codespeak.net> Author: arigo Date: Tue Dec 6 13:27:56 2005 New Revision: 20759 Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt Log: Fix size percentables. Modified: pypy/dist/pypy/doc/low-level-encapsulation.txt ============================================================================== --- pypy/dist/pypy/doc/low-level-encapsulation.txt (original) +++ pypy/dist/pypy/doc/low-level-encapsulation.txt Tue Dec 6 13:27:56 2005 @@ -256,7 +256,7 @@ is to make PyPy slower by about 8%. A couple of minor pending optimisations could reduce this figure a bit. We expect the rest of the performance impact to be mainly caused by the increase of size - of the generated executable (+20%). + of the generated executable (+28%). Multiple Interpreters @@ -289,7 +289,7 @@ Evaluation Strategy When translated to C code, the Thunk object space has a global - performance impact of 6%. The executable is 10% bigger (probably + performance impact of 6%. The executable is 13% bigger (probably due to the arguably excessive inlining we perform). We have described five aspects in this document, each currently with From cfbolz at codespeak.net Tue Dec 6 13:55:17 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 6 Dec 2005 13:55:17 +0100 (CET) Subject: [pypy-svn] r20760 - pypy/dist/pypy/doc Message-ID: <20051206125517.DA1D927DB5@code1.codespeak.net> Author: cfbolz Date: Tue Dec 6 13:55:17 2005 New Revision: 20760 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: (pedronis): minor fix to the conclusion Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Tue Dec 6 13:55:17 2005 @@ -438,9 +438,7 @@ Conclusion =========== -As shown with various examples our approach gives us flexibility and lets us -chooses various aspects at translation time instead of encoding them into the -implementation itself. +As concretely shown with various detailed examples our approach gives us flexibility and lets us chooses various aspects at translation time instead of encoding them into the implementation itself. References =========== From bea at codespeak.net Tue Dec 6 13:58:10 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Tue, 6 Dec 2005 13:58:10 +0100 (CET) Subject: [pypy-svn] r20762 - pypy/extradoc/talk/pypy_euworkshop_2005-12-08 Message-ID: <20051206125810.B121C27DB7@code1.codespeak.net> Author: bea Date: Tue Dec 6 13:58:07 2005 New Revision: 20762 Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy_alastair_part1_intro.txt Log: alastairs part of eu-talk, he couldn?t commit to codespeak Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy_alastair_part1_intro.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy_alastair_part1_intro.txt Tue Dec 6 13:58:07 2005 @@ -0,0 +1,211 @@ +Introduction + +World Domination + + - Quote from Linus Torvalds, when he first asked for help in building + Linux. + +--------------------------------------------------------------------------- + +Notes: + + - Figures for growth of Linux + + - A student just learning computing. + + - Now works full time on Linux, in a non-profit mainly targeted at + promoting his baby. + + - Linux now at the centre of the business strategy of IBM, Sun, + Nokia, Sony etc. + + - Without the internet, it would have remained the hobby project of + a student. + + - The talk will cover the mechanisms through which Torvalds and + others have built up such effective communities. It will centre on + the community we know best - that of Python and PyPy. + +--------------------------------------------------------------------------- + +Talk Structure + + - Introduction + + - Free / Open Source Software. + + - Python programming language. + + - Elements of typical F/OSS development. + + - View from the Trenches + + - Typical Python development. + + - PyPy - building a better Python. + + - A F/OSS community and the EU. + + - Agile Programming Practices + + - Best practice in software engineering. + + - Agile methods and the typical F/OSS project. + + - Agile methods and PyPy - sprints. + + - Discussion + +--------------------------------------------------------------------------- + +Notes: + + - Handouts + + - Weblinks for further inforamtion. + + - Contacts. + +--------------------------------------------------------------------------- + +Free and Open Source Software + + - Four freedoms + +--------------------------------------------------------------------------- + +Notes: + + - The two communities and the F/OSS acronym (also FLOSS). + + - Python has elements of both communities. + + - Some Python developers also produce proprietary source code. + + - One man projects are just as free and open source as big ones, but + the openness supports cooperative development. + +--------------------------------------------------------------------------- + +What is Python? + + - Executable pseudo-code. Example. + + - Guido van Rossum - Benevolent Dictator for Life (BDFL). + + - The typical Pythonista. + + - Not the most popular F/OSS language, but has many dedicated fans: + + - Google. + + - Tim Berners Lee - plane flight coding projects. + +--------------------------------------------------------------------------- + +Notes: + + - Python is a general purpose language. PyPy will remove barriers to + its take-up in some sectors. + + - GvR - started in academic environment, in Netherlands, then US. + + - Currently, most people working with Python do so out of + choice. This is reflected in the nature of the + community. Popularity may change this. + +--------------------------------------------------------------------------- + +Python Principles - What Shapes the Community + + - Priorities and principle - Gabriel "Worse is better" + + - Python principles - highlight the interesting ones. + +--------------------------------------------------------------------------- + +Notes: + + - Almost all programmers value the same set of principles, but + priorities are everything. + +--------------------------------------------------------------------------- + +F/OSS Community Practices + + - Mailing list - no email, no community. + + - Source code management - read / write access. + + - Bug / feature request tracking. + + - Newsgroups, forums - users and developers. + + - Web page: + + - About, News + + - Download - often stable and development branches + + - Support - Documentation, FAQ, Wiki. + + - Related projects. + + - IRC - chat. + + - Developer weblogs. + + - Newsletters. + +--------------------------------------------------------------------------- + +Notes: + + - Role of Sourceforge - and moves from Sourceforge: + + - Central point of failure. + + - CVS to SVN. + + - Declining role of USENET. + + - Formal documentation spotty - forums and IRC can provide excellent + support. This can be a deciding factor in choosing software. + + - Related projects - friendly rivalry. Much resuing of ideas and + some code. + +--------------------------------------------------------------------------- + +F/OSS Community Practices - Formal Structure + + - Sub-comunities in larger projects - special interest groups. + + - Regular meetings and conferences. + + - Non-proft organisations: + + - Organise meetings and marketing. + + - Hold copyright. + + - Parallel to technical structure. + + - Semi-formal decision processes: + + - Python PEPS - proposed changes in the language and development. + + - BDFLs and the art of saying "no". + +--------------------------------------------------------------------------- + +Notes: + + - Meritocracy. + + - Ability to fork. + + - Tensions between large commercial developers and outsiders. + + +--------------------------------------------------------------------------- \ No newline at end of file From cfbolz at codespeak.net Tue Dec 6 14:13:46 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 6 Dec 2005 14:13:46 +0100 (CET) Subject: [pypy-svn] r20765 - pypy/extradoc/talk/pypy_euworkshop_2005-12-08 Message-ID: <20051206131346.8814A27DB5@code1.codespeak.net> Author: cfbolz Date: Tue Dec 6 14:13:42 2005 New Revision: 20765 Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/loc.png (contents, props changed) pypy/extradoc/talk/pypy_euworkshop_2005-12-08/subscribers.png (contents, props changed) pypy/extradoc/talk/pypy_euworkshop_2005-12-08/webaccess.png (contents, props changed) Log: add pictures Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/loc.png ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/subscribers.png ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/webaccess.png ============================================================================== Binary file. No diff available. From bea at codespeak.net Tue Dec 6 15:03:01 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Tue, 6 Dec 2005 15:03:01 +0100 (CET) Subject: [pypy-svn] r20771 - pypy/extradoc/talk/pypy_euworkshop_2005-12-08 Message-ID: <20051206140301.43F9927DBB@code1.codespeak.net> Author: bea Date: Tue Dec 6 15:03:00 2005 New Revision: 20771 Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy_alastair_part1_intro.txt Log: some typos Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy_alastair_part1_intro.txt ============================================================================== --- pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy_alastair_part1_intro.txt (original) +++ pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy_alastair_part1_intro.txt Tue Dec 6 15:03:00 2005 @@ -62,7 +62,7 @@ - Handouts - - Weblinks for further inforamtion. + - Weblinks for further information. - Contacts. @@ -179,7 +179,7 @@ F/OSS Community Practices - Formal Structure - - Sub-comunities in larger projects - special interest groups. + - Sub-communities in larger projects - special interest groups. - Regular meetings and conferences. From bea at codespeak.net Tue Dec 6 15:38:31 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Tue, 6 Dec 2005 15:38:31 +0100 (CET) Subject: [pypy-svn] r20772 - pypy/extradoc/talk/pypy_euworkshop_2005-12-08 Message-ID: <20051206143831.3CA4927DB6@code1.codespeak.net> Author: bea Date: Tue Dec 6 15:38:29 2005 New Revision: 20772 Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/bea_part3_agility Log: created slides (9 in total) from my notes Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/bea_part3_agility ============================================================================== --- pypy/extradoc/talk/pypy_euworkshop_2005-12-08/bea_part3_agility (original) +++ pypy/extradoc/talk/pypy_euworkshop_2005-12-08/bea_part3_agility Tue Dec 6 15:38:29 2005 @@ -6,29 +6,30 @@ Slides: +------------------------------------------------------------------------------------------------------------------------- 1. -- Agile development grew out of a need to handle rapid change - in processes surrounding commercial software development - -- How then do agile approaches fit distributed, open-source projects without - the need to handle changing formal requirements and client relations? +Agile development +- Need to handle rapid change in commercial software development +- How do agile approaches fit distributed, open-source projects? +-------------------------------------------------------------------------------------------------------------------------- +2. -- The answer points to the core of Agile practises: the people factor - "Agile processes are designed to capitalize on each individual and each team?s +Core of Agile practises: the people factor +- "Agile processes are designed to capitalize on each individual and each team?s unique strenghts" (Cockburn, Highsmith, 2001) +- OSS nature of teams: self-organized, intensely collaborative - fit the agile approach +- OSS teams are an unique implementation of agile practices +------------------------------------------------------------------------------------------------------------------------- +3. -- The OSS nature of teams being self-organized and intensely collaborative fits the - agile approach, although OSS teams are an unique implementation due to the distributed - nature of work - -2. - -- Agile approaches aim at: - * reducing "cost of information" and distance between decision-making and it?s implementation - * by locating the team closer in a physical sense, replace documentation with face-to-face dissemination - * resulting in improved sense of community and team "morale", the foundation of pro-active teams +Agile approaches aim at: + * reducing "cost of information",distance from decision-making + * by physical location, unorthodox dissemination + * resulting in improved sense of community, team "morale" +----------------------------------------------------------------------------------------------------------------------- +Notes: - OSS teams fit the criteria very well if you look at the "physical" aspect in a more unorthodox sense. Transparent intense, daily communication via IRC, emails and wiki:s make up for a lot of this. @@ -36,37 +37,70 @@ technique, sprints, to make up for the lack of "physical" interaction between programmers. This technique is now widely used within the Python community. -3. - -- Sprints are ""two-day or three-day focused development session, in which developers pair off - together in a room and focus on building a particular subsystem". In this implementation it fits - agile criterias because of the knowledge/learning aspects as well as the incremental approaches. - -- Sprinting was the key agile technique in the start-up of PyPy, work being non-funded. While working on the - proposal (during sprints) the challenge was to tailor a project process based on sprinting that would fit and - work within an EU framework. (picture - sprint process) - -- Sprinting is central to the PyPy project because it is the focus point of the funded, consortium-based efforts - as well as the non-funded OSS efforts. Primarily focused on programming but there are also regular dissemination - activities (tutorials, talks) as well as consortium/management coordination. - +------------------------------------------------------------------------------------------------------------------------- 4. -- "Agile teams are characterized by self-organization and intense collaboration, within and across organizational - boundaries" (Cockburn, Highsmith, 2001) How do one structure an agile OSS community into a consortium of 7 partners? +"Agile teams are characterized by self-organization and intense collaboration, within and across organizational + boundaries" (Cockburn, Highsmith, 2001) +How do one structure an agile OSS community into a consortium of 7 partners? + - create developer driven organization + - roles and responsibility (management team + tehnical board) + - uses IRC channels, version control (SVN)on consortium level +-------------------------------------------------------------------------------------------------------------------------- +Notes: - In order to stay true to the agile vision as much as possible, the consortium structure and roles/responsibilities are supporting a developer-driven, flat organization. Much of the coordination of work is delegated to the core developers. Regular "sync" meetings (once per week) are done via IRC in which the community of developers (funded - as well as non-funded) coordinate development work, keeping communication as transparent as possible. +- as well as non-funded) coordinate development work, keeping communication as transparent as possible. - Consortium meetings are done once every month via IRC with developers attending as well, physical consortium meetings are done in conjunction with sprints. The tools for automated test driven development and version control are implemented on consortium documentation, reducing the gap between the consortium and the community in ways of working. - -- Contribution from the community is partially funded through the process of "physical persons", entering the consortium - as individual partners, recieving funding for travel and accommodation during sprints. - -- Striking a balance between agile approaches within the OSS community of PyPy and the funded consortium structure of PyPy - is a constant challenge but an crucial one. The results from the first year of the project show important results supporting - this effort. \ No newline at end of file + +------------------------------------------------------------------------------------------------------------------------- +5. + +Sprints are + "two-day or three-day focused development session, in which developers pair off + together in a room and focus on building a particular subsystem". +------------------------------------------------------------------------------------------------------------------------- +Notes: +- In this implementation it fits agile criterias because of the knowledge/learning aspects as well as the + incremental approaches. + +- Sprinting was the key agile technique in the start-up of PyPy, work being non-funded. While working on the + proposal (during sprints) the challenge was to tailor a project process based on sprinting that would fit and + work within an EU framework. (picture - sprint process) +------------------------------------------------------------------------------------------------------------------------- +6. + +Sprinting is central to the PyPy project: + - funded as well as non-funded work + - dissemination (talks, tutorials, pairprogramming) + - consortium activities (meetings, planning, coordinating wp work) + - contribution from community via "physical persons" structure + +------------------------------------------------------------------------------------------------------------------------- +7. + +------------------------------------------------------------------------------------------------------------------------- +8. + +PyPy sprints are evolving + - evaluations are done with "external" participants + - different forms of sprints with different focus + - sprints at conferences are growing into workshops +------------------------------------------------------------------------------------------------------------------------- +9. + +Sprints have already resulted in: + - more people knowing and understanding the vision of PyPy + - more people "recruited" into the community + - people "recruited" into the consortium (physical persons) + - people "recruited" into PyPy companies + +--------------------------------------------------------------------------------------------------------------------------- +Notes: +Striking a balance between agile approaches within the OSS community of PyPy and the funded consortium structure of PyPy +is a constant challenge but an crucial one. The important results from the first year of the project shows this. \ No newline at end of file From bea at codespeak.net Tue Dec 6 15:50:58 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Tue, 6 Dec 2005 15:50:58 +0100 (CET) Subject: [pypy-svn] r20774 - pypy/extradoc/talk/pypy_euworkshop_2005-12-08 Message-ID: <20051206145058.EC24827DB4@code1.codespeak.net> Author: bea Date: Tue Dec 6 15:50:57 2005 New Revision: 20774 Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy_alastair_part1_intro.txt Log: fixed typo Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy_alastair_part1_intro.txt ============================================================================== --- pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy_alastair_part1_intro.txt (original) +++ pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy_alastair_part1_intro.txt Tue Dec 6 15:50:57 2005 @@ -172,7 +172,7 @@ - Formal documentation spotty - forums and IRC can provide excellent support. This can be a deciding factor in choosing software. - - Related projects - friendly rivalry. Much resuing of ideas and + - Related projects - friendly rivalry. Much reusing of ideas and some code. --------------------------------------------------------------------------- From arigo at codespeak.net Tue Dec 6 16:03:55 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 6 Dec 2005 16:03:55 +0100 (CET) Subject: [pypy-svn] r20776 - pypy/dist/pypy/doc Message-ID: <20051206150355.321F427B47@code1.codespeak.net> Author: arigo Date: Tue Dec 6 16:03:53 2005 New Revision: 20776 Modified: pypy/dist/pypy/doc/ (props changed) pypy/dist/pypy/doc/conftest.py pypy/dist/pypy/doc/test_redirections.py Log: py.test --generate-redirections now produces stub HTML files based on the 'redirections' file. Modified: pypy/dist/pypy/doc/conftest.py ============================================================================== --- pypy/dist/pypy/doc/conftest.py (original) +++ pypy/dist/pypy/doc/conftest.py Tue Dec 6 16:03:53 2005 @@ -1,6 +1,13 @@ import py from py.__.documentation.conftest import Directory, DoctestText, ReSTChecker +Option = py.test.Config.Option +option = py.test.Config.addoptions("pypy-doc options", + Option('--generate-redirections', action="store_true", + dest="generateredirections", + default=False, help="Generate the redirecting HTML files"), + ) + class PyPyDoctestText(DoctestText): def run(self): Modified: pypy/dist/pypy/doc/test_redirections.py ============================================================================== --- pypy/dist/pypy/doc/test_redirections.py (original) +++ pypy/dist/pypy/doc/test_redirections.py Tue Dec 6 16:03:53 2005 @@ -1,21 +1,53 @@ import py +from pypy.doc.conftest import option redir = py.magic.autopath().dirpath('redirections') def checkexist(path): print "checking", path assert path.new(ext='.txt').check(file=1) - + +def checkredirection(oldname, newname): + print "checking", newname + newpath = redir.dirpath(newname.split('#')[0]) + checkexist(newpath) + # HACK: create the redirecting HTML file here... + # XXX obscure fishing + if option.generateredirections and '#' not in oldname: + generate_redirection(oldname, newname) + def test_eval(): d = eval(redir.read(mode='r')) return d def test_redirections(): d = test_eval() - for newname in d.values(): - yield checkexist, redir.dirpath(newname) + for oldname, newname in d.items(): + yield checkredirection, oldname, newname def test_navlist(): navlist = eval(redir.dirpath('navlist').read()) for entry in navlist: yield checkexist, redir.dirpath(entry) + +# ____________________________________________________________ + +def generate_redirection(oldname, newname): + print "redirecting from", oldname + oldpath = redir.dirpath(oldname) + url = newname # relative URL + oldpath.write(""" + + + + + + +

+ you should be redirected to + %s +

+ + +""" % (url, url, url)) From arigo at codespeak.net Tue Dec 6 16:18:32 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 6 Dec 2005 16:18:32 +0100 (CET) Subject: [pypy-svn] r20778 - pypy/dist/pypy/doc Message-ID: <20051206151832.0647427B47@code1.codespeak.net> Author: arigo Date: Tue Dec 6 16:18:31 2005 New Revision: 20778 Removed: pypy/dist/pypy/doc/misc.txt Modified: pypy/dist/pypy/doc/redirections Log: Got rid of misc.txt, fixed some redirections, don't care about the very very very old redirections to misc.html. Modified: pypy/dist/pypy/doc/redirections ============================================================================== --- pypy/dist/pypy/doc/redirections (original) +++ pypy/dist/pypy/doc/redirections Tue Dec 6 16:18:31 2005 @@ -15,11 +15,8 @@ 'checking_ReST.html' : 'coding-guide.html#pypy-documentation', 'optionaltools.html' : 'coding-guide.html#optionaltool', - 'newrepolayout.html' : 'misc.html#newrepolayout', - 'goals.html' : 'misc.html#goals', - - 'developers.html' : 'misc.html#developers', - 'cmodules.html' : 'misc.html#cmodules', + 'developers.html' : 'contributor.html', + 'goals.html' : 'architecture.html', 'annotation.html' : 'translation.html#annotator', 'basicblock.asc' : 'objspace.html#the-flow-model', From bea at codespeak.net Tue Dec 6 16:25:30 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Tue, 6 Dec 2005 16:25:30 +0100 (CET) Subject: [pypy-svn] r20781 - pypy/extradoc/sprintinfo/paris Message-ID: <20051206152530.2BEBD27B47@code1.codespeak.net> Author: bea Date: Tue Dec 6 16:25:29 2005 New Revision: 20781 Added: pypy/extradoc/sprintinfo/paris/paris-report.txt Log: recreated the paris sprint report out fo mails to pypy-dev Added: pypy/extradoc/sprintinfo/paris/paris-report.txt ============================================================================== --- (empty file) +++ pypy/extradoc/sprintinfo/paris/paris-report.txt Tue Dec 6 16:25:29 2005 @@ -0,0 +1,234 @@ +Paris sprint report 2005-10-10-2005-10-16 +Authors: Michael Hudson, Carl Friedrich Bolz, Armin Rigo + +Participants: + +Ludovic Aubry +Adrien Di Mascio +Jacob Hallen +Laura Creighton +Beatrice Duering +Armin Rigo +Samuele Pedroni +Anders Chrigstroem +Holger Krekel +Lene Wagner +Michael Hudson +Carl Friedrich Bolz +Bert Freudenberg +Anders Lehmann +Boris Feigin +Amaury Forgeot d'Arc +Andrew Thompson +Christian Tismer +Valentino Volonghi +Aurelien Campeas +Stephan Busemann +Nicholas Chauvat + +----------------------------------------------------------------------------------------------------------------------------------------- +The largest PyPy sprint yet (in terms of attendants) where done in the offices of Logilab in Paris +2005-10-16-2005-10-16. + +Possible task on the sprint was: + +RTyper tasks: +- fixed size lists +- poor-man type erasure for rlist and rdict + +- rtyping of classes/instances/methods for target languages with direct/prevalent OO/class support: + devise a low-level model variation for this case + +Annotator related tasks: +- support the notion of separate/external functions (and classes/PBCs) + in preparation for separate compilations + +JIT related work: +- (DONE) support addresses in the backends +- an ll interpreter written in RPython +- Saving ll graphs plus a loader written in RPython + +- Start thinking/experimenting with JIT generation at translation time + +- (DONE the starting) start a genasm back-end + +Threading/concurrency: +- release the GIL around system calls +- understand and possibly fix where the overhead, when threads are enabled, comes from + +- (DONE) generating (single-threaded) stackless C code + +Implementation/translation: +- (somewhat DONE) stack overflow detection (needed to be able to run many compliancy tests) +- try more interp. level optimisations (dicts with string keys, more agressive use of fastcall*...) + +- compute correct max stack size in compiler (?) +- cleanup our multiple compiler situation: remove + testcompiler, fix the tests to work on CPython 2.3 too, + decide what to put in lib-python/modified-2.4.1/compiler -- + stablecompiler or astcompiler? -- and submit it back to + CPython. Clean up pyparser/pythonutil.py. + +- (socket module, PEP302) + +GC related tasks: +- look into implementing weakrefs +- Boehm: fix the x=range(10**7) issue +- (improve refcounting) +- (passes toward integrating other GCs, rpython/memory) + +Refactorings/cleanups: +- cbuild/translator.Translator (use SCons?, use/generalize TranslationDriver) +- PyPy option handling unification, passing py.py options to targetpypy* +- inline (transforms in general) + +- (DONE) genc: producing multiple .h/.c files tracking Python code origin + +Larger whole projects: +- Javascript frontend +- support for developing C extensions for CPython with RPython code +- writing a sort of flow object space for the llinterpreter + to experiment with JIT work + +---------------------------------------------------------------------------------------------------------------------------------------- +Activities during the sprint - day-to-day: + +Monday-Wednesday: +The morning (once everyone had found their way/fought with the +metro/...) began with a tutorial for the newcomers and two discussion +groups -- one on implementing stackless-like functionality and one +titled "towards a translatable llinterpreter". + +After lunch, everyone met to hear the results of the discussion groups +and decide what to do next. The stackless group's conclusion was +"it'll be easy!" :) The llinterp group concluded "it might be doable". +More details can be found in svn at + +http://codespeak.net/svn/pypy/extradoc/sprintinfo/paris-2005-stackless-discussion.txt +http://codespeak.net/svn/pypy/extradoc/sprintinfo/paris/tllinterpreter_planning.txt + +(consistency? We're researchers!) + +Christian (he didn't get a choice), Valentino, Anders (L), Adrien, +Armin and Amaury became the stackless working group and by Tuesday +lunchtime had progressed via an unlikely sounding six-person-pair +programming methodology involving a beamer to a fully stackless C +translation, albeit with limited functionality visible even to +RPython. + +Samuele, Bert, Arre, Aurelien, and Boris became what was ultimately +known as the 'ootype group' working on a variation of the rtyper more +suited to translation to a language with a richer type system than C +(classes, lists, some vague notion of type safety, etc) such as Java, +Smalltalk, ... + +Michael and Andrew worked on a backend that emits machine code -- in +particular ppc32 machine code -- directly. By the end of Monday a toy +function doing some simple integer calculations had been translated +but on Tuesday restructuring towards re-use (and comprehensibility) +became the main goal. Oh, and not assuming an infinite supply of +registers... + +Carl and Holger started implementing Addresses in the C backend to +prepare for the coming llinterpreter work, finishing on Monday +evening. Carl then worked on data structures needed for a +translatable llinterpreter. + +Thursday: +Consortium meeting + breakday + +Friday-Sunday: +The stackless group reported good progress, having compiled a working +pypy-c with stackless support and implemented stack overflow detection +for non-stackless builds. Unfortunately for the stackless builds, +several CPOython tests expect infinite recursion to result in an error +-- and do so fairly quickly, i.e. in less time than it takes to fill +the entire heap with stack frames. This groups work is basically +done for this sprint, although Anders and Christian are going to work +on compliancy testing with the new pypy-c. While waiting for builds +targeting compliancy tests, they are also going to investigate +reorganizing code generation to improve locality. + +The ootype group is also progressing well. The RTyper has mostly been +refactored to be independent of the targeted type system and work is +continuing on implementing the new OOType type system alongside the +existing LLType target. This group will be continuing, although with +somehwat different membership -- Michael joining and half of each of +Samuele and Arre leaving. + +Michael and Andrews work on the PPC backend has progressed to the +point where essentially any function that only manipulates integers +can be translated (with an exceedingly stupid register allocator). +Further work depends to a large extent on the llinterpreter work (see +below) so this work will wait until after the sprint. Andrew is +moving to work on implementing a Numeric-a-like for PyPy, together +with Ludovic. + +The LLInterpreter grouplette (Carl and 0.5 Armins and a little Holger) +did not produce much code since there are many decisions to be made +and the implications of these decisions are not understood. A +discussion group of Carl, Armin, Samuele, Holger, Christian and Arre +will try to shine lights into these shadows, and report after lunch. + +Valentino and Amaury are going to implement the socket module. This +is a step towards allowing Valentino to run Twisted on PyPy and thus +make him very happy. + +This sprint is working in quite a different way to previous sprints -- +there is lots of discussion which isnt new, but the farming off of +discussion to groups of 5-6 people who present a report to the larger +group is a novelty and seems to be working well (15+ people is too +many for a focussed technical discussion). Another difference is a +less strict emphasis on "pair" programming -- or, if you like, we are +still pair programming but we have redefined "pair" to mean a group of +two to six programmers :) + +On Friday morning another discussion group was founded and discussed - +again - the state and future of the l3interpreter (l3 = lll = low low +level), that is the translatable llinterpreter. The results were +presented after lunch, together with some ideas about the JIT. +Afterwards Carl gave a short talk on the results of his summer of code +project on writing garbage collectors in RPython. + +Boris, Michael, Bert with help from Samuele spent the whole rest of the +sprint working on the many open issues related to ootyping. Simple +programs can now be ootyped, including inheritance, methods, instance +attributes and right at the end some support for prebuilt instances. In +addition they extended the llinterpreter to understand the ootype +operations as well (we were worried that our names were starting to make +sense). + +Armin spent the last days fixing different cases that crashed pypy-c +which he found by running the CPython compliancy tests. In addition he +helped various people to find their way around the codebase. + +Adrien and Arre worked on fixing compiler and parser issues that led to +wrong line numbers and different issues that popped up. + +Ludovic and Adrien experimented with rewriting parts of the Numeric +package in RPython. + +Valentino and Amaury continued on implementing the socket module which +turned out (as expected) to be a platform dependent nightmare. They have +a kind of complete socket module now, but some functions cannot yet be +translated. + +Christian worked on an experiment to reorder functions in the created C +code to improve code locality. + +Finally, after a week of srapped attemps, much headscratching and heated +discussion there was some code written for the l3interpreter. On +Saturday afternoon Holger and Carl wrote the basic model and managed to +interpret interesting functions like x + 4. On Sunday Samuele and Carl +continued and started on a graph converter that takes ll graphs and +transforms into the form the l3interpreter expects. + +On Saturday afternoon there was a planning meeting where the actions of +the following weeks were discussed. The EU-report writing was +distributed to the different consortium members. +Furthermore we discussed the various conference and sprints planned for autumn 2005 +and spring 2006. + +All in all it was a very productive sprint but of course we all have to +recover for two weeks now. + From rxe at codespeak.net Tue Dec 6 16:26:46 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Tue, 6 Dec 2005 16:26:46 +0100 (CET) Subject: [pypy-svn] r20782 - in pypy/dist/pypy/translator/llvm: . module test Message-ID: <20051206152646.6993327B47@code1.codespeak.net> Author: rxe Date: Tue Dec 6 16:26:45 2005 New Revision: 20782 Modified: pypy/dist/pypy/translator/llvm/externs2ll.py pypy/dist/pypy/translator/llvm/module/genexterns.c pypy/dist/pypy/translator/llvm/test/runtest.py Log: Local changes that were pending before merge occured: llvm-gcc is now required on the local machine including tests (so they might mysteriously skip now if you dont have the llvm front end compiler installed.) Modified: pypy/dist/pypy/translator/llvm/externs2ll.py ============================================================================== --- pypy/dist/pypy/translator/llvm/externs2ll.py (original) +++ pypy/dist/pypy/translator/llvm/externs2ll.py Tue Dec 6 16:26:45 2005 @@ -24,6 +24,16 @@ "%LLVM_RPython_StartupCode", ] +def get_c_cpath(): + from pypy.translator import translator + return os.path.dirname(translator.__file__) + +def get_genexterns_path(): + return os.path.join(get_llvm_cpath(), "genexterns.c") + +def get_llvm_cpath(): + return os.path.join(os.path.dirname(__file__), "module") + def get_ll(ccode, function_names): function_names += support_functions filename = str(udir.join("ccode.c")) @@ -31,16 +41,13 @@ f.write(ccode) f.close() - llvm_gcc = os.popen('which llvm-gcc 2>&1').read() - if llvm_gcc and not llvm_gcc.startswith('which'): #local llvm CFE available - #log('using local llvm-gcc') - plain = filename[:-2] - os.system("llvm-gcc -S %s.c -o %s.ll 2>&1" % (plain, plain)) - llcode = open(plain + '.ll').read() - else: #as fallback use remove CFE. XXX local and remote should be similar machines! - #log('falling back on remote llvm-gcc') - request = urllib.urlencode({'ccode':ccode}) # goto codespeak and compile our c code - llcode = urllib.urlopen('http://codespeak.net/pypy/llvm-gcc.cgi', request).read() + plain = filename[:-2] + cmd = "llvm-gcc -I%s -I%s -S %s.c -o %s.ll 2>&1" % (get_llvm_cpath(), + get_c_cpath(), + plain, + plain) + os.system(cmd) + llcode = open(plain + '.ll').read() # strip lines ll_lines = [] @@ -125,12 +132,6 @@ return decls -def path_join(root_path, *paths): - path = root_path - for p in paths: - path = os.path.join(path, p) - return path - def generate_llfile(db, extern_decls, entrynode, standalone): ccode = [] function_names = [] @@ -142,16 +143,9 @@ assert '\n' not in llname ccode.append('#define\t%s\t%s\n' % (c_name, llname)) - # special case name entry_point XXX bit underhand - for k, v in db.obj2node.items(): - try: - if isinstance(lltype.typeOf(k), lltype.FuncType): - if v == entrynode and standalone: - predeclarefn("__ENTRY_POINT__", v.get_ref()) - ccode.append('#define ENTRY_POINT_DEFINED 1\n\n') - break - except TypeError, exc: - pass + if standalone: + predeclarefn("__ENTRY_POINT__", entrynode.get_ref()) + ccode.append('#define ENTRY_POINT_DEFINED 1\n\n') for c_name, obj in extern_decls: if isinstance(obj, lltype.LowLevelType): @@ -171,46 +165,17 @@ assert False, "unhandled extern_decls %s %s %s" % (c_name, type(obj), obj) # start building our source - src = open(path_join(os.path.dirname(__file__), - "module", - "genexterns.c")).read() + src = open(get_genexterns_path()).read() - # set python version to include + # XXX MESS: set python version to include if sys.platform == 'darwin': python_h = '"/System/Library/Frameworks/Python.framework/Versions/2.3/include/python2.3/Python.h"' else: python_h = '' - src = src.replace('__PYTHON_H__', python_h) - - # add our raising ops - s = open(path_join(os.path.dirname(__file__), - "module", - "raisingop.h")).read() - src = src.replace('__RAISING_OPS__', s) - - - from pypy.translator.c import extfunc - src_path = path_join(os.path.dirname(extfunc.__file__), "src") - - include_files = [path_join(src_path, f + ".h") for f in - ["thread", "ll_os", "ll_math", "ll_time", - "ll_strtod", "ll_thread", "stack"]] - - includes = [] - for f in include_files: - s = open(f).read() - - # XXX this is getting a tad (even more) ridiculous - for name in ["ll_osdefs.h", "thread_pthread.h"]: - include_str = '#include "%s"' % name - if s.find(include_str) >= 0: - s2 = open(path_join(src_path, name)).read() - s = s.replace(include_str, s2) - includes.append(s) + src = src.replace('__PYTHON_H__', python_h) - src = src.replace('__INCLUDE_FILES__', "".join(includes)) - ccode.append(src) ccode = "".join(ccode) + ccode += src return get_ll(ccode, function_names) Modified: pypy/dist/pypy/translator/llvm/module/genexterns.c ============================================================================== --- pypy/dist/pypy/translator/llvm/module/genexterns.c (original) +++ pypy/dist/pypy/translator/llvm/module/genexterns.c Tue Dec 6 16:26:45 2005 @@ -24,10 +24,16 @@ #include __PYTHON_H__ // overflows/zeros/values raising operations -__RAISING_OPS__ +#include "raisingop.h" // append some genc files here manually from python -__INCLUDE_FILES__ +#include "c/src/thread.h" +#include "c/src/ll_os.h" +#include "c/src/ll_math.h" +#include "c/src/ll_time.h" +#include "c/src/ll_strtod.h" +#include "c/src/ll_thread.h" +#include "c/src/stack.h" // setup code for ThreadLock Opaque types char *RPyOpaque_LLVM_SETUP_ThreadLock(struct RPyOpaque_ThreadLock *lock, Modified: pypy/dist/pypy/translator/llvm/test/runtest.py ============================================================================== --- pypy/dist/pypy/translator/llvm/test/runtest.py (original) +++ pypy/dist/pypy/translator/llvm/test/runtest.py Tue Dec 6 16:26:45 2005 @@ -7,6 +7,7 @@ def llvm_is_on_path(): try: py.path.local.sysfind("llvm-as") + py.path.local.sysfind("llvm-gcc") except py.error.ENOENT: return False return True From bea at codespeak.net Tue Dec 6 16:47:19 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Tue, 6 Dec 2005 16:47:19 +0100 (CET) Subject: [pypy-svn] r20786 - pypy/extradoc/talk Message-ID: <20051206154719.60DB727B58@code1.codespeak.net> Author: bea Date: Tue Dec 6 16:47:18 2005 New Revision: 20786 Modified: pypy/extradoc/talk/conference-attendance.txt Log: updated with info (acceptance PyCon) about the agile 2006 conference Modified: pypy/extradoc/talk/conference-attendance.txt ============================================================================== --- pypy/extradoc/talk/conference-attendance.txt (original) +++ pypy/extradoc/talk/conference-attendance.txt Tue Dec 6 16:47:18 2005 @@ -44,8 +44,28 @@ Time & location: 24-26 Feburary, 2006, Addison, Texas (near Dallas) -out of the pypy group four talks were submitted. -Acceptance pending (15th November ASFAIK). +Out of the pypy group four talks were submitted. +Acceptance pending (15th November ASFAIK). +The 3 PyPy talks where accepted into the PyCon programme! Due: sprint planning, possibly inviting Ka Ping (regarding E-lang) + +Agile 2006 +----------- + +DEADLINE: 23 June 2005 (end of early bird registration) + +Annual US agile conference (the main agile alliance conference) - last year were +sold out. + +Call for papers: +http://www.agile2006.org./ +- 31 January for research papers +- 28 February for experience reports + +Time and location: 23-28th of July 2006, Minneapolis, Minnesota, USA + +Change Maker (Bea) will plan to submit: +- experience report +- research paper From tismer at codespeak.net Tue Dec 6 17:07:11 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Tue, 6 Dec 2005 17:07:11 +0100 (CET) Subject: [pypy-svn] r20789 - pypy/dist/pypy/doc Message-ID: <20051206160711.5B43927B47@code1.codespeak.net> Author: tismer Date: Tue Dec 6 17:07:10 2005 New Revision: 20789 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Tue Dec 6 17:07:10 2005 @@ -8,9 +8,9 @@ Introduction ============= -One of the goals of the PyPy project is it to have the memory and concurrency -models flexible and changeable without having to manually reimplement the -interpreter. In fact, PyPy by time of the 0.8 release contains code for memory +One of the goals of the PyPy project is to have the memory and concurrency +models flexible and changeable without having to reimplement the +interpreter manually. In fact, PyPy, by the time of the 0.8 release contains code for memory management and concurrency models which allows experimentation without requiring early design decisions. This document describes many of the more technical details of the current state of the implementation of the memory @@ -82,9 +82,9 @@ The way we do subclass checking is a good example of the flexibility provided by our approach: in the beginning we were using a naive linear lookup algorithm. Since subclass checking is quite a common operation (it is also used -to check whether an object is an instance of a certain class) we wanted to +to check whether an object is an instance of a certain class), we wanted to replace it with the more efficient relative numbering algorithm (see [PVE]_ for -an overview of techniques). This was a matter of just changing the appropriate +an overview of techniques). This was a matter of changing just the appropriate code of the rtyping process to calculate the class-ids during rtyping and insert the necessary fields into the class structure. It would be similarly easy to switch to another implementation. @@ -92,10 +92,10 @@ Identity hashes --------------- -In the RPython type system class instances can be used as dictionary keys using -a default hash implementation based on identity which in practise is -implemented using the memory address. This is similar to how CPython behaves if -no user-defined hash function is present. The annotator keeps track of the +In the RPython type system, class instances can be used as dictionary keys using +a default hash implementation based on identity, which in practice is +implemented using the memory address. This is similar to CPython's behavior +when no user-defined hash function is present. The annotator keeps track of the classes for which this hashing is ever used. One of the peculiarities of PyPy's approach is that live objects are analyzed @@ -104,10 +104,8 @@ "pre-built constants" (PBCs for short). During rtyping, these instances must be converted to the low level model. One of the problems with doing this is that the standard hash implementation of Python is to take the id of an object, which -is just the memory address. If the RPython program explicitely stores the hashes -of a PBC somewhere (for example in the implementation of a data structure) then -the stored hash value would be extremely unlikely to match the value of the object's -address after translation. +is just the memory address. This is problematic for creating PBCs, because +the address of an object is not persistent after translation. To prevent this the following strategy is used: for every class whose instances are hashed somewhere in the program (either when storing them in a @@ -138,10 +136,10 @@ One example of the flexibility the RTyper provides is how we deal with lists. Based on information gathered by the annotator the RTyper chooses between two -different list implementations. If a list never changes its size after creation -a low-level array is used directly. For lists which might be resized a -representation consisting of a structure with a pointer to an array is used and -overallocation is performed. +different list implementations. If a list never changes its size after creation, +a low-level array is used directly. For lists which might be resized, a +representation consisting of a structure with a pointer to an array is used, +together with over-allocation. We plan to use similar techniques to use tagged pointers instead of using boxing to represent builtin types of the PyPy interpreter such as integers. This would @@ -161,12 +159,12 @@ LLVM the backend has to produce code that uses some sort of garbage collection. This approach has several advantages. It makes it possible to target different -platforms, with and without integrated garbage collection. Furthermore the +platforms, with and without integrated garbage collection. Furthermore, the interpreter implementation is not complicated by the need to do explicit memory management everywhere. Even more important the backend can optimize the memory handling to fit a certain situation (like a machine with very restricted memory) or completely replace the memory management technique or memory model -with a different one without having to change interpreter code. Additionally +with a different one without the need to change source code. Additionally, the backend can use information that was inferred by the rest of the toolchain to improve the quality of memory management. @@ -181,7 +179,7 @@ Since the C backend has a lot of information avaiable about the data structure being allocated it can choose the memory allocation function out of the Boehm API that fits best. For example, for objects that do not contain references to -other objects (e.g. strings) there is a special allocation function that +other objects (e.g. strings) there is a special allocation function which signals to the collector that it does not need to consider this memory when tracing pointers. @@ -206,6 +204,8 @@ The current placement of reference counter updates is far from optimal: The reference counts are updated much more often than theoretically necessary (e.g. sometimes a counter is increased and then immediately decreased again). +Objects passed into a function as arguments can almost always use a "trusted reference", +because the call-site is responsible to create a valid reference. Furthermore some more analysis could show that some objects don't need a reference counter at all because they either have a very short, foreseeable life-time or because they live exactly as long as another object. From tismer at codespeak.net Tue Dec 6 17:28:00 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Tue, 6 Dec 2005 17:28:00 +0100 (CET) Subject: [pypy-svn] r20792 - pypy/dist/pypy/doc Message-ID: <20051206162800.4D8CE27B47@code1.codespeak.net> Author: tismer Date: Tue Dec 6 17:27:59 2005 New Revision: 20792 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: made the text till the Stackless section. changed some wordings, removed some speling errors, added a few things, and manny commas (!) :-) Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Tue Dec 6 17:27:59 2005 @@ -231,7 +231,7 @@ the stack. To achieve this, the object is "exploded", that means that for every element of the structure a new variable is generated that is handed around in the graph. Reads from elements of the structure are removed and just replaced -by one of the variables, writes by assignements to same. +by one of the variables, writes by assignments to same. Since quite a lot of objects are allocated in small helper functions, this simple approach which does not track objects accross function boundaries only @@ -263,28 +263,29 @@ PyPy to C with them. This is because is not easy to find the root pointers that reside on the C stack -- both because the C stack layout is heavily platform dependent and because of the possibility of roots that are not -only on the stack but also in registers (which would give a problem for moving -garbage collectors). +only on the stack but also in registers (which would give a problem for *moving +garbage collectors*). There are several possible solutions for this problem: One -of them is to not use C compilers to generate machine code so that the stack -frame layout can be controlled by us. This is one of the tasks that need to be -tackled in phase 2 as generating assembly directly is needed anyway for a +of them is to not use C compilers to generate machine code, so that the stack +frame layout get into our control. This is one of the tasks that need to be +tackled in phase 2, as directly generating assembly is needed anyway for a just-in-time compiler. The other possibility (which would be much easier to -implement) is to move all the data away from the stack to the heap, as -described below in section "Stackless C code". +implement) is to move all the data away from the stack to the heap +before collecting garbage, as described in section "Stackless C code" below. Concurrency Model Implementations ============================================ -At the moment we have implemented two different concurrency models and the -option to not support concurrency (which proof the modularity of our approach): +At the moment we have implemented two different concurrency models, and the +option to not support concurrency at all +(another proof of the modularity of our approach): threading with a global interpreter lock and a "stackless" model. No threading ------------- -By default multi-threading is not supported at all, which gives some small +By default, multi-threading is not supported at all, which gives some small benefits for single-threaded applications since even in the single-threaded case there is some overhead if threading capabilities are built into the interpreter. @@ -292,19 +293,19 @@ Threading with a Global Interpreter Lock ------------------------------------------ -At the moment there is one non-trivial threading model implemented. It follows +Right now, there is one non-trivial threading model implemented. It follows the threading implementation of CPython and thus uses a global interpreter -lock. This lock prevents any two threads from interpreting python code at any -time. The global interpreter lock is released around calls to blocking I/O +lock. This lock prevents any two threads from interpreting python code at +the same time. The global interpreter lock is released around calls to blocking I/O functions. This approach has a number of advantages: it gives very little runtime penalty for single-threaded applications, makes many of the common uses -for threading possible and is relatively easy to implement and maintain. It has -the disadvantages that multiple threads cannot be distributed accross multiple +for threading possible, and it is relatively easy to implement and maintain. It has +the disadvantage that multiple threads cannot be distributed accross multiple proccessors. -To make this threading-model useable for I/O-bound applications the global +To make this threading-model usable for I/O-bound applications, the global intepreter lock should be released around blocking external function calls -(which is also what CPython does). This has not been fully implemented yet. +(which is also what CPython does). This has been partially implemented, yet. Stackless C code From tismer at codespeak.net Tue Dec 6 17:40:39 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Tue, 6 Dec 2005 17:40:39 +0100 (CET) Subject: [pypy-svn] r20793 - pypy/dist/pypy/doc Message-ID: <20051206164039.B176D27B47@code1.codespeak.net> Author: tismer Date: Tue Dec 6 17:40:38 2005 New Revision: 20793 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: added/changed the Stackless part, only slightly, I like it very much Just another pass before the final check-in Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Tue Dec 6 17:40:38 2005 @@ -260,10 +260,10 @@ framework: a simple copying collector, a mark-and-sweep collector and a deferred reference counting collector. These garbage collectors are work when run on top of the memory simulator, but at the moment it is not yet possible to translate -PyPy to C with them. This is because is not easy to +PyPy to C with them. This is because it is not easy to find the root pointers that reside on the C stack -- both because the C stack layout is -heavily platform dependent and because of the possibility of roots that are not -only on the stack but also in registers (which would give a problem for *moving +heavily platform dependent, and also due to the possibility of roots that are not +only on the stack but also hiding in registers (which would give a problem for *moving garbage collectors*). There are several possible solutions for this problem: One @@ -311,14 +311,14 @@ Stackless C code ----------------- -"Stackless" C code is C code that only uses a bounded amount of the -space in the C stack, and that can more generally obtain explicit -control of its own stack. This is generally known as "continuations", +"Stackless" C code is C code that only uses a bounded amount of +space in the C stack, and that can generally obtain explicit +control of its own stack. This is commonly known as "continuations", or "continuation-passing style" code, although in our case we will limit ourselves to single-shot continuations, i.e. continuations that are -captured and subsequently only resumed once. +captured and subsequently will be resumed exactly once. -The technique we have implemented is based on an old but recurring idea +The technique we have implemented is based on an ancient but recurring idea of emulating this style via exceptions: a specific program point can generate a pseudo-exception whose purpose is to unwind the whole C stack in a restartable way. More precisely, the "unwind" exception causes @@ -327,11 +327,11 @@ innermost (most recent) frame of the saved stack -- allowing unlimited recursion on OSes that limit the size of the C stack -- or to resume a different previously-saved C stack altogether, thus implementing -coroutines. +coroutines or light-weight threads. In our case, exception handling is always explicit in the generated code: -the C backend always -puts after each call site a cheap check to detect if the callee exited +the C backend puts a cheap check +after each call site to detect if the callee exited normally or generated an exception. So when compiling functions in stackless mode, the generated exception handling code special-cases the new "unwind" exception. This exception causes the current function to @@ -343,7 +343,9 @@ At this point, the whole C stack is stored away in the heap. This is a very interesting state in itself, because precisely there is no C stack -left. It is this which will allow us to write in a portable way all the algorithms that +below the dispatcher +left. It is this which will allow us to write all the algorithms +in a portable way, that normally require machine-specific code to inspect the stack, in particular garbage collectors. @@ -356,15 +358,16 @@ invokes the previous (parent) saved frame, and so on. We insert stack checks before calls that can lead to recursion by detecting cycles in the call graph. These stack checks copy the stack to the heap (by raising the special -exception) if it is deeper than a certain level. In this model, the C stack -can be considered as a cache for the heap-based saved frames. When we run out +exception) if it is about to grow deeper than a certain level. +As a different point of view, the C stack can also be considered as a cache +for the heap-based saved frames in this model. When we run out of C stack space, we flush the cache. When the cache is empty, we fill it with the next item from the heap. To give the translated program some amount of control over the heap-based stack structures and over the top-level dispatcher that jumps between them, there are a few "external" functions directly implemented -in C. These functions provide an elementary interface on top of which +in C. These functions provide an elementary interface, on top of which useful abstractions can be implemented, like: * coroutines: explicitly switching code, similar to Greenlets [GREENLET]_. @@ -372,7 +375,7 @@ * "tasklets": cooperatively-scheduled microthreads, as introduced in Stackless Python [STK]_. -* implicitly-scheduled microthreads, also known as green threads. +* implicitly-scheduled (preemptive) microthreads, also known as green threads. An important property of the changes in all the generated C functions is to be written in a way that does not significantly degrade their performance in From tismer at codespeak.net Tue Dec 6 17:46:59 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Tue, 6 Dec 2005 17:46:59 +0100 (CET) Subject: [pypy-svn] r20795 - pypy/dist/pypy/doc Message-ID: <20051206164659.A92AD27B47@code1.codespeak.net> Author: tismer Date: Tue Dec 6 17:46:58 2005 New Revision: 20795 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: final check-in. Now you can trash my nice commas if you like :-) Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Tue Dec 6 17:46:58 2005 @@ -378,7 +378,7 @@ * implicitly-scheduled (preemptive) microthreads, also known as green threads. An important property of the changes in all the generated C functions is -to be written in a way that does not significantly degrade their performance in +that they are written in a way that does only minimally degrade their performance in the non-exceptional case. Most optimisations performed by C compilers, like register allocation, continue to work... @@ -412,18 +412,19 @@ keep track of the GC roots which is hard to do with portable C code. One solution would be to use the "stackless" code since it can move the stack completely to the heap. We expect that we can implement GC read and write -barriers as function calls and rely on inlining to make them less inefficient. +barriers as function calls and rely on inlining to make them more efficient. We may also spend some time on improving the existing reference counting -implementation by removing unnecessary incref-decref pairs. A bigger task would +implementation by removing unnecessary incref-decref pairs and identifying +trustworthy references. A bigger task would be to add support for detecing circular references. Threading model --------------- -One of the interesting possibities that stackless offers is to implement green -threading. This would involve writing a scheduler and preemption logic. +One of the interesting possibities that stackless offers is to implement *green +threading*. This would involve writing a scheduler and some preemption logic. We should also investigate other threading models based on operating system threads with various granularities of locking for access of shared objects. @@ -432,7 +433,7 @@ ------------ We also might want to experiment with more sophisticated structure inlining. -Sometimes it is possible to find out that one structure object that is +Sometimes it is possible to find out that one structure object allocated on the heap lives exactly as long as another structure object on the heap pointing to it. If this is the case it is possible to inline the first object into the second. This saves the space of one pointer and avoids @@ -442,7 +443,9 @@ Conclusion =========== -As concretely shown with various detailed examples our approach gives us flexibility and lets us chooses various aspects at translation time instead of encoding them into the implementation itself. +As concretely shown with various detailed examples, our approach gives us +flexibility and lets us choose various aspects at translation time instead +of encoding them into the implementation itself. References =========== From rxe at codespeak.net Tue Dec 6 17:52:08 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Tue, 6 Dec 2005 17:52:08 +0100 (CET) Subject: [pypy-svn] r20796 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20051206165208.8521027B58@code1.codespeak.net> Author: rxe Date: Tue Dec 6 17:52:07 2005 New Revision: 20796 Modified: pypy/dist/pypy/translator/llvm/externs2ll.py pypy/dist/pypy/translator/llvm/module/genexterns.c Log: Use distutils to decide where default python include files live. Modified: pypy/dist/pypy/translator/llvm/externs2ll.py ============================================================================== --- pypy/dist/pypy/translator/llvm/externs2ll.py (original) +++ pypy/dist/pypy/translator/llvm/externs2ll.py Tue Dec 6 17:52:07 2005 @@ -42,10 +42,11 @@ f.close() plain = filename[:-2] - cmd = "llvm-gcc -I%s -I%s -S %s.c -o %s.ll 2>&1" % (get_llvm_cpath(), - get_c_cpath(), - plain, - plain) + cmd = "llvm-gcc -I%s -I%s -I%s -S %s.c -o %s.ll 2>&1" % (get_llvm_cpath(), + get_c_cpath(), + get_python_inc(), + plain, + plain) os.system(cmd) llcode = open(plain + '.ll').read() @@ -132,6 +133,10 @@ return decls +def get_python_inc(): + import distutils.sysconfig + return distutils.sysconfig.get_python_inc() + def generate_llfile(db, extern_decls, entrynode, standalone): ccode = [] function_names = [] @@ -165,17 +170,7 @@ assert False, "unhandled extern_decls %s %s %s" % (c_name, type(obj), obj) # start building our source - src = open(get_genexterns_path()).read() - - # XXX MESS: set python version to include - if sys.platform == 'darwin': - python_h = '"/System/Library/Frameworks/Python.framework/Versions/2.3/include/python2.3/Python.h"' - else: - python_h = '' - - src = src.replace('__PYTHON_H__', python_h) - ccode = "".join(ccode) - ccode += src + ccode += open(get_genexterns_path()).read() return get_ll(ccode, function_names) Modified: pypy/dist/pypy/translator/llvm/module/genexterns.c ============================================================================== --- pypy/dist/pypy/translator/llvm/module/genexterns.c (original) +++ pypy/dist/pypy/translator/llvm/module/genexterns.c Tue Dec 6 17:52:07 2005 @@ -16,12 +16,8 @@ RPyListOfString *_RPyListOfString_New(int); void _RPyListOfString_SetItem(RPyListOfString *, int, RPyString *); -#include -#include -#include - -// the placeholder in the next line gets replaced by the actual python.h path -#include __PYTHON_H__ +// include this to get constants and macros for below includes +#include // overflows/zeros/values raising operations #include "raisingop.h" From cfbolz at codespeak.net Tue Dec 6 18:20:04 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 6 Dec 2005 18:20:04 +0100 (CET) Subject: [pypy-svn] r20800 - pypy/dist/pypy/doc Message-ID: <20051206172004.9778827B47@code1.codespeak.net> Author: cfbolz Date: Tue Dec 6 18:20:03 2005 New Revision: 20800 Modified: pypy/dist/pypy/doc/translation-aspects.txt Log: fixed some of the fixes. this document is getting better, slowly Modified: pypy/dist/pypy/doc/translation-aspects.txt ============================================================================== --- pypy/dist/pypy/doc/translation-aspects.txt (original) +++ pypy/dist/pypy/doc/translation-aspects.txt Tue Dec 6 18:20:03 2005 @@ -104,8 +104,11 @@ "pre-built constants" (PBCs for short). During rtyping, these instances must be converted to the low level model. One of the problems with doing this is that the standard hash implementation of Python is to take the id of an object, which -is just the memory address. This is problematic for creating PBCs, because -the address of an object is not persistent after translation. + +is just the memory address. If the RPython program explicitely captures the +hash of a PBC by storing it (for example in the implementation of a data +structure) then the stored hash value will not match the value of the object's +address after translation. To prevent this the following strategy is used: for every class whose instances are hashed somewhere in the program (either when storing them in a @@ -268,7 +271,7 @@ There are several possible solutions for this problem: One of them is to not use C compilers to generate machine code, so that the stack -frame layout get into our control. This is one of the tasks that need to be +frame layout gets into our control. This is one of the tasks that need to be tackled in phase 2, as directly generating assembly is needed anyway for a just-in-time compiler. The other possibility (which would be much easier to implement) is to move all the data away from the stack to the heap @@ -305,7 +308,7 @@ To make this threading-model usable for I/O-bound applications, the global intepreter lock should be released around blocking external function calls -(which is also what CPython does). This has been partially implemented, yet. +(which is also what CPython does). This has been partially implemented. Stackless C code @@ -318,7 +321,7 @@ ourselves to single-shot continuations, i.e. continuations that are captured and subsequently will be resumed exactly once. -The technique we have implemented is based on an ancient but recurring idea +The technique we have implemented is based on the recurring idea of emulating this style via exceptions: a specific program point can generate a pseudo-exception whose purpose is to unwind the whole C stack in a restartable way. More precisely, the "unwind" exception causes From ericvrp at codespeak.net Tue Dec 6 18:28:47 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 6 Dec 2005 18:28:47 +0100 (CET) Subject: [pypy-svn] r20801 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20051206172847.DF50427B47@code1.codespeak.net> Author: ericvrp Date: Tue Dec 6 18:28:45 2005 New Revision: 20801 Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py pypy/dist/pypy/translator/llvm/externs2ll.py pypy/dist/pypy/translator/llvm/gc.py pypy/dist/pypy/translator/llvm/module/genexterns.c Log: Removed some hardcoded library and include paths Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Tue Dec 6 18:28:45 2005 @@ -87,7 +87,9 @@ cleanup = False if sys.platform == 'darwin': - gc_libs_path = '-L/sw/lib -ldl' + import distutils.sysconfig + libdir = distutils.sysconfig.EXEC_PREFIX + "/lib" + gc_libs_path = '-L%s -ldl' % libdir else: gc_libs_path = '-static' Modified: pypy/dist/pypy/translator/llvm/externs2ll.py ============================================================================== --- pypy/dist/pypy/translator/llvm/externs2ll.py (original) +++ pypy/dist/pypy/translator/llvm/externs2ll.py Tue Dec 6 18:28:45 2005 @@ -42,7 +42,8 @@ f.close() plain = filename[:-2] - cmd = "llvm-gcc -I%s -I%s -I%s -S %s.c -o %s.ll 2>&1" % (get_llvm_cpath(), + cmd = "llvm-gcc -I%s -I%s -I%s -I%s -S %s.c -o %s.ll 2>&1" % (get_llvm_cpath(), + get_incdir(), get_c_cpath(), get_python_inc(), plain, @@ -137,6 +138,10 @@ import distutils.sysconfig return distutils.sysconfig.get_python_inc() +def get_incdir(): + import distutils.sysconfig + return distutils.sysconfig.EXEC_PREFIX + "/include" + def generate_llfile(db, extern_decls, entrynode, standalone): ccode = [] function_names = [] Modified: pypy/dist/pypy/translator/llvm/gc.py ============================================================================== --- pypy/dist/pypy/translator/llvm/gc.py (original) +++ pypy/dist/pypy/translator/llvm/gc.py Tue Dec 6 18:28:45 2005 @@ -20,10 +20,11 @@ def new(gcpolicy=None): #factory gcpolicy = gcpolicy or 'boehm' - + + import distutils.sysconfig from os.path import exists - boehm_on_path = exists('/usr/lib/libgc.so') or exists('/usr/lib/libgc.a') or \ - exists('/sw/lib/libgc.so') or exists('/sw/lib/libgc.a') + libdir = distutils.sysconfig.EXEC_PREFIX + "/lib" + boehm_on_path = exists(libdir + '/libgc.so') or exists(libdir + '/libgc.a') if gcpolicy == 'boehm' and not boehm_on_path: log.gc.WARNING('warning: Boehm GC libary not found in /usr/lib, falling back on no gc') gcpolicy = 'none' Modified: pypy/dist/pypy/translator/llvm/module/genexterns.c ============================================================================== --- pypy/dist/pypy/translator/llvm/module/genexterns.c (original) +++ pypy/dist/pypy/translator/llvm/module/genexterns.c Tue Dec 6 18:28:45 2005 @@ -60,7 +60,7 @@ memcpy((void *) ptr2, (void *) ptr1, size); } -#include +#include #define USING_BOEHM_GC char *LLVM_RPython_StartupCode(); From rxe at codespeak.net Tue Dec 6 18:58:07 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Tue, 6 Dec 2005 18:58:07 +0100 (CET) Subject: [pypy-svn] r20805 - pypy/dist/pypy/translator/llvm Message-ID: <20051206175807.283DB27B47@code1.codespeak.net> Author: rxe Date: Tue Dec 6 18:58:06 2005 New Revision: 20805 Modified: pypy/dist/pypy/translator/llvm/externs2ll.py Log: Small updates to include paths... not all tested yet. Modified: pypy/dist/pypy/translator/llvm/externs2ll.py ============================================================================== --- pypy/dist/pypy/translator/llvm/externs2ll.py (original) +++ pypy/dist/pypy/translator/llvm/externs2ll.py Tue Dec 6 18:58:06 2005 @@ -24,16 +24,9 @@ "%LLVM_RPython_StartupCode", ] -def get_c_cpath(): - from pypy.translator import translator - return os.path.dirname(translator.__file__) - def get_genexterns_path(): return os.path.join(get_llvm_cpath(), "genexterns.c") -def get_llvm_cpath(): - return os.path.join(os.path.dirname(__file__), "module") - def get_ll(ccode, function_names): function_names += support_functions filename = str(udir.join("ccode.c")) @@ -42,12 +35,10 @@ f.close() plain = filename[:-2] - cmd = "llvm-gcc -I%s -I%s -I%s -I%s -S %s.c -o %s.ll 2>&1" % (get_llvm_cpath(), - get_incdir(), - get_c_cpath(), - get_python_inc(), - plain, - plain) + includes = get_incdirs() + cmd = "llvm-gcc %s -S %s.c -o %s.ll 2>&1" % (includes, + plain, + plain) os.system(cmd) llcode = open(plain + '.ll').read() @@ -134,13 +125,26 @@ return decls -def get_python_inc(): - import distutils.sysconfig - return distutils.sysconfig.get_python_inc() +def get_c_cpath(): + from pypy.translator import translator + return os.path.dirname(translator.__file__) + +def get_llvm_cpath(): + return os.path.join(os.path.dirname(__file__), "module") + +def get_incdirs(): -def get_incdir(): import distutils.sysconfig - return distutils.sysconfig.EXEC_PREFIX + "/include" + includes = (distutils.sysconfig.EXEC_PREFIX + "/include", + distutils.sysconfig.EXEC_PREFIX + "/include/gc", + distutils.sysconfig.get_python_inc(), + get_c_cpath(), + get_llvm_cpath()) + + includestr = "" + for ii in includes: + includestr += "-I %s " % ii + return includestr def generate_llfile(db, extern_decls, entrynode, standalone): ccode = [] From ericvrp at codespeak.net Tue Dec 6 19:04:35 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 6 Dec 2005 19:04:35 +0100 (CET) Subject: [pypy-svn] r20806 - pypy/dist/pypy/translator/llvm Message-ID: <20051206180435.D4CEE27B47@code1.codespeak.net> Author: ericvrp Date: Tue Dec 6 19:04:34 2005 New Revision: 20806 Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py Log: Fix so OSX compilers find libgc, OSX unittest pass now Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Tue Dec 6 19:04:34 2005 @@ -65,7 +65,8 @@ source_files = [ "%s.c" % modname ] else: source_files = [] - object_files = [] + from distutils.sysconfig import EXEC_PREFIX + object_files = ["-L%s/lib" % EXEC_PREFIX] library_files = genllvm.gcpolicy.gc_libraries() gc_libs = ' '.join(['-l' + lib for lib in library_files]) From cfbolz at codespeak.net Tue Dec 6 19:07:30 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 6 Dec 2005 19:07:30 +0100 (CET) Subject: [pypy-svn] r20807 - pypy/extradoc/sprintinfo/gothenburg-2005 Message-ID: <20051206180730.EB9CF27B5C@code1.codespeak.net> Author: cfbolz Date: Tue Dec 6 19:07:30 2005 New Revision: 20807 Added: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt - copied, changed from r20800, pypy/extradoc/sprintinfo/gothenburg-2005/people.txt Log: (cfbolz, tismer, mwh, ludal, arigo, pedronis) listed possible tasks for the coding sprint starting tomorrow. From adim at codespeak.net Tue Dec 6 19:12:36 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Tue, 6 Dec 2005 19:12:36 +0100 (CET) Subject: [pypy-svn] r20808 - in pypy/dist/pypy/interpreter/pyparser: . test Message-ID: <20051206181236.9FFF627B58@code1.codespeak.net> Author: adim Date: Tue Dec 6 19:12:33 2005 New Revision: 20808 Modified: pypy/dist/pypy/interpreter/pyparser/ebnfparse.py pypy/dist/pypy/interpreter/pyparser/grammar.py pypy/dist/pypy/interpreter/pyparser/test/test_lookahead.py Log: fix issue167 (KleenStar becomes KleeneStar) Modified: pypy/dist/pypy/interpreter/pyparser/ebnfparse.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/ebnfparse.py (original) +++ pypy/dist/pypy/interpreter/pyparser/ebnfparse.py Tue Dec 6 19:12:33 2005 @@ -1,6 +1,6 @@ #!/usr/bin/env python from grammar import BaseGrammarBuilder, Alternative, Sequence, Token, \ - KleenStar, GrammarElement, build_first_sets, EmptyToken + KleeneStar, GrammarElement, build_first_sets, EmptyToken from ebnflexer import GrammarSource from syntaxtree import AbstractSyntaxVisitor import pytoken @@ -182,7 +182,7 @@ def handle_option( self, node ): rule = node.nodes[1].visit(self) - return self.new_item( KleenStar( self.new_symbol(), 0, 1, rule ) ) + return self.new_item( KleeneStar( self.new_symbol(), 0, 1, rule ) ) def handle_group( self, node ): rule = node.nodes[1].visit(self) @@ -214,10 +214,10 @@ rule_name = self.new_symbol() tok = star_opt.nodes[0].nodes[0] if tok.value == '+': - item = KleenStar(rule_name, _min=1, rule=myrule) + item = KleeneStar(rule_name, _min=1, rule=myrule) return self.new_item(item) elif tok.value == '*': - item = KleenStar(rule_name, _min=0, rule=myrule) + item = KleeneStar(rule_name, _min=0, rule=myrule) return self.new_item(item) else: raise SyntaxError("Got symbol star_opt with value='%s'" @@ -260,7 +260,7 @@ S = g_add_symbol # star: '*' | '+' star = Alternative( S("star"), [Token(S('*')), Token(S('+'))] ) - star_opt = KleenStar ( S("star_opt"), 0, 1, rule=star ) + star_opt = KleeneStar ( S("star_opt"), 0, 1, rule=star ) # rule: SYMBOL ':' alternative symbol = Sequence( S("symbol"), [Token(S('SYMBOL')), star_opt] ) @@ -269,12 +269,12 @@ rule = Sequence( S("rule"), [symboldef, alternative] ) # grammar: rule+ - grammar = KleenStar( S("grammar"), _min=1, rule=rule ) + grammar = KleeneStar( S("grammar"), _min=1, rule=rule ) # alternative: sequence ( '|' sequence )* - sequence = KleenStar( S("sequence"), 1 ) + sequence = KleeneStar( S("sequence"), 1 ) seq_cont_list = Sequence( S("seq_cont_list"), [Token(S('|')), sequence] ) - sequence_cont = KleenStar( S("sequence_cont"),0, rule=seq_cont_list ) + sequence_cont = KleeneStar( S("sequence_cont"),0, rule=seq_cont_list ) alternative.args = [ sequence, sequence_cont ] Modified: pypy/dist/pypy/interpreter/pyparser/grammar.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/grammar.py (original) +++ pypy/dist/pypy/interpreter/pyparser/grammar.py Tue Dec 6 19:12:33 2005 @@ -4,7 +4,7 @@ the objects of the grammar are : Alternative : as in S -> A | B | C Sequence : as in S -> A B C -KleenStar : as in S -> A* or S -> A+ +KleeneStar : as in S -> A* or S -> A+ Token : a lexer token """ @@ -122,7 +122,7 @@ # # we use the term root for a grammar rule to specify rules that are given a name # by the grammar -# a rule like S -> A B* is mapped as Sequence( SCODE, KleenStar(-3, B)) +# a rule like S -> A B* is mapped as Sequence( SCODE, KleeneStar(-3, B)) # so S is a root and the subrule describing B* is not. # SCODE is the numerical value for rule "S" @@ -190,9 +190,9 @@ return True -###################################################################### -# Grammar Elements Classes (Alternative, Sequence, KleenStar, Token) # -###################################################################### +####################################################################### +# Grammar Elements Classes (Alternative, Sequence, KleeneStar, Token) # +####################################################################### class GrammarElement(object): """Base parser class""" @@ -515,14 +515,14 @@ return True -class KleenStar(GrammarElement): - """Represents a KleenStar in a grammar rule as in (S -> A+) or (S -> A*)""" +class KleeneStar(GrammarElement): + """Represents a KleeneStar in a grammar rule as in (S -> A+) or (S -> A*)""" def __init__(self, name, _min = 0, _max = -1, rule=None): GrammarElement.__init__( self, name ) self.args = [rule] self.min = _min if _max == 0: - raise ValueError("KleenStar needs max==-1 or max>1") + raise ValueError("KleeneStar needs max==-1 or max>1") self.max = _max self.star = "x" if self.min == 0: Modified: pypy/dist/pypy/interpreter/pyparser/test/test_lookahead.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_lookahead.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_lookahead.py Tue Dec 6 19:12:33 2005 @@ -1,4 +1,4 @@ -from pypy.interpreter.pyparser.grammar import Alternative, Sequence, KleenStar, \ +from pypy.interpreter.pyparser.grammar import Alternative, Sequence, KleeneStar, \ Token, EmptyToken, build_first_sets class TestLookAheadBasics: @@ -32,10 +32,10 @@ def test_basic_kleenstar(self): tok1, tok2, tok3 = self.tokens - kstar = KleenStar(self.nextid(), 1, 3, tok1) + kstar = KleeneStar(self.nextid(), 1, 3, tok1) build_first_sets([kstar]) assert kstar.first_set == [tok1] - kstar = KleenStar(self.nextid(), 0, 3, tok1) + kstar = KleeneStar(self.nextid(), 0, 3, tok1) build_first_sets([kstar]) assert kstar.first_set == [tok1, EmptyToken] @@ -45,8 +45,8 @@ ==> S.first_set = [tok1, tok2, EmptyToken] """ tok1, tok2, tok3 = self.tokens - k1 = KleenStar(self.nextid(), 0, 2, tok1) - k2 = KleenStar(self.nextid(), 0, 2, tok2) + k1 = KleeneStar(self.nextid(), 0, 2, tok1) + k2 = KleeneStar(self.nextid(), 0, 2, tok2) seq = Sequence(self.nextid(), [k1, k2]) build_first_sets([k1, k2, seq]) assert seq.first_set == [tok1, tok2, EmptyToken] @@ -57,8 +57,8 @@ ==> S.first_set = [tok1, tok2] """ tok1, tok2, tok3 = self.tokens - k1 = KleenStar(self.nextid(), 0, 2, tok1) - k2 = KleenStar(self.nextid(), 1, 2, tok2) + k1 = KleeneStar(self.nextid(), 0, 2, tok1) + k2 = KleeneStar(self.nextid(), 1, 2, tok2) seq = Sequence(self.nextid(), [k1, k2]) build_first_sets([k1, k2, seq]) assert seq.first_set == [tok1, tok2] @@ -83,8 +83,8 @@ self.LOW = Token(LOW, 'low') self.CAP = Token(CAP ,'cap') self.A = Alternative(R_A, []) - k1 = KleenStar(R_k1, 0, rule=self.LOW) - k2 = KleenStar(R_k2, 0, rule=self.CAP) + k1 = KleeneStar(R_k1, 0, rule=self.LOW) + k2 = KleeneStar(R_k2, 0, rule=self.CAP) self.B = Sequence(R_B, [k1, self.A]) self.C = Sequence(R_C, [k2, self.A]) self.A.args = [self.B, self.C] From mwh at codespeak.net Wed Dec 7 11:12:16 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Wed, 7 Dec 2005 11:12:16 +0100 (CET) Subject: [pypy-svn] r20823 - in pypy/dist/pypy: annotation objspace/flow tool Message-ID: <20051207101216.9ADFA27B47@code1.codespeak.net> Author: mwh Date: Wed Dec 7 11:12:14 2005 New Revision: 20823 Modified: pypy/dist/pypy/annotation/description.py pypy/dist/pypy/objspace/flow/model.py pypy/dist/pypy/tool/sourcetools.py Log: supply a default alt_name for specialized functions if none is supplied in the spirit of what happened before the somepbc-refactoring branch landed, so you get names like wrap__str in the resulting graph. Modified: pypy/dist/pypy/annotation/description.py ============================================================================== --- pypy/dist/pypy/annotation/description.py (original) +++ pypy/dist/pypy/annotation/description.py Wed Dec 7 11:12:14 2005 @@ -168,6 +168,21 @@ try: return self._cache[key] except KeyError: + def nameof(thing): + if isinstance(thing, str): + return thing + elif hasattr(thing, '__name__'): # mostly types and functions + return thing.__name__ + elif hasattr(thing, 'name'): # mostly ClassDescs + return thing.name + elif isinstance(thing, tuple): + return '_'.join(map(nameof, thing)) + else: + return str(thing)[:30] + + if key is not None and alt_name is None: + postfix = nameof(key) + alt_name = "%s__%s"%(self.name, postfix) graph = self.buildgraph(alt_name, builder) self._cache[key] = graph return graph Modified: pypy/dist/pypy/objspace/flow/model.py ============================================================================== --- pypy/dist/pypy/objspace/flow/model.py (original) +++ pypy/dist/pypy/objspace/flow/model.py Wed Dec 7 11:12:14 2005 @@ -68,7 +68,7 @@ def __repr__(self): if hasattr(self, 'func'): - fnrepr = nice_repr_for_func(self.func) + fnrepr = nice_repr_for_func(self.func, self.name) else: fnrepr = self.name return '' % (fnrepr, uid(self)) Modified: pypy/dist/pypy/tool/sourcetools.py ============================================================================== --- pypy/dist/pypy/tool/sourcetools.py (original) +++ pypy/dist/pypy/tool/sourcetools.py Wed Dec 7 11:12:14 2005 @@ -249,11 +249,12 @@ func = getattr(func, 'func_code', func) return (func.co_flags & CO_VARKEYWORDS) != 0 -def nice_repr_for_func(fn): +def nice_repr_for_func(fn, name=None): mod = getattr(fn, '__module__', None) if mod is None: mod = '?' - name = getattr(fn, '__name__', None) + if name is None: + name = getattr(fn, '__name__', None) if name is not None: firstlineno = fn.func_code.co_firstlineno else: From arigo at codespeak.net Wed Dec 7 11:25:48 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 7 Dec 2005 11:25:48 +0100 (CET) Subject: [pypy-svn] r20824 - pypy/dist/pypy/translator/llvm/module Message-ID: <20051207102548.9365427B47@code1.codespeak.net> Author: arigo Date: Wed Dec 7 11:25:47 2005 New Revision: 20824 Modified: pypy/dist/pypy/translator/llvm/module/support.py Log: fix llvm for mwh's previous check-in. Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Wed Dec 7 11:25:47 2005 @@ -36,7 +36,7 @@ internal fastcc %RPyString* %RPyString_FromString(sbyte* %s) { %lenu = call ccc uint %strlen(sbyte* %s) %len = cast uint %lenu to int - %rpy = call fastcc %RPyString* %pypy_RPyString_New(int %len) + %rpy = call fastcc %RPyString* %pypy_RPyString_New__Signed(int %len) %rpystrptr = getelementptr %RPyString* %rpy, int 0, uint 1, uint 1 %rpystr = cast [0 x sbyte]* %rpystrptr to sbyte* From cfbolz at codespeak.net Wed Dec 7 11:26:34 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Wed, 7 Dec 2005 11:26:34 +0100 (CET) Subject: [pypy-svn] r20825 - pypy/extradoc/sprintinfo/gothenburg-2005 Message-ID: <20051207102634.8076D27B47@code1.codespeak.net> Author: cfbolz Date: Wed Dec 7 11:26:34 2005 New Revision: 20825 Modified: pypy/extradoc/sprintinfo/gothenburg-2005/people.txt pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Log: (all present) * fix dates * pairing up for today Modified: pypy/extradoc/sprintinfo/gothenburg-2005/people.txt ============================================================================== --- pypy/extradoc/sprintinfo/gothenburg-2005/people.txt (original) +++ pypy/extradoc/sprintinfo/gothenburg-2005/people.txt Wed Dec 7 11:26:34 2005 @@ -11,19 +11,20 @@ ==================== ============== ===================== Ludovic Aubry 5/12-10/12 some hotel Adrien Di Mascio 5/12-10/12 some hotel -Jacob Hallen ? private -Laura Creighton ? private -Beatrice Duering 5/12-11/12 private -Armin Rigo ? private -Samuele Pedroni ? private -Anders Chrigstroem ? private +Jacob Hallen private +Laura Creighton private +Beatrice Duering 5/12-11/12 private +Armin Rigo 5/12-11/12 private +Samuele Pedroni 5/12-11/12 private +Anders Chrigstroem 5/12-11/12 private Eric van Riet Paap 6/12-11/12 private -Michael Hudson ? ? -Carl Friedrich Bolz ? ? +Michael Hudson 5/12-11/12 SGS +Carl Friedrich Bolz 5/12-11/12 SGS Anders Lehmann 4/12-11/12 SGS Christian Tismer 4/12-11/12 some Hotel ? Niklaus Haldimann 6/12-11/12 private -Richard Emslie 5/12-? private +Richard Emslie 5/12-11/12 private +Johan Hahn 7/12- ==================== ============== ===================== People on the following list were present at previous sprints: Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt ============================================================================== --- pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt (original) +++ pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Wed Dec 7 11:26:34 2005 @@ -19,11 +19,15 @@ Richard Emslie 5/12-11/12 Johan Hahn 7/12- +Thursday afternoon: half-breakday (starting at 15.00) + Possible sprint tasks ========================= -JIT work -~~~~~~~~~ +JIT work +~~~~~~~~~~~~~~~~~ +(Armin, Carl Friedrich, Samuele, Arre, Eric) + see doc/discussion/draft-jit-ideas.txt - toy target intepreter @@ -32,8 +36,10 @@ Stackless ~~~~~~~~~~ +(Richard, Christian) + +Expose the low-level switching facilities: -- expose the low-level switching facilities - write RPython structures (tasklet, channel) and basic functions for switching - add an app-level interface (mixed module) @@ -51,10 +57,13 @@ _socket, C gluing for extensions ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +(Nik, Anders L) - work on _socket - this exposes limitations in our way to glue to C libraries, think/design solutions + +(Johan, Michael) - support more basic integer types. Decide on the proper design (explicit spelling of sizes, or the long-long way?) note that we already have functions which return 64 bit values. @@ -77,7 +86,8 @@ - ... Logic programming, WP9 -~~~~~~~~~~~~~~~~~~~~~~~ +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +(Ludovic, Adrien) - export the AST nodes hierarchy to application level through the compiler module From cfbolz at codespeak.net Wed Dec 7 12:02:17 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Wed, 7 Dec 2005 12:02:17 +0100 (CET) Subject: [pypy-svn] r20826 - pypy/dist/pypy/jit Message-ID: <20051207110217.BF2F227B45@code1.codespeak.net> Author: cfbolz Date: Wed Dec 7 12:02:16 2005 New Revision: 20826 Added: pypy/dist/pypy/jit/ Log: new directory for jit stuff From ericvrp at codespeak.net Wed Dec 7 12:04:34 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 7 Dec 2005 12:04:34 +0100 (CET) Subject: [pypy-svn] r20827 - pypy/dist/pypy/jit/test Message-ID: <20051207110434.6619227B47@code1.codespeak.net> Author: ericvrp Date: Wed Dec 7 12:04:33 2005 New Revision: 20827 Added: pypy/dist/pypy/jit/test/ Log: testdir From ericvrp at codespeak.net Wed Dec 7 13:04:15 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 7 Dec 2005 13:04:15 +0100 (CET) Subject: [pypy-svn] r20828 - in pypy/dist/pypy/jit: . test Message-ID: <20051207120415.253C627B45@code1.codespeak.net> Author: ericvrp Date: Wed Dec 7 13:04:12 2005 New Revision: 20828 Added: pypy/dist/pypy/jit/__init__.py pypy/dist/pypy/jit/bytecode.py pypy/dist/pypy/jit/test/test_tl.py pypy/dist/pypy/jit/tl.py Log: Initial tl (Toy Language) commit with tests. (tl.interp is translatable) Added: pypy/dist/pypy/jit/__init__.py ============================================================================== Added: pypy/dist/pypy/jit/bytecode.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/jit/bytecode.py Wed Dec 7 13:04:12 2005 @@ -0,0 +1,4 @@ +PUSH = 'P' +POP = 'p' +ADD = '+' +INVALID = '!' Added: pypy/dist/pypy/jit/test/test_tl.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/jit/test/test_tl.py Wed Dec 7 13:04:12 2005 @@ -0,0 +1,53 @@ +import py +from pypy.jit.tl import interp +from pypy.jit.bytecode import * + +#from pypy.rpython.l3interp import l3interp +#from pypy.rpython.l3interp import model +#from pypy.rpython.l3interp.model import Op +#from pypy.translator.c.test.test_genc import compile +from pypy.translator.translator import TranslationContext +from pypy.annotation import policy + +def translate(func, inputargs): + t = TranslationContext() + pol = policy.AnnotatorPolicy() + pol.allow_someobjects = False + t.buildannotator(policy=pol).build_types(func, inputargs) + t.buildrtyper().specialize() + + from pypy.translator.tool.cbuild import skip_missing_compiler + from pypy.translator.c import genc + builder = genc.CExtModuleBuilder(t, func) + builder.generate_source() + skip_missing_compiler(builder.compile) + builder.import_module() + return builder.get_entry_point() + +# actual tests go here + +def test_tl_push(): + assert interp(PUSH+chr(16)) == 16 + +def test_tl_pop(): + assert interp( ''.join([PUSH,chr(16), PUSH,chr(42), PUSH,chr(200), POP]) ) == 42 + +def test_tl_add(): + assert interp( ''.join([PUSH,chr(42), PUSH,chr(200), ADD]) ) == 242 + assert interp( ''.join([PUSH,chr(16), PUSH,chr(42), PUSH,chr(200), ADD]) ) == 242 + +def test_tl_error(): + py.test.raises(IndexError, interp,POP) + py.test.raises(IndexError, interp,ADD) + py.test.raises(IndexError, interp,''.join([PUSH,chr(200), ADD]) ) + +def test_tl_invalid_codetype(): + py.test.raises(TypeError, interp,[INVALID]) + +def test_tl_invalid_bytecode(): + py.test.raises(RuntimeError, interp,INVALID) + +def test_tl_translatable(): + code = ''.join([PUSH,chr(42), PUSH,chr(200), ADD]) + fn = translate(interp, [str]) + assert interp(code) == fn(code) Added: pypy/dist/pypy/jit/tl.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/jit/tl.py Wed Dec 7 13:04:12 2005 @@ -0,0 +1,32 @@ +'''Toy Language''' + +import py +from bytecode import * + +def interp(code=''): + if not isinstance(code,str): + raise TypeError("code '%s' should be a string" % str(code)) + + code_len = len(code) + stack = [] + pc = 0 + + while pc < code_len: + opcode = code[pc] + pc += 1 + + if opcode == PUSH: + stack.append(ord(code[pc])) + pc += 1 + + elif opcode == POP: + stack.pop() + + elif opcode == ADD: + stack.append( stack.pop() + stack.pop() ) + + else: + raise RuntimeError("unknown opcode: " + str(opcode)) + + return stack[-1] + From mwh at codespeak.net Wed Dec 7 15:19:19 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Wed, 7 Dec 2005 15:19:19 +0100 (CET) Subject: [pypy-svn] r20831 - pypy/dist/pypy/translator/c/src Message-ID: <20051207141919.D3DAC27B5C@code1.codespeak.net> Author: mwh Date: Wed Dec 7 15:19:18 2005 New Revision: 20831 Modified: pypy/dist/pypy/translator/c/src/g_include.h Log: don't #include pyobj.h when building standalone Modified: pypy/dist/pypy/translator/c/src/g_include.h ============================================================================== --- pypy/dist/pypy/translator/c/src/g_include.h (original) +++ pypy/dist/pypy/translator/c/src/g_include.h Wed Dec 7 15:19:18 2005 @@ -21,13 +21,13 @@ #ifndef PYPY_STANDALONE # include "src/module.h" +# include "src/pyobj.h" #endif #include "src/int.h" #include "src/char.h" #include "src/unichar.h" #include "src/float.h" -#include "src/pyobj.h" #include "src/address.h" /*** modules ***/ From adim at codespeak.net Wed Dec 7 16:05:52 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Wed, 7 Dec 2005 16:05:52 +0100 (CET) Subject: [pypy-svn] r20832 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20051207150552.AC08127B5C@code1.codespeak.net> Author: adim Date: Wed Dec 7 16:05:51 2005 New Revision: 20832 Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py Log: don't use load_boilerplate() anymore to find prologue and epilogue Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Wed Dec 7 16:05:51 2005 @@ -18,16 +18,6 @@ SPEC = "ast.txt" COMMA = ", " -def load_boilerplate(file): - f = open(file) - buf = f.read() - f.close() - i = buf.find('### ''PROLOGUE') - j = buf.find('### ''EPILOGUE') - pro = buf[i+12:j].strip() - epi = buf[j+12:].strip() - return pro, epi - def strip_default(arg): """Return the argname from an 'arg = default' string""" i = arg.find('=') @@ -115,6 +105,8 @@ self._gen_repr(buf) print >> buf self._gen_visit(buf) + print >> buf + self._gen_descr_visit(buf) buf.seek(0, 0) return buf.read() @@ -226,6 +218,12 @@ print >> buf, " def accept(self, visitor):" print >> buf, " return visitor.visit%s(self)" % self.name + def _gen_descr_visit(self, buf): + print >> buf, " def descr_accept(self, space, w_visitor):" + print >> buf, " w_callable = space.getattr(w_visitor, space.wrap('visit%s'))" % self.name + print >> buf, " args = Arguments(space, [ self ])" + print >> buf, " return space.call_args(w_callable, args)" + def _gen_additional_methods(self, buf): for key, value in self.additional_methods.iteritems(): if key not in '_cur_': @@ -333,7 +331,6 @@ print buf.getvalue() def main(): - prologue, epilogue = load_boilerplate(sys.argv[-1]) print prologue print classes = parse_spec(SPEC) @@ -348,18 +345,17 @@ emit(info) gen_ast_visitor(classes) print epilogue - -if __name__ == "__main__": - main() - sys.exit(0) - -### PROLOGUE + +prologue = ''' """Python abstract syntax node definitions This file is automatically generated by Tools/compiler/astgen.py """ from consts import CO_VARARGS, CO_VARKEYWORDS, OP_ASSIGN from pypy.interpreter.baseobjspace import Wrappable +from pypy.interpreter.typedef import TypeDef +from pypy.interpreter.gateway import interp2app +from pypy.interpreter.argument import Arguments def flatten(list): l = [] @@ -405,6 +401,28 @@ res.append( self ) return res + def __repr__(self): + return "Node()" + + def descr_repr( self, space ): + return space.wrap( self.__repr__() ) + +def descr_node_repr( space, w_obj ): + return w_obj.descr_repr( space ) + +def descr_getChildNodes( space, w_obj ): + lst = w_obj.getChildNodes() + return space.newlist( lst ) + +def descr_accept( space, w_obj, w_visitor ): + return w_obj.descr_accept( space, w_visitor ) + +Node.typedef = TypeDef('ASTNode', + __repr__ = interp2app(descr_node_repr), + getChildNodes = interp2app(descr_getChildNodes), + accept = interp2app(descr_accept), + ) + class EmptyNode(Node): def accept(self, visitor): @@ -429,7 +447,15 @@ def accept(self, visitor): return visitor.visitExpression(self) -### EPILOGUE +''' + +epilogue = ''' for name, obj in globals().items(): if isinstance(obj, type) and issubclass(obj, Node): nodes[name.lower()] = obj +''' + +if __name__ == "__main__": + main() + sys.exit(0) + From adim at codespeak.net Wed Dec 7 16:09:17 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Wed, 7 Dec 2005 16:09:17 +0100 (CET) Subject: [pypy-svn] r20833 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20051207150917.5EF2427B66@code1.codespeak.net> Author: adim Date: Wed Dec 7 16:09:09 2005 New Revision: 20833 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py Log: forgot to checkin regenerated ast.py Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Wed Dec 7 16:09:09 2005 @@ -1,9 +1,13 @@ + """Python abstract syntax node definitions This file is automatically generated by Tools/compiler/astgen.py """ from consts import CO_VARARGS, CO_VARKEYWORDS, OP_ASSIGN from pypy.interpreter.baseobjspace import Wrappable +from pypy.interpreter.typedef import TypeDef +from pypy.interpreter.gateway import interp2app +from pypy.interpreter.argument import Arguments def flatten(list): l = [] @@ -49,6 +53,28 @@ res.append( self ) return res + def __repr__(self): + return "Node()" + + def descr_repr( self, space ): + return space.wrap( self.__repr__() ) + +def descr_node_repr( space, w_obj ): + return w_obj.descr_repr( space ) + +def descr_getChildNodes( space, w_obj ): + lst = w_obj.getChildNodes() + return space.newlist( lst ) + +def descr_accept( space, w_obj, w_visitor ): + return w_obj.descr_accept( space, w_visitor ) + +Node.typedef = TypeDef('ASTNode', + __repr__ = interp2app(descr_node_repr), + getChildNodes = interp2app(descr_getChildNodes), + accept = interp2app(descr_accept), + ) + class EmptyNode(Node): def accept(self, visitor): @@ -73,6 +99,8 @@ def accept(self, visitor): return visitor.visitExpression(self) + + class AbstractFunction(Node): def __init__(self, lineno=-1): Node.__init__(self, lineno) @@ -90,6 +118,11 @@ def accept(self, visitor): return visitor.visitAbstractFunction(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAbstractFunction')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class AbstractTest(Node): def __init__(self, lineno=-1): Node.__init__(self, lineno) @@ -107,6 +140,11 @@ def accept(self, visitor): return visitor.visitAbstractTest(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAbstractTest')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class BinaryOp(Node): def __init__(self, lineno=-1): Node.__init__(self, lineno) @@ -124,6 +162,11 @@ def accept(self, visitor): return visitor.visitBinaryOp(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBinaryOp')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Add(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) @@ -143,6 +186,11 @@ def accept(self, visitor): return visitor.visitAdd(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAdd')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class And(AbstractTest): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) @@ -163,6 +211,11 @@ def accept(self, visitor): return visitor.visitAnd(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAnd')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class AssAttr(Node): def __init__(self, expr, attrname, flags, lineno=-1): Node.__init__(self, lineno) @@ -183,6 +236,11 @@ def accept(self, visitor): return visitor.visitAssAttr(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssAttr')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class AssSeq(Node): def __init__(self, lineno=-1): Node.__init__(self, lineno) @@ -200,6 +258,11 @@ def accept(self, visitor): return visitor.visitAssSeq(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssSeq')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class AssList(AssSeq): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) @@ -220,6 +283,11 @@ def accept(self, visitor): return visitor.visitAssList(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssList')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class AssName(Node): def __init__(self, name, flags, lineno=-1): Node.__init__(self, lineno) @@ -239,6 +307,11 @@ def accept(self, visitor): return visitor.visitAssName(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssName')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class AssTuple(AssSeq): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) @@ -272,6 +345,11 @@ def accept(self, visitor): return visitor.visitAssTuple(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssTuple')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Assert(Node): def __init__(self, test, fail, lineno=-1): Node.__init__(self, lineno) @@ -298,6 +376,11 @@ def accept(self, visitor): return visitor.visitAssert(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssert')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Assign(Node): def __init__(self, nodes, expr, lineno=-1): Node.__init__(self, lineno) @@ -323,6 +406,11 @@ def accept(self, visitor): return visitor.visitAssign(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssign')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class AugAssign(Node): def __init__(self, node, op, expr, lineno=-1): Node.__init__(self, lineno) @@ -343,6 +431,11 @@ def accept(self, visitor): return visitor.visitAugAssign(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAugAssign')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class UnaryOp(Node): def __init__(self, lineno=-1): Node.__init__(self, lineno) @@ -360,6 +453,11 @@ def accept(self, visitor): return visitor.visitUnaryOp(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitUnaryOp')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Backquote(UnaryOp): def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) @@ -378,6 +476,11 @@ def accept(self, visitor): return visitor.visitBackquote(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBackquote')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class BitOp(Node): def __init__(self, lineno=-1): Node.__init__(self, lineno) @@ -395,6 +498,11 @@ def accept(self, visitor): return visitor.visitBitOp(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBitOp')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Bitand(BitOp): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) @@ -415,6 +523,11 @@ def accept(self, visitor): return visitor.visitBitand(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBitand')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Bitor(BitOp): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) @@ -435,6 +548,11 @@ def accept(self, visitor): return visitor.visitBitor(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBitor')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Bitxor(BitOp): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) @@ -455,6 +573,11 @@ def accept(self, visitor): return visitor.visitBitxor(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBitxor')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Break(Node): def __init__(self, lineno=-1): Node.__init__(self, lineno) @@ -472,6 +595,11 @@ def accept(self, visitor): return visitor.visitBreak(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBreak')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class CallFunc(Node): def __init__(self, node, args, star_args = None, dstar_args = None, lineno=-1): Node.__init__(self, lineno) @@ -505,6 +633,11 @@ def accept(self, visitor): return visitor.visitCallFunc(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitCallFunc')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Class(Node): def __init__(self, name, bases, doc, code, lineno=-1): Node.__init__(self, lineno) @@ -534,6 +667,11 @@ def accept(self, visitor): return visitor.visitClass(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitClass')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Compare(Node): def __init__(self, expr, ops, lineno=-1): Node.__init__(self, lineno) @@ -562,6 +700,11 @@ def accept(self, visitor): return visitor.visitCompare(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitCompare')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Const(Node): def __init__(self, value, lineno=-1): Node.__init__(self, lineno) @@ -580,6 +723,11 @@ def accept(self, visitor): return visitor.visitConst(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitConst')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Continue(Node): def __init__(self, lineno=-1): Node.__init__(self, lineno) @@ -597,6 +745,11 @@ def accept(self, visitor): return visitor.visitContinue(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitContinue')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Decorators(Node): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) @@ -617,6 +770,11 @@ def accept(self, visitor): return visitor.visitDecorators(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitDecorators')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Dict(Node): def __init__(self, items, lineno=-1): Node.__init__(self, lineno) @@ -641,6 +799,11 @@ def accept(self, visitor): return visitor.visitDict(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitDict')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Discard(Node): def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) @@ -659,6 +822,11 @@ def accept(self, visitor): return visitor.visitDiscard(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitDiscard')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Div(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) @@ -678,6 +846,11 @@ def accept(self, visitor): return visitor.visitDiv(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitDiv')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Ellipsis(Node): def __init__(self, lineno=-1): Node.__init__(self, lineno) @@ -695,6 +868,11 @@ def accept(self, visitor): return visitor.visitEllipsis(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitEllipsis')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Exec(Node): def __init__(self, expr, locals, globals, lineno=-1): Node.__init__(self, lineno) @@ -725,6 +903,11 @@ def accept(self, visitor): return visitor.visitExec(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitExec')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class FloorDiv(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) @@ -744,6 +927,11 @@ def accept(self, visitor): return visitor.visitFloorDiv(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitFloorDiv')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class For(Node): def __init__(self, assign, list, body, else_, lineno=-1): Node.__init__(self, lineno) @@ -776,6 +964,11 @@ def accept(self, visitor): return visitor.visitFor(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitFor')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class From(Node): def __init__(self, modname, names, lineno=-1): Node.__init__(self, lineno) @@ -795,6 +988,11 @@ def accept(self, visitor): return visitor.visitFrom(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitFrom')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Function(AbstractFunction): def __init__(self, decorators, name, argnames, defaults, flags, doc, code, lineno=-1): Node.__init__(self, lineno) @@ -839,6 +1037,11 @@ def accept(self, visitor): return visitor.visitFunction(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitFunction')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class GenExpr(AbstractFunction): def __init__(self, code, lineno=-1): Node.__init__(self, lineno) @@ -861,6 +1064,11 @@ def accept(self, visitor): return visitor.visitGenExpr(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitGenExpr')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class GenExprFor(Node): def __init__(self, assign, iter, ifs, lineno=-1): Node.__init__(self, lineno) @@ -892,6 +1100,11 @@ def accept(self, visitor): return visitor.visitGenExprFor(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitGenExprFor')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class GenExprIf(Node): def __init__(self, test, lineno=-1): Node.__init__(self, lineno) @@ -910,6 +1123,11 @@ def accept(self, visitor): return visitor.visitGenExprIf(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitGenExprIf')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class GenExprInner(Node): def __init__(self, expr, quals, lineno=-1): Node.__init__(self, lineno) @@ -935,6 +1153,11 @@ def accept(self, visitor): return visitor.visitGenExprInner(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitGenExprInner')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Getattr(Node): def __init__(self, expr, attrname, lineno=-1): Node.__init__(self, lineno) @@ -954,6 +1177,11 @@ def accept(self, visitor): return visitor.visitGetattr(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitGetattr')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Global(Node): def __init__(self, names, lineno=-1): Node.__init__(self, lineno) @@ -972,6 +1200,11 @@ def accept(self, visitor): return visitor.visitGlobal(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitGlobal')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class If(Node): def __init__(self, tests, else_, lineno=-1): Node.__init__(self, lineno) @@ -1002,6 +1235,11 @@ def accept(self, visitor): return visitor.visitIf(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitIf')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Import(Node): def __init__(self, names, lineno=-1): Node.__init__(self, lineno) @@ -1020,6 +1258,11 @@ def accept(self, visitor): return visitor.visitImport(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitImport')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Invert(UnaryOp): def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) @@ -1038,6 +1281,11 @@ def accept(self, visitor): return visitor.visitInvert(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitInvert')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Keyword(Node): def __init__(self, name, expr, lineno=-1): Node.__init__(self, lineno) @@ -1057,6 +1305,11 @@ def accept(self, visitor): return visitor.visitKeyword(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitKeyword')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Lambda(AbstractFunction): def __init__(self, argnames, defaults, flags, code, lineno=-1): Node.__init__(self, lineno) @@ -1093,6 +1346,11 @@ def accept(self, visitor): return visitor.visitLambda(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitLambda')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class LeftShift(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) @@ -1112,6 +1370,11 @@ def accept(self, visitor): return visitor.visitLeftShift(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitLeftShift')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class List(Node): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) @@ -1132,6 +1395,11 @@ def accept(self, visitor): return visitor.visitList(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitList')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class ListComp(Node): def __init__(self, expr, quals, lineno=-1): Node.__init__(self, lineno) @@ -1157,6 +1425,11 @@ def accept(self, visitor): return visitor.visitListComp(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitListComp')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class ListCompFor(Node): def __init__(self, assign, list, ifs, lineno=-1): Node.__init__(self, lineno) @@ -1185,6 +1458,11 @@ def accept(self, visitor): return visitor.visitListCompFor(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitListCompFor')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class ListCompIf(Node): def __init__(self, test, lineno=-1): Node.__init__(self, lineno) @@ -1203,6 +1481,11 @@ def accept(self, visitor): return visitor.visitListCompIf(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitListCompIf')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Mod(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) @@ -1222,6 +1505,11 @@ def accept(self, visitor): return visitor.visitMod(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitMod')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Module(Node): def __init__(self, doc, node, lineno=-1): Node.__init__(self, lineno) @@ -1241,6 +1529,11 @@ def accept(self, visitor): return visitor.visitModule(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitModule')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Mul(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) @@ -1260,6 +1553,11 @@ def accept(self, visitor): return visitor.visitMul(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitMul')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Name(Node): def __init__(self, varname, lineno=-1): Node.__init__(self, lineno) @@ -1278,6 +1576,11 @@ def accept(self, visitor): return visitor.visitName(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitName')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class NoneConst(Node): def __init__(self, lineno=-1): Node.__init__(self, lineno) @@ -1295,6 +1598,11 @@ def accept(self, visitor): return visitor.visitNoneConst(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitNoneConst')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Not(UnaryOp): def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) @@ -1313,6 +1621,11 @@ def accept(self, visitor): return visitor.visitNot(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitNot')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class NumberConst(Node): def __init__(self, number_value, lineno=-1): Node.__init__(self, lineno) @@ -1331,6 +1644,11 @@ def accept(self, visitor): return visitor.visitNumberConst(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitNumberConst')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Or(AbstractTest): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) @@ -1351,6 +1669,11 @@ def accept(self, visitor): return visitor.visitOr(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitOr')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Pass(Node): def __init__(self, lineno=-1): Node.__init__(self, lineno) @@ -1368,6 +1691,11 @@ def accept(self, visitor): return visitor.visitPass(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitPass')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Power(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) @@ -1387,6 +1715,11 @@ def accept(self, visitor): return visitor.visitPower(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitPower')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Print(Node): def __init__(self, nodes, dest, lineno=-1): Node.__init__(self, lineno) @@ -1413,6 +1746,11 @@ def accept(self, visitor): return visitor.visitPrint(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitPrint')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Printnl(Node): def __init__(self, nodes, dest, lineno=-1): Node.__init__(self, lineno) @@ -1439,6 +1777,11 @@ def accept(self, visitor): return visitor.visitPrintnl(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitPrintnl')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Raise(Node): def __init__(self, expr1, expr2, expr3, lineno=-1): Node.__init__(self, lineno) @@ -1470,6 +1813,11 @@ def accept(self, visitor): return visitor.visitRaise(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitRaise')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Return(Node): def __init__(self, value, lineno=-1): Node.__init__(self, lineno) @@ -1491,6 +1839,11 @@ def accept(self, visitor): return visitor.visitReturn(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitReturn')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class RightShift(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) @@ -1510,6 +1863,11 @@ def accept(self, visitor): return visitor.visitRightShift(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitRightShift')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Slice(Node): def __init__(self, expr, flags, lower, upper, lineno=-1): Node.__init__(self, lineno) @@ -1542,6 +1900,11 @@ def accept(self, visitor): return visitor.visitSlice(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitSlice')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Sliceobj(Node): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) @@ -1562,6 +1925,11 @@ def accept(self, visitor): return visitor.visitSliceobj(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitSliceobj')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Stmt(Node): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) @@ -1582,6 +1950,11 @@ def accept(self, visitor): return visitor.visitStmt(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitStmt')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class StringConst(Node): def __init__(self, string_value, lineno=-1): Node.__init__(self, lineno) @@ -1600,6 +1973,11 @@ def accept(self, visitor): return visitor.visitStringConst(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitStringConst')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Sub(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) @@ -1619,6 +1997,11 @@ def accept(self, visitor): return visitor.visitSub(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitSub')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Subscript(Node): def __init__(self, expr, flags, subs, lineno=-1): Node.__init__(self, lineno) @@ -1646,6 +2029,11 @@ def accept(self, visitor): return visitor.visitSubscript(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitSubscript')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class TryExcept(Node): def __init__(self, body, handlers, else_, lineno=-1): Node.__init__(self, lineno) @@ -1683,6 +2071,11 @@ def accept(self, visitor): return visitor.visitTryExcept(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitTryExcept')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class TryFinally(Node): def __init__(self, body, final, lineno=-1): Node.__init__(self, lineno) @@ -1702,6 +2095,11 @@ def accept(self, visitor): return visitor.visitTryFinally(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitTryFinally')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Tuple(Node): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) @@ -1722,6 +2120,11 @@ def accept(self, visitor): return visitor.visitTuple(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitTuple')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class UnaryAdd(UnaryOp): def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) @@ -1740,6 +2143,11 @@ def accept(self, visitor): return visitor.visitUnaryAdd(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitUnaryAdd')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class UnarySub(UnaryOp): def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) @@ -1758,6 +2166,11 @@ def accept(self, visitor): return visitor.visitUnarySub(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitUnarySub')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class While(Node): def __init__(self, test, body, else_, lineno=-1): Node.__init__(self, lineno) @@ -1787,6 +2200,11 @@ def accept(self, visitor): return visitor.visitWhile(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitWhile')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class Yield(Node): def __init__(self, value, lineno=-1): Node.__init__(self, lineno) @@ -1805,6 +2223,11 @@ def accept(self, visitor): return visitor.visitYield(self) + def descr_accept(self, space, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitYield')) + args = Arguments(space, [ self ]) + return space.call_args(w_callable, args) + class ASTVisitor(object): """This is a visitor base class used to provide the visit @@ -1982,6 +2405,8 @@ def visitYield(self, node): return self.default( node ) + for name, obj in globals().items(): if isinstance(obj, type) and issubclass(obj, Node): nodes[name.lower()] = obj + From mwh at codespeak.net Wed Dec 7 16:09:17 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Wed, 7 Dec 2005 16:09:17 +0100 (CET) Subject: [pypy-svn] r20834 - pypy/dist/pypy/rpython Message-ID: <20051207150917.7810727B43@code1.codespeak.net> Author: mwh Date: Wed Dec 7 16:09:12 2005 New Revision: 20834 Modified: pypy/dist/pypy/rpython/rint.py pypy/dist/pypy/rpython/rmodel.py Log: (johahn, mwh) Reduce if-elsing on Signed/Unsigned in rint.py. Modified: pypy/dist/pypy/rpython/rint.py ============================================================================== --- pypy/dist/pypy/rpython/rint.py (original) +++ pypy/dist/pypy/rpython/rint.py Wed Dec 7 16:09:12 2005 @@ -20,9 +20,8 @@ def rtyper_makekey(self): return self.__class__, self.unsigned -signed_repr = IntegerRepr() -unsigned_repr = IntegerRepr() -unsigned_repr.lowleveltype = Unsigned +signed_repr = IntegerRepr(Signed, 'int_') +unsigned_repr = IntegerRepr(Unsigned, 'uint_') class __extend__(pairtype(IntegerRepr, IntegerRepr)): @@ -112,20 +111,13 @@ if hop.has_implicit_exception(ZeroDivisionError): suffix += '_zer' s_int3 = hop.args_s[2] - if hop.s_result.unsigned: - if s_int3.is_constant() and s_int3.const is None: - vlist = hop.inputargs(Unsigned, Unsigned, Void)[:2] - else: - vlist = hop.inputargs(Unsigned, Unsigned, Unsigned) - hop.exception_is_here() - return hop.genop('uint_pow' + suffix, vlist, resulttype=Unsigned) + rresult = hop.rtyper.makerepr(hop.s_result) + if s_int3.is_constant() and s_int3.const is None: + vlist = hop.inputargs(rresult, rresult, Void)[:2] else: - if s_int3.is_constant() and s_int3.const is None: - vlist = hop.inputargs(Signed, Signed, Void)[:2] - else: - vlist = hop.inputargs(Signed, Signed, Signed) - hop.exception_is_here() - return hop.genop('int_pow' + suffix, vlist, resulttype=Signed) + vlist = hop.inputargs(rresult, rresult, rresult) + hop.exception_is_here() + return hop.genop(rresult.opprefix + 'pow' + suffix, vlist, resulttype=rresult) def rtype_pow_ovf(_, hop): if hop.s_result.unsigned: @@ -161,23 +153,22 @@ #Helper functions def _rtype_template(hop, func, implicit_excs=[]): - func1 = func + if func.endswith('_ovf'): + if hop.s_result.unsigned: + raise TyperError("forbidden unsigned " + func) + else: + hop.has_implicit_exception(OverflowError) + for implicit_exc in implicit_excs: if hop.has_implicit_exception(implicit_exc): appendix = op_appendices[implicit_exc] func += '_' + appendix - if hop.s_result.unsigned: - if func1.endswith('_ovf'): - raise TyperError("forbidden uint_" + func) - vlist = hop.inputargs(Unsigned, Unsigned) - hop.exception_is_here() - return hop.genop('uint_'+func, vlist, resulttype=Unsigned) - else: - if func1.endswith('_ovf'): # record that we know about it - hop.has_implicit_exception(OverflowError) - vlist = hop.inputargs(Signed, Signed) - hop.exception_is_here() - return hop.genop('int_'+func, vlist, resulttype=Signed) + + repr = hop.rtyper.makerepr(hop.s_result) + vlist = hop.inputargs(repr, repr) + hop.exception_is_here() + return hop.genop(repr.opprefix+func, vlist, resulttype=repr) + #Helper functions for comparisons @@ -186,11 +177,11 @@ if s_int1.unsigned or s_int2.unsigned: if not s_int1.nonneg or not s_int2.nonneg: raise TyperError("comparing a signed and an unsigned number") - vlist = hop.inputargs(Unsigned, Unsigned) - return hop.genop('uint_'+func, vlist, resulttype=Bool) - else: - vlist = hop.inputargs(Signed, Signed) - return hop.genop('int_'+func, vlist, resulttype=Bool) + + repr = hop.rtyper.makerepr(annmodel.unionof(s_int1, s_int2)) + vlist = hop.inputargs(repr, repr) + hop.exception_is_here() + return hop.genop(repr.opprefix+func, vlist, resulttype=Bool) # @@ -220,75 +211,60 @@ return hop.genop('cast_int_to_char', vlist, resulttype=Char) def rtype_unichr(_, hop): - vlist = hop.inputargs(Signed) + vlist = hop.inputargs(Signed) if hop.has_implicit_exception(ValueError): hop.exception_is_here() hop.gendirectcall(ll_check_unichr, vlist[0]) return hop.genop('cast_int_to_unichar', vlist, resulttype=UniChar) def rtype_is_true(self, hop): - if self.lowleveltype == Unsigned: - vlist = hop.inputargs(Unsigned) - return hop.genop('uint_is_true', vlist, resulttype=Bool) - else: - vlist = hop.inputargs(Signed) - return hop.genop('int_is_true', vlist, resulttype=Bool) + vlist = hop.inputargs(self) + return hop.genop(self.opprefix + 'is_true', vlist, resulttype=Bool) #Unary arithmetic operations - def rtype_abs(_, hop): + def rtype_abs(self, hop): if hop.s_result.unsigned: - vlist = hop.inputargs(Unsigned) + vlist = hop.inputargs(self) return vlist[0] else: - vlist = hop.inputargs(Signed) - return hop.genop('int_abs', vlist, resulttype=Signed) + vlist = hop.inputargs(self) + return hop.genop(self.opprefix + 'abs', vlist, resulttype=self) - def rtype_abs_ovf(_, hop): + def rtype_abs_ovf(self, hop): if hop.s_result.unsigned: raise TyperError("forbidden uint_abs_ovf") else: - vlist = hop.inputargs(Signed) + vlist = hop.inputargs(self) hop.has_implicit_exception(OverflowError) # record we know about it hop.exception_is_here() - return hop.genop('int_abs_ovf', vlist, resulttype=Signed) - - def rtype_invert(_, hop): - if hop.s_result.unsigned: - vlist = hop.inputargs(Unsigned) - return hop.genop('uint_invert', vlist, resulttype=Unsigned) - else: - vlist = hop.inputargs(Signed) - return hop.genop('int_invert', vlist, resulttype=Signed) + return hop.genop(self.opprefix + 'abs_ovf', vlist, resulttype=self) - def rtype_neg(_, hop): - if hop.s_result.unsigned: - vlist = hop.inputargs(Unsigned) - return hop.genop('uint_neg', vlist, resulttype=Unsigned) - else: - vlist = hop.inputargs(Signed) - return hop.genop('int_neg', vlist, resulttype=Signed) + def rtype_invert(self, hop): + vlist = hop.inputargs(self) + return hop.genop(self.opprefix + 'invert', vlist, resulttype=self) + + def rtype_neg(self, hop): + vlist = hop.inputargs(self) + return hop.genop(self.opprefix + 'neg', vlist, resulttype=self) - def rtype_neg_ovf(_, hop): + def rtype_neg_ovf(self, hop): if hop.s_result.unsigned: raise TyperError("forbidden uint_neg_ovf") else: - vlist = hop.inputargs(Signed) + vlist = hop.inputargs(self) hop.has_implicit_exception(OverflowError) # record we know about it hop.exception_is_here() - return hop.genop('int_neg_ovf', vlist, resulttype=Signed) + return hop.genop(self.opprefix + 'neg_ovf', vlist, resulttype=self) - def rtype_pos(_, hop): - if hop.s_result.unsigned: - vlist = hop.inputargs(Unsigned) - else: - vlist = hop.inputargs(Signed) + def rtype_pos(self, hop): + vlist = hop.inputargs(self) return vlist[0] - def rtype_int(r_int, hop): - if r_int.lowleveltype == Unsigned: + def rtype_int(self, hop): + if self.lowleveltype == Unsigned: raise TyperError("use intmask() instead of int(r_uint(...))") - vlist = hop.inputargs(Signed) + vlist = hop.inputargs(self) return vlist[0] def rtype_float(_, hop): Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Wed Dec 7 16:09:12 2005 @@ -271,10 +271,15 @@ lowleveltype = Float class IntegerRepr(FloatRepr): - lowleveltype = Signed + def __init__(self, lowleveltype, opprefix): + self.lowleveltype = lowleveltype + self.opprefix = opprefix class BoolRepr(IntegerRepr): lowleveltype = Bool + opprefix = 'int_' + def __init__(self): + pass class StringRepr(Repr): pass From cfbolz at codespeak.net Wed Dec 7 16:09:59 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Wed, 7 Dec 2005 16:09:59 +0100 (CET) Subject: [pypy-svn] r20835 - in pypy/dist/pypy: jit jit/test objspace/flow Message-ID: <20051207150959.12A4927B45@code1.codespeak.net> Author: cfbolz Date: Wed Dec 7 16:09:57 2005 New Revision: 20835 Added: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Modified: pypy/dist/pypy/objspace/flow/model.py Log: (cfbolz, arigo): tiny attempts to write an abstract interpreter for l2 graphs. Added: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/jit/llabstractinterp.py Wed Dec 7 16:09:57 2005 @@ -0,0 +1,245 @@ +import operator +from pypy.objspace.flow.model import Variable, Constant, SpaceOperation +from pypy.objspace.flow.model import Block, Link, FunctionGraph +from pypy.rpython.lltypesystem import lltype + + +class LLAbstractValue(object): + pass + +class LLConcreteValue(LLAbstractValue): + + def __init__(self, value): + self.value = value + +# def __eq__(self, other): +# return self.__class__ is other.__class__ and self.value == other.value +# +# def __ne__(self, other): +# return not (self == other) +# +# def __hash__(self): +# return hash(self.value) + + def getconcretetype(self): + return lltype.typeOf(self.value) + + def getvarorconst(self): + c = Constant(self.value) + c.concretetype = self.getconcretetype() + return c + + def match(self, other): + return isinstance(other, LLConcreteValue) and self.value == other.value + + +class LLRuntimeValue(LLAbstractValue): + + def __init__(self, orig_v): + if isinstance(orig_v, Variable): + self.copy_v = Variable(orig_v) + self.copy_v.concretetype = orig_v.concretetype + else: + # we can share the Constant() + self.copy_v = orig_v + + def getconcretetype(self): + return self.copy_v.concretetype + + def getvarorconst(self): + return self.copy_v + + def match(self, other): + return isinstance(other, LLRuntimeValue) # XXX and ... + + +class LLState(object): + + def __init__(self, origblock, args_a): + assert len(args_a) == len(origblock.inputargs) + self.args_a = args_a + self.origblock = origblock + self.copyblock = None + self.pendinglinks = [] + + def patchlink(self, copylink): + if self.copyblock is None: + print 'PENDING', self, id(copylink) + self.pendinglinks.append(copylink) + else: + # XXX nice interface required! + print 'LINKING', self, id(copylink), self.copyblock + copylink.settarget(self.copyblock) + + def resolveblock(self, newblock): + self.copyblock = newblock + for copylink in self.pendinglinks: + self.patchlink(copylink) + del self.pendinglinks[:] + + def match(self, args_a): + # simple for now + for a1, a2 in zip(self.args_a, args_a): + if not a1.match(a2): + return False + else: + return True + +# ____________________________________________________________ + +class GotReturnValue(Exception): + def __init__(self, returnstate): + self.returnstate = returnstate + + +class LLAbstractInterp(object): + + def __init__(self): + pass + + def eval(self, origgraph, hints): + # for now, 'hints' means "I'm absolutely sure that the + # given variables will have the given ll value" + self.allpendingstates = [] + self.hints = hints + self.blocks = {} # {origblock: list-of-LLStates} + args_a = [LLRuntimeValue(orig_v=v) for v in origgraph.getargs()] + newstartlink = self.schedule(args_a, origgraph.startblock) + + return_a = LLRuntimeValue(orig_v=origgraph.getreturnvar()) + returnstate = LLState(origgraph.returnblock, [return_a]) + self.allpendingstates.append(returnstate) + self.blocks[origgraph.returnblock] = [returnstate] + self.complete(returnstate) + + copygraph = FunctionGraph(origgraph.name, newstartlink.target) + # XXX messy -- what about len(returnlink.args) == 0 ?? + copygraph.getreturnvar().concretetype = ( + origgraph.getreturnvar().concretetype) + returnstate.resolveblock(copygraph.returnblock) + return copygraph + + def applyhint(self, args_a, origblock): + result_a = [] + for a, origv in zip(args_a, origblock.inputargs): + if origv in self.hints: + # use the hint, ignore the source binding + a = LLConcreteValue(self.hints[origv]) + result_a.append(a) + return result_a + + def schedule(self, args_a, origblock): + print "SCHEDULE", args_a, origblock + # args_a: [a_value for v in origblock.inputargs] + args_a = self.applyhint(args_a, origblock) + args_v = [a.getvarorconst() for a in args_a + if not isinstance(a, LLConcreteValue)] + newlink = Link(args_v, None) + # try to match this new state with an existing one + pendingstates = self.blocks.setdefault(origblock, []) + for state in pendingstates: + if state.match(args_a): + # already matched + break + else: + # schedule this new state + state = LLState(origblock, args_a) + pendingstates.append(state) + self.allpendingstates.append(state) + state.patchlink(newlink) + return newlink + + def complete(self, returnstate): + while self.allpendingstates: + state = self.allpendingstates.pop() + print 'CONSIDERING', state + try: + self.flowin(state) + except GotReturnValue, e: + assert e.returnstate is returnstate # XXX + + def flowin(self, state): + # flow in the block + assert state.copyblock is None + origblock = state.origblock + if origblock.operations == (): + if len(origblock.inputargs) == 1: + # return block + raise GotReturnValue(state) + elif len(origblock.inputargs) == 2: + # except block + XXX + else: + raise Exception("uh?") + self.residual_operations = [] + bindings = {} # {Variables-of-origblock: a_value} + def binding(v): + if isinstance(v, Constant): + return LLRuntimeValue(orig_v=v) + else: + return bindings[v] + for v, a in zip(origblock.inputargs, state.args_a): + if not isinstance(a, LLConcreteValue): + a = LLRuntimeValue(orig_v=v) + bindings[v] = a + for op in origblock.operations: + handler = getattr(self, 'op_' + op.opname) + a_result = handler(op, *[binding(v) for v in op.args]) + bindings[op.result] = a_result + if 1: # self.residual_operations: + inputargs = [] + for v in origblock.inputargs: + a = bindings[v] + if not isinstance(a, LLConcreteValue): + inputargs.append(a.getvarorconst()) + newblock = Block(inputargs) + newblock.operations = self.residual_operations + del self.residual_operations # just in case + assert origblock.exitswitch is None # XXX + origlink, = origblock.exits # XXX + args_a = [binding(v) for v in origlink.args] + newlink = self.schedule(args_a, origlink.target) + print "CLOSING" + newblock.closeblock(newlink) + state.resolveblock(newblock) + else: + XXX + + def constantfold(self, constant_op, args_a): + concretevalues = [] + any_concrete = False + for a in args_a: + v = a.getvarorconst() + if isinstance(v, Constant): + concretevalues.append(v.value) + else: + return None # cannot constant-fold + any_concrete = any_concrete or isinstance(a, LLConcreteValue) + # can constant-fold + concreteresult = constant_op(*concretevalues) + if any_concrete: + return LLConcreteValue(concreteresult) + else: + c = Constant(concreteresult) + c.concretetype = typeOf(concreteresult) + return LLRuntimeValue(c) + + def residual(self, opname, args_a, a_result): + op = SpaceOperation(opname, + [a.getvarorconst() for a in args_a], + a_result.getvarorconst()) + self.residual_operations.append(op) + + def residualize(self, op, args_a, constant_op=None): + if constant_op: + a_result = self.constantfold(constant_op, args_a) + if a_result is not None: + return a_result + a_result = LLRuntimeValue(op.result) + self.residual(op.opname, args_a, a_result) + return a_result + + # ____________________________________________________________ + + def op_int_add(self, op, a1, a2): + return self.residualize(op, [a1, a2], operator.add) Added: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Wed Dec 7 16:09:57 2005 @@ -0,0 +1,29 @@ +from pypy.translator.translator import TranslationContext +from pypy.rpython.annlowlevel import annotate_lowlevel_helper +from pypy.rpython.lltypesystem import lltype +from pypy.jit.llabstractinterp import LLAbstractInterp + +def test_simple(): + def ll_function(x, y): + return x + y + + t = TranslationContext() + a = t.buildannotator() + argtypes = [a.typeannotation(int), a.typeannotation(int)] + graph1 = annotate_lowlevel_helper(a, ll_function, argtypes) + t.buildrtyper().specialize() + interp = LLAbstractInterp() + # tell 'y=42' + hints = {graph1.getargs()[1]: 42} + graph2 = interp.eval(graph1, hints) + # check that the result is "lambda x: x+42" + assert len(graph2.startblock.operations) == 1 + assert len(graph2.getargs()) == 1 + op = graph2.startblock.operations[0] + assert op.opname == 'int_add' + assert op.args[0] is graph2.getargs()[0] + assert op.args[0].concretetype == lltype.Signed + assert op.args[1].value == 42 + assert op.args[1].concretetype == lltype.Signed + assert len(graph2.startblock.exits) == 1 + assert graph2.startblock.exits[0].target is graph2.returnblock Modified: pypy/dist/pypy/objspace/flow/model.py ============================================================================== --- pypy/dist/pypy/objspace/flow/model.py (original) +++ pypy/dist/pypy/objspace/flow/model.py Wed Dec 7 16:09:57 2005 @@ -108,7 +108,8 @@ last_exception last_exc_value""".split() def __init__(self, args, target, exitcase=None): - assert len(args) == len(target.inputargs), "output args mismatch" + if target is not None: + assert len(args) == len(target.inputargs), "output args mismatch" self.args = list(args) # mixed list of var/const self.target = target # block self.exitcase = exitcase # this is a concrete value @@ -142,6 +143,11 @@ newlink.llexitcase = self.llexitcase return newlink + def settarget(self, targetblock): + assert len(self.args) == len(targetblock.inputargs), ( + "output args mismatch") + self.target = targetblock + def __repr__(self): return "link from %s to %s" % (str(self.prevblock), str(self.target)) From cfbolz at codespeak.net Wed Dec 7 16:10:35 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Wed, 7 Dec 2005 16:10:35 +0100 (CET) Subject: [pypy-svn] r20836 - in pypy/dist/pypy/jit: . test Message-ID: <20051207151035.D8E3727B66@code1.codespeak.net> Author: cfbolz Date: Wed Dec 7 16:10:34 2005 New Revision: 20836 Modified: pypy/dist/pypy/jit/ (props changed) pypy/dist/pypy/jit/__init__.py (props changed) pypy/dist/pypy/jit/bytecode.py (props changed) pypy/dist/pypy/jit/llabstractinterp.py (props changed) pypy/dist/pypy/jit/test/ (props changed) pypy/dist/pypy/jit/test/test_llabstractinterp.py (props changed) pypy/dist/pypy/jit/test/test_tl.py (props changed) pypy/dist/pypy/jit/tl.py (props changed) Log: fixeol From nik at codespeak.net Wed Dec 7 16:13:11 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Wed, 7 Dec 2005 16:13:11 +0100 (CET) Subject: [pypy-svn] r20837 - in pypy/dist/pypy: module/_socket module/_socket/rpython module/_socket/rpython/test translator/c/test Message-ID: <20051207151311.7FF9027B5C@code1.codespeak.net> Author: nik Date: Wed Dec 7 16:13:08 2005 New Revision: 20837 Modified: pypy/dist/pypy/module/_socket/interp_socket.py pypy/dist/pypy/module/_socket/rpython/exttable.py pypy/dist/pypy/module/_socket/rpython/ll__socket.py pypy/dist/pypy/module/_socket/rpython/rsocket.py pypy/dist/pypy/module/_socket/rpython/test/test_ll__socket.py pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (ale, nik) first tentative steps towards creating socket objects. doesn't translate, yet. Modified: pypy/dist/pypy/module/_socket/interp_socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/interp_socket.py (original) +++ pypy/dist/pypy/module/_socket/interp_socket.py Wed Dec 7 16:13:08 2005 @@ -618,12 +618,16 @@ socket.setdefaulttimeout(timeout) try: - fd = socket.socket(family, type, proto) + fd = rsocket.newsocket(family, type, proto) except socket.error, e: raise wrap_socketerror(space, e) - sock = space.allocate_instance(Socket, w_subtype) - Socket.__init__(sock, space, fd, family, type, proto) - return space.wrap(sock) + # XXX If we want to support subclassing the socket type we will need + # something along these lines. But allocate_instance is only defined + # on the standard object space, so this is not really correct. + #sock = space.allocate_instance(Socket, w_subtype) + #Socket.__init__(sock, space, fd, family, type, proto) + #return space.wrap(sock) + return space.wrap(Socket(space, fd, family, type, proto)) descr_socket_new = interp2app(newsocket, unwrap_spec=[ObjSpace, W_Root, int, int, int]) Modified: pypy/dist/pypy/module/_socket/rpython/exttable.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/exttable.py (original) +++ pypy/dist/pypy/module/_socket/rpython/exttable.py Wed Dec 7 16:13:08 2005 @@ -41,6 +41,8 @@ declare(_socket.ntohl, int, '%s/ntohl' % module) declare(_socket.htonl, int, '%s/htonl' % module) +declare(rsocket.newsocket, int, '%s/newsocket' % module) + # ____________________________________________________________ # _socket.error can be raised by the above Modified: pypy/dist/pypy/module/_socket/rpython/ll__socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/ll__socket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/ll__socket.py Wed Dec 7 16:13:08 2005 @@ -73,3 +73,7 @@ return _socket.ntohl(htonl) ll__socket_ntohl.suggested_primitive = True +def ll__socket_newsocket(family, type, protocol): + return _socket.socket(family, type, protocol).fileno() +ll__socket_newsocket.suggested_primitive = True + Modified: pypy/dist/pypy/module/_socket/rpython/rsocket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/rsocket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/rsocket.py Wed Dec 7 16:13:08 2005 @@ -25,3 +25,6 @@ def getaddrinfo(host, port, family, socktype, proto, flags): return ADDRINFO(host, port, family, socktype, proto, flags) + +def newsocket(family, type, protocol): + return socket.socket(family, type, protocol).fileno() Modified: pypy/dist/pypy/module/_socket/rpython/test/test_ll__socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/test/test_ll__socket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/test/test_ll__socket.py Wed Dec 7 16:13:08 2005 @@ -29,3 +29,7 @@ info = ll__socket_nextaddrinfo(addr) info = info[:4] + (info[4:],) assert info == _socket.getaddrinfo(host, port)[0] + +def test_newsocket(): + fd = ll__socket_newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) + assert isinstance(fd, int) Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Wed Dec 7 16:13:08 2005 @@ -2,6 +2,7 @@ import py import _socket from pypy.translator.c.test.test_genc import compile +from pypy.translator.translator import Translator def setup_module(mod): import pypy.module._socket.rpython.exttable # for declare()/declaretype() @@ -60,3 +61,19 @@ f1 = compile(does_stuff, [str, str]) res = f1("localhost", "25") assert eval(res) == _socket.getaddrinfo("localhost", "25") + +def test_newsocket_annotation(): + from pypy.module._socket.rpython import rsocket + def does_stuff(): + return rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) + t = Translator(does_stuff) + a = t.annotate([]) + assert a.gettype(t.graphs[0].getreturnvar()) == int + +def DONOT_test_newsocket(): + from pypy.module._socket.rpython import rsocket + def does_stuff(): + return rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) + f1 = compile(does_stuff, []) + res = f1() + assert isinstance(res, int) From adim at codespeak.net Wed Dec 7 16:13:15 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Wed, 7 Dec 2005 16:13:15 +0100 (CET) Subject: [pypy-svn] r20838 - pypy/dist/pypy/module/recparser/leftout Message-ID: <20051207151315.C3A9027DB7@code1.codespeak.net> Author: adim Date: Wed Dec 7 16:13:14 2005 New Revision: 20838 Modified: pypy/dist/pypy/module/recparser/leftout/builders.py pypy/dist/pypy/module/recparser/leftout/pgen.py Log: changed KleenStar to KleeneStar even on old modules Modified: pypy/dist/pypy/module/recparser/leftout/builders.py ============================================================================== --- pypy/dist/pypy/module/recparser/leftout/builders.py (original) +++ pypy/dist/pypy/module/recparser/leftout/builders.py Wed Dec 7 16:13:14 2005 @@ -1,7 +1,7 @@ """DEPRECATED""" raise DeprecationWarning("This module is broken and out of date. Don't use it !") -from grammar import BaseGrammarBuilder, Alternative, Token, Sequence, KleenStart +from grammar import BaseGrammarBuilder, Alternative, Token, Sequence, KleeneStar class BuilderToken(object): def __init__(self, name, value): @@ -149,7 +149,7 @@ _min = 0 elif star=='+': _min = 1 - sym = KleenStar( self.get_name(), _min, _max, rule=sym ) + sym = KleeneStar( self.get_name(), _min, _max, rule=sym ) sym.star = star debug_rule( sym ) self.items.append(sym) @@ -161,7 +161,7 @@ def build_option( self, values ): """option: '[' alternative ']'""" - sym = KleenStar( self.get_name(), 0, 1, rule=values[1] ) + sym = KleeneStar( self.get_name(), 0, 1, rule=values[1] ) debug_rule( sym ) self.items.append(sym) return sym Modified: pypy/dist/pypy/module/recparser/leftout/pgen.py ============================================================================== --- pypy/dist/pypy/module/recparser/leftout/pgen.py (original) +++ pypy/dist/pypy/module/recparser/leftout/pgen.py Wed Dec 7 16:13:14 2005 @@ -55,7 +55,7 @@ import re import grammar -from grammar import Token, Alternative, KleenStar, Sequence, TokenSource, BaseGrammarBuilder, Proxy, Pgen +from grammar import Token, Alternative, KleeneStar, Sequence, TokenSource, BaseGrammarBuilder, Proxy, Pgen g_symdef = re.compile(r"[a-zA-Z_][a-zA-Z0-9_]*:",re.M) g_symbol = re.compile(r"[a-zA-Z_][a-zA-Z0-9_]*",re.M) @@ -114,7 +114,7 @@ def debug_rule( rule ): nm = rule.__class__.__name__ print nm, rule.name, "->", - if nm=='KleenStar': + if nm=='KleeneStar': print "(%d,%d,%s)" % (rule.min, rule.max, rule.star), for x in rule.args: print x.name, @@ -223,7 +223,7 @@ _min = 0 elif star=='+': _min = 1 - sym = KleenStar( self.get_name(), _min, _max, rule=sym ) + sym = KleeneStar( self.get_name(), _min, _max, rule=sym ) sym.star = star debug_rule( sym ) self.items.append(sym) @@ -235,7 +235,7 @@ def build_option( self, values ): """option: '[' alternative ']'""" - sym = KleenStar( self.get_name(), 0, 1, rule=values[1] ) + sym = KleeneStar( self.get_name(), 0, 1, rule=values[1] ) debug_rule( sym ) self.items.append(sym) return sym @@ -343,7 +343,7 @@ def visit_option( self, node ): rule = node.nodes[1].visit(self) - return self.new_item( KleenStar( self.new_name(), 0, 1, rule ) ) + return self.new_item( KleeneStar( self.new_name(), 0, 1, rule ) ) def visit_group( self, node ): rule = node.nodes[1].visit(self) @@ -372,9 +372,9 @@ rule_name = self.new_name() tok = star_opt.nodes[0].nodes[0] if tok.value == '+': - return self.new_item( KleenStar( rule_name, _min=1, rule = myrule ) ) + return self.new_item( KleeneStar( rule_name, _min=1, rule = myrule ) ) elif tok.value == '*': - return self.new_item( KleenStar( rule_name, _min=0, rule = myrule ) ) + return self.new_item( KleeneStar( rule_name, _min=0, rule = myrule ) ) else: raise SyntaxError("Got symbol star_opt with value='%s'" % tok.value ) return myrule @@ -395,7 +395,7 @@ """ # star: '*' | '+' star = Alternative( "star", Token('*'), Token('+') ) - star_opt = KleenStar ( "star_opt", 0, 1, rule=star ) + star_opt = KleeneStar ( "star_opt", 0, 1, rule=star ) # rule: SYMBOL ':' alternative symbol = Sequence( "symbol", Token('SYMBOL'), star_opt ) @@ -404,12 +404,12 @@ rule = Sequence( "rule", symboldef, alternative ) # grammar: rule+ - grammar = KleenStar( "grammar", _min=1, rule=rule ) + grammar = KleeneStar( "grammar", _min=1, rule=rule ) # alternative: sequence ( '|' sequence )* - sequence = KleenStar( "sequence", 1 ) + sequence = KleeneStar( "sequence", 1 ) seq_cont_list = Sequence( "seq_cont_list", Token('|'), sequence ) - sequence_cont = KleenStar( "sequence_cont",0, rule=seq_cont_list ) + sequence_cont = KleeneStar( "sequence_cont",0, rule=seq_cont_list ) alternative.args = [ sequence, sequence_cont ] From adim at codespeak.net Wed Dec 7 16:16:52 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Wed, 7 Dec 2005 16:16:52 +0100 (CET) Subject: [pypy-svn] r20839 - pypy/dist/pypy/doc Message-ID: <20051207151652.8434627B66@code1.codespeak.net> Author: adim Date: Wed Dec 7 16:16:51 2005 New Revision: 20839 Modified: pypy/dist/pypy/doc/parser.txt Log: forgot to checkin the parser's doc yesterday after the KleenStar/KleeneStar change Modified: pypy/dist/pypy/doc/parser.txt ============================================================================== --- pypy/dist/pypy/doc/parser.txt (original) +++ pypy/dist/pypy/doc/parser.txt Wed Dec 7 16:16:51 2005 @@ -37,7 +37,7 @@ * token The four types are represented by a class in pyparser/grammar.py -( Sequence, Alternative, KleenStar, Token) all classes have a ``match()`` method +( Sequence, Alternative, KleeneStar, Token) all classes have a ``match()`` method accepting a source (the tokenizer) and a builder (an object responsible for building something out of the grammar). @@ -50,7 +50,7 @@ In python: V = Alternative( Token('x'), Token('y') ) A = Sequence( V, - KleenStar( + KleeneStar( Sequence( Alternative( Token('*'), Token('/') ), V ) From adim at codespeak.net Wed Dec 7 16:24:35 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Wed, 7 Dec 2005 16:24:35 +0100 (CET) Subject: [pypy-svn] r20840 - in pypy/dist/pypy: interpreter module/recparser Message-ID: <20051207152435.8D57927B66@code1.codespeak.net> Author: adim Date: Wed Dec 7 16:24:33 2005 New Revision: 20840 Modified: pypy/dist/pypy/interpreter/pycompiler.py pypy/dist/pypy/module/recparser/__init__.py Log: (ludal, adim) : added a simple hook to be able to modify AST at run-time Modified: pypy/dist/pypy/interpreter/pycompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pycompiler.py (original) +++ pypy/dist/pypy/interpreter/pycompiler.py Wed Dec 7 16:24:33 2005 @@ -13,6 +13,7 @@ def __init__(self, space): self.space = space + self.compile_hook = None def compile(self, source, filename, mode, flags): """Compile and return an pypy.interpreter.eval.Code instance.""" @@ -218,6 +219,15 @@ raise OperationError(space.w_SyntaxError, e.wrap_info(space, filename)) + try: + if self.compile_hook is not None: + new_tree = space.call_function(self.compile_hook, + space.wrap(ast_tree), + space.wrap(encoding)) + except Exception, e: + # XXX find a better way to handle exceptions at this point + raise OperationError(space.w_Exception, + space.wrap(str(e))) try: astcompiler.misc.set_filename(filename, ast_tree) flag_names = get_flag_names(space, flags) @@ -239,3 +249,10 @@ assert isinstance(c,PyCode) return c + + +def install_compiler_hook(space, w_callable): + if space.is_w(w_callable, space.w_None): + space.default_compiler.compile_hook = None + else: + space.default_compiler.compile_hook = w_callable Modified: pypy/dist/pypy/module/recparser/__init__.py ============================================================================== --- pypy/dist/pypy/module/recparser/__init__.py (original) +++ pypy/dist/pypy/module/recparser/__init__.py Wed Dec 7 16:24:33 2005 @@ -7,6 +7,7 @@ import pyparser import pypy.interpreter.pyparser.pythonlexer import pypy.interpreter.pyparser.pythonparse +import pypy.interpreter.pycompiler class Module(MixedModule): """The builtin parser module. @@ -44,5 +45,6 @@ # PyPy extension 'decode_string_literal': 'pyparser.decode_string_literal', + 'install_compiler_hook' : 'pypy.interpreter.pycompiler.install_compiler_hook', } From cfbolz at codespeak.net Wed Dec 7 16:32:02 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Wed, 7 Dec 2005 16:32:02 +0100 (CET) Subject: [pypy-svn] r20841 - pypy/dist/pypy/jit/test Message-ID: <20051207153202.4EB3027B5C@code1.codespeak.net> Author: cfbolz Date: Wed Dec 7 16:32:01 2005 New Revision: 20841 Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: (arigo, cfbolz): refactored test to use llinterp to interpret each graph Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Wed Dec 7 16:32:01 2005 @@ -1,21 +1,55 @@ from pypy.translator.translator import TranslationContext from pypy.rpython.annlowlevel import annotate_lowlevel_helper from pypy.rpython.lltypesystem import lltype +from pypy.rpython.llinterp import LLInterpreter +from pypy.rpython import rstr +from pypy.annotation import model as annmodel from pypy.jit.llabstractinterp import LLAbstractInterp -def test_simple(): - def ll_function(x, y): - return x + y +def annotation(a, x): + T = lltype.typeOf(x) + if T == lltype.Ptr(rstr.STR): + t = str + else: + t = annmodel.lltype_to_annotation(T) + return a.typeannotation(t) + +def abstrinterp(ll_function, argvalues, arghints): t = TranslationContext() a = t.buildannotator() - argtypes = [a.typeannotation(int), a.typeannotation(int)] + argtypes = [annotation(a, value) for value in argvalues] graph1 = annotate_lowlevel_helper(a, ll_function, argtypes) - t.buildrtyper().specialize() + rtyper = t.buildrtyper() + rtyper.specialize() interp = LLAbstractInterp() - # tell 'y=42' - hints = {graph1.getargs()[1]: 42} + hints = {} + argvalues2 = argvalues[:] + lst = list(arghints) + lst.sort() + lst.reverse() + for hint in lst: + hints[graph1.getargs()[hint]] = argvalues2[hint] + del argvalues2[hint] graph2 = interp.eval(graph1, hints) + # check the result by running it + llinterp = LLInterpreter(rtyper) + result1 = llinterp.eval_graph(graph1, argvalues) + result2 = llinterp.eval_graph(graph2, argvalues2) + assert result1 == result2 + # return a summary of the instructions left in graph2 + insns = {} + for block in graph2.iterblocks(): + for op in block.operations: + insns[op.opname] = insns.get(op.opname, 0) + 1 + return graph2, insns + + +def test_simple(): + def ll_function(x, y): + return x + y + + graph2, insns = abstrinterp(ll_function, [6, 42], [1]) # check that the result is "lambda x: x+42" assert len(graph2.startblock.operations) == 1 assert len(graph2.getargs()) == 1 From ludal at codespeak.net Wed Dec 7 16:33:52 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Wed, 7 Dec 2005 16:33:52 +0100 (CET) Subject: [pypy-svn] r20842 - pypy/dist/pypy/interpreter Message-ID: <20051207153352.D242427B5C@code1.codespeak.net> Author: ludal Date: Wed Dec 7 16:33:49 2005 New Revision: 20842 Modified: pypy/dist/pypy/interpreter/pycompiler.py Log: don't touch the exception we just let the OperationError propagate from the callback to applevel again Modified: pypy/dist/pypy/interpreter/pycompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pycompiler.py (original) +++ pypy/dist/pypy/interpreter/pycompiler.py Wed Dec 7 16:33:49 2005 @@ -219,15 +219,10 @@ raise OperationError(space.w_SyntaxError, e.wrap_info(space, filename)) - try: - if self.compile_hook is not None: - new_tree = space.call_function(self.compile_hook, - space.wrap(ast_tree), - space.wrap(encoding)) - except Exception, e: - # XXX find a better way to handle exceptions at this point - raise OperationError(space.w_Exception, - space.wrap(str(e))) + if self.compile_hook is not None: + new_tree = space.call_function(self.compile_hook, + space.wrap(ast_tree), + space.wrap(encoding)) try: astcompiler.misc.set_filename(filename, ast_tree) flag_names = get_flag_names(space, flags) @@ -250,9 +245,10 @@ return c - def install_compiler_hook(space, w_callable): if space.is_w(w_callable, space.w_None): space.default_compiler.compile_hook = None else: +# if not space.get( w_callable ): +# raise OperationError( space.w_TypeError( space.wrap( "must have a callable" ) ) space.default_compiler.compile_hook = w_callable From cfbolz at codespeak.net Wed Dec 7 16:39:10 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Wed, 7 Dec 2005 16:39:10 +0100 (CET) Subject: [pypy-svn] r20843 - in pypy/dist/pypy/jit: . test Message-ID: <20051207153910.7652527B66@code1.codespeak.net> Author: cfbolz Date: Wed Dec 7 16:39:09 2005 New Revision: 20843 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: (cfbolz, arigo, pedronis around) normalizing links to the return block. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Wed Dec 7 16:39:09 2005 @@ -121,11 +121,19 @@ def applyhint(self, args_a, origblock): result_a = [] - for a, origv in zip(args_a, origblock.inputargs): - if origv in self.hints: - # use the hint, ignore the source binding - a = LLConcreteValue(self.hints[origv]) - result_a.append(a) + if origblock.operations == (): + # make sure args_s does *not* contain LLConcreteValues + for a in args_a: + if isinstance(a, LLConcreteValue): + a = LLRuntimeValue(orig_v=a.getvarorconst()) + result_a.append(a) + else: + # apply the hints to make more LLConcreteValues + for a, origv in zip(args_a, origblock.inputargs): + if origv in self.hints: + # use the hint, ignore the source binding + a = LLConcreteValue(self.hints[origv]) + result_a.append(a) return result_a def schedule(self, args_a, origblock): @@ -156,7 +164,7 @@ try: self.flowin(state) except GotReturnValue, e: - assert e.returnstate is returnstate # XXX + assert e.returnstate is returnstate def flowin(self, state): # flow in the block Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Wed Dec 7 16:39:09 2005 @@ -61,3 +61,10 @@ assert op.args[1].concretetype == lltype.Signed assert len(graph2.startblock.exits) == 1 assert graph2.startblock.exits[0].target is graph2.returnblock + +def test_simple2(): + def ll_function(x, y): + return x + y + + graph2, insns = abstrinterp(ll_function, [6, 42], [0, 1]) + assert not insns From ludal at codespeak.net Wed Dec 7 16:50:23 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Wed, 7 Dec 2005 16:50:23 +0100 (CET) Subject: [pypy-svn] r20844 - pypy/branch/pypy-compiler Message-ID: <20051207155023.3C6F227B5C@code1.codespeak.net> Author: ludal Date: Wed Dec 7 16:50:22 2005 New Revision: 20844 Added: pypy/branch/pypy-compiler/ - copied from r20843, pypy/dist/ Log: branch to export compiler/ast internals From cfbolz at codespeak.net Wed Dec 7 17:04:32 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Wed, 7 Dec 2005 17:04:32 +0100 (CET) Subject: [pypy-svn] r20845 - in pypy/dist/pypy/jit: . test Message-ID: <20051207160432.31F0827B5C@code1.codespeak.net> Author: cfbolz Date: Wed Dec 7 17:04:30 2005 New Revision: 20845 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: (cfbolz, arigo, pedronis around) Branching. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Wed Dec 7 17:04:30 2005 @@ -1,6 +1,7 @@ import operator from pypy.objspace.flow.model import Variable, Constant, SpaceOperation from pypy.objspace.flow.model import Block, Link, FunctionGraph +from pypy.objspace.flow.model import checkgraph, last_exception from pypy.rpython.lltypesystem import lltype @@ -117,6 +118,7 @@ copygraph.getreturnvar().concretetype = ( origgraph.getreturnvar().concretetype) returnstate.resolveblock(copygraph.returnblock) + checkgraph(copygraph) # sanity-check return copygraph def applyhint(self, args_a, origblock): @@ -194,24 +196,34 @@ handler = getattr(self, 'op_' + op.opname) a_result = handler(op, *[binding(v) for v in op.args]) bindings[op.result] = a_result - if 1: # self.residual_operations: - inputargs = [] - for v in origblock.inputargs: - a = bindings[v] - if not isinstance(a, LLConcreteValue): - inputargs.append(a.getvarorconst()) - newblock = Block(inputargs) - newblock.operations = self.residual_operations - del self.residual_operations # just in case - assert origblock.exitswitch is None # XXX - origlink, = origblock.exits # XXX + inputargs = [] + for v in origblock.inputargs: + a = bindings[v] + if not isinstance(a, LLConcreteValue): + inputargs.append(a.getvarorconst()) + newblock = Block(inputargs) + newblock.operations = self.residual_operations + del self.residual_operations # just in case + if origblock.exitswitch is None: + links = origblock.exits + elif origblock.exitswitch == Constant(last_exception): + XXX + else: + v = bindings[origblock.exitswitch].getvarorconst() + if isinstance(v, Variable): + newblock.exitswitch = v + links = origblock.exits + else: + links = [link for link in origblock.exits + if link.llexitcase == v.value] + newlinks = [] + for origlink in links: args_a = [binding(v) for v in origlink.args] newlink = self.schedule(args_a, origlink.target) - print "CLOSING" - newblock.closeblock(newlink) - state.resolveblock(newblock) - else: - XXX + newlinks.append(newlink) + print "CLOSING" + newblock.closeblock(*newlinks) + state.resolveblock(newblock) def constantfold(self, constant_op, args_a): concretevalues = [] @@ -249,5 +261,8 @@ # ____________________________________________________________ + def op_int_is_true(self, op, a): + return self.residualize(op, [a], operator.truth) + def op_int_add(self, op, a1, a2): return self.residualize(op, [a1, a2], operator.add) Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Wed Dec 7 17:04:30 2005 @@ -65,6 +65,32 @@ def test_simple2(): def ll_function(x, y): return x + y + graph2, insns = abstrinterp(ll_function, [6, 42], [0, 1]) + assert not insns + +def test_constantbranch(): + def ll_function(x, y): + if x: + y += 1 + y += 2 + return y + graph2, insns = abstrinterp(ll_function, [6, 42], [0]) + assert insns == {'int_add': 2} +def test_constantbranch_two_constants(): + def ll_function(x, y): + if x: + y += 1 + y += 2 + return y graph2, insns = abstrinterp(ll_function, [6, 42], [0, 1]) assert not insns + +def test_branch(): + def ll_function(x, y): + if x: + y += 1 + y += 2 + return y + graph2, insns = abstrinterp(ll_function, [6, 42], []) + assert insns == {'int_is_true': 1, 'int_add': 2} From ludal at codespeak.net Wed Dec 7 17:16:37 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Wed, 7 Dec 2005 17:16:37 +0100 (CET) Subject: [pypy-svn] r20846 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20051207161637.64C1727B5C@code1.codespeak.net> Author: ludal Date: Wed Dec 7 17:16:36 2005 New Revision: 20846 Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py Log: remove repr from export of nodes Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Wed Dec 7 17:16:36 2005 @@ -205,7 +205,7 @@ fmt = COMMA.join(["%s"] * self.nargs) if '(' in self.args: fmt = '(%s)' % fmt - vals = ["repr(self.%s)" % name for name in self.argnames] + vals = ["self.%s.__repr__()" % name for name in self.argnames] vals = COMMA.join(vals) if self.nargs == 1: vals = vals + "," @@ -418,7 +418,7 @@ return w_obj.descr_accept( space, w_visitor ) Node.typedef = TypeDef('ASTNode', - __repr__ = interp2app(descr_node_repr), + #__repr__ = interp2app(descr_node_repr), getChildNodes = interp2app(descr_getChildNodes), accept = interp2app(descr_accept), ) From adim at codespeak.net Wed Dec 7 17:18:31 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Wed, 7 Dec 2005 17:18:31 +0100 (CET) Subject: [pypy-svn] r20847 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20051207161831.AF8E227DB4@code1.codespeak.net> Author: adim Date: Wed Dec 7 17:18:30 2005 New Revision: 20847 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py Log: regenreated ast.py Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Wed Dec 7 17:18:30 2005 @@ -70,7 +70,7 @@ return w_obj.descr_accept( space, w_visitor ) Node.typedef = TypeDef('ASTNode', - __repr__ = interp2app(descr_node_repr), + #__repr__ = interp2app(descr_node_repr), getChildNodes = interp2app(descr_getChildNodes), accept = interp2app(descr_accept), ) @@ -181,7 +181,7 @@ return [self.left, self.right] def __repr__(self): - return "Add((%s, %s))" % (repr(self.left), repr(self.right)) + return "Add((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitAdd(self) @@ -206,7 +206,7 @@ return nodelist def __repr__(self): - return "And(%s)" % (repr(self.nodes),) + return "And(%s)" % (self.nodes.__repr__(),) def accept(self, visitor): return visitor.visitAnd(self) @@ -231,7 +231,7 @@ return [self.expr,] def __repr__(self): - return "AssAttr(%s, %s, %s)" % (repr(self.expr), repr(self.attrname), repr(self.flags)) + return "AssAttr(%s, %s, %s)" % (self.expr.__repr__(), self.attrname.__repr__(), self.flags.__repr__()) def accept(self, visitor): return visitor.visitAssAttr(self) @@ -278,7 +278,7 @@ return nodelist def __repr__(self): - return "AssList(%s)" % (repr(self.nodes),) + return "AssList(%s)" % (self.nodes.__repr__(),) def accept(self, visitor): return visitor.visitAssList(self) @@ -302,7 +302,7 @@ return [] def __repr__(self): - return "AssName(%s, %s)" % (repr(self.name), repr(self.flags)) + return "AssName(%s, %s)" % (self.name.__repr__(), self.flags.__repr__()) def accept(self, visitor): return visitor.visitAssName(self) @@ -340,7 +340,7 @@ return argnames def __repr__(self): - return "AssTuple(%s)" % (repr(self.nodes),) + return "AssTuple(%s)" % (self.nodes.__repr__(),) def accept(self, visitor): return visitor.visitAssTuple(self) @@ -371,7 +371,7 @@ return nodelist def __repr__(self): - return "Assert(%s, %s)" % (repr(self.test), repr(self.fail)) + return "Assert(%s, %s)" % (self.test.__repr__(), self.fail.__repr__()) def accept(self, visitor): return visitor.visitAssert(self) @@ -401,7 +401,7 @@ return nodelist def __repr__(self): - return "Assign(%s, %s)" % (repr(self.nodes), repr(self.expr)) + return "Assign(%s, %s)" % (self.nodes.__repr__(), self.expr.__repr__()) def accept(self, visitor): return visitor.visitAssign(self) @@ -426,7 +426,7 @@ return [self.node, self.expr] def __repr__(self): - return "AugAssign(%s, %s, %s)" % (repr(self.node), repr(self.op), repr(self.expr)) + return "AugAssign(%s, %s, %s)" % (self.node.__repr__(), self.op.__repr__(), self.expr.__repr__()) def accept(self, visitor): return visitor.visitAugAssign(self) @@ -471,7 +471,7 @@ return [self.expr,] def __repr__(self): - return "Backquote(%s)" % (repr(self.expr),) + return "Backquote(%s)" % (self.expr.__repr__(),) def accept(self, visitor): return visitor.visitBackquote(self) @@ -518,7 +518,7 @@ return nodelist def __repr__(self): - return "Bitand(%s)" % (repr(self.nodes),) + return "Bitand(%s)" % (self.nodes.__repr__(),) def accept(self, visitor): return visitor.visitBitand(self) @@ -543,7 +543,7 @@ return nodelist def __repr__(self): - return "Bitor(%s)" % (repr(self.nodes),) + return "Bitor(%s)" % (self.nodes.__repr__(),) def accept(self, visitor): return visitor.visitBitor(self) @@ -568,7 +568,7 @@ return nodelist def __repr__(self): - return "Bitxor(%s)" % (repr(self.nodes),) + return "Bitxor(%s)" % (self.nodes.__repr__(),) def accept(self, visitor): return visitor.visitBitxor(self) @@ -628,7 +628,7 @@ return nodelist def __repr__(self): - return "CallFunc(%s, %s, %s, %s)" % (repr(self.node), repr(self.args), repr(self.star_args), repr(self.dstar_args)) + return "CallFunc(%s, %s, %s, %s)" % (self.node.__repr__(), self.args.__repr__(), self.star_args.__repr__(), self.dstar_args.__repr__()) def accept(self, visitor): return visitor.visitCallFunc(self) @@ -662,7 +662,7 @@ return nodelist def __repr__(self): - return "Class(%s, %s, %s, %s)" % (repr(self.name), repr(self.bases), repr(self.doc), repr(self.code)) + return "Class(%s, %s, %s, %s)" % (self.name.__repr__(), self.bases.__repr__(), self.doc.__repr__(), self.code.__repr__()) def accept(self, visitor): return visitor.visitClass(self) @@ -695,7 +695,7 @@ return nodelist def __repr__(self): - return "Compare(%s, %s)" % (repr(self.expr), repr(self.ops)) + return "Compare(%s, %s)" % (self.expr.__repr__(), self.ops.__repr__()) def accept(self, visitor): return visitor.visitCompare(self) @@ -718,7 +718,7 @@ return [] def __repr__(self): - return "Const(%s)" % (repr(self.value),) + return "Const(%s)" % (self.value.__repr__(),) def accept(self, visitor): return visitor.visitConst(self) @@ -765,7 +765,7 @@ return nodelist def __repr__(self): - return "Decorators(%s)" % (repr(self.nodes),) + return "Decorators(%s)" % (self.nodes.__repr__(),) def accept(self, visitor): return visitor.visitDecorators(self) @@ -794,7 +794,7 @@ return nodelist def __repr__(self): - return "Dict(%s)" % (repr(self.items),) + return "Dict(%s)" % (self.items.__repr__(),) def accept(self, visitor): return visitor.visitDict(self) @@ -817,7 +817,7 @@ return [self.expr,] def __repr__(self): - return "Discard(%s)" % (repr(self.expr),) + return "Discard(%s)" % (self.expr.__repr__(),) def accept(self, visitor): return visitor.visitDiscard(self) @@ -841,7 +841,7 @@ return [self.left, self.right] def __repr__(self): - return "Div((%s, %s))" % (repr(self.left), repr(self.right)) + return "Div((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitDiv(self) @@ -898,7 +898,7 @@ return nodelist def __repr__(self): - return "Exec(%s, %s, %s)" % (repr(self.expr), repr(self.locals), repr(self.globals)) + return "Exec(%s, %s, %s)" % (self.expr.__repr__(), self.locals.__repr__(), self.globals.__repr__()) def accept(self, visitor): return visitor.visitExec(self) @@ -922,7 +922,7 @@ return [self.left, self.right] def __repr__(self): - return "FloorDiv((%s, %s))" % (repr(self.left), repr(self.right)) + return "FloorDiv((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitFloorDiv(self) @@ -959,7 +959,7 @@ return nodelist def __repr__(self): - return "For(%s, %s, %s, %s)" % (repr(self.assign), repr(self.list), repr(self.body), repr(self.else_)) + return "For(%s, %s, %s, %s)" % (self.assign.__repr__(), self.list.__repr__(), self.body.__repr__(), self.else_.__repr__()) def accept(self, visitor): return visitor.visitFor(self) @@ -983,7 +983,7 @@ return [] def __repr__(self): - return "From(%s, %s)" % (repr(self.modname), repr(self.names)) + return "From(%s, %s)" % (self.modname.__repr__(), self.names.__repr__()) def accept(self, visitor): return visitor.visitFrom(self) @@ -1032,7 +1032,7 @@ return nodelist def __repr__(self): - return "Function(%s, %s, %s, %s, %s, %s, %s)" % (repr(self.decorators), repr(self.name), repr(self.argnames), repr(self.defaults), repr(self.flags), repr(self.doc), repr(self.code)) + return "Function(%s, %s, %s, %s, %s, %s, %s)" % (self.decorators.__repr__(), self.name.__repr__(), self.argnames.__repr__(), self.defaults.__repr__(), self.flags.__repr__(), self.doc.__repr__(), self.code.__repr__()) def accept(self, visitor): return visitor.visitFunction(self) @@ -1059,7 +1059,7 @@ return [self.code,] def __repr__(self): - return "GenExpr(%s)" % (repr(self.code),) + return "GenExpr(%s)" % (self.code.__repr__(),) def accept(self, visitor): return visitor.visitGenExpr(self) @@ -1095,7 +1095,7 @@ return nodelist def __repr__(self): - return "GenExprFor(%s, %s, %s)" % (repr(self.assign), repr(self.iter), repr(self.ifs)) + return "GenExprFor(%s, %s, %s)" % (self.assign.__repr__(), self.iter.__repr__(), self.ifs.__repr__()) def accept(self, visitor): return visitor.visitGenExprFor(self) @@ -1118,7 +1118,7 @@ return [self.test,] def __repr__(self): - return "GenExprIf(%s)" % (repr(self.test),) + return "GenExprIf(%s)" % (self.test.__repr__(),) def accept(self, visitor): return visitor.visitGenExprIf(self) @@ -1148,7 +1148,7 @@ return nodelist def __repr__(self): - return "GenExprInner(%s, %s)" % (repr(self.expr), repr(self.quals)) + return "GenExprInner(%s, %s)" % (self.expr.__repr__(), self.quals.__repr__()) def accept(self, visitor): return visitor.visitGenExprInner(self) @@ -1172,7 +1172,7 @@ return [self.expr,] def __repr__(self): - return "Getattr(%s, %s)" % (repr(self.expr), repr(self.attrname)) + return "Getattr(%s, %s)" % (self.expr.__repr__(), self.attrname.__repr__()) def accept(self, visitor): return visitor.visitGetattr(self) @@ -1195,7 +1195,7 @@ return [] def __repr__(self): - return "Global(%s)" % (repr(self.names),) + return "Global(%s)" % (self.names.__repr__(),) def accept(self, visitor): return visitor.visitGlobal(self) @@ -1230,7 +1230,7 @@ return nodelist def __repr__(self): - return "If(%s, %s)" % (repr(self.tests), repr(self.else_)) + return "If(%s, %s)" % (self.tests.__repr__(), self.else_.__repr__()) def accept(self, visitor): return visitor.visitIf(self) @@ -1253,7 +1253,7 @@ return [] def __repr__(self): - return "Import(%s)" % (repr(self.names),) + return "Import(%s)" % (self.names.__repr__(),) def accept(self, visitor): return visitor.visitImport(self) @@ -1276,7 +1276,7 @@ return [self.expr,] def __repr__(self): - return "Invert(%s)" % (repr(self.expr),) + return "Invert(%s)" % (self.expr.__repr__(),) def accept(self, visitor): return visitor.visitInvert(self) @@ -1300,7 +1300,7 @@ return [self.expr,] def __repr__(self): - return "Keyword(%s, %s)" % (repr(self.name), repr(self.expr)) + return "Keyword(%s, %s)" % (self.name.__repr__(), self.expr.__repr__()) def accept(self, visitor): return visitor.visitKeyword(self) @@ -1341,7 +1341,7 @@ return nodelist def __repr__(self): - return "Lambda(%s, %s, %s, %s)" % (repr(self.argnames), repr(self.defaults), repr(self.flags), repr(self.code)) + return "Lambda(%s, %s, %s, %s)" % (self.argnames.__repr__(), self.defaults.__repr__(), self.flags.__repr__(), self.code.__repr__()) def accept(self, visitor): return visitor.visitLambda(self) @@ -1365,7 +1365,7 @@ return [self.left, self.right] def __repr__(self): - return "LeftShift((%s, %s))" % (repr(self.left), repr(self.right)) + return "LeftShift((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitLeftShift(self) @@ -1390,7 +1390,7 @@ return nodelist def __repr__(self): - return "List(%s)" % (repr(self.nodes),) + return "List(%s)" % (self.nodes.__repr__(),) def accept(self, visitor): return visitor.visitList(self) @@ -1420,7 +1420,7 @@ return nodelist def __repr__(self): - return "ListComp(%s, %s)" % (repr(self.expr), repr(self.quals)) + return "ListComp(%s, %s)" % (self.expr.__repr__(), self.quals.__repr__()) def accept(self, visitor): return visitor.visitListComp(self) @@ -1453,7 +1453,7 @@ return nodelist def __repr__(self): - return "ListCompFor(%s, %s, %s)" % (repr(self.assign), repr(self.list), repr(self.ifs)) + return "ListCompFor(%s, %s, %s)" % (self.assign.__repr__(), self.list.__repr__(), self.ifs.__repr__()) def accept(self, visitor): return visitor.visitListCompFor(self) @@ -1476,7 +1476,7 @@ return [self.test,] def __repr__(self): - return "ListCompIf(%s)" % (repr(self.test),) + return "ListCompIf(%s)" % (self.test.__repr__(),) def accept(self, visitor): return visitor.visitListCompIf(self) @@ -1500,7 +1500,7 @@ return [self.left, self.right] def __repr__(self): - return "Mod((%s, %s))" % (repr(self.left), repr(self.right)) + return "Mod((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitMod(self) @@ -1524,7 +1524,7 @@ return [self.node,] def __repr__(self): - return "Module(%s, %s)" % (repr(self.doc), repr(self.node)) + return "Module(%s, %s)" % (self.doc.__repr__(), self.node.__repr__()) def accept(self, visitor): return visitor.visitModule(self) @@ -1548,7 +1548,7 @@ return [self.left, self.right] def __repr__(self): - return "Mul((%s, %s))" % (repr(self.left), repr(self.right)) + return "Mul((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitMul(self) @@ -1571,7 +1571,7 @@ return [] def __repr__(self): - return "Name(%s)" % (repr(self.varname),) + return "Name(%s)" % (self.varname.__repr__(),) def accept(self, visitor): return visitor.visitName(self) @@ -1616,7 +1616,7 @@ return [self.expr,] def __repr__(self): - return "Not(%s)" % (repr(self.expr),) + return "Not(%s)" % (self.expr.__repr__(),) def accept(self, visitor): return visitor.visitNot(self) @@ -1639,7 +1639,7 @@ return [] def __repr__(self): - return "NumberConst(%s)" % (repr(self.number_value),) + return "NumberConst(%s)" % (self.number_value.__repr__(),) def accept(self, visitor): return visitor.visitNumberConst(self) @@ -1664,7 +1664,7 @@ return nodelist def __repr__(self): - return "Or(%s)" % (repr(self.nodes),) + return "Or(%s)" % (self.nodes.__repr__(),) def accept(self, visitor): return visitor.visitOr(self) @@ -1710,7 +1710,7 @@ return [self.left, self.right] def __repr__(self): - return "Power((%s, %s))" % (repr(self.left), repr(self.right)) + return "Power((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitPower(self) @@ -1741,7 +1741,7 @@ return nodelist def __repr__(self): - return "Print(%s, %s)" % (repr(self.nodes), repr(self.dest)) + return "Print(%s, %s)" % (self.nodes.__repr__(), self.dest.__repr__()) def accept(self, visitor): return visitor.visitPrint(self) @@ -1772,7 +1772,7 @@ return nodelist def __repr__(self): - return "Printnl(%s, %s)" % (repr(self.nodes), repr(self.dest)) + return "Printnl(%s, %s)" % (self.nodes.__repr__(), self.dest.__repr__()) def accept(self, visitor): return visitor.visitPrintnl(self) @@ -1808,7 +1808,7 @@ return nodelist def __repr__(self): - return "Raise(%s, %s, %s)" % (repr(self.expr1), repr(self.expr2), repr(self.expr3)) + return "Raise(%s, %s, %s)" % (self.expr1.__repr__(), self.expr2.__repr__(), self.expr3.__repr__()) def accept(self, visitor): return visitor.visitRaise(self) @@ -1834,7 +1834,7 @@ return nodelist def __repr__(self): - return "Return(%s)" % (repr(self.value),) + return "Return(%s)" % (self.value.__repr__(),) def accept(self, visitor): return visitor.visitReturn(self) @@ -1858,7 +1858,7 @@ return [self.left, self.right] def __repr__(self): - return "RightShift((%s, %s))" % (repr(self.left), repr(self.right)) + return "RightShift((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitRightShift(self) @@ -1895,7 +1895,7 @@ return nodelist def __repr__(self): - return "Slice(%s, %s, %s, %s)" % (repr(self.expr), repr(self.flags), repr(self.lower), repr(self.upper)) + return "Slice(%s, %s, %s, %s)" % (self.expr.__repr__(), self.flags.__repr__(), self.lower.__repr__(), self.upper.__repr__()) def accept(self, visitor): return visitor.visitSlice(self) @@ -1920,7 +1920,7 @@ return nodelist def __repr__(self): - return "Sliceobj(%s)" % (repr(self.nodes),) + return "Sliceobj(%s)" % (self.nodes.__repr__(),) def accept(self, visitor): return visitor.visitSliceobj(self) @@ -1945,7 +1945,7 @@ return nodelist def __repr__(self): - return "Stmt(%s)" % (repr(self.nodes),) + return "Stmt(%s)" % (self.nodes.__repr__(),) def accept(self, visitor): return visitor.visitStmt(self) @@ -1968,7 +1968,7 @@ return [] def __repr__(self): - return "StringConst(%s)" % (repr(self.string_value),) + return "StringConst(%s)" % (self.string_value.__repr__(),) def accept(self, visitor): return visitor.visitStringConst(self) @@ -1992,7 +1992,7 @@ return [self.left, self.right] def __repr__(self): - return "Sub((%s, %s))" % (repr(self.left), repr(self.right)) + return "Sub((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitSub(self) @@ -2024,7 +2024,7 @@ return nodelist def __repr__(self): - return "Subscript(%s, %s, %s)" % (repr(self.expr), repr(self.flags), repr(self.subs)) + return "Subscript(%s, %s, %s)" % (self.expr.__repr__(), self.flags.__repr__(), self.subs.__repr__()) def accept(self, visitor): return visitor.visitSubscript(self) @@ -2066,7 +2066,7 @@ return nodelist def __repr__(self): - return "TryExcept(%s, %s, %s)" % (repr(self.body), repr(self.handlers), repr(self.else_)) + return "TryExcept(%s, %s, %s)" % (self.body.__repr__(), self.handlers.__repr__(), self.else_.__repr__()) def accept(self, visitor): return visitor.visitTryExcept(self) @@ -2090,7 +2090,7 @@ return [self.body, self.final] def __repr__(self): - return "TryFinally(%s, %s)" % (repr(self.body), repr(self.final)) + return "TryFinally(%s, %s)" % (self.body.__repr__(), self.final.__repr__()) def accept(self, visitor): return visitor.visitTryFinally(self) @@ -2115,7 +2115,7 @@ return nodelist def __repr__(self): - return "Tuple(%s)" % (repr(self.nodes),) + return "Tuple(%s)" % (self.nodes.__repr__(),) def accept(self, visitor): return visitor.visitTuple(self) @@ -2138,7 +2138,7 @@ return [self.expr,] def __repr__(self): - return "UnaryAdd(%s)" % (repr(self.expr),) + return "UnaryAdd(%s)" % (self.expr.__repr__(),) def accept(self, visitor): return visitor.visitUnaryAdd(self) @@ -2161,7 +2161,7 @@ return [self.expr,] def __repr__(self): - return "UnarySub(%s)" % (repr(self.expr),) + return "UnarySub(%s)" % (self.expr.__repr__(),) def accept(self, visitor): return visitor.visitUnarySub(self) @@ -2195,7 +2195,7 @@ return nodelist def __repr__(self): - return "While(%s, %s, %s)" % (repr(self.test), repr(self.body), repr(self.else_)) + return "While(%s, %s, %s)" % (self.test.__repr__(), self.body.__repr__(), self.else_.__repr__()) def accept(self, visitor): return visitor.visitWhile(self) @@ -2218,7 +2218,7 @@ return [self.value,] def __repr__(self): - return "Yield(%s)" % (repr(self.value),) + return "Yield(%s)" % (self.value.__repr__(),) def accept(self, visitor): return visitor.visitYield(self) From ericvrp at codespeak.net Wed Dec 7 17:23:01 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 7 Dec 2005 17:23:01 +0100 (CET) Subject: [pypy-svn] r20848 - in pypy/dist/pypy/jit: . test Message-ID: <20051207162301.CF6B327DB4@code1.codespeak.net> Author: ericvrp Date: Wed Dec 7 17:23:00 2005 New Revision: 20848 Modified: pypy/dist/pypy/jit/bytecode.py pypy/dist/pypy/jit/test/test_tl.py pypy/dist/pypy/jit/tl.py Log: (arre, ericvrp) Added lots more tests and basic bytecodes, including comparisons, conditional branches, function calls, returns, exits. Modified: pypy/dist/pypy/jit/bytecode.py ============================================================================== --- pypy/dist/pypy/jit/bytecode.py (original) +++ pypy/dist/pypy/jit/bytecode.py Wed Dec 7 17:23:00 2005 @@ -1,4 +1,34 @@ -PUSH = 'P' -POP = 'p' -ADD = '+' -INVALID = '!' +opcode = 0 +def next_opcode(): + global opcode + opcode += 1 + return opcode + +PUSH = next_opcode() #1 operand +POP = next_opcode() +SWAP = next_opcode() +ROT = next_opcode() + +PICK = next_opcode() #1 operand (DUP = PICK,0) +PUT = next_opcode() #1 operand + +ADD = next_opcode() +SUB = next_opcode() +MUL = next_opcode() +DIV = next_opcode() + +EQ = next_opcode() +NE = next_opcode() +LT = next_opcode() +LE = next_opcode() +GT = next_opcode() +GE = next_opcode() + +BR_COND = next_opcode() #1 operand offset + +CALL = next_opcode() #1 operand offset +RETURN = next_opcode() + +EXIT = next_opcode() + +INVALID = next_opcode() Modified: pypy/dist/pypy/jit/test/test_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_tl.py (original) +++ pypy/dist/pypy/jit/test/test_tl.py Wed Dec 7 17:23:00 2005 @@ -1,4 +1,5 @@ import py +import operator from pypy.jit.tl import interp from pypy.jit.bytecode import * @@ -24,30 +25,108 @@ builder.import_module() return builder.get_entry_point() +def compile(insn): + return ''.join([chr(i & 0xff) for i in insn]) + # actual tests go here def test_tl_push(): - assert interp(PUSH+chr(16)) == 16 + assert interp(compile([PUSH, 16])) == 16 def test_tl_pop(): - assert interp( ''.join([PUSH,chr(16), PUSH,chr(42), PUSH,chr(200), POP]) ) == 42 + assert interp( compile([PUSH,16, PUSH,42, PUSH,100, POP]) ) == 42 def test_tl_add(): - assert interp( ''.join([PUSH,chr(42), PUSH,chr(200), ADD]) ) == 242 - assert interp( ''.join([PUSH,chr(16), PUSH,chr(42), PUSH,chr(200), ADD]) ) == 242 + assert interp( compile([PUSH,42, PUSH,100, ADD]) ) == 142 + assert interp( compile([PUSH,16, PUSH,42, PUSH,100, ADD]) ) == 142 def test_tl_error(): - py.test.raises(IndexError, interp,POP) - py.test.raises(IndexError, interp,ADD) - py.test.raises(IndexError, interp,''.join([PUSH,chr(200), ADD]) ) + py.test.raises(IndexError, interp,compile([POP])) + py.test.raises(IndexError, interp,compile([ADD])) + py.test.raises(IndexError, interp,compile([PUSH,100, ADD]) ) def test_tl_invalid_codetype(): py.test.raises(TypeError, interp,[INVALID]) def test_tl_invalid_bytecode(): - py.test.raises(RuntimeError, interp,INVALID) + py.test.raises(RuntimeError, interp,compile([INVALID])) def test_tl_translatable(): - code = ''.join([PUSH,chr(42), PUSH,chr(200), ADD]) + code = compile([PUSH,42, PUSH,100, ADD]) fn = translate(interp, [str]) assert interp(code) == fn(code) + +def test_swap(): + code = [PUSH,42, PUSH, 84] + assert interp(compile(code)) == 84 + code.append(SWAP) + assert interp(compile(code)) == 42 + code.append(POP) + assert interp(compile(code)) == 84 + +def test_pick(): + values = [7, 8, 9] + code = [] + for v in reversed(values): + code.extend([PUSH, v]) + + for i, v in enumerate(values): + assert interp(compile(code + [PICK,i])) == v + +def test_put(): + values = [1,2,7,-3] + code = [PUSH,0] * len(values) + for i, v in enumerate(values): + code += [PUSH,v, PUT,i] + + for i, v in enumerate(values): + assert interp(compile(code + [PICK,i])) == v + +ops = [ (ADD, operator.add, ((2, 4), (1, 1), (-1, 1))), + (SUB, operator.sub, ((2, 4), (4, 2), (1, 1))), + (MUL, operator.mul, ((2, 4), (4, 2), (1, 1), (-1, 6), (0, 5))), + (DIV, operator.div, ((2, 4), (4, 2), (1, 1), (-4, -2), (0, 9), (9, -3))), + (EQ, operator.eq, ((0, 0), (0, 1), (1, 0), (1, 1), (-1, 0), (0, -1), (-1, -1), (1, -1), (-1, 1))), + (NE, operator.ne, ((0, 0), (0, 1), (1, 0), (1, 1), (-1, 0), (0, -1), (-1, -1), (1, -1), (-1, 1))), + (LT, operator.lt, ((0, 0), (0, 1), (1, 0), (1, 1), (-1, 0), (0, -1), (-1, -1), (1, -1), (-1, 1))), + (LE, operator.le, ((0, 0), (0, 1), (1, 0), (1, 1), (-1, 0), (0, -1), (-1, -1), (1, -1), (-1, 1))), + (GT, operator.gt, ((0, 0), (0, 1), (1, 0), (1, 1), (-1, 0), (0, -1), (-1, -1), (1, -1), (-1, 1))), + (GE, operator.ge, ((0, 0), (0, 1), (1, 0), (1, 1), (-1, 0), (0, -1), (-1, -1), (1, -1), (-1, 1))), + ] + +def test_ops(): + for insn, pyop, values in ops: + for first, second in values: + code = [PUSH, first, PUSH, second, insn] + assert interp(compile(code)) == pyop(first, second) + + +def test_branch_forward(): + assert interp(compile([PUSH,1, PUSH,0, BR_COND,2, PUSH,-1])) == -1 + assert interp(compile([PUSH,1, PUSH,1, BR_COND,2, PUSH,-1])) == 1 + assert interp(compile([PUSH,1, PUSH,-1, BR_COND,2, PUSH,-1])) == 1 + +def test_branch_backwards(): + assert interp(compile([PUSH,0, PUSH,1, BR_COND,6, PUSH,-1, PUSH,3, BR_COND,4, PUSH,2, BR_COND,-10])) == -1 + +def test_branch0(): + assert interp(compile([PUSH,7, PUSH,1, BR_COND,0])) == 7 + +def test_exit(): + assert py.test.raises(IndexError, interp, compile([EXIT])) + assert interp(compile([PUSH,7, EXIT, PUSH,5])) == 7 + +def test_rot(): + code = [PUSH,1, PUSH,2, PUSH,3, ROT,3] + assert interp(compile(code)) == 2 + assert interp(compile(code + [POP])) == 1 + assert interp(compile(code + [POP, POP])) == 3 + + py.test.raises(IndexError, interp, compile([PUSH,1, PUSH,2, PUSH,3, ROT,4])) + +def test_call_ret(): + assert py.test.raises(IndexError, interp, compile([RETURN])) + assert interp(compile([PUSH,6, RETURN, PUSH,4, EXIT, PUSH,9])) == 9 + assert interp(compile([CALL,0])) == 2 + + assert interp(compile([PUSH,1, CALL,5, PUSH,2, CALL,2, EXIT, RETURN, ROT,3, ADD, SWAP, RETURN])) == 3 Modified: pypy/dist/pypy/jit/tl.py ============================================================================== --- pypy/dist/pypy/jit/tl.py (original) +++ pypy/dist/pypy/jit/tl.py Wed Dec 7 17:23:00 2005 @@ -3,6 +3,12 @@ import py from bytecode import * +def char2int(c): + t = ord(c) + if t & 128: + t = -(-ord(c) & 0xff) + return t + def interp(code=''): if not isinstance(code,str): raise TypeError("code '%s' should be a string" % str(code)) @@ -12,21 +18,94 @@ pc = 0 while pc < code_len: - opcode = code[pc] + opcode = ord(code[pc]) pc += 1 if opcode == PUSH: - stack.append(ord(code[pc])) + stack.append( char2int(code[pc]) ) pc += 1 elif opcode == POP: stack.pop() + elif opcode == SWAP: + a, b = stack.pop(), stack.pop() + stack.append(a) + stack.append(b) + + elif opcode == ROT: #rotate stack top to somewhere below + r = char2int(code[pc]) + if r > 1: + i = len(stack) - r + if i < 0: + raise IndexError + stack.insert( i, stack.pop() ) + pc += 1 + + elif opcode == PICK: + stack.append( stack[-1 - char2int(code[pc])] ) + pc += 1 + + elif opcode == PUT: + stack[-1 - char2int(code[pc])] = stack.pop() + pc += 1 + elif opcode == ADD: - stack.append( stack.pop() + stack.pop() ) + a, b = stack.pop(), stack.pop() + stack.append( b + a ) + + elif opcode == SUB: + a, b = stack.pop(), stack.pop() + stack.append( b - a ) + + elif opcode == MUL: + a, b = stack.pop(), stack.pop() + stack.append( b * a ) + + elif opcode == DIV: + a, b = stack.pop(), stack.pop() + stack.append( b / a ) + + elif opcode == EQ: + a, b = stack.pop(), stack.pop() + stack.append( b == a ) + + elif opcode == NE: + a, b = stack.pop(), stack.pop() + stack.append( b != a ) + + elif opcode == LT: + a, b = stack.pop(), stack.pop() + stack.append( b < a ) + + elif opcode == LE: + a, b = stack.pop(), stack.pop() + stack.append( b <= a ) + + elif opcode == GT: + a, b = stack.pop(), stack.pop() + stack.append( b > a ) + + elif opcode == GE: + a, b = stack.pop(), stack.pop() + stack.append( b >= a ) + + elif opcode == BR_COND: + if stack.pop(): + pc += char2int(code[pc]) + pc += 1 + + elif opcode == CALL: + stack.append( pc+1 ) + pc += char2int(code[pc]) + 1 + + elif opcode == RETURN: + pc = stack.pop() + + elif opcode == EXIT: + break else: raise RuntimeError("unknown opcode: " + str(opcode)) return stack[-1] - From cfbolz at codespeak.net Wed Dec 7 17:24:51 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Wed, 7 Dec 2005 17:24:51 +0100 (CET) Subject: [pypy-svn] r20849 - in pypy/dist/pypy/jit: . test Message-ID: <20051207162451.459F927DB4@code1.codespeak.net> Author: cfbolz Date: Wed Dec 7 17:24:50 2005 New Revision: 20849 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: (cfbolz, arigo, pedronis) A few more tests and operations. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Wed Dec 7 17:24:50 2005 @@ -266,3 +266,12 @@ def op_int_add(self, op, a1, a2): return self.residualize(op, [a1, a2], operator.add) + + def op_int_sub(self, op, a1, a2): + return self.residualize(op, [a1, a2], operator.sub) + + def op_int_gt(self, op, a1, a2): + return self.residualize(op, [a1, a2], operator.gt) + + def op_same_as(self, op, a): + return a Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Wed Dec 7 17:24:50 2005 @@ -94,3 +94,40 @@ return y graph2, insns = abstrinterp(ll_function, [6, 42], []) assert insns == {'int_is_true': 1, 'int_add': 2} + +def test_unrolling_loop(): + def ll_function(x, y): + while x > 0: + y += x + x -= 1 + return y + graph2, insns = abstrinterp(ll_function, [6, 42], [0]) + assert insns == {'int_add': 6} + +def test_loop(): + def ll_function(x, y): + while x > 0: + y += x + x -= 1 + return y + graph2, insns = abstrinterp(ll_function, [6, 42], []) + assert insns == {'int_gt': 1, 'int_add': 1, 'int_sub': 1} + +def test_loop2(): + def ll_function(x, y): + while x > 0: + y += x + x -= 1 + return y + graph2, insns = abstrinterp(ll_function, [6, 42], [1]) + assert insns == {'int_gt': 2, 'int_add': 2, 'int_sub': 2} + +def test_not_merging(): + def ll_function(x, y, z): + if x: + a = y + z + else: + a = y - z + return a + x + graph2, insns = abstrinterp(ll_function, [3, 4, 5], [1, 2]) + assert insns == {'int_is_true': 1, 'int_add': 2} From ale at codespeak.net Wed Dec 7 17:26:55 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Wed, 7 Dec 2005 17:26:55 +0100 (CET) Subject: [pypy-svn] r20850 - in pypy/dist/pypy: module/_socket module/_socket/rpython translator/c/test Message-ID: <20051207162655.BA10827DB4@code1.codespeak.net> Author: ale Date: Wed Dec 7 17:26:53 2005 New Revision: 20850 Modified: pypy/dist/pypy/module/_socket/interp_socket.py pypy/dist/pypy/module/_socket/rpython/ll__socket.py pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (nik,ale) Finally, we managed to specialise newsocket (the trick was to have a simple ll_helper which the specializer cant try to build a flowgraph for) Modified: pypy/dist/pypy/module/_socket/interp_socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/interp_socket.py (original) +++ pypy/dist/pypy/module/_socket/interp_socket.py Wed Dec 7 17:26:53 2005 @@ -992,5 +992,4 @@ [*] not available on all platforms!""", __new__ = descr_socket_new, - **socketmethods ) Modified: pypy/dist/pypy/module/_socket/rpython/ll__socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/ll__socket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/ll__socket.py Wed Dec 7 17:26:53 2005 @@ -74,6 +74,8 @@ ll__socket_ntohl.suggested_primitive = True def ll__socket_newsocket(family, type, protocol): - return _socket.socket(family, type, protocol).fileno() +# from pypy.module._socket.rpython import rsocket +# return rsocket.newsocket(family, type, protocol).fileno() + return 0 ll__socket_newsocket.suggested_primitive = True Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Wed Dec 7 17:26:53 2005 @@ -70,7 +70,7 @@ a = t.annotate([]) assert a.gettype(t.graphs[0].getreturnvar()) == int -def DONOT_test_newsocket(): +def test_newsocket(): from pypy.module._socket.rpython import rsocket def does_stuff(): return rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) From ale at codespeak.net Wed Dec 7 17:35:56 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Wed, 7 Dec 2005 17:35:56 +0100 (CET) Subject: [pypy-svn] r20851 - in pypy/dist/pypy/translator/c: . src Message-ID: <20051207163556.3D13C27B5C@code1.codespeak.net> Author: ale Date: Wed Dec 7 17:35:54 2005 New Revision: 20851 Modified: pypy/dist/pypy/translator/c/extfunc.py pypy/dist/pypy/translator/c/src/ll__socket.h Log: (nik,ale) Well, newsocket works ( without any error checking whatsoever) Modified: pypy/dist/pypy/translator/c/extfunc.py ============================================================================== --- pypy/dist/pypy/translator/c/extfunc.py (original) +++ pypy/dist/pypy/translator/c/extfunc.py Wed Dec 7 17:35:54 2005 @@ -66,6 +66,7 @@ ll__socket.ll__socket_htons: 'LL__socket_htons', ll__socket.ll__socket_htonl: 'LL__socket_htonl', ll__socket.ll__socket_ntohl: 'LL__socket_htonl', + ll__socket.ll__socket_newsocket: 'LL__socket_newsocket', } #______________________________________________________ Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Wed Dec 7 17:35:54 2005 @@ -13,6 +13,7 @@ int LL__socket_htons(int ntohs); long LL__socket_ntohl(long htonl); long LL__socket_htonl(long ntohl); +int LL__socket_newsocket(int family, int type, int protocol); RPyString *LL__socket_gethostname(void); RPyString *LL__socket_gethostbyname(RPyString *name); struct RPyOpaque_ADDRINFO *LL__socket_getaddrinfo(RPyString *host, RPyString *port, @@ -75,6 +76,10 @@ return htonl(ntohl); } +int LL__socket_newsocket(int family, int type, int protocol) +{ + return socket(family, type, protocol); +} /* ____________________________________________________________________________ */ /* Lock to allow python interpreter to continue, but only allow one From mwh at codespeak.net Wed Dec 7 17:37:39 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Wed, 7 Dec 2005 17:37:39 +0100 (CET) Subject: [pypy-svn] r20852 - in pypy/dist/pypy/rpython: . test Message-ID: <20051207163739.7023127B66@code1.codespeak.net> Author: mwh Date: Wed Dec 7 17:37:38 2005 New Revision: 20852 Modified: pypy/dist/pypy/rpython/rarithmetic.py pypy/dist/pypy/rpython/test/test_rarithmetic.py Log: (mwh, johahn) fairly substantial reworking of rarithmetic. there are now four exported classes: r_int, r_uint, r_longlong and r_ulonglong. they correspond to the most similarly named C type. r_int now inherits from long, which is a change, but not one that seems to affect anything (fingers crossed!). Modified: pypy/dist/pypy/rpython/rarithmetic.py ============================================================================== --- pypy/dist/pypy/rpython/rarithmetic.py (original) +++ pypy/dist/pypy/rpython/rarithmetic.py Wed Dec 7 17:37:38 2005 @@ -28,127 +28,6 @@ """ import math -class r_int(int): - """ fake integer implementation in order to make sure that - primitive integer operations do overflow """ - - def __add__(self, other): - x = int(self) - y = int(other) - return r_int(x + y) - __radd__ = __add__ - - def __sub__(self, other): - x = int(self) - y = int(other) - return r_int(x - y) - - def __rsub__(self, other): - y = int(self) - x = int(other) - return r_int(x - y) - - def __mul__(self, other): - x = int(self) - if not isinstance(other, (int, long)): - return x * other - y = int(other) - return r_int(x * y) - __rmul__ = __mul__ - - def __div__(self, other): - x = int(self) - y = int(other) - return r_int(x // y) - - __floordiv__ = __div__ - - def __rdiv__(self, other): - y = int(self) - x = int(other) - return r_int(x // y) - - __rfloordiv__ = __rdiv__ - - def __mod__(self, other): - x = int(self) - y = int(other) - return r_int(x % y) - - def __rmod__(self, other): - y = int(self) - x = int(other) - return r_int(x % y) - - def __divmod__(self, other): - x = int(self) - y = int(other) - res = divmod(x, y) - return (r_int(res[0]), r_int(res[1])) - - def __lshift__(self, n): - # ensure long shift, so we don't depend on - # shift truncation (2.3) vs. long(2.4) - x = long(self) - y = int(n) - return r_int(x << y) - - def __rlshift__(self, n): - y = long(self) - x = int(n) - return r_int(x << y) - - def __rshift__(self, n): - x = int(self) - y = int(n) - return r_int(x >> y) - - def __rrshift__(self, n): - y = int(self) - x = int(n) - return r_int(x >> y) - - def __or__(self, other): - x = int(self) - y = int(other) - return r_int(x | y) - __ror__ = __or__ - - def __and__(self, other): - x = int(self) - y = int(other) - return r_int(x & y) - __rand__ = __and__ - - def __xor__(self, other): - x = int(self) - y = int(other) - return r_int(x ^ y) - __rxor__ = __xor__ - - def __neg__(self): - x = int(self) - return r_int(-x) - - def __pos__(self): - return r_int(self) - - def __invert__(self): - x = int(self) - return r_int(~x) - - def __pow__(self, other, m=None): - x = int(self) - y = int(other) - res = pow(x, y, m) - return r_int(res) - - def __rpow__(self, other, m=None): - y = int(self) - x = int(other) - res = pow(x, y, m) - return r_int(res) - # set up of machine internals _bits = 0 _itest = 1 @@ -203,21 +82,22 @@ raise OverflowError -def _widen(self, other, value): - """ - if one argument is int or long, the other type wins. - otherwise, produce the largest class to hold the result. - """ - return _typemap[ type(self), type(other) ](value) - -class r_uint(long): +class base_int(long): """ fake unsigned integer implementation """ - MASK = LONG_MASK - BITS = LONG_BIT + + def _widen(self, other, value): + """ + if one argument is int or long, the other type wins. + otherwise, produce the largest class to hold the result. + """ + return self.typemap[ type(self), type(other) ](value) def __new__(klass, val): - return long.__new__(klass, val & klass.MASK) + if klass is base_int: + raise TypeError("abstract base!") + else: + return super(base_int, klass).__new__(klass, val) def __int__(self): if self < LONG_TEST: @@ -228,56 +108,56 @@ def __add__(self, other): x = long(self) y = long(other) - return _widen(self, other, x + y) + return self._widen(other, x + y) __radd__ = __add__ def __sub__(self, other): x = long(self) y = long(other) - return _widen(self, other, x - y) + return self._widen(other, x - y) def __rsub__(self, other): y = long(self) x = long(other) - return _widen(self, other, x - y) + return self._widen(other, x - y) def __mul__(self, other): x = long(self) if not isinstance(other, (int, long)): return x * other y = long(other) - return _widen(self, other, x * y) + return self._widen(other, x * y) __rmul__ = __mul__ def __div__(self, other): x = long(self) y = long(other) - return _widen(self, other, x // y) + return self._widen(other, x // y) __floordiv__ = __div__ def __rdiv__(self, other): y = long(self) x = long(other) - return _widen(self, other, x // y) + return self._widen(other, x // y) __rfloordiv__ = __rdiv__ def __mod__(self, other): x = long(self) y = long(other) - return _widen(self, other, x % y) + return self._widen(other, x % y) def __rmod__(self, other): y = long(self) x = long(other) - return _widen(self, other, x % y) + return self._widen(other, x % y) def __divmod__(self, other): x = long(self) y = long(other) res = divmod(x, y) - return (r_uint(res[0]), r_uint(res[1])) + return (self.__class__(res[0]), self.__class__(res[1])) def __lshift__(self, n): x = long(self) @@ -287,34 +167,34 @@ def __rlshift__(self, n): y = long(self) x = long(n) - return _widen(self, n, x << y) + return self._widen(n, x << y) def __rshift__(self, n): x = long(self) y = long(n) - return _widen(self, n, x >> y) + return self._widen(n, x >> y) def __rrshift__(self, n): y = long(self) x = long(n) - return _widen(self, n, x >> y) + return self._widen(n, x >> y) def __or__(self, other): x = long(self) y = long(other) - return _widen(self, other, x | y) + return self._widen(other, x | y) __ror__ = __or__ def __and__(self, other): x = long(self) y = long(other) - return _widen(self, other, x & y) + return self._widen(other, x & y) __rand__ = __and__ def __xor__(self, other): x = long(self) y = long(other) - return _widen(self, other, x ^ y) + return self._widen(other, x ^ y) __rxor__ = __xor__ def __neg__(self): @@ -332,26 +212,53 @@ x = long(self) y = long(other) res = pow(x, y, m) - return _widen(self, other, res) + return self._widen(other, res) def __rpow__(self, other, m=None): y = long(self) x = long(other) res = pow(x, y, m) - return _widen(self, other, res) + return self._widen(other, res) -class r_ushort(r_uint): - """ fake unsigned short integer implementation """ - BITS = r_uint.BITS // 2 - MASK = (1L << BITS) - 1 - -class r_ulong(r_uint): - """ fake unsigned long integer implementation """ - BITS = r_uint.BITS * 2 - MASK = (1L << BITS) - 1 +class signed_int(base_int): + def __new__(klass, val): + if val > klass.MASK>>1 or val < -(klass.MASK>>1)-1: + raise OverflowError("%s does not fit in signed %d-bit integer"%(val, klass.BITS)) + if val < 0: + val = - ((-val) & klass.MASK) + return super(signed_int, klass).__new__(klass, val) + typemap = {} + +class unsigned_int(base_int): + def __new__(klass, val): + return super(unsigned_int, klass).__new__(klass, val & klass.MASK) + typemap = {} -def setup_typemap(): - types = int, long, r_uint, r_ushort, r_ulong +class r_int(signed_int): + MASK = LONG_MASK + BITS = LONG_BIT + + +class r_uint(unsigned_int): + MASK = LONG_MASK + BITS = LONG_BIT + + +if LONG_BIT == 64: + r_ulonglong = r_uint + r_longlong = r_int +else: + assert LONG_BIT == 32 + + class r_longlong(signed_int): + BITS = LONG_BIT * 2 + MASK = 2**BITS-1 + + class r_ulonglong(unsigned_int): + BITS = LONG_BIT * 2 + MASK = 2**BITS-1 + +def setup_typemap(typemap, types): for left in types: for right in types: if left in (int, long): @@ -364,10 +271,11 @@ else: restype = right if restype not in (int, long): - _typemap[ left, right ] = restype -_typemap = {} + typemap[ left, right ] = restype + +setup_typemap(unsigned_int.typemap, (int, long, r_uint, r_ulonglong)) +setup_typemap(signed_int.typemap, (int, long, r_int, r_longlong)) -setup_typemap() del setup_typemap # string -> float helper Modified: pypy/dist/pypy/rpython/test/test_rarithmetic.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rarithmetic.py (original) +++ pypy/dist/pypy/rpython/test/test_rarithmetic.py Wed Dec 7 17:37:38 2005 @@ -1,6 +1,6 @@ from pypy.rpython.rarithmetic import * import sys - +import py maxint_mask = (sys.maxint*2 + 1) machbits = 0 @@ -52,7 +52,7 @@ self.unary_test(lambda x: ~x) def test__pow__(self): self.binary_test(lambda x, y: x**y, (2, 3)) - self.binary_test(lambda x, y: pow(x, y, 42), (2, 3, 5, 1000)) + self.binary_test(lambda x, y: pow(x, y, 42L), (2, 3, 5, 1000)) def unary_test(self, f): for arg in (-10, -1, 0, 3, 12345): @@ -139,7 +139,7 @@ assert res == cmp def test_mixed_types(): - types = [r_ushort, r_uint, r_ulong] + types = [r_uint, r_ulonglong] for left in types: for right in types: x = left(3) + right(5) @@ -147,17 +147,17 @@ assert types.index(type(x)) == expected def test_limits(): - mask = r_ushort.MASK - assert r_ushort(mask) == mask - assert r_ushort(mask+1) == 0 - mask = (mask << r_ushort.BITS) + mask - assert mask == r_uint.MASK - assert r_uint(mask == mask) - assert r_uint(mask+1) == 0 - mask = (mask << r_uint.BITS) + mask - assert mask == r_ulong.MASK - assert r_ulong(mask == mask) - assert r_ulong(mask+1) == 0 + for cls in r_uint, r_ulonglong: + mask = cls.MASK + assert cls(mask) == mask + assert cls(mask+1) == 0 + + for cls in r_int, r_longlong: + mask = cls.MASK>>1 + assert cls(mask) == mask + assert cls(-mask-1) == -mask-1 + py.test.raises(OverflowError, "cls(mask) + 1") + py.test.raises(OverflowError, "cls(-mask-1) - 1") def test_intmask(): assert intmask(1) == 1 From ludal at codespeak.net Wed Dec 7 18:07:37 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Wed, 7 Dec 2005 18:07:37 +0100 (CET) Subject: [pypy-svn] r20853 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20051207170737.E55DC27B6C@code1.codespeak.net> Author: ludal Date: Wed Dec 7 18:07:36 2005 New Revision: 20853 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/astgen.py Log: implements some descr as methods to correctly unwrap and annotate self Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Wed Dec 7 18:07:36 2005 @@ -6,7 +6,7 @@ from consts import CO_VARARGS, CO_VARKEYWORDS, OP_ASSIGN from pypy.interpreter.baseobjspace import Wrappable from pypy.interpreter.typedef import TypeDef -from pypy.interpreter.gateway import interp2app +from pypy.interpreter.gateway import interp2app, W_Root, ObjSpace from pypy.interpreter.argument import Arguments def flatten(list): @@ -58,21 +58,20 @@ def descr_repr( self, space ): return space.wrap( self.__repr__() ) - -def descr_node_repr( space, w_obj ): - return w_obj.descr_repr( space ) -def descr_getChildNodes( space, w_obj ): - lst = w_obj.getChildNodes() - return space.newlist( lst ) - -def descr_accept( space, w_obj, w_visitor ): - return w_obj.descr_accept( space, w_visitor ) + def descr_getChildNodes( self, space ): + lst = self.getChildNodes() + return space.newlist( [ self.wrap( it ) for it in lst ] ) + +def descr_node_accept( space, w_self, w_visitor ): + w_callable = space.getattr(w_visitor, space.wrap('visitNode')) + args = Arguments(space, [ w_self ]) + return space.call_args( w_callable, args ) Node.typedef = TypeDef('ASTNode', - #__repr__ = interp2app(descr_node_repr), - getChildNodes = interp2app(descr_getChildNodes), - accept = interp2app(descr_accept), + #__repr__ = interp2app(descr_node_repr, unwrap_spec=['self', ObjSpace] ), + getChildNodes = interp2app(Node.descr_getChildNodes, unwrap_spec=[ 'self', ObjSpace ] ), + accept = interp2app(descr_node_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), ) @@ -118,10 +117,13 @@ def accept(self, visitor): return visitor.visitAbstractFunction(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitAbstractFunction')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_AbstractFunction_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAbstractFunction')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +AbstractFunction.typedef = TypeDef('AbstractFunction', Node.typedef, + accept=interp2app(descr_AbstractFunction_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class AbstractTest(Node): def __init__(self, lineno=-1): @@ -140,10 +142,13 @@ def accept(self, visitor): return visitor.visitAbstractTest(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitAbstractTest')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_AbstractTest_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAbstractTest')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +AbstractTest.typedef = TypeDef('AbstractTest', Node.typedef, + accept=interp2app(descr_AbstractTest_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class BinaryOp(Node): def __init__(self, lineno=-1): @@ -162,10 +167,13 @@ def accept(self, visitor): return visitor.visitBinaryOp(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitBinaryOp')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_BinaryOp_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBinaryOp')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +BinaryOp.typedef = TypeDef('BinaryOp', Node.typedef, + accept=interp2app(descr_BinaryOp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Add(BinaryOp): def __init__(self, (left, right), lineno=-1): @@ -186,10 +194,13 @@ def accept(self, visitor): return visitor.visitAdd(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitAdd')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Add_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAdd')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Add.typedef = TypeDef('Add', Node.typedef, + accept=interp2app(descr_Add_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class And(AbstractTest): def __init__(self, nodes, lineno=-1): @@ -211,10 +222,13 @@ def accept(self, visitor): return visitor.visitAnd(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitAnd')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_And_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAnd')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +And.typedef = TypeDef('And', Node.typedef, + accept=interp2app(descr_And_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class AssAttr(Node): def __init__(self, expr, attrname, flags, lineno=-1): @@ -236,10 +250,13 @@ def accept(self, visitor): return visitor.visitAssAttr(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitAssAttr')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_AssAttr_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssAttr')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +AssAttr.typedef = TypeDef('AssAttr', Node.typedef, + accept=interp2app(descr_AssAttr_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class AssSeq(Node): def __init__(self, lineno=-1): @@ -258,10 +275,13 @@ def accept(self, visitor): return visitor.visitAssSeq(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitAssSeq')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_AssSeq_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssSeq')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +AssSeq.typedef = TypeDef('AssSeq', Node.typedef, + accept=interp2app(descr_AssSeq_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class AssList(AssSeq): def __init__(self, nodes, lineno=-1): @@ -283,10 +303,13 @@ def accept(self, visitor): return visitor.visitAssList(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitAssList')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_AssList_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssList')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +AssList.typedef = TypeDef('AssList', Node.typedef, + accept=interp2app(descr_AssList_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class AssName(Node): def __init__(self, name, flags, lineno=-1): @@ -307,10 +330,13 @@ def accept(self, visitor): return visitor.visitAssName(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitAssName')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_AssName_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssName')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +AssName.typedef = TypeDef('AssName', Node.typedef, + accept=interp2app(descr_AssName_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class AssTuple(AssSeq): def __init__(self, nodes, lineno=-1): @@ -345,10 +371,13 @@ def accept(self, visitor): return visitor.visitAssTuple(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitAssTuple')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_AssTuple_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssTuple')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +AssTuple.typedef = TypeDef('AssTuple', Node.typedef, + accept=interp2app(descr_AssTuple_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Assert(Node): def __init__(self, test, fail, lineno=-1): @@ -376,10 +405,13 @@ def accept(self, visitor): return visitor.visitAssert(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitAssert')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Assert_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssert')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Assert.typedef = TypeDef('Assert', Node.typedef, + accept=interp2app(descr_Assert_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Assign(Node): def __init__(self, nodes, expr, lineno=-1): @@ -406,10 +438,13 @@ def accept(self, visitor): return visitor.visitAssign(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitAssign')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Assign_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAssign')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Assign.typedef = TypeDef('Assign', Node.typedef, + accept=interp2app(descr_Assign_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class AugAssign(Node): def __init__(self, node, op, expr, lineno=-1): @@ -431,10 +466,13 @@ def accept(self, visitor): return visitor.visitAugAssign(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitAugAssign')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_AugAssign_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitAugAssign')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +AugAssign.typedef = TypeDef('AugAssign', Node.typedef, + accept=interp2app(descr_AugAssign_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class UnaryOp(Node): def __init__(self, lineno=-1): @@ -453,10 +491,13 @@ def accept(self, visitor): return visitor.visitUnaryOp(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitUnaryOp')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_UnaryOp_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitUnaryOp')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +UnaryOp.typedef = TypeDef('UnaryOp', Node.typedef, + accept=interp2app(descr_UnaryOp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Backquote(UnaryOp): def __init__(self, expr, lineno=-1): @@ -476,10 +517,13 @@ def accept(self, visitor): return visitor.visitBackquote(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitBackquote')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Backquote_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBackquote')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Backquote.typedef = TypeDef('Backquote', Node.typedef, + accept=interp2app(descr_Backquote_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class BitOp(Node): def __init__(self, lineno=-1): @@ -498,10 +542,13 @@ def accept(self, visitor): return visitor.visitBitOp(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitBitOp')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_BitOp_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBitOp')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +BitOp.typedef = TypeDef('BitOp', Node.typedef, + accept=interp2app(descr_BitOp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Bitand(BitOp): def __init__(self, nodes, lineno=-1): @@ -523,10 +570,13 @@ def accept(self, visitor): return visitor.visitBitand(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitBitand')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Bitand_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBitand')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Bitand.typedef = TypeDef('Bitand', Node.typedef, + accept=interp2app(descr_Bitand_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Bitor(BitOp): def __init__(self, nodes, lineno=-1): @@ -548,10 +598,13 @@ def accept(self, visitor): return visitor.visitBitor(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitBitor')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Bitor_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBitor')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Bitor.typedef = TypeDef('Bitor', Node.typedef, + accept=interp2app(descr_Bitor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Bitxor(BitOp): def __init__(self, nodes, lineno=-1): @@ -573,10 +626,13 @@ def accept(self, visitor): return visitor.visitBitxor(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitBitxor')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Bitxor_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBitxor')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Bitxor.typedef = TypeDef('Bitxor', Node.typedef, + accept=interp2app(descr_Bitxor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Break(Node): def __init__(self, lineno=-1): @@ -595,10 +651,13 @@ def accept(self, visitor): return visitor.visitBreak(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitBreak')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Break_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitBreak')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Break.typedef = TypeDef('Break', Node.typedef, + accept=interp2app(descr_Break_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class CallFunc(Node): def __init__(self, node, args, star_args = None, dstar_args = None, lineno=-1): @@ -633,10 +692,13 @@ def accept(self, visitor): return visitor.visitCallFunc(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitCallFunc')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_CallFunc_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitCallFunc')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +CallFunc.typedef = TypeDef('CallFunc', Node.typedef, + accept=interp2app(descr_CallFunc_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Class(Node): def __init__(self, name, bases, doc, code, lineno=-1): @@ -667,10 +729,13 @@ def accept(self, visitor): return visitor.visitClass(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitClass')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Class_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitClass')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Class.typedef = TypeDef('Class', Node.typedef, + accept=interp2app(descr_Class_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Compare(Node): def __init__(self, expr, ops, lineno=-1): @@ -700,10 +765,13 @@ def accept(self, visitor): return visitor.visitCompare(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitCompare')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Compare_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitCompare')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Compare.typedef = TypeDef('Compare', Node.typedef, + accept=interp2app(descr_Compare_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Const(Node): def __init__(self, value, lineno=-1): @@ -723,10 +791,13 @@ def accept(self, visitor): return visitor.visitConst(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitConst')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Const_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitConst')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Const.typedef = TypeDef('Const', Node.typedef, + accept=interp2app(descr_Const_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Continue(Node): def __init__(self, lineno=-1): @@ -745,10 +816,13 @@ def accept(self, visitor): return visitor.visitContinue(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitContinue')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Continue_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitContinue')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Continue.typedef = TypeDef('Continue', Node.typedef, + accept=interp2app(descr_Continue_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Decorators(Node): def __init__(self, nodes, lineno=-1): @@ -770,10 +844,13 @@ def accept(self, visitor): return visitor.visitDecorators(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitDecorators')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Decorators_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitDecorators')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Decorators.typedef = TypeDef('Decorators', Node.typedef, + accept=interp2app(descr_Decorators_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Dict(Node): def __init__(self, items, lineno=-1): @@ -799,10 +876,13 @@ def accept(self, visitor): return visitor.visitDict(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitDict')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Dict_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitDict')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Dict.typedef = TypeDef('Dict', Node.typedef, + accept=interp2app(descr_Dict_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Discard(Node): def __init__(self, expr, lineno=-1): @@ -822,10 +902,13 @@ def accept(self, visitor): return visitor.visitDiscard(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitDiscard')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Discard_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitDiscard')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Discard.typedef = TypeDef('Discard', Node.typedef, + accept=interp2app(descr_Discard_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Div(BinaryOp): def __init__(self, (left, right), lineno=-1): @@ -846,10 +929,13 @@ def accept(self, visitor): return visitor.visitDiv(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitDiv')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Div_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitDiv')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Div.typedef = TypeDef('Div', Node.typedef, + accept=interp2app(descr_Div_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Ellipsis(Node): def __init__(self, lineno=-1): @@ -868,10 +954,13 @@ def accept(self, visitor): return visitor.visitEllipsis(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitEllipsis')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Ellipsis_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitEllipsis')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Ellipsis.typedef = TypeDef('Ellipsis', Node.typedef, + accept=interp2app(descr_Ellipsis_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Exec(Node): def __init__(self, expr, locals, globals, lineno=-1): @@ -903,10 +992,13 @@ def accept(self, visitor): return visitor.visitExec(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitExec')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Exec_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitExec')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Exec.typedef = TypeDef('Exec', Node.typedef, + accept=interp2app(descr_Exec_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class FloorDiv(BinaryOp): def __init__(self, (left, right), lineno=-1): @@ -927,10 +1019,13 @@ def accept(self, visitor): return visitor.visitFloorDiv(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitFloorDiv')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_FloorDiv_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitFloorDiv')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +FloorDiv.typedef = TypeDef('FloorDiv', Node.typedef, + accept=interp2app(descr_FloorDiv_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class For(Node): def __init__(self, assign, list, body, else_, lineno=-1): @@ -964,10 +1059,13 @@ def accept(self, visitor): return visitor.visitFor(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitFor')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_For_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitFor')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +For.typedef = TypeDef('For', Node.typedef, + accept=interp2app(descr_For_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class From(Node): def __init__(self, modname, names, lineno=-1): @@ -988,10 +1086,13 @@ def accept(self, visitor): return visitor.visitFrom(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitFrom')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_From_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitFrom')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +From.typedef = TypeDef('From', Node.typedef, + accept=interp2app(descr_From_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Function(AbstractFunction): def __init__(self, decorators, name, argnames, defaults, flags, doc, code, lineno=-1): @@ -1037,10 +1138,13 @@ def accept(self, visitor): return visitor.visitFunction(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitFunction')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Function_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitFunction')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Function.typedef = TypeDef('Function', Node.typedef, + accept=interp2app(descr_Function_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class GenExpr(AbstractFunction): def __init__(self, code, lineno=-1): @@ -1064,10 +1168,13 @@ def accept(self, visitor): return visitor.visitGenExpr(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitGenExpr')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_GenExpr_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitGenExpr')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +GenExpr.typedef = TypeDef('GenExpr', Node.typedef, + accept=interp2app(descr_GenExpr_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class GenExprFor(Node): def __init__(self, assign, iter, ifs, lineno=-1): @@ -1100,10 +1207,13 @@ def accept(self, visitor): return visitor.visitGenExprFor(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitGenExprFor')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_GenExprFor_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitGenExprFor')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +GenExprFor.typedef = TypeDef('GenExprFor', Node.typedef, + accept=interp2app(descr_GenExprFor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class GenExprIf(Node): def __init__(self, test, lineno=-1): @@ -1123,10 +1233,13 @@ def accept(self, visitor): return visitor.visitGenExprIf(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitGenExprIf')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_GenExprIf_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitGenExprIf')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +GenExprIf.typedef = TypeDef('GenExprIf', Node.typedef, + accept=interp2app(descr_GenExprIf_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class GenExprInner(Node): def __init__(self, expr, quals, lineno=-1): @@ -1153,10 +1266,13 @@ def accept(self, visitor): return visitor.visitGenExprInner(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitGenExprInner')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_GenExprInner_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitGenExprInner')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +GenExprInner.typedef = TypeDef('GenExprInner', Node.typedef, + accept=interp2app(descr_GenExprInner_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Getattr(Node): def __init__(self, expr, attrname, lineno=-1): @@ -1177,10 +1293,13 @@ def accept(self, visitor): return visitor.visitGetattr(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitGetattr')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Getattr_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitGetattr')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Getattr.typedef = TypeDef('Getattr', Node.typedef, + accept=interp2app(descr_Getattr_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Global(Node): def __init__(self, names, lineno=-1): @@ -1200,10 +1319,13 @@ def accept(self, visitor): return visitor.visitGlobal(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitGlobal')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Global_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitGlobal')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Global.typedef = TypeDef('Global', Node.typedef, + accept=interp2app(descr_Global_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class If(Node): def __init__(self, tests, else_, lineno=-1): @@ -1235,10 +1357,13 @@ def accept(self, visitor): return visitor.visitIf(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitIf')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_If_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitIf')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +If.typedef = TypeDef('If', Node.typedef, + accept=interp2app(descr_If_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Import(Node): def __init__(self, names, lineno=-1): @@ -1258,10 +1383,13 @@ def accept(self, visitor): return visitor.visitImport(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitImport')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Import_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitImport')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Import.typedef = TypeDef('Import', Node.typedef, + accept=interp2app(descr_Import_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Invert(UnaryOp): def __init__(self, expr, lineno=-1): @@ -1281,10 +1409,13 @@ def accept(self, visitor): return visitor.visitInvert(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitInvert')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Invert_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitInvert')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Invert.typedef = TypeDef('Invert', Node.typedef, + accept=interp2app(descr_Invert_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Keyword(Node): def __init__(self, name, expr, lineno=-1): @@ -1305,10 +1436,13 @@ def accept(self, visitor): return visitor.visitKeyword(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitKeyword')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Keyword_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitKeyword')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Keyword.typedef = TypeDef('Keyword', Node.typedef, + accept=interp2app(descr_Keyword_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Lambda(AbstractFunction): def __init__(self, argnames, defaults, flags, code, lineno=-1): @@ -1346,10 +1480,13 @@ def accept(self, visitor): return visitor.visitLambda(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitLambda')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Lambda_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitLambda')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Lambda.typedef = TypeDef('Lambda', Node.typedef, + accept=interp2app(descr_Lambda_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class LeftShift(BinaryOp): def __init__(self, (left, right), lineno=-1): @@ -1370,10 +1507,13 @@ def accept(self, visitor): return visitor.visitLeftShift(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitLeftShift')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_LeftShift_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitLeftShift')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +LeftShift.typedef = TypeDef('LeftShift', Node.typedef, + accept=interp2app(descr_LeftShift_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class List(Node): def __init__(self, nodes, lineno=-1): @@ -1395,10 +1535,13 @@ def accept(self, visitor): return visitor.visitList(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitList')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_List_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitList')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +List.typedef = TypeDef('List', Node.typedef, + accept=interp2app(descr_List_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class ListComp(Node): def __init__(self, expr, quals, lineno=-1): @@ -1425,10 +1568,13 @@ def accept(self, visitor): return visitor.visitListComp(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitListComp')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_ListComp_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitListComp')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +ListComp.typedef = TypeDef('ListComp', Node.typedef, + accept=interp2app(descr_ListComp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class ListCompFor(Node): def __init__(self, assign, list, ifs, lineno=-1): @@ -1458,10 +1604,13 @@ def accept(self, visitor): return visitor.visitListCompFor(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitListCompFor')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_ListCompFor_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitListCompFor')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +ListCompFor.typedef = TypeDef('ListCompFor', Node.typedef, + accept=interp2app(descr_ListCompFor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class ListCompIf(Node): def __init__(self, test, lineno=-1): @@ -1481,10 +1630,13 @@ def accept(self, visitor): return visitor.visitListCompIf(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitListCompIf')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_ListCompIf_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitListCompIf')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +ListCompIf.typedef = TypeDef('ListCompIf', Node.typedef, + accept=interp2app(descr_ListCompIf_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Mod(BinaryOp): def __init__(self, (left, right), lineno=-1): @@ -1505,10 +1657,13 @@ def accept(self, visitor): return visitor.visitMod(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitMod')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Mod_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitMod')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Mod.typedef = TypeDef('Mod', Node.typedef, + accept=interp2app(descr_Mod_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Module(Node): def __init__(self, doc, node, lineno=-1): @@ -1529,10 +1684,13 @@ def accept(self, visitor): return visitor.visitModule(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitModule')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Module_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitModule')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Module.typedef = TypeDef('Module', Node.typedef, + accept=interp2app(descr_Module_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Mul(BinaryOp): def __init__(self, (left, right), lineno=-1): @@ -1553,10 +1711,13 @@ def accept(self, visitor): return visitor.visitMul(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitMul')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Mul_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitMul')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Mul.typedef = TypeDef('Mul', Node.typedef, + accept=interp2app(descr_Mul_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Name(Node): def __init__(self, varname, lineno=-1): @@ -1576,10 +1737,13 @@ def accept(self, visitor): return visitor.visitName(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitName')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Name_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitName')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Name.typedef = TypeDef('Name', Node.typedef, + accept=interp2app(descr_Name_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class NoneConst(Node): def __init__(self, lineno=-1): @@ -1598,10 +1762,13 @@ def accept(self, visitor): return visitor.visitNoneConst(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitNoneConst')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_NoneConst_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitNoneConst')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +NoneConst.typedef = TypeDef('NoneConst', Node.typedef, + accept=interp2app(descr_NoneConst_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Not(UnaryOp): def __init__(self, expr, lineno=-1): @@ -1621,10 +1788,13 @@ def accept(self, visitor): return visitor.visitNot(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitNot')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Not_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitNot')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Not.typedef = TypeDef('Not', Node.typedef, + accept=interp2app(descr_Not_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class NumberConst(Node): def __init__(self, number_value, lineno=-1): @@ -1644,10 +1814,13 @@ def accept(self, visitor): return visitor.visitNumberConst(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitNumberConst')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_NumberConst_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitNumberConst')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +NumberConst.typedef = TypeDef('NumberConst', Node.typedef, + accept=interp2app(descr_NumberConst_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Or(AbstractTest): def __init__(self, nodes, lineno=-1): @@ -1669,10 +1842,13 @@ def accept(self, visitor): return visitor.visitOr(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitOr')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Or_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitOr')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Or.typedef = TypeDef('Or', Node.typedef, + accept=interp2app(descr_Or_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Pass(Node): def __init__(self, lineno=-1): @@ -1691,10 +1867,13 @@ def accept(self, visitor): return visitor.visitPass(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitPass')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Pass_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitPass')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Pass.typedef = TypeDef('Pass', Node.typedef, + accept=interp2app(descr_Pass_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Power(BinaryOp): def __init__(self, (left, right), lineno=-1): @@ -1715,10 +1894,13 @@ def accept(self, visitor): return visitor.visitPower(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitPower')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Power_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitPower')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Power.typedef = TypeDef('Power', Node.typedef, + accept=interp2app(descr_Power_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Print(Node): def __init__(self, nodes, dest, lineno=-1): @@ -1746,10 +1928,13 @@ def accept(self, visitor): return visitor.visitPrint(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitPrint')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Print_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitPrint')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Print.typedef = TypeDef('Print', Node.typedef, + accept=interp2app(descr_Print_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Printnl(Node): def __init__(self, nodes, dest, lineno=-1): @@ -1777,10 +1962,13 @@ def accept(self, visitor): return visitor.visitPrintnl(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitPrintnl')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Printnl_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitPrintnl')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Printnl.typedef = TypeDef('Printnl', Node.typedef, + accept=interp2app(descr_Printnl_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Raise(Node): def __init__(self, expr1, expr2, expr3, lineno=-1): @@ -1813,10 +2001,13 @@ def accept(self, visitor): return visitor.visitRaise(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitRaise')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Raise_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitRaise')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Raise.typedef = TypeDef('Raise', Node.typedef, + accept=interp2app(descr_Raise_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Return(Node): def __init__(self, value, lineno=-1): @@ -1839,10 +2030,13 @@ def accept(self, visitor): return visitor.visitReturn(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitReturn')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Return_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitReturn')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Return.typedef = TypeDef('Return', Node.typedef, + accept=interp2app(descr_Return_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class RightShift(BinaryOp): def __init__(self, (left, right), lineno=-1): @@ -1863,10 +2057,13 @@ def accept(self, visitor): return visitor.visitRightShift(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitRightShift')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_RightShift_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitRightShift')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +RightShift.typedef = TypeDef('RightShift', Node.typedef, + accept=interp2app(descr_RightShift_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Slice(Node): def __init__(self, expr, flags, lower, upper, lineno=-1): @@ -1900,10 +2097,13 @@ def accept(self, visitor): return visitor.visitSlice(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitSlice')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Slice_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitSlice')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Slice.typedef = TypeDef('Slice', Node.typedef, + accept=interp2app(descr_Slice_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Sliceobj(Node): def __init__(self, nodes, lineno=-1): @@ -1925,10 +2125,13 @@ def accept(self, visitor): return visitor.visitSliceobj(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitSliceobj')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Sliceobj_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitSliceobj')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Sliceobj.typedef = TypeDef('Sliceobj', Node.typedef, + accept=interp2app(descr_Sliceobj_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Stmt(Node): def __init__(self, nodes, lineno=-1): @@ -1950,10 +2153,13 @@ def accept(self, visitor): return visitor.visitStmt(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitStmt')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Stmt_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitStmt')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Stmt.typedef = TypeDef('Stmt', Node.typedef, + accept=interp2app(descr_Stmt_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class StringConst(Node): def __init__(self, string_value, lineno=-1): @@ -1973,10 +2179,13 @@ def accept(self, visitor): return visitor.visitStringConst(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitStringConst')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_StringConst_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitStringConst')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +StringConst.typedef = TypeDef('StringConst', Node.typedef, + accept=interp2app(descr_StringConst_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Sub(BinaryOp): def __init__(self, (left, right), lineno=-1): @@ -1997,10 +2206,13 @@ def accept(self, visitor): return visitor.visitSub(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitSub')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Sub_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitSub')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Sub.typedef = TypeDef('Sub', Node.typedef, + accept=interp2app(descr_Sub_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Subscript(Node): def __init__(self, expr, flags, subs, lineno=-1): @@ -2029,10 +2241,13 @@ def accept(self, visitor): return visitor.visitSubscript(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitSubscript')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Subscript_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitSubscript')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Subscript.typedef = TypeDef('Subscript', Node.typedef, + accept=interp2app(descr_Subscript_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class TryExcept(Node): def __init__(self, body, handlers, else_, lineno=-1): @@ -2071,10 +2286,13 @@ def accept(self, visitor): return visitor.visitTryExcept(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitTryExcept')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_TryExcept_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitTryExcept')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +TryExcept.typedef = TypeDef('TryExcept', Node.typedef, + accept=interp2app(descr_TryExcept_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class TryFinally(Node): def __init__(self, body, final, lineno=-1): @@ -2095,10 +2313,13 @@ def accept(self, visitor): return visitor.visitTryFinally(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitTryFinally')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_TryFinally_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitTryFinally')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +TryFinally.typedef = TypeDef('TryFinally', Node.typedef, + accept=interp2app(descr_TryFinally_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Tuple(Node): def __init__(self, nodes, lineno=-1): @@ -2120,10 +2341,13 @@ def accept(self, visitor): return visitor.visitTuple(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitTuple')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Tuple_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitTuple')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Tuple.typedef = TypeDef('Tuple', Node.typedef, + accept=interp2app(descr_Tuple_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class UnaryAdd(UnaryOp): def __init__(self, expr, lineno=-1): @@ -2143,10 +2367,13 @@ def accept(self, visitor): return visitor.visitUnaryAdd(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitUnaryAdd')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_UnaryAdd_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitUnaryAdd')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +UnaryAdd.typedef = TypeDef('UnaryAdd', Node.typedef, + accept=interp2app(descr_UnaryAdd_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class UnarySub(UnaryOp): def __init__(self, expr, lineno=-1): @@ -2166,10 +2393,13 @@ def accept(self, visitor): return visitor.visitUnarySub(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitUnarySub')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_UnarySub_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitUnarySub')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +UnarySub.typedef = TypeDef('UnarySub', Node.typedef, + accept=interp2app(descr_UnarySub_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class While(Node): def __init__(self, test, body, else_, lineno=-1): @@ -2200,10 +2430,13 @@ def accept(self, visitor): return visitor.visitWhile(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitWhile')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_While_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitWhile')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +While.typedef = TypeDef('While', Node.typedef, + accept=interp2app(descr_While_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class Yield(Node): def __init__(self, value, lineno=-1): @@ -2223,10 +2456,13 @@ def accept(self, visitor): return visitor.visitYield(self) - def descr_accept(self, space, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitYield')) - args = Arguments(space, [ self ]) - return space.call_args(w_callable, args) +def descr_Yield_accept( space, w_self, w_visitor): + w_callable = space.getattr(w_visitor, space.wrap('visitYield')) + args = Arguments(space, [ w_self ]) + return space.call_args(w_callable, args) + +Yield.typedef = TypeDef('Yield', Node.typedef, + accept=interp2app(descr_Yield_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) class ASTVisitor(object): Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Wed Dec 7 18:07:36 2005 @@ -106,7 +106,7 @@ print >> buf self._gen_visit(buf) print >> buf - self._gen_descr_visit(buf) + self._gen_typedef(buf) buf.seek(0, 0) return buf.read() @@ -218,11 +218,15 @@ print >> buf, " def accept(self, visitor):" print >> buf, " return visitor.visit%s(self)" % self.name - def _gen_descr_visit(self, buf): - print >> buf, " def descr_accept(self, space, w_visitor):" - print >> buf, " w_callable = space.getattr(w_visitor, space.wrap('visit%s'))" % self.name - print >> buf, " args = Arguments(space, [ self ])" - print >> buf, " return space.call_args(w_callable, args)" + def _gen_typedef(self, buf): + print >> buf, "def descr_%s_accept( space, w_self, w_visitor):" %self.name + print >> buf, " w_callable = space.getattr(w_visitor, space.wrap('visit%s'))" % self.name + print >> buf, " args = Arguments(space, [ w_self ])" + print >> buf, " return space.call_args(w_callable, args)" + print >> buf, "" + print >> buf, "%s.typedef = TypeDef('%s', Node.typedef, " % (self.name,self.name) + print >> buf, " accept=interp2app(descr_%s_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ))" % self.name + def _gen_additional_methods(self, buf): for key, value in self.additional_methods.iteritems(): @@ -354,7 +358,7 @@ from consts import CO_VARARGS, CO_VARKEYWORDS, OP_ASSIGN from pypy.interpreter.baseobjspace import Wrappable from pypy.interpreter.typedef import TypeDef -from pypy.interpreter.gateway import interp2app +from pypy.interpreter.gateway import interp2app, W_Root, ObjSpace from pypy.interpreter.argument import Arguments def flatten(list): @@ -406,21 +410,20 @@ def descr_repr( self, space ): return space.wrap( self.__repr__() ) - -def descr_node_repr( space, w_obj ): - return w_obj.descr_repr( space ) -def descr_getChildNodes( space, w_obj ): - lst = w_obj.getChildNodes() - return space.newlist( lst ) - -def descr_accept( space, w_obj, w_visitor ): - return w_obj.descr_accept( space, w_visitor ) + def descr_getChildNodes( self, space ): + lst = self.getChildNodes() + return space.newlist( [ self.wrap( it ) for it in lst ] ) + +def descr_node_accept( space, w_self, w_visitor ): + w_callable = space.getattr(w_visitor, space.wrap('visitNode')) + args = Arguments(space, [ w_self ]) + return space.call_args( w_callable, args ) Node.typedef = TypeDef('ASTNode', - #__repr__ = interp2app(descr_node_repr), - getChildNodes = interp2app(descr_getChildNodes), - accept = interp2app(descr_accept), + #__repr__ = interp2app(descr_node_repr, unwrap_spec=['self', ObjSpace] ), + getChildNodes = interp2app(Node.descr_getChildNodes, unwrap_spec=[ 'self', ObjSpace ] ), + accept = interp2app(descr_node_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), ) From mwh at codespeak.net Wed Dec 7 18:19:56 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Wed, 7 Dec 2005 18:19:56 +0100 (CET) Subject: [pypy-svn] r20854 - in pypy/dist/pypy: annotation rpython rpython/lltypesystem translator/c translator/c/src translator/c/test Message-ID: <20051207171956.E4FB227B6C@code1.codespeak.net> Author: mwh Date: Wed Dec 7 18:19:53 2005 New Revision: 20854 Modified: pypy/dist/pypy/annotation/binaryop.py pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/annotation/model.py pypy/dist/pypy/annotation/unaryop.py pypy/dist/pypy/rpython/lltypesystem/lltype.py pypy/dist/pypy/rpython/rint.py pypy/dist/pypy/translator/c/primitive.py pypy/dist/pypy/translator/c/src/int.h pypy/dist/pypy/translator/c/src/pyobj.h pypy/dist/pypy/translator/c/test/test_annotated.py Log: (mwh, johahn) Support for long long signed and unsigned integers in the annotator, rtyper and c backend. SomeInteger now has a 'size' attribute measured in 'sizeof(long)'s (this isn't terribly nice, no). the other changes should be /fairly/ uncontroversial :) Modified: pypy/dist/pypy/annotation/binaryop.py ============================================================================== --- pypy/dist/pypy/annotation/binaryop.py (original) +++ pypy/dist/pypy/annotation/binaryop.py Wed Dec 7 18:19:53 2005 @@ -222,7 +222,8 @@ def union((int1, int2)): unsigned = int1.unsigned or int2.unsigned return SomeInteger(nonneg = unsigned or (int1.nonneg and int2.nonneg), - unsigned=unsigned) + unsigned=unsigned, + size = max(int1.size, int2.size)) or_ = xor = add = mul = _clone(union, []) add_ovf = mul_ovf = _clone(union, [OverflowError]) @@ -235,28 +236,33 @@ truediv_ovf = _clone(truediv, [ZeroDivisionError, OverflowError]) def sub((int1, int2)): - return SomeInteger(unsigned = int1.unsigned or int2.unsigned) + return SomeInteger(unsigned = int1.unsigned or int2.unsigned, + size = max(int1.size, int2.size)) sub.can_only_throw = [] sub_ovf = _clone(sub, [OverflowError]) def and_((int1, int2)): unsigned = int1.unsigned or int2.unsigned return SomeInteger(nonneg = unsigned or int1.nonneg or int2.nonneg, - unsigned = unsigned) + unsigned = unsigned, + size = max(int1.size, int2.size)) and_.can_only_throw = [] def lshift((int1, int2)): if int1.unsigned: return SomeInteger(unsigned=True) - return SomeInteger(nonneg = int1.nonneg) + return SomeInteger(nonneg = int1.nonneg, + size = max(int1.size, int2.size)) lshift.can_only_throw = [ValueError] rshift = lshift lshift_ovf = _clone(lshift, [ValueError, OverflowError]) def pow((int1, int2), obj3): if int1.unsigned or int2.unsigned or getattr(obj3, 'unsigned', False): - return SomeInteger(unsigned=True) - return SomeInteger(nonneg = int1.nonneg) + return SomeInteger(unsigned=True, + size = max(int1.size, int2.size)) + return SomeInteger(nonneg = int1.nonneg, + size = max(int1.size, int2.size)) pow.can_only_throw = [ZeroDivisionError] pow_ovf = _clone(pow, [ZeroDivisionError, OverflowError]) Modified: pypy/dist/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/dist/pypy/annotation/bookkeeper.py (original) +++ pypy/dist/pypy/annotation/bookkeeper.py Wed Dec 7 18:19:53 2005 @@ -17,7 +17,7 @@ from pypy.annotation.dictdef import DictDef, MOST_GENERAL_DICTDEF from pypy.annotation import description from pypy.interpreter.argument import Arguments, ArgErr -from pypy.rpython.rarithmetic import r_uint +from pypy.rpython.rarithmetic import r_uint, r_ulonglong, r_longlong from pypy.rpython.objectmodel import r_dict from pypy.tool.algo.unionfind import UnionFind from pypy.rpython.lltypesystem import lltype @@ -306,6 +306,10 @@ result = SomeInteger(nonneg = x>=0) elif tp is r_uint: result = SomeInteger(nonneg = True, unsigned = True) + elif tp is r_ulonglong: + result = SomeInteger(nonneg = True, unsigned = True, size = 2) + elif tp is r_longlong: + result = SomeInteger(nonneg = x>0, size = 2) elif issubclass(tp, str): # py.lib uses annotated str subclasses if len(x) == 1: result = SomeChar() @@ -468,6 +472,10 @@ return SomeInteger() elif t is r_uint: return SomeInteger(nonneg = True, unsigned = True) + elif t is r_ulonglong: + return SomeInteger(nonneg = True, unsigned = True, size = 2) + elif t is r_longlong: + return SomeInteger(size = 2) elif issubclass(t, str): # py.lib uses annotated str subclasses return SomeString() elif t is float: Modified: pypy/dist/pypy/annotation/model.py ============================================================================== --- pypy/dist/pypy/annotation/model.py (original) +++ pypy/dist/pypy/annotation/model.py Wed Dec 7 18:19:53 2005 @@ -158,9 +158,15 @@ class SomeInteger(SomeFloat): "Stands for an object which is known to be an integer." knowntype = int - def __init__(self, nonneg=False, unsigned=False): + # size is in multiples of C's sizeof(long)! + def __init__(self, nonneg=False, unsigned=False, size=1): self.nonneg = unsigned or nonneg self.unsigned = unsigned # pypy.rpython.rarithmetic.r_uint + self.size = size + + def fmt_size(self, s): + if s != 1: + return str(s) class SomeBool(SomeInteger): @@ -168,6 +174,7 @@ knowntype = bool nonneg = True unsigned = False + size = 1 def __init__(self): pass @@ -470,7 +477,9 @@ (s_None, lltype.Void), # also matches SomeImpossibleValue() (SomeBool(), lltype.Bool), (SomeInteger(), lltype.Signed), + (SomeInteger(size=2), lltype.SignedLongLong), (SomeInteger(nonneg=True, unsigned=True), lltype.Unsigned), + (SomeInteger(nonneg=True, unsigned=True, size=2), lltype.UnsignedLongLong), (SomeFloat(), lltype.Float), (SomeChar(), lltype.Char), (SomeUnicodeCodePoint(), lltype.UniChar), Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Wed Dec 7 18:19:53 2005 @@ -198,8 +198,8 @@ def invert(self): if self.unsigned: - return SomeInteger(unsigned=True) - return SomeInteger() + return SomeInteger(unsigned=True, size=self.size) + return SomeInteger(size=self.size) invert.can_only_throw = [] @@ -213,8 +213,8 @@ def neg(self): if self.unsigned: - return SomeInteger(unsigned=True) - return SomeInteger() + return SomeInteger(unsigned=True, size=self.size) + return SomeInteger(size=self.size) neg.can_only_throw = [] neg_ovf = _clone(neg, [OverflowError]) @@ -222,7 +222,7 @@ def abs(self): if self.unsigned: return self - return SomeInteger(nonneg=True) + return SomeInteger(nonneg=True, size=self.size) abs.can_only_throw = [] abs_ovf = _clone(abs, [OverflowError]) Modified: pypy/dist/pypy/rpython/lltypesystem/lltype.py ============================================================================== --- pypy/dist/pypy/rpython/lltypesystem/lltype.py (original) +++ pypy/dist/pypy/rpython/lltypesystem/lltype.py Wed Dec 7 18:19:53 2005 @@ -1,6 +1,6 @@ import weakref import py -from pypy.rpython.rarithmetic import r_uint +from pypy.rpython.rarithmetic import r_uint, r_ulonglong, r_longlong from pypy.tool.uid import Hashable from pypy.tool.tls import tlsobject from types import NoneType @@ -416,7 +416,9 @@ Signed = Primitive("Signed", 0) +SignedLongLong = Primitive("SignedLongLong", r_longlong(0)) Unsigned = Primitive("Unsigned", r_uint(0)) +UnsignedLongLong = Primitive("UnsignedLongLong", r_ulonglong(0)) Float = Primitive("Float", 0.0) Char = Primitive("Char", '\x00') Bool = Primitive("Bool", False) @@ -469,6 +471,10 @@ return Bool if tp is r_uint: return Unsigned + if tp is r_ulonglong: + return UnsignedLongLong + if tp is r_longlong: + return SignedLongLong if tp is float: return Float if tp is str: Modified: pypy/dist/pypy/rpython/rint.py ============================================================================== --- pypy/dist/pypy/rpython/rint.py (original) +++ pypy/dist/pypy/rpython/rint.py Wed Dec 7 18:19:53 2005 @@ -3,10 +3,11 @@ from pypy.annotation import model as annmodel from pypy.objspace.flow.objspace import op_appendices from pypy.rpython.lltypesystem.lltype import Signed, Unsigned, Bool, Float, \ - Void, Char, UniChar, GcArray, malloc, Array, pyobjectptr + Void, Char, UniChar, GcArray, malloc, Array, pyobjectptr, \ + UnsignedLongLong, SignedLongLong from pypy.rpython.rmodel import IntegerRepr, inputconst from pypy.rpython.robject import PyObjRepr, pyobj_repr -from pypy.rpython.rarithmetic import intmask, r_uint +from pypy.rpython.rarithmetic import intmask, r_uint, r_ulonglong, r_longlong from pypy.rpython.error import TyperError from pypy.rpython.rmodel import log @@ -14,14 +15,24 @@ class __extend__(annmodel.SomeInteger): def rtyper_makerepr(self, rtyper): if self.unsigned: - return unsigned_repr + if self.size == 2: + return unsignedlonglong_repr + else: + assert self.size == 1 + return unsigned_repr else: - return signed_repr + if self.size == 2: + return signedlonglong_repr + else: + assert self.size == 1 + return signed_repr def rtyper_makekey(self): - return self.__class__, self.unsigned + return self.__class__, self.unsigned, self.size signed_repr = IntegerRepr(Signed, 'int_') +signedlonglong_repr = IntegerRepr(SignedLongLong, 'llong_') unsigned_repr = IntegerRepr(Unsigned, 'uint_') +unsignedlonglong_repr = IntegerRepr(UnsignedLongLong, 'ullong_') class __extend__(pairtype(IntegerRepr, IntegerRepr)): @@ -195,6 +206,10 @@ return intmask(value) if self.lowleveltype == Unsigned: return r_uint(value) + if self.lowleveltype == UnsignedLongLong: + return r_ulonglong(value) + if self.lowleveltype == SignedLongLong: + return r_longlong(value) raise NotImplementedError def get_ll_eq_function(self): @@ -262,7 +277,7 @@ return vlist[0] def rtype_int(self, hop): - if self.lowleveltype == Unsigned: + if self.lowleveltype in (Unsigned, UnsignedLongLong): raise TyperError("use intmask() instead of int(r_uint(...))") vlist = hop.inputargs(self) return vlist[0] @@ -403,24 +418,32 @@ # # _________________________ Conversions _________________________ + +py_to_ll_conversion_functions = { + UnsignedLongLong: ('RPyLong_AsUnsignedLongLong', lambda pyo: r_ulonglong(pyo._obj.value)), + SignedLongLong: ('RPyLong_AsLongLong', lambda pyo: r_longlong(pyo._obj.value)), + Unsigned: ('PyLong_AsUnsignedLong', lambda pyo: r_uint(pyo._obj.value)), + Signed: ('PyInt_AsLong', lambda pyo: int(pyo._obj.value)) +} + +ll_to_py_conversion_functions = { + UnsignedLongLong: ('PyLong_FromUnsignedLongLong', lambda i: pyobjectptr(i)), + SignedLongLong: ('PyLong_FromLongLong', lambda i: pyobjectptr(i)), + Unsigned: ('PyLong_FromUnsignedLong', lambda i: pyobjectptr(i)), + Signed: ('PyLong_FromLong', lambda i: pyobjectptr(i)), +} + + class __extend__(pairtype(PyObjRepr, IntegerRepr)): def convert_from_to((r_from, r_to), v, llops): - if r_to.lowleveltype == Unsigned: - return llops.gencapicall('PyLong_AsUnsignedLong', [v], - resulttype=Unsigned) - if r_to.lowleveltype == Signed: - return llops.gencapicall('PyInt_AsLong', [v], - resulttype=Signed, - _callable = lambda pyo: int(pyo._obj.value)) - return NotImplemented + tolltype = r_to.lowleveltype + fnname, callable = py_to_ll_conversion_functions[tolltype] + return llops.gencapicall(fnname, [v], + resulttype=r_to, _callable=callable) class __extend__(pairtype(IntegerRepr, PyObjRepr)): def convert_from_to((r_from, r_to), v, llops): - if r_from.lowleveltype == Unsigned: - return llops.gencapicall('PyLong_FromUnsignedLong', [v], - resulttype=pyobj_repr) - if r_from.lowleveltype == Signed: - # xxx put in table - return llops.gencapicall('PyInt_FromLong', [v], - resulttype=pyobj_repr, _callable = lambda i: pyobjectptr(i)) - return NotImplemented + fromlltype = r_from.lowleveltype + fnname, callable = ll_to_py_conversion_functions[fromlltype] + return llops.gencapicall(fnname, [v], + resulttype=pyobj_repr, _callable=callable) Modified: pypy/dist/pypy/translator/c/primitive.py ============================================================================== --- pypy/dist/pypy/translator/c/primitive.py (original) +++ pypy/dist/pypy/translator/c/primitive.py Wed Dec 7 18:19:53 2005 @@ -16,6 +16,13 @@ assert value >= 0 return '%dUL' % value +def name_unsignedlonglong(value): + assert value >= 0 + return '%dULL' % value + +def name_signedlonglong(value): + return '%dLL' % value + def isinf(x): return x != 0.0 and x / 2 == x @@ -52,7 +59,9 @@ PrimitiveName = { Signed: name_signed, + SignedLongLong: name_signedlonglong, Unsigned: name_unsigned, + UnsignedLongLong: name_unsignedlonglong, Float: name_float, Char: name_char, UniChar: name_unichar, @@ -63,7 +72,9 @@ PrimitiveType = { Signed: 'long @', + SignedLongLong: 'long long @', Unsigned: 'unsigned long @', + UnsignedLongLong: 'unsigned long long @', Float: 'double @', Char: 'char @', UniChar: 'unsigned int @', @@ -74,7 +85,9 @@ PrimitiveErrorValue = { Signed: '-1', + SignedLongLong: '-1LL', Unsigned: '((unsigned) -1)', + UnsignedLongLong: '((unsigned long long) -1)', Float: '-1.0', Char: '((char) -1)', UniChar: '((unsigned) -1)', Modified: pypy/dist/pypy/translator/c/src/int.h ============================================================================== --- pypy/dist/pypy/translator/c/src/int.h (original) +++ pypy/dist/pypy/translator/c/src/int.h Wed Dec 7 18:19:53 2005 @@ -275,3 +275,6 @@ #define OP_UINT_AND OP_INT_AND #define OP_UINT_OR OP_INT_OR #define OP_UINT_XOR OP_INT_XOR + +#define OP_ULLONG_MUL OP_INT_MUL +#define OP_LLONG_MUL OP_INT_MUL Modified: pypy/dist/pypy/translator/c/src/pyobj.h ============================================================================== --- pypy/dist/pypy/translator/c/src/pyobj.h (original) +++ pypy/dist/pypy/translator/c/src/pyobj.h Wed Dec 7 18:19:53 2005 @@ -215,3 +215,26 @@ if (!(r=decode_arg(fname, pos, name, vargs, vkwds, def))) CFAIL(err) #define OP_CHECK_NO_MORE_ARG(fname, n, vargs, r, err) \ if (check_no_more_arg(fname, n, vargs) < 0) CFAIL(err) + +unsigned long long RPyLong_AsUnsignedLongLong(PyObject *v); +long long RPyLong_AsLongLong(PyObject *v); + +#ifndef PYPY_NOT_MAIN_FILE + +unsigned long long RPyLong_AsUnsignedLongLong(PyObject *v) +{ + if (PyInt_Check(v)) + return PyInt_AsLong(v); + else + return PyLong_AsUnsignedLongLong(v); +} + +long long RPyLong_AsLongLong(PyObject *v) +{ + if (PyInt_Check(v)) + return PyInt_AsLong(v); + else + return PyLong_AsLongLong(v); +} + +#endif Modified: pypy/dist/pypy/translator/c/test/test_annotated.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_annotated.py (original) +++ pypy/dist/pypy/translator/c/test/test_annotated.py Wed Dec 7 18:19:53 2005 @@ -233,3 +233,15 @@ assert fn(2) == 42 assert fn(-2) == 789 assert fn(-3) == 42 + + def test_long_long(self): + from pypy.rpython.rarithmetic import r_ulonglong, r_longlong + def f(i=r_ulonglong): + return 4*i + fn = self.getcompiled(f, view=False) + assert fn(sys.maxint) == 4*sys.maxint + + def g(i=r_longlong): + return 4*i + gn = self.getcompiled(g, view=False) + assert gn(sys.maxint) == 4*sys.maxint From ericvrp at codespeak.net Wed Dec 7 18:30:26 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 7 Dec 2005 18:30:26 +0100 (CET) Subject: [pypy-svn] r20855 - in pypy/dist/pypy/jit: . test Message-ID: <20051207173026.56BEC27DB5@code1.codespeak.net> Author: ericvrp Date: Wed Dec 7 18:30:24 2005 New Revision: 20855 Added: pypy/dist/pypy/jit/opcode.py - copied, changed from r20848, pypy/dist/pypy/jit/bytecode.py Removed: pypy/dist/pypy/jit/bytecode.py Modified: pypy/dist/pypy/jit/test/test_tl.py pypy/dist/pypy/jit/tl.py Log: (arre, ericvrp) Added simple source to bytecode parser/compiler. Modified: pypy/dist/pypy/jit/test/test_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_tl.py (original) +++ pypy/dist/pypy/jit/test/test_tl.py Wed Dec 7 18:30:24 2005 @@ -1,12 +1,8 @@ import py import operator -from pypy.jit.tl import interp -from pypy.jit.bytecode import * +from pypy.jit.tl import interp, compile +from pypy.jit.opcode import * -#from pypy.rpython.l3interp import l3interp -#from pypy.rpython.l3interp import model -#from pypy.rpython.l3interp.model import Op -#from pypy.translator.c.test.test_genc import compile from pypy.translator.translator import TranslationContext from pypy.annotation import policy @@ -25,44 +21,44 @@ builder.import_module() return builder.get_entry_point() -def compile(insn): +def list2bytecode(insn): return ''.join([chr(i & 0xff) for i in insn]) # actual tests go here def test_tl_push(): - assert interp(compile([PUSH, 16])) == 16 + assert interp(list2bytecode([PUSH, 16])) == 16 def test_tl_pop(): - assert interp( compile([PUSH,16, PUSH,42, PUSH,100, POP]) ) == 42 + assert interp( list2bytecode([PUSH,16, PUSH,42, PUSH,100, POP]) ) == 42 def test_tl_add(): - assert interp( compile([PUSH,42, PUSH,100, ADD]) ) == 142 - assert interp( compile([PUSH,16, PUSH,42, PUSH,100, ADD]) ) == 142 + assert interp( list2bytecode([PUSH,42, PUSH,100, ADD]) ) == 142 + assert interp( list2bytecode([PUSH,16, PUSH,42, PUSH,100, ADD]) ) == 142 def test_tl_error(): - py.test.raises(IndexError, interp,compile([POP])) - py.test.raises(IndexError, interp,compile([ADD])) - py.test.raises(IndexError, interp,compile([PUSH,100, ADD]) ) + py.test.raises(IndexError, interp,list2bytecode([POP])) + py.test.raises(IndexError, interp,list2bytecode([ADD])) + py.test.raises(IndexError, interp,list2bytecode([PUSH,100, ADD]) ) def test_tl_invalid_codetype(): py.test.raises(TypeError, interp,[INVALID]) def test_tl_invalid_bytecode(): - py.test.raises(RuntimeError, interp,compile([INVALID])) + py.test.raises(RuntimeError, interp,list2bytecode([INVALID])) def test_tl_translatable(): - code = compile([PUSH,42, PUSH,100, ADD]) + code = list2bytecode([PUSH,42, PUSH,100, ADD]) fn = translate(interp, [str]) assert interp(code) == fn(code) def test_swap(): code = [PUSH,42, PUSH, 84] - assert interp(compile(code)) == 84 + assert interp(list2bytecode(code)) == 84 code.append(SWAP) - assert interp(compile(code)) == 42 + assert interp(list2bytecode(code)) == 42 code.append(POP) - assert interp(compile(code)) == 84 + assert interp(list2bytecode(code)) == 84 def test_pick(): values = [7, 8, 9] @@ -71,7 +67,7 @@ code.extend([PUSH, v]) for i, v in enumerate(values): - assert interp(compile(code + [PICK,i])) == v + assert interp(list2bytecode(code + [PICK,i])) == v def test_put(): values = [1,2,7,-3] @@ -80,7 +76,7 @@ code += [PUSH,v, PUT,i] for i, v in enumerate(values): - assert interp(compile(code + [PICK,i])) == v + assert interp(list2bytecode(code + [PICK,i])) == v ops = [ (ADD, operator.add, ((2, 4), (1, 1), (-1, 1))), (SUB, operator.sub, ((2, 4), (4, 2), (1, 1))), @@ -98,35 +94,68 @@ for insn, pyop, values in ops: for first, second in values: code = [PUSH, first, PUSH, second, insn] - assert interp(compile(code)) == pyop(first, second) + assert interp(list2bytecode(code)) == pyop(first, second) def test_branch_forward(): - assert interp(compile([PUSH,1, PUSH,0, BR_COND,2, PUSH,-1])) == -1 - assert interp(compile([PUSH,1, PUSH,1, BR_COND,2, PUSH,-1])) == 1 - assert interp(compile([PUSH,1, PUSH,-1, BR_COND,2, PUSH,-1])) == 1 + assert interp(list2bytecode([PUSH,1, PUSH,0, BR_COND,2, PUSH,-1])) == -1 + assert interp(list2bytecode([PUSH,1, PUSH,1, BR_COND,2, PUSH,-1])) == 1 + assert interp(list2bytecode([PUSH,1, PUSH,-1, BR_COND,2, PUSH,-1])) == 1 def test_branch_backwards(): - assert interp(compile([PUSH,0, PUSH,1, BR_COND,6, PUSH,-1, PUSH,3, BR_COND,4, PUSH,2, BR_COND,-10])) == -1 + assert interp(list2bytecode([PUSH,0, PUSH,1, BR_COND,6, PUSH,-1, PUSH,3, BR_COND,4, PUSH,2, BR_COND,-10])) == -1 def test_branch0(): - assert interp(compile([PUSH,7, PUSH,1, BR_COND,0])) == 7 + assert interp(list2bytecode([PUSH,7, PUSH,1, BR_COND,0])) == 7 def test_exit(): - assert py.test.raises(IndexError, interp, compile([EXIT])) - assert interp(compile([PUSH,7, EXIT, PUSH,5])) == 7 + assert py.test.raises(IndexError, interp, list2bytecode([EXIT])) + assert interp(list2bytecode([PUSH,7, EXIT, PUSH,5])) == 7 def test_rot(): code = [PUSH,1, PUSH,2, PUSH,3, ROT,3] - assert interp(compile(code)) == 2 - assert interp(compile(code + [POP])) == 1 - assert interp(compile(code + [POP, POP])) == 3 + assert interp(list2bytecode(code)) == 2 + assert interp(list2bytecode(code + [POP])) == 1 + assert interp(list2bytecode(code + [POP, POP])) == 3 - py.test.raises(IndexError, interp, compile([PUSH,1, PUSH,2, PUSH,3, ROT,4])) + py.test.raises(IndexError, interp, list2bytecode([PUSH,1, PUSH,2, PUSH,3, ROT,4])) def test_call_ret(): - assert py.test.raises(IndexError, interp, compile([RETURN])) - assert interp(compile([PUSH,6, RETURN, PUSH,4, EXIT, PUSH,9])) == 9 - assert interp(compile([CALL,0])) == 2 - - assert interp(compile([PUSH,1, CALL,5, PUSH,2, CALL,2, EXIT, RETURN, ROT,3, ADD, SWAP, RETURN])) == 3 + assert py.test.raises(IndexError, interp, list2bytecode([RETURN])) + assert interp(list2bytecode([PUSH,6, RETURN, PUSH,4, EXIT, PUSH,9])) == 9 + assert interp(list2bytecode([CALL,0])) == 2 + assert interp(list2bytecode([PUSH,1, CALL,5, PUSH,2, CALL,2, EXIT, RETURN, ROT,3, ADD, SWAP, RETURN])) == 3 + +def test_compile_branch_backwards(): + code = compile(""" +main: + PUSH 0 + PUSH 1 + BR_COND somename +label1: + PUSH -1 + PUSH 3 + BR_COND end +somename: + PUSH 2 + BR_COND label1 +end: +""") + assert code == list2bytecode([PUSH,0, PUSH,1, BR_COND,6, PUSH,-1, PUSH,3, BR_COND,4, PUSH,2, BR_COND,-10]) + +def test_compile_call_ret(): + code = compile("""PUSH 1 + CALL func1 + PUSH 2 + CALL func2 + EXIT + +func1: + RETURN + +func2: + ROT 3 + ADD + SWAP + RETURN""") + assert code == list2bytecode([PUSH,1, CALL,5, PUSH,2, CALL,2, EXIT, RETURN, ROT,3, ADD, SWAP, RETURN]) Modified: pypy/dist/pypy/jit/tl.py ============================================================================== --- pypy/dist/pypy/jit/tl.py (original) +++ pypy/dist/pypy/jit/tl.py Wed Dec 7 18:30:24 2005 @@ -1,7 +1,8 @@ '''Toy Language''' import py -from bytecode import * +from opcode import * +import opcode def char2int(c): t = ord(c) @@ -109,3 +110,28 @@ raise RuntimeError("unknown opcode: " + str(opcode)) return stack[-1] + +def compile(code=''): + bytecode = [] + labels = {} #[key] = pc + label_usage = [] #(name, pc) + for s in code.split('\n'): + for comment in '; # //'.split(): + s = s.split(comment, 1)[0] + s = s.strip() + if not s: + continue + t = s.split() + if t[0].endswith(':'): + labels[ t[0][:-1] ] = len(bytecode) + continue + bytecode.append( opcode.names[ t[0] ] ) + if len(t) > 1: + try: + bytecode.append( int(t[1]) ) + except ValueError: + label_usage.append( (t[1], len(bytecode)) ) + bytecode.append( 0 ) + for label, pc in label_usage: + bytecode[pc] = labels[label] - pc - 1 + return ''.join([chr(i & 0xff) for i in bytecode]) From ale at codespeak.net Wed Dec 7 18:32:54 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Wed, 7 Dec 2005 18:32:54 +0100 (CET) Subject: [pypy-svn] r20856 - in pypy/dist/pypy: module/_socket module/_socket/test translator/c/src translator/c/test Message-ID: <20051207173254.6B77027DBA@code1.codespeak.net> Author: ale Date: Wed Dec 7 18:32:52 2005 New Revision: 20856 Modified: pypy/dist/pypy/module/_socket/interp_socket.py pypy/dist/pypy/module/_socket/test/test_socket2.py pypy/dist/pypy/translator/c/src/ll__socket.h pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (nik,ale) Intermediate checkin - caring about errors Modified: pypy/dist/pypy/module/_socket/interp_socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/interp_socket.py (original) +++ pypy/dist/pypy/module/_socket/interp_socket.py Wed Dec 7 18:32:52 2005 @@ -992,4 +992,5 @@ [*] not available on all platforms!""", __new__ = descr_socket_new, + ** socketmethods ) Modified: pypy/dist/pypy/module/_socket/test/test_socket2.py ============================================================================== --- pypy/dist/pypy/module/_socket/test/test_socket2.py (original) +++ pypy/dist/pypy/module/_socket/test/test_socket2.py Wed Dec 7 18:32:52 2005 @@ -286,7 +286,17 @@ else: assert False - def test_newsocket(self): import socket s = socket.socket() + + def test_newsocket_error(self): + import socket + import errno + try: + s = socket.socket(1001,socket.SOCK_STREAM,0) + except socket.error, ex: + print ex,ex.args[0] + assert ex.args[0] == errno.EAFNOSUPPORT + else: + assert 0 Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Wed Dec 7 18:32:52 2005 @@ -78,7 +78,18 @@ int LL__socket_newsocket(int family, int type, int protocol) { - return socket(family, type, protocol); + int fd; + + fd = socket(family, type, protocol); + +#ifdef MS_WINDOWS + if (fd == INVALID_SOCKET) +#else + if (fd < 0) +#endif + { + return -1; + } } /* ____________________________________________________________________________ */ Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Wed Dec 7 18:32:52 2005 @@ -77,3 +77,12 @@ f1 = compile(does_stuff, []) res = f1() assert isinstance(res, int) + +def test_newsocket_error(): + from pypy.module._socket.rpython import rsocket + def does_stuff(): + return rsocket.newsocket(1001, _socket.SOCK_STREAM, 0) + f1 = compile(does_stuff, []) + res = f1() + assert res == -1 + From ericvrp at codespeak.net Wed Dec 7 18:36:21 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 7 Dec 2005 18:36:21 +0100 (CET) Subject: [pypy-svn] r20857 - pypy/dist/pypy/jit/test Message-ID: <20051207173621.CD1C327DB5@code1.codespeak.net> Author: ericvrp Date: Wed Dec 7 18:36:20 2005 New Revision: 20857 Modified: pypy/dist/pypy/jit/test/test_tl.py Log: (arre, ericvrp) Added tests for sourcecode comments Modified: pypy/dist/pypy/jit/test/test_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_tl.py (original) +++ pypy/dist/pypy/jit/test/test_tl.py Wed Dec 7 18:36:20 2005 @@ -136,10 +136,13 @@ PUSH -1 PUSH 3 BR_COND end -somename: - PUSH 2 - BR_COND label1 -end: +somename: ; + PUSH 2 // + BR_COND label1// +end:// comment + // +// +//comment """) assert code == list2bytecode([PUSH,0, PUSH,1, BR_COND,6, PUSH,-1, PUSH,3, BR_COND,4, PUSH,2, BR_COND,-10]) @@ -151,10 +154,10 @@ EXIT func1: - RETURN + RETURN # comment func2: - ROT 3 + ROT 3 ;comment ADD SWAP RETURN""") From nik at codespeak.net Wed Dec 7 18:39:20 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Wed, 7 Dec 2005 18:39:20 +0100 (CET) Subject: [pypy-svn] r20858 - pypy/dist/pypy/translator/c/src Message-ID: <20051207173920.3195627DB5@code1.codespeak.net> Author: nik Date: Wed Dec 7 18:39:19 2005 New Revision: 20858 Modified: pypy/dist/pypy/translator/c/src/ll__socket.h Log: added netinet/in.h header apparently required on my particular OS X version Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Wed Dec 7 18:39:19 2005 @@ -7,6 +7,7 @@ # include # include # include +# include #endif int LL__socket_ntohs(int htons); From ludal at codespeak.net Wed Dec 7 18:55:43 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Wed, 7 Dec 2005 18:55:43 +0100 (CET) Subject: [pypy-svn] r20859 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20051207175543.9DDC927B40@code1.codespeak.net> Author: ludal Date: Wed Dec 7 18:55:42 2005 New Revision: 20859 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/astgen.py Log: mini bug Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Wed Dec 7 18:55:42 2005 @@ -61,7 +61,7 @@ def descr_getChildNodes( self, space ): lst = self.getChildNodes() - return space.newlist( [ self.wrap( it ) for it in lst ] ) + return space.newlist( [ space.wrap( it ) for it in lst ] ) def descr_node_accept( space, w_self, w_visitor ): w_callable = space.getattr(w_visitor, space.wrap('visitNode')) Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Wed Dec 7 18:55:42 2005 @@ -413,7 +413,7 @@ def descr_getChildNodes( self, space ): lst = self.getChildNodes() - return space.newlist( [ self.wrap( it ) for it in lst ] ) + return space.newlist( [ space.wrap( it ) for it in lst ] ) def descr_node_accept( space, w_self, w_visitor ): w_callable = space.getattr(w_visitor, space.wrap('visitNode')) From ale at codespeak.net Wed Dec 7 19:02:49 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Wed, 7 Dec 2005 19:02:49 +0100 (CET) Subject: [pypy-svn] r20860 - pypy/dist/pypy/module/_socket/test Message-ID: <20051207180249.220ED27B47@code1.codespeak.net> Author: ale Date: Wed Dec 7 19:02:47 2005 New Revision: 20860 Modified: pypy/dist/pypy/module/_socket/test/test_socket2.py Log: (nik,ale) A test for socket errors Modified: pypy/dist/pypy/module/_socket/test/test_socket2.py ============================================================================== --- pypy/dist/pypy/module/_socket/test/test_socket2.py (original) +++ pypy/dist/pypy/module/_socket/test/test_socket2.py Wed Dec 7 19:02:47 2005 @@ -262,6 +262,10 @@ "(_socket): return _socket.getdefaulttimeout()") assert space.unwrap(w_t) is None +def app_test_newsocket_error(): + import socket + raises (socket.error, socket.socket, 10001, socket.SOCK_STREAM, 0) + class AppTestSocket: def setup_class(cls): cls.space = space @@ -290,13 +294,3 @@ import socket s = socket.socket() - def test_newsocket_error(self): - import socket - import errno - try: - s = socket.socket(1001,socket.SOCK_STREAM,0) - except socket.error, ex: - print ex,ex.args[0] - assert ex.args[0] == errno.EAFNOSUPPORT - else: - assert 0 From alastair at codespeak.net Wed Dec 7 19:36:44 2005 From: alastair at codespeak.net (alastair at codespeak.net) Date: Wed, 7 Dec 2005 19:36:44 +0100 (CET) Subject: [pypy-svn] r20861 - pypy/extradoc/talk/pypy_euworkshop_2005-12-08 Message-ID: <20051207183644.4F2E927B64@code1.codespeak.net> Author: alastair Date: Wed Dec 7 19:36:43 2005 New Revision: 20861 Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy-brussels-2005-12-08.odp (contents, props changed) pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy-brussels-2005-12-08.pdf (contents, props changed) Log: First draft of slides. Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy-brussels-2005-12-08.odp ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy-brussels-2005-12-08.pdf ============================================================================== Binary file. No diff available. From alastair at codespeak.net Wed Dec 7 21:36:10 2005 From: alastair at codespeak.net (alastair at codespeak.net) Date: Wed, 7 Dec 2005 21:36:10 +0100 (CET) Subject: [pypy-svn] r20862 - pypy/extradoc/talk/pypy_euworkshop_2005-12-08 Message-ID: <20051207203610.9D9EB27B64@code1.codespeak.net> Author: alastair Date: Wed Dec 7 21:36:08 2005 New Revision: 20862 Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy-brussels-2005-12-08.odp pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy-brussels-2005-12-08.pdf Log: Getting there - colour and content added. Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy-brussels-2005-12-08.odp ============================================================================== Binary files. No diff available. Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy-brussels-2005-12-08.pdf ============================================================================== Binary files. No diff available. From alastair at codespeak.net Wed Dec 7 22:34:15 2005 From: alastair at codespeak.net (alastair at codespeak.net) Date: Wed, 7 Dec 2005 22:34:15 +0100 (CET) Subject: [pypy-svn] r20863 - pypy/extradoc/talk/pypy_euworkshop_2005-12-08 Message-ID: <20051207213415.8FFCA27B3E@code1.codespeak.net> Author: alastair Date: Wed Dec 7 22:34:09 2005 New Revision: 20863 Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy-brussels-2005-12-08.odp pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy-brussels-2005-12-08.pdf Log: Final version. Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy-brussels-2005-12-08.odp ============================================================================== Binary files. No diff available. Modified: pypy/extradoc/talk/pypy_euworkshop_2005-12-08/pypy-brussels-2005-12-08.pdf ============================================================================== Binary files. No diff available. From cfbolz at codespeak.net Thu Dec 8 10:55:01 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 8 Dec 2005 10:55:01 +0100 (CET) Subject: [pypy-svn] r20876 - pypy/extradoc/sprintinfo/gothenburg-2005 Message-ID: <20051208095501.AB97627B68@code1.codespeak.net> Author: cfbolz Date: Thu Dec 8 10:55:01 2005 New Revision: 20876 Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Log: plans for today Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt ============================================================================== --- pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt (original) +++ pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Thu Dec 8 10:55:01 2005 @@ -30,8 +30,8 @@ see doc/discussion/draft-jit-ideas.txt -- toy target intepreter -- low-level graphs abstract interpreter +- toy target intepreter + parser/assembler (DONE) +- low-level graphs abstract interpreter (IN-PROGRESS) (- L3 interpreter) Stackless @@ -41,7 +41,7 @@ Expose the low-level switching facilities: - write RPython structures (tasklet, channel) and basic - functions for switching + functions for switching (IN-PROGRESS, Richard wants annotation help) - add an app-level interface (mixed module) - implement support structures - a deque module exists already which can be used for channel queues @@ -59,12 +59,12 @@ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ (Nik, Anders L) -- work on _socket +- work on _socket (IN-PROGRESS) - this exposes limitations in our way to glue to C libraries, think/design solutions (Johan, Michael) -- support more basic integer types. Decide on the proper +- support more basic integer types. Decide on the proper (IN-PROGRESS) design (explicit spelling of sizes, or the long-long way?) note that we already have functions which return 64 bit values. @@ -79,7 +79,7 @@ ~~~~~~~~~~~~~~~~~~~~~~~~~~~ - look into the perfomance and code path for function calls - in our intepreter + in our intepreter (Arre, Eric) - look into converting the indirect call in the eval loop for bytecode dispatch into a switch: probably needs a representation choice in the RTyper, a transformation, and integer exitswitch implementation as switch in the backends @@ -90,7 +90,7 @@ (Ludovic, Adrien) - export the AST nodes hierarchy to application level through the - compiler module + compiler module (IN-PROGRESS) - export the Grammar representation and provide means to (at least) add new rules (long) which involve providing ST->AST transformation functions From nik at codespeak.net Thu Dec 8 11:16:16 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Thu, 8 Dec 2005 11:16:16 +0100 (CET) Subject: [pypy-svn] r20877 - in pypy/dist/pypy/translator/c: src test Message-ID: <20051208101616.31C5327DB9@code1.codespeak.net> Author: nik Date: Thu Dec 8 11:16:14 2005 New Revision: 20877 Modified: pypy/dist/pypy/translator/c/src/ll__socket.h pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (ale, nik) for now raise OSError from C code on socket errors. more sophisticated error tests. Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Thu Dec 8 11:16:14 2005 @@ -89,7 +89,7 @@ if (fd < 0) #endif { - return -1; + RPYTHON_RAISE_OSERROR(errno); } } /* ____________________________________________________________________________ */ Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Thu Dec 8 11:16:14 2005 @@ -80,9 +80,16 @@ def test_newsocket_error(): from pypy.module._socket.rpython import rsocket - def does_stuff(): - return rsocket.newsocket(1001, _socket.SOCK_STREAM, 0) - f1 = compile(does_stuff, []) - res = f1() - assert res == -1 - + tests = [(1001, _socket.SOCK_STREAM, 0)] + def does_stuff(family, type, protocol): + return rsocket.newsocket(family, type, protocol) + f1 = compile(does_stuff, [int, int, int]) + for args in tests: + try: + f1(*args) + except OSError, ex: + try: + import socket + socket.socket(*args) + except socket.error, ex_socket: + assert ex_socket.args[0] == ex.errno From mwh at codespeak.net Thu Dec 8 11:18:29 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Thu, 8 Dec 2005 11:18:29 +0100 (CET) Subject: [pypy-svn] r20878 - in pypy/dist/pypy: annotation rpython rpython/module translator/c/src Message-ID: <20051208101829.5E43127DB9@code1.codespeak.net> Author: mwh Date: Thu Dec 8 11:18:27 2005 New Revision: 20878 Modified: pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/rpython/extfunctable.py pypy/dist/pypy/rpython/module/ll_os.py pypy/dist/pypy/rpython/rbuiltin.py pypy/dist/pypy/translator/c/src/int.h Log: ll_os_lseek now returns a long long. this required changing far too many places in the code... Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Thu Dec 8 11:18:27 2005 @@ -85,6 +85,10 @@ return constpropagate(pypy.rpython.rarithmetic.r_uint, [s_obj], SomeInteger(nonneg=True, unsigned=True)) +def restricted_longlong(s_obj): # for r_uint + return constpropagate(pypy.rpython.rarithmetic.r_longlong, [s_obj], + SomeInteger(size=2)) + def builtin_float(s_obj): return constpropagate(float, [s_obj], SomeFloat()) @@ -314,6 +318,7 @@ BUILTIN_ANALYZERS[original] = value BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.r_uint] = restricted_uint +BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.r_longlong] = restricted_longlong ##BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.ovfcheck] = rarith_ovfcheck ##BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.ovfcheck_lshift] = rarith_ovfcheck_lshift BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.intmask] = rarith_intmask Modified: pypy/dist/pypy/rpython/extfunctable.py ============================================================================== --- pypy/dist/pypy/rpython/extfunctable.py (original) +++ pypy/dist/pypy/rpython/extfunctable.py Thu Dec 8 11:18:27 2005 @@ -5,7 +5,7 @@ import time import math import types - +from pypy.rpython.rarithmetic import r_longlong class ExtFuncInfo: def __init__(self, func, annotation, ll_function_path, ll_annotable, backend_functiontemplate): @@ -149,7 +149,7 @@ declare(os.write , posannotation , 'll_os/write') declare(os.close , noneannotation, 'll_os/close') declare(os.dup , int , 'll_os/dup') -declare(os.lseek , int , 'll_os/lseek') +declare(os.lseek , r_longlong , 'll_os/lseek') declare(os.isatty , bool , 'll_os/isatty') if hasattr(posix, 'ftruncate'): declare(os.ftruncate, noneannotation, 'll_os/ftruncate') Modified: pypy/dist/pypy/rpython/module/ll_os.py ============================================================================== --- pypy/dist/pypy/rpython/module/ll_os.py (original) +++ pypy/dist/pypy/rpython/module/ll_os.py Thu Dec 8 11:18:27 2005 @@ -20,6 +20,7 @@ from pypy.rpython.module.support import to_rstr, from_rstr, ll_strcpy, _ll_strfill from pypy.rpython.module.support import to_opaque_object, from_opaque_object from pypy.rpython import ros +from pypy.rpython.rarithmetic import r_longlong def ll_os_open(fname, flag, mode): return os.open(from_rstr(fname), flag, mode) @@ -64,7 +65,7 @@ ll_os_dup.suggested_primitive = True def ll_os_lseek(fd,pos,how): - return intmask(os.lseek(fd,pos,how)) + return r_longlong(os.lseek(fd,pos,how)) ll_os_lseek.suggested_primitive = True def ll_os_isatty(fd): Modified: pypy/dist/pypy/rpython/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/rbuiltin.py Thu Dec 8 11:18:27 2005 @@ -146,6 +146,10 @@ vlist = hop.inputargs(lltype.Unsigned) return vlist[0] +def rtype_r_longlong(hop): + vlist = hop.inputargs(lltype.SignedLongLong) + return vlist[0] + def rtype_builtin_min(hop): rint1, rint2 = hop.args_r assert isinstance(rint1, IntegerRepr) @@ -281,6 +285,7 @@ BUILTIN_TYPER[lltype.runtime_type_info] = rtype_runtime_type_info BUILTIN_TYPER[rarithmetic.intmask] = rtype_intmask BUILTIN_TYPER[rarithmetic.r_uint] = rtype_r_uint +BUILTIN_TYPER[rarithmetic.r_longlong] = rtype_r_longlong BUILTIN_TYPER[objectmodel.r_dict] = rtype_r_dict BUILTIN_TYPER[objectmodel.we_are_translated] = rtype_we_are_translated BUILTIN_TYPER[rstack.yield_current_frame_to_caller] = ( Modified: pypy/dist/pypy/translator/c/src/int.h ============================================================================== --- pypy/dist/pypy/translator/c/src/int.h (original) +++ pypy/dist/pypy/translator/c/src/int.h Thu Dec 8 11:18:27 2005 @@ -277,4 +277,6 @@ #define OP_UINT_XOR OP_INT_XOR #define OP_ULLONG_MUL OP_INT_MUL + #define OP_LLONG_MUL OP_INT_MUL +#define OP_LLONG_EQ OP_INT_EQ From nik at codespeak.net Thu Dec 8 11:53:59 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Thu, 8 Dec 2005 11:53:59 +0100 (CET) Subject: [pypy-svn] r20879 - in pypy/dist/pypy: module/_socket translator/c/src translator/c/test Message-ID: <20051208105359.2698027DB9@code1.codespeak.net> Author: nik Date: Thu Dec 8 11:53:57 2005 New Revision: 20879 Modified: pypy/dist/pypy/module/_socket/interp_socket.py pypy/dist/pypy/translator/c/src/ll__socket.h pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (ale, nik) error handling for socket creation complete (but convoluted). Modified: pypy/dist/pypy/module/_socket/interp_socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/interp_socket.py (original) +++ pypy/dist/pypy/module/_socket/interp_socket.py Thu Dec 8 11:53:57 2005 @@ -619,8 +619,10 @@ try: fd = rsocket.newsocket(family, type, proto) - except socket.error, e: + except socket.error, e: # On untranslated PyPy raise wrap_socketerror(space, e) + except OSError, e: # On translated PyPy + raise w_get_socketerror(space, e.strerror, e.errno) # XXX If we want to support subclassing the socket type we will need # something along these lines. But allocate_instance is only defined # on the standard object space, so this is not really correct. Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Thu Dec 8 11:53:57 2005 @@ -89,6 +89,9 @@ if (fd < 0) #endif { + // Raise OSError instead of socket.error for convenience. + // XXX For some reason the errno attribute of the OSError is not set + // at interpreter level. Investigate ... RPYTHON_RAISE_OSERROR(errno); } } Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Thu Dec 8 11:53:57 2005 @@ -80,16 +80,12 @@ def test_newsocket_error(): from pypy.module._socket.rpython import rsocket - tests = [(1001, _socket.SOCK_STREAM, 0)] + tests = [ + (1001, _socket.SOCK_STREAM, 0), + (_socket.AF_INET, 555555, 0), + ] def does_stuff(family, type, protocol): return rsocket.newsocket(family, type, protocol) f1 = compile(does_stuff, [int, int, int]) for args in tests: - try: - f1(*args) - except OSError, ex: - try: - import socket - socket.socket(*args) - except socket.error, ex_socket: - assert ex_socket.args[0] == ex.errno + py.test.raises(OSError, f1, *args) From mwh at codespeak.net Thu Dec 8 12:01:22 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Thu, 8 Dec 2005 12:01:22 +0100 (CET) Subject: [pypy-svn] r20880 - in pypy/dist/pypy: annotation rpython rpython/test Message-ID: <20051208110122.A981427DBE@code1.codespeak.net> Author: mwh Date: Thu Dec 8 12:01:21 2005 New Revision: 20880 Modified: pypy/dist/pypy/annotation/model.py pypy/dist/pypy/rpython/llinterp.py pypy/dist/pypy/rpython/rint.py pypy/dist/pypy/rpython/test/test_rint.py Log: implement converting Signeds to SignedLongLongs. be more honest about .knowntype on SomeIntegers. (no-one else has noticed that we broke translate_pypy yet...) Modified: pypy/dist/pypy/annotation/model.py ============================================================================== --- pypy/dist/pypy/annotation/model.py (original) +++ pypy/dist/pypy/annotation/model.py Thu Dec 8 12:01:21 2005 @@ -32,6 +32,7 @@ import pypy from pypy.annotation.pairtype import pair, extendabletype from pypy.tool.tls import tlsobject +from pypy.rpython.rarithmetic import r_uint, r_longlong, r_ulonglong import inspect @@ -163,6 +164,16 @@ self.nonneg = unsigned or nonneg self.unsigned = unsigned # pypy.rpython.rarithmetic.r_uint self.size = size + if self.unsigned: + if self.size == 2: + self.knowntype = r_ulonglong + else: + self.knowntype = r_uint + else: + if self.size == 2: + self.knowntype = r_longlong + else: + self.knowntype = int def fmt_size(self, s): if s != 1: Modified: pypy/dist/pypy/rpython/llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/llinterp.py (original) +++ pypy/dist/pypy/rpython/llinterp.py Thu Dec 8 12:01:21 2005 @@ -1,5 +1,5 @@ from pypy.objspace.flow.model import FunctionGraph, Constant, Variable, last_exception -from pypy.rpython.rarithmetic import intmask, r_uint, ovfcheck +from pypy.rpython.rarithmetic import intmask, r_uint, ovfcheck, r_longlong from pypy.rpython.lltypesystem import lltype from pypy.rpython.memory import lladdress from pypy.rpython.ootypesystem import ootype @@ -436,6 +436,10 @@ assert type(b) is r_uint return intmask(b) + def op_cast_int_to_longlong(self, b): + assert type(b) is int + return r_longlong(b) + def op_int_floordiv_ovf_zer(self, a, b): assert type(a) is int assert type(b) is int @@ -534,12 +538,14 @@ # __________________________________________________________ # primitive operations - for typ in (float, int, r_uint): + for typ in (float, int, r_uint, r_longlong): typname = typ.__name__ optup = ('add', 'sub', 'mul', 'div', 'truediv', 'floordiv', 'mod', 'gt', 'lt', 'ge', 'ne', 'le', 'eq',) if typ is r_uint: opnameprefix = 'uint' + elif typ is r_longlong: + opnameprefix = 'llong' else: opnameprefix = typname if typ in (int, r_uint): Modified: pypy/dist/pypy/rpython/rint.py ============================================================================== --- pypy/dist/pypy/rpython/rint.py (original) +++ pypy/dist/pypy/rpython/rint.py Thu Dec 8 12:01:21 2005 @@ -7,7 +7,7 @@ UnsignedLongLong, SignedLongLong from pypy.rpython.rmodel import IntegerRepr, inputconst from pypy.rpython.robject import PyObjRepr, pyobj_repr -from pypy.rpython.rarithmetic import intmask, r_uint, r_ulonglong, r_longlong +from pypy.rpython.rarithmetic import intmask, r_int, r_uint, r_ulonglong, r_longlong from pypy.rpython.error import TyperError from pypy.rpython.rmodel import log @@ -44,6 +44,8 @@ if r_from.lowleveltype == Unsigned and r_to.lowleveltype == Signed: log.debug('explicit cast_uint_to_int') return llops.genop('cast_uint_to_int', [v], resulttype=Signed) + if r_from.lowleveltype == Signed and r_to.lowleveltype == SignedLongLong: + return llops.genop('cast_int_to_longlong', [v], resulttype=SignedLongLong) return NotImplemented #arithmetic @@ -200,7 +202,7 @@ class __extend__(IntegerRepr): def convert_const(self, value): - if not isinstance(value, (int, r_uint)): # can be bool + if not isinstance(value, (int, r_uint, r_int, r_longlong, r_ulonglong)): # can be bool raise TyperError("not an integer: %r" % (value,)) if self.lowleveltype == Signed: return intmask(value) Modified: pypy/dist/pypy/rpython/test/test_rint.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rint.py (original) +++ pypy/dist/pypy/rpython/test/test_rint.py Thu Dec 8 12:01:21 2005 @@ -3,7 +3,7 @@ from pypy.annotation import model as annmodel from pypy.rpython.test import snippet from pypy.rpython.test.test_llinterp import interpret -from pypy.rpython.rarithmetic import r_uint +from pypy.rpython.rarithmetic import r_uint, r_longlong class TestSnippet(object): @@ -101,3 +101,17 @@ res = interpret(dummy, [-1]) assert res is False # -1 ==> 0xffffffff +def test_specializing_int_functions(): + def f(i): + return i + 1 + f._annspecialcase_ = "specialize:argtype0" + def g(n): + if n > 0: + return f(r_longlong(0)) + else: + return f(0) + res = interpret(g, [0]) + assert res == 1 + + res = interpret(g, [1]) + assert res == 1 From mwh at codespeak.net Thu Dec 8 12:08:36 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Thu, 8 Dec 2005 12:08:36 +0100 (CET) Subject: [pypy-svn] r20881 - in pypy/dist/pypy/translator/c: src test Message-ID: <20051208110836.C7A9127DC0@code1.codespeak.net> Author: mwh Date: Thu Dec 8 12:08:34 2005 New Revision: 20881 Modified: pypy/dist/pypy/translator/c/src/int.h pypy/dist/pypy/translator/c/test/test_annotated.py Log: (mwh, johahn) Copy test over from the rpython tests (...) and made it pass by implementing a couple more operations in int.h Modified: pypy/dist/pypy/translator/c/src/int.h ============================================================================== --- pypy/dist/pypy/translator/c/src/int.h (original) +++ pypy/dist/pypy/translator/c/src/int.h Thu Dec 8 12:08:34 2005 @@ -156,6 +156,7 @@ #define OP_CAST_BOOL_TO_UINT(x,r,err) r = (unsigned long)(x) #define OP_CAST_UINT_TO_INT(x,r,err) r = (long)(x) #define OP_CAST_INT_TO_UINT(x,r,err) r = (unsigned long)(x) +#define OP_CAST_INT_TO_LONGLONG(x,r,err) r = (long long)(x) #define OP_CAST_CHAR_TO_INT(x,r,err) r = (long)((unsigned char)(x)) #define OP_CAST_INT_TO_CHAR(x,r,err) r = (char)(x) #define OP_CAST_PTR_TO_INT(x,r,err) r = (long)(x) /* XXX */ @@ -280,3 +281,4 @@ #define OP_LLONG_MUL OP_INT_MUL #define OP_LLONG_EQ OP_INT_EQ +#define OP_LLONG_ADD OP_INT_ADD Modified: pypy/dist/pypy/translator/c/test/test_annotated.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_annotated.py (original) +++ pypy/dist/pypy/translator/c/test/test_annotated.py Thu Dec 8 12:08:34 2005 @@ -245,3 +245,18 @@ return 4*i gn = self.getcompiled(g, view=False) assert gn(sys.maxint) == 4*sys.maxint + + def test_specializing_int_functions(self): + from pypy.rpython.rarithmetic import r_longlong + def f(i): + return i + 1 + f._annspecialcase_ = "specialize:argtype0" + def g(n=int): + if n > 0: + return f(r_longlong(0)) + else: + return f(0) + + fn = self.getcompiled(g) + assert g(0) == 1 + assert g(1) == 1 From cfbolz at codespeak.net Thu Dec 8 12:21:05 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 8 Dec 2005 12:21:05 +0100 (CET) Subject: [pypy-svn] r20882 - in pypy/dist/pypy/jit: . test Message-ID: <20051208112105.956C927DC2@code1.codespeak.net> Author: cfbolz Date: Thu Dec 8 12:21:04 2005 New Revision: 20882 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: (pedronis, cfbolz, arigo) Support for 'direct_call': introduced GraphState, changed the way the new graphs are patched around, pushed, pulled, etc. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Thu Dec 8 12:21:04 2005 @@ -55,48 +55,72 @@ class LLState(object): + """Entry state of a block or a graph, as a combination of LLAbstractValues + for its input arguments.""" + + def __init__(self, args_a): + self.args_a = args_a + + def match(self, args_a): + # simple for now + for a1, a2 in zip(self.args_a, args_a): + if not a1.match(a2): + return False + else: + return True + + +class BlockState(LLState): + """Entry state of a block.""" def __init__(self, origblock, args_a): assert len(args_a) == len(origblock.inputargs) - self.args_a = args_a + super(BlockState, self).__init__(args_a) self.origblock = origblock self.copyblock = None - self.pendinglinks = [] + self.pendingsources = [] - def patchlink(self, copylink): + def patchsource(self, source): if self.copyblock is None: - print 'PENDING', self, id(copylink) - self.pendinglinks.append(copylink) + print 'PENDING', self, hex(id(source)) + self.pendingsources.append(source) else: # XXX nice interface required! - print 'LINKING', self, id(copylink), self.copyblock - copylink.settarget(self.copyblock) + print 'LINKING', self, id(source), self.copyblock + source.settarget(self.copyblock) def resolveblock(self, newblock): + print "RESOLVING BLOCK", newblock self.copyblock = newblock - for copylink in self.pendinglinks: - self.patchlink(copylink) - del self.pendinglinks[:] - - def match(self, args_a): - # simple for now - for a1, a2 in zip(self.args_a, args_a): - if not a1.match(a2): - return False - else: - return True + for source in self.pendingsources: + self.patchsource(source) + del self.pendingsources[:] + + +class GraphState(LLState): + """Entry state of a graph.""" + + def __init__(self, origgraph, args_a): + super(GraphState, self).__init__(args_a) + self.origgraph = origgraph + self.copygraph = FunctionGraph(origgraph.name, Block([])) # grumble + + def settarget(self, block): + block.isstartblock = True + self.copygraph.startblock = block # ____________________________________________________________ -class GotReturnValue(Exception): - def __init__(self, returnstate): - self.returnstate = returnstate - - class LLAbstractInterp(object): def __init__(self): - pass + self.graphs = {} # {origgraph: {BlockState: GraphState}} + self.fixreturnblocks = [] + + def itercopygraphs(self): + for d in self.graphs.itervalues(): + for graphstate in d.itervalues(): + yield graphstate.copygraph def eval(self, origgraph, hints): # for now, 'hints' means "I'm absolutely sure that the @@ -105,21 +129,24 @@ self.hints = hints self.blocks = {} # {origblock: list-of-LLStates} args_a = [LLRuntimeValue(orig_v=v) for v in origgraph.getargs()] - newstartlink = self.schedule(args_a, origgraph.startblock) - - return_a = LLRuntimeValue(orig_v=origgraph.getreturnvar()) - returnstate = LLState(origgraph.returnblock, [return_a]) - self.allpendingstates.append(returnstate) - self.blocks[origgraph.returnblock] = [returnstate] - self.complete(returnstate) - - copygraph = FunctionGraph(origgraph.name, newstartlink.target) - # XXX messy -- what about len(returnlink.args) == 0 ?? - copygraph.getreturnvar().concretetype = ( - origgraph.getreturnvar().concretetype) - returnstate.resolveblock(copygraph.returnblock) - checkgraph(copygraph) # sanity-check - return copygraph + graphstate = self.schedule_graph(args_a, origgraph) + self.complete() + self.fixgraphs() + return graphstate.copygraph + + def fixgraphs(self): + # add the missing '.returnblock' attribute + for graph in self.fixreturnblocks: + for block in graph.iterblocks(): + if block.operations == () and len(block.inputargs) == 1: + # here it is :-) + graph.returnblock = block + break + else: + # no return block... + graph.getreturnvar().concretevalue = lltype.Void + checkgraph(graph) # sanity-check + del self.fixreturnblocks def applyhint(self, args_a, origblock): result_a = [] @@ -138,50 +165,56 @@ result_a.append(a) return result_a + def schedule_graph(self, args_a, origgraph): + origblock = origgraph.startblock + state, args_a = self.schedule_getstate(args_a, origblock) + try: + graphstate = self.graphs[origgraph][state] + except KeyError: + graphstate = GraphState(origgraph, args_a) + self.fixreturnblocks.append(graphstate.copygraph) + d = self.graphs.setdefault(origgraph, {}) + d[state] = graphstate + print "SCHEDULE_GRAPH", graphstate + state.patchsource(graphstate) + return graphstate + def schedule(self, args_a, origblock): print "SCHEDULE", args_a, origblock # args_a: [a_value for v in origblock.inputargs] - args_a = self.applyhint(args_a, origblock) + state, args_a = self.schedule_getstate(args_a, origblock) args_v = [a.getvarorconst() for a in args_a if not isinstance(a, LLConcreteValue)] newlink = Link(args_v, None) - # try to match this new state with an existing one + state.patchsource(newlink) + return newlink + + def schedule_getstate(self, args_a, origblock): + # NOTA BENE: copyblocks can get shared between different copygraphs! + args_a = self.applyhint(args_a, origblock) pendingstates = self.blocks.setdefault(origblock, []) + # try to match this new state with an existing one for state in pendingstates: if state.match(args_a): # already matched - break + return state, args_a else: # schedule this new state - state = LLState(origblock, args_a) + state = BlockState(origblock, args_a) pendingstates.append(state) self.allpendingstates.append(state) - state.patchlink(newlink) - return newlink + return state, args_a - def complete(self, returnstate): + def complete(self): while self.allpendingstates: state = self.allpendingstates.pop() print 'CONSIDERING', state - try: - self.flowin(state) - except GotReturnValue, e: - assert e.returnstate is returnstate + self.flowin(state) def flowin(self, state): # flow in the block assert state.copyblock is None origblock = state.origblock - if origblock.operations == (): - if len(origblock.inputargs) == 1: - # return block - raise GotReturnValue(state) - elif len(origblock.inputargs) == 2: - # except block - XXX - else: - raise Exception("uh?") - self.residual_operations = [] bindings = {} # {Variables-of-origblock: a_value} def binding(v): if isinstance(v, Constant): @@ -192,10 +225,14 @@ if not isinstance(a, LLConcreteValue): a = LLRuntimeValue(orig_v=v) bindings[v] = a - for op in origblock.operations: - handler = getattr(self, 'op_' + op.opname) - a_result = handler(op, *[binding(v) for v in op.args]) - bindings[op.result] = a_result + if origblock.operations == (): + self.residual_operations = () + else: + self.residual_operations = [] + for op in origblock.operations: + handler = getattr(self, 'op_' + op.opname) + a_result = handler(op, *[binding(v) for v in op.args]) + bindings[op.result] = a_result inputargs = [] for v in origblock.inputargs: a = bindings[v] @@ -275,3 +312,30 @@ def op_same_as(self, op, a): return a + + def op_direct_call(self, op, a_func, *args_a): + v_func = a_func.getvarorconst() + if isinstance(v_func, Constant): + fnobj = v_func.value._obj + if hasattr(fnobj, 'graph'): + origgraph = fnobj.graph + graphstate = self.schedule_graph(args_a, origgraph) + origfptr = v_func.value + ARGS = [] + new_args_a = [] + for a in graphstate.args_a: + if not isinstance(a, LLConcreteValue): + ARGS.append(a.getconcretetype()) + new_args_a.append(a) + args_a = new_args_a + TYPE = lltype.FuncType( + ARGS, lltype.typeOf(origfptr).TO.RESULT) + fptr = lltype.functionptr( + TYPE, fnobj._name, graph=graphstate.copygraph) + fconst = Constant(fptr) + fconst.concretetype = lltype.typeOf(fptr) + a_func = LLRuntimeValue(fconst) + a_result = LLRuntimeValue(op.result) + self.residual("direct_call", [a_func] + args_a, a_result) + return a_result + Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Thu Dec 8 12:21:04 2005 @@ -39,9 +39,10 @@ assert result1 == result2 # return a summary of the instructions left in graph2 insns = {} - for block in graph2.iterblocks(): - for op in block.operations: - insns[op.opname] = insns.get(op.opname, 0) + 1 + for copygraph in interp.itercopygraphs(): + for block in copygraph.iterblocks(): + for op in block.operations: + insns[op.opname] = insns.get(op.opname, 0) + 1 return graph2, insns @@ -128,6 +129,15 @@ a = y + z else: a = y - z - return a + x + a += x + return a graph2, insns = abstrinterp(ll_function, [3, 4, 5], [1, 2]) assert insns == {'int_is_true': 1, 'int_add': 2} + +def test_simple_call(): + def ll2(x, y): + return x + (y + 42) + def ll1(x, y, z): + return ll2(x, y - z) + graph2, insns = abstrinterp(ll1, [3, 4, 5], [1, 2]) + assert insns == {'direct_call': 1, 'int_add': 1} From cfbolz at codespeak.net Thu Dec 8 12:28:13 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 8 Dec 2005 12:28:13 +0100 (CET) Subject: [pypy-svn] r20883 - pypy/dist/pypy/rpython/test Message-ID: <20051208112813.442EE27DBF@code1.codespeak.net> Author: cfbolz Date: Thu Dec 8 12:28:12 2005 New Revision: 20883 Modified: pypy/dist/pypy/rpython/test/test_llinterp.py Log: remove this very very very old __main__. it's broken. it sucks. nobody should use it. Modified: pypy/dist/pypy/rpython/test/test_llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_llinterp.py (original) +++ pypy/dist/pypy/rpython/test/test_llinterp.py Thu Dec 8 12:28:12 2005 @@ -418,22 +418,3 @@ except ValueError: raise TypeError -#__________________________________________________________________ -# interactive playing - -if __name__ == '__main__': - try: - import rlcompleter2 as _rl2 - _rl2.setup() - except ImportError: - pass - - t, typer = gengraph(number_ops, [int]) - interp = LLInterpreter(t.flowgraphs, typer) - res = interp.eval_function(number_ops, [3]) - assert res == number_ops(3) - for name, value in globals().items(): - if name not in _snap and name[0] != '_': - print "%20s: %s" %(name, value) - - From mwh at codespeak.net Thu Dec 8 12:35:55 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Thu, 8 Dec 2005 12:35:55 +0100 (CET) Subject: [pypy-svn] r20884 - in pypy/dist/pypy: rpython rpython/test translator/c/src translator/c/test Message-ID: <20051208113555.F010B27DC1@code1.codespeak.net> Author: mwh Date: Thu Dec 8 12:35:54 2005 New Revision: 20884 Modified: pypy/dist/pypy/rpython/llinterp.py pypy/dist/pypy/rpython/rint.py pypy/dist/pypy/rpython/test/test_rint.py pypy/dist/pypy/translator/c/src/int.h pypy/dist/pypy/translator/c/test/test_annotated.py Log: (mwh, johahn) implement truncation of signed long long to signed long (need to do all this stuff for unsigned variants at some point too ... fun). Modified: pypy/dist/pypy/rpython/llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/llinterp.py (original) +++ pypy/dist/pypy/rpython/llinterp.py Thu Dec 8 12:35:54 2005 @@ -4,6 +4,7 @@ from pypy.rpython.memory import lladdress from pypy.rpython.ootypesystem import ootype +import sys import math import py @@ -440,6 +441,11 @@ assert type(b) is int return r_longlong(b) + def op_truncate_longlong_to_int(self, b): + assert type(b) is r_longlong + assert -sys.maxint-1 <= b <= sys.maxint + return int(b) + def op_int_floordiv_ovf_zer(self, a, b): assert type(a) is int assert type(b) is int Modified: pypy/dist/pypy/rpython/rint.py ============================================================================== --- pypy/dist/pypy/rpython/rint.py (original) +++ pypy/dist/pypy/rpython/rint.py Thu Dec 8 12:35:54 2005 @@ -46,6 +46,8 @@ return llops.genop('cast_uint_to_int', [v], resulttype=Signed) if r_from.lowleveltype == Signed and r_to.lowleveltype == SignedLongLong: return llops.genop('cast_int_to_longlong', [v], resulttype=SignedLongLong) + if r_from.lowleveltype == SignedLongLong and r_to.lowleveltype == Signed: + return llops.genop('truncate_longlong_to_int', [v], resulttype=Signed) return NotImplemented #arithmetic @@ -281,7 +283,7 @@ def rtype_int(self, hop): if self.lowleveltype in (Unsigned, UnsignedLongLong): raise TyperError("use intmask() instead of int(r_uint(...))") - vlist = hop.inputargs(self) + vlist = hop.inputargs(Signed) return vlist[0] def rtype_float(_, hop): Modified: pypy/dist/pypy/rpython/test/test_rint.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rint.py (original) +++ pypy/dist/pypy/rpython/test/test_rint.py Thu Dec 8 12:35:54 2005 @@ -115,3 +115,9 @@ res = interpret(g, [1]) assert res == 1 + +def test_downcast_int(): + def f(i): + return int(i) + res = interpret(f, [r_longlong(0)]) + assert res == 0 Modified: pypy/dist/pypy/translator/c/src/int.h ============================================================================== --- pypy/dist/pypy/translator/c/src/int.h (original) +++ pypy/dist/pypy/translator/c/src/int.h Thu Dec 8 12:35:54 2005 @@ -161,6 +161,8 @@ #define OP_CAST_INT_TO_CHAR(x,r,err) r = (char)(x) #define OP_CAST_PTR_TO_INT(x,r,err) r = (long)(x) /* XXX */ +#define OP_TRUNCATE_LONGLONG_TO_INT(x,r,err) r = (long)(x) + #define OP_CAST_UNICHAR_TO_INT(x,r,err) r = (long)((unsigned long)(x)) /*?*/ #define OP_CAST_INT_TO_UNICHAR(x,r,err) r = (unsigned int)(x) Modified: pypy/dist/pypy/translator/c/test/test_annotated.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_annotated.py (original) +++ pypy/dist/pypy/translator/c/test/test_annotated.py Thu Dec 8 12:35:54 2005 @@ -5,6 +5,9 @@ from pypy.translator.test import snippet +from pypy.rpython.rarithmetic import r_ulonglong, r_longlong + + # XXX this tries to make compiling faster for full-scale testing from pypy.translator.tool import cbuild cbuild.enable_fast_compilation() @@ -235,7 +238,6 @@ assert fn(-3) == 42 def test_long_long(self): - from pypy.rpython.rarithmetic import r_ulonglong, r_longlong def f(i=r_ulonglong): return 4*i fn = self.getcompiled(f, view=False) @@ -247,7 +249,6 @@ assert gn(sys.maxint) == 4*sys.maxint def test_specializing_int_functions(self): - from pypy.rpython.rarithmetic import r_longlong def f(i): return i + 1 f._annspecialcase_ = "specialize:argtype0" @@ -260,3 +261,9 @@ fn = self.getcompiled(g) assert g(0) == 1 assert g(1) == 1 + + def test_downcast_int(self): + def f(i=r_longlong): + return int(i) + fn = self.getcompiled(f) + assert f(0) == 0 From nik at codespeak.net Thu Dec 8 12:38:42 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Thu, 8 Dec 2005 12:38:42 +0100 (CET) Subject: [pypy-svn] r20885 - in pypy/dist/pypy/module/_socket: . rpython test Message-ID: <20051208113842.08F6127DC6@code1.codespeak.net> Author: nik Date: Thu Dec 8 12:38:40 2005 New Revision: 20885 Modified: pypy/dist/pypy/module/_socket/interp_socket.py pypy/dist/pypy/module/_socket/rpython/rsocket.py pypy/dist/pypy/module/_socket/test/test_socket2.py Log: (ale, nik) implemented socket.fileno() and socket.close() Modified: pypy/dist/pypy/module/_socket/interp_socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/interp_socket.py (original) +++ pypy/dist/pypy/module/_socket/interp_socket.py Thu Dec 8 12:38:40 2005 @@ -1,4 +1,4 @@ -import _socket, socket, errno, sys +import _socket, socket, errno, os, sys from pypy.interpreter.typedef import TypeDef from pypy.interpreter.baseobjspace import Wrappable from pypy.interpreter.error import OperationError @@ -642,6 +642,7 @@ self.type = type self.proto = proto self.timeout = getstate(space).defaulttimeout + self.closed = False def accept(self, space): """accept() -> (socket object, address info) @@ -677,10 +678,15 @@ Close the socket. It cannot be used after this call. """ - if self.fd is not None: - fd = self.fd - self.fd = None - fd.close() + if not self.closed: + try: + # Reusing the os.close primitive to save us from writing a + # socket-specific close primitive. This might not be perfectly + # cross-platform (Windows?). + os.close(self.fd) + except OSError, e: + raise w_get_socketerror(space, e.strerror, e.errno) + self.closed = True close.unwrap_spec = ['self', ObjSpace] def connect(self, space, w_addr): @@ -729,7 +735,10 @@ Return the integer file descriptor of the socket. """ - return space.wrap(self.fd.fileno()) + if not self.closed: + return space.wrap(self.fd) + else: + raise w_get_socketerror(space, "Bad file descriptor", errno.EBADF) fileno.unwrap_spec = ['self', ObjSpace] def getpeername(self, space): Modified: pypy/dist/pypy/module/_socket/rpython/rsocket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/rsocket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/rsocket.py Thu Dec 8 12:38:40 2005 @@ -4,6 +4,8 @@ import socket +keep_sockets_alive = [] + class ADDRINFO(object): # a simulated addrinfo structure from C, i.e. a chained list # returned by getaddrinfo() @@ -27,4 +29,8 @@ return ADDRINFO(host, port, family, socktype, proto, flags) def newsocket(family, type, protocol): - return socket.socket(family, type, protocol).fileno() + s = socket.socket(family, type, protocol) + # HACK: We have to prevent GC to collect the socket object because we don't + # want it to be closed. + keep_sockets_alive.append(s) + return s.fileno() Modified: pypy/dist/pypy/module/_socket/test/test_socket2.py ============================================================================== --- pypy/dist/pypy/module/_socket/test/test_socket2.py (original) +++ pypy/dist/pypy/module/_socket/test/test_socket2.py Thu Dec 8 12:38:40 2005 @@ -263,8 +263,34 @@ assert space.unwrap(w_t) is None def app_test_newsocket_error(): - import socket - raises (socket.error, socket.socket, 10001, socket.SOCK_STREAM, 0) + import _socket + raises(_socket.error, _socket.socket, 10001, _socket.SOCK_STREAM, 0) + +def app_test_socket_fileno(): + import _socket + s = _socket.socket(_socket.AF_INET, _socket.SOCK_STREAM, 0) + assert s.fileno() > -1 + assert isinstance(s.fileno(), int) + +def app_test_socket_close(): + import _socket, errno + s = _socket.socket(_socket.AF_INET, _socket.SOCK_STREAM, 0) + fileno = s.fileno() + s.close() + s.close() + try: + s.fileno() + except _socket.error, ex: + assert ex.args[0], errno.EBADF + else: + assert 0 + +def app_test_socket_close_error(): + import _socket, os + s = _socket.socket(_socket.AF_INET, _socket.SOCK_STREAM, 0) + os.close(s.fileno()) + raises(_socket.error, s.close) + class AppTestSocket: def setup_class(cls): From rxe at codespeak.net Thu Dec 8 12:48:59 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Thu, 8 Dec 2005 12:48:59 +0100 (CET) Subject: [pypy-svn] r20886 - pypy/dist/pypy/translator/c/test Message-ID: <20051208114859.DD6D127DC8@code1.codespeak.net> Author: rxe Date: Thu Dec 8 12:48:58 2005 New Revision: 20886 Added: pypy/dist/pypy/translator/c/test/test_tasklets.py (contents, props changed) Log: Proof of concept tasklet switching in rpython. Added: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Thu Dec 8 12:48:58 2005 @@ -0,0 +1,120 @@ +from pypy.translator.translator import TranslationContext +from pypy.translator.c.genc import CStandaloneBuilder +from pypy.annotation.model import SomeList, SomeString +from pypy.annotation.listdef import ListDef +from pypy.rpython.rstack import stack_unwind, stack_frames_depth, stack_too_big +from pypy.rpython.rstack import yield_current_frame_to_caller +import os + +def wrap_stackless_function(fn): + def entry_point(argv): + os.write(1, str(fn())) + return 0 + + s_list_of_strings = SomeList(ListDef(None, SomeString())) + s_list_of_strings.listdef.resize() + t = TranslationContext() + t.buildannotator().build_types(entry_point, [s_list_of_strings]) + t.buildrtyper().specialize() + cbuilder = CStandaloneBuilder(t, entry_point) + cbuilder.stackless = True + cbuilder.generate_source() + cbuilder.compile() + return cbuilder.cmdexec('') + +# ____________________________________________________________ + +def debug(s): + #os.write(1, "%s\n" % s) + pass + +class Tasklet(object): + + def __init__(self, name, fn): + self.fn = fn + self.name = name + self.alive = False + + def start(self): + debug("starting %s" % self.name) + self.caller = yield_current_frame_to_caller() + + debug("entering %s" % self.name) + self.fn(self.name) + debug("leaving %s" % self.name) + return self.caller + + def setalive(self, resumable): + self.alive = True + self.resumable = resumable + + def schedule(self): + debug("scheduling %s" % self.name) + self.caller = self.caller.switch() + + def resume(self): + debug("resuming %s" % self.name) + self.resumable = self.resumable.switch() + self.alive = self.resumable is not None + + +class Scheduler(object): + def __init__(self): + self.runnables = [] + self.current_tasklet = None + + def add_tasklet(self, tasklet): + self.runnables.append(tasklet) + + def run(self): + debug("running: length of runnables %s" % len(self.runnables)) + while len(self.runnables): + t = self.runnables.pop(0) + debug("resuming %s(%s)" % (t.name, t.alive)) + self.current_tasklet = t + t.resume() + self.current_tasklet = None + if t.alive: + self.runnables.append(t) + + debug("ran") + +scheduler = Scheduler() +def start_tasklet(tasklet): + res = tasklet.start() + tasklet.setalive(res) + scheduler.add_tasklet(tasklet) + +def run(): + scheduler.run() + +def schedule(): + assert scheduler.current_tasklet + scheduler.current_tasklet.schedule() + +def test_simple(): + class Counter: + def __init__(self): + self.count = 0 + + def increment(self): + self.count += 1 + + def get_count(self): + return self.count + + c = Counter() + + def simple(name): + for ii in range(5): + debug("xxx %s %s" % (name, ii)) + c.increment() + schedule() + + def f(): + for ii in range(5): + start_tasklet(Tasklet("T%s" % ii, simple)) + run() + return c.get_count() == 25 + + assert wrap_stackless_function(f) == '1' From cfbolz at codespeak.net Thu Dec 8 12:52:23 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 8 Dec 2005 12:52:23 +0100 (CET) Subject: [pypy-svn] r20887 - in pypy/dist/pypy: jit jit/test rpython/lltypesystem Message-ID: <20051208115223.9608127DCD@code1.codespeak.net> Author: cfbolz Date: Thu Dec 8 12:52:21 2005 New Revision: 20887 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py pypy/dist/pypy/rpython/lltypesystem/lltype.py Log: (cfbolz, arigo, pedronis) Support for reading from structs. Hints about immutable structs. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Thu Dec 8 12:52:21 2005 @@ -307,6 +307,9 @@ def op_int_sub(self, op, a1, a2): return self.residualize(op, [a1, a2], operator.sub) + def op_int_mul(self, op, a1, a2): + return self.residualize(op, [a1, a2], operator.mul) + def op_int_gt(self, op, a1, a2): return self.residualize(op, [a1, a2], operator.gt) @@ -338,4 +341,24 @@ a_result = LLRuntimeValue(op.result) self.residual("direct_call", [a_func] + args_a, a_result) return a_result - + + def op_getfield(self, op, a_ptr, a_attrname): + T = a_ptr.getconcretetype().TO + attrname = a_attrname.getvarorconst().value + RESULT = getattr(T, attrname) + if RESULT is lltype.Void: + return XXX_later + v_ptr = a_ptr.getvarorconst() + if isinstance(v_ptr, Constant): + if T._hints.get('immutable', False): + concreteresult = getattr(v_ptr.value, attrname) + if isinstance(a_ptr, LLConcreteValue): + a_result = LLConcreteValue(concreteresult) + else: + c_result = Constant(concreteresult) + c_result.concretetype = lltype.typeOf(concreteresult) + a_result = LLRuntimeValue(c_result) + return a_result + a_result = LLRuntimeValue(op.result) + self.residual("getfield", [a_ptr, a_attrname], a_result) + return a_result Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Thu Dec 8 12:52:21 2005 @@ -141,3 +141,15 @@ return ll2(x, y - z) graph2, insns = abstrinterp(ll1, [3, 4, 5], [1, 2]) assert insns == {'direct_call': 1, 'int_add': 1} + +def test_simple_struct(): + S = lltype.GcStruct('helloworld', ('hello', lltype.Signed), + ('world', lltype.Signed), + hints={'immutable': True}) + s = lltype.malloc(S) + s.hello = 6 + s.world = 7 + def ll_function(s): + return s.hello * s.world + graph2, insns = abstrinterp(ll_function, [s], [0]) + assert insns == {} Modified: pypy/dist/pypy/rpython/lltypesystem/lltype.py ============================================================================== --- pypy/dist/pypy/rpython/lltypesystem/lltype.py (original) +++ pypy/dist/pypy/rpython/lltypesystem/lltype.py Thu Dec 8 12:52:21 2005 @@ -127,8 +127,9 @@ def _inline_is_varsize(self, last): raise TypeError, "%r cannot be inlined in structure" % self - def _install_adtmeths(self, adtmeths={}): + def _install_extras(self, adtmeths={}, hints={}): self._adtmeths = frozendict(adtmeths) + self._hints = frozendict(hints) def __getattr__(self, name): adtmeth = self._adtmeths.get(name, NFOUND) @@ -175,7 +176,7 @@ self._flds = frozendict(flds) self._names = tuple(names) - self._install_adtmeths(**kwds) + self._install_extras(**kwds) def _first_struct(self): if self._names: @@ -277,7 +278,7 @@ raise TypeError("cannot have a GC structure as array item type") self.OF._inline_is_varsize(False) - self._install_adtmeths(**kwds) + self._install_extras(**kwds) def _inline_is_varsize(self, last): if not last: From mwh at codespeak.net Thu Dec 8 13:20:38 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Thu, 8 Dec 2005 13:20:38 +0100 (CET) Subject: [pypy-svn] r20888 - in pypy/dist/pypy: annotation rpython/test Message-ID: <20051208122038.45F1C27DC8@code1.codespeak.net> Author: mwh Date: Thu Dec 8 13:20:36 2005 New Revision: 20888 Modified: pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/rpython/test/test_rint.py Log: (mwh, johahn) fix up builtin_isinstance wrt the changes to how integers are annotated. Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Thu Dec 8 13:20:36 2005 @@ -112,12 +112,11 @@ r = SomeBool() if s_type.is_constant(): typ = s_type.const - if typ == pypy.rpython.rarithmetic.r_uint: + if issubclass(typ, pypy.rpython.rarithmetic.base_int): if s_obj.is_constant(): r.const = isinstance(s_obj.const, typ) else: - if s_obj.knowntype == int: - r.const = s_obj.unsigned + r.const = issubclass(s_obj.knowntype, typ) else: if typ == long: getbookkeeper().warning("isinstance(., long) is not RPython") Modified: pypy/dist/pypy/rpython/test/test_rint.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rint.py (original) +++ pypy/dist/pypy/rpython/test/test_rint.py Thu Dec 8 13:20:36 2005 @@ -121,3 +121,21 @@ return int(i) res = interpret(f, [r_longlong(0)]) assert res == 0 + +def test_isinstance_vs_int_types(): + class FakeSpace(object): + def wrap(self, x): + if x is None: + return [None] + if isinstance(x, str): + return x + if isinstance(x, r_longlong): + return int(x) + return "XXX" + wrap._annspecialcase_ = 'specialize:argtype0' + + space = FakeSpace() + def wrap(x): + return space.wrap(x) + res = interpret(wrap, [r_longlong(0)]) + assert res == 0 From mwh at codespeak.net Thu Dec 8 13:29:58 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Thu, 8 Dec 2005 13:29:58 +0100 (CET) Subject: [pypy-svn] r20889 - in pypy/dist/pypy/rpython: . test Message-ID: <20051208122958.92B7027DCA@code1.codespeak.net> Author: mwh Date: Thu Dec 8 13:29:57 2005 New Revision: 20889 Modified: pypy/dist/pypy/rpython/rarithmetic.py pypy/dist/pypy/rpython/test/test_rarithmetic.py Log: implement __abs__ for our integer types. Modified: pypy/dist/pypy/rpython/rarithmetic.py ============================================================================== --- pypy/dist/pypy/rpython/rarithmetic.py (original) +++ pypy/dist/pypy/rpython/rarithmetic.py Thu Dec 8 13:29:57 2005 @@ -201,6 +201,10 @@ x = long(self) return self.__class__(-x) + def __abs__(self): + x = long(self) + return self.__class__(abs(x)) + def __pos__(self): return self.__class__(self) Modified: pypy/dist/pypy/rpython/test/test_rarithmetic.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rarithmetic.py (original) +++ pypy/dist/pypy/rpython/test/test_rarithmetic.py Thu Dec 8 13:29:57 2005 @@ -262,5 +262,5 @@ else: assert False - - +def test_abs(): + assert type(abs(r_longlong(1))) is r_longlong From mwh at codespeak.net Thu Dec 8 13:30:49 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Thu, 8 Dec 2005 13:30:49 +0100 (CET) Subject: [pypy-svn] r20890 - pypy/dist/pypy/objspace/std Message-ID: <20051208123049.0D79127DCC@code1.codespeak.net> Author: mwh Date: Thu Dec 8 13:30:48 2005 New Revision: 20890 Modified: pypy/dist/pypy/objspace/std/objspace.py Log: a r_longlong case in wrap() Modified: pypy/dist/pypy/objspace/std/objspace.py ============================================================================== --- pypy/dist/pypy/objspace/std/objspace.py (original) +++ pypy/dist/pypy/objspace/std/objspace.py Thu Dec 8 13:30:48 2005 @@ -11,6 +11,7 @@ from pypy.objspace.std.multimethod import FailedToImplement from pypy.objspace.descroperation import DescrOperation from pypy.objspace.std import stdtypedef +from pypy.rpython.rarithmetic import r_longlong import sys import os @@ -262,6 +263,9 @@ if isinstance(x, long): from pypy.objspace.std.longobject import args_from_long return W_LongObject(self, *args_from_long(x)) + if isinstance(x, r_longlong): + from pypy.objspace.std.longobject import args_from_long + return W_LongObject(self, *args_from_long(x)) if isinstance(x, slice): return W_SliceObject(self, self.wrap(x.start), self.wrap(x.stop), From arigo at codespeak.net Thu Dec 8 13:51:49 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Dec 2005 13:51:49 +0100 (CET) Subject: [pypy-svn] r20891 - in pypy/dist/pypy/rpython: . test Message-ID: <20051208125149.5CA8B27B68@code1.codespeak.net> Author: arigo Date: Thu Dec 8 13:51:48 2005 New Revision: 20891 Modified: pypy/dist/pypy/rpython/llinterp.py pypy/dist/pypy/rpython/rint.py pypy/dist/pypy/rpython/rmodel.py pypy/dist/pypy/rpython/test/test_rbool.py Log: Fixed arithmetic on bools. Modified: pypy/dist/pypy/rpython/llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/llinterp.py (original) +++ pypy/dist/pypy/rpython/llinterp.py Thu Dec 8 13:51:48 2005 @@ -401,6 +401,10 @@ assert type(b) is bool return int(b) + def op_cast_bool_to_uint(self, b): + assert type(b) is bool + return r_uint(int(b)) + def op_cast_bool_to_float(self, b): assert type(b) is bool return float(b) Modified: pypy/dist/pypy/rpython/rint.py ============================================================================== --- pypy/dist/pypy/rpython/rint.py (original) +++ pypy/dist/pypy/rpython/rint.py Thu Dec 8 13:51:48 2005 @@ -193,7 +193,7 @@ if not s_int1.nonneg or not s_int2.nonneg: raise TyperError("comparing a signed and an unsigned number") - repr = hop.rtyper.makerepr(annmodel.unionof(s_int1, s_int2)) + repr = hop.rtyper.makerepr(annmodel.unionof(s_int1, s_int2)).as_int vlist = hop.inputargs(repr, repr) hop.exception_is_here() return hop.genop(repr.opprefix+func, vlist, resulttype=Bool) @@ -237,12 +237,14 @@ return hop.genop('cast_int_to_unichar', vlist, resulttype=UniChar) def rtype_is_true(self, hop): + assert self is self.as_int # rtype_is_true() is overridden in BoolRepr vlist = hop.inputargs(self) return hop.genop(self.opprefix + 'is_true', vlist, resulttype=Bool) #Unary arithmetic operations def rtype_abs(self, hop): + self = self.as_int if hop.s_result.unsigned: vlist = hop.inputargs(self) return vlist[0] @@ -251,6 +253,7 @@ return hop.genop(self.opprefix + 'abs', vlist, resulttype=self) def rtype_abs_ovf(self, hop): + self = self.as_int if hop.s_result.unsigned: raise TyperError("forbidden uint_abs_ovf") else: @@ -260,14 +263,17 @@ return hop.genop(self.opprefix + 'abs_ovf', vlist, resulttype=self) def rtype_invert(self, hop): + self = self.as_int vlist = hop.inputargs(self) return hop.genop(self.opprefix + 'invert', vlist, resulttype=self) def rtype_neg(self, hop): + self = self.as_int vlist = hop.inputargs(self) return hop.genop(self.opprefix + 'neg', vlist, resulttype=self) def rtype_neg_ovf(self, hop): + self = self.as_int if hop.s_result.unsigned: raise TyperError("forbidden uint_neg_ovf") else: @@ -277,6 +283,7 @@ return hop.genop(self.opprefix + 'neg_ovf', vlist, resulttype=self) def rtype_pos(self, hop): + self = self.as_int vlist = hop.inputargs(self) return vlist[0] @@ -320,13 +327,15 @@ j += 1 return result - def rtype_hex(_, hop): - varg = hop.inputarg(hop.args_r[0], 0) + def rtype_hex(self, hop): + self = self.as_int + varg = hop.inputarg(self, 0) true = inputconst(Bool, True) return hop.gendirectcall(ll_int2hex, varg, true) - def rtype_oct(_, hop): - varg = hop.inputarg(hop.args_r[0], 0) + def rtype_oct(self, hop): + self = self.as_int + varg = hop.inputarg(self, 0) true = inputconst(Bool, True) return hop.gendirectcall(ll_int2oct, varg, true) Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Thu Dec 8 13:51:48 2005 @@ -274,12 +274,14 @@ def __init__(self, lowleveltype, opprefix): self.lowleveltype = lowleveltype self.opprefix = opprefix + self.as_int = self class BoolRepr(IntegerRepr): lowleveltype = Bool - opprefix = 'int_' + # NB. no 'opprefix' here. Use 'as_int' systematically. def __init__(self): - pass + from pypy.rpython.rint import signed_repr + self.as_int = signed_repr class StringRepr(Repr): pass Modified: pypy/dist/pypy/rpython/test/test_rbool.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rbool.py (original) +++ pypy/dist/pypy/rpython/test/test_rbool.py Thu Dec 8 13:51:48 2005 @@ -44,3 +44,33 @@ assert res == 0 and res is not False # forced to int by static typing res = interpret(f, [True]) assert res == 2 + + def test_arithmetic_with_bool_inputs(self): + def f(n): + a = n * ((n>2) + (n>=2)) + a -= (a != n) > False + return a + (-(n<0)) + for i in [-1, 1, 2, 42]: + res = interpret(f, [i]) + assert res == f(i) + + def test_bool2str(self): + def f(n, m): + if m == 1: + return hex(n > 5) + elif m == 2: + return oct(n > 5) + else: + return str(n > 5) + res = interpret(f, [2, 0]) + assert ''.join(res.chars) in ('0', 'False') # unspecified so far + res = interpret(f, [9, 0]) + assert ''.join(res.chars) in ('1', 'True') # unspecified so far + res = interpret(f, [2, 1]) + assert ''.join(res.chars) == '0x0' + res = interpret(f, [9, 1]) + assert ''.join(res.chars) == '0x1' + res = interpret(f, [2, 2]) + assert ''.join(res.chars) == '0' + res = interpret(f, [9, 2]) + assert ''.join(res.chars) == '01' From rxe at codespeak.net Thu Dec 8 13:53:28 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Thu, 8 Dec 2005 13:53:28 +0100 (CET) Subject: [pypy-svn] r20892 - pypy/dist/pypy/translator/c/test Message-ID: <20051208125328.78E4D27B68@code1.codespeak.net> Author: rxe Date: Thu Dec 8 13:53:27 2005 New Revision: 20892 Modified: pypy/dist/pypy/translator/c/test/test_typed.py Log: Try to track some obscure bug with list length and is true - but failed. Modified: pypy/dist/pypy/translator/c/test/test_typed.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_typed.py (original) +++ pypy/dist/pypy/translator/c/test/test_typed.py Thu Dec 8 13:53:27 2005 @@ -397,3 +397,35 @@ assert fn(7, 1) == 5040 # detection must work several times, too assert fn(7, 1) == 5040 py.test.raises(RuntimeError, fn, -1, 0) + + def test_list_len_is_true(self): + + class X(object): + pass + class A(object): + def __init__(self): + self.l = [] + + def append_to_list(self, e): + self.l.append(e) + + def check_list_is_true(self): + did_loop = 0 + while self.l: + did_loop = 1 + if len(self.l): + break + return did_loop + + a1 = A() + def f(): + a2 = A() + for ii in range(1): + a1.append_to_list(X()) + a2.append_to_list(X()) + return a1.check_list_is_true() + 2 * a2.check_list_is_true() + fn = self.getcompiled(f) + assert fn() == 3 + + + From mwh at codespeak.net Thu Dec 8 13:56:23 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Thu, 8 Dec 2005 13:56:23 +0100 (CET) Subject: [pypy-svn] r20893 - pypy/dist/pypy/translator/c/src Message-ID: <20051208125623.6CDD727B68@code1.codespeak.net> Author: mwh Date: Thu Dec 8 13:56:22 2005 New Revision: 20893 Modified: pypy/dist/pypy/translator/c/src/int.h Log: (johahn, mwh) more operations for llongs Modified: pypy/dist/pypy/translator/c/src/int.h ============================================================================== --- pypy/dist/pypy/translator/c/src/int.h (original) +++ pypy/dist/pypy/translator/c/src/int.h Thu Dec 8 13:56:22 2005 @@ -281,6 +281,21 @@ #define OP_ULLONG_MUL OP_INT_MUL -#define OP_LLONG_MUL OP_INT_MUL -#define OP_LLONG_EQ OP_INT_EQ #define OP_LLONG_ADD OP_INT_ADD +#define OP_LLONG_SUB OP_INT_SUB +#define OP_LLONG_MUL OP_INT_MUL +#define OP_LLONG_DIV OP_INT_DIV + +#define OP_LLONG_EQ OP_INT_EQ +#define OP_LLONG_NE OP_INT_NE +#define OP_LLONG_LT OP_INT_LT +#define OP_LLONG_LE OP_INT_LE +#define OP_LLONG_GT OP_INT_GT +#define OP_LLONG_GE OP_INT_GE + +#define OP_LLONG_CMP OP_INT_CMP + +#define OP_LLONG_INVERT OP_INT_INVERT +#define OP_LLONG_AND OP_INT_AND +#define OP_LLONG_OR OP_INT_OR +#define OP_LLONG_XOR OP_INT_XOR From mwh at codespeak.net Thu Dec 8 14:17:39 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Thu, 8 Dec 2005 14:17:39 +0100 (CET) Subject: [pypy-svn] r20894 - pypy/dist/pypy/translator/c/src Message-ID: <20051208131739.23AB627B70@code1.codespeak.net> Author: mwh Date: Thu Dec 8 14:17:38 2005 New Revision: 20894 Modified: pypy/dist/pypy/translator/c/src/int.h Log: (mwh, johahn) more operations on long longs, enough for translate_pypy to work again. simple tests suggest os.lseek (the only function so far that returns a long long) works. woohoo! Modified: pypy/dist/pypy/translator/c/src/int.h ============================================================================== --- pypy/dist/pypy/translator/c/src/int.h (original) +++ pypy/dist/pypy/translator/c/src/int.h Thu Dec 8 14:17:38 2005 @@ -281,6 +281,12 @@ #define OP_ULLONG_MUL OP_INT_MUL +#define OP_LLONG_IS_TRUE OP_INT_IS_TRUE +#define OP_LLONG_INVERT OP_INT_INVERT + +#define OP_LLONG_POS OP_INT_POS +#define OP_LLONG_NEG OP_INT_NEG + #define OP_LLONG_ADD OP_INT_ADD #define OP_LLONG_SUB OP_INT_SUB #define OP_LLONG_MUL OP_INT_MUL @@ -295,7 +301,10 @@ #define OP_LLONG_CMP OP_INT_CMP -#define OP_LLONG_INVERT OP_INT_INVERT #define OP_LLONG_AND OP_INT_AND #define OP_LLONG_OR OP_INT_OR #define OP_LLONG_XOR OP_INT_XOR + +#define OP_LLONG_ABS OP_INT_ABS +#define OP_LLONG_RSHIFT OP_INT_RSHIFT +#define OP_LLONG_LSHIFT OP_INT_LSHIFT From arigo at codespeak.net Thu Dec 8 14:21:33 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Dec 2005 14:21:33 +0100 (CET) Subject: [pypy-svn] r20895 - in pypy/dist/pypy: annotation translator/c/test Message-ID: <20051208132133.D816C27B70@code1.codespeak.net> Author: arigo Date: Thu Dec 8 14:21:32 2005 New Revision: 20895 Modified: pypy/dist/pypy/annotation/binaryop.py pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/annotation/model.py pypy/dist/pypy/annotation/unaryop.py pypy/dist/pypy/translator/c/test/test_tasklets.py pypy/dist/pypy/translator/c/test/test_typed.py Log: (pedronis, arigo) Fixed misusages of s.is_constant() for non-immutable objects -- in this case, s.const is valid as far as its identity is concerned, but s.const can still get mutated. Modified: pypy/dist/pypy/annotation/binaryop.py ============================================================================== --- pypy/dist/pypy/annotation/binaryop.py (original) +++ pypy/dist/pypy/annotation/binaryop.py Thu Dec 8 14:21:32 2005 @@ -101,42 +101,42 @@ inplace_mod.can_only_throw = [ZeroDivisionError] def lt((obj1, obj2)): - if obj1.is_constant() and obj2.is_constant(): + if obj1.is_immutable_constant() and obj2.is_immutable_constant(): return immutablevalue(obj1.const < obj2.const) else: getbookkeeper().count("non_int_comp", obj1, obj2) return SomeBool() def le((obj1, obj2)): - if obj1.is_constant() and obj2.is_constant(): + if obj1.is_immutable_constant() and obj2.is_immutable_constant(): return immutablevalue(obj1.const <= obj2.const) else: getbookkeeper().count("non_int_comp", obj1, obj2) return SomeBool() def eq((obj1, obj2)): - if obj1.is_constant() and obj2.is_constant(): + if obj1.is_immutable_constant() and obj2.is_immutable_constant(): return immutablevalue(obj1.const == obj2.const) else: getbookkeeper().count("non_int_eq", obj1, obj2) return SomeBool() def ne((obj1, obj2)): - if obj1.is_constant() and obj2.is_constant(): + if obj1.is_immutable_constant() and obj2.is_immutable_constant(): return immutablevalue(obj1.const != obj2.const) else: getbookkeeper().count("non_int_eq", obj1, obj2) return SomeBool() def gt((obj1, obj2)): - if obj1.is_constant() and obj2.is_constant(): + if obj1.is_immutable_constant() and obj2.is_immutable_constant(): return immutablevalue(obj1.const > obj2.const) else: getbookkeeper().count("non_int_comp", obj1, obj2) return SomeBool() def ge((obj1, obj2)): - if obj1.is_constant() and obj2.is_constant(): + if obj1.is_immutable_constant() and obj2.is_immutable_constant(): return immutablevalue(obj1.const >= obj2.const) else: getbookkeeper().count("non_int_comp", obj1, obj2) @@ -144,7 +144,7 @@ def cmp((obj1, obj2)): getbookkeeper().count("cmp", obj1, obj2) - if obj1.is_constant() and obj2.is_constant(): + if obj1.is_immutable_constant() and obj2.is_immutable_constant(): return immutablevalue(cmp(obj1.const, obj2.const)) else: return SomeInteger() Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Thu Dec 8 14:21:32 2005 @@ -30,7 +30,7 @@ """ args = [] for s in args_s: - if not s.is_constant(): + if not s.is_immutable_constant(): return s_result args.append(s.const) realresult = func(*args) @@ -70,7 +70,7 @@ builtin_xrange = builtin_range # xxx for now allow it def builtin_bool(s_obj): - return constpropagate(bool, [s_obj], SomeBool()) + return s_obj.is_true() def builtin_int(s_obj, s_base=None): assert (s_base is None or isinstance(s_base, SomeInteger) @@ -172,7 +172,7 @@ getbookkeeper().warning('hasattr(%r, %r) is not RPythonic enough' % (s_obj, s_attr)) r = SomeBool() - if s_obj.is_constant(): + if s_obj.is_immutable_constant(): r.const = hasattr(s_obj.const, s_attr.const) elif (isinstance(s_obj, SomePBC) and s_obj.getKind() is description.FrozenDesc): Modified: pypy/dist/pypy/annotation/model.py ============================================================================== --- pypy/dist/pypy/annotation/model.py (original) +++ pypy/dist/pypy/annotation/model.py Thu Dec 8 14:21:32 2005 @@ -63,6 +63,7 @@ for an arbitrary object about which nothing is known.""" __metaclass__ = extendabletype knowntype = object + immutable = False def __eq__(self, other): return (self.__class__ is other.__class__ and @@ -113,6 +114,9 @@ def is_constant(self): return hasattr(self, 'const') + def is_immutable_constant(self): + return self.immutable and hasattr(self, 'const') + # for debugging, record where each instance comes from # this is disabled if DEBUG is set to False _coming_from = {} @@ -152,6 +156,7 @@ "Stands for a float or an integer." knowntype = float # if we don't know if it's a float or an int, # pretend it's a float. + immutable = True def can_be_none(self): return False @@ -192,6 +197,7 @@ class SomeString(SomeObject): "Stands for an object which is known to be a string." knowntype = str + immutable = True def __init__(self, can_be_None=False): self.can_be_None = can_be_None @@ -205,8 +211,9 @@ "Stands for an object known to be a string of length 1." class SomeUnicodeCodePoint(SomeObject): - knowntype = unicode "Stands for an object known to be a unicode codepoint." + knowntype = unicode + immutable = True def can_be_none(self): return False @@ -232,6 +239,7 @@ class SomeSlice(SomeObject): knowntype = slice + immutable = True def __init__(self, start, stop, step): self.start = start self.stop = stop @@ -243,6 +251,7 @@ class SomeTuple(SomeObject): "Stands for a tuple of known length." knowntype = tuple + immutable = True def __init__(self, items): self.items = tuple(items) # tuple of s_xxx elements for i in items: @@ -316,6 +325,8 @@ class SomePBC(SomeObject): """Stands for a global user instance, built prior to the analysis, or a set of such instances.""" + immutable = True + def __init__(self, descriptions, can_be_None=False): # descriptions is a set of Desc instances. descriptions = dict.fromkeys(descriptions) @@ -390,6 +401,8 @@ class SomeBuiltin(SomeObject): "Stands for a built-in function or method with special-cased analysis." knowntype = BuiltinFunctionType # == BuiltinMethodType + immutable = True + def __init__(self, analyser, s_self=None, methodname=None): self.analyser = analyser self.s_self = s_self @@ -414,6 +427,7 @@ class SomeImpossibleValue(SomeObject): """The empty set. Instances are placeholders for objects that will never show up at run-time, e.g. elements of an empty list.""" + immutable = True def can_be_none(self): return False @@ -428,6 +442,7 @@ from pypy.rpython.memory import lladdress class SomeAddress(SomeObject): + immutable = True def __init__(self, is_null=False): self.is_null = is_null @@ -450,6 +465,7 @@ # annotation of low-level types class SomePtr(SomeObject): + immutable = True def __init__(self, ll_ptrtype): self.ll_ptrtype = ll_ptrtype @@ -457,6 +473,7 @@ return False class SomeLLADTMeth(SomeObject): + immutable = True def __init__(self, ll_ptrtype, func): self.ll_ptrtype = ll_ptrtype self.func = func @@ -473,11 +490,13 @@ self.ootype = ootype class SomeOOBoundMeth(SomeObject): + immutable = True def __init__(self, ootype, name): self.ootype = ootype self.name = name class SomeOOStaticMeth(SomeObject): + immutable = True def __init__(self, method): self.method = method Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Thu Dec 8 14:21:32 2005 @@ -66,7 +66,7 @@ return SomeInteger(nonneg=True) def is_true_behavior(obj): - if obj.is_constant(): + if obj.is_immutable_constant(): return immutablevalue(bool(obj.const)) else: s_len = obj.len() @@ -151,7 +151,7 @@ if s_method is not None: return s_method # if the SomeObject is itself a constant, allow reading its attrs - if obj.is_constant() and hasattr(obj.const, attr): + if obj.is_immutable_constant() and hasattr(obj.const, attr): return immutablevalue(getattr(obj.const, attr)) else: getbookkeeper().warning('getattr(%r, %r) is not RPythonic enough' % @@ -479,6 +479,12 @@ getbookkeeper().needs_hash_support[ins.classdef] = True return SomeInteger() + def is_true_behavior(ins): + if ins.can_be_None: + return SomeBool() + else: + return immutablevalue(True) + class __extend__(SomeBuiltin): def simple_call(bltn, *args): Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Thu Dec 8 14:21:32 2005 @@ -68,7 +68,7 @@ def run(self): debug("running: length of runnables %s" % len(self.runnables)) - while len(self.runnables): + while self.runnables: t = self.runnables.pop(0) debug("resuming %s(%s)" % (t.name, t.alive)) self.current_tasklet = t @@ -117,4 +117,5 @@ run() return c.get_count() == 25 - assert wrap_stackless_function(f) == '1' + res = wrap_stackless_function(f) + assert res == '1' Modified: pypy/dist/pypy/translator/c/test/test_typed.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_typed.py (original) +++ pypy/dist/pypy/translator/c/test/test_typed.py Thu Dec 8 14:21:32 2005 @@ -419,13 +419,8 @@ a1 = A() def f(): - a2 = A() for ii in range(1): a1.append_to_list(X()) - a2.append_to_list(X()) - return a1.check_list_is_true() + 2 * a2.check_list_is_true() + return a1.check_list_is_true() fn = self.getcompiled(f) - assert fn() == 3 - - - + assert fn() == 1 From arigo at codespeak.net Thu Dec 8 14:29:19 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Dec 2005 14:29:19 +0100 (CET) Subject: [pypy-svn] r20896 - pypy/dist/pypy/translator/test Message-ID: <20051208132919.2582B27DB9@code1.codespeak.net> Author: arigo Date: Thu Dec 8 14:29:18 2005 New Revision: 20896 Modified: pypy/dist/pypy/translator/test/snippet.py Log: (pedronis, arigo) Moved bits around this file, to hide them from geninterp. Modified: pypy/dist/pypy/translator/test/snippet.py ============================================================================== --- pypy/dist/pypy/translator/test/snippet.py (original) +++ pypy/dist/pypy/translator/test/snippet.py Thu Dec 8 14:29:18 2005 @@ -84,54 +84,6 @@ def simple_func(i=numtype): return i + 1 -from pypy.rpython.rarithmetic import ovfcheck, ovfcheck_lshift - -def add_func(i=numtype): - try: - return ovfcheck(i + 1) - except OverflowError: - raise - -from sys import maxint - -def div_func(i=numtype): - try: - return ovfcheck((-maxint-1) // i) - except (OverflowError, ZeroDivisionError): - raise - -def mod_func(i=numtype): - try: - return ovfcheck((-maxint-1) % i) - except OverflowError: - raise - except ZeroDivisionError: - raise - -def rshift_func(i=numtype): - try: - return (-maxint-1) >> i - except ValueError: - raise - -class hugelmugel(OverflowError):pass - -def hugo(a, b, c):pass - -def lshift_func(i=numtype): - try: - hugo(2, 3, 5) - return ovfcheck_lshift((-maxint-1), i) - except (hugelmugel, OverflowError, StandardError, ValueError): - raise - -def unary_func(i=numtype): - try: - return ovfcheck(-i), ovfcheck(abs(i-1)) - except: raise - # XXX it would be nice to get it right without an exception - # handler at all, but then we need to do much harder parsing - def while_func(i=numtype): total = 0 while i > 0: @@ -1174,3 +1126,52 @@ #if not isinstance(istk, PolyStk): # return "confused" return istk.top(), sstk.top() + + +from pypy.rpython.rarithmetic import ovfcheck, ovfcheck_lshift + +def add_func(i=numtype): + try: + return ovfcheck(i + 1) + except OverflowError: + raise + +from sys import maxint + +def div_func(i=numtype): + try: + return ovfcheck((-maxint-1) // i) + except (OverflowError, ZeroDivisionError): + raise + +def mod_func(i=numtype): + try: + return ovfcheck((-maxint-1) % i) + except OverflowError: + raise + except ZeroDivisionError: + raise + +def rshift_func(i=numtype): + try: + return (-maxint-1) >> i + except ValueError: + raise + +class hugelmugel(OverflowError):pass + +def hugo(a, b, c):pass + +def lshift_func(i=numtype): + try: + hugo(2, 3, 5) + return ovfcheck_lshift((-maxint-1), i) + except (hugelmugel, OverflowError, StandardError, ValueError): + raise + +def unary_func(i=numtype): + try: + return ovfcheck(-i), ovfcheck(abs(i-1)) + except: raise + # XXX it would be nice to get it right without an exception + # handler at all, but then we need to do much harder parsing From cfbolz at codespeak.net Thu Dec 8 14:58:19 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 8 Dec 2005 14:58:19 +0100 (CET) Subject: [pypy-svn] r20897 - in pypy/dist/pypy/jit: . test Message-ID: <20051208135819.724F627DB9@code1.codespeak.net> Author: cfbolz Date: Thu Dec 8 14:58:18 2005 New Revision: 20897 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: (cfbolz; arigo, pedronis floating around) Make the first arraytest pass by adding some operations Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Thu Dec 8 14:58:18 2005 @@ -288,6 +288,9 @@ self.residual_operations.append(op) def residualize(self, op, args_a, constant_op=None): + RESULT = op.result.concretetype + if RESULT is lltype.Void: + return XXX_later if constant_op: a_result = self.constantfold(constant_op, args_a) if a_result is not None: @@ -313,6 +316,12 @@ def op_int_gt(self, op, a1, a2): return self.residualize(op, [a1, a2], operator.gt) + def op_int_lt(self, op, a1, a2): + return self.residualize(op, [a1, a2], operator.lt) + + def op_cast_char_to_int(self, op, a): + return self.residualize(op, [a], ord) + def op_same_as(self, op, a): return a @@ -343,22 +352,31 @@ return a_result def op_getfield(self, op, a_ptr, a_attrname): + constant_op = None T = a_ptr.getconcretetype().TO - attrname = a_attrname.getvarorconst().value - RESULT = getattr(T, attrname) - if RESULT is lltype.Void: - return XXX_later v_ptr = a_ptr.getvarorconst() if isinstance(v_ptr, Constant): if T._hints.get('immutable', False): - concreteresult = getattr(v_ptr.value, attrname) - if isinstance(a_ptr, LLConcreteValue): - a_result = LLConcreteValue(concreteresult) - else: - c_result = Constant(concreteresult) - c_result.concretetype = lltype.typeOf(concreteresult) - a_result = LLRuntimeValue(c_result) - return a_result - a_result = LLRuntimeValue(op.result) - self.residual("getfield", [a_ptr, a_attrname], a_result) - return a_result + constant_op = getattr + return self.residualize(op, [a_ptr, a_attrname], constant_op) + op_getsubstruct = op_getfield + + def op_getarraysize(self, op, a_ptr): + constant_op = None + T = a_ptr.getconcretetype().TO + v_ptr = a_ptr.getvarorconst() + if isinstance(v_ptr, Constant): + if T._hints.get('immutable', False): + constant_op = len + return self.residualize(op, [a_ptr], constant_op) + + def op_getarrayitem(self, op, a_ptr, a_index): + constant_op = None + T = a_ptr.getconcretetype().TO + v_ptr = a_ptr.getvarorconst() + if isinstance(v_ptr, Constant): + if T._hints.get('immutable', False): + constant_op = operator.getitem + return self.residualize(op, [a_ptr, a_index], constant_op) + + Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Thu Dec 8 14:58:18 2005 @@ -153,3 +153,19 @@ return s.hello * s.world graph2, insns = abstrinterp(ll_function, [s], [0]) assert insns == {} + +def test_simple_array(): + A = lltype.Array(lltype.Char, + hints={'immutable': True}) + S = lltype.GcStruct('str', ('chars', A), + hints={'immutable': True}) + s = lltype.malloc(S, 11) + for i, c in enumerate("hello world"): + s.chars[i] = c + def ll_function(s, i, total): + while i < len(s.chars): + total += ord(s.chars[i]) + i += 1 + return total + graph2, insns = abstrinterp(ll_function, [s, 0, 0], [0, 1, 2]) + assert insns == {} From nik at codespeak.net Thu Dec 8 15:09:37 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Thu, 8 Dec 2005 15:09:37 +0100 (CET) Subject: [pypy-svn] r20898 - in pypy/dist/pypy/module/_socket: . rpython test Message-ID: <20051208140937.4C48F27DB9@code1.codespeak.net> Author: nik Date: Thu Dec 8 15:09:35 2005 New Revision: 20898 Modified: pypy/dist/pypy/module/_socket/interp_socket.py pypy/dist/pypy/module/_socket/rpython/rsocket.py pypy/dist/pypy/module/_socket/test/test_socket2.py Log: (ale, nik) first steps implementing socket.connect. not rtyper support, yet. some mystery test failures remaining. Modified: pypy/dist/pypy/module/_socket/interp_socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/interp_socket.py (original) +++ pypy/dist/pypy/module/_socket/interp_socket.py Thu Dec 8 15:09:35 2005 @@ -695,13 +695,27 @@ Connect the socket to a remote address. For IP sockets, the address is a pair (host, port). """ - addr = space.unwrap(w_addr) - try: - self.fd.connect(addr) - except socket.timeout: - raise wrap_timeouterror(space) - except socket.error, e: - raise wrap_socketerror(space, e) + if self.family == socket.AF_INET: + if not (space.is_true(space.isinstance(w_addr, space.w_tuple)) and + space.int_w(space.len(w_addr)) == 2): + raise OperationError(space.w_TypeError, + space.wrap("AF_INET address must be tuple of length 2")) + addr_w = space.unpackiterable(w_addr) + if not (space.is_true(space.isinstance(addr_w[0], space.w_str)) + and space.is_true(space.isinstance(addr_w[1], space.w_int))): + raise OperationError(space.w_TypeError, + space.wrap("tuple of a string and an int required")) + host = space.str_w(addr_w[0]) + port = space.int_w(addr_w[1]) + try: + rsocket.connect(self.fd, host, port) + except socket.timeout: + raise wrap_timeouterror(space) + except socket.error, e: + raise wrap_socketerror(space, e) + else: + # XXX IPv6 and Unix sockets missing here + pass connect.unwrap_spec = ['self', ObjSpace, W_Root] def connect_ex(self, space, w_addr): @@ -748,7 +762,9 @@ info is a pair (hostaddr, port). """ try: - return space.wrap(self.fd.getpeername()) + name = rsocket.getpeername(self.fd) + # XXX IPv4 only + return space.newtuple([space.wrap(name[0]), space.wrap(name[1])]) except socket.error, e: raise wrap_socketerror(space, e) getpeername.unwrap_spec = ['self', ObjSpace] @@ -959,10 +975,11 @@ """.split() socketmethods = {} for methodname in socketmethodnames: - method = getattr(Socket, methodname) - assert hasattr(method,'unwrap_spec'), methodname - assert method.im_func.func_code.co_argcount == len(method.unwrap_spec), methodname - socketmethods[methodname] = interp2app(method, unwrap_spec=method.unwrap_spec) + if hasattr(_socket.socket, methodname): + method = getattr(Socket, methodname) + assert hasattr(method,'unwrap_spec'), methodname + assert method.im_func.func_code.co_argcount == len(method.unwrap_spec), methodname + socketmethods[methodname] = interp2app(method, unwrap_spec=method.unwrap_spec) Socket.typedef = TypeDef("_socket.socket", __doc__ = """\ Modified: pypy/dist/pypy/module/_socket/rpython/rsocket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/rsocket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/rsocket.py Thu Dec 8 15:09:35 2005 @@ -4,7 +4,7 @@ import socket -keep_sockets_alive = [] +keep_sockets_alive = {} class ADDRINFO(object): # a simulated addrinfo structure from C, i.e. a chained list @@ -32,5 +32,19 @@ s = socket.socket(family, type, protocol) # HACK: We have to prevent GC to collect the socket object because we don't # want it to be closed. - keep_sockets_alive.append(s) - return s.fileno() + fileno = s.fileno() + keep_sockets_alive[fileno] = s + return fileno + +def connect(fd, host, port): + # XXX IPv4 only + s = keep_sockets_alive[fd] + try: + s.connect((host, port)) + except Exception, ex: + print ex + +def getpeername(fd): + s = keep_sockets_alive[fd] + return s.getpeername() + Modified: pypy/dist/pypy/module/_socket/test/test_socket2.py ============================================================================== --- pypy/dist/pypy/module/_socket/test/test_socket2.py (original) +++ pypy/dist/pypy/module/_socket/test/test_socket2.py Thu Dec 8 15:09:35 2005 @@ -291,6 +291,27 @@ os.close(s.fileno()) raises(_socket.error, s.close) +def app_test_socket_connect(): + import _socket, os + s = _socket.socket(_socket.AF_INET, _socket.SOCK_STREAM, 0) + s.connect(("codespeak.net", 80)) + name = s.getpeername() # Will raise socket.error if not connected + assert name[1] == 80 + s.close() + +def DONOT_app_test_socket_connect_typeerrors(): + tests = [ + "", + ("80"), + ("80", "80"), + (80, 80), + ] + import _socket + s = _socket.socket(_socket.AF_INET, _socket.SOCK_STREAM, 0) + for args in tests: + raises(TypeError, s.connect, args) + s.close() + class AppTestSocket: def setup_class(cls): From tismer at codespeak.net Thu Dec 8 17:49:06 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Thu, 8 Dec 2005 17:49:06 +0100 (CET) Subject: [pypy-svn] r20900 - pypy/dist/pypy/translator/c/test Message-ID: <20051208164906.6637827B66@code1.codespeak.net> Author: tismer Date: Thu Dec 8 17:49:05 2005 New Revision: 20900 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: added backend optimizations. scheduler_run juggled 15 variables around, now it is down to 12. But I still think this is too much! Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Thu Dec 8 17:49:05 2005 @@ -4,6 +4,7 @@ from pypy.annotation.listdef import ListDef from pypy.rpython.rstack import stack_unwind, stack_frames_depth, stack_too_big from pypy.rpython.rstack import yield_current_frame_to_caller +from pypy.translator.backendopt.all import backend_optimizations import os def wrap_stackless_function(fn): @@ -16,6 +17,7 @@ t = TranslationContext() t.buildannotator().build_types(entry_point, [s_list_of_strings]) t.buildrtyper().specialize() + backend_optimizations(t) cbuilder = CStandaloneBuilder(t, entry_point) cbuilder.stackless = True cbuilder.generate_source() From ericvrp at codespeak.net Thu Dec 8 18:14:48 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Thu, 8 Dec 2005 18:14:48 +0100 (CET) Subject: [pypy-svn] r20902 - in pypy/dist/pypy/translator/js: src test Message-ID: <20051208171448.5B32127B68@code1.codespeak.net> Author: ericvrp Date: Thu Dec 8 18:14:47 2005 New Revision: 20902 Modified: pypy/dist/pypy/translator/js/src/ll_stackless.js pypy/dist/pypy/translator/js/test/test_stackless.py Log: test for playing with long running processes Modified: pypy/dist/pypy/translator/js/src/ll_stackless.js ============================================================================== --- pypy/dist/pypy/translator/js/src/ll_stackless.js (original) +++ pypy/dist/pypy/translator/js/src/ll_stackless.js Thu Dec 8 18:14:47 2005 @@ -3,6 +3,7 @@ var slp_frame_stack_top = null; var slp_frame_stack_bottom = null; var slp_return_value = undefined; +var slp_start_time = undefined; var slp_stack_depth = 0; // This gets called with --log @@ -40,14 +41,26 @@ } return result; } +ll_stackless_stack_frames_depth__ = ll_stackless_stack_frames_depth // function ll_stack_too_big() { var result = slp_stack_depth > 500; // Firefox has a recursion limit of 1000 (others allow more) LOG("ll_stack_to_big result=" + result); + + if (!result) { + var t = new Date().getTime(); + var d = t - slp_start_time; + result = d > 1000; + if (result) { + print('XXX d='+d + ' XXX t='+t); + slp_start_time = t; + } + } return result; } +ll_stack_too_big__ = ll_stack_too_big function slp_new_frame(targetvar, func, resume_blocknum, vars) { //LOG("slp_new_frame("+targetvar+","+function_name(func)+","+resume_blocknum+","+vars.toSource()+")"); @@ -83,6 +96,7 @@ LOG('slp_frame_stack_top='+slp_frame_stack_top + ', slp_frame_stack_bottom='+slp_frame_stack_bottom) return slp_return_value; } +ll_stack_unwind__ = ll_stack_unwind function slp_return_current_frame_to_caller() { LOG("slp_return_current_frame_to_caller"); @@ -125,6 +139,7 @@ f.p0 = c; slp_frame_stack_top = slp_frame_stack_bottom = f; } +ll_stackless_switch__frame_stack_topPtr = ll_stackless_switch; // main dispatcher loop @@ -137,6 +152,7 @@ while (true) { f_back = pending.f_back; LOG('calling: ' + function_name(pending.func)); + //slp_start_time = new Date().getTime(); //XXX should really exit javascript and resume with setTimeout(...) slp_stack_depth = 0; // we are restarting to recurse slp_return_value = pending.func(); // params get initialized in the function because it's a resume! if (slp_frame_stack_top) { @@ -157,6 +173,7 @@ } function slp_entry_point(funcstring) { + slp_start_time = new Date().getTime(); slp_stack_depth = 0; /// initial stack depth var result = eval(funcstring); if (slp_frame_stack_bottom) { // get with dispatch loop when stack unwound Modified: pypy/dist/pypy/translator/js/test/test_stackless.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_stackless.py (original) +++ pypy/dist/pypy/translator/js/test/test_stackless.py Thu Dec 8 18:14:47 2005 @@ -180,3 +180,17 @@ data = wrap_stackless_function(f) assert int(data.strip()) == 7495 + +def test_long_running(): + n_iterations = 50000 + + def g(x): + if x > 0: + g(x-1) + return x + + def lp(): + return g(n_iterations) + + data = wrap_stackless_function(lp) + assert int(data.strip()) == n_iterations From ludal at codespeak.net Thu Dec 8 18:25:36 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Thu, 8 Dec 2005 18:25:36 +0100 (CET) Subject: [pypy-svn] r20903 - in pypy/dist/pypy: interpreter/astcompiler interpreter/pyparser interpreter/pyparser/test interpreter/stablecompiler module/recparser Message-ID: <20051208172536.D80EF27B68@code1.codespeak.net> Author: ludal Date: Thu Dec 8 18:25:31 2005 New Revision: 20903 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/ast.txt pypy/dist/pypy/interpreter/astcompiler/astgen.py pypy/dist/pypy/interpreter/astcompiler/consts.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/astcompiler/symbols.py pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/grammar.py pypy/dist/pypy/interpreter/pyparser/pythonparse.py pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py pypy/dist/pypy/interpreter/pyparser/test/test_samples.py pypy/dist/pypy/interpreter/stablecompiler/consts.py pypy/dist/pypy/interpreter/stablecompiler/pycodegen.py pypy/dist/pypy/module/recparser/__init__.py Log: (adim,ludal) allow the AST tree to be exported to application level also export the grammar rules all attributes of all AST nodes are made available can install a callback with parser.install_compiler_hook which allows modification of AST just before compilation Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Thu Dec 8 18:25:31 2005 @@ -5,9 +5,10 @@ """ from consts import CO_VARARGS, CO_VARKEYWORDS, OP_ASSIGN from pypy.interpreter.baseobjspace import Wrappable -from pypy.interpreter.typedef import TypeDef +from pypy.interpreter.typedef import TypeDef, GetSetProperty from pypy.interpreter.gateway import interp2app, W_Root, ObjSpace from pypy.interpreter.argument import Arguments +from pypy.interpreter.error import OperationError def flatten(list): l = [] @@ -117,13 +118,15 @@ def accept(self, visitor): return visitor.visitAbstractFunction(self) + def descr_AbstractFunction_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAbstractFunction')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) AbstractFunction.typedef = TypeDef('AbstractFunction', Node.typedef, - accept=interp2app(descr_AbstractFunction_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_AbstractFunction_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + ) class AbstractTest(Node): def __init__(self, lineno=-1): @@ -142,13 +145,15 @@ def accept(self, visitor): return visitor.visitAbstractTest(self) + def descr_AbstractTest_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAbstractTest')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) AbstractTest.typedef = TypeDef('AbstractTest', Node.typedef, - accept=interp2app(descr_AbstractTest_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_AbstractTest_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + ) class BinaryOp(Node): def __init__(self, lineno=-1): @@ -167,16 +172,18 @@ def accept(self, visitor): return visitor.visitBinaryOp(self) + def descr_BinaryOp_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBinaryOp')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) BinaryOp.typedef = TypeDef('BinaryOp', Node.typedef, - accept=interp2app(descr_BinaryOp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_BinaryOp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + ) class Add(BinaryOp): - def __init__(self, (left, right), lineno=-1): + def __init__(self, left, right, lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -189,18 +196,36 @@ return [self.left, self.right] def __repr__(self): - return "Add((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) + return "Add(%s, %s)" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitAdd(self) + def fget_left( space, self): + return space.wrap(self.left) + def fset_left( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.left = obj + def fget_right( space, self): + return space.wrap(self.right) + def fset_right( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.right = obj + def descr_Add_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAdd')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Add.typedef = TypeDef('Add', Node.typedef, - accept=interp2app(descr_Add_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Add.typedef = TypeDef('Add', BinaryOp.typedef, + accept=interp2app(descr_Add_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + left=GetSetProperty(Add.fget_left, Add.fset_left ), + right=GetSetProperty(Add.fget_right, Add.fset_right ), + ) class And(AbstractTest): def __init__(self, nodes, lineno=-1): @@ -222,13 +247,22 @@ def accept(self, visitor): return visitor.visitAnd(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def descr_And_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAnd')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -And.typedef = TypeDef('And', Node.typedef, - accept=interp2app(descr_And_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +And.typedef = TypeDef('And', AbstractTest.typedef, + accept=interp2app(descr_And_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(And.fget_nodes, And.fset_nodes ), + ) class AssAttr(Node): def __init__(self, expr, attrname, flags, lineno=-1): @@ -250,13 +284,33 @@ def accept(self, visitor): return visitor.visitAssAttr(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def fget_attrname( space, self): + return space.wrap(self.attrname) + def fset_attrname( space, self, w_arg): + self.attrname = space.str_w(w_arg) + def fget_flags( space, self): + return space.wrap(self.flags) + def fset_flags( space, self, w_arg): + self.flags = space.int_w(w_arg) + def descr_AssAttr_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssAttr')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) AssAttr.typedef = TypeDef('AssAttr', Node.typedef, - accept=interp2app(descr_AssAttr_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_AssAttr_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(AssAttr.fget_expr, AssAttr.fset_expr ), + attrname=GetSetProperty(AssAttr.fget_attrname, AssAttr.fset_attrname ), + flags=GetSetProperty(AssAttr.fget_flags, AssAttr.fset_flags ), + ) class AssSeq(Node): def __init__(self, lineno=-1): @@ -275,13 +329,15 @@ def accept(self, visitor): return visitor.visitAssSeq(self) + def descr_AssSeq_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssSeq')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) AssSeq.typedef = TypeDef('AssSeq', Node.typedef, - accept=interp2app(descr_AssSeq_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_AssSeq_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + ) class AssList(AssSeq): def __init__(self, nodes, lineno=-1): @@ -303,13 +359,22 @@ def accept(self, visitor): return visitor.visitAssList(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def descr_AssList_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssList')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -AssList.typedef = TypeDef('AssList', Node.typedef, - accept=interp2app(descr_AssList_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +AssList.typedef = TypeDef('AssList', AssSeq.typedef, + accept=interp2app(descr_AssList_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(AssList.fget_nodes, AssList.fset_nodes ), + ) class AssName(Node): def __init__(self, name, flags, lineno=-1): @@ -330,13 +395,25 @@ def accept(self, visitor): return visitor.visitAssName(self) + def fget_name( space, self): + return space.wrap(self.name) + def fset_name( space, self, w_arg): + self.name = space.str_w(w_arg) + def fget_flags( space, self): + return space.wrap(self.flags) + def fset_flags( space, self, w_arg): + self.flags = space.int_w(w_arg) + def descr_AssName_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssName')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) AssName.typedef = TypeDef('AssName', Node.typedef, - accept=interp2app(descr_AssName_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_AssName_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + name=GetSetProperty(AssName.fget_name, AssName.fset_name ), + flags=GetSetProperty(AssName.fget_flags, AssName.fset_flags ), + ) class AssTuple(AssSeq): def __init__(self, nodes, lineno=-1): @@ -364,6 +441,7 @@ else: assert False, "should only have AssName and AssTuple as children" return argnames + def __repr__(self): return "AssTuple(%s)" % (self.nodes.__repr__(),) @@ -371,13 +449,22 @@ def accept(self, visitor): return visitor.visitAssTuple(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def descr_AssTuple_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssTuple')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -AssTuple.typedef = TypeDef('AssTuple', Node.typedef, - accept=interp2app(descr_AssTuple_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +AssTuple.typedef = TypeDef('AssTuple', AssSeq.typedef, + accept=interp2app(descr_AssTuple_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(AssTuple.fget_nodes, AssTuple.fset_nodes ), + ) class Assert(Node): def __init__(self, test, fail, lineno=-1): @@ -405,13 +492,37 @@ def accept(self, visitor): return visitor.visitAssert(self) + def fget_test( space, self): + return space.wrap(self.test) + def fset_test( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.test = obj + def fget_fail( space, self): + if self.fail is None: + return space.w_None + else: + return space.wrap(self.fail) + def fset_fail( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.fail = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.fail = obj + def descr_Assert_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssert')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Assert.typedef = TypeDef('Assert', Node.typedef, - accept=interp2app(descr_Assert_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Assert_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + test=GetSetProperty(Assert.fget_test, Assert.fset_test ), + fail=GetSetProperty(Assert.fget_fail, Assert.fset_fail ), + ) class Assign(Node): def __init__(self, nodes, expr, lineno=-1): @@ -438,13 +549,30 @@ def accept(self, visitor): return visitor.visitAssign(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def descr_Assign_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssign')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Assign.typedef = TypeDef('Assign', Node.typedef, - accept=interp2app(descr_Assign_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Assign_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(Assign.fget_nodes, Assign.fset_nodes ), + expr=GetSetProperty(Assign.fget_expr, Assign.fset_expr ), + ) class AugAssign(Node): def __init__(self, node, op, expr, lineno=-1): @@ -466,13 +594,36 @@ def accept(self, visitor): return visitor.visitAugAssign(self) + def fget_node( space, self): + return space.wrap(self.node) + def fset_node( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.node = obj + def fget_op( space, self): + return space.wrap(self.op) + def fset_op( space, self, w_arg): + self.op = space.str_w(w_arg) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def descr_AugAssign_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAugAssign')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) AugAssign.typedef = TypeDef('AugAssign', Node.typedef, - accept=interp2app(descr_AugAssign_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_AugAssign_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + node=GetSetProperty(AugAssign.fget_node, AugAssign.fset_node ), + op=GetSetProperty(AugAssign.fget_op, AugAssign.fset_op ), + expr=GetSetProperty(AugAssign.fget_expr, AugAssign.fset_expr ), + ) class UnaryOp(Node): def __init__(self, lineno=-1): @@ -491,13 +642,15 @@ def accept(self, visitor): return visitor.visitUnaryOp(self) + def descr_UnaryOp_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitUnaryOp')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) UnaryOp.typedef = TypeDef('UnaryOp', Node.typedef, - accept=interp2app(descr_UnaryOp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_UnaryOp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + ) class Backquote(UnaryOp): def __init__(self, expr, lineno=-1): @@ -517,13 +670,23 @@ def accept(self, visitor): return visitor.visitBackquote(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def descr_Backquote_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBackquote')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Backquote.typedef = TypeDef('Backquote', Node.typedef, - accept=interp2app(descr_Backquote_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Backquote.typedef = TypeDef('Backquote', UnaryOp.typedef, + accept=interp2app(descr_Backquote_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(Backquote.fget_expr, Backquote.fset_expr ), + ) class BitOp(Node): def __init__(self, lineno=-1): @@ -542,13 +705,15 @@ def accept(self, visitor): return visitor.visitBitOp(self) + def descr_BitOp_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBitOp')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) BitOp.typedef = TypeDef('BitOp', Node.typedef, - accept=interp2app(descr_BitOp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_BitOp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + ) class Bitand(BitOp): def __init__(self, nodes, lineno=-1): @@ -570,13 +735,22 @@ def accept(self, visitor): return visitor.visitBitand(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def descr_Bitand_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBitand')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Bitand.typedef = TypeDef('Bitand', Node.typedef, - accept=interp2app(descr_Bitand_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Bitand.typedef = TypeDef('Bitand', BitOp.typedef, + accept=interp2app(descr_Bitand_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(Bitand.fget_nodes, Bitand.fset_nodes ), + ) class Bitor(BitOp): def __init__(self, nodes, lineno=-1): @@ -598,13 +772,22 @@ def accept(self, visitor): return visitor.visitBitor(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def descr_Bitor_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBitor')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Bitor.typedef = TypeDef('Bitor', Node.typedef, - accept=interp2app(descr_Bitor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Bitor.typedef = TypeDef('Bitor', BitOp.typedef, + accept=interp2app(descr_Bitor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(Bitor.fget_nodes, Bitor.fset_nodes ), + ) class Bitxor(BitOp): def __init__(self, nodes, lineno=-1): @@ -626,13 +809,22 @@ def accept(self, visitor): return visitor.visitBitxor(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def descr_Bitxor_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBitxor')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Bitxor.typedef = TypeDef('Bitxor', Node.typedef, - accept=interp2app(descr_Bitxor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Bitxor.typedef = TypeDef('Bitxor', BitOp.typedef, + accept=interp2app(descr_Bitxor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(Bitxor.fget_nodes, Bitxor.fset_nodes ), + ) class Break(Node): def __init__(self, lineno=-1): @@ -651,13 +843,15 @@ def accept(self, visitor): return visitor.visitBreak(self) + def descr_Break_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBreak')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Break.typedef = TypeDef('Break', Node.typedef, - accept=interp2app(descr_Break_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Break_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + ) class CallFunc(Node): def __init__(self, node, args, star_args = None, dstar_args = None, lineno=-1): @@ -692,20 +886,65 @@ def accept(self, visitor): return visitor.visitCallFunc(self) + def fget_node( space, self): + return space.wrap(self.node) + def fset_node( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.node = obj + def fget_args( space, self): + return space.newlist( [space.wrap(itm) for itm in self.args] ) + def fset_args( space, self, w_arg): + del self.args[:] + for w_itm in space.unpackiterable( w_arg ): + self.args.append( space.interpclass_w( w_arg ) ) + def fget_star_args( space, self): + if self.star_args is None: + return space.w_None + else: + return space.wrap(self.star_args) + def fset_star_args( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.star_args = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.star_args = obj + def fget_dstar_args( space, self): + if self.dstar_args is None: + return space.w_None + else: + return space.wrap(self.dstar_args) + def fset_dstar_args( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.dstar_args = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.dstar_args = obj + def descr_CallFunc_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitCallFunc')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) CallFunc.typedef = TypeDef('CallFunc', Node.typedef, - accept=interp2app(descr_CallFunc_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_CallFunc_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + node=GetSetProperty(CallFunc.fget_node, CallFunc.fset_node ), + args=GetSetProperty(CallFunc.fget_args, CallFunc.fset_args ), + star_args=GetSetProperty(CallFunc.fget_star_args, CallFunc.fset_star_args ), + dstar_args=GetSetProperty(CallFunc.fget_dstar_args, CallFunc.fset_dstar_args ), + ) class Class(Node): - def __init__(self, name, bases, doc, code, lineno=-1): + def __init__(self, name, bases, w_doc, code, lineno=-1): Node.__init__(self, lineno) self.name = name self.bases = bases - self.doc = doc + self.w_doc = w_doc self.code = code def getChildren(self): @@ -713,7 +952,7 @@ children = [] children.append(self.name) children.extend(flatten(self.bases)) - children.append(self.doc) + children.append(self.w_doc) children.append(self.code) return tuple(children) @@ -724,18 +963,45 @@ return nodelist def __repr__(self): - return "Class(%s, %s, %s, %s)" % (self.name.__repr__(), self.bases.__repr__(), self.doc.__repr__(), self.code.__repr__()) + return "Class(%s, %s, %s, %s)" % (self.name.__repr__(), self.bases.__repr__(), self.w_doc.__repr__(), self.code.__repr__()) def accept(self, visitor): return visitor.visitClass(self) + def fget_name( space, self): + return space.wrap(self.name) + def fset_name( space, self, w_arg): + self.name = space.str_w(w_arg) + def fget_bases( space, self): + return space.newlist( [space.wrap(itm) for itm in self.bases] ) + def fset_bases( space, self, w_arg): + del self.bases[:] + for w_itm in space.unpackiterable( w_arg ): + self.bases.append( space.interpclass_w( w_arg ) ) + def fget_w_doc( space, self): + return self.w_doc + def fset_w_doc( space, self, w_arg): + self.w_doc = w_arg + def fget_code( space, self): + return space.wrap(self.code) + def fset_code( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.code = obj + def descr_Class_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitClass')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Class.typedef = TypeDef('Class', Node.typedef, - accept=interp2app(descr_Class_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Class_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + name=GetSetProperty(Class.fget_name, Class.fset_name ), + bases=GetSetProperty(Class.fget_bases, Class.fset_bases ), + w_doc=GetSetProperty(Class.fget_w_doc, Class.fset_w_doc ), + code=GetSetProperty(Class.fget_code, Class.fset_code ), + ) class Compare(Node): def __init__(self, expr, ops, lineno=-1): @@ -759,19 +1025,49 @@ nodelist.append(node) return nodelist + def fset_ops( space, self, w_arg ): + del self.ops[:] + for w_obj in space.unpackiterable( w_arg ): + w_opname = space.getitem( w_obj, space.wrap(0) ) + w_node = space.getitem( w_obj, space.wrap(1) ) + ops = space.str_w(w_opname) + node = space.interpclass_w( w_node ) + if not isinstance(node, Node): + raise OperationError(space.w_TypeError, space.wrap("ops must be a list of (name,node)")) + self.ops.append( (ops,node) ) + + + def fget_ops( space, self ): + lst = [] + for op_name, node in self.ops: + lst.append( space.newtuple( [ space.wrap(op_name), space.wrap(node) ] ) ) + return space.newlist( lst ) + + def __repr__(self): return "Compare(%s, %s)" % (self.expr.__repr__(), self.ops.__repr__()) def accept(self, visitor): return visitor.visitCompare(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def descr_Compare_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitCompare')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Compare.typedef = TypeDef('Compare', Node.typedef, - accept=interp2app(descr_Compare_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Compare_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(Compare.fget_expr, Compare.fset_expr ), + ops=GetSetProperty(Compare.fget_ops, Compare.fset_ops ), + ) class Const(Node): def __init__(self, value, lineno=-1): @@ -791,13 +1087,20 @@ def accept(self, visitor): return visitor.visitConst(self) + def fget_value( space, self): + return self.value + def fset_value( space, self, w_arg): + self.value = w_arg + def descr_Const_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitConst')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Const.typedef = TypeDef('Const', Node.typedef, - accept=interp2app(descr_Const_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Const_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + value=GetSetProperty(Const.fget_value, Const.fset_value ), + ) class Continue(Node): def __init__(self, lineno=-1): @@ -816,13 +1119,15 @@ def accept(self, visitor): return visitor.visitContinue(self) + def descr_Continue_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitContinue')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Continue.typedef = TypeDef('Continue', Node.typedef, - accept=interp2app(descr_Continue_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Continue_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + ) class Decorators(Node): def __init__(self, nodes, lineno=-1): @@ -844,13 +1149,22 @@ def accept(self, visitor): return visitor.visitDecorators(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def descr_Decorators_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitDecorators')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Decorators.typedef = TypeDef('Decorators', Node.typedef, - accept=interp2app(descr_Decorators_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Decorators_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(Decorators.fget_nodes, Decorators.fset_nodes ), + ) class Dict(Node): def __init__(self, items, lineno=-1): @@ -870,19 +1184,39 @@ nodelist.append(value) return nodelist + def fset_items( space, self, w_arg ): + del self.items[:] + for w_tup in space.unpackiterable( w_arg ): + w_key = space.getitem( w_tup, space.wrap(0) ) + w_value = space.getitem( w_tup, space.wrap(1) ) + key = space.interpclass_w( w_key ) + value = space.interpclass_w( w_value ) + if not isinstance( key, Node ) or not isinstance( value, Node ): + raise OperationError(space.w_TypeError, space.wrap("Need a list of (key node, value node)")) + self.items.append( (key,value) ) + + + def fget_items( space, self ): + return space.newlist( [ space.newtuple( [ space.wrap(key), space.wrap(value) ] ) + for key, value in self.items ] ) + + def __repr__(self): return "Dict(%s)" % (self.items.__repr__(),) def accept(self, visitor): return visitor.visitDict(self) + def descr_Dict_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitDict')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Dict.typedef = TypeDef('Dict', Node.typedef, - accept=interp2app(descr_Dict_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Dict_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + items=GetSetProperty(Dict.fget_items, Dict.fset_items ), + ) class Discard(Node): def __init__(self, expr, lineno=-1): @@ -902,16 +1236,26 @@ def accept(self, visitor): return visitor.visitDiscard(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def descr_Discard_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitDiscard')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Discard.typedef = TypeDef('Discard', Node.typedef, - accept=interp2app(descr_Discard_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Discard_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(Discard.fget_expr, Discard.fset_expr ), + ) class Div(BinaryOp): - def __init__(self, (left, right), lineno=-1): + def __init__(self, left, right, lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -924,18 +1268,36 @@ return [self.left, self.right] def __repr__(self): - return "Div((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) + return "Div(%s, %s)" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitDiv(self) + def fget_left( space, self): + return space.wrap(self.left) + def fset_left( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.left = obj + def fget_right( space, self): + return space.wrap(self.right) + def fset_right( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.right = obj + def descr_Div_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitDiv')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Div.typedef = TypeDef('Div', Node.typedef, - accept=interp2app(descr_Div_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Div.typedef = TypeDef('Div', BinaryOp.typedef, + accept=interp2app(descr_Div_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + left=GetSetProperty(Div.fget_left, Div.fset_left ), + right=GetSetProperty(Div.fget_right, Div.fset_right ), + ) class Ellipsis(Node): def __init__(self, lineno=-1): @@ -954,13 +1316,15 @@ def accept(self, visitor): return visitor.visitEllipsis(self) + def descr_Ellipsis_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitEllipsis')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Ellipsis.typedef = TypeDef('Ellipsis', Node.typedef, - accept=interp2app(descr_Ellipsis_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Ellipsis_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + ) class Exec(Node): def __init__(self, expr, locals, globals, lineno=-1): @@ -992,16 +1356,54 @@ def accept(self, visitor): return visitor.visitExec(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def fget_locals( space, self): + if self.locals is None: + return space.w_None + else: + return space.wrap(self.locals) + def fset_locals( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.locals = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.locals = obj + def fget_globals( space, self): + if self.globals is None: + return space.w_None + else: + return space.wrap(self.globals) + def fset_globals( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.globals = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.globals = obj + def descr_Exec_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitExec')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Exec.typedef = TypeDef('Exec', Node.typedef, - accept=interp2app(descr_Exec_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Exec_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(Exec.fget_expr, Exec.fset_expr ), + locals=GetSetProperty(Exec.fget_locals, Exec.fset_locals ), + globals=GetSetProperty(Exec.fget_globals, Exec.fset_globals ), + ) class FloorDiv(BinaryOp): - def __init__(self, (left, right), lineno=-1): + def __init__(self, left, right, lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -1014,18 +1416,36 @@ return [self.left, self.right] def __repr__(self): - return "FloorDiv((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) + return "FloorDiv(%s, %s)" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitFloorDiv(self) + def fget_left( space, self): + return space.wrap(self.left) + def fset_left( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.left = obj + def fget_right( space, self): + return space.wrap(self.right) + def fset_right( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.right = obj + def descr_FloorDiv_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitFloorDiv')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -FloorDiv.typedef = TypeDef('FloorDiv', Node.typedef, - accept=interp2app(descr_FloorDiv_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +FloorDiv.typedef = TypeDef('FloorDiv', BinaryOp.typedef, + accept=interp2app(descr_FloorDiv_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + left=GetSetProperty(FloorDiv.fget_left, FloorDiv.fset_left ), + right=GetSetProperty(FloorDiv.fget_right, FloorDiv.fset_right ), + ) class For(Node): def __init__(self, assign, list, body, else_, lineno=-1): @@ -1059,13 +1479,53 @@ def accept(self, visitor): return visitor.visitFor(self) + def fget_assign( space, self): + return space.wrap(self.assign) + def fset_assign( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.assign = obj + def fget_list( space, self): + return space.wrap(self.list) + def fset_list( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.list = obj + def fget_body( space, self): + return space.wrap(self.body) + def fset_body( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.body = obj + def fget_else_( space, self): + if self.else_ is None: + return space.w_None + else: + return space.wrap(self.else_) + def fset_else_( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.else_ = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.else_ = obj + def descr_For_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitFor')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) For.typedef = TypeDef('For', Node.typedef, - accept=interp2app(descr_For_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_For_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + assign=GetSetProperty(For.fget_assign, For.fset_assign ), + list=GetSetProperty(For.fget_list, For.fset_list ), + body=GetSetProperty(For.fget_body, For.fset_body ), + else_=GetSetProperty(For.fget_else_, For.fset_else_ ), + ) class From(Node): def __init__(self, modname, names, lineno=-1): @@ -1080,29 +1540,54 @@ def getChildNodes(self): return [] + def fget_names( space, self ): + return space.newlist( [ space.newtuple( [ space.wrap(name), space.wrap(as_name) ] ) + for name, as_name in self.names ] ) + + + def fset_names( space, self, w_arg ): + del self.names[:] + for w_tup in space.unpackiterable( w_arg ): + w_name = space.getitem( w_tup, space.wrap(0) ) + w_as_name = space.getitem( w_tup, space.wrap(1) ) + name = space.str_w( w_name ) + as_name = None + if not space.is_w( w_as_name, space.w_None ): + as_name = space.str_w( w_as_name ) + self.names.append( (name, as_name) ) + + def __repr__(self): return "From(%s, %s)" % (self.modname.__repr__(), self.names.__repr__()) def accept(self, visitor): return visitor.visitFrom(self) + def fget_modname( space, self): + return space.wrap(self.modname) + def fset_modname( space, self, w_arg): + self.modname = space.str_w(w_arg) + def descr_From_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitFrom')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) From.typedef = TypeDef('From', Node.typedef, - accept=interp2app(descr_From_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_From_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + modname=GetSetProperty(From.fget_modname, From.fset_modname ), + names=GetSetProperty(From.fget_names, From.fset_names ), + ) class Function(AbstractFunction): - def __init__(self, decorators, name, argnames, defaults, flags, doc, code, lineno=-1): + def __init__(self, decorators, name, argnames, defaults, flags, w_doc, code, lineno=-1): Node.__init__(self, lineno) self.decorators = decorators self.name = name self.argnames = argnames self.defaults = defaults self.flags = flags - self.doc = doc + self.w_doc = w_doc self.code = code self.varargs = self.kwargs = 0 if flags & CO_VARARGS: @@ -1117,10 +1602,10 @@ children = [] children.append(self.decorators) children.append(self.name) - children.append(self.argnames) + children.extend(flatten(self.argnames)) children.extend(flatten(self.defaults)) children.append(self.flags) - children.append(self.doc) + children.append(self.w_doc) children.append(self.code) return tuple(children) @@ -1128,23 +1613,77 @@ nodelist = [] if self.decorators is not None: nodelist.append(self.decorators) + nodelist.extend(self.argnames) nodelist.extend(self.defaults) nodelist.append(self.code) return nodelist def __repr__(self): - return "Function(%s, %s, %s, %s, %s, %s, %s)" % (self.decorators.__repr__(), self.name.__repr__(), self.argnames.__repr__(), self.defaults.__repr__(), self.flags.__repr__(), self.doc.__repr__(), self.code.__repr__()) + return "Function(%s, %s, %s, %s, %s, %s, %s)" % (self.decorators.__repr__(), self.name.__repr__(), self.argnames.__repr__(), self.defaults.__repr__(), self.flags.__repr__(), self.w_doc.__repr__(), self.code.__repr__()) def accept(self, visitor): return visitor.visitFunction(self) + def fget_decorators( space, self): + if self.decorators is None: + return space.w_None + else: + return space.wrap(self.decorators) + def fset_decorators( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.decorators = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.decorators = obj + def fget_name( space, self): + return space.wrap(self.name) + def fset_name( space, self, w_arg): + self.name = space.str_w(w_arg) + def fget_argnames( space, self): + return space.newlist( [space.wrap(itm) for itm in self.argnames] ) + def fset_argnames( space, self, w_arg): + del self.argnames[:] + for w_itm in space.unpackiterable( w_arg ): + self.argnames.append( space.interpclass_w( w_arg ) ) + def fget_defaults( space, self): + return space.newlist( [space.wrap(itm) for itm in self.defaults] ) + def fset_defaults( space, self, w_arg): + del self.defaults[:] + for w_itm in space.unpackiterable( w_arg ): + self.defaults.append( space.interpclass_w( w_arg ) ) + def fget_flags( space, self): + return space.wrap(self.flags) + def fset_flags( space, self, w_arg): + self.flags = space.int_w(w_arg) + def fget_w_doc( space, self): + return self.w_doc + def fset_w_doc( space, self, w_arg): + self.w_doc = w_arg + def fget_code( space, self): + return space.wrap(self.code) + def fset_code( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.code = obj + def descr_Function_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitFunction')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Function.typedef = TypeDef('Function', Node.typedef, - accept=interp2app(descr_Function_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Function.typedef = TypeDef('Function', AbstractFunction.typedef, + accept=interp2app(descr_Function_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + decorators=GetSetProperty(Function.fget_decorators, Function.fset_decorators ), + name=GetSetProperty(Function.fget_name, Function.fset_name ), + argnames=GetSetProperty(Function.fget_argnames, Function.fset_argnames ), + defaults=GetSetProperty(Function.fget_defaults, Function.fset_defaults ), + flags=GetSetProperty(Function.fget_flags, Function.fset_flags ), + w_doc=GetSetProperty(Function.fget_w_doc, Function.fset_w_doc ), + code=GetSetProperty(Function.fget_code, Function.fset_code ), + ) class GenExpr(AbstractFunction): def __init__(self, code, lineno=-1): @@ -1168,13 +1707,23 @@ def accept(self, visitor): return visitor.visitGenExpr(self) + def fget_code( space, self): + return space.wrap(self.code) + def fset_code( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.code = obj + def descr_GenExpr_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitGenExpr')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -GenExpr.typedef = TypeDef('GenExpr', Node.typedef, - accept=interp2app(descr_GenExpr_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +GenExpr.typedef = TypeDef('GenExpr', AbstractFunction.typedef, + accept=interp2app(descr_GenExpr_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + code=GetSetProperty(GenExpr.fget_code, GenExpr.fset_code ), + ) class GenExprFor(Node): def __init__(self, assign, iter, ifs, lineno=-1): @@ -1207,13 +1756,38 @@ def accept(self, visitor): return visitor.visitGenExprFor(self) + def fget_assign( space, self): + return space.wrap(self.assign) + def fset_assign( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.assign = obj + def fget_iter( space, self): + return space.wrap(self.iter) + def fset_iter( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.iter = obj + def fget_ifs( space, self): + return space.newlist( [space.wrap(itm) for itm in self.ifs] ) + def fset_ifs( space, self, w_arg): + del self.ifs[:] + for w_itm in space.unpackiterable( w_arg ): + self.ifs.append( space.interpclass_w( w_arg ) ) + def descr_GenExprFor_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitGenExprFor')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) GenExprFor.typedef = TypeDef('GenExprFor', Node.typedef, - accept=interp2app(descr_GenExprFor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_GenExprFor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + assign=GetSetProperty(GenExprFor.fget_assign, GenExprFor.fset_assign ), + iter=GetSetProperty(GenExprFor.fget_iter, GenExprFor.fset_iter ), + ifs=GetSetProperty(GenExprFor.fget_ifs, GenExprFor.fset_ifs ), + ) class GenExprIf(Node): def __init__(self, test, lineno=-1): @@ -1233,13 +1807,23 @@ def accept(self, visitor): return visitor.visitGenExprIf(self) + def fget_test( space, self): + return space.wrap(self.test) + def fset_test( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.test = obj + def descr_GenExprIf_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitGenExprIf')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) GenExprIf.typedef = TypeDef('GenExprIf', Node.typedef, - accept=interp2app(descr_GenExprIf_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_GenExprIf_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + test=GetSetProperty(GenExprIf.fget_test, GenExprIf.fset_test ), + ) class GenExprInner(Node): def __init__(self, expr, quals, lineno=-1): @@ -1266,13 +1850,30 @@ def accept(self, visitor): return visitor.visitGenExprInner(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def fget_quals( space, self): + return space.newlist( [space.wrap(itm) for itm in self.quals] ) + def fset_quals( space, self, w_arg): + del self.quals[:] + for w_itm in space.unpackiterable( w_arg ): + self.quals.append( space.interpclass_w( w_arg ) ) + def descr_GenExprInner_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitGenExprInner')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) GenExprInner.typedef = TypeDef('GenExprInner', Node.typedef, - accept=interp2app(descr_GenExprInner_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_GenExprInner_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(GenExprInner.fget_expr, GenExprInner.fset_expr ), + quals=GetSetProperty(GenExprInner.fget_quals, GenExprInner.fset_quals ), + ) class Getattr(Node): def __init__(self, expr, attrname, lineno=-1): @@ -1293,13 +1894,28 @@ def accept(self, visitor): return visitor.visitGetattr(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def fget_attrname( space, self): + return space.wrap(self.attrname) + def fset_attrname( space, self, w_arg): + self.attrname = space.str_w(w_arg) + def descr_Getattr_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitGetattr')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Getattr.typedef = TypeDef('Getattr', Node.typedef, - accept=interp2app(descr_Getattr_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Getattr_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(Getattr.fget_expr, Getattr.fset_expr ), + attrname=GetSetProperty(Getattr.fget_attrname, Getattr.fset_attrname ), + ) class Global(Node): def __init__(self, names, lineno=-1): @@ -1319,13 +1935,22 @@ def accept(self, visitor): return visitor.visitGlobal(self) + def fget_names( space, self): + return space.newlist( [space.wrap(itm) for itm in self.names] ) + def fset_names( space, self, w_arg): + del self.names[:] + for itm in space.unpackiterable(w_arg): + self.names.append( space.str_w(itm) ) + def descr_Global_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitGlobal')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Global.typedef = TypeDef('Global', Node.typedef, - accept=interp2app(descr_Global_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Global_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + names=GetSetProperty(Global.fget_names, Global.fset_names ), + ) class If(Node): def __init__(self, tests, else_, lineno=-1): @@ -1351,19 +1976,56 @@ nodelist.append(self.else_) return nodelist + def fset_tests( space, self, w_arg ): + del self.tests[:] + for w_tup in space.unpackiterable( w_arg ): + w_test = space.getitem( w_tup, space.wrap(0) ) + w_suite = space.getitem( w_tup, space.wrap(1) ) + test = space.interpclass_w( w_test ) + suite = space.interpclass_w( w_suite ) + if not isinstance( test, Node ) or not isinstance( suite, Node ): + raise OperationError(space.w_TypeError, space.wrap("Need a list of (test,suite) nodes") ) + self.tests.append( (test,suite) ) + + + + + def fget_tests( space, self ): + return space.newlist( [ space.newtuple( [ space.wrap(test), + space.wrap(suite) ] ) + for test, suite in self.tests ] ) + + def __repr__(self): return "If(%s, %s)" % (self.tests.__repr__(), self.else_.__repr__()) def accept(self, visitor): return visitor.visitIf(self) + def fget_else_( space, self): + if self.else_ is None: + return space.w_None + else: + return space.wrap(self.else_) + def fset_else_( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.else_ = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.else_ = obj + def descr_If_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitIf')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) If.typedef = TypeDef('If', Node.typedef, - accept=interp2app(descr_If_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_If_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + tests=GetSetProperty(If.fget_tests, If.fset_tests ), + else_=GetSetProperty(If.fget_else_, If.fset_else_ ), + ) class Import(Node): def __init__(self, names, lineno=-1): @@ -1377,19 +2039,39 @@ def getChildNodes(self): return [] + def fget_names( space, self ): + return space.newlist( [ space.newtuple( [ space.wrap(name), space.wrap(as_name) ] ) + for name, as_name in self.names ] ) + + + def fset_names( space, self, w_arg ): + del self.names[:] + for w_tup in space.unpackiterable( w_arg ): + w_name = space.getitem( w_tup, space.wrap(0) ) + w_as_name = space.getitem( w_tup, space.wrap(1) ) + name = space.str_w( w_name ) + as_name = None + if not space.is_w( w_as_name, space.w_None ): + as_name = space.str_w( w_as_name ) + self.names.append( (name, as_name) ) + + def __repr__(self): return "Import(%s)" % (self.names.__repr__(),) def accept(self, visitor): return visitor.visitImport(self) + def descr_Import_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitImport')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Import.typedef = TypeDef('Import', Node.typedef, - accept=interp2app(descr_Import_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Import_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + names=GetSetProperty(Import.fget_names, Import.fset_names ), + ) class Invert(UnaryOp): def __init__(self, expr, lineno=-1): @@ -1409,13 +2091,23 @@ def accept(self, visitor): return visitor.visitInvert(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def descr_Invert_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitInvert')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Invert.typedef = TypeDef('Invert', Node.typedef, - accept=interp2app(descr_Invert_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Invert.typedef = TypeDef('Invert', UnaryOp.typedef, + accept=interp2app(descr_Invert_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(Invert.fget_expr, Invert.fset_expr ), + ) class Keyword(Node): def __init__(self, name, expr, lineno=-1): @@ -1436,13 +2128,28 @@ def accept(self, visitor): return visitor.visitKeyword(self) + def fget_name( space, self): + return space.wrap(self.name) + def fset_name( space, self, w_arg): + self.name = space.str_w(w_arg) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def descr_Keyword_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitKeyword')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Keyword.typedef = TypeDef('Keyword', Node.typedef, - accept=interp2app(descr_Keyword_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Keyword_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + name=GetSetProperty(Keyword.fget_name, Keyword.fset_name ), + expr=GetSetProperty(Keyword.fget_expr, Keyword.fset_expr ), + ) class Lambda(AbstractFunction): def __init__(self, argnames, defaults, flags, code, lineno=-1): @@ -1462,7 +2169,7 @@ def getChildren(self): "NOT_RPYTHON" children = [] - children.append(self.argnames) + children.extend(flatten(self.argnames)) children.extend(flatten(self.defaults)) children.append(self.flags) children.append(self.code) @@ -1470,6 +2177,7 @@ def getChildNodes(self): nodelist = [] + nodelist.extend(self.argnames) nodelist.extend(self.defaults) nodelist.append(self.code) return nodelist @@ -1480,16 +2188,45 @@ def accept(self, visitor): return visitor.visitLambda(self) + def fget_argnames( space, self): + return space.newlist( [space.wrap(itm) for itm in self.argnames] ) + def fset_argnames( space, self, w_arg): + del self.argnames[:] + for w_itm in space.unpackiterable( w_arg ): + self.argnames.append( space.interpclass_w( w_arg ) ) + def fget_defaults( space, self): + return space.newlist( [space.wrap(itm) for itm in self.defaults] ) + def fset_defaults( space, self, w_arg): + del self.defaults[:] + for w_itm in space.unpackiterable( w_arg ): + self.defaults.append( space.interpclass_w( w_arg ) ) + def fget_flags( space, self): + return space.wrap(self.flags) + def fset_flags( space, self, w_arg): + self.flags = space.int_w(w_arg) + def fget_code( space, self): + return space.wrap(self.code) + def fset_code( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.code = obj + def descr_Lambda_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitLambda')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Lambda.typedef = TypeDef('Lambda', Node.typedef, - accept=interp2app(descr_Lambda_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Lambda.typedef = TypeDef('Lambda', AbstractFunction.typedef, + accept=interp2app(descr_Lambda_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + argnames=GetSetProperty(Lambda.fget_argnames, Lambda.fset_argnames ), + defaults=GetSetProperty(Lambda.fget_defaults, Lambda.fset_defaults ), + flags=GetSetProperty(Lambda.fget_flags, Lambda.fset_flags ), + code=GetSetProperty(Lambda.fget_code, Lambda.fset_code ), + ) class LeftShift(BinaryOp): - def __init__(self, (left, right), lineno=-1): + def __init__(self, left, right, lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -1502,18 +2239,36 @@ return [self.left, self.right] def __repr__(self): - return "LeftShift((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) + return "LeftShift(%s, %s)" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitLeftShift(self) + def fget_left( space, self): + return space.wrap(self.left) + def fset_left( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.left = obj + def fget_right( space, self): + return space.wrap(self.right) + def fset_right( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.right = obj + def descr_LeftShift_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitLeftShift')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -LeftShift.typedef = TypeDef('LeftShift', Node.typedef, - accept=interp2app(descr_LeftShift_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +LeftShift.typedef = TypeDef('LeftShift', BinaryOp.typedef, + accept=interp2app(descr_LeftShift_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + left=GetSetProperty(LeftShift.fget_left, LeftShift.fset_left ), + right=GetSetProperty(LeftShift.fget_right, LeftShift.fset_right ), + ) class List(Node): def __init__(self, nodes, lineno=-1): @@ -1535,13 +2290,22 @@ def accept(self, visitor): return visitor.visitList(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def descr_List_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitList')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) List.typedef = TypeDef('List', Node.typedef, - accept=interp2app(descr_List_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_List_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(List.fget_nodes, List.fset_nodes ), + ) class ListComp(Node): def __init__(self, expr, quals, lineno=-1): @@ -1568,13 +2332,30 @@ def accept(self, visitor): return visitor.visitListComp(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def fget_quals( space, self): + return space.newlist( [space.wrap(itm) for itm in self.quals] ) + def fset_quals( space, self, w_arg): + del self.quals[:] + for w_itm in space.unpackiterable( w_arg ): + self.quals.append( space.interpclass_w( w_arg ) ) + def descr_ListComp_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitListComp')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) ListComp.typedef = TypeDef('ListComp', Node.typedef, - accept=interp2app(descr_ListComp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_ListComp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(ListComp.fget_expr, ListComp.fset_expr ), + quals=GetSetProperty(ListComp.fget_quals, ListComp.fset_quals ), + ) class ListCompFor(Node): def __init__(self, assign, list, ifs, lineno=-1): @@ -1604,13 +2385,38 @@ def accept(self, visitor): return visitor.visitListCompFor(self) + def fget_assign( space, self): + return space.wrap(self.assign) + def fset_assign( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.assign = obj + def fget_list( space, self): + return space.wrap(self.list) + def fset_list( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.list = obj + def fget_ifs( space, self): + return space.newlist( [space.wrap(itm) for itm in self.ifs] ) + def fset_ifs( space, self, w_arg): + del self.ifs[:] + for w_itm in space.unpackiterable( w_arg ): + self.ifs.append( space.interpclass_w( w_arg ) ) + def descr_ListCompFor_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitListCompFor')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) ListCompFor.typedef = TypeDef('ListCompFor', Node.typedef, - accept=interp2app(descr_ListCompFor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_ListCompFor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + assign=GetSetProperty(ListCompFor.fget_assign, ListCompFor.fset_assign ), + list=GetSetProperty(ListCompFor.fget_list, ListCompFor.fset_list ), + ifs=GetSetProperty(ListCompFor.fget_ifs, ListCompFor.fset_ifs ), + ) class ListCompIf(Node): def __init__(self, test, lineno=-1): @@ -1630,16 +2436,26 @@ def accept(self, visitor): return visitor.visitListCompIf(self) + def fget_test( space, self): + return space.wrap(self.test) + def fset_test( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.test = obj + def descr_ListCompIf_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitListCompIf')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) ListCompIf.typedef = TypeDef('ListCompIf', Node.typedef, - accept=interp2app(descr_ListCompIf_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_ListCompIf_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + test=GetSetProperty(ListCompIf.fget_test, ListCompIf.fset_test ), + ) class Mod(BinaryOp): - def __init__(self, (left, right), lineno=-1): + def __init__(self, left, right, lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -1652,48 +2468,81 @@ return [self.left, self.right] def __repr__(self): - return "Mod((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) + return "Mod(%s, %s)" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitMod(self) + def fget_left( space, self): + return space.wrap(self.left) + def fset_left( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.left = obj + def fget_right( space, self): + return space.wrap(self.right) + def fset_right( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.right = obj + def descr_Mod_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitMod')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Mod.typedef = TypeDef('Mod', Node.typedef, - accept=interp2app(descr_Mod_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Mod.typedef = TypeDef('Mod', BinaryOp.typedef, + accept=interp2app(descr_Mod_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + left=GetSetProperty(Mod.fget_left, Mod.fset_left ), + right=GetSetProperty(Mod.fget_right, Mod.fset_right ), + ) class Module(Node): - def __init__(self, doc, node, lineno=-1): + def __init__(self, w_doc, node, lineno=-1): Node.__init__(self, lineno) - self.doc = doc + self.w_doc = w_doc self.node = node def getChildren(self): "NOT_RPYTHON" - return self.doc, self.node + return self.w_doc, self.node def getChildNodes(self): return [self.node,] def __repr__(self): - return "Module(%s, %s)" % (self.doc.__repr__(), self.node.__repr__()) + return "Module(%s, %s)" % (self.w_doc.__repr__(), self.node.__repr__()) def accept(self, visitor): return visitor.visitModule(self) + def fget_w_doc( space, self): + return self.w_doc + def fset_w_doc( space, self, w_arg): + self.w_doc = w_arg + def fget_node( space, self): + return space.wrap(self.node) + def fset_node( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.node = obj + def descr_Module_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitModule')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Module.typedef = TypeDef('Module', Node.typedef, - accept=interp2app(descr_Module_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Module_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + w_doc=GetSetProperty(Module.fget_w_doc, Module.fset_w_doc ), + node=GetSetProperty(Module.fget_node, Module.fset_node ), + ) class Mul(BinaryOp): - def __init__(self, (left, right), lineno=-1): + def __init__(self, left, right, lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -1706,18 +2555,36 @@ return [self.left, self.right] def __repr__(self): - return "Mul((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) + return "Mul(%s, %s)" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitMul(self) + def fget_left( space, self): + return space.wrap(self.left) + def fset_left( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.left = obj + def fget_right( space, self): + return space.wrap(self.right) + def fset_right( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.right = obj + def descr_Mul_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitMul')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Mul.typedef = TypeDef('Mul', Node.typedef, - accept=interp2app(descr_Mul_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Mul.typedef = TypeDef('Mul', BinaryOp.typedef, + accept=interp2app(descr_Mul_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + left=GetSetProperty(Mul.fget_left, Mul.fset_left ), + right=GetSetProperty(Mul.fget_right, Mul.fset_right ), + ) class Name(Node): def __init__(self, varname, lineno=-1): @@ -1737,13 +2604,20 @@ def accept(self, visitor): return visitor.visitName(self) + def fget_varname( space, self): + return space.wrap(self.varname) + def fset_varname( space, self, w_arg): + self.varname = space.str_w(w_arg) + def descr_Name_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitName')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Name.typedef = TypeDef('Name', Node.typedef, - accept=interp2app(descr_Name_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Name_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + varname=GetSetProperty(Name.fget_varname, Name.fset_varname ), + ) class NoneConst(Node): def __init__(self, lineno=-1): @@ -1762,13 +2636,15 @@ def accept(self, visitor): return visitor.visitNoneConst(self) + def descr_NoneConst_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitNoneConst')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) NoneConst.typedef = TypeDef('NoneConst', Node.typedef, - accept=interp2app(descr_NoneConst_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_NoneConst_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + ) class Not(UnaryOp): def __init__(self, expr, lineno=-1): @@ -1788,39 +2664,23 @@ def accept(self, visitor): return visitor.visitNot(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def descr_Not_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitNot')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Not.typedef = TypeDef('Not', Node.typedef, - accept=interp2app(descr_Not_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) - -class NumberConst(Node): - def __init__(self, number_value, lineno=-1): - Node.__init__(self, lineno) - self.number_value = number_value - - def getChildren(self): - "NOT_RPYTHON" - return self.number_value, - - def getChildNodes(self): - return [] - - def __repr__(self): - return "NumberConst(%s)" % (self.number_value.__repr__(),) - - def accept(self, visitor): - return visitor.visitNumberConst(self) - -def descr_NumberConst_accept( space, w_self, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitNumberConst')) - args = Arguments(space, [ w_self ]) - return space.call_args(w_callable, args) - -NumberConst.typedef = TypeDef('NumberConst', Node.typedef, - accept=interp2app(descr_NumberConst_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Not.typedef = TypeDef('Not', UnaryOp.typedef, + accept=interp2app(descr_Not_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(Not.fget_expr, Not.fset_expr ), + ) class Or(AbstractTest): def __init__(self, nodes, lineno=-1): @@ -1842,13 +2702,22 @@ def accept(self, visitor): return visitor.visitOr(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def descr_Or_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitOr')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Or.typedef = TypeDef('Or', Node.typedef, - accept=interp2app(descr_Or_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Or.typedef = TypeDef('Or', AbstractTest.typedef, + accept=interp2app(descr_Or_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(Or.fget_nodes, Or.fset_nodes ), + ) class Pass(Node): def __init__(self, lineno=-1): @@ -1867,16 +2736,18 @@ def accept(self, visitor): return visitor.visitPass(self) + def descr_Pass_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitPass')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Pass.typedef = TypeDef('Pass', Node.typedef, - accept=interp2app(descr_Pass_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Pass_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + ) class Power(BinaryOp): - def __init__(self, (left, right), lineno=-1): + def __init__(self, left, right, lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -1889,18 +2760,36 @@ return [self.left, self.right] def __repr__(self): - return "Power((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) + return "Power(%s, %s)" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitPower(self) + def fget_left( space, self): + return space.wrap(self.left) + def fset_left( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.left = obj + def fget_right( space, self): + return space.wrap(self.right) + def fset_right( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.right = obj + def descr_Power_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitPower')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Power.typedef = TypeDef('Power', Node.typedef, - accept=interp2app(descr_Power_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Power.typedef = TypeDef('Power', BinaryOp.typedef, + accept=interp2app(descr_Power_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + left=GetSetProperty(Power.fget_left, Power.fset_left ), + right=GetSetProperty(Power.fget_right, Power.fset_right ), + ) class Print(Node): def __init__(self, nodes, dest, lineno=-1): @@ -1928,13 +2817,36 @@ def accept(self, visitor): return visitor.visitPrint(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def fget_dest( space, self): + if self.dest is None: + return space.w_None + else: + return space.wrap(self.dest) + def fset_dest( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.dest = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.dest = obj + def descr_Print_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitPrint')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Print.typedef = TypeDef('Print', Node.typedef, - accept=interp2app(descr_Print_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Print_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(Print.fget_nodes, Print.fset_nodes ), + dest=GetSetProperty(Print.fget_dest, Print.fset_dest ), + ) class Printnl(Node): def __init__(self, nodes, dest, lineno=-1): @@ -1962,13 +2874,36 @@ def accept(self, visitor): return visitor.visitPrintnl(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def fget_dest( space, self): + if self.dest is None: + return space.w_None + else: + return space.wrap(self.dest) + def fset_dest( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.dest = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.dest = obj + def descr_Printnl_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitPrintnl')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Printnl.typedef = TypeDef('Printnl', Node.typedef, - accept=interp2app(descr_Printnl_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Printnl_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(Printnl.fget_nodes, Printnl.fset_nodes ), + dest=GetSetProperty(Printnl.fget_dest, Printnl.fset_dest ), + ) class Raise(Node): def __init__(self, expr1, expr2, expr3, lineno=-1): @@ -2001,13 +2936,57 @@ def accept(self, visitor): return visitor.visitRaise(self) + def fget_expr1( space, self): + if self.expr1 is None: + return space.w_None + else: + return space.wrap(self.expr1) + def fset_expr1( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.expr1 = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr1 = obj + def fget_expr2( space, self): + if self.expr2 is None: + return space.w_None + else: + return space.wrap(self.expr2) + def fset_expr2( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.expr2 = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr2 = obj + def fget_expr3( space, self): + if self.expr3 is None: + return space.w_None + else: + return space.wrap(self.expr3) + def fset_expr3( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.expr3 = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr3 = obj + def descr_Raise_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitRaise')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Raise.typedef = TypeDef('Raise', Node.typedef, - accept=interp2app(descr_Raise_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Raise_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr1=GetSetProperty(Raise.fget_expr1, Raise.fset_expr1 ), + expr2=GetSetProperty(Raise.fget_expr2, Raise.fset_expr2 ), + expr3=GetSetProperty(Raise.fget_expr3, Raise.fset_expr3 ), + ) class Return(Node): def __init__(self, value, lineno=-1): @@ -2030,16 +3009,32 @@ def accept(self, visitor): return visitor.visitReturn(self) + def fget_value( space, self): + if self.value is None: + return space.w_None + else: + return space.wrap(self.value) + def fset_value( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.value = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.value = obj + def descr_Return_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitReturn')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Return.typedef = TypeDef('Return', Node.typedef, - accept=interp2app(descr_Return_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Return_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + value=GetSetProperty(Return.fget_value, Return.fset_value ), + ) class RightShift(BinaryOp): - def __init__(self, (left, right), lineno=-1): + def __init__(self, left, right, lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -2052,18 +3047,36 @@ return [self.left, self.right] def __repr__(self): - return "RightShift((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) + return "RightShift(%s, %s)" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitRightShift(self) + def fget_left( space, self): + return space.wrap(self.left) + def fset_left( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.left = obj + def fget_right( space, self): + return space.wrap(self.right) + def fset_right( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.right = obj + def descr_RightShift_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitRightShift')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -RightShift.typedef = TypeDef('RightShift', Node.typedef, - accept=interp2app(descr_RightShift_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +RightShift.typedef = TypeDef('RightShift', BinaryOp.typedef, + accept=interp2app(descr_RightShift_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + left=GetSetProperty(RightShift.fget_left, RightShift.fset_left ), + right=GetSetProperty(RightShift.fget_right, RightShift.fset_right ), + ) class Slice(Node): def __init__(self, expr, flags, lower, upper, lineno=-1): @@ -2097,13 +3110,56 @@ def accept(self, visitor): return visitor.visitSlice(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def fget_flags( space, self): + return space.wrap(self.flags) + def fset_flags( space, self, w_arg): + self.flags = space.int_w(w_arg) + def fget_lower( space, self): + if self.lower is None: + return space.w_None + else: + return space.wrap(self.lower) + def fset_lower( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.lower = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.lower = obj + def fget_upper( space, self): + if self.upper is None: + return space.w_None + else: + return space.wrap(self.upper) + def fset_upper( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.upper = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.upper = obj + def descr_Slice_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitSlice')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Slice.typedef = TypeDef('Slice', Node.typedef, - accept=interp2app(descr_Slice_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Slice_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(Slice.fget_expr, Slice.fset_expr ), + flags=GetSetProperty(Slice.fget_flags, Slice.fset_flags ), + lower=GetSetProperty(Slice.fget_lower, Slice.fset_lower ), + upper=GetSetProperty(Slice.fget_upper, Slice.fset_upper ), + ) class Sliceobj(Node): def __init__(self, nodes, lineno=-1): @@ -2125,13 +3181,22 @@ def accept(self, visitor): return visitor.visitSliceobj(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def descr_Sliceobj_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitSliceobj')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Sliceobj.typedef = TypeDef('Sliceobj', Node.typedef, - accept=interp2app(descr_Sliceobj_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Sliceobj_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(Sliceobj.fget_nodes, Sliceobj.fset_nodes ), + ) class Stmt(Node): def __init__(self, nodes, lineno=-1): @@ -2153,42 +3218,25 @@ def accept(self, visitor): return visitor.visitStmt(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def descr_Stmt_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitStmt')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Stmt.typedef = TypeDef('Stmt', Node.typedef, - accept=interp2app(descr_Stmt_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) - -class StringConst(Node): - def __init__(self, string_value, lineno=-1): - Node.__init__(self, lineno) - self.string_value = string_value - - def getChildren(self): - "NOT_RPYTHON" - return self.string_value, - - def getChildNodes(self): - return [] - - def __repr__(self): - return "StringConst(%s)" % (self.string_value.__repr__(),) - - def accept(self, visitor): - return visitor.visitStringConst(self) - -def descr_StringConst_accept( space, w_self, w_visitor): - w_callable = space.getattr(w_visitor, space.wrap('visitStringConst')) - args = Arguments(space, [ w_self ]) - return space.call_args(w_callable, args) - -StringConst.typedef = TypeDef('StringConst', Node.typedef, - accept=interp2app(descr_StringConst_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Stmt_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(Stmt.fget_nodes, Stmt.fset_nodes ), + ) class Sub(BinaryOp): - def __init__(self, (left, right), lineno=-1): + def __init__(self, left, right, lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -2201,18 +3249,36 @@ return [self.left, self.right] def __repr__(self): - return "Sub((%s, %s))" % (self.left.__repr__(), self.right.__repr__()) + return "Sub(%s, %s)" % (self.left.__repr__(), self.right.__repr__()) def accept(self, visitor): return visitor.visitSub(self) + def fget_left( space, self): + return space.wrap(self.left) + def fset_left( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.left = obj + def fget_right( space, self): + return space.wrap(self.right) + def fset_right( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.right = obj + def descr_Sub_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitSub')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -Sub.typedef = TypeDef('Sub', Node.typedef, - accept=interp2app(descr_Sub_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +Sub.typedef = TypeDef('Sub', BinaryOp.typedef, + accept=interp2app(descr_Sub_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + left=GetSetProperty(Sub.fget_left, Sub.fset_left ), + right=GetSetProperty(Sub.fget_right, Sub.fset_right ), + ) class Subscript(Node): def __init__(self, expr, flags, subs, lineno=-1): @@ -2241,13 +3307,35 @@ def accept(self, visitor): return visitor.visitSubscript(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def fget_flags( space, self): + return space.wrap(self.flags) + def fset_flags( space, self, w_arg): + self.flags = space.int_w(w_arg) + def fget_subs( space, self): + return space.newlist( [space.wrap(itm) for itm in self.subs] ) + def fset_subs( space, self, w_arg): + del self.subs[:] + for w_itm in space.unpackiterable( w_arg ): + self.subs.append( space.interpclass_w( w_arg ) ) + def descr_Subscript_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitSubscript')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Subscript.typedef = TypeDef('Subscript', Node.typedef, - accept=interp2app(descr_Subscript_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Subscript_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(Subscript.fget_expr, Subscript.fset_expr ), + flags=GetSetProperty(Subscript.fget_flags, Subscript.fset_flags ), + subs=GetSetProperty(Subscript.fget_subs, Subscript.fset_subs ), + ) class TryExcept(Node): def __init__(self, body, handlers, else_, lineno=-1): @@ -2280,19 +3368,65 @@ nodelist.append(self.else_) return nodelist + def fget_handlers( space, self ): + return space.newlist( [ space.newtuple( [ space.wrap(expr1), + space.wrap(expr2), + space.wrap(body) ] ) + for expr1, expr2, body in self.handlers ] ) + + + def fset_handlers( space, self, w_arg ): + del self.handlers[:] + for w_tup in space.unpackiterable( w_arg ): + w_expr1 = space.getitem( w_tup, space.wrap(0) ) + w_expr2 = space.getitem( w_tup, space.wrap(1) ) + w_body = space.getitem( w_tup, space.wrap(2) ) + expr1 = space.interpclass_w( w_expr1 ) + expr2 = space.interpclass_w( w_expr2 ) + body = space.interpclass_w( w_body ) + if not isinstance( expr1, Node ) or not isinstance( expr2, Node ) or not isinstance( body, Node ): + raise OperationError(space.w_TypeError, space.wrap("Need a list of (expr1,expr2,body) nodes") ) + self.handlers.append( (expr1,expr2,body) ) + + def __repr__(self): return "TryExcept(%s, %s, %s)" % (self.body.__repr__(), self.handlers.__repr__(), self.else_.__repr__()) def accept(self, visitor): return visitor.visitTryExcept(self) + def fget_body( space, self): + return space.wrap(self.body) + def fset_body( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.body = obj + def fget_else_( space, self): + if self.else_ is None: + return space.w_None + else: + return space.wrap(self.else_) + def fset_else_( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.else_ = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.else_ = obj + def descr_TryExcept_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitTryExcept')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) TryExcept.typedef = TypeDef('TryExcept', Node.typedef, - accept=interp2app(descr_TryExcept_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_TryExcept_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + body=GetSetProperty(TryExcept.fget_body, TryExcept.fset_body ), + handlers=GetSetProperty(TryExcept.fget_handlers, TryExcept.fset_handlers ), + else_=GetSetProperty(TryExcept.fget_else_, TryExcept.fset_else_ ), + ) class TryFinally(Node): def __init__(self, body, final, lineno=-1): @@ -2313,13 +3447,31 @@ def accept(self, visitor): return visitor.visitTryFinally(self) + def fget_body( space, self): + return space.wrap(self.body) + def fset_body( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.body = obj + def fget_final( space, self): + return space.wrap(self.final) + def fset_final( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.final = obj + def descr_TryFinally_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitTryFinally')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) TryFinally.typedef = TypeDef('TryFinally', Node.typedef, - accept=interp2app(descr_TryFinally_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_TryFinally_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + body=GetSetProperty(TryFinally.fget_body, TryFinally.fset_body ), + final=GetSetProperty(TryFinally.fget_final, TryFinally.fset_final ), + ) class Tuple(Node): def __init__(self, nodes, lineno=-1): @@ -2341,13 +3493,22 @@ def accept(self, visitor): return visitor.visitTuple(self) + def fget_nodes( space, self): + return space.newlist( [space.wrap(itm) for itm in self.nodes] ) + def fset_nodes( space, self, w_arg): + del self.nodes[:] + for w_itm in space.unpackiterable( w_arg ): + self.nodes.append( space.interpclass_w( w_arg ) ) + def descr_Tuple_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitTuple')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Tuple.typedef = TypeDef('Tuple', Node.typedef, - accept=interp2app(descr_Tuple_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Tuple_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + nodes=GetSetProperty(Tuple.fget_nodes, Tuple.fset_nodes ), + ) class UnaryAdd(UnaryOp): def __init__(self, expr, lineno=-1): @@ -2367,13 +3528,23 @@ def accept(self, visitor): return visitor.visitUnaryAdd(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def descr_UnaryAdd_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitUnaryAdd')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -UnaryAdd.typedef = TypeDef('UnaryAdd', Node.typedef, - accept=interp2app(descr_UnaryAdd_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +UnaryAdd.typedef = TypeDef('UnaryAdd', UnaryOp.typedef, + accept=interp2app(descr_UnaryAdd_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(UnaryAdd.fget_expr, UnaryAdd.fset_expr ), + ) class UnarySub(UnaryOp): def __init__(self, expr, lineno=-1): @@ -2393,13 +3564,23 @@ def accept(self, visitor): return visitor.visitUnarySub(self) + def fget_expr( space, self): + return space.wrap(self.expr) + def fset_expr( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.expr = obj + def descr_UnarySub_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitUnarySub')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) -UnarySub.typedef = TypeDef('UnarySub', Node.typedef, - accept=interp2app(descr_UnarySub_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) +UnarySub.typedef = TypeDef('UnarySub', UnaryOp.typedef, + accept=interp2app(descr_UnarySub_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + expr=GetSetProperty(UnarySub.fget_expr, UnarySub.fset_expr ), + ) class While(Node): def __init__(self, test, body, else_, lineno=-1): @@ -2430,13 +3611,45 @@ def accept(self, visitor): return visitor.visitWhile(self) + def fget_test( space, self): + return space.wrap(self.test) + def fset_test( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.test = obj + def fget_body( space, self): + return space.wrap(self.body) + def fset_body( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.body = obj + def fget_else_( space, self): + if self.else_ is None: + return space.w_None + else: + return space.wrap(self.else_) + def fset_else_( space, self, w_arg): + if space.is_w( w_arg, space.w_None ): + self.else_ = None + else: + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.else_ = obj + def descr_While_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitWhile')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) While.typedef = TypeDef('While', Node.typedef, - accept=interp2app(descr_While_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_While_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + test=GetSetProperty(While.fget_test, While.fset_test ), + body=GetSetProperty(While.fget_body, While.fset_body ), + else_=GetSetProperty(While.fget_else_, While.fset_else_ ), + ) class Yield(Node): def __init__(self, value, lineno=-1): @@ -2456,13 +3669,23 @@ def accept(self, visitor): return visitor.visitYield(self) + def fget_value( space, self): + return space.wrap(self.value) + def fset_value( space, self, w_arg): + obj = space.interpclass_w( w_arg ) + if not isinstance( obj, Node): + raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) + self.value = obj + def descr_Yield_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitYield')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Yield.typedef = TypeDef('Yield', Node.typedef, - accept=interp2app(descr_Yield_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )) + accept=interp2app(descr_Yield_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + value=GetSetProperty(Yield.fget_value, Yield.fset_value ), + ) class ASTVisitor(object): @@ -2594,8 +3817,6 @@ return self.default( node ) def visitNot(self, node): return self.default( node ) - def visitNumberConst(self, node): - return self.default( node ) def visitOr(self, node): return self.default( node ) def visitPass(self, node): @@ -2618,8 +3839,6 @@ return self.default( node ) def visitStmt(self, node): return self.default( node ) - def visitStringConst(self, node): - return self.default( node ) def visitSub(self, node): return self.default( node ) def visitSubscript(self, node): Modified: pypy/dist/pypy/interpreter/astcompiler/ast.txt ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.txt (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.txt Thu Dec 8 18:25:31 2005 @@ -2,17 +2,27 @@ # generated by astgen.py. # The descriptions use the following special notation to describe # properties of the children: + # * this child is not a node + +# *int this child is not a node +# *str this child is not a node +# *[int] this child is not a node +# *[str] this child is not a node + +# % this child is a wrapped object # ! this child is a sequence that contains nodes in it # & this child may be set to None +# (type) where type is int, str +# [type] # = ... a default value for the node constructor (optional args) -Module: doc*, node +Module: w_doc%, node Stmt: nodes! Decorators: nodes! AbstractFunction: -Function(AbstractFunction): decorators&, name*, argnames*, defaults!, flags*, doc*, code -Lambda(AbstractFunction): argnames*, defaults!, flags*, code -Class: name*, bases!, doc*, code +Function(AbstractFunction): decorators&, name*str, argnames!, defaults!, flags*int, w_doc%, code +Lambda(AbstractFunction): argnames!, defaults!, flags*int, code +Class: name*str, bases!, w_doc%, code Pass: Break: Continue: @@ -20,27 +30,27 @@ While: test, body, else_& If: tests!, else_& Exec: expr, locals&, globals& -From: modname*, names* +From: modname*str, names* Import: names* Raise: expr1&, expr2&, expr3& TryFinally: body, final TryExcept: body, handlers!, else_& Return: value& Yield: value -Const: value* +Const: value% NoneConst: -StringConst: string_value* -NumberConst: number_value* +#StringConst: string_value* +#NumberConst: number_value* Print: nodes!, dest& Printnl: nodes!, dest& Discard: expr -AugAssign: node, op*, expr +AugAssign: node, op*str, expr Assign: nodes!, expr AssSeq: AssTuple(AssSeq): nodes! AssList(AssSeq): nodes! -AssName: name*, flags* -AssAttr: expr, attrname*, flags* +AssName: name*str, flags*int +AssAttr: expr, attrname*str, flags*int ListComp: expr, quals! ListCompFor: assign, list, ifs! ListCompIf: test @@ -53,16 +63,16 @@ UnaryOp: Not(UnaryOp): expr Compare: expr, ops! -Name: varname* -Global: names* +Name: varname*str +Global: names*[str] Backquote(UnaryOp): expr -Getattr: expr, attrname* +Getattr: expr, attrname*str CallFunc: node, args!, star_args& = None, dstar_args& = None -Keyword: name*, expr -Subscript: expr, flags*, subs! +Keyword: name*str, expr +Subscript: expr, flags*int, subs! Ellipsis: Sliceobj: nodes! -Slice: expr, flags*, lower&, upper& +Slice: expr, flags*int, lower&, upper& Assert: test, fail& Tuple: nodes! AbstractTest: @@ -73,19 +83,21 @@ Bitxor(BitOp): nodes! Bitand(BitOp): nodes! BinaryOp: -LeftShift(BinaryOp): (left, right) -RightShift(BinaryOp): (left, right) -Add(BinaryOp): (left, right) -Sub(BinaryOp): (left, right) -Mul(BinaryOp): (left, right) -Div(BinaryOp): (left, right) -Mod(BinaryOp): (left, right) -Power(BinaryOp): (left, right) -FloorDiv(BinaryOp): (left, right) +LeftShift(BinaryOp): left, right +RightShift(BinaryOp): left, right +Add(BinaryOp): left, right +Sub(BinaryOp): left, right +Mul(BinaryOp): left, right +Div(BinaryOp): left, right +Mod(BinaryOp): left, right +Power(BinaryOp): left, right +FloorDiv(BinaryOp): left, right UnaryAdd(UnaryOp): expr UnarySub(UnaryOp): expr Invert(UnaryOp): expr +== OVERRIDES == + init(Function): self.varargs = self.kwargs = 0 if flags & CO_VARARGS: @@ -146,3 +158,120 @@ else: assert False, "should only have AssName and AssTuple as children" return argnames + +Compare.fget_ops( space, self ): + lst = [] + for op_name, node in self.ops: + lst.append( space.newtuple( [ space.wrap(op_name), space.wrap(node) ] ) ) + return space.newlist( lst ) + +Compare.fset_ops( space, self, w_arg ): + del self.ops[:] + for w_obj in space.unpackiterable( w_arg ): + w_opname = space.getitem( w_obj, space.wrap(0) ) + w_node = space.getitem( w_obj, space.wrap(1) ) + ops = space.str_w(w_opname) + node = space.interpclass_w( w_node ) + if not isinstance(node, Node): + raise OperationError(space.w_TypeError, space.wrap("ops must be a list of (name,node)")) + self.ops.append( (ops,node) ) + +Dict.fget_items( space, self ): + return space.newlist( [ space.newtuple( [ space.wrap(key), space.wrap(value) ] ) + for key, value in self.items ] ) + +Dict.fset_items( space, self, w_arg ): + del self.items[:] + for w_tup in space.unpackiterable( w_arg ): + w_key = space.getitem( w_tup, space.wrap(0) ) + w_value = space.getitem( w_tup, space.wrap(1) ) + key = space.interpclass_w( w_key ) + value = space.interpclass_w( w_value ) + if not isinstance( key, Node ) or not isinstance( value, Node ): + raise OperationError(space.w_TypeError, space.wrap("Need a list of (key node, value node)")) + self.items.append( (key,value) ) + +flatten_nodes(TryExcept.handlers): + # handlers is a list of triplets (expr1, expr2, body) + for expr1, expr2, body in self.handlers: + if expr1 is not None: + nodelist.append(expr1) + if expr2 is not None: + nodelist.append(expr2) + if body is not None: + nodelist.append(body) + +flatten_nodes(If.tests): + # tests is a list of couples (node (test), node (suite)) + for test, suite in self.tests: + nodelist.append(test) + nodelist.append(suite) + + +If.fget_tests( space, self ): + return space.newlist( [ space.newtuple( [ space.wrap(test), + space.wrap(suite) ] ) + for test, suite in self.tests ] ) + +If.fset_tests( space, self, w_arg ): + del self.tests[:] + for w_tup in space.unpackiterable( w_arg ): + w_test = space.getitem( w_tup, space.wrap(0) ) + w_suite = space.getitem( w_tup, space.wrap(1) ) + test = space.interpclass_w( w_test ) + suite = space.interpclass_w( w_suite ) + if not isinstance( test, Node ) or not isinstance( suite, Node ): + raise OperationError(space.w_TypeError, space.wrap("Need a list of (test,suite) nodes") ) + self.tests.append( (test,suite) ) + + + +TryExcept.fget_handlers( space, self ): + return space.newlist( [ space.newtuple( [ space.wrap(expr1), + space.wrap(expr2), + space.wrap(body) ] ) + for expr1, expr2, body in self.handlers ] ) + +TryExcept.fset_handlers( space, self, w_arg ): + del self.handlers[:] + for w_tup in space.unpackiterable( w_arg ): + w_expr1 = space.getitem( w_tup, space.wrap(0) ) + w_expr2 = space.getitem( w_tup, space.wrap(1) ) + w_body = space.getitem( w_tup, space.wrap(2) ) + expr1 = space.interpclass_w( w_expr1 ) + expr2 = space.interpclass_w( w_expr2 ) + body = space.interpclass_w( w_body ) + if not isinstance( expr1, Node ) or not isinstance( expr2, Node ) or not isinstance( body, Node ): + raise OperationError(space.w_TypeError, space.wrap("Need a list of (expr1,expr2,body) nodes") ) + self.handlers.append( (expr1,expr2,body) ) + +Import.fget_names( space, self ): + return space.newlist( [ space.newtuple( [ space.wrap(name), space.wrap(as_name) ] ) + for name, as_name in self.names ] ) + +Import.fset_names( space, self, w_arg ): + del self.names[:] + for w_tup in space.unpackiterable( w_arg ): + w_name = space.getitem( w_tup, space.wrap(0) ) + w_as_name = space.getitem( w_tup, space.wrap(1) ) + name = space.str_w( w_name ) + as_name = None + if not space.is_w( w_as_name, space.w_None ): + as_name = space.str_w( w_as_name ) + self.names.append( (name, as_name) ) + +From.fget_names( space, self ): + return space.newlist( [ space.newtuple( [ space.wrap(name), space.wrap(as_name) ] ) + for name, as_name in self.names ] ) + +From.fset_names( space, self, w_arg ): + del self.names[:] + for w_tup in space.unpackiterable( w_arg ): + w_name = space.getitem( w_tup, space.wrap(0) ) + w_as_name = space.getitem( w_tup, space.wrap(1) ) + name = space.str_w( w_name ) + as_name = None + if not space.is_w( w_as_name, space.w_None ): + as_name = space.str_w( w_as_name ) + self.names.append( (name, as_name) ) + Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Thu Dec 8 18:25:31 2005 @@ -6,7 +6,7 @@ Instead, it is rather complex to generate the appropriate code. And the Node interface has changed more often than the grammar. """ -# This is a slightly modified version from the original that adds a +# This is a heavily modified version from the original that adds a # visit method to each node import fileinput @@ -28,8 +28,13 @@ P_NODE = 1 P_OTHER = 2 -P_NESTED = 3 -P_NONE = 4 +P_STR = 3 +P_INT = 4 +P_STR_LIST = 5 +P_INT_LIST = 6 +P_WRAPPED = 7 +P_NESTED = 8 +P_NONE = 9 class NodeInfo: """Each instance describes a specific AST node""" @@ -51,12 +56,7 @@ self.parent = Node_NodeInfo def get_argnames(self): - if '(' in self.args: - i = self.args.find('(') - j = self.args.rfind(')') - args = self.args[i+1:j] - else: - args = self.args + args = self.args return [strip_default(arg.strip()) for arg in args.split(',') if arg] @@ -73,6 +73,26 @@ arg = self.argnames[i] = arg[:-1] d[arg] = P_OTHER hardest_arg = max(hardest_arg, P_OTHER) + elif arg.endswith('*int'): + arg = self.argnames[i] = arg[:-4] + d[arg] = P_INT + hardest_arg = max(hardest_arg, P_INT) + elif arg.endswith('*str'): + arg = self.argnames[i] = arg[:-4] + d[arg] = P_STR + hardest_arg = max(hardest_arg, P_STR) + elif arg.endswith('*[int]'): + arg = self.argnames[i] = arg[:-6] + d[arg] = P_INT_LIST + hardest_arg = max(hardest_arg, P_INT_LIST) + elif arg.endswith('*[str]'): + arg = self.argnames[i] = arg[:-6] + d[arg] = P_STR_LIST + hardest_arg = max(hardest_arg, P_STR_LIST) + elif arg.endswith('%'): + arg = self.argnames[i] = arg[:-1] + d[arg] = P_WRAPPED + hardest_arg = max(hardest_arg, P_WRAPPED) elif arg.endswith('!'): arg = self.argnames[i] = arg[:-1] d[arg] = P_NESTED @@ -86,9 +106,14 @@ self.hardest_arg = hardest_arg if hardest_arg > P_NODE: + self.args = self.args.replace('*str', '') + self.args = self.args.replace('*int', '') + self.args = self.args.replace('*[str]', '') + self.args = self.args.replace('*[int]', '') self.args = self.args.replace('*', '') self.args = self.args.replace('!', '') self.args = self.args.replace('&', '') + self.args = self.args.replace('%', '') return d @@ -106,6 +131,8 @@ print >> buf self._gen_visit(buf) print >> buf + self._gen_attrs(buf) + print >> buf self._gen_typedef(buf) buf.seek(0, 0) return buf.read() @@ -218,91 +245,185 @@ print >> buf, " def accept(self, visitor):" print >> buf, " return visitor.visit%s(self)" % self.name + + def _gen_fget_func(self, buf, attr, prop ): + # FGET + print >> buf, " def fget_%s( space, self):" % attr + if prop[attr]==P_WRAPPED: + print >> buf, " return self.%s" % attr + elif prop[attr] in (P_INT,P_STR, P_NODE): + print >> buf, " return space.wrap(self.%s)" % attr + elif prop[attr] in (P_INT_LIST, P_STR_LIST, P_NESTED ): + print >> buf, " return space.newlist( [space.wrap(itm) for itm in self.%s] )" % attr + elif prop[attr]==P_NONE: + print >> buf, " if self.%s is None:" % attr + print >> buf, " return space.w_None" + print >> buf, " else:" + print >> buf, " return space.wrap(self.%s)" % attr + else: + assert False, "Unkown node type" + + def _gen_fset_func(self, buf, attr, prop ): + # FSET + print >> buf, " def fset_%s( space, self, w_arg):" % attr + if prop[attr]==P_WRAPPED: + print >> buf, " self.%s = w_arg" % attr + elif prop[attr]==P_INT: + print >> buf, " self.%s = space.int_w(w_arg)" % attr + elif prop[attr]==P_STR: + print >> buf, " self.%s = space.str_w(w_arg)" % attr + elif prop[attr]==P_INT_LIST: + print >> buf, " del self.%s[:]" % attr + print >> buf, " for itm in space.unpackiterable(w_arg):" + print >> buf, " self.%s.append( space.int_w(itm) )" % attr + elif prop[attr]==P_STR_LIST: + print >> buf, " del self.%s[:]" % attr + print >> buf, " for itm in space.unpackiterable(w_arg):" + print >> buf, " self.%s.append( space.str_w(itm) )" % attr + elif prop[attr]==P_NESTED: + print >> buf, " del self.%s[:]" % attr + print >> buf, " for w_itm in space.unpackiterable( w_arg ):" + print >> buf, " self.%s.append( space.interpclass_w( w_arg ) )" % attr + elif prop[attr]==P_NONE: + print >> buf, " if space.is_w( w_arg, space.w_None ):" + print >> buf, " self.%s = None" % attr + print >> buf, " else:" + print >> buf, " obj = space.interpclass_w( w_arg )" + print >> buf, " if not isinstance( obj, Node):" + print >> buf, " raise OperationError(space.w_TypeError,space.wrap('Need a Node instance'))" + print >> buf, " self.%s = obj" % attr + else: # P_NODE + print >> buf, " obj = space.interpclass_w( w_arg )" + print >> buf, " if not isinstance( obj, Node):" + print >> buf, " raise OperationError(space.w_TypeError,space.wrap('Need a Node instance'))" + print >> buf, " self.%s = obj" % attr + + def _gen_attrs(self, buf): + prop = self.argprops + for attr in self.argnames: + if "fget_%s" % attr not in self.additional_methods: + self._gen_fget_func( buf, attr, prop ) + + if "fset_%s" % attr not in self.additional_methods: + self._gen_fset_func( buf, attr, prop ) + + def _gen_typedef(self, buf): + parent_type = "%s.typedef" % self.parent.name print >> buf, "def descr_%s_accept( space, w_self, w_visitor):" %self.name print >> buf, " w_callable = space.getattr(w_visitor, space.wrap('visit%s'))" % self.name print >> buf, " args = Arguments(space, [ w_self ])" print >> buf, " return space.call_args(w_callable, args)" print >> buf, "" - print >> buf, "%s.typedef = TypeDef('%s', Node.typedef, " % (self.name,self.name) - print >> buf, " accept=interp2app(descr_%s_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ))" % self.name + print >> buf, "%s.typedef = TypeDef('%s', %s, " % (self.name,self.name,parent_type) + print >> buf, " accept=interp2app(descr_%s_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )," % self.name + for attr in self.argnames: + print >> buf, " %s=GetSetProperty(%s.fget_%s, %s.fset_%s )," % (attr,self.name,attr,self.name,attr) + print >> buf, " )" def _gen_additional_methods(self, buf): for key, value in self.additional_methods.iteritems(): - if key not in '_cur_': - print >> buf, ''.join(value) - # print >> buf, '\n\n' + print >> buf, ''.join(value) def gen_base_visit(self, buf): print >> buf, " def visit%s(self, node):" % self.name print >> buf, " return self.default( node )" + def gen_print_visit(self, buf): + # This is a print visitor for application level tests + print >> buf, " def visit%s(self, node):" % self.name + print >> buf, " print '%s('," % self.name + for attr in self.argnames: + if self.argprops[attr] == P_NODE: + print >> buf, " node.%s.accept(self)" % attr + print >> buf, " print ','," + if self.argprops[attr] == P_NONE: + print >> buf, " if node.%s: node.%s.accept(self)" % (attr,attr) + print >> buf, " print ','," + elif self.argprops[attr] == P_NESTED: + print >> buf, " for nd in node.%s:" % attr + print >> buf, " nd.accept(self)" + print >> buf, " print ','," + else: + print >> buf, " print node.%s,','," % attr + print >> buf, " print ')'," + + + Node_NodeInfo = NodeInfo("Node","") rx_init = re.compile('init\((.*)\):') rx_flatten_nodes = re.compile('flatten_nodes\((.*)\.(.*)\):') -rx_additional_methods = re.compile('(.*)\.(.*)\((.*?)\):') +rx_additional_methods = re.compile('(\\w+)\.(\w+)\((.*?)\):') def parse_spec(file): classes = {} cur = None kind = None - for line in fileinput.input(file): - mo = None + fiter = fileinput.input(file) + for line in fiter: + if line.startswith("== OVERRIDES =="): + break comment = line.strip().startswith('#') - if not comment: - mo = rx_init.search(line) - if mo: - kind = 'init' - else: - mo = rx_flatten_nodes.search(line) - if mo: - kind = 'flatten_nodes' - else: - mo = rx_additional_methods.search(line) - if mo: - kind = 'additional_method' - if mo is None: - if cur is None: - if comment: - continue - # a normal entry - try: - name, args = line.split(':') - except ValueError: - continue - if "(" in name: - name, parent = name.split("(") - parent = parent[:-1] - else: - parent = None - classes[name] = NodeInfo(name, args, parent) - cur = None - elif kind == 'init': - # some code for the __init__ method - cur.init.append(line) - elif kind == 'flatten_nodes': - cur.flatten_nodes['_cur_'].append(line) - elif kind == 'additional_method': - cur.additional_methods['_cur_'].append(' '*4 + line) - elif kind == 'init': - # some extra code for a Node's __init__ method - name = mo.group(1) - cur = classes[name] - elif kind == 'flatten_nodes': - # special case for getChildNodes flattening - name = mo.group(1) - attr = mo.group(2) - cur = classes[name] - cur.flatten_nodes[attr] = cur.flatten_nodes['_cur_'] = [] - elif kind == 'additional_method': - name = mo.group(1) - methname = mo.group(2) - params = mo.group(3) - cur = classes[name] - cur.additional_methods['_cur_'] = [' def %s(%s):\n' % (methname, params)] - cur.additional_methods[methname] = cur.additional_methods['_cur_'] + if comment: + continue + # a normal entry + try: + name, args = line.split(':') + except ValueError: + continue + if "(" in name: + name, parent = name.split("(") + parent = parent[:-1] + else: + parent = None + classes[name] = NodeInfo(name, args, parent) + + + for line in fiter: + mo = None + mo = rx_init.match(line) + if mo: + kind = 'init' + # some extra code for a Node's __init__ method + name = mo.group(1) + cur = classes[name] + continue + + mo = rx_flatten_nodes.match(line) + if mo: + kind = 'flatten_nodes' + # special case for getChildNodes flattening + name = mo.group(1) + attr = mo.group(2) + cur = classes[name] + _cur_ = attr + cur.flatten_nodes[attr] = [] + flatten_expect_comment = True + continue + + mo = rx_additional_methods.match(line) + if mo: + kind = 'additional_method' + name = mo.group(1) + methname = mo.group(2) + params = mo.group(3) + cur = classes[name] + _cur_ = methname + cur.additional_methods[_cur_] = [' def %s(%s):\n' % (methname, params)] + continue + + if kind == 'init': + # some code for the __init__ method + cur.init.append(line) + elif kind == 'flatten_nodes': + if flatten_expect_comment: + assert line.strip().startswith("#") + flatten_expect_comment=False + cur.flatten_nodes[_cur_].append(line) + elif kind == 'additional_method': + cur.additional_methods[_cur_].append(' '*4 + line) for node in classes.values(): node.setup_parent(classes) @@ -334,6 +455,19 @@ info.gen_base_visit(buf) print buf.getvalue() +def gen_print_visitor(classes, f): + print >>f, ASTVISITORCLASS + buf = StringIO() + for info in classes: + info.gen_base_visit(buf) + print >>f, buf.getvalue() + print >>f, "class ASTPrintVisitor(ASTVisitor):" + buf = StringIO() + for info in classes: + info.gen_print_visit(buf) + print >>f, buf.getvalue() + + def main(): print prologue print @@ -348,6 +482,7 @@ for info in classes: emit(info) gen_ast_visitor(classes) + gen_print_visitor(classes,file("ast_test.py","w")) print epilogue prologue = ''' @@ -357,9 +492,10 @@ """ from consts import CO_VARARGS, CO_VARKEYWORDS, OP_ASSIGN from pypy.interpreter.baseobjspace import Wrappable -from pypy.interpreter.typedef import TypeDef +from pypy.interpreter.typedef import TypeDef, GetSetProperty from pypy.interpreter.gateway import interp2app, W_Root, ObjSpace from pypy.interpreter.argument import Arguments +from pypy.interpreter.error import OperationError def flatten(list): l = [] Modified: pypy/dist/pypy/interpreter/astcompiler/consts.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/consts.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/consts.py Thu Dec 8 18:25:31 2005 @@ -1,7 +1,8 @@ # operation flags -OP_ASSIGN = 'OP_ASSIGN' -OP_DELETE = 'OP_DELETE' -OP_APPLY = 'OP_APPLY' +OP_ASSIGN = 0 # 'OP_ASSIGN' +OP_DELETE = 1 # 'OP_DELETE' +OP_APPLY = 2 # 'OP_APPLY' +OP_NONE = 3 SC_LOCAL = 1 SC_GLOBAL = 2 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Thu Dec 8 18:25:31 2005 @@ -7,7 +7,7 @@ from pypy.interpreter.astcompiler import ast, parse, walk, syntax from pypy.interpreter.astcompiler import pyassem, misc, future, symbols from pypy.interpreter.astcompiler.consts import SC_LOCAL, SC_GLOBAL, \ - SC_FREE, SC_CELL, SC_DEFAULT + SC_FREE, SC_CELL, SC_DEFAULT, OP_APPLY, OP_ASSIGN, OP_DELETE, OP_NONE from pypy.interpreter.astcompiler.consts import CO_VARARGS, CO_VARKEYWORDS, \ CO_NEWLOCALS, CO_NESTED, CO_GENERATOR, CO_GENERATOR_ALLOWED, CO_FUTURE_DIVISION from pypy.interpreter.pyparser.error import SyntaxError @@ -313,9 +313,9 @@ assert node.scope is not None self.scope = node.scope self.emitop_int('SET_LINENO', 0) - if not space.is_w(node.doc, space.w_None): + if not space.is_w(node.w_doc, space.w_None): self.set_lineno(node) - self.emitop_obj('LOAD_CONST', node.doc) + self.emitop_obj('LOAD_CONST', node.w_doc) self.storeName('__doc__', node.lineno) node.node.accept( self ) self.emitop_obj('LOAD_CONST', space.w_None ) @@ -332,8 +332,8 @@ def visitFunction(self, node): self._visitFuncOrLambda(node, isLambda=0) space = self.space - if not space.is_w(node.doc, space.w_None): - self.setDocstring(node.doc) + if not space.is_w(node.w_doc, space.w_None): + self.setDocstring(node.w_doc) self.storeName(node.name, node.lineno) def visitLambda(self, node): @@ -892,29 +892,29 @@ elt.accept( self ) def visitAssName(self, node): - if node.flags == 'OP_ASSIGN': + if node.flags == OP_ASSIGN: self.storeName(node.name, node.lineno) - elif node.flags == 'OP_DELETE': + elif node.flags == OP_DELETE: self.set_lineno(node) self.delName(node.name, node.lineno) else: - assert False, "visitAssName unexpected flags: %s" % node.flags + assert False, "visitAssName unexpected flags: %d" % node.flags def visitAssAttr(self, node): node.expr.accept( self ) - if node.flags == 'OP_ASSIGN': + if node.flags == OP_ASSIGN: if node.attrname == 'None': raise SyntaxError('assignment to None is not allowed', node.lineno) self.emitop('STORE_ATTR', self.mangle(node.attrname)) - elif node.flags == 'OP_DELETE': + elif node.flags == OP_DELETE: if node.attrname == 'None': raise SyntaxError('deleting None is not allowed', node.lineno) self.emitop('DELETE_ATTR', self.mangle(node.attrname)) else: - assert False, "visitAssAttr unexpected flags: %s" % node.flags + assert False, "visitAssAttr unexpected flags: %d" % node.flags def _visitAssSequence(self, node, op='UNPACK_SEQUENCE'): - if findOp(node) != 'OP_DELETE': + if findOp(node) != OP_DELETE: self.emitop_int(op, len(node.nodes)) for child in node.nodes: child.accept( self ) @@ -1052,14 +1052,14 @@ self.emitop_int('DUP_TOPX', 3) else: self.emitop_int('DUP_TOPX', 2) - if node.flags == 'OP_APPLY': + if node.flags == OP_APPLY: self.emit('SLICE+%d' % slice) - elif node.flags == 'OP_ASSIGN': + elif node.flags == OP_ASSIGN: self.emit('STORE_SLICE+%d' % slice) - elif node.flags == 'OP_DELETE': + elif node.flags == OP_DELETE: self.emit('DELETE_SLICE+%d' % slice) else: - assert False, "weird slice %s" % node.flags + assert False, "weird slice %d" % node.flags def visitSubscript(self, node): return self._visitSubscript(node, False) @@ -1072,11 +1072,11 @@ self.emitop_int('DUP_TOPX', 2) if len(node.subs) > 1: self.emitop_int('BUILD_TUPLE', len(node.subs)) - if node.flags == 'OP_APPLY': + if node.flags == OP_APPLY: self.emit('BINARY_SUBSCR') - elif node.flags == 'OP_ASSIGN': + elif node.flags == OP_ASSIGN: self.emit('STORE_SUBSCR') - elif node.flags == 'OP_DELETE': + elif node.flags == OP_DELETE: self.emit('DELETE_SUBSCR') # binary ops @@ -1268,8 +1268,8 @@ CodeGenerator.__init__(self, space, graph) self.optimized = 1 - if not isLambda and not space.is_w(func.doc, space.w_None): - self.setDocstring(func.doc) + if not isLambda and not space.is_w(func.w_doc, space.w_None): + self.setDocstring(func.w_doc) if func.varargs: self.graph.setFlag(CO_VARARGS) @@ -1348,8 +1348,8 @@ CodeGenerator.__init__(self, space, graph) self.class_name = klass.name self.graph.setFlag(CO_NEWLOCALS) - if not space.is_w(klass.doc, space.w_None): - self.setDocstring(klass.doc) + if not space.is_w(klass.w_doc, space.w_None): + self.setDocstring(klass.w_doc) def get_module(self): return self.module @@ -1371,8 +1371,8 @@ self.set_lineno(klass) self.emitop("LOAD_GLOBAL", "__name__") self.storeName("__module__", klass.lineno) - if not space.is_w(klass.doc, space.w_None): - self.emitop_obj("LOAD_CONST", klass.doc) + if not space.is_w(klass.w_doc, space.w_None): + self.emitop_obj("LOAD_CONST", klass.w_doc) self.storeName('__doc__', klass.lineno) def findOp(node): @@ -1384,20 +1384,20 @@ class OpFinder(ast.ASTVisitor): def __init__(self): - self.op = None + self.op = OP_NONE def visitAssName(self, node): - if self.op is None: + if self.op is OP_NONE: self.op = node.flags elif self.op != node.flags: raise ValueError("mixed ops in stmt") def visitAssAttr(self, node): - if self.op is None: + if self.op is OP_NONE: self.op = node.flags elif self.op != node.flags: raise ValueError("mixed ops in stmt") def visitSubscript(self, node): - if self.op is None: + if self.op is OP_NONE: self.op = node.flags elif self.op != node.flags: raise ValueError("mixed ops in stmt") Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Thu Dec 8 18:25:31 2005 @@ -361,7 +361,7 @@ scope = ClassScope(node.name, self.module) if parent.nested or isinstance(parent, FunctionScope): scope.nested = 1 - if node.doc is not None: + if node.w_doc is not None: scope.add_def('__doc__') scope.add_def('__module__') node.scope = scope Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Thu Dec 8 18:25:31 2005 @@ -591,7 +591,7 @@ token = atoms[-2] if isinstance(token, TokenObject) and token.name == tok.DOUBLESTAR: obj = parse_attraccess(slicecut(atoms, 0, -2)) - builder.push(ast.Power([obj, atoms[-1]], lineno)) + builder.push(ast.Power( obj, atoms[-1], lineno)) else: obj = parse_attraccess(atoms) builder.push(obj) @@ -620,13 +620,13 @@ op_node = atoms[i-1] assert isinstance(op_node, TokenObject) if op_node.name == tok.STAR: - left = ast.Mul( [ left, right ], left.lineno ) + left = ast.Mul( left, right, left.lineno ) elif op_node.name == tok.SLASH: - left = ast.Div( [ left, right ], left.lineno ) + left = ast.Div( left, right, left.lineno ) elif op_node.name == tok.PERCENT: - left = ast.Mod( [ left, right ], left.lineno ) + left = ast.Mod( left, right, left.lineno ) elif op_node.name == tok.DOUBLESLASH: - left = ast.FloorDiv( [ left, right ], left.lineno ) + left = ast.FloorDiv( left, right, left.lineno ) else: token = atoms[i-1] raise SyntaxError("unexpected token", token.lineno, token.col) @@ -641,9 +641,9 @@ op_node = atoms[i-1] assert isinstance(op_node, TokenObject) if op_node.name == tok.PLUS: - left = ast.Add([ left, right ], left.lineno) + left = ast.Add( left, right, left.lineno) elif op_node.name == tok.MINUS: - left = ast.Sub([ left, right ], left.lineno) + left = ast.Sub( left, right, left.lineno) else: token = atoms[i-1] raise SyntaxError("unexpected token", token.lineno, token.col) @@ -659,9 +659,9 @@ op_node = atoms[i-1] assert isinstance(op_node, TokenObject) if op_node.name == tok.LEFTSHIFT: - left = ast.LeftShift( [left, right], lineno ) + left = ast.LeftShift( left, right, lineno ) elif op_node.name == tok.RIGHTSHIFT: - left = ast.RightShift( [ left, right ], lineno ) + left = ast.RightShift( left, right, lineno ) else: token = atoms[i-1] raise SyntaxError("unexpected token", token.lineno, token.col) @@ -1264,7 +1264,9 @@ names.append((name, as_name)) # move forward until next ',' # XXX: what is it supposed to do ? - for atom in atoms[index:]: + while index A* or S -> A+ Token : a lexer token """ +from pypy.interpreter.baseobjspace import Wrappable +from pypy.interpreter.typedef import TypeDef +from pypy.interpreter.gateway import interp2app, ObjSpace, W_Root +from pypy.interpreter.argument import Arguments +from pypy.interpreter.error import OperationError DEBUG = 0 USE_LOOKAHEAD = True @@ -193,7 +198,7 @@ ####################################################################### # Grammar Elements Classes (Alternative, Sequence, KleeneStar, Token) # ####################################################################### -class GrammarElement(object): +class GrammarElement(Wrappable): """Base parser class""" symbols = {} # dirty trick to provide a symbols mapping while printing (and not putting it in every object) @@ -344,6 +349,21 @@ pass + def descr_repr( self, space ): + """TODO: make __repr__ RPython""" + import pysymbol + return space.wrap( self.display(0, pysymbol.sym_name) ) + + def descr_get_children( self, space ): + return space.newlist( [ space.wrap(it) for it in self.args ] ) + +GrammarElement.typedef = TypeDef( "GrammarElement", + #__repr__ = interp2app(GrammarElement.descr_repr, + # unwrap_spec=['self', ObjSpace] ), + get_children = interp2app(GrammarElement.descr_get_children, + unwrap_spec=['self', ObjSpace] ), + ) + class Alternative(GrammarElement): """Represents an alternative in a grammar rule (as in S -> A | B | C)""" def __init__(self, name, args): @@ -440,7 +460,47 @@ return True return False - + def descr_alternative_append( self, space, w_rule ): + rule = space.interpclass_w(w_rule) + if not isinstance( rule, GrammarElement ): + raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) + self.args.append( rule ) + + def descr_alternative___getitem__(self, space, idx ): + return space.wrap(self.args[idx]) + + def descr_alternative___setitem__(self, space, idx, w_rule ): + rule = space.interpclass_w(w_rule) + if not isinstance( rule, GrammarElement ): + raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) + return space.wrap( self.args[idx] ) + + def descr_alternative___delitem__(self, space, idx ): + del self.args[idx] + + def descr_alternative_insert(self, space, idx, w_rule ): + rule = space.interpclass_w(w_rule) + if not isinstance( rule, GrammarElement ): + raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) + if idx<0 or idx>len(self.args): + raise OperationError( space.w_IndexError, space.wrap("Invalid index") ) + self.args.insert( idx, rule ) + + + +Alternative.typedef = TypeDef("Alternative", GrammarElement.typedef, + __getitem__ = interp2app( Alternative.descr_alternative___getitem__, + unwrap_spec=['self',ObjSpace,int]), + __setitem__ = interp2app( Alternative.descr_alternative___setitem__, + unwrap_spec=['self',ObjSpace,int,W_Root]), + __delitem__ = interp2app( Alternative.descr_alternative___delitem__, + unwrap_spec=['self',ObjSpace,int]), + insert = interp2app( Alternative.descr_alternative_insert, + unwrap_spec = ['self', ObjSpace, int, W_Root ] ), + append = interp2app( Alternative.descr_alternative_append, + unwrap_spec = ['self', ObjSpace, W_Root ] ), + ) + class Sequence(GrammarElement): """Reprensents a Sequence in a grammar rule (as in S -> A B C)""" def __init__(self, name, args): @@ -452,7 +512,7 @@ def _match(self, source, builder, level=0): """matches all of the symbols in order""" if DEBUG > 1: - print "try seq:", self.display(level, builder.symbols ) + print "try seq:", self.display(0, builder.symbols ) ctx = source.context() bctx = builder.context() for rule in self.args: @@ -474,7 +534,7 @@ return name else: name = "" - items = [a.display(1) for a in self.args] + items = [a.display(1,symbols) for a in self.args] return name + "(" + " ".join( items ) + ")" def calc_first_set(self): @@ -514,6 +574,47 @@ return False return True + def descr_alternative_append( self, space, w_rule ): + rule = space.interpclass_w(w_rule) + if not isinstance( rule, GrammarElement ): + raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) + self.args.append( rule ) + + def descr_alternative___getitem__(self, space, idx ): + return space.wrap(self.args[idx]) + + def descr_alternative___setitem__(self, space, idx, w_rule ): + rule = space.interpclass_w(w_rule) + if not isinstance( rule, GrammarElement ): + raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) + return space.wrap( self.args[idx] ) + + def descr_alternative___delitem__(self, space, idx ): + del self.args[idx] + + def descr_alternative_insert(self, space, idx, w_rule ): + rule = space.interpclass_w(w_rule) + if not isinstance( rule, GrammarElement ): + raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) + if idx<0 or idx>len(self.args): + raise OperationError( space.w_IndexError, space.wrap("Invalid index") ) + self.args.insert( idx, rule ) + + + +Sequence.typedef = TypeDef("Sequence", GrammarElement.typedef, + __getitem__ = interp2app( Sequence.descr_alternative___getitem__, + unwrap_spec=['self',ObjSpace,int]), + __setitem__ = interp2app( Sequence.descr_alternative___setitem__, + unwrap_spec=['self',ObjSpace,int,W_Root]), + __delitem__ = interp2app( Sequence.descr_alternative___delitem__, + unwrap_spec=['self',ObjSpace,int]), + insert = interp2app( Sequence.descr_alternative_insert, + unwrap_spec = ['self', ObjSpace, int, W_Root ] ), + append = interp2app( Sequence.descr_alternative_append, + unwrap_spec = ['self', ObjSpace, W_Root ] ), + ) + class KleeneStar(GrammarElement): """Represents a KleeneStar in a grammar rule as in (S -> A+) or (S -> A*)""" @@ -535,7 +636,7 @@ represent infinity """ if DEBUG > 1: - print "try kle:", self.display() + print "try kle:", self.display(0,builder.symbols) ctx = 0 bctx = None if self.min: @@ -573,7 +674,7 @@ star = "*" elif self.min==1 and self.max==-1: star = "+" - s = self.args[0].display(1) + s = self.args[0].display(1, symbols) return name + "%s%s" % (s, star) @@ -605,6 +706,28 @@ return False return True + def descr_kleenestar___getitem__(self, space, idx ): + if idx!=0: + raise OperationError( space.w_ValueError, space.wrap("KleeneStar only support one child")) + return space.wrap(self.args[idx]) + + def descr_kleenestar___setitem__(self, space, idx, w_rule ): + rule = space.interpclass_w(w_rule) + if idx!=0: + raise OperationError( space.w_ValueError, space.wrap("KleeneStar only support one child")) + if not isinstance( rule, GrammarElement ): + raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) + self.args[idx] = rule + + + +KleeneStar.typedef = TypeDef("KleeneStar", GrammarElement.typedef, + __getitem__ = interp2app(KleeneStar.descr_kleenestar___getitem__, + unwrap_spec=[ 'self', ObjSpace, int]), + __setitem__ = interp2app(KleeneStar.descr_kleenestar___setitem__, + unwrap_spec=[ 'self', ObjSpace, int, W_Root ]), + ) + class Token(GrammarElement): """Represents a Token in a grammar rule (a lexer token)""" @@ -681,6 +804,8 @@ return True return False +Token.typedef = TypeDef("Token", GrammarElement.typedef ) + from pypy.interpreter.pyparser.pytoken import NULLTOKEN EmptyToken = Token(NULLTOKEN, None) Modified: pypy/dist/pypy/interpreter/pyparser/pythonparse.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonparse.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonparse.py Thu Dec 8 18:25:31 2005 @@ -149,3 +149,7 @@ def parse_eval_input(textsrc, gram, builder): """Parse a python expression""" return gram.parse_source( textsrc, "eval_input", builder ) + + +def grammar_rules( space ): + return space.wrap( PYTHON_PARSER.rules ) Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Thu Dec 8 18:25:31 2005 @@ -7,6 +7,8 @@ import pypy.interpreter.stablecompiler.ast as test_ast import pypy.interpreter.astcompiler.ast as ast_ast +flatten = ast_ast.flatten + import py.test from pypy.interpreter.astcompiler import ast @@ -24,8 +26,10 @@ return False else: print "Type mismatch", repr(l), repr(r) - print "l is str", type(l)==str - print "r is AssName", isinstance(r,ast_ast.AssName) + print "l is str", repr(l), type(l)==str + print "r is AssName", repr(r), isinstance(r,ast_ast.AssName) + print "left is", repr(left) + print "right is", repr(right) return False return True @@ -38,7 +42,14 @@ return False if isinstance(left,test_ast.Function) and isinstance(right,ast_ast.Function): left_nodes = list(left.getChildren()) - right_nodes = list(right.getChildren()) + right_nodes = [] # generated ast differ here because argnames is a list of nodes in + right_nodes.append(right.decorators) + right_nodes.append(right.name) + right_nodes.append(right.argnames) + right_nodes.extend(flatten(right.defaults)) + right_nodes.append(right.flags) + right_nodes.append(right.w_doc) + right_nodes.append(right.code) left_args = left_nodes[2] del left_nodes[2] right_args = right_nodes[2] @@ -47,7 +58,14 @@ return False elif isinstance(left,test_ast.Lambda) and isinstance(right,ast_ast.Lambda): left_nodes = list(left.getChildren()) - right_nodes = list(right.getChildren()) + right_nodes = [] # generated ast differ here because argnames is a list of nodes in + right_nodes.append(right.argnames) + right_nodes.extend(flatten(right.defaults)) + right_nodes.append(right.flags) + right_nodes.append(right.code) + + print "left", repr(left_nodes) + print "right", repr(right_nodes) left_args = left_nodes[0] del left_nodes[0] right_args = right_nodes[0] Modified: pypy/dist/pypy/interpreter/pyparser/test/test_samples.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_samples.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_samples.py Thu Dec 8 18:25:31 2005 @@ -95,6 +95,7 @@ from pypy.interpreter.stablecompiler.transformer import Transformer as PyPyTransformer from compiler.transformer import Transformer as PythonTransformer +from pypy.interpreter.astcompiler.consts import OP_ASSIGN, OP_DELETE, OP_APPLY def _check_tuples_equality(pypy_tuples, python_tuples, testname): # compare the two tuples by transforming them into AST, to hide irrelevant @@ -126,6 +127,11 @@ # astbuilder.py repr_pypy = repr_pypy.replace("[]", "()") repr_python = repr_python.replace("[]", "()") + # We also changed constants 'OP_ASSIGN' 'OP_DELETE' 'OP_APPLY' to use numeric values + repr_python = repr_python.replace("'OP_ASSIGN'", repr(OP_ASSIGN) ) + repr_python = repr_python.replace("'OP_DELETE'", repr(OP_DELETE) ) + repr_python = repr_python.replace("'OP_APPLY'", repr(OP_APPLY) ) + assert repr_pypy == repr_python Modified: pypy/dist/pypy/interpreter/stablecompiler/consts.py ============================================================================== --- pypy/dist/pypy/interpreter/stablecompiler/consts.py (original) +++ pypy/dist/pypy/interpreter/stablecompiler/consts.py Thu Dec 8 18:25:31 2005 @@ -1,7 +1,7 @@ # operation flags -OP_ASSIGN = 'OP_ASSIGN' -OP_DELETE = 'OP_DELETE' -OP_APPLY = 'OP_APPLY' +OP_ASSIGN = 0 # 'OP_ASSIGN' +OP_DELETE = 1 # 'OP_DELETE' +OP_APPLY = 2 # 'OP_APPLY' SC_LOCAL = 1 SC_GLOBAL = 2 Modified: pypy/dist/pypy/interpreter/stablecompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/stablecompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/stablecompiler/pycodegen.py Thu Dec 8 18:25:31 2005 @@ -9,7 +9,7 @@ from pypy.interpreter.stablecompiler import ast, parse, walk, syntax from pypy.interpreter.stablecompiler import pyassem, misc, future, symbols from pypy.interpreter.stablecompiler.consts import SC_LOCAL, SC_GLOBAL, \ - SC_FREE, SC_CELL, SC_DEFAULT + SC_FREE, SC_CELL, SC_DEFAULT, OP_APPLY, OP_DELETE, OP_ASSIGN from pypy.interpreter.stablecompiler.consts import CO_VARARGS, CO_VARKEYWORDS, \ CO_NEWLOCALS, CO_NESTED, CO_GENERATOR, CO_GENERATOR_ALLOWED, CO_FUTURE_DIVISION from pypy.interpreter.stablecompiler.pyassem import TupleArg @@ -914,9 +914,9 @@ self.visit(elt) def visitAssName(self, node): - if node.flags == 'OP_ASSIGN': + if node.flags == OP_ASSIGN: self.storeName(node.name) - elif node.flags == 'OP_DELETE': + elif node.flags == OP_DELETE: self.set_lineno(node) self.delName(node.name) else: @@ -924,16 +924,16 @@ def visitAssAttr(self, node): self.visit(node.expr) - if node.flags == 'OP_ASSIGN': + if node.flags == OP_ASSIGN: self.emit('STORE_ATTR', self.mangle(node.attrname)) - elif node.flags == 'OP_DELETE': + elif node.flags == OP_DELETE: self.emit('DELETE_ATTR', self.mangle(node.attrname)) else: print "warning: unexpected flags:", node.flags print node def _visitAssSequence(self, node, op='UNPACK_SEQUENCE'): - if findOp(node) != 'OP_DELETE': + if findOp(node) != OP_DELETE: self.emit(op, len(node.nodes)) for child in node.nodes: self.visit(child) @@ -1098,11 +1098,11 @@ self.emit('DUP_TOPX', 3) else: self.emit('DUP_TOPX', 2) - if node.flags == 'OP_APPLY': + if node.flags == OP_APPLY: self.emit('SLICE+%d' % slice) - elif node.flags == 'OP_ASSIGN': + elif node.flags == OP_ASSIGN: self.emit('STORE_SLICE+%d' % slice) - elif node.flags == 'OP_DELETE': + elif node.flags == OP_DELETE: self.emit('DELETE_SLICE+%d' % slice) else: print "weird slice", node.flags @@ -1116,11 +1116,11 @@ self.emit('DUP_TOPX', 2) if len(node.subs) > 1: self.emit('BUILD_TUPLE', len(node.subs)) - if node.flags == 'OP_APPLY': + if node.flags == OP_APPLY: self.emit('BINARY_SUBSCR') - elif node.flags == 'OP_ASSIGN': + elif node.flags == OP_ASSIGN: self.emit('STORE_SUBSCR') - elif node.flags == 'OP_DELETE': + elif node.flags == OP_DELETE: self.emit('DELETE_SUBSCR') # binary ops Modified: pypy/dist/pypy/module/recparser/__init__.py ============================================================================== --- pypy/dist/pypy/module/recparser/__init__.py (original) +++ pypy/dist/pypy/module/recparser/__init__.py Thu Dec 8 18:25:31 2005 @@ -46,5 +46,6 @@ # PyPy extension 'decode_string_literal': 'pyparser.decode_string_literal', 'install_compiler_hook' : 'pypy.interpreter.pycompiler.install_compiler_hook', + 'rules' : 'pypy.interpreter.pyparser.pythonparse.grammar_rules', } From mwh at codespeak.net Thu Dec 8 18:40:55 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Thu, 8 Dec 2005 18:40:55 +0100 (CET) Subject: [pypy-svn] r20904 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20051208174055.6683527B68@code1.codespeak.net> Author: mwh Date: Thu Dec 8 18:40:54 2005 New Revision: 20904 Modified: pypy/dist/pypy/translator/llvm/database.py pypy/dist/pypy/translator/llvm/module/support.py pypy/dist/pypy/translator/llvm/opwriter.py Log: almost enough operations for pypy-llvm to work again (the way in which it fails now confuses me, someone who actually knows about llvm should try it, i guess). Modified: pypy/dist/pypy/translator/llvm/database.py ============================================================================== --- pypy/dist/pypy/translator/llvm/database.py (original) +++ pypy/dist/pypy/translator/llvm/database.py Thu Dec 8 18:40:54 2005 @@ -36,6 +36,8 @@ lltype.Float: "double", lltype.UniChar: "uint", lltype.Void: "void", + lltype.UnsignedLongLong: "ulong", + lltype.SignedLongLong: "long", Address: "sbyte*"} # 32 bit platform Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Thu Dec 8 18:40:54 2005 @@ -57,6 +57,18 @@ ret int %result } +internal fastcc long %pypyop_long_abs(long %x) { +block0: + %cond1 = setge long %x, 0 + br bool %cond1, label %return_block, label %block1 +block1: + %x2 = sub long 0, %x + br label %return_block +return_block: + %result = phi long [%x, %block0], [%x2, %block1] + ret long %result +} + internal fastcc double %pypyop_float_abs(double %x) { block0: %cond1 = setge double %x, 0.0 Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Thu Dec 8 18:40:54 2005 @@ -19,6 +19,21 @@ 'int_ge': 'setge', 'int_gt': 'setgt', + 'llong_mul': 'mul', + 'llong_add': 'add', + 'llong_sub': 'sub', + 'llong_floordiv': 'div', + 'llong_mod': 'rem', + 'llong_and': 'and', + 'llong_or': 'or', + 'llong_xor': 'xor', + 'llong_lt': 'setlt', + 'llong_le': 'setle', + 'llong_eq': 'seteq', + 'llong_ne': 'setne', + 'llong_ge': 'setge', + 'llong_gt': 'setgt', + 'uint_mul': 'mul', 'uint_add': 'add', 'uint_sub': 'sub', @@ -61,6 +76,9 @@ 'uint_lshift': 'shl', 'uint_rshift': 'shr', + + 'llong_lshift': 'shl', + 'llong_rshift': 'shr', } @@ -88,7 +106,7 @@ self.shiftop(op) elif op.opname in self.char_operations: self.char_binaryop(op) - elif op.opname.startswith('cast_'): + elif op.opname.startswith('cast_') or op.opname.startswith('truncate_'): if op.opname == 'cast_char_to_int': self.cast_char_to_int(op) else: @@ -140,6 +158,7 @@ [self.db.repr_arg(op.args[0])], [self.db.repr_arg_type(op.args[0])]) float_abs = int_abs + llong_abs = int_abs def int_pow(self, op): self._generic_pow(op, "1") @@ -248,6 +267,7 @@ self.db.repr_arg(op.args[0]), "0") uint_is_true = int_is_true + llong_is_true = int_is_true def float_is_true(self, op): self.codewriter.binaryop("setne", From adim at codespeak.net Thu Dec 8 19:12:18 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Thu, 8 Dec 2005 19:12:18 +0100 (CET) Subject: [pypy-svn] r20905 - pypy/dist/pypy/module/recparser/test Message-ID: <20051208181218.9F06727B69@code1.codespeak.net> Author: adim Date: Thu Dec 8 19:12:17 2005 New Revision: 20905 Added: pypy/dist/pypy/module/recparser/test/test_compilehooks.py Log: added a small high-level test for compile hooks Added: pypy/dist/pypy/module/recparser/test/test_compilehooks.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/module/recparser/test/test_compilehooks.py Thu Dec 8 19:12:17 2005 @@ -0,0 +1,28 @@ +class AppTest_CompilerHooks: + + def test_basic_hook(self): + # define the hook + def threebecomestwo(ast, enc): + class ChangeConstVisitor: + def visitConst(self, node): + if node.value == 3: + node.value = 2 + + def defaultvisit(self, node): + for child in node.getChildNodes(): + child.accept(self) + + def __getattr__(self, attrname): + if attrname.startswith('visit'): + return self.defaultvisit + raise AttributeError(attrname) + + ast.accept(ChangeConstVisitor()) + return ast + + # install the hook + import parser + parser.install_compiler_hook(threebecomestwo) + d = {} + exec "a = 3" in d + assert d['a'] == 2 # well, yes ... From ericvrp at codespeak.net Fri Dec 9 00:01:23 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 9 Dec 2005 00:01:23 +0100 (CET) Subject: [pypy-svn] r20911 - in pypy/dist/pypy/translator/js: src test Message-ID: <20051208230123.DEE4F27DBA@code1.codespeak.net> Author: ericvrp Date: Fri Dec 9 00:01:21 2005 New Revision: 20911 Added: pypy/dist/pypy/translator/js/src/misc.js Modified: pypy/dist/pypy/translator/js/src/ll_stackless.js pypy/dist/pypy/translator/js/src/stack.js pypy/dist/pypy/translator/js/test/browsertest.py pypy/dist/pypy/translator/js/test/runtest.py pypy/dist/pypy/translator/js/test/test_stackless.py Log: Some refactoring to getting closer to running long running processes. Modified: pypy/dist/pypy/translator/js/src/ll_stackless.js ============================================================================== --- pypy/dist/pypy/translator/js/src/ll_stackless.js (original) +++ pypy/dist/pypy/translator/js/src/ll_stackless.js Fri Dec 9 00:01:21 2005 @@ -3,19 +3,11 @@ var slp_frame_stack_top = null; var slp_frame_stack_bottom = null; var slp_return_value = undefined; + +var slp_timeout = false; var slp_start_time = undefined; var slp_stack_depth = 0; -// This gets called with --log - -function log(s) { - try { - alert(s); // in browser - } catch (e) { - print('log: ' + s); // commandline - } -} - function function_name(func) { var s = func.toString().split("\n"); s = s[0].length == 0 ? s[1] : s[0]; @@ -24,8 +16,6 @@ return s } -// example function for testing - function ll_stackless_stack_frames_depth() { if (!slp_frame_stack_top) { LOG("ll_stackless_stack_frames_depth init"); @@ -49,18 +39,17 @@ var result = slp_stack_depth > 500; // Firefox has a recursion limit of 1000 (others allow more) LOG("ll_stack_to_big result=" + result); - if (!result) { + if (!result && in_browser && false) { var t = new Date().getTime(); var d = t - slp_start_time; - result = d > 1000; + result = d > 100; if (result) { - print('XXX d='+d + ' XXX t='+t); - slp_start_time = t; + slp_timeout = true; } } return result; } -ll_stack_too_big__ = ll_stack_too_big +ll_stack_too_big__ = ll_stack_too_big; function slp_new_frame(targetvar, func, resume_blocknum, vars) { //LOG("slp_new_frame("+targetvar+","+function_name(func)+","+resume_blocknum+","+vars.toSource()+")"); @@ -145,6 +134,9 @@ function slp_main_loop() { var f_back; + log("SLP_MAIN_LOOP"); + slp_timeout = false; + slp_start_time = new Date().getTime(); while (true) { slp_frame_stack_bottom = null; pending = slp_frame_stack_top; @@ -152,9 +144,12 @@ while (true) { f_back = pending.f_back; LOG('calling: ' + function_name(pending.func)); - //slp_start_time = new Date().getTime(); //XXX should really exit javascript and resume with setTimeout(...) slp_stack_depth = 0; // we are restarting to recurse slp_return_value = pending.func(); // params get initialized in the function because it's a resume! + if (slp_timeout) { + setTimeout('slp_main_loop()', 0); + return undefined; + } if (slp_frame_stack_top) { break; } @@ -164,19 +159,29 @@ pending = f_back; slp_frame_stack_top = pending; } - + if (slp_frame_stack_bottom) { // returning from switch() if (slp_frame_stack_bottom.f_back) log('slp_frame_stack_bottom.f_back'); slp_frame_stack_bottom.f_back = f_back; } } + log("REALLY FINISHED"); } -function slp_entry_point(funcstring) { +// +// note: this function returns undefined for long running processes. +// In that case it should be seen as similar to thread.run() +// +function slp_entry_point(funcstring) { //new thread().run() + slp_timeout = false; slp_start_time = new Date().getTime(); slp_stack_depth = 0; /// initial stack depth var result = eval(funcstring); if (slp_frame_stack_bottom) { // get with dispatch loop when stack unwound + if (slp_timeout) { + setTimeout('slp_main_loop()', 0); + return undefined; + } slp_main_loop(); result = slp_return_value; } Added: pypy/dist/pypy/translator/js/src/misc.js ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/js/src/misc.js Fri Dec 9 00:01:21 2005 @@ -0,0 +1,16 @@ +var in_browser; +try { + dummy = alert; + in_browser = true; +} catch (e) { + in_browser = false; +} + +function log(s) { + if (in_browser) { + var logdiv = document.getElementById('logdiv'); + logdiv.innerHTML = new Date().getTime() + ': ' + s + "
" + logdiv.innerHTML; + } else { + print('log: ' + s); + } +} Modified: pypy/dist/pypy/translator/js/src/stack.js ============================================================================== --- pypy/dist/pypy/translator/js/src/stack.js (original) +++ pypy/dist/pypy/translator/js/src/stack.js Fri Dec 9 00:01:21 2005 @@ -1,7 +1,3 @@ -function logme(s) { - print("logme: " + s); -} - function ll_stack_too_big_helper(depth) { if (depth > 0) { ll_stack_too_big_helper(depth-1) @@ -16,3 +12,4 @@ } return false; } +ll_stack_too_big__ = ll_stack_too_big; Modified: pypy/dist/pypy/translator/js/test/browsertest.py ============================================================================== --- pypy/dist/pypy/translator/js/test/browsertest.py (original) +++ pypy/dist/pypy/translator/js/test/browsertest.py Fri Dec 9 00:01:21 2005 @@ -29,9 +29,12 @@ result = "raise Exception('unknown')"; } } - var resultform = document.forms['resultform']; - resultform.result.value = result; - resultform.submit(); + + if (result != undefined || !in_browser) { // if no timeout (i.e. not long running) + var resultform = document.forms['resultform']; + resultform.result.value = result; + resultform.submit(); + } }; @@ -40,6 +43,7 @@
+
""" @@ -76,6 +80,7 @@ jstestcase = jstest.jstestcase jscode = jstest.jscode html_page = config.html_page % locals() + open("html_page.html", "w").write(html_page) self.serve_data('text/html', html_page) do_status = 'do_GET' Modified: pypy/dist/pypy/translator/js/test/runtest.py ============================================================================== --- pypy/dist/pypy/translator/js/test/runtest.py (original) +++ pypy/dist/pypy/translator/js/test/runtest.py Fri Dec 9 00:01:21 2005 @@ -36,7 +36,7 @@ function_call = "%s(%s)" % (self.js.graph.name, args) if self.js.stackless: - function_call = "slp_entry_point(%s)" % function_call + function_call = "slp_entry_point('%s')" % function_call if use_browsertest: output = jstest(self.js.filename, function_call) Modified: pypy/dist/pypy/translator/js/test/test_stackless.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_stackless.py (original) +++ pypy/dist/pypy/translator/js/test/test_stackless.py Fri Dec 9 00:01:21 2005 @@ -181,11 +181,19 @@ data = wrap_stackless_function(f) assert int(data.strip()) == 7495 +# XXX +# need test to detect timeout (return=undefined), call slp_main_loop() until no timeout +# and only then check result. + def test_long_running(): + py.test.skip("stackless feature incomplete (no long running processes yet)") + n_iterations = 50000 def g(x): if x > 0: + for q in range(1000): + pass g(x-1) return x @@ -193,4 +201,6 @@ return g(n_iterations) data = wrap_stackless_function(lp) - assert int(data.strip()) == n_iterations + + #note: because long running processes can't return a value like this + assert int(data.strip()) == undefined From ericvrp at codespeak.net Fri Dec 9 00:13:19 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 9 Dec 2005 00:13:19 +0100 (CET) Subject: [pypy-svn] r20912 - pypy/dist/pypy/translator/js Message-ID: <20051208231319.0721427DB9@code1.codespeak.net> Author: ericvrp Date: Fri Dec 9 00:13:18 2005 New Revision: 20912 Modified: pypy/dist/pypy/translator/js/js.py Log: oops forgot to check this in Modified: pypy/dist/pypy/translator/js/js.py ============================================================================== --- pypy/dist/pypy/translator/js/js.py (original) +++ pypy/dist/pypy/translator/js/js.py Fri Dec 9 00:13:18 2005 @@ -55,6 +55,12 @@ codewriter.comment('filename: %s' % self.filename) codewriter.newline() + + src_filename = _path_join(os.path.dirname(__file__), 'src', 'misc.js') + s = open(src_filename).read() + f.write(s) + + codewriter.newline() for node in self.db.getnodes(): node.write_implementation(codewriter) From ericvrp at codespeak.net Fri Dec 9 10:39:23 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 9 Dec 2005 10:39:23 +0100 (CET) Subject: [pypy-svn] r20916 - pypy/dist/pypy/translator/llvm Message-ID: <20051209093923.2E82927DC7@code1.codespeak.net> Author: ericvrp Date: Fri Dec 9 10:39:22 2005 New Revision: 20916 Modified: pypy/dist/pypy/translator/llvm/node.py Log: Probable fix to get pypy-llvm translatable again Modified: pypy/dist/pypy/translator/llvm/node.py ============================================================================== --- pypy/dist/pypy/translator/llvm/node.py (original) +++ pypy/dist/pypy/translator/llvm/node.py Fri Dec 9 10:39:22 2005 @@ -7,16 +7,17 @@ def make_name(self, name): " helper for creating names" - if " " in name or "<" in name: - name = '"%s"' % name - if name in self.nodename_count: postfix = '_%d' % self.nodename_count[name] self.nodename_count[name] += 1 else: postfix = '' self.nodename_count[name] = 1 - return name + postfix + + name += postfix + if " " in name or "<" in name: + name = '"%s"' % name + return name def make_ref(self, prefix, name): return self.make_name(prefix + name) From nik at codespeak.net Fri Dec 9 11:00:43 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Fri, 9 Dec 2005 11:00:43 +0100 (CET) Subject: [pypy-svn] r20917 - in pypy/dist/pypy/module/_socket: rpython test Message-ID: <20051209100043.EAB6327DC7@code1.codespeak.net> Author: nik Date: Fri Dec 9 11:00:42 2005 New Revision: 20917 Modified: pypy/dist/pypy/module/_socket/rpython/rsocket.py pypy/dist/pypy/module/_socket/test/test_socket2.py Log: (ale, nik) fixed mystery test failures related to the way we hack around the association of file descriptors and socket objects on untranslated pypy. Modified: pypy/dist/pypy/module/_socket/rpython/rsocket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/rsocket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/rsocket.py Fri Dec 9 11:00:42 2005 @@ -4,7 +4,11 @@ import socket -keep_sockets_alive = {} +# HACK: We have to prevent GC to collect the socket object we create within this +#?module. Because socket.close() is called on GC this can lead to strange +# effects in corner cases where file descriptors are reused. +socket_cache = {} +keep_sockets_alive = [] class ADDRINFO(object): # a simulated addrinfo structure from C, i.e. a chained list @@ -30,21 +34,21 @@ def newsocket(family, type, protocol): s = socket.socket(family, type, protocol) - # HACK: We have to prevent GC to collect the socket object because we don't - # want it to be closed. fileno = s.fileno() - keep_sockets_alive[fileno] = s + if socket_cache.has_key(fileno): + keep_sockets_alive.append(socket_cache[fileno]) + socket_cache[fileno] = s return fileno def connect(fd, host, port): # XXX IPv4 only - s = keep_sockets_alive[fd] + s = socket_cache[fd] try: s.connect((host, port)) except Exception, ex: print ex def getpeername(fd): - s = keep_sockets_alive[fd] + s = socket_cache[fd] return s.getpeername() Modified: pypy/dist/pypy/module/_socket/test/test_socket2.py ============================================================================== --- pypy/dist/pypy/module/_socket/test/test_socket2.py (original) +++ pypy/dist/pypy/module/_socket/test/test_socket2.py Fri Dec 9 11:00:42 2005 @@ -294,12 +294,15 @@ def app_test_socket_connect(): import _socket, os s = _socket.socket(_socket.AF_INET, _socket.SOCK_STREAM, 0) + # XXX temporarily we use codespeak to test, will have more robust tests in + # the absence of a network connection later when mroe parts of the socket + # API are implemented. s.connect(("codespeak.net", 80)) name = s.getpeername() # Will raise socket.error if not connected assert name[1] == 80 s.close() -def DONOT_app_test_socket_connect_typeerrors(): +def app_test_socket_connect_typeerrors(): tests = [ "", ("80"), From ericvrp at codespeak.net Fri Dec 9 11:10:25 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 9 Dec 2005 11:10:25 +0100 (CET) Subject: [pypy-svn] r20918 - in pypy/dist/pypy/translator/js: src test Message-ID: <20051209101025.6168827DC9@code1.codespeak.net> Author: ericvrp Date: Fri Dec 9 11:10:23 2005 New Revision: 20918 Modified: pypy/dist/pypy/translator/js/src/ll_stackless.js pypy/dist/pypy/translator/js/src/stack.js pypy/dist/pypy/translator/js/test/browsertest.py pypy/dist/pypy/translator/js/test/test_genllvm1.py pypy/dist/pypy/translator/js/test/test_stackless.py Log: Fixes to get things working in Safari (which is the one with the most limited recusion limit (100) Modified: pypy/dist/pypy/translator/js/src/ll_stackless.js ============================================================================== --- pypy/dist/pypy/translator/js/src/ll_stackless.js (original) +++ pypy/dist/pypy/translator/js/src/ll_stackless.js Fri Dec 9 11:10:23 2005 @@ -7,6 +7,7 @@ var slp_timeout = false; var slp_start_time = undefined; var slp_stack_depth = 0; +var slp_max_stack_depth = 75; // XXX make this browser dependent (75:Safari, 750:Firefox/Spidermonkey, more:IE) function function_name(func) { var s = func.toString().split("\n"); @@ -36,7 +37,7 @@ // function ll_stack_too_big() { - var result = slp_stack_depth > 500; // Firefox has a recursion limit of 1000 (others allow more) + var result = slp_stack_depth > slp_max_stack_depth; // Firefox has a recursion limit of 1000 (others allow more) LOG("ll_stack_to_big result=" + result); if (!result && in_browser && false) { @@ -166,6 +167,7 @@ } } log("REALLY FINISHED"); + handle_result(slp_return_value); } // Modified: pypy/dist/pypy/translator/js/src/stack.js ============================================================================== --- pypy/dist/pypy/translator/js/src/stack.js (original) +++ pypy/dist/pypy/translator/js/src/stack.js Fri Dec 9 11:10:23 2005 @@ -13,3 +13,8 @@ return false; } ll_stack_too_big__ = ll_stack_too_big; + +function ll_stack_unwind() { + throw "Recursion limit exceeded"; +} +ll_stack_unwind__ = ll_stack_unwind; Modified: pypy/dist/pypy/translator/js/test/browsertest.py ============================================================================== --- pypy/dist/pypy/translator/js/test/browsertest.py (original) +++ pypy/dist/pypy/translator/js/test/browsertest.py Fri Dec 9 11:10:23 2005 @@ -30,12 +30,16 @@ } } - if (result != undefined || !in_browser) { // if no timeout (i.e. not long running) - var resultform = document.forms['resultform']; - resultform.result.value = result; - resultform.submit(); + if (result != undefined || !in_browser) { // if valid result (no timeout) + handle_result(result); } }; + +function handle_result(result) { + var resultform = document.forms['resultform']; + resultform.result.value = result; + resultform.submit(); +}; Modified: pypy/dist/pypy/translator/js/test/test_genllvm1.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_genllvm1.py (original) +++ pypy/dist/pypy/translator/js/test/test_genllvm1.py Fri Dec 9 11:10:23 2005 @@ -28,7 +28,7 @@ def test_ackermann(self): f = compile_function(llvmsnippet.ackermann, [int, int]) - for i in range(7): #>7 js error: too much recursion?!? + for i in range(4): # (otherwise too much recursion) max 4 in Safari, max 7 in Firefox, IE allows more recursion assert f(0, i) == i + 1 assert f(1, i) == i + 2 assert f(2, i) == 2 * i + 3 Modified: pypy/dist/pypy/translator/js/test/test_stackless.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_stackless.py (original) +++ pypy/dist/pypy/translator/js/test/test_stackless.py Fri Dec 9 11:10:23 2005 @@ -107,7 +107,7 @@ def fn(): return f(0) data = wrap_stackless_function(fn) - assert int(data.strip()) == 494 + assert int(data.strip()) > 50 #conservative estimate because the value is browser dependent def test_stack_unwind(): From cfbolz at codespeak.net Fri Dec 9 11:34:46 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 9 Dec 2005 11:34:46 +0100 (CET) Subject: [pypy-svn] r20920 - pypy/extradoc/sprintinfo/gothenburg-2005 Message-ID: <20051209103446.AC6FF27DC2@code1.codespeak.net> Author: cfbolz Date: Fri Dec 9 11:34:46 2005 New Revision: 20920 Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Log: update Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt ============================================================================== --- pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt (original) +++ pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Fri Dec 9 11:34:46 2005 @@ -41,7 +41,7 @@ Expose the low-level switching facilities: - write RPython structures (tasklet, channel) and basic - functions for switching (IN-PROGRESS, Richard wants annotation help) + functions for switching (IN-PROGRESS) - add an app-level interface (mixed module) - implement support structures - a deque module exists already which can be used for channel queues @@ -64,7 +64,7 @@ glue to C libraries, think/design solutions (Johan, Michael) -- support more basic integer types. Decide on the proper (IN-PROGRESS) +- (DONE) support more basic integer types. Decide on the proper design (explicit spelling of sizes, or the long-long way?) note that we already have functions which return 64 bit values. @@ -79,7 +79,7 @@ ~~~~~~~~~~~~~~~~~~~~~~~~~~~ - look into the perfomance and code path for function calls - in our intepreter (Arre, Eric) + in our interpreter (Arre, Eric, with help from Richard) - look into converting the indirect call in the eval loop for bytecode dispatch into a switch: probably needs a representation choice in the RTyper, a transformation, and integer exitswitch implementation as switch in the backends @@ -87,16 +87,21 @@ Logic programming, WP9 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -(Ludovic, Adrien) +(Ludovic, Adrien, maybe some help needed) - export the AST nodes hierarchy to application level through the compiler module (IN-PROGRESS) + - all the AST tree exported to applevel + - still need to create new nodes - export the Grammar representation and provide means to (at least) add new rules (long) which involve providing ST->AST transformation functions + - the grammar rules are exported too US travel report, maybe towards WP03/WP07 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +Saturday morning + - telling the story about a commercial travel to the states to optimize some Python application - done using RPython - discussing possible advantages/new goals/extensions to the project From arigo at codespeak.net Fri Dec 9 11:37:00 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Dec 2005 11:37:00 +0100 (CET) Subject: [pypy-svn] r20921 - in pypy/dist/pypy/jit: . test Message-ID: <20051209103700.2D14B27DCB@code1.codespeak.net> Author: arigo Date: Fri Dec 9 11:36:58 2005 New Revision: 20921 Added: pypy/dist/pypy/jit/tlopcode.py - copied unchanged from r20919, pypy/dist/pypy/jit/opcode.py Removed: pypy/dist/pypy/jit/opcode.py Modified: pypy/dist/pypy/jit/test/test_tl.py pypy/dist/pypy/jit/tl.py Log: Renamed opcode.py to tlopcode.py to avoid conficts with the stdlib module. Modified: pypy/dist/pypy/jit/test/test_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_tl.py (original) +++ pypy/dist/pypy/jit/test/test_tl.py Fri Dec 9 11:36:58 2005 @@ -1,7 +1,7 @@ import py import operator from pypy.jit.tl import interp, compile -from pypy.jit.opcode import * +from pypy.jit.tlopcode import * from pypy.translator.translator import TranslationContext from pypy.annotation import policy Modified: pypy/dist/pypy/jit/tl.py ============================================================================== --- pypy/dist/pypy/jit/tl.py (original) +++ pypy/dist/pypy/jit/tl.py Fri Dec 9 11:36:58 2005 @@ -1,8 +1,8 @@ '''Toy Language''' import py -from opcode import * -import opcode +from tlopcode import * +import tlopcode def char2int(c): t = ord(c) @@ -125,7 +125,7 @@ if t[0].endswith(':'): labels[ t[0][:-1] ] = len(bytecode) continue - bytecode.append( opcode.names[ t[0] ] ) + bytecode.append( tlopcode.names[ t[0] ] ) if len(t) > 1: try: bytecode.append( int(t[1]) ) From mwh at codespeak.net Fri Dec 9 12:00:58 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Fri, 9 Dec 2005 12:00:58 +0100 (CET) Subject: [pypy-svn] r20923 - pypy/dist/pypy/doc Message-ID: <20051209110058.7A75527DCE@code1.codespeak.net> Author: mwh Date: Fri Dec 9 12:00:57 2005 New Revision: 20923 Modified: pypy/dist/pypy/doc/contact.txt Log: mention the PyPy calendar. Modified: pypy/dist/pypy/doc/contact.txt ============================================================================== --- pypy/dist/pypy/doc/contact.txt (original) +++ pypy/dist/pypy/doc/contact.txt Fri Dec 9 12:00:57 2005 @@ -22,7 +22,7 @@ .. _`development bug/feature tracker`: https://codespeak.net/issue/pypy-dev/ -IRC Channel #pypy on freenode +IRC channel #pypy on freenode ============================= Many of the core developers are hanging out at #pypy on irc.freenode.net. @@ -31,3 +31,14 @@ the channel here_. .. _here: http://tismerysoft.de/pypy/irc-logs/pypy + +PyPy calendar +============= + +There is a PyPy calendar at http://pypycal.sabi.net/. Currently +Michael Hudson is the only person who can reliably modify the +calendar, so if there's a mistake or something to add please tell him +about it. If you have a webcal client, you could try clicking +`this`_. + +.. _`this`: webcal://pypycal.sabi.net///calendars/PyPy.ics From rxe at codespeak.net Fri Dec 9 12:20:16 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 12:20:16 +0100 (CET) Subject: [pypy-svn] r20925 - pypy/dist/pypy/translator/llvm Message-ID: <20051209112016.0947527DC5@code1.codespeak.net> Author: rxe Date: Fri Dec 9 12:20:15 2005 New Revision: 20925 Modified: pypy/dist/pypy/translator/llvm/database.py pypy/dist/pypy/translator/llvm/opwriter.py Log: tab pedantries (sorry) Modified: pypy/dist/pypy/translator/llvm/database.py ============================================================================== --- pypy/dist/pypy/translator/llvm/database.py (original) +++ pypy/dist/pypy/translator/llvm/database.py Fri Dec 9 12:20:15 2005 @@ -36,8 +36,8 @@ lltype.Float: "double", lltype.UniChar: "uint", lltype.Void: "void", - lltype.UnsignedLongLong: "ulong", - lltype.SignedLongLong: "long", + lltype.UnsignedLongLong: "ulong", + lltype.SignedLongLong: "long", Address: "sbyte*"} # 32 bit platform Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Fri Dec 9 12:20:15 2005 @@ -77,7 +77,7 @@ 'uint_lshift': 'shl', 'uint_rshift': 'shr', - 'llong_lshift': 'shl', + 'llong_lshift': 'shl', 'llong_rshift': 'shr', } From rxe at codespeak.net Fri Dec 9 12:38:41 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 12:38:41 +0100 (CET) Subject: [pypy-svn] r20928 - pypy/dist/pypy/translator/backendopt Message-ID: <20051209113841.A13DE27DDB@code1.codespeak.net> Author: rxe Date: Fri Dec 9 12:38:40 2005 New Revision: 20928 Modified: pypy/dist/pypy/translator/backendopt/inline.py Log: Hint not to inline when we have yield_current_frame_to_caller operations. Modified: pypy/dist/pypy/translator/backendopt/inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/inline.py (original) +++ pypy/dist/pypy/translator/backendopt/inline.py Fri Dec 9 12:38:40 2005 @@ -278,6 +278,7 @@ 'cast_pointer': 0, 'keepalive': 0, 'direct_call': 2, # guess + 'yield_current_frame_to_caller': sys.maxint, # XXX bit extreme } def block_weight(block, weights=OP_WEIGHTS): From rxe at codespeak.net Fri Dec 9 12:39:37 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 12:39:37 +0100 (CET) Subject: [pypy-svn] r20929 - pypy/dist/pypy/translator/c/test Message-ID: <20051209113937.AF1BE27DDC@code1.codespeak.net> Author: rxe Date: Fri Dec 9 12:39:36 2005 New Revision: 20929 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: More experiments... should probably move this out of here? Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Fri Dec 9 12:39:36 2005 @@ -1,13 +1,35 @@ -from pypy.translator.translator import TranslationContext -from pypy.translator.c.genc import CStandaloneBuilder -from pypy.annotation.model import SomeList, SomeString -from pypy.annotation.listdef import ListDef -from pypy.rpython.rstack import stack_unwind, stack_frames_depth, stack_too_big +import os +from pypy.rpython.memory.lladdress import NULL from pypy.rpython.rstack import yield_current_frame_to_caller -from pypy.translator.backendopt.all import backend_optimizations import os +# ____________________________________________________________ +# For testing + +from pypy.translator.c.gc import BoehmGcPolicy +gcpolicy = None #BoehmGcPolicy +debug_flag = True + +# count of loops in tests (set lower to speed up) +loops = 5000 + +def debug(s): + if debug_flag: + os.write(2, "%s\n" % s) + +class Globals: + def __init__(self): + pass + +globals = Globals() +globals.count = 0 def wrap_stackless_function(fn): + from pypy.translator.translator import TranslationContext + from pypy.translator.c.genc import CStandaloneBuilder + from pypy.annotation.model import SomeList, SomeString + from pypy.annotation.listdef import ListDef + from pypy.translator.backendopt.all import backend_optimizations + def entry_point(argv): os.write(1, str(fn())) return 0 @@ -18,7 +40,7 @@ t.buildannotator().build_types(entry_point, [s_list_of_strings]) t.buildrtyper().specialize() backend_optimizations(t) - cbuilder = CStandaloneBuilder(t, entry_point) + cbuilder = CStandaloneBuilder(t, entry_point, gcpolicy=gcpolicy) cbuilder.stackless = True cbuilder.generate_source() cbuilder.compile() @@ -26,39 +48,79 @@ # ____________________________________________________________ -def debug(s): - #os.write(1, "%s\n" % s) - pass - -class Tasklet(object): - - def __init__(self, name, fn): +class Resumable(object): + def __init__(self, fn): self.fn = fn - self.name = name self.alive = False + # propogates round suspend-resume to tell scheduler in run() + # XXX too late to think this thru + self.remove = False + def start(self): - debug("starting %s" % self.name) self.caller = yield_current_frame_to_caller() - - debug("entering %s" % self.name) self.fn(self.name) - debug("leaving %s" % self.name) return self.caller - def setalive(self, resumable): - self.alive = True + def set_resumable(self, resumable): self.resumable = resumable - def schedule(self): - debug("scheduling %s" % self.name) + def suspend(self, remove): self.caller = self.caller.switch() - + self.remove = remove + def resume(self): - debug("resuming %s" % self.name) + debug("resuming %s" % self.name) self.resumable = self.resumable.switch() self.alive = self.resumable is not None + # not sure what to do with alive yetXXX + #XXX arggh - why NOT?? + #if not alive: + # self.caller = # None / NULL + return self.alive and not self.remove + +class Tasklet(Resumable): + def __init__(self, name, fn): + Resumable.__init__(self, fn) + self.name = name + self.blocked = False + +class Channel: + def __init__(self): + self.balance = 0 + self.queue = [] + + def send(self, value): + self.balance += 1 + if self.balance < 0: + t = self.queue.pop(0) + t.data = value + t.blocked = 0 + else: + t = getcurrent() + # Remove the tasklet from the list of running tasklets. + #XXX dont need this - t.remove() + + # let it wait for a receiver to come along + self.queue.append((t, value)) + t.blocked = 1 + scheduler.schedule() + + def receive(self): + self.balance -= 1 + # good to go + if self.balance > 0: + t, value = self.queue.pop(0) + t.blocked = 0 + scheduler.add_tasklet(t) + return value + + # block until ready + t = getcurrent() + self.queue.append(t) + t.blocked = -1 + scheduler.schedule() class Scheduler(object): def __init__(self): @@ -69,55 +131,89 @@ self.runnables.append(tasklet) def run(self): - debug("running: length of runnables %s" % len(self.runnables)) + debug("len1 %s" % len(self.runnables)) while self.runnables: - t = self.runnables.pop(0) - debug("resuming %s(%s)" % (t.name, t.alive)) - self.current_tasklet = t - t.resume() - self.current_tasklet = None - if t.alive: - self.runnables.append(t) - - debug("ran") + runnables = self.runnables + debug("len2 %s" % len(runnables)) + self.runnables = [] + for t in runnables: + assert self.current_tasklet is None + self.current_tasklet = t + if t.resume(): + self.runnables.append(self.current_tasklet) + self.current_tasklet = None + + def schedule(self, remove=False): + assert self.current_tasklet is not None + self.current_tasklet.suspend(remove) + +# ____________________________________________________________ scheduler = Scheduler() def start_tasklet(tasklet): res = tasklet.start() - tasklet.setalive(res) + tasklet.set_resumable(res) scheduler.add_tasklet(tasklet) +def schedule(): + scheduler.schedule() + +def schedule_remove(): + scheduler.schedule(remove=True) + def run(): scheduler.run() -def schedule(): - assert scheduler.current_tasklet - scheduler.current_tasklet.schedule() +def getcurrent(): + return scheduler.current_tasklet + +# ____________________________________________________________ def test_simple(): - class Counter: - def __init__(self): - self.count = 0 + + def simple(name): + for ii in range(5): + globals.count += 1 + schedule() - def increment(self): - self.count += 1 + def f(): + for ii in range(loops): + start_tasklet(Tasklet("T%s" % ii, simple)) + run() + return globals.count == loops * 5 - def get_count(self): - return self.count + res = wrap_stackless_function(f) + assert res == '1' - c = Counter() +def test_multiple_simple(): def simple(name): for ii in range(5): - debug("xxx %s %s" % (name, ii)) - c.increment() + globals.count += 1 schedule() - def f(): + def simple2(name): for ii in range(5): - start_tasklet(Tasklet("T%s" % ii, simple)) + globals.count += 1 + schedule() + globals.count += 1 + + def simple3(name): + schedule() + for ii in range(10): + globals.count += 1 + if ii % 2: + schedule() + schedule() + + def f(): + globals.count = 0 + for ii in range(loops): + start_tasklet(Tasklet("T1%s" % ii, simple)) + start_tasklet(Tasklet("T2%s" % ii, simple2)) + start_tasklet(Tasklet("T3%s" % ii, simple3)) run() - return c.get_count() == 25 + return globals.count == loops * 25 res = wrap_stackless_function(f) assert res == '1' From rxe at codespeak.net Fri Dec 9 12:56:26 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 12:56:26 +0100 (CET) Subject: [pypy-svn] r20930 - pypy/dist/pypy/translator/c/test Message-ID: <20051209115626.0473727DDB@code1.codespeak.net> Author: rxe Date: Fri Dec 9 12:56:25 2005 New Revision: 20930 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: Resumable included too much - thanks Chris. (this is all just experimental, wouldnt read too closely) Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Fri Dec 9 12:56:25 2005 @@ -52,10 +52,6 @@ def __init__(self, fn): self.fn = fn self.alive = False - - # propogates round suspend-resume to tell scheduler in run() - # XXX too late to think this thru - self.remove = False def start(self): self.caller = yield_current_frame_to_caller() @@ -65,20 +61,12 @@ def set_resumable(self, resumable): self.resumable = resumable - def suspend(self, remove): + def suspend(self): self.caller = self.caller.switch() - self.remove = remove def resume(self): - debug("resuming %s" % self.name) self.resumable = self.resumable.switch() self.alive = self.resumable is not None - # not sure what to do with alive yetXXX - - #XXX arggh - why NOT?? - #if not alive: - # self.caller = # None / NULL - return self.alive and not self.remove class Tasklet(Resumable): def __init__(self, name, fn): @@ -86,6 +74,23 @@ self.name = name self.blocked = False + # propogates round suspend-resume to tell scheduler in run() + # XXX too late to think this thru + self.remove = False + + def suspend_and_remove(self, remove): + self.suspend() + self.remove = remove + + def resume(self): + Resumable.resume(self) + # not sure what to do with alive yetXXX + + #XXX arggh - why NOT?? + #if not alive: + # self.caller = # None / NULL + return self.alive and not self.remove + class Channel: def __init__(self): self.balance = 0 @@ -145,7 +150,7 @@ def schedule(self, remove=False): assert self.current_tasklet is not None - self.current_tasklet.suspend(remove) + self.current_tasklet.suspend_and_remove(remove) # ____________________________________________________________ From rxe at codespeak.net Fri Dec 9 13:30:20 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 13:30:20 +0100 (CET) Subject: [pypy-svn] r20933 - pypy/dist/pypy/translator/c/test Message-ID: <20051209123020.7C24F27DE5@code1.codespeak.net> Author: rxe Date: Fri Dec 9 13:30:19 2005 New Revision: 20933 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: Make schedule_remove() work. Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Fri Dec 9 13:30:19 2005 @@ -6,11 +6,11 @@ # For testing from pypy.translator.c.gc import BoehmGcPolicy -gcpolicy = None #BoehmGcPolicy +gcpolicy = BoehmGcPolicy debug_flag = True # count of loops in tests (set lower to speed up) -loops = 5000 +loops = 1000 def debug(s): if debug_flag: @@ -79,11 +79,13 @@ self.remove = False def suspend_and_remove(self, remove): - self.suspend() self.remove = remove + self.suspend() def resume(self): + assert not self.remove Resumable.resume(self) + # not sure what to do with alive yetXXX #XXX arggh - why NOT?? @@ -136,10 +138,8 @@ self.runnables.append(tasklet) def run(self): - debug("len1 %s" % len(self.runnables)) while self.runnables: runnables = self.runnables - debug("len2 %s" % len(runnables)) self.runnables = [] for t in runnables: assert self.current_tasklet is None @@ -212,7 +212,6 @@ schedule() def f(): - globals.count = 0 for ii in range(loops): start_tasklet(Tasklet("T1%s" % ii, simple)) start_tasklet(Tasklet("T2%s" % ii, simple2)) @@ -222,3 +221,25 @@ res = wrap_stackless_function(f) assert res == '1' + +def test_schedule_remove(): + + def simple(name): + for ii in range(20): + if ii < 10: + schedule() + else: + schedule_remove() + globals.count += 1 + + def f(): + for ii in range(loops): + start_tasklet(Tasklet("T%s" % ii, simple)) + run() + for ii in range(loops): + start_tasklet(Tasklet("T%s" % ii, simple)) + run() + return globals.count == loops * 10 * 2 + + res = wrap_stackless_function(f) + assert res == '1' From rxe at codespeak.net Fri Dec 9 14:59:09 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 14:59:09 +0100 (CET) Subject: [pypy-svn] r20936 - pypy/dist/pypy/translator/c/test Message-ID: <20051209135909.F217427DF0@code1.codespeak.net> Author: rxe Date: Fri Dec 9 14:59:09 2005 New Revision: 20936 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: Add start_tasklet_now() and test. Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Fri Dec 9 14:59:09 2005 @@ -10,7 +10,7 @@ debug_flag = True # count of loops in tests (set lower to speed up) -loops = 1000 +loops = 10 def debug(s): if debug_flag: @@ -95,24 +95,27 @@ class Channel: def __init__(self): - self.balance = 0 self.queue = [] + self.balance = 0 def send(self, value): self.balance += 1 - if self.balance < 0: + if self.balance <= 0: t = self.queue.pop(0) t.data = value t.blocked = 0 + + # XXX Wrong - should run immediately + scheduler.add_tasklet(t) + scheduler.schedule() + else: t = getcurrent() - # Remove the tasklet from the list of running tasklets. - #XXX dont need this - t.remove() - # let it wait for a receiver to come along self.queue.append((t, value)) t.blocked = 1 - scheduler.schedule() + scheduler.schedule_remove() + def receive(self): self.balance -= 1 @@ -127,27 +130,39 @@ t = getcurrent() self.queue.append(t) t.blocked = -1 - scheduler.schedule() + scheduler.schedule_remove() class Scheduler(object): def __init__(self): self.runnables = [] self.current_tasklet = None + self.immediately_schedule = None def add_tasklet(self, tasklet): self.runnables.append(tasklet) + def run_immediately(self, tasklet): + self.immediately_schedule = tasklet + def run(self): while self.runnables: runnables = self.runnables self.runnables = [] + count = 0 for t in runnables: assert self.current_tasklet is None self.current_tasklet = t if t.resume(): self.runnables.append(self.current_tasklet) self.current_tasklet = None + count += 1 + if self.immediately_schedule: + self.runnables = [self.immediately_schedule] \ + + runnables[count:] + self.runnables + self.immediately_schedule = None + break + def schedule(self, remove=False): assert self.current_tasklet is not None self.current_tasklet.suspend_and_remove(remove) @@ -160,6 +175,11 @@ tasklet.set_resumable(res) scheduler.add_tasklet(tasklet) +def start_tasklet_now(tasklet): + res = tasklet.start() + tasklet.set_resumable(res) + scheduler.run_immediately(tasklet) + def schedule(): scheduler.schedule() @@ -243,3 +263,36 @@ res = wrap_stackless_function(f) assert res == '1' + +def test_run_immediately(): + globals.intermediate = 0 + globals.count = 0 + def simple(name): + for ii in range(20): + globals.count += 1 + schedule() + + def run_immediately(name): + globals.intermediate = globals.count + schedule() + + def simple2(name): + for ii in range(20): + globals.count += 1 + if ii == 10: + start_tasklet_now(Tasklet("intermediate", run_immediately)) + schedule() + + def f(): + start_tasklet(Tasklet("simple2", simple2)) + for ii in range(loops): + start_tasklet(Tasklet("T%s" % ii, simple)) + run() + total_expected = (loops + 1) * 20 + return (globals.intermediate == total_expected / 2 + 1 and + globals.count == total_expected) + + res = wrap_stackless_function(f) + assert res == '1' + + From cfbolz at codespeak.net Fri Dec 9 15:23:05 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 9 Dec 2005 15:23:05 +0100 (CET) Subject: [pypy-svn] r20937 - in pypy/dist/pypy: annotation translator/test Message-ID: <20051209142305.DA9B527DCE@code1.codespeak.net> Author: cfbolz Date: Fri Dec 9 15:23:04 2005 New Revision: 20937 Modified: pypy/dist/pypy/annotation/description.py pypy/dist/pypy/translator/test/test_annrpython.py Log: (johahn, cfbolz): make sure that __del__ methods are annotated if found. Modified: pypy/dist/pypy/annotation/description.py ============================================================================== --- pypy/dist/pypy/annotation/description.py (original) +++ pypy/dist/pypy/annotation/description.py Fri Dec 9 15:23:04 2005 @@ -387,6 +387,13 @@ for attr in self.classdict: classsources[attr] = self # comes from this ClassDesc classdef.setup(classsources) + # look for a __del__ method and annotate it if it's there + if '__del__' in self.classdict: + from pypy.annotation.model import s_None, SomeInstance + s_func = self.s_read_attribute('__del__') + args_s = [SomeInstance(classdef)] + s = self.bookkeeper.emulate_pbc_call(None, s_func, args_s) + assert s_None.contains(s) return classdef def getuniqueclassdef(self): Modified: pypy/dist/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/dist/pypy/translator/test/test_annrpython.py (original) +++ pypy/dist/pypy/translator/test/test_annrpython.py Fri Dec 9 15:23:04 2005 @@ -1884,6 +1884,39 @@ s = a.build_types(f, []) assert s.knowntype == int + def test_annotate__del__(self): + class A(object): + def __init__(self): + self.a = 2 + def __del__(self): + self.a = 1 + def f(): + return A().a + a = self.RPythonAnnotator() + t = a.translator + s = a.build_types(f, []) + assert s.knowntype == int + graph = tgraphof(t, A.__del__.im_func) + assert graph.startblock in a.annotated + + def test_annotate__del__baseclass(self): + class A(object): + def __init__(self): + self.a = 2 + def __del__(self): + self.a = 1 + class B(A): + def __init__(self): + self.a = 3 + def f(): + return B().a + a = self.RPythonAnnotator() + t = a.translator + s = a.build_types(f, []) + assert s.knowntype == int + graph = tgraphof(t, A.__del__.im_func) + assert graph.startblock in a.annotated + def g(n): return [0,1,2,n] From nik at codespeak.net Fri Dec 9 15:32:16 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Fri, 9 Dec 2005 15:32:16 +0100 (CET) Subject: [pypy-svn] r20938 - in pypy/dist/pypy: module/_socket/rpython module/_socket/test translator/c translator/c/src translator/c/test Message-ID: <20051209143216.4924027DDF@code1.codespeak.net> Author: nik Date: Fri Dec 9 15:32:12 2005 New Revision: 20938 Modified: pypy/dist/pypy/module/_socket/rpython/exttable.py pypy/dist/pypy/module/_socket/rpython/ll__socket.py pypy/dist/pypy/module/_socket/rpython/rsocket.py pypy/dist/pypy/module/_socket/test/test_socket2.py pypy/dist/pypy/translator/c/extfunc.py pypy/dist/pypy/translator/c/src/ll__socket.h pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (ale, nik) low-level implementation of socket.connect() and socket.getpeername(). intermediate checkin for changing machines. Modified: pypy/dist/pypy/module/_socket/rpython/exttable.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/exttable.py (original) +++ pypy/dist/pypy/module/_socket/rpython/exttable.py Fri Dec 9 15:32:12 2005 @@ -5,7 +5,7 @@ import _socket from pypy.module._socket.rpython import rsocket from pypy.rpython.extfunctable import declare, declaretype, declareptrtype -from pypy.rpython.extfunctable import standardexceptions +from pypy.rpython.extfunctable import standardexceptions, noneannotation from pypy.annotation.model import SomeTuple, SomeInteger, SomeString from pypy.annotation import classdef @@ -28,6 +28,9 @@ ]) return addrinfo +def ann_sockname(*args): + return SomeTuple([SomeString(), SomeInteger(), SomeInteger(), SomeInteger()]) + declare(_socket.gethostname, str, '%s/gethostname' % module) declare(_socket.gethostbyname, str, '%s/gethostbyname' % module) @@ -42,6 +45,9 @@ declare(_socket.htonl, int, '%s/htonl' % module) declare(rsocket.newsocket, int, '%s/newsocket' % module) +declare(rsocket.connect, noneannotation, '%s/connect' % module) +declare(rsocket.getpeername, ann_sockname, '%s/getpeername' % module) +declare(rsocket.freesockname, noneannotation, '%s/freesockname' % module) # ____________________________________________________________ # _socket.error can be raised by the above Modified: pypy/dist/pypy/module/_socket/rpython/ll__socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/ll__socket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/ll__socket.py Fri Dec 9 15:32:12 2005 @@ -34,6 +34,13 @@ ('item7', Signed), ) +SOCKNAME = GcStruct('tuple4', + ('item0', Ptr(STR)), + ('item1', Signed), + ('item2', Signed), + ('item3', Signed), + ) + def ll__socket_addrinfo(family, socktype, proto, canonname, ipaddr, port, flowinfo, scopeid): tup = malloc(ADDRINFO_RESULT) @@ -47,6 +54,14 @@ tup.item7 = scopeid # ipV6 return tup +def ll__socket_sockname(host, port, flowinfo, scopeid): + tup = malloc(SOCKNAME) + tup.item0 = host + tup.item1 = port + tup.item2 = flowinfo # ipV6 + tup.item3 = scopeid # ipV6 + return tup + def ll__socket_nextaddrinfo(opaqueaddr): addr = from_opaque_object(opaqueaddr) return addr.nextinfo() @@ -73,9 +88,21 @@ return _socket.ntohl(htonl) ll__socket_ntohl.suggested_primitive = True +# Can't actually create socket objects in these ll helpers because they are +# turned into flowgraphs at some point, and that fails on SocketType.__init__. + def ll__socket_newsocket(family, type, protocol): -# from pypy.module._socket.rpython import rsocket -# return rsocket.newsocket(family, type, protocol).fileno() return 0 ll__socket_newsocket.suggested_primitive = True +def ll__socket_connect(fd, host, port): + return None +ll__socket_connect.suggested_primitive = True + +def ll__socket_getpeername(fd): + return ("", 0, 0, 0) +ll__socket_getpeername.suggested_primitive = True + +def ll__socket_freesockname(sockname): + return None +ll__socket_freesockname.suggested_primitive = True Modified: pypy/dist/pypy/module/_socket/rpython/rsocket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/rsocket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/rsocket.py Fri Dec 9 15:32:12 2005 @@ -52,3 +52,5 @@ s = socket_cache[fd] return s.getpeername() +def freesockname(sockname): + pass Modified: pypy/dist/pypy/module/_socket/test/test_socket2.py ============================================================================== --- pypy/dist/pypy/module/_socket/test/test_socket2.py (original) +++ pypy/dist/pypy/module/_socket/test/test_socket2.py Fri Dec 9 15:32:12 2005 @@ -315,6 +315,8 @@ raises(TypeError, s.connect, args) s.close() +# XXX also need tests for other connection and timeout errors + class AppTestSocket: def setup_class(cls): Modified: pypy/dist/pypy/translator/c/extfunc.py ============================================================================== --- pypy/dist/pypy/translator/c/extfunc.py (original) +++ pypy/dist/pypy/translator/c/extfunc.py Fri Dec 9 15:32:12 2005 @@ -67,6 +67,9 @@ ll__socket.ll__socket_htonl: 'LL__socket_htonl', ll__socket.ll__socket_ntohl: 'LL__socket_htonl', ll__socket.ll__socket_newsocket: 'LL__socket_newsocket', + ll__socket.ll__socket_connect: 'LL__socket_connect', + ll__socket.ll__socket_getpeername: 'LL__socket_getpeername', + ll__socket.ll__socket_freesockname: 'LL__socket_freesockname', } #______________________________________________________ @@ -91,6 +94,7 @@ yield ('RPyMODF_RESULT', ll_math.MODF_RESULT) yield ('RPySTAT_RESULT', ll_os.STAT_RESULT) yield ('RPySOCKET_ADDRINFO', ll__socket.ADDRINFO_RESULT) + yield ('RPySOCKET_SOCKNAME', ll__socket.SOCKNAME) def predeclare_utility_functions(db, rtyper): # Common utility functions @@ -132,6 +136,8 @@ args = [lltype.Signed, lltype.Signed, lltype.Signed, lltype.Ptr(STR), lltype.Ptr(STR), lltype.Signed, lltype.Signed, lltype.Signed] yield annotate(ll__socket.ll__socket_addrinfo, *args) + args = [lltype.Ptr(STR), lltype.Signed, lltype.Signed, lltype.Signed] + yield annotate(ll__socket.ll__socket_sockname, *args) def predeclare_extfuncs(db, rtyper): modules = {} Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Fri Dec 9 15:32:12 2005 @@ -77,6 +77,8 @@ return htonl(ntohl); } +// XXX Check what should be done threading-wise around blocking system calls + int LL__socket_newsocket(int family, int type, int protocol) { int fd; @@ -95,6 +97,43 @@ RPYTHON_RAISE_OSERROR(errno); } } + +void LL__socket_connect(int fd, RPyString *host, int port) +{ + struct sockaddr_in addr; + + addr.sin_family = AF_INET; + addr.sin_port = htons((short)port); + if (setipaddr(RPyString_AsString(host), (struct sockaddr *) &addr, + sizeof(addr), AF_INET) < 0) { + // XXX raise some error here + } + if (connect(fd, &addr, sizeof(addr)) < 0) { + // XXX raise some error here + } +} + +RPySOCKET_SOCKNAME *LL__socket_getpeername(int fd) +{ + struct sockaddr_in addr; // XXX IPv4 only + int addr_len; + RPySOCKET_SOCKNAME* sockname; + RPyString* host; + + memset((void *) &addr, '\0', sizeof(addr)); + if (getpeername(fd, (struct sockaddr *) &addr, &addr_len) < 0) { + // XXX raise some error + } + + host = RPyString_FromString(inet_ntoa(addr.sin_addr)); + return ll__socket_sockname(host, addr.sin_port, 0, 0); +} + +void LL__socket_freesockname(RPySOCKET_SOCKNAME *sockname) +{ + free(sockname); +} + /* ____________________________________________________________________________ */ /* Lock to allow python interpreter to continue, but only allow one Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Fri Dec 9 15:32:12 2005 @@ -89,3 +89,16 @@ f1 = compile(does_stuff, [int, int, int]) for args in tests: py.test.raises(OSError, f1, *args) + +def test_connect(): + from pypy.module._socket.rpython import rsocket + def does_stuff(): + fd = rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) + # XXX need to think of a test without connecting to outside servers + rsocket.connect(fd, "codespeak.net", 80) + sockname = rsocket.getpeername(fd) + port = sockname[1] + rsocket.freesockname(sockname) + return port + f1 = compile(does_stuff, []) + assert f1() == 80 From rxe at codespeak.net Fri Dec 9 15:35:05 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 15:35:05 +0100 (CET) Subject: [pypy-svn] r20939 - pypy/dist/pypy/translator/c/test Message-ID: <20051209143505.D8FEF27DDF@code1.codespeak.net> Author: rxe Date: Fri Dec 9 15:35:00 2005 New Revision: 20939 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: Hacking in progress... Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Fri Dec 9 15:35:00 2005 @@ -1,7 +1,8 @@ import os + from pypy.rpython.memory.lladdress import NULL from pypy.rpython.rstack import yield_current_frame_to_caller -import os + # ____________________________________________________________ # For testing @@ -73,6 +74,7 @@ Resumable.__init__(self, fn) self.name = name self.blocked = False + self.data = -1 # propogates round suspend-resume to tell scheduler in run() # XXX too late to think this thru @@ -104,34 +106,32 @@ t = self.queue.pop(0) t.data = value t.blocked = 0 - - # XXX Wrong - should run immediately - scheduler.add_tasklet(t) + scheduler.run_immediately(tasklet) scheduler.schedule() else: t = getcurrent() + assert isinstance(t, Tasklet) # let it wait for a receiver to come along - self.queue.append((t, value)) + self.queue.append(t) t.blocked = 1 scheduler.schedule_remove() - def receive(self): self.balance -= 1 # good to go - if self.balance > 0: - t, value = self.queue.pop(0) + if self.balance >= 0: + t = self.queue.pop(0) t.blocked = 0 scheduler.add_tasklet(t) - return value - - # block until ready - t = getcurrent() - self.queue.append(t) - t.blocked = -1 - scheduler.schedule_remove() + else: + # block until ready + t = getcurrent() + self.queue.append(t) + t.blocked = -1 + scheduler.schedule_remove() + class Scheduler(object): def __init__(self): self.runnables = [] @@ -295,4 +295,25 @@ res = wrap_stackless_function(f) assert res == '1' +def test_channels(): + ch = Channel() + def f1(name): + for ii in range(5): + ch.send(ii) + debug("done sending") + + def f2(name): + while True: + ch.receive() + debug("received") + + def f(): + start_tasklet(Tasklet("f2", f2)) + start_tasklet(Tasklet("f1", f1)) + run() + + return 0 + + res = wrap_stackless_function(f) + assert res == '1' From tismer at codespeak.net Fri Dec 9 15:47:30 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Fri, 9 Dec 2005 15:47:30 +0100 (CET) Subject: [pypy-svn] r20940 - pypy/dist/pypy/translator/c/test Message-ID: <20051209144730.5664927DE7@code1.codespeak.net> Author: tismer Date: Fri Dec 9 15:47:29 2005 New Revision: 20940 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: temporary check-in. Problems were non-existing methods which were globals. New problem: t.clocked cannot be set. I commented this out and it works partially. My guess is that it is a problem to create the scheduler instance statically: we get wrong annotations on this global thing. Proposal: build the scheduler explicitly at run-time. Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Fri Dec 9 15:47:29 2005 @@ -73,7 +73,7 @@ def __init__(self, name, fn): Resumable.__init__(self, fn) self.name = name - self.blocked = False + self.blocked = 0 self.data = -1 # propogates round suspend-resume to tell scheduler in run() @@ -105,7 +105,7 @@ if self.balance <= 0: t = self.queue.pop(0) t.data = value - t.blocked = 0 + ##!!t.blocked = 0 scheduler.run_immediately(tasklet) scheduler.schedule() @@ -114,23 +114,23 @@ assert isinstance(t, Tasklet) # let it wait for a receiver to come along self.queue.append(t) - t.blocked = 1 - scheduler.schedule_remove() + ##!!t.blocked = 1 + schedule_remove() def receive(self): self.balance -= 1 # good to go if self.balance >= 0: t = self.queue.pop(0) - t.blocked = 0 + ##!!t.blocked = 0 scheduler.add_tasklet(t) else: # block until ready t = getcurrent() self.queue.append(t) - t.blocked = -1 - scheduler.schedule_remove() + ##!!t.blocked = -1 + schedule_remove() class Scheduler(object): def __init__(self): From cfbolz at codespeak.net Fri Dec 9 16:07:06 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 9 Dec 2005 16:07:06 +0100 (CET) Subject: [pypy-svn] r20941 - in pypy/dist/pypy/rpython: lltypesystem test Message-ID: <20051209150706.8F4C627DDF@code1.codespeak.net> Author: cfbolz Date: Fri Dec 9 16:07:05 2005 New Revision: 20941 Modified: pypy/dist/pypy/rpython/lltypesystem/lltype.py pypy/dist/pypy/rpython/test/test_lltype.py Log: (johahn, cfbolz): make it possible to attach custom destructor function pointers to GcStructs. Modified: pypy/dist/pypy/rpython/lltypesystem/lltype.py ============================================================================== --- pypy/dist/pypy/rpython/lltypesystem/lltype.py (original) +++ pypy/dist/pypy/rpython/lltypesystem/lltype.py Fri Dec 9 16:07:05 2005 @@ -250,7 +250,7 @@ class GcStruct(Struct): _runtime_type_info = None - def _attach_runtime_type_info_funcptr(self, funcptr): + def _attach_runtime_type_info_funcptr(self, funcptr, destrptr): if self._runtime_type_info is None: self._runtime_type_info = opaqueptr(RuntimeTypeInfo, name=self._name, about=self)._obj if funcptr is not None: @@ -263,6 +263,17 @@ raise TypeError("expected a runtime type info function " "implementation, got: %s" % funcptr) self._runtime_type_info.query_funcptr = funcptr + if destrptr is not None : + T = typeOf(destrptr) + if (not isinstance(T, Ptr) or + not isinstance(T.TO, FuncType) or + len(T.TO.ARGS) != 1 or + T.TO.RESULT != Void or + castable(T.TO.ARGS[0], Ptr(self)) < 0): + raise TypeError("expected a destructor function " + "implementation, got: %s" % destrptr) + self._runtime_type_info.destructor_funcptr = destrptr + class Array(ContainerType): __name__ = 'array' @@ -969,10 +980,10 @@ return id(obj) -def attachRuntimeTypeInfo(GCSTRUCT, funcptr=None): +def attachRuntimeTypeInfo(GCSTRUCT, funcptr=None, destrptr=None): if not isinstance(GCSTRUCT, GcStruct): raise TypeError, "expected a GcStruct: %s" % GCSTRUCT - GCSTRUCT._attach_runtime_type_info_funcptr(funcptr) + GCSTRUCT._attach_runtime_type_info_funcptr(funcptr, destrptr) return _ptr(Ptr(RuntimeTypeInfo), GCSTRUCT._runtime_type_info) def getRuntimeTypeInfo(GCSTRUCT): Modified: pypy/dist/pypy/rpython/test/test_lltype.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_lltype.py (original) +++ pypy/dist/pypy/rpython/test/test_lltype.py Fri Dec 9 16:07:05 2005 @@ -361,6 +361,27 @@ assert getRuntimeTypeInfo(Sbis) != pinf0 assert Sbis != S # the attached runtime type info distinguishes them +def test_getRuntimeTypeInfo_destrpointer(): + S = GcStruct('s', ('x', Signed)) + def f(s): + s.x = 1 + def type_info_S(p): + return getRuntimeTypeInfo(S) + qp = functionptr(FuncType([Ptr(S)], Ptr(RuntimeTypeInfo)), + "type_info_S", + _callable=type_info_S) + dp = functionptr(FuncType([Ptr(S)], Void), + "destructor_funcptr", + _callable=f) + pinf0 = attachRuntimeTypeInfo(S, qp, destrptr=dp) + assert pinf0._obj.about == S + pinf = getRuntimeTypeInfo(S) + assert pinf == pinf0 + pinf1 = getRuntimeTypeInfo(S) + assert pinf == pinf1 + assert pinf._obj.destructor_funcptr == dp + assert pinf._obj.query_funcptr == qp + def test_runtime_type_info(): S = GcStruct('s', ('x', Signed)) attachRuntimeTypeInfo(S) From ludal at codespeak.net Fri Dec 9 16:08:00 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Fri, 9 Dec 2005 16:08:00 +0100 (CET) Subject: [pypy-svn] r20942 - in pypy/dist/pypy: annotation interpreter/pyparser module/_socket/rpython module/recparser module/recparser/test rpython translator/goal Message-ID: <20051209150800.8A85827DE1@code1.codespeak.net> Author: ludal Date: Fri Dec 9 16:07:56 2005 New Revision: 20942 Added: pypy/dist/pypy/interpreter/pyparser/ebnfgrammar.py pypy/dist/pypy/translator/goal/targetebnflexer.py Modified: pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/interpreter/pyparser/ebnflexer.py pypy/dist/pypy/interpreter/pyparser/ebnfparse.py pypy/dist/pypy/interpreter/pyparser/grammar.py pypy/dist/pypy/interpreter/pyparser/pysymbol.py pypy/dist/pypy/interpreter/pyparser/syntaxtree.py pypy/dist/pypy/module/_socket/rpython/rsocket.py pypy/dist/pypy/module/recparser/pyparser.py pypy/dist/pypy/module/recparser/test/test_compilehooks.py pypy/dist/pypy/rpython/rbuiltin.py Log: first steps into making the ebnf parser translatable - the lexer translates - some cleanup/reorg for the next part moved Typedefs for grammar object into the recparser module allows object.__init__ to be ignored by the annotator/rtyper Modified: pypy/dist/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/dist/pypy/annotation/bookkeeper.py (original) +++ pypy/dist/pypy/annotation/bookkeeper.py Fri Dec 9 16:07:56 2005 @@ -348,7 +348,8 @@ result.dictdef.generalize_key(self.immutablevalue(ek)) result.dictdef.generalize_value(self.immutablevalue(ev)) elif ishashable(x) and x in BUILTIN_ANALYZERS: - result = SomeBuiltin(BUILTIN_ANALYZERS[x], methodname="%s.%s" % (x.__module__, x.__name__)) + _module = getattr(x,"__module__","unknown") + result = SomeBuiltin(BUILTIN_ANALYZERS[x], methodname="%s.%s" % (_module, x.__name__)) elif tp in EXTERNAL_TYPE_ANALYZERS: result = SomeExternalObject(tp) elif isinstance(x, lltype._ptr): Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Fri Dec 9 16:07:56 2005 @@ -233,6 +233,11 @@ def exception_init(s_self, *args): pass # XXX check correctness of args, maybe +def object_init(s_self, *args): + # ignore - mostly used for abstract classes initialization + pass + + def count(s_obj): return SomeInteger() @@ -339,6 +344,9 @@ import unicodedata BUILTIN_ANALYZERS[unicodedata.decimal] = unicodedata_decimal # xxx +# object - just ignore object.__init__ +BUILTIN_ANALYZERS[object.__init__] = object_init + # import BUILTIN_ANALYZERS[__import__] = import_func Added: pypy/dist/pypy/interpreter/pyparser/ebnfgrammar.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/interpreter/pyparser/ebnfgrammar.py Fri Dec 9 16:07:56 2005 @@ -0,0 +1,109 @@ +# This module contains the grammar parser +# and the symbol mappings + +from grammar import BaseGrammarBuilder, Alternative, Sequence, Token, \ + KleeneStar, GrammarElement, build_first_sets, EmptyToken + + +sym_map = {} +sym_rmap = {} +_count = 0 + +def g_add_symbol( name ): + global _count + if name in sym_rmap: + return sym_rmap[name] + val = _count + _count += 1 + sym_map[val] = name + sym_rmap[name] = val + return val + + +tok_map = {} +tok_rmap = {} + +def g_add_token( **kwargs ): + global _count + assert len(kwargs) == 1 + sym, name = kwargs.popitem() + if name in tok_rmap: + return tok_rmap[name] + val = _count + _count += 1 + tok_map[val] = name + tok_rmap[name] = val + sym_map[val] = sym + sym_rmap[sym] = val + return val + +g_add_token( EOF='EOF' ) + + +def grammar_grammar(): + """NOT RPYTHON (mostly because of g_add_token I suppose) + Builds the grammar for the grammar file + + Here's the description of the grammar's grammar :: + + grammar: rule+ + rule: SYMDEF alternative + + alternative: sequence ( '|' sequence )+ + star: '*' | '+' + sequence: (SYMBOL star? | STRING | option | group star? )+ + option: '[' alternative ']' + group: '(' alternative ')' star? + """ + global sym_map + S = g_add_symbol + T = g_add_token + # star: '*' | '+' + star = Alternative( S("star"), [Token(T(TOK_STAR='*')), Token(T(TOK_ADD='+'))] ) + star_opt = KleeneStar ( S("star_opt"), 0, 1, rule=star ) + + # rule: SYMBOL ':' alternative + symbol = Sequence( S("symbol"), [Token(T(TOK_SYMBOL='SYMBOL')), star_opt] ) + symboldef = Token( T(TOK_SYMDEF="SYMDEF") ) + alternative = Sequence( S("alternative"), []) + rule = Sequence( S("rule"), [symboldef, alternative] ) + + # grammar: rule+ + grammar = KleeneStar( S("grammar"), _min=1, rule=rule ) + + # alternative: sequence ( '|' sequence )* + sequence = KleeneStar( S("sequence"), 1 ) + seq_cont_list = Sequence( S("seq_cont_list"), [Token(T(TOK_BAR='|')), sequence] ) + sequence_cont = KleeneStar( S("sequence_cont"),0, rule=seq_cont_list ) + + alternative.args = [ sequence, sequence_cont ] + + # option: '[' alternative ']' + option = Sequence( S("option"), [Token(T(TOK_LBRACKET='[')), alternative, Token(T(TOK_RBRACKET=']'))] ) + + # group: '(' alternative ')' + group = Sequence( S("group"), [Token(T(TOK_LPAR='(')), alternative, Token(T(TOK_RPAR=')')), star_opt] ) + + # sequence: (SYMBOL | STRING | option | group )+ + string = Token(T(TOK_STRING='STRING')) + alt = Alternative( S("sequence_alt"), [symbol, string, option, group] ) + sequence.args = [ alt ] + + + rules = [ star, star_opt, symbol, alternative, rule, grammar, sequence, + seq_cont_list, sequence_cont, option, group, alt ] + build_first_sets( rules ) + return grammar + + +GRAMMAR_GRAMMAR = grammar_grammar() + +for _sym, _value in sym_rmap.items(): + globals()[_sym] = _value + +# cleanup +del _sym +del _value +del grammar_grammar +del g_add_symbol +del g_add_token Modified: pypy/dist/pypy/interpreter/pyparser/ebnflexer.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/ebnflexer.py (original) +++ pypy/dist/pypy/interpreter/pyparser/ebnflexer.py Fri Dec 9 16:07:56 2005 @@ -3,18 +3,21 @@ analyser in grammar.py """ -import re from grammar import TokenSource, Token +from ebnfgrammar import * -## Lexer for Python's grammar ######################################## -g_symdef = re.compile(r"[a-zA-Z_][a-zA-Z0-9_]*:",re.M) -g_symbol = re.compile(r"[a-zA-Z_][a-zA-Z0-9_]*",re.M) -g_string = re.compile(r"'[^']+'",re.M) -g_tok = re.compile(r"\[|\]|\(|\)|\*|\+|\|",re.M) -g_skip = re.compile(r"\s*(#.*$)?",re.M) + +def match_symbol( input, start, stop ): + idx = start + while idx=0 + return Token(TOK_STRING,inp[pos+1:_endpos]) + else: + npos = match_symbol( inp, pos, end) + if npos!=pos: + self.pos = npos + if npos!=end and inp[npos]==":": + self.pos += 1 + return Token(TOK_SYMDEF,inp[pos:npos]) + else: + return Token(TOK_SYMBOL,inp[pos:npos]) + + # we still have pos!=end here + chr = inp[pos] + if chr in "[]()*+|": + self.pos = pos+1 + return Token(tok_rmap[chr], chr) + self.RaiseError( "Unknown token" ) def peek(self): """take a peek at the next token""" @@ -113,3 +168,21 @@ """A simple helper function returning the stream at the last parsed position""" return self.input[self.pos:self.pos+N] + + +# a simple target used to annotate/translate the tokenizer +def target_parse_input( txt ): + lst = [] + src = GrammarSource( txt ) + while 1: + x = src.next() + lst.append( x ) + if x.codename == EOF: + break + #return lst + +if __name__ == "__main__": + import sys + f = file(sys.argv[-1]) + lst = target_parse_input( f.read() ) + for i in lst: print i Modified: pypy/dist/pypy/interpreter/pyparser/ebnfparse.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/ebnfparse.py (original) +++ pypy/dist/pypy/interpreter/pyparser/ebnfparse.py Fri Dec 9 16:07:56 2005 @@ -2,6 +2,7 @@ from grammar import BaseGrammarBuilder, Alternative, Sequence, Token, \ KleeneStar, GrammarElement, build_first_sets, EmptyToken from ebnflexer import GrammarSource +from ebnfgrammar import GRAMMAR_GRAMMAR, sym_map from syntaxtree import AbstractSyntaxVisitor import pytoken import pysymbol @@ -16,13 +17,6 @@ '%', '<<', '//', '\\', '', '\n\\)', '\\(', ';', ':', '@', '\\[', '\\]', '`', '\\{', '\\}'] -py_punct = re.compile(r""" ->=|<>|!=|<|>|<=|==|~| -\*=|//=|%=|\^=|<<=|\*\*=|\|=|\+=|>>=|=|&=|/=|-=| -,|\^|>>|&|\+|\*|-|/|\.|\*\*|%|<<|//|\|| -\)|\(|;|:|@|\[|\]|`|\{|\} -""", re.M | re.X) - TERMINALS = [ 'NAME', 'NUMBER', 'STRING', 'NEWLINE', 'ENDMARKER', @@ -188,7 +182,7 @@ rule = node.nodes[1].visit(self) return self.repeat( node.nodes[3], rule ) - def handle_STRING( self, node ): + def handle_TOK_STRING( self, node ): value = node.value tokencode = pytoken.tok_punct.get( value ) if tokencode is None: @@ -224,76 +218,6 @@ % tok.value) return myrule -rules = None - -sym_map = {} -sym_rmap = {} -sym_count = 0 - -def g_add_symbol( name ): - global sym_count - if name in sym_rmap: - return sym_rmap[name] - val = sym_count - sym_count += 1 - sym_map[val] = name - sym_rmap[name] = val - return val - -g_add_symbol( 'EOF' ) - -def grammar_grammar(): - """Builds the grammar for the grammar file - - Here's the description of the grammar's grammar :: - - grammar: rule+ - rule: SYMDEF alternative - - alternative: sequence ( '|' sequence )+ - star: '*' | '+' - sequence: (SYMBOL star? | STRING | option | group star? )+ - option: '[' alternative ']' - group: '(' alternative ')' star? - """ - global rules, sym_map - S = g_add_symbol - # star: '*' | '+' - star = Alternative( S("star"), [Token(S('*')), Token(S('+'))] ) - star_opt = KleeneStar ( S("star_opt"), 0, 1, rule=star ) - - # rule: SYMBOL ':' alternative - symbol = Sequence( S("symbol"), [Token(S('SYMBOL')), star_opt] ) - symboldef = Token( S("SYMDEF") ) - alternative = Sequence( S("alternative"), []) - rule = Sequence( S("rule"), [symboldef, alternative] ) - - # grammar: rule+ - grammar = KleeneStar( S("grammar"), _min=1, rule=rule ) - - # alternative: sequence ( '|' sequence )* - sequence = KleeneStar( S("sequence"), 1 ) - seq_cont_list = Sequence( S("seq_cont_list"), [Token(S('|')), sequence] ) - sequence_cont = KleeneStar( S("sequence_cont"),0, rule=seq_cont_list ) - - alternative.args = [ sequence, sequence_cont ] - - # option: '[' alternative ']' - option = Sequence( S("option"), [Token(S('[')), alternative, Token(S(']'))] ) - - # group: '(' alternative ')' - group = Sequence( S("group"), [Token(S('(')), alternative, Token(S(')')), star_opt] ) - - # sequence: (SYMBOL | STRING | option | group )+ - string = Token(S('STRING')) - alt = Alternative( S("sequence_alt"), [symbol, string, option, group] ) - sequence.args = [ alt ] - - - rules = [ star, star_opt, symbol, alternative, rule, grammar, sequence, - seq_cont_list, sequence_cont, option, group, alt ] - build_first_sets( rules ) - return grammar def parse_grammar(stream): @@ -301,15 +225,27 @@ stream : file-like object representing the grammar to parse """ - source = GrammarSource(stream.read(), sym_rmap) - rule = grammar_grammar() + source = GrammarSource(stream.read()) builder = BaseGrammarBuilder() - result = rule.match(source, builder) + result = GRAMMAR_GRAMMAR.match(source, builder) node = builder.stack[-1] vis = EBNFVisitor() node.visit(vis) return vis +def parse_grammar_text(txt): + """parses a grammar input + + stream : file-like object representing the grammar to parse + """ + source = GrammarSource(txt) + builder = BaseGrammarBuilder() + result = GRAMMAR_GRAMMAR.match(source, builder) + node = builder.stack[-1] + vis = EBNFVisitor() + node.visit(vis) + return vis + from pprint import pprint if __name__ == "__main__": Modified: pypy/dist/pypy/interpreter/pyparser/grammar.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/grammar.py (original) +++ pypy/dist/pypy/interpreter/pyparser/grammar.py Fri Dec 9 16:07:56 2005 @@ -7,11 +7,16 @@ KleeneStar : as in S -> A* or S -> A+ Token : a lexer token """ -from pypy.interpreter.baseobjspace import Wrappable -from pypy.interpreter.typedef import TypeDef -from pypy.interpreter.gateway import interp2app, ObjSpace, W_Root -from pypy.interpreter.argument import Arguments -from pypy.interpreter.error import OperationError +try: + from pypy.interpreter.baseobjspace import Wrappable + from pypy.interpreter.pyparser.pytoken import NULLTOKEN +except ImportError: + # allows standalone testing + Wrappable = object + NULLTOKEN = None + +from syntaxtree import SyntaxNode, TempSyntaxNode, TokenNode + DEBUG = 0 USE_LOOKAHEAD = True @@ -26,6 +31,7 @@ #### Abstract interface for a lexer/tokenizer class TokenSource(object): """Abstract base class for a source tokenizer""" + def context(self): """Returns a context to restore the state of the object later""" @@ -123,7 +129,6 @@ def token(self, name, value, source): return False -from syntaxtree import SyntaxNode, TempSyntaxNode, TokenNode # # we use the term root for a grammar rule to specify rules that are given a name # by the grammar @@ -349,20 +354,7 @@ pass - def descr_repr( self, space ): - """TODO: make __repr__ RPython""" - import pysymbol - return space.wrap( self.display(0, pysymbol.sym_name) ) - - def descr_get_children( self, space ): - return space.newlist( [ space.wrap(it) for it in self.args ] ) - -GrammarElement.typedef = TypeDef( "GrammarElement", - #__repr__ = interp2app(GrammarElement.descr_repr, - # unwrap_spec=['self', ObjSpace] ), - get_children = interp2app(GrammarElement.descr_get_children, - unwrap_spec=['self', ObjSpace] ), - ) + class Alternative(GrammarElement): """Represents an alternative in a grammar rule (as in S -> A | B | C)""" @@ -460,47 +452,8 @@ return True return False - def descr_alternative_append( self, space, w_rule ): - rule = space.interpclass_w(w_rule) - if not isinstance( rule, GrammarElement ): - raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) - self.args.append( rule ) - - def descr_alternative___getitem__(self, space, idx ): - return space.wrap(self.args[idx]) - - def descr_alternative___setitem__(self, space, idx, w_rule ): - rule = space.interpclass_w(w_rule) - if not isinstance( rule, GrammarElement ): - raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) - return space.wrap( self.args[idx] ) - - def descr_alternative___delitem__(self, space, idx ): - del self.args[idx] - - def descr_alternative_insert(self, space, idx, w_rule ): - rule = space.interpclass_w(w_rule) - if not isinstance( rule, GrammarElement ): - raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) - if idx<0 or idx>len(self.args): - raise OperationError( space.w_IndexError, space.wrap("Invalid index") ) - self.args.insert( idx, rule ) - -Alternative.typedef = TypeDef("Alternative", GrammarElement.typedef, - __getitem__ = interp2app( Alternative.descr_alternative___getitem__, - unwrap_spec=['self',ObjSpace,int]), - __setitem__ = interp2app( Alternative.descr_alternative___setitem__, - unwrap_spec=['self',ObjSpace,int,W_Root]), - __delitem__ = interp2app( Alternative.descr_alternative___delitem__, - unwrap_spec=['self',ObjSpace,int]), - insert = interp2app( Alternative.descr_alternative_insert, - unwrap_spec = ['self', ObjSpace, int, W_Root ] ), - append = interp2app( Alternative.descr_alternative_append, - unwrap_spec = ['self', ObjSpace, W_Root ] ), - ) - class Sequence(GrammarElement): """Reprensents a Sequence in a grammar rule (as in S -> A B C)""" def __init__(self, name, args): @@ -574,46 +527,9 @@ return False return True - def descr_alternative_append( self, space, w_rule ): - rule = space.interpclass_w(w_rule) - if not isinstance( rule, GrammarElement ): - raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) - self.args.append( rule ) - - def descr_alternative___getitem__(self, space, idx ): - return space.wrap(self.args[idx]) - - def descr_alternative___setitem__(self, space, idx, w_rule ): - rule = space.interpclass_w(w_rule) - if not isinstance( rule, GrammarElement ): - raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) - return space.wrap( self.args[idx] ) - - def descr_alternative___delitem__(self, space, idx ): - del self.args[idx] - - def descr_alternative_insert(self, space, idx, w_rule ): - rule = space.interpclass_w(w_rule) - if not isinstance( rule, GrammarElement ): - raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) - if idx<0 or idx>len(self.args): - raise OperationError( space.w_IndexError, space.wrap("Invalid index") ) - self.args.insert( idx, rule ) -Sequence.typedef = TypeDef("Sequence", GrammarElement.typedef, - __getitem__ = interp2app( Sequence.descr_alternative___getitem__, - unwrap_spec=['self',ObjSpace,int]), - __setitem__ = interp2app( Sequence.descr_alternative___setitem__, - unwrap_spec=['self',ObjSpace,int,W_Root]), - __delitem__ = interp2app( Sequence.descr_alternative___delitem__, - unwrap_spec=['self',ObjSpace,int]), - insert = interp2app( Sequence.descr_alternative_insert, - unwrap_spec = ['self', ObjSpace, int, W_Root ] ), - append = interp2app( Sequence.descr_alternative_append, - unwrap_spec = ['self', ObjSpace, W_Root ] ), - ) class KleeneStar(GrammarElement): @@ -706,28 +622,6 @@ return False return True - def descr_kleenestar___getitem__(self, space, idx ): - if idx!=0: - raise OperationError( space.w_ValueError, space.wrap("KleeneStar only support one child")) - return space.wrap(self.args[idx]) - - def descr_kleenestar___setitem__(self, space, idx, w_rule ): - rule = space.interpclass_w(w_rule) - if idx!=0: - raise OperationError( space.w_ValueError, space.wrap("KleeneStar only support one child")) - if not isinstance( rule, GrammarElement ): - raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) - self.args[idx] = rule - - - -KleeneStar.typedef = TypeDef("KleeneStar", GrammarElement.typedef, - __getitem__ = interp2app(KleeneStar.descr_kleenestar___getitem__, - unwrap_spec=[ 'self', ObjSpace, int]), - __setitem__ = interp2app(KleeneStar.descr_kleenestar___setitem__, - unwrap_spec=[ 'self', ObjSpace, int, W_Root ]), - ) - class Token(GrammarElement): """Represents a Token in a grammar rule (a lexer token)""" @@ -804,9 +698,7 @@ return True return False -Token.typedef = TypeDef("Token", GrammarElement.typedef ) -from pypy.interpreter.pyparser.pytoken import NULLTOKEN EmptyToken = Token(NULLTOKEN, None) Modified: pypy/dist/pypy/interpreter/pyparser/pysymbol.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pysymbol.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pysymbol.py Fri Dec 9 16:07:56 2005 @@ -1,5 +1,9 @@ # replacement for the CPython symbol module -from pypy.interpreter.pyparser import symbol +try: + from pypy.interpreter.pyparser import symbol +except ImportError: + # for standalone testing + import symbol # try to avoid numeric values conflict with tokens # it's important for CPython, but I'm not so sure it's still Modified: pypy/dist/pypy/interpreter/pyparser/syntaxtree.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/syntaxtree.py (original) +++ pypy/dist/pypy/interpreter/pyparser/syntaxtree.py Fri Dec 9 16:07:56 2005 @@ -1,7 +1,11 @@ """SyntaxTree class definition""" -from pypy.interpreter.pyparser.pysymbol import sym_values -from pypy.interpreter.pyparser.pytoken import tok_values - +try: + from pypy.interpreter.pyparser.pysymbol import sym_values + from pypy.interpreter.pyparser.pytoken import tok_values +except ImportError: + from pysymbol import sym_values + from pytoken import tok_values + class AbstractSyntaxVisitor(object): def visit_syntaxnode( self, node ): pass Modified: pypy/dist/pypy/module/_socket/rpython/rsocket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/rsocket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/rsocket.py Fri Dec 9 16:07:56 2005 @@ -5,7 +5,7 @@ import socket # HACK: We have to prevent GC to collect the socket object we create within this -#?module. Because socket.close() is called on GC this can lead to strange +# module. Because socket.close() is called on GC this can lead to strange # effects in corner cases where file descriptors are reused. socket_cache = {} keep_sockets_alive = [] Modified: pypy/dist/pypy/module/recparser/pyparser.py ============================================================================== --- pypy/dist/pypy/module/recparser/pyparser.py (original) +++ pypy/dist/pypy/module/recparser/pyparser.py Fri Dec 9 16:07:56 2005 @@ -11,6 +11,8 @@ from pypy.interpreter.pyparser.pythonutil import PYTHON_PARSER from pypy.interpreter.pyparser.error import SyntaxError from pypy.interpreter.pyparser import grammar, pysymbol, pytoken +from pypy.interpreter.argument import Arguments + __all__ = [ "ASTType", "STType", "suite", "expr" ] @@ -206,3 +208,121 @@ encoding = None return parsestr(space, encoding, s) decode_string_literal.unwrap_spec = [ObjSpace, str, W_Root] + + +# append typedefs to the grammar objects +from pypy.interpreter.pyparser.grammar import GrammarElement, Alternative +from pypy.interpreter.pyparser.grammar import Sequence, KleeneStar, Token + + +def descr_grammarelement_repr( self, space ): + """TODO: make __repr__ RPython""" + import pysymbol + return space.wrap( self.display(0, pysymbol.sym_name) ) + +def descr_grammarelement_get_children( self, space ): + return space.newlist( [ space.wrap(it) for it in self.args ] ) + +GrammarElement.descr_grammarelement_repr = descr_grammarelement_repr +GrammarElement.descr_grammarelement_get_children = descr_grammarelement_get_children + +GrammarElement.typedef = TypeDef( "GrammarElement", + #__repr__ = interp2app(GrammarElement.descr_grammarelement_repr, + # unwrap_spec=['self', ObjSpace] ), + get_children = interp2app(GrammarElement.descr_grammarelement_get_children, + unwrap_spec=['self', ObjSpace] ), + ) + + + +def descr_alternative_append( self, space, w_rule ): + rule = space.interpclass_w(w_rule) + if not isinstance( rule, GrammarElement ): + raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) + self.args.append( rule ) + + +def descr_alternative___getitem__(self, space, idx ): + return space.wrap(self.args[idx]) + +def descr_alternative___setitem__(self, space, idx, w_rule ): + rule = space.interpclass_w(w_rule) + if not isinstance( rule, GrammarElement ): + raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) + return space.wrap( self.args[idx] ) + +def descr_alternative___delitem__(self, space, idx ): + del self.args[idx] + +def descr_alternative_insert(self, space, idx, w_rule ): + rule = space.interpclass_w(w_rule) + if not isinstance( rule, GrammarElement ): + raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) + if idx<0 or idx>len(self.args): + raise OperationError( space.w_IndexError, space.wrap("Invalid index") ) + self.args.insert( idx, rule ) + +Alternative.descr_alternative_append = descr_alternative_append +Alternative.descr_alternative_insert = descr_alternative_insert +Alternative.descr_alternative___getitem__ = descr_alternative___getitem__ +Alternative.descr_alternative___setitem__ = descr_alternative___setitem__ +Alternative.descr_alternative___delitem__ = descr_alternative___delitem__ + + +Alternative.typedef = TypeDef("Alternative", GrammarElement.typedef, + __getitem__ = interp2app( Alternative.descr_alternative___getitem__, + unwrap_spec=['self',ObjSpace,int]), + __setitem__ = interp2app( Alternative.descr_alternative___setitem__, + unwrap_spec=['self',ObjSpace,int,W_Root]), + __delitem__ = interp2app( Alternative.descr_alternative___delitem__, + unwrap_spec=['self',ObjSpace,int]), + insert = interp2app( Alternative.descr_alternative_insert, + unwrap_spec = ['self', ObjSpace, int, W_Root ] ), + append = interp2app( Alternative.descr_alternative_append, + unwrap_spec = ['self', ObjSpace, W_Root ] ), + ) + +Sequence.descr_alternative_append = descr_alternative_append +Sequence.descr_alternative_insert = descr_alternative_insert +Sequence.descr_alternative___getitem__ = descr_alternative___getitem__ +Sequence.descr_alternative___setitem__ = descr_alternative___setitem__ +Sequence.descr_alternative___delitem__ = descr_alternative___delitem__ + + +Sequence.typedef = TypeDef("Sequence", GrammarElement.typedef, + __getitem__ = interp2app( Sequence.descr_alternative___getitem__, + unwrap_spec=['self',ObjSpace,int]), + __setitem__ = interp2app( Sequence.descr_alternative___setitem__, + unwrap_spec=['self',ObjSpace,int,W_Root]), + __delitem__ = interp2app( Sequence.descr_alternative___delitem__, + unwrap_spec=['self',ObjSpace,int]), + insert = interp2app( Sequence.descr_alternative_insert, + unwrap_spec = ['self', ObjSpace, int, W_Root ] ), + append = interp2app( Sequence.descr_alternative_append, + unwrap_spec = ['self', ObjSpace, W_Root ] ), + ) + +def descr_kleenestar___getitem__(self, space, idx ): + if idx!=0: + raise OperationError( space.w_ValueError, space.wrap("KleeneStar only support one child")) + return space.wrap(self.args[idx]) + +def descr_kleenestar___setitem__(self, space, idx, w_rule ): + rule = space.interpclass_w(w_rule) + if idx!=0: + raise OperationError( space.w_ValueError, space.wrap("KleeneStar only support one child")) + if not isinstance( rule, GrammarElement ): + raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) + self.args[idx] = rule + +KleeneStar.descr_kleenestar___getitem__ = descr_kleenestar___getitem__ +KleeneStar.descr_kleenestar___setitem__ = descr_kleenestar___setitem__ + +KleeneStar.typedef = TypeDef("KleeneStar", GrammarElement.typedef, + __getitem__ = interp2app(KleeneStar.descr_kleenestar___getitem__, + unwrap_spec=[ 'self', ObjSpace, int]), + __setitem__ = interp2app(KleeneStar.descr_kleenestar___setitem__, + unwrap_spec=[ 'self', ObjSpace, int, W_Root ]), + ) + +Token.typedef = TypeDef("Token", GrammarElement.typedef ) Modified: pypy/dist/pypy/module/recparser/test/test_compilehooks.py ============================================================================== --- pypy/dist/pypy/module/recparser/test/test_compilehooks.py (original) +++ pypy/dist/pypy/module/recparser/test/test_compilehooks.py Fri Dec 9 16:07:56 2005 @@ -26,3 +26,32 @@ d = {} exec "a = 3" in d assert d['a'] == 2 # well, yes ... + + +class DISABLEDAppTest_GlobalsAsConsts: + def test_ast_parser(self): + # define the hook + def change_globals(ast, enc): + class ChangeGlobalsVisitor: + def visitConst(self, node): + pass + + def defaultvisit(self, node): + for child in node.getChildNodes(): + child.accept(self) + + def __getattr__(self, attrname): + if attrname.startswith('visit'): + return self.defaultvisit + raise AttributeError(attrname) + + ast.accept(ChangeConstVisitor()) + return ast + + # install the hook + import parser + parser.install_compiler_hook(change_globals) + # check that the visitor changed all globals + # in the code into Consts + # TODO + # simplest version of the test : dis(code) | grep -v LOAD_GLOBAL == dis(code) Modified: pypy/dist/pypy/rpython/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/rbuiltin.py Fri Dec 9 16:07:56 2005 @@ -179,6 +179,9 @@ def rtype_Exception__init__(hop): pass +def rtype_object__init__(hop): + pass + def rtype_OSError__init__(hop): if hop.nb_args == 2: raise TyperError("OSError() should not be called with " @@ -241,6 +244,7 @@ BUILTIN_TYPER[Exception.__init__.im_func] = rtype_Exception__init__ BUILTIN_TYPER[AssertionError.__init__.im_func] = rtype_Exception__init__ BUILTIN_TYPER[OSError.__init__.im_func] = rtype_OSError__init__ +BUILTIN_TYPER[object.__init__] = rtype_object__init__ # annotation of low-level types def rtype_malloc(hop): Added: pypy/dist/pypy/translator/goal/targetebnflexer.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/goal/targetebnflexer.py Fri Dec 9 16:07:56 2005 @@ -0,0 +1,33 @@ +from pypy.interpreter.pyparser.ebnflexer import target_parse_input + + +entry_point = target_parse_input + +# _____ Define and setup target ___ + +def target(*args): + return entry_point, [str] + +def get_llinterp_args(): + return [1] + +# _____ Run translated _____ +def run(c_entry_point): + import sys + NBC=100 + import time + src = file("../../interpreter/pyparser/data/Grammar2.4").read() + print "Translated:" + t1 = time.time() + for i in range(NBC): + c_entry_point( src ) + t2 = time.time() + print "%8.5f sec/loop" % (float(t2-t1)/NBC) + print "CPython:" + t1 = time.time() + for i in range(NBC): + entry_point( src ) + t2 = time.time() + print "%8.5f sec/loop" % (float(t2-t1)/NBC) + + From ac at codespeak.net Fri Dec 9 16:12:41 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Fri, 9 Dec 2005 16:12:41 +0100 (CET) Subject: [pypy-svn] r20943 - pypy/dist/pypy/translator/c Message-ID: <20051209151241.5DC2B27DE1@code1.codespeak.net> Author: ac Date: Fri Dec 9 16:12:41 2005 New Revision: 20943 Modified: pypy/dist/pypy/translator/c/gc.py Log: Increase preformance with 35-40% on linux. Modified: pypy/dist/pypy/translator/c/gc.py ============================================================================== --- pypy/dist/pypy/translator/c/gc.py (original) +++ pypy/dist/pypy/translator/c/gc.py Fri Dec 9 16:12:41 2005 @@ -371,18 +371,22 @@ return ['gc'] def pre_pre_gc_code(self): - if sys.platform == "linux2" and self.thread_enabled: + if sys.platform == "linux2": yield "#define _REENTRANT 1" yield "#define GC_LINUX_THREADS 1" - yield '#include ' - yield '#define USING_BOEHM_GC' + yield "#define GC_REDIRECT_TO_LOCAL 1" + yield '#include ' + yield '#define USING_BOEHM_GC' + else: + yield '#include ' + yield '#define USING_BOEHM_GC' def gc_startup_code(self): if sys.platform == 'win32': yield 'assert(GC_all_interior_pointers == 0);' else: yield 'GC_all_interior_pointers = 0;' - yield 'GC_INIT();' + yield 'GC_init();' class BoehmGcRuntimeTypeInfo_OpaqueNode(ContainerNode): From ericvrp at codespeak.net Fri Dec 9 16:12:57 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 9 Dec 2005 16:12:57 +0100 (CET) Subject: [pypy-svn] r20944 - pypy/dist/pypy/translator/llvm Message-ID: <20051209151257.2EB2F27DE3@code1.codespeak.net> Author: ericvrp Date: Fri Dec 9 16:12:54 2005 New Revision: 20944 Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py pypy/dist/pypy/translator/llvm/extfuncnode.py Log: fix for int/long mistake with LL_os_seek Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Fri Dec 9 16:12:54 2005 @@ -83,7 +83,7 @@ #ball = str(dirpath.join('%s_all.bc' % b)) #cmds.append("opt %s %s -f -o %s.bc" % (OPTIMIZATION_SWITCHES, ball, b)) - use_gcc = True + use_gcc = sys.platform == 'linux2' profile = False cleanup = False Modified: pypy/dist/pypy/translator/llvm/extfuncnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/extfuncnode.py (original) +++ pypy/dist/pypy/translator/llvm/extfuncnode.py Fri Dec 9 16:12:54 2005 @@ -13,6 +13,7 @@ ext_func_sigs = { "%LL_os_isatty" : ExtFuncSig("int", None), "%LL_stack_too_big" : ExtFuncSig("int", None), + "%LL_os_lseek" : ExtFuncSig("long", None), "%LL_thread_acquirelock" : ExtFuncSig("int", [None, "int"]), "%LL_thread_start" : ExtFuncSig(None, ["sbyte*", "sbyte*"])} From tismer at codespeak.net Fri Dec 9 16:43:04 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Fri, 9 Dec 2005 16:43:04 +0100 (CET) Subject: [pypy-svn] r20946 - pypy/dist/pypy/translator/c/test Message-ID: <20051209154304.9693627DE4@code1.codespeak.net> Author: tismer Date: Fri Dec 9 16:43:03 2005 New Revision: 20946 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: temporary check-in, stopping to hack on the same source. Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Fri Dec 9 16:43:03 2005 @@ -39,6 +39,7 @@ s_list_of_strings.listdef.resize() t = TranslationContext() t.buildannotator().build_types(entry_point, [s_list_of_strings]) + t.view() t.buildrtyper().specialize() backend_optimizations(t) cbuilder = CStandaloneBuilder(t, entry_point, gcpolicy=gcpolicy) @@ -105,8 +106,8 @@ if self.balance <= 0: t = self.queue.pop(0) t.data = value - ##!!t.blocked = 0 - scheduler.run_immediately(tasklet) + t.blocked = 0 + scheduler.run_immediately(t) scheduler.schedule() else: @@ -114,7 +115,7 @@ assert isinstance(t, Tasklet) # let it wait for a receiver to come along self.queue.append(t) - ##!!t.blocked = 1 + t.blocked = 1 schedule_remove() def receive(self): @@ -122,15 +123,18 @@ # good to go if self.balance >= 0: t = self.queue.pop(0) - ##!!t.blocked = 0 + t.blocked = 0 + data = t.data scheduler.add_tasklet(t) - + return data else: # block until ready t = getcurrent() + assert isinstance(t, Tasklet) self.queue.append(t) - ##!!t.blocked = -1 + t.blocked = -1 schedule_remove() + return -1 # never reached class Scheduler(object): def __init__(self): @@ -304,9 +308,14 @@ debug("done sending") def f2(name): - while True: - ch.receive() - debug("received") + for ii in range(5): + data = ch.receive() + debug("done receiving") +## while True: +## data = ch.receive() +## if data == 42: +## break +## debug("received") def f(): start_tasklet(Tasklet("f2", f2)) From nik at codespeak.net Fri Dec 9 17:04:47 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Fri, 9 Dec 2005 17:04:47 +0100 (CET) Subject: [pypy-svn] r20947 - in pypy/dist/pypy: module/_socket/rpython translator/c translator/c/src translator/c/test Message-ID: <20051209160447.03E5327DDC@code1.codespeak.net> Author: nik Date: Fri Dec 9 17:04:45 2005 New Revision: 20947 Modified: pypy/dist/pypy/module/_socket/rpython/exttable.py pypy/dist/pypy/module/_socket/rpython/ll__socket.py pypy/dist/pypy/translator/c/extfunc.py pypy/dist/pypy/translator/c/src/ll__socket.h pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (ale, nik) got basic connect() and getpeername() working (for IPv4). still needs error handling. Modified: pypy/dist/pypy/module/_socket/rpython/exttable.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/exttable.py (original) +++ pypy/dist/pypy/module/_socket/rpython/exttable.py Fri Dec 9 17:04:45 2005 @@ -47,7 +47,6 @@ declare(rsocket.newsocket, int, '%s/newsocket' % module) declare(rsocket.connect, noneannotation, '%s/connect' % module) declare(rsocket.getpeername, ann_sockname, '%s/getpeername' % module) -declare(rsocket.freesockname, noneannotation, '%s/freesockname' % module) # ____________________________________________________________ # _socket.error can be raised by the above Modified: pypy/dist/pypy/module/_socket/rpython/ll__socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/ll__socket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/ll__socket.py Fri Dec 9 17:04:45 2005 @@ -102,7 +102,3 @@ def ll__socket_getpeername(fd): return ("", 0, 0, 0) ll__socket_getpeername.suggested_primitive = True - -def ll__socket_freesockname(sockname): - return None -ll__socket_freesockname.suggested_primitive = True Modified: pypy/dist/pypy/translator/c/extfunc.py ============================================================================== --- pypy/dist/pypy/translator/c/extfunc.py (original) +++ pypy/dist/pypy/translator/c/extfunc.py Fri Dec 9 17:04:45 2005 @@ -69,7 +69,6 @@ ll__socket.ll__socket_newsocket: 'LL__socket_newsocket', ll__socket.ll__socket_connect: 'LL__socket_connect', ll__socket.ll__socket_getpeername: 'LL__socket_getpeername', - ll__socket.ll__socket_freesockname: 'LL__socket_freesockname', } #______________________________________________________ Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Fri Dec 9 17:04:45 2005 @@ -102,12 +102,12 @@ { struct sockaddr_in addr; - addr.sin_family = AF_INET; - addr.sin_port = htons((short)port); if (setipaddr(RPyString_AsString(host), (struct sockaddr *) &addr, sizeof(addr), AF_INET) < 0) { - // XXX raise some error here + // XXX raise some error here } + addr.sin_family = AF_INET; + addr.sin_port = htons(port); if (connect(fd, &addr, sizeof(addr)) < 0) { // XXX raise some error here } @@ -122,18 +122,17 @@ memset((void *) &addr, '\0', sizeof(addr)); if (getpeername(fd, (struct sockaddr *) &addr, &addr_len) < 0) { - // XXX raise some error + // XXX raise some error here } host = RPyString_FromString(inet_ntoa(addr.sin_addr)); +#if !defined(USING_BOEHM_GC) && !defined(USING_NO_GC) + host->refcount--; // XXX this is not sane, but there is no better way + // at the moment. +#endif return ll__socket_sockname(host, addr.sin_port, 0, 0); } -void LL__socket_freesockname(RPySOCKET_SOCKNAME *sockname) -{ - free(sockname); -} - /* ____________________________________________________________________________ */ /* Lock to allow python interpreter to continue, but only allow one Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Fri Dec 9 17:04:45 2005 @@ -91,14 +91,15 @@ py.test.raises(OSError, f1, *args) def test_connect(): + import os from pypy.module._socket.rpython import rsocket def does_stuff(): fd = rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) # XXX need to think of a test without connecting to outside servers rsocket.connect(fd, "codespeak.net", 80) sockname = rsocket.getpeername(fd) - port = sockname[1] - rsocket.freesockname(sockname) - return port + os.close(fd) + return sockname[1] f1 = compile(does_stuff, []) - assert f1() == 80 + res = f1() + assert res == 80 From cfbolz at codespeak.net Fri Dec 9 17:09:07 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 9 Dec 2005 17:09:07 +0100 (CET) Subject: [pypy-svn] r20948 - in pypy/dist/pypy/rpython: . lltypesystem test Message-ID: <20051209160907.5F4FF27DE3@code1.codespeak.net> Author: cfbolz Date: Fri Dec 9 17:09:05 2005 New Revision: 20948 Modified: pypy/dist/pypy/rpython/lltypesystem/rclass.py pypy/dist/pypy/rpython/rtyper.py pypy/dist/pypy/rpython/test/test_rclass.py Log: (cfbolz, johahn): now all the classes that have a __del__ get a pointer to it attached to the rtti opaque object. Modified: pypy/dist/pypy/rpython/lltypesystem/rclass.py ============================================================================== --- pypy/dist/pypy/rpython/lltypesystem/rclass.py (original) +++ pypy/dist/pypy/rpython/lltypesystem/rclass.py Fri Dec 9 17:09:05 2005 @@ -16,7 +16,7 @@ cast_pointer, castable, nullptr, \ RuntimeTypeInfo, getRuntimeTypeInfo, typeOf, \ Array, Char, Void, attachRuntimeTypeInfo, \ - FuncType, Bool, Signed + FuncType, Bool, Signed, functionptr, FuncType # # There is one "vtable" per user class, with the following structure: @@ -336,9 +336,21 @@ def _setup_repr_final(self): if self.needsgc: # only gc-case + if (self.classdef is not None and + self.classdef.classdesc.lookup('__del__') is not None): + s_func = self.classdef.classdesc.s_read_attribute('__del__') + assert len(s_func.descriptions) == 1 + funcdesc = s_func.descriptions.keys()[0] + graph = funcdesc.cachedgraph(None) + FUNCTYPE = FuncType([Ptr(self.object_type)], Void) + destrptr = functionptr(FUNCTYPE, graph.name, + graph=graph, + _callable=graph.func) + else: + destrptr = None self.rtyper.attachRuntimeTypeInfoFunc(self.object_type, ll_runtime_type_info, - OBJECT) + OBJECT, destrptr) def common_repr(self): # -> object or nongcobject reprs return getinstancerepr(self.rtyper, None, nogc=not self.needsgc) Modified: pypy/dist/pypy/rpython/rtyper.py ============================================================================== --- pypy/dist/pypy/rpython/rtyper.py (original) +++ pypy/dist/pypy/rpython/rtyper.py Fri Dec 9 17:09:05 2005 @@ -547,7 +547,8 @@ ll_function, args_s) return helper_graph - def attachRuntimeTypeInfoFunc(self, GCSTRUCT, func, ARG_GCSTRUCT=None): + def attachRuntimeTypeInfoFunc(self, GCSTRUCT, func, ARG_GCSTRUCT=None, + destrptr=None): self.call_all_setups() # compute ForwardReferences now if ARG_GCSTRUCT is None: ARG_GCSTRUCT = GCSTRUCT @@ -560,7 +561,7 @@ raise TyperError("runtime type info function %r returns %r, " "excepted Ptr(RuntimeTypeInfo)" % (func, s)) funcptr = self.getcallable(graph) - attachRuntimeTypeInfo(GCSTRUCT, funcptr) + attachRuntimeTypeInfo(GCSTRUCT, funcptr, destrptr) # ____________________________________________________________ Modified: pypy/dist/pypy/rpython/test/test_rclass.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rclass.py (original) +++ pypy/dist/pypy/rpython/test/test_rclass.py Fri Dec 9 17:09:05 2005 @@ -1,4 +1,4 @@ -from pypy.translator.translator import Translator +from pypy.translator.translator import TranslationContext, graphof from pypy.rpython.lltypesystem.lltype import * from pypy.rpython.test.test_llinterp import interpret from pypy.rpython.rarithmetic import intmask @@ -367,3 +367,25 @@ return meth() res = interpret(f, []) assert res == 1 + +def test__del__(): + class A(object): + def __init__(self): + self.a = 2 + def __del__(self): + self.a = 3 + def f(): + a = A() + return a.a + t = TranslationContext() + t.buildannotator().build_types(f, []) + t.buildrtyper().specialize() + graph = graphof(t, f) + TYPE = graph.startblock.operations[0].args[0].value + RTTI = getRuntimeTypeInfo(TYPE) + queryptr = RTTI._obj.query_funcptr # should not raise + destrptr = RTTI._obj.destructor_funcptr + assert destrptr is not None + + + From arigo at codespeak.net Fri Dec 9 17:30:44 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Dec 2005 17:30:44 +0100 (CET) Subject: [pypy-svn] r20949 - in pypy/dist/pypy/jit: . test Message-ID: <20051209163044.048DC27DDC@code1.codespeak.net> Author: arigo Date: Fri Dec 9 17:30:43 2005 New Revision: 20949 Added: pypy/dist/pypy/jit/test/__init__.py Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py pypy/dist/pypy/jit/test/test_tl.py pypy/dist/pypy/jit/tl.py Log: (arigo, mwh) Much code reorganization with the goal of being able to tell when a direct_call operation results in a concrete value. The code has an insanity rating of at least 25% (but probably less than 100%). Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Fri Dec 9 17:30:43 2005 @@ -54,12 +54,15 @@ return isinstance(other, LLRuntimeValue) # XXX and ... -class LLState(object): - """Entry state of a block or a graph, as a combination of LLAbstractValues +class BlockState(object): + """Entry state of a block, as a combination of LLAbstractValues for its input arguments.""" - def __init__(self, args_a): + def __init__(self, origblock, args_a): + assert len(args_a) == len(origblock.inputargs) self.args_a = args_a + self.origblock = origblock + self.copyblock = None def match(self, args_a): # simple for now @@ -69,53 +72,47 @@ else: return True - -class BlockState(LLState): - """Entry state of a block.""" - - def __init__(self, origblock, args_a): - assert len(args_a) == len(origblock.inputargs) - super(BlockState, self).__init__(args_a) - self.origblock = origblock - self.copyblock = None - self.pendingsources = [] - - def patchsource(self, source): - if self.copyblock is None: - print 'PENDING', self, hex(id(source)) - self.pendingsources.append(source) - else: - # XXX nice interface required! - print 'LINKING', self, id(source), self.copyblock - source.settarget(self.copyblock) - def resolveblock(self, newblock): - print "RESOLVING BLOCK", newblock + #print "RESOLVING BLOCK", newblock self.copyblock = newblock - for source in self.pendingsources: - self.patchsource(source) - del self.pendingsources[:] -class GraphState(LLState): +class GraphState(object): """Entry state of a graph.""" def __init__(self, origgraph, args_a): super(GraphState, self).__init__(args_a) self.origgraph = origgraph self.copygraph = FunctionGraph(origgraph.name, Block([])) # grumble + for orig_v, copy_v in [(origgraph.getreturnvar(), + self.copygraph.getreturnvar()), + (origgraph.exceptblock.inputargs[0], + self.copygraph.exceptblock.inputargs[0]), + (origgraph.exceptblock.inputargs[1], + self.copygraph.exceptblock.inputargs[1])]: + if hasattr(orig_v, 'concretetype'): + copy_v.concretetype = orig_v.concretetype + self.a_return = None + self.state = "before" def settarget(self, block): block.isstartblock = True self.copygraph.startblock = block + def complete(self, interp): + assert self.state != "during" + if self.state == "before": + builderframe = LLAbstractFrame(interp, self) + builderframe.complete() + self.state = "after" + # ____________________________________________________________ class LLAbstractInterp(object): def __init__(self): - self.graphs = {} # {origgraph: {BlockState: GraphState}} - self.fixreturnblocks = [] + self.graphs = {} # {origgraph: {BlockState: GraphState}} + self.pendingstates = {} # {Link-or-GraphState: next-BlockState} def itercopygraphs(self): for d in self.graphs.itervalues(): @@ -125,29 +122,13 @@ def eval(self, origgraph, hints): # for now, 'hints' means "I'm absolutely sure that the # given variables will have the given ll value" - self.allpendingstates = [] self.hints = hints self.blocks = {} # {origblock: list-of-LLStates} args_a = [LLRuntimeValue(orig_v=v) for v in origgraph.getargs()] - graphstate = self.schedule_graph(args_a, origgraph) - self.complete() - self.fixgraphs() + graphstate, args_a = self.schedule_graph(args_a, origgraph) + graphstate.complete(self) return graphstate.copygraph - def fixgraphs(self): - # add the missing '.returnblock' attribute - for graph in self.fixreturnblocks: - for block in graph.iterblocks(): - if block.operations == () and len(block.inputargs) == 1: - # here it is :-) - graph.returnblock = block - break - else: - # no return block... - graph.getreturnvar().concretevalue = lltype.Void - checkgraph(graph) # sanity-check - del self.fixreturnblocks - def applyhint(self, args_a, origblock): result_a = [] if origblock.operations == (): @@ -172,21 +153,20 @@ graphstate = self.graphs[origgraph][state] except KeyError: graphstate = GraphState(origgraph, args_a) - self.fixreturnblocks.append(graphstate.copygraph) d = self.graphs.setdefault(origgraph, {}) d[state] = graphstate - print "SCHEDULE_GRAPH", graphstate - state.patchsource(graphstate) - return graphstate + self.pendingstates[graphstate] = state + #print "SCHEDULE_GRAPH", graphstate + return graphstate, args_a def schedule(self, args_a, origblock): - print "SCHEDULE", args_a, origblock + #print "SCHEDULE", args_a, origblock # args_a: [a_value for v in origblock.inputargs] state, args_a = self.schedule_getstate(args_a, origblock) args_v = [a.getvarorconst() for a in args_a if not isinstance(a, LLConcreteValue)] newlink = Link(args_v, None) - state.patchsource(newlink) + self.pendingstates[newlink] = state return newlink def schedule_getstate(self, args_a, origblock): @@ -202,18 +182,38 @@ # schedule this new state state = BlockState(origblock, args_a) pendingstates.append(state) - self.allpendingstates.append(state) return state, args_a + +class LLAbstractFrame(object): + + def __init__(self, interp, graphstate): + self.interp = interp + self.graphstate = graphstate + def complete(self): - while self.allpendingstates: - state = self.allpendingstates.pop() - print 'CONSIDERING', state - self.flowin(state) + graph = self.graphstate.copygraph + interp = self.interp + pending = [self.graphstate] + seen = {} + # follow all possible links, forcing the blocks along the way to be + # computed + while pending: + next = pending.pop() + state = interp.pendingstates[next] + if state.copyblock is None: + self.flowin(state) + next.settarget(state.copyblock) + for link in state.copyblock.exits: + if (link not in seen and link.target is not graph.returnblock + and link.target is not graph.exceptblock): + pending.append(link) + seen[link] = True + # the graph should be complete now; sanity-check + checkgraph(graph) def flowin(self, state): # flow in the block - assert state.copyblock is None origblock = state.origblock bindings = {} # {Variables-of-origblock: a_value} def binding(v): @@ -225,14 +225,12 @@ if not isinstance(a, LLConcreteValue): a = LLRuntimeValue(orig_v=v) bindings[v] = a - if origblock.operations == (): - self.residual_operations = () - else: - self.residual_operations = [] - for op in origblock.operations: - handler = getattr(self, 'op_' + op.opname) - a_result = handler(op, *[binding(v) for v in op.args]) - bindings[op.result] = a_result + print + self.residual_operations = [] + for op in origblock.operations: + handler = getattr(self, 'op_' + op.opname) + a_result = handler(op, *[binding(v) for v in op.args]) + bindings[op.result] = a_result inputargs = [] for v in origblock.inputargs: a = bindings[v] @@ -241,24 +239,39 @@ newblock = Block(inputargs) newblock.operations = self.residual_operations del self.residual_operations # just in case - if origblock.exitswitch is None: - links = origblock.exits - elif origblock.exitswitch == Constant(last_exception): - XXX - else: - v = bindings[origblock.exitswitch].getvarorconst() - if isinstance(v, Variable): - newblock.exitswitch = v + + if origblock.operations != (): + # build exit links and schedule their target for later completion + if origblock.exitswitch is None: links = origblock.exits + elif origblock.exitswitch == Constant(last_exception): + XXX + else: + v = bindings[origblock.exitswitch].getvarorconst() + if isinstance(v, Variable): + newblock.exitswitch = v + links = origblock.exits + else: + links = [link for link in origblock.exits + if link.llexitcase == v.value] + newlinks = [] + for origlink in links: + args_a = [binding(v) for v in origlink.args] + newlink = self.interp.schedule(args_a, origlink.target) + newlinks.append(newlink) + else: + # copies of return and except blocks are *normal* blocks currently; + # they are linked to the official return or except block of the + # copygraph. If needed, LLConcreteValues are turned into Constants. + if len(origblock.inputargs) == 1: + self.graphstate.a_return = bindings[origblock.inputargs[0]] + target = self.graphstate.copygraph.returnblock else: - links = [link for link in origblock.exits - if link.llexitcase == v.value] - newlinks = [] - for origlink in links: - args_a = [binding(v) for v in origlink.args] - newlink = self.schedule(args_a, origlink.target) - newlinks.append(newlink) - print "CLOSING" + XXX_later + target = self.graphstate.copygraph.exceptblock + args_v = [binding(v).getvarorconst() for v in origblock.inputargs] + newlinks = [Link(args_v, target)] + #print "CLOSING" newblock.closeblock(*newlinks) state.resolveblock(newblock) @@ -273,6 +286,7 @@ return None # cannot constant-fold any_concrete = any_concrete or isinstance(a, LLConcreteValue) # can constant-fold + print 'fold:', constant_op, concretevalues concreteresult = constant_op(*concretevalues) if any_concrete: return LLConcreteValue(concreteresult) @@ -282,9 +296,15 @@ return LLRuntimeValue(c) def residual(self, opname, args_a, a_result): + v_result = a_result.getvarorconst() + if isinstance(v_result, Constant): + v = Variable() + v.concretetype = v_result.concretetype + v_result = v op = SpaceOperation(opname, [a.getvarorconst() for a in args_a], - a_result.getvarorconst()) + v_result) + print 'keep:', op self.residual_operations.append(op) def residualize(self, op, args_a, constant_op=None): @@ -319,6 +339,18 @@ def op_int_lt(self, op, a1, a2): return self.residualize(op, [a1, a2], operator.lt) + def op_int_ge(self, op, a1, a2): + return self.residualize(op, [a1, a2], operator.ge) + + def op_int_le(self, op, a1, a2): + return self.residualize(op, [a1, a2], operator.le) + + def op_int_eq(self, op, a1, a2): + return self.residualize(op, [a1, a2], operator.eq) + + def op_int_ne(self, op, a1, a2): + return self.residualize(op, [a1, a2], operator.ne) + def op_cast_char_to_int(self, op, a): return self.residualize(op, [a], ord) @@ -326,16 +358,23 @@ return a def op_direct_call(self, op, a_func, *args_a): + a_result = LLRuntimeValue(op.result) v_func = a_func.getvarorconst() if isinstance(v_func, Constant): fnobj = v_func.value._obj if hasattr(fnobj, 'graph'): origgraph = fnobj.graph - graphstate = self.schedule_graph(args_a, origgraph) + graphstate, args_a = self.interp.schedule_graph( + args_a, origgraph) + if graphstate.state != "during": + graphstate.complete(self.interp) + if isinstance(graphstate.a_return, LLConcreteValue): + a_result = graphstate.a_return + origfptr = v_func.value ARGS = [] new_args_a = [] - for a in graphstate.args_a: + for a in args_a: if not isinstance(a, LLConcreteValue): ARGS.append(a.getconcretetype()) new_args_a.append(a) @@ -347,36 +386,27 @@ fconst = Constant(fptr) fconst.concretetype = lltype.typeOf(fptr) a_func = LLRuntimeValue(fconst) - a_result = LLRuntimeValue(op.result) self.residual("direct_call", [a_func] + args_a, a_result) return a_result def op_getfield(self, op, a_ptr, a_attrname): constant_op = None T = a_ptr.getconcretetype().TO - v_ptr = a_ptr.getvarorconst() - if isinstance(v_ptr, Constant): - if T._hints.get('immutable', False): - constant_op = getattr + if T._hints.get('immutable', False): + constant_op = getattr return self.residualize(op, [a_ptr, a_attrname], constant_op) - op_getsubstruct = op_getfield + + def op_getsubstruct(self, op, a_ptr, a_attrname): + return self.residualize(op, [a_ptr, a_attrname], getattr) def op_getarraysize(self, op, a_ptr): - constant_op = None - T = a_ptr.getconcretetype().TO - v_ptr = a_ptr.getvarorconst() - if isinstance(v_ptr, Constant): - if T._hints.get('immutable', False): - constant_op = len - return self.residualize(op, [a_ptr], constant_op) + return self.residualize(op, [a_ptr], len) def op_getarrayitem(self, op, a_ptr, a_index): constant_op = None T = a_ptr.getconcretetype().TO - v_ptr = a_ptr.getvarorconst() - if isinstance(v_ptr, Constant): - if T._hints.get('immutable', False): - constant_op = operator.getitem + if T._hints.get('immutable', False): + constant_op = operator.getitem return self.residualize(op, [a_ptr, a_index], constant_op) Added: pypy/dist/pypy/jit/test/__init__.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/jit/test/__init__.py Fri Dec 9 17:30:43 2005 @@ -0,0 +1 @@ +#empty Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Fri Dec 9 17:30:43 2005 @@ -61,7 +61,7 @@ assert op.args[1].value == 42 assert op.args[1].concretetype == lltype.Signed assert len(graph2.startblock.exits) == 1 - assert graph2.startblock.exits[0].target is graph2.returnblock + assert insns == {'int_add': 1} def test_simple2(): def ll_function(x, y): @@ -157,8 +157,7 @@ def test_simple_array(): A = lltype.Array(lltype.Char, hints={'immutable': True}) - S = lltype.GcStruct('str', ('chars', A), - hints={'immutable': True}) + S = lltype.GcStruct('str', ('chars', A)) s = lltype.malloc(S, 11) for i, c in enumerate("hello world"): s.chars[i] = c @@ -169,3 +168,4 @@ return total graph2, insns = abstrinterp(ll_function, [s, 0, 0], [0, 1, 2]) assert insns == {} + Modified: pypy/dist/pypy/jit/test/test_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_tl.py (original) +++ pypy/dist/pypy/jit/test/test_tl.py Fri Dec 9 17:30:43 2005 @@ -49,7 +49,7 @@ def test_tl_translatable(): code = list2bytecode([PUSH,42, PUSH,100, ADD]) - fn = translate(interp, [str]) + fn = translate(interp, [str, int]) assert interp(code) == fn(code) def test_swap(): Modified: pypy/dist/pypy/jit/tl.py ============================================================================== --- pypy/dist/pypy/jit/tl.py (original) +++ pypy/dist/pypy/jit/tl.py Fri Dec 9 17:30:43 2005 @@ -10,13 +10,12 @@ t = -(-ord(c) & 0xff) return t -def interp(code=''): +def interp(code='', pc=0): if not isinstance(code,str): raise TypeError("code '%s' should be a string" % str(code)) code_len = len(code) stack = [] - pc = 0 while pc < code_len: opcode = ord(code[pc]) From ericvrp at codespeak.net Fri Dec 9 17:34:00 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 9 Dec 2005 17:34:00 +0100 (CET) Subject: [pypy-svn] r20950 - pypy/dist/pypy/translator/goal Message-ID: <20051209163400.6F89727DDF@code1.codespeak.net> Author: ericvrp Date: Fri Dec 9 17:33:59 2005 New Revision: 20950 Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py pypy/dist/pypy/translator/goal/bench-unix.py Log: * don't stop when one of the backends is unable to generate an exe. * enhance benchmark output * post the benchmark result on http://codespeak.net/~ericvrp/benchmark/ Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-cronjob.py (original) +++ pypy/dist/pypy/translator/goal/bench-cronjob.py Fri Dec 9 17:33:59 2005 @@ -2,21 +2,6 @@ import time, os, sys, stat -current_result = ''' -executable richards pystone -python 2.4.2c1 855ms ( 1.00x) 44642 ( 1.00x) -pypy-llvm-17884 11034ms ( 12.91x) 3362 ( 13.28x) -pypy-llvm-17881 11702ms ( 13.69x) 3240 ( 13.78x) -pypy-llvm-17870 12683ms ( 14.83x) 3073 ( 14.53x) -pypy-llvm-17862 13053ms ( 15.27x) 3017 ( 14.79x) -pypy-llvm-17797 13497ms ( 15.79x) 2832 ( 15.76x) -pypy-llvm-17792 13808ms ( 16.15x) 2818 ( 15.84x) -pypy-llvm-17758 16998ms ( 19.88x) 2237 ( 19.96x) -pypy-c-17853 22389ms ( 26.19x) 1651 ( 27.04x) -pypy-c-17806 22328ms ( 26.11x) 1660 ( 26.88x) -pypy-c-17758 23485ms ( 27.47x) 1598 ( 27.92x) -''' - homedir = os.getenv('HOME') os.putenv('PATH','~/bin:/usr/local/bin:/usr/bin:/bin:/opt/bin:/usr/i686-pc-linux-gnu/gcc-bin/3.3.6') @@ -27,7 +12,8 @@ def update_llvm(): os.chdir(homedir + '/projects/llvm') os.system('cvs -q up 2>&1') - os.system('make tools-only 2>&1') + os.system('make clean 2>&1') + os.system('make -j3 tools-only 2>&1') def compile(backend): os.chdir(homedir + '/projects/pypy-dist/pypy/translator/goal') @@ -51,7 +37,11 @@ os.system('cat /proc/cpuinfo') os.system('free') os.chdir(homedir + '/projects/pypy-dist/pypy/translator/goal') - os.system('python bench-unix.py 2>&1' % locals()) + os.system('python bench-unix.py 2>&1 | tee benchmark.txt' % locals()) + os.system('echo "
"    >  benchmark.html')
+    os.system('cat benchmark.txt           >> benchmark.html')
+    os.system('echo "
" >> benchmark.html') + os.system('scp benchmark.html ericvrp at codespeak.net:public_html/benchmark/index.html') def main(backends=[]): if backends == []: @@ -60,7 +50,10 @@ update_pypy() update_llvm() for backend in backends: - compile(backend) + try: + compile(backend) + except: + pass benchmark() print time.ctime() print 80*'-' Modified: pypy/dist/pypy/translator/goal/bench-unix.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-unix.py (original) +++ pypy/dist/pypy/translator/goal/bench-unix.py Fri Dec 9 17:33:59 2005 @@ -2,20 +2,7 @@ # to be executed in the goal folder, # where a couple of pypy-* files is expected. -import os, sys - -current_result = ''' -executable richards pystone -python 2.4.2c1 864ms ( 1.00x) 43103 ( 1.00x) -pypy-llvm-17870 12574ms ( 14.55x) 3069 ( 14.04x) -pypy-llvm-17862 12980ms ( 15.02x) 3041 ( 14.17x) -pypy-llvm-17797 13473ms ( 15.59x) 2824 ( 15.26x) -pypy-llvm-17792 13755ms ( 15.92x) 2823 ( 15.27x) -pypy-llvm-17758 17057ms ( 19.74x) 2229 ( 19.34x) -pypy-c-17853 22411ms ( 25.94x) 1653 ( 26.07x) -pypy-c-17806 22315ms ( 25.83x) 1656 ( 26.03x) -pypy-c-17758 23500ms ( 27.20x) 1570 ( 27.45x) -''' +import os, sys, time PYSTONE_CMD = 'from test import pystone;pystone.main(%s)' PYSTONE_PATTERN = 'This machine benchmarks at' @@ -54,21 +41,22 @@ exes = [s[1] for s in exes] return exes -HEADLINE = 'executable richards pystone' -FMT = '%-30s %6dms (%6.2fx) %6d (%6.2fx)' +HEADLINE = 'date executable richards pystone' +FMT = '%-26s %-30s %6dms (%6.2fx) %6d (%6.2fx)' def main(): print HEADLINE sys.stdout.flush() ref_rich = run_richards() ref_stone = run_pystone() - print FMT % ('python %s' % sys.version.split()[0], ref_rich, 1.0, ref_stone, 1.0) + print FMT % (time.ctime(), 'python %s' % sys.version.split()[0], ref_rich, 1.0, ref_stone, 1.0) sys.stdout.flush() for exe in get_executables(): exename = os.path.splitext(exe)[0].lstrip('./') + ctime = time.ctime( os.path.getctime(exename) ) rich = run_richards(exe, 1) stone = run_pystone(exe) - print FMT % (exename, rich, rich / ref_rich, stone, ref_stone / stone) + print FMT % (ctime, exename, rich, rich / ref_rich, stone, ref_stone / stone) sys.stdout.flush() if __name__ == '__main__': From ale at codespeak.net Fri Dec 9 17:50:46 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Fri, 9 Dec 2005 17:50:46 +0100 (CET) Subject: [pypy-svn] r20951 - pypy/dist/pypy/translator/c Message-ID: <20051209165046.8DA2B27DE4@code1.codespeak.net> Author: ale Date: Fri Dec 9 17:50:45 2005 New Revision: 20951 Modified: pypy/dist/pypy/translator/c/stackless.py Log: typo Modified: pypy/dist/pypy/translator/c/stackless.py ============================================================================== --- pypy/dist/pypy/translator/c/stackless.py (original) +++ pypy/dist/pypy/translator/c/stackless.py Fri Dec 9 17:50:45 2005 @@ -23,7 +23,7 @@ # start the decoding table with entries for the functions that # are written manually in ll_stackless.h def reg(llfn): - """Register the given ll_ primitive function as being able to unwing + """Register the given ll_ primitive function as being able to unwind the stack. Required to compute 'can_reach_unwind' correctly.""" assert llfn.suggested_primitive self.stackless_roots[llfn] = True From ale at codespeak.net Fri Dec 9 17:53:46 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Fri, 9 Dec 2005 17:53:46 +0100 (CET) Subject: [pypy-svn] r20952 - in pypy/dist/pypy: module/_socket/rpython translator/c/src translator/c/test Message-ID: <20051209165346.A235327DE4@code1.codespeak.net> Author: ale Date: Fri Dec 9 17:53:42 2005 New Revision: 20952 Modified: pypy/dist/pypy/module/_socket/rpython/ll__socket.py pypy/dist/pypy/translator/c/src/ll__socket.h pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (nik, ale) Fixed "htons" (misspelled). Fixed the test_socket connect (the length of the structure shall be set before calling getpeername) Modified: pypy/dist/pypy/module/_socket/rpython/ll__socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/ll__socket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/ll__socket.py Fri Dec 9 17:53:42 2005 @@ -77,7 +77,7 @@ ll__socket_ntohs.suggested_primitive = True def ll__socket_htons(ntohs): - return _socket.ntohs(htons) + return _socket.htons(ntohs) ll__socket_htons.suggested_primitive = True def ll__socket_htonl(ntohl): Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Fri Dec 9 17:53:42 2005 @@ -10,6 +10,8 @@ # include #endif +static int +setipaddr(char *name, struct sockaddr *addr_ret, size_t addr_ret_size, int af); int LL__socket_ntohs(int htons); int LL__socket_htons(int ntohs); long LL__socket_ntohl(long htonl); @@ -121,6 +123,7 @@ RPyString* host; memset((void *) &addr, '\0', sizeof(addr)); + addr_len = sizeof(addr); if (getpeername(fd, (struct sockaddr *) &addr, &addr_len) < 0) { // XXX raise some error here } Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Fri Dec 9 17:53:42 2005 @@ -76,7 +76,7 @@ return rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) f1 = compile(does_stuff, []) res = f1() - assert isinstance(res, int) + assert isinstance(res, (int, long)) def test_newsocket_error(): from pypy.module._socket.rpython import rsocket From arigo at codespeak.net Fri Dec 9 18:02:24 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Dec 2005 18:02:24 +0100 (CET) Subject: [pypy-svn] r20953 - in pypy/dist/pypy/jit: . test Message-ID: <20051209170224.F1ECF27DE5@code1.codespeak.net> Author: arigo Date: Fri Dec 9 18:02:24 2005 New Revision: 20953 Added: pypy/dist/pypy/jit/test/test_jit_tl.py Modified: pypy/dist/pypy/jit/llabstractinterp.py Log: (mwh, arigo) Intermediate check-in to switch machines. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Fri Dec 9 18:02:24 2005 @@ -53,6 +53,11 @@ def match(self, other): return isinstance(other, LLRuntimeValue) # XXX and ... +orig_v = Constant(None) +orig_v.concretetype = lltype.Void +ll_no_return_value = LLRuntimeValue(orig_v) +del orig_v + class BlockState(object): """Entry state of a block, as a combination of LLAbstractValues @@ -205,10 +210,20 @@ self.flowin(state) next.settarget(state.copyblock) for link in state.copyblock.exits: - if (link not in seen and link.target is not graph.returnblock - and link.target is not graph.exceptblock): - pending.append(link) + if link not in seen: seen[link] = True + if link.target is None or link.target.operations != (): + pending.append(link) + else: + # link.target is a return or except block; make sure + # that it is really the one from 'graph' -- by patching + # 'graph' if necessary. + if len(link.target.inputargs) == 1: + graph.returnblock = link.target + elif len(link.target.inputargs) == 2: + graph.exceptblock = link.target + else: + raise Exception("uh?") # the graph should be complete now; sanity-check checkgraph(graph) @@ -308,10 +323,10 @@ self.residual_operations.append(op) def residualize(self, op, args_a, constant_op=None): - RESULT = op.result.concretetype - if RESULT is lltype.Void: - return XXX_later if constant_op: + RESULT = op.result.concretetype + if RESULT is lltype.Void: + return ll_no_return_value a_result = self.constantfold(constant_op, args_a) if a_result is not None: return a_result @@ -333,6 +348,15 @@ def op_int_mul(self, op, a1, a2): return self.residualize(op, [a1, a2], operator.mul) + def op_int_and(self, op, a1, a2): + return self.residualize(op, [a1, a2], operator.and_) + + def op_int_rshift(self, op, a1, a2): + return self.residualize(op, [a1, a2], operator.rshift) + + def op_int_neg(self, op, a1): + return self.residualize(op, [a1], operator.neg) + def op_int_gt(self, op, a1, a2): return self.residualize(op, [a1, a2], operator.gt) @@ -409,4 +433,14 @@ constant_op = operator.getitem return self.residualize(op, [a_ptr, a_index], constant_op) - + def op_malloc(self, op, a_T): + return self.residualize(op, [a_T]) + + def op_malloc_varsize(self, op, a_T, a_size): + return self.residualize(op, [a_T, a_size]) + + def op_setfield(self, op, a_ptr, a_attrname, a_value): + return self.residualize(op, [a_ptr, a_attrname, a_value]) + + def op_setarrayitem(self, op, a_ptr, a_index, a_value): + return self.residualize(op, [a_ptr, a_index, a_value]) Added: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Fri Dec 9 18:02:24 2005 @@ -0,0 +1,28 @@ +# "coughcoughcough" applies to most of this file + +from pypy.translator.translator import TranslationContext +from pypy.jit import tl +from pypy.jit.llabstractinterp import LLAbstractInterp +from pypy.rpython.rstr import string_repr + + +def jit_tl(code): + t = TranslationContext() + t.buildannotator().build_types(tl.interp, [str, int]) + rtyper = t.buildrtyper() + rtyper.specialize() + graph1 = t.graphs[0] + + interp = LLAbstractInterp() + hints = {graph1.getargs()[0]: string_repr.convert_const(code), + graph1.getargs()[1]: 0} + + graph2 = interp.eval(graph1, hints) + graph2.show() + + +def INPROGRESS_test_jit_tl_1(): + code = tl.compile(""" + PUSH 42 + """) + jit_tl(code) From arigo at codespeak.net Fri Dec 9 18:04:30 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Dec 2005 18:04:30 +0100 (CET) Subject: [pypy-svn] r20954 - pypy/dist/pypy/rpython Message-ID: <20051209170430.4C49B27DE5@code1.codespeak.net> Author: arigo Date: Fri Dec 9 18:04:29 2005 New Revision: 20954 Modified: pypy/dist/pypy/rpython/rstr.py Log: (mwh, arigo) More of the same as the previous check-in, left behind by mistake. Modified: pypy/dist/pypy/rpython/rstr.py ============================================================================== --- pypy/dist/pypy/rpython/rstr.py (original) +++ pypy/dist/pypy/rpython/rstr.py Fri Dec 9 18:04:29 2005 @@ -26,8 +26,7 @@ # } STR = GcStruct('rpy_string', ('hash', Signed), - ('chars', Array(Char))) - + ('chars', Array(Char, hints={'immutable': True}))) SIGNED_ARRAY = GcArray(Signed) From cfbolz at codespeak.net Fri Dec 9 18:08:59 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 9 Dec 2005 18:08:59 +0100 (CET) Subject: [pypy-svn] r20955 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20051209170859.AE24627DE5@code1.codespeak.net> Author: cfbolz Date: Fri Dec 9 18:08:58 2005 New Revision: 20955 Modified: pypy/dist/pypy/translator/backendopt/malloc.py pypy/dist/pypy/translator/backendopt/test/test_malloc.py Log: (johahn, cfbolz): prevent the malloc removal to remove instances of classes that have a __del__ method (or other structs that have a destructor registered). Modified: pypy/dist/pypy/translator/backendopt/malloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/malloc.py Fri Dec 9 18:08:58 2005 @@ -119,6 +119,16 @@ continue # ok return False + # must not remove mallocs of structures that have a RTTI with a destructor + + try: + destr_ptr = lltype.getRuntimeTypeInfo(STRUCT)._obj.destructor_funcptr + if destr_ptr: + return False + except (ValueError, AttributeError), e: + print e + pass + # success: replace each variable with a family of variables (one per field) example = STRUCT._container_example() flatnames = [] Modified: pypy/dist/pypy/translator/backendopt/test/test_malloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_malloc.py Fri Dec 9 18:08:58 2005 @@ -1,6 +1,7 @@ +import py from pypy.translator.backendopt.malloc import remove_simple_mallocs from pypy.translator.backendopt.inline import inline_function -from pypy.translator.translator import TranslationContext, graphof +from pypy.translator.translator import TranslationContext, Translator, graphof from pypy.objspace.flow.model import checkgraph, flatten, Block from pypy.rpython.llinterp import LLInterpreter @@ -109,3 +110,33 @@ keepalive_until_here(t) return s*d check(fn1, [int, int], [15, 10], 125) + +def test_dont_remove_with__del__(): + import os + delcalls = [0] + class A(object): + nextid = 0 + def __init__(self): + self.id = self.nextid + self.nextid += 1 + + def __del__(self): + delcalls[0] += 1 + os.write(1, "__del__\n") + + def f(x=int): + a = A() + i = 0 + while i < x: + a = A() + os.write(1, str(delcalls[0]) + "\n") + i += 1 + return 1 + t = Translator(f) + t.buildannotator().build_types(f, [int]) + t.buildrtyper().specialize() + graph = graphof(t, f) + t.backend_optimizations() + op = graph.startblock.exits[0].target.exits[1].target.operations[0] + assert op.opname == "malloc" + From cfbolz at codespeak.net Fri Dec 9 18:09:33 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 9 Dec 2005 18:09:33 +0100 (CET) Subject: [pypy-svn] r20956 - pypy/dist/pypy/translator/backendopt Message-ID: <20051209170933.37C5627DE8@code1.codespeak.net> Author: cfbolz Date: Fri Dec 9 18:09:32 2005 New Revision: 20956 Modified: pypy/dist/pypy/translator/backendopt/malloc.py Log: (johahn, cfbolz): remove debug statement Modified: pypy/dist/pypy/translator/backendopt/malloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/malloc.py Fri Dec 9 18:09:32 2005 @@ -126,7 +126,6 @@ if destr_ptr: return False except (ValueError, AttributeError), e: - print e pass # success: replace each variable with a family of variables (one per field) From rxe at codespeak.net Fri Dec 9 18:10:36 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 18:10:36 +0100 (CET) Subject: [pypy-svn] r20957 - pypy/dist/pypy/translator/c/test Message-ID: <20051209171036.243BD27DE5@code1.codespeak.net> Author: rxe Date: Fri Dec 9 18:10:35 2005 New Revision: 20957 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: Channels work! Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Fri Dec 9 18:10:35 2005 @@ -39,7 +39,6 @@ s_list_of_strings.listdef.resize() t = TranslationContext() t.buildannotator().build_types(entry_point, [s_list_of_strings]) - t.view() t.buildrtyper().specialize() backend_optimizations(t) cbuilder = CStandaloneBuilder(t, entry_point, gcpolicy=gcpolicy) @@ -64,9 +63,11 @@ self.resumable = resumable def suspend(self): + # we suspend ourself self.caller = self.caller.switch() def resume(self): + # the caller resumes me self.resumable = self.resumable.switch() self.alive = self.resumable is not None @@ -75,7 +76,6 @@ Resumable.__init__(self, fn) self.name = name self.blocked = 0 - self.data = -1 # propogates round suspend-resume to tell scheduler in run() # XXX too late to think this thru @@ -107,16 +107,26 @@ t = self.queue.pop(0) t.data = value t.blocked = 0 + t.remove = False scheduler.run_immediately(t) scheduler.schedule() + + # resuming + t = getcurrent() + assert t.blocked == 0 else: t = getcurrent() assert isinstance(t, Tasklet) + t.data = value # let it wait for a receiver to come along self.queue.append(t) t.blocked = 1 schedule_remove() + + # resuming + assert t == getcurrent() + assert t.blocked == 0 def receive(self): self.balance -= 1 @@ -124,17 +134,25 @@ if self.balance >= 0: t = self.queue.pop(0) t.blocked = 0 + t.remove = False data = t.data scheduler.add_tasklet(t) return data else: - # block until ready + # queue ourself t = getcurrent() assert isinstance(t, Tasklet) self.queue.append(t) + + # block until send has reenabled me t.blocked = -1 schedule_remove() - return -1 # never reached + + # resuming + assert t == getcurrent() + assert t.blocked == 0 + + return t.data class Scheduler(object): def __init__(self): @@ -155,6 +173,7 @@ count = 0 for t in runnables: assert self.current_tasklet is None + self.current_tasklet = t if t.resume(): self.runnables.append(self.current_tasklet) @@ -299,30 +318,70 @@ res = wrap_stackless_function(f) assert res == '1' -def test_channels(): +def test_channel1(): ch = Channel() def f1(name): for ii in range(5): ch.send(ii) - debug("done sending") def f2(name): - for ii in range(5): - data = ch.receive() - debug("done receiving") -## while True: -## data = ch.receive() -## if data == 42: -## break -## debug("received") + #while True: + for ii in range(6): + globals.count += ch.receive() def f(): start_tasklet(Tasklet("f2", f2)) start_tasklet(Tasklet("f1", f1)) run() + return (globals.count == 10) + + res = wrap_stackless_function(f) + assert res == '1' + +def test_channel2(): + ch = Channel() + + def f1(name): + for ii in range(5): + ch.send(ii) + + def f2(name): + #while True: + for ii in range(6): + res = ch.receive() + globals.count += res + + def f(): + start_tasklet(Tasklet("f1", f1)) + start_tasklet(Tasklet("f2", f2)) + run() + return (globals.count == 10) + + res = wrap_stackless_function(f) + assert res == '1' + + +def test_channel3(): + ch = Channel() + + def f1(name): + for ii in range(5): + ch.send(ii) + + def f2(name): + #while True: + for ii in range(16): + res = ch.receive() + globals.count += res + + def f(): + start_tasklet(Tasklet("f1x", f1)) + start_tasklet(Tasklet("f1xx", f1)) + start_tasklet(Tasklet("f1xxx", f1)) + start_tasklet(Tasklet("f2", f2)) + run() + return (globals.count == 30) - return 0 - res = wrap_stackless_function(f) assert res == '1' From ludal at codespeak.net Fri Dec 9 18:17:25 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Fri, 9 Dec 2005 18:17:25 +0100 (CET) Subject: [pypy-svn] r20958 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20051209171725.09C6D27DE5@code1.codespeak.net> Author: ludal Date: Fri Dec 9 18:17:24 2005 New Revision: 20958 Modified: pypy/dist/pypy/interpreter/astcompiler/future.py Log: minor cleanup Modified: pypy/dist/pypy/interpreter/astcompiler/future.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/future.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/future.py Fri Dec 9 18:17:24 2005 @@ -2,7 +2,7 @@ """ from pypy.interpreter.pyparser.error import SyntaxError -from pypy.interpreter.astcompiler import ast, walk +from pypy.interpreter.astcompiler import ast def is_future(stmt): """Return true if statement is a well-formed future statement""" @@ -82,18 +82,18 @@ def find_futures(node): p1 = FutureParser() p2 = BadFutureParser() - walk(node, p1) - walk(node, p2) + node.accept( p1 ) + node.accept( p2 ) return p1.get_features() if __name__ == "__main__": import sys - from pypy.interpreter.astcompiler import parseFile, walk + from pypy.interpreter.astcompiler import parseFile for file in sys.argv[1:]: print file tree = parseFile(file) v = FutureParser() - walk(tree, v) + tree.accept(v) print v.found print From mwh at codespeak.net Fri Dec 9 18:24:23 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Fri, 9 Dec 2005 18:24:23 +0100 (CET) Subject: [pypy-svn] r20959 - in pypy/dist/pypy/jit: . test Message-ID: <20051209172423.A648A27DE5@code1.codespeak.net> Author: mwh Date: Fri Dec 9 18:24:22 2005 New Revision: 20959 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_jit_tl.py Log: delete code and the test passes! extend the test and it still passes! Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Fri Dec 9 18:24:22 2005 @@ -136,19 +136,12 @@ def applyhint(self, args_a, origblock): result_a = [] - if origblock.operations == (): - # make sure args_s does *not* contain LLConcreteValues - for a in args_a: - if isinstance(a, LLConcreteValue): - a = LLRuntimeValue(orig_v=a.getvarorconst()) - result_a.append(a) - else: - # apply the hints to make more LLConcreteValues - for a, origv in zip(args_a, origblock.inputargs): - if origv in self.hints: - # use the hint, ignore the source binding - a = LLConcreteValue(self.hints[origv]) - result_a.append(a) + # apply the hints to make more LLConcreteValues + for a, origv in zip(args_a, origblock.inputargs): + if origv in self.hints: + # use the hint, ignore the source binding + a = LLConcreteValue(self.hints[origv]) + result_a.append(a) return result_a def schedule_graph(self, args_a, origgraph): Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Fri Dec 9 18:24:22 2005 @@ -4,7 +4,7 @@ from pypy.jit import tl from pypy.jit.llabstractinterp import LLAbstractInterp from pypy.rpython.rstr import string_repr - +from pypy.rpython.llinterp import LLInterpreter def jit_tl(code): t = TranslationContext() @@ -16,12 +16,17 @@ interp = LLAbstractInterp() hints = {graph1.getargs()[0]: string_repr.convert_const(code), graph1.getargs()[1]: 0} - graph2 = interp.eval(graph1, hints) - graph2.show() + + llinterp = LLInterpreter(rtyper) + result1 = llinterp.eval_graph(graph1, [string_repr.convert_const(code), 0]) + result2 = llinterp.eval_graph(graph2, []) + + assert result1 == result2 + #graph2.show() -def INPROGRESS_test_jit_tl_1(): +def test_jit_tl_1(): code = tl.compile(""" PUSH 42 """) From ludal at codespeak.net Fri Dec 9 18:27:27 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Fri, 9 Dec 2005 18:27:27 +0100 (CET) Subject: [pypy-svn] r20960 - in pypy/dist/pypy: interpreter/astcompiler interpreter/pyparser interpreter/stablecompiler module/symbol Message-ID: <20051209172727.4336227DE3@code1.codespeak.net> Author: ludal Date: Fri Dec 9 18:27:24 2005 New Revision: 20960 Modified: pypy/dist/pypy/interpreter/astcompiler/__init__.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/astcompiler/symbols.py pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/ebnfparse.py pypy/dist/pypy/interpreter/pyparser/pysymbol.py pypy/dist/pypy/interpreter/pyparser/pythonparse.py pypy/dist/pypy/interpreter/pyparser/syntaxtree.py pypy/dist/pypy/interpreter/stablecompiler/transformer.py pypy/dist/pypy/module/symbol/__init__.py Log: more refactoring towards making the grammar parser annotatable Modified: pypy/dist/pypy/interpreter/astcompiler/__init__.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/__init__.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/__init__.py Fri Dec 9 18:27:24 2005 @@ -21,6 +21,6 @@ Generates a .pyc file by compiling filename. """ -from transformer import parse, parseFile -from visitor import walk -from pycodegen import compile, compileFile +# from transformer import parse, parseFile +# from visitor import walk +# from pycodegen import compile, compileFile Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Fri Dec 9 18:27:24 2005 @@ -4,7 +4,8 @@ import struct import sys -from pypy.interpreter.astcompiler import ast, parse, walk, syntax +#from pypy.interpreter.astcompiler import ast, parse, walk, syntax +from pypy.interpreter.astcompiler import ast from pypy.interpreter.astcompiler import pyassem, misc, future, symbols from pypy.interpreter.astcompiler.consts import SC_LOCAL, SC_GLOBAL, \ SC_FREE, SC_CELL, SC_DEFAULT, OP_APPLY, OP_ASSIGN, OP_DELETE, OP_NONE @@ -350,7 +351,7 @@ gen = FunctionCodeGenerator(self.space, node, isLambda, self.class_name, self.get_module(), self.scopeambiguity) - walk(node.code, gen) + node.code.accept( gen ) gen.finish() self.set_lineno(node) for default in node.defaults: @@ -374,7 +375,7 @@ gen = ClassCodeGenerator(self.space, node, self.get_module(), self.scopeambiguity) - walk(node.code, gen) + node.code.accept( gen ) gen.finish() self.set_lineno(node) self.emitop_obj('LOAD_CONST', self.space.wrap(node.name) ) @@ -620,7 +621,7 @@ self.get_module(), self.scopeambiguity) inner = node.code assert isinstance(inner, ast.GenExprInner) - walk(inner, gen) + inner.accept( gen ) gen.finish() self.set_lineno(node) frees = gen.scope.get_free_vars() @@ -1378,7 +1379,6 @@ def findOp(node): """Find the op (DELETE, LOAD, STORE) in an AssTuple tree""" v = OpFinder() - # walk(node, v, verbose=0) node.accept(v) return v.op Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Fri Dec 9 18:27:24 2005 @@ -516,7 +516,7 @@ if __name__ == "__main__": import sys - from pypy.interpreter.astcompiler import parseFile, walk + from pypy.interpreter.astcompiler import parseFile import symtable def get_names(syms): @@ -532,7 +532,7 @@ mod_names = get_names(syms) tree = parseFile(file) s = SymbolVisitor() - walk(tree, s) + tree.accept(s) # compare module-level symbols names2 = tree.scope.get_names() Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Fri Dec 9 18:27:24 2005 @@ -5,7 +5,7 @@ from grammar import BaseGrammarBuilder, AbstractContext from pypy.interpreter.astcompiler import ast, consts -import pypy.interpreter.pyparser.pysymbol as sym +from pypy.interpreter.pyparser.pysymbol import _cpython_symbols as sym import pypy.interpreter.pyparser.pytoken as tok from pypy.interpreter.pyparser.error import SyntaxError from pypy.interpreter.pyparser.parsestring import parsestr @@ -1442,55 +1442,55 @@ ASTRULES = { - sym.atom : build_atom, - sym.power : build_power, - sym.factor : build_factor, - sym.term : build_term, - sym.arith_expr : build_arith_expr, - sym.shift_expr : build_shift_expr, - sym.and_expr : build_and_expr, - sym.xor_expr : build_xor_expr, - sym.expr : build_expr, - sym.comparison : build_comparison, - sym.comp_op : build_comp_op, - sym.and_test : build_and_test, - sym.not_test : build_not_test, - sym.test : build_test, - sym.testlist : build_testlist, - sym.expr_stmt : build_expr_stmt, - sym.small_stmt : return_one, - sym.simple_stmt : build_simple_stmt, - sym.single_input : build_single_input, - sym.file_input : build_file_input, - sym.testlist_gexp : build_testlist_gexp, - sym.lambdef : build_lambdef, - sym.trailer : build_trailer, - sym.arglist : build_arglist, - sym.subscript : build_subscript, - sym.listmaker : build_listmaker, - sym.funcdef : build_funcdef, - sym.classdef : build_classdef, - sym.return_stmt : build_return_stmt, - sym.suite : build_suite, - sym.if_stmt : build_if_stmt, - sym.pass_stmt : build_pass_stmt, - sym.break_stmt : build_break_stmt, - sym.for_stmt : build_for_stmt, - sym.while_stmt : build_while_stmt, - sym.import_name : build_import_name, - sym.import_from : build_import_from, - sym.yield_stmt : build_yield_stmt, - sym.continue_stmt : build_continue_stmt, - sym.del_stmt : build_del_stmt, - sym.assert_stmt : build_assert_stmt, - sym.exec_stmt : build_exec_stmt, - sym.print_stmt : build_print_stmt, - sym.global_stmt : build_global_stmt, - sym.raise_stmt : build_raise_stmt, - sym.try_stmt : build_try_stmt, - sym.exprlist : build_exprlist, - sym.decorator : build_decorator, - sym.eval_input : build_eval_input, + sym['atom'] : build_atom, + sym['power'] : build_power, + sym['factor'] : build_factor, + sym['term'] : build_term, + sym['arith_expr'] : build_arith_expr, + sym['shift_expr'] : build_shift_expr, + sym['and_expr'] : build_and_expr, + sym['xor_expr'] : build_xor_expr, + sym['expr'] : build_expr, + sym['comparison'] : build_comparison, + sym['comp_op'] : build_comp_op, + sym['and_test'] : build_and_test, + sym['not_test'] : build_not_test, + sym['test'] : build_test, + sym['testlist'] : build_testlist, + sym['expr_stmt'] : build_expr_stmt, + sym['small_stmt'] : return_one, + sym['simple_stmt'] : build_simple_stmt, + sym['single_input'] : build_single_input, + sym['file_input'] : build_file_input, + sym['testlist_gexp'] : build_testlist_gexp, + sym['lambdef'] : build_lambdef, + sym['trailer'] : build_trailer, + sym['arglist'] : build_arglist, + sym['subscript'] : build_subscript, + sym['listmaker'] : build_listmaker, + sym['funcdef'] : build_funcdef, + sym['classdef'] : build_classdef, + sym['return_stmt'] : build_return_stmt, + sym['suite'] : build_suite, + sym['if_stmt'] : build_if_stmt, + sym['pass_stmt'] : build_pass_stmt, + sym['break_stmt'] : build_break_stmt, + sym['for_stmt'] : build_for_stmt, + sym['while_stmt'] : build_while_stmt, + sym['import_name'] : build_import_name, + sym['import_from'] : build_import_from, + sym['yield_stmt'] : build_yield_stmt, + sym['continue_stmt'] : build_continue_stmt, + sym['del_stmt'] : build_del_stmt, + sym['assert_stmt'] : build_assert_stmt, + sym['exec_stmt'] : build_exec_stmt, + sym['print_stmt'] : build_print_stmt, + sym['global_stmt'] : build_global_stmt, + sym['raise_stmt'] : build_raise_stmt, + sym['try_stmt'] : build_try_stmt, + sym['exprlist'] : build_exprlist, + sym['decorator'] : build_decorator, + sym['eval_input'] : build_eval_input, } ## Stack elements definitions ################################### Modified: pypy/dist/pypy/interpreter/pyparser/ebnfparse.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/ebnfparse.py (original) +++ pypy/dist/pypy/interpreter/pyparser/ebnfparse.py Fri Dec 9 18:27:24 2005 @@ -2,6 +2,7 @@ from grammar import BaseGrammarBuilder, Alternative, Sequence, Token, \ KleeneStar, GrammarElement, build_first_sets, EmptyToken from ebnflexer import GrammarSource +import ebnfgrammar from ebnfgrammar import GRAMMAR_GRAMMAR, sym_map from syntaxtree import AbstractSyntaxVisitor import pytoken @@ -66,6 +67,116 @@ return True + +def ebnf_handle_grammar(self, node): + for rule in node.nodes: + rule.visit(self) + # the rules are registered already + # we do a pass through the variables to detect + # terminal symbols from non terminals + for r in self.items: + for i,a in enumerate(r.args): + if a.codename in self.rules: + assert isinstance(a,Token) + r.args[i] = self.rules[a.codename] + if a.codename in self.terminals: + del self.terminals[a.codename] + # XXX .keywords also contains punctuations + self.terminals['NAME'].keywords = self.keywords + +def ebnf_handle_rule(self, node): + symdef = node.nodes[0].value + self.current_rule = symdef + self.current_subrule = 0 + alt = node.nodes[1] + rule = alt.visit(self) + if not isinstance(rule, Token): + rule.codename = self.symbols.add_symbol( symdef ) + self.rules[rule.codename] = rule + +def ebnf_handle_alternative(self, node): + items = [node.nodes[0].visit(self)] + items += node.nodes[1].visit(self) + if len(items) == 1 and not items[0].is_root(): + return items[0] + alt = Alternative(self.new_symbol(), items) + return self.new_item(alt) + +def ebnf_handle_sequence( self, node ): + """ """ + items = [] + for n in node.nodes: + items.append( n.visit(self) ) + if len(items)==1: + return items[0] + elif len(items)>1: + return self.new_item( Sequence( self.new_symbol(), items) ) + raise RuntimeError("Found empty sequence") + +def ebnf_handle_sequence_cont( self, node ): + """Returns a list of sequences (possibly empty)""" + return [n.visit(self) for n in node.nodes] + +def ebnf_handle_seq_cont_list(self, node): + return node.nodes[1].visit(self) + + +def ebnf_handle_symbol(self, node): + star_opt = node.nodes[1] + sym = node.nodes[0].value + terminal = self.terminals.get( sym, None ) + if not terminal: + tokencode = pytoken.tok_values.get( sym, None ) + if tokencode is None: + tokencode = self.symbols.add_symbol( sym ) + terminal = Token( tokencode ) + else: + terminal = Token( tokencode ) + self.terminals[sym] = terminal + + return self.repeat( star_opt, terminal ) + +def ebnf_handle_option( self, node ): + rule = node.nodes[1].visit(self) + return self.new_item( KleeneStar( self.new_symbol(), 0, 1, rule ) ) + +def ebnf_handle_group( self, node ): + rule = node.nodes[1].visit(self) + return self.repeat( node.nodes[3], rule ) + +def ebnf_handle_TOK_STRING( self, node ): + value = node.value + tokencode = pytoken.tok_punct.get( value, None ) + if tokencode is None: + if not py_name.match( value ): + raise RuntimeError("Unknown STRING value ('%s')" % value ) + # assume a keyword + tok = Token( pytoken.NAME, value ) + if value not in self.keywords: + self.keywords.append( value ) + else: + # punctuation + tok = Token( tokencode ) + return tok + +def ebnf_handle_sequence_alt( self, node ): + res = node.nodes[0].visit(self) + assert isinstance( res, GrammarElement ) + return res + +# This will setup a mapping between +# ebnf_handle_xxx functions and ebnfgrammar.xxx +ebnf_handles = {} +for name, value in globals().items(): + if name.startswith("ebnf_handle_"): + name = name[12:] + key = getattr(ebnfgrammar, name ) + ebnf_handles[key] = value + +def handle_unknown( self, node ): + raise RuntimeError("Unknown Visitor for %r" % node.name) + + class EBNFVisitor(AbstractSyntaxVisitor): def __init__(self): @@ -76,11 +187,12 @@ self.keywords = [] self.items = [] self.terminals['NAME'] = NameToken() + self.symbols = pysymbol.SymbolMapper( pysymbol._cpython_symbols.sym_name ) def new_symbol(self): rule_name = ":%s_%s" % (self.current_rule, self.current_subrule) self.current_subrule += 1 - symval = pysymbol.add_anon_symbol( rule_name ) + symval = self.symbols.add_anon_symbol( rule_name ) return symval def new_item(self, itm): @@ -88,17 +200,8 @@ return itm def visit_syntaxnode( self, node ): - """NOT RPYTHON, used only at bootstrap time anyway""" - name = sym_map[node.name] - visit_meth = getattr(self, "handle_%s" % name, None) - if visit_meth: - return visit_meth(node) - else: - print "Unknown handler for %s" %name - # helper function for nodes that have only one subnode: - if len(node.nodes) == 1: - return node.nodes[0].visit(visitor) - raise RuntimeError("Unknown Visitor for %r" % name) + visit_func = ebnf_handles.get( node.name, handle_unknown ) + return visit_func( self, node ) def visit_tokennode( self, node ): return self.visit_syntaxnode( node ) @@ -106,101 +209,6 @@ def visit_tempsyntaxnode( self, node ): return self.visit_syntaxnode( node ) - def handle_grammar(self, node): - for rule in node.nodes: - rule.visit(self) - # the rules are registered already - # we do a pass through the variables to detect - # terminal symbols from non terminals - for r in self.items: - for i,a in enumerate(r.args): - if a.codename in self.rules: - assert isinstance(a,Token) - r.args[i] = self.rules[a.codename] - if a.codename in self.terminals: - del self.terminals[a.codename] - # XXX .keywords also contains punctuations - self.terminals['NAME'].keywords = self.keywords - - def handle_rule(self, node): - symdef = node.nodes[0].value - self.current_rule = symdef - self.current_subrule = 0 - alt = node.nodes[1] - rule = alt.visit(self) - if not isinstance(rule, Token): - rule.codename = pysymbol.add_symbol( symdef ) - self.rules[rule.codename] = rule - - def handle_alternative(self, node): - items = [node.nodes[0].visit(self)] - items += node.nodes[1].visit(self) - if len(items) == 1 and not items[0].is_root(): - return items[0] - alt = Alternative(self.new_symbol(), items) - return self.new_item(alt) - - def handle_sequence( self, node ): - """ """ - items = [] - for n in node.nodes: - items.append( n.visit(self) ) - if len(items)==1: - return items[0] - elif len(items)>1: - return self.new_item( Sequence( self.new_symbol(), items) ) - raise SyntaxError("Found empty sequence") - - def handle_sequence_cont( self, node ): - """Returns a list of sequences (possibly empty)""" - return [n.visit(self) for n in node.nodes] - - def handle_seq_cont_list(self, node): - return node.nodes[1].visit(self) - - - def handle_symbol(self, node): - star_opt = node.nodes[1] - sym = node.nodes[0].value - terminal = self.terminals.get( sym ) - if not terminal: - tokencode = pytoken.tok_values.get( sym ) - if tokencode is None: - tokencode = pysymbol.add_symbol( sym ) - terminal = Token( tokencode ) - else: - terminal = Token( tokencode ) - self.terminals[sym] = terminal - - return self.repeat( star_opt, terminal ) - - def handle_option( self, node ): - rule = node.nodes[1].visit(self) - return self.new_item( KleeneStar( self.new_symbol(), 0, 1, rule ) ) - - def handle_group( self, node ): - rule = node.nodes[1].visit(self) - return self.repeat( node.nodes[3], rule ) - - def handle_TOK_STRING( self, node ): - value = node.value - tokencode = pytoken.tok_punct.get( value ) - if tokencode is None: - if not py_name.match( value ): - raise SyntaxError("Unknown STRING value ('%s')" % value ) - # assume a keyword - tok = Token( pytoken.NAME, value ) - if value not in self.keywords: - self.keywords.append( value ) - else: - # punctuation - tok = Token( tokencode ) - return tok - - def handle_sequence_alt( self, node ): - res = node.nodes[0].visit(self) - assert isinstance( res, GrammarElement ) - return res def repeat( self, star_opt, myrule ): assert isinstance( myrule, GrammarElement ) @@ -214,7 +222,7 @@ item = KleeneStar(rule_name, _min=0, rule=myrule) return self.new_item(item) else: - raise SyntaxError("Got symbol star_opt with value='%s'" + raise RuntimeError("Got symbol star_opt with value='%s'" % tok.value) return myrule @@ -245,7 +253,10 @@ vis = EBNFVisitor() node.visit(vis) return vis - + +def target_parse_grammar_text(txt): + vis = parse_grammar_text(txt) + # do nothing from pprint import pprint if __name__ == "__main__": Modified: pypy/dist/pypy/interpreter/pyparser/pysymbol.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pysymbol.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pysymbol.py Fri Dec 9 18:27:24 2005 @@ -9,48 +9,63 @@ # it's important for CPython, but I'm not so sure it's still # important here -_anoncount = -10 -_count = 0 +class SymbolMapper(object): + def __init__(self, sym_name=None ): + _anoncount = self._anoncount = -10 + _count = self._count = 0 + self.sym_name = {} + self.sym_values = {} + if sym_name is not None: + for _value, _name in sym_name.items(): + if _value<_anoncount: + _anoncount = _value + if _value>_count: + _count = _value + self.sym_values[_name] = _value + self.sym_name[_value] = _name + self._anoncount = _anoncount + self._count = _count + + def add_symbol( self, sym ): + assert type(sym)==str + if not sym in self.sym_values: + self._count += 1 + val = self._count + self.sym_values[sym] = val + self.sym_name[val] = sym + return val + return self.sym_values[ sym ] + + def add_anon_symbol( self, sym ): + assert type(sym)==str + if not sym in self.sym_values: + self._anoncount -= 1 + val = self._anoncount + self.sym_values[sym] = val + self.sym_name[val] = sym + return val + return self.sym_values[ sym ] + + def __getitem__(self, sym ): + """NOT RPYTHON""" + assert type(sym)==str + return self.sym_values[ sym ] + + +_cpython_symbols = SymbolMapper( symbol.sym_name ) + -sym_name = {} -sym_values = {} +# prepopulate symbol table from symbols used by CPython +for _value, _name in _cpython_symbols.sym_name.items(): + globals()[_name] = _value -for _name, _value in symbol.__dict__.items(): - if type(_value) is type(0): - _count = max(_count, _value) - -def add_symbol( sym ): - assert type(sym)==str - if not sym_values.has_key( sym ): - if hasattr(symbol, sym): - val = getattr(symbol, sym) - else: - global _count - _count += 1 - val = _count - sym_values[sym] = val - sym_name[val] = sym - globals()[sym] = val - return val - return sym_values[ sym ] - -def add_anon_symbol( sym ): - global _anoncount - assert type(sym)==str - if not sym_values.has_key( sym ): - val = _anoncount - sym_values[sym] = val - sym_name[val] = sym - _anoncount -= 1 - return val - return sym_values[ sym ] def update_symbols( parser ): """Update the symbol module according to rules in PythonParser instance : parser""" for rule in parser.rules: - add_symbol( rule ) + _cpython_symbols.add_symbol( rule ) # There is no symbol in this module until the grammar is loaded # once loaded the grammar parser will fill the mappings with the Modified: pypy/dist/pypy/interpreter/pyparser/pythonparse.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonparse.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonparse.py Fri Dec 9 18:27:24 2005 @@ -45,7 +45,7 @@ return self.parse_lines(lines, goal, builder, flags) def parse_lines(self, lines, goal, builder, flags=0): - goalnumber = pysymbol.sym_values[goal] + goalnumber = pysymbol._cpython_symbols.sym_values[goal] target = self.rules[goalnumber] src = Source(lines, flags) @@ -153,3 +153,8 @@ def grammar_rules( space ): return space.wrap( PYTHON_PARSER.rules ) + + +def make_rule( space, w_rule ): + rule = space.str_w( w_rule ) + Modified: pypy/dist/pypy/interpreter/pyparser/syntaxtree.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/syntaxtree.py (original) +++ pypy/dist/pypy/interpreter/pyparser/syntaxtree.py Fri Dec 9 18:27:24 2005 @@ -1,10 +1,10 @@ """SyntaxTree class definition""" -try: - from pypy.interpreter.pyparser.pysymbol import sym_values - from pypy.interpreter.pyparser.pytoken import tok_values -except ImportError: - from pysymbol import sym_values - from pytoken import tok_values +# try: +# # from pypy.interpreter.pyparser.pysymbol import sym_values +# from pypy.interpreter.pyparser.pytoken import tok_values +# except ImportError: +# # from pysymbol import sym_values +# from pytoken import tok_values class AbstractSyntaxVisitor(object): def visit_syntaxnode( self, node ): @@ -60,11 +60,13 @@ a TempSyntaxNode""" return [ self ] - def totuple(self, lineno=False ): - """returns a tuple representation of the syntax tree""" + def totuple(self, sym_values, lineno=False ): + """returns a tuple representation of the syntax tree + needs symbols+tokens value to name mapping to represent the nodes + """ symvalue = sym_values.get( self.name, (0, self.name) ) l = [ symvalue ] - l += [node.totuple(lineno) for node in self.nodes] + l += [node.totuple(lineno, sym_values ) for node in self.nodes] return tuple(l) @@ -102,8 +104,10 @@ assert isinstance(visitor, AbstractSyntaxVisitor) return visitor.visit_tokennode(self) - def totuple(self, lineno=False): - """returns a tuple representation of the syntax tree""" + def totuple(self, tok_values, lineno=False): + """returns a tuple representation of the syntax tree + needs symbols+tokens value to name mapping to represent the nodes + """ num = tok_values.get(self.name, -1) if num == -1: print "Unknown", self.name, self.value Modified: pypy/dist/pypy/interpreter/stablecompiler/transformer.py ============================================================================== --- pypy/dist/pypy/interpreter/stablecompiler/transformer.py (original) +++ pypy/dist/pypy/interpreter/stablecompiler/transformer.py Fri Dec 9 18:27:24 2005 @@ -33,6 +33,8 @@ import pypy.interpreter.pyparser.pytoken as token import sys +sym_name = symbol._cpython_symbols.sym_name + # transforming is requiring a lot of recursion depth so make sure we have enough if sys.getrecursionlimit()<5000: sys.setrecursionlimit(5000) @@ -113,7 +115,7 @@ def __init__(self, filename=''): self._dispatch = {} self.filename = filename - for value, name in symbol.sym_name.items(): + for value, name in sym_name.items(): if hasattr(self, name): self._dispatch[value] = getattr(self, name) self._dispatch[token.NEWLINE] = self.com_NEWLINE @@ -1481,7 +1483,7 @@ import types _names = {} -for k, v in symbol.sym_name.items(): +for k, v in sym_name.items(): _names[k] = v for k, v in token.tok_name.items(): _names[k] = v Modified: pypy/dist/pypy/module/symbol/__init__.py ============================================================================== --- pypy/dist/pypy/module/symbol/__init__.py (original) +++ pypy/dist/pypy/module/symbol/__init__.py Fri Dec 9 18:27:24 2005 @@ -24,7 +24,7 @@ from pypy.interpreter.pyparser import pysymbol sym_name = {} -for val, name in pysymbol.sym_name.items(): +for val, name in pysymbol._cpython_symbols.sym_name.items(): if val >= 0: Module.interpleveldefs[name] = 'space.wrap(%d)' % val sym_name[val] = name From mwh at codespeak.net Fri Dec 9 18:47:34 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Fri, 9 Dec 2005 18:47:34 +0100 (CET) Subject: [pypy-svn] r20961 - pypy/dist/pypy/jit Message-ID: <20051209174734.4A1A527DE5@code1.codespeak.net> Author: mwh Date: Fri Dec 9 18:47:33 2005 New Revision: 20961 Modified: pypy/dist/pypy/jit/llabstractinterp.py Log: (arigo, mwh) Apply a couple of especially likely-to-be-useful transformations to the partially specialized graph. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Fri Dec 9 18:47:33 2005 @@ -3,6 +3,7 @@ from pypy.objspace.flow.model import Block, Link, FunctionGraph from pypy.objspace.flow.model import checkgraph, last_exception from pypy.rpython.lltypesystem import lltype +from pypy.translator.simplify import eliminate_empty_blocks, join_blocks class LLAbstractValue(object): @@ -219,6 +220,8 @@ raise Exception("uh?") # the graph should be complete now; sanity-check checkgraph(graph) + eliminate_empty_blocks(graph) + join_blocks(graph) def flowin(self, state): # flow in the block From rxe at codespeak.net Fri Dec 9 19:00:25 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 19:00:25 +0100 (CET) Subject: [pypy-svn] r20962 - pypy/dist/pypy/translator/c/test Message-ID: <20051209180025.3B7D127DEF@code1.codespeak.net> Author: rxe Date: Fri Dec 9 19:00:24 2005 New Revision: 20962 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: Yet another channel test... Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Fri Dec 9 19:00:24 2005 @@ -25,6 +25,11 @@ globals.count = 0 def wrap_stackless_function(fn): + # ensure we have no SomeObject s + from pypy.annotation.policy import AnnotatorPolicy + annotatorpolicy = AnnotatorPolicy() + annotatorpolicy.allow_someobjects = False + from pypy.translator.translator import TranslationContext from pypy.translator.c.genc import CStandaloneBuilder from pypy.annotation.model import SomeList, SomeString @@ -38,7 +43,7 @@ s_list_of_strings = SomeList(ListDef(None, SomeString())) s_list_of_strings.listdef.resize() t = TranslationContext() - t.buildannotator().build_types(entry_point, [s_list_of_strings]) + t.buildannotator(annotatorpolicy).build_types(entry_point, [s_list_of_strings]) t.buildrtyper().specialize() backend_optimizations(t) cbuilder = CStandaloneBuilder(t, entry_point, gcpolicy=gcpolicy) @@ -385,3 +390,64 @@ res = wrap_stackless_function(f) assert res == '1' + + +def test_channel4(): + """ test with something other than int """ + + class A: + pass + + class Data(object): + pass + + class IntData(Data): + def __init__(self, d): + self.d = d + + class StringData(Data): + def __init__(self, d): + self.d = d + + class InstanceAData(Data): + def __init__(self, d): + self.d = d + + ch1 = Channel() + ch2 = Channel() + ch3 = Channel() + + def f1(name): + for ii in range(5): + ch1.send(IntData(ii)) + + def f2(name): + for ii in range(5): + ch2.send(StringData("asda")) + + def f3(name): + for ii in range(5): + ch3.send(StringData("asda")) + + def fr(name): + #while True: + for ii in range(11): + data3 = ch3.receive() + globals.count += 1 + data1 = ch1.receive() + globals.count += 1 + data2 = ch2.receive() + globals.count += 1 + + def f(): + start_tasklet(Tasklet("fr", fr)) + start_tasklet(Tasklet("f1", f1)) + start_tasklet(Tasklet("f2", f2)) + start_tasklet(Tasklet("f3", f3)) + run() + debug("asd %s" % globals.count) + return (globals.count == 15) + + res = wrap_stackless_function(f) + assert res == '1' + From arigo at codespeak.net Fri Dec 9 19:21:56 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Dec 2005 19:21:56 +0100 (CET) Subject: [pypy-svn] r20963 - in pypy/dist/pypy/jit: . test Message-ID: <20051209182156.27D9F27DEF@code1.codespeak.net> Author: arigo Date: Fri Dec 9 19:21:53 2005 New Revision: 20963 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_jit_tl.py Log: (mwh, arigo) One more test, with one more fix: 'a_return' not being attached to 'graphstate' when the residual blocks are shared among multiple residual graphs. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Fri Dec 9 19:21:53 2005 @@ -14,6 +14,9 @@ def __init__(self, value): self.value = value + def __repr__(self): + return '' % (self.value,) + # def __eq__(self, other): # return self.__class__ is other.__class__ and self.value == other.value # @@ -45,6 +48,9 @@ # we can share the Constant() self.copy_v = orig_v + def __repr__(self): + return '' % (self.copy_v,) + def getconcretetype(self): return self.copy_v.concretetype @@ -213,6 +219,7 @@ # that it is really the one from 'graph' -- by patching # 'graph' if necessary. if len(link.target.inputargs) == 1: + self.graphstate.a_return = state.args_a[0] graph.returnblock = link.target elif len(link.target.inputargs) == 2: graph.exceptblock = link.target @@ -275,7 +282,6 @@ # they are linked to the official return or except block of the # copygraph. If needed, LLConcreteValues are turned into Constants. if len(origblock.inputargs) == 1: - self.graphstate.a_return = bindings[origblock.inputargs[0]] target = self.graphstate.copygraph.returnblock else: XXX_later @@ -387,9 +393,11 @@ graphstate, args_a = self.interp.schedule_graph( args_a, origgraph) if graphstate.state != "during": + print 'ENTERING', graphstate.copygraph.name, args_a graphstate.complete(self.interp) if isinstance(graphstate.a_return, LLConcreteValue): a_result = graphstate.a_return + print 'LEAVING', graphstate.copygraph.name, graphstate.a_return origfptr = v_func.value ARGS = [] Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Fri Dec 9 19:21:53 2005 @@ -26,8 +26,17 @@ #graph2.show() -def test_jit_tl_1(): - code = tl.compile(""" - PUSH 42 - """) +def run_jit(code): + code = tl.compile(code) jit_tl(code) + + +def test_jit_tl_1(): + for code in [ + ''' PUSH 42 + ''', + ''' PUSH 6 + PUSH 7 + ADD + ''']: + yield run_jit, code From arigo at codespeak.net Fri Dec 9 19:22:49 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Dec 2005 19:22:49 +0100 (CET) Subject: [pypy-svn] r20964 - pypy/dist/pypy/jit/test Message-ID: <20051209182249.1778627DEF@code1.codespeak.net> Author: arigo Date: Fri Dec 9 19:22:47 2005 New Revision: 20964 Modified: pypy/dist/pypy/jit/test/test_tl.py Log: No reversed() in Python 2.3. Modified: pypy/dist/pypy/jit/test/test_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_tl.py (original) +++ pypy/dist/pypy/jit/test/test_tl.py Fri Dec 9 19:22:47 2005 @@ -63,7 +63,7 @@ def test_pick(): values = [7, 8, 9] code = [] - for v in reversed(values): + for v in values[::-1]: code.extend([PUSH, v]) for i, v in enumerate(values): From rxe at codespeak.net Fri Dec 9 19:30:10 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 19:30:10 +0100 (CET) Subject: [pypy-svn] r20965 - pypy/dist/pypy/translator/js/test Message-ID: <20051209183010.7CD8527DEF@code1.codespeak.net> Author: rxe Date: Fri Dec 9 19:30:09 2005 New Revision: 20965 Added: pypy/dist/pypy/translator/js/test/test_tasklets.py (contents, props changed) Log: Add bunch of failing tests from genc. Added: pypy/dist/pypy/translator/js/test/test_tasklets.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/js/test/test_tasklets.py Fri Dec 9 19:30:09 2005 @@ -0,0 +1,430 @@ +import os + +from pypy.rpython.memory.lladdress import NULL +from pypy.rpython.rstack import yield_current_frame_to_caller + + +# ____________________________________________________________ + +def wrap_stackless_function(fn): + from pypy.translator.js.test.runtest import compile_function + jsfn = compile_function(fn, [], stackless=True) + return str(jsfn()) + "\n" +# ____________________________________________________________ +# For testing +debug_flag = True + +# count of loops in tests (set lower to speed up) +loops = 10 + +def debug(s): + if debug_flag: + os.write(2, "%s\n" % s) + +class Globals: + def __init__(self): + pass + +globals = Globals() +globals.count = 0 + + +# ____________________________________________________________ + +class Resumable(object): + def __init__(self, fn): + self.fn = fn + self.alive = False + + def start(self): + self.caller = yield_current_frame_to_caller() + self.fn(self.name) + return self.caller + + def set_resumable(self, resumable): + self.resumable = resumable + + def suspend(self): + # we suspend ourself + self.caller = self.caller.switch() + + def resume(self): + # the caller resumes me + self.resumable = self.resumable.switch() + self.alive = self.resumable is not None + +class Tasklet(Resumable): + def __init__(self, name, fn): + Resumable.__init__(self, fn) + self.name = name + self.blocked = 0 + + # propogates round suspend-resume to tell scheduler in run() + # XXX too late to think this thru + self.remove = False + + def suspend_and_remove(self, remove): + self.remove = remove + self.suspend() + + def resume(self): + assert not self.remove + Resumable.resume(self) + + # not sure what to do with alive yetXXX + + #XXX arggh - why NOT?? + #if not alive: + # self.caller = # None / NULL + return self.alive and not self.remove + +class Channel: + def __init__(self): + self.queue = [] + self.balance = 0 + + def send(self, value): + self.balance += 1 + if self.balance <= 0: + t = self.queue.pop(0) + t.data = value + t.blocked = 0 + t.remove = False + scheduler.run_immediately(t) + scheduler.schedule() + + # resuming + t = getcurrent() + assert t.blocked == 0 + + else: + t = getcurrent() + assert isinstance(t, Tasklet) + t.data = value + # let it wait for a receiver to come along + self.queue.append(t) + t.blocked = 1 + schedule_remove() + + # resuming + assert t == getcurrent() + assert t.blocked == 0 + + def receive(self): + self.balance -= 1 + # good to go + if self.balance >= 0: + t = self.queue.pop(0) + t.blocked = 0 + t.remove = False + data = t.data + scheduler.add_tasklet(t) + return data + else: + # queue ourself + t = getcurrent() + assert isinstance(t, Tasklet) + self.queue.append(t) + + # block until send has reenabled me + t.blocked = -1 + schedule_remove() + + # resuming + assert t == getcurrent() + assert t.blocked == 0 + + return t.data + +class Scheduler(object): + def __init__(self): + self.runnables = [] + self.current_tasklet = None + self.immediately_schedule = None + + def add_tasklet(self, tasklet): + self.runnables.append(tasklet) + + def run_immediately(self, tasklet): + self.immediately_schedule = tasklet + + def run(self): + while self.runnables: + runnables = self.runnables + self.runnables = [] + count = 0 + for t in runnables: + assert self.current_tasklet is None + + self.current_tasklet = t + if t.resume(): + self.runnables.append(self.current_tasklet) + self.current_tasklet = None + count += 1 + + if self.immediately_schedule: + self.runnables = [self.immediately_schedule] \ + + runnables[count:] + self.runnables + self.immediately_schedule = None + break + + def schedule(self, remove=False): + assert self.current_tasklet is not None + self.current_tasklet.suspend_and_remove(remove) + +# ____________________________________________________________ + +scheduler = Scheduler() +def start_tasklet(tasklet): + res = tasklet.start() + tasklet.set_resumable(res) + scheduler.add_tasklet(tasklet) + +def start_tasklet_now(tasklet): + res = tasklet.start() + tasklet.set_resumable(res) + scheduler.run_immediately(tasklet) + +def schedule(): + scheduler.schedule() + +def schedule_remove(): + scheduler.schedule(remove=True) + +def run(): + scheduler.run() + +def getcurrent(): + return scheduler.current_tasklet + +# ____________________________________________________________ + +def test_simple(): + + def simple(name): + for ii in range(5): + globals.count += 1 + schedule() + + def f(): + for ii in range(loops): + start_tasklet(Tasklet("T%s" % ii, simple)) + run() + return globals.count == loops * 5 + + res = wrap_stackless_function(f) + assert res == '1' + +def test_multiple_simple(): + + def simple(name): + for ii in range(5): + globals.count += 1 + schedule() + + def simple2(name): + for ii in range(5): + globals.count += 1 + schedule() + globals.count += 1 + + def simple3(name): + schedule() + for ii in range(10): + globals.count += 1 + if ii % 2: + schedule() + schedule() + + def f(): + for ii in range(loops): + start_tasklet(Tasklet("T1%s" % ii, simple)) + start_tasklet(Tasklet("T2%s" % ii, simple2)) + start_tasklet(Tasklet("T3%s" % ii, simple3)) + run() + return globals.count == loops * 25 + + res = wrap_stackless_function(f) + assert res == '1' + +def test_schedule_remove(): + + def simple(name): + for ii in range(20): + if ii < 10: + schedule() + else: + schedule_remove() + globals.count += 1 + + def f(): + for ii in range(loops): + start_tasklet(Tasklet("T%s" % ii, simple)) + run() + for ii in range(loops): + start_tasklet(Tasklet("T%s" % ii, simple)) + run() + return globals.count == loops * 10 * 2 + + res = wrap_stackless_function(f) + assert res == '1' + +def test_run_immediately(): + globals.intermediate = 0 + globals.count = 0 + def simple(name): + for ii in range(20): + globals.count += 1 + schedule() + + def run_immediately(name): + globals.intermediate = globals.count + schedule() + + def simple2(name): + for ii in range(20): + globals.count += 1 + if ii == 10: + start_tasklet_now(Tasklet("intermediate", run_immediately)) + schedule() + + def f(): + start_tasklet(Tasklet("simple2", simple2)) + for ii in range(loops): + start_tasklet(Tasklet("T%s" % ii, simple)) + run() + total_expected = (loops + 1) * 20 + return (globals.intermediate == total_expected / 2 + 1 and + globals.count == total_expected) + + res = wrap_stackless_function(f) + assert res == '1' + +def test_channel1(): + ch = Channel() + + def f1(name): + for ii in range(5): + ch.send(ii) + + def f2(name): + #while True: + for ii in range(6): + globals.count += ch.receive() + + def f(): + start_tasklet(Tasklet("f2", f2)) + start_tasklet(Tasklet("f1", f1)) + run() + return (globals.count == 10) + + res = wrap_stackless_function(f) + assert res == '1' + +def test_channel2(): + ch = Channel() + + def f1(name): + for ii in range(5): + ch.send(ii) + + def f2(name): + #while True: + for ii in range(6): + res = ch.receive() + globals.count += res + + def f(): + start_tasklet(Tasklet("f1", f1)) + start_tasklet(Tasklet("f2", f2)) + run() + return (globals.count == 10) + + res = wrap_stackless_function(f) + assert res == '1' + + +def test_channel3(): + ch = Channel() + + def f1(name): + for ii in range(5): + ch.send(ii) + + def f2(name): + #while True: + for ii in range(16): + res = ch.receive() + globals.count += res + + def f(): + start_tasklet(Tasklet("f1x", f1)) + start_tasklet(Tasklet("f1xx", f1)) + start_tasklet(Tasklet("f1xxx", f1)) + start_tasklet(Tasklet("f2", f2)) + run() + return (globals.count == 30) + + res = wrap_stackless_function(f) + assert res == '1' + + +def test_channel4(): + """ test with something other than int """ + + class A: + pass + + class Data(object): + pass + + class IntData(Data): + def __init__(self, d): + self.d = d + + class StringData(Data): + def __init__(self, d): + self.d = d + + class InstanceAData(Data): + def __init__(self, d): + self.d = d + + ch1 = Channel() + ch2 = Channel() + ch3 = Channel() + + def f1(name): + for ii in range(5): + ch1.send(IntData(ii)) + + def f2(name): + for ii in range(5): + ch2.send(StringData("asda")) + + def f3(name): + for ii in range(5): + ch3.send(StringData("asda")) + + def fr(name): + #while True: + for ii in range(11): + data3 = ch3.receive() + globals.count += 1 + data1 = ch1.receive() + globals.count += 1 + data2 = ch2.receive() + globals.count += 1 + + def f(): + start_tasklet(Tasklet("fr", fr)) + start_tasklet(Tasklet("f1", f1)) + start_tasklet(Tasklet("f2", f2)) + start_tasklet(Tasklet("f3", f3)) + run() + debug("asd %s" % globals.count) + return (globals.count == 15) + + res = wrap_stackless_function(f) + assert res == '1' + From ale at codespeak.net Fri Dec 9 19:36:36 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Fri, 9 Dec 2005 19:36:36 +0100 (CET) Subject: [pypy-svn] r20966 - pypy/dist/pypy/translator/c/src Message-ID: <20051209183636.5523327DEF@code1.codespeak.net> Author: ale Date: Fri Dec 9 19:36:35 2005 New Revision: 20966 Modified: pypy/dist/pypy/translator/c/src/ll__socket.h Log: (nik, ale) Oups, shear luck made the test pass - this should be more robust Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Fri Dec 9 19:36:35 2005 @@ -97,7 +97,9 @@ // XXX For some reason the errno attribute of the OSError is not set // at interpreter level. Investigate ... RPYTHON_RAISE_OSERROR(errno); + return -1; } + return fd; } void LL__socket_connect(int fd, RPyString *host, int port) From nik at codespeak.net Fri Dec 9 20:02:02 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Fri, 9 Dec 2005 20:02:02 +0100 (CET) Subject: [pypy-svn] r20967 - in pypy/dist/pypy: module/_socket translator/c/src translator/c/test Message-ID: <20051209190202.A9DA127DEF@code1.codespeak.net> Author: nik Date: Fri Dec 9 20:02:00 2005 New Revision: 20967 Modified: pypy/dist/pypy/module/_socket/interp_socket.py pypy/dist/pypy/translator/c/src/ll__socket.h pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (ale, nik) raising errors out of C socket helpers. they are OSErrors at the moment, because socket.errors can't be raised from C. must investigated at some later point ... Modified: pypy/dist/pypy/module/_socket/interp_socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/interp_socket.py (original) +++ pypy/dist/pypy/module/_socket/interp_socket.py Fri Dec 9 20:02:00 2005 @@ -709,6 +709,9 @@ port = space.int_w(addr_w[1]) try: rsocket.connect(self.fd, host, port) + except OSError, ex: + raise w_get_socketerror(space, e.strerror, e.errno) + # XXX timeout doesn't really work at the moment except socket.timeout: raise wrap_timeouterror(space) except socket.error, e: Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Fri Dec 9 20:02:00 2005 @@ -108,12 +108,16 @@ if (setipaddr(RPyString_AsString(host), (struct sockaddr *) &addr, sizeof(addr), AF_INET) < 0) { - // XXX raise some error here + // XXX we actually want to raise socket.error + RPYTHON_RAISE_OSERROR(errno); + return NULL; } addr.sin_family = AF_INET; addr.sin_port = htons(port); if (connect(fd, &addr, sizeof(addr)) < 0) { - // XXX raise some error here + // XXX we actually want to raise socket.error + RPYTHON_RAISE_OSERROR(errno); + return NULL; } } Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Fri Dec 9 20:02:00 2005 @@ -103,3 +103,19 @@ f1 = compile(does_stuff, []) res = f1() assert res == 80 + +def test_connect_error(): + from pypy.module._socket.rpython import rsocket + import os + tests = [ + ("blablablablabla", 80), + ("127.0.0.1", 909090), + ("127.0.0.1", -2), + ] + def does_stuff(host, port): + fd = rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) + rsocket.connect(fd, host, port) + os.close(fd) + f1 = compile(does_stuff, [str, int]) + for args in tests: + py.test.raises(OSError, f1, *args) From ac at codespeak.net Fri Dec 9 20:14:29 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Fri, 9 Dec 2005 20:14:29 +0100 (CET) Subject: [pypy-svn] r20968 - pypy/dist/pypy/interpreter/pyparser/test Message-ID: <20051209191429.020C527DEF@code1.codespeak.net> Author: ac Date: Fri Dec 9 20:14:29 2005 New Revision: 20968 Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Log: Fix code_comparison. Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Fri Dec 9 20:14:29 2005 @@ -117,7 +117,7 @@ assert len(ac_code.co_consts) == len(sc_code.co_consts) for c1, c2 in zip( ac_code.co_consts, sc_code.co_consts ): - if type(c1)==PyCode: + if isinstance(c1, PyCode): return compare_code( c1, c2, space ) else: assert c1 == c2 From cfbolz at codespeak.net Fri Dec 9 20:20:02 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 9 Dec 2005 20:20:02 +0100 (CET) Subject: [pypy-svn] r20969 - in pypy/dist/pypy/rpython: lltypesystem test Message-ID: <20051209192002.E0DE427DEF@code1.codespeak.net> Author: cfbolz Date: Fri Dec 9 20:20:01 2005 New Revision: 20969 Modified: pypy/dist/pypy/rpython/lltypesystem/rclass.py pypy/dist/pypy/rpython/test/test_rclass.py Log: (johahn, cfbolz): fixed argument types of the destructor function Modified: pypy/dist/pypy/rpython/lltypesystem/rclass.py ============================================================================== --- pypy/dist/pypy/rpython/lltypesystem/rclass.py (original) +++ pypy/dist/pypy/rpython/lltypesystem/rclass.py Fri Dec 9 20:20:01 2005 @@ -339,10 +339,13 @@ if (self.classdef is not None and self.classdef.classdesc.lookup('__del__') is not None): s_func = self.classdef.classdesc.s_read_attribute('__del__') + source_desc = self.classdef.classdesc.lookup('__del__') + source_classdef = source_desc.getclassdef(None) + source_repr = getinstancerepr(self.rtyper, source_classdef) assert len(s_func.descriptions) == 1 funcdesc = s_func.descriptions.keys()[0] graph = funcdesc.cachedgraph(None) - FUNCTYPE = FuncType([Ptr(self.object_type)], Void) + FUNCTYPE = FuncType([Ptr(source_repr.object_type)], Void) destrptr = functionptr(FUNCTYPE, graph.name, graph=graph, _callable=graph.func) Modified: pypy/dist/pypy/rpython/test/test_rclass.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rclass.py (original) +++ pypy/dist/pypy/rpython/test/test_rclass.py Fri Dec 9 20:20:01 2005 @@ -387,5 +387,48 @@ destrptr = RTTI._obj.destructor_funcptr assert destrptr is not None - - + +def test_del_inheritance(): + class State: + pass + s = State() + s.a_dels = 0 + s.b_dels = 0 + class A(object): + def __del__(self): + s.a_dels += 1 + class B(A): + def __del__(self): + s.b_dels += 1 + class C(A): + pass + def f(): + A() + B() + C() + A() + B() + C() + return s.a_dels * 10 + s.b_dels + res = f() + assert res == 42 + t = TranslationContext() + t.buildannotator().build_types(f, []) + t.buildrtyper().specialize() + graph = graphof(t, f) + TYPEA = graph.startblock.operations[0].args[0].value + RTTIA = getRuntimeTypeInfo(TYPEA) + TYPEB = graph.startblock.operations[3].args[0].value + RTTIB = getRuntimeTypeInfo(TYPEB) + TYPEC = graph.startblock.operations[6].args[0].value + RTTIC = getRuntimeTypeInfo(TYPEC) + queryptra = RTTIA._obj.query_funcptr # should not raise + queryptrb = RTTIB._obj.query_funcptr # should not raise + queryptrc = RTTIC._obj.query_funcptr # should not raise + destrptra = RTTIA._obj.destructor_funcptr + destrptrb = RTTIB._obj.destructor_funcptr + destrptrc = RTTIC._obj.destructor_funcptr + assert destrptra == destrptrc + assert typeOf(destrptra).TO.ARGS[0] != typeOf(destrptrb).TO.ARGS[0] + assert destrptra is not None + assert destrptrb is not None From nik at codespeak.net Fri Dec 9 20:25:48 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Fri, 9 Dec 2005 20:25:48 +0100 (CET) Subject: [pypy-svn] r20970 - in pypy/dist/pypy: module/_socket module/_socket/rpython translator/c/src translator/c/test Message-ID: <20051209192548.81A4927DF3@code1.codespeak.net> Author: nik Date: Fri Dec 9 20:25:46 2005 New Revision: 20970 Modified: pypy/dist/pypy/module/_socket/interp_socket.py pypy/dist/pypy/module/_socket/rpython/ll__socket.py pypy/dist/pypy/module/_socket/rpython/rsocket.py pypy/dist/pypy/translator/c/src/ll__socket.h pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (ale, nik) make our low-level socket.connect() take a socket argument to be able to add IPv6 support in the future. Modified: pypy/dist/pypy/module/_socket/interp_socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/interp_socket.py (original) +++ pypy/dist/pypy/module/_socket/interp_socket.py Fri Dec 9 20:25:46 2005 @@ -707,18 +707,19 @@ space.wrap("tuple of a string and an int required")) host = space.str_w(addr_w[0]) port = space.int_w(addr_w[1]) - try: - rsocket.connect(self.fd, host, port) - except OSError, ex: - raise w_get_socketerror(space, e.strerror, e.errno) - # XXX timeout doesn't really work at the moment - except socket.timeout: - raise wrap_timeouterror(space) - except socket.error, e: - raise wrap_socketerror(space, e) + sockname = (host, port, 0, 0) else: # XXX IPv6 and Unix sockets missing here pass + try: + rsocket.connect(self.fd, sockname) + except OSError, ex: + raise w_get_socketerror(space, e.strerror, e.errno) + # XXX timeout doesn't really work at the moment + except socket.timeout: + raise wrap_timeouterror(space) + except socket.error, e: + raise wrap_socketerror(space, e) connect.unwrap_spec = ['self', ObjSpace, W_Root] def connect_ex(self, space, w_addr): Modified: pypy/dist/pypy/module/_socket/rpython/ll__socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/ll__socket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/ll__socket.py Fri Dec 9 20:25:46 2005 @@ -95,7 +95,7 @@ return 0 ll__socket_newsocket.suggested_primitive = True -def ll__socket_connect(fd, host, port): +def ll__socket_connect(fd, sockname): return None ll__socket_connect.suggested_primitive = True Modified: pypy/dist/pypy/module/_socket/rpython/rsocket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/rsocket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/rsocket.py Fri Dec 9 20:25:46 2005 @@ -40,13 +40,9 @@ socket_cache[fileno] = s return fileno -def connect(fd, host, port): - # XXX IPv4 only +def connect(fd, sockname): s = socket_cache[fd] - try: - s.connect((host, port)) - except Exception, ex: - print ex + s.connect(sockname[:2]) # XXX IPv4 only def getpeername(fd): s = socket_cache[fd] Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Fri Dec 9 20:25:46 2005 @@ -102,18 +102,18 @@ return fd; } -void LL__socket_connect(int fd, RPyString *host, int port) +void LL__socket_connect(int fd, RPySOCKET_SOCKNAME* sockname) { struct sockaddr_in addr; - if (setipaddr(RPyString_AsString(host), (struct sockaddr *) &addr, + if (setipaddr(RPyString_AsString(sockname->t_item0), (struct sockaddr *) &addr, sizeof(addr), AF_INET) < 0) { // XXX we actually want to raise socket.error RPYTHON_RAISE_OSERROR(errno); return NULL; } addr.sin_family = AF_INET; - addr.sin_port = htons(port); + addr.sin_port = htons(sockname->t_item1); if (connect(fd, &addr, sizeof(addr)) < 0) { // XXX we actually want to raise socket.error RPYTHON_RAISE_OSERROR(errno); Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Fri Dec 9 20:25:46 2005 @@ -96,7 +96,7 @@ def does_stuff(): fd = rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) # XXX need to think of a test without connecting to outside servers - rsocket.connect(fd, "codespeak.net", 80) + rsocket.connect(fd, ("codespeak.net", 80, 0, 0)) sockname = rsocket.getpeername(fd) os.close(fd) return sockname[1] @@ -114,7 +114,7 @@ ] def does_stuff(host, port): fd = rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) - rsocket.connect(fd, host, port) + rsocket.connect(fd, (host, port, 0, 0)) os.close(fd) f1 = compile(does_stuff, [str, int]) for args in tests: From cfbolz at codespeak.net Fri Dec 9 20:27:44 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 9 Dec 2005 20:27:44 +0100 (CET) Subject: [pypy-svn] r20971 - in pypy/dist/pypy/translator/c: . test Message-ID: <20051209192744.6B3B027DF6@code1.codespeak.net> Author: cfbolz Date: Fri Dec 9 20:27:43 2005 New Revision: 20971 Modified: pypy/dist/pypy/translator/c/gc.py pypy/dist/pypy/translator/c/test/test_backendoptimized.py Log: (johahn, cfbolz) support __del__ in genc if using refcounting. Modified: pypy/dist/pypy/translator/c/gc.py ============================================================================== --- pypy/dist/pypy/translator/c/gc.py (original) +++ pypy/dist/pypy/translator/c/gc.py Fri Dec 9 20:27:43 2005 @@ -84,6 +84,7 @@ class RefcountingInfo: deallocator = None static_deallocator = None + destructor = None class RefcountingGcPolicy(BasicGcPolicy): @@ -184,6 +185,10 @@ gcinfo.rtti_query_funcptr = db.get(fnptr) T = typeOf(fnptr).TO.ARGS[0] gcinfo.rtti_query_funcptr_argtype = db.gettype(T) + if hasattr(rtti._obj, 'destructor_funcptr'): + destrptr = rtti._obj.destructor_funcptr + gcinfo.destructor = db.get(destrptr) + T = typeOf(destrptr).TO.ARGS[0] else: # is a deallocator really needed, or would it be empty? if list(self.deallocator_lines(structdefnode, '')): @@ -198,14 +203,32 @@ def struct_implementationcode(self, structdefnode): if structdefnode.gcinfo: gcinfo = structdefnode.gcinfo - if gcinfo.static_deallocator: + has_dynamic_deallocator = gcinfo.deallocator and gcinfo.deallocator != gcinfo.static_deallocator + if gcinfo.static_deallocator and not has_dynamic_deallocator: yield 'void %s(struct %s *p) {' % (gcinfo.static_deallocator, structdefnode.name) + # insert decrefs to objects we have a reference to for line in self.deallocator_lines(structdefnode, '(*p)'): yield '\t' + line yield '\tOP_FREE(p);' yield '}' - if gcinfo.deallocator and gcinfo.deallocator != gcinfo.static_deallocator: + elif has_dynamic_deallocator: + # write static deallocator + yield 'void %s(struct %s *p) {' % (gcinfo.static_deallocator, + structdefnode.name) + # insert call to __del__ if necessary + if gcinfo.destructor: + yield "\t%s((%s) p);" % (gcinfo.destructor, + cdecl(gcinfo.destructor_argtype, '')) + # insert decrefs to objects we have a reference to + yield '\tif (!--p->%s) {' % (structdefnode.gcheader,) + for line in self.deallocator_lines(structdefnode, '(*p)'): + yield '\t\t' + line + yield '\t\tOP_FREE(p);' + yield '\t}' + yield '}' + + # write dynamic deallocator yield 'void %s(struct %s *p) {' % (gcinfo.deallocator, structdefnode.name) yield '\tvoid (*staticdealloc) (void *);' # the refcount should be 0; temporarily bump it to 1 @@ -214,8 +237,7 @@ yield '\tstaticdealloc = %s((%s) p);' % ( gcinfo.rtti_query_funcptr, cdecl(gcinfo.rtti_query_funcptr_argtype, '')) - yield '\tif (!--p->%s)' % (structdefnode.gcheader,) - yield '\t\tstaticdealloc(p);' + yield '\tstaticdealloc(p);' yield '}' Modified: pypy/dist/pypy/translator/c/test/test_backendoptimized.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_backendoptimized.py (original) +++ pypy/dist/pypy/translator/c/test/test_backendoptimized.py Fri Dec 9 20:27:43 2005 @@ -7,6 +7,7 @@ def process(self, t): _TestTypedTestCase.process(self, t) + self.t = t backend_optimizations(t) def test_remove_same_as(self): @@ -18,3 +19,60 @@ fn = self.getcompiled(f) assert f(True) == 123 assert f(False) == 456 + + def test__del__(self): + import os + class B(object): + pass + b = B() + b.nextid = 0 + b.num_deleted = 0 + class A(object): + def __init__(self): + self.id = b.nextid + b.nextid += 1 + + def __del__(self): + b.num_deleted += 1 + + def f(x=int): + a = A() + for i in range(x): + a = A() + return b.num_deleted + + fn = self.getcompiled(f) + res = f(5) + assert res == 5 + res = fn(5) + # translated function looses its last reference earlier + assert res == 6 + + def test_del_inheritance(self): + class State: + pass + s = State() + s.a_dels = 0 + s.b_dels = 0 + class A(object): + def __del__(self): + s.a_dels += 1 + class B(A): + def __del__(self): + s.b_dels += 1 + class C(A): + pass + def f(): + A() + B() + C() + A() + B() + C() + return s.a_dels * 10 + s.b_dels + res = f() + assert res == 42 + fn = self.getcompiled(f) + res = fn() + assert res == 42 + From rxe at codespeak.net Fri Dec 9 20:37:14 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 20:37:14 +0100 (CET) Subject: [pypy-svn] r20972 - pypy/dist/pypy/translator/llvm Message-ID: <20051209193714.8C8C827DF4@code1.codespeak.net> Author: rxe Date: Fri Dec 9 20:37:13 2005 New Revision: 20972 Modified: pypy/dist/pypy/translator/llvm/database.py pypy/dist/pypy/translator/llvm/opwriter.py Log: Attempt to use the database in a few places that we werent. This is an attempt to fix weirdness during translating pypy-llvm, testing on snake now. Modified: pypy/dist/pypy/translator/llvm/database.py ============================================================================== --- pypy/dist/pypy/translator/llvm/database.py (original) +++ pypy/dist/pypy/translator/llvm/database.py Fri Dec 9 20:37:13 2005 @@ -323,6 +323,10 @@ def repr_name(self, obj): return self.obj2node[obj].ref + def repr_value(self, value): + # XXX Testing + return self.obj2node[value].get_ref() + # __________________________________________________________ # Primitive stuff Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Fri Dec 9 20:37:13 2005 @@ -305,9 +305,9 @@ def last_exception_type_ptr(self, op): e = self.db.translator.rtyper.getexceptiondata() - # XXX Can we use database? - lltype_of_exception_type = ('%structtype_' + e.lltype_of_exception_type.TO.__name__ + '*') - self.codewriter.load('%'+str(op.result), lltype_of_exception_type, '%last_exception_type') + self.codewriter.load('%' + str(op.result), + self.db.repr_type(e.lltype_of_exception_type), + '%last_exception_type') def invoke(self, op): op_args = [arg for arg in op.args @@ -344,15 +344,9 @@ argtypes, none_label, exc_label) e = self.db.translator.rtyper.getexceptiondata() - ll_exception_match = '%pypy_' + e.fn_exception_match._obj._name - - # XXX Can we use database? - lltype_of_exception_type = ('%structtype_' + - e.lltype_of_exception_type.TO.__name__ - + '*') - lltype_of_exception_value = ('%structtype_' + - e.lltype_of_exception_value.TO.__name__ - + '*') + ll_exception_match = self.db.repr_value(e.fn_exception_match._obj) + lltype_of_exception_type = self.db.repr_type(e.lltype_of_exception_type) + lltype_of_exception_value = self.db.repr_type(e.lltype_of_exception_value) self.codewriter.label(exc_label) From ericvrp at codespeak.net Fri Dec 9 20:44:04 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 9 Dec 2005 20:44:04 +0100 (CET) Subject: [pypy-svn] r20973 - in pypy/dist/pypy/translator/js: . test Message-ID: <20051209194404.3D52127DF4@code1.codespeak.net> Author: ericvrp Date: Fri Dec 9 20:44:02 2005 New Revision: 20973 Modified: pypy/dist/pypy/translator/js/database.py pypy/dist/pypy/translator/js/opwriter.py pypy/dist/pypy/translator/js/test/test_tasklets.py Log: * made 7 out of 8 tasklet tests pass in genjs (with some hacking that needs refactoring later) * fix bug an object instead of an array was created * refactoring to elimintate creation of addition temporary variables Modified: pypy/dist/pypy/translator/js/database.py ============================================================================== --- pypy/dist/pypy/translator/js/database.py (original) +++ pypy/dist/pypy/translator/js/database.py Fri Dec 9 20:44:02 2005 @@ -154,14 +154,14 @@ assert isinstance(arg, Variable) return str(arg) - def repr_type(self, arg): - try: - node = self.obj2node.get(arg.value._obj) - if isinstance(node, ArrayNode): - return 'Array' - except: - pass - return 'Object' + #def repr_type(self, arg): + # try: + # node = self.obj2node.get(arg.value._obj) + # if isinstance(node, ArrayNode): + # return 'Array' + # except: + # pass + # return 'Object' def repr_concretetype(self, ct): #used by casts try: @@ -200,16 +200,16 @@ assert False, "%s not supported" % (type(value)) - def repr_tmpvar(self): - count = self._tmpcount - self._tmpcount += 1 - return "tmp_" + str(count) - - def repr_constructor(self, type_): - return self.obj2node[type_].constructor_ref - - def repr_name(self, obj): - return self.obj2node[obj].ref + #def repr_tmpvar(self): + # count = self._tmpcount + # self._tmpcount += 1 + # return "tmp_" + str(count) + # + #def repr_constructor(self, type_): + # return self.obj2node[type_].constructor_ref + # + #def repr_name(self, obj): + # return self.obj2node[obj].ref # __________________________________________________________ # Primitive stuff Modified: pypy/dist/pypy/translator/js/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/js/opwriter.py (original) +++ pypy/dist/pypy/translator/js/opwriter.py Fri Dec 9 20:44:02 2005 @@ -113,7 +113,7 @@ else: res_val = mult_val for ii in range(operand - 1): - res_val = self.db.repr_tmpvar() + #res_val = self.db.repr_tmpvar() self.codewriter.binaryop('*', res_val, last_val, @@ -178,22 +178,27 @@ name = self.char_operations[op.opname] assert len(op.args) == 2 res = self.db.repr_arg(op.result) - c1 = self.db.repr_tmpvar() - c2 = self.db.repr_tmpvar() - self.codewriter.cast(c1, "sbyte", self.db.repr_arg(op.args[0]), "ubyte") - self.codewriter.cast(c2, "sbyte", self.db.repr_arg(op.args[1]), "ubyte") + if True: + c1 = self.db.repr_arg(op.args[0]) + c2 = self.db.repr_arg(op.args[1]) + else: + c1 = self.db.repr_tmpvar() + c2 = self.db.repr_tmpvar() + self.codewriter.cast(c1, "sbyte", self.db.repr_arg(op.args[0]), "ubyte") + self.codewriter.cast(c2, "sbyte", self.db.repr_arg(op.args[1]), "ubyte") self.codewriter.binaryop(name, res, c1, c2) def cast_char_to_int(self, op): " works for all casts " assert len(op.args) == 1 - targetvar = self.db.repr_arg(op.result) - #targettype = self.db.repr_arg_type(op.result) + targetvar = self.db.repr_arg(op.result) targettype = self.db.repr_concretetype(op.result.concretetype) - fromvar = self.db.repr_arg(op.args[0]) - #fromtype = self.db.repr_arg_type(op.args[0]) + fromvar = self.db.repr_arg(op.args[0]) fromtype = self.db.repr_concretetype(op.args[0].concretetype) - intermediate = self.db.repr_tmpvar() + if True: + intermediate = self.db.repr_arg(op.args[0]) + else: + intermediate = self.db.repr_tmpvar() self.codewriter.cast(intermediate, fromtype, fromvar, "ubyte") self.codewriter.cast(targetvar, "ubyte", intermediate, targettype) @@ -288,8 +293,16 @@ def malloc(self, op): arg_type = op.args[0].value targetvar = self.db.repr_arg(op.result) - type_ = self.db.repr_type(arg_type) + t = str(op.args[0]).split() + if t[0].endswith('Array'): #XXX ouch do I really want to go down this road + type_ = 'Array' + else: + type_ = 'Object' #self.db.repr_type(arg_type) + # self.codewriter.comment(str(arg_type)) + self.codewriter.comment(str(op.args[0])) self.codewriter.malloc(targetvar, type_) + if t[1] == 'rpy_string': + self.codewriter.append(targetvar + '.chars = ""') #XXX this should be done correctly for all types offcourse! malloc_exception = malloc malloc_varsize = malloc Modified: pypy/dist/pypy/translator/js/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/js/test/test_tasklets.py Fri Dec 9 20:44:02 2005 @@ -1,3 +1,4 @@ +import py import os from pypy.rpython.memory.lladdress import NULL @@ -10,9 +11,10 @@ from pypy.translator.js.test.runtest import compile_function jsfn = compile_function(fn, [], stackless=True) return str(jsfn()) + "\n" + # ____________________________________________________________ # For testing -debug_flag = True +debug_flag = False # count of loops in tests (set lower to speed up) loops = 10 @@ -213,7 +215,7 @@ return globals.count == loops * 5 res = wrap_stackless_function(f) - assert res == '1' + assert res == "True\n" def test_multiple_simple(): @@ -245,7 +247,7 @@ return globals.count == loops * 25 res = wrap_stackless_function(f) - assert res == '1' + assert res == "True\n" def test_schedule_remove(): @@ -267,7 +269,7 @@ return globals.count == loops * 10 * 2 res = wrap_stackless_function(f) - assert res == '1' + assert res == "True\n" def test_run_immediately(): globals.intermediate = 0 @@ -298,7 +300,7 @@ globals.count == total_expected) res = wrap_stackless_function(f) - assert res == '1' + assert res == "True\n" def test_channel1(): ch = Channel() @@ -319,7 +321,7 @@ return (globals.count == 10) res = wrap_stackless_function(f) - assert res == '1' + assert res == "True\n" def test_channel2(): ch = Channel() @@ -341,10 +343,12 @@ return (globals.count == 10) res = wrap_stackless_function(f) - assert res == '1' + assert res == "True\n" def test_channel3(): + py.test.skip("would fail because of uncaught exception") + ch = Channel() def f1(name): @@ -353,7 +357,7 @@ def f2(name): #while True: - for ii in range(16): + for ii in range(6): res = ch.receive() globals.count += res @@ -366,7 +370,7 @@ return (globals.count == 30) res = wrap_stackless_function(f) - assert res == '1' + assert res == "True\n" def test_channel4(): @@ -426,5 +430,4 @@ return (globals.count == 15) res = wrap_stackless_function(f) - assert res == '1' - + assert res == "True\n" From arigo at codespeak.net Fri Dec 9 20:55:59 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Dec 2005 20:55:59 +0100 (CET) Subject: [pypy-svn] r20974 - in pypy/dist/pypy/jit: . test Message-ID: <20051209195559.1C5D527DF4@code1.codespeak.net> Author: arigo Date: Fri Dec 9 20:55:55 2005 New Revision: 20974 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_jit_tl.py pypy/dist/pypy/jit/test/test_llabstractinterp.py pypy/dist/pypy/jit/test/test_tl.py pypy/dist/pypy/jit/tl.py pypy/dist/pypy/jit/tlopcode.py Log: (mwh, arigo) * Changed the Toy Language CALL/RETURN to use recursive calls. * Implemented some more operations, fixed bugs. * Added tests. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Fri Dec 9 20:55:55 2005 @@ -92,10 +92,10 @@ class GraphState(object): """Entry state of a graph.""" - def __init__(self, origgraph, args_a): - super(GraphState, self).__init__(args_a) + def __init__(self, origgraph, args_a, n): self.origgraph = origgraph - self.copygraph = FunctionGraph(origgraph.name, Block([])) # grumble + name = '%s_%d' % (origgraph.name, n) + self.copygraph = FunctionGraph(name, Block([])) # grumble for orig_v, copy_v in [(origgraph.getreturnvar(), self.copygraph.getreturnvar()), (origgraph.exceptblock.inputargs[0], @@ -114,6 +114,7 @@ def complete(self, interp): assert self.state != "during" if self.state == "before": + self.state = "during" builderframe = LLAbstractFrame(interp, self) builderframe.complete() self.state = "after" @@ -157,8 +158,8 @@ try: graphstate = self.graphs[origgraph][state] except KeyError: - graphstate = GraphState(origgraph, args_a) d = self.graphs.setdefault(origgraph, {}) + graphstate = GraphState(origgraph, args_a, n=len(d)) d[state] = graphstate self.pendingstates[graphstate] = state #print "SCHEDULE_GRAPH", graphstate @@ -284,7 +285,6 @@ if len(origblock.inputargs) == 1: target = self.graphstate.copygraph.returnblock else: - XXX_later target = self.graphstate.copygraph.exceptblock args_v = [binding(v).getvarorconst() for v in origblock.inputargs] newlinks = [Link(args_v, target)] @@ -388,14 +388,18 @@ v_func = a_func.getvarorconst() if isinstance(v_func, Constant): fnobj = v_func.value._obj - if hasattr(fnobj, 'graph'): + if (hasattr(fnobj, 'graph') and + not getattr(fnobj._callable, 'suggested_primitive', False)): origgraph = fnobj.graph graphstate, args_a = self.interp.schedule_graph( args_a, origgraph) + #print 'SCHEDULE_GRAPH', args_a, '==>', graphstate.copygraph.name if graphstate.state != "during": print 'ENTERING', graphstate.copygraph.name, args_a graphstate.complete(self.interp) - if isinstance(graphstate.a_return, LLConcreteValue): + if (graphstate.a_return is not None and + isinstance(graphstate.a_return.getvarorconst(), + Constant)): a_result = graphstate.a_return print 'LEAVING', graphstate.copygraph.name, graphstate.a_return @@ -410,11 +414,11 @@ TYPE = lltype.FuncType( ARGS, lltype.typeOf(origfptr).TO.RESULT) fptr = lltype.functionptr( - TYPE, fnobj._name, graph=graphstate.copygraph) + TYPE, graphstate.copygraph.name, graph=graphstate.copygraph) fconst = Constant(fptr) fconst.concretetype = lltype.typeOf(fptr) a_func = LLRuntimeValue(fconst) - self.residual("direct_call", [a_func] + args_a, a_result) + self.residual("direct_call", [a_func] + list(args_a), a_result) return a_result def op_getfield(self, op, a_ptr, a_attrname): @@ -448,3 +452,8 @@ def op_setarrayitem(self, op, a_ptr, a_index, a_value): return self.residualize(op, [a_ptr, a_index, a_value]) + + def op_cast_pointer(self, op, a_ptr): + def constant_op(ptr): + return lltype.cast_pointer(op.result.concretetype, ptr) + return self.residualize(op, [a_ptr], constant_op) Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Fri Dec 9 20:55:55 2005 @@ -6,9 +6,14 @@ from pypy.rpython.rstr import string_repr from pypy.rpython.llinterp import LLInterpreter +def entry_point(code, pc): + # indirection needed, because the hints are not about *all* calls to + # interp() + return tl.interp(code, pc) + def jit_tl(code): t = TranslationContext() - t.buildannotator().build_types(tl.interp, [str, int]) + t.buildannotator().build_types(entry_point, [str, int]) rtyper = t.buildrtyper() rtyper.specialize() graph1 = t.graphs[0] @@ -31,12 +36,54 @@ jit_tl(code) -def test_jit_tl_1(): - for code in [ - ''' PUSH 42 - ''', - ''' PUSH 6 - PUSH 7 +def test_simple1(): + run_jit(''' PUSH 42 + ''') + +def test_simple2(): + run_jit(''' PUSH 6 + PUSH 7 + ADD + ''') + +def test_branches(): + run_jit(''' + main: + PUSH 0 + PUSH 1 + BR_COND somename + label1: + PUSH -1 + PUSH 3 + BR_COND end + somename: ; + PUSH 2 // + BR_COND label1// + end:// should return 3 + ''') + +def test_exceptions(): + run_jit(''' + PUSH 42 + PUSH -42 + ROT 2 # at the moment we see a potential IndexError here + ''') + +def test_calls(): + run_jit(''' + PUSH 1 + CALL func1 + PUSH 3 + CALL func2 + RETURN + + func1: + PUSH 2 + RETURN # comment + + func2: + PUSH 4 ;comment + PUSH 5 ADD - ''']: - yield run_jit, code + RETURN + ''') Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Fri Dec 9 20:55:55 2005 @@ -169,3 +169,16 @@ graph2, insns = abstrinterp(ll_function, [s, 0, 0], [0, 1, 2]) assert insns == {} +def test_recursive_call(): + def ll_factorial(k): + if k <= 1: + return 1 + else: + return ll_factorial(k-1) * k + def ll_function(k): + # indirection needed, because the hint is not about *all* calls to + # ll_factorial() + return ll_factorial(k) + graph2, insns = abstrinterp(ll_function, [7], [0]) + # the direct_calls are messy to count, with calls to ll_stack_check + assert insns.keys() == ['direct_call'] Modified: pypy/dist/pypy/jit/test/test_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_tl.py (original) +++ pypy/dist/pypy/jit/test/test_tl.py Fri Dec 9 20:55:55 2005 @@ -108,9 +108,9 @@ def test_branch0(): assert interp(list2bytecode([PUSH,7, PUSH,1, BR_COND,0])) == 7 -def test_exit(): - assert py.test.raises(IndexError, interp, list2bytecode([EXIT])) - assert interp(list2bytecode([PUSH,7, EXIT, PUSH,5])) == 7 +def test_return(): + assert py.test.raises(IndexError, interp, list2bytecode([RETURN])) + assert interp(list2bytecode([PUSH,7, RETURN, PUSH,5])) == 7 def test_rot(): code = [PUSH,1, PUSH,2, PUSH,3, ROT,3] @@ -121,10 +121,8 @@ py.test.raises(IndexError, interp, list2bytecode([PUSH,1, PUSH,2, PUSH,3, ROT,4])) def test_call_ret(): - assert py.test.raises(IndexError, interp, list2bytecode([RETURN])) - assert interp(list2bytecode([PUSH,6, RETURN, PUSH,4, EXIT, PUSH,9])) == 9 - assert interp(list2bytecode([CALL,0])) == 2 - assert interp(list2bytecode([PUSH,1, CALL,5, PUSH,2, CALL,2, EXIT, RETURN, ROT,3, ADD, SWAP, RETURN])) == 3 + assert interp(list2bytecode([CALL,1, RETURN, PUSH,2])) == 2 + assert interp(list2bytecode([PUSH,6, CALL,2, MUL, RETURN, PUSH,7, RETURN])) == 42 def test_compile_branch_backwards(): code = compile(""" @@ -149,16 +147,19 @@ def test_compile_call_ret(): code = compile("""PUSH 1 CALL func1 - PUSH 2 + PUSH 3 CALL func2 - EXIT + RETURN func1: + PUSH 2 RETURN # comment func2: - ROT 3 ;comment - ADD - SWAP + PUSH 4 ;comment + PUSH 5 + ADD RETURN""") - assert code == list2bytecode([PUSH,1, CALL,5, PUSH,2, CALL,2, EXIT, RETURN, ROT,3, ADD, SWAP, RETURN]) + assert code == list2bytecode([PUSH,1, CALL,5, PUSH,3, CALL,4, RETURN, + PUSH,2, RETURN, + PUSH,4, PUSH,5, ADD, RETURN]) Modified: pypy/dist/pypy/jit/tl.py ============================================================================== --- pypy/dist/pypy/jit/tl.py (original) +++ pypy/dist/pypy/jit/tl.py Fri Dec 9 20:55:55 2005 @@ -96,13 +96,12 @@ pc += 1 elif opcode == CALL: - stack.append( pc+1 ) - pc += char2int(code[pc]) + 1 + offset = char2int(code[pc]) + pc += 1 + res = interp(code, pc + offset) + stack.append( res ) elif opcode == RETURN: - pc = stack.pop() - - elif opcode == EXIT: break else: Modified: pypy/dist/pypy/jit/tlopcode.py ============================================================================== --- pypy/dist/pypy/jit/tlopcode.py (original) +++ pypy/dist/pypy/jit/tlopcode.py Fri Dec 9 20:55:55 2005 @@ -31,8 +31,6 @@ opcode("CALL") #1 operand offset opcode("RETURN") -opcode("EXIT") - opcode("INVALID") del opcode From rxe at codespeak.net Fri Dec 9 21:15:58 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 21:15:58 +0100 (CET) Subject: [pypy-svn] r20975 - pypy/dist/pypy/translator/llvm/module Message-ID: <20051209201558.2E69227DF3@code1.codespeak.net> Author: rxe Date: Fri Dec 9 21:15:57 2005 New Revision: 20975 Modified: pypy/dist/pypy/translator/llvm/module/support.py Log: opyt Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Fri Dec 9 21:15:57 2005 @@ -57,7 +57,7 @@ ret int %result } -internal fastcc long %pypyop_long_abs(long %x) { +internal fastcc long %pypyop_llong_abs(long %x) { block0: %cond1 = setge long %x, 0 br bool %cond1, label %return_block, label %block1 From ericvrp at codespeak.net Fri Dec 9 21:16:39 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 9 Dec 2005 21:16:39 +0100 (CET) Subject: [pypy-svn] r20976 - in pypy/dist/pypy/translator/js: . test Message-ID: <20051209201639.CDBDC27DF3@code1.codespeak.net> Author: ericvrp Date: Fri Dec 9 21:16:38 2005 New Revision: 20976 Modified: pypy/dist/pypy/translator/js/opwriter.py pypy/dist/pypy/translator/js/test/test_genllvm.py pypy/dist/pypy/translator/js/test/test_genllvm1.py Log: some more passing tests Modified: pypy/dist/pypy/translator/js/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/js/opwriter.py (original) +++ pypy/dist/pypy/translator/js/opwriter.py Fri Dec 9 21:16:38 2005 @@ -298,11 +298,13 @@ type_ = 'Array' else: type_ = 'Object' #self.db.repr_type(arg_type) - # self.codewriter.comment(str(arg_type)) + self.codewriter.comment(str(arg_type)) self.codewriter.comment(str(op.args[0])) self.codewriter.malloc(targetvar, type_) - if t[1] == 'rpy_string': - self.codewriter.append(targetvar + '.chars = ""') #XXX this should be done correctly for all types offcourse! + if t[1] == 'rpy_string': #XXX this should be done correctly for all types offcourse! + #self.codewriter.append(targetvar + '.length = 0') + self.codewriter.append(targetvar + '.hash = 0') + self.codewriter.append(targetvar + '.chars = ""') malloc_exception = malloc malloc_varsize = malloc Modified: pypy/dist/pypy/translator/js/test/test_genllvm.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_genllvm.py (original) +++ pypy/dist/pypy/translator/js/test/test_genllvm.py Fri Dec 9 21:16:38 2005 @@ -215,7 +215,7 @@ assert f(0) == 5 assert f(1) == 2 -def DONTtest_pbc_fns(): #issue with arrayinstance output in incorrect order +def test_pbc_fns(): def f2(x): return x+1 def f3(x): @@ -271,7 +271,7 @@ assert f(0) == 0 assert f(1) == 1 -def DONTtest_list_basic_ops(): #issue unknown +def test_list_basic_ops(): def list_basic_ops(i, j): l = [1,2,3] l.insert(0, 42) @@ -288,7 +288,7 @@ for j in range(6): assert f(i,j) == list_basic_ops(i,j) -def DONTtest_string_simple(): #issue because malloc(sometype) doesn't populate the Object with data(types) +def test_string_simple(): def string_simple(i): return ord(str(i)) f = compile_function(string_simple, [int]) Modified: pypy/dist/pypy/translator/js/test/test_genllvm1.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_genllvm1.py (original) +++ pypy/dist/pypy/translator/js/test/test_genllvm1.py Fri Dec 9 21:16:38 2005 @@ -84,4 +84,3 @@ assert f(1) == 15 assert f(2) == 17 assert f(3) == 19 - From rxe at codespeak.net Fri Dec 9 22:13:28 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 22:13:28 +0100 (CET) Subject: [pypy-svn] r20977 - pypy/dist/pypy/translator/llvm Message-ID: <20051209211328.AC2DA27DF3@code1.codespeak.net> Author: rxe Date: Fri Dec 9 22:13:27 2005 New Revision: 20977 Modified: pypy/dist/pypy/translator/llvm/extfuncnode.py Log: oops Modified: pypy/dist/pypy/translator/llvm/extfuncnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/extfuncnode.py (original) +++ pypy/dist/pypy/translator/llvm/extfuncnode.py Fri Dec 9 22:13:27 2005 @@ -13,7 +13,7 @@ ext_func_sigs = { "%LL_os_isatty" : ExtFuncSig("int", None), "%LL_stack_too_big" : ExtFuncSig("int", None), - "%LL_os_lseek" : ExtFuncSig("long", None), + "%LL_os_lseek" : ExtFuncSig("int", None), "%LL_thread_acquirelock" : ExtFuncSig("int", [None, "int"]), "%LL_thread_start" : ExtFuncSig(None, ["sbyte*", "sbyte*"])} From rxe at codespeak.net Fri Dec 9 22:41:57 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 22:41:57 +0100 (CET) Subject: [pypy-svn] r20978 - pypy/dist/pypy/translator/llvm Message-ID: <20051209214157.54CA227DF7@code1.codespeak.net> Author: rxe Date: Fri Dec 9 22:41:54 2005 New Revision: 20978 Modified: pypy/dist/pypy/translator/llvm/arraynode.py pypy/dist/pypy/translator/llvm/database.py pypy/dist/pypy/translator/llvm/exception.py pypy/dist/pypy/translator/llvm/gc.py pypy/dist/pypy/translator/llvm/varsize.py Log: Phew - pypy-llvm compiling again. Try some pending experimental optmisations on snake. Modified: pypy/dist/pypy/translator/llvm/arraynode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/arraynode.py (original) +++ pypy/dist/pypy/translator/llvm/arraynode.py Fri Dec 9 22:41:54 2005 @@ -70,6 +70,17 @@ def writedatatypedecl(self, codewriter): codewriter.typedef(self.ref, self.db.get_machine_word()) + + +class StrArrayTypeNode(ArrayTypeNode): + def writeimpl(self, codewriter): + log.writeimpl(self.ref) + varsize.write_constructor(self.db, codewriter, self.ref, + self.constructor_decl, + self.array, + atomic=self.array._is_atomic(), + is_str=True) + class ArrayNode(ConstantLLVMNode): """ An arraynode. Elements can be Modified: pypy/dist/pypy/translator/llvm/database.py ============================================================================== --- pypy/dist/pypy/translator/llvm/database.py (original) +++ pypy/dist/pypy/translator/llvm/database.py Fri Dec 9 22:41:54 2005 @@ -7,7 +7,7 @@ from pypy.translator.llvm.structnode import StructNode, StructVarsizeNode, \ StructTypeNode, StructVarsizeTypeNode from pypy.translator.llvm.arraynode import ArrayNode, StrArrayNode, \ - VoidArrayNode, ArrayTypeNode, VoidArrayTypeNode + VoidArrayNode, ArrayTypeNode, StrArrayTypeNode, VoidArrayTypeNode from pypy.translator.llvm.opaquenode import OpaqueNode, ExtOpaqueNode, \ OpaqueTypeNode, ExtOpaqueTypeNode from pypy.rpython.lltypesystem import lltype @@ -155,6 +155,8 @@ elif isinstance(type_, lltype.Array): if type_.OF is lltype.Void: self.addpending(type_, VoidArrayTypeNode(self, type_)) + elif type_.OF is lltype.Char: + self.addpending(type_, StrArrayTypeNode(self, type_)) else: self.addpending(type_, ArrayTypeNode(self, type_)) Modified: pypy/dist/pypy/translator/llvm/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/exception.py (original) +++ pypy/dist/pypy/translator/llvm/exception.py Fri Dec 9 22:41:54 2005 @@ -201,7 +201,8 @@ def invoke(self, codewriter, targetvar, tail_, cconv, returntype, functionref, args, label, except_label): if returntype == 'void': - if functionref != '%keepalive': #XXX I think keepalive should not be the last operation here! + #XXX I think keepalive should not be the last operation here! + if functionref != '%keepalive': codewriter.indent('%scall %s void %s(%s)' % (tail_, cconv, functionref, args)) else: codewriter.indent('%s = %scall %s %s %s(%s)' % (targetvar, tail_, cconv, returntype, functionref, args)) Modified: pypy/dist/pypy/translator/llvm/gc.py ============================================================================== --- pypy/dist/pypy/translator/llvm/gc.py (original) +++ pypy/dist/pypy/translator/llvm/gc.py Fri Dec 9 22:41:54 2005 @@ -67,10 +67,11 @@ %%malloc_Ptr%(cnt)s = call fastcc sbyte* %%pypy_malloc%(atomic)s(%(uword)s %%malloc_SizeU%(cnt)s) %(targetvar)s = cast sbyte* %%malloc_Ptr%(cnt)s to %(type_)s* ''' % locals() - if is_atomic: - t += ''' -call ccc void %%llvm.memset(sbyte* %%malloc_Ptr%(cnt)s, ubyte 0, uint %%malloc_SizeU%(cnt)s, uint 0) -''' % locals() + + #if is_atomic: + # t += ''' + #call ccc void %%llvm.memset(sbyte* %%malloc_Ptr%(cnt)s, ubyte 0, uint %%malloc_SizeU%(cnt)s, uint 0) + #''' % locals() return t def pyrex_code(self): Modified: pypy/dist/pypy/translator/llvm/varsize.py ============================================================================== --- pypy/dist/pypy/translator/llvm/varsize.py (original) +++ pypy/dist/pypy/translator/llvm/varsize.py Fri Dec 9 22:41:54 2005 @@ -1,7 +1,7 @@ from pypy.rpython.rstr import STR def write_constructor(db, codewriter, ref, constructor_decl, ARRAY, - indices_to_array=(), atomic=False): + indices_to_array=(), atomic=False, is_str=False): #varsized arrays and structs look like this: #Array: {int length , elemtype*} @@ -27,12 +27,27 @@ codewriter.cast("%result", "sbyte*", "%ptr", ref + "*") indices_to_arraylength = tuple(indices_to_array) + (("uint", 0),) + # the following accesses the length field of the array codewriter.getelementptr("%arraylength", ref + "*", "%result", *indices_to_arraylength) codewriter.store(lentype, "%len", "%arraylength") + if is_str: + indices_to_hash = (("uint", 0),) + codewriter.getelementptr("%xxx1", ref + "*", + "%result", + *indices_to_hash) + codewriter.store("int", "0", "%arraylength") + + codewriter.getelementptr("%xxx2", ref + "*", + "%result", + *elemindices) + codewriter.store(elemtype, "0", "%xxx2") + + + codewriter.ret(ref + "*", "%result") codewriter.closefunc() From rxe at codespeak.net Fri Dec 9 23:09:37 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 23:09:37 +0100 (CET) Subject: [pypy-svn] r20979 - pypy/dist/pypy/translator/llvm/module Message-ID: <20051209220937.58A1827DF7@code1.codespeak.net> Author: rxe Date: Fri Dec 9 23:09:36 2005 New Revision: 20979 Added: pypy/dist/pypy/translator/llvm/module/boehm.h Modified: pypy/dist/pypy/translator/llvm/module/genexterns.c Log: hummmph when will something be back ported from genllvm to genc... :-) Use the correct macros to use thread local storage if threading support has been compiled into boehm library. Need to detect this by compiling small programs - will do tomorrow morning... Added: pypy/dist/pypy/translator/llvm/module/boehm.h ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/llvm/module/boehm.h Fri Dec 9 23:09:36 2005 @@ -0,0 +1,14 @@ +#define USING_THREADED_BOEHM = 1 + +#ifdef USING_THREADED_BOEHM + +#define GC_REDIRECT_TO_LOCAL 1 +#include + +#else + +#include + +#endif + +#define USING_BOEHM_GC Modified: pypy/dist/pypy/translator/llvm/module/genexterns.c ============================================================================== --- pypy/dist/pypy/translator/llvm/module/genexterns.c (original) +++ pypy/dist/pypy/translator/llvm/module/genexterns.c Fri Dec 9 23:09:36 2005 @@ -60,8 +60,8 @@ memcpy((void *) ptr2, (void *) ptr1, size); } -#include -#define USING_BOEHM_GC +// overflows/zeros/values raising operations +#include "boehm.h" char *LLVM_RPython_StartupCode(); @@ -80,7 +80,7 @@ extern GC_all_interior_pointers; char *RPython_StartupCode() { GC_all_interior_pointers = 0; - GC_INIT(); + GC_init(); return LLVM_RPython_StartupCode(); } @@ -125,7 +125,7 @@ char *RPython_StartupCode() { GC_all_interior_pointers = 0; - GC_INIT(); + GC_init(); return LLVM_RPython_StartupCode(); } From nik at codespeak.net Fri Dec 9 23:19:33 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Fri, 9 Dec 2005 23:19:33 +0100 (CET) Subject: [pypy-svn] r20980 - in pypy/dist/pypy: module/_socket/test translator/c/test Message-ID: <20051209221933.C362F27DF7@code1.codespeak.net> Author: nik Date: Fri Dec 9 23:19:32 2005 New Revision: 20980 Added: pypy/dist/pypy/module/_socket/test/echoserver.py (contents, props changed) pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py (contents, props changed) Log: (ale, nik) start of a trivial tcp server to use in socket tests. we can't terminate it gracefully, yet. Added: pypy/dist/pypy/module/_socket/test/echoserver.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/module/_socket/test/echoserver.py Fri Dec 9 23:19:32 2005 @@ -0,0 +1,22 @@ +import SocketServer +import sys, time + +# user-accessible port +PORT = 8037 + +class EchoRequestHandler(SocketServer.StreamRequestHandler): + + def handle(self): + while True: + client_string = "" + char = "" + while char != "\n": + char = self.rfile.read(1) + client_string += char + if client_string.startswith("shutdown"): + sys.exit(1) + self.wfile.write(client_string) + +if __name__ == "__main__": + server = SocketServer.TCPServer(("", PORT), EchoRequestHandler) + server.serve_forever() Added: pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py Fri Dec 9 23:19:32 2005 @@ -0,0 +1,34 @@ +import autopath +import py +import os.path, subprocess, sys +import _socket +from pypy.translator.c.test.test_genc import compile +from pypy.translator.translator import Translator + +HOST = "localhost" +PORT = 8037 + +def setup_module(mod): + import pypy.module._socket.rpython.exttable # for declare()/declaretype() + serverpath = os.path.join(autopath.pypydir, "module/_socket/test/echoserver.py") + mod.server_pid = subprocess.Popen([sys.executable, serverpath]).pid + +def teardown_module(mod): + import telnetlib + tn = telnetlib.Telnet(HOST, PORT) + tn.write("shutdown\n") + tn.close() + +def DONOT_test_connect(): + import os + from pypy.module._socket.rpython import rsocket + def does_stuff(): + fd = rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) + # XXX need to think of a test without connecting to outside servers + rsocket.connect(fd, (HOST, PORT, 0, 0)) + sockname = rsocket.getpeername(fd) + os.close(fd) + return sockname[1] + f1 = compile(does_stuff, []) + res = f1() + assert res == PORT From rxe at codespeak.net Fri Dec 9 23:20:22 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 23:20:22 +0100 (CET) Subject: [pypy-svn] r20981 - pypy/dist/pypy/translator/llvm/module Message-ID: <20051209222022.5BC2B27DF8@code1.codespeak.net> Author: rxe Date: Fri Dec 9 23:20:21 2005 New Revision: 20981 Modified: pypy/dist/pypy/translator/llvm/module/boehm.h pypy/dist/pypy/translator/llvm/module/genexterns.c Log: Last checkin was a mess... :-( Time to go home. Modified: pypy/dist/pypy/translator/llvm/module/boehm.h ============================================================================== --- pypy/dist/pypy/translator/llvm/module/boehm.h (original) +++ pypy/dist/pypy/translator/llvm/module/boehm.h Fri Dec 9 23:20:21 2005 @@ -12,3 +12,18 @@ #endif #define USING_BOEHM_GC + +char *pypy_malloc(unsigned int size) { + return GC_MALLOC(size); +} + +char *pypy_malloc_atomic(unsigned int size) { + return GC_MALLOC_ATOMIC(size); +} + +extern GC_all_interior_pointers; + char *RPython_StartupCode() { + GC_all_interior_pointers = 0; + GC_init(); + return LLVM_RPython_StartupCode(); +} Modified: pypy/dist/pypy/translator/llvm/module/genexterns.c ============================================================================== --- pypy/dist/pypy/translator/llvm/module/genexterns.c (original) +++ pypy/dist/pypy/translator/llvm/module/genexterns.c Fri Dec 9 23:20:21 2005 @@ -60,30 +60,13 @@ memcpy((void *) ptr2, (void *) ptr1, size); } -// overflows/zeros/values raising operations -#include "boehm.h" - char *LLVM_RPython_StartupCode(); -char *pypy_malloc(unsigned int size) { - // use the macros luke - return GC_MALLOC(size); -} - -char *pypy_malloc_atomic(unsigned int size) { - // use the macros luke - return GC_MALLOC_ATOMIC(size); -} +// boehm includes +#include "boehm.h" #ifdef ENTRY_POINT_DEFINED -extern GC_all_interior_pointers; -char *RPython_StartupCode() { - GC_all_interior_pointers = 0; - GC_init(); - return LLVM_RPython_StartupCode(); -} - int __ENTRY_POINT__(RPyListOfString *); int main(int argc, char *argv[]) @@ -121,13 +104,6 @@ } #else -extern GC_all_interior_pointers; - -char *RPython_StartupCode() { - GC_all_interior_pointers = 0; - GC_init(); - return LLVM_RPython_StartupCode(); -} int Pyrex_RPython_StartupCode() { From rxe at codespeak.net Fri Dec 9 23:35:44 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 9 Dec 2005 23:35:44 +0100 (CET) Subject: [pypy-svn] r20982 - pypy/dist/pypy/translator/llvm/module Message-ID: <20051209223544.DB04627DF8@code1.codespeak.net> Author: rxe Date: Fri Dec 9 23:35:44 2005 New Revision: 20982 Modified: pypy/dist/pypy/translator/llvm/module/boehm.h Log: Dont enable gc_local_alloc.h by default and fixed some typos. Modified: pypy/dist/pypy/translator/llvm/module/boehm.h ============================================================================== --- pypy/dist/pypy/translator/llvm/module/boehm.h (original) +++ pypy/dist/pypy/translator/llvm/module/boehm.h Fri Dec 9 23:35:44 2005 @@ -1,9 +1,9 @@ -#define USING_THREADED_BOEHM = 1 +//#define USING_THREADED_BOEHM #ifdef USING_THREADED_BOEHM #define GC_REDIRECT_TO_LOCAL 1 -#include +#include #else From mwh at codespeak.net Sat Dec 10 00:00:04 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Sat, 10 Dec 2005 00:00:04 +0100 (CET) Subject: [pypy-svn] r20983 - pypy/dist/pypy/doc/weekly Message-ID: <20051209230004.29AD727DDE@code1.codespeak.net> Author: mwh Date: Sat Dec 10 00:00:02 2005 New Revision: 20983 Added: pypy/dist/pypy/doc/weekly/summary-2005-12-09.txt Modified: pypy/dist/pypy/doc/weekly/index.txt Log: This Week in PyPy 6 -- you have maybe 20 minutes to complain before it gets sent out :) Modified: pypy/dist/pypy/doc/weekly/index.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/index.txt (original) +++ pypy/dist/pypy/doc/weekly/index.txt Sat Dec 10 00:00:02 2005 @@ -67,9 +67,16 @@ - Background EU-related work - Where did PyPy-sync go? + * `Week ending 2005-12-09`_ + + - The Sprint! + - IRC Summary + - EU-related Talks + .. _`Week ending 2005-11-04`: summary-2005-11-04.html .. _`Week ending 2005-11-11`: summary-2005-11-11.html .. _`Week ending 2005-11-18`: summary-2005-11-18.html .. _`Week ending 2005-11-25`: summary-2005-11-25.html .. _`Week ending 2005-12-02`: summary-2005-12-02.html +.. _`Week ending 2005-12-09`: summary-2005-12-09.html Added: pypy/dist/pypy/doc/weekly/summary-2005-12-09.txt ============================================================================== --- (empty file) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-09.txt Sat Dec 10 00:00:02 2005 @@ -0,0 +1,102 @@ +======================= + This Week in PyPy 6 +======================= + +Introduction +============ + +This is the sixth of what will hopefully be many summaries of what's +been going on in the world of PyPy in the last week. I'd still like +to remind people that when something worth summarizing happens to +recommend if for "This Week in PyPy" as mentioned on: + + http://codespeak.net/pypy/dist/pypy/doc/weekly/ + +where you can also find old summaries. This week features the first +IRC summary from Pieter Holtzhausen, a feature that will hopefully +continue. + +There were about 150 commits to the pypy section of codespeak's +repository in the last week (a relatively small number for a sprint +week -- lots of thinking going on here). + + +The Sprint! +=========== + +This is covered in more detail in the `sprint report`_, but seems to be +going well. There has been work on the JIT, supporting larger integers and +sockets in RPython, making the stackless option more useful, performance, +compiler flexibility, documentation and probably even more. + +.. _`sprint report`: http://please/write/me + + +IRC Summary +=========== + +Thanks again to Pieter for this. We need to talk about formatting :) + +**Friday** http://tismerysoft.de/pypy/irc-logs/pypy/%23pypy.log.20051202:: + + [00:04] Arigo states it is time to merge the PBC branch. Merging henceforth + commences. + [15:46] Pedronis and mwh discusses the simplification of the backend + selection of the translator. Some translator planning documents + checked in later. + +**Saturday** http://tismerysoft.de/pypy/irc-logs/pypy/%23pypy.log.20051203:: + + [15:45] Stakkars mentions the idea he posted to pypy-dev, that involves + the substitution of CPython modules piecewise with pypy generated + modules. Pedronis replies that he has thought of a similar + approach to integrate pypy and Jython, but that this effort needs + to be balanced with the fact that the pypy JIT currently needs + attention. + +**Sunday** http://tismerysoft.de/pypy/irc-logs/pypy/%23pypy.log.20051204:: + + [14:03] Stakkars asks about the necessity of 3 stacks in the pypy + system. One for floats, ints and addresses. After remarks about + easier CPU support, Arigo replies that there is simply no sane + way to do RPython with a single one. + [18:26] Gromit asks how ready pypy is for production usage. He is + interested in pypy as a smalltalk-like environment, since its + objects spaces is reminiscent of smalltalk vm images. + [18:31] Stakkars states that he believes the project should postpone + advanced technologies, in favour of getting the groundwork to a + level where the project really becomes a CPython alternative. + +**Monday** http://tismerysoft.de/pypy/irc-logs/pypy/%23pypy.log.20051205:: + + [01:44] Pedronis running counting microbenchmarks, one 4.7 times slower + than CPython, the other one 11.3 times. Function calling takes + its toll in the latter. + +**Tuesday, Wednesday**:: + + [xx:xx] Sprint background radiation. Braintone rings like a bell. Not + much to report. + +**Thursday** http://tismerysoft.de/pypy/irc-logs/pypy/%23pypy.log.20051208:: + + [17:55] Stakkars guess that RPython may get basic coroutine support, and + is excited about that. + [18:05] Stakkars votes for having stackless enabled all the time. The + advantages: + - real garbage collection + - iterator implementation without clumsy state machines + [20:19] Rhamphoryncus wonders whether dynamic specialization (e.g. psyco) + can possibly improve memory layout. + [20:46] Sabi is glad that long long is now supported (courtesy of mwh and + Johahn). He yanks out his work around. + + +EU-related Talks +================ + +On Monday Holger spoke at a German EU office workshop in Bonn and two days +later he, Alastair and Bea spoke at a more union-wide EU workshop in +Brussels. Both talks were very well received and while ostensibly we were +telling the EU about our project, we gained much immediately useful +information about how the EU actually adminsters projects such as ours. From cfbolz at codespeak.net Sat Dec 10 00:08:48 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sat, 10 Dec 2005 00:08:48 +0100 (CET) Subject: [pypy-svn] r20984 - pypy/dist/pypy/doc/weekly Message-ID: <20051209230848.BF92527DF0@code1.codespeak.net> Author: cfbolz Date: Sat Dec 10 00:08:47 2005 New Revision: 20984 Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-09.txt Log: add a link to the sprint report, small fixes in IRC logs Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-09.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/summary-2005-12-09.txt (original) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-09.txt Sat Dec 10 00:08:47 2005 @@ -29,7 +29,7 @@ sockets in RPython, making the stackless option more useful, performance, compiler flexibility, documentation and probably even more. -.. _`sprint report`: http://please/write/me +.. _`sprint report`: http://codespeak.net/pipermail/pypy-dev/2005q4/002656.html IRC Summary @@ -56,13 +56,13 @@ **Sunday** http://tismerysoft.de/pypy/irc-logs/pypy/%23pypy.log.20051204:: - [14:03] Stakkars asks about the necessity of 3 stacks in the pypy - system. One for floats, ints and addresses. After remarks about - easier CPU support, Arigo replies that there is simply no sane - way to do RPython with a single one. + [14:03] Stakkars asks about the necessity of 3 stacks in the l3interpreter + that Armin has been working on. One for floats, ints and + addresses. After remarks about easier CPU support, Arigo replies + that there is simply no sane way to write RPython with a single one. [18:26] Gromit asks how ready pypy is for production usage. He is - interested in pypy as a smalltalk-like environment, since its - objects spaces is reminiscent of smalltalk vm images. + interested in pypy as a smalltalk-like environment, since he deems + objects spaces to be reminiscent of smalltalk vm images. [18:31] Stakkars states that he believes the project should postpone advanced technologies, in favour of getting the groundwork to a level where the project really becomes a CPython alternative. From ericvrp at codespeak.net Sat Dec 10 01:17:10 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Sat, 10 Dec 2005 01:17:10 +0100 (CET) Subject: [pypy-svn] r20987 - pypy/dist/pypy/translator/js/test Message-ID: <20051210001710.B1E1227DFC@code1.codespeak.net> Author: ericvrp Date: Sat Dec 10 01:17:09 2005 New Revision: 20987 Modified: pypy/dist/pypy/translator/js/test/test_exception.py pypy/dist/pypy/translator/js/test/test_lltype.py pypy/dist/pypy/translator/js/test/test_seq.py Log: Some more passing tests and a better understanding of what else needs fixing Modified: pypy/dist/pypy/translator/js/test/test_exception.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_exception.py (original) +++ pypy/dist/pypy/translator/js/test/test_exception.py Sat Dec 10 01:17:09 2005 @@ -164,7 +164,7 @@ for i in range(10, 20): assert f(i) == fn(i) -def DONTtest_catches(): #issue empty Object mallocs +def DONTtest_catches(): #issue with last exception value not being set def raises(i): if i == 3: raise MyException, 12 Modified: pypy/dist/pypy/translator/js/test/test_lltype.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_lltype.py (original) +++ pypy/dist/pypy/translator/js/test/test_lltype.py Sat Dec 10 01:17:09 2005 @@ -101,7 +101,7 @@ f = compile_function(struct_constant, []) assert f() == struct_constant() -def DONTtest_aliasing(): #issue looks like empty Object mallocs +def DONTtest_aliasing(): #issue with missing operation (v229 = getelementptr arrayinstance, word 0, uint 1, i_0) B = lltype.Struct('B', ('x', lltype.Signed)) A = lltype.Array(B) global_a = lltype.malloc(A, 5, immortal=True) @@ -113,7 +113,7 @@ assert f(2) == 0 assert f(3) == 17 -def DONTtest_aliasing2(): #issue mallocs +def DONTtest_aliasing2(): #issue with missing operation (v230 = getelementptr arrayinstance, word 0, uint 1, i_0) B = lltype.Struct('B', ('x', lltype.Signed)) A = lltype.Array(B) C = lltype.Struct('C', ('x', lltype.Signed), ('bptr', lltype.Ptr(B))) @@ -150,7 +150,7 @@ f = compile_function(array_constant, []) assert f() == array_constant() -def DONTtest_array_constant3(): #malloc issue +def DONTtest_array_constant3(): #issue with missing operation (v289 = getelementptr arrayinstance, word 0, uint 1, 0) A = lltype.GcArray(('x', lltype.Signed)) a = lltype.malloc(A, 3) a[0].x = 100 Modified: pypy/dist/pypy/translator/js/test/test_seq.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_seq.py (original) +++ pypy/dist/pypy/translator/js/test/test_seq.py Sat Dec 10 01:17:09 2005 @@ -22,7 +22,7 @@ assert f(2) == 13 assert f(3) == 3 - def DONTtest_array_add(self): #unknown issue + def test_array_add(self): f = compile_function(llvmsnippet.array_add, [int, int, int, int, int]) assert f(1,2,3,4,0) == 1 assert f(1,2,3,4,1) == 2 @@ -41,7 +41,7 @@ f = compile_function(llvmsnippet.array_arg, [int]) assert f(5) == 0 - def DONTtest_array_len(self): #unknown issue + def test_array_len(self): f = compile_function(llvmsnippet.array_len, []) assert f() == 10 @@ -86,7 +86,7 @@ for j in range(5): assert f(i, j, 0) == i + j - def DONTtest_circular_list(self): #unkjown issue + def test_circular_list(self): f = compile_function(llvmsnippet.circular_list, [int]) assert f(0) == 0 assert f(1) == 1 From cfbolz at codespeak.net Sat Dec 10 11:04:02 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sat, 10 Dec 2005 11:04:02 +0100 (CET) Subject: [pypy-svn] r20989 - pypy/dist/pypy/translator/c Message-ID: <20051210100402.689BE27DCE@code1.codespeak.net> Author: cfbolz Date: Sat Dec 10 11:04:01 2005 New Revision: 20989 Modified: pypy/dist/pypy/translator/c/gc.py Log: why does BoehmGcPolicy use the RefcountingInfo? Modified: pypy/dist/pypy/translator/c/gc.py ============================================================================== --- pypy/dist/pypy/translator/c/gc.py (original) +++ pypy/dist/pypy/translator/c/gc.py Sat Dec 10 11:04:01 2005 @@ -308,7 +308,7 @@ -class BoehmGcInfo: +class BoehmInfo: finalizer = None class BoehmGcPolicy(BasicGcPolicy): @@ -329,7 +329,7 @@ def array_setup(self, arraydefnode): if isinstance(arraydefnode.LLTYPE, GcArray) and list(self.deallocator_lines(arraydefnode, '')): - gcinfo = arraydefnode.gcinfo = RefcountingInfo() + gcinfo = arraydefnode.gcinfo = BoehmInfo() gcinfo.finalizer = self.db.namespace.uniquename('finalize_'+arraydefnode.barename) def array_implementationcode(self, arraydefnode): @@ -347,7 +347,7 @@ # for structs def struct_setup(self, structdefnode, rtti): if isinstance(structdefnode.LLTYPE, GcStruct) and list(self.deallocator_lines(structdefnode, '')): - gcinfo = structdefnode.gcinfo = RefcountingInfo() + gcinfo = structdefnode.gcinfo = BoehmInfo() gcinfo.finalizer = self.db.namespace.uniquename('finalize_'+structdefnode.barename) struct_after_definition = common_after_definition From cfbolz at codespeak.net Sat Dec 10 11:28:05 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sat, 10 Dec 2005 11:28:05 +0100 (CET) Subject: [pypy-svn] r20990 - pypy/extradoc/sprintinfo/gothenburg-2005 Message-ID: <20051210102805.6DE4427DC3@code1.codespeak.net> Author: cfbolz Date: Sat Dec 10 11:28:04 2005 New Revision: 20990 Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Log: planning for tonight Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt ============================================================================== --- pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt (original) +++ pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Sat Dec 10 11:28:04 2005 @@ -26,12 +26,12 @@ JIT work ~~~~~~~~~~~~~~~~~ -(Armin, Carl Friedrich, Samuele, Arre, Eric) +(Armin, Michael, Samuele, Arre, Eric) see doc/discussion/draft-jit-ideas.txt - toy target intepreter + parser/assembler (DONE) -- low-level graphs abstract interpreter (IN-PROGRESS) +- low-level graphs abstract interpreter (MORE PROGRESS) (- L3 interpreter) Stackless @@ -42,6 +42,7 @@ - write RPython structures (tasklet, channel) and basic functions for switching (IN-PROGRESS) +- prototypes for channels and tasklets + tests (FINISHED) - add an app-level interface (mixed module) - implement support structures - a deque module exists already which can be used for channel queues @@ -49,7 +50,9 @@ GC, __del__, weakref ~~~~~~~~~~~~~~~~~~~~~ -- implement __del__ support in the RTyper and backends +(Carl Friedrich) +- implement __del__ support in the RTyper and backends (FINISHED for + refcounting, some work left for Boehm) (- possibly implement weakref (at least with Boehm)) - integrate GC construction framework in the backends @@ -63,7 +66,7 @@ - this exposes limitations in our way to glue to C libraries, think/design solutions -(Johan, Michael) + - (DONE) support more basic integer types. Decide on the proper design (explicit spelling of sizes, or the long-long way?) note that we already have functions which return 64 bit values. @@ -71,7 +74,7 @@ threading ~~~~~~~~~~~ -- fix stack_too_big with threads on Windows +- fix stack_too_big with threads on Windows (Johan) - investigate why enabling threads creates such a large overhead - think of a design to release the GIL around blocking calls @@ -79,7 +82,7 @@ ~~~~~~~~~~~~~~~~~~~~~~~~~~~ - look into the perfomance and code path for function calls - in our interpreter (Arre, Eric, with help from Richard) + in our interpreter (IN-PROGRESS) Arre, Eric, with help from Richard) - look into converting the indirect call in the eval loop for bytecode dispatch into a switch: probably needs a representation choice in the RTyper, a transformation, and integer exitswitch implementation as switch in the backends @@ -100,7 +103,7 @@ US travel report, maybe towards WP03/WP07 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -Saturday morning +Saturday morning (waiting a bit for Richard?) - telling the story about a commercial travel to the states to optimize some Python application - done using RPython From arigo at codespeak.net Sat Dec 10 11:31:09 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 10 Dec 2005 11:31:09 +0100 (CET) Subject: [pypy-svn] r20991 - pypy/dist/pypy/tool/pytest Message-ID: <20051210103109.EA9A127DC8@code1.codespeak.net> Author: arigo Date: Sat Dec 10 11:31:09 2005 New Revision: 20991 Modified: pypy/dist/pypy/tool/pytest/regrverbose.py Log: Not happy with CPython imports. Modified: pypy/dist/pypy/tool/pytest/regrverbose.py ============================================================================== --- pypy/dist/pypy/tool/pytest/regrverbose.py (original) +++ pypy/dist/pypy/tool/pytest/regrverbose.py Sat Dec 10 11:31:09 2005 @@ -1,6 +1,9 @@ # refer to 2.4.1/test/regrtest.py's runtest() for comparison import sys -from test import test_support + +# ARGH! we need 'test' from the standard library, not the local one! +del sys.path[0] +from test import test_support test_support.verbose = int(sys.argv[1]) sys.argv[:] = sys.argv[2:] From hpk at codespeak.net Sat Dec 10 11:47:31 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Sat, 10 Dec 2005 11:47:31 +0100 (CET) Subject: [pypy-svn] r20992 - in pypy/dist: lib-python pypy/tool/pytest pypy/tool/pytest/run-script Message-ID: <20051210104731.2558B27DC6@code1.codespeak.net> Author: hpk Date: Sat Dec 10 11:47:30 2005 New Revision: 20992 Added: pypy/dist/pypy/tool/pytest/run-script/ pypy/dist/pypy/tool/pytest/run-script/regrverbose.py - copied, changed from r20991, pypy/dist/pypy/tool/pytest/regrverbose.py Removed: pypy/dist/pypy/tool/pytest/regrverbose.py Modified: pypy/dist/lib-python/conftest.py Log: fix compliance test run problem - the problem arose because pypy is now properly inserting the basedirectory of a script-to-be-run into sys.path so that "import test" now found the local pypy/tool/pytest/test instead of the stdlib one. - so i moved the regrverbose script into its own directory (run-script) for now and reverted armin's argh-fix Modified: pypy/dist/lib-python/conftest.py ============================================================================== --- pypy/dist/lib-python/conftest.py (original) +++ pypy/dist/lib-python/conftest.py Sat Dec 10 11:47:30 2005 @@ -886,7 +886,8 @@ python = sys.executable pypy_script = pypydir.join('bin', 'py.py') alarm_script = pypydir.join('tool', 'alarm.py') - regr_script = pypydir.join('tool', 'pytest', 'regrverbose.py') + regr_script = pypydir.join('tool', 'pytest', + 'run-script', 'regrverbose.py') pypy_options = [] if regrtest.oldstyle: pypy_options.append('--oldstyle') From ac at codespeak.net Sat Dec 10 12:51:44 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Sat, 10 Dec 2005 12:51:44 +0100 (CET) Subject: [pypy-svn] r20993 - in pypy/dist/pypy: interpreter objspace Message-ID: <20051210115144.9969127DC8@code1.codespeak.net> Author: ac Date: Sat Dec 10 12:51:44 2005 New Revision: 20993 Modified: pypy/dist/pypy/interpreter/baseobjspace.py pypy/dist/pypy/interpreter/eval.py pypy/dist/pypy/interpreter/gateway.py pypy/dist/pypy/interpreter/pycode.py pypy/dist/pypy/objspace/descroperation.py Log: (arre, eric) Optimise some calls of applevel functions. (only single argument functions for now). Modified: pypy/dist/pypy/interpreter/baseobjspace.py ============================================================================== --- pypy/dist/pypy/interpreter/baseobjspace.py (original) +++ pypy/dist/pypy/interpreter/baseobjspace.py Sat Dec 10 12:51:44 2005 @@ -465,7 +465,7 @@ from pypy.interpreter.function import Function if isinstance(w_func, Function): if len(args_w) == 1: - w_res = w_func.code.fastcall_1(self, args_w[0]) + w_res = w_func.code.fastcall_1(self, w_func, args_w[0]) if w_res is not None: return w_res elif len(args_w) == 2: Modified: pypy/dist/pypy/interpreter/eval.py ============================================================================== --- pypy/dist/pypy/interpreter/eval.py (original) +++ pypy/dist/pypy/interpreter/eval.py Sat Dec 10 12:51:44 2005 @@ -51,7 +51,7 @@ return None # a performance hack (see gateway.BuiltinCode1/2/3) - def fastcall_1(self, space, w1): + def fastcall_1(self, space, func, w1): return None def fastcall_2(self, space, w1, w2): return None Modified: pypy/dist/pypy/interpreter/gateway.py ============================================================================== --- pypy/dist/pypy/interpreter/gateway.py (original) +++ pypy/dist/pypy/interpreter/gateway.py Sat Dec 10 12:51:44 2005 @@ -438,7 +438,7 @@ # (verbose) performance hack below class BuiltinCode1(BuiltinCode): - def fastcall_1(self, space, w1): + def fastcall_1(self, space, w_func, w1): try: w_result = self.fastfunc_1(space, w1) except KeyboardInterrupt: Modified: pypy/dist/pypy/interpreter/pycode.py ============================================================================== --- pypy/dist/pypy/interpreter/pycode.py (original) +++ pypy/dist/pypy/interpreter/pycode.py Sat Dec 10 12:51:44 2005 @@ -101,7 +101,7 @@ self.co_firstlineno = 0 # first source line number self.co_lnotab = "" # string: encoding addr<->lineno mapping self.hidden_applevel = False - + self.do_fastcall = -1 def _code_new( self, argcount, nlocals, stacksize, flags, code, consts, names, varnames, filename, @@ -155,8 +155,16 @@ magic, = struct.unpack(" Author: cfbolz Date: Sat Dec 10 12:53:31 2005 New Revision: 20994 Modified: pypy/dist/pypy/translator/c/gc.py Log: argh! one missing crucial line! Modified: pypy/dist/pypy/translator/c/gc.py ============================================================================== --- pypy/dist/pypy/translator/c/gc.py (original) +++ pypy/dist/pypy/translator/c/gc.py Sat Dec 10 12:53:31 2005 @@ -189,6 +189,7 @@ destrptr = rtti._obj.destructor_funcptr gcinfo.destructor = db.get(destrptr) T = typeOf(destrptr).TO.ARGS[0] + gcinfo.destructor_argtype = db.gettype(T) else: # is a deallocator really needed, or would it be empty? if list(self.deallocator_lines(structdefnode, '')): From nik at codespeak.net Sat Dec 10 13:37:09 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Sat, 10 Dec 2005 13:37:09 +0100 (CET) Subject: [pypy-svn] r20995 - in pypy/dist/pypy: module/_socket/test translator/c/test Message-ID: <20051210123709.9382C27DC8@code1.codespeak.net> Author: nik Date: Sat Dec 10 13:37:06 2005 New Revision: 20995 Modified: pypy/dist/pypy/module/_socket/test/echoserver.py pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py Log: (ale, nik) our test echo server works, but we still don't manage to shutdown the server process properly ... the combination of the subprocess module with python's telnetlib or socket seems to be a problem. Modified: pypy/dist/pypy/module/_socket/test/echoserver.py ============================================================================== --- pypy/dist/pypy/module/_socket/test/echoserver.py (original) +++ pypy/dist/pypy/module/_socket/test/echoserver.py Sat Dec 10 13:37:06 2005 @@ -4,6 +4,19 @@ # user-accessible port PORT = 8037 +class EchoServer(SocketServer.TCPServer): + + def __init__(self, *args, **kwargs): + SocketServer.TCPServer.__init__(self, *args, **kwargs) + self.stop = False + + def handle_error(self, request, client_address): + self.stop = True + + def serve(self): + while not self.stop: + self.handle_request() + class EchoRequestHandler(SocketServer.StreamRequestHandler): def handle(self): @@ -14,9 +27,9 @@ char = self.rfile.read(1) client_string += char if client_string.startswith("shutdown"): - sys.exit(1) + raise RuntimeError() self.wfile.write(client_string) if __name__ == "__main__": - server = SocketServer.TCPServer(("", PORT), EchoRequestHandler) - server.serve_forever() + server = EchoServer(("", PORT), EchoRequestHandler) + server.serve() Modified: pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py Sat Dec 10 13:37:06 2005 @@ -11,20 +11,21 @@ def setup_module(mod): import pypy.module._socket.rpython.exttable # for declare()/declaretype() serverpath = os.path.join(autopath.pypydir, "module/_socket/test/echoserver.py") - mod.server_pid = subprocess.Popen([sys.executable, serverpath]).pid + mod.process = subprocess.Popen([sys.executable, serverpath]) def teardown_module(mod): import telnetlib tn = telnetlib.Telnet(HOST, PORT) tn.write("shutdown\n") tn.close() + del tn + del mod.process -def DONOT_test_connect(): +def test_connect(): import os from pypy.module._socket.rpython import rsocket def does_stuff(): fd = rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) - # XXX need to think of a test without connecting to outside servers rsocket.connect(fd, (HOST, PORT, 0, 0)) sockname = rsocket.getpeername(fd) os.close(fd) From arigo at codespeak.net Sat Dec 10 14:07:16 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 10 Dec 2005 14:07:16 +0100 (CET) Subject: [pypy-svn] r20996 - pypy/dist/pypy/jit Message-ID: <20051210130716.61B5127DC8@code1.codespeak.net> Author: arigo Date: Sat Dec 10 14:07:14 2005 New Revision: 20996 Modified: pypy/dist/pypy/jit/llabstractinterp.py Log: (mwh, arigo, pedronis around) Did stuff. Reorganized a bit the classes: LLAbstractFrame is replaced by a BlockBuilder, responsible for building only one block; the process of completing a graph is now driven by the GraphState directly. We also replaced most usages of 'isinstance(_, LLXxxValue)' with method calls, in preparation for the next test -- a new LLXxxValue class for virtual structures. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sat Dec 10 14:07:14 2005 @@ -29,11 +29,22 @@ def getconcretetype(self): return lltype.typeOf(self.value) - def getvarorconst(self): + def forcevarorconst(self, builder): c = Constant(self.value) c.concretetype = self.getconcretetype() return c + def getruntimevars(self): + return [] + + def maybe_get_constant(self): + c = Constant(self.value) + c.concretetype = self.getconcretetype() + return c + + def with_fresh_variables(self, to_be_stored_into): + return self + def match(self, other): return isinstance(other, LLConcreteValue) and self.value == other.value @@ -54,9 +65,21 @@ def getconcretetype(self): return self.copy_v.concretetype - def getvarorconst(self): + def forcevarorconst(self, builder): return self.copy_v + def getruntimevars(self): + return [self.copy_v] + + def maybe_get_constant(self): + if isinstance(self.copy_v, Constant): + return self.copy_v + else: + return None + + def with_fresh_variables(self, to_be_stored_into): + return LLRuntimeValue(orig_v=to_be_stored_into) + def match(self, other): return isinstance(other, LLRuntimeValue) # XXX and ... @@ -88,37 +111,6 @@ #print "RESOLVING BLOCK", newblock self.copyblock = newblock - -class GraphState(object): - """Entry state of a graph.""" - - def __init__(self, origgraph, args_a, n): - self.origgraph = origgraph - name = '%s_%d' % (origgraph.name, n) - self.copygraph = FunctionGraph(name, Block([])) # grumble - for orig_v, copy_v in [(origgraph.getreturnvar(), - self.copygraph.getreturnvar()), - (origgraph.exceptblock.inputargs[0], - self.copygraph.exceptblock.inputargs[0]), - (origgraph.exceptblock.inputargs[1], - self.copygraph.exceptblock.inputargs[1])]: - if hasattr(orig_v, 'concretetype'): - copy_v.concretetype = orig_v.concretetype - self.a_return = None - self.state = "before" - - def settarget(self, block): - block.isstartblock = True - self.copygraph.startblock = block - - def complete(self, interp): - assert self.state != "during" - if self.state == "before": - self.state = "during" - builderframe = LLAbstractFrame(interp, self) - builderframe.complete() - self.state = "after" - # ____________________________________________________________ class LLAbstractInterp(object): @@ -139,7 +131,7 @@ self.blocks = {} # {origblock: list-of-LLStates} args_a = [LLRuntimeValue(orig_v=v) for v in origgraph.getargs()] graphstate, args_a = self.schedule_graph(args_a, origgraph) - graphstate.complete(self) + graphstate.complete() return graphstate.copygraph def applyhint(self, args_a, origblock): @@ -159,7 +151,7 @@ graphstate = self.graphs[origgraph][state] except KeyError: d = self.graphs.setdefault(origgraph, {}) - graphstate = GraphState(origgraph, args_a, n=len(d)) + graphstate = GraphState(self, origgraph, args_a, n=len(d)) d[state] = graphstate self.pendingstates[graphstate] = state #print "SCHEDULE_GRAPH", graphstate @@ -167,10 +159,11 @@ def schedule(self, args_a, origblock): #print "SCHEDULE", args_a, origblock - # args_a: [a_value for v in origblock.inputargs] + # args_a: [the-a-corresponding-to-v for v in origblock.inputargs] state, args_a = self.schedule_getstate(args_a, origblock) - args_v = [a.getvarorconst() for a in args_a - if not isinstance(a, LLConcreteValue)] + args_v = [] + for a in args_a: + args_v.extend(a.getruntimevars()) newlink = Link(args_v, None) self.pendingstates[newlink] = state return newlink @@ -191,16 +184,37 @@ return state, args_a -class LLAbstractFrame(object): +class GraphState(object): + """Entry state of a graph.""" - def __init__(self, interp, graphstate): + def __init__(self, interp, origgraph, args_a, n): self.interp = interp - self.graphstate = graphstate + self.origgraph = origgraph + name = '%s_%d' % (origgraph.name, n) + self.copygraph = FunctionGraph(name, Block([])) # grumble + for orig_v, copy_v in [(origgraph.getreturnvar(), + self.copygraph.getreturnvar()), + (origgraph.exceptblock.inputargs[0], + self.copygraph.exceptblock.inputargs[0]), + (origgraph.exceptblock.inputargs[1], + self.copygraph.exceptblock.inputargs[1])]: + if hasattr(orig_v, 'concretetype'): + copy_v.concretetype = orig_v.concretetype + self.a_return = None + self.state = "before" + + def settarget(self, block): + block.isstartblock = True + self.copygraph.startblock = block def complete(self): - graph = self.graphstate.copygraph + assert self.state != "during" + if self.state == "after": + return + self.state = "during" + graph = self.copygraph interp = self.interp - pending = [self.graphstate] + pending = [self] seen = {} # follow all possible links, forcing the blocks along the way to be # computed @@ -220,7 +234,7 @@ # that it is really the one from 'graph' -- by patching # 'graph' if necessary. if len(link.target.inputargs) == 1: - self.graphstate.a_return = state.args_a[0] + self.a_return = state.args_a[0] graph.returnblock = link.target elif len(link.target.inputargs) == 2: graph.exceptblock = link.target @@ -230,35 +244,21 @@ checkgraph(graph) eliminate_empty_blocks(graph) join_blocks(graph) + self.state = "after" def flowin(self, state): # flow in the block origblock = state.origblock - bindings = {} # {Variables-of-origblock: a_value} - def binding(v): - if isinstance(v, Constant): - return LLRuntimeValue(orig_v=v) - else: - return bindings[v] + builder = BlockBuilder(self.interp) for v, a in zip(origblock.inputargs, state.args_a): - if not isinstance(a, LLConcreteValue): - a = LLRuntimeValue(orig_v=v) - bindings[v] = a + builder.bindings[v] = a.with_fresh_variables(to_be_stored_into=v) print - self.residual_operations = [] + # flow the actual operations of the block for op in origblock.operations: - handler = getattr(self, 'op_' + op.opname) - a_result = handler(op, *[binding(v) for v in op.args]) - bindings[op.result] = a_result - inputargs = [] - for v in origblock.inputargs: - a = bindings[v] - if not isinstance(a, LLConcreteValue): - inputargs.append(a.getvarorconst()) - newblock = Block(inputargs) - newblock.operations = self.residual_operations - del self.residual_operations # just in case + builder.dispatch(op) + # done + newexitswitch = None if origblock.operations != (): # build exit links and schedule their target for later completion if origblock.exitswitch is None: @@ -266,16 +266,17 @@ elif origblock.exitswitch == Constant(last_exception): XXX else: - v = bindings[origblock.exitswitch].getvarorconst() + a = builder.bindings[origblock.exitswitch] + v = a.forcevarorconst(builder) if isinstance(v, Variable): - newblock.exitswitch = v + newexitswitch = v links = origblock.exits else: links = [link for link in origblock.exits if link.llexitcase == v.value] newlinks = [] for origlink in links: - args_a = [binding(v) for v in origlink.args] + args_a = [builder.binding(v) for v in origlink.args] newlink = self.interp.schedule(args_a, origlink.target) newlinks.append(newlink) else: @@ -283,24 +284,57 @@ # they are linked to the official return or except block of the # copygraph. If needed, LLConcreteValues are turned into Constants. if len(origblock.inputargs) == 1: - target = self.graphstate.copygraph.returnblock + target = self.copygraph.returnblock else: - target = self.graphstate.copygraph.exceptblock - args_v = [binding(v).getvarorconst() for v in origblock.inputargs] + target = self.copygraph.exceptblock + args_v = [builder.binding(v).forcevarorconst(builder) + for v in origblock.inputargs] newlinks = [Link(args_v, target)] #print "CLOSING" - newblock.closeblock(*newlinks) + + newblock = builder.buildblock(origblock.inputargs, + newexitswitch, newlinks) state.resolveblock(newblock) + +class BlockBuilder(object): + + def __init__(self, interp): + self.interp = interp + self.bindings = {} # {Variables-of-origblock: a_value} + self.residual_operations = [] + + def buildblock(self, originputargs, newexitswitch, newlinks): + inputargs = [] + for v in originputargs: + a = self.bindings[v] + inputargs.extend(a.getruntimevars()) + b = Block(inputargs) + b.operations = self.residual_operations + b.exitswitch = newexitswitch + b.closeblock(*newlinks) + return b + + def binding(self, v): + if isinstance(v, Constant): + return LLRuntimeValue(orig_v=v) + else: + return self.bindings[v] + + def dispatch(self, op): + handler = getattr(self, 'op_' + op.opname) + a_result = handler(op, *[self.binding(v) for v in op.args]) + self.bindings[op.result] = a_result + + def constantfold(self, constant_op, args_a): concretevalues = [] any_concrete = False for a in args_a: - v = a.getvarorconst() - if isinstance(v, Constant): - concretevalues.append(v.value) - else: + v = a.maybe_get_constant() + if v is None: return None # cannot constant-fold + concretevalues.append(v.value) any_concrete = any_concrete or isinstance(a, LLConcreteValue) # can constant-fold print 'fold:', constant_op, concretevalues @@ -313,13 +347,13 @@ return LLRuntimeValue(c) def residual(self, opname, args_a, a_result): - v_result = a_result.getvarorconst() + v_result = a_result.forcevarorconst(self) if isinstance(v_result, Constant): v = Variable() v.concretetype = v_result.concretetype v_result = v op = SpaceOperation(opname, - [a.getvarorconst() for a in args_a], + [a.forcevarorconst(self) for a in args_a], v_result) print 'keep:', op self.residual_operations.append(op) @@ -385,8 +419,8 @@ def op_direct_call(self, op, a_func, *args_a): a_result = LLRuntimeValue(op.result) - v_func = a_func.getvarorconst() - if isinstance(v_func, Constant): + v_func = a_func.maybe_get_constant() + if v_func is not None: fnobj = v_func.value._obj if (hasattr(fnobj, 'graph') and not getattr(fnobj._callable, 'suggested_primitive', False)): @@ -396,10 +430,9 @@ #print 'SCHEDULE_GRAPH', args_a, '==>', graphstate.copygraph.name if graphstate.state != "during": print 'ENTERING', graphstate.copygraph.name, args_a - graphstate.complete(self.interp) + graphstate.complete() if (graphstate.a_return is not None and - isinstance(graphstate.a_return.getvarorconst(), - Constant)): + graphstate.a_return.maybe_get_constant() is not None): a_result = graphstate.a_return print 'LEAVING', graphstate.copygraph.name, graphstate.a_return From cfbolz at codespeak.net Sat Dec 10 14:23:44 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sat, 10 Dec 2005 14:23:44 +0100 (CET) Subject: [pypy-svn] r20997 - in pypy/dist/pypy/translator/c: . test Message-ID: <20051210132344.1250D27DC8@code1.codespeak.net> Author: cfbolz Date: Sat Dec 10 14:23:43 2005 New Revision: 20997 Modified: pypy/dist/pypy/translator/c/gc.py pypy/dist/pypy/translator/c/test/test_boehm.py Log: boehm should support __del__ now too -- code looks good, a bit hard to test Modified: pypy/dist/pypy/translator/c/gc.py ============================================================================== --- pypy/dist/pypy/translator/c/gc.py (original) +++ pypy/dist/pypy/translator/c/gc.py Sat Dec 10 14:23:43 2005 @@ -347,10 +347,18 @@ # for structs def struct_setup(self, structdefnode, rtti): - if isinstance(structdefnode.LLTYPE, GcStruct) and list(self.deallocator_lines(structdefnode, '')): + if isinstance(structdefnode.LLTYPE, GcStruct): + has_del = rtti is not None and hasattr(rtti._obj, 'destructor_funcptr') gcinfo = structdefnode.gcinfo = BoehmInfo() gcinfo.finalizer = self.db.namespace.uniquename('finalize_'+structdefnode.barename) - + if list(self.deallocator_lines(structdefnode, '')): + if has_del: + raise Exception("you cannot use __del__ with PyObjects and Boehm") + if has_del: + destrptr = rtti._obj.destructor_funcptr + gcinfo.destructor = self.db.get(destrptr) + T = typeOf(destrptr).TO.ARGS[0] + gcinfo.destructor_argtype = self.db.gettype(T) struct_after_definition = common_after_definition def struct_implementationcode(self, structdefnode): @@ -359,6 +367,9 @@ if gcinfo.finalizer: yield 'void %s(GC_PTR obj, GC_PTR ignore) {' % gcinfo.finalizer yield '\tstruct %s *p = (struct %s *)obj;' % (structdefnode.name, structdefnode.name) + if hasattr(gcinfo, 'destructor'): + yield '\t%s((%s) p);' % ( + gcinfo.destructor, cdecl(gcinfo.destructor_argtype, '')) for line in self.deallocator_lines(structdefnode, '(*p)'): yield '\t' + line yield '}' Modified: pypy/dist/pypy/translator/c/test/test_boehm.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_boehm.py (original) +++ pypy/dist/pypy/translator/c/test/test_boehm.py Sat Dec 10 14:23:43 2005 @@ -39,11 +39,41 @@ fn = getcompiled(malloc_a_lot) fn() +def test__del__(): + class State: + pass + s = State() + class A(object): + def __del__(self): + s.a_dels += 1 + class B(A): + def __del__(self): + s.b_dels += 1 + class C(A): + pass + def f(): + s.a_dels = 0 + s.b_dels = 0 + A() + B() + C() + A() + B() + C() + return s.a_dels * 10 + s.b_dels + fn = getcompiled(f) + res = f() + assert res == 42 + res = fn() #does not crash + res = fn() #does not crash + assert 0 <= res <= 42 # 42 cannot be guaranteed + def run_test(fn): fn() channel.send("ok") run_test(test_malloc_a_lot) +run_test(test__del__) """ @@ -57,6 +87,8 @@ chan = gw.remote_exec(py.code.Source(test_src)) res = chan.receive() assert res == "ok" + res = chan.receive() + assert res == "ok" chan.close() From mwh at codespeak.net Sat Dec 10 14:34:14 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Sat, 10 Dec 2005 14:34:14 +0100 (CET) Subject: [pypy-svn] r20999 - in pypy/dist/pypy/translator: asm/ppcgen c/test Message-ID: <20051210133414.9407B27DCC@code1.codespeak.net> Author: mwh Date: Sat Dec 10 14:34:13 2005 New Revision: 20999 Modified: pypy/dist/pypy/translator/asm/ppcgen/_ppcgen.c pypy/dist/pypy/translator/c/test/test_boehm.py Log: remove this strange hack. now the test is skipped (as before) but if the skip is commented out the test can be run in the regular way. Modified: pypy/dist/pypy/translator/asm/ppcgen/_ppcgen.c ============================================================================== --- pypy/dist/pypy/translator/asm/ppcgen/_ppcgen.c (original) +++ pypy/dist/pypy/translator/asm/ppcgen/_ppcgen.c Sat Dec 10 14:34:13 2005 @@ -121,6 +121,10 @@ {0, 0} }; +#if !defined(MAP_ANON) && defined(__APPLE__) +#define MAP_ANON 0x1000 +#endif + PyMODINIT_FUNC init_ppcgen(void) { Modified: pypy/dist/pypy/translator/c/test/test_boehm.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_boehm.py (original) +++ pypy/dist/pypy/translator/c/test/test_boehm.py Sat Dec 10 14:34:13 2005 @@ -1,9 +1,11 @@ import py -test_src = """ from pypy.translator.translator import TranslationContext from pypy.translator.tool.cbuild import skip_missing_compiler from pypy.translator.c.genc import CExtModuleBuilder +py.test.skip("boehm test is fragile wrt. the number of dynamically loaded libs") + + def getcompiled(func): from pypy.translator.c.gc import BoehmGcPolicy t = TranslationContext(simplifying=True) @@ -67,28 +69,3 @@ res = fn() #does not crash res = fn() #does not crash assert 0 <= res <= 42 # 42 cannot be guaranteed - -def run_test(fn): - fn() - channel.send("ok") - -run_test(test_malloc_a_lot) -run_test(test__del__) -""" - - -def test_boehm(): - import py - py.test.skip("boehm test is fragile wrt. the number of dynamically loaded libs") - from pypy.translator.tool import cbuild - if not cbuild.check_boehm_presence(): - py.test.skip("no boehm gc on this machine") - gw = py.execnet.PopenGateway() - chan = gw.remote_exec(py.code.Source(test_src)) - res = chan.receive() - assert res == "ok" - res = chan.receive() - assert res == "ok" - chan.close() - - From arigo at codespeak.net Sat Dec 10 15:37:00 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 10 Dec 2005 15:37:00 +0100 (CET) Subject: [pypy-svn] r21002 - in pypy/dist/pypy/jit: . test Message-ID: <20051210143700.5CE3327DC8@code1.codespeak.net> Author: arigo Date: Sat Dec 10 15:36:57 2005 New Revision: 21002 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: Support virtual structures. Missing: support for inlined substructures and related operations, e.g. cast_pointer. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sat Dec 10 15:36:57 2005 @@ -6,6 +6,17 @@ from pypy.translator.simplify import eliminate_empty_blocks, join_blocks +def const(value, T=None): + c = Constant(value) + c.concretetype = T or lltype.typeOf(value) + return c + +def newvar(T): + v = Variable() + v.concretetype = T + return v + + class LLAbstractValue(object): pass @@ -30,17 +41,13 @@ return lltype.typeOf(self.value) def forcevarorconst(self, builder): - c = Constant(self.value) - c.concretetype = self.getconcretetype() - return c + return const(self.value) def getruntimevars(self): return [] def maybe_get_constant(self): - c = Constant(self.value) - c.concretetype = self.getconcretetype() - return c + return const(self.value) def with_fresh_variables(self, to_be_stored_into): return self @@ -81,13 +88,104 @@ return LLRuntimeValue(orig_v=to_be_stored_into) def match(self, other): - return isinstance(other, LLRuntimeValue) # XXX and ... + if isinstance(other, LLRuntimeValue): + if isinstance(self.copy_v, Variable): + return isinstance(other.copy_v, Variable) + else: + return self.copy_v == other.copy_v + else: + return False + +ll_no_return_value = LLRuntimeValue(const(None, lltype.Void)) + + +class VirtualStruct(object): + def __init__(self, STRUCT): + self.T = STRUCT + self.fields = {} + + def getfield(self, name): + try: + return self.fields[name] + except KeyError: + T = getattr(self.T, name) + return LLRuntimeValue(const(T._defl())) + + def setfield(self, name, value): + self.fields[name] = value + + def copy(self): + result = VirtualStruct(self.T) + for name, a_value in self.fields.items(): + v = newvar(a_value.getconcretetype()) + result.fields[name] = a_value.with_fresh_variables(v) + return result + + def force(self, builder): + v_result = newvar(lltype.Ptr(self.T)) + op = SpaceOperation('malloc', [const(self.T, lltype.Void)], v_result) + print 'force:', op + builder.residual_operations.append(op) + # initialize all fields by relying on the assumption that the + # structure is initialized to zeros + for name in self.T._names: + if name in self.fields: + v_value = self.fields[name].forcevarorconst(builder) + op = SpaceOperation('setfield', [v_result, + const(name, lltype.Void), + v_value], + newvar(lltype.Void)) + print 'force:', op + builder.residual_operations.append(op) + return v_result + + def getruntimevars(self): + result = [] + for name in self.T._names: + result.extend(self.getfield(name).getruntimevars()) + return result + + def match(self, other): + assert self.T == other.T + for name in self.T._names: + a1 = self.getfield(name) + a2 = other.getfield(name) + if not a1.match(a2): + return False + else: + return True + + +class LLVirtualPtr(LLAbstractValue): + + def __init__(self, containerobj): + self.containerobj = containerobj # a VirtualStruct + + def getconcretetype(self): + return lltype.Ptr(self.containerobj.T) + + def forcevarorconst(self, builder): + v_result = self.containerobj.force(builder) + self.__class__ = LLRuntimeValue + self.__dict__ = {'copy_v': v_result} + return v_result -orig_v = Constant(None) -orig_v.concretetype = lltype.Void -ll_no_return_value = LLRuntimeValue(orig_v) -del orig_v + def getruntimevars(self): + return self.containerobj.getruntimevars() + + def maybe_get_constant(self): + return None + + def with_fresh_variables(self, to_be_stored_into): + return LLVirtualPtr(self.containerobj.copy()) + def match(self, other): + if isinstance(other, LLVirtualPtr): + return self.containerobj.match(other.containerobj) + else: + return False + +# ____________________________________________________________ class BlockState(object): """Entry state of a block, as a combination of LLAbstractValues @@ -116,13 +214,12 @@ class LLAbstractInterp(object): def __init__(self): - self.graphs = {} # {origgraph: {BlockState: GraphState}} + self.graphs = [] + self.graphstates = {} # {origgraph: {BlockState: GraphState}} self.pendingstates = {} # {Link-or-GraphState: next-BlockState} def itercopygraphs(self): - for d in self.graphs.itervalues(): - for graphstate in d.itervalues(): - yield graphstate.copygraph + return self.graphs def eval(self, origgraph, hints): # for now, 'hints' means "I'm absolutely sure that the @@ -148,9 +245,9 @@ origblock = origgraph.startblock state, args_a = self.schedule_getstate(args_a, origblock) try: - graphstate = self.graphs[origgraph][state] + graphstate = self.graphstates[origgraph][state] except KeyError: - d = self.graphs.setdefault(origgraph, {}) + d = self.graphstates.setdefault(origgraph, {}) graphstate = GraphState(self, origgraph, args_a, n=len(d)) d[state] = graphstate self.pendingstates[graphstate] = state @@ -192,6 +289,7 @@ self.origgraph = origgraph name = '%s_%d' % (origgraph.name, n) self.copygraph = FunctionGraph(name, Block([])) # grumble + interp.graphs.append(self.copygraph) for orig_v, copy_v in [(origgraph.getreturnvar(), self.copygraph.getreturnvar()), (origgraph.exceptblock.inputargs[0], @@ -250,8 +348,11 @@ # flow in the block origblock = state.origblock builder = BlockBuilder(self.interp) + newinputargs = [] for v, a in zip(origblock.inputargs, state.args_a): - builder.bindings[v] = a.with_fresh_variables(to_be_stored_into=v) + a = a.with_fresh_variables(to_be_stored_into=v) + builder.bindings[v] = a + newinputargs.extend(a.getruntimevars()) print # flow the actual operations of the block for op in origblock.operations: @@ -292,8 +393,7 @@ newlinks = [Link(args_v, target)] #print "CLOSING" - newblock = builder.buildblock(origblock.inputargs, - newexitswitch, newlinks) + newblock = builder.buildblock(newinputargs, newexitswitch, newlinks) state.resolveblock(newblock) @@ -304,12 +404,8 @@ self.bindings = {} # {Variables-of-origblock: a_value} self.residual_operations = [] - def buildblock(self, originputargs, newexitswitch, newlinks): - inputargs = [] - for v in originputargs: - a = self.bindings[v] - inputargs.extend(a.getruntimevars()) - b = Block(inputargs) + def buildblock(self, newinputargs, newexitswitch, newlinks): + b = Block(newinputargs) b.operations = self.residual_operations b.exitswitch = newexitswitch b.closeblock(*newlinks) @@ -349,9 +445,7 @@ def residual(self, opname, args_a, a_result): v_result = a_result.forcevarorconst(self) if isinstance(v_result, Constant): - v = Variable() - v.concretetype = v_result.concretetype - v_result = v + v_result = newvar(v_result.concretetype) op = SpaceOperation(opname, [a.forcevarorconst(self) for a in args_a], v_result) @@ -455,6 +549,10 @@ return a_result def op_getfield(self, op, a_ptr, a_attrname): + if isinstance(a_ptr, LLVirtualPtr): + c_attrname = a_attrname.maybe_get_constant() + assert c_attrname is not None + return a_ptr.containerobj.getfield(c_attrname.value) constant_op = None T = a_ptr.getconcretetype().TO if T._hints.get('immutable', False): @@ -475,12 +573,20 @@ return self.residualize(op, [a_ptr, a_index], constant_op) def op_malloc(self, op, a_T): - return self.residualize(op, [a_T]) + c_T = a_T.maybe_get_constant() + assert c_T is not None + S = VirtualStruct(c_T.value) + return LLVirtualPtr(S) def op_malloc_varsize(self, op, a_T, a_size): return self.residualize(op, [a_T, a_size]) def op_setfield(self, op, a_ptr, a_attrname, a_value): + if isinstance(a_ptr, LLVirtualPtr): + c_attrname = a_attrname.maybe_get_constant() + assert c_attrname is not None + a_ptr.containerobj.setfield(c_attrname.value, a_value) + return ll_no_return_value return self.residualize(op, [a_ptr, a_attrname, a_value]) def op_setarrayitem(self, op, a_ptr, a_index, a_value): Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Sat Dec 10 15:36:57 2005 @@ -182,3 +182,13 @@ graph2, insns = abstrinterp(ll_function, [7], [0]) # the direct_calls are messy to count, with calls to ll_stack_check assert insns.keys() == ['direct_call'] + +def test_simple_malloc_removal(): + S = lltype.GcStruct('S', ('n', lltype.Signed)) + def ll_function(k): + s = lltype.malloc(S) + s.n = k + l = s.n + return l+1 + graph2, insns = abstrinterp(ll_function, [7], [0]) + assert insns == {} From cfbolz at codespeak.net Sat Dec 10 15:49:31 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sat, 10 Dec 2005 15:49:31 +0100 (CET) Subject: [pypy-svn] r21003 - pypy/extradoc/sprintinfo/gothenburg-2005 Message-ID: <20051210144931.D47C327DC7@code1.codespeak.net> Author: cfbolz Date: Sat Dec 10 15:49:30 2005 New Revision: 21003 Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Log: this is done now Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt ============================================================================== --- pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt (original) +++ pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Sat Dec 10 15:49:30 2005 @@ -50,9 +50,7 @@ GC, __del__, weakref ~~~~~~~~~~~~~~~~~~~~~ -(Carl Friedrich) -- implement __del__ support in the RTyper and backends (FINISHED for - refcounting, some work left for Boehm) +- implement __del__ support in the RTyper and backends (DONE) (- possibly implement weakref (at least with Boehm)) - integrate GC construction framework in the backends From arigo at codespeak.net Sat Dec 10 16:35:25 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 10 Dec 2005 16:35:25 +0100 (CET) Subject: [pypy-svn] r21005 - in pypy/dist/pypy/jit: . test Message-ID: <20051210153525.DAB2127DD3@code1.codespeak.net> Author: arigo Date: Sat Dec 10 16:35:23 2005 New Revision: 21005 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: (pedronis, mwh, arigo) First shot at substructures. For now, just enough to pass the test. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sat Dec 10 16:35:23 2005 @@ -49,7 +49,7 @@ def maybe_get_constant(self): return const(self.value) - def with_fresh_variables(self, to_be_stored_into): + def with_fresh_variables(self, memo): return self def match(self, other): @@ -62,9 +62,14 @@ if isinstance(orig_v, Variable): self.copy_v = Variable(orig_v) self.copy_v.concretetype = orig_v.concretetype - else: + elif isinstance(orig_v, Constant): # we can share the Constant() self.copy_v = orig_v + elif isinstance(orig_v, lltype.LowLevelType): + # hackish interface :-( we accept a type too + self.copy_v = newvar(orig_v) + else: + raise TypeError(repr(orig_v)) def __repr__(self): return '' % (self.copy_v,) @@ -84,8 +89,8 @@ else: return None - def with_fresh_variables(self, to_be_stored_into): - return LLRuntimeValue(orig_v=to_be_stored_into) + def with_fresh_variables(self, memo): + return LLRuntimeValue(self.getconcretetype()) def match(self, other): if isinstance(other, LLRuntimeValue): @@ -100,52 +105,122 @@ class VirtualStruct(object): + parent = None + parentindex = None + def __init__(self, STRUCT): self.T = STRUCT self.fields = {} + def setparent(self, parent, parentindex): + self.parent = parent + self.parentindex = parentindex + + def topmostparent(self): + obj = self + while obj.parent is not None: + obj = obj.parent + return obj + def getfield(self, name): try: return self.fields[name] except KeyError: T = getattr(self.T, name) - return LLRuntimeValue(const(T._defl())) + if isinstance(T, lltype.ContainerType): + # reading a substructure + substr = VirtualStruct(T) + substr.setparent(self, name) + a_result = LLVirtualPtr(substr) + self.fields[name] = a_result + return a_result + else: + # no value ever set, return a default + return LLRuntimeValue(const(T._defl())) def setfield(self, name, value): self.fields[name] = value - def copy(self): - result = VirtualStruct(self.T) - for name, a_value in self.fields.items(): - v = newvar(a_value.getconcretetype()) - result.fields[name] = a_value.with_fresh_variables(v) - return result + def copy(self, memo): + if self in memo: + return memo[self] # already seen + else: + result = VirtualStruct(self.T) + memo[self] = result + if self.parent is not None: + # build the parent first -- note that parent.copy() will pick + # up 'result' again, because it is already in the memo + result.setparent(self.parent.copy(memo), self.parentindex) + + for name, a_value in self.fields.items(): + a = a_value.with_fresh_variables(memo) + result.fields[name] = a + return result def force(self, builder): v_result = newvar(lltype.Ptr(self.T)) - op = SpaceOperation('malloc', [const(self.T, lltype.Void)], v_result) - print 'force:', op - builder.residual_operations.append(op) - # initialize all fields by relying on the assumption that the - # structure is initialized to zeros + if self.parent is not None: + v_parent = self.parent.force(builder) + op = SpaceOperation('getsubstruct', [v_parent, + const(self.parentindex, + lltype.Void)], + v_result) + print 'force:', op + builder.residual_operations.append(op) + else: + op = SpaceOperation('malloc', [const(self.T, lltype.Void)], v_result) + print 'force:', op + builder.residual_operations.append(op) + self.buildcontent(builder, v_result) + return v_result + + def buildcontent(self, builder, v_target): + # initialize all fields for name in self.T._names: if name in self.fields: - v_value = self.fields[name].forcevarorconst(builder) - op = SpaceOperation('setfield', [v_result, - const(name, lltype.Void), - v_value], - newvar(lltype.Void)) - print 'force:', op - builder.residual_operations.append(op) - return v_result + a_value = self.fields[name] + T = getattr(self.T, name) + if isinstance(T, lltype.ContainerType): + # initialize the substructure + v_subptr = newvar(lltype.Ptr(T)) + op = SpaceOperation('getsubstruct', + [v_target, const(name, lltype.Void)], + v_subptr) + print 'force:', op + builder.residual_operations.append(op) + assert isinstance(a_value, LLVirtualPtr) + a_value.containerobj.buildcontent(builder, v_subptr) + else: + v_value = a_value.forcevarorconst(builder) + op = SpaceOperation('setfield', [v_target, + const(name, lltype.Void), + v_value], + newvar(lltype.Void)) + print 'force:', op + builder.residual_operations.append(op) + + def rec_fields(self): + # enumerate all the fields of this structure and each of + # its substructures + for name in self.T._names: + a_value = self.getfield(name) + T = getattr(self.T, name) + if isinstance(T, lltype.ContainerType): + assert isinstance(a_value, LLVirtualPtr) + for obj, fld in a_value.containerobj.rec_fields(): + yield obj, fld + else: + yield self, name def getruntimevars(self): result = [] - for name in self.T._names: - result.extend(self.getfield(name).getruntimevars()) + for obj, name in self.topmostparent().rec_fields(): + result.extend(obj.getfield(name).getruntimevars()) return result def match(self, other): + if self is other: + return True assert self.T == other.T for name in self.T._names: a1 = self.getfield(name) @@ -176,8 +251,8 @@ def maybe_get_constant(self): return None - def with_fresh_variables(self, to_be_stored_into): - return LLVirtualPtr(self.containerobj.copy()) + def with_fresh_variables(self, memo): + return LLVirtualPtr(self.containerobj.copy(memo)) def match(self, other): if isinstance(other, LLVirtualPtr): @@ -349,8 +424,12 @@ origblock = state.origblock builder = BlockBuilder(self.interp) newinputargs = [] + memo = {} for v, a in zip(origblock.inputargs, state.args_a): - a = a.with_fresh_variables(to_be_stored_into=v) + a = a.with_fresh_variables(memo) + # try to preserve the name + if isinstance(a, LLRuntimeValue) and isinstance(a.copy_v, Variable): + a.copy_v.rename(v) builder.bindings[v] = a newinputargs.extend(a.getruntimevars()) print @@ -438,9 +517,7 @@ if any_concrete: return LLConcreteValue(concreteresult) else: - c = Constant(concreteresult) - c.concretetype = typeOf(concreteresult) - return LLRuntimeValue(c) + return LLRuntimeValue(const(concreteresult)) def residual(self, opname, args_a, a_result): v_result = a_result.forcevarorconst(self) @@ -542,9 +619,7 @@ ARGS, lltype.typeOf(origfptr).TO.RESULT) fptr = lltype.functionptr( TYPE, graphstate.copygraph.name, graph=graphstate.copygraph) - fconst = Constant(fptr) - fconst.concretetype = lltype.typeOf(fptr) - a_func = LLRuntimeValue(fconst) + a_func = LLRuntimeValue(const(fptr)) self.residual("direct_call", [a_func] + list(args_a), a_result) return a_result @@ -560,6 +635,11 @@ return self.residualize(op, [a_ptr, a_attrname], constant_op) def op_getsubstruct(self, op, a_ptr, a_attrname): + if isinstance(a_ptr, LLVirtualPtr): + c_attrname = a_attrname.maybe_get_constant() + assert c_attrname is not None + # this should return a new LLVirtualPtr + return a_ptr.containerobj.getfield(c_attrname.value) return self.residualize(op, [a_ptr, a_attrname], getattr) def op_getarraysize(self, op, a_ptr): Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Sat Dec 10 16:35:23 2005 @@ -192,3 +192,14 @@ return l+1 graph2, insns = abstrinterp(ll_function, [7], [0]) assert insns == {} + +def test_inlined_substructure(): + S = lltype.Struct('S', ('n', lltype.Signed)) + T = lltype.GcStruct('T', ('s', S), ('n', lltype.Float)) + def ll_function(k): + t = lltype.malloc(T) + t.s.n = k + l = t.s.n + return l + graph2, insns = abstrinterp(ll_function, [7], [0]) + assert insns == {} From arigo at codespeak.net Sat Dec 10 16:44:13 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 10 Dec 2005 16:44:13 +0100 (CET) Subject: [pypy-svn] r21006 - pypy/dist/pypy/jit Message-ID: <20051210154413.B1E0E27DD5@code1.codespeak.net> Author: arigo Date: Sat Dec 10 16:44:11 2005 New Revision: 21006 Modified: pypy/dist/pypy/jit/llabstractinterp.py Log: Clarified the meaning of match(). Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sat Dec 10 16:44:11 2005 @@ -93,13 +93,10 @@ return LLRuntimeValue(self.getconcretetype()) def match(self, other): - if isinstance(other, LLRuntimeValue): - if isinstance(self.copy_v, Variable): - return isinstance(other.copy_v, Variable) - else: - return self.copy_v == other.copy_v - else: - return False + # Note: the meaning of match() is actually to see if calling + # with_fresh_variables() on both 'self' and 'other' would give the + # same result. This is why any two LLRuntimeValues match each other. + return isinstance(other, LLRuntimeValue) ll_no_return_value = LLRuntimeValue(const(None, lltype.Void)) From arigo at codespeak.net Sat Dec 10 16:58:32 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 10 Dec 2005 16:58:32 +0100 (CET) Subject: [pypy-svn] r21007 - in pypy/dist/pypy/jit: . test Message-ID: <20051210155832.980B927DD5@code1.codespeak.net> Author: arigo Date: Sat Dec 10 16:58:30 2005 New Revision: 21007 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: Added a test and a corresponding bug fix. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sat Dec 10 16:58:30 2005 @@ -149,8 +149,10 @@ # up 'result' again, because it is already in the memo result.setparent(self.parent.copy(memo), self.parentindex) - for name, a_value in self.fields.items(): - a = a_value.with_fresh_variables(memo) + # cannot keep lazy fields around: the copy is expected to have + # only variables, not constants + for name in self.T._names: + a = self.getfield(name).with_fresh_variables(memo) result.fields[name] = a return result Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Sat Dec 10 16:58:30 2005 @@ -203,3 +203,22 @@ return l graph2, insns = abstrinterp(ll_function, [7], [0]) assert insns == {} + +def test_merge_with_inlined_substructure(): + S = lltype.Struct('S', ('n1', lltype.Signed), ('n2', lltype.Signed)) + T = lltype.GcStruct('T', ('s', S), ('n', lltype.Float)) + def ll_function(k, flag): + if flag: + t = lltype.malloc(T) + t.s.n1 = k + t.s.n2 = flag + else: + t = lltype.malloc(T) + t.s.n1 = 14 - k + t.s.n2 = flag + 42 + # 't.s.n1' should always be 7 here, so the two branches should merge + n1 = t.s.n1 + n2 = t.s.n2 + return n1 * n2 + graph2, insns = abstrinterp(ll_function, [7, 1], [0]) + assert insns == {'int_is_true': 1, 'int_add': 1, 'int_mul': 1} From arigo at codespeak.net Sat Dec 10 17:38:22 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 10 Dec 2005 17:38:22 +0100 (CET) Subject: [pypy-svn] r21008 - in pypy/dist/pypy/jit: . test Message-ID: <20051210163822.5BCEB27DD5@code1.codespeak.net> Author: arigo Date: Sat Dec 10 17:38:20 2005 New Revision: 21008 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_jit_tl.py Log: In-progress: be careful about aliasing -- i.e. structures that can be reached via several pointers, e.g. several local vars or something more indirect. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sat Dec 10 17:38:20 2005 @@ -43,7 +43,7 @@ def forcevarorconst(self, builder): return const(self.value) - def getruntimevars(self): + def getruntimevars(self, memo): return [] def maybe_get_constant(self): @@ -80,7 +80,7 @@ def forcevarorconst(self, builder): return self.copy_v - def getruntimevars(self): + def getruntimevars(self, memo): return [self.copy_v] def maybe_get_constant(self): @@ -199,6 +199,7 @@ builder.residual_operations.append(op) def rec_fields(self): + # -- not used at the moment -- # enumerate all the fields of this structure and each of # its substructures for name in self.T._names: @@ -211,10 +212,14 @@ else: yield self, name - def getruntimevars(self): + def getruntimevars(self, memo): result = [] - for obj, name in self.topmostparent().rec_fields(): - result.extend(obj.getfield(name).getruntimevars()) + if self not in memo: + memo[self] = True + if self.parent is not None: + result.extend(self.parent.getruntimevars(memo)) + for name in self.T._names: + result.extend(self.getfield(name).getruntimevars(memo)) return result def match(self, other): @@ -244,8 +249,8 @@ self.__dict__ = {'copy_v': v_result} return v_result - def getruntimevars(self): - return self.containerobj.getruntimevars() + def getruntimevars(self, memo): + return self.containerobj.getruntimevars(memo) def maybe_get_constant(self): return None @@ -333,8 +338,9 @@ # args_a: [the-a-corresponding-to-v for v in origblock.inputargs] state, args_a = self.schedule_getstate(args_a, origblock) args_v = [] + memo = {} for a in args_a: - args_v.extend(a.getruntimevars()) + args_v.extend(a.getruntimevars(memo)) newlink = Link(args_v, None) self.pendingstates[newlink] = state return newlink @@ -413,7 +419,11 @@ else: raise Exception("uh?") # the graph should be complete now; sanity-check - checkgraph(graph) + try: + checkgraph(graph) + except: + graph.show() + raise eliminate_empty_blocks(graph) join_blocks(graph) self.state = "after" @@ -424,13 +434,14 @@ builder = BlockBuilder(self.interp) newinputargs = [] memo = {} + memo2 = {} for v, a in zip(origblock.inputargs, state.args_a): a = a.with_fresh_variables(memo) # try to preserve the name if isinstance(a, LLRuntimeValue) and isinstance(a.copy_v, Variable): a.copy_v.rename(v) builder.bindings[v] = a - newinputargs.extend(a.getruntimevars()) + newinputargs.extend(a.getruntimevars(memo2)) print # flow the actual operations of the block for op in origblock.operations: @@ -595,6 +606,11 @@ if (hasattr(fnobj, 'graph') and not getattr(fnobj._callable, 'suggested_primitive', False)): origgraph = fnobj.graph + + # for now, we need to force all arguments + for a in args_a: + a.forcevarorconst(self) + graphstate, args_a = self.interp.schedule_graph( args_a, origgraph) #print 'SCHEDULE_GRAPH', args_a, '==>', graphstate.copygraph.name @@ -675,3 +691,13 @@ def constant_op(ptr): return lltype.cast_pointer(op.result.concretetype, ptr) return self.residualize(op, [a_ptr], constant_op) + + def op_keepalive(self, op, a_ptr): + if isinstance(a_ptr, LLVirtualPtr): + for v in a_ptr.getruntimevars({}): + if isinstance(v, Variable) and not v.concretetype._is_atomic(): + op = SpaceOperation('keepalive', [v], newvar(lltype.Void)) + print 'virtual:', op + self.residual_operations.append(op) + return ll_no_return_value + return self.residualize(op, [a_ptr]) Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Sat Dec 10 17:38:20 2005 @@ -1,10 +1,14 @@ # "coughcoughcough" applies to most of this file +import py from pypy.translator.translator import TranslationContext from pypy.jit import tl from pypy.jit.llabstractinterp import LLAbstractInterp from pypy.rpython.rstr import string_repr from pypy.rpython.llinterp import LLInterpreter +from pypy.translator.backendopt import inline + +py.test.skip("in-progress") def entry_point(code, pc): # indirection needed, because the hints are not about *all* calls to @@ -16,6 +20,7 @@ t.buildannotator().build_types(entry_point, [str, int]) rtyper = t.buildrtyper() rtyper.specialize() + inline.auto_inlining(t, 0.5) graph1 = t.graphs[0] interp = LLAbstractInterp() @@ -28,7 +33,8 @@ result2 = llinterp.eval_graph(graph2, []) assert result1 == result2 - #graph2.show() + + #interp.graphs[1].show() # graphs[0] should be the entry_point def run_jit(code): From ac at codespeak.net Sat Dec 10 17:40:29 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Sat, 10 Dec 2005 17:40:29 +0100 (CET) Subject: [pypy-svn] r21009 - in pypy/dist/pypy: interpreter objspace Message-ID: <20051210164029.43EE027DD5@code1.codespeak.net> Author: ac Date: Sat Dec 10 17:40:28 2005 New Revision: 21009 Modified: pypy/dist/pypy/interpreter/baseobjspace.py pypy/dist/pypy/interpreter/eval.py pypy/dist/pypy/interpreter/gateway.py pypy/dist/pypy/interpreter/pycode.py pypy/dist/pypy/objspace/descroperation.py Log: (arre, eric) Fully implement fastcall for applevel functions. Modified: pypy/dist/pypy/interpreter/baseobjspace.py ============================================================================== --- pypy/dist/pypy/interpreter/baseobjspace.py (original) +++ pypy/dist/pypy/interpreter/baseobjspace.py Sat Dec 10 17:40:28 2005 @@ -464,19 +464,29 @@ # XXX start of hack for performance from pypy.interpreter.function import Function if isinstance(w_func, Function): - if len(args_w) == 1: + if len(args_w) == 0: + w_res = w_func.code.fastcall_0(self, w_func) + if w_res is not None: + return w_res + elif len(args_w) == 1: w_res = w_func.code.fastcall_1(self, w_func, args_w[0]) if w_res is not None: return w_res elif len(args_w) == 2: - w_res = w_func.code.fastcall_2(self, args_w[0], args_w[1]) + w_res = w_func.code.fastcall_2(self, w_func, args_w[0], + args_w[1]) if w_res is not None: return w_res elif len(args_w) == 3: - w_res = w_func.code.fastcall_3(self, args_w[0], + w_res = w_func.code.fastcall_3(self, w_func, args_w[0], args_w[1], args_w[2]) if w_res is not None: return w_res + elif len(args_w) == 4: + w_res = w_func.code.fastcall_4(self, w_func, args_w[0], + args_w[1], args_w[2], args_w[3]) + if w_res is not None: + return w_res args = Arguments(self, list(args_w)) return w_func.call_args(args) # XXX end of hack for performance Modified: pypy/dist/pypy/interpreter/eval.py ============================================================================== --- pypy/dist/pypy/interpreter/eval.py (original) +++ pypy/dist/pypy/interpreter/eval.py Sat Dec 10 17:40:28 2005 @@ -50,12 +50,16 @@ def getdocstring(self): return None - # a performance hack (see gateway.BuiltinCode1/2/3) + # a performance hack (see gateway.BuiltinCode1/2/3 and pycode.PyCode) + def fastcall_0(self, space, func): + return None def fastcall_1(self, space, func, w1): return None - def fastcall_2(self, space, w1, w2): + def fastcall_2(self, space, func, w1, w2): + return None + def fastcall_3(self, space, func, w1, w2, w3): return None - def fastcall_3(self, space, w1, w2, w3): + def fastcall_4(self, space, func, w1, w2, w3, w4): return None class Frame(Wrappable): Modified: pypy/dist/pypy/interpreter/gateway.py ============================================================================== --- pypy/dist/pypy/interpreter/gateway.py (original) +++ pypy/dist/pypy/interpreter/gateway.py Sat Dec 10 17:40:28 2005 @@ -415,6 +415,9 @@ self.framefactory = make_builtin_frame_factory(func, orig_sig, unwrap_spec) # speed hack + if unwrap_spec == [ObjSpace]: + self.__class__ = BuiltinCode0 + self.fastfunc_0 = func if unwrap_spec == [ObjSpace, W_Root]: self.__class__ = BuiltinCode1 self.fastfunc_1 = func @@ -424,6 +427,9 @@ elif unwrap_spec == [ObjSpace, W_Root, W_Root, W_Root]: self.__class__ = BuiltinCode3 self.fastfunc_3 = func + elif unwrap_spec == [ObjSpace, W_Root, W_Root, W_Root, W_Root]: + self.__class__ = BuiltinCode4 + self.fastfunc_4 = func def create_frame(self, space, w_globals, closure=None): return self.framefactory.create(space, self, w_globals) @@ -437,6 +443,21 @@ # (verbose) performance hack below +class BuiltinCode0(BuiltinCode): + def fastcall_0(self, space, w_func): + try: + w_result = self.fastfunc_0(space) + except KeyboardInterrupt: + raise OperationError(space.w_KeyboardInterrupt, space.w_None) + except MemoryError: + raise OperationError(space.w_MemoryError, space.w_None) + except RuntimeError, e: + raise OperationError(space.w_RuntimeError, + space.wrap("internal error: " + str(e))) + if w_result is None: + w_result = space.w_None + return w_result + class BuiltinCode1(BuiltinCode): def fastcall_1(self, space, w_func, w1): try: @@ -453,7 +474,7 @@ return w_result class BuiltinCode2(BuiltinCode): - def fastcall_2(self, space, w1, w2): + def fastcall_2(self, space, w_func, w1, w2): try: w_result = self.fastfunc_2(space, w1, w2) except KeyboardInterrupt: @@ -468,7 +489,7 @@ return w_result class BuiltinCode3(BuiltinCode): - def fastcall_3(self, space, w1, w2, w3): + def fastcall_3(self, space, func, w1, w2, w3): try: w_result = self.fastfunc_3(space, w1, w2, w3) except KeyboardInterrupt: @@ -482,6 +503,21 @@ w_result = space.w_None return w_result +class BuiltinCode4(BuiltinCode): + def fastcall_4(self, space, func, w1, w2, w3, w4): + try: + w_result = self.fastfunc_4(space, w1, w2, w3, w4) + except KeyboardInterrupt: + raise OperationError(space.w_KeyboardInterrupt, space.w_None) + except MemoryError: + raise OperationError(space.w_MemoryError, space.w_None) + except RuntimeError, e: + raise OperationError(space.w_RuntimeError, + space.wrap("internal error: " + str(e))) + if w_result is None: + w_result = space.w_None + return w_result + class interp2app(Wrappable): """Build a gateway that calls 'f' at interp-level.""" Modified: pypy/dist/pypy/interpreter/pycode.py ============================================================================== --- pypy/dist/pypy/interpreter/pycode.py (original) +++ pypy/dist/pypy/interpreter/pycode.py Sat Dec 10 17:40:28 2005 @@ -158,12 +158,6 @@ self._compute_fastcall() return self - def _compute_fastcall(self): - # Speed hack! - self.do_fastcall = -1 - if self.co_flags & (CO_VARARGS | CO_VARKEYWORDS) == 0: - if self.co_argcount == 1: - self.do_fastcall = 1 def _from_code(self, code, hidden_applevel=False, from_cpython=True): """ Initialize the code object from a real (CPython) one. @@ -223,13 +217,65 @@ self._compute_fastcall() return self + def _compute_fastcall(self): + # Speed hack! + self.do_fastcall = -1 + if not (0 <= self.co_argcount <= 4): + return + if self.co_flags & (CO_VARARGS | CO_VARKEYWORDS): + return + if self.co_cellvars: + first_cellvar = self.co_cellvars[0] + for i in range(self.co_argcount): + if first_cellvar == self.co_varnames[i]: + return + + self.do_fastcall = self.co_argcount + + def fastcall_0(self, space, w_func): + if self.do_fastcall == 0: + frame = self.create_frame(space, w_func.w_func_globals, + w_func.closure) + return frame.run() + return None + def fastcall_1(self, space, w_func, w_arg): - if self.do_fastcall != 1: - return None - frame = self.create_frame(space, w_func.w_func_globals, - w_func.closure) - frame.setfastscope([w_arg]) - return frame.run() + if self.do_fastcall == 1: + frame = self.create_frame(space, w_func.w_func_globals, + w_func.closure) + frame.fastlocals_w[0] = w_arg # frame.setfastscope([w_arg]) + return frame.run() + return None + + def fastcall_2(self, space, w_func, w_arg1, w_arg2): + if self.do_fastcall == 2: + frame = self.create_frame(space, w_func.w_func_globals, + w_func.closure) + frame.fastlocals_w[0] = w_arg1 # frame.setfastscope([w_arg]) + frame.fastlocals_w[1] = w_arg2 + return frame.run() + return None + + def fastcall_3(self, space, w_func, w_arg1, w_arg2, w_arg3): + if self.do_fastcall == 3: + frame = self.create_frame(space, w_func.w_func_globals, + w_func.closure) + frame.fastlocals_w[0] = w_arg1 # frame.setfastscope([w_arg]) + frame.fastlocals_w[1] = w_arg2 + frame.fastlocals_w[2] = w_arg3 + return frame.run() + return None + + def fastcall_4(self, space, w_func, w_arg1, w_arg2, w_arg3, w_arg4): + if self.do_fastcall == 4: + frame = self.create_frame(space, w_func.w_func_globals, + w_func.closure) + frame.fastlocals_w[0] = w_arg1 # frame.setfastscope([w_arg]) + frame.fastlocals_w[1] = w_arg2 + frame.fastlocals_w[2] = w_arg3 + frame.fastlocals_w[3] = w_arg4 + return frame.run() + return None def create_frame(self, space, w_globals, closure=None): "Create an empty PyFrame suitable for this code object." Modified: pypy/dist/pypy/objspace/descroperation.py ============================================================================== --- pypy/dist/pypy/objspace/descroperation.py (original) +++ pypy/dist/pypy/objspace/descroperation.py Sat Dec 10 17:40:28 2005 @@ -89,14 +89,19 @@ if w_res is not None: return w_res elif len(args_w) == 1: - w_res = descr.code.fastcall_2(space, w_obj, args_w[0]) + w_res = descr.code.fastcall_2(space, descr, w_obj, args_w[0]) if w_res is not None: return w_res elif len(args_w) == 2: - w_res = descr.code.fastcall_3(space, w_obj, args_w[0], + w_res = descr.code.fastcall_3(space, descr, w_obj, args_w[0], args_w[1]) if w_res is not None: return w_res + elif len(args_w) == 3: + w_res = descr.code.fastcall_4(space, descr, w_obj, args_w[0], + args_w[1], args_w[2]) + if w_res is not None: + return w_res args = Arguments(space, list(args_w)) return descr.call_args(args.prepend(w_obj)) else: From nik at codespeak.net Sat Dec 10 22:30:14 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Sat, 10 Dec 2005 22:30:14 +0100 (CET) Subject: [pypy-svn] r21010 - pypy/dist/pypy/translator/c/test Message-ID: <20051210213014.0225B27DC2@code1.codespeak.net> Author: nik Date: Sat Dec 10 22:30:13 2005 New Revision: 21010 Modified: pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py Log: (ale, nik) disable test until our test server works correctly. Modified: pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py Sat Dec 10 22:30:13 2005 @@ -21,7 +21,7 @@ del tn del mod.process -def test_connect(): +def DONOT_test_connect(): import os from pypy.module._socket.rpython import rsocket def does_stuff(): From cfbolz at codespeak.net Sat Dec 10 23:45:01 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sat, 10 Dec 2005 23:45:01 +0100 (CET) Subject: [pypy-svn] r21011 - in pypy/dist/pypy/rpython: . test Message-ID: <20051210224501.EFB6C27DD2@code1.codespeak.net> Author: cfbolz Date: Sat Dec 10 23:45:00 2005 New Revision: 21011 Modified: pypy/dist/pypy/rpython/objectmodel.py pypy/dist/pypy/rpython/test/test_objectmodel.py Log: add some hackish functions that allow one to "cast" python objects to integers and back. those are meant to be translated to real casts. Modified: pypy/dist/pypy/rpython/objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/objectmodel.py (original) +++ pypy/dist/pypy/rpython/objectmodel.py Sat Dec 10 23:45:00 2005 @@ -4,7 +4,7 @@ """ import new - +import weakref def instantiate(cls): "Create an empty instance of 'cls'." @@ -33,6 +33,29 @@ obj.__dict__ = {} obj.__class__ = FREED_OBJECT + +# support for cast from object to int and back + +__int_to_weakref = {} + +def cast_object_to_int(obj): + i = id(obj) + if i not in __int_to_weakref: + __int_to_weakref[i] = weakref.ref(obj) + return i + +def cast_int_to_object(i, expected_class): + # only ints are valid that are the result of cast_object_to_int + if i not in __int_to_weakref: + raise ValueError("not a valid object") + obj = __int_to_weakref[i]() + if obj is not None: + if type(obj) != expected_class: + raise ValueError("class of obj != expected_class") + return obj + else: + return FREED_OBJECT() + # __ invoke XXX this doesn't seem completely the right place for this def hlinvoke(repr, llcallable, *args): Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Sat Dec 10 23:45:00 2005 @@ -10,6 +10,17 @@ res = interpret(fn, []) assert res is True +def test_casttoandfromint(): + class A(object): + pass + class B(object): + pass + a = A() + b = B() + i1 = cast_object_to_int(a) + i2 = cast_object_to_int(b) + assert cast_int_to_object(i1, A) is a + assert cast_int_to_object(i2, B) is b def strange_key_eq(key1, key2): return key1[0] == key2[0] # only the 1st character is relevant From cfbolz at codespeak.net Sat Dec 10 23:47:15 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sat, 10 Dec 2005 23:47:15 +0100 (CET) Subject: [pypy-svn] r21012 - pypy/dist/pypy/rpython/test Message-ID: <20051210224715.E376C27DD2@code1.codespeak.net> Author: cfbolz Date: Sat Dec 10 23:47:15 2005 New Revision: 21012 Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py Log: extend the test a bit Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Sat Dec 10 23:47:15 2005 @@ -1,3 +1,4 @@ +import py from pypy.rpython.objectmodel import * from pypy.translator.translator import TranslationContext, graphof from pypy.rpython.test.test_llinterp import interpret @@ -21,6 +22,11 @@ i2 = cast_object_to_int(b) assert cast_int_to_object(i1, A) is a assert cast_int_to_object(i2, B) is b + a = None + b = None + a = cast_int_to_object(i1, A) + py.test.raises(RuntimeError, "a.b") + def strange_key_eq(key1, key2): return key1[0] == key2[0] # only the 1st character is relevant From cfbolz at codespeak.net Sun Dec 11 00:03:56 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sun, 11 Dec 2005 00:03:56 +0100 (CET) Subject: [pypy-svn] r21013 - in pypy/dist/pypy: annotation translator/test Message-ID: <20051210230356.2A23627DD3@code1.codespeak.net> Author: cfbolz Date: Sun Dec 11 00:03:55 2005 New Revision: 21013 Modified: pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/translator/test/test_annrpython.py Log: proper annotations for these functions Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Sun Dec 11 00:03:55 2005 @@ -247,6 +247,15 @@ def rarith_intmask(s_obj): return SomeInteger() +def robjmodel_cast_obj_to_int(s_instance): + return SomeInteger() + +def robjmodel_cast_int_to_obj(s_int, s_clspbc): + assert len(s_clspbc.descriptions) == 1 + desc = s_clspbc.descriptions.keys()[0] + cdef = desc.getuniqueclassdef() + return SomeInstance(cdef) + def robjmodel_instantiate(s_clspbc): assert isinstance(s_clspbc, SomePBC) clsdef = None @@ -326,6 +335,8 @@ ##BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.ovfcheck] = rarith_ovfcheck ##BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.ovfcheck_lshift] = rarith_ovfcheck_lshift BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.intmask] = rarith_intmask +BUILTIN_ANALYZERS[pypy.rpython.objectmodel.cast_object_to_int] = robjmodel_cast_obj_to_int +BUILTIN_ANALYZERS[pypy.rpython.objectmodel.cast_int_to_object] = robjmodel_cast_int_to_obj BUILTIN_ANALYZERS[pypy.rpython.objectmodel.instantiate] = robjmodel_instantiate BUILTIN_ANALYZERS[pypy.rpython.objectmodel.we_are_translated] = ( robjmodel_we_are_translated) Modified: pypy/dist/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/dist/pypy/translator/test/test_annrpython.py (original) +++ pypy/dist/pypy/translator/test/test_annrpython.py Sun Dec 11 00:03:55 2005 @@ -11,6 +11,7 @@ from pypy.annotation.dictdef import DictDef from pypy.objspace.flow.model import * from pypy.rpython.rarithmetic import r_uint +from pypy.rpython import objectmodel from pypy.objspace.flow import FlowObjSpace from pypy.translator.test import snippet @@ -1916,6 +1917,22 @@ assert s.knowntype == int graph = tgraphof(t, A.__del__.im_func) assert graph.startblock in a.annotated + + def test_casttoandfromint(self): + class A(object): + pass + def f(): + a = A() + return objectmodel.cast_object_to_int(a) + def g(i): + return objectmodel.cast_int_to_object(i, A) + a = self.RPythonAnnotator() + s = a.build_types(f, []) + assert isinstance(s, annmodel.SomeInteger) + s = a.build_types(g, [annmodel.SomeInteger()]) + assert isinstance(s, annmodel.SomeInstance) + assert s.classdef.classdesc.pyobj is A + def g(n): return [0,1,2,n] From cfbolz at codespeak.net Sun Dec 11 00:53:36 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sun, 11 Dec 2005 00:53:36 +0100 (CET) Subject: [pypy-svn] r21014 - in pypy/dist/pypy/rpython: . test Message-ID: <20051210235336.89B5D27DD3@code1.codespeak.net> Author: cfbolz Date: Sun Dec 11 00:53:35 2005 New Revision: 21014 Modified: pypy/dist/pypy/rpython/rbuiltin.py pypy/dist/pypy/rpython/test/test_objectmodel.py Log: rtyping support for these strange casts -- not clear to me yet how to support them in the llinterpreter Modified: pypy/dist/pypy/rpython/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/rbuiltin.py Sun Dec 11 00:53:35 2005 @@ -1,8 +1,8 @@ from pypy.annotation.pairtype import pairtype from pypy.annotation import model as annmodel from pypy.objspace.flow.model import Constant -from pypy.rpython.lltypesystem import lltype -from pypy.rpython import rarithmetic, objectmodel, rstack +from pypy.rpython.lltypesystem import lltype, rclass +from pypy.rpython import rarithmetic, objectmodel, rstack, rint from pypy.rpython.error import TyperError from pypy.rpython.rmodel import Repr, IntegerRepr from pypy.rpython.rrange import rtype_builtin_range, rtype_builtin_xrange @@ -280,6 +280,18 @@ return hop.genop('runtime_type_info', vlist, resulttype = rptr.PtrRepr(lltype.Ptr(lltype.RuntimeTypeInfo))) +def rtype_cast_object_to_int(hop): + assert isinstance(hop.args_r[0], rclass.InstanceRepr) + vlist = hop.inputargs(hop.args_r[0]) + return hop.genop('cast_ptr_to_int', vlist, + resulttype = rint.signed_repr) + +def rtype_cast_int_to_object(hop): + assert isinstance(hop.args_r[0], rint.IntegerRepr) + vlist = [hop.inputarg(rint.signed_repr, arg=0)] + return hop.genop('cast_int_to_ptr', vlist, + resulttype = hop.r_result.lowleveltype) + BUILTIN_TYPER[lltype.malloc] = rtype_malloc BUILTIN_TYPER[lltype.cast_pointer] = rtype_cast_pointer BUILTIN_TYPER[lltype.cast_ptr_to_int] = rtype_cast_ptr_to_int @@ -292,6 +304,8 @@ BUILTIN_TYPER[rarithmetic.r_longlong] = rtype_r_longlong BUILTIN_TYPER[objectmodel.r_dict] = rtype_r_dict BUILTIN_TYPER[objectmodel.we_are_translated] = rtype_we_are_translated +BUILTIN_TYPER[objectmodel.cast_object_to_int] = rtype_cast_object_to_int +BUILTIN_TYPER[objectmodel.cast_int_to_object] = rtype_cast_int_to_object BUILTIN_TYPER[rstack.yield_current_frame_to_caller] = ( rtype_yield_current_frame_to_caller) Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Sun Dec 11 00:53:35 2005 @@ -11,7 +11,7 @@ res = interpret(fn, []) assert res is True -def test_casttoandfromint(): +def test_casttoandfromint_on_cpython(): class A(object): pass class B(object): @@ -27,6 +27,23 @@ a = cast_int_to_object(i1, A) py.test.raises(RuntimeError, "a.b") +def test_casttoandfromint(): + class A(object): + pass + def f(): + a = A() + return cast_object_to_int(a) + def g(): + a = A() + i = cast_object_to_int(a) + return cast_object_to_int(cast_int_to_object(i, A)) == i + res = interpret(f, []) + # XXX humpf: there is no sane way to implement cast_ptr_to_int + # without going for the same hacks as in robjectmodel.cast_XXX_to_XXX + py.test.raises(AssertionError, interpret, g, []) + assert res > 0 + + def strange_key_eq(key1, key2): return key1[0] == key2[0] # only the 1st character is relevant From cfbolz at codespeak.net Sun Dec 11 01:08:26 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sun, 11 Dec 2005 01:08:26 +0100 (CET) Subject: [pypy-svn] r21015 - pypy/dist/pypy/rpython/test Message-ID: <20051211000826.87A8727DD3@code1.codespeak.net> Author: cfbolz Date: Sun Dec 11 01:08:25 2005 New Revision: 21015 Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py Log: make order of tests a bit clearer Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Sun Dec 11 01:08:25 2005 @@ -38,12 +38,10 @@ i = cast_object_to_int(a) return cast_object_to_int(cast_int_to_object(i, A)) == i res = interpret(f, []) + assert res > 0 # XXX humpf: there is no sane way to implement cast_ptr_to_int # without going for the same hacks as in robjectmodel.cast_XXX_to_XXX py.test.raises(AssertionError, interpret, g, []) - assert res > 0 - - def strange_key_eq(key1, key2): return key1[0] == key2[0] # only the 1st character is relevant From cfbolz at codespeak.net Sun Dec 11 01:20:23 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sun, 11 Dec 2005 01:20:23 +0100 (CET) Subject: [pypy-svn] r21016 - in pypy/dist/pypy/translator/c: . test Message-ID: <20051211002023.838FC27DD3@code1.codespeak.net> Author: cfbolz Date: Sun Dec 11 01:20:22 2005 New Revision: 21016 Modified: pypy/dist/pypy/translator/c/funcgen.py pypy/dist/pypy/translator/c/test/test_backendoptimized.py Log: final step: support the new operation in genc Modified: pypy/dist/pypy/translator/c/funcgen.py ============================================================================== --- pypy/dist/pypy/translator/c/funcgen.py (original) +++ pypy/dist/pypy/translator/c/funcgen.py Sun Dec 11 01:20:22 2005 @@ -545,6 +545,12 @@ result.append(self.pyobj_incref(op.result)) return '\t'.join(result) + def OP_CAST_INT_TO_PTR(self, op, err): + TYPE = self.lltypemap(op.result) + typename = self.db.gettype(TYPE) + return "%s = (%s)%s;" % (self.expr(op.result), cdecl(typename, ""), + self.expr(op.args[0])) + def OP_SAME_AS(self, op, err): result = [] TYPE = self.lltypemap(op.result) Modified: pypy/dist/pypy/translator/c/test/test_backendoptimized.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_backendoptimized.py (original) +++ pypy/dist/pypy/translator/c/test/test_backendoptimized.py Sun Dec 11 01:20:22 2005 @@ -1,6 +1,7 @@ import autopath from pypy.translator.c.test.test_typed import TestTypedTestCase as _TestTypedTestCase from pypy.translator.backendopt.all import backend_optimizations +from pypy.rpython import objectmodel class TestTypedOptimizedTestCase(_TestTypedTestCase): @@ -75,4 +76,22 @@ fn = self.getcompiled(f) res = fn() assert res == 42 - + + def test_casttoandfromint(self): + class A(object): + pass + def f(): + a = A() + return objectmodel.cast_object_to_int(a) + def g(): + a = A() + i = objectmodel.cast_object_to_int(a) + return objectmodel.cast_object_to_int( + objectmodel.cast_int_to_object(i, A)) == i + fn = self.getcompiled(f) + res = fn() + assert res > 0 + gn = self.getcompiled(g) + res = gn() + assert res + From sanxiyn at codespeak.net Sun Dec 11 05:42:19 2005 From: sanxiyn at codespeak.net (sanxiyn at codespeak.net) Date: Sun, 11 Dec 2005 05:42:19 +0100 (CET) Subject: [pypy-svn] r21017 - pypy/dist/pypy/translator/tool Message-ID: <20051211044219.C474127DD8@code1.codespeak.net> Author: sanxiyn Date: Sun Dec 11 05:42:16 2005 New Revision: 21017 Modified: pypy/dist/pypy/translator/tool/graphpage.py Log: Import reordering. Because of a circular import, importing classdef before model fails. Modified: pypy/dist/pypy/translator/tool/graphpage.py ============================================================================== --- pypy/dist/pypy/translator/tool/graphpage.py (original) +++ pypy/dist/pypy/translator/tool/graphpage.py Sun Dec 11 05:42:16 2005 @@ -1,8 +1,9 @@ import inspect, types from pypy.objspace.flow.model import traverse, Block, Link, FunctionGraph from pypy.translator.tool.make_dot import DotGen, make_dot, make_dot_graphs +from pypy.annotation.model import SomePBC +from pypy.annotation.description import MethodDesc from pypy.annotation.classdef import ClassDef -from pypy.annotation import model as annmodel, description from pypy.tool.uid import uid @@ -206,8 +207,8 @@ s_value = attrdef.s_value linkname = name info = s_value - if (classattrs and isinstance(s_value, annmodel.SomePBC) - and s_value.getKind() == description.MethodDesc): + if (classattrs and isinstance(s_value, SomePBC) + and s_value.getKind() == MethodDesc): name += '()' info = 'SomePBC(%s)' % ', '.join( ['method %s.%s' % ( From nik at codespeak.net Sun Dec 11 11:13:27 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Sun, 11 Dec 2005 11:13:27 +0100 (CET) Subject: [pypy-svn] r21018 - in pypy/dist/pypy: module/_socket/test translator/c/test Message-ID: <20051211101327.5E82427DD2@code1.codespeak.net> Author: nik Date: Sun Dec 11 11:13:25 2005 New Revision: 21018 Added: pypy/dist/pypy/module/_socket/test/__init__.py (contents, props changed) Modified: pypy/dist/pypy/module/_socket/test/echoserver.py pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py Log: (ale, nik) starting our test server in a thread actually just works, no idea why it didn't work with processes. Added: pypy/dist/pypy/module/_socket/test/__init__.py ============================================================================== Modified: pypy/dist/pypy/module/_socket/test/echoserver.py ============================================================================== --- pypy/dist/pypy/module/_socket/test/echoserver.py (original) +++ pypy/dist/pypy/module/_socket/test/echoserver.py Sun Dec 11 11:13:25 2005 @@ -30,6 +30,9 @@ raise RuntimeError() self.wfile.write(client_string) -if __name__ == "__main__": +def start_server(): server = EchoServer(("", PORT), EchoRequestHandler) - server.serve() + server.serve() + +if __name__ == "__main__": + start_server() Modified: pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py Sun Dec 11 11:13:25 2005 @@ -1,6 +1,6 @@ import autopath import py -import os.path, subprocess, sys +import os.path, subprocess, sys, thread import _socket from pypy.translator.c.test.test_genc import compile from pypy.translator.translator import Translator @@ -10,8 +10,8 @@ def setup_module(mod): import pypy.module._socket.rpython.exttable # for declare()/declaretype() - serverpath = os.path.join(autopath.pypydir, "module/_socket/test/echoserver.py") - mod.process = subprocess.Popen([sys.executable, serverpath]) + from pypy.module._socket.test import echoserver + thread.start_new_thread(echoserver.start_server, ()) def teardown_module(mod): import telnetlib @@ -19,9 +19,8 @@ tn.write("shutdown\n") tn.close() del tn - del mod.process -def DONOT_test_connect(): +def test_connect(): import os from pypy.module._socket.rpython import rsocket def does_stuff(): From nik at codespeak.net Sun Dec 11 11:16:39 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Sun, 11 Dec 2005 11:16:39 +0100 (CET) Subject: [pypy-svn] r21019 - pypy/dist/pypy/translator/c/test Message-ID: <20051211101639.DDFF627DC1@code1.codespeak.net> Author: nik Date: Sun Dec 11 11:16:38 2005 New Revision: 21019 Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (ale, nik) remove this test which is now in the other test file, using the local test server. Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Sun Dec 11 11:16:38 2005 @@ -90,20 +90,6 @@ for args in tests: py.test.raises(OSError, f1, *args) -def test_connect(): - import os - from pypy.module._socket.rpython import rsocket - def does_stuff(): - fd = rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) - # XXX need to think of a test without connecting to outside servers - rsocket.connect(fd, ("codespeak.net", 80, 0, 0)) - sockname = rsocket.getpeername(fd) - os.close(fd) - return sockname[1] - f1 = compile(does_stuff, []) - res = f1() - assert res == 80 - def test_connect_error(): from pypy.module._socket.rpython import rsocket import os From rxe at codespeak.net Sun Dec 11 11:27:56 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Sun, 11 Dec 2005 11:27:56 +0100 (CET) Subject: [pypy-svn] r21020 - pypy/dist/pypy/translator/llvm Message-ID: <20051211102756.270FB27DD2@code1.codespeak.net> Author: rxe Date: Sun Dec 11 11:27:54 2005 New Revision: 21020 Modified: pypy/dist/pypy/translator/llvm/arraynode.py pypy/dist/pypy/translator/llvm/database.py pypy/dist/pypy/translator/llvm/gc.py pypy/dist/pypy/translator/llvm/structnode.py pypy/dist/pypy/translator/llvm/varsize.py Log: Keep memset() ing of memory for now - lets llvm compile again. Modified: pypy/dist/pypy/translator/llvm/arraynode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/arraynode.py (original) +++ pypy/dist/pypy/translator/llvm/arraynode.py Sun Dec 11 11:27:54 2005 @@ -70,17 +70,6 @@ def writedatatypedecl(self, codewriter): codewriter.typedef(self.ref, self.db.get_machine_word()) - - -class StrArrayTypeNode(ArrayTypeNode): - def writeimpl(self, codewriter): - log.writeimpl(self.ref) - varsize.write_constructor(self.db, codewriter, self.ref, - self.constructor_decl, - self.array, - atomic=self.array._is_atomic(), - is_str=True) - class ArrayNode(ConstantLLVMNode): """ An arraynode. Elements can be Modified: pypy/dist/pypy/translator/llvm/database.py ============================================================================== --- pypy/dist/pypy/translator/llvm/database.py (original) +++ pypy/dist/pypy/translator/llvm/database.py Sun Dec 11 11:27:54 2005 @@ -7,7 +7,7 @@ from pypy.translator.llvm.structnode import StructNode, StructVarsizeNode, \ StructTypeNode, StructVarsizeTypeNode from pypy.translator.llvm.arraynode import ArrayNode, StrArrayNode, \ - VoidArrayNode, ArrayTypeNode, StrArrayTypeNode, VoidArrayTypeNode + VoidArrayNode, ArrayTypeNode, VoidArrayTypeNode from pypy.translator.llvm.opaquenode import OpaqueNode, ExtOpaqueNode, \ OpaqueTypeNode, ExtOpaqueTypeNode from pypy.rpython.lltypesystem import lltype @@ -155,8 +155,6 @@ elif isinstance(type_, lltype.Array): if type_.OF is lltype.Void: self.addpending(type_, VoidArrayTypeNode(self, type_)) - elif type_.OF is lltype.Char: - self.addpending(type_, StrArrayTypeNode(self, type_)) else: self.addpending(type_, ArrayTypeNode(self, type_)) Modified: pypy/dist/pypy/translator/llvm/gc.py ============================================================================== --- pypy/dist/pypy/translator/llvm/gc.py (original) +++ pypy/dist/pypy/translator/llvm/gc.py Sun Dec 11 11:27:54 2005 @@ -68,10 +68,10 @@ %(targetvar)s = cast sbyte* %%malloc_Ptr%(cnt)s to %(type_)s* ''' % locals() - #if is_atomic: - # t += ''' - #call ccc void %%llvm.memset(sbyte* %%malloc_Ptr%(cnt)s, ubyte 0, uint %%malloc_SizeU%(cnt)s, uint 0) - #''' % locals() + if is_atomic: + t += ''' + call ccc void %%llvm.memset(sbyte* %%malloc_Ptr%(cnt)s, ubyte 0, uint %%malloc_SizeU%(cnt)s, uint 0) + ''' % locals() return t def pyrex_code(self): Modified: pypy/dist/pypy/translator/llvm/structnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/structnode.py (original) +++ pypy/dist/pypy/translator/llvm/structnode.py Sun Dec 11 11:27:54 2005 @@ -77,7 +77,8 @@ self.constructor_decl, current, indices_to_array, - self.struct._is_atomic()) + self.struct._is_atomic(), + is_str=self.struct._name == "rpy_string") class StructNode(ConstantLLVMNode): """ A struct constant. Can simply contain Modified: pypy/dist/pypy/translator/llvm/varsize.py ============================================================================== --- pypy/dist/pypy/translator/llvm/varsize.py (original) +++ pypy/dist/pypy/translator/llvm/varsize.py Sun Dec 11 11:27:54 2005 @@ -34,17 +34,19 @@ *indices_to_arraylength) codewriter.store(lentype, "%len", "%arraylength") - if is_str: - indices_to_hash = (("uint", 0),) - codewriter.getelementptr("%xxx1", ref + "*", - "%result", - *indices_to_hash) - codewriter.store("int", "0", "%arraylength") - - codewriter.getelementptr("%xxx2", ref + "*", - "%result", - *elemindices) - codewriter.store(elemtype, "0", "%xxx2") + #if is_str: + # indices_to_hash = (("uint", 0),) + # codewriter.getelementptr("%ptrhash", ref + "*", + # "%result", + # *indices_to_hash) + # codewriter.store("int", "0", "%ptrhash") + + + #if ARRAY is STR.chars: + # codewriter.getelementptr("%ptrendofchar", ref + "*", + # "%result", + # *elemindices) + # codewriter.store(elemtype, "0", "%ptrendofchar") From nik at codespeak.net Sun Dec 11 11:36:07 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Sun, 11 Dec 2005 11:36:07 +0100 (CET) Subject: [pypy-svn] r21021 - in pypy/dist/pypy: module/_socket/test translator/c/test Message-ID: <20051211103607.0D11F27DD2@code1.codespeak.net> Author: nik Date: Sun Dec 11 11:36:05 2005 New Revision: 21021 Removed: pypy/dist/pypy/translator/c/test/test_ext__socket_conn.py Modified: pypy/dist/pypy/module/_socket/test/echoserver.py pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (ale, nik) test reorganisation. merged the two test files, we can now reuse the ipv4 tests for ipv6 and unix sockets. Modified: pypy/dist/pypy/module/_socket/test/echoserver.py ============================================================================== --- pypy/dist/pypy/module/_socket/test/echoserver.py (original) +++ pypy/dist/pypy/module/_socket/test/echoserver.py Sun Dec 11 11:36:05 2005 @@ -1,4 +1,4 @@ -import SocketServer +import socket, SocketServer import sys, time # user-accessible port @@ -7,6 +7,8 @@ class EchoServer(SocketServer.TCPServer): def __init__(self, *args, **kwargs): + self.address_family = kwargs["address_family"] + del kwargs["address_family"] SocketServer.TCPServer.__init__(self, *args, **kwargs) self.stop = False @@ -30,8 +32,8 @@ raise RuntimeError() self.wfile.write(client_string) -def start_server(): - server = EchoServer(("", PORT), EchoRequestHandler) +def start_server(address_family=socket.AF_INET): + server = EchoServer(("", PORT), EchoRequestHandler, address_family=address_family) server.serve() if __name__ == "__main__": Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Sun Dec 11 11:36:05 2005 @@ -1,8 +1,12 @@ import autopath import py -import _socket +import _socket, thread from pypy.translator.c.test.test_genc import compile from pypy.translator.translator import Translator +from pypy.module._socket.test import echoserver + +HOST = "localhost" +PORT = 8037 def setup_module(mod): import pypy.module._socket.rpython.exttable # for declare()/declaretype() @@ -105,3 +109,36 @@ f1 = compile(does_stuff, [str, int]) for args in tests: py.test.raises(OSError, f1, *args) + + +class TestConnectedIPv4: + + family = _socket.AF_INET + + def setup_class(cls): + thread.start_new_thread(echoserver.start_server, (), + {"address_family": cls.family}) + + def teardown_class(cls): + import telnetlib + tn = telnetlib.Telnet(HOST, PORT) + tn.write("shutdown\n") + tn.close() + + def test_connect(self): + import os + from pypy.module._socket.rpython import rsocket + def does_stuff(): + fd = rsocket.newsocket(self.family, _socket.SOCK_STREAM, 0) + rsocket.connect(fd, (HOST, PORT, 0, 0)) + sockname = rsocket.getpeername(fd) + os.close(fd) + return sockname[1] + f1 = compile(does_stuff, []) + res = f1() + assert res == PORT + +class DONOT_TestConnectedIPv6(TestConnectedIPv4): + + disabled = not _socket.has_ipv6 + family = _socket.AF_INET6 From cfbolz at codespeak.net Sun Dec 11 12:40:48 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sun, 11 Dec 2005 12:40:48 +0100 (CET) Subject: [pypy-svn] r21023 - pypy/extradoc/sprintinfo/gothenburg-2005 Message-ID: <20051211114048.2085527DD6@code1.codespeak.net> Author: cfbolz Date: Sun Dec 11 12:40:47 2005 New Revision: 21023 Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Log: planning for the next weeks Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt ============================================================================== --- pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt (original) +++ pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Sun Dec 11 12:40:47 2005 @@ -21,58 +21,56 @@ Thursday afternoon: half-breakday (starting at 15.00) -Possible sprint tasks -========================= +Possible sprint tasks / Work distribution after the sprint +============================================================= JIT work ~~~~~~~~~~~~~~~~~ -(Armin, Michael, Samuele, Arre, Eric) see doc/discussion/draft-jit-ideas.txt - toy target intepreter + parser/assembler (DONE) -- low-level graphs abstract interpreter (MORE PROGRESS) -(- L3 interpreter) +- low-level graphs abstract interpreter: Armin, Michael, Carl Friedrich, + Samuele) +- (L3 interpreter: Carl Friedrich) Stackless ~~~~~~~~~~ -(Richard, Christian) Expose the low-level switching facilities: +Christian, Richard? + - write RPython structures (tasklet, channel) and basic - functions for switching (IN-PROGRESS) -- prototypes for channels and tasklets + tests (FINISHED) -- add an app-level interface (mixed module) + functions for switching +- prototypes for channels and tasklets + tests +- add an app-level interface - implement support structures - a deque module exists already which can be used for channel queues GC, __del__, weakref ~~~~~~~~~~~~~~~~~~~~~ -- implement __del__ support in the RTyper and backends (DONE) +- implement __del__ support in the RTyper and backends (DONE, + performance killer! 10 times slower! argh!!!), Samuele, Carl Friedrich (- possibly implement weakref (at least with Boehm)) -- integrate GC construction framework in the backends - (quite a big task) +- integrate GC construction framework in the backends: Eric, Carl + Friedrich _socket, C gluing for extensions ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -(Nik, Anders L) -- work on _socket (IN-PROGRESS) +- work on _socket: Nik, later - this exposes limitations in our way to - glue to C libraries, think/design solutions - + glue to C libraries, think/design solutions: Armin, maybe -- (DONE) support more basic integer types. Decide on the proper - design (explicit spelling of sizes, or the long-long way?) - note that we already have functions which return 64 bit values. threading ~~~~~~~~~~~ -- fix stack_too_big with threads on Windows (Johan) +- fix stack_too_big with threads on Windows (Johan, work started, help + needed, Christian) - investigate why enabling threads creates such a large overhead - think of a design to release the GIL around blocking calls @@ -80,10 +78,11 @@ ~~~~~~~~~~~~~~~~~~~~~~~~~~~ - look into the perfomance and code path for function calls - in our interpreter (IN-PROGRESS) Arre, Eric, with help from Richard) + in our interpreter (MOSTLY DONE) - look into converting the indirect call in the eval loop for bytecode dispatch into a switch: probably needs a representation choice in the RTyper, - a transformation, and integer exitswitch implementation as switch in the backends + a transformation, and integer exitswitch implementation as switch in the + backends: Arre, Eric - ... Logic programming, WP9 @@ -101,10 +100,22 @@ US travel report, maybe towards WP03/WP07 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -Saturday morning (waiting a bit for Richard?) +Saturday morning (DONE) - telling the story about a commercial travel to the states to optimize some Python application - done using RPython - discussing possible advantages/new goals/extensions to the project - ideas about producing extension modules: a new object space? +Reports finalization (important!!!, done before 16th December) +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +- pdf generation (Carl Friedrich) +- beautification (people responsible) + +next pypy-sync meeting topics (eric) +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +- pypy-sync meetings attendance +- mallorca sprint topics + From ac at codespeak.net Sun Dec 11 13:07:02 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Sun, 11 Dec 2005 13:07:02 +0100 (CET) Subject: [pypy-svn] r21029 - pypy/dist/pypy/interpreter Message-ID: <20051211120702.6A69D27DD2@code1.codespeak.net> Author: ac Date: Sun Dec 11 13:07:02 2005 New Revision: 21029 Modified: pypy/dist/pypy/interpreter/baseobjspace.py pypy/dist/pypy/interpreter/pyopcode.py Log: Implement a fastcall path for methodcalls. Modified: pypy/dist/pypy/interpreter/baseobjspace.py ============================================================================== --- pypy/dist/pypy/interpreter/baseobjspace.py (original) +++ pypy/dist/pypy/interpreter/baseobjspace.py Sun Dec 11 13:07:02 2005 @@ -462,6 +462,12 @@ def call_function(self, w_func, *args_w): # XXX start of hack for performance + from pypy.interpreter.function import Function, Method + if (isinstance(w_func, Method) and + w_func.w_instance is not None and + len(args_w) <= 3): + return self.call_function(w_func.w_function, w_func.w_instance, *args_w) + from pypy.interpreter.function import Function if isinstance(w_func, Function): if len(args_w) == 0: @@ -490,6 +496,7 @@ args = Arguments(self, list(args_w)) return w_func.call_args(args) # XXX end of hack for performance + args = Arguments(self, list(args_w)) return self.call_args(w_func, args) Modified: pypy/dist/pypy/interpreter/pyopcode.py ============================================================================== --- pypy/dist/pypy/interpreter/pyopcode.py (original) +++ pypy/dist/pypy/interpreter/pyopcode.py Sun Dec 11 13:07:02 2005 @@ -650,7 +650,11 @@ def CALL_FUNCTION(f, oparg): # XXX start of hack for performance - if oparg == 1: # 1 arg, 0 keyword arg + if oparg == 0: # 0 arg, 0 keyword arg + w_function = f.valuestack.pop() + w_result = f.space.call_function(w_function) + f.valuestack.push(w_result) + elif oparg == 1: # 1 arg, 0 keyword arg w_arg = f.valuestack.pop() w_function = f.valuestack.pop() w_result = f.space.call_function(w_function, w_arg) From rxe at codespeak.net Sun Dec 11 13:15:33 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Sun, 11 Dec 2005 13:15:33 +0100 (CET) Subject: [pypy-svn] r21030 - pypy/dist/pypy/translator/llvm/module Message-ID: <20051211121533.7DE8527DD2@code1.codespeak.net> Author: rxe Date: Sun Dec 11 13:15:32 2005 New Revision: 21030 Modified: pypy/dist/pypy/translator/llvm/module/boehm.h Log: Make threaded by default. Modified: pypy/dist/pypy/translator/llvm/module/boehm.h ============================================================================== --- pypy/dist/pypy/translator/llvm/module/boehm.h (original) +++ pypy/dist/pypy/translator/llvm/module/boehm.h Sun Dec 11 13:15:32 2005 @@ -1,4 +1,4 @@ -//#define USING_THREADED_BOEHM +#define USING_THREADED_BOEHM #ifdef USING_THREADED_BOEHM From ericvrp at codespeak.net Sun Dec 11 13:18:00 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Sun, 11 Dec 2005 13:18:00 +0100 (CET) Subject: [pypy-svn] r21031 - pypy/dist/pypy/translator/llvm Message-ID: <20051211121800.DEF3627DD2@code1.codespeak.net> Author: ericvrp Date: Sun Dec 11 13:18:00 2005 New Revision: 21031 Modified: pypy/dist/pypy/translator/llvm/funcnode.py Log: enable exception malloc removal by default again Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Sun Dec 11 13:18:00 2005 @@ -40,7 +40,7 @@ self.graph = value.graph self.db.genllvm.exceptionpolicy.transform(self.db.translator, self.graph) - #remove_exception_mallocs(self.db.translator, self.graph, self.ref) + remove_exception_mallocs(self.db.translator, self.graph, self.ref) #merge_mallocs(self.db.translator, self.graph, self.ref) remove_double_links(self.db.translator, self.graph) From cfbolz at codespeak.net Sun Dec 11 13:18:38 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sun, 11 Dec 2005 13:18:38 +0100 (CET) Subject: [pypy-svn] r21032 - in pypy/dist/pypy: interpreter objspace/std Message-ID: <20051211121838.4A27027DD2@code1.codespeak.net> Author: cfbolz Date: Sun Dec 11 13:18:36 2005 New Revision: 21032 Modified: pypy/dist/pypy/interpreter/typedef.py pypy/dist/pypy/objspace/std/objspace.py pypy/dist/pypy/objspace/std/typeobject.py Log: (pedronis, cfbolz): fun with typdef: add yet another layer of inheritance for classes for user-defined classes to minimize the number of classes on interp-level that have a __del__ defined. Now __del__ works only if its defined at type-definition time. Modified: pypy/dist/pypy/interpreter/typedef.py ============================================================================== --- pypy/dist/pypy/interpreter/typedef.py (original) +++ pypy/dist/pypy/interpreter/typedef.py Sun Dec 11 13:18:36 2005 @@ -38,54 +38,79 @@ # we cannot specialize:memo by more than one PBC key # so we need to work a bit to allow that -def get_unique_interplevel_subclass(cls, hasdict, wants_slots): - if hasdict: - if wants_slots: - return get_unique_interplevel_WithDictWithSlots(cls) - else: - return get_unique_interplevel_WithDictNoSlots(cls) - else: - if wants_slots: - return get_unique_interplevel_NoDictWithSlots(cls) - else: - return get_unique_interplevel_NoDictNoSlots(cls) +def get_unique_interplevel_subclass(cls, hasdict, wants_slots, needsdel=False): + if needsdel: + if hasdict: + if wants_slots: + return get_unique_interplevel_WithDictWithSlotsWithDel(cls) + else: + return get_unique_interplevel_WithDictNoSlotsWithDel(cls) + else: + if wants_slots: + return get_unique_interplevel_NoDictWithSlotsWithDel(cls) + else: + return get_unique_interplevel_NoDictNoSlotsWithDel(cls) + else: + if hasdict: + if wants_slots: + return get_unique_interplevel_WithDictWithSlotsNoDel(cls) + else: + return get_unique_interplevel_WithDictNoSlotsNoDel(cls) + else: + if wants_slots: + return get_unique_interplevel_NoDictWithSlotsNoDel(cls) + else: + return get_unique_interplevel_NoDictNoSlotsNoDel(cls) get_unique_interplevel_subclass._annspecialcase_ = "specialize:arg0" for hasdict in False, True: - for wants_slots in False, True: - name = hasdict and "WithDict" or "NoDict" - name += wants_slots and "WithSlots" or "NoSlots" - funcname = "get_unique_interplevel_%s" % (name,) - exec compile2(""" - subclass_cache_%(name)s = {} - def %(funcname)s(cls): - try: - return subclass_cache_%(name)s[cls] - except KeyError: - subcls = _buildusercls(cls, %(hasdict)r, %(wants_slots)r) - subclass_cache_%(name)s[cls] = subcls - return subcls - %(funcname)s._annspecialcase_ = "specialize:memo" - """ % locals()) + for wants_del in False, True: + for wants_slots in False, True: + name = hasdict and "WithDict" or "NoDict" + name += wants_slots and "WithSlots" or "NoSlots" + name += wants_del and "WithDel" or "NoDel" + funcname = "get_unique_interplevel_%s" % (name,) + exec compile2(""" + subclass_cache_%(name)s = {} + def %(funcname)s(cls): + try: + return subclass_cache_%(name)s[cls] + except KeyError: + subcls = _buildusercls(cls, %(hasdict)r, %(wants_slots)r, %(wants_del)r) + subclass_cache_%(name)s[cls] = subcls + return subcls + %(funcname)s._annspecialcase_ = "specialize:memo" + """ % locals()) -def _buildusercls(cls, hasdict, wants_slots): +def _buildusercls(cls, hasdict, wants_slots, wants_del): "NOT_RPYTHON: initialization-time only" typedef = cls.typedef if hasdict and typedef.hasdict: - return get_unique_interplevel_subclass(cls, False, wants_slots) + return get_unique_interplevel_subclass(cls, False, wants_slots, wants_del) name = ['User'] if not hasdict: name.append('NoDict') if wants_slots: name.append('WithSlots') + if wants_del: + name.append('WithDel') name.append(cls.__name__) name = ''.join(name) - if wants_slots: - supercls = get_unique_interplevel_subclass(cls, hasdict, False) + if wants_del: + supercls = get_unique_interplevel_subclass(cls, hasdict, wants_slots, False) + class Proto(object): + def __del__(self): + try: + self.space.userdel(self) + except OperationError, e: + e.write_unraisable(self.space, 'method __del__ of ', self) + e.clear(self.space) # break up reference cycles + elif wants_slots: + supercls = get_unique_interplevel_subclass(cls, hasdict, False, False) class Proto(object): def user_setup_slots(self, nslots): @@ -97,7 +122,7 @@ def getslotvalue(self, index): return self.slots_w[index] elif hasdict: - supercls = get_unique_interplevel_subclass(cls, False, False) + supercls = get_unique_interplevel_subclass(cls, False, False, False) class Proto(object): def getdict(self): @@ -126,12 +151,6 @@ # only used by descr_set___class__ self.w__class__ = w_subtype - def __del__(self): - try: - self.space.userdel(self) - except OperationError, e: - e.write_unraisable(self.space, 'method __del__ of ', self) - e.clear(self.space) # break up reference cycles def user_setup(self, space, w_subtype, nslots): self.space = space Modified: pypy/dist/pypy/objspace/std/objspace.py ============================================================================== --- pypy/dist/pypy/objspace/std/objspace.py (original) +++ pypy/dist/pypy/objspace/std/objspace.py Sun Dec 11 13:18:36 2005 @@ -372,7 +372,7 @@ instance = instantiate(cls) else: w_subtype = w_type.check_user_subclass(w_subtype) - subcls = get_unique_interplevel_subclass(cls, w_subtype.hasdict, w_subtype.nslots != 0) + subcls = get_unique_interplevel_subclass(cls, w_subtype.hasdict, w_subtype.nslots != 0, w_subtype.needsdel) instance = instantiate(subcls) instance.user_setup(self, w_subtype, w_subtype.nslots) assert isinstance(instance, cls) Modified: pypy/dist/pypy/objspace/std/typeobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/typeobject.py (original) +++ pypy/dist/pypy/objspace/std/typeobject.py Sun Dec 11 13:18:36 2005 @@ -50,6 +50,7 @@ w_self.dict_w = dict_w w_self.ensure_static__new__() w_self.nslots = 0 + w_self.needsdel = False w_self.w_bestbase = None # make sure there is a __doc__ in dict_w @@ -119,6 +120,7 @@ space.wrap("instance layout conflicts in " "multiple inheritance")) w_self.hasdict = w_self.hasdict or w_base.hasdict + w_self.needsdel = w_self.needsdel or w_base.needsdel if not w_newstyle: # only classic bases raise OperationError(space.w_TypeError, space.wrap("a new-style class can't have only classic bases")) @@ -176,7 +178,8 @@ if wantdict and not w_self.hasdict: w_self.dict_w['__dict__'] = space.wrap(std_dict_descr) w_self.hasdict = True - + if '__del__' in dict_w: + w_self.needsdel = True w_type = space.type(w_self) if not space.is_w(w_type, space.w_type): w_self.mro_w = [] @@ -185,7 +188,6 @@ w_mro = space.call_args(mro_func, mro_func_args) w_self.mro_w = space.unpackiterable(w_mro) return - w_self.mro_w = w_self.compute_mro() # compute the most parent class with the same layout as us From arigo at codespeak.net Sun Dec 11 13:19:54 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Dec 2005 13:19:54 +0100 (CET) Subject: [pypy-svn] r21033 - in pypy/dist/pypy: doc objspace/flow rpython Message-ID: <20051211121954.C8EA427DD2@code1.codespeak.net> Author: arigo Date: Sun Dec 11 13:19:51 2005 New Revision: 21033 Modified: pypy/dist/pypy/doc/objspace.txt pypy/dist/pypy/objspace/flow/model.py pypy/dist/pypy/rpython/llinterp.py Log: * Should not rely on the exit links to be "False-True" in this order. * Fixed the llinterp in this respect. * Fixed checkgraph() to detect several exit links with the same exitcase. Modified: pypy/dist/pypy/doc/objspace.txt ============================================================================== --- pypy/dist/pypy/doc/objspace.txt (original) +++ pypy/dist/pypy/doc/objspace.txt Sun Dec 11 13:19:51 2005 @@ -396,7 +396,7 @@ Each Block ends in one of the following ways: * unconditional jump: exitswitch is None, exits contains a single Link. - * conditional jump: exitswitch is one of the Variables that appear in the Block, and exits contains one or more Links (usually 2). Each Link's exitcase gives a concrete value. This is the equivalent of a "switch": the control follows the Link whose exitcase matches the run-time value of the exitswitch Variable. It is a run-time error if the Variable doesn't match any exitcase. (Currently only used with 2 Links whose exitcase are False and True, respectively.) + * conditional jump: exitswitch is one of the Variables that appear in the Block, and exits contains one or more Links (usually 2). Each Link's exitcase gives a concrete value. This is the equivalent of a "switch": the control follows the Link whose exitcase matches the run-time value of the exitswitch Variable. It is a run-time error if the Variable doesn't match any exitcase. (Currently only used with 2 Links whose exitcase are False and True, respectively -- but this can change, so don't rely on it!) * exception catching: exitswitch is ``Constant(last_exception)``. The first Link has exitcase set to None and represents the non-exceptional path. The next Links have exitcase set to a subclass of Exception, and are taken when the *last* operation of the basic block raises a matching exception. (Thus the basic block must not be empty, and only the last operation is protected by the handler.) * return or except: the returnblock and the exceptblock have operations set to an empty tuple, exitswitch to None, and exits empty. Modified: pypy/dist/pypy/objspace/flow/model.py ============================================================================== --- pypy/dist/pypy/objspace/flow/model.py (original) +++ pypy/dist/pypy/objspace/flow/model.py Sun Dec 11 13:19:51 2005 @@ -520,6 +520,7 @@ assert isinstance(block.exitswitch, Variable) assert block.exitswitch in vars + allexitcases = {} for link in block.exits: assert len(link.args) == len(link.target.inputargs) assert link.prevblock is block @@ -542,6 +543,8 @@ # if not exc_link: # assert v.value is not last_exception # #assert v.value != last_exc_value + allexitcases[link.exitcase] = True + assert len(allexitcases) == len(block.exits) vars_previous_blocks.update(vars) try: Modified: pypy/dist/pypy/rpython/llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/llinterp.py (original) +++ pypy/dist/pypy/rpython/llinterp.py Sun Dec 11 13:19:51 2005 @@ -208,8 +208,13 @@ # no handler found, pass on raise e else: - index = self.getval(block.exitswitch) - link = block.exits[index] + llexitvalue = self.getval(block.exitswitch) + for link in block.exits: + if link.llexitcase == llexitvalue: + break # found -- the result is in 'link' + else: + raise ValueError("exit case %r not found in the exit links " + "of %r" % (llexitvalue, block)) return link.target, [self.getval(x) for x in link.args] def eval_operation(self, operation): From nik at codespeak.net Sun Dec 11 13:21:04 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Sun, 11 Dec 2005 13:21:04 +0100 (CET) Subject: [pypy-svn] r21034 - pypy/dist/pypy/translator/c/test Message-ID: <20051211122104.D554F27DD2@code1.codespeak.net> Author: nik Date: Sun Dec 11 13:21:03 2005 New Revision: 21034 Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: (ale, nik) towards IPv6 support for socket.connect. not quite working, yet. Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Sun Dec 11 13:21:03 2005 @@ -104,7 +104,7 @@ ] def does_stuff(host, port): fd = rsocket.newsocket(_socket.AF_INET, _socket.SOCK_STREAM, 0) - rsocket.connect(fd, (host, port, 0, 0)) + rsocket.connect(fd, (host, port, 0, 0), _socket.AF_INET) os.close(fd) f1 = compile(does_stuff, [str, int]) for args in tests: @@ -130,7 +130,7 @@ from pypy.module._socket.rpython import rsocket def does_stuff(): fd = rsocket.newsocket(self.family, _socket.SOCK_STREAM, 0) - rsocket.connect(fd, (HOST, PORT, 0, 0)) + rsocket.connect(fd, (HOST, PORT, 0, 0), self.family) sockname = rsocket.getpeername(fd) os.close(fd) return sockname[1] From nik at codespeak.net Sun Dec 11 13:21:42 2005 From: nik at codespeak.net (nik at codespeak.net) Date: Sun, 11 Dec 2005 13:21:42 +0100 (CET) Subject: [pypy-svn] r21035 - in pypy/dist/pypy: module/_socket module/_socket/rpython translator/c/src Message-ID: <20051211122142.5D35F27DD2@code1.codespeak.net> Author: nik Date: Sun Dec 11 13:21:40 2005 New Revision: 21035 Modified: pypy/dist/pypy/module/_socket/interp_socket.py pypy/dist/pypy/module/_socket/rpython/ll__socket.py pypy/dist/pypy/module/_socket/rpython/rsocket.py pypy/dist/pypy/translator/c/src/ll__socket.h Log: (ale, nik) oops, missed these files in the previous commit. towards ipv6 support for connect. Modified: pypy/dist/pypy/module/_socket/interp_socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/interp_socket.py (original) +++ pypy/dist/pypy/module/_socket/interp_socket.py Sun Dec 11 13:21:40 2005 @@ -712,7 +712,7 @@ # XXX IPv6 and Unix sockets missing here pass try: - rsocket.connect(self.fd, sockname) + rsocket.connect(self.fd, sockname, self.family) except OSError, ex: raise w_get_socketerror(space, e.strerror, e.errno) # XXX timeout doesn't really work at the moment Modified: pypy/dist/pypy/module/_socket/rpython/ll__socket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/ll__socket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/ll__socket.py Sun Dec 11 13:21:40 2005 @@ -95,7 +95,7 @@ return 0 ll__socket_newsocket.suggested_primitive = True -def ll__socket_connect(fd, sockname): +def ll__socket_connect(fd, sockname, family): return None ll__socket_connect.suggested_primitive = True Modified: pypy/dist/pypy/module/_socket/rpython/rsocket.py ============================================================================== --- pypy/dist/pypy/module/_socket/rpython/rsocket.py (original) +++ pypy/dist/pypy/module/_socket/rpython/rsocket.py Sun Dec 11 13:21:40 2005 @@ -40,9 +40,12 @@ socket_cache[fileno] = s return fileno -def connect(fd, sockname): +def connect(fd, sockname, family): s = socket_cache[fd] - s.connect(sockname[:2]) # XXX IPv4 only + if family == socket.AF_INET: + s.connect(sockname[:2]) + elif family == socket.AF_INET6: + s.connect(sockname) def getpeername(fd): s = socket_cache[fd] Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Sun Dec 11 13:21:40 2005 @@ -102,19 +102,53 @@ return fd; } -void LL__socket_connect(int fd, RPySOCKET_SOCKNAME* sockname) +void LL__socket_connect(int fd, RPySOCKET_SOCKNAME* sockname, int family) { - struct sockaddr_in addr; + struct sockaddr addr; + int addr_len; if (setipaddr(RPyString_AsString(sockname->t_item0), (struct sockaddr *) &addr, - sizeof(addr), AF_INET) < 0) { + sizeof(addr), family) < 0) { // XXX we actually want to raise socket.error RPYTHON_RAISE_OSERROR(errno); return NULL; } - addr.sin_family = AF_INET; - addr.sin_port = htons(sockname->t_item1); - if (connect(fd, &addr, sizeof(addr)) < 0) { + + switch (family) { + + case AF_INET: + { + struct sockaddr_in* addr_in; + addr_in = (struct sockaddr_in *) &addr; + addr_in->sin_family = family; + addr_in->sin_port = htons(sockname->t_item1); + addr_len = sizeof(addr_in); + break; + } + +#ifdef ENABLE_IPV6 + case AF_INET6: + { + struct sockaddr_in6* addr_in6; + addr_in6 = (struct sockaddr_in6 *) &addr; + addr_in6->sin6_family = family; + addr_in6->sin6_port = htons((short)sockname->t_item1); + addr_in6->sin6_flowinfo = sockname->t_item2; + addr_in6->sin6_scope_id = sockname->t_item3; + addr_len = sizeof(addr_in6); + break; + } +#endif + + default: + { + // XXX raise some error + break; + } + + } + + if (connect(fd, &addr, addr_len) < 0) { // XXX we actually want to raise socket.error RPYTHON_RAISE_OSERROR(errno); return NULL; From ericvrp at codespeak.net Sun Dec 11 13:23:16 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Sun, 11 Dec 2005 13:23:16 +0100 (CET) Subject: [pypy-svn] r21036 - pypy/dist/pypy/translator/goal Message-ID: <20051211122316.BCAAC27DD2@code1.codespeak.net> Author: ericvrp Date: Sun Dec 11 13:23:15 2005 New Revision: 21036 Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py pypy/dist/pypy/translator/goal/bench-unix.py Log: * Limit the number of executables that are benchmarked * cleanup *.pyc files and _cache before benchmarking Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-cronjob.py (original) +++ pypy/dist/pypy/translator/goal/bench-cronjob.py Sun Dec 11 13:23:15 2005 @@ -16,6 +16,12 @@ os.system('make -j3 tools-only 2>&1') def compile(backend): + os.chdir(homedir + '/projects/pypy-dist') + os.system('rm `find . -name *.pyc`') + + os.chdir(homedir + '/projects/pypy-dist/pypy/_cache') + os.system('rm *') + os.chdir(homedir + '/projects/pypy-dist/pypy/translator/goal') os.system('python translate_pypy.py --backend=%(backend)s --text --batch targetpypystandalone 2>&1' % locals()) @@ -47,8 +53,9 @@ if backends == []: backends = 'llvm c'.split() print time.ctime() + #if 'llvm' in backends: + # update_llvm() update_pypy() - update_llvm() for backend in backends: try: compile(backend) Modified: pypy/dist/pypy/translator/goal/bench-unix.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-unix.py (original) +++ pypy/dist/pypy/translator/goal/bench-unix.py Sun Dec 11 13:23:15 2005 @@ -4,6 +4,7 @@ import os, sys, time +MAX_BENCHMARKS = 40 PYSTONE_CMD = 'from test import pystone;pystone.main(%s)' PYSTONE_PATTERN = 'This machine benchmarks at' RICHARDS_CMD = 'from richards import *;main(iterations=%d)' @@ -51,7 +52,7 @@ ref_stone = run_pystone() print FMT % (time.ctime(), 'python %s' % sys.version.split()[0], ref_rich, 1.0, ref_stone, 1.0) sys.stdout.flush() - for exe in get_executables(): + for exe in get_executables()[:MAX_BENCHMARKS]: exename = os.path.splitext(exe)[0].lstrip('./') ctime = time.ctime( os.path.getctime(exename) ) rich = run_richards(exe, 1) From arigo at codespeak.net Sun Dec 11 13:27:05 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Dec 2005 13:27:05 +0100 (CET) Subject: [pypy-svn] r21037 - in pypy/dist/pypy/jit: . test Message-ID: <20051211122705.D57C027DD2@code1.codespeak.net> Author: arigo Date: Sun Dec 11 13:27:03 2005 New Revision: 21037 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: * Bug fix & test: forgot to create the exitcase and llexitcase on the new links! * Added caching of specializations for the tests, similar to test_llinterp. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 11 13:27:03 2005 @@ -52,7 +52,7 @@ def with_fresh_variables(self, memo): return self - def match(self, other): + def match(self, other, memo): return isinstance(other, LLConcreteValue) and self.value == other.value @@ -92,7 +92,7 @@ def with_fresh_variables(self, memo): return LLRuntimeValue(self.getconcretetype()) - def match(self, other): + def match(self, other, memo): # Note: the meaning of match() is actually to see if calling # with_fresh_variables() on both 'self' and 'other' would give the # same result. This is why any two LLRuntimeValues match each other. @@ -222,14 +222,18 @@ result.extend(self.getfield(name).getruntimevars(memo)) return result - def match(self, other): - if self is other: - return True + def match(self, other, memo): + if (False, self) in memo: + return other is memo[False, self] + if (True, other) in memo: + return self is memo[True, other] + memo[False, self] = other + memo[True, other] = self assert self.T == other.T for name in self.T._names: a1 = self.getfield(name) a2 = other.getfield(name) - if not a1.match(a2): + if not a1.match(a2, memo): return False else: return True @@ -258,9 +262,9 @@ def with_fresh_variables(self, memo): return LLVirtualPtr(self.containerobj.copy(memo)) - def match(self, other): + def match(self, other, memo): if isinstance(other, LLVirtualPtr): - return self.containerobj.match(other.containerobj) + return self.containerobj.match(other.containerobj, memo) else: return False @@ -278,8 +282,9 @@ def match(self, args_a): # simple for now + memo = {} for a1, a2 in zip(self.args_a, args_a): - if not a1.match(a2): + if not a1.match(a2, memo): return False else: return True @@ -468,6 +473,9 @@ for origlink in links: args_a = [builder.binding(v) for v in origlink.args] newlink = self.interp.schedule(args_a, origlink.target) + if newexitswitch is not None: + newlink.exitcase = origlink.exitcase + newlink.llexitcase = origlink.llexitcase newlinks.append(newlink) else: # copies of return and except blocks are *normal* blocks currently; Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Sun Dec 11 13:27:03 2005 @@ -15,23 +15,38 @@ t = annmodel.lltype_to_annotation(T) return a.typeannotation(t) -def abstrinterp(ll_function, argvalues, arghints): +_lastinterpreted = [] +def get_and_residualize_graph(ll_function, argvalues, arghints): + key = ll_function, tuple(arghints), tuple([argvalues[n] for n in arghints]) + for key1, value1 in _lastinterpreted: # 'key' is not hashable + if key1 == key: + return value1 + if len(_lastinterpreted) >= 3: + del _lastinterpreted[0] + # build the normal ll graphs for ll_function t = TranslationContext() a = t.buildannotator() argtypes = [annotation(a, value) for value in argvalues] graph1 = annotate_lowlevel_helper(a, ll_function, argtypes) rtyper = t.buildrtyper() rtyper.specialize() + # build the residual ll graphs by propagating the hints interp = LLAbstractInterp() hints = {} - argvalues2 = argvalues[:] - lst = list(arghints) - lst.sort() - lst.reverse() - for hint in lst: - hints[graph1.getargs()[hint]] = argvalues2[hint] - del argvalues2[hint] + for hint in arghints: + hints[graph1.getargs()[hint]] = argvalues[hint] graph2 = interp.eval(graph1, hints) + # cache and return the original and the residual ll graph + result = t, interp, graph1, graph2 + _lastinterpreted.append((key, result)) + return result + +def abstrinterp(ll_function, argvalues, arghints): + t, interp, graph1, graph2 = get_and_residualize_graph( + ll_function, argvalues, arghints) + argvalues2 = [argvalues[n] for n in range(len(argvalues)) + if n not in arghints] + rtyper = t.rtyper # check the result by running it llinterp = LLInterpreter(rtyper) result1 = llinterp.eval_graph(graph1, argvalues) @@ -95,6 +110,8 @@ return y graph2, insns = abstrinterp(ll_function, [6, 42], []) assert insns == {'int_is_true': 1, 'int_add': 2} + graph2, insns = abstrinterp(ll_function, [0, 42], []) + assert insns == {'int_is_true': 1, 'int_add': 2} def test_unrolling_loop(): def ll_function(x, y): From mwh at codespeak.net Sun Dec 11 13:30:08 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Sun, 11 Dec 2005 13:30:08 +0100 (CET) Subject: [pypy-svn] r21038 - pypy/dist/pypy/rpython/test Message-ID: <20051211123008.745F227DC6@code1.codespeak.net> Author: mwh Date: Sun Dec 11 13:30:07 2005 New Revision: 21038 Modified: pypy/dist/pypy/rpython/test/test_llinterp.py Log: a test for armin's last checkin to the llinterpreter. Modified: pypy/dist/pypy/rpython/test/test_llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_llinterp.py (original) +++ pypy/dist/pypy/rpython/test/test_llinterp.py Sun Dec 11 13:30:07 2005 @@ -333,6 +333,32 @@ res = interpret(f, [o, o], someobjects=True) assert res is True + +def test_funny_links(): + from pypy.objspace.flow.model import Block, FunctionGraph, \ + SpaceOperation, Variable, Constant, Link + for i in range(2): + v_i = Variable("i") + v_case = Variable("case") + block = Block([v_i]) + g = FunctionGraph("is_one", block) + block.operations.append(SpaceOperation("eq", [v_i, Constant(1)], v_case)) + block.exitswitch = v_case + tlink = Link([Constant(1)], g.returnblock, True) + flink = Link([Constant(0)], g.returnblock, False) + links = [tlink, flink] + if i: + links.reverse() + block.closeblock(*links) + t = TranslationContext() + a = t.buildannotator() + a.build_graph_types(g, [annmodel.SomeInteger()]) + rtyper = t.buildrtyper() + rtyper.specialize() + interp = LLInterpreter(rtyper) + assert interp.eval_graph(g, [1]) == 1 + assert interp.eval_graph(g, [0]) == 0 + #__________________________________________________________________ # # Test objects and instances From ale at codespeak.net Sun Dec 11 14:20:40 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Sun, 11 Dec 2005 14:20:40 +0100 (CET) Subject: [pypy-svn] r21041 - pypy/dist/pypy/translator/c/src Message-ID: <20051211132040.9EE9127DC3@code1.codespeak.net> Author: ale Date: Sun Dec 11 14:20:39 2005 New Revision: 21041 Modified: pypy/dist/pypy/translator/c/src/ll__socket.h Log: (nik, ale) Oups - we have to get the size of the struct not the size of the pointer Modified: pypy/dist/pypy/translator/c/src/ll__socket.h ============================================================================== --- pypy/dist/pypy/translator/c/src/ll__socket.h (original) +++ pypy/dist/pypy/translator/c/src/ll__socket.h Sun Dec 11 14:20:39 2005 @@ -106,7 +106,7 @@ { struct sockaddr addr; int addr_len; - + if (setipaddr(RPyString_AsString(sockname->t_item0), (struct sockaddr *) &addr, sizeof(addr), family) < 0) { // XXX we actually want to raise socket.error @@ -122,7 +122,7 @@ addr_in = (struct sockaddr_in *) &addr; addr_in->sin_family = family; addr_in->sin_port = htons(sockname->t_item1); - addr_len = sizeof(addr_in); + addr_len = sizeof(*addr_in); break; } @@ -135,7 +135,7 @@ addr_in6->sin6_port = htons((short)sockname->t_item1); addr_in6->sin6_flowinfo = sockname->t_item2; addr_in6->sin6_scope_id = sockname->t_item3; - addr_len = sizeof(addr_in6); + addr_len = sizeof(*addr_in6); break; } #endif From arigo at codespeak.net Sun Dec 11 15:31:51 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Dec 2005 15:31:51 +0100 (CET) Subject: [pypy-svn] r21042 - in pypy/dist/pypy/jit: . test Message-ID: <20051211143151.7676D27DC8@code1.codespeak.net> Author: arigo Date: Sun Dec 11 15:31:49 2005 New Revision: 21042 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: Bug fix, with a test. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 11 15:31:49 2005 @@ -260,7 +260,13 @@ return None def with_fresh_variables(self, memo): - return LLVirtualPtr(self.containerobj.copy(memo)) + if self in memo: + return memo[self] + else: + result = LLVirtualPtr(None) + memo[self] = result + result.containerobj = self.containerobj.copy(memo) + return result def match(self, other, memo): if isinstance(other, LLVirtualPtr): Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Sun Dec 11 15:31:49 2005 @@ -239,3 +239,36 @@ return n1 * n2 graph2, insns = abstrinterp(ll_function, [7, 1], [0]) assert insns == {'int_is_true': 1, 'int_add': 1, 'int_mul': 1} + +def test_dont_merge_forced_and_not_forced(): + S = lltype.GcStruct('S', ('n', lltype.Signed)) + def ll_do_nothing(s): + s.n = 2 + def ll_function(flag): + s = lltype.malloc(S) + s.n = 12 + t = s.n + if flag: + ll_do_nothing(s) + return t + s.n + graph2, insns = abstrinterp(ll_function, [0], []) + # XXX fragile test: at the moment, the two branches of the 'if' are not + # being merged at all because 's' was forced in one case only. + assert insns == {'direct_call': 1, 'int_is_true': 1, 'int_add': 2, + 'malloc': 1, 'setfield': 2, 'getfield': 1} + +def test_unique_virtualptrs(): + S = lltype.GcStruct('S', ('n', lltype.Signed)) + def ll_do_nothing(s): + s.n = 2 + def ll_function(flag, flag2): + s = lltype.malloc(S) + s.n = 12 + if flag2: # flag2 should always be 0 + t = lltype.nullptr(S) + else: + t = s + if flag: + ll_do_nothing(s) + return s.n * t.n + graph2, insns = abstrinterp(ll_function, [1, 0], []) From arigo at codespeak.net Sun Dec 11 15:43:06 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Dec 2005 15:43:06 +0100 (CET) Subject: [pypy-svn] r21043 - pypy/dist/pypy/jit Message-ID: <20051211144306.CAE8527DD0@code1.codespeak.net> Author: arigo Date: Sun Dec 11 15:43:05 2005 New Revision: 21043 Modified: pypy/dist/pypy/jit/llabstractinterp.py Log: Merged the VirtualStruct and LLVirtualPtr classes into one. Indeed, following the example of Psyco, they should always be in one-to-one correspondance, so this check-in makes sure they are :-) Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 11 15:43:05 2005 @@ -101,7 +101,10 @@ ll_no_return_value = LLRuntimeValue(const(None, lltype.Void)) -class VirtualStruct(object): +class LLVirtualStruct(LLAbstractValue): + """Stands for a pointer to a malloc'ed structure; the structure is not + malloc'ed so far, but we record which fields have which value. + """ parent = None parentindex = None @@ -109,6 +112,12 @@ self.T = STRUCT self.fields = {} + def getconcretetype(self): + return lltype.Ptr(self.T) + + def maybe_get_constant(self): + return None + def setparent(self, parent, parentindex): self.parent = parent self.parentindex = parentindex @@ -126,28 +135,29 @@ T = getattr(self.T, name) if isinstance(T, lltype.ContainerType): # reading a substructure - substr = VirtualStruct(T) - substr.setparent(self, name) - a_result = LLVirtualPtr(substr) - self.fields[name] = a_result - return a_result + a_substr = LLVirtualStruct(T) + a_substr.setparent(self, name) + self.fields[name] = a_substr + return a_substr else: # no value ever set, return a default return LLRuntimeValue(const(T._defl())) - def setfield(self, name, value): - self.fields[name] = value + def setfield(self, name, a_value): + self.fields[name] = a_value - def copy(self, memo): + def with_fresh_variables(self, memo): if self in memo: return memo[self] # already seen else: - result = VirtualStruct(self.T) + result = LLVirtualStruct(self.T) memo[self] = result if self.parent is not None: - # build the parent first -- note that parent.copy() will pick - # up 'result' again, because it is already in the memo - result.setparent(self.parent.copy(memo), self.parentindex) + # build the parent first -- note that + # parent.with_fresh_variables() will pick up 'result' again, + # because it is already in the memo + result.setparent(self.parent.with_fresh_variables(memo), + self.parentindex) # cannot keep lazy fields around: the copy is expected to have # only variables, not constants @@ -156,7 +166,7 @@ result.fields[name] = a return result - def force(self, builder): + def forcevarorconst(self, builder): v_result = newvar(lltype.Ptr(self.T)) if self.parent is not None: v_parent = self.parent.force(builder) @@ -171,6 +181,8 @@ print 'force:', op builder.residual_operations.append(op) self.buildcontent(builder, v_result) + self.__class__ = LLRuntimeValue + self.__dict__ = {'copy_v': v_result} return v_result def buildcontent(self, builder, v_target): @@ -187,8 +199,8 @@ v_subptr) print 'force:', op builder.residual_operations.append(op) - assert isinstance(a_value, LLVirtualPtr) - a_value.containerobj.buildcontent(builder, v_subptr) + assert isinstance(a_value, LLVirtualStruct) + a_value.buildcontent(builder, v_subptr) else: v_value = a_value.forcevarorconst(builder) op = SpaceOperation('setfield', [v_target, @@ -206,8 +218,8 @@ a_value = self.getfield(name) T = getattr(self.T, name) if isinstance(T, lltype.ContainerType): - assert isinstance(a_value, LLVirtualPtr) - for obj, fld in a_value.containerobj.rec_fields(): + assert isinstance(a_value, LLVirtualStruct) + for obj, fld in a_value.rec_fields(): yield obj, fld else: yield self, name @@ -223,6 +235,8 @@ return result def match(self, other, memo): + if not isinstance(other, LLVirtualStruct): + return False if (False, self) in memo: return other is memo[False, self] if (True, other) in memo: @@ -238,42 +252,6 @@ else: return True - -class LLVirtualPtr(LLAbstractValue): - - def __init__(self, containerobj): - self.containerobj = containerobj # a VirtualStruct - - def getconcretetype(self): - return lltype.Ptr(self.containerobj.T) - - def forcevarorconst(self, builder): - v_result = self.containerobj.force(builder) - self.__class__ = LLRuntimeValue - self.__dict__ = {'copy_v': v_result} - return v_result - - def getruntimevars(self, memo): - return self.containerobj.getruntimevars(memo) - - def maybe_get_constant(self): - return None - - def with_fresh_variables(self, memo): - if self in memo: - return memo[self] - else: - result = LLVirtualPtr(None) - memo[self] = result - result.containerobj = self.containerobj.copy(memo) - return result - - def match(self, other, memo): - if isinstance(other, LLVirtualPtr): - return self.containerobj.match(other.containerobj, memo) - else: - return False - # ____________________________________________________________ class BlockState(object): @@ -653,10 +631,10 @@ return a_result def op_getfield(self, op, a_ptr, a_attrname): - if isinstance(a_ptr, LLVirtualPtr): + if isinstance(a_ptr, LLVirtualStruct): c_attrname = a_attrname.maybe_get_constant() assert c_attrname is not None - return a_ptr.containerobj.getfield(c_attrname.value) + return a_ptr.getfield(c_attrname.value) constant_op = None T = a_ptr.getconcretetype().TO if T._hints.get('immutable', False): @@ -664,11 +642,11 @@ return self.residualize(op, [a_ptr, a_attrname], constant_op) def op_getsubstruct(self, op, a_ptr, a_attrname): - if isinstance(a_ptr, LLVirtualPtr): + if isinstance(a_ptr, LLVirtualStruct): c_attrname = a_attrname.maybe_get_constant() assert c_attrname is not None - # this should return a new LLVirtualPtr - return a_ptr.containerobj.getfield(c_attrname.value) + # this should return new LLVirtualStruct as well + return a_ptr.getfield(c_attrname.value) return self.residualize(op, [a_ptr, a_attrname], getattr) def op_getarraysize(self, op, a_ptr): @@ -684,17 +662,16 @@ def op_malloc(self, op, a_T): c_T = a_T.maybe_get_constant() assert c_T is not None - S = VirtualStruct(c_T.value) - return LLVirtualPtr(S) + return LLVirtualStruct(c_T.value) def op_malloc_varsize(self, op, a_T, a_size): return self.residualize(op, [a_T, a_size]) def op_setfield(self, op, a_ptr, a_attrname, a_value): - if isinstance(a_ptr, LLVirtualPtr): + if isinstance(a_ptr, LLVirtualStruct): c_attrname = a_attrname.maybe_get_constant() assert c_attrname is not None - a_ptr.containerobj.setfield(c_attrname.value, a_value) + a_ptr.setfield(c_attrname.value, a_value) return ll_no_return_value return self.residualize(op, [a_ptr, a_attrname, a_value]) @@ -707,7 +684,7 @@ return self.residualize(op, [a_ptr], constant_op) def op_keepalive(self, op, a_ptr): - if isinstance(a_ptr, LLVirtualPtr): + if isinstance(a_ptr, LLVirtualStruct): for v in a_ptr.getruntimevars({}): if isinstance(v, Variable) and not v.concretetype._is_atomic(): op = SpaceOperation('keepalive', [v], newvar(lltype.Void)) From mwh at codespeak.net Sun Dec 11 15:43:12 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Sun, 11 Dec 2005 15:43:12 +0100 (CET) Subject: [pypy-svn] r21044 - pypy/dist/pypy/jit/test Message-ID: <20051211144312.9EE1527DDB@code1.codespeak.net> Author: mwh Date: Sun Dec 11 15:43:11 2005 New Revision: 21044 Modified: pypy/dist/pypy/jit/test/test_jit_tl.py Log: only build the llgraph of the interpreter once and reduce inlining threshold. speeds execution of the test by a factor of about 10. Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Sun Dec 11 15:43:11 2005 @@ -8,27 +8,30 @@ from pypy.rpython.llinterp import LLInterpreter from pypy.translator.backendopt import inline -py.test.skip("in-progress") +#py.test.skip("in-progress") def entry_point(code, pc): # indirection needed, because the hints are not about *all* calls to # interp() return tl.interp(code, pc) -def jit_tl(code): +def setup_module(mod): t = TranslationContext() t.buildannotator().build_types(entry_point, [str, int]) rtyper = t.buildrtyper() rtyper.specialize() - inline.auto_inlining(t, 0.5) - graph1 = t.graphs[0] + inline.auto_inlining(t, 0.3) + + mod.graph1 = t.graphs[0] + mod.llinterp = LLInterpreter(rtyper) + +def jit_tl(code): interp = LLAbstractInterp() hints = {graph1.getargs()[0]: string_repr.convert_const(code), graph1.getargs()[1]: 0} graph2 = interp.eval(graph1, hints) - llinterp = LLInterpreter(rtyper) result1 = llinterp.eval_graph(graph1, [string_repr.convert_const(code), 0]) result2 = llinterp.eval_graph(graph2, []) From arigo at codespeak.net Sun Dec 11 16:06:18 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Dec 2005 16:06:18 +0100 (CET) Subject: [pypy-svn] r21045 - in pypy/dist/pypy/jit: . test Message-ID: <20051211150618.69DE627DD6@code1.codespeak.net> Author: arigo Date: Sun Dec 11 16:06:16 2005 New Revision: 21045 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: Cast_pointer support and test. Bug fix. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 11 16:06:16 2005 @@ -169,7 +169,7 @@ def forcevarorconst(self, builder): v_result = newvar(lltype.Ptr(self.T)) if self.parent is not None: - v_parent = self.parent.force(builder) + v_parent = self.parent.forcevarorconst(builder) op = SpaceOperation('getsubstruct', [v_parent, const(self.parentindex, lltype.Void)], @@ -679,6 +679,17 @@ return self.residualize(op, [a_ptr, a_index, a_value]) def op_cast_pointer(self, op, a_ptr): + if isinstance(a_ptr, LLVirtualStruct): + down_or_up = lltype.castable(op.result.concretetype, + a_ptr.getconcretetype()) + a = a_ptr + if down_or_up >= 0: + for n in range(down_or_up): + a = a.getfield(a.T._names[0]) + else: + for n in range(-down_or_up): + a = a.parent + return a def constant_op(ptr): return lltype.cast_pointer(op.result.concretetype, ptr) return self.residualize(op, [a_ptr], constant_op) Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Sun Dec 11 16:06:16 2005 @@ -272,3 +272,15 @@ ll_do_nothing(s) return s.n * t.n graph2, insns = abstrinterp(ll_function, [1, 0], []) + +def test_cast_pointer(): + S = lltype.GcStruct('S', ('n1', lltype.Signed), ('n2', lltype.Signed)) + PS = lltype.Ptr(S) + T = lltype.GcStruct('T', ('s', S), ('n', lltype.Float)) + def ll_function(): + t = lltype.malloc(T) + s = lltype.cast_pointer(PS, t) + t.s.n1 = 12 + return s.n1 + graph2, insns = abstrinterp(ll_function, [], []) + assert insns == {} From rxe at codespeak.net Sun Dec 11 16:07:19 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Sun, 11 Dec 2005 16:07:19 +0100 (CET) Subject: [pypy-svn] r21046 - pypy/dist/pypy/translator/c/test Message-ID: <20051211150719.ED48C27DD9@code1.codespeak.net> Author: rxe Date: Sun Dec 11 16:07:19 2005 New Revision: 21046 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: Simplified the API - as this is rpython, not python. Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Sun Dec 11 16:07:19 2005 @@ -60,13 +60,13 @@ self.alive = False def start(self): + self.resumable = self._start() + + def _start(self): self.caller = yield_current_frame_to_caller() - self.fn(self.name) + self.fn() return self.caller - def set_resumable(self, resumable): - self.resumable = resumable - def suspend(self): # we suspend ourself self.caller = self.caller.switch() @@ -75,17 +75,25 @@ # the caller resumes me self.resumable = self.resumable.switch() self.alive = self.resumable is not None - + class Tasklet(Resumable): - def __init__(self, name, fn): + def __init__(self, fn): Resumable.__init__(self, fn) - self.name = name self.blocked = 0 # propogates round suspend-resume to tell scheduler in run() - # XXX too late to think this thru + # XXX this should probably be done by setting a value in the + # scheduler?? self.remove = False + def start(self): + Resumable.start(self) + scheduler.add_tasklet(self) + + def run(self): + Resumable.start(self) + scheduler.run_immediately(self) + def suspend_and_remove(self, remove): self.remove = remove self.suspend() @@ -198,15 +206,6 @@ # ____________________________________________________________ scheduler = Scheduler() -def start_tasklet(tasklet): - res = tasklet.start() - tasklet.set_resumable(res) - scheduler.add_tasklet(tasklet) - -def start_tasklet_now(tasklet): - res = tasklet.start() - tasklet.set_resumable(res) - scheduler.run_immediately(tasklet) def schedule(): scheduler.schedule() @@ -222,16 +221,19 @@ # ____________________________________________________________ +#XXX start_tasklet +#XXX start_tasklet_now + def test_simple(): - def simple(name): + def simple(): for ii in range(5): globals.count += 1 schedule() def f(): for ii in range(loops): - start_tasklet(Tasklet("T%s" % ii, simple)) + Tasklet(simple).start() run() return globals.count == loops * 5 @@ -240,18 +242,18 @@ def test_multiple_simple(): - def simple(name): + def simple(): for ii in range(5): globals.count += 1 schedule() - def simple2(name): + def simple2(): for ii in range(5): globals.count += 1 schedule() globals.count += 1 - def simple3(name): + def simple3(): schedule() for ii in range(10): globals.count += 1 @@ -261,9 +263,9 @@ def f(): for ii in range(loops): - start_tasklet(Tasklet("T1%s" % ii, simple)) - start_tasklet(Tasklet("T2%s" % ii, simple2)) - start_tasklet(Tasklet("T3%s" % ii, simple3)) + Tasklet(simple).start() + Tasklet(simple2).start() + Tasklet(simple3).start() run() return globals.count == loops * 25 @@ -272,7 +274,7 @@ def test_schedule_remove(): - def simple(name): + def simple(): for ii in range(20): if ii < 10: schedule() @@ -282,10 +284,10 @@ def f(): for ii in range(loops): - start_tasklet(Tasklet("T%s" % ii, simple)) + Tasklet(simple).start() run() for ii in range(loops): - start_tasklet(Tasklet("T%s" % ii, simple)) + Tasklet(simple).start() run() return globals.count == loops * 10 * 2 @@ -295,26 +297,26 @@ def test_run_immediately(): globals.intermediate = 0 globals.count = 0 - def simple(name): + def simple(): for ii in range(20): globals.count += 1 schedule() - def run_immediately(name): + def run_immediately(): globals.intermediate = globals.count schedule() - def simple2(name): + def simple2(): for ii in range(20): globals.count += 1 if ii == 10: - start_tasklet_now(Tasklet("intermediate", run_immediately)) + Tasklet(run_immediately).run() schedule() def f(): - start_tasklet(Tasklet("simple2", simple2)) + Tasklet(simple2).start() for ii in range(loops): - start_tasklet(Tasklet("T%s" % ii, simple)) + Tasklet(simple).start() run() total_expected = (loops + 1) * 20 return (globals.intermediate == total_expected / 2 + 1 and @@ -326,18 +328,18 @@ def test_channel1(): ch = Channel() - def f1(name): + def f1(): for ii in range(5): ch.send(ii) - def f2(name): - #while True: + def f2(): + #while True: XXX Doesnt annotate for ii in range(6): globals.count += ch.receive() def f(): - start_tasklet(Tasklet("f2", f2)) - start_tasklet(Tasklet("f1", f1)) + Tasklet(f2).start() + Tasklet(f1).start() run() return (globals.count == 10) @@ -347,51 +349,49 @@ def test_channel2(): ch = Channel() - def f1(name): + def f1(): for ii in range(5): ch.send(ii) - def f2(name): - #while True: + def f2(): + #while True:XXX Doesnt annotate for ii in range(6): res = ch.receive() globals.count += res def f(): - start_tasklet(Tasklet("f1", f1)) - start_tasklet(Tasklet("f2", f2)) + Tasklet(f1).start() + Tasklet(f2).start() run() return (globals.count == 10) res = wrap_stackless_function(f) assert res == '1' - def test_channel3(): ch = Channel() - def f1(name): + def f1(): for ii in range(5): ch.send(ii) - def f2(name): - #while True: + def f2(): + #while True: XXX Doesnt annotate for ii in range(16): res = ch.receive() globals.count += res def f(): - start_tasklet(Tasklet("f1x", f1)) - start_tasklet(Tasklet("f1xx", f1)) - start_tasklet(Tasklet("f1xxx", f1)) - start_tasklet(Tasklet("f2", f2)) + Tasklet(f1).start() + Tasklet(f1).start() + Tasklet(f1).start() + Tasklet(f2).start() run() return (globals.count == 30) res = wrap_stackless_function(f) assert res == '1' - def test_channel4(): """ test with something other than int """ @@ -417,19 +417,19 @@ ch2 = Channel() ch3 = Channel() - def f1(name): + def f1(): for ii in range(5): ch1.send(IntData(ii)) - def f2(name): + def f2(): for ii in range(5): ch2.send(StringData("asda")) - def f3(name): + def f3(): for ii in range(5): ch3.send(StringData("asda")) - def fr(name): + def fr(): #while True: for ii in range(11): data3 = ch3.receive() @@ -440,14 +440,13 @@ globals.count += 1 def f(): - start_tasklet(Tasklet("fr", fr)) - start_tasklet(Tasklet("f1", f1)) - start_tasklet(Tasklet("f2", f2)) - start_tasklet(Tasklet("f3", f3)) + Tasklet(fr).start() + Tasklet(f1).start() + Tasklet(f2).start() + Tasklet(f3).start() run() - debug("asd %s" % globals.count) return (globals.count == 15) res = wrap_stackless_function(f) assert res == '1' - + From arigo at codespeak.net Sun Dec 11 16:18:29 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Dec 2005 16:18:29 +0100 (CET) Subject: [pypy-svn] r21047 - in pypy/dist/pypy/jit: . test Message-ID: <20051211151829.C479B27DD0@code1.codespeak.net> Author: arigo Date: Sun Dec 11 16:18:27 2005 New Revision: 21047 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: Don't residualize called graphs if they are called without any LLConcreteValue. Write a call to the original graph instead. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 11 16:18:27 2005 @@ -590,44 +590,58 @@ def op_same_as(self, op, a): return a - def op_direct_call(self, op, a_func, *args_a): - a_result = LLRuntimeValue(op.result) + def op_direct_call(self, op, *args_a): + a_result = self.handle_call(op, *args_a) + if a_result is None: + a_result = self.residualize(op, args_a) + return a_result + + def handle_call(self, op, a_func, *args_a): v_func = a_func.maybe_get_constant() - if v_func is not None: - fnobj = v_func.value._obj - if (hasattr(fnobj, 'graph') and - not getattr(fnobj._callable, 'suggested_primitive', False)): - origgraph = fnobj.graph - - # for now, we need to force all arguments - for a in args_a: - a.forcevarorconst(self) - - graphstate, args_a = self.interp.schedule_graph( - args_a, origgraph) - #print 'SCHEDULE_GRAPH', args_a, '==>', graphstate.copygraph.name - if graphstate.state != "during": - print 'ENTERING', graphstate.copygraph.name, args_a - graphstate.complete() - if (graphstate.a_return is not None and - graphstate.a_return.maybe_get_constant() is not None): - a_result = graphstate.a_return - print 'LEAVING', graphstate.copygraph.name, graphstate.a_return - - origfptr = v_func.value - ARGS = [] - new_args_a = [] - for a in args_a: - if not isinstance(a, LLConcreteValue): - ARGS.append(a.getconcretetype()) - new_args_a.append(a) - args_a = new_args_a - TYPE = lltype.FuncType( - ARGS, lltype.typeOf(origfptr).TO.RESULT) - fptr = lltype.functionptr( - TYPE, graphstate.copygraph.name, graph=graphstate.copygraph) - a_func = LLRuntimeValue(const(fptr)) - self.residual("direct_call", [a_func] + list(args_a), a_result) + if v_func is None: + return None + fnobj = v_func.value._obj + if not hasattr(fnobj, 'graph'): + return None + if getattr(fnobj._callable, 'suggested_primitive', False): + return None + + origgraph = fnobj.graph + + # for now, we need to force all arguments + any_concrete = False + for a in args_a: + a.forcevarorconst(self) + any_concrete = any_concrete or isinstance(a,LLConcreteValue) + if not any_concrete: + return None + + a_result = LLRuntimeValue(op.result) + graphstate, args_a = self.interp.schedule_graph( + args_a, origgraph) + #print 'SCHEDULE_GRAPH', args_a, '==>', graphstate.copygraph.name + if graphstate.state != "during": + print 'ENTERING', graphstate.copygraph.name, args_a + graphstate.complete() + if (graphstate.a_return is not None and + graphstate.a_return.maybe_get_constant() is not None): + a_result = graphstate.a_return + print 'LEAVING', graphstate.copygraph.name, graphstate.a_return + + origfptr = v_func.value + ARGS = [] + new_args_a = [] + for a in args_a: + if not isinstance(a, LLConcreteValue): + ARGS.append(a.getconcretetype()) + new_args_a.append(a) + args_a = new_args_a + TYPE = lltype.FuncType( + ARGS, lltype.typeOf(origfptr).TO.RESULT) + fptr = lltype.functionptr( + TYPE, graphstate.copygraph.name, graph=graphstate.copygraph) + a_func = LLRuntimeValue(const(fptr)) + self.residual("direct_call", [a_func] + args_a, a_result) return a_result def op_getfield(self, op, a_ptr, a_attrname): Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Sun Dec 11 16:18:27 2005 @@ -255,7 +255,7 @@ # XXX fragile test: at the moment, the two branches of the 'if' are not # being merged at all because 's' was forced in one case only. assert insns == {'direct_call': 1, 'int_is_true': 1, 'int_add': 2, - 'malloc': 1, 'setfield': 2, 'getfield': 1} + 'malloc': 1, 'setfield': 1, 'getfield': 1} def test_unique_virtualptrs(): S = lltype.GcStruct('S', ('n', lltype.Signed)) @@ -284,3 +284,14 @@ return s.n1 graph2, insns = abstrinterp(ll_function, [], []) assert insns == {} + +def test_residual_direct_call(): + def ll_uninteresting(x, y): + return x * y + def ll_function(a, b): + return ll_uninteresting(a+b, b+1) + graph2, insns = abstrinterp(ll_function, [2, 5], [0]) + # ll_uninteresting() should not be residualized because it is only passed + # non-concrete values, so 'insns' should only see the residualized + # ll_function(). + assert insns == {'direct_call': 1, 'int_add': 2} From sanxiyn at codespeak.net Sun Dec 11 16:23:39 2005 From: sanxiyn at codespeak.net (sanxiyn at codespeak.net) Date: Sun, 11 Dec 2005 16:23:39 +0100 (CET) Subject: [pypy-svn] r21048 - in pypy/dist/pypy: annotation rpython rpython/test Message-ID: <20051211152339.ABC0027DD0@code1.codespeak.net> Author: sanxiyn Date: Sun Dec 11 16:23:34 2005 New Revision: 21048 Modified: pypy/dist/pypy/annotation/unaryop.py pypy/dist/pypy/rpython/rstr.py pypy/dist/pypy/rpython/test/test_rstr.py Log: str.{strip,lstrip,rstrip} on RPython Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Sun Dec 11 16:23:34 2005 @@ -368,6 +368,15 @@ def method_rfind(str, frag, start=None, end=None): return SomeInteger() + def method_strip(str, chr): + return SomeString() + + def method_lstrip(str, chr): + return SomeString() + + def method_rstrip(str, chr): + return SomeString() + def method_join(str, s_list): getbookkeeper().count("str_join", str) s_item = s_list.listdef.read_item() Modified: pypy/dist/pypy/rpython/rstr.py ============================================================================== --- pypy/dist/pypy/rpython/rstr.py (original) +++ pypy/dist/pypy/rpython/rstr.py Sun Dec 11 16:23:34 2005 @@ -138,6 +138,19 @@ def rtype_method_rfind(self, hop): return self.rtype_method_find(hop, reverse=True) + def rtype_method_strip(_, hop, left=True, right=True): + v_str = hop.inputarg(string_repr, arg=0) + v_char = hop.inputarg(char_repr, arg=1) + v_left = hop.inputconst(Bool, left) + v_right = hop.inputconst(Bool, right) + return hop.gendirectcall(ll_strip, v_str, v_char, v_left, v_right) + + def rtype_method_lstrip(self, hop): + return self.rtype_method_strip(hop, left=True, right=False) + + def rtype_method_rstrip(self, hop): + return self.rtype_method_strip(hop, left=False, right=True) + def rtype_method_upper(_, hop): v_str, = hop.inputargs(string_repr) hop.exception_cannot_occur() @@ -822,6 +835,28 @@ emptystr = string_repr.convert_const("") +def ll_strip(s, ch, left, right): + s_len = len(s.chars) + if s_len == 0: + return emptystr + lpos = 0 + rpos = s_len - 1 + if left: + while lpos < rpos and s.chars[lpos] == ch: + lpos += 1 + if right: + while lpos < rpos and s.chars[rpos] == ch: + rpos -= 1 + r_len = rpos - lpos + 1 + result = malloc(STR, r_len) + i = 0 + j = lpos + while i < r_len: + result.chars[i] = s.chars[j] + i += 1 + j += 1 + return result + def ll_upper(s): s_chars = s.chars s_len = len(s_chars) Modified: pypy/dist/pypy/rpython/test/test_rstr.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rstr.py (original) +++ pypy/dist/pypy/rpython/test/test_rstr.py Sun Dec 11 16:23:34 2005 @@ -262,6 +262,20 @@ res = interpret(fn, [ch]) assert res == fn(ch) +def test_strip(): + def both(): + return '!ab!'.strip('!') + def left(): + return '!ab!'.lstrip('!') + def right(): + return '!ab!'.rstrip('!') + res = interpret(both, []) + assert ''.join(res.chars) == 'ab' + res = interpret(left, []) + assert ''.join(res.chars) == 'ab!' + res = interpret(right, []) + assert ''.join(res.chars) == '!ab' + def test_upper(): strings = ['', ' ', 'upper', 'UpPeR', ',uppEr,'] for i in range(256): strings.append(chr(i)) From cfbolz at codespeak.net Sun Dec 11 16:33:20 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sun, 11 Dec 2005 16:33:20 +0100 (CET) Subject: [pypy-svn] r21049 - pypy/dist/pypy/translator/c Message-ID: <20051211153320.C08ED27DC5@code1.codespeak.net> Author: cfbolz Date: Sun Dec 11 16:33:19 2005 New Revision: 21049 Modified: pypy/dist/pypy/translator/c/gc.py Log: argh! we were registering _a_lot_ of finalizers thus making my last checkin useless. Modified: pypy/dist/pypy/translator/c/gc.py ============================================================================== --- pypy/dist/pypy/translator/c/gc.py (original) +++ pypy/dist/pypy/translator/c/gc.py Sun Dec 11 16:33:19 2005 @@ -349,16 +349,18 @@ def struct_setup(self, structdefnode, rtti): if isinstance(structdefnode.LLTYPE, GcStruct): has_del = rtti is not None and hasattr(rtti._obj, 'destructor_funcptr') + needs_deallocator = bool(list(self.deallocator_lines(structdefnode, ''))) gcinfo = structdefnode.gcinfo = BoehmInfo() - gcinfo.finalizer = self.db.namespace.uniquename('finalize_'+structdefnode.barename) - if list(self.deallocator_lines(structdefnode, '')): - if has_del: - raise Exception("you cannot use __del__ with PyObjects and Boehm") - if has_del: - destrptr = rtti._obj.destructor_funcptr - gcinfo.destructor = self.db.get(destrptr) - T = typeOf(destrptr).TO.ARGS[0] - gcinfo.destructor_argtype = self.db.gettype(T) + if needs_deallocator and has_del: + raise Exception("you cannot use __del__ with PyObjects and Boehm") + if needs_deallocator or has_del: + name = 'finalize_'+structdefnode.barename + gcinfo.finalizer = self.db.namespace.uniquename(name) + if has_del: + destrptr = rtti._obj.destructor_funcptr + gcinfo.destructor = self.db.get(destrptr) + T = typeOf(destrptr).TO.ARGS[0] + gcinfo.destructor_argtype = self.db.gettype(T) struct_after_definition = common_after_definition def struct_implementationcode(self, structdefnode): From ale at codespeak.net Sun Dec 11 17:29:46 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Sun, 11 Dec 2005 17:29:46 +0100 (CET) Subject: [pypy-svn] r21050 - pypy/dist/pypy/translator/c/test Message-ID: <20051211162946.5522527DD0@code1.codespeak.net> Author: ale Date: Sun Dec 11 17:29:45 2005 New Revision: 21050 Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: A test that was skipped (due to segfault) passes on MAC OS X. Small changes to the connect tests Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Sun Dec 11 17:29:45 2005 @@ -5,8 +5,6 @@ from pypy.translator.translator import Translator from pypy.module._socket.test import echoserver -HOST = "localhost" -PORT = 8037 def setup_module(mod): import pypy.module._socket.rpython.exttable # for declare()/declaretype() @@ -48,7 +46,7 @@ assert res == _socket.gethostbyname("localhost") def test_getaddrinfo(): - py.test.skip("segfaulting on linux right now") +# py.test.skip("segfaulting on linux right now") import pypy.module._socket.rpython.exttable # for declare()/declaretype() from pypy.module._socket.rpython import rsocket def does_stuff(host, port): @@ -58,8 +56,13 @@ info = addr.nextinfo() if info[0] == 0: break - result.append("(%d, %d, %d, '%s', ('%s', %d))" % + if info[0] == _socket.AF_INET: + result.append("(%d, %d, %d, '%s', ('%s', %d))" % (info[0],info[1],info[2],info[3],info[4],info[5])) + elif info[0] == _socket.AF_INET6: + result.append("(%d, %d, %d, '%s', ('%s', %d, %d, %d))" % + (info[0],info[1],info[2],info[3],info[4],info[5],info[6],info[7])) + addr.free() return str(result) f1 = compile(does_stuff, [str, str]) @@ -113,6 +116,8 @@ class TestConnectedIPv4: + HOST = "localhost" + PORT = 8037 family = _socket.AF_INET def setup_class(cls): @@ -121,7 +126,7 @@ def teardown_class(cls): import telnetlib - tn = telnetlib.Telnet(HOST, PORT) + tn = telnetlib.Telnet(cls.HOST, cls.PORT) tn.write("shutdown\n") tn.close() @@ -130,15 +135,16 @@ from pypy.module._socket.rpython import rsocket def does_stuff(): fd = rsocket.newsocket(self.family, _socket.SOCK_STREAM, 0) - rsocket.connect(fd, (HOST, PORT, 0, 0), self.family) + rsocket.connect(fd, (self.HOST, self.PORT, 0, 0), self.family) sockname = rsocket.getpeername(fd) os.close(fd) return sockname[1] - f1 = compile(does_stuff, []) + f1 = compile(does_stuff, [], True) res = f1() - assert res == PORT + assert res == self.PORT class DONOT_TestConnectedIPv6(TestConnectedIPv4): + HOST = "0000:0000:0000:0000:0000:0000:0000:0001" disabled = not _socket.has_ipv6 family = _socket.AF_INET6 From arigo at codespeak.net Sun Dec 11 18:05:59 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Dec 2005 18:05:59 +0100 (CET) Subject: [pypy-svn] r21052 - in pypy/dist/pypy/jit: . test Message-ID: <20051211170559.983C027DD0@code1.codespeak.net> Author: arigo Date: Sun Dec 11 18:05:57 2005 New Revision: 21052 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: Virtual arrays. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 11 18:05:57 2005 @@ -101,15 +101,16 @@ ll_no_return_value = LLRuntimeValue(const(None, lltype.Void)) -class LLVirtualStruct(LLAbstractValue): - """Stands for a pointer to a malloc'ed structure; the structure is not - malloc'ed so far, but we record which fields have which value. - """ +class LLVirtualContainer(LLAbstractValue): + parent = None parentindex = None - def __init__(self, STRUCT): - self.T = STRUCT + def __init__(self, T, a_length=None): + assert (a_length is not None) == T._is_varsize() + self.T = T + self.a_length = a_length + self.names = self.getnames() self.fields = {} def getconcretetype(self): @@ -132,13 +133,17 @@ try: return self.fields[name] except KeyError: - T = getattr(self.T, name) + T = self.fieldtype(name) if isinstance(T, lltype.ContainerType): # reading a substructure - a_substr = LLVirtualStruct(T) - a_substr.setparent(self, name) - self.fields[name] = a_substr - return a_substr + if T._is_varsize(): + a_length = self.a_length + else: + a_length = None + a_sub = virtualcontainer(T, a_length) + a_sub.setparent(self, name) + self.fields[name] = a_sub + return a_sub else: # no value ever set, return a default return LLRuntimeValue(const(T._defl())) @@ -150,7 +155,7 @@ if self in memo: return memo[self] # already seen else: - result = LLVirtualStruct(self.T) + result = virtualcontainer(self.T, self.a_length) memo[self] = result if self.parent is not None: # build the parent first -- note that @@ -161,7 +166,7 @@ # cannot keep lazy fields around: the copy is expected to have # only variables, not constants - for name in self.T._names: + for name in self.names: a = self.getfield(name).with_fresh_variables(memo) result.fields[name] = a return result @@ -177,7 +182,14 @@ print 'force:', op builder.residual_operations.append(op) else: - op = SpaceOperation('malloc', [const(self.T, lltype.Void)], v_result) + if self.T._is_varsize(): + op = SpaceOperation('malloc_varsize', [ + const(self.T, lltype.Void), + self.a_length.forcevarorconst(builder)], + v_result) + else: + op = SpaceOperation('malloc', [const(self.T, lltype.Void)], + v_result) print 'force:', op builder.residual_operations.append(op) self.buildcontent(builder, v_result) @@ -187,12 +199,12 @@ def buildcontent(self, builder, v_target): # initialize all fields - for name in self.T._names: + for name in self.names: if name in self.fields: a_value = self.fields[name] - T = getattr(self.T, name) + T = self.fieldtype(name) if isinstance(T, lltype.ContainerType): - # initialize the substructure + # initialize the substructure/subarray v_subptr = newvar(lltype.Ptr(T)) op = SpaceOperation('getsubstruct', [v_target, const(name, lltype.Void)], @@ -203,39 +215,22 @@ a_value.buildcontent(builder, v_subptr) else: v_value = a_value.forcevarorconst(builder) - op = SpaceOperation('setfield', [v_target, - const(name, lltype.Void), - v_value], - newvar(lltype.Void)) + op = self.setop(v_target, name, v_value) print 'force:', op builder.residual_operations.append(op) - def rec_fields(self): - # -- not used at the moment -- - # enumerate all the fields of this structure and each of - # its substructures - for name in self.T._names: - a_value = self.getfield(name) - T = getattr(self.T, name) - if isinstance(T, lltype.ContainerType): - assert isinstance(a_value, LLVirtualStruct) - for obj, fld in a_value.rec_fields(): - yield obj, fld - else: - yield self, name - def getruntimevars(self, memo): result = [] if self not in memo: memo[self] = True if self.parent is not None: result.extend(self.parent.getruntimevars(memo)) - for name in self.T._names: + for name in self.names: result.extend(self.getfield(name).getruntimevars(memo)) return result def match(self, other, memo): - if not isinstance(other, LLVirtualStruct): + if self.__class__ is not other.__class__: return False if (False, self) in memo: return other is memo[False, self] @@ -244,7 +239,10 @@ memo[False, self] = other memo[True, other] = self assert self.T == other.T - for name in self.T._names: + if self.a_length is not None: + if not self.a_length.match(other.a_length, memo): + return False + for name in self.names: a1 = self.getfield(name) a2 = other.getfield(name) if not a1.match(a2, memo): @@ -252,6 +250,51 @@ else: return True + +class LLVirtualStruct(LLVirtualContainer): + """Stands for a pointer to a malloc'ed structure; the structure is not + malloc'ed so far, but we record which fields have which value. + """ + def getnames(self): + return self.T._names + + def fieldtype(self, name): + return getattr(self.T, name) + + def setop(self, v_target, name, v_value): + return SpaceOperation('setfield', [v_target, + const(name, lltype.Void), + v_value], + newvar(lltype.Void)) + +class LLVirtualArray(LLVirtualContainer): + """Stands for a pointer to a malloc'ed array; the array is not + malloc'ed so far, but we record which fields have which value -- here + a field is an item, indexed by an integer instead of a string field name. + """ + def getnames(self): + c = self.a_length.maybe_get_constant() + assert c is not None + return range(c.value) + + def fieldtype(self, index): + return self.T.OF + + def setop(self, v_target, name, v_value): + return SpaceOperation('setarrayitem', [v_target, + const(name, lltype.Signed), + v_value], + newvar(lltype.Void)) + +def virtualcontainer(T, a_length=None): + if isinstance(T, lltype.Struct): + cls = LLVirtualStruct + elif isinstance(T, lltype.Array): + cls = LLVirtualArray + else: + raise TypeError("unsupported container type %r" % (T,)) + return cls(T, a_length) + # ____________________________________________________________ class BlockState(object): @@ -656,17 +699,23 @@ return self.residualize(op, [a_ptr, a_attrname], constant_op) def op_getsubstruct(self, op, a_ptr, a_attrname): - if isinstance(a_ptr, LLVirtualStruct): + if isinstance(a_ptr, LLVirtualContainer): c_attrname = a_attrname.maybe_get_constant() assert c_attrname is not None - # this should return new LLVirtualStruct as well + # this should return new LLVirtualContainer as well return a_ptr.getfield(c_attrname.value) return self.residualize(op, [a_ptr, a_attrname], getattr) def op_getarraysize(self, op, a_ptr): + if isinstance(a_ptr, LLVirtualArray): + return a_ptr.a_length return self.residualize(op, [a_ptr], len) def op_getarrayitem(self, op, a_ptr, a_index): + if isinstance(a_ptr, LLVirtualArray): + c_index = a_index.maybe_get_constant() + if c_index is not None: + return a_ptr.getfield(c_index.value) constant_op = None T = a_ptr.getconcretetype().TO if T._hints.get('immutable', False): @@ -679,6 +728,10 @@ return LLVirtualStruct(c_T.value) def op_malloc_varsize(self, op, a_T, a_size): + if a_size.maybe_get_constant() is not None: + c_T = a_T.maybe_get_constant() + assert c_T is not None + return virtualcontainer(c_T.value, a_length=a_size) return self.residualize(op, [a_T, a_size]) def op_setfield(self, op, a_ptr, a_attrname, a_value): @@ -690,6 +743,11 @@ return self.residualize(op, [a_ptr, a_attrname, a_value]) def op_setarrayitem(self, op, a_ptr, a_index, a_value): + if isinstance(a_ptr, LLVirtualArray): + c_index = a_index.maybe_get_constant() + if c_index is not None: + a_ptr.setfield(c_index.value, a_value) + return ll_no_return_value return self.residualize(op, [a_ptr, a_index, a_value]) def op_cast_pointer(self, op, a_ptr): Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Sun Dec 11 18:05:57 2005 @@ -295,3 +295,14 @@ # non-concrete values, so 'insns' should only see the residualized # ll_function(). assert insns == {'direct_call': 1, 'int_add': 2} + +def test_virtual_array(): + A = lltype.GcArray(lltype.Signed) + def ll_function(k, l): + a = lltype.malloc(A, 3) + a[0] = k + a[1] = 12 + a[2] = l + return (a[0] + a[1]) + a[2] + graph2, insns = abstrinterp(ll_function, [7, 983], [0]) + assert insns == {'int_add': 1} From cfbolz at codespeak.net Sun Dec 11 21:27:15 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sun, 11 Dec 2005 21:27:15 +0100 (CET) Subject: [pypy-svn] r21055 - pypy/dist/pypy/translator/c/test Message-ID: <20051211202715.1AA8A27DC8@code1.codespeak.net> Author: cfbolz Date: Sun Dec 11 21:27:07 2005 New Revision: 21055 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: first checkin from a ferry! skip test if boehm is not found. Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Sun Dec 11 21:27:07 2005 @@ -1,3 +1,4 @@ +import py import os from pypy.rpython.memory.lladdress import NULL @@ -6,10 +7,14 @@ # ____________________________________________________________ # For testing +from pypy.translator.tool import cbuild from pypy.translator.c.gc import BoehmGcPolicy gcpolicy = BoehmGcPolicy debug_flag = True +if not cbuild.check_boehm_presence(): + raise py.test.skip("boehm not found") + # count of loops in tests (set lower to speed up) loops = 10 From rxe at codespeak.net Mon Dec 12 08:40:00 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Mon, 12 Dec 2005 08:40:00 +0100 (CET) Subject: [pypy-svn] r21064 - pypy/dist/pypy/translator/c/test Message-ID: <20051212074000.843C327DC5@code1.codespeak.net> Author: rxe Date: Mon Dec 12 08:39:59 2005 New Revision: 21064 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: Just use default gcpolicy if boehm is not found Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Mon Dec 12 08:39:59 2005 @@ -9,15 +9,15 @@ from pypy.translator.tool import cbuild from pypy.translator.c.gc import BoehmGcPolicy -gcpolicy = BoehmGcPolicy -debug_flag = True -if not cbuild.check_boehm_presence(): - raise py.test.skip("boehm not found") +gcpolicy = None +if cbuild.check_boehm_presence(): + gcpolicy = BoehmGcPolicy # count of loops in tests (set lower to speed up) loops = 10 +debug_flag = False def debug(s): if debug_flag: os.write(2, "%s\n" % s) From lene at codespeak.net Mon Dec 12 09:29:18 2005 From: lene at codespeak.net (lene at codespeak.net) Date: Mon, 12 Dec 2005 09:29:18 +0100 (CET) Subject: [pypy-svn] r21067 - pypy/extradoc/pypy.org Message-ID: <20051212082918.1A5D027DDE@code1.codespeak.net> Author: lene Date: Mon Dec 12 09:29:17 2005 New Revision: 21067 Modified: pypy/extradoc/pypy.org/index.txt Log: right contract number - we have number 004779, hope this is the right place to change it Modified: pypy/extradoc/pypy.org/index.txt ============================================================================== --- pypy/extradoc/pypy.org/index.txt (original) +++ pypy/extradoc/pypy.org/index.txt Mon Dec 12 09:29:17 2005 @@ -1,11 +1,11 @@ -PyPy EU project title (contract number: 004479) +PyPy EU project title (contract number: 004779) ------------------------------------------------ Researching a higly flexible and modular language platform and implementing it by leveraging the Open Source Python Language and Community -PyPy EU project description (004479) +PyPy EU project description (004779) -------------------------------------- The PyPy project have been an ongoing Open Source Python language From hpk at codespeak.net Mon Dec 12 10:33:01 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Mon, 12 Dec 2005 10:33:01 +0100 (CET) Subject: [pypy-svn] r21068 - pypy/extradoc/sprintinfo/gothenburg-2005 Message-ID: <20051212093301.4124727DC8@code1.codespeak.net> Author: hpk Date: Mon Dec 12 10:32:55 2005 New Revision: 21068 Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Log: fix ReST problem Modified: pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt ============================================================================== --- pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt (original) +++ pypy/extradoc/sprintinfo/gothenburg-2005/planning.txt Mon Dec 12 10:32:55 2005 @@ -53,7 +53,8 @@ - implement __del__ support in the RTyper and backends (DONE, performance killer! 10 times slower! argh!!!), Samuele, Carl Friedrich -(- possibly implement weakref (at least with Boehm)) + +- (possibly implement weakref (at least with Boehm)) - integrate GC construction framework in the backends: Eric, Carl Friedrich From arigo at codespeak.net Mon Dec 12 13:01:53 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Dec 2005 13:01:53 +0100 (CET) Subject: [pypy-svn] r21069 - pypy/extradoc/talk/chalmers Message-ID: <20051212120153.2CE8527DC6@code1.codespeak.net> Author: arigo Date: Mon Dec 12 13:01:51 2005 New Revision: 21069 Added: pypy/extradoc/talk/chalmers/ (props changed) pypy/extradoc/talk/chalmers/winter-meeting-2006.txt (contents, props changed) Log: A talk for the (internal) Winter Meeting of the Chalmers Computer Science departement, in Gothenburg. Abstract only so far. I plan to submit something very similar to the CCC paper, if neither conference objects to that. Added: pypy/extradoc/talk/chalmers/winter-meeting-2006.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/chalmers/winter-meeting-2006.txt Mon Dec 12 13:01:51 2005 @@ -0,0 +1,27 @@ +Reference: http://www.cs.chalmers.se/wm/ +DEADLINE: 3rd January 2006 (Tuesday) + +Title: PyPy - the new Python implementation on the block + +Talker: Armin Rigo + +Abstract (max 150 words): + + PyPy (http://codespeak.net/pypy/dist/pypy/doc/architecture.html) is + an implementation of the Python (http://www.python.org) programming + language written in Python itself, flexible and easy to experiment + with. We are targetting a large variety of platforms, small and + large, by providing a compiler toolsuite that can produce custom + Python versions. Platform, memory and threading models are aspects + of the translation process - as opposed to encoding low level + details into the language implementation itself. + + The talk will give a quick overview of this toolsuite, which is so + far a static type inferencer and compiler for RPython, a subset of + Python. I will then talk about the current work on dynamic + optimization techniques: implemented as another translation aspect, + they should become robust against language changes. In other words, + the toolsuite will be able to turn an interpreter for any language + or dialect into a just-in-time compiler - more exactly, a + "just-in-time specializer", like Psyco for the Python language + (http://psyco.sf.net). From arigo at codespeak.net Mon Dec 12 13:07:54 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Dec 2005 13:07:54 +0100 (CET) Subject: [pypy-svn] r21070 - pypy/extradoc/talk/chalmers Message-ID: <20051212120754.2FA4F27DC6@code1.codespeak.net> Author: arigo Date: Mon Dec 12 13:07:52 2005 New Revision: 21070 Modified: pypy/extradoc/talk/chalmers/winter-meeting-2006.txt Log: Changed the title, focusing on the particular aspect that I want to talk about. Modified: pypy/extradoc/talk/chalmers/winter-meeting-2006.txt ============================================================================== --- pypy/extradoc/talk/chalmers/winter-meeting-2006.txt (original) +++ pypy/extradoc/talk/chalmers/winter-meeting-2006.txt Mon Dec 12 13:07:52 2005 @@ -1,7 +1,7 @@ Reference: http://www.cs.chalmers.se/wm/ DEADLINE: 3rd January 2006 (Tuesday) -Title: PyPy - the new Python implementation on the block +Title: PyPy: Dynamic optimizations for your favorite language Talker: Armin Rigo @@ -10,7 +10,7 @@ PyPy (http://codespeak.net/pypy/dist/pypy/doc/architecture.html) is an implementation of the Python (http://www.python.org) programming language written in Python itself, flexible and easy to experiment - with. We are targetting a large variety of platforms, small and + with. We are targeting a large variety of platforms, small and large, by providing a compiler toolsuite that can produce custom Python versions. Platform, memory and threading models are aspects of the translation process - as opposed to encoding low level From arigo at codespeak.net Mon Dec 12 13:41:17 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Dec 2005 13:41:17 +0100 (CET) Subject: [pypy-svn] r21072 - pypy/dist/pypy/jit Message-ID: <20051212124117.BE2E527DC6@code1.codespeak.net> Author: arigo Date: Mon Dec 12 13:41:16 2005 New Revision: 21072 Modified: pypy/dist/pypy/jit/llabstractinterp.py Log: Starting to make the BlockState a subclass of LLAbstractValue, in the idea that when inlining a graph the current state should contain a chain of BlockStates corresponding to the caller's frames. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Mon Dec 12 13:41:16 2005 @@ -297,20 +297,61 @@ # ____________________________________________________________ -class BlockState(object): +class LLBlockState(LLAbstractValue): """Entry state of a block, as a combination of LLAbstractValues for its input arguments.""" - def __init__(self, origblock, args_a): + def __init__(self, args_a, origblock, origposition=0, a_back=None): assert len(args_a) == len(origblock.inputargs) self.args_a = args_a self.origblock = origblock + self.origposition = origposition self.copyblock = None + self.a_back = a_back - def match(self, args_a): - # simple for now - memo = {} - for a1, a2 in zip(self.args_a, args_a): + def key(self): + # two LLBlockStates should return different keys if they cannot match(). + if self.a_back is None: + backkey = None + else: + backkey = self.a_back.key() + return (self.origblock, self.origposition, backkey) + + def getruntimevars(self, memo): + if self.a_back is None: + result = [] + else: + result = self.a_back.getruntimevars(memo) + for a in self.args_a: + result.extend(a.getruntimevars(memo)) + return result + + def maybe_get_constant(self): + return None + + def with_fresh_variables(self, memo): + if self.a_back is None: + new_a_back = self.a_back.with_fresh_variables(memo) + else: + new_a_back = None + new_args_a = [a.with_fresh_variables(memo) for a in self.args_a] + return LLBlockState(new_args_a, self.origblock, self.origposition, + new_a_back) + + def match(self, other, memo): + if self.origblock is not other.origblock: + return False + if self.origposition != other.origposition: + return False + if self.a_back is None: + if other.a_back is not None: + return False + else: + if other.a_back is None: + return False + if not self.a_back.match(other.a_back, memo): + return False + for a1, a2 in zip(self.args_a, other.args_a): if not a1.match(a2, memo): return False else: @@ -328,6 +369,7 @@ self.graphs = [] self.graphstates = {} # {origgraph: {BlockState: GraphState}} self.pendingstates = {} # {Link-or-GraphState: next-BlockState} + self.blocks = {} # {BlockState.key(): list-of-LLBlockStates} def itercopygraphs(self): return self.graphs @@ -336,7 +378,6 @@ # for now, 'hints' means "I'm absolutely sure that the # given variables will have the given ll value" self.hints = hints - self.blocks = {} # {origblock: list-of-LLStates} args_a = [LLRuntimeValue(orig_v=v) for v in origgraph.getargs()] graphstate, args_a = self.schedule_graph(args_a, origgraph) graphstate.complete() @@ -352,7 +393,7 @@ result_a.append(a) return result_a - def schedule_graph(self, args_a, origgraph): + def schedule_graph(self, args_a, origgraph, a_back=None): origblock = origgraph.startblock state, args_a = self.schedule_getstate(args_a, origblock) try: @@ -377,20 +418,20 @@ self.pendingstates[newlink] = state return newlink - def schedule_getstate(self, args_a, origblock): + def schedule_getstate(self, args_a, origblock, a_back=None): # NOTA BENE: copyblocks can get shared between different copygraphs! args_a = self.applyhint(args_a, origblock) - pendingstates = self.blocks.setdefault(origblock, []) + newstate = LLBlockState(args_a, origblock, a_back) + pendingstates = self.blocks.setdefault(newstate.key(), []) # try to match this new state with an existing one for state in pendingstates: - if state.match(args_a): + if state.match(newstate, {}): # already matched return state, args_a else: - # schedule this new state - state = BlockState(origblock, args_a) - pendingstates.append(state) - return state, args_a + # cache and return this new state + pendingstates.append(newstate) + return newstate, args_a class GraphState(object): From arigo at codespeak.net Mon Dec 12 13:49:10 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Dec 2005 13:49:10 +0100 (CET) Subject: [pypy-svn] r21073 - in pypy/dist/pypy/jit: . test Message-ID: <20051212124910.1163027DC6@code1.codespeak.net> Author: arigo Date: Mon Dec 12 13:49:08 2005 New Revision: 21073 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_jit_tl.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: Get rid of the too general 'hints'. They go in the way... Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Mon Dec 12 13:49:08 2005 @@ -374,28 +374,22 @@ def itercopygraphs(self): return self.graphs - def eval(self, origgraph, hints): - # for now, 'hints' means "I'm absolutely sure that the - # given variables will have the given ll value" - self.hints = hints - args_a = [LLRuntimeValue(orig_v=v) for v in origgraph.getargs()] - graphstate, args_a = self.schedule_graph(args_a, origgraph) + def eval(self, origgraph, arghints): + # 'arghints' maps argument index to a given ll value + args_a = [] + for i, v in enumerate(origgraph.getargs()): + if i in arghints: + a = LLConcreteValue(arghints[i]) + else: + a = LLRuntimeValue(orig_v=v) + args_a.append(a) + graphstate = self.schedule_graph(args_a, origgraph) graphstate.complete() return graphstate.copygraph - def applyhint(self, args_a, origblock): - result_a = [] - # apply the hints to make more LLConcreteValues - for a, origv in zip(args_a, origblock.inputargs): - if origv in self.hints: - # use the hint, ignore the source binding - a = LLConcreteValue(self.hints[origv]) - result_a.append(a) - return result_a - def schedule_graph(self, args_a, origgraph, a_back=None): origblock = origgraph.startblock - state, args_a = self.schedule_getstate(args_a, origblock) + state = self.schedule_getstate(args_a, origblock) try: graphstate = self.graphstates[origgraph][state] except KeyError: @@ -404,12 +398,12 @@ d[state] = graphstate self.pendingstates[graphstate] = state #print "SCHEDULE_GRAPH", graphstate - return graphstate, args_a + return graphstate def schedule(self, args_a, origblock): #print "SCHEDULE", args_a, origblock # args_a: [the-a-corresponding-to-v for v in origblock.inputargs] - state, args_a = self.schedule_getstate(args_a, origblock) + state = self.schedule_getstate(args_a, origblock) args_v = [] memo = {} for a in args_a: @@ -420,18 +414,17 @@ def schedule_getstate(self, args_a, origblock, a_back=None): # NOTA BENE: copyblocks can get shared between different copygraphs! - args_a = self.applyhint(args_a, origblock) newstate = LLBlockState(args_a, origblock, a_back) pendingstates = self.blocks.setdefault(newstate.key(), []) # try to match this new state with an existing one for state in pendingstates: if state.match(newstate, {}): # already matched - return state, args_a + return state else: # cache and return this new state pendingstates.append(newstate) - return newstate, args_a + return newstate class GraphState(object): @@ -701,8 +694,7 @@ return None a_result = LLRuntimeValue(op.result) - graphstate, args_a = self.interp.schedule_graph( - args_a, origgraph) + graphstate = self.interp.schedule_graph(args_a, origgraph) #print 'SCHEDULE_GRAPH', args_a, '==>', graphstate.copygraph.name if graphstate.state != "during": print 'ENTERING', graphstate.copygraph.name, args_a Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Mon Dec 12 13:49:08 2005 @@ -10,14 +10,9 @@ #py.test.skip("in-progress") -def entry_point(code, pc): - # indirection needed, because the hints are not about *all* calls to - # interp() - return tl.interp(code, pc) - def setup_module(mod): t = TranslationContext() - t.buildannotator().build_types(entry_point, [str, int]) + t.buildannotator().build_types(tl.interp, [str, int]) rtyper = t.buildrtyper() rtyper.specialize() inline.auto_inlining(t, 0.3) @@ -28,8 +23,8 @@ def jit_tl(code): interp = LLAbstractInterp() - hints = {graph1.getargs()[0]: string_repr.convert_const(code), - graph1.getargs()[1]: 0} + hints = {0: string_repr.convert_const(code), + 1: 0} graph2 = interp.eval(graph1, hints) result1 = llinterp.eval_graph(graph1, [string_repr.convert_const(code), 0]) @@ -37,7 +32,7 @@ assert result1 == result2 - #interp.graphs[1].show() # graphs[0] should be the entry_point + #interp.graphs[0].show() def run_jit(code): Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Mon Dec 12 13:49:08 2005 @@ -34,7 +34,7 @@ interp = LLAbstractInterp() hints = {} for hint in arghints: - hints[graph1.getargs()[hint]] = argvalues[hint] + hints[hint] = argvalues[hint] graph2 = interp.eval(graph1, hints) # cache and return the original and the residual ll graph result = t, interp, graph1, graph2 @@ -192,11 +192,7 @@ return 1 else: return ll_factorial(k-1) * k - def ll_function(k): - # indirection needed, because the hint is not about *all* calls to - # ll_factorial() - return ll_factorial(k) - graph2, insns = abstrinterp(ll_function, [7], [0]) + graph2, insns = abstrinterp(ll_factorial, [7], [0]) # the direct_calls are messy to count, with calls to ll_stack_check assert insns.keys() == ['direct_call'] From arigo at codespeak.net Mon Dec 12 13:56:58 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Dec 2005 13:56:58 +0100 (CET) Subject: [pypy-svn] r21074 - pypy/dist/pypy/jit Message-ID: <20051212125658.97D1F27B60@code1.codespeak.net> Author: arigo Date: Mon Dec 12 13:56:57 2005 New Revision: 21074 Modified: pypy/dist/pypy/jit/llabstractinterp.py Log: Consistently pass LLBlockStates around instead of args_a lists. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Mon Dec 12 13:56:57 2005 @@ -387,50 +387,46 @@ graphstate.complete() return graphstate.copygraph - def schedule_graph(self, args_a, origgraph, a_back=None): - origblock = origgraph.startblock - state = self.schedule_getstate(args_a, origblock) + def schedule_graph(self, args_a, origgraph): + inputstate = LLBlockState(args_a, origgraph.startblock) + state = self.schedule_getstate(inputstate) try: graphstate = self.graphstates[origgraph][state] except KeyError: d = self.graphstates.setdefault(origgraph, {}) - graphstate = GraphState(self, origgraph, args_a, n=len(d)) + graphstate = GraphState(self, origgraph, inputstate, n=len(d)) d[state] = graphstate self.pendingstates[graphstate] = state #print "SCHEDULE_GRAPH", graphstate return graphstate - def schedule(self, args_a, origblock): + def schedule(self, inputstate): #print "SCHEDULE", args_a, origblock # args_a: [the-a-corresponding-to-v for v in origblock.inputargs] - state = self.schedule_getstate(args_a, origblock) - args_v = [] - memo = {} - for a in args_a: - args_v.extend(a.getruntimevars(memo)) + state = self.schedule_getstate(inputstate) + args_v = inputstate.getruntimevars({}) newlink = Link(args_v, None) self.pendingstates[newlink] = state return newlink - def schedule_getstate(self, args_a, origblock, a_back=None): + def schedule_getstate(self, inputstate): # NOTA BENE: copyblocks can get shared between different copygraphs! - newstate = LLBlockState(args_a, origblock, a_back) - pendingstates = self.blocks.setdefault(newstate.key(), []) - # try to match this new state with an existing one + pendingstates = self.blocks.setdefault(inputstate.key(), []) + # try to match the input state with an existing one for state in pendingstates: - if state.match(newstate, {}): + if state.match(inputstate, {}): # already matched return state else: # cache and return this new state - pendingstates.append(newstate) - return newstate + pendingstates.append(inputstate) + return inputstate class GraphState(object): """Entry state of a graph.""" - def __init__(self, interp, origgraph, args_a, n): + def __init__(self, interp, origgraph, inputstate, n): self.interp = interp self.origgraph = origgraph name = '%s_%d' % (origgraph.name, n) @@ -533,7 +529,9 @@ newlinks = [] for origlink in links: args_a = [builder.binding(v) for v in origlink.args] - newlink = self.interp.schedule(args_a, origlink.target) + nextinputstate = LLBlockState(args_a, origlink.target, + a_back=state.a_back) + newlink = self.interp.schedule(nextinputstate) if newexitswitch is not None: newlink.exitcase = origlink.exitcase newlink.llexitcase = origlink.llexitcase From arigo at codespeak.net Mon Dec 12 15:56:13 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Dec 2005 15:56:13 +0100 (CET) Subject: [pypy-svn] r21079 - in pypy/dist/pypy/jit: . test Message-ID: <20051212145613.7D5E627DC0@code1.codespeak.net> Author: arigo Date: Mon Dec 12 15:56:11 2005 New Revision: 21079 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: Inlining. Currently with a policy to select between either the old behavior or full inlining. At some point we should try to detect when inlining is necessary. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Mon Dec 12 15:56:11 2005 @@ -297,25 +297,23 @@ # ____________________________________________________________ -class LLBlockState(LLAbstractValue): +class LLState(LLAbstractValue): """Entry state of a block, as a combination of LLAbstractValues - for its input arguments.""" + for its input arguments. Abstract base class.""" - def __init__(self, args_a, origblock, origposition=0, a_back=None): - assert len(args_a) == len(origblock.inputargs) + def __init__(self, a_back, args_a, origblock): + self.a_back = a_back self.args_a = args_a self.origblock = origblock - self.origposition = origposition self.copyblock = None - self.a_back = a_back + assert len(args_a) == len(self.getlivevars()) def key(self): - # two LLBlockStates should return different keys if they cannot match(). - if self.a_back is None: - backkey = None - else: - backkey = self.a_back.key() - return (self.origblock, self.origposition, backkey) + # two LLStates should return different keys if they cannot match(). + result = self.localkey() + if self.a_back is not None: + result += self.a_back.key() + return result def getruntimevars(self, memo): if self.a_back is None: @@ -330,18 +328,22 @@ return None def with_fresh_variables(self, memo): - if self.a_back is None: + if self.a_back is not None: new_a_back = self.a_back.with_fresh_variables(memo) else: new_a_back = None - new_args_a = [a.with_fresh_variables(memo) for a in self.args_a] - return LLBlockState(new_args_a, self.origblock, self.origposition, - new_a_back) + new_args_a = [] + for v, a in zip(self.getlivevars(), self.args_a): + a = a.with_fresh_variables(memo) + # try to preserve the name + if isinstance(a, LLRuntimeValue) and isinstance(a.copy_v, Variable): + a.copy_v.rename(v) + new_args_a.append(a) + return self.__class__(new_a_back, new_args_a, *self.localkey()) def match(self, other, memo): - if self.origblock is not other.origblock: - return False - if self.origposition != other.origposition: + assert self.__class__ is other.__class__ + if self.localkey() != other.localkey(): return False if self.a_back is None: if other.a_back is not None: @@ -361,15 +363,51 @@ #print "RESOLVING BLOCK", newblock self.copyblock = newblock + def getbindings(self): + return dict(zip(self.getlivevars(), self.args_a)) + + +class LLBlockState(LLState): + """Entry state of a block, as a combination of LLAbstractValues + for its input arguments.""" + + def localkey(self): + return (self.origblock,) + + def getlivevars(self): + return self.origblock.inputargs + + +class LLSuspendedBlockState(LLBlockState): + """Block state in the middle of the execution of one instruction + (typically a direct_call() that is causing inlining).""" + + def __init__(self, a_back, args_a, origblock, origposition): + self.origposition = origposition + super(LLSuspendedBlockState, self).__init__(a_back, args_a, origblock) + + def localkey(self): + return (self.origblock, self.origposition) + + def getlivevars(self): + return live_variables(self.origblock, self.origposition) + + # ____________________________________________________________ +class Policy(object): + def __init__(self, inlining=False): + self.inlining = inlining + + class LLAbstractInterp(object): - def __init__(self): + def __init__(self, policy=Policy()): self.graphs = [] self.graphstates = {} # {origgraph: {BlockState: GraphState}} self.pendingstates = {} # {Link-or-GraphState: next-BlockState} self.blocks = {} # {BlockState.key(): list-of-LLBlockStates} + self.policy = policy def itercopygraphs(self): return self.graphs @@ -388,7 +426,7 @@ return graphstate.copygraph def schedule_graph(self, args_a, origgraph): - inputstate = LLBlockState(args_a, origgraph.startblock) + inputstate = LLBlockState(None, args_a, origgraph.startblock) state = self.schedule_getstate(inputstate) try: graphstate = self.graphstates[origgraph][state] @@ -402,7 +440,6 @@ def schedule(self, inputstate): #print "SCHEDULE", args_a, origblock - # args_a: [the-a-corresponding-to-v for v in origblock.inputargs] state = self.schedule_getstate(inputstate) args_v = inputstate.getruntimevars({}) newlink = Link(args_v, None) @@ -483,7 +520,11 @@ # the graph should be complete now; sanity-check try: checkgraph(graph) - except: + except Exception, e: + print 'INVALID GRAPH:' + import traceback + traceback.print_exc() + print 'graph.show()...' graph.show() raise eliminate_empty_blocks(graph) @@ -492,27 +533,59 @@ def flowin(self, state): # flow in the block + assert isinstance(state, LLBlockState) origblock = state.origblock - builder = BlockBuilder(self.interp) - newinputargs = [] - memo = {} - memo2 = {} - for v, a in zip(origblock.inputargs, state.args_a): - a = a.with_fresh_variables(memo) - # try to preserve the name - if isinstance(a, LLRuntimeValue) and isinstance(a.copy_v, Variable): - a.copy_v.rename(v) - builder.bindings[v] = a - newinputargs.extend(a.getruntimevars(memo2)) + origposition = 0 + builder = BlockBuilder(self.interp, state) + newexitswitch = None print - # flow the actual operations of the block - for op in origblock.operations: - builder.dispatch(op) - # done + try: + if origblock.operations == (): + if state.a_back is None: + # copies of return and except blocks are *normal* blocks + # currently; they are linked to the official return or + # except block of the copygraph. If needed, + # LLConcreteValues are turned into Constants. + if len(origblock.inputargs) == 1: + target = self.copygraph.returnblock + else: + target = self.copygraph.exceptblock + args_v = [builder.binding(v).forcevarorconst(builder) + for v in origblock.inputargs] + raise InsertNextLink(Link(args_v, target)) + else: + # finishing a handle_call_inlining(): link back to + # the parent, passing the return value + # XXX GENERATE KEEPALIVES HERE + if len(origblock.inputargs) == 1: + a_result = builder.binding(origblock.inputargs[0]) + builder.runningstate = builder.runningstate.a_back + origblock = builder.runningstate.origblock + origposition = builder.runningstate.origposition + builder.bindings = builder.runningstate.getbindings() + op = origblock.operations[origposition] + builder.bindings[op.result] = a_result + origposition += 1 + else: + XXX_later - newexitswitch = None - if origblock.operations != (): - # build exit links and schedule their target for later completion + # flow the actual operations of the block + for i in range(origposition, len(origblock.operations)): + op = origblock.operations[i] + builder.enter(origblock, i) + try: + builder.dispatch(op) + finally: + builder.leave() + # done + + except InsertNextLink, e: + # the current operation forces a jump to another block + newlinks = [e.link] + + else: + # normal path: build exit links and schedule their target for + # later completion if origblock.exitswitch is None: links = origblock.exits elif origblock.exitswitch == Constant(last_exception): @@ -529,39 +602,30 @@ newlinks = [] for origlink in links: args_a = [builder.binding(v) for v in origlink.args] - nextinputstate = LLBlockState(args_a, origlink.target, - a_back=state.a_back) + nextinputstate = LLBlockState(builder.runningstate.a_back, + args_a, origlink.target) newlink = self.interp.schedule(nextinputstate) if newexitswitch is not None: newlink.exitcase = origlink.exitcase newlink.llexitcase = origlink.llexitcase newlinks.append(newlink) - else: - # copies of return and except blocks are *normal* blocks currently; - # they are linked to the official return or except block of the - # copygraph. If needed, LLConcreteValues are turned into Constants. - if len(origblock.inputargs) == 1: - target = self.copygraph.returnblock - else: - target = self.copygraph.exceptblock - args_v = [builder.binding(v).forcevarorconst(builder) - for v in origblock.inputargs] - newlinks = [Link(args_v, target)] - #print "CLOSING" - newblock = builder.buildblock(newinputargs, newexitswitch, newlinks) + newblock = builder.buildblock(newexitswitch, newlinks) state.resolveblock(newblock) class BlockBuilder(object): - def __init__(self, interp): + def __init__(self, interp, initialstate): self.interp = interp - self.bindings = {} # {Variables-of-origblock: a_value} + self.runningstate = initialstate.with_fresh_variables({}) + self.newinputargs = self.runningstate.getruntimevars({}) + # {Variables-of-origblock: a_value} + self.bindings = self.runningstate.getbindings() self.residual_operations = [] - def buildblock(self, newinputargs, newexitswitch, newlinks): - b = Block(newinputargs) + def buildblock(self, newexitswitch, newlinks): + b = Block(self.newinputargs) b.operations = self.residual_operations b.exitswitch = newexitswitch b.closeblock(*newlinks) @@ -578,6 +642,14 @@ a_result = handler(op, *[self.binding(v) for v in op.args]) self.bindings[op.result] = a_result + def enter(self, origblock, origposition): + self.blockpos = origblock, origposition + + def leave(self): + del self.blockpos + + # ____________________________________________________________ + # Utilities def constantfold(self, constant_op, args_a): concretevalues = [] @@ -619,6 +691,7 @@ return a_result # ____________________________________________________________ + # Operation handlers def op_int_is_true(self, op, a): return self.residualize(op, [a], operator.truth) @@ -682,8 +755,23 @@ return None origgraph = fnobj.graph + if self.interp.policy.inlining: + return self.handle_call_inlining(op, origgraph, *args_a) + else: + return self.handle_call_residual(op, origgraph, *args_a) - # for now, we need to force all arguments + def handle_call_inlining(self, op, origgraph, *args_a): + origblock, origposition = self.blockpos + alive_a = [] + for v in live_variables(origblock, origposition): + alive_a.append(self.bindings[v]) + parentstate = LLSuspendedBlockState(self.runningstate.a_back, alive_a, + origblock, origposition) + nextstate = LLBlockState(parentstate, args_a, origgraph.startblock) + raise InsertNextLink(self.interp.schedule(nextstate)) + + def handle_call_residual(self, op, origgraph, *args_a): + # residual call: for now we need to force all arguments any_concrete = False for a in args_a: a.forcevarorconst(self) @@ -702,7 +790,6 @@ a_result = graphstate.a_return print 'LEAVING', graphstate.copygraph.name, graphstate.a_return - origfptr = v_func.value ARGS = [] new_args_a = [] for a in args_a: @@ -710,8 +797,7 @@ ARGS.append(a.getconcretetype()) new_args_a.append(a) args_a = new_args_a - TYPE = lltype.FuncType( - ARGS, lltype.typeOf(origfptr).TO.RESULT) + TYPE = lltype.FuncType(ARGS, a_result.getconcretetype()) fptr = lltype.functionptr( TYPE, graphstate.copygraph.name, graph=graphstate.copygraph) a_func = LLRuntimeValue(const(fptr)) @@ -806,3 +892,28 @@ self.residual_operations.append(op) return ll_no_return_value return self.residualize(op, [a_ptr]) + + +class InsertNextLink(Exception): + def __init__(self, link): + self.link = link + + +def live_variables(block, position): + # return a list of all variables alive in the block at the beginning of + # the given 'position', in the order of creation. + used = {block.exitswitch: True} + for op in block.operations[position:]: + for v in op.args: + used[v] = True + for link in block.exits: + for v in link.args: + used[v] = True + result = [] + for v in block.inputargs: + if v in used: + result.append(v) + for op in block.operations[:position]: + if op.result in used: + result.append(op.result) + return result Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Mon Dec 12 15:56:11 2005 @@ -4,7 +4,7 @@ from pypy.rpython.llinterp import LLInterpreter from pypy.rpython import rstr from pypy.annotation import model as annmodel -from pypy.jit.llabstractinterp import LLAbstractInterp +from pypy.jit.llabstractinterp import LLAbstractInterp, Policy def annotation(a, x): @@ -16,8 +16,9 @@ return a.typeannotation(t) _lastinterpreted = [] -def get_and_residualize_graph(ll_function, argvalues, arghints): - key = ll_function, tuple(arghints), tuple([argvalues[n] for n in arghints]) +def get_and_residualize_graph(ll_function, argvalues, arghints, policy): + key = (ll_function, tuple(arghints), + tuple([argvalues[n] for n in arghints]), policy) for key1, value1 in _lastinterpreted: # 'key' is not hashable if key1 == key: return value1 @@ -31,7 +32,7 @@ rtyper = t.buildrtyper() rtyper.specialize() # build the residual ll graphs by propagating the hints - interp = LLAbstractInterp() + interp = LLAbstractInterp(policy) hints = {} for hint in arghints: hints[hint] = argvalues[hint] @@ -41,9 +42,9 @@ _lastinterpreted.append((key, result)) return result -def abstrinterp(ll_function, argvalues, arghints): +def abstrinterp(ll_function, argvalues, arghints, policy=Policy()): t, interp, graph1, graph2 = get_and_residualize_graph( - ll_function, argvalues, arghints) + ll_function, argvalues, arghints, policy) argvalues2 = [argvalues[n] for n in range(len(argvalues)) if n not in arghints] rtyper = t.rtyper @@ -60,6 +61,8 @@ insns[op.opname] = insns.get(op.opname, 0) + 1 return graph2, insns +P_INLINE = Policy(inlining=True) + def test_simple(): def ll_function(x, y): @@ -302,3 +305,11 @@ return (a[0] + a[1]) + a[2] graph2, insns = abstrinterp(ll_function, [7, 983], [0]) assert insns == {'int_add': 1} + +def test_simple_call_with_inlining(): + def ll2(x, y): + return x + (y + 42) + def ll1(x, y, z): + return ll2(x, y - z) + graph2, insns = abstrinterp(ll1, [3, 4, 5], [1, 2], policy=P_INLINE) + assert insns == {'int_add': 1} From arigo at codespeak.net Mon Dec 12 15:58:44 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Dec 2005 15:58:44 +0100 (CET) Subject: [pypy-svn] r21080 - pypy/dist/pypy/jit/test Message-ID: <20051212145844.F251C27DC0@code1.codespeak.net> Author: arigo Date: Mon Dec 12 15:58:43 2005 New Revision: 21080 Modified: pypy/dist/pypy/jit/test/test_jit_tl.py Log: Use the LLAbstractInterp inlining instead of the backendopt one in this test. Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Mon Dec 12 15:58:43 2005 @@ -3,10 +3,10 @@ import py from pypy.translator.translator import TranslationContext from pypy.jit import tl -from pypy.jit.llabstractinterp import LLAbstractInterp +from pypy.jit.llabstractinterp import LLAbstractInterp, Policy from pypy.rpython.rstr import string_repr from pypy.rpython.llinterp import LLInterpreter -from pypy.translator.backendopt import inline +#from pypy.translator.backendopt import inline #py.test.skip("in-progress") @@ -15,14 +15,14 @@ t.buildannotator().build_types(tl.interp, [str, int]) rtyper = t.buildrtyper() rtyper.specialize() - inline.auto_inlining(t, 0.3) + #inline.auto_inlining(t, 0.3) mod.graph1 = t.graphs[0] mod.llinterp = LLInterpreter(rtyper) def jit_tl(code): - interp = LLAbstractInterp() + interp = LLAbstractInterp(Policy(inlining=True)) hints = {0: string_repr.convert_const(code), 1: 0} graph2 = interp.eval(graph1, hints) From cfbolz at codespeak.net Mon Dec 12 21:56:15 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Mon, 12 Dec 2005 21:56:15 +0100 (CET) Subject: [pypy-svn] r21089 - pypy/dist/pypy/doc/statistic Message-ID: <20051212205615.3C25327DB4@code1.codespeak.net> Author: cfbolz Date: Mon Dec 12 21:56:14 2005 New Revision: 21089 Modified: pypy/dist/pypy/doc/statistic/release_dates.csv Log: add start of funding line Modified: pypy/dist/pypy/doc/statistic/release_dates.csv ============================================================================== --- pypy/dist/pypy/doc/statistic/release_dates.csv (original) +++ pypy/dist/pypy/doc/statistic/release_dates.csv Mon Dec 12 21:56:14 2005 @@ -1,5 +1,6 @@ PyPy releases date, release +2004-12-01,"Start of EU-funding" 2005-05-20,"PyPy 0.6 & 0.6.1" 2005-08-28,"PyPy 0.7.0" 2005-11-03,"PyPy 0.8.0" From rxe at codespeak.net Tue Dec 13 11:17:46 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Tue, 13 Dec 2005 11:17:46 +0100 (CET) Subject: [pypy-svn] r21121 - pypy/dist/pypy/translator/c/test Message-ID: <20051213101746.B080327B6C@code1.codespeak.net> Author: rxe Date: Tue Dec 13 11:17:45 2005 New Revision: 21121 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: API change. Now Resumable is longer (!) as it also includes the function. Funny how inheritance seems so much more useful in a static language. :-) Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Tue Dec 13 11:17:45 2005 @@ -15,12 +15,10 @@ gcpolicy = BoehmGcPolicy # count of loops in tests (set lower to speed up) -loops = 10 +loops = 1000 -debug_flag = False def debug(s): - if debug_flag: - os.write(2, "%s\n" % s) + os.write(2, "%s\n" % s) class Globals: def __init__(self): @@ -60,8 +58,7 @@ # ____________________________________________________________ class Resumable(object): - def __init__(self, fn): - self.fn = fn + def __init__(self): self.alive = False def start(self): @@ -74,16 +71,19 @@ def suspend(self): # we suspend ourself - self.caller = self.caller.switch() + self.caller = self.caller.switch() def resume(self): # the caller resumes me self.resumable = self.resumable.switch() self.alive = self.resumable is not None + def fn(self): + pass + class Tasklet(Resumable): - def __init__(self, fn): - Resumable.__init__(self, fn) + def __init__(self): + Resumable.__init__(self) self.blocked = 0 # propogates round suspend-resume to tell scheduler in run() @@ -226,51 +226,53 @@ # ____________________________________________________________ -#XXX start_tasklet -#XXX start_tasklet_now +def test_simplex(): -def test_simple(): - - def simple(): - for ii in range(5): - globals.count += 1 - schedule() + class Tasklet1(Tasklet): + def fn(self): + for ii in range(5): + globals.count += 1 + schedule() def f(): for ii in range(loops): - Tasklet(simple).start() + Tasklet1().start() run() return globals.count == loops * 5 res = wrap_stackless_function(f) + print res assert res == '1' def test_multiple_simple(): - def simple(): - for ii in range(5): - globals.count += 1 - schedule() - - def simple2(): - for ii in range(5): - globals.count += 1 - schedule() - globals.count += 1 + class Tasklet1(Tasklet): + def fn(self): + for ii in range(5): + globals.count += 1 + schedule() - def simple3(): - schedule() - for ii in range(10): - globals.count += 1 - if ii % 2: + class Tasklet2(Tasklet): + def fn(self): + for ii in range(5): + globals.count += 1 schedule() - schedule() + globals.count += 1 + + class Tasklet3(Tasklet): + def fn(self): + schedule() + for ii in range(10): + globals.count += 1 + if ii % 2: + schedule() + schedule() def f(): for ii in range(loops): - Tasklet(simple).start() - Tasklet(simple2).start() - Tasklet(simple3).start() + Tasklet1().start() + Tasklet2().start() + Tasklet3().start() run() return globals.count == loops * 25 @@ -278,21 +280,22 @@ assert res == '1' def test_schedule_remove(): - - def simple(): - for ii in range(20): - if ii < 10: - schedule() - else: - schedule_remove() - globals.count += 1 + + class Tasklet1(Tasklet): + def fn(self): + for ii in range(20): + if ii < 10: + schedule() + else: + schedule_remove() + globals.count += 1 def f(): for ii in range(loops): - Tasklet(simple).start() + Tasklet1().start() run() for ii in range(loops): - Tasklet(simple).start() + Tasklet1().start() run() return globals.count == loops * 10 * 2 @@ -301,27 +304,29 @@ def test_run_immediately(): globals.intermediate = 0 - globals.count = 0 - def simple(): - for ii in range(20): - globals.count += 1 - schedule() + class Tasklet1(Tasklet): + def fn(self): + for ii in range(20): + globals.count += 1 + schedule() - def run_immediately(): - globals.intermediate = globals.count - schedule() - - def simple2(): - for ii in range(20): - globals.count += 1 - if ii == 10: - Tasklet(run_immediately).run() + class RunImmediate(Tasklet): + def fn(self): + globals.intermediate = globals.count schedule() + + class Tasklet2(Tasklet): + def fn(self): + for ii in range(20): + globals.count += 1 + if ii == 10: + RunImmediate().run() + schedule() def f(): - Tasklet(simple2).start() + Tasklet2().start() for ii in range(loops): - Tasklet(simple).start() + Tasklet1().start() run() total_expected = (loops + 1) * 20 return (globals.intermediate == total_expected / 2 + 1 and @@ -333,18 +338,20 @@ def test_channel1(): ch = Channel() - def f1(): - for ii in range(5): - ch.send(ii) + class Tasklet1(Tasklet): + def fn(self): + for ii in range(5): + ch.send(ii) - def f2(): - #while True: XXX Doesnt annotate - for ii in range(6): - globals.count += ch.receive() + class Tasklet2(Tasklet): + def fn(self): + #while True: XXX Doesnt annotate + for ii in range(6): + globals.count += ch.receive() def f(): - Tasklet(f2).start() - Tasklet(f1).start() + Tasklet2().start() + Tasklet1().start() run() return (globals.count == 10) @@ -352,106 +359,168 @@ assert res == '1' def test_channel2(): - ch = Channel() - def f1(): - for ii in range(5): - ch.send(ii) + class Tasklet1(Tasklet): + def __init__(self, ch): + self.ch = ch + def fn(self): + for ii in range(5): + self.ch.send(ii) - def f2(): - #while True:XXX Doesnt annotate - for ii in range(6): - res = ch.receive() - globals.count += res + class Tasklet2(Tasklet): + def __init__(self, ch): + self.ch = ch + def fn(self): + #while True:XXX Doesnt annotate + for ii in range(6): + res = self.ch.receive() + globals.count += res def f(): - Tasklet(f1).start() - Tasklet(f2).start() + ch = Channel() + Tasklet1(ch).start() + Tasklet2(ch).start() run() - return (globals.count == 10) + return globals.count == 10 res = wrap_stackless_function(f) assert res == '1' def test_channel3(): - ch = Channel() - def f1(): - for ii in range(5): - ch.send(ii) + class Tasklet1(Tasklet): + def __init__(self, ch): + self.ch = ch + def fn(self): + for ii in range(5): + self.ch.send(ii) - def f2(): - #while True: XXX Doesnt annotate - for ii in range(16): - res = ch.receive() - globals.count += res + class Tasklet2(Tasklet): + def __init__(self, ch): + self.ch = ch + def fn(self): + #while True: XXX Doesnt annotate + for ii in range(16): + res = self.ch.receive() + globals.count += res def f(): - Tasklet(f1).start() - Tasklet(f1).start() - Tasklet(f1).start() - Tasklet(f2).start() + ch = Channel() + Tasklet1(ch).start() + Tasklet1(ch).start() + Tasklet1(ch).start() + Tasklet2(ch).start() run() - return (globals.count == 30) + return globals.count == 30 res = wrap_stackless_function(f) assert res == '1' -def test_channel4(): +def test_flexible_channels(): """ test with something other than int """ - class A: - pass - + class A(object): + def __init__(self, num): + self.num = num + def getvalue(self): + res = self.num + self.num *= 2 + return res + class Data(object): pass class IntData(Data): - def __init__(self, d): - self.d = d + def __init__(self, i): + self.int = i class StringData(Data): - def __init__(self, d): - self.d = d + def __init__(self, s): + self.str = s - class InstanceAData(Data): - def __init__(self, d): - self.d = d - - ch1 = Channel() - ch2 = Channel() - ch3 = Channel() - - def f1(): - for ii in range(5): - ch1.send(IntData(ii)) + class InstanceData(Data): + def __init__(self, i): + self.instance = i + + + class Tasklet1(Tasklet): + def __init__(self, ch): + self.ch = ch + def fn(self): + for ii in range(5): + self.ch.send(IntData(ii)) + + class Tasklet2(Tasklet): + def __init__(self, ch, strdata): + self.ch = ch + self.strdata = strdata + def fn(self): + for ii in range(5): + self.ch.send(StringData(self.strdata)) + + class Tasklet3(Tasklet): + def __init__(self, ch, instance): + self.ch = ch + self.instance = instance + def fn(self): + for ii in range(5): + self.ch.send(InstanceData(self.instance)) + + class Server(Tasklet): + def __init__(self, ch): + self.ch = ch + self.loop = True + + def stop(self): + self.loop = False - def f2(): - for ii in range(5): - ch2.send(StringData("asda")) + def fn(self): + while self.loop: + data = self.ch.receive() + if isinstance(data, IntData): + globals.count += data.int + elif isinstance(data, StringData): + globals.count += len(data.str) + elif isinstance(data, InstanceData): + globals.count += data.instance.getvalue() + + ch = Channel() + server = Server(ch) + + def f(): + Tasklet1(ch).start() + Tasklet2(ch, "abcd").start() + Tasklet2(ch, "xxx").start() + Tasklet3(ch, A(1)).start() + server.start() + run() + return globals.count == (0+1+2+3+4) + (5*4) + (5*3) + (1+2+4+8+16) + + res = wrap_stackless_function(f) + assert res == '1' - def f3(): +def test_original_api(): + + class TaskletAsFunction(Tasklet): + def __init__(self, fn): + self.redirect_fn = fn + def fn(self): + self.redirect_fn() + + def tasklet(fn): + return TaskletAsFunction(fn) + + def simple(): for ii in range(5): - ch3.send(StringData("asda")) - - def fr(): - #while True: - for ii in range(11): - data3 = ch3.receive() globals.count += 1 - data1 = ch1.receive() - globals.count += 1 - data2 = ch2.receive() - globals.count += 1 - + schedule() + def f(): - Tasklet(fr).start() - Tasklet(f1).start() - Tasklet(f2).start() - Tasklet(f3).start() + for ii in range(loops): + tasklet(simple).start() + run() run() - return (globals.count == 15) + return globals.count == loops * 5 res = wrap_stackless_function(f) assert res == '1' - From adim at codespeak.net Tue Dec 13 11:20:22 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Tue, 13 Dec 2005 11:20:22 +0100 (CET) Subject: [pypy-svn] r21122 - in pypy/dist/pypy/interpreter: . test Message-ID: <20051213102022.6F8EC27B6C@code1.codespeak.net> Author: adim Date: Tue Dec 13 11:20:21 2005 New Revision: 21122 Modified: pypy/dist/pypy/interpreter/baseobjspace.py pypy/dist/pypy/interpreter/test/test_objspace.py Log: (adim, arigo) A helper space.interp_w() with proper type checking. Modified: pypy/dist/pypy/interpreter/baseobjspace.py ============================================================================== --- pypy/dist/pypy/interpreter/baseobjspace.py (original) +++ pypy/dist/pypy/interpreter/baseobjspace.py Tue Dec 13 11:20:21 2005 @@ -390,12 +390,29 @@ def interpclass_w(space, w_obj): """ If w_obj is a wrapped internal interpreter class instance unwrap to it, - otherwise return None + otherwise return None. (Can be overridden in specific spaces; you + should generally use the helper space.interp_w() instead.) """ if isinstance(w_obj, Wrappable): return w_obj return None + def interp_w(self, RequiredClass, w_obj, can_be_None=False): + """ + Unwrap w_obj, checking that it is an instance of the required internal + interpreter class (a subclass of Wrappable). + """ + if can_be_None and self.is_w(w_obj, self.w_None): + return None + obj = self.interpclass_w(w_obj) + if not isinstance(obj, RequiredClass): # or obj is None + msg = "expected a %s, got %s instead" % ( + RequiredClass.typedef.name, + w_obj.getclass(self).getname(self, '?')) + raise OperationError(self.w_TypeError, self.wrap(msg)) + return obj + interp_w._annspecialcase_ = 'specialize:arg1' + def unpackiterable(self, w_iterable, expected_length=-1): """Unpack an iterable object into a real (interpreter-level) list. Raise a real (subclass of) ValueError if the length is wrong.""" Modified: pypy/dist/pypy/interpreter/test/test_objspace.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_objspace.py (original) +++ pypy/dist/pypy/interpreter/test/test_objspace.py Tue Dec 13 11:20:21 2005 @@ -1,5 +1,7 @@ import autopath from py.test import raises +from pypy.interpreter.function import Function +from pypy.interpreter.pycode import PyCode # this test isn't so much to test that the objspace interface *works* # -- it's more to test that it's *there* @@ -87,6 +89,17 @@ assert not self.space.exception_match(self.space.w_ValueError, self.space.w_LookupError) + def test_interp_w(self): + w = self.space.wrap + w_bltinfunction = self.space.builtin.get('len') + res = self.space.interp_w(Function, w_bltinfunction) + assert res is w_bltinfunction # with the std objspace only + self.space.raises_w(self.space.w_TypeError, self.space.interp_w, PyCode, w_bltinfunction) + self.space.raises_w(self.space.w_TypeError, self.space.interp_w, Function, w(42)) + self.space.raises_w(self.space.w_TypeError, self.space.interp_w, Function, w(None)) + res = self.space.interp_w(Function, w(None), can_be_None=True) + assert res is None + class TestModuleMinimal: def test_sys_exists(self): assert self.space.sys From rxe at codespeak.net Tue Dec 13 11:24:58 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Tue, 13 Dec 2005 11:24:58 +0100 (CET) Subject: [pypy-svn] r21123 - pypy/dist/pypy/translator/c/test Message-ID: <20051213102458.CACCB27B6C@code1.codespeak.net> Author: rxe Date: Tue Dec 13 11:24:57 2005 New Revision: 21123 Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py Log: Move caller out to a global - as I am guessing there will only ever be on per thread? Modified: pypy/dist/pypy/translator/c/test/test_tasklets.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_tasklets.py (original) +++ pypy/dist/pypy/translator/c/test/test_tasklets.py Tue Dec 13 11:24:57 2005 @@ -15,7 +15,7 @@ gcpolicy = BoehmGcPolicy # count of loops in tests (set lower to speed up) -loops = 1000 +loops = 1 def debug(s): os.write(2, "%s\n" % s) @@ -57,6 +57,10 @@ # ____________________________________________________________ +class ThreadLocals(object): + pass +threadlocals = ThreadLocals() + class Resumable(object): def __init__(self): self.alive = False @@ -65,13 +69,13 @@ self.resumable = self._start() def _start(self): - self.caller = yield_current_frame_to_caller() + threadlocals.cc = yield_current_frame_to_caller() self.fn() - return self.caller + return threadlocals.cc def suspend(self): # we suspend ourself - self.caller = self.caller.switch() + threadlocals.cc = threadlocals.cc.switch() def resume(self): # the caller resumes me @@ -82,6 +86,7 @@ pass class Tasklet(Resumable): + def __init__(self): Resumable.__init__(self) self.blocked = 0 @@ -106,12 +111,6 @@ def resume(self): assert not self.remove Resumable.resume(self) - - # not sure what to do with alive yetXXX - - #XXX arggh - why NOT?? - #if not alive: - # self.caller = # None / NULL return self.alive and not self.remove class Channel: @@ -281,7 +280,7 @@ def test_schedule_remove(): - class Tasklet1(Tasklet): + class Tasklet1(Tasklet): def fn(self): for ii in range(20): if ii < 10: @@ -506,7 +505,7 @@ self.redirect_fn = fn def fn(self): self.redirect_fn() - + def tasklet(fn): return TaskletAsFunction(fn) From arigo at codespeak.net Tue Dec 13 12:11:42 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Dec 2005 12:11:42 +0100 (CET) Subject: [pypy-svn] r21124 - in pypy/dist/pypy/jit: . test Message-ID: <20051213111142.9E00E27B6C@code1.codespeak.net> Author: arigo Date: Tue Dec 13 12:11:40 2005 New Revision: 21124 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_jit_tl.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: Try to propagate non-concrete constants, until they merge. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Tue Dec 13 12:11:40 2005 @@ -81,7 +81,11 @@ return self.copy_v def getruntimevars(self, memo): - return [self.copy_v] + if (isinstance(self.copy_v, Variable) or + self not in memo.propagate_as_constants): + return [self.copy_v] + else: + return [] # we propagate this constant as a constant def maybe_get_constant(self): if isinstance(self.copy_v, Constant): @@ -90,13 +94,23 @@ return None def with_fresh_variables(self, memo): - return LLRuntimeValue(self.getconcretetype()) + # don't use memo.seen here: shared variables must become distinct + if (isinstance(self.copy_v, Variable) or + self not in memo.propagate_as_constants): + return LLRuntimeValue(self.getconcretetype()) + else: + return self # we are allowed to propagate this constant def match(self, other, memo): - # Note: the meaning of match() is actually to see if calling - # with_fresh_variables() on both 'self' and 'other' would give the - # same result. This is why any two LLRuntimeValues match each other. - return isinstance(other, LLRuntimeValue) + if not isinstance(other, LLRuntimeValue): + return False + if isinstance(self.copy_v, Variable): + return True + if self.copy_v == other.copy_v: + memo.propagate_as_constant[other] = True # exact match + else: + memo.exact_match = False + return True ll_no_return_value = LLRuntimeValue(const(None, lltype.Void)) @@ -152,11 +166,11 @@ self.fields[name] = a_value def with_fresh_variables(self, memo): - if self in memo: - return memo[self] # already seen + if self in memo.seen: + return memo.seen[self] # already seen else: result = virtualcontainer(self.T, self.a_length) - memo[self] = result + memo.seen[self] = result if self.parent is not None: # build the parent first -- note that # parent.with_fresh_variables() will pick up 'result' again, @@ -221,8 +235,8 @@ def getruntimevars(self, memo): result = [] - if self not in memo: - memo[self] = True + if self not in memo.seen: + memo.seen[self] = True if self.parent is not None: result.extend(self.parent.getruntimevars(memo)) for name in self.names: @@ -232,12 +246,14 @@ def match(self, other, memo): if self.__class__ is not other.__class__: return False - if (False, self) in memo: - return other is memo[False, self] - if (True, other) in memo: - return self is memo[True, other] - memo[False, self] = other - memo[True, other] = self + + if self in memo.self_alias: + return other is memo.self_alias[self] + if other in memo.other_alias: + return self is memo.other_alias[other] + memo.self_alias[self] = other + memo.other_alias[other] = self + assert self.T == other.T if self.a_length is not None: if not self.a_length.match(other.a_length, memo): @@ -370,6 +386,7 @@ class LLBlockState(LLState): """Entry state of a block, as a combination of LLAbstractValues for its input arguments.""" + propagate_as_constants = {} def localkey(self): return (self.origblock,) @@ -396,13 +413,16 @@ # ____________________________________________________________ class Policy(object): - def __init__(self, inlining=False): + def __init__(self, inlining=False, const_propagate=False): self.inlining = inlining + self.const_propagate = const_propagate + +best_policy = Policy(inlining=True, const_propagate=True) class LLAbstractInterp(object): - def __init__(self, policy=Policy()): + def __init__(self, policy=best_policy): self.graphs = [] self.graphstates = {} # {origgraph: {BlockState: GraphState}} self.pendingstates = {} # {Link-or-GraphState: next-BlockState} @@ -441,7 +461,8 @@ def schedule(self, inputstate): #print "SCHEDULE", args_a, origblock state = self.schedule_getstate(inputstate) - args_v = inputstate.getruntimevars({}) + memo = VarMemo(state.propagate_as_constants) + args_v = inputstate.getruntimevars(memo) newlink = Link(args_v, None) self.pendingstates[newlink] = state return newlink @@ -451,13 +472,24 @@ pendingstates = self.blocks.setdefault(inputstate.key(), []) # try to match the input state with an existing one for state in pendingstates: - if state.match(inputstate, {}): + memo = MatchMemo() + if state.match(inputstate, memo): # already matched - return state - else: - # cache and return this new state - pendingstates.append(inputstate) - return inputstate + if memo.exact_match: + return state # exact match + if not self.policy.const_propagate: + return state # all constants will be generalized anyway + # partial match: in the old state, some constants need to + # be turned into variables. XXX patch oldstate.block to point + # to the new state, as in the flow object space + inputstate.propagate_as_constants = memo.propagate_as_constants + break + else: + if self.policy.const_propagate: + inputstate.propagate_as_constants = ALL + # cache and return this new state + pendingstates.append(inputstate) + return inputstate class GraphState(object): @@ -618,8 +650,10 @@ def __init__(self, interp, initialstate): self.interp = interp - self.runningstate = initialstate.with_fresh_variables({}) - self.newinputargs = self.runningstate.getruntimevars({}) + memo = VarMemo(initialstate.propagate_as_constants) + self.runningstate = initialstate.with_fresh_variables(memo) + memo = VarMemo(initialstate.propagate_as_constants) + self.newinputargs = self.runningstate.getruntimevars(memo) # {Variables-of-origblock: a_value} self.bindings = self.runningstate.getbindings() self.residual_operations = [] @@ -885,7 +919,7 @@ def op_keepalive(self, op, a_ptr): if isinstance(a_ptr, LLVirtualStruct): - for v in a_ptr.getruntimevars({}): + for v in a_ptr.getruntimevars(VarMemo()): if isinstance(v, Variable) and not v.concretetype._is_atomic(): op = SpaceOperation('keepalive', [v], newvar(lltype.Void)) print 'virtual:', op @@ -898,6 +932,23 @@ def __init__(self, link): self.link = link +class MatchMemo(object): + def __init__(self): + self.exact_match = True + self.propagate_as_constant = {} + self.self_alias = {} + self.other_alias = {} + +class VarMemo(object): + def __init__(self, propagate_as_constants={}): + self.seen = {} + self.propagate_as_constants = propagate_as_constants + +class ALL(object): + def __contains__(self, other): + return True +ALL = ALL() + def live_variables(block, position): # return a list of all variables alive in the block at the beginning of Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Tue Dec 13 12:11:40 2005 @@ -3,7 +3,7 @@ import py from pypy.translator.translator import TranslationContext from pypy.jit import tl -from pypy.jit.llabstractinterp import LLAbstractInterp, Policy +from pypy.jit.llabstractinterp import LLAbstractInterp from pypy.rpython.rstr import string_repr from pypy.rpython.llinterp import LLInterpreter #from pypy.translator.backendopt import inline @@ -22,7 +22,7 @@ def jit_tl(code): - interp = LLAbstractInterp(Policy(inlining=True)) + interp = LLAbstractInterp() hints = {0: string_repr.convert_const(code), 1: 0} graph2 = interp.eval(graph1, hints) Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Tue Dec 13 12:11:40 2005 @@ -62,6 +62,7 @@ return graph2, insns P_INLINE = Policy(inlining=True) +P_CONST_INLINE = Policy(inlining=True, const_propagate=True) def test_simple(): @@ -313,3 +314,11 @@ return ll2(x, y - z) graph2, insns = abstrinterp(ll1, [3, 4, 5], [1, 2], policy=P_INLINE) assert insns == {'int_add': 1} + +def test_const_propagate(): + def ll_add(x, y): + return x + y + def ll1(x): + return ll_add(x, 42) + graph2, insns = abstrinterp(ll1, [3], [0], policy=P_CONST_INLINE) + assert insns == {} From arigo at codespeak.net Tue Dec 13 12:12:38 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Dec 2005 12:12:38 +0100 (CET) Subject: [pypy-svn] r21125 - in pypy/dist/pypy: interpreter module/__builtin__ module/_sre module/recparser objspace/std Message-ID: <20051213111238.3ABE027B6C@code1.codespeak.net> Author: arigo Date: Tue Dec 13 12:12:35 2005 New Revision: 21125 Modified: pypy/dist/pypy/interpreter/baseobjspace.py pypy/dist/pypy/interpreter/function.py pypy/dist/pypy/interpreter/gateway.py pypy/dist/pypy/interpreter/main.py pypy/dist/pypy/interpreter/nestedscope.py pypy/dist/pypy/interpreter/pyopcode.py pypy/dist/pypy/interpreter/typedef.py pypy/dist/pypy/module/__builtin__/importing.py pypy/dist/pypy/module/_sre/interp_sre.py pypy/dist/pypy/module/recparser/pyparser.py pypy/dist/pypy/objspace/std/marshal_impl.py Log: Use the new space.interp_w() instead of space.interpclass_w() where it makes sense to do so. Modified: pypy/dist/pypy/interpreter/baseobjspace.py ============================================================================== --- pypy/dist/pypy/interpreter/baseobjspace.py (original) +++ pypy/dist/pypy/interpreter/baseobjspace.py Tue Dec 13 12:12:35 2005 @@ -406,7 +406,7 @@ return None obj = self.interpclass_w(w_obj) if not isinstance(obj, RequiredClass): # or obj is None - msg = "expected a %s, got %s instead" % ( + msg = "'%s' object expected, got '%s' instead" % ( RequiredClass.typedef.name, w_obj.getclass(self).getname(self, '?')) raise OperationError(self.w_TypeError, self.wrap(msg)) Modified: pypy/dist/pypy/interpreter/function.py ============================================================================== --- pypy/dist/pypy/interpreter/function.py (original) +++ pypy/dist/pypy/interpreter/function.py Tue Dec 13 12:12:35 2005 @@ -50,9 +50,7 @@ def descr_method__new__(space, w_subtype, w_code, w_globals, w_name=None, w_argdefs=None, w_closure=None): - code = space.interpclass_w(w_code) - if code is None or not isinstance(code, Code): - raise OperationError(space.w_TypeError, space.wrap("expected code")) + code = space.interp_w(Code, w_code) if not space.is_true(space.isinstance(w_globals, space.w_dict)): raise OperationError(space.w_TypeError, space.wrap("expected dict")) if not space.is_w(w_name, space.w_None): @@ -79,12 +77,7 @@ raise OperationError(space.w_ValueError, space.wrap("no closure needed")) elif nfreevars != n: raise OperationError(space.w_ValueError, space.wrap("closure is wrong size")) - closure = [] - for w_cell in closure_w: - cell = space.interpclass_w(w_cell) - if not isinstance(cell, Cell): - raise OperationError(space.w_TypeError, space.wrap("non-cell in closure")) - closure.append(cell) + closure = [space.interp_w(Cell, w_cell) for w_cell in closure_w] func = space.allocate_instance(Function, w_subtype) Function.__init__(func, space, code, w_globals, defs_w, closure, name) return space.wrap(func) @@ -153,9 +146,7 @@ def fset_func_code(space, self, w_code): from pypy.interpreter.pycode import PyCode - code = space.interpclass_w(w_code) - if not isinstance(code, Code): - raise OperationError(space.w_TypeError, space.wrap("func_code must be set to a code object") ) + code = space.interp_w(Code, w_code) closure_len = 0 if self.closure: closure_len = len(self.closure) @@ -320,10 +311,7 @@ self.w_module = func.w_module def descr_method__new__(space, w_subtype, w_func): - func = space.interpclass_w(w_func) - if func is None or not isinstance(func, Function): - raise OperationError(space.w_TypeError, - space.wrap("expected a function object")) + func = space.interp_w(Function, w_func) bltin = space.allocate_instance(BuiltinFunction, w_subtype) BuiltinFunction.__init__(bltin, func) return space.wrap(bltin) Modified: pypy/dist/pypy/interpreter/gateway.py ============================================================================== --- pypy/dist/pypy/interpreter/gateway.py (original) +++ pypy/dist/pypy/interpreter/gateway.py Tue Dec 13 12:12:35 2005 @@ -165,13 +165,8 @@ name = el.__name__ cur = emit_sig.through_scope_w emit_sig.setfastscope.append( - "obj = self.space.interpclass_w(scope_w[%d])" % cur) - emit_sig.setfastscope.append( - "if obj is None or not isinstance(obj, %s):" % name) - emit_sig.setfastscope.append( - " raise OperationError(self.space.w_TypeError,self.space.wrap('expected %%s' %% %s.typedef.name ))" % name) # xxx + "obj = self.space.interp_w(%s, scope_w[%d])" % (name, cur)) emit_sig.miniglobals[name] = el - emit_sig.miniglobals['OperationError'] = OperationError emit_sig.setfastscope.append( "self.%s_arg%d = obj" % (name,cur)) emit_sig.through_scope_w += 1 @@ -374,7 +369,7 @@ # It is a list of types or singleton objects: # baseobjspace.ObjSpace is used to specify the space argument # baseobjspace.W_Root is for wrapped arguments to keep wrapped - # baseobjspace.Wrappable subclasses imply interpclass_w and a typecheck + # baseobjspace.Wrappable subclasses imply interp_w and a typecheck # argument.Arguments is for a final rest arguments Arguments object # 'args_w' for unpacktuple applied to rest arguments # 'w_args' for rest arguments passed as wrapped tuple Modified: pypy/dist/pypy/interpreter/main.py ============================================================================== --- pypy/dist/pypy/interpreter/main.py (original) +++ pypy/dist/pypy/interpreter/main.py Tue Dec 13 12:12:35 2005 @@ -20,8 +20,7 @@ w = space.wrap w_code = space.builtin.call('compile', w(source), w(filename), w(cmd), w(0), w(0)) - pycode = space.interpclass_w(w_code) - assert isinstance(pycode, eval.Code) + pycode = space.interp_w(eval.Code, w_code) return pycode Modified: pypy/dist/pypy/interpreter/nestedscope.py ============================================================================== --- pypy/dist/pypy/interpreter/nestedscope.py (original) +++ pypy/dist/pypy/interpreter/nestedscope.py Tue Dec 13 12:12:35 2005 @@ -166,24 +166,15 @@ def MAKE_CLOSURE(f, numdefaults): w_codeobj = f.valuestack.pop() - codeobj = f.space.interpclass_w(w_codeobj) - assert isinstance(codeobj, pycode.PyCode) + codeobj = f.space.interp_w(pycode.PyCode, w_codeobj) if codeobj.magic >= 0xa0df281: # CPython 2.5 AST branch merge w_freevarstuple = f.valuestack.pop() - freevars = [] - for cell in f.space.unpacktuple(w_freevarstuple): - cell = f.space.interpclass_w(cell) - if not isinstance(cell, Cell): - raise pyframe.BytecodeCorruption - freevars.append(cell) + freevars = [f.space.interp_w(Cell, cell) + for cell in f.space.unpacktuple(w_freevarstuple)] else: nfreevars = len(codeobj.co_freevars) - freevars = [] - for i in range(nfreevars): - cell = f.space.interpclass_w(f.valuestack.pop()) - if not isinstance(cell, Cell): - raise pyframe.BytecodeCorruption - freevars.append(cell) + freevars = [f.space.interp_w(Cell, f.valuestack.pop()) + for i in range(nfreevars)] freevars.reverse() defaultarguments = [f.valuestack.pop() for i in range(numdefaults)] defaultarguments.reverse() Modified: pypy/dist/pypy/interpreter/pyopcode.py ============================================================================== --- pypy/dist/pypy/interpreter/pyopcode.py (original) +++ pypy/dist/pypy/interpreter/pyopcode.py Tue Dec 13 12:12:35 2005 @@ -6,7 +6,7 @@ from pypy.interpreter.error import OperationError from pypy.interpreter.baseobjspace import UnpackValueError -from pypy.interpreter import gateway, function +from pypy.interpreter import gateway, function, eval from pypy.interpreter import pyframe, pytraceback from pypy.interpreter.miscutils import InitializedClass from pypy.interpreter.argument import Arguments @@ -360,9 +360,8 @@ plain = f.w_locals is not None and f.space.is_w(w_locals, f.w_locals) if plain: w_locals = f.getdictscope() - pycode = f.space.interpclass_w(w_prog) - assert isinstance(pycode, PyCode) - pycode.exec_code(f.space, w_globals, w_locals) + co = f.space.interp_w(eval.Code, w_prog) + co.exec_code(f.space, w_globals, w_locals) if plain: f.setdictscope(w_locals) @@ -692,8 +691,7 @@ def MAKE_FUNCTION(f, numdefaults): w_codeobj = f.valuestack.pop() - codeobj = f.space.interpclass_w(w_codeobj) - assert isinstance(codeobj, PyCode) + codeobj = f.space.interp_w(PyCode, w_codeobj) defaultarguments = [f.valuestack.pop() for i in range(numdefaults)] defaultarguments.reverse() fn = function.Function(f.space, codeobj, f.w_globals, defaultarguments) Modified: pypy/dist/pypy/interpreter/typedef.py ============================================================================== --- pypy/dist/pypy/interpreter/typedef.py (original) +++ pypy/dist/pypy/interpreter/typedef.py Tue Dec 13 12:12:35 2005 @@ -2,6 +2,7 @@ """ +import py from pypy.interpreter.gateway import interp2app from pypy.interpreter.argument import Arguments from pypy.interpreter.baseobjspace import Wrappable, W_Root, ObjSpace @@ -184,34 +185,30 @@ if isinstance(cls, str): #print "len(self.args): raise OperationError( space.w_IndexError, space.wrap("Invalid index") ) self.args.insert( idx, rule ) @@ -308,11 +302,9 @@ return space.wrap(self.args[idx]) def descr_kleenestar___setitem__(self, space, idx, w_rule ): - rule = space.interpclass_w(w_rule) if idx!=0: raise OperationError( space.w_ValueError, space.wrap("KleeneStar only support one child")) - if not isinstance( rule, GrammarElement ): - raise OperationError( space.w_TypeError, space.wrap("Need a GrammarElement instance") ) + rule = space.interp_w(GrammarElement, w_rule) self.args[idx] = rule KleeneStar.descr_kleenestar___getitem__ = descr_kleenestar___getitem__ Modified: pypy/dist/pypy/objspace/std/marshal_impl.py ============================================================================== --- pypy/dist/pypy/objspace/std/marshal_impl.py (original) +++ pypy/dist/pypy/objspace/std/marshal_impl.py Tue Dec 13 12:12:35 2005 @@ -369,8 +369,7 @@ def marshal_w_pycode(space, w_pycode, m): m.start(TYPE_CODE) # see pypy.interpreter.pycode for the layout - x = space.interpclass_w(w_pycode) - assert isinstance(x, PyCode) + x = space.interp_w(PyCode, w_pycode) m.put_int(x.co_argcount) m.put_int(x.co_nlocals) m.put_int(x.co_stacksize) From cfbolz at codespeak.net Tue Dec 13 12:19:14 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 13 Dec 2005 12:19:14 +0100 (CET) Subject: [pypy-svn] r21126 - pypy/dist/pypy/doc/statistic Message-ID: <20051213111914.A9B5227B6C@code1.codespeak.net> Author: cfbolz Date: Tue Dec 13 12:19:13 2005 New Revision: 21126 Modified: pypy/dist/pypy/doc/statistic/post.txt Log: fix off by one error in mailing list data Modified: pypy/dist/pypy/doc/statistic/post.txt ============================================================================== --- pypy/dist/pypy/doc/statistic/post.txt (original) +++ pypy/dist/pypy/doc/statistic/post.txt Tue Dec 13 12:19:13 2005 @@ -1,36 +1,36 @@ Posts to mailing lists -month, pypy-dev, pypy-svn -2003-2, 371, 0 -2003-3, 187, 0 -2003-4, 76, 0 -2003-5, 36, 0 -2003-6, 32, 304 -2003-7, 115, 301 -2003-8, 101, 46 -2003-9, 24, 7 -2003-10, 115, 200 -2003-11, 112, 579 -2003-12, 49, 52 -2004-1, 62, 380 -2004-2, 49, 18 -2004-3, 29, 5 -2004-4, 13, 68 -2004-5, 55, 200 -2004-6, 24, 82 -2004-7, 52, 332 -2004-8, 80, 107 -2004-9, 38, 23 -2004-10, 23, 44 -2004-11, 11, 14 -2004-12, 79, 400 -2005-1, 58, 77 -2005-2, 71, 371 -2005-3, 95, 358 -2005-4, 54, 342 -2005-5, 66, 472 -2005-6, 108, 731 -2005-7, 54, 732 -2005-8, 71, 922 -2005-9, 68, 1038 -2005-10, 162, 559 -2005-11, 109, 737 +month,pypy-dev,pypy-svn +2003-1,371,0 +2003-2,187,0 +2003-3,76,0 +2003-4,36,0 +2003-5,32,304 +2003-6,115,301 +2003-7,101,46 +2003-8,24,7 +2003-9,115,200 +2003-10,112,579 +2003-11,49,52 +2003-12,62,380 +2004-1,49,18 +2004-2,29,5 +2004-3,13,68 +2004-4,55,200 +2004-5,24,82 +2004-6,52,332 +2004-7,80,107 +2004-8,38,23 +2004-9,23,44 +2004-10,11,14 +2004-11,79,400 +2004-12,58,77 +2005-1,71,371 +2005-2,95,358 +2005-3,54,342 +2005-4,66,472 +2005-5,108,731 +2005-6,54,732 +2005-7,71,922 +2005-8,68,1038 +2005-9,162,559 +2005-10,109,737 From arigo at codespeak.net Tue Dec 13 13:04:07 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Dec 2005 13:04:07 +0100 (CET) Subject: [pypy-svn] r21127 - in pypy/dist/pypy/jit: . test Message-ID: <20051213120407.A4FA527B69@code1.codespeak.net> Author: arigo Date: Tue Dec 13 13:04:05 2005 New Revision: 21127 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: More on eager constant propagation: don't be too eager :-) This fixes the merging detection, and adds a second phase after a graph is complete to "compactify" it by linking blocks for old states directly to their generalized counterpart. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Tue Dec 13 13:04:05 2005 @@ -107,7 +107,7 @@ if isinstance(self.copy_v, Variable): return True if self.copy_v == other.copy_v: - memo.propagate_as_constant[other] = True # exact match + memo.propagate_as_constants[other] = True # exact match else: memo.exact_match = False return True @@ -316,6 +316,7 @@ class LLState(LLAbstractValue): """Entry state of a block, as a combination of LLAbstractValues for its input arguments. Abstract base class.""" + generalized_by = None def __init__(self, a_back, args_a, origblock): self.a_back = a_back @@ -377,7 +378,15 @@ def resolveblock(self, newblock): #print "RESOLVING BLOCK", newblock - self.copyblock = newblock + if self.copyblock is not None: + # uncommon case: must patch the existing Block + assert len(self.copyblock.inputargs) == len(newblock.inputargs) + self.copyblock.inputargs = newblock.inputargs + self.copyblock.operations = newblock.operations + self.copyblock.exitswitch = newblock.exitswitch + self.copyblock.recloseblock(*newblock.exits) + else: + self.copyblock = newblock def getbindings(self): return dict(zip(self.getlivevars(), self.args_a)) @@ -471,7 +480,7 @@ # NOTA BENE: copyblocks can get shared between different copygraphs! pendingstates = self.blocks.setdefault(inputstate.key(), []) # try to match the input state with an existing one - for state in pendingstates: + for i, state in enumerate(pendingstates): memo = MatchMemo() if state.match(inputstate, memo): # already matched @@ -480,16 +489,18 @@ if not self.policy.const_propagate: return state # all constants will be generalized anyway # partial match: in the old state, some constants need to - # be turned into variables. XXX patch oldstate.block to point - # to the new state, as in the flow object space + # be turned into variables. inputstate.propagate_as_constants = memo.propagate_as_constants - break + # The generalized state replaces the existing one. + pendingstates[i] = inputstate + state.generalized_by = inputstate + return inputstate else: + # cache and return this new state if self.policy.const_propagate: inputstate.propagate_as_constants = ALL - # cache and return this new state - pendingstates.append(inputstate) - return inputstate + pendingstates.append(inputstate) + return inputstate class GraphState(object): @@ -534,21 +545,25 @@ self.flowin(state) next.settarget(state.copyblock) for link in state.copyblock.exits: - if link not in seen: - seen[link] = True - if link.target is None or link.target.operations != (): + if link.target is None or link.target.operations != (): + if link not in seen: + seen[link] = True pending.append(link) + else: + # link.target is a return or except block; make sure + # that it is really the one from 'graph' -- by patching + # 'graph' if necessary. + if len(link.target.inputargs) == 1: + self.a_return = state.args_a[0] + graph.returnblock = link.target + elif len(link.target.inputargs) == 2: + graph.exceptblock = link.target else: - # link.target is a return or except block; make sure - # that it is really the one from 'graph' -- by patching - # 'graph' if necessary. - if len(link.target.inputargs) == 1: - self.a_return = state.args_a[0] - graph.returnblock = link.target - elif len(link.target.inputargs) == 2: - graph.exceptblock = link.target - else: - raise Exception("uh?") + raise Exception("uh?") + + if interp.policy.const_propagate: + self.compactify(seen) + # the graph should be complete now; sanity-check try: checkgraph(graph) @@ -563,6 +578,25 @@ join_blocks(graph) self.state = "after" + def compactify(self, links): + # remove the parts of the graph that use constants that were later + # generalized + interp = self.interp + for link in links: + oldstate = interp.pendingstates[link] + if oldstate.generalized_by is not None: + newstate = oldstate.generalized_by + while newstate.generalized_by: + newstate = newstate.generalized_by + # Patch oldstate.block to point to the new state, + # as in the flow object space + builder = BlockBuilder(self, oldstate) + memo = VarMemo(newstate.propagate_as_constants) + args_v = builder.runningstate.getruntimevars(memo) + oldlink = Link(args_v, newstate.copyblock) + oldblock = builder.buildblock(None, [oldlink]) + oldstate.resolveblock(oldblock) + def flowin(self, state): # flow in the block assert isinstance(state, LLBlockState) @@ -935,7 +969,7 @@ class MatchMemo(object): def __init__(self): self.exact_match = True - self.propagate_as_constant = {} + self.propagate_as_constants = {} self.self_alias = {} self.other_alias = {} Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Tue Dec 13 13:04:05 2005 @@ -322,3 +322,14 @@ return ll_add(x, 42) graph2, insns = abstrinterp(ll1, [3], [0], policy=P_CONST_INLINE) assert insns == {} + +def test_dont_unroll_loop(): + def ll_factorial(n): + i = 1 + result = 1 + while i < n: + i += 1 + result *= i + return result + graph2, insns = abstrinterp(ll_factorial, [7], [], policy=P_CONST_INLINE) + assert insns == {'int_lt': 1, 'int_add': 1, 'int_mul': 1} From cfbolz at codespeak.net Tue Dec 13 13:41:52 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 13 Dec 2005 13:41:52 +0100 (CET) Subject: [pypy-svn] r21128 - pypy/dist/pypy/doc/statistic Message-ID: <20051213124152.A001627B69@code1.codespeak.net> Author: cfbolz Date: Tue Dec 13 13:41:51 2005 New Revision: 21128 Added: pypy/dist/pypy/doc/statistic/rebin.py Log: add script to rebin data Added: pypy/dist/pypy/doc/statistic/rebin.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/doc/statistic/rebin.py Tue Dec 13 13:41:51 2005 @@ -0,0 +1,22 @@ +import py + +chunks = 7 + +p = py.path.local(py.std.sys.argv[1]) + +data = [l.strip() for l in p.readlines()] + +result = data[:2] +data = data[2:] + +acc = 0 +for i, line in enumerate(data): + print line + line = line.split(',') + acc += int(line[1]) + if i % chunks == chunks - 1: + line[1] = str(acc) + result.append(", ".join(line)) + acc = 0 + +p.write("\n".join(result)) From arigo at codespeak.net Tue Dec 13 17:50:55 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Dec 2005 17:50:55 +0100 (CET) Subject: [pypy-svn] r21141 - in pypy/dist/pypy: jit objspace/flow rpython translator translator/backendopt translator/c translator/js translator/llvm translator/llvm/backendopt Message-ID: <20051213165055.88DEC27B66@code1.codespeak.net> Author: arigo Date: Tue Dec 13 17:50:51 2005 New Revision: 21141 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/objspace/flow/flowcontext.py pypy/dist/pypy/objspace/flow/model.py pypy/dist/pypy/rpython/llinterp.py pypy/dist/pypy/rpython/rtyper.py pypy/dist/pypy/translator/annrpython.py pypy/dist/pypy/translator/backendopt/inline.py pypy/dist/pypy/translator/backendopt/propagate.py pypy/dist/pypy/translator/c/funcgen.py pypy/dist/pypy/translator/geninterplevel.py pypy/dist/pypy/translator/js/funcnode.py pypy/dist/pypy/translator/llvm/backendopt/exception.py pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/simplify.py pypy/dist/pypy/translator/transform.py pypy/dist/pypy/translator/unsimplify.py Log: Replaced the many calls 'Constant(last_exception)' with a prebuilt 'c_last_exception'. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Tue Dec 13 17:50:51 2005 @@ -1,7 +1,7 @@ import operator from pypy.objspace.flow.model import Variable, Constant, SpaceOperation from pypy.objspace.flow.model import Block, Link, FunctionGraph -from pypy.objspace.flow.model import checkgraph, last_exception +from pypy.objspace.flow.model import checkgraph, c_last_exception from pypy.rpython.lltypesystem import lltype from pypy.translator.simplify import eliminate_empty_blocks, join_blocks @@ -654,7 +654,7 @@ # later completion if origblock.exitswitch is None: links = origblock.exits - elif origblock.exitswitch == Constant(last_exception): + elif origblock.exitswitch == c_last_exception: XXX else: a = builder.bindings[origblock.exitswitch] Modified: pypy/dist/pypy/objspace/flow/flowcontext.py ============================================================================== --- pypy/dist/pypy/objspace/flow/flowcontext.py (original) +++ pypy/dist/pypy/objspace/flow/flowcontext.py Tue Dec 13 17:50:51 2005 @@ -233,7 +233,7 @@ else: yield 'last_exception', Variable('last_exception') yield 'last_exc_value', Variable('last_exc_value') - outcome = self.guessbool(Constant(last_exception), + outcome = self.guessbool(c_last_exception, cases = [None] + list(classes), replace_last_variable_except_in_first_case = replace_exc_values) if outcome is None: Modified: pypy/dist/pypy/objspace/flow/model.py ============================================================================== --- pypy/dist/pypy/objspace/flow/model.py (original) +++ pypy/dist/pypy/objspace/flow/model.py Tue Dec 13 17:50:51 2005 @@ -357,6 +357,7 @@ return self.__name__ last_exception = Atom('last_exception') +c_last_exception = Constant(last_exception) # if Block().exitswitch == Constant(last_exception), it means that we are # interested in catching the exception that the *last operation* of the # block could raise. The exitcases of the links are None for no exception @@ -504,7 +505,7 @@ assert len(block.exits) <= 1 if block.exits: assert block.exits[0].exitcase is None - elif block.exitswitch == Constant(last_exception): + elif block.exitswitch == c_last_exception: assert len(block.operations) >= 1 # check if an exception catch is done on a reasonable # operation Modified: pypy/dist/pypy/rpython/llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/llinterp.py (original) +++ pypy/dist/pypy/rpython/llinterp.py Tue Dec 13 17:50:51 2005 @@ -1,4 +1,4 @@ -from pypy.objspace.flow.model import FunctionGraph, Constant, Variable, last_exception +from pypy.objspace.flow.model import FunctionGraph, Constant, Variable, c_last_exception from pypy.rpython.rarithmetic import intmask, r_uint, ovfcheck, r_longlong from pypy.rpython.lltypesystem import lltype from pypy.rpython.memory import lladdress @@ -163,7 +163,7 @@ is None, values is the concrete return value. """ self.curr_block = block - catch_exception = block.exitswitch == Constant(last_exception) + catch_exception = block.exitswitch == c_last_exception e = None try: Modified: pypy/dist/pypy/rpython/rtyper.py ============================================================================== --- pypy/dist/pypy/rpython/rtyper.py (original) +++ pypy/dist/pypy/rpython/rtyper.py Tue Dec 13 17:50:51 2005 @@ -17,7 +17,7 @@ from pypy.annotation.pairtype import pair from pypy.annotation import model as annmodel from pypy.objspace.flow.model import Variable, Constant -from pypy.objspace.flow.model import SpaceOperation, last_exception +from pypy.objspace.flow.model import SpaceOperation, c_last_exception from pypy.rpython.lltypesystem.lltype import \ Signed, Unsigned, Float, Char, Bool, Void, \ LowLevelType, Ptr, ContainerType, \ @@ -302,7 +302,7 @@ if (pos is not None and pos != len(newops)-1): # this is for the case where the llop that raises the exceptions # is not the last one in the list. - assert block.exitswitch == Constant(last_exception) + assert block.exitswitch == c_last_exception noexclink = block.exits[0] assert noexclink.exitcase is None if pos == "removed": @@ -342,7 +342,7 @@ if isinstance(block.exitswitch, Variable): r_case = self.bindingrepr(block.exitswitch) else: - assert block.exitswitch == Constant(last_exception) + assert block.exitswitch == c_last_exception r_case = rclass.get_type_repr(self) link.llexitcase = r_case.convert_const(link.exitcase) else: @@ -408,7 +408,7 @@ for op in block.operations[:-1]: yield HighLevelOp(self, op, [], llops) # look for exception links for the last operation - if block.exitswitch == Constant(last_exception): + if block.exitswitch == c_last_exception: exclinks = block.exits[1:] else: exclinks = [] Modified: pypy/dist/pypy/translator/annrpython.py ============================================================================== --- pypy/dist/pypy/translator/annrpython.py (original) +++ pypy/dist/pypy/translator/annrpython.py Tue Dec 13 17:50:51 2005 @@ -7,7 +7,7 @@ from pypy.annotation.bookkeeper import Bookkeeper from pypy.objspace.flow.model import Variable, Constant from pypy.objspace.flow.model import FunctionGraph -from pypy.objspace.flow.model import last_exception, checkgraph +from pypy.objspace.flow.model import c_last_exception, checkgraph import py log = py.log.Producer("annrpython") py.log.setconsumer("annrpython", ansi_log) @@ -190,7 +190,6 @@ elif isinstance(arg, Constant): #if arg.value is undefined_value: # undefined local variables # return annmodel.SomeImpossibleValue() - assert not arg.value is last_exception return self.bookkeeper.immutablevalue(arg.value) else: raise TypeError, 'Variable or Constant expected, got %r' % (arg,) @@ -434,7 +433,7 @@ self.why_not_annotated[block] = sys.exc_info() if (e.op is block.operations[-1] and - block.exitswitch == Constant(last_exception)): + block.exitswitch == c_last_exception): # this is the case where the last operation of the block will # always raise an exception which is immediately caught by # an exception handler. We then only follow the exceptional @@ -471,7 +470,7 @@ # filter out those exceptions which cannot # occour for this specific, typed operation. - if block.exitswitch == Constant(last_exception): + if block.exitswitch == c_last_exception: op = block.operations[-1] if op.opname in annmodel.BINARY_OPERATIONS: arg1 = self.binding(op.args[0]) Modified: pypy/dist/pypy/translator/backendopt/inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/inline.py (original) +++ pypy/dist/pypy/translator/backendopt/inline.py Tue Dec 13 17:50:51 2005 @@ -3,7 +3,7 @@ from pypy.translator.simplify import remove_identical_vars from pypy.translator.unsimplify import copyvar, split_block from pypy.objspace.flow.model import Variable, Constant, Block, Link -from pypy.objspace.flow.model import SpaceOperation, last_exception +from pypy.objspace.flow.model import SpaceOperation, c_last_exception from pypy.objspace.flow.model import traverse, mkentrymap, checkgraph from pypy.annotation import model as annmodel from pypy.rpython.lltypesystem.lltype import Bool, typeOf, Void @@ -79,7 +79,7 @@ op = block.operations[index_operation] graph_to_inline = op.args[0].value._obj.graph exception_guarded = False - if (block.exitswitch == Constant(last_exception) and + if (block.exitswitch == c_last_exception and index_operation == len(block.operations) - 1): exception_guarded = True if len(collect_called_functions(graph_to_inline)) != 0: Modified: pypy/dist/pypy/translator/backendopt/propagate.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/propagate.py (original) +++ pypy/dist/pypy/translator/backendopt/propagate.py Tue Dec 13 17:50:51 2005 @@ -1,4 +1,4 @@ -from pypy.objspace.flow.model import Block, Variable, Constant, last_exception +from pypy.objspace.flow.model import Block, Variable, Constant, c_last_exception from pypy.objspace.flow.model import traverse, mkentrymap, checkgraph from pypy.objspace.flow.model import SpaceOperation from pypy.rpython.lltypesystem.lltype import Bool @@ -63,7 +63,7 @@ return if len(block.exits) != 2: return - if block.exitswitch == Constant(last_exception): + if block.exitswitch == c_last_exception: return if (block.exits[0].args == block.exits[1].args and block.exits[0].target is block.exits[1].target): @@ -172,7 +172,7 @@ called_graph = get_graph(op.args[0], translator) if (called_graph is not None and simplify.has_no_side_effects(translator, called_graph) and - (block.exitswitch != Constant(last_exception) or + (block.exitswitch != c_last_exception or i != len(block.operations) - 1)): args = [arg.value for arg in op.args[1:]] countingframe = CountingLLFrame(called_graph, args, lli) Modified: pypy/dist/pypy/translator/c/funcgen.py ============================================================================== --- pypy/dist/pypy/translator/c/funcgen.py (original) +++ pypy/dist/pypy/translator/c/funcgen.py Tue Dec 13 17:50:51 2005 @@ -3,7 +3,7 @@ from pypy.translator.c.support import cdecl, ErrorValue from pypy.translator.c.support import llvalue_from_constant, gen_assignments from pypy.objspace.flow.model import Variable, Constant, Block -from pypy.objspace.flow.model import traverse, last_exception +from pypy.objspace.flow.model import traverse, c_last_exception from pypy.rpython.lltypesystem.lltype import \ Ptr, PyObject, Void, Bool, pyobjectptr, Struct, Array @@ -270,7 +270,7 @@ for op in gen_link(block.exits[0]): yield op yield '' - elif block.exitswitch == Constant(last_exception): + elif block.exitswitch == c_last_exception: # block catching the exceptions raised by its last operation # we handle the non-exceptional case first link = block.exits[0] Modified: pypy/dist/pypy/translator/geninterplevel.py ============================================================================== --- pypy/dist/pypy/translator/geninterplevel.py (original) +++ pypy/dist/pypy/translator/geninterplevel.py Tue Dec 13 17:50:51 2005 @@ -48,7 +48,7 @@ import cPickle as pickle, __builtin__ from copy_reg import _HEAPTYPE from pypy.objspace.flow.model import Variable, Constant, SpaceOperation -from pypy.objspace.flow.model import last_exception, checkgraph +from pypy.objspace.flow.model import c_last_exception, checkgraph from pypy.interpreter.pycode import CO_VARARGS, CO_VARKEYWORDS from types import FunctionType, CodeType, ModuleType from pypy.interpreter.error import OperationError @@ -1264,7 +1264,7 @@ yield " while True:" def render_block(block): - catch_exception = block.exitswitch == Constant(last_exception) + catch_exception = block.exitswitch == c_last_exception regular_op = len(block.operations) - catch_exception # render all but maybe the last op for op in block.operations[:regular_op]: Modified: pypy/dist/pypy/translator/js/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/js/funcnode.py (original) +++ pypy/dist/pypy/translator/js/funcnode.py Tue Dec 13 17:50:51 2005 @@ -1,7 +1,7 @@ import py import sys from pypy.objspace.flow.model import Block, Constant, Variable, Link -from pypy.objspace.flow.model import flatten, mkentrymap, traverse, last_exception +from pypy.objspace.flow.model import flatten, mkentrymap, traverse, c_last_exception from pypy.rpython.lltypesystem import lltype from pypy.translator.js.node import Node from pypy.translator.js.opwriter import OpWriter @@ -30,7 +30,7 @@ for op in block.operations: map(self.db.prepare_arg, op.args) self.db.prepare_arg(op.result) - if block.exitswitch != Constant(last_exception): + if block.exitswitch != c_last_exception: continue for link in block.exits[1:]: self.db.prepare_constant(lltype.typeOf(link.llexitcase), @@ -75,7 +75,7 @@ self.write_block_branches(codewriter, block) def write_block_branches(self, codewriter, block): - if block.exitswitch == Constant(last_exception): + if block.exitswitch == c_last_exception: return if len(block.exits) == 1: codewriter.br_uncond(self.blockindex[block.exits[0].target], block.exits[0]) @@ -87,7 +87,7 @@ def write_block_operations(self, codewriter, block): opwriter = OpWriter(self.db, codewriter, self, block) - if block.exitswitch == Constant(last_exception): + if block.exitswitch == c_last_exception: last_op_index = len(block.operations) - 1 else: last_op_index = None Modified: pypy/dist/pypy/translator/llvm/backendopt/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/backendopt/exception.py (original) +++ pypy/dist/pypy/translator/llvm/backendopt/exception.py Tue Dec 13 17:50:51 2005 @@ -1,6 +1,6 @@ from pypy.translator.unsimplify import split_block from pypy.objspace.flow.model import Block, Constant, Variable, Link, \ - last_exception, flatten, SpaceOperation + c_last_exception, flatten, SpaceOperation from pypy.annotation import model as annmodel from pypy.rpython.lltypesystem.lltype import Bool @@ -20,7 +20,7 @@ blocks = [x for x in flatten(graph) if isinstance(x, Block)] for block in blocks: last_operation = len(block.operations)-1 - if block.exitswitch == Constant(last_exception): + if block.exitswitch == c_last_exception: last_operation -= 1 for i in range(last_operation, -1, -1): op = block.operations[i] Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Tue Dec 13 17:50:51 2005 @@ -1,5 +1,5 @@ from pypy.objspace.flow.model import Block, Constant, Link -from pypy.objspace.flow.model import flatten, mkentrymap, traverse, last_exception +from pypy.objspace.flow.model import flatten, mkentrymap, traverse, c_last_exception from pypy.rpython.lltypesystem import lltype from pypy.translator.llvm.node import LLVMNode, ConstantLLVMNode from pypy.translator.llvm.opwriter import OpWriter @@ -59,7 +59,7 @@ for op in block.operations: map(self.db.prepare_arg, op.args) self.db.prepare_arg(op.result) - if block.exitswitch != Constant(last_exception): + if block.exitswitch != c_last_exception: continue for link in block.exits[1:]: self.db.prepare_constant(lltype.typeOf(link.llexitcase), @@ -154,7 +154,7 @@ blocknames = [self.block_to_name[link.prevblock] for link in entrylinks] for i, link in enumerate(entrylinks): #XXX refactor into a transformation - if link.prevblock.exitswitch == Constant(last_exception) and \ + if link.prevblock.exitswitch == c_last_exception and \ link.prevblock.exits[0].target != block: blocknames[i] += '_exception_found_branchto_' + self.block_to_name[block] data.append( (arg, type_, names, blocknames) ) @@ -167,7 +167,7 @@ def write_block_branches(self, codewriter, block): #assert len(block.exits) <= 2 #more exits are possible (esp. in combination with exceptions) - if block.exitswitch == Constant(last_exception): + if block.exitswitch == c_last_exception: #codewriter.comment('FuncNode(ConstantLLVMNode) *last_exception* write_block_branches @%s@' % str(block.exits)) return if len(block.exits) == 1: @@ -179,7 +179,7 @@ def write_block_operations(self, codewriter, block): opwriter = OpWriter(self.db, codewriter, self, block) - if block.exitswitch == Constant(last_exception): + if block.exitswitch == c_last_exception: last_op_index = len(block.operations) - 1 else: last_op_index = None Modified: pypy/dist/pypy/translator/simplify.py ============================================================================== --- pypy/dist/pypy/translator/simplify.py (original) +++ pypy/dist/pypy/translator/simplify.py Tue Dec 13 17:50:51 2005 @@ -7,7 +7,7 @@ from pypy.objspace.flow.model import SpaceOperation from pypy.objspace.flow.model import Variable, Constant, Block, Link -from pypy.objspace.flow.model import last_exception +from pypy.objspace.flow.model import c_last_exception from pypy.objspace.flow.model import checkgraph, traverse, mkentrymap def get_graph(arg, translator): @@ -42,7 +42,7 @@ if isinstance(link, Link): while not link.target.operations: if (len(link.target.exits) != 1 and - link.target.exitswitch != Constant(last_exception)): + link.target.exitswitch != c_last_exception): break assert link.target is not link.prevblock, ( "the graph contains an empty infinite loop") @@ -190,7 +190,7 @@ chain of is_/issubtype tests. We collapse them all into the block's single list of exits. """ - clastexc = Constant(last_exception) + clastexc = c_last_exception renaming = {} def rename(v): return renaming.get(v, v) @@ -251,7 +251,7 @@ def remove_dead_exceptions(graph): """Exceptions can be removed if they are unreachable""" - clastexc = Constant(last_exception) + clastexc = c_last_exception def issubclassofmember(cls, seq): for member in seq: @@ -331,7 +331,7 @@ # can we remove this exit without breaking the graph? if len(block.exits) < 2: break - if block.exitswitch == Constant(last_exception): + if block.exitswitch == c_last_exception: if exit.exitcase is None: break if len(block.exits) == 2: @@ -434,7 +434,7 @@ def canremove(op, block): if op.opname not in CanRemove: return False - if block.exitswitch != Constant(last_exception): + if block.exitswitch != c_last_exception: return True # cannot remove the exc-raising operation return op is not block.operations[-1] @@ -544,7 +544,7 @@ graph = get_graph(op.args[0], translator) if (graph is not None and has_no_side_effects(translator, graph) and - (block.exitswitch != Constant(last_exception) or + (block.exitswitch != c_last_exception or i != len(block.operations)- 1)): del block.operations[i] # look for output variables never used Modified: pypy/dist/pypy/translator/transform.py ============================================================================== --- pypy/dist/pypy/translator/transform.py (original) +++ pypy/dist/pypy/translator/transform.py Tue Dec 13 17:50:51 2005 @@ -10,7 +10,7 @@ import types from pypy.objspace.flow.model import SpaceOperation from pypy.objspace.flow.model import Variable, Constant, Link -from pypy.objspace.flow.model import last_exception, checkgraph +from pypy.objspace.flow.model import c_last_exception, checkgraph from pypy.annotation import model as annmodel from pypy.rpython.rstack import stack_check @@ -98,7 +98,7 @@ if not block.exits: # oups! cannot reach the end of this block cutoff_alwaysraising_block(self, block) - elif block.exitswitch == Constant(last_exception): + elif block.exitswitch == c_last_exception: # exceptional exit if block.exits[0].exitcase is not None: # killed the non-exceptional path! Modified: pypy/dist/pypy/translator/unsimplify.py ============================================================================== --- pypy/dist/pypy/translator/unsimplify.py (original) +++ pypy/dist/pypy/translator/unsimplify.py Tue Dec 13 17:50:51 2005 @@ -38,7 +38,7 @@ def split_block(translator, graph, block, index): """split a block in two, inserting a proper link between the new blocks""" assert 0 <= index <= len(block.operations) - if block.exitswitch == Constant(last_exception): + if block.exitswitch == c_last_exception: assert index < len(block.operations) #varmap is the map between names in the new and the old block #but only for variables that are produced in the old block and needed in From arigo at codespeak.net Tue Dec 13 19:51:36 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Dec 2005 19:51:36 +0100 (CET) Subject: [pypy-svn] r21143 - pypy/dist/pypy/translator/tool Message-ID: <20051213185136.9649227B69@code1.codespeak.net> Author: arigo Date: Tue Dec 13 19:51:34 2005 New Revision: 21143 Modified: pypy/dist/pypy/translator/tool/make_dot.py Log: Append an underscore to graph name, to avoid collision with the keywords of dot, like 'Node' when we try to see the class ast.Node... Modified: pypy/dist/pypy/translator/tool/make_dot.py ============================================================================== --- pypy/dist/pypy/translator/tool/make_dot.py (original) +++ pypy/dist/pypy/translator/tool/make_dot.py Tue Dec 13 19:51:34 2005 @@ -14,10 +14,10 @@ class DotGen: def __init__(self, graphname, rankdir=None): - self.graphname = graphname + self.graphname = graphname + '_' self.lines = [] self.source = None - self.emit("digraph %s {" % graphname) + self.emit("digraph %s {" % self.graphname) if rankdir: self.emit('rankdir="%s"' % rankdir) @@ -80,7 +80,7 @@ DotGen.__init__(self, graphname.replace('.', '_'), rankdir) def emit_subgraph(self, name, node): - name = name.replace('.', '_') + name = name.replace('.', '_') + '_' self.blocks = {} self.func = None self.prefix = name From arigo at codespeak.net Tue Dec 13 20:01:03 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Dec 2005 20:01:03 +0100 (CET) Subject: [pypy-svn] r21144 - in pypy/dist/pypy: annotation tool translator Message-ID: <20051213190103.0E2E627B6C@code1.codespeak.net> Author: arigo Date: Tue Dec 13 20:00:58 2005 New Revision: 21144 Modified: pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/annotation/policy.py pypy/dist/pypy/annotation/specialize.py pypy/dist/pypy/tool/cache.py pypy/dist/pypy/translator/annrpython.py Log: (pedronis, arigo) Yet Another Refacoring of The Memo Mess (tm). There are again two phases, discovery of the possible argument values (at which point the specializer returns the most precise annotation as a result, but not a complete graph to compute it); and then, just before fixpoint is reached, the memo tables built so far are "forced" and become graphs that can be called. This allows the general N-arguments logic to be written in a kind-of-sane-but-a-bit-longish way. It would be easy to add support for Bool arguments now. Modified: pypy/dist/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/dist/pypy/annotation/bookkeeper.py (original) +++ pypy/dist/pypy/annotation/bookkeeper.py Tue Dec 13 20:00:58 2005 @@ -168,6 +168,8 @@ self.pbc_maximal_call_families = UnionFind(description.CallFamily) self.emulated_pbc_calls = {} + self.all_specializations = {} # {FuncDesc: specialization-info} + self.pending_specializations = [] # list of callbacks self.needs_hash_support = {} self.needs_generic_instantiate = {} Modified: pypy/dist/pypy/annotation/policy.py ============================================================================== --- pypy/dist/pypy/annotation/policy.py (original) +++ pypy/dist/pypy/annotation/policy.py Tue Dec 13 20:00:58 2005 @@ -1,7 +1,7 @@ # base annotation policy for overrides and specialization from pypy.annotation.specialize import default_specialize as default from pypy.annotation.specialize import argtype, argvalue, arglistitemtype -from pypy.annotation.specialize import memo, methodmemo +from pypy.annotation.specialize import memo # for some reason, model must be imported first, # or we create a cycle. from pypy.annotation import model as annmodel @@ -20,8 +20,11 @@ def no_specialization(pol, funcdesc, args_s): return funcdesc.cachedgraph(None) - def compute_at_fixpoint(pol, annotator): - annotator.bookkeeper.compute_at_fixpoint() + def no_more_blocks_to_annotate(pol, annotator): + # hint to all pending specializers that we are done + for callback in annotator.bookkeeper.pending_specializations: + callback() + del annotator.bookkeeper.pending_specializations[:] class AnnotatorPolicy(BasicAnnotatorPolicy): @@ -52,7 +55,6 @@ default_specialize = staticmethod(default) specialize__memo = staticmethod(memo) - specialize__methodmemo = staticmethod(methodmemo) specialize__arg0 = staticmethod(argvalue(0)) specialize__argtype0 = staticmethod(argtype(0)) specialize__arglistitemtype0 = staticmethod(arglistitemtype(0)) Modified: pypy/dist/pypy/annotation/specialize.py ============================================================================== --- pypy/dist/pypy/annotation/specialize.py (original) +++ pypy/dist/pypy/annotation/specialize.py Tue Dec 13 20:00:58 2005 @@ -1,6 +1,9 @@ # specialization support import types +import py from pypy.tool.uid import uid +from pypy.tool.sourcetools import func_with_new_name +from pypy.tool.algo.unionfind import UnionFind from pypy.objspace.flow.model import Block, Link, Variable, SpaceOperation from pypy.objspace.flow.model import Constant, checkgraph @@ -54,126 +57,298 @@ # ____________________________________________________________________________ # specializations +class MemoTable: + def __init__(self, funcdesc, args, value): + self.funcdesc = funcdesc + self.table = {args: value} + self.graph = None + + def update(self, other): + self.table.update(other.table) + self.graph = None # just in case + + fieldnamecounter = 0 + + def getuniquefieldname(self, descs): + name = self.funcdesc.name + fieldname = 'memofield_%s_%d' % (name, MemoTable.fieldnamecounter) + MemoTable.fieldnamecounter += 1 + # look for name clashes + for desc in descs: + try: + desc.read_attribute(fieldname) + except AttributeError: + pass # no clash + else: + # clash! try again... + return self.getuniquefieldname(descs) + else: + return fieldname + + def finish(self): + from pypy.annotation.model import unionof + # list of which argument positions can take more than one value + example_args, example_value = self.table.iteritems().next() + nbargs = len(example_args) + # list of sets of possible argument values -- one set per argument index + sets = [{} for i in range(nbargs)] + for args in self.table: + for i in range(nbargs): + sets[i][args[i]] = True + + bookkeeper = self.funcdesc.bookkeeper + annotator = bookkeeper.annotator + name = self.funcdesc.name + argnames = ['a%d' % i for i in range(nbargs)] + + def make_helper(firstarg, expr, miniglobals): + source = """ + def f(%s): + return %s + """ % (', '.join(argnames[firstarg:]), expr) + exec py.code.Source(source).compile() in miniglobals + f = miniglobals['f'] + return func_with_new_name(f, 'memo_%s_%d' % (name, firstarg)) + + def make_constant_subhelper(firstarg, result): + # make a function that just returns the constant answer 'result' + f = make_helper(firstarg, 'result', {'result': result}) + f.constant_result = result + return f + + def make_subhelper(args_so_far=()): + firstarg = len(args_so_far) + if firstarg == nbargs: + # no argument left, return the known result + # (or a dummy value if none corresponds exactly) + result = self.table.get(args_so_far, example_value) + return make_constant_subhelper(firstarg, result) + else: + nextargvalues = list(sets[len(args_so_far)]) + nextfns = [make_subhelper(args_so_far + (arg,)) + for arg in nextargvalues] + # do all graphs return a constant? + try: + constants = [fn.constant_result for fn in nextfns] + except AttributeError: + constants = None # one of the 'fn' has no constant_result + + # is there actually only one possible value for the current arg? + if len(nextargvalues) == 1: + if constants: # is the result a constant? + result = constants[0] + return make_constant_subhelper(firstarg, result) + else: + # ignore the first argument and just call the subhelper + expr = 'subhelper(%s)' % ( + ', '.join(argnames[firstarg+1:]),) + return make_helper(firstarg, expr, + {'subhelper': nextfns[0]}) + else: + descs = [bookkeeper.getdesc(pbc) for pbc in nextargvalues] + fieldname = self.getuniquefieldname(descs) + expr = 'getattr(%s, %r)' % (argnames[firstarg], + fieldname) + if constants: + # instead of calling these subhelpers indirectly, + # we store what they would return directly in the + # pbc memo fields + store = constants + else: + store = nextfns + # call the result of the getattr() + expr += '(%s)' % (', '.join(argnames[firstarg+1:]),) + + # store the memo field values + for desc, value_to_store in zip(descs, store): + desc.create_new_attribute(fieldname, value_to_store) + + return make_helper(firstarg, expr, {}) + + entrypoint = make_subhelper(args_so_far = ()) + self.graph = annotator.translator.buildflowgraph(entrypoint) + + # schedule this new graph for being annotated + args_s = [] + for set in sets: + values_s = [bookkeeper.immutablevalue(x) for x in set] + args_s.append(unionof(*values_s)) + annotator.addpendinggraph(self.graph, args_s) + + def memo(funcdesc, arglist_s): - """NOT_RPYTHON""" - from pypy.annotation.model import SomePBC, SomeImpossibleValue + from pypy.annotation.model import SomePBC, SomeImpossibleValue, unionof # call the function now, and collect possible results + argvalues = [] for s in arglist_s: if not isinstance(s, SomePBC): if isinstance(s, SomeImpossibleValue): return s # we will probably get more possible args later raise Exception("memo call: argument must be a class or a frozen " "PBC, got %r" % (s,)) - if len(arglist_s) != 1: - raise Exception("memo call: only 1 argument functions supported" - " at the moment (%r)" % (funcdesc,)) - s, = arglist_s - from pypy.annotation.model import SomeImpossibleValue - func = funcdesc.pyobj - if func is None: - raise Exception("memo call: no Python function object to call (%r)" % - (funcdesc,)) - return memo1(funcdesc, func, s) - -# XXX OBSCURE to support methodmemo()... needs to find something more -# reasonable :-( -KEY_NUMBERS = {} - -def memo1(funcdesc, func, s, key='memo1'): - from pypy.annotation.model import SomeImpossibleValue - # compute the concrete results and store them directly on the descs, - # using a strange attribute name - num = KEY_NUMBERS.setdefault(key, len(KEY_NUMBERS)) - attrname = '$memo%d_%d_%s' % (uid(funcdesc), num, funcdesc.name) - for desc in s.descriptions: - s_result = desc.s_read_attribute(attrname) - if isinstance(s_result, SomeImpossibleValue): - # first time we see this 'desc' + assert not s.can_be_None, "memo call: arguments must never be None" + values = [] + for desc in s.descriptions: if desc.pyobj is None: raise Exception("memo call with a class or PBC that has no " "corresponding Python object (%r)" % (desc,)) - result = func(desc.pyobj) - desc.create_new_attribute(attrname, result) - # get or build the graph of the function that reads this strange attr - def memoized(x, y=None): - return getattr(x, attrname) - def builder(translator, func): - return translator.buildflowgraph(memoized) # instead of 'func' - return funcdesc.cachedgraph(key, alt_name='memo_%s' % funcdesc.name, - builder=builder) - -def methodmemo(funcdesc, arglist_s): - """NOT_RPYTHON""" - from pypy.annotation.model import SomePBC, SomeImpossibleValue - # call the function now, and collect possible results - for s in arglist_s: - if not isinstance(s, SomePBC): - if isinstance(s, SomeImpossibleValue): - return s # we will probably get more possible args later - raise Exception("method-memo call: argument must be a class or" - " a frozen PBC, got %r" % (s,)) - if len(arglist_s) != 2: - raise Exception("method-memo call: expected 2 arguments function" - " at the moment (%r)" % (funcdesc,)) - from pypy.annotation.model import SomeImpossibleValue - from pypy.annotation.description import FrozenDesc - func = funcdesc.pyobj - if func is None: - raise Exception("method-memo call: no Python function object to call" - " (%r)" % (funcdesc,)) - # compute the concrete results and store them directly on the descs, - # using a strange attribute name. The goal is to store in the pbcs of - # 's1' under the common 'attrname' a reader function; each reader function - # will read a field 'attrname2' from the pbcs of 's2', where 'attrname2' - # differs for each pbc of 's1'. This is all specialized also - # considering the type of s1 to support return value - # polymorphism. - s1, s2 = arglist_s - s1_type = s1.knowntype - if s2.is_constant(): - return memo1(funcdesc, lambda val1: func(val1, s2.const), - s1, ('memo1of2', s1_type, Constant(s2.const))) - memosig = "%d_%d_%s" % (uid(funcdesc), uid(s1_type), funcdesc.name) - - attrname = '$memoreader%s' % memosig - for desc1 in s1.descriptions: - attrname2 = '$memofield%d_%s' % (uid(desc1), memosig) - s_reader = desc1.s_read_attribute(attrname) - if isinstance(s_reader, SomeImpossibleValue): - # first time we see this 'desc1': sanity-check 'desc1' and - # create its reader function - assert isinstance(desc1, FrozenDesc), ( - "XXX not implemented: memo call with a class as first arg") - if desc1.pyobj is None: - raise Exception("method-memo call with a class or PBC" - " that has no " - "corresponding Python object (%r)" % (desc1,)) - def reader(y, attrname2=attrname2): - return getattr(y, attrname2) - desc1.create_new_attribute(attrname, reader) - for desc2 in s2.descriptions: - s_result = desc2.s_read_attribute(attrname2) - if isinstance(s_result, SomeImpossibleValue): - # first time we see this 'desc1+desc2' combination - if desc2.pyobj is None: - raise Exception("method-memo call with a class or PBC" - " that has no " - "corresponding Python object (%r)" % (desc2,)) - # concrete call, to get the concrete result - result = func(desc1.pyobj, desc2.pyobj) - #print 'func(%s, %s) -> %s' % (desc1.pyobj, desc2.pyobj, result) - #print 'goes into %s.%s'% (desc2,attrname2) - #print 'with reader %s.%s'% (desc1,attrname) - desc2.create_new_attribute(attrname2, result) - # get or build the graph of the function that reads this indirect - # settings of attributes - def memoized(x, y): - reader_fn = getattr(x, attrname) - return reader_fn(y) - def builder(translator, func): - return translator.buildflowgraph(memoized) # instead of 'func' - return funcdesc.cachedgraph(s1_type, alt_name='memo_%s' % funcdesc.name, - builder=builder) + values.append(desc.pyobj) + argvalues.append(values) + # the list of all possible tuples of arguments to give to the memo function + possiblevalues = cartesian_product(argvalues) + + # a MemoTable factory -- one MemoTable per family of arguments that can + # be called together, merged via a UnionFind. + bookkeeper = funcdesc.bookkeeper + try: + memotables = bookkeeper.all_specializations[funcdesc] + except KeyError: + func = funcdesc.pyobj + if func is None: + raise Exception("memo call: no Python function object to call " + "(%r)" % (funcdesc,)) + + def compute_one_result(args): + value = func(*args) + return MemoTable(funcdesc, args, value) + + def finish(): + for memotable in memotables.infos(): + memotable.finish() + + memotables = UnionFind(compute_one_result) + bookkeeper.all_specializations[funcdesc] = memotables + bookkeeper.pending_specializations.append(finish) + + # merge the MemoTables for the individual argument combinations + firstvalues = possiblevalues.next() + _, _, memotable = memotables.find(firstvalues) + for values in possiblevalues: + _, _, memotable = memotables.union(firstvalues, values) + + if memotable.graph is not None: + return memotable.graph # if already computed + else: + # otherwise, for now, return the union of each possible result + return unionof(*[bookkeeper.immutablevalue(v) + for v in memotable.table.values()]) + +def cartesian_product(lstlst): + if not lstlst: + yield () + return + for tuple_tail in cartesian_product(lstlst[1:]): + for value in lstlst[0]: + yield (value,) + tuple_tail + +## """NOT_RPYTHON""" +## if len(arglist_s) != 1: +## raise Exception("memo call: only 1 argument functions supported" +## " at the moment (%r)" % (funcdesc,)) +## s, = arglist_s +## from pypy.annotation.model import SomeImpossibleValue +## return memo1(funcdesc, func, s) + +### XXX OBSCURE to support methodmemo()... needs to find something more +### reasonable :-( +##KEY_NUMBERS = {} + +##def memo1(funcdesc, func, s, key='memo1'): +## from pypy.annotation.model import SomeImpossibleValue +## # compute the concrete results and store them directly on the descs, +## # using a strange attribute name +## num = KEY_NUMBERS.setdefault(key, len(KEY_NUMBERS)) +## attrname = '$memo%d_%d_%s' % (uid(funcdesc), num, funcdesc.name) +## for desc in s.descriptions: +## s_result = desc.s_read_attribute(attrname) +## if isinstance(s_result, SomeImpossibleValue): +## # first time we see this 'desc' +## if desc.pyobj is None: +## raise Exception("memo call with a class or PBC that has no " +## "corresponding Python object (%r)" % (desc,)) +## result = func(desc.pyobj) +## desc.create_new_attribute(attrname, result) +## # get or build the graph of the function that reads this strange attr +## def memoized(x, y=None): +## return getattr(x, attrname) +## def builder(translator, func): +## return translator.buildflowgraph(memoized) # instead of 'func' +## return funcdesc.cachedgraph(key, alt_name='memo_%s' % funcdesc.name, +## builder=builder) + +##def methodmemo(funcdesc, arglist_s): +## """NOT_RPYTHON""" +## from pypy.annotation.model import SomePBC, SomeImpossibleValue +## # call the function now, and collect possible results +## for s in arglist_s: +## if not isinstance(s, SomePBC): +## if isinstance(s, SomeImpossibleValue): +## return s # we will probably get more possible args later +## raise Exception("method-memo call: argument must be a class or" +## " a frozen PBC, got %r" % (s,)) +## if len(arglist_s) != 2: +## raise Exception("method-memo call: expected 2 arguments function" +## " at the moment (%r)" % (funcdesc,)) +## from pypy.annotation.model import SomeImpossibleValue +## from pypy.annotation.description import FrozenDesc +## func = funcdesc.pyobj +## if func is None: +## raise Exception("method-memo call: no Python function object to call" +## " (%r)" % (funcdesc,)) +## # compute the concrete results and store them directly on the descs, +## # using a strange attribute name. The goal is to store in the pbcs of +## # 's1' under the common 'attrname' a reader function; each reader function +## # will read a field 'attrname2' from the pbcs of 's2', where 'attrname2' +## # differs for each pbc of 's1'. This is all specialized also +## # considering the type of s1 to support return value +## # polymorphism. +## s1, s2 = arglist_s +## s1_type = s1.knowntype +## if s2.is_constant(): +## return memo1(funcdesc, lambda val1: func(val1, s2.const), +## s1, ('memo1of2', s1_type, Constant(s2.const))) +## memosig = "%d_%d_%s" % (uid(funcdesc), uid(s1_type), funcdesc.name) + +## attrname = '$memoreader%s' % memosig +## for desc1 in s1.descriptions: +## attrname2 = '$memofield%d_%s' % (uid(desc1), memosig) +## s_reader = desc1.s_read_attribute(attrname) +## if isinstance(s_reader, SomeImpossibleValue): +## # first time we see this 'desc1': sanity-check 'desc1' and +## # create its reader function +## assert isinstance(desc1, FrozenDesc), ( +## "XXX not implemented: memo call with a class as first arg") +## if desc1.pyobj is None: +## raise Exception("method-memo call with a class or PBC" +## " that has no " +## "corresponding Python object (%r)" % (desc1,)) +## def reader(y, attrname2=attrname2): +## return getattr(y, attrname2) +## desc1.create_new_attribute(attrname, reader) +## for desc2 in s2.descriptions: +## s_result = desc2.s_read_attribute(attrname2) +## if isinstance(s_result, SomeImpossibleValue): +## # first time we see this 'desc1+desc2' combination +## if desc2.pyobj is None: +## raise Exception("method-memo call with a class or PBC" +## " that has no " +## "corresponding Python object (%r)" % (desc2,)) +## # concrete call, to get the concrete result +## result = func(desc1.pyobj, desc2.pyobj) +## #print 'func(%s, %s) -> %s' % (desc1.pyobj, desc2.pyobj, result) +## #print 'goes into %s.%s'% (desc2,attrname2) +## #print 'with reader %s.%s'% (desc1,attrname) +## desc2.create_new_attribute(attrname2, result) +## # get or build the graph of the function that reads this indirect +## # settings of attributes +## def memoized(x, y): +## reader_fn = getattr(x, attrname) +## return reader_fn(y) +## def builder(translator, func): +## return translator.buildflowgraph(memoized) # instead of 'func' +## return funcdesc.cachedgraph(s1_type, alt_name='memo_%s' % funcdesc.name, +## builder=builder) + def argvalue(i): def specialize_argvalue(funcdesc, args_s): Modified: pypy/dist/pypy/tool/cache.py ============================================================================== --- pypy/dist/pypy/tool/cache.py (original) +++ pypy/dist/pypy/tool/cache.py Tue Dec 13 20:00:58 2005 @@ -37,7 +37,7 @@ result = self._build(key) self.content[key] = result return result - getorbuild._annspecialcase_ = "specialize:methodmemo" + getorbuild._annspecialcase_ = "specialize:memo" def _freeze_(self): # needs to be SomePBC, but otherwise we can't really freeze the Modified: pypy/dist/pypy/translator/annrpython.py ============================================================================== --- pypy/dist/pypy/translator/annrpython.py (original) +++ pypy/dist/pypy/translator/annrpython.py Tue Dec 13 20:00:58 2005 @@ -94,7 +94,6 @@ def build_graph_types(self, flowgraph, inputcells): checkgraph(flowgraph) - self._register_returnvar(flowgraph) nbarg = len(flowgraph.getargs()) if len(inputcells) != nbarg: @@ -102,7 +101,7 @@ flowgraph, nbarg, len(inputcells))) # register the entry point - self.addpendingblock(flowgraph, flowgraph.startblock, inputcells) + self.addpendinggraph(flowgraph, inputcells) # recursively proceed until no more pending block is left self.complete() return self.binding(flowgraph.getreturnvar(), extquery=True) @@ -128,6 +127,10 @@ #___ medium-level interface ____________________________ + def addpendinggraph(self, flowgraph, inputcells): + self._register_returnvar(flowgraph) + self.addpendingblock(flowgraph, flowgraph.startblock, inputcells) + def addpendingblock(self, graph, block, cells, called_from_graph=None): """Register an entry point into block with the given input cells.""" assert not self.frozen @@ -142,9 +145,13 @@ def complete(self): """Process pending blocks until none is left.""" - while self.pendingblocks: - block, graph = self.pendingblocks.popitem() - self.processblock(graph, block) + while True: + while self.pendingblocks: + block, graph = self.pendingblocks.popitem() + self.processblock(graph, block) + self.policy.no_more_blocks_to_annotate(self) + if not self.pendingblocks: + break # finished if False in self.annotated.values(): if annmodel.DEBUG: for block in self.annotated: @@ -175,7 +182,7 @@ if v not in self.bindings: self.setbinding(v, annmodel.SomeImpossibleValue()) # policy-dependent computation - self.policy.compute_at_fixpoint(self) + self.bookkeeper.compute_at_fixpoint() def binding(self, arg, extquery=False): "Gives the SomeValue corresponding to the given Variable or Constant." From adim at codespeak.net Tue Dec 13 22:14:15 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Tue, 13 Dec 2005 22:14:15 +0100 (CET) Subject: [pypy-svn] r21146 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20051213211415.9FBE227B69@code1.codespeak.net> Author: adim Date: Tue Dec 13 22:14:14 2005 New Revision: 21146 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/ast.txt pypy/dist/pypy/interpreter/astcompiler/astgen.py Log: export __new__ methods to be able to create AST nodes at applevel. Thanks a lot to Armin who helped me to find out why PyPy refused to compile. Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Tue Dec 13 22:14:14 2005 @@ -5,7 +5,7 @@ """ from consts import CO_VARARGS, CO_VARKEYWORDS, OP_ASSIGN from pypy.interpreter.baseobjspace import Wrappable -from pypy.interpreter.typedef import TypeDef, GetSetProperty +from pypy.interpreter.typedef import TypeDef, GetSetProperty, interp_attrproperty from pypy.interpreter.gateway import interp2app, W_Root, ObjSpace from pypy.interpreter.argument import Arguments from pypy.interpreter.error import OperationError @@ -69,10 +69,18 @@ args = Arguments(space, [ w_self ]) return space.call_args( w_callable, args ) +def descr_Node_new(space, w_subtype, lineno=-1): + node = space.allocate_instance(Node, w_subtype) + node.lineno = lineno + return space.wrap(node) + Node.typedef = TypeDef('ASTNode', + __new__ = interp2app(descr_Node_new, unwrap_spec=[ObjSpace, W_Root, int]), #__repr__ = interp2app(descr_node_repr, unwrap_spec=['self', ObjSpace] ), getChildNodes = interp2app(Node.descr_getChildNodes, unwrap_spec=[ 'self', ObjSpace ] ), accept = interp2app(descr_node_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + lineno = interp_attrproperty('lineno', cls=Node), + filename = interp_attrproperty('filename', cls=Node), ) @@ -119,13 +127,19 @@ return visitor.visitAbstractFunction(self) +def descr_AbstractFunction_new(space, w_subtype, lineno=-1): + self = space.allocate_instance(AbstractFunction, w_subtype) + self.lineno = lineno + return space.wrap(self) + def descr_AbstractFunction_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAbstractFunction')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) AbstractFunction.typedef = TypeDef('AbstractFunction', Node.typedef, - accept=interp2app(descr_AbstractFunction_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_AbstractFunction_new, unwrap_spec=[ObjSpace, W_Root, int]), + accept=interp2app(descr_AbstractFunction_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), ) class AbstractTest(Node): @@ -146,13 +160,19 @@ return visitor.visitAbstractTest(self) +def descr_AbstractTest_new(space, w_subtype, lineno=-1): + self = space.allocate_instance(AbstractTest, w_subtype) + self.lineno = lineno + return space.wrap(self) + def descr_AbstractTest_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAbstractTest')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) AbstractTest.typedef = TypeDef('AbstractTest', Node.typedef, - accept=interp2app(descr_AbstractTest_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_AbstractTest_new, unwrap_spec=[ObjSpace, W_Root, int]), + accept=interp2app(descr_AbstractTest_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), ) class BinaryOp(Node): @@ -173,13 +193,19 @@ return visitor.visitBinaryOp(self) +def descr_BinaryOp_new(space, w_subtype, lineno=-1): + self = space.allocate_instance(BinaryOp, w_subtype) + self.lineno = lineno + return space.wrap(self) + def descr_BinaryOp_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBinaryOp')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) BinaryOp.typedef = TypeDef('BinaryOp', Node.typedef, - accept=interp2app(descr_BinaryOp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_BinaryOp_new, unwrap_spec=[ObjSpace, W_Root, int]), + accept=interp2app(descr_BinaryOp_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), ) class Add(BinaryOp): @@ -204,17 +230,20 @@ def fget_left( space, self): return space.wrap(self.left) def fset_left( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.left = obj + self.left = space.interp_w(Node, w_arg, can_be_None=False) def fget_right( space, self): return space.wrap(self.right) def fset_right( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.right = obj + self.right = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Add_new(space, w_subtype, w_left, w_right, lineno=-1): + self = space.allocate_instance(Add, w_subtype) + left = space.interp_w(Node, w_left, can_be_None=False) + self.left = left + right = space.interp_w(Node, w_right, can_be_None=False) + self.right = right + self.lineno = lineno + return space.wrap(self) def descr_Add_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAdd')) @@ -222,7 +251,8 @@ return space.call_args(w_callable, args) Add.typedef = TypeDef('Add', BinaryOp.typedef, - accept=interp2app(descr_Add_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Add_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Add_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), left=GetSetProperty(Add.fget_left, Add.fset_left ), right=GetSetProperty(Add.fget_right, Add.fset_right ), ) @@ -251,8 +281,15 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) + +def descr_And_new(space, w_subtype, w_nodes, lineno=-1): + self = space.allocate_instance(And, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + self.lineno = lineno + return space.wrap(self) def descr_And_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAnd')) @@ -260,7 +297,8 @@ return space.call_args(w_callable, args) And.typedef = TypeDef('And', AbstractTest.typedef, - accept=interp2app(descr_And_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_And_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_And_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(And.fget_nodes, And.fset_nodes ), ) @@ -287,10 +325,7 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) def fget_attrname( space, self): return space.wrap(self.attrname) def fset_attrname( space, self, w_arg): @@ -300,13 +335,25 @@ def fset_flags( space, self, w_arg): self.flags = space.int_w(w_arg) +def descr_AssAttr_new(space, w_subtype, w_expr, w_attrname, w_flags, lineno=-1): + self = space.allocate_instance(AssAttr, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + attrname = space.str_w(w_attrname) + self.attrname = attrname + flags = space.int_w(w_flags) + self.flags = flags + self.lineno = lineno + return space.wrap(self) + def descr_AssAttr_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssAttr')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) AssAttr.typedef = TypeDef('AssAttr', Node.typedef, - accept=interp2app(descr_AssAttr_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_AssAttr_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_AssAttr_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(AssAttr.fget_expr, AssAttr.fset_expr ), attrname=GetSetProperty(AssAttr.fget_attrname, AssAttr.fset_attrname ), flags=GetSetProperty(AssAttr.fget_flags, AssAttr.fset_flags ), @@ -330,13 +377,19 @@ return visitor.visitAssSeq(self) +def descr_AssSeq_new(space, w_subtype, lineno=-1): + self = space.allocate_instance(AssSeq, w_subtype) + self.lineno = lineno + return space.wrap(self) + def descr_AssSeq_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssSeq')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) AssSeq.typedef = TypeDef('AssSeq', Node.typedef, - accept=interp2app(descr_AssSeq_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_AssSeq_new, unwrap_spec=[ObjSpace, W_Root, int]), + accept=interp2app(descr_AssSeq_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), ) class AssList(AssSeq): @@ -363,8 +416,15 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) + +def descr_AssList_new(space, w_subtype, w_nodes, lineno=-1): + self = space.allocate_instance(AssList, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + self.lineno = lineno + return space.wrap(self) def descr_AssList_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssList')) @@ -372,7 +432,8 @@ return space.call_args(w_callable, args) AssList.typedef = TypeDef('AssList', AssSeq.typedef, - accept=interp2app(descr_AssList_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_AssList_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_AssList_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(AssList.fget_nodes, AssList.fset_nodes ), ) @@ -404,13 +465,23 @@ def fset_flags( space, self, w_arg): self.flags = space.int_w(w_arg) +def descr_AssName_new(space, w_subtype, w_name, w_flags, lineno=-1): + self = space.allocate_instance(AssName, w_subtype) + name = space.str_w(w_name) + self.name = name + flags = space.int_w(w_flags) + self.flags = flags + self.lineno = lineno + return space.wrap(self) + def descr_AssName_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssName')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) AssName.typedef = TypeDef('AssName', Node.typedef, - accept=interp2app(descr_AssName_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_AssName_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_AssName_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), name=GetSetProperty(AssName.fget_name, AssName.fset_name ), flags=GetSetProperty(AssName.fget_flags, AssName.fset_flags ), ) @@ -453,8 +524,15 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) + +def descr_AssTuple_new(space, w_subtype, w_nodes, lineno=-1): + self = space.allocate_instance(AssTuple, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + self.lineno = lineno + return space.wrap(self) def descr_AssTuple_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssTuple')) @@ -462,7 +540,8 @@ return space.call_args(w_callable, args) AssTuple.typedef = TypeDef('AssTuple', AssSeq.typedef, - accept=interp2app(descr_AssTuple_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_AssTuple_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_AssTuple_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(AssTuple.fget_nodes, AssTuple.fset_nodes ), ) @@ -495,23 +574,23 @@ def fget_test( space, self): return space.wrap(self.test) def fset_test( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.test = obj + self.test = space.interp_w(Node, w_arg, can_be_None=False) def fget_fail( space, self): if self.fail is None: return space.w_None else: return space.wrap(self.fail) def fset_fail( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.fail = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.fail = obj + self.fail = space.interp_w(Node, w_arg, can_be_None=True) + +def descr_Assert_new(space, w_subtype, w_test, w_fail, lineno=-1): + self = space.allocate_instance(Assert, w_subtype) + test = space.interp_w(Node, w_test, can_be_None=False) + self.test = test + fail = space.interp_w(Node, w_fail, can_be_None=True) + self.fail = fail + self.lineno = lineno + return space.wrap(self) def descr_Assert_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssert')) @@ -519,7 +598,8 @@ return space.call_args(w_callable, args) Assert.typedef = TypeDef('Assert', Node.typedef, - accept=interp2app(descr_Assert_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Assert_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Assert_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), test=GetSetProperty(Assert.fget_test, Assert.fset_test ), fail=GetSetProperty(Assert.fget_fail, Assert.fset_fail ), ) @@ -553,15 +633,21 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Assign_new(space, w_subtype, w_nodes, w_expr, lineno=-1): + self = space.allocate_instance(Assign, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + self.lineno = lineno + return space.wrap(self) def descr_Assign_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAssign')) @@ -569,7 +655,8 @@ return space.call_args(w_callable, args) Assign.typedef = TypeDef('Assign', Node.typedef, - accept=interp2app(descr_Assign_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Assign_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Assign_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(Assign.fget_nodes, Assign.fset_nodes ), expr=GetSetProperty(Assign.fget_expr, Assign.fset_expr ), ) @@ -597,10 +684,7 @@ def fget_node( space, self): return space.wrap(self.node) def fset_node( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.node = obj + self.node = space.interp_w(Node, w_arg, can_be_None=False) def fget_op( space, self): return space.wrap(self.op) def fset_op( space, self, w_arg): @@ -608,10 +692,18 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_AugAssign_new(space, w_subtype, w_node, w_op, w_expr, lineno=-1): + self = space.allocate_instance(AugAssign, w_subtype) + node = space.interp_w(Node, w_node, can_be_None=False) + self.node = node + op = space.str_w(w_op) + self.op = op + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + self.lineno = lineno + return space.wrap(self) def descr_AugAssign_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitAugAssign')) @@ -619,7 +711,8 @@ return space.call_args(w_callable, args) AugAssign.typedef = TypeDef('AugAssign', Node.typedef, - accept=interp2app(descr_AugAssign_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_AugAssign_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_AugAssign_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), node=GetSetProperty(AugAssign.fget_node, AugAssign.fset_node ), op=GetSetProperty(AugAssign.fget_op, AugAssign.fset_op ), expr=GetSetProperty(AugAssign.fget_expr, AugAssign.fset_expr ), @@ -643,13 +736,19 @@ return visitor.visitUnaryOp(self) +def descr_UnaryOp_new(space, w_subtype, lineno=-1): + self = space.allocate_instance(UnaryOp, w_subtype) + self.lineno = lineno + return space.wrap(self) + def descr_UnaryOp_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitUnaryOp')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) UnaryOp.typedef = TypeDef('UnaryOp', Node.typedef, - accept=interp2app(descr_UnaryOp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_UnaryOp_new, unwrap_spec=[ObjSpace, W_Root, int]), + accept=interp2app(descr_UnaryOp_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), ) class Backquote(UnaryOp): @@ -673,10 +772,14 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Backquote_new(space, w_subtype, w_expr, lineno=-1): + self = space.allocate_instance(Backquote, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + self.lineno = lineno + return space.wrap(self) def descr_Backquote_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBackquote')) @@ -684,7 +787,8 @@ return space.call_args(w_callable, args) Backquote.typedef = TypeDef('Backquote', UnaryOp.typedef, - accept=interp2app(descr_Backquote_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Backquote_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Backquote_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(Backquote.fget_expr, Backquote.fset_expr ), ) @@ -706,13 +810,19 @@ return visitor.visitBitOp(self) +def descr_BitOp_new(space, w_subtype, lineno=-1): + self = space.allocate_instance(BitOp, w_subtype) + self.lineno = lineno + return space.wrap(self) + def descr_BitOp_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBitOp')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) BitOp.typedef = TypeDef('BitOp', Node.typedef, - accept=interp2app(descr_BitOp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_BitOp_new, unwrap_spec=[ObjSpace, W_Root, int]), + accept=interp2app(descr_BitOp_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), ) class Bitand(BitOp): @@ -739,8 +849,15 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) + +def descr_Bitand_new(space, w_subtype, w_nodes, lineno=-1): + self = space.allocate_instance(Bitand, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + self.lineno = lineno + return space.wrap(self) def descr_Bitand_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBitand')) @@ -748,7 +865,8 @@ return space.call_args(w_callable, args) Bitand.typedef = TypeDef('Bitand', BitOp.typedef, - accept=interp2app(descr_Bitand_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Bitand_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Bitand_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(Bitand.fget_nodes, Bitand.fset_nodes ), ) @@ -776,8 +894,15 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) + +def descr_Bitor_new(space, w_subtype, w_nodes, lineno=-1): + self = space.allocate_instance(Bitor, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + self.lineno = lineno + return space.wrap(self) def descr_Bitor_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBitor')) @@ -785,7 +910,8 @@ return space.call_args(w_callable, args) Bitor.typedef = TypeDef('Bitor', BitOp.typedef, - accept=interp2app(descr_Bitor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Bitor_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Bitor_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(Bitor.fget_nodes, Bitor.fset_nodes ), ) @@ -813,8 +939,15 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) + +def descr_Bitxor_new(space, w_subtype, w_nodes, lineno=-1): + self = space.allocate_instance(Bitxor, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + self.lineno = lineno + return space.wrap(self) def descr_Bitxor_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBitxor')) @@ -822,7 +955,8 @@ return space.call_args(w_callable, args) Bitxor.typedef = TypeDef('Bitxor', BitOp.typedef, - accept=interp2app(descr_Bitxor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Bitxor_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Bitxor_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(Bitxor.fget_nodes, Bitxor.fset_nodes ), ) @@ -844,13 +978,19 @@ return visitor.visitBreak(self) +def descr_Break_new(space, w_subtype, lineno=-1): + self = space.allocate_instance(Break, w_subtype) + self.lineno = lineno + return space.wrap(self) + def descr_Break_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitBreak')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Break.typedef = TypeDef('Break', Node.typedef, - accept=interp2app(descr_Break_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Break_new, unwrap_spec=[ObjSpace, W_Root, int]), + accept=interp2app(descr_Break_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), ) class CallFunc(Node): @@ -889,42 +1029,40 @@ def fget_node( space, self): return space.wrap(self.node) def fset_node( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.node = obj + self.node = space.interp_w(Node, w_arg, can_be_None=False) def fget_args( space, self): return space.newlist( [space.wrap(itm) for itm in self.args] ) def fset_args( space, self, w_arg): del self.args[:] - for w_itm in space.unpackiterable( w_arg ): - self.args.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.args.append( space.interp_w(Node, w_arg)) def fget_star_args( space, self): if self.star_args is None: return space.w_None else: return space.wrap(self.star_args) def fset_star_args( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.star_args = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.star_args = obj + self.star_args = space.interp_w(Node, w_arg, can_be_None=True) def fget_dstar_args( space, self): if self.dstar_args is None: return space.w_None else: return space.wrap(self.dstar_args) def fset_dstar_args( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.dstar_args = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.dstar_args = obj + self.dstar_args = space.interp_w(Node, w_arg, can_be_None=True) + +def descr_CallFunc_new(space, w_subtype, w_node, w_args, w_star_args, w_dstar_args, lineno=-1): + self = space.allocate_instance(CallFunc, w_subtype) + node = space.interp_w(Node, w_node, can_be_None=False) + self.node = node + args = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_args)] + self.args = args + star_args = space.interp_w(Node, w_star_args, can_be_None=True) + self.star_args = star_args + dstar_args = space.interp_w(Node, w_dstar_args, can_be_None=True) + self.dstar_args = dstar_args + self.lineno = lineno + return space.wrap(self) def descr_CallFunc_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitCallFunc')) @@ -932,7 +1070,8 @@ return space.call_args(w_callable, args) CallFunc.typedef = TypeDef('CallFunc', Node.typedef, - accept=interp2app(descr_CallFunc_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_CallFunc_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_CallFunc_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), node=GetSetProperty(CallFunc.fget_node, CallFunc.fset_node ), args=GetSetProperty(CallFunc.fget_args, CallFunc.fset_args ), star_args=GetSetProperty(CallFunc.fget_star_args, CallFunc.fset_star_args ), @@ -976,8 +1115,8 @@ return space.newlist( [space.wrap(itm) for itm in self.bases] ) def fset_bases( space, self, w_arg): del self.bases[:] - for w_itm in space.unpackiterable( w_arg ): - self.bases.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.bases.append( space.interp_w(Node, w_arg)) def fget_w_doc( space, self): return self.w_doc def fset_w_doc( space, self, w_arg): @@ -985,10 +1124,21 @@ def fget_code( space, self): return space.wrap(self.code) def fset_code( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.code = obj + self.code = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Class_new(space, w_subtype, w_name, w_bases, w_w_doc, w_code, lineno=-1): + self = space.allocate_instance(Class, w_subtype) + name = space.str_w(w_name) + self.name = name + bases = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_bases)] + self.bases = bases + # This dummy assingment is auto-generated, astgen.py should be fixed to avoid that + w_doc = w_w_doc + self.w_doc = w_doc + code = space.interp_w(Node, w_code, can_be_None=False) + self.code = code + self.lineno = lineno + return space.wrap(self) def descr_Class_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitClass')) @@ -996,7 +1146,8 @@ return space.call_args(w_callable, args) Class.typedef = TypeDef('Class', Node.typedef, - accept=interp2app(descr_Class_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Class_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Class_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), name=GetSetProperty(Class.fget_name, Class.fset_name ), bases=GetSetProperty(Class.fget_bases, Class.fset_bases ), w_doc=GetSetProperty(Class.fget_w_doc, Class.fset_w_doc ), @@ -1031,9 +1182,7 @@ w_opname = space.getitem( w_obj, space.wrap(0) ) w_node = space.getitem( w_obj, space.wrap(1) ) ops = space.str_w(w_opname) - node = space.interpclass_w( w_node ) - if not isinstance(node, Node): - raise OperationError(space.w_TypeError, space.wrap("ops must be a list of (name,node)")) + node = space.interp_w(Node, w_node) self.ops.append( (ops,node) ) @@ -1053,10 +1202,23 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Compare_new(space, w_subtype, w_expr, w_ops, lineno=-1): + self = space.allocate_instance(Compare, w_subtype) + self.expr = space.interp_w(Node, w_expr) + ops = [] + for w_tuple in space.unpackiterable(w_ops): + w_opname = space.getitem(w_tuple, space.wrap(0)) + w_node = space.getitem(w_tuple, space.wrap(1)) + opname = space.str_w(w_opname) + node = space.interp_w(Node, w_node) + ops.append((opname, node)) + self.ops = ops + self.lineno = lineno + return space.wrap(self) + + def descr_Compare_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitCompare')) @@ -1064,7 +1226,8 @@ return space.call_args(w_callable, args) Compare.typedef = TypeDef('Compare', Node.typedef, - accept=interp2app(descr_Compare_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Compare_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Compare_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(Compare.fget_expr, Compare.fset_expr ), ops=GetSetProperty(Compare.fget_ops, Compare.fset_ops ), ) @@ -1092,13 +1255,22 @@ def fset_value( space, self, w_arg): self.value = w_arg +def descr_Const_new(space, w_subtype, w_value, lineno=-1): + self = space.allocate_instance(Const, w_subtype) + # This dummy assingment is auto-generated, astgen.py should be fixed to avoid that + value = w_value + self.value = value + self.lineno = lineno + return space.wrap(self) + def descr_Const_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitConst')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Const.typedef = TypeDef('Const', Node.typedef, - accept=interp2app(descr_Const_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Const_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Const_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), value=GetSetProperty(Const.fget_value, Const.fset_value ), ) @@ -1120,13 +1292,19 @@ return visitor.visitContinue(self) +def descr_Continue_new(space, w_subtype, lineno=-1): + self = space.allocate_instance(Continue, w_subtype) + self.lineno = lineno + return space.wrap(self) + def descr_Continue_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitContinue')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Continue.typedef = TypeDef('Continue', Node.typedef, - accept=interp2app(descr_Continue_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Continue_new, unwrap_spec=[ObjSpace, W_Root, int]), + accept=interp2app(descr_Continue_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), ) class Decorators(Node): @@ -1153,8 +1331,15 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) + +def descr_Decorators_new(space, w_subtype, w_nodes, lineno=-1): + self = space.allocate_instance(Decorators, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + self.lineno = lineno + return space.wrap(self) def descr_Decorators_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitDecorators')) @@ -1162,7 +1347,8 @@ return space.call_args(w_callable, args) Decorators.typedef = TypeDef('Decorators', Node.typedef, - accept=interp2app(descr_Decorators_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Decorators_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Decorators_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(Decorators.fget_nodes, Decorators.fset_nodes ), ) @@ -1189,14 +1375,13 @@ for w_tup in space.unpackiterable( w_arg ): w_key = space.getitem( w_tup, space.wrap(0) ) w_value = space.getitem( w_tup, space.wrap(1) ) - key = space.interpclass_w( w_key ) - value = space.interpclass_w( w_value ) - if not isinstance( key, Node ) or not isinstance( value, Node ): - raise OperationError(space.w_TypeError, space.wrap("Need a list of (key node, value node)")) + key = space.interp_w(Node, w_key) + value = space.interp_w(Node, w_value) self.items.append( (key,value) ) + - def fget_items( space, self ): + def fget_items(space, self): return space.newlist( [ space.newtuple( [ space.wrap(key), space.wrap(value) ] ) for key, value in self.items ] ) @@ -1208,13 +1393,30 @@ return visitor.visitDict(self) +def descr_Dict_new(space, w_subtype, w_items, lineno=-1): + self = space.allocate_instance(Dict, w_subtype) + items = [] + for w_tuple in space.unpackiterable(w_items): + w_key = space.getitem(w_tuple, space.wrap(0)) + w_value = space.getitem(w_tuple, space.wrap(1)) + key = space.interp_w(Node, w_key) + value = space.interp_w(Node, w_value) + items.append((key, value)) + self.items = items + self.lineno = lineno + return space.wrap(self) + + + + def descr_Dict_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitDict')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Dict.typedef = TypeDef('Dict', Node.typedef, - accept=interp2app(descr_Dict_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Dict_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Dict_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), items=GetSetProperty(Dict.fget_items, Dict.fset_items ), ) @@ -1239,10 +1441,14 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Discard_new(space, w_subtype, w_expr, lineno=-1): + self = space.allocate_instance(Discard, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + self.lineno = lineno + return space.wrap(self) def descr_Discard_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitDiscard')) @@ -1250,7 +1456,8 @@ return space.call_args(w_callable, args) Discard.typedef = TypeDef('Discard', Node.typedef, - accept=interp2app(descr_Discard_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Discard_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Discard_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(Discard.fget_expr, Discard.fset_expr ), ) @@ -1276,17 +1483,20 @@ def fget_left( space, self): return space.wrap(self.left) def fset_left( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.left = obj + self.left = space.interp_w(Node, w_arg, can_be_None=False) def fget_right( space, self): return space.wrap(self.right) def fset_right( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.right = obj + self.right = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Div_new(space, w_subtype, w_left, w_right, lineno=-1): + self = space.allocate_instance(Div, w_subtype) + left = space.interp_w(Node, w_left, can_be_None=False) + self.left = left + right = space.interp_w(Node, w_right, can_be_None=False) + self.right = right + self.lineno = lineno + return space.wrap(self) def descr_Div_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitDiv')) @@ -1294,7 +1504,8 @@ return space.call_args(w_callable, args) Div.typedef = TypeDef('Div', BinaryOp.typedef, - accept=interp2app(descr_Div_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Div_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Div_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), left=GetSetProperty(Div.fget_left, Div.fset_left ), right=GetSetProperty(Div.fget_right, Div.fset_right ), ) @@ -1317,13 +1528,19 @@ return visitor.visitEllipsis(self) +def descr_Ellipsis_new(space, w_subtype, lineno=-1): + self = space.allocate_instance(Ellipsis, w_subtype) + self.lineno = lineno + return space.wrap(self) + def descr_Ellipsis_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitEllipsis')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Ellipsis.typedef = TypeDef('Ellipsis', Node.typedef, - accept=interp2app(descr_Ellipsis_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Ellipsis_new, unwrap_spec=[ObjSpace, W_Root, int]), + accept=interp2app(descr_Ellipsis_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), ) class Exec(Node): @@ -1359,36 +1576,32 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) def fget_locals( space, self): if self.locals is None: return space.w_None else: return space.wrap(self.locals) def fset_locals( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.locals = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.locals = obj + self.locals = space.interp_w(Node, w_arg, can_be_None=True) def fget_globals( space, self): if self.globals is None: return space.w_None else: return space.wrap(self.globals) def fset_globals( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.globals = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.globals = obj + self.globals = space.interp_w(Node, w_arg, can_be_None=True) + +def descr_Exec_new(space, w_subtype, w_expr, w_locals, w_globals, lineno=-1): + self = space.allocate_instance(Exec, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + locals = space.interp_w(Node, w_locals, can_be_None=True) + self.locals = locals + globals = space.interp_w(Node, w_globals, can_be_None=True) + self.globals = globals + self.lineno = lineno + return space.wrap(self) def descr_Exec_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitExec')) @@ -1396,7 +1609,8 @@ return space.call_args(w_callable, args) Exec.typedef = TypeDef('Exec', Node.typedef, - accept=interp2app(descr_Exec_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Exec_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Exec_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(Exec.fget_expr, Exec.fset_expr ), locals=GetSetProperty(Exec.fget_locals, Exec.fset_locals ), globals=GetSetProperty(Exec.fget_globals, Exec.fset_globals ), @@ -1424,17 +1638,20 @@ def fget_left( space, self): return space.wrap(self.left) def fset_left( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.left = obj + self.left = space.interp_w(Node, w_arg, can_be_None=False) def fget_right( space, self): return space.wrap(self.right) def fset_right( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.right = obj + self.right = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_FloorDiv_new(space, w_subtype, w_left, w_right, lineno=-1): + self = space.allocate_instance(FloorDiv, w_subtype) + left = space.interp_w(Node, w_left, can_be_None=False) + self.left = left + right = space.interp_w(Node, w_right, can_be_None=False) + self.right = right + self.lineno = lineno + return space.wrap(self) def descr_FloorDiv_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitFloorDiv')) @@ -1442,7 +1659,8 @@ return space.call_args(w_callable, args) FloorDiv.typedef = TypeDef('FloorDiv', BinaryOp.typedef, - accept=interp2app(descr_FloorDiv_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_FloorDiv_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_FloorDiv_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), left=GetSetProperty(FloorDiv.fget_left, FloorDiv.fset_left ), right=GetSetProperty(FloorDiv.fget_right, FloorDiv.fset_right ), ) @@ -1482,37 +1700,35 @@ def fget_assign( space, self): return space.wrap(self.assign) def fset_assign( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.assign = obj + self.assign = space.interp_w(Node, w_arg, can_be_None=False) def fget_list( space, self): return space.wrap(self.list) def fset_list( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.list = obj + self.list = space.interp_w(Node, w_arg, can_be_None=False) def fget_body( space, self): return space.wrap(self.body) def fset_body( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.body = obj + self.body = space.interp_w(Node, w_arg, can_be_None=False) def fget_else_( space, self): if self.else_ is None: return space.w_None else: return space.wrap(self.else_) def fset_else_( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.else_ = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.else_ = obj + self.else_ = space.interp_w(Node, w_arg, can_be_None=True) + +def descr_For_new(space, w_subtype, w_assign, w_list, w_body, w_else_, lineno=-1): + self = space.allocate_instance(For, w_subtype) + assign = space.interp_w(Node, w_assign, can_be_None=False) + self.assign = assign + list = space.interp_w(Node, w_list, can_be_None=False) + self.list = list + body = space.interp_w(Node, w_body, can_be_None=False) + self.body = body + else_ = space.interp_w(Node, w_else_, can_be_None=True) + self.else_ = else_ + self.lineno = lineno + return space.wrap(self) def descr_For_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitFor')) @@ -1520,7 +1736,8 @@ return space.call_args(w_callable, args) For.typedef = TypeDef('For', Node.typedef, - accept=interp2app(descr_For_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_For_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_For_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), assign=GetSetProperty(For.fget_assign, For.fset_assign ), list=GetSetProperty(For.fget_list, For.fset_list ), body=GetSetProperty(For.fget_body, For.fset_body ), @@ -1568,13 +1785,33 @@ def fset_modname( space, self, w_arg): self.modname = space.str_w(w_arg) +def descr_From_new(space, w_subtype, w_modname, w_names, lineno=-1): + self = space.allocate_instance(From, w_subtype) + modname = space.str_w(w_modname) + self.modname = modname + names = [] + for w_tuple in space.unpackiterable(w_names): + w_name = space.getitem(w_tuple, space.wrap(0)) + w_as_name = space.getitem(w_tuple, space.wrap(1)) + name = space.str_w(w_name) + as_name = None + if not space.is_w(w_as_name, space.w_None): + as_name = space.str_w(w_as_name) + names.append((name, as_name)) + self.names = names + self.lineno = lineno + return space.wrap(self) + + + def descr_From_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitFrom')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) From.typedef = TypeDef('From', Node.typedef, - accept=interp2app(descr_From_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_From_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_From_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), modname=GetSetProperty(From.fget_modname, From.fset_modname ), names=GetSetProperty(From.fget_names, From.fset_names ), ) @@ -1630,13 +1867,7 @@ else: return space.wrap(self.decorators) def fset_decorators( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.decorators = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.decorators = obj + self.decorators = space.interp_w(Node, w_arg, can_be_None=True) def fget_name( space, self): return space.wrap(self.name) def fset_name( space, self, w_arg): @@ -1645,14 +1876,14 @@ return space.newlist( [space.wrap(itm) for itm in self.argnames] ) def fset_argnames( space, self, w_arg): del self.argnames[:] - for w_itm in space.unpackiterable( w_arg ): - self.argnames.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.argnames.append( space.interp_w(Node, w_arg)) def fget_defaults( space, self): return space.newlist( [space.wrap(itm) for itm in self.defaults] ) def fset_defaults( space, self, w_arg): del self.defaults[:] - for w_itm in space.unpackiterable( w_arg ): - self.defaults.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.defaults.append( space.interp_w(Node, w_arg)) def fget_flags( space, self): return space.wrap(self.flags) def fset_flags( space, self, w_arg): @@ -1664,10 +1895,27 @@ def fget_code( space, self): return space.wrap(self.code) def fset_code( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.code = obj + self.code = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Function_new(space, w_subtype, w_decorators, w_name, w_argnames, w_defaults, w_flags, w_w_doc, w_code, lineno=-1): + self = space.allocate_instance(Function, w_subtype) + decorators = space.interp_w(Node, w_decorators, can_be_None=True) + self.decorators = decorators + name = space.str_w(w_name) + self.name = name + argnames = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_argnames)] + self.argnames = argnames + defaults = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_defaults)] + self.defaults = defaults + flags = space.int_w(w_flags) + self.flags = flags + # This dummy assingment is auto-generated, astgen.py should be fixed to avoid that + w_doc = w_w_doc + self.w_doc = w_doc + code = space.interp_w(Node, w_code, can_be_None=False) + self.code = code + self.lineno = lineno + return space.wrap(self) def descr_Function_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitFunction')) @@ -1675,7 +1923,8 @@ return space.call_args(w_callable, args) Function.typedef = TypeDef('Function', AbstractFunction.typedef, - accept=interp2app(descr_Function_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Function_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Function_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), decorators=GetSetProperty(Function.fget_decorators, Function.fset_decorators ), name=GetSetProperty(Function.fget_name, Function.fset_name ), argnames=GetSetProperty(Function.fget_argnames, Function.fset_argnames ), @@ -1710,10 +1959,14 @@ def fget_code( space, self): return space.wrap(self.code) def fset_code( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.code = obj + self.code = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_GenExpr_new(space, w_subtype, w_code, lineno=-1): + self = space.allocate_instance(GenExpr, w_subtype) + code = space.interp_w(Node, w_code, can_be_None=False) + self.code = code + self.lineno = lineno + return space.wrap(self) def descr_GenExpr_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitGenExpr')) @@ -1721,7 +1974,8 @@ return space.call_args(w_callable, args) GenExpr.typedef = TypeDef('GenExpr', AbstractFunction.typedef, - accept=interp2app(descr_GenExpr_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_GenExpr_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_GenExpr_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), code=GetSetProperty(GenExpr.fget_code, GenExpr.fset_code ), ) @@ -1759,23 +2013,28 @@ def fget_assign( space, self): return space.wrap(self.assign) def fset_assign( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.assign = obj + self.assign = space.interp_w(Node, w_arg, can_be_None=False) def fget_iter( space, self): return space.wrap(self.iter) def fset_iter( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.iter = obj + self.iter = space.interp_w(Node, w_arg, can_be_None=False) def fget_ifs( space, self): return space.newlist( [space.wrap(itm) for itm in self.ifs] ) def fset_ifs( space, self, w_arg): del self.ifs[:] - for w_itm in space.unpackiterable( w_arg ): - self.ifs.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.ifs.append( space.interp_w(Node, w_arg)) + +def descr_GenExprFor_new(space, w_subtype, w_assign, w_iter, w_ifs, lineno=-1): + self = space.allocate_instance(GenExprFor, w_subtype) + assign = space.interp_w(Node, w_assign, can_be_None=False) + self.assign = assign + iter = space.interp_w(Node, w_iter, can_be_None=False) + self.iter = iter + ifs = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_ifs)] + self.ifs = ifs + self.lineno = lineno + return space.wrap(self) def descr_GenExprFor_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitGenExprFor')) @@ -1783,7 +2042,8 @@ return space.call_args(w_callable, args) GenExprFor.typedef = TypeDef('GenExprFor', Node.typedef, - accept=interp2app(descr_GenExprFor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_GenExprFor_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_GenExprFor_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), assign=GetSetProperty(GenExprFor.fget_assign, GenExprFor.fset_assign ), iter=GetSetProperty(GenExprFor.fget_iter, GenExprFor.fset_iter ), ifs=GetSetProperty(GenExprFor.fget_ifs, GenExprFor.fset_ifs ), @@ -1810,10 +2070,14 @@ def fget_test( space, self): return space.wrap(self.test) def fset_test( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.test = obj + self.test = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_GenExprIf_new(space, w_subtype, w_test, lineno=-1): + self = space.allocate_instance(GenExprIf, w_subtype) + test = space.interp_w(Node, w_test, can_be_None=False) + self.test = test + self.lineno = lineno + return space.wrap(self) def descr_GenExprIf_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitGenExprIf')) @@ -1821,7 +2085,8 @@ return space.call_args(w_callable, args) GenExprIf.typedef = TypeDef('GenExprIf', Node.typedef, - accept=interp2app(descr_GenExprIf_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_GenExprIf_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_GenExprIf_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), test=GetSetProperty(GenExprIf.fget_test, GenExprIf.fset_test ), ) @@ -1853,16 +2118,22 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) def fget_quals( space, self): return space.newlist( [space.wrap(itm) for itm in self.quals] ) def fset_quals( space, self, w_arg): del self.quals[:] - for w_itm in space.unpackiterable( w_arg ): - self.quals.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.quals.append( space.interp_w(Node, w_arg)) + +def descr_GenExprInner_new(space, w_subtype, w_expr, w_quals, lineno=-1): + self = space.allocate_instance(GenExprInner, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + quals = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_quals)] + self.quals = quals + self.lineno = lineno + return space.wrap(self) def descr_GenExprInner_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitGenExprInner')) @@ -1870,7 +2141,8 @@ return space.call_args(w_callable, args) GenExprInner.typedef = TypeDef('GenExprInner', Node.typedef, - accept=interp2app(descr_GenExprInner_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_GenExprInner_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_GenExprInner_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(GenExprInner.fget_expr, GenExprInner.fset_expr ), quals=GetSetProperty(GenExprInner.fget_quals, GenExprInner.fset_quals ), ) @@ -1897,22 +2169,29 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) def fget_attrname( space, self): return space.wrap(self.attrname) def fset_attrname( space, self, w_arg): self.attrname = space.str_w(w_arg) +def descr_Getattr_new(space, w_subtype, w_expr, w_attrname, lineno=-1): + self = space.allocate_instance(Getattr, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + attrname = space.str_w(w_attrname) + self.attrname = attrname + self.lineno = lineno + return space.wrap(self) + def descr_Getattr_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitGetattr')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Getattr.typedef = TypeDef('Getattr', Node.typedef, - accept=interp2app(descr_Getattr_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Getattr_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Getattr_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(Getattr.fget_expr, Getattr.fset_expr ), attrname=GetSetProperty(Getattr.fget_attrname, Getattr.fset_attrname ), ) @@ -1942,13 +2221,21 @@ for itm in space.unpackiterable(w_arg): self.names.append( space.str_w(itm) ) +def descr_Global_new(space, w_subtype, w_names, lineno=-1): + self = space.allocate_instance(Global, w_subtype) + names = [space.str_w(w_str) for w_str in space.unpackiterable(w_names)] + self.names = names + self.lineno = lineno + return space.wrap(self) + def descr_Global_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitGlobal')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Global.typedef = TypeDef('Global', Node.typedef, - accept=interp2app(descr_Global_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Global_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Global_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), names=GetSetProperty(Global.fget_names, Global.fset_names ), ) @@ -1981,14 +2268,11 @@ for w_tup in space.unpackiterable( w_arg ): w_test = space.getitem( w_tup, space.wrap(0) ) w_suite = space.getitem( w_tup, space.wrap(1) ) - test = space.interpclass_w( w_test ) - suite = space.interpclass_w( w_suite ) - if not isinstance( test, Node ) or not isinstance( suite, Node ): - raise OperationError(space.w_TypeError, space.wrap("Need a list of (test,suite) nodes") ) + test = space.interp_w(Node, w_test) + suite = space.interp_w(Node, w_suite) self.tests.append( (test,suite) ) - def fget_tests( space, self ): return space.newlist( [ space.newtuple( [ space.wrap(test), @@ -2008,13 +2292,24 @@ else: return space.wrap(self.else_) def fset_else_( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.else_ = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.else_ = obj + self.else_ = space.interp_w(Node, w_arg, can_be_None=True) + +def descr_If_new(space, w_subtype, w_tests, w_else_, lineno=-1): + self = space.allocate_instance(If, w_subtype) + tests = [] + for w_tuple in space.unpackiterable(w_tests): + w_test = space.getitem(w_tuple, space.wrap(0)) + w_suite = space.getitem(w_tuple, space.wrap(1)) + test = space.interp_w(Node, w_test) + suite = space.interp_w(Node, w_suite) + tests.append((test, suite)) + self.tests = tests + self.else_ = space.interp_w(Node, w_else_, can_be_None=True) + self.lineno = lineno + return space.wrap(self) + + + def descr_If_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitIf')) @@ -2022,7 +2317,8 @@ return space.call_args(w_callable, args) If.typedef = TypeDef('If', Node.typedef, - accept=interp2app(descr_If_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_If_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_If_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), tests=GetSetProperty(If.fget_tests, If.fset_tests ), else_=GetSetProperty(If.fget_else_, If.fset_else_ ), ) @@ -2063,13 +2359,31 @@ return visitor.visitImport(self) +def descr_Import_new(space, w_subtype, w_names, lineno=-1): + self = space.allocate_instance(Import, w_subtype) + names = [] + for w_tuple in space.unpackiterable(w_names): + w_name = space.getitem(w_tuple, space.wrap(0)) + w_as_name = space.getitem(w_tuple, space.wrap(1)) + name = space.str_w(w_name) + as_name = None + if not space.is_w(w_as_name, space.w_None): + as_name = space.str_w(w_as_name) + names.append((name, as_name)) + self.names = names + self.lineno = lineno + return space.wrap(self) + + + def descr_Import_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitImport')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Import.typedef = TypeDef('Import', Node.typedef, - accept=interp2app(descr_Import_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Import_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Import_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), names=GetSetProperty(Import.fget_names, Import.fset_names ), ) @@ -2094,10 +2408,14 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Invert_new(space, w_subtype, w_expr, lineno=-1): + self = space.allocate_instance(Invert, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + self.lineno = lineno + return space.wrap(self) def descr_Invert_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitInvert')) @@ -2105,7 +2423,8 @@ return space.call_args(w_callable, args) Invert.typedef = TypeDef('Invert', UnaryOp.typedef, - accept=interp2app(descr_Invert_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Invert_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Invert_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(Invert.fget_expr, Invert.fset_expr ), ) @@ -2135,10 +2454,16 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Keyword_new(space, w_subtype, w_name, w_expr, lineno=-1): + self = space.allocate_instance(Keyword, w_subtype) + name = space.str_w(w_name) + self.name = name + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + self.lineno = lineno + return space.wrap(self) def descr_Keyword_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitKeyword')) @@ -2146,7 +2471,8 @@ return space.call_args(w_callable, args) Keyword.typedef = TypeDef('Keyword', Node.typedef, - accept=interp2app(descr_Keyword_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Keyword_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Keyword_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), name=GetSetProperty(Keyword.fget_name, Keyword.fset_name ), expr=GetSetProperty(Keyword.fget_expr, Keyword.fset_expr ), ) @@ -2192,14 +2518,14 @@ return space.newlist( [space.wrap(itm) for itm in self.argnames] ) def fset_argnames( space, self, w_arg): del self.argnames[:] - for w_itm in space.unpackiterable( w_arg ): - self.argnames.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.argnames.append( space.interp_w(Node, w_arg)) def fget_defaults( space, self): return space.newlist( [space.wrap(itm) for itm in self.defaults] ) def fset_defaults( space, self, w_arg): del self.defaults[:] - for w_itm in space.unpackiterable( w_arg ): - self.defaults.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.defaults.append( space.interp_w(Node, w_arg)) def fget_flags( space, self): return space.wrap(self.flags) def fset_flags( space, self, w_arg): @@ -2207,10 +2533,20 @@ def fget_code( space, self): return space.wrap(self.code) def fset_code( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.code = obj + self.code = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Lambda_new(space, w_subtype, w_argnames, w_defaults, w_flags, w_code, lineno=-1): + self = space.allocate_instance(Lambda, w_subtype) + argnames = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_argnames)] + self.argnames = argnames + defaults = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_defaults)] + self.defaults = defaults + flags = space.int_w(w_flags) + self.flags = flags + code = space.interp_w(Node, w_code, can_be_None=False) + self.code = code + self.lineno = lineno + return space.wrap(self) def descr_Lambda_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitLambda')) @@ -2218,7 +2554,8 @@ return space.call_args(w_callable, args) Lambda.typedef = TypeDef('Lambda', AbstractFunction.typedef, - accept=interp2app(descr_Lambda_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Lambda_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Lambda_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), argnames=GetSetProperty(Lambda.fget_argnames, Lambda.fset_argnames ), defaults=GetSetProperty(Lambda.fget_defaults, Lambda.fset_defaults ), flags=GetSetProperty(Lambda.fget_flags, Lambda.fset_flags ), @@ -2247,17 +2584,20 @@ def fget_left( space, self): return space.wrap(self.left) def fset_left( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.left = obj + self.left = space.interp_w(Node, w_arg, can_be_None=False) def fget_right( space, self): return space.wrap(self.right) def fset_right( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.right = obj + self.right = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_LeftShift_new(space, w_subtype, w_left, w_right, lineno=-1): + self = space.allocate_instance(LeftShift, w_subtype) + left = space.interp_w(Node, w_left, can_be_None=False) + self.left = left + right = space.interp_w(Node, w_right, can_be_None=False) + self.right = right + self.lineno = lineno + return space.wrap(self) def descr_LeftShift_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitLeftShift')) @@ -2265,7 +2605,8 @@ return space.call_args(w_callable, args) LeftShift.typedef = TypeDef('LeftShift', BinaryOp.typedef, - accept=interp2app(descr_LeftShift_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_LeftShift_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_LeftShift_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), left=GetSetProperty(LeftShift.fget_left, LeftShift.fset_left ), right=GetSetProperty(LeftShift.fget_right, LeftShift.fset_right ), ) @@ -2294,8 +2635,15 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) + +def descr_List_new(space, w_subtype, w_nodes, lineno=-1): + self = space.allocate_instance(List, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + self.lineno = lineno + return space.wrap(self) def descr_List_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitList')) @@ -2303,7 +2651,8 @@ return space.call_args(w_callable, args) List.typedef = TypeDef('List', Node.typedef, - accept=interp2app(descr_List_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_List_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_List_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(List.fget_nodes, List.fset_nodes ), ) @@ -2335,16 +2684,22 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) def fget_quals( space, self): return space.newlist( [space.wrap(itm) for itm in self.quals] ) def fset_quals( space, self, w_arg): del self.quals[:] - for w_itm in space.unpackiterable( w_arg ): - self.quals.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.quals.append( space.interp_w(Node, w_arg)) + +def descr_ListComp_new(space, w_subtype, w_expr, w_quals, lineno=-1): + self = space.allocate_instance(ListComp, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + quals = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_quals)] + self.quals = quals + self.lineno = lineno + return space.wrap(self) def descr_ListComp_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitListComp')) @@ -2352,7 +2707,8 @@ return space.call_args(w_callable, args) ListComp.typedef = TypeDef('ListComp', Node.typedef, - accept=interp2app(descr_ListComp_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_ListComp_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_ListComp_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(ListComp.fget_expr, ListComp.fset_expr ), quals=GetSetProperty(ListComp.fget_quals, ListComp.fset_quals ), ) @@ -2388,23 +2744,28 @@ def fget_assign( space, self): return space.wrap(self.assign) def fset_assign( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.assign = obj + self.assign = space.interp_w(Node, w_arg, can_be_None=False) def fget_list( space, self): return space.wrap(self.list) def fset_list( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.list = obj + self.list = space.interp_w(Node, w_arg, can_be_None=False) def fget_ifs( space, self): return space.newlist( [space.wrap(itm) for itm in self.ifs] ) def fset_ifs( space, self, w_arg): del self.ifs[:] - for w_itm in space.unpackiterable( w_arg ): - self.ifs.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.ifs.append( space.interp_w(Node, w_arg)) + +def descr_ListCompFor_new(space, w_subtype, w_assign, w_list, w_ifs, lineno=-1): + self = space.allocate_instance(ListCompFor, w_subtype) + assign = space.interp_w(Node, w_assign, can_be_None=False) + self.assign = assign + list = space.interp_w(Node, w_list, can_be_None=False) + self.list = list + ifs = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_ifs)] + self.ifs = ifs + self.lineno = lineno + return space.wrap(self) def descr_ListCompFor_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitListCompFor')) @@ -2412,7 +2773,8 @@ return space.call_args(w_callable, args) ListCompFor.typedef = TypeDef('ListCompFor', Node.typedef, - accept=interp2app(descr_ListCompFor_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_ListCompFor_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_ListCompFor_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), assign=GetSetProperty(ListCompFor.fget_assign, ListCompFor.fset_assign ), list=GetSetProperty(ListCompFor.fget_list, ListCompFor.fset_list ), ifs=GetSetProperty(ListCompFor.fget_ifs, ListCompFor.fset_ifs ), @@ -2439,10 +2801,14 @@ def fget_test( space, self): return space.wrap(self.test) def fset_test( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.test = obj + self.test = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_ListCompIf_new(space, w_subtype, w_test, lineno=-1): + self = space.allocate_instance(ListCompIf, w_subtype) + test = space.interp_w(Node, w_test, can_be_None=False) + self.test = test + self.lineno = lineno + return space.wrap(self) def descr_ListCompIf_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitListCompIf')) @@ -2450,7 +2816,8 @@ return space.call_args(w_callable, args) ListCompIf.typedef = TypeDef('ListCompIf', Node.typedef, - accept=interp2app(descr_ListCompIf_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_ListCompIf_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_ListCompIf_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), test=GetSetProperty(ListCompIf.fget_test, ListCompIf.fset_test ), ) @@ -2476,17 +2843,20 @@ def fget_left( space, self): return space.wrap(self.left) def fset_left( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.left = obj + self.left = space.interp_w(Node, w_arg, can_be_None=False) def fget_right( space, self): return space.wrap(self.right) def fset_right( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.right = obj + self.right = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Mod_new(space, w_subtype, w_left, w_right, lineno=-1): + self = space.allocate_instance(Mod, w_subtype) + left = space.interp_w(Node, w_left, can_be_None=False) + self.left = left + right = space.interp_w(Node, w_right, can_be_None=False) + self.right = right + self.lineno = lineno + return space.wrap(self) def descr_Mod_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitMod')) @@ -2494,7 +2864,8 @@ return space.call_args(w_callable, args) Mod.typedef = TypeDef('Mod', BinaryOp.typedef, - accept=interp2app(descr_Mod_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Mod_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Mod_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), left=GetSetProperty(Mod.fget_left, Mod.fset_left ), right=GetSetProperty(Mod.fget_right, Mod.fset_right ), ) @@ -2525,10 +2896,17 @@ def fget_node( space, self): return space.wrap(self.node) def fset_node( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.node = obj + self.node = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Module_new(space, w_subtype, w_w_doc, w_node, lineno=-1): + self = space.allocate_instance(Module, w_subtype) + # This dummy assingment is auto-generated, astgen.py should be fixed to avoid that + w_doc = w_w_doc + self.w_doc = w_doc + node = space.interp_w(Node, w_node, can_be_None=False) + self.node = node + self.lineno = lineno + return space.wrap(self) def descr_Module_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitModule')) @@ -2536,7 +2914,8 @@ return space.call_args(w_callable, args) Module.typedef = TypeDef('Module', Node.typedef, - accept=interp2app(descr_Module_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Module_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Module_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), w_doc=GetSetProperty(Module.fget_w_doc, Module.fset_w_doc ), node=GetSetProperty(Module.fget_node, Module.fset_node ), ) @@ -2563,17 +2942,20 @@ def fget_left( space, self): return space.wrap(self.left) def fset_left( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.left = obj + self.left = space.interp_w(Node, w_arg, can_be_None=False) def fget_right( space, self): return space.wrap(self.right) def fset_right( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.right = obj + self.right = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Mul_new(space, w_subtype, w_left, w_right, lineno=-1): + self = space.allocate_instance(Mul, w_subtype) + left = space.interp_w(Node, w_left, can_be_None=False) + self.left = left + right = space.interp_w(Node, w_right, can_be_None=False) + self.right = right + self.lineno = lineno + return space.wrap(self) def descr_Mul_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitMul')) @@ -2581,7 +2963,8 @@ return space.call_args(w_callable, args) Mul.typedef = TypeDef('Mul', BinaryOp.typedef, - accept=interp2app(descr_Mul_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Mul_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Mul_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), left=GetSetProperty(Mul.fget_left, Mul.fset_left ), right=GetSetProperty(Mul.fget_right, Mul.fset_right ), ) @@ -2609,13 +2992,21 @@ def fset_varname( space, self, w_arg): self.varname = space.str_w(w_arg) +def descr_Name_new(space, w_subtype, w_varname, lineno=-1): + self = space.allocate_instance(Name, w_subtype) + varname = space.str_w(w_varname) + self.varname = varname + self.lineno = lineno + return space.wrap(self) + def descr_Name_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitName')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Name.typedef = TypeDef('Name', Node.typedef, - accept=interp2app(descr_Name_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Name_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Name_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), varname=GetSetProperty(Name.fget_varname, Name.fset_varname ), ) @@ -2637,13 +3028,19 @@ return visitor.visitNoneConst(self) +def descr_NoneConst_new(space, w_subtype, lineno=-1): + self = space.allocate_instance(NoneConst, w_subtype) + self.lineno = lineno + return space.wrap(self) + def descr_NoneConst_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitNoneConst')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) NoneConst.typedef = TypeDef('NoneConst', Node.typedef, - accept=interp2app(descr_NoneConst_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_NoneConst_new, unwrap_spec=[ObjSpace, W_Root, int]), + accept=interp2app(descr_NoneConst_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), ) class Not(UnaryOp): @@ -2667,10 +3064,14 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Not_new(space, w_subtype, w_expr, lineno=-1): + self = space.allocate_instance(Not, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + self.lineno = lineno + return space.wrap(self) def descr_Not_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitNot')) @@ -2678,7 +3079,8 @@ return space.call_args(w_callable, args) Not.typedef = TypeDef('Not', UnaryOp.typedef, - accept=interp2app(descr_Not_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Not_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Not_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(Not.fget_expr, Not.fset_expr ), ) @@ -2706,8 +3108,15 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) + +def descr_Or_new(space, w_subtype, w_nodes, lineno=-1): + self = space.allocate_instance(Or, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + self.lineno = lineno + return space.wrap(self) def descr_Or_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitOr')) @@ -2715,7 +3124,8 @@ return space.call_args(w_callable, args) Or.typedef = TypeDef('Or', AbstractTest.typedef, - accept=interp2app(descr_Or_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Or_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Or_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(Or.fget_nodes, Or.fset_nodes ), ) @@ -2737,13 +3147,19 @@ return visitor.visitPass(self) +def descr_Pass_new(space, w_subtype, lineno=-1): + self = space.allocate_instance(Pass, w_subtype) + self.lineno = lineno + return space.wrap(self) + def descr_Pass_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitPass')) args = Arguments(space, [ w_self ]) return space.call_args(w_callable, args) Pass.typedef = TypeDef('Pass', Node.typedef, - accept=interp2app(descr_Pass_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Pass_new, unwrap_spec=[ObjSpace, W_Root, int]), + accept=interp2app(descr_Pass_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), ) class Power(BinaryOp): @@ -2768,17 +3184,20 @@ def fget_left( space, self): return space.wrap(self.left) def fset_left( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.left = obj + self.left = space.interp_w(Node, w_arg, can_be_None=False) def fget_right( space, self): return space.wrap(self.right) def fset_right( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.right = obj + self.right = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Power_new(space, w_subtype, w_left, w_right, lineno=-1): + self = space.allocate_instance(Power, w_subtype) + left = space.interp_w(Node, w_left, can_be_None=False) + self.left = left + right = space.interp_w(Node, w_right, can_be_None=False) + self.right = right + self.lineno = lineno + return space.wrap(self) def descr_Power_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitPower')) @@ -2786,7 +3205,8 @@ return space.call_args(w_callable, args) Power.typedef = TypeDef('Power', BinaryOp.typedef, - accept=interp2app(descr_Power_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Power_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Power_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), left=GetSetProperty(Power.fget_left, Power.fset_left ), right=GetSetProperty(Power.fget_right, Power.fset_right ), ) @@ -2821,21 +3241,24 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) def fget_dest( space, self): if self.dest is None: return space.w_None else: return space.wrap(self.dest) def fset_dest( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.dest = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.dest = obj + self.dest = space.interp_w(Node, w_arg, can_be_None=True) + +def descr_Print_new(space, w_subtype, w_nodes, w_dest, lineno=-1): + self = space.allocate_instance(Print, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + dest = space.interp_w(Node, w_dest, can_be_None=True) + self.dest = dest + self.lineno = lineno + return space.wrap(self) def descr_Print_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitPrint')) @@ -2843,7 +3266,8 @@ return space.call_args(w_callable, args) Print.typedef = TypeDef('Print', Node.typedef, - accept=interp2app(descr_Print_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Print_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Print_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(Print.fget_nodes, Print.fset_nodes ), dest=GetSetProperty(Print.fget_dest, Print.fset_dest ), ) @@ -2878,21 +3302,24 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) def fget_dest( space, self): if self.dest is None: return space.w_None else: return space.wrap(self.dest) def fset_dest( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.dest = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.dest = obj + self.dest = space.interp_w(Node, w_arg, can_be_None=True) + +def descr_Printnl_new(space, w_subtype, w_nodes, w_dest, lineno=-1): + self = space.allocate_instance(Printnl, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + dest = space.interp_w(Node, w_dest, can_be_None=True) + self.dest = dest + self.lineno = lineno + return space.wrap(self) def descr_Printnl_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitPrintnl')) @@ -2900,7 +3327,8 @@ return space.call_args(w_callable, args) Printnl.typedef = TypeDef('Printnl', Node.typedef, - accept=interp2app(descr_Printnl_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Printnl_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Printnl_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(Printnl.fget_nodes, Printnl.fset_nodes ), dest=GetSetProperty(Printnl.fget_dest, Printnl.fset_dest ), ) @@ -2942,39 +3370,32 @@ else: return space.wrap(self.expr1) def fset_expr1( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.expr1 = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr1 = obj + self.expr1 = space.interp_w(Node, w_arg, can_be_None=True) def fget_expr2( space, self): if self.expr2 is None: return space.w_None else: return space.wrap(self.expr2) def fset_expr2( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.expr2 = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr2 = obj + self.expr2 = space.interp_w(Node, w_arg, can_be_None=True) def fget_expr3( space, self): if self.expr3 is None: return space.w_None else: return space.wrap(self.expr3) def fset_expr3( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.expr3 = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr3 = obj + self.expr3 = space.interp_w(Node, w_arg, can_be_None=True) + +def descr_Raise_new(space, w_subtype, w_expr1, w_expr2, w_expr3, lineno=-1): + self = space.allocate_instance(Raise, w_subtype) + expr1 = space.interp_w(Node, w_expr1, can_be_None=True) + self.expr1 = expr1 + expr2 = space.interp_w(Node, w_expr2, can_be_None=True) + self.expr2 = expr2 + expr3 = space.interp_w(Node, w_expr3, can_be_None=True) + self.expr3 = expr3 + self.lineno = lineno + return space.wrap(self) def descr_Raise_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitRaise')) @@ -2982,7 +3403,8 @@ return space.call_args(w_callable, args) Raise.typedef = TypeDef('Raise', Node.typedef, - accept=interp2app(descr_Raise_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Raise_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Raise_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr1=GetSetProperty(Raise.fget_expr1, Raise.fset_expr1 ), expr2=GetSetProperty(Raise.fget_expr2, Raise.fset_expr2 ), expr3=GetSetProperty(Raise.fget_expr3, Raise.fset_expr3 ), @@ -3015,13 +3437,14 @@ else: return space.wrap(self.value) def fset_value( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.value = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.value = obj + self.value = space.interp_w(Node, w_arg, can_be_None=True) + +def descr_Return_new(space, w_subtype, w_value, lineno=-1): + self = space.allocate_instance(Return, w_subtype) + value = space.interp_w(Node, w_value, can_be_None=True) + self.value = value + self.lineno = lineno + return space.wrap(self) def descr_Return_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitReturn')) @@ -3029,7 +3452,8 @@ return space.call_args(w_callable, args) Return.typedef = TypeDef('Return', Node.typedef, - accept=interp2app(descr_Return_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Return_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Return_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), value=GetSetProperty(Return.fget_value, Return.fset_value ), ) @@ -3055,17 +3479,20 @@ def fget_left( space, self): return space.wrap(self.left) def fset_left( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.left = obj + self.left = space.interp_w(Node, w_arg, can_be_None=False) def fget_right( space, self): return space.wrap(self.right) def fset_right( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.right = obj + self.right = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_RightShift_new(space, w_subtype, w_left, w_right, lineno=-1): + self = space.allocate_instance(RightShift, w_subtype) + left = space.interp_w(Node, w_left, can_be_None=False) + self.left = left + right = space.interp_w(Node, w_right, can_be_None=False) + self.right = right + self.lineno = lineno + return space.wrap(self) def descr_RightShift_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitRightShift')) @@ -3073,7 +3500,8 @@ return space.call_args(w_callable, args) RightShift.typedef = TypeDef('RightShift', BinaryOp.typedef, - accept=interp2app(descr_RightShift_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_RightShift_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_RightShift_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), left=GetSetProperty(RightShift.fget_left, RightShift.fset_left ), right=GetSetProperty(RightShift.fget_right, RightShift.fset_right ), ) @@ -3113,10 +3541,7 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) def fget_flags( space, self): return space.wrap(self.flags) def fset_flags( space, self, w_arg): @@ -3127,26 +3552,27 @@ else: return space.wrap(self.lower) def fset_lower( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.lower = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.lower = obj + self.lower = space.interp_w(Node, w_arg, can_be_None=True) def fget_upper( space, self): if self.upper is None: return space.w_None else: return space.wrap(self.upper) def fset_upper( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.upper = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.upper = obj + self.upper = space.interp_w(Node, w_arg, can_be_None=True) + +def descr_Slice_new(space, w_subtype, w_expr, w_flags, w_lower, w_upper, lineno=-1): + self = space.allocate_instance(Slice, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + flags = space.int_w(w_flags) + self.flags = flags + lower = space.interp_w(Node, w_lower, can_be_None=True) + self.lower = lower + upper = space.interp_w(Node, w_upper, can_be_None=True) + self.upper = upper + self.lineno = lineno + return space.wrap(self) def descr_Slice_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitSlice')) @@ -3154,7 +3580,8 @@ return space.call_args(w_callable, args) Slice.typedef = TypeDef('Slice', Node.typedef, - accept=interp2app(descr_Slice_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Slice_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Slice_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(Slice.fget_expr, Slice.fset_expr ), flags=GetSetProperty(Slice.fget_flags, Slice.fset_flags ), lower=GetSetProperty(Slice.fget_lower, Slice.fset_lower ), @@ -3185,8 +3612,15 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) + +def descr_Sliceobj_new(space, w_subtype, w_nodes, lineno=-1): + self = space.allocate_instance(Sliceobj, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + self.lineno = lineno + return space.wrap(self) def descr_Sliceobj_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitSliceobj')) @@ -3194,7 +3628,8 @@ return space.call_args(w_callable, args) Sliceobj.typedef = TypeDef('Sliceobj', Node.typedef, - accept=interp2app(descr_Sliceobj_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Sliceobj_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Sliceobj_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(Sliceobj.fget_nodes, Sliceobj.fset_nodes ), ) @@ -3222,8 +3657,15 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) + +def descr_Stmt_new(space, w_subtype, w_nodes, lineno=-1): + self = space.allocate_instance(Stmt, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + self.lineno = lineno + return space.wrap(self) def descr_Stmt_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitStmt')) @@ -3231,7 +3673,8 @@ return space.call_args(w_callable, args) Stmt.typedef = TypeDef('Stmt', Node.typedef, - accept=interp2app(descr_Stmt_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Stmt_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Stmt_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(Stmt.fget_nodes, Stmt.fset_nodes ), ) @@ -3257,17 +3700,20 @@ def fget_left( space, self): return space.wrap(self.left) def fset_left( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.left = obj + self.left = space.interp_w(Node, w_arg, can_be_None=False) def fget_right( space, self): return space.wrap(self.right) def fset_right( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.right = obj + self.right = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Sub_new(space, w_subtype, w_left, w_right, lineno=-1): + self = space.allocate_instance(Sub, w_subtype) + left = space.interp_w(Node, w_left, can_be_None=False) + self.left = left + right = space.interp_w(Node, w_right, can_be_None=False) + self.right = right + self.lineno = lineno + return space.wrap(self) def descr_Sub_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitSub')) @@ -3275,7 +3721,8 @@ return space.call_args(w_callable, args) Sub.typedef = TypeDef('Sub', BinaryOp.typedef, - accept=interp2app(descr_Sub_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Sub_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Sub_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), left=GetSetProperty(Sub.fget_left, Sub.fset_left ), right=GetSetProperty(Sub.fget_right, Sub.fset_right ), ) @@ -3310,10 +3757,7 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) def fget_flags( space, self): return space.wrap(self.flags) def fset_flags( space, self, w_arg): @@ -3322,8 +3766,19 @@ return space.newlist( [space.wrap(itm) for itm in self.subs] ) def fset_subs( space, self, w_arg): del self.subs[:] - for w_itm in space.unpackiterable( w_arg ): - self.subs.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.subs.append( space.interp_w(Node, w_arg)) + +def descr_Subscript_new(space, w_subtype, w_expr, w_flags, w_subs, lineno=-1): + self = space.allocate_instance(Subscript, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + flags = space.int_w(w_flags) + self.flags = flags + subs = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_subs)] + self.subs = subs + self.lineno = lineno + return space.wrap(self) def descr_Subscript_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitSubscript')) @@ -3331,7 +3786,8 @@ return space.call_args(w_callable, args) Subscript.typedef = TypeDef('Subscript', Node.typedef, - accept=interp2app(descr_Subscript_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Subscript_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_Subscript_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(Subscript.fget_expr, Subscript.fset_expr ), flags=GetSetProperty(Subscript.fget_flags, Subscript.fset_flags ), subs=GetSetProperty(Subscript.fget_subs, Subscript.fset_subs ), @@ -3381,11 +3837,9 @@ w_expr1 = space.getitem( w_tup, space.wrap(0) ) w_expr2 = space.getitem( w_tup, space.wrap(1) ) w_body = space.getitem( w_tup, space.wrap(2) ) - expr1 = space.interpclass_w( w_expr1 ) - expr2 = space.interpclass_w( w_expr2 ) - body = space.interpclass_w( w_body ) - if not isinstance( expr1, Node ) or not isinstance( expr2, Node ) or not isinstance( body, Node ): - raise OperationError(space.w_TypeError, space.wrap("Need a list of (expr1,expr2,body) nodes") ) + expr1 = space.interp_w(Node, w_expr1, can_be_None=True) + expr2 = space.interp_w(Node, w_expr2, can_be_None=True) + body = space.interp_w(Node, w_body, can_be_None=False) self.handlers.append( (expr1,expr2,body) ) @@ -3398,23 +3852,33 @@ def fget_body( space, self): return space.wrap(self.body) def fset_body( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.body = obj + self.body = space.interp_w(Node, w_arg, can_be_None=False) def fget_else_( space, self): if self.else_ is None: return space.w_None else: return space.wrap(self.else_) def fset_else_( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.else_ = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.else_ = obj + self.else_ = space.interp_w(Node, w_arg, can_be_None=True) + +def descr_TryExcept_new(space, w_subtype, w_body, w_handlers, w_else_, lineno=-1): + self = space.allocate_instance(TryExcept, w_subtype) + self.body = space.interp_w(Node, w_body) + handlers = [] + for w_tuple in space.unpackiterable( w_handlers ): + w_expr1 = space.getitem( w_tuple, space.wrap(0) ) + w_expr2 = space.getitem( w_tuple, space.wrap(1) ) + w_body = space.getitem( w_tuple, space.wrap(2) ) + expr1 = space.interp_w(Node, w_expr1, can_be_None=True) + expr2 = space.interp_w(Node, w_expr2, can_be_None=True) + body = space.interp_w(Node, w_body, can_be_None=False) + handlers.append((expr1, expr2, body)) + self.handlers = handlers + self.else_ = space.interp_w(Node, w_else_, can_be_None=True) + self.lineno = lineno + return space.wrap(self) + + def descr_TryExcept_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitTryExcept')) @@ -3422,7 +3886,8 @@ return space.call_args(w_callable, args) TryExcept.typedef = TypeDef('TryExcept', Node.typedef, - accept=interp2app(descr_TryExcept_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_TryExcept_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_TryExcept_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), body=GetSetProperty(TryExcept.fget_body, TryExcept.fset_body ), handlers=GetSetProperty(TryExcept.fget_handlers, TryExcept.fset_handlers ), else_=GetSetProperty(TryExcept.fget_else_, TryExcept.fset_else_ ), @@ -3450,17 +3915,20 @@ def fget_body( space, self): return space.wrap(self.body) def fset_body( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.body = obj + self.body = space.interp_w(Node, w_arg, can_be_None=False) def fget_final( space, self): return space.wrap(self.final) def fset_final( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.final = obj + self.final = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_TryFinally_new(space, w_subtype, w_body, w_final, lineno=-1): + self = space.allocate_instance(TryFinally, w_subtype) + body = space.interp_w(Node, w_body, can_be_None=False) + self.body = body + final = space.interp_w(Node, w_final, can_be_None=False) + self.final = final + self.lineno = lineno + return space.wrap(self) def descr_TryFinally_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitTryFinally')) @@ -3468,7 +3936,8 @@ return space.call_args(w_callable, args) TryFinally.typedef = TypeDef('TryFinally', Node.typedef, - accept=interp2app(descr_TryFinally_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_TryFinally_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_TryFinally_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), body=GetSetProperty(TryFinally.fget_body, TryFinally.fset_body ), final=GetSetProperty(TryFinally.fget_final, TryFinally.fset_final ), ) @@ -3497,8 +3966,15 @@ return space.newlist( [space.wrap(itm) for itm in self.nodes] ) def fset_nodes( space, self, w_arg): del self.nodes[:] - for w_itm in space.unpackiterable( w_arg ): - self.nodes.append( space.interpclass_w( w_arg ) ) + for w_itm in space.unpackiterable(w_arg): + self.nodes.append( space.interp_w(Node, w_arg)) + +def descr_Tuple_new(space, w_subtype, w_nodes, lineno=-1): + self = space.allocate_instance(Tuple, w_subtype) + nodes = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(w_nodes)] + self.nodes = nodes + self.lineno = lineno + return space.wrap(self) def descr_Tuple_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitTuple')) @@ -3506,7 +3982,8 @@ return space.call_args(w_callable, args) Tuple.typedef = TypeDef('Tuple', Node.typedef, - accept=interp2app(descr_Tuple_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Tuple_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Tuple_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), nodes=GetSetProperty(Tuple.fget_nodes, Tuple.fset_nodes ), ) @@ -3531,10 +4008,14 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_UnaryAdd_new(space, w_subtype, w_expr, lineno=-1): + self = space.allocate_instance(UnaryAdd, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + self.lineno = lineno + return space.wrap(self) def descr_UnaryAdd_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitUnaryAdd')) @@ -3542,7 +4023,8 @@ return space.call_args(w_callable, args) UnaryAdd.typedef = TypeDef('UnaryAdd', UnaryOp.typedef, - accept=interp2app(descr_UnaryAdd_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_UnaryAdd_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_UnaryAdd_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(UnaryAdd.fget_expr, UnaryAdd.fset_expr ), ) @@ -3567,10 +4049,14 @@ def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.expr = obj + self.expr = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_UnarySub_new(space, w_subtype, w_expr, lineno=-1): + self = space.allocate_instance(UnarySub, w_subtype) + expr = space.interp_w(Node, w_expr, can_be_None=False) + self.expr = expr + self.lineno = lineno + return space.wrap(self) def descr_UnarySub_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitUnarySub')) @@ -3578,7 +4064,8 @@ return space.call_args(w_callable, args) UnarySub.typedef = TypeDef('UnarySub', UnaryOp.typedef, - accept=interp2app(descr_UnarySub_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_UnarySub_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_UnarySub_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), expr=GetSetProperty(UnarySub.fget_expr, UnarySub.fset_expr ), ) @@ -3614,30 +4101,29 @@ def fget_test( space, self): return space.wrap(self.test) def fset_test( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.test = obj + self.test = space.interp_w(Node, w_arg, can_be_None=False) def fget_body( space, self): return space.wrap(self.body) def fset_body( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.body = obj + self.body = space.interp_w(Node, w_arg, can_be_None=False) def fget_else_( space, self): if self.else_ is None: return space.w_None else: return space.wrap(self.else_) def fset_else_( space, self, w_arg): - if space.is_w( w_arg, space.w_None ): - self.else_ = None - else: - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.else_ = obj + self.else_ = space.interp_w(Node, w_arg, can_be_None=True) + +def descr_While_new(space, w_subtype, w_test, w_body, w_else_, lineno=-1): + self = space.allocate_instance(While, w_subtype) + test = space.interp_w(Node, w_test, can_be_None=False) + self.test = test + body = space.interp_w(Node, w_body, can_be_None=False) + self.body = body + else_ = space.interp_w(Node, w_else_, can_be_None=True) + self.else_ = else_ + self.lineno = lineno + return space.wrap(self) def descr_While_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitWhile')) @@ -3645,7 +4131,8 @@ return space.call_args(w_callable, args) While.typedef = TypeDef('While', Node.typedef, - accept=interp2app(descr_While_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_While_new, unwrap_spec=[ObjSpace, W_Root, W_Root, W_Root, W_Root, int]), + accept=interp2app(descr_While_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), test=GetSetProperty(While.fget_test, While.fset_test ), body=GetSetProperty(While.fget_body, While.fset_body ), else_=GetSetProperty(While.fget_else_, While.fset_else_ ), @@ -3672,10 +4159,14 @@ def fget_value( space, self): return space.wrap(self.value) def fset_value( space, self, w_arg): - obj = space.interpclass_w( w_arg ) - if not isinstance( obj, Node): - raise OperationError(space.w_TypeError,space.wrap('Need a Node instance')) - self.value = obj + self.value = space.interp_w(Node, w_arg, can_be_None=False) + +def descr_Yield_new(space, w_subtype, w_value, lineno=-1): + self = space.allocate_instance(Yield, w_subtype) + value = space.interp_w(Node, w_value, can_be_None=False) + self.value = value + self.lineno = lineno + return space.wrap(self) def descr_Yield_accept( space, w_self, w_visitor): w_callable = space.getattr(w_visitor, space.wrap('visitYield')) @@ -3683,7 +4174,8 @@ return space.call_args(w_callable, args) Yield.typedef = TypeDef('Yield', Node.typedef, - accept=interp2app(descr_Yield_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + __new__ = interp2app(descr_Yield_new, unwrap_spec=[ObjSpace, W_Root, W_Root, int]), + accept=interp2app(descr_Yield_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] ), value=GetSetProperty(Yield.fget_value, Yield.fset_value ), ) @@ -3861,7 +4353,9 @@ return self.default( node ) +nodeclasses = [] for name, obj in globals().items(): if isinstance(obj, type) and issubclass(obj, Node): nodes[name.lower()] = obj + nodeclasses.append(name) Modified: pypy/dist/pypy/interpreter/astcompiler/ast.txt ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.txt (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.txt Tue Dec 13 22:14:14 2005 @@ -171,12 +171,10 @@ w_opname = space.getitem( w_obj, space.wrap(0) ) w_node = space.getitem( w_obj, space.wrap(1) ) ops = space.str_w(w_opname) - node = space.interpclass_w( w_node ) - if not isinstance(node, Node): - raise OperationError(space.w_TypeError, space.wrap("ops must be a list of (name,node)")) + node = space.interp_w(Node, w_node) self.ops.append( (ops,node) ) -Dict.fget_items( space, self ): +Dict.fget_items(space, self): return space.newlist( [ space.newtuple( [ space.wrap(key), space.wrap(value) ] ) for key, value in self.items ] ) @@ -185,28 +183,10 @@ for w_tup in space.unpackiterable( w_arg ): w_key = space.getitem( w_tup, space.wrap(0) ) w_value = space.getitem( w_tup, space.wrap(1) ) - key = space.interpclass_w( w_key ) - value = space.interpclass_w( w_value ) - if not isinstance( key, Node ) or not isinstance( value, Node ): - raise OperationError(space.w_TypeError, space.wrap("Need a list of (key node, value node)")) + key = space.interp_w(Node, w_key) + value = space.interp_w(Node, w_value) self.items.append( (key,value) ) -flatten_nodes(TryExcept.handlers): - # handlers is a list of triplets (expr1, expr2, body) - for expr1, expr2, body in self.handlers: - if expr1 is not None: - nodelist.append(expr1) - if expr2 is not None: - nodelist.append(expr2) - if body is not None: - nodelist.append(body) - -flatten_nodes(If.tests): - # tests is a list of couples (node (test), node (suite)) - for test, suite in self.tests: - nodelist.append(test) - nodelist.append(suite) - If.fget_tests( space, self ): return space.newlist( [ space.newtuple( [ space.wrap(test), @@ -218,14 +198,11 @@ for w_tup in space.unpackiterable( w_arg ): w_test = space.getitem( w_tup, space.wrap(0) ) w_suite = space.getitem( w_tup, space.wrap(1) ) - test = space.interpclass_w( w_test ) - suite = space.interpclass_w( w_suite ) - if not isinstance( test, Node ) or not isinstance( suite, Node ): - raise OperationError(space.w_TypeError, space.wrap("Need a list of (test,suite) nodes") ) + test = space.interp_w(Node, w_test) + suite = space.interp_w(Node, w_suite) self.tests.append( (test,suite) ) - TryExcept.fget_handlers( space, self ): return space.newlist( [ space.newtuple( [ space.wrap(expr1), space.wrap(expr2), @@ -238,11 +215,9 @@ w_expr1 = space.getitem( w_tup, space.wrap(0) ) w_expr2 = space.getitem( w_tup, space.wrap(1) ) w_body = space.getitem( w_tup, space.wrap(2) ) - expr1 = space.interpclass_w( w_expr1 ) - expr2 = space.interpclass_w( w_expr2 ) - body = space.interpclass_w( w_body ) - if not isinstance( expr1, Node ) or not isinstance( expr2, Node ) or not isinstance( body, Node ): - raise OperationError(space.w_TypeError, space.wrap("Need a list of (expr1,expr2,body) nodes") ) + expr1 = space.interp_w(Node, w_expr1, can_be_None=True) + expr2 = space.interp_w(Node, w_expr2, can_be_None=True) + body = space.interp_w(Node, w_body, can_be_None=False) self.handlers.append( (expr1,expr2,body) ) Import.fget_names( space, self ): @@ -275,3 +250,95 @@ as_name = space.str_w( w_as_name ) self.names.append( (name, as_name) ) +def descr_From_new(space, w_subtype, w_modname, w_names, lineno=-1): + self = space.allocate_instance(From, w_subtype) + modname = space.str_w(w_modname) + self.modname = modname + names = [] + for w_tuple in space.unpackiterable(w_names): + w_name = space.getitem(w_tuple, space.wrap(0)) + w_as_name = space.getitem(w_tuple, space.wrap(1)) + name = space.str_w(w_name) + as_name = None + if not space.is_w(w_as_name, space.w_None): + as_name = space.str_w(w_as_name) + names.append((name, as_name)) + self.names = names + self.lineno = lineno + return space.wrap(self) + +def descr_Import_new(space, w_subtype, w_names, lineno=-1): + self = space.allocate_instance(Import, w_subtype) + names = [] + for w_tuple in space.unpackiterable(w_names): + w_name = space.getitem(w_tuple, space.wrap(0)) + w_as_name = space.getitem(w_tuple, space.wrap(1)) + name = space.str_w(w_name) + as_name = None + if not space.is_w(w_as_name, space.w_None): + as_name = space.str_w(w_as_name) + names.append((name, as_name)) + self.names = names + self.lineno = lineno + return space.wrap(self) + +def descr_Compare_new(space, w_subtype, w_expr, w_ops, lineno=-1): + self = space.allocate_instance(Compare, w_subtype) + self.expr = space.interp_w(Node, w_expr) + ops = [] + for w_tuple in space.unpackiterable(w_ops): + w_opname = space.getitem(w_tuple, space.wrap(0)) + w_node = space.getitem(w_tuple, space.wrap(1)) + opname = space.str_w(w_opname) + node = space.interp_w(Node, w_node) + ops.append((opname, node)) + self.ops = ops + self.lineno = lineno + return space.wrap(self) + +def descr_Dict_new(space, w_subtype, w_items, lineno=-1): + self = space.allocate_instance(Dict, w_subtype) + items = [] + for w_tuple in space.unpackiterable(w_items): + w_key = space.getitem(w_tuple, space.wrap(0)) + w_value = space.getitem(w_tuple, space.wrap(1)) + key = space.interp_w(Node, w_key) + value = space.interp_w(Node, w_value) + items.append((key, value)) + self.items = items + self.lineno = lineno + return space.wrap(self) + + +def descr_If_new(space, w_subtype, w_tests, w_else_, lineno=-1): + self = space.allocate_instance(If, w_subtype) + tests = [] + for w_tuple in space.unpackiterable(w_tests): + w_test = space.getitem(w_tuple, space.wrap(0)) + w_suite = space.getitem(w_tuple, space.wrap(1)) + test = space.interp_w(Node, w_test) + suite = space.interp_w(Node, w_suite) + tests.append((test, suite)) + self.tests = tests + self.else_ = space.interp_w(Node, w_else_, can_be_None=True) + self.lineno = lineno + return space.wrap(self) + + +def descr_TryExcept_new(space, w_subtype, w_body, w_handlers, w_else_, lineno=-1): + self = space.allocate_instance(TryExcept, w_subtype) + self.body = space.interp_w(Node, w_body) + handlers = [] + for w_tuple in space.unpackiterable( w_handlers ): + w_expr1 = space.getitem( w_tuple, space.wrap(0) ) + w_expr2 = space.getitem( w_tuple, space.wrap(1) ) + w_body = space.getitem( w_tuple, space.wrap(2) ) + expr1 = space.interp_w(Node, w_expr1, can_be_None=True) + expr2 = space.interp_w(Node, w_expr2, can_be_None=True) + body = space.interp_w(Node, w_body, can_be_None=False) + handlers.append((expr1, expr2, body)) + self.handlers = handlers + self.else_ = space.interp_w(Node, w_else_, can_be_None=True) + self.lineno = lineno + return space.wrap(self) + Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Tue Dec 13 22:14:14 2005 @@ -45,6 +45,7 @@ self.argprops = self.get_argprops() self.nargs = len(self.argnames) self.init = [] + self.applevel_new = [] self.flatten_nodes = {} self.additional_methods = {} self.parent = parent @@ -117,6 +118,13 @@ return d + def get_initargs(self): + if self.parent.args and self.args: + args = self.parent.args +","+ self.args + else: + args = self.parent.args or self.args + return args + def gen_source(self): buf = StringIO() print >> buf, "class %s(%s):" % (self.name, self.parent.name) @@ -133,21 +141,20 @@ print >> buf self._gen_attrs(buf) print >> buf + self._gen_new(buf) + print >> buf self._gen_typedef(buf) buf.seek(0, 0) return buf.read() def _gen_init(self, buf): - if self.parent.args and self.args: - args = self.parent.args +","+ self.args - else: - args = self.parent.args or self.args - if args: - print >> buf, " def __init__(self, %s, lineno=-1):" % args + initargs = self.get_initargs() + if initargs: + print >> buf, " def __init__(self, %s, lineno=-1):" % initargs else: print >> buf, " def __init__(self, lineno=-1):" if self.parent.args: - print >> buf, " %s.__init__(self, %s, lineno)" % self.parent.args + print >> buf, " %s.__init__(self, %s, lineno)" % (self.parent.name, self.parent.args) else: print >> buf, " Node.__init__(self, lineno)" if self.argnames: @@ -158,6 +165,47 @@ if self.init: print >> buf, "".join([" " + line for line in self.init]) + def _gen_new(self, buf): + if self.applevel_new: + print >> buf, ''.join(self.applevel_new) + return + args = self.get_initargs() + argprops = self.argprops + if args: + w_args = ['w_%s' % strip_default(arg.strip()) + for arg in args.split(',') if arg] + print >> buf, "def descr_%s_new(space, w_subtype, %s, lineno=-1):" % (self.name, ', '.join(w_args)) + else: + w_args = [] + print >> buf, "def descr_%s_new(space, w_subtype, lineno=-1):" % (self.name,) + print >> buf, " self = space.allocate_instance(%s, w_subtype)" % (self.name,) + # w_args = ['w_%s' % strip_default(arg.strip()) for arg in self.args.split(',') if arg] + for w_arg in w_args: + argname = w_arg[2:] + prop = argprops[argname] + if prop == P_NONE: + print >> buf, " %s = space.interp_w(Node, %s, can_be_None=True)" % (argname, w_arg) + elif prop == P_NODE: + print >> buf, " %s = space.interp_w(Node, %s, can_be_None=False)" % (argname, w_arg) + elif prop == P_NESTED: + print >> buf, " %s = [space.interp_w(Node, w_node) for w_node in space.unpackiterable(%s)]" % (argname, w_arg) + elif prop == P_STR: + print >> buf, " %s = space.str_w(%s)" % (argname, w_arg) + elif prop == P_INT: + print >> buf, " %s = space.int_w(%s)" % (argname, w_arg) + elif prop == P_STR_LIST: + print >> buf, " %s = [space.str_w(w_str) for w_str in space.unpackiterable(%s)]" % (argname, w_arg) + elif prop == P_INT_LIST: + print >> buf, " %s = [space.int_w(w_int) for w_int in space.unpackiterable(%s)]" % (argname, w_arg) + elif prop == P_WRAPPED: + print >> buf, " # This dummy assingment is auto-generated, astgen.py should be fixed to avoid that" + print >> buf, " %s = %s" % (argname, w_arg) + else: + raise ValueError("Don't know how to handle property '%s'" % prop) + print >> buf, " self.%s = %s" % (argname, argname) + print >> buf, " self.lineno = lineno" + print >> buf, " return space.wrap(self)" + def _gen_getChildren(self, buf): print >> buf, " def getChildren(self):" print >> buf, ' "NOT_RPYTHON"' @@ -266,37 +314,28 @@ def _gen_fset_func(self, buf, attr, prop ): # FSET print >> buf, " def fset_%s( space, self, w_arg):" % attr - if prop[attr]==P_WRAPPED: + if prop[attr] == P_WRAPPED: print >> buf, " self.%s = w_arg" % attr - elif prop[attr]==P_INT: + elif prop[attr] == P_INT: print >> buf, " self.%s = space.int_w(w_arg)" % attr - elif prop[attr]==P_STR: + elif prop[attr] == P_STR: print >> buf, " self.%s = space.str_w(w_arg)" % attr - elif prop[attr]==P_INT_LIST: + elif prop[attr] == P_INT_LIST: print >> buf, " del self.%s[:]" % attr print >> buf, " for itm in space.unpackiterable(w_arg):" print >> buf, " self.%s.append( space.int_w(itm) )" % attr - elif prop[attr]==P_STR_LIST: + elif prop[attr] == P_STR_LIST: print >> buf, " del self.%s[:]" % attr print >> buf, " for itm in space.unpackiterable(w_arg):" print >> buf, " self.%s.append( space.str_w(itm) )" % attr - elif prop[attr]==P_NESTED: + elif prop[attr] == P_NESTED: print >> buf, " del self.%s[:]" % attr - print >> buf, " for w_itm in space.unpackiterable( w_arg ):" - print >> buf, " self.%s.append( space.interpclass_w( w_arg ) )" % attr - elif prop[attr]==P_NONE: - print >> buf, " if space.is_w( w_arg, space.w_None ):" - print >> buf, " self.%s = None" % attr - print >> buf, " else:" - print >> buf, " obj = space.interpclass_w( w_arg )" - print >> buf, " if not isinstance( obj, Node):" - print >> buf, " raise OperationError(space.w_TypeError,space.wrap('Need a Node instance'))" - print >> buf, " self.%s = obj" % attr + print >> buf, " for w_itm in space.unpackiterable(w_arg):" + print >> buf, " self.%s.append( space.interp_w(Node, w_arg))" % attr + elif prop[attr] == P_NONE: + print >> buf, " self.%s = space.interp_w(Node, w_arg, can_be_None=True)" % attr else: # P_NODE - print >> buf, " obj = space.interpclass_w( w_arg )" - print >> buf, " if not isinstance( obj, Node):" - print >> buf, " raise OperationError(space.w_TypeError,space.wrap('Need a Node instance'))" - print >> buf, " self.%s = obj" % attr + print >> buf, " self.%s = space.interp_w(Node, w_arg, can_be_None=False)" % attr def _gen_attrs(self, buf): prop = self.argprops @@ -307,16 +346,22 @@ if "fset_%s" % attr not in self.additional_methods: self._gen_fset_func( buf, attr, prop ) - def _gen_typedef(self, buf): + initargs = [strip_default(arg.strip()) + for arg in self.get_initargs().split(',') if arg] + if initargs: + new_unwrap_spec = ['ObjSpace', 'W_Root'] + ['W_Root'] * len(initargs) + ['int'] + else: + new_unwrap_spec = ['ObjSpace', 'W_Root', 'int'] parent_type = "%s.typedef" % self.parent.name print >> buf, "def descr_%s_accept( space, w_self, w_visitor):" %self.name print >> buf, " w_callable = space.getattr(w_visitor, space.wrap('visit%s'))" % self.name print >> buf, " args = Arguments(space, [ w_self ])" print >> buf, " return space.call_args(w_callable, args)" print >> buf, "" - print >> buf, "%s.typedef = TypeDef('%s', %s, " % (self.name,self.name,parent_type) - print >> buf, " accept=interp2app(descr_%s_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] )," % self.name + print >> buf, "%s.typedef = TypeDef('%s', %s, " % (self.name, self.name, parent_type) + print >> buf, " __new__ = interp2app(descr_%s_new, unwrap_spec=[%s])," % (self.name, ', '.join(new_unwrap_spec)) + print >> buf, " accept=interp2app(descr_%s_accept, unwrap_spec=[ObjSpace, W_Root, W_Root] )," % self.name for attr in self.argnames: print >> buf, " %s=GetSetProperty(%s.fget_%s, %s.fset_%s )," % (attr,self.name,attr,self.name,attr) print >> buf, " )" @@ -356,7 +401,7 @@ rx_init = re.compile('init\((.*)\):') rx_flatten_nodes = re.compile('flatten_nodes\((.*)\.(.*)\):') rx_additional_methods = re.compile('(\\w+)\.(\w+)\((.*?)\):') - +rx_descr_news_methods = re.compile('def\s+descr_(\\w+)_new\((.*?)\):') def parse_spec(file): classes = {} cur = None @@ -414,6 +459,14 @@ cur.additional_methods[_cur_] = [' def %s(%s):\n' % (methname, params)] continue + mo = rx_descr_news_methods.match(line) + if mo: + kind = 'applevel_new' + name = mo.group(1) + cur = classes[name] + cur.applevel_new = [mo.group(0) + '\n'] + continue + if kind == 'init': # some code for the __init__ method cur.init.append(line) @@ -424,6 +477,8 @@ cur.flatten_nodes[_cur_].append(line) elif kind == 'additional_method': cur.additional_methods[_cur_].append(' '*4 + line) + elif kind == 'applevel_new': + cur.applevel_new.append(line) for node in classes.values(): node.setup_parent(classes) @@ -492,7 +547,7 @@ """ from consts import CO_VARARGS, CO_VARKEYWORDS, OP_ASSIGN from pypy.interpreter.baseobjspace import Wrappable -from pypy.interpreter.typedef import TypeDef, GetSetProperty +from pypy.interpreter.typedef import TypeDef, GetSetProperty, interp_attrproperty from pypy.interpreter.gateway import interp2app, W_Root, ObjSpace from pypy.interpreter.argument import Arguments from pypy.interpreter.error import OperationError @@ -556,10 +611,18 @@ args = Arguments(space, [ w_self ]) return space.call_args( w_callable, args ) +def descr_Node_new(space, w_subtype, lineno=-1): + node = space.allocate_instance(Node, w_subtype) + node.lineno = lineno + return space.wrap(node) + Node.typedef = TypeDef('ASTNode', + __new__ = interp2app(descr_Node_new, unwrap_spec=[ObjSpace, W_Root, int]), #__repr__ = interp2app(descr_node_repr, unwrap_spec=['self', ObjSpace] ), getChildNodes = interp2app(Node.descr_getChildNodes, unwrap_spec=[ 'self', ObjSpace ] ), accept = interp2app(descr_node_accept, unwrap_spec=[ ObjSpace, W_Root, W_Root ] ), + lineno = interp_attrproperty('lineno', cls=Node), + filename = interp_attrproperty('filename', cls=Node), ) @@ -589,9 +652,11 @@ ''' epilogue = ''' +nodeclasses = [] for name, obj in globals().items(): if isinstance(obj, type) and issubclass(obj, Node): nodes[name.lower()] = obj + nodeclasses.append(name) ''' if __name__ == "__main__": From adim at codespeak.net Tue Dec 13 22:18:26 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Tue, 13 Dec 2005 22:18:26 +0100 (CET) Subject: [pypy-svn] r21147 - pypy/dist/pypy/module/recparser Message-ID: <20051213211826.3CAA927B69@code1.codespeak.net> Author: adim Date: Tue Dec 13 22:18:25 2005 New Revision: 21147 Modified: pypy/dist/pypy/module/recparser/__init__.py Log: export each AST type through the parser module Modified: pypy/dist/pypy/module/recparser/__init__.py ============================================================================== --- pypy/dist/pypy/module/recparser/__init__.py (original) +++ pypy/dist/pypy/module/recparser/__init__.py Tue Dec 13 22:18:25 2005 @@ -47,5 +47,9 @@ 'decode_string_literal': 'pyparser.decode_string_literal', 'install_compiler_hook' : 'pypy.interpreter.pycompiler.install_compiler_hook', 'rules' : 'pypy.interpreter.pyparser.pythonparse.grammar_rules', - } + } +# Automatically exports each AST class +from pypy.interpreter.astcompiler.ast import nodeclasses +for klass_name in nodeclasses: + Module.interpleveldefs['AST' + klass_name] = 'pypy.interpreter.astcompiler.ast.%s' % klass_name From ale at codespeak.net Wed Dec 14 09:42:46 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Wed, 14 Dec 2005 09:42:46 +0100 (CET) Subject: [pypy-svn] r21151 - pypy/dist/pypy/translator/c/test Message-ID: <20051214084246.3ED5C27B64@code1.codespeak.net> Author: ale Date: Wed Dec 14 09:42:45 2005 New Revision: 21151 Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py Log: Forgot to remove view = True in the test Modified: pypy/dist/pypy/translator/c/test/test_ext__socket.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_ext__socket.py (original) +++ pypy/dist/pypy/translator/c/test/test_ext__socket.py Wed Dec 14 09:42:45 2005 @@ -139,7 +139,7 @@ sockname = rsocket.getpeername(fd) os.close(fd) return sockname[1] - f1 = compile(does_stuff, [], True) + f1 = compile(does_stuff, []) res = f1() assert res == self.PORT From arigo at codespeak.net Wed Dec 14 12:05:18 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 14 Dec 2005 12:05:18 +0100 (CET) Subject: [pypy-svn] r21152 - in pypy/dist/pypy: annotation rpython/test translator/test Message-ID: <20051214110518.3AEEE27B5C@code1.codespeak.net> Author: arigo Date: Wed Dec 14 12:05:13 2005 New Revision: 21152 Modified: pypy/dist/pypy/annotation/specialize.py pypy/dist/pypy/rpython/test/test_rpbc.py pypy/dist/pypy/translator/test/test_annrpython.py Log: Memo calls with boolean arguments. Modified: pypy/dist/pypy/annotation/specialize.py ============================================================================== --- pypy/dist/pypy/annotation/specialize.py (original) +++ pypy/dist/pypy/annotation/specialize.py Wed Dec 14 12:05:13 2005 @@ -69,21 +69,11 @@ fieldnamecounter = 0 - def getuniquefieldname(self, descs): + def getuniquefieldname(self): name = self.funcdesc.name - fieldname = 'memofield_%s_%d' % (name, MemoTable.fieldnamecounter) + fieldname = '$memofield_%s_%d' % (name, MemoTable.fieldnamecounter) MemoTable.fieldnamecounter += 1 - # look for name clashes - for desc in descs: - try: - desc.read_attribute(fieldname) - except AttributeError: - pass # no clash - else: - # clash! try again... - return self.getuniquefieldname(descs) - else: - return fieldname + return fieldname def finish(self): from pypy.annotation.model import unionof @@ -101,18 +91,17 @@ name = self.funcdesc.name argnames = ['a%d' % i for i in range(nbargs)] - def make_helper(firstarg, expr, miniglobals): - source = """ - def f(%s): - return %s - """ % (', '.join(argnames[firstarg:]), expr) - exec py.code.Source(source).compile() in miniglobals + def make_helper(firstarg, stmt, miniglobals): + header = "def f(%s):" % (', '.join(argnames[firstarg:],)) + source = py.code.Source(stmt) + source = source.putaround(header) + exec source.compile() in miniglobals f = miniglobals['f'] return func_with_new_name(f, 'memo_%s_%d' % (name, firstarg)) def make_constant_subhelper(firstarg, result): # make a function that just returns the constant answer 'result' - f = make_helper(firstarg, 'result', {'result': result}) + f = make_helper(firstarg, 'return result', {'result': result}) f.constant_result = result return f @@ -125,6 +114,8 @@ return make_constant_subhelper(firstarg, result) else: nextargvalues = list(sets[len(args_so_far)]) + if nextargvalues == [True, False]: + nextargvalues = [False, True] nextfns = [make_subhelper(args_so_far + (arg,)) for arg in nextargvalues] # do all graphs return a constant? @@ -132,6 +123,7 @@ constants = [fn.constant_result for fn in nextfns] except AttributeError: constants = None # one of the 'fn' has no constant_result + restargs = ', '.join(argnames[firstarg+1:]) # is there actually only one possible value for the current arg? if len(nextargvalues) == 1: @@ -140,15 +132,43 @@ return make_constant_subhelper(firstarg, result) else: # ignore the first argument and just call the subhelper - expr = 'subhelper(%s)' % ( - ', '.join(argnames[firstarg+1:]),) - return make_helper(firstarg, expr, + stmt = 'return subhelper(%s)' % restargs + return make_helper(firstarg, stmt, {'subhelper': nextfns[0]}) + + # is the arg a bool? + elif nextargvalues == [False, True]: + fieldname0 = self.getuniquefieldname() + fieldname1 = self.getuniquefieldname() + stmt = ['if %s:' % argnames[firstarg]] + if hasattr(nextfns[True], 'constant_result'): + # the True branch has a constant result + case1 = nextfns[True].constant_result + stmt.append(' return case1') + else: + # must call the subhelper + case1 = nextfns[True] + stmt.append(' return case1(%s)' % restargs) + stmt.append('else:') + if hasattr(nextfns[False], 'constant_result'): + # the False branch has a constant result + case0 = nextfns[False].constant_result + stmt.append(' return case0') + else: + # must call the subhelper + case0 = nextfns[False] + stmt.append(' return case0(%s)' % restargs) + + return make_helper(firstarg, '\n'.join(stmt), + {'case0': case0, + 'case1': case1}) + + # the arg is a set of PBCs else: descs = [bookkeeper.getdesc(pbc) for pbc in nextargvalues] - fieldname = self.getuniquefieldname(descs) - expr = 'getattr(%s, %r)' % (argnames[firstarg], - fieldname) + fieldname = self.getuniquefieldname() + stmt = 'return getattr(%s, %r)' % (argnames[firstarg], + fieldname) if constants: # instead of calling these subhelpers indirectly, # we store what they would return directly in the @@ -157,13 +177,13 @@ else: store = nextfns # call the result of the getattr() - expr += '(%s)' % (', '.join(argnames[firstarg+1:]),) + stmt += '(%s)' % restargs # store the memo field values for desc, value_to_store in zip(descs, store): desc.create_new_attribute(fieldname, value_to_store) - return make_helper(firstarg, expr, {}) + return make_helper(firstarg, stmt, {}) entrypoint = make_subhelper(args_so_far = ()) self.graph = annotator.translator.buildflowgraph(entrypoint) @@ -177,22 +197,28 @@ def memo(funcdesc, arglist_s): - from pypy.annotation.model import SomePBC, SomeImpossibleValue, unionof + from pypy.annotation.model import SomePBC, SomeImpossibleValue, SomeBool + from pypy.annotation.model import unionof # call the function now, and collect possible results argvalues = [] for s in arglist_s: - if not isinstance(s, SomePBC): - if isinstance(s, SomeImpossibleValue): - return s # we will probably get more possible args later + if s.is_constant(): + values = [s.const] + elif isinstance(s, SomePBC): + values = [] + assert not s.can_be_None, "memo call: cannot mix None and PBCs" + for desc in s.descriptions: + if desc.pyobj is None: + raise Exception("memo call with a class or PBC that has no " + "corresponding Python object (%r)" % (desc,)) + values.append(desc.pyobj) + elif isinstance(s, SomeImpossibleValue): + return s # we will probably get more possible args later + elif isinstance(s, SomeBool): + values = [False, True] + else: raise Exception("memo call: argument must be a class or a frozen " "PBC, got %r" % (s,)) - assert not s.can_be_None, "memo call: arguments must never be None" - values = [] - for desc in s.descriptions: - if desc.pyobj is None: - raise Exception("memo call with a class or PBC that has no " - "corresponding Python object (%r)" % (desc,)) - values.append(desc.pyobj) argvalues.append(values) # the list of all possible tuples of arguments to give to the memo function possiblevalues = cartesian_product(argvalues) Modified: pypy/dist/pypy/rpython/test/test_rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rpbc.py (original) +++ pypy/dist/pypy/rpython/test/test_rpbc.py Wed Dec 14 12:05:13 2005 @@ -259,7 +259,34 @@ assert res == 7 res = interpret(f1, [1]) assert res == 3 - + +def test_call_memoized_function_with_bools(): + fr1 = Freezing() + fr2 = Freezing() + def getorbuild(key, flag1, flag2): + a = 1 + if key is fr1: + result = eval("a+2") + else: + result = eval("a+6") + if flag1: + result += 100 + if flag2: + result += 1000 + return result + getorbuild._annspecialcase_ = "specialize:memo" + + def f1(i): + if i > 0: + fr = fr1 + else: + fr = fr2 + return getorbuild(fr, i % 2 == 0, i % 3 == 0) + + for n in [0, 1, 2, -3, 6]: + res = interpret(f1, [n]) + assert res == f1(n) + def test_call_memoized_cache(): # this test checks that we add a separate field Modified: pypy/dist/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/dist/pypy/translator/test/test_annrpython.py (original) +++ pypy/dist/pypy/translator/test/test_annrpython.py Wed Dec 14 12:05:13 2005 @@ -1791,6 +1791,33 @@ s = a.build_types(f1, [int]) assert s.knowntype == int + def test_call_memoized_function_with_bools(self): + fr1 = Freezing() + fr2 = Freezing() + def getorbuild(key, flag1, flag2): + a = 1 + if key is fr1: + result = eval("a+2") + else: + result = eval("a+6") + if flag1: + result += 100 + if flag2: + result += 1000 + return result + getorbuild._annspecialcase_ = "specialize:memo" + + def f1(i): + if i > 0: + fr = fr1 + else: + fr = fr2 + return getorbuild(fr, i % 2 == 0, i % 3 == 0) + + a = self.RPythonAnnotator() + s = a.build_types(f1, [int]) + assert s.knowntype == int + def test_stored_bound_method(self): # issue 129 class H: From mwh at codespeak.net Wed Dec 14 12:08:32 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Wed, 14 Dec 2005 12:08:32 +0100 (CET) Subject: [pypy-svn] r21153 - pypy/dist/pypy/objspace/std Message-ID: <20051214110832.6899127B5C@code1.codespeak.net> Author: mwh Date: Wed Dec 14 12:08:31 2005 New Revision: 21153 Modified: pypy/dist/pypy/objspace/std/fake.py Log: this fixes the 'failing socket tests kill traceback printing' issue. i'm not 100% sure about the whys and wheres, but at this point in fake.py's life i'm not sure it's worth worrying over. it doesn't break any more tests and it doesn't break translation... Modified: pypy/dist/pypy/objspace/std/fake.py ============================================================================== --- pypy/dist/pypy/objspace/std/fake.py (original) +++ pypy/dist/pypy/objspace/std/fake.py Wed Dec 14 12:08:31 2005 @@ -80,7 +80,7 @@ else: for s, v in cpy_type.__dict__.items(): if not (cpy_type is unicode and s in ['__add__', '__contains__']): - if s != '__getattribute__' or cpy_type is type(sys): + if s != '__getattribute__' or cpy_type is type(sys) or cpy_type is type(Exception): kw[s] = v kw['__module__'] = cpy_type.__module__ From arigo at codespeak.net Wed Dec 14 12:48:44 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 14 Dec 2005 12:48:44 +0100 (CET) Subject: [pypy-svn] r21156 - pypy/dist/pypy/interpreter Message-ID: <20051214114844.8F5F127B5E@code1.codespeak.net> Author: arigo Date: Wed Dec 14 12:48:43 2005 New Revision: 21156 Modified: pypy/dist/pypy/interpreter/typedef.py Log: Simplified the logic in get_unique_user_subclass(), using the support for bools in memo calls. Modified: pypy/dist/pypy/interpreter/typedef.py ============================================================================== --- pypy/dist/pypy/interpreter/typedef.py (original) +++ pypy/dist/pypy/interpreter/typedef.py Wed Dec 14 12:48:43 2005 @@ -36,52 +36,19 @@ return True -# we cannot specialize:memo by more than one PBC key -# so we need to work a bit to allow that - def get_unique_interplevel_subclass(cls, hasdict, wants_slots, needsdel=False): - if needsdel: - if hasdict: - if wants_slots: - return get_unique_interplevel_WithDictWithSlotsWithDel(cls) - else: - return get_unique_interplevel_WithDictNoSlotsWithDel(cls) - else: - if wants_slots: - return get_unique_interplevel_NoDictWithSlotsWithDel(cls) - else: - return get_unique_interplevel_NoDictNoSlotsWithDel(cls) - else: - if hasdict: - if wants_slots: - return get_unique_interplevel_WithDictWithSlotsNoDel(cls) - else: - return get_unique_interplevel_WithDictNoSlotsNoDel(cls) - else: - if wants_slots: - return get_unique_interplevel_NoDictWithSlotsNoDel(cls) - else: - return get_unique_interplevel_NoDictNoSlotsNoDel(cls) -get_unique_interplevel_subclass._annspecialcase_ = "specialize:arg0" - -for hasdict in False, True: - for wants_del in False, True: - for wants_slots in False, True: - name = hasdict and "WithDict" or "NoDict" - name += wants_slots and "WithSlots" or "NoSlots" - name += wants_del and "WithDel" or "NoDel" - funcname = "get_unique_interplevel_%s" % (name,) - exec compile2(""" - subclass_cache_%(name)s = {} - def %(funcname)s(cls): - try: - return subclass_cache_%(name)s[cls] - except KeyError: - subcls = _buildusercls(cls, %(hasdict)r, %(wants_slots)r, %(wants_del)r) - subclass_cache_%(name)s[cls] = subcls - return subcls - %(funcname)s._annspecialcase_ = "specialize:memo" - """ % locals()) + key = cls, hasdict, wants_slots, needsdel + try: + return _subclass_cache[key] + except KeyError: + name = hasdict and "WithDict" or "NoDict" + name += wants_slots and "WithSlots" or "NoSlots" + name += needsdel and "WithDel" or "NoDel" + subcls = _buildusercls(cls, hasdict, wants_slots, needsdel) + _subclass_cache[key] = subcls + return subcls +get_unique_interplevel_subclass._annspecialcase_ = "specialize:memo" +_subclass_cache = {} def _buildusercls(cls, hasdict, wants_slots, wants_del): "NOT_RPYTHON: initialization-time only" From mwh at codespeak.net Wed Dec 14 13:09:14 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Wed, 14 Dec 2005 13:09:14 +0100 (CET) Subject: [pypy-svn] r21157 - in pypy/dist/pypy/objspace/flow: . test Message-ID: <20051214120914.F1A4D27B68@code1.codespeak.net> Author: mwh Date: Wed Dec 14 13:09:13 2005 New Revision: 21157 Added: pypy/dist/pypy/objspace/flow/test/test_checkgraph.py (contents, props changed) Modified: pypy/dist/pypy/objspace/flow/model.py Log: rewrite checkgraph to use iterblocks() also add another check: that exitswitch != None => len(exits) > 1 add a fair few tests for things that checkgraph should check. Modified: pypy/dist/pypy/objspace/flow/model.py ============================================================================== --- pypy/dist/pypy/objspace/flow/model.py (original) +++ pypy/dist/pypy/objspace/flow/model.py Wed Dec 14 13:09:13 2005 @@ -158,7 +158,7 @@ def __init__(self, inputargs): self.isstartblock = False - self.inputargs = list(inputargs) # mixed list of variable/const + self.inputargs = list(inputargs) # mixed list of variable/const XXX self.operations = [] # list of SpaceOperation(s) self.exitswitch = None # a variable or # Constant(last_exception), see below @@ -462,106 +462,103 @@ def checkgraph(graph): "Check the consistency of a flow graph." - if __debug__: - this_block = [None] + if not __debug__: + return + try: + + vars_previous_blocks = {} + exitblocks = {graph.returnblock: 1, # retval graph.exceptblock: 2} # exc_cls, exc_value - - def visit(block): - if isinstance(block, Block): - this_block[0] = block - assert bool(block.isstartblock) == (block is graph.startblock) - if not block.exits: - assert block in exitblocks - vars = {} - - def definevar(v, only_in_link=None): - assert isinstance(v, Variable) - assert v not in vars, "duplicate variable %r" % (v,) - assert v not in vars_previous_blocks, ( - "variable %r used in more than one block" % (v,)) - vars[v] = only_in_link - - def usevar(v, in_link=None): - assert v in vars - if in_link is not None: - assert vars[v] is None or vars[v] is in_link - - for v in block.inputargs: - definevar(v) - - for op in block.operations: - for v in op.args: - assert isinstance(v, (Constant, Variable)) - if isinstance(v, Variable): - usevar(v) - else: - assert v.value is not last_exception - #assert v.value != last_exc_value - definevar(op.result) - - exc_links = {} - if block.exitswitch is None: - assert len(block.exits) <= 1 - if block.exits: - assert block.exits[0].exitcase is None - elif block.exitswitch == c_last_exception: - assert len(block.operations) >= 1 - # check if an exception catch is done on a reasonable - # operation - assert block.operations[-1].opname not in ("keepalive", - "cast_pointer", - "same_as") - assert len(block.exits) >= 2 - assert block.exits[0].exitcase is None - for link in block.exits[1:]: - assert issubclass(link.exitcase, Exception) - exc_links[link] = True - else: - assert isinstance(block.exitswitch, Variable) - assert block.exitswitch in vars - allexitcases = {} - for link in block.exits: - assert len(link.args) == len(link.target.inputargs) - assert link.prevblock is block - exc_link = link in exc_links - if exc_link: - for v in [link.last_exception, link.last_exc_value]: - assert isinstance(v, (Variable, Constant)) - if isinstance(v, Variable): - definevar(v, only_in_link=link) + for block, nbargs in exitblocks.items(): + assert len(block.inputargs) == nbargs + assert not block.operations + assert not block.exits + + for block in graph.iterblocks(): + assert bool(block.isstartblock) == (block is graph.startblock) + if not block.exits: + assert block in exitblocks + vars = {} + + def definevar(v, only_in_link=None): + assert isinstance(v, Variable) + assert v not in vars, "duplicate variable %r" % (v,) + assert v not in vars_previous_blocks, ( + "variable %r used in more than one block" % (v,)) + vars[v] = only_in_link + + def usevar(v, in_link=None): + assert v in vars + if in_link is not None: + assert vars[v] is None or vars[v] is in_link + + for v in block.inputargs: + definevar(v) + + for op in block.operations: + for v in op.args: + assert isinstance(v, (Constant, Variable)) + if isinstance(v, Variable): + usevar(v) else: - assert link.last_exception is None - assert link.last_exc_value is None - for v in link.args: - assert isinstance(v, (Constant, Variable)) + assert v.value is not last_exception + #assert v.value != last_exc_value + definevar(op.result) + + exc_links = {} + if block.exitswitch is None: + assert len(block.exits) <= 1 + if block.exits: + assert block.exits[0].exitcase is None + elif block.exitswitch == Constant(last_exception): + assert len(block.operations) >= 1 + # check if an exception catch is done on a reasonable + # operation + assert block.operations[-1].opname not in ("keepalive", + "cast_pointer", + "same_as") + assert len(block.exits) >= 2 + assert block.exits[0].exitcase is None + for link in block.exits[1:]: + assert issubclass(link.exitcase, Exception) + exc_links[link] = True + else: + assert isinstance(block.exitswitch, Variable) + assert block.exitswitch in vars + assert len(block.exits) > 1 + + allexitcases = {} + for link in block.exits: + assert len(link.args) == len(link.target.inputargs) + assert link.prevblock is block + exc_link = link in exc_links + if exc_link: + for v in [link.last_exception, link.last_exc_value]: + assert isinstance(v, (Variable, Constant)) if isinstance(v, Variable): - usevar(v, in_link=link) - if exc_link: - assert v != block.operations[-1].result - #else: - # if not exc_link: - # assert v.value is not last_exception - # #assert v.value != last_exc_value - allexitcases[link.exitcase] = True - assert len(allexitcases) == len(block.exits) - vars_previous_blocks.update(vars) - - try: - for block, nbargs in exitblocks.items(): - this_block[0] = block - assert len(block.inputargs) == nbargs - assert not block.operations - assert not block.exits - - vars_previous_blocks = {} - - traverse(visit, graph) - - except AssertionError, e: - # hack for debug tools only - #graph.show() # <== ENABLE THIS TO SEE THE BROKEN GRAPH - if this_block[0] and not hasattr(e, '__annotator_block'): - setattr(e, '__annotator_block', this_block[0]) - raise + definevar(v, only_in_link=link) + else: + assert link.last_exception is None + assert link.last_exc_value is None + for v in link.args: + assert isinstance(v, (Constant, Variable)) + if isinstance(v, Variable): + usevar(v, in_link=link) + if exc_link: + assert v != block.operations[-1].result + #else: + # if not exc_link: + # assert v.value is not last_exception + # #assert v.value != last_exc_value + allexitcases[link.exitcase] = True + assert len(allexitcases) == len(block.exits) + vars_previous_blocks.update(vars) + + except AssertionError, e: + # hack for debug tools only + #graph.show() # <== ENABLE THIS TO SEE THE BROKEN GRAPH + if block and not hasattr(e, '__annotator_block'): + setattr(e, '__annotator_block', block) + raise Added: pypy/dist/pypy/objspace/flow/test/test_checkgraph.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/objspace/flow/test/test_checkgraph.py Wed Dec 14 13:09:13 2005 @@ -0,0 +1,102 @@ +from pypy.objspace.flow.model import * +import py + +def test_mingraph(): + g = FunctionGraph("g", Block([])) + g.startblock.closeblock(Link([Constant(1)], g.returnblock)) + checkgraph(g) + +def template(): + g = FunctionGraph("g", Block([])) + g.startblock.closeblock(Link([Constant(1)], g.returnblock)) + checkgraph(g) + py.test.raises(AssertionError, checkgraph, g) + + +def test_nostartblock(): + g = FunctionGraph("g", Block([])) + g.startblock.closeblock(Link([Constant(1)], g.returnblock)) + g.startblock.isstartblock = False + py.test.raises(AssertionError, checkgraph, g) + +def test_twostartblocks(): + g = FunctionGraph("g", Block([])) + b = Block([]) + b.isstartblock = True + g.startblock.closeblock(Link([], b)) + b.closeblock(Link([Constant(1)], g.returnblock)) + py.test.raises(AssertionError, checkgraph, g) + +def test_exitlessblocknotexitblock(): + g = FunctionGraph("g", Block([])) + py.test.raises(AssertionError, checkgraph, g) + + +def test_nonvariableinputarg(): + b = Block([Constant(1)]) + g = FunctionGraph("g", b) + g.startblock.closeblock(Link([Constant(1)], g.returnblock)) + + py.test.raises(AssertionError, checkgraph, g) + +def test_multiplydefinedvars(): + v = Variable() + g = FunctionGraph("g", Block([v, v])) + g.startblock.closeblock(Link([v], g.returnblock)) + py.test.raises(AssertionError, checkgraph, g) + + v = Variable() + b = Block([v]) + b.operations.append(SpaceOperation("add", [Constant(1), Constant(2)], v)) + g = FunctionGraph("g", b) + g.startblock.closeblock(Link([v], g.returnblock)) + + py.test.raises(AssertionError, checkgraph, g) + +def test_varinmorethanoneblock(): + v = Variable() + g = FunctionGraph("g", Block([])) + g.startblock.operations.append(SpaceOperation("pos", [Constant(1)], v)) + b = Block([v]) + g.startblock.closeblock(Link([v], b)) + b.closeblock(Link([v], g.returnblock)) + py.test.raises(AssertionError, checkgraph, g) + +def test_useundefinedvar(): + v = Variable() + g = FunctionGraph("g", Block([])) + g.startblock.closeblock(Link([v], g.returnblock)) + py.test.raises(AssertionError, checkgraph, g) + + v = Variable() + g = FunctionGraph("g", Block([])) + g.startblock.exitswitch = v + g.startblock.closeblock(Link([Constant(1)], g.returnblock)) + py.test.raises(AssertionError, checkgraph, g) + +def test_invalid_arg(): + v = Variable() + g = FunctionGraph("g", Block([])) + g.startblock.operations.append(SpaceOperation("pos", [1], v)) + g.startblock.closeblock(Link([v], g.returnblock)) + py.test.raises(AssertionError, checkgraph, g) + +def test_invalid_links(): + g = FunctionGraph("g", Block([])) + g.startblock.closeblock(Link([Constant(1)], g.returnblock), Link([Constant(1)], g.returnblock)) + py.test.raises(AssertionError, checkgraph, g) + + v = Variable() + g = FunctionGraph("g", Block([v])) + g.startblock.exitswitch = v + g.startblock.closeblock(Link([Constant(1)], g.returnblock, True), + Link([Constant(1)], g.returnblock, True)) + py.test.raises(AssertionError, checkgraph, g) + + v = Variable() + g = FunctionGraph("g", Block([v])) + g.startblock.exitswitch = v + g.startblock.closeblock(Link([Constant(1)], g.returnblock)) + checkgraph(g) + py.test.raises(AssertionError, checkgraph, g) + From cfbolz at codespeak.net Wed Dec 14 23:40:23 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Wed, 14 Dec 2005 23:40:23 +0100 (CET) Subject: [pypy-svn] r21163 - pypy/dist/pypy/translator/backendopt Message-ID: <20051214224023.1829027B5C@code1.codespeak.net> Author: cfbolz Date: Wed Dec 14 23:40:21 2005 New Revision: 21163 Modified: pypy/dist/pypy/translator/backendopt/inline.py Log: make functions that are called exactly once more likely to get inlined. Modified: pypy/dist/pypy/translator/backendopt/inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/inline.py (original) +++ pypy/dist/pypy/translator/backendopt/inline.py Wed Dec 14 23:40:21 2005 @@ -320,10 +320,14 @@ count += block_weight(block) return count -def inlining_heuristic(graph): +def inlining_heuristic(graph, callers=None, callees=None): # XXX ponderation factors? + factor = 1 + if callers is not None: + if len(callers) == 1: + factor = 0.1 return (0.9999 * measure_median_execution_cost(graph) + - static_instruction_count(graph)) + static_instruction_count(graph)) * factor def static_callers(translator, ignore_primitives=False): @@ -361,7 +365,7 @@ while fiboheap: weight, graph = fiboheap[0] if not valid_weight.get(graph): - weight = inlining_heuristic(graph) + weight = inlining_heuristic(graph, callers.get(graph), callees.get(graph)) #print ' + cost %7.2f %50s' % (weight, graph.name) heapreplace(fiboheap, (weight, graph)) valid_weight[graph] = True From ale at codespeak.net Thu Dec 15 10:31:40 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Thu, 15 Dec 2005 10:31:40 +0100 (CET) Subject: [pypy-svn] r21164 - in pypy/dist/pypy/lib/pyontology: . test Message-ID: <20051215093140.AE77D27B58@code1.codespeak.net> Author: ale Date: Thu Dec 15 10:31:39 2005 New Revision: 21164 Added: pypy/dist/pypy/lib/pyontology/ pypy/dist/pypy/lib/pyontology/__init__.py pypy/dist/pypy/lib/pyontology/pyontology.py pypy/dist/pypy/lib/pyontology/test/ pypy/dist/pypy/lib/pyontology/test/test_ontology.py Log: First checkin of OWL parser/inferer. This is work in progress. The tests are disabled if rdflib and/or logilab.constraint is not installed. There is still a lot missing Added: pypy/dist/pypy/lib/pyontology/__init__.py ============================================================================== Added: pypy/dist/pypy/lib/pyontology/pyontology.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/lib/pyontology/pyontology.py Thu Dec 15 10:31:39 2005 @@ -0,0 +1,473 @@ +from rdflib import Graph, URIRef, BNode +from logilab.constraint import Repository, Solver +from logilab.constraint.fd import Equals, AllDistinct, BinaryExpression, Expression +from logilab.constraint.fd import FiniteDomain as fd +from logilab.constraint.propagation import AbstractDomain, AbstractConstraint, ConsistencyFailure +import sys + +namespaces = {'rdf':'http://www.w3.org/1999/02/22-rdf-syntax-ns', + 'rdfs':'http://www.w3.org/2000/01/rdf-schema', + 'dc':'http://purl.org/dc/elements/1.0/', + 'xmlns':'http://www.w3.org/1999/xhtml', + 'owl':'http://www.w3.org/2002/07/owl', +} +uris = {} +for k,v in namespaces.items(): + uris[v] = k + +Thing = URIRef(u'http://www.w3.org/2002/07/owl#Thing') +Class = URIRef(u'http://www.w3.org/2002/07/owl#Class') +builtin_voc = [ + 'Thing', + 'Class', + 'ObjectProperty', + 'AllDifferent', + 'AnnotationProperty', + 'DataRange', + 'DatatypeProperty', + 'DeprecatedClass', + 'DeprecatedProperty', + 'FunctionalProperty', + 'InverseFunctionalProperty', + 'Nothing', + 'ObjectProperty', + 'Ontology', + 'OntologyProperty', + 'Restriction', + 'SymmetricProperty', + 'TransitiveProperty' + ] + +class Ontology(Graph): + + def __init__(self): + Graph.__init__(self) + self.variables = {} + self.constraints = [] + self.seen = {} + self.var2ns ={} + + def add_file(self, f): + tmp = Graph() + tmp.load(f) + for i in tmp.triples((None,)*3): + self.add(i) + + def attach_fd(self): + for (s, p, o) in (self.triples((None, None, None))): + if p.find('#') != -1: + owl,func = p.split('#') + else: + owl ='' + func = p + #print s, p, o + #raise Exception + if owl in [namespaces['owl'],namespaces['rdf'],namespaces['rdfs']]: + pred = getattr(self, func) + else: + pred = None + if pred: + res = pred(s, p, o) + if res == None: + continue + if type(res) != list : + res = [res] + avar = self.make_var(s) + else: + res = [o] + avar = self.make_var(s,p) + if self.variables.get(avar) and type(self.variables[avar]) == fd: + self.variables[avar] = fd(list(self.variables[avar].getValues()) + res) + else: + self.variables[avar] = fd(res) + # for var in self.seen: + # self.variables.pop(var) + # self.seen = {} + + def solve(self,verbose=0): + rep = Repository(self.variables.keys(), self.variables, self.constraints) + return Solver().solve(rep,verbose) + + def consistency(self): + rep = Repository(self.variables.keys(), self.variables, self.constraints) + rep.consistency() + + def get_list(self, subject): + res = [] + p = URIRef(u'http://www.w3.org/1999/02/22-rdf-syntax-ns#first') + first = list(self.objects(subject, p)) + assert len(first) == 1 + self.seen[self.make_var(subject,p)]= 1 + if type(first[0]) == URIRef: + var = self.make_var(first[0]) + if var not in self.variables.keys(): + self.variables[var] = ClassDomain(var) + res += first + + p = URIRef(u'http://www.w3.org/1999/02/22-rdf-syntax-ns#rest') + rest = list(self.objects(subject, p)) + self.seen[self.make_var(subject,p)]= 1 + if "#nil" in rest[0] : + return res + else: + res += self.get_list(rest[0]) + return res + + def make_var(self,*args): + res = [] + for a in args: + if type(a) == URIRef: + if a.find('#') != -1: + ns,name = a.split('#') + else: + ns,name = a,'' + if ns not in uris.keys(): + uris[ns] = ns.split('/')[-1] + namespaces[uris[ns]] = ns + mangle_name = uris[ns] + '_' + name + res.append(mangle_name) + else: + res.append(a) + var = '.'.join([str(a.replace('-','_')) for a in res]) + return var + + def find_prop(self, s): + p = URIRef(u'http://www.w3.org/2002/07/owl#onProperty') + pr = list(self.objects(s,p)) + assert len(pr) == 1 + return pr[0] + + def find_cls(self, s): + p = URIRef(u'http://www.w3.org/1999/02/22-rdf-syntax-ns#type') + r = URIRef(u'http://www.w3.org/2000/01/rdf-schema#subClassOf') + if type(s) == BNode: + pr = list( self.subjects(p,s) ) + if len(pr) == 0: + return + # pr = list( self.subjects(r,s) ) + # assert len(pr) == 1 + return pr[0] + else: + return s + + def find_uriref(self, s): + while type(s) == BNode: + s = list(self.subjects(None,s))[0] + return s + + def find_property(self, s): + prop = self.find_prop(s) + cls = self.find_cls(s) + if cls : + avar = self.make_var(cls, prop) + else: + avar = self.make_var( prop) + if not self.variables.get(avar): + self.variables[avar] = ClassDomain(avar) + return avar + +#---------------- Implementation ---------------- + + def type(self, s, p, var): + avar = self.make_var(var) + svar = self.make_var(s) + if (type(var) == URIRef and not + (var in [URIRef(namespaces['owl']+'#'+x) for x in builtin_voc])): + # var is not one of the builtin classes + if not self.variables.get(svar): + self.variables[svar] = ClassDomain(svar) + if not self.variables.get(avar): + self.variables[avar] = ClassDomain(avar) + +# if self.variables[avar].values: + self.variables[svar].values += self.variables[avar].values + constrain = BinaryExpression([svar, avar],"%s in %s" %(svar, avar)) + self.constraints.append(constrain) + else: + # var is a builtin class + pass + + def first(self, s, p, var): + pass + + def rest(self, s, p, var): + pass + + def range(self, s, p, var): + pass + + def domain(self, s, p, var): + pass + +# --------- Class Axioms --------------------- + + def subClassOf(self, s, p, var): + # s is a subclass of var means that the + # class extension of s is a subset of the + # class extension of var. + avar = self.make_var(var) + svar = self.make_var(s) + if not self.variables.get(avar): + self.variables[avar] = ClassDomain(avar) + constrain = SubClassConstraint(svar, avar) + self.constraints.append(constrain) + + def equivalentClass(self, s, p, var): + avar = self.make_var(var) + svar = self.make_var(s) + if not self.variables.get(avar): + self.variables[avar] = ClassDomain(avar) +# constrain = EquivalentClassConstraint(svar, avar) +# self.constraints.append(constrain) + self.subClassOf(s, p, var) + self.subClassOf(var, p, s) + + def disjointWith(self, s, p, var): + avar = self.make_var(var) + svar = self.make_var(s) + if not self.variables.get(avar): + self.variables[avar] = ClassDomain(avar) + constrain = DisjointClassConstraint(svar, avar) + self.constraints.append(constrain) + + def oneOf(self, s, p, var): + res = self.get_list(var) + prop = self.find_uriref(s) + avar = self.make_var( prop) + if self.variables.get(avar) and type(self.variables[avar]) == fd: + self.variables[avar] = fd(list(self.variables[avar].getValues()) + res) + else: + self.variables[avar] = fd(res) + + def maxCardinality(self, s, p, var): + """ Len of finite domain of the property shall be less than or equal to var""" + avar = self.find_property(s) + constrain = MaxCardinality(avar,int(var)) + self.constraints.append(constrain) + + def minCardinality(self, s, p, var): + """ Len of finite domain of the property shall be greater than or equal to var""" + avar = self.find_property(s) + constrain = MinCardinality(avar,int(var)) + self.constraints.append(constrain) + + def cardinality(self, s, p, var): + """ Len of finite domain of the property shall be equal to var""" + avar = self.find_property(s) + # Check if var is an int, else find the int buried in the structure + constrain = Cardinality(avar,int(var)) + self.constraints.append(constrain) + + def unionOf(self,s, p, var): + res = self.get_list(var) + return res #There might be doubles (but fd takes care of that) + + def intersectionOf(self, s, p, var): + res = self.get_list(var) + result = {}.fromkeys(res[0]) + for el in res: + for cls in result.keys(): + if cls not in el: + result.pop(cls) + return result.keys() + + def differentFrom(self, s, p, var): + s_var = self.make_var(s) + var_var = self.make_var(var) + if not self.variables.get(s_var): + self.variables[s_var] = ClassDomain(s_var) + if not self.variables.get(var_var): + self.variables[var_var] = fd([]) + constrain = BinaryExpression([s_var, var_var],"%s != %s" %(s_var, var_var)) + self.constraints.append(constrain) + + def distinctMembers(self, s, p, var): + res = self.get_list(var) + self.constraints.append(AllDistinct([self.make_var(y) for y in res])) + return res + + def sameAs(self, s, p, var): + constrain = BinaryExpression([self.make_var(s), self.make_var(var)],"%s == %s" %(self.make_var(s), self.make_var( var))) + self.constraints.append(constrain) + + def complementOf(self, s, p, var): + # add constraint of not var + pass + + def onProperty(self, s, p, var): + pass + + def hasValue(self, s, p, var): + pass + + def allValuesFrom(self, s, p, var): + pass + + def someValuesFrom(self, s, p, var): + pass + + def equivalentProperty(self, s, p, var): + pass + + def inverseOf(self, s, p, var): + pass + + def someValuesFrom(self, s, p, var): + pass + + def subPropertyOf(self, s, p, var): + pass + + def imports(self, s, p, var): + pass + +# ----------------- Helper classes ---------------- + +class MaxCardinality(AbstractConstraint): + """Contraint: all values must be distinct""" + + def __init__(self, variable, cardinality): + AbstractConstraint.__init__(self, [variable]) + # worst case complexity + self.__cost = 1 #len(variables) * (len(variables) - 1) / 2 + self.cardinality = cardinality + + def __repr__(self): + return '' % (str(self._variables[0]),self.cardinality) + + def estimateCost(self, domains): + return self.__cost + + def narrow(self, domains): + """narrowing algorithm for the constraint""" + if len(domains[self._variables[0]]) > self.cardinality: + print " I Think I will raise an exception" + raise ConsistencyFailure("Maxcardinality exceeded") + else: + return 1 + +class MinCardinality(MaxCardinality): + + def __repr__(self): + return '' % (str(self._variables[0]),self.cardinality) + + def narrow(self, domains): + """narrowing algorithm for the constraint""" + + if len(domains[self._variables[0]]) < self.cardinality: + raise ConsistencyFailure() + else: + return 1 + +class Cardinality(MaxCardinality): + + def __repr__(self): + return '' % (str(self._variables[0]),self.cardinality) + + def narrow(self, domains): + """narrowing algorithm for the constraint""" + + if len(domains[self._variables[0]]) != self.cardinality: + raise ConsistencyFailure() + else: + return 1 + +def get_bases(cls_dom, domains): + res = {} + for bas in cls_dom.bases: + res[bas] = 1 + if bas in domains.keys(): + res.update( get_bases(bas, domains)) + res[cls_dom] = 1 + return res + +class SubClassConstraint(AbstractConstraint): + + def __init__(self, variable, cls_or_restriction): + AbstractConstraint.__init__(self, [variable]) + # worst case complexity + self.__cost = 1 #len(variables) * (len(variables) - 1) / 2 + self.super = cls_or_restriction + self.variable = variable + + def narrow(self, domains): + subdom = domains[self.variable] + superdom = domains[self.super] + bases = get_bases(superdom, domains).keys() + print subdom,superdom, bases, subdom.bases + subdom.bases += [bas for bas in bases if bas not in subdom.bases] + +class EquivalentClassConstraint(AbstractConstraint): + + def __init__(self, variable, cls_or_restriction): + AbstractConstraint.__init__(self, [variable]) + # worst case complexity + self.__cost = 1 #len(variables) * (len(variables) - 1) / 2 + self.other = cls_or_restriction + self.variable = variable + + def narrow(self, domains): + subdom = domains[self.variable] + otherdom = domains[self.other] + bases = get_bases(subdom, domains).keys() + otherbases = get_bases(otherdom, domains).keys() + print subdom, otherdom, "----",bases , otherbases + if bases != otherbases: + raise ConsistencyFailure() + else: + return 1 + +class DisjointClassConstraint(AbstractConstraint): + + def __init__(self, variable, cls_or_restriction): + AbstractConstraint.__init__(self, [variable]) + # worst case complexity + self.__cost = 1 #len(variables) * (len(variables) - 1) / 2 + self.super = cls_or_restriction + self.variable = variable + + def narrow(self, domains): + subdom = domains[self.variable] + superdom = domains[self.super] + bases = get_bases(superdom, domains).keys() + print subdom,superdom, bases, subdom.bases + subdom.bases += [bas for bas in bases if bas not in subdom.bases] + +class ClassDomain(AbstractDomain): + # Class domain is intended as a (abstract/virtual) domain for implementing + # Class axioms. Working on class descriptions the class domain should allow + # creation of classes through axioms. + # The instances of a class can be represented as a FiniteDomain in values (not always see Disjointwith) + # Properties of a class is in the dictionary "properties" + # The bases of a class is in the list "bases" + + def __init__(self, name='', values=[], bases = []): + AbstractDomain.__init__(self) + self.bases = bases+[self] + self.values = values + self.name = name + + def __repr__(self): + return "" % str(self.name) + + def __getitem__(self, index): + return None + + def __iter__(self): + return iter(self.bases) + + def size(self): + return sys.maxint + + __len__ = size + + def copy(self): + return self + + def removeValues(self, values): + print "remove values from ClassDomain", values + self.bases.pop(self.bases.index(values[0])) + + def getValues(self): + return self.bases + Added: pypy/dist/pypy/lib/pyontology/test/test_ontology.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/lib/pyontology/test/test_ontology.py Thu Dec 15 10:31:39 2005 @@ -0,0 +1,97 @@ +# tests for the Ontology class + +try: + import logilab.constraint + import rdflib +except ImportError: + import py + py.test.skip("Logilab.constraint and/or rdflib not installed") + +from pypy.lib.pyontology.pyontology import * # Ontology, ClassDomain, SubClassConstraint +from rdflib import Graph, URIRef, BNode + +def test_makevar(): + O = Ontology() + var = URIRef(u'http://www.w3.org/2002/03owlt/unionOf/premises004#A-and-B') + cod = O.make_var(var)+' = 1' + exec cod + assert O.make_var(var) in locals() + +def DONOT_test_subClassof(): + O = Ontology() + a = b = c = URIRef(u'http://www.w3.org/2002/03owlt/unionOf/premises004#A-and-B') + O.subClassOf(b, None, a) + O.subClassOf(c, None, b) + assert O.solve() + O.subClassOf(c, None, a) + assert O.solve() + +def test_ClassDomain(): + a = ClassDomain() + assert a.bases == [a] + cls = 1 + b = ClassDomain('B',[],[a]) + assert b in b.bases + assert a in b.bases + assert len(b.bases) == 2 + +def test_subClassconstraint(): + a = ClassDomain('A') + b = ClassDomain('B') + c = ClassDomain('C') + con = SubClassConstraint('b','a') + con2 = SubClassConstraint('c','b') + con.narrow({'a': a, 'b': b, 'c': c}) + con2.narrow({'a': a, 'b': b, 'c': c}) + assert a in c.bases + assert b in c.bases + assert c in c.bases + +def test_subClassconstraintMulti(): + a = ClassDomain('A') + b = ClassDomain('B') + c = ClassDomain('C') + con = SubClassConstraint('c','a') + con2 = SubClassConstraint('c','b') + con.narrow({'a': a, 'b': b, 'c': c}) + con2.narrow({'a': a, 'b': b, 'c': c}) + assert a in c.bases + assert b in c.bases + assert c in c.bases + +def test_subClassconstraintMulti2(): + a = ClassDomain('A') + b = ClassDomain('B') + c = ClassDomain('C') + con = SubClassConstraint('c','a') + con2 = SubClassConstraint('c','b') + con3 = SubClassConstraint('a','c') + con.narrow({'a': a, 'b': b, 'c': c}) + con2.narrow({'a': a, 'b': b, 'c': c}) + con3.narrow({'a': a, 'b': b, 'c': c}) + assert a in c.bases + assert b in c.bases + assert c in c.bases + assert c in a.bases + assert len(c.bases) == len(a.bases) + assert [bas in a.bases for bas in c.bases] == [True]*len(a.bases) + +def DONOT_test_equivalentClass(): + a = ClassDomain('A') + b = ClassDomain('B') + c = ClassDomain('C') + con = EquivalentClassConstraint('c','a') + con2 = EquivalentClassConstraint('c','b') + con.narrow({'a': a, 'b': b, 'c': c}) + con2.narrow({'a': a, 'b': b, 'c': c}) + assert a == b + +def test_type(): + sub = URIRef('a') + pred = URIRef('type') + obj = URIRef('o') + O = Ontology() + O.type(sub, pred , obj) + assert O.variables[O.make_var(sub)].__class__ == ClassDomain + + From adim at codespeak.net Thu Dec 15 11:27:47 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Thu, 15 Dec 2005 11:27:47 +0100 (CET) Subject: [pypy-svn] r21165 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20051215102747.B64F627B62@code1.codespeak.net> Author: adim Date: Thu Dec 15 11:27:46 2005 New Revision: 21165 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/astgen.py Log: fixed bug for setters() in ast.py Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Thu Dec 15 11:27:46 2005 @@ -282,7 +282,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def descr_And_new(space, w_subtype, w_nodes, lineno=-1): self = space.allocate_instance(And, w_subtype) @@ -417,7 +417,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def descr_AssList_new(space, w_subtype, w_nodes, lineno=-1): self = space.allocate_instance(AssList, w_subtype) @@ -525,7 +525,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def descr_AssTuple_new(space, w_subtype, w_nodes, lineno=-1): self = space.allocate_instance(AssTuple, w_subtype) @@ -634,7 +634,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def fget_expr( space, self): return space.wrap(self.expr) def fset_expr( space, self, w_arg): @@ -850,7 +850,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def descr_Bitand_new(space, w_subtype, w_nodes, lineno=-1): self = space.allocate_instance(Bitand, w_subtype) @@ -895,7 +895,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def descr_Bitor_new(space, w_subtype, w_nodes, lineno=-1): self = space.allocate_instance(Bitor, w_subtype) @@ -940,7 +940,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def descr_Bitxor_new(space, w_subtype, w_nodes, lineno=-1): self = space.allocate_instance(Bitxor, w_subtype) @@ -1035,7 +1035,7 @@ def fset_args( space, self, w_arg): del self.args[:] for w_itm in space.unpackiterable(w_arg): - self.args.append( space.interp_w(Node, w_arg)) + self.args.append( space.interp_w(Node, w_itm)) def fget_star_args( space, self): if self.star_args is None: return space.w_None @@ -1116,7 +1116,7 @@ def fset_bases( space, self, w_arg): del self.bases[:] for w_itm in space.unpackiterable(w_arg): - self.bases.append( space.interp_w(Node, w_arg)) + self.bases.append( space.interp_w(Node, w_itm)) def fget_w_doc( space, self): return self.w_doc def fset_w_doc( space, self, w_arg): @@ -1332,7 +1332,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def descr_Decorators_new(space, w_subtype, w_nodes, lineno=-1): self = space.allocate_instance(Decorators, w_subtype) @@ -1877,13 +1877,13 @@ def fset_argnames( space, self, w_arg): del self.argnames[:] for w_itm in space.unpackiterable(w_arg): - self.argnames.append( space.interp_w(Node, w_arg)) + self.argnames.append( space.interp_w(Node, w_itm)) def fget_defaults( space, self): return space.newlist( [space.wrap(itm) for itm in self.defaults] ) def fset_defaults( space, self, w_arg): del self.defaults[:] for w_itm in space.unpackiterable(w_arg): - self.defaults.append( space.interp_w(Node, w_arg)) + self.defaults.append( space.interp_w(Node, w_itm)) def fget_flags( space, self): return space.wrap(self.flags) def fset_flags( space, self, w_arg): @@ -2023,7 +2023,7 @@ def fset_ifs( space, self, w_arg): del self.ifs[:] for w_itm in space.unpackiterable(w_arg): - self.ifs.append( space.interp_w(Node, w_arg)) + self.ifs.append( space.interp_w(Node, w_itm)) def descr_GenExprFor_new(space, w_subtype, w_assign, w_iter, w_ifs, lineno=-1): self = space.allocate_instance(GenExprFor, w_subtype) @@ -2124,7 +2124,7 @@ def fset_quals( space, self, w_arg): del self.quals[:] for w_itm in space.unpackiterable(w_arg): - self.quals.append( space.interp_w(Node, w_arg)) + self.quals.append( space.interp_w(Node, w_itm)) def descr_GenExprInner_new(space, w_subtype, w_expr, w_quals, lineno=-1): self = space.allocate_instance(GenExprInner, w_subtype) @@ -2519,13 +2519,13 @@ def fset_argnames( space, self, w_arg): del self.argnames[:] for w_itm in space.unpackiterable(w_arg): - self.argnames.append( space.interp_w(Node, w_arg)) + self.argnames.append( space.interp_w(Node, w_itm)) def fget_defaults( space, self): return space.newlist( [space.wrap(itm) for itm in self.defaults] ) def fset_defaults( space, self, w_arg): del self.defaults[:] for w_itm in space.unpackiterable(w_arg): - self.defaults.append( space.interp_w(Node, w_arg)) + self.defaults.append( space.interp_w(Node, w_itm)) def fget_flags( space, self): return space.wrap(self.flags) def fset_flags( space, self, w_arg): @@ -2636,7 +2636,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def descr_List_new(space, w_subtype, w_nodes, lineno=-1): self = space.allocate_instance(List, w_subtype) @@ -2690,7 +2690,7 @@ def fset_quals( space, self, w_arg): del self.quals[:] for w_itm in space.unpackiterable(w_arg): - self.quals.append( space.interp_w(Node, w_arg)) + self.quals.append( space.interp_w(Node, w_itm)) def descr_ListComp_new(space, w_subtype, w_expr, w_quals, lineno=-1): self = space.allocate_instance(ListComp, w_subtype) @@ -2754,7 +2754,7 @@ def fset_ifs( space, self, w_arg): del self.ifs[:] for w_itm in space.unpackiterable(w_arg): - self.ifs.append( space.interp_w(Node, w_arg)) + self.ifs.append( space.interp_w(Node, w_itm)) def descr_ListCompFor_new(space, w_subtype, w_assign, w_list, w_ifs, lineno=-1): self = space.allocate_instance(ListCompFor, w_subtype) @@ -3109,7 +3109,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def descr_Or_new(space, w_subtype, w_nodes, lineno=-1): self = space.allocate_instance(Or, w_subtype) @@ -3242,7 +3242,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def fget_dest( space, self): if self.dest is None: return space.w_None @@ -3303,7 +3303,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def fget_dest( space, self): if self.dest is None: return space.w_None @@ -3613,7 +3613,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def descr_Sliceobj_new(space, w_subtype, w_nodes, lineno=-1): self = space.allocate_instance(Sliceobj, w_subtype) @@ -3658,7 +3658,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def descr_Stmt_new(space, w_subtype, w_nodes, lineno=-1): self = space.allocate_instance(Stmt, w_subtype) @@ -3767,7 +3767,7 @@ def fset_subs( space, self, w_arg): del self.subs[:] for w_itm in space.unpackiterable(w_arg): - self.subs.append( space.interp_w(Node, w_arg)) + self.subs.append( space.interp_w(Node, w_itm)) def descr_Subscript_new(space, w_subtype, w_expr, w_flags, w_subs, lineno=-1): self = space.allocate_instance(Subscript, w_subtype) @@ -3967,7 +3967,7 @@ def fset_nodes( space, self, w_arg): del self.nodes[:] for w_itm in space.unpackiterable(w_arg): - self.nodes.append( space.interp_w(Node, w_arg)) + self.nodes.append( space.interp_w(Node, w_itm)) def descr_Tuple_new(space, w_subtype, w_nodes, lineno=-1): self = space.allocate_instance(Tuple, w_subtype) Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Thu Dec 15 11:27:46 2005 @@ -331,7 +331,7 @@ elif prop[attr] == P_NESTED: print >> buf, " del self.%s[:]" % attr print >> buf, " for w_itm in space.unpackiterable(w_arg):" - print >> buf, " self.%s.append( space.interp_w(Node, w_arg))" % attr + print >> buf, " self.%s.append( space.interp_w(Node, w_itm))" % attr elif prop[attr] == P_NONE: print >> buf, " self.%s = space.interp_w(Node, w_arg, can_be_None=True)" % attr else: # P_NODE From arigo at codespeak.net Thu Dec 15 12:14:24 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 15 Dec 2005 12:14:24 +0100 (CET) Subject: [pypy-svn] r21166 - pypy/dist/pypy/translator/c/test Message-ID: <20051215111424.0BD9627B62@code1.codespeak.net> Author: arigo Date: Thu Dec 15 12:14:23 2005 New Revision: 21166 Modified: pypy/dist/pypy/translator/c/test/test_backendoptimized.py Log: casting an object to an int doesn't always return a positive number. Modified: pypy/dist/pypy/translator/c/test/test_backendoptimized.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_backendoptimized.py (original) +++ pypy/dist/pypy/translator/c/test/test_backendoptimized.py Thu Dec 15 12:14:23 2005 @@ -90,7 +90,7 @@ objectmodel.cast_int_to_object(i, A)) == i fn = self.getcompiled(f) res = fn() - assert res > 0 + # cannot really test anything about 'res' here gn = self.getcompiled(g) res = gn() assert res From tismer at codespeak.net Thu Dec 15 12:14:54 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Thu, 15 Dec 2005 12:14:54 +0100 (CET) Subject: [pypy-svn] r21167 - pypy/dist/pypy/translator/c/test Message-ID: <20051215111454.3C89027B70@code1.codespeak.net> Author: tismer Date: Thu Dec 15 12:14:52 2005 New Revision: 21167 Added: pypy/dist/pypy/translator/c/test/test_coroutine.py Log: trying to build a minimal coroutine implementation. There seem to be some issues left which keep me from verifying if this is right. Added: pypy/dist/pypy/translator/c/test/test_coroutine.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/c/test/test_coroutine.py Thu Dec 15 12:14:52 2005 @@ -0,0 +1,187 @@ +""" +Hi Armin: +When I set DEBUG to False, the program crashes. Maybe I'm doing something +wrong and re-using some used continuation, don't know. +So I was trying to set things to Nonw after becoming invalid, but +that breaks the rtyper. +""" + +DEBUG = False +# set to true and compilation crashes +USE_NONE = False +# set to true and rtyper crashes + +# the above are exclusive right now + +CHECKED_IN = True +# set this to false to skip skipping :-) + +import os +from pypy.rpython.rstack import yield_current_frame_to_caller + +def wrap_stackless_function(fn): + from pypy.translator.translator import TranslationContext + from pypy.translator.c.genc import CStandaloneBuilder + from pypy.annotation.model import SomeList, SomeString + from pypy.annotation.listdef import ListDef + from pypy.translator.backendopt.all import backend_optimizations + + def entry_point(argv): + os.write(1, str(fn())) + return 0 + + s_list_of_strings = SomeList(ListDef(None, SomeString())) + s_list_of_strings.listdef.resize() + t = TranslationContext() + t.buildannotator().build_types(entry_point, [s_list_of_strings]) + #t.view() + t.buildrtyper().specialize() + backend_optimizations(t) + cbuilder = CStandaloneBuilder(t, entry_point) + cbuilder.stackless = True + cbuilder.generate_source() + cbuilder.compile() + return cbuilder.cmdexec('') + +# ____________________________________________________________ + +""" +Trying to build the simplest possible coroutine interface. + +A coroutine is a tiny wrapper around a frame, or better +to say a one-shot continuation. This continuation is +resumed whenever we switch to the coroutine. On depart, +the coroutine is updated with its current state, that is, +the continuation is replaced. To avoid confusion with +general continuations, we are naming them as 'frame' +in the code. By frame, we are referring to the toplevel +frame as a placeholder for the whole structure appended +to it. This might be a chain of frames, or even a special +stack structure, when we implement 'hard switching'. The +abstraction layer should make this invisible. + +The 'seed' of coroutines is actually the special function +yield_current_frame_to_caller(). It is, in a sense, able +to return twice. When yield_current_frame_to_caller() is +reached, it creates a resumable frame and returns it to the +caller of the current function. This frame serves as the +entry point to the coroutine. + +On evetry entry to the coroutine, the return value of the +point where we left off is the continuation of the caller. +We need to update the caller's frame with it. +This is not necessarily the caller which created ourself. +We are therefore keeping track of the current coroutine. + +The update sequence during a switch to a coroutine is: + +- save the return value (caller's continuation) in the + calling coroutine, which is still 'current' +- change current to ourself (the callee) +- invalidate our continuation by setting it to None. +""" + + +class CoState(object): + pass + +costate = CoState() + +class CoroutineDamage(SystemError): + pass + +class Coroutine(object): + + if DEBUG: + def __init__(self): + self._switchable = False + + if USE_NONE: + def __init__(self): + self.frame = None + + def bind(self, thunk): + if USE_NONE: + assert self.frame is None + self.frame = self._bind(thunk) + + def _bind(self, thunk): + if self is costate.current or self is costate.main: + raise CoroutineDamage + frame = yield_current_frame_to_caller() + costate.current.frame = frame + if DEBUG: + costate.current.switchable = True + assert self._switchable == True + self._switchable = False + costate.current = self + thunk.call() + return self.frame # just for the annotator + + def switch(self): + if DEBUG: + assert self._switchable == True + assert costate.current._switchable == False + if USE_NONE: + assert costate.current.frame is None + assert self.frame is not None + frame = self.frame.switch() + if DEBUG: + assert costate.current._switchable == False + costate.current._switchable = True + if USE_NONE: + assert costate.current.frame is None + costate.current.frame = frame + costate.current = self + # XXX support: self.frame = None + +costate.current = costate.main = Coroutine() + +def output(stuff): + os.write(2, stuff + '\n') + +def test_coroutine(): + if CHECKED_IN: + import py.test + py.test.skip("in-progress") + + def g(lst): + lst.append(2) + output('g appended 2') + costate.main.switch() + lst.append(4) + output('g appended 4') + costate.main.switch() + lst.append(6) + output('g appended 6') + + class T: + def __init__(self, func, arg): + self.func = func + self.arg = arg + def call(self): + self.func(self.arg) + + def f(): + lst = [1] + coro_g = Coroutine() + t = T(g, lst) + output('binding after f set 1') + coro_g.bind(t) + output('switching') + coro_g.switch() + lst.append(3) + output('f appended 3') + coro_g.switch() + lst.append(5) + output('f appended 5') + coro_g.switch() + lst.append(7) + output('f appended 7') + n = 0 + for i in lst: + n = n*10 + i + return n + + data = wrap_stackless_function(f) + assert int(data.strip()) == 1234567 From arigo at codespeak.net Thu Dec 15 12:54:02 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 15 Dec 2005 12:54:02 +0100 (CET) Subject: [pypy-svn] r21169 - in pypy/dist/pypy: annotation rpython rpython/test Message-ID: <20051215115402.8C8B327B62@code1.codespeak.net> Author: arigo Date: Thu Dec 15 12:53:59 2005 New Revision: 21169 Modified: pypy/dist/pypy/annotation/binaryop.py pypy/dist/pypy/rpython/rexternalobj.py pypy/dist/pypy/rpython/test/test_rexternalobj.py Log: Added support for merging SomeExternalObjects and None (thanks Christian). Cleaned up a bit the corresponding code in binaryop.py. Modified: pypy/dist/pypy/annotation/binaryop.py ============================================================================== --- pypy/dist/pypy/annotation/binaryop.py (original) +++ pypy/dist/pypy/annotation/binaryop.py Thu Dec 15 12:53:59 2005 @@ -2,6 +2,7 @@ Binary operations between SomeValues. """ +import py import operator from pypy.annotation.pairtype import pair, pairtype from pypy.annotation.model import SomeObject, SomeInteger, SomeBool @@ -572,57 +573,32 @@ def union((obj1, imp2)): return obj1 -class __extend__(pairtype(SomeInstance, SomePBC)): - def union((ins, pbc)): - if pbc.isNone(): - return SomeInstance(classdef=ins.classdef, can_be_None = True) - raise UnionError("mixing pbc and instance not supported anymore: %s %s" % (pbc, ins)) - # XXX is the following still useful? - #classdef = ins.classdef.superdef_containing(pbc.knowntype) - #if classdef is None: - # # print warning? - # return SomeObject() - #if not getattr(TLS, 'no_side_effects_in_union', 0): - # raise UnionError("mixing pbc and instance not supported anymore: %s %s" % (pbc, ins)) - #return SomeInstance(classdef) - -class __extend__(pairtype(SomePBC, SomeInstance)): - def union((pbc, ins)): - return pair(ins, pbc).union() - -# let mix lists and None for now -class __extend__(pairtype(SomeList, SomePBC)): - def union((lst, pbc)): - if pbc.isNone(): - return SomeList(lst.listdef) - return SomeObject() - -class __extend__(pairtype(SomePBC, SomeList )): - def union((pbc, lst)): - return pair(lst, pbc).union() - -# let mix dicts and None -class __extend__(pairtype(SomeDict, SomePBC)): - def union((dct, pbc)): - if pbc.isNone(): - return SomeDict(dct.dictdef) - return SomeObject() +# mixing Nones with other objects -class __extend__(pairtype(SomePBC, SomeDict )): - def union((pbc, dct)): - return pair(dct, pbc).union() - -# mixing strings and None - -class __extend__(pairtype(SomeString, SomePBC)): - def union((s, pbc)): - if pbc.isNone(): - return SomeString(can_be_None=True) - return SomeObject() +def _make_none_union(classname, constructor_args=''): + loc = locals() + source = py.code.Source(""" + class __extend__(pairtype(%(classname)s, SomePBC)): + def union((obj, pbc)): + if pbc.isNone(): + return %(classname)s(%(constructor_args)s) + else: + return SomeObject() -class __extend__(pairtype(SomePBC, SomeString )): - def union((pbc, s)): - return pair(s, pbc).union() + class __extend__(pairtype(SomePBC, %(classname)s)): + def union((pbc, obj)): + if pbc.isNone(): + return %(classname)s(%(constructor_args)s) + else: + return SomeObject() + """ % loc) + exec source.compile() in globals() + +_make_none_union('SomeInstance', 'classdef=obj.classdef, can_be_None=True') +_make_none_union('SomeString', 'can_be_None=True') +_make_none_union('SomeList', 'obj.listdef') +_make_none_union('SomeDict', 'obj.dictdef') +_make_none_union('SomeExternalObject', 'obj.knowntype') # getitem on SomePBCs, in particular None fails Modified: pypy/dist/pypy/rpython/rexternalobj.py ============================================================================== --- pypy/dist/pypy/rpython/rexternalobj.py (original) +++ pypy/dist/pypy/rpython/rexternalobj.py Thu Dec 15 12:53:59 2005 @@ -32,7 +32,7 @@ def convert_const(self, value): T = self.exttypeinfo.get_lltype() if value is None: - return nullptr(T) + return lltype.nullptr(T) if not isinstance(value, self.exttypeinfo.typ): raise TyperError("expected a %r: %r" % (self.exttypeinfo.typ, value)) Modified: pypy/dist/pypy/rpython/test/test_rexternalobj.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rexternalobj.py (original) +++ pypy/dist/pypy/rpython/test/test_rexternalobj.py Thu Dec 15 12:53:59 2005 @@ -19,3 +19,21 @@ assert res is True res = interpret(fn, [1], policy=policy) assert res is False + +def test_lock_or_None(): + import thread + import pypy.module.thread.rpython.exttable # for declare()/declaretype() + def makelock(i): + if i > 0: + return thread.allocate_lock() + else: + return None + def fn(i): + lock = makelock(i) + return lock is not None and lock.acquire(False) + policy = AnnotatorPolicy() + policy.allow_someobjects = False + res = interpret(fn, [0], policy=policy) + assert res is False + res = interpret(fn, [1], policy=policy) + assert res is True From arigo at codespeak.net Thu Dec 15 13:38:30 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 15 Dec 2005 13:38:30 +0100 (CET) Subject: [pypy-svn] r21170 - pypy/dist/pypy/rpython/test Message-ID: <20051215123830.750B527B62@code1.codespeak.net> Author: arigo Date: Thu Dec 15 13:38:29 2005 New Revision: 21170 Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py Log: Some more cast_object_to_int-can-return-a-negative-value. Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Thu Dec 15 13:38:29 2005 @@ -38,7 +38,8 @@ i = cast_object_to_int(a) return cast_object_to_int(cast_int_to_object(i, A)) == i res = interpret(f, []) - assert res > 0 + # cannot really check anything about 'res' here + # XXX humpf: there is no sane way to implement cast_ptr_to_int # without going for the same hacks as in robjectmodel.cast_XXX_to_XXX py.test.raises(AssertionError, interpret, g, []) From arigo at codespeak.net Thu Dec 15 13:41:30 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 15 Dec 2005 13:41:30 +0100 (CET) Subject: [pypy-svn] r21171 - in pypy/dist/pypy/rpython: . lltypesystem test Message-ID: <20051215124130.90B1A27B62@code1.codespeak.net> Author: arigo Date: Thu Dec 15 13:41:29 2005 New Revision: 21171 Modified: pypy/dist/pypy/rpython/lltypesystem/rpbc.py pypy/dist/pypy/rpython/rpbc.py pypy/dist/pypy/rpython/test/test_rpbc.py Log: More precise function and method calls, to fix the following (now-tested) buggy case: at the end of annotation, consider_call_site() fills a call table using the SomePBCs of each call site; later, the RTyper used to try to perform lookups in this call table by using a less precise SomePBC (the whole call family). This lookup can thus fail. Modified: pypy/dist/pypy/rpython/lltypesystem/rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/lltypesystem/rpbc.py (original) +++ pypy/dist/pypy/rpython/lltypesystem/rpbc.py Thu Dec 15 13:41:29 2005 @@ -206,7 +206,10 @@ mangled_name, r_func = r_class.clsfields[self.methodname] assert isinstance(r_func, (FunctionsPBCRepr, OverriddenFunctionPBCRepr)) - s_func = r_func.s_pbc + # s_func = r_func.s_pbc -- not precise enough, see + # test_precise_method_call_1. Build a more precise one... + funcdescs = [desc.funcdesc for desc in hop.args_s[0].descriptions] + s_func = annmodel.SomePBC(funcdescs) v_im_self = hop.inputarg(self, arg=0) v_cls = self.r_im_self.getfield(v_im_self, '__class__', hop.llops) v_func = r_class.getclsfield(v_cls, self.methodname, hop.llops) Modified: pypy/dist/pypy/rpython/rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/rpbc.py (original) +++ pypy/dist/pypy/rpython/rpbc.py Thu Dec 15 13:41:29 2005 @@ -301,7 +301,8 @@ def call(self, opname, hop): bk = self.rtyper.annotator.bookkeeper args = bk.build_args(opname, hop.args_s[1:]) - descs = self.s_pbc.descriptions.keys() + s_pbc = hop.args_s[0] # possibly more precise than self.s_pbc + descs = s_pbc.descriptions.keys() shape, index = description.FunctionDesc.variant_for_call_site(bk, self.callfamily, descs, args) row_of_graphs = self.callfamily.calltables[shape][index] anygraph = row_of_graphs.itervalues().next() # pick any witness Modified: pypy/dist/pypy/rpython/test/test_rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rpbc.py (original) +++ pypy/dist/pypy/rpython/test/test_rpbc.py Thu Dec 15 13:41:29 2005 @@ -1258,3 +1258,54 @@ for i in range(5): res = interpret(f, [i, 1000]) assert res == f(i, 1000) + +def test_precise_method_call_1(): + class A(object): + def meth(self, x=5): + return x+1 + class B(A): + def meth(self, x=5): + return x+2 + class C(A): + pass + def f(i, n): + # call both A.meth and B.meth with an explicit argument + if i > 0: + x = A() + else: + x = B() + result1 = x.meth(n) + # now call A.meth only, using the default argument + result2 = C().meth() + return result1 * result2 + for i in [0, 1]: + res = interpret(f, [i, 1234]) + assert res == f(i, 1234) + +def test_precise_method_call_2(): + class A(object): + def meth(self, x=5): + return x+1 + class B(A): + def meth(self, x=5): + return x+2 + class C(A): + def meth(self, x=5): + return x+3 + def f(i, n): + # call both A.meth and B.meth with an explicit argument + if i > 0: + x = A() + else: + x = B() + result1 = x.meth(n) + # now call A.meth and C.meth, using the default argument + if i > 0: + x = C() + else: + x = A() + result2 = x.meth() + return result1 * result2 + for i in [0, 1]: + res = interpret(f, [i, 1234]) + assert res == f(i, 1234) From ericvrp at codespeak.net Thu Dec 15 14:54:16 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Thu, 15 Dec 2005 14:54:16 +0100 (CET) Subject: [pypy-svn] r21172 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20051215135416.B252D27B62@code1.codespeak.net> Author: ericvrp Date: Thu Dec 15 14:54:15 2005 New Revision: 21172 Added: pypy/dist/pypy/translator/backendopt/merge_if_blocks.py pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py Log: (cfbolz, ericvrp): start of a transformation to merge consecutive equality tests on the same variable into a switch (taking that variable as an exitswitch and having several exits with the cases) Added: pypy/dist/pypy/translator/backendopt/merge_if_blocks.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/backendopt/merge_if_blocks.py Thu Dec 15 14:54:15 2005 @@ -0,0 +1,117 @@ +from pypy.objspace.flow.model import Block, Constant, Variable, flatten, checkgraph +from pypy.translator.backendopt.support import log + +log = log.mergeifblocks + +''' +[backendopt:mergeifblocks] merge1 +[backendopt:mergeifblocks] 1 +[backendopt:mergeifblocks] int_eq, args[n_0, (1)], result=v0 +[backendopt:mergeifblocks] exitswitch...v0 +[backendopt:mergeifblocks] exits...(link from block at -1 to block at -1, link from block at -1 to codeless block) +[backendopt:mergeifblocks] merge1 +[backendopt:mergeifblocks] 1 +[backendopt:mergeifblocks] int_eq, args[n_1, (2)], result=v1 +[backendopt:mergeifblocks] exitswitch...v1 +[backendopt:mergeifblocks] exits...(link from block at -1 to block at -1, link from block at -1 to codeless block) +[backendopt:mergeifblocks] merge1 +[backendopt:mergeifblocks] 1 +[backendopt:mergeifblocks] int_eq, args[v2, (3)], result=v3 +[backendopt:mergeifblocks] exitswitch...v3 +[backendopt:mergeifblocks] exits...(link from block at -1 to codeless block, link from block at -1 to codeless block) +[backendopt:mergeifblocks] merge1 +[backendopt:mergeifblocks] 0 +[backendopt:mergeifblocks] exitswitch...None +[backendopt:mergeifblocks] exits...() +''' + +def is_chain_block(block, first=False): + if len(block.operations) == 0: + return False + if len(block.operations) > 1 and not first: + return False + op = block.operations[-1] + if op.opname != 'int_eq' or op.result != block.exitswitch: + return False + if isinstance(op.args[0], Variable) and isinstance(op.args[1], Variable): + return False + return True + +def merge_chain(chain, checkvar, varmap): + def get_new_arg(var_or_const): + if isinstance(var_or_const, Constant): + return var_or_const + return varmap[var_or_const] + print chain, checkvar + firstblock, case = chain[0] + firstblock.operations = firstblock.operations[:-1] + firstblock.exitswitch = checkvar + links = [] + default = chain[-1][0].exits[0] + default.exitcase = "default" + default.llexitcase = None + default.prevblock = firstblock + default.args = [get_new_arg(arg) for arg in default.args] + for block, case in chain: + link = block.exits[1] + links.append(link) + link.exitcase = case + link.llexitcase = case.value + link.prevblock = firstblock + link.args = [get_new_arg(arg) for arg in link.args] + links.append(default) + firstblock.exits = links + +def merge_if_blocks_once(graph): + """Convert consecutive blocks that all compare a variable (of Primitive type) + with a constant into one block with multiple exits. The backends can in + turn output this block as a switch statement. + """ + candidates = [block for block in graph.iterblocks() + if is_chain_block(block, first=True)] + print "candidates", candidates + for firstblock in candidates: + chain = [] + checkvars = [] + varmap = {} # {var in a block in the chain: var in the first block} + for var in firstblock.exits[0].args: + varmap[var] = var + for var in firstblock.exits[1].args: + varmap[var] = var + def add_to_varmap(var, newvar): + if isinstance(var, Variable): + varmap[newvar] = varmap[var] + else: + varmap[newvar] = var + current = firstblock + while 1: + # check whether the chain can be extended with the block that follows the + # False link + checkvar = [var for var in current.operations[-1].args + if isinstance(var, Variable)][0] + case = [var for var in current.operations[-1].args + if isinstance(var, Constant)][0] + chain.append((current, case)) + checkvars.append(checkvar) + falseexit = current.exits[0] + assert not falseexit.exitcase + trueexit = current.exits[1] + for i, var in enumerate(trueexit.args): + add_to_varmap(var, trueexit.target.inputargs[i]) + for i, var in enumerate(falseexit.args): + add_to_varmap(var, falseexit.target.inputargs[i]) + targetblock = falseexit.target + if checkvar not in falseexit.args: + break + newcheckvar = targetblock.inputargs[falseexit.args.index(checkvar)] + if not is_chain_block(targetblock): + break + if newcheckvar not in targetblock.operations[0].args: + break + current = targetblock + if len(chain) > 1: + break + else: + return False + merge_chain(chain, checkvars[0], varmap) + checkgraph(graph) Added: pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py Thu Dec 15 14:54:15 2005 @@ -0,0 +1,47 @@ +from pypy.translator.backendopt.merge_if_blocks import merge_if_blocks_once +from pypy.translator.translator import TranslationContext, graphof as tgraphof +from pypy.objspace.flow.model import flatten, Block +from pypy.translator.backendopt.removenoops import remove_same_as + +def test_merge1(): + def merge1(n): + n += 1 + if n == 1: + return 1 + elif n == 2: + return 2 + elif n == 3: + return 3 + return 4 + t = TranslationContext() + a = t.buildannotator() + a.build_types(merge1, [int]) + rtyper = t.buildrtyper() + rtyper.specialize() + graph = tgraphof(t, merge1) + assert len(list(graph.iterblocks())) == 4 #startblock, blocks, returnblock + remove_same_as(graph) + merge_if_blocks_once(graph) + assert len(graph.startblock.exits) == 4 + assert len(list(graph.iterblocks())) == 2 #startblock, returnblock + +def test_merge_passonvars(): + def merge(n, m): + if n == 1: + return m + 1 + elif n == 2: + return m + 2 + elif n == 3: + return m + 3 + return m + 4 + t = TranslationContext() + a = t.buildannotator() + a.build_types(merge, [int, int]) + rtyper = t.buildrtyper() + rtyper.specialize() + graph = tgraphof(t, merge) + assert len(list(graph.iterblocks())) == 8 + remove_same_as(graph) + merge_if_blocks_once(graph) + assert len(graph.startblock.exits) == 4 + From cfbolz at codespeak.net Thu Dec 15 15:10:05 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 15 Dec 2005 15:10:05 +0100 (CET) Subject: [pypy-svn] r21173 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20051215141005.7095D27B62@code1.codespeak.net> Author: cfbolz Date: Thu Dec 15 15:10:03 2005 New Revision: 21173 Modified: pypy/dist/pypy/translator/backendopt/merge_if_blocks.py pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py Log: merge as many blocks as possible per graph Modified: pypy/dist/pypy/translator/backendopt/merge_if_blocks.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/merge_if_blocks.py (original) +++ pypy/dist/pypy/translator/backendopt/merge_if_blocks.py Thu Dec 15 15:10:03 2005 @@ -3,28 +3,6 @@ log = log.mergeifblocks -''' -[backendopt:mergeifblocks] merge1 -[backendopt:mergeifblocks] 1 -[backendopt:mergeifblocks] int_eq, args[n_0, (1)], result=v0 -[backendopt:mergeifblocks] exitswitch...v0 -[backendopt:mergeifblocks] exits...(link from block at -1 to block at -1, link from block at -1 to codeless block) -[backendopt:mergeifblocks] merge1 -[backendopt:mergeifblocks] 1 -[backendopt:mergeifblocks] int_eq, args[n_1, (2)], result=v1 -[backendopt:mergeifblocks] exitswitch...v1 -[backendopt:mergeifblocks] exits...(link from block at -1 to block at -1, link from block at -1 to codeless block) -[backendopt:mergeifblocks] merge1 -[backendopt:mergeifblocks] 1 -[backendopt:mergeifblocks] int_eq, args[v2, (3)], result=v3 -[backendopt:mergeifblocks] exitswitch...v3 -[backendopt:mergeifblocks] exits...(link from block at -1 to codeless block, link from block at -1 to codeless block) -[backendopt:mergeifblocks] merge1 -[backendopt:mergeifblocks] 0 -[backendopt:mergeifblocks] exitswitch...None -[backendopt:mergeifblocks] exits...() -''' - def is_chain_block(block, first=False): if len(block.operations) == 0: return False @@ -42,7 +20,6 @@ if isinstance(var_or_const, Constant): return var_or_const return varmap[var_or_const] - print chain, checkvar firstblock, case = chain[0] firstblock.operations = firstblock.operations[:-1] firstblock.exitswitch = checkvar @@ -69,7 +46,6 @@ """ candidates = [block for block in graph.iterblocks() if is_chain_block(block, first=True)] - print "candidates", candidates for firstblock in candidates: chain = [] checkvars = [] @@ -113,5 +89,11 @@ break else: return False + log("merging blocks in %s" % (graph.name, )) merge_chain(chain, checkvars[0], varmap) checkgraph(graph) + return True + +def merge_if_blocks(graph): + while merge_if_blocks_once(graph): + pass Modified: pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py Thu Dec 15 15:10:03 2005 @@ -1,4 +1,4 @@ -from pypy.translator.backendopt.merge_if_blocks import merge_if_blocks_once +from pypy.translator.backendopt.merge_if_blocks import merge_if_blocks_once, merge_if_blocks from pypy.translator.translator import TranslationContext, graphof as tgraphof from pypy.objspace.flow.model import flatten, Block from pypy.translator.backendopt.removenoops import remove_same_as @@ -45,3 +45,30 @@ merge_if_blocks_once(graph) assert len(graph.startblock.exits) == 4 +def test_merge_several(): + def merge(n, m): + r = -1 + if n == 0: + if m == 0: + r = 0 + elif m == 1: + r = 1 + else: + r = 2 + elif n == 1: + r = 4 + else: + r = 6 + return r + t = TranslationContext() + a = t.buildannotator() + a.build_types(merge, [int, int]) + rtyper = t.buildrtyper() + rtyper.specialize() + graph = tgraphof(t, merge) + remove_same_as(graph) + merge_if_blocks(graph) + assert len(graph.startblock.exits) == 3 + assert len(list(graph.iterblocks())) == 3 + + From cfbolz at codespeak.net Thu Dec 15 15:21:57 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 15 Dec 2005 15:21:57 +0100 (CET) Subject: [pypy-svn] r21174 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20051215142157.5D16627B62@code1.codespeak.net> Author: cfbolz Date: Thu Dec 15 15:21:55 2005 New Revision: 21174 Modified: pypy/dist/pypy/translator/backendopt/merge_if_blocks.py pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py Log: oops! don't merge chains of ifs, only chains of one if and elifs. Modified: pypy/dist/pypy/translator/backendopt/merge_if_blocks.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/merge_if_blocks.py (original) +++ pypy/dist/pypy/translator/backendopt/merge_if_blocks.py Thu Dec 15 15:21:55 2005 @@ -1,4 +1,5 @@ -from pypy.objspace.flow.model import Block, Constant, Variable, flatten, checkgraph +from pypy.objspace.flow.model import Block, Constant, Variable, flatten +from pypy.objspace.flow.model import checkgraph, mkentrymap from pypy.translator.backendopt.support import log log = log.mergeifblocks @@ -46,6 +47,7 @@ """ candidates = [block for block in graph.iterblocks() if is_chain_block(block, first=True)] + entrymap = mkentrymap(graph) for firstblock in candidates: chain = [] checkvars = [] @@ -72,11 +74,9 @@ falseexit = current.exits[0] assert not falseexit.exitcase trueexit = current.exits[1] - for i, var in enumerate(trueexit.args): - add_to_varmap(var, trueexit.target.inputargs[i]) - for i, var in enumerate(falseexit.args): - add_to_varmap(var, falseexit.target.inputargs[i]) targetblock = falseexit.target + if len(entrymap[targetblock]) != 1: + break if checkvar not in falseexit.args: break newcheckvar = targetblock.inputargs[falseexit.args.index(checkvar)] @@ -84,6 +84,10 @@ break if newcheckvar not in targetblock.operations[0].args: break + for i, var in enumerate(trueexit.args): + add_to_varmap(var, trueexit.target.inputargs[i]) + for i, var in enumerate(falseexit.args): + add_to_varmap(var, falseexit.target.inputargs[i]) current = targetblock if len(chain) > 1: break Modified: pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py Thu Dec 15 15:21:55 2005 @@ -71,4 +71,23 @@ assert len(graph.startblock.exits) == 3 assert len(list(graph.iterblocks())) == 3 - +def test_dont_merge(): + def merge(n, m): + r = -1 + if n == 0: + r += m + if n == 1: + r += 2 * m + else: + r += 6 + return r + t = TranslationContext() + a = t.buildannotator() + a.build_types(merge, [int, int]) + rtyper = t.buildrtyper() + rtyper.specialize() + graph = tgraphof(t, merge) + remove_same_as(graph) + blocknum = len(list(graph.iterblocks())) + merge_if_blocks(graph) + assert blocknum == len(list(graph.iterblocks())) From ericvrp at codespeak.net Thu Dec 15 15:26:51 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Thu, 15 Dec 2005 15:26:51 +0100 (CET) Subject: [pypy-svn] r21176 - pypy/dist/pypy/translator/backendopt Message-ID: <20051215142651.1B5E627B84@code1.codespeak.net> Author: ericvrp Date: Thu Dec 15 15:26:50 2005 New Revision: 21176 Modified: pypy/dist/pypy/translator/backendopt/all.py Log: (ericvrp, cfbolz): adding a switch so that the if merging can be done for all graphs Modified: pypy/dist/pypy/translator/backendopt/all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/all.py (original) +++ pypy/dist/pypy/translator/backendopt/all.py Thu Dec 15 15:26:50 2005 @@ -3,12 +3,14 @@ from pypy.translator.backendopt.malloc import remove_simple_mallocs from pypy.translator.backendopt.ssa import SSI_to_SSA from pypy.translator.backendopt.propagate import propagate_all +from pypy.translator.backendopt.merge_if_blocks import merge_if_blocks from pypy.translator import simplify def backend_optimizations(translator, inline_threshold=1, mallocs=True, ssa_form=True, + merge_if_blocks_to_switch=False, propagate=False): # remove obvious no-ops for graph in translator.graphs: @@ -34,6 +36,10 @@ simplify.transform_dead_op_vars(graph, translator) if propagate: propagate_all(translator) + + if merge_if_blocks_to_switch: + for graph in translator.graphs: + merge_if_blocks(graph) if ssa_form: for graph in translator.graphs: From cfbolz at codespeak.net Thu Dec 15 15:41:18 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 15 Dec 2005 15:41:18 +0100 (CET) Subject: [pypy-svn] r21178 - in pypy/dist/pypy: rpython translator/backendopt/test Message-ID: <20051215144118.AF0D727B62@code1.codespeak.net> Author: cfbolz Date: Thu Dec 15 15:41:16 2005 New Revision: 21178 Modified: pypy/dist/pypy/rpython/llinterp.py pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py Log: fixes to llinterp to be able to cope with the switches. make the tests use llinterp Modified: pypy/dist/pypy/rpython/llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/llinterp.py (original) +++ pypy/dist/pypy/rpython/llinterp.py Thu Dec 15 15:41:16 2005 @@ -213,8 +213,12 @@ if link.llexitcase == llexitvalue: break # found -- the result is in 'link' else: - raise ValueError("exit case %r not found in the exit links " - "of %r" % (llexitvalue, block)) + if block.exits[-1].exitcase == "default": + assert block.exits[-1].llexitcase is None + link = block.exits[-1] + else: + raise ValueError("exit case %r not found in the exit links " + "of %r" % (llexitvalue, block)) return link.target, [self.getval(x) for x in link.args] def eval_operation(self, operation): Modified: pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py Thu Dec 15 15:41:16 2005 @@ -2,6 +2,7 @@ from pypy.translator.translator import TranslationContext, graphof as tgraphof from pypy.objspace.flow.model import flatten, Block from pypy.translator.backendopt.removenoops import remove_same_as +from pypy.rpython.llinterp import LLInterpreter def test_merge1(): def merge1(n): @@ -24,6 +25,10 @@ merge_if_blocks_once(graph) assert len(graph.startblock.exits) == 4 assert len(list(graph.iterblocks())) == 2 #startblock, returnblock + interp = LLInterpreter(rtyper) + for i in range(4): + res = interp.eval_graph(graph, [i]) + assert res == i + 1 def test_merge_passonvars(): def merge(n, m): @@ -44,6 +49,10 @@ remove_same_as(graph) merge_if_blocks_once(graph) assert len(graph.startblock.exits) == 4 + interp = LLInterpreter(rtyper) + for i in range(1, 5): + res = interp.eval_graph(graph, [i, 1]) + assert res == i + 1 def test_merge_several(): def merge(n, m): @@ -70,6 +79,16 @@ merge_if_blocks(graph) assert len(graph.startblock.exits) == 3 assert len(list(graph.iterblocks())) == 3 + interp = LLInterpreter(rtyper) + for m in range(3): + res = interp.eval_graph(graph, [0, m]) + assert res == m + res = interp.eval_graph(graph, [1, 0]) + assert res == 4 + res = interp.eval_graph(graph, [2, 0]) + assert res == 6 + + def test_dont_merge(): def merge(n, m): @@ -91,3 +110,4 @@ blocknum = len(list(graph.iterblocks())) merge_if_blocks(graph) assert blocknum == len(list(graph.iterblocks())) + From ac at codespeak.net Fri Dec 16 10:02:17 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Fri, 16 Dec 2005 10:02:17 +0100 (CET) Subject: [pypy-svn] r21201 - in pypy/dist/pypy/translator/c: . test Message-ID: <20051216090217.519F727B52@code1.codespeak.net> Author: ac Date: Fri Dec 16 10:02:16 2005 New Revision: 21201 Modified: pypy/dist/pypy/translator/c/funcgen.py pypy/dist/pypy/translator/c/test/test_backendoptimized.py Log: Support exitswitch on a Signed variable using a switch statement. Modified: pypy/dist/pypy/translator/c/funcgen.py ============================================================================== --- pypy/dist/pypy/translator/c/funcgen.py (original) +++ pypy/dist/pypy/translator/c/funcgen.py Fri Dec 16 10:02:16 2005 @@ -5,7 +5,7 @@ from pypy.objspace.flow.model import Variable, Constant, Block from pypy.objspace.flow.model import traverse, c_last_exception from pypy.rpython.lltypesystem.lltype import \ - Ptr, PyObject, Void, Bool, pyobjectptr, Struct, Array + Ptr, PyObject, Void, Bool, Signed, pyobjectptr, Struct, Array PyObjPtr = Ptr(PyObject) @@ -320,33 +320,55 @@ else: # block ending in a switch on a value TYPE = self.lltypemap(block.exitswitch) - for link in block.exits[:-1]: + if TYPE in (Bool, PyObjPtr): + expr = self.expr(block.exitswitch) + for link in block.exits[:-1]: + assert link.exitcase in (False, True) + if TYPE == Bool: + if not link.exitcase: + expr = '!' + expr + elif TYPE == PyObjPtr: + yield 'assert(%s == Py_True || %s == Py_False);' % ( + expr, expr) + if link.exitcase: + expr = '%s == Py_True' % expr + else: + expr = '%s == Py_False' % expr + yield 'if (%s) {' % expr + for op in gen_link(link): + yield '\t' + op + yield '}' + link = block.exits[-1] assert link.exitcase in (False, True) + #yield 'assert(%s == %s);' % (self.expr(block.exitswitch), + # self.genc.nameofvalue(link.exitcase, ct)) + for op in gen_link(block.exits[-1]): + yield op + yield '' + elif TYPE == Signed: + defaultlink = None expr = self.expr(block.exitswitch) - if TYPE == Bool: - if not link.exitcase: - expr = '!' + expr - elif TYPE == PyObjPtr: - yield 'assert(%s == Py_True || %s == Py_False);' % ( - expr, expr) - if link.exitcase: - expr = '%s == Py_True' % expr - else: - expr = '%s == Py_False' % expr - else: - raise TypeError("switches can only be on Bool or " - "PyObjPtr. Got %r" % (TYPE,)) - yield 'if (%s) {' % expr - for op in gen_link(link): + yield 'switch (%s) {' % self.expr(block.exitswitch) + for link in block.exits: + if link.exitcase is 'default': + defaultlink = link + continue + yield 'case %s:' % link.llexitcase + for op in gen_link(link): + yield '\t' + op + yield 'break;' + + # ? Emit default case + if defaultlink is None: + raise TypeError('switches must have a default case.') + yield 'default:' + for op in gen_link(defaultlink): yield '\t' + op + yield '}' - link = block.exits[-1] - assert link.exitcase in (False, True) - #yield 'assert(%s == %s);' % (self.expr(block.exitswitch), - # self.genc.nameofvalue(link.exitcase, ct)) - for op in gen_link(block.exits[-1]): - yield op - yield '' + else: + raise TypeError("switches can only be on Bool or " + "PyObjPtr. Got %r" % (TYPE,)) for i in range(reachable_err, -1, -1): if not fallthrough: Modified: pypy/dist/pypy/translator/c/test/test_backendoptimized.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_backendoptimized.py (original) +++ pypy/dist/pypy/translator/c/test/test_backendoptimized.py Fri Dec 16 10:02:16 2005 @@ -95,3 +95,27 @@ res = gn() assert res + + +class TestTypedOptimizedSwitchTestCase: + + class CodeGenerator(_TestTypedTestCase): + def process(self, t): + _TestTypedTestCase.process(self, t) + self.t = t + backend_optimizations(t, merge_if_blocks_to_switch=True) + + def test_switch(self): + def f(x=int): + if x == 3: + return 9 + elif x == 9: + return 27 + elif x == 27: + return 3 + return 0 + codegenerator = self.CodeGenerator() + fn = codegenerator.getcompiled(f) + for x in (0,1,2,3,9,27,48, -9): + assert fn(x) == f(x) + From cfbolz at codespeak.net Fri Dec 16 10:36:26 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 16 Dec 2005 10:36:26 +0100 (CET) Subject: [pypy-svn] r21202 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20051216093626.7930527B52@code1.codespeak.net> Author: cfbolz Date: Fri Dec 16 10:36:24 2005 New Revision: 21202 Modified: pypy/dist/pypy/translator/backendopt/merge_if_blocks.py pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py Log: bug fix + test: after inlining it is possible that there are int_eq operations that contain two constants. merge_if_blocks assumed that there was always at least one variable in such an operation. Modified: pypy/dist/pypy/translator/backendopt/merge_if_blocks.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/merge_if_blocks.py (original) +++ pypy/dist/pypy/translator/backendopt/merge_if_blocks.py Fri Dec 16 10:36:24 2005 @@ -14,6 +14,8 @@ return False if isinstance(op.args[0], Variable) and isinstance(op.args[1], Variable): return False + if isinstance(op.args[0], Constant) and isinstance(op.args[1], Constant): + return False return True def merge_chain(chain, checkvar, varmap): Modified: pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py Fri Dec 16 10:36:24 2005 @@ -1,4 +1,6 @@ -from pypy.translator.backendopt.merge_if_blocks import merge_if_blocks_once, merge_if_blocks +from pypy.translator.backendopt.merge_if_blocks import merge_if_blocks_once +from pypy.translator.backendopt.merge_if_blocks import merge_if_blocks +from pypy.translator.backendopt.all import backend_optimizations from pypy.translator.translator import TranslationContext, graphof as tgraphof from pypy.objspace.flow.model import flatten, Block from pypy.translator.backendopt.removenoops import remove_same_as @@ -111,3 +113,19 @@ merge_if_blocks(graph) assert blocknum == len(list(graph.iterblocks())) +def test_two_constants(): + def fn(): + r = range(10, 37, 4) + r.reverse() + return r[0] + t = TranslationContext() + a = t.buildannotator() + a.build_types(fn, []) + rtyper = t.buildrtyper() + rtyper.specialize() + backend_optimizations(t, merge_if_blocks_to_switch=True) + graph = tgraphof(t, fn) + blocknum = len(list(graph.iterblocks())) + merge_if_blocks(graph) + assert blocknum == len(list(graph.iterblocks())) + From ericvrp at codespeak.net Fri Dec 16 11:03:20 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 16 Dec 2005 11:03:20 +0100 (CET) Subject: [pypy-svn] r21205 - pypy/dist/pypy/interpreter Message-ID: <20051216100320.1B5B327B62@code1.codespeak.net> Author: ericvrp Date: Fri Dec 16 11:03:19 2005 New Revision: 21205 Modified: pypy/dist/pypy/interpreter/pyopcode.py Log: Unrolled dispatch loop. (should work nicely with merge_if_blocks transformation) Modified: pypy/dist/pypy/interpreter/pyopcode.py ============================================================================== --- pypy/dist/pypy/interpreter/pyopcode.py (original) +++ pypy/dist/pypy/interpreter/pyopcode.py Fri Dec 16 11:03:19 2005 @@ -47,16 +47,6 @@ # Currently, they are always setup in pyopcode.py # but it could be a custom table. - def dispatch(self): - opcode = self.nextop() - if self.opcode_has_arg[opcode]: - fn = self.dispatch_table_w_arg[opcode] - oparg = self.nextarg() - fn(self, oparg) - else: - fn = self.dispatch_table_no_arg[opcode] - fn(self) - def nextop(self): c = self.pycode.co_code[self.next_instr] self.next_instr += 1 @@ -777,6 +767,21 @@ cls.dispatch_table_no_arg = dispatch_table_no_arg cls.dispatch_table_w_arg = dispatch_table_w_arg + ### create unrolled dispatch loop ### + import py + dispatch_code = 'def dispatch(self):\n' + dispatch_code += ' opcode = self.nextop()\n' + for i in range(256): + dispatch_code += ' %s opcode == %d:\n' % (('if', 'elif')[i > 0], i) + opcode_has_arg = i >= dis.HAVE_ARGUMENT + opname = dis.opname[i].replace('+', '_') + missingname = ('MISSING_OPCODE', 'MISSING_OPCODE_W_ARG')[opcode_has_arg] + func_name = (missingname, opname)[hasattr(cls, opname)] + dispatch_code += ' self.%s(%s)\n' % (func_name, ('', 'self.nextarg()')[opcode_has_arg]) + exec py.code.Source(dispatch_code).compile() + cls.dispatch = dispatch + del dispatch_code, i, opcode_has_arg, opname, missingname, func_name + ### helpers written at the application-level ### # Some of these functions are expected to be generally useful if other From ericvrp at codespeak.net Fri Dec 16 11:07:27 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 16 Dec 2005 11:07:27 +0100 (CET) Subject: [pypy-svn] r21206 - pypy/dist/pypy/interpreter Message-ID: <20051216100727.B320727B82@code1.codespeak.net> Author: ericvrp Date: Fri Dec 16 11:07:27 2005 New Revision: 21206 Modified: pypy/dist/pypy/interpreter/pyopcode.py Log: "XXX performance hack!" Modified: pypy/dist/pypy/interpreter/pyopcode.py ============================================================================== --- pypy/dist/pypy/interpreter/pyopcode.py (original) +++ pypy/dist/pypy/interpreter/pyopcode.py Fri Dec 16 11:07:27 2005 @@ -767,6 +767,8 @@ cls.dispatch_table_no_arg = dispatch_table_no_arg cls.dispatch_table_w_arg = dispatch_table_w_arg + #XXX performance hack! + ### create unrolled dispatch loop ### import py dispatch_code = 'def dispatch(self):\n' From ericvrp at codespeak.net Fri Dec 16 11:35:16 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 16 Dec 2005 11:35:16 +0100 (CET) Subject: [pypy-svn] r21207 - pypy/dist/pypy/interpreter Message-ID: <20051216103516.7D61A27B54@code1.codespeak.net> Author: ericvrp Date: Fri Dec 16 11:35:15 2005 New Revision: 21207 Modified: pypy/dist/pypy/interpreter/pyopcode.py Log: * Use dispatch performance hack when translated only. * put MISSING_OPCODE in else block Modified: pypy/dist/pypy/interpreter/pyopcode.py ============================================================================== --- pypy/dist/pypy/interpreter/pyopcode.py (original) +++ pypy/dist/pypy/interpreter/pyopcode.py Fri Dec 16 11:35:15 2005 @@ -12,6 +12,7 @@ from pypy.interpreter.argument import Arguments from pypy.interpreter.pycode import PyCode from pypy.tool.sourcetools import func_with_new_name +from pypy.rpython.objectmodel import we_are_translated def unaryoperation(operationname): """NOT_RPYTHON""" @@ -47,6 +48,17 @@ # Currently, they are always setup in pyopcode.py # but it could be a custom table. + # note: __initclass__ might override dispatch() with a more efficient version + def dispatch(self): + opcode = self.nextop() + if self.opcode_has_arg[opcode]: + fn = self.dispatch_table_w_arg[opcode] + oparg = self.nextarg() + fn(self, oparg) + else: + fn = self.dispatch_table_no_arg[opcode] + fn(self) + def nextop(self): c = self.pycode.co_code[self.next_instr] self.next_instr += 1 @@ -768,21 +780,24 @@ cls.dispatch_table_w_arg = dispatch_table_w_arg #XXX performance hack! - - ### create unrolled dispatch loop ### - import py - dispatch_code = 'def dispatch(self):\n' - dispatch_code += ' opcode = self.nextop()\n' - for i in range(256): - dispatch_code += ' %s opcode == %d:\n' % (('if', 'elif')[i > 0], i) - opcode_has_arg = i >= dis.HAVE_ARGUMENT - opname = dis.opname[i].replace('+', '_') - missingname = ('MISSING_OPCODE', 'MISSING_OPCODE_W_ARG')[opcode_has_arg] - func_name = (missingname, opname)[hasattr(cls, opname)] - dispatch_code += ' self.%s(%s)\n' % (func_name, ('', 'self.nextarg()')[opcode_has_arg]) - exec py.code.Source(dispatch_code).compile() - cls.dispatch = dispatch - del dispatch_code, i, opcode_has_arg, opname, missingname, func_name + if we_are_translated(): ### create unrolled dispatch thingy ### + import py + dispatch_code = 'def dispatch(self):\n' + dispatch_code += ' opcode = self.nextop()\n' + n_outputed = 0 + for i in range(256): + opname = dis.opname[i].replace('+', '_') + if not hasattr(cls, opname): + continue + dispatch_code += ' %s opcode == %d:\n' % (('if', 'elif')[n_outputed > 0], i) + opcode_has_arg = cls.opcode_has_arg[i] + dispatch_code += ' self.%s(%s)\n' % (opname, ('', 'self.nextarg()')[opcode_has_arg]) + n_outputed += 1 + dispatch_code += ' else:\n' + dispatch_code += ' self.MISSING_OPCODE()\n' + exec py.code.Source(dispatch_code).compile() + cls.dispatch = dispatch + del dispatch_code, i, opcode_has_arg, opname ### helpers written at the application-level ### From ac at codespeak.net Fri Dec 16 11:36:44 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Fri, 16 Dec 2005 11:36:44 +0100 (CET) Subject: [pypy-svn] r21208 - pypy/dist/pypy/translator/goal Message-ID: <20051216103644.84B4327B54@code1.codespeak.net> Author: ac Date: Fri Dec 16 11:36:44 2005 New Revision: 21208 Modified: pypy/dist/pypy/translator/goal/driver.py pypy/dist/pypy/translator/goal/translate_pypy.py Log: Add option for merging if-blocks. Modified: pypy/dist/pypy/translator/goal/driver.py ============================================================================== --- pypy/dist/pypy/translator/goal/driver.py (original) +++ pypy/dist/pypy/translator/goal/driver.py Fri Dec 16 11:36:44 2005 @@ -24,7 +24,8 @@ 'insist': False, 'backend': 'c', 'lowmem': False, - 'fork_before': None + 'fork_before': None, + 'merge_if_blocks': False }) def taskdef(taskfunc, deps, title, new_state=None, expected_states=[], idemp=False): @@ -182,7 +183,8 @@ def task_backendopt(self): from pypy.translator.backendopt.all import backend_optimizations opt = self.options - backend_optimizations(self.translator, ssa_form=opt.backend != 'llvm') + backend_optimizations(self.translator, ssa_form=opt.backend != 'llvm', + merge_if_blocks_to_switch=opt.merge_if_blocks) # task_backendopt = taskdef(task_backendopt, ['rtype'], "Back-end optimisations") Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Fri Dec 16 11:36:44 2005 @@ -50,6 +50,7 @@ '2_gc': [OPT(('--gc',), "Garbage collector", ['boehm', 'ref', 'none'])], '3_stackless': [OPT(('--stackless',), "Stackless code generation", True)], + '4_merge_if_blocks': [OPT(('--merge_if_blocks',), "Merge if ... elif ... chains and use a switch statement for them.", True)], }, @@ -101,6 +102,7 @@ 'gc': 'boehm', 'backend': 'c', 'stackless': False, + 'merge_if_blocks': False, 'batch': False, 'text': False, From mwh at codespeak.net Fri Dec 16 11:51:10 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Fri, 16 Dec 2005 11:51:10 +0100 (CET) Subject: [pypy-svn] r21209 - pypy/dist/pypy/annotation Message-ID: <20051216105110.4ADEC27B54@code1.codespeak.net> Author: mwh Date: Fri Dec 16 11:51:09 2005 New Revision: 21209 Modified: pypy/dist/pypy/annotation/unaryop.py Log: remove an import Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Fri Dec 16 11:51:09 2005 @@ -2,7 +2,6 @@ Unary operations on SomeValues. """ -from types import FunctionType from pypy.annotation.model import \ SomeObject, SomeInteger, SomeBool, SomeString, SomeChar, SomeList, \ SomeDict, SomeUnicodeCodePoint, SomeTuple, SomeImpossibleValue, \ From mwh at codespeak.net Fri Dec 16 11:51:42 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Fri, 16 Dec 2005 11:51:42 +0100 (CET) Subject: [pypy-svn] r21210 - pypy/dist/pypy/objspace/std Message-ID: <20051216105142.8762C27B82@code1.codespeak.net> Author: mwh Date: Fri Dec 16 11:51:41 2005 New Revision: 21210 Modified: pypy/dist/pypy/objspace/std/objspace.py Log: reorder the conditions in wrap() slightly. add a comment. Modified: pypy/dist/pypy/objspace/std/objspace.py ============================================================================== --- pypy/dist/pypy/objspace/std/objspace.py (original) +++ pypy/dist/pypy/objspace/std/objspace.py Fri Dec 16 11:51:41 2005 @@ -230,6 +230,12 @@ def wrap(self, x): "Wraps the Python value 'x' into one of the wrapper classes." + # You might notice that this function is rather conspicuously + # not RPython. We can get away with this because the function + # is speicalized (see after the function body). Also worth + # noting is that the isinstance's involving integer types + # behave rather differently to how you might expect during + # annotation (see pypy/annotation/builtin.py) if x is None: return self.w_None if isinstance(x, W_Object): @@ -260,10 +266,10 @@ w_result = x.__spacebind__(self) #print 'wrapping', x, '->', w_result return w_result - if isinstance(x, long): + if isinstance(x, r_longlong): from pypy.objspace.std.longobject import args_from_long return W_LongObject(self, *args_from_long(x)) - if isinstance(x, r_longlong): + if isinstance(x, long): from pypy.objspace.std.longobject import args_from_long return W_LongObject(self, *args_from_long(x)) if isinstance(x, slice): From mwh at codespeak.net Fri Dec 16 11:54:58 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Fri, 16 Dec 2005 11:54:58 +0100 (CET) Subject: [pypy-svn] r21211 - pypy/dist/pypy/objspace/std Message-ID: <20051216105458.55AD827B62@code1.codespeak.net> Author: mwh Date: Fri Dec 16 11:54:57 2005 New Revision: 21211 Modified: pypy/dist/pypy/objspace/std/objspace.py Log: typo (thanks carl) Modified: pypy/dist/pypy/objspace/std/objspace.py ============================================================================== --- pypy/dist/pypy/objspace/std/objspace.py (original) +++ pypy/dist/pypy/objspace/std/objspace.py Fri Dec 16 11:54:57 2005 @@ -232,7 +232,7 @@ "Wraps the Python value 'x' into one of the wrapper classes." # You might notice that this function is rather conspicuously # not RPython. We can get away with this because the function - # is speicalized (see after the function body). Also worth + # is specialized (see after the function body). Also worth # noting is that the isinstance's involving integer types # behave rather differently to how you might expect during # annotation (see pypy/annotation/builtin.py) From ericvrp at codespeak.net Fri Dec 16 15:05:49 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 16 Dec 2005 15:05:49 +0100 (CET) Subject: [pypy-svn] r21216 - pypy/dist/pypy/interpreter Message-ID: <20051216140549.E033327B82@code1.codespeak.net> Author: ericvrp Date: Fri Dec 16 15:05:49 2005 New Revision: 21216 Modified: pypy/dist/pypy/interpreter/pyopcode.py Log: fix to get the hacked dispatch code only when translated Modified: pypy/dist/pypy/interpreter/pyopcode.py ============================================================================== --- pypy/dist/pypy/interpreter/pyopcode.py (original) +++ pypy/dist/pypy/interpreter/pyopcode.py Fri Dec 16 15:05:49 2005 @@ -48,8 +48,13 @@ # Currently, they are always setup in pyopcode.py # but it could be a custom table. - # note: __initclass__ might override dispatch() with a more efficient version def dispatch(self): + if we_are_translated(): + self.dispatch_translated() + else: + self.dispatch_not_translated() + + def dispatch_not_translated(self): opcode = self.nextop() if self.opcode_has_arg[opcode]: fn = self.dispatch_table_w_arg[opcode] @@ -780,24 +785,25 @@ cls.dispatch_table_w_arg = dispatch_table_w_arg #XXX performance hack! - if we_are_translated(): ### create unrolled dispatch thingy ### - import py - dispatch_code = 'def dispatch(self):\n' - dispatch_code += ' opcode = self.nextop()\n' - n_outputed = 0 - for i in range(256): - opname = dis.opname[i].replace('+', '_') - if not hasattr(cls, opname): - continue - dispatch_code += ' %s opcode == %d:\n' % (('if', 'elif')[n_outputed > 0], i) - opcode_has_arg = cls.opcode_has_arg[i] - dispatch_code += ' self.%s(%s)\n' % (opname, ('', 'self.nextarg()')[opcode_has_arg]) - n_outputed += 1 - dispatch_code += ' else:\n' - dispatch_code += ' self.MISSING_OPCODE()\n' - exec py.code.Source(dispatch_code).compile() - cls.dispatch = dispatch - del dispatch_code, i, opcode_has_arg, opname + ### Create dispatch with a lot of if,elifs ### + ### (this gets optimized for translated pypy by the merge_if_blocks transformation) ### + import py + dispatch_code = 'def dispatch_translated(self):\n' + dispatch_code += ' opcode = self.nextop()\n' + n_outputed = 0 + for i in range(256): + opname = dis.opname[i].replace('+', '_') + if not hasattr(cls, opname): + continue + dispatch_code += ' %s opcode == %d:\n' % (('if', 'elif')[n_outputed > 0], i) + opcode_has_arg = cls.opcode_has_arg[i] + dispatch_code += ' self.%s(%s)\n' % (opname, ('', 'self.nextarg()')[opcode_has_arg]) + n_outputed += 1 + dispatch_code += ' else:\n' + dispatch_code += ' self.MISSING_OPCODE()\n' + exec py.code.Source(dispatch_code).compile() + cls.dispatch_translated = dispatch_translated + del dispatch_code, i, opcode_has_arg, opname ### helpers written at the application-level ### From arigo at codespeak.net Fri Dec 16 17:42:30 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 16 Dec 2005 17:42:30 +0100 (CET) Subject: [pypy-svn] r21223 - pypy/dist/pypy/jit Message-ID: <20051216164230.2172027DBF@code1.codespeak.net> Author: arigo Date: Fri Dec 16 17:42:28 2005 New Revision: 21223 Modified: pypy/dist/pypy/jit/llabstractinterp.py Log: Checking in minor local changes to llabstractinterp: a __repr__ for the virtual structures and an extra supported operation. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Fri Dec 16 17:42:28 2005 @@ -271,6 +271,12 @@ """Stands for a pointer to a malloc'ed structure; the structure is not malloc'ed so far, but we record which fields have which value. """ + def __repr__(self): + items = self.fields.items() + items.sort() + flds = ['%s=%r' % item for item in items] + return '' % (self.T._name, ', '.join(flds)) + def getnames(self): return self.T._names @@ -288,6 +294,12 @@ malloc'ed so far, but we record which fields have which value -- here a field is an item, indexed by an integer instead of a string field name. """ + def __repr__(self): + items = self.fields.items() + items.sort() + flds = ['%s=%r' % item for item in items] + return '' % (', '.join(flds),) + def getnames(self): c = self.a_length.maybe_get_constant() assert c is not None @@ -803,6 +815,9 @@ def op_cast_char_to_int(self, op, a): return self.residualize(op, [a], ord) + def op_cast_bool_to_int(self, op, a): + return self.residualize(op, [a], int) + def op_same_as(self, op, a): return a From arigo at codespeak.net Fri Dec 16 18:29:45 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 16 Dec 2005 18:29:45 +0100 (CET) Subject: [pypy-svn] r21226 - in pypy/dist/pypy/jit: . test Message-ID: <20051216172945.916E027B82@code1.codespeak.net> Author: arigo Date: Fri Dec 16 18:29:43 2005 New Revision: 21226 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: A test, disabled for now, showing what I *believe* I expect from the pypy-dev plan (http://codespeak.net/pipermail/pypy-dev/2005q4/002667.html). Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Fri Dec 16 18:29:43 2005 @@ -434,9 +434,11 @@ # ____________________________________________________________ class Policy(object): - def __init__(self, inlining=False, const_propagate=False): + def __init__(self, inlining=False, const_propagate=False, + concrete_propagate=True): self.inlining = inlining self.const_propagate = const_propagate + self.concrete_propagate = concrete_propagate best_policy = Policy(inlining=True, const_propagate=True) @@ -743,7 +745,7 @@ # can constant-fold print 'fold:', constant_op, concretevalues concreteresult = constant_op(*concretevalues) - if any_concrete: + if any_concrete and self.policy.concrete_propagate: return LLConcreteValue(concreteresult) else: return LLRuntimeValue(const(concreteresult)) Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Fri Dec 16 18:29:43 2005 @@ -63,6 +63,7 @@ P_INLINE = Policy(inlining=True) P_CONST_INLINE = Policy(inlining=True, const_propagate=True) +P_HINT_DRIVEN = Policy(inlining=True, concrete_propagate=False) def test_simple(): @@ -333,3 +334,28 @@ return result graph2, insns = abstrinterp(ll_factorial, [7], [], policy=P_CONST_INLINE) assert insns == {'int_lt': 1, 'int_add': 1, 'int_mul': 1} + +def INPROGRESS_test_hint(): + from pypy.rpython.objectmodel import hint + A = lltype.GcArray(lltype.Char) + def ll_interp(code): + accum = 0 + pc = 0 + while pc < len(code): + opcode = hint(code[pc], concrete=True) + pc += 1 + if opcode == 'A': + accum += 6 + elif opcode == 'B': + if accum < 20: + pc = 0 + return accum + bytecode = lltype.malloc(A, 5) + bytecode[0] = 'A' + bytecode[1] = 'A' + bytecode[2] = 'A' + bytecode[3] = 'B' + bytecode[4] = 'A' + graph2, insns = abstrinterp(ll_interp, [bytecode], [], + policy=P_HINT_DRIVEN) + assert insns == {'int_add': 4, 'int_lt': 1} From ericvrp at codespeak.net Sat Dec 17 20:16:29 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Sat, 17 Dec 2005 20:16:29 +0100 (CET) Subject: [pypy-svn] r21259 - in pypy/dist/pypy/translator/llvm: . backendopt Message-ID: <20051217191629.262A527B82@code1.codespeak.net> Author: ericvrp Date: Sat Dec 17 20:16:23 2005 New Revision: 21259 Modified: pypy/dist/pypy/translator/llvm/backendopt/exception.py pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py pypy/dist/pypy/translator/llvm/funcnode.py Log: use graph.iterblocks() Modified: pypy/dist/pypy/translator/llvm/backendopt/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/backendopt/exception.py (original) +++ pypy/dist/pypy/translator/llvm/backendopt/exception.py Sat Dec 17 20:16:23 2005 @@ -17,8 +17,7 @@ global n_calls, n_calls_patched n_calls_patched_begin = n_calls_patched e = translator.rtyper.getexceptiondata() - blocks = [x for x in flatten(graph) if isinstance(x, Block)] - for block in blocks: + for block in graph.iterblocks(): last_operation = len(block.operations)-1 if block.exitswitch == c_last_exception: last_operation -= 1 Modified: pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py ============================================================================== --- pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py (original) +++ pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py Sat Dec 17 20:16:23 2005 @@ -1,4 +1,4 @@ -from pypy.objspace.flow.model import Block, flatten, SpaceOperation, Constant, Variable +from pypy.objspace.flow.model import SpaceOperation, Constant, Variable from pypy.rpython.lltypesystem.lltype import GcStruct, Void, Ptr from pypy.translator.llvm.backendopt.support import log @@ -14,8 +14,7 @@ warning: some will consider this a dirty hack, that's ok! :) """ n_times_merged = 0 - blocks = [x for x in flatten(graph) if isinstance(x, Block)] - for block in blocks: + for block in graph.iterblocks(): mallocs = [[], []] for i, op in enumerate(block.operations): if op.opname != 'malloc' or op.args[0].value._arrayfld: Modified: pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py ============================================================================== --- pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py (original) +++ pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py Sat Dec 17 20:16:23 2005 @@ -18,8 +18,7 @@ """ n_removed = 0 n_removed_of_type = {} - blocks = [x for x in flatten(graph) if isinstance(x, Block)] - for block in blocks: + for block in graph.iterblocks(): ops = block.operations if len(ops) < 3 or \ ops[0].opname != 'malloc' or ops[1].opname != 'cast_pointer' or \ Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Sat Dec 17 20:16:23 2005 @@ -1,5 +1,5 @@ from pypy.objspace.flow.model import Block, Constant, Link -from pypy.objspace.flow.model import flatten, mkentrymap, traverse, c_last_exception +from pypy.objspace.flow.model import mkentrymap, traverse, c_last_exception from pypy.rpython.lltypesystem import lltype from pypy.translator.llvm.node import LLVMNode, ConstantLLVMNode from pypy.translator.llvm.opwriter import OpWriter @@ -79,11 +79,10 @@ codewriter.openfunc(self.getdecl(), self is self.db.entrynode) nextblock = graph.startblock args = graph.startblock.inputargs - l = [x for x in flatten(graph) if isinstance(x, Block)] self.block_to_name = {} - for i, block in enumerate(l): + for i, block in enumerate(graph.iterblocks()): self.block_to_name[block] = "block%s" % i - for block in l: + for block in graph.iterblocks(): codewriter.label(self.block_to_name[block]) for name in 'startblock returnblock exceptblock'.split(): if block is getattr(graph, name): @@ -95,8 +94,7 @@ def writecomments(self, codewriter): """ write operations strings for debugging purposes. """ - blocks = [x for x in flatten(self.graph) if isinstance(x, Block)] - for block in blocks: + for block in self.graph.iterblocks(): for op in block.operations: strop = str(op) + "\n\x00" l = len(strop) From ericvrp at codespeak.net Sat Dec 17 20:23:39 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Sat, 17 Dec 2005 20:23:39 +0100 (CET) Subject: [pypy-svn] r21260 - in pypy/dist/pypy/translator/js: . src test Message-ID: <20051217192339.5C89527B82@code1.codespeak.net> Author: ericvrp Date: Sat Dec 17 20:23:35 2005 New Revision: 21260 Added: pypy/dist/pypy/translator/js/optimize.py Modified: pypy/dist/pypy/translator/js/codewriter.py pypy/dist/pypy/translator/js/conftest.py pypy/dist/pypy/translator/js/database.py pypy/dist/pypy/translator/js/funcnode.py pypy/dist/pypy/translator/js/js.py pypy/dist/pypy/translator/js/opwriter.py pypy/dist/pypy/translator/js/src/ll_stackless.js pypy/dist/pypy/translator/js/src/stack.js pypy/dist/pypy/translator/js/support.py pypy/dist/pypy/translator/js/test/test_class.py pypy/dist/pypy/translator/js/test/test_exception.py pypy/dist/pypy/translator/js/test/test_genllvm.py pypy/dist/pypy/translator/js/test/test_lltype.py Log: * about 10 more passing tests (by adding pow(), etc) * added some js code for handling basic string manipulation natively * add a --compress option to conftest.py to do just that * only output exception match code when exceptions are used Modified: pypy/dist/pypy/translator/js/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/js/codewriter.py (original) +++ pypy/dist/pypy/translator/js/codewriter.py Sat Dec 17 20:23:35 2005 @@ -1,5 +1,6 @@ import py from itertools import count +from pypy.translator.js.optimize import optimize_call from pypy.translator.js.log import log log = log.codewriter @@ -163,8 +164,9 @@ if self.js.stackless: # XXX and this funcnode has resumepoints self.append("if (slp_frame_stack_top) {") self.indent_more() + self.append('var t = slp_frame_stack_top.vars') for i, k in enumerate(self._usedvars.keys()): - self.append('%-19s = slp_frame_stack_top.vars[%d]' % (k, i)) + self.append('%-19s = t[%d]' % (k, i)) self.append('%-19s = slp_frame_stack_top.resume_blocknum' % 'block') self.append('eval(slp_frame_stack_top.targetvar + " = slp_return_value")') self.append('slp_frame_stack_top = null') @@ -180,7 +182,7 @@ self.indent_less() self.append("}") #end of forever (block) loop self.indent_less() - self.append("};") #end of function + self.append("}") #end of function self.newline() def ret(self, ref=''): @@ -200,7 +202,7 @@ assert no_exception is None if self.js.stackless: self.append("slp_stack_depth++") - self.append('%s = %s(%s)' % (targetvar, functionref, args)) + self.append( optimize_call('%s = %s(%s)' % (targetvar, functionref, args)) ) if self.js.stackless: self.append("slp_stack_depth--") selfdecl = self.decl.split('(')[0] @@ -227,7 +229,7 @@ if self.js.stackless: self.comment('TODO: XXX stackless in combination with exceptions handling') self.append("slp_stack_depth++") - self.append('%s = %s(%s)' % (targetvar, functionref, args)) + self.append( optimize_call('%s = %s(%s)' % (targetvar, functionref, args)) ) if self.js.stackless: self.append("slp_stack_depth--") #XXX we don't actually get here when an exception occurs! self._phi(no_exception_exit) @@ -264,7 +266,6 @@ self.indent_less() self.append('}') - def cast(self, targetvar, fromtype, fromvar, targettype): if fromtype == 'void' and targettype == 'void': return @@ -288,9 +289,9 @@ res += ", ".join(["%s %s" % (t, i) for t, i in indices]) self.comment(res) - #res = "%(targetvar)s = %(typevar)s" % locals() - #res += ''.join(['[%s]' % i for t, i in indices]) - #self.append(res) + res = "%(targetvar)s = %(typevar)s" % locals() + res += ''.join(['[%s]' % i for t, i in indices[1:]]) + self.append(res) def load(self, destvar, src, srcindices): res = "%(destvar)s = %(src)s" % locals() Modified: pypy/dist/pypy/translator/js/conftest.py ============================================================================== --- pypy/dist/pypy/translator/js/conftest.py (original) +++ pypy/dist/pypy/translator/js/conftest.py Sat Dec 17 20:23:35 2005 @@ -7,6 +7,8 @@ default=False, help="run Javascript tests in your default browser"), Option('--stackless', action="store_true",dest="jsstackless", default=False, help="enable stackless feature"), + Option('--compress', action="store_true",dest="jscompress", + default=False, help="enable javascript compression"), Option('--log', action="store_true",dest="jslog", default=False, help="log debugging info"), ) Modified: pypy/dist/pypy/translator/js/database.py ============================================================================== --- pypy/dist/pypy/translator/js/database.py (original) +++ pypy/dist/pypy/translator/js/database.py Sat Dec 17 20:23:35 2005 @@ -23,12 +23,13 @@ lltype.UniChar: "uint", lltype.Void: "void"} - def __init__(self, translator): + def __init__(self, translator, js): self.translator = translator + self.js = js self.obj2node = {} self._pendingsetup = [] self._tmpcount = 1 - self.namespace = JavascriptNameManager() + self.namespace = JavascriptNameManager(js) #_______setting up and preperation______________________________ Modified: pypy/dist/pypy/translator/js/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/js/funcnode.py (original) +++ pypy/dist/pypy/translator/js/funcnode.py Sat Dec 17 20:23:35 2005 @@ -32,6 +32,12 @@ self.db.prepare_arg(op.result) if block.exitswitch != c_last_exception: continue + if hasattr(self.graph, 'exceptblock'): + from pypy.rpython.rmodel import inputconst + e = self.db.translator.rtyper.getexceptiondata() + matchptr = e.fn_exception_match + matchconst = inputconst(lltype.typeOf(matchptr), matchptr) + self.db.prepare_arg_value(matchconst) for link in block.exits[1:]: self.db.prepare_constant(lltype.typeOf(link.llexitcase), link.llexitcase) @@ -118,5 +124,5 @@ def __init__(self, db, value): self.db = db self.value = value - self.ref = db.namespace.uniquename(value.graph.name) + self.ref = value.graph.name #keep the exact name (do not compress) self.graph = value.graph Modified: pypy/dist/pypy/translator/js/js.py ============================================================================== --- pypy/dist/pypy/translator/js/js.py (original) +++ pypy/dist/pypy/translator/js/js.py Sat Dec 17 20:23:35 2005 @@ -17,6 +17,7 @@ from pypy.translator.js.node import Node from pypy.translator.js.database import Database from pypy.translator.js.codewriter import CodeWriter +from pypy.translator.js.optimize import optimize_filesize from pypy.translator.js.log import log from pypy.translator.js import conftest @@ -27,11 +28,12 @@ return path class JS(object): # JS = Javascript - def __init__(self, translator, entrypoint=None, stackless=False, logging=False): - self.db = Database(translator) + def __init__(self, translator, entrypoint=None, stackless=False, compress=False, logging=False): self.entrypoint = entrypoint or translator.entrypoint self.stackless = stackless or conftest.option.jsstackless + self.compress = compress or conftest.option.jscompress self.logging = logging or conftest.option.jslog + self.db = Database(translator, self) def write_source(self): func = self.entrypoint @@ -40,12 +42,6 @@ c = inputconst(lltype.typeOf(ptr), ptr) self.db.prepare_arg_value(c) - #add exception matching function (XXX should only be done when needed) - e = self.db.translator.rtyper.getexceptiondata() - matchptr = e.fn_exception_match - matchconst = inputconst(lltype.typeOf(matchptr), matchptr) - self.db.prepare_arg_value(matchconst) - # set up all nodes self.db.setup_all() @@ -96,6 +92,9 @@ f.close() + if self.compress: + optimize_filesize(str(self.filename)) + entry_point= c.value._obj self.graph = self.db.obj2node[entry_point].graph Added: pypy/dist/pypy/translator/js/optimize.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/js/optimize.py Sat Dec 17 20:23:35 2005 @@ -0,0 +1,61 @@ +optimized_functions = [ + 'll_strlen__rpy_stringPtr', + 'll_strconcat__rpy_stringPtr_rpy_stringPtr', + 'll_stritem_nonneg__rpy_stringPtr_Signed', + 'll_stritem__rpy_stringPtr_Signed', + 'll_streq__rpy_stringPtr_rpy_stringPtr', + 'll_issubclass__object_vtablePtr_object_vtablePtr' #TODO +] + + +def optimize_call(statement): + targetvar, statement = statement.split(' = ', 1) + funcname, params = statement.split('(', 1) + params = [param.strip() for param in params[:-1].split(',')] + + if funcname == 'll_strlen__rpy_stringPtr': + return '%s = %s.chars.length' % (targetvar, params[0]) + + elif funcname == 'll_strconcat__rpy_stringPtr_rpy_stringPtr': + #XXX javascript of ll_strconcat__rpy_stringPtr_rpy_stringPtr actually does not work, FIX IT! + # by outcommenting this code end running js/test/test_genllvm.py -k test_simple_chars + p = '%s.chars' % '.chars + '.join(params) + return '%s = new Object({hash:0, chars:%s})' % (targetvar, p) + + elif funcname == 'll_stritem_nonneg__rpy_stringPtr_Signed': + return '%s = %s.chars[%s]' % (targetvar, params[0], params[1]) + + elif funcname == 'll_stritem__rpy_stringPtr_Signed': + s, i = params + return '%s = %s.chars[%s >= 0 ? %s : %s + %s.chars.length]' % (targetvar, s, i, i, i, s) + + elif funcname == 'll_streq__rpy_stringPtr_rpy_stringPtr': + s0, s1 = params + return '%s = (%s == %s) || (%s && %s && %s.chars == %s.chars)' %\ + (targetvar, s0,s1, s0,s1, s0,s1) + + return '%s = %s(%s)' % (targetvar, funcname, ', '.join(params)) + +def optimize_filesize(filename): + f = open(filename, "r") + lines = f.readlines() + f.close() + + f = open(filename, "w") + for line in lines: + line = line.strip() + if not line or line.startswith('//'): + continue + t = line.split('//', 1) + if len(t) == 2 and '\'"' not in t[0]: + line = t[0].strip() + if not line: + continue + t = line.split('=', 1) + if len(t) == 2: + t[0] = t[0].strip() + t[1] = t[1].strip() + if '\'"' not in t[0]: + line = '%s=%s' % (t[0], t[1]) + f.write(line) + f.close() Modified: pypy/dist/pypy/translator/js/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/js/opwriter.py (original) +++ pypy/dist/pypy/translator/js/opwriter.py Sat Dec 17 20:23:35 2005 @@ -97,31 +97,11 @@ meth(op) def _generic_pow(self, op, onestr): - mult_val = self.db.repr_arg(op.args[0]) - last_val = mult_val - try: - value = "NO VALUE" - value = op.args[1].value - operand = int(value) - except Exception, exc: - msg = 'XXX: Error: _generic_pow: Variable '\ - '%s - failed to convert to int %s' % (value, str(exc)) - self.codewriter.comment(msg) - return - if operand < 1: - res_val = onestr - else: - res_val = mult_val - for ii in range(operand - 1): - #res_val = self.db.repr_tmpvar() - self.codewriter.binaryop('*', - res_val, - last_val, - mult_val) - last_val = res_val targetvar = self.db.repr_arg(op.result) - self.codewriter.cast(targetvar, mult_type, res_val, mult_type) - + mult_val = self.db.repr_arg(op.args[0]) + value = op.args[1].value + self.codewriter.append('%s = Math.pow(%s, %s)' % (targetvar, mult_val, value)) + def _skipped(self, op): self.codewriter.comment('Skipping operation %s()' % op.opname) pass @@ -298,13 +278,13 @@ type_ = 'Array' else: type_ = 'Object' #self.db.repr_type(arg_type) - self.codewriter.comment(str(arg_type)) - self.codewriter.comment(str(op.args[0])) - self.codewriter.malloc(targetvar, type_) - if t[1] == 'rpy_string': #XXX this should be done correctly for all types offcourse! - #self.codewriter.append(targetvar + '.length = 0') - self.codewriter.append(targetvar + '.hash = 0') - self.codewriter.append(targetvar + '.chars = ""') + #XXX this should be done correctly for all types offcourse! + if type_ == 'Object' and t[1] == 'rpy_string': + self.codewriter.append(targetvar + ' = new Object({hash:0, chars:""})') + else: + self.codewriter.comment(str(arg_type)) + self.codewriter.comment(str(op.args[0])) + self.codewriter.malloc(targetvar, type_) malloc_exception = malloc malloc_varsize = malloc Modified: pypy/dist/pypy/translator/js/src/ll_stackless.js ============================================================================== --- pypy/dist/pypy/translator/js/src/ll_stackless.js (original) +++ pypy/dist/pypy/translator/js/src/ll_stackless.js Sat Dec 17 20:23:35 2005 @@ -32,7 +32,7 @@ } return result; } -ll_stackless_stack_frames_depth__ = ll_stackless_stack_frames_depth +ll_stackless_stack_frames_depth__ = ll_stackless_stack_frames_depth; // @@ -86,7 +86,7 @@ LOG('slp_frame_stack_top='+slp_frame_stack_top + ', slp_frame_stack_bottom='+slp_frame_stack_bottom) return slp_return_value; } -ll_stack_unwind__ = ll_stack_unwind +ll_stack_unwind__ = ll_stack_unwind; function slp_return_current_frame_to_caller() { LOG("slp_return_current_frame_to_caller"); Modified: pypy/dist/pypy/translator/js/src/stack.js ============================================================================== --- pypy/dist/pypy/translator/js/src/stack.js (original) +++ pypy/dist/pypy/translator/js/src/stack.js Sat Dec 17 20:23:35 2005 @@ -1,3 +1,5 @@ +// Start of helpers + function ll_stack_too_big_helper(depth) { if (depth > 0) { ll_stack_too_big_helper(depth-1) @@ -18,3 +20,5 @@ throw "Recursion limit exceeded"; } ll_stack_unwind__ = ll_stack_unwind; + +// End of helpers Modified: pypy/dist/pypy/translator/js/support.py ============================================================================== --- pypy/dist/pypy/translator/js/support.py (original) +++ pypy/dist/pypy/translator/js/support.py Sat Dec 17 20:23:35 2005 @@ -1,9 +1,10 @@ from pypy.translator.gensupp import NameManager - +from pypy.translator.js.optimize import optimized_functions class JavascriptNameManager(NameManager): - def __init__(self): + def __init__(self, js): NameManager.__init__(self) + self.js = js # keywords cannot be reused. This is the C99 draft's list. #XXX this reserved_names list is incomplete! reserved_names_string = ''' @@ -18,6 +19,11 @@ self.reserved_names[name] = True self.make_reserved_names(reserved_names_string) + def uniquename(self, name): + if self.js.compress and name != self.js.entrypoint.func_name and name not in optimized_functions: + name = 'f' + return NameManager.uniquename(self, name) + def ensure_non_reserved(self, name): while name in self.reserved_names: name += '_' Modified: pypy/dist/pypy/translator/js/test/test_class.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_class.py (original) +++ pypy/dist/pypy/translator/js/test/test_class.py Sat Dec 17 20:23:35 2005 @@ -22,7 +22,7 @@ f = compile_function(llvmsnippet.class_simple2, [int]) assert f(2) == 10 - def DONTtest_inherit1(self): #unknown issue + def DONTtest_inherit1(self): #issue with empty object mallocs f = compile_function(llvmsnippet.class_inherit1, []) assert f() == 11 @@ -56,7 +56,7 @@ assert f(True) == 1 assert f(False) == 2 - def DONTtest_global_instance(self): #issue unknown + def DONTtest_global_instance(self): #issue unknown TEST THIS! f = compile_function(llvmsnippet.global_instance, [int]) assert f(-1) == llvmsnippet.global_instance(-1) for i in range(20): Modified: pypy/dist/pypy/translator/js/test/test_exception.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_exception.py (original) +++ pypy/dist/pypy/translator/js/test/test_exception.py Sat Dec 17 20:23:35 2005 @@ -164,7 +164,7 @@ for i in range(10, 20): assert f(i) == fn(i) -def DONTtest_catches(): #issue with last exception value not being set +def DONTtest_catches(): #issue with empty object mallocs def raises(i): if i == 3: raise MyException, 12 Modified: pypy/dist/pypy/translator/js/test/test_genllvm.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_genllvm.py (original) +++ pypy/dist/pypy/translator/js/test/test_genllvm.py Sat Dec 17 20:23:35 2005 @@ -56,7 +56,7 @@ assert f(True) == 12 assert f(False) == 13 -def DONTtest_int_ops(): #issue in opwriter._generic_pow +def test_int_ops(): def ops(i): x = 0 x += i < i @@ -70,14 +70,12 @@ x += x ** 1 x += x ** 2 x += i + 1 * i // i - 1 - #x += i is not None - #x += i is None return x f = compile_function(ops, [int]) assert f(1) == ops(1) assert f(2) == ops(2) -def DONTtest_uint_ops(): #issue in opwriter._generic_pow +def test_uint_ops(): def ops(i): x = 0 x += i < i @@ -91,14 +89,12 @@ x += x ** 1 x += x ** 2 x += i + 1 * i // i - 1 - #x += i is not None - #x += i is None return x f = compile_function(ops, [r_uint]) assert f(1) == ops(1) assert f(2) == ops(2) -def DONTtest_float_ops(): #issue with opwriter.generic_pow +def test_float_ops(): def ops(flt): x = 0 x += flt < flt @@ -111,8 +107,6 @@ x += x ** 1 x += x ** 2 x += int(flt + 1 * flt / flt - 1) - #x += flt fs not None - #x += flt is None return x f = compile_function(ops, [float]) assert f(1) == ops(1) @@ -230,7 +224,7 @@ assert f(-1) == 3 assert f(0) == 5 -def DONTtest_simple_chars(): #issue unknown +def test_simple_chars(): #XXX test this also without optimize_call(...) def char_constant2(s): s = s + s + s return len(s + '.') @@ -294,7 +288,7 @@ f = compile_function(string_simple, [int]) assert f(0) -def DONTtest_string_simple_ops(): #same issue with empty Object mallocs +def DONTtest_string_simple_ops(): #issue with casts def string_simple_ops(i): res = 0 s = str(i) @@ -321,7 +315,7 @@ f = compile_function(string_test, [int]) assert f(0) == ord("H") -def DONTtest_list_of_string(): #issue probably also with malloc of empty Objects +def DONTtest_list_of_string(): #issue with casts a = ["hello", "world"] def string_simple(i, j, k, l): s = a[i][j] + a[k][l] @@ -359,7 +353,7 @@ f = compile_function(method_call, []) assert f() == 4 -def DONTtest_dict_creation(): #issue unknown +def test_dict_creation(): d = {'hello' : 23, 'world' : 21} l = ["hello", "world"] Modified: pypy/dist/pypy/translator/js/test/test_lltype.py ============================================================================== --- pypy/dist/pypy/translator/js/test/test_lltype.py (original) +++ pypy/dist/pypy/translator/js/test/test_lltype.py Sat Dec 17 20:23:35 2005 @@ -101,7 +101,7 @@ f = compile_function(struct_constant, []) assert f() == struct_constant() -def DONTtest_aliasing(): #issue with missing operation (v229 = getelementptr arrayinstance, word 0, uint 1, i_0) +def test_aliasing(): B = lltype.Struct('B', ('x', lltype.Signed)) A = lltype.Array(B) global_a = lltype.malloc(A, 5, immortal=True) @@ -113,7 +113,7 @@ assert f(2) == 0 assert f(3) == 17 -def DONTtest_aliasing2(): #issue with missing operation (v230 = getelementptr arrayinstance, word 0, uint 1, i_0) +def test_aliasing2(): B = lltype.Struct('B', ('x', lltype.Signed)) A = lltype.Array(B) C = lltype.Struct('C', ('x', lltype.Signed), ('bptr', lltype.Ptr(B))) @@ -150,7 +150,7 @@ f = compile_function(array_constant, []) assert f() == array_constant() -def DONTtest_array_constant3(): #issue with missing operation (v289 = getelementptr arrayinstance, word 0, uint 1, 0) +def test_array_constant3(): A = lltype.GcArray(('x', lltype.Signed)) a = lltype.malloc(A, 3) a[0].x = 100 From arigo at codespeak.net Sat Dec 17 21:50:07 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 17 Dec 2005 21:50:07 +0100 (CET) Subject: [pypy-svn] r21261 - in pypy/dist/pypy: annotation rpython rpython/test Message-ID: <20051217205007.5D57B27B66@code1.codespeak.net> Author: arigo Date: Sat Dec 17 21:50:02 2005 New Revision: 21261 Modified: pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/annotation/unaryop.py pypy/dist/pypy/rpython/llinterp.py pypy/dist/pypy/rpython/objectmodel.py pypy/dist/pypy/rpython/rbuiltin.py pypy/dist/pypy/rpython/test/test_objectmodel.py Log: The hint() built-in and corresponding ll operation. Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Sat Dec 17 21:50:02 2005 @@ -298,6 +298,9 @@ def robjmodel_keepalive_until_here(*args_s): return immutablevalue(None) +def robjmodel_hint(s, **kwds_s): + return s + def rstack_yield_current_frame_to_caller(): return SomeExternalObject(pypy.rpython.rstack.frame_stack_top) @@ -343,6 +346,7 @@ BUILTIN_ANALYZERS[pypy.rpython.objectmodel.r_dict] = robjmodel_r_dict BUILTIN_ANALYZERS[pypy.rpython.objectmodel.hlinvoke] = robjmodel_hlinvoke BUILTIN_ANALYZERS[pypy.rpython.objectmodel.keepalive_until_here] = robjmodel_keepalive_until_here +BUILTIN_ANALYZERS[pypy.rpython.objectmodel.hint] = robjmodel_hint BUILTIN_ANALYZERS[pypy.rpython.rstack.yield_current_frame_to_caller] = ( rstack_yield_current_frame_to_caller) Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Sat Dec 17 21:50:02 2005 @@ -503,12 +503,15 @@ return bltn.analyser(*args) def call(bltn, args, implicit_init=False): - args, kw = args.unpack() - assert not kw, "don't call builtins with keywords arguments" + args_s, kwds = args.unpack() + # prefix keyword arguments with 's_' + kwds_s = {} + for key, s_value in kwds.items(): + kwds_s['s_'+key] = s_value if bltn.s_self is not None: - return bltn.analyser(bltn.s_self, *args) + return bltn.analyser(bltn.s_self, *args_s, **kwds_s) else: - return bltn.analyser(*args) + return bltn.analyser(*args_s, **kwds_s) class __extend__(SomePBC): Modified: pypy/dist/pypy/rpython/llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/llinterp.py (original) +++ pypy/dist/pypy/rpython/llinterp.py Sat Dec 17 21:50:02 2005 @@ -271,6 +271,9 @@ def op_same_as(self, x): return x + def op_hint(self, x, hints): + return x + def op_setfield(self, obj, fieldname, fieldvalue): # obj should be pointer FIELDTYPE = getattr(self.llt.typeOf(obj).TO, fieldname) Modified: pypy/dist/pypy/rpython/objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/objectmodel.py (original) +++ pypy/dist/pypy/rpython/objectmodel.py Sat Dec 17 21:50:02 2005 @@ -20,6 +20,9 @@ def keepalive_until_here(*values): pass +def hint(x, **kwds): + return x + class FREED_OBJECT(object): def __getattribute__(self, attr): Modified: pypy/dist/pypy/rpython/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/rbuiltin.py Sat Dec 17 21:50:02 2005 @@ -64,6 +64,23 @@ hop2.r_s_popfirstarg() return bltintyper(hop2) + def rtype_call_args(self, hop): + # calling a built-in function with keyword arguments: + # mostly for rpython.objectmodel.hint() + from pypy.interpreter.argument import Arguments + arguments = Arguments.fromshape(None, hop.args_s[1].const, # shape + hop.args_s[2:]) + args_s, kwds = arguments.unpack() + # prefix keyword arguments with 's_' + kwds_s = {} + for key, s_value in kwds.items(): + kwds_s['s_'+key] = s_value + bltintyper = BUILTIN_TYPER[self.builtinfunc] + hop2 = hop.copy() + hop2.r_s_popfirstarg() + hop2.r_s_popfirstarg() + return bltintyper(hop2, **kwds_s) + class BuiltinMethodRepr(Repr): @@ -390,3 +407,18 @@ return hop.inputconst(lltype.Void, None) BUILTIN_TYPER[objectmodel.keepalive_until_here] = rtype_keepalive_until_here + +# hint + +def rtype_hint(hop, **kwds_s): + hints = {} + for key, s_value in kwds_s.items(): + if not s_value.is_constant(): + raise TyperError("hint %r is not constant" % (key,)) + assert key.startswith('s_') + hints[key[2:]] = s_value.const + v = hop.inputarg(hop.args_r[0], arg=0) + c_hint = hop.inputconst(lltype.Void, hints) + return hop.genop('hint', [v, c_hint], resulttype=v.concretetype) + +BUILTIN_TYPER[objectmodel.hint] = rtype_hint Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Sat Dec 17 21:50:02 2005 @@ -190,3 +190,11 @@ res = interpret(f, []) assert res == 1 + +def test_hint(): + from pypy.rpython import objectmodel + def f(): + x = objectmodel.hint(5, hello="world") + return x + res = interpret(f, []) + assert res == 5 From arigo at codespeak.net Sun Dec 18 11:05:39 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 18 Dec 2005 11:05:39 +0100 (CET) Subject: [pypy-svn] r21262 - in pypy/dist/pypy/translator: backendopt test Message-ID: <20051218100539.90F5027B54@code1.codespeak.net> Author: arigo Date: Sun Dec 18 11:05:36 2005 New Revision: 21262 Modified: pypy/dist/pypy/translator/backendopt/ssa.py pypy/dist/pypy/translator/test/test_simplify.py Log: Improve translator.simplify.remove_identical_vars() to find and eliminate duplicate input variables of a block also in the case where a link provides several times the same *constant* for this input variable. Seems to be required for hint() to work as expected in the jit. XXX does this noticeably slow down translate_pypy? Modified: pypy/dist/pypy/translator/backendopt/ssa.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/ssa.py (original) +++ pypy/dist/pypy/translator/backendopt/ssa.py Sun Dec 18 11:05:36 2005 @@ -15,21 +15,24 @@ # the nth output variable from each of the incoming links, in a list: # [Block, blockvar, linkvar, linkvar, linkvar...] opportunities = [] + opportunities_with_const = [] for block, links in mkentrymap(graph).items(): if block is graph.startblock: continue assert links for n, inputvar in enumerate(block.inputargs): vars = [block, inputvar] + put_in = opportunities for link in links: var = link.args[n] if not isinstance(var, Variable): - break + put_in = opportunities_with_const vars.append(var) - else: - # if no Constant found in the incoming links - opportunities.append(vars) + # if any link provides a Constant, record this in + # the opportunities_with_const list instead + put_in.append(vars) self.opportunities = opportunities + self.opportunities_with_const = opportunities_with_const self.variable_families = UnionFind() def complete(self): @@ -66,9 +69,9 @@ while progress: progress = False block_phi_nodes = {} # in the SSA sense - for vars in self.opportunities: + for vars in self.opportunities + self.opportunities_with_const: block, blockvar = vars[:2] - linksvars = vars[2:] # from the incoming links + linksvars = vars[2:] # from the incoming links (vars+consts) linksvars = [variable_families.find_rep(v) for v in linksvars] phi_node = (block,) + tuple(linksvars) # ignoring n and blockvar if phi_node in block_phi_nodes: Modified: pypy/dist/pypy/translator/test/test_simplify.py ============================================================================== --- pypy/dist/pypy/translator/test/test_simplify.py (original) +++ pypy/dist/pypy/translator/test/test_simplify.py Sun Dec 18 11:05:36 2005 @@ -82,4 +82,15 @@ assert op.opname != 'getfield' if op.opname == 'keepalive': assert op.args[0] in graph.getargs() - + + +def test_remove_identical_variables(): + def g(code): + pc = 0 + while pc < len(code): + pc += 1 + return pc + + graph = TranslationContext().buildflowgraph(g) + for block in graph.iterblocks(): + assert len(block.inputargs) <= 2 # at most 'pc' and 'code' From arigo at codespeak.net Sun Dec 18 11:38:12 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 18 Dec 2005 11:38:12 +0100 (CET) Subject: [pypy-svn] r21263 - in pypy/dist/pypy: jit jit/test translator/c Message-ID: <20051218103812.D659527DB5@code1.codespeak.net> Author: arigo Date: Sun Dec 18 11:38:09 2005 New Revision: 21263 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_jit_tl.py pypy/dist/pypy/jit/test/test_llabstractinterp.py pypy/dist/pypy/jit/test/test_tl.py pypy/dist/pypy/jit/tl.py pypy/dist/pypy/translator/c/funcgen.py Log: Start of the hint()-driven approach. So far, it works on the small example (and only there :-). The idea is to have a flag 'fixed' on LLRuntimeValues that can be set to True a posteriori, to mean that the constant contained in this LLRuntimeValue should be passed over to the next block as a constant -- instead of the default behavior which is to turn it into a variable. The 'hint' operation sets some 'fixed' flags to True and raises a RestartCompleting to restart the whole process. An 'origin' attribute on LLRuntimeValues tracks the multiple possible histories of a variable back to the point(s) where it was constant. I removed the code for constant propagation with automatic generalization because the new a-posteriori-constantification logic is confusing enough to get right without having to additionally worry about not breaking the const_propagate policy. I guess we'll have to re-insert this code when things start to work again. Added the hint() in the TL source code and implemented the 'hint' operation in GenC. The test_jit_tl is disabled because it explodes. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 18 11:38:09 2005 @@ -57,6 +57,8 @@ class LLRuntimeValue(LLAbstractValue): + origin = None + fixed = False def __init__(self, orig_v): if isinstance(orig_v, Variable): @@ -81,11 +83,10 @@ return self.copy_v def getruntimevars(self, memo): - if (isinstance(self.copy_v, Variable) or - self not in memo.propagate_as_constants): - return [self.copy_v] + if memo.get_fixed_flags: + return [self.fixed] else: - return [] # we propagate this constant as a constant + return [self.copy_v] def maybe_get_constant(self): if isinstance(self.copy_v, Constant): @@ -95,22 +96,17 @@ def with_fresh_variables(self, memo): # don't use memo.seen here: shared variables must become distinct - if (isinstance(self.copy_v, Variable) or - self not in memo.propagate_as_constants): - return LLRuntimeValue(self.getconcretetype()) - else: - return self # we are allowed to propagate this constant + if memo.key is not None: + c = memo.key.next() + if c is not None: + return LLRuntimeValue(c) + result = LLRuntimeValue(self.getconcretetype()) + result.origin = [self] + return result def match(self, other, memo): - if not isinstance(other, LLRuntimeValue): - return False - if isinstance(self.copy_v, Variable): - return True - if self.copy_v == other.copy_v: - memo.propagate_as_constants[other] = True # exact match - else: - memo.exact_match = False - return True + memo.dependencies.append((self, other, self.fixed)) + return isinstance(other, LLRuntimeValue) ll_no_return_value = LLRuntimeValue(const(None, lltype.Void)) @@ -334,7 +330,7 @@ self.a_back = a_back self.args_a = args_a self.origblock = origblock - self.copyblock = None + self.copyblocks = {} assert len(args_a) == len(self.getlivevars()) def key(self): @@ -388,18 +384,6 @@ else: return True - def resolveblock(self, newblock): - #print "RESOLVING BLOCK", newblock - if self.copyblock is not None: - # uncommon case: must patch the existing Block - assert len(self.copyblock.inputargs) == len(newblock.inputargs) - self.copyblock.inputargs = newblock.inputargs - self.copyblock.operations = newblock.operations - self.copyblock.exitswitch = newblock.exitswitch - self.copyblock.recloseblock(*newblock.exits) - else: - self.copyblock = newblock - def getbindings(self): return dict(zip(self.getlivevars(), self.args_a)) @@ -407,7 +391,6 @@ class LLBlockState(LLState): """Entry state of a block, as a combination of LLAbstractValues for its input arguments.""" - propagate_as_constants = {} def localkey(self): return (self.origblock,) @@ -434,13 +417,13 @@ # ____________________________________________________________ class Policy(object): - def __init__(self, inlining=False, const_propagate=False, - concrete_propagate=True): + def __init__(self, inlining=False, + concrete_propagate=True, concrete_args=True): self.inlining = inlining - self.const_propagate = const_propagate self.concrete_propagate = concrete_propagate + self.concrete_args = concrete_args -best_policy = Policy(inlining=True, const_propagate=True) +best_policy = Policy(inlining=True, concrete_args=False) class LLAbstractInterp(object): @@ -460,7 +443,10 @@ args_a = [] for i, v in enumerate(origgraph.getargs()): if i in arghints: - a = LLConcreteValue(arghints[i]) + if self.policy.concrete_args: + a = LLConcreteValue(arghints[i]) + else: + a = LLRuntimeValue(const(arghints[i])) else: a = LLRuntimeValue(orig_v=v) args_a.append(a) @@ -484,8 +470,7 @@ def schedule(self, inputstate): #print "SCHEDULE", args_a, origblock state = self.schedule_getstate(inputstate) - memo = VarMemo(state.propagate_as_constants) - args_v = inputstate.getruntimevars(memo) + args_v = inputstate.getruntimevars(VarMemo()) newlink = Link(args_v, None) self.pendingstates[newlink] = state return newlink @@ -498,24 +483,41 @@ memo = MatchMemo() if state.match(inputstate, memo): # already matched - if memo.exact_match: - return state # exact match - if not self.policy.const_propagate: - return state # all constants will be generalized anyway - # partial match: in the old state, some constants need to - # be turned into variables. - inputstate.propagate_as_constants = memo.propagate_as_constants - # The generalized state replaces the existing one. - pendingstates[i] = inputstate - state.generalized_by = inputstate - return inputstate + must_restart = False + for statevar, inputvar, fixed in memo.dependencies: + if fixed: + must_restart |= self.hint_needs_constant(inputvar) + if must_restart: + raise RestartCompleting + for statevar, inputvar, fixed in memo.dependencies: + if statevar.origin is None: + statevar.origin = [] + statevar.origin.append(inputvar) + return state else: # cache and return this new state - if self.policy.const_propagate: - inputstate.propagate_as_constants = ALL pendingstates.append(inputstate) return inputstate + def hint_needs_constant(self, a): + if a.maybe_get_constant() is not None: + return False + fix_me = [a] + while fix_me: + a = fix_me.pop() + if not a.origin: + raise Exception("hint() failed: cannot trace the variable %r " + "back to a link where it was a constant" % (a,)) + for a_origin in a.origin: + # 'a_origin' is a LLRuntimeValue attached to a saved state + assert isinstance(a_origin, LLRuntimeValue) + if not a_origin.fixed: + print 'fixing:', a_origin + a_origin.fixed = True + if a_origin.maybe_get_constant() is None: + fix_me.append(a_origin) + return True + class GraphState(object): """Entry state of a graph.""" @@ -534,6 +536,9 @@ self.copygraph.exceptblock.inputargs[1])]: if hasattr(orig_v, 'concretetype'): copy_v.concretetype = orig_v.concretetype + # The 'args' attribute is needed by process_constant_input(), + # which looks for it on either a GraphState or a Link + self.args = inputstate.getruntimevars(VarMemo()) self.a_return = None self.state = "before" @@ -546,6 +551,15 @@ if self.state == "after": return self.state = "during" + while True: + try: + self.try_to_complete() + break + except RestartCompleting: + print '--- restarting ---' + continue + + def try_to_complete(self): graph = self.copygraph interp = self.interp pending = [self] @@ -555,10 +569,22 @@ while pending: next = pending.pop() state = interp.pendingstates[next] - if state.copyblock is None: - self.flowin(state) - next.settarget(state.copyblock) - for link in state.copyblock.exits: + fixed_flags = state.getruntimevars(VarMemo(get_fixed_flags=True)) + key = [] + for fixed, c in zip(fixed_flags, next.args): + if fixed: + assert isinstance(c, Constant), ( + "unexpected Variable %r reaching a fixed input arg" % + (c,)) + key.append(c) + else: + key.append(None) + key = tuple(key) + if key not in state.copyblocks: + self.flowin(state, key) + block = state.copyblocks[key] + next.settarget(block) + for link in block.exits: if link.target is None or link.target.operations != (): if link not in seen: seen[link] = True @@ -575,8 +601,7 @@ else: raise Exception("uh?") - if interp.policy.const_propagate: - self.compactify(seen) + remove_constant_inputargs(graph) # the graph should be complete now; sanity-check try: @@ -592,33 +617,28 @@ join_blocks(graph) self.state = "after" - def compactify(self, links): - # remove the parts of the graph that use constants that were later - # generalized - interp = self.interp - for link in links: - oldstate = interp.pendingstates[link] - if oldstate.generalized_by is not None: - newstate = oldstate.generalized_by - while newstate.generalized_by: - newstate = newstate.generalized_by - # Patch oldstate.block to point to the new state, - # as in the flow object space - builder = BlockBuilder(self, oldstate) - memo = VarMemo(newstate.propagate_as_constants) - args_v = builder.runningstate.getruntimevars(memo) - oldlink = Link(args_v, newstate.copyblock) - oldblock = builder.buildblock(None, [oldlink]) - oldstate.resolveblock(oldblock) - - def flowin(self, state): + def flowin(self, state, key): # flow in the block assert isinstance(state, LLBlockState) origblock = state.origblock origposition = 0 - builder = BlockBuilder(self.interp, state) + builder = BlockBuilder(self.interp, state, key) newexitswitch = None + # debugging print + arglist = [] + for v1, v2, k in zip(state.getruntimevars(VarMemo()), + builder.runningstate.getruntimevars(VarMemo()), + key): + if k is None: + assert isinstance(v2, Variable) + else: + assert v2 == k + arglist.append('%s => %s' % (v1, v2)) print + print '--> %s [%s]' % (origblock, ', '.join(arglist)) + for op in origblock.operations: + print '\t\t', op + # end of debugging print try: if origblock.operations == (): if state.a_back is None: @@ -691,17 +711,16 @@ newlinks.append(newlink) newblock = builder.buildblock(newexitswitch, newlinks) - state.resolveblock(newblock) + state.copyblocks[key] = newblock class BlockBuilder(object): - def __init__(self, interp, initialstate): + def __init__(self, interp, initialstate, key): self.interp = interp - memo = VarMemo(initialstate.propagate_as_constants) + memo = VarMemo(iter(key)) self.runningstate = initialstate.with_fresh_variables(memo) - memo = VarMemo(initialstate.propagate_as_constants) - self.newinputargs = self.runningstate.getruntimevars(memo) + self.newinputargs = self.runningstate.getruntimevars(VarMemo()) # {Variables-of-origblock: a_value} self.bindings = self.runningstate.getbindings() self.residual_operations = [] @@ -743,9 +762,9 @@ concretevalues.append(v.value) any_concrete = any_concrete or isinstance(a, LLConcreteValue) # can constant-fold - print 'fold:', constant_op, concretevalues + print 'fold:', constant_op.__name__, concretevalues concreteresult = constant_op(*concretevalues) - if any_concrete and self.policy.concrete_propagate: + if any_concrete and self.interp.policy.concrete_propagate: return LLConcreteValue(concreteresult) else: return LLRuntimeValue(const(concreteresult)) @@ -769,9 +788,21 @@ if a_result is not None: return a_result a_result = LLRuntimeValue(op.result) + if constant_op: + self.record_origin(a_result, args_a) self.residual(op.opname, args_a, a_result) return a_result + def record_origin(self, a_result, args_a): + origin = [] + for a in args_a: + if a.maybe_get_constant() is not None: + continue + if not isinstance(a, LLRuntimeValue) or a.origin is None: + return + origin.extend(a.origin) + a_result.origin = origin + # ____________________________________________________________ # Operation handlers @@ -814,6 +845,9 @@ def op_int_ne(self, op, a1, a2): return self.residualize(op, [a1, a2], operator.ne) + op_char_eq = op_int_eq + op_char_ne = op_int_ne + def op_cast_char_to_int(self, op, a): return self.residualize(op, [a], ord) @@ -823,6 +857,23 @@ def op_same_as(self, op, a): return a + def op_hint(self, op, a, a_hints): + c_hints = a_hints.maybe_get_constant() + assert c_hints is not None, "hint dict not constant" + hints = c_hints.value + if hints.get('concrete'): + # turn this 'a' into a concrete value + c = a.forcevarorconst(self) + if isinstance(c, Constant): + a = LLConcreteValue(c.value) + else: + # Oups! it's not a constant. Try to trace it back to a + # constant that was turned into a variable by a link. + restart = self.interp.hint_needs_constant(a) + assert restart + raise RestartCompleting + return a + def op_direct_call(self, op, *args_a): a_result = self.handle_call(op, *args_a) if a_result is None: @@ -983,22 +1034,20 @@ def __init__(self, link): self.link = link +class RestartCompleting(Exception): + pass + class MatchMemo(object): def __init__(self): - self.exact_match = True - self.propagate_as_constants = {} + self.dependencies = [] self.self_alias = {} self.other_alias = {} class VarMemo(object): - def __init__(self, propagate_as_constants={}): + def __init__(self, key=None, get_fixed_flags=False): self.seen = {} - self.propagate_as_constants = propagate_as_constants - -class ALL(object): - def __contains__(self, other): - return True -ALL = ALL() + self.key = key + self.get_fixed_flags = get_fixed_flags def live_variables(block, position): @@ -1019,3 +1068,22 @@ if op.result in used: result.append(op.result) return result + +def remove_constant_inputargs(graph): + # for simplicity, the logic in GraphState produces graphs that can + # pass constants from one block to the next explicitly, via a + # link.args -> block.inputargs. Remove them now. + for link in graph.iterlinks(): + i = 0 + for v in link.target.inputargs: + if isinstance(v, Constant): + del link.args[i] + else: + i += 1 + for block in graph.iterblocks(): + i = 0 + for v in block.inputargs[:]: + if isinstance(v, Constant): + del block.inputargs[i] + else: + i += 1 Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Sun Dec 18 11:38:09 2005 @@ -8,7 +8,7 @@ from pypy.rpython.llinterp import LLInterpreter #from pypy.translator.backendopt import inline -#py.test.skip("in-progress") +py.test.skip("in-progress") def setup_module(mod): t = TranslationContext() @@ -32,7 +32,7 @@ assert result1 == result2 - #interp.graphs[0].show() + interp.graphs[0].show() def run_jit(code): @@ -90,4 +90,29 @@ PUSH 5 ADD RETURN - ''') + ''') + +def test_factorial(): + run_jit(''' + PUSH 1 # accumulator + PUSH 7 # N + + start: + PICK 0 + PUSH 1 + LE + BR_COND exit + + SWAP + PICK 1 + MUL + SWAP + PUSH 1 + SUB + PUSH 1 + BR_COND start + + exit: + POP + RETURN + ''') Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Sun Dec 18 11:38:09 2005 @@ -62,8 +62,7 @@ return graph2, insns P_INLINE = Policy(inlining=True) -P_CONST_INLINE = Policy(inlining=True, const_propagate=True) -P_HINT_DRIVEN = Policy(inlining=True, concrete_propagate=False) +P_HINT_DRIVEN = Policy(inlining=True, concrete_args=False) def test_simple(): @@ -316,13 +315,13 @@ graph2, insns = abstrinterp(ll1, [3, 4, 5], [1, 2], policy=P_INLINE) assert insns == {'int_add': 1} -def test_const_propagate(): - def ll_add(x, y): - return x + y - def ll1(x): - return ll_add(x, 42) - graph2, insns = abstrinterp(ll1, [3], [0], policy=P_CONST_INLINE) - assert insns == {} +##def test_const_propagate(): +## def ll_add(x, y): +## return x + y +## def ll1(x): +## return ll_add(x, 42) +## graph2, insns = abstrinterp(ll1, [3], [0], policy=P_CONST_INLINE) +## assert insns == {} def test_dont_unroll_loop(): def ll_factorial(n): @@ -332,12 +331,12 @@ i += 1 result *= i return result - graph2, insns = abstrinterp(ll_factorial, [7], [], policy=P_CONST_INLINE) + graph2, insns = abstrinterp(ll_factorial, [7], [], policy=P_INLINE) assert insns == {'int_lt': 1, 'int_add': 1, 'int_mul': 1} -def INPROGRESS_test_hint(): +def test_hint(): from pypy.rpython.objectmodel import hint - A = lltype.GcArray(lltype.Char) + A = lltype.GcArray(lltype.Char, hints={'immutable': True}) def ll_interp(code): accum = 0 pc = 0 @@ -356,6 +355,6 @@ bytecode[2] = 'A' bytecode[3] = 'B' bytecode[4] = 'A' - graph2, insns = abstrinterp(ll_interp, [bytecode], [], + graph2, insns = abstrinterp(ll_interp, [bytecode], [0], policy=P_HINT_DRIVEN) assert insns == {'int_add': 4, 'int_lt': 1} Modified: pypy/dist/pypy/jit/test/test_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_tl.py (original) +++ pypy/dist/pypy/jit/test/test_tl.py Sun Dec 18 11:38:09 2005 @@ -163,3 +163,30 @@ assert code == list2bytecode([PUSH,1, CALL,5, PUSH,3, CALL,4, RETURN, PUSH,2, RETURN, PUSH,4, PUSH,5, ADD, RETURN]) + +def test_factorial(): + code = compile(''' + PUSH 1 # accumulator + PUSH 7 # N + + start: + PICK 0 + PUSH 1 + LE + BR_COND exit + + SWAP + PICK 1 + MUL + SWAP + PUSH 1 + SUB + PUSH 1 + BR_COND start + + exit: + POP + RETURN + ''') + res = interp(code) + assert res == 5040 Modified: pypy/dist/pypy/jit/tl.py ============================================================================== --- pypy/dist/pypy/jit/tl.py (original) +++ pypy/dist/pypy/jit/tl.py Sun Dec 18 11:38:09 2005 @@ -3,6 +3,7 @@ import py from tlopcode import * import tlopcode +from pypy.rpython.objectmodel import hint def char2int(c): t = ord(c) @@ -19,6 +20,7 @@ while pc < code_len: opcode = ord(code[pc]) + opcode = hint(opcode, concrete=True) pc += 1 if opcode == PUSH: Modified: pypy/dist/pypy/translator/c/funcgen.py ============================================================================== --- pypy/dist/pypy/translator/c/funcgen.py (original) +++ pypy/dist/pypy/translator/c/funcgen.py Sun Dec 18 11:38:09 2005 @@ -584,6 +584,10 @@ result.append(self.pyobj_incref(op.result)) return '\t'.join(result) + def OP_HINT(self, op, err): + hints = op.args[1].value + return '%s\t/* hint: %r */' % (self.OP_SAME_AS(op, err), hints) + def OP_KEEPALIVE(self, op, err): # xxx what should be the sematics consequences of this return "/* kept alive: %s */ ;" % self.expr(op.args[0], special_case_void=False) From cfbolz at codespeak.net Sun Dec 18 12:44:40 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sun, 18 Dec 2005 12:44:40 +0100 (CET) Subject: [pypy-svn] r21265 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20051218114440.E085427B66@code1.codespeak.net> Author: cfbolz Date: Sun Dec 18 12:44:38 2005 New Revision: 21265 Modified: pypy/dist/pypy/translator/backendopt/all.py pypy/dist/pypy/translator/backendopt/test/test_propagate.py Log: make this test pass again (and test something meaningful) after my recent changes to the inliner: the test was supposed to test whether calls can be folded as well. The new inliner just inlined the call, so the folding did not work. Now I don't inline at all in this test (which involved changing to a for to a while loop, because in a for-loop the calls to the iterator etc. would need to be inlined). Modified: pypy/dist/pypy/translator/backendopt/all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/all.py (original) +++ pypy/dist/pypy/translator/backendopt/all.py Sun Dec 18 12:44:38 2005 @@ -10,7 +10,7 @@ def backend_optimizations(translator, inline_threshold=1, mallocs=True, ssa_form=True, - merge_if_blocks_to_switch=False, + merge_if_blocks_to_switch=True, propagate=False): # remove obvious no-ops for graph in translator.graphs: Modified: pypy/dist/pypy/translator/backendopt/test/test_propagate.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_propagate.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_propagate.py Sun Dec 18 12:44:38 2005 @@ -4,11 +4,12 @@ from pypy.rpython.llinterp import LLInterpreter -def get_graph(fn, signature): +def get_graph(fn, signature, inline_threshold=True): t = TranslationContext() t.buildannotator().build_types(fn, signature) t.buildrtyper().specialize() - backend_optimizations(t, ssa_form=False, propagate=False) + backend_optimizations(t, inline_threshold=inline_threshold, + ssa_form=False, propagate=False) graph = graphof(t, fn) return graph, t @@ -59,12 +60,14 @@ def test_constant_fold_call(): def s(x): res = 0 - for i in range(1, x + 1): + i = 1 + while i <= x: res += i + i += 1 return res def g(x): return s(100) + s(1) + x - graph, t = get_graph(g, [int]) + graph, t = get_graph(g, [int], inline_threshold=0) while constant_folding(graph, t): pass assert len(graph.startblock.operations) == 1 From cfbolz at codespeak.net Sun Dec 18 13:04:21 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sun, 18 Dec 2005 13:04:21 +0100 (CET) Subject: [pypy-svn] r21266 - pypy/dist/pypy/translator/backendopt Message-ID: <20051218120421.4C54E27B82@code1.codespeak.net> Author: cfbolz Date: Sun Dec 18 13:04:20 2005 New Revision: 21266 Modified: pypy/dist/pypy/translator/backendopt/all.py Log: reverting r21265: accidentally checked in my local changes to this file. sorry :-( Modified: pypy/dist/pypy/translator/backendopt/all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/all.py (original) +++ pypy/dist/pypy/translator/backendopt/all.py Sun Dec 18 13:04:20 2005 @@ -10,7 +10,7 @@ def backend_optimizations(translator, inline_threshold=1, mallocs=True, ssa_form=True, - merge_if_blocks_to_switch=True, + merge_if_blocks_to_switch=False, propagate=False): # remove obvious no-ops for graph in translator.graphs: From arigo at codespeak.net Sun Dec 18 13:28:18 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 18 Dec 2005 13:28:18 +0100 (CET) Subject: [pypy-svn] r21267 - in pypy/dist/pypy/jit: . test Message-ID: <20051218122818.D72C327B82@code1.codespeak.net> Author: arigo Date: Sun Dec 18 13:28:16 2005 New Revision: 21267 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_jit_tl.py pypy/dist/pypy/jit/test/test_llabstractinterp.py Log: Reintroduce something like the const_propagate policy. It doesn't look too bad, and the tests so far pass, but there is a flaw in here... Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 18 13:28:16 2005 @@ -57,8 +57,18 @@ class LLRuntimeValue(LLAbstractValue): - origin = None - fixed = False + + origin = None # list of LLRuntimeValues attached to a saved state: + # the sources that did or could allow 'self' to be + # computed as a constant + + becomes = "var" # only meaningful on LLRuntimeValues attached to a + # saved state. Describes what this LLRuntimeValue will + # become in the next block: + # "var" - becomes a Variable in the next block + # "single" - only one Constant seen so far, so + # for now it can stay a Constant + # "fixed" - forced by hint() to stay a Constant def __init__(self, orig_v): if isinstance(orig_v, Variable): @@ -67,6 +77,8 @@ elif isinstance(orig_v, Constant): # we can share the Constant() self.copy_v = orig_v + self.origin = [] + self.becomes = "single" elif isinstance(orig_v, lltype.LowLevelType): # hackish interface :-( we accept a type too self.copy_v = newvar(orig_v) @@ -83,10 +95,17 @@ return self.copy_v def getruntimevars(self, memo): - if memo.get_fixed_flags: - return [self.fixed] - else: + if memo.key is None: return [self.copy_v] + else: + c = memo.key.next() + if self.becomes == "var": + return [None] + else: + assert isinstance(c, Constant), ( + "unexpected Variable %r reaching a %r input arg" % + (c, self.becomes)) + return [c] def maybe_get_constant(self): if isinstance(self.copy_v, Constant): @@ -96,16 +115,16 @@ def with_fresh_variables(self, memo): # don't use memo.seen here: shared variables must become distinct - if memo.key is not None: - c = memo.key.next() - if c is not None: - return LLRuntimeValue(c) - result = LLRuntimeValue(self.getconcretetype()) + c = memo.key and memo.key.next() + if c is not None: # allowed to propagate as a Constant? + result = LLRuntimeValue(c) + else: + result = LLRuntimeValue(self.getconcretetype()) result.origin = [self] return result def match(self, other, memo): - memo.dependencies.append((self, other, self.fixed)) + memo.dependencies.append((self, other)) return isinstance(other, LLRuntimeValue) ll_no_return_value = LLRuntimeValue(const(None, lltype.Void)) @@ -417,13 +436,14 @@ # ____________________________________________________________ class Policy(object): - def __init__(self, inlining=False, + def __init__(self, inlining=False, const_propagate=False, concrete_propagate=True, concrete_args=True): self.inlining = inlining + self.const_propagate = const_propagate self.concrete_propagate = concrete_propagate self.concrete_args = concrete_args -best_policy = Policy(inlining=True, concrete_args=False) +best_policy = Policy(inlining=True, const_propagate=True) class LLAbstractInterp(object): @@ -484,15 +504,26 @@ if state.match(inputstate, memo): # already matched must_restart = False - for statevar, inputvar, fixed in memo.dependencies: - if fixed: + for statevar, inputvar in memo.dependencies: + if statevar.becomes == "single": + # the saved state only records one possible Constant + # incoming value so far. Are we seeing a different + # Constant, or even a Variable? + if inputvar.copy_v != statevar.copy_v: + statevar.becomes = "var" + must_restart = True + elif statevar.becomes == "fixed": + # the saved state says that this new incoming + # variable must be forced to a constant must_restart |= self.hint_needs_constant(inputvar) if must_restart: raise RestartCompleting - for statevar, inputvar, fixed in memo.dependencies: - if statevar.origin is None: - statevar.origin = [] - statevar.origin.append(inputvar) + # The new inputstate is merged into the existing saved state. + # Record this inputstate's variables in the possible origins + # of the saved state's variables. + for statevar, inputvar in memo.dependencies: + if statevar.origin is not None: + statevar.origin.append(inputvar) return state else: # cache and return this new state @@ -500,23 +531,28 @@ return inputstate def hint_needs_constant(self, a): - if a.maybe_get_constant() is not None: - return False + # Force the given LLRuntimeValue to be a fixed constant. + must_restart = False fix_me = [a] while fix_me: a = fix_me.pop() - if not a.origin: + assert isinstance(a, LLRuntimeValue) + if a.becomes == "fixed": + continue # already fixed + print 'fixing:', a + if a.becomes == "var": + must_restart = True # this Var is now fixed + # (no need to restart if a.becomes was "single") + a.becomes = "fixed" + if a.origin: + fix_me.extend(a.origin) + elif a.maybe_get_constant() is None: + # a Variable with no recorded origin raise Exception("hint() failed: cannot trace the variable %r " "back to a link where it was a constant" % (a,)) - for a_origin in a.origin: - # 'a_origin' is a LLRuntimeValue attached to a saved state - assert isinstance(a_origin, LLRuntimeValue) - if not a_origin.fixed: - print 'fixing:', a_origin - a_origin.fixed = True - if a_origin.maybe_get_constant() is None: - fix_me.append(a_origin) - return True + assert self.policy.const_propagate, ( + "hint() can only be used with a policy of const_propagate=True") + return must_restart class GraphState(object): @@ -569,17 +605,10 @@ while pending: next = pending.pop() state = interp.pendingstates[next] - fixed_flags = state.getruntimevars(VarMemo(get_fixed_flags=True)) - key = [] - for fixed, c in zip(fixed_flags, next.args): - if fixed: - assert isinstance(c, Constant), ( - "unexpected Variable %r reaching a fixed input arg" % - (c,)) - key.append(c) - else: - key.append(None) - key = tuple(key) + if interp.policy.const_propagate: + key = tuple(state.getruntimevars(VarMemo(next.args))) + else: + key = None if key not in state.copyblocks: self.flowin(state, key) block = state.copyblocks[key] @@ -626,14 +655,15 @@ newexitswitch = None # debugging print arglist = [] - for v1, v2, k in zip(state.getruntimevars(VarMemo()), - builder.runningstate.getruntimevars(VarMemo()), - key): - if k is None: - assert isinstance(v2, Variable) - else: - assert v2 == k - arglist.append('%s => %s' % (v1, v2)) + if key: + for v1, v2, k in zip(state.getruntimevars(VarMemo()), + builder.runningstate.getruntimevars(VarMemo()), + key): + if k is None: + assert isinstance(v2, Variable) + else: + assert v2 == k + arglist.append('%s => %s' % (v1, v2)) print print '--> %s [%s]' % (origblock, ', '.join(arglist)) for op in origblock.operations: @@ -718,7 +748,7 @@ def __init__(self, interp, initialstate, key): self.interp = interp - memo = VarMemo(iter(key)) + memo = VarMemo(key) self.runningstate = initialstate.with_fresh_variables(memo) self.newinputargs = self.runningstate.getruntimevars(VarMemo()) # {Variables-of-origblock: a_value} @@ -767,7 +797,9 @@ if any_concrete and self.interp.policy.concrete_propagate: return LLConcreteValue(concreteresult) else: - return LLRuntimeValue(const(concreteresult)) + a_result = LLRuntimeValue(const(concreteresult)) + self.record_origin(a_result, args_a) + return a_result def residual(self, opname, args_a, a_result): v_result = a_result.forcevarorconst(self) @@ -796,7 +828,7 @@ def record_origin(self, a_result, args_a): origin = [] for a in args_a: - if a.maybe_get_constant() is not None: + if isinstance(a, LLConcreteValue): continue if not isinstance(a, LLRuntimeValue) or a.origin is None: return @@ -1044,10 +1076,12 @@ self.other_alias = {} class VarMemo(object): - def __init__(self, key=None, get_fixed_flags=False): + def __init__(self, key=None): self.seen = {} - self.key = key - self.get_fixed_flags = get_fixed_flags + if key is not None: + self.key = iter(key) + else: + self.key = None def live_variables(block, position): Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Sun Dec 18 13:28:16 2005 @@ -32,7 +32,7 @@ assert result1 == result2 - interp.graphs[0].show() + #interp.graphs[0].show() def run_jit(code): Modified: pypy/dist/pypy/jit/test/test_llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/test/test_llabstractinterp.py (original) +++ pypy/dist/pypy/jit/test/test_llabstractinterp.py Sun Dec 18 13:28:16 2005 @@ -62,7 +62,8 @@ return graph2, insns P_INLINE = Policy(inlining=True) -P_HINT_DRIVEN = Policy(inlining=True, concrete_args=False) +P_CONST_INLINE = Policy(inlining=True, const_propagate=True) +P_HINT_DRIVEN = Policy(inlining=True, const_propagate=True, concrete_args=False) def test_simple(): @@ -315,13 +316,13 @@ graph2, insns = abstrinterp(ll1, [3, 4, 5], [1, 2], policy=P_INLINE) assert insns == {'int_add': 1} -##def test_const_propagate(): -## def ll_add(x, y): -## return x + y -## def ll1(x): -## return ll_add(x, 42) -## graph2, insns = abstrinterp(ll1, [3], [0], policy=P_CONST_INLINE) -## assert insns == {} +def test_const_propagate(): + def ll_add(x, y): + return x + y + def ll1(x): + return ll_add(x, 42) + graph2, insns = abstrinterp(ll1, [3], [0], policy=P_CONST_INLINE) + assert insns == {} def test_dont_unroll_loop(): def ll_factorial(n): @@ -331,7 +332,7 @@ i += 1 result *= i return result - graph2, insns = abstrinterp(ll_factorial, [7], [], policy=P_INLINE) + graph2, insns = abstrinterp(ll_factorial, [7], [], policy=P_CONST_INLINE) assert insns == {'int_lt': 1, 'int_add': 1, 'int_mul': 1} def test_hint(): @@ -358,3 +359,32 @@ graph2, insns = abstrinterp(ll_interp, [bytecode], [0], policy=P_HINT_DRIVEN) assert insns == {'int_add': 4, 'int_lt': 1} + +def test_hint_across_call(): + from pypy.rpython.objectmodel import hint + A = lltype.GcArray(lltype.Char, hints={'immutable': True}) + def ll_length(a): + return len(a) + def ll_getitem(a, i): + return a[i] + def ll_interp(code): + accum = 0 + pc = 0 + while pc < ll_length(code): + opcode = hint(ll_getitem(code, pc), concrete=True) + pc += 1 + if opcode == 'A': + accum += 6 + elif opcode == 'B': + if accum < 20: + pc = 0 + return accum + bytecode = lltype.malloc(A, 5) + bytecode[0] = 'A' + bytecode[1] = 'A' + bytecode[2] = 'A' + bytecode[3] = 'B' + bytecode[4] = 'A' + graph2, insns = abstrinterp(ll_interp, [bytecode], [0], + policy=P_HINT_DRIVEN) + assert insns == {'int_add': 4, 'int_lt': 1} From arigo at codespeak.net Sun Dec 18 14:29:52 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 18 Dec 2005 14:29:52 +0100 (CET) Subject: [pypy-svn] r21268 - pypy/dist/pypy/jit Message-ID: <20051218132952.4264527B82@code1.codespeak.net> Author: arigo Date: Sun Dec 18 14:29:50 2005 New Revision: 21268 Modified: pypy/dist/pypy/jit/llabstractinterp.py Log: I think that this is doing what I described in the pypy-dev mail, but I'm not sure. The factorial test of test_jit_tl.py still becomes 'return 5040' instead of staying a loop... Before computing each block, we compute a 'key' which is derived from the current state's fixed constants. Instead of only one residual block per state, there is one residual block per 'key'. The residual block in question has inputargs that are constants -- at least for each fixed constant, but possibly for more, if policy.const_propagate is True. When we consider a link that should go to a given block, we compute the 'key' and check if there is already a corresponding residual block; if so, we check the constants that have been put in the inputargs. If they don't match the new link's constants, we throw away the existing residual block and compute a new one with less constants in its inputargs. These recomputations are based on the official 'key', so that links with different *fixed* constants don't interfere with each other. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 18 14:29:50 2005 @@ -62,13 +62,10 @@ # the sources that did or could allow 'self' to be # computed as a constant - becomes = "var" # only meaningful on LLRuntimeValues attached to a + fixed = False # only meaningful on LLRuntimeValues attached to a # saved state. Describes what this LLRuntimeValue will - # become in the next block: - # "var" - becomes a Variable in the next block - # "single" - only one Constant seen so far, so - # for now it can stay a Constant - # "fixed" - forced by hint() to stay a Constant + # become in the next block. Set to True by hint() + # to force a Constant to stay a Constant. def __init__(self, orig_v): if isinstance(orig_v, Variable): @@ -78,7 +75,6 @@ # we can share the Constant() self.copy_v = orig_v self.origin = [] - self.becomes = "single" elif isinstance(orig_v, lltype.LowLevelType): # hackish interface :-( we accept a type too self.copy_v = newvar(orig_v) @@ -99,13 +95,12 @@ return [self.copy_v] else: c = memo.key.next() - if self.becomes == "var": - return [None] - else: + if self.fixed: assert isinstance(c, Constant), ( - "unexpected Variable %r reaching a %r input arg" % - (c, self.becomes)) + "unexpected Variable %r reaching a fixed input arg" % (c,)) return [c] + else: + return [None] def maybe_get_constant(self): if isinstance(self.copy_v, Constant): @@ -116,7 +111,7 @@ def with_fresh_variables(self, memo): # don't use memo.seen here: shared variables must become distinct c = memo.key and memo.key.next() - if c is not None: # allowed to propagate as a Constant? + if isinstance(c, Constant): # allowed to propagate as a Constant? result = LLRuntimeValue(c) else: result = LLRuntimeValue(self.getconcretetype()) @@ -505,14 +500,7 @@ # already matched must_restart = False for statevar, inputvar in memo.dependencies: - if statevar.becomes == "single": - # the saved state only records one possible Constant - # incoming value so far. Are we seeing a different - # Constant, or even a Variable? - if inputvar.copy_v != statevar.copy_v: - statevar.becomes = "var" - must_restart = True - elif statevar.becomes == "fixed": + if statevar.fixed: # the saved state says that this new incoming # variable must be forced to a constant must_restart |= self.hint_needs_constant(inputvar) @@ -537,21 +525,21 @@ while fix_me: a = fix_me.pop() assert isinstance(a, LLRuntimeValue) - if a.becomes == "fixed": + if a.fixed: continue # already fixed print 'fixing:', a - if a.becomes == "var": - must_restart = True # this Var is now fixed - # (no need to restart if a.becomes was "single") - a.becomes = "fixed" + a.fixed = True + # If 'a' is already a Constant, we just fixed it and we can + # continue. If it is a Variable, restart the whole process. + is_variable = a.maybe_get_constant() is None + if is_variable: + must_restart = True if a.origin: fix_me.extend(a.origin) - elif a.maybe_get_constant() is None: + elif is_variable: # a Variable with no recorded origin raise Exception("hint() failed: cannot trace the variable %r " "back to a link where it was a constant" % (a,)) - assert self.policy.const_propagate, ( - "hint() can only be used with a policy of const_propagate=True") return must_restart @@ -605,13 +593,60 @@ while pending: next = pending.pop() state = interp.pendingstates[next] - if interp.policy.const_propagate: - key = tuple(state.getruntimevars(VarMemo(next.args))) + + # Before computing each block, we compute a 'key' which is + # derived from the current state's fixed constants. Instead + # of only one residual block per state, there is one residual + # block per 'key'. The residual block in question has + # inputargs that are constants -- at least for each fixed + # constant, but possibly for more, if policy.const_propagate + # is True. + # + # When we consider a link that should go to a given block, we + # compute the 'key' and check if there is already a + # corresponding residual block; if so, we check the constants + # that have been put in the inputargs. If they don't match + # the new link's constants, we throw away the existing + # residual block and compute a new one with less constants in + # its inputargs. + # + # These recomputations are based on the official 'key', so + # that links with different *fixed* constants don't interfere + # with each other. + + key = tuple(state.getruntimevars(VarMemo(next.args))) + print + print 'key=', key + try: + block = state.copyblocks[key] + except KeyError: + if interp.policy.const_propagate: + # originally, propagate all constants from next.args + # optimistically to the new block + initial_key = next.args + else: + # don't propagate anything more than required ('fixed') + initial_key = key + print 'flowin() with initial key', initial_key + block = self.flowin(state, initial_key) + state.copyblocks[key] = block else: - key = None - if key not in state.copyblocks: - self.flowin(state, key) - block = state.copyblocks[key] + # check if the tentative constants of the existing block + # are compatible with the ones specified by the new link + merged_key = [] + recompute = False + for c1, c2 in zip(block.inputargs, next.args): + if isinstance(c1, Constant) and c1 != c2: + # incompatibility + merged_key.append(None) # force a Variable + recompute = True + else: + merged_key.append(c1) # unmodified + if recompute: + print 'flowin() merged as', merged_key + block = self.flowin(state, merged_key) + state.copyblocks[key] = block + raise RestartCompleting next.settarget(block) for link in block.exits: if link.target is None or link.target.operations != (): @@ -659,10 +694,10 @@ for v1, v2, k in zip(state.getruntimevars(VarMemo()), builder.runningstate.getruntimevars(VarMemo()), key): - if k is None: - assert isinstance(v2, Variable) - else: + if isinstance(k, Constant): assert v2 == k + else: + assert isinstance(v2, Variable) arglist.append('%s => %s' % (v1, v2)) print print '--> %s [%s]' % (origblock, ', '.join(arglist)) @@ -741,7 +776,7 @@ newlinks.append(newlink) newblock = builder.buildblock(newexitswitch, newlinks) - state.copyblocks[key] = newblock + return newblock class BlockBuilder(object): From arigo at codespeak.net Sun Dec 18 14:39:21 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 18 Dec 2005 14:39:21 +0100 (CET) Subject: [pypy-svn] r21269 - pypy/dist/pypy/jit Message-ID: <20051218133921.7F6BE27B82@code1.codespeak.net> Author: arigo Date: Sun Dec 18 14:39:19 2005 New Revision: 21269 Modified: pypy/dist/pypy/jit/llabstractinterp.py Log: Forgot to adjust the policy used by test_jit_tl.py. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 18 14:39:19 2005 @@ -438,7 +438,8 @@ self.concrete_propagate = concrete_propagate self.concrete_args = concrete_args -best_policy = Policy(inlining=True, const_propagate=True) +# hint-driven policy +best_policy = Policy(inlining=True, const_propagate=True, concrete_args=False) class LLAbstractInterp(object): From arigo at codespeak.net Sun Dec 18 16:13:34 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 18 Dec 2005 16:13:34 +0100 (CET) Subject: [pypy-svn] r21270 - in pypy/dist/pypy/rpython: . lltypesystem Message-ID: <20051218151334.AD6A527B47@code1.codespeak.net> Author: arigo Date: Sun Dec 18 16:13:30 2005 New Revision: 21270 Modified: pypy/dist/pypy/rpython/llinterp.py pypy/dist/pypy/rpython/lltypesystem/lltype.py Log: Don't print raw characters; use repr() instead. Modified: pypy/dist/pypy/rpython/llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/llinterp.py (original) +++ pypy/dist/pypy/rpython/llinterp.py Sun Dec 18 16:13:30 2005 @@ -186,7 +186,7 @@ raise LLException(etype, evalue) resultvar, = block.getvariables() result = self.getval(resultvar) - log.operation("returning", result) + log.operation("returning", repr(result)) return None, result elif block.exitswitch is None: # single-exit block Modified: pypy/dist/pypy/rpython/lltypesystem/lltype.py ============================================================================== --- pypy/dist/pypy/rpython/lltypesystem/lltype.py (original) +++ pypy/dist/pypy/rpython/lltypesystem/lltype.py Sun Dec 18 16:13:30 2005 @@ -862,10 +862,10 @@ else: return "%s {%s}" % (of._name, item._str_fields()) else: - return item + return repr(item) def __str__(self): - return 'array [ %s ]' % (', '.join(['%s' % self._str_item(item) + return 'array [ %s ]' % (', '.join([self._str_item(item) for item in self.items]),) assert not '__dict__' in dir(_array) From arigo at codespeak.net Sun Dec 18 16:14:25 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 18 Dec 2005 16:14:25 +0100 (CET) Subject: [pypy-svn] r21271 - in pypy/dist/pypy/jit: . test Message-ID: <20051218151425.19B0127B47@code1.codespeak.net> Author: arigo Date: Sun Dec 18 16:14:23 2005 New Revision: 21271 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_jit_tl.py Log: Missing check in match(): arrays of different sizes cannot match at the moment. The JIT-TL test now pass! It takes about 55 seconds to compute the factorial function, but it works... Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 18 16:14:23 2005 @@ -265,6 +265,8 @@ memo.other_alias[other] = self assert self.T == other.T + if self.names != other.names: + return False if self.a_length is not None: if not self.a_length.match(other.a_length, memo): return False Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Sun Dec 18 16:14:23 2005 @@ -8,7 +8,7 @@ from pypy.rpython.llinterp import LLInterpreter #from pypy.translator.backendopt import inline -py.test.skip("in-progress") +#py.test.skip("in-progress") def setup_module(mod): t = TranslationContext() From arigo at codespeak.net Sun Dec 18 16:38:38 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 18 Dec 2005 16:38:38 +0100 (CET) Subject: [pypy-svn] r21273 - in pypy/dist/pypy/jit: . test Message-ID: <20051218153838.DC53827B47@code1.codespeak.net> Author: arigo Date: Sun Dec 18 16:38:36 2005 New Revision: 21273 Modified: pypy/dist/pypy/jit/test/test_jit_tl.py pypy/dist/pypy/jit/test/test_tl.py pypy/dist/pypy/jit/tl.py pypy/dist/pypy/jit/tlopcode.py Log: Added a BR_COND_STK operation (credits: pedronis) that takes the jump offset from the value stack. Verified by looking at the graph that it is still melted away by the existing hint(code[pc]). (We need a way to check the resulting graphs automatically...) Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Sun Dec 18 16:38:36 2005 @@ -116,3 +116,30 @@ POP RETURN ''') + +def test_factorial_harder(): + run_jit(''' + PUSH 1 # accumulator + PUSH 7 # N + + start: + PICK 0 + PUSH 1 + LE + PUSH exit + BR_COND_STK + + SWAP + PICK 1 + MUL + SWAP + PUSH 1 + SUB + PUSH 1 + BR_COND start + + exit: + NOP # BR_COND_STK skips this instruction + POP + RETURN + ''') Modified: pypy/dist/pypy/jit/test/test_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_tl.py (original) +++ pypy/dist/pypy/jit/test/test_tl.py Sun Dec 18 16:38:36 2005 @@ -190,3 +190,32 @@ ''') res = interp(code) assert res == 5040 + +def test_factorial_harder(): + code = compile(''' + PUSH 1 # accumulator + PUSH 7 # N + + start: + PICK 0 + PUSH 1 + LE + PUSH exit + BR_COND_STK + + SWAP + PICK 1 + MUL + SWAP + PUSH 1 + SUB + PUSH 1 + BR_COND start + + exit: + NOP # BR_COND_STK skips this instruction + POP + RETURN + ''') + res = interp(code) + assert res == 5040 Modified: pypy/dist/pypy/jit/tl.py ============================================================================== --- pypy/dist/pypy/jit/tl.py (original) +++ pypy/dist/pypy/jit/tl.py Sun Dec 18 16:38:36 2005 @@ -23,7 +23,10 @@ opcode = hint(opcode, concrete=True) pc += 1 - if opcode == PUSH: + if opcode == NOP: + pass + + elif opcode == PUSH: stack.append( char2int(code[pc]) ) pc += 1 @@ -97,6 +100,11 @@ pc += char2int(code[pc]) pc += 1 + elif opcode == BR_COND_STK: + offset = stack.pop() + if stack.pop(): + pc += offset + elif opcode == CALL: offset = char2int(code[pc]) pc += 1 Modified: pypy/dist/pypy/jit/tlopcode.py ============================================================================== --- pypy/dist/pypy/jit/tlopcode.py (original) +++ pypy/dist/pypy/jit/tlopcode.py Sun Dec 18 16:38:36 2005 @@ -6,6 +6,7 @@ g_opcode += 1 names[opcode_name] = globals()[opcode_name] = g_opcode +opcode("NOP") opcode("PUSH") #1 operand opcode("POP") opcode("SWAP") @@ -27,6 +28,7 @@ opcode("GE") opcode("BR_COND") #1 operand offset +opcode("BR_COND_STK") # no operand, takes [condition, offset] from the stack opcode("CALL") #1 operand offset opcode("RETURN") From cfbolz at codespeak.net Sun Dec 18 18:08:46 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sun, 18 Dec 2005 18:08:46 +0100 (CET) Subject: [pypy-svn] r21277 - pypy/dist/pypy/doc Message-ID: <20051218170846.8AB2C27DB4@code1.codespeak.net> Author: cfbolz Date: Sun Dec 18 18:08:45 2005 New Revision: 21277 Modified: pypy/dist/pypy/doc/news.txt Log: finally update the news item. Modified: pypy/dist/pypy/doc/news.txt ============================================================================== --- pypy/dist/pypy/doc/news.txt (original) +++ pypy/dist/pypy/doc/news.txt Sun Dec 18 18:08:45 2005 @@ -12,15 +12,14 @@ PyPy Sprint in G?teborg 7th - 11th December 2005 ================================================= -The next PyPy sprint is scheduled to be in December 2005 in G?teborg. -Its main focus is heading towards phase 2, which means JIT work, -alternate threading models and logic programming (but there are also -other possible topics). We'll give newcomer-friendly introductions. -Read more in `the announcement`_, and see who's currently planning to -attend on the `people page`_. - -.. _`the announcement`: http://codespeak.net/pypy/extradoc/sprintinfo/gothenburg-2005/announcement.html -.. _`people page`: http://codespeak.net/svn/pypy/extradoc/sprintinfo/gothenburg-2005/people.txt +The Gothenburg sprint is over. It was a very productive sprint: work has +been started on a JIT prototype, we added support for __del__ in PyPy, +the socket module had some progress, PyPy got faster and work was started to +expose the internals of our parser and bytecode compiler to the user. +Michael and Carl have written a `report about the first half`_ and `one about +the second half`_ of the sprint. *(12/18/2005)* +.. _`report about the first half`: http://codespeak.net/pipermail/pypy-dev/2005q4/002656.html +.. _`one about the second half`: http://codespeak.net/pipermail/pypy-dev/2005q4/002660.html PyPy release 0.8.0 =================== From arigo at codespeak.net Sun Dec 18 19:42:51 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 18 Dec 2005 19:42:51 +0100 (CET) Subject: [pypy-svn] r21282 - in pypy/dist/pypy/jit: . test Message-ID: <20051218184251.0137A27B82@code1.codespeak.net> Author: arigo Date: Sun Dec 18 19:42:49 2005 New Revision: 21282 Modified: pypy/dist/pypy/jit/llabstractinterp.py pypy/dist/pypy/jit/test/test_jit_tl.py Log: More useful debugging prints. Make the factorial tests check exactly what operations are left in the residual graphs. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 18 19:42:49 2005 @@ -597,6 +597,24 @@ next = pending.pop() state = interp.pendingstates[next] + # debugging: print the current call stack + print + st = state + stlist = [] + while st.a_back is not None: + st = st.a_back + stlist.append(st) + stlist.reverse() + for st in stlist: + op = st.origblock.operations[st.origposition] + if op.opname == 'direct_call': + v = op.args[0] + if isinstance(v, Constant): + v = v.value + else: + v = '?' + print 'In %r:' % (v,) + # Before computing each block, we compute a 'key' which is # derived from the current state's fixed constants. Instead # of only one residual block per state, there is one residual @@ -618,8 +636,6 @@ # with each other. key = tuple(state.getruntimevars(VarMemo(next.args))) - print - print 'key=', key try: block = state.copyblocks[key] except KeyError: Modified: pypy/dist/pypy/jit/test/test_jit_tl.py ============================================================================== --- pypy/dist/pypy/jit/test/test_jit_tl.py (original) +++ pypy/dist/pypy/jit/test/test_jit_tl.py Sun Dec 18 19:42:49 2005 @@ -34,10 +34,18 @@ #interp.graphs[0].show() + # return a summary of the instructions left in graph2 + insns = {} + for copygraph in interp.itercopygraphs(): + for block in copygraph.iterblocks(): + for op in block.operations: + insns[op.opname] = insns.get(op.opname, 0) + 1 + return insns + def run_jit(code): code = tl.compile(code) - jit_tl(code) + return jit_tl(code) def test_simple1(): @@ -93,7 +101,7 @@ ''') def test_factorial(): - run_jit(''' + insns = run_jit(''' PUSH 1 # accumulator PUSH 7 # N @@ -116,9 +124,17 @@ POP RETURN ''') + # currently, the condition is turned from the bool to an int and back + # so ignore that + if 'cast_bool_to_int' in insns: + assert insns['cast_bool_to_int'] == 1 + assert insns['int_is_true'] == 1 + del insns['cast_bool_to_int'] + del insns['int_is_true'] + assert insns == {'int_le': 1, 'int_mul': 1, 'int_sub': 1} def test_factorial_harder(): - run_jit(''' + insns = run_jit(''' PUSH 1 # accumulator PUSH 7 # N @@ -143,3 +159,11 @@ POP RETURN ''') + # currently, the condition is turned from the bool to an int and back + # so ignore that + if 'cast_bool_to_int' in insns: + assert insns['cast_bool_to_int'] == 1 + assert insns['int_is_true'] == 1 + del insns['cast_bool_to_int'] + del insns['int_is_true'] + assert insns == {'int_le': 1, 'int_mul': 1, 'int_sub': 1} From arigo at codespeak.net Sun Dec 18 19:55:30 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 18 Dec 2005 19:55:30 +0100 (CET) Subject: [pypy-svn] r21283 - pypy/dist/pypy/jit Message-ID: <20051218185530.EF7C927B82@code1.codespeak.net> Author: arigo Date: Sun Dec 18 19:55:29 2005 New Revision: 21283 Modified: pypy/dist/pypy/jit/llabstractinterp.py Log: This makes the factorial examples much faster. I'm not sure I understand why; in theory it could compute (and throw away) more blocks. As a guess, computing the 'key' is expensive. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 18 19:55:29 2005 @@ -591,6 +591,7 @@ interp = self.interp pending = [self] seen = {} + did_any_generalization = False # follow all possible links, forcing the blocks along the way to be # computed while pending: @@ -665,7 +666,7 @@ print 'flowin() merged as', merged_key block = self.flowin(state, merged_key) state.copyblocks[key] = block - raise RestartCompleting + did_any_generalization = True next.settarget(block) for link in block.exits: if link.target is None or link.target.operations != (): @@ -684,6 +685,9 @@ else: raise Exception("uh?") + if did_any_generalization: + raise RestartCompleting + remove_constant_inputargs(graph) # the graph should be complete now; sanity-check From arigo at codespeak.net Sun Dec 18 22:32:17 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 18 Dec 2005 22:32:17 +0100 (CET) Subject: [pypy-svn] r21285 - pypy/dist/pypy/jit Message-ID: <20051218213217.8F43727DB5@code1.codespeak.net> Author: arigo Date: Sun Dec 18 22:32:15 2005 New Revision: 21285 Modified: pypy/dist/pypy/jit/llabstractinterp.py Log: Some fixes to hint_needs_constant() and its usages. Modified: pypy/dist/pypy/jit/llabstractinterp.py ============================================================================== --- pypy/dist/pypy/jit/llabstractinterp.py (original) +++ pypy/dist/pypy/jit/llabstractinterp.py Sun Dec 18 22:32:15 2005 @@ -506,7 +506,11 @@ if statevar.fixed: # the saved state says that this new incoming # variable must be forced to a constant - must_restart |= self.hint_needs_constant(inputvar) + self.hint_needs_constant(inputvar) + # we'll have to restart if we are trying to turn + # a variable into a constant + if inputvar.maybe_get_constant() is None: + must_restart = True if must_restart: raise RestartCompleting # The new inputstate is merged into the existing saved state. @@ -523,7 +527,6 @@ def hint_needs_constant(self, a): # Force the given LLRuntimeValue to be a fixed constant. - must_restart = False fix_me = [a] while fix_me: a = fix_me.pop() @@ -534,16 +537,12 @@ a.fixed = True # If 'a' is already a Constant, we just fixed it and we can # continue. If it is a Variable, restart the whole process. - is_variable = a.maybe_get_constant() is None - if is_variable: - must_restart = True if a.origin: fix_me.extend(a.origin) - elif is_variable: + elif a.maybe_get_constant() is None: # a Variable with no recorded origin raise Exception("hint() failed: cannot trace the variable %r " "back to a link where it was a constant" % (a,)) - return must_restart class GraphState(object): @@ -953,15 +952,18 @@ hints = c_hints.value if hints.get('concrete'): # turn this 'a' into a concrete value - c = a.forcevarorconst(self) - if isinstance(c, Constant): + a.forcevarorconst(self) + if not isinstance(a, LLConcreteValue): + self.interp.hint_needs_constant(a) + c = a.maybe_get_constant() + if c is None: + # Oups! it's not a constant. But hint_needs_constant() + # traced it back to a constant that was turned into a + # variable by a link. This constant has been marked as + # 'fixed', so if we restart now, op_hint() should receive + # a constant the next time. + raise RestartCompleting a = LLConcreteValue(c.value) - else: - # Oups! it's not a constant. Try to trace it back to a - # constant that was turned into a variable by a link. - restart = self.interp.hint_needs_constant(a) - assert restart - raise RestartCompleting return a def op_direct_call(self, op, *args_a): From mwh at codespeak.net Mon Dec 19 09:24:05 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 19 Dec 2005 09:24:05 +0100 (CET) Subject: [pypy-svn] r21294 - pypy/dist/pypy/doc Message-ID: <20051219082405.6849927DB5@code1.codespeak.net> Author: mwh Date: Mon Dec 19 09:24:04 2005 New Revision: 21294 Modified: pypy/dist/pypy/doc/news.txt Log: add a blank line which makes the news item look less weird. Modified: pypy/dist/pypy/doc/news.txt ============================================================================== --- pypy/dist/pypy/doc/news.txt (original) +++ pypy/dist/pypy/doc/news.txt Mon Dec 19 09:24:04 2005 @@ -18,6 +18,7 @@ expose the internals of our parser and bytecode compiler to the user. Michael and Carl have written a `report about the first half`_ and `one about the second half`_ of the sprint. *(12/18/2005)* + .. _`report about the first half`: http://codespeak.net/pipermail/pypy-dev/2005q4/002656.html .. _`one about the second half`: http://codespeak.net/pipermail/pypy-dev/2005q4/002660.html From mwh at codespeak.net Mon Dec 19 09:28:53 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 19 Dec 2005 09:28:53 +0100 (CET) Subject: [pypy-svn] r21295 - pypy/dist/pypy/doc/weekly Message-ID: <20051219082853.3570727DBA@code1.codespeak.net> Author: mwh Date: Mon Dec 19 09:28:52 2005 New Revision: 21295 Added: pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt Modified: pypy/dist/pypy/doc/weekly/index.txt Log: this week in pypy skeleton, a bit late Modified: pypy/dist/pypy/doc/weekly/index.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/index.txt (original) +++ pypy/dist/pypy/doc/weekly/index.txt Mon Dec 19 09:28:52 2005 @@ -73,10 +73,16 @@ - IRC Summary - EU-related Talks + * `Week ending 2005-12-16`_ + + - The Sprint! + - IRC Summary + .. _`Week ending 2005-11-04`: summary-2005-11-04.html .. _`Week ending 2005-11-11`: summary-2005-11-11.html .. _`Week ending 2005-11-18`: summary-2005-11-18.html .. _`Week ending 2005-11-25`: summary-2005-11-25.html .. _`Week ending 2005-12-02`: summary-2005-12-02.html .. _`Week ending 2005-12-09`: summary-2005-12-09.html +.. _`Week ending 2005-12-16`: summary-2005-12-16.html Added: pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt ============================================================================== --- (empty file) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt Mon Dec 19 09:28:52 2005 @@ -0,0 +1,58 @@ +======================= + This Week in PyPy 7 +======================= + +Introduction +============ + +This is the seventh summary of what's been going on in the world of +PyPy in the last week. I'd still like to remind people that when +something worth summarizing happens to recommend if for "This Week in +PyPy" as mentioned on: + + http://codespeak.net/pypy/dist/pypy/doc/weekly/ + +where you can also find old summaries. + +There were about 110 commits to the pypy section of codespeak's +repository in the last week. + + +The Sprint! +=========== + +The last weekly summary was written towards then end of the sprint. +The things we did in the couple of remaining weeks were written up in +the second sprint report: + + http://codespeak.net/pipermail/pypy-dev/2005q4/002660.html + +Apart from continuing our work from the first half of the sprint, the +main new work was implementing __del__ support in the translated PyPy. + + +IRC Summary +=========== + +Thanks again to Pieter for this. + +**Monday** http://tismerysoft.de/pypy/irc-logs/pypy/%23pypy.log.20051212:: + + [00:26] Stakkars says that it is great that pypy does not punish you for + indirection. He is of meaning that he writes better style in RPython + than in Python, because the "it is slow" aspect is gone. + +**Tuesday** http://tismerysoft.de/pypy/irc-logs/pypy/%23pypy.log.20051213:: + + [21:01] Heatsink says that he is doing some dynamic optimizations in CPython. + This turns into a discussion about the nature of pypy, and Arigo takes + us on a tour of how pypy and the JIT will interact in the future. A + good read of general pypy ideas. + +**Thursday** http://tismerysoft.de/pypy/irc-logs/pypy/%23pypy.log.20051215:: + + [10:24] Ericvrp discovers an optimization that makes pypy 6.8x slower than + CPython on the richards test suite. All if-elses are converted to + switches. Cfbolz replies that it is time to write a graph + transformation to implement this optimization officially. + From arigo at codespeak.net Mon Dec 19 11:20:52 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 19 Dec 2005 11:20:52 +0100 (CET) Subject: [pypy-svn] r21298 - pypy/dist/pypy/doc Message-ID: <20051219102052.BE24927DBB@code1.codespeak.net> Author: arigo Date: Mon Dec 19 11:20:51 2005 New Revision: 21298 Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt Log: Page layout: add a "Figure N" caption to the graphviz figures, scale them down a little bit so that two of them fit on a PDF page. Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/dynamic-language-translation.txt Mon Dec 19 11:20:51 2005 @@ -819,7 +819,11 @@ It is left as an exercise to show that this partial order makes *A* a lattice. -Graphically: + +.. graphviz:: image/lattice1.dot + :scale: 80 + +:Figure 1: the lattice of annotations. .. ____________ Top ___________ @@ -837,10 +841,13 @@ \ \ | / / `--------`-- Bottom ------' -.. graphviz:: image/lattice1.dot -Here is the part about instances and nullable instances, assuming a -simple class hierarchy with only two direct subclasses of ``object``: +.. graphviz:: image/lattice2.dot + :scale: 80 + +:Figure 2: The part about instances and nullable instances, assuming a + simple class hierarchy with only two direct subclasses of + ``object``. .. Top @@ -866,9 +873,11 @@ \ / / Bottom -.. graphviz:: image/lattice2.dot -All list terms for all variables are unordered: +.. graphviz:: image/lattice3.dot + :scale: 80 + +:Figure 3: All list terms for all variables are unordered. .. __________________ Top __________________ @@ -881,7 +890,8 @@ \ \ \ / / / '------------'--- None ----'------------' -.. graphviz:: image/lattice3.dot + +~~~~~~~~ The Pbcs form a classical finite set-of-subsets lattice. In practice, we consider ``None`` as a degenerated pre-built constant, so the None From cfbolz at codespeak.net Mon Dec 19 13:05:13 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Mon, 19 Dec 2005 13:05:13 +0100 (CET) Subject: [pypy-svn] r21304 - pypy/dist/pypy/doc/weekly Message-ID: <20051219120513.6C91D27DBB@code1.codespeak.net> Author: cfbolz Date: Mon Dec 19 13:05:12 2005 New Revision: 21304 Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt Log: so we are having sprints now that last several weeks? Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt (original) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt Mon Dec 19 13:05:12 2005 @@ -22,7 +22,7 @@ =========== The last weekly summary was written towards then end of the sprint. -The things we did in the couple of remaining weeks were written up in +The things we did in the couple of remaining days were written up in the second sprint report: http://codespeak.net/pipermail/pypy-dev/2005q4/002660.html From ac at codespeak.net Mon Dec 19 13:22:02 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Mon, 19 Dec 2005 13:22:02 +0100 (CET) Subject: [pypy-svn] r21305 - pypy/dist/pypy/rpython Message-ID: <20051219122202.E120127DBB@code1.codespeak.net> Author: ac Date: Mon Dec 19 13:22:02 2005 New Revision: 21305 Modified: pypy/dist/pypy/rpython/llinterp.py Log: Add support for r_ulonglong to llinterp. Modified: pypy/dist/pypy/rpython/llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/llinterp.py (original) +++ pypy/dist/pypy/rpython/llinterp.py Mon Dec 19 13:22:02 2005 @@ -1,5 +1,5 @@ from pypy.objspace.flow.model import FunctionGraph, Constant, Variable, c_last_exception -from pypy.rpython.rarithmetic import intmask, r_uint, ovfcheck, r_longlong +from pypy.rpython.rarithmetic import intmask, r_uint, ovfcheck, r_longlong, r_ulonglong from pypy.rpython.lltypesystem import lltype from pypy.rpython.memory import lladdress from pypy.rpython.ootypesystem import ootype @@ -560,7 +560,7 @@ # __________________________________________________________ # primitive operations - for typ in (float, int, r_uint, r_longlong): + for typ in (float, int, r_uint, r_longlong, r_ulonglong): typname = typ.__name__ optup = ('add', 'sub', 'mul', 'div', 'truediv', 'floordiv', 'mod', 'gt', 'lt', 'ge', 'ne', 'le', 'eq',) @@ -568,6 +568,8 @@ opnameprefix = 'uint' elif typ is r_longlong: opnameprefix = 'llong' + elif typ is r_ulonglong: + opnameprefix = 'ullong' else: opnameprefix = typname if typ in (int, r_uint): From ac at codespeak.net Mon Dec 19 14:31:28 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Mon, 19 Dec 2005 14:31:28 +0100 (CET) Subject: [pypy-svn] r21309 - in pypy/dist/pypy: annotation rpython Message-ID: <20051219133128.9D18E27DC4@code1.codespeak.net> Author: ac Date: Mon Dec 19 14:31:28 2005 New Revision: 21309 Modified: pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/rpython/rbuiltin.py Log: Imporve on r_ulonglong support. Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Mon Dec 19 14:31:28 2005 @@ -89,6 +89,10 @@ return constpropagate(pypy.rpython.rarithmetic.r_longlong, [s_obj], SomeInteger(size=2)) +def restricted_ulonglong(s_obj): # for r_uint + return constpropagate(pypy.rpython.rarithmetic.r_ulonglong, [s_obj], + SomeInteger(size=2, nonneg=True, unsigned=True)) + def builtin_float(s_obj): return constpropagate(float, [s_obj], SomeFloat()) @@ -335,6 +339,7 @@ BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.r_uint] = restricted_uint BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.r_longlong] = restricted_longlong +BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.r_ulonglong] = restricted_ulonglong ##BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.ovfcheck] = rarith_ovfcheck ##BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.ovfcheck_lshift] = rarith_ovfcheck_lshift BUILTIN_ANALYZERS[pypy.rpython.rarithmetic.intmask] = rarith_intmask Modified: pypy/dist/pypy/rpython/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/rbuiltin.py Mon Dec 19 14:31:28 2005 @@ -167,6 +167,10 @@ vlist = hop.inputargs(lltype.SignedLongLong) return vlist[0] +def rtype_r_ulonglong(hop): + vlist = hop.inputargs(lltype.UnsignedLongLong) + return vlist[0] + def rtype_builtin_min(hop): rint1, rint2 = hop.args_r assert isinstance(rint1, IntegerRepr) @@ -319,6 +323,7 @@ BUILTIN_TYPER[rarithmetic.intmask] = rtype_intmask BUILTIN_TYPER[rarithmetic.r_uint] = rtype_r_uint BUILTIN_TYPER[rarithmetic.r_longlong] = rtype_r_longlong +BUILTIN_TYPER[rarithmetic.r_ulonglong] = rtype_r_ulonglong BUILTIN_TYPER[objectmodel.r_dict] = rtype_r_dict BUILTIN_TYPER[objectmodel.we_are_translated] = rtype_we_are_translated BUILTIN_TYPER[objectmodel.cast_object_to_int] = rtype_cast_object_to_int From arigo at codespeak.net Mon Dec 19 15:11:18 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 19 Dec 2005 15:11:18 +0100 (CET) Subject: [pypy-svn] r21311 - pypy/dist/pypy/doc Message-ID: <20051219141118.9421627DD6@code1.codespeak.net> Author: arigo Date: Mon Dec 19 15:11:16 2005 New Revision: 21311 Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt Log: Removed the :scale: which have a bad effect e.g. in the HTML for Firefox. Added a missing block quote. Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/dynamic-language-translation.txt Mon Dec 19 15:11:16 2005 @@ -821,7 +821,6 @@ .. graphviz:: image/lattice1.dot - :scale: 80 :Figure 1: the lattice of annotations. @@ -843,7 +842,6 @@ .. graphviz:: image/lattice2.dot - :scale: 80 :Figure 2: The part about instances and nullable instances, assuming a simple class hierarchy with only two direct subclasses of @@ -875,7 +873,6 @@ .. graphviz:: image/lattice3.dot - :scale: 80 :Figure 3: All list terms for all variables are unordered. @@ -1708,7 +1705,7 @@ which concludes the induction step. Case 2: *r* is not in *S_i*. By induction hypothesis ``(b_i, E_i) = - r( (b_i, E_i) )``. + r( (b_i, E_i) )``. :: (b_i, E_i) = (b_i, E_i) union (bf,Ef) (b_i, E_i) >= (bf,Ef) From ac at codespeak.net Mon Dec 19 15:34:13 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Mon, 19 Dec 2005 15:34:13 +0100 (CET) Subject: [pypy-svn] r21314 - in pypy/dist/pypy/translator: backendopt backendopt/test c c/test Message-ID: <20051219143413.6E46C27DD6@code1.codespeak.net> Author: ac Date: Mon Dec 19 15:34:13 2005 New Revision: 21314 Modified: pypy/dist/pypy/translator/backendopt/merge_if_blocks.py pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py pypy/dist/pypy/translator/c/funcgen.py pypy/dist/pypy/translator/c/test/test_backendoptimized.py Log: Extend the 'if ... elif ...' merging to other types than Signed. Modified: pypy/dist/pypy/translator/backendopt/merge_if_blocks.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/merge_if_blocks.py (original) +++ pypy/dist/pypy/translator/backendopt/merge_if_blocks.py Mon Dec 19 15:34:13 2005 @@ -10,7 +10,9 @@ if len(block.operations) > 1 and not first: return False op = block.operations[-1] - if op.opname != 'int_eq' or op.result != block.exitswitch: + if (op.opname not in ('int_eq', 'uint_eq', 'llong_eq', 'ullong_eq', + 'char_eq', 'unichar_eq') + or op.result != block.exitswitch): return False if isinstance(op.args[0], Variable) and isinstance(op.args[1], Variable): return False Modified: pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_merge_if_blocks.py Mon Dec 19 15:34:13 2005 @@ -5,33 +5,64 @@ from pypy.objspace.flow.model import flatten, Block from pypy.translator.backendopt.removenoops import remove_same_as from pypy.rpython.llinterp import LLInterpreter +from pypy.rpython.rarithmetic import r_uint, r_ulonglong, r_longlong +from pypy.annotation.model import SomeChar, SomeUnicodeCodePoint -def test_merge1(): - def merge1(n): - n += 1 - if n == 1: - return 1 - elif n == 2: - return 2 - elif n == 3: - return 3 - return 4 +def do_test_merge(fn, testvalues): t = TranslationContext() a = t.buildannotator() - a.build_types(merge1, [int]) + a.build_types(fn, [type(testvalues[0])]) rtyper = t.buildrtyper() rtyper.specialize() - graph = tgraphof(t, merge1) + graph = tgraphof(t, fn) assert len(list(graph.iterblocks())) == 4 #startblock, blocks, returnblock remove_same_as(graph) merge_if_blocks_once(graph) assert len(graph.startblock.exits) == 4 assert len(list(graph.iterblocks())) == 2 #startblock, returnblock interp = LLInterpreter(rtyper) - for i in range(4): - res = interp.eval_graph(graph, [i]) - assert res == i + 1 + for i in testvalues: + expected = fn(i) + actual = interp.eval_graph(graph, [i]) + assert actual == expected +def test_merge1(): + def merge_int(n): + n += 1 + if n == 1: + return 1 + elif n == 2: + return 2 + elif n == 3: + return 3 + return 4 + do_test_merge(merge_int, range(4)) + do_test_merge(merge_int, [r_uint(i) for i in range(4)]) + do_test_merge(merge_int, [r_longlong(i) for i in range(4)]) + do_test_merge(merge_int, [r_ulonglong(i) for i in range(4)]) + + def merge_chr(n): + c = chr(n + 1) + if c == 'a': + return 'a' + elif c == 'b': + return 'b' + elif c == 'c': + return 'c' + return 'd' + do_test_merge(merge_chr, range(96, 101)) + + def merge_uchr(n): + c = unichr(n + 1) + if c == u'a': + return u'a' + elif c == u'b': + return u'b' + elif c == u'c': + return u'c' + return u'd' + do_test_merge(merge_uchr, range(96, 101)) + def test_merge_passonvars(): def merge(n, m): if n == 1: Modified: pypy/dist/pypy/translator/c/funcgen.py ============================================================================== --- pypy/dist/pypy/translator/c/funcgen.py (original) +++ pypy/dist/pypy/translator/c/funcgen.py Mon Dec 19 15:34:13 2005 @@ -5,7 +5,7 @@ from pypy.objspace.flow.model import Variable, Constant, Block from pypy.objspace.flow.model import traverse, c_last_exception from pypy.rpython.lltypesystem.lltype import \ - Ptr, PyObject, Void, Bool, Signed, pyobjectptr, Struct, Array + Ptr, PyObject, Void, Bool, Signed, Unsigned, SignedLongLong, UnsignedLongLong,Char, UniChar, pyobjectptr, Struct, Array PyObjPtr = Ptr(PyObject) @@ -345,7 +345,8 @@ for op in gen_link(block.exits[-1]): yield op yield '' - elif TYPE == Signed: + elif TYPE in (Signed, Unsigned, SignedLongLong, + UnsignedLongLong, Char, UniChar): defaultlink = None expr = self.expr(block.exitswitch) yield 'switch (%s) {' % self.expr(block.exitswitch) @@ -353,7 +354,7 @@ if link.exitcase is 'default': defaultlink = link continue - yield 'case %s:' % link.llexitcase + yield 'case %s:' % self.db.get(link.llexitcase) for op in gen_link(link): yield '\t' + op yield 'break;' Modified: pypy/dist/pypy/translator/c/test/test_backendoptimized.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_backendoptimized.py (original) +++ pypy/dist/pypy/translator/c/test/test_backendoptimized.py Mon Dec 19 15:34:13 2005 @@ -2,7 +2,7 @@ from pypy.translator.c.test.test_typed import TestTypedTestCase as _TestTypedTestCase from pypy.translator.backendopt.all import backend_optimizations from pypy.rpython import objectmodel - +from pypy.rpython.rarithmetic import r_uint, r_longlong, r_ulonglong class TestTypedOptimizedTestCase(_TestTypedTestCase): @@ -105,7 +105,7 @@ self.t = t backend_optimizations(t, merge_if_blocks_to_switch=True) - def test_switch(self): + def test_int_switch(self): def f(x=int): if x == 3: return 9 @@ -119,3 +119,77 @@ for x in (0,1,2,3,9,27,48, -9): assert fn(x) == f(x) + def test_uint_switch(self): + def f(x=r_uint): + if x == r_uint(3): + return 9 + elif x == r_uint(9): + return 27 + elif x == r_uint(27): + return 3 + return 0 + codegenerator = self.CodeGenerator() + fn = codegenerator.getcompiled(f) + for x in (0,1,2,3,9,27,48): + assert fn(x) == f(x) + + def test_longlong_switch(self): + def f(x=r_longlong): + if x == r_longlong(3): + return 9 + elif x == r_longlong(9): + return 27 + elif x == r_longlong(27): + return 3 + return 0 + codegenerator = self.CodeGenerator() + fn = codegenerator.getcompiled(f) + for x in (0,1,2,3,9,27,48, -9): + assert fn(x) == f(x) + + def test_ulonglong_switch(self): + def f(x=r_ulonglong): + if x == r_ulonglong(3): + return 9 + elif x == r_ulonglong(9): + return 27 + elif x == r_ulonglong(27): + return 3 + return 0 + codegenerator = self.CodeGenerator() + fn = codegenerator.getcompiled(f) + for x in (0,1,2,3,9,27,48, -9): + assert fn(x) == f(x) + + def test_chr_switch(self): + def f(y=int): + x = chr(y) + if x == 'a': + return 'b' + elif x == 'b': + return 'c' + elif x == 'c': + return 'd' + return '@' + codegenerator = self.CodeGenerator() + fn = codegenerator.getcompiled(f) + for x in 'ABCabc@': + y = ord(x) + assert fn(y) == f(y) + + def test_unichr_switch(self): + def f(y=int): + x = unichr(y) + if x == u'a': + return 'b' + elif x == u'b': + return 'c' + elif x == u'c': + return 'd' + return '@' + codegenerator = self.CodeGenerator() + fn = codegenerator.getcompiled(f) + for x in u'ABCabc@': + y = ord(x) + assert fn(y) == f(y) + From cfbolz at codespeak.net Mon Dec 19 16:43:11 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Mon, 19 Dec 2005 16:43:11 +0100 (CET) Subject: [pypy-svn] r21316 - pypy/dist/pypy/doc/weekly Message-ID: <20051219154311.91AC427DD4@code1.codespeak.net> Author: cfbolz Date: Mon Dec 19 16:43:10 2005 New Revision: 21316 Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt Log: add a strange, strange paragraph about the new bytecode dispatching code Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt (original) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt Mon Dec 19 16:43:10 2005 @@ -56,3 +56,26 @@ switches. Cfbolz replies that it is time to write a graph transformation to implement this optimization officially. + +PyPy's Bytecode Dispatcher +========================== + +Until now the bytecode dispatching in PyPy was done using a list of functions +that contain the implementation of the respective bytecode. The dispatch +function retrieved the correct function by using the bytecode as an index into +this list. This was turned by the translator and the C backend into an array of +function pointers. This has the drawback that the bytecode implementing +functions can never be inlined (although some of them are quite small) and +there always is a read from memory for every bytecode. + +During the Gothenburg sprint we discussed a strategy to transform the dispatch +code into something more efficient. During the last week Eric, Arre and Carl +Friedrich implemented this strategy. Now the dispatching is done with a huge +chain of if/elif/else that all test the value of the same variable. In addition +there is a transformation that transforms chains of such if/elif/else blocks +into a block that has an integer variable as an exitswitch and links which +exitcases corresponding to the different values of the single integer variable. +The C backend outputs such a block as a switch. In addition this technique +makes it possible for our inliner to inline some of the bytecode implementing +functions work. Using the new dispatcher pypy-c got XXX slower and is now XXX +times slower thatn CPython. From tismer at codespeak.net Mon Dec 19 17:34:12 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Mon, 19 Dec 2005 17:34:12 +0100 (CET) Subject: [pypy-svn] r21317 - pypy/dist/pypy/translator/c/test Message-ID: <20051219163412.EF91F27DD2@code1.codespeak.net> Author: tismer Date: Mon Dec 19 17:34:11 2005 New Revision: 21317 Modified: pypy/dist/pypy/translator/c/test/test_coroutine.py Log: a minimalist coroutine implementation which seems to work fine. We explicitly avoid to put anything into the coro structure than a single continuation. Anything else can be added via Greenlets, Tasklets, ... Note that it is also not necessary to store a callable - it is passed in to bind() as a thunk object. Modified: pypy/dist/pypy/translator/c/test/test_coroutine.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_coroutine.py (original) +++ pypy/dist/pypy/translator/c/test/test_coroutine.py Mon Dec 19 17:34:11 2005 @@ -1,21 +1,7 @@ """ -Hi Armin: -When I set DEBUG to False, the program crashes. Maybe I'm doing something -wrong and re-using some used continuation, don't know. -So I was trying to set things to Nonw after becoming invalid, but -that breaks the rtyper. +minimalistic coroutine implementation """ -DEBUG = False -# set to true and compilation crashes -USE_NONE = False -# set to true and rtyper crashes - -# the above are exclusive right now - -CHECKED_IN = True -# set this to false to skip skipping :-) - import os from pypy.rpython.rstack import yield_current_frame_to_caller @@ -67,7 +53,7 @@ caller of the current function. This frame serves as the entry point to the coroutine. -On evetry entry to the coroutine, the return value of the +On every entry to the coroutine, the return value of the point where we left off is the continuation of the caller. We need to update the caller's frame with it. This is not necessarily the caller which created ourself. @@ -92,96 +78,85 @@ class Coroutine(object): - if DEBUG: - def __init__(self): - self._switchable = False - - if USE_NONE: - def __init__(self): - self.frame = None + def __init__(self): + self.frame = None def bind(self, thunk): - if USE_NONE: - assert self.frame is None + if self.frame is not None: + raise CoroutineDamage self.frame = self._bind(thunk) def _bind(self, thunk): - if self is costate.current or self is costate.main: - raise CoroutineDamage - frame = yield_current_frame_to_caller() - costate.current.frame = frame - if DEBUG: - costate.current.switchable = True - assert self._switchable == True - self._switchable = False - costate.current = self + costate.last.frame = yield_current_frame_to_caller() thunk.call() - return self.frame # just for the annotator + costate.last, costate.current = costate.current, costate.main + frame, costate.main.frame = costate.main.frame, None + return frame def switch(self): - if DEBUG: - assert self._switchable == True - assert costate.current._switchable == False - if USE_NONE: - assert costate.current.frame is None - assert self.frame is not None - frame = self.frame.switch() - if DEBUG: - assert costate.current._switchable == False - costate.current._switchable = True - if USE_NONE: - assert costate.current.frame is None - costate.current.frame = frame - costate.current = self - # XXX support: self.frame = None + if self.frame is None: + raise CoroutineDamage + costate.last, costate.current = costate.current, self + frame, self.frame = self.frame, None + costate.last.frame = frame.switch() -costate.current = costate.main = Coroutine() +costate.current = costate.last = costate.main = Coroutine() def output(stuff): os.write(2, stuff + '\n') def test_coroutine(): - if CHECKED_IN: - import py.test - py.test.skip("in-progress") - def g(lst): + def g(lst, coros): + coro_f, coro_g, coro_h = coros lst.append(2) output('g appended 2') - costate.main.switch() - lst.append(4) - output('g appended 4') - costate.main.switch() - lst.append(6) - output('g appended 6') + coro_h.switch() + lst.append(5) + output('g appended 5') + + def h(lst, coros): + coro_f, coro_g, coro_h = coros + lst.append(3) + output('h appended 3') + coro_f.switch() + lst.append(7) + output('h appended 7') class T: - def __init__(self, func, arg): + def __init__(self, func, arg1, arg2): self.func = func - self.arg = arg + self.arg1 = arg1 + self.arg2 = arg2 def call(self): - self.func(self.arg) + self.func(self.arg1, self.arg2) def f(): lst = [1] + coro_f = costate.main coro_g = Coroutine() - t = T(g, lst) - output('binding after f set 1') - coro_g.bind(t) - output('switching') - coro_g.switch() - lst.append(3) - output('f appended 3') + coro_h = Coroutine() + coros = [coro_f, coro_g, coro_h] + thunk_g = T(g, lst, coros) + output('binding g after f set 1') + coro_g.bind(thunk_g) + thunk_h = T(h, lst, coros) + output('binding h after f set 1') + coro_h.bind(thunk_h) + output('switching to g') coro_g.switch() - lst.append(5) - output('f appended 5') + lst.append(4) + output('f appended 4') coro_g.switch() - lst.append(7) - output('f appended 7') + lst.append(6) + output('f appended 6') + coro_h.switch() + lst.append(8) + output('f appended 8') n = 0 for i in lst: n = n*10 + i return n data = wrap_stackless_function(f) - assert int(data.strip()) == 1234567 + assert int(data.strip()) == 12345678 From rxe at codespeak.net Mon Dec 19 21:52:19 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Mon, 19 Dec 2005 21:52:19 +0100 (CET) Subject: [pypy-svn] r21330 - pypy/dist/pypy/translator/c/test Message-ID: <20051219205219.5ED8E27B69@code1.codespeak.net> Author: rxe Date: Mon Dec 19 21:52:18 2005 New Revision: 21330 Modified: pypy/dist/pypy/translator/c/test/test_coroutine.py Log: A failing test trying to understand exiting of coroutines and bind from outside main. Modified: pypy/dist/pypy/translator/c/test/test_coroutine.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_coroutine.py (original) +++ pypy/dist/pypy/translator/c/test/test_coroutine.py Mon Dec 19 21:52:18 2005 @@ -160,3 +160,82 @@ data = wrap_stackless_function(f) assert int(data.strip()) == 12345678 + +def test_coroutine2(): + + class TBase: + def call(self): + pass + + class T(TBase): + def __init__(self, func, arg1, arg2): + self.func = func + self.arg1 = arg1 + self.arg2 = arg2 + def call(self): + self.res = self.func(self.arg1, self.arg2) + + class T1(TBase): + def __init__(self, func, arg1): + self.func = func + self.arg1 = arg1 + def call(self): + self.res = self.func(self.arg1) + + def g(lst, coros): + coro_f1, coro_g, coro_h = coros + lst.append(2) + output('g appended 2') + coro_h.switch() + lst.append(5) + output('g appended 5') + output('exiting g') + + def h(lst, coros): + coro_f1, coro_g, coro_h = coros + lst.append(3) + output('h appended 3') + coro_f1.switch() + lst.append(7) + output('h appended 7') + output('exiting h') + + def f1(coro_f1): + lst = [1] + coro_g = Coroutine() + coro_h = Coroutine() + coros = [coro_f1, coro_g, coro_h] + thunk_g = T(g, lst, coros) + output('binding g after f1 set 1') + coro_g.bind(thunk_g) + thunk_h = T(h, lst, coros) + output('binding h after f1 set 1') + coro_h.bind(thunk_h) + output('switching to g') + coro_g.switch() + lst.append(4) + output('f1 appended 4') + coro_g.switch() + lst.append(6) + output('f1 appended 6') + coro_h.switch() + lst.append(8) + output('f1 appended 8') + n = 0 + for i in lst: + n = n*10 + i + output('exiting f1') + return n + + def f(): + coro_f = costate.main + coro_f1 = Coroutine() + thunk_f1 = T1(f1, coro_f1) + output('binding f1 after f set 1') + coro_f1.bind(thunk_f1) + coro_f1.switch() + output('return to main :-(') + return thunk_f1.res + + data = wrap_stackless_function(f) + assert int(data.strip()) == 12345678 From rxe at codespeak.net Mon Dec 19 21:53:22 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Mon, 19 Dec 2005 21:53:22 +0100 (CET) Subject: [pypy-svn] r21331 - pypy/dist/pypy/translator/c/test Message-ID: <20051219205322.B0F5527B69@code1.codespeak.net> Author: rxe Date: Mon Dec 19 21:53:21 2005 New Revision: 21331 Modified: pypy/dist/pypy/translator/c/test/test_coroutine.py Log: Ooops - forgot to disable failing test. Modified: pypy/dist/pypy/translator/c/test/test_coroutine.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_coroutine.py (original) +++ pypy/dist/pypy/translator/c/test/test_coroutine.py Mon Dec 19 21:53:21 2005 @@ -3,6 +3,7 @@ """ import os +import py from pypy.rpython.rstack import yield_current_frame_to_caller def wrap_stackless_function(fn): @@ -163,6 +164,8 @@ def test_coroutine2(): + py.test.skip("failing test") + class TBase: def call(self): pass From rxe at codespeak.net Mon Dec 19 23:11:35 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Mon, 19 Dec 2005 23:11:35 +0100 (CET) Subject: [pypy-svn] r21333 - pypy/dist/pypy/translator/c/test Message-ID: <20051219221135.9699227DB4@code1.codespeak.net> Author: rxe Date: Mon Dec 19 23:11:34 2005 New Revision: 21333 Modified: pypy/dist/pypy/translator/c/test/test_coroutine.py Log: Hack to make test work - not sure it makes sense. The idea is we return to binder when the coroutine ends iff the binder still has a frame (IOW is alive). If the binder is not available, fall back to main. Modified: pypy/dist/pypy/translator/c/test/test_coroutine.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_coroutine.py (original) +++ pypy/dist/pypy/translator/c/test/test_coroutine.py Mon Dec 19 23:11:34 2005 @@ -88,10 +88,13 @@ self.frame = self._bind(thunk) def _bind(self, thunk): + binder = costate.current costate.last.frame = yield_current_frame_to_caller() thunk.call() - costate.last, costate.current = costate.current, costate.main - frame, costate.main.frame = costate.main.frame, None + if binder.frame is None: + binder = costate.main + costate.last, costate.current = costate.current, binder + frame, binder.frame = binder.frame, None return frame def switch(self): @@ -164,8 +167,6 @@ def test_coroutine2(): - py.test.skip("failing test") - class TBase: def call(self): pass From hpk at codespeak.net Tue Dec 20 00:16:04 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Tue, 20 Dec 2005 00:16:04 +0100 (CET) Subject: [pypy-svn] r21334 - pypy/dist/pypy/doc/weekly Message-ID: <20051219231604.CA13227B5E@code1.codespeak.net> Author: hpk Date: Tue Dec 20 00:16:04 2005 New Revision: 21334 Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt Log: added a blurb about ongoing EU-review preps (maybe a bit general). Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt (original) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt Tue Dec 20 00:16:04 2005 @@ -79,3 +79,16 @@ makes it possible for our inliner to inline some of the bytecode implementing functions work. Using the new dispatcher pypy-c got XXX slower and is now XXX times slower thatn CPython. + +Preparations for EU-review still ongoing +=========================================== + +Still many developers are involved in ongoing +preparations for the EU review on 20th January. +Reports are being finalized and there are discussions +about various issues that are only indirectly related +to the development efforts (in so far as it provides +the basis for the partial funding we receive). +We probably will only know on the 20th if everything +works out suitably. + From arigo at codespeak.net Tue Dec 20 12:06:49 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 20 Dec 2005 12:06:49 +0100 (CET) Subject: [pypy-svn] r21337 - pypy/dist/pypy/doc Message-ID: <20051220110649.2792E27B5B@code1.codespeak.net> Author: arigo Date: Tue Dec 20 12:06:47 2005 New Revision: 21337 Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt Log: Typo -- a nicely repeated one, too :-) Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/dynamic-language-translation.txt Tue Dec 20 12:06:47 2005 @@ -1818,7 +1818,7 @@ aspects. -Sprecialization +Specialization *************** The type system used by the annotator does not include polymorphism @@ -1845,19 +1845,19 @@ object space and annotator abstractly interpret the function's bytecode. In more details, the following special-cases are supported by default -(more advanced sprecializations have been implemented specifically for +(more advanced specializations have been implemented specifically for PyPy): -* sprecializing a function by the annotation of a given argument +* specializing a function by the annotation of a given argument -* sprecializing a function by the value of a given argument (requires all +* specializing a function by the value of a given argument (requires all calls to the function to resolve the argument to a constant value) * ignoring -- the function call is ignored. Useful for writing tests or debugging support code that should be removed during translation. * by arity -- for functions taking a variable number of (non-keyword) - arguments via a ``*args``, the default sprecialization is by the number + arguments via a ``*args``, the default specialization is by the number of extra arguments. (This follows naturally from the fact that the extended annotation lattice we use has annotations of the form ``Tuple(A_1, ..., A_n)`` representing a heterogeneous tuple of length @@ -1866,7 +1866,7 @@ * ctr_location -- for classes. A fresh independent copy of the class is made for each program point that instantiate the class. This is a - simple (but potentially over-sprecializing) way to obtain class + simple (but potentially over-specializing) way to obtain class polymorphism for the couple of container classes we needed in PyPy (e.g. Stack). @@ -1882,7 +1882,7 @@ Concrete mode execution *********************** -The *memo* sprecialization_ is used at key points in PyPy to obtain the +The *memo* specialization_ is used at key points in PyPy to obtain the effect described in the introduction (see `Abstract interpretation`_): the memo functions and all the code it invokes is concretely executed during annotation. There is no staticness restriction on that code -- @@ -2147,7 +2147,7 @@ back into the flow object space and the annotator and the RTyper itself, so that it gets turned into another low-level control flow graph. At this point, the annotator runs with a different set of default -sprecializations: it allows several copies of the helper functions to be +specializations: it allows several copies of the helper functions to be automatically built, one for each low-level type of its arguments. We do this by default at this level because of the intended purpose of these helpers: they are usually methods of a polymorphic container. @@ -2256,9 +2256,9 @@ In PyPy, our short-term future work is to focus on using the translation toolchain presented here to generate a modified low-level version of the same full Python interpreter. This modified version will drive a -just-in-time sprecialization process, in the sense of providing a +just-in-time specialization process, in the sense of providing a description of full Python that will not be directly executed, but -sprecialized for the particular user Python program. +specialized for the particular user Python program. As of October 2005, we are only starting the work in this direction. The details are not fleshed out nor documented yet, but the [Psyco]_ From ale at codespeak.net Tue Dec 20 12:46:38 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Tue, 20 Dec 2005 12:46:38 +0100 (CET) Subject: [pypy-svn] r21339 - in pypy/dist/pypy/lib/pyontology: . test Message-ID: <20051220114638.5264727B46@code1.codespeak.net> Author: ale Date: Tue Dec 20 12:46:36 2005 New Revision: 21339 Modified: pypy/dist/pypy/lib/pyontology/pyontology.py pypy/dist/pypy/lib/pyontology/test/test_ontology.py Log: Moved things around, fixed a test Modified: pypy/dist/pypy/lib/pyontology/pyontology.py ============================================================================== --- pypy/dist/pypy/lib/pyontology/pyontology.py (original) +++ pypy/dist/pypy/lib/pyontology/pyontology.py Tue Dec 20 12:46:36 2005 @@ -5,38 +5,121 @@ from logilab.constraint.propagation import AbstractDomain, AbstractConstraint, ConsistencyFailure import sys -namespaces = {'rdf':'http://www.w3.org/1999/02/22-rdf-syntax-ns', - 'rdfs':'http://www.w3.org/2000/01/rdf-schema', - 'dc':'http://purl.org/dc/elements/1.0/', - 'xmlns':'http://www.w3.org/1999/xhtml', - 'owl':'http://www.w3.org/2002/07/owl', +namespaces = { + 'rdf' : 'http://www.w3.org/1999/02/22-rdf-syntax-ns', + 'rdfs' : 'http://www.w3.org/2000/01/rdf-schema', + 'xmlns' : 'http://www.w3.org/1999/xhtml', + 'owl' : 'http://www.w3.org/2002/07/owl', } + uris = {} for k,v in namespaces.items(): uris[v] = k -Thing = URIRef(u'http://www.w3.org/2002/07/owl#Thing') Class = URIRef(u'http://www.w3.org/2002/07/owl#Class') -builtin_voc = [ - 'Thing', - 'Class', - 'ObjectProperty', - 'AllDifferent', - 'AnnotationProperty', - 'DataRange', - 'DatatypeProperty', - 'DeprecatedClass', - 'DeprecatedProperty', - 'FunctionalProperty', - 'InverseFunctionalProperty', - 'Nothing', - 'ObjectProperty', - 'Ontology', - 'OntologyProperty', - 'Restriction', - 'SymmetricProperty', - 'TransitiveProperty' - ] +rdf_rest = URIRef(u'http://www.w3.org/1999/02/22-rdf-syntax-ns#rest') +rdf_first = URIRef(u'http://www.w3.org/1999/02/22-rdf-syntax-ns#first') + +class ClassDomain(AbstractDomain): + + # Class domain is intended as a (abstract/virtual) domain for implementing + # Class axioms. Working on class descriptions the class domain should allow + # creation of classes through axioms. + # The instances of a class can be represented as a FiniteDomain in values (not always see Disjointwith) + # Properties of a class is in the dictionary "properties" + # The bases of a class is in the list "bases" + + def __init__(self, name='', values=[], bases = []): + AbstractDomain.__init__(self) + self.bases = bases+[self] + self.values = values + self.name = name + self.properties = {} + + def __repr__(self): + return "" % str(self.name) + + def __getitem__(self, index): + return None + + def __iter__(self): + return iter(self.bases) + + def size(self): + return len(self.bases) + + __len__ = size + + def copy(self): + return self + + def removeValues(self, values): + print "remove values from ClassDomain %r"%self, values + if len(values) > 0: + self.bases.pop(self.bases.index(values[0])) + + def getValues(self): + return self.bases + +class Property(ClassDomain): + pass + +class ObjectProperty(Property): + + pass + +class DataTypeProperty(Property): + + pass + +class Thing: + + def __init__(self): + pass + +class AllDifferent(ClassDomain): + # A special class whose members are distinct + # Syntactic sugar + pass + +class Nothing: + + pass + + +class FunctionalProperty: + + def __init__(self): + pass + +class DataRange: + + def __init__(self): + pass + +class Restriction(ClassDomain): + pass + +builtin_voc = { + 'Thing' : Thing, + 'Class' : ClassDomain, + 'ObjectProperty' : ObjectProperty, + 'AllDifferent' : AllDifferent , +## 'AnnotationProperty' : AnnotationProperty, +## 'DataRange' : DataRange, +## 'DatatypeProperty' : DatatypeProperty, +## 'DeprecatedClass' : DeprecatedClass, +## 'DeprecatedProperty' : DeprecatedProperty, +## 'FunctionalProperty' : FunctionalProperty, +## 'InverseFunctionalProperty' : InverseFunctionalProperty, +## 'Nothing' : Nothing, +## 'ObjectProperty' : ObjectProperty, +## 'Ontology' : Ontology, +## 'OntologyProperty' : OntologyProperty, + 'Restriction' : Restriction, +## 'SymmetricProperty' : SymmetricProperty, +## 'TransitiveProperty' : TransitiveProperty + } class Ontology(Graph): @@ -56,37 +139,30 @@ def attach_fd(self): for (s, p, o) in (self.triples((None, None, None))): if p.find('#') != -1: - owl,func = p.split('#') + ns, func = p.split('#') else: - owl ='' + ns ='' func = p - #print s, p, o - #raise Exception - if owl in [namespaces['owl'],namespaces['rdf'],namespaces['rdfs']]: + + if ns in namespaces.items(): pred = getattr(self, func) - else: - pred = None - if pred: res = pred(s, p, o) if res == None: continue if type(res) != list : res = [res] - avar = self.make_var(s) + avar = self.make_var(fd, s) else: res = [o] - avar = self.make_var(s,p) + avar = self.make_var(fd, s, p) if self.variables.get(avar) and type(self.variables[avar]) == fd: self.variables[avar] = fd(list(self.variables[avar].getValues()) + res) else: self.variables[avar] = fd(res) - # for var in self.seen: - # self.variables.pop(var) - # self.seen = {} - + def solve(self,verbose=0): rep = Repository(self.variables.keys(), self.variables, self.constraints) - return Solver().solve(rep,verbose) + return Solver().solve(rep, verbose) def consistency(self): rep = Repository(self.variables.keys(), self.variables, self.constraints) @@ -94,26 +170,24 @@ def get_list(self, subject): res = [] - p = URIRef(u'http://www.w3.org/1999/02/22-rdf-syntax-ns#first') - first = list(self.objects(subject, p)) + first = list(self.objects(subject, rdf_first)) assert len(first) == 1 - self.seen[self.make_var(subject,p)]= 1 + self.seen[self.make_var(fd, subject, p)]= 1 if type(first[0]) == URIRef: - var = self.make_var(first[0]) + var = self.make_var(fd, first[0]) if var not in self.variables.keys(): self.variables[var] = ClassDomain(var) res += first - - p = URIRef(u'http://www.w3.org/1999/02/22-rdf-syntax-ns#rest') - rest = list(self.objects(subject, p)) - self.seen[self.make_var(subject,p)]= 1 + + rest = list(self.objects(subject, rdf_rest)) + self.seen[self.make_var(fd, subject, p)]= 1 if "#nil" in rest[0] : return res else: res += self.get_list(rest[0]) return res - def make_var(self,*args): + def make_var(self, cls=fd, *args): res = [] for a in args: if type(a) == URIRef: @@ -129,6 +203,9 @@ else: res.append(a) var = '.'.join([str(a.replace('-','_')) for a in res]) + if not var in self.variables.keys(): + print var + self.variables[var] = cls(name=var) return var def find_prop(self, s): @@ -144,8 +221,6 @@ pr = list( self.subjects(p,s) ) if len(pr) == 0: return - # pr = list( self.subjects(r,s) ) - # assert len(pr) == 1 return pr[0] else: return s @@ -159,9 +234,9 @@ prop = self.find_prop(s) cls = self.find_cls(s) if cls : - avar = self.make_var(cls, prop) + avar = self.make_var(ClassDomain, cls, prop) else: - avar = self.make_var( prop) + avar = self.make_var(ClassDomain, prop) if not self.variables.get(avar): self.variables[avar] = ClassDomain(avar) return avar @@ -169,15 +244,11 @@ #---------------- Implementation ---------------- def type(self, s, p, var): - avar = self.make_var(var) - svar = self.make_var(s) + avar = self.make_var(ClassDomain, var) + svar = self.make_var(ClassDomain, s) if (type(var) == URIRef and not (var in [URIRef(namespaces['owl']+'#'+x) for x in builtin_voc])): # var is not one of the builtin classes - if not self.variables.get(svar): - self.variables[svar] = ClassDomain(svar) - if not self.variables.get(avar): - self.variables[avar] = ClassDomain(avar) # if self.variables[avar].values: self.variables[svar].values += self.variables[avar].values @@ -185,7 +256,8 @@ self.constraints.append(constrain) else: # var is a builtin class - pass + self.variables[svar] = builtin_voc[var.split('#')[-1]]() + def first(self, s, p, var): pass @@ -193,52 +265,78 @@ def rest(self, s, p, var): pass - def range(self, s, p, var): - pass - - def domain(self, s, p, var): - pass - # --------- Class Axioms --------------------- def subClassOf(self, s, p, var): # s is a subclass of var means that the # class extension of s is a subset of the # class extension of var. - avar = self.make_var(var) - svar = self.make_var(s) - if not self.variables.get(avar): - self.variables[avar] = ClassDomain(avar) - constrain = SubClassConstraint(svar, avar) - self.constraints.append(constrain) - + avar = self.make_var(ClassDomain, var) + svar = self.make_var(ClassDomain, s) + res = get_bases(self.variables[avar], self.variables) + self.variables[svar].bases.extend(res.keys()) + def equivalentClass(self, s, p, var): - avar = self.make_var(var) - svar = self.make_var(s) - if not self.variables.get(avar): - self.variables[avar] = ClassDomain(avar) -# constrain = EquivalentClassConstraint(svar, avar) -# self.constraints.append(constrain) + avar = self.make_var(ClassDomain, var) + svar = self.make_var(ClassDomain, s) self.subClassOf(s, p, var) self.subClassOf(var, p, s) def disjointWith(self, s, p, var): - avar = self.make_var(var) - svar = self.make_var(s) - if not self.variables.get(avar): - self.variables[avar] = ClassDomain(avar) + avar = self.make_var(ClassDomain, var) + svar = self.make_var(ClassDomain, s) constrain = DisjointClassConstraint(svar, avar) self.constraints.append(constrain) + def complementOf(self, s, p, var): + # add constraint of not var + pass + def oneOf(self, s, p, var): res = self.get_list(var) prop = self.find_uriref(s) - avar = self.make_var( prop) + avar = self.make_var(fd, prop) if self.variables.get(avar) and type(self.variables[avar]) == fd: self.variables[avar] = fd(list(self.variables[avar].getValues()) + res) else: self.variables[avar] = fd(res) + def unionOf(self,s, p, var): + res = self.get_list(var) + return res #There might be doubles (but fd takes care of that) + + def intersectionOf(self, s, p, var): + res = self.get_list(var) + result = {}.fromkeys(res[0]) + for el in res: + for cls in result.keys(): + if cls not in el: + result.pop(cls) + return result.keys() + +#---------Property axioms-------------------- + + def range(self, s, p, var): + pass + + def domain(self, s, p, var): + avar = self.make_var(CassDomain, var) + svar = self.make_var(Property, s) + assert isinstance(self.variables[svar], Property) + assert isinstance(self.variables[avar], ClassDomain) + self.variables[avar].properties[svar] = self.variables[svar] + + def subPropertyOf(self, s, p, var): + pass + + def equivalentProperty(self, s, p, var): + pass + + def inverseOf(self, s, p, var): + pass + +#------------------------------------------- + def maxCardinality(self, s, p, var): """ Len of finite domain of the property shall be less than or equal to var""" avar = self.find_property(s) @@ -258,42 +356,22 @@ constrain = Cardinality(avar,int(var)) self.constraints.append(constrain) - def unionOf(self,s, p, var): - res = self.get_list(var) - return res #There might be doubles (but fd takes care of that) - - def intersectionOf(self, s, p, var): - res = self.get_list(var) - result = {}.fromkeys(res[0]) - for el in res: - for cls in result.keys(): - if cls not in el: - result.pop(cls) - return result.keys() - def differentFrom(self, s, p, var): - s_var = self.make_var(s) - var_var = self.make_var(var) - if not self.variables.get(s_var): - self.variables[s_var] = ClassDomain(s_var) - if not self.variables.get(var_var): - self.variables[var_var] = fd([]) + s_var = self.make_var(ClassDomain, s) + var_var = self.make_var(fd, var) constrain = BinaryExpression([s_var, var_var],"%s != %s" %(s_var, var_var)) self.constraints.append(constrain) def distinctMembers(self, s, p, var): res = self.get_list(var) - self.constraints.append(AllDistinct([self.make_var(y) for y in res])) + self.constraints.append(AllDistinct([self.make_var(ClassDomain, y) for y in res])) return res def sameAs(self, s, p, var): - constrain = BinaryExpression([self.make_var(s), self.make_var(var)],"%s == %s" %(self.make_var(s), self.make_var( var))) + constrain = BinaryExpression([self.make_var(ClassDomain, s), self.make_var(ClassDomain, var)], + "%s == %s" %(self.make_var(ClassDomain, s), self.make_var(ClassDomain, var))) self.constraints.append(constrain) - def complementOf(self, s, p, var): - # add constraint of not var - pass - def onProperty(self, s, p, var): pass @@ -306,18 +384,9 @@ def someValuesFrom(self, s, p, var): pass - def equivalentProperty(self, s, p, var): - pass - - def inverseOf(self, s, p, var): - pass - def someValuesFrom(self, s, p, var): pass - def subPropertyOf(self, s, p, var): - pass - def imports(self, s, p, var): pass @@ -433,41 +502,3 @@ print subdom,superdom, bases, subdom.bases subdom.bases += [bas for bas in bases if bas not in subdom.bases] -class ClassDomain(AbstractDomain): - # Class domain is intended as a (abstract/virtual) domain for implementing - # Class axioms. Working on class descriptions the class domain should allow - # creation of classes through axioms. - # The instances of a class can be represented as a FiniteDomain in values (not always see Disjointwith) - # Properties of a class is in the dictionary "properties" - # The bases of a class is in the list "bases" - - def __init__(self, name='', values=[], bases = []): - AbstractDomain.__init__(self) - self.bases = bases+[self] - self.values = values - self.name = name - - def __repr__(self): - return "" % str(self.name) - - def __getitem__(self, index): - return None - - def __iter__(self): - return iter(self.bases) - - def size(self): - return sys.maxint - - __len__ = size - - def copy(self): - return self - - def removeValues(self, values): - print "remove values from ClassDomain", values - self.bases.pop(self.bases.index(values[0])) - - def getValues(self): - return self.bases - Modified: pypy/dist/pypy/lib/pyontology/test/test_ontology.py ============================================================================== --- pypy/dist/pypy/lib/pyontology/test/test_ontology.py (original) +++ pypy/dist/pypy/lib/pyontology/test/test_ontology.py Tue Dec 20 12:46:36 2005 @@ -13,18 +13,23 @@ def test_makevar(): O = Ontology() var = URIRef(u'http://www.w3.org/2002/03owlt/unionOf/premises004#A-and-B') - cod = O.make_var(var)+' = 1' + name = O.make_var(ClassDomain, var) + cod = name+' = 1' exec cod - assert O.make_var(var) in locals() - -def DONOT_test_subClassof(): + assert O.make_var(ClassDomain, var) in locals() + assert isinstance(O.variables[name], ClassDomain) + +def test_subClassof(): O = Ontology() - a = b = c = URIRef(u'http://www.w3.org/2002/03owlt/unionOf/premises004#A-and-B') + a = URIRef(u'http://www.w3.org/2002/03owlt/unionOf/premises004#A') + b = URIRef(u'http://www.w3.org/2002/03owlt/unionOf/premises004#B') + c = URIRef(u'http://www.w3.org/2002/03owlt/unionOf/premises004#C') O.subClassOf(b, None, a) O.subClassOf(c, None, b) - assert O.solve() - O.subClassOf(c, None, a) - assert O.solve() + A = O.make_var(ClassDomain, a) + C = O.make_var(ClassDomain, c) + assert len(O.variables) == 3 + assert O.variables[A] in O.variables[C].bases def test_ClassDomain(): a = ClassDomain() @@ -92,6 +97,13 @@ obj = URIRef('o') O = Ontology() O.type(sub, pred , obj) - assert O.variables[O.make_var(sub)].__class__ == ClassDomain + assert O.variables[O.make_var(ClassDomain, sub)].__class__ == ClassDomain + +def test_ObjectProperty(): + sub = URIRef('a') + pred = URIRef('type') + obj = URIRef(namespaces['owl']+'#ObjectProperty') + O = Ontology() + O.type(sub, pred , obj) + assert O.variables[O.make_var(ClassDomain, sub)].__class__ == ObjectProperty - From arigo at codespeak.net Tue Dec 20 15:30:08 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 20 Dec 2005 15:30:08 +0100 (CET) Subject: [pypy-svn] r21350 - pypy/dist/pypy/doc Message-ID: <20051220143008.3845227B54@code1.codespeak.net> Author: arigo Date: Tue Dec 20 15:30:05 2005 New Revision: 21350 Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt Log: Typoish inconsistencies. Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/dynamic-language-translation.txt Tue Dec 20 15:30:05 2005 @@ -749,7 +749,7 @@ For a function ``f`` of the user program, we call *arg_f_1, ..., arg_f_n* the variables bound to the input arguments of ``f`` (which are actually the input variables of the first block in the flow graph of -``f``) and *return_f* the variable bound to the return value of ``f`` +``f``) and *returnvar_f* the variable bound to the return value of ``f`` (which is the single input variable of a special empty "return" block ending the flow graph). @@ -986,7 +986,7 @@ Also ``[note]`` that we do not generally try to prove the correctness and safety of the user program, preferring to rely on test coverage for -that. This is apparent in the third rule above, which considers +that. This is apparent in the last rule above, which considers concatenation of two potentially "nullable" strings, i.e. strings that the annotator could not prove to be non-None. Instead of reporting an error, we take it as a hint that the two strings will not actually be From rxe at codespeak.net Tue Dec 20 16:10:06 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Tue, 20 Dec 2005 16:10:06 +0100 (CET) Subject: [pypy-svn] r21352 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20051220151006.D929C27B5A@code1.codespeak.net> Author: rxe Date: Tue Dec 20 16:10:04 2005 New Revision: 21352 Modified: pypy/dist/pypy/translator/llvm/externs2ll.py pypy/dist/pypy/translator/llvm/gc.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/module/boehm.h pypy/dist/pypy/translator/llvm/module/genexterns.c Log: * Make extern code a little less boehm specific * Actually really cache our generated ll file for externs while running tests Modified: pypy/dist/pypy/translator/llvm/externs2ll.py ============================================================================== --- pypy/dist/pypy/translator/llvm/externs2ll.py (original) +++ pypy/dist/pypy/translator/llvm/externs2ll.py Tue Dec 20 16:10:04 2005 @@ -146,7 +146,7 @@ includestr += "-I %s " % ii return includestr -def generate_llfile(db, extern_decls, entrynode, standalone): +def generate_llfile(db, extern_decls, entrynode, standalone, gcpolicy): ccode = [] function_names = [] @@ -178,7 +178,14 @@ else: assert False, "unhandled extern_decls %s %s %s" % (c_name, type(obj), obj) - # start building our source + + # include this early to get constants and macros for any further includes + ccode.append('#include \n') + + # ask gcpolicy for any code needed + ccode.append('%s\n' % gcpolicy.genextern_code()) + + # append our source file ccode = "".join(ccode) ccode += open(get_genexterns_path()).read() Modified: pypy/dist/pypy/translator/llvm/gc.py ============================================================================== --- pypy/dist/pypy/translator/llvm/gc.py (original) +++ pypy/dist/pypy/translator/llvm/gc.py Tue Dec 20 16:10:04 2005 @@ -5,6 +5,9 @@ def __init__(self): raise Exception, 'GcPolicy should not be used directly' + def genextern_code(self): + return "" + def gc_libraries(self): return [] @@ -50,6 +53,9 @@ def __init__(self): self.n_malloced = 0 + def genextern_code(self): + return '#include "boehm.h"' + def gc_libraries(self): return ['gc', 'pthread'] # XXX on windows? Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Tue Dec 20 16:10:04 2005 @@ -171,11 +171,12 @@ # we only cache the llexterns to make tests run faster if self.llexterns_header is None: assert self.llexterns_functions is None - self.llexterns_header, self.llexterns_functions = \ + GenLLVM.llexterns_header, GenLLVM.llexterns_functions = \ generate_llfile(self.db, self.extern_decls, self.entrynode, - self.standalone) + self.standalone, + self.gcpolicy) def create_codewriter(self): # prevent running the same function twice in a test Modified: pypy/dist/pypy/translator/llvm/module/boehm.h ============================================================================== --- pypy/dist/pypy/translator/llvm/module/boehm.h (original) +++ pypy/dist/pypy/translator/llvm/module/boehm.h Tue Dec 20 16:10:04 2005 @@ -1,3 +1,5 @@ +// XXX use some form of "configure" script +// disable this for boehm compiled without threading #define USING_THREADED_BOEHM #ifdef USING_THREADED_BOEHM @@ -22,8 +24,8 @@ } extern GC_all_interior_pointers; - char *RPython_StartupCode() { - GC_all_interior_pointers = 0; + +// startup specific code for boehm +#define __GC_STARTUP_CODE__ \ + GC_all_interior_pointers = 0; \ GC_init(); - return LLVM_RPython_StartupCode(); -} Modified: pypy/dist/pypy/translator/llvm/module/genexterns.c ============================================================================== --- pypy/dist/pypy/translator/llvm/module/genexterns.c (original) +++ pypy/dist/pypy/translator/llvm/module/genexterns.c Tue Dec 20 16:10:04 2005 @@ -16,9 +16,6 @@ RPyListOfString *_RPyListOfString_New(int); void _RPyListOfString_SetItem(RPyListOfString *, int, RPyString *); -// include this to get constants and macros for below includes -#include - // overflows/zeros/values raising operations #include "raisingop.h" @@ -47,7 +44,7 @@ return NULL; } - +// raw malloc code char *raw_malloc(int size) { return malloc(size); } @@ -62,8 +59,13 @@ char *LLVM_RPython_StartupCode(); -// boehm includes -#include "boehm.h" +char *RPython_StartupCode() { + + // is there any garbage collection / memory management initialisation + __GC_STARTUP_CODE__ + + return LLVM_RPython_StartupCode(); +} #ifdef ENTRY_POINT_DEFINED From pedronis at codespeak.net Tue Dec 20 23:02:49 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 20 Dec 2005 23:02:49 +0100 (CET) Subject: [pypy-svn] r21375 - pypy/dist/pypy/doc Message-ID: <20051220220249.CFC2427B5A@code1.codespeak.net> Author: pedronis Date: Tue Dec 20 23:02:48 2005 New Revision: 21375 Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt Log: backport consistency change Modified: pypy/dist/pypy/doc/dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/dynamic-language-translation.txt Tue Dec 20 23:02:48 2005 @@ -1860,8 +1860,8 @@ arguments via a ``*args``, the default specialization is by the number of extra arguments. (This follows naturally from the fact that the extended annotation lattice we use has annotations of the form - ``Tuple(A_1, ..., A_n)`` representing a heterogeneous tuple of length - *n* whose items are respectively annotated with ``A_1, ..., A_n``, but + ``Tuple(a_1, ..., a_n)`` representing a heterogeneous tuple of length + *n* whose items are respectively annotated with ``a_1, ..., a_n``, but there is no annotation for tuples of unknown length.) * ctr_location -- for classes. A fresh independent copy of the class is From ale at codespeak.net Wed Dec 21 11:34:04 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Wed, 21 Dec 2005 11:34:04 +0100 (CET) Subject: [pypy-svn] r21392 - in pypy/dist/pypy/lib/pyontology: . test Message-ID: <20051221103404.1E39C27B58@code1.codespeak.net> Author: ale Date: Wed Dec 21 11:34:02 2005 New Revision: 21392 Modified: pypy/dist/pypy/lib/pyontology/pyontology.py pypy/dist/pypy/lib/pyontology/test/test_ontology.py Log: Make a test pass. Some cleanup Modified: pypy/dist/pypy/lib/pyontology/pyontology.py ============================================================================== --- pypy/dist/pypy/lib/pyontology/pyontology.py (original) +++ pypy/dist/pypy/lib/pyontology/pyontology.py Wed Dec 21 11:34:02 2005 @@ -249,8 +249,6 @@ if (type(var) == URIRef and not (var in [URIRef(namespaces['owl']+'#'+x) for x in builtin_voc])): # var is not one of the builtin classes - -# if self.variables[avar].values: self.variables[svar].values += self.variables[avar].values constrain = BinaryExpression([svar, avar],"%s in %s" %(svar, avar)) self.constraints.append(constrain) @@ -258,14 +256,13 @@ # var is a builtin class self.variables[svar] = builtin_voc[var.split('#')[-1]]() - def first(self, s, p, var): pass def rest(self, s, p, var): pass -# --------- Class Axioms --------------------- +#---Class Axioms---#000000#FFFFFF----------------------------------------------- def subClassOf(self, s, p, var): # s is a subclass of var means that the @@ -273,12 +270,10 @@ # class extension of var. avar = self.make_var(ClassDomain, var) svar = self.make_var(ClassDomain, s) - res = get_bases(self.variables[avar], self.variables) - self.variables[svar].bases.extend(res.keys()) + cons = SubClassConstraint( svar, avar) + self.constraints.append(cons) def equivalentClass(self, s, p, var): - avar = self.make_var(ClassDomain, var) - svar = self.make_var(ClassDomain, s) self.subClassOf(s, p, var) self.subClassOf(var, p, s) @@ -290,7 +285,9 @@ def complementOf(self, s, p, var): # add constraint of not var - pass + # TODO: implementthis for OWL DL + avar = self.make_var(ClassDomain, var) + svar = self.make_var(ClassDomain, s) def oneOf(self, s, p, var): res = self.get_list(var) @@ -314,28 +311,35 @@ result.pop(cls) return result.keys() -#---------Property axioms-------------------- +#---Property Axioms---#000000#FFFFFF-------------------------------------------- def range(self, s, p, var): - pass + avar = self.make_var(ClassDomain, var) + svar = self.make_var(Property, s) + cons = RangeConstraint( svar, avar) + self.constraints.append(cons) + def domain(self, s, p, var): - avar = self.make_var(CassDomain, var) + avar = self.make_var(ClassDomain, var) svar = self.make_var(Property, s) assert isinstance(self.variables[svar], Property) assert isinstance(self.variables[avar], ClassDomain) self.variables[avar].properties[svar] = self.variables[svar] def subPropertyOf(self, s, p, var): + # TODO: implement this pass def equivalentProperty(self, s, p, var): + # TODO: implement this pass def inverseOf(self, s, p, var): + # TODO: implement this pass -#------------------------------------------- +#---Label---#000000#FFFFFF------------------------------------------------------ def maxCardinality(self, s, p, var): """ Len of finite domain of the property shall be less than or equal to var""" @@ -373,21 +377,27 @@ self.constraints.append(constrain) def onProperty(self, s, p, var): + # TODO: implement this pass def hasValue(self, s, p, var): + # TODO: implement this pass def allValuesFrom(self, s, p, var): + # TODO: implement this pass def someValuesFrom(self, s, p, var): + # TODO: implement this pass def someValuesFrom(self, s, p, var): + # TODO: implement this pass def imports(self, s, p, var): + # TODO: implement this pass # ----------------- Helper classes ---------------- @@ -441,15 +451,16 @@ else: return 1 -def get_bases(cls_dom, domains): + +def get_values(dom, domains, attr = 'values'): res = {} - for bas in cls_dom.bases: - res[bas] = 1 - if bas in domains.keys(): - res.update( get_bases(bas, domains)) - res[cls_dom] = 1 + for val in getattr(dom, attr): + res[val] = 1 + if val in domains.keys(): + res.update( get_values(val, domains, attr)) + res[dom] = 1 return res - + class SubClassConstraint(AbstractConstraint): def __init__(self, variable, cls_or_restriction): @@ -462,31 +473,34 @@ def narrow(self, domains): subdom = domains[self.variable] superdom = domains[self.super] - bases = get_bases(superdom, domains).keys() + bases = get_values(superdom, domains, 'bases').keys() print subdom,superdom, bases, subdom.bases subdom.bases += [bas for bas in bases if bas not in subdom.bases] - -class EquivalentClassConstraint(AbstractConstraint): + vals = get_values(subdom, domains, 'values') + superdom.values += [val for val in vals if val not in superdom.values] + +class DisjointClassConstraint(AbstractConstraint): def __init__(self, variable, cls_or_restriction): AbstractConstraint.__init__(self, [variable]) # worst case complexity self.__cost = 1 #len(variables) * (len(variables) - 1) / 2 - self.other = cls_or_restriction + self.super = cls_or_restriction self.variable = variable def narrow(self, domains): subdom = domains[self.variable] - otherdom = domains[self.other] - bases = get_bases(subdom, domains).keys() - otherbases = get_bases(otherdom, domains).keys() - print subdom, otherdom, "----",bases , otherbases - if bases != otherbases: - raise ConsistencyFailure() - else: - return 1 + superdom = domains[self.super] + bases = get_values(superdom, domains, 'bases').keys() + print subdom,superdom, bases, subdom.bases + subdom.bases += [bas for bas in bases if bas not in subdom.bases] + vals1 = get_values(superdom, domains, 'values').keys() + vals2 = get_values(variable, domains, 'values').keys() + for i in vals1: + if i in vals2: + raise ConsistencyError -class DisjointClassConstraint(AbstractConstraint): +class ComplementClassConstraint(AbstractConstraint): def __init__(self, variable, cls_or_restriction): AbstractConstraint.__init__(self, [variable]) @@ -498,7 +512,4 @@ def narrow(self, domains): subdom = domains[self.variable] superdom = domains[self.super] - bases = get_bases(superdom, domains).keys() - print subdom,superdom, bases, subdom.bases - subdom.bases += [bas for bas in bases if bas not in subdom.bases] - + \ No newline at end of file Modified: pypy/dist/pypy/lib/pyontology/test/test_ontology.py ============================================================================== --- pypy/dist/pypy/lib/pyontology/test/test_ontology.py (original) +++ pypy/dist/pypy/lib/pyontology/test/test_ontology.py Wed Dec 21 11:34:02 2005 @@ -28,6 +28,8 @@ O.subClassOf(c, None, b) A = O.make_var(ClassDomain, a) C = O.make_var(ClassDomain, c) + for con in O.constraints: + con.narrow(O.variables) assert len(O.variables) == 3 assert O.variables[A] in O.variables[C].bases @@ -81,15 +83,16 @@ assert len(c.bases) == len(a.bases) assert [bas in a.bases for bas in c.bases] == [True]*len(a.bases) -def DONOT_test_equivalentClass(): - a = ClassDomain('A') - b = ClassDomain('B') - c = ClassDomain('C') - con = EquivalentClassConstraint('c','a') - con2 = EquivalentClassConstraint('c','b') - con.narrow({'a': a, 'b': b, 'c': c}) - con2.narrow({'a': a, 'b': b, 'c': c}) - assert a == b +def test_equivalentClass(): + a = URIRef('A') + b = URIRef('B') + c = URIRef('C') + O = Ontology() + O.equivalentClass(c, None, a) + O.equivalentClass(c, None, b) + A = O.make_var(ClassDomain, a) + B = O.make_var(ClassDomain, b) + assert O.variables[A].values == O.variables[B].values def test_type(): sub = URIRef('a') From mwh at codespeak.net Wed Dec 21 16:31:26 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Wed, 21 Dec 2005 16:31:26 +0100 (CET) Subject: [pypy-svn] r21417 - pypy/dist/pypy/doc/weekly Message-ID: <20051221153126.46A9A27B52@code1.codespeak.net> Author: mwh Date: Wed Dec 21 16:31:23 2005 New Revision: 21417 Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt Log: 'last' week in pypy now nearly ready :/ Modified: pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt ============================================================================== --- pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt (original) +++ pypy/dist/pypy/doc/weekly/summary-2005-12-16.txt Wed Dec 21 16:31:23 2005 @@ -21,7 +21,7 @@ The Sprint! =========== -The last weekly summary was written towards then end of the sprint. +The last weekly summary was written towards the end of the sprint. The things we did in the couple of remaining days were written up in the second sprint report: @@ -60,35 +60,43 @@ PyPy's Bytecode Dispatcher ========================== -Until now the bytecode dispatching in PyPy was done using a list of functions -that contain the implementation of the respective bytecode. The dispatch -function retrieved the correct function by using the bytecode as an index into -this list. This was turned by the translator and the C backend into an array of -function pointers. This has the drawback that the bytecode implementing -functions can never be inlined (although some of them are quite small) and -there always is a read from memory for every bytecode. - -During the Gothenburg sprint we discussed a strategy to transform the dispatch -code into something more efficient. During the last week Eric, Arre and Carl -Friedrich implemented this strategy. Now the dispatching is done with a huge -chain of if/elif/else that all test the value of the same variable. In addition -there is a transformation that transforms chains of such if/elif/else blocks -into a block that has an integer variable as an exitswitch and links which -exitcases corresponding to the different values of the single integer variable. -The C backend outputs such a block as a switch. In addition this technique -makes it possible for our inliner to inline some of the bytecode implementing -functions work. Using the new dispatcher pypy-c got XXX slower and is now XXX -times slower thatn CPython. +Something that was suggested but never got-around-to at the last +sprint was to modify the translation process so that the bytecode +dispatch loop of the interpreter used a C switch rather than a table +of function pointers. + +The bytecode implementation code in PyPy builds a list of functions +that contain the implementation of the respective bytecode. Up until +a few days ago, the dispatch function retrieved the correct function +by using the bytecode as an index into this list. This was turned by +the translator and the C backend into an array of function +pointers. This has the drawback that the bytecode-implementing +functions can never be inlined (even though some of them are quite +small) and there always is a read from memory for every bytecode +executed. + +During the Gothenburg sprint we discussed and a strategy to transform +the dispatch code into something more efficient and in the last week +Eric, Arre and Carl Friedrich implemented this strategy. Now the +dispatching is done by a huge (automatically generated, of course) +chain of if/elif/else that all test the value of the same variable. +In addition there is a transformation that transforms chains of such +if/elif/else blocks into a block that has an integer variable as an +exitswitch and links with exitcases corresponding to the different +values of the single integer variable. The C backend converts such a +block into a switch. In addition this technique makes it possible for +our inliner to inline some of the bytecode implementing functions +work. Using the new dispatcher pypy-c got 10% or so faster (though +the *first* time we ran it it was much much faster! Benchmarking is +hard). Preparations for EU-review still ongoing =========================================== -Still many developers are involved in ongoing -preparations for the EU review on 20th January. -Reports are being finalized and there are discussions -about various issues that are only indirectly related -to the development efforts (in so far as it provides -the basis for the partial funding we receive). -We probably will only know on the 20th if everything -works out suitably. +Many developers are still involved in preparations for the EU review +on 20th January. Reports are being finalized and there are +discussions about various issues that are only indirectly related to +the development efforts (in so far as it provides the basis for the +partial funding we receive). We probably will only know on the 20th +if everything works out suitably. From ericvrp at codespeak.net Wed Dec 21 16:40:35 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 21 Dec 2005 16:40:35 +0100 (CET) Subject: [pypy-svn] r21421 - pypy/dist/pypy/translator/microbench Message-ID: <20051221154035.1AF1A27B54@code1.codespeak.net> Author: ericvrp Date: Wed Dec 21 16:40:34 2005 New Revision: 21421 Added: pypy/dist/pypy/translator/microbench/ pypy/dist/pypy/translator/microbench/microbench.py (contents, props changed) pypy/dist/pypy/translator/microbench/test_count1.py pypy/dist/pypy/translator/microbench/test_create1.py Log: basic framework to stuff our micro benchmarks into Added: pypy/dist/pypy/translator/microbench/microbench.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/microbench/microbench.py Wed Dec 21 16:40:34 2005 @@ -0,0 +1,43 @@ +#!/usr/bin/python + +import os, time, sys + +microbenches = [] +for fname in os.listdir('.'): + if not fname.startswith('test_') or not fname.endswith('.py'): + continue + microbench = fname[:-3] + exec 'import ' + microbench + microbenches.append(microbench) + +def run(): + MINIMUM_MICROBENCH_TIME = 2.5 + + for microbench in microbenches: + for k in [s for s in globals()[microbench].__dict__ if s.startswith('test_')] : + testcase = microbench + '.' + k + '()' + start = time.clock() + n = 0 + duration = 0.0 + while duration < MINIMUM_MICROBENCH_TIME: + exec testcase + n += 1 + duration = time.clock() - start + print '%s took %.2f seconds' % (testcase, duration / float(n)) + +if __name__ == '__main__': + for n, exe in enumerate(sys.argv[1:3]): + print 'exe:', exe + data = [s for s in os.popen(exe + ' microbench.py 2>&1').readlines() if not s.startswith('debug:')] + benchdata = {} + for d in data: + testcase, took, duration, seconds = d.split() + benchdata[testcase] = float(duration) + if n == 0: + benchdata_ref = benchdata + else: + for k, v in benchdata.iteritems(): + print '%s %.2fx slower' % (k, v / benchdata_ref[k]) + + if len(sys.argv) == 1: + run() Added: pypy/dist/pypy/translator/microbench/test_count1.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/microbench/test_count1.py Wed Dec 21 16:40:34 2005 @@ -0,0 +1,19 @@ +N = int(2**19 - 1) + +def test_loop(): + x = 0 + n = N + while x < n: + x = x + 1 + +# +def plus1(x): + return x + 1 + +def test_call_function(): + x = 0 + n = N + while x < n: + x = plus1(x) + +# Added: pypy/dist/pypy/translator/microbench/test_create1.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/microbench/test_create1.py Wed Dec 21 16:40:34 2005 @@ -0,0 +1,15 @@ +LOOPS = 1 << 18 + +class Foo: + pass + +def test_simple_loop(): + i = 0 + while i < LOOPS: + i += 1 + +def test_simple_loop_with_class_creation(): + i = 0 + while i < LOOPS: + Foo() + i += 1 From bea at codespeak.net Wed Dec 21 17:23:04 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Wed, 21 Dec 2005 17:23:04 +0100 (CET) Subject: [pypy-svn] r21434 - pypy/extradoc/sprintinfo Message-ID: <20051221162304.ED9B927B58@code1.codespeak.net> Author: bea Date: Wed Dec 21 17:22:32 2005 New Revision: 21434 Added: pypy/extradoc/sprintinfo/mallorca-sprint_announcement.txt Log: draft version of the mallorca sprint announcement - lacking technical goals...and the people page Added: pypy/extradoc/sprintinfo/mallorca-sprint_announcement.txt ============================================================================== --- (empty file) +++ pypy/extradoc/sprintinfo/mallorca-sprint_announcement.txt Wed Dec 21 17:22:32 2005 @@ -0,0 +1,122 @@ +Parma de Mallorca PyPy Sprint II: 23rd - 29th January 2006 +====================================================== + +The next PyPy sprint is scheduled to be in January 2006 in Parma De Mallorca, +Balearic Isles, Spain. We'll give newcomer-friendly +introductions. To learn more about the new PyPy Python-in-Python +implementation look here: + + http://codespeak.net/pypy + +Goals and topics of the sprint +------------------------------ + + +.. _`pypy-0.8.0`: http://codespeak.net/pypy/dist/pypy/doc/release-0.8.0.html + +Location & Accomodation +------------------------ + +The sprint will be held at the Palma University (UIB - Universitat de les Illes Balears), +in their GNU/Linux lab (http://mnm.uib.es/phpwiki/AulaLinux). +We are hosted by the Computer Science department and Ricardo Galli +is our contact person there, helping with arranging facilities. + +The University is located 7 km away from the central Palma. Busses to +the University departs from "Plaza de Espa?a" (which is a +very central location in Palma). Take bus 19 to the UIB campus. +A ticket for one urban trip costs 1 euro. You can also buy a card that is +valid for 10 trips and costs 7.51 euros. +Information about bus timetables and routes can be found on: + +http://www.a-palma.es. + +A map over the UIB campus are can be found on: + +http://www.uib.es/imagenes/planoCampus.html +The actual adress is: 3r pis de l'Anselm Turmeda which can be found +on the UIB Campus map. + +At "Plaza de Espa?a" there is a hostel (Hostal Residencia Terminus) +which has been recommended to us. It's cheap (ca 50 euros/double +room with bathroom). Some more links to accommodations (flats, +student homes and hotels): + +http://www.lodging-in-spain.com/hotel/town/Islas_Baleares,Mallorca,Palma_de_Mallorca,1/ + +http://www.uib.es/fuguib/residencia/english/index.html + +http://www.homelidays.com/EN-Holidays-Rental/110_Search/SearchList.asp?DESTINATION=Palma%20de%20Mallorca&ADR_PAYS=ES&ADR_ +LOCALISATION=ES%20ISLASBALEARES%20MALLORCA + +If you want to find a given street, you can search here: + +http://www.callejeando.com/Pueblos/pueblo7_1.htm + +To get to Parma De Mallorca almost all low fare airlines and travel agencies +have cheap tickets to get there. Information about Mallorca and Palma +(maps, tourist information, local transports, recommended air lines, ferries +and much more) can be found on: + +http://www.palmademallorca.es/portalPalma/home.jsp + +Comments on the weather: In January it is cold and wet on Mallorca + +Average temperature: 8,4 degrees Celsius +Lowest temperature: 2 degrees Celsius +Highest temperature: 14,5 degrees Celsius +Average humidity rate: 77,6 % + +So more time for coding and less time for sunbathing and beaches ;-) + +Exact times +----------- + +The public PyPy sprint is held Monday 23rd - Sunday 29th January +2006. Hours will be from 10:00 until people have had enough. +It's a good idea to arrive a day before the sprint starts and leave a day +later. In the middle of the sprint there usually is a break day and +it's usually ok to take half-days off if you feel like it. + +For this particular break day, Thursday, we are invited to the studio of Gin?s Qui?onero, +a local artist and painter. Gin?s have also been the person helping us +getting connections to UIB and providing much appreciated help regarding +accommodation and other logistical information. + +For those of you interested - here is his website where there also are paintings +showing his studio: + +http://www.hermetex4.com/damnans/ + +For those interested in playing collectable card games, this will also be an +opportunity to get aquainted with V:TES which will be demoed by Gin?s and +Beatrice and Sten D?ring. For more information on this cardgame - see: +http://www.white-wolf.com/vtes/index.php. +(The Mallorca sprint was organized through contacts within the V:TES +community). + +Network, Food, currency +------------------------ + +Currency is Euro. + +Food is available in the UIB Campus area as well as cheap restaurants in Palma. + +You normally need a wireless network card to access the network, but we +can provide a wireless/ethernet bridge. + +230V AC plugs are used in Mallorca. + +Registration etc.pp. +-------------------- + +Please subscribe to the `PyPy sprint mailing list`_, introduce yourself +and post a note that you want to come. Feel free to ask any questions +there! There also is a separate `Mallorca people`_ page tracking who +is already thought to come. If you have commit rights on codespeak then +you can modify yourself a checkout of + + http://codespeak.net/svn/pypy/extradoc/sprintinfo/mallorca-2006/people.txt + +.. _`PyPy sprint mailing list`: http://codespeak.net/mailman/listinfo/pypy-sprint +.. _`Mallorca people`: http://codespeak.net/pypy/extradoc/sprintinfo/mallorca-2006/people.html From pedronis at codespeak.net Wed Dec 21 17:47:44 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 21 Dec 2005 17:47:44 +0100 (CET) Subject: [pypy-svn] r21437 - pypy/dist/pypy/doc Message-ID: <20051221164744.F0E3627B62@code1.codespeak.net> Author: pedronis Date: Wed Dec 21 17:47:43 2005 New Revision: 21437 Modified: pypy/dist/pypy/doc/news.txt Log: compliancy -> compliance Modified: pypy/dist/pypy/doc/news.txt ============================================================================== --- pypy/dist/pypy/doc/news.txt (original) +++ pypy/dist/pypy/doc/news.txt Wed Dec 21 17:47:43 2005 @@ -79,7 +79,7 @@ The last `PyPy sprint`_ took place at the Heidelberg University in Germany from 22nd August to 29th August (both days included). Its main focus is translation of the whole PyPy interpreter -to a low level language and reaching 2.4.1 Python compliancy. +to a low level language and reaching 2.4.1 Python compliance. The goal of the sprint is to release a first self-contained PyPy-0.7 version. Carl has written a report about `day 1 - 3`_, there are some pictures_ online and a `heidelberg summary report`_ From rxe at codespeak.net Wed Dec 21 18:38:38 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Wed, 21 Dec 2005 18:38:38 +0100 (CET) Subject: [pypy-svn] r21440 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20051221173838.0887327B54@code1.codespeak.net> Author: rxe Date: Wed Dec 21 18:38:34 2005 New Revision: 21440 Removed: pypy/dist/pypy/translator/llvm/varsize.py Modified: pypy/dist/pypy/translator/llvm/arraynode.py pypy/dist/pypy/translator/llvm/build_llvm_module.py pypy/dist/pypy/translator/llvm/codewriter.py pypy/dist/pypy/translator/llvm/exception.py pypy/dist/pypy/translator/llvm/externs2ll.py pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/llvm/gc.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/module/support.py pypy/dist/pypy/translator/llvm/opaquenode.py pypy/dist/pypy/translator/llvm/opwriter.py pypy/dist/pypy/translator/llvm/pyxwrapper.py pypy/dist/pypy/translator/llvm/structnode.py Log: Fairly hefty refactoring mostly to codewriter.py (now all llvm writing is emitted through this) and funcnode.py (for stackless support). Fixed an obscure bug in exception handling which allows pypy-llvm now to translate again (was failing 2 times of 3 before this). Only ExplicitExceptionPolicy and BoehmGcPolicy working for now (or at least only these have been tested). Modified: pypy/dist/pypy/translator/llvm/arraynode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/arraynode.py (original) +++ pypy/dist/pypy/translator/llvm/arraynode.py Wed Dec 21 18:38:34 2005 @@ -1,7 +1,6 @@ from pypy.rpython.lltypesystem import lltype from pypy.translator.llvm.log import log from pypy.translator.llvm.node import LLVMNode, ConstantLLVMNode -from pypy.translator.llvm import varsize log = log.structnode class ArrayTypeNode(LLVMNode): @@ -53,11 +52,9 @@ def writeimpl(self, codewriter): log.writeimpl(self.ref) - varsize.write_constructor(self.db, codewriter, self.ref, - self.constructor_decl, - self.array, - atomic=self.array._is_atomic()) - + gp = self.db.gcpolicy + gp.write_constructor(codewriter, self.ref, self.constructor_decl, + self.array, atomic=self.array._is_atomic()) class VoidArrayTypeNode(LLVMNode): __slots__ = "db array ref".split() @@ -69,7 +66,7 @@ self.ref = "%arraytype_Void" def writedatatypedecl(self, codewriter): - codewriter.typedef(self.ref, self.db.get_machine_word()) + codewriter.typedef(self.ref, "{ %s }" % self.db.get_machine_word()) class ArrayNode(ConstantLLVMNode): """ An arraynode. Elements can be Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Wed Dec 21 18:38:34 2005 @@ -67,7 +67,7 @@ source_files = [] from distutils.sysconfig import EXEC_PREFIX object_files = ["-L%s/lib" % EXEC_PREFIX] - library_files = genllvm.gcpolicy.gc_libraries() + library_files = genllvm.db.gcpolicy.gc_libraries() gc_libs = ' '.join(['-l' + lib for lib in library_files]) if optimize: @@ -96,14 +96,14 @@ cmds = ["llvm-as < %s.ll | opt %s -f -o %s.bc" % (b, optimization_switches, b)] if not use_gcc: - cmds.append("llc %s %s.bc -f -o %s.s" % (genllvm.exceptionpolicy.llc_options(), b, b)) + cmds.append("llc %s %s.bc -f -o %s.s" % (genllvm.db.exceptionpolicy.llc_options(), b, b)) cmds.append("as %s.s -o %s.o" % (b, b)) if exe_name: cmd = "gcc %s.o %s %s -lm -pipe -o %s" % (b, gc_libs_path, gc_libs, exe_name) cmds.append(cmd) object_files.append("%s.o" % b) else: - cmds.append("llc %s %s.bc -march=c -f -o %s.c" % (genllvm.exceptionpolicy.llc_options(), b, b)) + cmds.append("llc %s %s.bc -march=c -f -o %s.c" % (genllvm.db.exceptionpolicy.llc_options(), b, b)) if exe_name: cmd = "gcc %s.c -c -O2 -pipe" % b if profile: Modified: pypy/dist/pypy/translator/llvm/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/codewriter.py (original) +++ pypy/dist/pypy/translator/llvm/codewriter.py Wed Dec 21 18:38:34 2005 @@ -6,21 +6,43 @@ DEFAULT_CCONV = 'fastcc' #ccc/fastcc class CodeWriter(object): - def __init__(self, f, genllvm): - self.f = f - self.genllvm = genllvm - self.word = genllvm.db.get_machine_word() - self.uword = genllvm.db.get_machine_uword() + def __init__(self, file, db): + self.file = file + self.word_repr = db.get_machine_word() + + def _resolvetail(self, tail, cconv): + # from: http://llvm.cs.uiuc.edu/docs/LangRef.html + # The optional "tail" marker indicates whether the callee function + # accesses any allocas or varargs in the caller. If the "tail" marker + # is present, the function call is eligible for tail call + # optimization. Note that calls may be marked "tail" even if they do + # not occur before a ret instruction. - def append(self, line): - self.f.write(line + '\n') + if cconv is not 'fastcc': + tail_ = '' + else: + tail_ = tail + if tail_: + tail_ += ' ' + return tail_ + # keep these two internal for now - incase we try a different API + def _append(self, line): + self.file.write(line + '\n') + + def _indent(self, line): + self._append(" " + line) + + def write_lines(self, lines): + for l in lines.split("\n"): + self._append(l) + def comment(self, line, indent=True): line = ";; " + line if indent: - self.indent(line) + self._indent(line) else: - self.append(line) + self._append(line) def header_comment(self, s): self.newline() @@ -28,139 +50,125 @@ self.newline() def newline(self): - self.append("") - - def indent(self, line): - self.append(" " + line) + self._append("") def label(self, name): self.newline() - self.append(" %s:" % name) + self._append(" %s:" % name) def globalinstance(self, name, typeandata): - self.append("%s = %s global %s" % (name, "internal", typeandata)) + self._append("%s = %s global %s" % (name, "internal", typeandata)) - def typedef(self, name, elements): - self.append("%s = type { %s }" % (name, elements)) + def typedef(self, name, type_): + self._append("%s = type %s" % (name, type_)) def structdef(self, name, typereprs): - self.typedef(name, ", ".join(typereprs)) + self.typedef(name, "{ %s }" % ", ".join(typereprs)) def arraydef(self, name, lentype, typerepr): - self.typedef(name, "%s, [0 x %s]" % (lentype, typerepr)) + self.typedef(name, "{ %s, [0 x %s] }" % (lentype, typerepr)) def funcdef(self, name, rettyperepr, argtypereprs): - self.append("%s = type %s (%s)" % (name, rettyperepr, - ", ".join(argtypereprs))) + self.typedef(name, "%s (%s)" % (rettyperepr, + ", ".join(argtypereprs))) def declare(self, decl, cconv=DEFAULT_CCONV): - self.append("declare %s %s" %(cconv, decl,)) + self._append("declare %s %s" %(cconv, decl,)) def startimpl(self): self.newline() - self.append("implementation") + self._append("implementation") self.newline() def br_uncond(self, blockname): - self.indent("br label %%%s" %(blockname,)) + self._indent("br label %%%s" %(blockname,)) def br(self, cond, blockname_false, blockname_true): - self.indent("br bool %s, label %%%s, label %%%s" - % (cond, blockname_true, blockname_false)) + self._indent("br bool %s, label %%%s, label %%%s" + % (cond, blockname_true, blockname_false)) def switch(self, intty, cond, defaultdest, value_label): labels = '' for value, label in value_label: labels += ' %s %s, label %%%s' % (intty, value, label) - self.indent("switch %s %s, label %%%s [%s ]" - % (intty, cond, defaultdest, labels)) + self._indent("switch %s %s, label %%%s [%s ]" + % (intty, cond, defaultdest, labels)) - def openfunc(self, decl, is_entrynode=False, cconv=DEFAULT_CCONV): + def openfunc(self, decl, cconv=DEFAULT_CCONV): self.newline() - #if is_entrynode: - # linkage_type = '' - #else: - # linkage_type = 'internal ' - linkage_type = 'internal ' - self.append("%s%s %s {" % (linkage_type, cconv, decl,)) + self._append("internal %s %s {" % (cconv, decl,)) def closefunc(self): - self.append("}") + self._append("}") def ret(self, type_, ref): if type_ == 'void': - self.indent("ret void") + self._indent("ret void") else: - self.indent("ret %s %s" % (type_, ref)) + self._indent("ret %s %s" % (type_, ref)) def phi(self, targetvar, type_, refs, blocknames): - assert targetvar.startswith('%') - assert refs and len(refs) == len(blocknames), "phi node requires blocks" + assert len(refs) == len(blocknames), "phi node requires blocks" mergelist = ", ".join( ["[%s, %%%s]" % item for item in zip(refs, blocknames)]) s = "%s = phi %s %s" % (targetvar, type_, mergelist) - self.indent(s) + self._indent(s) def binaryop(self, name, targetvar, type_, ref1, ref2): - self.indent("%s = %s %s %s, %s" % (targetvar, name, type_, ref1, ref2)) + self._indent("%s = %s %s %s, %s" % (targetvar, name, type_, + ref1, ref2)) def shiftop(self, name, targetvar, type_, ref1, ref2): - self.indent("%s = %s %s %s, ubyte %s" % (targetvar, name, type_, ref1, ref2)) - - #from: http://llvm.cs.uiuc.edu/docs/LangRef.html - #The optional "tail" marker indicates whether the callee function accesses any - # allocas or varargs in the caller. If the "tail" marker is present, the function - # call is eligible for tail call optimization. Note that calls may be marked - # "tail" even if they do not occur before a ret instruction. - def call(self, targetvar, returntype, functionref, argrefs, argtypes, label=None, except_label=None, tail=DEFAULT_TAIL, cconv=DEFAULT_CCONV): - if cconv is not 'fastcc': - tail_ = '' - else: - tail_ = tail - if tail_: - tail_ += ' ' - args = ", ".join(["%s %s" % item for item in zip(argtypes, argrefs)]) - if except_label: - self.genllvm.exceptionpolicy.invoke(self, targetvar, tail_, cconv, returntype, functionref, args, label, except_label) - else: - if returntype == 'void': - self.indent("%scall %s void %s(%s)" % (tail_, cconv, functionref, args)) - else: - self.indent("%s = %scall %s %s %s(%s)" % (targetvar, tail_, cconv, returntype, functionref, args)) + self._indent("%s = %s %s %s, ubyte %s" % (targetvar, name, type_, + ref1, ref2)) def cast(self, targetvar, fromtype, fromvar, targettype): if fromtype == 'void' and targettype == 'void': - return - self.indent("%(targetvar)s = cast %(fromtype)s " - "%(fromvar)s to %(targettype)s" % locals()) - - def malloc(self, targetvar, type_, size=1, atomic=False, cconv=DEFAULT_CCONV): - for s in self.genllvm.gcpolicy.malloc(targetvar, type_, size, atomic, self.word, self.uword).split('\n'): - self.indent(s) + return + self._indent("%(targetvar)s = cast %(fromtype)s " + "%(fromvar)s to %(targettype)s" % locals()) + # XXX refactor - should only be one getelementptr def raw_getelementptr(self, targetvar, type, typevar, *indices): - word = self.word + word = self.word_repr res = "%(targetvar)s = getelementptr %(type)s %(typevar)s, " % locals() res += ", ".join(["%s %s" % (t, i) for t, i in indices]) - self.indent(res) + self._indent(res) def getelementptr(self, targetvar, type, typevar, *indices): - word = self.word - res = "%(targetvar)s = getelementptr %(type)s %(typevar)s, %(word)s 0, " % locals() + word = self.word_repr + res = "%(targetvar)s = getelementptr " \ + "%(type)s %(typevar)s, %(word)s 0, " % locals() res += ", ".join(["%s %s" % (t, i) for t, i in indices]) - self.indent(res) + self._indent(res) + + def load(self, target, targettype, ptr): + self._indent("%(target)s = load %(targettype)s* %(ptr)s" % locals()) - def load(self, targetvar, targettype, ptr): - self.indent("%(targetvar)s = load %(targettype)s* %(ptr)s" % locals()) + def store(self, valuetype, value, ptr): + l = "store %(valuetype)s %(value)s, %(valuetype)s* %(ptr)s" % locals() + self._indent(l) - def store(self, valuetype, valuevar, ptr): - self.indent("store %(valuetype)s %(valuevar)s, " - "%(valuetype)s* %(ptr)s" % locals()) - - def debugcomment(self, tempname, len, tmpname): - word = self.word - res = "%s = call ccc %(word)s (sbyte*, ...)* %%printf(" % locals() - res += "sbyte* getelementptr ([%s x sbyte]* %s, %(word)s 0, %(word)s 0) )" % locals() - res = res % (tmpname, len, tmpname) - self.indent(res) + def unwind(self): + self._indent("unwind") + + def call(self, targetvar, returntype, functionref, argrefs, argtypes, + tail=DEFAULT_TAIL, cconv=DEFAULT_CCONV): + + tail = self._resolvetail(tail, cconv) + args = ", ".join(["%s %s" % item for item in zip(argtypes, argrefs)]) + + if returntype == 'void': + self._indent("%scall %s void %s(%s)" % (tail, + cconv, + functionref, + args)) + else: + self._indent("%s = %scall %s %s %s(%s)" % (targetvar, + tail, + cconv, + returntype, + functionref, + args)) + Modified: pypy/dist/pypy/translator/llvm/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/exception.py (original) +++ pypy/dist/pypy/translator/llvm/exception.py Wed Dec 21 18:38:34 2005 @@ -1,5 +1,6 @@ +from pypy.objspace.flow.model import c_last_exception from pypy.translator.llvm.codewriter import DEFAULT_CCONV - +from pypy.translator.llvm.backendopt.exception import create_exception_handling class ExceptionPolicy: RINGBUGGER_SIZE = 8192 @@ -24,7 +25,7 @@ } ''' % (RINGBUFFER_ENTRY_MAXSIZE, RINGBUGGER_OVERSIZE, RINGBUGGER_SIZE-1) - def __init__(self): + def __init__(self, db): raise Exception, 'ExceptionPolicy should not be used directly' def transform(self, translator, graph=None): @@ -42,31 +43,57 @@ return r + ' 0' return r + ' null' + def _noresult2(self, returntype): + r = returntype.strip() + if r == 'void': + return 'void' + elif r == 'bool': + return 'false' + elif r in 'float double'.split(): + return '0.0' + elif r in 'ubyte sbyte ushort short uint int ulong long'.split(): + return ' 0' + return 'null' + + def _nonoderesult(self, node): returntype, name, dummy = node.getdecl_parts() noresult = self._noresult(returntype) return noresult - def new(exceptionpolicy=None): #factory + def new(db, exceptionpolicy=None): #factory exceptionpolicy = exceptionpolicy or 'explicit' if exceptionpolicy == 'invokeunwind': - exceptionpolicy = InvokeUnwindExceptionPolicy() + exceptionpolicy = InvokeUnwindExceptionPolicy(db) elif exceptionpolicy == 'explicit': - exceptionpolicy = ExplicitExceptionPolicy() + exceptionpolicy = ExplicitExceptionPolicy(db) elif exceptionpolicy == 'none': - exceptionpolicy = NoneExceptionPolicy() + exceptionpolicy = NoneExceptionPolicy(db) else: raise Exception, 'unknown exceptionpolicy: ' + str(exceptionpolicy) return exceptionpolicy new = staticmethod(new) - -class NoneExceptionPolicy(ExceptionPolicy): #XXX untested - def __init__(self): - pass - - -class InvokeUnwindExceptionPolicy(ExceptionPolicy): #uses issubclass() and llvm invoke&unwind + def update_phi_data(self, funcnode, entrylinks, block, blocknames): + """ Exceptions handling code introduces intermediate blocks for + exception handling cases, hence we modify our input phi data + accordingly. """ + for ii, link in enumerate(entrylinks): + if (link.prevblock.exitswitch == c_last_exception and + link.prevblock.exits[0].target != block): + blocknames[ii] += '_exception_found_branchto_' + blocknames[ii] += funcnode.block_to_name[block] + +class NoneExceptionPolicy(ExceptionPolicy): + """ XXX untested """ + + def __init__(self, db): + self.db = db + +class InvokeUnwindExceptionPolicy(ExceptionPolicy): + """ uses issubclass() and llvm invoke&unwind + XXX Untested for a while """ + def __init__(self): pass @@ -97,12 +124,14 @@ } ''' % locals() + self.RINGBUFFER_LLVMCODE - def invoke(self, codewriter, targetvar, tail_, cconv, returntype, functionref, args, label, except_label): + def invoke(self, codewriter, targetvar, tail_, cconv, returntype, + functionref, args, label, except_label): + labels = 'to label %%%s except label %%%s' % (label, except_label) if returntype == 'void': - codewriter.indent('%sinvoke %s void %s(%s) %s' % (tail_, cconv, functionref, args, labels)) + codewriter._indent('%sinvoke %s void %s(%s) %s' % (tail_, cconv, functionref, args, labels)) else: - codewriter.indent('%s = %sinvoke %s %s %s(%s) %s' % (targetvar, tail_, cconv, returntype, functionref, args, labels)) + codewriter._indent('%s = %sinvoke %s %s %s(%s) %s' % (targetvar, tail_, cconv, returntype, functionref, args, labels)) def _is_raise_new_exception(self, db, graph, block): from pypy.objspace.flow.model import mkentrymap @@ -137,7 +166,7 @@ #Which is already stored in the global variables. #So nothing needs to happen here! - codewriter.indent('unwind') + codewriter.unwind() def fetch_exceptions(self, codewriter, exc_found_labels, lltype_of_exception_type, lltype_of_exception_value): for label, target, last_exc_type_var, last_exc_value_var in exc_found_labels: @@ -150,14 +179,15 @@ def reraise(self, funcnode, codewriter): codewriter.comment('reraise when exception is not caught') - codewriter.indent('unwind') + codewriter.unwind() def llc_options(self): return '-enable-correct-eh-support' - -class ExplicitExceptionPolicy(ExceptionPolicy): #uses issubclass() and last_exception tests after each call - def __init__(self): +class ExplicitExceptionPolicy(ExceptionPolicy): + """ uses issubclass() and last_exception tests after each call """ + def __init__(self, db): + self.db = db self.invoke_count = 0 def llvmcode(self, entrynode): @@ -191,56 +221,73 @@ ''' % locals() + self.RINGBUFFER_LLVMCODE def transform(self, translator, graph=None): - from pypy.translator.llvm.backendopt.exception import create_exception_handling if graph: create_exception_handling(translator, graph) else: for graph in translator.flowgraphs.itervalues(): create_exception_handling(translator, graph) - #translator.view() - def invoke(self, codewriter, targetvar, tail_, cconv, returntype, functionref, args, label, except_label): - if returntype == 'void': - #XXX I think keepalive should not be the last operation here! - if functionref != '%keepalive': - codewriter.indent('%scall %s void %s(%s)' % (tail_, cconv, functionref, args)) - else: - codewriter.indent('%s = %scall %s %s %s(%s)' % (targetvar, tail_, cconv, returntype, functionref, args)) + + def invoke(self, codewriter, targetvar, returntype, functionref, + argrefs, argtypes, label, except_label): + + assert functionref != '%keepalive' + tmp = '%%invoke.tmp.%d' % self.invoke_count exc = '%%invoke.exc.%d' % self.invoke_count self.invoke_count += 1 - codewriter.indent('%(tmp)s = load %%RPYTHON_EXCEPTION_VTABLE** %%last_exception_type' % locals()) - codewriter.indent('%(exc)s = seteq %%RPYTHON_EXCEPTION_VTABLE* %(tmp)s, null' % locals()) - codewriter.indent('br bool %(exc)s, label %%%(label)s, label %%%(except_label)s' % locals()) + + # XXX Hardcoded type... + type_ = "%RPYTHON_EXCEPTION_VTABLE*" + + codewriter.call(targetvar, returntype, functionref, argrefs, argtypes) + codewriter.load(tmp, type_, "%last_exception_type") + codewriter.binaryop("seteq", exc, type_, tmp, "null") + codewriter.br(exc, except_label, label) def write_exceptblock(self, funcnode, codewriter, block): + """ Raises an exception - called from FuncNode """ + assert len(block.inputargs) == 2 - noresult = self._nonoderesult(funcnode) + returntype, name, dummy = funcnode.getdecl_parts() funcnode.write_block_phi_nodes(codewriter, block) - inputargs = funcnode.db.repr_arg_multi(block.inputargs) + inputargs = funcnode.db.repr_arg_multi(block.inputargs) inputargtypes = funcnode.db.repr_arg_type_multi(block.inputargs) codewriter.store(inputargtypes[0], inputargs[0], '%last_exception_type') codewriter.store(inputargtypes[1], inputargs[1], '%last_exception_value') - codewriter.indent('ret ' + noresult) + codewriter.ret(returntype, self._noresult2(returntype)) + + def fetch_exceptions(self, codewriter, exc_found_labels, + lltype_of_exception_type, lltype_of_exception_value): + + for (label, target, + last_exc_type_var, last_exc_value_var) in exc_found_labels: - def fetch_exceptions(self, codewriter, exc_found_labels, lltype_of_exception_type, lltype_of_exception_value): - for label, target, last_exc_type_var, last_exc_value_var in exc_found_labels: codewriter.label(label) if last_exc_type_var: - codewriter.load(last_exc_type_var, lltype_of_exception_type, '%last_exception_type') + codewriter.load(last_exc_type_var, + lltype_of_exception_type, + '%last_exception_type') if last_exc_value_var: - codewriter.load(last_exc_value_var, lltype_of_exception_value, '%last_exception_value') - codewriter.store(lltype_of_exception_type , 'null', '%last_exception_type') - codewriter.store(lltype_of_exception_value, 'null', '%last_exception_value') + codewriter.load(last_exc_value_var, + lltype_of_exception_value, + '%last_exception_value') + codewriter.store(lltype_of_exception_type, + 'null', + '%last_exception_type') + codewriter.store(lltype_of_exception_value, + 'null', + '%last_exception_value') codewriter.br_uncond(target) def reraise(self, funcnode, codewriter): - noresult = self._nonoderesult(funcnode) - codewriter.indent('ret ' + noresult) + returntype, name, dummy = funcnode.getdecl_parts() + codewriter.ret(returntype, self._noresult2(returntype)) def llc_options(self): return '' + Modified: pypy/dist/pypy/translator/llvm/externs2ll.py ============================================================================== --- pypy/dist/pypy/translator/llvm/externs2ll.py (original) +++ pypy/dist/pypy/translator/llvm/externs2ll.py Wed Dec 21 18:38:34 2005 @@ -146,7 +146,7 @@ includestr += "-I %s " % ii return includestr -def generate_llfile(db, extern_decls, entrynode, standalone, gcpolicy): +def generate_llfile(db, extern_decls, entrynode, standalone): ccode = [] function_names = [] @@ -183,7 +183,7 @@ ccode.append('#include \n') # ask gcpolicy for any code needed - ccode.append('%s\n' % gcpolicy.genextern_code()) + ccode.append('%s\n' % db.gcpolicy.genextern_code()) # append our source file ccode = "".join(ccode) Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Wed Dec 21 18:38:34 2005 @@ -5,7 +5,6 @@ from pypy.translator.llvm.opwriter import OpWriter from pypy.translator.llvm.log import log from pypy.translator.llvm.backendopt.removeexcmallocs import remove_exception_mallocs -#from pypy.translator.llvm.backendopt.mergemallocs import merge_mallocs from pypy.translator.unsimplify import remove_double_links log = log.funcnode @@ -30,6 +29,9 @@ inputargtypes = [self.db.repr_type(a) for a in self.type_._trueargs()] codewriter.funcdef(self.ref, returntype, inputargtypes) +class BlockBranchWriterException(Exception): + pass + class FuncNode(ConstantLLVMNode): __slots__ = "db value ref graph block_to_name".split() @@ -39,8 +41,13 @@ self.ref = self.make_ref('%pypy_', value.graph.name) self.graph = value.graph - self.db.genllvm.exceptionpolicy.transform(self.db.translator, self.graph) + self.db.exceptionpolicy.transform(self.db.translator, + self.graph) + remove_exception_mallocs(self.db.translator, self.graph, self.ref) + + #XXX experimental + #from pypy.translator.llvm.backendopt.mergemallocs import merge_mallocs #merge_mallocs(self.db.translator, self.graph, self.ref) remove_double_links(self.db.translator, self.graph) @@ -49,7 +56,6 @@ return "" %(self.ref,) def setup(self): - #log("setup", self) def visit(node): if isinstance(node, Link): map(self.db.prepare_arg, node.args) @@ -62,8 +68,8 @@ if block.exitswitch != c_last_exception: continue for link in block.exits[1:]: - self.db.prepare_constant(lltype.typeOf(link.llexitcase), - link.llexitcase) + type_ = lltype.typeOf(link.llexitcase) + self.db.prepare_constant(type_, link.llexitcase) assert self.graph, "cannot traverse" traverse(visit, self.graph) @@ -76,7 +82,7 @@ def writeimpl(self, codewriter): graph = self.graph log.writeimpl(graph.name) - codewriter.openfunc(self.getdecl(), self is self.db.entrynode) + codewriter.openfunc(self.getdecl()) nextblock = graph.startblock args = graph.startblock.inputargs self.block_to_name = {} @@ -92,29 +98,6 @@ self.write_block(codewriter, block) codewriter.closefunc() - def writecomments(self, codewriter): - """ write operations strings for debugging purposes. """ - for block in self.graph.iterblocks(): - for op in block.operations: - strop = str(op) + "\n\x00" - l = len(strop) - if strop.find("direct_call") == -1: - continue - tempname = self.db.add_op2comment(l, op) - printables = dict([(ord(i), None) for i in - ("0123456789abcdefghijklmnopqrstuvwxyz" + - "ABCDEFGHIJKLMNOPQRSTUVWXYZ" + - "!#$%&()*+,-./:;<=>?@[\\]^_`{|}~ '")]) - s = [] - for c in strop: - if ord(c) in printables: - s.append(c) - else: - s.append("\\%02x" % ord(c)) - r = 'c"%s"' % "".join(s) - typeandata = '[%s x sbyte] %s' % (l, r) - codewriter.globalinstance(tempname, typeandata) - def writeglobalconstants(self, codewriter): pass @@ -136,26 +119,36 @@ returntype, ref, args = self.getdecl_parts() return "%s %s(%s)" % (returntype, ref, ", ".join(args)) - def write_block(self, codewriter, block): - self.write_block_phi_nodes(codewriter, block) - self.write_block_operations(codewriter, block) - self.write_block_branches(codewriter, block) - + # ______________________________________________________________________ + # helpers for block writers + def get_phi_data(self, block): + exceptionpolicy = self.db.exceptionpolicy data = [] + entrylinks = mkentrymap(self.graph)[block] entrylinks = [x for x in entrylinks if x.prevblock is not None] + inputargs = self.db.repr_arg_multi(block.inputargs) inputargtypes = self.db.repr_arg_type_multi(block.inputargs) - for i, (arg, type_) in enumerate(zip(inputargs, inputargtypes)): - names = self.db.repr_arg_multi([link.args[i] for link in entrylinks]) + + # for each argument in block, return a 4 tuple of + # arg_name, arg_type, [list of names from previous blocks, + # [corresponding list of block names] + for ii, (arg, type_) in enumerate(zip(inputargs, inputargtypes)): + + names = self.db.repr_arg_multi([link.args[ii] + for link in entrylinks]) + blocknames = [self.block_to_name[link.prevblock] - for link in entrylinks] - for i, link in enumerate(entrylinks): #XXX refactor into a transformation - if link.prevblock.exitswitch == c_last_exception and \ - link.prevblock.exits[0].target != block: - blocknames[i] += '_exception_found_branchto_' + self.block_to_name[block] - data.append( (arg, type_, names, blocknames) ) + for link in entrylinks] + + assert len(names) == len(blocknames) + + # some exception policies will add new blocks... + exceptionpolicy.update_phi_data(self, entrylinks, block, blocknames) + data.append((arg, type_, names, blocknames)) + return data def write_block_phi_nodes(self, codewriter, block): @@ -164,51 +157,54 @@ codewriter.phi(arg, type_, names, blocknames) def write_block_branches(self, codewriter, block): - #assert len(block.exits) <= 2 #more exits are possible (esp. in combination with exceptions) if block.exitswitch == c_last_exception: - #codewriter.comment('FuncNode(ConstantLLVMNode) *last_exception* write_block_branches @%s@' % str(block.exits)) + # special case - handled by exception policy return + if len(block.exits) == 1: codewriter.br_uncond(self.block_to_name[block.exits[0].target]) elif len(block.exits) == 2: cond = self.db.repr_arg(block.exitswitch) - codewriter.br(cond, self.block_to_name[block.exits[0].target], + codewriter.br(cond, + self.block_to_name[block.exits[0].target], self.block_to_name[block.exits[1].target]) + else: + raise BranchException("only support branches with 2 exit cases") def write_block_operations(self, codewriter, block): opwriter = OpWriter(self.db, codewriter, self, block) + if block.exitswitch == c_last_exception: - last_op_index = len(block.operations) - 1 - else: - last_op_index = None - for op_index, op in enumerate(block.operations): - if False: # print out debug string - codewriter.newline() - codewriter.comment("** %s **" % str(op)) - info = self.db.get_op2comment(op) - if info is not None: - lenofopstr, opstrname = info - codewriter.debugcomment(self.db.repr_tmpvar(), - lenofopstr, - opstrname) - if op_index == last_op_index: - #could raise an exception and should therefor have a function - #implementation that can be invoked by the llvm-code. - invoke_prefix = 'invoke:' - assert not op.opname.startswith(invoke_prefix) - op.opname = invoke_prefix + op.opname + invoke_prefix = 'invoke:' + # could raise an exception and should therefor have a function + # implementation that can be invoked by the llvm-code. + op = block.operations[len(block.operations) - 1] + assert not op.opname.startswith(invoke_prefix) + op.opname = invoke_prefix + op.opname + + # emit operations + for op in block.operations: opwriter.write_operation(op) + # ______________________________________________________________________ + # actual block writers + def write_startblock(self, codewriter, block): self.write_block_operations(codewriter, block) self.write_block_branches(codewriter, block) + def write_block(self, codewriter, block): + self.write_block_phi_nodes(codewriter, block) + self.write_block_operations(codewriter, block) + self.write_block_branches(codewriter, block) + def write_returnblock(self, codewriter, block): assert len(block.inputargs) == 1 self.write_block_phi_nodes(codewriter, block) - inputargtype = self.db.repr_arg_type(block.inputargs[0]) - inputarg = self.db.repr_arg(block.inputargs[0]) + inputarg, inputargtype = self.db.repr_argwithtype(block.inputargs[0]) codewriter.ret(inputargtype, inputarg) def write_exceptblock(self, codewriter, block): - self.db.genllvm.exceptionpolicy.write_exceptblock(self, codewriter, block) + self.db.exceptionpolicy.write_exceptblock(self, + codewriter, + block) Modified: pypy/dist/pypy/translator/llvm/gc.py ============================================================================== --- pypy/dist/pypy/translator/llvm/gc.py (original) +++ pypy/dist/pypy/translator/llvm/gc.py Wed Dec 21 18:38:34 2005 @@ -1,8 +1,10 @@ +from pypy.rpython.rstr import STR + from pypy.translator.llvm.log import log log = log.gc class GcPolicy: - def __init__(self): + def __init__(self, db): raise Exception, 'GcPolicy should not be used directly' def genextern_code(self): @@ -11,17 +13,68 @@ def gc_libraries(self): return [] - def declarations(self): - return '' + # malloc is not an codewriter specific thing + def malloc(self, codewriter, targetvar, type_, size=1, atomic=False): + # XXX _indent & test + codewriter._indent('%(targetvar)s = malloc %(type_)s, uint %(s)s' % locals()) + + def write_constructor(self, codewriter, ref, constructor_decl, ARRAY, + indices_to_array=(), atomic=False, is_str=False): - def malloc(self, targetvar, type_, size, is_atomic, word, uword): - s = str(size) - return '%(targetvar)s = malloc %(type_)s, uint %(s)s' % locals() + + #varsized arrays and structs look like this: + #Array: {int length , elemtype*} + #Struct: {...., Array} + + # the following indices access the last element in the array + elemtype = self.db.repr_type(ARRAY.OF) + word = lentype = self.db.get_machine_word() + uword = self.db.get_machine_uword() + + codewriter.openfunc(constructor_decl) + + # Need room for NUL terminator + if ARRAY is STR.chars: + codewriter.binaryop("add", "%actuallen", lentype, "%len", 1) + else: + codewriter.cast("%actuallen", lentype, "%len", lentype) + + elemindices = list(indices_to_array) + elemindices += [("uint", 1), (lentype, "%actuallen")] + codewriter.getelementptr("%size", ref + "*", "null", *elemindices) + codewriter.cast("%usize", elemtype + "*", "%size", uword) + self.malloc(codewriter, "%ptr", "sbyte", "%usize", atomic=atomic) + codewriter.cast("%result", "sbyte*", "%ptr", ref + "*") + + indices_to_arraylength = tuple(indices_to_array) + (("uint", 0),) + + # the following accesses the length field of the array + codewriter.getelementptr("%arraylength", ref + "*", + "%result", + *indices_to_arraylength) + codewriter.store(lentype, "%len", "%arraylength") + + #if is_str: + # indices_to_hash = (("uint", 0),) + # codewriter.getelementptr("%ptrhash", ref + "*", + # "%result", + # *indices_to_hash) + # codewriter.store("int", "0", "%ptrhash") + + + #if ARRAY is STR.chars: + # codewriter.getelementptr("%ptrendofchar", ref + "*", + # "%result", + # *elemindices) + # codewriter.store(elemtype, "0", "%ptrendofchar") + + codewriter.ret(ref + "*", "%result") + codewriter.closefunc() def pyrex_code(self): return '' - def new(gcpolicy=None): #factory + def new(db, gcpolicy=None): #factory gcpolicy = gcpolicy or 'boehm' import distutils.sysconfig @@ -33,11 +86,11 @@ gcpolicy = 'none' if gcpolicy == 'boehm': - gcpolicy = BoehmGcPolicy() + gcpolicy = BoehmGcPolicy(db) elif gcpolicy == 'ref': - gcpolicy = RefcountingGcPolicy() + gcpolicy = RefcountingGcPolicy(db) elif gcpolicy == 'none': - gcpolicy = NoneGcPolicy() + gcpolicy = NoneGcPolicy(db) else: raise Exception, 'unknown gcpolicy: ' + str(gcpolicy) return gcpolicy @@ -45,12 +98,13 @@ class NoneGcPolicy(GcPolicy): - def __init__(self): - pass + def __init__(self, db): + self.db = db class BoehmGcPolicy(GcPolicy): - def __init__(self): + def __init__(self, db): + self.db = db self.n_malloced = 0 def genextern_code(self): @@ -59,10 +113,10 @@ def gc_libraries(self): return ['gc', 'pthread'] # XXX on windows? - def declarations(self): - return '' - def malloc(self, targetvar, type_, size, is_atomic, word, uword): + def malloc(self, codewriter, targetvar, type_, size=1, atomic=False): + is_atomic = atomic + uword = self.db.get_machine_uword() s = str(size) self.n_malloced += 1 cnt = '.%d' % self.n_malloced @@ -78,7 +132,7 @@ t += ''' call ccc void %%llvm.memset(sbyte* %%malloc_Ptr%(cnt)s, ubyte 0, uint %%malloc_SizeU%(cnt)s, uint 0) ''' % locals() - return t + codewriter.write_lines(t) def pyrex_code(self): return ''' Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Wed Dec 21 18:38:34 2005 @@ -31,11 +31,12 @@ # create and set internals self.db = Database(self, translator) + self.db.gcpolicy = GcPolicy.new(self.db, gcpolicy) + self.db.exceptionpolicy = ExceptionPolicy.new(self.db, + exceptionpolicy) - self.gcpolicy = GcPolicy.new(gcpolicy) self.standalone = standalone self.translator = translator - self.exceptionpolicy = ExceptionPolicy.new(exceptionpolicy) # the debug flag is for creating comments of every operation # that may be executed @@ -99,7 +100,7 @@ def write_headers(self, codewriter): # write external function headers codewriter.header_comment('External Function Headers') - codewriter.append(self.llexterns_header) + codewriter.write_lines(self.llexterns_header) codewriter.header_comment("Type Declarations") @@ -122,10 +123,7 @@ codewriter.header_comment("Function Prototypes") # write external protos - codewriter.append(extdeclarations) - - # write garbage collection protos - codewriter.append(self.gcpolicy.declarations()) + codewriter.write_lines(extdeclarations) # write node protos for typ_decl in self.db.getnodes(): @@ -138,15 +136,16 @@ # write external function implementations codewriter.header_comment('External Function Implementation') - codewriter.append(self.llexterns_functions) - codewriter.append(extfunctions) + codewriter.write_lines(self.llexterns_functions) + codewriter.write_lines(extfunctions) self.write_extern_impls(codewriter) self.write_setup_impl(codewriter) self._checkpoint('write support implentations') # write exception implementaions - codewriter.append(self.exceptionpolicy.llvmcode(self.entrynode)) + ep = self.db.exceptionpolicy + codewriter.write_lines(ep.llvmcode(self.entrynode)) # write all node implementations for typ_decl in self.db.getnodes(): @@ -175,8 +174,7 @@ generate_llfile(self.db, self.extern_decls, self.entrynode, - self.standalone, - self.gcpolicy) + self.standalone) def create_codewriter(self): # prevent running the same function twice in a test @@ -188,7 +186,7 @@ self.function_count[self.entry_func_name] = 1 filename = udir.join(self.entry_func_name + postfix).new(ext='.ll') f = open(str(filename), 'w') - return CodeWriter(f, self), filename + return CodeWriter(f, self.db), filename def write_extern_decls(self, codewriter): for c_name, obj in self.extern_decls: @@ -197,7 +195,7 @@ obj = obj.TO l = "%%%s = type %s" % (c_name, self.db.repr_type(obj)) - codewriter.append(l) + codewriter.write_lines(l) def write_extern_impls(self, codewriter): for c_name, obj in self.extern_decls: @@ -283,18 +281,20 @@ return gen.compile_llvm_source(**kwds) -def genllvm_compile(function, annotation, view=False, **kwds): +def genllvm_compile(function, annotation, view=False, optimize=True, **kwds): from pypy.translator.translator import TranslationContext from pypy.translator.backendopt.all import backend_optimizations t = TranslationContext() t.buildannotator().build_types(function, annotation) t.buildrtyper().specialize() - backend_optimizations(t, ssa_form=False) - + if optimize: + backend_optimizations(t, ssa_form=False) + else: + backend_optimizations(t, ssa_form=False, mallocs=False, inline_threshold=0) # note: this is without policy transforms if view: t.view() - return genllvm(t, function, **kwds) + return genllvm(t, function, optimize=optimize, **kwds) def compile_function(function, annotation, **kwds): """ Helper - which get the compiled module from CPython. """ Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Wed Dec 21 18:38:34 2005 @@ -96,5 +96,5 @@ ret void } """ % (c_name, exc_repr) - codewriter.append(l) + codewriter.write_lines(l) Modified: pypy/dist/pypy/translator/llvm/opaquenode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opaquenode.py (original) +++ pypy/dist/pypy/translator/llvm/opaquenode.py Wed Dec 21 18:38:34 2005 @@ -16,7 +16,7 @@ # main entry points from genllvm def writedatatypedecl(self, codewriter): - codewriter.append("%s = type opaque*" % self.ref) + codewriter.typedef(self.ref, "opaque*") class ExtOpaqueTypeNode(OpaqueTypeNode): def writedatatypedecl(self, codewriter): Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Wed Dec 21 18:38:34 2005 @@ -1,8 +1,12 @@ -from pypy.objspace.flow.model import Constant +from pypy.objspace.flow.model import Constant, Variable from pypy.rpython.lltypesystem import lltype from pypy.translator.llvm.log import log log = log.opwriter +def repr_if_variable(db, arg): + if isinstance(arg, Variable): + return db.repr_arg(arg) + class OpWriter(object): binary_operations = {'int_mul': 'mul', 'int_add': 'add', @@ -115,7 +119,6 @@ meth = getattr(self, op.opname, None) if not meth: raise Exception, "operation %s not found" % op.opname - return meth(op) def _generic_pow(self, op, onestr): @@ -130,7 +133,8 @@ msg = 'XXX: Error: _generic_pow: Variable '\ '%s - failed to convert to int %s' % (value, str(exc)) self.codewriter.comment(msg) - return + raise Exception(msg) + if operand < 1: res_val = onestr else: @@ -301,102 +305,8 @@ argtypes = self.db.repr_arg_type_multi(op_args[1:]) if self.db.is_function_ptr(op.result): returntype = "%s (%s)*" % (returntype, ", ".join(argtypes)) - self.codewriter.call(targetvar,returntype,functionref,argrefs,argtypes) - - def last_exception_type_ptr(self, op): - e = self.db.translator.rtyper.getexceptiondata() - self.codewriter.load('%' + str(op.result), - self.db.repr_type(e.lltype_of_exception_type), - '%last_exception_type') - - def invoke(self, op): - op_args = [arg for arg in op.args - if arg.concretetype is not lltype.Void] - - if op.opname == 'invoke:direct_call': - functionref = self.db.repr_arg(op_args[0]) - - else: - # operation - provided by genexterns - opname = op.opname.split(':',1)[1] - op_args = ['%pypyop_' + opname] + op_args - functionref = op_args[0] - - assert len(op_args) >= 1 - # at least one label and one exception label - assert len(self.block.exits) >= 2 - - link = self.block.exits[0] - assert link.exitcase is None - - targetvar = self.db.repr_arg(op.result) - returntype = self.db.repr_arg_type(op.result) - argrefs = self.db.repr_arg_multi(op_args[1:]) - argtypes = self.db.repr_arg_type_multi(op_args[1:]) - - none_label = self.node.block_to_name[link.target] - block_label = self.node.block_to_name[self.block] - exc_label = block_label + '_exception_handling' - - if self.db.is_function_ptr(op.result): #use longhand form - returntype = "%s (%s)*" % (returntype, ", ".join(argtypes)) - self.codewriter.call(targetvar, returntype, functionref, argrefs, - argtypes, none_label, exc_label) - - e = self.db.translator.rtyper.getexceptiondata() - ll_exception_match = self.db.repr_value(e.fn_exception_match._obj) - lltype_of_exception_type = self.db.repr_type(e.lltype_of_exception_type) - lltype_of_exception_value = self.db.repr_type(e.lltype_of_exception_value) - - self.codewriter.label(exc_label) - - exc_found_labels, last_exception_type = [], None - catch_all = False - for link in self.block.exits[1:]: - assert issubclass(link.exitcase, Exception) - - etype = self.db.obj2node[link.llexitcase._obj] - current_exception_type = etype.get_ref() - target = self.node.block_to_name[link.target] - exc_found_label = block_label + '_exception_found_branchto_' + target - last_exc_type_var, last_exc_value_var = None, None - - for p in self.node.get_phi_data(link.target): - arg, type_, names, blocknames = p - for name, blockname in zip(names, blocknames): - if blockname != exc_found_label: - continue - if name.startswith('%last_exception_'): - last_exc_type_var = name - if name.startswith('%last_exc_value_'): - last_exc_value_var = name - - t = (exc_found_label,target,last_exc_type_var,last_exc_value_var) - exc_found_labels.append(t) - - not_this_exception_label = block_label + '_not_exception_' + etype.ref[1:] - - if current_exception_type.find('getelementptr') == -1: #catch all (except:) - catch_all = True - self.codewriter.br_uncond(exc_found_label) - else: #catch specific exception (class) type - if not last_exception_type: #load pointer only once - last_exception_type = self.db.repr_tmpvar() - self.codewriter.load(last_exception_type, lltype_of_exception_type, '%last_exception_type') - self.codewriter.newline() - ll_issubclass_cond = self.db.repr_tmpvar() - self.codewriter.call(ll_issubclass_cond, - 'bool', - ll_exception_match, - [last_exception_type, current_exception_type], - [lltype_of_exception_type, lltype_of_exception_type]) - self.codewriter.br(ll_issubclass_cond, not_this_exception_label, exc_found_label) - self.codewriter.label(not_this_exception_label) - - ep = self.codewriter.genllvm.exceptionpolicy - if not catch_all: - ep.reraise(self.node, self.codewriter) - ep.fetch_exceptions(self.codewriter, exc_found_labels, lltype_of_exception_type, lltype_of_exception_value) + self.codewriter.call(targetvar, returntype, + functionref, argrefs, argtypes) def malloc_exception(self, op): arg_type = op.args[0].value @@ -405,8 +315,10 @@ tmpvar1 = self.db.repr_tmpvar() tmpvar2 = self.db.repr_tmpvar() tmpvar3 = self.db.repr_tmpvar() - self.codewriter.indent('%(tmpvar1)s = getelementptr %(type_)s* null, int 1' % locals()) - self.codewriter.cast(tmpvar2, type_+'*', tmpvar1, 'uint') + + ptr_type = type_ + '*' + self.codewriter.raw_getelementptr(tmpvar1, ptr_type, "null", ("int", 1)) + self.codewriter.cast(tmpvar2, ptr_type, tmpvar1, 'uint') self.codewriter.call(tmpvar3, 'sbyte*', '%malloc_exception', [tmpvar2], ['uint']) self.codewriter.cast(targetvar, 'sbyte*', tmpvar3, type_+'*') @@ -414,7 +326,9 @@ arg_type = op.args[0].value targetvar = self.db.repr_arg(op.result) type_ = self.db.repr_type(arg_type) - self.codewriter.malloc(targetvar, type_, atomic=arg_type._is_atomic()) + gp = self.db.gcpolicy + gp.malloc(self.codewriter, targetvar, type_, + atomic=arg_type._is_atomic()) def malloc_varsize(self, op): arg_type = op.args[0].value @@ -589,7 +503,7 @@ self._op_adr_comparison_generic(op, "setge") def raw_malloc(self, op): - # XXX Ignore raise as not last op + # XXX ignore raise as not last op targetvar = self.db.repr_arg(op.result) targettype = self.db.repr_arg_type(op.result) argrefs = self.db.repr_arg_multi(op.args) @@ -662,3 +576,134 @@ self.codewriter.load(targetvar, targettype, cast_addr) + # ______________________________________________________________________ + # exception specific + + def last_exception_type_ptr(self, op): + e = self.db.translator.rtyper.getexceptiondata() + self.codewriter.load('%' + str(op.result), + self.db.repr_type(e.lltype_of_exception_type), + '%last_exception_type') + + def invoke(self, op): + ep = self.db.exceptionpolicy + + op_args = [arg for arg in op.args + if arg.concretetype is not lltype.Void] + + if op.opname == 'invoke:direct_call': + functionref = self.db.repr_arg(op_args[0]) + + else: + # operation - provided by genexterns + opname = op.opname.split(':', 1)[1] + op_args = ['%pypyop_' + opname] + op_args + functionref = op_args[0] + + assert len(op_args) >= 1 + + # at least one label and one exception label + assert len(self.block.exits) >= 2 + + link = self.block.exits[0] + assert link.exitcase is None + + targetvar = self.db.repr_arg(op.result) + returntype = self.db.repr_arg_type(op.result) + argrefs = self.db.repr_arg_multi(op_args[1:]) + argtypes = self.db.repr_arg_type_multi(op_args[1:]) + + none_label = self.node.block_to_name[link.target] + block_label = self.node.block_to_name[self.block] + exc_label = block_label + '_exception_handling' + + # use longhand form + if self.db.is_function_ptr(op.result): + returntype = "%s (%s)*" % (returntype, ", ".join(argtypes)) + + ep.invoke(self.codewriter, targetvar, returntype, functionref, + argrefs, argtypes, none_label, exc_label) + + # write exception handling blocks + + e = self.db.translator.rtyper.getexceptiondata() + ll_exception_match = self.db.repr_value(e.fn_exception_match._obj) + lltype_of_exception_type = self.db.repr_type(e.lltype_of_exception_type) + lltype_of_exception_value = self.db.repr_type(e.lltype_of_exception_value) + + # start with the exception handling block + # * load the last exception type + # * check it with call to ll_exception_match() + # * branch to to correct block? + + self.codewriter.label(exc_label) + + catch_all = False + found_blocks_info = [] + last_exception_type = None + + # XXX tmp - debugging info + + # block_label = "block28" + # exc_label = "block28_exception_handling" + # ll_exception_match = function for catching exception + # lltype_of_exception_type, lltype_of_exception_value = generic + # catch_all = ??? + # found_blocks_info = list of found block data to write those blocks + # last_exception_type = Load exception pointer once for handle and not found blocks + + # link = iteration thru rest of links in block + # etype = node for exception + # current_exception_type = repr for node etype + # target = label of the destination block + # exc_found_label = label of intermediate exc found block + # last_exc_type_var = ???? + # last_exc_value_var = ??? + + for link in self.block.exits[1:]: + assert issubclass(link.exitcase, Exception) + + # information for found blocks + target = self.node.block_to_name[link.target] + exc_found_label = block_label + '_exception_found_branchto_' + target + link_exc_type = repr_if_variable(self.db, link.last_exception) + link_exc_value = repr_if_variable(self.db, link.last_exc_value) + found_blocks_info.append((exc_found_label, target, + link_exc_type, link_exc_value)) + + # XXX fix database to handle this case + etype = self.db.obj2node[link.llexitcase._obj] + current_exception_type = etype.get_ref() + not_this_exception_label = block_label + '_not_exception_' + etype.ref[1:] + + # catch specific exception (class) type + + # load pointer only once + if not last_exception_type: + last_exception_type = self.db.repr_tmpvar() + self.codewriter.load(last_exception_type, + lltype_of_exception_type, + '%last_exception_type') + self.codewriter.newline() + + ll_issubclass_cond = self.db.repr_tmpvar() + + self.codewriter.call(ll_issubclass_cond, + 'bool', + ll_exception_match, + [last_exception_type, current_exception_type], + [lltype_of_exception_type, lltype_of_exception_type]) + + self.codewriter.br(ll_issubclass_cond, + not_this_exception_label, + exc_found_label) + + self.codewriter.label(not_this_exception_label) + + if not catch_all: + ep.reraise(self.node, self.codewriter) + + ep.fetch_exceptions(self.codewriter, + found_blocks_info, + lltype_of_exception_type, + lltype_of_exception_value) Modified: pypy/dist/pypy/translator/llvm/pyxwrapper.py ============================================================================== --- pypy/dist/pypy/translator/llvm/pyxwrapper.py (original) +++ pypy/dist/pypy/translator/llvm/pyxwrapper.py Wed Dec 21 18:38:34 2005 @@ -46,7 +46,7 @@ append("class LLVMException(Exception):") append(" pass") append("") - append(genllvm.gcpolicy.pyrex_code()) + append(genllvm.db.gcpolicy.pyrex_code()) append("def %s_wrapper(%s):" % (funcgen.ref.strip("%"), ", ".join(inputargs))) append(" if not setup[0]:") append(" startup = Pyrex_RPython_StartupCode()") Modified: pypy/dist/pypy/translator/llvm/structnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/structnode.py (original) +++ pypy/dist/pypy/translator/llvm/structnode.py Wed Dec 21 18:38:34 2005 @@ -1,6 +1,5 @@ from pypy.translator.llvm.log import log from pypy.translator.llvm.node import LLVMNode, ConstantLLVMNode -from pypy.translator.llvm import varsize from pypy.rpython.lltypesystem import lltype log = log.structnode @@ -71,14 +70,14 @@ name = current._names_without_voids()[-1] current = current._flds[name] assert isinstance(current, lltype.Array) - varsize.write_constructor(self.db, - codewriter, - self.ref, - self.constructor_decl, - current, - indices_to_array, - self.struct._is_atomic(), - is_str=self.struct._name == "rpy_string") + gp = self.db.gcpolicy + gp.write_constructor(codewriter, + self.ref, + self.constructor_decl, + current, + indices_to_array, + self.struct._is_atomic(), + is_str=self.struct._name == "rpy_string") class StructNode(ConstantLLVMNode): """ A struct constant. Can simply contain From cfbolz at codespeak.net Thu Dec 22 01:17:01 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 22 Dec 2005 01:17:01 +0100 (CET) Subject: [pypy-svn] r21487 - in pypy/extradoc/sprintinfo: . mallorca Message-ID: <20051222001701.41EBC27B5B@code1.codespeak.net> Author: cfbolz Date: Thu Dec 22 01:16:59 2005 New Revision: 21487 Added: pypy/extradoc/sprintinfo/mallorca/ pypy/extradoc/sprintinfo/mallorca/people.txt pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt - copied, changed from r21485, pypy/extradoc/sprintinfo/mallorca-sprint_announcement.txt Removed: pypy/extradoc/sprintinfo/mallorca-sprint_announcement.txt Log: move sprint announcement, restify it, add people-page Added: pypy/extradoc/sprintinfo/mallorca/people.txt ============================================================================== --- (empty file) +++ pypy/extradoc/sprintinfo/mallorca/people.txt Thu Dec 22 01:16:59 2005 @@ -0,0 +1,42 @@ + +People coming to the Mallorca sprint 2006 +================================================== + +People who have a ``?`` in their arrive/depart or accomodation +column are known to be coming but there are no details +available yet from them. + +==================== ============== ===================== + Name Arrive/Depart Accomodation +==================== ============== ===================== +Beatrice Duering ? ? +Armin Rigo ? ? +Samuele Pedroni ? ? +Eric van Riet Paap ? ? +Michael Hudson ? ? +Carl Friedrich Bolz ? ? +Anders Chrigstroem ? ? +Christian Tismer ? ? +Jacob Hallen ? ? +Holger Krekel ? ? +==================== ============== ===================== + +People on the following list were present at previous sprints: + +==================== ============== ===================== + Name Arrive/Depart Accomodation +==================== ============== ===================== +Ludovic Aubry ? ? +Adrien Di Mascio ? ? +Laura Creighton ? ? +Lene Wagner ? ? +Anders Lehmann ? ? +Niklaus Haldimann ? ? +Richard Emslie ? ? +Johan Hahn ? ? +Amaury Forgeot d'Arc ? ? +Valentino Volonghi ? ? +Boris Feigin ? ? +Andrew Thompson ? ? +Bert Freudenberg ? ? +==================== ============== ===================== From pedronis at codespeak.net Thu Dec 22 01:36:06 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 22 Dec 2005 01:36:06 +0100 (CET) Subject: [pypy-svn] r21489 - pypy/extradoc/sprintinfo/mallorca Message-ID: <20051222003606.26F3A27B5B@code1.codespeak.net> Author: pedronis Date: Thu Dec 22 01:36:05 2005 New Revision: 21489 Modified: pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt Log: spelling Modified: pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt ============================================================================== --- pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt (original) +++ pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt Thu Dec 22 01:36:05 2005 @@ -37,12 +37,12 @@ http://www.uib.es/imagenes/planoCampus.html -The actual adress is: 3r pis de l'Anselm Turmeda which can be found on +The actual address is: 3r pis de l'Anselm Turmeda which can be found on the UIB Campus map. At "Plaza de Espa?a" there is a hostel (Hostal Residencia Terminus) which has been recommended to us. It's cheap (ca 50 euros/double room -with bathroom). Some more links to accommodations (flats, student homes +with bathroom). Some more links to accomodations (flats, student homes and hotels): http://www.lodging-in-spain.com/hotel/town/Islas_Baleares,Mallorca,Palma_de_Mallorca,1/ From cfbolz at codespeak.net Thu Dec 22 01:37:08 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 22 Dec 2005 01:37:08 +0100 (CET) Subject: [pypy-svn] r21490 - pypy/extradoc/sprintinfo/mallorca Message-ID: <20051222003708.91C7127B70@code1.codespeak.net> Author: cfbolz Date: Thu Dec 22 01:37:06 2005 New Revision: 21490 Modified: pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt Log: add sprint goals Modified: pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt ============================================================================== --- pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt (original) +++ pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt Thu Dec 22 01:37:06 2005 @@ -1,10 +1,12 @@ -Parma de Mallorca PyPy Sprint II: 23rd - 29th January 2006 +Palma de Mallorca PyPy Sprint II: 23rd - 29th January 2006 ============================================================ -The next PyPy sprint is scheduled to be in January 2006 in Parma De +The next PyPy sprint is scheduled to be in January 2006 in Palma De Mallorca, Balearic Isles, Spain. We'll give newcomer-friendly -introductions. To learn more about the new PyPy Python-in-Python -implementation look here: +introductions. Its focus will mainly be on the current JIT work, garbage +collection, alternative threading models, logic programming and on +improving the interface with external functions. To learn more about the +new PyPy Python-in-Python implementation look here: http://codespeak.net/pypy @@ -12,8 +14,31 @@ Goals and topics of the sprint ------------------------------ +In Gothenburg we have made some first forays into the interesting topics +of phase 2 of the PyPy project. In Mallorca we will continue the work +that we started there. + +The currently scheduled main topics are: + + - Further work/experimentation with the Abstract Interpreter that was + started in Gothenburg. + + - Integrating our garbage collection toolkit with the backends and the + code generation. + + - Heading into the direction of adding logic programming to PyPy. + + - Optimization work: our threading implementation is still incredibly + slow, we need to work on that. Furthermore there are still quite + some slow places in the interpreter that could be improved. + + - If someone feels like it: although vastly improved our socket module + is still far from complete + + - In general we need to improve the way we interface with external + functions. + -.. _`pypy-0.8.0`: http://codespeak.net/pypy/dist/pypy/doc/release-0.8.0.html Location & Accomodation ------------------------ @@ -56,7 +81,7 @@ http://www.callejeando.com/Pueblos/pueblo7_1.htm -To get to Parma De Mallorca almost all low fare airlines and travel +To get to Palma De Mallorca almost all low fare airlines and travel agencies have cheap tickets to get there. Information about Mallorca and Palma (maps, tourist information, local transports, recommended air lines, ferries and much more) can be found on: From hpk at codespeak.net Thu Dec 22 01:54:37 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 22 Dec 2005 01:54:37 +0100 (CET) Subject: [pypy-svn] r21491 - pypy/extradoc/sprintinfo/mallorca Message-ID: <20051222005437.8A7A627B68@code1.codespeak.net> Author: hpk Date: Thu Dec 22 01:54:30 2005 New Revision: 21491 Modified: pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt Log: refinments and simpler formulations of sprint goals + slight reformatting Modified: pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt ============================================================================== --- pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt (original) +++ pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt Thu Dec 22 01:54:30 2005 @@ -1,24 +1,21 @@ Palma de Mallorca PyPy Sprint II: 23rd - 29th January 2006 ============================================================ -The next PyPy sprint is scheduled to be in January 2006 in Palma De -Mallorca, Balearic Isles, Spain. We'll give newcomer-friendly -introductions. Its focus will mainly be on the current JIT work, garbage +The next PyPy sprint is scheduled to take place January 2006 in +Palma De Mallorca, Balearic Isles, Spain. We'll give newcomer-friendly +introductions and the focus will mainly be on current JIT work, garbage collection, alternative threading models, logic programming and on improving the interface with external functions. To learn more about the -new PyPy Python-in-Python implementation look here: +new Python-in-Python implementation look here: http://codespeak.net/pypy - Goals and topics of the sprint ------------------------------ In Gothenburg we have made some first forays into the interesting topics -of phase 2 of the PyPy project. In Mallorca we will continue the work -that we started there. - -The currently scheduled main topics are: +of Just-in-Time compilation. In Mallorca we will continue that +and have the following ideas: - Further work/experimentation with the Abstract Interpreter that was started in Gothenburg. @@ -32,12 +29,14 @@ slow, we need to work on that. Furthermore there are still quite some slow places in the interpreter that could be improved. - - If someone feels like it: although vastly improved our socket module - is still far from complete + - getting the socket module to a more complete state (it is + already improved but still far from complete) - - In general we need to improve the way we interface with external - functions. + - generally improving the way we interface with external functions. + - whatever participants want to do with PyPy (please send + suggestions to the mailing list before to allow us to plan + and give feedback) Location & Accomodation @@ -56,11 +55,11 @@ trips and costs 7.51 euros. Information about bus timetables and routes can be found on: -http://www.a-palma.es. + http://www.a-palma.es A map over the UIB campus are can be found on: -http://www.uib.es/imagenes/planoCampus.html + http://www.uib.es/imagenes/planoCampus.html The actual address is: 3r pis de l'Anselm Turmeda which can be found on the UIB Campus map. @@ -79,14 +78,14 @@ If you want to find a given street, you can search here: -http://www.callejeando.com/Pueblos/pueblo7_1.htm + http://www.callejeando.com/Pueblos/pueblo7_1.htm To get to Palma De Mallorca almost all low fare airlines and travel agencies have cheap tickets to get there. Information about Mallorca and Palma (maps, tourist information, local transports, recommended air lines, ferries and much more) can be found on: -http://www.palmademallorca.es/portalPalma/home.jsp + http://www.palmademallorca.es/portalPalma/home.jsp Comments on the weather: In January it is cold and wet on Mallorca @@ -115,7 +114,7 @@ For those of you interested - here is his website where there also are paintings showing his studio: -http://www.hermetex4.com/damnans/ + http://www.hermetex4.com/damnans/ For those interested in playing collectable card games, this will also be an opportunity to get aquainted with V:TES which will be demoed by From pedronis at codespeak.net Thu Dec 22 01:56:58 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 22 Dec 2005 01:56:58 +0100 (CET) Subject: [pypy-svn] r21492 - pypy/extradoc/sprintinfo/mallorca Message-ID: <20051222005658.AE03F27B5B@code1.codespeak.net> Author: pedronis Date: Thu Dec 22 01:56:57 2005 New Revision: 21492 Modified: pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt Log: slight reformulation of JIT task Modified: pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt ============================================================================== --- pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt (original) +++ pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt Thu Dec 22 01:56:57 2005 @@ -17,8 +17,8 @@ of Just-in-Time compilation. In Mallorca we will continue that and have the following ideas: - - Further work/experimentation with the Abstract Interpreter that was - started in Gothenburg. + - Further work/experimentation toward Just-In-Time Compiler generation, + which was initiated with the Abstract Interpreter started in Gothenburg. - Integrating our garbage collection toolkit with the backends and the code generation. From cfbolz at codespeak.net Thu Dec 22 02:14:04 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 22 Dec 2005 02:14:04 +0100 (CET) Subject: [pypy-svn] r21493 - pypy/extradoc/sprintinfo/mallorca Message-ID: <20051222011404.99E8F27B68@code1.codespeak.net> Author: cfbolz Date: Thu Dec 22 02:14:03 2005 New Revision: 21493 Modified: pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt Log: Tonight on "It's the Mind" we examine the phenomena of deja vu: That strange feeling we sometimes get that we've lived through something before. Modified: pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt ============================================================================== --- pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt (original) +++ pypy/extradoc/sprintinfo/mallorca/sprint-announcement.txt Thu Dec 22 02:14:03 2005 @@ -1,4 +1,4 @@ -Palma de Mallorca PyPy Sprint II: 23rd - 29th January 2006 +Palma de Mallorca PyPy Sprint: 23rd - 29th January 2006 ============================================================ The next PyPy sprint is scheduled to take place January 2006 in From ale at codespeak.net Thu Dec 22 10:44:53 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Thu, 22 Dec 2005 10:44:53 +0100 (CET) Subject: [pypy-svn] r21498 - in pypy/dist/pypy/lib/pyontology: . test Message-ID: <20051222094453.2A5FF27B84@code1.codespeak.net> Author: ale Date: Thu Dec 22 10:44:51 2005 New Revision: 21498 Modified: pypy/dist/pypy/lib/pyontology/pyontology.py pypy/dist/pypy/lib/pyontology/test/test_ontology.py Log: Added tests for range. The beginings of support for subproperty,equivalentproperty Modified: pypy/dist/pypy/lib/pyontology/pyontology.py ============================================================================== --- pypy/dist/pypy/lib/pyontology/pyontology.py (original) +++ pypy/dist/pypy/lib/pyontology/pyontology.py Thu Dec 22 10:44:51 2005 @@ -37,7 +37,7 @@ self.properties = {} def __repr__(self): - return "" % str(self.name) + return "<%s %s %r>" % (self.__class__, str(self.name),self.values) def __getitem__(self, index): return None @@ -54,13 +54,18 @@ return self def removeValues(self, values): - print "remove values from ClassDomain %r"%self, values - if len(values) > 0: - self.bases.pop(self.bases.index(values[0])) + for val in values: + self.values.pop(self.values.index(val)) - def getValues(self): + def getBases(self): return self.bases + def getValues(self): + return self.values + + def setValues(self, values): + self.values = values + class Property(ClassDomain): pass @@ -160,7 +165,22 @@ else: self.variables[avar] = fd(res) + def merge_constraints(self): + # Make the intersection of multiple rdfs:range constraints on the same variable + cons_dict = {} + new_cons =[] + for con in self.constraints: + if isinstance(con, RangeConstraint): + cons_dict.setdefault(con.variable, []) + cons_dict[con.variable].append(con) + else: + new_cons.append(con) + for k,v in cons_dict.items(): + for con in v: + pass + def solve(self,verbose=0): + #self.merge_constraints() rep = Repository(self.variables.keys(), self.variables, self.constraints) return Solver().solve(rep, verbose) @@ -204,7 +224,6 @@ res.append(a) var = '.'.join([str(a.replace('-','_')) for a in res]) if not var in self.variables.keys(): - print var self.variables[var] = cls(name=var) return var @@ -244,17 +263,19 @@ #---------------- Implementation ---------------- def type(self, s, p, var): - avar = self.make_var(ClassDomain, var) svar = self.make_var(ClassDomain, s) if (type(var) == URIRef and not (var in [URIRef(namespaces['owl']+'#'+x) for x in builtin_voc])): # var is not one of the builtin classes + avar = self.make_var(ClassDomain, var) self.variables[svar].values += self.variables[avar].values constrain = BinaryExpression([svar, avar],"%s in %s" %(svar, avar)) self.constraints.append(constrain) else: # var is a builtin class - self.variables[svar] = builtin_voc[var.split('#')[-1]]() + cls =builtin_voc[var.split('#')[-1]](name=svar) + cls.setValues(self.variables[svar].getValues()) + self.variables[svar] = cls def first(self, s, p, var): pass @@ -316,24 +337,37 @@ def range(self, s, p, var): avar = self.make_var(ClassDomain, var) svar = self.make_var(Property, s) + vals = get_values(self.variables[avar], self.variables).keys() + for v in self.variables[svar].getValues(): + if not v in vals: + vals.append(v) + self.variables[svar].setValues(vals) cons = RangeConstraint( svar, avar) self.constraints.append(cons) def domain(self, s, p, var): + # The classes that has this property (s) must belong to the class extension of var avar = self.make_var(ClassDomain, var) svar = self.make_var(Property, s) assert isinstance(self.variables[svar], Property) assert isinstance(self.variables[avar], ClassDomain) - self.variables[avar].properties[svar] = self.variables[svar] + def subPropertyOf(self, s, p, var): # TODO: implement this - pass + # s is a subproperty of var + avar = self.make_var(ClassDomain, var) + svar = self.make_var(Property, s) + cons = SubPropertyConstraint( svar, avar) + self.constraints.append(cons) def equivalentProperty(self, s, p, var): # TODO: implement this - pass + avar = self.make_var(ClassDomain, var) + svar = self.make_var(Property, s) + cons = EquivalentConstraint( svar, avar) + self.constraints.append(cons) def inverseOf(self, s, p, var): # TODO: implement this @@ -420,7 +454,6 @@ def narrow(self, domains): """narrowing algorithm for the constraint""" if len(domains[self._variables[0]]) > self.cardinality: - print " I Think I will raise an exception" raise ConsistencyFailure("Maxcardinality exceeded") else: return 1 @@ -452,13 +485,13 @@ return 1 -def get_values(dom, domains, attr = 'values'): +def get_values(dom, domains, attr = 'getValues'): res = {} - for val in getattr(dom, attr): + for val in getattr(dom, attr)(): res[val] = 1 if val in domains.keys(): res.update( get_values(val, domains, attr)) - res[dom] = 1 + #res[dom] = 1 return res class SubClassConstraint(AbstractConstraint): @@ -473,10 +506,9 @@ def narrow(self, domains): subdom = domains[self.variable] superdom = domains[self.super] - bases = get_values(superdom, domains, 'bases').keys() - print subdom,superdom, bases, subdom.bases + bases = get_values(superdom, domains, 'getBases').keys() subdom.bases += [bas for bas in bases if bas not in subdom.bases] - vals = get_values(subdom, domains, 'values') + vals = get_values(subdom, domains, 'getValues') superdom.values += [val for val in vals if val not in superdom.values] class DisjointClassConstraint(AbstractConstraint): @@ -491,11 +523,10 @@ def narrow(self, domains): subdom = domains[self.variable] superdom = domains[self.super] - bases = get_values(superdom, domains, 'bases').keys() - print subdom,superdom, bases, subdom.bases + bases = get_values(superdom, domains, 'getBases').keys() subdom.bases += [bas for bas in bases if bas not in subdom.bases] - vals1 = get_values(superdom, domains, 'values').keys() - vals2 = get_values(variable, domains, 'values').keys() + vals1 = get_values(superdom, domains, 'getValues').keys() + vals2 = get_values(variable, domains, 'getValues').keys() for i in vals1: if i in vals2: raise ConsistencyError @@ -512,4 +543,57 @@ def narrow(self, domains): subdom = domains[self.variable] superdom = domains[self.super] - \ No newline at end of file + +class RangeConstraint(AbstractConstraint): + + def __init__(self, variable, cls_or_restriction): + AbstractConstraint.__init__(self, [variable]) + # worst case complexity + self.__cost = 1 #len(variables) * (len(variables) - 1) / 2 + self.super = cls_or_restriction + self.variable = variable + + def narrow(self, domains): + subdom = domains[self.variable] + superdom = domains[self.super] + vals = get_values(superdom, domains, 'getValues').keys() + res = [] + for val in get_values(subdom, domains, 'getValues').keys(): + if not val in vals: + res.append(val) + subdom.removeValues(res) + +class SubPropertyConstraint(AbstractConstraint): + + def __init__(self, variable, cls_or_restriction): + AbstractConstraint.__init__(self, [variable]) + # worst case complexity + self.__cost = 1 #len(variables) * (len(variables) - 1) / 2 + self.super = cls_or_restriction + self.variable = variable + + def narrow(self, domains): + subdom = domains[self.variable] + superdom = domains[self.super] + vals = get_values(superdom, domains, 'getValues').keys() + for val in subdom.getValues(): + if not val in vals: + raise ConsistencyError("Value not in prescribed range") + +class EquivalentPropertyConstraint(AbstractConstraint): + + def __init__(self, variable, cls_or_restriction): + AbstractConstraint.__init__(self, [variable]) + # worst case complexity + self.__cost = 1 #len(variables) * (len(variables) - 1) / 2 + self.super = cls_or_restriction + self.variable = variable + + def narrow(self, domains): + subdom = domains[self.variable] + superdom = domains[self.super] + vals = get_values(superdom, domains, 'getValues').keys() + for val in subdom.getValues(): + if not val in vals: + raise ConsistencyError("Value not in prescribed range") + \ No newline at end of file Modified: pypy/dist/pypy/lib/pyontology/test/test_ontology.py ============================================================================== --- pypy/dist/pypy/lib/pyontology/test/test_ontology.py (original) +++ pypy/dist/pypy/lib/pyontology/test/test_ontology.py Thu Dec 22 10:44:51 2005 @@ -110,3 +110,34 @@ O.type(sub, pred , obj) assert O.variables[O.make_var(ClassDomain, sub)].__class__ == ObjectProperty +def test_range(): + O = Ontology() + sub = URIRef('a') + obj = URIRef('b') + O.variables['b_'] = fd([1,2,3,4]) + O.range(sub, None , obj) + sub = URIRef('a') + pred = URIRef('type') + obj = URIRef(namespaces['owl']+'#ObjectProperty') + O.type(sub, pred , obj) + assert len(O.constraints) == 1 + O.constraints[0].narrow(O.variables) + assert list(O.variables['a_'].getValues()) == [1,2,3,4] + +def test_merge(): + O = Ontology() + sub = URIRef('a') + obj = URIRef('b') + O.variables['b_'] = fd([1,2,3,4]) + O.range(sub, None , obj) + sub = URIRef('a') + obj = URIRef('c') + O.variables['c_'] = fd([3,4,5,6]) + O.range(sub, None , obj) + sub = URIRef('a') + pred = URIRef('type') + obj = URIRef(namespaces['owl']+'#ObjectProperty') + O.type(sub, pred , obj) + assert len(O.constraints) == 2 + O.consistency() + assert list(O.variables['a_'].getValues()) == [3,4] From ericvrp at codespeak.net Thu Dec 22 11:25:49 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Thu, 22 Dec 2005 11:25:49 +0100 (CET) Subject: [pypy-svn] r21502 - pypy/dist/pypy/translator/microbench Message-ID: <20051222102549.D132F27B70@code1.codespeak.net> Author: ericvrp Date: Thu Dec 22 11:25:49 2005 New Revision: 21502 Modified: pypy/dist/pypy/translator/microbench/microbench.py pypy/dist/pypy/translator/microbench/test_count1.py pypy/dist/pypy/translator/microbench/test_create1.py Log: Added some tests. These are some results to show the performance between python2.4 and python2.3 and between python2.4 and pypy-c. ericvrp at snake ~/projects/pypy-dist/pypy/translator/microbench $ ./microbench.py python2.4 python2.3 ./pypy exe: python2.4 exe: python2.3 1.03x slower on test_count1.test_call_method_of_new_style_class() 1.12x slower on test_count1.test_call_method_of_old_style_class() 1.17x slower on test_count1.test_loop() 1.17x slower on test_create1.test_simple_loop_with_new_style_class_creation() 1.18x slower on test_count1.test_call_function() 1.19x slower on test_count1.test_call_nested_function() 1.27x slower on test_create1.test_simple_loop_with_old_style_class_creation() 1.33x slower on test_create1.test_simple_loop() exe: ./pypy 5.52x slower on test_count1.test_call_nested_function() 5.53x slower on test_count1.test_call_method_of_new_style_class() 5.67x slower on test_count1.test_loop() 5.70x slower on test_count1.test_call_method_of_old_style_class() 6.41x slower on test_count1.test_call_function() 9.33x slower on test_create1.test_simple_loop() 37.27x slower on test_create1.test_simple_loop_with_old_style_class_creation() 46.75x slower on test_create1.test_simple_loop_with_new_style_class_creation() Modified: pypy/dist/pypy/translator/microbench/microbench.py ============================================================================== --- pypy/dist/pypy/translator/microbench/microbench.py (original) +++ pypy/dist/pypy/translator/microbench/microbench.py Thu Dec 22 11:25:49 2005 @@ -1,5 +1,9 @@ #!/usr/bin/python +"""This script computes the relative performance between python +implementations on a set of microbenchmarks. The script usally is started +with "./microbench.py python ./pypy" where pypy is a symlink to you pypy exectable.""" + import os, time, sys microbenches = [] @@ -11,7 +15,7 @@ microbenches.append(microbench) def run(): - MINIMUM_MICROBENCH_TIME = 2.5 + MINIMUM_MICROBENCH_TIME = 1.0 for microbench in microbenches: for k in [s for s in globals()[microbench].__dict__ if s.startswith('test_')] : @@ -26,7 +30,7 @@ print '%s took %.2f seconds' % (testcase, duration / float(n)) if __name__ == '__main__': - for n, exe in enumerate(sys.argv[1:3]): + for n, exe in enumerate(sys.argv[1:]): print 'exe:', exe data = [s for s in os.popen(exe + ' microbench.py 2>&1').readlines() if not s.startswith('debug:')] benchdata = {} @@ -36,8 +40,13 @@ if n == 0: benchdata_ref = benchdata else: + result = [] for k, v in benchdata.iteritems(): - print '%s %.2fx slower' % (k, v / benchdata_ref[k]) + result.append( (v / benchdata_ref[k], k) ) + result.sort() + for r in result: + slowdown, testcase = r + print '%5.2fx slower on %s' % (slowdown, testcase) if len(sys.argv) == 1: run() Modified: pypy/dist/pypy/translator/microbench/test_count1.py ============================================================================== --- pypy/dist/pypy/translator/microbench/test_count1.py (original) +++ pypy/dist/pypy/translator/microbench/test_count1.py Thu Dec 22 11:25:49 2005 @@ -17,3 +17,36 @@ x = plus1(x) # +def test_call_nested_function(): + def plus2(x): + return x + 1 + + x = 0 + n = N + while x < n: + x = plus2(x) + +# +class MyOldStyleClass: + def my_method(self, x): + return x + 1 + +class MyNewStyleClass(object): + def my_method(self, x): + return x + 1 + +def test_call_method_of_old_style_class(): + c = MyOldStyleClass() + x = 0 + n = N + while x < n: + x = c.my_method(x) + +def test_call_method_of_new_style_class(): + c = MyNewStyleClass() + x = 0 + n = N + while x < n: + x = c.my_method(x) + +# Modified: pypy/dist/pypy/translator/microbench/test_create1.py ============================================================================== --- pypy/dist/pypy/translator/microbench/test_create1.py (original) +++ pypy/dist/pypy/translator/microbench/test_create1.py Thu Dec 22 11:25:49 2005 @@ -1,6 +1,9 @@ LOOPS = 1 << 18 -class Foo: +class OldStyleFoo: + pass + +class NewStyleFoo(object): pass def test_simple_loop(): @@ -8,8 +11,14 @@ while i < LOOPS: i += 1 -def test_simple_loop_with_class_creation(): +def test_simple_loop_with_old_style_class_creation(): + i = 0 + while i < LOOPS: + OldStyleFoo() + i += 1 + +def test_simple_loop_with_new_style_class_creation(): i = 0 while i < LOOPS: - Foo() + NewStyleFoo() i += 1 From ericvrp at codespeak.net Fri Dec 23 15:03:59 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 23 Dec 2005 15:03:59 +0100 (CET) Subject: [pypy-svn] r21549 - in pypy/dist/pypy/translator/llvm: . test Message-ID: <20051223140359.D8AB827DB6@code1.codespeak.net> Author: ericvrp Date: Fri Dec 23 15:03:58 2005 New Revision: 21549 Added: pypy/dist/pypy/translator/llvm/test/test_merge_if_blocks.py Modified: pypy/dist/pypy/translator/llvm/codewriter.py pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/test/runtest.py pypy/dist/pypy/translator/llvm/test/test_lltype.py Log: Added code to handle result of merge_if_blocks transformation. LLVM still does not like to generate a jumptable for this! :( (Should be safe now for genc and genllvm to enable it by default) Modified: pypy/dist/pypy/translator/llvm/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/codewriter.py (original) +++ pypy/dist/pypy/translator/llvm/codewriter.py Fri Dec 23 15:03:58 2005 @@ -87,9 +87,11 @@ self._indent("br bool %s, label %%%s, label %%%s" % (cond, blockname_true, blockname_false)) - def switch(self, intty, cond, defaultdest, value_label): + def switch(self, intty, cond, defaultdest, value_labels): + if not defaultdest: + raise TypeError('switches must have a default case.') labels = '' - for value, label in value_label: + for value, label in value_labels: labels += ' %s %s, label %%%s' % (intty, value, label) self._indent("switch %s %s, label %%%s [%s ]" % (intty, cond, defaultdest, labels)) Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Fri Dec 23 15:03:58 2005 @@ -29,7 +29,7 @@ inputargtypes = [self.db.repr_type(a) for a in self.type_._trueargs()] codewriter.funcdef(self.ref, returntype, inputargtypes) -class BlockBranchWriterException(Exception): +class BranchException(Exception): pass class FuncNode(ConstantLLVMNode): @@ -163,13 +163,29 @@ if len(block.exits) == 1: codewriter.br_uncond(self.block_to_name[block.exits[0].target]) - elif len(block.exits) == 2: - cond = self.db.repr_arg(block.exitswitch) + return + + cond, condtype = self.db.repr_argwithtype(block.exitswitch) + if block.exitswitch.concretetype == lltype.Bool: + assert len(block.exits) == 2 codewriter.br(cond, self.block_to_name[block.exits[0].target], self.block_to_name[block.exits[1].target]) + + elif block.exitswitch.concretetype in \ + (lltype.Signed, lltype.Unsigned, lltype.SignedLongLong, + lltype.UnsignedLongLong, lltype.Char, lltype.UniChar): + defaultlink = None + value_labels = [] + for link in block.exits: + if link.exitcase is 'default': + defaultlink = link + continue + value_labels.append( (link.llexitcase, self.block_to_name[link.target]) ) + codewriter.switch(condtype, cond, self.block_to_name[defaultlink.target], value_labels) + else: - raise BranchException("only support branches with 2 exit cases") + raise BranchException("exitswitch type '%s' not supported" % block.exitswitch.concretetype) def write_block_operations(self, codewriter, block): opwriter = OpWriter(self.db, codewriter, self, block) Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Fri Dec 23 15:03:58 2005 @@ -290,7 +290,7 @@ if optimize: backend_optimizations(t, ssa_form=False) else: - backend_optimizations(t, ssa_form=False, mallocs=False, inline_threshold=0) + backend_optimizations(t, ssa_form=False, mallocs=False, inline_threshold=0, merge_if_blocks_to_switch=False, propagate=False) # note: this is without policy transforms if view: t.view() Modified: pypy/dist/pypy/translator/llvm/test/runtest.py ============================================================================== --- pypy/dist/pypy/translator/llvm/test/runtest.py (original) +++ pypy/dist/pypy/translator/llvm/test/runtest.py Fri Dec 23 15:03:58 2005 @@ -35,7 +35,16 @@ return genllvm_compile(function, annotation, optimize=optimize_tests, logging=False, **kwds) +def compile_optimized_test(function, annotation, **kwds): + if llvm_test(): + return genllvm_compile(function, annotation, optimize=True, + logging=False, **kwds) + def compile_function(function, annotation, **kwds): if llvm_test(): return compile_test(function, annotation, return_fn=True, **kwds) +def compile_optimized_function(function, annotation, **kwds): + if llvm_test(): + return compile_optimized_test(function, annotation, return_fn=True, **kwds) + Modified: pypy/dist/pypy/translator/llvm/test/test_lltype.py ============================================================================== --- pypy/dist/pypy/translator/llvm/test/test_lltype.py (original) +++ pypy/dist/pypy/translator/llvm/test/test_lltype.py Fri Dec 23 15:03:58 2005 @@ -211,7 +211,7 @@ f = compile_function(struct_opaque, []) assert f() == struct_opaque() -def test_floats(): +def test_floats(): #note: this is known to fail with llvm1.6 and llvm1.7cvs when not using gcc " test pbc of floats " F = lltype.GcStruct("f", ('f1', lltype.Float), Added: pypy/dist/pypy/translator/llvm/test/test_merge_if_blocks.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/llvm/test/test_merge_if_blocks.py Fri Dec 23 15:03:58 2005 @@ -0,0 +1,54 @@ +import py + +from pypy.translator.llvm.test.runtest import compile_optimized_function + +import sys + +def test_merge_if_blocks_simple(): + def merge_if_blocks_simple(i): + if i == 5: + return 1005 + else: + return 2222 + simple = compile_optimized_function(merge_if_blocks_simple, [int]) + for i in range(-20,20): + assert simple(i) == merge_if_blocks_simple(i) + +def test_merge_if_blocks_basic(): + def merge_if_blocks_basic(i): + if i == 5: + return 1005 + elif i == 8: + return 1008 + return 2222 + basic = compile_optimized_function(merge_if_blocks_basic , [int]) + for i in range(-20,20): + assert basic(i) == merge_if_blocks_basic(i) + +def test_merge_if_blocks_many(): + def merge_if_blocks_many(i): + if i == 0: + return 1000 + elif i == 1: + return 1001 + elif i == 2: + return 1002 + elif i == 3: + return 1003 + elif i == 4: + return 1004 + elif i == 5: + return 1005 + elif i == 6: + return 1006 + elif i == 7: + return 1007 + elif i == 8: + return 1008 + elif i == 9: + return 1009 + else: + return 2222 + many = compile_optimized_function(merge_if_blocks_many , [int]) + for i in range(-20,20): + assert many(i) == merge_if_blocks_many(i) From ericvrp at codespeak.net Fri Dec 23 16:40:08 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 23 Dec 2005 16:40:08 +0100 (CET) Subject: [pypy-svn] r21554 - in pypy/dist/pypy/translator/c: . test Message-ID: <20051223154008.62C4627DB6@code1.codespeak.net> Author: ericvrp Date: Fri Dec 23 16:40:07 2005 New Revision: 21554 Modified: pypy/dist/pypy/translator/c/funcgen.py pypy/dist/pypy/translator/c/test/test_backendoptimized.py Log: fix error message Modified: pypy/dist/pypy/translator/c/funcgen.py ============================================================================== --- pypy/dist/pypy/translator/c/funcgen.py (original) +++ pypy/dist/pypy/translator/c/funcgen.py Fri Dec 23 16:40:07 2005 @@ -368,8 +368,8 @@ yield '}' else: - raise TypeError("switches can only be on Bool or " - "PyObjPtr. Got %r" % (TYPE,)) + raise TypeError("exitswitch type not supported" + " Got %r" % (TYPE,)) for i in range(reachable_err, -1, -1): if not fallthrough: Modified: pypy/dist/pypy/translator/c/test/test_backendoptimized.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_backendoptimized.py (original) +++ pypy/dist/pypy/translator/c/test/test_backendoptimized.py Fri Dec 23 16:40:07 2005 @@ -9,7 +9,7 @@ def process(self, t): _TestTypedTestCase.process(self, t) self.t = t - backend_optimizations(t) + backend_optimizations(t, merge_if_blocks_to_switch=False) def test_remove_same_as(self): def f(n=bool): From ericvrp at codespeak.net Fri Dec 23 16:40:54 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 23 Dec 2005 16:40:54 +0100 (CET) Subject: [pypy-svn] r21555 - in pypy/dist/pypy/translator: backendopt goal Message-ID: <20051223154054.6338527DB8@code1.codespeak.net> Author: ericvrp Date: Fri Dec 23 16:40:53 2005 New Revision: 21555 Modified: pypy/dist/pypy/translator/backendopt/all.py pypy/dist/pypy/translator/goal/driver.py pypy/dist/pypy/translator/goal/translate_pypy.py Log: Enable merge_if_block transformation by default Modified: pypy/dist/pypy/translator/backendopt/all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/all.py (original) +++ pypy/dist/pypy/translator/backendopt/all.py Fri Dec 23 16:40:53 2005 @@ -10,7 +10,7 @@ def backend_optimizations(translator, inline_threshold=1, mallocs=True, ssa_form=True, - merge_if_blocks_to_switch=False, + merge_if_blocks_to_switch=True, propagate=False): # remove obvious no-ops for graph in translator.graphs: Modified: pypy/dist/pypy/translator/goal/driver.py ============================================================================== --- pypy/dist/pypy/translator/goal/driver.py (original) +++ pypy/dist/pypy/translator/goal/driver.py Fri Dec 23 16:40:53 2005 @@ -25,7 +25,7 @@ 'backend': 'c', 'lowmem': False, 'fork_before': None, - 'merge_if_blocks': False + 'merge_if_blocks': True }) def taskdef(taskfunc, deps, title, new_state=None, expected_states=[], idemp=False): Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Fri Dec 23 16:40:53 2005 @@ -50,7 +50,7 @@ '2_gc': [OPT(('--gc',), "Garbage collector", ['boehm', 'ref', 'none'])], '3_stackless': [OPT(('--stackless',), "Stackless code generation", True)], - '4_merge_if_blocks': [OPT(('--merge_if_blocks',), "Merge if ... elif ... chains and use a switch statement for them.", True)], + '4_merge_if_blocks': [OPT(('--merge_if_blocks',), "Merge if ... elif ... chains and use a switch statement for them.", False)], }, @@ -102,7 +102,7 @@ 'gc': 'boehm', 'backend': 'c', 'stackless': False, - 'merge_if_blocks': False, + 'merge_if_blocks': True, 'batch': False, 'text': False, From pedronis at codespeak.net Fri Dec 23 16:57:29 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 23 Dec 2005 16:57:29 +0100 (CET) Subject: [pypy-svn] r21556 - pypy/dist/pypy/translator/goal Message-ID: <20051223155729.7CE5227DB6@code1.codespeak.net> Author: pedronis Date: Fri Dec 23 16:57:26 2005 New Revision: 21556 Modified: pypy/dist/pypy/translator/goal/translate_pypy.py Log: perhaps saner option name Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Fri Dec 23 16:57:26 2005 @@ -50,7 +50,7 @@ '2_gc': [OPT(('--gc',), "Garbage collector", ['boehm', 'ref', 'none'])], '3_stackless': [OPT(('--stackless',), "Stackless code generation", True)], - '4_merge_if_blocks': [OPT(('--merge_if_blocks',), "Merge if ... elif ... chains and use a switch statement for them.", False)], + '4_merge_if_blocks': [OPT(('--no-if-blocks-merge',), "Do not merge if ... elif ... chains and use a switch statement for them.", False)], }, From rxe at codespeak.net Fri Dec 23 17:58:57 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 23 Dec 2005 17:58:57 +0100 (CET) Subject: [pypy-svn] r21557 - in pypy/dist/pypy/translator/llvm: . test Message-ID: <20051223165857.24B6F27DB6@code1.codespeak.net> Author: rxe Date: Fri Dec 23 17:58:55 2005 New Revision: 21557 Modified: pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/llvm/test/test_merge_if_blocks.py Log: char needs to an ord(). Fix and test. Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Fri Dec 23 17:58:55 2005 @@ -75,7 +75,7 @@ traverse(visit, self.graph) # ______________________________________________________________________ - # main entry points from genllvm + # main entry points from genllvm def writedecl(self, codewriter): codewriter.declare(self.getdecl()) @@ -180,12 +180,20 @@ for link in block.exits: if link.exitcase is 'default': defaultlink = link - continue - value_labels.append( (link.llexitcase, self.block_to_name[link.target]) ) - codewriter.switch(condtype, cond, self.block_to_name[defaultlink.target], value_labels) + continue + + exitcase = link.llexitcase + if block.exitswitch.concretetype is lltype.Char: + exitcase = ord(exitcase) + value_labels.append( (exitcase, + self.block_to_name[link.target]) ) + + codewriter.switch(condtype, cond, + self.block_to_name[defaultlink.target], value_labels) else: - raise BranchException("exitswitch type '%s' not supported" % block.exitswitch.concretetype) + raise BranchException("exitswitch type '%s' not supported" % + block.exitswitch.concretetype) def write_block_operations(self, codewriter, block): opwriter = OpWriter(self.db, codewriter, self, block) Modified: pypy/dist/pypy/translator/llvm/test/test_merge_if_blocks.py ============================================================================== --- pypy/dist/pypy/translator/llvm/test/test_merge_if_blocks.py (original) +++ pypy/dist/pypy/translator/llvm/test/test_merge_if_blocks.py Fri Dec 23 17:58:55 2005 @@ -23,7 +23,19 @@ return 2222 basic = compile_optimized_function(merge_if_blocks_basic , [int]) for i in range(-20,20): - assert basic(i) == merge_if_blocks_basic(i) + assert basic(i) == merge_if_blocks_basic(i) + +def test_merge_if_blocks_chr(): + def merge_if_blocks_chr(i): + c = chr(i) + if c == '\x05': + return 1005 + elif c == '!': + return 1008 + return 2222 + basic = compile_optimized_function(merge_if_blocks_chr , [int]) + for i in range(0, 50): + assert basic(i) == merge_if_blocks_chr(i) def test_merge_if_blocks_many(): def merge_if_blocks_many(i): From rxe at codespeak.net Fri Dec 23 19:00:10 2005 From: rxe at codespeak.net (rxe at codespeak.net) Date: Fri, 23 Dec 2005 19:00:10 +0100 (CET) Subject: [pypy-svn] r21558 - in pypy/dist/pypy/translator/llvm: . test Message-ID: <20051223180010.7265627DB8@code1.codespeak.net> Author: rxe Date: Fri Dec 23 19:00:08 2005 New Revision: 21558 Modified: pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/llvm/test/test_merge_if_blocks.py Log: arrghh - im an idiot! Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Fri Dec 23 19:00:08 2005 @@ -182,8 +182,8 @@ defaultlink = link continue - exitcase = link.llexitcase - if block.exitswitch.concretetype is lltype.Char: + exitcase = link.llexitcase + if block.exitswitch.concretetype in [lltype.Char, lltype.UniChar]: exitcase = ord(exitcase) value_labels.append( (exitcase, self.block_to_name[link.target]) ) Modified: pypy/dist/pypy/translator/llvm/test/test_merge_if_blocks.py ============================================================================== --- pypy/dist/pypy/translator/llvm/test/test_merge_if_blocks.py (original) +++ pypy/dist/pypy/translator/llvm/test/test_merge_if_blocks.py Fri Dec 23 19:00:08 2005 @@ -37,6 +37,19 @@ for i in range(0, 50): assert basic(i) == merge_if_blocks_chr(i) +def test_merge_if_blocks_uni(): + def merge_if_blocks_uni(i): + c = unichr(i) + if c == u'\x05': + return 1005 + elif c == u'!': + return 1008 + return 2222 + basic = compile_optimized_function(merge_if_blocks_uni , [int]) + for i in range(0, 50): + assert basic(i) == merge_if_blocks_uni(i) + + def test_merge_if_blocks_many(): def merge_if_blocks_many(i): if i == 0: From arigo at codespeak.net Fri Dec 23 22:53:45 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 23 Dec 2005 22:53:45 +0100 (CET) Subject: [pypy-svn] r21559 - pypy/dist/pypy/annotation Message-ID: <20051223215345.87D7D27DB6@code1.codespeak.net> Author: arigo Date: Fri Dec 23 22:53:43 2005 New Revision: 21559 Modified: pypy/dist/pypy/annotation/description.py Log: issue175 resolved Make sure that specialized function names are valid identifiers. Issue 175 shows an example where we get spaces in the name, which confuses dot quite a bit. Modified: pypy/dist/pypy/annotation/description.py ============================================================================== --- pypy/dist/pypy/annotation/description.py (original) +++ pypy/dist/pypy/annotation/description.py Fri Dec 23 22:53:43 2005 @@ -3,6 +3,7 @@ from pypy.interpreter.pycode import cpython_code_signature from pypy.interpreter.argument import rawshape from pypy.interpreter.argument import ArgErr +from pypy.tool.sourcetools import valid_identifier class CallFamily: @@ -181,7 +182,7 @@ return str(thing)[:30] if key is not None and alt_name is None: - postfix = nameof(key) + postfix = valid_identifier(nameof(key)) alt_name = "%s__%s"%(self.name, postfix) graph = self.buildgraph(alt_name, builder) self._cache[key] = graph From arigo at codespeak.net Sat Dec 24 11:24:14 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Dec 2005 11:24:14 +0100 (CET) Subject: [pypy-svn] r21560 - in pypy/dist/pypy/rpython: lltypesystem test Message-ID: <20051224102414.6C1DD27B82@code1.codespeak.net> Author: arigo Date: Sat Dec 24 11:24:08 2005 New Revision: 21560 Modified: pypy/dist/pypy/rpython/lltypesystem/rbuiltin.py pypy/dist/pypy/rpython/lltypesystem/rclass.py pypy/dist/pypy/rpython/test/test_rclass.py Log: Minor optimization in the rtyper for isinstance() on classes with no subclasses. Modified: pypy/dist/pypy/rpython/lltypesystem/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/lltypesystem/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/lltypesystem/rbuiltin.py Sat Dec 24 11:24:08 2005 @@ -30,10 +30,15 @@ v_obj, v_cls = hop.inputargs(instance_repr, class_repr) if isinstance(v_cls, Constant): - minid = hop.inputconst(lltype.Signed, v_cls.value.subclassrange_min) - maxid = hop.inputconst(lltype.Signed, v_cls.value.subclassrange_max) - return hop.gendirectcall(rclass.ll_isinstance_const, v_obj, minid, - maxid) + cls = v_cls.value + if cls.subclassrange_max == cls.subclassrange_min + 1: + # a class with no subclass + return hop.gendirectcall(rclass.ll_isinstance_exact, v_obj, v_cls) + else: + minid = hop.inputconst(lltype.Signed, cls.subclassrange_min) + maxid = hop.inputconst(lltype.Signed, cls.subclassrange_max) + return hop.gendirectcall(rclass.ll_isinstance_const, v_obj, minid, + maxid) else: return hop.gendirectcall(rclass.ll_isinstance, v_obj, v_cls) Modified: pypy/dist/pypy/rpython/lltypesystem/rclass.py ============================================================================== --- pypy/dist/pypy/rpython/lltypesystem/rclass.py (original) +++ pypy/dist/pypy/rpython/lltypesystem/rclass.py Sat Dec 24 11:24:08 2005 @@ -260,9 +260,15 @@ class_repr = get_type_repr(self.rtyper) v_cls1, v_cls2 = hop.inputargs(class_repr, class_repr) if isinstance(v_cls2, Constant): - minid = hop.inputconst(Signed, v_cls2.value.subclassrange_min) - maxid = hop.inputconst(Signed, v_cls2.value.subclassrange_max) - return hop.gendirectcall(ll_issubclass_const, v_cls1, minid, maxid) + cls2 = v_cls2.value + if cls2.subclassrange_max == cls2.subclassrange_min + 1: + # a class with no subclass + return hop.genop('ptr_eq', [v_cls1, v_cls2], resulttype=Bool) + else: + minid = hop.inputconst(Signed, cls2.subclassrange_min) + maxid = hop.inputconst(Signed, cls2.subclassrange_max) + return hop.gendirectcall(ll_issubclass_const, v_cls1, minid, + maxid) else: v_cls1, v_cls2 = hop.inputargs(class_repr, class_repr) return hop.gendirectcall(ll_issubclass, v_cls1, v_cls2) @@ -632,6 +638,12 @@ return False return ll_issubclass_const(obj.typeptr, minid, maxid) +def ll_isinstance_exact(obj, cls): + if not obj: + return False + obj_cls = obj.typeptr + return obj_cls == cls + def ll_runtime_type_info(obj): return obj.typeptr.rtti Modified: pypy/dist/pypy/rpython/test/test_rclass.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rclass.py (original) +++ pypy/dist/pypy/rpython/test/test_rclass.py Sat Dec 24 11:24:08 2005 @@ -148,7 +148,9 @@ assert res == 246 def test_issubclass_type(): - class A: + class Abstract: + pass + class A(Abstract): pass class B(A): pass From arigo at codespeak.net Sun Dec 25 17:57:07 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 25 Dec 2005 17:57:07 +0100 (CET) Subject: [pypy-svn] r21561 - pypy/dist/pypy/objspace/test Message-ID: <20051225165707.EBB0027B4E@code1.codespeak.net> Author: arigo Date: Sun Dec 25 17:57:04 2005 New Revision: 21561 Modified: pypy/dist/pypy/objspace/test/test_descroperation.py Log: A test to see that NotImplemented returned by all binary __xxx__ special methods are correctly turned into TypeErrors. Modified: pypy/dist/pypy/objspace/test/test_descroperation.py ============================================================================== --- pypy/dist/pypy/objspace/test/test_descroperation.py (original) +++ pypy/dist/pypy/objspace/test/test_descroperation.py Sun Dec 25 17:57:04 2005 @@ -191,3 +191,49 @@ assert len(s.__dict__) == 1 assert type(s.__dict__.keys()[0]) is str # don't store S keys assert s.abc is s + + def test_notimplemented(self): + #import types + import operator + + def specialmethod(self, other): + return NotImplemented + + def check(expr, x, y, operator=operator): + raises(TypeError, expr) + + for metaclass in [type]: # [type, types.ClassType]: + for name, expr, iexpr in [ + ('__add__', 'x + y', 'x += y'), + ('__sub__', 'x - y', 'x -= y'), + ('__mul__', 'x * y', 'x *= y'), + ('__truediv__', 'operator.truediv(x, y)', None), + ('__floordiv__', 'operator.floordiv(x, y)', None), + ('__div__', 'x / y', 'x /= y'), + ('__mod__', 'x % y', 'x %= y'), + ('__divmod__', 'divmod(x, y)', None), + ('__pow__', 'x ** y', 'x **= y'), + ('__lshift__', 'x << y', 'x <<= y'), + ('__rshift__', 'x >> y', 'x >>= y'), + ('__and__', 'x & y', 'x &= y'), + ('__or__', 'x | y', 'x |= y'), + ('__xor__', 'x ^ y', 'x ^= y'), + ('__coerce__', 'coerce(x, y)', None)]: + if name == '__coerce__': + rname = name + else: + rname = '__r' + name[2:] + A = metaclass('A', (), {name: specialmethod}) + B = metaclass('B', (), {rname: specialmethod}) + a = A() + b = B() + check(expr, a, a) + check(expr, a, b) + check(expr, b, a) + check(expr, b, b) + if iexpr: + iname = '__i' + name[2:] + C = metaclass('C', (), {iname: specialmethod}) + c = C() + check(expr, c, a) + check(expr, c, b) From arigo at codespeak.net Sun Dec 25 18:03:11 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 25 Dec 2005 18:03:11 +0100 (CET) Subject: [pypy-svn] r21562 - pypy/dist/pypy/objspace/test Message-ID: <20051225170311.8477927B52@code1.codespeak.net> Author: arigo Date: Sun Dec 25 18:03:09 2005 New Revision: 21562 Modified: pypy/dist/pypy/objspace/test/test_descroperation.py Log: Oups, typo (not testing the in-place operators correctly). Modified: pypy/dist/pypy/objspace/test/test_descroperation.py ============================================================================== --- pypy/dist/pypy/objspace/test/test_descroperation.py (original) +++ pypy/dist/pypy/objspace/test/test_descroperation.py Sun Dec 25 18:03:09 2005 @@ -232,8 +232,12 @@ check(expr, b, a) check(expr, b, b) if iexpr: + check(iexpr, a, a) + check(iexpr, a, b) + check(iexpr, b, a) + check(iexpr, b, b) iname = '__i' + name[2:] C = metaclass('C', (), {iname: specialmethod}) c = C() - check(expr, c, a) - check(expr, c, b) + check(iexpr, c, a) + check(iexpr, c, b) From arigo at codespeak.net Sun Dec 25 18:12:27 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 25 Dec 2005 18:12:27 +0100 (CET) Subject: [pypy-svn] r21563 - pypy/dist/pypy/objspace/test Message-ID: <20051225171227.0925027B52@code1.codespeak.net> Author: arigo Date: Sun Dec 25 18:12:25 2005 New Revision: 21563 Modified: pypy/dist/pypy/objspace/test/test_descroperation.py Log: A few more tests. No problem in PyPy, but they crash CPython hilarilously (or scarily, if you prefer). Modified: pypy/dist/pypy/objspace/test/test_descroperation.py ============================================================================== --- pypy/dist/pypy/objspace/test/test_descroperation.py (original) +++ pypy/dist/pypy/objspace/test/test_descroperation.py Sun Dec 25 18:12:25 2005 @@ -231,13 +231,17 @@ check(expr, a, b) check(expr, b, a) check(expr, b, b) + check(expr, a, 5) + check(expr, 5, b) if iexpr: check(iexpr, a, a) check(iexpr, a, b) check(iexpr, b, a) check(iexpr, b, b) + check(iexpr, a, 5) iname = '__i' + name[2:] C = metaclass('C', (), {iname: specialmethod}) c = C() check(iexpr, c, a) check(iexpr, c, b) + check(iexpr, c, 5) From ale at codespeak.net Tue Dec 27 14:00:07 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Tue, 27 Dec 2005 14:00:07 +0100 (CET) Subject: [pypy-svn] r21569 - in pypy/dist/pypy/lib/pyontology: . test Message-ID: <20051227130007.05EFD27B38@code1.codespeak.net> Author: ale Date: Tue Dec 27 14:00:06 2005 New Revision: 21569 Modified: pypy/dist/pypy/lib/pyontology/pyontology.py pypy/dist/pypy/lib/pyontology/test/test_ontology.py Log: Added support and tests for domain and range. Modified: pypy/dist/pypy/lib/pyontology/pyontology.py ============================================================================== --- pypy/dist/pypy/lib/pyontology/pyontology.py (original) +++ pypy/dist/pypy/lib/pyontology/pyontology.py Tue Dec 27 14:00:06 2005 @@ -37,7 +37,7 @@ self.properties = {} def __repr__(self): - return "<%s %s %r>" % (self.__class__, str(self.name),self.values) + return "<%s %s %r>" % (self.__class__, str(self.name),self.getValues()) def __getitem__(self, index): return None @@ -67,8 +67,34 @@ self.values = values class Property(ClassDomain): - pass - + # Property contains the relationship between a class instance and a value + # - a pair. To accomodate global assertions like 'range' and 'domain' anonymous + # pairs are allowed, ie None as key + def __init__(self, name='', values=[], bases = []): + ClassDomain.__init__(self, name, values, bases) + self._dict = {} + + def getValues(self): + return tuple(self._dict.items()) + + def setValues(self, values): + for k,v in values: + print "####",k,v + self._dict.setdefault(k,[]) + if type(v) == list: + self._dict[k] = v + else: + if not v in self._dict[k]: + self._dict[k].append(v) + + def removeValues(self, values): + for k,v in values: + vals = self._dict[k] + if vals == [None]: + self._dict.pop(k) + else: + self._dict[k] = [ x for x in vals if x != v] + class ObjectProperty(Property): pass @@ -274,7 +300,8 @@ else: # var is a builtin class cls =builtin_voc[var.split('#')[-1]](name=svar) - cls.setValues(self.variables[svar].getValues()) + vals = self.variables[svar].getValues() + cls.setValues(vals) self.variables[svar] = cls def first(self, s, p, var): @@ -337,34 +364,42 @@ def range(self, s, p, var): avar = self.make_var(ClassDomain, var) svar = self.make_var(Property, s) - vals = get_values(self.variables[avar], self.variables).keys() - for v in self.variables[svar].getValues(): - if not v in vals: - vals.append(v) + vals = get_values(self.variables[avar], self.variables) + for k,v in self.variables[svar].getValues(): + for x in v: + if not x in vals: + vals.append(x) + vals =[(None,val) for val in vals] self.variables[svar].setValues(vals) - cons = RangeConstraint( svar, avar) + cons = RangeConstraint(svar, avar) self.constraints.append(cons) - def domain(self, s, p, var): # The classes that has this property (s) must belong to the class extension of var avar = self.make_var(ClassDomain, var) svar = self.make_var(Property, s) assert isinstance(self.variables[svar], Property) assert isinstance(self.variables[avar], ClassDomain) - + vals = get_values(self.variables[avar], self.variables) + if len(vals) == 0 : + vals = [(self.variables[avar], None)] + for k,v in self.variables[svar].getValues(): + if not k in vals: + vals.append((k,v)) + self.variables[svar].setValues(vals) + print "->",vals, self.variables[svar] + cons = DomainConstraint(svar, avar) + self.constraints.append(cons) def subPropertyOf(self, s, p, var): - # TODO: implement this # s is a subproperty of var - avar = self.make_var(ClassDomain, var) + avar = self.make_var(Property, var) svar = self.make_var(Property, s) cons = SubPropertyConstraint( svar, avar) self.constraints.append(cons) def equivalentProperty(self, s, p, var): - # TODO: implement this - avar = self.make_var(ClassDomain, var) + avar = self.make_var(Property, var) svar = self.make_var(Property, s) cons = EquivalentConstraint( svar, avar) self.constraints.append(cons) @@ -426,10 +461,6 @@ # TODO: implement this pass - def someValuesFrom(self, s, p, var): - # TODO: implement this - pass - def imports(self, s, p, var): # TODO: implement this pass @@ -486,14 +517,16 @@ def get_values(dom, domains, attr = 'getValues'): - res = {} + res = [] for val in getattr(dom, attr)(): - res[val] = 1 + res.append(val) + if type(val) == tuple: + val = val[0] if val in domains.keys(): - res.update( get_values(val, domains, attr)) + res.extend(get_values(val, domains, attr)) #res[dom] = 1 return res - + class SubClassConstraint(AbstractConstraint): def __init__(self, variable, cls_or_restriction): @@ -506,7 +539,7 @@ def narrow(self, domains): subdom = domains[self.variable] superdom = domains[self.super] - bases = get_values(superdom, domains, 'getBases').keys() + bases = get_values(superdom, domains, 'getBases') subdom.bases += [bas for bas in bases if bas not in subdom.bases] vals = get_values(subdom, domains, 'getValues') superdom.values += [val for val in vals if val not in superdom.values] @@ -523,10 +556,10 @@ def narrow(self, domains): subdom = domains[self.variable] superdom = domains[self.super] - bases = get_values(superdom, domains, 'getBases').keys() + bases = get_values(superdom, domains, 'getBases') subdom.bases += [bas for bas in bases if bas not in subdom.bases] - vals1 = get_values(superdom, domains, 'getValues').keys() - vals2 = get_values(variable, domains, 'getValues').keys() + vals1 = get_values(superdom, domains, 'getValues') + vals2 = get_values(variable, domains, 'getValues') for i in vals1: if i in vals2: raise ConsistencyError @@ -556,12 +589,35 @@ def narrow(self, domains): subdom = domains[self.variable] superdom = domains[self.super] - vals = get_values(superdom, domains, 'getValues').keys() + vals = get_values(superdom, domains, 'getValues') res = [] - for val in get_values(subdom, domains, 'getValues').keys(): - if not val in vals: - res.append(val) + svals = get_values(subdom, domains, 'getValues') + for k,val in svals: + for v in val: + if not v in vals: + res.append((k,v)) + subdom.removeValues(res) + +class DomainConstraint(AbstractConstraint): + + def __init__(self, variable, cls_or_restriction): + AbstractConstraint.__init__(self, [variable]) + # worst case complexity + self.__cost = 1 #len(variables) * (len(variables) - 1) / 2 + self.super = cls_or_restriction + self.variable = variable + + def narrow(self, domains): + subdom = domains[self.variable] + superdom = domains[self.super] + vals = get_values(superdom, domains, 'getValues') + res = [] + for k,val in get_values(subdom, domains, 'getValues'): + if not k in vals and k != superdom: + res.append((k,val)) + print "res",res,vals,superdom subdom.removeValues(res) + print "---",subdom class SubPropertyConstraint(AbstractConstraint): @@ -575,7 +631,7 @@ def narrow(self, domains): subdom = domains[self.variable] superdom = domains[self.super] - vals = get_values(superdom, domains, 'getValues').keys() + vals = get_values(superdom, domains, 'getValues') for val in subdom.getValues(): if not val in vals: raise ConsistencyError("Value not in prescribed range") @@ -592,7 +648,7 @@ def narrow(self, domains): subdom = domains[self.variable] superdom = domains[self.super] - vals = get_values(superdom, domains, 'getValues').keys() + vals = get_values(superdom, domains, 'getValues') for val in subdom.getValues(): if not val in vals: raise ConsistencyError("Value not in prescribed range") Modified: pypy/dist/pypy/lib/pyontology/test/test_ontology.py ============================================================================== --- pypy/dist/pypy/lib/pyontology/test/test_ontology.py (original) +++ pypy/dist/pypy/lib/pyontology/test/test_ontology.py Tue Dec 27 14:00:06 2005 @@ -122,7 +122,7 @@ O.type(sub, pred , obj) assert len(O.constraints) == 1 O.constraints[0].narrow(O.variables) - assert list(O.variables['a_'].getValues()) == [1,2,3,4] + assert O.variables['a_'].getValues() == ((None,[1,2,3,4]),) def test_merge(): O = Ontology() @@ -130,7 +130,6 @@ obj = URIRef('b') O.variables['b_'] = fd([1,2,3,4]) O.range(sub, None , obj) - sub = URIRef('a') obj = URIRef('c') O.variables['c_'] = fd([3,4,5,6]) O.range(sub, None , obj) @@ -140,4 +139,47 @@ O.type(sub, pred , obj) assert len(O.constraints) == 2 O.consistency() - assert list(O.variables['a_'].getValues()) == [3,4] + assert O.variables['a_'].getValues() == ((None, [3,4]),) + +def test_domain(): + O = Ontology() + sub = URIRef('a') + obj = URIRef('b') + O.variables['b_'] = ClassDomain('b') + O.domain(sub, None , obj) + sub = URIRef('a') + pred = URIRef('type') + obj = URIRef(namespaces['owl']+'#ObjectProperty') + O.type(sub, pred , obj) + assert len(O.constraints) == 1 + O.constraints[0].narrow(O.variables) + assert O.variables['a_'].getValues() == ((O.variables['b_'], [None]),) + +def test_domain_merge(): + O = Ontology() + sub = URIRef('a') + obj = URIRef('b') + O.variables['b_'] = ClassDomain('b') + O.domain(sub, None , obj) + obj = URIRef('c') + O.variables['c_'] = ClassDomain('c') + O.domain(sub, None , obj) + pred = URIRef('type') + obj = URIRef(namespaces['owl']+'#ObjectProperty') + O.type(sub, pred , obj) + + assert len(O.constraints) == 2 + for con in O.constraints: + con.narrow(O.variables) + assert O.variables['a_'].getValues() ==() #O.variables['b_'] + +def test_subproperty(): + O = Ontology() + sub = URIRef('a') + obj = URIRef(namespaces['owl']+'#ObjectProperty') + O.type(sub, None, obj) + b = URIRef('b') + O.type(b, None, obj) + O.subPropertyOf(sub, None, b) + assert len(O.constraints) ==1 + assert O.variables['a_'].getValues() in O.variables['b_'].getValues() \ No newline at end of file From hpk at codespeak.net Tue Dec 27 17:35:06 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Tue, 27 Dec 2005 17:35:06 +0100 (CET) Subject: [pypy-svn] r21570 - pypy/extradoc/talk/22c3 Message-ID: <20051227163506.9702D27B38@code1.codespeak.net> Author: hpk Date: Tue Dec 27 17:35:05 2005 New Revision: 21570 Added: pypy/extradoc/talk/22c3/hpk-tech.txt (contents, props changed) Log: first draft of talk for 22c3 (process with rest2s5.py from docutils-svn) Added: pypy/extradoc/talk/22c3/hpk-tech.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/hpk-tech.txt Tue Dec 27 17:35:05 2005 @@ -0,0 +1,237 @@ +.. include:: + +================================================= +PyPy - The new Python implementation on the block +================================================= + +:Authors: Holger Krekel & Carl Friedrich Bolz +:Date: 28th December 2005 + +PyPy/Translation overview +========================= + +.. image:: translation-overview.png + :scale: 500 + +Python implementation facts +=========================== + +- Parser/Compiler produces bytecode +- Virtual Machine interprets bytecode +- strongly dynamically typed +- clean object model at Python and C level + +Python implementations +=========================== + +- CPython: main Python version (BDFL'ed by Guido) +- Jython: compiles to Java Bytecode +- IronPython (MS): compiles to .NET's CLR +- PyPy: self-contained - self-translating + +PyPy implementation facts +============================ + +- implements full Python language in Python itself +- parts implemented in a restricted subset: RPython +- "static enough" for full-programm type inference +- but at boot time we allow unrestricted python! +- 350 subscribers to pypy-dev, 150.000 LOCs, 20.000 visitors per month, +- MIT license + +PyPy/Python architecture +============================= +[picture] + +- parser and compiler +- bytecode interpreter +- Standard Object Space / Type implementations +- Python VM = interpreter + Standard Object Space +- builtin and fundamental modules + +Parser and Compiler +=================== + +- parses python source code to AST +- compiles AST to code objects (bytecode) +- works from the CPython grammar definition +- can be modified/extended at runtime (almost) +- [showing interactively] ... + +Bytecode interpreter +==================== + +- interprets bytecode/code objects through "Frames" +- Frames tie to global and local variable scopes +- implements control flow (loops, branches, exceptions, calls) +- dispatches all operations on objects to an Object Library + ("Space") + +Object Spaces +============= + +- library of all python types and operations on them +- is not concerned with control flow or bytecode +- ... + +Builtin and Fundamental Modules +=============================== + +- around 200 builtin functions and classes +- fundamental modules like 'sys' and 'os' implemented +- quite fully compliant to CPython's regression tests + +Animation on Interpreter/Objspace interaction +=============================================== + +- shown on pygame-window ... + + +.. Carl + + +PyPy/Translation architecture +============================= + +- bytecode interpreter +- Flow Object Space +- Annotation +- Specialising to lltypesystem / ootypesystem +- C and LLVM Backends to lltypesystem + + +Abstract Interpretation +======================== + +- bytecode interpreter dispatches to Flow Object Space +- Flow Object Space implements abstract operations +- produces flow graphs as a side effect +- starts from "live" byte code NOT source code +- pygame demonstration + + +Type Inference +=============== + +- performs forward propagating type inference +- is used to infer the types in flow graphs +- needs types of the entry poing function's arguments +- assumes that the used types are static +- goes from very special to more general values + +Specialization +=========================== + +- annotated flow graphs are specialized for language families +- lltypesystem (for C like languages): C, LLVM +- ootypesystem (for OO languages): Java, Javascript, Smalltalk +- result is specialized flow graphs +- these contain operations at target level + +Backends +========== + +- produce code out of specialized flow graphs +- complete backends: C, LLVM +- ongoing: JavaScript, Squeak +- foreign function calls: manually written glue snippets +- big example + +Translation Aspects +==================== + +- implementation decisions (GC, threading, CC) at translation time +- most language implementation do a "fixed" decision +- translation aspects are weaved into the produced code +- independent from language semantics (python interpreter) + +aspects: Memory Models +=================================== + +- Currently implemented: refcounting, Boehm-collector +- more general exact GCs (copying, mark&sweep, ...) - not yet +- different allocation strategies - not yet + +Aspects: Threading Models +================================= + +- currently implemented: single thread and global interpreter lock +- future plans: free threading models +- stacklessness: don't use the C stack for user-level recursion +- implemented as a part of the backends + +comparison to other approaches +=============================== + +========== =========== ============ ============= +Project languages environments impl aspects +========== =========== ============ ============= +PyPy 1 (for now) variable variable +Java variable 1 semi-variable +.NET variable 1 semi-variable +========== =========== ============ ============= + +- environments: language backends, standard runtime + environments +- implementation aspects: GC, threading, calling + conventions, security, ... + + +.. holger + +three public releases +===================== + +- 0.6 quite compliant python implementation +- 0.7 compliant self-contained python implementation +- 0.8 full parser and compiler, "10-50 times" better speed + +lots of documentation +====================== +- http://codespeak.net/pypy +- 23rd December: release of 10 PyPy reports to the EU +- talks, papers, slides available on the site + + +PyPy cross pollination +======================= +- perl6: Object Spaces +- llvm +- cpython +- squeak (started last CCC conf) +- IronPython/Microsoft + +the speed issue +=============== +- currently 5-15 times slower than CPython +- now seriously starting with optmisations at various levels +- pypy can translate (R-)python code to something 10-50 times + faster compared to running on top of CPython + +pypy development method +======================== +- sprints +- test-driven +- open source culture +- see talk tomorrow 5pm (29th Dec. 2005) + +technical outlook 2006 +====================== + +- specialising JIT-compiler, processor backends +- stackless/non-c calling conventions (CPS) +- GC / threading integration + extensions +- orthogonal persistence and distribution (see thunk example) +- built-in security (e-lang ...) + +outlook on whole project level +============================== + +- surviving the EU review in Bruxelles 20th January 2006 +- better interactions with community & contribution +- taking care about post-EU development (2007++) +- visiting the US, Japan ... + + +.. |bullet| unicode:: U+02022 +.. footer:: Holger Krekel, Carl Friedrich Bolz (merlinux) |bullet| 22C3 |bullet| 28th December 2005 From hpk at codespeak.net Tue Dec 27 17:36:19 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Tue, 27 Dec 2005 17:36:19 +0100 (CET) Subject: [pypy-svn] r21571 - pypy/extradoc/talk/22c3 Message-ID: <20051227163619.216E427B3E@code1.codespeak.net> Author: hpk Date: Tue Dec 27 17:36:15 2005 New Revision: 21571 Added: pypy/extradoc/talk/22c3/hpk-agility.txt (contents, props changed) Log: still rough draft (missing input from bea) of the agility talk on 29th, slides are in a somewhat random order Added: pypy/extradoc/talk/22c3/hpk-agility.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/hpk-agility.txt Tue Dec 27 17:36:15 2005 @@ -0,0 +1,196 @@ +============================================================ +Open Source, EU Funding and Agile Methods +============================================================ + +:Authors: Bea During (changemaker), Holger Krekel (merlinux) +:Date: 29th December 2005 + +how it got started +================================================== + +- 2003 first emails between Armin Rigo, Christian Tismer and + Holger Krekel +- participated in zope3 coding events ("sprints") +- initial invitation for a one-week sprint to Trillke, Hildesheim +- participants got to know each other at conferences +- goal: Python implementation in Python (various motivations) + +key elements of the technical development +================================================== + +- test-driven from the start +- driven by architectural experiences +- welcomed by python community (just ask for name for a + project :) +- based on zen of python / python culture +- focus on correctness of concepts, then speed + +python community +================================================== + +- strong open-source cultural background +- strong focus on glue & integration esp. with C-libraries +- few flaming fights inside / much collaboration +- has many windows and unix hackers + +evolving agility +================================================== + +- all large python projects rely and depend on automated testing +- several projects regularly "sprint" and work together distributedly +- first community conference EuroPython in 2002 (now yearly) +- many test tools and methods available + +PyPy test-driven development +================================================== + +- identify problems/evolution by tests first +- our own testing and development tools +- rule: first get the semantics and concepts right! + optimize later! +- today around 3000 tests (plus CPython regression tests) + +PyPy's evolution as a project +================================================== + +- 2003: four one-week meetings, basic architecture evolved +- mid 2003: realisation that we'd need to work + full time on it to make it succeed +- idea for EU funding was born! +- collaborative work on a proposal ... + +EU funding +================================================== + +- proposal got good reviews from EU contracted experts +- negotiation and finalisation: 1 year! +- Dec. 2004 EU contract starts +- 2 year project, 1.3 Million Euro funding +- contractual framework + reporting obligations + +individuals ... need to organise +================================================== + +- PyPy was first and still is a network of people +- but EU only funds organisations +- two companies were founded +- 7 partners form the EU consortium: 6 companies, 1 university + +balance of interests +================================================== + +- developers want to (continue to) drive the project +- companies have to co-finance 50% of all costs + (travel/salary), commercial interests +- EU wants challenging research goals and tracking of goals + +- **at all levels it is about finding + models for co-operation that fit** + +free co-operation basics +================================================== + +- Christoph Spehr's "foundation of free co-operation": +- negotiate any model you want (including dictatorship) +- question and change rules and roles at any time +- everyone can "leave" the co-operation without penalty +- leaving party taking a share with him/her + +developers collaboration +================================================== + +- weekly 30 minute synchronisation meetings +- open collaborative open-source work style +- representation through Trusted "Technical Board" + within the EU project +- research/architecture informally guided by accepted experts + +company collaboration +================================================== + +- contractually through the EU consortium +- exchange of knowledge and people, shared tools +- evolving commercial opportunities + US companies asking for consulting (test tool) + or wanting to hire/pay developers on pypy related tools + +----- bea ----- + +- sprints +- how does that EU thing work + +- EU review + +- agile experiences from other projects + +---------------- + +problems and perspectives +================================================== + +- confrontation with people from +- same planet, different place +- different planet +- different solar system +- ... +- what follows is slightly abstracted ... + +working with people from the same planet +================================================== + +- generally shared perspectives (synchronised + rotation around a common center) on project success +- different continents can be hard enough +- potential for misunderstanding/friction + +different planet +================================================== +- shared view regarding the center (project success) +- quite different working perspectives / methods + (e.g. "open-source collaborative" versus "formal traditional") +- potential for mistrust and dis-connection + +different solar system +================================================== + +- missing shared focus on project success?! +- alien or alienating interests + +the universal truth is ... +================================================== + +- often good intentions (!) even from aliens +- confrontation with lots of levels, planets and solar systems +- the challenge is to find a fitting model for case-by-case co-operation! +- determine location in universe and try to match and + synchronize + +cross-project pollination +================================================== + +- zope-europe +- dissemination: universities, IONA, Intel, HP ... +- Alan Kay / Squeak ... + +business collaboration example: canonical +================================================== + +- founded and funded by Mark Shuttleworth (Thawte founder) +- drives infrastructure development for Ubuntu +- by now 30-50 developers worldwide +- main language is python +- weekly sync meetings / open-source collaborative work style + +business collaboration example: merlinux +================================================== + +- founded just before the EU project (see last year's talk) +- hires 8 people distributed across europe (and brasil) +- weekly sync meetings +- looks for fitting contracting models, "making it fit" +- it's not about products, but about viable solutions based on + an efficient development model + +.. |bullet| unicode:: U+02022 +.. footer:: Bea During, Holger Krekel |bullet| 22C3 |bullet| 29th December 2005 + From hpk at codespeak.net Tue Dec 27 17:38:24 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Tue, 27 Dec 2005 17:38:24 +0100 (CET) Subject: [pypy-svn] r21572 - pypy/extradoc/talk/22c3 Message-ID: <20051227163824.A94CE27B3E@code1.codespeak.net> Author: hpk Date: Tue Dec 27 17:38:22 2005 New Revision: 21572 Added: pypy/extradoc/talk/22c3/translation-overview.png (contents, props changed) pypy/extradoc/talk/22c3/translation-overview.sxd (contents, props changed) Modified: pypy/extradoc/talk/22c3/hpk-tech.txt Log: - added simple "translation overview" images to be used for the 22c3 talk Modified: pypy/extradoc/talk/22c3/hpk-tech.txt ============================================================================== --- pypy/extradoc/talk/22c3/hpk-tech.txt (original) +++ pypy/extradoc/talk/22c3/hpk-tech.txt Tue Dec 27 17:38:22 2005 @@ -7,6 +7,8 @@ :Authors: Holger Krekel & Carl Friedrich Bolz :Date: 28th December 2005 +.. holger + PyPy/Translation overview ========================= Added: pypy/extradoc/talk/22c3/translation-overview.png ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/translation-overview.sxd ============================================================================== Binary file. No diff available. From arigo at codespeak.net Tue Dec 27 18:04:55 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 27 Dec 2005 18:04:55 +0100 (CET) Subject: [pypy-svn] r21573 - pypy/extradoc/talk/22c3 Message-ID: <20051227170455.ADBAF27B39@code1.codespeak.net> Author: arigo Date: Tue Dec 27 18:04:53 2005 New Revision: 21573 Modified: pypy/extradoc/talk/22c3/hpk-tech.txt Log: Typos Modified: pypy/extradoc/talk/22c3/hpk-tech.txt ============================================================================== --- pypy/extradoc/talk/22c3/hpk-tech.txt (original) +++ pypy/extradoc/talk/22c3/hpk-tech.txt Tue Dec 27 18:04:53 2005 @@ -36,7 +36,7 @@ - implements full Python language in Python itself - parts implemented in a restricted subset: RPython -- "static enough" for full-programm type inference +- "static enough" for full-program type inference - but at boot time we allow unrestricted python! - 350 subscribers to pypy-dev, 150.000 LOCs, 20.000 visitors per month, - MIT license @@ -143,11 +143,11 @@ ==================== - implementation decisions (GC, threading, CC) at translation time -- most language implementation do a "fixed" decision +- most other language implementations do a "fixed" decision - translation aspects are weaved into the produced code - independent from language semantics (python interpreter) -aspects: Memory Models +Aspects: Memory Models =================================== - Currently implemented: refcounting, Boehm-collector @@ -206,7 +206,7 @@ the speed issue =============== - currently 5-15 times slower than CPython -- now seriously starting with optmisations at various levels +- now seriously starting with optimisations at various levels - pypy can translate (R-)python code to something 10-50 times faster compared to running on top of CPython @@ -221,7 +221,7 @@ ====================== - specialising JIT-compiler, processor backends -- stackless/non-c calling conventions (CPS) +- stackless/non-C calling conventions (CPS) - GC / threading integration + extensions - orthogonal persistence and distribution (see thunk example) - built-in security (e-lang ...) From bea at codespeak.net Tue Dec 27 21:36:50 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Tue, 27 Dec 2005 21:36:50 +0100 (CET) Subject: [pypy-svn] r21574 - pypy/extradoc/talk/22c3 Message-ID: <20051227203650.913BC27B36@code1.codespeak.net> Author: bea Date: Tue Dec 27 21:36:49 2005 New Revision: 21574 Added: pypy/extradoc/talk/22c3/part_bea.txt Log: beas notes for 22c3 talk - to be merged with holgers notes Added: pypy/extradoc/talk/22c3/part_bea.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/part_bea.txt Tue Dec 27 21:36:49 2005 @@ -0,0 +1,80 @@ +22C3 "Open Source, EU Funding and Agile Methods" +29th of December 2005/Berlin + +Slides Bea: + + +Core of Agile practises: the people factor +- "Agile processes are designed to capitalize on each individual and each team?s unique strenghts" (Cockburn, Highsmith, 2001) +- OSS nature of teams: self-organized, intensely collaborative - fit the agile approach +- OSS teams are an unique implementation of agile practices - why? + +Picture: man on the moon + +Agile approaches aim at: + * reducing "cost of information",distance from decision-making + * by physical location, unorthodox dissemination + * resulting in improved sense of community, team "morale" + +Origins of sprinting +- Scrum (Agile community): 1 month long iteration of development work, increments + (also supporting activities: planning, documentation, tracking work, evaluation) +- Zope Foundation (Python Community): "two-day or three-day focused development session, + in which developers pair off together in a room and focus on building a particular subsystem". + +PyPy sprints +- The project "started" during a sprint +- Changing facilities and location as a strategy (Vilnius, Lovain LeNeuve, Leysin, Gothenburg, + Paris, Heidelberg, Hildesheim, Washington etc) +- The nature of sprints have evolved since the project started 2003 and since recieving + partial EU-funding 2004/2005 + +Bidding for the EU-funding +- Project needed to reach critical mass, EU needed novel compiler design techniques in OSS contexts +- Proposal was written during sprints as well as distributed (submitted Oct 2003) +- Negotiations in Brussels (March 2004): key issues being 30% cuts in budget and denied procedures for funding contribution +- Project "started" 1 Dec 2004: troubles with creating consortium agreement fitting the OSS structure needed + +Organising the consortium +- 7 partners, 3 previously not involved in the PyPy community +- 2 new companies: "forced" entrepreneurship +- All partners but one partially funded (50% cost models) +- Less than 5% of the involved developers was covered by this partial funding + +Organising the work +- 14 workpackages and 58 deliverables, 3 phases +- Need for consortium meetings every month (IRC) +- Sprints every 6th week (coordinating the development and management work) + +The different cultures of the PyPy project +- OSS (Python) culture (agile and distributed workstyle) +- EU project culture +- Traditional project management culture +- Chaospilot (actionlearning and process design) culture +- 3 different national cultures + +The challenge: managing diversities part 1. +- Formal project organization vs developer driven process + - management team, technical board and partners + - sprint organising +- Resulting in: increased risk of added workload of management work on core developers + +The challenge: managing diversities part 2. +- Formal EU requirements vs agile strategies + - written high level requirements + - change control structures complicated +- Resulting in:increased risk of missing opportunities and not creating/reacting to change fast enough + +The challenge: managing diversities part 3. +- OSS community vs "conceptual integrity" + - pypy-dev/core developers in technical board + - industrial usage vs research oriented work +- Resulting in: increased risk for unbalancing the community + +Picture: chaos vs structure + +Conclusion +- A shared and challenging vision +- Respecting and "exploiting" strengths of the different cultures involved +- Designing minimalistic project structures channeling work, not hindering work +- Room for group learning and creating change - not just reacting to change \ No newline at end of file From bea at codespeak.net Tue Dec 27 21:38:29 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Tue, 27 Dec 2005 21:38:29 +0100 (CET) Subject: [pypy-svn] r21575 - pypy/extradoc/talk/22c3 Message-ID: <20051227203829.6059027B36@code1.codespeak.net> Author: bea Date: Tue Dec 27 21:38:27 2005 New Revision: 21575 Modified: pypy/extradoc/talk/22c3/part_bea.txt (contents, props changed) Log: eolstyle Modified: pypy/extradoc/talk/22c3/part_bea.txt ============================================================================== --- pypy/extradoc/talk/22c3/part_bea.txt (original) +++ pypy/extradoc/talk/22c3/part_bea.txt Tue Dec 27 21:38:27 2005 @@ -1,80 +1,80 @@ -22C3 "Open Source, EU Funding and Agile Methods" -29th of December 2005/Berlin - -Slides Bea: - - -Core of Agile practises: the people factor -- "Agile processes are designed to capitalize on each individual and each team?s unique strenghts" (Cockburn, Highsmith, 2001) -- OSS nature of teams: self-organized, intensely collaborative - fit the agile approach -- OSS teams are an unique implementation of agile practices - why? - -Picture: man on the moon - -Agile approaches aim at: - * reducing "cost of information",distance from decision-making - * by physical location, unorthodox dissemination - * resulting in improved sense of community, team "morale" - -Origins of sprinting -- Scrum (Agile community): 1 month long iteration of development work, increments - (also supporting activities: planning, documentation, tracking work, evaluation) -- Zope Foundation (Python Community): "two-day or three-day focused development session, - in which developers pair off together in a room and focus on building a particular subsystem". - -PyPy sprints -- The project "started" during a sprint -- Changing facilities and location as a strategy (Vilnius, Lovain LeNeuve, Leysin, Gothenburg, - Paris, Heidelberg, Hildesheim, Washington etc) -- The nature of sprints have evolved since the project started 2003 and since recieving - partial EU-funding 2004/2005 - -Bidding for the EU-funding -- Project needed to reach critical mass, EU needed novel compiler design techniques in OSS contexts -- Proposal was written during sprints as well as distributed (submitted Oct 2003) -- Negotiations in Brussels (March 2004): key issues being 30% cuts in budget and denied procedures for funding contribution -- Project "started" 1 Dec 2004: troubles with creating consortium agreement fitting the OSS structure needed - -Organising the consortium -- 7 partners, 3 previously not involved in the PyPy community -- 2 new companies: "forced" entrepreneurship -- All partners but one partially funded (50% cost models) -- Less than 5% of the involved developers was covered by this partial funding - -Organising the work -- 14 workpackages and 58 deliverables, 3 phases -- Need for consortium meetings every month (IRC) -- Sprints every 6th week (coordinating the development and management work) - -The different cultures of the PyPy project -- OSS (Python) culture (agile and distributed workstyle) -- EU project culture -- Traditional project management culture -- Chaospilot (actionlearning and process design) culture -- 3 different national cultures - -The challenge: managing diversities part 1. -- Formal project organization vs developer driven process - - management team, technical board and partners - - sprint organising -- Resulting in: increased risk of added workload of management work on core developers - -The challenge: managing diversities part 2. -- Formal EU requirements vs agile strategies - - written high level requirements - - change control structures complicated -- Resulting in:increased risk of missing opportunities and not creating/reacting to change fast enough - -The challenge: managing diversities part 3. -- OSS community vs "conceptual integrity" - - pypy-dev/core developers in technical board - - industrial usage vs research oriented work -- Resulting in: increased risk for unbalancing the community - -Picture: chaos vs structure - -Conclusion -- A shared and challenging vision -- Respecting and "exploiting" strengths of the different cultures involved -- Designing minimalistic project structures channeling work, not hindering work +22C3 "Open Source, EU Funding and Agile Methods" +29th of December 2005/Berlin + +Slides Bea: + + +Core of Agile practises: the people factor +- "Agile processes are designed to capitalize on each individual and each team?s unique strenghts" (Cockburn, Highsmith, 2001) +- OSS nature of teams: self-organized, intensely collaborative - fit the agile approach +- OSS teams are an unique implementation of agile practices - why? + +Picture: man on the moon + +Agile approaches aim at: + * reducing "cost of information",distance from decision-making + * by physical location, unorthodox dissemination + * resulting in improved sense of community, team "morale" + +Origins of sprinting +- Scrum (Agile community): 1 month long iteration of development work, increments + (also supporting activities: planning, documentation, tracking work, evaluation) +- Zope Foundation (Python Community): "two-day or three-day focused development session, + in which developers pair off together in a room and focus on building a particular subsystem". + +PyPy sprints +- The project "started" during a sprint +- Changing facilities and location as a strategy (Vilnius, Lovain LeNeuve, Leysin, Gothenburg, + Paris, Heidelberg, Hildesheim, Washington etc) +- The nature of sprints have evolved since the project started 2003 and since recieving + partial EU-funding 2004/2005 + +Bidding for the EU-funding +- Project needed to reach critical mass, EU needed novel compiler design techniques in OSS contexts +- Proposal was written during sprints as well as distributed (submitted Oct 2003) +- Negotiations in Brussels (March 2004): key issues being 30% cuts in budget and denied procedures for funding contribution +- Project "started" 1 Dec 2004: troubles with creating consortium agreement fitting the OSS structure needed + +Organising the consortium +- 7 partners, 3 previously not involved in the PyPy community +- 2 new companies: "forced" entrepreneurship +- All partners but one partially funded (50% cost models) +- Less than 5% of the involved developers was covered by this partial funding + +Organising the work +- 14 workpackages and 58 deliverables, 3 phases +- Need for consortium meetings every month (IRC) +- Sprints every 6th week (coordinating the development and management work) + +The different cultures of the PyPy project +- OSS (Python) culture (agile and distributed workstyle) +- EU project culture +- Traditional project management culture +- Chaospilot (actionlearning and process design) culture +- 3 different national cultures + +The challenge: managing diversities part 1. +- Formal project organization vs developer driven process + - management team, technical board and partners + - sprint organising +- Resulting in: increased risk of added workload of management work on core developers + +The challenge: managing diversities part 2. +- Formal EU requirements vs agile strategies + - written high level requirements + - change control structures complicated +- Resulting in:increased risk of missing opportunities and not creating/reacting to change fast enough + +The challenge: managing diversities part 3. +- OSS community vs "conceptual integrity" + - pypy-dev/core developers in technical board + - industrial usage vs research oriented work +- Resulting in: increased risk for unbalancing the community + +Picture: chaos vs structure + +Conclusion +- A shared and challenging vision +- Respecting and "exploiting" strengths of the different cultures involved +- Designing minimalistic project structures channeling work, not hindering work - Room for group learning and creating change - not just reacting to change \ No newline at end of file From pedronis at codespeak.net Tue Dec 27 21:44:29 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 27 Dec 2005 21:44:29 +0100 (CET) Subject: [pypy-svn] r21576 - pypy/extradoc/talk/22c3 Message-ID: <20051227204429.A49F227B39@code1.codespeak.net> Author: pedronis Date: Tue Dec 27 21:44:28 2005 New Revision: 21576 Modified: pypy/extradoc/talk/22c3/hpk-tech.txt Log: fix typo. Java the platform is the JVM Modified: pypy/extradoc/talk/22c3/hpk-tech.txt ============================================================================== --- pypy/extradoc/talk/22c3/hpk-tech.txt (original) +++ pypy/extradoc/talk/22c3/hpk-tech.txt Tue Dec 27 21:44:28 2005 @@ -117,7 +117,7 @@ - performs forward propagating type inference - is used to infer the types in flow graphs -- needs types of the entry poing function's arguments +- needs types of the entry point function's arguments - assumes that the used types are static - goes from very special to more general values @@ -169,7 +169,7 @@ Project languages environments impl aspects ========== =========== ============ ============= PyPy 1 (for now) variable variable -Java variable 1 semi-variable +JVM/Java variable 1 semi-variable .NET variable 1 semi-variable ========== =========== ============ ============= From hpk at codespeak.net Wed Dec 28 09:39:20 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Wed, 28 Dec 2005 09:39:20 +0100 (CET) Subject: [pypy-svn] r21581 - pypy/extradoc/talk/22c3 Message-ID: <20051228083920.BA3C827B35@code1.codespeak.net> Author: hpk Date: Wed Dec 28 09:39:18 2005 New Revision: 21581 Modified: pypy/extradoc/talk/22c3/hpk-tech.txt pypy/extradoc/talk/22c3/part_bea.txt Log: formatting checkin Modified: pypy/extradoc/talk/22c3/hpk-tech.txt ============================================================================== --- pypy/extradoc/talk/22c3/hpk-tech.txt (original) +++ pypy/extradoc/talk/22c3/hpk-tech.txt Wed Dec 28 09:39:18 2005 @@ -9,12 +9,6 @@ .. holger -PyPy/Translation overview -========================= - -.. image:: translation-overview.png - :scale: 500 - Python implementation facts =========================== @@ -29,21 +23,28 @@ - CPython: main Python version (BDFL'ed by Guido) - Jython: compiles to Java Bytecode - IronPython (MS): compiles to .NET's CLR -- PyPy: self-contained - self-translating +- PyPy: self-contained - self-translating - flexible + +PyPy project facts +======================= + +- started 2002 as a grass-root effort +- aims: flexibility, research, speed +- test-driven development +- received EU-funding from end 2004 on +- 350 subscribers to pypy-dev, 150.000 LOCs, 20.000 visitors per month, +- MIT license PyPy implementation facts ============================ -- implements full Python language in Python itself +- implements Python language in Python itself - parts implemented in a restricted subset: RPython - "static enough" for full-program type inference -- but at boot time we allow unrestricted python! -- 350 subscribers to pypy-dev, 150.000 LOCs, 20.000 visitors per month, -- MIT license +- at boot time we allow unrestricted python! PyPy/Python architecture ============================= -[picture] - parser and compiler - bytecode interpreter @@ -51,6 +52,14 @@ - Python VM = interpreter + Standard Object Space - builtin and fundamental modules +PyPy/Python architecture picture +================================== + +.. image:: interpreter-overview.png + :width: 650 + :height: 400 + + Parser and Compiler =================== @@ -58,23 +67,24 @@ - compiles AST to code objects (bytecode) - works from the CPython grammar definition - can be modified/extended at runtime (almost) -- [showing interactively] ... +- [interactive command line dis-example] ... Bytecode interpreter ==================== -- interprets bytecode/code objects through "Frames" +- interprets bytecode/code objects through Frame objects - Frames tie to global and local variable scopes - implements control flow (loops, branches, exceptions, calls) - dispatches all operations on objects to an Object Library - ("Space") + or "Object Space" Object Spaces ============= - library of all python types and operations on them +- encapsulates all knowledge about app-level objects - is not concerned with control flow or bytecode -- ... +- e.g. enough control to implement lazy evaluation Builtin and Fundamental Modules =============================== @@ -101,6 +111,12 @@ - Specialising to lltypesystem / ootypesystem - C and LLVM Backends to lltypesystem +PyPy/Translation overview +========================= + +.. image:: translation-overview.png + :height: 422 + :width: 760 Abstract Interpretation ======================== @@ -215,7 +231,7 @@ - sprints - test-driven - open source culture -- see talk tomorrow 5pm (29th Dec. 2005) +- see talk tomorrow 2pm (29th Dec. 2005) technical outlook 2006 ====================== Modified: pypy/extradoc/talk/22c3/part_bea.txt ============================================================================== --- pypy/extradoc/talk/22c3/part_bea.txt (original) +++ pypy/extradoc/talk/22c3/part_bea.txt Wed Dec 28 09:39:18 2005 @@ -5,48 +5,70 @@ Core of Agile practises: the people factor -- "Agile processes are designed to capitalize on each individual and each team?s unique strenghts" (Cockburn, Highsmith, 2001) -- OSS nature of teams: self-organized, intensely collaborative - fit the agile approach +============================================================ + +- "Agile processes are designed to capitalize on each + individual and each team?s unique strenghts" (Cockburn, Highsmith, 2001) +- OSS nature of teams: self-organized, intensely + collaborative - fit the agile approach - OSS teams are an unique implementation of agile practices - why? Picture: man on the moon Agile approaches aim at: - * reducing "cost of information",distance from decision-making - * by physical location, unorthodox dissemination - * resulting in improved sense of community, team "morale" +============================================================ + +* reducing ... "cost of information",distance from decision-making +* by ... physical location, unorthodox dissemination +* resulting in ... improved sense of community, team "morale" Origins of sprinting -- Scrum (Agile community): 1 month long iteration of development work, increments - (also supporting activities: planning, documentation, tracking work, evaluation) -- Zope Foundation (Python Community): "two-day or three-day focused development session, - in which developers pair off together in a room and focus on building a particular subsystem". +============================================================ + +- Scrum (Agile community): 1 month long iteration of + development work, increments (also supporting activities: + planning, documentation, tracking work, evaluation) + +- Zope Foundation (Python Community): "two-day or three-day + focused development session, in which developers pair off + together in a room and focus on building a particular + subsystem". PyPy sprints +============================================================ - The project "started" during a sprint -- Changing facilities and location as a strategy (Vilnius, Lovain LeNeuve, Leysin, Gothenburg, - Paris, Heidelberg, Hildesheim, Washington etc) -- The nature of sprints have evolved since the project started 2003 and since recieving - partial EU-funding 2004/2005 +- Changing facilities and location as a strategy (Vilnius, + Lovain LeNeuve, Leysin, Gothenburg, Paris, Heidelberg, + Hildesheim, Washington etc) +- The nature of sprints have evolved since the project started + 2003 and since recieving partial EU-funding 2004/2005 Bidding for the EU-funding -- Project needed to reach critical mass, EU needed novel compiler design techniques in OSS contexts -- Proposal was written during sprints as well as distributed (submitted Oct 2003) +============================================================ +- Project needed to reach critical mass, EU needed novel + compiler design techniques in OSS contexts + +- Proposal was written during sprints as well as distributed + (submitted Oct 2003) - Negotiations in Brussels (March 2004): key issues being 30% cuts in budget and denied procedures for funding contribution -- Project "started" 1 Dec 2004: troubles with creating consortium agreement fitting the OSS structure needed +- Project "started" 1 Dec 2004: troubles with creating + consortium agreement fitting the OSS structure needed Organising the consortium +============================================================ - 7 partners, 3 previously not involved in the PyPy community - 2 new companies: "forced" entrepreneurship - All partners but one partially funded (50% cost models) - Less than 5% of the involved developers was covered by this partial funding Organising the work +============================================================ - 14 workpackages and 58 deliverables, 3 phases - Need for consortium meetings every month (IRC) -- Sprints every 6th week (coordinating the development and management work) +- Sprints every 6th week (coordinating development and management work) The different cultures of the PyPy project +============================================================ - OSS (Python) culture (agile and distributed workstyle) - EU project culture - Traditional project management culture @@ -54,18 +76,23 @@ - 3 different national cultures The challenge: managing diversities part 1. +============================================================ - Formal project organization vs developer driven process - management team, technical board and partners - sprint organising -- Resulting in: increased risk of added workload of management work on core developers +- Resulting in: increased risk of added workload of management + work on core developers The challenge: managing diversities part 2. +============================================================ - Formal EU requirements vs agile strategies - written high level requirements - change control structures complicated -- Resulting in:increased risk of missing opportunities and not creating/reacting to change fast enough +- Resulting in:increased risk of missing opportunities and not + creating/reacting to change fast enough The challenge: managing diversities part 3. +============================================================ - OSS community vs "conceptual integrity" - pypy-dev/core developers in technical board - industrial usage vs research oriented work @@ -74,7 +101,8 @@ Picture: chaos vs structure Conclusion +============================================================ - A shared and challenging vision - Respecting and "exploiting" strengths of the different cultures involved - Designing minimalistic project structures channeling work, not hindering work -- Room for group learning and creating change - not just reacting to change \ No newline at end of file +- Room for group learning and creating change - not just reacting to change From hpk at codespeak.net Wed Dec 28 09:43:01 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Wed, 28 Dec 2005 09:43:01 +0100 (CET) Subject: [pypy-svn] r21582 - pypy/extradoc/talk/22c3 Message-ID: <20051228084301.2B8D727B3E@code1.codespeak.net> Author: hpk Date: Wed Dec 28 09:42:59 2005 New Revision: 21582 Added: pypy/extradoc/talk/22c3/slides-agility.txt (contents, props changed) Log: rough merge - work in progress Added: pypy/extradoc/talk/22c3/slides-agility.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/slides-agility.txt Wed Dec 28 09:42:59 2005 @@ -0,0 +1,296 @@ +============================================================ +Open Source, EU Funding and Agile Methods +============================================================ + +:Authors: Bea During (changemaker), Holger Krekel (merlinux) +:Date: 29th December 2005 + +how it got started +================================================== + +- 2003 first emails between Armin Rigo, Christian Tismer and + Holger Krekel +- participated in zope3 coding events ("sprints") +- initial invitation for a one-week sprint to Trillke, Hildesheim +- participants got to know each other at conferences +- goal: Python implementation in Python (various motivations) + +key elements of the technical development +================================================== + +- test-driven from the start +- driven by architectural experiences +- welcomed by python community (just ask for name for a + project :) +- based on zen of python / python culture +- focus on correctness of concepts, then speed + +python community +================================================== + +- strong open-source cultural background +- strong focus on glue & integration esp. with C-libraries +- few flaming fights inside / much collaboration +- has many windows and unix hackers + +evolving agility +================================================== + +- all large python projects rely and depend on automated testing +- several projects regularly "sprint" and work together distributedly +- first community conference EuroPython in 2002 (now yearly) +- many test tools and methods available + +PyPy test-driven development +================================================== + +- identify problems/evolution by tests first +- our own testing and development tools +- rule: first get the semantics and concepts right! + optimize later! +- today around 3000 tests (plus CPython regression tests) + +PyPy's evolution as a project +================================================== + +- 2003: four one-week meetings, basic architecture evolved +- mid 2003: realisation that we'd need to work + full time on it to make it succeed +- idea for EU funding was born! +- collaborative work on a proposal ... + +EU funding +================================================== + +- proposal got good reviews from EU contracted experts +- negotiation and finalisation: 1 year! +- Dec. 2004 EU contract starts +- 2 year project, 1.3 Million Euro funding +- contractual framework + reporting obligations + +individuals ... need to organise +================================================== + +- PyPy was first and still is a network of people +- but EU only funds organisations +- two companies were founded +- 7 partners form the EU consortium: 6 companies, 1 university + +balance of interests +================================================== + +- developers want to (continue to) drive the project +- companies have to co-finance 50% of all costs + (travel/salary), commercial interests +- EU wants challenging research goals and tracking of goals + +- **at all levels it is about finding + models for co-operation that fit** + +free co-operation basics +================================================== + +- Christoph Spehr's "foundation of free co-operation": +- negotiate any model you want (including dictatorship) +- question and change rules and roles at any time +- everyone can "leave" the co-operation without penalty +- leaving party taking a share with him/her + +developers collaboration +================================================== + +- weekly 30 minute synchronisation meetings +- open collaborative open-source work style +- representation through Trusted "Technical Board" + within the EU project +- research/architecture informally guided by accepted experts + +company collaboration +================================================== + +- contractually through the EU consortium +- exchange of knowledge and people, shared tools +- evolving commercial opportunities + US companies asking for consulting (test tool) + or wanting to hire/pay developers on pypy related tools + +Core of Agile practises: the people factor +============================================================ + +- "Agile processes are designed to capitalize on each + individual and each team?s unique strenghts" (Cockburn, Highsmith, 2001) +- OSS nature of teams: self-organized, intensely + collaborative - fit the agile approach +- OSS teams are an unique implementation of agile practices - why? + +Picture: man on the moon + +Agile approaches aim at: +============================================================ + +* reducing ... "cost of information",distance from decision-making +* by ... physical location, unorthodox dissemination +* resulting in ... improved sense of community, team "morale" + +Origins of sprinting +============================================================ + +- Scrum (Agile community): 1 month long iteration of + development work, increments (also supporting activities: + planning, documentation, tracking work, evaluation) + +- Zope Foundation (Python Community): "two-day or three-day + focused development session, in which developers pair off + together in a room and focus on building a particular + subsystem". + +PyPy sprints +============================================================ +- The project "started" during a sprint +- Changing facilities and location as a strategy (Vilnius, + Lovain LeNeuve, Leysin, Gothenburg, Paris, Heidelberg, + Hildesheim, Washington etc) +- The nature of sprints have evolved since the project started + 2003 and since recieving partial EU-funding 2004/2005 + +Bidding for the EU-funding +============================================================ +- Project needed to reach critical mass, EU needed novel + compiler design techniques in OSS contexts + +- Proposal was written during sprints as well as distributed + (submitted Oct 2003) +- Negotiations in Brussels (March 2004): key issues being 30% cuts in budget and denied procedures for funding contribution +- Project "started" 1 Dec 2004: troubles with creating + consortium agreement fitting the OSS structure needed + +Organising the consortium +============================================================ +- 7 partners, 3 previously not involved in the PyPy community +- 2 new companies: "forced" entrepreneurship +- All partners but one partially funded (50% cost models) +- Less than 5% of the involved developers was covered by this partial funding + +Organising the work +============================================================ +- 14 workpackages and 58 deliverables, 3 phases +- Need for consortium meetings every month (IRC) +- Sprints every 6th week (coordinating development and management work) + +The different cultures of the PyPy project +============================================================ +- OSS (Python) culture (agile and distributed workstyle) +- EU project culture +- Traditional project management culture +- Chaospilot (actionlearning and process design) culture +- 3 different national cultures + +The challenge: managing diversities part 1. +============================================================ +- Formal project organization vs developer driven process + - management team, technical board and partners + - sprint organising +- Resulting in: increased risk of added workload of management + work on core developers + +The challenge: managing diversities part 2. +============================================================ +- Formal EU requirements vs agile strategies + - written high level requirements + - change control structures complicated +- Resulting in:increased risk of missing opportunities and not + creating/reacting to change fast enough + +The challenge: managing diversities part 3. +============================================================ +- OSS community vs "conceptual integrity" + - pypy-dev/core developers in technical board + - industrial usage vs research oriented work +- Resulting in: increased risk for unbalancing the community + +Picture: chaos vs structure + + +problems and perspectives +================================================== + +- confrontation with people from +- same planet, different place +- different planet +- different solar system +- ... +- what follows is slightly abstracted ... + +working with people from the same planet +================================================== + +- generally shared perspectives (synchronised + rotation around a common center) on project success +- different continents can be hard enough +- potential for misunderstanding/friction + +different planet +================================================== +- shared view regarding the center (project success) +- quite different working perspectives / methods + (e.g. "open-source collaborative" versus "formal traditional") +- potential for mistrust and dis-connection + +different solar system +================================================== + +- missing shared focus on project success?! +- alien or alienating interests + +the universal truth is ... +================================================== + +- often good intentions (!) even from aliens +- confrontation with lots of levels, planets and solar systems +- the challenge is to find a fitting model for case-by-case co-operation! +- determine location in universe and try to match and + synchronize + +cross-project pollination +================================================== + +- zope-europe +- dissemination: universities, IONA, Intel, HP ... +- Alan Kay / Squeak ... + +business collaboration example: canonical +================================================== + +- founded and funded by Mark Shuttleworth (Thawte founder) +- drives infrastructure development for Ubuntu +- by now 30-50 developers worldwide +- main language is python +- weekly sync meetings / open-source collaborative work style + +business collaboration example: merlinux +================================================== + +- founded just before the EU project (see last year's talk) +- hires 8 people distributed across europe (and brasil) +- weekly sync meetings +- looks for fitting contracting models, "making it fit" +- it's not about products, but about viable solutions based on + an efficient development model + +.. |bullet| unicode:: U+02022 +.. footer:: Bea During, Holger Krekel |bullet| 22C3 |bullet| 29th December 2005 + +Conclusion +============================================================ +- A shared and challenging vision +- Respecting and "exploiting" strengths of the different cultures involved +- Designing minimalistic project structures channeling work, not hindering work +- Room for group learning and creating change - not just reacting to change +- sprints +- how does that EU thing work + +- EU review + +- agile experiences from other projects + +---------------- From hpk at codespeak.net Wed Dec 28 10:11:39 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Wed, 28 Dec 2005 10:11:39 +0100 (CET) Subject: [pypy-svn] r21583 - pypy/extradoc/talk/22c3 Message-ID: <20051228091139.C7B3327B35@code1.codespeak.net> Author: hpk Date: Wed Dec 28 10:11:37 2005 New Revision: 21583 Added: pypy/extradoc/talk/22c3/interpreter-overview.png (contents, props changed) pypy/extradoc/talk/22c3/interpreter-overview.sxd (contents, props changed) Log: add interpreter overview pictures Added: pypy/extradoc/talk/22c3/interpreter-overview.png ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/interpreter-overview.sxd ============================================================================== Binary file. No diff available. From hpk at codespeak.net Wed Dec 28 10:26:51 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Wed, 28 Dec 2005 10:26:51 +0100 (CET) Subject: [pypy-svn] r21584 - pypy/extradoc/talk/22c3 Message-ID: <20051228092651.26D5C27B3E@code1.codespeak.net> Author: hpk Date: Wed Dec 28 10:26:49 2005 New Revision: 21584 Modified: pypy/extradoc/talk/22c3/slides-agility.txt Log: (bea,hpk) first common pass through all slides Modified: pypy/extradoc/talk/22c3/slides-agility.txt ============================================================================== --- pypy/extradoc/talk/22c3/slides-agility.txt (original) +++ pypy/extradoc/talk/22c3/slides-agility.txt Wed Dec 28 10:26:49 2005 @@ -20,8 +20,7 @@ - test-driven from the start - driven by architectural experiences -- welcomed by python community (just ask for name for a - project :) +- welcomed by python community - based on zen of python / python culture - focus on correctness of concepts, then speed @@ -29,7 +28,7 @@ ================================================== - strong open-source cultural background -- strong focus on glue & integration esp. with C-libraries +- strong focus on glue & integration esp. with C/other languages - few flaming fights inside / much collaboration - has many windows and unix hackers @@ -38,7 +37,7 @@ - all large python projects rely and depend on automated testing - several projects regularly "sprint" and work together distributedly -- first community conference EuroPython in 2002 (now yearly) +- community conference EuroPython in 2002 (now yearly) - many test tools and methods available PyPy test-driven development @@ -118,18 +117,18 @@ ============================================================ - "Agile processes are designed to capitalize on each - individual and each team?s unique strenghts" (Cockburn, Highsmith, 2001) + individual and each team's unique strenghts" (Cockburn, Highsmith, 2001) - OSS nature of teams: self-organized, intensely collaborative - fit the agile approach - OSS teams are an unique implementation of agile practices - why? Picture: man on the moon -Agile approaches aim at: +Agile approaches aim at ... ============================================================ * reducing ... "cost of information",distance from decision-making -* by ... physical location, unorthodox dissemination +* by ... physical location, unorthodox exchange of knowledge * resulting in ... improved sense of community, team "morale" Origins of sprinting @@ -146,7 +145,8 @@ PyPy sprints ============================================================ -- The project "started" during a sprint + +- The project "started" with a sprint - Changing facilities and location as a strategy (Vilnius, Lovain LeNeuve, Leysin, Gothenburg, Paris, Heidelberg, Hildesheim, Washington etc) @@ -157,56 +157,56 @@ ============================================================ - Project needed to reach critical mass, EU needed novel compiler design techniques in OSS contexts - - Proposal was written during sprints as well as distributed (submitted Oct 2003) - Negotiations in Brussels (March 2004): key issues being 30% cuts in budget and denied procedures for funding contribution -- Project "started" 1 Dec 2004: troubles with creating +- Project "started" Dec 2004: troubles with creating consortium agreement fitting the OSS structure needed Organising the consortium ============================================================ - 7 partners, 3 previously not involved in the PyPy community +- 6 partners only partially funded (50% cost models) - 2 new companies: "forced" entrepreneurship -- All partners but one partially funded (50% cost models) -- Less than 5% of the involved developers was covered by this partial funding +- not all involved developers got/get funded! Organising the work ============================================================ - 14 workpackages and 58 deliverables, 3 phases - Need for consortium meetings every month (IRC) - Sprints every 6th week (coordinating development and management work) +- EU project aspects helped to gain mid-term/long-term focus The different cultures of the PyPy project ============================================================ -- OSS (Python) culture (agile and distributed workstyle) +- OSS/Python culture (agile and distributed workstyle) - EU project culture - Traditional project management culture - Chaospilot (actionlearning and process design) culture -- 3 different national cultures +- 5 different national cultures The challenge: managing diversities part 1. ============================================================ -- Formal project organization vs developer driven process +- Developer driven process and Formal project organization - management team, technical board and partners - sprint organising -- Resulting in: increased risk of added workload of management +- increased risk of added workload of management work on core developers The challenge: managing diversities part 2. ============================================================ -- Formal EU requirements vs agile strategies +- Agile strategies and Formal EU requirements - written high level requirements - change control structures complicated -- Resulting in:increased risk of missing opportunities and not +- increased risk of missing opportunities and not creating/reacting to change fast enough The challenge: managing diversities part 3. ============================================================ -- OSS community vs "conceptual integrity" - - pypy-dev/core developers in technical board - - industrial usage vs research oriented work -- Resulting in: increased risk for unbalancing the community +- OSS community and hierarchies for "conceptual integrity" + - pypy-dev/core developers in technical board + - industrial usage vs research oriented work +- increased risk for unbalancing the community Picture: chaos vs structure @@ -215,10 +215,9 @@ ================================================== - confrontation with people from -- same planet, different place -- different planet -- different solar system -- ... + - same planet, different place + - different planet + - different solar system - what follows is slightly abstracted ... working with people from the same planet @@ -241,6 +240,8 @@ - missing shared focus on project success?! - alien or alienating interests +- potential for defense/attack thinking, secret agendas + the universal truth is ... ================================================== @@ -248,38 +249,15 @@ - often good intentions (!) even from aliens - confrontation with lots of levels, planets and solar systems - the challenge is to find a fitting model for case-by-case co-operation! -- determine location in universe and try to match and - synchronize +- identify location in universe and try to match and synchronize cross-project pollination ================================================== -- zope-europe +- zope-europe, canonical - dissemination: universities, IONA, Intel, HP ... - Alan Kay / Squeak ... -business collaboration example: canonical -================================================== - -- founded and funded by Mark Shuttleworth (Thawte founder) -- drives infrastructure development for Ubuntu -- by now 30-50 developers worldwide -- main language is python -- weekly sync meetings / open-source collaborative work style - -business collaboration example: merlinux -================================================== - -- founded just before the EU project (see last year's talk) -- hires 8 people distributed across europe (and brasil) -- weekly sync meetings -- looks for fitting contracting models, "making it fit" -- it's not about products, but about viable solutions based on - an efficient development model - -.. |bullet| unicode:: U+02022 -.. footer:: Bea During, Holger Krekel |bullet| 22C3 |bullet| 29th December 2005 - Conclusion ============================================================ - A shared and challenging vision @@ -288,9 +266,10 @@ - Room for group learning and creating change - not just reacting to change - sprints - how does that EU thing work - - EU review - - agile experiences from other projects ----------------- + + +.. |bullet| unicode:: U+02022 +.. footer:: Bea During, Holger Krekel |bullet| 22C3 |bullet| 29th December 2005 From bea at codespeak.net Wed Dec 28 10:28:03 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Wed, 28 Dec 2005 10:28:03 +0100 (CET) Subject: [pypy-svn] r21585 - pypy/extradoc/talk/22c3 Message-ID: <20051228092803.B0E8B27B39@code1.codespeak.net> Author: bea Date: Wed Dec 28 10:28:00 2005 New Revision: 21585 Modified: pypy/extradoc/talk/22c3/interpreter-overview.png pypy/extradoc/talk/22c3/translation-overview.png Log: updated images Modified: pypy/extradoc/talk/22c3/interpreter-overview.png ============================================================================== Binary files. No diff available. Modified: pypy/extradoc/talk/22c3/translation-overview.png ============================================================================== Binary files. No diff available. From hpk at codespeak.net Wed Dec 28 16:00:06 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Wed, 28 Dec 2005 16:00:06 +0100 (CET) Subject: [pypy-svn] r21589 - pypy/extradoc/talk/22c3 Message-ID: <20051228150006.EA38527B39@code1.codespeak.net> Author: hpk Date: Wed Dec 28 16:00:04 2005 New Revision: 21589 Modified: pypy/extradoc/talk/22c3/hpk-tech.txt pypy/extradoc/talk/22c3/interpreter-overview.png pypy/extradoc/talk/22c3/interpreter-overview.sxd Log: formatting checkin Modified: pypy/extradoc/talk/22c3/hpk-tech.txt ============================================================================== --- pypy/extradoc/talk/22c3/hpk-tech.txt (original) +++ pypy/extradoc/talk/22c3/hpk-tech.txt Wed Dec 28 16:00:04 2005 @@ -35,6 +35,14 @@ - 350 subscribers to pypy-dev, 150.000 LOCs, 20.000 visitors per month, - MIT license +PyPy development method +======================== + +- sprints +- test-driven +- open source culture +- see talk tomorrow 2pm (29th Dec. 2005) + PyPy implementation facts ============================ @@ -56,8 +64,6 @@ ================================== .. image:: interpreter-overview.png - :width: 650 - :height: 400 Parser and Compiler @@ -106,8 +112,8 @@ ============================= - bytecode interpreter -- Flow Object Space -- Annotation +- Abstract Interpretation (Flow Object Space) +- Type Inference (Annotation) - Specialising to lltypesystem / ootypesystem - C and LLVM Backends to lltypesystem @@ -127,7 +133,6 @@ - starts from "live" byte code NOT source code - pygame demonstration - Type Inference =============== @@ -167,7 +172,10 @@ =================================== - Currently implemented: refcounting, Boehm-collector -- more general exact GCs (copying, mark&sweep, ...) - not yet +- more general exact GCs (not yet integrated) + - copying + - mark & sweep + - ... - different allocation strategies - not yet Aspects: Threading Models @@ -176,6 +184,7 @@ - currently implemented: single thread and global interpreter lock - future plans: free threading models - stacklessness: don't use the C stack for user-level recursion +- Continuation Passing Style (CPS) - implemented as a part of the backends comparison to other approaches @@ -213,26 +222,26 @@ PyPy cross pollination ======================= -- perl6: Object Spaces -- llvm -- cpython -- squeak (started last CCC conf) -- IronPython/Microsoft +- `perl6: Object Spaces`_ +- `llvm`_ +- cpython_ +- squeak_ (started last CCC conf) +- IronPython_/Microsoft + +.. _`perl6: Object Spaces`: http://www.nntp.perl.org/group/perl.perl6.compiler/1107 +.. _`llvm`: http://llvm.org +.. _`cpython`: http://www.python.org +.. _`squeak`: http://squeak.org +.. _`IronPython`: http://www.gotdotnet.com/workspaces/workspace.aspx?id=ad7acff7-ab1e-4bcb-99c0-57ac5a3a9742 -the speed issue -=============== -- currently 5-15 times slower than CPython +one thing: the speed issue +============================ + +- currently interpreting programs 5-15 times slower than CPython - now seriously starting with optimisations at various levels -- pypy can translate (R-)python code to something 10-50 times +- pypy can translate (R-)python code to something 10-100 times faster compared to running on top of CPython -pypy development method -======================== -- sprints -- test-driven -- open source culture -- see talk tomorrow 2pm (29th Dec. 2005) - technical outlook 2006 ====================== @@ -249,6 +258,9 @@ - better interactions with community & contribution - taking care about post-EU development (2007++) - visiting the US, Japan ... +- commercial opportunities ... + +http://codespeak.net/pypy .. |bullet| unicode:: U+02022 Modified: pypy/extradoc/talk/22c3/interpreter-overview.png ============================================================================== Binary files. No diff available. Modified: pypy/extradoc/talk/22c3/interpreter-overview.sxd ============================================================================== Binary files. No diff available. From bea at codespeak.net Wed Dec 28 16:10:03 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Wed, 28 Dec 2005 16:10:03 +0100 (CET) Subject: [pypy-svn] r21590 - pypy/extradoc/talk/22c3 Message-ID: <20051228151003.F32DA27B35@code1.codespeak.net> Author: bea Date: Wed Dec 28 16:10:02 2005 New Revision: 21590 Modified: pypy/extradoc/talk/22c3/slides-agility.txt Log: reshuffled the merged talk, added two slides about sprints (how its done) Modified: pypy/extradoc/talk/22c3/slides-agility.txt ============================================================================== --- pypy/extradoc/talk/22c3/slides-agility.txt (original) +++ pypy/extradoc/talk/22c3/slides-agility.txt Wed Dec 28 16:10:02 2005 @@ -5,7 +5,7 @@ :Authors: Bea During (changemaker), Holger Krekel (merlinux) :Date: 29th December 2005 -how it got started +How it got started ================================================== - 2003 first emails between Armin Rigo, Christian Tismer and @@ -15,7 +15,7 @@ - participants got to know each other at conferences - goal: Python implementation in Python (various motivations) -key elements of the technical development +Key elements of the technical development ================================================== - test-driven from the start @@ -24,7 +24,7 @@ - based on zen of python / python culture - focus on correctness of concepts, then speed -python community +Python community ================================================== - strong open-source cultural background @@ -32,7 +32,7 @@ - few flaming fights inside / much collaboration - has many windows and unix hackers -evolving agility +Evolving agility ================================================== - all large python projects rely and depend on automated testing @@ -67,7 +67,18 @@ - 2 year project, 1.3 Million Euro funding - contractual framework + reporting obligations -individuals ... need to organise +XXX SKIPXXXXBidding for the EU-funding +============================================================ + +- Project needed to reach critical mass, EU needed novel + compiler design techniques in OSS contexts +- Proposal was written during sprints as well as distributed + (submitted Oct 2003) +- Negotiations in Brussels (March 2004): key issues being 30% cuts in budget and denied procedures for funding contribution +- Project "started" Dec 2004: troubles with creating + consortium agreement fitting the OSS structure needed + +Individuals ... need to organise ================================================== - PyPy was first and still is a network of people @@ -75,7 +86,7 @@ - two companies were founded - 7 partners form the EU consortium: 6 companies, 1 university -balance of interests +Balance of interests ================================================== - developers want to (continue to) drive the project @@ -86,7 +97,7 @@ - **at all levels it is about finding models for co-operation that fit** -free co-operation basics +Free co-operation basics ================================================== - Christoph Spehr's "foundation of free co-operation": @@ -95,7 +106,7 @@ - everyone can "leave" the co-operation without penalty - leaving party taking a share with him/her -developers collaboration +Developers collaboration ================================================== - weekly 30 minute synchronisation meetings @@ -104,7 +115,7 @@ within the EU project - research/architecture informally guided by accepted experts -company collaboration +Company collaboration ================================================== - contractually through the EU consortium @@ -112,6 +123,22 @@ - evolving commercial opportunities US companies asking for consulting (test tool) or wanting to hire/pay developers on pypy related tools + +Organising the consortium +============================================================ + +- 7 partners, 3 previously not involved in the PyPy community +- 6 partners only partially funded (50% cost models) +- 2 new companies: "forced" entrepreneurship +- not all involved developers get funded! + +Organising the work +============================================================ + +- 14 workpackages and 58 deliverables, 3 phases +- Need for consortium meetings every month (IRC) +- Sprints every 6th week (coordinating development and management work) +- EU project aspects helped to gain mid-term/long-term focus Core of Agile practises: the people factor ============================================================ @@ -122,8 +149,6 @@ collaborative - fit the agile approach - OSS teams are an unique implementation of agile practices - why? -Picture: man on the moon - Agile approaches aim at ... ============================================================ @@ -152,33 +177,24 @@ Hildesheim, Washington etc) - The nature of sprints have evolved since the project started 2003 and since recieving partial EU-funding 2004/2005 - -Bidding for the EU-funding + +Sprinting the PyPy way 1 ============================================================ -- Project needed to reach critical mass, EU needed novel - compiler design techniques in OSS contexts -- Proposal was written during sprints as well as distributed - (submitted Oct 2003) -- Negotiations in Brussels (March 2004): key issues being 30% cuts in budget and denied procedures for funding contribution -- Project "started" Dec 2004: troubles with creating - consortium agreement fitting the OSS structure needed -Organising the consortium -============================================================ -- 7 partners, 3 previously not involved in the PyPy community -- 6 partners only partially funded (50% cost models) -- 2 new companies: "forced" entrepreneurship -- not all involved developers got/get funded! +- Planning: location, venue, rough goals and activities, preparation with local hosts +- Doing: start up meeting, daily status meetings, pairprogramming +- Closing: closure meeting (planning work between sprints), sprint reports, evaluations -Organising the work +Sprinting the PyPy way 2 ============================================================ -- 14 workpackages and 58 deliverables, 3 phases -- Need for consortium meetings every month (IRC) -- Sprints every 6th week (coordinating development and management work) -- EU project aspects helped to gain mid-term/long-term focus + +- 7 days with 1 break day +- "open" sprints and "closed" sprints - levels of PyPy knowledge in participants +- sprints at conferences (PyCon, EuroPython) The different cultures of the PyPy project ============================================================ + - OSS/Python culture (agile and distributed workstyle) - EU project culture - Traditional project management culture @@ -187,6 +203,7 @@ The challenge: managing diversities part 1. ============================================================ + - Developer driven process and Formal project organization - management team, technical board and partners - sprint organising @@ -195,6 +212,7 @@ The challenge: managing diversities part 2. ============================================================ + - Agile strategies and Formal EU requirements - written high level requirements - change control structures complicated @@ -203,14 +221,12 @@ The challenge: managing diversities part 3. ============================================================ + - OSS community and hierarchies for "conceptual integrity" - pypy-dev/core developers in technical board - industrial usage vs research oriented work - increased risk for unbalancing the community -Picture: chaos vs structure - - problems and perspectives ================================================== @@ -230,6 +246,7 @@ different planet ================================================== + - shared view regarding the center (project success) - quite different working perspectives / methods (e.g. "open-source collaborative" versus "formal traditional") From bea at codespeak.net Wed Dec 28 21:53:00 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Wed, 28 Dec 2005 21:53:00 +0100 (CET) Subject: [pypy-svn] r21591 - pypy/extradoc/talk/22c3 Message-ID: <20051228205300.8AD2027B35@code1.codespeak.net> Author: bea Date: Wed Dec 28 21:52:57 2005 New Revision: 21591 Added: pypy/extradoc/talk/22c3/pypy_manonthemoon.ppt (contents, props changed) Log: a nice picture of a man on the moon Added: pypy/extradoc/talk/22c3/pypy_manonthemoon.ppt ============================================================================== Binary file. No diff available. From bea at codespeak.net Wed Dec 28 23:03:06 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Wed, 28 Dec 2005 23:03:06 +0100 (CET) Subject: [pypy-svn] r21592 - pypy/extradoc/talk/22c3 Message-ID: <20051228220306.A209827B38@code1.codespeak.net> Author: bea Date: Wed Dec 28 23:03:05 2005 New Revision: 21592 Modified: pypy/extradoc/talk/22c3/slides-agility.txt Log: some more adjustments Modified: pypy/extradoc/talk/22c3/slides-agility.txt ============================================================================== --- pypy/extradoc/talk/22c3/slides-agility.txt (original) +++ pypy/extradoc/talk/22c3/slides-agility.txt Wed Dec 28 23:03:05 2005 @@ -227,16 +227,18 @@ - industrial usage vs research oriented work - increased risk for unbalancing the community -problems and perspectives +Picture: man on the moon + +Problems and perspectives ================================================== - confrontation with people from - same planet, different place - different planet - different solar system -- what follows is slightly abstracted ... +- what follows is slightly abstract ... -working with people from the same planet +Working with people from the same planet ================================================== - generally shared perspectives (synchronised @@ -244,7 +246,7 @@ - different continents can be hard enough - potential for misunderstanding/friction -different planet +Different planet ================================================== - shared view regarding the center (project success) @@ -252,7 +254,7 @@ (e.g. "open-source collaborative" versus "formal traditional") - potential for mistrust and dis-connection -different solar system +Different solar system ================================================== - missing shared focus on project success?! @@ -260,7 +262,7 @@ - potential for defense/attack thinking, secret agendas -the universal truth is ... +The universal truth is ... ================================================== - often good intentions (!) even from aliens @@ -268,7 +270,7 @@ - the challenge is to find a fitting model for case-by-case co-operation! - identify location in universe and try to match and synchronize -cross-project pollination +Cross-project pollination ================================================== - zope-europe, canonical @@ -281,11 +283,12 @@ - Respecting and "exploiting" strengths of the different cultures involved - Designing minimalistic project structures channeling work, not hindering work - Room for group learning and creating change - not just reacting to change +XXX ?? - sprints - how does that EU thing work - EU review - agile experiences from other projects - +XXX .. |bullet| unicode:: U+02022 From hpk at codespeak.net Wed Dec 28 23:03:28 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Wed, 28 Dec 2005 23:03:28 +0100 (CET) Subject: [pypy-svn] r21593 - pypy/extradoc/talk/22c3 Message-ID: <20051228220328.044A927B38@code1.codespeak.net> Author: hpk Date: Wed Dec 28 23:03:26 2005 New Revision: 21593 Modified: pypy/extradoc/talk/22c3/hpk-tech.txt Log: final slides used in the actual talk (which went quite good, btw ...) Modified: pypy/extradoc/talk/22c3/hpk-tech.txt ============================================================================== --- pypy/extradoc/talk/22c3/hpk-tech.txt (original) +++ pypy/extradoc/talk/22c3/hpk-tech.txt Wed Dec 28 23:03:26 2005 @@ -6,9 +6,16 @@ :Authors: Holger Krekel & Carl Friedrich Bolz :Date: 28th December 2005 +:Location: 22C3, Berlin .. holger +The missing talker: Armin Rigo +=============================== + +.. image:: armin_rigo.jpg + + Python implementation facts =========================== @@ -98,6 +105,7 @@ - around 200 builtin functions and classes - fundamental modules like 'sys' and 'os' implemented - quite fully compliant to CPython's regression tests +- a number of modules missing or incomplete (socket ...) Animation on Interpreter/Objspace interaction =============================================== @@ -255,7 +263,7 @@ ============================== - surviving the EU review in Bruxelles 20th January 2006 -- better interactions with community & contribution +- improve interactions with community & contribution - taking care about post-EU development (2007++) - visiting the US, Japan ... - commercial opportunities ... From bea at codespeak.net Wed Dec 28 23:48:26 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Wed, 28 Dec 2005 23:48:26 +0100 (CET) Subject: [pypy-svn] r21594 - pypy/extradoc/talk/22c3 Message-ID: <20051228224826.3649227B3F@code1.codespeak.net> Author: bea Date: Wed Dec 28 23:48:20 2005 New Revision: 21594 Added: pypy/extradoc/talk/22c3/manmoon.PNG (contents, props changed) Log: man on the moon Added: pypy/extradoc/talk/22c3/manmoon.PNG ============================================================================== Binary file. No diff available. From hpk at codespeak.net Thu Dec 29 00:14:52 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 29 Dec 2005 00:14:52 +0100 (CET) Subject: [pypy-svn] r21597 - pypy/extradoc/talk/22c3 Message-ID: <20051228231452.98C1527B3F@code1.codespeak.net> Author: hpk Date: Thu Dec 29 00:14:36 2005 New Revision: 21597 Added: pypy/extradoc/talk/22c3/armin_rigo.jpg (contents, props changed) pypy/extradoc/talk/22c3/closure.jpg (contents, props changed) pypy/extradoc/talk/22c3/location.jpg (contents, props changed) pypy/extradoc/talk/22c3/manmoon.jpg (contents, props changed) pypy/extradoc/talk/22c3/pair.jpg (contents, props changed) Modified: pypy/extradoc/talk/22c3/slides-agility.txt Log: (bea,hpk) added some pictures, refined the slides Added: pypy/extradoc/talk/22c3/armin_rigo.jpg ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/closure.jpg ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/location.jpg ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/manmoon.jpg ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/pair.jpg ============================================================================== Binary file. No diff available. Modified: pypy/extradoc/talk/22c3/slides-agility.txt ============================================================================== --- pypy/extradoc/talk/22c3/slides-agility.txt (original) +++ pypy/extradoc/talk/22c3/slides-agility.txt Thu Dec 29 00:14:36 2005 @@ -61,30 +61,20 @@ EU funding ================================================== -- proposal got good reviews from EU contracted experts +- Proposal written during sprints as well as distributed + (submitted Oct 2003) +- got good reviews from EU contracted experts - negotiation and finalisation: 1 year! -- Dec. 2004 EU contract starts - 2 year project, 1.3 Million Euro funding - contractual framework + reporting obligations -XXX SKIPXXXXBidding for the EU-funding +Work organisation (in the EU proposal) ============================================================ -- Project needed to reach critical mass, EU needed novel - compiler design techniques in OSS contexts -- Proposal was written during sprints as well as distributed - (submitted Oct 2003) -- Negotiations in Brussels (March 2004): key issues being 30% cuts in budget and denied procedures for funding contribution -- Project "started" Dec 2004: troubles with creating - consortium agreement fitting the OSS structure needed - -Individuals ... need to organise -================================================== - -- PyPy was first and still is a network of people -- but EU only funds organisations -- two companies were founded -- 7 partners form the EU consortium: 6 companies, 1 university +- 14 workpackages and 58 deliverables, 3 phases +- Need for consortium meetings every month (IRC) +- Sprints every 6th week (coordinating development and management work) +- EU project aspects helped to gain mid-term/long-term focus Balance of interests ================================================== @@ -97,14 +87,14 @@ - **at all levels it is about finding models for co-operation that fit** -Free co-operation basics +Sidenote: Free co-operation basics ================================================== - Christoph Spehr's "foundation of free co-operation": - negotiate any model you want (including dictatorship) - question and change rules and roles at any time - everyone can "leave" the co-operation without penalty -- leaving party taking a share with him/her +- leaving party can even take a share with him/her Developers collaboration ================================================== @@ -123,22 +113,23 @@ - evolving commercial opportunities US companies asking for consulting (test tool) or wanting to hire/pay developers on pypy related tools - + Organising the consortium ============================================================ +- PyPy was first and still is a network of people +- but EU only funds organisations - 7 partners, 3 previously not involved in the PyPy community - 6 partners only partially funded (50% cost models) - 2 new companies: "forced" entrepreneurship -- not all involved developers get funded! -Organising the work +Consortium Meetings ... ============================================================ -- 14 workpackages and 58 deliverables, 3 phases -- Need for consortium meetings every month (IRC) -- Sprints every 6th week (coordinating development and management work) -- EU project aspects helped to gain mid-term/long-term focus +.. image:: saarbruecken_consortium.jpg + +.. Bea + Core of Agile practises: the people factor ============================================================ @@ -182,10 +173,25 @@ ============================================================ - Planning: location, venue, rough goals and activities, preparation with local hosts + +.. image:: location.jpg + + +Sprinting the PyPy way 2 +============================================================ - Doing: start up meeting, daily status meetings, pairprogramming + +.. image:: pair.jpg + +Sprinting the PyPy way 3 +============================================================ + - Closing: closure meeting (planning work between sprints), sprint reports, evaluations -Sprinting the PyPy way 2 +.. image:: closure.jpg + + +Sprinting the PyPy way 4 ============================================================ - 7 days with 1 break day @@ -199,35 +205,39 @@ - EU project culture - Traditional project management culture - Chaospilot (actionlearning and process design) culture -- 5 different national cultures +- 5+X different national cultures The challenge: managing diversities part 1. ============================================================ -- Developer driven process and Formal project organization - - management team, technical board and partners - - sprint organising -- increased risk of added workload of management +- Developer driven process and formal project organization + - management team, technical board and partners + - sprint organising + - planning and focusing on technical tasks +- constant risk of added workload of management work on core developers The challenge: managing diversities part 2. ============================================================ - Agile strategies and Formal EU requirements - - written high level requirements - - change control structures complicated -- increased risk of missing opportunities and not + - written high level requirements + - change control structures complicated +- constant risk of missing opportunities and not creating/reacting to change fast enough The challenge: managing diversities part 3. ============================================================ - OSS community and hierarchies for "conceptual integrity" - - pypy-dev/core developers in technical board - - industrial usage vs research oriented work -- increased risk for unbalancing the community + - pypy-dev/core developers in technical board + - industrial usage vs research oriented work +- risk for unbalancing the community -Picture: man on the moon +hitchikers guide ... +============================================================ + +.. image:: manmoon.png Problems and perspectives ================================================== @@ -241,8 +251,8 @@ Working with people from the same planet ================================================== -- generally shared perspectives (synchronised - rotation around a common center) on project success +- generally shared perspectives, synchronised + rotation around a common center ... on project success - different continents can be hard enough - potential for misunderstanding/friction @@ -273,23 +283,19 @@ Cross-project pollination ================================================== -- zope-europe, canonical +- zope-europe, canonical, Calibre - dissemination: universities, IONA, Intel, HP ... -- Alan Kay / Squeak ... +- Alan Kay +- Squeak (21c3) +- ... -Conclusion +Conclusion / Food for thought ============================================================ + - A shared and challenging vision - Respecting and "exploiting" strengths of the different cultures involved - Designing minimalistic project structures channeling work, not hindering work - Room for group learning and creating change - not just reacting to change -XXX ?? -- sprints -- how does that EU thing work -- EU review -- agile experiences from other projects -XXX - .. |bullet| unicode:: U+02022 .. footer:: Bea During, Holger Krekel |bullet| 22C3 |bullet| 29th December 2005 From hpk at codespeak.net Thu Dec 29 00:38:21 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 29 Dec 2005 00:38:21 +0100 (CET) Subject: [pypy-svn] r21598 - pypy/extradoc/talk/22c3 Message-ID: <20051228233821.71FB327B3F@code1.codespeak.net> Author: hpk Date: Thu Dec 29 00:38:20 2005 New Revision: 21598 Modified: pypy/extradoc/talk/22c3/slides-agility.txt Log: (bea,hpk) basically finalised except formatting problems of nested lists Modified: pypy/extradoc/talk/22c3/slides-agility.txt ============================================================================== --- pypy/extradoc/talk/22c3/slides-agility.txt (original) +++ pypy/extradoc/talk/22c3/slides-agility.txt Thu Dec 29 00:38:20 2005 @@ -175,6 +175,9 @@ - Planning: location, venue, rough goals and activities, preparation with local hosts .. image:: location.jpg + :width: 400 + :height: 300 + Sprinting the PyPy way 2 @@ -182,6 +185,8 @@ - Doing: start up meeting, daily status meetings, pairprogramming .. image:: pair.jpg + :width: 400 + :height: 300 Sprinting the PyPy way 3 ============================================================ @@ -189,6 +194,8 @@ - Closing: closure meeting (planning work between sprints), sprint reports, evaluations .. image:: closure.jpg + :width: 400 + :height: 300 Sprinting the PyPy way 4 @@ -207,7 +214,7 @@ - Chaospilot (actionlearning and process design) culture - 5+X different national cultures -The challenge: managing diversities part 1. +The challenge: managing diversities part 1 ============================================================ - Developer driven process and formal project organization @@ -217,7 +224,7 @@ - constant risk of added workload of management work on core developers -The challenge: managing diversities part 2. +The challenge: managing diversities part 2 ============================================================ - Agile strategies and Formal EU requirements @@ -226,7 +233,7 @@ - constant risk of missing opportunities and not creating/reacting to change fast enough -The challenge: managing diversities part 3. +The challenge: managing diversities part 3 ============================================================ - OSS community and hierarchies for "conceptual integrity" @@ -234,7 +241,7 @@ - industrial usage vs research oriented work - risk for unbalancing the community -hitchikers guide ... +Hitchikers guide ... ============================================================ .. image:: manmoon.png From hpk at codespeak.net Thu Dec 29 12:23:35 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 29 Dec 2005 12:23:35 +0100 (CET) Subject: [pypy-svn] r21599 - pypy/extradoc/talk/22c3 Message-ID: <20051229112335.9741B27B38@code1.codespeak.net> Author: hpk Date: Thu Dec 29 12:23:34 2005 New Revision: 21599 Removed: pypy/extradoc/talk/22c3/hpk-agility.txt pypy/extradoc/talk/22c3/part_bea.txt Log: remove origin al parts now merged into slides-agility.txt From hpk at codespeak.net Thu Dec 29 12:39:21 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 29 Dec 2005 12:39:21 +0100 (CET) Subject: [pypy-svn] r21600 - in pypy/extradoc/talk/22c3: . plots Message-ID: <20051229113921.52D1827B41@code1.codespeak.net> Author: hpk Date: Thu Dec 29 12:39:10 2005 New Revision: 21600 Modified: pypy/extradoc/talk/22c3/plots/irc_messages.png pypy/extradoc/talk/22c3/plots/loc.png pypy/extradoc/talk/22c3/plots/number_files.png pypy/extradoc/talk/22c3/plots/post.png pypy/extradoc/talk/22c3/plots/statistic_irc_log.png pypy/extradoc/talk/22c3/plots/subscribers.png pypy/extradoc/talk/22c3/plots/webaccess.png pypy/extradoc/talk/22c3/slides-agility.txt Log: color images and last updates to slides Modified: pypy/extradoc/talk/22c3/plots/irc_messages.png ============================================================================== Binary files. No diff available. Modified: pypy/extradoc/talk/22c3/plots/loc.png ============================================================================== Binary files. No diff available. Modified: pypy/extradoc/talk/22c3/plots/number_files.png ============================================================================== Binary files. No diff available. Modified: pypy/extradoc/talk/22c3/plots/post.png ============================================================================== Binary files. No diff available. Modified: pypy/extradoc/talk/22c3/plots/statistic_irc_log.png ============================================================================== Binary files. No diff available. Modified: pypy/extradoc/talk/22c3/plots/subscribers.png ============================================================================== Binary files. No diff available. Modified: pypy/extradoc/talk/22c3/plots/webaccess.png ============================================================================== Binary files. No diff available. Modified: pypy/extradoc/talk/22c3/slides-agility.txt ============================================================================== --- pypy/extradoc/talk/22c3/slides-agility.txt (original) +++ pypy/extradoc/talk/22c3/slides-agility.txt Thu Dec 29 12:39:10 2005 @@ -24,6 +24,13 @@ - based on zen of python / python culture - focus on correctness of concepts, then speed +Lines of Code and tests +============================================================ + +.. image:: plots/loc.png + :width: 600 + :height: 450 + Python community ================================================== @@ -205,6 +212,16 @@ - "open" sprints and "closed" sprints - levels of PyPy knowledge in participants - sprints at conferences (PyCon, EuroPython) + +Effects of sprints on community participation +============================================================ + +.. image:: plots/subscribers.png + :width: 600 + :height: 450 + + + The different cultures of the PyPy project ============================================================ @@ -304,5 +321,18 @@ - Designing minimalistic project structures channeling work, not hindering work - Room for group learning and creating change - not just reacting to change +Outlook on whole project level +============================== + +- surviving the EU review in Bruxelles 20th January 2006 +- improve interactions with community & contribution +- taking care about post-EU development (2007++) +- visiting Mallorca, Texas, Tokyo, Ireland, ... +- commercial opportunities ... +- *Questions?* (talk to us ...) + +http://codespeak.net/pypy and http://pypy.org + + .. |bullet| unicode:: U+02022 .. footer:: Bea During, Holger Krekel |bullet| 22C3 |bullet| 29th December 2005 From pedronis at codespeak.net Fri Dec 30 00:51:27 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Dec 2005 00:51:27 +0100 (CET) Subject: [pypy-svn] r21604 - in pypy/dist/pypy/tool: . test Message-ID: <20051229235127.DDA5927B69@code1.codespeak.net> Author: pedronis Date: Fri Dec 30 00:51:25 2005 New Revision: 21604 Added: pypy/dist/pypy/tool/isolate.py (contents, props changed) pypy/dist/pypy/tool/test/simple.py (contents, props changed) pypy/dist/pypy/tool/test/test_isolate.py (contents, props changed) Log: simple interface to load a module in a separate process and invoke functions in it with simple args and return values. For now using py.execnet, but it should be easy to reimplement or have multiple implementation with fork/pipes for more speed or popen directly. A isolate instance can be used as a module mod to be invoke functions, mod.f(...). the only difference is that it is better to call the close_isolate cleanup function on it when done. To be used for things like test_boehm... Added: pypy/dist/pypy/tool/isolate.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/tool/isolate.py Fri Dec 30 00:51:25 2005 @@ -0,0 +1,88 @@ +import py +import exceptions + +ISOLATE = """ +import sys +import imp + +mod = channel.receive() +if isinstance(mod, str): + mod = __import__(mod, {}, {}, ['__doc__']) +else: + dir, name = mod + file, pathname, description = imp.find_module(name, [str(dir)]) + try: + mod = imp.load_module(name, file, pathname, description) + finally: + if file: + file.close() +channel.send("loaded") +while True: + func, args = channel.receive() + try: + res = getattr(mod, func)(*args) + except KeyboardInterrupt: + raise + except: + exc_type = sys.exc_info()[0] + channel.send(('exc', (exc_type.__module__, exc_type.__name__))) + else: + channel.send(('ok', res)) +""" + +class IsolateException(Exception): + pass + +class IsolateInvoker(object): + # to have a nice repr + + def __init__(self, isolate, name): + self.isolate = isolate + self.name = name + + def __call__(self, *args): + return self.isolate._invoke(self.name, args) + + def __repr__(self): + return "" % (self.isolate.module, self.name) + +class Isolate(object): + """ + Isolate lets load a module in a different process, + and support invoking functions from it passing and + returning simple values + + module: a dotted module name or a tuple (directory, module-name) + """ + + def __init__(self, module): + self.gw = py.execnet.PopenGateway() + chan = self.chan = self.gw.remote_exec(ISOLATE) + chan.send(module) + assert chan.receive() == "loaded" + + def __getattr__(self, name): + return IsolateInvoker(self, name) + + def _invoke(self, func, args): + self.chan.send((func, args)) + status, value = self.chan.receive() + if status == 'ok': + return value + else: + exc_type_module, exc_type_name = value + if exc_type_module == 'exceptions': + raise getattr(exceptions, exc_type_name) + else: + raise IsolateException, "%s.%s" % value + + def _close(self): + self.chan.close() + self.gw.exit() + + def __del__(self): + self.close() + +def close_isolate(isolate): + assert isinstance(isolate, Isolate) + isolate._close() Added: pypy/dist/pypy/tool/test/simple.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/tool/test/simple.py Fri Dec 30 00:51:25 2005 @@ -0,0 +1,12 @@ + +def f(a,b): + return a+b + +def g(): + raise ValueError, "booh" + +class FancyException(Exception): + pass + +def h(): + raise FancyException, "booh" Added: pypy/dist/pypy/tool/test/test_isolate.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/tool/test/test_isolate.py Fri Dec 30 00:51:25 2005 @@ -0,0 +1,40 @@ +import os +import py +from pypy.tool import isolate + +def test_init(): + simple = isolate.Isolate('pypy.tool.test.simple') + isolate.close_isolate(simple) + +def test_init_dir_name(): + simple = isolate.Isolate((os.path.dirname(__file__), 'simple')) + isolate.close_isolate(simple) + +def test_simple(): + simple = isolate.Isolate('pypy.tool.test.simple') + f = simple.f + res =f(1,2) + assert res == 3 + res = f(2,3) + assert res == 5 + isolate.close_isolate(simple) + +def test_simple_dir_name(): + simple = isolate.Isolate((os.path.dirname(__file__), 'simple')) + f = simple.f + res = f(1,2) + assert res == 3 + res = f(2,3) + assert res == 5 + isolate.close_isolate(simple) + +def test_raising(): + simple = isolate.Isolate('pypy.tool.test.simple') + py.test.raises(ValueError, "simple.g()") + isolate.close_isolate(simple) + +def test_raising_fancy(): + simple = isolate.Isolate('pypy.tool.test.simple') + py.test.raises(isolate.IsolateException, "simple.h()") + isolate.close_isolate(simple) + #os.system("ps") From pedronis at codespeak.net Fri Dec 30 00:56:47 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Dec 2005 00:56:47 +0100 (CET) Subject: [pypy-svn] r21605 - pypy/dist/pypy/tool/test Message-ID: <20051229235647.EB57027B69@code1.codespeak.net> Author: pedronis Date: Fri Dec 30 00:56:42 2005 New Revision: 21605 Added: pypy/dist/pypy/tool/test/isolate_simple.py - copied unchanged from r21604, pypy/dist/pypy/tool/test/simple.py Removed: pypy/dist/pypy/tool/test/simple.py Modified: pypy/dist/pypy/tool/test/test_isolate.py Log: less ambiguous naming for this module used by the test Modified: pypy/dist/pypy/tool/test/test_isolate.py ============================================================================== --- pypy/dist/pypy/tool/test/test_isolate.py (original) +++ pypy/dist/pypy/tool/test/test_isolate.py Fri Dec 30 00:56:42 2005 @@ -3,15 +3,15 @@ from pypy.tool import isolate def test_init(): - simple = isolate.Isolate('pypy.tool.test.simple') + simple = isolate.Isolate('pypy.tool.test.isolate_simple') isolate.close_isolate(simple) def test_init_dir_name(): - simple = isolate.Isolate((os.path.dirname(__file__), 'simple')) + simple = isolate.Isolate((os.path.dirname(__file__), 'isolate_simple')) isolate.close_isolate(simple) def test_simple(): - simple = isolate.Isolate('pypy.tool.test.simple') + simple = isolate.Isolate('pypy.tool.test.isolate_simple') f = simple.f res =f(1,2) assert res == 3 @@ -20,7 +20,7 @@ isolate.close_isolate(simple) def test_simple_dir_name(): - simple = isolate.Isolate((os.path.dirname(__file__), 'simple')) + simple = isolate.Isolate((os.path.dirname(__file__), 'isolate_simple')) f = simple.f res = f(1,2) assert res == 3 @@ -29,12 +29,12 @@ isolate.close_isolate(simple) def test_raising(): - simple = isolate.Isolate('pypy.tool.test.simple') + simple = isolate.Isolate('pypy.tool.test.isolate_simple') py.test.raises(ValueError, "simple.g()") isolate.close_isolate(simple) def test_raising_fancy(): - simple = isolate.Isolate('pypy.tool.test.simple') + simple = isolate.Isolate('pypy.tool.test.isolate_simple') py.test.raises(isolate.IsolateException, "simple.h()") isolate.close_isolate(simple) #os.system("ps") From pedronis at codespeak.net Fri Dec 30 18:21:11 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Dec 2005 18:21:11 +0100 (CET) Subject: [pypy-svn] r21607 - pypy/dist/pypy/tool Message-ID: <20051230172111.E050227B5E@code1.codespeak.net> Author: pedronis Date: Fri Dec 30 18:21:10 2005 New Revision: 21607 Modified: pypy/dist/pypy/tool/isolate.py Log: fixed bug close->_close. Modified: pypy/dist/pypy/tool/isolate.py ============================================================================== --- pypy/dist/pypy/tool/isolate.py (original) +++ pypy/dist/pypy/tool/isolate.py Fri Dec 30 18:21:10 2005 @@ -10,7 +10,7 @@ mod = __import__(mod, {}, {}, ['__doc__']) else: dir, name = mod - file, pathname, description = imp.find_module(name, [str(dir)]) + file, pathname, description = imp.find_module(name, [dir]) try: mod = imp.load_module(name, file, pathname, description) finally: @@ -54,6 +54,7 @@ module: a dotted module name or a tuple (directory, module-name) """ + _closed = False def __init__(self, module): self.gw = py.execnet.PopenGateway() @@ -77,11 +78,13 @@ raise IsolateException, "%s.%s" % value def _close(self): - self.chan.close() - self.gw.exit() + if not self._closed: + self.chan.close() + self.gw.exit() + self._closed = True def __del__(self): - self.close() + self._close() def close_isolate(isolate): assert isinstance(isolate, Isolate) From pedronis at codespeak.net Fri Dec 30 23:13:23 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Dec 2005 23:13:23 +0100 (CET) Subject: [pypy-svn] r21608 - pypy/dist/pypy/tool Message-ID: <20051230221323.81D1B27B5E@code1.codespeak.net> Author: pedronis Date: Fri Dec 30 23:13:22 2005 New Revision: 21608 Added: pypy/dist/pypy/tool/isolate_slave.py (contents, props changed) pypy/dist/pypy/tool/slaveproc.py (contents, props changed) Modified: pypy/dist/pypy/tool/isolate.py Log: switch to using popen directly for isolate. The problem is that threads used by execnet and Boehm result in segfaults because obviously the hosting cpython is not compiled in such a way that new threads are advertised to Boehm. Modified: pypy/dist/pypy/tool/isolate.py ============================================================================== --- pypy/dist/pypy/tool/isolate.py (original) +++ pypy/dist/pypy/tool/isolate.py Fri Dec 30 23:13:22 2005 @@ -1,34 +1,5 @@ -import py -import exceptions - -ISOLATE = """ -import sys -import imp - -mod = channel.receive() -if isinstance(mod, str): - mod = __import__(mod, {}, {}, ['__doc__']) -else: - dir, name = mod - file, pathname, description = imp.find_module(name, [dir]) - try: - mod = imp.load_module(name, file, pathname, description) - finally: - if file: - file.close() -channel.send("loaded") -while True: - func, args = channel.receive() - try: - res = getattr(mod, func)(*args) - except KeyboardInterrupt: - raise - except: - exc_type = sys.exc_info()[0] - channel.send(('exc', (exc_type.__module__, exc_type.__name__))) - else: - channel.send(('ok', res)) -""" +import exceptions, os +from pypy.tool import slaveproc class IsolateException(Exception): pass @@ -57,17 +28,17 @@ _closed = False def __init__(self, module): - self.gw = py.execnet.PopenGateway() - chan = self.chan = self.gw.remote_exec(ISOLATE) - chan.send(module) - assert chan.receive() == "loaded" + self.slave = slaveproc.SlaveProcess(os.path.join(os.path.dirname(__file__), + 'isolate_slave.py')) + res = self.slave.cmd(('load', module)) + assert res == 'loaded' def __getattr__(self, name): return IsolateInvoker(self, name) def _invoke(self, func, args): - self.chan.send((func, args)) - status, value = self.chan.receive() + status, value = self.slave.cmd(('invoke', (func, args))) + print 'OK' if status == 'ok': return value else: @@ -79,8 +50,7 @@ def _close(self): if not self._closed: - self.chan.close() - self.gw.exit() + self.slave.close() self._closed = True def __del__(self): Added: pypy/dist/pypy/tool/isolate_slave.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/tool/isolate_slave.py Fri Dec 30 23:13:22 2005 @@ -0,0 +1,41 @@ +import autopath +import sys, imp +from pypy.tool import slaveproc + +class IsolateSlave(slaveproc.Slave): + mod = None + + def do_cmd(self, cmd): + cmd, data = cmd + if cmd == 'load': + assert self.mod is None + mod = data + if isinstance(mod, str): + mod = __import__(mod, {}, {}, ['__doc__']) + else: + dir, name = mod + file, pathname, description = imp.find_module(name, [dir]) + try: + mod = imp.load_module(name, file, pathname, description) + finally: + if file: + file.close() + self.mod = mod + return 'loaded' + elif cmd == 'invoke': + assert self.mod is not None + func, args = data + try: + res = getattr(self.mod, func)(*args) + except KeyboardInterrupt: + raise + except: + exc_type = sys.exc_info()[0] + return ('exc', (exc_type.__module__, exc_type.__name__)) + else: + return ('ok', res) + else: + return 'no-clue' + +if __name__ == '__main__': + IsolateSlave().do() Added: pypy/dist/pypy/tool/slaveproc.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/tool/slaveproc.py Fri Dec 30 23:13:22 2005 @@ -0,0 +1,52 @@ +import os, struct, marshal, sys + +class Exchange(object): + def __init__(self, inp, out): + self.out = out + self.inp = inp + + def send(self, data): + s = marshal.dumps(data) + h = struct.pack('L', len(s)) + self.out.write(h+s) + self.out.flush() + + def recv(self): + HSIZE = struct.calcsize('L') + h = self.inp.read(HSIZE) + if len(h) < HSIZE: + raise EOFError + size = struct.unpack('L', h)[0] + s = self.inp.read(size) + if len(s) < size: + raise EOFError + return marshal.loads(s) + +class SlaveProcess(object): + + def __init__(self, slave_impl): + inp, out = os.popen2('%s -u %s' % (sys.executable, os.path.abspath(slave_impl))) + self.exchg = Exchange(out, inp) + + def cmd(self, data): + self.exchg.send(data) + return self.exchg.recv() + + def close(self): + assert self.cmd(None) == 'done' + +class Slave(object): + + def do_cmd(self, data): + raise NotImplementedError + + def do(self): + exchg = Exchange(sys.stdin, sys.stdout) + while True: + cmd = exchg.recv() + if cmd is None: + exchg.send('done') + break + result = self.do_cmd(cmd) + exchg.send(result) + From pedronis at codespeak.net Fri Dec 30 23:17:00 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Dec 2005 23:17:00 +0100 (CET) Subject: [pypy-svn] r21609 - in pypy/dist/pypy/translator/c: . test Message-ID: <20051230221700.C955327B60@code1.codespeak.net> Author: pedronis Date: Fri Dec 30 23:16:58 2005 New Revision: 21609 Modified: pypy/dist/pypy/translator/c/genc.py pypy/dist/pypy/translator/c/test/test_boehm.py Log: make test_boehm runnable using the isolate.py mechanism. Modified: pypy/dist/pypy/translator/c/genc.py ============================================================================== --- pypy/dist/pypy/translator/c/genc.py (original) +++ pypy/dist/pypy/translator/c/genc.py Fri Dec 30 23:16:58 2005 @@ -9,6 +9,7 @@ from pypy.translator.tool.cbuild import import_module_from_directory from pypy.rpython.lltypesystem import lltype from pypy.tool.udir import udir +from pypy.tool import isolate from pypy.translator.locality.calltree import CallTree from pypy.translator.c.support import log from pypy.rpython.typesystem import getfunctionptr @@ -108,12 +109,23 @@ if self.symboltable: self.symboltable.attach(mod) # hopefully temporary hack return mod + + def isolated_import(self): + assert self._compiled + assert not self.c_ext_module + self.c_ext_module = isolate.Isolate((str(self.c_source_filename.dirpath()), + self.c_source_filename.purebasename)) + return self.c_ext_module def get_entry_point(self): assert self.c_ext_module return getattr(self.c_ext_module, self.entrypoint.func_name) + def cleanup(self): + assert self.c_ext_module + if isinstance(self.c_ext_module, isolate.Isolate): + isolate.close_isolate(self.c_ext_module) class CStandaloneBuilder(CBuilder): standalone = True Modified: pypy/dist/pypy/translator/c/test/test_boehm.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_boehm.py (original) +++ pypy/dist/pypy/translator/c/test/test_boehm.py Fri Dec 30 23:16:58 2005 @@ -1,71 +1,83 @@ import py from pypy.translator.translator import TranslationContext -from pypy.translator.tool.cbuild import skip_missing_compiler +from pypy.translator.tool.cbuild import skip_missing_compiler, check_boehm_presence from pypy.translator.c.genc import CExtModuleBuilder -py.test.skip("boehm test is fragile wrt. the number of dynamically loaded libs") - - -def getcompiled(func): - from pypy.translator.c.gc import BoehmGcPolicy - t = TranslationContext(simplifying=True) - # builds starting-types from func_defs - argstypelist = [] - if func.func_defaults: - for spec in func.func_defaults: - if isinstance(spec, tuple): - spec = spec[0] # use the first type only for the tests - argstypelist.append(spec) - a = t.buildannotator().build_types(func, argstypelist) - t.buildrtyper().specialize() - t.checkgraphs() - def compile(): - cbuilder = CExtModuleBuilder(t, func, gcpolicy=BoehmGcPolicy) - c_source_filename = cbuilder.generate_source() - cbuilder.compile() - cbuilder.import_module() - return cbuilder.get_entry_point() - return skip_missing_compiler(compile) - - -def test_malloc_a_lot(): - def malloc_a_lot(): - i = 0 - while i < 10: - i += 1 - a = [1] * 10 - j = 0 - while j < 20: - j += 1 - a.append(j) - fn = getcompiled(malloc_a_lot) - fn() - -def test__del__(): - class State: - pass - s = State() - class A(object): - def __del__(self): - s.a_dels += 1 - class B(A): - def __del__(self): - s.b_dels += 1 - class C(A): - pass - def f(): - s.a_dels = 0 - s.b_dels = 0 - A() - B() - C() - A() - B() - C() - return s.a_dels * 10 + s.b_dels - fn = getcompiled(f) - res = f() - assert res == 42 - res = fn() #does not crash - res = fn() #does not crash - assert 0 <= res <= 42 # 42 cannot be guaranteed +def setup_module(mod): + if not check_boehm_presence(): + py.test.skip("Boehm GC not present") + +class TestUsingBoehm: + + # deal with cleanups + def setup_method(self, meth): + self._cleanups = [] + def teardown_method(self, meth): + while self._cleanups: + #print "CLEANUP" + self._cleanups.pop()() + + def getcompiled(self, func): + from pypy.translator.c.gc import BoehmGcPolicy + t = TranslationContext(simplifying=True) + # builds starting-types from func_defs + argstypelist = [] + if func.func_defaults: + for spec in func.func_defaults: + if isinstance(spec, tuple): + spec = spec[0] # use the first type only for the tests + argstypelist.append(spec) + a = t.buildannotator().build_types(func, argstypelist) + t.buildrtyper().specialize() + t.checkgraphs() + def compile(): + cbuilder = CExtModuleBuilder(t, func, gcpolicy=BoehmGcPolicy) + c_source_filename = cbuilder.generate_source() + cbuilder.compile() + mod = cbuilder.isolated_import() + self._cleanups.append(cbuilder.cleanup) # schedule cleanup after test + return cbuilder.get_entry_point() + return skip_missing_compiler(compile) + + + def test_malloc_a_lot(self): + def malloc_a_lot(): + i = 0 + while i < 10: + i += 1 + a = [1] * 10 + j = 0 + while j < 20: + j += 1 + a.append(j) + fn = self.getcompiled(malloc_a_lot) + fn() + + def test__del__(self): + class State: + pass + s = State() + class A(object): + def __del__(self): + s.a_dels += 1 + class B(A): + def __del__(self): + s.b_dels += 1 + class C(A): + pass + def f(): + s.a_dels = 0 + s.b_dels = 0 + A() + B() + C() + A() + B() + C() + return s.a_dels * 10 + s.b_dels + fn = self.getcompiled(f) + res = f() + assert res == 42 + res = fn() #does not crash + res = fn() #does not crash + assert 0 <= res <= 42 # 42 cannot be guaranteed From pedronis at codespeak.net Sat Dec 31 14:22:26 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 31 Dec 2005 14:22:26 +0100 (CET) Subject: [pypy-svn] r21610 - in pypy/dist/pypy/tool: . test Message-ID: <20051231132226.B8DD027B60@code1.codespeak.net> Author: pedronis Date: Sat Dec 31 14:22:24 2005 New Revision: 21610 Modified: pypy/dist/pypy/tool/slaveproc.py pypy/dist/pypy/tool/test/isolate_simple.py pypy/dist/pypy/tool/test/test_isolate.py Log: slightly better slave process crash handling (the test run output will contain the stderr of the child python) Modified: pypy/dist/pypy/tool/slaveproc.py ============================================================================== --- pypy/dist/pypy/tool/slaveproc.py (original) +++ pypy/dist/pypy/tool/slaveproc.py Sat Dec 31 14:22:24 2005 @@ -23,17 +23,23 @@ return marshal.loads(s) class SlaveProcess(object): - + _broken = False + def __init__(self, slave_impl): inp, out = os.popen2('%s -u %s' % (sys.executable, os.path.abspath(slave_impl))) self.exchg = Exchange(out, inp) def cmd(self, data): self.exchg.send(data) - return self.exchg.recv() + try: + return self.exchg.recv() + except EOFError: + self._broken = True + raise def close(self): - assert self.cmd(None) == 'done' + if not self._broken: + assert self.cmd(None) == 'done' class Slave(object): Modified: pypy/dist/pypy/tool/test/isolate_simple.py ============================================================================== --- pypy/dist/pypy/tool/test/isolate_simple.py (original) +++ pypy/dist/pypy/tool/test/isolate_simple.py Sat Dec 31 14:22:24 2005 @@ -10,3 +10,6 @@ def h(): raise FancyException, "booh" + +def bomb(): + raise KeyboardInterrupt Modified: pypy/dist/pypy/tool/test/test_isolate.py ============================================================================== --- pypy/dist/pypy/tool/test/test_isolate.py (original) +++ pypy/dist/pypy/tool/test/test_isolate.py Sat Dec 31 14:22:24 2005 @@ -38,3 +38,10 @@ py.test.raises(isolate.IsolateException, "simple.h()") isolate.close_isolate(simple) #os.system("ps") + +def test_bomb(): + simple = isolate.Isolate('pypy.tool.test.isolate_simple') + py.test.raises(EOFError, "simple.bomb()") + isolate.close_isolate(simple) + + From pedronis at codespeak.net Sat Dec 31 15:01:32 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 31 Dec 2005 15:01:32 +0100 (CET) Subject: [pypy-svn] r21611 - pypy/dist/pypy/annotation Message-ID: <20051231140132.326B727B5E@code1.codespeak.net> Author: pedronis Date: Sat Dec 31 15:01:30 2005 New Revision: 21611 Modified: pypy/dist/pypy/annotation/builtin.py Log: XXX GC behavior of int<->object Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Sat Dec 31 15:01:30 2005 @@ -251,10 +251,10 @@ def rarith_intmask(s_obj): return SomeInteger() -def robjmodel_cast_obj_to_int(s_instance): +def robjmodel_cast_obj_to_int(s_instance): # XXX GC behavior return SomeInteger() -def robjmodel_cast_int_to_obj(s_int, s_clspbc): +def robjmodel_cast_int_to_obj(s_int, s_clspbc): # XXX GC behavior assert len(s_clspbc.descriptions) == 1 desc = s_clspbc.descriptions.keys()[0] cdef = desc.getuniqueclassdef() From pedronis at codespeak.net Sat Dec 31 18:44:31 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 31 Dec 2005 18:44:31 +0100 (CET) Subject: [pypy-svn] r21615 - pypy/dist/pypy/annotation Message-ID: <20051231174431.B26BE27B60@code1.codespeak.net> Author: pedronis Date: Sat Dec 31 18:44:29 2005 New Revision: 21615 Modified: pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/annotation/description.py Log: avoid to build classdefs for builtin types (was happening because of the somepbc-refactoring changes) Modified: pypy/dist/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/dist/pypy/annotation/bookkeeper.py (original) +++ pypy/dist/pypy/annotation/bookkeeper.py Sat Dec 31 18:44:29 2005 @@ -409,7 +409,10 @@ elif isinstance(pyobj, (type, types.ClassType)): if pyobj is object: raise Exception, "ClassDesc for object not supported" - result = description.ClassDesc(self, pyobj) + if pyobj.__module__ == '__builtin__': # avoid making classdefs for builtin types + result = self.getfrozen(pyobj) + else: + result = description.ClassDesc(self, pyobj) elif isinstance(pyobj, types.MethodType): if pyobj.im_self is None: # unbound return self.getdesc(pyobj.im_func) @@ -434,17 +437,17 @@ else: # must be a frozen pre-built constant, but let's check assert pyobj._freeze_() - result = description.FrozenDesc(self, pyobj) - cls = result.knowntype - if cls not in self.pbctypes: - self.pbctypes[cls] = True - # XXX what to do about this old check?: - #if cls in self.userclasses: - # self.warning("making some PBC of type %r, which has " - # "already got a ClassDef" % (cls,)) + result = self.getfrozen(pyobj) self.descs[pyobj] = result return result + def getfrozen(self, pyobj): + result = description.FrozenDesc(self, pyobj) + cls = result.knowntype + if cls not in self.pbctypes: + self.pbctypes[cls] = True + return result + def getmethoddesc(self, funcdesc, originclassdef, selfclassdef, name): key = funcdesc, originclassdef, selfclassdef, name try: Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Sat Dec 31 18:44:29 2005 @@ -110,7 +110,33 @@ def our_issubclass(cls1, cls2): """ we're going to try to be less silly in the face of old-style classes""" - return cls2 is object or issubclass(cls1, cls2) + from pypy.annotation.classdef import ClassDef + if cls2 is object: + return True + def classify(cls): + if isinstance(cls, ClassDef): + return 'def' + if cls.__module__ == '__builtin__': + return 'builtin' + else: + return 'cls' + kind1 = classify(cls1) + kind2 = classify(cls2) + if kind1 != 'def' and kind2 != 'def': + return issubclass(cls1, cls2) + if kind1 == 'builtin' and kind2 == 'def': + return False + elif kind1 == 'def' and kind2 == 'builtin': + return issubclass(object, cls2) + else: + bk = getbookkeeper() + def toclassdef(kind, cls): + if kind != 'def': + return bk.getuniqueclassdef(cls) + else: + return cls + return toclassdef(kind1, cls1).issubclass(toclassdef(kind2, cls2)) + def builtin_isinstance(s_obj, s_type, variables=None): r = SomeBool() @@ -138,13 +164,6 @@ if s_obj.is_constant(): r.const = isinstance(s_obj.const, typ) - elif isinstance(s_obj, SomeInstance): - typdef = getbookkeeper().getuniqueclassdef(typ) - if s_obj.classdef.issubclass(typdef): - if not s_obj.can_be_none(): - r.const = True - elif not typdef.issubclass(s_obj.classdef): - r.const = False elif our_issubclass(s_obj.knowntype, typ): if not s_obj.can_be_none(): r.const = True Modified: pypy/dist/pypy/annotation/description.py ============================================================================== --- pypy/dist/pypy/annotation/description.py (original) +++ pypy/dist/pypy/annotation/description.py Sat Dec 31 18:44:29 2005 @@ -320,6 +320,7 @@ self._classdefs = {} if pyobj is not None: + assert pyobj.__module__ != '__builtin__' cls = pyobj base = object baselist = list(cls.__bases__) From pedronis at codespeak.net Sat Dec 31 20:23:47 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 31 Dec 2005 20:23:47 +0100 (CET) Subject: [pypy-svn] r21616 - pypy/dist/pypy/objspace/flow/test Message-ID: <20051231192347.0A12D27B5E@code1.codespeak.net> Author: pedronis Date: Sat Dec 31 20:23:46 2005 New Revision: 21616 Modified: pypy/dist/pypy/objspace/flow/test/test_checkgraph.py Log: fix broken test. Modified: pypy/dist/pypy/objspace/flow/test/test_checkgraph.py ============================================================================== --- pypy/dist/pypy/objspace/flow/test/test_checkgraph.py (original) +++ pypy/dist/pypy/objspace/flow/test/test_checkgraph.py Sat Dec 31 20:23:46 2005 @@ -97,6 +97,5 @@ g = FunctionGraph("g", Block([v])) g.startblock.exitswitch = v g.startblock.closeblock(Link([Constant(1)], g.returnblock)) - checkgraph(g) py.test.raises(AssertionError, checkgraph, g) From pedronis at codespeak.net Sat Dec 31 21:49:07 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 31 Dec 2005 21:49:07 +0100 (CET) Subject: [pypy-svn] r21617 - in pypy/dist/pypy: rpython translator/c/src Message-ID: <20051231204907.0B2E227B60@code1.codespeak.net> Author: pedronis Date: Sat Dec 31 21:49:06 2005 New Revision: 21617 Modified: pypy/dist/pypy/rpython/rint.py pypy/dist/pypy/translator/c/src/pyobj.h Log: fix for test_backendoptimized test_uint_switch for Python 2.3. Before 2.4 PyLong_AsUnsignedLong didn't accept just an int. Last bit of PyPy hacking for me for 2005. Modified: pypy/dist/pypy/rpython/rint.py ============================================================================== --- pypy/dist/pypy/rpython/rint.py (original) +++ pypy/dist/pypy/rpython/rint.py Sat Dec 31 21:49:06 2005 @@ -435,7 +435,7 @@ py_to_ll_conversion_functions = { UnsignedLongLong: ('RPyLong_AsUnsignedLongLong', lambda pyo: r_ulonglong(pyo._obj.value)), SignedLongLong: ('RPyLong_AsLongLong', lambda pyo: r_longlong(pyo._obj.value)), - Unsigned: ('PyLong_AsUnsignedLong', lambda pyo: r_uint(pyo._obj.value)), + Unsigned: ('RPyLong_AsUnsignedLong', lambda pyo: r_uint(pyo._obj.value)), Signed: ('PyInt_AsLong', lambda pyo: int(pyo._obj.value)) } Modified: pypy/dist/pypy/translator/c/src/pyobj.h ============================================================================== --- pypy/dist/pypy/translator/c/src/pyobj.h (original) +++ pypy/dist/pypy/translator/c/src/pyobj.h Sat Dec 31 21:49:06 2005 @@ -221,6 +221,27 @@ #ifndef PYPY_NOT_MAIN_FILE +#if (PY_VERSION_HEX < 0x02040000) + +unsigned long RPyLong_AsUnsignedLong(PyObject *v) +{ + if (PyInt_Check(v)) { + long val = PyInt_AsLong(v); + if (val < 0) { + PyErr_SetNone(PyExc_OverflowError); + return (unsigned long)-1; + } + return val; + } else { + return PyLong_AsUnsignedLong(v); + } +} + +#else +#define RPyLong_AsUnsignedLong PyLong_AsUnsignedLong +#endif + + unsigned long long RPyLong_AsUnsignedLongLong(PyObject *v) { if (PyInt_Check(v))