From cfbolz at codespeak.net Thu Sep 1 05:24:23 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 1 Sep 2005 05:24:23 +0200 (CEST) Subject: [pypy-svn] r17135 - pypy/dist/pypy/doc Message-ID: <20050901032423.CC5DB27B54@code1.codespeak.net> Author: cfbolz Date: Thu Sep 1 05:24:13 2005 New Revision: 17135 Added: pypy/dist/pypy/doc/garbage_collection.txt - copied, changed from r17103, pypy/dist/pypy/doc/gc_planning.txt Removed: pypy/dist/pypy/doc/gc_planning.txt Modified: pypy/dist/pypy/doc/_ref.txt pypy/dist/pypy/doc/index.txt pypy/dist/pypy/doc/translation.txt Log: documentation about the recent garbage collection work Modified: pypy/dist/pypy/doc/_ref.txt ============================================================================== --- pypy/dist/pypy/doc/_ref.txt (original) +++ pypy/dist/pypy/doc/_ref.txt Thu Sep 1 05:24:13 2005 @@ -39,11 +39,16 @@ .. _`pypy/rpython`: .. _`rpython/`: ../../pypy/rpython .. _`pypy/rpython/extfunctable.py`: ../../pypy/rpython/extfunctable.py -.. _`pypy/rpython/lltype.py`: -.. _`rpython/lltype.py`: ../../pypy/rpython/lltype.py +.. _`rpython/lltype.py`: +.. _`pypy/rpython/lltype.py`: ../../pypy/rpython/lltype.py +.. _`pypy/rpython/memory/gc.py`: ../../pypy/rpython/memory/gc.py +.. _`pypy/rpython/memory/lladdress.py`: ../../pypy/rpython/memory/lladdress.py +.. _`pypy/rpython/memory/simulator.py`: ../../pypy/rpython/memory/simulator.py +.. _`pypy/rpython/memory/support.py`: ../../pypy/rpython/memory/support.py .. _`pypy/rpython/module/`: ../../pypy/rpython/module .. _`pypy/rpython/module/ll_os.py`: ../../pypy/rpython/module/ll_os.py .. _`pypy/rpython/module/test`: ../../pypy/rpython/module/test +.. _`pypy/rpython/objectmodel.py`: ../../pypy/rpython/objectmodel.py .. _`rpython/rint.py`: ../../pypy/rpython/rint.py .. _`rpython/rlist.py`: ../../pypy/rpython/rlist.py .. _`rpython/rmodel.py`: ../../pypy/rpython/rmodel.py Modified: pypy/dist/pypy/doc/index.txt ============================================================================== --- pypy/dist/pypy/doc/index.txt (original) +++ pypy/dist/pypy/doc/index.txt Thu Sep 1 05:24:13 2005 @@ -23,6 +23,9 @@ `parser`_ contains the beginnings of documentation about the parser (and the compiler at some point). +`garbage collection`_ contains documentation about +garbage collection in PyPy. + `FAQ`_ contains the beginning of frequently asked questions. Right now it's a bit empty. @@ -50,7 +53,7 @@ .. _`getting started`: getting-started.html .. _`theory`: theory.html .. _`bytecode interpreter`: interpreter.html - +.. _`garbage collection`: garbage_collection.html .. _`directory reference`: PyPy directory cross-reference Modified: pypy/dist/pypy/doc/translation.txt ============================================================================== --- pypy/dist/pypy/doc/translation.txt (original) +++ pypy/dist/pypy/doc/translation.txt Thu Sep 1 05:24:13 2005 @@ -921,6 +921,8 @@ exception-catching links.) +.. _LLInterpreter: + The LLInterpreter ----------------- From hpk at codespeak.net Thu Sep 1 08:16:40 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Sep 2005 08:16:40 +0200 (CEST) Subject: [pypy-svn] r17136 - in pypy/extradoc: . minute pypy.org Message-ID: <20050901061640.6CC7727B50@code1.codespeak.net> Author: hpk Date: Thu Sep 1 08:16:38 2005 New Revision: 17136 Added: pypy/extradoc/pypy.org/ pypy/extradoc/pypy.org/index.txt - copied unchanged from r17135, pypy/extradoc/pypyorg_20050827.txt Removed: pypy/extradoc/pypyorg_20050827.txt Modified: pypy/extradoc/minute/pypy-sync-08-11-2005.txt Log: move Bea's pypy.org web page to its own pypy.org directory Modified: pypy/extradoc/minute/pypy-sync-08-11-2005.txt ============================================================================== --- pypy/extradoc/minute/pypy-sync-08-11-2005.txt (original) +++ pypy/extradoc/minute/pypy-sync-08-11-2005.txt Thu Sep 1 08:16:38 2005 @@ -137,7 +137,7 @@ Holger closes the meeting in time at 13:30pm. -.. _`IRC-log`: : +.. _`IRC-log`: Here is the full IRC log:: From hpk at codespeak.net Thu Sep 1 08:23:09 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Sep 2005 08:23:09 +0200 (CEST) Subject: [pypy-svn] r17137 - pypy/extradoc/pypy.org Message-ID: <20050901062309.6C56627B50@code1.codespeak.net> Author: hpk Date: Thu Sep 1 08:23:08 2005 New Revision: 17137 Added: pypy/extradoc/pypy.org/news.txt - copied, changed from r17136, pypy/extradoc/pypy.org/index.txt Modified: pypy/extradoc/pypy.org/index.txt Log: separating a news page out of Beas Modified: pypy/extradoc/pypy.org/index.txt ============================================================================== --- pypy/extradoc/pypy.org/index.txt (original) +++ pypy/extradoc/pypy.org/index.txt Thu Sep 1 08:23:08 2005 @@ -1,126 +1,41 @@ PyPy -Researching a higly flexible and modular language platform and implementing it by leveraging -the Open Source Python Language and Community - -The PyPy project have been an ongoing Open Source Python language implementation since 2003. -In December 2004 PyPy recieved EU-funding within the Framework Programme 6, second call for proposals -("Open development platforms and services" IST). - -A consortium of 7 partners in Germany, France and Sweden are working to achieve the goal of a -open run-time environment for the Open Source Programming Language Python. The scientific aspects of the project is -to investigate novel techniques (based on aspect-oriented programming code generation and abstract interpretation) for the implementation of +Researching a higly flexible and modular language platform and +implementing it by leveraging the Open Source Python Language and +Community + +The PyPy project have been an ongoing Open Source Python language +implementation since 2003. In December 2004 PyPy recieved EU-funding +within the Framework Programme 6, second call for proposals ("Open +development platforms and services" IST). + +A consortium of 7 partners in Germany, France and Sweden are working to +achieve the goal of a open run-time environment for the Open Source +Programming Language Python. The scientific aspects of the project is to +investigate novel techniques (based on aspect-oriented programming code +generation and abstract interpretation) for the implementation of practical dynamic languages. -A methodological goal of the project is also to show case a novel software engineering process, Sprint Driven Development. This is an Agile -methodology, providing adynamic and adaptive environment, suitable for co-operative and distributed development. - -The project is divided into three major phases, phase 1 has the focus of developing the actual research tool - the self contained compiler, -phase 2 has the focus of optimisations (core, translation and dynamic) and in phase 3 the actual integration of efforts and dissemination of the results. -The project has an expected deadline in November 2006. - -PyPy is still, though EU-funded, heavily integrated in the Open Source community of Python.The methodology of choice -is the key strategy to make sure that the community of skilled and enthusiastic developers can contribute in ways that wouldn't have been possible -without EU-funding. - -For questions regarding the PyPy-project, please email our consortium at pypy-funding at codespeak.net. - -For more detailed information, documentation and code - please visit the PyPy community housed at Codespeak -http://codespeak.net/pypy - -______________________________________________________________________________________________________________________________ - -News - -PyPy in Heidelberg August 2005 -The team met up, in total 14 people to work on the planned 0.7 release. - -PyPy sprint in Hildesheim July 2005 -The team met up at Trillke Gut and worked on the upcoming phase 1 -deliverables. The result was PyPy - selfcontained! - - -PyPy at Europython in Gothenburg June/July 2005 -The Pypy team was busy sprinting 4 days before Europython and 6 days afterwards. -There where also several PyPy talks at the conference, a joint talk by Holger Krekel, -Armin Rigo, Carl Friedrich Bolz about translation aspects of PyPy, Holger Krekel again about Py.test -and Beatrice D?ring about sprint driven development. -This sprint had 20 participants which is a PyPy record. - -PyPy at ACCU April 2005 -Armin Rigo and Jacob Hall?n held 2 talks at the ACCU conference,about PyPy and sprint driven -development. - -PyPy at PyCon in Washington March 2005 -The Pypy team sprinted at PyCon - 13 developers worked on getting -Pypy compliant with the CPython regression test. Armin Rigo and Holger Krekel -also did talks about the Object space and Py.test. - -PyPy at the Calibre conference in Paris March 2005 -Beatrice D?ring from the PyPy team participated in the Calibre workshop -"Libre software - which business model?". - -Sprint in Leysin, Switzerland January 2005 -The PyPy team met up for the first sprint after the official start of the EU funded project. -13 people participated and worked together for 7 days. -The team succeded in importing and running CPython test on PyPy. - - -Contract signed - PyPy is flying December 2004 -The PyPy team recieved contract confirmation 1 December 2004 form the Commission -The team kicked of the work with a Consortium meeting i Saarbruecken. - - -Partners - - -DFKI www.dfki.de - Stephan Busemann stephan.busemann at dfki.de - (coordinator) - - Alastair Burt alastair.burt at dfki.de - - Anders Lehmann anders.lehmann at dfki.de - -AB Strakt www.strakt.com - Jacob Hall?n jacob at strakt.com - (project manager) - - Samuele Pedroni pedronis at strakt.com - (technical board) - - Anders Chrigstr?m ac at strakt.com - -Change Maker www.changemaker.nu - Beatrice D?ring bea at changemaker.nu - (assistant project manager) - -Merlinux GmbH www.merlinux.de - Holger Krekel krekel at merlinux.de - (technical board) - -Heinrich Heine Universit?t D?sseldorf www.uni-duesseldorf.de/ - Armin Rigo arigo at tunes.org - (technical board) - -Tismerysoft www.stackless.com - Christian Tismer tismer at stackless.com - (technical board) - -Logilab www.logilab.fr - Nicholas Chauvat Nicolas.Chauvat at logilab.fr - - Ludovic Aubry ludal at logilab.fr - - Adrien DiMascio adim at logilab.fr - -impara GmbH www.impara.de - bert at impara.de - -Also - as individual partners: +A methodological goal of the project is also to show case a novel +software engineering process, Sprint Driven Development. This is an +Agile methodology, providing adynamic and adaptive environment, suitable +for co-operative and distributed development. + +The project is divided into three major phases, phase 1 has the focus of +developing the actual research tool - the self contained compiler, phase +2 has the focus of optimisations (core, translation and dynamic) and in +phase 3 the actual integration of efforts and dissemination of the +results. The project has an expected deadline in November 2006. + +PyPy is still, though EU-funded, heavily integrated in the Open Source +community of Python. The methodology of choice is the key strategy to +make sure that the community of skilled and enthusiastic developers can +contribute in ways that wouldn't have been possible without EU-funding. -Laura Creighton lac at strakt.com -Eric van Riet Paap eric at vanrietpaap.nl -Richard Emslie rxe at ukshells.co.uk +For questions regarding the PyPy-project, please email our consortium at +pypy-funding at codespeak.net or Bea During (bea at changemaker nu). +For more detailed information, documentation and code - please visit the +`PyPy community housed at codespeak`_. +.. _`PyPy community housed at codespeak`: http://codespeak.net/pypy From hpk at codespeak.net Thu Sep 1 08:25:47 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Sep 2005 08:25:47 +0200 (CEST) Subject: [pypy-svn] r17138 - pypy/extradoc/pypy.org Message-ID: <20050901062547.2FE5F27B57@code1.codespeak.net> Author: hpk Date: Thu Sep 1 08:25:46 2005 New Revision: 17138 Added: pypy/extradoc/pypy.org/consortium.txt - copied, changed from r17137, pypy/extradoc/pypy.org/news.txt Modified: pypy/extradoc/pypy.org/news.txt Log: split of partners into its own page (consortium.txt) Modified: pypy/extradoc/pypy.org/news.txt ============================================================================== --- pypy/extradoc/pypy.org/news.txt (original) +++ pypy/extradoc/pypy.org/news.txt Thu Sep 1 08:25:46 2005 @@ -62,55 +62,3 @@ Saarbruecken. -Consortium Partners - -DFKI www.dfki.de - Stephan Busemann stephan.busemann at dfki.de - (coordinator) - - Alastair Burt alastair.burt at dfki.de - - Anders Lehmann anders.lehmann at dfki.de - -AB Strakt www.strakt.com - Jacob Hall?n jacob at strakt.com - (project manager) - - Samuele Pedroni pedronis at strakt.com - (technical board) - - Anders Chrigstr?m ac at strakt.com - -Change Maker www.changemaker.nu - Beatrice D?ring bea at changemaker.nu - (assistant project manager) - -Merlinux GmbH www.merlinux.de - Holger Krekel krekel at merlinux.de - (technical director/technical board) - -Heinrich Heine Universit?t D?sseldorf www.uni-duesseldorf.de/ - Armin Rigo arigo at tunes.org - (technical board) - -Tismerysoft www.stackless.com - Christian Tismer tismer at stackless.com - (technical board) - -Logilab www.logilab.fr - Nicholas Chauvat Nicolas.Chauvat at logilab.fr - - Ludovic Aubry ludal at logilab.fr - - Adrien DiMascio adim at logilab.fr - -impara GmbH www.impara.de - Bert Freudenberg bert at impara.de - -Also - as individual partners: - -Laura Creighton lac at strakt.com -Eric van Riet Paap eric at vanrietpaap.nl -Richard Emslie rxe at ukshells.co.uk - - From hpk at codespeak.net Thu Sep 1 08:29:08 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Sep 2005 08:29:08 +0200 (CEST) Subject: [pypy-svn] r17139 - pypy/extradoc/pypy.org Message-ID: <20050901062908.9952B27B50@code1.codespeak.net> Author: hpk Date: Thu Sep 1 08:29:08 2005 New Revision: 17139 Added: pypy/extradoc/pypy.org/confrest.py - copied, changed from r17082, pypy/dist/pypy/doc/confrest.py pypy/extradoc/pypy.org/conftest.py - copied unchanged from r17082, pypy/dist/pypy/doc/conftest.py Modified: pypy/extradoc/pypy.org/consortium.txt pypy/extradoc/pypy.org/news.txt Log: converting to UTF-8 Modified: pypy/extradoc/pypy.org/consortium.txt ============================================================================== --- pypy/extradoc/pypy.org/consortium.txt (original) +++ pypy/extradoc/pypy.org/consortium.txt Thu Sep 1 08:29:08 2005 @@ -8,17 +8,17 @@ Anders Lehmann anders.lehmann at dfki.de AB Strakt http://www.strakt.com - Jacob Hall?n jacob at strakt.com (project manager) + Jacob Hall??n jacob at strakt.com (project manager) Samuele Pedroni pedronis at strakt.com (technical board) - Anders Chrigstr?m ac at strakt.com + Anders Chrigstr??m ac at strakt.com Change Maker http://www.changemaker.nu - Beatrice D?ring bea at changemaker.nu (assistant project manager) + Beatrice D??ring bea at changemaker.nu (assistant project manager) Merlinux GmbH http://merlinux.de Holger Krekel krekel at merlinux.de (technical director/board) -Heinrich Heine Universit?t D?sseldorf http://www.uni-duesseldorf.de/ +Heinrich Heine Universit??t D??sseldorf http://www.uni-duesseldorf.de/ Armin Rigo arigo at tunes.org (technical board) Tismerysoft http://www.stackless.com Modified: pypy/extradoc/pypy.org/news.txt ============================================================================== --- pypy/extradoc/pypy.org/news.txt (original) +++ pypy/extradoc/pypy.org/news.txt Thu Sep 1 08:29:08 2005 @@ -4,7 +4,7 @@ The team met up, in total 14 people to work on the planned 0.7 release. PyPy sprint in Hildesheim July 2005 ----------------------------------- +---------------------------------------- The team met up at Trillke Gut and worked on the upcoming phase 1 deliverables. The result was PyPy - selfcontained! @@ -17,13 +17,13 @@ afterwards. There where also several PyPy talks at the conference, a joint talk by Holger Krekel, Armin Rigo, Carl Friedrich Bolz about translation aspects of PyPy, Holger Krekel about py.test and -Beatrice D?ring about sprint driven development. This sprint had 20 +Beatrice D??ring about sprint driven development. This sprint had 20 participants which is a PyPy record. PyPy at ACCU April 2005 ------------------------------------------------------ -Armin Rigo and Jacob Hall?n held 2 talks at the ACCU conference,about PyPy and sprint driven +Armin Rigo and Jacob Hall??n held 2 talks at the ACCU conference,about PyPy and sprint driven development. PyPy at PyCon in Washington March 2005 @@ -36,7 +36,7 @@ PyPy at the Calibre conference in Paris March 2005 ------------------------------------------------------ -Beatrice D?ring from the PyPy team participated in the Calibre workshop +Beatrice D??ring from the PyPy team participated in the Calibre workshop "Libre software - which business model?". Sprint in Leysin, Switzerland January 2005 From hpk at codespeak.net Thu Sep 1 08:32:11 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Sep 2005 08:32:11 +0200 (CEST) Subject: [pypy-svn] r17140 - pypy/extradoc/pypy.org Message-ID: <20050901063211.44A1027B57@code1.codespeak.net> Author: hpk Date: Thu Sep 1 08:32:10 2005 New Revision: 17140 Modified: pypy/extradoc/pypy.org/ (props changed) pypy/extradoc/pypy.org/confrest.py Log: fixeol + adjustments Modified: pypy/extradoc/pypy.org/confrest.py ============================================================================== --- pypy/extradoc/pypy.org/confrest.py (original) +++ pypy/extradoc/pypy.org/confrest.py Thu Sep 1 08:32:10 2005 @@ -4,7 +4,7 @@ def fill(self): super(PyPyPage, self).fill() self.menubar[:] = html.div( - html.a("home", href="home.html", class_="menu"), " ", + html.a("home", href="index.html", class_="menu"), " ", html.a("news", href="news.html", class_="menu"), " ", html.a("consortium", href="consortium.html", class_="menu"), " ", " ", id="menubar") From hpk at codespeak.net Thu Sep 1 08:51:15 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Sep 2005 08:51:15 +0200 (CEST) Subject: [pypy-svn] r17141 - pypy/dist/pypy/doc Message-ID: <20050901065115.5402527B56@code1.codespeak.net> Author: hpk Date: Thu Sep 1 08:51:14 2005 New Revision: 17141 Modified: pypy/dist/pypy/doc/style.css Log: added the position for another (EU) logo Modified: pypy/dist/pypy/doc/style.css ============================================================================== --- pypy/dist/pypy/doc/style.css (original) +++ pypy/dist/pypy/doc/style.css Thu Sep 1 08:51:14 2005 @@ -772,6 +772,12 @@ top: 4px; left: 4px; } + +img#extraimg { + position: absolute; + right: 14px; + top: 4px; +} div#navspace { position: absolute; From hpk at codespeak.net Thu Sep 1 08:51:34 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Sep 2005 08:51:34 +0200 (CEST) Subject: [pypy-svn] r17142 - pypy/extradoc/pypy.org Message-ID: <20050901065134.09B3A27B5C@code1.codespeak.net> Author: hpk Date: Thu Sep 1 08:51:33 2005 New Revision: 17142 Added: pypy/extradoc/pypy.org/eu-logo-small.jpg (contents, props changed) Modified: pypy/extradoc/pypy.org/confrest.py pypy/extradoc/pypy.org/conftest.py pypy/extradoc/pypy.org/consortium.txt pypy/extradoc/pypy.org/index.txt Log: bring the web page into shape, add an EU logo Modified: pypy/extradoc/pypy.org/confrest.py ============================================================================== --- pypy/extradoc/pypy.org/confrest.py (original) +++ pypy/extradoc/pypy.org/confrest.py Thu Sep 1 08:51:33 2005 @@ -1,24 +1,30 @@ from py.__.documentation.confrest import * -class PyPyPage(Page): +class PyPyPage(Page): def fill(self): super(PyPyPage, self).fill() self.menubar[:] = html.div( html.a("home", href="index.html", class_="menu"), " ", html.a("news", href="news.html", class_="menu"), " ", html.a("consortium", href="consortium.html", class_="menu"), " ", + html.a("community/coding", + href="http://codespeak.net/pypy/dist/pypy/doc/index.html", + class_="menu"), " ", " ", id="menubar") -class Project(Project): - title = "PyPy EU Project" - stylesheet = 'style.css' - encoding = 'latin1' +class Project(Project): + title = "PyPy EU Project" + stylesheet = 'http://codespeak.net/pypy/dist/pypy/doc/style.css' + encoding = 'latin1' prefix_title = "EU/PyPy" logo = html.div( - html.a( - html.img(alt="PyPy", id="pyimg", - src="http://codespeak.net/pypy/img/py-web1.png", - height=110, width=149))) - Page = PyPyPage + html.a(html.img(alt="PyPy", id="pyimg", + src="http://codespeak.net/pypy/img/py-web1.png", + height=110, width=149)), + html.img(alt="EU Logo", id="extraimg", + src="eu-logo-small.jpg", + height=105, width=154), + ) + Page = PyPyPage Modified: pypy/extradoc/pypy.org/conftest.py ============================================================================== --- pypy/extradoc/pypy.org/conftest.py (original) +++ pypy/extradoc/pypy.org/conftest.py Thu Sep 1 08:51:33 2005 @@ -1,31 +1,31 @@ import py from py.__.documentation.conftest import Directory, DoctestText, ReSTChecker -class PyPyDoctestText(DoctestText): +class PyPyDoctestText(DoctestText): - def run(self): - # XXX refine doctest support with respect to scoping - return - - def execute(self, module, docstring): - # XXX execute PyPy prompts as well + def run(self): + # XXX refine doctest support with respect to scoping + return + + def execute(self, module, docstring): + # XXX execute PyPy prompts as well l = [] - for line in docstring.split('\n'): - if line.find('>>>>') != -1: - line = "" - l.append(line) - text = "\n".join(l) - super(PyPyDoctestText, self).execute(module, text) + for line in docstring.split('\n'): + if line.find('>>>>') != -1: + line = "" + l.append(line) + text = "\n".join(l) + super(PyPyDoctestText, self).execute(module, text) - #mod = py.std.types.ModuleType(self.fspath.basename, text) - #self.mergescopes(mod, scopes) + #mod = py.std.types.ModuleType(self.fspath.basename, text) + #self.mergescopes(mod, scopes) #failed, tot = py.std.doctest.testmod(mod, verbose=1) #if failed: # py.test.fail("doctest %s: %s failed out of %s" %( # self.fspath, failed, tot)) -class PyPyReSTChecker(ReSTChecker): - DoctestText = PyPyDoctestText - -class Directory(Directory): - ReSTChecker = PyPyReSTChecker +class PyPyReSTChecker(ReSTChecker): + DoctestText = PyPyDoctestText + +class Directory(Directory): + ReSTChecker = PyPyReSTChecker Modified: pypy/extradoc/pypy.org/consortium.txt ============================================================================== --- pypy/extradoc/pypy.org/consortium.txt (original) +++ pypy/extradoc/pypy.org/consortium.txt Thu Sep 1 08:51:33 2005 @@ -15,7 +15,7 @@ Change Maker http://www.changemaker.nu Beatrice D?ring bea at changemaker.nu (assistant project manager) -Merlinux GmbH http://merlinux.de +merlinux GmbH http://merlinux.de Holger Krekel krekel at merlinux.de (technical director/board) Heinrich Heine Universit?t D?sseldorf http://www.uni-duesseldorf.de/ Added: pypy/extradoc/pypy.org/eu-logo-small.jpg ============================================================================== Binary file. No diff available. Modified: pypy/extradoc/pypy.org/index.txt ============================================================================== --- pypy/extradoc/pypy.org/index.txt (original) +++ pypy/extradoc/pypy.org/index.txt Thu Sep 1 08:51:33 2005 @@ -1,9 +1,13 @@ -PyPy +PyPy EU project title +-------------------------------- Researching a higly flexible and modular language platform and implementing it by leveraging the Open Source Python Language and Community +PyPy EU project description +-------------------------------- + The PyPy project have been an ongoing Open Source Python language implementation since 2003. In December 2004 PyPy recieved EU-funding within the Framework Programme 6, second call for proposals ("Open From hpk at codespeak.net Thu Sep 1 08:59:08 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Sep 2005 08:59:08 +0200 (CEST) Subject: [pypy-svn] r17143 - pypy/dist/pypy/doc Message-ID: <20050901065908.A404D27B56@code1.codespeak.net> Author: hpk Date: Thu Sep 1 08:59:08 2005 New Revision: 17143 Modified: pypy/dist/pypy/doc/confrest.py Log: insert a top-level link to the EU project pages Modified: pypy/dist/pypy/doc/confrest.py ============================================================================== --- pypy/dist/pypy/doc/confrest.py (original) +++ pypy/dist/pypy/doc/confrest.py Thu Sep 1 08:59:08 2005 @@ -9,6 +9,8 @@ html.a("contact", href="contact.html", class_="menu"), " ", html.a("getting-started", href="getting-started.html", class_="menu"), " ", + html.a("EU/project", + href="http://pypy.org/", class_="menu"), " ", html.a("issue", href="https://codespeak.net/issue/pypy-dev/", class_="menu"), From hpk at codespeak.net Thu Sep 1 09:11:15 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Sep 2005 09:11:15 +0200 (CEST) Subject: [pypy-svn] r17144 - pypy/dist/pypy/doc Message-ID: <20050901071115.D1ED627B56@code1.codespeak.net> Author: hpk Date: Thu Sep 1 09:11:15 2005 New Revision: 17144 Modified: pypy/dist/pypy/doc/style.css Log: smaller fonts for menu/links Modified: pypy/dist/pypy/doc/style.css ============================================================================== --- pypy/dist/pypy/doc/style.css (original) +++ pypy/dist/pypy/doc/style.css Thu Sep 1 09:11:15 2005 @@ -111,7 +111,7 @@ span.menu_selected { color: black; - font: 140% Verdana, Helvetica, Arial, sans-serif; + font: 120% Verdana, Helvetica, Arial, sans-serif; text-decoration: none; padding-right: 0.3em; background-color: #cccccc; @@ -120,14 +120,14 @@ a.menu { /*color: #3ba6ec; */ - font: 140% Verdana, Helvetica, Arial, sans-serif; + font: 120% Verdana, Helvetica, Arial, sans-serif; text-decoration: none; padding-right: 0.3em; } a.menu[href]:visited, a.menu[href]:link{ /*color: #3ba6ec; */ - font: 140% Verdana, Helvetica, Arial, sans-serif; + font: 120% Verdana, Helvetica, Arial, sans-serif; text-decoration: none; } From hpk at codespeak.net Thu Sep 1 09:17:42 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Sep 2005 09:17:42 +0200 (CEST) Subject: [pypy-svn] r17145 - pypy/extradoc/pypy.org Message-ID: <20050901071742.B61C227B57@code1.codespeak.net> Author: hpk Date: Thu Sep 1 09:17:42 2005 New Revision: 17145 Modified: pypy/extradoc/pypy.org/confrest.py pypy/extradoc/pypy.org/consortium.txt pypy/extradoc/pypy.org/index.txt Log: - obfuscate email addresses - small other changes Modified: pypy/extradoc/pypy.org/confrest.py ============================================================================== --- pypy/extradoc/pypy.org/confrest.py (original) +++ pypy/extradoc/pypy.org/confrest.py Thu Sep 1 09:17:42 2005 @@ -7,6 +7,8 @@ html.a("home", href="index.html", class_="menu"), " ", html.a("news", href="news.html", class_="menu"), " ", html.a("consortium", href="consortium.html", class_="menu"), " ", + html.a("EU-Issues", href="https://codespeak.net/issue/pypy-eu/", + class_="menu"), " ", html.a("community/coding", href="http://codespeak.net/pypy/dist/pypy/doc/index.html", class_="menu"), " ", Modified: pypy/extradoc/pypy.org/consortium.txt ============================================================================== --- pypy/extradoc/pypy.org/consortium.txt (original) +++ pypy/extradoc/pypy.org/consortium.txt Thu Sep 1 09:17:42 2005 @@ -2,40 +2,40 @@ ------------------------------ DFKI http://www.dfki.de - Stephan Busemann stephan.busemann at dfki.de + Stephan Busemann [stephan.busemann at dfki de] (EU coordinator) - Alastair Burt alastair.burt at dfki.de - Anders Lehmann anders.lehmann at dfki.de + Alastair Burt [alastair.burt at dfki de] + Anders Lehmann [anders.lehmann at dfki de] AB Strakt http://www.strakt.com - Jacob Hall?n jacob at strakt.com (project manager) - Samuele Pedroni pedronis at strakt.com (technical board) - Anders Chrigstr?m ac at strakt.com + Jacob Hall?n [jacob at strakt com] (project manager) + Samuele Pedroni [pedronis at strakt com] (technical board) + Anders Chrigstr?m [ac at strakt com] Change Maker http://www.changemaker.nu - Beatrice D?ring bea at changemaker.nu (assistant project manager) + Beatrice D?ring [bea at changemaker nu] (assistant project manager) merlinux GmbH http://merlinux.de - Holger Krekel krekel at merlinux.de (technical director/board) + Holger Krekel [krekel at merlinux de] (technical director/board) Heinrich Heine Universit?t D?sseldorf http://www.uni-duesseldorf.de/ - Armin Rigo arigo at tunes.org (technical board) + Armin Rigo [arigo at tunes org] (technical board) Tismerysoft http://www.stackless.com - Christian Tismer tismer at stackless.com (technical board) + Christian Tismer [tismer at stackless com] (technical board) Logilab http://www.logilab.fr - Nicholas Chauvat Nicolas.Chauvat at logilab.fr - Ludovic Aubry ludal at logilab.fr - Adrien DiMascio adim at logilab.fr + Nicholas Chauvat [Nicolas.Chauvat at logilab fr] + Ludovic Aubry [ludal at logilab fr] + Adrien DiMascio [adim at logilab fr] impara GmbH www.impara.de - Bert Freudenberg bert at impara.de + Bert Freudenberg [bert at impara de] Individual Partners ------------------------- -Laura Creighton lac at strakt.com -Eric van Riet Paap eric at vanrietpaap.nl -Richard Emslie rxe at ukshells.co.uk +Laura Creighton [lac at strakt com] +Eric van Riet Paap [eric at vanrietpaap nl] +Richard Emslie [rxe at ukshells co uk] Modified: pypy/extradoc/pypy.org/index.txt ============================================================================== --- pypy/extradoc/pypy.org/index.txt (original) +++ pypy/extradoc/pypy.org/index.txt Thu Sep 1 09:17:42 2005 @@ -1,12 +1,12 @@ -PyPy EU project title --------------------------------- +PyPy EU project title (contract number: 004479) +------------------------------------------------ Researching a higly flexible and modular language platform and implementing it by leveraging the Open Source Python Language and Community -PyPy EU project description --------------------------------- +PyPy EU project description (004479) +-------------------------------------- The PyPy project have been an ongoing Open Source Python language implementation since 2003. In December 2004 PyPy recieved EU-funding @@ -37,7 +37,7 @@ contribute in ways that wouldn't have been possible without EU-funding. For questions regarding the PyPy-project, please email our consortium at -pypy-funding at codespeak.net or Bea During (bea at changemaker nu). +[pypy-funding at codespeak net] or Bea During (bea at changemaker nu). For more detailed information, documentation and code - please visit the `PyPy community housed at codespeak`_. From ale at codespeak.net Thu Sep 1 13:01:21 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Thu, 1 Sep 2005 13:01:21 +0200 (CEST) Subject: [pypy-svn] r17150 - pypy/dist/pypy/module/_codecs Message-ID: <20050901110121.C5C1827B56@code1.codespeak.net> Author: ale Date: Thu Sep 1 13:01:20 2005 New Revision: 17150 Modified: pypy/dist/pypy/module/_codecs/app_codecs.py Log: Cleanup of app_codecs. Sorry for the large diff. Trying to follow the coding conventions Modified: pypy/dist/pypy/module/_codecs/app_codecs.py ============================================================================== --- pypy/dist/pypy/module/_codecs/app_codecs.py (original) +++ pypy/dist/pypy/module/_codecs/app_codecs.py Thu Sep 1 13:01:20 2005 @@ -124,22 +124,22 @@ """ if encoding == None: encoding = sys.getdefaultencoding() - if isinstance(encoding,str): + if isinstance(encoding, str): decoder = lookup(encoding)[1] - if decoder and isinstance(errors,str): - res = decoder(obj,errors) - if not isinstance(res,tuple) or len(res) != 2: - raise TypeError("encoder must return a tuple (object,integer)") + if decoder and isinstance(errors, str): + res = decoder(obj, errors) + if not isinstance(res, tuple) or len(res) != 2: + raise TypeError("encoder must return a tuple (object, integer)") return res[0] else: raise TypeError("Errors must be a string") else: raise TypeError("Encoding must be a string") -def latin_1_encode( obj,errors='strict'): +def latin_1_encode( obj, errors='strict'): """None """ - res = PyUnicode_EncodeLatin1(obj,len(obj),errors) + res = PyUnicode_EncodeLatin1(obj, len(obj), errors) res = ''.join(res) return res, len(res) # XXX MBCS codec might involve ctypes ? @@ -148,87 +148,87 @@ """ pass -def readbuffer_encode( obj,errors='strict'): +def readbuffer_encode( obj, errors='strict'): """None """ res = str(obj) - return res,len(res) + return res, len(res) -def escape_encode( obj,errors='strict'): +def escape_encode( obj, errors='strict'): """None """ s = repr(obj) v = s[1:-1] - return v,len(v) + return v, len(v) -def utf_8_decode( data,errors='strict',final=False): +def utf_8_decode( data, errors='strict', final=False): """None """ consumed = len(data) if final: consumed = 0 - res,consumed = PyUnicode_DecodeUTF8Stateful(data, len(data), errors, final) + res, consumed = PyUnicode_DecodeUTF8Stateful(data, len(data), errors, final) res = u''.join(res) return res, consumed -def raw_unicode_escape_decode( data,errors='strict'): +def raw_unicode_escape_decode( data, errors='strict'): """None """ res = PyUnicode_DecodeRawUnicodeEscape(data, len(data), errors) res = u''.join(res) - return res,len(res) + return res, len(res) -def utf_7_decode( data,errors='strict'): +def utf_7_decode( data, errors='strict'): """None """ - res = PyUnicode_DecodeUTF7(data,len(data),errors) + res = PyUnicode_DecodeUTF7(data, len(data), errors) res = u''.join(res) - return res,len(res) + return res, len(res) -def unicode_escape_encode( obj,errors='strict'): +def unicode_escape_encode( obj, errors='strict'): """None """ - res = unicodeescape_string(obj,len(obj),0) + res = unicodeescape_string(obj, len(obj), 0) res = ''.join(res) return res, len(res) -def latin_1_decode( data,errors='strict'): +def latin_1_decode( data, errors='strict'): """None """ - res = PyUnicode_DecodeLatin1(data,len(data),errors) + res = PyUnicode_DecodeLatin1(data, len(data), errors) res = u''.join(res) return res, len(res) -def utf_16_decode( data,errors='strict',final=False): +def utf_16_decode( data, errors='strict', final=False): """None """ consumed = len(data) if final: consumed = 0 - res,consumed,byteorder = PyUnicode_DecodeUTF16Stateful(data,len(data),errors,'native',final) + res, consumed, byteorder = PyUnicode_DecodeUTF16Stateful(data, len(data), errors, 'native', final) res = ''.join(res) return res, consumed -def unicode_escape_decode( data,errors='strict'): +def unicode_escape_decode( data, errors='strict'): """None """ - res = PyUnicode_DecodeUnicodeEscape(data,len(data),errors) + res = PyUnicode_DecodeUnicodeEscape(data, len(data), errors) res = u''.join(res) return res, len(res) -def ascii_decode( data,errors='strict'): +def ascii_decode( data, errors='strict'): """None """ - res = PyUnicode_DecodeASCII(data,len(data),errors) + res = PyUnicode_DecodeASCII(data, len(data), errors) res = u''.join(res) return res, len(res) -def charmap_encode(obj,errors='strict',mapping='latin-1'): +def charmap_encode(obj, errors='strict', mapping='latin-1'): """None """ - res = PyUnicode_EncodeCharmap(obj,len(obj),mapping,errors) + res = PyUnicode_EncodeCharmap(obj, len(obj), mapping, errors) res = ''.join(res) return res, len(res) @@ -237,7 +237,7 @@ else: unicode_bytes = 4 -def unicode_internal_encode( obj,errors='strict'): +def unicode_internal_encode( obj, errors='strict'): """None """ if type(obj) == unicode: @@ -255,16 +255,16 @@ return res, len(res) else: res = "You can do better than this" # XXX make this right - return res,len(res) + return res, len(res) -def unicode_internal_decode( unistr,errors='strict'): +def unicode_internal_decode( unistr, errors='strict'): """None """ if type(unistr) == unicode: - return unistr,len(unistr) + return unistr, len(unistr) else: - p=[] - i=0 + p = [] + i = 0 if sys.byteorder == "big": start = unicode_bytes - 1 stop = -1 @@ -296,18 +296,18 @@ consumed = len(data) if final: consumed = 0 - res,consumed,byteorder = PyUnicode_DecodeUTF16Stateful(data, len(data), errors, bm, consumed) + res, consumed, byteorder = PyUnicode_DecodeUTF16Stateful(data, len(data), errors, bm, consumed) res = ''.join(res) return res, consumed, byteorder # XXX needs error messages when the input is invalid -def escape_decode(data,errors='strict'): +def escape_decode(data, errors='strict'): """None """ l = len(data) i = 0 res = [] - while i= 0x00010000): - p.append('U') - p.append("%.8x" % ord(c)) + p += 'U' + p += "%.8x" % ord(c) elif (oc >= 0x100): - p.append('u') - p.append("%.4x" % ord(c)) + p += 'u' + p += "%.4x" % ord(c) else: - p.append('x') - p.append("%.2x" % ord(c)) - return u''.join(p),exc.end + p += 'x' + p += "%.2x" % ord(c) + return ''.join(p), exc.end else: raise TypeError("don't know how to handle %.400s in error callback"%type(exc)) @@ -526,7 +528,7 @@ 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 3, 3, 3, 1, 1, ] -unicode_latin1=[None]*256 +unicode_latin1 = [None]*256 def lookup_error(errors): @@ -556,11 +558,11 @@ else: raise TypeError("handler must be callable") -register_error("strict",strict_errors) -register_error("ignore",ignore_errors) -register_error("replace",replace_errors) -register_error("xmlcharrefreplace",xmlcharrefreplace_errors) -register_error("backslashreplace",backslashreplace_errors) +register_error("strict", strict_errors) +register_error("ignore", ignore_errors) +register_error("replace", replace_errors) +register_error("xmlcharrefreplace", xmlcharrefreplace_errors) +register_error("backslashreplace", backslashreplace_errors) def SPECIAL(c, encodeO, encodeWS): c = ord(c) @@ -584,16 +586,11 @@ return ord(c) + 4 def ENCODE( ch, bits) : - charvalue = 0 out = [] -## for c in ch: -## charvalue <<= 16 -## charvalue += ord(c) while (bits >= 6): out += B64(ch >> (bits-6)) bits -= 6 - return out,bits - + return out, bits def PyUnicode_DecodeUTF7(s, size, errors): @@ -631,7 +628,7 @@ ## it in a 16-bit character surrogate = 1 msg = "code pairs are not supported" - out,x = unicode_call_errorhandler(errors,'utf-7',msg,s,i-1,i) + out, x = unicode_call_errorhandler(errors, 'utf-7', msg, s, i-1, i) p += out bitsleft = 0 break @@ -643,7 +640,7 @@ ## bitsleft < 6 then we could just classify it as padding ## but that is not the case here */ msg = "partial character in shift sequence" - out,x = unicode_call_errorhandler(errors,'utf-7',msg,s,i-1,i) + out, x = unicode_call_errorhandler(errors, 'utf-7', msg, s, i-1, i) ## /* According to RFC2152 the remaining bits should be zero. We ## choose to signal an error/insert a replacement character @@ -657,32 +654,32 @@ p += '-' inShift = 1 - elif SPECIAL(ch,0,0) : - raise UnicodeDecodeError,"unexpected special character" + elif SPECIAL(ch, 0, 0) : + raise UnicodeDecodeError, "unexpected special character" else: p += ch else: charsleft = (charsleft << 6) | UB64(ch) bitsleft += 6 - i+=1 + i += 1 ## /* p, charsleft, bitsleft, surrogate = */ DECODE(p, charsleft, bitsleft, surrogate); elif ( ch == '+' ): startinpos = i - i+=1 + i += 1 if (i= 0x10000): p += '\\' p += 'U' - p += '%08x'%ord(ch) + p += '%08x' % ord(ch) pos += 1 continue #endif @@ -811,7 +792,7 @@ ucs = (((ord(ch) & 0x03FF) << 10) | (ord(ch2) & 0x03FF)) + 0x00010000 p += '\\' p += 'U' - p += '%08x'%ucs + p += '%08x' % ucs pos += 1 continue @@ -822,7 +803,7 @@ if (ord(ch) >= 256): p += '\\' p += 'u' - p += '%04x'%ord(ch) + p += '%04x' % ord(ch) #/* Map special whitespace to '\t', \n', '\r' */ elif (ch == '\t'): @@ -841,7 +822,7 @@ elif (ch < ' ' or ch >= 0x7F) : p += '\\' p += 'x' - p += '%02x'%ord(ch) + p += '%02x' % ord(ch) #/* Copy everything else as-is */ else: p += chr(ord(ch)) @@ -873,7 +854,7 @@ pos = res[1] return p -def PyUnicode_EncodeASCII(p,size,errors): +def PyUnicode_EncodeASCII(p, size, errors): return unicode_encode_ucs1(p, size, errors, 128) @@ -885,7 +866,7 @@ len(unicode), None) -def PyUnicode_DecodeUTF16Stateful(s,size,errors,byteorder='native',final=True): +def PyUnicode_DecodeUTF16Stateful(s, size, errors, byteorder='native', final=True): bo = 0 #/* assume native ordering by default */ consumed = 0 @@ -912,26 +893,26 @@ bom = (ord(s[ihi]) << 8) | ord(s[ilo]) #ifdef BYTEORDER_IS_LITTLE_ENDIAN if sys.byteorder == 'little': - if (bom == 0xFEFF): - q += 2 - bo = -1 - elif bom == 0xFFFE: - q += 2 - bo = 1 + if (bom == 0xFEFF): + q += 2 + bo = -1 + elif bom == 0xFFFE: + q += 2 + bo = 1 else: - if bom == 0xFEFF: - q += 2 - bo = 1 - elif bom == 0xFFFE: - q += 2 - bo = -1 + if bom == 0xFEFF: + q += 2 + bo = 1 + elif bom == 0xFFFE: + q += 2 + bo = -1 elif byteorder == 'little': bo = -1 else: bo = 1 if (size == 0): - return [u''],0,bo + return [u''], 0, bo if (bo == -1): #/* force LE */ @@ -952,7 +933,7 @@ errmsg = "truncated data" startinpos = q endinpos = len(s) - unicode_call_errorhandler(errors,'utf-16',errmsg,s,startinpos,endinpos,True) + unicode_call_errorhandler(errors, 'utf-16', errmsg, s, startinpos, endinpos, True) # /* The remaining input chars are ignored if the callback ## chooses to skip the input */ @@ -960,15 +941,15 @@ q += 2 if (ch < 0xD800 or ch > 0xDFFF): - p += unichr(ch) - continue + p += unichr(ch) + continue #/* UTF-16 code pair: */ if (q >= len(s)): errmsg = "unexpected end of data" startinpos = q-2 endinpos = len(s) - unicode_call_errorhandler(errors,'utf-16',errmsg,s,startinpos,endinpos,True) + unicode_call_errorhandler(errors, 'utf-16', errmsg, s, startinpos, endinpos, True) if (0xD800 <= ch and ch <= 0xDBFF): ch2 = (ord(s[q+ihi]) << 8) | ord(s[q+ilo]) @@ -987,12 +968,12 @@ errmsg = "illegal UTF-16 surrogate" startinpos = q-4 endinpos = startinpos+2 - unicode_call_errorhandler(errors,'utf-16',errmsg,s,startinpos,endinpos,True) + unicode_call_errorhandler(errors, 'utf-16', errmsg, s, startinpos, endinpos, True) errmsg = "illegal encoding" startinpos = q-2 endinpos = startinpos+2 - unicode_call_errorhandler(errors,'utf-16',errmsg,s,startinpos,endinpos,True) + unicode_call_errorhandler(errors, 'utf-16', errmsg, s, startinpos, endinpos, True) return p, q, bo @@ -1017,7 +998,7 @@ if (byteorder == 'native'): bom = sys.byteorder - p += STORECHAR(0xFEFF,bom) + p += STORECHAR(0xFEFF, bom) if (size == 0): return "" @@ -1035,9 +1016,9 @@ ch2 = 0xDC00 | ((ch-0x10000) & 0x3FF) ch = 0xD800 | ((ch-0x10000) >> 10) - p += STORECHAR(ch,bom) + p += STORECHAR(ch, bom) if (ch2): - p +=STORECHAR(ch2,bom) + p += STORECHAR(ch2, bom) return p @@ -1047,24 +1028,9 @@ def PyUnicode_EncodeMBCS(p, size, errors): pass -#### /* If there are no characters, bail now! */ -## if (size==0) -## return "" -## from ctypes import * -## WideCharToMultiByte = windll.kernel32.WideCharToMultiByte -#### /* First get the size of the result */ -## mbcssize = WideCharToMultiByte(CP_ACP, 0, p, size, s, 0, None, None); -## if (mbcssize==0) -## raise UnicodeEncodeError, "Windows cannot decode the string %s" %p -### More error handling required (check windows errors and such) -## -### /* Do the conversion */ -#### s = ' '*mbcssize -#### if (0 == WideCharToMultiByte(CP_ACP, 0, p, size, s, mbcssize, NULL, NULL)): -#### raise UnicodeEncodeError, "Windows cannot decode the string %s" %p -## return s + def unicode_call_errorhandler(errors, encoding, - reason, input, startinpos, endinpos,decode=True): + reason, input, startinpos, endinpos, decode=True): errorHandler = lookup_error(errors) if decode: @@ -1072,18 +1038,17 @@ else: exceptionObject = UnicodeEncodeError(encoding, input, startinpos, endinpos, reason) res = errorHandler(exceptionObject) - if isinstance(res,tuple) and isinstance(res[0],unicode) and isinstance(res[1],int): + if isinstance(res, tuple) and isinstance(res[0], unicode) and isinstance(res[1], int): newpos = res[1] - if (newpos<0): - newpos = len(input)+newpos - if newpos<0 or newpos>len(input): + if (newpos < 0): + newpos = len(input) + newpos + if newpos < 0 or newpos > len(input): raise IndexError( "position %d from error handler out of bounds" % newpos) - return res[0],newpos + return res[0], newpos else: - raise TypeError("encoding error handler must return (unicode, int) tuple") + raise TypeError("encoding error handler must return (unicode, int) tuple, not %s" % repr(res)) def PyUnicode_DecodeUTF8(s, size, errors): - return PyUnicode_DecodeUTF8Stateful(s, size, errors, False) ## /* Map UTF-8 encoded prefix byte to sequence length. zero means @@ -1107,7 +1072,7 @@ 4, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 6, 6, 0, 0 ] -def PyUnicode_DecodeUTF8Stateful(s,size,errors,final): +def PyUnicode_DecodeUTF8Stateful(s, size, errors, final): consumed = 0 if (size == 0): @@ -1163,7 +1128,7 @@ pos = res[1] else: c = ((ord(s[pos]) & 0x1f) << 6) + (ord(s[pos+1]) & 0x3f) - if c<0x80: + if c < 0x80: errmsg = "illegal encoding" endinpos = startinpos+2 res = unicode_call_errorhandler( @@ -1240,7 +1205,7 @@ pos = res[1] else: #ifdef Py_UNICODE_WIDE - if c= 0) p = [] i = 0 - while i> 18)))) p += (chr((0x80 | ((ch >> 12) & 0x3f)))) p += (chr((0x80 | ((ch >> 6) & 0x3f)))) @@ -1330,7 +1295,7 @@ pos += 1 return p -def unicode_encode_ucs1(p,size,errors,limit): +def unicode_encode_ucs1(p, size, errors, limit): if limit == 256: reason = "ordinal not in range(256)" @@ -1342,7 +1307,7 @@ if (size == 0): return [''] res = [] - pos=0 + pos = 0 while pos < len(p): #for ch in p: ch = p[pos] @@ -1356,41 +1321,41 @@ collend = pos+1 while collend < len(p) and ord(p[collend]) >= limit: collend += 1 - x = unicode_call_errorhandler(errors,encoding,reason,p,collstart,collend,False) + x = unicode_call_errorhandler(errors, encoding, reason, p, collstart, collend, False) res += str(x[0]) pos = x[1] return res -def PyUnicode_EncodeLatin1(p,size,errors): - res=unicode_encode_ucs1(p, size, errors, 256) +def PyUnicode_EncodeLatin1(p, size, errors): + res = unicode_encode_ucs1(p, size, errors, 256) return res -hexdigits = [hex(i)[-1] for i in range(16)]+[hex(i)[-1].upper() for i in range(10,16)] +hexdigits = [hex(i)[-1] for i in range(16)]+[hex(i)[-1].upper() for i in range(10, 16)] -def hexescape(s,pos,digits,message,errors): +def hexescape(s, pos, digits, message, errors): chr = 0 p = [] if (pos+digits>len(s)): message = "end of string in escape sequence" - x = unicode_call_errorhandler(errors,"unicodeescape",message,s,pos-2,len(s)) + x = unicode_call_errorhandler(errors, "unicodeescape", message, s, pos-2, len(s)) p += x[0] pos = x[1] else: try: - chr = int(s[pos:pos+digits],16) + chr = int(s[pos:pos+digits], 16) except ValueError: endinpos = pos while s[endinpos] in hexdigits: - endinpos +=1 - x = unicode_call_errorhandler(errors,"unicodeescape",message,s,pos-2, + endinpos += 1 + x = unicode_call_errorhandler(errors, "unicodeescape", message, s, pos-2, endinpos+1) p += x[0] pos = x[1] #/* when we get here, chr is a 32-bit unicode character */ else: if chr <= sys.maxunicode: - p += [unichr(chr)] + p += unichr(chr) pos += digits elif (chr <= 0x10ffff): @@ -1400,12 +1365,12 @@ pos += digits else: message = "illegal Unicode character" - x = unicode_call_errorhandler(errors,"unicodeescape",message,s,pos-2, + x = unicode_call_errorhandler(errors, "unicodeescape", message, s, pos-2, pos+1) p += x[0] pos = x[1] res = p - return res,pos + return res, pos def PyUnicode_DecodeUnicodeEscape(s, size, errors): @@ -1422,45 +1387,41 @@ continue ## /* \ - Escapes */ else: - pos +=1 - if pos>=len(s): + pos += 1 + if pos >= len(s): errmessage = "\\ at end of string" - unicode_call_errorhandler(errors,"unicodeescape",errmessage,s,pos-1,size) + unicode_call_errorhandler(errors, "unicodeescape", errmessage, s, pos-1, size) ch = s[pos] pos += 1 ## /* \x escapes */ - #if ch == '\n': break; - if ch == '\\': p += u'\\' + if ch == '\\' : p += u'\\' elif ch == '\'': p += u'\'' elif ch == '\"': p += u'\"' - elif ch == 'b': p += u'\b' - elif ch == 'f': p += u'\014' #/* FF */ - elif ch == 't': p += u'\t' - elif ch == 'n': p += u'\n' - elif ch == 'r': - p += u'\r' - + elif ch == 'b' : p += u'\b' + elif ch == 'f' : p += u'\014' #/* FF */ + elif ch == 't' : p += u'\t' + elif ch == 'n' : p += u'\n' + elif ch == 'r' : p += u'\r' elif ch == 'v': p += u'\013' #break; /* VT */ elif ch == 'a': p += u'\007' # break; /* BEL, not classic C */ - - ## /* \OOO (octal) escapes */ - elif ch in [ '0','1', '2', '3','4', '5', '6','7']: - x = ord(ch) - ord('0') - ch = s[pos] - if ('0' <= ch and ch <= '7'): - x = (x<<3) + ord(ch) - ord('0') - ch = s[pos+1] - if ('0' <= ch and ch <= '7'): - x = (x<<3) + ord(ch) - ord('0') - pos += 2 - + elif ch in [ '0', '1', '2', '3', '4', '5', '6', '7']: + x = int(s[pos, pos+3], 8) + # x = ord(ch) - ord('0') + # ch = s[pos] + # if ('0' <= ch and ch <= '7'): + # x = (x<<3) + ord(ch) - ord('0') + # ch = s[pos+1] + # if ('0' <= ch and ch <= '7'): + # x = (x<<3) + ord(ch) - ord('0') + # pos += 2 + pos += 3 p += unichr(x) ## /* hex escapes */ ## /* \xXX */ elif ch == 'x': digits = 2 message = "truncated \\xXX escape" - x = hexescape(s,pos,digits,message,errors) + x = hexescape(s, pos, digits, message, errors) p += x[0] pos = x[1] @@ -1468,7 +1429,7 @@ elif ch == 'u': digits = 4 message = "truncated \\uXXXX escape" - x = hexescape(s,pos,digits,message,errors) + x = hexescape(s, pos, digits, message, errors) p += x[0] pos = x[1] @@ -1476,7 +1437,7 @@ elif ch == 'U': digits = 8 message = "truncated \\UXXXXXXXX escape" - x = hexescape(s,pos,digits,message,errors) + x = hexescape(s, pos, digits, message, errors) p += x[0] pos = x[1] ## /* \N{name} */ @@ -1488,7 +1449,7 @@ import unicodedata except ImportError: message = "\\N escapes not supported (can't load unicodedata module)" - unicode_call_errorhandler(errors,"unicodeescape",message,s,pos-1,size) + unicode_call_errorhandler(errors, "unicodeescape", message, s, pos-1, size) if look < size and s[look] == '{': #/* look for the closing brace */ while (look < size and s[look] != '}'): @@ -1500,21 +1461,21 @@ try: chr = unicodedata.lookup("%s" % st) except KeyError, e: - x=unicode_call_errorhandler(errors,"unicodeescape",message,s,pos-1,look+1) + x = unicode_call_errorhandler(errors, "unicodeescape", message, s, pos-1, look+1) else: - x = chr,look + 1 + x = chr, look + 1 p += x[0] pos = x[1] else: - x=unicode_call_errorhandler(errors,"unicodeescape",message,s,pos-1,look+1) + x = unicode_call_errorhandler(errors, "unicodeescape", message, s, pos-1, look+1) else: - x=unicode_call_errorhandler(errors,"unicodeescape",message,s,pos-1,look+1) + x = unicode_call_errorhandler(errors, "unicodeescape", message, s, pos-1, look+1) else: if (pos > size): message = "\\ at end of string" handler = lookup_error(errors) - x = handler(UnicodeDecodeError("unicodeescape",s,pos, - size,message)) + x = handler(UnicodeDecodeError("unicodeescape", s, pos, + size, message)) p += x[0] pos = x[1] else: @@ -1522,7 +1483,7 @@ p += s[pos] return p -def PyUnicode_EncodeRawUnicodeEscape(s,size): +def PyUnicode_EncodeRawUnicodeEscape(s, size): if (size == 0): return '' @@ -1533,12 +1494,12 @@ if (ord(ch) >= 0x10000): p += '\\' p += 'U' - p += '%08x'%(ord(ch)) + p += '%08x' % (ord(ch)) elif (ord(ch) >= 256) : # /* Map 16-bit characters to '\uxxxx' */ p += '\\' p += 'u' - p += '%04x'%(ord(ch)) + p += '%04x' % (ord(ch)) # /* Copy everything else as-is */ else: p += chr(ord(ch)) @@ -1546,22 +1507,22 @@ #p += '\0' return p -def charmapencode_output(c,mapping): +def charmapencode_output(c, mapping): rep = mapping[c] - if isinstance(rep,int) or isinstance(rep, long): - if rep<256: + if isinstance(rep, int) or isinstance(rep, long): + if rep < 256: return chr(rep) else: raise TypeError("character mapping must be in range(256)") - elif isinstance(rep,str): + elif isinstance(rep, str): return rep elif rep == None: raise KeyError("character maps to ") else: raise TypeError("character mapping must return integer, None or str") -def PyUnicode_EncodeCharmap(p,size,mapping='latin-1',errors='strict'): +def PyUnicode_EncodeCharmap(p, size, mapping='latin-1', errors='strict'): ## /* the following variable is used for caching string comparisons ## * -1=not initialized, 0=unknown, 1=strict, 2=replace, @@ -1577,27 +1538,17 @@ while (inpos",p,inpos,inpos+1,False) + x = unicode_call_errorhandler(errors, "charmap", + "character maps to ", p, inpos, inpos+1, False) try: - res += [charmapencode_output(ord(y),mapping) for y in x[0]] + res += [charmapencode_output(ord(y), mapping) for y in x[0]] except KeyError: - raise UnicodeEncodeError("charmap",p,inpos,inpos+1, + raise UnicodeEncodeError("charmap", p, inpos, inpos+1, "character maps to ") -## except TypeError,err: -## x = unicode_call_errorhandler(errors,"charmap", -## err,p,inpos,inpos+1,False) -## try: -## res += [charmapencode_output(ord(y),mapping) for y in x[0]] -## except KeyError: -## raise UnicodeEncodeError("charmap",p,inpos,inpos+1, -## "character maps to ") -## - #/* done with this character => adjust input position */ - inpos+=1 + inpos += 1 return res def PyUnicode_DecodeCharmap(s, size, mapping, errors): @@ -1616,30 +1567,25 @@ ch = s[inpos] try: x = mapping[ord(ch)] - if isinstance(x,int): - if x<65536: + if isinstance(x, int): + if x < 65536: p += unichr(x) else: raise TypeError("character mapping must be in range(65536)") - elif isinstance(x,unicode): + elif isinstance(x, unicode): p += x elif not x: raise KeyError else: raise TypeError except KeyError: - x = unicode_call_errorhandler(errors,"charmap", - "character maps to ",s,inpos,inpos+1) + x = unicode_call_errorhandler(errors, "charmap", + "character maps to ", s, inpos, inpos+1) p += x[0] -## except TypeError: -## x = unicode_call_errorhandler(errors,"charmap", -## "character mapping must return integer, None or unicode", -## s,inpos,inpos+1) -## p += x[0] - inpos +=1 + inpos += 1 return p -def PyUnicode_DecodeRawUnicodeEscape(s, size,errors): +def PyUnicode_DecodeRawUnicodeEscape(s, size, errors): if (size == 0): return u'' @@ -1653,7 +1599,6 @@ pos += 1 continue startinpos = pos - #pos += 1 ## /* \u-escapes are only interpreted iff the number of leading ## backslashes is odd */ bs = pos @@ -1678,11 +1623,9 @@ pos += 1 #/* \uXXXX with 4 hex digits, \Uxxxxxxxx with 8 */ - - i = 0 x = 0 try: - x = int(s[pos:pos+count],16) + x = int(s[pos:pos+count], 16) except ValueError: res = unicode_call_errorhandler( errors, "rawunicodeescape", "truncated \\uXXXX", @@ -1696,9 +1639,8 @@ res = unicode_call_errorhandler( errors, "rawunicodeescape", "\\Uxxxxxxxx out of range", s, size, pos, pos+1) - pos = i = res[1] + pos = res[1] p += res[0] - i += 1 else: p += unichr(x) pos += count @@ -1707,7 +1649,7 @@ res = unicode_call_errorhandler( errors, "rawunicodeescape", "\\Uxxxxxxxx out of range", s, size, pos, pos+1) - pos = i = res[1] + pos = res[1] p += res[0] #endif From hpk at codespeak.net Thu Sep 1 14:45:03 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 1 Sep 2005 14:45:03 +0200 (CEST) Subject: [pypy-svn] r17151 - pypy/extradoc/minute Message-ID: <20050901124503.4B20C27B50@code1.codespeak.net> Author: hpk Date: Thu Sep 1 14:45:01 2005 New Revision: 17151 Added: pypy/extradoc/minute/pypy-sync-09-01-2005.txt Log: minutes for today's pypy-sync meeting. next meeting is being moderated by Armin's. Added: pypy/extradoc/minute/pypy-sync-09-01-2005.txt ============================================================================== --- (empty file) +++ pypy/extradoc/minute/pypy-sync-09-01-2005.txt Thu Sep 1 14:45:01 2005 @@ -0,0 +1,253 @@ +============================================= +pypy-sync developer meeting 1st September +============================================= + +Time & location: 1pm (30 minutes) at #pypy-sync + +Attendees:: + + Samuele Pedroni, + Anders Lehmann, + Ludovic Aubrien, + Adrien Di Mascio, + Christian Tismer, + Armin Rigo, Carl Friedrich Bolz (later) + Holger Krekel (minutes/moderation) + +Regular Topics +==================== + +- activity reports (3 prepared lines of info). + All Attendees submitted activity reports (see `IRC-Log`_ + at the end and 'LAST/NEXT/BLOCKERS' entries in particular) + +- resolve conflicts/blockers + No conflicts were discovered. + +Topics of the week +=================== + +Recap of discussed development areas until October +------------------------------------------------------- + +Everyone agreed that we want to deal with cleanups +and simplification refactorings until the Paris +Sprint rather than optimizations. Optimizations should only +be tackled for "low hanging fruits" which is defined to have +a good ratio of LOCs against speedup. For example, +the recent specializations of function calls which lead +to a speed-up by 2-3 times were considered as having +a good ratio. Moreover we looked in more detail +at the following cleanup/refactoring issues: + +- bringing the compiler to good state. Ludovic and + Adrien with help from Samuele and Arre will work + on making the current 'astcompiler' the (translateable) + and compliant default compiler. + +- the translation driving (translate_pypy.py and friends) + + Anders Lehmann is going to work on this and Armin has + some refactored code and intends to help Anders along + with Samuele. Eric will keep an eye on having LLVM + properly integrated. + +- improved test/reporting support. Holger intends to + work on this. + + +Paris Sprint announcement +----------------------------------------------- + +Ludovic is going to prepare a Paris Sprint announcement +tomorrow with help from Holger. The meeting reconfirms +the following sprint topics: + + - threading and GC + - refactoring/translation features + - start on JIT/optimizations/stackless + +Preparation/Moderation Next pypy-sync meeting +-------------------------------------------------- + +Armin, Anders L. and Christian are ready to prepare +the next three pypy-sync meetings, respectively: + + 8th September: Armin + 15th September: Christian + 22rd September: Anders L. + +We'll see who is going to do the next pypy-sync +meetings after that. + +Closing +------------------ + +Holger closes the meeting in time at 13:26pm. + +.. _`IRC-log`: + +Here is the full IRC log:: + + **** BEGIN LOGGING AT Thu Aug 18 12:59:41 2005 + + Sep 01 12:53:14 --> You are now talking on #pypy-sync + Sep 01 12:56:18 --> pedronis (n=Samuele_ at c-398b70d5.022-54-67626719.cust.bredbandsbolaget.se) has joined #pypy-sync + Sep 01 12:56:24 --> stakkars (i=pyanan at i577B4CC6.versanet.de) has joined #pypy-sync + Sep 01 13:02:03 i guess we start (hope arigo drops over soon) + Sep 01 13:02:18 here is the agenda (as mailed yesterday): + Sep 01 13:02:21 - activity reports (3 prepared lines of info). + Sep 01 13:02:21 - resolve conflicts/blockers + Sep 01 13:02:21 *Topics of the week* + Sep 01 13:02:21 - Recap of discussed development areas until October + Sep 01 13:02:21 - Paris Sprint announcement + Sep 01 13:02:21 - Preparation/Moderation next pypy-sync meeting + Sep 01 13:02:22 --> ludal (n=ludal at logilab.net2.nerim.net) has joined #pypy-sync + Sep 01 13:02:53 let's start with activity reports in this order: + Sep 01 13:03:02 adim,aleale,ericvrp,hpk,ludal,pedronis,stakkars + Sep 01 13:03:24 LAST: none + Sep 01 13:03:24 NEXT: astcompiler + Sep 01 13:03:24 BLOCKERS: none + Sep 01 13:03:28 This week: recover from Heidelberg, clean-up _codecs, planning of wp 9 and 10 + Sep 01 13:03:30 how about reverse,mine is in the clipboad + Sep 01 13:03:37 Next week: more cleanups, more planning, some compliancy issues + Sep 01 13:03:45 Blockers : - + Sep 01 13:03:52 LAST: heidelberg sprint, reporting, mailing + Sep 01 13:03:52 NEXT: steps towards better test reporting and general refactoring + Sep 01 13:03:52 BLOCKERS: None + Sep 01 13:04:03 last: slotified llvm backend, bugfixes, added gc atomic calls to genc + Sep 01 13:04:04 next: test experimental llvm escape analysis + Sep 01 13:04:06 blockers: - + Sep 01 13:04:30 stakkars: go ahead if you can't keep your clipboard content :) + Sep 01 13:04:30 LAST: sprint, small fixes, looked a bit at performance + Sep 01 13:04:32 NEXT: report planning with Armin, cleanups/help with compiler + Sep 01 13:04:33 BLOCKERS: how we want to distribute cleanups work/design + Sep 01 13:04:55 Last week: sprint then none + Sep 01 13:04:55 Next week: astcompiler + Sep 01 13:04:55 blockers: none + Sep 01 13:05:18 DONE: reimplemented list overallocation in low-level, removing it from listobject + Sep 01 13:05:25 NEXT: making dictionaries low-level, maybe looking into trusted refcounting. + Sep 01 13:05:31 BLOCK: maybe a missing interface between lowlevel types that prevents malicious use of fields + Sep 01 13:06:07 ok, pedronis's blocker is a topic of this meeting + Sep 01 13:06:19 and stakkars blocker should be discussed on #pypy i guess + Sep 01 13:06:40 yes + Sep 01 13:06:49 ok, first topic of the week: + Sep 01 13:06:54 Recap of discussed development areas until October + Sep 01 13:07:14 we said that we want to focus on cleanups/refactoring until we start with otpm,ization wp06-XX efforts in october + Sep 01 13:07:29 the three areas i wrote down at last sprint: + Sep 01 13:07:33 compiler + Sep 01 13:07:34 translation driving + Sep 01 13:07:42 testing stuff + Sep 01 13:07:51 is there anything that is missing or different opinions? + Sep 01 13:08:13 consolodating the use of _codecs + Sep 01 13:08:28 in which sense? + Sep 01 13:08:41 The same algoritm of escaped strings is at least 33 places + Sep 01 13:08:49 s/33/3 + Sep 01 13:09:18 there might be more + Sep 01 13:09:20 at least the string one needs to be at interp-level for bootstrapping reason + Sep 01 13:09:47 ok lets defer it #pypy + Sep 01 13:09:52 ok + Sep 01 13:09:53 --> arigo (n=arigo at pdpc/supporter/sustaining/arigo) has joined #pypy-sync + Sep 01 13:10:12 ludal,adim: you are aware that arre and samuele are willing to help with astcompiler and the compiler efforts? + Sep 01 13:10:31 now yes :) + Sep 01 13:10:49 ludal: it also is in the technical board minutes from heidelberg + Sep 01 13:11:00 jsut so you know + Sep 01 13:11:14 it needs to become translatable, and probably we'll then have to repass the compliance tests + Sep 01 13:11:28 stakkars: so simplifying/unifying our dict implementations also goes in the camp of refactoring/cleanups i guess + Sep 01 13:11:45 quick question: how clear is what we are doing when? Do we want to work sequentially or do some + Sep 01 13:11:58 optimization in parallel? I'm asking since this has happened already. + Sep 01 13:12:15 i know, i am more in the camp of not tackling too much optimization + Sep 01 13:12:32 but rather get our code base into better shape + Sep 01 13:13:04 I think there are different opinions form what I see happending. We should be clear about this. + Sep 01 13:13:07 it is open who is going to work on improving translate_pypy and friends + Sep 01 13:13:22 the optimisation that was done had an enormous number of lines/efforts speed up ratio + Sep 01 13:13:37 unless there is some other such very low hanging fruit + Sep 01 13:13:50 I think cleanups take precedence + Sep 01 13:13:57 yes, i agree + Sep 01 13:14:12 othewise they risk to never to be done + Sep 01 13:14:12 I se, it was the "very much issue". Then I will trash my list overallocation. + Sep 01 13:14:28 everyone else agrees with "only very good ratios of LoC/speed ups" are ok? + Sep 01 13:14:32 maybe looking at the possible optimization would help coming up with a more flexible design + Sep 01 13:14:51 stakkars: ? + Sep 01 13:15:02 trashing done work doesn't seem a good idea + Sep 01 13:15:12 at least we should do considerations. I found some corner cases on list allocation, + Sep 01 13:15:23 I aggree to only addres low hanging fruits at this time + Sep 01 13:15:30 which need a bit of discussion, because optimization is hard wihtout knowledge of GC + Sep 01 13:16:01 is anyone here ready to tackle translate_pypy cleanups? + Sep 01 13:16:02 low-hanging fruit is ok. What do I do if the fruit wasn't that sleshy, after all? :-) + Sep 01 13:16:08 yes + Sep 01 13:16:36 hpk: yes + Sep 01 13:16:47 I started some time ago something about cleaning up Translator + Sep 01 13:17:00 but I'm not sure I'd like to finish it right now (it looks like work) + Sep 01 13:17:15 arigo: :-) + Sep 01 13:17:22 for me, outlining the reports' content and starting on them is higher priority + Sep 01 13:17:38 on cleaning up dictionaries, this gives both speed and nicer source.Postpone as well? + Sep 01 13:17:39 makes sense + Sep 01 13:17:55 right, ok. then aleale can give it a go and whoever wants to join. + Sep 01 13:18:12 I can help + Sep 01 13:18:17 great + Sep 01 13:18:20 i think we can close this topic now (there also is and should be discussion on pypy-dev regarding some of the topics) + Sep 01 13:18:26 but it has less priority than reports and compiler + Sep 01 13:18:27 I will keep checking if it's still llvm compatible + Sep 01 13:18:34 pedronis: sure + Sep 01 13:18:35 ok, I misunderstood the sudden massive speedup-checkins, it was just a big fruit, right? + Sep 01 13:18:51 yes, a factor of 2-3 speedup in pystones + Sep 01 13:19:03 and richards + Sep 01 13:19:06 (and I got just 5 percent) + Sep 01 13:19:11 pedronis: what do you mean by compiler ? + Sep 01 13:19:27 arigo: we discussed this earlier, the translateable astcompiler + Sep 01 13:19:28 helping logilab making it translatable + Sep 01 13:19:30 translate it + Sep 01 13:19:34 hpk: ah, sorry of course + Sep 01 13:19:40 please next topic: Paris Sprint announcement + Sep 01 13:20:04 adim, ludal: do you intend to draft up an annoucnement in extradoc/sprintinfo/paris-annoucnement.txt? + Sep 01 13:20:20 yes, I'll do that tomorrow + Sep 01 13:20:35 ok, i can help you a bit. + Sep 01 13:20:44 do we still agree on the topics? + Sep 01 13:20:50 - threading and GC + Sep 01 13:20:50 - refactoring/translation features + Sep 01 13:20:50 - start discussing/heading for JIT/optimizations/stackless + Sep 01 13:20:53 thanks + Sep 01 13:21:26 yes + Sep 01 13:21:31 yes + Sep 01 13:21:40 seems good to me + Sep 01 13:21:54 yes + Sep 01 13:22:16 "finish up reports in a hurry" shouldn't probably be announced :-) + Sep 01 13:22:24 :) + Sep 01 13:22:37 one more thing i'd like to consider (but it needn't go into the announcement) is the issue of easing porting of C-modules for developers new to pypy + Sep 01 13:22:54 yes, also refactoring Translator is probably related what we want to do in phase2 + Sep 01 13:23:01 so maybe it can partly wait for paris + Sep 01 13:23:01 yes + Sep 01 13:23:28 ok, that's it for the paris sprint/annoucnement for now, i guess. + Sep 01 13:23:42 last topic: Preparation/Moderation Next pypy-sync meeting + Sep 01 13:24:11 --> cfbolz (n=carlson at hdlb-d9b945fb.pool.mediaWays.net) has joined #pypy-sync + Sep 01 13:24:14 who would like to do it the next time? (Can be multiple people so we can rotate a bit) + Sep 01 13:24:18 cfbolz: good morning + Sep 01 13:24:26 I could do it + Sep 01 13:24:28 sorry + Sep 01 13:24:35 i can do it. + Sep 01 13:24:42 I'd like to volonteer + Sep 01 13:25:03 ok, let's pick a random number. + Sep 01 13:25:08 42 + Sep 01 13:25:17 49. mine is higher. + Sep 01 13:25:29 great, then maybe just arigo, stakkars, aleale in that order? + Sep 01 13:25:40 let me just note that so far i had to come up with topics for the meetings myself + Sep 01 13:25:46 wise man spoke :-) + Sep 01 13:26:39 4 minutes left, is there some opne point? + Sep 01 13:26:42 ok, then I will do the next week one + Sep 01 13:27:00 and it would probably help the new moderators if at least sometimes topics are pushed to them + Sep 01 13:27:10 indeed :-) + Sep 01 13:27:28 but part of the game is identifying what would benefit from pypy-sync attention and decisions + Sep 01 13:27:37 :-) + Sep 01 13:27:52 ok, then let me close my last pypy-sync meeting for a few weeks :-) + Sep 01 13:28:06 * arigo claps + Sep 01 13:28:08 should I post my lines? + Sep 01 13:28:17 do it,not ask + Sep 01 13:28:19 LAST: worked on/finished summer of code project on GC + Sep 01 13:28:19 NEXT: PyPy unrelated stuff (another exam) + Sep 01 13:28:19 BLOCKER: some sort of strange behaviour under python 2.3 + Sep 01 13:28:33 cfbolz: congrats, btw! + Sep 01 13:28:59 cfbolz: please tell about the behavior on #pypy + Sep 01 13:29:04 :-) + Sep 01 13:29:35 ok + Sep 01 13:29:39 <-- cfbolz (n=carlson at hdlb-d9b945fb.pool.mediaWays.net) has left #pypy-sync ("Verlassend") From ericvrp at codespeak.net Thu Sep 1 14:58:54 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Thu, 1 Sep 2005 14:58:54 +0200 (CEST) Subject: [pypy-svn] r17153 - pypy/extradoc/minute Message-ID: <20050901125854.133C327B50@code1.codespeak.net> Author: ericvrp Date: Thu Sep 1 14:58:53 2005 New Revision: 17153 Modified: pypy/extradoc/minute/pypy-sync-09-01-2005.txt Log: Eric was here Modified: pypy/extradoc/minute/pypy-sync-09-01-2005.txt ============================================================================== --- pypy/extradoc/minute/pypy-sync-09-01-2005.txt (original) +++ pypy/extradoc/minute/pypy-sync-09-01-2005.txt Thu Sep 1 14:58:53 2005 @@ -13,6 +13,7 @@ Christian Tismer, Armin Rigo, Carl Friedrich Bolz (later) Holger Krekel (minutes/moderation) + Eric van Riet Paap Regular Topics ==================== From pedronis at codespeak.net Thu Sep 1 15:39:01 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 1 Sep 2005 15:39:01 +0200 (CEST) Subject: [pypy-svn] r17155 - pypy/dist/pypy/doc Message-ID: <20050901133901.C8FD327B57@code1.codespeak.net> Author: pedronis Date: Thu Sep 1 15:39:00 2005 New Revision: 17155 Modified: pypy/dist/pypy/doc/garbage_collection.txt Log: fix typos Modified: pypy/dist/pypy/doc/garbage_collection.txt ============================================================================== --- pypy/dist/pypy/doc/garbage_collection.txt (original) +++ pypy/dist/pypy/doc/garbage_collection.txt Thu Sep 1 15:39:00 2005 @@ -60,7 +60,7 @@ ++++++++++++++++++ Pointer arithmetic between the addresses is done via overloaded operators: -Substraction of two addresses gives an offset (integer), addition/substraction +Subtraction of two addresses gives an offset (integer), addition/subtraction of an offset (integer) gives another address. Furthermore addresses can be compared to each other. @@ -84,7 +84,7 @@ --------------------------------------- Instances of the ``address`` class are annotated as ``SomeAddress``. The -RTyper produces senseful results for operations with addresses: All the +RTyper produces sensible results for operations with addresses: All the basic functions manipulating addresses like ``raw_malloc`` and so on are turned into operations in the flow graph. @@ -123,7 +123,7 @@ Getting Information about Object Layout --------------------------------------- -The following functions are avaiable to the GC to get information about +The following functions are available to the GC to get information about objects: ``is_varsize(typeid) --> bool``: @@ -161,9 +161,9 @@ -------------------------- The data structures of the GC can not be handled by the GC itself. Therefore -it is neccessary to have explicit management of memory. One possibility for -doing this is via ``raw_malloc, raw_free`` and addresses. Another possiblity -is the following: Classes can be declared as being explicitely mangaed by +it is necessary to have explicit management of memory. One possibility for +doing this is via ``raw_malloc, raw_free`` and addresses. Another possibility +is the following: Classes can be declared as being explicitly managed by attaching a attribute ``_alloc_flavor_ = "raw"`` to the class. Instance creation is done the regular way, to free an instance there is a @@ -211,10 +211,10 @@ -------------------------------------------------- The Garbage Collector stores the data it needs to attach to an object directly -in front of it. The program sees gets only pointers to the part of the object -that containt non-GC-data:: +in front of it. The program sees only pointers to the part of the object +that contains non-GC-specific data:: - +---<- object model sees only this + +---<- program sees only this | +---------+---------+----------------------------+ | gc info | type id | object data | From tismer at codespeak.net Thu Sep 1 23:00:37 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Thu, 1 Sep 2005 23:00:37 +0200 (CEST) Subject: [pypy-svn] r17157 - pypy/dist/pypy/interpreter Message-ID: <20050901210037.A348427B56@code1.codespeak.net> Author: tismer Date: Thu Sep 1 23:00:36 2005 New Revision: 17157 Modified: pypy/dist/pypy/interpreter/pyframe.py Log: fixed a pseudo-leak that ate quite some memory on every app-level exception when pypy was compiled. The point is that we don't break the cycle for frame.last_exception in the context of having reference counting, only. Modified: pypy/dist/pypy/interpreter/pyframe.py ============================================================================== --- pypy/dist/pypy/interpreter/pyframe.py (original) +++ pypy/dist/pypy/interpreter/pyframe.py Thu Sep 1 23:00:36 2005 @@ -125,6 +125,8 @@ # leave that frame w_exitvalue = e.w_exitvalue executioncontext.return_trace(self, w_exitvalue) + # on exit, always release self.last_exception in case of no GC + self.last_exception = None return w_exitvalue ### line numbers ### From tismer at codespeak.net Thu Sep 1 23:06:15 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Thu, 1 Sep 2005 23:06:15 +0200 (CEST) Subject: [pypy-svn] r17158 - pypy/dist/pypy/translator/goal Message-ID: <20050901210615.E201827B62@code1.codespeak.net> Author: tismer Date: Thu Sep 1 23:06:14 2005 New Revision: 17158 Modified: pypy/dist/pypy/translator/goal/translate_pypy.py Log: using shutil.copy instead of move: reasoning: - the executable is small comparing to the C source which we leave behind. - having the duplicate binary in the usession saves some time, if you - forget to manually rename your executable in the working directory Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Thu Sep 1 23:06:14 2005 @@ -665,7 +665,7 @@ import shutil exename = mkexename(c_entry_point) newexename = mkexename('./pypy-c') - shutil.move(exename, newexename) + shutil.copy(exename, newexename) c_entry_point = newexename update_usession_dir() if not options['-o']: From tismer at codespeak.net Thu Sep 1 23:12:04 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Thu, 1 Sep 2005 23:12:04 +0200 (CEST) Subject: [pypy-svn] r17159 - pypy/dist/pypy/objspace/std Message-ID: <20050901211204.AA31427B6A@code1.codespeak.net> Author: tismer Date: Thu Sep 1 23:12:03 2005 New Revision: 17159 Modified: pypy/dist/pypy/objspace/std/model.py Log: just a comment why I think we do need assignment of space to all objects Modified: pypy/dist/pypy/objspace/std/model.py ============================================================================== --- pypy/dist/pypy/objspace/std/model.py (original) +++ pypy/dist/pypy/objspace/std/model.py Thu Sep 1 23:12:03 2005 @@ -120,7 +120,9 @@ "Parent base class for wrapped objects." def __init__(w_self, space): - w_self.space = space # XXX not sure this is ever used any more + w_self.space = space # XXX not sure this is ever used any more + # YYY I think we need it for calling hash() from an ll dicts impl. + # without explicitly passing the space. def __repr__(self): s = '%s(%s)' % ( From tismer at codespeak.net Thu Sep 1 23:17:14 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Thu, 1 Sep 2005 23:17:14 +0200 (CEST) Subject: [pypy-svn] r17160 - pypy/dist/pypy/rpython Message-ID: <20050901211714.5382727B6A@code1.codespeak.net> Author: tismer Date: Thu Sep 1 23:17:12 2005 New Revision: 17160 Modified: pypy/dist/pypy/rpython/rstr.py Log: Another "bug" that showed up after making low level list length explicit. We now need to pass all the lengths explicitly. I think we are lacking some more abstract interface, here. Instead of referring to properties of other primitives, directly, we should have a set of standard accessors which do the right thing, regardless whether we change layout or not. For instance, lltype.length always means the current length of an object, whether this is the length of an underlying structure, or the length the objects wants to pretend. Modified: pypy/dist/pypy/rpython/rstr.py ============================================================================== --- pypy/dist/pypy/rpython/rstr.py (original) +++ pypy/dist/pypy/rpython/rstr.py Thu Sep 1 23:17:12 2005 @@ -153,6 +153,9 @@ if not isinstance(r_lst, ListRepr): raise TyperError("string.join of non-list: %r" % r_lst) v_str, v_lst = hop.inputargs(string_repr, r_lst) + cname = inputconst(Void, "length") + v_length = hop.genop("getfield", [v_lst, cname], + resulttype=Signed) cname = inputconst(Void, "items") v_items = hop.genop("getfield", [v_lst, cname], resulttype=r_lst.lowleveltype.TO.items) @@ -163,13 +166,13 @@ llfn = ll_join_chars else: raise TyperError("''.join() of non-string list: %r" % r_lst) - return hop.gendirectcall(llfn, v_items) + return hop.gendirectcall(llfn, v_length, v_items) else: if r_lst.item_repr == string_repr: llfn = ll_join else: raise TyperError("sep.join() of non-string list: %r" % r_lst) - return hop.gendirectcall(llfn, v_str, v_items) + return hop.gendirectcall(llfn, v_str, v_length, v_items) def rtype_method_split(_, hop): v_str, v_chr = hop.inputargs(string_repr, char_repr) @@ -362,7 +365,7 @@ hop.genop('setarrayitem', [vtemp, i, vchunk]) hop.exception_cannot_occur() # to ignore the ZeroDivisionError of '%' - return hop.gendirectcall(ll_join_strs, vtemp) + return hop.gendirectcall(ll_join_strs, size, vtemp) class __extend__(pairtype(StringRepr, TupleRepr)): @@ -780,10 +783,10 @@ i += 1 return result -def ll_join(s, items): +def ll_join(s, length, items): s_chars = s.chars s_len = len(s_chars) - num_items = len(items) + num_items = length if num_items == 0: return emptystr itemslen = 0 @@ -820,8 +823,8 @@ i += 1 return result -def ll_join_strs(items): - num_items = len(items) +def ll_join_strs(length, items): + num_items = length itemslen = 0 i = 0 while i < num_items: @@ -842,8 +845,8 @@ i += 1 return result -def ll_join_chars(chars): - num_chars = len(chars) +def ll_join_chars(length, chars): + num_chars = length result = malloc(STR, num_chars) res_chars = result.chars i = 0 From arigo at codespeak.net Fri Sep 2 13:57:47 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 2 Sep 2005 13:57:47 +0200 (CEST) Subject: [pypy-svn] r17163 - pypy/dist/pypy/rpython Message-ID: <20050902115747.9716927B5D@code1.codespeak.net> Author: arigo Date: Fri Sep 2 13:57:46 2005 New Revision: 17163 Modified: pypy/dist/pypy/rpython/rlist.py Log: Fix for failing test_rlist.py. Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Fri Sep 2 13:57:46 2005 @@ -175,6 +175,7 @@ return rstr.ll_strconcat( rstr.list_str_open_bracket, rstr.ll_strconcat(rstr.ll_join(rstr.list_str_sep, + length, temp), rstr.list_str_close_bracket)) ll_str = staticmethod(ll_str) From arigo at codespeak.net Fri Sep 2 14:41:58 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 2 Sep 2005 14:41:58 +0200 (CEST) Subject: [pypy-svn] r17164 - pypy/dist/pypy/translator/tool/test Message-ID: <20050902124158.E4B3627B5D@code1.codespeak.net> Author: arigo Date: Fri Sep 2 14:41:58 2005 New Revision: 17164 Modified: pypy/dist/pypy/translator/tool/test/test_cbuild.py Log: Some compilers give warnings if the last byte of the .c source isn't a newline. Modified: pypy/dist/pypy/translator/tool/test/test_cbuild.py ============================================================================== --- pypy/dist/pypy/translator/tool/test/test_cbuild.py (original) +++ pypy/dist/pypy/translator/tool/test/test_cbuild.py Fri Sep 2 14:41:58 2005 @@ -13,7 +13,7 @@ printf("hello world\n"); return 0; } - """) +""") testexec = build_executable([t]) out = py.process.cmdexec(testexec) assert out.startswith('hello world') From arigo at codespeak.net Fri Sep 2 14:42:53 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 2 Sep 2005 14:42:53 +0200 (CEST) Subject: [pypy-svn] r17165 - in pypy/dist/pypy: objspace/flow translator Message-ID: <20050902124253.7C39A27B5D@code1.codespeak.net> Author: arigo Date: Fri Sep 2 14:42:52 2005 New Revision: 17165 Modified: pypy/dist/pypy/objspace/flow/model.py pypy/dist/pypy/translator/translator.py Log: When making the flow graph of a function, grab its source code lazily. (Workaround for possibly buggy inspect.getsource()) Modified: pypy/dist/pypy/objspace/flow/model.py ============================================================================== --- pypy/dist/pypy/objspace/flow/model.py (original) +++ pypy/dist/pypy/objspace/flow/model.py Fri Sep 2 14:42:52 2005 @@ -30,6 +30,16 @@ __metaclass__ = type +class roproperty(object): + def __init__(self, getter): + self.getter = getter + def __get__(self, obj, cls=None): + if obj is None: + return self + else: + return self.getter(obj) + + class FunctionGraph(object): def __init__(self, name, startblock, return_var=None): @@ -52,6 +62,11 @@ def getreturnvar(self): return self.returnblock.inputargs[0] + def getsource(self): + from pypy.tool.sourcetools import getsource + return getsource(self.func) + source = roproperty(getsource) + ## def hasonlyexceptionreturns(self): ## try: ## return self._onlyex Modified: pypy/dist/pypy/translator/translator.py ============================================================================== --- pypy/dist/pypy/translator/translator.py (original) +++ pypy/dist/pypy/translator/translator.py Fri Sep 2 14:42:52 2005 @@ -74,11 +74,6 @@ self.flowgraphs[func] = graph self.functions.append(func) graph.func = func - try: - import inspect - graph.source = inspect.getsource(func) - except: - pass # e.g. when func is defined interactively if called_by: self.callgraph[called_by, func, call_tag] = called_by, func return graph From ericvrp at codespeak.net Fri Sep 2 16:10:38 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 2 Sep 2005 16:10:38 +0200 (CEST) Subject: [pypy-svn] r17166 - pypy/dist/pypy/translator/llvm Message-ID: <20050902141038.61BE627B5D@code1.codespeak.net> Author: ericvrp Date: Fri Sep 2 16:10:37 2005 New Revision: 17166 Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py Log: use -heap2stack optimization pass when available Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Fri Sep 2 16:10:37 2005 @@ -31,11 +31,13 @@ "-simplifycfg", ])) -# XXX: TODO: refactoring: use gccas to populate this list -# suggested by: gccas /dev/null -o /dev/null -debug-pass=Arguments -OPTIMIZATION_SWITCHES = (" ".join([ - "-verify -lowersetjmp -funcresolve -raiseallocs -simplifycfg -mem2reg -globalopt -globaldce -ipconstprop -deadargelim -instcombine -simplifycfg -prune-eh -inline -simplify-libcalls -argpromotion -raise -tailduplicate -simplifycfg -scalarrepl -instcombine -break-crit-edges -condprop -tailcallelim -simplifycfg -reassociate -loopsimplify -licm -instcombine -indvars -loop-unroll -instcombine -load-vn -gcse -sccp -instcombine -break-crit-edges -condprop -dse -mergereturn -adce -simplifycfg -deadtypeelim -constmerge -verify" - ])) +flags = os.popen("gccas /dev/null -o /dev/null -debug-pass=Arguments 2>&1").read()[17:-1].split() + +if int(os.popen("opt --help 2>&1").read().find('-heap2stack')) >= 0: + flags.insert(flags.index("-inline")+1, "-heap2stack") + +OPTIMIZATION_SWITCHES = " ".join(flags) +#print OPTIMIZATION_SWITCHES def compile_module(module, source_files, object_files, library_files): open("%s_setup.py" % module, "w").write(str(py.code.Source( From ericvrp at codespeak.net Fri Sep 2 16:11:52 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 2 Sep 2005 16:11:52 +0200 (CEST) Subject: [pypy-svn] r17167 - in pypy/dist/pypy/translator/llvm: . demo module Message-ID: <20050902141152.567D527B62@code1.codespeak.net> Author: ericvrp Date: Fri Sep 2 16:11:49 2005 New Revision: 17167 Modified: pypy/dist/pypy/translator/llvm/codewriter.py pypy/dist/pypy/translator/llvm/demo/richards.py pypy/dist/pypy/translator/llvm/extfuncnode.py pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/module/support.py pypy/dist/pypy/translator/llvm/opwriter.py pypy/dist/pypy/translator/llvm/structnode.py pypy/dist/pypy/translator/llvm/varsize.py Log: optimization to generate the .ll file quicker Modified: pypy/dist/pypy/translator/llvm/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/codewriter.py (original) +++ pypy/dist/pypy/translator/llvm/codewriter.py Fri Sep 2 16:11:49 2005 @@ -4,9 +4,8 @@ log = log.codewriter -DEFAULT_TAIL = 'tail' #/tail +DEFAULT_TAIL = '' #/tail DEFAULT_CCONV = 'fastcc' #ccc/fastcc -DEFAULT_INTERNAL = 'internal' #/internal class CodeWriter(object): def __init__(self, f, word, uword, show_line_number=False): Modified: pypy/dist/pypy/translator/llvm/demo/richards.py ============================================================================== --- pypy/dist/pypy/translator/llvm/demo/richards.py (original) +++ pypy/dist/pypy/translator/llvm/demo/richards.py Fri Sep 2 16:11:49 2005 @@ -360,7 +360,7 @@ class Richards: - iterations = 2 + iterations = 25 def run(self): for i in xrange(self.iterations): Modified: pypy/dist/pypy/translator/llvm/extfuncnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/extfuncnode.py (original) +++ pypy/dist/pypy/translator/llvm/extfuncnode.py Fri Sep 2 16:11:49 2005 @@ -26,8 +26,5 @@ def writedecl(self, codewriter): codewriter.declare(self.getdecl()) - #def writeimpl(self, codewriter): - # self.used_external_functions[self.ref] = True - def writeglobalconstants(self, codewriter): pass Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Fri Sep 2 16:11:49 2005 @@ -179,7 +179,7 @@ else: last_op_index = None for op_index, op in enumerate(block.operations): - if True: # print out debug string + if False: # print out debug string codewriter.newline() codewriter.comment("** %s **" % str(op)) info = self.db.get_op2comment(op) Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Fri Sep 2 16:11:49 2005 @@ -1,5 +1,6 @@ from os.path import exists use_boehm_gc = exists('/usr/lib/libgc.so') or exists('/usr/lib/libgc.a') +use_boehm_gc = False import os import time @@ -16,11 +17,12 @@ from pypy.rpython import lltype from pypy.tool.udir import udir from pypy.translator.llvm.codewriter import CodeWriter, \ - DEFAULT_INTERNAL, DEFAULT_TAIL, DEFAULT_CCONV + DEFAULT_TAIL, DEFAULT_CCONV from pypy.translator.llvm import extfuncnode from pypy.translator.llvm.module.extfunction import extdeclarations, \ extfunctions, gc_boehm, gc_disabled, dependencies from pypy.translator.llvm.node import LLVMNode +from pypy.translator.llvm.structnode import StructNode from pypy.translator.translator import Translator @@ -61,13 +63,13 @@ line = line[:comment] line = line.rstrip() - # find function names, declare them internal with fastcc calling convertion + # find function names, declare them with the default calling convertion if line[-1:] == '{': returntype, s = line.split(' ', 1) funcname , s = s.split('(', 1) funcnames[funcname] = True if line.find("internal") == -1: - line = '%s %s %s' % ("", DEFAULT_CCONV, line,) + line = '%s %s' % (DEFAULT_CCONV, line,) ll_lines.append(line) # patch calls to function that we just declared fastcc @@ -96,7 +98,7 @@ class GenLLVM(object): - def __init__(self, translator, debug=False): + def __init__(self, translator, debug=True): # reset counters LLVMNode.nodename_count = {} @@ -118,14 +120,14 @@ STATS (46, "") STATS (52, "") STATS (189, "") - STATS (816, "") - STATS (1247, "") - STATS (1747, "") - STATS (5886, "") - STATS (24003, "") - STATS (25410, "") - STATS (26206, "") - STATS (268435, "") + STATS (819, "") + STATS (1250, "") + STATS (1753, "") + STATS (5896, "") + STATS (24013, "") + STATS (25411, "") + STATS (26210, "") + STATS (268884, "") """ nodecount = {} for node in self.db.getnodes(): @@ -375,10 +377,7 @@ if dep not in depdone: try: llvm_code = extfunctions[dep][1] - except KeyError: - msg = 'primitive function %s has no implementation' % dep - codewriter.comment('XXX: Error: ' + msg) - #raise Exception('primitive function %s has no implementation' %(dep,)) + except KeyError: #external function that is shared with genc continue for extfunc in llvm_code.split('\n'): codewriter.append(extfunc) Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Fri Sep 2 16:11:49 2005 @@ -1,9 +1,8 @@ extdeclarations = """ -declare ccc int %puts(sbyte*) declare ccc uint %strlen(sbyte*) -declare ccc sbyte* %memset(sbyte*, int, uint) -declare ccc sbyte* %strncpy(sbyte *, sbyte *, int) +declare ccc void %llvm.memset(sbyte*, ubyte, uint, uint) +declare ccc void %llvm.memcpy(sbyte*, sbyte*, uint, uint) """ @@ -35,7 +34,7 @@ %rpystrptr = getelementptr %RPyString* %rpy, int 0, uint 1, uint 1 %rpystr = cast [0 x sbyte]* %rpystrptr to sbyte* - call ccc sbyte* %strncpy(sbyte* %rpystr, sbyte* %s, int %len) + call ccc void %llvm.memcpy(sbyte* %rpystr, sbyte* %s, uint %lenu, uint 0) ret %RPyString* %rpy } @@ -89,11 +88,10 @@ """ % locals()) -#prepare and raise exceptions +#prepare and raise exceptions (%msg not used right now!) for exc in "IOError ZeroDivisionError OverflowError ValueError".split(): #_ZER _OVF _VAL extfunctions["%%raisePyExc_%(exc)s" % locals()] = ((), """ internal fastcc void %%raisePyExc_%(exc)s(sbyte* %%msg) { - ;XXX %%msg not used right now! %%exception_value = call fastcc %%RPYTHON_EXCEPTION* %%pypy_instantiate_%(exc)s() %%tmp = getelementptr %%RPYTHON_EXCEPTION* %%exception_value, int 0, uint 0 %%exception_type = load %%RPYTHON_EXCEPTION_VTABLE** %%tmp Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Fri Sep 2 16:11:49 2005 @@ -351,7 +351,7 @@ assert issubclass(link.exitcase, Exception) etype = self.db.obj2node[link.llexitcase._obj] - current_exception_type = etype.get_ref() + current_exception_type = etype.get_ref() target = self.node.block_to_name[link.target] exc_found_label = block_label + '_exception_found_branchto_' + target last_exc_type_var, last_exc_value_var = None, None Modified: pypy/dist/pypy/translator/llvm/structnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/structnode.py (original) +++ pypy/dist/pypy/translator/llvm/structnode.py Fri Sep 2 16:11:49 2005 @@ -78,37 +78,39 @@ self.constructor_decl, current, indices_to_array) - + class StructNode(ConstantLLVMNode): """ A struct constant. Can simply contain a primitive, a struct, pointer to struct/array """ - __slots__ = "db value structtype ref".split() + __slots__ = "db value structtype ref _get_ref_cache _get_types".split() def __init__(self, db, value): self.db = db self.value = value self.structtype = self.value._TYPE self.ref = self.make_ref('%structinstance', '') - + self._get_ref_cache = None + self._get_types = self._compute_types() + def __str__(self): return "" % (self.ref,) - def _gettypes(self): + def _compute_types(self): return [(name, self.structtype._flds[name]) for name in self.structtype._names_without_voids()] def _getvalues(self): values = [] - for name, T in self._gettypes(): + for name, T in self._get_types: value = getattr(self.value, name) values.append(self.db.repr_constant(value)[1]) return values def setup(self): - for name, T in self._gettypes(): + for name, T in self._get_types: assert T is not lltype.Void value = getattr(self.value, name) self.db.prepare_constant(T, value) @@ -137,11 +139,14 @@ def get_ref(self): """ Returns a reference as used for operations in blocks. """ + if self._get_ref_cache: + return self._get_ref_cache p, c = lltype.parentlink(self.value) if p is None: ref = self.ref else: ref = self.db.get_childref(p, c) + self._get_ref_cache = ref return ref def get_pbcref(self, toptr): @@ -172,14 +177,14 @@ def _getvalues(self): values = [] - for name, T in self._gettypes()[:-1]: + for name, T in self._get_types[:-1]: value = getattr(self.value, name) values.append(self.db.repr_constant(value)[1]) values.append(self._get_lastnoderepr()) return values def _get_lastnode_helper(self): - lastname, LASTT = self._gettypes()[-1] + lastname, LASTT = self._get_types[-1] assert isinstance(LASTT, lltype.Array) or ( isinstance(LASTT, lltype.Struct) and LASTT._arrayfld) value = getattr(self.value, lastname) @@ -195,19 +200,26 @@ super(StructVarsizeNode, self).setup() def get_typerepr(self): - # last type is a special case and need to be worked out recursively - types = self._gettypes()[:-1] - types_repr = [self.db.repr_type(T) for name, T in types] - types_repr.append(self._get_lastnode().get_typerepr()) - - return "{%s}" % ", ".join(types_repr) + try: + return self._get_typerepr_cache + except: + # last type is a special case and need to be worked out recursively + types = self._get_types[:-1] + types_repr = [self.db.repr_type(T) for name, T in types] + types_repr.append(self._get_lastnode().get_typerepr()) + result = "{%s}" % ", ".join(types_repr) + self._get_typerepr_cache = result + return result def get_ref(self): + if self._get_ref_cache: + return self._get_ref_cache ref = super(StructVarsizeNode, self).get_ref() typeval = self.db.repr_type(lltype.typeOf(self.value)) ref = "cast (%s* %s to %s*)" % (self.get_typerepr(), ref, typeval) + self._get_ref_cache = ref return ref def get_pbcref(self, toptr): Modified: pypy/dist/pypy/translator/llvm/varsize.py ============================================================================== --- pypy/dist/pypy/translator/llvm/varsize.py (original) +++ pypy/dist/pypy/translator/llvm/varsize.py Fri Sep 2 16:11:49 2005 @@ -26,11 +26,12 @@ codewriter.malloc("%ptr", "sbyte", "%usize", atomic=ARRAY._is_atomic()) codewriter.cast("%result", "sbyte*", "%ptr", ref + "*") - #if ARRAY is STR.chars: - # #XXX instead of memset we could probably just zero the hash and string terminator - # codewriter.call('%memset_result', 'sbyte*', '%memset', ['%ptr', '0', '%usize',], ['sbyte*', word, uword], cconv='ccc') - codewriter.call('%memset_result', 'sbyte*', '%memset', ['%ptr', '0', '%usize',], ['sbyte*', word, uword], cconv='ccc') - + if ARRAY is STR.chars: + #XXX instead of memset we could probably just zero the hash and string terminator + codewriter.call_void('%llvm.memset', ['%ptr', '0', '%usize', '0'], ['sbyte*', 'ubyte', 'uint', 'uint'], cconv='ccc') + else: + codewriter.call_void('%llvm.memset', ['%ptr', '0', '%usize', '0'], ['sbyte*', 'ubyte', 'uint', 'uint'], cconv='ccc') + indices_to_arraylength = tuple(indices_to_array) + (("uint", 0),) # the following accesses the length field of the array codewriter.getelementptr("%arraylength", ref + "*", From tismer at codespeak.net Fri Sep 2 16:36:53 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Fri, 2 Sep 2005 16:36:53 +0200 (CEST) Subject: [pypy-svn] r17168 - pypy/dist/pypy/interpreter Message-ID: <20050902143653.7E79027B5D@code1.codespeak.net> Author: tismer Date: Fri Sep 2 16:36:51 2005 New Revision: 17168 Modified: pypy/dist/pypy/interpreter/eval.py pypy/dist/pypy/interpreter/pyframe.py Log: About this pseudo-memory-leak ----------------------------- dropping frame.last_exception after leaving it. This solves the problem that we get memory leaks on every applevel expcetion if we have to GC at hand. This is the only obvious circle that we have, and it is also not necessary and not conforming to the CPython implementation, which too clears exception state from frames when they are not executing. One might argue that I should have put this into pyframe.py instead of eval; feel free to do so. My reason was: the eval loop of pyframe is already very deeply nested, and I would have to add another finally clause, which not onlymade it less readable, but gave a measurable performance hit. Putting it into the resume method is cheap, because we have a finally clause there, anyway. It does logically not really belong there, so I could understand if somebody's aesthetics is disturbed and he moves it to frame.eval Modified: pypy/dist/pypy/interpreter/eval.py ============================================================================== --- pypy/dist/pypy/interpreter/eval.py (original) +++ pypy/dist/pypy/interpreter/eval.py Fri Sep 2 16:36:51 2005 @@ -144,6 +144,10 @@ try: result = self.eval(executioncontext) finally: + # on exit, we always release self.last_exception. + # this belongs into pyframe's eval, but would cost an extra + # try..except clause there which we can save. + self.last_exception = None executioncontext.leave(self) return result Modified: pypy/dist/pypy/interpreter/pyframe.py ============================================================================== --- pypy/dist/pypy/interpreter/pyframe.py (original) +++ pypy/dist/pypy/interpreter/pyframe.py Fri Sep 2 16:36:51 2005 @@ -125,10 +125,8 @@ # leave that frame w_exitvalue = e.w_exitvalue executioncontext.return_trace(self, w_exitvalue) - # on exit, always release self.last_exception in case of no GC - self.last_exception = None return w_exitvalue - + ### line numbers ### # for f*_f_* unwrapping through unwrap_spec in typedef.py From cfbolz at codespeak.net Fri Sep 2 18:31:36 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 2 Sep 2005 18:31:36 +0200 (CEST) Subject: [pypy-svn] r17169 - in pypy/dist/pypy/translator: . test Message-ID: <20050902163136.E0A7527B56@code1.codespeak.net> Author: cfbolz Date: Fri Sep 2 18:31:35 2005 New Revision: 17169 Added: pypy/dist/pypy/translator/test/test_unsimplify.py Modified: pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/test/test_backendoptimization.py pypy/dist/pypy/translator/unsimplify.py Log: implemented a simple version of inlining, that does not work for functions guarded by a try...except. in addition I implemented a split_block that splits a block in two blocks, doing the right thing for passing variables and such. this is used by inline_function: inline_function splits the block where the call occurs into two blocks and then copies the blocks of the function to be inlined. the link between the splitted blocks is replaced by these copies. Modified: pypy/dist/pypy/translator/backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/backendoptimization.py (original) +++ pypy/dist/pypy/translator/backendoptimization.py Fri Sep 2 18:31:35 2005 @@ -1,8 +1,9 @@ import autopath from pypy.translator.translator import Translator -from pypy.translator.simplify import eliminate_empty_blocks +from pypy.translator.simplify import eliminate_empty_blocks, join_blocks, remove_identical_vars +from pypy.translator.unsimplify import copyvar, split_block from pypy.objspace.flow.model import Variable, Constant, Block, Link -from pypy.objspace.flow.model import SpaceOperation +from pypy.objspace.flow.model import SpaceOperation, last_exception from pypy.objspace.flow.model import traverse, mkentrymap, checkgraph from pypy.tool.unionfind import UnionFind from pypy.rpython.lltype import Void @@ -147,6 +148,113 @@ "variables called %s have mixed concretetypes: %r" % (vname, vct)) +def inline_function(translator, inline_func, graph): + callsites = [] + def find_callsites(block): + if isinstance(block, Block): + for i, op in enumerate(block.operations): + if not (op.opname == "direct_call" and + isinstance(op.args[0], Constant)): + continue + if op.args[0].value._obj._callable is inline_func: + callsites.append((block, i)) + traverse(find_callsites, graph) + for block, index_operation in callsites: + _inline_function(translator, graph, block, index_operation) + checkgraph(graph) + +def _inline_function(translator, graph, block, index_operation): + if block.exitswitch == Constant(last_exception): + assert index_operation != len(block.operations) - 1, ( + "can't handle exceptions yet") + op = block.operations[index_operation] + graph_to_inline = translator.flowgraphs[op.args[0].value._obj._callable] + entrymap = mkentrymap(graph_to_inline) + beforeblock = block + afterblock = split_block(translator, graph, block, index_operation + 1) + assert beforeblock.operations[-1] is op + #vars that need to be passed through the blocks of the inlined function + #this excludes the var resulting of the direct_call + passon_vars = {beforeblock: [arg for arg in beforeblock.exits[0].args + if isinstance(arg, Variable) and + arg != op.result]} + copied_blocks = {} + varmap = {} + def get_new_name(var): + if var is None: + return None + if isinstance(var, Constant): + return var + if var not in varmap: + varmap[var] = copyvar(translator, var) + return varmap[var] + def get_new_passon_var_names(block): + result = [copyvar(translator, var) for var in passon_vars[beforeblock]] + passon_vars[block] = result + return result + def copy_operation(op): + args = [get_new_name(arg) for arg in op.args] + return SpaceOperation(op.opname, args, get_new_name(op.result)) + def copy_block(block): + if block in copied_blocks: + "already there" + return copied_blocks[block] + args = ([get_new_name(var) for var in block.inputargs] + + get_new_passon_var_names(block)) + newblock = Block(args) + copied_blocks[block] = newblock + newblock.operations = [copy_operation(op) for op in block.operations] + newblock.exits = [copy_link(link, block) for link in block.exits] + newblock.exitswitch = get_new_name(block.exitswitch) + newblock.exc_handler = block.exc_handler + return newblock + def copy_link(link, prevblock): + newargs = [get_new_name(a) for a in link.args] + passon_vars[prevblock] + newlink = Link(newargs, copy_block(link.target), link.exitcase) + newlink.prevblock = copy_block(link.prevblock) + newlink.last_exception = get_new_name(link.last_exception) + newlink.last_exc_value = get_new_name(link.last_exc_value) + if hasattr(link, 'llexitcase'): + newlink.llexitcase = link.llexitcase + return newlink + linktoinlined = beforeblock.exits[0] + assert linktoinlined.target is afterblock + copiedstartblock = copy_block(graph_to_inline.startblock) + copiedstartblock.isstartblock = False + copiedreturnblock = copied_blocks[graph_to_inline.returnblock] + passon_args = [] + i = 0 + for arg in linktoinlined.args: + if isinstance(arg, Constant): + passon_args.append(arg) + elif arg == op.result: + passon_args.append(copiedreturnblock.inputargs[0]) + else: + passon_args.append(passon_vars[graph_to_inline.returnblock][i]) + i += 1 + linktoinlined.target = copiedstartblock + linktoinlined.args = op.args[1:] + passon_vars[beforeblock] + afterblock.inputargs = afterblock.inputargs + beforeblock.operations = beforeblock.operations[:-1] + linkfrominlined = Link(passon_args, afterblock) + linkfrominlined.prevblock = copiedreturnblock + copiedreturnblock.exitswitch = None + copiedreturnblock.exits = [linkfrominlined] + assert copiedreturnblock.exits[0].target == afterblock + #let links to exceptblock of the graph to inline go to graphs exceptblock + if graph_to_inline.exceptblock in entrymap: + copiedexceptblock = copied_blocks[graph_to_inline.exceptblock] + for link in entrymap[graph_to_inline.exceptblock]: + copiedblock = copied_blocks[link.prevblock] + assert len(copiedblock.exits) == 1 + copiedblock.exits[0].args = copiedblock.exits[0].args[:2] + copiedblock.exits[0].target = graph.exceptblock + #cleaning up -- makes sense to be here, because I insert quite + #some empty blocks and blocks that can be joined + eliminate_empty_blocks(graph) + join_blocks(graph) + remove_identical_vars(graph) + def backend_optimizations(graph): remove_same_as(graph) eliminate_empty_blocks(graph) Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/test/test_backendoptimization.py (original) +++ pypy/dist/pypy/translator/test/test_backendoptimization.py Fri Sep 2 18:31:35 2005 @@ -1,9 +1,9 @@ -from pypy.translator.backendoptimization import remove_void +from pypy.translator.backendoptimization import remove_void, inline_function from pypy.translator.translator import Translator from pypy.rpython.lltype import Void from pypy.rpython.llinterp import LLInterpreter from pypy.objspace.flow.model import checkgraph -from pypy.translator.test.snippet import simple_method +from pypy.translator.test.snippet import simple_method, is_perfect_number from pypy.translator.llvm.log import log import py @@ -42,3 +42,67 @@ #interp = LLInterpreter(t.flowgraphs, t.rtyper) #assert interp.eval_function(f, [0]) == 1 +def test_inline_simple(): + def f(x, y): + return (g(x, y) + 1) * x + def g(x, y): + if x > 0: + return x * y + else: + return -x * y + t = Translator(f) + a = t.annotate([int, int]) + a.simplify() + t.specialize() + inline_function(t, g, t.flowgraphs[f]) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(f, [-1, 5]) + assert result == f(-1, 5) + result = interp.eval_function(f, [2, 12]) + assert result == f(2, 12) + +def test_inline_big(): + def f(x): + result = [] + for i in range(1, x+1): + if is_perfect_number(i): + result.append(i) + return result + t = Translator(f) + a = t.annotate([int]) + a.simplify() + t.specialize() + inline_function(t, is_perfect_number, t.flowgraphs[f]) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(f, [10]) + assert result.length == len(f(10)) + +def test_inline_raising(): + def f(x): + if x == 1: + raise ValueError + return x + def g(x): + a = f(x) + if x == 2: + raise KeyError + def h(x): + try: + g(x) + except ValueError: + return 1 + except KeyError: + return 2 + return x + t = Translator(h) + a = t.annotate([int]) + a.simplify() + t.specialize() + inline_function(t, f, t.flowgraphs[g]) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(h, [0]) + assert result == 0 + result = interp.eval_function(h, [1]) + assert result == 1 + result = interp.eval_function(h, [2]) + assert result == 2 Added: pypy/dist/pypy/translator/test/test_unsimplify.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/test/test_unsimplify.py Fri Sep 2 18:31:35 2005 @@ -0,0 +1,67 @@ +from pypy.rpython.llinterp import LLInterpreter +from pypy.translator.translator import Translator +from pypy.translator.unsimplify import split_block + +def test_split_blocks_simple(): + for i in range(4): + def f(x, y): + z = x + y + w = x * y + return z + w + t = Translator(f) + a = t.annotate([int, int]) + t.specialize() + graph = t.flowgraphs[f] + split_block(t, graph, graph.startblock, i) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(f, [1, 2]) + assert result == 5 + +def test_split_blocks_conditional(): + for i in range(3): + def f(x, y): + if x + 12: + return y + 1 + else: + return y + 2 + t = Translator(f) + a = t.annotate([int, int]) + t.specialize() + graph = t.flowgraphs[f] + split_block(t, graph, graph.startblock, i) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(f, [-12, 2]) + assert result == 4 + result = interp.eval_function(f, [0, 2]) + assert result == 3 + +def test_split_block_exceptions(): + for i in range(2): + def raises(x): + if x == 1: + raise ValueError + elif x == 2: + raise KeyError + return x + def catches(x): + try: + y = x + 1 + raises(y) + except ValueError: + return 0 + except KeyError: + return 1 + return x + t = Translator(catches) + a = t.annotate([int]) + t.specialize() + graph = t.flowgraphs[catches] + split_block(t, graph, graph.startblock, i) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(catches, [0]) + assert result == 0 + result = interp.eval_function(catches, [1]) + assert result == 1 + result = interp.eval_function(catches, [2]) + assert result == 2 + Modified: pypy/dist/pypy/translator/unsimplify.py ============================================================================== --- pypy/dist/pypy/translator/unsimplify.py (original) +++ pypy/dist/pypy/translator/unsimplify.py Fri Sep 2 18:31:35 2005 @@ -35,6 +35,56 @@ link.target = newblock return newblock +def split_block(translator, graph, block, index): + """split a block in two, inserting a proper link between the new blocks""" + assert 0 <= index <= len(block.operations) + if block.exitswitch == Constant(last_exception): + assert index < len(block.operations) + #varmap is the map between names in the new and the old block + #but only for variables that are produced in the old block and needed in + #the new one + varmap = {} + vars_produced_in_new_block = {} + def get_new_name(var): + if var is None: + return None + if isinstance(var, Constant): + return var + if var in vars_produced_in_new_block: + return var + if var not in varmap: + varmap[var] = copyvar(translator, var) + return varmap[var] + moved_operations = block.operations[index:] + for op in moved_operations: + for i, arg in enumerate(op.args): + op.args[i] = get_new_name(op.args[i]) + vars_produced_in_new_block[op.result] = True + for link in block.exits: + for i, arg in enumerate(link.args): + #last_exception and last_exc_value are considered to be created + #when the link is entered + if link.args[i] not in [link.last_exception, link.last_exc_value]: + link.args[i] = get_new_name(link.args[i]) + exitswitch = get_new_name(block.exitswitch) + #the new block gets all the attributes relevant to outgoing links + #from block the old block + newblock = Block(varmap.values()) + newblock.operations = moved_operations + newblock.exits = block.exits + newblock.exitswitch = exitswitch + newblock.exc_handler = block.exc_handler + for link in newblock.exits: + link.prevblock = newblock + link = Link(varmap.keys(), newblock) + link.prevblock = block + block.operations = block.operations[:index] + block.exits = [link] + block.exitswitch = None + block.exc_handler = False + checkgraph(graph) + return newblock + def remove_direct_loops(translator, graph): """This is useful for code generators: it ensures that no link has common input and output variables, which could occur if a block's exit From cfbolz at codespeak.net Fri Sep 2 18:58:15 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 2 Sep 2005 18:58:15 +0200 (CEST) Subject: [pypy-svn] r17170 - pypy/dist/pypy/translator/test Message-ID: <20050902165815.4959027B5D@code1.codespeak.net> Author: cfbolz Date: Fri Sep 2 18:58:14 2005 New Revision: 17170 Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py Log: disabled test for a constellation where inlining does not work. Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/test/test_backendoptimization.py (original) +++ pypy/dist/pypy/translator/test/test_backendoptimization.py Fri Sep 2 18:58:14 2005 @@ -106,3 +106,22 @@ assert result == 1 result = interp.eval_function(h, [2]) assert result == 2 + +def DONOTtest_inline_exceptions(): + def f(x): + if x: + raise ValueError + def g(x): + try: + f(x) + except ValueError: + return 1 + return 1 + t = Translator(g) + a = t.annotate([int]) + a.simplify() + t.specialize() + t.view() + inline_function(t, f, t.flowgraphs[g]) + t.view() + From cfbolz at codespeak.net Fri Sep 2 22:23:04 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 2 Sep 2005 22:23:04 +0200 (CEST) Subject: [pypy-svn] r17171 - in pypy/dist/pypy/translator: . test Message-ID: <20050902202304.1BBD627B6A@code1.codespeak.net> Author: cfbolz Date: Fri Sep 2 22:23:03 2005 New Revision: 17171 Modified: pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/test/test_backendoptimization.py Log: oops, inlining could fail if the function to be inlined is called several times Modified: pypy/dist/pypy/translator/backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/backendoptimization.py (original) +++ pypy/dist/pypy/translator/backendoptimization.py Fri Sep 2 22:23:03 2005 @@ -159,8 +159,11 @@ if op.args[0].value._obj._callable is inline_func: callsites.append((block, i)) traverse(find_callsites, graph) - for block, index_operation in callsites: + while callsites != []: + block, index_operation = callsites.pop() _inline_function(translator, graph, block, index_operation) + callsites = [] + traverse(find_callsites, graph) checkgraph(graph) def _inline_function(translator, graph, block, index_operation): Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/test/test_backendoptimization.py (original) +++ pypy/dist/pypy/translator/test/test_backendoptimization.py Fri Sep 2 22:23:03 2005 @@ -107,6 +107,21 @@ result = interp.eval_function(h, [2]) assert result == 2 +def test_inline_several_times(): + def f(x): + return (x + 1) * 2 + def g(x): + if x: + a = f(x) + f(x) + else: + a = f(x) + 1 + return a + f(x) + t = Translator(g) + a = t.annotate([int]) + a.simplify() + t.specialize() + inline_function(t, f, t.flowgraphs[g]) + def DONOTtest_inline_exceptions(): def f(x): if x: @@ -124,4 +139,3 @@ t.view() inline_function(t, f, t.flowgraphs[g]) t.view() - From tismer at codespeak.net Sat Sep 3 15:25:05 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sat, 3 Sep 2005 15:25:05 +0200 (CEST) Subject: [pypy-svn] r17173 - pypy/dist/pypy/translator/c/src Message-ID: <20050903132505.D744227B56@code1.codespeak.net> Author: tismer Date: Sat Sep 3 15:25:03 2005 New Revision: 17173 Modified: pypy/dist/pypy/translator/c/src/ll_math.h (props changed) pypy/dist/pypy/translator/c/src/main.h (props changed) pypy/dist/pypy/translator/c/src/mem.h (props changed) pypy/dist/pypy/translator/c/src/standalone.h (props changed) Log: eol style From tismer at codespeak.net Sat Sep 3 16:44:26 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sat, 3 Sep 2005 16:44:26 +0200 (CEST) Subject: [pypy-svn] r17174 - pypy/dist/pypy/translator/c/src Message-ID: <20050903144426.9E3B327B53@code1.codespeak.net> Author: tismer Date: Sat Sep 3 16:44:25 2005 New Revision: 17174 Modified: pypy/dist/pypy/translator/c/src/mem.h Log: added support for USING_NO_GC Modified: pypy/dist/pypy/translator/c/src/mem.h ============================================================================== --- pypy/dist/pypy/translator/c/src/mem.h (original) +++ pypy/dist/pypy/translator/c/src/mem.h Sat Sep 3 16:44:25 2005 @@ -66,3 +66,20 @@ #define PUSH_ALIVE(obj) #endif /* USING_BOEHM_GC */ + +/* for no GC */ +#ifdef USING_NO_GC + +#undef OP_ZERO_MALLOC + +#define OP_ZERO_MALLOC(size, r, err) { \ + r = (void*) malloc(size); \ + if (r == NULL) FAIL_EXCEPTION(err, PyExc_MemoryError, "out of memory");\ + memset((void*) r, 0, size); \ + COUNT_MALLOC; \ + } + +#undef PUSH_ALIVE +#define PUSH_ALIVE(obj) + +#endif /* USING_NO_GC */ From tismer at codespeak.net Sat Sep 3 16:45:20 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sat, 3 Sep 2005 16:45:20 +0200 (CEST) Subject: [pypy-svn] r17175 - pypy/dist/pypy/translator/c Message-ID: <20050903144520.2C8BF27B53@code1.codespeak.net> Author: tismer Date: Sat Sep 3 16:45:19 2005 New Revision: 17175 Modified: pypy/dist/pypy/translator/c/node.py Log: added survival kit for empty structs Modified: pypy/dist/pypy/translator/c/node.py ============================================================================== --- pypy/dist/pypy/translator/c/node.py (original) +++ pypy/dist/pypy/translator/c/node.py Sat Sep 3 16:45:19 2005 @@ -100,6 +100,7 @@ def definition(self, phase): gcpolicy = self.db.gcpolicy + is_empty = True if phase == 1: yield 'struct %s {' % self.name # gcheader @@ -107,12 +108,17 @@ line = gcpolicy.struct_gcheader_definition(self) if line: yield '\t' + line + is_empty = False for name, typename in self.fields: line = '%s;' % cdecl(typename, name) if typename == PrimitiveType[Void]: line = '/* %s */' % line + else: + is_empty = False yield '\t' + line + if is_empty: + yield '\t' + 'int _dummy; /* this struct is empty */' yield '};' for line in gcpolicy.struct_after_definition(self): From tismer at codespeak.net Sat Sep 3 18:35:15 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sat, 3 Sep 2005 18:35:15 +0200 (CEST) Subject: [pypy-svn] r17176 - pypy/dist/pypy/translator/c Message-ID: <20050903163515.65F9227B56@code1.codespeak.net> Author: tismer Date: Sat Sep 3 18:35:14 2005 New Revision: 17176 Modified: pypy/dist/pypy/translator/c/node.py Log: re: added survival kit for empty structs forgot to augment initialization as well. Modified: pypy/dist/pypy/translator/c/node.py ============================================================================== --- pypy/dist/pypy/translator/c/node.py (original) +++ pypy/dist/pypy/translator/c/node.py Sat Sep 3 18:35:14 2005 @@ -100,10 +100,10 @@ def definition(self, phase): gcpolicy = self.db.gcpolicy - is_empty = True if phase == 1: yield 'struct %s {' % self.name # gcheader + is_empty = True if needs_gcheader(self.STRUCT): line = gcpolicy.struct_gcheader_definition(self) if line: @@ -336,11 +336,13 @@ return len(array.items) def initializationexpr(self, decoration=''): + is_empty = True yield '{' if needs_gcheader(self.T): line = self.db.gcpolicy.struct_gcheader_initializationexpr(self) if line: yield '\t' + line + is_empty = False defnode = self.db.gettypedefnode(self.T) for name in self.T._names: value = getattr(self.obj, name) @@ -349,6 +351,10 @@ '%s.%s' % (self.name, c_name), decoration + name) yield '\t%s' % expr + if not expr.startswith('/*'): + is_empty = False + if is_empty: + yield '\t' + '0,' yield '}' assert not USESLOTS or '__dict__' not in dir(StructNode) From tismer at codespeak.net Sat Sep 3 20:20:58 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sat, 3 Sep 2005 20:20:58 +0200 (CEST) Subject: [pypy-svn] r17182 - in pypy/dist/pypy/translator: c goal Message-ID: <20050903182058.7E4A227B56@code1.codespeak.net> Author: tismer Date: Sat Sep 3 20:20:56 2005 New Revision: 17182 Modified: pypy/dist/pypy/translator/c/gc.py pypy/dist/pypy/translator/goal/translate_pypy.py Log: experimental feature: -no-gc This feature is just for comparison how much memory management costs. I disabled any care about object deallocation. The result is quite promising: D:\pypy\dist\pypy\translator\goal>pypy-c-ref.exe -c "from test.pystone import main;main(500)" Pystone(1.1) time for 500 passes = 3.05089 This machine benchmarks at 163.887 pystones/second D:\pypy\dist\pypy\translator\goal>pypy-c.exe -c "from test.pystone import main;main(500)" Pystone(1.1) time for 500 passes = 2.08992 This machine benchmarks at 239.243 pystones/second The limit of 500 was choosen because the process then runs in about 200 MB of memory. New objects are allocated all the time. The result might be misleading, because no deallocation was ever called. But I believe the 50% extra speed comes from the fact that we loose continuous incref/decref operations from the inner loops. This happens for instance intensively in nextop() and nextarg() which are called very frequently. Since about 80-90 percent of reference operations can be saved by tracking trusted references, I hope to be able to gather a speedup of 30-40 percent by saving avoidable refcounting. It would be interesting to do a boehm comparison after this optimization ist done, whether boehm is still faster!? Modified: pypy/dist/pypy/translator/c/gc.py ============================================================================== --- pypy/dist/pypy/translator/c/gc.py (original) +++ pypy/dist/pypy/translator/c/gc.py Sat Sep 3 20:20:56 2005 @@ -161,7 +161,7 @@ getRuntimeTypeInfo(structdefnode.STRUCT) getRuntimeTypeInfo(INNER) - def struct_setup(self, structdefnode, rtti): + def struct_setup(self, structdefnode, rtti): if structdefnode.gcheader: db = self.db gcinfo = structdefnode.gcinfo = RefcountingInfo() @@ -372,6 +372,7 @@ self.T = T self.obj = obj defnode = db.gettypedefnode(obj.about) + self.implementationtypename = self.typename self.name = self.db.namespace.uniquename('g_rtti_v_'+ defnode.barename) self.ptrname = '(&%s)' % (self.name,) @@ -380,3 +381,19 @@ def implementation(self): yield 'char %s /* uninitialized */;' % self.name + +# to get an idea how it looks like with no refcount/gc at all + +class NoneGcPolicy(BoehmGcPolicy): + + zero_malloc = RefcountingGcPolicy.zero_malloc.im_func + gc_libraries = RefcountingGcPolicy.gc_libraries.im_func + gc_startup_code = RefcountingGcPolicy.gc_startup_code.im_func + rtti_type = RefcountingGcPolicy.rtti_type.im_func + + def pre_pre_gc_code(self): + yield '#define USING_NO_GC' + + def struct_implementationcode(self, structdefnode): + return [] + array_implementationcode = struct_implementationcode Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Sat Sep 3 20:20:56 2005 @@ -22,6 +22,7 @@ -llvm Use LLVM instead of C -c Generate the C code, but don't compile it -boehm Use the Boehm collector when generating C code + -no-gc Experimental: use no GC and no refcounting at all -o Generate and compile the C code, but don't run it -tcc Equivalent to the envvar PYPY_CC='tcc -shared -o "%s.so" "%s.c"' -- http://fabrice.bellard.free.fr/tcc/ @@ -346,6 +347,7 @@ '-no-c': False, '-c': False, '-boehm': False, + '-no-gc': False, '-o': False, '-llvm': False, '-no-mark-some-objects': False, @@ -631,6 +633,9 @@ if options['-boehm']: from pypy.translator.c import gc gcpolicy = gc.BoehmGcPolicy + if options['-no-gc']: + from pypy.translator.c import gc + gcpolicy = gc.NoneGcPolicy if options['-llinterpret']: def interpret(): From tismer at codespeak.net Sun Sep 4 01:36:11 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sun, 4 Sep 2005 01:36:11 +0200 (CEST) Subject: [pypy-svn] r17194 - pypy/dist/pypy/translator/c Message-ID: <20050903233611.4D18727B56@code1.codespeak.net> Author: tismer Date: Sun Sep 4 01:36:10 2005 New Revision: 17194 Modified: pypy/dist/pypy/translator/c/node.py Log: irrelevant cosmetics Modified: pypy/dist/pypy/translator/c/node.py ============================================================================== --- pypy/dist/pypy/translator/c/node.py (original) +++ pypy/dist/pypy/translator/c/node.py Sun Sep 4 01:36:10 2005 @@ -354,7 +354,7 @@ if not expr.startswith('/*'): is_empty = False if is_empty: - yield '\t' + '0,' + yield '\t%s' % '0,' yield '}' assert not USESLOTS or '__dict__' not in dir(StructNode) From tismer at codespeak.net Sun Sep 4 01:48:51 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sun, 4 Sep 2005 01:48:51 +0200 (CEST) Subject: [pypy-svn] r17195 - pypy/dist/pypy/translator/c Message-ID: <20050903234851.8F2EE27B5F@code1.codespeak.net> Author: tismer Date: Sun Sep 4 01:48:50 2005 New Revision: 17195 Modified: pypy/dist/pypy/translator/c/gc.py Log: corrected an unintended change (zero_malloc does not yieldbut return) so boehm was broke for a while. Seems to be a crossing of changes, return had been introduced in rev. 17041, already. Modified: pypy/dist/pypy/translator/c/gc.py ============================================================================== --- pypy/dist/pypy/translator/c/gc.py (original) +++ pypy/dist/pypy/translator/c/gc.py Sun Sep 4 01:48:50 2005 @@ -338,13 +338,13 @@ gcinfo = self.db.gettypedefnode(TYPE).gcinfo atomic = ['','_ATOMIC'][TYPE._is_atomic()] if gcinfo and gcinfo.finalizer: - yield 'OP_BOEHM_ZERO_MALLOC_FINALIZER(%s, %s, %s, %s, %s);' % (esize, + return 'OP_BOEHM_ZERO_MALLOC_FINALIZER(%s, %s, %s, %s, %s);' % (esize, eresult, atomic, gcinfo.finalizer, err) else: - yield 'OP_BOEHM_ZERO_MALLOC(%s, %s, %s, %s);' % (esize, + return 'OP_BOEHM_ZERO_MALLOC(%s, %s, %s, %s);' % (esize, eresult, atomic, err) From arigo at codespeak.net Sun Sep 4 15:50:54 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 4 Sep 2005 15:50:54 +0200 (CEST) Subject: [pypy-svn] r17205 - pypy/dist/pypy/module/_sre Message-ID: <20050904135054.7C21127B4D@code1.codespeak.net> Author: arigo Date: Sun Sep 4 15:50:53 2005 New Revision: 17205 Modified: pypy/dist/pypy/module/_sre/app_sre.py pypy/dist/pypy/module/_sre/interp_sre.py Log: Two details in _sre: * implement SRE_Pattern.sub() by calling SRE_Pattern.subn() directly instead of having them both call a common helper * use the unwrap_spec to get information from the interp-level entry point functions, and use space.interpclass_w() to unwrap the W_State object. Modified: pypy/dist/pypy/module/_sre/app_sre.py ============================================================================== --- pypy/dist/pypy/module/_sre/app_sre.py (original) +++ pypy/dist/pypy/module/_sre/app_sre.py Sun Sep 4 15:50:53 2005 @@ -69,12 +69,15 @@ state.start = state.string_position return matchlist - def _subx(self, template, string, count=0, subn=False): - filter = template - if not callable(template) and "\\" in template: + def subn(self, repl, string, count=0): + """Return the tuple (new_string, number_of_subs_made) found by replacing + the leftmost non-overlapping occurrences of pattern with the replacement + repl.""" + filter = repl + if not callable(repl) and "\\" in repl: # handle non-literal strings ; hand it over to the template compiler import sre - filter = sre._subx(self, template) + filter = sre._subx(self, repl) state = _sre._State(string, 0, sys.maxint, self.flags) sublist = [] @@ -103,21 +106,13 @@ if last_pos < state.end: sublist.append(string[last_pos:state.end]) item = "".join(sublist) - if subn: - return item, n - else: - return item + return item, n def sub(self, repl, string, count=0): """Return the string obtained by replacing the leftmost non-overlapping occurrences of pattern in string by the replacement repl.""" - return self._subx(repl, string, count, False) - - def subn(self, repl, string, count=0): - """Return the tuple (new_string, number_of_subs_made) found by replacing - the leftmost non-overlapping occurrences of pattern with the replacement - repl.""" - return self._subx(repl, string, count, True) + item, n = self.subn(repl, string, count) + return item def split(self, string, maxsplit=0): """Split string by the occurrences of pattern.""" Modified: pypy/dist/pypy/module/_sre/interp_sre.py ============================================================================== --- pypy/dist/pypy/module/_sre/interp_sre.py (original) +++ pypy/dist/pypy/module/_sre/interp_sre.py Sun Sep 4 15:50:53 2005 @@ -1,7 +1,8 @@ from pypy.interpreter.baseobjspace import Wrappable from pypy.interpreter.typedef import GetSetProperty, TypeDef from pypy.interpreter.typedef import interp_attrproperty, interp_attrproperty_w -from pypy.interpreter.gateway import interp2app +from pypy.interpreter.gateway import interp2app, ObjSpace, W_Root +from pypy.interpreter.error import OperationError from pypy.rpython.rarithmetic import intmask import sys @@ -34,8 +35,9 @@ OPCODE_LITERAL = 19 MAXREPEAT = 65535 -def w_getlower(space, w_char_ord, w_flags): - return space.wrap(getlower(space, space.int_w(w_char_ord), space.int_w(w_flags))) +def w_getlower(space, char_ord, flags): + return space.wrap(getlower(space, char_ord, flags)) +w_getlower.unwrap_spec = [ObjSpace, int, int] def getlower(space, char_ord, flags): if (char_ord < 128) or (flags & SRE_FLAG_UNICODE) \ @@ -51,17 +53,16 @@ #### Core classes -def make_state(space, w_string, w_start, w_end, w_flags): +def make_state(space, w_string, start, end, flags): # XXX maybe turn this into a __new__ method of W_State - return space.wrap(W_State(space, w_string, w_start, w_end, w_flags)) + return space.wrap(W_State(space, w_string, start, end, flags)) +make_state.unwrap_spec = [ObjSpace, W_Root, int, int, int] class W_State(Wrappable): - def __init__(self, space, w_string, w_start, w_end, w_flags): + def __init__(self, space, w_string, start, end, flags): self.space = space self.w_string = w_string - start = space.int_w(w_start) - end = space.int_w(w_end) if start < 0: start = 0 if end > space.int_w(space.len(w_string)): @@ -70,7 +71,7 @@ self.string_position = start self.end = end self.pos = start - self.flags = space.int_w(w_flags) + self.flags = flags self.w_reset() def w_reset(self): @@ -250,10 +251,13 @@ #### Main opcode dispatch loop def w_search(space, w_state, w_pattern_codes): - assert isinstance(w_state, W_State) + state = space.interpclass_w(w_state) + if not isinstance(state, W_State): + raise OperationError(space.w_TypeError, + space.wrap("State object expected")) pattern_codes = [intmask(space.uint_w(code)) for code in space.unpackiterable(w_pattern_codes)] - return space.newbool(search(space, w_state, pattern_codes)) + return space.newbool(search(space, state, pattern_codes)) def search(space, state, pattern_codes): flags = 0 @@ -323,10 +327,13 @@ return False def w_match(space, w_state, w_pattern_codes): - assert isinstance(w_state, W_State) + state = space.interpclass_w(w_state) + if not isinstance(state, W_State): + raise OperationError(space.w_TypeError, + space.wrap("State object expected")) pattern_codes = [intmask(space.uint_w(code)) for code in space.unpackiterable(w_pattern_codes)] - return space.newbool(match(space, w_state, pattern_codes)) + return space.newbool(match(space, state, pattern_codes)) def match(space, state, pattern_codes): # Optimization: Check string length. pattern_codes[3] contains the From cfbolz at codespeak.net Sun Sep 4 21:43:40 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sun, 4 Sep 2005 21:43:40 +0200 (CEST) Subject: [pypy-svn] r17211 - in pypy/dist/pypy/translator: . test Message-ID: <20050904194340.21E5427B46@code1.codespeak.net> Author: cfbolz Date: Sun Sep 4 21:43:36 2005 New Revision: 17211 Modified: pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/test/test_backendoptimization.py Log: added tests with for loop (I want to get there :-). reworked _inline_function a bit (not really less messy) to make exceptions easier to handle Modified: pypy/dist/pypy/translator/backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/backendoptimization.py (original) +++ pypy/dist/pypy/translator/backendoptimization.py Sun Sep 4 21:43:36 2005 @@ -147,6 +147,15 @@ assert vct == vct[:1] * len(vct), ( "variables called %s have mixed concretetypes: %r" % (vname, vct)) +def collect_called_functions(graph): + funcs = {} + def visit(obj): + if not isinstance(obj, Block): + return + for op in obj.operations: + if op.opname == "direct_call": + funcs[op.args[0]] = True + return funcs def inline_function(translator, inline_func, graph): callsites = [] @@ -165,7 +174,7 @@ callsites = [] traverse(find_callsites, graph) checkgraph(graph) - + def _inline_function(translator, graph, block, index_operation): if block.exitswitch == Constant(last_exception): assert index_operation != len(block.operations) - 1, ( @@ -174,13 +183,11 @@ graph_to_inline = translator.flowgraphs[op.args[0].value._obj._callable] entrymap = mkentrymap(graph_to_inline) beforeblock = block - afterblock = split_block(translator, graph, block, index_operation + 1) - assert beforeblock.operations[-1] is op + afterblock = split_block(translator, graph, block, index_operation) + assert afterblock.operations[0] is op #vars that need to be passed through the blocks of the inlined function - #this excludes the var resulting of the direct_call passon_vars = {beforeblock: [arg for arg in beforeblock.exits[0].args - if isinstance(arg, Variable) and - arg != op.result]} + if isinstance(arg, Variable)]} copied_blocks = {} varmap = {} def get_new_name(var): @@ -225,21 +232,21 @@ copiedstartblock = copy_block(graph_to_inline.startblock) copiedstartblock.isstartblock = False copiedreturnblock = copied_blocks[graph_to_inline.returnblock] + #find args passed to startblock of inlined function passon_args = [] - i = 0 - for arg in linktoinlined.args: + for arg in op.args[1:]: if isinstance(arg, Constant): passon_args.append(arg) - elif arg == op.result: - passon_args.append(copiedreturnblock.inputargs[0]) else: - passon_args.append(passon_vars[graph_to_inline.returnblock][i]) - i += 1 + index = afterblock.inputargs.index(arg) + passon_args.append(linktoinlined.args[index]) + passon_args += passon_vars[beforeblock] + #rewire blocks linktoinlined.target = copiedstartblock - linktoinlined.args = op.args[1:] + passon_vars[beforeblock] - afterblock.inputargs = afterblock.inputargs - beforeblock.operations = beforeblock.operations[:-1] - linkfrominlined = Link(passon_args, afterblock) + linktoinlined.args = passon_args + afterblock.inputargs = [op.result] + afterblock.inputargs + afterblock.operations = afterblock.operations[1:] + linkfrominlined = Link([copiedreturnblock.inputargs[0]] + passon_vars[graph_to_inline.returnblock], afterblock) linkfrominlined.prevblock = copiedreturnblock copiedreturnblock.exitswitch = None copiedreturnblock.exits = [linkfrominlined] Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/test/test_backendoptimization.py (original) +++ pypy/dist/pypy/translator/test/test_backendoptimization.py Sun Sep 4 21:43:36 2005 @@ -139,3 +139,16 @@ t.view() inline_function(t, f, t.flowgraphs[g]) t.view() + +def DONOTtest_for_loop(): + def f(x): + result = 0 + for i in range(0, x): + result += i + return result + t = Translator(f) + a = t.annotate([int]) + a.simplify() + t.specialize() + t.view() + From tismer at codespeak.net Sun Sep 4 22:11:34 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sun, 4 Sep 2005 22:11:34 +0200 (CEST) Subject: [pypy-svn] r17212 - pypy/dist/pypy/translator/c Message-ID: <20050904201134.70B9427B47@code1.codespeak.net> Author: tismer Date: Sun Sep 4 22:11:33 2005 New Revision: 17212 Modified: pypy/dist/pypy/translator/c/genc.py Log: hackish patch to make the boehm includes be found under windows. No idea how to make distutils happy another way, yet. I will undo this when I know. Now the assumption is that gc6.5 is living in translator/c Modified: pypy/dist/pypy/translator/c/genc.py ============================================================================== --- pypy/dist/pypy/translator/c/genc.py (original) +++ pypy/dist/pypy/translator/c/genc.py Sun Sep 4 22:11:33 2005 @@ -104,7 +104,8 @@ python_inc = sysconfig.get_python_inc() self.executable_name = build_executable([self.c_source_filename], include_dirs = [autopath.this_dir, - python_inc], + python_inc, + autopath.this_dir+'/gc6.5/include',], libraries=self.libraries) self._compiled = True return self.executable_name From tismer at codespeak.net Sun Sep 4 22:15:30 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sun, 4 Sep 2005 22:15:30 +0200 (CEST) Subject: [pypy-svn] r17213 - pypy/dist/pypy/rpython Message-ID: <20050904201530.AD09A27B47@code1.codespeak.net> Author: tismer Date: Sun Sep 4 22:15:29 2005 New Revision: 17213 Modified: pypy/dist/pypy/rpython/rlist.py Log: overallocation of low-level lists done quite right I guess. There is still the possibility to gain a few cycles if we introduce a low-level move function that shifts references around without counting; this nevertheless doesn't apply to the boehm case. On Windows, using Boehm, this gets us an pystone acceleration of 576 to 671 pystones Most probably because we are using pus/pop on the frame stack,so this will probably vanish again when we do stacks the right way. Going to remove over-allocation from listobject.py, doubling this makes no sense at all. Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Sun Sep 4 22:15:29 2005 @@ -307,6 +307,63 @@ # be direct_call'ed from rtyped flow graphs, which means that they will # get flowed and annotated, mostly with SomePtr. +# adapted C code + +def _ll_list_resize(l, newsize): + """ + Ensure ob_item has room for at least newsize elements, and set + ob_size to newsize. If newsize > ob_size on entry, the content + of the new slots at exit is undefined heap trash; it's the caller's + responsiblity to overwrite them with sane values. + The number of allocated elements may grow, shrink, or stay the same. + Failure is impossible if newsize <= self.allocated on entry, although + that partly relies on an assumption that the system realloc() never + fails when passed a number of bytes <= the number of bytes last + allocated (the C standard doesn't guarantee this, but it's hard to + imagine a realloc implementation where it wouldn't be true). + Note that self->ob_item may change, and even if newsize is less + than ob_size on entry. + """ + allocated = len(l.items) + + # Bypass realloc() when a previous overallocation is large enough + # to accommodate the newsize. If the newsize falls lower than half + # the allocated size, then proceed with the realloc() to shrink the list. + if allocated >= newsize and newsize >= (allocated >> 1): + # assert l.ob_item != NULL or newsize == 0 + l.length = newsize + return + + # This over-allocates proportional to the list size, making room + # for additional growth. The over-allocation is mild, but is + # enough to give linear-time amortized behavior over a long + # sequence of appends() in the presence of a poorly-performing + # system realloc(). + # The growth pattern is: 0, 4, 8, 16, 25, 35, 46, 58, 72, 88, ... + ## (newsize < 9 ? 3 : 6) + if newsize < 9: + some = 3 + else: + some = 6 + new_allocated = (newsize >> 3) + some + newsize + if newsize == 0: + new_allocated = 0 + # XXX consider to have a real realloc + items = l.items + newitems = malloc(typeOf(l).TO.items.TO, new_allocated) + if allocated < new_allocated: + p = allocated - 1 + else: + p = new_allocated - 1 + while p >= 0: + newitems[p] = items[p] + ITEM = typeOf(l).TO.items.TO.OF + if isinstance(ITEM, Ptr): + items[p] = nullptr(ITEM.TO) + p -= 1 + l.length = newsize + l.items = newitems + def ll_copy(l): items = l.items length = l.length @@ -327,41 +384,29 @@ def ll_append(l, newitem): length = l.length - newitems = malloc(typeOf(l).TO.items.TO, length+1) - i = 0 - while i < length: - newitems[i] = l.items[i] - i += 1 - newitems[length] = newitem - l.length = length + 1 - l.items = newitems + _ll_list_resize(l, length+1) + l.items[length] = newitem # this one is for the special case of insert(0, x) def ll_prepend(l, newitem): length = l.length - newitems = malloc(typeOf(l).TO.items.TO, length+1) - i = 0 - while i < length: - newitems[i+1] = l.items[i] - i += 1 - newitems[0] = newitem - l.length = length + 1 - l.items = newitems + _ll_list_resize(l, length+1) + i = length + items = l.items + while i > 0: + items[i] = items[i+1] + i -= 1 + items[0] = newitem def ll_insert_nonneg(l, index, newitem): length = l.length - newitems = malloc(typeOf(l).TO.items.TO, length+1) - i = 0 - while i < index: - newitems[i] = l.items[i] - i += 1 - newitems[i] = newitem - i += 1 - while i <= length: - newitems[i] = l.items[i-1] - i += 1 - l.length = length + 1 - l.items = newitems + _ll_list_resize(l, length+1) + items = l.items + i = length + while i > index: + items[i] = items[i+1] + i -= 1 + items[i] = newitem def ll_insert(l, index, newitem): if index < 0: @@ -375,28 +420,27 @@ def ll_pop_default(l): index = l.length - 1 - res = l.items[index] newlength = index - newitems = malloc(typeOf(l).TO.items.TO, newlength) - j = 0 - while j < newlength: - newitems[j] = l.items[j] - j += 1 - l.length = newlength - l.items = newitems + items = l.items + res = items[index] + ITEM = typeOf(l).TO.items.TO.OF + if isinstance(ITEM, Ptr): + items[index] = nullptr(ITEM.TO) + _ll_list_resize(l, newlength) return res def ll_pop_zero(l): - index = l.length - 1 + newlength = l.length - 1 res = l.items[0] - newlength = index - newitems = malloc(typeOf(l).TO.items.TO, newlength) j = 0 + items = l.items while j < newlength: - newitems[j] = l.items[j+1] + items[j] = items[j+1] j += 1 - l.length = newlength - l.items = newitems + ITEM = typeOf(l).TO.items.TO.OF + if isinstance(ITEM, Ptr): + items[newlength] = nullptr(ITEM.TO) + _ll_list_resize(l, newlength) return res def ll_pop(l, index): @@ -408,12 +452,13 @@ def ll_reverse(l): length = l.length - len2 = length // 2 # moved this out of the loop + len2 = length // 2 i = 0 + items = l.items while i < len2: tmp = l.items[i] - l.items[i] = l.items[length-1-i] - l.items[length-1-i] = tmp + items[i] = items[length-1-i] + items[length-1-i] = tmp i += 1 def ll_getitem_nonneg(l, i): @@ -466,16 +511,15 @@ def ll_delitem_nonneg(l, i): newlength = l.length - 1 - newitems = malloc(typeOf(l).TO.items.TO, newlength) - j = 0 - while j < i: - newitems[j] = l.items[j] - j += 1 + j = i + items = l.items while j < newlength: - newitems[j] = l.items[j+1] + items[j] = items[j+1] j += 1 - l.length = newlength - l.items = newitems + ITEM = typeOf(l).TO.items.TO.OF + if isinstance(ITEM, Ptr): + items[newlength] = nullptr(ITEM.TO) + _ll_list_resize(l, newlength) def ll_delitem(l, i): if i < 0: @@ -495,12 +539,14 @@ newlength = len1 + len2 newitems = malloc(typeOf(l1).TO.items.TO, newlength) j = 0 + source = l1.items while j < len1: - newitems[j] = l1.items[j] + newitems[j] = source[j] j += 1 i = 0 + source = l2.items while i < len2: - newitems[j] = l2.items[i] + newitems[j] = source[i] i += 1 j += 1 l = malloc(typeOf(l1).TO) @@ -512,26 +558,24 @@ len1 = l1.length len2 = l2.length newlength = len1 + len2 - newitems = malloc(typeOf(l1).TO.items.TO, newlength) - j = 0 - while j < len1: - newitems[j] = l1.items[j] - j += 1 + _ll_list_resize(l1, newlength) + items = l1.items + source =l2.items i = 0 + j = len1 while i < len2: - newitems[j] = l2.items[i] + items[j] = source[i] i += 1 j += 1 - l1.length = newlength - l1.items = newitems def ll_listslice_startonly(l1, start): len1 = l1.length newlength = len1 - start newitems = malloc(typeOf(l1).TO.items.TO, newlength) j = 0 + source= l1.items while start < len1: - newitems[j] = l1.items[start] + newitems[j] = source[start] start += 1 j += 1 l = malloc(typeOf(l1).TO) @@ -547,8 +591,9 @@ newlength = stop - start newitems = malloc(typeOf(l1).TO.items.TO, newlength) j = 0 + source = l1.items while start < stop: - newitems[j] = l1.items[start] + newitems[j] = source[start] start += 1 j += 1 l = malloc(typeOf(l1).TO) @@ -561,49 +606,57 @@ assert newlength >= 0 newitems = malloc(typeOf(l1).TO.items.TO, newlength) j = 0 + source = l1.items while j < newlength: - newitems[j] = l1.items[j] + newitems[j] = source[j] j += 1 l = malloc(typeOf(l1).TO) l.length = newlength l.items = newitems return l -def ll_listdelslice_startonly(l1, start): - newitems = malloc(typeOf(l1).TO.items.TO, start) - j = 0 - while j < start: - newitems[j] = l1.items[j] - j += 1 - l1.length = start - l1.items = newitems +def ll_listdelslice_startonly(l, start): + newlength = start + ITEM = typeOf(l).TO.items.TO.OF + if isinstance(ITEM, Ptr): + j = l.length - 1 + items = l.items + while j >= newlength: + items[j] = nullptr(ITEM.TO) + j -= 1 + _ll_list_resize(l, newlength) -def ll_listdelslice(l1, slice): +def ll_listdelslice(l, slice): start = slice.start stop = slice.stop - if stop > l1.length: - stop = l1.length - newlength = l1.length - (stop-start) - newitems = malloc(typeOf(l1).TO.items.TO, newlength) - j = 0 - while j < start: - newitems[j] = l1.items[j] - j += 1 + if stop > l.length: + stop = l.length + newlength = l.length - (stop-start) + j = start + items = l.items while j < newlength: - newitems[j] = l1.items[stop] + items[j] = items[stop] stop += 1 j += 1 - l1.length = newlength - l1.items = newitems + ITEM = typeOf(l).TO.items.TO.OF + if isinstance(ITEM, Ptr): + j = l.length - 1 + while j >= newlength: + items[j] = nullptr(ITEM.TO) + j -= 1 + _ll_list_resize(l, newlength) def ll_listsetslice(l1, slice, l2): count = l2.length assert count == slice.stop - slice.start, ( "setslice cannot resize lists in RPython") + # XXX but it should be easy enough to support, soon start = slice.start j = 0 + items1 = l1.items + items2 = l2.items while j < count: - l1.items[start+j] = l2.items[j] + items1[start+j] = items2[j] j += 1 # ____________________________________________________________ From cfbolz at codespeak.net Mon Sep 5 00:41:56 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Mon, 5 Sep 2005 00:41:56 +0200 (CEST) Subject: [pypy-svn] r17214 - in pypy/dist/pypy/translator: . test Message-ID: <20050904224156.89F4927B47@code1.codespeak.net> Author: cfbolz Date: Mon Sep 5 00:41:51 2005 New Revision: 17214 Modified: pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/test/test_backendoptimization.py Log: slightly insane (not only because of its spaghettiness) addition to the inlining code: now functions that don't contain calls to other functions can be inlined even though they are guarded by an exception handler. The code inserts exception matching blocks. This can be done better of course (will do that soon), because in the interesting cases the exception matching can be done at compile time. Modified: pypy/dist/pypy/translator/backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/backendoptimization.py (original) +++ pypy/dist/pypy/translator/backendoptimization.py Mon Sep 5 00:41:51 2005 @@ -5,8 +5,10 @@ from pypy.objspace.flow.model import Variable, Constant, Block, Link from pypy.objspace.flow.model import SpaceOperation, last_exception from pypy.objspace.flow.model import traverse, mkentrymap, checkgraph +from pypy.annotation import model as annmodel from pypy.tool.unionfind import UnionFind -from pypy.rpython.lltype import Void +from pypy.rpython.lltype import Void, Bool +from pypy.rpython import rmodel def remove_same_as(graph): """Remove all 'same_as' operations. @@ -176,8 +178,11 @@ checkgraph(graph) def _inline_function(translator, graph, block, index_operation): - if block.exitswitch == Constant(last_exception): - assert index_operation != len(block.operations) - 1, ( + exception_guarded = False + if (block.exitswitch == Constant(last_exception) and + index_operation == len(block.operations) - 1): + exception_guarded = True + assert len(collect_called_functions(graph)) == 0, ( "can't handle exceptions yet") op = block.operations[index_operation] graph_to_inline = translator.flowgraphs[op.args[0].value._obj._callable] @@ -251,14 +256,61 @@ copiedreturnblock.exitswitch = None copiedreturnblock.exits = [linkfrominlined] assert copiedreturnblock.exits[0].target == afterblock - #let links to exceptblock of the graph to inline go to graphs exceptblock if graph_to_inline.exceptblock in entrymap: + #let links to exceptblock of the graph to inline go to graphs exceptblock copiedexceptblock = copied_blocks[graph_to_inline.exceptblock] - for link in entrymap[graph_to_inline.exceptblock]: - copiedblock = copied_blocks[link.prevblock] - assert len(copiedblock.exits) == 1 - copiedblock.exits[0].args = copiedblock.exits[0].args[:2] - copiedblock.exits[0].target = graph.exceptblock + if not exception_guarded: + copiedexceptblock = copied_blocks[graph_to_inline.exceptblock] + for link in entrymap[graph_to_inline.exceptblock]: + copiedblock = copied_blocks[link.prevblock] + assert len(copiedblock.exits) == 1 + copiedblock.exits[0].args = copiedblock.exits[0].args[:2] + copiedblock.exits[0].target = graph.exceptblock + else: + #XXXXX don't look: insert blocks that do exception matching + #XXXXXX should do exception matching as far as possible before runtime + blocks = [] + exc_match = Constant(rmodel.getfunctionptr( + translator, + translator.rtyper.getexceptiondata().ll_exception_match)) + for i, link in enumerate(afterblock.exits[1:]): + etype = copyvar(translator, copiedexceptblock.inputargs[0]) + evalue = copyvar(translator, copiedexceptblock.inputargs[1]) + block = Block([etype, evalue] + get_new_passon_var_names(link.target)) + res = Variable() + res.concretetype = Bool + translator.annotator.bindings[res] = annmodel.SomeBool() + args = [exc_match, etype, Constant(link.llexitcase)] + block.operations.append(SpaceOperation("direct_call", args, res)) + block.exitswitch = res + linkargs = [] + for arg in afterblock.exits[i + 1].args: + if arg == afterblock.exits[i + 1].last_exception: + linkargs.append(etype) + elif arg == afterblock.exits[i + 1].last_exc_value: + linkargs.append(evalue) + elif isinstance(arg, Constant): + linkargs.append(arg) + else: + index = afterblock.inputargs.index(arg) + linkargs.append(passon_vars[link.target][index - 1]) + l = Link(linkargs, afterblock.exits[i + 1].target) + l.prevblock = block + l.exitcase = True + block.exits.append(l) + if i > 0: + l = Link(blocks[-1].inputargs, block) + l.prevblock = blocks[-1] + l.exitcase = False + blocks[-1].exits.insert(0, l) + blocks.append(block) + blocks[-1].exits = blocks[-1].exits[:1] + blocks[-1].operations = [] + blocks[-1].exitswitch = None + linkargs = copiedexceptblock.inputargs + copiedexceptblock.closeblock(Link(linkargs, blocks[0])) + afterblock.exits = [afterblock.exits[0]] + afterblock.exitswitch = None #cleaning up -- makes sense to be here, because I insert quite #some empty blocks and blocks that can be joined eliminate_empty_blocks(graph) Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/test/test_backendoptimization.py (original) +++ pypy/dist/pypy/translator/test/test_backendoptimization.py Mon Sep 5 00:41:51 2005 @@ -121,24 +121,69 @@ a.simplify() t.specialize() inline_function(t, f, t.flowgraphs[g]) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(g, [0]) + assert result == g(0) + result = interp.eval_function(g, [42]) + assert result == g(42) -def DONOTtest_inline_exceptions(): +def test_inline_exceptions(): def f(x): - if x: + if x == 0: raise ValueError + if x == 1: + raise KeyError def g(x): try: f(x) except ValueError: - return 1 + return 2 + except KeyError: + return 3 return 1 t = Translator(g) a = t.annotate([int]) a.simplify() t.specialize() - t.view() inline_function(t, f, t.flowgraphs[g]) - t.view() + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(g, [0]) + assert result == 2 + result = interp.eval_function(g, [1]) + assert result == 3 + result = interp.eval_function(g, [42]) + assert result == 1 + +def test_inline_var_exception(): + def f(x): + e = None + if x == 0: + e = ValueError() + elif x == 1: + e = KeyError() + if x == 0 or x == 1: + raise e + def g(x): + try: + f(x) + except ValueError: + return 2 + except KeyError: + return 3 + return 1 + t = Translator(g) + a = t.annotate([int]) + a.simplify() + t.specialize() + inline_function(t, f, t.flowgraphs[g]) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(g, [0]) + assert result == 2 + result = interp.eval_function(g, [1]) + assert result == 3 + result = interp.eval_function(g, [42]) + assert result == 1 + def DONOTtest_for_loop(): def f(x): From tismer at codespeak.net Mon Sep 5 02:32:50 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Mon, 5 Sep 2005 02:32:50 +0200 (CEST) Subject: [pypy-svn] r17215 - pypy/dist/pypy/rpython Message-ID: <20050905003250.BF42C27B47@code1.codespeak.net> Author: tismer Date: Mon Sep 5 02:32:49 2005 New Revision: 17215 Modified: pypy/dist/pypy/rpython/rlist.py Log: - some cosmetic cleanups - saved a few operations which were hidden in index expressions with operators - tried to make the code a little more consistent about variable usage - stopped modifying variables like "stop" which is not their meaning. additional local variables most probably don't hurt. Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Mon Sep 5 02:32:49 2005 @@ -393,8 +393,10 @@ _ll_list_resize(l, length+1) i = length items = l.items + i1 = i+1 while i > 0: - items[i] = items[i+1] + items[i] = items[i1] + i1 = i i -= 1 items[0] = newitem @@ -403,8 +405,10 @@ _ll_list_resize(l, length+1) items = l.items i = length + i1 = i+1 while i > index: - items[i] = items[i+1] + items[i] = items[i1] + i1 = i i -= 1 items[i] = newitem @@ -434,9 +438,11 @@ res = l.items[0] j = 0 items = l.items + j1 = j+1 while j < newlength: - items[j] = items[j+1] - j += 1 + items[j] = items[j1] + j = j1 + j1 += 1 ITEM = typeOf(l).TO.items.TO.OF if isinstance(ITEM, Ptr): items[newlength] = nullptr(ITEM.TO) @@ -455,11 +461,13 @@ len2 = length // 2 i = 0 items = l.items + length_1_i = length-1-i while i < len2: tmp = l.items[i] - items[i] = items[length-1-i] - items[length-1-i] = tmp + items[i] = items[length_1_i] + items[length_1_i] = tmp i += 1 + length_1_i -= 1 def ll_getitem_nonneg(l, i): return l.items[i] @@ -513,9 +521,11 @@ newlength = l.length - 1 j = i items = l.items + j1 = j+1 while j < newlength: - items[j] = items[j+1] - j += 1 + items[j] = items[j1] + j = j1 + j1 += 1 ITEM = typeOf(l).TO.items.TO.OF if isinstance(ITEM, Ptr): items[newlength] = nullptr(ITEM.TO) @@ -560,7 +570,7 @@ newlength = len1 + len2 _ll_list_resize(l1, newlength) items = l1.items - source =l2.items + source = l2.items i = 0 j = len1 while i < len2: @@ -573,10 +583,11 @@ newlength = len1 - start newitems = malloc(typeOf(l1).TO.items.TO, newlength) j = 0 - source= l1.items - while start < len1: - newitems[j] = source[start] - start += 1 + source = l1.items + i = start + while i < len1: + newitems[j] = source[i] + i += 1 j += 1 l = malloc(typeOf(l1).TO) l.length = newlength @@ -592,9 +603,10 @@ newitems = malloc(typeOf(l1).TO.items.TO, newlength) j = 0 source = l1.items - while start < stop: - newitems[j] = source[start] - start += 1 + i = start + while i < stop: + newitems[j] = source[i] + i += 1 j += 1 l = malloc(typeOf(l1).TO) l.length = newlength @@ -634,9 +646,10 @@ newlength = l.length - (stop-start) j = start items = l.items + i = stop while j < newlength: - items[j] = items[stop] - stop += 1 + items[j] = items[i] + i += 1 j += 1 ITEM = typeOf(l).TO.items.TO.OF if isinstance(ITEM, Ptr): @@ -652,11 +665,13 @@ "setslice cannot resize lists in RPython") # XXX but it should be easy enough to support, soon start = slice.start - j = 0 + j = start items1 = l1.items items2 = l2.items - while j < count: - items1[start+j] = items2[j] + i = 0 + while i < count: + items1[j] = items2[i] + i += 1 j += 1 # ____________________________________________________________ From pedronis at codespeak.net Mon Sep 5 10:07:54 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 5 Sep 2005 10:07:54 +0200 (CEST) Subject: [pypy-svn] r17216 - pypy/dist/pypy/translator/goal Message-ID: <20050905080754.D5DBA27B46@code1.codespeak.net> Author: pedronis Date: Mon Sep 5 10:07:53 2005 New Revision: 17216 Modified: pypy/dist/pypy/translator/goal/targetcompiler.py Log: use the PyPyAnnotationPolicy for this target, the space is used Modified: pypy/dist/pypy/translator/goal/targetcompiler.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetcompiler.py (original) +++ pypy/dist/pypy/translator/goal/targetcompiler.py Mon Sep 5 10:07:53 2005 @@ -12,6 +12,8 @@ # __________ Entry point __________ # entry_point = annotateme +from pypy.translator.ann_override import PyPyAnnotatorPolicy + from pypy.interpreter.pyparser.pythonutil import target_ast_compile from pypy.objspace.std.objspace import StdObjSpace @@ -32,7 +34,7 @@ translating=True, #usemodules=['marhsal', '_sre'], geninterp=geninterp) - return entry_point, [str, str] + return entry_point, [str, str], PyPyAnnotatorPolicy() # _____ Run translated _____ def run(c_entry_point): From pedronis at codespeak.net Mon Sep 5 10:30:24 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 5 Sep 2005 10:30:24 +0200 (CEST) Subject: [pypy-svn] r17217 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050905083024.33A5027B46@code1.codespeak.net> Author: pedronis Date: Mon Sep 5 10:30:22 2005 New Revision: 17217 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py Log: some getChildNodes were still returning tuples Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Mon Sep 5 10:30:22 2005 @@ -332,7 +332,7 @@ return () def getChildNodes(self): - return () + return [] def __repr__(self): return "Break()" @@ -449,7 +449,7 @@ return () def getChildNodes(self): - return () + return [] def __repr__(self): return "Continue()" @@ -538,7 +538,7 @@ return () def getChildNodes(self): - return () + return [] def __repr__(self): return "Ellipsis()" @@ -1107,7 +1107,7 @@ return () def getChildNodes(self): - return () + return [] def __repr__(self): return "NoneConst()" @@ -1176,7 +1176,7 @@ return () def getChildNodes(self): - return () + return [] def __repr__(self): return "Pass()" From adim at codespeak.net Mon Sep 5 10:38:29 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Mon, 5 Sep 2005 10:38:29 +0200 (CEST) Subject: [pypy-svn] r17218 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050905083829.A082427B4B@code1.codespeak.net> Author: adim Date: Mon Sep 5 10:38:28 2005 New Revision: 17218 Modified: pypy/dist/pypy/interpreter/astcompiler/future.py Log: use pyparser's SyntaxError Modified: pypy/dist/pypy/interpreter/astcompiler/future.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/future.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/future.py Mon Sep 5 10:38:28 2005 @@ -1,7 +1,7 @@ """Parser for future statements """ - +from pypy.interpreter.pyparser.error import SyntaxError from pypy.interpreter.astcompiler import ast, walk def is_future(stmt): From adim at codespeak.net Mon Sep 5 11:36:30 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Mon, 5 Sep 2005 11:36:30 +0200 (CEST) Subject: [pypy-svn] r17219 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050905093630.C55AD27B46@code1.codespeak.net> Author: adim Date: Mon Sep 5 11:36:29 2005 New Revision: 17219 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: small annotation fixes Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Mon Sep 5 11:36:29 2005 @@ -529,7 +529,9 @@ left = atoms[0] for i in range(2,l,2): right = atoms[i] - op = atoms[i-1].name + op_node = atoms[i-1] + assert isinstance(op_node, TokenObject) + op = op_node.name if op == tok.STAR: left = ast.Mul( [ left, right ] ) elif op == tok.SLASH: @@ -749,7 +751,7 @@ builder.push(atoms[0]) return items = [] - if atoms[1].name == tok.COMMA: + if isinstance(atoms[1], TokenObject) and atoms[1].name == tok.COMMA: for i in range(0, l, 2): # this is atoms not 1 items.append(atoms[i]) else: @@ -775,13 +777,13 @@ """ atoms = get_atoms(builder, nb) # Case 1 : '(' ... - if atoms[0].name == tok.LPAR: + if isinstance(atoms[0], TokenObject) and atoms[0].name == tok.LPAR: if len(atoms) == 2: # and atoms[1].token == tok.RPAR: builder.push(ArglistObject([], None, None)) elif len(atoms) == 3: # '(' Arglist ')' # push arglist on the stack builder.push(atoms[1]) - elif atoms[0].name == tok.LSQB: + elif isinstance(atoms[0], TokenObject) and atoms[0].name == tok.LSQB: if isinstance(atoms[1], SlicelistObject): builder.push(atoms[1]) else: From ludal at codespeak.net Mon Sep 5 11:53:34 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Mon, 5 Sep 2005 11:53:34 +0200 (CEST) Subject: [pypy-svn] r17220 - pypy/extradoc/sprintinfo Message-ID: <20050905095334.270B827B44@code1.codespeak.net> Author: ludal Date: Mon Sep 5 11:53:32 2005 New Revision: 17220 Added: pypy/extradoc/sprintinfo/Paris-2005-sprint.txt (contents, props changed) Log: draft Added: pypy/extradoc/sprintinfo/Paris-2005-sprint.txt ============================================================================== --- (empty file) +++ pypy/extradoc/sprintinfo/Paris-2005-sprint.txt Mon Sep 5 11:53:32 2005 @@ -0,0 +1,100 @@ +PyPy Sprint in Heidelberg 10th - 16th October 2005 +================================================== + +The next PyPy sprint will take place in Paris at Logilab's office +in France from the 10th to the 16th of October 2005 (both days included). + +To learn more about the new PyPy Python implementation +look here: + + http://codespeak.net/pypy + + +Goals and topics of the sprint +------------------------------ + +Now that version 0.7 has been released, it is time for refactoring and planning +of the next features. + +For these reasons the main topics of this sprint will be on + - threading and GC + - refactoring and translation features + - discussing about the upcoming JIT, optimizations and stackless features + + +Location & Accomodation +------------------------ + +The sprint will be held in Logilab's offices in Paris. The exact address is + + 10 rue Louis Vicat + 75015 Paris + +We'll provide WLAN connection and there is a small kitchen to make tea and coffee. +`See here`_ for a map of the area nearby. + +There are two metro lines and three bus lines that stop close to Logilab's: + - Metro line 12 : stop "Porte de Versailles" + - Metro line 13 : stop "Malakoff plateau de Vanves" + - Bus 89 or 58 : stop "Carrefour Albert Legris" + - Bus PC1 : stop "Porte de la Plaine" +Those are more or less within 5 minutes walk. + +There is a huge number of hotels_ in Paris, all of which are +unfortunately rather expensive. It should be no problem to reach the sprint +venue from anywhere in Paris using public transportation. +Other links to _`hotels en France` sites : + +However it would be better to find accomodation in the southern districts (14th, 15th) +or in the nearby towns (Malakoff, Vanves) not far from Paris. Another good choice is +to find an hotel near one of the metro line above. (avoid long bus trips as it takes +a long time.) +Looking at the map here: _hotels, the area you should look at is between (3) and (6), +(5) is still ok. + + +As an alternative you could also try to group (4 or 5 people) and try to rent +a furnished appartment for the week. + +For transportation we strongly recommend the public transportation (metro and buses). +There are weekly tickets (monday to sunday) from 15.70 euros (Zone 1-2). You need a +picture (ID format) and ask for a "Carte Orange". +Don't get the tourist's cards which are more expensive and valid only for 5 days. + +.. _`See here`: http://www.logilab.fr/contact.html +.. _hotels: http://www.hotel-paris.net/index-hotels-English.htm +.. _`hotels en France`: http://www.hotels.fr/ + +Exact times +----------- + +The Pypy sprint is held Monday 10th October - Sunday 16th October 2005. +Hours will be from 09:00 until 20:00 or later. It's a good +idea to arrive a day before the sprint starts. + +Network, Food +------------- + +There are plenty of restaurants and sandwich places around and a big supermarket for those +who would rather make their own food. + +You will probably need a wireless network card to access the network but we can provide +ethernet cables for those who don't have one. + + +Registration etc.pp. +-------------------- + +Please subscribe to the `PyPy sprint mailing list`_, introduce +yourself and post a note that you want to come. + +Feel free to ask any questions there! There also is a separate +`Paris people`_ page tracking who is already thought +to come. If you have commit rights on codespeak then +you can modify yourself a checkout of + + http://codespeak.net/svn/pypy/extradoc/sprintinfo/paris-people.txt + +.. _`Paris people`: http://codespeak.net/pypy/index.cgi?extradoc/sprintinfo/paris-people.html +.. _`PyPy sprint mailing list`: http://codespeak.net/mailman/listinfo/pypy-sprint + From ludal at codespeak.net Mon Sep 5 11:56:53 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Mon, 5 Sep 2005 11:56:53 +0200 (CEST) Subject: [pypy-svn] r17221 - pypy/extradoc/sprintinfo Message-ID: <20050905095653.26C9C27B4E@code1.codespeak.net> Author: ludal Date: Mon Sep 5 11:56:51 2005 New Revision: 17221 Modified: pypy/extradoc/sprintinfo/Paris-2005-sprint.txt Log: replace hotel url with one having cheaper rates Modified: pypy/extradoc/sprintinfo/Paris-2005-sprint.txt ============================================================================== --- pypy/extradoc/sprintinfo/Paris-2005-sprint.txt (original) +++ pypy/extradoc/sprintinfo/Paris-2005-sprint.txt Mon Sep 5 11:56:51 2005 @@ -63,7 +63,7 @@ .. _`See here`: http://www.logilab.fr/contact.html .. _hotels: http://www.hotel-paris.net/index-hotels-English.htm -.. _`hotels en France`: http://www.hotels.fr/ +.. _`hotels en France`: http://www.0800paris-hotels.com/ Exact times ----------- From ludal at codespeak.net Mon Sep 5 12:15:16 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Mon, 5 Sep 2005 12:15:16 +0200 (CEST) Subject: [pypy-svn] r17222 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050905101516.DD20227B45@code1.codespeak.net> Author: ludal Date: Mon Sep 5 12:15:15 2005 New Revision: 17222 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/astgen.py Log: - modify astgen.py to get rid of the last return () - regenerate ast.py Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Mon Sep 5 12:15:15 2005 @@ -3,6 +3,7 @@ This file is automatically generated by Tools/compiler/astgen.py """ from consts import CO_VARARGS, CO_VARKEYWORDS, OP_ASSIGN +from pypy.interpreter.baseobjspace import Wrappable def flatten(list): l = [] @@ -20,7 +21,7 @@ nodes = {} -class Node: +class Node(Wrappable): """Abstract base class for ast nodes.""" def __init__(self, lineno = None): self.lineno = lineno @@ -47,6 +48,7 @@ res.append( self ) return res + class EmptyNode(Node): def accept(self, visitor): return visitor.visitEmptyNode(self) @@ -329,7 +331,7 @@ Node.__init__(self, lineno) def getChildren(self): - return () + return [] def getChildNodes(self): return [] @@ -446,7 +448,7 @@ Node.__init__(self, lineno) def getChildren(self): - return () + return [] def getChildNodes(self): return [] @@ -535,7 +537,7 @@ Node.__init__(self, lineno) def getChildren(self): - return () + return [] def getChildNodes(self): return [] @@ -1104,7 +1106,7 @@ Node.__init__(self, lineno) def getChildren(self): - return () + return [] def getChildNodes(self): return [] @@ -1173,7 +1175,7 @@ Node.__init__(self, lineno) def getChildren(self): - return () + return [] def getChildNodes(self): return [] Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Mon Sep 5 12:15:15 2005 @@ -137,7 +137,7 @@ def _gen_getChildren(self, buf): print >> buf, " def getChildren(self):" if len(self.argnames) == 0: - print >> buf, " return ()" + print >> buf, " return []" else: if self.hardest_arg < P_NESTED: clist = COMMA.join(["self.%s" % c @@ -163,7 +163,7 @@ def _gen_getChildNodes(self, buf): print >> buf, " def getChildNodes(self):" if len(self.argnames) == 0: - print >> buf, " return ()" + print >> buf, " return []" else: if self.hardest_arg < P_NESTED: clist = ["self.%s" % c From adim at codespeak.net Mon Sep 5 12:16:15 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Mon, 5 Sep 2005 12:16:15 +0200 (CEST) Subject: [pypy-svn] r17223 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050905101615.1508C27B46@code1.codespeak.net> Author: adim Date: Mon Sep 5 12:16:13 2005 New Revision: 17223 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: this should fix most of '.name'-related SomeObjects Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Mon Sep 5 12:16:13 2005 @@ -389,7 +389,7 @@ def reduce_slice(obj, sliceobj): """generic factory for Slice nodes""" assert isinstance(sliceobj, SlicelistObject) - if sliceobj.name == 'slice': + if sliceobj.fake_rulename == 'slice': start = sliceobj.value[0] end = sliceobj.value[1] return ast.Slice(obj, consts.OP_APPLY, start, end) @@ -531,14 +531,13 @@ right = atoms[i] op_node = atoms[i-1] assert isinstance(op_node, TokenObject) - op = op_node.name - if op == tok.STAR: + if op_node.name == tok.STAR: left = ast.Mul( [ left, right ] ) - elif op == tok.SLASH: + elif op_node.name == tok.SLASH: left = ast.Div( [ left, right ] ) - elif op == tok.PERCENT: + elif op_node.name == tok.PERCENT: left = ast.Mod( [ left, right ] ) - elif op == tok.DOUBLESLASH: + elif op_node.name == tok.DOUBLESLASH: left = ast.FloorDiv( [ left, right ] ) else: raise ValueError, "unexpected token: %s" % atoms[i-1] @@ -550,10 +549,11 @@ left = atoms[0] for i in range(2,l,2): right = atoms[i] - op = atoms[i-1].name - if op == tok.PLUS: + op_node = atoms[i-1] + assert isinstance(op_node, TokenObject) + if op_node.name == tok.PLUS: left = ast.Add( [ left, right ] ) - elif op == tok.MINUS: + elif op_node.name == tok.MINUS: left = ast.Sub( [ left, right ] ) else: raise ValueError, "unexpected token: %s : %s" % atoms[i-1] @@ -670,6 +670,7 @@ builder.push(ast.Discard(atoms[0])) return op = atoms[1] + assert isinstance(op, TokenObject) if op.name == tok.EQUAL: nodes = [] for i in range(0,l-2,2): @@ -1085,7 +1086,8 @@ index += 2 names.append((name, as_name)) # move forward until next ',' - while index < l and atoms[index].name != tok.COMMA: + while index < l and isinstance(atoms[index], TokenObject) and \ + atoms[index].name != tok.COMMA: index += 1 index += 1 builder.push(ast.Import(names)) @@ -1102,6 +1104,7 @@ index = 1 incr, from_name = parse_dotted_names(atoms[index:]) index += (incr + 1) # skip 'import' + assert isinstance(atoms[index], TokenObject) # XXX if atoms[index].name == tok.STAR: names = [('*', None)] else: @@ -1388,7 +1391,7 @@ FIXME: think about a more appropriate name """ def __init__(self, name, value, src): - self.name = name + self.fake_rulename = name self.value = value self.count = 0 self.line = 0 # src.getline() @@ -1400,7 +1403,7 @@ self.value is the 3-tuple (names, defaults, flags) """ def __init__(self, arguments, stararg, dstararg): - self.name = 'arglist' + self.fake_rulename = 'arglist' self.arguments = arguments self.stararg = stararg self.dstararg = dstararg From ludal at codespeak.net Mon Sep 5 13:01:57 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Mon, 5 Sep 2005 13:01:57 +0200 (CEST) Subject: [pypy-svn] r17224 - pypy/extradoc/sprintinfo Message-ID: <20050905110157.937AF27B46@code1.codespeak.net> Author: ludal Date: Mon Sep 5 13:01:56 2005 New Revision: 17224 Modified: pypy/extradoc/sprintinfo/Paris-2005-sprint.txt Log: added info abot appartment rental Modified: pypy/extradoc/sprintinfo/Paris-2005-sprint.txt ============================================================================== --- pypy/extradoc/sprintinfo/Paris-2005-sprint.txt (original) +++ pypy/extradoc/sprintinfo/Paris-2005-sprint.txt Mon Sep 5 13:01:56 2005 @@ -38,23 +38,27 @@ - Metro line 13 : stop "Malakoff plateau de Vanves" - Bus 89 or 58 : stop "Carrefour Albert Legris" - Bus PC1 : stop "Porte de la Plaine" + Those are more or less within 5 minutes walk. There is a huge number of hotels_ in Paris, all of which are unfortunately rather expensive. It should be no problem to reach the sprint venue from anywhere in Paris using public transportation. -Other links to _`hotels en France` sites : +Other links to `hotels en France`_ sites : However it would be better to find accomodation in the southern districts (14th, 15th) or in the nearby towns (Malakoff, Vanves) not far from Paris. Another good choice is to find an hotel near one of the metro line above. (avoid long bus trips as it takes a long time.) -Looking at the map here: _hotels, the area you should look at is between (3) and (6), +Looking at the map here: hotels_, the area you should look at is between (3) and (6), (5) is still ok. As an alternative you could also try to group (4 or 5 people) and try to rent -a furnished appartment for the week. +a furnished appartment for the week. Almost all announces from French websites come +from `lodgis agency`_. They are rather expensive but most of the time it can be cheaper +than hotels. For the more adventurous you can look on the french site pap_. (select +appartements, d?partement=75, arrondissements=15,14,6) For transportation we strongly recommend the public transportation (metro and buses). There are weekly tickets (monday to sunday) from 15.70 euros (Zone 1-2). You need a @@ -64,6 +68,8 @@ .. _`See here`: http://www.logilab.fr/contact.html .. _hotels: http://www.hotel-paris.net/index-hotels-English.htm .. _`hotels en France`: http://www.0800paris-hotels.com/ +.. _pap: http://www.pap.fr/immobilier/offres/offre-location-vacances.asp +.. _`lodgis agency`: http://www.lodgis.com/ Exact times ----------- From cfbolz at codespeak.net Mon Sep 5 13:40:00 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Mon, 5 Sep 2005 13:40:00 +0200 (CEST) Subject: [pypy-svn] r17225 - pypy/dist/pypy/translator Message-ID: <20050905114000.0A45327B46@code1.codespeak.net> Author: cfbolz Date: Mon Sep 5 13:40:00 2005 New Revision: 17225 Modified: pypy/dist/pypy/translator/backendoptimization.py Log: try to match exceptions during inlining by doing simple pattern matching. Modified: pypy/dist/pypy/translator/backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/backendoptimization.py (original) +++ pypy/dist/pypy/translator/backendoptimization.py Mon Sep 5 13:40:00 2005 @@ -177,6 +177,21 @@ traverse(find_callsites, graph) checkgraph(graph) +def _find_exception_type(block): + #XXX slightly brittle: find the exception type for simple cases + #(e.g. if you do only raise XXXError) by doing pattern matching + ops = block.operations + if (len(ops) < 6 or + ops[-6].opname != "malloc" or ops[-5].opname != "cast_pointer" or + ops[-4].opname != "setfield" or ops[-3].opname != "cast_pointer" or + ops[-2].opname != "getfield" or ops[-1].opname != "cast_pointer" or + len(block.exits) != 1 or block.exits[0].args[0] != ops[-2].result or + block.exits[0].args[1] != ops[-1].result or + not isinstance(ops[-4].args[1], Constant) or + ops[-4].args[1].value != "typeptr"): + return None + return ops[-4].args[2].value + def _inline_function(translator, graph, block, index_operation): exception_guarded = False if (block.exitswitch == Constant(last_exception) and @@ -267,12 +282,43 @@ copiedblock.exits[0].args = copiedblock.exits[0].args[:2] copiedblock.exits[0].target = graph.exceptblock else: - #XXXXX don't look: insert blocks that do exception matching - #XXXXXX should do exception matching as far as possible before runtime - blocks = [] + def find_args_in_exceptional_case(link, block, etype, evalue): + linkargs = [] + for arg in link.args: + if arg == link.last_exception: + linkargs.append(etype) + elif arg == link.last_exc_value: + linkargs.append(evalue) + elif isinstance(arg, Constant): + linkargs.append(arg) + else: + index = afterblock.inputargs.index(arg) + linkargs.append(passon_vars[block][index - 1]) + return linkargs exc_match = Constant(rmodel.getfunctionptr( translator, translator.rtyper.getexceptiondata().ll_exception_match)) + #try to match the exceptions for simple cases + for link in entrymap[graph_to_inline.exceptblock]: + copiedblock = copied_blocks[link.prevblock] + copiedlink = copiedblock.exits[0] + eclass = _find_exception_type(copiedblock) + print copiedblock.operations + if eclass is None: + continue + etype = copiedlink.args[0] + evalue = copiedlink.args[1] + for exceptionlink in afterblock.exits[1:]: + if exc_match.value(eclass, exceptionlink.llexitcase): + copiedlink.target = exceptionlink.target + linkargs = find_args_in_exceptional_case(exceptionlink, + copiedblock, + etype, evalue) + copiedlink.args = linkargs + break + #XXXXX don't look: insert blocks that do exception matching + #for the cases where direct matching did not work + blocks = [] for i, link in enumerate(afterblock.exits[1:]): etype = copyvar(translator, copiedexceptblock.inputargs[0]) evalue = copyvar(translator, copiedexceptblock.inputargs[1]) @@ -283,18 +329,9 @@ args = [exc_match, etype, Constant(link.llexitcase)] block.operations.append(SpaceOperation("direct_call", args, res)) block.exitswitch = res - linkargs = [] - for arg in afterblock.exits[i + 1].args: - if arg == afterblock.exits[i + 1].last_exception: - linkargs.append(etype) - elif arg == afterblock.exits[i + 1].last_exc_value: - linkargs.append(evalue) - elif isinstance(arg, Constant): - linkargs.append(arg) - else: - index = afterblock.inputargs.index(arg) - linkargs.append(passon_vars[link.target][index - 1]) - l = Link(linkargs, afterblock.exits[i + 1].target) + linkargs = find_args_in_exceptional_case(link, link.target, + etype, evalue) + l = Link(linkargs, link.target) l.prevblock = block l.exitcase = True block.exits.append(l) From pedronis at codespeak.net Mon Sep 5 14:30:44 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 5 Sep 2005 14:30:44 +0200 (CEST) Subject: [pypy-svn] r17226 - in pypy/dist/pypy: annotation translator Message-ID: <20050905123044.171B627B46@code1.codespeak.net> Author: pedronis Date: Mon Sep 5 14:30:42 2005 New Revision: 17226 Modified: pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/annotation/classdef.py pypy/dist/pypy/annotation/listdef.py pypy/dist/pypy/annotation/model.py pypy/dist/pypy/translator/annrpython.py Log: try at better crash early on SomeObject logic. we check on setbinding, attr and dict/list generalisation Modified: pypy/dist/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/dist/pypy/annotation/bookkeeper.py (original) +++ pypy/dist/pypy/annotation/bookkeeper.py Mon Sep 5 14:30:42 2005 @@ -684,8 +684,10 @@ self.spec_callsite_keys_results[occurence] = prev_key, r return r - + def ondegenerated(self, what, s_value, where=None, called_from=None): + self.annotator.ondegenerated(what, s_value, where=where, called_from=called_from) + def whereami(self): return self.annotator.whereami(self.position_key) Modified: pypy/dist/pypy/annotation/classdef.py ============================================================================== --- pypy/dist/pypy/annotation/classdef.py (original) +++ pypy/dist/pypy/annotation/classdef.py Mon Sep 5 14:30:42 2005 @@ -4,8 +4,8 @@ from __future__ import generators from types import FunctionType -from pypy.annotation.model import SomeImpossibleValue, SomePBC, tracking_unionof -from pypy.annotation.model import SomeInteger +from pypy.annotation.model import SomeImpossibleValue, SomePBC, unionof +from pypy.annotation.model import SomeInteger, isdegenerated # The main purpose of a ClassDef is to collect information about class/instance @@ -88,15 +88,28 @@ else: # a prebuilt instance source forces readonly=False, see above self.readonly = False - self.s_value = tracking_unionof(self, self.s_value, s_value) + s_new_value = unionof(self.s_value, s_value) + if isdegenerated(s_new_value): + self.bookkeeper.ondegenerated("source %r attr %s" % (source, self.name), + s_new_value) + + self.s_value = s_new_value def getvalue(self): # Same as 'self.s_value' for historical reasons. return self.s_value - def merge(self, other): + def merge(self, other, classdef=None): assert self.name == other.name - self.s_value = tracking_unionof(self, self.s_value, other.s_value) + s_new_value = unionof(self.s_value, other.s_value) + if isdegenerated(s_new_value): + if classdef is None: + what = "? attr %s" % self.name + else: + what = "%r attr %s" % (classdef, self.name) + self.bookkeeper.ondegenerated(what, s_new_value) + + self.s_value = s_new_value self.readonly = self.readonly and other.readonly self.read_locations.update(other.read_locations) @@ -294,7 +307,7 @@ # keep all subattributes' values for subattr in subclass_attrs: - newattr.merge(subattr) + newattr.merge(subattr, classdef=self) # store this new Attribute, generalizing the previous ones from # subclasses -- invariant (A) Modified: pypy/dist/pypy/annotation/listdef.py ============================================================================== --- pypy/dist/pypy/annotation/listdef.py (original) +++ pypy/dist/pypy/annotation/listdef.py Mon Sep 5 14:30:42 2005 @@ -1,5 +1,5 @@ from pypy.annotation.model import SomeObject, SomeImpossibleValue -from pypy.annotation.model import tracking_unionof, TLS, UnionError +from pypy.annotation.model import unionof, TLS, UnionError, isdegenerated class ListItem: @@ -28,7 +28,9 @@ self.patch() # which should patch all refs to 'other' s_value = self.s_value s_other_value = other.s_value - s_new_value = tracking_unionof(self.__class__.__name__, s_value, s_other_value) + s_new_value = unionof(s_value, s_other_value) + if isdegenerated(s_new_value): + self.bookkeeper.ondegenerated(self, s_new_value) if s_new_value != s_value: self.s_value = s_new_value # reflow from reading points @@ -44,7 +46,9 @@ listdef.listitem = self def generalize(self, s_other_value): - s_new_value = tracking_unionof(self.__class__.__name__, self.s_value, s_other_value) + s_new_value = unionof(self.s_value, s_other_value) + if isdegenerated(s_new_value): + self.bookkeeper.ondegenerated(self, s_new_value) if s_new_value != self.s_value: self.s_value = s_new_value # reflow from all reading points Modified: pypy/dist/pypy/annotation/model.py ============================================================================== --- pypy/dist/pypy/annotation/model.py (original) +++ pypy/dist/pypy/annotation/model.py Mon Sep 5 14:30:42 2005 @@ -493,10 +493,13 @@ s1.caused_by_merge = somevalues return s1 +def isdegenerated(s_value): + return s_value.__class__ is SomeObject and s_value.knowntype is not type + def tracking_unionof(ctxt, *somevalues): s1 = unionof(*somevalues) - if not s1.origin and type(ctxt) is tuple: - s1.origin = ctxt+(0,) + #if not s1.origin and type(ctxt) is tuple: + # s1.origin = ctxt+(0,) return s1 # make knowntypedata dictionary Modified: pypy/dist/pypy/translator/annrpython.py ============================================================================== --- pypy/dist/pypy/translator/annrpython.py (original) +++ pypy/dist/pypy/translator/annrpython.py Mon Sep 5 14:30:42 2005 @@ -217,7 +217,26 @@ else: raise TypeError, 'Variable or Constant expected, got %r' % (arg,) - def setbinding(self, arg, s_value, called_from=None): + def ondegenerated(self, what, s_value, where=None, called_from=None): + if self.policy.allow_someobjects: + return + msglines = ["annotation of %r degenerated to SomeObject()" % (what,)] + try: + position_key = where or self.bookkeeper.position_key + except AttributeError: + pass + else: + msglines.append(".. position: %s" % (self.whereami(position_key),)) + if called_from is not None: + msglines.append(".. called from %r" % (called_from,)) + if hasattr(called_from, '__module__'): + msglines[-1] += " from module %r"% (called_from.__module__,) + if s_value.origin is not None: + msglines.append(".. SomeObject() origin: %s" % ( + self.whereami(s_value.origin),)) + raise AnnotatorError('\n'.join(msglines)) + + def setbinding(self, arg, s_value, called_from=None, where=None): if arg in self.bindings: assert s_value.contains(self.bindings[arg]) # for debugging purposes, record the history of bindings that @@ -227,25 +246,12 @@ history.append(self.bindings[arg]) cause_history = self.binding_cause_history.setdefault(arg, []) cause_history.append(self.binding_caused_by[arg]) - degenerated = (s_value.__class__ is annmodel.SomeObject and - s_value.knowntype is not type) - if degenerated and not self.policy.allow_someobjects: - msglines = ["annotation of %r degenerated to SomeObject()" % (arg,)] - try: - position_key = self.bookkeeper.position_key - except AttributeError: - pass - else: - msglines.append(".. %r position: %s" % ( - arg, self.whereami(position_key),)) - if called_from is not None: - msglines.append(".. called from %r" % (called_from,)) - if hasattr(called_from, '__module__'): - msglines[-1] += " from module %r"% (called_from.__module__,) - if s_value.origin is not None: - msglines.append(".. SomeObject() origin: %s" % ( - self.whereami(s_value.origin),)) - raise AnnotatorError('\n'.join(msglines)) + + degenerated = annmodel.isdegenerated(s_value) + + if degenerated: + self.ondegenerated(arg, s_value, where=where, called_from=called_from) + self.bindings[arg] = s_value if annmodel.DEBUG: if arg in self.return_bindings: @@ -373,21 +379,21 @@ assert block in self.annotated self.annotated[block] = False # must re-flow - def bindinputargs(self, fn, block, inputcells, called_from=None): + def bindinputargs(self, fn, block, inputcells, called_from=None, where=None): # Create the initial bindings for the input args of a block. assert len(block.inputargs) == len(inputcells) for a, cell in zip(block.inputargs, inputcells): - self.setbinding(a, cell, called_from) + self.setbinding(a, cell, called_from, where=where) self.annotated[block] = False # must flowin. def mergeinputargs(self, fn, block, inputcells, called_from=None): # Merge the new 'cells' with each of the block's existing input # variables. oldcells = [self.binding(a) for a in block.inputargs] - unions = [annmodel.tracking_unionof((fn, block), c1,c2) for c1, c2 in zip(oldcells,inputcells)] + unions = [annmodel.unionof(c1,c2) for c1, c2 in zip(oldcells,inputcells)] # if the merged cells changed, we must redo the analysis if unions != oldcells: - self.bindinputargs(fn, block, unions, called_from) + self.bindinputargs(fn, block, unions, called_from, where=(fn, block, None)) def whereami(self, position_key): fn, block, i = position_key From pedronis at codespeak.net Mon Sep 5 14:32:58 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 5 Sep 2005 14:32:58 +0200 (CEST) Subject: [pypy-svn] r17227 - pypy/dist/pypy/annotation Message-ID: <20050905123258.2F59727B46@code1.codespeak.net> Author: pedronis Date: Mon Sep 5 14:32:57 2005 New Revision: 17227 Modified: pypy/dist/pypy/annotation/model.py Log: tracking_unionof is no longer needed Modified: pypy/dist/pypy/annotation/model.py ============================================================================== --- pypy/dist/pypy/annotation/model.py (original) +++ pypy/dist/pypy/annotation/model.py Mon Sep 5 14:32:57 2005 @@ -496,12 +496,6 @@ def isdegenerated(s_value): return s_value.__class__ is SomeObject and s_value.knowntype is not type -def tracking_unionof(ctxt, *somevalues): - s1 = unionof(*somevalues) - #if not s1.origin and type(ctxt) is tuple: - # s1.origin = ctxt+(0,) - return s1 - # make knowntypedata dictionary def add_knowntypedata(ktd, truth, vars, s_obj): From pedronis at codespeak.net Mon Sep 5 14:42:40 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 5 Sep 2005 14:42:40 +0200 (CEST) Subject: [pypy-svn] r17229 - pypy/dist/pypy/translator/goal Message-ID: <20050905124240.B12DD27B46@code1.codespeak.net> Author: pedronis Date: Mon Sep 5 14:42:33 2005 New Revision: 17229 Added: pypy/dist/pypy/translator/goal/targetastbuilder.py - copied, changed from r17227, pypy/dist/pypy/translator/goal/targetparser.py Log: target only for the astbuilder part. From adim at codespeak.net Mon Sep 5 15:00:58 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Mon, 5 Sep 2005 15:00:58 +0200 (CEST) Subject: [pypy-svn] r17230 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050905130058.8416E27B46@code1.codespeak.net> Author: adim Date: Mon Sep 5 15:00:57 2005 New Revision: 17230 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: fix ast.Node.name / TokenObject.name annotation conflicts Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Mon Sep 5 15:00:57 2005 @@ -163,6 +163,7 @@ index += 1 else: raise ValueError("FIXME: SyntaxError (incomplete varags) ?") + assert isinstance(cur_token, TokenObject) if cur_token.name != tok.DOUBLESTAR: raise ValueError("Unexpected token: %s" % cur_token) cur_token = tokens[index] @@ -504,7 +505,8 @@ if len(atoms) == 1: builder.push(atoms[0]) else: - if isinstance(atoms[-2], TokenObject) and atoms[-2].name == tok.DOUBLESTAR: + token = atoms[-2] + if isinstance(token, TokenObject) and token.name == tok.DOUBLESTAR: obj = parse_attraccess(atoms[:-2]) builder.push(ast.Power([obj, atoms[-1]])) else: @@ -515,13 +517,15 @@ atoms = get_atoms( builder, nb ) if len(atoms) == 1: builder.push( atoms[0] ) - elif len(atoms) == 2 and isinstance(atoms[0],TokenObject): - if atoms[0].name == tok.PLUS: - builder.push( ast.UnaryAdd( atoms[1] ) ) - if atoms[0].name == tok.MINUS: - builder.push( ast.UnarySub( atoms[1] ) ) - if atoms[0].name == tok.TILDE: - builder.push( ast.Invert( atoms[1] ) ) + elif len(atoms) == 2: + token = atoms[0] + if isinstance(token, TokenObject): + if token.name == tok.PLUS: + builder.push( ast.UnaryAdd( atoms[1] ) ) + if token.name == tok.MINUS: + builder.push( ast.UnarySub( atoms[1] ) ) + if token.name == tok.TILDE: + builder.push( ast.Invert( atoms[1] ) ) def build_term( builder, nb ): atoms = get_atoms( builder, nb ) @@ -565,10 +569,11 @@ left = atoms[0] for i in range(2,l,2): right = atoms[i] - op = atoms[i-1].name - if op == tok.LEFTSHIFT: + op_node = atoms[i-1] + assert isinstance(op_node, TokenObject) + if op_node.name == tok.LEFTSHIFT: left = ast.LeftShift( [ left, right ] ) - elif op == tok.RIGHTSHIFT: + elif op_node.name == tok.RIGHTSHIFT: left = ast.RightShift( [ left, right ] ) else: raise ValueError, "unexpected token: %s : %s" % atoms[i-1] @@ -752,7 +757,8 @@ builder.push(atoms[0]) return items = [] - if isinstance(atoms[1], TokenObject) and atoms[1].name == tok.COMMA: + token = atoms[1] + if isinstance(token, TokenObject) and token.name == tok.COMMA: for i in range(0, l, 2): # this is atoms not 1 items.append(atoms[i]) else: @@ -777,14 +783,15 @@ """trailer: '(' ')' | '(' arglist ')' | '[' subscriptlist ']' | '.' NAME """ atoms = get_atoms(builder, nb) + first_token = atoms[0] # Case 1 : '(' ... - if isinstance(atoms[0], TokenObject) and atoms[0].name == tok.LPAR: + if isinstance(first_token, TokenObject) and first_token.name == tok.LPAR: if len(atoms) == 2: # and atoms[1].token == tok.RPAR: builder.push(ArglistObject([], None, None)) elif len(atoms) == 3: # '(' Arglist ')' # push arglist on the stack builder.push(atoms[1]) - elif isinstance(atoms[0], TokenObject) and atoms[0].name == tok.LSQB: + elif isinstance(first_token, TokenObject) and first_token.name == tok.LSQB: if isinstance(atoms[1], SlicelistObject): builder.push(atoms[1]) else: @@ -815,17 +822,17 @@ def build_subscript(builder, nb): """'.' '.' '.' | [test] ':' [test] [':' [test]] | test""" atoms = get_atoms(builder, nb) - if isinstance(atoms[0], TokenObject) and atoms[0].name == tok.DOT: + token = atoms[0] + if isinstance(token, TokenObject) and token.name == tok.DOT: # Ellipsis: builder.push(ast.Ellipsis()) elif len(atoms) == 1: - token = atoms[0] if isinstance(token, TokenObject) and token.name == tok.COLON: sliceinfos = [None, None, None] builder.push(SlicelistObject('slice', sliceinfos, None)) else: # test - builder.push(atoms[0]) + builder.push(token) else: # elif len(atoms) > 1: items = [] sliceinfos = [None, None, None] @@ -862,15 +869,15 @@ def build_listmaker(builder, nb): """listmaker: test ( list_for | (',' test)* [','] )""" atoms = get_atoms(builder, nb) - if len(atoms) >= 2 and isinstance(atoms[1], TokenObject): + if len(atoms) >= 2: token = atoms[1] - assert isinstance(token, TokenObject) # rtyper info - if token.get_value() == 'for': - # list comp - expr = atoms[0] - list_for = parse_listcomp(atoms[1:]) - builder.push(ast.ListComp(expr, list_for)) - return + if isinstance(token, TokenObject): + if token.get_value() == 'for': + # list comp + expr = atoms[0] + list_for = parse_listcomp(atoms[1:]) + builder.push(ast.ListComp(expr, list_for)) + return # regular list building (like in [1, 2, 3,]) index = 0 nodes = [] @@ -1086,9 +1093,14 @@ index += 2 names.append((name, as_name)) # move forward until next ',' - while index < l and isinstance(atoms[index], TokenObject) and \ - atoms[index].name != tok.COMMA: + # XXX: what is it supposed to do ? + for atom in atoms[index:]: + if isinstance(atom, TokenObject) and atom.name == tok.COMMA: + break index += 1 +## while index < l and isinstance(atoms[index], TokenObject) and \ +## atoms[index].name != tok.COMMA: +## index += 1 index += 1 builder.push(ast.Import(names)) @@ -1104,11 +1116,12 @@ index = 1 incr, from_name = parse_dotted_names(atoms[index:]) index += (incr + 1) # skip 'import' - assert isinstance(atoms[index], TokenObject) # XXX - if atoms[index].name == tok.STAR: + token = atoms[index] + assert isinstance(token, TokenObject) # XXX + if token.name == tok.STAR: names = [('*', None)] else: - if atoms[index].name == tok.LPAR: + if token.name == tok.LPAR: # mutli-line imports tokens = atoms[index+1:-1] else: @@ -1181,13 +1194,15 @@ dest = None start = 1 if l > 1: - if isinstance(atoms[1], TokenObject) and atoms[1].name == tok.RIGHTSHIFT: + token = atoms[1] + if isinstance(token, TokenObject) and token.name == tok.RIGHTSHIFT: dest = atoms[2] # skip following comma start = 4 for index in range(start, l, 2): items.append(atoms[index]) - if isinstance(atoms[-1], TokenObject) and atoms[-1].name == tok.COMMA: + last_token = atoms[-1] + if isinstance(last_token, TokenObject) and last_token.name == tok.COMMA: builder.push(ast.Print(items, dest)) else: builder.push(ast.Printnl(items, dest)) From ludal at codespeak.net Mon Sep 5 15:06:43 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Mon, 5 Sep 2005 15:06:43 +0200 (CEST) Subject: [pypy-svn] r17231 - pypy/extradoc/sprintinfo Message-ID: <20050905130643.5874227B46@code1.codespeak.net> Author: ludal Date: Mon Sep 5 15:06:42 2005 New Revision: 17231 Modified: pypy/extradoc/sprintinfo/Paris-2005-sprint.txt Log: more tweaking Modified: pypy/extradoc/sprintinfo/Paris-2005-sprint.txt ============================================================================== --- pypy/extradoc/sprintinfo/Paris-2005-sprint.txt (original) +++ pypy/extradoc/sprintinfo/Paris-2005-sprint.txt Mon Sep 5 15:06:42 2005 @@ -25,7 +25,7 @@ Location & Accomodation ------------------------ -The sprint will be held in Logilab's offices in Paris. The exact address is +The sprint will be held in Logilab's offices in Paris. The exact address is:: 10 rue Louis Vicat 75015 Paris @@ -42,10 +42,9 @@ Those are more or less within 5 minutes walk. There is a huge number of hotels_ in Paris, all of which are -unfortunately rather expensive. It should be no problem to reach the sprint -venue from anywhere in Paris using public transportation. -Other links to `hotels en France`_ sites : +unfortunately rather expensive. Here is another site for `hotels en France`_. +It should be no problem to reach the sprint venue from anywhere in Paris using public transportation. However it would be better to find accomodation in the southern districts (14th, 15th) or in the nearby towns (Malakoff, Vanves) not far from Paris. Another good choice is to find an hotel near one of the metro line above. (avoid long bus trips as it takes @@ -56,20 +55,22 @@ As an alternative you could also try to group (4 or 5 people) and try to rent a furnished appartment for the week. Almost all announces from French websites come -from `lodgis agency`_. They are rather expensive but most of the time it can be cheaper +from `Lodgis agency`_. They are rather expensive but most of the time it can be cheaper than hotels. For the more adventurous you can look on the french site pap_. (select appartements, d?partement=75, arrondissements=15,14,6) For transportation we strongly recommend the public transportation (metro and buses). There are weekly tickets (monday to sunday) from 15.70 euros (Zone 1-2). You need a -picture (ID format) and ask for a "Carte Orange". +picture (ID format) and ask for a *Carte Orange*. Don't get the tourist's cards which are more expensive and valid only for 5 days. +Also note that zone 1-2 won't pay for your ticket from the airport and is valid only (mostly) +inside Paris. .. _`See here`: http://www.logilab.fr/contact.html .. _hotels: http://www.hotel-paris.net/index-hotels-English.htm .. _`hotels en France`: http://www.0800paris-hotels.com/ .. _pap: http://www.pap.fr/immobilier/offres/offre-location-vacances.asp -.. _`lodgis agency`: http://www.lodgis.com/ +.. _`Lodgis agency`: http://www.lodgis.com/ Exact times ----------- @@ -88,8 +89,8 @@ ethernet cables for those who don't have one. -Registration etc.pp. --------------------- +Registration etc. +----------------- Please subscribe to the `PyPy sprint mailing list`_, introduce yourself and post a note that you want to come. From ale at codespeak.net Mon Sep 5 15:07:39 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Mon, 5 Sep 2005 15:07:39 +0200 (CEST) Subject: [pypy-svn] r17232 - pypy/dist/pypy/module/_codecs Message-ID: <20050905130739.76C0127B4D@code1.codespeak.net> Author: ale Date: Mon Sep 5 15:07:38 2005 New Revision: 17232 Modified: pypy/dist/pypy/module/_codecs/app_codecs.py Log: test_codecs passes again Modified: pypy/dist/pypy/module/_codecs/app_codecs.py ============================================================================== --- pypy/dist/pypy/module/_codecs/app_codecs.py (original) +++ pypy/dist/pypy/module/_codecs/app_codecs.py Mon Sep 5 15:07:38 2005 @@ -296,7 +296,7 @@ consumed = len(data) if final: consumed = 0 - res, consumed, byteorder = PyUnicode_DecodeUTF16Stateful(data, len(data), errors, bm, consumed) + res, consumed, byteorder = PyUnicode_DecodeUTF16Stateful(data, len(data), errors, bm, final) res = ''.join(res) return res, consumed, byteorder @@ -431,7 +431,7 @@ consumed = len(data) if final: consumed = 0 - res, consumed, byteorder = PyUnicode_DecodeUTF16Stateful(data, len(data), errors, 'little', consumed) + res, consumed, byteorder = PyUnicode_DecodeUTF16Stateful(data, len(data), errors, 'little', final) res = u''.join(res) return res, consumed @@ -441,7 +441,7 @@ consumed = len(data) if final: consumed = 0 - res, consumed, byteorder = PyUnicode_DecodeUTF16Stateful(data, len(data), errors, 'big', consumed) + res, consumed, byteorder = PyUnicode_DecodeUTF16Stateful(data, len(data), errors, 'big', final) res = u''.join(res) return res, consumed From ludal at codespeak.net Mon Sep 5 15:14:17 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Mon, 5 Sep 2005 15:14:17 +0200 (CEST) Subject: [pypy-svn] r17234 - pypy/extradoc/sprintinfo Message-ID: <20050905131417.8591427B59@code1.codespeak.net> Author: ludal Date: Mon Sep 5 15:14:15 2005 New Revision: 17234 Added: pypy/extradoc/sprintinfo/paris-people.txt (contents, props changed) Log: took the list of previous sprint as a starting point Added: pypy/extradoc/sprintinfo/paris-people.txt ============================================================================== --- (empty file) +++ pypy/extradoc/sprintinfo/paris-people.txt Mon Sep 5 15:14:15 2005 @@ -0,0 +1,36 @@ + +.. _`Paris Sprint`: http://codespeak.net/pypy/index.cgi?extradoc/sprintinfo/Paris-2005-sprint.html + +People coming to the `Paris Sprint`_ +================================================== + +People who have a ``?`` in their arrive/depart or accomodation +column are known to be coming but there are no details +available yet from them. + +=================== ============== ===================== + Name Arrive/Depart Accomodation +=================== ============== ===================== +Ludovic Aubry 10/10 - 16/10 Private +Adrien Di Mascio 10/10 - 16/10 Private +Jacob Hallen ? ? +Laura Creighton ? ? +Beatrice Duering ? ? +Armin Rigo ? ? +Samuele Pedroni ? ? +Holger Krekel ? ? +=================== ============= ===================== + +People on the following list are likely to come and were +present at the previous sprint +=================== ============== ===================== + Name Arrive/Depart Accomodation +=================== ============= ===================== +Carl Friedrich Bolz ? ? +Niklaus Haldimann ? ? +Eric van Riet Paap ? ? +Richard Emslie ? ? +Anders Chrigstroem ? ? +Christian Tismer ? ? +Anders Lehmann ? ? +=================== ============== ===================== From pedronis at codespeak.net Mon Sep 5 15:30:16 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 5 Sep 2005 15:30:16 +0200 (CEST) Subject: [pypy-svn] r17235 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050905133016.EB66B27B5A@code1.codespeak.net> Author: pedronis Date: Mon Sep 5 15:30:15 2005 New Revision: 17235 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/ast.txt Log: changed info, use 0/1 type consistently for varargs and kwargs Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Mon Sep 5 15:30:15 2005 @@ -654,7 +654,7 @@ self.flags = flags self.doc = doc self.code = code - self.varargs = self.kwargs = None + self.varargs = self.kwargs = 0 if flags & CO_VARARGS: self.varargs = 1 if flags & CO_VARKEYWORDS: @@ -692,7 +692,7 @@ Node.__init__(self, lineno) self.code = code self.argnames = [AssName('[outmost-iterable]', OP_ASSIGN)] - self.varargs = self.kwargs = None + self.varargs = self.kwargs = 0 @@ -897,7 +897,7 @@ self.defaults = defaults self.flags = flags self.code = code - self.varargs = self.kwargs = None + self.varargs = self.kwargs = 0 if flags & CO_VARARGS: self.varargs = 1 if flags & CO_VARKEYWORDS: Modified: pypy/dist/pypy/interpreter/astcompiler/ast.txt ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.txt (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.txt Mon Sep 5 15:30:15 2005 @@ -81,14 +81,14 @@ Invert: expr init(Function): - self.varargs = self.kwargs = None + self.varargs = self.kwargs = 0 if flags & CO_VARARGS: self.varargs = 1 if flags & CO_VARKEYWORDS: self.kwargs = 1 init(Lambda): - self.varargs = self.kwargs = None + self.varargs = self.kwargs = 0 if flags & CO_VARARGS: self.varargs = 1 if flags & CO_VARKEYWORDS: @@ -96,7 +96,7 @@ init(GenExpr): self.argnames = [AssName('[outmost-iterable]', OP_ASSIGN)] - self.varargs = self.kwargs = None + self.varargs = self.kwargs = 0 init(GenExprFor): self.is_outmost = False From adim at codespeak.net Mon Sep 5 15:42:04 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Mon, 5 Sep 2005 15:42:04 +0200 (CEST) Subject: [pypy-svn] r17236 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050905134204.1800C27B60@code1.codespeak.net> Author: adim Date: Mon Sep 5 15:42:02 2005 New Revision: 17236 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/astgen.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: set default lineno value to -1 rather than None because union of None and numbers is not supported Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Mon Sep 5 15:42:02 2005 @@ -23,7 +23,7 @@ class Node(Wrappable): """Abstract base class for ast nodes.""" - def __init__(self, lineno = None): + def __init__(self, lineno=-1): self.lineno = lineno self.filename = "" @@ -73,7 +73,7 @@ return visitor.visitExpression(self) class Add(Node): - def __init__(self, (left, right), lineno=None): + def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -91,7 +91,7 @@ return visitor.visitAdd(self) class And(Node): - def __init__(self, nodes, lineno=None): + def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -110,7 +110,7 @@ return visitor.visitAnd(self) class AssAttr(Node): - def __init__(self, expr, attrname, flags, lineno=None): + def __init__(self, expr, attrname, flags, lineno=-1): Node.__init__(self, lineno) self.expr = expr self.attrname = attrname @@ -129,7 +129,7 @@ return visitor.visitAssAttr(self) class AssList(Node): - def __init__(self, nodes, lineno=None): + def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -148,7 +148,7 @@ return visitor.visitAssList(self) class AssName(Node): - def __init__(self, name, flags, lineno=None): + def __init__(self, name, flags, lineno=-1): Node.__init__(self, lineno) self.name = name self.flags = flags @@ -166,7 +166,7 @@ return visitor.visitAssName(self) class AssTuple(Node): - def __init__(self, nodes, lineno=None): + def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -185,7 +185,7 @@ return visitor.visitAssTuple(self) class Assert(Node): - def __init__(self, test, fail, lineno=None): + def __init__(self, test, fail, lineno=-1): Node.__init__(self, lineno) self.test = test self.fail = fail @@ -210,7 +210,7 @@ return visitor.visitAssert(self) class Assign(Node): - def __init__(self, nodes, expr, lineno=None): + def __init__(self, nodes, expr, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes self.expr = expr @@ -234,7 +234,7 @@ return visitor.visitAssign(self) class AugAssign(Node): - def __init__(self, node, op, expr, lineno=None): + def __init__(self, node, op, expr, lineno=-1): Node.__init__(self, lineno) self.node = node self.op = op @@ -253,7 +253,7 @@ return visitor.visitAugAssign(self) class Backquote(Node): - def __init__(self, expr, lineno=None): + def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) self.expr = expr @@ -270,7 +270,7 @@ return visitor.visitBackquote(self) class Bitand(Node): - def __init__(self, nodes, lineno=None): + def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -289,7 +289,7 @@ return visitor.visitBitand(self) class Bitor(Node): - def __init__(self, nodes, lineno=None): + def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -308,7 +308,7 @@ return visitor.visitBitor(self) class Bitxor(Node): - def __init__(self, nodes, lineno=None): + def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -327,7 +327,7 @@ return visitor.visitBitxor(self) class Break(Node): - def __init__(self, lineno=None): + def __init__(self, lineno=-1): Node.__init__(self, lineno) def getChildren(self): @@ -343,7 +343,7 @@ return visitor.visitBreak(self) class CallFunc(Node): - def __init__(self, node, args, star_args = None, dstar_args = None, lineno=None): + def __init__(self, node, args, star_args = None, dstar_args = None, lineno=-1): Node.__init__(self, lineno) self.node = node self.args = args @@ -375,7 +375,7 @@ return visitor.visitCallFunc(self) class Class(Node): - def __init__(self, name, bases, doc, code, lineno=None): + def __init__(self, name, bases, doc, code, lineno=-1): Node.__init__(self, lineno) self.name = name self.bases = bases @@ -403,7 +403,7 @@ return visitor.visitClass(self) class Compare(Node): - def __init__(self, expr, ops, lineno=None): + def __init__(self, expr, ops, lineno=-1): Node.__init__(self, lineno) self.expr = expr self.ops = ops @@ -427,7 +427,7 @@ return visitor.visitCompare(self) class Const(Node): - def __init__(self, value, lineno=None): + def __init__(self, value, lineno=-1): Node.__init__(self, lineno) self.value = value @@ -444,7 +444,7 @@ return visitor.visitConst(self) class Continue(Node): - def __init__(self, lineno=None): + def __init__(self, lineno=-1): Node.__init__(self, lineno) def getChildren(self): @@ -460,7 +460,7 @@ return visitor.visitContinue(self) class Decorators(Node): - def __init__(self, nodes, lineno=None): + def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -479,7 +479,7 @@ return visitor.visitDecorators(self) class Dict(Node): - def __init__(self, items, lineno=None): + def __init__(self, items, lineno=-1): Node.__init__(self, lineno) self.items = items @@ -498,7 +498,7 @@ return visitor.visitDict(self) class Discard(Node): - def __init__(self, expr, lineno=None): + def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) self.expr = expr @@ -515,7 +515,7 @@ return visitor.visitDiscard(self) class Div(Node): - def __init__(self, (left, right), lineno=None): + def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -533,7 +533,7 @@ return visitor.visitDiv(self) class Ellipsis(Node): - def __init__(self, lineno=None): + def __init__(self, lineno=-1): Node.__init__(self, lineno) def getChildren(self): @@ -549,7 +549,7 @@ return visitor.visitEllipsis(self) class Exec(Node): - def __init__(self, expr, locals, globals, lineno=None): + def __init__(self, expr, locals, globals, lineno=-1): Node.__init__(self, lineno) self.expr = expr self.locals = locals @@ -578,7 +578,7 @@ return visitor.visitExec(self) class FloorDiv(Node): - def __init__(self, (left, right), lineno=None): + def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -596,7 +596,7 @@ return visitor.visitFloorDiv(self) class For(Node): - def __init__(self, assign, list, body, else_, lineno=None): + def __init__(self, assign, list, body, else_, lineno=-1): Node.__init__(self, lineno) self.assign = assign self.list = list @@ -627,7 +627,7 @@ return visitor.visitFor(self) class From(Node): - def __init__(self, modname, names, lineno=None): + def __init__(self, modname, names, lineno=-1): Node.__init__(self, lineno) self.modname = modname self.names = names @@ -645,7 +645,7 @@ return visitor.visitFrom(self) class Function(Node): - def __init__(self, decorators, name, argnames, defaults, flags, doc, code, lineno=None): + def __init__(self, decorators, name, argnames, defaults, flags, doc, code, lineno=-1): Node.__init__(self, lineno) self.decorators = decorators self.name = name @@ -688,7 +688,7 @@ return visitor.visitFunction(self) class GenExpr(Node): - def __init__(self, code, lineno=None): + def __init__(self, code, lineno=-1): Node.__init__(self, lineno) self.code = code self.argnames = [AssName('[outmost-iterable]', OP_ASSIGN)] @@ -709,7 +709,7 @@ return visitor.visitGenExpr(self) class GenExprFor(Node): - def __init__(self, assign, iter, ifs, lineno=None): + def __init__(self, assign, iter, ifs, lineno=-1): Node.__init__(self, lineno) self.assign = assign self.iter = iter @@ -738,7 +738,7 @@ return visitor.visitGenExprFor(self) class GenExprIf(Node): - def __init__(self, test, lineno=None): + def __init__(self, test, lineno=-1): Node.__init__(self, lineno) self.test = test @@ -755,7 +755,7 @@ return visitor.visitGenExprIf(self) class GenExprInner(Node): - def __init__(self, expr, quals, lineno=None): + def __init__(self, expr, quals, lineno=-1): Node.__init__(self, lineno) self.expr = expr self.quals = quals @@ -779,7 +779,7 @@ return visitor.visitGenExprInner(self) class Getattr(Node): - def __init__(self, expr, attrname, lineno=None): + def __init__(self, expr, attrname, lineno=-1): Node.__init__(self, lineno) self.expr = expr self.attrname = attrname @@ -797,7 +797,7 @@ return visitor.visitGetattr(self) class Global(Node): - def __init__(self, names, lineno=None): + def __init__(self, names, lineno=-1): Node.__init__(self, lineno) self.names = names @@ -814,7 +814,7 @@ return visitor.visitGlobal(self) class If(Node): - def __init__(self, tests, else_, lineno=None): + def __init__(self, tests, else_, lineno=-1): Node.__init__(self, lineno) self.tests = tests self.else_ = else_ @@ -839,7 +839,7 @@ return visitor.visitIf(self) class Import(Node): - def __init__(self, names, lineno=None): + def __init__(self, names, lineno=-1): Node.__init__(self, lineno) self.names = names @@ -856,7 +856,7 @@ return visitor.visitImport(self) class Invert(Node): - def __init__(self, expr, lineno=None): + def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) self.expr = expr @@ -873,7 +873,7 @@ return visitor.visitInvert(self) class Keyword(Node): - def __init__(self, name, expr, lineno=None): + def __init__(self, name, expr, lineno=-1): Node.__init__(self, lineno) self.name = name self.expr = expr @@ -891,7 +891,7 @@ return visitor.visitKeyword(self) class Lambda(Node): - def __init__(self, argnames, defaults, flags, code, lineno=None): + def __init__(self, argnames, defaults, flags, code, lineno=-1): Node.__init__(self, lineno) self.argnames = argnames self.defaults = defaults @@ -926,7 +926,7 @@ return visitor.visitLambda(self) class LeftShift(Node): - def __init__(self, (left, right), lineno=None): + def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -944,7 +944,7 @@ return visitor.visitLeftShift(self) class List(Node): - def __init__(self, nodes, lineno=None): + def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -963,7 +963,7 @@ return visitor.visitList(self) class ListComp(Node): - def __init__(self, expr, quals, lineno=None): + def __init__(self, expr, quals, lineno=-1): Node.__init__(self, lineno) self.expr = expr self.quals = quals @@ -987,7 +987,7 @@ return visitor.visitListComp(self) class ListCompFor(Node): - def __init__(self, assign, list, ifs, lineno=None): + def __init__(self, assign, list, ifs, lineno=-1): Node.__init__(self, lineno) self.assign = assign self.list = list @@ -1014,7 +1014,7 @@ return visitor.visitListCompFor(self) class ListCompIf(Node): - def __init__(self, test, lineno=None): + def __init__(self, test, lineno=-1): Node.__init__(self, lineno) self.test = test @@ -1031,7 +1031,7 @@ return visitor.visitListCompIf(self) class Mod(Node): - def __init__(self, (left, right), lineno=None): + def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -1049,7 +1049,7 @@ return visitor.visitMod(self) class Module(Node): - def __init__(self, doc, node, lineno=None): + def __init__(self, doc, node, lineno=-1): Node.__init__(self, lineno) self.doc = doc self.node = node @@ -1067,7 +1067,7 @@ return visitor.visitModule(self) class Mul(Node): - def __init__(self, (left, right), lineno=None): + def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -1085,7 +1085,7 @@ return visitor.visitMul(self) class Name(Node): - def __init__(self, varname, lineno=None): + def __init__(self, varname, lineno=-1): Node.__init__(self, lineno) self.varname = varname @@ -1102,7 +1102,7 @@ return visitor.visitName(self) class NoneConst(Node): - def __init__(self, lineno=None): + def __init__(self, lineno=-1): Node.__init__(self, lineno) def getChildren(self): @@ -1118,7 +1118,7 @@ return visitor.visitNoneConst(self) class Not(Node): - def __init__(self, expr, lineno=None): + def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) self.expr = expr @@ -1135,7 +1135,7 @@ return visitor.visitNot(self) class NumberConst(Node): - def __init__(self, number_value, lineno=None): + def __init__(self, number_value, lineno=-1): Node.__init__(self, lineno) self.number_value = number_value @@ -1152,7 +1152,7 @@ return visitor.visitNumberConst(self) class Or(Node): - def __init__(self, nodes, lineno=None): + def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -1171,7 +1171,7 @@ return visitor.visitOr(self) class Pass(Node): - def __init__(self, lineno=None): + def __init__(self, lineno=-1): Node.__init__(self, lineno) def getChildren(self): @@ -1187,7 +1187,7 @@ return visitor.visitPass(self) class Power(Node): - def __init__(self, (left, right), lineno=None): + def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -1205,7 +1205,7 @@ return visitor.visitPower(self) class Print(Node): - def __init__(self, nodes, dest, lineno=None): + def __init__(self, nodes, dest, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes self.dest = dest @@ -1230,7 +1230,7 @@ return visitor.visitPrint(self) class Printnl(Node): - def __init__(self, nodes, dest, lineno=None): + def __init__(self, nodes, dest, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes self.dest = dest @@ -1255,7 +1255,7 @@ return visitor.visitPrintnl(self) class Raise(Node): - def __init__(self, expr1, expr2, expr3, lineno=None): + def __init__(self, expr1, expr2, expr3, lineno=-1): Node.__init__(self, lineno) self.expr1 = expr1 self.expr2 = expr2 @@ -1285,7 +1285,7 @@ return visitor.visitRaise(self) class Return(Node): - def __init__(self, value, lineno=None): + def __init__(self, value, lineno=-1): Node.__init__(self, lineno) self.value = value @@ -1302,7 +1302,7 @@ return visitor.visitReturn(self) class RightShift(Node): - def __init__(self, (left, right), lineno=None): + def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -1320,7 +1320,7 @@ return visitor.visitRightShift(self) class Slice(Node): - def __init__(self, expr, flags, lower, upper, lineno=None): + def __init__(self, expr, flags, lower, upper, lineno=-1): Node.__init__(self, lineno) self.expr = expr self.flags = flags @@ -1351,7 +1351,7 @@ return visitor.visitSlice(self) class Sliceobj(Node): - def __init__(self, nodes, lineno=None): + def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -1370,7 +1370,7 @@ return visitor.visitSliceobj(self) class Stmt(Node): - def __init__(self, nodes, lineno=None): + def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -1389,7 +1389,7 @@ return visitor.visitStmt(self) class StringConst(Node): - def __init__(self, string_value, lineno=None): + def __init__(self, string_value, lineno=-1): Node.__init__(self, lineno) self.string_value = string_value @@ -1406,7 +1406,7 @@ return visitor.visitStringConst(self) class Sub(Node): - def __init__(self, (left, right), lineno=None): + def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left self.right = right @@ -1424,7 +1424,7 @@ return visitor.visitSub(self) class Subscript(Node): - def __init__(self, expr, flags, subs, lineno=None): + def __init__(self, expr, flags, subs, lineno=-1): Node.__init__(self, lineno) self.expr = expr self.flags = flags @@ -1450,7 +1450,7 @@ return visitor.visitSubscript(self) class TryExcept(Node): - def __init__(self, body, handlers, else_, lineno=None): + def __init__(self, body, handlers, else_, lineno=-1): Node.__init__(self, lineno) self.body = body self.handlers = handlers @@ -1478,7 +1478,7 @@ return visitor.visitTryExcept(self) class TryFinally(Node): - def __init__(self, body, final, lineno=None): + def __init__(self, body, final, lineno=-1): Node.__init__(self, lineno) self.body = body self.final = final @@ -1496,7 +1496,7 @@ return visitor.visitTryFinally(self) class Tuple(Node): - def __init__(self, nodes, lineno=None): + def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -1515,7 +1515,7 @@ return visitor.visitTuple(self) class UnaryAdd(Node): - def __init__(self, expr, lineno=None): + def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) self.expr = expr @@ -1532,7 +1532,7 @@ return visitor.visitUnaryAdd(self) class UnarySub(Node): - def __init__(self, expr, lineno=None): + def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) self.expr = expr @@ -1549,7 +1549,7 @@ return visitor.visitUnarySub(self) class While(Node): - def __init__(self, test, body, else_, lineno=None): + def __init__(self, test, body, else_, lineno=-1): Node.__init__(self, lineno) self.test = test self.body = body @@ -1577,7 +1577,7 @@ return visitor.visitWhile(self) class Yield(Node): - def __init__(self, value, lineno=None): + def __init__(self, value, lineno=-1): Node.__init__(self, lineno) self.value = value @@ -1599,7 +1599,7 @@ method in replacement of the former visitor.visit = walker.dispatch It could also use to identify base type for visit arguments of AST nodes """ - + def default(self, node): for child in node.getChildNodes(): child.accept(self) Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Mon Sep 5 15:42:02 2005 @@ -121,9 +121,9 @@ else: args = self.parent.args or self.args if args: - print >> buf, " def __init__(self, %s, lineno=None):" % args + print >> buf, " def __init__(self, %s, lineno=-1):" % args else: - print >> buf, " def __init__(self, lineno=None):" + print >> buf, " def __init__(self, lineno=-1):" if self.parent.args: print >> buf, " %s.__init__(self, %s, lineno)" % self.parent.args else: @@ -307,7 +307,7 @@ class Node(Wrappable): """Abstract base class for ast nodes.""" - def __init__(self, lineno = None): + def __init__(self, lineno = -1): self.lineno = lineno self.filename = "" Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Mon Sep 5 15:42:02 2005 @@ -194,7 +194,7 @@ self.checkClass() self.locals = misc.Stack() self.setups = misc.Stack() - self.last_lineno = None + self.last_lineno = -1 self._div_op = "BINARY_DIVIDE" self.genexpr_cont_stack = [] @@ -335,8 +335,8 @@ if node is None: return False lineno = node.lineno - if lineno is not None and (lineno != self.last_lineno - or force): + if lineno != -1 and (lineno != self.last_lineno + or force): self.emitop_int('SET_LINENO', lineno) self.last_lineno = lineno return True From adim at codespeak.net Mon Sep 5 15:53:38 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Mon, 5 Sep 2005 15:53:38 +0200 (CEST) Subject: [pypy-svn] r17237 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050905135338.B72B227B66@code1.codespeak.net> Author: adim Date: Mon Sep 5 15:53:37 2005 New Revision: 17237 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: let Node constructor use the default lineno value Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Mon Sep 5 15:53:37 2005 @@ -714,9 +714,9 @@ if len(atoms) > 2: assert False, "return several stmts not implemented" elif len(atoms) == 1: - builder.push(ast.Return(ast.Const(builder.wrap_none()), None)) # XXX lineno + builder.push(ast.Return(ast.Const(builder.wrap_none()))) # XXX lineno else: - builder.push(ast.Return(atoms[1], None)) # XXX lineno + builder.push(ast.Return(atoms[1])) # XXX lineno def build_file_input(builder, nb): stmts = [] From ludal at codespeak.net Mon Sep 5 15:55:58 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Mon, 5 Sep 2005 15:55:58 +0200 (CEST) Subject: [pypy-svn] r17238 - pypy/extradoc/sprintinfo Message-ID: <20050905135558.E3E7227B68@code1.codespeak.net> Author: ludal Date: Mon Sep 5 15:55:57 2005 New Revision: 17238 Modified: pypy/extradoc/sprintinfo/Paris-2005-sprint.txt Log: few last bits, and formatting Modified: pypy/extradoc/sprintinfo/Paris-2005-sprint.txt ============================================================================== --- pypy/extradoc/sprintinfo/Paris-2005-sprint.txt (original) +++ pypy/extradoc/sprintinfo/Paris-2005-sprint.txt Mon Sep 5 15:55:57 2005 @@ -30,8 +30,8 @@ 10 rue Louis Vicat 75015 Paris -We'll provide WLAN connection and there is a small kitchen to make tea and coffee. -`See here`_ for a map of the area nearby. +We'll provide WLAN connection and there is a small kitchen to make tea +and coffee. `See here`_ for a map of the area nearby. There are two metro lines and three bus lines that stop close to Logilab's: - Metro line 12 : stop "Porte de Versailles" @@ -44,29 +44,36 @@ There is a huge number of hotels_ in Paris, all of which are unfortunately rather expensive. Here is another site for `hotels en France`_. -It should be no problem to reach the sprint venue from anywhere in Paris using public transportation. -However it would be better to find accomodation in the southern districts (14th, 15th) -or in the nearby towns (Malakoff, Vanves) not far from Paris. Another good choice is -to find an hotel near one of the metro line above. (avoid long bus trips as it takes -a long time.) -Looking at the map here: hotels_, the area you should look at is between (3) and (6), -(5) is still ok. - - -As an alternative you could also try to group (4 or 5 people) and try to rent -a furnished appartment for the week. Almost all announces from French websites come -from `Lodgis agency`_. They are rather expensive but most of the time it can be cheaper -than hotels. For the more adventurous you can look on the french site pap_. (select -appartements, d?partement=75, arrondissements=15,14,6) - -For transportation we strongly recommend the public transportation (metro and buses). -There are weekly tickets (monday to sunday) from 15.70 euros (Zone 1-2). You need a -picture (ID format) and ask for a *Carte Orange*. -Don't get the tourist's cards which are more expensive and valid only for 5 days. -Also note that zone 1-2 won't pay for your ticket from the airport and is valid only (mostly) -inside Paris. +It should be no problem to reach the sprint venue from anywhere in +Paris using public transportation. However it would be better to find +accomodation in the southern districts (14th, 15th) or in the nearby +towns (Malakoff, Vanves) not far from Paris. Another good choice is to +find an hotel near one of the metro line above. (avoid long bus trips +as it takes more time.) + +Looking at the map here: hotels_, the area you should look at is +between (3) and (6), (5) is still ok. + + +As an alternative you could also try to group (4 or 5 people) and try +to rent a furnished appartment for the week. Almost all announces from +French websites come from `Lodgis agency`_. They are rather expensive +but most of the time it can be cheaper than hotels. For the more +adventurous you can look on the french site pap_ which lists rental +ads. (select appartements, d?partement=75, arrondissements=15,14,13) + +For transportation we strongly recommend the public transportation +(metro and buses). There are weekly tickets (monday to sunday) from +15.70 euros (Zone 1-2). You need a picture (ID format) and ask for a +*Carte Orange*. + +Don't get the tourist's cards which are more expensive and valid only +for 5 days. Also note that zone 1-2 won't pay for your ticket from +the airport and is valid only (mostly) inside Paris. If you arrive by +plane this will be most probably at Roissy airport and from there you +can take the RER line B to Paris for about 8 euros. -.. _`See here`: http://www.logilab.fr/contact.html +.. _`See here`: http://www.logilab.com/contact .. _hotels: http://www.hotel-paris.net/index-hotels-English.htm .. _`hotels en France`: http://www.0800paris-hotels.com/ .. _pap: http://www.pap.fr/immobilier/offres/offre-location-vacances.asp @@ -75,9 +82,9 @@ Exact times ----------- -The Pypy sprint is held Monday 10th October - Sunday 16th October 2005. -Hours will be from 09:00 until 20:00 or later. It's a good -idea to arrive a day before the sprint starts. +The Pypy sprint is held Monday 10th October - Sunday 16th October +2005. Hours will be from 09:00 until 20:00. It's a good idea to +arrive a day before the sprint starts. Network, Food ------------- From hpk at codespeak.net Mon Sep 5 16:10:47 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Mon, 5 Sep 2005 16:10:47 +0200 (CEST) Subject: [pypy-svn] r17240 - pypy/extradoc/sprintinfo Message-ID: <20050905141047.728CF27B68@code1.codespeak.net> Author: hpk Date: Mon Sep 5 16:10:46 2005 New Revision: 17240 Added: pypy/extradoc/sprintinfo/paris-2005-people.txt - copied, changed from r17239, pypy/extradoc/sprintinfo/paris-people.txt pypy/extradoc/sprintinfo/paris-2005-sprint.txt - copied unchanged from r17239, pypy/extradoc/sprintinfo/Paris-2005-sprint.txt Removed: pypy/extradoc/sprintinfo/Paris-2005-sprint.txt pypy/extradoc/sprintinfo/paris-people.txt Log: unify filenames and fix ReST problems From hpk at codespeak.net Mon Sep 5 16:25:03 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Mon, 5 Sep 2005 16:25:03 +0200 (CEST) Subject: [pypy-svn] r17241 - pypy/extradoc/sprintinfo Message-ID: <20050905142503.596E527B7D@code1.codespeak.net> Author: hpk Date: Mon Sep 5 16:25:02 2005 New Revision: 17241 Modified: pypy/extradoc/sprintinfo/paris-2005-sprint.txt Log: small enhancements/adjustments and links. Modified: pypy/extradoc/sprintinfo/paris-2005-sprint.txt ============================================================================== --- pypy/extradoc/sprintinfo/paris-2005-sprint.txt (original) +++ pypy/extradoc/sprintinfo/paris-2005-sprint.txt Mon Sep 5 16:25:02 2005 @@ -1,11 +1,11 @@ -PyPy Sprint in Heidelberg 10th - 16th October 2005 -================================================== +PyPy Sprint in Paris 10th - 16th October 2005 +========================================================== The next PyPy sprint will take place in Paris at Logilab's office -in France from the 10th to the 16th of October 2005 (both days included). +in France from the 10th to the 16th of October 2005 +(both days fully included). -To learn more about the new PyPy Python implementation -look here: +To learn more about the new PyPy Python implementation look here: http://codespeak.net/pypy @@ -13,14 +13,18 @@ Goals and topics of the sprint ------------------------------ -Now that version 0.7 has been released, it is time for refactoring and planning +Now that pypy-0.7.0_ has been released, it is time for refactoring and planning of the next features. -For these reasons the main topics of this sprint will be on +For these reasons the currently scheduled main topics of this +sprint will be: + - threading and GC - refactoring and translation features - discussing about the upcoming JIT, optimizations and stackless features +Moreover, we are open to work in other areas that participants are +interested in. Please raise topics on the mailinglist_. Location & Accomodation ------------------------ @@ -34,6 +38,7 @@ and coffee. `See here`_ for a map of the area nearby. There are two metro lines and three bus lines that stop close to Logilab's: + - Metro line 12 : stop "Porte de Versailles" - Metro line 13 : stop "Malakoff plateau de Vanves" - Bus 89 or 58 : stop "Carrefour Albert Legris" @@ -54,7 +59,6 @@ Looking at the map here: hotels_, the area you should look at is between (3) and (6), (5) is still ok. - As an alternative you could also try to group (4 or 5 people) and try to rent a furnished appartment for the week. Almost all announces from French websites come from `Lodgis agency`_. They are rather expensive @@ -82,33 +86,36 @@ Exact times ----------- -The Pypy sprint is held Monday 10th October - Sunday 16th October +The PyPy sprint is held Monday 10th October - Sunday 16th October 2005. Hours will be from 09:00 until 20:00. It's a good idea to -arrive a day before the sprint starts. +arrive a day before the sprint starts and leave a day later. +In the middle of the sprint there usually is a break day +and it's usually ok to take half-days off if you feel like it. Network, Food ------------- -There are plenty of restaurants and sandwich places around and a big supermarket for those -who would rather make their own food. +There are plenty of restaurants and sandwich places around and a big supermarket +for those who would rather make their own food. -You will probably need a wireless network card to access the network but we can provide -ethernet cables for those who don't have one. +You will probably need a wireless network card to access the network but we can +provide ethernet cables for those who don't have one. Registration etc. ----------------- Please subscribe to the `PyPy sprint mailing list`_, introduce -yourself and post a note that you want to come. +yourself and post a note that you want to come. Feel free to ask any questions there! There also is a separate `Paris people`_ page tracking who is already thought to come. If you have commit rights on codespeak then you can modify yourself a checkout of - http://codespeak.net/svn/pypy/extradoc/sprintinfo/paris-people.txt + http://codespeak.net/svn/pypy/extradoc/sprintinfo/paris-2005-people.txt .. _`Paris people`: http://codespeak.net/pypy/index.cgi?extradoc/sprintinfo/paris-people.html +.. _`mailinglist`: .. _`PyPy sprint mailing list`: http://codespeak.net/mailman/listinfo/pypy-sprint - +.. _`pypy-0.7.0`: http://codespeak.net/pypy/dist/pypy/doc/release-0.7.0.html From hpk at codespeak.net Mon Sep 5 16:43:12 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Mon, 5 Sep 2005 16:43:12 +0200 (CEST) Subject: [pypy-svn] r17242 - pypy/extradoc/sprintinfo Message-ID: <20050905144312.642EE27B73@code1.codespeak.net> Author: hpk Date: Mon Sep 5 16:43:11 2005 New Revision: 17242 Modified: pypy/extradoc/sprintinfo/paris-2005-sprint.txt Log: added two links, converted to UTF8 Modified: pypy/extradoc/sprintinfo/paris-2005-sprint.txt ============================================================================== --- pypy/extradoc/sprintinfo/paris-2005-sprint.txt (original) +++ pypy/extradoc/sprintinfo/paris-2005-sprint.txt Mon Sep 5 16:43:11 2005 @@ -21,7 +21,8 @@ - threading and GC - refactoring and translation features - - discussing about the upcoming JIT, optimizations and stackless features + - discussing/experimenting towards JIT_ and `continuation-passing`_ + style translation Moreover, we are open to work in other areas that participants are interested in. Please raise topics on the mailinglist_. @@ -64,7 +65,7 @@ French websites come from `Lodgis agency`_. They are rather expensive but most of the time it can be cheaper than hotels. For the more adventurous you can look on the french site pap_ which lists rental -ads. (select appartements, d?partement=75, arrondissements=15,14,13) +ads. (select appartements, d??partement=75, arrondissements=15,14,13) For transportation we strongly recommend the public transportation (metro and buses). There are weekly tickets (monday to sunday) from @@ -119,3 +120,5 @@ .. _`mailinglist`: .. _`PyPy sprint mailing list`: http://codespeak.net/mailman/listinfo/pypy-sprint .. _`pypy-0.7.0`: http://codespeak.net/pypy/dist/pypy/doc/release-0.7.0.html +.. _JIT: http://en.wikipedia.org/wiki/Just-in-time_compilation +.. _`continuation-passing`: http://en.wikipedia.org/wiki/Continuation_passing_style From hpk at codespeak.net Mon Sep 5 16:44:25 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Mon, 5 Sep 2005 16:44:25 +0200 (CEST) Subject: [pypy-svn] r17243 - pypy/extradoc/sprintinfo Message-ID: <20050905144425.1FE4A27B85@code1.codespeak.net> Author: hpk Date: Mon Sep 5 16:44:24 2005 New Revision: 17243 Modified: pypy/extradoc/sprintinfo/template_sprintreports_20050827.txt Log: converted to UTF8 encoding Modified: pypy/extradoc/sprintinfo/template_sprintreports_20050827.txt ============================================================================== --- pypy/extradoc/sprintinfo/template_sprintreports_20050827.txt (original) +++ pypy/extradoc/sprintinfo/template_sprintreports_20050827.txt Mon Sep 5 16:44:24 2005 @@ -1,5 +1,5 @@ Template for sprint reports -Author: Beatrice D?ring +Author: Beatrice D??ring Date:20050827 Version:1.0 From hpk at codespeak.net Mon Sep 5 16:45:42 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Mon, 5 Sep 2005 16:45:42 +0200 (CEST) Subject: [pypy-svn] r17244 - pypy/extradoc/sprintinfo Message-ID: <20050905144542.4720B27B8C@code1.codespeak.net> Author: hpk Date: Mon Sep 5 16:45:41 2005 New Revision: 17244 Modified: pypy/extradoc/sprintinfo/pycon_sprint_report.txt Log: converted to UTF8 encoding Modified: pypy/extradoc/sprintinfo/pycon_sprint_report.txt ============================================================================== --- pypy/extradoc/sprintinfo/pycon_sprint_report.txt (original) +++ pypy/extradoc/sprintinfo/pycon_sprint_report.txt Mon Sep 5 16:45:41 2005 @@ -10,14 +10,14 @@ Participants at the Pypy sprint were: Michael Chermside -Anders Chrigstr?m +Anders Chrigstr??m Brian Dorsey Richard Emslie -Jacob Hall?n +Jacob Hall??n Holger Krekel Alex Martelli Alan Mcintyre -Lutz P?like +Lutz P??like Samuele Pedroni Jonathan Riehl Armin Rigo @@ -117,11 +117,11 @@ Python ======================================================================= Holger Krekel -Jacob Hall?n +Jacob Hall??n Armin Rigo Samuele Pedroni Christian Tismer -Anders Chrigstr?m +Anders Chrigstr??m Brian Dorsey Guido van Rossum (CPython) Jim Hugunin (IronPython) @@ -156,7 +156,7 @@ the next few weeks. We should go ahead with the Oxford sprint. Holger wants to do a "private" -sprint in G?teborg a week before Europython. +sprint in G??teborg a week before Europython. Armin - will evaluate different alternatives for the translator and write a report From hpk at codespeak.net Mon Sep 5 16:47:08 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Mon, 5 Sep 2005 16:47:08 +0200 (CEST) Subject: [pypy-svn] r17245 - pypy/dist/pypy/doc Message-ID: <20050905144708.961BD27B8D@code1.codespeak.net> Author: hpk Date: Mon Sep 5 16:47:07 2005 New Revision: 17245 Modified: pypy/dist/pypy/doc/news.txt Log: - updated the Heidelberg sprint item, point it to my heidelberg sprint report (reviewers of that report welcome) - inserted a news item for the Paris sprint. Modified: pypy/dist/pypy/doc/news.txt ============================================================================== --- pypy/dist/pypy/doc/news.txt (original) +++ pypy/dist/pypy/doc/news.txt Mon Sep 5 16:47:07 2005 @@ -9,6 +9,20 @@ .. _Python: http://www.python.org/doc/current/ref/ref.html .. _`more...`: http://codespeak.net/pypy/dist/pypy/doc/architecture.html#mission-statement +Next PyPy Sprint in Paris 10th-16th October 2005 +========================================================== + +Our next sprint is going to focus on threads/GC, various +refactorings as well as JIT_ and `continuation-passing`_ style +translation of PyPy's Python interpreter. The `Paris Announcement`_ +details times and logistics as well as the registration procedure. +The number of people who can attend is somewhat limited so it's +good if you signal your interest as early as possible. + +.. _`Paris Announcement`: http://codespeak.net/pypy/extradoc/sprintinfo/paris-2005-sprint.html +.. _JIT: http://en.wikipedia.org/wiki/Just-in-time_compilation +.. _`continuation-passing`: http://en.wikipedia.org/wiki/Continuation_passing_style + PyPy release 0.7.0 =================== @@ -19,21 +33,25 @@ the `getting started`_ document for instructions about downloading it and trying it out. We also have the beginning of a FAQ_. *(08/28/2005)* +.. _`pypy-0.7.0`: .. _`release announcement`: http://codespeak.net/pypy/dist/pypy/doc/release-0.7.0.html .. _`getting started`: http://codespeak.net/pypy/dist/pypy/doc/getting-started.html .. _FAQ: http://codespeak.net/pypy/dist/pypy/doc/faq.html -Ongoing: PyPy Sprint in Heidelberg 22nd-29th August 2005 +PyPy Sprint in Heidelberg 22nd-29th August 2005 ========================================================== -The current `PyPy sprint`_ takes place at the Heidelberg University +The last `PyPy sprint`_ took place at the Heidelberg University in Germany from 22nd August to 29th August (both days included). Its main focus is translation of the whole PyPy interpreter to a low level language and reaching 2.4.1 Python compliancy. The goal of the sprint is to release a first self-contained -PyPy-0.7 version. Carl has written a report about `day 1 - 3`_ -and there are also some pictures_ online. +PyPy-0.7 version. Carl has written a report about `day 1 - 3`_, +there are some pictures_ online and a `heidelberg summary report`_ +detailing some of the works that led to the successful release +of `pypy-0.7.0`_! +.. _`heidelberg summary report`: http://codespeak.net/pypy/extradoc/sprintinfo/Heidelberg-report.html .. _`PyPy sprint`: http://codespeak.net/pypy/extradoc/sprintinfo/Heidelberg-sprint.html .. _`day 1 - 3`: http://codespeak.net/pipermail/pypy-dev/2005q3/002287.html .. _pictures: http://codespeak.net/~hpk/heidelberg-sprint/ From adim at codespeak.net Mon Sep 5 16:48:14 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Mon, 5 Sep 2005 16:48:14 +0200 (CEST) Subject: [pypy-svn] r17246 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050905144814.488EE27B93@code1.codespeak.net> Author: adim Date: Mon Sep 5 16:48:13 2005 New Revision: 17246 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py Log: first step to remove flatten_nodes() calls Still to do : - the same kind of transformations for .tests - change nodelist.extend(flatten_nodes(xxx)) into nodelist.extend(xxx) - make appopriate changes in astgen.py / ast.txt to reflect these changes Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Mon Sep 5 16:48:13 2005 @@ -406,6 +406,7 @@ def __init__(self, expr, ops, lineno=-1): Node.__init__(self, lineno) self.expr = expr + # ops is a list of couples (op_name, node) self.ops = ops def getChildren(self): @@ -417,7 +418,10 @@ def getChildNodes(self): nodelist = [] nodelist.append(self.expr) - nodelist.extend(flatten_nodes(self.ops)) + # this is a replacement of flatten_nodes + for op_name, node in self.ops: + nodelist.append(node) + # nodelist.extend(flatten_nodes(self.ops)) return nodelist def __repr__(self): @@ -481,6 +485,7 @@ class Dict(Node): def __init__(self, items, lineno=-1): Node.__init__(self, lineno) + # items is a list of couples (node (key), node (value)) self.items = items def getChildren(self): @@ -488,7 +493,11 @@ def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.items)) + # replacement for flatten_nodes() + for key, value in self.items: + nodelist.append(key) + nodelist.append(value) + # nodelist.extend(flatten_nodes(self.items)) return nodelist def __repr__(self): @@ -1453,6 +1462,7 @@ def __init__(self, body, handlers, else_, lineno=-1): Node.__init__(self, lineno) self.body = body + # handlers is a list of triplets (expr1, expr2, body) self.handlers = handlers self.else_ = else_ @@ -1466,7 +1476,14 @@ def getChildNodes(self): nodelist = [] nodelist.append(self.body) - nodelist.extend(flatten_nodes(self.handlers)) + # replacement for flatten_nodes(self.handlers) + for expr1, expr2, body in self.handlers: + if expr1 is not None: + nodelist.append(expr1) + if expr2 is not None: + nodelist.append(expr2) + if body is not None: + nodelist.append(body) if self.else_ is not None: nodelist.append(self.else_) return nodelist From pedronis at codespeak.net Mon Sep 5 18:40:36 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 5 Sep 2005 18:40:36 +0200 (CEST) Subject: [pypy-svn] r17248 - in pypy/dist/pypy/interpreter: astcompiler pyparser/test Message-ID: <20050905164036.293E027B51@code1.codespeak.net> Author: pedronis Date: Mon Sep 5 18:40:34 2005 New Revision: 17248 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/ast.txt pypy/dist/pypy/interpreter/astcompiler/astgen.py pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Log: don't use a generic type-problematic flatten_nodes for getChildNodes, put special cases treatment specs into ast.txt itself, flatten_nodes(CLASS.attr): add to nodelist added code in astgen.py to support this. Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Mon Sep 5 18:40:34 2005 @@ -16,14 +16,14 @@ l.append(elt) return l -def flatten_nodes(list): - return [n for n in flatten(list) if isinstance(n, Node)] +#def flatten_nodes(list): +# return [n for n in flatten(list) if isinstance(n, Node)] nodes = {} class Node(Wrappable): """Abstract base class for ast nodes.""" - def __init__(self, lineno=-1): + def __init__(self, lineno = -1): self.lineno = lineno self.filename = "" @@ -79,6 +79,7 @@ self.right = right def getChildren(self): + "NOT_RPYTHON" return self.left, self.right def getChildNodes(self): @@ -96,11 +97,12 @@ self.nodes = nodes def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.nodes)) def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) return nodelist def __repr__(self): @@ -117,6 +119,7 @@ self.flags = flags def getChildren(self): + "NOT_RPYTHON" return self.expr, self.attrname, self.flags def getChildNodes(self): @@ -134,11 +137,12 @@ self.nodes = nodes def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.nodes)) def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) return nodelist def __repr__(self): @@ -154,6 +158,7 @@ self.flags = flags def getChildren(self): + "NOT_RPYTHON" return self.name, self.flags def getChildNodes(self): @@ -171,11 +176,12 @@ self.nodes = nodes def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.nodes)) def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) return nodelist def __repr__(self): @@ -191,6 +197,7 @@ self.fail = fail def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.test) children.append(self.fail) @@ -216,6 +223,7 @@ self.expr = expr def getChildren(self): + "NOT_RPYTHON" children = [] children.extend(flatten(self.nodes)) children.append(self.expr) @@ -223,7 +231,7 @@ def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) nodelist.append(self.expr) return nodelist @@ -241,6 +249,7 @@ self.expr = expr def getChildren(self): + "NOT_RPYTHON" return self.node, self.op, self.expr def getChildNodes(self): @@ -258,6 +267,7 @@ self.expr = expr def getChildren(self): + "NOT_RPYTHON" return self.expr, def getChildNodes(self): @@ -275,11 +285,12 @@ self.nodes = nodes def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.nodes)) def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) return nodelist def __repr__(self): @@ -294,11 +305,12 @@ self.nodes = nodes def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.nodes)) def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) return nodelist def __repr__(self): @@ -313,11 +325,12 @@ self.nodes = nodes def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.nodes)) def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) return nodelist def __repr__(self): @@ -331,6 +344,7 @@ Node.__init__(self, lineno) def getChildren(self): + "NOT_RPYTHON" return [] def getChildNodes(self): @@ -351,6 +365,7 @@ self.dstar_args = dstar_args def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.node) children.extend(flatten(self.args)) @@ -361,7 +376,7 @@ def getChildNodes(self): nodelist = [] nodelist.append(self.node) - nodelist.extend(flatten_nodes(self.args)) + nodelist.extend(self.args) if self.star_args is not None: nodelist.append(self.star_args) if self.dstar_args is not None: @@ -383,6 +398,7 @@ self.code = code def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.name) children.extend(flatten(self.bases)) @@ -392,7 +408,7 @@ def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.bases)) + nodelist.extend(self.bases) nodelist.append(self.code) return nodelist @@ -410,6 +426,7 @@ self.ops = ops def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.expr) children.extend(flatten(self.ops)) @@ -418,10 +435,9 @@ def getChildNodes(self): nodelist = [] nodelist.append(self.expr) - # this is a replacement of flatten_nodes + # ops is a list of couples (op_name, node) for op_name, node in self.ops: nodelist.append(node) - # nodelist.extend(flatten_nodes(self.ops)) return nodelist def __repr__(self): @@ -436,6 +452,7 @@ self.value = value def getChildren(self): + "NOT_RPYTHON" return self.value, def getChildNodes(self): @@ -452,6 +469,7 @@ Node.__init__(self, lineno) def getChildren(self): + "NOT_RPYTHON" return [] def getChildNodes(self): @@ -469,11 +487,12 @@ self.nodes = nodes def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.nodes)) def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) return nodelist def __repr__(self): @@ -489,15 +508,15 @@ self.items = items def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.items)) def getChildNodes(self): nodelist = [] - # replacement for flatten_nodes() + # items is a list of couples (node (key), node (value)) for key, value in self.items: nodelist.append(key) nodelist.append(value) - # nodelist.extend(flatten_nodes(self.items)) return nodelist def __repr__(self): @@ -512,6 +531,7 @@ self.expr = expr def getChildren(self): + "NOT_RPYTHON" return self.expr, def getChildNodes(self): @@ -530,6 +550,7 @@ self.right = right def getChildren(self): + "NOT_RPYTHON" return self.left, self.right def getChildNodes(self): @@ -546,6 +567,7 @@ Node.__init__(self, lineno) def getChildren(self): + "NOT_RPYTHON" return [] def getChildNodes(self): @@ -565,6 +587,7 @@ self.globals = globals def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.expr) children.append(self.locals) @@ -593,6 +616,7 @@ self.right = right def getChildren(self): + "NOT_RPYTHON" return self.left, self.right def getChildNodes(self): @@ -613,6 +637,7 @@ self.else_ = else_ def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.assign) children.append(self.list) @@ -642,6 +667,7 @@ self.names = names def getChildren(self): + "NOT_RPYTHON" return self.modname, self.names def getChildNodes(self): @@ -672,6 +698,7 @@ def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.decorators) children.append(self.name) @@ -686,7 +713,7 @@ nodelist = [] if self.decorators is not None: nodelist.append(self.decorators) - nodelist.extend(flatten_nodes(self.defaults)) + nodelist.extend(self.defaults) nodelist.append(self.code) return nodelist @@ -706,6 +733,7 @@ def getChildren(self): + "NOT_RPYTHON" return self.code, def getChildNodes(self): @@ -724,9 +752,11 @@ self.iter = iter self.ifs = ifs self.is_outmost = False + def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.assign) children.append(self.iter) @@ -737,7 +767,7 @@ nodelist = [] nodelist.append(self.assign) nodelist.append(self.iter) - nodelist.extend(flatten_nodes(self.ifs)) + nodelist.extend(self.ifs) return nodelist def __repr__(self): @@ -752,6 +782,7 @@ self.test = test def getChildren(self): + "NOT_RPYTHON" return self.test, def getChildNodes(self): @@ -770,6 +801,7 @@ self.quals = quals def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.expr) children.extend(flatten(self.quals)) @@ -778,7 +810,7 @@ def getChildNodes(self): nodelist = [] nodelist.append(self.expr) - nodelist.extend(flatten_nodes(self.quals)) + nodelist.extend(self.quals) return nodelist def __repr__(self): @@ -794,6 +826,7 @@ self.attrname = attrname def getChildren(self): + "NOT_RPYTHON" return self.expr, self.attrname def getChildNodes(self): @@ -811,6 +844,7 @@ self.names = names def getChildren(self): + "NOT_RPYTHON" return self.names, def getChildNodes(self): @@ -825,10 +859,12 @@ class If(Node): def __init__(self, tests, else_, lineno=-1): Node.__init__(self, lineno) + # tests is a list of couples (node (test), node (suite)) self.tests = tests self.else_ = else_ def getChildren(self): + "NOT_RPYTHON" children = [] children.extend(flatten(self.tests)) children.append(self.else_) @@ -836,7 +872,10 @@ def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.tests)) + # tests is a list of couples (node (test), node (suite)) + for test, suite in self.items: + nodelist.append(test) + nodelist.append(suite) if self.else_ is not None: nodelist.append(self.else_) return nodelist @@ -853,6 +892,7 @@ self.names = names def getChildren(self): + "NOT_RPYTHON" return self.names, def getChildNodes(self): @@ -870,6 +910,7 @@ self.expr = expr def getChildren(self): + "NOT_RPYTHON" return self.expr, def getChildNodes(self): @@ -888,6 +929,7 @@ self.expr = expr def getChildren(self): + "NOT_RPYTHON" return self.name, self.expr def getChildNodes(self): @@ -915,6 +957,7 @@ def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.argnames) children.extend(flatten(self.defaults)) @@ -924,7 +967,7 @@ def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.defaults)) + nodelist.extend(self.defaults) nodelist.append(self.code) return nodelist @@ -941,6 +984,7 @@ self.right = right def getChildren(self): + "NOT_RPYTHON" return self.left, self.right def getChildNodes(self): @@ -958,11 +1002,12 @@ self.nodes = nodes def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.nodes)) def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) return nodelist def __repr__(self): @@ -978,6 +1023,7 @@ self.quals = quals def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.expr) children.extend(flatten(self.quals)) @@ -986,7 +1032,7 @@ def getChildNodes(self): nodelist = [] nodelist.append(self.expr) - nodelist.extend(flatten_nodes(self.quals)) + nodelist.extend(self.quals) return nodelist def __repr__(self): @@ -1003,6 +1049,7 @@ self.ifs = ifs def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.assign) children.append(self.list) @@ -1013,7 +1060,7 @@ nodelist = [] nodelist.append(self.assign) nodelist.append(self.list) - nodelist.extend(flatten_nodes(self.ifs)) + nodelist.extend(self.ifs) return nodelist def __repr__(self): @@ -1028,6 +1075,7 @@ self.test = test def getChildren(self): + "NOT_RPYTHON" return self.test, def getChildNodes(self): @@ -1046,6 +1094,7 @@ self.right = right def getChildren(self): + "NOT_RPYTHON" return self.left, self.right def getChildNodes(self): @@ -1064,6 +1113,7 @@ self.node = node def getChildren(self): + "NOT_RPYTHON" return self.doc, self.node def getChildNodes(self): @@ -1082,6 +1132,7 @@ self.right = right def getChildren(self): + "NOT_RPYTHON" return self.left, self.right def getChildNodes(self): @@ -1099,6 +1150,7 @@ self.varname = varname def getChildren(self): + "NOT_RPYTHON" return self.varname, def getChildNodes(self): @@ -1115,6 +1167,7 @@ Node.__init__(self, lineno) def getChildren(self): + "NOT_RPYTHON" return [] def getChildNodes(self): @@ -1132,6 +1185,7 @@ self.expr = expr def getChildren(self): + "NOT_RPYTHON" return self.expr, def getChildNodes(self): @@ -1149,6 +1203,7 @@ self.number_value = number_value def getChildren(self): + "NOT_RPYTHON" return self.number_value, def getChildNodes(self): @@ -1166,11 +1221,12 @@ self.nodes = nodes def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.nodes)) def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) return nodelist def __repr__(self): @@ -1184,6 +1240,7 @@ Node.__init__(self, lineno) def getChildren(self): + "NOT_RPYTHON" return [] def getChildNodes(self): @@ -1202,6 +1259,7 @@ self.right = right def getChildren(self): + "NOT_RPYTHON" return self.left, self.right def getChildNodes(self): @@ -1220,6 +1278,7 @@ self.dest = dest def getChildren(self): + "NOT_RPYTHON" children = [] children.extend(flatten(self.nodes)) children.append(self.dest) @@ -1227,7 +1286,7 @@ def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) if self.dest is not None: nodelist.append(self.dest) return nodelist @@ -1245,6 +1304,7 @@ self.dest = dest def getChildren(self): + "NOT_RPYTHON" children = [] children.extend(flatten(self.nodes)) children.append(self.dest) @@ -1252,7 +1312,7 @@ def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) if self.dest is not None: nodelist.append(self.dest) return nodelist @@ -1271,6 +1331,7 @@ self.expr3 = expr3 def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.expr1) children.append(self.expr2) @@ -1299,6 +1360,7 @@ self.value = value def getChildren(self): + "NOT_RPYTHON" return self.value, def getChildNodes(self): @@ -1317,6 +1379,7 @@ self.right = right def getChildren(self): + "NOT_RPYTHON" return self.left, self.right def getChildNodes(self): @@ -1337,6 +1400,7 @@ self.upper = upper def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.expr) children.append(self.flags) @@ -1365,11 +1429,12 @@ self.nodes = nodes def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.nodes)) def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) return nodelist def __repr__(self): @@ -1384,11 +1449,12 @@ self.nodes = nodes def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.nodes)) def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) return nodelist def __repr__(self): @@ -1403,6 +1469,7 @@ self.string_value = string_value def getChildren(self): + "NOT_RPYTHON" return self.string_value, def getChildNodes(self): @@ -1421,6 +1488,7 @@ self.right = right def getChildren(self): + "NOT_RPYTHON" return self.left, self.right def getChildNodes(self): @@ -1440,6 +1508,7 @@ self.subs = subs def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.expr) children.append(self.flags) @@ -1449,7 +1518,7 @@ def getChildNodes(self): nodelist = [] nodelist.append(self.expr) - nodelist.extend(flatten_nodes(self.subs)) + nodelist.extend(self.subs) return nodelist def __repr__(self): @@ -1467,6 +1536,7 @@ self.else_ = else_ def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.body) children.extend(flatten(self.handlers)) @@ -1476,7 +1546,7 @@ def getChildNodes(self): nodelist = [] nodelist.append(self.body) - # replacement for flatten_nodes(self.handlers) + # handlers is a list of triplets (expr1, expr2, body) for expr1, expr2, body in self.handlers: if expr1 is not None: nodelist.append(expr1) @@ -1501,6 +1571,7 @@ self.final = final def getChildren(self): + "NOT_RPYTHON" return self.body, self.final def getChildNodes(self): @@ -1518,11 +1589,12 @@ self.nodes = nodes def getChildren(self): + "NOT_RPYTHON" return tuple(flatten(self.nodes)) def getChildNodes(self): nodelist = [] - nodelist.extend(flatten_nodes(self.nodes)) + nodelist.extend(self.nodes) return nodelist def __repr__(self): @@ -1537,6 +1609,7 @@ self.expr = expr def getChildren(self): + "NOT_RPYTHON" return self.expr, def getChildNodes(self): @@ -1554,6 +1627,7 @@ self.expr = expr def getChildren(self): + "NOT_RPYTHON" return self.expr, def getChildNodes(self): @@ -1573,6 +1647,7 @@ self.else_ = else_ def getChildren(self): + "NOT_RPYTHON" children = [] children.append(self.test) children.append(self.body) @@ -1599,6 +1674,7 @@ self.value = value def getChildren(self): + "NOT_RPYTHON" return self.value, def getChildNodes(self): @@ -1616,7 +1692,7 @@ method in replacement of the former visitor.visit = walker.dispatch It could also use to identify base type for visit arguments of AST nodes """ - + def default(self, node): for child in node.getChildNodes(): child.accept(self) Modified: pypy/dist/pypy/interpreter/astcompiler/ast.txt ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.txt (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.txt Mon Sep 5 18:40:34 2005 @@ -100,3 +100,30 @@ init(GenExprFor): self.is_outmost = False + +flatten_nodes(Compare.ops): + # ops is a list of couples (op_name, node) + for op_name, node in self.ops: + nodelist.append(node) + +flatten_nodes(TryExcept.handlers): + # handlers is a list of triplets (expr1, expr2, body) + for expr1, expr2, body in self.handlers: + if expr1 is not None: + nodelist.append(expr1) + if expr2 is not None: + nodelist.append(expr2) + if body is not None: + nodelist.append(body) + +flatten_nodes(Dict.items): + # items is a list of couples (node (key), node (value)) + for key, value in self.items: + nodelist.append(key) + nodelist.append(value) + +flatten_nodes(If.tests): + # tests is a list of couples (node (test), node (suite)) + for test, suite in self.items: + nodelist.append(test) + nodelist.append(suite) Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Mon Sep 5 18:40:34 2005 @@ -50,6 +50,7 @@ self.argprops = self.get_argprops() self.nargs = len(self.argnames) self.init = [] + self.flatten_nodes = {} self.parent = parent def setup_parent(self, classes): @@ -130,12 +131,15 @@ print >> buf, " Node.__init__(self, lineno)" if self.argnames: for name in self.argnames: + if name in self.flatten_nodes: + print >>buf, " %s" % self.flatten_nodes[name][0].rstrip() print >> buf, " self.%s = %s" % (name, name) if self.init: print >> buf, "".join([" " + line for line in self.init]) def _gen_getChildren(self, buf): print >> buf, " def getChildren(self):" + print >> buf, ' "NOT_RPYTHON"' if len(self.argnames) == 0: print >> buf, " return []" else: @@ -184,8 +188,15 @@ " nodelist.append(self.%s)") print >> buf, tmp % (name, name) elif self.argprops[name] == P_NESTED: - print >> buf, template % ("extend", "flatten_nodes(", - name, ")") + if name not in self.flatten_nodes: + print >> buf, template % ("extend", "", + name, "") + else: + flat_logic = self.flatten_nodes[name] + while not flat_logic[-1].strip(): + flat_logic.pop() + flat_logic[-1] = flat_logic[-1].rstrip() + print >> buf, "".join([" " + line for line in flat_logic]) elif self.argprops[name] == P_NODE: print >> buf, template % ("append", "", name, "") print >> buf, " return nodelist" @@ -214,16 +225,27 @@ print >> buf, " return self.default( node )" rx_init = re.compile('init\((.*)\):') +rx_flatten_nodes = re.compile('flatten_nodes\((.*)\.(.*)\):') def parse_spec(file): classes = {} cur = None + kind = None for line in fileinput.input(file): - if line.strip().startswith('#'): - continue - mo = rx_init.search(line) + mo = None + comment = line.strip().startswith('#') + if not comment: + mo = rx_init.search(line) + if mo: + kind = 'init' + else: + mo = rx_flatten_nodes.search(line) + if mo: + kind = 'flatten_nodes' if mo is None: if cur is None: + if comment: + continue # a normal entry try: name, args = line.split(':') @@ -236,13 +258,22 @@ parent = None classes[name] = NodeInfo(name, args, parent) cur = None - else: + elif kind == 'init': # some code for the __init__ method cur.init.append(line) - else: + elif kind == 'flatten_nodes': + cur.flatten_nodes['_cur_'].append(line) + elif kind == 'init': # some extra code for a Node's __init__ method name = mo.group(1) cur = classes[name] + elif kind == 'flatten_nodes': + # special case for getChildNodes flattening + name = mo.group(1) + attr = mo.group(2) + cur = classes[name] + cur.flatten_nodes[attr] = cur.flatten_nodes['_cur_'] = [] + for node in classes.values(): node.setup_parent(classes) return sorted(classes.values(), key=lambda n: n.name) @@ -300,8 +331,8 @@ l.append(elt) return l -def flatten_nodes(list): - return [n for n in flatten(list) if isinstance(n, Node)] +#def flatten_nodes(list): +# return [n for n in flatten(list) if isinstance(n, Node)] nodes = {} Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Mon Sep 5 18:40:34 2005 @@ -32,10 +32,10 @@ def nodes_equal(left, right): - if type(left)!=type(right): - return False if not isinstance(left,stable_ast.Node) or not isinstance(right,ast_ast.Node): return left==right + if left.__class__.__name__ != right.__class__.__name__: + return False if isinstance(left,stable_ast.Function) and isinstance(right,ast_ast.Function): left_nodes = list(left.getChildren()) right_nodes = list(right.getChildren()) From bea at codespeak.net Mon Sep 5 22:46:44 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Mon, 5 Sep 2005 22:46:44 +0200 (CEST) Subject: [pypy-svn] r17251 - pypy/extradoc/talk Message-ID: <20050905204644.CA3D227B45@code1.codespeak.net> Author: bea Date: Mon Sep 5 22:46:42 2005 New Revision: 17251 Added: pypy/extradoc/talk/pypy_sprinttalk_ep2005bd.sxi (contents, props changed) Log: the sprint talk held at Europython 2005 Added: pypy/extradoc/talk/pypy_sprinttalk_ep2005bd.sxi ============================================================================== Binary file. No diff available. From bea at codespeak.net Mon Sep 5 22:47:45 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Mon, 5 Sep 2005 22:47:45 +0200 (CEST) Subject: [pypy-svn] r17252 - pypy/extradoc/talk Message-ID: <20050905204745.9559327B51@code1.codespeak.net> Author: bea Date: Mon Sep 5 22:47:42 2005 New Revision: 17252 Added: pypy/extradoc/talk/pypy_sprinttalk_calibre20050909.sxi (contents, props changed) Log: the talk I am holding at the Calibre conference (2 international conference) in Limerick 20050909. Added: pypy/extradoc/talk/pypy_sprinttalk_calibre20050909.sxi ============================================================================== Binary file. No diff available. From pedronis at codespeak.net Tue Sep 6 00:34:32 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 6 Sep 2005 00:34:32 +0200 (CEST) Subject: [pypy-svn] r17254 - pypy/dist/pypy/translator/goal Message-ID: <20050905223432.5137527B4D@code1.codespeak.net> Author: pedronis Date: Tue Sep 6 00:34:30 2005 New Revision: 17254 Modified: pypy/dist/pypy/translator/goal/translate_pypy.py Log: some more convenience commands in translate_pypy to give fast access to the common producedures used when tracking annotation problems: help ann_other -> "other annotation related commands are: find, findclasses, findfuncs, attrs, attrsann, readpos" summarized: find object using pypy prefixes find(annotated)classes meeting some criteria find(flow-graphed)func(tion)s meeting some criteria find command store results in a variable given with "as var" modifier or _. attrs: list annotated class(es) atttributes, possibly filtering with some criteria attrsann: same, print annotation s_value too readpos: read locations for a annotated class, attrname pair, possibly filtering result as a list of functions containing read locs is stored in "as var" or _. examples: # assuming we have annotated astbuilder, find the annotation for .nodes attributes for Node subclasses that have such a thing # first retrieve Node with prefixes (just a convenience): find astcompiler.ast.Node as N findclasses issubclass(cand, N) attrsann _ match cand.name == 'nodes' # read locations for TokenObject name, just functions starting with 'build_' readpos pyparser.astbuilder.TokenObject name match cand.func.__name__.startswith('build_') Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Tue Sep 6 00:34:30 2005 @@ -453,6 +453,21 @@ TRYPREFIXES = ['','pypy.','pypy.objspace.','pypy.interpreter.', 'pypy.objspace.std.' ] + def _mygetval(self, arg, errmsg): + try: + return eval(arg, self.curframe.f_globals, + self.curframe.f_locals) + except: + t, v = sys.exc_info()[:2] + if isinstance(t, str): + exc_type_name = t + else: exc_type_name = t.__name__ + if not isinstance(arg, str): + print '*** %s' % errmsg, "\t[%s: %s]" % (exc_type_name, v) + else: + print '*** %s:' % errmsg, arg, "\t[%s: %s]" % (exc_type_name, v) + raise + def _getobj(self, name): if '.' in name: for pfx in self.TRYPREFIXES: @@ -461,11 +476,98 @@ except NameError: pass try: - return self._getval(name) - except (NameError, AttributeError, LookupError): - print "*** Not found:", name + return self._mygetval(name, "Not found") + except (KeyboardInterrupt, SystemExit, MemoryError): + raise + except: + pass return None + def do_find(self, arg): + """find obj [as var] +find dotted named obj, possibly using prefixing with some packages +in pypy (see help pypyprefixes); the result is assigned to var or _.""" + objarg, var = self._parse_modif(arg) + obj = self._getobj(objarg) + if obj is None: + return + print obj + self._setvar(var, obj) + + def _parse_modif(self, arg, modif='as'): + var = '_' + aspos = arg.rfind(modif+' ') + if aspos != -1: + objarg = arg[:aspos].strip() + var = arg[aspos+(1+len(modif)):].strip() + else: + objarg = arg + return objarg, var + + def _setvar(self, var, obj): + self.curframe.f_locals[var] = obj + + class GiveUp(Exception): + pass + + def _make_flt(self, expr): + try: + expr = compile(expr, '', 'eval') + except SyntaxError: + print "*** syntax: %s" % expr + return None + def flt(c): + marker = object() + try: + old = self.curframe.f_locals.get('cand', marker) + self.curframe.f_locals['cand'] = c + try: + return self._mygetval(expr, "oops") + except (KeyboardInterrupt, SystemExit, MemoryError): + raise + except: + raise self.GiveUp + finally: + if old is not marker: + self.curframe.f_locals['cand'] = old + else: + del self.curframe.f_locals['cand'] + return flt + + def do_findclasses(self, arg): + """findclasses expr [as var] +find annotated classes for which expr is true, cand in it referes to +the candidate class; the result list is assigned to var or _.""" + expr, var = self._parse_modif(arg) + flt = self._make_flt(expr) + if flt is None: + return + cls = [] + try: + for c in t.annotator.getuserclasses(): + if flt(c): + cls.append(c) + except self.GiveUp: + return + self._setvar(var, cls) + + def do_findfuncs(self, arg): + """findfuncs expr [as var] +find flow-graphed functions for which expr is true, cand in it referes to +the candidate function; the result list is assigned to var or _.""" + expr, var = self._parse_modif(arg) + flt = self._make_flt(expr) + if flt is None: + return + funcs = [] + try: + for f in t.flowgraphs: + if flt(f): + funcs.append(f) + except self.GiveUp: + return + self._setvar(var, funcs) + def do_showg(self, arg): """showg obj show graph for obj, obj can be an expression or a dotted name @@ -490,6 +592,108 @@ return self._show(page) + def _attrs(self, arg, pr): + arg, expr = self._parse_modif(arg, 'match') + if expr == '_': + expr = 'True' + obj = self._getobj(arg) + if obj is None: + return + import types + if isinstance(obj, (type, types.ClassType)): + obj = [obj] + else: + obj = list(obj) + def longname(c): + return "%s.%s" % (c.__module__, c.__name__) + obj.sort(lambda x,y: cmp(longname(x), longname(y))) + cls = t.annotator.getuserclasses() + flt = self._make_flt(expr) + if flt is None: + return + for c in obj: + if c in cls: + try: + attrs = [a for a in cls[c].attrs.itervalues() if flt(a)] + except self.GiveUp: + return + if attrs: + print "%s:" % longname(c) + pr(attrs) + + def do_attrs(self, arg): + """attrs obj [match expr] +list annotated attrs of class obj or list of classes obj, +obj can be an expression or a dotted name +(in which case prefixing with some packages in pypy is tried (see help pypyprefixes)); +expr is an optional filtering expression; cand in it refer to the candidate Attribute +information object, which has a .name and .s_value.""" + def pr(attrs): + print " " + ' '.join([a.name for a in attrs]) + self._attrs(arg, pr) + + def do_attrsann(self, arg): + """attrsann obj [match expr] +list with their annotation annotated attrs of class obj or list of classes obj, +obj can be an expression or a dotted name +(in which case prefixing with some packages in pypy is tried (see help pypyprefixes)); +expr is an optional filtering expression; cand in it refer to the candidate Attribute +information object, which has a .name and .s_value.""" + def pr(attrs): + for a in attrs: + print ' %s %s' % (a.name, a.s_value) + self._attrs(arg, pr) + + def do_readpos(self, arg): + """readpos obj attrname [match expr] [as var] +list the read positions of annotated attr with attrname of class obj, +obj can be an expression or a dotted name +(in which case prefixing with some packages in pypy is tried (see help pypyprefixes)); +expr is an optional filtering expression; cand in it refer to the candidate read +position information, which has a .func and .block and .i; +the list of the read positions functions is set to var or _.""" + class Pos: + def __init__(self, func, block, i): + self.func = func + self.block = block + self.i = i + arg, var = self._parse_modif(arg, 'as') + arg, expr = self._parse_modif(arg, 'match') + if expr == '_': + expr = 'True' + args = arg.split() + if len(args) != 2: + print "*** expected obj attrname:", arg + return + arg, attrname = args + obj = self._getobj(arg) + if obj is None: + return + cls = t.annotator.getuserclasses() + if obj not in cls: + return + attrs = cls[obj].attrs + if attrname not in attrs: + print "*** bogus:", attrname + return + pos = attrs[attrname].read_locations + if not pos: + return + flt = self._make_flt(expr) + if flt is None: + return + r = {} + try: + for p in pos: + func, block, i = p + if flt(Pos(func, block, i)): + print func.__module__ or '?', func.__name__, block, i + r[func] = True + except self.GiveUp: + return + self._setvar(var, r.keys()) + + def do_flowg(self, arg): """callg obj show flow graph for function obj, obj can be an expression or a dotted name @@ -531,6 +735,9 @@ def help_graphs(self): print "graph commands are: showg, flowg, callg, classhier" + def help_ann_other(self): + print "other annotation related commands are: find, findclasses, findfuncs, attrs, attrsann, readpos" + def help_pypyprefixes(self): print "these prefixes are tried for dotted names in graph commands:" print self.TRYPREFIXES From pedronis at codespeak.net Tue Sep 6 00:41:41 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 6 Sep 2005 00:41:41 +0200 (CEST) Subject: [pypy-svn] r17255 - pypy/dist/pypy/translator/goal Message-ID: <20050905224141.4A42427B4D@code1.codespeak.net> Author: pedronis Date: Tue Sep 6 00:41:39 2005 New Revision: 17255 Modified: pypy/dist/pypy/translator/goal/translate_pypy.py Log: allow quote around attrname in readpos command. Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Tue Sep 6 00:41:39 2005 @@ -666,6 +666,11 @@ print "*** expected obj attrname:", arg return arg, attrname = args + # allow quotes around attrname + if (attrname.startswith("'") and attrname.endswith("'") + or attrname.startswith('"') and attrname.endswith('"')): + attrname = attrname[1:-1] + obj = self._getobj(arg) if obj is None: return From ale at codespeak.net Tue Sep 6 09:03:59 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Tue, 6 Sep 2005 09:03:59 +0200 (CEST) Subject: [pypy-svn] r17257 - pypy/dist/pypy/module/_codecs Message-ID: <20050906070359.CEE0927B46@code1.codespeak.net> Author: ale Date: Tue Sep 6 09:03:58 2005 New Revision: 17257 Modified: pypy/dist/pypy/module/_codecs/app_codecs.py Log: test_codeccallbacks passes again Modified: pypy/dist/pypy/module/_codecs/app_codecs.py ============================================================================== --- pypy/dist/pypy/module/_codecs/app_codecs.py (original) +++ pypy/dist/pypy/module/_codecs/app_codecs.py Tue Sep 6 09:03:58 2005 @@ -476,7 +476,7 @@ res += '&#' res += str(ord(ch)) res += ';' - return ''.join(res), exc.end + return u''.join(res), exc.end else: raise TypeError("don't know how to handle %.400s in error callback"%type(exc)) @@ -495,7 +495,7 @@ else: p += 'x' p += "%.2x" % ord(c) - return ''.join(p), exc.end + return u''.join(p), exc.end else: raise TypeError("don't know how to handle %.400s in error callback"%type(exc)) From adim at codespeak.net Tue Sep 6 09:16:13 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Tue, 6 Sep 2005 09:16:13 +0200 (CEST) Subject: [pypy-svn] r17258 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050906071613.788DD27B44@code1.codespeak.net> Author: adim Date: Tue Sep 6 09:16:11 2005 New Revision: 17258 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/ast.txt Log: fixed little attribute name error in class If Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Tue Sep 6 09:16:11 2005 @@ -873,7 +873,7 @@ def getChildNodes(self): nodelist = [] # tests is a list of couples (node (test), node (suite)) - for test, suite in self.items: + for test, suite in self.tests: nodelist.append(test) nodelist.append(suite) if self.else_ is not None: Modified: pypy/dist/pypy/interpreter/astcompiler/ast.txt ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.txt (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.txt Tue Sep 6 09:16:11 2005 @@ -124,6 +124,6 @@ flatten_nodes(If.tests): # tests is a list of couples (node (test), node (suite)) - for test, suite in self.items: + for test, suite in self.tests: nodelist.append(test) nodelist.append(suite) From ericvrp at codespeak.net Tue Sep 6 10:47:05 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 6 Sep 2005 10:47:05 +0200 (CEST) Subject: [pypy-svn] r17261 - pypy/dist/pypy/translator/llvm Message-ID: <20050906084705.35B6027B45@code1.codespeak.net> Author: ericvrp Date: Tue Sep 6 10:47:04 2005 New Revision: 17261 Modified: pypy/dist/pypy/translator/llvm/database.py Log: fix because underscores seem to be no longer allowed in log.HERE Modified: pypy/dist/pypy/translator/llvm/database.py ============================================================================== --- pypy/dist/pypy/translator/llvm/database.py (original) +++ pypy/dist/pypy/translator/llvm/database.py Tue Sep 6 10:47:04 2005 @@ -162,7 +162,7 @@ def prepare_constant(self, type_, value): if isinstance(type_, lltype.Primitive): - #log.prepare_constant(value, "(is primitive)") + #log.prepareconstant(value, "(is primitive)") return if isinstance(type_, lltype.Ptr): @@ -170,7 +170,7 @@ type_ = type_.TO value = value._obj - log.prepare_constant("preparing ptr", value) + log.prepareconstant("preparing ptr", value) # we dont need a node for nulls if value is None: @@ -200,7 +200,7 @@ if isinstance(ct, lltype.Array) or isinstance(ct, lltype.Struct): p, c = lltype.parentlink(value) if p is None: - log.prepare_arg_value("skipping preparing non root", value) + log.prepareargvalue("skipping preparing non root", value) return if value is not None and value not in self.obj2node: From cfbolz at codespeak.net Tue Sep 6 11:28:58 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 6 Sep 2005 11:28:58 +0200 (CEST) Subject: [pypy-svn] r17262 - in pypy/dist/pypy/rpython: . memory Message-ID: <20050906092858.C007827B45@code1.codespeak.net> Author: cfbolz Date: Tue Sep 6 11:28:57 2005 New Revision: 17262 Modified: pypy/dist/pypy/rpython/llinterp.py pypy/dist/pypy/rpython/memory/gc.py Log: fix test_gc: llinterp was using py.log with keywords containing underscores incref was not checking for NULL pointers Modified: pypy/dist/pypy/rpython/llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/llinterp.py (original) +++ pypy/dist/pypy/rpython/llinterp.py Tue Sep 6 11:28:57 2005 @@ -67,11 +67,11 @@ print operation def find_roots(self): - log.find_roots("starting") + log.findroots("starting") frame = self.active_frame roots = [] while frame is not None: - log.find_roots("graph", frame.graph.name) + log.findroots("graph", frame.graph.name) frame.find_roots(roots) frame = frame.f_back return roots @@ -235,7 +235,7 @@ self.make_llexception(e) def find_roots(self, roots): - log.find_roots(self.curr_block.inputargs) + log.findroots(self.curr_block.inputargs) for arg in self.curr_block.inputargs: if (isinstance(arg, Variable) and isinstance(self.getval(arg), self.llt._ptr)): Modified: pypy/dist/pypy/rpython/memory/gc.py ============================================================================== --- pypy/dist/pypy/rpython/memory/gc.py (original) +++ pypy/dist/pypy/rpython/memory/gc.py Tue Sep 6 11:28:57 2005 @@ -405,6 +405,8 @@ raw_free(gc_info) def incref(self, addr): + if addr == NULL: + return (addr - self.size_gc_header()).signed[0] += 1 def decref(self, addr): From adim at codespeak.net Tue Sep 6 11:50:46 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Tue, 6 Sep 2005 11:50:46 +0200 (CEST) Subject: [pypy-svn] r17263 - pypy/dist/pypy/interpreter/pyparser/test Message-ID: <20050906095046.9B52F27B45@code1.codespeak.net> Author: adim Date: Tue Sep 6 11:50:44 2005 New Revision: 17263 Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Log: tests now compare stablecompiler bytecode and astcompiler bytecode Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Tue Sep 6 11:50:44 2005 @@ -7,27 +7,36 @@ from test_astbuilder import expressions, comparisons, funccalls, backtrackings,\ listmakers, genexps, dictmakers, multiexpr, attraccess, slices, imports,\ - asserts, execs, prints, globs, raises_, imports_newstyle, augassigns + asserts, execs, prints, globs, raises_, imports_newstyle, augassigns, \ + if_stmts, one_stmt_classdefs, one_stmt_funcdefs, tryexcepts, docstrings, \ + returns from test_astbuilder import FakeSpace TESTS = [ expressions, - augassigns, - comparisons, - funccalls, - backtrackings, - listmakers, - dictmakers, - multiexpr, - attraccess, - slices, - imports, - execs, - prints, - globs, - raises_, +## augassigns, +## comparisons, +## funccalls, +## backtrackings, +## listmakers, +## dictmakers, +## multiexpr, +## attraccess, +## slices, +## imports, +## execs, +## prints, +## globs, +## raises_, +## # EXEC INPUTS +## # one_stmt_classdefs, +## one_stmt_funcdefs, +## if_stmts, +## tryexcepts, +## # docstrings, +## # returns, ] import sys @@ -51,30 +60,38 @@ ### Note: builtin compile and compiler.compile behave differently -def compile_expr( expr, target="exec" ): +def compile_with_builtin_comiple( expr, target="exec" ): return compile( expr, "", target ) -def ast_compile( expr, target="exec" ): - from compiler import compile - return compile( expr, "", target ) - +def compile_with_astcompiler(expr, target='exec', space=FakeSpace()): + ast = ast_parse_expr(epxr, target='exec', space=space) + misc.set_filename('', ast) + codegen = pycodegen.ModuleCodeGenerator(space, ast) + rcode = codegenerator.getCode() + return to_code(rcode) -def compare_code( code1, code2 ): - #print "Filename", code1.co_filename, code2.co_filename - assert code1.co_filename == code2.co_filename - #print repr(code1.co_code) - #print repr(code2.co_code) - if code1.co_code != code2.co_code: +def compile_with_stablecompiler(expr, target='exec'): + from pypy.interpreter.stablecompiler import compile + # from compiler import compile + return compile(expr, '', target) + + +def compare_code(ac_code, sc_code): + #print "Filename", ac_code.co_filename, sc_code.co_filename + assert ac_code.co_filename == sc_code.co_filename + #print repr(ac_code.co_code) + #print repr(sc_code.co_code) + if ac_code.co_code != sc_code.co_code: import dis print "Code from pypy:" - dis.dis(code1) + dis.dis(ac_code) print "Code from python", sys.version - dis.dis(code2) - assert code1.co_code == code2.co_code - assert code1.co_varnames == code2.co_varnames + dis.dis(sc_code) + assert ac_code.co_code == sc_code.co_code + assert ac_code.co_varnames == sc_code.co_varnames - assert len(code1.co_consts) == len(code2.co_consts) - for c1, c2 in zip( code1.co_consts, code2.co_consts ): + assert len(ac_code.co_consts) == len(sc_code.co_consts) + for c1, c2 in zip( ac_code.co_consts, sc_code.co_consts ): if type(c1)==PyCode: c1 = to_code(c1) return compare_code( c1, c2 ) @@ -99,25 +116,33 @@ tuple(rcode.co_cellvars) ) return code -def check_compile( expr ): - space = FakeSpace() - ast_tree = ast_parse_expr( expr, target='exec', space=space ) - misc.set_filename("", ast_tree) +def check_compile(expr): print "Compiling:", expr - print ast_tree - codegenerator = pycodegen.ModuleCodeGenerator(space,ast_tree) - rcode = codegenerator.getCode() - code1 = to_code( rcode ) - code2 = ast_compile( expr ) - compare_code(code1,code2) + sc_code = compile_with_stablecompiler(expr, target='exec') + as_code = compile_with_astcompiler(expr, target='exec') + compare_code(ac_code, sc_code) + +## def check_compile( expr ): +## space = FakeSpace() +## ast_tree = ast_parse_expr( expr, target='exec', space=space ) +## misc.set_filename("", ast_tree) +## print "Compiling:", expr +## print ast_tree +## codegenerator = pycodegen.ModuleCodeGenerator(space,ast_tree) +## rcode = codegenerator.getCode() +## code1 = to_code( rcode ) +## code2 = ast_compile( expr ) +## compare_code(code1,code2) def test_compile_argtuple_1(): + py.test.skip('will be tested when more basic stuff will work') code = """def f( x, (y,z) ): print x,y,z """ check_compile( code ) def test_compile_argtuple_2(): + py.test.skip('will be tested when more basic stuff will work') code = """def f( x, (y,(z,t)) ): print x,y,z,t """ @@ -125,6 +150,7 @@ def test_compile_argtuple_3(): + py.test.skip('will be tested when more basic stuff will work') code = """def f( x, (y,(z,(t,u))) ): print x,y,z,t,u """ From adim at codespeak.net Tue Sep 6 11:52:03 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Tue, 6 Sep 2005 11:52:03 +0200 (CEST) Subject: [pypy-svn] r17265 - pypy/dist/pypy/interpreter/stablecompiler Message-ID: <20050906095203.7C5D427B45@code1.codespeak.net> Author: adim Date: Tue Sep 6 11:52:01 2005 New Revision: 17265 Modified: pypy/dist/pypy/interpreter/stablecompiler/pycodegen.py pypy/dist/pypy/interpreter/stablecompiler/transformer.py Log: added a filename default argument to parse() to match Transformer's new API Modified: pypy/dist/pypy/interpreter/stablecompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/stablecompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/stablecompiler/pycodegen.py Tue Sep 6 11:52:01 2005 @@ -75,7 +75,7 @@ self.code = None def _get_tree(self): - tree = parse(self.source, self.mode) + tree = parse(self.source, self.mode, self.filename) misc.set_filename(self.filename, tree) syntax.check(tree) return tree Modified: pypy/dist/pypy/interpreter/stablecompiler/transformer.py ============================================================================== --- pypy/dist/pypy/interpreter/stablecompiler/transformer.py (original) +++ pypy/dist/pypy/interpreter/stablecompiler/transformer.py Tue Sep 6 11:52:01 2005 @@ -54,11 +54,12 @@ f.close() return parse(src) -def parse(buf, mode="exec"): +# added a filename keyword argument to improve SyntaxErrors' messages +def parse(buf, mode="exec", filename=''): if mode == "exec" or mode == "single": - return Transformer().parsesuite(buf) + return Transformer(filename).parsesuite(buf) elif mode == "eval": - return Transformer().parseexpr(buf) + return Transformer(filename).parseexpr(buf) else: raise ValueError("compile() arg 3 must be" " 'exec' or 'eval' or 'single'") @@ -109,7 +110,7 @@ tree = parsefile(fileob | filename) """ - def __init__(self, filename): + def __init__(self, filename=''): self._dispatch = {} self.filename = filename for value, name in symbol.sym_name.items(): From ale at codespeak.net Tue Sep 6 11:53:06 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Tue, 6 Sep 2005 11:53:06 +0200 (CEST) Subject: [pypy-svn] r17266 - in pypy/dist/pypy/translator: goal tool Message-ID: <20050906095306.ED1B827B45@code1.codespeak.net> Author: ale Date: Tue Sep 6 11:53:05 2005 New Revision: 17266 Added: pypy/dist/pypy/translator/goal/translate_pypy_new.py - copied, changed from r17255, pypy/dist/pypy/translator/goal/translate_pypy.py pypy/dist/pypy/translator/tool/pdbplus.py pypy/dist/pypy/translator/tool/util.py Log: Start of cleanup of translate_pypy. Copied translate_pypy to translate_pypy_new inorder not to interfere with other work going on. Moved the extensions of pdb to tool/pdbplus Moved some helper functions to tool/util Added: pypy/dist/pypy/translator/tool/pdbplus.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/tool/pdbplus.py Tue Sep 6 11:53:05 2005 @@ -0,0 +1,342 @@ +import threading, pdb + +def run_debugger_in_thread(fn, args, cleanup=None, cleanup_args=()): + def _run_in_thread(): + try: + try: + fn(*args) + pass # for debugger to land + except pdb.bdb.BdbQuit: + pass + finally: + if cleanup is not None: + cleanup(*cleanup_args) + return threading.Thread(target=_run_in_thread, args=()) + + +class PdbPlusShow(pdb.Pdb): + + def post_mortem(self, t): + self.reset() + while t.tb_next is not None: + t = t.tb_next + self.interaction(t.tb_frame, t) + + show = None + + def _show(self, page): + if not self.show: + print "*** No display" + return + self.show(page) + + def _importobj(self, fullname): + obj = None + name = '' + for comp in fullname.split('.'): + name += comp + obj = getattr(obj, comp, None) + if obj is None: + try: + obj = __import__(name, {}, {}, ['*']) + except ImportError: + raise NameError + name += '.' + return obj + + TRYPREFIXES = ['','pypy.','pypy.objspace.','pypy.interpreter.', 'pypy.objspace.std.' ] + + def _mygetval(self, arg, errmsg): + try: + return eval(arg, self.curframe.f_globals, + self.curframe.f_locals) + except: + t, v = sys.exc_info()[:2] + if isinstance(t, str): + exc_type_name = t + else: exc_type_name = t.__name__ + if not isinstance(arg, str): + print '*** %s' % errmsg, "\t[%s: %s]" % (exc_type_name, v) + else: + print '*** %s:' % errmsg, arg, "\t[%s: %s]" % (exc_type_name, v) + raise + + def _getobj(self, name): + if '.' in name: + for pfx in self.TRYPREFIXES: + try: + return self._importobj(pfx+name) + except NameError: + pass + try: + return self._mygetval(name, "Not found") + except (KeyboardInterrupt, SystemExit, MemoryError): + raise + except: + pass + return None + + def do_find(self, arg): + """find obj [as var] +find dotted named obj, possibly using prefixing with some packages +in pypy (see help pypyprefixes); the result is assigned to var or _.""" + objarg, var = self._parse_modif(arg) + obj = self._getobj(objarg) + if obj is None: + return + print obj + self._setvar(var, obj) + + def _parse_modif(self, arg, modif='as'): + var = '_' + aspos = arg.rfind(modif+' ') + if aspos != -1: + objarg = arg[:aspos].strip() + var = arg[aspos+(1+len(modif)):].strip() + else: + objarg = arg + return objarg, var + + def _setvar(self, var, obj): + self.curframe.f_locals[var] = obj + + class GiveUp(Exception): + pass + + def _make_flt(self, expr): + try: + expr = compile(expr, '', 'eval') + except SyntaxError: + print "*** syntax: %s" % expr + return None + def flt(c): + marker = object() + try: + old = self.curframe.f_locals.get('cand', marker) + self.curframe.f_locals['cand'] = c + try: + return self._mygetval(expr, "oops") + except (KeyboardInterrupt, SystemExit, MemoryError): + raise + except: + raise self.GiveUp + finally: + if old is not marker: + self.curframe.f_locals['cand'] = old + else: + del self.curframe.f_locals['cand'] + return flt + + def do_findclasses(self, arg): + """findclasses expr [as var] +find annotated classes for which expr is true, cand in it referes to +the candidate class; the result list is assigned to var or _.""" + expr, var = self._parse_modif(arg) + flt = self._make_flt(expr) + if flt is None: + return + cls = [] + try: + for c in t.annotator.getuserclasses(): + if flt(c): + cls.append(c) + except self.GiveUp: + return + self._setvar(var, cls) + + def do_findfuncs(self, arg): + """findfuncs expr [as var] +find flow-graphed functions for which expr is true, cand in it referes to +the candidate function; the result list is assigned to var or _.""" + expr, var = self._parse_modif(arg) + flt = self._make_flt(expr) + if flt is None: + return + funcs = [] + try: + for f in t.flowgraphs: + if flt(f): + funcs.append(f) + except self.GiveUp: + return + self._setvar(var, funcs) + + def do_showg(self, arg): + """showg obj +show graph for obj, obj can be an expression or a dotted name +(in which case prefixing with some packages in pypy is tried (see help pypyprefixes)). +if obj is a function or method, the localized call graph is shown; +if obj is a class or ClassDef the class definition graph is shown""" + from pypy.annotation.classdef import ClassDef + from pypy.translator.tool import graphpage + obj = self._getobj(arg) + if obj is None: + return + if hasattr(obj, 'im_func'): + obj = obj.im_func + if obj in t.flowgraphs: + page = graphpage.LocalizedCallGraphPage(t, obj) + elif obj in getattr(t.annotator, 'getuserclasses', lambda: {})(): + page = graphpage.ClassDefPage(t, t.annotator.getuserclasses()[obj]) + elif isinstance(obj, ClassDef): + page = graphpage.ClassDefPage(t, obj) + else: + print "*** Nothing to do" + return + self._show(page) + + def _attrs(self, arg, pr): + arg, expr = self._parse_modif(arg, 'match') + if expr == '_': + expr = 'True' + obj = self._getobj(arg) + if obj is None: + return + import types + if isinstance(obj, (type, types.ClassType)): + obj = [obj] + else: + obj = list(obj) + def longname(c): + return "%s.%s" % (c.__module__, c.__name__) + obj.sort(lambda x,y: cmp(longname(x), longname(y))) + cls = t.annotator.getuserclasses() + flt = self._make_flt(expr) + if flt is None: + return + for c in obj: + if c in cls: + try: + attrs = [a for a in cls[c].attrs.itervalues() if flt(a)] + except self.GiveUp: + return + if attrs: + print "%s:" % longname(c) + pr(attrs) + + def do_attrs(self, arg): + """attrs obj [match expr] +list annotated attrs of class obj or list of classes obj, +obj can be an expression or a dotted name +(in which case prefixing with some packages in pypy is tried (see help pypyprefixes)); +expr is an optional filtering expression; cand in it refer to the candidate Attribute +information object, which has a .name and .s_value.""" + def pr(attrs): + print " " + ' '.join([a.name for a in attrs]) + self._attrs(arg, pr) + + def do_attrsann(self, arg): + """attrsann obj [match expr] +list with their annotation annotated attrs of class obj or list of classes obj, +obj can be an expression or a dotted name +(in which case prefixing with some packages in pypy is tried (see help pypyprefixes)); +expr is an optional filtering expression; cand in it refer to the candidate Attribute +information object, which has a .name and .s_value.""" + def pr(attrs): + for a in attrs: + print ' %s %s' % (a.name, a.s_value) + self._attrs(arg, pr) + + def do_readpos(self, arg): + """readpos obj attrname [match expr] [as var] +list the read positions of annotated attr with attrname of class obj, +obj can be an expression or a dotted name +(in which case prefixing with some packages in pypy is tried (see help pypyprefixes)); +expr is an optional filtering expression; cand in it refer to the candidate read +position information, which has a .func and .block and .i; +the list of the read positions functions is set to var or _.""" + class Pos: + def __init__(self, func, block, i): + self.func = func + self.block = block + self.i = i + arg, var = self._parse_modif(arg, 'as') + arg, expr = self._parse_modif(arg, 'match') + if expr == '_': + expr = 'True' + args = arg.split() + if len(args) != 2: + print "*** expected obj attrname:", arg + return + arg, attrname = args + # allow quotes around attrname + if (attrname.startswith("'") and attrname.endswith("'") + or attrname.startswith('"') and attrname.endswith('"')): + attrname = attrname[1:-1] + + obj = self._getobj(arg) + if obj is None: + return + cls = t.annotator.getuserclasses() + if obj not in cls: + return + attrs = cls[obj].attrs + if attrname not in attrs: + print "*** bogus:", attrname + return + pos = attrs[attrname].read_locations + if not pos: + return + flt = self._make_flt(expr) + if flt is None: + return + r = {} + try: + for p in pos: + func, block, i = p + if flt(Pos(func, block, i)): + print func.__module__ or '?', func.__name__, block, i + r[func] = True + except self.GiveUp: + return + self._setvar(var, r.keys()) + + + def do_flowg(self, arg): + """callg obj +show flow graph for function obj, obj can be an expression or a dotted name +(in which case prefixing with some packages in pypy is tried (see help pypyprefixes))""" + import types + from pypy.translator.tool import graphpage + obj = self._getobj(arg) + if obj is None: + return + if hasattr(obj, 'im_func'): + obj = obj.im_func + if not isinstance(obj, types.FunctionType): + print "*** Not a function" + return + self._show(graphpage.FlowGraphPage(t, [obj])) + + def do_callg(self, arg): + """callg obj +show localized call-graph for function obj, obj can be an expression or a dotted name +(in which case prefixing with some packages in pypy is tried (see help pypyprefixes))""" + import types + from pypy.translator.tool import graphpage + obj = self._getobj(arg) + if obj is None: + return + if hasattr(obj, 'im_func'): + obj = obj.im_func + if not isinstance(obj, types.FunctionType): + print "*** Not a function" + return + self._show(graphpage.LocalizedCallGraphPage(t, obj)) + + def do_classhier(self, arg): + """classhier +show class hierarchy graph""" + from pypy.translator.tool import graphpage + self._show(graphpage.ClassHierarchyPage(t)) + + def help_graphs(self): + print "graph commands are: showg, flowg, callg, classhier" + + def help_ann_other(self): + print "other annotation related commands are: find, findclasses, findfuncs, attrs, attrsann, readpos" + + def help_pypyprefixes(self): + print "these prefixes are tried for dotted names in graph commands:" + print self.TRYPREFIXES + Added: pypy/dist/pypy/translator/tool/util.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/tool/util.py Tue Sep 6 11:53:05 2005 @@ -0,0 +1,96 @@ +from pypy.annotation.model import SomeObject + +def sanity_check_exceptblocks(translator): + annotator = translator.annotator + irreg = 0 + for graph in translator.flowgraphs.itervalues(): + et, ev = graph.exceptblock.inputargs + s_et = annotator.binding(et, extquery=True) + s_ev = annotator.binding(ev, extquery=True) + if s_et: + if s_et.knowntype == type: + if s_et.__class__ == SomeObject: + if hasattr(s_et, 'is_type_of') and s_et.is_type_of == [ev]: + continue + else: + if s_et.__class__ == annmodel.SomePBC: + continue + print "*****", graph.name, "exceptblock is not completely sane" + irreg += 1 + if irreg == 0: + print "*** All exceptblocks seem sane." + +def find_someobjects(translator, quiet=False): + """Find all functions in that have SomeObject in their signature.""" + annotator = translator.annotator + if not annotator: + return # no annotations available + + translator.highlight_functions = {} + + def is_someobject(var): + try: + return annotator.binding(var).__class__ == SomeObject + except KeyError: + return False + + def short_binding(var): + try: + binding = annotator.binding(var) + except KeyError: + return "?" + if binding.is_constant(): + return 'const %s' % binding.__class__.__name__ + else: + return binding.__class__.__name__ + + header = True + items = [(graph.name, func, graph) + for func, graph in translator.flowgraphs.items()] + items.sort() + num = someobjnum = 0 + for graphname, func, graph in items: + unknown_input_args = len(filter(is_someobject, graph.getargs())) + unknown_return_value = is_someobject(graph.getreturnvar()) + if unknown_input_args or unknown_return_value: + someobjnum += 1 + translator.highlight_functions[func] = True + if not quiet: + if header: + header = False + print "=" * 70 + print "Functions that have SomeObject in their signature" + print "=" * 70 + print ("%(name)s(%(args)s) -> %(result)s\n" + "%(filename)s:%(lineno)s\n" + % {'name': graph.name, + 'filename': func.func_globals.get('__name__', '?'), + 'lineno': func.func_code.co_firstlineno, + 'args': ', '.join(map(short_binding, + graph.getargs())), + 'result': short_binding(graph.getreturnvar())}) + num += 1 + if not quiet: + print "=" * 70 + percent = int(num and (100.0*someobjnum / num) or 0) + print "someobjectness: %2d percent" % (percent) + print "(%d out of %d functions get or return SomeObjects" % ( + someobjnum, num) + print "=" * 70 + +def worstblocks_topten(ann, n=10): + h = [(count, block) for block, count in ann.reflowcounter.iteritems()] + h.sort() + if not h: + return + print + ansi_print(',----------------------- Top %d Most Reflown Blocks -----------------------.' % n, 36) + for i in range(n): + if not h: + break + count, block = h.pop() + ansi_print(' #%3d: reflown %d times |' % (i+1, count), 36) + about(block) + ansi_print("`----------------------------------------------------------------------------'", 36) + print + From ericvrp at codespeak.net Tue Sep 6 13:26:06 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 6 Sep 2005 13:26:06 +0200 (CEST) Subject: [pypy-svn] r17273 - pypy/dist/pypy/translator/c/test Message-ID: <20050906112606.3601C27B44@code1.codespeak.net> Author: ericvrp Date: Tue Sep 6 13:26:05 2005 New Revision: 17273 Modified: pypy/dist/pypy/translator/c/test/test_typed.py Log: Added basic list operations test (from llvm backend) that used to pass, but fails since last optz on lists. Modified: pypy/dist/pypy/translator/c/test/test_typed.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_typed.py (original) +++ pypy/dist/pypy/translator/c/test/test_typed.py Tue Sep 6 13:26:05 2005 @@ -332,3 +332,20 @@ assert res[0] == True assert res[1] == intmask(hash(c)+hash(d)) + + def test_list_basic_ops(self): + def list_basic_ops(i=int, j=int): + l = [1,2,3] + l.insert(0, 42) + del l[1] + l.append(i) + listlen = len(l) + l.extend(l) + del l[listlen:] + l += [5,6] + l[1] = i + return l[j] + f = self.getcompiled(list_basic_ops) + for i in range(6): + for j in range(6): + assert f(i,j) == list_basic_ops(i,j) From ericvrp at codespeak.net Tue Sep 6 13:28:48 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 6 Sep 2005 13:28:48 +0200 (CEST) Subject: [pypy-svn] r17274 - pypy/dist/pypy/translator/llvm Message-ID: <20050906112848.AFC3C27B44@code1.codespeak.net> Author: ericvrp Date: Tue Sep 6 13:28:47 2005 New Revision: 17274 Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py Log: disables usage of llvm stack2heap optz pass until it's debugged Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Tue Sep 6 13:28:47 2005 @@ -33,8 +33,8 @@ flags = os.popen("gccas /dev/null -o /dev/null -debug-pass=Arguments 2>&1").read()[17:-1].split() -if int(os.popen("opt --help 2>&1").read().find('-heap2stack')) >= 0: - flags.insert(flags.index("-inline")+1, "-heap2stack") +#if int(os.popen("opt --help 2>&1").read().find('-heap2stack')) >= 0: +# flags.insert(flags.index("-inline")+1, "-heap2stack -debug") OPTIMIZATION_SWITCHES = " ".join(flags) #print OPTIMIZATION_SWITCHES From ericvrp at codespeak.net Tue Sep 6 13:29:40 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 6 Sep 2005 13:29:40 +0200 (CEST) Subject: [pypy-svn] r17275 - pypy/dist/pypy/translator/llvm Message-ID: <20050906112940.2C56C27B44@code1.codespeak.net> Author: ericvrp Date: Tue Sep 6 13:29:39 2005 New Revision: 17275 Added: pypy/dist/pypy/translator/llvm/externs2ll.py (contents, props changed) Modified: pypy/dist/pypy/translator/llvm/genllvm.py Log: Refactoring to clean up gen_llvm_source Added: pypy/dist/pypy/translator/llvm/externs2ll.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/llvm/externs2ll.py Tue Sep 6 13:29:39 2005 @@ -0,0 +1,134 @@ +import os +import types +import urllib + +from pypy.rpython.rmodel import inputconst, getfunctionptr +from pypy.rpython import lltype +from pypy.translator.llvm.codewriter import CodeWriter, \ + DEFAULT_TAIL, DEFAULT_CCONV + +from pypy.tool.udir import udir + + +def get_ll(ccode, function_names): + + # goto codespeak and compile our c code + request = urllib.urlencode({'ccode':ccode}) + llcode = urllib.urlopen('http://codespeak.net/pypy/llvm-gcc.cgi', request).read() + + # strip lines + ll_lines = [] + funcnames = dict([(k, True) for k in function_names]) + + # strip declares that are in funcnames + for line in llcode.split('\n'): + + # get rid of any of the structs that llvm-gcc introduces to struct types + line = line.replace("%struct.", "%") + + # strip comments + comment = line.find(';') + if comment >= 0: + line = line[:comment] + line = line.rstrip() + + # find function names, declare them with the default calling convertion + if line[-1:] == '{': + returntype, s = line.split(' ', 1) + funcname , s = s.split('(', 1) + funcnames[funcname] = True + if line.find("internal") == -1: + line = '%s %s' % (DEFAULT_CCONV, line,) + ll_lines.append(line) + + # patch calls to function that we just declared fastcc + ll_lines2, calltag, declaretag = [], 'call ', 'declare ' + for line in ll_lines: + i = line.find(calltag) + if i >= 0: + cconv = 'ccc' + for funcname in funcnames.iterkeys(): + if line.find(funcname) >= 0: + cconv = DEFAULT_CCONV + break + line = "%scall %s %s" % (line[:i], cconv, line[i+len(calltag):]) + if line[:len(declaretag)] == declaretag: + cconv = 'ccc' + for funcname in funcnames.keys(): + if line.find(funcname) >= 0: + cconv = DEFAULT_CCONV + break + line = "declare %s %s" % (cconv, line[len(declaretag):]) + ll_lines2.append(line) + + llcode = '\n'.join(ll_lines2) + return llcode.split('implementation') + + +def post_setup_externs(db): + rtyper = db._translator.rtyper + from pypy.translator.c.extfunc import predeclare_all + + # hacks to make predeclare_all work + db.standalone = True + db.externalfuncs = {} + decls = list(predeclare_all(db, rtyper)) + + for c_name, obj in decls: + if isinstance(obj, lltype.LowLevelType): + db.prepare_type(obj) + elif isinstance(obj, types.FunctionType): + funcptr = getfunctionptr(db._translator, obj) + c = inputconst(lltype.typeOf(funcptr), funcptr) + db.prepare_arg_value(c) + elif isinstance(lltype.typeOf(obj), lltype.Ptr): + db.prepare_constant(lltype.typeOf(obj), obj) + else: + assert False, "unhandled predeclare %s %s %s" % (c_name, type(obj), obj) + + return decls + + +def generate_llfile(db, extern_decls, support_functions, debug=False): + ccode = [] + function_names = [] + + def predeclarefn(c_name, llname): + function_names.append(llname) + assert llname[0] == "%" + llname = llname[1:] + assert '\n' not in llname + ccode.append('#define\t%s\t%s\n' % (c_name, llname)) + + for c_name, obj in extern_decls: + if isinstance(obj, lltype.LowLevelType): + s = "#define %s struct %s\n%s;\n" % (c_name, c_name, c_name) + ccode.append(s) + elif isinstance(obj, types.FunctionType): + funcptr = getfunctionptr(db._translator, obj) + c = inputconst(lltype.typeOf(funcptr), funcptr) + predeclarefn(c_name, db.repr_arg(c)) + elif isinstance(lltype.typeOf(obj), lltype.Ptr): + if isinstance(lltype.typeOf(obj._obj), lltype.FuncType): + predeclarefn(c_name, db.repr_name(obj._obj)) + + include_files = [] + # append local file + j = os.path.join + include_files.append(j(j(os.path.dirname(__file__), "module"), "genexterns.c")) + + from pypy.translator.c import extfunc + for f in ["ll_os", "ll_math", "ll_time", "ll_strtod"]: + include_files.append(j(j(os.path.dirname(extfunc.__file__), "src"), f + ".h")) + + for f in include_files: + ccode.append(open(f).read()) + + if debug: + ccode = "".join(ccode) + filename = udir.join("ccode.c") + f = open(str(filename), "w") + f.write(ccode) + f.close() + + return get_ll(ccode, function_names + support_functions) Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Tue Sep 6 13:29:39 2005 @@ -1,6 +1,6 @@ from os.path import exists use_boehm_gc = exists('/usr/lib/libgc.so') or exists('/usr/lib/libgc.a') -use_boehm_gc = False +#use_boehm_gc = False import os import time @@ -23,79 +23,15 @@ extfunctions, gc_boehm, gc_disabled, dependencies from pypy.translator.llvm.node import LLVMNode from pypy.translator.llvm.structnode import StructNode +from pypy.translator.llvm.externs2ll import post_setup_externs, generate_llfile from pypy.translator.translator import Translator -from py.process import cmdexec function_count = {} -llcode_header = ll_functions = None +llexterns_header = llexterns_functions = None + -ll_func_names = [ - "%raisePyExc_IOError", - "%raisePyExc_ValueError", - "%raisePyExc_OverflowError", - "%raisePyExc_ZeroDivisionError", - "%RPyString_AsString", - "%RPyString_FromString", - "%RPyString_Size"] - -def get_ll(ccode, function_names): - - # goto codespeak and compile our c code - request = urllib.urlencode({'ccode':ccode}) - llcode = urllib.urlopen('http://codespeak.net/pypy/llvm-gcc.cgi', request).read() - - # strip lines - ll_lines = [] - function_names = list(function_names) + ll_func_names - funcnames = dict([(k, True) for k in function_names]) - - # strip declares tjat in ll_func_names - for line in llcode.split('\n'): - - # get rid of any of the structs that llvm-gcc introduces to struct types - line = line.replace("%struct.", "%") - - # strip comments - comment = line.find(';') - if comment >= 0: - line = line[:comment] - line = line.rstrip() - - # find function names, declare them with the default calling convertion - if line[-1:] == '{': - returntype, s = line.split(' ', 1) - funcname , s = s.split('(', 1) - funcnames[funcname] = True - if line.find("internal") == -1: - line = '%s %s' % (DEFAULT_CCONV, line,) - ll_lines.append(line) - - # patch calls to function that we just declared fastcc - ll_lines2, calltag, declaretag = [], 'call ', 'declare ' - for line in ll_lines: - i = line.find(calltag) - if i >= 0: - cconv = 'ccc' - for funcname in funcnames.iterkeys(): - if line.find(funcname) >= 0: - cconv = DEFAULT_CCONV - break - line = "%scall %s %s" % (line[:i], cconv, line[i+len(calltag):]) - if line[:len(declaretag)] == declaretag: - cconv = 'ccc' - for funcname in funcnames.keys(): - if line.find(funcname) >= 0: - cconv = DEFAULT_CCONV - break - line = "declare %s %s" % (cconv, line[len(declaretag):]) - ll_lines2.append(line) - - llcode = '\n'.join(ll_lines2) - global llcode_header, ll_functions - llcode_header, ll_functions = llcode.split('implementation') - class GenLLVM(object): def __init__(self, translator, debug=True): @@ -110,6 +46,15 @@ # for debug we create comments of every operation that may be executed self.debug = debug + def _checkpoint(self, msg=None): + if self.debug: + if msg: + t = (time.time() - self.starttime) + print '\t%s took %02dm%02ds' % (msg, t/60, t%60) + else: + print 'GenLLVM:' + self.starttime = time.time() + def _print_node_stats(self): """run_pypy-llvm.sh [aug 29th 2005] before slotifying: 350Mb @@ -129,6 +74,9 @@ STATS (26210, "") STATS (268884, "") """ + return #disable node stats output + if not self.debug: + return nodecount = {} for node in self.db.getnodes(): typ = type(node) @@ -141,76 +89,9 @@ for s in stats: print 'STATS', s - def post_setup_externs(self): - - rtyper = self.db._translator.rtyper - from pypy.translator.c.extfunc import predeclare_all - - # hacks to make predeclare_all work - self.db.standalone = True - self.db.externalfuncs = {} - decls = list(predeclare_all(self.db, rtyper)) - - for c_name, obj in decls: - if isinstance(obj, lltype.LowLevelType): - self.db.prepare_type(obj) - elif isinstance(obj, types.FunctionType): - funcptr = getfunctionptr(self.translator, obj) - c = inputconst(lltype.typeOf(funcptr), funcptr) - self.db.prepare_arg_value(c) - elif isinstance(lltype.typeOf(obj), lltype.Ptr): - self.db.prepare_constant(lltype.typeOf(obj), obj) - else: - assert False, "unhandled predeclare %s %s %s" % (c_name, type(obj), obj) - - return decls - - def generate_llfile(self, extern_decls): - ccode = [] - function_names = [] - - def predeclarefn(c_name, llname): - function_names.append(llname) - assert llname[0] == "%" - llname = llname[1:] - assert '\n' not in llname - ccode.append('#define\t%s\t%s\n' % (c_name, llname)) - - for c_name, obj in extern_decls: - if isinstance(obj, lltype.LowLevelType): - s = "#define %s struct %s\n%s;\n" % (c_name, c_name, c_name) - ccode.append(s) - elif isinstance(obj, types.FunctionType): - funcptr = getfunctionptr(self.translator, obj) - c = inputconst(lltype.typeOf(funcptr), funcptr) - predeclarefn(c_name, self.db.repr_arg(c)) - elif isinstance(lltype.typeOf(obj), lltype.Ptr): - if isinstance(lltype.typeOf(obj._obj), lltype.FuncType): - predeclarefn(c_name, self.db.repr_name(obj._obj)) - - include_files = [] - # append local file - j = os.path.join - include_files.append(j(j(os.path.dirname(__file__), "module"), "genexterns.c")) - - from pypy.translator.c import extfunc - for f in ["ll_os", "ll_math", "ll_time", "ll_strtod"]: - include_files.append(j(j(os.path.dirname(extfunc.__file__), "src"), f + ".h")) - - for f in include_files: - ccode.append(open(f).read()) - - # for debugging - ccode = "".join(ccode) - filename = udir.join("ccode.c") - f = open(str(filename), "w") - f.write(ccode) - f.close() - - get_ll(ccode, function_names) - def gen_llvm_source(self, func=None): - if self.debug: print 'gen_llvm_source begin) ' + time.ctime() + self._checkpoint() + if func is None: func = self.translator.entrypoint self.entrypoint = func @@ -219,33 +100,31 @@ c = inputconst(lltype.typeOf(ptr), ptr) entry_point = c.value._obj self.db.prepare_arg_value(c) - - #if self.debug: print 'gen_llvm_source db.setup_all) ' + time.ctime() - #7 minutes + self._checkpoint('init') # set up all nodes self.db.setup_all() self.entrynode = self.db.set_entrynode(entry_point) + self._checkpoint('setup_all') # post set up externs - extern_decls = self.post_setup_externs() + extern_decls = post_setup_externs(self.db) self.translator.rtyper.specialize_more_blocks() self.db.setup_all() - using_external_functions = extfuncnode.ExternalFuncNode.used_external_functions.keys() != [] + self._print_node_stats() + self._checkpoint('setup_all externs') - if self.debug: - self._print_node_stats() - - if llcode_header is None and using_external_functions: - self.generate_llfile(extern_decls) + support_functions = "%raisePyExc_IOError %raisePyExc_ValueError "\ + "%raisePyExc_OverflowError %raisePyExc_ZeroDivisionError "\ + "%prepare_ZeroDivisionError %prepare_OverflowError %prepare_ValueError "\ + "%RPyString_FromString %RPyString_AsString %RPyString_Size".split() + + global llexterns_header, llexterns_functions + if llexterns_header is None and using_external_functions: + llexterns_header, llexterns_functions = generate_llfile(self.db, extern_decls, support_functions, self.debug) + self._checkpoint('generate_ll') - #if self.debug: print 'gen_llvm_source typ_decl.writedatatypedecl) ' + time.ctime() - #if self.debug: print 'gen_llvm_source n_nodes) %d' % len(self.db.getnodes()) - #3 seconds - #if self.debug: - # log.gen_llvm_source(self.db.dump_pbcs()) - # prevent running the same function twice in a test if func.func_name in function_count: postfix = '_%d' % function_count[func.func_name] @@ -260,55 +139,37 @@ nl = codewriter.newline if using_external_functions: - nl(); comment("EXTERNAL FUNCTION DECLARATIONS") ; nl() - for s in llcode_header.split('\n'): + nl(); comment("External Function Declarations") ; nl() + for s in llexterns_header.split('\n'): codewriter.append(s) nl(); comment("Type Declarations"); nl() - for c_name, obj in extern_decls: - if isinstance(obj, lltype.LowLevelType): if isinstance(obj, lltype.Ptr): obj = obj.TO l = "%%%s = type %s" % (c_name, self.db.repr_type(obj)) codewriter.append(l) - - #XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX - #elif isinstance(obj, types.FunctionType): - # #c.value._obj.graph.name = c_name - # print "XXX predeclare" , c_name, type(obj), obj + self._checkpoint('write externs type declarations') for typ_decl in self.db.getnodes(): typ_decl.writedatatypedecl(codewriter) + self._checkpoint('write data type declarations') - if self.debug: print 'gen_llvm_source typ_decl.writeglobalconstants) ' + time.ctime() - #20 minutes nl(); comment("Global Data") ; nl() for typ_decl in self.db.getnodes(): typ_decl.writeglobalconstants(codewriter) + self._checkpoint('write global constants') - if self.debug: print 'gen_llvm_source typ_decl.writecomments) ' + time.ctime() - #0 minutes - #if self.debug: - # nl(); comment("Comments") ; nl() - # for typ_decl in self.db.getnodes(): - # typ_decl.writecomments(codewriter) - - if self.debug: print 'gen_llvm_source extdeclarations) ' + time.ctime() nl(); comment("Function Prototypes") ; nl() for extdecl in extdeclarations.split('\n'): codewriter.append(extdecl) + self._checkpoint('write function prototypes') - if self.debug: print 'gen_llvm_source self._debug_prototype) ' + time.ctime() - #if self.debug: - # self._debug_prototype(codewriter) - - if self.debug: print 'gen_llvm_source typ_decl.writedecl) ' + time.ctime() for typ_decl in self.db.getnodes(): typ_decl.writedecl(codewriter) + self._checkpoint('write declarations') - if self.debug: print 'gen_llvm_source boehm_gc) ' + time.ctime() nl(); comment("Function Implementation") codewriter.startimpl() if use_boehm_gc: @@ -318,12 +179,10 @@ for gc_func in gc_funcs.split('\n'): codewriter.append(gc_func) - if self.debug: print 'gen_llvm_source typ_decl.writeimpl) ' + time.ctime() - #XXX ? minutes for typ_decl in self.db.getnodes(): typ_decl.writeimpl(codewriter) + self._checkpoint('write implementations') - if self.debug: print 'gen_llvm_source entrypoint) ' + time.ctime() #XXX use codewriter methods here decl = self.entrynode.getdecl() t = decl.split('%', 1) @@ -360,15 +219,9 @@ elif entryfunc_name == 'pypy_main_noargs': #XXX just to get on with bpnn & richards extfuncnode.ExternalFuncNode.used_external_functions['%main_noargs'] = True + for f in support_functions: + extfuncnode.ExternalFuncNode.used_external_functions[f] = True - - for f in "raisePyExc_IOError raisePyExc_ValueError "\ - "raisePyExc_OverflowError raisePyExc_ZeroDivisionError "\ - "prepare_ZeroDivisionError prepare_OverflowError prepare_ValueError "\ - "RPyString_FromString RPyString_AsString RPyString_Size".split(): - extfuncnode.ExternalFuncNode.used_external_functions["%" + f] = True - - if self.debug: print 'gen_llvm_source used_external_functions) ' + time.ctime() depdone = {} for funcname,value in extfuncnode.ExternalFuncNode.used_external_functions.iteritems(): deps = dependencies(funcname,[]) @@ -382,14 +235,15 @@ for extfunc in llvm_code.split('\n'): codewriter.append(extfunc) depdone[dep] = True - + self._checkpoint('write support functions') + if using_external_functions: - nl(); comment("EXTERNAL FUNCTION IMPLEMENTATION") ; nl() - for s in ll_functions.split('\n'): + nl(); comment("External Function Implementation") ; nl() + for s in llexterns_functions.split('\n'): codewriter.append(s) + self._checkpoint('write external functions') comment("End of file") ; nl() - if self.debug: print 'gen_llvm_source return) ' + time.ctime() return filename def create_module(self, From nico at codespeak.net Tue Sep 6 13:50:40 2005 From: nico at codespeak.net (nico at codespeak.net) Date: Tue, 6 Sep 2005 13:50:40 +0200 (CEST) Subject: [pypy-svn] r17276 - pypy/extradoc/sprintinfo Message-ID: <20050906115040.CEDD627B44@code1.codespeak.net> Author: nico Date: Tue Sep 6 13:50:39 2005 New Revision: 17276 Modified: pypy/extradoc/sprintinfo/paris-2005-sprint.txt Log: fix broken link Modified: pypy/extradoc/sprintinfo/paris-2005-sprint.txt ============================================================================== --- pypy/extradoc/sprintinfo/paris-2005-sprint.txt (original) +++ pypy/extradoc/sprintinfo/paris-2005-sprint.txt Tue Sep 6 13:50:39 2005 @@ -116,7 +116,7 @@ http://codespeak.net/svn/pypy/extradoc/sprintinfo/paris-2005-people.txt -.. _`Paris people`: http://codespeak.net/pypy/index.cgi?extradoc/sprintinfo/paris-people.html +.. _`Paris people`: http://codespeak.net/pypy/index.cgi?extradoc/sprintinfo/paris-2005-people.html .. _`mailinglist`: .. _`PyPy sprint mailing list`: http://codespeak.net/mailman/listinfo/pypy-sprint .. _`pypy-0.7.0`: http://codespeak.net/pypy/dist/pypy/doc/release-0.7.0.html From pedronis at codespeak.net Tue Sep 6 14:17:06 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 6 Sep 2005 14:17:06 +0200 (CEST) Subject: [pypy-svn] r17281 - in pypy/dist/pypy/interpreter: pyparser/test testcompiler Message-ID: <20050906121706.B436E27B51@code1.codespeak.net> Author: pedronis Date: Tue Sep 6 14:17:04 2005 New Revision: 17281 Added: pypy/dist/pypy/interpreter/testcompiler/ - copied from r17267, pypy/dist/pypy/interpreter/stablecompiler/ Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py pypy/dist/pypy/interpreter/testcompiler/future.py pypy/dist/pypy/interpreter/testcompiler/pyassem.py pypy/dist/pypy/interpreter/testcompiler/pycodegen.py pypy/dist/pypy/interpreter/testcompiler/symbols.py pypy/dist/pypy/interpreter/testcompiler/syntax.py pypy/dist/pypy/interpreter/testcompiler/transformer.py pypy/dist/pypy/interpreter/testcompiler/visitor.py Log: yet another compiler :(, this one cannot be used with PyPy itself, is meant to be used by test_astcompiler and should be short lived (that's the hope). PS: pedronis in this very moments hates globals, non-overridable imports and wished module and packages could be a bit more like ML functors. Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Tue Sep 6 14:17:04 2005 @@ -3,6 +3,11 @@ from pypy.interpreter.pycode import PyCode import py.test +def setup_module(mod): + import sys + if sys.version[:3] != "2.4": + py.test.skip("expected to work only on 2.4") + from pypy.interpreter.astcompiler import ast, misc, pycodegen from test_astbuilder import expressions, comparisons, funccalls, backtrackings,\ @@ -16,27 +21,27 @@ TESTS = [ expressions, -## augassigns, -## comparisons, -## funccalls, -## backtrackings, -## listmakers, -## dictmakers, -## multiexpr, -## attraccess, -## slices, -## imports, -## execs, -## prints, -## globs, -## raises_, -## # EXEC INPUTS -## # one_stmt_classdefs, -## one_stmt_funcdefs, -## if_stmts, -## tryexcepts, -## # docstrings, -## # returns, + augassigns, + comparisons, + funccalls, + backtrackings, + listmakers, + dictmakers, + multiexpr, + attraccess, + slices, + imports, + execs, + prints, + globs, + raises_, +# EXEC_INPUTS + one_stmt_classdefs, + one_stmt_funcdefs, + if_stmts, + tryexcepts, + docstrings, + returns, ] import sys @@ -59,19 +64,15 @@ return builder.rule_stack[-1] -### Note: builtin compile and compiler.compile behave differently -def compile_with_builtin_comiple( expr, target="exec" ): - return compile( expr, "", target ) - def compile_with_astcompiler(expr, target='exec', space=FakeSpace()): - ast = ast_parse_expr(epxr, target='exec', space=space) + ast = ast_parse_expr(expr, target='exec', space=space) misc.set_filename('', ast) codegen = pycodegen.ModuleCodeGenerator(space, ast) - rcode = codegenerator.getCode() + rcode = codegen.getCode() return to_code(rcode) def compile_with_stablecompiler(expr, target='exec'): - from pypy.interpreter.stablecompiler import compile + from pypy.interpreter.testcompiler import compile # from compiler import compile return compile(expr, '', target) @@ -119,7 +120,7 @@ def check_compile(expr): print "Compiling:", expr sc_code = compile_with_stablecompiler(expr, target='exec') - as_code = compile_with_astcompiler(expr, target='exec') + ac_code = compile_with_astcompiler(expr, target='exec') compare_code(ac_code, sc_code) ## def check_compile( expr ): @@ -135,14 +136,14 @@ ## compare_code(code1,code2) def test_compile_argtuple_1(): - py.test.skip('will be tested when more basic stuff will work') + #py.test.skip('will be tested when more basic stuff will work') code = """def f( x, (y,z) ): print x,y,z """ check_compile( code ) def test_compile_argtuple_2(): - py.test.skip('will be tested when more basic stuff will work') + #py.test.skip('will be tested when more basic stuff will work') code = """def f( x, (y,(z,t)) ): print x,y,z,t """ @@ -150,7 +151,7 @@ def test_compile_argtuple_3(): - py.test.skip('will be tested when more basic stuff will work') + #py.test.skip('will be tested when more basic stuff will work') code = """def f( x, (y,(z,(t,u))) ): print x,y,z,t,u """ Modified: pypy/dist/pypy/interpreter/testcompiler/future.py ============================================================================== --- pypy/dist/pypy/interpreter/stablecompiler/future.py (original) +++ pypy/dist/pypy/interpreter/testcompiler/future.py Tue Sep 6 14:17:04 2005 @@ -2,7 +2,7 @@ """ -from pypy.interpreter.stablecompiler import ast, walk +from pypy.interpreter.testcompiler import ast, walk def is_future(stmt): """Return true if statement is a well-formed future statement""" @@ -69,7 +69,7 @@ if __name__ == "__main__": import sys - from pypy.interpreter.stablecompiler import parseFile, walk + from pypy.interpreter.testcompiler import parseFile, walk for file in sys.argv[1:]: print file Modified: pypy/dist/pypy/interpreter/testcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/stablecompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/testcompiler/pyassem.py Tue Sep 6 14:17:04 2005 @@ -5,8 +5,8 @@ import sys import types -from pypy.interpreter.stablecompiler import misc -from pypy.interpreter.stablecompiler.consts \ +from pypy.interpreter.testcompiler import misc +from pypy.interpreter.testcompiler.consts \ import CO_OPTIMIZED, CO_NEWLOCALS, CO_VARARGS, CO_VARKEYWORDS class FlowGraph: Modified: pypy/dist/pypy/interpreter/testcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/stablecompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/testcompiler/pycodegen.py Tue Sep 6 14:17:04 2005 @@ -6,13 +6,13 @@ import types from cStringIO import StringIO -from pypy.interpreter.stablecompiler import ast, parse, walk, syntax -from pypy.interpreter.stablecompiler import pyassem, misc, future, symbols -from pypy.interpreter.stablecompiler.consts import SC_LOCAL, SC_GLOBAL, \ +from pypy.interpreter.testcompiler import ast, parse, walk, syntax +from pypy.interpreter.testcompiler import pyassem, misc, future, symbols +from pypy.interpreter.testcompiler.consts import SC_LOCAL, SC_GLOBAL, \ SC_FREE, SC_CELL, SC_DEFAULT -from pypy.interpreter.stablecompiler.consts import CO_VARARGS, CO_VARKEYWORDS, \ +from pypy.interpreter.testcompiler.consts import CO_VARARGS, CO_VARKEYWORDS, \ CO_NEWLOCALS, CO_NESTED, CO_GENERATOR, CO_GENERATOR_ALLOWED, CO_FUTURE_DIVISION -from pypy.interpreter.stablecompiler.pyassem import TupleArg +from pypy.interpreter.testcompiler.pyassem import TupleArg # XXX The version-specific code can go, since this code only works with 2.x. # Do we have Python 1.x or Python 2.x? Modified: pypy/dist/pypy/interpreter/testcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/stablecompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/testcompiler/symbols.py Tue Sep 6 14:17:04 2005 @@ -1,9 +1,9 @@ """Module symbol-table generator""" -from pypy.interpreter.stablecompiler import ast -from pypy.interpreter.stablecompiler.consts import SC_LOCAL, SC_GLOBAL, \ +from pypy.interpreter.testcompiler import ast +from pypy.interpreter.testcompiler.consts import SC_LOCAL, SC_GLOBAL, \ SC_FREE, SC_CELL, SC_UNKNOWN, SC_DEFAULT -from pypy.interpreter.stablecompiler.misc import mangle +from pypy.interpreter.testcompiler.misc import mangle import types @@ -431,7 +431,7 @@ if __name__ == "__main__": import sys - from pypy.interpreter.stablecompiler import parseFile, walk + from pypy.interpreter.testcompiler import parseFile, walk import symtable def get_names(syms): Modified: pypy/dist/pypy/interpreter/testcompiler/syntax.py ============================================================================== --- pypy/dist/pypy/interpreter/stablecompiler/syntax.py (original) +++ pypy/dist/pypy/interpreter/testcompiler/syntax.py Tue Sep 6 14:17:04 2005 @@ -9,7 +9,7 @@ errors. """ -from pypy.interpreter.stablecompiler import ast, walk +from pypy.interpreter.testcompiler import ast, walk def check(tree, multi=None): v = SyntaxErrorChecker(multi) Modified: pypy/dist/pypy/interpreter/testcompiler/transformer.py ============================================================================== --- pypy/dist/pypy/interpreter/stablecompiler/transformer.py (original) +++ pypy/dist/pypy/interpreter/testcompiler/transformer.py Tue Sep 6 14:17:04 2005 @@ -27,10 +27,10 @@ # make sure we import the parser with the correct grammar import pypy.interpreter.pyparser.pythonparse -from pypy.interpreter.stablecompiler.ast import * +from pypy.interpreter.testcompiler.ast import * import parser -import pypy.interpreter.pyparser.pysymbol as symbol -import pypy.interpreter.pyparser.pytoken as token +import symbol as symbol +import token as token import sys # transforming is requiring a lot of recursion depth so make sure we have enough Modified: pypy/dist/pypy/interpreter/testcompiler/visitor.py ============================================================================== --- pypy/dist/pypy/interpreter/stablecompiler/visitor.py (original) +++ pypy/dist/pypy/interpreter/testcompiler/visitor.py Tue Sep 6 14:17:04 2005 @@ -1,4 +1,4 @@ -from pypy.interpreter.stablecompiler import ast +from pypy.interpreter.testcompiler import ast # XXX should probably rename ASTVisitor to ASTWalker # XXX can it be made even more generic? From pedronis at codespeak.net Tue Sep 6 15:15:17 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 6 Sep 2005 15:15:17 +0200 (CEST) Subject: [pypy-svn] r17287 - in pypy/dist/pypy: annotation translator/test Message-ID: <20050906131517.2A7AD27B70@code1.codespeak.net> Author: pedronis Date: Tue Sep 6 15:15:16 2005 New Revision: 17287 Modified: pypy/dist/pypy/annotation/binaryop.py pypy/dist/pypy/translator/test/test_annrpython.py Log: support mixing dicts and None (in annotation) Modified: pypy/dist/pypy/annotation/binaryop.py ============================================================================== --- pypy/dist/pypy/annotation/binaryop.py (original) +++ pypy/dist/pypy/annotation/binaryop.py Tue Sep 6 15:15:16 2005 @@ -584,6 +584,17 @@ def union((pbc, lst)): return pair(lst, pbc).union() +# let mix dicts and None +class __extend__(pairtype(SomeDict, SomePBC)): + def union((dct, pbc)): + if pbc.isNone(): + return dct + return SomeObject() + +class __extend__(pairtype(SomePBC, SomeDict )): + def union((pbc, dct)): + return pair(dct, pbc).union() + # mixing strings and None class __extend__(pairtype(SomeString, SomePBC)): Modified: pypy/dist/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/dist/pypy/translator/test/test_annrpython.py (original) +++ pypy/dist/pypy/translator/test/test_annrpython.py Tue Sep 6 15:15:16 2005 @@ -1524,6 +1524,17 @@ assert s.items[0].can_be_None assert s.items[1] == a.bookkeeper.immutablevalue(A.hello) + def test_dict_and_none(self): + def f(i): + if i: + return {} + else: + return None + a = self.RPythonAnnotator() + s = a.build_types(f, [int]) + assert s.knowntype == dict + + def g(n): return [0,1,2,n] From arigo at codespeak.net Tue Sep 6 15:19:24 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 6 Sep 2005 15:19:24 +0200 (CEST) Subject: [pypy-svn] r17288 - pypy/dist/pypy/annotation Message-ID: <20050906131924.E9F7027B80@code1.codespeak.net> Author: arigo Date: Tue Sep 6 15:19:24 2005 New Revision: 17288 Modified: pypy/dist/pypy/annotation/model.py Log: Dicts can also be None now. Modified: pypy/dist/pypy/annotation/model.py ============================================================================== --- pypy/dist/pypy/annotation/model.py (original) +++ pypy/dist/pypy/annotation/model.py Tue Sep 6 15:19:24 2005 @@ -258,7 +258,7 @@ return selfdic == otherdic def can_be_none(self): - return False + return True def fmt_const(self, const): if len(const) < 20: From arigo at codespeak.net Tue Sep 6 15:34:35 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 6 Sep 2005 15:34:35 +0200 (CEST) Subject: [pypy-svn] r17291 - in pypy/dist/pypy/rpython: . test Message-ID: <20050906133435.08D6127B82@code1.codespeak.net> Author: arigo Date: Tue Sep 6 15:34:35 2005 New Revision: 17291 Modified: pypy/dist/pypy/rpython/rdict.py pypy/dist/pypy/rpython/test/test_rdict.py Log: Support dict-or-None in the rtyper. Modified: pypy/dist/pypy/rpython/rdict.py ============================================================================== --- pypy/dist/pypy/rpython/rdict.py (original) +++ pypy/dist/pypy/rpython/rdict.py Tue Sep 6 15:34:35 2005 @@ -84,6 +84,8 @@ def convert_const(self, dictobj): # get object from bound dict methods #dictobj = getattr(dictobj, '__self__', dictobj) + if dictobj is None: + return nullptr(self.STRDICT) if not isinstance(dictobj, dict): raise TyperError("expected a dict: %r" % (dictobj,)) try: @@ -105,6 +107,10 @@ v_dict, = hop.inputargs(self) return hop.gendirectcall(ll_strdict_len, v_dict) + def rtype_is_true(self, hop): + v_dict, = hop.inputargs(self) + return hop.gendirectcall(ll_strdict_is_true, v_dict) + def make_iterator_repr(self): return StrDictIteratorRepr(self) @@ -193,6 +199,10 @@ def ll_strdict_len(d): return d.num_items +def ll_strdict_is_true(d): + # check if a dict is True, allowing for None + return bool(d) and d.num_items != 0 + def ll_strdict_getitem(d, key): entry = ll_strdict_lookup(d, key) if entry.key and entry.key != deleted_entry_marker: Modified: pypy/dist/pypy/rpython/test/test_rdict.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rdict.py (original) +++ pypy/dist/pypy/rpython/test/test_rdict.py Tue Sep 6 15:34:35 2005 @@ -254,3 +254,21 @@ res = interpret(func, [1]) assert res is False +def dict_or_none(): + class A: + pass + def negate(d): + return not d + def func(n): + a = A() + a.d = None + if n > 0: + a.d = {str(n): 1, "42": 2} + del a.d["42"] + return negate(a.d) + res = interpret(func, [10]) + assert res is False + res = interpret(func, [0]) + assert res is True + res = interpret(func, [42]) + assert res is True From adim at codespeak.net Tue Sep 6 16:07:19 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Tue, 6 Sep 2005 16:07:19 +0200 (CEST) Subject: [pypy-svn] r17292 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050906140719.0EA9327B43@code1.codespeak.net> Author: adim Date: Tue Sep 6 16:07:13 2005 New Revision: 17292 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/astcompiler/visitor.py Log: small changes to let the annotation go a litlle bit further in astcompiler. Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Tue Sep 6 16:07:13 2005 @@ -397,9 +397,11 @@ for name in frees: self.emitop('LOAD_CLOSURE', name) self.emitop_obj('LOAD_CONST', gen) + # self.emitop_obj('LOAD_CONST', gen.getCode()) self.emitop_int('MAKE_CLOSURE', len(node.defaults)) else: self.emitop_obj('LOAD_CONST', gen) + # self.emitop_obj('LOAD_CONST', gen.getCode()) self.emitop_int('MAKE_FUNCTION', len(node.defaults)) for i in range(ndecorators): @@ -419,6 +421,7 @@ for name in frees: self.emitop('LOAD_CLOSURE', name) self.emitop_obj('LOAD_CONST', gen) + # self.emitop_obj('LOAD_CONST', gen.getCode()) if frees: self.emitop_int('MAKE_CLOSURE', 0) else: @@ -656,9 +659,11 @@ for name in frees: self.emitop('LOAD_CLOSURE', name) self.emitop_obj('LOAD_CONST', gen) + # self.emitop_obj('LOAD_CONST', gen.getCode()) self.emitop_int('MAKE_CLOSURE', 0) else: self.emitop_obj('LOAD_CONST', gen) + # self.emitop_obj('LOAD_CONST', gen.getCode()) self.emitop_int('MAKE_FUNCTION', 0) # precomputation of outmost iterable @@ -1143,7 +1148,7 @@ # object constructors def visitEllipsis(self, node): - self.emitop_obj('LOAD_CONST', self.space.wrap(Ellipsis) ) + return self.emitop_obj('LOAD_CONST', self.space.w_Ellipsis) def visitTuple(self, node): self.set_lineno(node) @@ -1371,7 +1376,8 @@ def findOp(node): """Find the op (DELETE, LOAD, STORE) in an AssTuple tree""" v = OpFinder() - walk(node, v, verbose=0) + # walk(node, v, verbose=0) + node.accept(v) return v.op class OpFinder(ast.ASTVisitor): Modified: pypy/dist/pypy/interpreter/astcompiler/visitor.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/visitor.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/visitor.py Tue Sep 6 16:07:13 2005 @@ -106,8 +106,8 @@ walker.preorder(tree, visitor) return walker.visitor -def walk(tree, visitor, verbose=None): - tree.accept( visitor ) +def walk(tree, visitor, verbose=-1): + tree.accept(visitor) return visitor def dumpNode(node): From arigo at codespeak.net Tue Sep 6 17:12:33 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 6 Sep 2005 17:12:33 +0200 (CEST) Subject: [pypy-svn] r17294 - pypy/dist/pypy/annotation Message-ID: <20050906151233.6893D27B51@code1.codespeak.net> Author: arigo Date: Tue Sep 6 17:12:32 2005 New Revision: 17294 Modified: pypy/dist/pypy/annotation/model.py Log: Removed a few lines that were apparently accidentally checked in, in rev 12254, along with a block of comments. Modified: pypy/dist/pypy/annotation/model.py ============================================================================== --- pypy/dist/pypy/annotation/model.py (original) +++ pypy/dist/pypy/annotation/model.py Tue Sep 6 17:12:32 2005 @@ -34,7 +34,7 @@ from pypy.objspace.flow.model import Constant from pypy.tool.tls import tlsobject import inspect -import copy + DEBUG = True # set to False to disable recording of debugging information TLS = tlsobject() @@ -119,7 +119,6 @@ def __new__(cls, *args, **kw): self = super(SomeObject, cls).__new__(cls, *args, **kw) if DEBUG: - so = SomeObject try: bookkeeper = pypy.annotation.bookkeeper.getbookkeeper() position_key = bookkeeper.position_key @@ -143,9 +142,6 @@ caused_by_merge = property(caused_by_merge, set_caused_by_merge) del set_caused_by_merge - def __setattr__(self, key, value): - object.__setattr__(self, key, value) - def can_be_none(self): return True From tismer at codespeak.net Tue Sep 6 18:06:21 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Tue, 6 Sep 2005 18:06:21 +0200 (CEST) Subject: [pypy-svn] r17295 - in pypy/dist/pypy/rpython: . test Message-ID: <20050906160621.6D75A27B5B@code1.codespeak.net> Author: tismer Date: Tue Sep 6 18:06:18 2005 New Revision: 17295 Modified: pypy/dist/pypy/rpython/rlist.py pypy/dist/pypy/rpython/test/test_rlist.py Log: fixed some bad bugs in ll_insert, added more tests Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Tue Sep 6 18:06:18 2005 @@ -394,8 +394,8 @@ i = length items = l.items i1 = i+1 - while i > 0: - items[i] = items[i1] + while i >= 0: + items[i1] = items[i] i1 = i i -= 1 items[0] = newitem @@ -406,11 +406,11 @@ items = l.items i = length i1 = i+1 - while i > index: - items[i] = items[i1] + while i >= index: + items[i1] = items[i] i1 = i i -= 1 - items[i] = newitem + items[index] = newitem def ll_insert(l, index, newitem): if index < 0: Modified: pypy/dist/pypy/rpython/test/test_rlist.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rlist.py (original) +++ pypy/dist/pypy/rpython/test/test_rlist.py Tue Sep 6 18:06:18 2005 @@ -394,3 +394,20 @@ assert res._obj.value == 3 res = interpret(fn, [-2]) assert res._obj.value == "oups" + +def test_list_basic_ops(): + def list_basic_ops(i=int, j=int): + l = [1,2,3] + l.insert(0, 42) + del l[1] + l.append(i) + listlen = len(l) + l.extend(l) + del l[listlen:] + l += [5,6] + l[1] = i + return l[j] + for i in range(6): + for j in range(6): + res = interpret(list_basic_ops, [i, j]) + assert res == list_basic_ops(i, j) From arigo at codespeak.net Tue Sep 6 19:01:15 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 6 Sep 2005 19:01:15 +0200 (CEST) Subject: [pypy-svn] r17297 - pypy/dist/pypy/doc Message-ID: <20050906170115.2A36D27B5B@code1.codespeak.net> Author: arigo Date: Tue Sep 6 19:01:14 2005 New Revision: 17297 Modified: pypy/dist/pypy/doc/translation.txt Log: Reworded description of hop.exception_cannot_occur(). Modified: pypy/dist/pypy/doc/translation.txt ============================================================================== --- pypy/dist/pypy/doc/translation.txt (original) +++ pypy/dist/pypy/doc/translation.txt Tue Sep 6 19:01:14 2005 @@ -914,11 +914,15 @@ 'direct_call' and other operations that can just raise any exception. ``hop.exception_cannot_occur()`` - A special case for ``override:ignore`` meaning that there is, after - all, no operation left that could raise an exception. (The RTyper - class normally verifies that exception_is_here() was really called - once for each high-level operation that is in the scope of - exception-catching links.) + The RTyper normally verifies that exception_is_here() was really + called once for each high-level operation that is in the scope of + exception-catching links. By saying exception_cannot_occur(), + you say that after all this particular operation cannot raise + anything. (It can be the case that unexpected exception links are + attached to flow graphs; e.g. any method call within a + ``try:finally:`` block will have an Exception branch to the finally + part, which only the RTyper can remove if exception_cannot_occur() + is called.) .. _LLInterpreter: From tismer at codespeak.net Tue Sep 6 19:32:52 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Tue, 6 Sep 2005 19:32:52 +0200 (CEST) Subject: [pypy-svn] r17298 - in pypy/dist/pypy: objspace/std objspace/std/test rpython rpython/test translator/c Message-ID: <20050906173252.1712827B5B@code1.codespeak.net> Author: tismer Date: Tue Sep 6 19:32:48 2005 New Revision: 17298 Modified: pypy/dist/pypy/objspace/std/listobject.py pypy/dist/pypy/objspace/std/marshal_impl.py pypy/dist/pypy/objspace/std/test/test_listobject.py pypy/dist/pypy/rpython/rlist.py pypy/dist/pypy/rpython/rtuple.py pypy/dist/pypy/rpython/test/test_rlist.py pypy/dist/pypy/translator/c/extfunc.py Log: quite a big check-in, this was not my intent. I removed overallocation from listobject.py and tried to map as much as possible directly to rlist.py Also did some simplifications and additions to rlist, which unfortunately affected quite some code. new: rlist supports inplace multiplication. rlist.pop specializes on error checking, now. The latter causes the rtyper to break. I'm checking in, since this looks like a bug in rtyper. Modified: pypy/dist/pypy/objspace/std/listobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/listobject.py (original) +++ pypy/dist/pypy/objspace/std/listobject.py Tue Sep 6 19:32:48 2005 @@ -14,31 +14,18 @@ def __init__(w_self, space, wrappeditems): W_Object.__init__(w_self, space) - w_self.ob_item = [] - w_self.ob_size = 0 - newlen = len(wrappeditems) - _list_resize(w_self, newlen) - w_self.ob_size = newlen - items = w_self.ob_item - p = newlen - while p: - p -= 1 - items[p] = wrappeditems[p] + w_self.wrappeditems = wrappeditems def __repr__(w_self): """ representation for debugging purposes """ - reprlist = [repr(w_item) for w_item in w_self.ob_item[:w_self.ob_size]] + reprlist = [repr(w_item) for w_item in w_self.wrappeditems] return "%s(%s)" % (w_self.__class__.__name__, ', '.join(reprlist)) def unwrap(w_list): space = w_list.space - items = [space.unwrap(w_item) for w_item in w_list.ob_item[:w_list.ob_size]]# XXX generic mixed types unwrap + items = [space.unwrap(w_item) for w_item in w_list.wrappeditems]# XXX generic mixed types unwrap return list(items) - def clear(w_list): - w_list.ob_item = [] - w_list.ob_size = 0 - registerimplementation(W_ListObject) @@ -47,60 +34,45 @@ w_iterable, = __args__.parse('list', (['sequence'], None, None), # signature [W_ListObject(space, [])]) # default argument - w_list.clear() - - length = 0 - try: - length = space.int_w(space.len(w_iterable)) - if length < 0: - length = 8 - except OperationError, e: - pass # for now - _list_resize(w_list, length) - w_iterator = space.iter(w_iterable) - while True: - try: - w_item = space.next(w_iterator) - except OperationError, e: - if not e.match(space, space.w_StopIteration): - raise - break # done - _ins1(w_list, w_list.ob_size, w_item) - + w_list.wrappeditems = space.unpackiterable(w_iterable) def len__List(space, w_list): - result = w_list.ob_size + result = len(w_list.wrappeditems) return W_IntObject(space, result) def getitem__List_ANY(space, w_list, w_index): idx = space.int_w(w_index) - if idx < 0: - idx += w_list.ob_size - if idx < 0 or idx >= w_list.ob_size: + try: + w_ret = w_list.wrappeditems[idx] + except IndexError: raise OperationError(space.w_IndexError, space.wrap("list index out of range")) - w_item = w_list.ob_item[idx] - return w_item + return w_ret def getitem__List_Slice(space, w_list, w_slice): - length = w_list.ob_size + # XXX consider to extend rlist's functionality? + length = len(w_list.wrappeditems) start, stop, step, slicelength = slicetype.indices4(space, w_slice, length) assert slicelength >= 0 - w_res = W_ListObject(space, []) - _list_resize(w_res, slicelength) - items = w_list.ob_item - subitems = w_res.ob_item + if step == 1 and stop >= start >= 0: + assert stop >= 0 + assert start >= 0 + return W_ListObject(space, w_list.wrappeditems[start:stop]) + w_res = W_ListObject(space, [None] * slicelength) + items = w_list.wrappeditems + subitems = w_res.wrappeditems for i in range(slicelength): subitems[i] = items[start] start += step - w_res.ob_size = slicelength return w_res def contains__List_ANY(space, w_list, w_obj): # needs to be safe against eq_w() mutating the w_list behind our back i = 0 - while i < w_list.ob_size: - if space.eq_w(w_list.ob_item[i], w_obj): + items_w = w_list.wrappeditems + length = len(items_w) + while i < length: + if space.eq_w(items_w[i], w_obj): return space.w_True i += 1 return space.w_False @@ -110,21 +82,7 @@ return iterobject.W_SeqIterObject(space, w_list) def add__List_List(space, w_list1, w_list2): - w_res = W_ListObject(space, []) - newlen = w_list1.ob_size + w_list2.ob_size - _list_resize(w_res, newlen) - p = 0 - items = w_res.ob_item - src = w_list1.ob_item - for i in range(w_list1.ob_size): - items[p] = src[i] - p += 1 - src = w_list2.ob_item - for i in range(w_list2.ob_size): - items[p] = src[i] - p += 1 - w_res.ob_size = p - return w_res + return W_ListObject(space, w_list1.wrappeditems + w_list2.wrappeditems) def inplace_add__List_ANY(space, w_list1, w_iterable2): list_extend__List_ANY(space, w_list1, w_iterable2) @@ -137,19 +95,7 @@ if e.match(space, space.w_TypeError): raise FailedToImplement raise - w_res = W_ListObject(space, []) - size = w_list.ob_size - newlen = size * times # XXX check overflow - _list_resize(w_res, newlen) - src = w_list.ob_item - items = w_res.ob_item - p = 0 - for _ in range(times): - for i in range(size): - items[p] = src[i] - p += 1 - w_res.ob_size = p - return w_res + return W_ListObject(space, w_list.wrappeditems * times) def mul__List_ANY(space, w_list, w_times): return mul_list_times(space, w_list, w_times) @@ -164,31 +110,19 @@ if e.match(space, space.w_TypeError): raise FailedToImplement raise - if times <= 0: - w_list.clear() - return w_list - size = w_list.ob_size - newlen = size * times # XXX check overflow - _list_resize(w_list, newlen) - items = w_list.ob_item - p = size - for _ in range(1, times): - for i in range(size): - items[p] = items[i] - p += 1 - w_list.ob_size = newlen + w_list.wrappeditems *= times return w_list def eq__List_List(space, w_list1, w_list2): # needs to be safe against eq_w() mutating the w_lists behind our back - if w_list1.ob_size != w_list2.ob_size: + if len(w_list1.wrappeditems) != len(w_list2.wrappeditems): return space.w_False i = 0 - while i < w_list1.ob_size and i < w_list2.ob_size: - if not space.eq_w(w_list1.ob_item[i], w_list2.ob_item[i]): + while i < len(w_list1.wrappeditems) and i < len(w_list2.wrappeditems): + if not space.eq_w(w_list1.wrappeditems[i], w_list2.wrappeditems[i]): return space.w_False i += 1 - return space.newbool(w_list1.ob_size == w_list2.ob_size) + return space.newbool(len(w_list1.wrappeditems) == len(w_list2.wrappeditems)) def _min(a, b): if a < b: @@ -199,57 +133,55 @@ # needs to be safe against eq_w() mutating the w_lists behind our back # Search for the first index where items are different i = 0 - while i < w_list1.ob_size and i < w_list2.ob_size: - w_item1 = w_list1.ob_item[i] - w_item2 = w_list2.ob_item[i] + while i < len(w_list1.wrappeditems) and i < len(w_list2.wrappeditems): + w_item1 = w_list1.wrappeditems[i] + w_item2 = w_list2.wrappeditems[i] if not space.eq_w(w_item1, w_item2): return space.lt(w_item1, w_item2) i += 1 # No more items to compare -- compare sizes - return space.newbool(w_list1.ob_size < w_list2.ob_size) + return space.newbool(len(w_list1.wrappeditems) < len(w_list2.wrappeditems)) def gt__List_List(space, w_list1, w_list2): # needs to be safe against eq_w() mutating the w_lists behind our back # Search for the first index where items are different i = 0 - while i < w_list1.ob_size and i < w_list2.ob_size: - w_item1 = w_list1.ob_item[i] - w_item2 = w_list2.ob_item[i] + while i < len(w_list1.wrappeditems) and i < len(w_list2.wrappeditems): + w_item1 = w_list1.wrappeditems[i] + w_item2 = w_list2.wrappeditems[i] if not space.eq_w(w_item1, w_item2): return space.gt(w_item1, w_item2) i += 1 # No more items to compare -- compare sizes - return space.newbool(w_list1.ob_size > w_list2.ob_size) + return space.newbool(len(w_list1.wrappeditems) > len(w_list2.wrappeditems)) -# upto here, lists are nearly identical to tuples, despite the -# fact that we now support over-allocation! def delitem__List_ANY(space, w_list, w_idx): - i = space.int_w(w_idx) - if i < 0: - i += w_list.ob_size - if i < 0 or i >= w_list.ob_size: + idx = space.int_w(w_idx) + try: + del w_list.wrappeditems[idx] + except IndexError: raise OperationError(space.w_IndexError, space.wrap("list deletion index out of range")) - _del_slice(w_list, i, i+1) return space.w_None def delitem__List_Slice(space, w_list, w_slice): - start, stop, step, slicelength = slicetype.indices4(space, w_slice, w_list.ob_size) + start, stop, step, slicelength = slicetype.indices4(space, w_slice, + len(w_list.wrappeditems)) if slicelength==0: return - if step<0: - start = start+step*(slicelength-1) + if step < 0: + start = start + step * (slicelength-1) step = -step # stop is invalid if step == 1: _del_slice(w_list, start, start+slicelength) else: - items = w_list.ob_item - n = w_list.ob_size + items = w_list.wrappeditems + n = len(items) recycle = [None] * slicelength i = start @@ -267,15 +199,13 @@ recycle[discard] = items[i] j = i+1 - while j= 0 # annotator hint - w_list.ob_size = n - for i in range(w_list.ob_size, n): - items[i] = None + start = n - slicelength + assert start >= 0 # annotator hint + # XXX allow negative indices in rlist + del items[start:] # now we can destruct recycle safely, regardless of # side-effects to the list del recycle[:] @@ -284,43 +214,38 @@ def setitem__List_ANY_ANY(space, w_list, w_index, w_any): idx = space.int_w(w_index) - if idx < 0: - idx += w_list.ob_size - if idx < 0 or idx >= w_list.ob_size: + try: + w_list.wrappeditems[idx] = w_any + except IndexError: raise OperationError(space.w_IndexError, space.wrap("list index out of range")) - w_list.ob_item[idx] = w_any return space.w_None def setitem__List_Slice_List(space, w_list, w_slice, w_list2): - return _setitem_slice_helper(space, w_list, w_slice, w_list2.ob_item, w_list2.ob_size) + l = w_list2.wrappeditems + return _setitem_slice_helper(space, w_list, w_slice, l, len(l)) def setitem__List_Slice_Tuple(space, w_list, w_slice, w_tuple): t = w_tuple.wrappeditems return _setitem_slice_helper(space, w_list, w_slice, t, len(t)) def setitem__List_Slice_ANY(space, w_list, w_slice, w_iterable): -## if isinstance(w_iterable, W_ListObject): -## return _setitem_slice_helper(space, w_list, w_slice, -## w_iterable.ob_item, w_iterable.ob_size) -## if isinstance(w_iterable, W_TupleObject): -## t = w_iterable.wrappeditems -## else: - t = space.unpackiterable(w_iterable) - return _setitem_slice_helper(space, w_list, w_slice, t, len(t)) + l = space.unpackiterable(w_iterable) + return _setitem_slice_helper(space, w_list, w_slice, l, len(l)) def _setitem_slice_helper(space, w_list, w_slice, sequence2, len2): - start, stop, step, slicelength = slicetype.indices4(space, w_slice, w_list.ob_size) + oldsize = len(w_list.wrappeditems) + start, stop, step, slicelength = slicetype.indices4(space, w_slice, + oldsize) assert slicelength >= 0 + items = w_list.wrappeditems if step == 1: # Support list resizing for non-extended slices - oldsize = w_list.ob_size delta = len2 - slicelength if delta >= 0: newsize = oldsize + delta - _list_resize(w_list, newsize) - w_list.ob_size = newsize - items = w_list.ob_item + # XXX support this in rlist! + items += [None] * delta for i in range(newsize-1, start+len2-1, -1): items[i] = items[i-delta] else: @@ -331,7 +256,6 @@ "assign sequence of size %d to extended slice of size %d" % (len2,slicelength))) - items = w_list.ob_item if sequence2 is items: if step > 0: # Always copy starting from the right to avoid @@ -366,7 +290,7 @@ listrepr = app.interphook("listrepr") def repr__List(space, w_list): - if w_list.ob_size == 0: + if len(w_list.wrappeditems) == 0: return space.wrap('[]') w_currently_in_repr = space.getexecutioncontext()._py_repr return listrepr(space, w_currently_in_repr, w_list) @@ -374,101 +298,22 @@ def hash__List(space,w_list): raise OperationError(space.w_TypeError,space.wrap("list objects are unhashable")) -# adapted C code -def _roundupsize(n): - nbits = r_uint(0) - n2 = n >> 5 - -## /* Round up: -## * If n < 256, to a multiple of 8. -## * If n < 2048, to a multiple of 64. -## * If n < 16384, to a multiple of 512. -## * If n < 131072, to a multiple of 4096. -## * If n < 1048576, to a multiple of 32768. -## * If n < 8388608, to a multiple of 262144. -## * If n < 67108864, to a multiple of 2097152. -## * If n < 536870912, to a multiple of 16777216. -## * ... -## * If n < 2**(5+3*i), to a multiple of 2**(3*i). -## * -## * This over-allocates proportional to the list size, making room -## * for additional growth. The over-allocation is mild, but is -## * enough to give linear-time amortized behavior over a long -## * sequence of appends() in the presence of a poorly-performing -## * system realloc() (which is a reality, e.g., across all flavors -## * of Windows, with Win9x behavior being particularly bad -- and -## * we've still got address space fragmentation problems on Win9x -## * even with this scheme, although it requires much longer lists to -## * provoke them than it used to). -## */ - while 1: - n2 >>= 3 - nbits += 3 - if not n2 : - break - return ((n >> nbits) + 1) << nbits - -# before we have real arrays, -# we use lists, allocated to fixed size. -# XXX memory overflow is ignored here. -# See listobject.c for reference. - -for_later = """ -#define NRESIZE(var, type, nitems) \ -do { \ - size_t _new_size = _roundupsize(nitems); \ - if (_new_size <= ((~(size_t)0) / sizeof(type))) \ - PyMem_RESIZE(var, type, _new_size); \ - else \ - var = NULL; \ -} while (0) -""" - -def _list_resize(w_list, newlen): - if newlen > len(w_list.ob_item): - true_size = _roundupsize(newlen) - old_items = w_list.ob_item - w_list.ob_item = items = [None] * true_size - for p in range(len(old_items)): - items[p] = old_items[p] - -def _ins1(w_list, where, w_any): - _list_resize(w_list, w_list.ob_size+1) - size = w_list.ob_size - items = w_list.ob_item - if where < 0: - where += size - if where < 0: - where = 0 - if (where > size): - where = size - for i in range(size, where, -1): - items[i] = items[i-1] - items[where] = w_any - w_list.ob_size += 1 - def list_insert__List_ANY_ANY(space, w_list, w_where, w_any): - _ins1(w_list, space.int_w(w_where), w_any) + w_list.wrappeditems.insert(space.int_w(w_where), w_any) return space.w_None def list_append__List_ANY(space, w_list, w_any): - _ins1(w_list, w_list.ob_size, w_any) + w_list.wrappeditems.append(w_any) return space.w_None def list_extend__List_ANY(space, w_list, w_any): - lis = space.unpackiterable(w_any) - newlen = w_list.ob_size + len(lis) - _list_resize(w_list, newlen) - d = w_list.ob_size - items = w_list.ob_item - for i in range(len(lis)): - items[d+i] = lis[i] - w_list.ob_size = newlen + w_list.wrappeditems += space.unpackiterable(w_any) return space.w_None def _del_slice(w_list, ilow, ihigh): """ similar to the deletion part of list_ass_slice in CPython """ - n = w_list.ob_size + items = w_list.wrappeditems + n = len(items) if ilow < 0: ilow = 0 elif ilow > n: @@ -477,46 +322,36 @@ ihigh = ilow elif ihigh > n: ihigh = n - items = w_list.ob_item - d = ihigh-ilow # keep a reference to the objects to be removed, # preventing side effects during destruction - recycle = [items[i] for i in range(ilow, ihigh)] - for i in range(ilow, n - d): - items[i] = items[i+d] - items[i+d] = None - # make sure entries after ob_size-d are None, to avoid keeping references - # (the above loop already set to None all items[ilow+d:old_style]) - n -= d - assert n >= 0 # annotator hint - w_list.ob_size = n - for i in range(n, ilow + d): - items[i] = None + recycle = items[ilow:ihigh] + del items[ilow:ihigh] # now we can destruct recycle safely, regardless of # side-effects to the list del recycle[:] # note that the default value will come back wrapped!!! def list_pop__List_ANY(space, w_list, w_idx=-1): - if w_list.ob_size == 0: + items = w_list.wrappeditems + if len(items)== 0: raise OperationError(space.w_IndexError, space.wrap("pop from empty list")) - i = space.int_w(w_idx) - if i < 0: - i += w_list.ob_size - if i < 0 or i >= w_list.ob_size: + idx = space.int_w(w_idx) + try: + w_ret = items.pop(idx) + except IndexError: raise OperationError(space.w_IndexError, space.wrap("pop index out of range")) - w_res = w_list.ob_item[i] - _del_slice(w_list, i, i+1) - return w_res + return w_ret def list_remove__List_ANY(space, w_list, w_any): # needs to be safe against eq_w() mutating the w_list behind our back i = 0 - while i < w_list.ob_size: - if space.eq_w(w_list.ob_item[i], w_any): - _del_slice(w_list, i, i+1) + items = w_list.wrappeditems + length = len(items) + while i < length: + if space.eq_w(items[i], w_any): + del items[i] return space.w_None i += 1 raise OperationError(space.w_ValueError, @@ -524,13 +359,14 @@ def list_index__List_ANY_ANY_ANY(space, w_list, w_any, w_start, w_stop): # needs to be safe against eq_w() mutating the w_list behind our back - size = w_list.ob_size + items = w_list.wrappeditems + size = len(items) w_start = slicetype.adapt_bound(space, w_start, space.wrap(size)) w_stop = slicetype.adapt_bound(space, w_stop, space.wrap(size)) i = space.int_w(w_start) stop = space.int_w(w_stop) - while i < stop and i < w_list.ob_size: - if space.eq_w(w_list.ob_item[i], w_any): + while i < stop and i < len(items): + if space.eq_w(items[i], w_any): return space.wrap(i) i += 1 raise OperationError(space.w_ValueError, @@ -540,14 +376,22 @@ # needs to be safe against eq_w() mutating the w_list behind our back count = 0 i = 0 - while i < w_list.ob_size: - if space.eq_w(w_list.ob_item[i], w_any): + items = w_list.wrappeditems + while i < len(items): + if space.eq_w(items[i], w_any): count += 1 i += 1 return space.wrap(count) +def list_reverse__List(space, w_list): + w_list.wrappeditems.reverse() + return space.w_None + +# ____________________________________________________________ +# Sorting + # Reverse a slice of a list in place, from lo up to (exclusive) hi. -# (also used in sort, later) +# (used in sort) def _reverse_slice(lis, lo, hi): hi -= 1 @@ -558,14 +402,6 @@ lo += 1 hi -= 1 -def list_reverse__List(space, w_list): - if w_list.ob_size > 1: - _reverse_slice(w_list.ob_item, 0, w_list.ob_size) - return space.w_None - -# ____________________________________________________________ -# Sorting - class KeyContainer(baseobjspace.W_Root): def __init__(self, w_key, w_item): self.w_key = w_key @@ -623,8 +459,8 @@ sorterclass = CustomKeySort else: sorterclass = SimpleSort - - sorter = sorterclass(w_list.ob_item, w_list.ob_size) + items = w_list.wrappeditems + sorter = sorterclass(items, len(items)) sorter.space = space sorter.w_cmp = w_cmp @@ -632,8 +468,8 @@ # The list is temporarily made empty, so that mutations performed # by comparison functions can't affect the slice of memory we're # sorting (allowing mutations during sorting is an IndexError or - # core-dump factory, since ob_item may change). - w_list.clear() + # core-dump factory, since wrappeditems may change). + w_list.wrappeditems = [] # wrap each item in a KeyContainer if needed if has_key: @@ -663,11 +499,10 @@ sorter.list[i] = w_obj.w_item # check if the user mucked with the list during the sort - mucked = len(w_list.ob_item) > 0 + mucked = len(w_list.wrappeditems) > 0 # put the items back into the list - w_list.ob_item = sorter.list - w_list.ob_size = sorter.listlength + w_list.wrappeditems = sorter.list if mucked: raise OperationError(space.w_ValueError, Modified: pypy/dist/pypy/objspace/std/marshal_impl.py ============================================================================== --- pypy/dist/pypy/objspace/std/marshal_impl.py (original) +++ pypy/dist/pypy/objspace/std/marshal_impl.py Tue Sep 6 19:32:48 2005 @@ -318,7 +318,8 @@ def marshal_w__Tuple(space, w_tuple, m): m.start(TYPE_TUPLE) - m.put_list_w(w_tuple.wrappeditems, len(w_tuple.wrappeditems)) + items = w_tuple.wrappeditems + m.put_list_w(items, len(items)) def unmarshal_Tuple(space, u, tc): items_w = u.get_list_w() @@ -327,8 +328,8 @@ def marshal_w__List(space, w_list, m): m.start(TYPE_LIST) - n = w_list.ob_size - m.put_list_w(w_list.ob_item, w_list.ob_size) + items = w_list.wrappeditems + m.put_list_w(items, len(items)) def unmarshal_List(space, u, tc): items_w = u.get_list_w() @@ -488,7 +489,6 @@ w_lis = set_to_list(space, w_set) # cannot access this list directly, because it's # type is not exactly known through applevel. - # otherwise, I would access ob_item and ob_size, directly. lis_w = space.unpackiterable(w_lis) m.start(TYPE_SET) m.put_list_w(lis_w, len(lis_w)) Modified: pypy/dist/pypy/objspace/std/test/test_listobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/test/test_listobject.py (original) +++ pypy/dist/pypy/objspace/std/test/test_listobject.py Tue Sep 6 19:32:48 2005 @@ -341,27 +341,6 @@ assert self.space.eq_w(self.space.le(w_list4, w_list3), self.space.w_True) - def test_no_unneeded_refs(self): - space = self.space - w = space.wrap - w_empty = W_ListObject(space, []) - - w_list = W_ListObject(space, [w(5), w(3), w(99)]) - space.setitem(w_list, space.newslice(w(0), w(3), w(None)), w_empty) - assert w_list.ob_item == [None]*len(w_list.ob_item) - - w_list = W_ListObject(space, [w(5), w(3), w(99)]) - space.delitem(w_list, space.newslice(w(0), w(3), w(None))) - assert w_list.ob_item == [None]*len(w_list.ob_item) - - w_list = W_ListObject(space, [w(5), w(3), w(99)]) - space.call_method(w_list, 'pop') - assert w_list.ob_item[2] is None - space.call_method(w_list, 'pop') - assert w_list.ob_item[1] is None - space.call_method(w_list, 'pop') - assert w_list.ob_item == [None]*len(w_list.ob_item) - class AppTestW_ListObject: def test_call_list(self): assert list('') == [] Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Tue Sep 6 19:32:48 2005 @@ -93,7 +93,7 @@ def get_eqfunc(self): return inputconst(Void, self.item_repr.get_ll_eq_function()) - def rtype_bltn_list(self,hop): + def rtype_bltn_list(self, hop): v_lst = hop.inputarg(self, 0) return hop.gendirectcall(ll_copy, v_lst) @@ -141,6 +141,11 @@ hop.gendirectcall(ll_reverse,v_lst) def rtype_method_pop(self, hop): + if hop.has_implicit_exception(IndexError): + spec = dum_checkidx + else: + spec = dum_nocheck + v_func = hop.inputconst(Void, spec) if hop.nb_args == 2: args = hop.inputargs(self, Signed) assert hasattr(args[1], 'concretetype') @@ -155,8 +160,8 @@ else: args = hop.inputargs(self) llfn = ll_pop_default - hop.exception_cannot_occur() # no IndexError support (yet?) - return hop.gendirectcall(llfn, *args) + hop.exception_is_here() + return hop.gendirectcall(llfn, v_func, *args) def make_iterator_repr(self): return ListIteratorRepr(self) @@ -191,54 +196,57 @@ class __extend__(pairtype(ListRepr, IntegerRepr)): def rtype_getitem((r_lst, r_int), hop): - v_lst, v_index = hop.inputargs(r_lst, Signed) if hop.has_implicit_exception(IndexError): - if hop.args_s[1].nonneg: - llfn = ll_getitem_nonneg_checked - else: - llfn = ll_getitem_checked + spec = dum_checkidx else: - if hop.args_s[1].nonneg: - llfn = ll_getitem_nonneg - else: - llfn = ll_getitem + spec = dum_nocheck + v_func = hop.inputconst(Void, spec) + v_lst, v_index = hop.inputargs(r_lst, Signed) + if hop.args_s[1].nonneg: + llfn = ll_getitem_nonneg + else: + llfn = ll_getitem hop.exception_is_here() - return hop.gendirectcall(llfn, v_lst, v_index) - + return hop.gendirectcall(llfn, v_func, v_lst, v_index) + def rtype_setitem((r_lst, r_int), hop): - v_lst, v_index, v_item = hop.inputargs(r_lst, Signed, r_lst.item_repr) if hop.has_implicit_exception(IndexError): - if hop.args_s[1].nonneg: - llfn = ll_setitem_nonneg_checked - else: - llfn = ll_setitem_checked + spec = dum_checkidx else: - if hop.args_s[1].nonneg: - llfn = ll_setitem_nonneg - else: - llfn = ll_setitem + spec = dum_nocheck + v_func = hop.inputconst(Void, spec) + v_lst, v_index, v_item = hop.inputargs(r_lst, Signed, r_lst.item_repr) + if hop.args_s[1].nonneg: + llfn = ll_setitem_nonneg + else: + llfn = ll_setitem hop.exception_is_here() - return hop.gendirectcall(llfn, v_lst, v_index, v_item) + return hop.gendirectcall(llfn, v_func, v_lst, v_index, v_item) def rtype_delitem((r_lst, r_int), hop): - v_lst, v_index = hop.inputargs(r_lst, Signed) if hop.has_implicit_exception(IndexError): - if hop.args_s[1].nonneg: - llfn = ll_delitem_nonneg_checked - else: - llfn = ll_delitem_checked + spec = dum_checkidx else: - if hop.args_s[1].nonneg: - llfn = ll_delitem_nonneg - else: - llfn = ll_delitem + spec = dum_nocheck + v_func = hop.inputconst(Void, spec) + v_lst, v_index = hop.inputargs(r_lst, Signed) + if hop.args_s[1].nonneg: + llfn = ll_delitem_nonneg + else: + llfn = ll_delitem hop.exception_is_here() - return hop.gendirectcall(llfn, v_lst, v_index) + return hop.gendirectcall(llfn, v_func, v_lst, v_index) def rtype_mul((r_lst, r_int), hop): + v_func = hop.inputconst(Void, dum_newlist) v_lst, v_factor = hop.inputargs(r_lst, Signed) - return hop.gendirectcall(ll_mul, v_lst, v_factor) - + return hop.gendirectcall(ll_mul, v_func, v_lst, v_factor) + + def rtype_inplace_mul((r_lst, r_int), hop): + v_func = hop.inputconst(Void, dum_inplace) + v_lst, v_factor = hop.inputargs(r_lst, Signed) + return hop.gendirectcall(ll_mul, v_func, v_lst, v_factor) + class __extend__(pairtype(ListRepr, SliceRepr)): def rtype_getitem((r_lst, r_slic), hop): @@ -417,13 +425,23 @@ index += l.length ll_insert_nonneg(l, index, newitem) -def ll_pop_nonneg(l, index): +def dum_checkidx(): pass +def dum_nocheck(): pass +def dum_inplace():pass +def dum_newlist():pass + +def ll_pop_nonneg(func, l, index): + if func is dum_checkidx and (index >= l.length): + raise IndexError res = l.items[index] - ll_delitem_nonneg(l, index) + ll_delitem_nonneg(dum_nocheck, l, index) return res -def ll_pop_default(l): - index = l.length - 1 +def ll_pop_default(func, l): + length = l.length + if func is dum_checkidx and (length == 0): + raise IndexError + index = length - 1 newlength = index items = l.items res = items[index] @@ -433,8 +451,11 @@ _ll_list_resize(l, newlength) return res -def ll_pop_zero(l): - newlength = l.length - 1 +def ll_pop_zero(func, l): + length = l.length + if func is dum_checkidx and (length == 0): + raise IndexError + newlength = length - 1 res = l.items[0] j = 0 items = l.items @@ -449,77 +470,60 @@ _ll_list_resize(l, newlength) return res -def ll_pop(l, index): +def ll_pop(func, l, index): + length = l.length if index < 0: - index += l.length + index += length + if func is dum_checkidx and (index < 0 or index >= length): + raise IndexError res = l.items[index] - ll_delitem_nonneg(l, index) + ll_delitem_nonneg(dum_nocheck, l, index) return res def ll_reverse(l): length = l.length - len2 = length // 2 i = 0 items = l.items length_1_i = length-1-i - while i < len2: + while i < length_1_i: tmp = l.items[i] items[i] = items[length_1_i] items[length_1_i] = tmp i += 1 length_1_i -= 1 -def ll_getitem_nonneg(l, i): - return l.items[i] - -def ll_getitem(l, i): - if i < 0: - i += l.length - return l.items[i] - -def ll_getitem_nonneg_checked(l, i): - if i >= l.length: +def ll_getitem_nonneg(func, l, index): + if func is dum_checkidx and (index >= l.length): raise IndexError - else: - return l.items[i] + return l.items[index] -def ll_getitem_checked(l, i): - if i < 0: - i += l.length - if i >= l.length or i < 0: +def ll_getitem(func, l, index): + length = l.length + if index < 0: + index += length + if func is dum_checkidx and (index < 0 or index >= length): raise IndexError - else: - return l.items[i] - -def ll_setitem_nonneg(l, i, newitem): - l.items[i] = newitem + return l.items[index] -def ll_setitem_nonneg_checked(l, i, newitem): - if i >= l.length: +def ll_setitem_nonneg(func, l, index, newitem): + if func is dum_checkidx and (index >= l.length): raise IndexError - l.items[i] = newitem + l.items[index] = newitem -def ll_setitem(l, i, newitem): - if i < 0: - i += l.length - l.items[i] = newitem - -def ll_setitem_checked(l, i, newitem): - if i < 0: - i += l.length - if i >= l.length or i < 0: +def ll_setitem(func, l, index, newitem): + length = l.length + if index < 0: + index += length + if func is dum_checkidx and (index < 0 or index >= length): raise IndexError - else: - l.items[i] = newitem + l.items[index] = newitem -def ll_delitem_nonneg_checked(l, i): - if i >= l.length: +def ll_delitem_nonneg(func, l, index): + length = l.length + if func is dum_checkidx and (index < 0 or index >= length): raise IndexError - ll_delitem_nonneg(l, i) - -def ll_delitem_nonneg(l, i): - newlength = l.length - 1 - j = i + newlength = length - 1 + j = index items = l.items j1 = j+1 while j < newlength: @@ -531,17 +535,10 @@ items[newlength] = nullptr(ITEM.TO) _ll_list_resize(l, newlength) -def ll_delitem(l, i): +def ll_delitem(func, l, i): if i < 0: i += l.length - ll_delitem_nonneg(l, i) - -def ll_delitem_checked(l, i): - if i < 0: - i += l.length - if i >= l.length or i < 0: - raise IndexError - ll_delitem_nonneg(l, i) + ll_delitem_nonneg(func, l, i) def ll_concat(l1, l2): len1 = l1.length @@ -730,25 +727,30 @@ TEMP = GcArray(Ptr(rstr.STR)) -def ll_mul(l, f): - items = l.items +def ll_mul(func, l, factor): + source = l.items length = l.length - if length == 0 or f <= 0: - return ll_newlist(typeOf(l), 0) - - resultlen = length * f - new_lst = ll_newlist(typeOf(l), resultlen) + if factor < 0: + factor = 0 + resultlen = length * factor + if func is dum_inplace: + res = l + _ll_list_resize(res, resultlen) + j = length + else: + res = ll_newlist(typeOf(l), resultlen) + j = 0 i = 0 - new_items = new_lst.items - j = 0 + target = res.items while j < resultlen: while i < length: - new_items[i + j] = items[i] + p = j + i + target[p] = source[i] i += 1 j += length - return new_lst - + return res + # ____________________________________________________________ # # Irregular operations. @@ -775,10 +777,11 @@ c1 = hop.inputconst(Void, r_list.lowleveltype) c2 = hop.inputconst(Signed, nb_args) v_result = hop.gendirectcall(ll_newlist, c1, c2) + v_func = hop.inputconst(Void, dum_nocheck) for i in range(nb_args): ci = hop.inputconst(Signed, i) v_item = hop.inputarg(r_listitem, arg=i) - hop.gendirectcall(ll_setitem_nonneg, v_result, ci, v_item) + hop.gendirectcall(ll_setitem_nonneg, v_func, v_result, ci, v_item) return v_result def ll_alloc_and_set(LISTPTR, count, item): Modified: pypy/dist/pypy/rpython/rtuple.py ============================================================================== --- pypy/dist/pypy/rpython/rtuple.py (original) +++ pypy/dist/pypy/rpython/rtuple.py Tue Sep 6 19:32:48 2005 @@ -55,6 +55,7 @@ c1 = inputconst(Void, hop.r_result.lowleveltype) c2 = inputconst(Signed, nitems) vlist = hop.gendirectcall(rlist.ll_newlist, c1, c2) + v_func = hop.inputconst(Void, rlist.dum_nocheck) for index in range(nitems): name = self.fieldnames[index] ritem = self.items_r[index] @@ -62,7 +63,7 @@ vitem = hop.genop('getfield', [vtup, cname], resulttype = ritem) vitem = hop.llops.convertvar(vitem, ritem, hop.r_result.item_repr) cindex = inputconst(Signed, index) - hop.gendirectcall(rlist.ll_setitem_nonneg, vlist, cindex, vitem) + hop.gendirectcall(rlist.ll_setitem_nonneg, v_func, vlist, cindex, vitem) return vlist def make_iterator_repr(self): Modified: pypy/dist/pypy/rpython/test/test_rlist.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rlist.py (original) +++ pypy/dist/pypy/rpython/test/test_rlist.py Tue Sep 6 19:32:48 2005 @@ -6,6 +6,16 @@ from pypy.rpython.rint import signed_repr from pypy.rpython.test.test_llinterp import interpret, interpret_raises +# undo the specialization parameter +for n1 in 'get set del'.split(): + for n2 in '','_nonneg': + name = 'll_%sitem%s' % (n1, n2) + globals()['_'+name] = globals()[name] + exec """if 1: + def %s(*args): + return _%s(dum_checkidx, *args) +""" % (name, name) +del n1, n2, name def sample_list(): # [42, 43, 44, 45] rlist = ListRepr(signed_repr) @@ -379,6 +389,22 @@ res = interpret(fn, [arg]) assert res == fn(arg) +def test_list_inplace_multiply(): + def fn(i): + lst = [i] + lst *= i + return len(lst) + for arg in (1, 9, 0, -1, -27): + res = interpret(fn, [arg]) + assert res == fn(arg) + def fn(i): + lst = [i, i + 1] + lst *= i + return len(lst) + for arg in (1, 9, 0, -1, -27): + res = interpret(fn, [arg]) + assert res == fn(arg) + def test_indexerror(): def fn(i): l = [5, 8, 3] @@ -395,6 +421,33 @@ res = interpret(fn, [-2]) assert res._obj.value == "oups" +def list_is_clear(lis, idx): + items = lis._obj.items._obj.items + for i in range(idx, len(items)): + if items[i]._obj is not None: + return False + return True + +def test_no_unneeded_refs(): + def fndel(p, q): + lis = ["5", "3", "99"] + assert q >= 0 + assert p >= 0 + del lis[p:q] + return lis + def fnpop(n): + lis = ["5", "3", "99"] + while n: + lis.pop() + n -=1 + return lis + for i in range(2, 3+1): + lis = interpret(fndel, [0, i]) + assert list_is_clear(lis, 3-i) + for i in range(3): + lis = interpret(fnpop, [i]) + assert list_is_clear(lis, 3-i) + def test_list_basic_ops(): def list_basic_ops(i=int, j=int): l = [1,2,3] Modified: pypy/dist/pypy/translator/c/extfunc.py ============================================================================== --- pypy/dist/pypy/translator/c/extfunc.py (original) +++ pypy/dist/pypy/translator/c/extfunc.py Tue Sep 6 19:32:48 2005 @@ -93,7 +93,7 @@ def _RPyListOfString_SetItem(l=p, index=lltype.Signed, newstring=lltype.Ptr(STR)): - rlist.ll_setitem_nonneg(l, index, newstring) + rlist.ll_setitem_nonneg(rlist.dum_nocheck, l, index, newstring) for fname, f in locals().items(): if isinstance(f, types.FunctionType): From arigo at codespeak.net Tue Sep 6 19:36:24 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 6 Sep 2005 19:36:24 +0200 (CEST) Subject: [pypy-svn] r17299 - in pypy/dist/pypy/rpython: . test Message-ID: <20050906173624.6D0A327B5B@code1.codespeak.net> Author: arigo Date: Tue Sep 6 19:36:23 2005 New Revision: 17299 Modified: pypy/dist/pypy/rpython/rtyper.py pypy/dist/pypy/rpython/test/test_rlist.py Log: Fix in the logic of has_implicit_exception(), as found by Christian. Added a test that shows the problem (and that should have been written a long time ago anyway, it's not some dark corner). Modified: pypy/dist/pypy/rpython/rtyper.py ============================================================================== --- pypy/dist/pypy/rpython/rtyper.py (original) +++ pypy/dist/pypy/rpython/rtyper.py Tue Sep 6 19:36:23 2005 @@ -630,11 +630,14 @@ # high-level ops before the last one in the block if self.llops.implicit_exceptions_checked is None: self.llops.implicit_exceptions_checked = [] + result = False for link in self.exceptionlinks: if issubclass(exc_cls, link.exitcase): self.llops.implicit_exceptions_checked.append(link.exitcase) - return True - return False + result = True + # go on looping to add possibly more exceptions to the list + # (e.g. Exception itself - see test_rlist.test_valueerror) + return result def exception_is_here(self): if self.llops.llop_raising_exceptions is not None: Modified: pypy/dist/pypy/rpython/test/test_rlist.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rlist.py (original) +++ pypy/dist/pypy/rpython/test/test_rlist.py Tue Sep 6 19:36:23 2005 @@ -464,3 +464,20 @@ for j in range(6): res = interpret(list_basic_ops, [i, j]) assert res == list_basic_ops(i, j) + +def test_valueerror(): + def fn(i): + l = [4, 7, 3] + try: + j = l.index(i) + except ValueError: + j = 100 + return j + res = interpret(fn, [4]) + assert res == 0 + res = interpret(fn, [7]) + assert res == 1 + res = interpret(fn, [3]) + assert res == 2 + res = interpret(fn, [6]) + assert res == 100 From tismer at codespeak.net Tue Sep 6 22:12:34 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Tue, 6 Sep 2005 22:12:34 +0200 (CEST) Subject: [pypy-svn] r17301 - pypy/dist/pypy/objspace/std Message-ID: <20050906201234.B682727B5E@code1.codespeak.net> Author: tismer Date: Tue Sep 6 22:12:33 2005 New Revision: 17301 Modified: pypy/dist/pypy/objspace/std/listobject.py Log: simplified lists a bit more. turned all range loops into regular ones, since ranges support is no good, yet. optimized corrected some operations which have to be aware of list mutation during comparison. renamed some items to items_w to be consistent. going to do some single change in rlist as well: give a gap argument to _ll_list_resize, to save a few copies. After that, this chapter is closed for me. Modified: pypy/dist/pypy/objspace/std/listobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/listobject.py (original) +++ pypy/dist/pypy/objspace/std/listobject.py Tue Sep 6 22:12:33 2005 @@ -43,11 +43,10 @@ def getitem__List_ANY(space, w_list, w_index): idx = space.int_w(w_index) try: - w_ret = w_list.wrappeditems[idx] + return w_list.wrappeditems[idx] except IndexError: raise OperationError(space.w_IndexError, space.wrap("list index out of range")) - return w_ret def getitem__List_Slice(space, w_list, w_slice): # XXX consider to extend rlist's functionality? @@ -59,19 +58,20 @@ assert start >= 0 return W_ListObject(space, w_list.wrappeditems[start:stop]) w_res = W_ListObject(space, [None] * slicelength) - items = w_list.wrappeditems - subitems = w_res.wrappeditems - for i in range(slicelength): - subitems[i] = items[start] + items_w = w_list.wrappeditems + subitems_w = w_res.wrappeditems + i = 0 + while i < slicelength: + subitems_w[i] = items_w[start] start += step + i += 1 return w_res def contains__List_ANY(space, w_list, w_obj): # needs to be safe against eq_w() mutating the w_list behind our back i = 0 items_w = w_list.wrappeditems - length = len(items_w) - while i < length: + while i < len(items_w): # intentionally always calling len! if space.eq_w(items_w[i], w_obj): return space.w_True i += 1 @@ -133,27 +133,31 @@ # needs to be safe against eq_w() mutating the w_lists behind our back # Search for the first index where items are different i = 0 - while i < len(w_list1.wrappeditems) and i < len(w_list2.wrappeditems): - w_item1 = w_list1.wrappeditems[i] - w_item2 = w_list2.wrappeditems[i] + items1_w = w_list1.wrappeditems + items2_w = w_list2.wrappeditems + while i < len(items1_w) and i < len(items2_w): + w_item1 = items1_w[i] + w_item2 = items2_w[i] if not space.eq_w(w_item1, w_item2): return space.lt(w_item1, w_item2) i += 1 # No more items to compare -- compare sizes - return space.newbool(len(w_list1.wrappeditems) < len(w_list2.wrappeditems)) + return space.newbool(len(items1_w) < len(items2_w)) def gt__List_List(space, w_list1, w_list2): # needs to be safe against eq_w() mutating the w_lists behind our back # Search for the first index where items are different i = 0 + items1_w = w_list1.wrappeditems + items2_w = w_list2.wrappeditems while i < len(w_list1.wrappeditems) and i < len(w_list2.wrappeditems): - w_item1 = w_list1.wrappeditems[i] - w_item2 = w_list2.wrappeditems[i] + w_item1 = items1_w[i] + w_item2 = items2_w[i] if not space.eq_w(w_item1, w_item2): return space.gt(w_item1, w_item2) i += 1 # No more items to compare -- compare sizes - return space.newbool(len(w_list1.wrappeditems) > len(w_list2.wrappeditems)) + return space.newbool(len(items1_w) > len(items2_w)) def delitem__List_ANY(space, w_list, w_idx): @@ -189,14 +193,16 @@ # keep a reference to the objects to be removed, # preventing side effects during destruction recycle[0] = items[i] - - for discard in range(1, slicelength): + + discard = 1 + while discard < slicelength: j = i+1 i += step - while j= lim: items[i] = items[i-delta] + i -= 1 else: # shrinking requires the careful memory management of _del_slice() _del_slice(w_list, start, start-delta) @@ -261,14 +270,21 @@ # Always copy starting from the right to avoid # having to make a shallow copy in the case where # the source and destination lists are the same list. - for i in range(len2 - 1, -1, -1): - items[start+i*step] = sequence2[i] + i = len2 - 1 + start += i*step + while i >= 0: + items[start] = sequence2[i] + start -= step + i -= 1 return space.w_None else: # Make a shallow copy to more easily handle the reversal case sequence2 = list(sequence2) - for i in range(len2): - items[start+i*step] = sequence2[i] + i = 0 + while i < len2: + items[start] = sequence2[i] + start += step + i += 1 return space.w_None app = gateway.applevel(""" @@ -338,11 +354,10 @@ space.wrap("pop from empty list")) idx = space.int_w(w_idx) try: - w_ret = items.pop(idx) + return items.pop(idx) except IndexError: raise OperationError(space.w_IndexError, space.wrap("pop index out of range")) - return w_ret def list_remove__List_ANY(space, w_list, w_any): # needs to be safe against eq_w() mutating the w_list behind our back @@ -473,10 +488,12 @@ # wrap each item in a KeyContainer if needed if has_key: - for i in range(sorter.listlength): + i = 0 + while i < sorter.listlength: w_item = sorter.list[i] w_key = space.call_function(w_keyfunc, w_item) sorter.list[i] = KeyContainer(w_key, w_item) + i += 1 # Reverse sort stability achieved by initially reversing the list, # applying a stable forward sort, then reversing the final result. @@ -493,10 +510,12 @@ finally: # unwrap each item if needed if has_key: - for i in range(sorter.listlength): + i = 0 + while i < sorter.listlength: w_obj = sorter.list[i] if isinstance(w_obj, KeyContainer): sorter.list[i] = w_obj.w_item + i += 1 # check if the user mucked with the list during the sort mucked = len(w_list.wrappeditems) > 0 From tismer at codespeak.net Tue Sep 6 22:19:27 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Tue, 6 Sep 2005 22:19:27 +0200 (CEST) Subject: [pypy-svn] r17303 - pypy/dist/pypy/module/marshal Message-ID: <20050906201927.7266327B57@code1.codespeak.net> Author: tismer Date: Tue Sep 6 22:19:26 2005 New Revision: 17303 Modified: pypy/dist/pypy/module/marshal/interp_marshal.py Log: making use of the fact that rlist supports in-place multiplication Modified: pypy/dist/pypy/module/marshal/interp_marshal.py ============================================================================== --- pypy/dist/pypy/module/marshal/interp_marshal.py (original) +++ pypy/dist/pypy/module/marshal/interp_marshal.py Tue Sep 6 22:19:26 2005 @@ -217,7 +217,7 @@ lng = len(s) newpos = pos + lng while len(self.buflis) < newpos: - self.buflis = self.buflis + self.buflis + self.buflis *= 2 idx = 0 while idx < lng: self.buflis[pos + idx] = s[idx] @@ -228,7 +228,7 @@ pos = self.bufpos newpos = pos + 1 if len(self.buflis) < newpos: - self.buflis = self.buflis + self.buflis + self.buflis *= 2 self.buflis[pos] = c self.bufpos = newpos @@ -243,7 +243,7 @@ pos = self.bufpos newpos = pos + 5 if len(self.buflis) < newpos: - self.buflis = self.buflis + self.buflis + self.buflis *= 2 self.buflis[pos] = typecode self.buflis[pos+1] = a self.buflis[pos+2] = b @@ -258,7 +258,7 @@ pos = self.bufpos newpos = pos + 2 if len(self.buflis) < newpos: - self.buflis = self.buflis + self.buflis + self.buflis *= 2 self.buflis[pos] = a self.buflis[pos+1] = b self.bufpos = newpos @@ -274,7 +274,7 @@ pos = self.bufpos newpos = pos + 4 if len(self.buflis) < newpos: - self.buflis = self.buflis + self.buflis + self.buflis *= 2 self.buflis[pos] = a self.buflis[pos+1] = b self.buflis[pos+2] = c From pedronis at codespeak.net Tue Sep 6 23:07:24 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 6 Sep 2005 23:07:24 +0200 (CEST) Subject: [pypy-svn] r17305 - pypy/dist/pypy/objspace/std Message-ID: <20050906210724.0DDDF27B59@code1.codespeak.net> Author: pedronis Date: Tue Sep 6 23:07:23 2005 New Revision: 17305 Modified: pypy/dist/pypy/objspace/std/listobject.py Log: reinstate range where is the natural thing. Modified: pypy/dist/pypy/objspace/std/listobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/listobject.py (original) +++ pypy/dist/pypy/objspace/std/listobject.py Tue Sep 6 23:07:23 2005 @@ -60,11 +60,9 @@ w_res = W_ListObject(space, [None] * slicelength) items_w = w_list.wrappeditems subitems_w = w_res.wrappeditems - i = 0 - while i < slicelength: + for i in range(slicelength): subitems_w[i] = items_w[start] start += step - i += 1 return w_res def contains__List_ANY(space, w_list, w_obj): @@ -194,15 +192,13 @@ # preventing side effects during destruction recycle[0] = items[i] - discard = 1 - while discard < slicelength: + for discard in range(1, slicelength): j = i+1 i += step while j < i: items[j-discard] = items[j] j += 1 recycle[discard] = items[i] - discard += 1 j = i+1 while j < n: @@ -280,11 +276,9 @@ else: # Make a shallow copy to more easily handle the reversal case sequence2 = list(sequence2) - i = 0 - while i < len2: + for i in range(len2): items[start] = sequence2[i] start += step - i += 1 return space.w_None app = gateway.applevel(""" @@ -364,11 +358,10 @@ i = 0 items = w_list.wrappeditems length = len(items) - while i < length: + for i in range(length): if space.eq_w(items[i], w_any): del items[i] return space.w_None - i += 1 raise OperationError(space.w_ValueError, space.wrap("list.remove(x): x not in list")) @@ -488,12 +481,10 @@ # wrap each item in a KeyContainer if needed if has_key: - i = 0 - while i < sorter.listlength: + for i in range(sorter.listlength): w_item = sorter.list[i] w_key = space.call_function(w_keyfunc, w_item) sorter.list[i] = KeyContainer(w_key, w_item) - i += 1 # Reverse sort stability achieved by initially reversing the list, # applying a stable forward sort, then reversing the final result. @@ -510,12 +501,10 @@ finally: # unwrap each item if needed if has_key: - i = 0 - while i < sorter.listlength: + for i in range(sorter.listlength): w_obj = sorter.list[i] if isinstance(w_obj, KeyContainer): sorter.list[i] = w_obj.w_item - i += 1 # check if the user mucked with the list during the sort mucked = len(w_list.wrappeditems) > 0 From hpk at codespeak.net Tue Sep 6 23:11:25 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Tue, 6 Sep 2005 23:11:25 +0200 (CEST) Subject: [pypy-svn] r17306 - pypy/dist/pypy/objspace/std Message-ID: <20050906211125.C07CE27B59@code1.codespeak.net> Author: hpk Date: Tue Sep 6 23:11:25 2005 New Revision: 17306 Modified: pypy/dist/pypy/objspace/std/listobject.py Log: remove leftover 'i' Modified: pypy/dist/pypy/objspace/std/listobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/listobject.py (original) +++ pypy/dist/pypy/objspace/std/listobject.py Tue Sep 6 23:11:25 2005 @@ -355,7 +355,6 @@ def list_remove__List_ANY(space, w_list, w_any): # needs to be safe against eq_w() mutating the w_list behind our back - i = 0 items = w_list.wrappeditems length = len(items) for i in range(length): From xoraxax at codespeak.net Wed Sep 7 00:44:47 2005 From: xoraxax at codespeak.net (xoraxax at codespeak.net) Date: Wed, 7 Sep 2005 00:44:47 +0200 (CEST) Subject: [pypy-svn] r17307 - pypy/dist/pypy/bin Message-ID: <20050906224447.75FAA27B55@code1.codespeak.net> Author: xoraxax Date: Wed Sep 7 00:44:46 2005 New Revision: 17307 Modified: pypy/dist/pypy/bin/translator.py Log: Removed duplicate readline code. Modified: pypy/dist/pypy/bin/translator.py ============================================================================== --- pypy/dist/pypy/bin/translator.py (original) +++ pypy/dist/pypy/bin/translator.py Wed Sep 7 00:44:46 2005 @@ -236,10 +236,7 @@ from pypy.translator.test import snippet as test #from pypy.translator.llvm.test import llvmsnippet as test2 from pypy.rpython.rtyper import RPythonTyper - try: - setup_readline() - except ImportError, err: - print "Disabling readline support (%s)" % err + if (os.getcwd() not in sys.path and os.path.curdir not in sys.path): sys.path.insert(0, os.getcwd()) From pedronis at codespeak.net Wed Sep 7 00:47:37 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 7 Sep 2005 00:47:37 +0200 (CEST) Subject: [pypy-svn] r17308 - in pypy/dist/pypy/interpreter/pyparser: . test Message-ID: <20050906224737.47F4C27B55@code1.codespeak.net> Author: pedronis Date: Wed Sep 7 00:47:35 2005 New Revision: 17308 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Log: make ast_builder behave like the transformer on single_input, with some tests (also tweaked some snippets for test_astcompiler which ends up using parser.suite which seems more picky) Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Wed Sep 7 00:47:35 2005 @@ -746,7 +746,12 @@ atoms = get_atoms( builder, nb ) l = len(atoms) if l == 1 or l==2: - builder.push(ast.Module(None, atoms[0])) + atom0 = atoms[0] + if isinstance(atom0, TokenObject) and atom0.name == tok.NEWLINE: + atom0 = ast.Pass() + elif not isinstance(atom0, ast.Stmt): + atom0 = ast.Stmt([atom0]) + builder.push(ast.Module(None, atom0)) else: assert False, "Forbidden path" Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Wed Sep 7 00:47:35 2005 @@ -396,17 +396,17 @@ '''def foo(): """foo docstring""" return 1 - ''', +''', '''def foo(): """foo docstring""" a = 1 """bar""" return a - ''', +''', '''def foo(): """doc"""; print 1 a=1 - ''', +''', '''"""Docstring""";print 1''', ] @@ -503,6 +503,18 @@ returns, ] +SINGLE_INPUTS = [ + one_stmt_funcdefs, + ['\t # hello\n ', + 'print 6*7', + 'if 1: x\n', + 'x = 5', + 'x = 5 ', + '''"""Docstring""";print 1''', + '''"Docstring"''', + ] +] + TARGET_DICT = { 'single' : 'single_input', 'exec' : 'file_input', @@ -558,7 +570,6 @@ for expr in family: yield check_expression, expr, 'exec' - SNIPPETS = [ 'snippet_1.py', 'snippet_several_statements.py', @@ -626,3 +637,7 @@ for data in test: assert eval_string(data) == eval(data) +def test_single_inputs(): + for family in SINGLE_INPUTS: + for expr in family: + yield check_expression, expr, 'single' From pedronis at codespeak.net Wed Sep 7 00:49:21 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 7 Sep 2005 00:49:21 +0200 (CEST) Subject: [pypy-svn] r17309 - in pypy/dist/pypy/interpreter: astcompiler test Message-ID: <20050906224921.45C4627B55@code1.codespeak.net> Author: pedronis Date: Wed Sep 7 00:49:19 2005 New Revision: 17309 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/test/test_compiler.py Log: port docstring fix from stable/_stable. activated testing astcompiler (through cheating) in test_compiler. tests to the scope related tests whose fix have not been ported yet, marked skipped as INPROGRESS Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Wed Sep 7 00:49:19 2005 @@ -833,6 +833,11 @@ # misc def visitDiscard(self, node): + # Important: this function is overridden in InteractiveCodeGenerator, + # which also has the effect that the following test only occurs in + # non-'single' modes. + if isinstance(node.expr, ast.Const): + return # skip LOAD_CONST/POP_TOP pairs (for e.g. docstrings) self.set_lineno(node) node.expr.accept( self ) self.emit('POP_TOP') Modified: pypy/dist/pypy/interpreter/test/test_compiler.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_compiler.py (original) +++ pypy/dist/pypy/interpreter/test/test_compiler.py Wed Sep 7 00:49:19 2005 @@ -1,7 +1,7 @@ import __future__ import autopath import py -from pypy.interpreter.pycompiler import CPythonCompiler, PythonCompiler +from pypy.interpreter.pycompiler import CPythonCompiler, PythonCompiler, PythonAstCompiler from pypy.interpreter.pycode import PyCode from pypy.interpreter.error import OperationError @@ -113,7 +113,7 @@ ex = e.value assert ex.match(self.space, self.space.w_SyntaxError) - def XXXtest_scope_importstar_with_nested_free(self): + def test_scope_importstar_with_nested_free(self): e = py.test.raises(OperationError, self.compiler.compile, """if 1: def clash(x): from string import * @@ -163,6 +163,25 @@ def setup_method(self, method): self.compiler = PythonCompiler(self.space) +class TestPythonAstCompiler(BaseTestCompiler): + def setup_method(self, method): + self.compiler = PythonAstCompiler(self.space) + + def test_scope_unoptimized_clash1(self): + py.test.skip("INPROGESS") + + def test_scope_unoptimized_clash1_b(self): + py.test.skip("INPROGESS") + + def test_scope_exec_in_nested(self): + py.test.skip("INPROGESS") + + def test_scope_importstar_in_nested(self): + py.test.skip("INPROGESS") + + def test_scope_importstar_with_nested_free(self): + py.test.skip("INPROGESS") + class SkippedForNowTestPyPyCompiler(BaseTestCompiler): def setup_method(self, method): self.compiler = PyPyCompiler(self.space) From tismer at codespeak.net Wed Sep 7 02:24:48 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Wed, 7 Sep 2005 02:24:48 +0200 (CEST) Subject: [pypy-svn] r17310 - pypy/dist/pypy/rpython Message-ID: <20050907002448.A0B4427B59@code1.codespeak.net> Author: tismer Date: Wed Sep 7 02:24:47 2005 New Revision: 17310 Modified: pypy/dist/pypy/rpython/rlist.py Log: bug in ll_mul,assigning sourcetoo early. _ll_list_resize might change the list items and nullify the old list we use as source. Unfortunately, no test seems to uncover this problem. using py.py and the other tests, everything appears to be fine. Probably the isinstance enquiries do not work properly if we are not compiling? Well, using py.py, rlist isn't even used at all. But I'm wondering what all the tests are good for if such things can only be found by compiling? I thought llinterp would behave almost like compiled code? Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Wed Sep 7 02:24:47 2005 @@ -728,7 +728,6 @@ TEMP = GcArray(Ptr(rstr.STR)) def ll_mul(func, l, factor): - source = l.items length = l.length if factor < 0: factor = 0 @@ -741,6 +740,7 @@ res = ll_newlist(typeOf(l), resultlen) j = 0 i = 0 + source = l.items target = res.items while j < resultlen: while i < length: From tismer at codespeak.net Wed Sep 7 02:33:55 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Wed, 7 Sep 2005 02:33:55 +0200 (CEST) Subject: [pypy-svn] r17311 - pypy/dist/pypy/rpython/test Message-ID: <20050907003355.03FF027B59@code1.codespeak.net> Author: tismer Date: Wed Sep 7 02:33:54 2005 New Revision: 17311 Modified: pypy/dist/pypy/rpython/test/test_rlist.py Log: now I know how the malicious rlist mul could make it through the tests.We only tested the length of the result list and didn'tlook into the result at all. changed the tests to check the last list element as well. Modified: pypy/dist/pypy/rpython/test/test_rlist.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rlist.py (original) +++ pypy/dist/pypy/rpython/test/test_rlist.py Wed Sep 7 02:33:54 2005 @@ -378,13 +378,19 @@ def test_list_multiply(): def fn(i): lst = [i] * i - return len(lst) + ret = len(lst) + if ret: + ret *= lst[-1] + return ret for arg in (1, 9, 0, -1, -27): res = interpret(fn, [arg]) assert res == fn(arg) def fn(i): lst = [i, i + 1] * i - return len(lst) + ret = len(lst) + if ret: + ret *= lst[-1] + return ret for arg in (1, 9, 0, -1, -27): res = interpret(fn, [arg]) assert res == fn(arg) @@ -393,14 +399,20 @@ def fn(i): lst = [i] lst *= i - return len(lst) + ret = len(lst) + if ret: + ret *= lst[-1] + return ret for arg in (1, 9, 0, -1, -27): res = interpret(fn, [arg]) assert res == fn(arg) def fn(i): lst = [i, i + 1] lst *= i - return len(lst) + ret = len(lst) + if ret: + ret *= lst[-1] + return ret for arg in (1, 9, 0, -1, -27): res = interpret(fn, [arg]) assert res == fn(arg) From tismer at codespeak.net Wed Sep 7 02:42:22 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Wed, 7 Sep 2005 02:42:22 +0200 (CEST) Subject: [pypy-svn] r17312 - pypy/dist/pypy/rpython Message-ID: <20050907004222.8E67627B59@code1.codespeak.net> Author: tismer Date: Wed Sep 7 02:42:21 2005 New Revision: 17312 Modified: pypy/dist/pypy/rpython/rlist.py Log: yet another small bug in rlist mul, found by the enhanced tests. Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Wed Sep 7 02:42:21 2005 @@ -739,10 +739,10 @@ else: res = ll_newlist(typeOf(l), resultlen) j = 0 - i = 0 source = l.items target = res.items while j < resultlen: + i = 0 while i < length: p = j + i target[p] = source[i] From pedronis at codespeak.net Wed Sep 7 10:54:37 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 7 Sep 2005 10:54:37 +0200 (CEST) Subject: [pypy-svn] r17314 - in pypy/dist/pypy/translator: goal tool Message-ID: <20050907085437.C5A7827B7A@code1.codespeak.net> Author: pedronis Date: Wed Sep 7 10:54:36 2005 New Revision: 17314 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py pypy/dist/pypy/translator/tool/pdbplus.py Log: PdbPlusShow was referring to the global translator t in the extended commands, attach it as a translator attribute at construction time. make attaching a show into a method. change translate_pypy_new acccordingly. Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Wed Sep 7 10:54:36 2005 @@ -265,7 +265,7 @@ from pypy.translator.tool.pdbplus import PdbPlusShow from pypy.translator.tool.pdbplus import run_debugger_in_thread - pdb_plus_show = PdbPlusShow() + pdb_plus_show = PdbPlusShow(t) # need a translator to support extended commands if got_error: import traceback @@ -300,7 +300,7 @@ start, show, stop, cleanup = serv_start, serv_show, serv_stop, serv_cleanup else: start, show, stop, cleanup = run_server() - pdb_plus_show.show = show + pdb_plus_show.install_show(show) debugger = run_debugger_in_thread(func, args, stop) debugger.start() start() Modified: pypy/dist/pypy/translator/tool/pdbplus.py ============================================================================== --- pypy/dist/pypy/translator/tool/pdbplus.py (original) +++ pypy/dist/pypy/translator/tool/pdbplus.py Wed Sep 7 10:54:36 2005 @@ -16,6 +16,10 @@ class PdbPlusShow(pdb.Pdb): + def __init__(self, translator): + pdb.Pdb.__init__(self) + self.translator = translator + def post_mortem(self, t): self.reset() while t.tb_next is not None: @@ -24,6 +28,9 @@ show = None + def install_show(self, show): + self.show = show + def _show(self, page): if not self.show: print "*** No display" @@ -137,7 +144,7 @@ return cls = [] try: - for c in t.annotator.getuserclasses(): + for c in self.translator.annotator.getuserclasses(): if flt(c): cls.append(c) except self.GiveUp: @@ -154,7 +161,7 @@ return funcs = [] try: - for f in t.flowgraphs: + for f in self.translator.flowgraphs: if flt(f): funcs.append(f) except self.GiveUp: @@ -168,18 +175,19 @@ if obj is a function or method, the localized call graph is shown; if obj is a class or ClassDef the class definition graph is shown""" from pypy.annotation.classdef import ClassDef - from pypy.translator.tool import graphpage + from pypy.translator.tool import graphpage + translator = self.translator obj = self._getobj(arg) if obj is None: return if hasattr(obj, 'im_func'): obj = obj.im_func - if obj in t.flowgraphs: - page = graphpage.LocalizedCallGraphPage(t, obj) - elif obj in getattr(t.annotator, 'getuserclasses', lambda: {})(): - page = graphpage.ClassDefPage(t, t.annotator.getuserclasses()[obj]) + if obj in translator.flowgraphs: + page = graphpage.LocalizedCallGraphPage(translator, obj) + elif obj in getattr(translator.annotator, 'getuserclasses', lambda: {})(): + page = graphpage.ClassDefPage(translator, translator.annotator.getuserclasses()[obj]) elif isinstance(obj, ClassDef): - page = graphpage.ClassDefPage(t, obj) + page = graphpage.ClassDefPage(translator, obj) else: print "*** Nothing to do" return @@ -200,7 +208,7 @@ def longname(c): return "%s.%s" % (c.__module__, c.__name__) obj.sort(lambda x,y: cmp(longname(x), longname(y))) - cls = t.annotator.getuserclasses() + cls = self.translator.annotator.getuserclasses() flt = self._make_flt(expr) if flt is None: return @@ -267,7 +275,7 @@ obj = self._getobj(arg) if obj is None: return - cls = t.annotator.getuserclasses() + cls = self.translator.annotator.getuserclasses() if obj not in cls: return attrs = cls[obj].attrs From ale at codespeak.net Wed Sep 7 11:09:27 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Wed, 7 Sep 2005 11:09:27 +0200 (CEST) Subject: [pypy-svn] r17316 - pypy/dist/pypy/translator/tool Message-ID: <20050907090927.38B9B27B84@code1.codespeak.net> Author: ale Date: Wed Sep 7 11:09:26 2005 New Revision: 17316 Modified: pypy/dist/pypy/translator/tool/pdbplus.py Log: Changed the use of the (previously) global t to self.translator Modified: pypy/dist/pypy/translator/tool/pdbplus.py ============================================================================== --- pypy/dist/pypy/translator/tool/pdbplus.py (original) +++ pypy/dist/pypy/translator/tool/pdbplus.py Wed Sep 7 11:09:26 2005 @@ -314,7 +314,7 @@ if not isinstance(obj, types.FunctionType): print "*** Not a function" return - self._show(graphpage.FlowGraphPage(t, [obj])) + self._show(graphpage.FlowGraphPage(self.translator, [obj])) def do_callg(self, arg): """callg obj @@ -330,13 +330,13 @@ if not isinstance(obj, types.FunctionType): print "*** Not a function" return - self._show(graphpage.LocalizedCallGraphPage(t, obj)) + self._show(graphpage.LocalizedCallGraphPage(self.translator, obj)) def do_classhier(self, arg): """classhier show class hierarchy graph""" from pypy.translator.tool import graphpage - self._show(graphpage.ClassHierarchyPage(t)) + self._show(graphpage.ClassHierarchyPage(self.translator)) def help_graphs(self): print "graph commands are: showg, flowg, callg, classhier" From ale at codespeak.net Wed Sep 7 11:18:19 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Wed, 7 Sep 2005 11:18:19 +0200 (CEST) Subject: [pypy-svn] r17317 - in pypy/dist/pypy/translator: . goal Message-ID: <20050907091819.2C6D927B86@code1.codespeak.net> Author: ale Date: Wed Sep 7 11:18:17 2005 New Revision: 17317 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py pypy/dist/pypy/translator/translator.py Log: work in progress. Shufling things around. trying to group options, using optparse. added possibility to choose different gc's by a single option and different backendsby another options. The option names are a bit random (should be worked on) added a compile method to translator to try to accomododate more backends Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Wed Sep 7 11:18:17 2005 @@ -25,11 +25,13 @@ we are still lacking annotation of applevel code. -d Enable recording of annotator debugging information - -no-c Don't generate the C code - -llvm Use LLVM instead of C + -b backend is one 'ccompiler, llvmcompiler' + -gc can be 'boehm' or 'no-gc' or 'ref' (default) + -no-c Don't generate the code + #-llvm Use LLVM instead of C -c Generate the C code, but don't compile it - -boehm Use the Boehm collector when generating C code - -no-gc Experimental: use no GC and no refcounting at all + #-boehm Use the Boehm collector when generating C code + #-no-gc Experimental: use no GC and no refcounting at all -o Generate and compile the C code, but don't run it -tcc Equivalent to the envvar PYPY_CC='tcc -shared -o "%s.so" "%s.c"' -- http://fabrice.bellard.free.fr/tcc/ @@ -111,7 +113,7 @@ policy = AnnotatorPolicy() if target: - spec = target(not options['-t-lowmem']) + spec = target(not options1.lowmem) try: entry_point, inputtypes, policy = spec except ValueError: @@ -130,7 +132,7 @@ if listen_port: run_async_server() - if not options['-no-a']: + if not options1.no_a: print 'Annotating...' print 'with policy: %s.%s' % (policy.__class__.__module__, policy.__class__.__name__) a = t.annotate(inputtypes, policy=policy) @@ -143,18 +145,18 @@ if a: #and not options['-no-s']: print 'Simplifying...' a.simplify() - if a and options['-fork']: - from pypy.translator.goal import unixcheckpoint - assert_rpython_mostly_not_imported() - unixcheckpoint.restartable_point(auto='run') - if a and not options['-no-t']: + if 'fork1' in options1.fork: + from pypy.translator.goal import unixcheckpoint + assert_rpython_mostly_not_imported() + unixcheckpoint.restartable_point(auto='run') + if a and not options1.specialize: print 'Specializing...' t.specialize(dont_simplify_again=True, - crash_on_first_typeerror=not options['-t-insist']) - if not options['-no-o'] and not options['-llvm']: + crash_on_first_typeerror=not options1.insist) + if not options1.optimize and not options1.backend =='-llvm': print 'Back-end optimizations...' t.backend_optimizations() - if a and options['-fork2']: + if a and 'fork2' in options1.fork: from pypy.translator.goal import unixcheckpoint unixcheckpoint.restartable_point(auto='run') if a: @@ -214,7 +216,7 @@ from pypy.translator.tool.pygame.graphclient import get_layout from pypy.translator.tool.pygame.graphdisplay import GraphDisplay - if len(t.functions) <= huge: + if len(t.functions) <= options1.huge: page = graphpage.TranslatorPage(t) else: page = graphpage.LocalizedCallGraphPage(t, entry_point) @@ -232,33 +234,6 @@ if __name__ == '__main__': targetspec = 'targetpypystandalone' - huge = 100 - load_file = None - save_file = None - - options = {'-text': False, - '-no-c': False, - '-c': False, - '-boehm': False, - '-no-gc': False, - '-o': False, - '-llvm': False, - '-no-mark-some-objects': False, - '-no-a': False, - '-no-t': False, - '-t-insist': False, - '-no-o': False, - '-tcc': False, - '-d': False, - '-use-snapshot' : False, - '-load': False, - '-save': False, - '-fork': False, - '-fork2': False, - '-llinterpret': False, - '-t-lowmem': False, - '-batch': False, - } listen_port = None def debug(got_error): @@ -287,13 +262,13 @@ print 'Done.' print func, args = pdb_plus_show.set_trace, () - if options['-text']: - if options['-batch']: + if not options1.pygame: + if options1.batch: print >>sys.stderr, "batch mode, not calling interactive helpers" else: func(*args) else: - if options['-batch']: + if options1.batch: print >>sys.stderr, "batch mode, not calling interactive helpers" else: if serv_start: @@ -332,11 +307,50 @@ return print "don't know about", x - argiter = iter(sys.argv[1:]) + from optparse import OptionParser + parser = OptionParser() + parser.add_option("-u", "--usesnapshot", dest="snapshot", default=False, + action="store_true",help="use snapshot") + + parser.add_option("-f", "--fork", dest="fork", default=[], + action="append",help="(UNIX) Create restartable checkpoint after annotation,specialization") + parser.add_option("-m", "--lowmem", dest="lowmem", default=False, + action="store_true",help="Try to save memory") + parser.add_option("-t", "--specialize", dest="specialize", default=True, + action="store_false",help="Don't specialize") + parser.add_option("-o", "--optimize", dest="optimize", default=True, + action="store_false",help="Don't do backend optimizations") + parser.add_option("-n", "--no_annotations", dest="no_a", default=False, + action="store_true", help="Don't infer annotations") + parser.add_option("-l", "--load", dest="loadfile", + help="load translator from file") + parser.add_option("-s", "--save", dest="savefile", + help="save translator to file") + parser.add_option("-i", "--insist", dest="insist", default=True, + action="store_true", help="Don't stop on first error") + parser.add_option("-d", "--debug", dest="debug", default=False, + action="store_true", help="record debug information") + + parser.add_option("-g", "--gc", dest="gc", default="ref", + help="choose garbage collector (ref, boehm, none)") + parser.add_option("-b", "--backend", dest="backend", default='c', + help="choose backend (c, llvm, llinterpret)") + parser.add_option("-c", "--gencode", dest="really_compile", default=True, + action="store_false",help="Don't generate C code") + + parser.add_option("-r", "--no_run", dest="run", default=True, + action="store_false",help="compile but don't run") + parser.add_option("-H", "--huge", dest="huge", type="int", + help="Threshold in the number of functions after which only a local call\ + graph and not a full one is displayed") + parser.add_option("-p", "--pygame", dest="pygame", default=True, + action="store_false", help="Don't start Pygame viewer") + parser.add_option("-x", "--batch", dest="batch", default=False, + action="store_true",help="Don't use interactive helpers, like pdb") + (options1, args) = parser.parse_args() + print options1,args + argiter = iter(args) #sys.argv[1:]) for arg in argiter: - if arg in ('-h', '--help'): - print __doc__.strip() - sys.exit() try: listen_port = int(arg) except ValueError: @@ -346,30 +360,25 @@ targetspec = arg elif os.path.isfile(arg) and arg.endswith('.py'): targetspec = arg[:-3] - elif arg.startswith('-huge='): - huge = int(arg[6:]) - else: - assert arg in options, "unknown option %r" % (arg,) - options[arg] = True - if arg == '-load': - load_file = argiter.next() - loaded_dic = load(load_file) - if arg == '-save': - save_file = argiter.next() - if options['-tcc']: - os.environ['PYPY_CC'] = 'tcc -shared -o "%s.so" "%s.c"' - if options['-d']: + + options = {} + for opt in parser.option_list[1:]: + options[opt.dest] = getattr(options1,opt.dest) +## if options['-tcc']: +## os.environ['PYPY_CC'] = 'tcc -shared -o "%s.so" "%s.c"' + if options1.debug: annmodel.DEBUG = True try: err = None - if load_file: + if options1.loadfile: + loaded_dic = load(options1.filename) t = loaded_dic['trans'] entry_point = t.entrypoint inputtypes = loaded_dic['inputtypes'] targetspec_dic = loaded_dic['targetspec_dic'] targetspec = loaded_dic['targetspec'] old_options = loaded_dic['options'] - for name in '-no-a -no-t -no-o'.split(): + for name in 'no_a specialize optimize'.split(): # if one of these options has not been set, before, # then the action has been done and must be prevented, now. if not old_options[name]: @@ -377,13 +386,8 @@ print 'option %s is implied by the load' % name options[name] = True print "continuing Analysis as defined by %s, loaded from %s" %( - targetspec, load_file) + targetspec, options1.loadname) targetspec_dic['target'] = None -## print 'options in effect:', options -## try: -## analyse(None) -## except TyperError: -## err = sys.exc_info() else: targetspec_dic = {} sys.path.insert(0, os.path.dirname(targetspec)) @@ -400,11 +404,11 @@ except TyperError: err = sys.exc_info() print '-'*60 - if save_file: + if options1.savefile: print 'saving state to %s' % save_file if err: print '*** this save is done after errors occured ***' - save(t, save_file, + save(t, savefile, trans=t, inputtypes=inputtypes, targetspec=targetspec, @@ -414,14 +418,14 @@ if err: raise err[0], err[1], err[2] gcpolicy = None - if options['-boehm']: + if options1.gc =='boehm': from pypy.translator.c import gc gcpolicy = gc.BoehmGcPolicy - if options['-no-gc']: + if options1.gc == 'none': from pypy.translator.c import gc gcpolicy = gc.NoneGcPolicy - if options['-llinterpret']: + if options1.backend == 'llinterpret': def interpret(): import py from pypy.rpython.llinterp import LLInterpreter @@ -430,34 +434,24 @@ interp.eval_function(entry_point, targetspec_dic['get_llinterp_args']()) interpret() - elif options['-no-c']: + elif options1.gencode: print 'Not generating C code.' - elif options['-c']: - if options['-llvm']: - print 'Generating LLVM code without compiling it...' - filename = t.llvmcompile(really_compile=False, - standalone=standalone) - else: - print 'Generating C code without compiling it...' - filename = t.ccompile(really_compile=False, - standalone=standalone, gcpolicy=gcpolicy) - update_usession_dir() - print 'Written %s.' % (filename,) else: - if options['-llvm']: - print 'Generating and compiling LLVM code...' - c_entry_point = t.llvmcompile(standalone=standalone, exe_name='pypy-llvm') - else: - print 'Generating and compiling C code...' - c_entry_point = t.ccompile(standalone=standalone, gcpolicy=gcpolicy) - if standalone: # xxx fragile and messy - import shutil - exename = mkexename(c_entry_point) - newexename = mkexename('./pypy-c') - shutil.copy(exename, newexename) - c_entry_point = newexename + print 'Generating %s %s code...' %("and compiling " and options1.really_compile or "",options1.backend) + keywords = {'really_compile' : options1.really_compile, + 'standalone' : standalone, + 'gcpolicy' : gcpolicy} + c_entry_point = t.compile(options1.backend,keywords) + + if standalone and options1.backend == 'c': # xxx fragile and messy + import shutil + exename = mkexename(c_entry_point) + newexename = mkexename('./pypy-c') + shutil.copy(exename, newexename) + c_entry_point = newexename update_usession_dir() - if not options['-o']: + print 'Written %s.' % (c_entry_point,) + if not options.run: print 'Running!' if standalone: os.system(c_entry_point) @@ -466,7 +460,7 @@ except SystemExit: raise except: - debug(True) + if t: debug(True) raise SystemExit(1) else: - debug(False) + if t: debug(False) Modified: pypy/dist/pypy/translator/translator.py ============================================================================== --- pypy/dist/pypy/translator/translator.py (original) +++ pypy/dist/pypy/translator/translator.py Wed Sep 7 11:18:17 2005 @@ -251,6 +251,14 @@ mod = make_module_from_pyxstring(name, udir, pyxcode) return getattr(mod, name) + def compile(self, compiler='c', **kw): + compiler += 'compile' + if hasattr(self, compiler): + compiler = getattr(self,compiler) + return compiler(**kw) + else: + raise NotImplementedError, "Compiler not known", compiler + def ccompile(self, really_compile=True, standalone=False, gcpolicy=None): """Returns compiled function (living in a new C-extension module), compiled using the C generator. From ericvrp at codespeak.net Wed Sep 7 14:15:23 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 7 Sep 2005 14:15:23 +0200 (CEST) Subject: [pypy-svn] r17321 - pypy/dist/pypy/translator/llvm Message-ID: <20050907121523.070CA27B76@code1.codespeak.net> Author: ericvrp Date: Wed Sep 7 14:15:22 2005 New Revision: 17321 Modified: pypy/dist/pypy/translator/llvm/externs2ll.py pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/llvm/opwriter.py Log: cosmetics Modified: pypy/dist/pypy/translator/llvm/externs2ll.py ============================================================================== --- pypy/dist/pypy/translator/llvm/externs2ll.py (original) +++ pypy/dist/pypy/translator/llvm/externs2ll.py Wed Sep 7 14:15:22 2005 @@ -66,7 +66,7 @@ def post_setup_externs(db): - rtyper = db._translator.rtyper + rtyper = db.translator.rtyper from pypy.translator.c.extfunc import predeclare_all # hacks to make predeclare_all work @@ -78,7 +78,7 @@ if isinstance(obj, lltype.LowLevelType): db.prepare_type(obj) elif isinstance(obj, types.FunctionType): - funcptr = getfunctionptr(db._translator, obj) + funcptr = getfunctionptr(db.translator, obj) c = inputconst(lltype.typeOf(funcptr), funcptr) db.prepare_arg_value(c) elif isinstance(lltype.typeOf(obj), lltype.Ptr): @@ -105,7 +105,7 @@ s = "#define %s struct %s\n%s;\n" % (c_name, c_name, c_name) ccode.append(s) elif isinstance(obj, types.FunctionType): - funcptr = getfunctionptr(db._translator, obj) + funcptr = getfunctionptr(db.translator, obj) c = inputconst(lltype.typeOf(funcptr), funcptr) predeclarefn(c_name, db.repr_arg(c)) elif isinstance(lltype.typeOf(obj), lltype.Ptr): Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Wed Sep 7 14:15:22 2005 @@ -39,7 +39,7 @@ self.ref = self.make_ref('%pypy_', value.graph.name) self.graph = value.graph remove_same_as(self.graph) - remove_double_links(self.db._translator, self.graph) + remove_double_links(self.db.translator, self.graph) def __str__(self): return "" %(self.ref,) Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Wed Sep 7 14:15:22 2005 @@ -335,7 +335,7 @@ else: self.codewriter.invoke_void(functionref, argrefs, argtypes, none_label, exc_label) - e = self.db._translator.rtyper.getexceptiondata() + e = self.db.translator.rtyper.getexceptiondata() ll_exception_match = '%pypy_' + e.ll_exception_match.__name__ lltype_of_exception_type = ('%structtype.' + e.lltype_of_exception_type.TO.__name__ From ericvrp at codespeak.net Wed Sep 7 14:15:51 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 7 Sep 2005 14:15:51 +0200 (CEST) Subject: [pypy-svn] r17322 - pypy/dist/pypy/translator/llvm Message-ID: <20050907121551.167C227B76@code1.codespeak.net> Author: ericvrp Date: Wed Sep 7 14:15:50 2005 New Revision: 17322 Modified: pypy/dist/pypy/translator/llvm/database.py Log: more cosmetics Modified: pypy/dist/pypy/translator/llvm/database.py ============================================================================== --- pypy/dist/pypy/translator/llvm/database.py (original) +++ pypy/dist/pypy/translator/llvm/database.py Wed Sep 7 14:15:50 2005 @@ -17,7 +17,7 @@ class Database(object): def __init__(self, translator): - self._translator = translator + self.translator = translator self.obj2node = {} self._pendingsetup = [] self._tmpcount = 1 From ericvrp at codespeak.net Wed Sep 7 15:11:14 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 7 Sep 2005 15:11:14 +0200 (CEST) Subject: [pypy-svn] r17325 - in pypy/dist/pypy/translator/llvm: . module test Message-ID: <20050907131114.AB80527B68@code1.codespeak.net> Author: ericvrp Date: Wed Sep 7 15:11:12 2005 New Revision: 17325 Added: pypy/dist/pypy/translator/llvm/gc.py (contents, props changed) Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/module/extfunction.py pypy/dist/pypy/translator/llvm/pyxwrapper.py pypy/dist/pypy/translator/llvm/test/test_gc.py Log: Refactored garbage collection into gc.py (similar to genC) Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Wed Sep 7 15:11:12 2005 @@ -10,7 +10,6 @@ from pypy.translator.tool.cbuild import make_c_from_pyxfile from pypy.translator.tool import stdoutcapture -from pypy.translator.llvm.genllvm import use_boehm_gc from pypy.translator.llvm.log import log EXCEPTIONS_SWITCHES = "-enable-correct-eh-support" @@ -55,7 +54,7 @@ log.build(cmd) cmdexec(cmd) -def make_module_from_llvm(llvmfile, pyxfile=None, optimize=True, exe_name=None): +def make_module_from_llvm(genllvm, llvmfile, pyxfile=None, optimize=True, exe_name=None): include_dir = py.magic.autopath().dirpath() dirpath = llvmfile.dirpath() lastdir = path.local() @@ -67,12 +66,8 @@ else: source_files = [] object_files = [] - library_files = [] - if use_boehm_gc: - gc_libs = '-lgc -lpthread' - library_files.append('gc') - else: - gc_libs = '' + library_files = genllvm.gcpolicy.gc_libraries() + gc_libs = ' '.join(['-l' + lib for lib in library_files]) if optimize: optimization_switches = OPTIMIZATION_SWITCHES Added: pypy/dist/pypy/translator/llvm/gc.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/llvm/gc.py Wed Sep 7 15:11:12 2005 @@ -0,0 +1,85 @@ +class GcPolicy: + def __init__(self): + raise Exception, 'GcPolicy should not be used directly' + + def gc_libraries(self): + return [] + + def llvm_code(self): + return ''' +internal fastcc sbyte* %gc_malloc(uint %n) { + %nn = cast uint %n to uint + %ptr = malloc sbyte, uint %nn + ret sbyte* %ptr +} + +internal fastcc sbyte* %gc_malloc_atomic(uint %n) { + %nn = cast uint %n to uint + %ptr = malloc sbyte, uint %nn + ret sbyte* %ptr +} +''' + + def pyrex_code(self): + return '' + + def new(gcpolicy=None): #factory + if gcpolicy is None or gcpolicy == 'boehm': + from os.path import exists + boehm_on_path = exists('/usr/lib/libgc.so') or exists('/usr/lib/libgc.a') + if not boehm_on_path: + raise Exception, 'Boehm GC libary not found in /usr/lib' + from pypy.translator.llvm.gc import BoehmGcPolicy + gcpolicy = BoehmGcPolicy() + elif gcpolicy == 'ref': + from pypy.translator.llvm.gc import RefcountingGcPolicy + gcpolicy = RefcountingGcPolicy() + elif gcpolicy == 'none': + from pypy.translator.llvm.gc import NoneGcPolicy + gcpolicy = NoneGcPolicy() + else: + raise Exception, 'unknown gcpolicy: ' + str(gcpolicy) + return gcpolicy + new = staticmethod(new) + + +class NoneGcPolicy(GcPolicy): + def __init__(self): + pass + + +class BoehmGcPolicy(GcPolicy): + def __init__(self): + pass + + def gc_libraries(self): + return ['gc'] # xxx on windows? + + def llvm_code(self): + return ''' +declare ccc sbyte* %GC_malloc(uint) +declare ccc sbyte* %GC_malloc_atomic(uint) + +internal fastcc sbyte* %gc_malloc(uint %n) { + %ptr = call ccc sbyte* %GC_malloc(uint %n) + ret sbyte* %ptr +} + +internal fastcc sbyte* %gc_malloc_atomic(uint %n) { + %ptr = call ccc sbyte* %GC_malloc_atomic(uint %n) + ret sbyte* %ptr +} +''' + + def pyrex_code(self): + return ''' +cdef extern int GC_get_heap_size() + +def GC_get_heap_size_wrapper(): + return GC_get_heap_size() +''' + + +class RefcountingGcPolicy(GcPolicy): + def __init__(self): + raise NotImplementedError, 'RefcountingGcPolicy' Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Wed Sep 7 15:11:12 2005 @@ -1,7 +1,3 @@ -from os.path import exists -use_boehm_gc = exists('/usr/lib/libgc.so') or exists('/usr/lib/libgc.a') -#use_boehm_gc = False - import os import time import types @@ -20,10 +16,11 @@ DEFAULT_TAIL, DEFAULT_CCONV from pypy.translator.llvm import extfuncnode from pypy.translator.llvm.module.extfunction import extdeclarations, \ - extfunctions, gc_boehm, gc_disabled, dependencies + extfunctions, dependencies from pypy.translator.llvm.node import LLVMNode from pypy.translator.llvm.structnode import StructNode from pypy.translator.llvm.externs2ll import post_setup_externs, generate_llfile +from pypy.translator.llvm.gc import GcPolicy from pypy.translator.translator import Translator @@ -34,12 +31,13 @@ class GenLLVM(object): - def __init__(self, translator, debug=True): + def __init__(self, translator, gcpolicy=None, debug=True): # reset counters LLVMNode.nodename_count = {} self.db = Database(translator) self.translator = translator + self.gcpolicy = gcpolicy translator.checkgraphs() extfuncnode.ExternalFuncNode.used_external_functions = {} @@ -90,6 +88,20 @@ print 'STATS', s def gen_llvm_source(self, func=None): + """ + init took 00m00s + setup_all took 08m14s + setup_all externs took 00m00s + generate_ll took 00m02s + write externs type declarations took 00m00s + write data type declarations took 00m02s + write global constants took 09m49s + write function prototypes took 00m00s + write declarations took 00m03s + write implementations took 01m54s + write support functions took 00m00s + write external functions took 00m00s + """ self._checkpoint() if func is None: @@ -172,12 +184,8 @@ nl(); comment("Function Implementation") codewriter.startimpl() - if use_boehm_gc: - gc_funcs = gc_boehm - else: - gc_funcs = gc_disabled - for gc_func in gc_funcs.split('\n'): - codewriter.append(gc_func) + + codewriter.append(self.gcpolicy.llvm_code()) for typ_decl in self.db.getnodes(): typ_decl.writeimpl(codewriter) @@ -254,23 +262,23 @@ exe_name=None): if standalone: - return build_llvm_module.make_module_from_llvm(filename, + return build_llvm_module.make_module_from_llvm(self, filename, optimize=optimize, exe_name=exe_name) else: postfix = '' basename = filename.purebasename + '_wrapper' + postfix + '.pyx' pyxfile = filename.new(basename = basename) - write_pyx_wrapper(self.entrynode, pyxfile) - return build_llvm_module.make_module_from_llvm(filename, + write_pyx_wrapper(self, pyxfile) + return build_llvm_module.make_module_from_llvm(self, filename, pyxfile=pyxfile, optimize=optimize) def _debug_prototype(self, codewriter): codewriter.append("declare int %printf(sbyte*, ...)") -def genllvm(translator, log_source=False, **kwds): - gen = GenLLVM(translator) +def genllvm(translator, gcpolicy=None, log_source=False, **kwds): + gen = GenLLVM(translator, GcPolicy.new(gcpolicy)) filename = gen.gen_llvm_source() if log_source: log.genllvm(open(filename).read()) Modified: pypy/dist/pypy/translator/llvm/module/extfunction.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/extfunction.py (original) +++ pypy/dist/pypy/translator/llvm/module/extfunction.py Wed Sep 7 15:11:12 2005 @@ -9,33 +9,6 @@ %last_exception_value = global %RPYTHON_EXCEPTION* null """ -gc_boehm = """declare ccc sbyte* %GC_malloc(uint) -declare ccc sbyte* %GC_malloc_atomic(uint) - -internal fastcc sbyte* %gc_malloc(uint %n) { - %ptr = call ccc sbyte* %GC_malloc(uint %n) - ret sbyte* %ptr -} - -internal fastcc sbyte* %gc_malloc_atomic(uint %n) { - %ptr = call ccc sbyte* %GC_malloc_atomic(uint %n) - ret sbyte* %ptr -} -""" - -gc_disabled = """internal fastcc sbyte* %gc_malloc(uint %n) { - %nn = cast uint %n to uint - %ptr = malloc sbyte, uint %nn - ret sbyte* %ptr -} - -internal fastcc sbyte* %gc_malloc_atomic(uint %n) { - %nn = cast uint %n to uint - %ptr = malloc sbyte, uint %nn - ret sbyte* %ptr -} -""" - extfunctions = {} #dependencies, llvm-code import support Modified: pypy/dist/pypy/translator/llvm/pyxwrapper.py ============================================================================== --- pypy/dist/pypy/translator/llvm/pyxwrapper.py (original) +++ pypy/dist/pypy/translator/llvm/pyxwrapper.py Wed Sep 7 15:11:12 2005 @@ -1,7 +1,6 @@ import sys from pypy.translator.llvm.log import log from pypy.rpython import lltype -from pypy.translator.llvm.genllvm import use_boehm_gc log = log.pyrex PRIMITIVES_TO_C = {lltype.Bool: "char", @@ -24,7 +23,8 @@ else: assert False, "Unsupported platform" -def write_pyx_wrapper(funcgen, targetpath): +def write_pyx_wrapper(genllvm, targetpath): + funcgen = genllvm.entrynode def c_declaration(): returntype = PRIMITIVES_TO_C[ funcgen.graph.returnblock.inputargs[0].concretetype] @@ -43,12 +43,7 @@ append("class LLVMException(Exception):") append(" pass") append("") - if use_boehm_gc: - append("cdef extern int GC_get_heap_size()") - append("") - append("def GC_get_heap_size_wrapper():") - append(" return GC_get_heap_size()") - append("") + append(genllvm.gcpolicy.pyrex_code()) append("def %s_wrapper(%s):" % (funcgen.ref.strip("%"), ", ".join(inputargs))) append(" result = __entrypoint__%s(%s)" % (funcgen.ref.strip("%"), ", ".join(inputargs))) append(" if __entrypoint__raised_LLVMException(): #not caught by the LLVM code itself") Modified: pypy/dist/pypy/translator/llvm/test/test_gc.py ============================================================================== --- pypy/dist/pypy/translator/llvm/test/test_gc.py (original) +++ pypy/dist/pypy/translator/llvm/test/test_gc.py Wed Sep 7 15:11:12 2005 @@ -1,16 +1,16 @@ import sys import py -from pypy.translator.llvm.genllvm import use_boehm_gc from pypy.translator.llvm.test.runtest import compile_module_function py.log.setconsumer("genllvm", py.log.STDOUT) py.log.setconsumer("genllvm database prepare", None) def test_GC_malloc(): - if not use_boehm_gc: - py.test.skip("test_GC_malloc skipped because Boehm collector library was not found") - return + #XXX how to get to gcpolicy? + #if not use_boehm_gc: + # py.test.skip("test_GC_malloc skipped because Boehm collector library was not found") + # return def tuple_getitem(n): x = 666 i = 0 From ericvrp at codespeak.net Wed Sep 7 16:03:30 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 7 Sep 2005 16:03:30 +0200 (CEST) Subject: [pypy-svn] r17327 - pypy/dist/pypy/translator/goal Message-ID: <20050907140330.026D127B66@code1.codespeak.net> Author: ericvrp Date: Wed Sep 7 16:03:29 2005 New Revision: 17327 Modified: pypy/dist/pypy/translator/goal/run_pypy-llvm.sh pypy/dist/pypy/translator/goal/runtranslate.sh (contents, props changed) pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: fixes to make more llvm compatible Modified: pypy/dist/pypy/translator/goal/run_pypy-llvm.sh ============================================================================== --- pypy/dist/pypy/translator/goal/run_pypy-llvm.sh (original) +++ pypy/dist/pypy/translator/goal/run_pypy-llvm.sh Wed Sep 7 16:03:29 2005 @@ -4,6 +4,7 @@ #python translate_pypy.py -no-c -no-o -text -fork2 # running it all python translate_pypy.py target_pypy-llvm -text -llvm $* +#python translate_pypy_new.py targetpypystandalone --backend=llvm --gc=boehm --pygame $* # How to work in parallel: Modified: pypy/dist/pypy/translator/goal/runtranslate.sh ============================================================================== --- pypy/dist/pypy/translator/goal/runtranslate.sh (original) +++ pypy/dist/pypy/translator/goal/runtranslate.sh Wed Sep 7 16:03:29 2005 @@ -1,3 +1,4 @@ +#!/bin/sh export RTYPERORDER=order,module-list.pedronis # stopping on the first error #python translate_pypy.py -no-c -no-o -text -fork Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Wed Sep 7 16:03:29 2005 @@ -417,13 +417,16 @@ ) if err: raise err[0], err[1], err[2] - gcpolicy = None - if options1.gc =='boehm': - from pypy.translator.c import gc - gcpolicy = gc.BoehmGcPolicy - if options1.gc == 'none': - from pypy.translator.c import gc - gcpolicy = gc.NoneGcPolicy + if options1.backend == 'c': #XXX probably better to supply gcpolicy as string to the backends + gcpolicy = None + if options1.gc =='boehm': + from pypy.translator.c import gc + gcpolicy = gc.BoehmGcPolicy + if options1.gc == 'none': + from pypy.translator.c import gc + gcpolicy = gc.NoneGcPolicy + elif options1.backend == 'llvm': + gcpolicy = options1.gc if options1.backend == 'llinterpret': def interpret(): @@ -434,14 +437,14 @@ interp.eval_function(entry_point, targetspec_dic['get_llinterp_args']()) interpret() - elif options1.gencode: - print 'Not generating C code.' + #elif options1.gencode: + # print 'Not generating C code.' else: - print 'Generating %s %s code...' %("and compiling " and options1.really_compile or "",options1.backend) + print 'Generating %s %s code...' %(options1.really_compile and "and compiling " or "",options1.backend) keywords = {'really_compile' : options1.really_compile, 'standalone' : standalone, 'gcpolicy' : gcpolicy} - c_entry_point = t.compile(options1.backend,keywords) + c_entry_point = t.compile(options1.backend,**keywords) if standalone and options1.backend == 'c': # xxx fragile and messy import shutil From ericvrp at codespeak.net Wed Sep 7 16:03:54 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 7 Sep 2005 16:03:54 +0200 (CEST) Subject: [pypy-svn] r17328 - pypy/dist/pypy/translator Message-ID: <20050907140354.72FCC27B68@code1.codespeak.net> Author: ericvrp Date: Wed Sep 7 16:03:53 2005 New Revision: 17328 Modified: pypy/dist/pypy/translator/translator.py Log: fix for llvm gcpolicy Modified: pypy/dist/pypy/translator/translator.py ============================================================================== --- pypy/dist/pypy/translator/translator.py (original) +++ pypy/dist/pypy/translator/translator.py Wed Sep 7 16:03:53 2005 @@ -282,7 +282,7 @@ else: return genc.CExtModuleBuilder(self, gcpolicy=gcpolicy) - def llvmcompile(self, really_compile=True, standalone=False, optimize=True, exe_name=None): + def llvmcompile(self, really_compile=True, standalone=False, optimize=True, exe_name=None, gcpolicy=None): """llvmcompile(self, really_compile=True, standalone=False, optimize=True) -> LLVM translation Returns LLVM translation with or without optimization. @@ -296,7 +296,7 @@ else: exe_name = None self.frozen = True - return genllvm.genllvm(self, really_compile=really_compile, standalone=standalone, optimize=optimize, exe_name=exe_name) + return genllvm.genllvm(self, really_compile=really_compile, standalone=standalone, optimize=optimize, exe_name=exe_name, gcpolicy=gcpolicy) def call(self, *args): """Calls underlying Python function.""" From pedronis at codespeak.net Wed Sep 7 16:37:47 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 7 Sep 2005 16:37:47 +0200 (CEST) Subject: [pypy-svn] r17331 - in pypy/dist/pypy/interpreter: astcompiler pyparser/test Message-ID: <20050907143747.97D3A27B66@code1.codespeak.net> Author: pedronis Date: Wed Sep 7 16:37:45 2005 New Revision: 17331 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Log: modified mostly astcompiler/pyassem to avoid mixed-type lists/tuples, using small class instances to represent instructions in the blocks instead Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Wed Sep 7 16:37:45 2005 @@ -10,6 +10,61 @@ from pypy.interpreter.pycode import PyCode from pypy.interpreter.baseobjspace import W_Root + +class Instr: + has_arg = False + + def __init__(self, op): + self.op = op + +class InstrWithArg(Instr): + has_arg = True + +class InstrName(InstrWithArg): + def __init__(self, inst, name): + Instr.__init__(self, inst) + self.name = name + + def getArg(self): + "NOT_RPYTHON" + return self.name + +class InstrInt(InstrWithArg): + def __init__(self, inst, intval): + Instr.__init__(self, inst) + self.intval = intval + + def getArg(self): + "NOT_RPYTHON" + return self.intval + +class InstrBlock(InstrWithArg): + def __init__(self, inst, block): + Instr.__init__(self, inst) + self.block = block + + def getArg(self): + "NOT_RPYTHON" + return self.block + +class InstrObj(InstrWithArg): + def __init__(self, inst, obj): + Instr.__init__(self, inst) + self.obj = obj + + def getArg(self): + "NOT_RPYTHON" + return self.obj + +class InstrCode(InstrWithArg): + def __init__(self, inst, gen): + Instr.__init__(self, inst) + self.gen = gen + + def getArg(self): + "NOT_RPYTHON" + return self.gen + class FlowGraph: def __init__(self, space): self.space = space @@ -75,36 +130,41 @@ print "\t", inst if inst in ['RETURN_VALUE', 'YIELD_VALUE']: self.current.addOutEdge(self.exit) - self.current.emit( (inst,) ) + self.current.emit( Instr(inst) ) - def emitop(self, inst, arg ): - if self._debug: - print "\t", inst, arg - self.current.emit( (inst,arg) ) + #def emitop(self, inst, arg ): + # if self._debug: + # print "\t", inst, arg + # self.current.emit( (inst,arg) ) def emitop_obj(self, inst, obj ): if self._debug: print "\t", inst, repr(obj) - self.current.emit( (inst,obj) ) + self.current.emit( InstrObj(inst,obj) ) + + def emitop_code(self, inst, obj ): + if self._debug: + print "\t", inst, repr(obj) + self.current.emit( InstrCode(inst, obj) ) def emitop_int(self, inst, intval ): if self._debug: print "\t", inst, intval assert type(intval)==int - self.current.emit( (inst,intval) ) + self.current.emit( InstrInt(inst,intval) ) def emitop_block(self, inst, block): if self._debug: print "\t", inst, block assert isinstance(block, Block) self.current.addOutEdge( block ) - self.current.emit( (inst,block) ) + self.current.emit( InstrBlock(inst,block) ) def emitop_name(self, inst, name ): if self._debug: print "\t", inst, name assert type(name)==str - self.current.emit( (inst,name) ) + self.current.emit( InstrName(inst,name) ) def getBlocksInOrder(self): """Return the blocks in reverse postorder @@ -203,8 +263,9 @@ if index[c] < i: forward_p = 0 for inst in b.insts: - if inst[0] == 'JUMP_FORWARD': - if inst[1] == c: + if inst.op == 'JUMP_FORWARD': + assert isinstance(inst, InstrBlock) + if inst.block == c: forward_p = 1 if not forward_p: continue @@ -285,12 +346,13 @@ '\n'.join(insts)) def emit(self, inst): - op = inst[0] + op = inst.op if op[:4] == 'JUMP': - self.outEdges.add(inst[1]) + assert isinstance(inst, InstrBlock) + self.outEdges.add(inst.block) ## if op=="LOAD_CONST": ## assert isinstance( inst[1], W_Root ) or hasattr( inst[1], 'getCode') - self.insts.append( list(inst) ) + self.insts.append( inst ) def getInstructions(self): return self.insts @@ -323,10 +385,10 @@ transfer. """ try: - op, arg = self.insts[-1] + inst = self.insts[-1] except (IndexError, ValueError): return - if op in self._uncond_transfer: + if inst.op in self._uncond_transfer: self.next = [] def get_children(self): @@ -342,11 +404,10 @@ """ contained = [] for inst in self.insts: - if len(inst) == 1: - continue - op = inst[1] - if hasattr(op, 'graph'): - contained.append(op.graph) + if isinstance(inst, InstrCode): + gen = inst.gen + if gen: + contained.append(gen) return contained # flags for code objects @@ -438,14 +499,14 @@ sys.stdout = io pc = 0 for t in self.insts: - opname = t[0] + opname = t.op if opname == "SET_LINENO": print - if len(t) == 1: + if not t.has_arg: print "\t", "%3d" % pc, opname pc = pc + 1 else: - print "\t", "%3d" % pc, opname, t[1] + print "\t", "%3d" % pc, opname, t.getArg() pc = pc + 3 if io: sys.stdout = save @@ -491,39 +552,45 @@ for b in self.orderedblocks: begin[b] = pc for inst in b.getInstructions(): - if len(inst) == 1: + if not inst.has_arg: insts.append(inst) pc = pc + 1 - elif inst[0] != "SET_LINENO": - opname, arg = inst - if self.hasjrel.has_elt(opname): + elif inst.op != "SET_LINENO": + if self.hasjrel.has_elt(inst.op): + assert isinstance(inst, InstrBlock) # relative jump - no extended arg - forward_refs.append( (arg, inst, pc ) ) + block = inst.block + inst = InstrInt(inst.op, 0) + forward_refs.append( (block, inst, pc) ) insts.append(inst) pc = pc + 3 - elif self.hasjabs.has_elt(opname): + elif self.hasjabs.has_elt(inst.op): # absolute jump - can be extended if backward + assert isinstance(inst, InstrBlock) + arg = inst.block if arg in begin: # can only extend argument if backward offset = begin[arg] hi, lo = divmod(offset,65536) if hi>0: # extended argument - insts.append( ["EXTENDED_ARG", hi ] ) + insts.append( InstrInt("EXTENDED_ARG", hi) ) pc = pc + 3 - inst[1] = lo + inst = InstrInt(inst.op, lo) else: + inst = InstrInt(inst.op, 0) forward_refs.append( (arg, inst, pc ) ) insts.append(inst) pc = pc + 3 else: + assert isinstance(inst, InstrInt) + arg = inst.intval # numerical arg - assert type(arg)==int hi,lo = divmod(arg,65536) if hi>0: # extended argument - insts.append( ["EXTENDED_ARG", hi ] ) - inst[1] = lo + insts.append( InstrInt("EXTENDED_ARG", hi) ) + inst.intval = lo pc = pc + 3 insts.append(inst) pc = pc + 3 @@ -532,14 +599,14 @@ end[b] = pc pc = 0 - for arg, inst, pc in forward_refs: - opname, block = inst + for block, inst, pc in forward_refs: + opname = inst.op abspos = begin[block] if self.hasjrel.has_elt(opname): offset = abspos - pc - 3 - inst[1] = offset + inst.intval = offset else: - inst[1] = abspos + inst.intval = abspos self.stage = FLAT hasjrel = misc.Set() @@ -557,12 +624,14 @@ self.sort_cellvars() for b in self.orderedblocks: - for inst in b.getInstructions(): - if len(inst) == 2: - opname, oparg = inst + insts = b.getInstructions() + for i in range(len(insts)): + inst = insts[i] + if inst.has_arg: + opname = inst.op conv = self._converters.get(opname, None) if conv: - inst[1] = conv(self, oparg) + insts[i] = conv(self, inst) self.stage = CONV def sort_cellvars(self): @@ -587,36 +656,54 @@ must treat these two separately, so it does an explicit type comparison before comparing the values. """ - t = type(name) + assert isinstance(name, str) for i in range(len(list)): - if t == type(list[i]) and list[i] == name: + if list[i] == name: return i end = len(list) list.append(name) return end + def _lookupConst(self, w_obj, list_w): + space = self.space + w_obj_type = space.type(w_obj) + for i in range(len(list_w)): + cand_w = list_w[i] + if space.is_w(space.type(cand_w), w_obj_type) and space.eq_w(list_w[i], w_obj): + return i + end = len(list_w) + list_w.append(w_obj) + return end + _converters = {} - def _convert_LOAD_CONST(self, arg): - if hasattr(arg, 'getCode'): - arg = arg.getCode() -## assert arg is not None - return self._lookupName(arg, self.consts) - def _convert_LOAD_FAST(self, arg): + def _convert_LOAD_CONST(self, inst): + if isinstance(inst, InstrCode): + w_obj = inst.gen.getCode() + else: + assert isinstance(inst, InstrObj) + w_obj = inst.obj + #assert w_obj is not None + index = self._lookupConst(w_obj, self.consts) + return InstrInt(inst.op, index) + + def _convert_LOAD_FAST(self, inst): + assert isinstance(inst, InstrName) + arg = inst.name self._lookupName(arg, self.names) - return self._lookupName(arg, self.varnames) + index= self._lookupName(arg, self.varnames) + return InstrInt(inst.op, index) _convert_STORE_FAST = _convert_LOAD_FAST _convert_DELETE_FAST = _convert_LOAD_FAST - def _convert_LOAD_NAME(self, arg): - if self.klass is None: - self._lookupName(arg, self.varnames) - return self._lookupName(arg, self.names) - - def _convert_NAME(self, arg): + def _convert_NAME(self, inst): + assert isinstance(inst, InstrName) + arg = inst.name if self.klass is None: self._lookupName(arg, self.varnames) - return self._lookupName(arg, self.names) + index = self._lookupName(arg, self.names) + return InstrInt(inst.op, index) + _convert_LOAD_NAME = _convert_NAME _convert_STORE_NAME = _convert_NAME _convert_DELETE_NAME = _convert_NAME _convert_IMPORT_NAME = _convert_NAME @@ -628,20 +715,30 @@ _convert_STORE_GLOBAL = _convert_NAME _convert_DELETE_GLOBAL = _convert_NAME - def _convert_DEREF(self, arg): + def _convert_DEREF(self, inst): + assert isinstance(inst, InstrName) + arg = inst.name self._lookupName(arg, self.names) self._lookupName(arg, self.varnames) - return self._lookupName(arg, self.closure) + index = self._lookupName(arg, self.closure) + return InstrInt(inst.op, index) _convert_LOAD_DEREF = _convert_DEREF _convert_STORE_DEREF = _convert_DEREF - def _convert_LOAD_CLOSURE(self, arg): + def _convert_LOAD_CLOSURE(self, inst): + assert isinstance(inst, InstrName) + arg = inst.name self._lookupName(arg, self.varnames) - return self._lookupName(arg, self.closure) - + index = self._lookupName(arg, self.closure) + return InstrInt(inst.op, index) + _cmp = list(dis.cmp_op) - def _convert_COMPARE_OP(self, arg): - return self._cmp.index(arg) + def _convert_COMPARE_OP(self, inst): + assert isinstance(inst, InstrName) + arg = inst.name + index = self._cmp.index(arg) + return InstrInt(inst.op, index) + # similarly for other opcodes... @@ -655,16 +752,17 @@ assert self.stage == FLAT self.lnotab = lnotab = LineAddrTable() for t in self.insts: - opname = t[0] + opname = t.op if self._debug: - if len(t)==1: + if not t.has_arg: print "x",opname else: - print "x",opname, t[1] - if len(t) == 1: + print "x",opname, t.getArg() + if not t.has_arg: lnotab.addCode(self.opnum[opname]) else: - oparg = t[1] + assert isinstance(t, InstrInt) + oparg = t.intval if opname == "SET_LINENO": lnotab.nextLine(oparg) continue @@ -672,8 +770,9 @@ try: lnotab.addCode(self.opnum[opname], lo, hi) except ValueError: - print opname, oparg - print self.opnum[opname], lo, hi + if self._debug: + print opname, oparg + print self.opnum[opname], lo, hi raise self.stage = DONE @@ -693,13 +792,18 @@ argcount = argcount - 1 # was return new.code, now we just return the parameters and let # the caller create the code object + # XXX _code_new_w itself is not really annotable return PyCode(self.space)._code_new_w( argcount, nlocals, self.stacksize, self.flags, - self.lnotab.getCode(), self.getConsts(), - tuple(self.names), tuple(self.varnames), + self.lnotab.getCode(), + tuple(self.getConsts()), + tuple(self.names), + tuple(self.varnames), self.filename, self.name, self.lnotab.firstline, - self.lnotab.getTable(), tuple(self.freevars), - tuple(self.cellvars)) + self.lnotab.getTable(), + tuple(self.freevars), + tuple(self.cellvars) + ) def getConsts(self): """Return a tuple for the const slot of the code object @@ -707,12 +811,7 @@ Must convert references to code (MAKE_FUNCTION) to code objects recursively. """ - l = [] - for elt in self.consts: - if isinstance(elt, PyFlowGraph): - elt = elt.getCode() - l.append(elt) - return tuple(l) + return self.consts[:] def isJump(opname): if opname[:4] == 'JUMP': @@ -860,7 +959,7 @@ depth = 0 maxDepth = 0 for i in insts: - opname = i[0] + opname = i.op if debug: print i, delta = self.effect.get(opname, None) @@ -877,7 +976,8 @@ if delta is None: meth = DEPTH_OP_TRACKER.get( opname, None ) if meth is not None: - depth = depth + meth(i[1]) + assert isinstance(i, InstrInt) + depth = depth + meth(i.intval) if depth > maxDepth: maxDepth = depth if debug: Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Wed Sep 7 16:37:45 2005 @@ -225,6 +225,9 @@ def emitop_obj(self, inst, obj): return self.graph.emitop_obj( inst, obj ) + def emitop_code(self, inst, gen): + return self.graph.emitop_code( inst, gen ) + def emitop_int(self, inst, op): assert type(op) == int return self.graph.emitop_int( inst, op ) @@ -396,12 +399,10 @@ if frees: for name in frees: self.emitop('LOAD_CLOSURE', name) - self.emitop_obj('LOAD_CONST', gen) - # self.emitop_obj('LOAD_CONST', gen.getCode()) + self.emitop_code('LOAD_CONST', gen) self.emitop_int('MAKE_CLOSURE', len(node.defaults)) else: - self.emitop_obj('LOAD_CONST', gen) - # self.emitop_obj('LOAD_CONST', gen.getCode()) + self.emitop_code('LOAD_CONST', gen) self.emitop_int('MAKE_FUNCTION', len(node.defaults)) for i in range(ndecorators): @@ -420,8 +421,7 @@ frees = gen.scope.get_free_vars() for name in frees: self.emitop('LOAD_CLOSURE', name) - self.emitop_obj('LOAD_CONST', gen) - # self.emitop_obj('LOAD_CONST', gen.getCode()) + self.emitop_code('LOAD_CONST', gen) if frees: self.emitop_int('MAKE_CLOSURE', 0) else: @@ -658,12 +658,10 @@ if frees: for name in frees: self.emitop('LOAD_CLOSURE', name) - self.emitop_obj('LOAD_CONST', gen) - # self.emitop_obj('LOAD_CONST', gen.getCode()) + self.emitop_code('LOAD_CONST', gen) self.emitop_int('MAKE_CLOSURE', 0) else: - self.emitop_obj('LOAD_CONST', gen) - # self.emitop_obj('LOAD_CONST', gen.getCode()) + self.emitop_code('LOAD_CONST', gen) self.emitop_int('MAKE_FUNCTION', 0) # precomputation of outmost iterable Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Wed Sep 7 16:37:45 2005 @@ -505,7 +505,7 @@ SINGLE_INPUTS = [ one_stmt_funcdefs, - ['\t # hello\n ', + ['\t # hello\n', 'print 6*7', 'if 1: x\n', 'x = 5', @@ -536,6 +536,15 @@ def is_true(self, obj): return obj + def eq_w(self, obj1, obj2): + return obj1 == obj2 + + def is_w(self, obj1, obj2): + return obj1 is obj2 + + def type(self, obj): + return type(obj) + def newtuple(self, lst): return tuple(lst) Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Wed Sep 7 16:37:45 2005 @@ -1,3 +1,4 @@ +import os from pypy.interpreter.pyparser.pythonparse import PYTHON_PARSER from pypy.interpreter.pyparser.astbuilder import AstBuilder from pypy.interpreter.pycode import PyCode @@ -14,7 +15,7 @@ listmakers, genexps, dictmakers, multiexpr, attraccess, slices, imports,\ asserts, execs, prints, globs, raises_, imports_newstyle, augassigns, \ if_stmts, one_stmt_classdefs, one_stmt_funcdefs, tryexcepts, docstrings, \ - returns + returns, SNIPPETS, SINGLE_INPUTS from test_astbuilder import FakeSpace @@ -67,7 +68,13 @@ def compile_with_astcompiler(expr, target='exec', space=FakeSpace()): ast = ast_parse_expr(expr, target='exec', space=space) misc.set_filename('', ast) - codegen = pycodegen.ModuleCodeGenerator(space, ast) + if target == 'exec': + Generator = pycodegen.ModuleCodeGenerator + elif target == 'single': + Generator = pycodegen.InteractiveCodeGenerator + elif target == 'eval': + Generator = pycodegen.ExpressionCodeGenerator + codegen = Generator(space, ast) rcode = codegen.getCode() return to_code(rcode) @@ -117,10 +124,10 @@ tuple(rcode.co_cellvars) ) return code -def check_compile(expr): +def check_compile(expr, target='exec'): print "Compiling:", expr - sc_code = compile_with_stablecompiler(expr, target='exec') - ac_code = compile_with_astcompiler(expr, target='exec') + sc_code = compile_with_stablecompiler(expr, target=target) + ac_code = compile_with_astcompiler(expr, target=target) compare_code(ac_code, sc_code) ## def check_compile( expr ): @@ -163,3 +170,15 @@ for family in TESTS: for expr in family: yield check_compile, expr + +def test_snippets(): + for snippet_name in SNIPPETS: + filepath = os.path.join(os.path.dirname(__file__), 'samples', snippet_name) + source = file(filepath).read() + yield check_compile, source, 'exec' + + +def test_single_inputs(): + for family in SINGLE_INPUTS: + for expr in family: + yield check_compile, expr, 'single' From pedronis at codespeak.net Wed Sep 7 18:08:31 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 7 Sep 2005 18:08:31 +0200 (CEST) Subject: [pypy-svn] r17333 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050907160831.CA78027B71@code1.codespeak.net> Author: pedronis Date: Wed Sep 7 18:08:26 2005 New Revision: 17333 Modified: pypy/dist/pypy/interpreter/astcompiler/misc.py pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: killed some other SomeObject sources Modified: pypy/dist/pypy/interpreter/astcompiler/misc.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/misc.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/misc.py Wed Sep 7 18:08:26 2005 @@ -11,18 +11,20 @@ return elts class Set: + _annspecialcase_ = "specialize:ctr_location" # polymorphic + def __init__(self): self.elts = {} def __len__(self): return len(self.elts) def __contains__(self, elt): - return self.elts.has_key(elt) + return elt in self.elts def add(self, elt): self.elts[elt] = elt def elements(self): return self.elts.keys() def has_elt(self, elt): - return self.elts.has_key(elt) + return elt in self.elts def remove(self, elt): del self.elts[elt] def copy(self): @@ -31,6 +33,8 @@ return c class Stack: + _annspecialcase_ = "specialize:ctr_location" # polymorphic + def __init__(self): self.stack = [] self.pop = self.stack.pop Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Wed Sep 7 18:08:26 2005 @@ -420,7 +420,7 @@ class PyFlowGraph(FlowGraph): - def __init__(self, space, name, filename, args=(), optimized=0, klass=None): + def __init__(self, space, name, filename, args=(), optimized=0, klass=0): FlowGraph.__init__(self, space) self.name = name self.filename = filename @@ -556,7 +556,7 @@ insts.append(inst) pc = pc + 1 elif inst.op != "SET_LINENO": - if self.hasjrel.has_elt(inst.op): + if inst.op in self.hasjrel: assert isinstance(inst, InstrBlock) # relative jump - no extended arg block = inst.block @@ -564,7 +564,7 @@ forward_refs.append( (block, inst, pc) ) insts.append(inst) pc = pc + 3 - elif self.hasjabs.has_elt(inst.op): + elif inst.op in self.hasjabs: # absolute jump - can be extended if backward assert isinstance(inst, InstrBlock) arg = inst.block @@ -602,19 +602,19 @@ for block, inst, pc in forward_refs: opname = inst.op abspos = begin[block] - if self.hasjrel.has_elt(opname): + if opname in self.hasjrel: offset = abspos - pc - 3 inst.intval = offset else: inst.intval = abspos self.stage = FLAT - hasjrel = misc.Set() + hasjrel = {} for i in dis.hasjrel: - hasjrel.add(dis.opname[i]) - hasjabs = misc.Set() + hasjrel[dis.opname[i]] = True + hasjabs = {} for i in dis.hasjabs: - hasjabs.add(dis.opname[i]) + hasjabs[dis.opname[i]] = True def convertArgs(self): """Convert arguments from symbolic to concrete form""" @@ -699,7 +699,7 @@ def _convert_NAME(self, inst): assert isinstance(inst, InstrName) arg = inst.name - if self.klass is None: + if not self.klass: self._lookupName(arg, self.varnames) index = self._lookupName(arg, self.names) return InstrInt(inst.op, index) Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Wed Sep 7 18:08:26 2005 @@ -919,7 +919,7 @@ self.set_lineno(node) self.delName(node.name) else: - print "oops", node.flags + assert False, "visitAssName unexpected flags: %s" % node.flags def visitAssAttr(self, node): node.expr.accept( self ) @@ -928,8 +928,7 @@ elif node.flags == 'OP_DELETE': self.emitop('DELETE_ATTR', self.mangle(node.attrname)) else: - print "warning: unexpected flags:", node.flags - print node + assert False, "visitAssAttr unexpected flags: %s" % node.flags def _visitAssSequence(self, node, op='UNPACK_SEQUENCE'): if findOp(node) != 'OP_DELETE': @@ -1031,7 +1030,7 @@ # slice and subscript stuff - def visitSlice(self, node, aug_flag=None): + def visitSlice(self, node, aug_flag=0): # aug_flag is used by visitAugSlice node.expr.accept( self ) slice = 0 @@ -1058,7 +1057,7 @@ print "weird slice", node.flags raise - def visitSubscript(self, node, aug_flag=None): + def visitSubscript(self, node, aug_flag=0): node.expr.accept( self ) for sub in node.subs: sub.accept( self ) @@ -1386,14 +1385,22 @@ class OpFinder(ast.ASTVisitor): def __init__(self): self.op = None + def visitAssName(self, node): if self.op is None: self.op = node.flags elif self.op != node.flags: raise ValueError, "mixed ops in stmt" - visitAssAttr = visitAssName - visitSubscript = visitAssName - + def visitAssAttr(self, node): + if self.op is None: + self.op = node.flags + elif self.op != node.flags: + raise ValueError, "mixed ops in stmt" + def visitSubscript(self, node): + if self.op is None: + self.op = node.flags + elif self.op != node.flags: + raise ValueError, "mixed ops in stmt" class AugLoadVisitor(ast.ASTVisitor): From pedronis at codespeak.net Wed Sep 7 19:41:14 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 7 Sep 2005 19:41:14 +0200 (CEST) Subject: [pypy-svn] r17336 - in pypy/dist/pypy/interpreter: astcompiler pyparser Message-ID: <20050907174114.C433C27B71@code1.codespeak.net> Author: pedronis Date: Wed Sep 7 19:41:12 2005 New Revision: 17336 Modified: pypy/dist/pypy/interpreter/astcompiler/future.py pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/astcompiler/symbols.py pypy/dist/pypy/interpreter/astcompiler/transformer.py pypy/dist/pypy/interpreter/astcompiler/visitor.py pypy/dist/pypy/interpreter/pyparser/error.py Log: has_key -> in solved some other cases of mixed None and ints don't use hasattr on valid_future Modified: pypy/dist/pypy/interpreter/astcompiler/future.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/future.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/future.py Wed Sep 7 19:41:12 2005 @@ -22,25 +22,33 @@ def visitModule(self, node): stmt = node.node + invalid = False for s in stmt.nodes: - if not self.check_stmt(s): - break + if not self.check_stmt(s, invalid): + invalid = True - def check_stmt(self, stmt): - if is_future(stmt): - for name, asname in stmt.names: - if name in self.features: - self.found[name] = 1 - elif name=="*": - raise SyntaxError( - "future statement does not support import *", - ( stmt.filename, stmt.lineno, 0, "" ) ) - else: - raise SyntaxError( - "future feature %s is not defined" % name, - ( stmt.filename, stmt.lineno, 0, "" ) ) - stmt.valid_future = 1 - return 1 + def check_stmt(self, stmt, invalid): + if isinstance(stmt, ast.From): + stmt.valid_future = 0 + if invalid: + return 0 + if is_future(stmt): + assert isinstance(stmt, ast.From) + for name, asname in stmt.names: + if name in self.features: + self.found[name] = 1 + elif name=="*": + raise SyntaxError( + "future statement does not support import *", + filename = stmt.filename, + lineno = stmt.lineno) + else: + raise SyntaxError( + "future feature %s is not defined" % name, + filename = stmt.filename, + lineno = stmt.lineno) + stmt.valid_future = 1 + return 1 return 0 def get_features(self): @@ -52,13 +60,22 @@ Those not marked valid are appearing after other statements """ + def visitModule(self, node): + stmt = node.node + for s in stmt.nodes: + if isinstance(s, ast.From): + if s.valid_future: + continue + if s.modname != "__future__": + continue + self.visitFrom(s) + else: + self.default(s) + def visitFrom(self, node): - if hasattr(node, 'valid_future'): - return - if node.modname != "__future__": - return raise SyntaxError( "from __future__ imports must occur at the beginning of the file", - ( node.filename, node.lineno, 0, "" ) ) + filename=node.filename, + lineno=node.lineno) def find_futures(node): p1 = FutureParser() Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Wed Sep 7 19:41:12 2005 @@ -305,7 +305,7 @@ order = [] seen[b] = b for c in b.get_children(): - if seen.has_key(c): + if c in seen: continue order = order + dfs_postorder(c, seen) order.append(b) @@ -512,7 +512,7 @@ sys.stdout = save def _max_depth(self, depth, seen, b, d): - if seen.has_key(b): + if b in seen: return d seen[b] = 1 d = d + depth[b] @@ -641,7 +641,7 @@ for name in self.cellvars: cells[name] = 1 self.cellvars = [name for name in self.varnames - if cells.has_key(name)] + if name in cells] for name in self.cellvars: del cells[name] self.cellvars = self.cellvars + cells.keys() Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Wed Sep 7 19:41:12 2005 @@ -1242,6 +1242,7 @@ name = "" % klass.lambdaCount klass.lambdaCount = klass.lambdaCount + 1 else: + assert isinstance(func, ast.Function) name = func.name args, hasTupleArg = generateArgList(func.argnames) Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Wed Sep 7 19:41:12 2005 @@ -26,8 +26,8 @@ self.children = [] # nested is true if the class could contain free variables, # i.e. if it is nested within another function. - self.nested = None - self.generator = None + self.nested = 0 + self.generator = 0 self.klass = None if klass is not None: for i in range(len(klass)): @@ -51,9 +51,9 @@ def add_global(self, name): name = self.mangle(name) - if self.uses.has_key(name) or self.defs.has_key(name): + if name in self.uses or name in self.defs: pass # XXX warn about global following def/use - if self.params.has_key(name): + if name in self.params: raise SyntaxError, "%s in %s is global and parameter" % \ (name, self.name) self.globals[name] = 1 @@ -90,14 +90,14 @@ The scope of a name could be LOCAL, GLOBAL, FREE, or CELL. """ - if self.globals.has_key(name): + if name in self.globals: return SC_REALLY_GLOBAL - if self.cells.has_key(name): + if name in self.cells: return SC_CELL - if self.defs.has_key(name): + if name in self.defs: return SC_LOCAL - if self.nested and (self.frees.has_key(name) or - self.uses.has_key(name)): + if self.nested and (name in self.frees or + name in self.uses): return SC_FREE if self.nested: return SC_UNKNOWN @@ -110,8 +110,8 @@ free = {} free.update(self.frees) for name in self.uses.keys(): - if not (self.defs.has_key(name) or - self.globals.has_key(name)): + if not (name in self.defs or + name in self.globals): free[name] = 1 return free.keys() @@ -136,7 +136,7 @@ free. """ self.globals[name] = 1 - if self.frees.has_key(name): + if name in self.frees: del self.frees[name] for child in self.children: if child.check_name(name) == SC_FREE: Modified: pypy/dist/pypy/interpreter/astcompiler/transformer.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/transformer.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/transformer.py Wed Sep 7 19:41:12 2005 @@ -86,7 +86,7 @@ def Node(*args): kind = args[0] - if nodes.has_key(kind): + if kind in nodes: try: return nodes[kind](*args[1:]) except TypeError: Modified: pypy/dist/pypy/interpreter/astcompiler/visitor.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/visitor.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/visitor.py Wed Sep 7 19:41:12 2005 @@ -84,7 +84,7 @@ meth(node, *args) elif self.VERBOSE > 0: klass = node.__class__ - if not self.examples.has_key(klass): + if klass not in self.examples: self.examples[klass] = klass print print self.visitor Modified: pypy/dist/pypy/interpreter/pyparser/error.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/error.py (original) +++ pypy/dist/pypy/interpreter/pyparser/error.py Wed Sep 7 19:41:12 2005 @@ -27,12 +27,12 @@ class SyntaxError(Exception): """Base class for exceptions raised by the parser.""" - def __init__(self, msg, lineno=0, offset=0, text=0): + def __init__(self, msg, lineno=0, offset=0, text="", filename=""): self.msg = msg self.lineno = lineno self.offset = offset self.text = text - self.filename = "" + self.filename = filename self.print_file_and_line = False def wrap_info(self, space, filename): From pedronis at codespeak.net Wed Sep 7 19:59:41 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 7 Sep 2005 19:59:41 +0200 (CEST) Subject: [pypy-svn] r17339 - in pypy/dist/pypy: interpreter/astcompiler module/__builtin__/test/impsubdir/compiled Message-ID: <20050907175941.275CB27B66@code1.codespeak.net> Author: pedronis Date: Wed Sep 7 19:59:39 2005 New Revision: 17339 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/module/__builtin__/test/impsubdir/compiled/x.pyc Log: fix some mixing of () and lists through defaults call directly and isolate visit functions with return values Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Wed Sep 7 19:59:39 2005 @@ -420,8 +420,10 @@ class PyFlowGraph(FlowGraph): - def __init__(self, space, name, filename, args=(), optimized=0, klass=0): + def __init__(self, space, name, filename, args=None, optimized=0, klass=0): FlowGraph.__init__(self, space) + if args is None: + args = [] self.name = name self.filename = filename self.docstring = space.w_None Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Wed Sep 7 19:59:39 2005 @@ -130,7 +130,7 @@ class LocalNameFinder(ast.ASTVisitor): """Find local names in scope""" - def __init__(self, names=()): + def __init__(self, names=[]): self.names = misc.Set() self.globals = misc.Set() for name in names: @@ -600,7 +600,7 @@ stack = [] for i, for_ in zip(range(len(node.quals)), node.quals): - start, anchor = for_.accept( self ) + start, anchor = self._visitListCompFor(for_) self.genexpr_cont_stack.append( None ) for if_ in for_.ifs: if self.genexpr_cont_stack[-1] is None: @@ -627,7 +627,8 @@ self.__list_count = self.__list_count - 1 - def visitListCompFor(self, node): + def _visitListCompFor(self, node): + assert isinstance(node, ast.ListCompFor) start = self.newBlock() anchor = self.newBlock() @@ -675,7 +676,7 @@ stack = [] for i, for_ in zip(range(len(node.quals)), node.quals): - start, anchor = for_.accept( self ) + start, anchor = self._visitGenExprFor(for_) self.genexpr_cont_stack.append( None ) for if_ in for_.ifs: if self.genexpr_cont_stack[-1] is None: @@ -698,7 +699,8 @@ self.startBlock(anchor) self.emitop_obj('LOAD_CONST', self.space.w_None) - def visitGenExprFor(self, node): + def _visitGenExprFor(self, node): + assert isinstance(node, ast.GenExprFor) start = self.newBlock() anchor = self.newBlock() Modified: pypy/dist/pypy/module/__builtin__/test/impsubdir/compiled/x.pyc ============================================================================== Binary files. No diff available. From hpk at codespeak.net Thu Sep 8 09:33:56 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 8 Sep 2005 09:33:56 +0200 (CEST) Subject: [pypy-svn] r17344 - pypy/extradoc/sprintinfo Message-ID: <20050908073356.D3F2527B57@code1.codespeak.net> Author: hpk Date: Thu Sep 8 09:33:54 2005 New Revision: 17344 Modified: pypy/extradoc/sprintinfo/Heidelberg-report.txt pypy/extradoc/sprintinfo/hildesheim2-planning.txt Log: small reformattings/extensions Modified: pypy/extradoc/sprintinfo/Heidelberg-report.txt ============================================================================== --- pypy/extradoc/sprintinfo/Heidelberg-report.txt (original) +++ pypy/extradoc/sprintinfo/Heidelberg-report.txt Thu Sep 8 09:33:54 2005 @@ -4,14 +4,15 @@ The heidelberg sprint was announced_ late July and around `13 people registered`_ and showed up -at the nice Heidelberg physics institute which -Carl Friedrich Bolz had organized for us. - -The sprint was focused on getting a `0.7.0 release out`_ -and improve and refine crucial areas like -threading, GC and CPython compliancy. Here -is what people worked on in a somewhat chronological -particular order: +at the nice Heidelberg physics institute where +Carl Friedrich Bolz had organized sprint facilities +for us. + +The sprint was focused on getting a `0.7.0 release out`_ and +improve and refine crucial areas like threading, GC and +CPython compliancy. Here is what people worked on in a +somewhat chronological non-particular order and certainly +not complete: - Samuele and Carl worked on refactoring the parts of genc that are responsible for the use of refcounting in the translation to make @@ -104,9 +105,13 @@ Jacob and Stephan on monday to talk about management responsibilities in the near future. There now is the "3rd amendment" to the EU contract scheduled for - 7th September. Holger and Bea listed the steps - required for getting Michael Hudson on the project - through the University of Bristol. + 7th September. + +- Holger and Bea listed the steps required for getting + Michael Hudson on the project through the University + of Bristol. Later on the management team met + and evaluated the results of the EU-workshop organized + by Changemaker in Goteborg. On sunday afternoon (basically the last day where mostly everbody was there) we had a kind of sprint-conclusion Modified: pypy/extradoc/sprintinfo/hildesheim2-planning.txt ============================================================================== --- pypy/extradoc/sprintinfo/hildesheim2-planning.txt (original) +++ pypy/extradoc/sprintinfo/hildesheim2-planning.txt Thu Sep 8 09:33:54 2005 @@ -24,8 +24,6 @@ Christian Tismer whole-time Samuele Pedroni whole-time -Misc --------- Current Main task: RTYPER issues ----------------------------------- From ac at codespeak.net Thu Sep 8 10:06:45 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 8 Sep 2005 10:06:45 +0200 (CEST) Subject: [pypy-svn] r17345 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908080645.BDBB527B5D@code1.codespeak.net> Author: ac Date: Thu Sep 8 10:06:45 2005 New Revision: 17345 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: Another SomeObjeect() gone! Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Thu Sep 8 10:06:45 2005 @@ -1056,8 +1056,7 @@ elif node.flags == 'OP_DELETE': self.emit('DELETE_SLICE+%d' % slice) else: - print "weird slice", node.flags - raise + assert False, "weird slice %s" % node.flags def visitSubscript(self, node, aug_flag=0): node.expr.accept( self ) From arigo at codespeak.net Thu Sep 8 10:47:20 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Sep 2005 10:47:20 +0200 (CEST) Subject: [pypy-svn] r17346 - in pypy/dist/pypy/module/__builtin__/test: . impsubdir Message-ID: <20050908084720.EFF7D27B5E@code1.codespeak.net> Author: arigo Date: Thu Sep 8 10:47:20 2005 New Revision: 17346 Removed: pypy/dist/pypy/module/__builtin__/test/impsubdir/ Modified: pypy/dist/pypy/module/__builtin__/test/test_import.py Log: Replaced the large directory substructure 'impsubdir' used in test_import with a couple of pages of py.path code that creates the same structure temporarily. Cleaner, especially because of this .pyc file that was checked in. Modified: pypy/dist/pypy/module/__builtin__/test/test_import.py ============================================================================== --- pypy/dist/pypy/module/__builtin__/test/test_import.py (original) +++ pypy/dist/pypy/module/__builtin__/test/test_import.py Thu Sep 8 10:47:20 2005 @@ -9,13 +9,71 @@ from pypy.module.__builtin__ import importing -def get_import_path(): - j = os.path.join - p = os.path.abspath(j(os.path.dirname(__file__), 'impsubdir')) +def setuppkg(pkgname, **entries): + p = udir.join('impsubdir') + if pkgname: + p = p.join(*pkgname.split('.')) + p.ensure(dir=1) + f = p.join("__init__.py").open('w') + print >> f, "# package" + f.close() + for filename, content in entries.items(): + filename += '.py' + f = p.join(filename).open('w') + print >> f, '#', filename + print >> f, content + f.close() return p +def setup_directory_structure(space): + root = setuppkg("", + a = "imamodule = 1\ninpackage = 0", + b = "imamodule = 1\ninpackage = 0", + ambig = "imamodule = 1", + ) + root.ensure("notapackage", dir=1) # empty, no __init__.py + setuppkg("pkg", + a = "imamodule = 1\ninpackage = 1", + relative_a = "import a", + abs_b = "import b", + abs_x_y = "import x.y", + ) + setuppkg("pkg.pkg1", a='') + setuppkg("pkg.pkg2", a='', b='') + setuppkg("pkg_r", inpkg = "import x.y") + setuppkg("pkg_r.x") + setuppkg("x", y='') + setuppkg("ambig", __init__ = "imapackage = 1") + setuppkg("pkg_relative_a", + __init__ = "import a", + a = "imamodule = 1\ninpackage = 1", + ) + setuppkg("pkg_substituting", + __init__ = "import sys, pkg_substituted\n" + "sys.modules[__name__] = pkg_substituted") + setuppkg("pkg_substituted", mod='') + + # create compiled/x.py and a corresponding pyc file + p = setuppkg("compiled", x = "x = 84") + w = space.wrap + w_modname = w("compiled.x") + filename = str(p.join("x.py")) + fd = os.open(filename, os.O_RDONLY, 0666) + osfile = importing.OsFileWrapper(fd) + try: + importing.load_source_module(space, + w_modname, + w(importing.Module(space, w_modname)), + filename, + osfile) + finally: + osfile.close() + + return str(root) + + def _setup(space): - dn = get_import_path() + dn = setup_directory_structure(space) return space.appexec([space.wrap(dn)], """ (dn): import sys From ac at codespeak.net Thu Sep 8 10:51:32 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 8 Sep 2005 10:51:32 +0200 (CEST) Subject: [pypy-svn] r17347 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908085132.F037327B5E@code1.codespeak.net> Author: ac Date: Thu Sep 8 10:51:32 2005 New Revision: 17347 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/astcompiler/symbols.py Log: Loose some SomeObjects(). Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Thu Sep 8 10:51:32 2005 @@ -964,8 +964,8 @@ opname = i.op if debug: print i, - delta = self.effect.get(opname, None) - if delta is not None: + delta = self.effect.get(opname, sys.maxint) + if delta != sys.maxint: depth = depth + delta else: # now check patterns @@ -975,7 +975,7 @@ depth = depth + delta break # if we still haven't found a match - if delta is None: + if delta == sys.maxint: meth = DEPTH_OP_TRACKER.get( opname, None ) if meth is not None: assert isinstance(i, InstrInt) Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Thu Sep 8 10:51:32 2005 @@ -106,7 +106,7 @@ def get_free_vars(self): if not self.nested: - return () + return [] free = {} free.update(self.frees) for name in self.uses.keys(): From arigo at codespeak.net Thu Sep 8 10:55:38 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Sep 2005 10:55:38 +0200 (CEST) Subject: [pypy-svn] r17348 - pypy/dist/pypy/interpreter Message-ID: <20050908085538.0ECEE27B61@code1.codespeak.net> Author: arigo Date: Thu Sep 8 10:55:37 2005 New Revision: 17348 Modified: pypy/dist/pypy/interpreter/eval.py pypy/dist/pypy/interpreter/pyframe.py Log: "self.last_exception=None" does not belong to eval.py not only for aesthetical reasons: eval.py is about the base Frame class, while pyframe.py contains the PyFrame subclass. The last_exception attribute should only exist on the subclass. Don't care about try:finally: here; it's unlikely that this line is not executed, and not really damageable anyway if it really isn't. Modified: pypy/dist/pypy/interpreter/eval.py ============================================================================== --- pypy/dist/pypy/interpreter/eval.py (original) +++ pypy/dist/pypy/interpreter/eval.py Thu Sep 8 10:55:37 2005 @@ -144,10 +144,6 @@ try: result = self.eval(executioncontext) finally: - # on exit, we always release self.last_exception. - # this belongs into pyframe's eval, but would cost an extra - # try..except clause there which we can save. - self.last_exception = None executioncontext.leave(self) return result Modified: pypy/dist/pypy/interpreter/pyframe.py ============================================================================== --- pypy/dist/pypy/interpreter/pyframe.py (original) +++ pypy/dist/pypy/interpreter/pyframe.py Thu Sep 8 10:55:37 2005 @@ -125,6 +125,9 @@ # leave that frame w_exitvalue = e.w_exitvalue executioncontext.return_trace(self, w_exitvalue) + # on exit, we try to release self.last_exception -- breaks an + # obvious reference cycle, so it helps refcounting implementations + self.last_exception = None return w_exitvalue ### line numbers ### From arigo at codespeak.net Thu Sep 8 10:58:34 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Sep 2005 10:58:34 +0200 (CEST) Subject: [pypy-svn] r17349 - pypy/dist/pypy/objspace/std Message-ID: <20050908085834.E1DEE27B61@code1.codespeak.net> Author: arigo Date: Thu Sep 8 10:58:34 2005 New Revision: 17349 Modified: pypy/dist/pypy/objspace/std/model.py Log: Replying to comment... Modified: pypy/dist/pypy/objspace/std/model.py ============================================================================== --- pypy/dist/pypy/objspace/std/model.py (original) +++ pypy/dist/pypy/objspace/std/model.py Thu Sep 8 10:58:34 2005 @@ -121,8 +121,9 @@ def __init__(w_self, space): w_self.space = space # XXX not sure this is ever used any more - # YYY I think we need it for calling hash() from an ll dicts impl. - # without explicitly passing the space. + # Note that it is wrong to depend on a .space attribute for a random + # wrapped object anyway, because not all wrapped objects inherit from + # W_Object. (They inherit from W_Root.) def __repr__(self): s = '%s(%s)' % ( From ac at codespeak.net Thu Sep 8 11:01:44 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 8 Sep 2005 11:01:44 +0200 (CEST) Subject: [pypy-svn] r17350 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908090144.4EB9A27B61@code1.codespeak.net> Author: ac Date: Thu Sep 8 11:01:44 2005 New Revision: 17350 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: Remove some unused code. Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Thu Sep 8 11:01:44 2005 @@ -128,49 +128,6 @@ mtime = struct.pack(' Author: ale Date: Thu Sep 8 11:17:47 2005 New Revision: 17351 Modified: pypy/dist/pypy/translator/goal/targetpypystandalone.py pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: Corrected some misunderstandings. It seems to work now (I havent tried all combinations of options). It still needs more work though ( a way to tell which options depend on/exclude other options, propagation of options to backend compiler, ...) Modified: pypy/dist/pypy/translator/goal/targetpypystandalone.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetpypystandalone.py (original) +++ pypy/dist/pypy/translator/goal/targetpypystandalone.py Thu Sep 8 11:17:47 2005 @@ -57,7 +57,7 @@ # disable translation of the whole of classobjinterp.py StdObjSpace.setup_old_style_classes = lambda self: None - if __main__.options.get('-boehm'): + if __main__.options1.gc == 'boehm': print "disabling thread with boehm for stabilitiy (combination not tested)" usemodules = [] else: Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Thu Sep 8 11:17:47 2005 @@ -4,52 +4,43 @@ """ Command-line options for translate_pypy: - port Listen on the given port number for connexions - (see pypy/translator/tool/pygame/graphclient.py) - targetspec - targetspec.py is a python file defining - what is the translation target and setting things up for it, - it should have a target function returning an entry_point ...; - defaults to targetpypystandalone. The .py suffix is optional. - - -no-a Don't infer annotations, just translate everything - -no-t Don't type-specialize the graph operations with the C typer - -t-insist Specialize should not stop at the first error - -no-o Don't do backend-oriented optimizations - -fork (UNIX) Create a restartable checkpoint after annotation - -fork2 (UNIX) Create a restartable checkpoint after specializing - -t-lowmem try to save as much memory as possible, since many computers - tend to have less than a gigabyte of memory (512 MB is typical). - Currently, we avoid to use geninterplevel, which creates a lot - of extra blocks, but gains only some 10-20 % of speed, because - we are still lacking annotation of applevel code. - -d Enable recording of annotator debugging information - - -b backend is one 'ccompiler, llvmcompiler' - -gc can be 'boehm' or 'no-gc' or 'ref' (default) - -no-c Don't generate the code - #-llvm Use LLVM instead of C - -c Generate the C code, but don't compile it - #-boehm Use the Boehm collector when generating C code - #-no-gc Experimental: use no GC and no refcounting at all - -o Generate and compile the C code, but don't run it - -tcc Equivalent to the envvar PYPY_CC='tcc -shared -o "%s.so" "%s.c"' - -- http://fabrice.bellard.free.fr/tcc/ - -llinterpret - interprets the flow graph after rtyping it - - -text Don't start the Pygame viewer - -huge=% Threshold in the number of functions after which only a local call - graph and not a full one is displayed - -use-snapshot - Redirect imports to the translation snapshot - -save filename - saves the translator to a file. The file type can either - be .py or .zip (recommended). - -load filename - restores the translator from a file. The file type must - be either .py or .zip . - -batch don't use interactive helpers, like pdb + Option groups: + Annotation: + -m --lowmem Try to save memory + -n --no_annotations Don't infer annotations + -d --debug record debug information + -i --insist Dont't stop on first error + + Specialization: + -t --specialize Don't specialize + + Backend optimisation + -o --optimize Don't optimize (should have + different name) + + Process options: + -f fork1[fork2] --fork fork1[fork2] (UNIX) Create restartable + checkpoint after annotation + [,specialization] + -l file --load file load translator from file + -s file --save file save translator to file + + Codegeneration options: + -g gc --gc gc Garbage collector + -b be --backend be Backend selector + -c --gencode Don't generate code + + Compilation options: + + Run options: + -r --no_run Don't run the compiled code + -x --batch Dont run interactive helpers + Pygame options: + -p --pygame Dont run pygame + -H number --huge number Threshold in the number of + functions after which only a + local call graph and not a full + one is displayed """ import autopath, sys, os @@ -149,11 +140,11 @@ from pypy.translator.goal import unixcheckpoint assert_rpython_mostly_not_imported() unixcheckpoint.restartable_point(auto='run') - if a and not options1.specialize: + if a and options1.specialize: print 'Specializing...' t.specialize(dont_simplify_again=True, crash_on_first_typeerror=not options1.insist) - if not options1.optimize and not options1.backend =='-llvm': + if options1.optimize and options1.backend != 'llvm': print 'Back-end optimizations...' t.backend_optimizations() if a and 'fork2' in options1.fork: @@ -348,7 +339,7 @@ parser.add_option("-x", "--batch", dest="batch", default=False, action="store_true",help="Don't use interactive helpers, like pdb") (options1, args) = parser.parse_args() - print options1,args + argiter = iter(args) #sys.argv[1:]) for arg in argiter: try: @@ -405,10 +396,10 @@ err = sys.exc_info() print '-'*60 if options1.savefile: - print 'saving state to %s' % save_file + print 'saving state to %s' % options1.savefile if err: print '*** this save is done after errors occured ***' - save(t, savefile, + save(t, options1.savefile, trans=t, inputtypes=inputtypes, targetspec=targetspec, @@ -444,7 +435,7 @@ keywords = {'really_compile' : options1.really_compile, 'standalone' : standalone, 'gcpolicy' : gcpolicy} - c_entry_point = t.compile(options1.backend,**keywords) + c_entry_point = t.compile(options1.backend, **keywords) if standalone and options1.backend == 'c': # xxx fragile and messy import shutil From ac at codespeak.net Thu Sep 8 11:20:02 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 8 Sep 2005 11:20:02 +0200 (CEST) Subject: [pypy-svn] r17352 - in pypy/dist/pypy/interpreter: astcompiler pyparser Message-ID: <20050908092002.29B7E27B5D@code1.codespeak.net> Author: ac Date: Thu Sep 8 11:20:01 2005 New Revision: 17352 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: Fix some more SomeObject(). Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Thu Sep 8 11:20:01 2005 @@ -935,6 +935,8 @@ return -1 elif argc == 3: return -2 + assert False, 'Unexpected argument %s to depth_BUILD_SLICE' % argc + def depth_DUP_TOPX(argc): return argc Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Thu Sep 8 11:20:01 2005 @@ -285,6 +285,7 @@ expr = first_child.expr if builder.is_string_const(expr): # This *is* a docstring, remove it from stmt list + assert isinstance(expr, ast.Const) del stmt.nodes[0] doc = expr.value return doc @@ -1569,8 +1570,7 @@ value = value[:-1] return string_to_w_long( space, value, base=base ) try: - value = string_to_int(value, base=base) - return space.wrap(value) + return space.wrap(string_to_int(value, base=base)) except ParseStringError: return space.wrap(interp_string_to_float(space,value)) From adim at codespeak.net Thu Sep 8 11:28:05 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Thu, 8 Sep 2005 11:28:05 +0200 (CEST) Subject: [pypy-svn] r17353 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908092805.F0F5727B5D@code1.codespeak.net> Author: adim Date: Thu Sep 8 11:28:04 2005 New Revision: 17353 Modified: pypy/dist/pypy/interpreter/astcompiler/misc.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: get rid of annotation warning when using stack.__getitem__ (we could also probably simply remove the Stack class) Modified: pypy/dist/pypy/interpreter/astcompiler/misc.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/misc.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/misc.py Thu Sep 8 11:28:04 2005 @@ -46,6 +46,8 @@ return self.stack[-1] def __getitem__(self, index): # needed by visitContinue() return self.stack[index] + def elementAtIndex(self, index): + return self.stack[index] MANGLE_LEN = 256 # magic constant from compile.c Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Thu Sep 8 11:28:04 2005 @@ -481,7 +481,7 @@ loop_block = None while top > 0: top = top - 1 - kind, loop_block = self.setups[top] + kind, loop_block = self.setups.elementAtIndex(top) if kind == LOOP: break if kind != LOOP: From adim at codespeak.net Thu Sep 8 12:57:42 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Thu, 8 Sep 2005 12:57:42 +0200 (CEST) Subject: [pypy-svn] r17355 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908105742.12AC327B5E@code1.codespeak.net> Author: adim Date: Thu Sep 8 12:57:40 2005 New Revision: 17355 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/ast.txt pypy/dist/pypy/interpreter/astcompiler/astgen.py Log: added ability to add specific methods to ast.py's classes Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Thu Sep 8 12:57:40 2005 @@ -184,6 +184,19 @@ nodelist.extend(self.nodes) return nodelist + def getArgNames(self): + argnames = [] + for node in self.nodes: + if isinstance(node, AssTuple): + argnames.extend(node.getArgNames()) + elif isinstance(node, AssName): + name = node.name + assert isinstance(name, str) + argnames.append(name) + else: + assert False, "should only have AssName and AssTuple as children" + return argnames + def __repr__(self): return "AssTuple(%s)" % (repr(self.nodes),) Modified: pypy/dist/pypy/interpreter/astcompiler/ast.txt ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.txt (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.txt Thu Sep 8 12:57:40 2005 @@ -127,3 +127,16 @@ for test, suite in self.tests: nodelist.append(test) nodelist.append(suite) + +AssTuple.getArgNames(self): + argnames = [] + for node in self.nodes: + if isinstance(node, AssTuple): + argnames.extend(node.getArgNames()) + elif isinstance(node, AssName): + name = node.name + assert isinstance(name, str) + argnames.append(name) + else: + assert False, "should only have AssName and AssTuple as children" + return argnames Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Thu Sep 8 12:57:40 2005 @@ -51,6 +51,7 @@ self.nargs = len(self.argnames) self.init = [] self.flatten_nodes = {} + self.additional_methods = {} self.parent = parent def setup_parent(self, classes): @@ -110,6 +111,7 @@ print >> buf self._gen_getChildNodes(buf) print >> buf + self._gen_additional_methods(buf) self._gen_repr(buf) print >> buf self._gen_visit(buf) @@ -220,12 +222,19 @@ print >> buf, " def accept(self, visitor):" print >> buf, " return visitor.visit%s(self)" % self.name + def _gen_additional_methods(self, buf): + for key, value in self.additional_methods.iteritems(): + if key not in '_cur_': + print >> buf, ''.join(value) + # print >> buf, '\n\n' + def gen_base_visit(self, buf): print >> buf, " def visit%s(self, node):" % self.name print >> buf, " return self.default( node )" rx_init = re.compile('init\((.*)\):') rx_flatten_nodes = re.compile('flatten_nodes\((.*)\.(.*)\):') +rx_additional_methods = re.compile('(.*)\.(.*)\((.*?)\):') def parse_spec(file): classes = {} @@ -242,6 +251,10 @@ mo = rx_flatten_nodes.search(line) if mo: kind = 'flatten_nodes' + else: + mo = rx_additional_methods.search(line) + if mo: + kind = 'additional_method' if mo is None: if cur is None: if comment: @@ -263,6 +276,8 @@ cur.init.append(line) elif kind == 'flatten_nodes': cur.flatten_nodes['_cur_'].append(line) + elif kind == 'additional_method': + cur.additional_methods['_cur_'].append(' '*4 + line) elif kind == 'init': # some extra code for a Node's __init__ method name = mo.group(1) @@ -273,6 +288,13 @@ attr = mo.group(2) cur = classes[name] cur.flatten_nodes[attr] = cur.flatten_nodes['_cur_'] = [] + elif kind == 'additional_method': + name = mo.group(1) + methname = mo.group(2) + params = mo.group(3) + cur = classes[name] + cur.additional_methods['_cur_'] = [' def %s(%s):\n' % (methname, params)] + cur.additional_methods[methname] = cur.additional_methods['_cur_'] for node in classes.values(): node.setup_parent(classes) From adim at codespeak.net Thu Sep 8 12:58:07 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Thu, 8 Sep 2005 12:58:07 +0200 (CEST) Subject: [pypy-svn] r17356 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908105807.D3CEB27B6C@code1.codespeak.net> Author: adim Date: Thu Sep 8 12:58:06 2005 New Revision: 17356 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: use getArgNames and getChildNodes rather than flatten methods Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Thu Sep 8 12:58:06 2005 @@ -834,7 +834,8 @@ if args: for arg in args: if isinstance(arg, TupleArg): - numNames = len(misc.flatten(arg.names)) + numNames = len(arg.names.getArgNames()) + # numNames = len(misc.flatten(arg.names)) argcount = argcount - numNames return argcount Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Thu Sep 8 12:58:06 2005 @@ -1315,7 +1315,7 @@ args.append(elt) elif isinstance(elt, ast.AssTuple): args.append(TupleArg(i * 2, elt)) - extra.extend(ast.flatten(elt)) + extra.extend(elt.getChildNodes()) count = count + 1 else: raise ValueError( "unexpect argument type: %s" % elt ) From arigo at codespeak.net Thu Sep 8 13:50:17 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Sep 2005 13:50:17 +0200 (CEST) Subject: [pypy-svn] r17358 - pypy/dist/pypy/translator/c Message-ID: <20050908115017.AF7F427B61@code1.codespeak.net> Author: arigo Date: Thu Sep 8 13:50:16 2005 New Revision: 17358 Modified: pypy/dist/pypy/translator/c/gc.py Log: Shut off a warning caused by a change of type (long->char) that we forgot to change here too. Modified: pypy/dist/pypy/translator/c/gc.py ============================================================================== --- pypy/dist/pypy/translator/c/gc.py (original) +++ pypy/dist/pypy/translator/c/gc.py Thu Sep 8 13:50:16 2005 @@ -327,7 +327,7 @@ # for rtti node def rtti_type(self): - return 'long @' + return BoehmGcRuntimeTypeInfo_OpaqueNode.typename def rtti_node_factory(self): return BoehmGcRuntimeTypeInfo_OpaqueNode From arigo at codespeak.net Thu Sep 8 14:58:54 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Sep 2005 14:58:54 +0200 (CEST) Subject: [pypy-svn] r17364 - pypy/dist/pypy Message-ID: <20050908125854.17B4427B61@code1.codespeak.net> Author: arigo Date: Thu Sep 8 14:58:53 2005 New Revision: 17364 Modified: pypy/dist/pypy/conftest.py Log: Don't overwrite a global 'raises' from the conftest, if the module already has such a name. Modified: pypy/dist/pypy/conftest.py ============================================================================== --- pypy/dist/pypy/conftest.py (original) +++ pypy/dist/pypy/conftest.py Thu Sep 8 14:58:53 2005 @@ -98,8 +98,9 @@ return name.startswith('Test') or name.startswith('AppTest') def setup(self): - # stick py.test raise in module globals - self.obj.raises = py.test.raises + # stick py.test raise in module globals -- carefully + if not hasattr(self.obj, 'raises'): + self.obj.raises = py.test.raises super(Module, self).setup() # if hasattr(mod, 'objspacename'): # mod.space = getttestobjspace(mod.objspacename) From ac at codespeak.net Thu Sep 8 14:59:27 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 8 Sep 2005 14:59:27 +0200 (CEST) Subject: [pypy-svn] r17365 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908125927.2582027B61@code1.codespeak.net> Author: ac Date: Thu Sep 8 14:59:26 2005 New Revision: 17365 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py Log: Do not use unsupported methods on lists. Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Thu Sep 8 14:59:26 2005 @@ -207,13 +207,15 @@ block isn't next to the right block for implicit control transfers. """ - index = {} - for i in range(len(blocks)): - index[blocks[i]] = i - - for i in range(0, len(blocks) - 1): + new_blocks = blocks + blocks = blocks[:] + del new_blocks[:] + i = 0 + while i < len(blocks) - 1: b = blocks[i] n = blocks[i + 1] + i += 1 + new_blocks.append(b) if not b.next or b.next[0] == default_next or b.next[0] == n: continue # The blocks are in the wrong order. Find the chain of @@ -226,20 +228,17 @@ elt = elt.next[0] # Now remove the blocks in the chain from the current # block list, so that they can be re-inserted. - l = [] for b in chain: - assert index[b] > i - l.append((index[b], b)) - l.sort() - l.reverse() - for j, b in l: - del blocks[index[b]] - # Insert the chain in the proper location - blocks[i:i + 1] = [cur] + chain - # Finally, re-compute the block indexes - for i in range(len(blocks)): - index[blocks[i]] = i - + for j in range(i + 1, len(blocks)): + if blocks[i] == b: + del blocks[i] + else: + assert False, "Can't find block" + + new_blocks.extend(chain) + if i == len(blocks) - 1: + new_blocks.append(blocks[i]) + def fixupOrderForward(self, blocks, default_next): """Make sure all JUMP_FORWARDs jump forward""" index = {} @@ -279,7 +278,7 @@ goes_before, a_chain = constraints[0] assert a_chain > goes_before c = chains[a_chain] - chains.remove(c) + del chains[a_chain] chains.insert(goes_before, c) del blocks[:] From pedronis at codespeak.net Thu Sep 8 15:35:09 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 8 Sep 2005 15:35:09 +0200 (CEST) Subject: [pypy-svn] r17366 - in pypy/dist/pypy/translator: c goal Message-ID: <20050908133509.0E19627B6C@code1.codespeak.net> Author: pedronis Date: Thu Sep 8 15:35:02 2005 New Revision: 17366 Modified: pypy/dist/pypy/translator/c/gc.py pypy/dist/pypy/translator/goal/targetpypystandalone.py Log: - defines for Boehm on linux with threads - let translate standalone versions with boehm and threads Modified: pypy/dist/pypy/translator/c/gc.py ============================================================================== --- pypy/dist/pypy/translator/c/gc.py (original) +++ pypy/dist/pypy/translator/c/gc.py Thu Sep 8 15:35:02 2005 @@ -1,3 +1,4 @@ +import sys from pypy.translator.c.support import cdecl from pypy.translator.c.node import ContainerNode from pypy.rpython.lltype import typeOf, Ptr, PyObject, ContainerType @@ -353,6 +354,9 @@ return ['gc'] # xxx on windows? def pre_pre_gc_code(self): + if sys.platform == "linux2": + yield "#define _REENTRANT 1" + yield "#define GC_LINUX_THREADS 1" yield '#include ' yield '#define USING_BOEHM_GC' Modified: pypy/dist/pypy/translator/goal/targetpypystandalone.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetpypystandalone.py (original) +++ pypy/dist/pypy/translator/goal/targetpypystandalone.py Thu Sep 8 15:35:02 2005 @@ -58,8 +58,9 @@ # disable translation of the whole of classobjinterp.py StdObjSpace.setup_old_style_classes = lambda self: None if __main__.options1.gc == 'boehm': - print "disabling thread with boehm for stabilitiy (combination not tested)" - usemodules = [] + #print "disabling thread with boehm for stabilitiy (combination not tested)" + print "trying threads and boehm" + usemodules = ['thread'] else: usemodules = ['thread'] space = StdObjSpace(nofaking=True, From ale at codespeak.net Thu Sep 8 15:43:03 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Thu, 8 Sep 2005 15:43:03 +0200 (CEST) Subject: [pypy-svn] r17367 - pypy/dist/pypy/translator/goal Message-ID: <20050908134303.AE6B827B61@code1.codespeak.net> Author: ale Date: Thu Sep 8 15:43:02 2005 New Revision: 17367 Modified: pypy/dist/pypy/translator/goal/targetpypystandalone.py pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: redone targetpypystandalone fixed and error in translatepypy_new Modified: pypy/dist/pypy/translator/goal/targetpypystandalone.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetpypystandalone.py (original) +++ pypy/dist/pypy/translator/goal/targetpypystandalone.py Thu Sep 8 15:43:02 2005 @@ -57,7 +57,7 @@ # disable translation of the whole of classobjinterp.py StdObjSpace.setup_old_style_classes = lambda self: None - if __main__.options1.gc == 'boehm': + if __main__.options.get('-boehm'): #print "disabling thread with boehm for stabilitiy (combination not tested)" print "trying threads and boehm" usemodules = ['thread'] Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Thu Sep 8 15:43:02 2005 @@ -445,7 +445,7 @@ c_entry_point = newexename update_usession_dir() print 'Written %s.' % (c_entry_point,) - if not options.run: + if options1.run: print 'Running!' if standalone: os.system(c_entry_point) From ac at codespeak.net Thu Sep 8 15:43:06 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 8 Sep 2005 15:43:06 +0200 (CEST) Subject: [pypy-svn] r17368 - in pypy/dist/pypy/interpreter: . astcompiler pyparser pyparser/test Message-ID: <20050908134306.4CE8127B6C@code1.codespeak.net> Author: ac Date: Thu Sep 8 15:43:04 2005 New Revision: 17368 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/pycode.py pypy/dist/pypy/interpreter/pyparser/grammar.py pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Log: Fix code-object generation. Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Thu Sep 8 15:43:04 2005 @@ -797,13 +797,14 @@ return PyCode(self.space)._code_new_w( argcount, nlocals, self.stacksize, self.flags, self.lnotab.getCode(), - tuple(self.getConsts()), - tuple(self.names), - tuple(self.varnames), - self.filename, self.name, self.lnotab.firstline, + self.getConsts(), + self.names, + self.varnames, + self.filename, self.name, + self.lnotab.firstline, self.lnotab.getTable(), - tuple(self.freevars), - tuple(self.cellvars) + self.freevars, + self.cellvars ) def getConsts(self): Modified: pypy/dist/pypy/interpreter/pycode.py ============================================================================== --- pypy/dist/pypy/interpreter/pycode.py (original) +++ pypy/dist/pypy/interpreter/pycode.py Thu Sep 8 15:43:04 2005 @@ -186,35 +186,22 @@ # with a lot of boring asserts to enforce type knowledge # XXX get rid of that ASAP with a real compiler! import types - x = argcount; assert isinstance(x, int) - self.co_argcount = x - x = nlocals; assert isinstance(x, int) - self.co_nlocals = x - x = stacksize; assert isinstance(x, int) - self.co_stacksize = x - x = flags; assert isinstance(x, int) - self.co_flags = x - x = code; assert isinstance(x, str) - self.co_code = x + self.co_argcount = argcount + self.co_nlocals = nlocals + self.co_stacksize = stacksize + self.co_flags = flags + self.co_code = code ## for w in consts: ## assert isinstance(w,W_Root) self.co_consts_w = consts - x = names; assert isinstance(x, tuple) - self.co_names = [ str(n) for n in x ] - x = varnames; assert isinstance(x, tuple) - self.co_varnames = [ str(n) for n in x ] - x = freevars; assert isinstance(x, tuple) - self.co_freevars = [ str(n) for n in x ] - x = cellvars; assert isinstance(x, tuple) - self.co_cellvars = [ str(n) for n in x ] - x = filename; assert isinstance(x, str) - self.co_filename = x - x = name; assert isinstance(x, str) - self.co_name = x - x = firstlineno; assert isinstance(x, int) - self.co_firstlineno = x - x = lnotab; assert isinstance(x, str) - self.co_lnotab = x + self.co_names = names + self.co_varnames = varnames + self.co_freevars = freevars + self.co_cellvars = cellvars + self.co_filename = filename + self.co_name = name + self.co_firstlineno = firstlineno + self.co_lnotab = lnotab # recursively _from_code()-ify the code objects in code.co_consts space = self.space return self Modified: pypy/dist/pypy/interpreter/pyparser/grammar.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/grammar.py (original) +++ pypy/dist/pypy/interpreter/pyparser/grammar.py Thu Sep 8 15:43:04 2005 @@ -93,7 +93,10 @@ """Abstract base class for builder objects""" def __init__(self, rules=None, debug=0, symbols={} ): # a dictionary of grammar rules for debug/reference - self.rules = rules or {} + if rules is not None: + self.rules = rules + else: + self.rules = {} # This attribute is here for convenience self.debug = debug self.symbols = symbols # mapping from codename to symbols Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Thu Sep 8 15:43:04 2005 @@ -113,7 +113,7 @@ rcode.co_stacksize, rcode.co_flags, rcode.co_code, - rcode.co_consts_w, + tuple(rcode.co_consts_w), tuple(rcode.co_names), tuple(rcode.co_varnames), rcode.co_filename, From ac at codespeak.net Thu Sep 8 16:26:20 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 8 Sep 2005 16:26:20 +0200 (CEST) Subject: [pypy-svn] r17369 - in pypy/dist/pypy: annotation translator/test Message-ID: <20050908142620.DEDF527B61@code1.codespeak.net> Author: ac Date: Thu Sep 8 16:26:20 2005 New Revision: 17369 Modified: pypy/dist/pypy/annotation/binaryop.py pypy/dist/pypy/translator/test/test_annrpython.py Log: Fix bug when mixing constant list/dict with None. Modified: pypy/dist/pypy/annotation/binaryop.py ============================================================================== --- pypy/dist/pypy/annotation/binaryop.py (original) +++ pypy/dist/pypy/annotation/binaryop.py Thu Sep 8 16:26:20 2005 @@ -577,7 +577,7 @@ class __extend__(pairtype(SomeList, SomePBC)): def union((lst, pbc)): if pbc.isNone(): - return lst + return SomeList(lst.listdef) return SomeObject() class __extend__(pairtype(SomePBC, SomeList )): @@ -588,7 +588,7 @@ class __extend__(pairtype(SomeDict, SomePBC)): def union((dct, pbc)): if pbc.isNone(): - return dct + return SomeDict(dct.dictdef) return SomeObject() class __extend__(pairtype(SomePBC, SomeDict )): Modified: pypy/dist/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/dist/pypy/translator/test/test_annrpython.py (original) +++ pypy/dist/pypy/translator/test/test_annrpython.py Thu Sep 8 16:26:20 2005 @@ -1533,6 +1533,30 @@ a = self.RPythonAnnotator() s = a.build_types(f, [int]) assert s.knowntype == dict + + def test_const_list_and_none(self): + def g(l=None): + return l is None + L = [1,2] + def f(): + g() + return g(L) + a = self.RPythonAnnotator() + s = a.build_types(f, []) + assert s.knowntype == bool + assert not s.is_constant() + + def test_const_dict_and_none(self): + def g(d=None): + return d is None + D = {1:2} + def f(): + g(D) + return g() + a = self.RPythonAnnotator() + s = a.build_types(f, []) + assert s.knowntype == bool + assert not s.is_constant() def g(n): From arigo at codespeak.net Thu Sep 8 16:27:08 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Sep 2005 16:27:08 +0200 (CEST) Subject: [pypy-svn] r17370 - in pypy/dist/pypy: lib module/__builtin__ module/__builtin__/test Message-ID: <20050908142708.3AB2227B61@code1.codespeak.net> Author: arigo Date: Thu Sep 8 16:27:07 2005 New Revision: 17370 Modified: pypy/dist/pypy/lib/_osfilewrapper.py pypy/dist/pypy/module/__builtin__/importing.py pypy/dist/pypy/module/__builtin__/test/test_import.py Log: Don't crash trying to write .pyc files, just ignore the error and try hard not to leave a broken .pyc file behind. Test. Modified: pypy/dist/pypy/lib/_osfilewrapper.py ============================================================================== --- pypy/dist/pypy/lib/_osfilewrapper.py (original) +++ pypy/dist/pypy/lib/_osfilewrapper.py Thu Sep 8 16:27:07 2005 @@ -27,6 +27,9 @@ # os.write will raise an error itself writecount += os.write(self.fd, buf[writecount:]) + def seek(self, position): + os.lseek(self.fd, position, 0) + def close(self): os.close(self.fd) Modified: pypy/dist/pypy/module/__builtin__/importing.py ============================================================================== --- pypy/dist/pypy/module/__builtin__/importing.py (original) +++ pypy/dist/pypy/module/__builtin__/importing.py Thu Sep 8 16:27:07 2005 @@ -488,13 +488,35 @@ #print "indeed writing", cpathname try: w_str = space.call_method(w_marshal, 'dumps', space.wrap(co)) + strbuf = space.str_w(w_str) except OperationError: #print "Problem while marshalling %s, skipping" % cpathname return - fd = os.open(cpathname, BIN_WRITEMASK, 0666) - osfile = OsFileWrapper(fd) - _w_long(osfile, pyc_magic) - _w_long(osfile, mtime) - strbuf = space.str_w(w_str) - osfile.write(strbuf) - os.close(fd) + # + # Careful here: we must not crash nor leave behind something that looks + # too much like a valid pyc file but really isn't one. + # + try: + fd = os.open(cpathname, BIN_WRITEMASK, 0666) + except OSError: + return # cannot create file + try: + osfile = OsFileWrapper(fd) + try: + # will patch the header later; write zeroes until we are sure that + # the rest of the file is valid + _w_long(osfile, 0) # pyc_magic + _w_long(osfile, 0) # mtime + osfile.write(strbuf) + + # should be ok (XXX or should call os.fsync() to be sure?) + osfile.seek(0) + _w_long(osfile, pyc_magic) + _w_long(osfile, mtime) + finally: + osfile.close() + except OSError: + try: + os.unlink(pathname) + except OSError: + pass Modified: pypy/dist/pypy/module/__builtin__/test/test_import.py ============================================================================== --- pypy/dist/pypy/module/__builtin__/test/test_import.py (original) +++ pypy/dist/pypy/module/__builtin__/test/test_import.py Thu Sep 8 16:27:07 2005 @@ -52,6 +52,7 @@ __init__ = "import sys, pkg_substituted\n" "sys.modules[__name__] = pkg_substituted") setuppkg("pkg_substituted", mod='') + p = setuppkg("readonly", x='') # create compiled/x.py and a corresponding pyc file p = setuppkg("compiled", x = "x = 84") @@ -229,6 +230,18 @@ import compiled.x assert compiled.x == sys.modules.get('compiled.x') + def test_cannot_write_pyc(self): + import sys, os + p = os.path.join(sys.path[-1], 'readonly') + try: + os.chmod(p, 0555) + except: + skip("cannot chmod() the test directory to read-only") + try: + import readonly.x # cannot write x.pyc, but should not crash + finally: + os.chmod(p, 0775) + def _getlong(data): x = marshal.dumps(data) return x[-4:] From arigo at codespeak.net Thu Sep 8 16:33:54 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Sep 2005 16:33:54 +0200 (CEST) Subject: [pypy-svn] r17371 - pypy/dist/pypy/module/__builtin__ Message-ID: <20050908143354.BADE327B61@code1.codespeak.net> Author: arigo Date: Thu Sep 8 16:33:54 2005 New Revision: 17371 Modified: pypy/dist/pypy/module/__builtin__/importing.py Log: Oups. Modified: pypy/dist/pypy/module/__builtin__/importing.py ============================================================================== --- pypy/dist/pypy/module/__builtin__/importing.py (original) +++ pypy/dist/pypy/module/__builtin__/importing.py Thu Sep 8 16:33:54 2005 @@ -517,6 +517,6 @@ osfile.close() except OSError: try: - os.unlink(pathname) + os.unlink(cpathname) except OSError: pass From pedronis at codespeak.net Thu Sep 8 17:45:17 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 8 Sep 2005 17:45:17 +0200 (CEST) Subject: [pypy-svn] r17374 - pypy/dist/pypy/translator/goal Message-ID: <20050908154517.0FCD127B6E@code1.codespeak.net> Author: pedronis Date: Thu Sep 8 17:45:16 2005 New Revision: 17374 Modified: pypy/dist/pypy/translator/goal/targetcompiler.py Log: don't try compiler="ast" for now Modified: pypy/dist/pypy/translator/goal/targetcompiler.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetcompiler.py (original) +++ pypy/dist/pypy/translator/goal/targetcompiler.py Thu Sep 8 17:45:16 2005 @@ -30,7 +30,7 @@ # for the poor translator already # XXX why can't I enable this? crashes the annotator! space = StdObjSpace(nofaking=True, - compiler="ast", + compiler="_stable", translating=True, #usemodules=['marhsal', '_sre'], geninterp=geninterp) From pedronis at codespeak.net Thu Sep 8 17:46:35 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 8 Sep 2005 17:46:35 +0200 (CEST) Subject: [pypy-svn] r17375 - pypy/dist/pypy/interpreter/pyparser/test Message-ID: <20050908154635.AD77527B6E@code1.codespeak.net> Author: pedronis Date: Thu Sep 8 17:46:34 2005 New Revision: 17375 Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Log: test if const Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Thu Sep 8 17:46:34 2005 @@ -261,6 +261,7 @@ """, "if a and not b == c: pass", "if a and not not not b == c: pass", + "if 0: print 'foo'" ] asserts = [ From pedronis at codespeak.net Thu Sep 8 17:51:40 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 8 Sep 2005 17:51:40 +0200 (CEST) Subject: [pypy-svn] r17376 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908155140.B054127B6E@code1.codespeak.net> Author: pedronis Date: Thu Sep 8 17:51:39 2005 New Revision: 17376 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/astgen.py Log: visitNode is not really implemted causing blocked blocks. Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Thu Sep 8 17:51:39 2005 @@ -37,7 +37,7 @@ def getChildNodes(self): return [] # implemented by subclasses def accept(self, visitor): - return visitor.visitNode(self) + raise NotImplentedError def flatten(self): res = [] nodes = self.getChildNodes() Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Thu Sep 8 17:51:39 2005 @@ -374,7 +374,7 @@ def getChildNodes(self): return [] # implemented by subclasses def accept(self, visitor): - return visitor.visitNode(self) + raise NotImplentedError def flatten(self): res = [] nodes = self.getChildNodes() From pedronis at codespeak.net Thu Sep 8 17:52:10 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 8 Sep 2005 17:52:10 +0200 (CEST) Subject: [pypy-svn] r17377 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908155210.15E0327B6E@code1.codespeak.net> Author: pedronis Date: Thu Sep 8 17:52:09 2005 New Revision: 17377 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: make this RPython Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Thu Sep 8 17:52:09 2005 @@ -142,6 +142,8 @@ __init__() defined in this class. """ + graph = None + optimized = 0 # is namespace access optimized? __initialized = None class_name = None # provide default for instance variable @@ -165,12 +167,7 @@ def checkClass(self): """Verify that class is constructed correctly""" - try: - assert hasattr(self, 'graph') - except AssertionError, msg: - intro = "Bad class construction for %s" % self.__class__.__name__ - raise AssertionError, intro - + assert self.graph is not None, "bad class construction for %r" % self def emit(self, inst ): return self.graph.emit( inst ) From ac at codespeak.net Thu Sep 8 17:53:28 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 8 Sep 2005 17:53:28 +0200 (CEST) Subject: [pypy-svn] r17378 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908155328.8ACDF27B71@code1.codespeak.net> Author: ac Date: Thu Sep 8 17:53:28 2005 New Revision: 17378 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py Log: Fix some typos. Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Thu Sep 8 17:53:28 2005 @@ -230,8 +230,9 @@ # block list, so that they can be re-inserted. for b in chain: for j in range(i + 1, len(blocks)): - if blocks[i] == b: - del blocks[i] + if blocks[j] == b: + del blocks[j] + break else: assert False, "Can't find block" From pedronis at codespeak.net Thu Sep 8 18:40:50 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 8 Sep 2005 18:40:50 +0200 (CEST) Subject: [pypy-svn] r17383 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908164050.040A327B57@code1.codespeak.net> Author: pedronis Date: Thu Sep 8 18:40:49 2005 New Revision: 17383 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/astgen.py Log: oopsy oops Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Thu Sep 8 18:40:49 2005 @@ -37,7 +37,7 @@ def getChildNodes(self): return [] # implemented by subclasses def accept(self, visitor): - raise NotImplentedError + raise NotImplementedError def flatten(self): res = [] nodes = self.getChildNodes() Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Thu Sep 8 18:40:49 2005 @@ -374,7 +374,7 @@ def getChildNodes(self): return [] # implemented by subclasses def accept(self, visitor): - raise NotImplentedError + raise NotImplementedError def flatten(self): res = [] nodes = self.getChildNodes() From pedronis at codespeak.net Thu Sep 8 18:43:39 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 8 Sep 2005 18:43:39 +0200 (CEST) Subject: [pypy-svn] r17384 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908164339.0E54127B57@code1.codespeak.net> Author: pedronis Date: Thu Sep 8 18:43:38 2005 New Revision: 17384 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/astcompiler/symbols.py Log: use the space for checking for false if tests. targetastcompiler annotation finishes! Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Thu Sep 8 18:43:38 2005 @@ -128,9 +128,9 @@ mtime = struct.pack(' Author: arigo Date: Thu Sep 8 18:52:03 2005 New Revision: 17385 Modified: pypy/dist/pypy/module/posix/test/test_posix2.py pypy/dist/pypy/rpython/extfunctable.py pypy/dist/pypy/rpython/module/test/test_ll_os.py pypy/dist/pypy/rpython/rbuiltin.py pypy/dist/pypy/translator/c/test/test_extfunc.py Log: * os.unsetenv() may be present but a fake (e.g. there is no nt.unsetenv()). Fix the detection code. * rbuiltin.py was not tracking updates to the extfunctables, causing errors if you ran in the same process an rpython test followed by translator/c/test/test_extfunc. Modified: pypy/dist/pypy/module/posix/test/test_posix2.py ============================================================================== --- pypy/dist/pypy/module/posix/test/test_posix2.py (original) +++ pypy/dist/pypy/module/posix/test/test_posix2.py Thu Sep 8 18:52:03 2005 @@ -85,20 +85,19 @@ posix = self.posix os = self.os - def test_unsetenv_nonexisting(self): - os = self.os - os.unsetenv("XYZABC") #does not raise - try: - os.environ["ABCABC"] - except KeyError: - pass - else: - raise AssertionError("did not raise KeyError") - os.environ["ABCABC"] = "1" - assert os.environ["ABCABC"] == "1" - if hasattr(os, "unsetenv"): + if hasattr(__import__(os.name), "unsetenv"): + def test_unsetenv_nonexisting(self): + os = self.os + os.unsetenv("XYZABC") #does not raise + try: + os.environ["ABCABC"] + except KeyError: + pass + else: + raise AssertionError("did not raise KeyError") + os.environ["ABCABC"] = "1" + assert os.environ["ABCABC"] == "1" os.unsetenv("ABCABC") cmd = '''python -c "import os, sys; sys.exit(int('ABCABC' in os.environ))" ''' res = os.system(cmd) assert res == 0 - Modified: pypy/dist/pypy/rpython/extfunctable.py ============================================================================== --- pypy/dist/pypy/rpython/extfunctable.py (original) +++ pypy/dist/pypy/rpython/extfunctable.py Thu Sep 8 18:52:03 2005 @@ -133,6 +133,7 @@ return SomeString(can_be_None=True) # external function declarations +posix = __import__(os.name) declare(os.open , int , 'll_os/open') declare(os.read , str , 'll_os/read') declare(os.write , posannotation , 'll_os/write') @@ -140,7 +141,7 @@ declare(os.dup , int , 'll_os/dup') declare(os.lseek , int , 'll_os/lseek') declare(os.isatty , bool , 'll_os/isatty') -if hasattr(os, 'ftruncate'): +if hasattr(posix, 'ftruncate'): declare(os.ftruncate, noneannotation, 'll_os/ftruncate') declare(os.fstat , statannotation, 'll_os/fstat') declare(os.stat , statannotation, 'll_os/stat') @@ -151,7 +152,8 @@ declare(os.chdir , noneannotation, 'll_os/chdir') declare(os.mkdir , noneannotation, 'll_os/mkdir') declare(os.rmdir , noneannotation, 'll_os/rmdir') -declare(os.unsetenv , noneannotation, 'll_os/unsetenv') +if hasattr(posix, 'unsetenv'): # note: faked in os + declare(os.unsetenv , noneannotation, 'll_os/unsetenv') declare(os.path.exists, bool , 'll_os_path/exists') declare(os.path.isdir, bool , 'll_os_path/isdir') declare(time.time , float , 'll_time/time') Modified: pypy/dist/pypy/rpython/module/test/test_ll_os.py ============================================================================== --- pypy/dist/pypy/rpython/module/test/test_ll_os.py (original) +++ pypy/dist/pypy/rpython/module/test/test_ll_os.py Thu Sep 8 18:52:03 2005 @@ -46,14 +46,16 @@ assert result == '12345678' f.close() os.unlink(filename) - ll_os_unsetenv(to_rstr("abcdefgh")) - cmd = '''python -c "import os; print repr(os.getenv('abcdefgh'))" > %s''' % filename - os.system(cmd) - f = file(filename) - result = f.read().strip() - assert result == 'None' - f.close() - os.unlink(filename) + posix = __import__(os.name) + if hasattr(posix, "unsetenv"): + ll_os_unsetenv(to_rstr("abcdefgh")) + cmd = '''python -c "import os; print repr(os.getenv('abcdefgh'))" > %s''' % filename + os.system(cmd) + f = file(filename) + result = f.read().strip() + assert result == 'None' + f.close() + os.unlink(filename) test_src = """ import os Modified: pypy/dist/pypy/rpython/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/rbuiltin.py Thu Sep 8 18:52:03 2005 @@ -275,14 +275,15 @@ from pypy.rpython import extfunctable def make_rtype_extfunc(extfuncinfo): - ll_function = extfuncinfo.ll_function if extfuncinfo.ll_annotable: def rtype_extfunc(hop): + ll_function = extfuncinfo.ll_function vars = hop.inputargs(*hop.args_r) hop.exception_is_here() return hop.gendirectcall(ll_function, *vars) else: def rtype_extfunc(hop): + ll_function = extfuncinfo.ll_function resulttype = hop.r_result vars = hop.inputargs(*hop.args_r) hop.exception_is_here() @@ -294,10 +295,20 @@ "rtype_extfunc_%s" % extfuncinfo.func.__name__) return rtype_extfunc -# import rtyping information for external functions -# from the extfunctable.table into our own specific table -for func, extfuncinfo in extfunctable.table.iteritems(): - BUILTIN_TYPER[func] = make_rtype_extfunc(extfuncinfo) + +def update_exttable(): + """import rtyping information for external functions + from the extfunctable.table into our own specific table + """ + for func, extfuncinfo in extfunctable.table.iteritems(): + if func not in BUILTIN_TYPER: + BUILTIN_TYPER[func] = make_rtype_extfunc(extfuncinfo) + +# Note: calls to declare() may occur after rbuiltin.py is first imported. +# We must track future changes to the extfunctable. +extfunctable.table_callbacks.append(update_exttable) +update_exttable() + # _________________________________________________________________ # memory addresses Modified: pypy/dist/pypy/translator/c/test/test_extfunc.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_extfunc.py (original) +++ pypy/dist/pypy/translator/c/test/test_extfunc.py Thu Sep 8 18:52:03 2005 @@ -435,8 +435,6 @@ assert file(filename).read().strip() == '12345678' os.unlink(filename) -# XXX missing test for unsetenv, please do this on Linux - # aaargh: bad idea: the above test updates the environment directly, so the # os.environ dict is not updated, which makes the following test fail if not run @@ -478,17 +476,17 @@ assert res chan.close() -def test_unsetenv(): - if not hasattr(os, "unsetenv"): - py.test.skip("missing unsetenv on this architecture") - def unsetenv(): - os.unsetenv("ABCDEF") - f = compile(unsetenv, []) - os.putenv("ABCDEF", "a") - cmd = '''python -c "import os, sys; sys.exit(os.getenv('ABCDEF') != 'a')"''' - assert os.system(cmd) == 0 - f() - cmd = '''python -c "import os, sys; sys.exit(os.getenv('ABCDEF') != None)"''' - assert os.system(cmd) == 0 - f() - assert os.system(cmd) == 0 +posix = __import__(os.name) +if hasattr(posix, "unsetenv"): + def test_unsetenv(): + def unsetenv(): + os.unsetenv("ABCDEF") + f = compile(unsetenv, []) + os.putenv("ABCDEF", "a") + cmd = '''python -c "import os, sys; sys.exit(os.getenv('ABCDEF') != 'a')"''' + assert os.system(cmd) == 0 + f() + cmd = '''python -c "import os, sys; sys.exit(os.getenv('ABCDEF') != None)"''' + assert os.system(cmd) == 0 + f() + assert os.system(cmd) == 0 From arigo at codespeak.net Thu Sep 8 19:28:15 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Sep 2005 19:28:15 +0200 (CEST) Subject: [pypy-svn] r17386 - pypy/extradoc/minute Message-ID: <20050908172815.2F1B527B5E@code1.codespeak.net> Author: arigo Date: Thu Sep 8 19:28:14 2005 New Revision: 17386 Added: pypy/extradoc/minute/pypy-sync-09-08-2005.txt Log: pypy-sync meeting minutes. Added: pypy/extradoc/minute/pypy-sync-09-08-2005.txt ============================================================================== --- (empty file) +++ pypy/extradoc/minute/pypy-sync-09-08-2005.txt Thu Sep 8 19:28:14 2005 @@ -0,0 +1,243 @@ +============================================= +pypy-sync developer meeting 8th September +============================================= + +Time & location: 1pm (30 minutes) at #pypy-sync + +Attendees:: + + Holger Krekel, + Christian Tismer, + Anders Lehmann, + Adrien Di Mascio, + Samuele Pedroni, + Anders Chrigstroem, + Ludovic Aubrien, + Armin Rigo (minutes/moderation), + Carl Friedrich Bolz, + Eric van Riet Paap, + Michael Hudson (lurking) + +Regular Topics +==================== + +- activity reports (3 prepared lines of info). + All Attendees submitted activity reports (see `IRC-Log`_ + at the end and 'LAST/NEXT/BLOCKERS' entries in particular) + +- resolve conflicts/blockers + No conflicts were discovered. + +Topics of the week +=================== + +Documentation and reports: status, assignment +---------------------------------------------- + +Samuele and Armin drafted some content for the EU reports (related +to WP5) in eu-tracking/deliverable/D05-reports-planning.txt. + +Holger and Carl want to help with these reports, but not before, +respectively, the 3rd part of the month, and October. Holger also +suggested that the four of us could meet in Paris one day before the +sprint starts. + +We'll coordinate a bit on #pypy. + +Setting up another PyPy server +------------------------------- + +We will soon have a Linux PC to be used as a general computation +server for PyPy (snake.cs.uni-duesseldorf.de). Armin will set it up, +also to do things like automated testing and PyPy translation. Holger +proposed some help. We will probably copy all codespeak accounts to +snake. The machine is a 2x 3.2GHz, 1GB RAM (soon 2GB), and it +compiles C/C++ or -- more interestingly -- translates PyPy extremely +fast. + +Meeting time +------------------ + +A number of people are not too happy about the time (1pm) of the +pypy-sync meetings, but discussions for a different time show more +problems than solutions. Next meeting is still at 1pm. + + +Closing +------------------ + +Armin closes the meeting in time at 1:30pm. + +.. _`IRC-log`: + +Here is the full IRC log:: + + Session Start (pypy:#pypy-sync): Thu Sep 08 12:59:52 2005 + [12:59] *** #pypy-sync was created on Thu Sep 08 12:39:51 2005. + [13:02] *** ludal has joined #pypy-sync. + [13:05] hpk: hum, i guess armin is busy with some HHU admin or so + [13:06] hpk: shall we just start anyway? + [13:06] stakkars: so let's play the PyPy rock, until we start. (proposals for the melody?) + [13:06] aleale: I guess + [13:06] adim: +1 for me + [13:07] stakkars: DONE: boehm under Windows, lots of benchmarks, rewrote rlist and listobject + [13:07] stakkars: BLOCK: rtyper behavior which didn't match the implementation + [13:07] stakkars: NEXT: maybe a single dict implementation, removing it from interp-level, report templates, open for other tasks + [13:07] aleale: prev: trying to clean up translate_pypy + [13:07] aleale: next: first draft of plan for WP9, more translate_pypy + [13:07] aleale: blockers: - + [13:07] pedronis: LAST: planning report with Armin, compiler with Logilan and Arre, annotator err reporting, a bit of translate_pypy, more convenience commands in its debugger + [13:07] hpk: last: EU issues, website/codespeak stuff, mentoring + [13:07] hpk: next: non-pypy stuff, codespeak migration second go + [13:07] hpk: blockers: - + [13:07] pedronis: NEXT: reports, more of the rest + [13:07] pedronis: BLOCKERS: - + [13:08] adim: LAST: astcompiler + [13:08] adim: NEXT: astcompiler + [13:08] adim: BLOCKERS: none + [13:08] arre: LAST: Vacation and working on astcompiler with Samuele. + [13:08] arre: NEXT: More work in astcompiler. + [13:08] arre: BLOCKERS: No known. + [13:08] ludal: this week: some pairing with Adrien, also played a bit with compile options + [13:08] ludal: next week: same + [13:08] ludal: blockers: none + [13:08] *** arigo has joined #pypy-sync. + [13:08] arigo: oups, failed to speed up my eating + [13:08] arigo: sorry + [13:08] cfbolz: LAST: inlining functions + [13:08] cfbolz: NEXT: pypy unrelated + [13:08] cfbolz: BLOCKERS: None + [13:08] *** cfbolz has left #pypy-sync. + [13:08] ericvrp: last: talk with llvm team about their escape analysis + [13:08] ericvrp: next: prepare usage of heap2stack + [13:08] *** cfbolz has joined #pypy-sync. + [13:08] ericvrp: blockers: am a bit unfocused at the moment + [13:08] hpk: arigo: we are in activity reports, still lots to go :-) + [13:09] stakkars: should we paste it all again forArmin? It's quick + [13:09] hpk: i send him the log later + [13:09] arigo: yes + [13:09] ludal: we could also consider moving next meetings out of lunchtime + [13:10] arigo: LAST: drafted EU reports, not much else (minor illness) + [13:10] arigo: NEXT: EU reports + [13:10] arigo: BLOCKERS: - + [13:10] cfbolz: -1 + [13:10] arigo: any status lines missing? + [13:10] hpk: don't think so + [13:10] arigo: any blockers/conflicts ? + [13:11] hpk: apparently not :-) + [13:11] arigo: ok :-) + [13:11] stakkars: I only saw a little unfocusedness and my problems which are solved. + [13:11] arigo: ok + [13:11] arigo: we'll talk about the time later, I should manage to eat a bit earlier + [13:11] arigo: let's move to topics of the week + [13:12] arigo: Documentation and reports: status, assignments + [13:12] arigo: pedronis: do you want to give a quick summary? + [13:12] ludal: well, It's not about you, I'm personnally fed up with eating sandwiches every thursday + [13:13] arigo: ludal: ok, let's move it as topic 3 then + [13:13] pedronis: we drafted the some outline of the contents + [13:13] *** mwh has joined #pypy-sync. + [13:14] pedronis: it's in D05-reports-planning.txt + [13:14] pedronis: in eu-tracking/deliverables + [13:14] arigo: the question at the moment is if this is going to be written by Samuele and me only, or not + [13:14] arigo: the D05.1 "report about translating a very-high-level description + [13:14] arigo: of a language into low level code by building on "abstract interpretation" and + [13:14] arigo: PyPy's separation of the interpreter and the object space in particular") is probably going to be + [13:15] hpk: i am willing to help but probably only in the last third of the month + [13:15] hpk: (can't say which parts at the moment) + [13:15] cfbolz: me as well from beginning of october + [13:16] hpk: pedronis, cfbolz, arigo: what about meeting one day earlier in paris and trying to work on that together? + [13:16] cfbolz: fine with me! + [13:16] arigo: and always open, the question of what kind of format we should aim for + [13:16] hpk: first please ReST + [13:16] arigo: I mean contentwise + [13:16] hpk: otherwise it's a pain to collaborate + [13:16] hpk: ah ok, sorry + [13:17] pedronis: arigo: there was some info about that + [13:17] pedronis: in the minutes + [13:17] pedronis: of the management meeting + [13:17] * hpk hopes to get examples of reports next week + [13:17] pedronis: in Heildeberg + [13:17] arigo: ah ok + [13:18] hpk: idea: we should generally try to think of stuff we could link from Wikipedia to the pypy pages + [13:19] hpk: wikipedia really has nice terms and there are other projects like Scheme interweaved and mentioned a lot + [13:19] hpk: (this is regarindg the "publish" part of the deliverable) + [13:19] stakkars: and btw., WIkipedia already has an idea about Stackless, but PyPy is missing... + [13:20] arigo: ok (I can't drive this any more right now, as I didn't see the minutes in question) + [13:21] arigo: next topic is: I'm setting up a server here at HHU + [13:21] hpk: arigo, pedronis, cfbolz, hpk: we should try to remain in contact/discussion about this + [13:21] cfbolz: hpk: indeed + [13:21] arigo: (yes, I'll try to catch up and let's continue this on #pypy) + [13:21] arigo: the HHU server should be open to any of us + [13:22] hpk: that's good, there actually is an easy way to setup all pypy/codspeak users up on that machine + [13:22] arigo: as I said on #pypy it's a good but normal PC, 2x3.2GHz, 1GB RAM (soon 2GB) + [13:22] arigo: it compiles very quickly so we could use it for background testing/translating + [13:22] stakkars: dual processor, great + [13:22] hpk: sounds good. + [13:23] hpk: i have some scripts to regularly run tests. maybe we can hack a bit on this together some time soon. + [13:23] arigo: hpk: yes + [13:23] arigo: also, what is the status of the automated translation-runner, doing a binary search? + [13:23] hpk: nothin, only a design draft was done in Goetheborg (from Anders Q. IIRC) + [13:24] arigo: I thought there was some code + [13:24] hpk: code in the sense of classes and methods but all unimplemented + [13:24] arigo: ah ok + [13:25] arigo: fine, I guess that's it for this meeting -- next meeting: + [13:25] arigo: change the time? what about 1:30? + [13:26] * hpk would prefer earlier but is fine with 01:30 + [13:26] arigo: it could be anything actually, e.g. 1:20-1:50 + [13:26] cfbolz: 1:30 is ok but not later + [13:26] arigo: or of course 1:12-1:42 + [13:26] aleale: ok by me - would prefer 11.30 + [13:26] hpk: ludal: what is your opinion? + [13:26] ludal: is there a particular reason to be between 12h-14h + [13:26] stakkars: bad for me,I'dprefer earlier as well + [13:26] ludal: 11h30 or 13h30 is fine + [13:26] ericvrp: I would prefer earlier then 13:00 + [13:27] hpk: ok, so what about 11:30 ? + [13:27] ericvrp: + + [13:27] ludal: i'm just wondering maybe someone has obligations outside + [13:27] cfbolz: earlier is bad for me --> lectures + [13:27] ludal: hpk: +1 + [13:27] arigo: 12:00 ? + [13:27] cfbolz: they end at 13.00 :-( + [13:27] pedronis: 11:30 -1 + [13:28] ludal: no up yet ;) + [13:28] ludal: s/no/not/ + [13:28] arigo: hum, difficult + [13:28] hpk: well, let's all send arigo times we prefer per mail and he can figure it out and invite accordingly :-) + [13:28] arre: 11:00 would be even better for me. + [13:29] ludal: ok so constraints are ericvrp < 13:00; cfbolz >= 13:00 ... + [13:29] hpk: ludal: or you use your constraint solver or whatever :-) + [13:29] aleale: we could leave to the moderator of the week to decide + [13:29] ludal: I don't think it can manage that :) + [13:29] aleale: s/leave/leave it + [13:29] arigo: Christian, you are next week + [13:29] hpk: i don't think changing the time each week is a great idea + [13:29] arigo: (at least you proposed yourself as such, last time) + [13:30] adim: hpk: +1 + [13:30] ericvrp: if we go for 13:30, I will leave at 13:55 + [13:30] arigo: I suggest that 1pm is still the proposed time (least conflicting) + [13:30] aleale: arigo:+1 + [13:30] ericvrp: fine + [13:30] stakkars: so Ido the meeting at 1pmnext week. + [13:30] arigo: but we could all send constrains to Christian (precise constrains, e.g. <= 13:25 is fine) + [13:30] hpk: arigo: +1 ( until next week where we can resolve this maybe) + [13:30] stakkars: and I'll bring some more spaces + [13:30] arigo: ok + [13:31] arigo: fine, let's keep 1pm for next Thursday as well + [13:31] arigo: stakkars: :-) + [13:31] arigo: thanks all :-) + [13:31] cfbolz: see you + [13:31] ludal: bye + [13:31] adim: see ou + [13:31] *** cfbolz has left #pypy-sync. + [13:31] aleale: see you + [13:31] hpk: bye + [13:31] *** adim has left #pypy-sync. + [13:31] *** hpk has left #pypy-sync. + [13:31] stakkars: I wished I could send my notebook in for repair, but then I don't have any + [13:31] *** ludal has left #pypy-sync. + [13:32] *** aleale has left #pypy-sync. + [13:32] arigo: stakkars: do you have a log? + [13:32] stakkars: yes + [13:32] arigo: thanks + Session Close (#pypy-sync): Thu Sep 08 13:32:32 2005 From pedronis at codespeak.net Thu Sep 8 19:31:06 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 8 Sep 2005 19:31:06 +0200 (CEST) Subject: [pypy-svn] r17388 - pypy/dist/pypy/translator/goal Message-ID: <20050908173106.E2D8927B61@code1.codespeak.net> Author: pedronis Date: Thu Sep 8 19:31:04 2005 New Revision: 17388 Modified: pypy/dist/pypy/translator/goal/query.py Log: this sanity check should consider that the annotator always normalize unbound methods to the underlying function. Modified: pypy/dist/pypy/translator/goal/query.py ============================================================================== --- pypy/dist/pypy/translator/goal/query.py (original) +++ pypy/dist/pypy/translator/goal/query.py Thu Sep 8 19:31:04 2005 @@ -475,6 +475,8 @@ funcs = dict.fromkeys(meth.s_value.prebuiltinstances.iterkeys()) for subcls in subclasses: f = subcls.cls.__dict__.get(name) + if hasattr(f, 'im_self') and f.im_self is None: + f = f.im_func if f: if f not in funcs: print "Lost method!", name, subcls.cls, cls, subcls.attrs.keys() From pedronis at codespeak.net Thu Sep 8 19:46:32 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 8 Sep 2005 19:46:32 +0200 (CEST) Subject: [pypy-svn] r17389 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908174632.AED6127B61@code1.codespeak.net> Author: pedronis Date: Thu Sep 8 19:46:31 2005 New Revision: 17389 Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py Log: removed the __super_init idiom in symbols.py too Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Thu Sep 8 19:46:31 2005 @@ -173,23 +173,20 @@ return self.cells.keys() class ModuleScope(Scope): - __super_init = Scope.__init__ def __init__(self): - self.__super_init("global", self) + Scope.__init__(self, "global", self) class FunctionScope(Scope): pass class GenExprScope(Scope): - __super_init = Scope.__init__ - __counter = 1 def __init__(self, module, klass=None): i = self.__counter self.__counter += 1 - self.__super_init("generator expression<%d>"%i, module, klass) + Scope.__init__(self, "generator expression<%d>"%i, module, klass) self.add_param('[outmost-iterable]') def get_names(self): @@ -197,20 +194,17 @@ return keys class LambdaScope(FunctionScope): - __super_init = Scope.__init__ - __counter = 1 def __init__(self, module, klass=None): i = self.__counter self.__counter += 1 - self.__super_init("lambda.%d" % i, module, klass) + Scope.__init__(self, "lambda.%d" % i, module, klass) class ClassScope(Scope): - __super_init = Scope.__init__ def __init__(self, name, module): - self.__super_init(name, module, name) + Scope.__init__(self, name, module, name) class SymbolVisitor(ast.ASTVisitor): def __init__(self, space): From arigo at codespeak.net Thu Sep 8 20:39:20 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Sep 2005 20:39:20 +0200 (CEST) Subject: [pypy-svn] r17390 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908183920.CC91127B61@code1.codespeak.net> Author: arigo Date: Thu Sep 8 20:39:19 2005 New Revision: 17390 Modified: pypy/dist/pypy/interpreter/astcompiler/future.py Log: This logic was complaining on any "from import" statement that was not at top-level. Modified: pypy/dist/pypy/interpreter/astcompiler/future.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/future.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/future.py Thu Sep 8 20:39:19 2005 @@ -66,13 +66,13 @@ if isinstance(s, ast.From): if s.valid_future: continue - if s.modname != "__future__": - continue self.visitFrom(s) else: self.default(s) def visitFrom(self, node): + if node.modname != "__future__": + return raise SyntaxError( "from __future__ imports must occur at the beginning of the file", filename=node.filename, lineno=node.lineno) From ericvrp at codespeak.net Thu Sep 8 21:16:48 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Thu, 8 Sep 2005 21:16:48 +0200 (CEST) Subject: [pypy-svn] r17391 - pypy/dist/pypy/translator/goal Message-ID: <20050908191648.B1EBA27B61@code1.codespeak.net> Author: ericvrp Date: Thu Sep 8 21:16:48 2005 New Revision: 17391 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: fix to generate correct .exe for non-c backends also Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Thu Sep 8 21:16:48 2005 @@ -437,10 +437,10 @@ 'gcpolicy' : gcpolicy} c_entry_point = t.compile(options1.backend, **keywords) - if standalone and options1.backend == 'c': # xxx fragile and messy + if standalone: # xxx fragile and messy import shutil exename = mkexename(c_entry_point) - newexename = mkexename('./pypy-c') + newexename = mkexename('./pypy-' + options1.backend) shutil.copy(exename, newexename) c_entry_point = newexename update_usession_dir() From arigo at codespeak.net Thu Sep 8 21:21:18 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 8 Sep 2005 21:21:18 +0200 (CEST) Subject: [pypy-svn] r17392 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050908192118.43E1C27B61@code1.codespeak.net> Author: arigo Date: Thu Sep 8 21:21:17 2005 New Revision: 17392 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/astcompiler/symbols.py Log: Bug left by checking "... is not None" on a flag whose value is now 0/1 instead of the (insane) original None/1. Using False/True looks like a good move anyway. Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Thu Sep 8 21:21:17 2005 @@ -1251,7 +1251,7 @@ AbstractFunctionCode.__init__(self, space, func, scopes, isLambda, class_name, mod) self.graph.setFreeVars(self.scope.get_free_vars()) self.graph.setCellVars(self.scope.get_cell_vars()) - if self.scope.generator is not None: + if self.scope.generator: self.graph.setFlag(CO_GENERATOR) class GenExprCodeGenerator(AbstractFunctionCode): Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Thu Sep 8 21:21:17 2005 @@ -27,7 +27,7 @@ # nested is true if the class could contain free variables, # i.e. if it is nested within another function. self.nested = 0 - self.generator = 0 + self.generator = False self.klass = None if klass is not None: for i in range(len(klass)): @@ -456,7 +456,7 @@ def visitYield(self, node ): scope = self.cur_scope() - scope.generator = 1 + scope.generator = True node.value.accept( self ) def sort(l): From pedronis at codespeak.net Thu Sep 8 21:23:27 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 8 Sep 2005 21:23:27 +0200 (CEST) Subject: [pypy-svn] r17393 - pypy/dist/pypy/interpreter/pyparser/test Message-ID: <20050908192327.3061727B61@code1.codespeak.net> Author: pedronis Date: Thu Sep 8 21:23:26 2005 New Revision: 17393 Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Log: check for flags in test_astcompiler, try to parse and compile _file and _sio parsing is giving different results, need to investigate Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Thu Sep 8 21:23:26 2005 @@ -616,12 +616,24 @@ 'snippet_decorators.py', ] +LIBSTUFF = [ + '_file.py', + '_sio.py' + ] + def test_snippets(): for snippet_name in SNIPPETS: filepath = os.path.join(os.path.dirname(__file__), 'samples', snippet_name) source = file(filepath).read() yield check_expression, source, 'exec' +def test_libstuff(): + py.test.skip("failing, need to investigate") + for snippet_name in LIBSTUFF: + filepath = os.path.join(os.path.dirname(__file__), '../../../lib', snippet_name) + source = file(filepath).read() + yield check_expression, source, 'exec' + # FIXME: find the sys' attribute that define this STDLIB_PATH = os.path.dirname(os.__file__) def test_on_stdlib(): Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Thu Sep 8 21:23:26 2005 @@ -15,7 +15,7 @@ listmakers, genexps, dictmakers, multiexpr, attraccess, slices, imports,\ asserts, execs, prints, globs, raises_, imports_newstyle, augassigns, \ if_stmts, one_stmt_classdefs, one_stmt_funcdefs, tryexcepts, docstrings, \ - returns, SNIPPETS, SINGLE_INPUTS + returns, SNIPPETS, SINGLE_INPUTS, LIBSTUFF from test_astbuilder import FakeSpace @@ -97,6 +97,7 @@ dis.dis(sc_code) assert ac_code.co_code == sc_code.co_code assert ac_code.co_varnames == sc_code.co_varnames + assert ac_code.co_flags == sc_code.co_flags assert len(ac_code.co_consts) == len(sc_code.co_consts) for c1, c2 in zip( ac_code.co_consts, sc_code.co_consts ): @@ -177,6 +178,12 @@ source = file(filepath).read() yield check_compile, source, 'exec' +def test_libstuff(): + for snippet_name in LIBSTUFF: + filepath = os.path.join(os.path.dirname(__file__), '../../../lib', snippet_name) + source = file(filepath).read() + yield check_compile, source, 'exec' + def test_single_inputs(): for family in SINGLE_INPUTS: From tismer at codespeak.net Thu Sep 8 21:41:22 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Thu, 8 Sep 2005 21:41:22 +0200 (CEST) Subject: [pypy-svn] r17394 - pypy/dist/pypy/translator/tool Message-ID: <20050908194122.D1FC727B61@code1.codespeak.net> Author: tismer Date: Thu Sep 8 21:41:21 2005 New Revision: 17394 Modified: pypy/dist/pypy/translator/tool/cbuild.py Log: added a windows option /Op which makes math precision best possible. This way, two tests (exp(0.3), log(-1.0)) behave correctly. Modified: pypy/dist/pypy/translator/tool/cbuild.py ============================================================================== --- pypy/dist/pypy/translator/tool/cbuild.py (original) +++ pypy/dist/pypy/translator/tool/cbuild.py Thu Sep 8 21:41:21 2005 @@ -27,13 +27,27 @@ return os.getenv('PYPY_CC') def enable_fast_compilation(): + if sys.platform == 'win32': + dash = '/' + else: + dash = '-' from distutils import sysconfig gcv = sysconfig.get_config_vars() opt = gcv.get('OPT') # not always existent if opt: - opt = re.sub('-O\d+', '-O0', opt) + opt = re.sub('%sO\d+' % dash, '%sO0' % dash, opt) else: - opt = '-O0' + opt = '%sO0' % dash + gcv['OPT'] = opt + +def ensure_correct_math(): + if sys.platform != 'win32': + return # so far + from distutils import sysconfig + gcv = sysconfig.get_config_vars() + opt = gcv.get('OPT') # not always existent + if opt and '/Op' not in opt: + opt += '/Op' gcv['OPT'] = opt def compile_c_module(cfile, modname, include_dirs=None, libraries=[]): @@ -48,6 +62,7 @@ dirpath = cfile.dirpath() lastdir = dirpath.chdir() + ensure_correct_math() try: if debug: print "modname", modname c = stdoutcapture.Capture(mixed_out_err = True) @@ -74,6 +89,9 @@ # instantiate a Distribution, which also allows us to # ignore unwanted features like config files. extra_compile_args = [] + # ensure correct math on windows + if sys.platform == 'win32': + extra_compile_args.append('/Op') if get_default_compiler() == 'unix': old_version = False try: From ericvrp at codespeak.net Thu Sep 8 22:01:52 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Thu, 8 Sep 2005 22:01:52 +0200 (CEST) Subject: [pypy-svn] r17395 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20050908200152.7D30F27B61@code1.codespeak.net> Author: ericvrp Date: Thu Sep 8 22:01:46 2005 New Revision: 17395 Added: pypy/dist/pypy/translator/llvm/exception.py (contents, props changed) Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py pypy/dist/pypy/translator/llvm/codewriter.py pypy/dist/pypy/translator/llvm/gc.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/module/extfunction.py Log: * refactored more code into gc.py The goal of this refactoring is to make it easier to use llvm's optimization capabilities. In this case it would be good to let llvm know that our GC_malloc is basically just some allocation on the heap that can be moved to the stack if this helps speeding up the code. These kind of optimization will in future probably also be handled by pypy transformations. But it makes sence to allow llvm to show it's tricks, if only so we know what can be achived. * started to refactor exception handling into a policy. The goal here is again to keep related code in one place so experimentation is easier. To test this I would like to write a 'fast' exception policy (as opposed to the CPython one) that does not use issubclass(..) but does only direct exception class comparison. Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Thu Sep 8 22:01:46 2005 @@ -12,7 +12,6 @@ from pypy.translator.tool import stdoutcapture from pypy.translator.llvm.log import log -EXCEPTIONS_SWITCHES = "-enable-correct-eh-support" SIMPLE_OPTIMIZATION_SWITCHES = (" ".join([ # kill code - hopefully to speed things up "-globaldce -adce -deadtypeelim -simplifycfg", @@ -84,15 +83,16 @@ cmds = ["llvm-as < %s.ll | opt %s -f -o %s.bc" % (b, OPTIMIZATION_SWITCHES, b)] - if False and sys.maxint == 2147483647: #32 bit platform - cmds.append("llc %s %s.bc -f -o %s.s" % (EXCEPTIONS_SWITCHES, b, b)) + generate_s_file = False + if generate_s_file and sys.maxint == 2147483647: #32 bit platform + cmds.append("llc %s %s.bc -f -o %s.s" % (genllvm.exceptionpolicy.llc_options(), b, b)) cmds.append("as %s.s -o %s.o" % (b, b)) if exe_name: cmds.append("gcc %s.o %s -lm -ldl -pipe -o %s" % (b, gc_libs, exe_name)) object_files.append("%s.o" % b) else: #assume 64 bit platform (x86-64?) #this special case for x86-64 (called ia64 in llvm) can go as soon as llc supports ia64 assembly output! - cmds.append("llc %s %s.bc -march=c -f -o %s.c" % (EXCEPTIONS_SWITCHES, b, b)) + cmds.append("llc %s %s.bc -march=c -f -o %s.c" % (genllvm.exceptionpolicy.llc_options(), b, b)) if exe_name: #XXX TODO: use CFLAGS when available cmds.append("gcc %s.c -c -O3 -pipe" % (b,)) Modified: pypy/dist/pypy/translator/llvm/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/codewriter.py (original) +++ pypy/dist/pypy/translator/llvm/codewriter.py Thu Sep 8 22:01:46 2005 @@ -8,12 +8,13 @@ DEFAULT_CCONV = 'fastcc' #ccc/fastcc class CodeWriter(object): - def __init__(self, f, word, uword, show_line_number=False): + def __init__(self, f, genllvm, show_line_number=False): self.f = f + self.genllvm = genllvm + self.word = genllvm.db.get_machine_word() + self.uword = genllvm.db.get_machine_uword() self.show_line_numbers = show_line_number self.n_lines = 0 - self.word = word - self.uword = uword def append(self, line): self.n_lines += 1 @@ -74,7 +75,6 @@ % (intty, cond, defaultdest, labels)) def openfunc(self, decl, is_entrynode=False, cconv=DEFAULT_CCONV): - self.malloc_count = count(0).next self.newline() if is_entrynode: linkage_type = '' @@ -141,18 +141,8 @@ "%(fromvar)s to %(targettype)s" % locals()) def malloc(self, targetvar, type_, size=1, atomic=False, cconv=DEFAULT_CCONV): - n = self.malloc_count() - if n: - cnt = ".%d" % n - else: - cnt = "" - postfix = ('', '_atomic')[atomic] - word = self.word - uword = self.uword - self.indent("%%malloc.Size%(cnt)s = getelementptr %(type_)s* null, %(uword)s %(size)s" % locals()) - self.indent("%%malloc.SizeU%(cnt)s = cast %(type_)s* %%malloc.Size%(cnt)s to %(uword)s" % locals()) - self.indent("%%malloc.Ptr%(cnt)s = call %(cconv)s sbyte* %%gc_malloc%(postfix)s(%(uword)s %%malloc.SizeU%(cnt)s)" % locals()) - self.indent("%(targetvar)s = cast sbyte* %%malloc.Ptr%(cnt)s to %(type_)s*" % locals()) + for s in self.genllvm.gcpolicy.malloc(targetvar, type_, size, atomic, self.word, self.uword).split('\n'): + self.indent(s) def getelementptr(self, targetvar, type, typevar, *indices): word = self.word Added: pypy/dist/pypy/translator/llvm/exception.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/llvm/exception.py Thu Sep 8 22:01:46 2005 @@ -0,0 +1,42 @@ +class ExceptionPolicy: + def __init__(self): + raise Exception, 'ExceptionPolicy should not be used directly' + + def llc_options(self): + return '' + + def new(exceptionpolicy=None): #factory + if exceptionpolicy is None or exceptionpolicy == 'cpython': + from pypy.translator.llvm.exception import CPythonExceptionPolicy + exceptionpolicy = CPythonExceptionPolicy() + elif exceptionpolicy == 'fast': + from pypy.translator.llvm.exception import FastExceptionPolicy + exceptionpolicy = FastExceptionPolicy() + elif exceptionpolicy == 'none': + from pypy.translator.llvm.exception import NoneExceptionPolicy + exceptionpolicy = NoneExceptionPolicy() + else: + raise Exception, 'unknown exceptionpolicy: ' + str(exceptionpolicy) + return exceptionpolicy + new = staticmethod(new) + + +class NoneExceptionPolicy(ExceptionPolicy): + def __init__(self): + pass + + +class CPythonExceptionPolicy(ExceptionPolicy): #uses issubclass() + def __init__(self): + pass + + def llc_options(self): + return '-enable-correct-eh-support' + + +class FastExceptionPolicy(ExceptionPolicy): #uses only 'direct' exception class comparision + def __init__(self): + pass + + def llc_options(self): + return '-enable-correct-eh-support' Modified: pypy/dist/pypy/translator/llvm/gc.py ============================================================================== --- pypy/dist/pypy/translator/llvm/gc.py (original) +++ pypy/dist/pypy/translator/llvm/gc.py Thu Sep 8 22:01:46 2005 @@ -5,20 +5,14 @@ def gc_libraries(self): return [] - def llvm_code(self): - return ''' -internal fastcc sbyte* %gc_malloc(uint %n) { - %nn = cast uint %n to uint - %ptr = malloc sbyte, uint %nn - ret sbyte* %ptr -} - -internal fastcc sbyte* %gc_malloc_atomic(uint %n) { - %nn = cast uint %n to uint - %ptr = malloc sbyte, uint %nn - ret sbyte* %ptr -} -''' + def declarations(self): + return '' + + def malloc(self, targetvar, type_, size, is_atomic, word, uword): + s = str(size) + if s == '0': + return '%(targetvar)s = cast %(type_)s* null to %(type_)s* ;was malloc 0 bytes' % locals() + return '%(targetvar)s = malloc %(type_)s, uint %(s)s' % locals() def pyrex_code(self): return '' @@ -50,27 +44,31 @@ class BoehmGcPolicy(GcPolicy): def __init__(self): - pass + self.n_malloced = 0 def gc_libraries(self): return ['gc'] # xxx on windows? - def llvm_code(self): + def declarations(self): return ''' declare ccc sbyte* %GC_malloc(uint) declare ccc sbyte* %GC_malloc_atomic(uint) - -internal fastcc sbyte* %gc_malloc(uint %n) { - %ptr = call ccc sbyte* %GC_malloc(uint %n) - ret sbyte* %ptr -} - -internal fastcc sbyte* %gc_malloc_atomic(uint %n) { - %ptr = call ccc sbyte* %GC_malloc_atomic(uint %n) - ret sbyte* %ptr -} ''' + def malloc(self, targetvar, type_, size, is_atomic, word, uword): + s = str(size) + if s == '0': + return '%(targetvar)s = cast %(type_)s* null to %(type_)s* ;was malloc 0 bytes' % locals() + self.n_malloced += 1 + cnt = '.%d' % self.n_malloced + atomic = is_atomic and '_atomic' or '' + return ''' +%%malloc.Size%(cnt)s = getelementptr %(type_)s* null, %(uword)s %(s)s +%%malloc.SizeU%(cnt)s = cast %(type_)s* %%malloc.Size%(cnt)s to %(uword)s +%%malloc.Ptr%(cnt)s = call ccc sbyte* %%GC_malloc%(atomic)s(%(uword)s %%malloc.SizeU%(cnt)s) +%(targetvar)s = cast sbyte* %%malloc.Ptr%(cnt)s to %(type_)s* + ''' % locals() + def pyrex_code(self): return ''' cdef extern int GC_get_heap_size() Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Thu Sep 8 22:01:46 2005 @@ -21,6 +21,7 @@ from pypy.translator.llvm.structnode import StructNode from pypy.translator.llvm.externs2ll import post_setup_externs, generate_llfile from pypy.translator.llvm.gc import GcPolicy +from pypy.translator.llvm.exception import ExceptionPolicy from pypy.translator.translator import Translator @@ -31,13 +32,14 @@ class GenLLVM(object): - def __init__(self, translator, gcpolicy=None, debug=True): + def __init__(self, translator, gcpolicy=None, exceptionpolicy=None, debug=True): # reset counters LLVMNode.nodename_count = {} self.db = Database(translator) self.translator = translator self.gcpolicy = gcpolicy + self.exceptionpolicy = exceptionpolicy translator.checkgraphs() extfuncnode.ExternalFuncNode.used_external_functions = {} @@ -146,7 +148,7 @@ function_count[func.func_name] = 1 filename = udir.join(func.func_name + postfix).new(ext='.ll') f = open(str(filename),'w') - codewriter = CodeWriter(f, self.db.get_machine_word(), self.db.get_machine_uword()) + codewriter = CodeWriter(f, self) comment = codewriter.comment nl = codewriter.newline @@ -174,7 +176,7 @@ self._checkpoint('write global constants') nl(); comment("Function Prototypes") ; nl() - for extdecl in extdeclarations.split('\n'): + for extdecl in (extdeclarations+self.gcpolicy.declarations()).split('\n'): codewriter.append(extdecl) self._checkpoint('write function prototypes') @@ -185,8 +187,6 @@ nl(); comment("Function Implementation") codewriter.startimpl() - codewriter.append(self.gcpolicy.llvm_code()) - for typ_decl in self.db.getnodes(): typ_decl.writeimpl(codewriter) self._checkpoint('write implementations') @@ -277,8 +277,8 @@ def _debug_prototype(self, codewriter): codewriter.append("declare int %printf(sbyte*, ...)") -def genllvm(translator, gcpolicy=None, log_source=False, **kwds): - gen = GenLLVM(translator, GcPolicy.new(gcpolicy)) +def genllvm(translator, gcpolicy=None, exceptionpolicy=None, log_source=False, **kwds): + gen = GenLLVM(translator, GcPolicy.new(gcpolicy), ExceptionPolicy.new(exceptionpolicy)) filename = gen.gen_llvm_source() if log_source: log.genllvm(open(filename).read()) Modified: pypy/dist/pypy/translator/llvm/module/extfunction.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/extfunction.py (original) +++ pypy/dist/pypy/translator/llvm/module/extfunction.py Thu Sep 8 22:01:46 2005 @@ -1,13 +1,7 @@ -extdeclarations = """;rpython stuff - -;gc-type dependent mallocs -declare fastcc sbyte* %gc_malloc(uint) -declare fastcc sbyte* %gc_malloc_atomic(uint) - -;exception handling globals +extdeclarations = ''' %last_exception_type = global %RPYTHON_EXCEPTION_VTABLE* null %last_exception_value = global %RPYTHON_EXCEPTION* null -""" +''' extfunctions = {} #dependencies, llvm-code From pedronis at codespeak.net Thu Sep 8 23:19:06 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 8 Sep 2005 23:19:06 +0200 (CEST) Subject: [pypy-svn] r17396 - in pypy/dist/pypy/interpreter/pyparser: . test Message-ID: <20050908211906.9F15327B6C@code1.codespeak.net> Author: pedronis Date: Thu Sep 8 23:19:04 2005 New Revision: 17396 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Log: use parsestring.parsestr in astbuilder, missing encoding support (disabled temporary eval_string) activate previously failing tests Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Thu Sep 8 23:19:04 2005 @@ -8,6 +8,7 @@ import pypy.interpreter.pyparser.pysymbol as sym import pypy.interpreter.pyparser.pytoken as tok from pypy.interpreter.pyparser.error import SyntaxError +from pypy.interpreter.pyparser.parsestring import parsestr DEBUG_MODE = 0 @@ -346,34 +347,34 @@ atoms.reverse() return atoms -def eval_string(value): - """temporary implementation - - FIXME: need to be finished (check compile.c (parsestr) and - stringobject.c (PyString_DecodeEscape()) for complete implementation) - """ - # return eval(value) - if len(value) == 2: - return '' - result = '' - length = len(value) - quotetype = value[0] - index = 1 - while index < length and value[index] == quotetype: - index += 1 - if index == 6: - # empty strings like """""" or '''''' - return '' - # XXX: is it RPYTHON to do this value[index:-index] - chars = [char for char in value[index:len(value)-index]] - result = ''.join(chars) - result = result.replace('\\\\', '\\') - d = {'\\b' : '\b', '\\f' : '\f', '\\t' : '\t', '\\n' : '\n', - '\\r' : '\r', '\\v' : '\v', '\\a' : '\a', - } - for escaped, value in d.items(): - result = result.replace(escaped, value) - return result +#def eval_string(value): +# """temporary implementation +# +# FIXME: need to be finished (check compile.c (parsestr) and +# stringobject.c (PyString_DecodeEscape()) for complete implementation) +# """ +# # return eval(value) +# if len(value) == 2: +# return '' +# result = '' +# length = len(value) +# quotetype = value[0] +# index = 1 +# while index < length and value[index] == quotetype: +# index += 1 +# if index == 6: +# # empty strings like """""" or '''''' +# return '' +# # XXX: is it RPYTHON to do this value[index:-index] +# chars = [char for char in value[index:len(value)-index]] +# result = ''.join(chars) +# result = result.replace('\\\\', '\\') +# d = {'\\b' : '\b', '\\f' : '\f', '\\t' : '\t', '\\n' : '\n', +# '\\r' : '\r', '\\v' : '\v', '\\a' : '\a', +# } +# for escaped, value in d.items(): +# result = result.replace(escaped, value) +# return result ## misc utilities, especially for power: rule @@ -490,10 +491,19 @@ elif top.name == tok.STRING: # need to concatenate strings in atoms s = '' - for token in atoms: + if len(atoms) == 1: + token = atoms[0] assert isinstance(token, TokenObject) - s += eval_string(token.get_value()) - builder.push(ast.Const(builder.wrap_string(s))) + builder.push(ast.Const(parsestr(builder.space, None, token.get_value()))) # XXX encoding + else: + space = builder.space + empty = space.wrap('') + accum = [] + for token in atoms: + assert isinstance(token, TokenObject) + accum.append(parsestr(builder.space, None, token.get_value())) # XXX encoding + w_s = space.call_method(empty, 'join', space.newlist(accum)) + builder.push(ast.Const(w_s)) elif top.name == tok.BACKQUOTE: builder.push(ast.Backquote(atoms[1])) else: Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Thu Sep 8 23:19:04 2005 @@ -35,6 +35,7 @@ if not isinstance(left,stable_ast.Node) or not isinstance(right,ast_ast.Node): return left==right if left.__class__.__name__ != right.__class__.__name__: + print "Node type mismatch:", left, right return False if isinstance(left,stable_ast.Function) and isinstance(right,ast_ast.Function): left_nodes = list(left.getChildren()) @@ -56,20 +57,24 @@ return False elif isinstance(left,stable_ast.Const): if isinstance(right,ast_ast.Const): - return left.value == right.value + r = left.value == right.value elif isinstance(right,ast_ast.NoneConst): - return left.value == None + r = left.value == None elif isinstance(right, ast_ast.NumberConst): - return left.value == right.number_value + r = left.value == right.number_value elif isinstance(right, ast_ast.StringConst): - return left.value == right.string_value + r = left.value == right.string_value else: print "Not const type %s" % repr(right) return False + if not r: + print "Constant mismatch:", left, right + return True else: left_nodes = left.getChildren() right_nodes = right.getChildren() if len(left_nodes)!=len(right_nodes): + print "Number of children mismatch:", left, right return False for i,j in zip(left_nodes,right_nodes): if not nodes_equal(i,j): @@ -513,6 +518,7 @@ 'x = 5 ', '''"""Docstring""";print 1''', '''"Docstring"''', + '''"Docstring" "\\x00"''', ] ] @@ -546,9 +552,15 @@ def type(self, obj): return type(obj) + def newlist(self, lst): + return list(lst) + def newtuple(self, lst): return tuple(lst) + def call_method(self, obj, meth, *args): + return getattr(obj, meth)(*args) + def ast_parse_expr(expr, target='single'): target = TARGET_DICT[target] builder = AstBuilder(space=FakeSpace()) @@ -628,7 +640,7 @@ yield check_expression, source, 'exec' def test_libstuff(): - py.test.skip("failing, need to investigate") + #py.test.skip("failing, need to investigate") for snippet_name in LIBSTUFF: filepath = os.path.join(os.path.dirname(__file__), '../../../lib', snippet_name) source = file(filepath).read() @@ -651,13 +663,13 @@ def test_eval_string(): - from pypy.interpreter.pyparser.astbuilder import eval_string test = ['""', "''", '""""""', "''''''", "''' '''", '""" """', '"foo"', "'foo'", '"""\n"""', '"\\ "', '"\\n"', - # '"\""', + '"\\""', + '"\\x00"', ] for data in test: - assert eval_string(data) == eval(data) + yield check_expression, data, 'eval' def test_single_inputs(): for family in SINGLE_INPUTS: From ericvrp at codespeak.net Fri Sep 9 10:56:21 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 9 Sep 2005 10:56:21 +0200 (CEST) Subject: [pypy-svn] r17397 - pypy/dist/pypy/translator/llvm Message-ID: <20050909085621.EC77127B7D@code1.codespeak.net> Author: ericvrp Date: Fri Sep 9 10:56:20 2005 New Revision: 17397 Modified: pypy/dist/pypy/translator/llvm/codewriter.py Log: removed optional linenumbering in ll file because we are using multiline output in way to many places and I don't want to fix that! Modified: pypy/dist/pypy/translator/llvm/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/codewriter.py (original) +++ pypy/dist/pypy/translator/llvm/codewriter.py Fri Sep 9 10:56:20 2005 @@ -8,18 +8,13 @@ DEFAULT_CCONV = 'fastcc' #ccc/fastcc class CodeWriter(object): - def __init__(self, f, genllvm, show_line_number=False): + def __init__(self, f, genllvm): self.f = f self.genllvm = genllvm self.word = genllvm.db.get_machine_word() self.uword = genllvm.db.get_machine_uword() - self.show_line_numbers = show_line_number - self.n_lines = 0 def append(self, line): - self.n_lines += 1 - if self.show_line_numbers: - line = "%-75s; %d" % (line, self.n_lines) self.f.write(line + '\n') def comment(self, line, indent=True): From ericvrp at codespeak.net Fri Sep 9 10:57:53 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 9 Sep 2005 10:57:53 +0200 (CEST) Subject: [pypy-svn] r17398 - pypy/dist/pypy/translator/llvm Message-ID: <20050909085753.E2F7027B83@code1.codespeak.net> Author: ericvrp Date: Fri Sep 9 10:57:53 2005 New Revision: 17398 Modified: pypy/dist/pypy/translator/llvm/opwriter.py Log: Fix so we don't reraise an exception when we know there is a catch all Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Fri Sep 9 10:57:53 2005 @@ -347,6 +347,7 @@ self.codewriter.label(exc_label) exc_found_labels, last_exception_type = [], None + catch_all = False for link in self.block.exits[1:]: assert issubclass(link.exitcase, Exception) @@ -373,6 +374,7 @@ not_this_exception_label = block_label + '_not_exception_' + etype.ref[1:] if current_exception_type.find('getelementptr') == -1: #XXX catch all (except:) + catch_all = True self.codewriter.br_uncond(exc_found_label) else: if not last_exception_type: @@ -386,11 +388,11 @@ [last_exception_type, current_exception_type], [lltype_of_exception_type, lltype_of_exception_type]) self.codewriter.br(ll_issubclass_cond, not_this_exception_label, exc_found_label) + self.codewriter.label(not_this_exception_label) - self.codewriter.label(not_this_exception_label) - - self.codewriter.comment('reraise when exception is not caught') - self.codewriter.unwind() + if not catch_all: + self.codewriter.comment('reraise when exception is not caught') + self.codewriter.unwind() for label, target, last_exc_type_var, last_exc_value_var in exc_found_labels: self.codewriter.label(label) From ericvrp at codespeak.net Fri Sep 9 10:58:49 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 9 Sep 2005 10:58:49 +0200 (CEST) Subject: [pypy-svn] r17399 - pypy/dist/pypy/translator/llvm Message-ID: <20050909085849.8AED627B83@code1.codespeak.net> Author: ericvrp Date: Fri Sep 9 10:58:48 2005 New Revision: 17399 Modified: pypy/dist/pypy/translator/llvm/exception.py pypy/dist/pypy/translator/llvm/genllvm.py Log: Refactored pyrex (module) entry function out to the exception policy. Modified: pypy/dist/pypy/translator/llvm/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/exception.py (original) +++ pypy/dist/pypy/translator/llvm/exception.py Fri Sep 9 10:58:48 2005 @@ -1,7 +1,13 @@ +from pypy.translator.llvm.codewriter import DEFAULT_CCONV + + class ExceptionPolicy: def __init__(self): raise Exception, 'ExceptionPolicy should not be used directly' + def pyrex_entrypoint_code(self, entrynode): + return '' + def llc_options(self): return '' @@ -30,6 +36,34 @@ def __init__(self): pass + def pyrex_entrypoint_code(self, entrynode): + returntype, entrypointname = entrynode.getdecl().split('%', 1) + if returntype == 'double ': + noresult = '0.0' + elif returntype == 'bool ': + noresult = 'false' + else: + noresult = '0' + cconv = DEFAULT_CCONV + return ''' +ccc %(returntype)s%%__entrypoint__%(entrypointname)s { + %%result = invoke %(cconv)s %(returntype)s%%%(entrypointname)s to label %%no_exception except label %%exception + +no_exception: + store %%RPYTHON_EXCEPTION_VTABLE* null, %%RPYTHON_EXCEPTION_VTABLE** %%last_exception_type + ret %(returntype)s %%result + +exception: + ret %(returntype)s %(noresult)s +} + +ccc int %%__entrypoint__raised_LLVMException() { + %%tmp = load %%RPYTHON_EXCEPTION_VTABLE** %%last_exception_type + %%result = cast %%RPYTHON_EXCEPTION_VTABLE* %%tmp to int + ret int %%result +} +''' % locals() + def llc_options(self): return '-enable-correct-eh-support' @@ -38,5 +72,5 @@ def __init__(self): pass - def llc_options(self): - return '-enable-correct-eh-support' + pyrex_entrypoint_code = CPythonExceptionPolicy.pyrex_entrypoint_code + llc_options = CPythonExceptionPolicy.llc_options Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Fri Sep 9 10:58:48 2005 @@ -154,8 +154,7 @@ if using_external_functions: nl(); comment("External Function Declarations") ; nl() - for s in llexterns_header.split('\n'): - codewriter.append(s) + codewriter.append(llexterns_header) nl(); comment("Type Declarations"); nl() for c_name, obj in extern_decls: @@ -176,8 +175,8 @@ self._checkpoint('write global constants') nl(); comment("Function Prototypes") ; nl() - for extdecl in (extdeclarations+self.gcpolicy.declarations()).split('\n'): - codewriter.append(extdecl) + codewriter.append(extdeclarations) + codewriter.append(self.gcpolicy.declarations()) self._checkpoint('write function prototypes') for typ_decl in self.db.getnodes(): @@ -191,35 +190,11 @@ typ_decl.writeimpl(codewriter) self._checkpoint('write implementations') - #XXX use codewriter methods here - decl = self.entrynode.getdecl() - t = decl.split('%', 1) - if t[0] == 'double ': #XXX I know, I know... refactor at will! - no_result = '0.0' - elif t[0] == 'bool ': - no_result = 'false' - else: - no_result = '0' - codewriter.newline() - codewriter.append("ccc %s%%__entrypoint__%s {" % (t[0], t[1])) - codewriter.append(" %%result = invoke %s %s%%%s to label %%no_exception except label %%exception" % (DEFAULT_CCONV, t[0], t[1])) - codewriter.newline() - codewriter.append("no_exception:") - codewriter.append(" store %RPYTHON_EXCEPTION_VTABLE* null, %RPYTHON_EXCEPTION_VTABLE** %last_exception_type") - codewriter.append(" ret %s%%result" % t[0]) - codewriter.newline() - codewriter.append("exception:") - codewriter.append(" ret %s%s" % (t[0], no_result)) - codewriter.append("}") - codewriter.newline() - codewriter.append("ccc int %__entrypoint__raised_LLVMException() {") - codewriter.append(" %tmp = load %RPYTHON_EXCEPTION_VTABLE** %last_exception_type") - codewriter.append(" %result = cast %RPYTHON_EXCEPTION_VTABLE* %tmp to int") - codewriter.append(" ret int %result") - codewriter.append("}") - codewriter.newline() + codewriter.append(self.exceptionpolicy.pyrex_entrypoint_code(self.entrynode)) # XXX we need to create our own main() that calls the actual entry_point function + decl = self.entrynode.getdecl() + t = decl.split('%', 1) entryfunc_name = t[1].split('(')[0] if entryfunc_name == 'pypy_entry_point': #XXX just to get on with translate_pypy extfuncnode.ExternalFuncNode.used_external_functions['%main'] = True @@ -236,19 +211,14 @@ deps.reverse() for dep in deps: if dep not in depdone: - try: - llvm_code = extfunctions[dep][1] - except KeyError: #external function that is shared with genc - continue - for extfunc in llvm_code.split('\n'): - codewriter.append(extfunc) + if dep in extfunctions: #else external function that is shared with genc + codewriter.append(extfunctions[dep][1]) depdone[dep] = True self._checkpoint('write support functions') if using_external_functions: nl(); comment("External Function Implementation") ; nl() - for s in llexterns_functions.split('\n'): - codewriter.append(s) + codewriter.append(llexterns_functions) self._checkpoint('write external functions') comment("End of file") ; nl() @@ -274,8 +244,6 @@ pyxfile=pyxfile, optimize=optimize) - def _debug_prototype(self, codewriter): - codewriter.append("declare int %printf(sbyte*, ...)") def genllvm(translator, gcpolicy=None, exceptionpolicy=None, log_source=False, **kwds): gen = GenLLVM(translator, GcPolicy.new(gcpolicy), ExceptionPolicy.new(exceptionpolicy)) From ac at codespeak.net Fri Sep 9 14:52:25 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Fri, 9 Sep 2005 14:52:25 +0200 (CEST) Subject: [pypy-svn] r17404 - in pypy/dist/pypy: interpreter interpreter/test translator/goal Message-ID: <20050909125225.857D327B7D@code1.codespeak.net> Author: ac Date: Fri Sep 9 14:52:25 2005 New Revision: 17404 Modified: pypy/dist/pypy/interpreter/pycompiler.py pypy/dist/pypy/interpreter/test/test_compiler.py pypy/dist/pypy/translator/goal/targetcompiler.py Log: Test compilation of some unicode literals. Make targetcompiler fully use the ast-compiler. Modified: pypy/dist/pypy/interpreter/pycompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pycompiler.py (original) +++ pypy/dist/pypy/interpreter/pycompiler.py Fri Sep 9 14:52:25 2005 @@ -121,6 +121,12 @@ space.wrap(e.offset), space.wrap(e.text)])]) raise OperationError(space.w_SyntaxError, w_synerr) + except UnicodeDecodeError, e: + # TODO use a custom UnicodeError + raise OperationError(space.w_UnicodeDecodeError, space.newtuple([ + space.wrap(e.encoding), space.wrap(e.object), + space.wrap(e.start), + space.wrap(e.end), space.wrap(e.reason)])) except ValueError, e: raise OperationError(space.w_ValueError, space.wrap(str(e))) except TypeError, e: @@ -403,7 +409,6 @@ return self.compile_parse_result(ast_tree, filename, mode, flags) def compile_parse_result(self, ast_tree, filename, mode, flags): - """NOT_RPYTHON""" # __________ # XXX this uses the non-annotatable astcompiler at interp-level from pypy.interpreter import astcompiler @@ -429,17 +434,19 @@ space.wrap(e.offset), space.wrap(e.text)])]) raise OperationError(space.w_SyntaxError, w_synerr) - except UnicodeDecodeError, e: - # TODO use a custom UnicodeError - raise OperationError(space.w_UnicodeDecodeError, space.newtuple([ - space.wrap(e.encoding), space.wrap(e.object), space.wrap(e.start), - space.wrap(e.end), space.wrap(e.reason)])) +## except UnicodeDecodeError, e: +## # TODO use a custom UnicodeError +## import traceback +## traceback.print_exc() +## raise OperationError(space.w_UnicodeDecodeError, space.newtuple([ +## space.wrap(e.encoding), space.wrap(e.object), space.wrap(e.start), +## space.wrap(e.end), space.wrap(e.reason)])) except ValueError,e: - if e.__class__ != ValueError: - extra_msg = "(Really go %s)" % e.__class__.__name__ - else: - extra_msg = "" - raise OperationError(space.w_ValueError,space.wrap(str(e)+extra_msg)) + #if e.__class__ != ValueError: + # extra_msg = "(Really go %s)" % e.__class__.__name__ + #else: + # extra_msg = "" + raise OperationError(space.w_ValueError,space.wrap(str(e))) except TypeError,e: raise raise OperationError(space.w_TypeError,space.wrap(str(e))) Modified: pypy/dist/pypy/interpreter/test/test_compiler.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_compiler.py (original) +++ pypy/dist/pypy/interpreter/test/test_compiler.py Fri Sep 9 14:52:25 2005 @@ -10,6 +10,11 @@ def setup_method(self, method): self.compiler = self.space.createcompiler() + def eval_string(self, string): + space = self.space + code = self.compiler.compile(string, '<>', 'eval', 0) + return code.exec_code(space, space.newdict([]), space.newdict([])) + def test_compile(self): code = self.compiler.compile('6*7', '', 'eval', 0) assert isinstance(code, PyCode) @@ -150,6 +155,21 @@ assert not space.eq_w(w_const, space.wrap("b")) assert not space.eq_w(w_const, space.wrap("c")) + def test_unicodeliterals(self): + e = py.test.raises(OperationError, self.eval_string, "u'\\Ufffffffe'") + ex = e.value + ex.normalize_exception(self.space) + assert ex.match(self.space, self.space.w_UnicodeError) + + e = py.test.raises(OperationError, self.eval_string, "u'\\Uffffffff'") + ex = e.value + ex.normalize_exception(self.space) + assert ex.match(self.space, self.space.w_UnicodeError) + + e = py.test.raises(OperationError, self.eval_string, "u'\\U%08x'" % 0x110000) + ex = e.value + ex.normalize_exception(self.space) + assert ex.match(self.space, self.space.w_UnicodeError) class TestECCompiler(BaseTestCompiler): def setup_method(self, method): Modified: pypy/dist/pypy/translator/goal/targetcompiler.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetcompiler.py (original) +++ pypy/dist/pypy/translator/goal/targetcompiler.py Fri Sep 9 14:52:25 2005 @@ -30,7 +30,7 @@ # for the poor translator already # XXX why can't I enable this? crashes the annotator! space = StdObjSpace(nofaking=True, - compiler="_stable", + compiler="ast", translating=True, #usemodules=['marhsal', '_sre'], geninterp=geninterp) From ludal at codespeak.net Fri Sep 9 16:11:47 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Fri, 9 Sep 2005 16:11:47 +0200 (CEST) Subject: [pypy-svn] r17405 - in pypy/dist/pypy/interpreter: . astcompiler Message-ID: <20050909141147.9FC8327B7D@code1.codespeak.net> Author: ludal Date: Fri Sep 9 16:11:45 2005 New Revision: 17405 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/astgen.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/pycompiler.py Log: - move compile_parse_result from AstCompiler into compile the split being not needed because it should be fully annotable - correct some visitXXX signatures - added missing base methods to AstVisitor Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Fri Sep 9 16:11:45 2005 @@ -1710,6 +1710,12 @@ for child in node.getChildNodes(): child.accept(self) + def visitExpression(self, node): + return self.default(node) + + def visitEmptyNode(self, node): + return self.default(node) + def visitAdd(self, node): return self.default( node ) Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Fri Sep 9 16:11:45 2005 @@ -311,6 +311,12 @@ for child in node.getChildNodes(): child.accept(self) + def visitExpression(self, node): + return self.default(node) + + def visitEmptyNode(self, node): + return self.default(node) + ''' def gen_ast_visitor(classes): Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Fri Sep 9 16:11:45 2005 @@ -944,7 +944,7 @@ opcode = callfunc_opcode_info[have_star, have_dstar] self.emitop_int(opcode, kw << 8 | pos) - def visitPrint(self, node, newline=0): + def visitPrint(self, node): self.set_lineno(node) if node.dest: node.dest.accept( self ) @@ -957,11 +957,22 @@ self.emit('PRINT_ITEM_TO') else: self.emit('PRINT_ITEM') - if node.dest and not newline: + if node.dest: self.emit('POP_TOP') def visitPrintnl(self, node): - self.visitPrint(node, newline=1) + self.set_lineno(node) + if node.dest: + node.dest.accept( self ) + for child in node.nodes: + if node.dest: + self.emit('DUP_TOP') + child.accept( self ) + if node.dest: + self.emit('ROT_TWO') + self.emit('PRINT_ITEM_TO') + else: + self.emit('PRINT_ITEM') if node.dest: self.emit('PRINT_NEWLINE_TO') else: @@ -978,8 +989,10 @@ self.emit('YIELD_VALUE') # slice and subscript stuff + def visitSlice(self, node): + return self._visitSlice(node, False) - def visitSlice(self, node, aug_flag=0): + def _visitSlice(self, node, aug_flag): # aug_flag is used by visitAugSlice node.expr.accept( self ) slice = 0 @@ -1005,7 +1018,10 @@ else: assert False, "weird slice %s" % node.flags - def visitSubscript(self, node, aug_flag=0): + def visitSubscript(self, node): + return self._visitSubscript(node, False) + + def _visitSubscript(self, node, aug_flag): node.expr.accept( self ) for sub in node.subs: sub.accept( self ) @@ -1362,12 +1378,12 @@ self.main.emitop('LOAD_ATTR', self.main.mangle(node.attrname)) def visitSlice(self, node): - self.main.visitSlice(node, 1) + self.main._visitSlice(node, True) def visitSubscript(self, node): if len(node.subs) > 1: raise SyntaxError( "augmented assignment to tuple is not possible" ) - self.main.visitSubscript(node, 1) + self.main._visitSubscript(node, True) class AugStoreVisitor(ast.ASTVisitor): Modified: pypy/dist/pypy/interpreter/pycompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pycompiler.py (original) +++ pypy/dist/pypy/interpreter/pycompiler.py Fri Sep 9 16:11:45 2005 @@ -394,7 +394,14 @@ """ def compile(self, source, filename, mode, flags): from pyparser.error import ParseError + from pyparser.error import SyntaxError + from pypy.interpreter import astcompiler + from pypy.interpreter.astcompiler.pycodegen import ModuleCodeGenerator + from pypy.interpreter.astcompiler.pycodegen import InteractiveCodeGenerator + from pypy.interpreter.astcompiler.pycodegen import ExpressionCodeGenerator from pyparser.pythonutil import AstBuilder, PYTHON_PARSER, TARGET_DICT + from pypy.interpreter.pycode import PyCode + flags |= __future__.generators.compiler_flag # always on (2.2 compat) space = self.space try: @@ -406,17 +413,7 @@ except ParseError, e: raise OperationError(space.w_SyntaxError, e.wrap_info(space, filename)) - return self.compile_parse_result(ast_tree, filename, mode, flags) - def compile_parse_result(self, ast_tree, filename, mode, flags): - # __________ - # XXX this uses the non-annotatable astcompiler at interp-level - from pypy.interpreter import astcompiler - from pyparser.error import SyntaxError - from pypy.interpreter.astcompiler.pycodegen import ModuleCodeGenerator - from pypy.interpreter.astcompiler.pycodegen import InteractiveCodeGenerator - from pypy.interpreter.astcompiler.pycodegen import ExpressionCodeGenerator - space = self.space try: astcompiler.misc.set_filename(filename, ast_tree) flag_names = get_flag_names(space, flags) @@ -443,15 +440,13 @@ ## space.wrap(e.end), space.wrap(e.reason)])) except ValueError,e: #if e.__class__ != ValueError: - # extra_msg = "(Really go %s)" % e.__class__.__name__ + # extra_msg = "(Really got %s)" % e.__class__.__name__ #else: # extra_msg = "" raise OperationError(space.w_ValueError,space.wrap(str(e))) except TypeError,e: raise raise OperationError(space.w_TypeError,space.wrap(str(e))) - # __________ end of XXX above - from pypy.interpreter.pycode import PyCode assert isinstance(c,PyCode) return c #compile_parse_result._annspecialcase_ = 'override:cpy_stablecompiler' From arigo at codespeak.net Fri Sep 9 16:35:29 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Sep 2005 16:35:29 +0200 (CEST) Subject: [pypy-svn] r17406 - pypy/dist/pypy/annotation Message-ID: <20050909143529.8E84027B7D@code1.codespeak.net> Author: arigo Date: Fri Sep 9 16:35:27 2005 New Revision: 17406 Modified: pypy/dist/pypy/annotation/dictdef.py pypy/dist/pypy/annotation/listdef.py Log: Be a bit more explicit in the __repr__ of ListDef and DictDef. Modified: pypy/dist/pypy/annotation/dictdef.py ============================================================================== --- pypy/dist/pypy/annotation/dictdef.py (original) +++ pypy/dist/pypy/annotation/dictdef.py Fri Sep 9 16:35:27 2005 @@ -67,7 +67,7 @@ self.dictvalue.generalize(s_value) def __repr__(self): - return '<%r: %r>' % (self.dictkey.s_value, self.dictvalue.s_value) + return '<{%r: %r}>' % (self.dictkey.s_value, self.dictvalue.s_value) MOST_GENERAL_DICTDEF = DictDef(None, SomeObject(), SomeObject()) Modified: pypy/dist/pypy/annotation/listdef.py ============================================================================== --- pypy/dist/pypy/annotation/listdef.py (original) +++ pypy/dist/pypy/annotation/listdef.py Fri Sep 9 16:35:27 2005 @@ -91,7 +91,7 @@ self.listitem.generalize(s_value) def __repr__(self): - return '<%r>' % (self.listitem.s_value,) + return '<[%r]>' % (self.listitem.s_value,) def mutate(self): self.listitem.mutated = True From pedronis at codespeak.net Fri Sep 9 17:06:18 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 9 Sep 2005 17:06:18 +0200 (CEST) Subject: [pypy-svn] r17407 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050909150618.1FD8227B75@code1.codespeak.net> Author: pedronis Date: Fri Sep 9 17:06:16 2005 New Revision: 17407 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/ast.txt pypy/dist/pypy/interpreter/astcompiler/astgen.py Log: introduce some more common bases in the ast hierarchy to avoid all attributes to migrate to Node. Some more work required on this. Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Fri Sep 9 17:06:16 2005 @@ -72,7 +72,58 @@ def accept(self, visitor): return visitor.visitExpression(self) -class Add(Node): +class AbstractFunction(Node): + def __init__(self, lineno=-1): + Node.__init__(self, lineno) + + def getChildren(self): + "NOT_RPYTHON" + return [] + + def getChildNodes(self): + return [] + + def __repr__(self): + return "AbstractFunction()" + + def accept(self, visitor): + return visitor.visitAbstractFunction(self) + +class AbstractTest(Node): + def __init__(self, lineno=-1): + Node.__init__(self, lineno) + + def getChildren(self): + "NOT_RPYTHON" + return [] + + def getChildNodes(self): + return [] + + def __repr__(self): + return "AbstractTest()" + + def accept(self, visitor): + return visitor.visitAbstractTest(self) + +class BinaryOp(Node): + def __init__(self, lineno=-1): + Node.__init__(self, lineno) + + def getChildren(self): + "NOT_RPYTHON" + return [] + + def getChildNodes(self): + return [] + + def __repr__(self): + return "BinaryOp()" + + def accept(self, visitor): + return visitor.visitBinaryOp(self) + +class Add(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left @@ -91,7 +142,7 @@ def accept(self, visitor): return visitor.visitAdd(self) -class And(Node): +class And(AbstractTest): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -274,7 +325,24 @@ def accept(self, visitor): return visitor.visitAugAssign(self) -class Backquote(Node): +class UnaryOp(Node): + def __init__(self, lineno=-1): + Node.__init__(self, lineno) + + def getChildren(self): + "NOT_RPYTHON" + return [] + + def getChildNodes(self): + return [] + + def __repr__(self): + return "UnaryOp()" + + def accept(self, visitor): + return visitor.visitUnaryOp(self) + +class Backquote(UnaryOp): def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) self.expr = expr @@ -292,7 +360,24 @@ def accept(self, visitor): return visitor.visitBackquote(self) -class Bitand(Node): +class BitOp(Node): + def __init__(self, lineno=-1): + Node.__init__(self, lineno) + + def getChildren(self): + "NOT_RPYTHON" + return [] + + def getChildNodes(self): + return [] + + def __repr__(self): + return "BitOp()" + + def accept(self, visitor): + return visitor.visitBitOp(self) + +class Bitand(BitOp): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -312,7 +397,7 @@ def accept(self, visitor): return visitor.visitBitand(self) -class Bitor(Node): +class Bitor(BitOp): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -332,7 +417,7 @@ def accept(self, visitor): return visitor.visitBitor(self) -class Bitxor(Node): +class Bitxor(BitOp): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -556,7 +641,7 @@ def accept(self, visitor): return visitor.visitDiscard(self) -class Div(Node): +class Div(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left @@ -622,7 +707,7 @@ def accept(self, visitor): return visitor.visitExec(self) -class FloorDiv(Node): +class FloorDiv(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left @@ -692,7 +777,7 @@ def accept(self, visitor): return visitor.visitFrom(self) -class Function(Node): +class Function(AbstractFunction): def __init__(self, decorators, name, argnames, defaults, flags, doc, code, lineno=-1): Node.__init__(self, lineno) self.decorators = decorators @@ -736,7 +821,7 @@ def accept(self, visitor): return visitor.visitFunction(self) -class GenExpr(Node): +class GenExpr(AbstractFunction): def __init__(self, code, lineno=-1): Node.__init__(self, lineno) self.code = code @@ -917,7 +1002,7 @@ def accept(self, visitor): return visitor.visitImport(self) -class Invert(Node): +class Invert(UnaryOp): def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) self.expr = expr @@ -954,7 +1039,7 @@ def accept(self, visitor): return visitor.visitKeyword(self) -class Lambda(Node): +class Lambda(AbstractFunction): def __init__(self, argnames, defaults, flags, code, lineno=-1): Node.__init__(self, lineno) self.argnames = argnames @@ -990,7 +1075,7 @@ def accept(self, visitor): return visitor.visitLambda(self) -class LeftShift(Node): +class LeftShift(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left @@ -1100,7 +1185,7 @@ def accept(self, visitor): return visitor.visitListCompIf(self) -class Mod(Node): +class Mod(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left @@ -1138,7 +1223,7 @@ def accept(self, visitor): return visitor.visitModule(self) -class Mul(Node): +class Mul(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left @@ -1192,7 +1277,7 @@ def accept(self, visitor): return visitor.visitNoneConst(self) -class Not(Node): +class Not(UnaryOp): def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) self.expr = expr @@ -1228,7 +1313,7 @@ def accept(self, visitor): return visitor.visitNumberConst(self) -class Or(Node): +class Or(AbstractTest): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -1265,7 +1350,7 @@ def accept(self, visitor): return visitor.visitPass(self) -class Power(Node): +class Power(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left @@ -1385,7 +1470,7 @@ def accept(self, visitor): return visitor.visitReturn(self) -class RightShift(Node): +class RightShift(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left @@ -1494,7 +1579,7 @@ def accept(self, visitor): return visitor.visitStringConst(self) -class Sub(Node): +class Sub(BinaryOp): def __init__(self, (left, right), lineno=-1): Node.__init__(self, lineno) self.left = left @@ -1616,7 +1701,7 @@ def accept(self, visitor): return visitor.visitTuple(self) -class UnaryAdd(Node): +class UnaryAdd(UnaryOp): def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) self.expr = expr @@ -1634,7 +1719,7 @@ def accept(self, visitor): return visitor.visitUnaryAdd(self) -class UnarySub(Node): +class UnarySub(UnaryOp): def __init__(self, expr, lineno=-1): Node.__init__(self, lineno) self.expr = expr @@ -1717,6 +1802,10 @@ return self.default(node) + def visitAbstractFunction(self, node): + return self.default( node ) + def visitAbstractTest(self, node): + return self.default( node ) def visitAdd(self, node): return self.default( node ) def visitAnd(self, node): @@ -1737,6 +1826,10 @@ return self.default( node ) def visitBackquote(self, node): return self.default( node ) + def visitBinaryOp(self, node): + return self.default( node ) + def visitBitOp(self, node): + return self.default( node ) def visitBitand(self, node): return self.default( node ) def visitBitor(self, node): @@ -1857,6 +1950,8 @@ return self.default( node ) def visitUnaryAdd(self, node): return self.default( node ) + def visitUnaryOp(self, node): + return self.default( node ) def visitUnarySub(self, node): return self.default( node ) def visitWhile(self, node): Modified: pypy/dist/pypy/interpreter/astcompiler/ast.txt ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.txt (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.txt Fri Sep 9 17:06:16 2005 @@ -9,8 +9,9 @@ Module: doc*, node Stmt: nodes! Decorators: nodes! -Function: decorators&, name*, argnames*, defaults!, flags*, doc*, code -Lambda: argnames*, defaults!, flags*, code +AbstractFunction: +Function(AbstractFunction): decorators&, name*, argnames*, defaults!, flags*, doc*, code +Lambda(AbstractFunction): argnames*, defaults!, flags*, code Class: name*, bases!, doc*, code Pass: Break: @@ -42,17 +43,18 @@ ListComp: expr, quals! ListCompFor: assign, list, ifs! ListCompIf: test -GenExpr: code +GenExpr(AbstractFunction): code GenExprInner: expr, quals! GenExprFor: assign, iter, ifs! GenExprIf: test List: nodes! Dict: items! -Not: expr +UnaryOp: +Not(UnaryOp): expr Compare: expr, ops! Name: varname* Global: names* -Backquote: expr +Backquote(UnaryOp): expr Getattr: expr, attrname* CallFunc: node, args!, star_args& = None, dstar_args& = None Keyword: name*, expr @@ -62,23 +64,26 @@ Slice: expr, flags*, lower&, upper& Assert: test, fail& Tuple: nodes! -Or: nodes! -And: nodes! -Bitor: nodes! -Bitxor: nodes! -Bitand: nodes! -LeftShift: (left, right) -RightShift: (left, right) -Add: (left, right) -Sub: (left, right) -Mul: (left, right) -Div: (left, right) -Mod: (left, right) -Power: (left, right) -FloorDiv: (left, right) -UnaryAdd: expr -UnarySub: expr -Invert: expr +AbstractTest: +Or(AbstractTest): nodes! +And(AbstractTest): nodes! +BitOp: +Bitor(BitOp): nodes! +Bitxor(BitOp): nodes! +Bitand(BitOp): nodes! +BinaryOp: +LeftShift(BinaryOp): (left, right) +RightShift(BinaryOp): (left, right) +Add(BinaryOp): (left, right) +Sub(BinaryOp): (left, right) +Mul(BinaryOp): (left, right) +Div(BinaryOp): (left, right) +Mod(BinaryOp): (left, right) +Power(BinaryOp): (left, right) +FloorDiv(BinaryOp): (left, right) +UnaryAdd(UnaryOp): expr +UnarySub(UnaryOp): expr +Invert(UnaryOp): expr init(Function): self.varargs = self.kwargs = 0 Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Fri Sep 9 17:06:16 2005 @@ -58,7 +58,7 @@ if self.parent: self.parent = classes[self.parent] else: - self.parent = NodeInfo("Node","") + self.parent = Node_NodeInfo def get_argnames(self): if '(' in self.args: @@ -232,6 +232,8 @@ print >> buf, " def visit%s(self, node):" % self.name print >> buf, " return self.default( node )" +Node_NodeInfo = NodeInfo("Node","") + rx_init = re.compile('init\((.*)\):') rx_flatten_nodes = re.compile('flatten_nodes\((.*)\.(.*)\):') rx_additional_methods = re.compile('(.*)\.(.*)\((.*?)\):') @@ -331,8 +333,15 @@ print prologue print classes = parse_spec(SPEC) - for info in classes: + emitted = {Node_NodeInfo: True} + def emit(info): + if info in emitted: + return + emitted[info] = True + emit(info.parent) print info.gen_source() + for info in classes: + emit(info) gen_ast_visitor(classes) print epilogue From arigo at codespeak.net Fri Sep 9 17:07:01 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Sep 2005 17:07:01 +0200 (CEST) Subject: [pypy-svn] r17408 - in pypy/dist/pypy/interpreter/pyparser: . test Message-ID: <20050909150701.886FE27B75@code1.codespeak.net> Author: arigo Date: Fri Sep 9 17:06:59 2005 New Revision: 17408 Modified: pypy/dist/pypy/interpreter/pyparser/parsestring.py pypy/dist/pypy/interpreter/pyparser/test/test_parsestring.py Log: Oups! Not only bugs related to invalid escape sequences crashing parsestring(), but also a genuine problem discovered by the test: in the escape sequence '\123', the last 3 was incorrectly re-inserted into the resulting string. Modified: pypy/dist/pypy/interpreter/pyparser/parsestring.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/parsestring.py (original) +++ pypy/dist/pypy/interpreter/pyparser/parsestring.py Fri Sep 9 17:06:59 2005 @@ -155,16 +155,17 @@ lis.append('\013') # VT elif ch == 'a': lis.append('\007') # BEL, not classic C - elif ch >= '0' and ch <= '7': + elif '0' <= ch <= '7': c = ord(s[ps - 1]) - ord('0') - if '0' <= s[ps] <= '7': + if ps < end and '0' <= s[ps] <= '7': c = (c << 3) + ord(s[ps]) - ord('0') ps += 1 - if '0' <= s[ps] <= '7': + if ps < end and '0' <= s[ps] <= '7': c = (c << 3) + ord(s[ps]) - ord('0') + ps += 1 lis.append(chr(c)) elif ch == 'x': - if isxdigit(s[ps]) and isxdigit(s[ps + 1]): + if ps+2 <= end and isxdigit(s[ps]) and isxdigit(s[ps + 1]): lis.append(chr(int(s[ps : ps + 2], 16))) ps += 2 else: Modified: pypy/dist/pypy/interpreter/pyparser/test/test_parsestring.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_parsestring.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_parsestring.py Fri Sep 9 17:06:59 2005 @@ -13,6 +13,24 @@ s = "'''hello\\x42 world'''" w_ret = parsestring.parsestr(space, None, s) assert space.str_w(w_ret) == 'hello\x42 world' + s = r'"\0"' + w_ret = parsestring.parsestr(space, None, s) + assert space.str_w(w_ret) == chr(0) + s = r'"\07"' + w_ret = parsestring.parsestr(space, None, s) + assert space.str_w(w_ret) == chr(7) + s = r'"\123"' + w_ret = parsestring.parsestr(space, None, s) + assert space.str_w(w_ret) == chr(0123) + s = r'"\x"' + space.raises_w(space.w_ValueError, parsestring.parsestr, space, None, s) + s = r'"\x7"' + space.raises_w(space.w_ValueError, parsestring.parsestr, space, None, s) + s = r'"\x7g"' + space.raises_w(space.w_ValueError, parsestring.parsestr, space, None, s) + s = r'"\xfF"' + w_ret = parsestring.parsestr(space, None, s) + assert space.str_w(w_ret) == chr(0xFF) def test_unicode(self): space = self.space From arigo at codespeak.net Fri Sep 9 17:20:10 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Sep 2005 17:20:10 +0200 (CEST) Subject: [pypy-svn] r17411 - pypy/dist/pypy/interpreter/pyparser/test Message-ID: <20050909152010.30E5E27B7D@code1.codespeak.net> Author: arigo Date: Fri Sep 9 17:20:09 2005 New Revision: 17411 Added: pypy/dist/pypy/interpreter/pyparser/test/__init__.py - copied unchanged from r17408, pypy/dist/pypy/interpreter/test/__init__.py pypy/dist/pypy/interpreter/pyparser/test/autopath.py - copied unchanged from r17408, pypy/dist/pypy/interpreter/test/autopath.py pypy/dist/pypy/interpreter/pyparser/test/stdlib_testall.py Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Log: Added a test module to exercices our ast builder and compiler on the complete standard library (lib-python/2.4.1). The test file is not called test_*.py to avoid it being run by default. (Takes too much time and crashes quite a bit at the moment.) Added: pypy/dist/pypy/interpreter/pyparser/test/stdlib_testall.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/interpreter/pyparser/test/stdlib_testall.py Fri Sep 9 17:20:09 2005 @@ -0,0 +1,16 @@ +import autopath +import py +from test_astcompiler import check_compile + + +def check_file_compile(filename): + print 'Compiling:', filename + source = open(filename).read() + check_compile(source, 'exec', quiet=True) + + +def test_all(): + p = py.path.local(autopath.pypydir).dirpath().join('lib-python', '2.4.1') + for s in p.listdir(): + if s.check(ext='.py'): + yield check_file_compile, str(s) Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Fri Sep 9 17:20:09 2005 @@ -125,8 +125,9 @@ tuple(rcode.co_cellvars) ) return code -def check_compile(expr, target='exec'): - print "Compiling:", expr +def check_compile(expr, target='exec', quiet=False): + if not quiet: + print "Compiling:", expr sc_code = compile_with_stablecompiler(expr, target=target) ac_code = compile_with_astcompiler(expr, target=target) compare_code(ac_code, sc_code) From arigo at codespeak.net Fri Sep 9 17:28:10 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Sep 2005 17:28:10 +0200 (CEST) Subject: [pypy-svn] r17412 - pypy/dist/pypy/rpython Message-ID: <20050909152810.6B23F27B7D@code1.codespeak.net> Author: arigo Date: Fri Sep 9 17:28:08 2005 New Revision: 17412 Modified: pypy/dist/pypy/rpython/objectmodel.py Log: A bet that __setattribute__() is a typo. Modified: pypy/dist/pypy/rpython/objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/objectmodel.py (original) +++ pypy/dist/pypy/rpython/objectmodel.py Fri Sep 9 17:28:08 2005 @@ -21,7 +21,7 @@ class FREED_OBJECT(object): def __getattribute__(self, attr): raise RuntimeError("trying to access freed object") - def __setattribute__(self, attr, value): + def __setattr__(self, attr, value): raise RuntimeError("trying to access freed object") From arigo at codespeak.net Fri Sep 9 17:52:08 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Sep 2005 17:52:08 +0200 (CEST) Subject: [pypy-svn] r17414 - in pypy/dist/pypy: annotation rpython rpython/test Message-ID: <20050909155208.5CB0D27B7D@code1.codespeak.net> Author: arigo Date: Fri Sep 9 17:52:05 2005 New Revision: 17414 Modified: pypy/dist/pypy/annotation/unaryop.py pypy/dist/pypy/rpython/rdict.py pypy/dist/pypy/rpython/test/test_rdict.py Log: Support the method clear() on interp-level dicts. Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Fri Sep 9 17:52:05 2005 @@ -301,6 +301,9 @@ return getbookkeeper().newlist(SomeTuple((dct.dictdef.read_key(), dct.dictdef.read_value()))) + def method_clear(dct): + pass + class __extend__(SomeString): Modified: pypy/dist/pypy/rpython/rdict.py ============================================================================== --- pypy/dist/pypy/rpython/rdict.py (original) +++ pypy/dist/pypy/rpython/rdict.py Fri Sep 9 17:52:05 2005 @@ -147,6 +147,11 @@ def rtype_method_items(self, hop): return self._rtype_method_kvi(hop, dum_items) + def rtype_method_clear(self, hop): + v_dict, = hop.inputargs(self) + hop.exception_cannot_occur() + return hop.gendirectcall(ll_clear, v_dict) + class __extend__(pairtype(StrDictRepr, rmodel.StringRepr)): def rtype_getitem((r_dict, r_string), hop): @@ -387,6 +392,14 @@ i += 1 return d +def ll_clear(d): + if len(d.entries) == d.num_pristine_entries == STRDICT_INITSIZE: + return + DICTPTR = lltype.typeOf(d) + d.entries = lltype.malloc(DICTPTR.TO.entries.TO, STRDICT_INITSIZE) + d.num_items = 0 + d.num_pristine_entries = STRDICT_INITSIZE + def ll_update(dic1, dic2): d2len =len(dic2.entries) entries = dic2.entries Modified: pypy/dist/pypy/rpython/test/test_rdict.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rdict.py (original) +++ pypy/dist/pypy/rpython/test/test_rdict.py Fri Sep 9 17:52:05 2005 @@ -41,6 +41,17 @@ res = interpret(func, [6]) assert res == 1 +def test_dict_clear(): + def func(i): + d = {'abc': i} + d['def'] = i+1 + d.clear() + d['ghi'] = i+2 + return ('abc' not in d and 'def' not in d + and d['ghi'] == i+2 and len(d) == 1) + res = interpret(func, [7]) + assert res == True + def test_empty_strings(): def func(i): d = {'' : i} From cfbolz at codespeak.net Fri Sep 9 18:09:42 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 9 Sep 2005 18:09:42 +0200 (CEST) Subject: [pypy-svn] r17418 - pypy/dist/pypy/rpython/test Message-ID: <20050909160942.5C31D27B86@code1.codespeak.net> Author: cfbolz Date: Fri Sep 9 18:09:41 2005 New Revision: 17418 Modified: pypy/dist/pypy/rpython/test/test_nongc.py Log: check that setting an attribute on a freed nongc object raises as well (thanks arigo). Modified: pypy/dist/pypy/rpython/test/test_nongc.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_nongc.py (original) +++ pypy/dist/pypy/rpython/test/test_nongc.py Fri Sep 9 18:09:41 2005 @@ -24,6 +24,7 @@ py.test.raises(RuntimeError, "t.method1()") py.test.raises(RuntimeError, "t.method2()") py.test.raises(RuntimeError, "t.a") + py.test.raises(RuntimeError, "t.a = 1") py.test.raises(AssertionError, "free_non_gc_object(TestClass2())") def test_alloc_flavor(): From tismer at codespeak.net Fri Sep 9 18:14:22 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Fri, 9 Sep 2005 18:14:22 +0200 (CEST) Subject: [pypy-svn] r17419 - pypy/dist/pypy/rpython/test Message-ID: <20050909161422.D068B27B86@code1.codespeak.net> Author: tismer Date: Fri Sep 9 18:14:21 2005 New Revision: 17419 Modified: pypy/dist/pypy/rpython/test/test_rrange.py Log: added a test for len, this part was not tested with the ll interp Modified: pypy/dist/pypy/rpython/test/test_rrange.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rrange.py (original) +++ pypy/dist/pypy/rpython/test/test_rrange.py Fri Sep 9 18:14:21 2005 @@ -51,7 +51,7 @@ res = interpret(dummyfn, [10, 17, -2]) assert res == 15 -def test_range(): +def test_xrange(): def dummyfn(N): total = 0 for i in xrange(N): @@ -59,3 +59,11 @@ return total res = interpret(dummyfn, [10]) assert res == 45 + +def test_range_len(): + def dummyfn(start, stop): + r = range(start, stop) + return len(r) + start, stop = 10, 17 + res = interpret(dummyfn, [start, stop]) + assert res == len(range(start,stop)) From arigo at codespeak.net Fri Sep 9 18:16:05 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Sep 2005 18:16:05 +0200 (CEST) Subject: [pypy-svn] r17420 - in pypy/dist/pypy/rpython: . test Message-ID: <20050909161605.3C48A27B86@code1.codespeak.net> Author: arigo Date: Fri Sep 9 18:16:03 2005 New Revision: 17420 Added: pypy/dist/pypy/rpython/test/test_objectmodel.py (contents, props changed) Modified: pypy/dist/pypy/rpython/objectmodel.py Log: An r_dict() class that will work a bit like a dict at interp-level, but containing keys over which we have control for hashing and comparison, via subclassing and overriding the key_eq() and key_hash() methods. The idea is to use it to replace the dictobject.py table lookup algorithms altogether, then annotate r_dict almost like regular dictionaries, in the same way that we annotate r_int almost like regular ints. Modified: pypy/dist/pypy/rpython/objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/objectmodel.py (original) +++ pypy/dist/pypy/rpython/objectmodel.py Fri Sep 9 18:16:03 2005 @@ -29,3 +29,96 @@ assert not getattr(obj.__class__, "_alloc_flavor_", 'gc').startswith('gc'), "trying to free gc object" obj.__dict__ = {} obj.__class__ = FREED_OBJECT + +# ____________________________________________________________ + + +class r_dict(object): + """An RPython dict-like object. + Only provides the interface supported by RPython. + The methods key_eq() and key_hash() are used by the key comparison + algorithm and can be subclassed.""" + + def __init__(self): + self._dict = {} + + def __getitem__(self, key): + return self._dict[_r_dictkey(self, key)] + + def __setitem__(self, key, value): + self._dict[_r_dictkey(self, key)] = value + + def __delitem__(self, key): + del self._dict[_r_dictkey(self, key)] + + def __len__(self): + return len(self._dict) + + def __iter__(self): + for dk in self._dict: + yield dk.key + + def __contains__(self, key): + return _r_dictkey(self, key) in self._dict + + def get(self, key, default): + return self._dict.get(_r_dictkey(self, key), default) + + def copy(self): + result = self.__class__() + result.update(self) + return result + + def update(self, other): + for key, value in other.items(): + self[key] = value + + def keys(self): + return [dk.key for dk in self._dict] + + def values(self): + return self._dict.values() + + def items(self): + return [(dk.key, value) for dk, value in self._dict.items()] + + iterkeys = __iter__ + + def itervalues(self): + return self._dict.itervalues() + + def iteritems(self): + for dk, value in self._dict.items(): + yield dk.key, value + + def clear(self): + self._dict.clear() + + def key_eq(self, key1, key2): + "Called to compare two keys. Can be overridden in subclasses." + return key1 == key2 + + def key_hash(self, key): + "Called to compute the hash of a key. Can be overridden in subclasses." + return hash(key) + + def __repr__(self): + "Representation for debugging purposes." + return 'r_dict(%r)' % (dict(self.items()),) + + +class _r_dictkey(object): + __slots__ = ['dic', 'key'] + def __init__(self, dic, key): + self.dic = dic + self.key = key + def __eq__(self, other): + if not isinstance(other, _r_dictkey): + return NotImplemented + return self.dic.key_eq(self.key, other.key) + def __ne__(self, other): + if not isinstance(other, _r_dictkey): + return NotImplemented + return not self.dic.key_eq(self.key, other.key) + def __hash__(self): + return self.dic.key_hash(self.key) Added: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Fri Sep 9 18:16:03 2005 @@ -0,0 +1,40 @@ +import py +from pypy.rpython.objectmodel import * +from pypy.rpython.test.test_llinterp import interpret + + +def test_we_are_translated(): + assert we_are_translated() == False + + def fn(): + return we_are_translated() + res = interpret(fn, []) + assert res is True + +def test_r_dict(): + class StrangeDict(r_dict): + def key_eq(self, key1, key2): + return key1[0] == key2[0] # only the 1st character is relevant + def key_hash(self, key): + return ord(key[0]) + d = StrangeDict() + d['hello'] = 42 + assert d['hi there'] == 42 + py.test.raises(KeyError, 'd["dumb"]') + assert len(d) == 1 + assert 'oops' not in d + assert list(d) == ['hello'] + assert d.get('hola', -1) == 42 + assert d.get('salut', -1) == -1 + d1 = d.copy() + del d['hu!'] + assert len(d) == 0 + assert d1.keys() == ['hello'] + d.update(d1) + assert d.values() == [42] + assert d.items() == [('hello', 42)] + assert list(d.iterkeys()) == ['hello'] + assert list(d.itervalues()) == [42] + assert list(d.iteritems()) == [('hello', 42)] + d.clear() + assert d.keys() == [] From pedronis at codespeak.net Fri Sep 9 18:20:53 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 9 Sep 2005 18:20:53 +0200 (CEST) Subject: [pypy-svn] r17421 - in pypy/dist/pypy/interpreter: astcompiler pyparser/test Message-ID: <20050909162053.57D4127B86@code1.codespeak.net> Author: pedronis Date: Fri Sep 9 18:20:51 2005 New Revision: 17421 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Log: more attribute migration control Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Fri Sep 9 18:20:51 2005 @@ -547,6 +547,7 @@ stack = [] for i, for_ in zip(range(len(node.quals)), node.quals): + assert isinstance(for_, ast.ListCompFor) start, anchor = self._visitListCompFor(for_) self.genexpr_cont_stack.append( None ) for if_ in for_.ifs: @@ -575,7 +576,6 @@ self.__list_count = self.__list_count - 1 def _visitListCompFor(self, node): - assert isinstance(node, ast.ListCompFor) start = self.newBlock() anchor = self.newBlock() @@ -599,7 +599,9 @@ def visitGenExpr(self, node): gen = GenExprCodeGenerator(self.space, node, self.scopes, self.class_name, self.get_module()) - walk(node.code, gen) + inner = node.code + assert isinstance(inner, ast.GenExprInner) + walk(inner, gen) gen.finish() self.set_lineno(node) frees = gen.scope.get_free_vars() @@ -613,7 +615,9 @@ self.emitop_int('MAKE_FUNCTION', 0) # precomputation of outmost iterable - node.code.quals[0].iter.accept( self ) + qual0 = inner.quals[0] + assert isinstance(qual0, ast.GenExprFor) + qual0.iter.accept( self ) self.emit('GET_ITER') self.emitop_int('CALL_FUNCTION', 1) @@ -623,6 +627,7 @@ stack = [] for i, for_ in zip(range(len(node.quals)), node.quals): + assert isinstance(for_, ast.GenExprFor) start, anchor = self._visitGenExprFor(for_) self.genexpr_cont_stack.append( None ) for if_ in for_.ifs: @@ -647,7 +652,6 @@ self.emitop_obj('LOAD_CONST', self.space.w_None) def _visitGenExprFor(self, node): - assert isinstance(node, ast.GenExprFor) start = self.newBlock() anchor = self.newBlock() Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Fri Sep 9 18:20:51 2005 @@ -29,6 +29,7 @@ listmakers, dictmakers, multiexpr, + # genexps, investigate? attraccess, slices, imports, From tismer at codespeak.net Fri Sep 9 18:56:33 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Fri, 9 Sep 2005 18:56:33 +0200 (CEST) Subject: [pypy-svn] r17422 - pypy/dist/pypy/rpython Message-ID: <20050909165633.2A9D027B82@code1.codespeak.net> Author: tismer Date: Fri Sep 9 18:56:32 2005 New Revision: 17422 Modified: pypy/dist/pypy/rpython/rrange.py Log: harmless change that makes range_len usable for ll_range2list Modified: pypy/dist/pypy/rpython/rrange.py ============================================================================== --- pypy/dist/pypy/rpython/rrange.py (original) +++ pypy/dist/pypy/rpython/rrange.py Fri Sep 9 18:56:32 2005 @@ -50,20 +50,23 @@ # # Low-level methods. -def ll_rangelen(l, step): +def _ll_rangelen(start, stop, step): if step > 0: - result = (l.stop - l.start + (step-1)) // step + result = (stop - start + (step-1)) // step else: - result = (l.start - l.stop - (step+1)) // (-step) + result = (start - stop - (step+1)) // (-step) if result < 0: result = 0 return result +def ll_rangelen(l, step): + return _ll_rangelen(l.start, l.stop, step) + def ll_rangeitem_nonneg(l, i, step): return l.start + i*step def ll_rangeitem(l, i, step): - if i<0: + if i < 0: length = ll_rangelen(l, step) i += length return l.start + i*step @@ -99,6 +102,8 @@ rtype_builtin_xrange = rtype_builtin_range +def ll_range2list(start, stop, step): + pass # ____________________________________________________________ # # Iteration. From ale at codespeak.net Fri Sep 9 19:01:27 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Fri, 9 Sep 2005 19:01:27 +0200 (CEST) Subject: [pypy-svn] r17423 - pypy/dist/pypy/translator/goal Message-ID: <20050909170127.5C6C227B82@code1.codespeak.net> Author: ale Date: Fri Sep 9 19:01:26 2005 New Revision: 17423 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: refactored the options. removed some of the globals. Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Fri Sep 9 19:01:26 2005 @@ -1,47 +1,48 @@ #! /usr/bin/env python # -# """ Command-line options for translate_pypy: - Option groups: - Annotation: - -m --lowmem Try to save memory - -n --no_annotations Don't infer annotations - -d --debug record debug information - -i --insist Dont't stop on first error - - Specialization: - -t --specialize Don't specialize + See below +""" + +opts = { + 'Annotation':[ + ['-m', '--lowmem', 'Try to save memory', [True,False], False], + ['-n', '--no_annotations', "Don't infer annotations", [True,False], False], + ['-d', '--debug', 'record debug information', [True,False], False], + ['-i', '--insist', "Dont't stop on first error", [True,False], True]], - Backend optimisation - -o --optimize Don't optimize (should have - different name) + 'Specialization':[ + ['-t', '--specialize', "Don't specialize", [True,False], True]], - Process options: - -f fork1[fork2] --fork fork1[fork2] (UNIX) Create restartable - checkpoint after annotation - [,specialization] - -l file --load file load translator from file - -s file --save file save translator to file + 'Backend optimisation': [ + ['-o', '--optimize', "Don't optimize (should have different name)", + [True,False], True ]], + + 'Process options':[ + ['-f', '--fork', + "(UNIX) Create restartable checkpoint after annotation [,specialization]", + [['fork1','fork2']], [] ], + ['-l', '--load', "load translator from file", [str], ''], + ['-s', '--save', "save translator to file", [str], '']], - Codegeneration options: - -g gc --gc gc Garbage collector - -b be --backend be Backend selector - -c --gencode Don't generate code + 'Codegeneration options':[ + ['-g', '--gc', 'Garbage collector', ['ref', 'boehm','none'], 'ref'], + ['-b', '--backend', 'Backend selector', ['c','llvm'],'c'], + ['-w', '--gencode', "Don't generate code", [True,False], True], + ['-c', '--compile', "Don't compile generated code", [True,False], True]], - Compilation options: + 'Compilation options':[], - Run options: - -r --no_run Don't run the compiled code - -x --batch Dont run interactive helpers - Pygame options: - -p --pygame Dont run pygame - -H number --huge number Threshold in the number of - functions after which only a - local call graph and not a full - one is displayed -""" + 'Run options':[ + ['-r', '--run', "Don't run the compiled code", [True,False], True], + ['-x', '--batch', "Dont run interactive helpers", [True,False], False]], + 'Pygame options':[ + ['-p', '--pygame', "Dont run pygame", [True,False], True], + ['-H', '--huge', + "Threshold in the number of functions after which only a local call graph and not a full one is displayed", [int], 0 ]]} + import autopath, sys, os if '-use-snapshot' in sys.argv: @@ -100,7 +101,6 @@ # __________ Main __________ def analyse(target): - global t, entry_point, inputtypes, standalone policy = AnnotatorPolicy() if target: @@ -120,10 +120,10 @@ if standalone: ldef = listdef.ListDef(None, annmodel.SomeString()) inputtypes = [annmodel.SomeList(ldef)] - + if listen_port: run_async_server() - if not options1.no_a: + if not options1.no_annotations: print 'Annotating...' print 'with policy: %s.%s' % (policy.__class__.__module__, policy.__class__.__name__) a = t.annotate(inputtypes, policy=policy) @@ -152,6 +152,7 @@ unixcheckpoint.restartable_point(auto='run') if a: t.frozen = True # cannot freeze if we don't have annotations + return t, entry_point, inputtypes, standalone def assert_rpython_mostly_not_imported(): prefix = 'pypy.rpython.' @@ -300,46 +301,23 @@ from optparse import OptionParser parser = OptionParser() - parser.add_option("-u", "--usesnapshot", dest="snapshot", default=False, - action="store_true",help="use snapshot") - - parser.add_option("-f", "--fork", dest="fork", default=[], - action="append",help="(UNIX) Create restartable checkpoint after annotation,specialization") - parser.add_option("-m", "--lowmem", dest="lowmem", default=False, - action="store_true",help="Try to save memory") - parser.add_option("-t", "--specialize", dest="specialize", default=True, - action="store_false",help="Don't specialize") - parser.add_option("-o", "--optimize", dest="optimize", default=True, - action="store_false",help="Don't do backend optimizations") - parser.add_option("-n", "--no_annotations", dest="no_a", default=False, - action="store_true", help="Don't infer annotations") - parser.add_option("-l", "--load", dest="loadfile", - help="load translator from file") - parser.add_option("-s", "--save", dest="savefile", - help="save translator to file") - parser.add_option("-i", "--insist", dest="insist", default=True, - action="store_true", help="Don't stop on first error") - parser.add_option("-d", "--debug", dest="debug", default=False, - action="store_true", help="record debug information") - - parser.add_option("-g", "--gc", dest="gc", default="ref", - help="choose garbage collector (ref, boehm, none)") - parser.add_option("-b", "--backend", dest="backend", default='c', - help="choose backend (c, llvm, llinterpret)") - parser.add_option("-c", "--gencode", dest="really_compile", default=True, - action="store_false",help="Don't generate C code") - - parser.add_option("-r", "--no_run", dest="run", default=True, - action="store_false",help="compile but don't run") - parser.add_option("-H", "--huge", dest="huge", type="int", - help="Threshold in the number of functions after which only a local call\ - graph and not a full one is displayed") - parser.add_option("-p", "--pygame", dest="pygame", default=True, - action="store_false", help="Don't start Pygame viewer") - parser.add_option("-x", "--batch", dest="batch", default=False, - action="store_true",help="Don't use interactive helpers, like pdb") - (options1, args) = parser.parse_args() + for group in opts: + for option in opts[group]: + if option[-1] in [True,False]: + if option[-1] == True: + action = "store_false" + else: + action = "store_true" + parser.add_option(option[0],option[1], default=option[-1], + dest=option[1].lstrip('--'), help=option[2], action=action) + elif type(option[-2][0]) == list: + parser.add_option(option[0],option[1], default=option[-1], + dest=option[1].lstrip('--'), help=option[2], action="append") + else: + parser.add_option(option[0],option[1], default=option[-1], + dest=option[1].lstrip('--'), help=option[2]) + (options1, args) = parser.parse_args() argiter = iter(args) #sys.argv[1:]) for arg in argiter: try: @@ -351,18 +329,19 @@ targetspec = arg elif os.path.isfile(arg) and arg.endswith('.py'): targetspec = arg[:-3] - + t = None options = {} for opt in parser.option_list[1:]: options[opt.dest] = getattr(options1,opt.dest) + ## if options['-tcc']: ## os.environ['PYPY_CC'] = 'tcc -shared -o "%s.so" "%s.c"' if options1.debug: annmodel.DEBUG = True try: err = None - if options1.loadfile: - loaded_dic = load(options1.filename) + if options1.load: + loaded_dic = load(options1.load) t = loaded_dic['trans'] entry_point = t.entrypoint inputtypes = loaded_dic['inputtypes'] @@ -391,15 +370,15 @@ for name in optnames: print ' %25s: %s' %(name, options[name]) try: - analyse(targetspec_dic['target']) + t, entry_point, inputtypes, standalone = analyse(targetspec_dic['target']) except TyperError: err = sys.exc_info() print '-'*60 - if options1.savefile: - print 'saving state to %s' % options1.savefile + if options1.save: + print 'saving state to %s' % options1.save if err: print '*** this save is done after errors occured ***' - save(t, options1.savefile, + save(t, options1.save, trans=t, inputtypes=inputtypes, targetspec=targetspec, @@ -428,11 +407,11 @@ interp.eval_function(entry_point, targetspec_dic['get_llinterp_args']()) interpret() - #elif options1.gencode: - # print 'Not generating C code.' + elif not options1.gencode: + print 'Not generating C code.' else: - print 'Generating %s %s code...' %(options1.really_compile and "and compiling " or "",options1.backend) - keywords = {'really_compile' : options1.really_compile, + print 'Generating %s %s code...' %(options1.compile and "and compiling" or "",options1.backend) + keywords = {'really_compile' : options1.compile, 'standalone' : standalone, 'gcpolicy' : gcpolicy} c_entry_point = t.compile(options1.backend, **keywords) From arigo at codespeak.net Fri Sep 9 19:03:06 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Sep 2005 19:03:06 +0200 (CEST) Subject: [pypy-svn] r17424 - in pypy/dist/pypy/rpython: . test Message-ID: <20050909170306.3EFCD27B82@code1.codespeak.net> Author: arigo Date: Fri Sep 9 19:03:04 2005 New Revision: 17424 Modified: pypy/dist/pypy/rpython/objectmodel.py pypy/dist/pypy/rpython/test/test_objectmodel.py Log: r_dict should not rely on subclassing. This would get messy with the annotator. Modified: pypy/dist/pypy/rpython/objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/objectmodel.py (original) +++ pypy/dist/pypy/rpython/objectmodel.py Fri Sep 9 19:03:04 2005 @@ -36,11 +36,13 @@ class r_dict(object): """An RPython dict-like object. Only provides the interface supported by RPython. - The methods key_eq() and key_hash() are used by the key comparison - algorithm and can be subclassed.""" + The functions key_eq() and key_hash() are used by the key comparison + algorithm.""" - def __init__(self): + def __init__(self, key_eq, key_hash): self._dict = {} + self.key_eq = key_eq + self.key_hash = key_hash def __getitem__(self, key): return self._dict[_r_dictkey(self, key)] @@ -65,7 +67,7 @@ return self._dict.get(_r_dictkey(self, key), default) def copy(self): - result = self.__class__() + result = r_dict(self.key_eq, self.key_hash) result.update(self) return result Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Fri Sep 9 19:03:04 2005 @@ -12,12 +12,11 @@ assert res is True def test_r_dict(): - class StrangeDict(r_dict): - def key_eq(self, key1, key2): - return key1[0] == key2[0] # only the 1st character is relevant - def key_hash(self, key): - return ord(key[0]) - d = StrangeDict() + def key_eq(key1, key2): + return key1[0] == key2[0] # only the 1st character is relevant + def key_hash(key): + return ord(key[0]) + d = r_dict(key_eq, key_hash) d['hello'] = 42 assert d['hi there'] == 42 py.test.raises(KeyError, 'd["dumb"]') From ludal at codespeak.net Fri Sep 9 19:17:10 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Fri, 9 Sep 2005 19:17:10 +0200 (CEST) Subject: [pypy-svn] r17425 - in pypy/dist/pypy/interpreter: astcompiler pyparser Message-ID: <20050909171710.7DEFB27B82@code1.codespeak.net> Author: ludal Date: Fri Sep 9 19:17:07 2005 New Revision: 17425 Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/astcompiler/symbols.py pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: - make global objects for class attribute counters - remove scopes dict { Node:Scope } put scope attribute on nodes - try to make TokenObject.get_value annotated as SomeString(can_be_none=False) - remove some class attributes Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Fri Sep 9 19:17:07 2005 @@ -378,6 +378,7 @@ def __init__(self, lineno = -1): self.lineno = lineno self.filename = "" + self.scope = None def getChildren(self): pass # implemented by subclasses Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Fri Sep 9 19:17:07 2005 @@ -63,13 +63,12 @@ return gen.code class AbstractCompileMode: - - mode = None # defined by subclass - def __init__(self, source, filename): self.source = source self.filename = filename self.code = None + # XXX: this attribute looks like unused anyway ??? + self.mode = "" # defined by subclass def _get_tree(self): tree = parse(self.source, self.mode) @@ -84,17 +83,19 @@ return self.code class Expression(AbstractCompileMode): - - mode = "eval" - + def __init__(self, source, filename): + AbstractCompileMode.__init__(self, source, filename ) + self.mode = "eval" + def compile(self): tree = self._get_tree() gen = ExpressionCodeGenerator(tree) self.code = gen.getCode() class Interactive(AbstractCompileMode): - - mode = "single" + def __init__(self, source, filename): + AbstractCompileMode.__init__(self, source, filename ) + self.mode = "single" def compile(self): tree = self._get_tree() @@ -102,8 +103,9 @@ self.code = gen.getCode() class Module(AbstractCompileMode): - - mode = "exec" + def __init__(self, source, filename): + AbstractCompileMode.__init__(self, source, filename ) + self.mode = "exec" def compile(self, display=0): tree = self._get_tree() @@ -136,25 +138,19 @@ class CodeGenerator(ast.ASTVisitor): """Defines basic code generator for Python bytecode - - This class is an abstract base class. Concrete subclasses must - define an __init__() that defines self.graph and then calls the - __init__() defined in this class. """ - graph = None - optimized = 0 # is namespace access optimized? - __initialized = None - class_name = None # provide default for instance variable - def __init__(self, space): + def __init__(self, space, graph): self.space = space - self.checkClass() self.setups = misc.Stack() self.last_lineno = -1 self._div_op = "BINARY_DIVIDE" self.genexpr_cont_stack = [] + self.graph = graph + self.optimized = 0 # is namespace access optimized? + self.class_name = "" # provide default for instance variable # XXX set flags based on future features futures = self.get_module().futures @@ -165,10 +161,6 @@ elif feature == "generators": self.graph.setFlag(CO_GENERATOR_ALLOWED) - def checkClass(self): - """Verify that class is constructed correctly""" - assert self.graph is not None, "bad class construction for %r" % self - def emit(self, inst ): return self.graph.emit( inst ) @@ -209,15 +201,14 @@ return self.graph.getCode() def mangle(self, name): - if self.class_name is not None: + if self.class_name: return misc.mangle(name, self.class_name) else: return name def parseSymbols(self, tree): s = symbols.SymbolVisitor(self.space) - walk(tree, s) - return s.scopes + tree.accept(s) def get_module(self): raise RuntimeError, "should be implemented by subclasses" @@ -301,8 +292,9 @@ def visitModule(self, node): - self.scopes = self.parseSymbols(node) - self.scope = self.scopes[node] + self.parseSymbols(node) + assert node.scope is not None + self.scope = node.scope self.emitop_int('SET_LINENO', 0) if node.doc: self.emitop_obj('LOAD_CONST', node.doc) @@ -313,8 +305,9 @@ def visitExpression(self, node): self.set_lineno(node) - self.scopes = self.parseSymbols(node) - self.scope = self.scopes[node] + self.parseSymbols(node) + assert node.scope is not None + self.scope = node.scope node.node.accept( self ) self.emit('RETURN_VALUE') @@ -335,7 +328,7 @@ else: ndecorators = 0 - gen = FunctionCodeGenerator(self.space, node, self.scopes, isLambda, + gen = FunctionCodeGenerator(self.space, node, isLambda, self.class_name, self.get_module()) walk(node.code, gen) gen.finish() @@ -356,7 +349,7 @@ self.emitop_int('CALL_FUNCTION', 1) def visitClass(self, node): - gen = ClassCodeGenerator(self.space, node, self.scopes, + gen = ClassCodeGenerator(self.space, node, self.get_module()) walk(node.code, gen) gen.finish() @@ -597,7 +590,7 @@ self.emit('POP_TOP') def visitGenExpr(self, node): - gen = GenExprCodeGenerator(self.space, node, self.scopes, self.class_name, + gen = GenExprCodeGenerator(self.space, node, self.class_name, self.get_module()) inner = node.code assert isinstance(inner, ast.GenExprInner) @@ -1149,44 +1142,41 @@ class ModuleCodeGenerator(CodeGenerator): - scopes = None def __init__(self, space, tree, futures = []): - self.graph = pyassem.PyFlowGraph(space, "", tree.filename) + graph = pyassem.PyFlowGraph(space, "", tree.filename) self.futures = future.find_futures(tree) for f in futures: if f not in self.futures: self.futures.append(f) - CodeGenerator.__init__(self, space) - walk(tree, self) + CodeGenerator.__init__(self, space, graph) + tree.accept(self) # yuck def get_module(self): return self class ExpressionCodeGenerator(CodeGenerator): - scopes = None def __init__(self, space, tree, futures=[]): - self.graph = pyassem.PyFlowGraph(space, "", tree.filename) + graph = pyassem.PyFlowGraph(space, "", tree.filename) self.futures = futures[:] - CodeGenerator.__init__(self, space) - walk(tree, self) + CodeGenerator.__init__(self, space, graph) + tree.accept(self) # yuck def get_module(self): return self class InteractiveCodeGenerator(CodeGenerator): - scopes = None def __init__(self, space, tree, futures=[]): - self.graph = pyassem.PyFlowGraph(space, "", tree.filename) + graph = pyassem.PyFlowGraph(space, "", tree.filename) self.futures = future.find_futures(tree) for f in futures: if f not in self.futures: self.futures.append(f) - CodeGenerator.__init__(self, space) + CodeGenerator.__init__(self, space, graph) self.set_lineno(tree) - walk(tree, self) + tree.accept(self) # yuck self.emit('RETURN_VALUE') def get_module(self): @@ -1198,26 +1188,24 @@ node.expr.accept( self ) self.emit('PRINT_EXPR') -class AbstractFunctionCode(CodeGenerator): - optimized = 1 - lambdaCount = 0 +AbstractFunctionCodeLambdaCounter = symbols.Counter(0) - def __init__(self, space, func, scopes, isLambda, class_name, mod): +class AbstractFunctionCode(CodeGenerator): + def __init__(self, space, func, isLambda, class_name, mod): self.class_name = class_name self.module = mod if isLambda: - klass = FunctionCodeGenerator - name = "" % klass.lambdaCount - klass.lambdaCount = klass.lambdaCount + 1 + name = "" % AbstractFunctionCodeLambdaCounter.next() else: assert isinstance(func, ast.Function) name = func.name args, hasTupleArg = generateArgList(func.argnames) - self.graph = pyassem.PyFlowGraph(space, name, func.filename, args, + graph = pyassem.PyFlowGraph(space, name, func.filename, args, optimized=1) self.isLambda = isLambda - CodeGenerator.__init__(self, space) + CodeGenerator.__init__(self, space, graph) + self.optimized = 1 if not isLambda and func.doc: self.setDocstring(func.doc) @@ -1263,36 +1251,34 @@ unpackTuple = unpackSequence class FunctionCodeGenerator(AbstractFunctionCode): - scopes = None - def __init__(self, space, func, scopes, isLambda, class_name, mod): - self.scopes = scopes - self.scope = scopes[func] - AbstractFunctionCode.__init__(self, space, func, scopes, isLambda, class_name, mod) + def __init__(self, space, func, isLambda, class_name, mod): + assert func.scope is not None + self.scope = func.scope + AbstractFunctionCode.__init__(self, space, func, isLambda, class_name, mod) self.graph.setFreeVars(self.scope.get_free_vars()) self.graph.setCellVars(self.scope.get_cell_vars()) if self.scope.generator: self.graph.setFlag(CO_GENERATOR) class GenExprCodeGenerator(AbstractFunctionCode): - scopes = None - def __init__(self, space, gexp, scopes, class_name, mod): - self.scopes = scopes - self.scope = scopes[gexp] - AbstractFunctionCode.__init__(self, space, gexp, scopes, 1, class_name, mod) + def __init__(self, space, gexp, class_name, mod): + assert gexp.scope is not None + self.scope = gexp.scope + AbstractFunctionCode.__init__(self, space, gexp, 1, class_name, mod) self.graph.setFreeVars(self.scope.get_free_vars()) self.graph.setCellVars(self.scope.get_cell_vars()) self.graph.setFlag(CO_GENERATOR) class AbstractClassCode(CodeGenerator): - def __init__(self, space, klass, scopes, module): + def __init__(self, space, klass, module): self.class_name = klass.name self.module = module - self.graph = pyassem.PyFlowGraph( space, klass.name, klass.filename, + graph = pyassem.PyFlowGraph( space, klass.name, klass.filename, optimized=0, klass=1) - CodeGenerator.__init__(self, space) + CodeGenerator.__init__(self, space, graph) self.graph.setFlag(CO_NEWLOCALS) if klass.doc: self.setDocstring(klass.doc) @@ -1306,12 +1292,11 @@ self.emit('RETURN_VALUE') class ClassCodeGenerator(AbstractClassCode): - scopes = None - def __init__(self, space, klass, scopes, module): - self.scopes = scopes - self.scope = scopes[klass] - AbstractClassCode.__init__(self, space, klass, scopes, module) + def __init__(self, space, klass, module): + assert klass.scope is not None + self.scope = klass.scope + AbstractClassCode.__init__(self, space, klass, module) self.graph.setFreeVars(self.scope.get_free_vars()) self.graph.setCellVars(self.scope.get_cell_vars()) self.set_lineno(klass) Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Fri Sep 9 19:17:07 2005 @@ -12,6 +12,15 @@ MANGLE_LEN = 256 +class Counter: + def __init__(self, initial): + self.count = initial + + def next(self): + i = self.count + self.count += 1 + return i + class Scope: # XXX how much information do I need about each name? def __init__(self, name, module, klass=None): @@ -180,12 +189,12 @@ class FunctionScope(Scope): pass +GenExprScopeCounter = Counter(1) + class GenExprScope(Scope): - __counter = 1 def __init__(self, module, klass=None): - i = self.__counter - self.__counter += 1 + i = GenExprScopeCounter.next() Scope.__init__(self, "generator expression<%d>"%i, module, klass) self.add_param('[outmost-iterable]') @@ -193,12 +202,12 @@ keys = Scope.get_names() return keys +LambdaScopeCounter = Counter(1) + class LambdaScope(FunctionScope): - __counter = 1 def __init__(self, module, klass=None): - i = self.__counter - self.__counter += 1 + i = LambdaScopeCounter.next() Scope.__init__(self, "lambda.%d" % i, module, klass) class ClassScope(Scope): @@ -209,7 +218,6 @@ class SymbolVisitor(ast.ASTVisitor): def __init__(self, space): self.space = space - self.scopes = {} self.klass = None self.scope_stack = [] self.assign_stack = [ False ] @@ -235,7 +243,7 @@ # node that define new scopes def visitModule(self, node): - scope = self.module = self.scopes[node] = ModuleScope() + scope = self.module = node.scope = ModuleScope() self.push_scope(scope) node.node.accept(self) self.pop_scope() @@ -252,7 +260,7 @@ scope = FunctionScope(node.name, self.module, self.klass) if parent.nested or isinstance(parent, FunctionScope): scope.nested = 1 - self.scopes[node] = scope + node.scope = scope self._do_args(scope, node.argnames) self.push_scope( scope ) node.code.accept(self ) @@ -266,7 +274,7 @@ or isinstance(parent, GenExprScope): scope.nested = 1 - self.scopes[node] = scope + node.scope = scope self.push_scope(scope) node.code.accept(self) self.pop_scope() @@ -302,7 +310,7 @@ scope = LambdaScope(self.module, self.klass) if parent.nested or isinstance(parent, FunctionScope): scope.nested = 1 - self.scopes[node] = scope + node.scope = scope self._do_args(scope, node.argnames) self.push_scope(scope) node.code.accept(self) @@ -333,7 +341,7 @@ if node.doc is not None: scope.add_def('__doc__') scope.add_def('__module__') - self.scopes[node] = scope + node.scope = scope prev = self.klass self.klass = node.name self.push_scope( scope ) @@ -467,6 +475,7 @@ def list_eq(l1, l2): return sort(l1) == sort(l2) + if __name__ == "__main__": import sys from pypy.interpreter.astcompiler import parseFile, walk @@ -488,7 +497,7 @@ walk(tree, s) # compare module-level symbols - names2 = s.scopes[tree].get_names() + names2 = tree.scope.get_names() if not list_eq(mod_names, names2): print @@ -498,6 +507,7 @@ sys.exit(-1) d = {} + # this part won't work anymore d.update(s.scopes) del d[tree] scopes = d.values() Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Fri Sep 9 19:17:07 2005 @@ -123,7 +123,8 @@ stack.pop() else: assert isinstance(token, TokenObject) - stack[-1].nodes.append(ast.AssName(token.get_value(),consts.OP_ASSIGN)) + val = token.get_value() + stack[-1].nodes.append(ast.AssName(val,consts.OP_ASSIGN)) return tokens_read, top def parse_arglist(tokens): @@ -153,7 +154,8 @@ assert isinstance(cur_token, TokenObject) index += 1 if cur_token.name == tok.NAME: - names.append( ast.AssName( cur_token.get_value(), consts.OP_ASSIGN ) ) + val = cur_token.get_value() + names.append( ast.AssName( val, consts.OP_ASSIGN ) ) flags |= consts.CO_VARARGS index += 1 if index >= l: @@ -171,7 +173,8 @@ index += 1 assert isinstance(cur_token, TokenObject) if cur_token.name == tok.NAME: - names.append( ast.AssName( cur_token.get_value(), consts.OP_ASSIGN ) ) + val = cur_token.get_value() + names.append( ast.AssName( val, consts.OP_ASSIGN ) ) flags |= consts.CO_VARKEYWORDS index += 1 else: @@ -179,7 +182,8 @@ if index < l: raise ValueError("unexpected token: %s" % tokens[index]) elif cur_token.name == tok.NAME: - names.append( ast.AssName( cur_token.get_value(), consts.OP_ASSIGN ) ) + val = cur_token.get_value() + names.append( ast.AssName( val, consts.OP_ASSIGN ) ) return names, defaults, flags @@ -407,7 +411,8 @@ token = tokens[0] # XXX HACK for when parse_attraccess is called from build_decorator if isinstance(token, TokenObject): - result = ast.Name(token.get_value()) + val = token.get_value() + result = ast.Name(val) else: result = token index = 1 @@ -485,7 +490,8 @@ items.append((atoms[index], atoms[index+2])) builder.push(ast.Dict(items)) # top.line)) elif top.name == tok.NAME: - builder.push( ast.Name(top.get_value()) ) + val = top.get_value() + builder.push( ast.Name(val) ) elif top.name == tok.NUMBER: builder.push(ast.Const(builder.eval_number(top.get_value()))) elif top.name == tok.STRING: @@ -948,6 +954,7 @@ funcname_token = atoms[1] assert isinstance(funcname_token, TokenObject) funcname = funcname_token.get_value() + assert funcname is not None arglist = atoms[2] code = atoms[-1] doc = get_docstring(builder, code) @@ -960,7 +967,7 @@ l = len(atoms) classname_token = atoms[1] assert isinstance(classname_token, TokenObject) - classname = classname_token.get_value() + classname = classname_token.get_string() if l == 4: basenames = [] body = atoms[3] @@ -1388,10 +1395,9 @@ tok.tok_name.get(self.name, str(self.name))) def get_value(self): - if self.value is None: + value = self.value + if value is None: value = '' - else: - value = self.value return value def __str__(self): From ale at codespeak.net Fri Sep 9 19:25:02 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Fri, 9 Sep 2005 19:25:02 +0200 (CEST) Subject: [pypy-svn] r17426 - pypy/dist/pypy/translator/goal Message-ID: <20050909172502.AA3CF27B82@code1.codespeak.net> Author: ale Date: Fri Sep 9 19:25:01 2005 New Revision: 17426 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: Should work with targetpypy Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Fri Sep 9 19:25:01 2005 @@ -333,7 +333,8 @@ options = {} for opt in parser.option_list[1:]: options[opt.dest] = getattr(options1,opt.dest) - + if options.get('gc') == 'boehm': + options['-boehm'] = True ## if options['-tcc']: ## os.environ['PYPY_CC'] = 'tcc -shared -o "%s.so" "%s.c"' if options1.debug: From ludal at codespeak.net Fri Sep 9 19:31:44 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Fri, 9 Sep 2005 19:31:44 +0200 (CEST) Subject: [pypy-svn] r17427 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050909173144.7A51C27B82@code1.codespeak.net> Author: ludal Date: Fri Sep 9 19:31:43 2005 New Revision: 17427 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: oopsee : revert a wannabe change that shouldn't have been checked in Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Fri Sep 9 19:31:43 2005 @@ -967,7 +967,7 @@ l = len(atoms) classname_token = atoms[1] assert isinstance(classname_token, TokenObject) - classname = classname_token.get_string() + classname = classname_token.get_value() if l == 4: basenames = [] body = atoms[3] From tismer at codespeak.net Fri Sep 9 19:34:43 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Fri, 9 Sep 2005 19:34:43 +0200 (CEST) Subject: [pypy-svn] r17428 - pypy/dist/pypy/translator/c/test Message-ID: <20050909173443.271F027B82@code1.codespeak.net> Author: tismer Date: Fri Sep 9 19:34:41 2005 New Revision: 17428 Modified: pypy/dist/pypy/translator/c/test/test_typed.py Log: renamed a few completely ill-named tests Modified: pypy/dist/pypy/translator/c/test/test_typed.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_typed.py (original) +++ pypy/dist/pypy/translator/c/test/test_typed.py Fri Sep 9 19:34:41 2005 @@ -283,13 +283,13 @@ f = self.getcompiled(fn) assert f(1) == fn(1) - def test_str2int(self): + def test_int2str(self): def fn(i=int): return str(i) f = self.getcompiled(fn) assert f(1) == fn(1) - def test_float2int(self): + def test_float2str(self): def fn(i=float): return str(i) f = self.getcompiled(fn) From tismer at codespeak.net Fri Sep 9 19:41:16 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Fri, 9 Sep 2005 19:41:16 +0200 (CEST) Subject: [pypy-svn] r17429 - in pypy/dist/pypy: rpython rpython/test translator/c/test Message-ID: <20050909174116.D860B27B82@code1.codespeak.net> Author: tismer Date: Fri Sep 9 19:41:15 2005 New Revision: 17429 Modified: pypy/dist/pypy/rpython/rrange.py pypy/dist/pypy/rpython/test/test_rrange.py pypy/dist/pypy/translator/c/test/test_typed.py Log: implemented range to list conversion. Modified: pypy/dist/pypy/rpython/rrange.py ============================================================================== --- pypy/dist/pypy/rpython/rrange.py (original) +++ pypy/dist/pypy/rpython/rrange.py Fri Sep 9 19:41:15 2005 @@ -96,14 +96,24 @@ return hop.gendirectcall(ll_newrange, vstart, vstop) else: # cannot build a RANGE object, needs a real list - raise TyperError("range() result used as a normal list: " - "XXX not implemented") - #return hop.gendirectcall(ll_range2list, vstart, vstop, vstep) + r_list = hop.r_result + c1 = hop.inputconst(Void, r_list.lowleveltype) + return hop.gendirectcall(ll_range2list, c1, vstart, vstop, vstep) rtype_builtin_xrange = rtype_builtin_range -def ll_range2list(start, stop, step): - pass +def ll_range2list(LISTPTR, start, stop, step): + from pypy.rpython.rlist import ll_newlist + length = _ll_rangelen(start, stop, step) + l = ll_newlist(LISTPTR, length) + idx = 0 + items = l.items + while idx < length: + items[idx] = start + start += step + idx += 1 + return l + # ____________________________________________________________ # # Iteration. Modified: pypy/dist/pypy/rpython/test/test_rrange.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rrange.py (original) +++ pypy/dist/pypy/rpython/test/test_rrange.py Fri Sep 9 19:41:15 2005 @@ -66,4 +66,14 @@ return len(r) start, stop = 10, 17 res = interpret(dummyfn, [start, stop]) - assert res == len(range(start,stop)) + assert res == dummyfn(start, stop) + +def test_range2list(): + def dummyfn(start, stop): + r = range(start, stop) + r.reverse() + return r[0] + start, stop = 10, 17 + res = interpret(dummyfn, [start, stop])#, view=True) + assert res == dummyfn(start, stop) + Modified: pypy/dist/pypy/translator/c/test/test_typed.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_typed.py (original) +++ pypy/dist/pypy/translator/c/test/test_typed.py Fri Sep 9 19:41:15 2005 @@ -349,3 +349,11 @@ for i in range(6): for j in range(6): assert f(i,j) == list_basic_ops(i,j) + + def test_range2list(self): + def fn(): + r = range(10, 37, 4) + r.reverse() + return r[0] + f = self.getcompiled(fn) + assert f() == fn() From pedronis at codespeak.net Fri Sep 9 19:47:09 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 9 Sep 2005 19:47:09 +0200 (CEST) Subject: [pypy-svn] r17430 - in pypy/dist/pypy/translator: goal tool Message-ID: <20050909174709.B177927B82@code1.codespeak.net> Author: pedronis Date: Fri Sep 9 19:47:08 2005 New Revision: 17430 Modified: pypy/dist/pypy/translator/goal/translate_pypy.py pypy/dist/pypy/translator/tool/pdbplus.py Log: make the output of readpos a bit more verbose but also useful Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Fri Sep 9 19:47:08 2005 @@ -693,6 +693,13 @@ func, block, i = p if flt(Pos(func, block, i)): print func.__module__ or '?', func.__name__, block, i + if i >= 0: + op = block.operations[i] + print " ", op + print " ", + for arg in op.args: + print "%s: %s" % (arg, t.annotator.binding(arg)), + print r[func] = True except self.GiveUp: return Modified: pypy/dist/pypy/translator/tool/pdbplus.py ============================================================================== --- pypy/dist/pypy/translator/tool/pdbplus.py (original) +++ pypy/dist/pypy/translator/tool/pdbplus.py Fri Sep 9 19:47:08 2005 @@ -294,6 +294,14 @@ func, block, i = p if flt(Pos(func, block, i)): print func.__module__ or '?', func.__name__, block, i + if i >= 0: + op = block.operations[i] + print " ", op + print " ", + for arg in op.args: + print "%s: %s" (arg, self.translator.getbinding(arg)), + print + r[func] = True except self.GiveUp: return From pedronis at codespeak.net Fri Sep 9 20:57:02 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 9 Sep 2005 20:57:02 +0200 (CEST) Subject: [pypy-svn] r17431 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050909185702.41DA127B82@code1.codespeak.net> Author: pedronis Date: Fri Sep 9 20:57:00 2005 New Revision: 17431 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/ast.txt pypy/dist/pypy/interpreter/astcompiler/astgen.py pypy/dist/pypy/interpreter/astcompiler/future.py pypy/dist/pypy/interpreter/astcompiler/symbols.py Log: avoid last migration of attrs to Node. Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Fri Sep 9 20:57:00 2005 @@ -26,6 +26,7 @@ def __init__(self, lineno = -1): self.lineno = lineno self.filename = "" + #self.scope = None def getChildren(self): pass # implemented by subclasses @@ -182,7 +183,24 @@ def accept(self, visitor): return visitor.visitAssAttr(self) -class AssList(Node): +class AssSeq(Node): + def __init__(self, lineno=-1): + Node.__init__(self, lineno) + + def getChildren(self): + "NOT_RPYTHON" + return [] + + def getChildNodes(self): + return [] + + def __repr__(self): + return "AssSeq()" + + def accept(self, visitor): + return visitor.visitAssSeq(self) + +class AssList(AssSeq): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -221,7 +239,7 @@ def accept(self, visitor): return visitor.visitAssName(self) -class AssTuple(Node): +class AssTuple(AssSeq): def __init__(self, nodes, lineno=-1): Node.__init__(self, lineno) self.nodes = nodes @@ -1816,6 +1834,8 @@ return self.default( node ) def visitAssName(self, node): return self.default( node ) + def visitAssSeq(self, node): + return self.default( node ) def visitAssTuple(self, node): return self.default( node ) def visitAssert(self, node): Modified: pypy/dist/pypy/interpreter/astcompiler/ast.txt ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.txt (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.txt Fri Sep 9 20:57:00 2005 @@ -36,8 +36,9 @@ Discard: expr AugAssign: node, op*, expr Assign: nodes!, expr -AssTuple: nodes! -AssList: nodes! +AssSeq: +AssTuple(AssSeq): nodes! +AssList(AssSeq): nodes! AssName: name*, flags* AssAttr: expr, attrname*, flags* ListComp: expr, quals! Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Fri Sep 9 20:57:00 2005 @@ -378,7 +378,7 @@ def __init__(self, lineno = -1): self.lineno = lineno self.filename = "" - self.scope = None + #self.scope = None def getChildren(self): pass # implemented by subclasses Modified: pypy/dist/pypy/interpreter/astcompiler/future.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/future.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/future.py Fri Sep 9 20:57:00 2005 @@ -23,6 +23,7 @@ def visitModule(self, node): stmt = node.node invalid = False + assert isinstance(stmt, ast.Stmt) for s in stmt.nodes: if not self.check_stmt(s, invalid): invalid = True @@ -62,6 +63,7 @@ def visitModule(self, node): stmt = node.node + assert isinstance(stmt, ast.Stmt) for s in stmt.nodes: if isinstance(s, ast.From): if s.valid_future: Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Fri Sep 9 20:57:00 2005 @@ -248,7 +248,11 @@ node.node.accept(self) self.pop_scope() - visitExpression = visitModule + def visitExpression(self, node): + scope = self.module = node.scope = ModuleScope() + self.push_scope(scope) + node.node.accept(self) + self.pop_scope() def visitFunction(self, node): parent = self.cur_scope() From arigo at codespeak.net Fri Sep 9 21:00:39 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Sep 2005 21:00:39 +0200 (CEST) Subject: [pypy-svn] r17432 - in pypy/dist/pypy: annotation rpython/test Message-ID: <20050909190039.9B5AC27B82@code1.codespeak.net> Author: arigo Date: Fri Sep 9 21:00:36 2005 New Revision: 17432 Modified: pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/annotation/dictdef.py pypy/dist/pypy/annotation/listdef.py pypy/dist/pypy/rpython/test/test_objectmodel.py Log: Annotating r_dicts. Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Fri Sep 9 21:00:36 2005 @@ -9,7 +9,7 @@ from pypy.annotation.model import SomeList, SomeString, SomeTuple, SomeSlice from pypy.annotation.model import SomeUnicodeCodePoint, SomeAddress from pypy.annotation.model import SomeFloat, unionof -from pypy.annotation.model import SomePBC, SomeInstance +from pypy.annotation.model import SomePBC, SomeInstance, SomeDict from pypy.annotation.model import annotation_to_lltype from pypy.annotation.model import add_knowntypedata from pypy.annotation.bookkeeper import getbookkeeper @@ -236,6 +236,10 @@ def robjmodel_we_are_translated(): return immutablevalue(True) +def robjmodel_r_dict(s_eqfn, s_hashfn): + dictdef = getbookkeeper().getdictdef() + dictdef.dictkey.update_rdict_annotations(s_eqfn, s_hashfn) + return SomeDict(dictdef) ##def rarith_ovfcheck(s_obj): ## if isinstance(s_obj, SomeInteger) and s_obj.unsigned: @@ -275,6 +279,7 @@ BUILTIN_ANALYZERS[pypy.rpython.objectmodel.instantiate] = robjmodel_instantiate BUILTIN_ANALYZERS[pypy.rpython.objectmodel.we_are_translated] = ( robjmodel_we_are_translated) +BUILTIN_ANALYZERS[pypy.rpython.objectmodel.r_dict] = robjmodel_r_dict BUILTIN_ANALYZERS[Exception.__init__.im_func] = exception_init BUILTIN_ANALYZERS[OSError.__init__.im_func] = exception_init Modified: pypy/dist/pypy/annotation/dictdef.py ============================================================================== --- pypy/dist/pypy/annotation/dictdef.py (original) +++ pypy/dist/pypy/annotation/dictdef.py Fri Sep 9 21:00:36 2005 @@ -1,12 +1,52 @@ from pypy.annotation.model import SomeObject, SomeImpossibleValue +from pypy.annotation.model import SomeInteger, SomeBool, unionof from pypy.annotation.listdef import ListItem class DictKey(ListItem): + custom_eq_hash = False + def patch(self): for dictdef in self.itemof: dictdef.dictkey = self + def merge(self, other): + if self is not other: + assert self.custom_eq_hash == other.custom_eq_hash, ( + "mixing plain dictionaries with r_dict()") + ListItem.merge(self, other) + if self.custom_eq_hash: + self.update_rdict_annotations(other.s_rdict_eqfn, + other.s_rdict_hashfn) + + def generalize(self, s_other_value): + updated = ListItem.generalize(self, s_other_value) + if updated and self.custom_eq_hash: + self.emulate_rdict_calls() + return updated + + def update_rdict_annotations(self, s_eqfn, s_hashfn): + if not self.custom_eq_hash: + self.custom_eq_hash = True + else: + s_eqfn = unionof(s_eqfn, self.s_rdict_eqfn) + s_hashfn = unionof(s_hashfn, self.s_rdict_hashfn) + self.s_rdict_eqfn = s_eqfn + self.s_rdict_hashfn = s_hashfn + self.emulate_rdict_calls() + + def emulate_rdict_calls(self): + s_key = self.s_value + s1 = self.bookkeeper.emulate_pbc_call(self.s_rdict_eqfn, [s_key, s_key]) + assert SomeBool().contains(s1), ( + "the custom eq function of an r_dict must return a boolean" + " (got %r)" % (s1,)) + s2 = self.bookkeeper.emulate_pbc_call(self.s_rdict_hashfn, [s_key]) + assert SomeInteger().contains(s2), ( + "the custom hash function of an r_dict must return an integer" + " (got %r)" % (s2,)) + + class DictValue(ListItem): def patch(self): for dictdef in self.itemof: Modified: pypy/dist/pypy/annotation/listdef.py ============================================================================== --- pypy/dist/pypy/annotation/listdef.py (original) +++ pypy/dist/pypy/annotation/listdef.py Fri Sep 9 21:00:36 2005 @@ -49,11 +49,13 @@ s_new_value = unionof(self.s_value, s_other_value) if isdegenerated(s_new_value): self.bookkeeper.ondegenerated(self, s_new_value) - if s_new_value != self.s_value: + updated = s_new_value != self.s_value + if updated: self.s_value = s_new_value # reflow from all reading points for position_key in self.read_locations: self.bookkeeper.annotator.reflowfromposition(position_key) + return updated class ListDef: Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Fri Sep 9 21:00:36 2005 @@ -1,5 +1,5 @@ -import py from pypy.rpython.objectmodel import * +from pypy.translator.translator import Translator from pypy.rpython.test.test_llinterp import interpret @@ -11,15 +11,23 @@ res = interpret(fn, []) assert res is True + +def strange_key_eq(key1, key2): + return key1[0] == key2[0] # only the 1st character is relevant +def strange_key_hash(key): + return ord(key[0]) + def test_r_dict(): - def key_eq(key1, key2): - return key1[0] == key2[0] # only the 1st character is relevant - def key_hash(key): - return ord(key[0]) - d = r_dict(key_eq, key_hash) + # NB. this test function is also annotated/rtyped by the next tests + d = r_dict(strange_key_eq, strange_key_hash) d['hello'] = 42 assert d['hi there'] == 42 - py.test.raises(KeyError, 'd["dumb"]') + try: + d["dumb"] + except KeyError: + pass + else: + assert False, "should have raised" assert len(d) == 1 assert 'oops' not in d assert list(d) == ['hello'] @@ -37,3 +45,20 @@ assert list(d.iteritems()) == [('hello', 42)] d.clear() assert d.keys() == [] + return True # for the tests below + +def test_annotate_r_dict(): + t = Translator(test_r_dict) + a = t.annotate([]) + #t.view() + assert strange_key_eq in t.flowgraphs + assert strange_key_hash in t.flowgraphs + graph = t.flowgraphs[strange_key_eq] + assert a.binding(graph.getargs()[0]).knowntype == str + assert a.binding(graph.getargs()[1]).knowntype == str + graph = t.flowgraphs[strange_key_hash] + assert a.binding(graph.getargs()[0]).knowntype == str + +def INPROGRESS_test_rtype_r_dict(): + res = interpret(test_r_dict, []) + assert res is True From arigo at codespeak.net Fri Sep 9 21:49:27 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 9 Sep 2005 21:49:27 +0200 (CEST) Subject: [pypy-svn] r17433 - in pypy/dist/pypy: annotation rpython rpython/test Message-ID: <20050909194927.B2FF027B82@code1.codespeak.net> Author: arigo Date: Fri Sep 9 21:49:25 2005 New Revision: 17433 Modified: pypy/dist/pypy/annotation/model.py pypy/dist/pypy/annotation/unaryop.py pypy/dist/pypy/rpython/rdict.py pypy/dist/pypy/rpython/rlist.py pypy/dist/pypy/rpython/rmodel.py pypy/dist/pypy/rpython/rrange.py pypy/dist/pypy/rpython/rstr.py pypy/dist/pypy/rpython/rtuple.py pypy/dist/pypy/rpython/test/test_rdict.py Log: Support for the iterkeys(), itervalues() and iteritems() methods on RPython dicts. All three kinds of iterators are implementated with the same SomeIterator and StrDictIteratorRepr classes, using a 'variant' field to distinguish (and the dummy function trick in rdict.py to get the specialization). All xxxIteratorRepr classes now inherit from a common IteratorRepr, which implements the common functionality of returning the iterator itself when the 'iter' operation is applied. Modified: pypy/dist/pypy/annotation/model.py ============================================================================== --- pypy/dist/pypy/annotation/model.py (original) +++ pypy/dist/pypy/annotation/model.py Fri Sep 9 21:49:25 2005 @@ -266,7 +266,8 @@ class SomeIterator(SomeObject): "Stands for an iterator returning objects from a given container." knowntype = type(iter([])) # arbitrarily chose seqiter as the type - def __init__(self, s_container): + def __init__(self, s_container, *variant): + self.variant = variant self.s_container = s_container def can_be_none(self): Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Fri Sep 9 21:49:25 2005 @@ -277,8 +277,16 @@ return SomeIterator(dct) iter.can_only_throw = [] - def getanyitem(dct): - return dct.dictdef.read_key() + def getanyitem(dct, variant='keys'): + if variant == 'keys': + return dct.dictdef.read_key() + elif variant == 'values': + return dct.dictdef.read_value() + elif variant == 'items': + return SomeTuple((dct.dictdef.read_key(), + dct.dictdef.read_value())) + else: + raise ValueError def method_get(dct, key, dfl): dct.dictdef.generalize_key(key) @@ -301,6 +309,15 @@ return getbookkeeper().newlist(SomeTuple((dct.dictdef.read_key(), dct.dictdef.read_value()))) + def method_iterkeys(dct): + return SomeIterator(dct, 'keys') + + def method_itervalues(dct): + return SomeIterator(dct, 'values') + + def method_iteritems(dct): + return SomeIterator(dct, 'items') + def method_clear(dct): pass @@ -374,7 +391,7 @@ iter.can_only_throw = [] def next(itr): - return itr.s_container.getanyitem() + return itr.s_container.getanyitem(*itr.variant) class __extend__(SomeInstance): Modified: pypy/dist/pypy/rpython/rdict.py ============================================================================== --- pypy/dist/pypy/rpython/rdict.py (original) +++ pypy/dist/pypy/rpython/rdict.py Fri Sep 9 21:49:25 2005 @@ -111,8 +111,8 @@ v_dict, = hop.inputargs(self) return hop.gendirectcall(ll_strdict_is_true, v_dict) - def make_iterator_repr(self): - return StrDictIteratorRepr(self) + def make_iterator_repr(self, *variant): + return StrDictIteratorRepr(self, *variant) def rtype_method_get(self, hop): v_dict, v_key, v_default = hop.inputargs(self, rstr.string_repr, @@ -147,6 +147,18 @@ def rtype_method_items(self, hop): return self._rtype_method_kvi(hop, dum_items) + def rtype_method_iterkeys(self, hop): + hop.exception_cannot_occur() + return StrDictIteratorRepr(self, "keys").newiter(hop) + + def rtype_method_itervalues(self, hop): + hop.exception_cannot_occur() + return StrDictIteratorRepr(self, "values").newiter(hop) + + def rtype_method_iteritems(self, hop): + hop.exception_cannot_occur() + return StrDictIteratorRepr(self, "items").newiter(hop) + def rtype_method_clear(self, hop): v_dict, = hop.inputargs(self) hop.exception_cannot_occur() @@ -200,6 +212,12 @@ # get flowed and annotated, mostly with SomePtr. deleted_entry_marker = lltype.malloc(rstr.STR, 0, immortal=True) +def dum_keys(): pass +def dum_values(): pass +def dum_items():pass +dum_variant = {"keys": dum_keys, + "values": dum_values, + "items": dum_items} def ll_strdict_len(d): return d.num_items @@ -328,10 +346,11 @@ # # Iteration. -class StrDictIteratorRepr(rmodel.Repr): +class StrDictIteratorRepr(rmodel.IteratorRepr): - def __init__(self, r_dict): + def __init__(self, r_dict, variant="keys"): self.r_dict = r_dict + self.variant = variant self.lowleveltype = lltype.Ptr(lltype.GcStruct('strdictiter', ('dict', r_dict.lowleveltype), ('index', lltype.Signed))) @@ -343,9 +362,12 @@ def rtype_next(self, hop): v_iter, = hop.inputargs(self) + r_list = hop.r_result + v_func = hop.inputconst(lltype.Void, dum_variant[self.variant]) + c1 = hop.inputconst(lltype.Void, r_list.lowleveltype) hop.has_implicit_exception(StopIteration) # record that we know about it hop.exception_is_here() - return hop.gendirectcall(ll_strdictnext, v_iter) + return hop.gendirectcall(ll_strdictnext, v_iter, v_func, c1) def ll_strdictiter(ITERPTR, d): iter = lltype.malloc(ITERPTR.TO) @@ -353,16 +375,25 @@ iter.index = 0 return iter -def ll_strdictnext(iter): +def ll_strdictnext(iter, func, RETURNTYPE): entries = iter.dict.entries index = iter.index entries_len = len(entries) while index < entries_len: - key = entries[index].key + entry = entries[index] + key = entry.key index = index + 1 if key and key != deleted_entry_marker: iter.index = index - return key + if func is dum_items: + r = lltype.malloc(RETURNTYPE.TO) + r.item0 = key + r.item1 = entry.value + return r + elif func is dum_keys: + return key + elif func is dum_values: + return entry.value iter.index = index raise StopIteration @@ -410,10 +441,6 @@ ll_strdict_setitem(dic1, entry.key, entry.value) i += 1 -def dum_keys(): pass -def dum_values(): pass -def dum_items():pass - # this is an implementation of keys(), values() and items() # in a single function. # note that by specialization on func, three different Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Fri Sep 9 21:49:25 2005 @@ -2,6 +2,7 @@ from pypy.annotation import model as annmodel from pypy.objspace.flow.model import Constant from pypy.rpython.rmodel import Repr, TyperError, IntegerRepr, inputconst +from pypy.rpython.rmodel import IteratorRepr from pypy.rpython import rrange from pypy.rpython.rslice import SliceRepr from pypy.rpython.rslice import startstop_slice_repr, startonly_slice_repr @@ -806,7 +807,7 @@ # # Iteration. -class ListIteratorRepr(Repr): +class ListIteratorRepr(IteratorRepr): def __init__(self, r_list): self.r_list = r_list Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Fri Sep 9 21:49:25 2005 @@ -149,16 +149,24 @@ r_iter = self.make_iterator_repr() return r_iter.newiter(hop) - def make_iterator_repr(self): + def make_iterator_repr(self, *variant): raise TyperError("%s is not iterable" % (self,)) +class IteratorRepr(Repr): + """Base class of Reprs of any kind of iterator.""" + + def rtype_iter(self, hop): # iter(iter(x)) <==> iter(x) + v_iter, = hop.inputargs(self) + return v_iter + + class __extend__(annmodel.SomeIterator): # NOTE: SomeIterator is for iterators over any container, not just list def rtyper_makerepr(self, rtyper): r_container = rtyper.getrepr(self.s_container) - return r_container.make_iterator_repr() + return r_container.make_iterator_repr(*self.variant) def rtyper_makekey(self): - return self.__class__, self.s_container.rtyper_makekey() + return self.__class__, self.s_container.rtyper_makekey(), self.variant class __extend__(annmodel.SomeImpossibleValue): def rtyper_makerepr(self, rtyper): Modified: pypy/dist/pypy/rpython/rrange.py ============================================================================== --- pypy/dist/pypy/rpython/rrange.py (original) +++ pypy/dist/pypy/rpython/rrange.py Fri Sep 9 21:49:25 2005 @@ -1,6 +1,6 @@ from pypy.annotation.pairtype import pairtype from pypy.annotation import model as annmodel -from pypy.rpython.rmodel import Repr, TyperError, IntegerRepr +from pypy.rpython.rmodel import Repr, TyperError, IntegerRepr, IteratorRepr from pypy.rpython.lltype import Ptr, GcStruct, Signed, malloc, Void from pypy.objspace.flow.model import Constant @@ -118,7 +118,7 @@ # # Iteration. -class RangeIteratorRepr(Repr): +class RangeIteratorRepr(IteratorRepr): lowleveltype = Ptr(RANGEITER) def __init__(self, r_rng): Modified: pypy/dist/pypy/rpython/rstr.py ============================================================================== --- pypy/dist/pypy/rpython/rstr.py (original) +++ pypy/dist/pypy/rpython/rstr.py Fri Sep 9 21:49:25 2005 @@ -1,7 +1,7 @@ from weakref import WeakValueDictionary from pypy.annotation.pairtype import pairtype from pypy.annotation import model as annmodel -from pypy.rpython.rmodel import Repr, TyperError, IntegerRepr +from pypy.rpython.rmodel import Repr, TyperError, IntegerRepr, IteratorRepr from pypy.rpython.rmodel import StringRepr, CharRepr, inputconst, UniCharRepr from pypy.rpython.rarithmetic import intmask, _hash_string from pypy.rpython.robject import PyObjRepr, pyobj_repr @@ -994,7 +994,7 @@ # # Iteration. -class StringIteratorRepr(Repr): +class StringIteratorRepr(IteratorRepr): lowleveltype = Ptr(GcStruct('stringiter', ('string', string_repr.lowleveltype), ('index', Signed))) Modified: pypy/dist/pypy/rpython/rtuple.py ============================================================================== --- pypy/dist/pypy/rpython/rtuple.py (original) +++ pypy/dist/pypy/rpython/rtuple.py Fri Sep 9 21:49:25 2005 @@ -2,6 +2,7 @@ from pypy.annotation import model as annmodel from pypy.objspace.flow.model import Constant from pypy.rpython.rmodel import Repr, TyperError, IntegerRepr, inputconst +from pypy.rpython.rmodel import IteratorRepr from pypy.rpython.robject import PyObjRepr, pyobj_repr from pypy.rpython.lltype import Ptr, GcStruct, Void, Signed, malloc from pypy.rpython.lltype import typeOf, nullptr @@ -183,7 +184,7 @@ # # Iteration. -class Length1TupleIteratorRepr(Repr): +class Length1TupleIteratorRepr(IteratorRepr): def __init__(self, r_tuple): self.r_tuple = r_tuple Modified: pypy/dist/pypy/rpython/test/test_rdict.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rdict.py (original) +++ pypy/dist/pypy/rpython/test/test_rdict.py Fri Sep 9 21:49:25 2005 @@ -163,6 +163,23 @@ res = interpret(func, [6, 7]) assert res == 42 +def test_dict_itermethods(): + def func(): + d = {} + d['hello'] = 6 + d['world'] = 7 + k1 = k2 = k3 = 1 + for key in d.iterkeys(): + k1 = k1 * d[key] + for value in d.itervalues(): + k2 = k2 * value + for key, value in d.iteritems(): + assert d[key] == value + k3 = k3 * value + return k1 + k2 + k3 + res = interpret(func, []) + assert res == 42 + 42 + 42 + def test_two_dicts_with_different_value_types(): def func(i): d1 = {} From pedronis at codespeak.net Sat Sep 10 00:30:14 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 10 Sep 2005 00:30:14 +0200 (CEST) Subject: [pypy-svn] r17434 - in pypy/dist/pypy: annotation translator/test Message-ID: <20050909223014.BB81D27B84@code1.codespeak.net> Author: pedronis Date: Sat Sep 10 00:30:12 2005 New Revision: 17434 Modified: pypy/dist/pypy/annotation/unaryop.py pypy/dist/pypy/translator/test/test_annrpython.py Log: the old ordering could provoke contains assert failures Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Sat Sep 10 00:30:12 2005 @@ -56,13 +56,13 @@ return r def issubtype(obj, s_cls): - if obj.is_constant() and s_cls.is_constant(): - return immutablevalue(issubclass(obj.const, s_cls.const)) if hasattr(obj, 'is_type_of'): vars = obj.is_type_of annotator = getbookkeeper().annotator return builtin.builtin_isinstance(annotator.binding(vars[0]), s_cls, vars) + if obj.is_constant() and s_cls.is_constant(): + return immutablevalue(issubclass(obj.const, s_cls.const)) return SomeBool() def len(obj): Modified: pypy/dist/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/dist/pypy/translator/test/test_annrpython.py (original) +++ pypy/dist/pypy/translator/test/test_annrpython.py Sat Sep 10 00:30:12 2005 @@ -1558,6 +1558,44 @@ assert s.knowntype == bool assert not s.is_constant() + def test_issubtype_and_const(self): + class A(object): + pass + class B(object): + pass + class C(A): + pass + b = B() + c = C() + def g(f): + if f == 1: + x = b + elif f == 2: + x = c + else: + x = C() + t = type(x) + return issubclass(t, A) + + def f(): + x = g(1) + y = g(0) + return x or y + a = self.RPythonAnnotator() + s = a.build_types(f, []) + assert s.knowntype == bool + assert not s.is_constant() + a = self.RPythonAnnotator() + # sanity check + x = annmodel.SomeInteger() + x.const = 1 + s = a.build_types(g, [x]) + assert s.const == False + a = self.RPythonAnnotator() + x = annmodel.SomeInteger() + x.const = 2 + s = a.build_types(g, [x]) + assert s.const == True def g(n): return [0,1,2,n] From pedronis at codespeak.net Sat Sep 10 00:45:28 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 10 Sep 2005 00:45:28 +0200 (CEST) Subject: [pypy-svn] r17435 - in pypy/dist/pypy: annotation module/thread/rpython rpython/test Message-ID: <20050909224528.7EF5C27B82@code1.codespeak.net> Author: pedronis Date: Sat Sep 10 00:45:25 2005 New Revision: 17435 Modified: pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/annotation/dictdef.py pypy/dist/pypy/module/thread/rpython/exttable.py pypy/dist/pypy/rpython/test/test_objectmodel.py Log: extend emulate_pbc_call such that it can work with methods. in the general case the pbc itself cannot be used as key for the emulated_pbc_calls table, let the callers of the helpers specify a suitable unique key (and possibly whether the new information is also replacing that for some previous/other set of keys) Modified: pypy/dist/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/dist/pypy/annotation/bookkeeper.py (original) +++ pypy/dist/pypy/annotation/bookkeeper.py Sat Sep 10 00:45:25 2005 @@ -229,8 +229,7 @@ self.consider_pbc_call(pbc, shape, spaceop) self.pbc_call_sites = {} - for fn, shape in self.emulated_pbc_calls.iteritems(): - pbc = SomePBC({fn: True}) + for pbc, shape in self.emulated_pbc_calls.itervalues(): self.consider_pbc_call(pbc, shape) self.emulated_pbc_calls = {} @@ -546,16 +545,19 @@ return unionof(*results) - def emulate_pbc_call(self, pbc, args_s): + def emulate_pbc_call(self, unique_key, pbc, args_s, replace=[]): args = self.build_args("simple_call", args_s) shape = args.rawshape() - for func, classdef in pbc.prebuiltinstances.items(): - if func is not None: - assert not isclassdef(classdef) - if func in self.emulated_pbc_calls: - assert shape == self.emulated_pbc_calls[func] - else: - self.emulated_pbc_calls[func] = shape + emulated_pbc_calls = self.emulated_pbc_calls + prev = [unique_key] + prev.extend(replace) + for other_key in prev: + if other_key in emulated_pbc_calls: + pbc, old_shape = emulated_pbc_calls[other_key] + assert shape == old_shape + del emulated_pbc_calls[other_key] + emulated_pbc_calls[unique_key] = pbc, shape + return self.pbc_call(pbc, args, True) # decide_callable(position, func, args, mono) -> callb, key Modified: pypy/dist/pypy/annotation/dictdef.py ============================================================================== --- pypy/dist/pypy/annotation/dictdef.py (original) +++ pypy/dist/pypy/annotation/dictdef.py Sat Sep 10 00:45:25 2005 @@ -17,7 +17,8 @@ ListItem.merge(self, other) if self.custom_eq_hash: self.update_rdict_annotations(other.s_rdict_eqfn, - other.s_rdict_hashfn) + other.s_rdict_hashfn, + other=other) def generalize(self, s_other_value): updated = ListItem.generalize(self, s_other_value) @@ -25,7 +26,7 @@ self.emulate_rdict_calls() return updated - def update_rdict_annotations(self, s_eqfn, s_hashfn): + def update_rdict_annotations(self, s_eqfn, s_hashfn, other=None): if not self.custom_eq_hash: self.custom_eq_hash = True else: @@ -33,15 +34,25 @@ s_hashfn = unionof(s_hashfn, self.s_rdict_hashfn) self.s_rdict_eqfn = s_eqfn self.s_rdict_hashfn = s_hashfn - self.emulate_rdict_calls() + self.emulate_rdict_calls(other=other) + + def emulate_rdict_calls(self, other=None): + myeq = (self, 'eq') + myhash = (self, 'hash') + if other: + replace_othereq = [(other, 'eq')] + replace_otherhash = [(other, 'hash')] + else: + replace_othereq = replace_otherhash = () - def emulate_rdict_calls(self): s_key = self.s_value - s1 = self.bookkeeper.emulate_pbc_call(self.s_rdict_eqfn, [s_key, s_key]) + s1 = self.bookkeeper.emulate_pbc_call(myeq, self.s_rdict_eqfn, [s_key, s_key], + replace=replace_othereq) assert SomeBool().contains(s1), ( "the custom eq function of an r_dict must return a boolean" " (got %r)" % (s1,)) - s2 = self.bookkeeper.emulate_pbc_call(self.s_rdict_hashfn, [s_key]) + s2 = self.bookkeeper.emulate_pbc_call(myhash, self.s_rdict_hashfn, [s_key], + replace=replace_otherhash) assert SomeInteger().contains(s2), ( "the custom hash function of an r_dict must return an integer" " (got %r)" % (s2,)) Modified: pypy/dist/pypy/module/thread/rpython/exttable.py ============================================================================== --- pypy/dist/pypy/module/thread/rpython/exttable.py (original) +++ pypy/dist/pypy/module/thread/rpython/exttable.py Sat Sep 10 00:45:25 2005 @@ -29,7 +29,7 @@ length 1 for arg""") s_arg, = s_argument_tuple.items # XXX hack hack hack: emulate a call to s_bootstrap_function - s_result = bookkeeper.emulate_pbc_call(s_bootstrap_function, [s_arg]) + s_result = bookkeeper.emulate_pbc_call(bookkeeper.position_key, s_bootstrap_function, [s_arg]) assert bookkeeper.getpbc(None).contains(s_result), ( """thread.start_new_thread(f, arg): f() should return None""") return annmodel.SomeInteger() Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Sat Sep 10 00:45:25 2005 @@ -17,9 +17,7 @@ def strange_key_hash(key): return ord(key[0]) -def test_r_dict(): - # NB. this test function is also annotated/rtyped by the next tests - d = r_dict(strange_key_eq, strange_key_hash) +def play_with_r_dict(d): d['hello'] = 42 assert d['hi there'] == 42 try: @@ -47,6 +45,24 @@ assert d.keys() == [] return True # for the tests below + +def test_r_dict(): + # NB. this test function is also annotated/rtyped by the next tests + d = r_dict(strange_key_eq, strange_key_hash) + return play_with_r_dict(d) + +class Strange: + def key_eq(strange, key1, key2): + return key1[0] == key2[0] # only the 1st character is relevant + def key_hash(strange, key): + return ord(key[0]) + +def test_r_dict_bm(): + # NB. this test function is also annotated by the next tests + strange = Strange() + d = r_dict(strange.key_eq, strange.key_hash) + return play_with_r_dict(d) + def test_annotate_r_dict(): t = Translator(test_r_dict) a = t.annotate([]) @@ -59,6 +75,23 @@ graph = t.flowgraphs[strange_key_hash] assert a.binding(graph.getargs()[0]).knowntype == str +def test_annotate_r_dict_bm(): + t = Translator(test_r_dict_bm) + a = t.annotate([]) + #t.view() + strange_key_eq = Strange.key_eq.im_func + strange_key_hash = Strange.key_hash.im_func + + assert strange_key_eq in t.flowgraphs + assert strange_key_hash in t.flowgraphs + graph = t.flowgraphs[strange_key_eq] + assert a.binding(graph.getargs()[0]).knowntype == Strange + assert a.binding(graph.getargs()[1]).knowntype == str + assert a.binding(graph.getargs()[2]).knowntype == str + graph = t.flowgraphs[strange_key_hash] + assert a.binding(graph.getargs()[0]).knowntype == Strange + assert a.binding(graph.getargs()[1]).knowntype == str + def INPROGRESS_test_rtype_r_dict(): res = interpret(test_r_dict, []) assert res is True From tismer at codespeak.net Sat Sep 10 01:10:41 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sat, 10 Sep 2005 01:10:41 +0200 (CEST) Subject: [pypy-svn] r17436 - in pypy/dist/pypy: rpython rpython/test translator/c/test Message-ID: <20050909231041.6A5D927B84@code1.codespeak.net> Author: tismer Date: Sat Sep 10 01:10:39 2005 New Revision: 17436 Modified: pypy/dist/pypy/rpython/rlist.py pypy/dist/pypy/rpython/rrange.py pypy/dist/pypy/rpython/test/test_rrange.py pypy/dist/pypy/translator/c/test/test_typed.py Log: added specialization for index checking to (x)range. reversed dependencies of imports. Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Sat Sep 10 01:10:39 2005 @@ -3,7 +3,6 @@ from pypy.objspace.flow.model import Constant from pypy.rpython.rmodel import Repr, TyperError, IntegerRepr, inputconst from pypy.rpython.rmodel import IteratorRepr -from pypy.rpython import rrange from pypy.rpython.rslice import SliceRepr from pypy.rpython.rslice import startstop_slice_repr, startonly_slice_repr from pypy.rpython.rslice import minusone_slice_repr @@ -29,6 +28,7 @@ class __extend__(annmodel.SomeList): def rtyper_makerepr(self, rtyper): + from pypy.rpython import rrange listitem = self.listdef.listitem s_value = listitem.s_value if listitem.range_step and not listitem.mutated: Modified: pypy/dist/pypy/rpython/rrange.py ============================================================================== --- pypy/dist/pypy/rpython/rrange.py (original) +++ pypy/dist/pypy/rpython/rrange.py Sat Sep 10 01:10:39 2005 @@ -3,6 +3,7 @@ from pypy.rpython.rmodel import Repr, TyperError, IntegerRepr, IteratorRepr from pypy.rpython.lltype import Ptr, GcStruct, Signed, malloc, Void from pypy.objspace.flow.model import Constant +from pypy.rpython.rlist import ll_newlist, dum_nocheck, dum_checkidx # ____________________________________________________________ # @@ -34,9 +35,12 @@ class __extend__(pairtype(RangeRepr, IntegerRepr)): def rtype_getitem((r_rng, r_int), hop): + from pypy.rpython.rlist import dum_nocheck, dum_checkidx if hop.has_implicit_exception(IndexError): - s = "getitem on range with try, except: block not supported." - raise TyperError, s + spec = dum_checkidx + else: + spec = dum_nocheck + v_func = hop.inputconst(Void, spec) v_lst, v_index = hop.inputargs(r_rng, Signed) cstep = hop.inputconst(Signed, r_rng.step) if hop.args_s[1].nonneg: @@ -44,12 +48,17 @@ else: llfn = ll_rangeitem hop.exception_is_here() - return hop.gendirectcall(llfn, v_lst, v_index, cstep) + return hop.gendirectcall(llfn, v_func, v_lst, v_index, cstep) # ____________________________________________________________ # # Low-level methods. +# XXX I think range could be simplified and generalized by storing the +# range length and a current index, but no stop value at all. +# The iterator would always use indexing, which implies a multiplication +# in the range, but that's low cost. + def _ll_rangelen(start, stop, step): if step > 0: result = (stop - start + (step-1)) // step @@ -62,14 +71,23 @@ def ll_rangelen(l, step): return _ll_rangelen(l.start, l.stop, step) -def ll_rangeitem_nonneg(l, i, step): - return l.start + i*step - -def ll_rangeitem(l, i, step): - if i < 0: - length = ll_rangelen(l, step) - i += length - return l.start + i*step +def ll_rangeitem_nonneg(func, l, index, step): + if func is dum_checkidx and index >= _ll_rangelen(l.start, l.stop, step): + raise IndexError + return l.start + index * step + +def ll_rangeitem(func, l, index, step): + if func is dum_checkidx: + length = _ll_rangelen(l.start, l.stop, step) + if index < 0: + index += length + if index < 0 or index >= length: + raise IndexError + else: + if index < 0: + length = _ll_rangelen(l.start, l.stop, step) + index += length + return l.start + index * step # ____________________________________________________________ # @@ -103,7 +121,6 @@ rtype_builtin_xrange = rtype_builtin_range def ll_range2list(LISTPTR, start, stop, step): - from pypy.rpython.rlist import ll_newlist length = _ll_rangelen(start, stop, step) l = ll_newlist(LISTPTR, length) idx = 0 Modified: pypy/dist/pypy/rpython/test/test_rrange.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rrange.py (original) +++ pypy/dist/pypy/rpython/test/test_rrange.py Sat Sep 10 01:10:39 2005 @@ -8,11 +8,11 @@ length = len(expected) l = ll_newrange(start, stop) assert ll_rangelen(l, step) == length - lst = [ll_rangeitem(l, i, step) for i in range(length)] + lst = [ll_rangeitem(dum_nocheck, l, i, step) for i in range(length)] assert lst == expected - lst = [ll_rangeitem_nonneg(l, i, step) for i in range(length)] + lst = [ll_rangeitem_nonneg(dum_nocheck, l, i, step) for i in range(length)] assert lst == expected - lst = [ll_rangeitem(l, i-length, step) for i in range(length)] + lst = [ll_rangeitem(dum_nocheck, l, i-length, step) for i in range(length)] assert lst == expected for start in (-10, 0, 1, 10): @@ -74,6 +74,5 @@ r.reverse() return r[0] start, stop = 10, 17 - res = interpret(dummyfn, [start, stop])#, view=True) + res = interpret(dummyfn, [start, stop])) assert res == dummyfn(start, stop) - Modified: pypy/dist/pypy/translator/c/test/test_typed.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_typed.py (original) +++ pypy/dist/pypy/translator/c/test/test_typed.py Sat Sep 10 01:10:39 2005 @@ -357,3 +357,14 @@ return r[0] f = self.getcompiled(fn) assert f() == fn() + + def test_range_idx(self): + def fn(idx=int): + r = range(10, 37, 4) + try: + return r[idx] + except: raise + f = self.getcompiled(fn) + assert f(0) == fn(0) + assert f(-1) == fn(-1) + raises(IndexError, f, 42) \ No newline at end of file From tismer at codespeak.net Sat Sep 10 01:14:53 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sat, 10 Sep 2005 01:14:53 +0200 (CEST) Subject: [pypy-svn] r17437 - pypy/dist/pypy/rpython/test Message-ID: <20050909231453.F3CBC27B84@code1.codespeak.net> Author: tismer Date: Sat Sep 10 01:14:53 2005 New Revision: 17437 Modified: pypy/dist/pypy/rpython/test/test_rrange.py Log: typo, glitch during check-in Modified: pypy/dist/pypy/rpython/test/test_rrange.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rrange.py (original) +++ pypy/dist/pypy/rpython/test/test_rrange.py Sat Sep 10 01:14:53 2005 @@ -74,5 +74,5 @@ r.reverse() return r[0] start, stop = 10, 17 - res = interpret(dummyfn, [start, stop])) + res = interpret(dummyfn, [start, stop]) assert res == dummyfn(start, stop) From tismer at codespeak.net Sat Sep 10 01:21:20 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sat, 10 Sep 2005 01:21:20 +0200 (CEST) Subject: [pypy-svn] r17438 - pypy/dist/pypy/rpython Message-ID: <20050909232120.9AD9227B84@code1.codespeak.net> Author: tismer Date: Sat Sep 10 01:21:19 2005 New Revision: 17438 Modified: pypy/dist/pypy/rpython/rrange.py Log: not sure about this comment, would like to discuss this (but it has low priority) # XXX I think range could be simplified and generalized by storing the # range length and no stop value at all. # The iterator would always use indexing, which implies a multiplication # in the range, but that's low cost. Modified: pypy/dist/pypy/rpython/rrange.py ============================================================================== --- pypy/dist/pypy/rpython/rrange.py (original) +++ pypy/dist/pypy/rpython/rrange.py Sat Sep 10 01:21:19 2005 @@ -54,11 +54,6 @@ # # Low-level methods. -# XXX I think range could be simplified and generalized by storing the -# range length and a current index, but no stop value at all. -# The iterator would always use indexing, which implies a multiplication -# in the range, but that's low cost. - def _ll_rangelen(start, stop, step): if step > 0: result = (stop - start + (step-1)) // step From arigo at codespeak.net Sat Sep 10 12:30:22 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 10 Sep 2005 12:30:22 +0200 (CEST) Subject: [pypy-svn] r17439 - pypy/dist/pypy/annotation Message-ID: <20050910103022.7A4D427B46@code1.codespeak.net> Author: arigo Date: Sat Sep 10 12:30:21 2005 New Revision: 17439 Modified: pypy/dist/pypy/annotation/unaryop.py Log: A potential bug similar to r17369: we should avoid to return a SomeXxx instance when it designates a different object at run-time, even if it's of the same type, to avoid confusion with extra attributes like '.const'. Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Sat Sep 10 12:30:21 2005 @@ -294,7 +294,7 @@ return dct.dictdef.read_value() def method_copy(dct): - return dct + return SomeDict(dct.dictdef) def method_update(dct1, dct2): dct1.dictdef.union(dct2.dictdef) From arigo at codespeak.net Sat Sep 10 14:40:05 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 10 Sep 2005 14:40:05 +0200 (CEST) Subject: [pypy-svn] r17440 - in pypy/dist/pypy: annotation rpython rpython/test Message-ID: <20050910124005.4EA6B27B5D@code1.codespeak.net> Author: arigo Date: Sat Sep 10 14:40:02 2005 New Revision: 17440 Removed: pypy/dist/pypy/rpython/rconstantdict.py pypy/dist/pypy/rpython/remptydict.py Modified: pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/rpython/rbuiltin.py pypy/dist/pypy/rpython/rclass.py pypy/dist/pypy/rpython/rdict.py pypy/dist/pypy/rpython/rint.py pypy/dist/pypy/rpython/rmodel.py pypy/dist/pypy/rpython/rstr.py pypy/dist/pypy/rpython/test/test_rdict.py Log: Sanitization of the multiple low-level dict implementations. There is now only a generic DictRepr left. This is slightly less efficient e.g. for string keys, but we can think later about regaining this performance. Done by adding a get_ll_hash_function() to some Reprs, similar to the already-existing get_ll_eq_function(). The dict lookup uses these two functions generically. The tests pass, but if something in rdict.py is not tested I probably have left an oversight there. Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Sat Sep 10 14:40:02 2005 @@ -327,6 +327,9 @@ cast_p = lltype.cast_pointer(PtrT.const, s_p.ll_ptrtype._defl()) return SomePtr(ll_ptrtype=lltype.typeOf(cast_p)) +def cast_ptr_to_int(s_ptr): # xxx + return SomeInteger() + def getRuntimeTypeInfo(T): assert T.is_constant() return immutablevalue(lltype.getRuntimeTypeInfo(T.const)) @@ -339,6 +342,7 @@ BUILTIN_ANALYZERS[lltype.typeOf] = typeOf BUILTIN_ANALYZERS[lltype.nullptr] = nullptr BUILTIN_ANALYZERS[lltype.cast_pointer] = cast_pointer +BUILTIN_ANALYZERS[lltype.cast_ptr_to_int] = cast_ptr_to_int BUILTIN_ANALYZERS[lltype.getRuntimeTypeInfo] = getRuntimeTypeInfo BUILTIN_ANALYZERS[lltype.runtime_type_info] = runtime_type_info Modified: pypy/dist/pypy/rpython/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/rbuiltin.py Sat Sep 10 14:40:02 2005 @@ -254,6 +254,12 @@ return hop.genop('cast_pointer', [v_input], # v_type implicit in r_result resulttype = hop.r_result.lowleveltype) +def rtype_cast_ptr_to_int(hop): + assert isinstance(hop.args_r[0], rptr.PtrRepr) + vlist = hop.inputargs(hop.args_r[0]) + return hop.genop('cast_ptr_to_int', vlist, + resulttype = lltype.Signed) + def rtype_runtime_type_info(hop): assert isinstance(hop.args_r[0], rptr.PtrRepr) vlist = hop.inputargs(hop.args_r[0]) @@ -263,6 +269,7 @@ BUILTIN_TYPER[lltype.malloc] = rtype_malloc BUILTIN_TYPER[lltype.cast_pointer] = rtype_cast_pointer +BUILTIN_TYPER[lltype.cast_ptr_to_int] = rtype_cast_ptr_to_int BUILTIN_TYPER[lltype.typeOf] = rtype_const_result BUILTIN_TYPER[lltype.nullptr] = rtype_const_result BUILTIN_TYPER[lltype.getRuntimeTypeInfo] = rtype_const_result Modified: pypy/dist/pypy/rpython/rclass.py ============================================================================== --- pypy/dist/pypy/rpython/rclass.py (original) +++ pypy/dist/pypy/rpython/rclass.py Sat Sep 10 14:40:02 2005 @@ -202,6 +202,9 @@ # return getclassrepr(self.rtyper, subclassdef).getvtable() + def get_ll_eq_function(self): + return None + def getvtable(self, cast_to_typeptr=True): """Return a ptr to the vtable of this type.""" if self.vtable is None: Modified: pypy/dist/pypy/rpython/rdict.py ============================================================================== --- pypy/dist/pypy/rpython/rdict.py (original) +++ pypy/dist/pypy/rpython/rdict.py Sat Sep 10 14:40:02 2005 @@ -1,19 +1,26 @@ from pypy.annotation.pairtype import pairtype from pypy.annotation import model as annmodel from pypy.objspace.flow.model import Constant -from pypy.rpython import rmodel, lltype, rstr +from pypy.rpython import rmodel, lltype from pypy.rpython.rarithmetic import r_uint -from pypy.rpython import rlist, rconstantdict, remptydict +from pypy.rpython import rlist from pypy.rpython import robject # ____________________________________________________________ # -# pseudo implementation of RPython dictionary (this is per -# dictvalue type): +# generic implementation of RPython dictionary, with parametric DICTKEY and +# DICTVALUE types. +# +# XXX this should be re-optimized for specific types of keys; e.g. +# for string keys we don't need the two boolean flags but can use +# a NULL and a special 'dummy' keys. Similarily, for immutable dicts, +# the array should be inlined and num_pristine_entries is not needed. # # struct dictentry { -# struct STR *key; -# DICTVALUE value; +# DICTSTR key; +# bool valid; # to mark if the entry is filled +# bool everused; # to mark if the entry is or has ever been filled +# DICTVALUE value; # } # # struct dicttable { @@ -26,66 +33,70 @@ class __extend__(annmodel.SomeDict): def rtyper_makerepr(self, rtyper): - s_key = self.dictdef.dictkey.s_value - s_value = self.dictdef.dictvalue.s_value - if isinstance(s_key, annmodel.SomeString): - if s_key.can_be_none(): - raise rmodel.TyperError("cannot make repr of dict with " - "string-or-None keys") - dictvalue = self.dictdef.dictvalue - return StrDictRepr(lambda: rtyper.getrepr(dictvalue.s_value), - dictvalue) - elif isinstance(s_key, (annmodel.SomeInteger, - annmodel.SomeUnicodeCodePoint)): - dictkey = self.dictdef.dictkey - dictvalue = self.dictdef.dictvalue - return rconstantdict.ConstantDictRepr( - rtyper.getrepr(dictkey.s_value), - rtyper.getrepr(dictvalue.s_value)) - elif isinstance(s_key, annmodel.SomeImpossibleValue): - return remptydict.emptydict_repr - elif (s_key.__class__ is annmodel.SomeObject and s_key.knowntype == object and - s_value.__class__ is annmodel.SomeObject and s_value.knowntype == object): + dictkey = self.dictdef.dictkey + dictvalue = self.dictdef.dictvalue + s_key = dictkey .s_value + s_value = dictvalue.s_value + if (s_key.__class__ is annmodel.SomeObject and s_key.knowntype == object and + s_value.__class__ is annmodel.SomeObject and s_value.knowntype == object): return robject.pyobj_repr - else: - raise rmodel.TyperError("cannot make repr of %r" %(self.dictdef,)) + else: + return DictRepr(lambda: rtyper.getrepr(s_key), + lambda: rtyper.getrepr(s_value), + dictkey, + dictvalue) def rtyper_makekey(self): return (self.__class__, self.dictdef.dictkey, self.dictdef.dictvalue) -class StrDictRepr(rmodel.Repr): +class DictRepr(rmodel.Repr): - def __init__(self, value_repr, dictvalue=None): - self.STRDICT = lltype.GcForwardReference() - self.lowleveltype = lltype.Ptr(self.STRDICT) + def __init__(self, key_repr, value_repr, dictkey=None, dictvalue=None): + self.DICT = lltype.GcForwardReference() + self.lowleveltype = lltype.Ptr(self.DICT) + if not isinstance(key_repr, rmodel.Repr): # not computed yet, done by setup() + assert callable(key_repr) + self._key_repr_computer = key_repr + else: + self.key_repr = key_repr if not isinstance(value_repr, rmodel.Repr): # not computed yet, done by setup() assert callable(value_repr) self._value_repr_computer = value_repr else: self.value_repr = value_repr + self.dictkey = dictkey self.dictvalue = dictvalue self.dict_cache = {} # setup() needs to be called to finish this initialization def _setup_repr(self): + if 'key_repr' not in self.__dict__: + self.key_repr = self._key_repr_computer() if 'value_repr' not in self.__dict__: self.value_repr = self._value_repr_computer() - if isinstance(self.STRDICT, lltype.GcForwardReference): + if isinstance(self.DICT, lltype.GcForwardReference): + self.DICTKEY = self.key_repr.lowleveltype self.DICTVALUE = self.value_repr.lowleveltype self.DICTENTRY = lltype.Struct("dictentry", - ("key", lltype.Ptr(rstr.STR)), - ('value', self.DICTVALUE)) + ("key", self.DICTKEY), + ("valid", lltype.Bool), + ("everused", lltype.Bool), + ("value", self.DICTVALUE)) self.DICTENTRYARRAY = lltype.GcArray(self.DICTENTRY) - self.STRDICT.become(lltype.GcStruct("dicttable", + self.DICT.become(lltype.GcStruct("dicttable", ("num_items", lltype.Signed), ("num_pristine_entries", lltype.Signed), ("entries", lltype.Ptr(self.DICTENTRYARRAY)))) + if 'll_keyhash' not in self.__dict__: + # figure out which functions must be used to hash and compare keys + self.ll_keyeq = self.key_repr.get_ll_eq_function() # can be None + self.ll_keyhash = self.key_repr.get_ll_hash_function() def convert_const(self, dictobj): # get object from bound dict methods #dictobj = getattr(dictobj, '__self__', dictobj) if dictobj is None: - return nullptr(self.STRDICT) + return nullptr(self.DICT) if not isinstance(dictobj, dict): raise TyperError("expected a dict: %r" % (dictobj,)) try: @@ -93,32 +104,33 @@ return self.dict_cache[key] except KeyError: self.setup() - l_dict = ll_newstrdict(self.lowleveltype) + l_dict = ll_newdict(self.lowleveltype) self.dict_cache[key] = l_dict - r_key = rstr.string_repr + r_key = self.key_repr r_value = self.value_repr for dictkey, dictvalue in dictobj.items(): llkey = r_key.convert_const(dictkey) llvalue = r_value.convert_const(dictvalue) - ll_strdict_setitem(l_dict, llkey, llvalue) - return l_dict + ll_dict_setitem(l_dict, llkey, llvalue, self) + return l_dict def rtype_len(self, hop): v_dict, = hop.inputargs(self) - return hop.gendirectcall(ll_strdict_len, v_dict) + return hop.gendirectcall(ll_dict_len, v_dict) def rtype_is_true(self, hop): v_dict, = hop.inputargs(self) - return hop.gendirectcall(ll_strdict_is_true, v_dict) + return hop.gendirectcall(ll_dict_is_true, v_dict) def make_iterator_repr(self, *variant): - return StrDictIteratorRepr(self, *variant) + return DictIteratorRepr(self, *variant) def rtype_method_get(self, hop): - v_dict, v_key, v_default = hop.inputargs(self, rstr.string_repr, + v_dict, v_key, v_default = hop.inputargs(self, self.key_repr, self.value_repr) + crepr = hop.inputconst(lltype.Void, self) hop.exception_cannot_occur() - return hop.gendirectcall(ll_get, v_dict, v_key, v_default) + return hop.gendirectcall(ll_get, v_dict, v_key, v_default, crepr) def rtype_method_copy(self, hop): v_dict, = hop.inputargs(self) @@ -127,8 +139,9 @@ def rtype_method_update(self, hop): v_dic1, v_dic2 = hop.inputargs(self, self) + crepr = hop.inputconst(lltype.Void, self) hop.exception_cannot_occur() - return hop.gendirectcall(ll_update, v_dic1, v_dic2) + return hop.gendirectcall(ll_update, v_dic1, v_dic2, crepr) def _rtype_method_kvi(self, hop, spec): v_dic, = hop.inputargs(self) @@ -149,69 +162,67 @@ def rtype_method_iterkeys(self, hop): hop.exception_cannot_occur() - return StrDictIteratorRepr(self, "keys").newiter(hop) + return DictIteratorRepr(self, "keys").newiter(hop) def rtype_method_itervalues(self, hop): hop.exception_cannot_occur() - return StrDictIteratorRepr(self, "values").newiter(hop) + return DictIteratorRepr(self, "values").newiter(hop) def rtype_method_iteritems(self, hop): hop.exception_cannot_occur() - return StrDictIteratorRepr(self, "items").newiter(hop) + return DictIteratorRepr(self, "items").newiter(hop) def rtype_method_clear(self, hop): v_dict, = hop.inputargs(self) hop.exception_cannot_occur() return hop.gendirectcall(ll_clear, v_dict) -class __extend__(pairtype(StrDictRepr, rmodel.StringRepr)): +class __extend__(pairtype(DictRepr, rmodel.Repr)): - def rtype_getitem((r_dict, r_string), hop): - v_dict, v_key = hop.inputargs(r_dict, rstr.string_repr) + def rtype_getitem((r_dict, r_key), hop): + v_dict, v_key = hop.inputargs(r_dict, r_dict.key_repr) + crepr = hop.inputconst(lltype.Void, r_dict) hop.has_implicit_exception(KeyError) # record that we know about it hop.exception_is_here() - return hop.gendirectcall(ll_strdict_getitem, v_dict, v_key) + return hop.gendirectcall(ll_dict_getitem, v_dict, v_key, crepr) - def rtype_delitem((r_dict, r_string), hop): - v_dict, v_key = hop.inputargs(r_dict, rstr.string_repr) + def rtype_delitem((r_dict, r_key), hop): + v_dict, v_key = hop.inputargs(r_dict, r_dict.key_repr) + crepr = hop.inputconst(lltype.Void, r_dict) hop.has_implicit_exception(KeyError) # record that we know about it hop.exception_is_here() - return hop.gendirectcall(ll_strdict_delitem, v_dict, v_key) + return hop.gendirectcall(ll_dict_delitem, v_dict, v_key, crepr) - def rtype_setitem((r_dict, r_string), hop): - v_dict, v_key, v_value = hop.inputargs(r_dict, rstr.string_repr, r_dict.value_repr) - hop.gendirectcall(ll_strdict_setitem, v_dict, v_key, v_value) - - def rtype_contains((r_dict, r_string), hop): - v_dict, v_key = hop.inputargs(r_dict, rstr.string_repr) - return hop.gendirectcall(ll_contains, v_dict, v_key) + def rtype_setitem((r_dict, r_key), hop): + v_dict, v_key, v_value = hop.inputargs(r_dict, r_dict.key_repr, r_dict.value_repr) + crepr = hop.inputconst(lltype.Void, r_dict) + hop.gendirectcall(ll_dict_setitem, v_dict, v_key, v_value, crepr) + + def rtype_contains((r_dict, r_key), hop): + v_dict, v_key = hop.inputargs(r_dict, r_dict.key_repr) + crepr = hop.inputconst(lltype.Void, r_dict) + return hop.gendirectcall(ll_contains, v_dict, v_key, crepr) -class __extend__(pairtype(StrDictRepr, StrDictRepr)): +class __extend__(pairtype(DictRepr, DictRepr)): def convert_from_to((r_dict1, r_dict2), v, llops): - # check that we don't convert from StrDicts with - # different value types + # check that we don't convert from Dicts with + # different key/value types + if r_dict1.dictkey is None or r_dict2.dictkey is None: + return NotImplemented + if r_dict1.dictkey is not r_dict2.dictkey: + return NotImplemented if r_dict1.dictvalue is None or r_dict2.dictvalue is None: return NotImplemented if r_dict1.dictvalue is not r_dict2.dictvalue: return NotImplemented return v - #def rtype_add((self, _), hop): - # v_lst1, v_lst2 = hop.inputargs(self, self) - # return hop.gendirectcall(ll_concat, v_lst1, v_lst2) -# -# def rtype_inplace_add((self, _), hop): -# v_lst1, v_lst2 = hop.inputargs(self, self) -# hop.gendirectcall(ll_extend, v_lst1, v_lst2) -# return v_lst1 - # ____________________________________________________________ # # Low-level methods. These can be run for testing, but are meant to # be direct_call'ed from rtyped flow graphs, which means that they will # get flowed and annotated, mostly with SomePtr. -deleted_entry_marker = lltype.malloc(rstr.STR, 0, immortal=True) def dum_keys(): pass def dum_values(): pass def dum_items():pass @@ -219,114 +230,124 @@ "values": dum_values, "items": dum_items} -def ll_strdict_len(d): +def ll_dict_len(d): return d.num_items -def ll_strdict_is_true(d): +def ll_dict_is_true(d): # check if a dict is True, allowing for None return bool(d) and d.num_items != 0 -def ll_strdict_getitem(d, key): - entry = ll_strdict_lookup(d, key) - if entry.key and entry.key != deleted_entry_marker: +def ll_dict_getitem(d, key, dictrepr): + entry = ll_dict_lookup(d, key, dictrepr) + if entry.valid: return entry.value else: raise KeyError -def ll_strdict_setitem(d, key, value): - entry = ll_strdict_lookup(d, key) - if not entry.key: - entry.key = key - entry.value = value - d.num_items += 1 +def ll_dict_setitem(d, key, value, dictrepr): + entry = ll_dict_lookup(d, key, dictrepr) + entry.value = value + if entry.valid: + return + entry.key = key + entry.valid = True + d.num_items += 1 + if not entry.everused: + entry.everused = True d.num_pristine_entries -= 1 if d.num_pristine_entries <= len(d.entries) / 3: - ll_strdict_resize(d) - elif entry.key == deleted_entry_marker: - entry.key = key - entry.value = value - d.num_items += 1 - else: - entry.value = value + ll_dict_resize(d, dictrepr) -def ll_strdict_delitem(d, key): - entry = ll_strdict_lookup(d, key) - if not entry.key or entry.key == deleted_entry_marker: - raise KeyError - entry.key = deleted_entry_marker +def ll_dict_delitem(d, key, dictrepr): + entry = ll_dict_lookup(d, key, dictrepr) + if not entry.valid: + raise KeyError + entry.valid = False + d.num_items -= 1 + # clear the key and the value if they are pointers + keytype = lltype.typeOf(entry).TO.key + if isinstance(keytype, lltype.Ptr): + key = entry.key # careful about destructor side effects + entry.key = lltype.nullptr(keytype.TO) valuetype = lltype.typeOf(entry).TO.value if isinstance(valuetype, lltype.Ptr): entry.value = lltype.nullptr(valuetype.TO) - d.num_items -= 1 num_entries = len(d.entries) - if num_entries > STRDICT_INITSIZE and d.num_items < num_entries / 4: - ll_strdict_resize(d) + if num_entries > DICT_INITSIZE and d.num_items < num_entries / 4: + ll_dict_resize(d, dictrepr) -def ll_strdict_resize(d): +def ll_dict_resize(d, dictrepr): old_entries = d.entries old_size = len(old_entries) # make a 'new_size' estimate and shrink it if there are many # deleted entry markers new_size = old_size * 2 - while new_size > STRDICT_INITSIZE and d.num_items < new_size / 4: + while new_size > DICT_INITSIZE and d.num_items < new_size / 4: new_size /= 2 d.entries = lltype.malloc(lltype.typeOf(old_entries).TO, new_size) d.num_pristine_entries = new_size - d.num_items i = 0 while i < old_size: entry = old_entries[i] - if entry.key and entry.key != deleted_entry_marker: - new_entry = ll_strdict_lookup(d, entry.key) + if entry.valid: + new_entry = ll_dict_lookup(d, entry.key, dictrepr) new_entry.key = entry.key new_entry.value = entry.value + new_entry.valid = True + new_entry.everused = True i += 1 -# the below is a port of CPython's dictobject.c's lookdict implementation +# ------- a port of CPython's dictobject.c's lookdict implementation ------- PERTURB_SHIFT = 5 -def ll_strdict_lookup(d, key): - hash = rstr.ll_strhash(key) +def ll_dict_lookup(d, key, dictrepr): + hash = dictrepr.ll_keyhash(key) entries = d.entries mask = len(entries) - 1 i = r_uint(hash & mask) + """XXX MUTATION PROTECTION!""" + # do the first try before any looping entry = entries[i] - if not entry.key or entry.key == key: - return entry - if entry.key == deleted_entry_marker: - freeslot = entry - else: - if entry.key.hash == hash and rstr.ll_streq(entry.key, key): - return entry + if entry.valid: + if entry.key == key: + return entry # found the entry + if dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(entry.key, key): + return entry # found the entry freeslot = lltype.nullptr(lltype.typeOf(entry).TO) + elif entry.everused: + freeslot = entry + else: + return entry # pristine entry -- lookup failed - # In the loop, key == deleted_entry_marker is by far (factor of 100s) the - # least likely outcome, so test for that last. + # In the loop, a deleted entry (everused and not valid) is by far + # (factor of 100s) the least likely outcome, so test for that last. perturb = r_uint(hash) while 1: - i = (i << 2) + i + perturb + 1 - entry = entries[i & mask] - if not entry.key: + i = ((i << 2) + i + perturb + 1) & mask + entry = entries[i] + if not entry.everused: return freeslot or entry - if entry.key == key or (entry.key.hash == hash and - entry.key != deleted_entry_marker and - rstr.ll_streq(entry.key, key)): - return entry - if entry.key == deleted_entry_marker and not freeslot: + elif entry.valid: + if entry.key == key: + return entry + if dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(entry.key, key): + return entry + elif not freeslot: freeslot = entry perturb >>= PERTURB_SHIFT # ____________________________________________________________ # # Irregular operations. -STRDICT_INITSIZE = 8 +DICT_INITSIZE = 8 -def ll_newstrdict(DICTPTR): +def ll_newdict(DICTPTR): d = lltype.malloc(DICTPTR.TO) - d.entries = lltype.malloc(DICTPTR.TO.entries.TO, STRDICT_INITSIZE) + d.entries = lltype.malloc(DICTPTR.TO.entries.TO, DICT_INITSIZE) d.num_items = 0 # but still be explicit - d.num_pristine_entries = STRDICT_INITSIZE + d.num_pristine_entries = DICT_INITSIZE return d def rtype_newdict(hop): @@ -334,31 +355,27 @@ if r_dict == robject.pyobj_repr: # special case: SomeObject: SomeObject dicts! cdict = hop.inputconst(robject.pyobj_repr, dict) return hop.genop('simple_call', [cdict], resulttype = robject.pyobj_repr) - if r_dict == remptydict.emptydict_repr: # other special case: empty dicts - return hop.inputconst(lltype.Void, {}) - if not isinstance(r_dict, StrDictRepr): - raise rmodel.TyperError("cannot create non-StrDicts, got %r" %(r_dict,)) c1 = hop.inputconst(lltype.Void, r_dict.lowleveltype) - v_result = hop.gendirectcall(ll_newstrdict, c1) + v_result = hop.gendirectcall(ll_newdict, c1) return v_result # ____________________________________________________________ # # Iteration. -class StrDictIteratorRepr(rmodel.IteratorRepr): +class DictIteratorRepr(rmodel.IteratorRepr): def __init__(self, r_dict, variant="keys"): self.r_dict = r_dict self.variant = variant - self.lowleveltype = lltype.Ptr(lltype.GcStruct('strdictiter', + self.lowleveltype = lltype.Ptr(lltype.GcStruct('dictiter', ('dict', r_dict.lowleveltype), ('index', lltype.Signed))) def newiter(self, hop): v_dict, = hop.inputargs(self.r_dict) citerptr = hop.inputconst(lltype.Void, self.lowleveltype) - return hop.gendirectcall(ll_strdictiter, citerptr, v_dict) + return hop.gendirectcall(ll_dictiter, citerptr, v_dict) def rtype_next(self, hop): v_iter, = hop.inputargs(self) @@ -367,31 +384,30 @@ c1 = hop.inputconst(lltype.Void, r_list.lowleveltype) hop.has_implicit_exception(StopIteration) # record that we know about it hop.exception_is_here() - return hop.gendirectcall(ll_strdictnext, v_iter, v_func, c1) + return hop.gendirectcall(ll_dictnext, v_iter, v_func, c1) -def ll_strdictiter(ITERPTR, d): +def ll_dictiter(ITERPTR, d): iter = lltype.malloc(ITERPTR.TO) iter.dict = d iter.index = 0 return iter -def ll_strdictnext(iter, func, RETURNTYPE): +def ll_dictnext(iter, func, RETURNTYPE): entries = iter.dict.entries index = iter.index entries_len = len(entries) while index < entries_len: entry = entries[index] - key = entry.key index = index + 1 - if key and key != deleted_entry_marker: + if entry.valid: iter.index = index if func is dum_items: r = lltype.malloc(RETURNTYPE.TO) - r.item0 = key + r.item0 = entry.key r.item1 = entry.value return r elif func is dum_keys: - return key + return entry.key elif func is dum_values: return entry.value iter.index = index @@ -400,9 +416,9 @@ # _____________________________________________________________ # methods -def ll_get(dict, key, default): - entry = ll_strdict_lookup(dict, key) - if entry.key and entry.key != deleted_entry_marker: +def ll_get(dict, key, default, dictrepr): + entry = ll_dict_lookup(dict, key, dictrepr) + if entry.valid: return entry.value else: return default @@ -420,25 +436,28 @@ entry = dict.entries[i] d_entry.key = entry.key d_entry.value = entry.value + d_entry.valid = entry.valid + d_entry.everused = entry.everused i += 1 return d def ll_clear(d): - if len(d.entries) == d.num_pristine_entries == STRDICT_INITSIZE: + if len(d.entries) == d.num_pristine_entries == DICT_INITSIZE: return DICTPTR = lltype.typeOf(d) - d.entries = lltype.malloc(DICTPTR.TO.entries.TO, STRDICT_INITSIZE) + d.entries = lltype.malloc(DICTPTR.TO.entries.TO, DICT_INITSIZE) d.num_items = 0 - d.num_pristine_entries = STRDICT_INITSIZE + d.num_pristine_entries = DICT_INITSIZE -def ll_update(dic1, dic2): - d2len =len(dic2.entries) +def ll_update(dic1, dic2, dictrepr): + # XXX warning, no protection against ll_dict_setitem mutating dic2 + d2len = len(dic2.entries) entries = dic2.entries i = 0 while i < d2len: entry = entries[i] - if entry.key and entry.key != deleted_entry_marker: - ll_strdict_setitem(dic1, entry.key, entry.value) + if entry.valid: + ll_dict_setitem(dic1, entry.key, entry.value, dictrepr) i += 1 # this is an implementation of keys(), values() and items() @@ -455,23 +474,20 @@ p = 0 while i < dlen: entry = entries[i] - key = entry.key - if key and key != deleted_entry_marker: + if entry.valid: if func is dum_items: r = lltype.malloc(LISTPTR.TO.items.TO.OF.TO) - r.item0 = key + r.item0 = entry.key r.item1 = entry.value items[p] = r elif func is dum_keys: - items[p] = key + items[p] = entry.key elif func is dum_values: items[p] = entry.value p += 1 i += 1 return res -def ll_contains(d, key): - entry = ll_strdict_lookup(d, key) - if entry.key and entry.key != deleted_entry_marker: - return True - return False +def ll_contains(d, key, dictrepr): + entry = ll_dict_lookup(d, key, dictrepr) + return entry.valid Modified: pypy/dist/pypy/rpython/rint.py ============================================================================== --- pypy/dist/pypy/rpython/rint.py (original) +++ pypy/dist/pypy/rpython/rint.py Sat Sep 10 14:40:02 2005 @@ -209,6 +209,9 @@ def get_ll_eq_function(self): return None + def get_ll_hash_function(self): + return ll_hash_int + def rtype_chr(_, hop): vlist = hop.inputargs(Signed) return hop.genop('cast_int_to_char', vlist, resulttype=Char) @@ -399,6 +402,9 @@ j += 1 return result +def ll_hash_int(n): + return n + # # _________________________ Conversions _________________________ Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Sat Sep 10 14:40:02 2005 @@ -3,7 +3,7 @@ from pypy.objspace.flow.model import Constant from pypy.rpython.lltype import Void, Bool, Float, Signed, Char, UniChar from pypy.rpython.lltype import typeOf, LowLevelType, Ptr, PyObject -from pypy.rpython.lltype import FuncType, functionptr +from pypy.rpython.lltype import FuncType, functionptr, cast_ptr_to_int from pypy.tool.ansi_print import ansi_print from pypy.rpython.error import TyperError, MissingRTypeOperation @@ -101,6 +101,12 @@ def get_ll_eq_function(self): raise TyperError, 'no equality function for %r' % self + def get_ll_hash_function(self): + if not isinstance(self.lowleveltype, Ptr): + raise TyperError, 'no hashing function for %r' % self + # default behavior: use the pointer identity as a hash + return ll_hash_ptr + def rtype_bltn_list(self, hop): raise TyperError, 'no list() support for %r' % self @@ -152,6 +158,13 @@ def make_iterator_repr(self, *variant): raise TyperError("%s is not iterable" % (self,)) +def ll_hash_ptr(p): + return cast_ptr_to_int(p) + +def ll_hash_void(v): + return 0 + + class IteratorRepr(Repr): """Base class of Reprs of any kind of iterator.""" @@ -251,6 +264,8 @@ class VoidRepr(Repr): lowleveltype = Void + def get_ll_eq_function(self): return None + def get_ll_hash_function(self): return ll_hash_void impossible_repr = VoidRepr() # ____________________________________________________________ Modified: pypy/dist/pypy/rpython/rstr.py ============================================================================== --- pypy/dist/pypy/rpython/rstr.py (original) +++ pypy/dist/pypy/rpython/rstr.py Sat Sep 10 14:40:02 2005 @@ -76,6 +76,9 @@ def get_ll_eq_function(self): return ll_streq + def get_ll_hash_function(self): + return ll_strhash + def rtype_len(_, hop): v_str, = hop.inputargs(string_repr) return hop.gendirectcall(ll_strlen, v_str) @@ -393,6 +396,9 @@ def get_ll_eq_function(self): return None + def get_ll_hash_function(self): + return ll_char_hash + def rtype_len(_, hop): return hop.inputconst(Signed, 1) @@ -446,6 +452,9 @@ def get_ll_eq_function(self): return None + def get_ll_hash_function(self): + return ll_unichar_hash + ## def rtype_len(_, hop): ## return hop.inputconst(Signed, 1) ## @@ -536,6 +545,12 @@ j += 1 return newstr +def ll_char_hash(ch): + return ord(ch) + +def ll_unichar_hash(ch): + return ord(ch) + def ll_strlen(s): return len(s.chars) Modified: pypy/dist/pypy/rpython/test/test_rdict.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rdict.py (original) +++ pypy/dist/pypy/rpython/test/test_rdict.py Sat Sep 10 14:40:02 2005 @@ -90,7 +90,7 @@ return d[c2] char_by_hash = {} - base = rdict.STRDICT_INITSIZE + base = rdict.DICT_INITSIZE for y in range(0, 256): y = chr(y) y_hash = lowlevelhash(y) % base @@ -113,7 +113,7 @@ res = interpret(func2, [ord(x), ord(y)]) for i in range(len(res.entries)): - assert res.entries[i].key != rdict.deleted_entry_marker + assert not (res.entries[i].everused and not res.entries[i].valid) def func3(c0, c1, c2, c3, c4, c5, c6, c7): d = {} @@ -127,29 +127,29 @@ c7 = chr(c7) ; d[c7] = 1; del d[c7] return d - if rdict.STRDICT_INITSIZE != 8: + if rdict.DICT_INITSIZE != 8: py.test.skip("make dict tests more indepdent from initsize") res = interpret(func3, [ord(char_by_hash[i][0]) - for i in range(rdict.STRDICT_INITSIZE)]) + for i in range(rdict.DICT_INITSIZE)]) count_frees = 0 for i in range(len(res.entries)): - if not res.entries[i].key: + if not res.entries[i].everused: count_frees += 1 assert count_frees >= 3 def test_dict_resize(): def func(want_empty): d = {} - for i in range(rdict.STRDICT_INITSIZE): + for i in range(rdict.DICT_INITSIZE): d[chr(ord('a') + i)] = i if want_empty: - for i in range(rdict.STRDICT_INITSIZE): + for i in range(rdict.DICT_INITSIZE): del d[chr(ord('a') + i)] return d res = interpret(func, [0]) - assert len(res.entries) > rdict.STRDICT_INITSIZE + assert len(res.entries) > rdict.DICT_INITSIZE res = interpret(func, [1]) - assert len(res.entries) == rdict.STRDICT_INITSIZE + assert len(res.entries) == rdict.DICT_INITSIZE def test_dict_iteration(): def func(i, j): @@ -300,3 +300,19 @@ assert res is True res = interpret(func, [42]) assert res is True + +def test_int_dict(): + def func(a, b): + dic = {12: 34} + dic[a] = 1000 + return dic.get(b, -123) + res = interpret(func, [12, 12]) + assert res == 1000 + res = interpret(func, [12, 13]) + assert res == -123 + res = interpret(func, [524, 12]) + assert res == 34 + res = interpret(func, [524, 524]) + assert res == 1000 + res = interpret(func, [524, 1036]) + assert res == -123 From arigo at codespeak.net Sat Sep 10 14:41:43 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 10 Sep 2005 14:41:43 +0200 (CEST) Subject: [pypy-svn] r17441 - pypy/dist/pypy/module/thread Message-ID: <20050910124143.6FB3427B5D@code1.codespeak.net> Author: arigo Date: Sat Sep 10 14:41:42 2005 New Revision: 17441 Modified: pypy/dist/pypy/module/thread/threadlocals.py Log: We should be able to use int-keyed dicts freely now. Modified: pypy/dist/pypy/module/thread/threadlocals.py ============================================================================== --- pypy/dist/pypy/module/thread/threadlocals.py (original) +++ pypy/dist/pypy/module/thread/threadlocals.py Sat Sep 10 14:41:42 2005 @@ -11,16 +11,15 @@ os_thread.bootstrap().""" def __init__(self): - # XXX use string-keyed dicts only for now - self._valuedict = {} # {str(thread_ident): ExecutionContext()} + self._valuedict = {} # {thread_ident: ExecutionContext()} def getvalue(self): ident = thread.get_ident() - return self._valuedict.get(str(ident), None) + return self._valuedict.get(ident, None) def setvalue(self, value): ident = thread.get_ident() - self._valuedict[str(ident)] = value + self._valuedict[ident] = value def enter_thread(self, space): "Notification that the current thread is just starting." @@ -37,7 +36,7 @@ finally: ident = thread.get_ident() try: - del self._valuedict[str(ident)] + del self._valuedict[ident] except KeyError: pass From arigo at codespeak.net Sat Sep 10 15:06:00 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 10 Sep 2005 15:06:00 +0200 (CEST) Subject: [pypy-svn] r17442 - pypy/dist/pypy/rpython Message-ID: <20050910130600.B88C127B5E@code1.codespeak.net> Author: arigo Date: Sat Sep 10 15:05:59 2005 New Revision: 17442 Modified: pypy/dist/pypy/rpython/rmodel.py Log: A hash function that hashes any pointer by identity is more dangerous than helpful. For example the hash can be used for a prebuilt dict, when the pointer will change between compile-time and run-time. If needed, we can add logic in rclass.py to support instances as keys in dictionaries, linked to the hash-preserving code already there. Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Sat Sep 10 15:05:59 2005 @@ -102,10 +102,7 @@ raise TyperError, 'no equality function for %r' % self def get_ll_hash_function(self): - if not isinstance(self.lowleveltype, Ptr): - raise TyperError, 'no hashing function for %r' % self - # default behavior: use the pointer identity as a hash - return ll_hash_ptr + raise TyperError, 'no hashing function for %r' % self def rtype_bltn_list(self, hop): raise TyperError, 'no list() support for %r' % self @@ -158,9 +155,6 @@ def make_iterator_repr(self, *variant): raise TyperError("%s is not iterable" % (self,)) -def ll_hash_ptr(p): - return cast_ptr_to_int(p) - def ll_hash_void(v): return 0 From ale at codespeak.net Sat Sep 10 16:07:21 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Sat, 10 Sep 2005 16:07:21 +0200 (CEST) Subject: [pypy-svn] r17443 - pypy/dist/pypy/translator/goal Message-ID: <20050910140721.04EEF27B5D@code1.codespeak.net> Author: ale Date: Sat Sep 10 16:07:21 2005 New Revision: 17443 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: Made sure that a translator is defined if analyse fails Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Sat Sep 10 16:07:21 2005 @@ -100,21 +100,7 @@ # __________ Main __________ -def analyse(target): - - policy = AnnotatorPolicy() - if target: - spec = target(not options1.lowmem) - try: - entry_point, inputtypes, policy = spec - except ValueError: - entry_point, inputtypes = spec - t = Translator(entry_point, verbose=True, simplifying=True) - a = None - else: - # otherwise we have been loaded - a = t.annotator - t.frozen = False +def analyse(t, inputtypes): standalone = inputtypes is None if standalone: @@ -152,7 +138,7 @@ unixcheckpoint.restartable_point(auto='run') if a: t.frozen = True # cannot freeze if we don't have annotations - return t, entry_point, inputtypes, standalone + return standalone def assert_rpython_mostly_not_imported(): prefix = 'pypy.rpython.' @@ -370,8 +356,23 @@ optnames.sort() for name in optnames: print ' %25s: %s' %(name, options[name]) + + policy = AnnotatorPolicy() + target = targetspec_dic['target'] + if target: + spec = target(not options1.lowmem) + try: + entry_point, inputtypes, policy = spec + except ValueError: + entry_point, inputtypes = spec + t = Translator(entry_point, verbose=True, simplifying=True) + a = None + else: + # otherwise we have been loaded + a = t.annotator + t.frozen = False try: - t, entry_point, inputtypes, standalone = analyse(targetspec_dic['target']) + standalone = analyse(t, inputtypes) except TyperError: err = sys.exc_info() print '-'*60 From pedronis at codespeak.net Sun Sep 11 01:10:23 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 11 Sep 2005 01:10:23 +0200 (CEST) Subject: [pypy-svn] r17445 - in pypy/dist/pypy: annotation rpython rpython/test Message-ID: <20050910231023.6FB0F27B5E@code1.codespeak.net> Author: pedronis Date: Sun Sep 11 01:10:15 2005 New Revision: 17445 Modified: pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/rpython/objectmodel.py pypy/dist/pypy/rpython/rbuiltin.py pypy/dist/pypy/rpython/rmodel.py pypy/dist/pypy/rpython/rpbc.py pypy/dist/pypy/rpython/test/test_rpbc.py Log: start of support for special-cased helper hlinvoke to call high-level callables from low-level helpers, given a repr and a ll value for them. with tests, which given the juggling of levels are a bit imperscrutable on their own Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Sun Sep 11 01:10:15 2005 @@ -10,7 +10,7 @@ from pypy.annotation.model import SomeUnicodeCodePoint, SomeAddress from pypy.annotation.model import SomeFloat, unionof from pypy.annotation.model import SomePBC, SomeInstance, SomeDict -from pypy.annotation.model import annotation_to_lltype +from pypy.annotation.model import annotation_to_lltype, lltype_to_annotation from pypy.annotation.model import add_knowntypedata from pypy.annotation.bookkeeper import getbookkeeper from pypy.objspace.flow.model import Constant @@ -241,6 +241,14 @@ dictdef.dictkey.update_rdict_annotations(s_eqfn, s_hashfn) return SomeDict(dictdef) + +def robjmodel_hlinvoke(s_repr, s_llcallable, *args_s): + from pypy.rpython import rmodel + assert s_repr.is_constant() and isinstance(s_repr.const, rmodel.Repr),"hlinvoke expects a constant repr as first argument" + r_func, _ = s_repr.const.get_r_implfunc() + f, rinputs, rresult = r_func.get_signature() + return lltype_to_annotation(rresult.lowleveltype) + ##def rarith_ovfcheck(s_obj): ## if isinstance(s_obj, SomeInteger) and s_obj.unsigned: ## getbookkeeper().warning("ovfcheck on unsigned") @@ -280,6 +288,7 @@ BUILTIN_ANALYZERS[pypy.rpython.objectmodel.we_are_translated] = ( robjmodel_we_are_translated) BUILTIN_ANALYZERS[pypy.rpython.objectmodel.r_dict] = robjmodel_r_dict +BUILTIN_ANALYZERS[pypy.rpython.objectmodel.hlinvoke] = robjmodel_hlinvoke BUILTIN_ANALYZERS[Exception.__init__.im_func] = exception_init BUILTIN_ANALYZERS[OSError.__init__.im_func] = exception_init Modified: pypy/dist/pypy/rpython/objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/objectmodel.py (original) +++ pypy/dist/pypy/rpython/objectmodel.py Sun Sep 11 01:10:15 2005 @@ -30,6 +30,11 @@ obj.__dict__ = {} obj.__class__ = FREED_OBJECT +# __ invoke XXX this doesn't seem completely the right place for this + +def hlinvoke(repr, llcallable, *args): + raise TypeError, "invoke is meant to be rtyped and not called direclty" + # ____________________________________________________________ Modified: pypy/dist/pypy/rpython/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/rbuiltin.py Sun Sep 11 01:10:15 2005 @@ -220,6 +220,35 @@ def rtype_we_are_translated(hop): return hop.inputconst(lltype.Bool, True) +def rtype_hlinvoke(hop): + _, s_repr = hop.r_s_popfirstarg() + r_callable = s_repr.const + + r_func, nimplicitarg = r_callable.get_r_implfunc() + s_callable = r_callable.get_s_callable() + + _, rinputs, rresult = r_func.get_signature() + args_s, s_ret = r_func.get_args_ret_s() + + args_s = args_s[nimplicitarg:] + rinputs = rinputs[nimplicitarg:] + + assert 1+len(args_s) == len(hop.args_s) + + new_args_r = [r_callable] + rinputs + + for i in range(len(new_args_r)): + assert hop.args_r[i].lowleveltype == new_args_r[i].lowleveltype + + hop.args_r = new_args_r + hop.args_s = [s_callable] + args_s + + hop.s_result = s_ret + assert hop.r_result.lowleveltype == rresult.lowleveltype + hop.r_result = rresult + + return hop.dispatch() + # collect all functions import __builtin__ @@ -279,6 +308,8 @@ BUILTIN_TYPER[objectmodel.instantiate] = rtype_instantiate BUILTIN_TYPER[objectmodel.we_are_translated] = rtype_we_are_translated +BUILTIN_TYPER[objectmodel.hlinvoke] = rtype_hlinvoke + from pypy.rpython import extfunctable def make_rtype_extfunc(extfuncinfo): Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Sun Sep 11 01:10:15 2005 @@ -155,6 +155,14 @@ def make_iterator_repr(self, *variant): raise TyperError("%s is not iterable" % (self,)) + # hlinvoke helpers + + def get_r_implfunc(self): + raise TyperError("%s has no corresponding implementation function representation" % (self,)) + + def get_s_callable(self): + raise TyperError("%s is not callable or cannot reconstruct a pbc annotation for itself" % (self,)) + def ll_hash_void(v): return 0 Modified: pypy/dist/pypy/rpython/rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/rpbc.py (original) +++ pypy/dist/pypy/rpython/rpbc.py Sun Sep 11 01:10:15 2005 @@ -362,6 +362,21 @@ assert sig0[1:] == sig1[1:] # XXX not implemented self.lowleveltype = typeOf(sig0[0]) + def get_s_callable(self): + return self.s_pbc + + def get_r_implfunc(self): + return self, 0 + + def get_signature(self): + return self.function_signatures().itervalues().next() + + def get_args_ret_s(self): + f, _, _ = self.get_signature() + graph = f._obj.graph + rtyper = self.rtyper + return [rtyper.binding(arg) for arg in graph.getargs()], rtyper.binding(graph.getreturnvar()) + def function_signatures(self): if self._function_signatures is None: self._function_signatures = {} @@ -489,6 +504,14 @@ raise TyperError("not a bound method: %r" % method) return self.r_im_self.convert_const(method.im_self) + def get_r_implfunc(self): + r_class = self.r_im_self.rclass + mangled_name, r_func = r_class.clsfields[self.methodname] + return r_func, 1 + + def get_s_callable(self): + return self.s_pbc + def get_method_from_instance(self, r_inst, v_inst, llops): # The 'self' might have to be cast to a parent class # (as shown for example in test_rclass/test_method_both_A_and_B) Modified: pypy/dist/pypy/rpython/test/test_rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rpbc.py (original) +++ pypy/dist/pypy/rpython/test/test_rpbc.py Sun Sep 11 01:10:15 2005 @@ -846,3 +846,140 @@ assert res == 42 res = interpret(f, [1]) assert res == 42 + +def test_hlinvoke_simple(): + def f(a,b): + return a + b + from pypy.translator import annrpython + a = annrpython.RPythonAnnotator() + from pypy.annotation import model as annmodel + + def g(): + f(2,3) + f(4,5) + + a.build_types(g, []) + + from pypy.rpython import rtyper + rt = rtyper.RPythonTyper(a) + rt.specialize() + + def ll_h(R, f, x): + from pypy.rpython.objectmodel import hlinvoke + return hlinvoke(R, f, x, 2) + + from pypy.rpython import annlowlevel + + s_f = a.bookkeeper.immutablevalue(f) + r_f = rt.getrepr(s_f) + + s_R = a.bookkeeper.immutablevalue(r_f) + s_ll_f = annmodel.lltype_to_annotation(r_f.lowleveltype) + s, llfunction = annlowlevel.annotate_lowlevel_helper(a, ll_h, [s_R, s_ll_f, annmodel.SomeInteger()]) + assert s.knowntype == int + rt.specialize_more_blocks() + + from pypy.rpython.llinterp import LLInterpreter + interp = LLInterpreter(a.translator.flowgraphs, rt) + + c_f = r_f.convert_const(f) + #a.translator.view() + res = interp.eval_function(llfunction, [None, c_f, 3]) + assert res == 5 + +def test_hlinvoke_hltype(): + class A(object): + def __init__(self, v): + self.v = v + def f(a): + return A(a) + + from pypy.translator import annrpython + a = annrpython.RPythonAnnotator() + from pypy.annotation import model as annmodel + + def g(): + a = A(None) + f(a) + + a.build_types(g, []) + + from pypy.rpython import rtyper + from pypy.rpython import rclass + rt = rtyper.RPythonTyper(a) + rt.specialize() + + def ll_h(R, f, a): + from pypy.rpython.objectmodel import hlinvoke + return hlinvoke(R, f, a) + + from pypy.rpython import annlowlevel + + s_f = a.bookkeeper.immutablevalue(f) + r_f = rt.getrepr(s_f) + + s_R = a.bookkeeper.immutablevalue(r_f) + s_ll_f = annmodel.lltype_to_annotation(r_f.lowleveltype) + A_repr = rclass.getinstancerepr(rt, a.getuserclasses()[A]) + s, llfunction = annlowlevel.annotate_lowlevel_helper(a, ll_h, [s_R, s_ll_f, annmodel.SomePtr(A_repr.lowleveltype)]) + assert s.ll_ptrtype == A_repr.lowleveltype + rt.specialize_more_blocks() + + from pypy.rpython.llinterp import LLInterpreter + interp = LLInterpreter(a.translator.flowgraphs, rt) + + c_f = r_f.convert_const(f) + #a.translator.view() + c_a = A_repr.convert_const(A(None)) + res = interp.eval_function(llfunction, [None, c_f, c_a]) + assert typeOf(res) == A_repr.lowleveltype + +def test_hlinvoke_method_hltype(): + class A(object): + def __init__(self, v): + self.v = v + class Impl(object): + def f(self, a): + return A(a) + + from pypy.translator import annrpython + a = annrpython.RPythonAnnotator() + from pypy.annotation import model as annmodel + + def g(): + a = A(None) + i = Impl() + i.f(a) + + a.build_types(g, []) + + from pypy.rpython import rtyper + from pypy.rpython import rclass + rt = rtyper.RPythonTyper(a) + rt.specialize() + + def ll_h(R, f, a): + from pypy.rpython.objectmodel import hlinvoke + return hlinvoke(R, f, a) + + from pypy.rpython import annlowlevel + + Impl_def = a.getuserclasses()[Impl] + s_f = annmodel.SomePBC({Impl.f.im_func: Impl_def}) + r_f = rt.getrepr(s_f) + + s_R = a.bookkeeper.immutablevalue(r_f) + s_ll_f = annmodel.lltype_to_annotation(r_f.lowleveltype) + A_repr = rclass.getinstancerepr(rt, a.getuserclasses()[A]) + s, llfunction = annlowlevel.annotate_lowlevel_helper(a, ll_h, [s_R, s_ll_f, annmodel.SomePtr(A_repr.lowleveltype)]) + assert s.ll_ptrtype == A_repr.lowleveltype + rt.specialize_more_blocks() + + from pypy.rpython.llinterp import LLInterpreter + interp = LLInterpreter(a.translator.flowgraphs, rt) + + # low-level value is just the instance + c_f = rclass.getinstancerepr(rt, Impl_def).convert_const(Impl()) + c_a = A_repr.convert_const(A(None)) + res = interp.eval_function(llfunction, [None, c_f, c_a]) + assert typeOf(res) == A_repr.lowleveltype From pedronis at codespeak.net Sun Sep 11 01:25:40 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 11 Sep 2005 01:25:40 +0200 (CEST) Subject: [pypy-svn] r17446 - pypy/dist/pypy/rpython/test Message-ID: <20050910232540.7998227B6E@code1.codespeak.net> Author: pedronis Date: Sun Sep 11 01:25:38 2005 New Revision: 17446 Modified: pypy/dist/pypy/rpython/test/test_rpbc.py Log: improve hlinvoke tests: - when there's only one callable function this is carried around as Void, no need to supply it - case with callable family with more than one element Modified: pypy/dist/pypy/rpython/test/test_rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rpbc.py (original) +++ pypy/dist/pypy/rpython/test/test_rpbc.py Sun Sep 11 01:25:38 2005 @@ -882,11 +882,58 @@ from pypy.rpython.llinterp import LLInterpreter interp = LLInterpreter(a.translator.flowgraphs, rt) - c_f = r_f.convert_const(f) #a.translator.view() - res = interp.eval_function(llfunction, [None, c_f, 3]) + res = interp.eval_function(llfunction, [None, None, 3]) assert res == 5 +def test_hlinvoke_simple2(): + def f1(a,b): + return a + b + def f2(a,b): + return a - b + from pypy.translator import annrpython + a = annrpython.RPythonAnnotator() + from pypy.annotation import model as annmodel + + def g(i): + if i: + f = f1 + else: + f = f2 + f(5,4) + f(3,2) + + a.build_types(g, [int]) + + from pypy.rpython import rtyper + rt = rtyper.RPythonTyper(a) + rt.specialize() + + def ll_h(R, f, x): + from pypy.rpython.objectmodel import hlinvoke + return hlinvoke(R, f, x, 2) + + from pypy.rpython import annlowlevel + + s_f = annmodel.SomePBC({f1: True, f2: True}) + r_f = rt.getrepr(s_f) + + s_R = a.bookkeeper.immutablevalue(r_f) + s_ll_f = annmodel.lltype_to_annotation(r_f.lowleveltype) + s, llfunction = annlowlevel.annotate_lowlevel_helper(a, ll_h, [s_R, s_ll_f, annmodel.SomeInteger()]) + assert s.knowntype == int + rt.specialize_more_blocks() + + from pypy.rpython.llinterp import LLInterpreter + interp = LLInterpreter(a.translator.flowgraphs, rt) + + #a.translator.view() + res = interp.eval_function(llfunction, [None, r_f.convert_const(f1), 3]) + assert res == 5 + res = interp.eval_function(llfunction, [None, r_f.convert_const(f2), 3]) + assert res == 1 + + def test_hlinvoke_hltype(): class A(object): def __init__(self, v): @@ -928,10 +975,9 @@ from pypy.rpython.llinterp import LLInterpreter interp = LLInterpreter(a.translator.flowgraphs, rt) - c_f = r_f.convert_const(f) #a.translator.view() c_a = A_repr.convert_const(A(None)) - res = interp.eval_function(llfunction, [None, c_f, c_a]) + res = interp.eval_function(llfunction, [None, None, c_a]) assert typeOf(res) == A_repr.lowleveltype def test_hlinvoke_method_hltype(): From pedronis at codespeak.net Sun Sep 11 01:51:02 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 11 Sep 2005 01:51:02 +0200 (CEST) Subject: [pypy-svn] r17447 - in pypy/dist/pypy/rpython: . test Message-ID: <20050910235102.CD9A727B6E@code1.codespeak.net> Author: pedronis Date: Sun Sep 11 01:51:00 2005 New Revision: 17447 Modified: pypy/dist/pypy/rpython/rpbc.py pypy/dist/pypy/rpython/test/test_rpbc.py Log: hlinvoke support for frozen pbc methods Modified: pypy/dist/pypy/rpython/rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/rpbc.py (original) +++ pypy/dist/pypy/rpython/rpbc.py Sun Sep 11 01:51:00 2005 @@ -287,6 +287,13 @@ self.r_im_self = rtyper.getrepr(self.s_im_self) self.lowleveltype = self.r_im_self.lowleveltype + def get_s_callable(self): + return annmodel.SomePBC({self.function: True}) + + def get_r_implfunc(self): + r_func = self.rtyper.getrepr(self.get_s_callable()) + return r_func, 1 + def convert_const(self, method): if getattr(method, 'im_func', None) is not self.function: raise TyperError("not a method bound on %r: %r" % (self.function, Modified: pypy/dist/pypy/rpython/test/test_rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rpbc.py (original) +++ pypy/dist/pypy/rpython/test/test_rpbc.py Sun Sep 11 01:51:00 2005 @@ -1029,3 +1029,55 @@ c_a = A_repr.convert_const(A(None)) res = interp.eval_function(llfunction, [None, c_f, c_a]) assert typeOf(res) == A_repr.lowleveltype + +def test_hlinvoke_pbc_method_hltype(): + class A(object): + def __init__(self, v): + self.v = v + class Impl(object): + def _freeze_(self): + return True + + def f(self, a): + return A(a) + + from pypy.translator import annrpython + a = annrpython.RPythonAnnotator() + from pypy.annotation import model as annmodel + + i = Impl() + + def g(): + a = A(None) + i.f(a) + + a.build_types(g, []) + + from pypy.rpython import rtyper + from pypy.rpython import rclass + rt = rtyper.RPythonTyper(a) + rt.specialize() + + def ll_h(R, f, a): + from pypy.rpython.objectmodel import hlinvoke + return hlinvoke(R, f, a) + + from pypy.rpython import annlowlevel + + s_f = a.bookkeeper.immutablevalue(i.f) + r_f = rt.getrepr(s_f) + + s_R = a.bookkeeper.immutablevalue(r_f) + s_ll_f = annmodel.lltype_to_annotation(r_f.lowleveltype) + A_repr = rclass.getinstancerepr(rt, a.getuserclasses()[A]) + s, llfunction = annlowlevel.annotate_lowlevel_helper(a, ll_h, [s_R, s_ll_f, annmodel.SomePtr(A_repr.lowleveltype)]) + assert s.ll_ptrtype == A_repr.lowleveltype + rt.specialize_more_blocks() + + from pypy.rpython.llinterp import LLInterpreter + interp = LLInterpreter(a.translator.flowgraphs, rt) + + c_f = r_f.convert_const(i.f) + c_a = A_repr.convert_const(A(None)) + res = interp.eval_function(llfunction, [None, c_f, c_a]) + assert typeOf(res) == A_repr.lowleveltype From ludal at codespeak.net Sun Sep 11 02:06:11 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Sun, 11 Sep 2005 02:06:11 +0200 (CEST) Subject: [pypy-svn] r17448 - pypy/dist/pypy/annotation Message-ID: <20050911000611.18A7D27B6E@code1.codespeak.net> Author: ludal Date: Sun Sep 11 02:06:09 2005 New Revision: 17448 Modified: pypy/dist/pypy/annotation/model.py pypy/dist/pypy/annotation/unaryop.py Log: Propagate not None info accross : if string: xxx # string is true implies string is not None else: yyy I needed to ignore knowntypedata for SomeBool comparison ( not sure it's the correct fix ) this is needed because of the assertion in setbinding and the way contains is implemented pypy still annotate and compile correctly with this We could do the same with 'if SomeInstance(): xxx' Modified: pypy/dist/pypy/annotation/model.py ============================================================================== --- pypy/dist/pypy/annotation/model.py (original) +++ pypy/dist/pypy/annotation/model.py Sun Sep 11 02:06:09 2005 @@ -171,6 +171,20 @@ unsigned = False def __init__(self): pass + def __eq__(self, other): + if self.__class__ is not other.__class__: + return False + if 'knowntypedata' in self.__dict__: + selfdic = self.__dict__.copy() + del selfdic['knowntypedata'] + else: + selfdic = self.__dict__ + if 'knowntypedata' in other.__dict__: + otherdic = other.__dict__.copy() + del otherdic['knowntypedata'] + else: + otherdic = other.__dict__ + return selfdic == otherdic class SomeString(SomeObject): "Stands for an object which is known to be a string." Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Sun Sep 11 02:06:09 2005 @@ -14,6 +14,7 @@ from pypy.annotation.model import SomeExternalObject from pypy.annotation.model import SomeTypedAddressAccess, SomeAddress from pypy.annotation.model import unionof, set, setunion, missing_operation +from pypy.annotation.model import add_knowntypedata from pypy.annotation.bookkeeper import getbookkeeper, RPythonCallsSpace from pypy.annotation.classdef import isclassdef from pypy.annotation import builtin @@ -369,6 +370,24 @@ def method_upper(str): return SomeString() + def is_true(str): + r = SomeObject.is_true(str) + if not isinstance(r, SomeBool): + return r + bk = getbookkeeper() + knowntypedata = r.knowntypedata = {} + fn, block, i = bk.position_key + + annotator = bk.annotator + op = block.operations[i] + assert op.opname == "is_true" or op.opname == "nonzero" + assert len(op.args) == 1 + arg = op.args[0] + add_knowntypedata(knowntypedata, False, [arg], str) + add_knowntypedata(knowntypedata, True, [arg], str.nonnoneify()) + return r + + class __extend__(SomeChar): From pedronis at codespeak.net Sun Sep 11 02:34:37 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 11 Sep 2005 02:34:37 +0200 (CEST) Subject: [pypy-svn] r17449 - pypy/dist/pypy/annotation Message-ID: <20050911003437.0870727B6E@code1.codespeak.net> Author: pedronis Date: Sun Sep 11 02:34:36 2005 New Revision: 17449 Modified: pypy/dist/pypy/annotation/model.py pypy/dist/pypy/annotation/unaryop.py Log: disable this changes until there are tests for them, especially about why bool comparison needed to be changed sorry, but changing the annotator without a trail of tests that allow to understand what is going on is not a good idea Modified: pypy/dist/pypy/annotation/model.py ============================================================================== --- pypy/dist/pypy/annotation/model.py (original) +++ pypy/dist/pypy/annotation/model.py Sun Sep 11 02:34:36 2005 @@ -171,20 +171,20 @@ unsigned = False def __init__(self): pass - def __eq__(self, other): - if self.__class__ is not other.__class__: - return False - if 'knowntypedata' in self.__dict__: - selfdic = self.__dict__.copy() - del selfdic['knowntypedata'] - else: - selfdic = self.__dict__ - if 'knowntypedata' in other.__dict__: - otherdic = other.__dict__.copy() - del otherdic['knowntypedata'] - else: - otherdic = other.__dict__ - return selfdic == otherdic + #def __eq__(self, other): + # if self.__class__ is not other.__class__: + # return False + # if 'knowntypedata' in self.__dict__: + # selfdic = self.__dict__.copy() + # del selfdic['knowntypedata'] + # else: + # selfdic = self.__dict__ + # if 'knowntypedata' in other.__dict__: + # otherdic = other.__dict__.copy() + # del otherdic['knowntypedata'] + # else: + # otherdic = other.__dict__ + # return selfdic == otherdic class SomeString(SomeObject): "Stands for an object which is known to be a string." Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Sun Sep 11 02:34:36 2005 @@ -370,22 +370,22 @@ def method_upper(str): return SomeString() - def is_true(str): - r = SomeObject.is_true(str) - if not isinstance(r, SomeBool): - return r - bk = getbookkeeper() - knowntypedata = r.knowntypedata = {} - fn, block, i = bk.position_key - - annotator = bk.annotator - op = block.operations[i] - assert op.opname == "is_true" or op.opname == "nonzero" - assert len(op.args) == 1 - arg = op.args[0] - add_knowntypedata(knowntypedata, False, [arg], str) - add_knowntypedata(knowntypedata, True, [arg], str.nonnoneify()) - return r + #def is_true(str): + # r = SomeObject.is_true(str) + # if not isinstance(r, SomeBool): + # return r + # bk = getbookkeeper() + # knowntypedata = r.knowntypedata = {} + # fn, block, i = bk.position_key + # + # annotator = bk.annotator + # op = block.operations[i] + # assert op.opname == "is_true" or op.opname == "nonzero" + # assert len(op.args) == 1 + # arg = op.args[0] + # add_knowntypedata(knowntypedata, False, [arg], str) + # add_knowntypedata(knowntypedata, True, [arg], str.nonnoneify()) + # return r From ludal at codespeak.net Sun Sep 11 03:51:31 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Sun, 11 Sep 2005 03:51:31 +0200 (CEST) Subject: [pypy-svn] r17453 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050911015131.6809027B86@code1.codespeak.net> Author: ludal Date: Sun Sep 11 03:51:28 2005 New Revision: 17453 Modified: pypy/dist/pypy/interpreter/astcompiler/misc.py pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/astcompiler/symbols.py Log: More steps toward translation: - specialize Sets for Blocks using Block.bid - move Counter to misc.py - use Block.bid as key instead of Block() in fixupOrderForward Modified: pypy/dist/pypy/interpreter/astcompiler/misc.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/misc.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/misc.py Sun Sep 11 03:51:28 2005 @@ -10,6 +10,15 @@ elts.append(elt) return elts +class Counter: + def __init__(self, initial): + self.count = initial + + def next(self): + i = self.count + self.count += 1 + return i + class Set: _annspecialcase_ = "specialize:ctr_location" # polymorphic Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Sun Sep 11 03:51:28 2005 @@ -11,6 +11,30 @@ from pypy.interpreter.baseobjspace import W_Root + +class BlockSet: + """A Set implementation specific to Blocks + it uses Block.bid as keys to underlying dict""" + def __init__(self): + self.elts = {} + def __len__(self): + return len(self.elts) + def __contains__(self, elt): + return elt.bid in self.elts + def add(self, elt): + self.elts[elt.bid] = elt + def elements(self): + return self.elts.values() + def has_elt(self, elt): + return elt.bid in self.elts + def remove(self, elt): + del self.elts[elt.bid] + def copy(self): + c = BlockSet() + c.elts.update(self.elts) + return c + + class Instr: has_arg = False @@ -70,7 +94,7 @@ self.space = space self.current = self.entry = Block(space) self.exit = Block(space,"exit") - self.blocks = misc.Set() + self.blocks = BlockSet() self.blocks.add(self.entry) self.blocks.add(self.exit) @@ -246,7 +270,7 @@ chains = [] cur = [] for b in blocks: - index[b] = len(chains) + index[b.bid] = len(chains) cur.append(b) if b.next and b.next[0] == default_next: chains.append(cur) @@ -260,7 +284,7 @@ l = chains[i] for b in l: for c in b.get_children(): - if index[c] < i: + if index[c.bid] < i: forward_p = 0 for inst in b.insts: if inst.op == 'JUMP_FORWARD': @@ -269,7 +293,7 @@ forward_p = 1 if not forward_p: continue - constraints.append((index[c], i)) + constraints.append((index[c.bid], i)) if not constraints: break @@ -311,28 +335,18 @@ order.append(b) return order -class BlockCounter: - def __init__(self): - self._count = 0 - - def inc(self): - self._count += 1 +BlockCounter = misc.Counter(0) - def value(self): - return self._count - class Block: - _count = BlockCounter() def __init__(self, space, label=''): self.insts = [] - self.inEdges = misc.Set() - self.outEdges = misc.Set() + self.inEdges = BlockSet() + self.outEdges = BlockSet() self.label = label - self.bid = Block._count.value() + self.bid = BlockCounter.next() self.next = [] self.space = space - Block._count.inc() def __repr__(self): if self.label: Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Sun Sep 11 03:51:28 2005 @@ -1188,7 +1188,7 @@ node.expr.accept( self ) self.emit('PRINT_EXPR') -AbstractFunctionCodeLambdaCounter = symbols.Counter(0) +AbstractFunctionCodeLambdaCounter = misc.Counter(0) class AbstractFunctionCode(CodeGenerator): def __init__(self, space, func, isLambda, class_name, mod): @@ -1246,7 +1246,8 @@ elif isinstance(elt, ast.AssTuple): self.unpackSequence( elt ) else: - raise TypeError( "Got argument %s of type %s" % (elt,type(elt))) + #raise TypeError( "Got argument %s of type %s" % (elt,type(elt))) + raise TypeError( "Got unexpected argument" ) unpackTuple = unpackSequence Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Sun Sep 11 03:51:28 2005 @@ -3,7 +3,7 @@ from pypy.interpreter.astcompiler import ast from pypy.interpreter.astcompiler.consts import SC_LOCAL, SC_GLOBAL, \ SC_FREE, SC_CELL, SC_UNKNOWN, SC_REALLY_GLOBAL -from pypy.interpreter.astcompiler.misc import mangle +from pypy.interpreter.astcompiler.misc import mangle, Counter from pypy.interpreter.pyparser.error import SyntaxError import types @@ -12,15 +12,6 @@ MANGLE_LEN = 256 -class Counter: - def __init__(self, initial): - self.count = initial - - def next(self): - i = self.count - self.count += 1 - return i - class Scope: # XXX how much information do I need about each name? def __init__(self, name, module, klass=None): From arigo at codespeak.net Sun Sep 11 13:32:58 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 13:32:58 +0200 (CEST) Subject: [pypy-svn] r17455 - in pypy/dist/pypy/rpython: . test Message-ID: <20050911113258.3707F27B84@code1.codespeak.net> Author: arigo Date: Sun Sep 11 13:32:56 2005 New Revision: 17455 Modified: pypy/dist/pypy/rpython/llinterp.py pypy/dist/pypy/rpython/objectmodel.py pypy/dist/pypy/rpython/rbuiltin.py pypy/dist/pypy/rpython/rdict.py pypy/dist/pypy/rpython/rpbc.py pypy/dist/pypy/rpython/test/test_objectmodel.py pypy/dist/pypy/rpython/test/test_rconstantdict.py Log: Progress on test_rtype_r_dict, but not passing yet. * the r_dict becomes a GcStruct with two new fields, fnkeyeq and fnkeyhash, that are function pointers (which usually means just Voids, if the functions in question are constant). * we pass a test about a constant prebuilt r_dict (thanks pedronis for hints). * ll_dict_lookup() is slightly verbose now :-( * fix in FunctionsPBCRepr.convert_const(), which should return None for Voids. * llinterp prints more infos about the LLExceptions it got -- at least the exception's class name. * the _r_dictkey now caches the hash -- needed for rdict.convert_const(). Modified: pypy/dist/pypy/rpython/llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/llinterp.py (original) +++ pypy/dist/pypy/rpython/llinterp.py Sun Sep 11 13:32:56 2005 @@ -39,6 +39,11 @@ llframe = LLFrame(graph, args, self) try: return llframe.eval() + except LLException, e: + etype, evalue = e.args + print "LLEXCEPTION:", etype.name + self.print_traceback() + raise except Exception, e: print "AN ERROR OCCURED:", e self.print_traceback() Modified: pypy/dist/pypy/rpython/objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/objectmodel.py (original) +++ pypy/dist/pypy/rpython/objectmodel.py Sun Sep 11 13:32:56 2005 @@ -101,24 +101,20 @@ def clear(self): self._dict.clear() - def key_eq(self, key1, key2): - "Called to compare two keys. Can be overridden in subclasses." - return key1 == key2 - - def key_hash(self, key): - "Called to compute the hash of a key. Can be overridden in subclasses." - return hash(key) - def __repr__(self): "Representation for debugging purposes." return 'r_dict(%r)' % (dict(self.items()),) + def __hash__(self): + raise TypeError("cannot hash r_dict instances") + class _r_dictkey(object): - __slots__ = ['dic', 'key'] + __slots__ = ['dic', 'key', 'hash'] def __init__(self, dic, key): self.dic = dic self.key = key + self.hash = dic.key_hash(key) def __eq__(self, other): if not isinstance(other, _r_dictkey): return NotImplemented @@ -128,4 +124,4 @@ return NotImplemented return not self.dic.key_eq(self.key, other.key) def __hash__(self): - return self.dic.key_hash(self.key) + return self.hash Modified: pypy/dist/pypy/rpython/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/rbuiltin.py Sun Sep 11 13:32:56 2005 @@ -9,6 +9,7 @@ from pypy.rpython.robject import pyobj_repr from pypy.rpython.rfloat import float_repr, FloatRepr from pypy.rpython.rbool import bool_repr +from pypy.rpython.rdict import rtype_r_dict from pypy.rpython import rclass from pypy.tool import sourcetools @@ -154,6 +155,8 @@ #def rtype_builtin_xrange(hop): see rrange.py +#def rtype_r_dict(hop): see rdict.py + def rtype_intmask(hop): vlist = hop.inputargs(lltype.Signed) return vlist[0] @@ -305,6 +308,7 @@ BUILTIN_TYPER[lltype.runtime_type_info] = rtype_runtime_type_info BUILTIN_TYPER[rarithmetic.intmask] = rtype_intmask BUILTIN_TYPER[rarithmetic.r_uint] = rtype_r_uint +BUILTIN_TYPER[objectmodel.r_dict] = rtype_r_dict BUILTIN_TYPER[objectmodel.instantiate] = rtype_instantiate BUILTIN_TYPER[objectmodel.we_are_translated] = rtype_we_are_translated Modified: pypy/dist/pypy/rpython/rdict.py ============================================================================== --- pypy/dist/pypy/rpython/rdict.py (original) +++ pypy/dist/pypy/rpython/rdict.py Sun Sep 11 13:32:56 2005 @@ -3,8 +3,10 @@ from pypy.objspace.flow.model import Constant from pypy.rpython import rmodel, lltype from pypy.rpython.rarithmetic import r_uint +from pypy.rpython.objectmodel import hlinvoke from pypy.rpython import rlist from pypy.rpython import robject +from pypy.rpython import objectmodel # ____________________________________________________________ # @@ -17,7 +19,7 @@ # the array should be inlined and num_pristine_entries is not needed. # # struct dictentry { -# DICTSTR key; +# DICTKEY key; # bool valid; # to mark if the entry is filled # bool everused; # to mark if the entry is or has ever been filled # DICTVALUE value; @@ -26,7 +28,9 @@ # struct dicttable { # int num_items; # int num_pristine_entries; # never used entries -# Array *entries; +# Array *entries; +# (Function DICTKEY, DICTKEY -> bool) *fnkeyeq; +# (Function DICTKEY -> int) *fnkeyhash; # } # # @@ -41,17 +45,24 @@ s_value.__class__ is annmodel.SomeObject and s_value.knowntype == object): return robject.pyobj_repr else: + if dictkey.custom_eq_hash: + custom_eq_hash = lambda: (rtyper.getrepr(dictkey.s_rdict_eqfn), + rtyper.getrepr(dictkey.s_rdict_hashfn)) + else: + custom_eq_hash = None return DictRepr(lambda: rtyper.getrepr(s_key), lambda: rtyper.getrepr(s_value), dictkey, - dictvalue) + dictvalue, + custom_eq_hash) def rtyper_makekey(self): return (self.__class__, self.dictdef.dictkey, self.dictdef.dictvalue) class DictRepr(rmodel.Repr): - def __init__(self, key_repr, value_repr, dictkey=None, dictvalue=None): + def __init__(self, key_repr, value_repr, dictkey=None, dictvalue=None, + custom_eq_hash=None): self.DICT = lltype.GcForwardReference() self.lowleveltype = lltype.Ptr(self.DICT) if not isinstance(key_repr, rmodel.Repr): # not computed yet, done by setup() @@ -67,6 +78,8 @@ self.dictkey = dictkey self.dictvalue = dictvalue self.dict_cache = {} + self.custom_eq_hash = custom_eq_hash is not None + self._custom_eq_hash_repr = custom_eq_hash # setup() needs to be called to finish this initialization def _setup_repr(self): @@ -83,11 +96,15 @@ ("everused", lltype.Bool), ("value", self.DICTVALUE)) self.DICTENTRYARRAY = lltype.GcArray(self.DICTENTRY) - self.DICT.become(lltype.GcStruct("dicttable", - ("num_items", lltype.Signed), + fields = [ ("num_items", lltype.Signed), ("num_pristine_entries", lltype.Signed), - ("entries", lltype.Ptr(self.DICTENTRYARRAY)))) - if 'll_keyhash' not in self.__dict__: + ("entries", lltype.Ptr(self.DICTENTRYARRAY)) ] + if self.custom_eq_hash: + self.r_rdict_eqfn, self.r_rdict_hashfn = self._custom_eq_hash_repr() + fields.extend([ ("fnkeyeq", self.r_rdict_eqfn.lowleveltype), + ("fnkeyhash", self.r_rdict_hashfn.lowleveltype) ]) + self.DICT.become(lltype.GcStruct("dicttable", *fields)) + if 'll_keyhash' not in self.__dict__ and not self.custom_eq_hash: # figure out which functions must be used to hash and compare keys self.ll_keyeq = self.key_repr.get_ll_eq_function() # can be None self.ll_keyhash = self.key_repr.get_ll_hash_function() @@ -97,22 +114,49 @@ #dictobj = getattr(dictobj, '__self__', dictobj) if dictobj is None: return nullptr(self.DICT) - if not isinstance(dictobj, dict): + if not isinstance(dictobj, (dict, objectmodel.r_dict)): raise TyperError("expected a dict: %r" % (dictobj,)) try: key = Constant(dictobj) return self.dict_cache[key] except KeyError: self.setup() - l_dict = ll_newdict(self.lowleveltype) - self.dict_cache[key] = l_dict - r_key = self.key_repr - r_value = self.value_repr - for dictkey, dictvalue in dictobj.items(): - llkey = r_key.convert_const(dictkey) - llvalue = r_value.convert_const(dictvalue) - ll_dict_setitem(l_dict, llkey, llvalue, self) - return l_dict + if isinstance(dictobj, objectmodel.r_dict): + l_eqfn = self.r_rdict_eqfn .convert_const(dictobj.key_eq) + l_hashfn = self.r_rdict_hashfn.convert_const(dictobj.key_hash) + l_dict = ll_newdict_custom_eq_hash(l_eqfn, l_hashfn, self) + # a dummy object with ll_keyeq and ll_keyhash methods to + # pass to ll_dict_setitem() + class Dummy: + custom_eq_hash = False + def ll_keyeq(self, key1, key2): + # theory: ll_dict_lookup() will only see new items, + # which are never equal to any existing one + return False + def ll_keyhash(self, key): + return self.currenthash + + self.dict_cache[key] = l_dict + r_key = self.key_repr + r_value = self.value_repr + for dictkeycontainer, dictvalue in dictobj._dict.items(): + dummy = Dummy() + dummy.currenthash = dictkeycontainer.hash + llkey = r_key.convert_const(dictkeycontainer.key) + llvalue = r_value.convert_const(dictvalue) + ll_dict_setitem(l_dict, llkey, llvalue, dummy) + return l_dict + + else: + l_dict = ll_newdict(self) + self.dict_cache[key] = l_dict + r_key = self.key_repr + r_value = self.value_repr + for dictkey, dictvalue in dictobj.items(): + llkey = r_key.convert_const(dictkey) + llvalue = r_value.convert_const(dictvalue) + ll_dict_setitem(l_dict, llkey, llvalue, self) + return l_dict def rtype_len(self, hop): v_dict, = hop.inputargs(self) @@ -134,8 +178,9 @@ def rtype_method_copy(self, hop): v_dict, = hop.inputargs(self) + crepr = hop.inputconst(lltype.Void, self) hop.exception_cannot_occur() - return hop.gendirectcall(ll_copy, v_dict) + return hop.gendirectcall(ll_copy, v_dict, crepr) def rtype_method_update(self, hop): v_dic1, v_dic2 = hop.inputargs(self, self) @@ -301,7 +346,10 @@ PERTURB_SHIFT = 5 def ll_dict_lookup(d, key, dictrepr): - hash = dictrepr.ll_keyhash(key) + if dictrepr.custom_eq_hash: + hash = hlinvoke(dictrepr.r_rdict_hashfn, d.fnkeyhash, key) + else: + hash = dictrepr.ll_keyhash(key) entries = d.entries mask = len(entries) - 1 i = r_uint(hash & mask) @@ -313,7 +361,11 @@ if entry.valid: if entry.key == key: return entry # found the entry - if dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(entry.key, key): + if dictrepr.custom_eq_hash: + res = hlinvoke(dictrepr.r_rdict_eqfn, d.fnkeyeq, entry.key, key) + else: + res = dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(entry.key, key) + if res: return entry # found the entry freeslot = lltype.nullptr(lltype.typeOf(entry).TO) elif entry.everused: @@ -332,7 +384,11 @@ elif entry.valid: if entry.key == key: return entry - if dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(entry.key, key): + if dictrepr.custom_eq_hash: + res = hlinvoke(dictrepr.r_rdict_eqfn, d.fnkeyeq, entry.key, key) + else: + res = dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(entry.key, key) + if res: return entry elif not freeslot: freeslot = entry @@ -341,22 +397,52 @@ # ____________________________________________________________ # # Irregular operations. + DICT_INITSIZE = 8 -def ll_newdict(DICTPTR): - d = lltype.malloc(DICTPTR.TO) - d.entries = lltype.malloc(DICTPTR.TO.entries.TO, DICT_INITSIZE) +def ll_newdict(dictrepr): + assert not dictrepr.custom_eq_hash # use ll_newdict_custom_eq_hash() instead + d = lltype.malloc(dictrepr.DICT) + d.entries = lltype.malloc(dictrepr.DICTENTRYARRAY, DICT_INITSIZE) + d.num_items = 0 # but still be explicit + d.num_pristine_entries = DICT_INITSIZE + return d + +def ll_newdict_custom_eq_hash(eqfn, hashfn, dictrepr): + assert dictrepr.custom_eq_hash + d = lltype.malloc(dictrepr.DICT) + d.entries = lltype.malloc(dictrepr.DICTENTRYARRAY, DICT_INITSIZE) d.num_items = 0 # but still be explicit - d.num_pristine_entries = DICT_INITSIZE + d.num_pristine_entries = DICT_INITSIZE + d.fnkeyeq = eqfn + d.fnkeyhash = hashfn return d +def ll_copy_extra_data(targetdict, sourcedict, dictrepr): + if dictrepr.custom_eq_hash: + targetdict.fnkeyeq = sourcedict.fnkeyeq + targetdict.fnkeyhash = sourcedict.fnkeyhash + def rtype_newdict(hop): + hop.inputargs() # no arguments expected r_dict = hop.r_result if r_dict == robject.pyobj_repr: # special case: SomeObject: SomeObject dicts! cdict = hop.inputconst(robject.pyobj_repr, dict) return hop.genop('simple_call', [cdict], resulttype = robject.pyobj_repr) - c1 = hop.inputconst(lltype.Void, r_dict.lowleveltype) - v_result = hop.gendirectcall(ll_newdict, c1) + crepr = hop.inputconst(lltype.Void, r_dict) + v_result = hop.gendirectcall(ll_newdict, crepr) + return v_result + +def rtype_r_dict(hop): + r_dict = hop.r_result + if not r_dict.custom_eq_hash: + raise TyperError("r_dict() call does not return an r_dict instance") + v_eqfn, v_hashfn = hop.inputargs(r_dict.r_rdict_eqfn, + r_dict.r_rdict_hashfn) + crepr = hop.inputconst(lltype.Void, r_dict) + hop.exception_cannot_occur() + v_result = hop.gendirectcall(ll_newdict_custom_eq_hash, + v_eqfn, v_hashfn, crepr) return v_result # ____________________________________________________________ @@ -423,15 +509,15 @@ else: return default -def ll_copy(dict): - DICTPTR = lltype.typeOf(dict) - d = lltype.malloc(DICTPTR.TO) - d.entries = lltype.malloc(DICTPTR.TO.entries.TO, len(dict.entries)) +def ll_copy(dict, dictrepr): + dictsize = len(dict.entries) + d = lltype.malloc(dictrepr.DICT) + d.entries = lltype.malloc(dictrepr.DICTENTRYARRAY, dictsize) d.num_items = dict.num_items d.num_pristine_entries = dict.num_pristine_entries + ll_copy_extra_data(d, dict, dictrepr) i = 0 - dictlen = len(d.entries) - while i < dictlen: + while i < dictsize: d_entry = d.entries[i] entry = dict.entries[i] d_entry.key = entry.key Modified: pypy/dist/pypy/rpython/rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/rpbc.py (original) +++ pypy/dist/pypy/rpython/rpbc.py Sun Sep 11 13:32:56 2005 @@ -402,8 +402,11 @@ if value not in self.function_signatures(): raise TyperError("%r not in %r" % (value, self.s_pbc.prebuiltinstances)) - f, rinputs, rresult = self.function_signatures()[value] - return f + if self.lowleveltype == Void: + return None + else: + f, rinputs, rresult = self.function_signatures()[value] + return f def rtype_simple_call(self, hop): f, rinputs, rresult = self.function_signatures().itervalues().next() Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Sun Sep 11 13:32:56 2005 @@ -28,7 +28,8 @@ assert False, "should have raised" assert len(d) == 1 assert 'oops' not in d - assert list(d) == ['hello'] + x, = d + assert x == 'hello' assert d.get('hola', -1) == 42 assert d.get('salut', -1) == -1 d1 = d.copy() @@ -38,9 +39,12 @@ d.update(d1) assert d.values() == [42] assert d.items() == [('hello', 42)] - assert list(d.iterkeys()) == ['hello'] - assert list(d.itervalues()) == [42] - assert list(d.iteritems()) == [('hello', 42)] + x, = d.iterkeys() + assert x == 'hello' + x, = d.itervalues() + assert x == 42 + x, = d.iteritems() + assert x == ('hello', 42) d.clear() assert d.keys() == [] return True # for the tests below Modified: pypy/dist/pypy/rpython/test/test_rconstantdict.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rconstantdict.py (original) +++ pypy/dist/pypy/rpython/test/test_rconstantdict.py Sun Sep 11 13:32:56 2005 @@ -1,5 +1,6 @@ import py -from pypy.rpython.test.test_llinterp import interpret +from pypy.rpython.test.test_llinterp import interpret +from pypy.rpython.objectmodel import r_dict def test_constant_int_dict(): d = {1: 2, 2: 3, 3: 4} @@ -36,3 +37,19 @@ assert res == 123 res = interpret(func, [63]) assert res == 321 + +def test_constant_r_dict(): + def strange_key_eq(key1, key2): + return key1[0] == key2[0] # only the 1st character is relevant + def strange_key_hash(key): + return ord(key[0]) + + d = r_dict(strange_key_eq, strange_key_hash) + d['hello'] = 42 + d['world'] = 43 + def func(i): + return d[chr(i)] + res = interpret(func, [ord('h')]) + assert res == 42 + res = interpret(func, [ord('w')]) + assert res == 43 From arigo at codespeak.net Sun Sep 11 13:49:07 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 13:49:07 +0200 (CEST) Subject: [pypy-svn] r17456 - in pypy/dist/pypy/rpython: . test Message-ID: <20050911114907.1BC9D27B88@code1.codespeak.net> Author: arigo Date: Sun Sep 11 13:49:06 2005 New Revision: 17456 Modified: pypy/dist/pypy/rpython/rdict.py pypy/dist/pypy/rpython/test/test_rconstantdict.py Log: More care is needed, because ll_dict_setitem() can resize the dict. Thanks pedronis Modified: pypy/dist/pypy/rpython/rdict.py ============================================================================== --- pypy/dist/pypy/rpython/rdict.py (original) +++ pypy/dist/pypy/rpython/rdict.py Sun Sep 11 13:49:06 2005 @@ -130,20 +130,28 @@ class Dummy: custom_eq_hash = False def ll_keyeq(self, key1, key2): - # theory: ll_dict_lookup() will only see new items, - # which are never equal to any existing one - return False + # theory: all low-level values we consider as keys + # can be compared by equality (i.e. identity for + # pointers) because the r_dict itself should have + # ensured that it does not store duplicate equal keys. + return key1 == key2 def ll_keyhash(self, key): - return self.currenthash + # theoretically slow, but well (see theory above) + for llkey, hash in self.cache: + if key == llkey: + return hash + raise TyperError("hash missing in convert_const(%r)" % + (dictobj,)) + dummy = Dummy() + dummy.cache = [] self.dict_cache[key] = l_dict r_key = self.key_repr r_value = self.value_repr for dictkeycontainer, dictvalue in dictobj._dict.items(): - dummy = Dummy() - dummy.currenthash = dictkeycontainer.hash llkey = r_key.convert_const(dictkeycontainer.key) llvalue = r_value.convert_const(dictvalue) + dummy.cache.insert(0, (llkey, dictkeycontainer.hash)) ll_dict_setitem(l_dict, llkey, llvalue, dummy) return l_dict Modified: pypy/dist/pypy/rpython/test/test_rconstantdict.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rconstantdict.py (original) +++ pypy/dist/pypy/rpython/test/test_rconstantdict.py Sun Sep 11 13:49:06 2005 @@ -47,9 +47,14 @@ d = r_dict(strange_key_eq, strange_key_hash) d['hello'] = 42 d['world'] = 43 + for x in range(65, 91): + d[chr(x)] = x*x def func(i): return d[chr(i)] res = interpret(func, [ord('h')]) assert res == 42 res = interpret(func, [ord('w')]) assert res == 43 + for x in range(65, 91): + res = interpret(func, [x]) + assert res == x*x From arigo at codespeak.net Sun Sep 11 13:54:39 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 13:54:39 +0200 (CEST) Subject: [pypy-svn] r17457 - pypy/dist/pypy/annotation Message-ID: <20050911115439.D9D1127B88@code1.codespeak.net> Author: arigo Date: Sun Sep 11 13:54:39 2005 New Revision: 17457 Modified: pypy/dist/pypy/annotation/bookkeeper.py Log: Oups, forgot to check this in. Modified: pypy/dist/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/dist/pypy/annotation/bookkeeper.py (original) +++ pypy/dist/pypy/annotation/bookkeeper.py Sun Sep 11 13:54:39 2005 @@ -14,6 +14,7 @@ from pypy.interpreter.pycode import cpython_code_signature from pypy.interpreter.argument import Arguments, ArgErr from pypy.rpython.rarithmetic import r_uint +from pypy.rpython.objectmodel import r_dict from pypy.tool.unionfind import UnionFind from pypy.rpython import lltype from pypy.rpython.memory import lladdress @@ -325,7 +326,7 @@ self.immutable_cache[key] = result for e in x: result.listdef.generalize(self.immutablevalue(e)) - elif tp is dict: # exactly a dict + elif tp is dict or tp is r_dict: key = Constant(x) try: return self.immutable_cache[key] @@ -334,6 +335,11 @@ SomeImpossibleValue(), SomeImpossibleValue())) self.immutable_cache[key] = result + if tp is r_dict: + s_eqfn = self.immutablevalue(x.key_eq) + s_hashfn = self.immutablevalue(x.key_hash) + result.dictdef.dictkey.update_rdict_annotations(s_eqfn, + s_hashfn) for ek, ev in x.iteritems(): result.dictdef.generalize_key(self.immutablevalue(ek)) result.dictdef.generalize_value(self.immutablevalue(ev)) From arigo at codespeak.net Sun Sep 11 14:23:55 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 14:23:55 +0200 (CEST) Subject: [pypy-svn] r17458 - in pypy/dist/pypy: annotation translator translator/test Message-ID: <20050911122355.142AE27B92@code1.codespeak.net> Author: arigo Date: Sun Sep 11 14:23:53 2005 New Revision: 17458 Modified: pypy/dist/pypy/annotation/binaryop.py pypy/dist/pypy/annotation/listdef.py pypy/dist/pypy/annotation/unaryop.py pypy/dist/pypy/translator/annrpython.py pypy/dist/pypy/translator/test/test_annrpython.py Log: Operation that provide an item for a list or a key for a dictionary should force generalization of the list or dict to the corresponding type, even operations that don't actually put the item or key into the dict. Otherwise the rtyper gets conversion problems. Modified: pypy/dist/pypy/annotation/binaryop.py ============================================================================== --- pypy/dist/pypy/annotation/binaryop.py (original) +++ pypy/dist/pypy/annotation/binaryop.py Sun Sep 11 14:23:53 2005 @@ -342,12 +342,6 @@ add = union - def inplace_add((lst1, lst2)): # force the union of lst1 and lst2 - lst1.listdef.resize() - lst1.listdef.union(lst2.listdef) - return lst1 - inplace_add.can_only_throw = [] - def eq((lst1, lst2)): lst1.listdef.union(lst2.listdef) return SomeBool() @@ -357,9 +351,7 @@ class __extend__(pairtype(SomeList, SomeObject)): def inplace_add((lst1, obj2)): - lst1.listdef.resize() - s_iter = obj2.iter() - pair(lst1, SomeInteger()).setitem(s_iter.next()) + lst1.method_extend(obj2) return lst1 inplace_add.can_only_throw = [] @@ -391,6 +383,7 @@ def getitem((dic1, obj2)): getbookkeeper().count("dict_getitem", dic1) + dic1.dictdef.generalize_key(obj2) return dic1.dictdef.read_value() getitem.can_only_throw = [KeyError] Modified: pypy/dist/pypy/annotation/listdef.py ============================================================================== --- pypy/dist/pypy/annotation/listdef.py (original) +++ pypy/dist/pypy/annotation/listdef.py Sun Sep 11 14:23:53 2005 @@ -29,7 +29,7 @@ s_value = self.s_value s_other_value = other.s_value s_new_value = unionof(s_value, s_other_value) - if isdegenerated(s_new_value): + if isdegenerated(s_new_value) and self.bookkeeper: self.bookkeeper.ondegenerated(self, s_new_value) if s_new_value != s_value: self.s_value = s_new_value @@ -47,7 +47,7 @@ def generalize(self, s_other_value): s_new_value = unionof(self.s_value, s_other_value) - if isdegenerated(s_new_value): + if isdegenerated(s_new_value) and self.bookkeeper: self.bookkeeper.ondegenerated(self, s_new_value) updated = s_new_value != self.s_value if updated: Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Sun Sep 11 14:23:53 2005 @@ -159,6 +159,9 @@ getbookkeeper().warning("cannot follow call(%r, %r)" % (obj, args)) return SomeObject() + def op_contains(obj, s_element): + return SomeBool() + class __extend__(SomeInteger): def invert(self): @@ -227,7 +230,7 @@ def method_append(lst, s_value): lst.listdef.resize() - pair(lst, SomeInteger()).setitem(s_value) + lst.listdef.generalize(s_value) def method_extend(lst, s_iterable): lst.listdef.resize() @@ -235,21 +238,21 @@ lst.listdef.union(s_iterable.listdef) else: s_iter = s_iterable.iter() - pair(lst, SomeInteger()).setitem(s_iter.next()) + self.method_append(s_iter.next()) def method_reverse(lst): lst.listdef.mutate() def method_insert(lst, s_index, s_value): - lst.listdef.resize() - pair(lst, SomeInteger()).setitem(s_value) + self.method_append(lst, s_value) def method_pop(lst, s_index=None): lst.listdef.resize() return lst.listdef.read_item() - def method_index(lst, el): + def method_index(lst, s_value): getbookkeeper().count("list_index") + lst.listdef.generalize(s_value) return SomeInteger(nonneg=True) def len(lst): @@ -265,6 +268,10 @@ def getanyitem(lst): return lst.listdef.read_item() + def op_contains(lst, s_element): + lst.listdef.generalize(s_element) + return SomeBool() + class __extend__(SomeDict): def len(dct): @@ -322,6 +329,10 @@ def method_clear(dct): pass + def op_contains(dct, s_element): + dct.dictdef.generalize_key(s_element) + return SomeBool() + class __extend__(SomeString): Modified: pypy/dist/pypy/translator/annrpython.py ============================================================================== --- pypy/dist/pypy/translator/annrpython.py (original) +++ pypy/dist/pypy/translator/annrpython.py Sun Sep 11 14:23:53 2005 @@ -635,7 +635,7 @@ # XXX "contains" clash with SomeObject method def consider_op_contains(self, seq, elem): self.bookkeeper.count("contains", seq) - return annmodel.SomeBool() + return seq.op_contains(elem) def consider_op_newtuple(self, *args): return annmodel.SomeTuple(items = args) Modified: pypy/dist/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/dist/pypy/translator/test/test_annrpython.py (original) +++ pypy/dist/pypy/translator/test/test_annrpython.py Sun Sep 11 14:23:53 2005 @@ -1597,6 +1597,38 @@ s = a.build_types(g, [x]) assert s.const == True + def test_reading_also_generalizes(self): + def f1(i): + d = {'c': i} + return d['not-a-char'], d + a = self.RPythonAnnotator() + s = a.build_types(f1, [int]) + assert dictkey(s.items[1]).__class__ == annmodel.SomeString + def f2(i): + d = {'c': i} + return d.get('not-a-char', i+1), d + a = self.RPythonAnnotator() + s = a.build_types(f2, [int]) + assert dictkey(s.items[1]).__class__ == annmodel.SomeString + def f3(i): + d = {'c': i} + return 'not-a-char' in d, d + a = self.RPythonAnnotator() + s = a.build_types(f3, [int]) + assert dictkey(s.items[1]).__class__ == annmodel.SomeString + def f4(): + lst = ['a', 'b', 'c'] + return 'not-a-char' in lst, lst + a = self.RPythonAnnotator() + s = a.build_types(f4, []) + assert listitem(s.items[1]).__class__ == annmodel.SomeString + def f5(): + lst = ['a', 'b', 'c'] + return lst.index('not-a-char'), lst + a = self.RPythonAnnotator() + s = a.build_types(f5, []) + assert listitem(s.items[1]).__class__ == annmodel.SomeString + def g(n): return [0,1,2,n] From arigo at codespeak.net Sun Sep 11 14:26:10 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 14:26:10 +0200 (CEST) Subject: [pypy-svn] r17459 - in pypy/dist/pypy: annotation rpython/test Message-ID: <20050911122610.B154927B92@code1.codespeak.net> Author: arigo Date: Sun Sep 11 14:26:09 2005 New Revision: 17459 Modified: pypy/dist/pypy/annotation/unaryop.py pypy/dist/pypy/rpython/test/test_rdict.py pypy/dist/pypy/rpython/test/test_rlist.py Log: - typo - tests that the previous check-in made pass Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Sun Sep 11 14:26:09 2005 @@ -244,7 +244,7 @@ lst.listdef.mutate() def method_insert(lst, s_index, s_value): - self.method_append(lst, s_value) + lst.method_append(s_value) def method_pop(lst, s_index=None): lst.listdef.resize() Modified: pypy/dist/pypy/rpython/test/test_rdict.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rdict.py (original) +++ pypy/dist/pypy/rpython/test/test_rdict.py Sun Sep 11 14:26:09 2005 @@ -32,6 +32,16 @@ res = interpret(func, [6]) assert res == 0 +def test_dict_but_not_with_char_keys(): + def func(i): + d = {'h': i} + try: + return d['hello'] + except KeyError: + return 0 + res = interpret(func, [6]) + assert res == 0 + def test_dict_del_simple(): def func(i): d = {'hello' : i} Modified: pypy/dist/pypy/rpython/test/test_rlist.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rlist.py (original) +++ pypy/dist/pypy/rpython/test/test_rlist.py Sun Sep 11 14:26:09 2005 @@ -320,6 +320,13 @@ res = interpret(fn, [i, case]) assert res is fn(i, case) +def test_not_a_char_list_after_all(): + def fn(): + l = ['h', 'e', 'l', 'l', 'o'] + return 'world' in l + res = interpret(fn, []) + assert res is False + def test_list_index(): def fn(i): foo1 = Foo() From arigo at codespeak.net Sun Sep 11 14:38:41 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 14:38:41 +0200 (CEST) Subject: [pypy-svn] r17460 - pypy/dist/pypy/annotation Message-ID: <20050911123841.2EA8D27B92@code1.codespeak.net> Author: arigo Date: Sun Sep 11 14:38:40 2005 New Revision: 17460 Modified: pypy/dist/pypy/annotation/binaryop.py Log: Forgot dict.__delitem__(). Modified: pypy/dist/pypy/annotation/binaryop.py ============================================================================== --- pypy/dist/pypy/annotation/binaryop.py (original) +++ pypy/dist/pypy/annotation/binaryop.py Sun Sep 11 14:38:40 2005 @@ -393,9 +393,9 @@ dic1.dictdef.generalize_value(s_value) setitem.can_only_throw = [KeyError] - def delitem((dic1, obj1)): + def delitem((dic1, obj2)): getbookkeeper().count("dict_delitem", dic1) - pass + dic1.dictdef.generalize_key(obj2) delitem.can_only_throw = [KeyError] From arigo at codespeak.net Sun Sep 11 14:43:50 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 14:43:50 +0200 (CEST) Subject: [pypy-svn] r17461 - pypy/dist/pypy/rpython/test Message-ID: <20050911124350.C769C27B92@code1.codespeak.net> Author: arigo Date: Sun Sep 11 14:43:50 2005 New Revision: 17461 Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py Log: This test passes eventually. (We might investigate at one point what tuple unpacking on a dictionary really does, annotator- and rtyper-wise, because this gave rather strange results...) Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Sun Sep 11 14:43:50 2005 @@ -18,18 +18,24 @@ return ord(key[0]) def play_with_r_dict(d): + d['hello'] = 41 d['hello'] = 42 assert d['hi there'] == 42 try: - d["dumb"] + unexpected = d["dumb"] except KeyError: pass else: - assert False, "should have raised" + assert False, "should have raised, got %s" % unexpected assert len(d) == 1 assert 'oops' not in d - x, = d - assert x == 'hello' + + count = 0 + for x in d: + assert x == 'hello' + count += 1 + assert count == 1 + assert d.get('hola', -1) == 42 assert d.get('salut', -1) == -1 d1 = d.copy() @@ -38,13 +44,28 @@ assert d1.keys() == ['hello'] d.update(d1) assert d.values() == [42] - assert d.items() == [('hello', 42)] - x, = d.iterkeys() - assert x == 'hello' - x, = d.itervalues() - assert x == 42 - x, = d.iteritems() - assert x == ('hello', 42) + lst = d.items() + assert len(lst) == 1 and len(lst[0]) == 2 + assert lst[0][0] == 'hello' and lst[0][1] == 42 + + count = 0 + for x in d.iterkeys(): + assert x == 'hello' + count += 1 + assert count == 1 + + count = 0 + for x in d.itervalues(): + assert x == 42 + count += 1 + assert count == 1 + + count = 0 + for x in d.iteritems(): + assert len(x) == 2 and x[0] == 'hello' and x[1] == 42 + count += 1 + assert count == 1 + d.clear() assert d.keys() == [] return True # for the tests below @@ -96,6 +117,6 @@ assert a.binding(graph.getargs()[0]).knowntype == Strange assert a.binding(graph.getargs()[1]).knowntype == str -def INPROGRESS_test_rtype_r_dict(): +def test_rtype_r_dict(): res = interpret(test_r_dict, []) assert res is True From pedronis at codespeak.net Sun Sep 11 17:37:59 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 11 Sep 2005 17:37:59 +0200 (CEST) Subject: [pypy-svn] r17462 - in pypy/dist/pypy: annotation translator/test Message-ID: <20050911153759.B478B27B82@code1.codespeak.net> Author: pedronis Date: Sun Sep 11 17:37:56 2005 New Revision: 17462 Modified: pypy/dist/pypy/annotation/model.py pypy/dist/pypy/annotation/unaryop.py pypy/dist/pypy/translator/test/test_annrpython.py Log: implement generic if true then not None for non-numbers Modified: pypy/dist/pypy/annotation/model.py ============================================================================== --- pypy/dist/pypy/annotation/model.py (original) +++ pypy/dist/pypy/annotation/model.py Sun Sep 11 17:37:56 2005 @@ -171,20 +171,6 @@ unsigned = False def __init__(self): pass - #def __eq__(self, other): - # if self.__class__ is not other.__class__: - # return False - # if 'knowntypedata' in self.__dict__: - # selfdic = self.__dict__.copy() - # del selfdic['knowntypedata'] - # else: - # selfdic = self.__dict__ - # if 'knowntypedata' in other.__dict__: - # otherdic = other.__dict__.copy() - # del otherdic['knowntypedata'] - # else: - # otherdic = other.__dict__ - # return selfdic == otherdic class SomeString(SomeObject): "Stands for an object which is known to be a string." Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Sun Sep 11 17:37:56 2005 @@ -69,7 +69,7 @@ def len(obj): return SomeInteger(nonneg=True) - def is_true(obj): + def is_true_behavior(obj): if obj.is_constant(): return immutablevalue(bool(obj.const)) else: @@ -79,6 +79,24 @@ else: return SomeBool() + def is_true(s_obj): + r = s_obj.is_true_behavior() + assert isinstance(r, SomeBool) + + bk = getbookkeeper() + knowntypedata = r.knowntypedata = {} + fn, block, i = bk.position_key + op = block.operations[i] + assert op.opname == "is_true" or op.opname == "nonzero" + assert len(op.args) == 1 + arg = op.args[0] + s_nonnone_obj = s_obj + if s_obj.can_be_none(): + s_nonnone_obj = s_obj.nonnoneify() + add_knowntypedata(knowntypedata, True, [arg], s_nonnone_obj) + return r + + def nonzero(obj): return obj.is_true() @@ -162,6 +180,21 @@ def op_contains(obj, s_element): return SomeBool() +class __extend__(SomeFloat): + + def pos(flt): + return flt + + def neg(flt): + return SomeFloat() + + abs = neg + + def is_true(self): + if self.is_constant(): + return getbookkeeper().immutablevalue(bool(self.const)) + return SomeBool() + class __extend__(SomeInteger): def invert(self): @@ -195,23 +228,11 @@ abs.can_only_throw = [] abs_ovf = _clone(abs, [OverflowError]) - class __extend__(SomeBool): def is_true(self): return self -class __extend__(SomeFloat): - - def pos(flt): - return flt - - def neg(flt): - return SomeFloat() - - abs = neg - - class __extend__(SomeTuple): def len(tup): @@ -381,24 +402,6 @@ def method_upper(str): return SomeString() - #def is_true(str): - # r = SomeObject.is_true(str) - # if not isinstance(r, SomeBool): - # return r - # bk = getbookkeeper() - # knowntypedata = r.knowntypedata = {} - # fn, block, i = bk.position_key - # - # annotator = bk.annotator - # op = block.operations[i] - # assert op.opname == "is_true" or op.opname == "nonzero" - # assert len(op.args) == 1 - # arg = op.args[0] - # add_knowntypedata(knowntypedata, False, [arg], str) - # add_knowntypedata(knowntypedata, True, [arg], str.nonnoneify()) - # return r - - class __extend__(SomeChar): @@ -522,7 +525,7 @@ d[func] = value return SomePBC(d) - def is_true(pbc): + def is_true_behavior(pbc): outcome = None for c in pbc.prebuiltinstances: if c is not None and not bool(c): Modified: pypy/dist/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/dist/pypy/translator/test/test_annrpython.py (original) +++ pypy/dist/pypy/translator/test/test_annrpython.py Sun Sep 11 17:37:56 2005 @@ -1629,6 +1629,42 @@ s = a.build_types(f5, []) assert listitem(s.items[1]).__class__ == annmodel.SomeString + def test_true_str_is_not_none(self): + def f(s): + if s: + return s + else: + return '' + def g(i): + if i: + return f(None) + else: + return f('') + a = self.RPythonAnnotator() + s = a.build_types(g, [int]) + assert s.knowntype == str + assert not s.can_be_None + + def test_true_func_is_not_none(self): + def a1(): + pass + def a2(): + pass + def f(a): + if a: + return a + else: + return a2 + def g(i): + if i: + return f(None) + else: + return f(a1) + a = self.RPythonAnnotator() + s = a.build_types(g, [int]) + assert None not in s.prebuiltinstances + + def g(n): return [0,1,2,n] From arigo at codespeak.net Sun Sep 11 17:43:09 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 17:43:09 +0200 (CEST) Subject: [pypy-svn] r17463 - in pypy/dist/pypy/rpython: . test Message-ID: <20050911154309.C450C27B43@code1.codespeak.net> Author: arigo Date: Sun Sep 11 17:43:08 2005 New Revision: 17463 Modified: pypy/dist/pypy/rpython/rdict.py pypy/dist/pypy/rpython/rpbc.py pypy/dist/pypy/rpython/test/test_rclass.py Log: Reverted the change of rpbc.py in r17455, which broke start_new_thread() in translator/c/test/test_extfunc.py. Worked around the problem differently. Modified: pypy/dist/pypy/rpython/rdict.py ============================================================================== --- pypy/dist/pypy/rpython/rdict.py (original) +++ pypy/dist/pypy/rpython/rdict.py Sun Sep 11 17:43:08 2005 @@ -121,10 +121,17 @@ return self.dict_cache[key] except KeyError: self.setup() + l_dict = ll_newdict(self) + self.dict_cache[key] = l_dict + r_key = self.key_repr + r_value = self.value_repr if isinstance(dictobj, objectmodel.r_dict): - l_eqfn = self.r_rdict_eqfn .convert_const(dictobj.key_eq) - l_hashfn = self.r_rdict_hashfn.convert_const(dictobj.key_hash) - l_dict = ll_newdict_custom_eq_hash(l_eqfn, l_hashfn, self) + if self.r_rdict_eqfn.lowleveltype != lltype.Void: + l_fn = self.r_rdict_eqfn.convert_const(dictobj.key_eq) + l_dict.fnkeyeq = l_fn + if self.r_rdict_hashfn.lowleveltype != lltype.Void: + l_fn = self.r_rdict_hashfn.convert_const(dictobj.key_hash) + l_dict.fnkeyhash = l_fn # a dummy object with ll_keyeq and ll_keyhash methods to # pass to ll_dict_setitem() class Dummy: @@ -145,9 +152,6 @@ dummy = Dummy() dummy.cache = [] - self.dict_cache[key] = l_dict - r_key = self.key_repr - r_value = self.value_repr for dictkeycontainer, dictvalue in dictobj._dict.items(): llkey = r_key.convert_const(dictkeycontainer.key) llvalue = r_value.convert_const(dictvalue) @@ -156,10 +160,6 @@ return l_dict else: - l_dict = ll_newdict(self) - self.dict_cache[key] = l_dict - r_key = self.key_repr - r_value = self.value_repr for dictkey, dictvalue in dictobj.items(): llkey = r_key.convert_const(dictkey) llvalue = r_value.convert_const(dictvalue) @@ -409,23 +409,12 @@ DICT_INITSIZE = 8 def ll_newdict(dictrepr): - assert not dictrepr.custom_eq_hash # use ll_newdict_custom_eq_hash() instead d = lltype.malloc(dictrepr.DICT) d.entries = lltype.malloc(dictrepr.DICTENTRYARRAY, DICT_INITSIZE) d.num_items = 0 # but still be explicit d.num_pristine_entries = DICT_INITSIZE return d -def ll_newdict_custom_eq_hash(eqfn, hashfn, dictrepr): - assert dictrepr.custom_eq_hash - d = lltype.malloc(dictrepr.DICT) - d.entries = lltype.malloc(dictrepr.DICTENTRYARRAY, DICT_INITSIZE) - d.num_items = 0 # but still be explicit - d.num_pristine_entries = DICT_INITSIZE - d.fnkeyeq = eqfn - d.fnkeyhash = hashfn - return d - def ll_copy_extra_data(targetdict, sourcedict, dictrepr): if dictrepr.custom_eq_hash: targetdict.fnkeyeq = sourcedict.fnkeyeq @@ -449,8 +438,13 @@ r_dict.r_rdict_hashfn) crepr = hop.inputconst(lltype.Void, r_dict) hop.exception_cannot_occur() - v_result = hop.gendirectcall(ll_newdict_custom_eq_hash, - v_eqfn, v_hashfn, crepr) + v_result = hop.gendirectcall(ll_newdict, crepr) + if r_dict.r_rdict_eqfn.lowleveltype != lltype.Void: + cname = hop.inputconst(Void, 'fnkeyeq') + hop.genop('setfield', [v_result, cname, v_eqfn]) + if r_dict.r_rdict_hashfn.lowleveltype != lltype.Void: + cname = hop.inputconst(Void, 'fnkeyhash') + hop.genop('setfield', [v_result, cname, v_hashfn]) return v_result # ____________________________________________________________ Modified: pypy/dist/pypy/rpython/rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/rpbc.py (original) +++ pypy/dist/pypy/rpython/rpbc.py Sun Sep 11 17:43:08 2005 @@ -402,11 +402,8 @@ if value not in self.function_signatures(): raise TyperError("%r not in %r" % (value, self.s_pbc.prebuiltinstances)) - if self.lowleveltype == Void: - return None - else: - f, rinputs, rresult = self.function_signatures()[value] - return f + f, rinputs, rresult = self.function_signatures()[value] + return f def rtype_simple_call(self, hop): f, rinputs, rresult = self.function_signatures().itervalues().next() Modified: pypy/dist/pypy/rpython/test/test_rclass.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rclass.py (original) +++ pypy/dist/pypy/rpython/test/test_rclass.py Sun Sep 11 17:43:08 2005 @@ -321,3 +321,13 @@ assert res is False res = interpret(f, [0]) assert res is False + +def test_void_fnptr(): + def g(): + return 42 + def f(): + e = EmptyBase() + e.attr = g + return e.attr() + res = interpret(f, []) + assert res == 42 From pedronis at codespeak.net Sun Sep 11 17:51:20 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 11 Sep 2005 17:51:20 +0200 (CEST) Subject: [pypy-svn] r17464 - in pypy/dist/pypy/rpython: . test Message-ID: <20050911155120.3F9D427B43@code1.codespeak.net> Author: pedronis Date: Sun Sep 11 17:51:18 2005 New Revision: 17464 Modified: pypy/dist/pypy/rpython/rdict.py pypy/dist/pypy/rpython/test/test_objectmodel.py Log: enable r_dict test with bound methods Modified: pypy/dist/pypy/rpython/rdict.py ============================================================================== --- pypy/dist/pypy/rpython/rdict.py (original) +++ pypy/dist/pypy/rpython/rdict.py Sun Sep 11 17:51:18 2005 @@ -440,10 +440,10 @@ hop.exception_cannot_occur() v_result = hop.gendirectcall(ll_newdict, crepr) if r_dict.r_rdict_eqfn.lowleveltype != lltype.Void: - cname = hop.inputconst(Void, 'fnkeyeq') + cname = hop.inputconst(lltype.Void, 'fnkeyeq') hop.genop('setfield', [v_result, cname, v_eqfn]) if r_dict.r_rdict_hashfn.lowleveltype != lltype.Void: - cname = hop.inputconst(Void, 'fnkeyhash') + cname = hop.inputconst(lltype.Void, 'fnkeyhash') hop.genop('setfield', [v_result, cname, v_hashfn]) return v_result Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Sun Sep 11 17:51:18 2005 @@ -120,3 +120,7 @@ def test_rtype_r_dict(): res = interpret(test_r_dict, []) assert res is True + +def test_rtype_r_dict_bm(): + res = interpret(test_r_dict_bm, []) + assert res is True From arigo at codespeak.net Sun Sep 11 18:07:48 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 18:07:48 +0200 (CEST) Subject: [pypy-svn] r17466 - pypy/dist/pypy/rpython Message-ID: <20050911160748.56ACB27B56@code1.codespeak.net> Author: arigo Date: Sun Sep 11 18:07:47 2005 New Revision: 17466 Modified: pypy/dist/pypy/rpython/rdict.py Log: Ported from CPython the protection against custom eq functions that mutate a dict. Modified: pypy/dist/pypy/rpython/rdict.py ============================================================================== --- pypy/dist/pypy/rpython/rdict.py (original) +++ pypy/dist/pypy/rpython/rdict.py Sun Sep 11 18:07:47 2005 @@ -361,18 +361,20 @@ entries = d.entries mask = len(entries) - 1 i = r_uint(hash & mask) - - """XXX MUTATION PROTECTION!""" - # do the first try before any looping entry = entries[i] if entry.valid: - if entry.key == key: + checkingkey = entry.key + if checkingkey == key: return entry # found the entry if dictrepr.custom_eq_hash: - res = hlinvoke(dictrepr.r_rdict_eqfn, d.fnkeyeq, entry.key, key) + res = hlinvoke(dictrepr.r_rdict_eqfn, d.fnkeyeq, checkingkey, key) + if (entries != d.entries or + not entry.valid or entry.key != checkingkey): + # the compare did major nasty stuff to the dict: start over + return ll_dict_lookup(d, key, dictrepr) else: - res = dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(entry.key, key) + res = dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(checkingkey, key) if res: return entry # found the entry freeslot = lltype.nullptr(lltype.typeOf(entry).TO) @@ -390,12 +392,17 @@ if not entry.everused: return freeslot or entry elif entry.valid: - if entry.key == key: + checkingkey = entry.key + if checkingkey == key: return entry if dictrepr.custom_eq_hash: - res = hlinvoke(dictrepr.r_rdict_eqfn, d.fnkeyeq, entry.key, key) + res = hlinvoke(dictrepr.r_rdict_eqfn, d.fnkeyeq, checkingkey, key) + if (entries != d.entries or + not entry.valid or entry.key != checkingkey): + # the compare did major nasty stuff to the dict: start over + return ll_dict_lookup(d, key, dictrepr) else: - res = dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(entry.key, key) + res = dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(checkingkey, key) if res: return entry elif not freeslot: From pedronis at codespeak.net Sun Sep 11 18:28:35 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 11 Sep 2005 18:28:35 +0200 (CEST) Subject: [pypy-svn] r17467 - pypy/dist/pypy/rpython Message-ID: <20050911162835.E097527B56@code1.codespeak.net> Author: pedronis Date: Sun Sep 11 18:28:34 2005 New Revision: 17467 Modified: pypy/dist/pypy/rpython/objectmodel.py Log: change the name in the message too Modified: pypy/dist/pypy/rpython/objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/objectmodel.py (original) +++ pypy/dist/pypy/rpython/objectmodel.py Sun Sep 11 18:28:34 2005 @@ -33,7 +33,7 @@ # __ invoke XXX this doesn't seem completely the right place for this def hlinvoke(repr, llcallable, *args): - raise TypeError, "invoke is meant to be rtyped and not called direclty" + raise TypeError, "hlinvoke is meant to be rtyped and not called direclty" # ____________________________________________________________ From arigo at codespeak.net Sun Sep 11 18:43:18 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 18:43:18 +0200 (CEST) Subject: [pypy-svn] r17468 - pypy/dist/pypy/translator/goal Message-ID: <20050911164318.9FB8927B56@code1.codespeak.net> Author: arigo Date: Sun Sep 11 18:43:17 2005 New Revision: 17468 Modified: pypy/dist/pypy/translator/goal/ (props changed) Log: Ignore pypy-c and pypy-llvm files. From arigo at codespeak.net Sun Sep 11 19:34:42 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 19:34:42 +0200 (CEST) Subject: [pypy-svn] r17469 - pypy/dist/pypy/rpython Message-ID: <20050911173442.B61DB27B62@code1.codespeak.net> Author: arigo Date: Sun Sep 11 19:34:41 2005 New Revision: 17469 Modified: pypy/dist/pypy/rpython/rdict.py Log: A typo, and releasing the ref to the dict once its iterator is exhausted. Modified: pypy/dist/pypy/rpython/rdict.py ============================================================================== --- pypy/dist/pypy/rpython/rdict.py (original) +++ pypy/dist/pypy/rpython/rdict.py Sun Sep 11 19:34:41 2005 @@ -113,7 +113,7 @@ # get object from bound dict methods #dictobj = getattr(dictobj, '__self__', dictobj) if dictobj is None: - return nullptr(self.DICT) + return lltype.nullptr(self.DICT) if not isinstance(dictobj, (dict, objectmodel.r_dict)): raise TyperError("expected a dict: %r" % (dictobj,)) try: @@ -488,24 +488,27 @@ return iter def ll_dictnext(iter, func, RETURNTYPE): - entries = iter.dict.entries - index = iter.index - entries_len = len(entries) - while index < entries_len: - entry = entries[index] - index = index + 1 - if entry.valid: - iter.index = index - if func is dum_items: - r = lltype.malloc(RETURNTYPE.TO) - r.item0 = entry.key - r.item1 = entry.value - return r - elif func is dum_keys: - return entry.key - elif func is dum_values: - return entry.value - iter.index = index + dict = iter.dict + if dict: + entries = dict.entries + index = iter.index + entries_len = len(entries) + while index < entries_len: + entry = entries[index] + index = index + 1 + if entry.valid: + iter.index = index + if func is dum_items: + r = lltype.malloc(RETURNTYPE.TO) + r.item0 = entry.key + r.item1 = entry.value + return r + elif func is dum_keys: + return entry.key + elif func is dum_values: + return entry.value + # clear the reference to the dict and prevent restarts + iter.dict = lltype.nullptr(lltype.typeOf(iter).TO.dict.TO) raise StopIteration # _____________________________________________________________ From arigo at codespeak.net Sun Sep 11 19:36:40 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 19:36:40 +0200 (CEST) Subject: [pypy-svn] r17470 - in pypy/dist/pypy: annotation rpython/test Message-ID: <20050911173640.7B12927B62@code1.codespeak.net> Author: arigo Date: Sun Sep 11 19:36:39 2005 New Revision: 17470 Modified: pypy/dist/pypy/annotation/dictdef.py pypy/dist/pypy/rpython/test/test_objectmodel.py Log: Possibly slightly fragile fix for a test: we cannot use emulate_pbc_call() when there is no position_key on the bookkeeper. Lifting this restriction looks messy so I went for delaying the emulate_pbc_call()... Modified: pypy/dist/pypy/annotation/dictdef.py ============================================================================== --- pypy/dist/pypy/annotation/dictdef.py (original) +++ pypy/dist/pypy/annotation/dictdef.py Sun Sep 11 19:36:39 2005 @@ -5,6 +5,7 @@ class DictKey(ListItem): custom_eq_hash = False + pending_emulated_calls = () def patch(self): for dictdef in self.itemof: @@ -22,7 +23,7 @@ def generalize(self, s_other_value): updated = ListItem.generalize(self, s_other_value) - if updated and self.custom_eq_hash: + if self.custom_eq_hash and (updated or self.pending_emulated_calls): self.emulate_rdict_calls() return updated @@ -37,13 +38,23 @@ self.emulate_rdict_calls(other=other) def emulate_rdict_calls(self, other=None): + # hackish: cannot emulate a call if we are not currently handling + # an operation + # (e.g. a link or a prebuilt constant coming from somewhere, + # as in rpython.test.test_objectmodel.test_rtype_constant_r_dicts) + if not hasattr(self.bookkeeper, 'position_key'): + self.pending_emulated_calls += (other,) + return + myeq = (self, 'eq') myhash = (self, 'hash') - if other: - replace_othereq = [(other, 'eq')] - replace_otherhash = [(other, 'hash')] - else: - replace_othereq = replace_otherhash = () + replace_othereq = [] + replace_otherhash = [] + for other in self.pending_emulated_calls + (other,): + if other: + replace_othereq.append((other, 'eq')) + replace_otherhash.append((other, 'hash')) + self.pending_emulated_calls = () s_key = self.s_value s1 = self.bookkeeper.emulate_pbc_call(myeq, self.s_rdict_eqfn, [s_key, s_key], @@ -86,6 +97,8 @@ else: position_key = self.bookkeeper.position_key self.dictkey.read_locations[position_key] = True + if self.dictkey.pending_emulated_calls: + self.dictkey.emulate_rdict_calls() return self.dictkey.s_value def read_value(self, position_key=None): Modified: pypy/dist/pypy/rpython/test/test_objectmodel.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_objectmodel.py (original) +++ pypy/dist/pypy/rpython/test/test_objectmodel.py Sun Sep 11 19:36:39 2005 @@ -124,3 +124,20 @@ def test_rtype_r_dict_bm(): res = interpret(test_r_dict_bm, []) assert res is True + +def test_rtype_constant_r_dicts(): + d1 = r_dict(strange_key_eq, strange_key_hash) + d1['hello'] = 666 + d2 = r_dict(strange_key_eq, strange_key_hash) + d2['hello'] = 777 + d2['world'] = 888 + def fn(i): + if i == 1: + d = d1 + else: + d = d2 + return len(d) + res = interpret(fn, [1]) + assert res == 1 + res = interpret(fn, [2]) + assert res == 2 From tismer at codespeak.net Sun Sep 11 20:07:14 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sun, 11 Sep 2005 20:07:14 +0200 (CEST) Subject: [pypy-svn] r17471 - pypy/dist/pypy/translator/goal Message-ID: <20050911180714.46CFE27B62@code1.codespeak.net> Author: tismer Date: Sun Sep 11 20:07:09 2005 New Revision: 17471 Modified: pypy/dist/pypy/translator/goal/ (props changed) Log: for windows, simply ignore all .exe files From arigo at codespeak.net Sun Sep 11 20:22:08 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 20:22:08 +0200 (CEST) Subject: [pypy-svn] r17472 - pypy/dist/pypy/translator/tool/pygame Message-ID: <20050911182208.C8CF727B62@code1.codespeak.net> Author: arigo Date: Sun Sep 11 20:22:07 2005 New Revision: 17472 Modified: pypy/dist/pypy/translator/tool/pygame/graphdisplay.py Log: initialize the pygame submodules by hand, to avoid initializing too much (e.g. the sound system) Modified: pypy/dist/pypy/translator/tool/pygame/graphdisplay.py ============================================================================== --- pypy/dist/pypy/translator/tool/pygame/graphdisplay.py (original) +++ pypy/dist/pypy/translator/tool/pygame/graphdisplay.py Sun Sep 11 20:22:07 2005 @@ -46,7 +46,10 @@ class Display(object): def __init__(self, (w,h)=(800,680)): - pygame.init() + # initialize the modules by hand, to avoid initializing too much + # (e.g. the sound system) + pygame.display.init() + pygame.font.init() self.resize((w,h)) def resize(self, (w,h)): From arigo at codespeak.net Sun Sep 11 20:36:53 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 20:36:53 +0200 (CEST) Subject: [pypy-svn] r17473 - in pypy/dist/pypy: interpreter objspace/std objspace/std/test Message-ID: <20050911183653.951AD27B62@code1.codespeak.net> Author: arigo Date: Sun Sep 11 20:36:51 2005 New Revision: 17473 Modified: pypy/dist/pypy/interpreter/baseobjspace.py pypy/dist/pypy/objspace/std/dictobject.py pypy/dist/pypy/objspace/std/dicttype.py pypy/dist/pypy/objspace/std/marshal_impl.py pypy/dist/pypy/objspace/std/objspace.py pypy/dist/pypy/objspace/std/test/test_dictobject.py Log: And eventually... Got rid of the hash table logic in dictobject.py, using r_dicts instead. Modified: pypy/dist/pypy/interpreter/baseobjspace.py ============================================================================== --- pypy/dist/pypy/interpreter/baseobjspace.py (original) +++ pypy/dist/pypy/interpreter/baseobjspace.py Sun Sep 11 20:36:51 2005 @@ -369,6 +369,10 @@ """shortcut for space.is_true(space.is_(w_obj1, w_obj2))""" return self.is_true(self.is_(w_obj1, w_obj2)) + def hash_w(self, w_obj): + """shortcut for space.int_w(space.hash(w_obj))""" + return self.int_w(self.hash(w_obj)) + def newbool(self, b): if b: return self.w_True Modified: pypy/dist/pypy/objspace/std/dictobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/dictobject.py (original) +++ pypy/dist/pypy/objspace/std/dictobject.py Sun Sep 11 20:36:51 2005 @@ -9,108 +9,33 @@ from pypy.interpreter import gateway from pypy.rpython.rarithmetic import r_uint +from pypy.rpython.objectmodel import r_dict -class Entry: - def __init__(self): - self.hash = r_uint(0) - self.w_key = None - self.w_value = None - def __repr__(self): - return ''%(self.hash, self.w_key, self.w_value) class W_DictObject(W_Object): from pypy.objspace.std.dicttype import dict_typedef as typedef - def __init__(w_self, space, list_pairs_w): + def __init__(w_self, space, w_otherdict=None): W_Object.__init__(w_self, space) - - w_self.used = 0 - w_self.data = [] - w_self.resize(len(list_pairs_w)*2) - w_self.w_dummy = space.newlist([]) + if w_otherdict is None: + w_self.content = r_dict(space.eq_w, space.hash_w) + else: + w_self.content = w_otherdict.content.copy() + + def initialize_content(w_self, list_pairs_w): for w_k, w_v in list_pairs_w: - w_self.insert(w_self.hash(w_k), w_k, w_v) - + w_self.content[w_k] = w_v + def __repr__(w_self): """ representation for debugging purposes """ - return "%s(%s)" % (w_self.__class__.__name__, w_self.data) - - def hash(w_self, w_obj): - space = w_self.space - return r_uint(space.int_w(space.hash(w_obj))) - - def insert(self, h, w_key, w_value): - entry = self.lookdict(h, w_key) - if entry.w_value is None: - self.used += 1 - entry.hash = h - entry.w_key = w_key - entry.w_value = w_value - else: - entry.w_value = w_value - - def resize(self, minused): - newsize = 4 - while newsize < minused: - newsize *= 2 - od = self.data - - self.used = 0 - self.data = [Entry() for i in range(newsize)] - for entry in od: - if entry.w_value is not None: - self.insert(entry.hash, entry.w_key, entry.w_value) - - def lookdict(self, lookup_hash, w_lookup): - assert isinstance(lookup_hash, r_uint) - space = self.space - data = self.data - mask = len(data) - 1 # len(data) is expected to be a power of 2 - i = lookup_hash & mask - - entry = data[i] - if entry.w_key is None or space.is_w(w_lookup, entry.w_key): - return entry - if entry.w_key is self.w_dummy: - freeslot = entry - else: - if entry.hash == lookup_hash and space.eq_w(entry.w_key, w_lookup): - if self.data is not data: - # the eq_w() modified the dict sufficiently to have it - # switch to another table. Can't return 'entry', which - # belongs to the old table. Start over... - return self.lookdict(lookup_hash, w_lookup) - return entry - freeslot = None - - perturb = lookup_hash - while 1: - i = (i << 2) + i + perturb + 1 - entry = data[i & mask] - if entry.w_key is None: - if freeslot: - return freeslot - else: - return entry - if entry.hash == lookup_hash and entry.w_key is not self.w_dummy \ - and space.eq_w(entry.w_key, w_lookup): - if self.data is not data: - # the eq_w() modified the dict sufficiently to have it - # switch to another table. Can't return 'entry', which - # belongs to the old table. Start over... - return self.lookdict(lookup_hash, w_lookup) - return entry - if entry.w_key is self.w_dummy and freeslot is None: - freeslot = entry - perturb >>= 5 + return "%s(%s)" % (w_self.__class__.__name__, w_self.content) def unwrap(w_dict): space = w_dict.space result = {} - for entry in w_dict.data: - if entry.w_value is not None: - # XXX generic mixed types unwrap - result[space.unwrap(entry.w_key)] = space.unwrap(entry.w_value) + for w_key, w_value in w_dict.content.items(): + # generic mixed types unwrap + result[space.unwrap(w_key)] = space.unwrap(w_value) return result registerimplementation(W_DictObject) @@ -119,9 +44,8 @@ def init__Dict(space, w_dict, __args__): w_src, w_kwds = __args__.parse('dict', (['seq_or_map'], None, 'kwargs'), # signature - [W_DictObject(space, [])]) # default argument - dict_clear__Dict(space, w_dict) - # XXX do dict({...}) with dict_update__Dict_Dict() + [W_DictObject(space)]) # default argument + w_dict.content.clear() try: space.getattr(w_src, space.wrap("keys")) except OperationError: @@ -132,41 +56,35 @@ raise OperationError(space.w_ValueError, space.wrap("dict() takes a sequence of pairs")) w_k, w_v = pair - setitem__Dict_ANY_ANY(space, w_dict, w_k, w_v) + w_dict.content[w_k] = w_v else: if space.is_true(w_src): from pypy.objspace.std.dicttype import dict_update__ANY_ANY dict_update__ANY_ANY(space, w_dict, w_src) if space.is_true(w_kwds): - space.call_method(w_dict, 'update', w_kwds) + from pypy.objspace.std.dicttype import dict_update__ANY_ANY + dict_update__ANY_ANY(space, w_dict, w_kwds) def getitem__Dict_ANY(space, w_dict, w_lookup): - entry = w_dict.lookdict(w_dict.hash(w_lookup), w_lookup) - if entry.w_value is not None: - return entry.w_value - else: + try: + return w_dict.content[w_lookup] + except KeyError: raise OperationError(space.w_KeyError, w_lookup) def setitem__Dict_ANY_ANY(space, w_dict, w_newkey, w_newvalue): - w_dict.insert(w_dict.hash(w_newkey), w_newkey, w_newvalue) - if 2*w_dict.used > len(w_dict.data): - w_dict.resize(2*w_dict.used) + w_dict.content[w_newkey] = w_newvalue def delitem__Dict_ANY(space, w_dict, w_lookup): - entry = w_dict.lookdict(w_dict.hash(w_lookup), w_lookup) - if entry.w_value is not None: - w_dict.used -= 1 - entry.w_key = w_dict.w_dummy - entry.w_value = None - else: + try: + del w_dict.content[w_lookup] + except KeyError: raise OperationError(space.w_KeyError, w_lookup) def len__Dict(space, w_dict): - return space.wrap(w_dict.used) + return space.wrap(len(w_dict.content)) def contains__Dict_ANY(space, w_dict, w_lookup): - entry = w_dict.lookdict(w_dict.hash(w_lookup), w_lookup) - return space.newbool(entry.w_value is not None) + return space.newbool(w_lookup in w_dict.content) dict_has_key__Dict_ANY = contains__Dict_ANY @@ -177,60 +95,55 @@ if space.is_true(space.is_(w_left, w_right)): return space.w_True - if w_left.used != w_right.used: + if len(w_left.content) != len(w_right.content): return space.w_False - for entry in w_left.data: - w_val = entry.w_value - if w_val is None: - continue - rightentry = w_right.lookdict(entry.hash, entry.w_key) - if rightentry.w_value is None: + for w_key, w_val in w_left.content.iteritems(): + try: + w_rightval = w_right.content[w_key] + except KeyError: return space.w_False - if not space.eq_w(w_val, rightentry.w_value): + if not space.eq_w(w_val, w_rightval): return space.w_False return space.w_True -def characterize(space, adata, w_b): +def characterize(space, acontent, bcontent): """ (similar to CPython) - returns the smallest key in adata for which b's value is different or absent and this value """ + returns the smallest key in acontent for which b's value is different or absent and this value """ w_smallest_diff_a_key = None w_its_value = None - for entry in adata: - w_val = entry.w_value - if w_val is None: - continue - w_key = entry.w_key + for w_key, w_val in acontent.iteritems(): if w_smallest_diff_a_key is None or space.is_true(space.lt(w_key, w_smallest_diff_a_key)): - b_entry = w_b.lookdict(entry.hash, w_key) - if b_entry.w_value is None: + try: + w_bvalue = bcontent[w_key] + except KeyError: w_its_value = w_val w_smallest_diff_a_key = w_key else: - if not space.eq_w(w_val, b_entry.w_value): + if not space.eq_w(w_val, w_bvalue): w_its_value = w_val w_smallest_diff_a_key = w_key return w_smallest_diff_a_key, w_its_value def lt__Dict_Dict(space, w_left, w_right): # Different sizes, no problem - if w_left.used < w_right.used: + leftcontent = w_left.content + rightcontent = w_right.content + if len(leftcontent) < len(rightcontent): return space.w_True - if w_left.used > w_right.used: + if len(leftcontent) > len(rightcontent): return space.w_False # Same size - w_leftdiff, w_leftval = characterize(space, w_left.data, w_right) + w_leftdiff, w_leftval = characterize(space, leftcontent, rightcontent) if w_leftdiff is None: return space.w_False - w_rightdiff, w_rightval = characterize(space, w_right.data, w_left) - w_res = space.w_False + w_rightdiff, w_rightval = characterize(space, rightcontent, leftcontent) if w_rightdiff is None: # w_leftdiff is not None, w_rightdiff is None return space.w_True - w_isequal = space.eq(w_leftdiff, w_rightdiff) w_res = space.lt(w_leftdiff, w_rightdiff) - if (space.is_w(w_res, space.w_False) and - space.is_true(w_isequal) and + if (not space.is_true(w_res) and + space.eq_w(w_leftdiff, w_rightdiff) and w_rightval is not None): w_res = space.lt(w_leftval, w_rightval) return w_res @@ -239,24 +152,17 @@ raise OperationError(space.w_TypeError,space.wrap("dict objects are unhashable")) def dict_copy__Dict(space, w_self): - return W_DictObject(space, [(entry.w_key,entry.w_value) - for entry in w_self.data - if entry.w_value is not None]) + return W_DictObject(space, w_self) def dict_items__Dict(space, w_self): - return space.newlist([ space.newtuple([entry.w_key,entry.w_value]) - for entry in w_self.data - if entry.w_value is not None]) + return space.newlist([ space.newtuple([w_key, w_value]) + for w_key, w_value in w_self.content.iteritems() ]) def dict_keys__Dict(space, w_self): - return space.newlist([ entry.w_key - for entry in w_self.data - if entry.w_value is not None]) + return space.newlist(w_self.content.keys()) def dict_values__Dict(space, w_self): - return space.newlist([ entry.w_value - for entry in w_self.data - if entry.w_value is not None]) + return space.newlist(w_self.content.values()) def dict_iteritems__Dict(space, w_self): return W_DictIter_Items(space, w_self) @@ -268,15 +174,10 @@ return W_DictIter_Values(space, w_self) def dict_clear__Dict(space, w_self): - w_self.data = [Entry()] - w_self.used = 0 + w_self.content.clear() def dict_get__Dict_ANY_ANY(space, w_dict, w_lookup, w_default): - entry = w_dict.lookdict(w_dict.hash(w_lookup), w_lookup) - if entry.w_value is not None: - return entry.w_value - else: - return w_default + return w_dict.content.get(w_lookup, w_default) app = gateway.applevel(''' def dictrepr(currently_in_repr, d): @@ -306,7 +207,7 @@ dictrepr = app.interphook("dictrepr") def repr__Dict(space, w_dict): - if w_dict.used == 0: + if len(w_dict.content) == 0: return space.wrap('{}') w_currently_in_repr = space.getexecutioncontext()._py_repr return dictrepr(space, w_currently_in_repr, w_dict) @@ -320,58 +221,72 @@ def __init__(w_self, space, w_dictobject): W_Object.__init__(w_self, space) - w_self.w_dictobject = w_dictobject - w_self.len = w_dictobject.used + w_self.content = content = w_dictobject.content + w_self.len = len(content) w_self.pos = 0 - w_self.datapos = 0 + w_self.setup_iterator() - def return_entry(w_self, entry): + def return_entry(w_self, w_key, w_value): raise NotImplementedError registerimplementation(W_DictIterObject) class W_DictIter_Keys(W_DictIterObject): - def return_entry(w_self, entry): - return entry.w_key + def setup_iterator(w_self): + w_self.iterator = w_self.content.iterkeys() + def next_entry(w_self): + # note that this 'for' loop only runs once, at most + for w_key in w_self.iterator: + return w_key + else: + return None class W_DictIter_Values(W_DictIterObject): - def return_entry(w_self, entry): - return entry.w_value + def setup_iterator(w_self): + w_self.iterator = w_self.content.itervalues() + def next_entry(w_self): + # note that this 'for' loop only runs once, at most + for w_value in w_self.iterator: + return w_value + else: + return None class W_DictIter_Items(W_DictIterObject): - def return_entry(w_self, entry): - return w_self.space.newtuple([entry.w_key, entry.w_value]) + def setup_iterator(w_self): + w_self.iterator = w_self.content.iteritems() + def next_entry(w_self): + # note that this 'for' loop only runs once, at most + for w_key, w_value in w_self.iterator: + return w_self.space.newtuple([w_key, w_value]) + else: + return None def iter__DictIterObject(space, w_dictiter): return w_dictiter def next__DictIterObject(space, w_dictiter): - w_dict = w_dictiter.w_dictobject - if w_dict is not None: - if w_dictiter.len != w_dict.used: + content = w_dictiter.content + if content is not None: + if w_dictiter.len != len(content): w_dictiter.len = -1 # Make this error state sticky raise OperationError(space.w_RuntimeError, space.wrap("dictionary changed size during iteration")) # look for the next entry - i = w_dictiter.datapos - data = w_dict.data - while i < len(data): - entry = data[i] - i += 1 - if entry.w_value is not None: - w_dictiter.pos += 1 - w_dictiter.datapos = i - return w_dictiter.return_entry(entry) + w_result = w_dictiter.next_entry() + if w_result is not None: + w_dictiter.pos += 1 + return w_result # no more entries - w_dictiter.w_dictobject = None + w_dictiter.content = None raise OperationError(space.w_StopIteration, space.w_None) def len__DictIterObject(space, w_dictiter): - w_dict = w_dictiter.w_dictobject - if w_dict is None or w_dictiter.len == -1 : + content = w_dictiter.content + if content is None or w_dictiter.len == -1: return space.wrap(0) return space.wrap(w_dictiter.len - w_dictiter.pos) + # ____________________________________________________________ from pypy.objspace.std import dicttype Modified: pypy/dist/pypy/objspace/std/dicttype.py ============================================================================== --- pypy/dist/pypy/objspace/std/dicttype.py (original) +++ pypy/dist/pypy/objspace/std/dicttype.py Sun Sep 11 20:36:51 2005 @@ -100,7 +100,7 @@ def descr__new__(space, w_dicttype, __args__): from pypy.objspace.std.dictobject import W_DictObject w_obj = space.allocate_instance(W_DictObject, w_dicttype) - W_DictObject.__init__(w_obj, space, []) + W_DictObject.__init__(w_obj, space) return w_obj # ____________________________________________________________ Modified: pypy/dist/pypy/objspace/std/marshal_impl.py ============================================================================== --- pypy/dist/pypy/objspace/std/marshal_impl.py (original) +++ pypy/dist/pypy/objspace/std/marshal_impl.py Sun Sep 11 20:36:51 2005 @@ -341,17 +341,16 @@ def marshal_w__Dict(space, w_dict, m): m.start(TYPE_DICT) - for entry in w_dict.data: - if entry.w_value is not None: - m.put_w_obj(entry.w_key) - m.put_w_obj(entry.w_value) + for w_key, w_value in w_dict.content.iteritems(): + m.put_w_obj(w_key) + m.put_w_obj(w_value) m.atom(TYPE_NULL) def unmarshal_Dict(space, u, tc): # since primitive lists are not optimized and we don't know # the dict size in advance, use the dict's setitem instead # of building a list of tuples. - w_dic = W_DictObject(space, []) + w_dic = W_DictObject(space) setter = dictobject.setitem__Dict_ANY_ANY while 1: w_key = u.get_w_obj(True) Modified: pypy/dist/pypy/objspace/std/objspace.py ============================================================================== --- pypy/dist/pypy/objspace/std/objspace.py (original) +++ pypy/dist/pypy/objspace/std/objspace.py Sun Sep 11 20:36:51 2005 @@ -248,7 +248,7 @@ return W_UnicodeObject(self, [unichr(ord(u)) for u in x]) # xxx if isinstance(x, dict): items_w = [(self.wrap(k), self.wrap(v)) for (k, v) in x.iteritems()] - return W_DictObject(self, items_w) + return self.newdict(items_w) if isinstance(x, float): return W_FloatObject(self, x) if isinstance(x, tuple): @@ -324,7 +324,9 @@ return W_ListObject(self, list_w) def newdict(self, list_pairs_w): - return W_DictObject(self, list_pairs_w) + w_result = W_DictObject(self) + w_result.initialize_content(list_pairs_w) + return w_result def newslice(self, w_start, w_end, w_step): return W_SliceObject(self, w_start, w_end, w_step) @@ -395,7 +397,7 @@ def is_true(self, w_obj): # XXX don't look! if isinstance(w_obj, W_DictObject): - return not not w_obj.used + return len(w_obj.content) != 0 else: return DescrOperation.is_true(self, w_obj) Modified: pypy/dist/pypy/objspace/std/test/test_dictobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/test/test_dictobject.py (original) +++ pypy/dist/pypy/objspace/std/test/test_dictobject.py Sun Sep 11 20:36:51 2005 @@ -7,13 +7,14 @@ def test_empty(self): space = self.space - d = W_DictObject(space, []) + d = W_DictObject(space) assert not self.space.is_true(d) def test_nonempty(self): space = self.space wNone = space.w_None - d = W_DictObject(space, [(wNone, wNone)]) + d = W_DictObject(space) + d.initialize_content([(wNone, wNone)]) assert space.is_true(d) i = space.getitem(d, wNone) equal = space.eq(i, wNone) @@ -23,7 +24,8 @@ space = self.space wk1 = space.wrap('key') wone = space.wrap(1) - d = W_DictObject(space, [(space.wrap('zero'),space.wrap(0))]) + d = W_DictObject(space) + d.initialize_content([(space.wrap('zero'),space.wrap(0))]) space.setitem(d,wk1,wone) wback = space.getitem(d,wk1) assert self.space.eq_w(wback,wone) @@ -31,8 +33,8 @@ def test_delitem(self): space = self.space wk1 = space.wrap('key') - d = W_DictObject(space, - [(space.wrap('zero'),space.wrap(0)), + d = W_DictObject(space) + d.initialize_content( [(space.wrap('zero'),space.wrap(0)), (space.wrap('one'),space.wrap(1)), (space.wrap('two'),space.wrap(2))]) space.delitem(d,space.wrap('one')) @@ -338,12 +340,10 @@ # the minimal 'space' needed to use a W_DictObject class FakeSpace: - def hash(self, obj): + def hash_w(self, obj): return hash(obj) def unwrap(self, x): return x - def int_w(self, x): - return x def is_true(self, x): return x def is_(self, x, y): @@ -364,7 +364,7 @@ def test_stressdict(self): from random import randint - d = W_DictObject(self.space, []) + d = W_DictObject(self.space) N = 10000 pydict = {} for i in range(N): From arigo at codespeak.net Sun Sep 11 21:03:11 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 11 Sep 2005 21:03:11 +0200 (CEST) Subject: [pypy-svn] r17474 - pypy/dist/pypy/rpython/test Message-ID: <20050911190311.4DAE327B5E@code1.codespeak.net> Author: arigo Date: Sun Sep 11 21:03:10 2005 New Revision: 17474 Modified: pypy/dist/pypy/rpython/test/test_rdict.py Log: A stress test for the dictionary implementation. Modified: pypy/dist/pypy/rpython/test/test_rdict.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rdict.py (original) +++ pypy/dist/pypy/rpython/test/test_rdict.py Sun Sep 11 21:03:10 2005 @@ -1,7 +1,7 @@ from pypy.rpython import lltype from pypy.rpython.test.test_llinterp import interpret -from pypy.rpython import rstr, rdict +from pypy.rpython import rstr, rint, rdict import py py.log.setconsumer("rtyper", py.log.STDOUT) @@ -326,3 +326,59 @@ assert res == 1000 res = interpret(func, [524, 1036]) assert res == -123 + +# ____________________________________________________________ + +def not_really_random(): + """A random-ish generator, which also generates nice patterns from time to time. + Could be useful to detect problems associated with specific usage patterns.""" + import random + x = random.random() + for i in range(12000): + r = 3.4 + i/20000.0 + x = r*x - x*x + assert 0 <= x < 4 + yield x + +def test_stress(): + dictrepr = rdict.DictRepr(rint.signed_repr, rint.signed_repr) + dictrepr.setup() + l_dict = rdict.ll_newdict(dictrepr) + referencetable = [None] * 400 + referencelength = 0 + value = 0 + + def complete_check(): + for n, refvalue in zip(range(len(referencetable)), referencetable): + try: + gotvalue = rdict.ll_dict_getitem(l_dict, n, dictrepr) + except KeyError: + assert refvalue is None + else: + assert gotvalue == refvalue + + for x in not_really_random(): + n = int(x*100.0) # 0 <= x < 400 + op = repr(x)[-1] + if op <= '2' and referencetable[n] is not None: + rdict.ll_dict_delitem(l_dict, n, dictrepr) + referencetable[n] = None + referencelength -= 1 + elif op <= '6': + rdict.ll_dict_setitem(l_dict, n, value, dictrepr) + if referencetable[n] is None: + referencelength += 1 + referencetable[n] = value + value += 1 + else: + try: + gotvalue = rdict.ll_dict_getitem(l_dict, n, dictrepr) + except KeyError: + assert referencetable[n] is None + else: + assert gotvalue == referencetable[n] + if 1.38 <= x <= 1.39: + complete_check() + print 'current dict length:', referencelength + assert l_dict.num_items == referencelength + complete_check() From pedronis at codespeak.net Sun Sep 11 22:35:22 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 11 Sep 2005 22:35:22 +0200 (CEST) Subject: [pypy-svn] r17475 - in pypy/dist/pypy: annotation rpython/test translator translator/test Message-ID: <20050911203522.6FD2627B62@code1.codespeak.net> Author: pedronis Date: Sun Sep 11 22:35:19 2005 New Revision: 17475 Modified: pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/rpython/test/test_rpbc.py pypy/dist/pypy/translator/annrpython.py pypy/dist/pypy/translator/test/test_annrpython.py Log: emulate_pbc_call should not require position_key to be set support specifying a calalback that will be called with annotator, graph each time one of the involved callable result gets generalized Modified: pypy/dist/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/dist/pypy/annotation/bookkeeper.py (original) +++ pypy/dist/pypy/annotation/bookkeeper.py Sun Sep 11 22:35:19 2005 @@ -522,8 +522,8 @@ callfamily.patterns.update({shape: True}) - def pbc_call(self, pbc, args, implicit_init): - if not implicit_init: + def pbc_call(self, pbc, args, implicit_init=False, emulated=None): + if not implicit_init and not emulated: fn, block, i = self.position_key assert block.operations[i].opname in ('call_args', 'simple_call') assert self.annotator.binding(block.operations[i].args[0], extquery=True) is pbc @@ -541,17 +541,25 @@ if func is not None] mono = len(nonnullcallables) == 1 + if emulated is not None: + if emulated is True: + context = None + else: + context = emulated + else: + context = 'current' + for func, classdef in nonnullcallables: if isclassdef(classdef): s_self = SomeInstance(classdef) args1 = args.prepend(s_self) else: args1 = args - results.append(self.pycall(func, args1, mono)) + results.append(self.pycall(func, args1, mono, context=context)) return unionof(*results) - def emulate_pbc_call(self, unique_key, pbc, args_s, replace=[]): + def emulate_pbc_call(self, unique_key, pbc, args_s, replace=[], callback=None): args = self.build_args("simple_call", args_s) shape = args.rawshape() emulated_pbc_calls = self.emulated_pbc_calls @@ -564,7 +572,12 @@ del emulated_pbc_calls[other_key] emulated_pbc_calls[unique_key] = pbc, shape - return self.pbc_call(pbc, args, True) + if callback is None: + emulated = True + else: + emulated = callback + + return self.pbc_call(pbc, args, emulated=emulated) # decide_callable(position, func, args, mono) -> callb, key # query_spaceop_callable(spaceop) -> pbc, isspecialcase @@ -644,13 +657,17 @@ return inputcells - def pycall(self, func, args, mono): + def pycall(self, func, args, mono, context='current'): if func is None: # consider None as a NULL function pointer return SomeImpossibleValue() # decide and pick if necessary a specialized version base_func = func - func, key = decide_callable(self, self.position_key, func, args, mono, unpacked=True) + if context == 'current': + position_key = self.position_key + else: + position_key = None + func, key = decide_callable(self, position_key, func, args, mono, unpacked=True) if func is None: assert isinstance(key, SomeObject) @@ -675,11 +692,15 @@ assert isinstance(func, FunctionType), "[%s] expected user-defined function, got %r" % (self.whereami(), func) inputcells = self.get_inputcells(func, args) - - r = self.annotator.recursivecall(func, self.position_key, inputcells) + if context == 'current': + whence = self.position_key + else: + whence = context + r = self.annotator.recursivecall(func, whence, inputcells) # if we got different specializations keys for a same site, mix previous results for stability if key is not None: + assert context == 'current' occurence = (base_func, self.position_key) try: prev_key, prev_r = self.spec_callsite_keys_results[occurence] Modified: pypy/dist/pypy/rpython/test/test_rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rpbc.py (original) +++ pypy/dist/pypy/rpython/test/test_rpbc.py Sun Sep 11 22:35:19 2005 @@ -850,15 +850,14 @@ def test_hlinvoke_simple(): def f(a,b): return a + b + from pypy.translator import translator from pypy.translator import annrpython - a = annrpython.RPythonAnnotator() + a = annrpython.RPythonAnnotator(translator.Translator(simplifying=True)) from pypy.annotation import model as annmodel - def g(): - f(2,3) - f(4,5) - - a.build_types(g, []) + s_f = a.bookkeeper.immutablevalue(f) + a.bookkeeper.emulate_pbc_call('f', s_f, [annmodel.SomeInteger(), annmodel.SomeInteger()]) + a.complete() from pypy.rpython import rtyper rt = rtyper.RPythonTyper(a) @@ -870,7 +869,6 @@ from pypy.rpython import annlowlevel - s_f = a.bookkeeper.immutablevalue(f) r_f = rt.getrepr(s_f) s_R = a.bookkeeper.immutablevalue(r_f) Modified: pypy/dist/pypy/translator/annrpython.py ============================================================================== --- pypy/dist/pypy/translator/annrpython.py (original) +++ pypy/dist/pypy/translator/annrpython.py Sun Sep 11 22:35:19 2005 @@ -269,14 +269,23 @@ #___ interface for annotator.bookkeeper _______ - def recursivecall(self, func, position_key, inputcells): - parent_fn, parent_block, parent_index = position_key + def recursivecall(self, func, whence, inputcells): # whence = position_key|callback taking the annotator, graph + if isinstance(whence, tuple): + parent_fn, parent_block, parent_index = position_key = whence + else: + parent_fn = position_key = None graph = self.getflowgraph(func, parent_fn, position_key) # self.notify[graph.returnblock] is a dictionary of call # points to this func which triggers a reflow whenever the # return block of this graph has been analysed. callpositions = self.notify.setdefault(graph.returnblock, {}) - callpositions[position_key] = True + if whence is not None: + if callable(whence): + def callback(): + whence(self, graph) + else: + callback = whence + callpositions[callback] = True # generalize the function's input arguments self.addpendingblock(func, graph.startblock, inputcells, position_key) @@ -584,8 +593,11 @@ self.addpendingblock(fn, link.target, cells) if block in self.notify: # reflow from certain positions when this block is done - for position_key in self.notify[block]: - self.reflowfromposition(position_key) + for callback in self.notify[block]: + if isinstance(callback, tuple): + self.reflowfromposition(callback) # callback is a position + else: + callback() #___ creating the annotations based on operations ______ Modified: pypy/dist/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/dist/pypy/translator/test/test_annrpython.py (original) +++ pypy/dist/pypy/translator/test/test_annrpython.py Sun Sep 11 22:35:19 2005 @@ -1664,6 +1664,44 @@ s = a.build_types(g, [int]) assert None not in s.prebuiltinstances + def test_emulated_pbc_call_simple(self): + def f(a,b): + return a + b + from pypy.translator import translator + from pypy.translator import annrpython + a = annrpython.RPythonAnnotator(translator.Translator(simplifying=True)) + from pypy.annotation import model as annmodel + + s_f = a.bookkeeper.immutablevalue(f) + a.bookkeeper.emulate_pbc_call('f', s_f, [annmodel.SomeInteger(), annmodel.SomeInteger()]) + a.complete() + + assert f in a.translator.flowgraphs + assert a.binding(a.translator.flowgraphs[f].getreturnvar()).knowntype == int + + def test_emulated_pbc_call_callback(self): + def f(a,b): + return a + b + from pypy.translator import translator + from pypy.translator import annrpython + a = annrpython.RPythonAnnotator(translator.Translator(simplifying=True)) + from pypy.annotation import model as annmodel + + memo = [] + def callb(ann, graph): + memo.append(annmodel.SomeInteger().contains(ann.binding(graph.getreturnvar()))) + + s_f = a.bookkeeper.immutablevalue(f) + s = a.bookkeeper.emulate_pbc_call('f', s_f, [annmodel.SomeInteger(), annmodel.SomeInteger()], + callback=callb) + assert s == annmodel.SomeImpossibleValue() + a.complete() + + assert f in a.translator.flowgraphs + assert a.binding(a.translator.flowgraphs[f].getreturnvar()).knowntype == int + assert len(memo) >= 1 + for t in memo: + assert t def g(n): return [0,1,2,n] From pedronis at codespeak.net Sun Sep 11 22:47:31 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 11 Sep 2005 22:47:31 +0200 (CEST) Subject: [pypy-svn] r17476 - pypy/dist/pypy/translator/test Message-ID: <20050911204731.DD28527B62@code1.codespeak.net> Author: pedronis Date: Sun Sep 11 22:47:30 2005 New Revision: 17476 Modified: pypy/dist/pypy/translator/test/test_annrpython.py Log: more strict test Modified: pypy/dist/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/dist/pypy/translator/test/test_annrpython.py (original) +++ pypy/dist/pypy/translator/test/test_annrpython.py Sun Sep 11 22:47:30 2005 @@ -1689,7 +1689,7 @@ memo = [] def callb(ann, graph): - memo.append(annmodel.SomeInteger().contains(ann.binding(graph.getreturnvar()))) + memo.append(annmodel.SomeInteger() == ann.binding(graph.getreturnvar())) s_f = a.bookkeeper.immutablevalue(f) s = a.bookkeeper.emulate_pbc_call('f', s_f, [annmodel.SomeInteger(), annmodel.SomeInteger()], From pedronis at codespeak.net Sun Sep 11 23:02:23 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 11 Sep 2005 23:02:23 +0200 (CEST) Subject: [pypy-svn] r17477 - pypy/dist/pypy/annotation Message-ID: <20050911210223.C441F27B62@code1.codespeak.net> Author: pedronis Date: Sun Sep 11 23:02:22 2005 New Revision: 17477 Removed: pypy/dist/pypy/annotation/dictdef.py Log: removed to revert before the emulate_pbc_call workaround From pedronis at codespeak.net Sun Sep 11 23:03:41 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 11 Sep 2005 23:03:41 +0200 (CEST) Subject: [pypy-svn] r17478 - pypy/dist/pypy/annotation Message-ID: <20050911210341.3472E27B62@code1.codespeak.net> Author: pedronis Date: Sun Sep 11 23:03:39 2005 New Revision: 17478 Added: pypy/dist/pypy/annotation/dictdef.py - copied unchanged from r17435, pypy/dist/pypy/annotation/dictdef.py Log: version before the workaround From pedronis at codespeak.net Sun Sep 11 23:28:02 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 11 Sep 2005 23:28:02 +0200 (CEST) Subject: [pypy-svn] r17479 - pypy/dist/pypy/annotation Message-ID: <20050911212802.538CC27B62@code1.codespeak.net> Author: pedronis Date: Sun Sep 11 23:28:00 2005 New Revision: 17479 Modified: pypy/dist/pypy/annotation/dictdef.py Log: use callback mechanism Modified: pypy/dist/pypy/annotation/dictdef.py ============================================================================== --- pypy/dist/pypy/annotation/dictdef.py (original) +++ pypy/dist/pypy/annotation/dictdef.py Sun Sep 11 23:28:00 2005 @@ -46,16 +46,24 @@ replace_othereq = replace_otherhash = () s_key = self.s_value - s1 = self.bookkeeper.emulate_pbc_call(myeq, self.s_rdict_eqfn, [s_key, s_key], - replace=replace_othereq) - assert SomeBool().contains(s1), ( - "the custom eq function of an r_dict must return a boolean" - " (got %r)" % (s1,)) - s2 = self.bookkeeper.emulate_pbc_call(myhash, self.s_rdict_hashfn, [s_key], - replace=replace_otherhash) - assert SomeInteger().contains(s2), ( - "the custom hash function of an r_dict must return an integer" - " (got %r)" % (s2,)) + + def check_eqfn(annotator, graph): + s = annotator.binding(graph.getreturnvar()) + assert SomeBool().contains(s), ( + "the custom eq function of an r_dict must return a boolean" + " (got %r)" % (s,)) + self.bookkeeper.emulate_pbc_call(myeq, self.s_rdict_eqfn, [s_key, s_key], + replace=replace_othereq, + callback = check_eqfn) + + def check_hashfn(annotator, graph): + s = annotator.binding(graph.getreturnvar()) + assert SomeInteger().contains(s), ( + "the custom hash function of an r_dict must return an integer" + " (got %r)" % (s,)) + self.bookkeeper.emulate_pbc_call(myhash, self.s_rdict_hashfn, [s_key], + replace=replace_otherhash, + callback = check_hashfn) class DictValue(ListItem): From pedronis at codespeak.net Mon Sep 12 00:20:19 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 12 Sep 2005 00:20:19 +0200 (CEST) Subject: [pypy-svn] r17481 - in pypy/dist/pypy/interpreter: astcompiler pyparser/test Message-ID: <20050911222019.72A8627B5C@code1.codespeak.net> Author: pedronis Date: Mon Sep 12 00:20:17 2005 New Revision: 17481 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Log: use the std space instead of a faked one in this test, better convers forget space.* needed op fixed some of those we get a failing test this, related to different decisions about consts, first, we need proper line numbering Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Mon Sep 12 00:20:17 2005 @@ -292,15 +292,16 @@ def visitModule(self, node): + space = self.space self.parseSymbols(node) assert node.scope is not None self.scope = node.scope self.emitop_int('SET_LINENO', 0) - if node.doc: + if not space.is_w(node.doc, space.w_None): self.emitop_obj('LOAD_CONST', node.doc) self.storeName('__doc__') node.node.accept( self ) - self.emitop_obj('LOAD_CONST', self.space.w_None ) + self.emitop_obj('LOAD_CONST', space.w_None ) self.emit('RETURN_VALUE') def visitExpression(self, node): @@ -313,7 +314,8 @@ def visitFunction(self, node): self._visitFuncOrLambda(node, isLambda=0) - if node.doc: + space = self.space + if not space.is_w(node.doc, space.w_None): self.setDocstring(node.doc) self.storeName(node.name) @@ -1207,7 +1209,7 @@ CodeGenerator.__init__(self, space, graph) self.optimized = 1 - if not isLambda and func.doc: + if not isLambda and not space.is_w(func.doc, space.w_None): self.setDocstring(func.doc) if func.varargs: @@ -1281,7 +1283,7 @@ optimized=0, klass=1) CodeGenerator.__init__(self, space, graph) self.graph.setFlag(CO_NEWLOCALS) - if klass.doc: + if not space.is_w(klass.doc, space.w_None): self.setDocstring(klass.doc) def get_module(self): @@ -1303,7 +1305,7 @@ self.set_lineno(klass) self.emitop("LOAD_GLOBAL", "__name__") self.storeName("__module__") - if klass.doc: + if not space.is_w(klass.doc, space.w_None): self.emitop_obj("LOAD_CONST", klass.doc) self.storeName('__doc__') Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Mon Sep 12 00:20:17 2005 @@ -8,6 +8,8 @@ import sys if sys.version[:3] != "2.4": py.test.skip("expected to work only on 2.4") + import pypy.conftest + mod.std_space = pypy.conftest.getobjspace('std') from pypy.interpreter.astcompiler import ast, misc, pycodegen @@ -77,7 +79,7 @@ Generator = pycodegen.ExpressionCodeGenerator codegen = Generator(space, ast) rcode = codegen.getCode() - return to_code(rcode) + return rcode def compile_with_stablecompiler(expr, target='exec'): from pypy.interpreter.testcompiler import compile @@ -85,8 +87,9 @@ return compile(expr, '', target) -def compare_code(ac_code, sc_code): +def compare_code(ac_code, sc_code, space=FakeSpace()): #print "Filename", ac_code.co_filename, sc_code.co_filename + ac_code = to_code(ac_code, space) assert ac_code.co_filename == sc_code.co_filename #print repr(ac_code.co_code) #print repr(sc_code.co_code) @@ -99,23 +102,28 @@ assert ac_code.co_code == sc_code.co_code assert ac_code.co_varnames == sc_code.co_varnames assert ac_code.co_flags == sc_code.co_flags - + assert len(ac_code.co_consts) == len(sc_code.co_consts) for c1, c2 in zip( ac_code.co_consts, sc_code.co_consts ): if type(c1)==PyCode: - c1 = to_code(c1) - return compare_code( c1, c2 ) + return compare_code( c1, c2, space ) else: assert c1 == c2 -def to_code( rcode ): +def to_code( rcode, space ): import new + consts = [] + for w in rcode.co_consts_w: + if type(w)==PyCode: + consts.append(w) + else: + consts.append(space.unwrap(w)) code = new.code( rcode.co_argcount, rcode.co_nlocals, rcode.co_stacksize, rcode.co_flags, rcode.co_code, - tuple(rcode.co_consts_w), + tuple(consts), tuple(rcode.co_names), tuple(rcode.co_varnames), rcode.co_filename, @@ -126,12 +134,15 @@ tuple(rcode.co_cellvars) ) return code -def check_compile(expr, target='exec', quiet=False): +def check_compile(expr, target='exec', quiet=False, space=FakeSpace()): if not quiet: print "Compiling:", expr + + space = std_space + sc_code = compile_with_stablecompiler(expr, target=target) - ac_code = compile_with_astcompiler(expr, target=target) - compare_code(ac_code, sc_code) + ac_code = compile_with_astcompiler(expr, target=target, space=space) + compare_code(ac_code, sc_code, space=space) ## def check_compile( expr ): ## space = FakeSpace() From pedronis at codespeak.net Mon Sep 12 00:44:55 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 12 Sep 2005 00:44:55 +0200 (CEST) Subject: [pypy-svn] r17482 - pypy/dist/pypy/interpreter/pyparser/test Message-ID: <20050911224455.5C73027B5C@code1.codespeak.net> Author: pedronis Date: Mon Sep 12 00:44:53 2005 New Revision: 17482 Modified: pypy/dist/pypy/interpreter/pyparser/test/stdlib_testall.py pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Log: use std object space for stdlib_testall too Modified: pypy/dist/pypy/interpreter/pyparser/test/stdlib_testall.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/stdlib_testall.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/stdlib_testall.py Mon Sep 12 00:44:53 2005 @@ -2,11 +2,17 @@ import py from test_astcompiler import check_compile +def setup_module(mod): + import sys + if sys.version[:3] != "2.4": + py.test.skip("expected to work only on 2.4") + import pypy.conftest + mod.std_space = pypy.conftest.getobjspace('std') def check_file_compile(filename): print 'Compiling:', filename source = open(filename).read() - check_compile(source, 'exec', quiet=True) + check_compile(source, 'exec', quiet=True, space=std_space) def test_all(): Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Mon Sep 12 00:44:53 2005 @@ -134,11 +134,12 @@ tuple(rcode.co_cellvars) ) return code -def check_compile(expr, target='exec', quiet=False, space=FakeSpace()): +def check_compile(expr, target='exec', quiet=False, space=None): if not quiet: print "Compiling:", expr - space = std_space + if space is None: + space = std_space sc_code = compile_with_stablecompiler(expr, target=target) ac_code = compile_with_astcompiler(expr, target=target, space=space) From tismer at codespeak.net Mon Sep 12 02:36:45 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Mon, 12 Sep 2005 02:36:45 +0200 (CEST) Subject: [pypy-svn] r17484 - in pypy/dist/pypy: annotation doc rpython rpython/test translator/c/test Message-ID: <20050912003645.9786A27BAE@code1.codespeak.net> Author: tismer Date: Mon Sep 12 02:36:42 2005 New Revision: 17484 Modified: pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/doc/coding-guide.txt pypy/dist/pypy/rpython/rlist.py pypy/dist/pypy/rpython/rrange.py pypy/dist/pypy/rpython/test/test_rrange.py pypy/dist/pypy/translator/c/test/test_typed.py Log: completed the implementation of rrange. It is true that ranges with a variable step are seldom animals. Anyway I tried to make RPython complete in this area, without adding too much overhead or slowing down the common cases. - augmented the annotator to allow for variable step in ranges. This is flagged by setting range_step to zero, a disallowed value in the constant case. - special-cased variable step in rrange.py. Only if it is variable, a three-element object is allocated for range and its iterator, no overhead created otherwise. - added a runtime-check for variable step. Range always issues ValueError if step is zero. The case is rare enough to not add special cases here. - tried to make the additional code as small as possible and did not influence the speed of the const case in any way. - added a couple of new tests that ensure correct dynamic handling of the variable step. - updated the coding guide, mostly by removing dropped restrictions. I honestly hope not to raise complaints by this, although it was mostly for my own pleasure. considering if and how to map the range/xrange implementations of StdObjSpace to it. Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Mon Sep 12 02:36:42 2005 @@ -53,13 +53,15 @@ else: raise Exception, "range() takes 1 to 3 arguments" if not s_step.is_constant(): - raise Exception, "range() step argument should be a constant" - step = s_step.const - if step == 0: - raise Exception, "range() with step zero" - elif step > 0: - nonneg = s_start.nonneg + step = 0 # this case signals a variable step else: + step = s_step.const + if step == 0: + raise Exception, "range() with step zero" + nonneg = False # so far + if step > 0: + nonneg = s_start.nonneg + elif step < 0: nonneg = s_stop.nonneg or (s_stop.is_constant() and s_stop.const >= -1) return getbookkeeper().newlist(SomeInteger(nonneg=nonneg), range_step=step) Modified: pypy/dist/pypy/doc/coding-guide.txt ============================================================================== --- pypy/dist/pypy/doc/coding-guide.txt (original) +++ pypy/dist/pypy/doc/coding-guide.txt Mon Sep 12 02:36:42 2005 @@ -109,12 +109,14 @@ **control structures** - all allowed + all allowed but yield **range** - does not create an array. It is only allowed in for loops. The step argument - must be a constant. + ``range`` and ``xrange`` are identical. ``range`` does not necessarily create an array, + only if the result is modified. It is allowed everywhere and completely + implemented. The only visible difference to CPython is the inaccessability + of the ``xrange`` fields start, stop and step. **definitions** @@ -149,20 +151,21 @@ lists are used as an allocated array; list.append() does naive resizing, so as far as possible use list comprehensions (see below). list.extend() or the += - operator are allowed and efficient. Unless there is really a use case for it, - repetition is limited to initialization purposes: '[single_value] * length'. + operator are allowed and efficient. + Repetition via `*` or `*=` is fully supported as well. **dicts** - dicts with string keys only (preferably the kind of strings that are usually - interned in CPython, i.e. short strings that look like identifiers). The - implementation could safely decide that all dict keys should be interned. + dicts with a unique key type only, provided it is hashable. + String keys have been the only allowed key types for a while, but this was generalized. + After some re-optimization, + the implementation could safely decide that all dict keys should be interned. **list comprehensions** - may be used to create allocated, initialized array. the array size must be - computable in advance, which implies that we don't allow an if clause. + may be used to create allocated, initialized arrays. + After list over-allocation was introduced, there is no longer any restriction. **functions** Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Mon Sep 12 02:36:42 2005 @@ -31,7 +31,7 @@ from pypy.rpython import rrange listitem = self.listdef.listitem s_value = listitem.s_value - if listitem.range_step and not listitem.mutated: + if listitem.range_step is not None and not listitem.mutated: return rrange.RangeRepr(listitem.range_step) elif (s_value.__class__ is annmodel.SomeObject and s_value.knowntype == object): return robject.pyobj_repr Modified: pypy/dist/pypy/rpython/rrange.py ============================================================================== --- pypy/dist/pypy/rpython/rrange.py (original) +++ pypy/dist/pypy/rpython/rrange.py Mon Sep 12 02:36:42 2005 @@ -13,20 +13,35 @@ # struct range { # Signed start, stop; // step is always constant # } +# +# struct rangest { +# Signed start, stop, step; // rare case, for completeness +# } RANGE = GcStruct("range", ("start", Signed), ("stop", Signed)) RANGEITER = GcStruct("range", ("next", Signed), ("stop", Signed)) +RANGEST = GcStruct("range", ("start", Signed), ("stop", Signed),("step", Signed)) +RANGESTITER = GcStruct("range", ("next", Signed), ("stop", Signed), ("step", Signed)) class RangeRepr(Repr): - lowleveltype = Ptr(RANGE) - def __init__(self, step): self.step = step + if step != 0: + self.lowleveltype = Ptr(RANGE) + else: + self.lowleveltype = Ptr(RANGEST) + + def _getstep(self, v_rng, hop): + return hop.genop('getfield', [v_rng, hop.inputconst(Void, 'step')], + resulttype=Signed) def rtype_len(self, hop): v_rng, = hop.inputargs(self) - cstep = hop.inputconst(Signed, self.step) + if self.step != 0: + cstep = hop.inputconst(Signed, self.step) + else: + cstep = self._getstep(v_rng, hop) return hop.gendirectcall(ll_rangelen, v_rng, cstep) def make_iterator_repr(self): @@ -42,7 +57,10 @@ spec = dum_nocheck v_func = hop.inputconst(Void, spec) v_lst, v_index = hop.inputargs(r_rng, Signed) - cstep = hop.inputconst(Signed, r_rng.step) + if r_rng.step != 0: + cstep = hop.inputconst(Signed, r_rng.step) + else: + cstep = r_rng._getstep(v_lst, hop) if hop.args_s[1].nonneg: llfn = ll_rangeitem_nonneg else: @@ -94,6 +112,15 @@ l.stop = stop return l +def ll_newrangest(start, stop, step): + if step == 0: + raise ValueError + l = malloc(RANGEST) + l.start = start + l.stop = stop + l.step = step + return l + def rtype_builtin_range(hop): vstep = hop.inputconst(Signed, 1) if hop.nb_args == 1: @@ -103,10 +130,15 @@ vstart, vstop = hop.inputargs(Signed, Signed) else: vstart, vstop, vstep = hop.inputargs(Signed, Signed, Signed) - assert isinstance(vstep, Constant) - + const_step = isinstance(vstep, Constant) + if const_step and vstep.value == 0: + # not really needed, annotator catches it. Just in case... + raise TyperError("range cannot have a const step of zero") if isinstance(hop.r_result, RangeRepr): - return hop.gendirectcall(ll_newrange, vstart, vstop) + if const_step: + return hop.gendirectcall(ll_newrange, vstart, vstop) + else: + return hop.gendirectcall(ll_newrangest, vstart, vstop, vstep) else: # cannot build a RANGE object, needs a real list r_list = hop.r_result @@ -116,6 +148,8 @@ rtype_builtin_xrange = rtype_builtin_range def ll_range2list(LISTPTR, start, stop, step): + if step == 0: + raise ValueError length = _ll_rangelen(start, stop, step) l = ll_newlist(LISTPTR, length) idx = 0 @@ -131,10 +165,12 @@ # Iteration. class RangeIteratorRepr(IteratorRepr): - lowleveltype = Ptr(RANGEITER) - def __init__(self, r_rng): self.r_rng = r_rng + if r_rng.step != 0: + self.lowleveltype = Ptr(RANGEITER) + else: + self.lowleveltype = Ptr(RANGESTITER) def newiter(self, hop): v_rng, = hop.inputargs(self.r_rng) @@ -143,19 +179,24 @@ def rtype_next(self, hop): v_iter, = hop.inputargs(self) - cstep = hop.inputconst(Signed, self.r_rng.step) + args = hop.inputconst(Signed, self.r_rng.step), if self.r_rng.step > 0: llfn = ll_rangenext_up - else: + elif self.r_rng.step < 0: llfn = ll_rangenext_down + else: + llfn = ll_rangenext_updown + args = () hop.has_implicit_exception(StopIteration) # record that we know about it hop.exception_is_here() - return hop.gendirectcall(llfn, v_iter, cstep) + return hop.gendirectcall(llfn, v_iter, *args) def ll_rangeiter(ITERPTR, rng): iter = malloc(ITERPTR.TO) iter.next = rng.start iter.stop = rng.stop + if ITERPTR.TO is RANGESTITER: + iter.step = rng.step return iter def ll_rangenext_up(iter, step): @@ -171,3 +212,10 @@ raise StopIteration iter.next = next + step return next + +def ll_rangenext_updown(iter): + step = iter.step + if step > 0: + return ll_rangenext_up(iter, step) + else: + return ll_rangenext_down(iter, step) Modified: pypy/dist/pypy/rpython/test/test_rrange.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rrange.py (original) +++ pypy/dist/pypy/rpython/test/test_rrange.py Mon Sep 12 02:36:42 2005 @@ -3,10 +3,14 @@ from pypy.rpython.test.test_llinterp import interpret def test_rlist_range(): - def test1(start, stop, step): + def test1(start, stop, step, varstep): expected = range(start, stop, step) length = len(expected) - l = ll_newrange(start, stop) + if varstep: + l = ll_newrangest(start, stop, step) + step = l.step + else: + l = ll_newrange(start,stop) assert ll_rangelen(l, step) == length lst = [ll_rangeitem(dum_nocheck, l, i, step) for i in range(length)] assert lst == expected @@ -18,7 +22,8 @@ for start in (-10, 0, 1, 10): for stop in (-8, 0, 4, 8, 25): for step in (1, 2, 3, -1, -2): - test1(start, stop, step) + for varstep in False,True: + test1(start, stop, step, varstep) # ____________________________________________________________ @@ -76,3 +81,26 @@ start, stop = 10, 17 res = interpret(dummyfn, [start, stop]) assert res == dummyfn(start, stop) + +def check_failed(func, *args): + try: + interpret(func, *args) + except: + return True + else: + return False + +def test_range_extra(): + def failingfn_const(): + r = range(10, 17, 0) + return r[-1] + assert check_failed(failingfn_const, []) + + def failingfn_var(step): + r = range(10, 17, step) + return r[-1] + step = 3 + res = interpret(failingfn_var, [step]) + assert res == failingfn_var(step) + step = 0 + assert check_failed(failingfn_var, [step]) Modified: pypy/dist/pypy/translator/c/test/test_typed.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_typed.py (original) +++ pypy/dist/pypy/translator/c/test/test_typed.py Mon Sep 12 02:36:42 2005 @@ -11,7 +11,7 @@ class TestTypedTestCase(_TestAnnotatedTestCase): - def getcompiled(self, func): + def getcompiled(self, func, view=False): t = Translator(func, simplifying=True) # builds starting-types from func_defs argstypelist = [] @@ -24,6 +24,8 @@ a.simplify() t.specialize() t.checkgraphs() + if view: + t.view() return skip_missing_compiler(t.ccompile) def test_call_five(self): @@ -367,4 +369,14 @@ f = self.getcompiled(fn) assert f(0) == fn(0) assert f(-1) == fn(-1) - raises(IndexError, f, 42) \ No newline at end of file + raises(IndexError, f, 42) + + def test_range_step(self): + def fn(step=int): + r = range(10, 37, step) + # we always raise on step = 0 + return r[-2] + f = self.getcompiled(fn)#, view=True) + assert f(1) == fn(1) + assert f(3) == fn(3) + raises(ValueError, f, 0) From tismer at codespeak.net Mon Sep 12 02:45:07 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Mon, 12 Sep 2005 02:45:07 +0200 (CEST) Subject: [pypy-svn] r17485 - pypy/dist/pypy/doc Message-ID: <20050912004507.F37E627BAE@code1.codespeak.net> Author: tismer Date: Mon Sep 12 02:45:06 2005 New Revision: 17485 Modified: pypy/dist/pypy/doc/coding-guide.txt Log: removed inexactness Modified: pypy/dist/pypy/doc/coding-guide.txt ============================================================================== --- pypy/dist/pypy/doc/coding-guide.txt (original) +++ pypy/dist/pypy/doc/coding-guide.txt Mon Sep 12 02:45:06 2005 @@ -159,7 +159,7 @@ dicts with a unique key type only, provided it is hashable. String keys have been the only allowed key types for a while, but this was generalized. After some re-optimization, - the implementation could safely decide that all dict keys should be interned. + the implementation could safely decide that all string dict keys should be interned. **list comprehensions** From tismer at codespeak.net Mon Sep 12 02:50:07 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Mon, 12 Sep 2005 02:50:07 +0200 (CEST) Subject: [pypy-svn] r17486 - pypy/dist/pypy/rpython/test Message-ID: <20050912005007.4861527BAE@code1.codespeak.net> Author: tismer Date: Mon Sep 12 02:50:06 2005 New Revision: 17486 Modified: pypy/dist/pypy/rpython/test/test_rrange.py Log: cosmetics Modified: pypy/dist/pypy/rpython/test/test_rrange.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rrange.py (original) +++ pypy/dist/pypy/rpython/test/test_rrange.py Mon Sep 12 02:50:06 2005 @@ -22,7 +22,7 @@ for start in (-10, 0, 1, 10): for stop in (-8, 0, 4, 8, 25): for step in (1, 2, 3, -1, -2): - for varstep in False,True: + for varstep in False, True: test1(start, stop, step, varstep) # ____________________________________________________________ From tismer at codespeak.net Mon Sep 12 03:22:59 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Mon, 12 Sep 2005 03:22:59 +0200 (CEST) Subject: [pypy-svn] r17487 - pypy/dist/pypy/rpython/test Message-ID: <20050912012259.197DF27BB1@code1.codespeak.net> Author: tismer Date: Mon Sep 12 03:22:55 2005 New Revision: 17487 Modified: pypy/dist/pypy/rpython/test/test_rrange.py Log: this check-in is just to show a funny observation. When a variable is conditionally assigned two different ranges, the ranges are kept, unless the steps are different. Well, after all this is not surprizing. But I thing the action taken is not pleasant: the range is turned into a list in this case! I guess what we want is either special-casing the respective blocks, or at least map this to the variable step case, if this is possible. Please let me know what you think. Modified: pypy/dist/pypy/rpython/test/test_rrange.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rrange.py (original) +++ pypy/dist/pypy/rpython/test/test_rrange.py Mon Sep 12 03:22:55 2005 @@ -104,3 +104,44 @@ assert res == failingfn_var(step) step = 0 assert check_failed(failingfn_var, [step]) + +def test_range_iter(): + def fn(start, stop, step): + res = 0 + if step == 0: + if stop >= start: + r = range(start, stop, 1) + else: + r = range(start, stop, -1) + else: + r = range(start, stop, step) + for i in r: + res = res * 51 + i + return res + res = interpret(fn, [2, 7, 1])#, view=True) + # XXX not finished, stunned + +# XXX the above test works, but it always turns the range into a list!!! +# +# here another test that show that this even happens in a simple case. +# I think this is an annotator problem + +def test_range_funny(): + # this is just an example. + # making start/stop different is ok + def fn(start, stop): + if stop >= start: + r = range(start, stop, 1) + else: + r = range(start, stop-1, 1) + return r[-2] + # making step different turns the range into a list! + # I think, we should instead either specialize the blocks, + # or morph the whole thing into the variable step case??? + def fn(start, stop): + if stop >= start: + r = range(start, stop, 1) + else: + r = range(start, stop, -1) + return r[-2] + res = interpret(fn, [2, 7])#, view=True) From arigo at codespeak.net Mon Sep 12 10:28:16 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Sep 2005 10:28:16 +0200 (CEST) Subject: [pypy-svn] r17489 - pypy/dist/pypy/doc Message-ID: <20050912082816.83DA127BAA@code1.codespeak.net> Author: arigo Date: Mon Sep 12 10:28:14 2005 New Revision: 17489 Added: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (contents, props changed) Modified: pypy/dist/pypy/doc/_ref.txt Log: Draft with lengthy introduction about performing static analysis of dynamic languages. I tried to write to a reader that would not necessarily know well the particularities of dynamic languages, and I try to develop the basic reasons for which analysing Python source code is not a great idea. Modified: pypy/dist/pypy/doc/_ref.txt ============================================================================== --- pypy/dist/pypy/doc/_ref.txt (original) +++ pypy/dist/pypy/doc/_ref.txt Mon Sep 12 10:28:14 2005 @@ -1,16 +1,17 @@ .. _`demo/`: ../../demo .. _`lib-python/`: ../../lib-python .. _`lib-python/2.4.1/dis.py`: ../../lib-python/2.4.1/dis.py -.. _`pypy/annotation`: -.. _`annotation/`: ../../pypy/annotation +.. _`annotation/`: +.. _`pypy/annotation`: ../../pypy/annotation .. _`annotation/binaryop.py`: ../../pypy/annotation/binaryop.py .. _`doc/`: ../../pypy/doc .. _`doc/revreport/`: ../../pypy/doc/revreport -.. _`pypy/interpreter`: -.. _`interpreter/`: ../../pypy/interpreter +.. _`interpreter/`: +.. _`pypy/interpreter`: ../../pypy/interpreter .. _`pypy/interpreter/argument.py`: ../../pypy/interpreter/argument.py .. _`pypy/interpreter/function.py`: ../../pypy/interpreter/function.py -.. _`pypy/interpreter/gateway.py`: ../../pypy/interpreter/gateway.py +.. _`pypy/interpreter/gateway.py`: +.. _`interpreter/gateway.py`: ../../pypy/interpreter/gateway.py .. _`pypy/interpreter/generator.py`: ../../pypy/interpreter/generator.py .. _`pypy/interpreter/mixedmodule.py`: ../../pypy/interpreter/mixedmodule.py .. _`pypy/interpreter/nestedscope.py`: ../../pypy/interpreter/nestedscope.py @@ -28,19 +29,19 @@ .. _`module/parser/`: ../../pypy/module/parser .. _`module/recparser/`: ../../pypy/module/recparser .. _`module/sys/`: ../../pypy/module/sys -.. _`pypy/objspace`: -.. _`objspace/`: ../../pypy/objspace +.. _`objspace/`: +.. _`pypy/objspace`: ../../pypy/objspace .. _`objspace/flow/`: ../../pypy/objspace/flow -.. _`pypy/objspace/std`: -.. _`objspace/std/`: ../../pypy/objspace/std +.. _`objspace/std/`: +.. _`pypy/objspace/std`: ../../pypy/objspace/std .. _`objspace/thunk.py`: ../../pypy/objspace/thunk.py .. _`objspace/trace.py`: .. _`pypy/objspace/trace.py`: ../../pypy/objspace/trace.py -.. _`pypy/rpython`: -.. _`rpython/`: ../../pypy/rpython +.. _`rpython/`: +.. _`pypy/rpython`: ../../pypy/rpython .. _`pypy/rpython/extfunctable.py`: ../../pypy/rpython/extfunctable.py -.. _`rpython/lltype.py`: -.. _`pypy/rpython/lltype.py`: ../../pypy/rpython/lltype.py +.. _`pypy/rpython/lltype.py`: +.. _`rpython/lltype.py`: ../../pypy/rpython/lltype.py .. _`pypy/rpython/memory/gc.py`: ../../pypy/rpython/memory/gc.py .. _`pypy/rpython/memory/lladdress.py`: ../../pypy/rpython/memory/lladdress.py .. _`pypy/rpython/memory/simulator.py`: ../../pypy/rpython/memory/simulator.py @@ -58,8 +59,8 @@ .. _`tool/`: ../../pypy/tool .. _`tool/pytest/`: ../../pypy/tool/pytest .. _`tool/tb_server/`: ../../pypy/tool/tb_server -.. _`pypy/translator`: -.. _`translator/`: ../../pypy/translator +.. _`translator/`: +.. _`pypy/translator`: ../../pypy/translator .. _`pypy/translator/annrpython.py`: ../../pypy/translator/annrpython.py .. _`translator/c/`: ../../pypy/translator/c .. _`pypy/translator/c/extfunc.py`: ../../pypy/translator/c/extfunc.py Added: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- (empty file) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Mon Sep 12 10:28:14 2005 @@ -0,0 +1,83 @@ +============================================================ + Compiling dynamic language implementations +============================================================ + + +Introduction +=============================================== + +Dynamic languages +--------------------------- + +Dynamic languages are definitely not new on the computing scene. However, new conditions like increased computing power and designs driven by larger communities have allowed the emergence of new aspects in the recent members of the family, or at least made them more practical than they previously were. The following aspects in particular are typical not only of Python but of most modern dynamic languages: + +* The driving force is not minimalistic elegance. It is a balance between elegance and practicality, and rather un-minimalistic -- the feature sets built into languages tend to be relatively large and growing (though it is still a major difference between languages where exactly they stand on this scale). + +* High abstractions and theoretically powerful low-level primitives are generally ruled out in favor of a larger number of features that try to cover the most common use cases. In this respect, one could even regard these languages as mere libraries on top of some simpler (unspecified) language. + +* Implementation-wise, language design is no longer driven by a desire to enable high performance; any feature straightforward enough to achieve with an interpreter is candidate. As a result, compilation and most kinds of static inference are made impossible due to this dynamism (unless they are simply tedious due to the size of the language). + + +No Declarations +-------------------------- + +The notion declaration, central in compiled languages, is entierely missing in Python. There is no aspect of a program that must be declared; the complete program is built and run by executing statements. Some of these statements have a declarative look and feel; for example, some appear to be function or class declarations. Actually, they are merely statements that, when executed, build a function or class object and store a reference to that object at some place, under some name, where it can be retrieved from later. Units of programs -- modules, whose source is a file each -- are similarily mere objects in memory built on demand by some other module executing an ``import`` statement. Any such statement -- class construction or module import -- can be executed at any time during the execution of a program. + +This point of view should help explain why an analysis of a program is theoretically impossible: there is no declared structure. The program could for example build a class in completely different ways based on the results of NP-complete computations or external factors. This is not just a theoretical possibility but a regularly used feature: for example, the pure Python module ``os.py`` provides some OS-independent interface to OS-specific system calls, by importing OS-specific modules and defining substitute functions as needed depending on the OS on which ``os.py`` turns out to be executed. Most large Python projects use custom import mechanisms to control exactly how and from where each module is loaded, simply by tampering with import hooks or just emulating parts of the ``import`` statement manually. + +In addition, there are of course classical (and only partially true) arguments against compiling dynamic languages (there is an ``eval`` function that can execute arbitrary code, and introspection can change anything at run-time), but we consider the argument outlined above as more fundamental to the nature of dynamic languages. + + +Control flow versus data model +--------------------------------- + +Given the absence of declarations, the only preprocessing done on a Python module is the compilation of the source code into pseudo-code (bytecode). From there, the semantics can be roughly divided in two groups: the control flow semantics and the data model. In Python and other languages of its family, these two aspects are, to some extent, conceptually separated. Indeed, although it is possible, and common, to design languages in which the two aspects are more intricated, or one aspect is subsumed to the other (e.g. data structures in Lisp), programmers tend to separate the two concepts in common cases -- enough for the "practical-features-beats-obscure-primitives" language design guideline seen above. So in Python, both aspects are complex on their own. + +The control flow semantics include, clearly, all syntactic elements that influence the control flow of a program -- loops, function definitions and calls, etc. -- whereas the data model describes how the first-class objects manipulated by the program behave under some operations. There is a rich built-in set of object types in Python, and a rich set of operations on them, each corresponding to a syntactic element. Objects of different types react differently to the same operation, and the variables are not statically typed, which is also part of the dynamic nature of languages like Python -- operations are generally highly polymorphic and types are hard to infer in advance. + +Note that control flow and object model are not entierely separated. It is not uncommon for some control flow aspects to be manipulable as first-class objects as well, e.g. functions in Python. Conversely, almost any operation on any object could lead to a user-defined function being called back. + +The data model forms a so-called *Object Space* in PyPy. The bytecode interpreter works by delegating most operations to the object space, by invoking a well-defined abstract interface. The objects are regarded as "belonging" to the object space, where the interpreter sees them as black boxes on which it can ask for operations to be performed. + +Note that the term "object space" has already been reused for other dynamic language implementations, e.g. XXX for Perl 6. + + +The analysis of live programs +----------------------------------- + +How can we perform some static analysis on a program written in a dynamic language while keeping to the spirit of `No Declarations`_, i.e. without imposing that the program be written in a static way in which these declarative-looking statements would actually *be* declarations? + +The approach of PyPy is, first of all, to perform analysis on live programs in memory instead of dead source files. This means that the program to analyse is first fully imported and initialized, and once it has reached a state that is deemed advanced enough, we limit the amount of dynamism that is *further* allowed and we analyse the program's objects in memory. In some sense, we use the full Python as a preprocessor for a subset of the language, called RPython, which differs from Python only in ruling out some operations like creating new classes. + +More theoretically, analysing dead source files is equivalent to giving up all dynamism (in the sense of `No Declarations`_), but static analysis is still possible if we allow a *finite* amount of dynamism -- where an operation is considered dynamic or not depending on whether it is supported or not by the analysis we are performing. Of course, putting more cleverness in the tools helps too; but the point here is that we are still allowed to build dynamic programs, as long as they only ever build a bounded amount of, say, classes and functions. The source code of the PyPy interpreter, which is itself written in its style, also makes extensive use of the fact that it is possible to build new classes at any point in time, not just during an initialization phase, as long as this number of bounded (e.g. `interpreter/gateway.py`_ builds a custom class for each function that some variable can point to -- there is a finite number of functions in total, so this makes a finite number of extra classes). + +Note that this approach is natural in image-oriented environment like Smalltalk, where the program is, by default, live instead of in files. The Python experience forced us to allow some uncontrolled dynamism simply to be able to load the program to memory in the first place; once this was done, it was a mere (but very useful) side-effects that we could allow for some more uncontrolled dynamism at run-time, as opposed to analysing an image in a known frozen state. + + +Abstract interpretation +------------------------------ + +The analysis we perform in PyPy is global program discovery (i.e. slicing it out of all the objects in memory) and type inference. The analysis of the non-dynamic parts themelves is based on their `abstract interpretation`_. The object space separation was also designed for this purpose. PyPy has an alternate object space called the `Flow Object Space`_, whose objects are empty placeholders. The over-simplified view is that to analyse a function, we bind its input arguments to such placeholders, and execute the function -- i.e. let the interpreter follow its bytecode and invoke the object space for each operations, one by one. The Flow object space records each operation when it is issued, and returns a new placeholder as a result. At the end, the list of recorded operations, along with the involved placeholders, gives an assembler-like view of what the function performs. + +The global picture is then to run the program while switching between the flow object space for static enough functions, and a normal, concrete object space for functions or initializations requiring the full dynamism. + +If the placeholders are endowed with a bit more information, e.g. if they carry a type information that is propagated to resulting placeholders by individual operations, then our abstract interpretation simultaneously performs type inference. This is, in essence, executing the program while abstracting out some concrete values and replacing them with the set of all values that could actually be there. If the sets are broad enough, then after some time we will have seen all potential value sets along each possible code paths, and our program analysis is complete. + +This is a theoretical point of view that differs significantly from what we have implemented, for many reasons. Of course, the devil is in the details -- which the rest of this paper is all about. + + +Flow Object Space +=================================== + +XXX + +Annotator +=================================== + +XXX + + +.. _`abstract interpretation`: theory.html#abstract-interpretation +.. _`Flow Object Space`: objspace.html#flow-object-space + +.. include:: _ref.txt From tismer at codespeak.net Mon Sep 12 11:03:43 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Mon, 12 Sep 2005 11:03:43 +0200 (CEST) Subject: [pypy-svn] r17490 - in pypy/dist/pypy: annotation rpython rpython/test translator/c/test Message-ID: <20050912090343.8259627BAA@code1.codespeak.net> Author: tismer Date: Mon Sep 12 11:03:41 2005 New Revision: 17490 Modified: pypy/dist/pypy/annotation/listdef.py pypy/dist/pypy/rpython/rrange.py pypy/dist/pypy/rpython/test/test_rrange.py pypy/dist/pypy/translator/c/test/test_typed.py Log: added the missing pieces to unify different ranges, plus some tests. There is anyway a little problem: We don't handle large ranges correctly which needs an unsigned len result. Given that, range is still not absolutely compatible. No idea yet, what we want to do about it. Modified: pypy/dist/pypy/annotation/listdef.py ============================================================================== --- pypy/dist/pypy/annotation/listdef.py (original) +++ pypy/dist/pypy/annotation/listdef.py Mon Sep 12 11:03:41 2005 @@ -7,6 +7,15 @@ resized = False # True for lists resized after creation range_step = None # the step -- only for lists only created by a range() + # what to do if range_step is different in merge. + # - if one is a list (range_step is None), unify to a list. + # - if both have a step, unify to use a variable step (indicated by 0) + _step_map = { + (type(None), int): None, + (int, type(None)): None, + (int, int) : 0, + } + def __init__(self, bookkeeper, s_value): self.s_value = s_value self.bookkeeper = bookkeeper @@ -20,7 +29,8 @@ self.mutated |= other.mutated self.resized |= other.resized if other.range_step != self.range_step: - self.range_step = None + self.range_step = self._step_map[type(self.range_step), + type(other.range_step)] self.itemof.update(other.itemof) read_locations = self.read_locations.copy() other_read_locations = other.read_locations.copy() Modified: pypy/dist/pypy/rpython/rrange.py ============================================================================== --- pypy/dist/pypy/rpython/rrange.py (original) +++ pypy/dist/pypy/rpython/rrange.py Mon Sep 12 11:03:41 2005 @@ -130,12 +130,11 @@ vstart, vstop = hop.inputargs(Signed, Signed) else: vstart, vstop, vstep = hop.inputargs(Signed, Signed, Signed) - const_step = isinstance(vstep, Constant) - if const_step and vstep.value == 0: - # not really needed, annotator catches it. Just in case... - raise TyperError("range cannot have a const step of zero") + if isinstance(vstep, Constant) and vstep.value == 0: + # not really needed, annotator catches it. Just in case... + raise TyperError("range cannot have a const step of zero") if isinstance(hop.r_result, RangeRepr): - if const_step: + if hop.r_result.step != 0: return hop.gendirectcall(ll_newrange, vstart, vstop) else: return hop.gendirectcall(ll_newrangest, vstart, vstop, vstep) Modified: pypy/dist/pypy/rpython/test/test_rrange.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rrange.py (original) +++ pypy/dist/pypy/rpython/test/test_rrange.py Mon Sep 12 11:03:41 2005 @@ -1,6 +1,7 @@ from pypy.translator.translator import Translator from pypy.rpython.rrange import * from pypy.rpython.test.test_llinterp import interpret +from pypy.rpython.rarithmetic import intmask def test_rlist_range(): def test1(start, stop, step, varstep): @@ -118,30 +119,6 @@ for i in r: res = res * 51 + i return res - res = interpret(fn, [2, 7, 1])#, view=True) - # XXX not finished, stunned - -# XXX the above test works, but it always turns the range into a list!!! -# -# here another test that show that this even happens in a simple case. -# I think this is an annotator problem - -def test_range_funny(): - # this is just an example. - # making start/stop different is ok - def fn(start, stop): - if stop >= start: - r = range(start, stop, 1) - else: - r = range(start, stop-1, 1) - return r[-2] - # making step different turns the range into a list! - # I think, we should instead either specialize the blocks, - # or morph the whole thing into the variable step case??? - def fn(start, stop): - if stop >= start: - r = range(start, stop, 1) - else: - r = range(start, stop, -1) - return r[-2] - res = interpret(fn, [2, 7])#, view=True) + for args in [2, 7, 0], [7, 2, 0], [10, 50, 7], [50, -10, -3]: + res = interpret(fn, args)#, view=True) + assert res == intmask(fn(*args)) Modified: pypy/dist/pypy/translator/c/test/test_typed.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_typed.py (original) +++ pypy/dist/pypy/translator/c/test/test_typed.py Mon Sep 12 11:03:41 2005 @@ -4,7 +4,7 @@ from pypy.translator.translator import Translator from pypy.translator.test import snippet from pypy.translator.tool.cbuild import skip_missing_compiler -from pypy.rpython.rarithmetic import r_uint +from pypy.rpython.rarithmetic import r_uint, intmask from pypy.translator.c.test.test_annotated import TestAnnotatedTestCase as _TestAnnotatedTestCase @@ -380,3 +380,20 @@ assert f(1) == fn(1) assert f(3) == fn(3) raises(ValueError, f, 0) + + def test_range_iter(self): + def fn(start=int, stop=int, step=int): + res = 0 + if step == 0: + if stop >= start: + r = range(start, stop, 1) + else: + r = range(start, stop, -1) + else: + r = range(start, stop, step) + for i in r: + res = res * 51 + i + return res + f = self.getcompiled(fn) + for args in [2, 7, 0], [7, 2, 0], [10, 50, 7], [50, -10, -3]: + assert f(*args) == intmask(fn(*args)) From mwh at codespeak.net Mon Sep 12 12:25:16 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 12 Sep 2005 12:25:16 +0200 (CEST) Subject: [pypy-svn] r17492 - pypy/dist/pypy/doc Message-ID: <20050912102516.9D85027BBA@code1.codespeak.net> Author: mwh Date: Mon Sep 12 12:25:15 2005 New Revision: 17492 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: some light proof reading, including a couple of .. rest comments and [bracketed comments]. Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Mon Sep 12 12:25:15 2005 @@ -21,9 +21,9 @@ No Declarations -------------------------- -The notion declaration, central in compiled languages, is entierely missing in Python. There is no aspect of a program that must be declared; the complete program is built and run by executing statements. Some of these statements have a declarative look and feel; for example, some appear to be function or class declarations. Actually, they are merely statements that, when executed, build a function or class object and store a reference to that object at some place, under some name, where it can be retrieved from later. Units of programs -- modules, whose source is a file each -- are similarily mere objects in memory built on demand by some other module executing an ``import`` statement. Any such statement -- class construction or module import -- can be executed at any time during the execution of a program. +The notion of "declaration", central in compiled languages, is entirely missing in Python. There is no aspect of a program that must be declared; the complete program is built and run by executing statements. Some of these statements have a declarative look and feel; for example, some appear to be function or class declarations. Actually, they are merely statements that, when executed, build a function or class object and store a reference to that object at some place, under some name, from where it can be retrieved later. Units of programs -- modules, whose source is a file each -- are similarily mere objects in memory built on demand by some other module executing an ``import`` statement. Any such statement -- class construction or module import -- can be executed at any time during the execution of a program. -This point of view should help explain why an analysis of a program is theoretically impossible: there is no declared structure. The program could for example build a class in completely different ways based on the results of NP-complete computations or external factors. This is not just a theoretical possibility but a regularly used feature: for example, the pure Python module ``os.py`` provides some OS-independent interface to OS-specific system calls, by importing OS-specific modules and defining substitute functions as needed depending on the OS on which ``os.py`` turns out to be executed. Most large Python projects use custom import mechanisms to control exactly how and from where each module is loaded, simply by tampering with import hooks or just emulating parts of the ``import`` statement manually. +This point of view should help explain why an analysis of a program is theoretically impossible: there is no declared structure. The program could for example build a class in completely different ways based on the results of NP-complete computations or external factors. This is not just a theoretical possibility but a regularly used feature: for example, the pure Python module ``os.py`` provides some OS-independent interface to OS-specific system calls, by importing OS-specific modules and defining substitute functions as needed depending on the OS on which ``os.py`` turns out to be executed. Many large Python projects use custom import mechanisms to control exactly how and from where each module is loaded, simply by tampering with import hooks or just emulating parts of the ``import`` statement manually. In addition, there are of course classical (and only partially true) arguments against compiling dynamic languages (there is an ``eval`` function that can execute arbitrary code, and introspection can change anything at run-time), but we consider the argument outlined above as more fundamental to the nature of dynamic languages. @@ -31,11 +31,13 @@ Control flow versus data model --------------------------------- -Given the absence of declarations, the only preprocessing done on a Python module is the compilation of the source code into pseudo-code (bytecode). From there, the semantics can be roughly divided in two groups: the control flow semantics and the data model. In Python and other languages of its family, these two aspects are, to some extent, conceptually separated. Indeed, although it is possible, and common, to design languages in which the two aspects are more intricated, or one aspect is subsumed to the other (e.g. data structures in Lisp), programmers tend to separate the two concepts in common cases -- enough for the "practical-features-beats-obscure-primitives" language design guideline seen above. So in Python, both aspects are complex on their own. +Given the absence of declarations, the only preprocessing done on a Python module is the compilation of the source code into pseudo-code (bytecode). From there, the semantics can be roughly divided in two groups: the control flow semantics and the data model. In Python and other languages of its family, these two aspects are to some extent conceptually separated. Indeed, although it is possible -- and common -- to design languages in which the two aspects are more intricately connected, or one aspect is subsumed to the other (e.g. data structures in Lisp), programmers tend to separate the two concepts in common cases -- enough for the "practical-features-beats-obscure-primitives" language design guideline seen above. So in Python, both aspects are complex on their own. + +.. the above paragraph doesn't make a great deal of sense. some very long sentences! :) The control flow semantics include, clearly, all syntactic elements that influence the control flow of a program -- loops, function definitions and calls, etc. -- whereas the data model describes how the first-class objects manipulated by the program behave under some operations. There is a rich built-in set of object types in Python, and a rich set of operations on them, each corresponding to a syntactic element. Objects of different types react differently to the same operation, and the variables are not statically typed, which is also part of the dynamic nature of languages like Python -- operations are generally highly polymorphic and types are hard to infer in advance. -Note that control flow and object model are not entierely separated. It is not uncommon for some control flow aspects to be manipulable as first-class objects as well, e.g. functions in Python. Conversely, almost any operation on any object could lead to a user-defined function being called back. +Note that control flow and object model are not entirely separated. It is not uncommon for some control flow aspects to be manipulable as first-class objects as well, e.g. functions in Python. Conversely, almost any operation on any object could lead to a user-defined function being called back. The data model forms a so-called *Object Space* in PyPy. The bytecode interpreter works by delegating most operations to the object space, by invoking a well-defined abstract interface. The objects are regarded as "belonging" to the object space, where the interpreter sees them as black boxes on which it can ask for operations to be performed. @@ -47,17 +49,19 @@ How can we perform some static analysis on a program written in a dynamic language while keeping to the spirit of `No Declarations`_, i.e. without imposing that the program be written in a static way in which these declarative-looking statements would actually *be* declarations? -The approach of PyPy is, first of all, to perform analysis on live programs in memory instead of dead source files. This means that the program to analyse is first fully imported and initialized, and once it has reached a state that is deemed advanced enough, we limit the amount of dynamism that is *further* allowed and we analyse the program's objects in memory. In some sense, we use the full Python as a preprocessor for a subset of the language, called RPython, which differs from Python only in ruling out some operations like creating new classes. +The approach of PyPy is, first of all, to perform analysis on live programs in memory instead of dead source files. This means that the program to analyse is first fully imported and initialized, and once it has reached a state that is deemed advanced enough, we limit the amount of dynamism that is allowed *after this point* and we analyse the program's objects in memory. In some sense, we use the full Python as a preprocessor for a subset of the language, called RPython, which differs from Python only in ruling out some operations like creating new classes. + +More theoretically, analysing dead source files is equivalent to giving up all dynamism (in the sense of `No Declarations`_), but static analysis is still possible if we allow a *finite* amount of dynamism -- where an operation is considered dynamic or not depending on whether it is supported or not by the analysis we are performing. Of course, putting more cleverness in the tools helps too; but the point here is that we are still allowed to build dynamic programs, as long as they only ever build a bounded amount of, say, classes and functions. The source code of the PyPy interpreter, which is itself written in its [this?] style, also makes extensive use of the fact that it is possible to build new classes at any point in time, not just during an initialization phase, as long as this number of bounded (e.g. `interpreter/gateway.py`_ builds a custom class for each function that some variable can point to -- there is a finite number of functions in total, so this makes a finite number of extra classes). -More theoretically, analysing dead source files is equivalent to giving up all dynamism (in the sense of `No Declarations`_), but static analysis is still possible if we allow a *finite* amount of dynamism -- where an operation is considered dynamic or not depending on whether it is supported or not by the analysis we are performing. Of course, putting more cleverness in the tools helps too; but the point here is that we are still allowed to build dynamic programs, as long as they only ever build a bounded amount of, say, classes and functions. The source code of the PyPy interpreter, which is itself written in its style, also makes extensive use of the fact that it is possible to build new classes at any point in time, not just during an initialization phase, as long as this number of bounded (e.g. `interpreter/gateway.py`_ builds a custom class for each function that some variable can point to -- there is a finite number of functions in total, so this makes a finite number of extra classes). +.. the above paragraph is confusing too? -Note that this approach is natural in image-oriented environment like Smalltalk, where the program is, by default, live instead of in files. The Python experience forced us to allow some uncontrolled dynamism simply to be able to load the program to memory in the first place; once this was done, it was a mere (but very useful) side-effects that we could allow for some more uncontrolled dynamism at run-time, as opposed to analysing an image in a known frozen state. +Note that this approach is natural in image-oriented environment like Smalltalk, where the program is by default live instead of in files. The Python experience forced us to allow some uncontrolled dynamism simply to be able to load the program to memory in the first place; once this was done, it was a mere (but very useful) side-effect that we could allow for some more uncontrolled dynamism at run-time, as opposed to analysing an image in a known frozen state. Abstract interpretation ------------------------------ -The analysis we perform in PyPy is global program discovery (i.e. slicing it out of all the objects in memory) and type inference. The analysis of the non-dynamic parts themelves is based on their `abstract interpretation`_. The object space separation was also designed for this purpose. PyPy has an alternate object space called the `Flow Object Space`_, whose objects are empty placeholders. The over-simplified view is that to analyse a function, we bind its input arguments to such placeholders, and execute the function -- i.e. let the interpreter follow its bytecode and invoke the object space for each operations, one by one. The Flow object space records each operation when it is issued, and returns a new placeholder as a result. At the end, the list of recorded operations, along with the involved placeholders, gives an assembler-like view of what the function performs. +The analysis we perform in PyPy is global program discovery (i.e. slicing it out of all the objects in memory [what?]) and type inference. The analysis of the non-dynamic parts themselves is based on their `abstract interpretation`_. The object space separation was also designed for this purpose. PyPy has an alternate object space called the `Flow Object Space`_, whose objects are empty placeholders. The over-simplified view is that to analyse a function, we bind its input arguments to such placeholders, and execute the function -- i.e. let the interpreter follow its bytecode and invoke the object space for each operations, one by one. The Flow object space records each operation when it is issued, and returns a new placeholder as a result. At the end, the list of recorded operations, along with the involved placeholders, gives an assembler-like view of what the function performs. The global picture is then to run the program while switching between the flow object space for static enough functions, and a normal, concrete object space for functions or initializations requiring the full dynamism. From pedronis at codespeak.net Mon Sep 12 15:08:00 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 12 Sep 2005 15:08:00 +0200 (CEST) Subject: [pypy-svn] r17497 - in pypy/dist/pypy/interpreter/pyparser: . test Message-ID: <20050912130800.9331227BBE@code1.codespeak.net> Author: pedronis Date: Mon Sep 12 15:07:59 2005 New Revision: 17497 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Log: Module.doc should always be wrapped Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Mon Sep 12 15:07:59 2005 @@ -768,7 +768,7 @@ atom0 = ast.Pass() elif not isinstance(atom0, ast.Stmt): atom0 = ast.Stmt([atom0]) - builder.push(ast.Module(None, atom0)) + builder.push(ast.Module(builder.wrap_none(), atom0)) else: assert False, "Forbidden path" Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Mon Sep 12 15:07:59 2005 @@ -399,6 +399,9 @@ ] docstrings = [ + '''def foo(): return 1''', + '''class Foo: pass''', + '''class Foo: "foo"''', '''def foo(): """foo docstring""" return 1 From pedronis at codespeak.net Mon Sep 12 16:06:33 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 12 Sep 2005 16:06:33 +0200 (CEST) Subject: [pypy-svn] r17499 - pypy/dist/pypy/rpython Message-ID: <20050912140633.E768B27BB9@code1.codespeak.net> Author: pedronis Date: Mon Sep 12 16:06:28 2005 New Revision: 17499 Modified: pypy/dist/pypy/rpython/lltype.py pypy/dist/pypy/rpython/rmodel.py Log: use the short name of the lltype name for the default Repr repr. stop-gap for now, will need a different approach in annlowlevel to have more control and some more precise __repr__ for Reprs here and there. Modified: pypy/dist/pypy/rpython/lltype.py ============================================================================== --- pypy/dist/pypy/rpython/lltype.py (original) +++ pypy/dist/pypy/rpython/lltype.py Mon Sep 12 16:06:28 2005 @@ -70,6 +70,9 @@ def __str__(self): return self.__class__.__name__ + def _short_name(self): + return str(self) + def _defl(self, parent=None, parentindex=None): raise NotImplementedError @@ -172,6 +175,9 @@ return "%s %s { %s }" % (self.__class__.__name__, self._name, self._str_fields()) + def _short_name(self): + return "%s %s" % (self.__class__.__name__, self._name) + def _defl(self, parent=None, parentindex=None): return _struct(self, parent=parent, parentindex=parentindex) @@ -236,6 +242,10 @@ return "%s of %s " % (self.__class__.__name__, self._str_fields(),) + def _short_name(self): + return "%s of %s " % (self.__class__.__name__, + self.OF._short_name(),) + def _container_example(self): return _array(self, 1) @@ -260,6 +270,10 @@ args = ', '.join(map(str, self.ARGS)) return "Func ( %s ) -> %s" % (args, self.RESULT) + def _short_name(self): + args = ', '.join([ARG._short_name() for ARG in self.ARGS]) + return "Func ( %s ) -> %s" % (args, self.RESULT._short_name) + def _container_example(self): def ex(*args): return self.RESULT._defl() @@ -357,6 +371,10 @@ def __str__(self): return '* %s' % (self.TO, ) + + def _short_name(self): + return 'Ptr to %s' % (self.TO._short_name(), ) + def _defl(self, parent=None, parentindex=None): return _ptr(self, None) Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Mon Sep 12 16:06:28 2005 @@ -27,7 +27,7 @@ _initialized = setupstate.NOTINITIALIZED def __repr__(self): - return '<%s %s>' % (self.__class__.__name__, self.lowleveltype) + return '<%s %s>' % (self.__class__.__name__, self.lowleveltype._short_name()) def setup(self): """ call _setup_repr() and keep track of the initializiation From arigo at codespeak.net Mon Sep 12 16:09:12 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Sep 2005 16:09:12 +0200 (CEST) Subject: [pypy-svn] r17500 - pypy/dist/pypy/translator Message-ID: <20050912140912.6B17227BB9@code1.codespeak.net> Author: arigo Date: Mon Sep 12 16:09:11 2005 New Revision: 17500 Modified: pypy/dist/pypy/translator/gensupp.py Log: Don't use more than 50 characters to build the C identifiers. Some C compilers don't like lines with 22322 characters. Modified: pypy/dist/pypy/translator/gensupp.py ============================================================================== --- pypy/dist/pypy/translator/gensupp.py (original) +++ pypy/dist/pypy/translator/gensupp.py Mon Sep 12 16:09:11 2005 @@ -101,7 +101,7 @@ self.seennames[name] = 1 def uniquename(self, basename, with_number=None, bare=False): - basename = basename.translate(C_IDENTIFIER) + basename = basename[:50].translate(C_IDENTIFIER) n = self.seennames.get(basename, 0) self.seennames[basename] = n+1 if with_number is None: From arigo at codespeak.net Mon Sep 12 16:46:32 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Sep 2005 16:46:32 +0200 (CEST) Subject: [pypy-svn] r17502 - pypy/dist/pypy/translator Message-ID: <20050912144632.AA0FC27BB1@code1.codespeak.net> Author: arigo Date: Mon Sep 12 16:46:31 2005 New Revision: 17502 Modified: pypy/dist/pypy/translator/gensupp.py Log: Oups. Fix for the previous check-in. Modified: pypy/dist/pypy/translator/gensupp.py ============================================================================== --- pypy/dist/pypy/translator/gensupp.py (original) +++ pypy/dist/pypy/translator/gensupp.py Mon Sep 12 16:46:31 2005 @@ -100,8 +100,8 @@ raise NameError, "%s has already been seen!" self.seennames[name] = 1 - def uniquename(self, basename, with_number=None, bare=False): - basename = basename[:50].translate(C_IDENTIFIER) + def uniquename(self, basename, with_number=None, bare=False, lenmax=50): + basename = basename[:lenmax].translate(C_IDENTIFIER) n = self.seennames.get(basename, 0) self.seennames[basename] = n+1 if with_number is None: @@ -114,14 +114,16 @@ else: return self.global_prefix + newname else: - return self.uniquename('%s%d' % (basename, n), bare=bare) + return self.uniquename('%s%d' % (basename, n), bare=bare, + lenmax=sys.maxint) if n == 0: if bare: return basename, self.global_prefix + basename else: return self.global_prefix + basename else: - return self.uniquename('%s_%d' % (basename, n), bare=bare) + return self.uniquename('%s_%d' % (basename, n), bare=bare, + lenmax=sys.maxint) def localScope(self, parent=None): ret = _LocalScope(self, parent) From pedronis at codespeak.net Mon Sep 12 17:01:47 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 12 Sep 2005 17:01:47 +0200 (CEST) Subject: [pypy-svn] r17503 - pypy/dist/pypy/rpython Message-ID: <20050912150147.B3C4627BB6@code1.codespeak.net> Author: pedronis Date: Mon Sep 12 17:01:45 2005 New Revision: 17503 Modified: pypy/dist/pypy/rpython/annlowlevel.py pypy/dist/pypy/rpython/lltype.py pypy/dist/pypy/rpython/rclass.py pypy/dist/pypy/rpython/rdict.py pypy/dist/pypy/rpython/rlist.py pypy/dist/pypy/rpython/rmodel.py pypy/dist/pypy/rpython/rtuple.py Log: more ll helpers name generation tweaks Modified: pypy/dist/pypy/rpython/annlowlevel.py ============================================================================== --- pypy/dist/pypy/rpython/annlowlevel.py (original) +++ pypy/dist/pypy/rpython/annlowlevel.py Mon Sep 12 17:01:45 2005 @@ -29,7 +29,17 @@ def __hash__(self): return hash(self.val) def __str__(self): - return getattr(self.val, '__name__', repr(self.val)) + 'Const' + val = self.val + if isinstance(val, lltype.LowLevelType): + return val._short_name() + 'LlT' + s = getattr(val, '__name__', None) + if s is None: + compact = getattr(val, 'compact_repr', None) + if compact is None: + s = repr(s) + else: + s = compact() + return s + 'Const' class LowLevelAnnotatorPolicy(AnnotatorPolicy): allow_someobjects = False Modified: pypy/dist/pypy/rpython/lltype.py ============================================================================== --- pypy/dist/pypy/rpython/lltype.py (original) +++ pypy/dist/pypy/rpython/lltype.py Mon Sep 12 17:01:45 2005 @@ -243,8 +243,8 @@ self._str_fields(),) def _short_name(self): - return "%s of %s " % (self.__class__.__name__, - self.OF._short_name(),) + return "%s %s" % (self.__class__.__name__, + self.OF._short_name(),) def _container_example(self): return _array(self, 1) @@ -272,7 +272,7 @@ def _short_name(self): args = ', '.join([ARG._short_name() for ARG in self.ARGS]) - return "Func ( %s ) -> %s" % (args, self.RESULT._short_name) + return "Func(%s)->%s" % (args, self.RESULT._short_name) def _container_example(self): def ex(*args): @@ -373,7 +373,7 @@ return '* %s' % (self.TO, ) def _short_name(self): - return 'Ptr to %s' % (self.TO._short_name(), ) + return 'Ptr %s' % (self.TO._short_name(), ) def _defl(self, parent=None, parentindex=None): Modified: pypy/dist/pypy/rpython/rclass.py ============================================================================== --- pypy/dist/pypy/rpython/rclass.py (original) +++ pypy/dist/pypy/rpython/rclass.py Mon Sep 12 17:01:45 2005 @@ -115,6 +115,13 @@ cls = self.classdef.cls return '' % (cls.__module__, cls.__name__) + def compact_repr(self): + if self.classdef is None: + cls = object + else: + cls = self.classdef.cls + return 'ClassR %s.%s' % (cls.__module__, cls.__name__) + def _setup_repr(self): # NOTE: don't store mutable objects like the dicts below on 'self' # before they are fully built, to avoid strange bugs in case @@ -372,6 +379,13 @@ cls = self.classdef.cls return '' % (cls.__module__, cls.__name__) + def compact_repr(self): + if self.classdef is None: + cls = object + else: + cls = self.classdef.cls + return 'InstanceR %s.%s' % (cls.__module__, cls.__name__) + def _setup_repr(self): # NOTE: don't store mutable objects like the dicts below on 'self' # before they are fully built, to avoid strange bugs in case Modified: pypy/dist/pypy/rpython/rdict.py ============================================================================== --- pypy/dist/pypy/rpython/rdict.py (original) +++ pypy/dist/pypy/rpython/rdict.py Mon Sep 12 17:01:45 2005 @@ -82,6 +82,9 @@ self._custom_eq_hash_repr = custom_eq_hash # setup() needs to be called to finish this initialization + def compact_repr(self): + return 'DictR %s %s' % (self.key_repr.compact_repr(), self.value_repr.compact_repr()) + def _setup_repr(self): if 'key_repr' not in self.__dict__: self.key_repr = self._key_repr_computer() Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Mon Sep 12 17:01:45 2005 @@ -69,6 +69,9 @@ self.LIST.become(GcStruct("list", ("length", Signed), ("items", Ptr(ITEMARRAY)))) + def compact_repr(self): + return 'ListR %s' % (self.item_repr.compact_repr(),) + def convert_const(self, listobj): # get object from bound list method #listobj = getattr(listobj, '__self__', listobj) Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Mon Sep 12 17:01:45 2005 @@ -27,7 +27,10 @@ _initialized = setupstate.NOTINITIALIZED def __repr__(self): - return '<%s %s>' % (self.__class__.__name__, self.lowleveltype._short_name()) + return '<%s %s>' % (self.__class__.__name__, self.lowleveltype) + + def compact_repr(self): + return '%s %s' % (self.__class__.__name__.replace('Repr','R'), self.lowleveltype._short_name()) def setup(self): """ call _setup_repr() and keep track of the initializiation Modified: pypy/dist/pypy/rpython/rtuple.py ============================================================================== --- pypy/dist/pypy/rpython/rtuple.py (original) +++ pypy/dist/pypy/rpython/rtuple.py Mon Sep 12 17:01:45 2005 @@ -36,6 +36,9 @@ fields = zip(self.fieldnames, self.lltypes) self.lowleveltype = Ptr(GcStruct('tuple%d' % len(items_r), *fields)) + def compact_repr(self): + return "TupleR %s" % ' '.join([llt._short_name() for llt in self.lltypes]) + def convert_const(self, value): assert isinstance(value, tuple) and len(value) == len(self.items_r) p = malloc(self.lowleveltype.TO) From arigo at codespeak.net Mon Sep 12 17:43:37 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Sep 2005 17:43:37 +0200 (CEST) Subject: [pypy-svn] r17505 - in pypy/dist/pypy/translator: . test Message-ID: <20050912154337.2031827BB9@code1.codespeak.net> Author: arigo Date: Mon Sep 12 17:43:35 2005 New Revision: 17505 Modified: pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/simplify.py pypy/dist/pypy/translator/test/test_backendoptimization.py Log: Integrated a simple malloc-removing algorithm. This version is based on http://codespeak.net/svn/user/arigo/hack/pypy-hack/remove_malloc2.py. Given a graph, it divides the vars in families, where two vars are in the same family if a value can go from one to the other across links (or same_as or cast_pointer operations); the family records all usages of its variables. Then we just use this information to see which families correspond to malloc/getfield/setfield only. The family ("LifeTimes") are probably useful for other kind of analysis as well, e.g. (samuele) to property detect what exceptions a graph can raise. Modified: pypy/dist/pypy/translator/backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/backendoptimization.py (original) +++ pypy/dist/pypy/translator/backendoptimization.py Mon Sep 12 17:43:35 2005 @@ -1,6 +1,8 @@ import autopath from pypy.translator.translator import Translator -from pypy.translator.simplify import eliminate_empty_blocks, join_blocks, remove_identical_vars +from pypy.translator.simplify import eliminate_empty_blocks, join_blocks +from pypy.translator.simplify import remove_identical_vars +from pypy.translator.simplify import transform_dead_op_vars from pypy.translator.unsimplify import copyvar, split_block from pypy.objspace.flow.model import Variable, Constant, Block, Link from pypy.objspace.flow.model import SpaceOperation, last_exception @@ -8,7 +10,7 @@ from pypy.annotation import model as annmodel from pypy.tool.unionfind import UnionFind from pypy.rpython.lltype import Void, Bool -from pypy.rpython import rmodel +from pypy.rpython import rmodel, lltype def remove_same_as(graph): """Remove all 'same_as' operations. @@ -354,12 +356,231 @@ join_blocks(graph) remove_identical_vars(graph) +# ____________________________________________________________ + +class Blocked(Exception): + pass + +class LifeTime: + + def __init__(self, (block, var)): + assert isinstance(var, Variable) + self.variables = {(block, var) : True} + self.creationpoints = {} # set of ("type of creation point", ...) + self.usepoints = {} # set of ("type of use point", ...) + + def update(self, other): + self.variables.update(other.variables) + self.creationpoints.update(other.creationpoints) + self.usepoints.update(other.usepoints) + + +def compute_lifetimes(graph): + """Compute the static data flow of the graph: returns a list of LifeTime + instances, each of which corresponds to a set of Variables from the graph. + The variables are grouped in the same LifeTime if a value can pass from + one to the other by following the links. Each LifeTime also records all + places where a Variable in the set is used (read) or build (created). + """ + lifetimes = UnionFind(LifeTime) + + def set_creation_point(block, var, *cp): + _, _, info = lifetimes.find((block, var)) + info.creationpoints[cp] = True + + def set_use_point(block, var, *up): + _, _, info = lifetimes.find((block, var)) + info.usepoints[up] = True + + def union(block1, var1, block2, var2): + if isinstance(var1, Variable): + lifetimes.union((block1, var1), (block2, var2)) + elif isinstance(var1, Constant): + set_creation_point(block2, var2, "constant", var1) + else: + raise TypeError(var1) + + for var in graph.startblock.inputargs: + set_creation_point(graph.startblock, var, "inputargs") + set_use_point(graph.returnblock, graph.returnblock.inputargs[0], "return") + set_use_point(graph.exceptblock, graph.exceptblock.inputargs[0], "except") + set_use_point(graph.exceptblock, graph.exceptblock.inputargs[1], "except") + + def visit(node): + if isinstance(node, Block): + for op in node.operations: + if op.opname in ("same_as", "cast_pointer"): + # special-case these operations to identify their input + # and output variables + union(node, op.args[0], node, op.result) + else: + for i in range(len(op.args)): + if isinstance(op.args[i], Variable): + set_use_point(node, op.args[i], "op", node, op, i) + set_creation_point(node, op.result, "op", node, op) + + if isinstance(node, Link): + if isinstance(node.last_exception, Variable): + set_creation_point(node.prevblock, node.last_exception, + "last_exception") + if isinstance(node.last_exc_value, Variable): + set_creation_point(node.prevblock, node.last_exc_value, + "last_exc_value") + for i in range(len(node.args)): + union(node.prevblock, node.args[i], + node.target, node.target.inputargs[i]) + + traverse(visit, graph) + return lifetimes.infos() + +def _try_inline_malloc(info): + """Try to inline the mallocs creation and manipulation of the Variables + in the given LifeTime.""" + # the values must be only ever created by a "malloc" + lltypes = {} + for cp in info.creationpoints: + if cp[0] != "op": + return False + op = cp[2] + if op.opname != "malloc": + return False + lltypes[op.result.concretetype] = True + + # there must be a single largest malloced GcStruct; + # all variables can point to it or to initial substructures + if len(lltypes) != 1: + return False + STRUCT = lltypes.keys()[0].TO + assert isinstance(STRUCT, lltype.GcStruct) + + # must be only ever accessed via getfield/setfield + for up in info.usepoints: + if up[0] != "op": + return False + if (up[2].opname, up[3]) not in [("getfield", 0), ("setfield", 0)]: + return False + + # success: replace each variable with a family of variables (one per field) + example = STRUCT._container_example() + flatnames = [] + flatconstants = {} + def flatten(S, example): + start = 0 + if S._names and isinstance(S._flds[S._names[0]], lltype.GcStruct): + flatten(S._flds[S._names[0]], getattr(example, S._names[0])) + start = 1 + for name in S._names[start:]: + flatnames.append((S, name)) + constant = Constant(getattr(example, name)) + constant.concretetype = lltype.typeOf(constant.value) + flatconstants[S, name] = constant + flatten(STRUCT, example) + + pending = info.variables.keys() + for block, var in pending: + newvarsmap = {} + + def var_comes_from_outside(): + for key in flatnames: + newvar = Variable() + newvar.concretetype = flatconstants[key].concretetype + newvarsmap[key] = newvar + + def var_is_created_here(): + newvarsmap.update(flatconstants) + + def make_newvars(): + return [newvarsmap[key] for key in flatnames] + + if var in block.inputargs: + var_comes_from_outside() + i = block.inputargs.index(var) + block.inputargs = (block.inputargs[:i] + make_newvars() + + block.inputargs[i+1:]) + + assert block.operations != () + newops = [] + try: + for op in block.operations: + assert var not in op.args[1:] # should be the first arg only + if op.args and var == op.args[0]: + if op.opname == "getfield": + S = var.concretetype.TO + fldname = op.args[1].value + newop = SpaceOperation("same_as", + [newvarsmap[S, fldname]], + op.result) + newops.append(newop) + elif op.opname == "setfield": + S = var.concretetype.TO + fldname = op.args[1].value + assert (S, fldname) in newvarsmap + newvarsmap[S, fldname] = op.args[2] + elif op.opname in ("same_as", "cast_pointer"): + # temporary pseudo-operation, should be removed below + newop = SpaceOperation("_tmp_same_as", + make_newvars(), + op.result) + newops.append(newop) + else: + raise AssertionError, op.opname + elif var == op.result: + assert not newvarsmap + if op.opname == "malloc": + var_is_created_here() + elif op.opname in ("same_as", "cast_pointer"): + # in a 'v2=same_as(v1)', we must analyse v1 before + # we can analyse v2. If we get them in the wrong + # order we cancel and reschedule v2. + raise Blocked + elif op.opname == "_tmp_same_as": + # pseudo-operation just introduced by the code + # some lines above. + for key, v in zip(flatnames, op.args): + newvarsmap[key] = v + else: + raise AssertionError, op.opname + else: + newops.append(op) + except Blocked: + pending.append((block, var)) + continue + block.operations[:] = newops + + for link in block.exits: + while var in link.args: + i = link.args.index(var) + link.args = link.args[:i] + make_newvars() + link.args[i+1:] + + return True + +def remove_mallocs_once(graph): + """Perform one iteration of malloc removal.""" + lifetimes = compute_lifetimes(graph) + progress = False + for info in lifetimes: + if _try_inline_malloc(info): + progress = True + return progress + +def remove_simple_mallocs(graph): + """Iteratively remove (inline) the mallocs that can be simplified away.""" + done_something = False + while remove_mallocs_once(graph): + done_something = True + return done_something + +# ____________________________________________________________ + def backend_optimizations(graph): remove_same_as(graph) eliminate_empty_blocks(graph) checkgraph(graph) SSI_to_SSA(graph) - checkgraph(graph) + #checkgraph(graph) + if remove_simple_mallocs(graph): + transform_dead_op_vars(graph) # typical after malloc removal + checkgraph(graph) # ____________________________________________________________ Modified: pypy/dist/pypy/translator/simplify.py ============================================================================== --- pypy/dist/pypy/translator/simplify.py (original) +++ pypy/dist/pypy/translator/simplify.py Mon Sep 12 17:43:35 2005 @@ -341,7 +341,7 @@ pos neg nonzero abs hex oct ord invert add sub mul truediv floordiv div mod divmod pow lshift rshift and_ or_ xor int float long lt le eq ne gt ge cmp coerce contains - iter get '''.split(): + iter get same_as cast_pointer '''.split(): CanRemove[_op] = True del _op CanRemoveBuiltins = { Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/test/test_backendoptimization.py (original) +++ pypy/dist/pypy/translator/test/test_backendoptimization.py Mon Sep 12 17:43:35 2005 @@ -1,8 +1,9 @@ from pypy.translator.backendoptimization import remove_void, inline_function +from pypy.translator.backendoptimization import remove_simple_mallocs from pypy.translator.translator import Translator from pypy.rpython.lltype import Void from pypy.rpython.llinterp import LLInterpreter -from pypy.objspace.flow.model import checkgraph +from pypy.objspace.flow.model import checkgraph, flatten, Block from pypy.translator.test.snippet import simple_method, is_perfect_number from pypy.translator.llvm.log import log @@ -196,4 +197,54 @@ a.simplify() t.specialize() t.view() - + + +def check_malloc_removed(fn, signature, expected_remaining_mallocs): + t = Translator(fn) + t.annotate(signature) + t.specialize() + graph = t.getflowgraph() + remove_simple_mallocs(graph) + checkgraph(graph) + count = 0 + for node in flatten(graph): + if isinstance(node, Block): + for op in node.operations: + if op.opname == 'malloc': + count += 1 + assert count == expected_remaining_mallocs + +def test_remove_mallocs(): + def fn1(x, y): + s, d = x+y, x-y + return s*d + yield check_malloc_removed, fn1, [int, int], 0 + # + class T: + pass + def fn2(x, y): + t = T() + t.x = x + t.y = y + if x > 0: + return t.x + t.y + else: + return t.x - t.y + yield check_malloc_removed, fn2, [int, int], 0 + # + def fn3(x): + a, ((b, c), d, e) = x+1, ((x+2, x+3), x+4, x+5) + return a+b+c+d+e + yield check_malloc_removed, fn3, [int], 0 + # + class A: + pass + class B(A): + pass + def fn4(i): + a = A() + b = B() + a.b = b + b.i = i + return a.b.i + yield check_malloc_removed, fn4, [int], 0 From arigo at codespeak.net Mon Sep 12 17:44:47 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Sep 2005 17:44:47 +0200 (CEST) Subject: [pypy-svn] r17506 - pypy/dist/pypy/translator/c/test Message-ID: <20050912154447.3B6B027BB9@code1.codespeak.net> Author: arigo Date: Mon Sep 12 17:44:46 2005 New Revision: 17506 Modified: pypy/dist/pypy/translator/c/test/test_backendoptimized.py Log: Test all backend optimizations here. Modified: pypy/dist/pypy/translator/c/test/test_backendoptimized.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_backendoptimized.py (original) +++ pypy/dist/pypy/translator/c/test/test_backendoptimized.py Mon Sep 12 17:44:46 2005 @@ -21,8 +21,7 @@ a.simplify() t.specialize() for graph in t.flowgraphs.values(): - backendoptimization.remove_same_as(graph) - backendoptimization.SSI_to_SSA(graph) + backendoptimization.backend_optimizations(graph) t.checkgraphs() return skip_missing_compiler(t.ccompile) From arigo at codespeak.net Mon Sep 12 17:47:06 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Sep 2005 17:47:06 +0200 (CEST) Subject: [pypy-svn] r17507 - pypy/dist/pypy/doc Message-ID: <20050912154706.4923D27BB9@code1.codespeak.net> Author: arigo Date: Mon Sep 12 17:47:04 2005 New Revision: 17507 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: Added a sentence... Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Mon Sep 12 17:47:04 2005 @@ -65,7 +65,7 @@ The global picture is then to run the program while switching between the flow object space for static enough functions, and a normal, concrete object space for functions or initializations requiring the full dynamism. -If the placeholders are endowed with a bit more information, e.g. if they carry a type information that is propagated to resulting placeholders by individual operations, then our abstract interpretation simultaneously performs type inference. This is, in essence, executing the program while abstracting out some concrete values and replacing them with the set of all values that could actually be there. If the sets are broad enough, then after some time we will have seen all potential value sets along each possible code paths, and our program analysis is complete. +If the placeholders are endowed with a bit more information, e.g. if they carry a type information that is propagated to resulting placeholders by individual operations, then our abstract interpretation simultaneously performs type inference. This is, in essence, executing the program while abstracting out some concrete values and replacing them with the set of all values that could actually be there. If the sets are broad enough, then after some time we will have seen all potential value sets along each possible code paths, and our program analysis is complete. An object space is thus an *interpretation domain*; the Flow Object Space is an *abstract interpretation domain*. This is a theoretical point of view that differs significantly from what we have implemented, for many reasons. Of course, the devil is in the details -- which the rest of this paper is all about. From arigo at codespeak.net Mon Sep 12 17:48:34 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Sep 2005 17:48:34 +0200 (CEST) Subject: [pypy-svn] r17508 - pypy/dist/pypy/doc Message-ID: <20050912154834.4463C27BB9@code1.codespeak.net> Author: arigo Date: Mon Sep 12 17:48:32 2005 New Revision: 17508 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: This is only a reformatting, avoiding long single-line paragraphs. Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Mon Sep 12 17:48:32 2005 @@ -9,65 +9,192 @@ Dynamic languages --------------------------- -Dynamic languages are definitely not new on the computing scene. However, new conditions like increased computing power and designs driven by larger communities have allowed the emergence of new aspects in the recent members of the family, or at least made them more practical than they previously were. The following aspects in particular are typical not only of Python but of most modern dynamic languages: - -* The driving force is not minimalistic elegance. It is a balance between elegance and practicality, and rather un-minimalistic -- the feature sets built into languages tend to be relatively large and growing (though it is still a major difference between languages where exactly they stand on this scale). - -* High abstractions and theoretically powerful low-level primitives are generally ruled out in favor of a larger number of features that try to cover the most common use cases. In this respect, one could even regard these languages as mere libraries on top of some simpler (unspecified) language. - -* Implementation-wise, language design is no longer driven by a desire to enable high performance; any feature straightforward enough to achieve with an interpreter is candidate. As a result, compilation and most kinds of static inference are made impossible due to this dynamism (unless they are simply tedious due to the size of the language). +Dynamic languages are definitely not new on the computing scene. +However, new conditions like increased computing power and designs driven +by larger communities have allowed the emergence of new aspects in the +recent members of the family, or at least made them more practical than +they previously were. The following aspects in particular are typical not +only of Python but of most modern dynamic languages: + +* The driving force is not minimalistic elegance. It is a balance between + elegance and practicality, and rather un-minimalistic -- the feature + sets built into languages tend to be relatively large and growing + (though it is still a major difference between languages where exactly + they stand on this scale). + +* High abstractions and theoretically powerful low-level primitives are + generally ruled out in favor of a larger number of features that try to + cover the most common use cases. In this respect, one could even regard + these languages as mere libraries on top of some simpler (unspecified) + language. + +* Implementation-wise, language design is no longer driven by a desire to + enable high performance; any feature straightforward enough to achieve + with an interpreter is candidate. As a result, compilation and most + kinds of static inference are made impossible due to this dynamism + (unless they are simply tedious due to the size of the language). No Declarations -------------------------- -The notion of "declaration", central in compiled languages, is entirely missing in Python. There is no aspect of a program that must be declared; the complete program is built and run by executing statements. Some of these statements have a declarative look and feel; for example, some appear to be function or class declarations. Actually, they are merely statements that, when executed, build a function or class object and store a reference to that object at some place, under some name, from where it can be retrieved later. Units of programs -- modules, whose source is a file each -- are similarily mere objects in memory built on demand by some other module executing an ``import`` statement. Any such statement -- class construction or module import -- can be executed at any time during the execution of a program. - -This point of view should help explain why an analysis of a program is theoretically impossible: there is no declared structure. The program could for example build a class in completely different ways based on the results of NP-complete computations or external factors. This is not just a theoretical possibility but a regularly used feature: for example, the pure Python module ``os.py`` provides some OS-independent interface to OS-specific system calls, by importing OS-specific modules and defining substitute functions as needed depending on the OS on which ``os.py`` turns out to be executed. Many large Python projects use custom import mechanisms to control exactly how and from where each module is loaded, simply by tampering with import hooks or just emulating parts of the ``import`` statement manually. - -In addition, there are of course classical (and only partially true) arguments against compiling dynamic languages (there is an ``eval`` function that can execute arbitrary code, and introspection can change anything at run-time), but we consider the argument outlined above as more fundamental to the nature of dynamic languages. +The notion of "declaration", central in compiled languages, is entirely +missing in Python. There is no aspect of a program that must be declared; +the complete program is built and run by executing statements. Some of +these statements have a declarative look and feel; for example, some +appear to be function or class declarations. Actually, they are merely +statements that, when executed, build a function or class object and store +a reference to that object at some place, under some name, from where it +can be retrieved later. Units of programs -- modules, whose source is a +file each -- are similarily mere objects in memory built on demand by some +other module executing an ``import`` statement. Any such statement -- +class construction or module import -- can be executed at any time during +the execution of a program. + +This point of view should help explain why an analysis of a program is +theoretically impossible: there is no declared structure. The program +could for example build a class in completely different ways based on the +results of NP-complete computations or external factors. This is not just +a theoretical possibility but a regularly used feature: for example, the +pure Python module ``os.py`` provides some OS-independent interface to +OS-specific system calls, by importing OS-specific modules and defining +substitute functions as needed depending on the OS on which ``os.py`` +turns out to be executed. Many large Python projects use custom import +mechanisms to control exactly how and from where each module is loaded, +simply by tampering with import hooks or just emulating parts of the +``import`` statement manually. + +In addition, there are of course classical (and only partially true) +arguments against compiling dynamic languages (there is an ``eval`` +function that can execute arbitrary code, and introspection can change +anything at run-time), but we consider the argument outlined above as more +fundamental to the nature of dynamic languages. Control flow versus data model --------------------------------- -Given the absence of declarations, the only preprocessing done on a Python module is the compilation of the source code into pseudo-code (bytecode). From there, the semantics can be roughly divided in two groups: the control flow semantics and the data model. In Python and other languages of its family, these two aspects are to some extent conceptually separated. Indeed, although it is possible -- and common -- to design languages in which the two aspects are more intricately connected, or one aspect is subsumed to the other (e.g. data structures in Lisp), programmers tend to separate the two concepts in common cases -- enough for the "practical-features-beats-obscure-primitives" language design guideline seen above. So in Python, both aspects are complex on their own. +Given the absence of declarations, the only preprocessing done on a Python +module is the compilation of the source code into pseudo-code (bytecode). +From there, the semantics can be roughly divided in two groups: the +control flow semantics and the data model. In Python and other languages +of its family, these two aspects are to some extent conceptually +separated. Indeed, although it is possible -- and common -- to design +languages in which the two aspects are more intricately connected, or one +aspect is subsumed to the other (e.g. data structures in Lisp), +programmers tend to separate the two concepts in common cases -- enough +for the "practical-features-beats-obscure-primitives" language design +guideline seen above. So in Python, both aspects are complex on their +own. .. the above paragraph doesn't make a great deal of sense. some very long sentences! :) -The control flow semantics include, clearly, all syntactic elements that influence the control flow of a program -- loops, function definitions and calls, etc. -- whereas the data model describes how the first-class objects manipulated by the program behave under some operations. There is a rich built-in set of object types in Python, and a rich set of operations on them, each corresponding to a syntactic element. Objects of different types react differently to the same operation, and the variables are not statically typed, which is also part of the dynamic nature of languages like Python -- operations are generally highly polymorphic and types are hard to infer in advance. - -Note that control flow and object model are not entirely separated. It is not uncommon for some control flow aspects to be manipulable as first-class objects as well, e.g. functions in Python. Conversely, almost any operation on any object could lead to a user-defined function being called back. +The control flow semantics include, clearly, all syntactic elements that +influence the control flow of a program -- loops, function definitions and +calls, etc. -- whereas the data model describes how the first-class +objects manipulated by the program behave under some operations. There is +a rich built-in set of object types in Python, and a rich set of +operations on them, each corresponding to a syntactic element. Objects of +different types react differently to the same operation, and the variables +are not statically typed, which is also part of the dynamic nature of +languages like Python -- operations are generally highly polymorphic and +types are hard to infer in advance. + +Note that control flow and object model are not entirely separated. It is +not uncommon for some control flow aspects to be manipulable as +first-class objects as well, e.g. functions in Python. Conversely, almost +any operation on any object could lead to a user-defined function being +called back. + +The data model forms a so-called *Object Space* in PyPy. The bytecode +interpreter works by delegating most operations to the object space, by +invoking a well-defined abstract interface. The objects are regarded as +"belonging" to the object space, where the interpreter sees them as black +boxes on which it can ask for operations to be performed. -The data model forms a so-called *Object Space* in PyPy. The bytecode interpreter works by delegating most operations to the object space, by invoking a well-defined abstract interface. The objects are regarded as "belonging" to the object space, where the interpreter sees them as black boxes on which it can ask for operations to be performed. - -Note that the term "object space" has already been reused for other dynamic language implementations, e.g. XXX for Perl 6. +Note that the term "object space" has already been reused for other +dynamic language implementations, e.g. XXX for Perl 6. The analysis of live programs ----------------------------------- -How can we perform some static analysis on a program written in a dynamic language while keeping to the spirit of `No Declarations`_, i.e. without imposing that the program be written in a static way in which these declarative-looking statements would actually *be* declarations? - -The approach of PyPy is, first of all, to perform analysis on live programs in memory instead of dead source files. This means that the program to analyse is first fully imported and initialized, and once it has reached a state that is deemed advanced enough, we limit the amount of dynamism that is allowed *after this point* and we analyse the program's objects in memory. In some sense, we use the full Python as a preprocessor for a subset of the language, called RPython, which differs from Python only in ruling out some operations like creating new classes. - -More theoretically, analysing dead source files is equivalent to giving up all dynamism (in the sense of `No Declarations`_), but static analysis is still possible if we allow a *finite* amount of dynamism -- where an operation is considered dynamic or not depending on whether it is supported or not by the analysis we are performing. Of course, putting more cleverness in the tools helps too; but the point here is that we are still allowed to build dynamic programs, as long as they only ever build a bounded amount of, say, classes and functions. The source code of the PyPy interpreter, which is itself written in its [this?] style, also makes extensive use of the fact that it is possible to build new classes at any point in time, not just during an initialization phase, as long as this number of bounded (e.g. `interpreter/gateway.py`_ builds a custom class for each function that some variable can point to -- there is a finite number of functions in total, so this makes a finite number of extra classes). +How can we perform some static analysis on a program written in a dynamic +language while keeping to the spirit of `No Declarations`_, i.e. without +imposing that the program be written in a static way in which these +declarative-looking statements would actually *be* declarations? + +The approach of PyPy is, first of all, to perform analysis on live +programs in memory instead of dead source files. This means that the +program to analyse is first fully imported and initialized, and once it +has reached a state that is deemed advanced enough, we limit the amount of +dynamism that is allowed *after this point* and we analyse the program's +objects in memory. In some sense, we use the full Python as a +preprocessor for a subset of the language, called RPython, which differs +from Python only in ruling out some operations like creating new classes. + +More theoretically, analysing dead source files is equivalent to giving up +all dynamism (in the sense of `No Declarations`_), but static analysis is +still possible if we allow a *finite* amount of dynamism -- where an +operation is considered dynamic or not depending on whether it is +supported or not by the analysis we are performing. Of course, putting +more cleverness in the tools helps too; but the point here is that we are +still allowed to build dynamic programs, as long as they only ever build a +bounded amount of, say, classes and functions. The source code of the +PyPy interpreter, which is itself written in its [this?] style, also makes +extensive use of the fact that it is possible to build new classes at any +point in time, not just during an initialization phase, as long as this +number of bounded (e.g. `interpreter/gateway.py`_ builds a custom class +for each function that some variable can point to -- there is a finite +number of functions in total, so this makes a finite number of extra +classes). .. the above paragraph is confusing too? -Note that this approach is natural in image-oriented environment like Smalltalk, where the program is by default live instead of in files. The Python experience forced us to allow some uncontrolled dynamism simply to be able to load the program to memory in the first place; once this was done, it was a mere (but very useful) side-effect that we could allow for some more uncontrolled dynamism at run-time, as opposed to analysing an image in a known frozen state. +Note that this approach is natural in image-oriented environment like +Smalltalk, where the program is by default live instead of in files. The +Python experience forced us to allow some uncontrolled dynamism simply to +be able to load the program to memory in the first place; once this was +done, it was a mere (but very useful) side-effect that we could allow for +some more uncontrolled dynamism at run-time, as opposed to analysing an +image in a known frozen state. Abstract interpretation ------------------------------ -The analysis we perform in PyPy is global program discovery (i.e. slicing it out of all the objects in memory [what?]) and type inference. The analysis of the non-dynamic parts themselves is based on their `abstract interpretation`_. The object space separation was also designed for this purpose. PyPy has an alternate object space called the `Flow Object Space`_, whose objects are empty placeholders. The over-simplified view is that to analyse a function, we bind its input arguments to such placeholders, and execute the function -- i.e. let the interpreter follow its bytecode and invoke the object space for each operations, one by one. The Flow object space records each operation when it is issued, and returns a new placeholder as a result. At the end, the list of recorded operations, along with the involved placeholders, gives an assembler-like view of what the function performs. - -The global picture is then to run the program while switching between the flow object space for static enough functions, and a normal, concrete object space for functions or initializations requiring the full dynamism. - -If the placeholders are endowed with a bit more information, e.g. if they carry a type information that is propagated to resulting placeholders by individual operations, then our abstract interpretation simultaneously performs type inference. This is, in essence, executing the program while abstracting out some concrete values and replacing them with the set of all values that could actually be there. If the sets are broad enough, then after some time we will have seen all potential value sets along each possible code paths, and our program analysis is complete. An object space is thus an *interpretation domain*; the Flow Object Space is an *abstract interpretation domain*. - -This is a theoretical point of view that differs significantly from what we have implemented, for many reasons. Of course, the devil is in the details -- which the rest of this paper is all about. +The analysis we perform in PyPy is global program discovery (i.e. slicing +it out of all the objects in memory [what?]) and type inference. The +analysis of the non-dynamic parts themselves is based on their `abstract +interpretation`_. The object space separation was also designed for this +purpose. PyPy has an alternate object space called the `Flow Object +Space`_, whose objects are empty placeholders. The over-simplified view +is that to analyse a function, we bind its input arguments to such +placeholders, and execute the function -- i.e. let the interpreter follow +its bytecode and invoke the object space for each operations, one by one. +The Flow object space records each operation when it is issued, and +returns a new placeholder as a result. At the end, the list of recorded +operations, along with the involved placeholders, gives an assembler-like +view of what the function performs. + +The global picture is then to run the program while switching between the +flow object space for static enough functions, and a normal, concrete +object space for functions or initializations requiring the full dynamism. + +If the placeholders are endowed with a bit more information, e.g. if they +carry a type information that is propagated to resulting placeholders by +individual operations, then our abstract interpretation simultaneously +performs type inference. This is, in essence, executing the program while +abstracting out some concrete values and replacing them with the set of +all values that could actually be there. If the sets are broad enough, +then after some time we will have seen all potential value sets along each +possible code paths, and our program analysis is complete. An object +space is thus an *interpretation domain*; the Flow Object Space is an +*abstract interpretation domain*. + +This is a theoretical point of view that differs significantly from what +we have implemented, for many reasons. Of course, the devil is in the +details -- which the rest of this paper is all about. Flow Object Space From arigo at codespeak.net Mon Sep 12 18:30:56 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 12 Sep 2005 18:30:56 +0200 (CEST) Subject: [pypy-svn] r17510 - in pypy/dist/pypy/translator: . test Message-ID: <20050912163056.B33CA27B6D@code1.codespeak.net> Author: arigo Date: Mon Sep 12 18:30:55 2005 New Revision: 17510 Modified: pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/test/test_backendoptimization.py Log: Two typos in backendoptimization. Disabled a test, and added another disabled test, about supporting catching exceptions while inlining a graph that contains further direct_calls. Modified: pypy/dist/pypy/translator/backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/backendoptimization.py (original) +++ pypy/dist/pypy/translator/backendoptimization.py Mon Sep 12 18:30:55 2005 @@ -159,6 +159,7 @@ for op in obj.operations: if op.opname == "direct_call": funcs[op.args[0]] = True + traverse(visit, graph) return funcs def inline_function(translator, inline_func, graph): @@ -195,14 +196,14 @@ return ops[-4].args[2].value def _inline_function(translator, graph, block, index_operation): + op = block.operations[index_operation] + graph_to_inline = translator.flowgraphs[op.args[0].value._obj._callable] exception_guarded = False if (block.exitswitch == Constant(last_exception) and index_operation == len(block.operations) - 1): exception_guarded = True - assert len(collect_called_functions(graph)) == 0, ( + assert len(collect_called_functions(graph_to_inline)) == 0, ( "can't handle exceptions yet") - op = block.operations[index_operation] - graph_to_inline = translator.flowgraphs[op.args[0].value._obj._callable] entrymap = mkentrymap(graph_to_inline) beforeblock = block afterblock = split_block(translator, graph, block, index_operation) Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/test/test_backendoptimization.py (original) +++ pypy/dist/pypy/translator/test/test_backendoptimization.py Mon Sep 12 18:30:55 2005 @@ -155,7 +155,9 @@ result = interp.eval_function(g, [42]) assert result == 1 -def test_inline_var_exception(): +def DONOTtest_inline_var_exception(): + # this test is disabled for now, because f() contains a direct_call + # (at the end, to a ll helper, to get the type of the exception object) def f(x): e = None if x == 0: @@ -184,7 +186,32 @@ assert result == 3 result = interp.eval_function(g, [42]) assert result == 1 - + +def DONOTtest_call_call(): + # for reference. Just remove this test if we decide not to support + # catching exceptions while inlining a graph that contains further + # direct_calls. + def e(x): + if x < 0: + raise KeyError + return x+1 + def f(x): + return e(x)+2 + def g(x): + try: + return f(x)+3 + except KeyError: + return -1 + t = Translator(g) + a = t.annotate([int]) + a.simplify() + t.specialize() + inline_function(t, f, t.flowgraphs[g]) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(g, [100]) + assert result == 106 + result = interp.eval_function(g, [-100]) + assert result == -1 def DONOTtest_for_loop(): def f(x): From ludal at codespeak.net Mon Sep 12 23:25:52 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Mon, 12 Sep 2005 23:25:52 +0200 (CEST) Subject: [pypy-svn] r17513 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050912212552.0DA4927BA1@code1.codespeak.net> Author: ludal Date: Mon Sep 12 23:25:51 2005 New Revision: 17513 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py Log: more annotation/rtyping fixes Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Mon Sep 12 23:25:51 2005 @@ -327,9 +327,9 @@ def dfs_postorder(b, seen): """Depth-first search of tree rooted at b, return in postorder""" order = [] - seen[b] = b + seen[b.bid] = b for c in b.get_children(): - if c in seen: + if c.bid in seen: continue order = order + dfs_postorder(c, seen) order.append(b) @@ -775,7 +775,7 @@ else: print "x",opname, t.getArg() if not t.has_arg: - lnotab.addCode(self.opnum[opname]) + lnotab.addCode1(self.opnum[opname]) else: assert isinstance(t, InstrInt) oparg = t.intval @@ -784,7 +784,7 @@ continue hi, lo = twobyte(oparg) try: - lnotab.addCode(self.opnum[opname], lo, hi) + lnotab.addCode3(self.opnum[opname], lo, hi) except ValueError: if self._debug: print opname, oparg @@ -882,10 +882,15 @@ self.lastoff = 0 self.lnotab = [] - def addCode(self, *args): - for arg in args: - self.code.append(chr(arg)) - self.codeOffset = self.codeOffset + len(args) + def addCode1(self, op ): + self.code.append(chr(op)) + self.codeOffset = self.codeOffset + 1 + + def addCode3(self, op, hi, lo): + self.code.append(chr(op)) + self.code.append(chr(hi)) + self.code.append(chr(lo)) + self.codeOffset = self.codeOffset + 3 def nextLine(self, lineno): if self.firstline == 0: From pedronis at codespeak.net Tue Sep 13 00:06:01 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 13 Sep 2005 00:06:01 +0200 (CEST) Subject: [pypy-svn] r17514 - in pypy/dist/pypy: annotation rpython rpython/test Message-ID: <20050912220601.70FC927BA4@code1.codespeak.net> Author: pedronis Date: Tue Sep 13 00:05:58 2005 New Revision: 17514 Modified: pypy/dist/pypy/annotation/dictdef.py pypy/dist/pypy/rpython/rclass.py pypy/dist/pypy/rpython/test/test_rdict.py Log: enable using instances as dict key using identity Modified: pypy/dist/pypy/annotation/dictdef.py ============================================================================== --- pypy/dist/pypy/annotation/dictdef.py (original) +++ pypy/dist/pypy/annotation/dictdef.py Tue Sep 13 00:05:58 2005 @@ -1,11 +1,16 @@ from pypy.annotation.model import SomeObject, SomeImpossibleValue from pypy.annotation.model import SomeInteger, SomeBool, unionof +from pypy.annotation.model import SomeInstance from pypy.annotation.listdef import ListItem class DictKey(ListItem): custom_eq_hash = False + def __init__(self, bookkeeper, s_value): + ListItem.__init__(self, bookkeeper, s_value) + self.enable_hashing() + def patch(self): for dictdef in self.itemof: dictdef.dictkey = self @@ -20,8 +25,14 @@ other.s_rdict_hashfn, other=other) + def enable_hashing(self): + if isinstance(self.s_value, SomeInstance): + self.bookkeeper.needs_hash_support[self.s_value.classdef.cls] = True + def generalize(self, s_other_value): updated = ListItem.generalize(self, s_other_value) + if updated: + self.enable_hashing() if updated and self.custom_eq_hash: self.emulate_rdict_calls() return updated Modified: pypy/dist/pypy/rpython/rclass.py ============================================================================== --- pypy/dist/pypy/rpython/rclass.py (original) +++ pypy/dist/pypy/rpython/rclass.py Tue Sep 13 00:05:58 2005 @@ -472,7 +472,22 @@ return result def get_ll_eq_function(self): - return None + return ll_inst_eq + + def get_ll_hash_function(self): + if self.classdef is None: + return None + if self.rtyper.needs_hash_support( self.classdef.cls): + try: + return self._ll_hash_function + except AttributeError: + INSPTR = self.lowleveltype + def _ll_hash_function(ins): + return ll_inst_hash(cast_pointer(INSPTR, ins)) + self._ll_hash_function = _ll_hash_function + return _ll_hash_function + else: + return self.rbase.get_ll_hash_function() def initialize_prebuilt_instance(self, value, classdef, result): if self.classdef is not None: @@ -671,6 +686,7 @@ v = rpair.rtype_eq(hop) return hop.genop("bool_not", [v], resulttype=Bool) + def ll_both_none(ins1, ins2): return not ins1 and not ins2 @@ -722,6 +738,9 @@ cached = ins.hash_cache = id(ins) return cached +def ll_inst_eq(ins1, ins2): + return ins1 == ins2 + def ll_inst_type(obj): if obj: return obj.typeptr Modified: pypy/dist/pypy/rpython/test/test_rdict.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rdict.py (original) +++ pypy/dist/pypy/rpython/test/test_rdict.py Tue Sep 13 00:05:58 2005 @@ -382,3 +382,16 @@ print 'current dict length:', referencelength assert l_dict.num_items == referencelength complete_check() + +def test_id_instances_keys(): + class A: + pass + def f(): + a = A() + b = A() + d = {} + d[a] = 1 + d[b] = 7 + return len(d) + d[a] + d[b] + res = interpret(f, []) + assert res == 10 From pedronis at codespeak.net Tue Sep 13 00:24:01 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 13 Sep 2005 00:24:01 +0200 (CEST) Subject: [pypy-svn] r17515 - pypy/dist/pypy/rpython/test Message-ID: <20050912222401.B9CC827B69@code1.codespeak.net> Author: pedronis Date: Tue Sep 13 00:24:00 2005 New Revision: 17515 Modified: pypy/dist/pypy/rpython/test/test_rdict.py Log: richer test Modified: pypy/dist/pypy/rpython/test/test_rdict.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rdict.py (original) +++ pypy/dist/pypy/rpython/test/test_rdict.py Tue Sep 13 00:24:00 2005 @@ -386,12 +386,14 @@ def test_id_instances_keys(): class A: pass + class B(A): + pass def f(): a = A() - b = A() + b = B() d = {} - d[a] = 1 d[b] = 7 + d[a] = 3 return len(d) + d[a] + d[b] res = interpret(f, []) - assert res == 10 + assert res == 12 From tismer at codespeak.net Tue Sep 13 03:19:56 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Tue, 13 Sep 2005 03:19:56 +0200 (CEST) Subject: [pypy-svn] r17516 - in pypy/dist/pypy/rpython: . test Message-ID: <20050913011956.ED2CC27B9A@code1.codespeak.net> Author: tismer Date: Tue Sep 13 03:19:53 2005 New Revision: 17516 Modified: pypy/dist/pypy/rpython/rdict.py pypy/dist/pypy/rpython/test/test_rdict.py Log: re-added the hash fieldto the rdict implementation. This is a waste for many simple types, but so are other fields, and we need to re-optimize, anyway. Remarkable is the dramatic effect on performance: abs.richards abs.pystone rel.richards rel.pystone pypy-c-17439: 40929 ms 637.274 47.8 56.6 pypy-c-17512: 46105 ms 658.1 53.9 54.8 pypy-current: 33937 ms 698.415 39.6 51.7 python2.3.3: 856 ms 36081.6 1.0 1.0 richards naturally benefits much more from faster dict lookup, because it is all about accessing instance variables. All measurements were done on a windows notebook with 512 MB. Threading disabled, -boehm -t-lowmem Modified: pypy/dist/pypy/rpython/rdict.py ============================================================================== --- pypy/dist/pypy/rpython/rdict.py (original) +++ pypy/dist/pypy/rpython/rdict.py Tue Sep 13 03:19:53 2005 @@ -23,6 +23,7 @@ # bool valid; # to mark if the entry is filled # bool everused; # to mark if the entry is or has ever been filled # DICTVALUE value; +# int hash; # } # # struct dicttable { @@ -95,6 +96,7 @@ self.DICTVALUE = self.value_repr.lowleveltype self.DICTENTRY = lltype.Struct("dictentry", ("key", self.DICTKEY), + ("hash", lltype.Signed), ("valid", lltype.Bool), ("everused", lltype.Bool), ("value", self.DICTVALUE)) @@ -305,7 +307,12 @@ entry.value = value if entry.valid: return + if dictrepr.custom_eq_hash: + hash = hlinvoke(dictrepr.r_rdict_hashfn, d.fnkeyhash, key) + else: + hash = dictrepr.ll_keyhash(key) entry.key = key + entry.hash = hash entry.valid = True d.num_items += 1 if not entry.everused: @@ -348,6 +355,7 @@ if entry.valid: new_entry = ll_dict_lookup(d, entry.key, dictrepr) new_entry.key = entry.key + new_entry.hash = entry.hash new_entry.value = entry.value new_entry.valid = True new_entry.everused = True @@ -370,16 +378,17 @@ checkingkey = entry.key if checkingkey == key: return entry # found the entry - if dictrepr.custom_eq_hash: - res = hlinvoke(dictrepr.r_rdict_eqfn, d.fnkeyeq, checkingkey, key) - if (entries != d.entries or - not entry.valid or entry.key != checkingkey): - # the compare did major nasty stuff to the dict: start over - return ll_dict_lookup(d, key, dictrepr) - else: - res = dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(checkingkey, key) - if res: - return entry # found the entry + if entry.hash == hash: + if dictrepr.custom_eq_hash: + res = hlinvoke(dictrepr.r_rdict_eqfn, d.fnkeyeq, checkingkey, key) + if (entries != d.entries or + not entry.valid or entry.key != checkingkey): + # the compare did major nasty stuff to the dict: start over + return ll_dict_lookup(d, key, dictrepr) + else: + res = dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(checkingkey, key) + if res: + return entry # found the entry freeslot = lltype.nullptr(lltype.typeOf(entry).TO) elif entry.everused: freeslot = entry @@ -398,16 +407,17 @@ checkingkey = entry.key if checkingkey == key: return entry - if dictrepr.custom_eq_hash: - res = hlinvoke(dictrepr.r_rdict_eqfn, d.fnkeyeq, checkingkey, key) - if (entries != d.entries or - not entry.valid or entry.key != checkingkey): - # the compare did major nasty stuff to the dict: start over - return ll_dict_lookup(d, key, dictrepr) - else: - res = dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(checkingkey, key) - if res: - return entry + if entry.hash == hash: + if dictrepr.custom_eq_hash: + res = hlinvoke(dictrepr.r_rdict_eqfn, d.fnkeyeq, checkingkey, key) + if (entries != d.entries or + not entry.valid or entry.key != checkingkey): + # the compare did major nasty stuff to the dict: start over + return ll_dict_lookup(d, key, dictrepr) + else: + res = dictrepr.ll_keyeq is not None and dictrepr.ll_keyeq(checkingkey, key) + if res: + return entry elif not freeslot: freeslot = entry perturb >>= PERTURB_SHIFT @@ -536,6 +546,7 @@ d_entry = d.entries[i] entry = dict.entries[i] d_entry.key = entry.key + d_entry.hash = entry.hash d_entry.value = entry.value d_entry.valid = entry.valid d_entry.everused = entry.everused Modified: pypy/dist/pypy/rpython/test/test_rdict.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rdict.py (original) +++ pypy/dist/pypy/rpython/test/test_rdict.py Tue Sep 13 03:19:53 2005 @@ -161,6 +161,20 @@ res = interpret(func, [1]) assert len(res.entries) == rdict.DICT_INITSIZE +def test_dict_valid_resize(): + # see if we find our keys after resize + def func(): + d = {} + # fill it up + for i in range(10): + d[str(i)] = 0 + # delete again + for i in range(10): + del d[str(i)] + res = 0 + # if it does not crash, we are fine. It crashes if you forget the hash field. + interpret(func, []) + def test_dict_iteration(): def func(i, j): d = {} From ericvrp at codespeak.net Tue Sep 13 10:37:36 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 13 Sep 2005 10:37:36 +0200 (CEST) Subject: [pypy-svn] r17517 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20050913083736.168C327B9C@code1.codespeak.net> Author: ericvrp Date: Tue Sep 13 10:37:35 2005 New Revision: 17517 Modified: pypy/dist/pypy/translator/llvm/codewriter.py pypy/dist/pypy/translator/llvm/module/support.py pypy/dist/pypy/translator/llvm/opwriter.py Log: update to be compatible with latest list changes Modified: pypy/dist/pypy/translator/llvm/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/codewriter.py (original) +++ pypy/dist/pypy/translator/llvm/codewriter.py Tue Sep 13 10:37:35 2005 @@ -111,16 +111,20 @@ # "tail" even if they do not occur before a ret instruction. def call(self, targetvar, returntype, functionref, argrefs, argtypes, tail=DEFAULT_TAIL, cconv=DEFAULT_CCONV): if cconv is not 'fastcc': - tail = '' + tail_ = '' + else: + tail_ = tail + ' ' arglist = ["%s %s" % item for item in zip(argtypes, argrefs)] - self.indent("%s = %s call %s %s %s(%s)" % (targetvar, tail, cconv, returntype, functionref, + self.indent("%s = %scall %s %s %s(%s)" % (targetvar, tail_, cconv, returntype, functionref, ", ".join(arglist))) def call_void(self, functionref, argrefs, argtypes, tail=DEFAULT_TAIL, cconv=DEFAULT_CCONV): if cconv is not 'fastcc': - tail = '' + tail_ = '' + else: + tail_ = tail + ' ' arglist = ["%s %s" % item for item in zip(argtypes, argrefs)] - self.indent("%s call %s void %s(%s)" % (tail, cconv, functionref, ", ".join(arglist))) + self.indent("%scall %s void %s(%s)" % (tail_, cconv, functionref, ", ".join(arglist))) def invoke(self, targetvar, returntype, functionref, argrefs, argtypes, label, except_label, cconv=DEFAULT_CCONV): arglist = ["%s %s" % item for item in zip(argtypes, argrefs)] Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Tue Sep 13 10:37:35 2005 @@ -264,7 +264,7 @@ extfunctions["%main"] = [(), """ int %main(int %argc, sbyte** %argv) { entry: - %pypy_argv = call fastcc %RPyListOfString* %pypy_ll_newlist__listPtrConst_Signed.2(int 0) + %pypy_argv = call fastcc %RPyListOfString* %pypy_ll_newlist__Ptr_GcStruct_listLlT_Signed.2(int 0) br label %no_exit no_exit: Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Tue Sep 13 10:37:35 2005 @@ -327,8 +327,13 @@ block_label = self.node.block_to_name[self.block] exc_label = block_label + '_exception_handling' + + + + + if returntype != "void": - if self.db.is_function_ptr(op.result): + if self.db.is_function_ptr(op.result): #use longhand form returntype = "%s (%s)*" % (returntype, ", ".join(argtypes)) self.codewriter.invoke(targetvar, returntype, functionref, argrefs, argtypes, none_label, exc_label) @@ -403,6 +408,13 @@ self.codewriter.br_uncond(target) + + + + + + + def malloc(self, op): arg_type = op.args[0].value targetvar = self.db.repr_arg(op.result) From arigo at codespeak.net Tue Sep 13 11:43:30 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Sep 2005 11:43:30 +0200 (CEST) Subject: [pypy-svn] r17518 - pypy/dist/pypy/translator/test Message-ID: <20050913094330.1116827BB9@code1.codespeak.net> Author: arigo Date: Tue Sep 13 11:43:29 2005 New Revision: 17518 Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py Log: A test showing a crash in inline_function(). Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/test/test_backendoptimization.py (original) +++ pypy/dist/pypy/translator/test/test_backendoptimization.py Tue Sep 13 11:43:29 2005 @@ -213,7 +213,7 @@ result = interp.eval_function(g, [-100]) assert result == -1 -def DONOTtest_for_loop(): +def FAILING_test_for_loop(): def f(x): result = 0 for i in range(0, x): @@ -223,7 +223,14 @@ a = t.annotate([int]) a.simplify() t.specialize() + for graph in t.flowgraphs.values(): + if graph.name.startswith('ll_rangenext'): + break + inline_function(t, graph, t.flowgraphs[f]) t.view() + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(g, [10]) + assert result == 45 def check_malloc_removed(fn, signature, expected_remaining_mallocs): From ericvrp at codespeak.net Tue Sep 13 11:49:41 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 13 Sep 2005 11:49:41 +0200 (CEST) Subject: [pypy-svn] r17520 - in pypy/dist/pypy/translator: . llvm Message-ID: <20050913094941.5094827BB9@code1.codespeak.net> Author: ericvrp Date: Tue Sep 13 11:49:40 2005 New Revision: 17520 Modified: pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/llvm/build_llvm_module.py pypy/dist/pypy/translator/llvm/funcnode.py Log: * changed gcc optimization flags, becuase -O3 crashes on some machines (snake...) * using more backendoptimizations, but SSI_to_SSA is not working for LLVM backend, so made optz optional. Hoping to use remove_simple_mallocs and inlining. Modified: pypy/dist/pypy/translator/backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/backendoptimization.py (original) +++ pypy/dist/pypy/translator/backendoptimization.py Tue Sep 13 11:49:40 2005 @@ -573,13 +573,19 @@ # ____________________________________________________________ -def backend_optimizations(graph): - remove_same_as(graph) - eliminate_empty_blocks(graph) +def backend_optimizations(graph, opt_remove_same_as=True, + opt_SSI_to_SSA=True, + opt_eliminate_empty_blocks=True, + opt_remove_simple_mallocs=True): + if opt_remove_same_as: + remove_same_as(graph) + if opt_eliminate_empty_blocks: + eliminate_empty_blocks(graph) checkgraph(graph) - SSI_to_SSA(graph) + if opt_SSI_to_SSA: + SSI_to_SSA(graph) #checkgraph(graph) - if remove_simple_mallocs(graph): + if opt_remove_simple_mallocs and remove_simple_mallocs(graph): transform_dead_op_vars(graph) # typical after malloc removal checkgraph(graph) Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Tue Sep 13 11:49:40 2005 @@ -95,7 +95,7 @@ cmds.append("llc %s %s.bc -march=c -f -o %s.c" % (genllvm.exceptionpolicy.llc_options(), b, b)) if exe_name: #XXX TODO: use CFLAGS when available - cmds.append("gcc %s.c -c -O3 -pipe" % (b,)) + cmds.append("gcc %s.c -c -O2 -fomit-frame-pointer -pipe" % (b,)) cmds.append("gcc %s.o %s -lm -ldl -pipe -o %s" % (b, gc_libs, exe_name)) source_files.append("%s.c" % b) Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Tue Sep 13 11:49:40 2005 @@ -2,7 +2,7 @@ from pypy.objspace.flow.model import Block, Constant, Variable, Link from pypy.objspace.flow.model import flatten, mkentrymap, traverse, last_exception from pypy.rpython import lltype -from pypy.translator.backendoptimization import remove_same_as +from pypy.translator.backendoptimization import backend_optimizations from pypy.translator.unsimplify import remove_double_links from pypy.translator.llvm.node import LLVMNode, ConstantLLVMNode from pypy.translator.llvm.opwriter import OpWriter @@ -38,7 +38,7 @@ self.value = value self.ref = self.make_ref('%pypy_', value.graph.name) self.graph = value.graph - remove_same_as(self.graph) + backend_optimizations(self.graph, opt_SSI_to_SSA=False) remove_double_links(self.db.translator, self.graph) def __str__(self): From arigo at codespeak.net Tue Sep 13 11:51:41 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Sep 2005 11:51:41 +0200 (CEST) Subject: [pypy-svn] r17521 - pypy/dist/pypy/translator/test Message-ID: <20050913095141.7180527BBC@code1.codespeak.net> Author: arigo Date: Tue Sep 13 11:51:40 2005 New Revision: 17521 Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py Log: Isolated the failure to this single change: passing variables across exception-catching links. Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/test/test_backendoptimization.py (original) +++ pypy/dist/pypy/translator/test/test_backendoptimization.py Tue Sep 13 11:51:40 2005 @@ -140,7 +140,7 @@ except ValueError: return 2 except KeyError: - return 3 + return x+2 return 1 t = Translator(g) a = t.annotate([int]) From arigo at codespeak.net Tue Sep 13 12:19:58 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Sep 2005 12:19:58 +0200 (CEST) Subject: [pypy-svn] r17522 - in pypy/dist/pypy/translator: . test Message-ID: <20050913101958.3536E27BB9@code1.codespeak.net> Author: arigo Date: Tue Sep 13 12:19:57 2005 New Revision: 17522 Modified: pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/test/test_backendoptimization.py Log: Fix for the crash. REVIEW: is the other call to find_args_in_exceptional_case() also wrong in the same way? Modified: pypy/dist/pypy/translator/backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/backendoptimization.py (original) +++ pypy/dist/pypy/translator/backendoptimization.py Tue Sep 13 12:19:57 2005 @@ -163,6 +163,7 @@ return funcs def inline_function(translator, inline_func, graph): + count = 0 callsites = [] def find_callsites(block): if isinstance(block, Block): @@ -170,7 +171,10 @@ if not (op.opname == "direct_call" and isinstance(op.args[0], Constant)): continue - if op.args[0].value._obj._callable is inline_func: + funcobj = op.args[0].value._obj + # accept a function or a graph as 'inline_func' + if (getattr(funcobj, 'graph', None) is inline_func or + getattr(funcobj, '_callable', None) is inline_func): callsites.append((block, i)) traverse(find_callsites, graph) while callsites != []: @@ -179,6 +183,8 @@ callsites = [] traverse(find_callsites, graph) checkgraph(graph) + count += 1 + return count def _find_exception_type(block): #XXX slightly brittle: find the exception type for simple cases @@ -197,13 +203,13 @@ def _inline_function(translator, graph, block, index_operation): op = block.operations[index_operation] - graph_to_inline = translator.flowgraphs[op.args[0].value._obj._callable] + graph_to_inline = op.args[0].value._obj.graph exception_guarded = False if (block.exitswitch == Constant(last_exception) and index_operation == len(block.operations) - 1): exception_guarded = True - assert len(collect_called_functions(graph_to_inline)) == 0, ( - "can't handle exceptions yet") + if len(collect_called_functions(graph_to_inline)) != 0: + raise NotImplementedError("can't handle exceptions yet") entrymap = mkentrymap(graph_to_inline) beforeblock = block afterblock = split_block(translator, graph, block, index_operation) @@ -306,7 +312,7 @@ copiedblock = copied_blocks[link.prevblock] copiedlink = copiedblock.exits[0] eclass = _find_exception_type(copiedblock) - print copiedblock.operations + #print copiedblock.operations if eclass is None: continue etype = copiedlink.args[0] @@ -315,7 +321,7 @@ if exc_match.value(eclass, exceptionlink.llexitcase): copiedlink.target = exceptionlink.target linkargs = find_args_in_exceptional_case(exceptionlink, - copiedblock, + link.prevblock, etype, evalue) copiedlink.args = linkargs break Modified: pypy/dist/pypy/translator/test/test_backendoptimization.py ============================================================================== --- pypy/dist/pypy/translator/test/test_backendoptimization.py (original) +++ pypy/dist/pypy/translator/test/test_backendoptimization.py Tue Sep 13 12:19:57 2005 @@ -213,7 +213,7 @@ result = interp.eval_function(g, [-100]) assert result == -1 -def FAILING_test_for_loop(): +def test_for_loop(): def f(x): result = 0 for i in range(0, x): @@ -226,10 +226,11 @@ for graph in t.flowgraphs.values(): if graph.name.startswith('ll_rangenext'): break + else: + assert 0, "cannot find ll_rangenext_*() function" inline_function(t, graph, t.flowgraphs[f]) - t.view() interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(g, [10]) + result = interp.eval_function(f, [10]) assert result == 45 From ac at codespeak.net Tue Sep 13 13:39:42 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Tue, 13 Sep 2005 13:39:42 +0200 (CEST) Subject: [pypy-svn] r17524 - in pypy/dist/pypy/interpreter: astcompiler test Message-ID: <20050913113942.9C83027BA6@code1.codespeak.net> Author: ac Date: Tue Sep 13 13:39:41 2005 New Revision: 17524 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/test/test_compiler.py Log: Dissallow unqualified exec in a nested function. Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Tue Sep 13 13:39:41 2005 @@ -912,6 +912,8 @@ } def visitExec(self, node): + if self.scope.nested and node.locals is None and node.globals is None: + raise SyntaxError('unqualified exec is not allowed in a nested function') node.expr.accept( self ) if node.locals is None: self.emitop_obj('LOAD_CONST', self.space.w_None) Modified: pypy/dist/pypy/interpreter/test/test_compiler.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_compiler.py (original) +++ pypy/dist/pypy/interpreter/test/test_compiler.py Tue Sep 13 13:39:41 2005 @@ -193,9 +193,6 @@ def test_scope_unoptimized_clash1_b(self): py.test.skip("INPROGESS") - def test_scope_exec_in_nested(self): - py.test.skip("INPROGESS") - def test_scope_importstar_in_nested(self): py.test.skip("INPROGESS") From pedronis at codespeak.net Tue Sep 13 14:08:21 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 13 Sep 2005 14:08:21 +0200 (CEST) Subject: [pypy-svn] r17525 - in pypy/dist/pypy: annotation translator/test Message-ID: <20050913120821.820C927BA6@code1.codespeak.net> Author: pedronis Date: Tue Sep 13 14:08:19 2005 New Revision: 17525 Modified: pypy/dist/pypy/annotation/binaryop.py pypy/dist/pypy/translator/test/test_annrpython.py Log: proper union for iterators Modified: pypy/dist/pypy/annotation/binaryop.py ============================================================================== --- pypy/dist/pypy/annotation/binaryop.py (original) +++ pypy/dist/pypy/annotation/binaryop.py Tue Sep 13 14:08:19 2005 @@ -506,7 +506,10 @@ class __extend__(pairtype(SomeIterator, SomeIterator)): def union((iter1, iter2)): - return SomeIterator(unionof(iter1.s_container, iter2.s_container)) + s_cont = unionof(iter1.s_container, iter2.s_container) + if iter1.variant != iter2.variant: + raise UnionError("merging incompatible iterators variants") + return SomeIterator(s_cont, *iter1.variant) class __extend__(pairtype(SomeBuiltin, SomeBuiltin)): Modified: pypy/dist/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/dist/pypy/translator/test/test_annrpython.py (original) +++ pypy/dist/pypy/translator/test/test_annrpython.py Tue Sep 13 14:08:19 2005 @@ -1703,6 +1703,19 @@ for t in memo: assert t + def test_iterator_union(self): + def it(d): + return d.iteritems() + d0 = {1:2} + def f(): + it(d0) + return it({1:2}) + a = self.RPythonAnnotator() + s = a.build_types(f, []) + assert isinstance(s, annmodel.SomeIterator) + assert s.variant == ('items',) + + def g(n): return [0,1,2,n] From arigo at codespeak.net Tue Sep 13 14:17:19 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Sep 2005 14:17:19 +0200 (CEST) Subject: [pypy-svn] r17526 - in pypy/dist/pypy/translator/backend_opt: . test Message-ID: <20050913121719.2EAC127BA6@code1.codespeak.net> Author: arigo Date: Tue Sep 13 14:17:16 2005 New Revision: 17526 Added: pypy/dist/pypy/translator/backend_opt/ pypy/dist/pypy/translator/backend_opt/__init__.py pypy/dist/pypy/translator/backend_opt/inline.py - copied, changed from r17523, pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/backend_opt/malloc.py - copied, changed from r17523, pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/backend_opt/matfunc.py - copied unchanged from r17525, user/arigo/hack/pypy-hack/matfunc.py pypy/dist/pypy/translator/backend_opt/remove_no_ops.py - copied, changed from r17523, pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/backend_opt/ssa.py - copied, changed from r17523, pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/backend_opt/test/ pypy/dist/pypy/translator/backend_opt/test/__init__.py - copied unchanged from r17524, pypy/dist/pypy/translator/test/__init__.py pypy/dist/pypy/translator/backend_opt/test/test_inline.py - copied unchanged from r17524, pypy/dist/pypy/translator/test/test_backendoptimization.py pypy/dist/pypy/translator/backend_opt/test/test_malloc.py - copied unchanged from r17524, pypy/dist/pypy/translator/test/test_backendoptimization.py pypy/dist/pypy/translator/backend_opt/test/test_remove_no_ops.py - copied unchanged from r17524, pypy/dist/pypy/translator/test/test_backendoptimization.py pypy/dist/pypy/translator/backend_opt/test/test_ssa.py - copied unchanged from r17524, pypy/dist/pypy/translator/test/test_backendoptimization.py Log: (In-progress) putting the back-end optimizations in their own subdir. Added: pypy/dist/pypy/translator/backend_opt/__init__.py ============================================================================== From tismer at codespeak.net Tue Sep 13 15:01:37 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Tue, 13 Sep 2005 15:01:37 +0200 (CEST) Subject: [pypy-svn] r17527 - pypy/dist/pypy/module/marshal Message-ID: <20050913130137.2FF2F27BA6@code1.codespeak.net> Author: tismer Date: Tue Sep 13 15:01:36 2005 New Revision: 17527 Removed: pypy/dist/pypy/module/marshal/stackless_snippets.py Log: trashed these musings. Will write a general solution, instead From arigo at codespeak.net Tue Sep 13 16:49:16 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Sep 2005 16:49:16 +0200 (CEST) Subject: [pypy-svn] r17528 - in pypy/dist/pypy/translator: . backend_opt backendopt backendopt/test c/test llvm test Message-ID: <20050913144916.5913827BA6@code1.codespeak.net> Author: arigo Date: Tue Sep 13 16:49:13 2005 New Revision: 17528 Added: pypy/dist/pypy/translator/backendopt/ (props changed) - copied from r17526, pypy/dist/pypy/translator/backend_opt/ pypy/dist/pypy/translator/backendopt/all.py (contents, props changed) pypy/dist/pypy/translator/backendopt/removenoops.py - copied, changed from r17526, pypy/dist/pypy/translator/backend_opt/remove_no_ops.py pypy/dist/pypy/translator/backendopt/test/test_all.py (contents, props changed) pypy/dist/pypy/translator/backendopt/test/test_removenoops.py - copied, changed from r17526, pypy/dist/pypy/translator/backend_opt/test/test_remove_no_ops.py Removed: pypy/dist/pypy/translator/backend_opt/ pypy/dist/pypy/translator/backendopt/remove_no_ops.py pypy/dist/pypy/translator/backendopt/test/test_remove_no_ops.py pypy/dist/pypy/translator/backendoptimization.py pypy/dist/pypy/translator/test/test_backendoptimization.py Modified: pypy/dist/pypy/translator/backendopt/__init__.py (props changed) pypy/dist/pypy/translator/backendopt/inline.py pypy/dist/pypy/translator/backendopt/malloc.py pypy/dist/pypy/translator/backendopt/matfunc.py (props changed) pypy/dist/pypy/translator/backendopt/ssa.py pypy/dist/pypy/translator/backendopt/test/ (props changed) pypy/dist/pypy/translator/backendopt/test/test_inline.py pypy/dist/pypy/translator/backendopt/test/test_malloc.py pypy/dist/pypy/translator/backendopt/test/test_ssa.py pypy/dist/pypy/translator/c/test/test_backendoptimized.py pypy/dist/pypy/translator/geninterplevel.py pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/translator.py Log: * Removed underscores from directory and file names. * Finished sorting the tests in their own backendopt/test/test_* files. * Changed outside code to use backendopt/ instead of backendoptimization.py, which is deleted. * Fixed some bugs here and there, there are more bugs open, both in inlining and malloc removal... * llvm seems quite broken at the moment, sorry if I caused that. I will check this too. Added: pypy/dist/pypy/translator/backendopt/all.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/backendopt/all.py Tue Sep 13 16:49:13 2005 @@ -0,0 +1,30 @@ +from pypy.objspace.flow.model import checkgraph +from pypy.translator.backendopt.removenoops import remove_same_as +from pypy.translator.backendopt.inline import auto_inlining +from pypy.translator.backendopt.malloc import remove_simple_mallocs +from pypy.translator.backendopt.ssa import SSI_to_SSA +from pypy.translator import simplify + + +def backend_optimizations(translator, inline_threshold=1, ssa_form=True): + # remove obvious no-ops + for graph in translator.flowgraphs.values(): + remove_same_as(graph) + simplify.eliminate_empty_blocks(graph) + + # inline functions around + if inline_threshold and 0: # XXX in progress + auto_inlining(translator, inline_threshold) + + # vaporize mallocs + # XXX in progress + for graph in []:# translator.flowgraphs.values(): + if remove_simple_mallocs(graph): + # remove typical leftovers from malloc removal + remove_same_as(graph) + simplify.eliminate_empty_blocks(graph) + simplify.transform_dead_op_vars(graph) + + if ssa_form: + for graph in translator.flowgraphs.values(): + SSI_to_SSA(graph) Modified: pypy/dist/pypy/translator/backendopt/inline.py ============================================================================== --- pypy/dist/pypy/translator/backend_opt/inline.py (original) +++ pypy/dist/pypy/translator/backendopt/inline.py Tue Sep 13 16:49:13 2005 @@ -1,16 +1,20 @@ -##from pypy.translator.translator import Translator -##from pypy.translator.simplify import eliminate_empty_blocks, join_blocks -##from pypy.translator.simplify import remove_identical_vars -##from pypy.translator.simplify import transform_dead_op_vars -##from pypy.translator.unsimplify import copyvar, split_block -##from pypy.objspace.flow.model import Variable, Constant, Block, Link -##from pypy.objspace.flow.model import SpaceOperation, last_exception -##from pypy.objspace.flow.model import traverse, mkentrymap, checkgraph -##from pypy.annotation import model as annmodel -##from pypy.tool.unionfind import UnionFind -##from pypy.rpython.lltype import Void, Bool -##from pypy.rpython import rmodel, lltype -from pypy.translator.backend_opt import matfunc +import sys +from pypy.translator.simplify import eliminate_empty_blocks, join_blocks +from pypy.translator.simplify import remove_identical_vars +from pypy.translator.unsimplify import copyvar, split_block +from pypy.objspace.flow.model import Variable, Constant, Block, Link +from pypy.objspace.flow.model import SpaceOperation, last_exception +from pypy.objspace.flow.model import traverse, mkentrymap, checkgraph, flatten +from pypy.annotation import model as annmodel +from pypy.rpython.lltype import Bool +from pypy.rpython import rmodel +from pypy.translator.backendopt import matfunc + +BASE_INLINE_THRESHOLD = 17.0 # just enough to inline ll_rangeiter_next() + +class CannotInline(Exception): + pass + def collect_called_functions(graph): funcs = {} @@ -23,28 +27,34 @@ traverse(visit, graph) return funcs -def inline_function(translator, inline_func, graph): - count = 0 +def find_callsites(graph, calling_what): callsites = [] - def find_callsites(block): + def visit(block): if isinstance(block, Block): for i, op in enumerate(block.operations): if not (op.opname == "direct_call" and isinstance(op.args[0], Constant)): continue funcobj = op.args[0].value._obj + graph = getattr(funcobj, 'graph', None) # accept a function or a graph as 'inline_func' - if (getattr(funcobj, 'graph', None) is inline_func or - getattr(funcobj, '_callable', None) is inline_func): - callsites.append((block, i)) - traverse(find_callsites, graph) + if (graph is calling_what or + getattr(funcobj, '_callable', None) is calling_what): + callsites.append((graph, block, i)) + traverse(visit, graph) + return callsites + +def inline_function(translator, inline_func, graph): + count = 0 + callsites = find_callsites(graph, inline_func) while callsites != []: - block, index_operation = callsites.pop() + subgraph, block, index_operation = callsites.pop() + if find_callsites(subgraph, subgraph): + raise CannotInline("inlining a recursive function") _inline_function(translator, graph, block, index_operation) - callsites = [] - traverse(find_callsites, graph) checkgraph(graph) count += 1 + callsites = find_callsites(graph, inline_func) return count def _find_exception_type(block): @@ -70,7 +80,7 @@ index_operation == len(block.operations) - 1): exception_guarded = True if len(collect_called_functions(graph_to_inline)) != 0: - raise NotImplementedError("can't handle exceptions yet") + raise CannotInline("can't handle exceptions yet") entrymap = mkentrymap(graph_to_inline) beforeblock = block afterblock = split_block(translator, graph, block, index_operation) @@ -269,7 +279,7 @@ static_instruction_count(graph)) -def static_callers(translator): +def static_callers(translator, ignore_primitives=False): result = [] def build_call_graph(node): if isinstance(node, Block): @@ -279,27 +289,33 @@ funcobj = op.args[0].value._obj graph = getattr(funcobj, 'graph', None) if graph is not None: + if ignore_primitives: + if getattr(getattr(funcobj, '_callable', None), + 'suggested_primitive', False): + continue result.append((parentgraph, graph)) for parentgraph in translator.flowgraphs.itervalues(): traverse(build_call_graph, parentgraph) return result -def auto_inlining(translator, threshold=20): - from heapq import heappop, heapreplace +def auto_inlining(translator, threshold=1): + from heapq import heappush, heappop, heapreplace + threshold *= BASE_INLINE_THRESHOLD callers = {} # {graph: {graphs-that-call-it}} callees = {} # {graph: {graphs-that-it-calls}} - for graph1, graph2 in static_callers(translator): + for graph1, graph2 in static_callers(translator, ignore_primitives=True): callers.setdefault(graph2, {})[graph1] = True callees.setdefault(graph1, {})[graph2] = True fiboheap = [(0.0, graph) for graph in callers] valid_weight = {} + couldnt_inline = {} while fiboheap: weight, graph = fiboheap[0] if not valid_weight.get(graph): weight = inlining_heuristic(graph) - print ' + cost %7.2f %50s' % (weight, graph.name) + #print ' + cost %7.2f %50s' % (weight, graph.name) heapreplace(fiboheap, (weight, graph)) valid_weight[graph] = True continue @@ -312,13 +328,24 @@ for parentgraph in callers[graph]: if parentgraph == graph: continue - print '\t\t-> in %s' % parentgraph.name + print '\t\t-> in %s...' % parentgraph.name, + sys.stdout.flush() try: - if backendoptimization.inline_function(translator, graph, - parentgraph): - valid_weight[parentgraph] = False - for graph2 in callees.get(graph, {}): - callees[parentgraph][graph2] = True - callers[graph2][parentgraph] = True - except NotImplementedError: - pass + res = bool(inline_function(translator, graph, parentgraph)) + except CannotInline: + couldnt_inline[graph] = True + res = CannotInline + print res + if res is True: + # the parentgraph should now contain all calls that were + # done by 'graph' + for graph2 in callees.get(graph, {}): + callees[parentgraph][graph2] = True + callers[graph2][parentgraph] = True + if parentgraph in couldnt_inline: + # the parentgraph was previously uninlinable, but it has + # been modified. Maybe now we can inline it into further + # parents? + del couldnt_inline[parentgraph] + heappush(fiboheap, (0.0, parentgraph)) + valid_weight[parentgraph] = False Modified: pypy/dist/pypy/translator/backendopt/malloc.py ============================================================================== --- pypy/dist/pypy/translator/backend_opt/malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/malloc.py Tue Sep 13 16:49:13 2005 @@ -1,15 +1,7 @@ -##from pypy.translator.translator import Translator -##from pypy.translator.simplify import eliminate_empty_blocks, join_blocks -##from pypy.translator.simplify import remove_identical_vars -##from pypy.translator.simplify import transform_dead_op_vars -##from pypy.translator.unsimplify import copyvar, split_block -##from pypy.objspace.flow.model import Variable, Constant, Block, Link -##from pypy.objspace.flow.model import SpaceOperation, last_exception -##from pypy.objspace.flow.model import traverse, mkentrymap, checkgraph -##from pypy.annotation import model as annmodel -##from pypy.tool.unionfind import UnionFind -##from pypy.rpython.lltype import Void, Bool -##from pypy.rpython import rmodel, lltype +from pypy.objspace.flow.model import Variable, Constant, Block, Link +from pypy.objspace.flow.model import SpaceOperation, traverse +from pypy.tool.unionfind import UnionFind +from pypy.rpython import lltype class Blocked(Exception): pass Modified: pypy/dist/pypy/translator/backendopt/ssa.py ============================================================================== --- pypy/dist/pypy/translator/backend_opt/ssa.py (original) +++ pypy/dist/pypy/translator/backendopt/ssa.py Tue Sep 13 16:49:13 2005 @@ -1,15 +1,5 @@ -##from pypy.translator.translator import Translator -##from pypy.translator.simplify import eliminate_empty_blocks, join_blocks -##from pypy.translator.simplify import remove_identical_vars -##from pypy.translator.simplify import transform_dead_op_vars -##from pypy.translator.unsimplify import copyvar, split_block -##from pypy.objspace.flow.model import Variable, Constant, Block, Link -##from pypy.objspace.flow.model import SpaceOperation, last_exception -##from pypy.objspace.flow.model import traverse, mkentrymap, checkgraph -##from pypy.annotation import model as annmodel -##from pypy.tool.unionfind import UnionFind -##from pypy.rpython.lltype import Void, Bool -##from pypy.rpython import rmodel, lltype +from pypy.objspace.flow.model import Variable, mkentrymap +from pypy.tool.unionfind import UnionFind def SSI_to_SSA(graph): """Rename the variables in a flow graph as much as possible without Added: pypy/dist/pypy/translator/backendopt/test/test_all.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/backendopt/test/test_all.py Tue Sep 13 16:49:13 2005 @@ -0,0 +1,54 @@ +import py +from pypy.translator.backendopt.all import backend_optimizations +from pypy.translator.backendopt.test.test_malloc import check_malloc_removed +from pypy.translator.translator import Translator +from pypy.objspace.flow.model import Constant +from pypy.rpython.llinterp import LLInterpreter + + +class A: + def __init__(self, x, y): + self.bounds = (x, y) + def mean(self, percentage=50): + x, y = self.bounds + total = x*percentage + y*(100-percentage) + return total//100 + +def condition(n): + return n >= 100 + +def firstthat(function, condition): + for n in range(101): + if condition(function(n)): + return n + else: + return -1 + +def myfunction(n): + a = A(117, n) + return a.mean() + +def big(): + """This example should be turned into a simple 'while' loop with no + malloc nor direct_call by back-end optimizations, given a high enough + inlining threshold. + """ + return firstthat(myfunction, condition) + + +def test_big(): + py.test.skip("in progress") + assert big() == 83 + + t = Translator(big) + t.annotate([]) + t.specialize() + backend_optimizations(t, inline_threshold=100) + + t.view() + graph = t.getflowgraph() + check_malloc_removed(graph) + + interp = LLInterpreter(t.flowgraphs, t.rtyper) + res = interp.eval_function(big, []) + assert res == 83 Modified: pypy/dist/pypy/translator/backendopt/test/test_inline.py ============================================================================== --- pypy/dist/pypy/translator/backend_opt/test/test_inline.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_inline.py Tue Sep 13 16:49:13 2005 @@ -1,47 +1,10 @@ -from pypy.translator.backendoptimization import remove_void, inline_function -from pypy.translator.backendoptimization import remove_simple_mallocs +import py +from pypy.translator.backendopt.inline import inline_function, CannotInline +from pypy.translator.backendopt.inline import auto_inlining +from pypy.translator.backendopt.inline import collect_called_functions from pypy.translator.translator import Translator -from pypy.rpython.lltype import Void from pypy.rpython.llinterp import LLInterpreter -from pypy.objspace.flow.model import checkgraph, flatten, Block -from pypy.translator.test.snippet import simple_method, is_perfect_number -from pypy.translator.llvm.log import log - -import py -log = py.log.Producer('test_backendoptimization') - -def annotate_and_remove_void(f, annotate): - t = Translator(f) - a = t.annotate(annotate) - t.specialize() - remove_void(t) - return t - -def test_remove_void_args(): - def f(i): - return [1,2,3,i][i] - t = annotate_and_remove_void(f, [int]) - for func, graph in t.flowgraphs.iteritems(): - assert checkgraph(graph) is None - for arg in graph.startblock.inputargs: - assert arg.concretetype is not Void - interp = LLInterpreter(t.flowgraphs, t.rtyper) - assert interp.eval_function(f, [0]) == 1 - -def test_remove_void_in_struct(): - t = annotate_and_remove_void(simple_method, [int]) - #t.view() - log(t.flowgraphs.iteritems()) - for func, graph in t.flowgraphs.iteritems(): - log('func : ' + str(func)) - log('graph: ' + str(graph)) - assert checkgraph(graph) is None - #for fieldname in self.struct._names: #XXX helper (in lltype?) should remove these voids - # type_ = getattr(struct, fieldname) - # log('fieldname=%(fieldname)s , type_=%(type_)s' % locals()) - # assert _type is not Void - #interp = LLInterpreter(t.flowgraphs, t.rtyper) - #assert interp.eval_function(f, [0]) == 1 +from pypy.translator.test.snippet import is_perfect_number def test_inline_simple(): def f(x, y): @@ -233,53 +196,64 @@ result = interp.eval_function(f, [10]) assert result == 45 +def test_inline_constructor(): + class A: + def __init__(self, x, y): + self.bounds = (x, y) + def area(self, height=10): + return height * (self.bounds[1] - self.bounds[0]) + def f(i): + a = A(117, i) + return a.area() + t = Translator(f) + a = t.annotate([int]) + a.simplify() + t.specialize() + inline_function(t, A.__init__.im_func, t.flowgraphs[f]) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(f, [120]) + assert result == 30 -def check_malloc_removed(fn, signature, expected_remaining_mallocs): - t = Translator(fn) - t.annotate(signature) - t.specialize() - graph = t.getflowgraph() - remove_simple_mallocs(graph) - checkgraph(graph) - count = 0 - for node in flatten(graph): - if isinstance(node, Block): - for op in node.operations: - if op.opname == 'malloc': - count += 1 - assert count == expected_remaining_mallocs - -def test_remove_mallocs(): - def fn1(x, y): - s, d = x+y, x-y - return s*d - yield check_malloc_removed, fn1, [int, int], 0 - # - class T: - pass - def fn2(x, y): - t = T() - t.x = x - t.y = y - if x > 0: - return t.x + t.y +def test_cannot_inline_recursive_function(): + def factorial(n): + if n > 1: + return n * factorial(n-1) else: - return t.x - t.y - yield check_malloc_removed, fn2, [int, int], 0 - # - def fn3(x): - a, ((b, c), d, e) = x+1, ((x+2, x+3), x+4, x+5) - return a+b+c+d+e - yield check_malloc_removed, fn3, [int], 0 - # - class A: - pass - class B(A): - pass - def fn4(i): - a = A() - b = B() - a.b = b - b.i = i - return a.b.i - yield check_malloc_removed, fn4, [int], 0 + return 1 + def f(n): + return factorial(n//2) + t = Translator(f) + a = t.annotate([int]) + a.simplify() + t.specialize() + py.test.raises(CannotInline, + "inline_function(t, factorial, t.flowgraphs[f])") + +def test_auto_inlining_small_call_big(): + def leaf(n): + total = 0 + i = 0 + while i < n: + total += i + if total > 100: + raise OverflowError + i += 1 + return total + def g(n): + return leaf(n) + def f(n): + try: + return g(n) + except OverflowError: + return -1 + t = Translator(f) + a = t.annotate([int]) + a.simplify() + t.specialize() + auto_inlining(t, threshold=10) + assert len(collect_called_functions(t.getflowgraph(f))) == 0 + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(f, [10]) + assert result == 45 + result = interp.eval_function(f, [15]) + assert result == -1 Modified: pypy/dist/pypy/translator/backendopt/test/test_malloc.py ============================================================================== --- pypy/dist/pypy/translator/backend_opt/test/test_malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_malloc.py Tue Sep 13 16:49:13 2005 @@ -1,260 +1,41 @@ -from pypy.translator.backendoptimization import remove_void, inline_function -from pypy.translator.backendoptimization import remove_simple_mallocs +from pypy.translator.backendopt.malloc import remove_simple_mallocs +from pypy.translator.backendopt.inline import inline_function from pypy.translator.translator import Translator -from pypy.rpython.lltype import Void -from pypy.rpython.llinterp import LLInterpreter from pypy.objspace.flow.model import checkgraph, flatten, Block -from pypy.translator.test.snippet import simple_method, is_perfect_number -from pypy.translator.llvm.log import log - -import py -log = py.log.Producer('test_backendoptimization') - -def annotate_and_remove_void(f, annotate): - t = Translator(f) - a = t.annotate(annotate) - t.specialize() - remove_void(t) - return t - -def test_remove_void_args(): - def f(i): - return [1,2,3,i][i] - t = annotate_and_remove_void(f, [int]) - for func, graph in t.flowgraphs.iteritems(): - assert checkgraph(graph) is None - for arg in graph.startblock.inputargs: - assert arg.concretetype is not Void - interp = LLInterpreter(t.flowgraphs, t.rtyper) - assert interp.eval_function(f, [0]) == 1 - -def test_remove_void_in_struct(): - t = annotate_and_remove_void(simple_method, [int]) - #t.view() - log(t.flowgraphs.iteritems()) - for func, graph in t.flowgraphs.iteritems(): - log('func : ' + str(func)) - log('graph: ' + str(graph)) - assert checkgraph(graph) is None - #for fieldname in self.struct._names: #XXX helper (in lltype?) should remove these voids - # type_ = getattr(struct, fieldname) - # log('fieldname=%(fieldname)s , type_=%(type_)s' % locals()) - # assert _type is not Void - #interp = LLInterpreter(t.flowgraphs, t.rtyper) - #assert interp.eval_function(f, [0]) == 1 - -def test_inline_simple(): - def f(x, y): - return (g(x, y) + 1) * x - def g(x, y): - if x > 0: - return x * y - else: - return -x * y - t = Translator(f) - a = t.annotate([int, int]) - a.simplify() - t.specialize() - inline_function(t, g, t.flowgraphs[f]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(f, [-1, 5]) - assert result == f(-1, 5) - result = interp.eval_function(f, [2, 12]) - assert result == f(2, 12) - -def test_inline_big(): - def f(x): - result = [] - for i in range(1, x+1): - if is_perfect_number(i): - result.append(i) - return result - t = Translator(f) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, is_perfect_number, t.flowgraphs[f]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(f, [10]) - assert result.length == len(f(10)) - -def test_inline_raising(): - def f(x): - if x == 1: - raise ValueError - return x - def g(x): - a = f(x) - if x == 2: - raise KeyError - def h(x): - try: - g(x) - except ValueError: - return 1 - except KeyError: - return 2 - return x - t = Translator(h) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f, t.flowgraphs[g]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(h, [0]) - assert result == 0 - result = interp.eval_function(h, [1]) - assert result == 1 - result = interp.eval_function(h, [2]) - assert result == 2 - -def test_inline_several_times(): - def f(x): - return (x + 1) * 2 - def g(x): - if x: - a = f(x) + f(x) - else: - a = f(x) + 1 - return a + f(x) - t = Translator(g) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f, t.flowgraphs[g]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(g, [0]) - assert result == g(0) - result = interp.eval_function(g, [42]) - assert result == g(42) - -def test_inline_exceptions(): - def f(x): - if x == 0: - raise ValueError - if x == 1: - raise KeyError - def g(x): - try: - f(x) - except ValueError: - return 2 - except KeyError: - return x+2 - return 1 - t = Translator(g) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f, t.flowgraphs[g]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(g, [0]) - assert result == 2 - result = interp.eval_function(g, [1]) - assert result == 3 - result = interp.eval_function(g, [42]) - assert result == 1 - -def DONOTtest_inline_var_exception(): - # this test is disabled for now, because f() contains a direct_call - # (at the end, to a ll helper, to get the type of the exception object) - def f(x): - e = None - if x == 0: - e = ValueError() - elif x == 1: - e = KeyError() - if x == 0 or x == 1: - raise e - def g(x): - try: - f(x) - except ValueError: - return 2 - except KeyError: - return 3 - return 1 - t = Translator(g) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f, t.flowgraphs[g]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(g, [0]) - assert result == 2 - result = interp.eval_function(g, [1]) - assert result == 3 - result = interp.eval_function(g, [42]) - assert result == 1 - -def DONOTtest_call_call(): - # for reference. Just remove this test if we decide not to support - # catching exceptions while inlining a graph that contains further - # direct_calls. - def e(x): - if x < 0: - raise KeyError - return x+1 - def f(x): - return e(x)+2 - def g(x): - try: - return f(x)+3 - except KeyError: - return -1 - t = Translator(g) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f, t.flowgraphs[g]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(g, [100]) - assert result == 106 - result = interp.eval_function(g, [-100]) - assert result == -1 - -def test_for_loop(): - def f(x): - result = 0 - for i in range(0, x): - result += i - return result - t = Translator(f) - a = t.annotate([int]) - a.simplify() - t.specialize() - for graph in t.flowgraphs.values(): - if graph.name.startswith('ll_rangenext'): - break - else: - assert 0, "cannot find ll_rangenext_*() function" - inline_function(t, graph, t.flowgraphs[f]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(f, [10]) - assert result == 45 +from pypy.rpython.llinterp import LLInterpreter +def check_malloc_removed(graph): + checkgraph(graph) + count1 = count2 = 0 + for node in flatten(graph): + if isinstance(node, Block): + for op in node.operations: + if op.opname == 'malloc': + count1 += 1 + if op.opname == 'direct_call': + count2 += 1 + assert count1 == 0 # number of mallocs left + assert count2 == 0 # number of direct_calls left -def check_malloc_removed(fn, signature, expected_remaining_mallocs): +def check(fn, signature, args, expected_result): t = Translator(fn) t.annotate(signature) t.specialize() graph = t.getflowgraph() remove_simple_mallocs(graph) - checkgraph(graph) - count = 0 - for node in flatten(graph): - if isinstance(node, Block): - for op in node.operations: - if op.opname == 'malloc': - count += 1 - assert count == expected_remaining_mallocs + check_malloc_removed(graph) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + res = interp.eval_function(fn, args) + assert res == expected_result + -def test_remove_mallocs(): +def test_fn1(): def fn1(x, y): s, d = x+y, x-y return s*d - yield check_malloc_removed, fn1, [int, int], 0 - # + check(fn1, [int, int], [15, 10], 125) + +def test_fn2(): class T: pass def fn2(x, y): @@ -265,13 +46,15 @@ return t.x + t.y else: return t.x - t.y - yield check_malloc_removed, fn2, [int, int], 0 - # + check(fn2, [int, int], [-6, 7], -13) + +def test_fn3(): def fn3(x): a, ((b, c), d, e) = x+1, ((x+2, x+3), x+4, x+5) return a+b+c+d+e - yield check_malloc_removed, fn3, [int], 0 - # + check(fn3, [int], [10], 65) + +def test_fn4(): class A: pass class B(A): @@ -282,4 +65,14 @@ a.b = b b.i = i return a.b.i - yield check_malloc_removed, fn4, [int], 0 + check(fn4, [int], [42], 42) + +def test_fn5(): + class A: + attr = 666 + class B(A): + attr = 42 + def fn5(): + b = B() + return b.attr + check(fn5, [], [], 42) Modified: pypy/dist/pypy/translator/backendopt/test/test_ssa.py ============================================================================== --- pypy/dist/pypy/translator/backend_opt/test/test_ssa.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_ssa.py Tue Sep 13 16:49:13 2005 @@ -1,285 +1,2 @@ -from pypy.translator.backendoptimization import remove_void, inline_function -from pypy.translator.backendoptimization import remove_simple_mallocs -from pypy.translator.translator import Translator -from pypy.rpython.lltype import Void -from pypy.rpython.llinterp import LLInterpreter -from pypy.objspace.flow.model import checkgraph, flatten, Block -from pypy.translator.test.snippet import simple_method, is_perfect_number -from pypy.translator.llvm.log import log -import py -log = py.log.Producer('test_backendoptimization') - -def annotate_and_remove_void(f, annotate): - t = Translator(f) - a = t.annotate(annotate) - t.specialize() - remove_void(t) - return t - -def test_remove_void_args(): - def f(i): - return [1,2,3,i][i] - t = annotate_and_remove_void(f, [int]) - for func, graph in t.flowgraphs.iteritems(): - assert checkgraph(graph) is None - for arg in graph.startblock.inputargs: - assert arg.concretetype is not Void - interp = LLInterpreter(t.flowgraphs, t.rtyper) - assert interp.eval_function(f, [0]) == 1 - -def test_remove_void_in_struct(): - t = annotate_and_remove_void(simple_method, [int]) - #t.view() - log(t.flowgraphs.iteritems()) - for func, graph in t.flowgraphs.iteritems(): - log('func : ' + str(func)) - log('graph: ' + str(graph)) - assert checkgraph(graph) is None - #for fieldname in self.struct._names: #XXX helper (in lltype?) should remove these voids - # type_ = getattr(struct, fieldname) - # log('fieldname=%(fieldname)s , type_=%(type_)s' % locals()) - # assert _type is not Void - #interp = LLInterpreter(t.flowgraphs, t.rtyper) - #assert interp.eval_function(f, [0]) == 1 - -def test_inline_simple(): - def f(x, y): - return (g(x, y) + 1) * x - def g(x, y): - if x > 0: - return x * y - else: - return -x * y - t = Translator(f) - a = t.annotate([int, int]) - a.simplify() - t.specialize() - inline_function(t, g, t.flowgraphs[f]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(f, [-1, 5]) - assert result == f(-1, 5) - result = interp.eval_function(f, [2, 12]) - assert result == f(2, 12) - -def test_inline_big(): - def f(x): - result = [] - for i in range(1, x+1): - if is_perfect_number(i): - result.append(i) - return result - t = Translator(f) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, is_perfect_number, t.flowgraphs[f]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(f, [10]) - assert result.length == len(f(10)) - -def test_inline_raising(): - def f(x): - if x == 1: - raise ValueError - return x - def g(x): - a = f(x) - if x == 2: - raise KeyError - def h(x): - try: - g(x) - except ValueError: - return 1 - except KeyError: - return 2 - return x - t = Translator(h) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f, t.flowgraphs[g]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(h, [0]) - assert result == 0 - result = interp.eval_function(h, [1]) - assert result == 1 - result = interp.eval_function(h, [2]) - assert result == 2 - -def test_inline_several_times(): - def f(x): - return (x + 1) * 2 - def g(x): - if x: - a = f(x) + f(x) - else: - a = f(x) + 1 - return a + f(x) - t = Translator(g) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f, t.flowgraphs[g]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(g, [0]) - assert result == g(0) - result = interp.eval_function(g, [42]) - assert result == g(42) - -def test_inline_exceptions(): - def f(x): - if x == 0: - raise ValueError - if x == 1: - raise KeyError - def g(x): - try: - f(x) - except ValueError: - return 2 - except KeyError: - return x+2 - return 1 - t = Translator(g) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f, t.flowgraphs[g]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(g, [0]) - assert result == 2 - result = interp.eval_function(g, [1]) - assert result == 3 - result = interp.eval_function(g, [42]) - assert result == 1 - -def DONOTtest_inline_var_exception(): - # this test is disabled for now, because f() contains a direct_call - # (at the end, to a ll helper, to get the type of the exception object) - def f(x): - e = None - if x == 0: - e = ValueError() - elif x == 1: - e = KeyError() - if x == 0 or x == 1: - raise e - def g(x): - try: - f(x) - except ValueError: - return 2 - except KeyError: - return 3 - return 1 - t = Translator(g) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f, t.flowgraphs[g]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(g, [0]) - assert result == 2 - result = interp.eval_function(g, [1]) - assert result == 3 - result = interp.eval_function(g, [42]) - assert result == 1 - -def DONOTtest_call_call(): - # for reference. Just remove this test if we decide not to support - # catching exceptions while inlining a graph that contains further - # direct_calls. - def e(x): - if x < 0: - raise KeyError - return x+1 - def f(x): - return e(x)+2 - def g(x): - try: - return f(x)+3 - except KeyError: - return -1 - t = Translator(g) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f, t.flowgraphs[g]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(g, [100]) - assert result == 106 - result = interp.eval_function(g, [-100]) - assert result == -1 - -def test_for_loop(): - def f(x): - result = 0 - for i in range(0, x): - result += i - return result - t = Translator(f) - a = t.annotate([int]) - a.simplify() - t.specialize() - for graph in t.flowgraphs.values(): - if graph.name.startswith('ll_rangenext'): - break - else: - assert 0, "cannot find ll_rangenext_*() function" - inline_function(t, graph, t.flowgraphs[f]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - result = interp.eval_function(f, [10]) - assert result == 45 - - -def check_malloc_removed(fn, signature, expected_remaining_mallocs): - t = Translator(fn) - t.annotate(signature) - t.specialize() - graph = t.getflowgraph() - remove_simple_mallocs(graph) - checkgraph(graph) - count = 0 - for node in flatten(graph): - if isinstance(node, Block): - for op in node.operations: - if op.opname == 'malloc': - count += 1 - assert count == expected_remaining_mallocs - -def test_remove_mallocs(): - def fn1(x, y): - s, d = x+y, x-y - return s*d - yield check_malloc_removed, fn1, [int, int], 0 - # - class T: - pass - def fn2(x, y): - t = T() - t.x = x - t.y = y - if x > 0: - return t.x + t.y - else: - return t.x - t.y - yield check_malloc_removed, fn2, [int, int], 0 - # - def fn3(x): - a, ((b, c), d, e) = x+1, ((x+2, x+3), x+4, x+5) - return a+b+c+d+e - yield check_malloc_removed, fn3, [int], 0 - # - class A: - pass - class B(A): - pass - def fn4(i): - a = A() - b = B() - a.b = b - b.i = i - return a.b.i - yield check_malloc_removed, fn4, [int], 0 +# XXX write a test! Modified: pypy/dist/pypy/translator/c/test/test_backendoptimized.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_backendoptimized.py (original) +++ pypy/dist/pypy/translator/c/test/test_backendoptimized.py Tue Sep 13 16:49:13 2005 @@ -1,7 +1,6 @@ import autopath from pypy.translator.tool.cbuild import skip_missing_compiler from pypy.translator.translator import Translator -from pypy.translator import backendoptimization from pypy.translator.c.test.test_typed import TestTypedTestCase as _TestTypedTestCase @@ -20,8 +19,7 @@ a = t.annotate(argstypelist) a.simplify() t.specialize() - for graph in t.flowgraphs.values(): - backendoptimization.backend_optimizations(graph) + t.backend_optimizations() t.checkgraphs() return skip_missing_compiler(t.ccompile) Modified: pypy/dist/pypy/translator/geninterplevel.py ============================================================================== --- pypy/dist/pypy/translator/geninterplevel.py (original) +++ pypy/dist/pypy/translator/geninterplevel.py Tue Sep 13 16:49:13 2005 @@ -56,7 +56,7 @@ from pypy.interpreter.error import OperationError from pypy.interpreter.argument import Arguments from pypy.rpython.rarithmetic import r_int, r_uint -from pypy.translator.backendoptimization import SSI_to_SSA +from pypy.translator.backendopt.ssa import SSI_to_SSA from pypy.translator.translator import Translator from pypy.objspace.flow import FlowObjSpace Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Tue Sep 13 16:49:13 2005 @@ -2,7 +2,6 @@ from pypy.objspace.flow.model import Block, Constant, Variable, Link from pypy.objspace.flow.model import flatten, mkentrymap, traverse, last_exception from pypy.rpython import lltype -from pypy.translator.backendoptimization import backend_optimizations from pypy.translator.unsimplify import remove_double_links from pypy.translator.llvm.node import LLVMNode, ConstantLLVMNode from pypy.translator.llvm.opwriter import OpWriter @@ -37,8 +36,9 @@ self.db = db self.value = value self.ref = self.make_ref('%pypy_', value.graph.name) - self.graph = value.graph - backend_optimizations(self.graph, opt_SSI_to_SSA=False) + self.graph = value.graph + # XXX the following needs to be done in advance (e.g. for inlining) + #backend_optimizations(self.graph, opt_SSI_to_SSA=False) remove_double_links(self.db.translator, self.graph) def __str__(self): Modified: pypy/dist/pypy/translator/translator.py ============================================================================== --- pypy/dist/pypy/translator/translator.py (original) +++ pypy/dist/pypy/translator/translator.py Tue Sep 13 16:49:13 2005 @@ -142,10 +142,9 @@ self.rtyper = RPythonTyper(self.annotator) self.rtyper.specialize(**flags) - def backend_optimizations(self): - from pypy.translator.backendoptimization import backend_optimizations - for graph in self.flowgraphs.values(): - backend_optimizations(graph) + def backend_optimizations(self, **kwds): + from pypy.translator.backendopt.all import backend_optimizations + backend_optimizations(self, **kwds) def source(self, func=None): """Returns original Python source. From pedronis at codespeak.net Tue Sep 13 17:08:26 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 13 Sep 2005 17:08:26 +0200 (CEST) Subject: [pypy-svn] r17529 - pypy/dist/pypy/doc Message-ID: <20050913150826.60B7227BA6@code1.codespeak.net> Author: pedronis Date: Tue Sep 13 17:08:25 2005 New Revision: 17529 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: written some text about how the flow space plugs itself and an overview of its concrete operations. Probably will need to change or be mostly trashed but it's a start. Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Tue Sep 13 17:08:25 2005 @@ -202,7 +202,37 @@ XXX -Annotator +In our bytecode-interpreter design evaluation responsibilities are +split between the Object Space, frames and the so-called execution +context. The latter two object kinds are properly part of the +interpretation engine, while the object space implements all +operations on values which are treated as black boxes by the engine. + +The Object Space plays the role of a factory for execution contexts, +whose base implementation is supplied by the engine, and exposes hooks +triggered when frames are entered, left and before each bytecode, +allowing to gather a trace of the execution. + +Frames have run/resume methods which embed the interpretation loop, +These methods take an execution context invoking the appropriate hooks +at the corresponding situations. + +The Flow Object Space in our current design is responsible of +constructing a flow graph for a single function using abstract +interpretation. + +Concretely the Flow Space plugs itself in the interpreter as an object +space, and supplying a derived execution context implementation. It +also wrap a fix-point loop around invocations of the frame resume +method which is forced to execute one single bytecode through +exceptions reaching this loop from the space operations' +code and the specialised execution context. + + + + + +Annotator =================================== XXX From arigo at codespeak.net Tue Sep 13 17:16:35 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Sep 2005 17:16:35 +0200 (CEST) Subject: [pypy-svn] r17530 - pypy/dist/pypy/translator/backendopt Message-ID: <20050913151635.47C5D27BB9@code1.codespeak.net> Author: arigo Date: Tue Sep 13 17:16:34 2005 New Revision: 17530 Modified: pypy/dist/pypy/translator/backendopt/malloc.py Log: Fixed the broken algo in _try_inline_malloc(). Modified: pypy/dist/pypy/translator/backendopt/malloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/malloc.py Tue Sep 13 17:16:34 2005 @@ -3,9 +3,6 @@ from pypy.tool.unionfind import UnionFind from pypy.rpython import lltype -class Blocked(Exception): - pass - class LifeTime: def __init__(self, (block, var)): @@ -121,81 +118,86 @@ flatconstants[S, name] = constant flatten(STRUCT, example) - pending = info.variables.keys() - for block, var in pending: - newvarsmap = {} - - def var_comes_from_outside(): - for key in flatnames: - newvar = Variable() - newvar.concretetype = flatconstants[key].concretetype - newvarsmap[key] = newvar - - def var_is_created_here(): - newvarsmap.update(flatconstants) - - def make_newvars(): - return [newvarsmap[key] for key in flatnames] - - if var in block.inputargs: - var_comes_from_outside() - i = block.inputargs.index(var) - block.inputargs = (block.inputargs[:i] + make_newvars() + - block.inputargs[i+1:]) - - assert block.operations != () - newops = [] - try: + variables_by_block = {} + for block, var in info.variables: + vars = variables_by_block.setdefault(block, {}) + vars[var] = True + + for block, vars in variables_by_block.items(): + + def flowin(var, newvarsmap): + # in this 'block', follow where the 'var' goes to and replace + # it by a flattened-out family of variables. This family is given + # by newvarsmap, whose keys are the 'flatnames'. + vars = {var: True} + + def list_newvars(): + return [newvarsmap[key] for key in flatnames] + + assert block.operations != () + newops = [] for op in block.operations: - assert var not in op.args[1:] # should be the first arg only - if op.args and var == op.args[0]: + for arg in op.args[1:]: # should be the first arg only + assert arg not in vars + if op.args and op.args[0] in vars: if op.opname == "getfield": - S = var.concretetype.TO + S = op.args[0].concretetype.TO fldname = op.args[1].value newop = SpaceOperation("same_as", [newvarsmap[S, fldname]], op.result) newops.append(newop) elif op.opname == "setfield": - S = var.concretetype.TO + S = op.args[0].concretetype.TO fldname = op.args[1].value assert (S, fldname) in newvarsmap newvarsmap[S, fldname] = op.args[2] elif op.opname in ("same_as", "cast_pointer"): - # temporary pseudo-operation, should be removed below - newop = SpaceOperation("_tmp_same_as", - make_newvars(), - op.result) - newops.append(newop) - else: - raise AssertionError, op.opname - elif var == op.result: - assert not newvarsmap - if op.opname == "malloc": - var_is_created_here() - elif op.opname in ("same_as", "cast_pointer"): - # in a 'v2=same_as(v1)', we must analyse v1 before - # we can analyse v2. If we get them in the wrong - # order we cancel and reschedule v2. - raise Blocked - elif op.opname == "_tmp_same_as": - # pseudo-operation just introduced by the code - # some lines above. - for key, v in zip(flatnames, op.args): - newvarsmap[key] = v + assert op.result not in vars + vars[op.result] = True + # Consider the two pointers (input and result) as + # equivalent. We can, and indeed must, use the same + # flattened list of variables for both, as a "setfield" + # via one pointer must be reflected in the other. else: raise AssertionError, op.opname + elif op.result in vars: + assert op.opname == "malloc" + assert vars == {var: True} + # drop the "malloc" operation else: newops.append(op) - except Blocked: - pending.append((block, var)) - continue - block.operations[:] = newops - - for link in block.exits: - while var in link.args: - i = link.args.index(var) - link.args = link.args[:i] + make_newvars() + link.args[i+1:] + block.operations[:] = newops + + for link in block.exits: + newargs = [] + for arg in link.args: + if arg in vars: + newargs += list_newvars() + else: + newargs.append(arg) + link.args[:] = newargs + + # look for variables arriving from outside the block + for var in vars: + if var in block.inputargs: + i = block.inputargs.index(var) + newinputargs = block.inputargs[:i] + newvarsmap = {} + for key in flatnames: + newvar = Variable() + newvar.concretetype = flatconstants[key].concretetype + newvarsmap[key] = newvar + newinputargs.append(newvar) + newinputargs += block.inputargs[i+1:] + block.inputargs[:] = newinputargs + flowin(var, newvarsmap) + + # look for variables created inside the block by a malloc + for op in block.operations: + if op.opname == "malloc" and op.result in vars: + newvarsmap = flatconstants.copy() # dummy initial values + flowin(op.result, newvarsmap) return True From arigo at codespeak.net Tue Sep 13 18:00:53 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Sep 2005 18:00:53 +0200 (CEST) Subject: [pypy-svn] r17531 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20050913160053.B15FB27BBC@code1.codespeak.net> Author: arigo Date: Tue Sep 13 18:00:52 2005 New Revision: 17531 Modified: pypy/dist/pypy/translator/backendopt/inline.py pypy/dist/pypy/translator/backendopt/test/test_inline.py Log: Another small fix... Modified: pypy/dist/pypy/translator/backendopt/inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/inline.py (original) +++ pypy/dist/pypy/translator/backendopt/inline.py Tue Sep 13 18:00:52 2005 @@ -155,12 +155,13 @@ #let links to exceptblock of the graph to inline go to graphs exceptblock copiedexceptblock = copied_blocks[graph_to_inline.exceptblock] if not exception_guarded: - copiedexceptblock = copied_blocks[graph_to_inline.exceptblock] + # find all copied links that go to copiedexceptblock for link in entrymap[graph_to_inline.exceptblock]: copiedblock = copied_blocks[link.prevblock] - assert len(copiedblock.exits) == 1 - copiedblock.exits[0].args = copiedblock.exits[0].args[:2] - copiedblock.exits[0].target = graph.exceptblock + for copiedlink in copiedblock.exits: + if copiedlink.target is copiedexceptblock: + copiedlink.args = copiedlink.args[:2] + copiedlink.target = graph.exceptblock else: def find_args_in_exceptional_case(link, block, etype, evalue): linkargs = [] Modified: pypy/dist/pypy/translator/backendopt/test/test_inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_inline.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_inline.py Tue Sep 13 18:00:52 2005 @@ -1,4 +1,5 @@ import py +import os from pypy.translator.backendopt.inline import inline_function, CannotInline from pypy.translator.backendopt.inline import auto_inlining from pypy.translator.backendopt.inline import collect_called_functions @@ -257,3 +258,37 @@ assert result == 45 result = interp.eval_function(f, [15]) assert result == -1 + +def test_inline_exception_catching(): + def f3(): + raise KeyError + def f2(): + try: + f3() + except KeyError: + return True + else: + return False + def f(): + return f2() + t = Translator(f) + a = t.annotate([]) + a.simplify() + t.specialize() + inline_function(t, f2, t.flowgraphs[f]) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(f, []) + assert result is True + +def test_auto_inline_os_path_isdir(): + directory = "./." + def f(): + return os.path.isdir(directory) + t = Translator(f) + a = t.annotate([]) + a.simplify() + t.specialize() + auto_inlining(t) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(f, []) + assert result is True From arigo at codespeak.net Tue Sep 13 18:13:19 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Sep 2005 18:13:19 +0200 (CEST) Subject: [pypy-svn] r17532 - in pypy/dist/pypy/translator: . backendopt backendopt/test Message-ID: <20050913161319.EE7E627BC1@code1.codespeak.net> Author: arigo Date: Tue Sep 13 18:13:18 2005 New Revision: 17532 Modified: pypy/dist/pypy/translator/backendopt/all.py pypy/dist/pypy/translator/backendopt/malloc.py pypy/dist/pypy/translator/backendopt/test/test_all.py pypy/dist/pypy/translator/backendopt/test/test_malloc.py pypy/dist/pypy/translator/simplify.py Log: * Enable inlining and malloc removal in backend_optimizations(): all the C tests pass now. * test_malloc.test_fn1 wasn't testing anything -- fixed. Modified: pypy/dist/pypy/translator/backendopt/all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/all.py (original) +++ pypy/dist/pypy/translator/backendopt/all.py Tue Sep 13 18:13:18 2005 @@ -12,13 +12,12 @@ remove_same_as(graph) simplify.eliminate_empty_blocks(graph) - # inline functions around - if inline_threshold and 0: # XXX in progress + # inline functions in each other + if inline_threshold: auto_inlining(translator, inline_threshold) # vaporize mallocs - # XXX in progress - for graph in []:# translator.flowgraphs.values(): + for graph in translator.flowgraphs.values(): if remove_simple_mallocs(graph): # remove typical leftovers from malloc removal remove_same_as(graph) Modified: pypy/dist/pypy/translator/backendopt/malloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/malloc.py Tue Sep 13 18:13:18 2005 @@ -214,5 +214,6 @@ """Iteratively remove (inline) the mallocs that can be simplified away.""" done_something = False while remove_mallocs_once(graph): + print 'simple mallocs removed in %r' % graph.name done_something = True return done_something Modified: pypy/dist/pypy/translator/backendopt/test/test_all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_all.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_all.py Tue Sep 13 18:13:18 2005 @@ -37,7 +37,6 @@ def test_big(): - py.test.skip("in progress") assert big() == 83 t = Translator(big) @@ -45,7 +44,6 @@ t.specialize() backend_optimizations(t, inline_threshold=100) - t.view() graph = t.getflowgraph() check_malloc_removed(graph) Modified: pypy/dist/pypy/translator/backendopt/test/test_malloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_malloc.py Tue Sep 13 18:13:18 2005 @@ -31,7 +31,11 @@ def test_fn1(): def fn1(x, y): - s, d = x+y, x-y + if x > 0: + t = x+y, x-y + else: + t = x-y, x+y + s, d = t return s*d check(fn1, [int, int], [15, 10], 125) Modified: pypy/dist/pypy/translator/simplify.py ============================================================================== --- pypy/dist/pypy/translator/simplify.py (original) +++ pypy/dist/pypy/translator/simplify.py Tue Sep 13 18:13:18 2005 @@ -341,7 +341,7 @@ pos neg nonzero abs hex oct ord invert add sub mul truediv floordiv div mod divmod pow lshift rshift and_ or_ xor int float long lt le eq ne gt ge cmp coerce contains - iter get same_as cast_pointer '''.split(): + iter get same_as cast_pointer getfield '''.split(): CanRemove[_op] = True del _op CanRemoveBuiltins = { From ac at codespeak.net Tue Sep 13 18:24:18 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Tue, 13 Sep 2005 18:24:18 +0200 (CEST) Subject: [pypy-svn] r17533 - in pypy/dist/pypy/interpreter: astcompiler test Message-ID: <20050913162418.EF5B627BC6@code1.codespeak.net> Author: ac Date: Tue Sep 13 18:24:14 2005 New Revision: 17533 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/test/test_compiler.py Log: Reject import * in nested functions. Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Tue Sep 13 18:24:14 2005 @@ -825,6 +825,8 @@ self.emitop('IMPORT_NAME', node.modname) for name, alias in node.names: if name == '*': + if self.scope.nested: + raise SyntaxError('import * is not allowed in a nested function') self.namespace = 0 self.emit('IMPORT_STAR') # There can only be one name w/ from ... import * Modified: pypy/dist/pypy/interpreter/test/test_compiler.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_compiler.py (original) +++ pypy/dist/pypy/interpreter/test/test_compiler.py Tue Sep 13 18:24:14 2005 @@ -108,6 +108,16 @@ ex = e.value assert ex.match(self.space, self.space.w_SyntaxError) + def test_scope_exec_with_nested_free(self): + e = py.test.raises(OperationError, self.compiler.compile, """if 1: + def unoptimized_clash1(x): + exec "z=3" + def f(): + return x + return f""", '', 'exec', 0) + ex = e.value + assert ex.match(self.space, self.space.w_SyntaxError) + def test_scope_importstar_in_nested(self): e = py.test.raises(OperationError, self.compiler.compile, """if 1: def unoptimized_clash1(x): @@ -187,16 +197,10 @@ def setup_method(self, method): self.compiler = PythonAstCompiler(self.space) - def test_scope_unoptimized_clash1(self): - py.test.skip("INPROGESS") - - def test_scope_unoptimized_clash1_b(self): - py.test.skip("INPROGESS") - - def test_scope_importstar_in_nested(self): + def test_scope_importstar_with_nested_free(self): py.test.skip("INPROGESS") - def test_scope_importstar_with_nested_free(self): + def test_scope_exec_with_nested_free(self): py.test.skip("INPROGESS") class SkippedForNowTestPyPyCompiler(BaseTestCompiler): From pedronis at codespeak.net Tue Sep 13 18:51:29 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 13 Sep 2005 18:51:29 +0200 (CEST) Subject: [pypy-svn] r17534 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050913165129.266CB27BC7@code1.codespeak.net> Author: pedronis Date: Tue Sep 13 18:51:27 2005 New Revision: 17534 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py Log: max on lists is not RPython Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Tue Sep 13 18:51:27 2005 @@ -534,7 +534,12 @@ d = d + depth[b] children = b.get_children() if children: - return max([ self._max_depth(depth, seen, c, d) for c in children]) + maxd = -1 + for c in children: + childd =self._max_depth(depth, seen, c, d) + if childd > maxd: + maxd = childd + return maxd else: if not b.label == "exit": return self._max_depth(depth, seen, self.exit, d) From pedronis at codespeak.net Tue Sep 13 18:53:17 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 13 Sep 2005 18:53:17 +0200 (CEST) Subject: [pypy-svn] r17535 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050913165317.41C9827BC9@code1.codespeak.net> Author: pedronis Date: Tue Sep 13 18:53:16 2005 New Revision: 17535 Added: pypy/dist/pypy/interpreter/astcompiler/TYPERERRORS (contents, props changed) Log: typer errors encountered so far Added: pypy/dist/pypy/interpreter/astcompiler/TYPERERRORS ============================================================================== --- (empty file) +++ pypy/dist/pypy/interpreter/astcompiler/TYPERERRORS Tue Sep 13 18:53:16 2005 @@ -0,0 +1,124 @@ +TyperError-1: (pypy.interpreter.astcompiler.pycodegen:visitListComp) +don't know about built-in function +.. block at -1 with 1 exits +.. v179774 = simple_call((builtin_function_or_method zip), v179753, v179761) + +TyperError-2: (pypy.interpreter.pyparser.astbuilder:build_arith_expr) +unimplemented operation: 'mod' on (, ) +.. block at 212 with 1 exits +.. v93820 = mod(('unexpected token: %s : %s'), v93785) + +TyperError-3: (pypy.interpreter.astcompiler.pyassem:flattenGraph) +unimplemented operation: 'divmod' on (, ) +.. block at 536 with 1 exits +.. v1539443 = divmod(v1539254, (65536)) + +TyperError-4: (pypy.interpreter.astcompiler.symbols:_do_args) +ll_str unsupported for: +.. block at 112 with 1 exits +.. v194656 = mod(('Argument list contains %s of type %s'), v194632) + +TyperError-5: (pypy.interpreter.astcompiler.future:check_stmt) +contains() on non-const tuple +.. block at 234 with 2 exits(v165947) +.. v165938 = contains(v165923, v165912) + +TyperError-6: (pypy.interpreter.pyparser.astbuilder:build_lambdef) +slice stop must be proved non-negative +.. block at -1 with 1 exits +.. v91072 = newslice((1), (-2), (None)) + +TyperError-7: (pypy.interpreter.astcompiler.misc:mangle) +slice stop must be proved non-negative +.. block at 205 with 1 exits +.. v216816 = newslice((None), v216804, (None)) + +TyperError-8: (pypy.interpreter.pyparser.astbuilder:build_suite) +slice stop must be proved non-negative +.. block at 143 with 1 exits +.. v89498 = newslice((2), (-1), (None)) + +TyperError-9: (pypy.interpreter.pyparser.astbuilder:build_power) +slice stop must be proved non-negative +.. block at 109 with 1 exits +.. v72361 = newslice((None), (-2), (None)) + +TyperError-10: (pypy.interpreter.astcompiler.pycodegen:emitop_int) +unimplemented operation: 'type' on +.. block at -1 EH with 2 exits(v147808) +.. v147796 = type(op_147787) + +TyperError-11: (pypy.interpreter.astcompiler.symbols:visitImport) +slice stop must be proved non-negative +.. block at 68 with 1 exits +.. v140696 = newslice((None), v140691, (None)) + +TyperError-12: (pypy.interpreter.astcompiler.pyassem:twobyte) +unimplemented operation: 'type' on +.. block at -1 EH with 2 exits(v2058259) +.. v2058251 = type(val_2058244) + +TyperError-13: (pypy.interpreter.pyparser.astbuilder:parse_genexpr_for) +unimplemented operation: 'mod' on (, ) +.. block at 332 with 1 exits +.. v120242 = mod(('Unexpected token: %s'), v120221) + +TyperError-14: (pypy.interpreter.astcompiler.pycodegen:visitGenExprInner) +don't know about built-in function +.. block at -1 with 1 exits +.. v158235 = simple_call((builtin_function_or_method zip), v158216, v158223) + +TyperError-15: (pypy.interpreter.astcompiler.pyassem:emitop_int) +unimplemented operation: 'type' on +.. block at 33 EH with 2 exits(v164739) +.. v164713 = type(intval_164706) + +TyperError-16: (pypy.interpreter.astcompiler.pyassem:__init__) +unimplemented operation: 'type' on +.. block at 295 with 2 exits(v100109) +.. v100047 = type(v99980) + +TyperError-17: (pypy.interpreter.pyparser.astbuilder:to_lvalue) +unimplemented operation: 'mod' on (, ) +.. block at 377 with 1 exits +.. v116431 = mod(('cannot assign to %s'), v116400) + +TyperError-18: (pypy.interpreter.pyparser.astbuilder:build_not_test) +unimplemented operation: 'mod' on (, * GcStruct object { ... } } }, hash_cache: Signed } }}, inst_filename: * GcStruct rpy_string { hash: Signed, chars: Array of Char }, inst_lineno: Signed } }>) +.. block at 120 with 1 exits +.. v72744 = mod(('not_test implementation incomplete (%s)'), v72728) + +TyperError-19: (pypy.interpreter.pyparser.astbuilder:build_funcdef) +slice stop must be proved non-negative +.. block at 196 with 1 exits +.. v73769 = newslice((3), (-3), (None)) + +TyperError-20: (pypy.interpreter.astcompiler.pyassem:depth_CALL_FUNCTION) +unimplemented operation: 'divmod' on (, ) +.. block at -1 EH with 1 exits +.. v220408 = divmod(argc_220401, (256)) + +TyperError-21: (pypy.interpreter.pyparser.astbuilder:build_term) +unimplemented operation: 'mod' on (, ) +.. block at 300 with 1 exits +.. v89051 = mod(('unexpected token: %s'), v89016) + +TyperError-22: (pypy.interpreter.pyparser.astbuilder:build_shift_expr) +unimplemented operation: 'mod' on (, ) +.. block at 212 with 1 exits +.. v86494 = mod(('unexpected token: %s : %s'), v86459) + +TyperError-23: (pypy.interpreter.astcompiler.pyassem:__init__) +unimplemented operation: 'type' on +.. block at 224 with 2 exits(v99957) +.. v99901 = type(v99858) + +TyperError-24: (pypy.interpreter.astcompiler.pycodegen:generateArgList) +unimplemented operation: 'mod' on (, ) +.. block at 176 with 1 exits +.. v202392 = mod(('unexpect argument type: %s'), v202376) + +TyperError-25: (pypy.interpreter.astcompiler.pyassem:emitop_name) +unimplemented operation: 'type' on +.. block at 33 EH with 2 exits(v186456) +.. v186430 = type(name_186423) From arigo at codespeak.net Tue Sep 13 18:56:47 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Sep 2005 18:56:47 +0200 (CEST) Subject: [pypy-svn] r17536 - pypy/dist/pypy/translator/backendopt Message-ID: <20050913165647.81C2627BCD@code1.codespeak.net> Author: arigo Date: Tue Sep 13 18:56:46 2005 New Revision: 17536 Modified: pypy/dist/pypy/translator/backendopt/inline.py Log: For now, don't use the matrix-based computation -- it's quite slow, and can crash :-( Modified: pypy/dist/pypy/translator/backendopt/inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/inline.py (original) +++ pypy/dist/pypy/translator/backendopt/inline.py Tue Sep 13 18:56:46 2005 @@ -8,9 +8,9 @@ from pypy.annotation import model as annmodel from pypy.rpython.lltype import Bool from pypy.rpython import rmodel -from pypy.translator.backendopt import matfunc +#from pypy.translator.backendopt import matfunc -BASE_INLINE_THRESHOLD = 17.0 # just enough to inline ll_rangeiter_next() +BASE_INLINE_THRESHOLD = 12 # just enough to inline ll_rangeiter_next() class CannotInline(Exception): pass @@ -276,7 +276,7 @@ def inlining_heuristic(graph): # XXX ponderation factors? - return (0.819487132 * measure_median_execution_cost(graph) + + return ( #0.819487132 * measure_median_execution_cost(graph) + static_instruction_count(graph)) From arigo at codespeak.net Tue Sep 13 19:04:27 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 13 Sep 2005 19:04:27 +0200 (CEST) Subject: [pypy-svn] r17537 - pypy/dist/pypy/translator/goal Message-ID: <20050913170427.BB2BD27BCD@code1.codespeak.net> Author: arigo Date: Tue Sep 13 19:04:26 2005 New Revision: 17537 Modified: pypy/dist/pypy/translator/goal/translate_pypy.py pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: Argh argh argh -fork just crashes the compilation. Now I have to start it again. Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Tue Sep 13 19:04:26 2005 @@ -160,7 +160,7 @@ def assert_rpython_mostly_not_imported(): prefix = 'pypy.rpython.' oknames = ('rarithmetic memory memory.lladdress extfunctable ' - 'lltype objectmodel error'.split()) + 'lltype objectmodel error ros'.split()) wrongimports = [] for name, module in sys.modules.items(): if module is not None and name.startswith(prefix): Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Tue Sep 13 19:04:26 2005 @@ -143,7 +143,7 @@ def assert_rpython_mostly_not_imported(): prefix = 'pypy.rpython.' oknames = ('rarithmetic memory memory.lladdress extfunctable ' - 'lltype objectmodel error'.split()) + 'lltype objectmodel error ros'.split()) wrongimports = [] for name, module in sys.modules.items(): if module is not None and name.startswith(prefix): From pedronis at codespeak.net Tue Sep 13 19:19:53 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 13 Sep 2005 19:19:53 +0200 (CEST) Subject: [pypy-svn] r17538 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050913171953.D621127B9C@code1.codespeak.net> Author: pedronis Date: Tue Sep 13 19:19:53 2005 New Revision: 17538 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: remove zip usage Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Tue Sep 13 19:19:53 2005 @@ -541,7 +541,8 @@ stack = [] - for i, for_ in zip(range(len(node.quals)), node.quals): + i = 0 + for for_ in node.quals: assert isinstance(for_, ast.ListCompFor) start, anchor = self._visitListCompFor(for_) self.genexpr_cont_stack.append( None ) @@ -551,6 +552,7 @@ if_.accept( self ) stack.insert(0, (start, self.genexpr_cont_stack[-1], anchor)) self.genexpr_cont_stack.pop() + i += 1 self._implicitNameOp('LOAD', append) node.expr.accept( self ) @@ -621,7 +623,8 @@ # setup list stack = [] - for i, for_ in zip(range(len(node.quals)), node.quals): + i = 0 + for for_ in node.quals: assert isinstance(for_, ast.GenExprFor) start, anchor = self._visitGenExprFor(for_) self.genexpr_cont_stack.append( None ) @@ -631,6 +634,7 @@ if_.accept( self ) stack.insert(0, (start, self.genexpr_cont_stack[-1], anchor)) self.genexpr_cont_stack.pop() + i += 1 node.expr.accept( self ) self.emit('YIELD_VALUE') From pedronis at codespeak.net Tue Sep 13 19:23:53 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 13 Sep 2005 19:23:53 +0200 (CEST) Subject: [pypy-svn] r17539 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050913172353.F253627BA4@code1.codespeak.net> Author: pedronis Date: Tue Sep 13 19:23:52 2005 New Revision: 17539 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py Log: remove divmod usage Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Tue Sep 13 19:23:52 2005 @@ -592,7 +592,8 @@ if arg in begin: # can only extend argument if backward offset = begin[arg] - hi, lo = divmod(offset,65536) + hi = offset // 65536 + lo = offset % 65536 if hi>0: # extended argument insts.append( InstrInt("EXTENDED_ARG", hi) ) From pedronis at codespeak.net Tue Sep 13 19:31:12 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 13 Sep 2005 19:31:12 +0200 (CEST) Subject: [pypy-svn] r17540 - in pypy/dist/pypy/rpython: . test Message-ID: <20050913173112.3178327BA4@code1.codespeak.net> Author: pedronis Date: Tue Sep 13 19:31:10 2005 New Revision: 17540 Modified: pypy/dist/pypy/rpython/rtuple.py pypy/dist/pypy/rpython/test/test_rtuple.py Log: extend in constant tuple, to support annotated constansts vs just inline one Modified: pypy/dist/pypy/rpython/rtuple.py ============================================================================== --- pypy/dist/pypy/rpython/rtuple.py (original) +++ pypy/dist/pypy/rpython/rtuple.py Tue Sep 13 19:31:10 2005 @@ -83,16 +83,16 @@ class __extend__(pairtype(TupleRepr, Repr)): - def rtype_contains((r_tup, r_item), hop): - v_tup = hop.args_v[0] - if not isinstance(v_tup, Constant): + def rtype_contains((r_tup, r_item), hop): + s_tup = hop.args_s[0] + if not s_tup.is_constant(): raise TyperError("contains() on non-const tuple") - t = v_tup.value + t = s_tup.const typ = type(t[0]) for x in t[1:]: if type(x) is not typ: raise TyperError("contains() on mixed-type tuple " - "constant %r" % (v_tup,)) + "constant %r" % (t,)) d = {} for x in t: d[x] = None Modified: pypy/dist/pypy/rpython/test/test_rtuple.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rtuple.py (original) +++ pypy/dist/pypy/rpython/test/test_rtuple.py Tue Sep 13 19:31:10 2005 @@ -78,6 +78,17 @@ res = interpret(f, [0]) assert res is False +def test_constant_tuple_contains2(): + def t1(): + return (1,2,3,4) + def f(i): + return i in t1() + res = interpret(f, [3], view=False, viewbefore=False) + assert res is True + res = interpret(f, [0]) + assert res is False + + def test_constant_unichar_tuple_contains(): def f(i): return unichr(i) in (u'1', u'9') From pedronis at codespeak.net Tue Sep 13 20:11:24 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 13 Sep 2005 20:11:24 +0200 (CEST) Subject: [pypy-svn] r17541 - in pypy/dist/pypy/rpython: . test Message-ID: <20050913181124.406A727B9C@code1.codespeak.net> Author: pedronis Date: Tue Sep 13 20:11:22 2005 New Revision: 17541 Modified: pypy/dist/pypy/rpython/rbuiltin.py pypy/dist/pypy/rpython/test/test_rdict.py Log: convert_const for builtin methods Modified: pypy/dist/pypy/rpython/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/rbuiltin.py Tue Sep 13 20:11:22 2005 @@ -71,6 +71,9 @@ # methods of a known name are implemented as just their 'self' self.lowleveltype = self.self_repr.lowleveltype + def convert_const(self, obj): + return self.self_repr.convert_const(obj.__self__) + def rtype_simple_call(self, hop): # methods: look up the rtype_method_xxx() name = 'rtype_method_' + self.methodname Modified: pypy/dist/pypy/rpython/test/test_rdict.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rdict.py (original) +++ pypy/dist/pypy/rpython/test/test_rdict.py Tue Sep 13 20:11:22 2005 @@ -411,3 +411,18 @@ return len(d) + d[a] + d[b] res = interpret(f, []) assert res == 12 + +def test_captured_get(): + get = {1:2}.get + def f(): + return get(1, 3)+get(2, 4) + res = interpret(f, []) + assert res == 6 + + def g(h): + return h(1, 3) + def f(): + return g(get) + + res = interpret(f, []) + assert res == 2 From pedronis at codespeak.net Tue Sep 13 20:13:27 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 13 Sep 2005 20:13:27 +0200 (CEST) Subject: [pypy-svn] r17542 - in pypy/dist/pypy/interpreter: astcompiler pyparser Message-ID: <20050913181327.CF89E27B9C@code1.codespeak.net> Author: pedronis Date: Tue Sep 13 20:13:26 2005 New Revision: 17542 Modified: pypy/dist/pypy/interpreter/astcompiler/misc.py pypy/dist/pypy/interpreter/astcompiler/symbols.py pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: tentative removing of negative slice stops Modified: pypy/dist/pypy/interpreter/astcompiler/misc.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/misc.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/misc.py Tue Sep 13 20:13:26 2005 @@ -77,7 +77,11 @@ tlen = len(klass) + len(name) if tlen > MANGLE_LEN: - klass = klass[:MANGLE_LEN-tlen] + end = len(klass) + MANGLE_LEN-tlen + if end < 0: + klass = '' # annotator hint + else: + klass = klass[:end] return "_%s%s" % (klass, name) Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Tue Sep 13 20:13:26 2005 @@ -380,7 +380,7 @@ scope = self.cur_scope() for name, asname in node.names: i = name.find(".") - if i > -1: + if i >= 0: name = name[:i] scope.add_def(asname or name) Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Tue Sep 13 20:13:26 2005 @@ -515,6 +515,13 @@ else: raise ValueError, "unexpected tokens (%d): %s" % (nb, [str(i) for i in atoms]) +def slicecut(lst, first, endskip): # endskip is negative + last = len(lst)+endskip + if last > first: + return lst[first:last] + else: + return [] + def build_power(builder, nb): """power: atom trailer* ['**' factor]""" @@ -524,7 +531,7 @@ else: token = atoms[-2] if isinstance(token, TokenObject) and token.name == tok.DOUBLESTAR: - obj = parse_attraccess(atoms[:-2]) + obj = parse_attraccess(slicecut(atoms, 0, -2)) builder.push(ast.Power([obj, atoms[-1]])) else: obj = parse_attraccess(atoms) @@ -797,7 +804,7 @@ """lambdef: 'lambda' [varargslist] ':' test""" atoms = get_atoms(builder, nb) code = atoms[-1] - names, defaults, flags = parse_arglist(atoms[1:-2]) + names, defaults, flags = parse_arglist(slicecut(atoms, 1, -2)) builder.push(ast.Lambda(names, defaults, flags, code)) @@ -949,7 +956,7 @@ funcname = atoms[1] arglist = [] index = 3 - arglist = atoms[3:-3] + arglist = slicecut(atoms, 3, -3) names, default, flags = parse_arglist(arglist) funcname_token = atoms[1] assert isinstance(funcname_token, TokenObject) @@ -998,7 +1005,7 @@ else: # several statements stmts = [] - nodes = atoms[2:-1] + nodes = slicecut(atoms, 2,-1) for node in nodes: if isinstance(node, ast.Stmt): stmts.extend(node.nodes) From ericvrp at codespeak.net Tue Sep 13 20:27:17 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 13 Sep 2005 20:27:17 +0200 (CEST) Subject: [pypy-svn] r17543 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20050913182717.5E5AB27BB9@code1.codespeak.net> Author: ericvrp Date: Tue Sep 13 20:27:16 2005 New Revision: 17543 Modified: pypy/dist/pypy/translator/llvm/codewriter.py pypy/dist/pypy/translator/llvm/module/support.py Log: fix to remove unused casts (probably introduced by remove_simple_malloc) Modified: pypy/dist/pypy/translator/llvm/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/codewriter.py (original) +++ pypy/dist/pypy/translator/llvm/codewriter.py Tue Sep 13 20:27:16 2005 @@ -136,6 +136,8 @@ self.indent("invoke %s void %s(%s) to label %%%s except label %%%s" % (cconv, functionref, ", ".join(arglist), label, except_label)) def cast(self, targetvar, fromtype, fromvar, targettype): + if fromtype == 'void' and targettype == 'void': + return self.indent("%(targetvar)s = cast %(fromtype)s " "%(fromvar)s to %(targettype)s" % locals()) Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Tue Sep 13 20:27:16 2005 @@ -264,7 +264,7 @@ extfunctions["%main"] = [(), """ int %main(int %argc, sbyte** %argv) { entry: - %pypy_argv = call fastcc %RPyListOfString* %pypy_ll_newlist__Ptr_GcStruct_listLlT_Signed.2(int 0) + %pypy_argv = call fastcc %RPyListOfString* %pypy_ll_newlist__Ptr_GcStruct_listLlT_Signed(int 0) br label %no_exit no_exit: From pedronis at codespeak.net Tue Sep 13 20:46:30 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 13 Sep 2005 20:46:30 +0200 (CEST) Subject: [pypy-svn] r17544 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050913184630.1465C27B95@code1.codespeak.net> Author: pedronis Date: Tue Sep 13 20:46:29 2005 New Revision: 17544 Modified: pypy/dist/pypy/interpreter/astcompiler/TYPERERRORS Log: updated list, this time the rtyper didn't crash in the process, so this should be comprehensive Modified: pypy/dist/pypy/interpreter/astcompiler/TYPERERRORS ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/TYPERERRORS (original) +++ pypy/dist/pypy/interpreter/astcompiler/TYPERERRORS Tue Sep 13 20:46:29 2005 @@ -1,124 +1,124 @@ -TyperError-1: (pypy.interpreter.astcompiler.pycodegen:visitListComp) -don't know about built-in function -.. block at -1 with 1 exits -.. v179774 = simple_call((builtin_function_or_method zip), v179753, v179761) +TyperError-1: (pypy.interpreter.pyparser.astbuilder:build_import_from) +slice stop must be proved non-negative +.. block at 156 with 1 exits +.. v93327 = newslice(v93296, (-1), (None)) TyperError-2: (pypy.interpreter.pyparser.astbuilder:build_arith_expr) unimplemented operation: 'mod' on (, ) .. block at 212 with 1 exits -.. v93820 = mod(('unexpected token: %s : %s'), v93785) +.. v95081 = mod(('unexpected token: %s : %s'), v95046) -TyperError-3: (pypy.interpreter.astcompiler.pyassem:flattenGraph) -unimplemented operation: 'divmod' on (, ) -.. block at 536 with 1 exits -.. v1539443 = divmod(v1539254, (65536)) +TyperError-3: (pypy.interpreter.astcompiler.pycodegen:visitContinue) +unimplemented operation: 'len' on +.. block at 133 with 1 exits +.. v124518 = len(v124478) -TyperError-4: (pypy.interpreter.astcompiler.symbols:_do_args) -ll_str unsupported for: -.. block at 112 with 1 exits -.. v194656 = mod(('Argument list contains %s of type %s'), v194632) - -TyperError-5: (pypy.interpreter.astcompiler.future:check_stmt) -contains() on non-const tuple -.. block at 234 with 2 exits(v165947) -.. v165938 = contains(v165923, v165912) - -TyperError-6: (pypy.interpreter.pyparser.astbuilder:build_lambdef) -slice stop must be proved non-negative -.. block at -1 with 1 exits -.. v91072 = newslice((1), (-2), (None)) +TyperError-4: (pypy.interpreter.astcompiler.pyassem:twobyte) +unimplemented operation: 'type' on +.. block at -1 EH with 2 exits(v1882086) +.. v1882078 = type(val_1882071) -TyperError-7: (pypy.interpreter.astcompiler.misc:mangle) -slice stop must be proved non-negative -.. block at 205 with 1 exits -.. v216816 = newslice((None), v216804, (None)) +TyperError-5: (pypy.interpreter.astcompiler.pyassem:flattenGraph) +unimplemented operation: 'divmod' on (, ) +.. block at 535 with 1 exits +.. v673105 = divmod(v672924, (65536)) -TyperError-8: (pypy.interpreter.pyparser.astbuilder:build_suite) -slice stop must be proved non-negative -.. block at 143 with 1 exits -.. v89498 = newslice((2), (-1), (None)) +TyperError-6: (pypy.interpreter.astcompiler.pycodegen:generateArgList) +unimplemented operation: 'mod' on (, ) +.. block at 176 with 1 exits +.. v207318 = mod(('unexpect argument type: %s'), v207302) -TyperError-9: (pypy.interpreter.pyparser.astbuilder:build_power) -slice stop must be proved non-negative -.. block at 109 with 1 exits -.. v72361 = newslice((None), (-2), (None)) +TyperError-7: (pypy.interpreter.astcompiler.pyassem:depth_CALL_FUNCTION) +unimplemented operation: 'divmod' on (, ) +.. block at -1 EH with 1 exits +.. v297531 = divmod(argc_297524, (256)) -TyperError-10: (pypy.interpreter.astcompiler.pycodegen:emitop_int) -unimplemented operation: 'type' on -.. block at -1 EH with 2 exits(v147808) -.. v147796 = type(op_147787) +TyperError-8: (pypy.interpreter.pyparser.astbuilder:parse_listcomp) +unimplemented operation: 'mod' on (, ) +.. block at 332 with 1 exits +.. v136944 = mod(('Unexpected token: %s'), v136923) -TyperError-11: (pypy.interpreter.astcompiler.symbols:visitImport) -slice stop must be proved non-negative -.. block at 68 with 1 exits -.. v140696 = newslice((None), v140691, (None)) +TyperError-9: (pypy.interpreter.pyparser.astbuilder:build_term) +unimplemented operation: 'mod' on (, ) +.. block at 300 with 1 exits +.. v90369 = mod(('unexpected token: %s'), v90334) -TyperError-12: (pypy.interpreter.astcompiler.pyassem:twobyte) -unimplemented operation: 'type' on -.. block at -1 EH with 2 exits(v2058259) -.. v2058251 = type(val_2058244) +TyperError-10: (pypy.interpreter.astcompiler.pyassem:twobyte) +unimplemented operation: 'divmod' on (, ) +.. block at 38 with 1 exits +.. v1882096 = divmod(v1882093, (256)) -TyperError-13: (pypy.interpreter.pyparser.astbuilder:parse_genexpr_for) +TyperError-11: (pypy.interpreter.pyparser.astbuilder:parse_arglist) unimplemented operation: 'mod' on (, ) -.. block at 332 with 1 exits -.. v120242 = mod(('Unexpected token: %s'), v120221) +.. block at 491 with 1 exits +.. v110364 = mod(('Unexpected token: %s'), v110333) -TyperError-14: (pypy.interpreter.astcompiler.pycodegen:visitGenExprInner) -don't know about built-in function -.. block at -1 with 1 exits -.. v158235 = simple_call((builtin_function_or_method zip), v158216, v158223) +TyperError-12: (pypy.interpreter.pyparser.astbuilder:build_not_test) +unimplemented operation: 'mod' on (, * GcStruct object { ... } } }, hash_cache: Signed } }}, inst_filename: * GcStruct rpy_string { hash: Signed, chars: Array of Char }, inst_lineno: Signed } }>) +.. block at 120 with 1 exits +.. v74452 = mod(('not_test implementation incomplete (%s)'), v74436) + +TyperError-13: (pypy.interpreter.astcompiler.pyassem:get_children) +unimplemented operation: 'contains' on (, ) +.. block at 13 with 1 exits +.. v226229 = contains(v226199, v226167) -TyperError-15: (pypy.interpreter.astcompiler.pyassem:emitop_int) +TyperError-14: (pypy.interpreter.astcompiler.pycodegen:emitop_int) unimplemented operation: 'type' on -.. block at 33 EH with 2 exits(v164739) -.. v164713 = type(intval_164706) +.. block at -1 EH with 2 exits(v179678) +.. v179666 = type(op_179657) -TyperError-16: (pypy.interpreter.astcompiler.pyassem:__init__) +TyperError-15: (pypy.interpreter.astcompiler.pyassem:__init__) unimplemented operation: 'type' on -.. block at 295 with 2 exits(v100109) -.. v100047 = type(v99980) +.. block at 417 with 2 exits(v105093) +.. v105063 = type(v105043) -TyperError-17: (pypy.interpreter.pyparser.astbuilder:to_lvalue) +TyperError-16: (pypy.interpreter.pyparser.astbuilder:build_shift_expr) unimplemented operation: 'mod' on (, ) -.. block at 377 with 1 exits -.. v116431 = mod(('cannot assign to %s'), v116400) +.. block at 212 with 1 exits +.. v71560 = mod(('unexpected token: %s : %s'), v71525) -TyperError-18: (pypy.interpreter.pyparser.astbuilder:build_not_test) -unimplemented operation: 'mod' on (, * GcStruct object { ... } } }, hash_cache: Signed } }}, inst_filename: * GcStruct rpy_string { hash: Signed, chars: Array of Char }, inst_lineno: Signed } }>) -.. block at 120 with 1 exits -.. v72744 = mod(('not_test implementation incomplete (%s)'), v72728) +TyperError-17: (pypy.interpreter.astcompiler.pycodegen:visitCallFunc) +no equality function for +.. block at 228 with 1 exits +.. v186910 = getitem(({(1, 1): 'CALL_FUNCT...ION_KW'}), v186901) -TyperError-19: (pypy.interpreter.pyparser.astbuilder:build_funcdef) -slice stop must be proved non-negative -.. block at 196 with 1 exits -.. v73769 = newslice((3), (-3), (None)) +TyperError-18: (pypy.interpreter.astcompiler.pyassem:emitop_int) +unimplemented operation: 'type' on +.. block at 33 EH with 2 exits(v193854) +.. v193828 = type(intval_193821) -TyperError-20: (pypy.interpreter.astcompiler.pyassem:depth_CALL_FUNCTION) -unimplemented operation: 'divmod' on (, ) -.. block at -1 EH with 1 exits -.. v220408 = divmod(argc_220401, (256)) +TyperError-19: (pypy.interpreter.astcompiler.pyassem:emitop_name) +unimplemented operation: 'type' on +.. block at 33 EH with 2 exits(v196517) +.. v196491 = type(name_196484) -TyperError-21: (pypy.interpreter.pyparser.astbuilder:build_term) -unimplemented operation: 'mod' on (, ) -.. block at 300 with 1 exits -.. v89051 = mod(('unexpected token: %s'), v89016) +TyperError-20: (pypy.interpreter.astcompiler.pyassem:__init__) +unimplemented operation: 'type' on +.. block at 295 with 2 exits(v104842) +.. v104780 = type(v104713) -TyperError-22: (pypy.interpreter.pyparser.astbuilder:build_shift_expr) +TyperError-21: (pypy.interpreter.pyparser.astbuilder:parse_arglist) unimplemented operation: 'mod' on (, ) -.. block at 212 with 1 exits -.. v86494 = mod(('unexpected token: %s : %s'), v86459) +.. block at 668 with 1 exits +.. v110289 = mod(('unexpected token: %s'), v110226) -TyperError-23: (pypy.interpreter.astcompiler.pyassem:__init__) -unimplemented operation: 'type' on -.. block at 224 with 2 exits(v99957) -.. v99901 = type(v99858) +TyperError-22: (pypy.interpreter.pyparser.astbuilder:parse_genexpr_for) +unimplemented operation: 'mod' on (, ) +.. block at 332 with 1 exits +.. v113836 = mod(('Unexpected token: %s'), v113815) -TyperError-24: (pypy.interpreter.astcompiler.pycodegen:generateArgList) -unimplemented operation: 'mod' on (, ) -.. block at 176 with 1 exits -.. v202392 = mod(('unexpect argument type: %s'), v202376) +TyperError-23: (pypy.interpreter.astcompiler.symbols:_do_args) +ll_str unsupported for: +.. block at 112 with 1 exits +.. v205571 = mod(('Argument list contains %s of type %s'), v205547) -TyperError-25: (pypy.interpreter.astcompiler.pyassem:emitop_name) +TyperError-24: (pypy.interpreter.astcompiler.pyassem:__init__) unimplemented operation: 'type' on -.. block at 33 EH with 2 exits(v186456) -.. v186430 = type(name_186423) +.. block at 224 with 2 exits(v104690) +.. v104634 = type(v104591) + +TyperError-25: (pypy.interpreter.pyparser.astbuilder:to_lvalue) +unimplemented operation: 'mod' on (, ) +.. block at 377 with 1 exits +.. v112539 = mod(('cannot assign to %s'), v112508) From tismer at codespeak.net Tue Sep 13 21:38:56 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Tue, 13 Sep 2005 21:38:56 +0200 (CEST) Subject: [pypy-svn] r17545 - pypy/dist/pypy/doc Message-ID: <20050913193856.3263027BA4@code1.codespeak.net> Author: tismer Date: Tue Sep 13 21:38:50 2005 New Revision: 17545 Added: pypy/dist/pypy/doc/thoughts_string_interning.txt (contents, props changed) Log: after I spent some more time on the effects of string interning than I planned, and the effect does not break the limit to justify some real work, I considered to preserve some thoughts and experience with this for later use. Added: pypy/dist/pypy/doc/thoughts_string_interning.txt ============================================================================== --- (empty file) +++ pypy/dist/pypy/doc/thoughts_string_interning.txt Tue Sep 13 21:38:50 2005 @@ -0,0 +1,197 @@ +String Interning in PyPy +=========================== + +A few thoughts about string interning. CPython gets a remarkable +speed-up by interning strings. Interned are all builtin string +objects and all strings used as names. The effect is that when +a string lookup is done during instance attribute access, +the dict lookup method will find the string always by identity, +saving the need to do a string comparison. + +Interned Srings in CPython +-------------------------- + +CPython keeps an internal dictionary named ``interned`` for all of these +strings. It contains the string both as key and as value, which means +there are two extra references in principle. Upto Version 2.2, interned +strings were considered immortal. Once they entered the ``interned`` dict, +nothing could revert this memory usage. + +Starting with Python 2.3, interned strings became mortal by default. +The reason was less memory usage for strings that have no external +reference any longer. This seems to be a worthwhile enhancement. +Interned strings that are really needed always have a real reference. +Strings which are interned for temporary reasons get a big speed up +and can be freed after they are no longer in use. + +This was implemented by making the ``interned`` dictionary a weak dict, +by lowering the refcount of interned strings by 2. The string deallocator +got extra handling to look into the ``interned`` dict when a string is deallocated. +This is supported by the state variable on string objects which tells +whether the string is not interned, immortal or mortal. + +Implementation problems for PyPy +-------------------------------- + +- The CPython implementation makes explicit use of the refcount to handle + the weak-dict behavior of ``interned``. PyPy does not expose the implementation + of object aliveness. Special handling would be needed to simulate mortal + behavior. A possible but expensive solution would be to use a real + weak dictionary. Another way is to add a special interface to the backend + that allows either the two extra references to be reset, or for the + boehm collector to exclude the ``interned`` dict from reference tracking. + +- PyPy implements quite complete internal strings, as opposed to CPython + which always uses its "applevel" strings. It also supports low-level + dictionaries. This adds some complication to the issue of interning. + Additionally, the interpreter currently handles attribute access + by calling wrap(str) on the low-level attribute string when executing + frames. This implies that we have to primarily intern low-level strings + and cache the created string objects on top of them. + A possible implementation would use a dict with ll string keys and the + string objects as values. In order to save the extra dict lookup, we also + could consider to cache the string object directly on a field of the rstr, + which of course adds some extra cost. Alternatively, a fast id-indexed + extra dictionary can provide the mapping from rstr to interned string object. + But for efficiency reasons, it is anyway necessary to put an extra flag about + interning on the strings. Flagging this by putting the string object itself + as the flag might be acceptable. A dummyobject can be used if the interned + rstr is not exposed as an interned string object. + +A prototype brute-force patch +-------------------------------- + +In order to get some idea how efficient string interning is at the moment, +I implemented a quite crude version of interning. I patched space.wrap +to call this intern_string instead of W_StringObject:: + + def intern_string(space, str): + if we_are_translated(): + _intern_ids = W_StringObject._intern_ids + str_id = id(str) + w_ret = _intern_ids.get(str_id, None) + if w_ret is not None: + return w_ret + _intern = W_StringObject._intern + if str not in _intern: + _intern[str] = W_StringObject(space, str) + W_StringObject._intern_keep[str_id] = str + _intern_ids[str_id] = w_ret = _intern[str] + return w_ret + else: + return W_StringObject(space, str) + +This is no general solution at all, since it a) does not provide +interning of rstr and b) interns every app-level string. The +implementation is also by far not as efficient as it could be, +because it utilizes an extra dict _intern_ids which maps the +id of the rstr to the string object, and a dict _intern_keep to +keep these ids alive. + +With just a single _intern dict from rstr to string object, the +overall performance degraded slightly instead of an advantage. +The triple dict patch accelerates richards by about 12 percent. +Since it still has the overhead of handling the extra dicts, +I guess we can expect twice the acceleration if we add proper +interning support. + +The resulting estimated 24 % acceleration is still not enough +to justify an implementation right now. + +Here the results of the richards benchmark:: + + D:\pypy\dist\pypy\translator\goal>pypy-c-17516.exe -c "from richards import *;Richards.iterations=1;main()" + debug: entry point starting + debug: argv -> pypy-c-17516.exe + debug: argv -> -c + debug: argv -> from richards import *;Richards.iterations=1;main() + Richards benchmark (Python) starting... [] + finished. + Total time for 1 iterations: 38 secs + Average time for iterations: 38885 ms + + D:\pypy\dist\pypy\translator\goal>pypy-c.exe -c "from richards import *;Richards.iterations=1;main()" + debug: entry point starting + debug: argv -> pypy-c.exe + debug: argv -> -c + debug: argv -> from richards import *;Richards.iterations=1;main() + Richards benchmark (Python) starting... [] + finished. + Total time for 1 iterations: 34 secs + Average time for iterations: 34388 ms + + D:\pypy\dist\pypy\translator\goal> + + +This was just an exercize to get an idea. For sure this is not to be checked in. +Instead, I'm attaching the simple patch here for reference. +:: + + Index: objspace/std/objspace.py + =================================================================== + --- objspace/std/objspace.py (revision 17526) + +++ objspace/std/objspace.py (working copy) + @@ -243,6 +243,9 @@ + return self.newbool(x) + return W_IntObject(self, x) + if isinstance(x, str): + + # XXX quick speed testing hack + + from pypy.objspace.std.stringobject import intern_string + + return intern_string(self, x) + return W_StringObject(self, x) + if isinstance(x, unicode): + return W_UnicodeObject(self, [unichr(ord(u)) for u in x]) # xxx + Index: objspace/std/stringobject.py + =================================================================== + --- objspace/std/stringobject.py (revision 17526) + +++ objspace/std/stringobject.py (working copy) + @@ -18,6 +18,10 @@ + class W_StringObject(W_Object): + from pypy.objspace.std.stringtype import str_typedef as typedef + + + _intern_ids = {} + + _intern_keep = {} + + _intern = {} + + + def __init__(w_self, space, str): + W_Object.__init__(w_self, space) + w_self._value = str + @@ -32,6 +36,21 @@ + + registerimplementation(W_StringObject) + + +def intern_string(space, str): + + if we_are_translated(): + + _intern_ids = W_StringObject._intern_ids + + str_id = id(str) + + w_ret = _intern_ids.get(str_id, None) + + if w_ret is not None: + + return w_ret + + _intern = W_StringObject._intern + + if str not in _intern: + + _intern[str] = W_StringObject(space, str) + + W_StringObject._intern_keep[str_id] = str + + _intern_ids[str_id] = w_ret = _intern[str] + + return w_ret + + else: + + return W_StringObject(space, str) + + def _isspace(ch): + return ord(ch) in (9, 10, 11, 12, 13, 32) + Index: objspace/std/stringtype.py + =================================================================== + --- objspace/std/stringtype.py (revision 17526) + +++ objspace/std/stringtype.py (working copy) + @@ -47,6 +47,10 @@ + if space.is_true(space.is_(w_stringtype, space.w_str)): + return w_obj # XXX might be reworked when space.str() typechecks + value = space.str_w(w_obj) + + # XXX quick hack to check interning effect + + w_obj = W_StringObject._intern.get(value, None) + + if w_obj is not None: + + return w_obj + w_obj = space.allocate_instance(W_StringObject, w_stringtype) + W_StringObject.__init__(w_obj, space, value) + return w_obj + +ciao - chris From ludal at codespeak.net Tue Sep 13 23:21:31 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Tue, 13 Sep 2005 23:21:31 +0200 (CEST) Subject: [pypy-svn] r17546 - in pypy/dist/pypy/interpreter: astcompiler pyparser Message-ID: <20050913212131.E102C27B98@code1.codespeak.net> Author: ludal Date: Tue Sep 13 23:21:29 2005 New Revision: 17546 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/error.py Log: change type(x) == y into isinstance(x,y) remove usage of raise StandardError("xxx %s" % obj) by replacing standard exceptions with user defined exceptions this will need to be done better later for cpython compliancy Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Tue Sep 13 23:21:29 2005 @@ -174,7 +174,7 @@ def emitop_int(self, inst, intval ): if self._debug: print "\t", inst, intval - assert type(intval)==int + assert isinstance(intval,int) self.current.emit( InstrInt(inst,intval) ) def emitop_block(self, inst, block): @@ -187,7 +187,7 @@ def emitop_name(self, inst, name ): if self._debug: print "\t", inst, name - assert type(name)==str + assert isinstance(name,str) self.current.emit( InstrName(inst,name) ) def getBlocksInOrder(self): @@ -463,17 +463,17 @@ for var in args: if isinstance(var, ast.AssName): _name = var.name - assert type(_name) == str + assert isinstance(_name,str) self.varnames.append( _name ) elif isinstance(var, TupleArg): _name = var.getName() - assert type(_name) == str + assert isinstance(_name,str) self.varnames.append( _name ) elif isinstance(var, ast.AssTuple): for n in var.flatten(): assert isinstance(n, ast.AssName) _name = n.name - assert type(_name) == str + assert isinstance(_name,str) self.varnames.append( _name ) self.stage = RAW self.orderedblocks = [] @@ -862,7 +862,7 @@ def twobyte(val): """Convert an int argument into high and low bytes""" - assert type(val) == types.IntType + assert isinstance(val,int) return divmod(val, 256) class LineAddrTable: Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Tue Sep 13 23:21:29 2005 @@ -174,7 +174,7 @@ return self.graph.emitop_code( inst, gen ) def emitop_int(self, inst, op): - assert type(op) == int + assert isinstance(op, int) return self.graph.emitop_int( inst, op ) def emitop_block(self, inst, block): @@ -1331,7 +1331,7 @@ extra.extend(elt.getChildNodes()) count = count + 1 else: - raise ValueError( "unexpect argument type: %s" % elt ) + raise ValueError( "unexpect argument type:" + str(elt) ) return args + extra, count def findOp(node): @@ -1349,17 +1349,17 @@ if self.op is None: self.op = node.flags elif self.op != node.flags: - raise ValueError, "mixed ops in stmt" + raise ValueError("mixed ops in stmt") def visitAssAttr(self, node): if self.op is None: self.op = node.flags elif self.op != node.flags: - raise ValueError, "mixed ops in stmt" + raise ValueError("mixed ops in stmt") def visitSubscript(self, node): if self.op is None: self.op = node.flags elif self.op != node.flags: - raise ValueError, "mixed ops in stmt" + raise ValueError("mixed ops in stmt") class AugLoadVisitor(ast.ASTVisitor): Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Tue Sep 13 23:21:29 2005 @@ -7,7 +7,7 @@ from pypy.interpreter.astcompiler import ast, consts import pypy.interpreter.pyparser.pysymbol as sym import pypy.interpreter.pyparser.pytoken as tok -from pypy.interpreter.pyparser.error import SyntaxError +from pypy.interpreter.pyparser.error import SyntaxError, TokenError, ASTError from pypy.interpreter.pyparser.parsestring import parsestr DEBUG_MODE = 0 @@ -168,7 +168,7 @@ raise ValueError("FIXME: SyntaxError (incomplete varags) ?") assert isinstance(cur_token, TokenObject) if cur_token.name != tok.DOUBLESTAR: - raise ValueError("Unexpected token: %s" % cur_token) + raise TokenError("Unexpected token: ", [cur_token] ) cur_token = tokens[index] index += 1 assert isinstance(cur_token, TokenObject) @@ -180,7 +180,7 @@ else: raise ValueError("FIXME: SyntaxError (incomplete varags) ?") if index < l: - raise ValueError("unexpected token: %s" % tokens[index]) + raise TokenError("unexpected token" , [tokens[index]] ) elif cur_token.name == tok.NAME: val = cur_token.get_value() names.append( ast.AssName( val, consts.OP_ASSIGN ) ) @@ -328,7 +328,7 @@ else: # TODO: check type of ast_node and raise according SyntaxError in case # of del f() - raise SyntaxError("cannot assign to %s" % ast_node ) + raise ASTError("cannot assign to ", ast_node ) def is_augassign( ast_node ): if ( isinstance( ast_node, ast.Name ) or @@ -513,7 +513,7 @@ elif top.name == tok.BACKQUOTE: builder.push(ast.Backquote(atoms[1])) else: - raise ValueError, "unexpected tokens (%d): %s" % (nb, [str(i) for i in atoms]) + raise TokenError("unexpected tokens", atoms) def slicecut(lst, first, endskip): # endskip is negative last = len(lst)+endskip @@ -568,7 +568,7 @@ elif op_node.name == tok.DOUBLESLASH: left = ast.FloorDiv( [ left, right ] ) else: - raise ValueError, "unexpected token: %s" % atoms[i-1] + raise TokenError("unexpected token", [atoms[i-1]]) builder.push( left ) def build_arith_expr( builder, nb ): @@ -584,7 +584,7 @@ elif op_node.name == tok.MINUS: left = ast.Sub( [ left, right ] ) else: - raise ValueError, "unexpected token: %s : %s" % atoms[i-1] + raise ValueError("unexpected token", [atoms[i-1]] ) builder.push( left ) def build_shift_expr( builder, nb ): @@ -600,7 +600,7 @@ elif op_node.name == tok.RIGHTSHIFT: left = ast.RightShift( [ left, right ] ) else: - raise ValueError, "unexpected token: %s : %s" % atoms[i-1] + raise ValueError("unexpected token", [atoms[i-1]] ) builder.push( left ) Modified: pypy/dist/pypy/interpreter/pyparser/error.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/error.py (original) +++ pypy/dist/pypy/interpreter/pyparser/error.py Tue Sep 13 23:21:29 2005 @@ -47,3 +47,16 @@ self.lineno, self.offset, self.text) + + +class ASTError(Exception): + def __init__(self, msg, ast_node ): + self.msg = msg + self.ast_node = ast_node + + +class TokenError(Exception): + def __init__(self, msg, tokens ): + self.msg = msg + self.tokens = tokens + From ludal at codespeak.net Tue Sep 13 23:52:16 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Tue, 13 Sep 2005 23:52:16 +0200 (CEST) Subject: [pypy-svn] r17547 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050913215216.2263427B9A@code1.codespeak.net> Author: ludal Date: Tue Sep 13 23:52:14 2005 New Revision: 17547 Modified: pypy/dist/pypy/interpreter/astcompiler/TYPERERRORS Log: removed some of the errors that should have been fixed. I'll put the latest list as soon as I get it. Modified: pypy/dist/pypy/interpreter/astcompiler/TYPERERRORS ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/TYPERERRORS (original) +++ pypy/dist/pypy/interpreter/astcompiler/TYPERERRORS Tue Sep 13 23:52:14 2005 @@ -3,122 +3,38 @@ .. block at 156 with 1 exits .. v93327 = newslice(v93296, (-1), (None)) -TyperError-2: (pypy.interpreter.pyparser.astbuilder:build_arith_expr) -unimplemented operation: 'mod' on (, ) -.. block at 212 with 1 exits -.. v95081 = mod(('unexpected token: %s : %s'), v95046) - TyperError-3: (pypy.interpreter.astcompiler.pycodegen:visitContinue) unimplemented operation: 'len' on .. block at 133 with 1 exits .. v124518 = len(v124478) -TyperError-4: (pypy.interpreter.astcompiler.pyassem:twobyte) -unimplemented operation: 'type' on -.. block at -1 EH with 2 exits(v1882086) -.. v1882078 = type(val_1882071) - TyperError-5: (pypy.interpreter.astcompiler.pyassem:flattenGraph) unimplemented operation: 'divmod' on (, ) .. block at 535 with 1 exits .. v673105 = divmod(v672924, (65536)) -TyperError-6: (pypy.interpreter.astcompiler.pycodegen:generateArgList) -unimplemented operation: 'mod' on (, ) -.. block at 176 with 1 exits -.. v207318 = mod(('unexpect argument type: %s'), v207302) - TyperError-7: (pypy.interpreter.astcompiler.pyassem:depth_CALL_FUNCTION) unimplemented operation: 'divmod' on (, ) .. block at -1 EH with 1 exits .. v297531 = divmod(argc_297524, (256)) -TyperError-8: (pypy.interpreter.pyparser.astbuilder:parse_listcomp) -unimplemented operation: 'mod' on (, ) -.. block at 332 with 1 exits -.. v136944 = mod(('Unexpected token: %s'), v136923) - -TyperError-9: (pypy.interpreter.pyparser.astbuilder:build_term) -unimplemented operation: 'mod' on (, ) -.. block at 300 with 1 exits -.. v90369 = mod(('unexpected token: %s'), v90334) - TyperError-10: (pypy.interpreter.astcompiler.pyassem:twobyte) unimplemented operation: 'divmod' on (, ) .. block at 38 with 1 exits .. v1882096 = divmod(v1882093, (256)) -TyperError-11: (pypy.interpreter.pyparser.astbuilder:parse_arglist) -unimplemented operation: 'mod' on (, ) -.. block at 491 with 1 exits -.. v110364 = mod(('Unexpected token: %s'), v110333) - -TyperError-12: (pypy.interpreter.pyparser.astbuilder:build_not_test) -unimplemented operation: 'mod' on (, * GcStruct object { ... } } }, hash_cache: Signed } }}, inst_filename: * GcStruct rpy_string { hash: Signed, chars: Array of Char }, inst_lineno: Signed } }>) -.. block at 120 with 1 exits -.. v74452 = mod(('not_test implementation incomplete (%s)'), v74436) - TyperError-13: (pypy.interpreter.astcompiler.pyassem:get_children) unimplemented operation: 'contains' on (, ) .. block at 13 with 1 exits .. v226229 = contains(v226199, v226167) -TyperError-14: (pypy.interpreter.astcompiler.pycodegen:emitop_int) -unimplemented operation: 'type' on -.. block at -1 EH with 2 exits(v179678) -.. v179666 = type(op_179657) - -TyperError-15: (pypy.interpreter.astcompiler.pyassem:__init__) -unimplemented operation: 'type' on -.. block at 417 with 2 exits(v105093) -.. v105063 = type(v105043) - -TyperError-16: (pypy.interpreter.pyparser.astbuilder:build_shift_expr) -unimplemented operation: 'mod' on (, ) -.. block at 212 with 1 exits -.. v71560 = mod(('unexpected token: %s : %s'), v71525) - TyperError-17: (pypy.interpreter.astcompiler.pycodegen:visitCallFunc) no equality function for .. block at 228 with 1 exits .. v186910 = getitem(({(1, 1): 'CALL_FUNCT...ION_KW'}), v186901) -TyperError-18: (pypy.interpreter.astcompiler.pyassem:emitop_int) -unimplemented operation: 'type' on -.. block at 33 EH with 2 exits(v193854) -.. v193828 = type(intval_193821) - -TyperError-19: (pypy.interpreter.astcompiler.pyassem:emitop_name) -unimplemented operation: 'type' on -.. block at 33 EH with 2 exits(v196517) -.. v196491 = type(name_196484) - -TyperError-20: (pypy.interpreter.astcompiler.pyassem:__init__) -unimplemented operation: 'type' on -.. block at 295 with 2 exits(v104842) -.. v104780 = type(v104713) - -TyperError-21: (pypy.interpreter.pyparser.astbuilder:parse_arglist) -unimplemented operation: 'mod' on (, ) -.. block at 668 with 1 exits -.. v110289 = mod(('unexpected token: %s'), v110226) - -TyperError-22: (pypy.interpreter.pyparser.astbuilder:parse_genexpr_for) -unimplemented operation: 'mod' on (, ) -.. block at 332 with 1 exits -.. v113836 = mod(('Unexpected token: %s'), v113815) - TyperError-23: (pypy.interpreter.astcompiler.symbols:_do_args) ll_str unsupported for: .. block at 112 with 1 exits .. v205571 = mod(('Argument list contains %s of type %s'), v205547) -TyperError-24: (pypy.interpreter.astcompiler.pyassem:__init__) -unimplemented operation: 'type' on -.. block at 224 with 2 exits(v104690) -.. v104634 = type(v104591) - -TyperError-25: (pypy.interpreter.pyparser.astbuilder:to_lvalue) -unimplemented operation: 'mod' on (, ) -.. block at 377 with 1 exits -.. v112539 = mod(('cannot assign to %s'), v112508) From ludal at codespeak.net Wed Sep 14 01:22:51 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Wed, 14 Sep 2005 01:22:51 +0200 (CEST) Subject: [pypy-svn] r17548 - in pypy/dist/pypy/interpreter: astcompiler pyparser Message-ID: <20050913232251.DE06627B9A@code1.codespeak.net> Author: ludal Date: Wed Sep 14 01:22:48 2005 New Revision: 17548 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/astcompiler/symbols.py pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: more rpython fixes for the compiler - uses of "%s" % obj - divmod - one dict[ (0,1) ] - some len(SomeInstance) and x in SomeInstance Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Wed Sep 14 01:22:48 2005 @@ -406,7 +406,7 @@ self.next = [] def get_children(self): - if self.next and self.next[0] in self.outEdges: + if self.next and self.next[0].bid in self.outEdges.elts: self.outEdges.remove(self.next[0]) return self.outEdges.elements() + self.next @@ -608,7 +608,8 @@ assert isinstance(inst, InstrInt) arg = inst.intval # numerical arg - hi,lo = divmod(arg,65536) + hi = arg // 65536 + lo = arg % 65536 if hi>0: # extended argument insts.append( InstrInt("EXTENDED_ARG", hi) ) @@ -863,7 +864,9 @@ def twobyte(val): """Convert an int argument into high and low bytes""" assert isinstance(val,int) - return divmod(val, 256) + hi = val // 256 + lo = val % 256 + return hi, lo class LineAddrTable: """lnotab @@ -944,7 +947,8 @@ def depth_BUILD_LIST(count): return -count+1 def depth_CALL_FUNCTION(argc): - hi, lo = divmod(argc, 256) + hi = argc//256 + lo = argc%256 return -(lo + hi * 2) def depth_CALL_FUNCTION_VAR(argc): return depth_CALL_FUNCTION(argc)-1 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Wed Sep 14 01:22:48 2005 @@ -18,13 +18,13 @@ # drop VERSION dependency since it the ast transformer for 2.4 doesn't work with 2.3 anyway VERSION = 2 -callfunc_opcode_info = { +callfunc_opcode_info = [ # (Have *args, Have **args) : opcode - (0,0) : "CALL_FUNCTION", - (1,0) : "CALL_FUNCTION_VAR", - (0,1) : "CALL_FUNCTION_KW", - (1,1) : "CALL_FUNCTION_VAR_KW", -} + "CALL_FUNCTION", + "CALL_FUNCTION_KW", + "CALL_FUNCTION_VAR", + "CALL_FUNCTION_VAR_KW", +] LOOP = 1 EXCEPT = 2 @@ -469,7 +469,7 @@ elif kind == EXCEPT or kind == TRY_FINALLY: self.set_lineno(node) # find the block that starts the loop - top = len(self.setups) + top = len(self.setups.stack) loop_block = None while top > 0: top = top - 1 @@ -948,7 +948,7 @@ node.dstar_args.accept( self ) have_star = node.star_args is not None have_dstar = node.dstar_args is not None - opcode = callfunc_opcode_info[have_star, have_dstar] + opcode = callfunc_opcode_info[ have_star*2 + have_dstar] self.emitop_int(opcode, kw << 8 | pos) def visitPrint(self, node): Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Wed Sep 14 01:22:48 2005 @@ -54,8 +54,8 @@ if name in self.uses or name in self.defs: pass # XXX warn about global following def/use if name in self.params: - raise SyntaxError, "%s in %s is global and parameter" % \ - (name, self.name) + msg = "%s in %s is global and parameter" % (name, self.name) + raise SyntaxError( msg ) self.globals[name] = 1 self.module.add_def(name) @@ -319,7 +319,9 @@ elif isinstance( arg, ast.AssTuple ): self._do_args( scope, arg.flatten() ) else: - raise TypeError( "Argument list contains %s of type %s" % (arg, type(arg) ) ) + #msg = "Argument list contains %s of type %s" % (arg, type(arg) ) + msg = "Argument list contains ASTNodes other than AssName or AssTuple" + raise TypeError( msg ) def handle_free_vars(self, scope, parent): parent.add_child(scope) Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Wed Sep 14 01:22:48 2005 @@ -215,7 +215,7 @@ list_fors.append(ast.ListCompFor(ass_node, iterable, ifs)) ifs = [] else: - assert False, 'Unexpected token: %s' % token + assert False, 'Unexpected token: expecting for in listcomp' # # Original implementation: # @@ -268,7 +268,7 @@ genexpr_fors.append(ast.GenExprFor(ass_node, iterable, ifs)) ifs = [] else: - assert False, 'Unexpected token: %s' % token + assert False, 'Unexpected token: expected for in genexpr' return genexpr_fors @@ -684,7 +684,7 @@ elif len(atoms) == 2: builder.push(ast.Not(atoms[1])) else: - assert False, "not_test implementation incomplete (%s)" % atoms + assert False, "not_test implementation incomplete in not_test" def build_test( builder, nb ): return build_binary_expr(builder, nb, ast.Or) @@ -1153,7 +1153,7 @@ else: if token.name == tok.LPAR: # mutli-line imports - tokens = atoms[index+1:-1] + tokens = slicecut( atoms, index+1, -1 ) else: tokens = atoms[index:] index = 0 From ludal at codespeak.net Wed Sep 14 01:49:24 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Wed, 14 Sep 2005 01:49:24 +0200 (CEST) Subject: [pypy-svn] r17549 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050913234924.C4DEC27B9A@code1.codespeak.net> Author: ludal Date: Wed Sep 14 01:49:23 2005 New Revision: 17549 Removed: pypy/dist/pypy/interpreter/astcompiler/TYPERERRORS Log: no more typer errors. no it crashes while inlining From arigo at codespeak.net Wed Sep 14 10:16:30 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 14 Sep 2005 10:16:30 +0200 (CEST) Subject: [pypy-svn] r17552 - in pypy/dist/pypy: rpython translator/backendopt translator/backendopt/test Message-ID: <20050914081630.E1FEE27BA6@code1.codespeak.net> Author: arigo Date: Wed Sep 14 10:16:29 2005 New Revision: 17552 Modified: pypy/dist/pypy/rpython/llinterp.py pypy/dist/pypy/translator/backendopt/inline.py pypy/dist/pypy/translator/backendopt/test/test_inline.py Log: * support inlining functions that always raise and never return * fix in llinterp's print_traceback() Modified: pypy/dist/pypy/rpython/llinterp.py ============================================================================== --- pypy/dist/pypy/rpython/llinterp.py (original) +++ pypy/dist/pypy/rpython/llinterp.py Wed Sep 14 10:16:29 2005 @@ -58,6 +58,9 @@ frames.reverse() for frame in frames: print frame.graph.name, + if frame.curr_block is None: + print "" + continue try: print self.typer.annotator.annotated[frame.curr_block].__module__ except KeyError: Modified: pypy/dist/pypy/translator/backendopt/inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/inline.py (original) +++ pypy/dist/pypy/translator/backendopt/inline.py Wed Sep 14 10:16:29 2005 @@ -131,7 +131,6 @@ assert linktoinlined.target is afterblock copiedstartblock = copy_block(graph_to_inline.startblock) copiedstartblock.isstartblock = False - copiedreturnblock = copied_blocks[graph_to_inline.returnblock] #find args passed to startblock of inlined function passon_args = [] for arg in op.args[1:]: @@ -146,11 +145,13 @@ linktoinlined.args = passon_args afterblock.inputargs = [op.result] + afterblock.inputargs afterblock.operations = afterblock.operations[1:] - linkfrominlined = Link([copiedreturnblock.inputargs[0]] + passon_vars[graph_to_inline.returnblock], afterblock) - linkfrominlined.prevblock = copiedreturnblock - copiedreturnblock.exitswitch = None - copiedreturnblock.exits = [linkfrominlined] - assert copiedreturnblock.exits[0].target == afterblock + if graph_to_inline.returnblock in entrymap: + copiedreturnblock = copied_blocks[graph_to_inline.returnblock] + linkfrominlined = Link([copiedreturnblock.inputargs[0]] + passon_vars[graph_to_inline.returnblock], afterblock) + linkfrominlined.prevblock = copiedreturnblock + copiedreturnblock.exitswitch = None + copiedreturnblock.exits = [linkfrominlined] + assert copiedreturnblock.exits[0].target == afterblock if graph_to_inline.exceptblock in entrymap: #let links to exceptblock of the graph to inline go to graphs exceptblock copiedexceptblock = copied_blocks[graph_to_inline.exceptblock] Modified: pypy/dist/pypy/translator/backendopt/test/test_inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_inline.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_inline.py Wed Sep 14 10:16:29 2005 @@ -292,3 +292,20 @@ interp = LLInterpreter(t.flowgraphs, t.rtyper) result = interp.eval_function(f, []) assert result is True + +def test_inline_raiseonly(): + def f2(x): + raise KeyError + def f(x): + try: + return f2(x) + except KeyError: + return 42 + t = Translator(f) + a = t.annotate([int]) + a.simplify() + t.specialize() + inline_function(t, f2, t.flowgraphs[f]) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(f, [98371]) + assert result == 42 From arigo at codespeak.net Wed Sep 14 10:58:27 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 14 Sep 2005 10:58:27 +0200 (CEST) Subject: [pypy-svn] r17553 - pypy/dist/pypy/translator/backendopt/test Message-ID: <20050914085827.CEE6727B74@code1.codespeak.net> Author: arigo Date: Wed Sep 14 10:58:26 2005 New Revision: 17553 Modified: pypy/dist/pypy/translator/backendopt/test/test_inline.py Log: Test clean-up. Modified: pypy/dist/pypy/translator/backendopt/test/test_inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_inline.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_inline.py Wed Sep 14 10:58:26 2005 @@ -7,6 +7,16 @@ from pypy.rpython.llinterp import LLInterpreter from pypy.translator.test.snippet import is_perfect_number +def check_inline(func, in_func, sig): + t = Translator(in_func) + a = t.annotate(sig) + a.simplify() + t.specialize() + inline_function(t, func, t.flowgraphs[in_func]) + interp = LLInterpreter(t.flowgraphs, t.rtyper) + return interp + + def test_inline_simple(): def f(x, y): return (g(x, y) + 1) * x @@ -15,12 +25,7 @@ return x * y else: return -x * y - t = Translator(f) - a = t.annotate([int, int]) - a.simplify() - t.specialize() - inline_function(t, g, t.flowgraphs[f]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) + interp = check_inline(g, f, [int, int]) result = interp.eval_function(f, [-1, 5]) assert result == f(-1, 5) result = interp.eval_function(f, [2, 12]) @@ -33,12 +38,7 @@ if is_perfect_number(i): result.append(i) return result - t = Translator(f) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, is_perfect_number, t.flowgraphs[f]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) + interp = check_inline(is_perfect_number, f, [int]) result = interp.eval_function(f, [10]) assert result.length == len(f(10)) @@ -81,12 +81,7 @@ else: a = f(x) + 1 return a + f(x) - t = Translator(g) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f, t.flowgraphs[g]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) + interp = check_inline(f, g, [int]) result = interp.eval_function(g, [0]) assert result == g(0) result = interp.eval_function(g, [42]) @@ -106,12 +101,7 @@ except KeyError: return x+2 return 1 - t = Translator(g) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f, t.flowgraphs[g]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) + interp = check_inline(f, g, [int]) result = interp.eval_function(g, [0]) assert result == 2 result = interp.eval_function(g, [1]) @@ -206,12 +196,7 @@ def f(i): a = A(117, i) return a.area() - t = Translator(f) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, A.__init__.im_func, t.flowgraphs[f]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) + interp = check_inline(A.__init__.im_func, f, [int]) result = interp.eval_function(f, [120]) assert result == 30 @@ -223,12 +208,7 @@ return 1 def f(n): return factorial(n//2) - t = Translator(f) - a = t.annotate([int]) - a.simplify() - t.specialize() - py.test.raises(CannotInline, - "inline_function(t, factorial, t.flowgraphs[f])") + py.test.raises(CannotInline, check_inline, factorial, f, [int]) def test_auto_inlining_small_call_big(): def leaf(n): @@ -271,12 +251,7 @@ return False def f(): return f2() - t = Translator(f) - a = t.annotate([]) - a.simplify() - t.specialize() - inline_function(t, f2, t.flowgraphs[f]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) + interp = check_inline(f2, f, []) result = interp.eval_function(f, []) assert result is True @@ -301,11 +276,6 @@ return f2(x) except KeyError: return 42 - t = Translator(f) - a = t.annotate([int]) - a.simplify() - t.specialize() - inline_function(t, f2, t.flowgraphs[f]) - interp = LLInterpreter(t.flowgraphs, t.rtyper) + interp = check_inline(f2, f, [int]) result = interp.eval_function(f, [98371]) assert result == 42 From arigo at codespeak.net Wed Sep 14 11:13:54 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 14 Sep 2005 11:13:54 +0200 (CEST) Subject: [pypy-svn] r17554 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20050914091354.B031827B8A@code1.codespeak.net> Author: arigo Date: Wed Sep 14 11:13:53 2005 New Revision: 17554 Modified: pypy/dist/pypy/translator/backendopt/inline.py pypy/dist/pypy/translator/backendopt/test/test_inline.py Log: YACCWI (yet another corner case with inlining) Modified: pypy/dist/pypy/translator/backendopt/inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/inline.py (original) +++ pypy/dist/pypy/translator/backendopt/inline.py Wed Sep 14 11:13:53 2005 @@ -6,7 +6,7 @@ from pypy.objspace.flow.model import SpaceOperation, last_exception from pypy.objspace.flow.model import traverse, mkentrymap, checkgraph, flatten from pypy.annotation import model as annmodel -from pypy.rpython.lltype import Bool +from pypy.rpython.lltype import Bool, typeOf from pypy.rpython import rmodel #from pypy.translator.backendopt import matfunc @@ -163,6 +163,13 @@ if copiedlink.target is copiedexceptblock: copiedlink.args = copiedlink.args[:2] copiedlink.target = graph.exceptblock + for a1, a2 in zip(copiedlink.args, + graph.exceptblock.inputargs): + if hasattr(a2, 'concretetype'): + assert a1.concretetype == a2.concretetype + else: + # if graph.exceptblock was never used before + a2.concretetype = a1.concretetype else: def find_args_in_exceptional_case(link, block, etype, evalue): linkargs = [] @@ -180,6 +187,7 @@ exc_match = Constant(rmodel.getfunctionptr( translator, translator.rtyper.getexceptiondata().ll_exception_match)) + exc_match.concretetype = typeOf(exc_match.value) #try to match the exceptions for simple cases for link in entrymap[graph_to_inline.exceptblock]: copiedblock = copied_blocks[link.prevblock] @@ -208,7 +216,9 @@ res = Variable() res.concretetype = Bool translator.annotator.bindings[res] = annmodel.SomeBool() - args = [exc_match, etype, Constant(link.llexitcase)] + cexitcase = Constant(link.llexitcase) + cexitcase.concretetype = typeOf(cexitcase.value) + args = [exc_match, etype, cexitcase] block.operations.append(SpaceOperation("direct_call", args, res)) block.exitswitch = res linkargs = find_args_in_exceptional_case(link, link.target, Modified: pypy/dist/pypy/translator/backendopt/test/test_inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_inline.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_inline.py Wed Sep 14 11:13:53 2005 @@ -1,5 +1,6 @@ import py import os +from pypy.objspace.flow.model import traverse, Block, Link, Variable, Constant from pypy.translator.backendopt.inline import inline_function, CannotInline from pypy.translator.backendopt.inline import auto_inlining from pypy.translator.backendopt.inline import collect_called_functions @@ -7,12 +8,35 @@ from pypy.rpython.llinterp import LLInterpreter from pypy.translator.test.snippet import is_perfect_number +def no_missing_concretetype(node): + if isinstance(node, Block): + for v in node.inputargs: + assert hasattr(v, 'concretetype') + for op in node.operations: + for v in op.args: + assert hasattr(v, 'concretetype') + assert hasattr(op.result, 'concretetype') + if isinstance(node, Link): + for v in node.args: + assert hasattr(v, 'concretetype') + if isinstance(node.last_exception, (Variable, Constant)): + assert hasattr(node.last_exception, 'concretetype') + if isinstance(node.last_exc_value, (Variable, Constant)): + assert hasattr(node.last_exc_value, 'concretetype') + def check_inline(func, in_func, sig): t = Translator(in_func) a = t.annotate(sig) a.simplify() t.specialize() + # look for missing '.concretetype' before inlining (so we don't blame it) + for graph in t.flowgraphs.values(): + traverse(no_missing_concretetype, graph) + # inline! inline_function(t, func, t.flowgraphs[in_func]) + # look for missing '.concretetype' + for graph in t.flowgraphs.values(): + traverse(no_missing_concretetype, graph) interp = LLInterpreter(t.flowgraphs, t.rtyper) return interp From arigo at codespeak.net Wed Sep 14 12:56:59 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 14 Sep 2005 12:56:59 +0200 (CEST) Subject: [pypy-svn] r17555 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20050914105659.00E7427B95@code1.codespeak.net> Author: arigo Date: Wed Sep 14 12:56:58 2005 New Revision: 17555 Added: pypy/dist/pypy/translator/backendopt/sparsemat.py Removed: pypy/dist/pypy/translator/backendopt/matfunc.py Modified: pypy/dist/pypy/translator/backendopt/inline.py pypy/dist/pypy/translator/backendopt/test/test_all.py pypy/dist/pypy/translator/backendopt/test/test_inline.py Log: Reintroduced the matrix-based algorithm, this time with a custom sparse matrix solver. Modified: pypy/dist/pypy/translator/backendopt/inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/inline.py (original) +++ pypy/dist/pypy/translator/backendopt/inline.py Wed Sep 14 12:56:58 2005 @@ -8,9 +8,9 @@ from pypy.annotation import model as annmodel from pypy.rpython.lltype import Bool, typeOf from pypy.rpython import rmodel -#from pypy.translator.backendopt import matfunc +from pypy.translator.backendopt import sparsemat -BASE_INLINE_THRESHOLD = 12 # just enough to inline ll_rangeiter_next() +BASE_INLINE_THRESHOLD = 18 # just enough to inline ll_rangeiter_next() class CannotInline(Exception): pass @@ -251,32 +251,29 @@ # Automatic inlining def measure_median_execution_cost(graph): - linktargets = [graph.startblock] - linkmap = {} + blocks = [] + blockmap = {} for node in flatten(graph): - if isinstance(node, Link): - linkmap[node] = len(linktargets) - linktargets.append(node.target) - matrix = [] + if isinstance(node, Block): + blockmap[node] = len(blocks) + blocks.append(node) + M = sparsemat.SparseMatrix(len(blocks)) vector = [] - for i, target in zip(range(len(linktargets)), linktargets): - vector.append(len(target.operations)) - row = [0.0] * len(linktargets) - row[i] = 1.0 - if target.exits: - f = 1.0 / len(target.exits) - for nextlink in target.exits: - row[linkmap[nextlink]] -= f - matrix.append(row) - M = matfunc.Mat(matrix) - V = matfunc.Vec(vector) - # we must solve: M * (vector x1...xn) = V + for i, block in enumerate(blocks): + vector.append(len(block.operations)) + M[i, i] = 1 + if block.exits: + f = 1.0 / len(block.exits) + for link in block.exits: + M[i, blockmap[link.target]] -= f try: - Solution = M._solve(V) - except (OverflowError, ValueError): + Solution = M.solve(vector) + except ValueError: return sys.maxint else: - return Solution[0] + res = Solution[blockmap[graph.startblock]] + assert res >= 0 + return res def static_instruction_count(graph): count = 0 @@ -287,7 +284,7 @@ def inlining_heuristic(graph): # XXX ponderation factors? - return ( #0.819487132 * measure_median_execution_cost(graph) + + return (0.9999 * measure_median_execution_cost(graph) + static_instruction_count(graph)) Added: pypy/dist/pypy/translator/backendopt/sparsemat.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/backendopt/sparsemat.py Wed Sep 14 12:56:58 2005 @@ -0,0 +1,95 @@ +from __future__ import division + +EPSILON = 1E-6 + + +class SparseMatrix: + + def __init__(self, height): + self.lines = [{} for row in range(height)] + + def __getitem__(self, (row, col)): + return self.lines[row].get(col, 0) + + def __setitem__(self, (row, col), value): + if abs(value) > EPSILON: + self.lines[row][col] = value + else: + try: + del self.lines[row][col] + except KeyError: + pass + + def copy(self): + m = SparseMatrix(len(self.lines)) + for line1, line2 in zip(self.lines, m.lines): + line2.update(line1) + return m + + def solve(self, vector): + """Solves 'self * [x1...xn] == vector'; returns the list [x1...xn]. + Raises ValueError if no solution or indeterminate. + """ + vector = list(vector) + lines = [line.copy() for line in self.lines] + columns = [{} for i in range(len(vector))] + for i, line in enumerate(lines): + for j, a in line.items(): + columns[j][i] = a + lines_left = dict.fromkeys(range(len(self.lines))) + nrows = [] + for ncol in range(len(vector)): + currentcolumn = columns[ncol] + lst = [(abs(a), i) for (i, a) in currentcolumn.items() + if i in lines_left] + _, nrow = min(lst) # ValueError -> no solution + nrows.append(nrow) + del lines_left[nrow] + line1 = lines[nrow] + mina = line1[ncol] + for _, i in lst: + if i != nrow: + line2 = lines[i] + a = line2.pop(ncol) + #del currentcolumn[i] -- but currentcolumn no longer used + factor = a / mina + vector[i] -= factor*vector[nrow] + for col in line1: + if col > ncol: + value = line2.get(col, 0) - factor*line1[col] + if abs(value) > EPSILON: + line2[col] = columns[col][i] = value + else: + del line2[col] + del columns[col][i] + solution = [None] * len(vector) + for i in range(len(vector)-1, -1, -1): + row = nrows[i] + line = lines[row] + total = vector[row] + for j, a in line.items(): + if j != i: + total -= a * solution[j] + solution[i] = total / line[i] + return solution + + +def test_sparsemat1(): + import py + M = SparseMatrix(4) + M[0,0] = M[1,1] = M[2,2] = M[3,3] = 1 + M[0,1] = -1.0 + M[1,2] = M[1,3] = -0.5 + M[2,1] = -1.0 + res = M.solve([4, 5, 4, 1]) + assert res == [19, 15, 19, 1] + +def test_sparsemat2(): + import py + M = SparseMatrix(4) + M[0,0] = M[1,1] = M[2,2] = M[3,3] = 1 + M[0,1] = -1.0 + M[1,2] = M[1,3] = -0.5 + M[2,1] = M[2,3] = -0.5 + res = M.solve([6, 3, 6, 0]) + assert res == [14, 8, 10, 0] Modified: pypy/dist/pypy/translator/backendopt/test/test_all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_all.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_all.py Wed Sep 14 12:56:58 2005 @@ -50,3 +50,24 @@ interp = LLInterpreter(t.flowgraphs, t.rtyper) res = interp.eval_function(big, []) assert res == 83 + + + +def test_for_loop(): + def f(n): + total = 0 + for i in range(n): + total += i + return total + t = Translator(f) + t.annotate([int]) + t.specialize() + t.backend_optimizations() + # this also checks that the BASE_INLINE_THRESHOLD is enough for 'for' loops + + graph = t.getflowgraph() + check_malloc_removed(graph) + + interp = LLInterpreter(t.flowgraphs, t.rtyper) + res = interp.eval_function(f, [11]) + assert res == 55 Modified: pypy/dist/pypy/translator/backendopt/test/test_inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_inline.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_inline.py Wed Sep 14 12:56:58 2005 @@ -4,6 +4,7 @@ from pypy.translator.backendopt.inline import inline_function, CannotInline from pypy.translator.backendopt.inline import auto_inlining from pypy.translator.backendopt.inline import collect_called_functions +from pypy.translator.backendopt.inline import measure_median_execution_cost from pypy.translator.translator import Translator from pypy.rpython.llinterp import LLInterpreter from pypy.translator.test.snippet import is_perfect_number @@ -303,3 +304,24 @@ interp = check_inline(f2, f, [int]) result = interp.eval_function(f, [98371]) assert result == 42 + +def test_measure_median_execution_cost(): + def f(x): + x += 1 + x += 1 + x += 1 + while True: + x += 1 + x += 1 + x += 1 + if x: break + x += 1 + x += 1 + x += 1 + x += 1 + x += 1 + x += 1 + return x + t = Translator(f) + res = measure_median_execution_cost(t.getflowgraph()) + assert res == 17 From pedronis at codespeak.net Wed Sep 14 13:38:07 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 14 Sep 2005 13:38:07 +0200 (CEST) Subject: [pypy-svn] r17556 - pypy/dist/pypy/doc Message-ID: <20050914113807.EDAED27B95@code1.codespeak.net> Author: pedronis Date: Wed Sep 14 13:38:06 2005 New Revision: 17556 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: some overviewsh-is flow space op description Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Wed Sep 14 13:38:06 2005 @@ -225,8 +225,47 @@ space, and supplying a derived execution context implementation. It also wrap a fix-point loop around invocations of the frame resume method which is forced to execute one single bytecode through -exceptions reaching this loop from the space operations' -code and the specialised execution context. +exceptions reaching this loop from the space operations' code and the +specialised execution context. + +The domain on which the Flow Space operates comprises variables and +constant objects. They are stored as such in the frame objects without +problems because by design the interpreter engine treat them +neutrally. + +The Flow Space can synthesise out of a frame content so called frame +states. Frame states described the execution state for the frame at a +given point. + +The Flow Space constructs the flow graph by creating new blocks in it, +when fresh never-seen state is reached. During construction block in +the graph all have an associated frame state. The Flow Space start +from an empty block with an a frame state corresponding to setup +induced but input arguments in the form of variables and constants to +the analysed function. + +When an operation is delegated to the Flow Space by the frame +interpretation loop, either a constant result is produced, in the case +the arguments are constant and the operation doesn't have +side-effects, otherwise the operation is recorded in the current block +and a fresh new variable is returned as result. + +When a new bytecode is about to be executed, as signalled by the +bytecode hook, the Flow Space considers the frame state corresponding +to the current frame contents. The Flow Space keeps a mapping between +byecode instructions, as their position, and frame state, block pairs. + +A union operation is defined on frame states, only two equal constants +unify to a constant of the same value, all other combination unify +to a fresh new variable. + +If some previously associated frame state for the next byecode unifies +with the current state giving some more general state, i.e. an unequal +one, the corresponding block will be reused and reset. Otherwise a new +block is used. + +XXX non mergeable data, details +XXX conditionals, multiple pending blocks From ac at codespeak.net Wed Sep 14 14:05:11 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Wed, 14 Sep 2005 14:05:11 +0200 (CEST) Subject: [pypy-svn] r17557 - in pypy/dist/pypy: . interpreter/astcompiler Message-ID: <20050914120511.D138227B95@code1.codespeak.net> Author: ac Date: Wed Sep 14 14:05:11 2005 New Revision: 17557 Modified: pypy/dist/pypy/conftest.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: Add --compiler option to py.test in pypy. Modified: pypy/dist/pypy/conftest.py ============================================================================== --- pypy/dist/pypy/conftest.py (original) +++ pypy/dist/pypy/conftest.py Wed Sep 14 14:05:11 2005 @@ -36,6 +36,9 @@ Option('--usemodules', action="callback", type="string", metavar="NAME", callback=usemodules_callback, default=[], help="(mixed) modules to use."), + Option('--compiler', action="store", type="string", dest="compiler", + metavar="[stable|_stable|ast|cpython]", default='stable', + help="""select compiling approach. see pypy/doc/README.compiling""") ) _SPACECACHE={} @@ -57,6 +60,7 @@ kwds.setdefault('nofaking', option.nofaking) kwds.setdefault('oldstyle', option.oldstyle) kwds.setdefault('usemodules', option.usemodules) + kwds.setdefault('compiler', option.compiler) space = Space(**kwds) except KeyboardInterrupt: raise Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Wed Sep 14 14:05:11 2005 @@ -339,6 +339,8 @@ default.accept( self ) frees = gen.scope.get_free_vars() if frees: + # We contain a func with free vars. + # Any unqualified exec or import * is a SyntaxError for name in frees: self.emitop('LOAD_CLOSURE', name) self.emitop_code('LOAD_CONST', gen) @@ -361,12 +363,15 @@ base.accept( self ) self.emitop_int('BUILD_TUPLE', len(node.bases)) frees = gen.scope.get_free_vars() - for name in frees: - self.emitop('LOAD_CLOSURE', name) - self.emitop_code('LOAD_CONST', gen) if frees: + # We contain a func with free vars. + # Any unqualified exec or import * is a SyntaxError + for name in frees: + self.emitop('LOAD_CLOSURE', name) + self.emitop_code('LOAD_CONST', gen) self.emitop_int('MAKE_CLOSURE', 0) else: + self.emitop_code('LOAD_CONST', gen) self.emitop_int('MAKE_FUNCTION', 0) self.emitop_int('CALL_FUNCTION', 0) self.emit('BUILD_CLASS') @@ -603,6 +608,8 @@ self.set_lineno(node) frees = gen.scope.get_free_vars() if frees: + # We contain a func with free vars. + # Any unqualified exec or import * is a SyntaxError for name in frees: self.emitop('LOAD_CLOSURE', name) self.emitop_code('LOAD_CONST', gen) From pedronis at codespeak.net Wed Sep 14 14:17:53 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 14 Sep 2005 14:17:53 +0200 (CEST) Subject: [pypy-svn] r17558 - pypy/dist/pypy/rpython/test Message-ID: <20050914121753.C682427B95@code1.codespeak.net> Author: pedronis Date: Wed Sep 14 14:17:52 2005 New Revision: 17558 Modified: pypy/dist/pypy/rpython/test/test_rspecialcase.py Log: fix import Modified: pypy/dist/pypy/rpython/test/test_rspecialcase.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rspecialcase.py (original) +++ pypy/dist/pypy/rpython/test/test_rspecialcase.py Wed Sep 14 14:17:52 2005 @@ -31,10 +31,11 @@ except: return "hello!" t, typer = gengraph(g, [int]) - from pypy.translator import backendoptimization, simplify + from pypy.translator import simplify + from pypy.translator.backendopt import removenoops from pypy.objspace.flow.model import checkgraph graph = t.getflowgraph(g) - backendoptimization.remove_same_as(graph) + removenoops.remove_same_as(graph) simplify.eliminate_empty_blocks(graph) #should not crash: checkgraph(graph) From arigo at codespeak.net Wed Sep 14 15:16:04 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 14 Sep 2005 15:16:04 +0200 (CEST) Subject: [pypy-svn] r17560 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20050914131604.C32C327B9C@code1.codespeak.net> Author: arigo Date: Wed Sep 14 15:16:03 2005 New Revision: 17560 Modified: pypy/dist/pypy/translator/backendopt/all.py pypy/dist/pypy/translator/backendopt/test/test_all.py Log: Disable inlining and malloc removal for now -- it seems to produce strange infinite loops in 'pypy-c'. Modified: pypy/dist/pypy/translator/backendopt/all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/all.py (original) +++ pypy/dist/pypy/translator/backendopt/all.py Wed Sep 14 15:16:03 2005 @@ -6,7 +6,9 @@ from pypy.translator import simplify -def backend_optimizations(translator, inline_threshold=1, ssa_form=True): +def backend_optimizations(translator, inline_threshold=0, # XXX in-progress, should be 1 + mallocs=False, # XXX in-progress + ssa_form=True): # remove obvious no-ops for graph in translator.flowgraphs.values(): remove_same_as(graph) @@ -17,12 +19,13 @@ auto_inlining(translator, inline_threshold) # vaporize mallocs - for graph in translator.flowgraphs.values(): - if remove_simple_mallocs(graph): - # remove typical leftovers from malloc removal - remove_same_as(graph) - simplify.eliminate_empty_blocks(graph) - simplify.transform_dead_op_vars(graph) + if mallocs: + for graph in translator.flowgraphs.values(): + if remove_simple_mallocs(graph): + # remove typical leftovers from malloc removal + remove_same_as(graph) + simplify.eliminate_empty_blocks(graph) + simplify.transform_dead_op_vars(graph) if ssa_form: for graph in translator.flowgraphs.values(): Modified: pypy/dist/pypy/translator/backendopt/test/test_all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_all.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_all.py Wed Sep 14 15:16:03 2005 @@ -42,7 +42,7 @@ t = Translator(big) t.annotate([]) t.specialize() - backend_optimizations(t, inline_threshold=100) + backend_optimizations(t, inline_threshold=100, mallocs=True) graph = t.getflowgraph() check_malloc_removed(graph) @@ -62,7 +62,7 @@ t = Translator(f) t.annotate([int]) t.specialize() - t.backend_optimizations() + t.backend_optimizations(inline_threshold=1, mallocs=True) # this also checks that the BASE_INLINE_THRESHOLD is enough for 'for' loops graph = t.getflowgraph() From ac at codespeak.net Wed Sep 14 16:06:36 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Wed, 14 Sep 2005 16:06:36 +0200 (CEST) Subject: [pypy-svn] r17565 - in pypy/dist/pypy/interpreter: astcompiler pyparser pyparser/test test Message-ID: <20050914140636.2235127BC7@code1.codespeak.net> Author: ac Date: Wed Sep 14 16:06:35 2005 New Revision: 17565 Modified: pypy/dist/pypy/interpreter/astcompiler/consts.py pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/astcompiler/symbols.py pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py pypy/dist/pypy/interpreter/test/test_compiler.py Log: (Arre, Samuele) Port the scope ambiguity fix from stablecompiler and make gen-exprs work (allmost). Modified: pypy/dist/pypy/interpreter/astcompiler/consts.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/consts.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/consts.py Wed Sep 14 16:06:35 2005 @@ -8,7 +8,7 @@ SC_FREE = 3 SC_CELL = 4 SC_UNKNOWN = 5 -SC_REALLY_GLOBAL = 6 +SC_DEFAULT = 6 CO_OPTIMIZED = 0x0001 CO_NEWLOCALS = 0x0002 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Wed Sep 14 16:06:35 2005 @@ -434,7 +434,8 @@ class PyFlowGraph(FlowGraph): - def __init__(self, space, name, filename, args=None, optimized=0, klass=0): + def __init__(self, space, name, filename, args=None, optimized=0, + klass=0, newlocals=0): FlowGraph.__init__(self, space) if args is None: args = [] @@ -444,10 +445,12 @@ self.args = args # XXX self.argcount = getArgCount(args) self.klass = klass + self.flags = 0 if optimized: - self.flags = CO_OPTIMIZED | CO_NEWLOCALS - else: - self.flags = 0 + self.flags |= CO_OPTIMIZED + if newlocals: + self.flags |= CO_NEWLOCALS + self.consts = [] self.names = [] # Free variables found by the symbol table scan, including Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Wed Sep 14 16:06:35 2005 @@ -9,7 +9,7 @@ from pypy.interpreter.astcompiler import ast, parse, walk, syntax from pypy.interpreter.astcompiler import pyassem, misc, future, symbols from pypy.interpreter.astcompiler.consts import SC_LOCAL, SC_GLOBAL, \ - SC_FREE, SC_CELL, SC_REALLY_GLOBAL + SC_FREE, SC_CELL, SC_DEFAULT from pypy.interpreter.astcompiler.consts import CO_VARARGS, CO_VARKEYWORDS, \ CO_NEWLOCALS, CO_NESTED, CO_GENERATOR, CO_GENERATOR_ALLOWED, CO_FUTURE_DIVISION from pypy.interpreter.astcompiler.pyassem import TupleArg @@ -140,7 +140,7 @@ """Defines basic code generator for Python bytecode """ - + scopeambiguity = False def __init__(self, space, graph): self.space = space @@ -219,9 +219,19 @@ self._nameOp('STORE', name) def loadName(self, name): + if (self.scope.nested and self.scopeambiguity and + name in self.scope.hasbeenfree): + raise SyntaxError("cannot reference variable '%s' because " + "of ambiguity between " + "scopes" % name) + self._nameOp('LOAD', name) def delName(self, name): + scope = self.scope.check_name(name) + if scope == SC_CELL: + raise SyntaxError("can not delete variable '%s' " + "referenced in nested scope" % name) self._nameOp('DELETE', name) def _nameOp(self, prefix, name): @@ -233,14 +243,14 @@ else: self.emitop(prefix + '_FAST', name) elif scope == SC_GLOBAL: - if not self.optimized: - self.emitop(prefix + '_NAME', name) - else: - self.emitop(prefix + '_GLOBAL', name) + self.emitop(prefix + '_GLOBAL', name) elif scope == SC_FREE or scope == SC_CELL: self.emitop(prefix + '_DEREF', name) - elif scope == SC_REALLY_GLOBAL: - self.emitop(prefix + '_GLOBAL', name) + elif scope == SC_DEFAULT: + if self.optimized and self.localsfullyknown: + self.emitop(prefix + '_GLOBAL', name) + else: + self.emitop(prefix + '_NAME', name) else: raise RuntimeError, "unsupported scope for var %s: %d" % \ (name, scope) @@ -331,7 +341,8 @@ ndecorators = 0 gen = FunctionCodeGenerator(self.space, node, isLambda, - self.class_name, self.get_module()) + self.class_name, self.get_module(), + self.scopeambiguity) walk(node.code, gen) gen.finish() self.set_lineno(node) @@ -354,7 +365,8 @@ def visitClass(self, node): gen = ClassCodeGenerator(self.space, node, - self.get_module()) + self.get_module(), + self.scopeambiguity) walk(node.code, gen) gen.finish() self.set_lineno(node) @@ -600,7 +612,7 @@ def visitGenExpr(self, node): gen = GenExprCodeGenerator(self.space, node, self.class_name, - self.get_module()) + self.get_module(), self.scopeambiguity) inner = node.code assert isinstance(inner, ast.GenExprInner) walk(inner, gen) @@ -836,8 +848,6 @@ self.emitop('IMPORT_NAME', node.modname) for name, alias in node.names: if name == '*': - if self.scope.nested: - raise SyntaxError('import * is not allowed in a nested function') self.namespace = 0 self.emit('IMPORT_STAR') # There can only be one name w/ from ... import * @@ -925,8 +935,6 @@ } def visitExec(self, node): - if self.scope.nested and node.locals is None and node.globals is None: - raise SyntaxError('unqualified exec is not allowed in a nested function') node.expr.accept( self ) if node.locals is None: self.emitop_obj('LOAD_CONST', self.space.w_None) @@ -1205,21 +1213,20 @@ node.expr.accept( self ) self.emit('PRINT_EXPR') -AbstractFunctionCodeLambdaCounter = misc.Counter(0) - class AbstractFunctionCode(CodeGenerator): def __init__(self, space, func, isLambda, class_name, mod): self.class_name = class_name self.module = mod if isLambda: - name = "" % AbstractFunctionCodeLambdaCounter.next() + name = "" else: assert isinstance(func, ast.Function) name = func.name args, hasTupleArg = generateArgList(func.argnames) graph = pyassem.PyFlowGraph(space, name, func.filename, args, - optimized=1) + optimized=self.localsfullyknown, + newlocals=1) self.isLambda = isLambda CodeGenerator.__init__(self, space, graph) self.optimized = 1 @@ -1270,10 +1277,13 @@ class FunctionCodeGenerator(AbstractFunctionCode): - def __init__(self, space, func, isLambda, class_name, mod): + def __init__(self, space, func, isLambda, class_name, mod, parentscopeambiguity): assert func.scope is not None self.scope = func.scope + self.localsfullyknown = self.scope.localsfullyknown + self.scopeambiguity = (not self.localsfullyknown or parentscopeambiguity) AbstractFunctionCode.__init__(self, space, func, isLambda, class_name, mod) + self.graph.setFreeVars(self.scope.get_free_vars()) self.graph.setCellVars(self.scope.get_cell_vars()) if self.scope.generator: @@ -1281,9 +1291,12 @@ class GenExprCodeGenerator(AbstractFunctionCode): - def __init__(self, space, gexp, class_name, mod): + def __init__(self, space, gexp, class_name, mod, parentscopeambiguity): assert gexp.scope is not None self.scope = gexp.scope + self.localsfullyknown = self.scope.localsfullyknown + self.scopeambiguity = (not self.localsfullyknown or parentscopeambiguity) + AbstractFunctionCode.__init__(self, space, gexp, 1, class_name, mod) self.graph.setFreeVars(self.scope.get_free_vars()) self.graph.setCellVars(self.scope.get_cell_vars()) @@ -1296,6 +1309,7 @@ self.module = module graph = pyassem.PyFlowGraph( space, klass.name, klass.filename, optimized=0, klass=1) + CodeGenerator.__init__(self, space, graph) self.graph.setFlag(CO_NEWLOCALS) if not space.is_w(klass.doc, space.w_None): @@ -1311,9 +1325,10 @@ class ClassCodeGenerator(AbstractClassCode): - def __init__(self, space, klass, module): + def __init__(self, space, klass, module, parentscopeambiguity): assert klass.scope is not None self.scope = klass.scope + self.scopeambiguity = parentscopeambiguity AbstractClassCode.__init__(self, space, klass, module) self.graph.setFreeVars(self.scope.get_free_vars()) self.graph.setCellVars(self.scope.get_cell_vars()) Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Wed Sep 14 16:06:35 2005 @@ -2,7 +2,7 @@ from pypy.interpreter.astcompiler import ast from pypy.interpreter.astcompiler.consts import SC_LOCAL, SC_GLOBAL, \ - SC_FREE, SC_CELL, SC_UNKNOWN, SC_REALLY_GLOBAL + SC_FREE, SC_CELL, SC_UNKNOWN, SC_DEFAULT from pypy.interpreter.astcompiler.misc import mangle, Counter from pypy.interpreter.pyparser.error import SyntaxError import types @@ -13,6 +13,7 @@ MANGLE_LEN = 256 class Scope: + localsfullyknown = True # XXX how much information do I need about each name? def __init__(self, name, module, klass=None): self.name = name @@ -22,6 +23,7 @@ self.globals = {} self.params = {} self.frees = {} + self.hasbeenfree = {} self.cells = {} self.children = [] # nested is true if the class could contain free variables, @@ -91,7 +93,7 @@ The scope of a name could be LOCAL, GLOBAL, FREE, or CELL. """ if name in self.globals: - return SC_REALLY_GLOBAL + return SC_GLOBAL if name in self.cells: return SC_CELL if name in self.defs: @@ -102,7 +104,7 @@ if self.nested: return SC_UNKNOWN else: - return SC_GLOBAL + return SC_DEFAULT def get_free_vars(self): if not self.nested: @@ -113,6 +115,7 @@ if not (name in self.defs or name in self.globals): free[name] = 1 + self.hasbeenfree.update(free) return free.keys() def handle_children(self): @@ -135,7 +138,8 @@ Be careful to stop if a child does not think the name is free. """ - self.globals[name] = 1 + if name not in self.defs: + self.globals[name] = 1 if name in self.frees: del self.frees[name] for child in self.children: @@ -156,7 +160,7 @@ if sc == SC_UNKNOWN or sc == SC_FREE \ or isinstance(self, ClassScope): self.frees[name] = 1 - elif sc == SC_GLOBAL or sc == SC_REALLY_GLOBAL: + elif sc == SC_DEFAULT or sc == SC_GLOBAL: child_globals.append(name) elif isinstance(self, FunctionScope) and sc == SC_LOCAL: self.cells[name] = 1 @@ -262,6 +266,12 @@ self.pop_scope() self.handle_free_vars(scope, parent) + def visitExec(self, node): + if not (node.globals or node.locals): + parent = self.cur_scope() + parent.localsfullyknown = False # bare exec statement + ast.ASTVisitor.visitExec(self, node) + def visitGenExpr(self, node ): parent = self.cur_scope() scope = GenExprScope(self.module, self.klass); @@ -375,6 +385,7 @@ scope = self.cur_scope() for name, asname in node.names: if name == "*": + scope.localsfullyknown = False continue scope.add_def(asname or name) Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Wed Sep 14 16:06:35 2005 @@ -795,6 +795,7 @@ # GenExpr(GenExprInner(Name('i'), [GenExprFor(AssName('i', 'OP_ASSIGN'), Name('j'), [])])))])) expr = atoms[0] genexpr_for = parse_genexpr_for(atoms[1:]) + genexpr_for[0].is_outmost = True builder.push(ast.GenExpr(ast.GenExprInner(expr, genexpr_for))) return builder.push(ast.Tuple(items)) Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Wed Sep 14 16:06:35 2005 @@ -31,7 +31,7 @@ listmakers, dictmakers, multiexpr, - # genexps, investigate? + genexps, attraccess, slices, imports, Modified: pypy/dist/pypy/interpreter/test/test_compiler.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_compiler.py (original) +++ pypy/dist/pypy/interpreter/test/test_compiler.py Wed Sep 14 16:06:35 2005 @@ -197,12 +197,6 @@ def setup_method(self, method): self.compiler = PythonAstCompiler(self.space) - def test_scope_importstar_with_nested_free(self): - py.test.skip("INPROGESS") - - def test_scope_exec_with_nested_free(self): - py.test.skip("INPROGESS") - class SkippedForNowTestPyPyCompiler(BaseTestCompiler): def setup_method(self, method): self.compiler = PyPyCompiler(self.space) From ac at codespeak.net Wed Sep 14 17:16:44 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Wed, 14 Sep 2005 17:16:44 +0200 (CEST) Subject: [pypy-svn] r17568 - in pypy/dist/pypy/interpreter/pyparser: . test Message-ID: <20050914151644.0379927BC7@code1.codespeak.net> Author: ac Date: Wed Sep 14 17:16:40 2005 New Revision: 17568 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Log: Fix parsing of generator expression as single argument to a function. Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Wed Sep 14 17:16:40 2005 @@ -96,6 +96,15 @@ index += 2 # Skip COMMA and DOUBLESTAR dstararg_token = tokens[index] break + elif cur_token.get_value() == 'for': + if len(arguments) != 1: + raise ValueError('SyntaxError("invalid syntax")') # xxx lineno... + expr = arguments[0] + genexpr_for = parse_genexpr_for(tokens[index-1:]) + genexpr_for[0].is_outmost = True + gexp = ast.GenExpr(ast.GenExprInner(expr, genexpr_for)) + arguments[0] = gexp + break return arguments, stararg_token, dstararg_token Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Wed Sep 14 17:16:40 2005 @@ -170,6 +170,7 @@ "l = [i for i in ( j*2 for j in range(10) ) ]", "l = (i for i in [ j*2 for j in ( k*3 for k in range(10) ) ] )", "l = [i for j in ( j*2 for j in [ k*3 for k in range(10) ] ) ]", + "l = f(i for i in j)", ] From arigo at codespeak.net Wed Sep 14 18:10:18 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 14 Sep 2005 18:10:18 +0200 (CEST) Subject: [pypy-svn] r17570 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20050914161018.9A82527BC7@code1.codespeak.net> Author: arigo Date: Wed Sep 14 18:10:17 2005 New Revision: 17570 Modified: pypy/dist/pypy/translator/backendopt/malloc.py pypy/dist/pypy/translator/backendopt/removenoops.py pypy/dist/pypy/translator/backendopt/test/test_removenoops.py Log: Bug, test and fix of remove_same_as(). Modified: pypy/dist/pypy/translator/backendopt/malloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/malloc.py Wed Sep 14 18:10:17 2005 @@ -60,6 +60,8 @@ if isinstance(op.args[i], Variable): set_use_point(node, op.args[i], "op", node, op, i) set_creation_point(node, op.result, "op", node, op) + if isinstance(node.exitswitch, Variable): + set_use_point(node, node.exitswitch, "exitswitch", node) if isinstance(node, Link): if isinstance(node.last_exception, Variable): @@ -168,6 +170,7 @@ else: newops.append(op) block.operations[:] = newops + assert block.exitswitch not in vars for link in block.exits: newargs = [] Modified: pypy/dist/pypy/translator/backendopt/removenoops.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/removenoops.py (original) +++ pypy/dist/pypy/translator/backendopt/removenoops.py Wed Sep 14 18:10:17 2005 @@ -1,4 +1,4 @@ -from pypy.objspace.flow.model import Block +from pypy.objspace.flow.model import Block, Variable, Constant from pypy.objspace.flow.model import traverse from pypy.rpython.lltype import Void @@ -28,7 +28,18 @@ if link.args[i] == same_as_result: link.args[i] = same_as_arg if block.exitswitch == same_as_result: - block.exitswitch = same_as_arg + if isinstance(same_as_arg, Variable): + block.exitswitch = same_as_arg + else: + assert isinstance(same_as_arg, Constant) + newexits = [link for link in block.exits + if link.exitcase == same_as_arg.value] + assert len(newexits) == 1 + newexits[0].exitcase = None + if hasattr(newexits[0], 'llexitcase'): + newexits[0].llexitcase = None + block.exitswitch = None + block.recloseblock(*newexits) block.operations[index] = None # remove all same_as operations Modified: pypy/dist/pypy/translator/backendopt/test/test_removenoops.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_removenoops.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_removenoops.py Wed Sep 14 18:10:17 2005 @@ -1,7 +1,8 @@ -from pypy.translator.backendopt.removenoops import remove_void +from pypy.translator.backendopt.removenoops import remove_void, remove_same_as +from pypy.translator.backendopt.inline import inline_function from pypy.translator.translator import Translator from pypy.translator.test.snippet import simple_method -from pypy.objspace.flow.model import checkgraph +from pypy.objspace.flow.model import checkgraph, flatten, Block from pypy.rpython.lltype import Void from pypy.rpython.llinterp import LLInterpreter @@ -41,3 +42,29 @@ # assert _type is not Void #interp = LLInterpreter(t.flowgraphs, t.rtyper) #assert interp.eval_function(f, [0]) == 1 + +def test_remove_same_as(): + def nothing(x): + return x + def f(): + nothing(False) + if nothing(True): + return 42 + else: + return 666 + t = Translator(f) + a = t.annotate([]) + t.specialize() + # now we make the 'if True' appear + inline_function(t, nothing, t.flowgraphs[f]) + # here, the graph looks like v21=same_as(True); exitswitch: v21 + remove_same_as(t.flowgraphs[f]) + t.checkgraphs() + # only one path should be left + for node in flatten(t.flowgraphs[f]): + if isinstance(node, Block): + assert len(node.exits) <= 1 + + interp = LLInterpreter(t.flowgraphs, t.rtyper) + result = interp.eval_function(f, []) + assert result == 42 From arigo at codespeak.net Wed Sep 14 19:39:24 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 14 Sep 2005 19:39:24 +0200 (CEST) Subject: [pypy-svn] r17571 - pypy/dist/pypy/translator/backendopt Message-ID: <20050914173924.45A2F27B98@code1.codespeak.net> Author: arigo Date: Wed Sep 14 19:39:23 2005 New Revision: 17571 Modified: pypy/dist/pypy/translator/backendopt/all.py Log: Inlining and malloc removal seem to work now... Modified: pypy/dist/pypy/translator/backendopt/all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/all.py (original) +++ pypy/dist/pypy/translator/backendopt/all.py Wed Sep 14 19:39:23 2005 @@ -6,8 +6,8 @@ from pypy.translator import simplify -def backend_optimizations(translator, inline_threshold=0, # XXX in-progress, should be 1 - mallocs=False, # XXX in-progress +def backend_optimizations(translator, inline_threshold=1, + mallocs=True, ssa_form=True): # remove obvious no-ops for graph in translator.flowgraphs.values(): @@ -30,3 +30,5 @@ if ssa_form: for graph in translator.flowgraphs.values(): SSI_to_SSA(graph) + + translator.checkgraphs() From tismer at codespeak.net Thu Sep 15 01:38:21 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Thu, 15 Sep 2005 01:38:21 +0200 (CEST) Subject: [pypy-svn] r17572 - pypy/extradoc/minute Message-ID: <20050914233821.1904927BC0@code1.codespeak.net> Author: tismer Date: Thu Sep 15 01:38:19 2005 New Revision: 17572 Modified: pypy/extradoc/minute/pypy-sync-09-01-2005.txt (props changed) pypy/extradoc/minute/pypy-sync-09-08-2005.txt (props changed) Log: please make sure to add proper eol-style for every .txt file in the future. From tismer at codespeak.net Thu Sep 15 04:30:38 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Thu, 15 Sep 2005 04:30:38 +0200 (CEST) Subject: [pypy-svn] r17573 - pypy/dist/pypy/translator/goal Message-ID: <20050915023038.D55D027BD2@code1.codespeak.net> Author: tismer Date: Thu Sep 15 04:30:37 2005 New Revision: 17573 Added: pypy/dist/pypy/translator/goal/bench-windows.py (contents, props changed) Log: automated benchmarking under windows a little bit. Here the current results: executable abs.richards abs.pystone rel.richards rel.pystone pypy-c-17439 35871 ms 634.714 42.1 62.8 pypy-c-17512 46878 ms 665.326 55.0 59.9 pypy-c-17516 38956 ms 707.266 45.7 56.3 pypy-c-17545-intern 35369 ms 763.616 41.5 52.2 pypy-c-17572 37032 ms 763.531 43.4 52.2 python 2.3.3 853 ms 39831.500 1.0 1.0 As a current result, inlining appears not to be a candidate worth optimizing, yet. Added: pypy/dist/pypy/translator/goal/bench-windows.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/goal/bench-windows.py Thu Sep 15 04:30:37 2005 @@ -0,0 +1,70 @@ +# benchmarks on a windows machine. +# to be executed in the goal folder, +# where a couple of .exe files is expected. + +import os, sys + +PYSTONE_CMD = 'from test import pystone;pystone.main(%s)' +PYSTONE_PATTERN = 'This machine benchmarks at' +RICHARDS_CMD = 'from richards import *;Richards.iterations=%d;main()' +RICHARDS_PATTERN = 'Average time for iterations:' + +def get_result(txt, pattern): + for line in txt.split('\n'): + if line.startswith(pattern): + break + else: + raise ValueError, 'this is no valid output' + return float(line.split()[len(pattern.split())]) + +def run_cmd(cmd): + print "running", cmd + pipe = os.popen(cmd + ' 2>&1') + result = pipe.read() + print "done" + return result + +def run_pystone(executable='python', n=0): + argstr = PYSTONE_CMD % (str(n) and n or '') + txt = run_cmd('%s -c "%s"' % (executable, argstr)) + return get_result(txt, PYSTONE_PATTERN) + +def run_richards(executable='python', n=10): + argstr = RICHARDS_CMD % n + txt = run_cmd('%s -c "%s"' % (executable, argstr)) + return get_result(txt, RICHARDS_PATTERN) + +def get_executables(): + exes = [name for name in os.listdir('.') if name.endswith('.exe')] + exes.sort() + return exes + +LAYOUT = ''' +executable abs.richards abs.pystone rel.richards rel.pystone +pypy-c-17439 40929 ms 637.274 47.8 56.6 +pypy-c-17512 46105 ms 658.1 53.9 54.8 +pypy-current 33937 ms 698.415 39.6 51.7 +python 2.3.3 856 ms 36081.6 1.0 1.0 +''' + +HEADLINE = '''\ +executable abs.richards abs.pystone rel.richards rel.pystone''' +FMT = '''\ +%-20s ''' + '%5d ms %9.3f ' + '%5.1f %5.1f' + +def main(): + print 'getting the richards reference' + ref_rich = run_richards() + print 'getting the pystone reference' + ref_stone = run_pystone() + res = [] + for exe in get_executables(): + exename = os.path.splitext(exe)[0] + res.append( (exename, run_richards(exe, 1), run_pystone(exe, 2000)) ) + res.append( ('python %s' % sys.version.split()[0], ref_rich, ref_stone) ) + print HEADLINE + for exe, rich, stone in res: + print FMT % (exe, rich, stone, rich / ref_rich, ref_stone / stone) + +if __name__ == '__main__': + main() From ludal at codespeak.net Thu Sep 15 11:59:22 2005 From: ludal at codespeak.net (ludal at codespeak.net) Date: Thu, 15 Sep 2005 11:59:22 +0200 (CEST) Subject: [pypy-svn] r17574 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050915095922.3AE8527BD2@code1.codespeak.net> Author: ludal Date: Thu Sep 15 11:59:20 2005 New Revision: 17574 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: fix slice index not proven nonnegative Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Thu Sep 15 11:59:20 2005 @@ -72,8 +72,8 @@ dstararg_token = None while index < l: cur_token = tokens[index] - index += 1 if not isinstance(cur_token, TokenObject): + index += 1 if not building_kw: arguments.append(cur_token) else: @@ -82,12 +82,16 @@ arguments.append(ast.Keyword(last_token.varname, cur_token)) building_kw = False kw_built = True + continue elif cur_token.name == tok.COMMA: + index += 1 continue elif cur_token.name == tok.EQUAL: + index += 1 building_kw = True continue elif cur_token.name == tok.STAR or cur_token.name == tok.DOUBLESTAR: + index += 1 if cur_token.name == tok.STAR: stararg_token = tokens[index] index += 1 @@ -100,7 +104,7 @@ if len(arguments) != 1: raise ValueError('SyntaxError("invalid syntax")') # xxx lineno... expr = arguments[0] - genexpr_for = parse_genexpr_for(tokens[index-1:]) + genexpr_for = parse_genexpr_for(tokens[index:]) genexpr_for[0].is_outmost = True gexp = ast.GenExpr(ast.GenExprInner(expr, genexpr_for)) arguments[0] = gexp From ericvrp at codespeak.net Thu Sep 15 12:52:09 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Thu, 15 Sep 2005 12:52:09 +0200 (CEST) Subject: [pypy-svn] r17575 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20050915105209.2013D27BD3@code1.codespeak.net> Author: ericvrp Date: Thu Sep 15 12:52:07 2005 New Revision: 17575 Modified: pypy/dist/pypy/translator/llvm/codewriter.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/module/support.py Log: * Use backend_optimizations inliner and malloc remover * Little code cleanup of generated llvm code * Changed hacker-ish code for main(..) Will need to move this to the externs file. Modified: pypy/dist/pypy/translator/llvm/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/codewriter.py (original) +++ pypy/dist/pypy/translator/llvm/codewriter.py Thu Sep 15 12:52:07 2005 @@ -74,7 +74,7 @@ if is_entrynode: linkage_type = '' else: - linkage_type = ' ' + linkage_type = '' #'internal ' self.append("%s%s %s {" % (linkage_type, cconv, decl,)) def closefunc(self): @@ -113,7 +113,9 @@ if cconv is not 'fastcc': tail_ = '' else: - tail_ = tail + ' ' + tail_ = tail + if tail_: + tail_ += ' ' arglist = ["%s %s" % item for item in zip(argtypes, argrefs)] self.indent("%s = %scall %s %s %s(%s)" % (targetvar, tail_, cconv, returntype, functionref, ", ".join(arglist))) @@ -122,7 +124,9 @@ if cconv is not 'fastcc': tail_ = '' else: - tail_ = tail + ' ' + tail_ = tail + if tail_: + tail_ += ' ' arglist = ["%s %s" % item for item in zip(argtypes, argrefs)] self.indent("%scall %s void %s(%s)" % (tail_, cconv, functionref, ", ".join(arglist))) Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Thu Sep 15 12:52:07 2005 @@ -255,7 +255,10 @@ def compile_module(function, annotation, view=False, **kwds): t = Translator(function) a = t.annotate(annotation) + a.simplify() t.specialize() + t.backend_optimizations(ssa_form=False) + t.checkgraphs() if view: t.view() return genllvm(t, **kwds) Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Thu Sep 15 12:52:07 2005 @@ -264,7 +264,8 @@ extfunctions["%main"] = [(), """ int %main(int %argc, sbyte** %argv) { entry: - %pypy_argv = call fastcc %RPyListOfString* %pypy_ll_newlist__Ptr_GcStruct_listLlT_Signed(int 0) + ;%pypy_argv = call fastcc %RPyListOfString* %pypy_ll_newlist__Ptr_GcStruct_listLlT_Signed(int 0) + %pypy_argv = call fastcc %structtype.list* %pypy_ll_newlist__Ptr_GcStruct_listLlT_Signed(int 0) br label %no_exit no_exit: @@ -273,7 +274,8 @@ %tmp.8 = getelementptr sbyte** %argv, uint %indvar %tmp.9 = load sbyte** %tmp.8 %rpy = call fastcc %RPyString* %RPyString_FromString(sbyte* %tmp.9) - call fastcc void %pypy_ll_append__listPtr_rpy_stringPtr(%RPyListOfString* %pypy_argv, %RPyString* %rpy) + ;call fastcc void %pypy_ll_append__listPtr_rpy_stringPtr(%RPyListOfString* %pypy_argv, %RPyString* %rpy) + call fastcc void %pypy_ll_append__listPtr_rpy_stringPtr(%structtype.list* %pypy_argv, %RPyString* %rpy) %inc = add int %i.0.0, 1 %tmp.2 = setlt int %inc, %argc %indvar.next = add uint %indvar, 1 From tismer at codespeak.net Thu Sep 15 14:30:58 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Thu, 15 Sep 2005 14:30:58 +0200 (CEST) Subject: [pypy-svn] r17576 - pypy/extradoc/minute Message-ID: <20050915123058.3385627BDF@code1.codespeak.net> Author: tismer Date: Thu Sep 15 14:30:56 2005 New Revision: 17576 Added: pypy/extradoc/minute/pypy-sync-09-15-2005.txt (contents, props changed) Log: summary and IRC log of today's meeting Added: pypy/extradoc/minute/pypy-sync-09-15-2005.txt ============================================================================== --- (empty file) +++ pypy/extradoc/minute/pypy-sync-09-15-2005.txt Thu Sep 15 14:30:56 2005 @@ -0,0 +1,274 @@ +============================================= +pypy-sync developer meeting 15th September +============================================= + +Time & location: 1pm (30 minutes) at #pypy-sync + +Attendees:: + + Samuele Pedroni + Anders Lehmann + Anders Chrigstr?m + Christian Tismer (minutes/moderation) + Carl Friedrich Bolz + Holger Krekel + Eric van Riet Paap + Niklaus Heidimann + +Regular Topics +==================== + +- activity reports (3 prepared lines of info). + All Attendees submitted activity reports (see `IRC-Log`_ + at the end and 'LAST/NEXT/BLOCKERS' entries in particular) + +- resolve conflicts/blockers + No conflicts were discovered. + +Topics of the week +=================== + +A quick point about IRC logging +------------------------------------------------------- + +- There were no proposals for new channelsto be logged. + The current set of logged channels is + #pypy, #pypy-funding (password), #pypy-sync, #pypy-tb. + +- There was no objection to have a second serverfor eggdrop. + Holger added that he is going to work on some IRC infrastructure, + without giving details. He proposed to use codespeak as + the second server, which was agreed. + +Participation on the CCC congress in Berlin +----------------------------------------------- + +Holger would like to co-write a proposal and co-talk about PyPy. +Christian is considering to join this. We will ask Armin who was +interested but absent. + +The suggested re-structuring of the selection process was not discussed. + +Current optimization activities and allocation of work +------------------------------------------------------------- + +There was consensus about current optimization efforts. There is no +obvious target visible that gives huge improvements. +Opinions on profiling were different. We may need to have +profiling at various levels. But profiling might be unsharp, +if optimizing PyPy consists of a number of small improvements. +It was also proposed to judge optimization based upon the effect +multiplied by the speed of implementation. +It was concluded that we shouldfocus more on the reporting. +This topic will show up, again. + +Other activities, follow-up on sync meeting of last week +------------------------------------------------------------- + +Nobody showed up who wanted to do something last week, which resulted +in Armin, Samuele, Carl and Holger will work on that. It was proposed +to ask again and find a way to motivate each other to help with this. +Carl claimed that reporting actually can be fun. We are awaiting his +ideas on #pypy. + +Making money out of PyPy +------------------------------ + +It was agreed that this topic is too big to be discussed on #pypy-sync. +Some interesting input about this can be found on the pypy list. See +"""Re: [pypy-dev] next pypy-sync meeting 20050915""" by Michal Wallace +michal at sabren.com + +Next pypy-sync meeting +------------------------------- + +Scheduled for next Thursday, Sept. 22nd, conducted by Anders Lehmann. +The timing issue was touched, again, but needs to be deferred. + + +Closing +------------------ + +Christian closes the meeting in time at 13:33pm. + +The IRC log can be found at http://tismerysoft.de/pypy/irc-logs/pypy-sync/%23pypy-sync.log.20050915 + +Holger asked to include it in the log, anyway, for ease of reference. + +.. _`IRC-log`: + +Here is the full IRC log. Note that time is given in UTC:: + + [11:31] aleale (n=andersle at clogs.dfki.uni-sb.de) joined #pypy-sync. + [11:48] pedronis (n=Samuele_ at c-398b70d5.022-54-67626719.cust.bredbandsbolaget.se) joined #pypy-sync. + [11:51] stakkars (i=sppspg at i577B427C.versanet.de) joined #pypy-sync. + [11:52] arre (i=ac at ratthing-b40e.strakt.com) joined #pypy-sync. + [11:55] hpk (n=hpk at merlinux.de) joined #pypy-sync. + [11:55] ericvrp (n=chatzill at ericvrp.demon.nl) joined #pypy-sync. + [11:55] cfbolz (n=carlson at merlinux.de) joined #pypy-sync. + [12:02] hello all + [12:02] Action: hpk notices it's 13:03 on his clock + [12:02] hi all! Yes, I'm somewhat waiting for Armin and Michael + [12:03] nik (n=chatzill at ratthing-b410.strakt.com) joined #pypy-sync. + [12:03] I'd say let's start with the regular part + [12:03] ok? + [12:03] armin may not come (and we should not wait too much in general i think) + [12:03] yes + [12:03] fine with me + [12:03] great + [12:03] hi all + [12:03] then I start with my three lines: + [12:04] DONE: some "ten lines for ten percent" speedups on dicts, interning would take more to do right + [12:04] NEXT: working on different approaches to reduces PyPy's stack requirements + [12:04] BLOCK: demotivation by randomness of optimization criteria + [12:04] DONE: non-pypy + [12:04] NEXT: non-pypy, vacation + [12:04] BLOCKERS: none + [12:04] prev: translate_pypy_new, internal DFKI stuff + [12:04] next: see my family, moderate #pypy-sync + [12:04] blockers: - + [12:04] PREV: astcompiler + [12:04] NEXT: astcompiler + [12:04] BLOCKERS: None + [12:04] last: refactoring + [12:04] next: profiling and dive into transformations + [12:04] blockers: - + [12:05] last: EU issues, background communication + [12:05] next: non-pypy issues, EU-issues + [12:05] blockers: - + [12:05] Last: astcompiler, rtyper ann fixes, hlinvoke support to move dict impl to low-level, + [12:05] some writing, understanding the fine point of the flow space again + [12:05] Next: astcompiler, reports, translate_pypy, ? + [12:05] Blockers: - + [12:05] last: non-pypy + [12:05] next: non-pypy + [12:05] blockers: - + [12:05] nik: blockers: non-pypy? :-) + [12:05] uhm, yep ;) + [12:06] ok, then I'd add my toothache tomy blockers :-) + [12:06] anything else? + [12:07] I don't see blockers which are not touched by other topics. + [12:07] topic 1: + [12:07] A quick point about IRC logging + [12:07] does anybody have proposals about that to log else? + [12:07] let me note that i plan to setup some IRC-logging infrastructure + [12:07] on codespeak at some point. + [12:07] do we want a second server in another network? + [12:07] ah + [12:07] but that shouldn't interfere with your nice efforts. + [12:08] it probably should be the second server then. + [12:08] codespeak is of course a possible target. You can then easily attach something to thebasic logs + [12:08] the HHU server is probably more helpful for getting computing/testing power + [12:08] ok, then I can do an eggdrop setup. Some postprocessing of the logs would be really really good! + [12:09] the performance penalty of eggdrop is very low. + [12:09] the criteria should be reliability of network and server. + [12:09] yip. + [12:10] I take it that we want codespeakto be a second server, and there are no proposals of other channels? + [12:10] anything to be said, or next topic? + [12:10] next for me + [12:10] yip. next for me as well. + [12:10] next + [12:10] next. + [12:11] Participation on the CCC congress in Berlin + [12:11] i'd like to co-write a proposal and co-talk about PyPy. + [12:11] besides discussion of who wants to show what, I'd like to repeat: K can helptoget you a room + [12:11] I am interested in that + [12:11] great, but i know very many people in berlin :_) + [12:12] basically the CCC congress is much about showing off/reporting about cool technical stuff + [12:12] I was offering help, and you are overloaded, so let's put it together + [12:13] not so much about scientific musings about possible benefits :-) + [12:13] did everybody read Armin's proposal? + [12:13] which proposal? + [12:14] draft-file in doc + [12:14] you mean the report/deliverable draft? + [12:14] don't read it now,toolong + [12:14] is that much related to the CCC congress? + [12:14] draft-dynamic-language-translation.txt + [12:14] is not a talk proposal + [12:14] I think this wasmeant as a start for a talk on CCC? + [12:15] no, for a start ont he EU reporting issues + [12:15] it's the start of one of our reports + [12:15] uppsi :-) + [12:15] I even wrote stuff in it + [12:15] np + [12:15] pedronis: I know + [12:16] stakkars: you want to go and talk at CCC? + [12:16] I'm not really sure, but I thought of it,not only becaue I was very quiet, yet. + [12:17] no idea if they want more than one talk. I might offer to do something with you if you like. + [12:17] mwh (n=user at 82-33-185-193.cable.ubr01.azte.blueyonder.co.uk) joined #pypy-sync. + [12:17] stakkars: yes, that's fine + [12:17] stakkars: also we should ask armin who was interested + [12:17] i don't think it's a problem to do a one hour talk with three persons + [12:17] we have done that before + [12:17] let's try if we are able to share: + [12:17] we can also consider doing two talks (but now now i suggest) + [12:18] s/now now/not now/ + [12:18] I think we should move on with the topics. + [12:18] yip from my side. + [12:18] ok? + [12:18] Current optimization activities and allocation of work + [12:18] yes + [12:18] we have done some random improvements, where none of these but the call optimization + [12:19] came in any waynear the criteria of a2.0 speedup. + [12:19] We might think to drop optimizationj completely at this point, or change criteria. + [12:19] indeed. i think - as mwh points out on pypy-dev - we should go for profiling rather than direct optimizations at the moment + [12:20] having profiling at various levels provides better grounds for judging optimizations later on + [12:20] well, we should work on the reports more too + [12:20] I also would like to add three lines: + [12:20] we are very unlikely to hit such huge speed gains again at all. + [12:20] pedronis: yes. but that's a different topic :-) + [12:20] ANd if we have 10 percent speedups, it takes just 40 of these to rech the goal. + [12:21] >>> 1.1 ** 40 + [12:21] 45.259255568176101 + [12:21] :-) + [12:21] if we do one a week... + [12:21] (just kidding, of course) + [12:22] well, what about going for profiling rather than optimizations? + [12:22] other thing: multiply time effort to do the implementation by it's effect, that would be a good criteria. + [12:22] hpk: I fear that profiling will not be very sharp, as it was not yet. + [12:22] The problem is that we loose speedd in many tiny issues, and we needto go the hard wayto remove one by one. + [12:22] I'm all for profiling of various sorts. + [12:22] absolutely, anecessary companion. + [12:23] may I conclude that we need to concentrate on reporting ATM and don't waste time on optimization which needs much resources? + [12:24] anything on this, or next topic? ( 5 min left) + [12:24] at this point, we cannot all focus just on one thing + [12:24] reporting is currently thought to be brought forward by armin, samuele, carl and me + [12:24] i do think that help is much appreciated + [12:24] that's next topic + [12:24] Other activities, follow-up on sync meeting of last week + [12:25] do we realy stick with last week's conclusion, or should we contractmore people on this + [12:25] nobody said he wanted to something last week + [12:26] yes, I think that's bad. We need to structure work in a way that it is ieasy to work on something, and it is no func + [12:26] not much fun, so we need to motivate each other. + [12:26] :-) + [12:27] I actually think reporting _can_ be fun :-) + [12:27] I think this is great news, please tell this on #pypy, we are almost finished here + [12:27] Making money out of PyPy + [12:28] something interesting has been said on the list, and the topic is too big for pypy-sync, right? + [12:28] yes is too big + [12:28] yes + [12:28] i guess, it's a somewhat timeless question probably better discussed at some pub + [12:28] #pypy-pub? ;) + [12:29] my opinion was to get it a bit more pressure + [12:29] :-) + [12:29] good idea. + [12:29] we have 3 mins left, becasue we started 3 after 1pm + [12:29] or we can close now if you like + [12:30] fine for me. + [12:31] ok then, till next weak. same place, same time? + [12:31] s/weak/week/ ehum :) + [12:31] then thank you all for the nice input, closing JIT + [12:31] yip, actually that was a topic we had from last time, the timing issue + [12:31] stakkars: great, thanks + [12:31] see you then + [12:31] cfbolz (n=carlson at merlinux.de) left #pypy-sync ("Verlassend"). + [12:31] bye + [12:31] see you next week + [12:32] ericvrp (n=chatzill at ericvrp.demon.nl) left #pypy-sync. + [12:32] note that the protocol will not include the IRC log. + [12:32] see you + [12:32] nik (n=chatzill at ratthing-b410.strakt.com) left #pypy-sync. + [12:32] it can be found at tismerysoft.de:/pypy/irc-logs + [12:32] stakkars (i=sppspg at i577B427C.versanet.de) left #pypy-sync. + [12:32] arre (i=ac at ratthing-b40e.strakt.com) left #pypy-sync. + [12:32] aleale (n=andersle at clogs.dfki.uni-sb.de) left #pypy-sync. From tismer at codespeak.net Thu Sep 15 14:51:47 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Thu, 15 Sep 2005 14:51:47 +0200 (CEST) Subject: [pypy-svn] r17577 - pypy/extradoc/minute Message-ID: <20050915125147.7799427BE0@code1.codespeak.net> Author: tismer Date: Thu Sep 15 14:51:46 2005 New Revision: 17577 Modified: pypy/extradoc/minute/pypy-sync-09-15-2005.txt Log: correction. I misunderstood an injection by Carl. Modified: pypy/extradoc/minute/pypy-sync-09-15-2005.txt ============================================================================== --- pypy/extradoc/minute/pypy-sync-09-15-2005.txt (original) +++ pypy/extradoc/minute/pypy-sync-09-15-2005.txt Thu Sep 15 14:51:46 2005 @@ -68,8 +68,8 @@ Nobody showed up who wanted to do something last week, which resulted in Armin, Samuele, Carl and Holger will work on that. It was proposed to ask again and find a way to motivate each other to help with this. -Carl claimed that reporting actually can be fun. We are awaiting his -ideas on #pypy. +Carl threw in that reporting actually can be fun. We should try to +find ways tomotivate each other on #pypy. Making money out of PyPy ------------------------------ From ale at codespeak.net Thu Sep 15 16:09:49 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Thu, 15 Sep 2005 16:09:49 +0200 (CEST) Subject: [pypy-svn] r17578 - in pypy/dist/pypy/translator: goal tool tool/pygame Message-ID: <20050915140949.8510727BE6@code1.codespeak.net> Author: ale Date: Thu Sep 15 16:09:48 2005 New Revision: 17578 Added: pypy/dist/pypy/translator/tool/pygame/server.py Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py pypy/dist/pypy/translator/tool/graphserver.py Log: moved some code out of translate_pypy_new, concerning graphservers. Initially I had wanted to move both "run_async_server" and "run_translator_server" aka "run_server" into graphserver. That meant for "run_server" to be renamed to "run_translator_server", and since "run_async_server" imports pygame (no not pypy.translator.tool.pygame) I put it into pypy/translator/tool/pygame/server.py Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Thu Sep 15 16:09:48 2005 @@ -107,8 +107,6 @@ ldef = listdef.ListDef(None, annmodel.SomeString()) inputtypes = [annmodel.SomeList(ldef)] - if listen_port: - run_async_server() if not options1.no_annotations: print 'Annotating...' print 'with policy: %s.%s' % (policy.__class__.__module__, policy.__class__.__name__) @@ -180,30 +178,6 @@ serv_start, serv_show, serv_stop, serv_cleanup = None, None, None, None -def run_async_server(): - global serv_start, serv_show, serv_stop, serv_cleanup - from pypy.translator.tool import graphpage, graphserver - homepage = graphpage.TranslatorPage(t) - (serv_start, serv_show, serv_stop, serv_cleanup - )=graphserver.run_server(homepage, port=listen_port, background=True) - - -def run_server(): - from pypy.translator.tool import graphpage - import pygame - from pypy.translator.tool.pygame.graphclient import get_layout - from pypy.translator.tool.pygame.graphdisplay import GraphDisplay - - if len(t.functions) <= options1.huge: - page = graphpage.TranslatorPage(t) - else: - page = graphpage.LocalizedCallGraphPage(t, entry_point) - - layout = get_layout(page) - show, async_quit = layout.connexion.initiate_display, layout.connexion.quit - display = layout.get_display() - return display.run, show, async_quit, pygame.quit - def mkexename(name): if sys.platform == 'win32': name = os.path.normpath(name + '.exe') @@ -252,7 +226,8 @@ if serv_start: start, show, stop, cleanup = serv_start, serv_show, serv_stop, serv_cleanup else: - start, show, stop, cleanup = run_server() + from pypy.translator.tool.pygame.server import run_translator_server + start, show, stop, cleanup = run_translator_server(t, entry_point, options1) pdb_plus_show.install_show(show) debugger = run_debugger_in_thread(func, args, stop) debugger.start() @@ -371,6 +346,9 @@ # otherwise we have been loaded a = t.annotator t.frozen = False + if listen_port: + from pypy.translator.tool.graphserver import run_async_server + serv_start, serv_show, serv_stop, serv_cleanup = run_async_server(t, listen_port) try: standalone = analyse(t, inputtypes) except TyperError: Modified: pypy/dist/pypy/translator/tool/graphserver.py ============================================================================== --- pypy/dist/pypy/translator/tool/graphserver.py (original) +++ pypy/dist/pypy/translator/tool/graphserver.py Thu Sep 15 16:09:48 2005 @@ -10,6 +10,12 @@ send_msg = portutil.send_msg recv_msg = portutil.recv_msg +def run_async_server(t, listen_port): + import graphpage + homepage = graphpage.TranslatorPage(t) + print repr(t),repr(homepage) + return run_server(homepage, port=listen_port, background=True) + class GraphserverPort(portutil.Port): def __init__(self, s, homepage): Added: pypy/dist/pypy/translator/tool/pygame/server.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/tool/pygame/server.py Thu Sep 15 16:09:48 2005 @@ -0,0 +1,17 @@ + +def run_translator_server(t, entry_point, options): + from pypy.translator.tool import graphpage + import pygame + from pypy.translator.tool.pygame.graphclient import get_layout + from pypy.translator.tool.pygame.graphdisplay import GraphDisplay + + if len(t.functions) <= options.huge: + page = graphpage.TranslatorPage(t) + else: + page = graphpage.LocalizedCallGraphPage(t, entry_point) + + layout = get_layout(page) + show, async_quit = layout.connexion.initiate_display, layout.connexion.quit + display = layout.get_display() + return display.run, show, async_quit, pygame.quit + From arigo at codespeak.net Thu Sep 15 18:18:31 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 15 Sep 2005 18:18:31 +0200 (CEST) Subject: [pypy-svn] r17581 - in pypy/dist/pypy/rpython: . test Message-ID: <20050915161831.72E1F27BF4@code1.codespeak.net> Author: arigo Date: Thu Sep 15 18:18:28 2005 New Revision: 17581 Modified: pypy/dist/pypy/rpython/rpbc.py pypy/dist/pypy/rpython/test/test_rpbc.py Log: (cfbolz, arigo) fix and test for PBCs containing functions-or-None missing an rtype_is_true() operation. Modified: pypy/dist/pypy/rpython/rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/rpbc.py (original) +++ pypy/dist/pypy/rpython/rpbc.py Thu Sep 15 18:18:28 2005 @@ -181,7 +181,20 @@ # ____________________________________________________________ -class MultipleFrozenPBCRepr(Repr): +class MultiplePBCRepr(Repr): + """Base class for PBCReprs of multiple PBCs that can include None + (represented as a NULL pointer).""" + def rtype_is_true(self, hop): + if hop.s_result.is_constant(): + assert hop.s_result.const is True # custom __nonzero__ on PBCs? + return hop.inputconst(Bool, hop.s_result.const) + else: + # None is a nullptr, which is false; everything else is true. + vlist = hop.inputargs(self) + return hop.genop('ptr_nonzero', vlist, resulttype=Bool) + + +class MultipleFrozenPBCRepr(MultiplePBCRepr): """Representation selected for multiple non-callable pre-built constants.""" def __init__(self, rtyper, access_set): self.rtyper = rtyper @@ -233,15 +246,6 @@ setattr(result, mangled_name, llvalue) return result - def rtype_is_true(self, hop): - if hop.s_result.is_constant(): - assert hop.s_result.const is True # custom __nonzero__ on PBCs? - return hop.inputconst(Bool, hop.s_result.const) - else: - # None is a nullptr, which is false; everything else is true. - vlist = hop.inputargs(self) - return hop.genop('ptr_nonzero', vlist, resulttype=Bool) - def rtype_getattr(self, hop): attr = hop.args_s[1].const vpbc, vattr = hop.inputargs(self, Void) @@ -280,6 +284,9 @@ self.function = s_pbc.prebuiltinstances.keys()[0].im_func im_selves = {} for pbc, not_a_classdef in s_pbc.prebuiltinstances.items(): + if pbc is None: + raise TyperError("unsupported: variable of type " + "method-of-frozen-PBC or None") assert pbc.im_func is self.function assert not isclassdef(not_a_classdef) im_selves[pbc.im_self] = True @@ -351,7 +358,7 @@ return True -class FunctionsPBCRepr(Repr): +class FunctionsPBCRepr(MultiplePBCRepr): """Representation selected for a PBC of function(s).""" def __init__(self, rtyper, s_pbc): @@ -482,6 +489,9 @@ def __init__(self, rtyper, s_pbc): self.rtyper = rtyper self.s_pbc = s_pbc + if None in s_pbc.prebuiltinstances: + raise TyperError("unsupported: variable of type " + "bound-method-object or None") basedef = commonbase(s_pbc.prebuiltinstances.values()) for classdef1, name in allattributenames(basedef): # don't trust the func.func_names and see if this 'name' would be @@ -577,7 +587,9 @@ def __init__(self, rtyper, s_pbc): self.rtyper = rtyper self.s_pbc = s_pbc - assert None not in s_pbc.prebuiltinstances, "XXX not implemented" + if None in s_pbc.prebuiltinstances: + raise TyperError("unsupported: variable of type " + "class-pointer or None") if s_pbc.is_constant(): self.lowleveltype = Void else: Modified: pypy/dist/pypy/rpython/test/test_rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rpbc.py (original) +++ pypy/dist/pypy/rpython/test/test_rpbc.py Thu Sep 15 18:18:28 2005 @@ -1079,3 +1079,21 @@ c_a = A_repr.convert_const(A(None)) res = interp.eval_function(llfunction, [None, c_f, c_a]) assert typeOf(res) == A_repr.lowleveltype + +def test_function_or_none(): + def h(y): + return y+84 + def g(y): + return y+42 + def f(x, y): + d = {1: g, 2:h} + func = d.get(x, None) + if func: + return func(y) + return -1 + res = interpret(f, [1, 100]) + assert res == 142 + res = interpret(f, [2, 100]) + assert res == 184 + res = interpret(f, [3, 100]) + assert res == -1 From arigo at codespeak.net Thu Sep 15 20:29:55 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 15 Sep 2005 20:29:55 +0200 (CEST) Subject: [pypy-svn] r17584 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20050915182955.49A8627BF5@code1.codespeak.net> Author: arigo Date: Thu Sep 15 20:29:53 2005 New Revision: 17584 Modified: pypy/dist/pypy/translator/backendopt/inline.py pypy/dist/pypy/translator/backendopt/test/test_inline.py Log: Missing llexitcases on the Links. Re-enabled and completed a test that shows this. Modified: pypy/dist/pypy/translator/backendopt/inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/inline.py (original) +++ pypy/dist/pypy/translator/backendopt/inline.py Thu Sep 15 20:29:53 2005 @@ -226,11 +226,13 @@ l = Link(linkargs, link.target) l.prevblock = block l.exitcase = True + l.llexitcase = True block.exits.append(l) if i > 0: l = Link(blocks[-1].inputargs, block) l.prevblock = blocks[-1] l.exitcase = False + l.llexitcase = False blocks[-1].exits.insert(0, l) blocks.append(block) blocks[-1].exits = blocks[-1].exits[:1] Modified: pypy/dist/pypy/translator/backendopt/test/test_inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_inline.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_inline.py Thu Sep 15 20:29:53 2005 @@ -18,6 +18,8 @@ assert hasattr(v, 'concretetype') assert hasattr(op.result, 'concretetype') if isinstance(node, Link): + if node.exitcase is not None: + assert hasattr(node, 'llexitcase') for v in node.args: assert hasattr(v, 'concretetype') if isinstance(node.last_exception, (Variable, Constant)): @@ -134,9 +136,7 @@ result = interp.eval_function(g, [42]) assert result == 1 -def DONOTtest_inline_var_exception(): - # this test is disabled for now, because f() contains a direct_call - # (at the end, to a ll helper, to get the type of the exception object) +def test_inline_var_exception(): def f(x): e = None if x == 0: @@ -157,7 +157,9 @@ a = t.annotate([int]) a.simplify() t.specialize() - inline_function(t, f, t.flowgraphs[g]) + auto_inlining(t, threshold=10) + for graph in t.flowgraphs.values(): + traverse(no_missing_concretetype, graph) interp = LLInterpreter(t.flowgraphs, t.rtyper) result = interp.eval_function(g, [0]) assert result == 2 From arigo at codespeak.net Fri Sep 16 10:54:58 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 16 Sep 2005 10:54:58 +0200 (CEST) Subject: [pypy-svn] r17586 - in pypy/dist/pypy/translator: . backendopt backendopt/test Message-ID: <20050916085458.1CE2227BF7@code1.codespeak.net> Author: arigo Date: Fri Sep 16 10:54:54 2005 New Revision: 17586 Modified: pypy/dist/pypy/translator/backendopt/malloc.py pypy/dist/pypy/translator/backendopt/ssa.py pypy/dist/pypy/translator/backendopt/test/test_all.py pypy/dist/pypy/translator/backendopt/test/test_malloc.py pypy/dist/pypy/translator/simplify.py Log: Several subtle bug fixes with the graph transformations: * simplify.remove_identical_vars() was not agressive enough; it failed to do its job for pairs of identical variables passed around a loop. As SSI_to_SSA() has the logic to detect that, let remove_identical_vars() be based on the same logic. * the SSI_to_SSA() logic was not agressive enough as well. It's a fixpoint algo that wasn't reflowing enough. * a test that now passes because remove_identical_vars() does its job, which avoids getting malloc removal confused. * another test that still fails about malloc removal. * obscure bug in malloc removal about block.operations being mutated while it's iterated over. Modified: pypy/dist/pypy/translator/backendopt/malloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/malloc.py Fri Sep 16 10:54:54 2005 @@ -1,7 +1,8 @@ from pypy.objspace.flow.model import Variable, Constant, Block, Link -from pypy.objspace.flow.model import SpaceOperation, traverse +from pypy.objspace.flow.model import SpaceOperation, traverse, checkgraph from pypy.tool.unionfind import UnionFind from pypy.rpython import lltype +from pypy.translator.simplify import remove_identical_vars class LifeTime: @@ -194,18 +195,23 @@ newinputargs.append(newvar) newinputargs += block.inputargs[i+1:] block.inputargs[:] = newinputargs + assert var not in block.inputargs flowin(var, newvarsmap) # look for variables created inside the block by a malloc + vars_created_here = [] for op in block.operations: if op.opname == "malloc" and op.result in vars: - newvarsmap = flatconstants.copy() # dummy initial values - flowin(op.result, newvarsmap) + vars_created_here.append(op.result) + for var in vars_created_here: + newvarsmap = flatconstants.copy() # dummy initial values + flowin(var, newvarsmap) return True def remove_mallocs_once(graph): """Perform one iteration of malloc removal.""" + remove_identical_vars(graph) lifetimes = compute_lifetimes(graph) progress = False for info in lifetimes: Modified: pypy/dist/pypy/translator/backendopt/ssa.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/ssa.py (original) +++ pypy/dist/pypy/translator/backendopt/ssa.py Fri Sep 16 10:54:54 2005 @@ -1,30 +1,24 @@ -from pypy.objspace.flow.model import Variable, mkentrymap +from pypy.objspace.flow.model import Variable, mkentrymap, flatten, Block from pypy.tool.unionfind import UnionFind - -def SSI_to_SSA(graph): - """Rename the variables in a flow graph as much as possible without - violating the SSA rule. 'SSI' means that each Variable in a flow graph is - defined only once in the whole graph; all our graphs are SSI. This - function does not break that rule, but changes the 'name' of some - Variables to give them the same 'name' as other Variables. The result - looks like an SSA graph. 'SSA' means that each var name appears as the - result of an operation only once in the whole graph, but it can be - passed to other blocks across links. + +def data_flow_families(graph): + """Follow the flow of the data in the graph. Returns a UnionFind grouping + all the variables by families: each family contains exactly one variable + where a value is stored into -- either by an operation or a merge -- and + all following variables where the value is just passed unmerged into the + next block. """ - entrymap = mkentrymap(graph) - consider_blocks = entrymap + entrymaplist = mkentrymap(graph).items() + progress = True variable_families = UnionFind() # group variables by families; a family of variables will be identified. - while consider_blocks: - blocklist = consider_blocks.keys() - consider_blocks = {} - for block in blocklist: + while progress: + progress = False + for block, links in entrymaplist: if block is graph.startblock: continue - links = entrymap[block] assert links - mapping = {} for i in range(len(block.inputargs)): # list of possible vars that can arrive in i'th position v1 = block.inputargs[i] @@ -40,10 +34,20 @@ else: if len(inputs) == 2: variable_families.union(*inputs) - # mark all the following blocks as subject to - # possible further optimization - for link in block.exits: - consider_blocks[link.target] = True + progress = True + return variable_families + +def SSI_to_SSA(graph): + """Rename the variables in a flow graph as much as possible without + violating the SSA rule. 'SSI' means that each Variable in a flow graph is + defined only once in the whole graph; all our graphs are SSI. This + function does not break that rule, but changes the 'name' of some + Variables to give them the same 'name' as other Variables. The result + looks like an SSA graph. 'SSA' means that each var name appears as the + result of an operation only once in the whole graph, but it can be + passed to other blocks across links. + """ + variable_families = data_flow_families(graph) # rename variables to give them the name of their familiy representant for v in variable_families.keys(): v1 = variable_families.find_rep(v) @@ -52,7 +56,9 @@ # sanity-check that the same name is never used several times in a block variables_by_name = {} - for block in entrymap: + for block in flatten(graph): + if not isinstance(block, Block): + continue vars = [op.result for op in block.operations] for link in block.exits: vars += link.getextravars() Modified: pypy/dist/pypy/translator/backendopt/test/test_all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_all.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_all.py Fri Sep 16 10:54:54 2005 @@ -71,3 +71,20 @@ interp = LLInterpreter(t.flowgraphs, t.rtyper) res = interp.eval_function(f, [11]) assert res == 55 + + +def test_list_comp(): + def f(n1, n2): + c = [i for i in range(n2)] + return 33 + t = Translator(f) + t.annotate([int, int]) + t.specialize() + t.backend_optimizations(inline_threshold=10, mallocs=True) + + graph = t.getflowgraph() + check_malloc_removed(graph) + + interp = LLInterpreter(t.flowgraphs, t.rtyper) + res = interp.eval_function(f, [11, 22]) + assert res == 33 Modified: pypy/dist/pypy/translator/backendopt/test/test_malloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_malloc.py Fri Sep 16 10:54:54 2005 @@ -80,3 +80,19 @@ b = B() return b.attr check(fn5, [], [], 42) + +def test_aliasing(): + class A: + pass + def fn6(n): + a1 = A() + a1.x = 5 + a2 = A() + a2.x = 6 + if n > 0: + a = a1 + else: + a = a2 + a.x = 12 + return a1.x + check(fn6, [int], [1], 12) Modified: pypy/dist/pypy/translator/simplify.py ============================================================================== --- pypy/dist/pypy/translator/simplify.py (original) +++ pypy/dist/pypy/translator/simplify.py Fri Sep 16 10:54:54 2005 @@ -453,41 +453,36 @@ which otherwise doesn't realize that tests performed on one of the copies of the variable also affect the other.""" - entrymap = mkentrymap(graph) - consider_blocks = entrymap + from pypy.translator.backendopt.ssa import data_flow_families + variable_families = data_flow_families(graph) + def merges(varlist): + d = {} + for i in range(len(varlist)): + v = variable_families.find_rep(varlist[i]) + if v in d: + yield d[v], i + d[v] = i - while consider_blocks: - blocklist = consider_blocks.keys() - consider_blocks = {} - for block in blocklist: - if not block.exits: - continue - links = entrymap[block] - entryargs = {} - for i in range(len(block.inputargs)): - # list of possible vars that can arrive in i'th position - key = tuple([link.args[i] for link in links]) - if key not in entryargs: - entryargs[key] = i - else: - j = entryargs[key] - # positions i and j receive exactly the same input vars, - # we can remove the argument i and replace it with the j. - argi = block.inputargs[i] - if not isinstance(argi, Variable): continue - argj = block.inputargs[j] - block.renamevariables({argi: argj}) - assert block.inputargs[i] == block.inputargs[j] == argj - del block.inputargs[i] - for link in links: - assert link.args[i] == link.args[j] - del link.args[i] - # mark this block and all the following ones as subject to - # possible further optimization - consider_blocks[block] = True - for link in block.exits: - consider_blocks[link.target] = True - break + entrymap = mkentrymap(graph) + for block, links in entrymap.items(): + if not block.exits: + continue + try: + while True: + # look for the next possible merge (restarting each time) + j, i = merges(block.inputargs).next() + # we can remove the argument i and replace it with the j. + argi = block.inputargs[i] + argj = block.inputargs[j] + block.renamevariables({argi: argj}) + assert block.inputargs[i] == block.inputargs[j] == argj + del block.inputargs[i] + for link in links: + assert (variable_families.find_rep(link.args[i]) == + variable_families.find_rep(link.args[j])) + del link.args[i] + except StopIteration: + pass def coalesce_is_true(graph): """coalesce paths that go through an is_true and a directly successive From arigo at codespeak.net Fri Sep 16 11:04:43 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 16 Sep 2005 11:04:43 +0200 (CEST) Subject: [pypy-svn] r17587 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20050916090443.4786227BF7@code1.codespeak.net> Author: arigo Date: Fri Sep 16 11:04:41 2005 New Revision: 17587 Modified: pypy/dist/pypy/translator/backendopt/malloc.py pypy/dist/pypy/translator/backendopt/test/test_malloc.py Log: Finally, with a good remove_identical_vars(), a sufficient condition for disabling malloc removal is to see if the pointer variable is ever duplicated along a link. So the fix is easy, after all. (Good to be more awake than yesterday evening :-) Modified: pypy/dist/pypy/translator/backendopt/malloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/malloc.py Fri Sep 16 11:04:41 2005 @@ -71,9 +71,18 @@ if isinstance(node.last_exc_value, Variable): set_creation_point(node.prevblock, node.last_exc_value, "last_exc_value") - for i in range(len(node.args)): - union(node.prevblock, node.args[i], + d = {} + for i, arg in enumerate(node.args): + union(node.prevblock, arg, node.target, node.target.inputargs[i]) + if isinstance(arg, Variable): + if arg in d: + # same variable present several times in link.args + # consider it as a 'use' of the variable, which + # will disable malloc optimization (aliasing problems) + set_use_point(node.prevblock, arg, "dup", node, i) + else: + d[arg] = True traverse(visit, graph) return lifetimes.infos() Modified: pypy/dist/pypy/translator/backendopt/test/test_malloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_malloc.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_malloc.py Fri Sep 16 11:04:41 2005 @@ -17,13 +17,14 @@ assert count1 == 0 # number of mallocs left assert count2 == 0 # number of direct_calls left -def check(fn, signature, args, expected_result): +def check(fn, signature, args, expected_result, must_be_removed=True): t = Translator(fn) t.annotate(signature) t.specialize() graph = t.getflowgraph() remove_simple_mallocs(graph) - check_malloc_removed(graph) + if must_be_removed: + check_malloc_removed(graph) interp = LLInterpreter(t.flowgraphs, t.rtyper) res = interp.eval_function(fn, args) assert res == expected_result @@ -95,4 +96,4 @@ a = a2 a.x = 12 return a1.x - check(fn6, [int], [1], 12) + check(fn6, [int], [1], 12, must_be_removed=False) From ac at codespeak.net Fri Sep 16 11:21:23 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Fri, 16 Sep 2005 11:21:23 +0200 (CEST) Subject: [pypy-svn] r17588 - in pypy/dist/pypy/interpreter/pyparser: . test Message-ID: <20050916092123.C1B6827BF7@code1.codespeak.net> Author: ac Date: Fri Sep 16 11:21:23 2005 New Revision: 17588 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Log: Support imaginar constants (e.g. 3.4j) Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Fri Sep 16 11:21:23 2005 @@ -1606,6 +1606,9 @@ if value.endswith('l') or value.endswith('L'): value = value[:-1] return string_to_w_long( space, value, base=base ) + if value.endswith('j') or value.endswith('J'): + c = space.builtin.get('complex') + return space.call_function(c, space.wrap(value)) try: return space.wrap(string_to_int(value, base=base)) except ParseStringError: Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Fri Sep 16 11:21:23 2005 @@ -11,6 +11,7 @@ from pypy.interpreter.astcompiler import ast +from pypy.objspace.std import objspace def arglist_equal(left,right): """needs special case because we handle the argumentlist differently""" @@ -81,6 +82,20 @@ return False return True +constants = [ + "0", + "7", + "-3", + "053", + "0x18", + "14L", + "1.0", + "3.9", + "-3.6", + "1.8e19", + "3j" + ] + expressions = [ "x = a + 1", "x = 1 - a", @@ -565,9 +580,9 @@ def call_method(self, obj, meth, *args): return getattr(obj, meth)(*args) -def ast_parse_expr(expr, target='single'): +def ast_parse_expr(expr, target='single', space=FakeSpace): target = TARGET_DICT[target] - builder = AstBuilder(space=FakeSpace()) + builder = AstBuilder(space=space()) PYTHON_PARSER.parse_source(expr, target, builder) return builder @@ -585,7 +600,6 @@ print "-" * 30 assert nodes_equal( ast, r1.rule_stack[-1]), 'failed on %r' % (expr) - def test_basic_astgen(): for family in TESTS: for expr in family: @@ -596,6 +610,14 @@ for expr in family: yield check_expression, expr, 'exec' +def check_constant(expr): + ast_parse_expr(expr, 'single', objspace.StdObjSpace) + +def test_constants(): + for expr in constants: + yield check_constant, expr + + SNIPPETS = [ 'snippet_1.py', 'snippet_several_statements.py', From ale at codespeak.net Fri Sep 16 12:24:46 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Fri, 16 Sep 2005 12:24:46 +0200 (CEST) Subject: [pypy-svn] r17589 - in pypy/dist/pypy/translator: goal tool Message-ID: <20050916102446.8A7B327BFF@code1.codespeak.net> Author: ale Date: Fri Sep 16 12:24:45 2005 New Revision: 17589 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py pypy/dist/pypy/translator/tool/util.py Log: moved some utility functions to pypy.translator.tool.util Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Fri Sep 16 12:24:45 2005 @@ -79,10 +79,7 @@ from pypy.translator.translator import Translator from pypy.annotation import model as annmodel from pypy.annotation import listdef -from pypy.tool.cache import Cache from pypy.annotation.policy import AnnotatorPolicy -from pypy.tool.udir import udir -from pypy.tool.ansi_print import ansi_print from pypy.translator.pickle.main import load, save # catch TyperError to allow for post-mortem dump from pypy.rpython.error import TyperError @@ -93,7 +90,9 @@ from pypy.translator.tool import cbuild cbuild.enable_fast_compilation() from pypy.translator.tool.util import worstblocks_topten, find_someobjects -from pypy.translator.tool.util import sanity_check_exceptblocks +from pypy.translator.tool.util import sanity_check_exceptblocks, update_usession_dir +from pypy.translator.tool.util import assert_rpython_mostly_not_imported, mkexename + annmodel.DEBUG = False @@ -138,51 +137,10 @@ t.frozen = True # cannot freeze if we don't have annotations return standalone -def assert_rpython_mostly_not_imported(): - prefix = 'pypy.rpython.' - oknames = ('rarithmetic memory memory.lladdress extfunctable ' - 'lltype objectmodel error ros'.split()) - wrongimports = [] - for name, module in sys.modules.items(): - if module is not None and name.startswith(prefix): - sname = name[len(prefix):] - for okname in oknames: - if sname.startswith(okname): - break - else: - wrongimports.append(name) - if wrongimports: - raise RuntimeError("cannot fork because improper rtyper code" - " has already been imported: %r" %(wrongimports,)) - - -def update_usession_dir(stabledir = udir.dirpath('usession')): - from py import path - try: - if stabledir.check(dir=1): - for x in udir.visit(lambda x: x.check(file=1)): - target = stabledir.join(x.relto(udir)) - if target.check(): - target.remove() - else: - target.dirpath().ensure(dir=1) - try: - target.mklinkto(x) - except path.Invalid: - x.copy(target) - except path.Invalid: - print "ignored: couldn't link or copy to %s" % stabledir - - # graph servers serv_start, serv_show, serv_stop, serv_cleanup = None, None, None, None -def mkexename(name): - if sys.platform == 'win32': - name = os.path.normpath(name + '.exe') - return name - if __name__ == '__main__': targetspec = 'targetpypystandalone' Modified: pypy/dist/pypy/translator/tool/util.py ============================================================================== --- pypy/dist/pypy/translator/tool/util.py (original) +++ pypy/dist/pypy/translator/tool/util.py Fri Sep 16 12:24:45 2005 @@ -1,4 +1,44 @@ from pypy.annotation.model import SomeObject +from pypy.tool.udir import udir + +def update_usession_dir(stabledir = udir.dirpath('usession')): + from py import path + try: + if stabledir.check(dir=1): + for x in udir.visit(lambda x: x.check(file=1)): + target = stabledir.join(x.relto(udir)) + if target.check(): + target.remove() + else: + target.dirpath().ensure(dir=1) + try: + target.mklinkto(x) + except path.Invalid: + x.copy(target) + except path.Invalid: + print "ignored: couldn't link or copy to %s" % stabledir + +def mkexename(name): + if sys.platform == 'win32': + name = os.path.normpath(name + '.exe') + return name + +def assert_rpython_mostly_not_imported(): + prefix = 'pypy.rpython.' + oknames = ('rarithmetic memory memory.lladdress extfunctable ' + 'lltype objectmodel error ros'.split()) + wrongimports = [] + for name, module in sys.modules.items(): + if module is not None and name.startswith(prefix): + sname = name[len(prefix):] + for okname in oknames: + if sname.startswith(okname): + break + else: + wrongimports.append(name) + if wrongimports: + raise RuntimeError("cannot fork because improper rtyper code" + " has already been imported: %r" %(wrongimports,)) def sanity_check_exceptblocks(translator): annotator = translator.annotator From ale at codespeak.net Fri Sep 16 12:52:57 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Fri, 16 Sep 2005 12:52:57 +0200 (CEST) Subject: [pypy-svn] r17590 - pypy/dist/pypy/translator/tool Message-ID: <20050916105257.B53E527BF8@code1.codespeak.net> Author: ale Date: Fri Sep 16 12:52:56 2005 New Revision: 17590 Modified: pypy/dist/pypy/translator/tool/util.py Log: forgot to import sys Modified: pypy/dist/pypy/translator/tool/util.py ============================================================================== --- pypy/dist/pypy/translator/tool/util.py (original) +++ pypy/dist/pypy/translator/tool/util.py Fri Sep 16 12:52:56 2005 @@ -1,5 +1,6 @@ from pypy.annotation.model import SomeObject from pypy.tool.udir import udir +import sys def update_usession_dir(stabledir = udir.dirpath('usession')): from py import path From arigo at codespeak.net Fri Sep 16 12:58:03 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 16 Sep 2005 12:58:03 +0200 (CEST) Subject: [pypy-svn] r17591 - pypy/dist/pypy/translator Message-ID: <20050916105803.4793127BF8@code1.codespeak.net> Author: arigo Date: Fri Sep 16 12:58:01 2005 New Revision: 17591 Modified: pypy/dist/pypy/translator/simplify.py Log: Fix remove_identical_vars(): after all, the previous check-in didn't make it strictly more agressive, but differently so only. Combined the previous algorithm and the original one... Modified: pypy/dist/pypy/translator/simplify.py ============================================================================== --- pypy/dist/pypy/translator/simplify.py (original) +++ pypy/dist/pypy/translator/simplify.py Fri Sep 16 12:58:01 2005 @@ -454,35 +454,43 @@ of the variable also affect the other.""" from pypy.translator.backendopt.ssa import data_flow_families - variable_families = data_flow_families(graph) - def merges(varlist): - d = {} - for i in range(len(varlist)): - v = variable_families.find_rep(varlist[i]) - if v in d: - yield d[v], i - d[v] = i - - entrymap = mkentrymap(graph) - for block, links in entrymap.items(): - if not block.exits: - continue - try: - while True: - # look for the next possible merge (restarting each time) - j, i = merges(block.inputargs).next() - # we can remove the argument i and replace it with the j. - argi = block.inputargs[i] - argj = block.inputargs[j] - block.renamevariables({argi: argj}) - assert block.inputargs[i] == block.inputargs[j] == argj - del block.inputargs[i] + entrymapitems = mkentrymap(graph).items() + progress = True + while progress: + variable_families = data_flow_families(graph) + progress = False + for block, links in entrymapitems: + if not block.exits: + continue + entryargs = {} + for i in range(len(block.inputargs)): + # list of possible vars that can arrive in i'th position + key = [] for link in links: - assert (variable_families.find_rep(link.args[i]) == - variable_families.find_rep(link.args[j])) - del link.args[i] - except StopIteration: - pass + v = link.args[i] + if isinstance(v, Constant): + break + key.append(variable_families.find_rep(v)) + else: # if no Constant + key = tuple(key) + if key not in entryargs: + entryargs[key] = i + else: + j = entryargs[key] + # positions i and j receive exactly the same input + # vars, we can remove the argument i and replace it + # with the j. + argi = block.inputargs[i] + argj = block.inputargs[j] + block.renamevariables({argi: argj}) + assert block.inputargs[i] == block.inputargs[j]== argj + del block.inputargs[i] + for link in links: + assert (variable_families.find_rep(link.args[i])== + variable_families.find_rep(link.args[j])) + del link.args[i] + progress = True + break # block.inputargs mutated def coalesce_is_true(graph): """coalesce paths that go through an is_true and a directly successive From ac at codespeak.net Fri Sep 16 13:38:17 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Fri, 16 Sep 2005 13:38:17 +0200 (CEST) Subject: [pypy-svn] r17592 - in pypy/dist/pypy/interpreter/pyparser: . test Message-ID: <20050916113817.8FF2927BF8@code1.codespeak.net> Author: ac Date: Fri Sep 16 13:38:17 2005 New Revision: 17592 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Log: Change strategy for creating numeric constants in order to be more compliant. Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Fri Sep 16 13:38:17 2005 @@ -1594,9 +1594,6 @@ """temporary implementation eval_number intends to replace number = eval(value) ; return number """ - from pypy.objspace.std.strutil import string_to_int, string_to_float - from pypy.objspace.std.strutil import string_to_w_long, interp_string_to_float - from pypy.objspace.std.strutil import ParseStringError space = self.space base = 10 if value.startswith("0x") or value.startswith("0X"): @@ -1604,15 +1601,17 @@ elif value.startswith("0"): base = 8 if value.endswith('l') or value.endswith('L'): - value = value[:-1] - return string_to_w_long( space, value, base=base ) + l = space.builtin.get('long') + return space.call_function(l, space.wrap(value), space.wrap(base)) if value.endswith('j') or value.endswith('J'): c = space.builtin.get('complex') return space.call_function(c, space.wrap(value)) try: - return space.wrap(string_to_int(value, base=base)) - except ParseStringError: - return space.wrap(interp_string_to_float(space,value)) + i = space.builtin.get('int') + return space.call_function(i, space.wrap(value), space.wrap(base)) + except: + f = space.builtin.get('float') + return space.call_function(f, space.wrap(value)) def is_string_const(self, expr): if not isinstance(expr,ast.Const): Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Fri Sep 16 13:38:17 2005 @@ -11,8 +11,6 @@ from pypy.interpreter.astcompiler import ast -from pypy.objspace.std import objspace - def arglist_equal(left,right): """needs special case because we handle the argumentlist differently""" for l,r in zip(left,right): @@ -93,6 +91,8 @@ "3.9", "-3.6", "1.8e19", + "90000000000000", + "90000000000000.", "3j" ] @@ -499,6 +499,7 @@ ] TESTS = [ + constants, expressions, augassigns, comparisons, @@ -580,9 +581,14 @@ def call_method(self, obj, meth, *args): return getattr(obj, meth)(*args) -def ast_parse_expr(expr, target='single', space=FakeSpace): + def call_function(self, func, *args): + return func(*args) + + builtin = dict(int=int, long=long, float=float, complex=complex) + +def ast_parse_expr(expr, target='single'): target = TARGET_DICT[target] - builder = AstBuilder(space=space()) + builder = AstBuilder(space=FakeSpace()) PYTHON_PARSER.parse_source(expr, target, builder) return builder @@ -610,14 +616,6 @@ for expr in family: yield check_expression, expr, 'exec' -def check_constant(expr): - ast_parse_expr(expr, 'single', objspace.StdObjSpace) - -def test_constants(): - for expr in constants: - yield check_constant, expr - - SNIPPETS = [ 'snippet_1.py', 'snippet_several_statements.py', @@ -633,9 +631,7 @@ 'snippet_2.py', 'snippet_3.py', 'snippet_4.py', - # XXX: skip snippet_comment because we don't have a replacement of - # eval for numbers and strings (eval_number('0x1L') fails) - # 'snippet_comment.py', + 'snippet_comment.py', 'snippet_encoding_declaration2.py', 'snippet_encoding_declaration3.py', 'snippet_encoding_declaration.py', Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astcompiler.py Fri Sep 16 13:38:17 2005 @@ -17,12 +17,13 @@ listmakers, genexps, dictmakers, multiexpr, attraccess, slices, imports,\ asserts, execs, prints, globs, raises_, imports_newstyle, augassigns, \ if_stmts, one_stmt_classdefs, one_stmt_funcdefs, tryexcepts, docstrings, \ - returns, SNIPPETS, SINGLE_INPUTS, LIBSTUFF + returns, SNIPPETS, SINGLE_INPUTS, LIBSTUFF, constants from test_astbuilder import FakeSpace TESTS = [ + constants, expressions, augassigns, comparisons, From arigo at codespeak.net Fri Sep 16 14:12:06 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 16 Sep 2005 14:12:06 +0200 (CEST) Subject: [pypy-svn] r17593 - pypy/dist/pypy/translator/c Message-ID: <20050916121206.1B5FA27C09@code1.codespeak.net> Author: arigo Date: Fri Sep 16 14:12:03 2005 New Revision: 17593 Modified: pypy/dist/pypy/translator/c/database.py pypy/dist/pypy/translator/c/gc.py pypy/dist/pypy/translator/c/node.py Log: Show progress while collecting objects in genc. Modified: pypy/dist/pypy/translator/c/database.py ============================================================================== --- pypy/dist/pypy/translator/c/database.py (original) +++ pypy/dist/pypy/translator/c/database.py Fri Sep 16 14:12:03 2005 @@ -1,3 +1,4 @@ +import sys from pypy.rpython.lltype import Primitive, Ptr, typeOf, RuntimeTypeInfo from pypy.rpython.lltype import Struct, Array, FuncType, PyObject, Void from pypy.rpython.lltype import ContainerType, pyobjectptr, OpaqueType, GcStruct @@ -19,6 +20,8 @@ self.structdefnodes = {} self.containernodes = {} self.containerlist = [] + self.completedcontainers = 0 + self.containerstats = {} self.externalfuncs = {} self.namespace = CNameManager() if not standalone: @@ -98,6 +101,8 @@ node = nodefactory(self, T, container) self.containernodes[container] = node self.containerlist.append(node) + kind = getattr(node, 'nodekind', '?') + self.containerstats[kind] = self.containerstats.get(kind, 0) + 1 return node def get(self, obj): @@ -152,8 +157,17 @@ return '' """ - def complete(self): - i = 0 + def complete(self, show_progress=True): + def dump(): + lst = ['%s: %d' % keyvalue + for keyvalue in self.containerstats.items()] + lst.sort() + print '%8d nodes [ %s ]' % (i, ' '.join(lst)) + i = self.completedcontainers + if show_progress: + show_i = (i//1000 + 1) * 1000 + else: + show_i = -1 while True: if hasattr(self, 'pyobjmaker'): self.pyobjmaker.collect_initcode() @@ -166,6 +180,12 @@ else: self.get(value) i += 1 + self.completedcontainers = i + if i == show_i: + dump() + show_i += 1000 + if show_progress: + dump() def globalcontainers(self): for node in self.containerlist: Modified: pypy/dist/pypy/translator/c/gc.py ============================================================================== --- pypy/dist/pypy/translator/c/gc.py (original) +++ pypy/dist/pypy/translator/c/gc.py Fri Sep 16 14:12:03 2005 @@ -257,6 +257,7 @@ err) class RefcountingRuntimeTypeInfo_OpaqueNode(ContainerNode): + nodekind = 'refcnt rtti' globalcontainer = True includes = () typename = 'void (@)(void *)' @@ -365,6 +366,7 @@ class BoehmGcRuntimeTypeInfo_OpaqueNode(ContainerNode): + nodekind = 'boehm rtti' globalcontainer = True includes = () typename = 'char @' Modified: pypy/dist/pypy/translator/c/node.py ============================================================================== --- pypy/dist/pypy/translator/c/node.py (original) +++ pypy/dist/pypy/translator/c/node.py Fri Sep 16 14:12:03 2005 @@ -318,6 +318,7 @@ assert not USESLOTS or '__dict__' not in dir(ContainerNode) class StructNode(ContainerNode): + nodekind = 'struct' if USESLOTS: __slots__ = () @@ -360,6 +361,7 @@ assert not USESLOTS or '__dict__' not in dir(StructNode) class ArrayNode(ContainerNode): + nodekind = 'array' if USESLOTS: __slots__ = () @@ -419,6 +421,7 @@ class FuncNode(ContainerNode): + nodekind = 'func' if USESLOTS: __slots__ = """funcgen""".split() @@ -526,6 +529,7 @@ raise ValueError, "don't know how to generate code for %r" % (fnobj,) class ExtType_OpaqueNode(ContainerNode): + nodekind = 'rpyopaque' def enum_dependencies(self): return [] @@ -554,6 +558,7 @@ class PyObjectNode(ContainerNode): + nodekind = 'pyobj' globalcontainer = True typename = 'PyObject @' implementationtypename = 'PyObject *@' From ale at codespeak.net Fri Sep 16 14:19:21 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Fri, 16 Sep 2005 14:19:21 +0200 (CEST) Subject: [pypy-svn] r17594 - pypy/dist/pypy/translator/goal Message-ID: <20050916121921.CCEA227C04@code1.codespeak.net> Author: ale Date: Fri Sep 16 14:19:20 2005 New Revision: 17594 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: renamed options1 to cmd_lin_opt. made it possible for the targets to define options. The options from the target will be merged with command line options into "options" (command line taking precedence). The options are not yet propagated further. Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Fri Sep 16 14:19:20 2005 @@ -106,7 +106,7 @@ ldef = listdef.ListDef(None, annmodel.SomeString()) inputtypes = [annmodel.SomeList(ldef)] - if not options1.no_annotations: + if not cmd_line_opt.no_annotations: print 'Annotating...' print 'with policy: %s.%s' % (policy.__class__.__module__, policy.__class__.__name__) a = t.annotate(inputtypes, policy=policy) @@ -119,18 +119,18 @@ if a: #and not options['-no-s']: print 'Simplifying...' a.simplify() - if 'fork1' in options1.fork: + if 'fork1' in cmd_line_opt.fork: from pypy.translator.goal import unixcheckpoint assert_rpython_mostly_not_imported() unixcheckpoint.restartable_point(auto='run') - if a and options1.specialize: + if a and cmd_line_opt.specialize: print 'Specializing...' t.specialize(dont_simplify_again=True, - crash_on_first_typeerror=not options1.insist) - if options1.optimize and options1.backend != 'llvm': + crash_on_first_typeerror=not cmd_line_opt.insist) + if cmd_line_opt.optimize and cmd_line_opt.backend != 'llvm': print 'Back-end optimizations...' t.backend_optimizations() - if a and 'fork2' in options1.fork: + if a and 'fork2' in cmd_line_opt.fork: from pypy.translator.goal import unixcheckpoint unixcheckpoint.restartable_point(auto='run') if a: @@ -172,20 +172,20 @@ print 'Done.' print func, args = pdb_plus_show.set_trace, () - if not options1.pygame: - if options1.batch: + if not cmd_line_opt.pygame: + if cmd_line_opt.batch: print >>sys.stderr, "batch mode, not calling interactive helpers" else: func(*args) else: - if options1.batch: + if cmd_line_opt.batch: print >>sys.stderr, "batch mode, not calling interactive helpers" else: if serv_start: start, show, stop, cleanup = serv_start, serv_show, serv_stop, serv_cleanup else: from pypy.translator.tool.pygame.server import run_translator_server - start, show, stop, cleanup = run_translator_server(t, entry_point, options1) + start, show, stop, cleanup = run_translator_server(t, entry_point, cmd_line_opt) pdb_plus_show.install_show(show) debugger = run_debugger_in_thread(func, args, stop) debugger.start() @@ -236,7 +236,7 @@ parser.add_option(option[0],option[1], default=option[-1], dest=option[1].lstrip('--'), help=option[2]) - (options1, args) = parser.parse_args() + (cmd_line_opt, args) = parser.parse_args() argiter = iter(args) #sys.argv[1:]) for arg in argiter: try: @@ -251,17 +251,17 @@ t = None options = {} for opt in parser.option_list[1:]: - options[opt.dest] = getattr(options1,opt.dest) + options[opt.dest] = getattr(cmd_line_opt,opt.dest) if options.get('gc') == 'boehm': options['-boehm'] = True ## if options['-tcc']: ## os.environ['PYPY_CC'] = 'tcc -shared -o "%s.so" "%s.c"' - if options1.debug: + if cmd_line_opt.debug: annmodel.DEBUG = True try: err = None - if options1.load: - loaded_dic = load(options1.load) + if cmd_line_opt.load: + loaded_dic = load(cmd_line_opt.load) t = loaded_dic['trans'] entry_point = t.entrypoint inputtypes = loaded_dic['inputtypes'] @@ -276,14 +276,17 @@ print 'option %s is implied by the load' % name options[name] = True print "continuing Analysis as defined by %s, loaded from %s" %( - targetspec, options1.loadname) + targetspec, cmd_line_opt.loadname) targetspec_dic['target'] = None else: targetspec_dic = {} sys.path.insert(0, os.path.dirname(targetspec)) execfile(targetspec+'.py', targetspec_dic) print "Analysing target as defined by %s" % targetspec - + if targetspec_dic.get('options', None): + targetspec_dic['options'].update(options) + options = targetspec_dic['options'] + print options,targetspec_dic['options'] print 'options in effect:' optnames = options.keys() optnames.sort() @@ -293,7 +296,7 @@ policy = AnnotatorPolicy() target = targetspec_dic['target'] if target: - spec = target(not options1.lowmem) + spec = target(not cmd_line_opt.lowmem) try: entry_point, inputtypes, policy = spec except ValueError: @@ -312,11 +315,11 @@ except TyperError: err = sys.exc_info() print '-'*60 - if options1.save: - print 'saving state to %s' % options1.save + if cmd_line_opt.save: + print 'saving state to %s' % cmd_line_opt.save if err: print '*** this save is done after errors occured ***' - save(t, options1.save, + save(t, cmd_line_opt.save, trans=t, inputtypes=inputtypes, targetspec=targetspec, @@ -325,18 +328,18 @@ ) if err: raise err[0], err[1], err[2] - if options1.backend == 'c': #XXX probably better to supply gcpolicy as string to the backends + if cmd_line_opt.backend == 'c': #XXX probably better to supply gcpolicy as string to the backends gcpolicy = None - if options1.gc =='boehm': + if cmd_line_opt.gc =='boehm': from pypy.translator.c import gc gcpolicy = gc.BoehmGcPolicy - if options1.gc == 'none': + if cmd_line_opt.gc == 'none': from pypy.translator.c import gc gcpolicy = gc.NoneGcPolicy - elif options1.backend == 'llvm': - gcpolicy = options1.gc + elif cmd_line_opt.backend == 'llvm': + gcpolicy = cmd_line_opt.gc - if options1.backend == 'llinterpret': + if cmd_line_opt.backend == 'llinterpret': def interpret(): import py from pypy.rpython.llinterp import LLInterpreter @@ -345,24 +348,24 @@ interp.eval_function(entry_point, targetspec_dic['get_llinterp_args']()) interpret() - elif not options1.gencode: + elif not cmd_line_opt.gencode: print 'Not generating C code.' else: - print 'Generating %s %s code...' %(options1.compile and "and compiling" or "",options1.backend) - keywords = {'really_compile' : options1.compile, + print 'Generating %s %s code...' %(cmd_line_opt.compile and "and compiling" or "",cmd_line_opt.backend) + keywords = {'really_compile' : cmd_line_opt.compile, 'standalone' : standalone, 'gcpolicy' : gcpolicy} - c_entry_point = t.compile(options1.backend, **keywords) + c_entry_point = t.compile(cmd_line_opt.backend, **keywords) if standalone: # xxx fragile and messy import shutil exename = mkexename(c_entry_point) - newexename = mkexename('./pypy-' + options1.backend) + newexename = mkexename('./pypy-' + cmd_line_opt.backend) shutil.copy(exename, newexename) c_entry_point = newexename update_usession_dir() print 'Written %s.' % (c_entry_point,) - if options1.run: + if cmd_line_opt.run: print 'Running!' if standalone: os.system(c_entry_point) From arigo at codespeak.net Fri Sep 16 19:57:31 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 16 Sep 2005 19:57:31 +0200 (CEST) Subject: [pypy-svn] r17599 - pypy/dist/pypy/objspace Message-ID: <20050916175731.65B3927BDA@code1.codespeak.net> Author: arigo Date: Fri Sep 16 19:57:30 2005 New Revision: 17599 Modified: pypy/dist/pypy/objspace/descroperation.py Log: Special-case calls to Methods for performance. Based on gprof, saying that among all calls to BuiltinFrame.setfastscope(), the most costly one is by far the one that will end up invoking Method.descr_method_call(). This gives a 20% speed-up on the example 'random.random()'. Modified: pypy/dist/pypy/objspace/descroperation.py ============================================================================== --- pypy/dist/pypy/objspace/descroperation.py (original) +++ pypy/dist/pypy/objspace/descroperation.py Fri Sep 16 19:57:30 2005 @@ -1,7 +1,7 @@ import operator from pypy.interpreter.error import OperationError from pypy.interpreter.baseobjspace import ObjSpace, W_Root, BaseWrappable -from pypy.interpreter.function import Function +from pypy.interpreter.function import Function, Method from pypy.interpreter.gateway import BuiltinCode from pypy.interpreter.argument import Arguments from pypy.tool.sourcetools import compile2, func_with_new_name @@ -106,9 +106,11 @@ return space.call_args(w_impl, args) def call_args(space, w_obj, args): - # a special case for performance + # two special cases for performance if isinstance(w_obj, Function): return w_obj.call_args(args) + if isinstance(w_obj, Method): + return w_obj.call_args(args) w_descr = space.lookup(w_obj, '__call__') if w_descr is None: raise OperationError( From arigo at codespeak.net Fri Sep 16 20:17:27 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 16 Sep 2005 20:17:27 +0200 (CEST) Subject: [pypy-svn] r17600 - pypy/dist/pypy/translator/backendopt Message-ID: <20050916181727.278DB27BF4@code1.codespeak.net> Author: arigo Date: Fri Sep 16 20:17:26 2005 New Revision: 17600 Modified: pypy/dist/pypy/translator/backendopt/inline.py Log: Adapted the inlining threshold to target a specific bound within the list of all functions of PyPy. Seems to give reasonable results. Modified: pypy/dist/pypy/translator/backendopt/inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/inline.py (original) +++ pypy/dist/pypy/translator/backendopt/inline.py Fri Sep 16 20:17:26 2005 @@ -10,7 +10,7 @@ from pypy.rpython import rmodel from pypy.translator.backendopt import sparsemat -BASE_INLINE_THRESHOLD = 18 # just enough to inline ll_rangeiter_next() +BASE_INLINE_THRESHOLD = 38.8 # just enough to inline add__Int_Int() class CannotInline(Exception): pass From ericvrp at codespeak.net Fri Sep 16 20:26:34 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 16 Sep 2005 20:26:34 +0200 (CEST) Subject: [pypy-svn] r17601 - in pypy/dist/pypy/translator: goal llvm/module Message-ID: <20050916182634.0610D27BED@code1.codespeak.net> Author: ericvrp Date: Fri Sep 16 20:26:32 2005 New Revision: 17601 Modified: pypy/dist/pypy/translator/goal/targetpypystandalone.py pypy/dist/pypy/translator/goal/translate_pypy.py pypy/dist/pypy/translator/llvm/module/support.py Log: Fix because LLVM backend was not using backendoptz. (which was really hurting it) Modified: pypy/dist/pypy/translator/goal/targetpypystandalone.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetpypystandalone.py (original) +++ pypy/dist/pypy/translator/goal/targetpypystandalone.py Fri Sep 16 20:26:32 2005 @@ -60,7 +60,7 @@ if __main__.options.get('-boehm'): #print "disabling thread with boehm for stabilitiy (combination not tested)" print "trying threads and boehm" - usemodules = ['thread'] + usemodules = [] else: usemodules = ['thread'] space = StdObjSpace(nofaking=True, Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Fri Sep 16 20:26:32 2005 @@ -148,9 +148,9 @@ print 'Specializing...' t.specialize(dont_simplify_again=True, crash_on_first_typeerror=not options['-t-insist']) - if not options['-no-o'] and not options['-llvm']: + if not options['-no-o']: print 'Back-end optimizations...' - t.backend_optimizations() + t.backend_optimizations(ssa_form=not options['-llvm']) if a and options['-fork2']: from pypy.translator.goal import unixcheckpoint unixcheckpoint.restartable_point(auto='run') Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Fri Sep 16 20:26:32 2005 @@ -264,7 +264,6 @@ extfunctions["%main"] = [(), """ int %main(int %argc, sbyte** %argv) { entry: - ;%pypy_argv = call fastcc %RPyListOfString* %pypy_ll_newlist__Ptr_GcStruct_listLlT_Signed(int 0) %pypy_argv = call fastcc %structtype.list* %pypy_ll_newlist__Ptr_GcStruct_listLlT_Signed(int 0) br label %no_exit @@ -274,7 +273,6 @@ %tmp.8 = getelementptr sbyte** %argv, uint %indvar %tmp.9 = load sbyte** %tmp.8 %rpy = call fastcc %RPyString* %RPyString_FromString(sbyte* %tmp.9) - ;call fastcc void %pypy_ll_append__listPtr_rpy_stringPtr(%RPyListOfString* %pypy_argv, %RPyString* %rpy) call fastcc void %pypy_ll_append__listPtr_rpy_stringPtr(%structtype.list* %pypy_argv, %RPyString* %rpy) %inc = add int %i.0.0, 1 %tmp.2 = setlt int %inc, %argc From arigo at codespeak.net Fri Sep 16 20:30:00 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 16 Sep 2005 20:30:00 +0200 (CEST) Subject: [pypy-svn] r17602 - pypy/dist/pypy/translator/goal Message-ID: <20050916183000.2688727BF2@code1.codespeak.net> Author: arigo Date: Fri Sep 16 20:29:59 2005 New Revision: 17602 Modified: pypy/dist/pypy/translator/goal/targetpypystandalone.py Log: -boehm disables threads, but prints that it doesn't disable threads :-/ Modified: pypy/dist/pypy/translator/goal/targetpypystandalone.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetpypystandalone.py (original) +++ pypy/dist/pypy/translator/goal/targetpypystandalone.py Fri Sep 16 20:29:59 2005 @@ -59,7 +59,7 @@ StdObjSpace.setup_old_style_classes = lambda self: None if __main__.options.get('-boehm'): #print "disabling thread with boehm for stabilitiy (combination not tested)" - print "trying threads and boehm" + #print "trying threads and boehm" usemodules = [] else: usemodules = ['thread'] From arigo at codespeak.net Fri Sep 16 21:15:30 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 16 Sep 2005 21:15:30 +0200 (CEST) Subject: [pypy-svn] r17603 - pypy/dist/pypy/rpython Message-ID: <20050916191530.ADAF227BF2@code1.codespeak.net> Author: arigo Date: Fri Sep 16 21:15:25 2005 New Revision: 17603 Modified: pypy/dist/pypy/rpython/rlist.py Log: This may be a good speed-up (untested): don't reallocate small lists too often when items are removed. This formula should make sure that e.g. popping from 2 to 1 doesn't reallocate from 4 to 4 (!). It might have the side effect that small lists with elements removed consume a tiny bit more memory, but well. Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Fri Sep 16 21:15:25 2005 @@ -341,7 +341,7 @@ # Bypass realloc() when a previous overallocation is large enough # to accommodate the newsize. If the newsize falls lower than half # the allocated size, then proceed with the realloc() to shrink the list. - if allocated >= newsize and newsize >= (allocated >> 1): + if allocated >= newsize and newsize >= ((allocated >> 1) - 5): # assert l.ob_item != NULL or newsize == 0 l.length = newsize return From ericvrp at codespeak.net Fri Sep 16 22:25:59 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 16 Sep 2005 22:25:59 +0200 (CEST) Subject: [pypy-svn] r17605 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20050916202559.1DCFF27BF2@code1.codespeak.net> Author: ericvrp Date: Fri Sep 16 22:25:58 2005 New Revision: 17605 Modified: pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/module/support.py Log: Fixes for using the correct predeclared RPystring functions Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Fri Sep 16 22:25:58 2005 @@ -119,6 +119,7 @@ # set up all nodes self.db.setup_all() self.entrynode = self.db.set_entrynode(entry_point) + entryfunc_name = self.entrynode.getdecl().split('%', 1)[1].split('(')[0] self._checkpoint('setup_all') # post set up externs @@ -193,9 +194,6 @@ codewriter.append(self.exceptionpolicy.pyrex_entrypoint_code(self.entrynode)) # XXX we need to create our own main() that calls the actual entry_point function - decl = self.entrynode.getdecl() - t = decl.split('%', 1) - entryfunc_name = t[1].split('(')[0] if entryfunc_name == 'pypy_entry_point': #XXX just to get on with translate_pypy extfuncnode.ExternalFuncNode.used_external_functions['%main'] = True Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Fri Sep 16 22:25:58 2005 @@ -264,7 +264,7 @@ extfunctions["%main"] = [(), """ int %main(int %argc, sbyte** %argv) { entry: - %pypy_argv = call fastcc %structtype.list* %pypy_ll_newlist__Ptr_GcStruct_listLlT_Signed(int 0) + %pypy_argv = call fastcc %structtype.list* %pypy__RPyListOfString_New__Signed(int %argc) br label %no_exit no_exit: @@ -273,7 +273,7 @@ %tmp.8 = getelementptr sbyte** %argv, uint %indvar %tmp.9 = load sbyte** %tmp.8 %rpy = call fastcc %RPyString* %RPyString_FromString(sbyte* %tmp.9) - call fastcc void %pypy_ll_append__listPtr_rpy_stringPtr(%structtype.list* %pypy_argv, %RPyString* %rpy) + call fastcc void %pypy__RPyListOfString_SetItem__listPtr_Signed_rpy_stringPtr(%structtype.list* %pypy_argv, int %i.0.0, %RPyString* %rpy) %inc = add int %i.0.0, 1 %tmp.2 = setlt int %inc, %argc %indvar.next = add uint %indvar, 1 From tismer at codespeak.net Fri Sep 16 23:14:05 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Fri, 16 Sep 2005 23:14:05 +0200 (CEST) Subject: [pypy-svn] r17607 - in pypy/dist/pypy: rpython translator/goal Message-ID: <20050916211405.3B44727BF2@code1.codespeak.net> Author: tismer Date: Fri Sep 16 23:14:04 2005 New Revision: 17607 Modified: pypy/dist/pypy/rpython/rlist.py pypy/dist/pypy/translator/goal/bench-windows.py Log: changed _ll_list_resize to consist of two functions, one that handles the common case not to resize, and _ll_list_resize_really that does resize. Unfortunately, with our current threshold settings, this causes both functions to be inlined, with the effect that this code is now generated for 16 types times 10 call-sites! I also believe we need to thing about inlining criteria, again. What happens if I rewrite functions to consist of many tiny functions calling each other? Then inlining would suck this all in. So I think something is missing. Here are the current results: executable abs.richards abs.pystone rel.richards rel.pystone pypy-c-17439 35970 ms 646.105 42.5 56.4 pypy-c-17512 46946 ms 652.469 55.5 55.8 pypy-c-17516 38805 ms 700.740 45.9 52.0 pypy-c-17545-intern 35019 ms 751.860 41.4 48.4 pypy-c-17572 36882 ms 737.218 43.6 49.4 pypy-c-17600 27279 ms 895.572 32.2 40.7 pypy-c-17600_ll_list 27539 ms 799.185 32.6 45.6 python 2.3.3 846 ms 36419.800 1.0 1.0 Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Fri Sep 16 23:14:04 2005 @@ -321,7 +321,7 @@ # adapted C code -def _ll_list_resize(l, newsize): +def _ll_list_resize_really(l, newsize): """ Ensure ob_item has room for at least newsize elements, and set ob_size to newsize. If newsize > ob_size on entry, the content @@ -338,14 +338,6 @@ """ allocated = len(l.items) - # Bypass realloc() when a previous overallocation is large enough - # to accommodate the newsize. If the newsize falls lower than half - # the allocated size, then proceed with the realloc() to shrink the list. - if allocated >= newsize and newsize >= ((allocated >> 1) - 5): - # assert l.ob_item != NULL or newsize == 0 - l.length = newsize - return - # This over-allocates proportional to the list size, making room # for additional growth. The over-allocation is mild, but is # enough to give linear-time amortized behavior over a long @@ -376,6 +368,20 @@ l.length = newsize l.items = newitems +# this common case was factored out of _ll_list_resize +# to see if inlining it gives some speed-up. + +def _ll_list_resize(l, newsize): + # Bypass realloc() when a previous overallocation is large enough + # to accommodate the newsize. If the newsize falls lower than half + # the allocated size, then proceed with the realloc() to shrink the list. + allocated = len(l.items) + if allocated >= newsize and newsize >= ((allocated >> 1) - 5): + # assert l.ob_item != NULL or newsize == 0 + l.length = newsize + else: + _ll_list_resize_really(l, newsize) + def ll_copy(l): items = l.items length = l.length Modified: pypy/dist/pypy/translator/goal/bench-windows.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-windows.py (original) +++ pypy/dist/pypy/translator/goal/bench-windows.py Fri Sep 16 23:14:04 2005 @@ -27,12 +27,16 @@ def run_pystone(executable='python', n=0): argstr = PYSTONE_CMD % (str(n) and n or '') txt = run_cmd('%s -c "%s"' % (executable, argstr)) - return get_result(txt, PYSTONE_PATTERN) + res = get_result(txt, PYSTONE_PATTERN) + print res + return res def run_richards(executable='python', n=10): argstr = RICHARDS_CMD % n txt = run_cmd('%s -c "%s"' % (executable, argstr)) - return get_result(txt, RICHARDS_PATTERN) + res = get_result(txt, RICHARDS_PATTERN) + print res + return res def get_executables(): exes = [name for name in os.listdir('.') if name.endswith('.exe')] From arigo at codespeak.net Sat Sep 17 11:04:19 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 17 Sep 2005 11:04:19 +0200 (CEST) Subject: [pypy-svn] r17611 - in pypy/dist/pypy/translator/c: . test Message-ID: <20050917090419.A8D5727B90@code1.codespeak.net> Author: arigo Date: Sat Sep 17 11:04:16 2005 New Revision: 17611 Modified: pypy/dist/pypy/translator/c/node.py pypy/dist/pypy/translator/c/support.py pypy/dist/pypy/translator/c/test/test_genc.py Log: Writing C string constants in a more compact format. I hope I didn't disrupt indentation or produce broken code. (Need to try a targetpypystandalone to be sure, of course) Modified: pypy/dist/pypy/translator/c/node.py ============================================================================== --- pypy/dist/pypy/translator/c/node.py (original) +++ pypy/dist/pypy/translator/c/node.py Sat Sep 17 11:04:16 2005 @@ -2,11 +2,11 @@ from pypy.rpython.lltype import Struct, Array, FuncType, PyObjectType, typeOf from pypy.rpython.lltype import GcStruct, GcArray, GC_CONTAINER, ContainerType from pypy.rpython.lltype import parentlink, Ptr, PyObject, Void, OpaqueType -from pypy.rpython.lltype import RuntimeTypeInfo, getRuntimeTypeInfo +from pypy.rpython.lltype import RuntimeTypeInfo, getRuntimeTypeInfo, Char from pypy.translator.c.funcgen import FunctionCodeGenerator from pypy.translator.c.external import CExternalFunctionCodeGenerator from pypy.translator.c.support import USESLOTS # set to False if necessary while refactoring -from pypy.translator.c.support import cdecl, somelettersfrom +from pypy.translator.c.support import cdecl, somelettersfrom, c_string_constant from pypy.translator.c.primitive import PrimitiveType from pypy.translator.c import extfunc from pypy.rpython.rstr import STR @@ -348,11 +348,12 @@ for name in self.T._names: value = getattr(self.obj, name) c_name = defnode.c_struct_field_name(name) - expr = generic_initializationexpr(self.db, value, - '%s.%s' % (self.name, c_name), - decoration + name) - yield '\t%s' % expr - if not expr.startswith('/*'): + lines = generic_initializationexpr(self.db, value, + '%s.%s' % (self.name, c_name), + decoration + name) + for line in lines: + yield '\t' + line + if not lines[0].startswith('/*'): is_empty = False if is_empty: yield '\t%s' % '0,' @@ -383,14 +384,19 @@ if self.T.OF == Void or len(self.obj.items) == 0: yield '\t%d' % len(self.obj.items) yield '}' + elif self.T.OF == Char: + yield '\t%d, %s' % (len(self.obj.items), + c_string_constant(''.join(self.obj.items))) + yield '}' else: yield '\t%d, {' % len(self.obj.items) for j in range(len(self.obj.items)): value = self.obj.items[j] - expr = generic_initializationexpr(self.db, value, + lines = generic_initializationexpr(self.db, value, '%s.items[%d]' % (self.name, j), '%s%d' % (decoration, j)) - yield '\t%s' % expr + for line in lines: + yield '\t' + line yield '} }' assert not USESLOTS or '__dict__' not in dir(ArrayNode) @@ -398,8 +404,9 @@ def generic_initializationexpr(db, value, access_expr, decoration): if isinstance(typeOf(value), ContainerType): node = db.getcontainernode(value) - expr = '\n'.join(node.initializationexpr(decoration+'.')) - expr += ',' + lines = list(node.initializationexpr(decoration+'.')) + lines[-1] += ',' + return lines else: comma = ',' if typeOf(value) == Ptr(PyObject) and value: @@ -415,7 +422,7 @@ i = expr.find('\n') if i<0: i = len(expr) expr = '%s\t/* %s */%s' % (expr[:i], decoration, expr[i:]) - return expr.replace('\n', '\n\t') # indentation + return expr.split('\n') # ____________________________________________________________ Modified: pypy/dist/pypy/translator/c/support.py ============================================================================== --- pypy/dist/pypy/translator/c/support.py (original) +++ pypy/dist/pypy/translator/c/support.py Sat Sep 17 11:04:16 2005 @@ -68,6 +68,34 @@ ''') +def c_string_constant(s): + '''Returns EITHER a " "-delimited string literal for C + OR a { }-delimited array of chars. + ''' + def char_repr(c): + if c in '\\"': return '\\' + c + if ' ' <= c < '\x7F': return c + return '\\%03o' % ord(c) + def line_repr(s): + return ''.join([char_repr(c) for c in s]) + + if len(s) < 64: + return '"%s"' % line_repr(s) + + elif len(s) < 1024: + lines = ['"'] + for i in range(0, len(s), 32): + lines.append(line_repr(s[i:i+32])) + lines[-1] += '"' + return '\\\n'.join(lines) + + else: + lines = [] + for i in range(0, len(s), 20): + lines.append(','.join([str(ord(c)) for c in s[i:i+20]])) + return '{\n%s}' % ',\n'.join(lines) + + def gen_assignments(assignments): # Generate a sequence of assignments that is possibly reordered # to avoid clashes -- i.e. do the equivalent of a tuple assignment, Modified: pypy/dist/pypy/translator/c/test/test_genc.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_genc.py (original) +++ pypy/dist/pypy/translator/c/test/test_genc.py Sat Sep 17 11:04:16 2005 @@ -224,3 +224,16 @@ return a.d['hey'] f = compile(t, []) assert f() == 2 + +def test_long_strings(): + s1 = 'hello' + s2 = ''.join([chr(i) for i in range(256)]) + s3 = 'abcd'*17 + s4 = open(__file__, 'rb').read() + choices = [s1, s2, s3, s4] + def f(i, j): + return choices[i][j] + f1 = compile(f, [int, int]) + for i, s in enumerate(choices): + for j, c in enumerate(s): + assert f1(i, j) == c From cfbolz at codespeak.net Sat Sep 17 14:03:25 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sat, 17 Sep 2005 14:03:25 +0200 (CEST) Subject: [pypy-svn] r17612 - in pypy/dist/pypy/objspace/std: . test Message-ID: <20050917120325.D44C527B80@code1.codespeak.net> Author: cfbolz Date: Sat Sep 17 14:03:24 2005 New Revision: 17612 Modified: pypy/dist/pypy/objspace/std/slicetype.py pypy/dist/pypy/objspace/std/test/test_sliceobject.py Log: rewrite the indices implementation of slices at interplevel to make it a bit less slow. Following the CPython source code Modified: pypy/dist/pypy/objspace/std/slicetype.py ============================================================================== --- pypy/dist/pypy/objspace/std/slicetype.py (original) +++ pypy/dist/pypy/objspace/std/slicetype.py Sat Sep 17 14:03:24 2005 @@ -1,3 +1,4 @@ +import sys from pypy.interpreter import baseobjspace from pypy.objspace.std.stdtypedef import * from pypy.objspace.std.register_all import register_all @@ -5,82 +6,77 @@ slice_indices = MultiMethod('indices', 2) -# default application-level implementations for some operations -# gateway is imported in the stdtypedef module -app = gateway.applevel(""" - - def indices(slice, length): - # this is used internally, analogous to CPython's PySlice_GetIndicesEx - step = slice.step - if step is None: - step = 1 - elif step == 0: - raise ValueError, "slice step cannot be zero" - if step < 0: - defstart = length - 1 - defstop = -1 +def slice_indices__ANY_ANY(space, w_slice, w_length): + length = space.int_w(w_length) + start, stop, step = indices3(space, w_slice, length) + return space.newtuple([space.wrap(start), space.wrap(stop), + space.wrap(step)]) + +# utility functions +def _Eval_SliceIndex(space, w_int): + try: + x = space.int_w(w_int) + except OperationError, e: + if not e.match(space, space.w_OverflowError): + raise + cmp = space.is_true(space.ge(w_int, space.wrap(0))) + if cmp: + x = sys.maxint else: - defstart = 0 - defstop = length + x = -sys.maxint + return x - start = slice.start - if start is None: - start = defstart +def indices3(space, w_slice, length): + if space.is_true(space.is_(w_slice.w_step, space.w_None)): + step = 1 + else: + step = _Eval_SliceIndex(space, w_slice.w_step) + if step == 0: + raise OperationError(space.w_ValueError, + space.wrap("slice step cannot be zero")) + if space.is_true(space.is_(w_slice.w_start, space.w_None)): + if step < 0: + start = length - 1 else: + start = 0 + else: + start = _Eval_SliceIndex(space, w_slice.w_start) + if start < 0: + start += length if start < 0: - start += length - if start < 0: - if step < 0: - start = -1 - else: - start = 0 - elif start >= length: if step < 0: - start = length - 1 + start = -1 else: - start = length - - stop = slice.stop - if stop is None: - stop = defstop + start = 0 + elif start >= length: + if step < 0: + start = length - 1 + else: + start = length + if space.is_true(space.is_(w_slice.w_stop, space.w_None)): + if step < 0: + stop = -1 else: + stop = length + else: + stop = _Eval_SliceIndex(space, w_slice.w_stop) + if stop < 0: + stop += length if stop < 0: - stop += length - if stop < 0: - stop = -1 - elif stop > length: - stop = length - - return start, stop, step - - def slice_indices4(slice, sequencelength): - start, stop, step = indices(slice, sequencelength) - slicelength = stop - start - lengthsign = cmp(slicelength, 0) - stepsign = cmp(step, 0) - if stepsign == lengthsign: - slicelength = (slicelength - lengthsign) // step + 1 - else: - slicelength = 0 - - return start, stop, step, slicelength -""", filename=__file__) - -slice_indices__ANY_ANY = app.interphook("indices") -slice_indices3 = slice_indices__ANY_ANY -slice_indices4 = app.interphook("slice_indices4") - -# utility functions -def indices3(space, w_slice, length): - w_result = slice_indices3(space, w_slice, space.wrap(length)) - w_1, w_2, w_3 = space.unpacktuple(w_result, 3) - return space.int_w(w_1), space.int_w(w_2), space.int_w(w_3) + stop =-1 + elif stop > length: + stop = length + return start, stop, step def indices4(space, w_slice, length): - w_result = slice_indices4(space, w_slice, space.wrap(length)) - w_1, w_2, w_3, w_4 = space.unpacktuple(w_result, 4) - return (space.int_w(w_1), space.int_w(w_2), - space.int_w(w_3), space.int_w(w_4)) + start, stop, step = indices3(space, w_slice, length) + if (step < 0 and stop >= start) or (step > 0 and start >= stop): + slicelength = 0 + elif step < 0: + slicelength = (stop - start + 1) / step + 1 + else: + slicelength = (stop - start - 1) / step + 1 + return start, stop, step, slicelength def adapt_bound(space, w_index, w_size): if not (space.is_true(space.isinstance(w_index, space.w_int)) or Modified: pypy/dist/pypy/objspace/std/test/test_sliceobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/test/test_sliceobject.py (original) +++ pypy/dist/pypy/objspace/std/test/test_sliceobject.py Sat Sep 17 14:03:24 2005 @@ -24,6 +24,7 @@ self.space.raises_w(space.w_ValueError, slicetype.indices3, space, w_slice, 10) + class AppTest_SliceObject: def test_new(self): def cmp_slice(sl1, sl2): @@ -64,3 +65,14 @@ assert not (slice(1, 2, 3) < slice(1, 0, 0)) assert not (slice(1, 2, 3) < slice(1, 2, 0)) assert not (slice(1, 2, 3) < slice(1, 2, 3)) + + def test_long_indices(self): + assert slice(-2 ** 100, 10, 1).indices(1000) == (0, 10, 1) + assert slice(-2 ** 200, -2 ** 100, 1).indices(1000) == (0, -1, 1) + assert slice(2 ** 100, 0, -1).indices(1000) == (999, 0, -1) + assert slice(2 ** 100, -2 ** 100, -1).indices(1000) == (999, -1, -1) + start, stop, step = slice(0, 1000, 2 ** 200).indices(1000) + assert start == 0 + assert stop == 1000 + assert step >= 1000 + raises(OverflowError, "slice(0, 1000, 1).indices(2 ** 100)") From hpk at codespeak.net Sat Sep 17 14:45:23 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Sat, 17 Sep 2005 14:45:23 +0200 (CEST) Subject: [pypy-svn] r17613 - pypy/extradoc/sprintinfo Message-ID: <20050917124523.1554227B82@code1.codespeak.net> Author: hpk Date: Sat Sep 17 14:45:23 2005 New Revision: 17613 Modified: pypy/extradoc/sprintinfo/paris-2005-sprint.txt Log: fix paris people link Modified: pypy/extradoc/sprintinfo/paris-2005-sprint.txt ============================================================================== --- pypy/extradoc/sprintinfo/paris-2005-sprint.txt (original) +++ pypy/extradoc/sprintinfo/paris-2005-sprint.txt Sat Sep 17 14:45:23 2005 @@ -116,7 +116,7 @@ http://codespeak.net/svn/pypy/extradoc/sprintinfo/paris-2005-people.txt -.. _`Paris people`: http://codespeak.net/pypy/index.cgi?extradoc/sprintinfo/paris-2005-people.html +.. _`Paris people`: http://codespeak.net/pypy/extradoc/sprintinfo/paris-2005-people.html .. _`mailinglist`: .. _`PyPy sprint mailing list`: http://codespeak.net/mailman/listinfo/pypy-sprint .. _`pypy-0.7.0`: http://codespeak.net/pypy/dist/pypy/doc/release-0.7.0.html From arigo at codespeak.net Sat Sep 17 17:16:04 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 17 Sep 2005 17:16:04 +0200 (CEST) Subject: [pypy-svn] r17614 - pypy/dist/pypy/doc Message-ID: <20050917151604.302B827B80@code1.codespeak.net> Author: arigo Date: Sat Sep 17 17:16:02 2005 New Revision: 17614 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: Some rewrites. More needed. Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Sat Sep 17 17:16:02 2005 @@ -3,15 +3,12 @@ ============================================================ -Introduction +The analysis of dynamic languages =============================================== -Dynamic languages ---------------------------- - Dynamic languages are definitely not new on the computing scene. However, new conditions like increased computing power and designs driven -by larger communities have allowed the emergence of new aspects in the +by larger communities have enabled the emergence of new aspects in the recent members of the family, or at least made them more practical than they previously were. The following aspects in particular are typical not only of Python but of most modern dynamic languages: @@ -19,8 +16,7 @@ * The driving force is not minimalistic elegance. It is a balance between elegance and practicality, and rather un-minimalistic -- the feature sets built into languages tend to be relatively large and growing - (though it is still a major difference between languages where exactly - they stand on this scale). + (to some extent, depending on the language). * High abstractions and theoretically powerful low-level primitives are generally ruled out in favor of a larger number of features that try to @@ -43,10 +39,10 @@ the complete program is built and run by executing statements. Some of these statements have a declarative look and feel; for example, some appear to be function or class declarations. Actually, they are merely -statements that, when executed, build a function or class object and store -a reference to that object at some place, under some name, from where it -can be retrieved later. Units of programs -- modules, whose source is a -file each -- are similarily mere objects in memory built on demand by some +statements that, when executed, build a function or class object. Then a +reference to the new object is stored at some place, under some name, from +where it can be accessed. Units of programs -- modules, whose source is one +file each -- are similarily mere objects in memory, built on demand by some other module executing an ``import`` statement. Any such statement -- class construction or module import -- can be executed at any time during the execution of a program. @@ -57,11 +53,11 @@ results of NP-complete computations or external factors. This is not just a theoretical possibility but a regularly used feature: for example, the pure Python module ``os.py`` provides some OS-independent interface to -OS-specific system calls, by importing OS-specific modules and defining -substitute functions as needed depending on the OS on which ``os.py`` -turns out to be executed. Many large Python projects use custom import -mechanisms to control exactly how and from where each module is loaded, -simply by tampering with import hooks or just emulating parts of the +OS-specific system calls, by importing internal OS-specific modules and +completing it with substitute functions, as needed by the OS on which +``os.py`` turns out to be executed. Many large Python projects use custom +import mechanisms to control exactly how and from where each module is +loaded, simply by tampering with import hooks or just emulating parts of the ``import`` statement manually. In addition, there are of course classical (and only partially true) @@ -71,51 +67,6 @@ fundamental to the nature of dynamic languages. -Control flow versus data model ---------------------------------- - -Given the absence of declarations, the only preprocessing done on a Python -module is the compilation of the source code into pseudo-code (bytecode). -From there, the semantics can be roughly divided in two groups: the -control flow semantics and the data model. In Python and other languages -of its family, these two aspects are to some extent conceptually -separated. Indeed, although it is possible -- and common -- to design -languages in which the two aspects are more intricately connected, or one -aspect is subsumed to the other (e.g. data structures in Lisp), -programmers tend to separate the two concepts in common cases -- enough -for the "practical-features-beats-obscure-primitives" language design -guideline seen above. So in Python, both aspects are complex on their -own. - -.. the above paragraph doesn't make a great deal of sense. some very long sentences! :) - -The control flow semantics include, clearly, all syntactic elements that -influence the control flow of a program -- loops, function definitions and -calls, etc. -- whereas the data model describes how the first-class -objects manipulated by the program behave under some operations. There is -a rich built-in set of object types in Python, and a rich set of -operations on them, each corresponding to a syntactic element. Objects of -different types react differently to the same operation, and the variables -are not statically typed, which is also part of the dynamic nature of -languages like Python -- operations are generally highly polymorphic and -types are hard to infer in advance. - -Note that control flow and object model are not entirely separated. It is -not uncommon for some control flow aspects to be manipulable as -first-class objects as well, e.g. functions in Python. Conversely, almost -any operation on any object could lead to a user-defined function being -called back. - -The data model forms a so-called *Object Space* in PyPy. The bytecode -interpreter works by delegating most operations to the object space, by -invoking a well-defined abstract interface. The objects are regarded as -"belonging" to the object space, where the interpreter sees them as black -boxes on which it can ask for operations to be performed. - -Note that the term "object space" has already been reused for other -dynamic language implementations, e.g. XXX for Perl 6. - - The analysis of live programs ----------------------------------- @@ -130,45 +81,91 @@ has reached a state that is deemed advanced enough, we limit the amount of dynamism that is allowed *after this point* and we analyse the program's objects in memory. In some sense, we use the full Python as a -preprocessor for a subset of the language, called RPython, which differs -from Python only in ruling out some operations like creating new classes. - -More theoretically, analysing dead source files is equivalent to giving up -all dynamism (in the sense of `No Declarations`_), but static analysis is -still possible if we allow a *finite* amount of dynamism -- where an -operation is considered dynamic or not depending on whether it is -supported or not by the analysis we are performing. Of course, putting -more cleverness in the tools helps too; but the point here is that we are -still allowed to build dynamic programs, as long as they only ever build a -bounded amount of, say, classes and functions. The source code of the -PyPy interpreter, which is itself written in its [this?] style, also makes +preprocessor for a subset of the language, called RPython. Informally, +RPython is Python without the operations and effects that are not supported +by our analysis toolchain (e.g. class creation, and most non-local effects). + +Of course, putting more efforts into the toolchain would allow us to +support a larger subset of Python. We do not claim that our toolchain -- +which we describe in the sequel of this paper -- is particularly advanced. +To make our point, let us assume a given an analysis tool, which supports +a given subset of a language. Then: + +* Analysing dead source files is equivalent to giving up all dynamism + (as far as unsupported by this tool). This is natural in the presence of + static declarations. + +* Analysing a frozen memory image of a program that we loaded and + initialized is equivalent to giving up all dynamic after a certain point + in time. This is natural in image-oriented environments like Smalltalk, + where the program resides in memory and not in files in the first place. + +Our approach goes further and analyses *live* programs in memory: +the program is allowed to contain fully dynamic sections, as long as these +sections are entered a *bounded* number of times. +For example, the source code of the PyPy +interpreter, which is itself written in this bounded-dynamism style, makes extensive use of the fact that it is possible to build new classes at any -point in time, not just during an initialization phase, as long as this -number of bounded (e.g. `interpreter/gateway.py`_ builds a custom class -for each function that some variable can point to -- there is a finite -number of functions in total, so this makes a finite number of extra -classes). - -.. the above paragraph is confusing too? - -Note that this approach is natural in image-oriented environment like -Smalltalk, where the program is by default live instead of in files. The -Python experience forced us to allow some uncontrolled dynamism simply to -be able to load the program to memory in the first place; once this was -done, it was a mere (but very useful) side-effect that we could allow for -some more uncontrolled dynamism at run-time, as opposed to analysing an -image in a known frozen state. +point in time -- not just during an initialization phase -- as long as this +number of bounded. E.g. `interpreter/gateway.py`_ builds a custom class +for each function that some variable can point to. There is a finite +number of functions in total, so this can obviously only create +a finite number of extra classes. But the precise set of functions that +need a corresponding class is difficult to manually compute in advance; +instead, the code that builds and cache a new class is invoked by the +analysis tool itself each time it discovers that a new function object can +reach the corresponding point. + +This approach is derived from dynamic analysis techniques that can support +unrestricted dynamic languages by falling back to a regular interpreter for +unsupported features (e.g. Psyco, described in +http://psyco.sourceforge.net/psyco-pepm-a.ps.gz). +The above argumentation should have shown why we think that being similarily +able to fall back to regular interpretation for parts that cannot be +understood is a central feature of the analysis of dynamic languages. + + +Concrete and abstract interpretation +====================================================== + +Object Spaces +--------------------------------- + +The semantics of Python can be roughly divided in two groups: the syntax of +the language, which focuses on control flow aspects, and the object semantics, +which define how various types of objects react to various operations and +methods. As it is common in all languages of the family, both the +syntactic elements and the object semantics are complex and at times +complicated (as opposed to more classical languages that tend to subsume +one aspect to the other: for example, Lisp's execution semantics are almost +trivial). + +This observation led us to the concept of *Object Space*. An interpreter can +be divided in two non-trivial parts: one for handling compilation to and +interpretation of pseudo-code (control flow aspects) and one implementing +the object library's semantics. The former, called *bytecode interpreter*, +considers objects as black boxes; any operation on objects requested by the +bytecode is handled over to the object library, called *object space*. +The point of this architecture is, precisely, that neither of these two +components is trivial; separating them explicitely, with a well-defined +interface inbetween, allows each part to be reused independently. This is +a major flexibility feature of PyPy: we can for example insert proxy object +spaces in front of the real one, like the `Thunk Object Space`_ adding lazy +evaluation of objects. + +Note that the term "object space" has already been reused for other +dynamic language implementations, e.g. XXX for Perl 6. Abstract interpretation ------------------------------ -The analysis we perform in PyPy is global program discovery (i.e. slicing -it out of all the objects in memory [what?]) and type inference. The -analysis of the non-dynamic parts themselves is based on their `abstract -interpretation`_. The object space separation was also designed for this -purpose. PyPy has an alternate object space called the `Flow Object -Space`_, whose objects are empty placeholders. The over-simplified view +In the sequel of this paper, we will consider another application +of the object space separation. The analysis we perform in PyPy +is whole-program type inference. The analysis of the non-dynamic +parts themselves is based on their `abstract interpretation`_. +PyPy has an alternate object space called the `Flow Object Space`_, +whose objects are empty placeholders. The over-simplified view is that to analyse a function, we bind its input arguments to such placeholders, and execute the function -- i.e. let the interpreter follow its bytecode and invoke the object space for each operations, one by one. @@ -178,7 +175,7 @@ view of what the function performs. The global picture is then to run the program while switching between the -flow object space for static enough functions, and a normal, concrete +flow object space for static enough functions, and a standard, concrete object space for functions or initializations requiring the full dynamism. If the placeholders are endowed with a bit more information, e.g. if they @@ -188,17 +185,36 @@ abstracting out some concrete values and replacing them with the set of all values that could actually be there. If the sets are broad enough, then after some time we will have seen all potential value sets along each -possible code paths, and our program analysis is complete. An object -space is thus an *interpretation domain*; the Flow Object Space is an -*abstract interpretation domain*. - -This is a theoretical point of view that differs significantly from what -we have implemented, for many reasons. Of course, the devil is in the -details -- which the rest of this paper is all about. +possible code paths, and our program analysis is complete. + +An object space is thus an *interpretation domain*; the Flow Object Space +is an *abstract interpretation domain*. We are thus interpreting the +program while switching dynamically between several abstraction levels. +This is possible because our design allows the *same* interpreter to work +with a concrete or an abstract object space. + +Following parts of the program at the abstract level allows us to deduce +general information about the program, and for parts that cannot be analysed +we switch to the concrete level. The restrictions placed on the program +to statically analyse are that to be crafted in such a way that this process +eventually terminates; from this point of view, more abstract is better (it +covers whole sets of objects in a single pass). Thus the compromize that +the author of the program to analyse faces are less strong but more subtle +than not using a specific set of dynamic features at all, but using them +sparsingly enough. + + +The PyPy analysis toolchain +=========================================== + +The previous sections have developed a theoretical point of view that +differs significantly from what we have implemented, for many reasons. +The devil is in the details. Flow Object Space -=================================== +--------------------------------- + XXX @@ -272,12 +288,16 @@ Annotator -=================================== +--------------------------------- XXX +.. _architecture: architecture.html +.. _`Thunk Object Space`: objspace.html#the-thunk-object-space .. _`abstract interpretation`: theory.html#abstract-interpretation -.. _`Flow Object Space`: objspace.html#flow-object-space +.. _`Flow Object Space`: objspace.html#the-flow-object-space +.. _`Standard Object Space`: objspace.html#the-standard-object-space +.. _Psyco: http://psyco.sourceforge.net/ .. include:: _ref.txt From ericvrp at codespeak.net Sat Sep 17 17:41:44 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Sat, 17 Sep 2005 17:41:44 +0200 (CEST) Subject: [pypy-svn] r17615 - pypy/dist/pypy/translator/llvm Message-ID: <20050917154144.7675427B80@code1.codespeak.net> Author: ericvrp Date: Sat Sep 17 17:41:43 2005 New Revision: 17615 Modified: pypy/dist/pypy/translator/llvm/codewriter.py pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/llvm/opwriter.py pypy/dist/pypy/translator/llvm/varsize.py Log: Intermediate checking before implementing another exception handling policy. The LLVM invoke/unwind instructions cause plain ugly looking code! A lot of setjmp/longjmp's make the resulting assembly and c code very hard to read. Modified: pypy/dist/pypy/translator/llvm/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/codewriter.py (original) +++ pypy/dist/pypy/translator/llvm/codewriter.py Sat Sep 17 17:41:43 2005 @@ -81,10 +81,10 @@ self.append("}") def ret(self, type_, ref): - self.indent("ret %s %s" % (type_, ref)) - - def ret_void(self): - self.indent("ret void") + if type_ == 'void': + self.indent("ret void") + else: + self.indent("ret %s %s" % (type_, ref)) def unwind(self): self.indent("unwind") @@ -109,35 +109,26 @@ # allocas or varargs in the caller. If the "tail" marker is present, the function # call is eligible for tail call optimization. Note that calls may be marked # "tail" even if they do not occur before a ret instruction. - def call(self, targetvar, returntype, functionref, argrefs, argtypes, tail=DEFAULT_TAIL, cconv=DEFAULT_CCONV): + def call(self, targetvar, returntype, functionref, argrefs, argtypes, label=None, except_label=None, tail=DEFAULT_TAIL, cconv=DEFAULT_CCONV): if cconv is not 'fastcc': tail_ = '' else: tail_ = tail if tail_: tail_ += ' ' - arglist = ["%s %s" % item for item in zip(argtypes, argrefs)] - self.indent("%s = %scall %s %s %s(%s)" % (targetvar, tail_, cconv, returntype, functionref, - ", ".join(arglist))) - - def call_void(self, functionref, argrefs, argtypes, tail=DEFAULT_TAIL, cconv=DEFAULT_CCONV): - if cconv is not 'fastcc': - tail_ = '' + args = ", ".join(["%s %s" % item for item in zip(argtypes, argrefs)]) + if except_label: + assert label + instruction = 'invoke' + optional = ' to label %%%s except label %%%s' % (label, except_label) else: - tail_ = tail - if tail_: - tail_ += ' ' - arglist = ["%s %s" % item for item in zip(argtypes, argrefs)] - self.indent("%scall %s void %s(%s)" % (tail_, cconv, functionref, ", ".join(arglist))) - - def invoke(self, targetvar, returntype, functionref, argrefs, argtypes, label, except_label, cconv=DEFAULT_CCONV): - arglist = ["%s %s" % item for item in zip(argtypes, argrefs)] - self.indent("%s = invoke %s %s %s(%s) to label %%%s except label %%%s" % (targetvar, cconv, returntype, functionref, - ", ".join(arglist), label, except_label)) - - def invoke_void(self, functionref, argrefs, argtypes, label, except_label, cconv=DEFAULT_CCONV): - arglist = ["%s %s" % item for item in zip(argtypes, argrefs)] - self.indent("invoke %s void %s(%s) to label %%%s except label %%%s" % (cconv, functionref, ", ".join(arglist), label, except_label)) + assert not label + instruction = 'call' + optional = '' + if returntype == 'void': + self.indent("%s%s %s void %s(%s)%s" % (tail_, instruction, cconv, functionref, args, optional)) + else: + self.indent("%s = %s%s %s %s %s(%s)%s" % (targetvar, tail_, instruction, cconv, returntype, functionref, args, optional)) def cast(self, targetvar, fromtype, fromvar, targettype): if fromtype == 'void' and targettype == 'void': Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Sat Sep 17 17:41:43 2005 @@ -205,10 +205,7 @@ self.write_block_phi_nodes(codewriter, block) inputargtype = self.db.repr_arg_type(block.inputargs[0]) inputarg = self.db.repr_arg(block.inputargs[0]) - if inputargtype != "void": - codewriter.ret(inputargtype, inputarg) - else: - codewriter.ret_void() + codewriter.ret(inputargtype, inputarg) def _is_raise_new_exception(self, block): is_raise_new = False Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Sat Sep 17 17:41:43 2005 @@ -285,13 +285,9 @@ functionref = self.db.repr_arg(op_args[0]) argrefs = self.db.repr_arg_multi(op_args[1:]) argtypes = self.db.repr_arg_type_multi(op_args[1:]) - if returntype != "void": - if self.db.is_function_ptr(op.result): - returntype = "%s (%s)*" % (returntype, ", ".join(argtypes)) - self.codewriter.call(targetvar, returntype, functionref, argrefs, - argtypes) - else: - self.codewriter.call_void(functionref, argrefs, argtypes) + if self.db.is_function_ptr(op.result): + returntype = "%s (%s)*" % (returntype, ", ".join(argtypes)) + self.codewriter.call(targetvar,returntype,functionref,argrefs,argtypes) def invoke(self, op): op_args = [arg for arg in op.args @@ -332,13 +328,10 @@ - if returntype != "void": - if self.db.is_function_ptr(op.result): #use longhand form - returntype = "%s (%s)*" % (returntype, ", ".join(argtypes)) - self.codewriter.invoke(targetvar, returntype, functionref, argrefs, - argtypes, none_label, exc_label) - else: - self.codewriter.invoke_void(functionref, argrefs, argtypes, none_label, exc_label) + if self.db.is_function_ptr(op.result): #use longhand form + returntype = "%s (%s)*" % (returntype, ", ".join(argtypes)) + self.codewriter.call(targetvar, returntype, functionref, argrefs, + argtypes, none_label, exc_label) e = self.db.translator.rtyper.getexceptiondata() ll_exception_match = '%pypy_' + e.ll_exception_match.__name__ Modified: pypy/dist/pypy/translator/llvm/varsize.py ============================================================================== --- pypy/dist/pypy/translator/llvm/varsize.py (original) +++ pypy/dist/pypy/translator/llvm/varsize.py Sat Sep 17 17:41:43 2005 @@ -28,9 +28,9 @@ if ARRAY is STR.chars: #XXX instead of memset we could probably just zero the hash and string terminator - codewriter.call_void('%llvm.memset', ['%ptr', '0', '%usize', '0'], ['sbyte*', 'ubyte', 'uint', 'uint'], cconv='ccc') + codewriter.call('%dummy', 'void', '%llvm.memset', ['%ptr', '0', '%usize', '0'], ['sbyte*', 'ubyte', 'uint', 'uint'], cconv='ccc') else: - codewriter.call_void('%llvm.memset', ['%ptr', '0', '%usize', '0'], ['sbyte*', 'ubyte', 'uint', 'uint'], cconv='ccc') + codewriter.call('%dummy', 'void', '%llvm.memset', ['%ptr', '0', '%usize', '0'], ['sbyte*', 'ubyte', 'uint', 'uint'], cconv='ccc') indices_to_arraylength = tuple(indices_to_array) + (("uint", 0),) # the following accesses the length field of the array From pedronis at codespeak.net Sat Sep 17 18:13:08 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 17 Sep 2005 18:13:08 +0200 (CEST) Subject: [pypy-svn] r17616 - pypy/dist/pypy/objspace/std Message-ID: <20050917161308.4D32827B80@code1.codespeak.net> Author: pedronis Date: Sat Sep 17 18:13:06 2005 New Revision: 17616 Modified: pypy/dist/pypy/objspace/std/sliceobject.py pypy/dist/pypy/objspace/std/slicetype.py Log: define indices directly on W_SliceObject Modified: pypy/dist/pypy/objspace/std/sliceobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/sliceobject.py (original) +++ pypy/dist/pypy/objspace/std/sliceobject.py Sat Sep 17 18:13:06 2005 @@ -60,5 +60,15 @@ """slices are not hashables but they must have a __hash__ method""" raise OperationError(space.w_TypeError, space.wrap("unhashable type")) +# indices impl -register_all(vars()) +from pypy.objspace.std import slicetype + +def slice_indices__Slice_ANY(space, w_slice, w_length): + length = space.int_w(w_length) + start, stop, step = slicetype.indices3(space, w_slice, length) + return space.newtuple([space.wrap(start), space.wrap(stop), + space.wrap(step)]) + +# register all methods +register_all(vars(), slicetype) Modified: pypy/dist/pypy/objspace/std/slicetype.py ============================================================================== --- pypy/dist/pypy/objspace/std/slicetype.py (original) +++ pypy/dist/pypy/objspace/std/slicetype.py Sat Sep 17 18:13:06 2005 @@ -4,14 +4,9 @@ from pypy.objspace.std.register_all import register_all from pypy.interpreter.error import OperationError +# indices multimehtod slice_indices = MultiMethod('indices', 2) -def slice_indices__ANY_ANY(space, w_slice, w_length): - length = space.int_w(w_length) - start, stop, step = indices3(space, w_slice, length) - return space.newtuple([space.wrap(start), space.wrap(stop), - space.wrap(step)]) - # utility functions def _Eval_SliceIndex(space, w_int): try: From cfbolz at codespeak.net Sat Sep 17 18:14:15 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sat, 17 Sep 2005 18:14:15 +0200 (CEST) Subject: [pypy-svn] r17617 - in pypy/dist/pypy/rpython: . test Message-ID: <20050917161415.9488627B80@code1.codespeak.net> Author: cfbolz Date: Sat Sep 17 18:14:12 2005 New Revision: 17617 Modified: pypy/dist/pypy/rpython/exceptiondata.py pypy/dist/pypy/rpython/extfunctable.py pypy/dist/pypy/rpython/normalizecalls.py pypy/dist/pypy/rpython/rbuiltin.py pypy/dist/pypy/rpython/rclass.py pypy/dist/pypy/rpython/test/test_exception.py pypy/dist/pypy/rpython/test/test_rbuiltin.py Log: issue141: testing implemented issubclass checks using relative numbering including some more isinstance tests. This required quite some fixes in various places because of exceptions. Modified: pypy/dist/pypy/rpython/exceptiondata.py ============================================================================== --- pypy/dist/pypy/rpython/exceptiondata.py (original) +++ pypy/dist/pypy/rpython/exceptiondata.py Sat Sep 17 18:14:12 2005 @@ -4,7 +4,7 @@ from pypy.rpython.lltype import Array, malloc, Ptr, PyObject, pyobjectptr from pypy.rpython.lltype import FuncType, functionptr, Signed from pypy.rpython.extfunctable import standardexceptions - +from pypy.annotation.classdef import FORCE_ATTRIBUTES_INTO_CLASSES class ExceptionData: """Public information for the code generators to help with exceptions.""" @@ -70,7 +70,7 @@ if (clsdef and clsdef.cls is not Exception and issubclass(clsdef.cls, Exception)): cls = clsdef.cls - if cls in self.standardexceptions: + if cls in self.standardexceptions and cls not in FORCE_ATTRIBUTES_INTO_CLASSES: is_standard = True assert not clsdef.attrs, ( "%r should not have grown atributes" % (cls,)) Modified: pypy/dist/pypy/rpython/extfunctable.py ============================================================================== --- pypy/dist/pypy/rpython/extfunctable.py (original) +++ pypy/dist/pypy/rpython/extfunctable.py Sat Sep 17 18:14:12 2005 @@ -214,4 +214,9 @@ ZeroDivisionError: True, MemoryError : True, IOError : True, + OSError : True, + StopIteration : True, + KeyError : True, + IndexError : True, + AssertionError : True, } Modified: pypy/dist/pypy/rpython/normalizecalls.py ============================================================================== --- pypy/dist/pypy/rpython/normalizecalls.py (original) +++ pypy/dist/pypy/rpython/normalizecalls.py Sat Sep 17 18:14:12 2005 @@ -340,7 +340,19 @@ annotator.setbinding(graph.getreturnvar(), generalizedresult) classdef.my_instantiate = my_instantiate - +def assign_inheritance_ids(annotator): + def assign_id(classdef, nextid): + classdef.minid = nextid + nextid += 1 + for subclass in classdef.subdefs.values(): + nextid = assign_id(subclass, nextid) + classdef.maxid = nextid + return classdef.maxid + id_ = 0 + for cls, classdef in annotator.getuserclasses().items(): + if classdef.basedef is None: + id_ = assign_id(classdef, id_) + def perform_normalizations(rtyper): create_class_constructors(rtyper) rtyper.annotator.frozen += 1 @@ -348,6 +360,7 @@ normalize_function_signatures(rtyper.annotator) specialize_pbcs_by_memotables(rtyper.annotator) merge_classpbc_getattr_into_classdef(rtyper) + assign_inheritance_ids(rtyper.annotator) finally: rtyper.annotator.frozen -= 1 create_instantiate_functions(rtyper.annotator) Modified: pypy/dist/pypy/rpython/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/rbuiltin.py Sat Sep 17 18:14:12 2005 @@ -151,8 +151,13 @@ instance_repr = hop.args_r[0].common_repr() v_obj, v_cls = hop.inputargs(instance_repr, class_repr) - - return hop.gendirectcall(rclass.ll_isinstance, v_obj, v_cls) + if isinstance(v_cls, Constant): + minid = hop.inputconst(lltype.Signed, v_cls.value.subclassrange_min) + maxid = hop.inputconst(lltype.Signed, v_cls.value.subclassrange_max) + return hop.gendirectcall(rclass.ll_isinstance_const, v_obj, minid, + maxid) + else: + return hop.gendirectcall(rclass.ll_isinstance, v_obj, v_cls) #def rtype_builtin_range(hop): see rrange.py Modified: pypy/dist/pypy/rpython/rclass.py ============================================================================== --- pypy/dist/pypy/rpython/rclass.py (original) +++ pypy/dist/pypy/rpython/rclass.py Sat Sep 17 18:14:12 2005 @@ -1,7 +1,9 @@ +import sys import types from pypy.annotation.pairtype import pairtype, pair from pypy.annotation import model as annmodel from pypy.annotation.classdef import isclassdef +from pypy.objspace.flow.model import Constant from pypy.rpython.rmodel import Repr, TyperError, inputconst, warning, needsgc from pypy.rpython.lltype import ForwardReference, GcForwardReference from pypy.rpython.lltype import Ptr, Struct, GcStruct, malloc @@ -17,6 +19,8 @@ # struct object_vtable { # struct object_vtable* parenttypeptr; # RuntimeTypeInfo * rtti; +# Signed subclassrange_min; //this is also the id of the class itself +# Signed subclassrange_max; # array { char } * name; # struct object * instantiate(); # } @@ -47,6 +51,8 @@ OBJECTPTR = Ptr(OBJECT) OBJECT_VTABLE.become(Struct('object_vtable', ('parenttypeptr', TYPEPTR), + ('subclassrange_min', Signed), + ('subclassrange_max', Signed), ('rtti', Ptr(RuntimeTypeInfo)), ('name', Ptr(Array(Char))), ('instantiate', Ptr(FuncType([], OBJECTPTR))))) @@ -230,6 +236,11 @@ # initialize the 'parenttypeptr' and 'name' fields if rsubcls.classdef is not None: vtable.parenttypeptr = rsubcls.rbase.getvtable() + vtable.subclassrange_min = rsubcls.classdef.minid + vtable.subclassrange_max = rsubcls.classdef.maxid + else: #for the root class + vtable.subclassrange_min = 0 + vtable.subclassrange_max = sys.maxint rinstance = getinstancerepr(self.rtyper, rsubcls.classdef) rinstance.setup() if rinstance.needsgc: # only gc-case @@ -338,7 +349,13 @@ def rtype_issubtype(self, hop): class_repr = get_type_repr(self.rtyper) v_cls1, v_cls2 = hop.inputargs(class_repr, class_repr) - return hop.gendirectcall(ll_issubclass, v_cls1, v_cls2) + if isinstance(v_cls2, Constant): + minid = hop.inputconst(Signed, v_cls2.value.subclassrange_min) + maxid = hop.inputconst(Signed, v_cls2.value.subclassrange_max) + return hop.gendirectcall(ll_issubclass_const, v_cls1, minid, maxid) + else: + v_cls1, v_cls2 = hop.inputargs(class_repr, class_repr) + return hop.gendirectcall(ll_issubclass, v_cls1, v_cls2) def get_type_repr(rtyper): return getclassrepr(rtyper, None) @@ -717,11 +734,11 @@ return cast_pointer(OBJECTPTR, obj).typeptr def ll_issubclass(subcls, cls): - while subcls != cls: - if not subcls: - return False - subcls = subcls.parenttypeptr - return True + return cls.subclassrange_min <= subcls.subclassrange_min < cls.subclassrange_max + +def ll_issubclass_const(subcls, minid, maxid): + return minid <= subcls.subclassrange_min < maxid + def ll_isinstance(obj, cls): # obj should be cast to OBJECT or NONGCOBJECT if not obj: @@ -729,6 +746,11 @@ obj_cls = obj.typeptr return ll_issubclass(obj_cls, cls) +def ll_isinstance_const(obj, minid, maxid): + if not obj: + return False + return ll_issubclass_const(obj.typeptr, minid, maxid) + def ll_runtime_type_info(obj): return obj.typeptr.rtti Modified: pypy/dist/pypy/rpython/test/test_exception.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_exception.py (original) +++ pypy/dist/pypy/rpython/test/test_exception.py Sat Sep 17 18:14:12 2005 @@ -42,9 +42,9 @@ classdef = a.bookkeeper.getclassdef(OverflowError) assert ovferr_inst.typeptr == t.rtyper.class_reprs[classdef].getvtable() - keyerr_inst = data.ll_pyexcclass2exc(pyobjectptr(KeyError)) + taberr_inst = data.ll_pyexcclass2exc(pyobjectptr(TabError)) classdef = a.bookkeeper.getclassdef(StandardError) # most precise class seen - assert keyerr_inst.typeptr == t.rtyper.class_reprs[classdef].getvtable() + assert taberr_inst.typeptr == t.rtyper.class_reprs[classdef].getvtable() myerr_inst = data.ll_pyexcclass2exc(pyobjectptr(MyException)) assert myerr_inst.typeptr == t.rtyper.class_reprs[None].getvtable() Modified: pypy/dist/pypy/rpython/test/test_rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rbuiltin.py (original) +++ pypy/dist/pypy/rpython/test/test_rbuiltin.py Sat Sep 17 18:14:12 2005 @@ -199,6 +199,35 @@ res = interpret(f, []) assert res is True +def test_isinstance(): + class A(object): + pass + class B(A): + pass + class C(A): + pass + def f(x, y): + if x == 1: + a = A() + elif x == 2: + a = B() + else: + a = C() + if y == 1: + res = isinstance(a, A) + cls = A + elif y == 2: + res = isinstance(a, B) + cls = B + else: + res = isinstance(a, C) + cls = C + return int(res) + 2 * isinstance(a, cls) + for x in [1, 2, 3]: + for y in [1, 2, 3]: + res = interpret(f, [x, y]) + assert res == isinstance([A(), B(), C()][x-1], [A, B, C][y-1]) * 3 + def test_isinstance_list(): def f(i): if i == 0: From arigo at codespeak.net Sat Sep 17 18:17:19 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 17 Sep 2005 18:17:19 +0200 (CEST) Subject: [pypy-svn] r17618 - pypy/dist/pypy/objspace/std Message-ID: <20050917161719.B80B127B80@code1.codespeak.net> Author: arigo Date: Sat Sep 17 18:17:17 2005 New Revision: 17618 Modified: pypy/dist/pypy/objspace/std/unicodeobject.py pypy/dist/pypy/objspace/std/unicodetype.py Log: Removed unicode.__getslice__() -- we don't have this method anywhere else. I'm prepared to blame and fix any 2.4.1 compliance test that fails because of it. Modified: pypy/dist/pypy/objspace/std/unicodeobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/unicodeobject.py (original) +++ pypy/dist/pypy/objspace/std/unicodeobject.py Sat Sep 17 18:17:17 2005 @@ -231,19 +231,14 @@ uni = w_uni._value length = len(uni) start, stop, step, sl = slicetype.indices4(space, w_slice, length) - r = [uni[start + i*step] for i in range(sl)] - return W_UnicodeObject(space, r) - -def unicode_getslice__Unicode_ANY_ANY(space, w_uni, w_start, w_end): - w_slice = space.call_function(space.w_slice, w_start, w_end) - uni = w_uni._value - length = len(uni) - start, stop, step, sl = slicetype.indices4(space, w_slice, length) - if start > stop: - return W_UnicodeObject(space, []) + if sl == 0: + r = [] + elif step == 1: + assert start >= 0 and stop >= 0 + r = uni[start:stop] else: - assert 0 <= start <= stop - return W_UnicodeObject(space, uni[start:stop]) + r = [uni[start + i*step] for i in range(sl)] + return W_UnicodeObject(space, r) def mul__Unicode_ANY(space, w_uni, w_times): chars = w_uni._value Modified: pypy/dist/pypy/objspace/std/unicodetype.py ============================================================================== --- pypy/dist/pypy/objspace/std/unicodetype.py (original) +++ pypy/dist/pypy/objspace/std/unicodetype.py Sat Sep 17 18:17:17 2005 @@ -40,7 +40,7 @@ unicode_translate = MultiMethod('translate', 2) unicode_upper = MultiMethod('upper', 1) unicode_zfill = MultiMethod('zfill', 2) -unicode_getslice = MultiMethod('__getslice__', 3) + # ____________________________________________________________ app = gateway.applevel(''' From pedronis at codespeak.net Sat Sep 17 18:21:36 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 17 Sep 2005 18:21:36 +0200 (CEST) Subject: [pypy-svn] r17619 - in pypy/dist/pypy: annotation translator/test Message-ID: <20050917162136.A5A9C27B80@code1.codespeak.net> Author: pedronis Date: Sat Sep 17 18:21:33 2005 New Revision: 17619 Modified: pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/translator/test/test_annrpython.py Log: fix isinstance constant propagation Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Sat Sep 17 18:21:33 2005 @@ -128,8 +128,9 @@ "for integers only isinstance(.,int|r_uint) are supported") if s_obj.is_constant(): r.const = isinstance(s_obj.const, typ) - elif our_issubclass(s_obj.knowntype, typ) and not s_obj.can_be_none(): - r.const = True + elif our_issubclass(s_obj.knowntype, typ): + if not s_obj.can_be_none(): + r.const = True elif not our_issubclass(typ, s_obj.knowntype): r.const = False elif s_obj.knowntype == int and typ == bool: # xxx this will explode in case of generalisation Modified: pypy/dist/pypy/translator/test/test_annrpython.py ============================================================================== --- pypy/dist/pypy/translator/test/test_annrpython.py (original) +++ pypy/dist/pypy/translator/test/test_annrpython.py Sat Sep 17 18:21:33 2005 @@ -1715,7 +1715,22 @@ assert isinstance(s, annmodel.SomeIterator) assert s.variant == ('items',) - + def test_non_none_and_none_with_isinstance(self): + class A(object): + pass + class B(A): + pass + def g(x): + if isinstance(x, A): + return x + return None + def f(): + g(B()) + return g(None) + a = self.RPythonAnnotator() + s = a.build_types(f, []) + assert s.knowntype == B + def g(n): return [0,1,2,n] From arigo at codespeak.net Sat Sep 17 18:28:36 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 17 Sep 2005 18:28:36 +0200 (CEST) Subject: [pypy-svn] r17620 - in pypy/dist/pypy/objspace/std: . test Message-ID: <20050917162836.4612227B80@code1.codespeak.net> Author: arigo Date: Sat Sep 17 18:28:32 2005 New Revision: 17620 Modified: pypy/dist/pypy/objspace/std/listobject.py pypy/dist/pypy/objspace/std/sliceobject.py pypy/dist/pypy/objspace/std/slicetype.py pypy/dist/pypy/objspace/std/stringobject.py pypy/dist/pypy/objspace/std/test/test_sliceobject.py pypy/dist/pypy/objspace/std/tupleobject.py pypy/dist/pypy/objspace/std/unicodeobject.py Log: Moved slice indices as methods of W_SliceObject, now that they don't belong to slicetype.py any more (they read attributes of W_SliceObjects). Modified: pypy/dist/pypy/objspace/std/listobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/listobject.py (original) +++ pypy/dist/pypy/objspace/std/listobject.py Sat Sep 17 18:28:32 2005 @@ -51,7 +51,7 @@ def getitem__List_Slice(space, w_list, w_slice): # XXX consider to extend rlist's functionality? length = len(w_list.wrappeditems) - start, stop, step, slicelength = slicetype.indices4(space, w_slice, length) + start, stop, step, slicelength = w_slice.indices4(length) assert slicelength >= 0 if step == 1 and stop >= start >= 0: assert stop >= 0 @@ -168,8 +168,7 @@ return space.w_None def delitem__List_Slice(space, w_list, w_slice): - start, stop, step, slicelength = slicetype.indices4(space, w_slice, - len(w_list.wrappeditems)) + start, stop, step, slicelength = w_slice.indices4(len(w_list.wrappeditems)) if slicelength==0: return @@ -237,8 +236,7 @@ def _setitem_slice_helper(space, w_list, w_slice, sequence2, len2): oldsize = len(w_list.wrappeditems) - start, stop, step, slicelength = slicetype.indices4(space, w_slice, - oldsize) + start, stop, step, slicelength = w_slice.indices4(oldsize) assert slicelength >= 0 items = w_list.wrappeditems Modified: pypy/dist/pypy/objspace/std/sliceobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/sliceobject.py (original) +++ pypy/dist/pypy/objspace/std/sliceobject.py Sat Sep 17 18:28:32 2005 @@ -7,6 +7,7 @@ from pypy.objspace.std.objspace import * from pypy.interpreter import gateway +from pypy.objspace.std.slicetype import _Eval_SliceIndex class W_SliceObject(W_Object): @@ -25,8 +26,62 @@ space = w_slice.space return slice(space.unwrap(w_slice.w_start), space.unwrap(w_slice.w_stop), space.unwrap(w_slice.w_step)) + def indices3(w_slice, length): + space = w_slice.space + if space.is_true(space.is_(w_slice.w_step, space.w_None)): + step = 1 + else: + step = _Eval_SliceIndex(space, w_slice.w_step) + if step == 0: + raise OperationError(space.w_ValueError, + space.wrap("slice step cannot be zero")) + if space.is_true(space.is_(w_slice.w_start, space.w_None)): + if step < 0: + start = length - 1 + else: + start = 0 + else: + start = _Eval_SliceIndex(space, w_slice.w_start) + if start < 0: + start += length + if start < 0: + if step < 0: + start = -1 + else: + start = 0 + elif start >= length: + if step < 0: + start = length - 1 + else: + start = length + if space.is_true(space.is_(w_slice.w_stop, space.w_None)): + if step < 0: + stop = -1 + else: + stop = length + else: + stop = _Eval_SliceIndex(space, w_slice.w_stop) + if stop < 0: + stop += length + if stop < 0: + stop =-1 + elif stop > length: + stop = length + return start, stop, step + + def indices4(w_slice, length): + start, stop, step = w_slice.indices3(length) + if (step < 0 and stop >= start) or (step > 0 and start >= stop): + slicelength = 0 + elif step < 0: + slicelength = (stop - start + 1) / step + 1 + else: + slicelength = (stop - start - 1) / step + 1 + return start, stop, step, slicelength + registerimplementation(W_SliceObject) + repr__Slice = gateway.applevel(""" def repr__Slice(aslice): return 'slice(%r, %r, %r)' % (aslice.start, aslice.stop, aslice.step) @@ -62,13 +117,12 @@ space.wrap("unhashable type")) # indices impl -from pypy.objspace.std import slicetype - def slice_indices__Slice_ANY(space, w_slice, w_length): length = space.int_w(w_length) - start, stop, step = slicetype.indices3(space, w_slice, length) + start, stop, step = w_slice.indices3(length) return space.newtuple([space.wrap(start), space.wrap(stop), space.wrap(step)]) # register all methods +from pypy.objspace.std import slicetype register_all(vars(), slicetype) Modified: pypy/dist/pypy/objspace/std/slicetype.py ============================================================================== --- pypy/dist/pypy/objspace/std/slicetype.py (original) +++ pypy/dist/pypy/objspace/std/slicetype.py Sat Sep 17 18:28:32 2005 @@ -21,58 +21,6 @@ x = -sys.maxint return x -def indices3(space, w_slice, length): - if space.is_true(space.is_(w_slice.w_step, space.w_None)): - step = 1 - else: - step = _Eval_SliceIndex(space, w_slice.w_step) - if step == 0: - raise OperationError(space.w_ValueError, - space.wrap("slice step cannot be zero")) - if space.is_true(space.is_(w_slice.w_start, space.w_None)): - if step < 0: - start = length - 1 - else: - start = 0 - else: - start = _Eval_SliceIndex(space, w_slice.w_start) - if start < 0: - start += length - if start < 0: - if step < 0: - start = -1 - else: - start = 0 - elif start >= length: - if step < 0: - start = length - 1 - else: - start = length - if space.is_true(space.is_(w_slice.w_stop, space.w_None)): - if step < 0: - stop = -1 - else: - stop = length - else: - stop = _Eval_SliceIndex(space, w_slice.w_stop) - if stop < 0: - stop += length - if stop < 0: - stop =-1 - elif stop > length: - stop = length - return start, stop, step - -def indices4(space, w_slice, length): - start, stop, step = indices3(space, w_slice, length) - if (step < 0 and stop >= start) or (step > 0 and start >= stop): - slicelength = 0 - elif step < 0: - slicelength = (stop - start + 1) / step + 1 - else: - slicelength = (stop - start - 1) / step + 1 - return start, stop, step, slicelength - def adapt_bound(space, w_index, w_size): if not (space.is_true(space.isinstance(w_index, space.w_int)) or space.is_true(space.isinstance(w_index, space.w_long))): Modified: pypy/dist/pypy/objspace/std/stringobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/stringobject.py (original) +++ pypy/dist/pypy/objspace/std/stringobject.py Sat Sep 17 18:28:32 2005 @@ -941,7 +941,7 @@ w = space.wrap s = w_str._value length = len(s) - start, stop, step, sl = slicetype.indices4(space, w_slice, length) + start, stop, step, sl = w_slice.indices4(length) if sl == 0: str = "" elif step == 1: Modified: pypy/dist/pypy/objspace/std/test/test_sliceobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/test/test_sliceobject.py (original) +++ pypy/dist/pypy/objspace/std/test/test_sliceobject.py Sat Sep 17 18:28:32 2005 @@ -4,25 +4,22 @@ class TestW_SliceObject: def test_indices(self): - from pypy.objspace.std import slicetype space = self.space w = space.wrap w_None = space.w_None w_slice = space.newslice(w_None, w_None, w_None) - assert slicetype.indices3(space, w_slice, 6) == (0, 6, 1) + assert w_slice.indices3(6) == (0, 6, 1) w_slice = space.newslice(w(0), w(6), w(1)) - assert slicetype.indices3(space, w_slice, 6) == (0, 6, 1) + assert w_slice.indices3(6) == (0, 6, 1) w_slice = space.newslice(w_None, w_None, w(-1)) - assert slicetype.indices3(space, w_slice, 6) == (5, -1, -1) + assert w_slice.indices3(6) == (5, -1, -1) def test_indices_fail(self): - from pypy.objspace.std import slicetype space = self.space w = space.wrap w_None = space.w_None w_slice = space.newslice(w_None, w_None, w(0)) - self.space.raises_w(space.w_ValueError, - slicetype.indices3, space, w_slice, 10) + self.space.raises_w(space.w_ValueError, w_slice.indices3, 10) class AppTest_SliceObject: Modified: pypy/dist/pypy/objspace/std/tupleobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/tupleobject.py (original) +++ pypy/dist/pypy/objspace/std/tupleobject.py Sat Sep 17 18:28:32 2005 @@ -2,7 +2,6 @@ from pypy.objspace.std.intobject import W_IntObject from pypy.rpython.rarithmetic import intmask from pypy.objspace.std.sliceobject import W_SliceObject -from pypy.objspace.std import slicetype from pypy.interpreter import gateway class W_TupleObject(W_Object): @@ -42,7 +41,7 @@ def getitem__Tuple_Slice(space, w_tuple, w_slice): items = w_tuple.wrappeditems length = len(items) - start, stop, step, slicelength = slicetype.indices4(space, w_slice, length) + start, stop, step, slicelength = w_slice.indices4(length) assert slicelength >= 0 subitems = [None] * slicelength for i in range(slicelength): Modified: pypy/dist/pypy/objspace/std/unicodeobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/unicodeobject.py (original) +++ pypy/dist/pypy/objspace/std/unicodeobject.py Sat Sep 17 18:28:32 2005 @@ -3,7 +3,6 @@ from pypy.objspace.std.stringobject import W_StringObject from pypy.objspace.std.noneobject import W_NoneObject from pypy.objspace.std.sliceobject import W_SliceObject -from pypy.objspace.std import slicetype from pypy.rpython.rarithmetic import intmask from pypy.module.unicodedata import unicodedb @@ -230,7 +229,7 @@ def getitem__Unicode_Slice(space, w_uni, w_slice): uni = w_uni._value length = len(uni) - start, stop, step, sl = slicetype.indices4(space, w_slice, length) + start, stop, step, sl = w_slice.indices4(length) if sl == 0: r = [] elif step == 1: From cfbolz at codespeak.net Sat Sep 17 19:01:05 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sat, 17 Sep 2005 19:01:05 +0200 (CEST) Subject: [pypy-svn] r17621 - pypy/dist/pypy/objspace/std Message-ID: <20050917170105.CF3C327B80@code1.codespeak.net> Author: cfbolz Date: Sat Sep 17 19:01:02 2005 New Revision: 17621 Modified: pypy/dist/pypy/objspace/std/default.py pypy/dist/pypy/objspace/std/dictobject.py pypy/dist/pypy/objspace/std/floattype.py pypy/dist/pypy/objspace/std/inttype.py pypy/dist/pypy/objspace/std/longtype.py pypy/dist/pypy/objspace/std/sliceobject.py pypy/dist/pypy/objspace/std/stringobject.py pypy/dist/pypy/objspace/std/stringtype.py pypy/dist/pypy/objspace/std/tupleobject.py pypy/dist/pypy/objspace/std/typeobject.py pypy/dist/pypy/objspace/std/unicodeobject.py Log: changing space.is_true(space.is_ to space.is_w and space.is_true(space.eq to space.eq_w a bit everywhere. Also remove commented out code. Modified: pypy/dist/pypy/objspace/std/default.py ============================================================================== --- pypy/dist/pypy/objspace/std/default.py (original) +++ pypy/dist/pypy/objspace/std/default.py Sat Sep 17 19:01:02 2005 @@ -16,206 +16,6 @@ def init__ANY(space, w_obj, __args__): pass - -# __nonzero__ falls back to __len__ - -##def is_true__ANY(space, w_obj): -## w_descr = space.lookup(w_obj, '__len__') -## if w_descr is None: -## return True -## else: -## w_len = space.get_and_call_function(w_descr, w_obj) -## return space.is_true(w_len) - -### in-place operators fall back to their non-in-place counterpart - -##for _name, _symbol, _arity, _specialnames in ObjSpace.MethodTable: -## if _name.startswith('inplace_'): -## def default_inplace(space, w_1, w_2, baseop=_name[8:]): -## op = getattr(space, baseop) -## return op(w_1, w_2) -## getattr(StdObjSpace.MM, _name).register(default_inplace, -## W_Object, W_ANY) - -# '__get__(descr, inst, cls)' returns 'descr' by default - -#def get__Object_ANY_ANY(space, w_descr, w_inst, w_cls): -# return w_descr - -#def is_data_descr__Object(space, w_descr): -# return 0 - -# give objects some default attributes and a default way to complain -# about missing attributes - -##def getattribute__Object_ANY(space, w_obj, w_attr): -## # XXX build a nicer error message along these lines: -## #w_type = space.type(w_obj) -## #w_typename = space.getattr(w_type, space.wrap('__name__')) -## #... - -## w_type = space.type(w_obj) -## if space.is_true(space.eq(w_attr, space.wrap('__class__'))): -## return w_type - -## # 1) look for descriptor -## # 2) if data descriptor, call it -## # 3) check __dict__ -## # 4) if present, return that -## # 5) if descriptor found in 2), call that -## # 6) raise AttrbuteError - -## w_descr = None - -## from typeobject import W_TypeObject -## if isinstance(w_type, W_TypeObject): # XXX must always be true at some point -## try: -## w_descr = w_type.lookup(w_attr) -## except KeyError: -## pass -## else: -## if space.is_data_descr(w_descr): -## return space.get(w_descr, w_obj, w_type) # XXX 3rd arg is wrong - -## try: -## w_dict = space.getdict(w_obj) -## except OperationError, e: -## if not e.match(space, space.w_TypeError): # 'unsupported type for getdict' -## raise -## else: -## if space.is_true(space.eq(w_attr, space.wrap('__dict__'))): -## return w_dict -## try: -## w_value = space.getitem(w_dict, w_attr) -## except OperationError, e: -## if not e.match(space, space.w_KeyError): -## raise -## else: -## return w_value # got a value from 'obj.__dict__[attr]' - -## if w_descr is not None: -## return space.get(w_descr, w_obj, w_type) - -## raise OperationError(space.w_AttributeError, w_attr) - - -# set attributes, complaining about read-only ones -- -# a more declarative way to define attributes would be welcome - -##def setattr__Object_ANY_ANY(space, w_obj, w_attr, w_value): - -## # 1) look for descriptor -## # 2) if data descriptor, call it -## # 3) try to set item in __dict__ - -## w_type = space.type(w_obj) -## if space.is_true(space.eq(w_attr, space.wrap('__class__'))): -## raise OperationError(space.w_AttributeError, -## space.wrap("read-only attribute")) -## if space.is_true(space.eq(w_attr, space.wrap('__dict__'))): -## raise OperationError(space.w_AttributeError, -## space.wrap("read-only attribute")) - -## from typeobject import W_TypeObject -## if isinstance(w_type, W_TypeObject): -## try: -## w_descr = w_type.lookup(w_attr) -## except KeyError: -## pass -## else: -## if space.is_data_descr(w_descr): -## return space.set(w_descr, w_obj, w_value) - -## try: -## w_dict = space.getdict(w_obj) -## except OperationError, e: -## if not e.match(space, space.w_TypeError): # "unsupported type for getdict" -## raise -## raise OperationError(space.w_AttributeError, w_attr) -## else: -## space.setitem(w_dict, w_attr, w_value) - - -##def delattr__Object_ANY(space, w_obj, w_attr): -## w_type = space.type(w_obj) -## if space.is_true(space.eq(w_attr, space.wrap('__class__'))): -## raise OperationError(space.w_AttributeError, -## space.wrap("read-only attribute")) -## if space.is_true(space.eq(w_attr, space.wrap('__dict__'))): -## raise OperationError(space.w_AttributeError, -## space.wrap("read-only attribute")) - -## from typeobject import W_TypeObject -## if isinstance(w_type, W_TypeObject): -## try: -## w_descr = w_type.lookup(w_attr) -## except KeyError: -## pass -## else: -## #space.type(w_descr).lookup(space.wrap('__delete__')) -## if space.is_data_descr(w_descr): -## return space.delete(w_descr, w_obj) - -## try: -## w_dict = space.getdict(w_obj) -## except OperationError, e: -## if not e.match(space, space.w_TypeError): # "unsupported type for getdict" -## raise -## raise OperationError(space.w_AttributeError, w_attr) -## else: -## try: -## space.delitem(w_dict, w_attr) -## except OperationError, e: -## if not e.match(space, space.w_KeyError): -## raise -## raise OperationError(space.w_AttributeError, w_attr) - -# static types - -##def type__Object(space, w_obj): -## if w_obj.statictype is None: -## # XXX remove me, temporary -## return space.wrap(space.unwrap(w_obj).__class__) -## else: -## w_type = space.get_typeinstance(w_obj.statictype) -## return w_type - -# repr(), str(), hash() - -##def repr__Object(space, w_obj): -## return space.wrap('<%s object at %s>'%( -## space.type(w_obj).typename, space.unwrap(space.id(w_obj)))) - -##def str__Object(space, w_obj): -## return space.repr(w_obj) - -##def hash__ANY(space, w_obj): -## return space.id(w_obj) - - -# The following operations are fall-backs if we really cannot find -# anything else even with delegation. -# 'eq' falls back to 'is' - -##def eq__ANY_ANY(space, w_a, w_b): -## return space.is_(w_a, w_b) - -# 'contains' falls back to iteration. - -##def contains__ANY_ANY(space, w_iterable, w_lookfor): -## w_iter = space.iter(w_iterable) -## while 1: -## try: -## w_next = space.next(w_iter) -## except OperationError, e: -## if not e.match(space, space.w_StopIteration): -## raise -## return space.w_False -## if space.is_true(space.eq(w_next, w_lookfor)): -## return space.w_True - -# ugh - def typed_unwrap_error_msg(space, expected, w_obj): w = space.wrap type_name = space.str_w(space.getattr(space.type(w_obj),w("__name__"))) Modified: pypy/dist/pypy/objspace/std/dictobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/dictobject.py (original) +++ pypy/dist/pypy/objspace/std/dictobject.py Sat Sep 17 19:01:02 2005 @@ -92,7 +92,7 @@ return W_DictIter_Keys(space, w_dict) def eq__Dict_Dict(space, w_left, w_right): - if space.is_true(space.is_(w_left, w_right)): + if space.is_w(w_left, w_right): return space.w_True if len(w_left.content) != len(w_right.content): Modified: pypy/dist/pypy/objspace/std/floattype.py ============================================================================== --- pypy/dist/pypy/objspace/std/floattype.py (original) +++ pypy/dist/pypy/objspace/std/floattype.py Sat Sep 17 19:01:02 2005 @@ -31,7 +31,7 @@ space.wrap(e.msg)) else: w_obj = space.float(w_value) - if space.is_true(space.is_(w_floattype, space.w_float)): + if space.is_w(w_floattype, space.w_float): return w_obj # 'float(x)' should return # whatever x.__float__() returned value = space.float_w(w_obj) Modified: pypy/dist/pypy/objspace/std/inttype.py ============================================================================== --- pypy/dist/pypy/objspace/std/inttype.py (original) +++ pypy/dist/pypy/objspace/std/inttype.py Sat Sep 17 19:01:02 2005 @@ -42,7 +42,7 @@ # otherwise, use the __int__() method w_obj = space.int(w_value) # 'int(x)' should return whatever x.__int__() returned - if space.is_true(space.is_(w_inttype, space.w_int)): + if space.is_w(w_inttype, space.w_int): return w_obj # int_w is effectively what we want in this case, # we cannot construct a subclass of int instance with an @@ -76,7 +76,7 @@ w_longval = retry_to_w_long(space, e.parser, base) if w_longval is not None: - if not space.is_true(space.is_(w_inttype, space.w_int)): + if not space.is_w(w_inttype, space.w_int): raise OperationError(space.w_OverflowError, space.wrap( "long int too large to convert to int")) Modified: pypy/dist/pypy/objspace/std/longtype.py ============================================================================== --- pypy/dist/pypy/objspace/std/longtype.py (original) +++ pypy/dist/pypy/objspace/std/longtype.py Sat Sep 17 19:01:02 2005 @@ -28,7 +28,7 @@ # otherwise, use the __long__() method w_obj = space.long(w_value) # 'long(x)' should return whatever x.__long__() returned - if space.is_true(space.is_(w_longtype, space.w_long)): + if space.is_w(w_longtype, space.w_long): return w_obj if space.is_true(space.isinstance(w_obj, space.w_long)): assert isinstance(w_obj, W_LongObject) # XXX this could fail! Modified: pypy/dist/pypy/objspace/std/sliceobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/sliceobject.py (original) +++ pypy/dist/pypy/objspace/std/sliceobject.py Sat Sep 17 19:01:02 2005 @@ -28,14 +28,14 @@ def indices3(w_slice, length): space = w_slice.space - if space.is_true(space.is_(w_slice.w_step, space.w_None)): + if space.is_w(w_slice.w_step, space.w_None): step = 1 else: step = _Eval_SliceIndex(space, w_slice.w_step) if step == 0: raise OperationError(space.w_ValueError, space.wrap("slice step cannot be zero")) - if space.is_true(space.is_(w_slice.w_start, space.w_None)): + if space.is_w(w_slice.w_start, space.w_None): if step < 0: start = length - 1 else: @@ -54,7 +54,7 @@ start = length - 1 else: start = length - if space.is_true(space.is_(w_slice.w_stop, space.w_None)): + if space.is_w(w_slice.w_stop, space.w_None): if step < 0: stop = -1 else: Modified: pypy/dist/pypy/objspace/std/stringobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/stringobject.py (original) +++ pypy/dist/pypy/objspace/std/stringobject.py Sat Sep 17 19:01:02 2005 @@ -812,71 +812,6 @@ # the keys any more!) return W_IntObject(space, x) - -##EQ = 1 -##LE = 2 -##GE = 3 -##GT = 4 -##LT = 5 -##NE = 6 - - -##def string_richcompare(space, w_str1, w_str2, op): -## str1 = w_str1._value -## str2 = w_str2._value - -## if space.is_true(space.is_(w_str1, w_str2)): -## if op == EQ or op == LE or op == GE: -## return space.w_True -## elif op == GT or op == LT or op == NE: -## return space.w_False -## if 0: -## pass -## else: -## if op == EQ: -## if len(str1) == len(str2): -## for i in range(len(str1)): -## if ord(str1[i]) != ord(str2[i]): -## return space.w_False -## return space.w_True -## else: -## return space.w_False -## else: -## if len(str1) > len(str2): -## min_len = len(str2) -## else: -## min_len = len(str1) - -## c = 0 -## idx = 0 -## if (min_len > 0): -## while (c == 0) and (idx < min_len): -## c = ord(str1[idx]) - ord(str2[idx]) -## idx = idx + 1 -## else: -## c = 0 - -## if (c == 0): -## if len(str1) < len(str2): -## c = -1 -## elif len(str1) > len(str2): -## c = 1 -## else: -## c = 0 - -## if op == LT: -## return space.newbool(c < 0) -## elif op == LE: -## return space.newbool(c <= 0) -## elif op == NE: -## return space.newbool(c != 0) -## elif op == GT: -## return space.newbool(c > 0) -## elif op == GE: -## return space.newbool(c >= 0) -## else: -## return NotImplemented - def lt__String_String(space, w_str1, w_str2): s1 = w_str1._value s2 = w_str2._value Modified: pypy/dist/pypy/objspace/std/stringtype.py ============================================================================== --- pypy/dist/pypy/objspace/std/stringtype.py (original) +++ pypy/dist/pypy/objspace/std/stringtype.py Sat Sep 17 19:01:02 2005 @@ -44,7 +44,7 @@ def descr__new__(space, w_stringtype, w_object=''): from pypy.objspace.std.stringobject import W_StringObject w_obj = space.str(w_object) - if space.is_true(space.is_(w_stringtype, space.w_str)): + if space.is_w(w_stringtype, space.w_str): return w_obj # XXX might be reworked when space.str() typechecks value = space.str_w(w_obj) w_obj = space.allocate_instance(W_StringObject, w_stringtype) Modified: pypy/dist/pypy/objspace/std/tupleobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/tupleobject.py (original) +++ pypy/dist/pypy/objspace/std/tupleobject.py Sat Sep 17 19:01:02 2005 @@ -88,7 +88,7 @@ for i in range(len(items1)): item1 = items1[i] item2 = items2[i] - if not space.is_true(space.eq(item1, item2)): + if not space.eq_w(item1, item2): return space.w_False return space.w_True @@ -103,7 +103,7 @@ ncmp = _min(len(items1), len(items2)) # Search for the first index where items are different for p in range(ncmp): - if not space.is_true(space.eq(items1[p], items2[p])): + if not space.eq_w(items1[p], items2[p]): return space.lt(items1[p], items2[p]) # No more items to compare -- compare sizes return space.newbool(len(items1) < len(items2)) @@ -114,7 +114,7 @@ ncmp = _min(len(items1), len(items2)) # Search for the first index where items are different for p in range(ncmp): - if not space.is_true(space.eq(items1[p], items2[p])): + if not space.eq_w(items1[p], items2[p]): return space.gt(items1[p], items2[p]) # No more items to compare -- compare sizes return space.newbool(len(items1) > len(items2)) Modified: pypy/dist/pypy/objspace/std/typeobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/typeobject.py (original) +++ pypy/dist/pypy/objspace/std/typeobject.py Sat Sep 17 19:01:02 2005 @@ -163,7 +163,7 @@ w_self.hasdict = True w_type = space.type(w_self) - if not space.is_true(space.is_(w_type, space.w_type)): + if not space.is_w(w_type, space.w_type): w_self.mro_w = [] mro_func = w_type.lookup('mro') mro_func_args = Arguments(space, [w_self]) @@ -280,7 +280,7 @@ def call__Type(space, w_type, __args__): # special case for type(x) - if space.is_true(space.is_(w_type, space.w_type)): + if space.is_w(w_type, space.w_type): try: w_obj, = __args__.fixedunpack(1) except ValueError: Modified: pypy/dist/pypy/objspace/std/unicodeobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/unicodeobject.py (original) +++ pypy/dist/pypy/objspace/std/unicodeobject.py Sat Sep 17 19:01:02 2005 @@ -158,7 +158,7 @@ if len(list) == 0: return W_UnicodeObject(space, []) if (len(list) == 1 and - space.is_true(space.is_(space.type(list[0]), space.w_unicode))): + space.is_w(space.type(list[0]), space.w_unicode)): return list[0] values_list = [None] * len(list) From arigo at codespeak.net Sat Sep 17 19:48:53 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 17 Sep 2005 19:48:53 +0200 (CEST) Subject: [pypy-svn] r17622 - pypy/dist/pypy/translator/c Message-ID: <20050917174853.DB65C27B7C@code1.codespeak.net> Author: arigo Date: Sat Sep 17 19:48:53 2005 New Revision: 17622 Modified: pypy/dist/pypy/translator/c/extfunc.py Log: Don't put dots in the macro name. Modified: pypy/dist/pypy/translator/c/extfunc.py ============================================================================== --- pypy/dist/pypy/translator/c/extfunc.py (original) +++ pypy/dist/pypy/translator/c/extfunc.py Sat Sep 17 19:48:53 2005 @@ -136,7 +136,7 @@ # a substring of PyExc_%s name = pyexccls.__name__ if pyexccls.__module__ != 'exceptions': - name = '%s_%s' % (pyexccls.__module__, name) + name = '%s_%s' % (pyexccls.__module__.replace('.', '__'), name) yield ('RPyExc_%s' % name, exc_llvalue) From tismer at codespeak.net Sat Sep 17 19:50:06 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sat, 17 Sep 2005 19:50:06 +0200 (CEST) Subject: [pypy-svn] r17623 - in pypy/dist/pypy: interpreter interpreter/test module/__builtin__ objspace objspace/std tool/pytest Message-ID: <20050917175006.7C57427B7C@code1.codespeak.net> Author: tismer Date: Sat Sep 17 19:50:01 2005 New Revision: 17623 Modified: pypy/dist/pypy/interpreter/argument.py pypy/dist/pypy/interpreter/executioncontext.py pypy/dist/pypy/interpreter/pyframe.py pypy/dist/pypy/interpreter/test/test_function.py pypy/dist/pypy/interpreter/typedef.py pypy/dist/pypy/module/__builtin__/importing.py pypy/dist/pypy/objspace/descroperation.py pypy/dist/pypy/objspace/std/fake.py pypy/dist/pypy/tool/pytest/appsupport.py Log: completed replacement ofspace.is_true shortcuts Modified: pypy/dist/pypy/interpreter/argument.py ============================================================================== --- pypy/dist/pypy/interpreter/argument.py (original) +++ pypy/dist/pypy/interpreter/argument.py Sat Sep 17 19:50:01 2005 @@ -132,8 +132,7 @@ # self.match_signature() assumes that it can use it directly for # a matching *arg in the callee's signature. if self.w_stararg is not None: - if not space.is_true(space.is_(space.type(self.w_stararg), - space.w_tuple)): + if not space.is_w(space.type(self.w_stararg), space.w_tuple): self.unpack() try: return self.match_signature(signature, defaults_w) Modified: pypy/dist/pypy/interpreter/executioncontext.py ============================================================================== --- pypy/dist/pypy/interpreter/executioncontext.py (original) +++ pypy/dist/pypy/interpreter/executioncontext.py Sat Sep 17 19:50:01 2005 @@ -167,7 +167,7 @@ try: try: w_result = space.call_function(w_callback, space.wrap(frame), space.wrap(event), w_arg) - if space.is_true(space.is_(w_result, space.w_None)): + if space.is_w(w_result, space.w_None): frame.w_f_trace = None else: frame.w_f_trace = w_result Modified: pypy/dist/pypy/interpreter/pyframe.py ============================================================================== --- pypy/dist/pypy/interpreter/pyframe.py (original) +++ pypy/dist/pypy/interpreter/pyframe.py Sat Sep 17 19:50:01 2005 @@ -286,7 +286,7 @@ return self.w_f_trace def fset_f_trace(space, self, w_trace): - if space.is_true(space.is_(w_trace, space.w_None)): + if space.is_w(w_trace, space.w_None): self.w_f_trace = None else: self.w_f_trace = w_trace Modified: pypy/dist/pypy/interpreter/test/test_function.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_function.py (original) +++ pypy/dist/pypy/interpreter/test/test_function.py Sat Sep 17 19:50:01 2005 @@ -267,4 +267,4 @@ # Check method returned from unbound_method.__get__() # --- with an incompatible class w_meth5 = meth3.descr_method_get(space.wrap('hello'), space.w_str) - assert space.is_true(space.is_(w_meth5, w_meth3)) + assert space.is_w(w_meth5, w_meth3) Modified: pypy/dist/pypy/interpreter/typedef.py ============================================================================== --- pypy/dist/pypy/interpreter/typedef.py (original) +++ pypy/dist/pypy/interpreter/typedef.py Sat Sep 17 19:50:01 2005 @@ -202,7 +202,7 @@ """property.__get__(obj[, type]) -> value Read the value of the property of the given obj.""" # XXX HAAAAAAAAAAAACK (but possibly a good one) - if w_obj == space.w_None and not space.is_true(space.is_(w_cls, space.type(space.w_None))): + if w_obj == space.w_None and not space.is_w(w_cls, space.type(space.w_None)): #print property, w_obj, w_cls return space.wrap(property) else: Modified: pypy/dist/pypy/module/__builtin__/importing.py ============================================================================== --- pypy/dist/pypy/module/__builtin__/importing.py (original) +++ pypy/dist/pypy/module/__builtin__/importing.py Sat Sep 17 19:50:01 2005 @@ -172,7 +172,7 @@ space.wrap("__import__() argument 1 must be string" + helper)) w = space.wrap - if w_globals is not None and not space.is_true(space.is_(w_globals, space.w_None)): + if w_globals is not None and not space.is_w(w_globals, space.w_None): ctxt_w_name = try_getitem(space, w_globals, w('__name__')) ctxt_w_path = try_getitem(space, w_globals, w('__path__')) else: @@ -192,7 +192,7 @@ if rel_modulename is not None: w_mod = check_sys_modules(space, w(rel_modulename)) if (w_mod is None or - not space.is_true(space.is_(w_mod,space.w_None))): + not space.is_w(w_mod, space.w_None)): w_mod = absolute_import(space, rel_modulename, len(ctxt_name_prefix_parts), @@ -256,7 +256,7 @@ w_modulename = w(modulename) w_mod = check_sys_modules(space, w_modulename) if w_mod is not None: - if not space.is_true(space.is_(w_mod, space.w_None)): + if not space.is_w(w_mod, space.w_None): return w_mod else: w_mod = space.sys.getmodule(modulename) Modified: pypy/dist/pypy/objspace/descroperation.py ============================================================================== --- pypy/dist/pypy/objspace/descroperation.py (original) +++ pypy/dist/pypy/objspace/descroperation.py Sat Sep 17 19:50:01 2005 @@ -240,7 +240,7 @@ w_typ1 = space.type(w_obj1) w_typ2 = space.type(w_obj2) w_left_src, w_left_impl = space.lookup_where(w_obj1, '__pow__') - if space.is_true(space.is_(w_typ1, w_typ2)): + if space.is_w(w_typ1, w_typ2): w_right_impl = None else: w_right_src, w_right_impl = space.lookup_where(w_obj2, '__rpow__') @@ -286,7 +286,7 @@ if not e.match(space, space.w_StopIteration): raise return space.w_False - if space.is_true(space.eq(w_next, w_item)): + if space.eq_w(w_next, w_item): return space.w_True def hash(space, w_obj): @@ -332,7 +332,7 @@ w_typ1 = space.type(w_obj1) w_typ2 = space.type(w_obj2) w_left_src, w_left_impl = space.lookup_where(w_obj1, '__coerce__') - if space.is_true(space.is_(w_typ1, w_typ2)): + if space.is_w(w_typ1, w_typ2): w_right_impl = None else: w_right_src, w_right_impl = space.lookup_where(w_obj2, '__coerce__') @@ -366,7 +366,7 @@ # helpers def _check_notimplemented(space, w_obj): - return not space.is_true(space.is_(w_obj, space.w_NotImplemented)) + return not space.is_w(w_obj, space.w_NotImplemented) def _invoke_binop(space, w_impl, w_obj1, w_obj2): if w_impl is not None: @@ -389,7 +389,7 @@ w_left_src, w_left_impl = space.lookup_where(w_obj1, '__cmp__') do_neg1 = False do_neg2 = True - if space.is_true(space.is_(w_typ1, w_typ2)): + if space.is_w(w_typ1, w_typ2): w_right_impl = None else: w_right_src, w_right_impl = space.lookup_where(w_obj2, '__cmp__') @@ -405,13 +405,13 @@ if w_res is not None: return _conditional_neg(space, w_res, do_neg2) # fall back to internal rules - if space.is_true(space.is_(w_obj1, w_obj2)): + if space.is_w(w_obj1, w_obj2): return space.wrap(0) - if space.is_true(space.is_(w_obj1, space.w_None)): + if space.is_w(w_obj1, space.w_None): return space.wrap(-1) - if space.is_true(space.is_(w_obj2, space.w_None)): + if space.is_w(w_obj2, space.w_None): return space.wrap(1) - if space.is_true(space.is_(w_typ1, w_typ2)): + if space.is_w(w_typ1, w_typ2): w_id1 = space.id(w_obj1) w_id2 = space.id(w_obj2) else: @@ -430,7 +430,7 @@ w_typ1 = space.type(w_obj1) w_typ2 = space.type(w_obj2) w_left_src, w_left_impl = space.lookup_where(w_obj1, left) - if space.is_true(space.is_(w_typ1, w_typ2)): + if space.is_w(w_typ1, w_typ2): w_right_impl = None else: w_right_src, w_right_impl = space.lookup_where(w_obj2, right) @@ -458,7 +458,7 @@ w_first = w_obj1 w_second = w_obj2 - if space.is_true(space.is_(w_typ1, w_typ2)): + if space.is_w(w_typ1, w_typ2): w_right_impl = None else: w_right_src, w_right_impl = space.lookup_where(w_obj2, right) Modified: pypy/dist/pypy/objspace/std/fake.py ============================================================================== --- pypy/dist/pypy/objspace/std/fake.py (original) +++ pypy/dist/pypy/objspace/std/fake.py Sat Sep 17 19:50:01 2005 @@ -203,7 +203,7 @@ def descr_descriptor_get(space, descr, w_obj, w_cls=None): # XXX HAAAAAAAAAAAACK (but possibly a good one) - if w_obj == space.w_None and not space.is_true(space.is_(w_cls, space.type(space.w_None))): + if w_obj == space.w_None and not space.is_w(w_cls, space.type(space.w_None)): #print descr, w_obj, w_cls return space.wrap(descr) else: Modified: pypy/dist/pypy/tool/pytest/appsupport.py ============================================================================== --- pypy/dist/pypy/tool/pytest/appsupport.py (original) +++ pypy/dist/pypy/tool/pytest/appsupport.py Sat Sep 17 19:50:01 2005 @@ -196,4 +196,4 @@ def eq_w(space, w_obj1, w_obj2): """ return interp-level boolean of eq(w_obj1, w_obj2). """ - return space.is_true(space.eq(w_obj1, w_obj2)) + return space.eq_w(w_obj1, w_obj2) From arigo at codespeak.net Sat Sep 17 20:17:16 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 17 Sep 2005 20:17:16 +0200 (CEST) Subject: [pypy-svn] r17624 - pypy/dist/pypy/module/sys Message-ID: <20050917181716.B9D8E27B7C@code1.codespeak.net> Author: arigo Date: Sat Sep 17 20:17:16 2005 New Revision: 17624 Modified: pypy/dist/pypy/module/sys/__init__.py Log: We got a 2-3x speed-up in the last two days. That's worth a bump in the version number. As pypy-dev proves nothing else is worth a new version number anyway. Modified: pypy/dist/pypy/module/sys/__init__.py ============================================================================== --- pypy/dist/pypy/module/sys/__init__.py (original) +++ pypy/dist/pypy/module/sys/__init__.py Sat Sep 17 20:17:16 2005 @@ -50,8 +50,8 @@ 'copyright' : 'space.wrap("MIT-License")', 'api_version' : 'space.wrap(1012)', 'version_info' : 'space.wrap((2,4,1, "alpha", 42))', - 'version' : 'space.wrap("2.4.1 (pypy 0.7.0 build)")', - 'pypy_version_info' : """space.wrap((0,7,0, "alpha", + 'version' : 'space.wrap("2.4.1 (pypy 0.7.1 build)")', + 'pypy_version_info' : """space.wrap((0,7,1, "alpha", int('$Revision$'[11:-1])))""", 'pypy_svn_url' : 'space.wrap("$HeadURL$"[10:-29])', 'hexversion' : 'space.wrap(0x020401a0)', From tismer at codespeak.net Sat Sep 17 20:35:59 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sat, 17 Sep 2005 20:35:59 +0200 (CEST) Subject: [pypy-svn] r17625 - pypy/dist/pypy/translator/backendopt Message-ID: <20050917183559.E241D27B7C@code1.codespeak.net> Author: tismer Date: Sat Sep 17 20:35:58 2005 New Revision: 17625 Modified: pypy/dist/pypy/translator/backendopt/inline.py Log: changed the threshold of inlining to 32.4 to prevend us from inlining too much of rlist. Best result, so far. But I'm not sure wether we should try the old threshold again after reverting the _ll_list_resize change? executable abs.richards abs.pystone rel.richards rel.pystone pypy-c-17439 36481 ms 638.546 42.5 58.0 pypy-c-17512 47007 ms 656.899 54.8 56.4 pypy-c-17516 39326 ms 650.730 45.8 56.9 pypy-c-17545-intern 35411 ms 749.698 41.3 49.4 pypy-c-17572 37463 ms 734.966 43.7 50.4 pypy-c-17600 27159 ms 903.215 31.7 41.0 pypy-c-17600_ll_list 27639 ms 807.148 32.2 45.9 pypy-c-17623-32_4 26278 ms 965.521 30.6 38.4 python 2.3.3 858 ms 37033.600 1.0 1.0 Modified: pypy/dist/pypy/translator/backendopt/inline.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/inline.py (original) +++ pypy/dist/pypy/translator/backendopt/inline.py Sat Sep 17 20:35:58 2005 @@ -10,7 +10,8 @@ from pypy.rpython import rmodel from pypy.translator.backendopt import sparsemat -BASE_INLINE_THRESHOLD = 38.8 # just enough to inline add__Int_Int() +BASE_INLINE_THRESHOLD = 32.4 # just enough to inline add__Int_Int() +# and just small enough to prevend inlining of some rlist functions. class CannotInline(Exception): pass From arigo at codespeak.net Sat Sep 17 20:39:12 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 17 Sep 2005 20:39:12 +0200 (CEST) Subject: [pypy-svn] r17626 - pypy/dist/pypy/interpreter Message-ID: <20050917183912.7B1E027B7C@code1.codespeak.net> Author: arigo Date: Sat Sep 17 20:39:11 2005 New Revision: 17626 Modified: pypy/dist/pypy/interpreter/interactive.py Log: * preserve the locals between two Ctrl-C-invoked interactive consoles * the name 'locals' was getting in the way of calling the built-in function locals() (nice exercice: figure out where this 'locals' dict entry comes from) Modified: pypy/dist/pypy/interpreter/interactive.py ============================================================================== --- pypy/dist/pypy/interpreter/interactive.py (original) +++ pypy/dist/pypy/interpreter/interactive.py Sat Sep 17 20:39:11 2005 @@ -103,6 +103,7 @@ #space.exec_("__pytrace__ = 0", self.w_globals, self.w_globals) space.setitem(self.w_globals, space.wrap('__pytrace__'),space.wrap(0)) self.tracelevel = 0 + self.console_locals = {} def enable_command_line_completer(self): try: @@ -143,7 +144,9 @@ print banner = ("Python %s on %s\n" % (sys.version, sys.platform) + "*** Entering interpreter-level console ***") - local = self.__dict__.copy() + local = self.console_locals + local.update(self.__dict__) + del local['locals'] for w_name in self.space.unpackiterable(self.w_globals): local['w_' + self.space.str_w(w_name)] = ( self.space.getitem(self.w_globals, w_name)) From cfbolz at codespeak.net Sat Sep 17 22:12:45 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Sat, 17 Sep 2005 22:12:45 +0200 (CEST) Subject: [pypy-svn] r17627 - pypy/dist/pypy/tool/pytest Message-ID: <20050917201245.BA3A327B7D@code1.codespeak.net> Author: cfbolz Date: Sat Sep 17 22:12:45 2005 New Revision: 17627 Modified: pypy/dist/pypy/tool/pytest/appsupport.py Log: reverting this -- leeds to infinite recursion because the thing in appsupport behaves like the space itself. Modified: pypy/dist/pypy/tool/pytest/appsupport.py ============================================================================== --- pypy/dist/pypy/tool/pytest/appsupport.py (original) +++ pypy/dist/pypy/tool/pytest/appsupport.py Sat Sep 17 22:12:45 2005 @@ -196,4 +196,4 @@ def eq_w(space, w_obj1, w_obj2): """ return interp-level boolean of eq(w_obj1, w_obj2). """ - return space.eq_w(w_obj1, w_obj2) + return space.is_true(space.eq(w_obj1, w_obj2)) From arigo at codespeak.net Sat Sep 17 22:29:28 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 17 Sep 2005 22:29:28 +0200 (CEST) Subject: [pypy-svn] r17631 - pypy/dist/pypy/interpreter Message-ID: <20050917202928.734E227B7D@code1.codespeak.net> Author: arigo Date: Sat Sep 17 22:29:25 2005 New Revision: 17631 Modified: pypy/dist/pypy/interpreter/baseobjspace.py pypy/dist/pypy/interpreter/pyopcode.py Log: hack. Well maybe improves performance dramatically, but who knows. Short-cut for function calls, and specially for calls to built-in functions. Modified: pypy/dist/pypy/interpreter/baseobjspace.py ============================================================================== --- pypy/dist/pypy/interpreter/baseobjspace.py (original) +++ pypy/dist/pypy/interpreter/baseobjspace.py Sat Sep 17 22:29:25 2005 @@ -464,6 +464,25 @@ return self.call_args(w_callable, args) def call_function(self, w_func, *args_w): + # XXX start of hack for performance + from pypy.interpreter.function import Function + if isinstance(w_func, Function): + if len(args_w) == 1: + w_res = w_func.code.fastcall_1(self, args_w[0]) + if w_res is not None: + return w_res + elif len(args_w) == 2: + w_res = w_func.code.fastcall_2(self, args_w[0], args_w[1]) + if w_res is not None: + return w_res + elif len(args_w) == 3: + w_res = w_func.code.fastcall_3(self, args_w[0], + args_w[1], args_w[2]) + if w_res is not None: + return w_res + args = Arguments(self, list(args_w)) + return w_func.call_args(args) + # XXX end of hack for performance args = Arguments(self, list(args_w)) return self.call_args(w_func, args) Modified: pypy/dist/pypy/interpreter/pyopcode.py ============================================================================== --- pypy/dist/pypy/interpreter/pyopcode.py (original) +++ pypy/dist/pypy/interpreter/pyopcode.py Sat Sep 17 22:29:25 2005 @@ -647,7 +647,29 @@ f.valuestack.push(w_result) def CALL_FUNCTION(f, oparg): - f.call_function(oparg) + # XXX start of hack for performance + if oparg == 1: # 1 arg, 0 keyword arg + w_arg = f.valuestack.pop() + w_function = f.valuestack.pop() + w_result = f.space.call_function(w_function, w_arg) + f.valuestack.push(w_result) + elif oparg == 2: # 2 args, 0 keyword arg + w_arg2 = f.valuestack.pop() + w_arg1 = f.valuestack.pop() + w_function = f.valuestack.pop() + w_result = f.space.call_function(w_function, w_arg1, w_arg2) + f.valuestack.push(w_result) + elif oparg == 3: # 3 args, 0 keyword arg + w_arg3 = f.valuestack.pop() + w_arg2 = f.valuestack.pop() + w_arg1 = f.valuestack.pop() + w_function = f.valuestack.pop() + w_result = f.space.call_function(w_function, w_arg1, w_arg2, w_arg3) + f.valuestack.push(w_result) + # XXX end of hack for performance + else: + # general case + f.call_function(oparg) def CALL_FUNCTION_VAR(f, oparg): w_varargs = f.valuestack.pop() From tismer at codespeak.net Sun Sep 18 00:07:00 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sun, 18 Sep 2005 00:07:00 +0200 (CEST) Subject: [pypy-svn] r17634 - pypy/dist/pypy/translator/goal Message-ID: <20050917220700.CA12027B80@code1.codespeak.net> Author: tismer Date: Sun Sep 18 00:06:59 2005 New Revision: 17634 Modified: pypy/dist/pypy/translator/goal/bench-windows.py Log: did two extra tests, see the last two entries of the list. Undoing both the threshold and the rlist change makes richards a bit better, but pystone much worse. Undoing only the rlist change, keeping the new threshold leaves richards unchanged but makes pystone even worse. Conclusion: I don't revert anything and stick with this. executable abs.richards abs.pystone rel.richards rel.pystone pypy-c-17439 35671 ms 649.430 41.5 57.5 pypy-c-17512 46917 ms 661.685 54.6 56.4 pypy-c-17516 38585 ms 670.494 44.9 55.7 pypy-c-17545-intern 34849 ms 781.101 40.6 47.8 pypy-c-17572 36892 ms 755.969 42.9 49.4 pypy-c-17600 26828 ms 919.220 31.2 40.6 pypy-c-17600_ll_list 27138 ms 815.223 31.6 45.8 pypy-c-17623-32_4 25107 ms 994.383 29.2 37.6 pypy-c-17626-undo2 23873 ms 903.937 27.8 41.3 pypy-c-17626-unrlist 25206 ms 847.455 29.3 44.1 python 2.3.3 859 ms 37339.500 1.0 1.0 Modified: pypy/dist/pypy/translator/goal/bench-windows.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-windows.py (original) +++ pypy/dist/pypy/translator/goal/bench-windows.py Sun Sep 18 00:06:59 2005 @@ -2,6 +2,21 @@ # to be executed in the goal folder, # where a couple of .exe files is expected. +current_result = """ +executable abs.richards abs.pystone rel.richards rel.pystone +pypy-c-17439 35671 ms 649.430 41.5 57.5 +pypy-c-17512 46917 ms 661.685 54.6 56.4 +pypy-c-17516 38585 ms 670.494 44.9 55.7 +pypy-c-17545-intern 34849 ms 781.101 40.6 47.8 +pypy-c-17572 36892 ms 755.969 42.9 49.4 +pypy-c-17600 26828 ms 919.220 31.2 40.6 +pypy-c-17600_ll_list 27138 ms 815.223 31.6 45.8 +pypy-c-17623-32_4 25107 ms 994.383 29.2 37.6 +pypy-c-17626-undo2 23873 ms 903.937 27.8 41.3 +pypy-c-17626-unrlist 25206 ms 847.455 29.3 44.1 +python 2.3.3 859 ms 37339.500 1.0 1.0 +""" + import os, sys PYSTONE_CMD = 'from test import pystone;pystone.main(%s)' From pedronis at codespeak.net Sun Sep 18 00:42:18 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 18 Sep 2005 00:42:18 +0200 (CEST) Subject: [pypy-svn] r17635 - in pypy/dist/pypy/objspace: . std Message-ID: <20050917224218.BDAFB27B7D@code1.codespeak.net> Author: pedronis Date: Sun Sep 18 00:42:16 2005 New Revision: 17635 Modified: pypy/dist/pypy/objspace/descroperation.py pypy/dist/pypy/objspace/std/objspace.py Log: don't end up querying the type of the objects twice for binary ops. Modified: pypy/dist/pypy/objspace/descroperation.py ============================================================================== --- pypy/dist/pypy/objspace/descroperation.py (original) +++ pypy/dist/pypy/objspace/descroperation.py Sun Sep 18 00:42:16 2005 @@ -239,11 +239,11 @@ def pow(space, w_obj1, w_obj2, w_obj3): w_typ1 = space.type(w_obj1) w_typ2 = space.type(w_obj2) - w_left_src, w_left_impl = space.lookup_where(w_obj1, '__pow__') + w_left_src, w_left_impl = space.lookup_in_type_where(w_typ1, '__pow__') if space.is_w(w_typ1, w_typ2): w_right_impl = None else: - w_right_src, w_right_impl = space.lookup_where(w_obj2, '__rpow__') + w_right_src, w_right_impl = space.lookup_in_type_where(w_typ2, '__rpow__') if space.is_true(space.issubtype(w_typ2, w_typ1)) and not space.is_w(w_left_src, w_right_src): w_obj1, w_obj2 = w_obj2, w_obj1 w_left_impl, w_right_impl = w_right_impl, w_left_impl @@ -331,11 +331,11 @@ def coerce(space, w_obj1, w_obj2): w_typ1 = space.type(w_obj1) w_typ2 = space.type(w_obj2) - w_left_src, w_left_impl = space.lookup_where(w_obj1, '__coerce__') + w_left_src, w_left_impl = space.lookup_in_type_where(w_typ1, '__coerce__') if space.is_w(w_typ1, w_typ2): w_right_impl = None else: - w_right_src, w_right_impl = space.lookup_where(w_obj2, '__coerce__') + w_right_src, w_right_impl = space.lookup_in_type_where(w_typ2, '__coerce__') if space.is_true(space.issubtype(w_typ2, w_typ1)) and not space.is_w(w_left_src, w_right_src): w_obj1, w_obj2 = w_obj2, w_obj1 w_left_impl, w_right_impl = w_right_impl, w_left_impl @@ -386,13 +386,13 @@ def _cmp(space, w_obj1, w_obj2): w_typ1 = space.type(w_obj1) w_typ2 = space.type(w_obj2) - w_left_src, w_left_impl = space.lookup_where(w_obj1, '__cmp__') + w_left_src, w_left_impl = space.lookup_in_type_where(w_typ1, '__cmp__') do_neg1 = False do_neg2 = True if space.is_w(w_typ1, w_typ2): w_right_impl = None else: - w_right_src, w_right_impl = space.lookup_where(w_obj2, '__cmp__') + w_right_src, w_right_impl = space.lookup_in_type_where(w_typ2, '__cmp__') if space.is_true(space.issubtype(w_typ2, w_typ1)) and not space.is_w(w_right_src, w_left_src): w_obj1, w_obj2 = w_obj2, w_obj1 w_left_impl, w_right_impl = w_right_impl, w_left_impl @@ -429,11 +429,11 @@ def binop_impl(space, w_obj1, w_obj2): w_typ1 = space.type(w_obj1) w_typ2 = space.type(w_obj2) - w_left_src, w_left_impl = space.lookup_where(w_obj1, left) + w_left_src, w_left_impl = space.lookup_in_type_where(w_typ1, left) if space.is_w(w_typ1, w_typ2): w_right_impl = None else: - w_right_src, w_right_impl = space.lookup_where(w_obj2, right) + w_right_src, w_right_impl = space.lookup_in_type_where(w_typ2, right) if space.is_true(space.issubtype(w_typ2, w_typ1)) and not space.is_w(w_right_src, w_left_src): w_obj1, w_obj2 = w_obj2, w_obj1 w_left_impl, w_right_impl = w_right_impl, w_left_impl @@ -454,14 +454,14 @@ def comparison_impl(space, w_obj1, w_obj2): w_typ1 = space.type(w_obj1) w_typ2 = space.type(w_obj2) - w_left_src, w_left_impl = space.lookup_where(w_obj1, left) + w_left_src, w_left_impl = space.lookup_in_type_where(w_typ1, left) w_first = w_obj1 w_second = w_obj2 if space.is_w(w_typ1, w_typ2): w_right_impl = None else: - w_right_src, w_right_impl = space.lookup_where(w_obj2, right) + w_right_src, w_right_impl = space.lookup_in_type_where(w_typ2, right) if space.is_true(space.issubtype(w_typ2, w_typ1)) and not space.is_w(w_right_src, w_left_src): w_obj1, w_obj2 = w_obj2, w_obj1 w_left_impl, w_right_impl = w_right_impl, w_left_impl Modified: pypy/dist/pypy/objspace/std/objspace.py ============================================================================== --- pypy/dist/pypy/objspace/std/objspace.py (original) +++ pypy/dist/pypy/objspace/std/objspace.py Sun Sep 18 00:42:16 2005 @@ -357,8 +357,7 @@ w_type = w_obj.getclass(self) return w_type.lookup(name) - def lookup_where(self, w_obj, name): - w_type = w_obj.getclass(self) + def lookup_in_type_where(self, w_type, name): return w_type.lookup_where(name) def allocate_instance(self, cls, w_subtype): From tismer at codespeak.net Sun Sep 18 01:20:53 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sun, 18 Sep 2005 01:20:53 +0200 (CEST) Subject: [pypy-svn] r17636 - pypy/dist/pypy/translator/goal Message-ID: <20050917232053.712BF27B7D@code1.codespeak.net> Author: tismer Date: Sun Sep 18 01:20:52 2005 New Revision: 17636 Modified: pypy/dist/pypy/translator/goal/bench-windows.py Log: updated benchmarks after Armin's new acceleration. It is in fact phantastic, around 20% for richards, small effect on pystone. Now we have an almost 200% speedup in about 5 days. Repeating this 4-5 times more, and we are there in 25 days. haha :) executable abs.richards abs.pystone rel.richards rel.pystone pypy-c-17439 35131 ms 664.546 42.1 61.3 pypy-c-17512 45805 ms 682.501 54.9 59.7 pypy-c-17516 37934 ms 727.160 45.5 56.0 pypy-c-17545-intern 34299 ms 795.603 41.1 51.2 pypy-c-17572 36121 ms 785.060 43.3 51.9 pypy-c-17600 26348 ms 932.547 31.6 43.7 pypy-c-17623-32_4 24825 ms 1010.910 29.8 40.3 pypy-c-17634 20069 ms 1018.520 24.1 40.0 python 2.3.3 834 ms 40719.600 1.0 1.0 Modified: pypy/dist/pypy/translator/goal/bench-windows.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-windows.py (original) +++ pypy/dist/pypy/translator/goal/bench-windows.py Sun Sep 18 01:20:52 2005 @@ -4,17 +4,15 @@ current_result = """ executable abs.richards abs.pystone rel.richards rel.pystone -pypy-c-17439 35671 ms 649.430 41.5 57.5 -pypy-c-17512 46917 ms 661.685 54.6 56.4 -pypy-c-17516 38585 ms 670.494 44.9 55.7 -pypy-c-17545-intern 34849 ms 781.101 40.6 47.8 -pypy-c-17572 36892 ms 755.969 42.9 49.4 -pypy-c-17600 26828 ms 919.220 31.2 40.6 -pypy-c-17600_ll_list 27138 ms 815.223 31.6 45.8 -pypy-c-17623-32_4 25107 ms 994.383 29.2 37.6 -pypy-c-17626-undo2 23873 ms 903.937 27.8 41.3 -pypy-c-17626-unrlist 25206 ms 847.455 29.3 44.1 -python 2.3.3 859 ms 37339.500 1.0 1.0 +pypy-c-17439 35131 ms 664.546 42.1 61.3 +pypy-c-17512 45805 ms 682.501 54.9 59.7 +pypy-c-17516 37934 ms 727.160 45.5 56.0 +pypy-c-17545-intern 34299 ms 795.603 41.1 51.2 +pypy-c-17572 36121 ms 785.060 43.3 51.9 +pypy-c-17600 26348 ms 932.547 31.6 43.7 +pypy-c-17623-32_4 24825 ms 1010.910 29.8 40.3 +pypy-c-17634 20069 ms 1018.520 24.1 40.0 +python 2.3.3 834 ms 40719.600 1.0 1.0 """ import os, sys From tismer at codespeak.net Sun Sep 18 01:40:09 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sun, 18 Sep 2005 01:40:09 +0200 (CEST) Subject: [pypy-svn] r17638 - pypy/dist/pypy/doc Message-ID: <20050917234009.21D8B27B7D@code1.codespeak.net> Author: tismer Date: Sun Sep 18 01:40:08 2005 New Revision: 17638 Modified: pypy/dist/pypy/doc/thoughts_string_interning.txt Log: update: I think I found an easy way to implement interning of strings. Modified: pypy/dist/pypy/doc/thoughts_string_interning.txt ============================================================================== --- pypy/dist/pypy/doc/thoughts_string_interning.txt (original) +++ pypy/dist/pypy/doc/thoughts_string_interning.txt Sun Sep 18 01:40:08 2005 @@ -58,6 +58,20 @@ as the flag might be acceptable. A dummyobject can be used if the interned rstr is not exposed as an interned string object. +Update: a reasonably simple implementation +------------------------------------------- + +Instead of the complications using the stringobject as a property of an rstr +instance, I propose to special case this kind of dictionary (mapping rstr +to stringobject) and to put an integer ``interned`` field into the rstr. The +default is -1 for not interned. Non-negative values are the direct index +of this string into the interning dict. That is, we grow an extra function +that indexes the dict by slot number of the dict table and gives direct +access to its value. The dictionary gets special handling on dict_resize, +to recompute the slot numbers of the interned strings. ATM I'd say we leave +the strings immortal and support mortality later when we have a cheap +way to express this (less refcount, exclusion from Boehm, whatever). + A prototype brute-force patch -------------------------------- From tismer at codespeak.net Sun Sep 18 11:29:31 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sun, 18 Sep 2005 11:29:31 +0200 (CEST) Subject: [pypy-svn] r17639 - pypy/dist/pypy/rpython Message-ID: <20050918092931.909B227B76@code1.codespeak.net> Author: tismer Date: Sun Sep 18 11:29:30 2005 New Revision: 17639 Modified: pypy/dist/pypy/rpython/rlist.py Log: an obvious tiny optimization. since _ll_list_resize gets inlined, it makes sense to split it up intoa shrinking and a growing version. This creates shorter code and less inlining overhead. The effect is slightly noticeable, but side effects on the inlining criteria make these results very unreliable. Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Sun Sep 18 11:29:30 2005 @@ -377,11 +377,23 @@ # the allocated size, then proceed with the realloc() to shrink the list. allocated = len(l.items) if allocated >= newsize and newsize >= ((allocated >> 1) - 5): - # assert l.ob_item != NULL or newsize == 0 l.length = newsize else: _ll_list_resize_really(l, newsize) +def _ll_list_resize_ge(l, newsize): + if len(l.items) >= newsize: + l.length = newsize + else: + _ll_list_resize_really(l, newsize) + +def _ll_list_resize_le(l, newsize): + if newsize >= (len(l.items) >> 1) - 5: + l.length = newsize + else: + _ll_list_resize_really(l, newsize) + + def ll_copy(l): items = l.items length = l.length @@ -402,13 +414,13 @@ def ll_append(l, newitem): length = l.length - _ll_list_resize(l, length+1) + _ll_list_resize_ge(l, length+1) l.items[length] = newitem # this one is for the special case of insert(0, x) def ll_prepend(l, newitem): length = l.length - _ll_list_resize(l, length+1) + _ll_list_resize_ge(l, length+1) i = length items = l.items i1 = i+1 @@ -420,7 +432,7 @@ def ll_insert_nonneg(l, index, newitem): length = l.length - _ll_list_resize(l, length+1) + _ll_list_resize_ge(l, length+1) items = l.items i = length i1 = i+1 @@ -458,7 +470,7 @@ ITEM = typeOf(l).TO.items.TO.OF if isinstance(ITEM, Ptr): items[index] = nullptr(ITEM.TO) - _ll_list_resize(l, newlength) + _ll_list_resize_le(l, newlength) return res def ll_pop_zero(func, l): @@ -477,7 +489,7 @@ ITEM = typeOf(l).TO.items.TO.OF if isinstance(ITEM, Ptr): items[newlength] = nullptr(ITEM.TO) - _ll_list_resize(l, newlength) + _ll_list_resize_le(l, newlength) return res def ll_pop(func, l, index): @@ -543,7 +555,7 @@ ITEM = typeOf(l).TO.items.TO.OF if isinstance(ITEM, Ptr): items[newlength] = nullptr(ITEM.TO) - _ll_list_resize(l, newlength) + _ll_list_resize_le(l, newlength) def ll_delitem(func, l, i): if i < 0: @@ -575,7 +587,7 @@ len1 = l1.length len2 = l2.length newlength = len1 + len2 - _ll_list_resize(l1, newlength) + _ll_list_resize_ge(l1, newlength) items = l1.items source = l2.items i = 0 @@ -643,7 +655,7 @@ while j >= newlength: items[j] = nullptr(ITEM.TO) j -= 1 - _ll_list_resize(l, newlength) + _ll_list_resize_le(l, newlength) def ll_listdelslice(l, slice): start = slice.start @@ -664,7 +676,7 @@ while j >= newlength: items[j] = nullptr(ITEM.TO) j -= 1 - _ll_list_resize(l, newlength) + _ll_list_resize_le(l, newlength) def ll_listsetslice(l1, slice, l2): count = l2.length From pedronis at codespeak.net Sun Sep 18 16:31:08 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 18 Sep 2005 16:31:08 +0200 (CEST) Subject: [pypy-svn] r17645 - pypy/dist/pypy/translator/goal Message-ID: <20050918143108.8B5CB27B6C@code1.codespeak.net> Author: pedronis Date: Sun Sep 18 16:31:07 2005 New Revision: 17645 Added: pypy/dist/pypy/translator/goal/bench-unix.py - copied, changed from r17640, pypy/dist/pypy/translator/goal/bench-windows.py Log: adapted for unix conventions From pedronis at codespeak.net Sun Sep 18 19:49:29 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 18 Sep 2005 19:49:29 +0200 (CEST) Subject: [pypy-svn] r17647 - pypy/dist/pypy/translator/goal Message-ID: <20050918174929.43C1427B6A@code1.codespeak.net> Author: pedronis Date: Sun Sep 18 19:49:28 2005 New Revision: 17647 Added: pypy/dist/pypy/translator/goal/Ratthing-b246-benchs.txt (contents, props changed) Log: not happy :( benchmarks over the last revs with and without -t-lowmem on my office machine Added: pypy/dist/pypy/translator/goal/Ratthing-b246-benchs.txt ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/goal/Ratthing-b246-benchs.txt Sun Sep 18 19:49:28 2005 @@ -0,0 +1,8 @@ +executable abs.richards abs.pystone rel.richards rel.pystone +./pypy-c-BNthGeni17627 22770 ms 1075.270 34.5 46.0 +./pypy-c-BNthGeni17631 23801 ms 1156.070 36.1 42.8 +./pypy-c-BNthGeni17640 24124 ms 1142.860 36.6 43.3 +./pypy-c-BNthLowm17627 30187 ms 696.864 45.7 71.0 +./pypy-c-BNthLowm17631 29780 ms 711.744 45.1 69.6 +./pypy-c-BNthLowm17640 28996 ms 751.880 43.9 65.8 +python 2.3.5 660 ms 49505.000 1.0 1.0 From tismer at codespeak.net Sun Sep 18 21:25:06 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sun, 18 Sep 2005 21:25:06 +0200 (CEST) Subject: [pypy-svn] r17649 - pypy/dist/pypy/interpreter Message-ID: <20050918192506.245B127B69@code1.codespeak.net> Author: tismer Date: Sun Sep 18 21:25:04 2005 New Revision: 17649 Modified: pypy/dist/pypy/interpreter/miscutils.py pypy/dist/pypy/interpreter/pyframe.py Log: a patch that makes the value stack fixed size. This implies notto use push and pop, but to control the stack pointer in interplevel. Right now a slowdown on richards and a small improvement on pystone. We will see how it performs with extra supportfrom rlist. Modified: pypy/dist/pypy/interpreter/miscutils.py ============================================================================== --- pypy/dist/pypy/interpreter/miscutils.py (original) +++ pypy/dist/pypy/interpreter/miscutils.py Sun Sep 18 21:25:04 2005 @@ -4,8 +4,12 @@ import types +from pypy.rpython.rarithmetic import r_uint -class Stack: +class RootStack: + pass + +class Stack(RootStack): """Utility class implementing a stack.""" _annspecialcase_ = "specialize:ctr_location" # polymorphic @@ -44,10 +48,52 @@ def empty(self): return not self.items -## def __iter__(self): -## # Walk the stack backwards -## for ii in self.items[::-1]: -## yield ii + +class FixedStack(RootStack): + _annspecialcase_ = "specialize:ctr_location" # polymorphic + + # unfortunately, we have to re-do everything + def __init__(self): + pass + + def setup(self, stacksize): + self.ptr = r_uint(0) # we point after the last element + self.items = [None] * stacksize + + def clone(self): + # this is only needed if we support flow space + s = self.__class__() + s.setup(len(self.items)) + for item in self.items[:self.ptr]: + try: + item = item.clone() + except AttributeError: + pass + s.push(item) + return s + + def push(self, item): + ptr = self.ptr + self.items[ptr] = item + self.ptr = ptr + 1 + + def pop(self): + ptr = self.ptr - 1 + ret = self.items[ptr] + self.items[ptr] = None + self.ptr = ptr + return ret + + def top(self, position=0): + # for a fixed stack, we assume correct indices + return self.items[self.ptr + ~position] + + def depth(self): + return self.ptr + + def empty(self): + return not self.ptr + class InitializedClass(type): """NOT_RPYTHON. A meta-class that allows a class to initialize itself (or Modified: pypy/dist/pypy/interpreter/pyframe.py ============================================================================== --- pypy/dist/pypy/interpreter/pyframe.py (original) +++ pypy/dist/pypy/interpreter/pyframe.py Sun Sep 18 21:25:04 2005 @@ -2,7 +2,7 @@ """ from pypy.interpreter import eval, baseobjspace, gateway -from pypy.interpreter.miscutils import Stack +from pypy.interpreter.miscutils import Stack, FixedStack from pypy.interpreter.error import OperationError from pypy.interpreter import pytraceback import opcode @@ -39,7 +39,13 @@ def __init__(self, space, code, w_globals, closure): self.pycode = code eval.Frame.__init__(self, space, w_globals, code.co_nlocals) - self.valuestack = Stack() + # XXX hack: FlowSpace directly manipulates stack + # cannot use FixedStack without rewriting framestate + if space.full_exceptions: + self.valuestack = FixedStack() + self.valuestack.setup(code.co_stacksize) + else: + self.valuestack = Stack() self.blockstack = Stack() self.last_exception = None self.next_instr = 0 From tismer at codespeak.net Sun Sep 18 22:29:34 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Sun, 18 Sep 2005 22:29:34 +0200 (CEST) Subject: [pypy-svn] r17650 - pypy/dist/pypy/translator/goal Message-ID: <20050918202934.A0D0627B6D@code1.codespeak.net> Author: tismer Date: Sun Sep 18 22:29:33 2005 New Revision: 17650 Modified: pypy/dist/pypy/translator/goal/bench-windows.py Log: timings after using a fixed valuestack. executable abs.richards abs.pystone rel.richards rel.pystone pypy-c-17439 35180 ms 661.339 41.9 59.7 pypy-c-17512 46007 ms 659.205 54.8 59.9 pypy-c-17516 37944 ms 704.839 45.2 56.0 pypy-c-17545-intern 34309 ms 764.987 40.8 51.6 pypy-c-17572 36061 ms 736.094 42.9 53.7 pypy-c-17600 26348 ms 901.957 31.4 43.8 pypy-c-17623-32_4 24734 ms 970.845 29.4 40.7 pypy-c-17634 20088 ms 1018.240 23.9 38.8 pypy-c-17649 22902 ms 1018.300 27.3 38.8 python 2.3.3 840 ms 39500.600 1.0 1.0 17649 was with explicit fixed stack. Changes after 17634 we not included. Modified: pypy/dist/pypy/translator/goal/bench-windows.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-windows.py (original) +++ pypy/dist/pypy/translator/goal/bench-windows.py Sun Sep 18 22:29:33 2005 @@ -4,15 +4,19 @@ current_result = """ executable abs.richards abs.pystone rel.richards rel.pystone -pypy-c-17439 35131 ms 664.546 42.1 61.3 -pypy-c-17512 45805 ms 682.501 54.9 59.7 -pypy-c-17516 37934 ms 727.160 45.5 56.0 -pypy-c-17545-intern 34299 ms 795.603 41.1 51.2 -pypy-c-17572 36121 ms 785.060 43.3 51.9 -pypy-c-17600 26348 ms 932.547 31.6 43.7 -pypy-c-17623-32_4 24825 ms 1010.910 29.8 40.3 -pypy-c-17634 20069 ms 1018.520 24.1 40.0 -python 2.3.3 834 ms 40719.600 1.0 1.0 +pypy-c-17439 35180 ms 661.339 41.9 59.7 +pypy-c-17512 46007 ms 659.205 54.8 59.9 +pypy-c-17516 37944 ms 704.839 45.2 56.0 +pypy-c-17545-intern 34309 ms 764.987 40.8 51.6 +pypy-c-17572 36061 ms 736.094 42.9 53.7 +pypy-c-17600 26348 ms 901.957 31.4 43.8 +pypy-c-17623-32_4 24734 ms 970.845 29.4 40.7 +pypy-c-17634 20088 ms 1018.240 23.9 38.8 +pypy-c-17649 22902 ms 1018.300 27.3 38.8 +python 2.3.3 840 ms 39500.600 1.0 1.0 + +17649 was with explicit fixed stack. +Changes after 17634 we not included. """ import os, sys From pedronis at codespeak.net Sun Sep 18 23:55:13 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 18 Sep 2005 23:55:13 +0200 (CEST) Subject: [pypy-svn] r17651 - pypy/dist/pypy/translator/goal Message-ID: <20050918215513.C68F027B5F@code1.codespeak.net> Author: pedronis Date: Sun Sep 18 23:55:13 2005 New Revision: 17651 Modified: pypy/dist/pypy/translator/goal/Ratthing-b246-benchs.txt Log: numbers with optimisation enable for some (O), still very much muddy Modified: pypy/dist/pypy/translator/goal/Ratthing-b246-benchs.txt ============================================================================== --- pypy/dist/pypy/translator/goal/Ratthing-b246-benchs.txt (original) +++ pypy/dist/pypy/translator/goal/Ratthing-b246-benchs.txt Sun Sep 18 23:55:13 2005 @@ -1,8 +1,10 @@ executable abs.richards abs.pystone rel.richards rel.pystone -./pypy-c-BNthGeni17627 22770 ms 1075.270 34.5 46.0 -./pypy-c-BNthGeni17631 23801 ms 1156.070 36.1 42.8 -./pypy-c-BNthGeni17640 24124 ms 1142.860 36.6 43.3 -./pypy-c-BNthLowm17627 30187 ms 696.864 45.7 71.0 -./pypy-c-BNthLowm17631 29780 ms 711.744 45.1 69.6 -./pypy-c-BNthLowm17640 28996 ms 751.880 43.9 65.8 -python 2.3.5 660 ms 49505.000 1.0 1.0 +./pypy-c-BNthGeni17627 22723 ms 1075.270 35.0 46.5 +./pypy-c-BNthGeni17627O 24528 ms 1081.080 37.7 46.3 +./pypy-c-BNthGeni17631 23751 ms 1162.790 36.5 43.0 +./pypy-c-BNthGeni17640O 23912 ms 1149.430 36.8 43.5 +./pypy-c-BNthLowm17627 30100 ms 701.754 46.3 71.3 +./pypy-c-BNthLowm17627O 30184 ms 699.301 46.4 71.5 +./pypy-c-BNthLowm17631 29718 ms 714.286 45.7 70.0 +./pypy-c-BNthLowm17640O 29041 ms 751.880 44.7 66.5 +python 2.3.5 650 ms 50000.000 1.0 1.0 From ericvrp at codespeak.net Mon Sep 19 11:39:50 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Mon, 19 Sep 2005 11:39:50 +0200 (CEST) Subject: [pypy-svn] r17652 - pypy/dist/pypy/translator/backendopt Message-ID: <20050919093950.7F93527B69@code1.codespeak.net> Author: ericvrp Date: Mon Sep 19 11:39:49 2005 New Revision: 17652 Added: pypy/dist/pypy/translator/backendopt/exception.py Log: Work in progress. Checking this in because I'm moving to another machine Added: pypy/dist/pypy/translator/backendopt/exception.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/backendopt/exception.py Mon Sep 19 11:39:49 2005 @@ -0,0 +1,17 @@ +from pypy.translator.unsimplify import split_block +from pypy.objspace.flow.model import Block, flatten + + +def create_exception_handling(translator, graph): + """After an exception in a direct_call, that is not catched by an explicit + except statement, we need to reraise the exception. So after this + direct_call we need to test if an exception had occurred. If so, we return + from the current graph with an unused value (false/0/0.0/null). + Because of the added exitswitch we need an additional block. + """ + blocks = [x for x in flatten(graph) if isinstance(x, Block)] + for block in blocks: + for i in range(len(block.operations)-1, -1, -1): + op = block.operations[i] + if op.opname == 'direct_call': + split_block(translator, graph, block, i) From adim at codespeak.net Mon Sep 19 15:34:50 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Mon, 19 Sep 2005 15:34:50 +0200 (CEST) Subject: [pypy-svn] r17656 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050919133450.6CA7A27B68@code1.codespeak.net> Author: adim Date: Mon Sep 19 15:34:48 2005 New Revision: 17656 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/pythonlexer.py Log: improved a bit linenos in AST nodes Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Mon Sep 19 15:34:48 2005 @@ -17,6 +17,7 @@ """parses 'except' [test [',' test]] ':' suite and returns a 4-tuple : (tokens_read, expr1, expr2, except_body) """ + lineno = tokens[0].lineno clause_length = 1 # Read until end of except clause (bound by following 'else', # or 'except' or end of tokens) @@ -34,7 +35,7 @@ return (4, tokens[1], None, tokens[3]) else: # case 'except Exception, exc: body' - return (6, tokens[1], to_lvalue(tokens[3], consts.OP_ASSIGN), tokens[5]) + return (6, tokens[1], to_lvalue(tokens[3], consts.OP_ASSIGN, lineno), tokens[5]) def parse_dotted_names(tokens): @@ -213,7 +214,7 @@ assert isinstance(token, TokenObject) # rtyper info + check if token.get_value() == 'for': index += 1 # skip 'for' - ass_node = to_lvalue(tokens[index], consts.OP_ASSIGN) + ass_node = to_lvalue(tokens[index], consts.OP_ASSIGN, token.lineno) index += 2 # skip 'in' iterable = tokens[index] index += 1 @@ -266,7 +267,7 @@ assert isinstance(token, TokenObject) # rtyper info + check if token.get_value() == 'for': index += 1 # skip 'for' - ass_node = to_lvalue(tokens[index], consts.OP_ASSIGN) + ass_node = to_lvalue(tokens[index], consts.OP_ASSIGN, token.lineno) index += 2 # skip 'in' iterable = tokens[index] index += 1 @@ -309,29 +310,29 @@ return doc -def to_lvalue(ast_node, flags): +def to_lvalue(ast_node, flags, lineno): if isinstance( ast_node, ast.Name ): - return ast.AssName(ast_node.varname, flags) + return ast.AssName(ast_node.varname, flags, lineno) # return ast.AssName(ast_node.name, flags) elif isinstance(ast_node, ast.Tuple): nodes = [] # FIXME: should ast_node.getChildren() but it's not annotable # because of flatten() for node in ast_node.nodes: - nodes.append(to_lvalue(node, flags)) + nodes.append(to_lvalue(node, flags, lineno)) return ast.AssTuple(nodes) elif isinstance(ast_node, ast.List): nodes = [] # FIXME: should ast_node.getChildren() but it's not annotable # because of flatten() for node in ast_node.nodes: - nodes.append(to_lvalue(node, flags)) - return ast.AssList(nodes) + nodes.append(to_lvalue(node, flags, lineno)) + return ast.AssList(nodes, lineno) elif isinstance(ast_node, ast.Getattr): expr = ast_node.expr assert isinstance(ast_node, ast.Getattr) attrname = ast_node.attrname - return ast.AssAttr(expr, attrname, flags) + return ast.AssAttr(expr, attrname, flags, lineno) elif isinstance(ast_node, ast.Subscript): ast_node.flags = flags return ast_node @@ -341,7 +342,7 @@ else: # TODO: check type of ast_node and raise according SyntaxError in case # of del f() - raise ASTError("cannot assign to ", ast_node ) + raise ASTError("cannot assign to ", ast_node) def is_augassign( ast_node ): if ( isinstance( ast_node, ast.Name ) or @@ -351,7 +352,7 @@ return True return False -def get_atoms( builder, nb ): +def get_atoms(builder, nb): atoms = [] i = nb while i>0: @@ -399,12 +400,12 @@ """generic factory for CallFunc nodes""" assert isinstance(arglist, ArglistObject) return ast.CallFunc(obj, arglist.arguments, - arglist.stararg, arglist.dstararg) + arglist.stararg, arglist.dstararg, arglist.lineno) def reduce_subscript(obj, subscript): """generic factory for Subscript nodes""" assert isinstance(subscript, SubscriptObject) - return ast.Subscript(obj, consts.OP_APPLY, subscript.value) + return ast.Subscript(obj, consts.OP_APPLY, subscript.value, subscript.lineno) def reduce_slice(obj, sliceobj): """generic factory for Slice nodes""" @@ -412,9 +413,10 @@ if sliceobj.fake_rulename == 'slice': start = sliceobj.value[0] end = sliceobj.value[1] - return ast.Slice(obj, consts.OP_APPLY, start, end) + return ast.Slice(obj, consts.OP_APPLY, start, end, sliceobj.lineno) else: - return ast.Subscript(obj, consts.OP_APPLY, [ast.Sliceobj(sliceobj.value)]) + return ast.Subscript(obj, consts.OP_APPLY, [ast.Sliceobj(sliceobj.value, + sliceobj.lineno)], sliceobj.lineno) def parse_attraccess(tokens): """parses token list like ['a', '.', 'b', '.', 'c', ...] @@ -425,7 +427,7 @@ # XXX HACK for when parse_attraccess is called from build_decorator if isinstance(token, TokenObject): val = token.get_value() - result = ast.Name(val) + result = ast.Name(val, token.lineno) else: result = token index = 1 @@ -435,7 +437,7 @@ index += 1 token = tokens[index] assert isinstance(token, TokenObject) - result = ast.Getattr(result, token.get_value()) + result = ast.Getattr(result, token.get_value(), token.lineno) elif isinstance(token, ArglistObject): result = reduce_callfunc(result, token) elif isinstance(token, SubscriptObject): @@ -456,7 +458,7 @@ ## ## Naming convention: ## to provide a function handler for a grammar rule name yyy -## you should provide a build_yyy( builder, nb ) function +## you should provide a build_yyy(builder, nb) function ## where builder is the AstBuilder instance used to build the ## ast tree and nb is the number of items this rule is reducing ## @@ -466,7 +468,7 @@ ## matches ## x + (2*y) + z ## build_term will be called with nb == 2 -## and get_atoms( builder, nb ) should return a list +## and get_atoms(builder, nb) should return a list ## of 5 objects : Var TokenObject('+') Expr('2*y') TokenObject('+') Expr('z') ## where Var and Expr are AST subtrees and Token is a not yet ## reduced token @@ -475,25 +477,25 @@ ## main reason why build_* functions are not methods of the AstBuilder class ## -def build_atom(builder, nb): - atoms = get_atoms( builder, nb ) +def build_atom(builder, nb, lineno): + atoms = get_atoms(builder, nb) top = atoms[0] if isinstance(top, TokenObject): # assert isinstance(top, TokenObject) # rtyper if top.name == tok.LPAR: if len(atoms) == 2: - builder.push(ast.Tuple([])) # , top.line)) + builder.push(ast.Tuple([], top.lineno)) else: builder.push( atoms[1] ) elif top.name == tok.LSQB: if len(atoms) == 2: - builder.push(ast.List([])) # , top.line)) + builder.push(ast.List([], top.lineno)) else: list_node = atoms[1] # XXX lineno is not on *every* child class of ast.Node # (will probably crash the annotator, but should be # easily fixed) - list_node.lineno = top.line + list_node.lineno = top.lineno builder.push(list_node) elif top.name == tok.LBRACE: items = [] @@ -501,19 +503,19 @@ # a : b , c : d # ^ +1 +2 +3 +4 items.append((atoms[index], atoms[index+2])) - builder.push(ast.Dict(items)) # top.line)) + builder.push(ast.Dict(items, top.lineno)) elif top.name == tok.NAME: val = top.get_value() - builder.push( ast.Name(val) ) + builder.push( ast.Name(val, top.lineno) ) elif top.name == tok.NUMBER: - builder.push(ast.Const(builder.eval_number(top.get_value()))) + builder.push(ast.Const(builder.eval_number(top.get_value()), top.lineno)) elif top.name == tok.STRING: # need to concatenate strings in atoms s = '' if len(atoms) == 1: token = atoms[0] assert isinstance(token, TokenObject) - builder.push(ast.Const(parsestr(builder.space, None, token.get_value()))) # XXX encoding + builder.push(ast.Const(parsestr(builder.space, None, token.get_value()), lineno)) # XXX encoding else: space = builder.space empty = space.wrap('') @@ -522,7 +524,7 @@ assert isinstance(token, TokenObject) accum.append(parsestr(builder.space, None, token.get_value())) # XXX encoding w_s = space.call_method(empty, 'join', space.newlist(accum)) - builder.push(ast.Const(w_s)) + builder.push(ast.Const(w_s, top.lineno)) elif top.name == tok.BACKQUOTE: builder.push(ast.Backquote(atoms[1])) else: @@ -536,7 +538,7 @@ return [] -def build_power(builder, nb): +def build_power(builder, nb, lineno): """power: atom trailer* ['**' factor]""" atoms = get_atoms(builder, nb) if len(atoms) == 1: @@ -545,27 +547,27 @@ token = atoms[-2] if isinstance(token, TokenObject) and token.name == tok.DOUBLESTAR: obj = parse_attraccess(slicecut(atoms, 0, -2)) - builder.push(ast.Power([obj, atoms[-1]])) + builder.push(ast.Power([obj, atoms[-1]], lineno)) else: obj = parse_attraccess(atoms) builder.push(obj) -def build_factor( builder, nb ): - atoms = get_atoms( builder, nb ) +def build_factor(builder, nb, lineno): + atoms = get_atoms(builder, nb) if len(atoms) == 1: builder.push( atoms[0] ) elif len(atoms) == 2: token = atoms[0] if isinstance(token, TokenObject): if token.name == tok.PLUS: - builder.push( ast.UnaryAdd( atoms[1] ) ) + builder.push( ast.UnaryAdd( atoms[1], lineno) ) if token.name == tok.MINUS: - builder.push( ast.UnarySub( atoms[1] ) ) + builder.push( ast.UnarySub( atoms[1], lineno) ) if token.name == tok.TILDE: - builder.push( ast.Invert( atoms[1] ) ) + builder.push( ast.Invert( atoms[1], lineno) ) -def build_term( builder, nb ): - atoms = get_atoms( builder, nb ) +def build_term(builder, nb, lineno): + atoms = get_atoms(builder, nb) l = len(atoms) left = atoms[0] for i in range(2,l,2): @@ -573,19 +575,19 @@ op_node = atoms[i-1] assert isinstance(op_node, TokenObject) if op_node.name == tok.STAR: - left = ast.Mul( [ left, right ] ) + left = ast.Mul( [ left, right ], lineno ) elif op_node.name == tok.SLASH: - left = ast.Div( [ left, right ] ) + left = ast.Div( [ left, right ], lineno ) elif op_node.name == tok.PERCENT: - left = ast.Mod( [ left, right ] ) + left = ast.Mod( [ left, right ], lineno ) elif op_node.name == tok.DOUBLESLASH: - left = ast.FloorDiv( [ left, right ] ) + left = ast.FloorDiv( [ left, right ], lineno ) else: raise TokenError("unexpected token", [atoms[i-1]]) builder.push( left ) -def build_arith_expr( builder, nb ): - atoms = get_atoms( builder, nb ) +def build_arith_expr(builder, nb, lineno): + atoms = get_atoms(builder, nb) l = len(atoms) left = atoms[0] for i in range(2,l,2): @@ -593,15 +595,15 @@ op_node = atoms[i-1] assert isinstance(op_node, TokenObject) if op_node.name == tok.PLUS: - left = ast.Add( [ left, right ] ) + left = ast.Add([ left, right ], lineno) elif op_node.name == tok.MINUS: - left = ast.Sub( [ left, right ] ) + left = ast.Sub([ left, right ], lineno) else: raise ValueError("unexpected token", [atoms[i-1]] ) builder.push( left ) -def build_shift_expr( builder, nb ): - atoms = get_atoms( builder, nb ) +def build_shift_expr(builder, nb, lineno): + atoms = get_atoms(builder, nb) l = len(atoms) left = atoms[0] for i in range(2,l,2): @@ -609,37 +611,37 @@ op_node = atoms[i-1] assert isinstance(op_node, TokenObject) if op_node.name == tok.LEFTSHIFT: - left = ast.LeftShift( [ left, right ] ) + left = ast.LeftShift( [left, right], lineno ) elif op_node.name == tok.RIGHTSHIFT: - left = ast.RightShift( [ left, right ] ) + left = ast.RightShift( [ left, right ], lineno ) else: raise ValueError("unexpected token", [atoms[i-1]] ) - builder.push( left ) + builder.push(left) -def build_binary_expr(builder, nb, OP): +def build_binary_expr(builder, nb, OP, lineno): atoms = get_atoms(builder, nb) l = len(atoms) if l==1: - builder.push( atoms[0] ) + builder.push(atoms[0]) return items = [] for i in range(0,l,2): # this is atoms not 1 - items.append( atoms[i] ) - builder.push( OP( items ) ) + items.append(atoms[i]) + builder.push(OP(items, lineno)) return -def build_and_expr( builder, nb ): - return build_binary_expr( builder, nb, ast.Bitand ) +def build_and_expr(builder, nb, lineno): + return build_binary_expr(builder, nb, ast.Bitand, lineno) -def build_xor_expr( builder, nb ): - return build_binary_expr( builder, nb, ast.Bitxor ) +def build_xor_expr(builder, nb, lineno): + return build_binary_expr(builder, nb, ast.Bitxor, lineno) -def build_expr( builder, nb ): - return build_binary_expr( builder, nb, ast.Bitor ) +def build_expr(builder, nb, lineno): + return build_binary_expr(builder, nb, ast.Bitor, lineno) -def build_comparison( builder, nb ): - atoms = get_atoms( builder, nb ) +def build_comparison(builder, nb, lineno): + atoms = get_atoms(builder, nb) l = len(atoms) if l == 1: builder.push( atoms[0] ) @@ -656,9 +658,9 @@ assert isinstance(token, TokenObject) op_name = tok.tok_rpunct.get(token.name, token.get_value()) ops.append((op_name, atoms[i+1])) - builder.push(ast.Compare(atoms[0], ops)) + builder.push(ast.Compare(atoms[0], ops, lineno)) -def build_comp_op(builder, nb): +def build_comp_op(builder, nb, lineno): """comp_op reducing has 2 different cases: 1. There's only one token to reduce => nothing to do, just re-push it on the stack @@ -681,83 +683,85 @@ token = atoms[0] assert isinstance(token, TokenObject) if token.get_value() == 'not': - builder.push(TokenObject(tok.NAME, 'not in', None)) + builder.push(TokenObject(tok.NAME, 'not in', lineno)) else: - builder.push(TokenObject(tok.NAME, 'is not', None)) + builder.push(TokenObject(tok.NAME, 'is not', lineno)) else: assert False, "TODO" # uh ? -def build_and_test( builder, nb ): - return build_binary_expr( builder, nb, ast.And ) +def build_and_test(builder, nb, lineno): + return build_binary_expr(builder, nb, ast.And, lineno) -def build_not_test(builder, nb): +def build_not_test(builder, nb, lineno): atoms = get_atoms(builder, nb) if len(atoms) == 1: builder.push(atoms[0]) elif len(atoms) == 2: - builder.push(ast.Not(atoms[1])) + builder.push(ast.Not(atoms[1], lineno)) else: assert False, "not_test implementation incomplete in not_test" -def build_test( builder, nb ): - return build_binary_expr(builder, nb, ast.Or) +def build_test(builder, nb, lineno): + return build_binary_expr(builder, nb, ast.Or, lineno) -def build_testlist( builder, nb ): - return build_binary_expr( builder, nb, ast.Tuple ) +def build_testlist(builder, nb, lineno): + return build_binary_expr(builder, nb, ast.Tuple, lineno) -def build_expr_stmt(builder, nb): +def build_expr_stmt(builder, nb, lineno): atoms = get_atoms(builder, nb) l = len(atoms) if l==1: - builder.push(ast.Discard(atoms[0])) + builder.push(ast.Discard(atoms[0], lineno)) return op = atoms[1] assert isinstance(op, TokenObject) if op.name == tok.EQUAL: nodes = [] for i in range(0,l-2,2): - lvalue = to_lvalue( atoms[i], consts.OP_ASSIGN ) - nodes.append( lvalue ) + lvalue = to_lvalue(atoms[i], consts.OP_ASSIGN, op.lineno) + nodes.append(lvalue) rvalue = atoms[-1] - builder.push( ast.Assign( nodes, rvalue ) ) + builder.push( ast.Assign(nodes, rvalue, lineno) ) pass else: assert l==3 lvalue = atoms[0] assert isinstance(op, TokenObject) - builder.push(ast.AugAssign(lvalue, op.get_name(), atoms[2])) + builder.push(ast.AugAssign(lvalue, op.get_name(), atoms[2], lineno)) -def return_one( builder, nb ): - atoms = get_atoms( builder, nb ) +def return_one(builder, nb, lineno): + atoms = get_atoms(builder, nb) l = len(atoms) assert l == 1, "missing one node in stack" builder.push( atoms[0] ) return -def build_simple_stmt( builder, nb ): - atoms = get_atoms( builder, nb ) +def build_simple_stmt(builder, nb, lineno): + atoms = get_atoms(builder, nb) l = len(atoms) nodes = [] for n in range(0,l,2): node = atoms[n] if isinstance(node, TokenObject) and node.name == tok.NEWLINE: - nodes.append(ast.Discard(ast.Const(builder.wrap_none()))) + nodes.append(ast.Discard(ast.Const(builder.wrap_none()), lineno)) else: nodes.append(node) - builder.push(ast.Stmt(nodes)) + builder.push(ast.Stmt(nodes, lineno)) -def build_return_stmt(builder, nb): +def build_return_stmt(builder, nb, lineno): atoms = get_atoms(builder, nb) if len(atoms) > 2: assert False, "return several stmts not implemented" elif len(atoms) == 1: - builder.push(ast.Return(ast.Const(builder.wrap_none()))) # XXX lineno + builder.push(ast.Return(ast.Const(builder.wrap_none(), lineno))) # XXX lineno else: - builder.push(ast.Return(atoms[1])) # XXX lineno + builder.push(ast.Return(atoms[1], lineno)) # XXX lineno -def build_file_input(builder, nb): +def build_file_input(builder, nb, lineno): stmts = [] atoms = get_atoms(builder, nb) + if atoms: + lineno = atoms[0].lineno for node in atoms: if isinstance(node, ast.Stmt): stmts.extend(node.nodes) @@ -770,30 +774,32 @@ stmts.append(node) main_stmt = ast.Stmt(stmts) doc = get_docstring(builder,main_stmt) - return builder.push(ast.Module(doc, main_stmt)) + return builder.push(ast.Module(doc, main_stmt, lineno)) -def build_eval_input(builder, nb): +def build_eval_input(builder, nb, lineno): doc = builder.wrap_none() stmts = [] atoms = get_atoms(builder, nb) assert len(atoms)>=1 return builder.push(ast.Expression(atoms[0])) -def build_single_input( builder, nb ): - atoms = get_atoms( builder, nb ) +def build_single_input(builder, nb, lineno): + atoms = get_atoms(builder, nb) l = len(atoms) if l == 1 or l==2: atom0 = atoms[0] if isinstance(atom0, TokenObject) and atom0.name == tok.NEWLINE: - atom0 = ast.Pass() + atom0 = ast.Pass(lineno) elif not isinstance(atom0, ast.Stmt): - atom0 = ast.Stmt([atom0]) - builder.push(ast.Module(builder.wrap_none(), atom0)) + atom0 = ast.Stmt([atom0], lineno) + builder.push(ast.Module(builder.wrap_none(), atom0, atom0.lineno)) else: assert False, "Forbidden path" -def build_testlist_gexp(builder, nb): +def build_testlist_gexp(builder, nb, lineno): atoms = get_atoms(builder, nb) + if atoms: + lineno = atoms[0].lineno l = len(atoms) if l == 1: builder.push(atoms[0]) @@ -809,20 +815,20 @@ expr = atoms[0] genexpr_for = parse_genexpr_for(atoms[1:]) genexpr_for[0].is_outmost = True - builder.push(ast.GenExpr(ast.GenExprInner(expr, genexpr_for))) + builder.push(ast.GenExpr(ast.GenExprInner(expr, genexpr_for, lineno), lineno)) return - builder.push(ast.Tuple(items)) + builder.push(ast.Tuple(items, lineno)) return -def build_lambdef(builder, nb): +def build_lambdef(builder, nb, lineno): """lambdef: 'lambda' [varargslist] ':' test""" atoms = get_atoms(builder, nb) code = atoms[-1] names, defaults, flags = parse_arglist(slicecut(atoms, 1, -2)) - builder.push(ast.Lambda(names, defaults, flags, code)) + builder.push(ast.Lambda(names, defaults, flags, code, lineno)) -def build_trailer(builder, nb): +def build_trailer(builder, nb, lineno): """trailer: '(' ')' | '(' arglist ')' | '[' subscriptlist ']' | '.' NAME """ atoms = get_atoms(builder, nb) @@ -830,7 +836,7 @@ # Case 1 : '(' ... if isinstance(first_token, TokenObject) and first_token.name == tok.LPAR: if len(atoms) == 2: # and atoms[1].token == tok.RPAR: - builder.push(ArglistObject([], None, None)) + builder.push(ArglistObject([], None, None, first_token.lineno)) elif len(atoms) == 3: # '(' Arglist ')' # push arglist on the stack builder.push(atoms[1]) @@ -841,16 +847,16 @@ subs = [] for index in range(1, len(atoms), 2): subs.append(atoms[index]) - builder.push(SubscriptObject('subscript', subs, None)) + builder.push(SubscriptObject('subscript', subs, first_token.lineno)) elif len(atoms) == 2: # Attribute access: '.' NAME builder.push(atoms[0]) builder.push(atoms[1]) - builder.push(TempRuleObject('pending-attr-access', 2, None)) + builder.push(TempRuleObject('pending-attr-access', 2, first_token.lineno)) else: assert False, "Trailer reducing implementation incomplete !" -def build_arglist(builder, nb): +def build_arglist(builder, nb, lineno): """ arglist: (argument ',')* ( '*' test [',' '**' test] | '**' test | @@ -859,20 +865,22 @@ """ atoms = get_atoms(builder, nb) arguments, stararg, dstararg = parse_argument(atoms) - builder.push(ArglistObject(arguments, stararg, dstararg)) + if atoms: + lineno = atoms[0].lineno + builder.push(ArglistObject(arguments, stararg, dstararg, lineno)) -def build_subscript(builder, nb): +def build_subscript(builder, nb, lineno): """'.' '.' '.' | [test] ':' [test] [':' [test]] | test""" atoms = get_atoms(builder, nb) token = atoms[0] if isinstance(token, TokenObject) and token.name == tok.DOT: # Ellipsis: - builder.push(ast.Ellipsis()) + builder.push(ast.Ellipsis(lineno)) elif len(atoms) == 1: if isinstance(token, TokenObject) and token.name == tok.COLON: sliceinfos = [None, None, None] - builder.push(SlicelistObject('slice', sliceinfos, None)) + builder.push(SlicelistObject('slice', sliceinfos, lineno)) else: # test builder.push(token) @@ -899,17 +907,17 @@ sliceobj_infos = [] for value in sliceinfos: if value is None: - sliceobj_infos.append(ast.Const(builder.wrap_none())) + sliceobj_infos.append(ast.Const(builder.wrap_none(), lineno)) else: sliceobj_infos.append(value) - builder.push(SlicelistObject('sliceobj', sliceobj_infos, None)) + builder.push(SlicelistObject('sliceobj', sliceobj_infos, lineno)) else: - builder.push(SlicelistObject('slice', sliceinfos, None)) + builder.push(SlicelistObject('slice', sliceinfos, lineno)) else: - builder.push(SubscriptObject('subscript', items, None)) + builder.push(SubscriptObject('subscript', items, lineno)) -def build_listmaker(builder, nb): +def build_listmaker(builder, nb, lineno): """listmaker: test ( list_for | (',' test)* [','] )""" atoms = get_atoms(builder, nb) if len(atoms) >= 2: @@ -919,7 +927,7 @@ # list comp expr = atoms[0] list_for = parse_listcomp(atoms[1:]) - builder.push(ast.ListComp(expr, list_for)) + builder.push(ast.ListComp(expr, list_for, lineno)) return # regular list building (like in [1, 2, 3,]) index = 0 @@ -927,10 +935,12 @@ while index < len(atoms): nodes.append(atoms[index]) index += 2 # skip comas - builder.push(ast.List(nodes)) + if atoms: + lineno = atoms[0].lineno + builder.push(ast.List(nodes, lineno)) -def build_decorator(builder, nb): +def build_decorator(builder, nb, lineno): """decorator: '@' dotted_name [ '(' [arglist] ')' ] NEWLINE""" atoms = get_atoms(builder, nb) nodes = [] @@ -945,10 +955,11 @@ obj = parse_attraccess(nodes) builder.push(obj) -def build_funcdef(builder, nb): +def build_funcdef(builder, nb, lineno): """funcdef: [decorators] 'def' NAME parameters ':' suite """ atoms = get_atoms(builder, nb) + lineno = atoms[0].lineno index = 0 decorators = [] decorator_node = None @@ -965,7 +976,7 @@ decorators.append(atoms[index]) index += 1 if decorators: - decorator_node = ast.Decorators(decorators) + decorator_node = ast.Decorators(decorators, lineno) atoms = atoms[index:] funcname = atoms[1] arglist = [] @@ -979,12 +990,13 @@ arglist = atoms[2] code = atoms[-1] doc = get_docstring(builder, code) - builder.push(ast.Function(decorator_node, funcname, names, default, flags, doc, code)) + builder.push(ast.Function(decorator_node, funcname, names, default, flags, doc, code, lineno)) -def build_classdef(builder, nb): +def build_classdef(builder, nb, lineno): """classdef: 'class' NAME ['(' testlist ')'] ':' suite""" atoms = get_atoms(builder, nb) + lineno = atoms[0].lineno l = len(atoms) classname_token = atoms[1] assert isinstance(classname_token, TokenObject) @@ -1003,9 +1015,9 @@ else: basenames.append(base) doc = get_docstring(builder,body) - builder.push(ast.Class(classname, basenames, doc, body)) + builder.push(ast.Class(classname, basenames, doc, body, lineno)) -def build_suite(builder, nb): +def build_suite(builder, nb, lineno): """suite: simple_stmt | NEWLINE INDENT stmt+ DEDENT""" atoms = get_atoms(builder, nb) if len(atoms) == 1: @@ -1014,7 +1026,7 @@ # Only one statement for (stmt+) stmt = atoms[2] if not isinstance(stmt, ast.Stmt): - stmt = ast.Stmt([stmt]) + stmt = ast.Stmt([stmt], atoms[0].lineno) builder.push(stmt) else: # several statements @@ -1025,10 +1037,10 @@ stmts.extend(node.nodes) else: stmts.append(node) - builder.push(ast.Stmt(stmts)) + builder.push(ast.Stmt(stmts, atoms[0].lineno)) -def build_if_stmt(builder, nb): +def build_if_stmt(builder, nb, lineno): """ if_stmt: 'if' test ':' suite ('elif' test ':' suite)* ['else' ':' suite] """ @@ -1046,28 +1058,28 @@ else: # cur_token.get_value() == 'else' else_ = atoms[index+2] break # break is not necessary - builder.push(ast.If(tests, else_)) + builder.push(ast.If(tests, else_, atoms[0].lineno)) -def build_pass_stmt(builder, nb): +def build_pass_stmt(builder, nb, lineno): """past_stmt: 'pass'""" atoms = get_atoms(builder, nb) assert len(atoms) == 1 - builder.push(ast.Pass()) + builder.push(ast.Pass(lineno)) -def build_break_stmt(builder, nb): +def build_break_stmt(builder, nb, lineno): """past_stmt: 'pass'""" atoms = get_atoms(builder, nb) assert len(atoms) == 1 - builder.push(ast.Break()) + builder.push(ast.Break(lineno)) -def build_for_stmt(builder, nb): +def build_for_stmt(builder, nb, lineno): """for_stmt: 'for' exprlist 'in' testlist ':' suite ['else' ':' suite]""" atoms = get_atoms(builder, nb) else_ = None # skip 'for' - assign = to_lvalue(atoms[1], consts.OP_ASSIGN) + assign = to_lvalue(atoms[1], consts.OP_ASSIGN, atoms[0].lineno) # skip 'in' iterable = atoms[3] # skip ':' @@ -1078,7 +1090,7 @@ else_ = atoms[8] builder.push(ast.For(assign, iterable, body, else_)) -def build_exprlist(builder, nb): +def build_exprlist(builder, nb, lineno): """exprlist: expr (',' expr)* [',']""" atoms = get_atoms(builder, nb) if len(atoms) <= 2: @@ -1087,10 +1099,10 @@ names = [] for index in range(0, len(atoms), 2): names.append(atoms[index]) - builder.push(ast.Tuple(names)) + builder.push(ast.Tuple(names, atoms[0].lineno)) -def build_while_stmt(builder, nb): +def build_while_stmt(builder, nb, lineno): """while_stmt: 'while' test ':' suite ['else' ':' suite]""" atoms = get_atoms(builder, nb) else_ = None @@ -1102,10 +1114,10 @@ if len(atoms) > 4: # skip 'else' and ':' else_ = atoms[6] - builder.push(ast.While(test, body, else_)) + builder.push(ast.While(test, body, else_, atoms[0].lineno)) -def build_import_name(builder, nb): +def build_import_name(builder, nb, lineno): """import_name: 'import' dotted_as_names dotted_as_names: dotted_as_name (',' dotted_as_name)* @@ -1146,10 +1158,10 @@ ## atoms[index].name != tok.COMMA: ## index += 1 index += 1 - builder.push(ast.Import(names)) + builder.push(ast.Import(names, lineno)) -def build_import_from(builder, nb): +def build_import_from(builder, nb, lineno): """ import_from: 'from' dotted_name 'import' ('*' | '(' import_as_names ')' | import_as_names) @@ -1190,23 +1202,23 @@ names.append((name, as_name)) if index < l: # case ',' index += 1 - builder.push(ast.From(from_name, names)) + builder.push(ast.From(from_name, names, lineno)) -def build_yield_stmt(builder, nb): +def build_yield_stmt(builder, nb, lineno): atoms = get_atoms(builder, nb) - builder.push(ast.Yield(atoms[1])) + builder.push(ast.Yield(atoms[1], lineno)) -def build_continue_stmt(builder, nb): +def build_continue_stmt(builder, nb, lineno): atoms = get_atoms(builder, nb) - builder.push(ast.Continue()) + builder.push(ast.Continue(lineno)) -def build_del_stmt(builder, nb): +def build_del_stmt(builder, nb, lineno): atoms = get_atoms(builder, nb) - builder.push(to_lvalue(atoms[1], consts.OP_DELETE)) + builder.push(to_lvalue(atoms[1], consts.OP_DELETE, lineno)) -def build_assert_stmt(builder, nb): +def build_assert_stmt(builder, nb, lineno): """assert_stmt: 'assert' test [',' test]""" atoms = get_atoms(builder, nb) test = atoms[1] @@ -1214,9 +1226,9 @@ fail = atoms[3] else: fail = None - builder.push(ast.Assert(test, fail)) + builder.push(ast.Assert(test, fail, atoms[0].lineno)) -def build_exec_stmt(builder, nb): +def build_exec_stmt(builder, nb, lineno): """exec_stmt: 'exec' expr ['in' test [',' test]]""" atoms = get_atoms(builder, nb) expr = atoms[1] @@ -1226,9 +1238,9 @@ loc = atoms[3] if len(atoms) > 4: glob = atoms[5] - builder.push(ast.Exec(expr, loc, glob)) + builder.push(ast.Exec(expr, loc, glob, atoms[0].lineno)) -def build_print_stmt(builder, nb): +def build_print_stmt(builder, nb, lineno): """ print_stmt: 'print' ( '>>' test [ (',' test)+ [','] ] | [ test (',' test)* [','] ] ) """ @@ -1247,11 +1259,11 @@ items.append(atoms[index]) last_token = atoms[-1] if isinstance(last_token, TokenObject) and last_token.name == tok.COMMA: - builder.push(ast.Print(items, dest)) + builder.push(ast.Print(items, dest, atoms[0].lineno)) else: - builder.push(ast.Printnl(items, dest)) + builder.push(ast.Printnl(items, dest, atoms[0].lineno)) -def build_global_stmt(builder, nb): +def build_global_stmt(builder, nb, lineno): """global_stmt: 'global' NAME (',' NAME)*""" atoms = get_atoms(builder, nb) names = [] @@ -1259,10 +1271,10 @@ token = atoms[index] assert isinstance(token, TokenObject) names.append(token.get_value()) - builder.push(ast.Global(names)) + builder.push(ast.Global(names, lineno)) -def build_raise_stmt(builder, nb): +def build_raise_stmt(builder, nb, lineno): """raise_stmt: 'raise' [test [',' test [',' test]]]""" atoms = get_atoms(builder, nb) l = len(atoms) @@ -1271,13 +1283,14 @@ expr3 = None if l >= 2: expr1 = atoms[1] + lineno = expr1.lineno if l >= 4: expr2 = atoms[3] if l == 6: expr3 = atoms[5] - builder.push(ast.Raise(expr1, expr2, expr3)) + builder.push(ast.Raise(expr1, expr2, expr3, lineno)) -def build_try_stmt(builder, nb): +def build_try_stmt(builder, nb, lineno): """ try_stmt: ('try' ':' suite (except_clause ':' suite)+ #diagram:break ['else' ':' suite] | 'try' ':' suite 'finally' ':' suite) @@ -1293,7 +1306,7 @@ token = atoms[3] assert isinstance(token, TokenObject) if token.get_value() == 'finally': - builder.push(ast.TryFinally(body, atoms[5])) + builder.push(ast.TryFinally(body, atoms[5], atoms[0].lineno)) else: # token.get_value() == 'except' index = 3 token = atoms[index] @@ -1310,7 +1323,7 @@ assert isinstance(token, TokenObject) assert token.get_value() == 'else' else_ = atoms[index+2] # skip ':' - builder.push(ast.TryExcept(body, handlers, else_)) + builder.push(ast.TryExcept(body, handlers, else_, atoms[0].lineno)) ASTRULES = { @@ -1369,16 +1382,16 @@ class BaseRuleObject(ast.Node): """Base class for unnamed rules""" - def __init__(self, count, src): + def __init__(self, count, lineno): self.count = count - self.line = 0 # src.getline() + self.lineno = lineno # src.getline() self.col = 0 # src.getcol() class RuleObject(BaseRuleObject): """A simple object used to wrap a rule or token""" - def __init__(self, name, count, src): - BaseRuleObject.__init__(self, count, src) + def __init__(self, name, count, lineno): + BaseRuleObject.__init__(self, count, lineno) self.rulename = name def __str__(self): @@ -1391,8 +1404,8 @@ class TempRuleObject(BaseRuleObject): """used to keep track of how many items get_atom() should pop""" - def __init__(self, name, count, src): - BaseRuleObject.__init__(self, count, src) + def __init__(self, name, count, lineno): + BaseRuleObject.__init__(self, count, lineno) self.temp_rulename = name def __str__(self): @@ -1404,13 +1417,14 @@ class TokenObject(ast.Node): """A simple object used to wrap a rule or token""" - def __init__(self, name, value, src ): + def __init__(self, name, value, lineno): self.name = name self.value = value self.count = 0 - self.line = 0 # src.getline() + # self.line = 0 # src.getline() self.col = 0 # src.getcol() - + self.lineno = lineno + def get_name(self): return tok.tok_rpunct.get(self.name, tok.tok_name.get(self.name, str(self.name))) @@ -1428,31 +1442,16 @@ return "" % (self.get_name(), self.value) -class FPListObject(ast.Node): - """store temp informations for fplist""" - def __init__(self, name, value, src): - self.name = name - self.value = value - self.count = 0 - self.line = 0 # src.getline() - self.col = 0 # src.getcol() - - def __str__(self): - return "" % (self.value,) - - def __repr__(self): - return "" % (self.value,) - class ObjectAccessor(ast.Node): """base class for ArglistObject, SubscriptObject and SlicelistObject FIXME: think about a more appropriate name """ - def __init__(self, name, value, src): + def __init__(self, name, value, lineno): self.fake_rulename = name self.value = value self.count = 0 - self.line = 0 # src.getline() + self.lineno = lineno # src.getline() self.col = 0 # src.getcol() class ArglistObject(ObjectAccessor): @@ -1460,11 +1459,12 @@ self.value is the 3-tuple (names, defaults, flags) """ - def __init__(self, arguments, stararg, dstararg): + def __init__(self, arguments, stararg, dstararg, lineno): self.fake_rulename = 'arglist' self.arguments = arguments self.stararg = stararg self.dstararg = dstararg + self.lineno = lineno def __str__(self): return "" % self.value @@ -1536,10 +1536,10 @@ # print "\t", self.rule_stack def push_tok(self, name, value, src ): - self.push( TokenObject( name, value, src ) ) + self.push( TokenObject( name, value, src._lineno ) ) def push_rule(self, name, count, src ): - self.push( RuleObject( name, count, src ) ) + self.push( RuleObject( name, count, src._lineno ) ) def alternative( self, rule, source ): # Do nothing, keep rule on top of the stack @@ -1549,7 +1549,7 @@ ## print "ALT:", sym.sym_name[rule.codename], self.rule_stack builder_func = ASTRULES.get(rule.codename, None) if builder_func: - builder_func(self, 1) + builder_func(self, 1, source._lineno) else: ## if DEBUG_MODE: ## print "No reducing implementation for %s, just push it on stack" % ( @@ -1571,7 +1571,7 @@ builder_func = ASTRULES.get(rule.codename, None) if builder_func: # print "REDUCING SEQUENCE %s" % sym.sym_name[rule.codename] - builder_func(self, elts_number) + builder_func(self, elts_number, source._lineno) else: ## if DEBUG_MODE: ## print "No reducing implementation for %s, just push it on stack" % ( Modified: pypy/dist/pypy/interpreter/pyparser/pythonlexer.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonlexer.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonlexer.py Mon Sep 19 15:34:48 2005 @@ -275,6 +275,7 @@ last_comment = '' elif initial == '\\': # continued stmt continued = 1 + lnum -= 1 else: if initial in '([{': parenlev = parenlev + 1 From adim at codespeak.net Mon Sep 19 16:10:25 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Mon, 19 Sep 2005 16:10:25 +0200 (CEST) Subject: [pypy-svn] r17659 - pypy/dist/pypy/objspace/std Message-ID: <20050919141025.B52F427B68@code1.codespeak.net> Author: adim Date: Mon Sep 19 16:10:24 2005 New Revision: 17659 Modified: pypy/dist/pypy/objspace/std/intobject.py Log: fixed error messages to match CPython's Modified: pypy/dist/pypy/objspace/std/intobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/intobject.py (original) +++ pypy/dist/pypy/objspace/std/intobject.py Mon Sep 19 16:10:24 2005 @@ -147,7 +147,7 @@ z = ovfcheck(x // y) except ZeroDivisionError: raise OperationError(space.w_ZeroDivisionError, - space.wrap("integer division by zero")) + space.wrap("integer division or modulo by zero")) except OverflowError: raise FailedToImplement(space.w_OverflowError, space.wrap("integer division")) @@ -168,7 +168,7 @@ z = ovfcheck(x % y) except ZeroDivisionError: raise OperationError(space.w_ZeroDivisionError, - space.wrap("integer modulo by zero")) + space.wrap("integer division or modulo by zero")) except OverflowError: raise FailedToImplement(space.w_OverflowError, space.wrap("integer modulo")) @@ -181,7 +181,7 @@ z = ovfcheck(x // y) except ZeroDivisionError: raise OperationError(space.w_ZeroDivisionError, - space.wrap("integer divmod by zero")) + space.wrap("integer division or modulo by zero")) except OverflowError: raise FailedToImplement(space.w_OverflowError, space.wrap("integer modulo")) From pedronis at codespeak.net Mon Sep 19 16:12:45 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 19 Sep 2005 16:12:45 +0200 (CEST) Subject: [pypy-svn] r17660 - in pypy/dist/pypy: annotation interpreter tool/test Message-ID: <20050919141245.D443D27B68@code1.codespeak.net> Author: pedronis Date: Mon Sep 19 16:12:43 2005 New Revision: 17660 Modified: pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/interpreter/argument.py pypy/dist/pypy/interpreter/baseobjspace.py pypy/dist/pypy/interpreter/pyopcode.py pypy/dist/pypy/tool/test/test_pytestsupport.py Log: reduce the wasted creation/copy of dicts/lists for the normal call/argument parsing path. there's still some more that can be done in this area. Modified: pypy/dist/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/dist/pypy/annotation/bookkeeper.py (original) +++ pypy/dist/pypy/annotation/bookkeeper.py Mon Sep 19 16:12:43 2005 @@ -625,10 +625,10 @@ def build_args(self, op, args_s): space = RPythonCallsSpace() if op == "simple_call": - return Arguments(space, args_s) + return Arguments(space, list(args_s)) elif op == "call_args": return Arguments.fromshape(space, args_s[0].const, # shape - args_s[1:]) + list(args_s[1:])) def get_s_init(self, cls): classdef = self.getclassdef(cls) Modified: pypy/dist/pypy/interpreter/argument.py ============================================================================== --- pypy/dist/pypy/interpreter/argument.py (original) +++ pypy/dist/pypy/interpreter/argument.py Mon Sep 19 16:12:43 2005 @@ -16,18 +16,18 @@ blind_arguments = 0 - def __init__(self, space, args_w=[], kwds_w={}, + def __init__(self, space, args_w=None, kwds_w=None, w_stararg=None, w_starstararg=None): self.space = space - self.arguments_w = list(args_w) - self.kwds_w = kwds_w.copy() + self.arguments_w = args_w + self.kwds_w = kwds_w self.w_stararg = w_stararg self.w_starstararg = w_starstararg def frompacked(space, w_args=None, w_kwds=None): """Convenience static method to build an Arguments from a wrapped sequence and a wrapped dictionary.""" - return Arguments(space, w_stararg=w_args, w_starstararg=w_kwds) + return Arguments(space, [], w_stararg=w_args, w_starstararg=w_kwds) frompacked = staticmethod(frompacked) def __repr__(self): @@ -55,6 +55,8 @@ self.arguments_w += self.space.unpackiterable(self.w_stararg) self.w_stararg = None # --- unpack the ** argument now --- + if self.kwds_w is None: + self.kwds_w = {} if self.w_starstararg is not None: space = self.space w_starstararg = self.w_starstararg @@ -63,8 +65,9 @@ raise OperationError(space.w_TypeError, space.wrap("argument after ** must be " "a dictionary")) - d = self.kwds_w.copy() # don't change the original yet, - # in case something goes wrong + # don't change the original yet, + # in case something goes wrong + d = self.kwds_w.copy() for w_key in space.unpackiterable(w_starstararg): try: key = space.str_w(w_key) @@ -179,9 +182,13 @@ if name in kwds_w: raise ArgErrMultipleValues(name) - remainingkwds_w = kwds_w.copy() + remainingkwds_w = self.kwds_w missing = 0 if input_argcount < co_argcount: + if remainingkwds_w is None: + remainingkwds_w = {} + else: + remainingkwds_w = remainingkwds_w.copy() # not enough args, fill in kwargs or defaults if exists def_first = co_argcount - len(defaults_w) for i in range(input_argcount, co_argcount): @@ -209,8 +216,9 @@ # collect extra keyword arguments into the **kwarg if kwargname is not None: w_kwds = self.space.newdict([]) - for key, w_value in remainingkwds_w.items(): - self.space.setitem(w_kwds, self.space.wrap(key), w_value) + if remainingkwds_w: + for key, w_value in remainingkwds_w.items(): + self.space.setitem(w_kwds, self.space.wrap(key), w_value) scope_w.append(w_kwds) elif remainingkwds_w: raise ArgErrUnknownKwds(remainingkwds_w) @@ -223,7 +231,10 @@ def rawshape(self): shape_cnt = len(self.arguments_w) # Number of positional args - shape_keys = self.kwds_w.keys() # List of keywords (strings) + if self.kwds_w: + shape_keys = self.kwds_w.keys() # List of keywords (strings) + else: + shape_keys = [] shape_star = self.w_stararg is not None # Flag: presence of *arg shape_stst = self.w_starstararg is not None # Flag: presence of **kwds shape_keys.sort() Modified: pypy/dist/pypy/interpreter/baseobjspace.py ============================================================================== --- pypy/dist/pypy/interpreter/baseobjspace.py (original) +++ pypy/dist/pypy/interpreter/baseobjspace.py Mon Sep 19 16:12:43 2005 @@ -575,7 +575,7 @@ ''' """ w_func = self.fromcache(AppExecCache).getorbuild(source) - args = Arguments(self, posargs_w) + args = Arguments(self, list(posargs_w)) return self.call_args(w_func, args) class AppExecCache(SpaceCache): Modified: pypy/dist/pypy/interpreter/pyopcode.py ============================================================================== --- pypy/dist/pypy/interpreter/pyopcode.py (original) +++ pypy/dist/pypy/interpreter/pyopcode.py Mon Sep 19 16:12:43 2005 @@ -633,12 +633,14 @@ def call_function(f, oparg, w_star=None, w_starstar=None): n_arguments = oparg & 0xff n_keywords = (oparg>>8) & 0xff - keywords = {} - for i in range(n_keywords): - w_value = f.valuestack.pop() - w_key = f.valuestack.pop() - key = f.space.str_w(w_key) - keywords[key] = w_value + keywords = None + if n_keywords: + keywords = {} + for i in range(n_keywords): + w_value = f.valuestack.pop() + w_key = f.valuestack.pop() + key = f.space.str_w(w_key) + keywords[key] = w_value arguments = [f.valuestack.pop() for i in range(n_arguments)] arguments.reverse() args = Arguments(f.space, arguments, keywords, w_star, w_starstar) Modified: pypy/dist/pypy/tool/test/test_pytestsupport.py ============================================================================== --- pypy/dist/pypy/tool/test/test_pytestsupport.py (original) +++ pypy/dist/pypy/tool/test/test_pytestsupport.py Mon Sep 19 16:12:43 2005 @@ -31,7 +31,7 @@ space.setitem(space.builtin.w_dict, space.wrap('AssertionError'), build_pytest_assertion(space)) try: - f.call_args(Arguments([])) + f.call_args(Arguments(None, [])) except OperationError, e: assert e.match(space, space.w_AssertionError) assert space.unwrap(space.str(e.w_value)) == 'assert 42 == 43' From ac at codespeak.net Mon Sep 19 16:19:11 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Mon, 19 Sep 2005 16:19:11 +0200 (CEST) Subject: [pypy-svn] r17661 - in pypy/dist/pypy/interpreter: astcompiler test Message-ID: <20050919141911.C799527B62@code1.codespeak.net> Author: ac Date: Mon Sep 19 16:19:11 2005 New Revision: 17661 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/test/test_compiler.py Log: Detect duplicate arguments Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Mon Sep 19 16:19:11 2005 @@ -1212,7 +1212,7 @@ # name. node.expr.accept( self ) self.emit('PRINT_EXPR') - + class AbstractFunctionCode(CodeGenerator): def __init__(self, space, func, isLambda, class_name, mod): self.class_name = class_name @@ -1222,8 +1222,21 @@ else: assert isinstance(func, ast.Function) name = func.name + # Find duplicated arguments. + argnames = {} + for arg in func.argnames: + if isinstance(arg, ast.AssName): + if arg.name in argnames: + raise SyntaxError("duplicate argument '%s' in function definition" % arg.name) + argnames[arg.name] = 1 + elif isinstance(arg, ast.AssTuple): + for name in arg.getArgNames(): + if name in argnames: + raise SyntaxError("duplicate argument '%s' in function definition" % arg.name) + argnames[name] = 1 args, hasTupleArg = generateArgList(func.argnames) + graph = pyassem.PyFlowGraph(space, name, func.filename, args, optimized=self.localsfullyknown, newlocals=1) Modified: pypy/dist/pypy/interpreter/test/test_compiler.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_compiler.py (original) +++ pypy/dist/pypy/interpreter/test/test_compiler.py Mon Sep 19 16:19:11 2005 @@ -10,9 +10,9 @@ def setup_method(self, method): self.compiler = self.space.createcompiler() - def eval_string(self, string): + def eval_string(self, string, kind='eval'): space = self.space - code = self.compiler.compile(string, '<>', 'eval', 0) + code = self.compiler.compile(string, '<>', kind, 0) return code.exec_code(space, space.newdict([]), space.newdict([])) def test_compile(self): @@ -181,6 +181,21 @@ ex.normalize_exception(self.space) assert ex.match(self.space, self.space.w_UnicodeError) + def test_argument_handling(self): + for expr in 'lambda a,a:0', 'lambda a,a=1:0', 'lambda a=1,a=1:0': + e = py.test.raises(OperationError, self.eval_string, expr) + ex = e.value + ex.normalize_exception(self.space) + assert ex.match(self.space, self.space.w_SyntaxError) + + for code in 'def f(a, a): pass', 'def f(a = 0, a = 1): pass', 'def f(a): global a; a = 1': + e = py.test.raises(OperationError, self.eval_string, code, 'exec') + ex = e.value + ex.normalize_exception(self.space) + assert ex.match(self.space, self.space.w_SyntaxError) + + + class TestECCompiler(BaseTestCompiler): def setup_method(self, method): self.compiler = self.space.getexecutioncontext().compiler From adim at codespeak.net Mon Sep 19 16:36:50 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Mon, 19 Sep 2005 16:36:50 +0200 (CEST) Subject: [pypy-svn] r17662 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050919143650.AC4EC27B68@code1.codespeak.net> Author: adim Date: Mon Sep 19 16:36:49 2005 New Revision: 17662 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: changed type and messages of some raised exceptions to match CPython's error messages. Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Mon Sep 19 16:36:49 2005 @@ -7,7 +7,7 @@ from pypy.interpreter.astcompiler import ast, consts import pypy.interpreter.pyparser.pysymbol as sym import pypy.interpreter.pyparser.pytoken as tok -from pypy.interpreter.pyparser.error import SyntaxError, TokenError, ASTError +from pypy.interpreter.pyparser.error import SyntaxError, TokenError, ASTError, ParseError from pypy.interpreter.pyparser.parsestring import parsestr DEBUG_MODE = 0 @@ -342,7 +342,18 @@ else: # TODO: check type of ast_node and raise according SyntaxError in case # of del f() - raise ASTError("cannot assign to ", ast_node) + # #raise ASTError("cannot assign to %s" % ast_node, ast_node) + if isinstance(ast_node, ast.GenExpr): + raise ParseError("assign to generator expression not possible", + lineno, 0, '') + elif isinstance(ast_node, ast.ListComp): + raise ParseError("can't assign to list comprehension", + lineno, 0, '') + elif isinstance(ast_node, ast.CallFunc): + raise ParseError("can't assign to function call", + lineno, 0, '') + else: + raise ASTError("cannot assign to %s" % ast_node, ast_node) def is_augassign( ast_node ): if ( isinstance( ast_node, ast.Name ) or @@ -726,6 +737,9 @@ else: assert l==3 lvalue = atoms[0] + if isinstance(lvalue, (ast.GenExpr, ast.Tuple)): + raise ParseError("augmented assign to tuple literal or generator expression not possible", + lineno, 0, "") assert isinstance(op, TokenObject) builder.push(ast.AugAssign(lvalue, op.get_name(), atoms[2], lineno)) From ac at codespeak.net Mon Sep 19 18:02:30 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Mon, 19 Sep 2005 18:02:30 +0200 (CEST) Subject: [pypy-svn] r17664 - in pypy/dist/pypy/interpreter: . pyparser test Message-ID: <20050919160230.98A2227B5E@code1.codespeak.net> Author: ac Date: Mon Sep 19 18:02:30 2005 New Revision: 17664 Modified: pypy/dist/pypy/interpreter/pycompiler.py pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/test/test_compiler.py Log: Detect non-default arguments after default arguments. Modified: pypy/dist/pypy/interpreter/pycompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pycompiler.py (original) +++ pypy/dist/pypy/interpreter/pycompiler.py Mon Sep 19 18:02:30 2005 @@ -413,6 +413,9 @@ except ParseError, e: raise OperationError(space.w_SyntaxError, e.wrap_info(space, filename)) + except SyntaxError, e: + raise OperationError(space.w_SyntaxError, + e.wrap_info(space, filename)) try: astcompiler.misc.set_filename(filename, ast_tree) Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Mon Sep 19 18:02:30 2005 @@ -148,12 +148,15 @@ defaults = [] names = [] flags = 0 + first_with_default = -1 while index < l: cur_token = tokens[index] index += 1 if not isinstance(cur_token, TokenObject): # XXX: think of another way to write this test defaults.append(cur_token) + if first_with_default == -1: + first_with_default = len(names) - 1 elif cur_token.name == tok.COMMA: # We could skip test COMMA by incrementing index cleverly # but we might do some experiment on the grammar at some point @@ -198,6 +201,15 @@ elif cur_token.name == tok.NAME: val = cur_token.get_value() names.append( ast.AssName( val, consts.OP_ASSIGN ) ) + + if first_with_default != -1: + num_expected_with_default = len(names) - first_with_default + if flags & consts.CO_VARKEYWORDS: + num_expected_with_default -= 1 + if flags & consts.CO_VARARGS: + num_expected_with_default -= 1 + if len(defaults) != num_expected_with_default: + raise SyntaxError('non-default argument follows default argument') return names, defaults, flags Modified: pypy/dist/pypy/interpreter/test/test_compiler.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_compiler.py (original) +++ pypy/dist/pypy/interpreter/test/test_compiler.py Mon Sep 19 18:02:30 2005 @@ -194,7 +194,12 @@ ex.normalize_exception(self.space) assert ex.match(self.space, self.space.w_SyntaxError) - + def test_argument_order(self): + code = 'def f(a=1, (b, c)): pass' + e = py.test.raises(OperationError, self.eval_string, code, 'exec') + ex = e.value + ex.normalize_exception(self.space) + assert ex.match(self.space, self.space.w_SyntaxError) class TestECCompiler(BaseTestCompiler): def setup_method(self, method): From adim at codespeak.net Mon Sep 19 18:25:21 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Mon, 19 Sep 2005 18:25:21 +0200 (CEST) Subject: [pypy-svn] r17665 - in pypy/dist/pypy/interpreter/pyparser: . test test/samples Message-ID: <20050919162521.BAC2E27B5E@code1.codespeak.net> Author: adim Date: Mon Sep 19 18:25:18 2005 New Revision: 17665 Added: pypy/dist/pypy/interpreter/pyparser/test/samples/snippet_listlinenos.py (contents, props changed) pypy/dist/pypy/interpreter/pyparser/test/samples/snippet_whilelineno.py (contents, props changed) Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Log: - fixed lots of bad linenos - test_astbuilder now compares stablecompiler's linenos and astcompiler's linenos Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Mon Sep 19 18:25:18 2005 @@ -80,7 +80,7 @@ else: last_token = arguments.pop() assert isinstance(last_token, ast.Name) # used by rtyper - arguments.append(ast.Keyword(last_token.varname, cur_token)) + arguments.append(ast.Keyword(last_token.varname, cur_token, last_token.lineno)) building_kw = False kw_built = True continue @@ -107,7 +107,7 @@ expr = arguments[0] genexpr_for = parse_genexpr_for(tokens[index:]) genexpr_for[0].is_outmost = True - gexp = ast.GenExpr(ast.GenExprInner(expr, genexpr_for)) + gexp = ast.GenExpr(ast.GenExprInner(expr, genexpr_for, expr.lineno), expr.lineno) arguments[0] = gexp break return arguments, stararg_token, dstararg_token @@ -121,7 +121,11 @@ but it can't work with the default compiler. We switched to use astcompiler module now """ - top = ast.AssTuple([]) + if tokens: + lineno = tokens[0].lineno + else: + lineno = -1 + top = ast.AssTuple([], lineno) stack = [ top ] tokens_read = 0 while stack: @@ -130,7 +134,7 @@ if isinstance(token, TokenObject) and token.name == tok.COMMA: continue elif isinstance(token, TokenObject) and token.name == tok.LPAR: - new_tuple = ast.AssTuple([]) + new_tuple = ast.AssTuple([], lineno) stack[-1].nodes.append( new_tuple ) stack.append(new_tuple) elif isinstance(token, TokenObject) and token.name == tok.RPAR: @@ -138,7 +142,7 @@ else: assert isinstance(token, TokenObject) val = token.get_value() - stack[-1].nodes.append(ast.AssName(val,consts.OP_ASSIGN)) + stack[-1].nodes.append(ast.AssName(val,consts.OP_ASSIGN, token.lineno)) return tokens_read, top def parse_arglist(tokens): @@ -221,6 +225,10 @@ list_fors = [] ifs = [] index = 0 + if tokens: + lineno = tokens[0].lineno + else: + lineno = -1 while index < len(tokens): token = tokens[index] assert isinstance(token, TokenObject) # rtyper info + check @@ -234,11 +242,11 @@ token = tokens[index] assert isinstance(token, TokenObject) # rtyper info if token.get_value() == 'if': - ifs.append(ast.ListCompIf(tokens[index+1])) + ifs.append(ast.ListCompIf(tokens[index+1], token.lineno)) index += 2 else: break - list_fors.append(ast.ListCompFor(ass_node, iterable, ifs)) + list_fors.append(ast.ListCompFor(ass_node, iterable, ifs, lineno)) ifs = [] else: assert False, 'Unexpected token: expecting for in listcomp' @@ -274,6 +282,10 @@ genexpr_fors = [] ifs = [] index = 0 + if tokens: + lineno = tokens[0].lineno + else: + lineno = -1 while index < len(tokens): token = tokens[index] assert isinstance(token, TokenObject) # rtyper info + check @@ -287,11 +299,11 @@ token = tokens[index] assert isinstance(token, TokenObject) # rtyper info if token.get_value() == 'if': - ifs.append(ast.GenExprIf(tokens[index+1])) + ifs.append(ast.GenExprIf(tokens[index+1], token.lineno)) index += 2 else: break - genexpr_fors.append(ast.GenExprFor(ass_node, iterable, ifs)) + genexpr_fors.append(ast.GenExprFor(ass_node, iterable, ifs, lineno)) ifs = [] else: assert False, 'Unexpected token: expected for in genexpr' @@ -332,7 +344,7 @@ # because of flatten() for node in ast_node.nodes: nodes.append(to_lvalue(node, flags, lineno)) - return ast.AssTuple(nodes) + return ast.AssTuple(nodes, lineno) elif isinstance(ast_node, ast.List): nodes = [] # FIXME: should ast_node.getChildren() but it's not annotable @@ -515,10 +527,6 @@ builder.push(ast.List([], top.lineno)) else: list_node = atoms[1] - # XXX lineno is not on *every* child class of ast.Node - # (will probably crash the annotator, but should be - # easily fixed) - list_node.lineno = top.lineno builder.push(list_node) elif top.name == tok.LBRACE: items = [] @@ -538,7 +546,7 @@ if len(atoms) == 1: token = atoms[0] assert isinstance(token, TokenObject) - builder.push(ast.Const(parsestr(builder.space, None, token.get_value()), lineno)) # XXX encoding + builder.push(ast.Const(parsestr(builder.space, None, token.get_value()), top.lineno)) # XXX encoding else: space = builder.space empty = space.wrap('') @@ -549,7 +557,7 @@ w_s = space.call_method(empty, 'join', space.newlist(accum)) builder.push(ast.Const(w_s, top.lineno)) elif top.name == tok.BACKQUOTE: - builder.push(ast.Backquote(atoms[1])) + builder.push(ast.Backquote(atoms[1], atoms[1].lineno)) else: raise TokenError("unexpected tokens", atoms) @@ -598,11 +606,11 @@ op_node = atoms[i-1] assert isinstance(op_node, TokenObject) if op_node.name == tok.STAR: - left = ast.Mul( [ left, right ], lineno ) + left = ast.Mul( [ left, right ], left.lineno ) elif op_node.name == tok.SLASH: - left = ast.Div( [ left, right ], lineno ) + left = ast.Div( [ left, right ], left.lineno ) elif op_node.name == tok.PERCENT: - left = ast.Mod( [ left, right ], lineno ) + left = ast.Mod( [ left, right ], left.lineno ) elif op_node.name == tok.DOUBLESLASH: left = ast.FloorDiv( [ left, right ], lineno ) else: @@ -618,11 +626,11 @@ op_node = atoms[i-1] assert isinstance(op_node, TokenObject) if op_node.name == tok.PLUS: - left = ast.Add([ left, right ], lineno) + left = ast.Add([ left, right ], left.lineno) elif op_node.name == tok.MINUS: - left = ast.Sub([ left, right ], lineno) + left = ast.Sub([ left, right ], left.lineno) else: - raise ValueError("unexpected token", [atoms[i-1]] ) + raise ValueError("unexpected token", [atoms[i-1]]) builder.push( left ) def build_shift_expr(builder, nb, lineno): @@ -648,7 +656,11 @@ if l==1: builder.push(atoms[0]) return + # Here, len(atoms) >= 2 items = [] + # Apparently, lineno should be set to the line where + # the first OP occurs + lineno = atoms[1].lineno for i in range(0,l,2): # this is atoms not 1 items.append(atoms[i]) builder.push(OP(items, lineno)) @@ -681,7 +693,7 @@ assert isinstance(token, TokenObject) op_name = tok.tok_rpunct.get(token.name, token.get_value()) ops.append((op_name, atoms[i+1])) - builder.push(ast.Compare(atoms[0], ops, lineno)) + builder.push(ast.Compare(atoms[0], ops, atoms[0].lineno)) def build_comp_op(builder, nb, lineno): """comp_op reducing has 2 different cases: @@ -720,7 +732,7 @@ if len(atoms) == 1: builder.push(atoms[0]) elif len(atoms) == 2: - builder.push(ast.Not(atoms[1], lineno)) + builder.push(ast.Not(atoms[1], atoms[1].lineno)) else: assert False, "not_test implementation incomplete in not_test" @@ -731,7 +743,11 @@ return build_binary_expr(builder, nb, ast.Tuple, lineno) def build_expr_stmt(builder, nb, lineno): + """expr_stmt: testlist (augassign testlist | ('=' testlist)*) + """ atoms = get_atoms(builder, nb) + if atoms: + lineno = atoms[0].lineno l = len(atoms) if l==1: builder.push(ast.Discard(atoms[0], lineno)) @@ -779,9 +795,9 @@ if len(atoms) > 2: assert False, "return several stmts not implemented" elif len(atoms) == 1: - builder.push(ast.Return(ast.Const(builder.wrap_none(), lineno))) # XXX lineno + builder.push(ast.Return(ast.Const(builder.wrap_none(), lineno), lineno)) else: - builder.push(ast.Return(atoms[1], lineno)) # XXX lineno + builder.push(ast.Return(atoms[1], atoms[0].lineno)) def build_file_input(builder, nb, lineno): stmts = [] @@ -948,6 +964,7 @@ atoms = get_atoms(builder, nb) if len(atoms) >= 2: token = atoms[1] + lineno = token.lineno if isinstance(token, TokenObject): if token.get_value() == 'for': # list comp @@ -985,7 +1002,6 @@ """funcdef: [decorators] 'def' NAME parameters ':' suite """ atoms = get_atoms(builder, nb) - lineno = atoms[0].lineno index = 0 decorators = [] decorator_node = None @@ -1005,6 +1021,7 @@ decorator_node = ast.Decorators(decorators, lineno) atoms = atoms[index:] funcname = atoms[1] + lineno = funcname.lineno arglist = [] index = 3 arglist = slicecut(atoms, 3, -3) @@ -1114,7 +1131,7 @@ if len(atoms) > 6: # skip 'else' and ':' else_ = atoms[8] - builder.push(ast.For(assign, iterable, body, else_)) + builder.push(ast.For(assign, iterable, body, else_, atoms[0].lineno)) def build_exprlist(builder, nb, lineno): """exprlist: expr (',' expr)* [',']""" @@ -1309,12 +1326,11 @@ expr3 = None if l >= 2: expr1 = atoms[1] - lineno = expr1.lineno if l >= 4: expr2 = atoms[3] if l == 6: expr3 = atoms[5] - builder.push(ast.Raise(expr1, expr2, expr3, lineno)) + builder.push(ast.Raise(expr1, expr2, expr3, atoms[0].lineno)) def build_try_stmt(builder, nb, lineno): """ Added: pypy/dist/pypy/interpreter/pyparser/test/samples/snippet_listlinenos.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/interpreter/pyparser/test/samples/snippet_listlinenos.py Mon Sep 19 18:25:18 2005 @@ -0,0 +1,12 @@ +l = [ "foo", "bar", + "baz"] + +l = [ + "foo", + "bar", + "baz", + ] + +l = [] +l = [ + ] Added: pypy/dist/pypy/interpreter/pyparser/test/samples/snippet_whilelineno.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/interpreter/pyparser/test/samples/snippet_whilelineno.py Mon Sep 19 18:25:18 2005 @@ -0,0 +1,18 @@ +while (a < b and c < d + and e < f): + pass + +while (a < b and + c < d + and e < f): + pass + +while (a < b + and c < d + and e < f): + pass + +while (a < b + and c < d and + e < f): + pass Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Mon Sep 19 18:25:18 2005 @@ -30,7 +30,7 @@ return True -def nodes_equal(left, right): +def nodes_equal(left, right, check_lineno=False): if not isinstance(left,stable_ast.Node) or not isinstance(right,ast_ast.Node): return left==right if left.__class__.__name__ != right.__class__.__name__: @@ -68,6 +68,12 @@ return False if not r: print "Constant mismatch:", left, right + if check_lineno: + # left is a stablecompiler.ast node which means and stable compiler + # doesn't set a lineno on each Node + if left.lineno is not None and left.lineno != right.lineno: + print "(0) (%s) left: %s, right: %s" % (left, left.lineno, right.lineno) + return False return True else: left_nodes = left.getChildren() @@ -76,7 +82,14 @@ print "Number of children mismatch:", left, right return False for i,j in zip(left_nodes,right_nodes): - if not nodes_equal(i,j): + if not nodes_equal(i,j, check_lineno): + return False + if check_lineno: + # left is a stablecompiler.ast node which means and stable compiler + # doesn't set a lineno on each Node. + # (stablecompiler.ast.Expression doesn't have a lineno attribute) + if hasattr(left, 'lineno') and left.lineno is not None and left.lineno != right.lineno: + print "(1) (%s) left: %s, right: %s" % (left, left.lineno, right.lineno) return False return True @@ -604,7 +617,7 @@ print print "BUILT:", r1.rule_stack[-1] print "-" * 30 - assert nodes_equal( ast, r1.rule_stack[-1]), 'failed on %r' % (expr) + assert nodes_equal(ast, r1.rule_stack[-1], check_lineno=True), 'failed on %r' % (expr) def test_basic_astgen(): for family in TESTS: @@ -648,6 +661,8 @@ 'snippet_whitespaces.py', 'snippet_samples.py', 'snippet_decorators.py', + 'snippet_listlinenos.py', + 'snippet_whilelineno.py', ] LIBSTUFF = [ From arigo at codespeak.net Mon Sep 19 20:38:55 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Mon, 19 Sep 2005 20:38:55 +0200 (CEST) Subject: [pypy-svn] r17670 - pypy/dist/pypy/rpython Message-ID: <20050919183855.F0D4A27B6C@code1.codespeak.net> Author: arigo Date: Mon Sep 19 20:38:52 2005 New Revision: 17670 Modified: pypy/dist/pypy/rpython/rclass.py pypy/dist/pypy/rpython/rfloat.py pypy/dist/pypy/rpython/rint.py pypy/dist/pypy/rpython/rlist.py pypy/dist/pypy/rpython/rmodel.py pypy/dist/pypy/rpython/rpbc.py pypy/dist/pypy/rpython/rstr.py pypy/dist/pypy/rpython/rtyper.py Log: Makes calling ll-helpers on Repr instances look less hackish: we can now call them as bound methods (where the 'self' plays the role of a specialization argument). Changed ll_str() to be such a bound method now. The plan is to have many more such ll-helpers on Reprs. They allow us to have nice interfaces to manipulate the low-level structures in a way that doesn't depend so much on the details of the low-level structures in question. Modified: pypy/dist/pypy/rpython/rclass.py ============================================================================== --- pypy/dist/pypy/rpython/rclass.py (original) +++ pypy/dist/pypy/rpython/rclass.py Mon Sep 19 20:38:52 2005 @@ -647,7 +647,7 @@ vinst, = hop.inputargs(self) return hop.genop('ptr_nonzero', [vinst], resulttype=Bool) - def ll_str(i, r): # doesn't work for non-gc classes! + def ll_str(self, i): # doesn't work for non-gc classes! instance = cast_pointer(OBJECTPTR, i) from pypy.rpython import rstr nameLen = len(instance.typeptr.name) @@ -659,7 +659,6 @@ return rstr.ll_strconcat(rstr.instance_str_prefix, rstr.ll_strconcat(nameString, rstr.instance_str_suffix)) - ll_str = staticmethod(ll_str) class __extend__(pairtype(InstanceRepr, InstanceRepr)): Modified: pypy/dist/pypy/rpython/rfloat.py ============================================================================== --- pypy/dist/pypy/rpython/rfloat.py (original) +++ pypy/dist/pypy/rpython/rfloat.py Mon Sep 19 20:38:52 2005 @@ -134,7 +134,7 @@ rtype_float = rtype_pos - def ll_str(f, repr): + def ll_str(self, f): pyfloat = pyfloat_fromdouble_ptr(f) pystring = pyobject_str_ptr(pyfloat) stringsize = pystring_size_ptr(pystring) @@ -144,8 +144,6 @@ tollchararray_ptr(pystring, ret.chars) return ret - - ll_str = staticmethod(ll_str) PyObjectPtr = Ptr(PyObject) Modified: pypy/dist/pypy/rpython/rint.py ============================================================================== --- pypy/dist/pypy/rpython/rint.py (original) +++ pypy/dist/pypy/rpython/rint.py Mon Sep 19 20:38:52 2005 @@ -289,7 +289,7 @@ vlist = hop.inputargs(Float) return vlist[0] - def ll_str(i, repr): + def ll_str(self, i): from pypy.rpython.rstr import STR temp = malloc(CHAR_ARRAY, 20) len = 0 @@ -316,7 +316,6 @@ result.chars[j] = temp[len-j-1] j += 1 return result - ll_str = staticmethod(ll_str) def rtype_hex(_, hop): varg = hop.inputarg(hop.args_r[0], 0) Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Mon Sep 19 20:38:52 2005 @@ -170,15 +170,15 @@ def make_iterator_repr(self): return ListIteratorRepr(self) - def ll_str(l, listrepr): + def ll_str(self, l): items = l.items length = l.length - item_repr = listrepr.item_repr + item_repr = self.item_repr temp = malloc(TEMP, length) i = 0 while i < length: - temp[i] = item_repr.ll_str(items[i], item_repr) + temp[i] = item_repr.ll_str(items[i]) i += 1 return rstr.ll_strconcat( @@ -187,8 +187,7 @@ length, temp), rstr.list_str_close_bracket)) - ll_str = staticmethod(ll_str) - + class __extend__(pairtype(ListRepr, Repr)): Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Mon Sep 19 20:38:52 2005 @@ -129,8 +129,7 @@ raise TyperError("getattr() with a non-constant attribute name") def rtype_str(self, hop): - vrepr = inputconst(Void, self) - return hop.gendirectcall(self.ll_str, hop.args_v[0], vrepr) + return hop.gendirectcall(self.ll_str, hop.args_v[0]) def rtype_nonzero(self, hop): return self.rtype_is_true(hop) # can call a subclass' rtype_is_true() Modified: pypy/dist/pypy/rpython/rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/rpbc.py (original) +++ pypy/dist/pypy/rpython/rpbc.py Mon Sep 19 20:38:52 2005 @@ -282,6 +282,11 @@ def __init__(self, rtyper, s_pbc): self.rtyper = rtyper self.function = s_pbc.prebuiltinstances.keys()[0].im_func + # a hack to force the underlying function to show up in call_families + # (generally not needed, as normalizecalls() should ensure this, + # but needed for bound methods that are ll helpers) + call_families = rtyper.annotator.getpbccallfamilies() + call_families.find((None, self.function)) im_selves = {} for pbc, not_a_classdef in s_pbc.prebuiltinstances.items(): if pbc is None: Modified: pypy/dist/pypy/rpython/rstr.py ============================================================================== --- pypy/dist/pypy/rpython/rstr.py (original) +++ pypy/dist/pypy/rpython/rstr.py Mon Sep 19 20:38:52 2005 @@ -203,13 +203,9 @@ hop.exception_is_here() return hop.gendirectcall(ll_int, v_str, v_base) - def ll_str(s, r): - if typeOf(s) == Char: - return ll_chr2str(s) - else: - return s - ll_str = staticmethod(ll_str) - + def ll_str(self, s): + return s + def make_iterator_repr(self): return string_iterator_repr @@ -341,17 +337,16 @@ if isinstance(thing, tuple): code = thing[0] vitem, r_arg = argsiter.next() - rep = inputconst(Void, r_arg) if not hasattr(r_arg, 'll_str'): raise TyperError("ll_str unsupported for: %r" % r_arg) if code == 's' or (code == 'r' and isinstance(r_arg, InstanceRepr)): - vchunk = hop.gendirectcall(r_arg.ll_str, vitem, rep) + vchunk = hop.gendirectcall(r_arg.ll_str, vitem) elif code == 'd': assert isinstance(r_arg, IntegerRepr) - vchunk = hop.gendirectcall(r_arg.ll_str, vitem, rep) + vchunk = hop.gendirectcall(r_arg.ll_str, vitem) elif code == 'f': #assert isinstance(r_arg, FloatRepr) - vchunk = hop.gendirectcall(r_arg.ll_str, vitem, rep) + vchunk = hop.gendirectcall(r_arg.ll_str, vitem) elif code == 'x': assert isinstance(r_arg, IntegerRepr) vchunk = hop.gendirectcall(rint.ll_int2hex, vitem, @@ -399,6 +394,9 @@ def get_ll_hash_function(self): return ll_char_hash + def ll_str(self, ch): + return ll_chr2str(ch) + def rtype_len(_, hop): return hop.inputconst(Signed, 1) Modified: pypy/dist/pypy/rpython/rtyper.py ============================================================================== --- pypy/dist/pypy/rpython/rtyper.py (original) +++ pypy/dist/pypy/rpython/rtyper.py Mon Sep 19 20:38:52 2005 @@ -728,6 +728,10 @@ self.rtyper.call_all_setups() # compute ForwardReferences now dontcare, spec_function = annotate_lowlevel_helper(rtyper.annotator, ll_function, args_s) + # hack for bound methods + if hasattr(ll_function, 'im_func'): + newargs_v.insert(0, inputconst(Void, ll_function.im_self)) + # build the 'direct_call' operation f = self.rtyper.getfunctionptr(spec_function) c = inputconst(typeOf(f), f) From pedronis at codespeak.net Mon Sep 19 21:19:09 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 19 Sep 2005 21:19:09 +0200 (CEST) Subject: [pypy-svn] r17672 - pypy/dist/pypy/rpython Message-ID: <20050919191909.7F43527B6C@code1.codespeak.net> Author: pedronis Date: Mon Sep 19 21:19:08 2005 New Revision: 17672 Modified: pypy/dist/pypy/rpython/rlist.py Log: avoid zeroing twice in alloc_and_set Modified: pypy/dist/pypy/rpython/rlist.py ============================================================================== --- pypy/dist/pypy/rpython/rlist.py (original) +++ pypy/dist/pypy/rpython/rlist.py Mon Sep 19 21:19:08 2005 @@ -811,10 +811,11 @@ l = malloc(LISTPTR.TO) l.length = count l.items = malloc(LISTPTR.TO.items.TO, count) - i = 0 - while i < count: - l.items[i] = item - i += 1 + if item: # as long as malloc it is known to zero the allocated memory avoid zeroing twice + i = 0 + while i < count: + l.items[i] = item + i += 1 return l def rtype_alloc_and_set(hop): From adim at codespeak.net Mon Sep 19 22:23:37 2005 From: adim at codespeak.net (adim at codespeak.net) Date: Mon, 19 Sep 2005 22:23:37 +0200 (CEST) Subject: [pypy-svn] r17673 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050919202337.244B427B84@code1.codespeak.net> Author: adim Date: Mon Sep 19 22:23:32 2005 New Revision: 17673 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: removed unecessary lineno parameter from build_xxx() functions Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Mon Sep 19 22:23:32 2005 @@ -35,7 +35,7 @@ return (4, tokens[1], None, tokens[3]) else: # case 'except Exception, exc: body' - return (6, tokens[1], to_lvalue(tokens[3], consts.OP_ASSIGN, lineno), tokens[5]) + return (6, tokens[1], to_lvalue(tokens[3], consts.OP_ASSIGN), tokens[5]) def parse_dotted_names(tokens): @@ -103,7 +103,7 @@ break elif cur_token.get_value() == 'for': if len(arguments) != 1: - raise ValueError('SyntaxError("invalid syntax")') # xxx lineno... + raise ValueError('SyntaxError("invalid syntax")') expr = arguments[0] genexpr_for = parse_genexpr_for(tokens[index:]) genexpr_for[0].is_outmost = True @@ -234,7 +234,7 @@ assert isinstance(token, TokenObject) # rtyper info + check if token.get_value() == 'for': index += 1 # skip 'for' - ass_node = to_lvalue(tokens[index], consts.OP_ASSIGN, token.lineno) + ass_node = to_lvalue(tokens[index], consts.OP_ASSIGN) index += 2 # skip 'in' iterable = tokens[index] index += 1 @@ -291,7 +291,7 @@ assert isinstance(token, TokenObject) # rtyper info + check if token.get_value() == 'for': index += 1 # skip 'for' - ass_node = to_lvalue(tokens[index], consts.OP_ASSIGN, token.lineno) + ass_node = to_lvalue(tokens[index], consts.OP_ASSIGN) index += 2 # skip 'in' iterable = tokens[index] index += 1 @@ -334,7 +334,8 @@ return doc -def to_lvalue(ast_node, flags, lineno): +def to_lvalue(ast_node, flags): + lineno = ast_node.lineno if isinstance( ast_node, ast.Name ): return ast.AssName(ast_node.varname, flags, lineno) # return ast.AssName(ast_node.name, flags) @@ -343,14 +344,14 @@ # FIXME: should ast_node.getChildren() but it's not annotable # because of flatten() for node in ast_node.nodes: - nodes.append(to_lvalue(node, flags, lineno)) + nodes.append(to_lvalue(node, flags)) return ast.AssTuple(nodes, lineno) elif isinstance(ast_node, ast.List): nodes = [] # FIXME: should ast_node.getChildren() but it's not annotable # because of flatten() for node in ast_node.nodes: - nodes.append(to_lvalue(node, flags, lineno)) + nodes.append(to_lvalue(node, flags)) return ast.AssList(nodes, lineno) elif isinstance(ast_node, ast.Getattr): expr = ast_node.expr @@ -512,7 +513,7 @@ ## main reason why build_* functions are not methods of the AstBuilder class ## -def build_atom(builder, nb, lineno): +def build_atom(builder, nb): atoms = get_atoms(builder, nb) top = atoms[0] if isinstance(top, TokenObject): @@ -569,12 +570,13 @@ return [] -def build_power(builder, nb, lineno): +def build_power(builder, nb): """power: atom trailer* ['**' factor]""" atoms = get_atoms(builder, nb) if len(atoms) == 1: builder.push(atoms[0]) else: + lineno = atoms[0].lineno token = atoms[-2] if isinstance(token, TokenObject) and token.name == tok.DOUBLESTAR: obj = parse_attraccess(slicecut(atoms, 0, -2)) @@ -583,12 +585,13 @@ obj = parse_attraccess(atoms) builder.push(obj) -def build_factor(builder, nb, lineno): +def build_factor(builder, nb): atoms = get_atoms(builder, nb) if len(atoms) == 1: builder.push( atoms[0] ) elif len(atoms) == 2: token = atoms[0] + lineno = token.lineno if isinstance(token, TokenObject): if token.name == tok.PLUS: builder.push( ast.UnaryAdd( atoms[1], lineno) ) @@ -597,7 +600,7 @@ if token.name == tok.TILDE: builder.push( ast.Invert( atoms[1], lineno) ) -def build_term(builder, nb, lineno): +def build_term(builder, nb): atoms = get_atoms(builder, nb) l = len(atoms) left = atoms[0] @@ -612,12 +615,12 @@ elif op_node.name == tok.PERCENT: left = ast.Mod( [ left, right ], left.lineno ) elif op_node.name == tok.DOUBLESLASH: - left = ast.FloorDiv( [ left, right ], lineno ) + left = ast.FloorDiv( [ left, right ], left.lineno ) else: raise TokenError("unexpected token", [atoms[i-1]]) builder.push( left ) -def build_arith_expr(builder, nb, lineno): +def build_arith_expr(builder, nb): atoms = get_atoms(builder, nb) l = len(atoms) left = atoms[0] @@ -633,10 +636,11 @@ raise ValueError("unexpected token", [atoms[i-1]]) builder.push( left ) -def build_shift_expr(builder, nb, lineno): +def build_shift_expr(builder, nb): atoms = get_atoms(builder, nb) l = len(atoms) left = atoms[0] + lineno = left.lineno for i in range(2,l,2): right = atoms[i] op_node = atoms[i-1] @@ -650,7 +654,7 @@ builder.push(left) -def build_binary_expr(builder, nb, OP, lineno): +def build_binary_expr(builder, nb, OP): atoms = get_atoms(builder, nb) l = len(atoms) if l==1: @@ -666,16 +670,16 @@ builder.push(OP(items, lineno)) return -def build_and_expr(builder, nb, lineno): - return build_binary_expr(builder, nb, ast.Bitand, lineno) +def build_and_expr(builder, nb): + return build_binary_expr(builder, nb, ast.Bitand) -def build_xor_expr(builder, nb, lineno): - return build_binary_expr(builder, nb, ast.Bitxor, lineno) +def build_xor_expr(builder, nb): + return build_binary_expr(builder, nb, ast.Bitxor) -def build_expr(builder, nb, lineno): - return build_binary_expr(builder, nb, ast.Bitor, lineno) +def build_expr(builder, nb): + return build_binary_expr(builder, nb, ast.Bitor) -def build_comparison(builder, nb, lineno): +def build_comparison(builder, nb): atoms = get_atoms(builder, nb) l = len(atoms) if l == 1: @@ -695,7 +699,7 @@ ops.append((op_name, atoms[i+1])) builder.push(ast.Compare(atoms[0], ops, atoms[0].lineno)) -def build_comp_op(builder, nb, lineno): +def build_comp_op(builder, nb): """comp_op reducing has 2 different cases: 1. There's only one token to reduce => nothing to do, just re-push it on the stack @@ -716,6 +720,7 @@ # l==2 means 'not in' or 'is not' elif l == 2: token = atoms[0] + lineno = token.lineno assert isinstance(token, TokenObject) if token.get_value() == 'not': builder.push(TokenObject(tok.NAME, 'not in', lineno)) @@ -724,10 +729,10 @@ else: assert False, "TODO" # uh ? -def build_and_test(builder, nb, lineno): - return build_binary_expr(builder, nb, ast.And, lineno) +def build_and_test(builder, nb): + return build_binary_expr(builder, nb, ast.And) -def build_not_test(builder, nb, lineno): +def build_not_test(builder, nb): atoms = get_atoms(builder, nb) if len(atoms) == 1: builder.push(atoms[0]) @@ -736,18 +741,20 @@ else: assert False, "not_test implementation incomplete in not_test" -def build_test(builder, nb, lineno): - return build_binary_expr(builder, nb, ast.Or, lineno) +def build_test(builder, nb): + return build_binary_expr(builder, nb, ast.Or) -def build_testlist(builder, nb, lineno): - return build_binary_expr(builder, nb, ast.Tuple, lineno) +def build_testlist(builder, nb): + return build_binary_expr(builder, nb, ast.Tuple) -def build_expr_stmt(builder, nb, lineno): +def build_expr_stmt(builder, nb): """expr_stmt: testlist (augassign testlist | ('=' testlist)*) """ atoms = get_atoms(builder, nb) if atoms: lineno = atoms[0].lineno + else: + lineno = -1 l = len(atoms) if l==1: builder.push(ast.Discard(atoms[0], lineno)) @@ -757,7 +764,7 @@ if op.name == tok.EQUAL: nodes = [] for i in range(0,l-2,2): - lvalue = to_lvalue(atoms[i], consts.OP_ASSIGN, op.lineno) + lvalue = to_lvalue(atoms[i], consts.OP_ASSIGN) nodes.append(lvalue) rvalue = atoms[-1] builder.push( ast.Assign(nodes, rvalue, lineno) ) @@ -765,45 +772,52 @@ else: assert l==3 lvalue = atoms[0] - if isinstance(lvalue, (ast.GenExpr, ast.Tuple)): + if isinstance(lvalue, ast.GenExpr) or isinstance(lvalue, ast.Tuple): raise ParseError("augmented assign to tuple literal or generator expression not possible", lineno, 0, "") assert isinstance(op, TokenObject) builder.push(ast.AugAssign(lvalue, op.get_name(), atoms[2], lineno)) -def return_one(builder, nb, lineno): +def return_one(builder, nb): atoms = get_atoms(builder, nb) l = len(atoms) assert l == 1, "missing one node in stack" builder.push( atoms[0] ) return -def build_simple_stmt(builder, nb, lineno): +def build_simple_stmt(builder, nb): atoms = get_atoms(builder, nb) l = len(atoms) nodes = [] + if atoms: + lineno = atoms[0].lineno + else: + lineno = -1 for n in range(0,l,2): node = atoms[n] if isinstance(node, TokenObject) and node.name == tok.NEWLINE: - nodes.append(ast.Discard(ast.Const(builder.wrap_none()), lineno)) + nodes.append(ast.Discard(ast.Const(builder.wrap_none()), node.lineno)) else: nodes.append(node) builder.push(ast.Stmt(nodes, lineno)) -def build_return_stmt(builder, nb, lineno): +def build_return_stmt(builder, nb): atoms = get_atoms(builder, nb) + lineno = atoms[0].lineno if len(atoms) > 2: assert False, "return several stmts not implemented" elif len(atoms) == 1: builder.push(ast.Return(ast.Const(builder.wrap_none(), lineno), lineno)) else: - builder.push(ast.Return(atoms[1], atoms[0].lineno)) + builder.push(ast.Return(atoms[1], lineno)) -def build_file_input(builder, nb, lineno): +def build_file_input(builder, nb): stmts = [] atoms = get_atoms(builder, nb) if atoms: lineno = atoms[0].lineno + else: + lineno = -1 for node in atoms: if isinstance(node, ast.Stmt): stmts.extend(node.nodes) @@ -814,34 +828,36 @@ continue else: stmts.append(node) - main_stmt = ast.Stmt(stmts) + main_stmt = ast.Stmt(stmts, lineno) doc = get_docstring(builder,main_stmt) return builder.push(ast.Module(doc, main_stmt, lineno)) -def build_eval_input(builder, nb, lineno): +def build_eval_input(builder, nb): doc = builder.wrap_none() stmts = [] atoms = get_atoms(builder, nb) assert len(atoms)>=1 return builder.push(ast.Expression(atoms[0])) -def build_single_input(builder, nb, lineno): +def build_single_input(builder, nb): atoms = get_atoms(builder, nb) l = len(atoms) if l == 1 or l==2: atom0 = atoms[0] if isinstance(atom0, TokenObject) and atom0.name == tok.NEWLINE: - atom0 = ast.Pass(lineno) + atom0 = ast.Pass(atom0.lineno) elif not isinstance(atom0, ast.Stmt): - atom0 = ast.Stmt([atom0], lineno) + atom0 = ast.Stmt([atom0], atom0.lineno) builder.push(ast.Module(builder.wrap_none(), atom0, atom0.lineno)) else: assert False, "Forbidden path" -def build_testlist_gexp(builder, nb, lineno): +def build_testlist_gexp(builder, nb): atoms = get_atoms(builder, nb) if atoms: lineno = atoms[0].lineno + else: + lineno = -1 l = len(atoms) if l == 1: builder.push(atoms[0]) @@ -862,15 +878,16 @@ builder.push(ast.Tuple(items, lineno)) return -def build_lambdef(builder, nb, lineno): +def build_lambdef(builder, nb): """lambdef: 'lambda' [varargslist] ':' test""" atoms = get_atoms(builder, nb) + lineno = atoms[0].lineno code = atoms[-1] names, defaults, flags = parse_arglist(slicecut(atoms, 1, -2)) builder.push(ast.Lambda(names, defaults, flags, code, lineno)) -def build_trailer(builder, nb, lineno): +def build_trailer(builder, nb): """trailer: '(' ')' | '(' arglist ')' | '[' subscriptlist ']' | '.' NAME """ atoms = get_atoms(builder, nb) @@ -898,7 +915,7 @@ else: assert False, "Trailer reducing implementation incomplete !" -def build_arglist(builder, nb, lineno): +def build_arglist(builder, nb): """ arglist: (argument ',')* ( '*' test [',' '**' test] | '**' test | @@ -909,13 +926,16 @@ arguments, stararg, dstararg = parse_argument(atoms) if atoms: lineno = atoms[0].lineno + else: + lineno = -1 builder.push(ArglistObject(arguments, stararg, dstararg, lineno)) -def build_subscript(builder, nb, lineno): +def build_subscript(builder, nb): """'.' '.' '.' | [test] ':' [test] [':' [test]] | test""" atoms = get_atoms(builder, nb) token = atoms[0] + lineno = token.lineno if isinstance(token, TokenObject) and token.name == tok.DOT: # Ellipsis: builder.push(ast.Ellipsis(lineno)) @@ -959,7 +979,7 @@ builder.push(SubscriptObject('subscript', items, lineno)) -def build_listmaker(builder, nb, lineno): +def build_listmaker(builder, nb): """listmaker: test ( list_for | (',' test)* [','] )""" atoms = get_atoms(builder, nb) if len(atoms) >= 2: @@ -980,10 +1000,12 @@ index += 2 # skip comas if atoms: lineno = atoms[0].lineno + else: + lineno = -1 builder.push(ast.List(nodes, lineno)) -def build_decorator(builder, nb, lineno): +def build_decorator(builder, nb): """decorator: '@' dotted_name [ '(' [arglist] ')' ] NEWLINE""" atoms = get_atoms(builder, nb) nodes = [] @@ -998,13 +1020,14 @@ obj = parse_attraccess(nodes) builder.push(obj) -def build_funcdef(builder, nb, lineno): +def build_funcdef(builder, nb): """funcdef: [decorators] 'def' NAME parameters ':' suite """ atoms = get_atoms(builder, nb) index = 0 decorators = [] decorator_node = None + lineno = atoms[0].lineno # the original loop was: # while not (isinstance(atoms[index], TokenObject) and atoms[index].get_value() == 'def'): # decorators.append(atoms[index]) @@ -1036,7 +1059,7 @@ builder.push(ast.Function(decorator_node, funcname, names, default, flags, doc, code, lineno)) -def build_classdef(builder, nb, lineno): +def build_classdef(builder, nb): """classdef: 'class' NAME ['(' testlist ')'] ':' suite""" atoms = get_atoms(builder, nb) lineno = atoms[0].lineno @@ -1060,7 +1083,7 @@ doc = get_docstring(builder,body) builder.push(ast.Class(classname, basenames, doc, body, lineno)) -def build_suite(builder, nb, lineno): +def build_suite(builder, nb): """suite: simple_stmt | NEWLINE INDENT stmt+ DEDENT""" atoms = get_atoms(builder, nb) if len(atoms) == 1: @@ -1083,7 +1106,7 @@ builder.push(ast.Stmt(stmts, atoms[0].lineno)) -def build_if_stmt(builder, nb, lineno): +def build_if_stmt(builder, nb): """ if_stmt: 'if' test ':' suite ('elif' test ':' suite)* ['else' ':' suite] """ @@ -1103,26 +1126,26 @@ break # break is not necessary builder.push(ast.If(tests, else_, atoms[0].lineno)) -def build_pass_stmt(builder, nb, lineno): +def build_pass_stmt(builder, nb): """past_stmt: 'pass'""" atoms = get_atoms(builder, nb) assert len(atoms) == 1 - builder.push(ast.Pass(lineno)) + builder.push(ast.Pass(atoms[0].lineno)) -def build_break_stmt(builder, nb, lineno): +def build_break_stmt(builder, nb): """past_stmt: 'pass'""" atoms = get_atoms(builder, nb) assert len(atoms) == 1 - builder.push(ast.Break(lineno)) + builder.push(ast.Break(atoms[0].lineno)) -def build_for_stmt(builder, nb, lineno): +def build_for_stmt(builder, nb): """for_stmt: 'for' exprlist 'in' testlist ':' suite ['else' ':' suite]""" atoms = get_atoms(builder, nb) else_ = None # skip 'for' - assign = to_lvalue(atoms[1], consts.OP_ASSIGN, atoms[0].lineno) + assign = to_lvalue(atoms[1], consts.OP_ASSIGN) # skip 'in' iterable = atoms[3] # skip ':' @@ -1133,7 +1156,7 @@ else_ = atoms[8] builder.push(ast.For(assign, iterable, body, else_, atoms[0].lineno)) -def build_exprlist(builder, nb, lineno): +def build_exprlist(builder, nb): """exprlist: expr (',' expr)* [',']""" atoms = get_atoms(builder, nb) if len(atoms) <= 2: @@ -1145,7 +1168,7 @@ builder.push(ast.Tuple(names, atoms[0].lineno)) -def build_while_stmt(builder, nb, lineno): +def build_while_stmt(builder, nb): """while_stmt: 'while' test ':' suite ['else' ':' suite]""" atoms = get_atoms(builder, nb) else_ = None @@ -1160,7 +1183,7 @@ builder.push(ast.While(test, body, else_, atoms[0].lineno)) -def build_import_name(builder, nb, lineno): +def build_import_name(builder, nb): """import_name: 'import' dotted_as_names dotted_as_names: dotted_as_name (',' dotted_as_name)* @@ -1201,10 +1224,10 @@ ## atoms[index].name != tok.COMMA: ## index += 1 index += 1 - builder.push(ast.Import(names, lineno)) + builder.push(ast.Import(names, atoms[0].lineno)) -def build_import_from(builder, nb, lineno): +def build_import_from(builder, nb): """ import_from: 'from' dotted_name 'import' ('*' | '(' import_as_names ')' | import_as_names) @@ -1245,23 +1268,23 @@ names.append((name, as_name)) if index < l: # case ',' index += 1 - builder.push(ast.From(from_name, names, lineno)) + builder.push(ast.From(from_name, names, atoms[0].lineno)) -def build_yield_stmt(builder, nb, lineno): +def build_yield_stmt(builder, nb): atoms = get_atoms(builder, nb) - builder.push(ast.Yield(atoms[1], lineno)) + builder.push(ast.Yield(atoms[1], atoms[0].lineno)) -def build_continue_stmt(builder, nb, lineno): +def build_continue_stmt(builder, nb): atoms = get_atoms(builder, nb) - builder.push(ast.Continue(lineno)) + builder.push(ast.Continue(atoms[0].lineno)) -def build_del_stmt(builder, nb, lineno): +def build_del_stmt(builder, nb): atoms = get_atoms(builder, nb) - builder.push(to_lvalue(atoms[1], consts.OP_DELETE, lineno)) + builder.push(to_lvalue(atoms[1], consts.OP_DELETE)) -def build_assert_stmt(builder, nb, lineno): +def build_assert_stmt(builder, nb): """assert_stmt: 'assert' test [',' test]""" atoms = get_atoms(builder, nb) test = atoms[1] @@ -1271,7 +1294,7 @@ fail = None builder.push(ast.Assert(test, fail, atoms[0].lineno)) -def build_exec_stmt(builder, nb, lineno): +def build_exec_stmt(builder, nb): """exec_stmt: 'exec' expr ['in' test [',' test]]""" atoms = get_atoms(builder, nb) expr = atoms[1] @@ -1283,7 +1306,7 @@ glob = atoms[5] builder.push(ast.Exec(expr, loc, glob, atoms[0].lineno)) -def build_print_stmt(builder, nb, lineno): +def build_print_stmt(builder, nb): """ print_stmt: 'print' ( '>>' test [ (',' test)+ [','] ] | [ test (',' test)* [','] ] ) """ @@ -1306,7 +1329,7 @@ else: builder.push(ast.Printnl(items, dest, atoms[0].lineno)) -def build_global_stmt(builder, nb, lineno): +def build_global_stmt(builder, nb): """global_stmt: 'global' NAME (',' NAME)*""" atoms = get_atoms(builder, nb) names = [] @@ -1314,10 +1337,10 @@ token = atoms[index] assert isinstance(token, TokenObject) names.append(token.get_value()) - builder.push(ast.Global(names, lineno)) + builder.push(ast.Global(names, atoms[0].lineno)) -def build_raise_stmt(builder, nb, lineno): +def build_raise_stmt(builder, nb): """raise_stmt: 'raise' [test [',' test [',' test]]]""" atoms = get_atoms(builder, nb) l = len(atoms) @@ -1332,7 +1355,7 @@ expr3 = atoms[5] builder.push(ast.Raise(expr1, expr2, expr3, atoms[0].lineno)) -def build_try_stmt(builder, nb, lineno): +def build_try_stmt(builder, nb): """ try_stmt: ('try' ':' suite (except_clause ':' suite)+ #diagram:break ['else' ':' suite] | 'try' ':' suite 'finally' ':' suite) @@ -1591,7 +1614,7 @@ ## print "ALT:", sym.sym_name[rule.codename], self.rule_stack builder_func = ASTRULES.get(rule.codename, None) if builder_func: - builder_func(self, 1, source._lineno) + builder_func(self, 1) else: ## if DEBUG_MODE: ## print "No reducing implementation for %s, just push it on stack" % ( @@ -1613,7 +1636,7 @@ builder_func = ASTRULES.get(rule.codename, None) if builder_func: # print "REDUCING SEQUENCE %s" % sym.sym_name[rule.codename] - builder_func(self, elts_number, source._lineno) + builder_func(self, elts_number) else: ## if DEBUG_MODE: ## print "No reducing implementation for %s, just push it on stack" % ( From ericvrp at codespeak.net Mon Sep 19 22:36:41 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Mon, 19 Sep 2005 22:36:41 +0200 (CEST) Subject: [pypy-svn] r17674 - pypy/dist/pypy/translator/backendopt Message-ID: <20050919203641.A277827B84@code1.codespeak.net> Author: ericvrp Date: Mon Sep 19 22:36:40 2005 New Revision: 17674 Modified: pypy/dist/pypy/translator/backendopt/exception.py Log: Still work in progress. Now correctly splitting blocks after a call plus inserting some low level code for detecting if an exception has occurred. If not, we continue as usual. If so , return an identity (noresult) value. Modified: pypy/dist/pypy/translator/backendopt/exception.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/exception.py (original) +++ pypy/dist/pypy/translator/backendopt/exception.py Mon Sep 19 22:36:40 2005 @@ -1,5 +1,8 @@ from pypy.translator.unsimplify import split_block -from pypy.objspace.flow.model import Block, flatten +from pypy.objspace.flow.model import Block, Constant, Variable, Link, \ + last_exception, flatten, SpaceOperation +from pypy.annotation import model as annmodel +from pypy.rpython.lltype import Bool, Ptr def create_exception_handling(translator, graph): @@ -9,9 +12,43 @@ from the current graph with an unused value (false/0/0.0/null). Because of the added exitswitch we need an additional block. """ + e = translator.rtyper.getexceptiondata() blocks = [x for x in flatten(graph) if isinstance(x, Block)] for block in blocks: - for i in range(len(block.operations)-1, -1, -1): + last_operation = len(block.operations)-1 + if block.exitswitch == Constant(last_exception): + last_operation -= 1 + for i in range(last_operation, -1, -1): op = block.operations[i] - if op.opname == 'direct_call': - split_block(translator, graph, block, i) + if op.opname != 'direct_call': + continue + called_can_raise = True #XXX maybe we even want a list of possible exceptions + if not called_can_raise: + continue + + afterblock = split_block(translator, graph, block, i+1) + + res = Variable() + res.concretetype = Bool + translator.annotator.bindings[res] = annmodel.SomeBool() + + etype = Variable('etype') + etype.concretetype = e.lltype_of_exception_type + translator.annotator.bindings[etype] = e.lltype_of_exception_type + + #XXX better use 'load()' and instantiate '%last_exception_type' (here maybe?) + block.operations.append(SpaceOperation("last_exception_type_ptr", [], etype)) + block.operations.append(SpaceOperation("ptr_iszero", [etype], res)) + + block.exitswitch = res + + #non-exception case + block.exits[0].exitcase = block.exits[0].llexitcase = True + + #exception occurred case + noresulttype = graph.returnblock.inputargs[0].concretetype + noresult = Constant(noresulttype._defl(), noresulttype) + l = Link([noresult], graph.returnblock) + l.prevblock = block + l.exitcase = l.llexitcase = False + block.exits.insert(0, l) #False case needs to go first From tismer at codespeak.net Tue Sep 20 04:40:46 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Tue, 20 Sep 2005 04:40:46 +0200 (CEST) Subject: [pypy-svn] r17675 - pypy/dist/pypy/translator/goal Message-ID: <20050920024046.B527C27B80@code1.codespeak.net> Author: tismer Date: Tue Sep 20 04:40:45 2005 New Revision: 17675 Modified: pypy/dist/pypy/translator/goal/bench-windows.py Log: executable abs.richards abs.pystone rel.rich rel.pystone pypy-c-17439 35165 ms 668.586 42.4 61.1 pypy-c-17600 26388 ms 900.481 31.8 45.4 pypy-c-17634 20108 ms 1017.720 24.2 40.1 pypy-c-17649 22662 ms 1035.910 27.3 39.4 pypy-c-17674-nolowmem 15817 ms 1981.470 19.1 20.6 pypy-c-17674-t-lowmem 16834 ms 1274.650 20.3 32.1 python 2.3.3 830 ms 40861.400 1.0 1.0 17649 was with explicit fixed stack. Changes after 17634 we not included. 17674 has an outrageous effect. I cannot really find out what it was. Did Armin do the fixed stack patch already? Probably not. Was it Samuele's avoiding of duplicate zeroing? Really just that? I think so, and this is incredible. Even more incredible is the fact that not using using t-lowmem accelerates pystone so much. This is an indicator that we missed something used in pystone that still contains applevel code. I can't believe it, will find it tomorrow. Modified: pypy/dist/pypy/translator/goal/bench-windows.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-windows.py (original) +++ pypy/dist/pypy/translator/goal/bench-windows.py Tue Sep 20 04:40:45 2005 @@ -3,20 +3,26 @@ # where a couple of .exe files is expected. current_result = """ -executable abs.richards abs.pystone rel.richards rel.pystone -pypy-c-17439 35180 ms 661.339 41.9 59.7 -pypy-c-17512 46007 ms 659.205 54.8 59.9 -pypy-c-17516 37944 ms 704.839 45.2 56.0 -pypy-c-17545-intern 34309 ms 764.987 40.8 51.6 -pypy-c-17572 36061 ms 736.094 42.9 53.7 -pypy-c-17600 26348 ms 901.957 31.4 43.8 -pypy-c-17623-32_4 24734 ms 970.845 29.4 40.7 -pypy-c-17634 20088 ms 1018.240 23.9 38.8 -pypy-c-17649 22902 ms 1018.300 27.3 38.8 -python 2.3.3 840 ms 39500.600 1.0 1.0 +executable abs.richards abs.pystone rel.rich rel.pystone +pypy-c-17439 35165 ms 668.586 42.4 61.1 +pypy-c-17600 26388 ms 900.481 31.8 45.4 +pypy-c-17634 20108 ms 1017.720 24.2 40.1 +pypy-c-17649 22662 ms 1035.910 27.3 39.4 +pypy-c-17674-nolowmem 15817 ms 1981.470 19.1 20.6 +pypy-c-17674-t-lowmem 16834 ms 1274.650 20.3 32.1 +python 2.3.3 830 ms 40861.400 1.0 1.0 17649 was with explicit fixed stack. Changes after 17634 we not included. +17674 has an outrageous effect. I cannot really +find out what it was. Did Armin do the fixed stack +patch already? Probably not. Was it Samuele's avoiding +of duplicate zeroing? Really just that? I think so, and +this is incredible. +Even more incredible is the fact that not using using +t-lowmem accelerates pystone so much. This is an indicator +that we missed something used in pystone that still contains +applevel code. I can't believe it, will find it tomorrow. """ import os, sys @@ -48,7 +54,7 @@ print res return res -def run_richards(executable='python', n=10): +def run_richards(executable='python', n=20): argstr = RICHARDS_CMD % n txt = run_cmd('%s -c "%s"' % (executable, argstr)) res = get_result(txt, RICHARDS_PATTERN) @@ -60,18 +66,10 @@ exes.sort() return exes -LAYOUT = ''' -executable abs.richards abs.pystone rel.richards rel.pystone -pypy-c-17439 40929 ms 637.274 47.8 56.6 -pypy-c-17512 46105 ms 658.1 53.9 54.8 -pypy-current 33937 ms 698.415 39.6 51.7 -python 2.3.3 856 ms 36081.6 1.0 1.0 -''' - HEADLINE = '''\ -executable abs.richards abs.pystone rel.richards rel.pystone''' +executable abs.richards abs.pystone rel.rich rel.pystone''' FMT = '''\ -%-20s ''' + '%5d ms %9.3f ' + '%5.1f %5.1f' +%-27s ''' + '%5d ms %9.3f ' + '%5.1f %5.1f' def main(): print 'getting the richards reference' @@ -81,7 +79,7 @@ res = [] for exe in get_executables(): exename = os.path.splitext(exe)[0] - res.append( (exename, run_richards(exe, 1), run_pystone(exe, 2000)) ) + res.append( (exename, run_richards(exe, 2), run_pystone(exe, 20000)) ) res.append( ('python %s' % sys.version.split()[0], ref_rich, ref_stone) ) print HEADLINE for exe, rich, stone in res: From ericvrp at codespeak.net Tue Sep 20 08:55:27 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 20 Sep 2005 08:55:27 +0200 (CEST) Subject: [pypy-svn] r17676 - pypy/dist/pypy/translator/goal Message-ID: <20050920065527.DBD4C27B86@code1.codespeak.net> Author: ericvrp Date: Tue Sep 20 08:55:26 2005 New Revision: 17676 Modified: pypy/dist/pypy/translator/goal/bench-unix.py Log: Search for pypy-* (pypy with any backend) instead pypy-c* Modified: pypy/dist/pypy/translator/goal/bench-unix.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-unix.py (original) +++ pypy/dist/pypy/translator/goal/bench-unix.py Tue Sep 20 08:55:26 2005 @@ -1,6 +1,6 @@ # benchmarks on a unix machine. # to be executed in the goal folder, -# where a couple of pypy-c* files is expected. +# where a couple of pyp-c* files is expected. import os, sys @@ -39,7 +39,7 @@ return res def get_executables(): - exes = [os.path.join('.', name) for name in os.listdir('.') if name.startswith('pypy-c')] + exes = [os.path.join('.', name) for name in os.listdir('.') if name.startswith('pypy-')] exes.sort() return exes From ericvrp at codespeak.net Tue Sep 20 08:56:11 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 20 Sep 2005 08:56:11 +0200 (CEST) Subject: [pypy-svn] r17677 - pypy/dist/pypy/translator/goal Message-ID: <20050920065611.DC66127B86@code1.codespeak.net> Author: ericvrp Date: Tue Sep 20 08:56:11 2005 New Revision: 17677 Modified: pypy/dist/pypy/translator/goal/run_pypy-llvm.sh Log: Now using translate_pypy_new.py Modified: pypy/dist/pypy/translator/goal/run_pypy-llvm.sh ============================================================================== --- pypy/dist/pypy/translator/goal/run_pypy-llvm.sh (original) +++ pypy/dist/pypy/translator/goal/run_pypy-llvm.sh Tue Sep 20 08:56:11 2005 @@ -3,8 +3,8 @@ # stopping on the first error #python translate_pypy.py -no-c -no-o -text -fork2 # running it all -python translate_pypy.py target_pypy-llvm -text -llvm $* -#python translate_pypy_new.py targetpypystandalone --backend=llvm --gc=boehm --pygame $* +#python translate_pypy.py target_pypy-llvm -text -llvm $* +python translate_pypy_new.py targetpypystandalone --backend=llvm --gc=boehm --pygame --batch --fork=fork2 --lowmem $* # How to work in parallel: From ac at codespeak.net Tue Sep 20 11:55:48 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Tue, 20 Sep 2005 11:55:48 +0200 (CEST) Subject: [pypy-svn] r17682 - in pypy/dist/pypy/interpreter: astcompiler test Message-ID: <20050920095548.A510227B80@code1.codespeak.net> Author: ac Date: Tue Sep 20 11:55:48 2005 New Revision: 17682 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/test/test_compiler.py Log: Detect manipulation of None and __debug__ Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Tue Sep 20 11:55:48 2005 @@ -214,8 +214,9 @@ raise RuntimeError, "should be implemented by subclasses" # Next five methods handle name access - def storeName(self, name): + if name in ('None', '__debug__'): + raise SyntaxError('assignment to %s is not allowed' % name) self._nameOp('STORE', name) def loadName(self, name): @@ -228,6 +229,8 @@ self._nameOp('LOAD', name) def delName(self, name): + if name in ('None', '__debug__'): + raise SyntaxError('deleting %s is not allowed' % name) scope = self.scope.check_name(name) if scope == SC_CELL: raise SyntaxError("can not delete variable '%s' " @@ -895,8 +898,12 @@ def visitAssAttr(self, node): node.expr.accept( self ) if node.flags == 'OP_ASSIGN': + if node.attrname == 'None': + raise SyntaxError('assignment to None is not allowed') self.emitop('STORE_ATTR', self.mangle(node.attrname)) elif node.flags == 'OP_DELETE': + if node.attrname == 'None': + raise SyntaxError('deleting None is not allowed') self.emitop('DELETE_ATTR', self.mangle(node.attrname)) else: assert False, "visitAssAttr unexpected flags: %s" % node.flags @@ -1234,6 +1241,8 @@ if name in argnames: raise SyntaxError("duplicate argument '%s' in function definition" % arg.name) argnames[name] = 1 + if 'None' in argnames: + raise SyntaxError('assignment to None is not allowed') args, hasTupleArg = generateArgList(func.argnames) @@ -1279,7 +1288,7 @@ for elt in tup.nodes: if isinstance(elt, ast.AssName): - self._nameOp('STORE', elt.name) + self.storeName(elt.name) elif isinstance(elt, ast.AssTuple): self.unpackSequence( elt ) else: Modified: pypy/dist/pypy/interpreter/test/test_compiler.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_compiler.py (original) +++ pypy/dist/pypy/interpreter/test/test_compiler.py Tue Sep 20 11:55:48 2005 @@ -5,7 +5,6 @@ from pypy.interpreter.pycode import PyCode from pypy.interpreter.error import OperationError - class BaseTestCompiler: def setup_method(self, method): self.compiler = self.space.createcompiler() @@ -200,6 +199,33 @@ ex = e.value ex.normalize_exception(self.space) assert ex.match(self.space, self.space.w_SyntaxError) + + def test_debug_assignment(self): + code = '__debug__ = 1' + e = py.test.raises(OperationError, self.compiler.compile, code, '', 'single', 0) + ex = e.value + ex.normalize_exception(self.space) + assert ex.match(self.space, self.space.w_SyntaxError) + + def test_none_assignment(self): + stmts = [ + 'None = 0', + 'None += 0', + '__builtins__.None = 0', + 'def None(): pass', + 'class None: pass', + '(a, None) = 0, 0', + 'for None in range(10): pass', + 'def f(None): pass', + ] + for stmt in stmts: + stmt += '\n' + for kind in 'single', 'exec': + e = py.test.raises(OperationError, self.compiler.compile, stmt, + '', kind, 0) + ex = e.value + ex.normalize_exception(self.space) + assert ex.match(self.space, self.space.w_SyntaxError) class TestECCompiler(BaseTestCompiler): def setup_method(self, method): From ac at codespeak.net Tue Sep 20 14:10:16 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Tue, 20 Sep 2005 14:10:16 +0200 (CEST) Subject: [pypy-svn] r17688 - in pypy/dist/pypy/interpreter: pyparser pyparser/data test Message-ID: <20050920121016.554BF27B5B@code1.codespeak.net> Author: ac Date: Tue Sep 20 14:10:15 2005 New Revision: 17688 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/data/Grammar2.4 pypy/dist/pypy/interpreter/test/test_compiler.py Log: Catch a problem with multi-line imports without (). Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Tue Sep 20 14:10:15 2005 @@ -1229,9 +1229,9 @@ def build_import_from(builder, nb): """ - import_from: 'from' dotted_name 'import' ('*' | '(' import_as_names ')' | import_as_names) + import_from: 'from' dotted_name 'import' ('*' | '(' import_as_names [','] ')' | import_as_names) - import_as_names: import_as_name (',' import_as_name)* [','] + import_as_names: import_as_name (',' import_as_name)* import_as_name: NAME [NAME NAME] """ atoms = get_atoms(builder, nb) Modified: pypy/dist/pypy/interpreter/pyparser/data/Grammar2.4 ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/data/Grammar2.4 (original) +++ pypy/dist/pypy/interpreter/pyparser/data/Grammar2.4 Tue Sep 20 14:10:15 2005 @@ -53,10 +53,10 @@ raise_stmt: 'raise' [test [',' test [',' test]]] import_stmt: import_name | import_from import_name: 'import' dotted_as_names -import_from: 'from' dotted_name 'import' ('*' | '(' import_as_names ')' | import_as_names) +import_from: 'from' dotted_name 'import' ('*' | '(' import_as_names [','] ')' | import_as_names) import_as_name: NAME [NAME NAME] dotted_as_name: dotted_name [NAME NAME] -import_as_names: import_as_name (',' import_as_name)* [','] +import_as_names: import_as_name (',' import_as_name)* dotted_as_names: dotted_as_name (',' dotted_as_name)* dotted_name: NAME ('.' NAME)* global_stmt: 'global' NAME (',' NAME)* Modified: pypy/dist/pypy/interpreter/test/test_compiler.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_compiler.py (original) +++ pypy/dist/pypy/interpreter/test/test_compiler.py Tue Sep 20 14:10:15 2005 @@ -227,6 +227,51 @@ ex.normalize_exception(self.space) assert ex.match(self.space, self.space.w_SyntaxError) + def test_import(self): + succeed = [ + 'import sys', + 'import os, sys', + 'from __future__ import nested_scopes, generators', + 'from __future__ import (nested_scopes,\ngenerators)', + 'from __future__ import (nested_scopes,\ngenerators,)', + 'from sys import stdin, stderr, stdout', + 'from sys import (stdin, stderr,\nstdout)', + 'from sys import (stdin, stderr,\nstdout,)', + 'from sys import (stdin\n, stderr, stdout)', + 'from sys import (stdin\n, stderr, stdout,)', + 'from sys import stdin as si, stdout as so, stderr as se', + 'from sys import (stdin as si, stdout as so, stderr as se)', + 'from sys import (stdin as si, stdout as so, stderr as se,)', + ] + fail = [ + 'import (os, sys)', + 'import (os), (sys)', + 'import ((os), (sys))', + 'import (sys', + 'import sys)', + 'import (os,)', + 'from (sys) import stdin', + 'from __future__ import (nested_scopes', + 'from __future__ import nested_scopes)', + 'from __future__ import nested_scopes,\ngenerators', + 'from sys import (stdin', + 'from sys import stdin)', + 'from sys import stdin, stdout,\nstderr', + 'from sys import stdin si', + 'from sys import stdin,' + 'from sys import (*)', + 'from sys import (stdin,, stdout, stderr)', + 'from sys import (stdin, stdout),', + ] + for stmt in succeed: + self.compiler.compile(stmt, 'tmp', 'exec', 0) + for stmt in fail: + e = py.test.raises(OperationError, self.compiler.compile, + stmt, 'tmp', 'exec', 0) + ex = e.value + ex.normalize_exception(self.space) + assert ex.match(self.space, self.space.w_SyntaxError) + class TestECCompiler(BaseTestCompiler): def setup_method(self, method): self.compiler = self.space.getexecutioncontext().compiler From arigo at codespeak.net Tue Sep 20 14:10:29 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 20 Sep 2005 14:10:29 +0200 (CEST) Subject: [pypy-svn] r17689 - pypy/dist/pypy/objspace/std Message-ID: <20050920121029.4EEB027B70@code1.codespeak.net> Author: arigo Date: Tue Sep 20 14:10:27 2005 New Revision: 17689 Modified: pypy/dist/pypy/objspace/std/intobject.py Log: Agreement on IRC that it doesn't make much sense to stick this close to CPython's error messages when we can easily provide more precise ones. Modified: pypy/dist/pypy/objspace/std/intobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/intobject.py (original) +++ pypy/dist/pypy/objspace/std/intobject.py Tue Sep 20 14:10:27 2005 @@ -147,7 +147,7 @@ z = ovfcheck(x // y) except ZeroDivisionError: raise OperationError(space.w_ZeroDivisionError, - space.wrap("integer division or modulo by zero")) + space.wrap("integer division by zero")) except OverflowError: raise FailedToImplement(space.w_OverflowError, space.wrap("integer division")) @@ -168,7 +168,7 @@ z = ovfcheck(x % y) except ZeroDivisionError: raise OperationError(space.w_ZeroDivisionError, - space.wrap("integer division or modulo by zero")) + space.wrap("integer modulo by zero")) except OverflowError: raise FailedToImplement(space.w_OverflowError, space.wrap("integer modulo")) @@ -181,7 +181,7 @@ z = ovfcheck(x // y) except ZeroDivisionError: raise OperationError(space.w_ZeroDivisionError, - space.wrap("integer division or modulo by zero")) + space.wrap("integer divmod by zero")) except OverflowError: raise FailedToImplement(space.w_OverflowError, space.wrap("integer modulo")) From arigo at codespeak.net Tue Sep 20 14:21:26 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 20 Sep 2005 14:21:26 +0200 (CEST) Subject: [pypy-svn] r17690 - in pypy/dist/pypy/rpython: . test Message-ID: <20050920122126.99DC027B5B@code1.codespeak.net> Author: arigo Date: Tue Sep 20 14:21:23 2005 New Revision: 17690 Modified: pypy/dist/pypy/rpython/rstr.py pypy/dist/pypy/rpython/test/test_rstr.py Log: * A version of find/rfind optimized for searching for single characters. * Implemented ch.isdigit(), isalpha() etc., in addition to isspace() that we already had. Next step: fix objspace/std/stringobject.py. Modified: pypy/dist/pypy/rpython/rstr.py ============================================================================== --- pypy/dist/pypy/rpython/rstr.py (original) +++ pypy/dist/pypy/rpython/rstr.py Tue Sep 20 14:21:23 2005 @@ -114,7 +114,12 @@ def rtype_method_find(_, hop, reverse=False): v_str = hop.inputarg(string_repr, arg=0) - v_value = hop.inputarg(string_repr, arg=1) + if hop.args_r[1] == char_repr: + v_value = hop.inputarg(char_repr, arg=1) + llfn = reverse and ll_rfind_char or ll_find_char + else: + v_value = hop.inputarg(string_repr, arg=1) + llfn = reverse and ll_rfind or ll_find if hop.nb_args > 2: v_start = hop.inputarg(Signed, arg=2) if not hop.args_s[2].nonneg: @@ -127,10 +132,6 @@ raise TyperError("str.find() end must be proven non-negative") else: v_end = hop.gendirectcall(ll_strlen, v_str) - if reverse: - llfn = ll_rfind - else: - llfn = ll_find hop.exception_cannot_occur() return hop.gendirectcall(llfn, v_str, v_value, v_start, v_end) @@ -408,10 +409,23 @@ vlist = hop.inputargs(char_repr) return hop.genop('cast_char_to_int', vlist, resulttype=Signed) - def rtype_method_isspace(_, hop): + def _rtype_method_isxxx(_, llfn, hop): vlist = hop.inputargs(char_repr) hop.exception_cannot_occur() - return hop.gendirectcall(ll_char_isspace, vlist[0]) + return hop.gendirectcall(llfn, vlist[0]) + + def rtype_method_isspace(self, hop): + return self._rtype_method_isxxx(ll_char_isspace, hop) + def rtype_method_isdigit(self, hop): + return self._rtype_method_isxxx(ll_char_isdigit, hop) + def rtype_method_isalpha(self, hop): + return self._rtype_method_isxxx(ll_char_isalpha, hop) + def rtype_method_isalnum(self, hop): + return self._rtype_method_isxxx(ll_char_isalnum, hop) + def rtype_method_isupper(self, hop): + return self._rtype_method_isxxx(ll_char_isupper, hop) + def rtype_method_islower(self, hop): + return self._rtype_method_isxxx(ll_char_islower, hop) class __extend__(pairtype(CharRepr, IntegerRepr)): @@ -530,10 +544,37 @@ # get flowed and annotated, mostly with SomePtr. # def ll_char_isspace(ch): - # XXX: - #return ord(ch) in (9, 10, 11, 12, 13, 32) c = ord(ch) - return 9 <= c <= 13 or c == 32 + return c == 32 or (c <= 13 and c >= 9) # c in (9, 10, 11, 12, 13, 32) + +def ll_char_isdigit(ch): + c = ord(ch) + return c <= 57 and c >= 48 + +def ll_char_isalpha(ch): + c = ord(ch) + if c >= 97: + return c <= 122 + else: + return 65 <= c <= 90 + +def ll_char_isalnum(ch): + c = ord(ch) + if c >= 65: + if c >= 97: + return c <= 122 + else: + return c <= 90 + else: + return 48 <= c <= 57 + +def ll_char_isupper(ch): + c = ord(ch) + return 65 <= c <= 90 + +def ll_char_islower(ch): + c = ord(ch) + return 97 <= c <= 122 def ll_char_mul(ch, times): newstr = malloc(STR, times) @@ -678,9 +719,27 @@ return True +def ll_find_char(s, ch, start, end): + i = start + while i < end: + if s.chars[i] == ch: + return i + i += 1 + return -1 + +def ll_rfind_char(s, ch, start, end): + i = end + while i > start: + i -= 1 + if s.chars[i] == ch: + return i + return -1 + def ll_find(s1, s2, start, end): """Knuth Morris Prath algorithm for substring match""" len2 = len(s2.chars) + if len2 == 1: + return ll_find_char(s1, s2.chars[0], start, end) if len2 == 0: return start # Construct the array of possible restarting positions @@ -723,6 +782,8 @@ def ll_rfind(s1, s2, start, end): """Reversed version of ll_find()""" len2 = len(s2.chars) + if len2 == 1: + return ll_rfind_char(s1, s2.chars[0], start, end) if len2 == 0: return end # Construct the array of possible restarting positions Modified: pypy/dist/pypy/rpython/test/test_rstr.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rstr.py (original) +++ pypy/dist/pypy/rpython/test/test_rstr.py Tue Sep 20 14:21:23 2005 @@ -109,13 +109,18 @@ assert res.chars[0] == 'x' assert res.chars[1] == '.' -def test_char_isspace(): +def test_char_isxxx(): def fn(s): - return s.isspace() - res = interpret(fn, ['x']) - assert res == False - res = interpret(fn, [' ']) - assert res == True + return (s.isspace() | + s.isdigit() << 1 | + s.isalpha() << 2 | + s.isalnum() << 3 | + s.isupper() << 4 | + s.islower() << 5) + for i in range(128): + ch = chr(i) + res = interpret(fn, [ch]) + assert res == fn(ch) def test_char_compare(): res = interpret(lambda c1, c2: c1 == c2, ['a', 'b']) @@ -248,6 +253,15 @@ res = interpret(fn, []) assert res == 2 + 2 + 1 +def test_find_char(): + def fn(ch): + pos1 = 'aiuwraz 483'.find(ch) + pos2 = 'aiuwraz 483'.rfind(ch) + return pos1 + (pos2*100) + for ch in 'a ?3': + res = interpret(fn, [ch]) + assert res == fn(ch) + def test_upper(): def fn(i): strings = ['', ' ', 'upper', 'UpPeR', ',uppEr,'] From arigo at codespeak.net Tue Sep 20 14:42:12 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 20 Sep 2005 14:42:12 +0200 (CEST) Subject: [pypy-svn] r17691 - pypy/dist/pypy/objspace/std Message-ID: <20050920124212.6B8E027B69@code1.codespeak.net> Author: arigo Date: Tue Sep 20 14:42:09 2005 New Revision: 17691 Modified: pypy/dist/pypy/objspace/std/stringobject.py Log: Using the new isxxx() methods on characters, and other clean-ups of stringobject.py. Modified: pypy/dist/pypy/objspace/std/stringobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/stringobject.py (original) +++ pypy/dist/pypy/objspace/std/stringobject.py Tue Sep 20 14:42:09 2005 @@ -33,31 +33,6 @@ registerimplementation(W_StringObject) -def _isspace(ch): - return ord(ch) in (9, 10, 11, 12, 13, 32) - -def _isdigit(ch): - o = ord(ch) - return o >= 48 and o <= 57 - -def _isalpha(ch): - o = ord(ch) - return (o>=97 and o<=122) or (o>=65 and o<=90) - -def _isalnum(ch): - o = ord(ch) - return (o>=97 and o<=122) \ - or (o>=65 and o<=90) \ - or (o>=48 and o<=57) - -def _isupper(ch): - o = ord(ch) - return (o>=65 and o<=90) - -def _islower(ch): - o = ord(ch) - return (o>=97 and o<=122) - def _is_generic(w_self, fun): space = w_self.space v = w_self._value @@ -71,32 +46,33 @@ if not fun(v[idx]): return space.w_False return space.w_True +_is_generic._annspecialcase_ = "specialize:arg1" def _upper(ch): - if _islower(ch): + if ch.islower(): o = ord(ch) - 32 return chr(o) else: return ch def _lower(ch): - if _isupper(ch): + if ch.isupper(): o = ord(ch) + 32 return chr(o) else: return ch def str_isspace__String(space, w_self): - return _is_generic(w_self, _isspace) + return _is_generic(w_self, lambda c: c.isspace()) def str_isdigit__String(space, w_self): - return _is_generic(w_self, _isdigit) + return _is_generic(w_self, lambda c: c.isdigit()) def str_isalpha__String(space, w_self): - return _is_generic(w_self, _isalpha) + return _is_generic(w_self, lambda c: c.isalpha()) def str_isalnum__String(space, w_self): - return _is_generic(w_self, _isalnum) + return _is_generic(w_self, lambda c: c.isalnum()) def str_isupper__String(space, w_self): """Return True if all cased characters in S are uppercase and there is @@ -105,12 +81,12 @@ v = w_self._value if len(v) == 1: c = v[0] - return space.newbool(_isupper(c)) + return space.newbool(c.isupper()) cased = False for idx in range(len(v)): - if _islower(v[idx]): + if v[idx].islower(): return space.w_False - elif not cased and _isupper(v[idx]): + elif not cased and v[idx].isupper(): cased = True return space.newbool(cased) @@ -121,12 +97,12 @@ v = w_self._value if len(v) == 1: c = v[0] - return space.newbool(_islower(c)) + return space.newbool(c.islower()) cased = False for idx in range(len(v)): - if _isupper(v[idx]): + if v[idx].isupper(): return space.w_False - elif not cased and _islower(v[idx]): + elif not cased and v[idx].islower(): cased = True return space.newbool(cased) @@ -141,12 +117,12 @@ for pos in range(0, len(input)): ch = input[pos] - if _isupper(ch): + if ch.isupper(): if previous_is_cased: return space.w_False previous_is_cased = True cased = True - elif _islower(ch): + elif ch.islower(): if not previous_is_cased: return space.w_False cased = True @@ -178,10 +154,10 @@ res = [' '] * len(self) for i in range(len(self)): ch = self[i] - if _isupper(ch): + if ch.isupper(): o = ord(ch) + 32 res[i] = chr(o) - elif _islower(ch): + elif ch.islower(): o = ord(ch) - 32 res[i] = chr(o) else: @@ -195,7 +171,7 @@ buffer = [' '] * len(input) if len(input) > 0: ch = input[0] - if _islower(ch): + if ch.islower(): o = ord(ch) - 32 buffer[0] = chr(o) else: @@ -203,7 +179,7 @@ for i in range(1, len(input)): ch = input[i] - if _isupper(ch): + if ch.isupper(): o = ord(ch) + 32 buffer[i] = chr(o) else: @@ -218,7 +194,7 @@ for pos in range(0, len(input)): ch = input[pos] - if not _isalpha(prev_letter): + if not prev_letter.isalpha(): buffer[pos] = _upper(ch) else: buffer[pos] = _lower(ch) @@ -228,36 +204,40 @@ return space.wrap("".join(buffer)) def str_split__String_None_ANY(space, w_self, w_none, w_maxsplit=-1): - res = [] - inword = 0 - value = w_self._value maxsplit = space.int_w(w_maxsplit) - pos = 0 + res_w = [] + value = w_self._value + length = len(value) + i = 0 + while True: + # find the beginning of the next word + while i < length: + if not value[i].isspace(): + break # found + i += 1 + else: + break # end of string, finished - for ch in value: - if _isspace(ch): - if inword: - inword = 0 + # find the end of the word + if maxsplit == 0: + j = length # take all the rest of the string else: - if inword: - res[-1] += ch - else: - if maxsplit > -1: - if maxsplit == 0: - res.append(value[pos:]) - break - maxsplit = maxsplit - 1 - res.append(ch) - inword = 1 - pos = pos + 1 - - res_w = [None] * len(res) - for i in range(len(res)): - res_w[i] = W_StringObject(space, res[i]) + j = i + 1 + while j < length and not value[j].isspace(): + j += 1 + maxsplit -= 1 # NB. if it's already < 0, it stays < 0 + + # the word is value[i:j] + res_w.append(W_StringObject(space, value[i:j])) + + # continue to look from the character following the space after the word + i = j + 1 return W_ListObject(space, res_w) + def str_split__String_String_ANY(space, w_self, w_by, w_maxsplit=-1): + maxsplit = space.int_w(w_maxsplit) res_w = [] start = 0 value = w_self._value @@ -265,61 +245,56 @@ bylen = len(by) if bylen == 0: raise OperationError(space.w_ValueError, space.wrap("empty separator")) - maxsplit = space.int_w(w_maxsplit) - #if maxsplit is default, then you have no limit - #of the length of the resulting array - if maxsplit == -1: - splitcount = 1 - else: - splitcount = maxsplit - - while splitcount: + while maxsplit != 0: next = value.find(by, start) if next < 0: break res_w.append(W_StringObject(space, value[start:next])) start = next + bylen - #decrese the counter only then, when - #we don't have default maxsplit - if maxsplit > -1: - splitcount = splitcount - 1 + maxsplit -= 1 # NB. if it's already < 0, it stays < 0 res_w.append(W_StringObject(space, value[start:])) return W_ListObject(w_self.space, res_w) def str_rsplit__String_None_ANY(space, w_self, w_none, w_maxsplit=-1): - res = [] - inword = 0 - value = w_self._value maxsplit = space.int_w(w_maxsplit) + res_w = [] + value = w_self._value + i = len(value)-1 + while True: + # starting from the end, find the end of the next word + while i >= 0: + if not value[i].isspace(): + break # found + i -= 1 + else: + break # end of string, finished + + # find the start of the word + # (more precisely, 'j' will be the space character before the word) + if maxsplit == 0: + j = -1 # take all the rest of the string + else: + j = i - 1 + while j >= 0 and not value[j].isspace(): + j -= 1 + maxsplit -= 1 # NB. if it's already < 0, it stays < 0 + + # the word is value[j+1:i+1] + j1 = j + 1 + assert j1 >= 0 + res_w.append(W_StringObject(space, value[j1:i+1])) + + # continue to look from the character before the space before the word + i = j - 1 - for i in range(len(value)-1, -1, -1): - ch = value[i] - if _isspace(ch): - if inword: - inword = 0 - else: - if inword: - ch = ch + res[-1] - res[-1] = ch - else: - if maxsplit > -1: - if maxsplit == 0: - res.append(value[:i+1]) - break - maxsplit = maxsplit - 1 - res.append(ch) - inword = 1 - - res_w = [None] * len(res) - for i in range(len(res)): - res_w[i] = W_StringObject(space, res[i]) res_w.reverse() return W_ListObject(space, res_w) def str_rsplit__String_String_ANY(space, w_self, w_by, w_maxsplit=-1): + maxsplit = space.int_w(w_maxsplit) res_w = [] value = w_self._value end = len(value) @@ -327,25 +302,14 @@ bylen = len(by) if bylen == 0: raise OperationError(space.w_ValueError, space.wrap("empty separator")) - maxsplit = space.int_w(w_maxsplit) - - #if maxsplit is default, then you have no limit - #of the length of the resulting array - if maxsplit == -1: - splitcount = 1 - else: - splitcount = maxsplit - while splitcount: + while maxsplit != 0: next = value.rfind(by, 0, end) if next < 0: break res_w.append(W_StringObject(space, value[next+bylen:end])) end = next - #decrese the counter only then, when - #we don't have default maxsplit - if maxsplit > -1: - splitcount = splitcount - 1 + maxsplit -= 1 # NB. if it's already < 0, it stays < 0 res_w.append(W_StringObject(space, value[:end])) res_w.reverse() @@ -606,11 +570,11 @@ if left: #print "while %d < %d and -%s- in -%s-:"%(lpos, rpos, u_self[lpos],w_chars) - while lpos < rpos and _isspace(u_self[lpos]): + while lpos < rpos and u_self[lpos].isspace(): lpos += 1 if right: - while rpos > lpos and _isspace(u_self[rpos - 1]): + while rpos > lpos and u_self[rpos - 1].isspace(): rpos -= 1 assert rpos >= lpos # annotator hint, don't remove From pedronis at codespeak.net Tue Sep 20 15:38:55 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 20 Sep 2005 15:38:55 +0200 (CEST) Subject: [pypy-svn] r17692 - pypy/dist/pypy/objspace/std Message-ID: <20050920133855.CE5BC27B59@code1.codespeak.net> Author: pedronis Date: Tue Sep 20 15:38:53 2005 New Revision: 17692 Modified: pypy/dist/pypy/objspace/std/stringobject.py Log: the code in that form contained nested functions => not RPython, obscure crash in the annotatator Modified: pypy/dist/pypy/objspace/std/stringobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/stringobject.py (original) +++ pypy/dist/pypy/objspace/std/stringobject.py Tue Sep 20 15:38:53 2005 @@ -62,17 +62,22 @@ else: return ch +_isspace = lambda c: c.isspace() +_isdigit = lambda c: c.isdigit() +_isalpha = lambda c: c.isalpha() +_isalnum = lambda c: c.isalnum() + def str_isspace__String(space, w_self): - return _is_generic(w_self, lambda c: c.isspace()) + return _is_generic(w_self, _isspace) def str_isdigit__String(space, w_self): - return _is_generic(w_self, lambda c: c.isdigit()) + return _is_generic(w_self, _isdigit) def str_isalpha__String(space, w_self): - return _is_generic(w_self, lambda c: c.isalpha()) + return _is_generic(w_self, _isalpha) def str_isalnum__String(space, w_self): - return _is_generic(w_self, lambda c: c.isalnum()) + return _is_generic(w_self, _isalnum) def str_isupper__String(space, w_self): """Return True if all cased characters in S are uppercase and there is From pedronis at codespeak.net Tue Sep 20 15:49:18 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 20 Sep 2005 15:49:18 +0200 (CEST) Subject: [pypy-svn] r17693 - pypy/dist/pypy/annotation Message-ID: <20050920134918.0D98227B59@code1.codespeak.net> Author: pedronis Date: Tue Sep 20 15:49:16 2005 New Revision: 17693 Modified: pypy/dist/pypy/annotation/unaryop.py Log: annotation for more is* methods Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Tue Sep 20 15:49:16 2005 @@ -411,6 +411,16 @@ def method_isspace(chr): return SomeBool() + def method_isdigit(chr): + return SomeBool() + + def method_isalpha(chr): + return SomeBool() + + def method_isalnum(chr): + return SomeBool() + + class __extend__(SomeUnicodeCodePoint): def ord(uchr): From pedronis at codespeak.net Tue Sep 20 15:54:43 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 20 Sep 2005 15:54:43 +0200 (CEST) Subject: [pypy-svn] r17694 - pypy/dist/pypy/annotation Message-ID: <20050920135443.426F927B61@code1.codespeak.net> Author: pedronis Date: Tue Sep 20 15:54:40 2005 New Revision: 17694 Modified: pypy/dist/pypy/annotation/unaryop.py Log: oops, more Modified: pypy/dist/pypy/annotation/unaryop.py ============================================================================== --- pypy/dist/pypy/annotation/unaryop.py (original) +++ pypy/dist/pypy/annotation/unaryop.py Tue Sep 20 15:54:40 2005 @@ -420,6 +420,11 @@ def method_isalnum(chr): return SomeBool() + def method_islower(chr): + return SomeBool() + + def method_isupper(chr): + return SomeBool() class __extend__(SomeUnicodeCodePoint): From pedronis at codespeak.net Tue Sep 20 17:04:30 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 20 Sep 2005 17:04:30 +0200 (CEST) Subject: [pypy-svn] r17695 - pypy/dist/pypy/translator/goal Message-ID: <20050920150430.B45E727B62@code1.codespeak.net> Author: pedronis Date: Tue Sep 20 17:04:29 2005 New Revision: 17695 Modified: pypy/dist/pypy/translator/goal/query.py Log: experimental query about potential duplication Modified: pypy/dist/pypy/translator/goal/query.py ============================================================================== --- pypy/dist/pypy/translator/goal/query.py (original) +++ pypy/dist/pypy/translator/goal/query.py Tue Sep 20 17:04:29 2005 @@ -482,3 +482,30 @@ print "Lost method!", name, subcls.cls, cls, subcls.attrs.keys() lost += 0 return lost + +def graph_footprint(graph): + class Counter: + blocks = 0 + links = 0 + ops = 0 + count = Counter() + def visit(block): + if isinstance(block, flowmodel.Block): + count.blocks += 1 + count.ops += len(block.operations) + elif isinstance(block, flowmodel.Link): + count.links += 1 + flowmodel.traverse(visit, graph) + return count.blocks, count.links, count.ops + +# better used before backends opts +def duplication(t): + d = {} + funcs = t.flowgraphs.keys() + print len(funcs) + for f in funcs: + fingerprint = f.func_code, graph_footprint(t.flowgraphs[f]) + d.setdefault(fingerprint ,[]).append(f) + for fingerprint, funcs in d.iteritems(): + if len(funcs) > 1: + print fingerprint[0].co_name, len(funcs) From ac at codespeak.net Tue Sep 20 17:11:19 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Tue, 20 Sep 2005 17:11:19 +0200 (CEST) Subject: [pypy-svn] r17696 - in pypy/dist/pypy/interpreter: astcompiler test Message-ID: <20050920151119.2581D27B69@code1.codespeak.net> Author: ac Date: Tue Sep 20 17:11:18 2005 New Revision: 17696 Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py pypy/dist/pypy/interpreter/test/test_compiler.py Log: Warn about improper use of globals. Modified: pypy/dist/pypy/interpreter/astcompiler/symbols.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/symbols.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/symbols.py Tue Sep 20 17:11:18 2005 @@ -5,6 +5,7 @@ SC_FREE, SC_CELL, SC_UNKNOWN, SC_DEFAULT from pypy.interpreter.astcompiler.misc import mangle, Counter from pypy.interpreter.pyparser.error import SyntaxError +from pypy.interpreter import gateway import types @@ -210,6 +211,21 @@ def __init__(self, name, module): Scope.__init__(self, name, module, name) +app = gateway.applevel(r''' +def issue_warning(msg, filename, lineno): + import warnings + try: + warnings.warn_explicit(msg, SyntaxWarning, filename, lineno, + None, None) + except SyntaxWarning: + raise SyntaxError(msg, filename, lineno) +''') + +_issue_warning = app.interphook('issue_warning') +def issue_warning(space, msg, filename, lineno): + _issue_warning(space, space.wrap(msg), space.wrap(filename), + space.wrap(lineno)) + class SymbolVisitor(ast.ASTVisitor): def __init__(self, space): self.space = space @@ -400,6 +416,15 @@ def visitGlobal(self, node ): scope = self.cur_scope() for name in node.names: + namescope = scope.check_name(name) + if namescope == SC_LOCAL: + issue_warning(self.space, "name '%s' is assigned to before " + "global declaration" %(name,), + node.filename, node.lineno) + elif namescope != SC_GLOBAL and name in scope.uses: + issue_warning(self.space, "name '%s' is used prior " + "to global declaration" %(name,), + node.filename, node.lineno) scope.add_global(name) def visitAssign(self, node ): Modified: pypy/dist/pypy/interpreter/test/test_compiler.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_compiler.py (original) +++ pypy/dist/pypy/interpreter/test/test_compiler.py Tue Sep 20 17:11:18 2005 @@ -4,6 +4,7 @@ from pypy.interpreter.pycompiler import CPythonCompiler, PythonCompiler, PythonAstCompiler from pypy.interpreter.pycode import PyCode from pypy.interpreter.error import OperationError +from pypy.interpreter.argument import Arguments class BaseTestCompiler: def setup_method(self, method): @@ -271,11 +272,46 @@ ex = e.value ex.normalize_exception(self.space) assert ex.match(self.space, self.space.w_SyntaxError) - + + def test_globals_warnings(self): + space = self.space + w_mod = space.appexec((), '():\n import warnings\n return warnings\n') #sys.getmodule('warnings') + w_filterwarnings = space.getattr(w_mod, space.wrap('filterwarnings')) + filter_arg = Arguments(space, [ space.wrap('error') ], + dict(module=space.wrap(''))) + + for code in (''' +def wrong1(): + a = 1 + b = 2 + global a + global b +''', ''' +def wrong2(): + print x + global x +''', ''' +def wrong3(): + print x + x = 2 + global x +'''): + + space.call_args(w_filterwarnings, filter_arg) + e = py.test.raises(OperationError, self.compiler.compile, + code, '', 'exec', 0) + space.call_method(w_mod, 'resetwarnings') + ex = e.value + ex.normalize_exception(space) + assert ex.match(space, space.w_SyntaxError) + class TestECCompiler(BaseTestCompiler): def setup_method(self, method): self.compiler = self.space.getexecutioncontext().compiler + def test_globals_warnings(self): + py.test.skip('INPROGRES') + class TestPyCCompiler(BaseTestCompiler): def setup_method(self, method): self.compiler = CPythonCompiler(self.space) @@ -284,6 +320,9 @@ def setup_method(self, method): self.compiler = PythonCompiler(self.space) + def test_globals_warnings(self): + py.test.skip('INPROGRES') + class TestPythonAstCompiler(BaseTestCompiler): def setup_method(self, method): self.compiler = PythonAstCompiler(self.space) From ac at codespeak.net Tue Sep 20 17:16:07 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Tue, 20 Sep 2005 17:16:07 +0200 (CEST) Subject: [pypy-svn] r17697 - pypy/dist/lib-python Message-ID: <20050920151607.3795E27B62@code1.codespeak.net> Author: ac Date: Tue Sep 20 17:16:07 2005 New Revision: 17697 Modified: pypy/dist/lib-python/conftest.py Log: Use ast compiler for test_global Modified: pypy/dist/lib-python/conftest.py ============================================================================== --- pypy/dist/lib-python/conftest.py (original) +++ pypy/dist/lib-python/conftest.py Tue Sep 20 17:16:07 2005 @@ -514,11 +514,7 @@ RegrTest('test_gl.py', enabled=False, dumbtest=1), RegrTest('test_glob.py', enabled=True, core=True), - RegrTest('test_global.py', enabled=True, core=True, compiler='_stable'), - # this fails because it relies on the warnings module - # turning a warning into an exception, but PyPy's - # interplevel doesn't call into the app-level warnings - # module + RegrTest('test_global.py', enabled=True, core=True, compiler='ast'), RegrTest('test_grammar.py', enabled=True, core=True), RegrTest('test_grp.py', enabled=False), #rev 10840: ImportError: grp @@ -876,8 +872,9 @@ pypy_options.append('--oldstyle') if regrtest.uselibfile: pypy_options.append('--uselibfile') - if regrtest.compiler: - pypy_options.append('--compiler=%s' % regrtest.compiler) + #if regrtest.compiler: + # pypy_options.append('--compiler=%s' % regrtest.compiler) + pypy_options.append('--compiler=ast') pypy_options.extend( ['--usemodules=%s' % mod for mod in regrtest.usemodules]) sopt = " ".join(pypy_options) From arigo at codespeak.net Tue Sep 20 18:26:41 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 20 Sep 2005 18:26:41 +0200 (CEST) Subject: [pypy-svn] r17702 - pypy/dist/pypy/translator/goal Message-ID: <20050920162641.B8EC327B58@code1.codespeak.net> Author: arigo Date: Tue Sep 20 18:26:40 2005 New Revision: 17702 Modified: pypy/dist/pypy/translator/goal/targetcompiler.py pypy/dist/pypy/translator/goal/unixcheckpoint.py Log: The name 'cont' is clearer than 'run' to mean 'continue running from the forked checkpoint'. targetcompiler's entry point should not return an instance which cannot be converted to a PyObject*. Modified: pypy/dist/pypy/translator/goal/targetcompiler.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetcompiler.py (original) +++ pypy/dist/pypy/translator/goal/targetcompiler.py Tue Sep 20 18:26:40 2005 @@ -19,7 +19,8 @@ def entry_point( s1, s2 ): global space - return target_ast_compile( space, s1, s2 ) + pycode = target_ast_compile( space, s1, s2 ) + return 'target_ast_compile --> %r' % (pycode,) # _____ Define and setup target ___ def target(geninterp=True): Modified: pypy/dist/pypy/translator/goal/unixcheckpoint.py ============================================================================== --- pypy/dist/pypy/translator/goal/unixcheckpoint.py (original) +++ pypy/dist/pypy/translator/goal/unixcheckpoint.py Tue Sep 20 18:26:40 2005 @@ -7,7 +7,7 @@ def restartable_point(auto=None): while True: while True: - print '---> Checkpoint: run / restart / quit / pdb ?' + print '---> Checkpoint: cont / restart / quit / pdb ?' if auto: print 'auto-%s' % (auto,) line = auto @@ -18,7 +18,7 @@ except (KeyboardInterrupt, EOFError), e: print '(%s ignored)' % e.__class__.__name__ continue - if line == 'run': + if line in ('run', 'cont'): break if line == 'quit': raise SystemExit From pedronis at codespeak.net Tue Sep 20 19:25:30 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 20 Sep 2005 19:25:30 +0200 (CEST) Subject: [pypy-svn] r17704 - pypy/dist/pypy/translator/goal Message-ID: <20050920172530.12EDF27B5B@code1.codespeak.net> Author: pedronis Date: Tue Sep 20 19:25:28 2005 New Revision: 17704 Modified: pypy/dist/pypy/translator/goal/query.py Log: sort output in duplication Modified: pypy/dist/pypy/translator/goal/query.py ============================================================================== --- pypy/dist/pypy/translator/goal/query.py (original) +++ pypy/dist/pypy/translator/goal/query.py Tue Sep 20 19:25:28 2005 @@ -506,6 +506,10 @@ for f in funcs: fingerprint = f.func_code, graph_footprint(t.flowgraphs[f]) d.setdefault(fingerprint ,[]).append(f) + l = [] for fingerprint, funcs in d.iteritems(): if len(funcs) > 1: - print fingerprint[0].co_name, len(funcs) + l.append((fingerprint[0].co_name, len(funcs))) + l.sort() + for name, c in l: + print name, c From pedronis at codespeak.net Tue Sep 20 19:32:42 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 20 Sep 2005 19:32:42 +0200 (CEST) Subject: [pypy-svn] r17705 - pypy/dist/pypy/translator/goal Message-ID: <20050920173242.69A0927B5B@code1.codespeak.net> Author: pedronis Date: Tue Sep 20 19:32:41 2005 New Revision: 17705 Modified: pypy/dist/pypy/translator/goal/ (props changed) Log: generalize ingnoring pypy-c*/pypy-llvm* files From ac at codespeak.net Tue Sep 20 22:02:03 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Tue, 20 Sep 2005 22:02:03 +0200 (CEST) Subject: [pypy-svn] r17707 - pypy/dist/lib-python Message-ID: <20050920200203.4073B27B55@code1.codespeak.net> Author: ac Date: Tue Sep 20 22:02:03 2005 New Revision: 17707 Modified: pypy/dist/lib-python/conftest.py Log: Oops, reverting a private change. Modified: pypy/dist/lib-python/conftest.py ============================================================================== --- pypy/dist/lib-python/conftest.py (original) +++ pypy/dist/lib-python/conftest.py Tue Sep 20 22:02:03 2005 @@ -872,9 +872,8 @@ pypy_options.append('--oldstyle') if regrtest.uselibfile: pypy_options.append('--uselibfile') - #if regrtest.compiler: - # pypy_options.append('--compiler=%s' % regrtest.compiler) - pypy_options.append('--compiler=ast') + if regrtest.compiler: + pypy_options.append('--compiler=%s' % regrtest.compiler) pypy_options.extend( ['--usemodules=%s' % mod for mod in regrtest.usemodules]) sopt = " ".join(pypy_options) From tismer at codespeak.net Wed Sep 21 03:22:09 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Wed, 21 Sep 2005 03:22:09 +0200 (CEST) Subject: [pypy-svn] r17710 - pypy/dist/pypy/translator/goal Message-ID: <20050921012209.3717827B5B@code1.codespeak.net> Author: tismer Date: Wed Sep 21 03:22:07 2005 New Revision: 17710 Modified: pypy/dist/pypy/translator/goal/bench-windows.py Log: executable abs.richards abs.pystone rel.rich rel.pystone pypy-c-17439-hi 35415 ms 620.652 42.6 65.4 pypy-c-17439-lo 36492 ms 923.530 43.9 44.0 pypy-c-17600-lo 26542 ms 893.093 31.9 45.5 pypy-c-17634-lo 20203 ms 1001.520 24.3 40.6 pypy-c-17649-lo 22792 ms 1028.290 27.4 39.5 pypy-c-17674-hi 15927 ms 1934.000 19.1 21.0 pypy-c-17674-lo 17009 ms 1283.800 20.4 31.6 pypy-c-17707-hi 15942 ms 1971.950 19.2 20.6 python 2.3.3 832 ms 40612.100 1.0 1.0 This time, more comparisons between -t-lowmem and without it (using geninterp as much as possible) were done. It is interesting how much translation of geninterp'ed code is accelerated, now. Note that range() is still at applevel, but very efficiently translated. It will anyway be moved to interplevel next time, it is too frequently used. Modified: pypy/dist/pypy/translator/goal/bench-windows.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-windows.py (original) +++ pypy/dist/pypy/translator/goal/bench-windows.py Wed Sep 21 03:22:07 2005 @@ -4,25 +4,21 @@ current_result = """ executable abs.richards abs.pystone rel.rich rel.pystone -pypy-c-17439 35165 ms 668.586 42.4 61.1 -pypy-c-17600 26388 ms 900.481 31.8 45.4 -pypy-c-17634 20108 ms 1017.720 24.2 40.1 -pypy-c-17649 22662 ms 1035.910 27.3 39.4 -pypy-c-17674-nolowmem 15817 ms 1981.470 19.1 20.6 -pypy-c-17674-t-lowmem 16834 ms 1274.650 20.3 32.1 -python 2.3.3 830 ms 40861.400 1.0 1.0 - -17649 was with explicit fixed stack. -Changes after 17634 we not included. -17674 has an outrageous effect. I cannot really -find out what it was. Did Armin do the fixed stack -patch already? Probably not. Was it Samuele's avoiding -of duplicate zeroing? Really just that? I think so, and -this is incredible. -Even more incredible is the fact that not using using -t-lowmem accelerates pystone so much. This is an indicator -that we missed something used in pystone that still contains -applevel code. I can't believe it, will find it tomorrow. +pypy-c-17439-hi 35415 ms 620.652 42.6 65.4 +pypy-c-17439-lo 36492 ms 923.530 43.9 44.0 +pypy-c-17600-lo 26542 ms 893.093 31.9 45.5 +pypy-c-17634-lo 20203 ms 1001.520 24.3 40.6 +pypy-c-17649-lo 22792 ms 1028.290 27.4 39.5 +pypy-c-17674-hi 15927 ms 1934.000 19.1 21.0 +pypy-c-17674-lo 17009 ms 1283.800 20.4 31.6 +pypy-c-17707-hi 15942 ms 1971.950 19.2 20.6 +python 2.3.3 832 ms 40612.100 1.0 1.0 + +This time, more comparisons between -t-lowmem and without it (using geninterp +as much as possible) were done. It is interesting how much translation of +geninterp'ed code is accelerated, now. Note that range() is still at applevel, +but very efficiently translated. It will anyway be moved to interplevel +next time, it is too frequently used. """ import os, sys @@ -72,6 +68,13 @@ %-27s ''' + '%5d ms %9.3f ' + '%5.1f %5.1f' def main(): + import win32con, win32process + curr = win32process.GetCurrentProcess() + prio = win32con.HIGH_PRIORITY_CLASS + win32process.SetPriorityClass(curr, prio) + # unfortunately, the above doesn't help, because the process priority + # is not inherited by child process. We also cannot import WIn32 extensions + # right now, since PyPycanot handle extension modules. print 'getting the richards reference' ref_rich = run_richards() print 'getting the pystone reference' From ericvrp at codespeak.net Wed Sep 21 12:17:49 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 21 Sep 2005 12:17:49 +0200 (CEST) Subject: [pypy-svn] r17712 - pypy/dist/pypy/translator/llvm/test Message-ID: <20050921101749.A558B27B5B@code1.codespeak.net> Author: ericvrp Date: Wed Sep 21 12:17:48 2005 New Revision: 17712 Modified: pypy/dist/pypy/translator/llvm/test/test_exception.py Log: added test to detect not clearing of exception-occurred flag. (mostly for what I currently call fast exceptionpolicy) Modified: pypy/dist/pypy/translator/llvm/test/test_exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/test/test_exception.py (original) +++ pypy/dist/pypy/translator/llvm/test/test_exception.py Wed Sep 21 12:17:48 2005 @@ -194,6 +194,27 @@ for i in [-1, 0, 1, 2]: assert f(i) == i +def test_two_exceptions(): + def fn1(): + raise Exception + def fn2(): + return 10 + def two_exceptions(): + r = 50 + try: + fn1() + r += 1 + except: + r += 100 + try: + r += fn2() + except: + r += 300 + r += fn2() + return r + f = compile_function(two_exceptions, []) + assert f() == two_exceptions() + def test_raise_outside_testfn(): def raiser(n): if n < 0: From ericvrp at codespeak.net Wed Sep 21 12:20:22 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 21 Sep 2005 12:20:22 +0200 (CEST) Subject: [pypy-svn] r17713 - pypy/dist/pypy/translator/llvm Message-ID: <20050921102022.CD5B827B5C@code1.codespeak.net> Author: ericvrp Date: Wed Sep 21 12:20:21 2005 New Revision: 17713 Modified: pypy/dist/pypy/translator/llvm/codewriter.py pypy/dist/pypy/translator/llvm/exception.py pypy/dist/pypy/translator/llvm/opwriter.py Log: * Moved more code to exceptionpolicy. * Added 'fast' exceptionpolicy, which does not realy on setjmp/longjmp. Modified: pypy/dist/pypy/translator/llvm/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/codewriter.py (original) +++ pypy/dist/pypy/translator/llvm/codewriter.py Wed Sep 21 12:20:21 2005 @@ -86,9 +86,6 @@ else: self.indent("ret %s %s" % (type_, ref)) - def unwind(self): - self.indent("unwind") - def phi(self, targetvar, type_, refs, blocknames): assert targetvar.startswith('%') assert refs and len(refs) == len(blocknames), "phi node requires blocks" @@ -118,17 +115,12 @@ tail_ += ' ' args = ", ".join(["%s %s" % item for item in zip(argtypes, argrefs)]) if except_label: - assert label - instruction = 'invoke' - optional = ' to label %%%s except label %%%s' % (label, except_label) - else: - assert not label - instruction = 'call' - optional = '' - if returntype == 'void': - self.indent("%s%s %s void %s(%s)%s" % (tail_, instruction, cconv, functionref, args, optional)) + self.genllvm.exceptionpolicy.invoke(self, targetvar, tail_, cconv, returntype, functionref, args, label, except_label) else: - self.indent("%s = %s%s %s %s %s(%s)%s" % (targetvar, tail_, instruction, cconv, returntype, functionref, args, optional)) + if returntype == 'void': + self.indent("%scall %s void %s(%s)" % (tail_, cconv, functionref, args)) + else: + self.indent("%s = %scall %s %s %s(%s)" % (targetvar, tail_, cconv, returntype, functionref, args)) def cast(self, targetvar, fromtype, fromvar, targettype): if fromtype == 'void' and targettype == 'void': Modified: pypy/dist/pypy/translator/llvm/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/exception.py (original) +++ pypy/dist/pypy/translator/llvm/exception.py Wed Sep 21 12:20:21 2005 @@ -5,14 +5,32 @@ def __init__(self): raise Exception, 'ExceptionPolicy should not be used directly' - def pyrex_entrypoint_code(self, entrynode): - return '' + def transform(self, translator): + return - def llc_options(self): - return '' + def _noresult(self, returntype): + r = returntype.strip() + if r == 'void': + return 'void' + elif r == 'bool': + return 'bool false' + elif r in 'float double'.split(): + return r + ' 0.0' + elif r in 'ubyte sbyte ushort short uint int ulong long'.split(): + return r + ' 0' + return r + ' null' + + def _nonoderesult(self, node): + decl = node.getdecl() + returntype, name = decl.split(' ', 1) + noresult = self._noresult(returntype) + #print 'XXX decl=%s -> returntype=%s -> noresult=ret %s' % (decl, returntype, noresult) + return noresult def new(exceptionpolicy=None): #factory - if exceptionpolicy is None or exceptionpolicy == 'cpython': + if exceptionpolicy is None: + exceptionpolicy = 'cpython' + if exceptionpolicy == 'cpython': from pypy.translator.llvm.exception import CPythonExceptionPolicy exceptionpolicy = CPythonExceptionPolicy() elif exceptionpolicy == 'fast': @@ -27,23 +45,18 @@ new = staticmethod(new) -class NoneExceptionPolicy(ExceptionPolicy): +class NoneExceptionPolicy(ExceptionPolicy): #XXX untested def __init__(self): pass -class CPythonExceptionPolicy(ExceptionPolicy): #uses issubclass() +class CPythonExceptionPolicy(ExceptionPolicy): #uses issubclass() and llvm invoke&unwind def __init__(self): pass def pyrex_entrypoint_code(self, entrynode): returntype, entrypointname = entrynode.getdecl().split('%', 1) - if returntype == 'double ': - noresult = '0.0' - elif returntype == 'bool ': - noresult = 'false' - else: - noresult = '0' + noresult = self._noresult(returntype) cconv = DEFAULT_CCONV return ''' ccc %(returntype)s%%__entrypoint__%(entrypointname)s { @@ -54,7 +67,7 @@ ret %(returntype)s %%result exception: - ret %(returntype)s %(noresult)s + ret %(noresult)s } ccc int %%__entrypoint__raised_LLVMException() { @@ -64,13 +77,141 @@ } ''' % locals() + def invoke(self, codewriter, targetvar, tail_, cconv, returntype, functionref, args, label, except_label): + labels = 'to label %%%s except label %%%s' % (label, except_label) + if returntype == 'void': + codewriter.indent('%sinvoke %s void %s(%s) %s' % (tail_, cconv, functionref, args, labels)) + else: + codewriter.indent('%s = %sinvoke %s %s %s(%s) %s' % (targetvar, tail_, cconv, returntype, functionref, args, labels)) + + def _is_raise_new_exception(self, db, graph, block): + from pypy.objspace.flow.model import mkentrymap + is_raise_new = False + entrylinks = mkentrymap(graph)[block] + entrylinks = [x for x in entrylinks if x.prevblock is not None] + inputargs = db.repr_arg_multi(block.inputargs) + for i, arg in enumerate(inputargs): + names = db.repr_arg_multi([link.args[i] for link in entrylinks]) + for name in names: #These tests-by-name are a bit yikes, but I don't see a better way right now + if not name.startswith('%last_exception_') and not name.startswith('%last_exc_value_'): + is_raise_new = True + return is_raise_new + + def write_exceptblock(self, funcnode, codewriter, block): + assert len(block.inputargs) == 2 + + db = funcnode.db + graph = funcnode.graph + + if self._is_raise_new_exception(db, graph, block): + funcnode.write_block_phi_nodes(codewriter, block) + + inputargs = db.repr_arg_multi(block.inputargs) + inputargtypes = db.repr_arg_type_multi(block.inputargs) + + codewriter.store(inputargtypes[0], inputargs[0], '%last_exception_type') + codewriter.store(inputargtypes[1], inputargs[1], '%last_exception_value') + else: + codewriter.comment('reraise last exception') + #Reraising last_exception. + #Which is already stored in the global variables. + #So nothing needs to happen here! + + codewriter.indent('unwind') + + def fetch_exceptions(self, codewriter, exc_found_labels, lltype_of_exception_type, lltype_of_exception_value): + for label, target, last_exc_type_var, last_exc_value_var in exc_found_labels: + codewriter.label(label) + if last_exc_type_var: + codewriter.load(last_exc_type_var, lltype_of_exception_type, '%last_exception_type') + if last_exc_value_var: + codewriter.load(last_exc_value_var, lltype_of_exception_value, '%last_exception_value') + codewriter.br_uncond(target) + + def reraise(self, funcnode, codewriter): + codewriter.comment('reraise when exception is not caught') + codewriter.indent('unwind') + def llc_options(self): return '-enable-correct-eh-support' -class FastExceptionPolicy(ExceptionPolicy): #uses only 'direct' exception class comparision +class FastExceptionPolicy(ExceptionPolicy): #uses issubclass() and last_exception tests after each call def __init__(self): - pass + self.invoke_count = 0 - pyrex_entrypoint_code = CPythonExceptionPolicy.pyrex_entrypoint_code - llc_options = CPythonExceptionPolicy.llc_options + def pyrex_entrypoint_code(self, entrynode): + returntype, entrypointname = entrynode.getdecl().split('%', 1) + noresult = self._noresult(returntype) + cconv = DEFAULT_CCONV + return ''' +ccc %(returntype)s%%__entrypoint__%(entrypointname)s { + store %%RPYTHON_EXCEPTION_VTABLE* null, %%RPYTHON_EXCEPTION_VTABLE** %%last_exception_type + %%result = call %(cconv)s %(returntype)s%%%(entrypointname)s + %%tmp = load %%RPYTHON_EXCEPTION_VTABLE** %%last_exception_type + %%exc = seteq %%RPYTHON_EXCEPTION_VTABLE* %%tmp, null + br bool %%exc, label %%no_exception, label %%exception + +no_exception: + ret %(returntype)s %%result + +exception: + ret %(noresult)s +} + +ccc int %%__entrypoint__raised_LLVMException() { + %%tmp = load %%RPYTHON_EXCEPTION_VTABLE** %%last_exception_type + %%result = cast %%RPYTHON_EXCEPTION_VTABLE* %%tmp to int + ret int %%result +} +''' % locals() + + def transform(self, translator): + from pypy.translator.backendopt.exception import create_exception_handling + for graph in translator.flowgraphs.itervalues(): + create_exception_handling(translator, graph) + #translator.view() + + def invoke(self, codewriter, targetvar, tail_, cconv, returntype, functionref, args, label, except_label): + if returntype == 'void': + codewriter.indent('%scall %s void %s(%s)' % (tail_, cconv, functionref, args)) + else: + codewriter.indent('%s = %scall %s %s %s(%s)' % (targetvar, tail_, cconv, returntype, functionref, args)) + tmp = '%%invoke.tmp.%d' % self.invoke_count + exc = '%%invoke.exc.%d' % self.invoke_count + self.invoke_count += 1 + codewriter.indent('%(tmp)s = load %%RPYTHON_EXCEPTION_VTABLE** %%last_exception_type' % locals()) + codewriter.indent('%(exc)s = seteq %%RPYTHON_EXCEPTION_VTABLE* %(tmp)s, null' % locals()) + codewriter.indent('br bool %(exc)s, label %%%(label)s, label %%%(except_label)s' % locals()) + + def write_exceptblock(self, funcnode, codewriter, block): + assert len(block.inputargs) == 2 + + noresult = self._nonoderesult(funcnode) + + funcnode.write_block_phi_nodes(codewriter, block) + + inputargs = funcnode.db.repr_arg_multi(block.inputargs) + inputargtypes = funcnode.db.repr_arg_type_multi(block.inputargs) + + codewriter.store(inputargtypes[0], inputargs[0], '%last_exception_type') + codewriter.store(inputargtypes[1], inputargs[1], '%last_exception_value') + codewriter.indent('ret ' + noresult) + + def fetch_exceptions(self, codewriter, exc_found_labels, lltype_of_exception_type, lltype_of_exception_value): + for label, target, last_exc_type_var, last_exc_value_var in exc_found_labels: + codewriter.label(label) + if last_exc_type_var: + codewriter.load(last_exc_type_var, lltype_of_exception_type, '%last_exception_type') + if last_exc_value_var: + codewriter.load(last_exc_value_var, lltype_of_exception_value, '%last_exception_value') + codewriter.store(lltype_of_exception_type , 'null', '%last_exception_type') + codewriter.store(lltype_of_exception_value, 'null', '%last_exception_value') + codewriter.br_uncond(target) + + def reraise(self, funcnode, codewriter): + noresult = self._nonoderesult(funcnode) + codewriter.indent('ret ' + noresult) + + def llc_options(self): + return '' Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Wed Sep 21 12:20:21 2005 @@ -100,10 +100,7 @@ else: meth = getattr(self, op.opname, None) if not meth: - msg = "operation %s not found" %(op.opname,) - self.codewriter.comment('XXX: Error: ' + msg) - # XXX commented out for testing - #assert meth is not None, msg + raise Exception, "operation %s not found" % op.opname return meth(op) @@ -164,7 +161,7 @@ #this is really generated, don't know why # XXX rxe: Surely that cant be right? - uint_neg = int_neg + uint_neg = int_neg def float_neg(self, op): self._generic_neg(op, "0.0") @@ -275,10 +272,8 @@ "null") def direct_call(self, op): - op_args = [arg for arg in op.args if arg.concretetype is not lltype.Void] - assert len(op_args) >= 1 targetvar = self.db.repr_arg(op.result) returntype = self.db.repr_arg_type(op.result) @@ -289,6 +284,11 @@ returntype = "%s (%s)*" % (returntype, ", ".join(argtypes)) self.codewriter.call(targetvar,returntype,functionref,argrefs,argtypes) + def last_exception_type_ptr(self, op): + e = self.db.translator.rtyper.getexceptiondata() + lltype_of_exception_type = ('%structtype.' + e.lltype_of_exception_type.TO.__name__ + '*') + self.codewriter.load('%'+str(op.result), lltype_of_exception_type, '%last_exception_type') + def invoke(self, op): op_args = [arg for arg in op.args if arg.concretetype is not lltype.Void] @@ -314,20 +314,15 @@ link = self.block.exits[0] assert link.exitcase is None - targetvar = self.db.repr_arg(op.result) - returntype = self.db.repr_arg_type(op.result) - argrefs = self.db.repr_arg_multi(op_args[1:]) - argtypes = self.db.repr_arg_type_multi(op_args[1:]) + targetvar = self.db.repr_arg(op.result) + returntype = self.db.repr_arg_type(op.result) + argrefs = self.db.repr_arg_multi(op_args[1:]) + argtypes = self.db.repr_arg_type_multi(op_args[1:]) none_label = self.node.block_to_name[link.target] block_label = self.node.block_to_name[self.block] exc_label = block_label + '_exception_handling' - - - - - if self.db.is_function_ptr(op.result): #use longhand form returntype = "%s (%s)*" % (returntype, ", ".join(argtypes)) self.codewriter.call(targetvar, returntype, functionref, argrefs, @@ -360,7 +355,6 @@ for name, blockname in zip(names, blocknames): if blockname != exc_found_label: continue - #XXX might want to refactor the next few lines if name.startswith('%last_exception_'): last_exc_type_var = name if name.startswith('%last_exc_value_'): @@ -371,11 +365,11 @@ not_this_exception_label = block_label + '_not_exception_' + etype.ref[1:] - if current_exception_type.find('getelementptr') == -1: #XXX catch all (except:) + if current_exception_type.find('getelementptr') == -1: #catch all (except:) catch_all = True self.codewriter.br_uncond(exc_found_label) - else: - if not last_exception_type: + else: #catch specific exception (class) type + if not last_exception_type: #load pointer only once last_exception_type = self.db.repr_tmpvar() self.codewriter.load(last_exception_type, lltype_of_exception_type, '%last_exception_type') self.codewriter.newline() @@ -388,25 +382,10 @@ self.codewriter.br(ll_issubclass_cond, not_this_exception_label, exc_found_label) self.codewriter.label(not_this_exception_label) + ep = self.codewriter.genllvm.exceptionpolicy if not catch_all: - self.codewriter.comment('reraise when exception is not caught') - self.codewriter.unwind() - - for label, target, last_exc_type_var, last_exc_value_var in exc_found_labels: - self.codewriter.label(label) - if last_exc_type_var: - self.codewriter.load(last_exc_type_var, lltype_of_exception_type, '%last_exception_type') - if last_exc_value_var: - self.codewriter.load(last_exc_value_var, lltype_of_exception_value, '%last_exception_value') - - self.codewriter.br_uncond(target) - - - - - - - + ep.reraise(self.node, self.codewriter) + ep.fetch_exceptions(self.codewriter, exc_found_labels, lltype_of_exception_type, lltype_of_exception_value) def malloc(self, op): arg_type = op.args[0].value @@ -452,8 +431,6 @@ ("uint", index)) self.codewriter.load(targetvar, targettype, tmpvar) else: - #XXX what if this the last operation of the exception block? - # XXX rxe: would a getfield() ever raise anyway??? self.codewriter.comment("***Skipping operation getfield()***") def getsubstruct(self, op): From ac at codespeak.net Wed Sep 21 13:25:14 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Wed, 21 Sep 2005 13:25:14 +0200 (CEST) Subject: [pypy-svn] r17714 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050921112514.D329B27B5B@code1.codespeak.net> Author: ac Date: Wed Sep 21 13:25:14 2005 New Revision: 17714 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: Fix mangling of names. Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Wed Sep 21 13:25:14 2005 @@ -1222,7 +1222,6 @@ class AbstractFunctionCode(CodeGenerator): def __init__(self, space, func, isLambda, class_name, mod): - self.class_name = class_name self.module = mod if isLambda: name = "" @@ -1251,6 +1250,7 @@ newlocals=1) self.isLambda = isLambda CodeGenerator.__init__(self, space, graph) + self.class_name = class_name self.optimized = 1 if not isLambda and not space.is_w(func.doc, space.w_None): @@ -1327,12 +1327,12 @@ class AbstractClassCode(CodeGenerator): def __init__(self, space, klass, module): - self.class_name = klass.name self.module = module graph = pyassem.PyFlowGraph( space, klass.name, klass.filename, optimized=0, klass=1) CodeGenerator.__init__(self, space, graph) + self.class_name = klass.name self.graph.setFlag(CO_NEWLOCALS) if not space.is_w(klass.doc, space.w_None): self.setDocstring(klass.doc) From ericvrp at codespeak.net Wed Sep 21 14:59:17 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 21 Sep 2005 14:59:17 +0200 (CEST) Subject: [pypy-svn] r17715 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20050921125917.40AB927B5B@code1.codespeak.net> Author: ericvrp Date: Wed Sep 21 14:59:16 2005 New Revision: 17715 Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py pypy/dist/pypy/translator/llvm/exception.py pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/module/support.py Log: remaning files for exceptionpolicy Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Wed Sep 21 14:59:16 2005 @@ -95,7 +95,7 @@ cmds.append("llc %s %s.bc -march=c -f -o %s.c" % (genllvm.exceptionpolicy.llc_options(), b, b)) if exe_name: #XXX TODO: use CFLAGS when available - cmds.append("gcc %s.c -c -O2 -fomit-frame-pointer -pipe" % (b,)) + cmds.append("gcc %s.c -c -O2 -fomit-frame-pointer -march=pentium4 -ffast-math -pipe" % (b,)) cmds.append("gcc %s.o %s -lm -ldl -pipe -o %s" % (b, gc_libs, exe_name)) source_files.append("%s.c" % b) Modified: pypy/dist/pypy/translator/llvm/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/exception.py (original) +++ pypy/dist/pypy/translator/llvm/exception.py Wed Sep 21 14:59:16 2005 @@ -29,7 +29,7 @@ def new(exceptionpolicy=None): #factory if exceptionpolicy is None: - exceptionpolicy = 'cpython' + exceptionpolicy = 'fast' if exceptionpolicy == 'cpython': from pypy.translator.llvm.exception import CPythonExceptionPolicy exceptionpolicy = CPythonExceptionPolicy() Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Wed Sep 21 14:59:16 2005 @@ -2,10 +2,10 @@ from pypy.objspace.flow.model import Block, Constant, Variable, Link from pypy.objspace.flow.model import flatten, mkentrymap, traverse, last_exception from pypy.rpython import lltype -from pypy.translator.unsimplify import remove_double_links from pypy.translator.llvm.node import LLVMNode, ConstantLLVMNode from pypy.translator.llvm.opwriter import OpWriter from pypy.translator.llvm.log import log +from pypy.translator.unsimplify import remove_double_links log = log.funcnode class FuncTypeNode(LLVMNode): @@ -16,7 +16,7 @@ assert isinstance(type_, lltype.FuncType) self.type_ = type_ self.ref = self.make_ref('%functiontype', '') - + def __str__(self): return "" % self.ref @@ -37,9 +37,7 @@ self.value = value self.ref = self.make_ref('%pypy_', value.graph.name) self.graph = value.graph - # XXX the following needs to be done in advance (e.g. for inlining) - #backend_optimizations(self.graph, opt_SSI_to_SSA=False) - remove_double_links(self.db.translator, self.graph) + remove_double_links(self.db.translator, self.graph) def __str__(self): return "" %(self.ref,) @@ -207,33 +205,5 @@ inputarg = self.db.repr_arg(block.inputargs[0]) codewriter.ret(inputargtype, inputarg) - def _is_raise_new_exception(self, block): - is_raise_new = False - entrylinks = mkentrymap(self.graph)[block] - entrylinks = [x for x in entrylinks if x.prevblock is not None] - inputargs = self.db.repr_arg_multi(block.inputargs) - for i, arg in enumerate(inputargs): - names = self.db.repr_arg_multi([link.args[i] for link in entrylinks]) - for name in names: #These tests-by-name are a bit yikes, but I don't see a better way right now - if not name.startswith('%last_exception_') and not name.startswith('%last_exc_value_'): - is_raise_new = True - return is_raise_new - def write_exceptblock(self, codewriter, block): - assert len(block.inputargs) == 2 - - if self._is_raise_new_exception(block): - self.write_block_phi_nodes(codewriter, block) - - inputargs = self.db.repr_arg_multi(block.inputargs) - inputargtypes = self.db.repr_arg_type_multi(block.inputargs) - - codewriter.store(inputargtypes[0], inputargs[0], '%last_exception_type') - codewriter.store(inputargtypes[1], inputargs[1], '%last_exception_value') - else: - codewriter.comment('reraise last exception') - #Reraising last_exception. - #Which is already stored in the global variables. - #So nothing needs to happen here! - - codewriter.unwind() + codewriter.genllvm.exceptionpolicy.write_exceptblock(self, codewriter, block) Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Wed Sep 21 14:59:16 2005 @@ -22,7 +22,6 @@ from pypy.translator.llvm.externs2ll import post_setup_externs, generate_llfile from pypy.translator.llvm.gc import GcPolicy from pypy.translator.llvm.exception import ExceptionPolicy - from pypy.translator.translator import Translator @@ -40,11 +39,11 @@ self.translator = translator self.gcpolicy = gcpolicy self.exceptionpolicy = exceptionpolicy - translator.checkgraphs() extfuncnode.ExternalFuncNode.used_external_functions = {} - - # for debug we create comments of every operation that may be executed - self.debug = debug + self.debug = debug # for debug we create comments of every operation that may be executed + exceptionpolicy.transform(translator) + if debug: + translator.checkgraphs() def _checkpoint(self, msg=None): if self.debug: @@ -256,8 +255,7 @@ a.simplify() t.specialize() t.backend_optimizations(ssa_form=False) - t.checkgraphs() - if view: + if view: #note: this is without policy transforms t.view() return genllvm(t, **kwds) Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Wed Sep 21 14:59:16 2005 @@ -97,7 +97,7 @@ %%exception_type = load %%RPYTHON_EXCEPTION_VTABLE** %%tmp store %%RPYTHON_EXCEPTION_VTABLE* %%exception_type, %%RPYTHON_EXCEPTION_VTABLE** %%last_exception_type store %%RPYTHON_EXCEPTION* %%exception_value, %%RPYTHON_EXCEPTION** %%last_exception_value - unwind + ret void ; XXX unwind ; (1) } """ % locals()) @@ -109,7 +109,7 @@ br bool %%cond, label %%is_0, label %%is_not_0 is_0: call fastcc void %%prepare_ZeroDivisionError() - unwind + br label %%is_not_0 ; XXX unwind ; (2) is_not_0: """ @@ -132,7 +132,7 @@ ; br bool %cond3, label %return_block, label %ovf3 ;ovf3: call fastcc void %prepare_OverflowError() - unwind + ret int 0 ; XXX unwind ; (3) """ From pedronis at codespeak.net Wed Sep 21 15:26:52 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 21 Sep 2005 15:26:52 +0200 (CEST) Subject: [pypy-svn] r17718 - in pypy/dist/pypy: annotation rpython Message-ID: <20050921132652.CF4C027B5B@code1.codespeak.net> Author: pedronis Date: Wed Sep 21 15:26:51 2005 New Revision: 17718 Modified: pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/rpython/normalizecalls.py pypy/dist/pypy/rpython/rbuiltin.py Log: don't generate ununsed my instantiate Modified: pypy/dist/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/dist/pypy/annotation/bookkeeper.py (original) +++ pypy/dist/pypy/annotation/bookkeeper.py Wed Sep 21 15:26:51 2005 @@ -197,6 +197,8 @@ self.needs_hash_support = {} + self.needs_generic_instantiate = {} + self.memo_tables = [] self.stats = Stats(self) Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Wed Sep 21 15:26:51 2005 @@ -234,6 +234,9 @@ clsdef = getbookkeeper().getclassdef(cls) else: clsdef = clsdef.commonbase(getbookkeeper().getclassdef(cls)) + if len(s_clspbc.prebuiltinstances) > 1: + for cls in s_clspbc.prebuiltinstances: + getbookkeeper().needs_generic_instantiate[cls] = True return SomeInstance(clsdef) def robjmodel_we_are_translated(): Modified: pypy/dist/pypy/rpython/normalizecalls.py ============================================================================== --- pypy/dist/pypy/rpython/normalizecalls.py (original) +++ pypy/dist/pypy/rpython/normalizecalls.py Wed Sep 21 15:26:51 2005 @@ -324,8 +324,12 @@ def create_instantiate_functions(annotator): # build the 'instantiate() -> instance of C' functions for the vtables + + needs_generic_instantiate = annotator.bookkeeper.needs_generic_instantiate + for cls, classdef in annotator.getuserclasses().items(): - if needsgc(classdef): # only gc-case + if cls in needs_generic_instantiate: + assert needsgc(classdef) # only gc-case create_instantiate_function(annotator, cls, classdef) def create_instantiate_function(annotator, cls, classdef): Modified: pypy/dist/pypy/rpython/rbuiltin.py ============================================================================== --- pypy/dist/pypy/rpython/rbuiltin.py (original) +++ pypy/dist/pypy/rpython/rbuiltin.py Wed Sep 21 15:26:51 2005 @@ -212,9 +212,9 @@ v_errno = hop.inputarg(lltype.Signed, arg=1) r_self.setfield(v_self, 'errno', v_errno, hop.llops) -def ll_instantiate(typeptr, RESULT): +def ll_instantiate(typeptr): my_instantiate = typeptr.instantiate - return lltype.cast_pointer(RESULT, my_instantiate()) + return my_instantiate() def rtype_instantiate(hop): s_class = hop.args_s[0] @@ -222,8 +222,10 @@ if len(s_class.prebuiltinstances) != 1: # instantiate() on a variable class vtypeptr, = hop.inputargs(rclass.get_type_repr(hop.rtyper)) - cresult = hop.inputconst(lltype.Void, hop.r_result.lowleveltype) - return hop.gendirectcall(ll_instantiate, vtypeptr, cresult) + v_inst = hop.gendirectcall(ll_instantiate, vtypeptr) + return hop.genop('cast_pointer', [v_inst], # v_type implicit in r_result + resulttype = hop.r_result.lowleveltype) + klass = s_class.const return rclass.rtype_new_instance(hop.rtyper, klass, hop.llops) From ericvrp at codespeak.net Wed Sep 21 15:37:20 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 21 Sep 2005 15:37:20 +0200 (CEST) Subject: [pypy-svn] r17719 - pypy/dist/pypy/tool Message-ID: <20050921133720.0873D27B5C@code1.codespeak.net> Author: ericvrp Date: Wed Sep 21 15:37:15 2005 New Revision: 17719 Modified: pypy/dist/pypy/tool/udir.py Log: Renamed /tmp/usession-current to /tmp/usession- when available. Still use /tmp/usession-current symlink when username unavailable. The 'current' name was causing problems on multiuser systems where only one user owns the symlink and others can not change it. Modified: pypy/dist/pypy/tool/udir.py ============================================================================== --- pypy/dist/pypy/tool/udir.py (original) +++ pypy/dist/pypy/tool/udir.py Wed Sep 21 15:37:15 2005 @@ -5,14 +5,23 @@ # import autopath +import os from py.path import local udir = local.make_numbered_dir(prefix='usession-', keep=3) +try: + username = os.environ['LOGNAME'] #linux, et al +except: + try: + username = os.environ['USERNAME'] #windows + except: + username = 'current' + import os src = str(udir) -dest = src[:src.rfind('-')] + '-current' +dest = src[:src.rfind('-')] + '-' + username try: os.unlink(dest) except: From tismer at codespeak.net Wed Sep 21 20:00:43 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Wed, 21 Sep 2005 20:00:43 +0200 (CEST) Subject: [pypy-svn] r17741 - pypy/dist/pypy/translator/goal Message-ID: <20050921180043.6006127B62@code1.codespeak.net> Author: tismer Date: Wed Sep 21 20:00:42 2005 New Revision: 17741 Modified: pypy/dist/pypy/translator/goal/bench-windows.py Log: After implementing range at interp-level, results have changed quite dramatically. Revision 17707 runs everywhere fastest without the -t-lowmem option. This is probably different on machines with less than 2 MB of L2-cache. I was not aware that pystone uses range. This explains a lot, and after optimizing range, the world looks much more consistent. Note how much geninterp accelerated things, already. The effect of compiling has increased quite much since I added -t-lowmem. Perhaps it is time to drop the optiomn, again. The performance gap might be interpreted this way: Translation in general works quite efficiently, but the interpreter itself needs more to become equally accelerated. executable abs.richards abs.pystone rel.rich rel.pystone pypy-c-17439-hi 35135 ms 674.191 42.4 60.7 pypy-c-17439-lo 36062 ms 972.900 43.6 42.1 pypy-c-17600-lo 26357 ms 905.379 31.8 45.2 pypy-c-17634-lo 20098 ms 1016.890 24.3 40.3 pypy-c-17649-lo 22637 ms 1041.480 27.3 39.3 pypy-c-17674-hi 15812 ms 2114.430 19.1 19.4 pypy-c-17674-lo 19253 ms 1356.470 23.3 30.2 pypy-c-17707-hi-range 14265 ms 2906.260 17.2 14.1 pypy-c-17707-hi 14105 ms 2120.210 17.0 19.3 pypy-c-17707-lo-range 18701 ms 2834.690 22.6 14.4 pypy-c-17707-lo 19042 ms 1357.690 23.0 30.2 python 2.3.3 828 ms 40934.500 1.0 1.0 Modified: pypy/dist/pypy/translator/goal/bench-windows.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-windows.py (original) +++ pypy/dist/pypy/translator/goal/bench-windows.py Wed Sep 21 20:00:42 2005 @@ -4,21 +4,23 @@ current_result = """ executable abs.richards abs.pystone rel.rich rel.pystone -pypy-c-17439-hi 35415 ms 620.652 42.6 65.4 -pypy-c-17439-lo 36492 ms 923.530 43.9 44.0 -pypy-c-17600-lo 26542 ms 893.093 31.9 45.5 -pypy-c-17634-lo 20203 ms 1001.520 24.3 40.6 -pypy-c-17649-lo 22792 ms 1028.290 27.4 39.5 -pypy-c-17674-hi 15927 ms 1934.000 19.1 21.0 -pypy-c-17674-lo 17009 ms 1283.800 20.4 31.6 -pypy-c-17707-hi 15942 ms 1971.950 19.2 20.6 -python 2.3.3 832 ms 40612.100 1.0 1.0 - -This time, more comparisons between -t-lowmem and without it (using geninterp -as much as possible) were done. It is interesting how much translation of -geninterp'ed code is accelerated, now. Note that range() is still at applevel, -but very efficiently translated. It will anyway be moved to interplevel -next time, it is too frequently used. +pypy-c-17439-hi 35135 ms 674.191 42.4 60.7 +pypy-c-17439-lo 36062 ms 972.900 43.6 42.1 +pypy-c-17600-lo 26357 ms 905.379 31.8 45.2 +pypy-c-17634-lo 20098 ms 1016.890 24.3 40.3 +pypy-c-17649-lo 22637 ms 1041.480 27.3 39.3 +pypy-c-17674-hi 15812 ms 2114.430 19.1 19.4 +pypy-c-17674-lo 19253 ms 1356.470 23.3 30.2 +pypy-c-17707-hi-range 14265 ms 2906.260 17.2 14.1 +pypy-c-17707-hi 14105 ms 2120.210 17.0 19.3 +pypy-c-17707-lo-range 18701 ms 2834.690 22.6 14.4 +pypy-c-17707-lo 19042 ms 1357.690 23.0 30.2 +python 2.3.3 828 ms 40934.500 1.0 1.0 + +After implementing range at interp-level, results have changed +quite dramatically. Revision 17707 runs everywhere fastest +without the -t-lowmem option. This is probably different on machines +with less than 2 MB of L2-cache. """ import os, sys From tismer at codespeak.net Wed Sep 21 20:14:21 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Wed, 21 Sep 2005 20:14:21 +0200 (CEST) Subject: [pypy-svn] r17743 - pypy/dist/pypy/module/__builtin__ Message-ID: <20050921181421.808FB27B5C@code1.codespeak.net> Author: tismer Date: Wed Sep 21 20:14:19 2005 New Revision: 17743 Added: pypy/dist/pypy/module/__builtin__/functional.py (contents, props changed) Modified: pypy/dist/pypy/module/__builtin__/__init__.py pypy/dist/pypy/module/__builtin__/app_functional.py Log: added an interplevel implementation of range. The original function will stay in app_functional; it is used for non-integer and all exception cases. The speed effect on pystone is just tremendous, although I believe that this would be almost achievable by specialization of geninterp output. Not easily, of course, so I think this is worth the small effort. Modified: pypy/dist/pypy/module/__builtin__/__init__.py ============================================================================== --- pypy/dist/pypy/module/__builtin__/__init__.py (original) +++ pypy/dist/pypy/module/__builtin__/__init__.py Wed Sep 21 20:14:19 2005 @@ -22,7 +22,9 @@ 'filter' : 'app_functional.filter', 'zip' : 'app_functional.zip', 'reduce' : 'app_functional.reduce', - 'range' : 'app_functional.range', + #'range' : 'app_functional.range', + # redirected to functional.py, applevel version + # is still needed and should stay where it is. 'min' : 'app_functional.min', 'max' : 'app_functional.max', 'enumerate' : 'app_functional.enumerate', @@ -107,6 +109,8 @@ '__import__' : 'importing.importhook', + 'range' : 'functional.range_int', + # float->string helper '_formatd' : 'special._formatd' } Modified: pypy/dist/pypy/module/__builtin__/app_functional.py ============================================================================== --- pypy/dist/pypy/module/__builtin__/app_functional.py (original) +++ pypy/dist/pypy/module/__builtin__/app_functional.py Wed Sep 21 20:14:19 2005 @@ -140,6 +140,16 @@ # ____________________________________________________________ +""" +The following is a nice example of collaboration between +interp-level and app-level. +range is primarily implemented in functional.py for the integer case. +On every error or different data types, it redirects to the applevel +implementation below. functional.py uses this source via the inspect +module and uses gateway.applevel. This is also an alternative to +writing longer functions in strings. +""" + def range(x, y=None, step=1): """ returns a list of integers in arithmetic position from start (defaults to zero) to stop - 1 by step (defaults to 1). Use a negative step to @@ -147,11 +157,11 @@ if y is None: - start = 0 - stop = x + start = 0 + stop = x else: - start = x - stop = y + start = x + stop = y if not isinstance(start, (int, long)): raise TypeError('range() integer start argument expected, got %s' % type(start)) @@ -302,7 +312,7 @@ if not isinstance(index, int): raise TypeError, "sequence index must be integer" len = self.len - if index<0: + if index < 0: index += len if 0 <= index < len: return self.start + index * self.step Added: pypy/dist/pypy/module/__builtin__/functional.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/module/__builtin__/functional.py Wed Sep 21 20:14:19 2005 @@ -0,0 +1,94 @@ +""" +Interp-level definition of frequently used functionals. + +Candidates implemented + + range yes + zip no + min no + max no + enumerate no + xrange no + +""" + +from pypy.interpreter.error import OperationError +from pypy.interpreter.gateway import ObjSpace, W_Root, NoneNotWrapped, applevel +from pypy.rpython.rarithmetic import r_uint, intmask +from pypy.module.__builtin__.app_functional import range as app_range +from inspect import getsource, getfile + +""" +Implementation of the common integer case of range. Instead of handling +all other cases here, too, we fall back to the applevel implementation +for non-integer arguments. +Ideally this implementation could be saved, if we were able to +specialize the geninterp generated code. But I guess having this +hand-optimized is a good idea. + +Note the fun of using range inside range :-) +""" + +def get_len_of_range(lo, hi, step): + """ + Return number of items in range/xrange (lo, hi, step). step > 0 + required. Return a value < 0 if & only if the true value is too + large to fit in a signed long. + """ + + # If lo >= hi, the range is empty. + # Else if n values are in the range, the last one is + # lo + (n-1)*step, which must be <= hi-1. Rearranging, + # n <= (hi - lo - 1)/step + 1, so taking the floor of the RHS gives + # the proper value. Since lo < hi in this case, hi-lo-1 >= 0, so + # the RHS is non-negative and so truncation is the same as the + # floor. Letting M be the largest positive long, the worst case + # for the RHS numerator is hi=M, lo=-M-1, and then + # hi-lo-1 = M-(-M-1)-1 = 2*M. Therefore unsigned long has enough + # precision to compute the RHS exactly. + + # slight modification: we raise on everything bad and also adjust args + if step == 0: + raise ValueError + elif step < 0: + lo, hi, step = hi, lo, -step + if lo < hi: + uhi = r_uint(hi) + ulo = r_uint(lo) + diff = uhi - ulo - 1 + n = intmask(diff // r_uint(step) + 1) + if n < 0: + raise OverflowError + else: + n = 0 + return n + +def range(space, w_x, w_y=None, w_step=1): + """ returns a list of integers in arithmetic position from start (defaults + to zero) to stop - 1 by step (defaults to 1). Use a negative step to + get a list in decending order.""" + + try: + # save duplication by redirecting every error to applevel + x = space.int_w(w_x) + if w_y is space.w_None: + start, stop = 0, x + else: + start, stop = x, space.int_w(w_y) + step = space.int_w(w_step) + howmany = get_len_of_range(start, stop, step) + except (OperationError, ValueError, OverflowError): + return range_fallback(space, w_x, w_y, w_step) + + res_w = [None] * howmany + v = start + for idx in range(howmany): + res_w[idx] = space.wrap(v) + v += step + return space.newlist(res_w) +range_int = range +range_int.unwrap_spec = [ObjSpace, W_Root, W_Root, W_Root] +del range # don't hide the builtin one + +range_fallback = applevel(getsource(app_range), getfile(app_range) + ).interphook('range') From arigo at codespeak.net Wed Sep 21 20:48:57 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 21 Sep 2005 20:48:57 +0200 (CEST) Subject: [pypy-svn] r17744 - pypy/dist/pypy/translator/tool Message-ID: <20050921184857.756F127B5C@code1.codespeak.net> Author: arigo Date: Wed Sep 21 20:48:55 2005 New Revision: 17744 Modified: pypy/dist/pypy/translator/tool/make_dot.py Log: Accept \n characters in the labels and pass them to 'dot'. Modified: pypy/dist/pypy/translator/tool/make_dot.py ============================================================================== --- pypy/dist/pypy/translator/tool/make_dot.py (original) +++ pypy/dist/pypy/translator/tool/make_dot.py Wed Sep 21 20:48:55 2005 @@ -56,7 +56,7 @@ weight="5", ): d = locals() - attrs = [('%s="%s"' % (x, d[x].replace('"', '\\"'))) + attrs = [('%s="%s"' % (x, d[x].replace('"', '\\"').replace('\n', '\\n'))) for x in ['label', 'style', 'color', 'dir', 'weight']] self.emit('edge [%s];' % ", ".join(attrs)) self.emit('%s -> %s' % (name1, name2)) @@ -69,7 +69,7 @@ style="filled", ): d = locals() - attrs = [('%s="%s"' % (x, d[x].replace('"', '\\"'))) + attrs = [('%s="%s"' % (x, d[x].replace('"', '\\"').replace('\n', '\\n'))) for x in ['shape', 'label', 'color', 'fillcolor', 'style']] self.emit('%s [%s];' % (name, ", ".join(attrs))) From arigo at codespeak.net Wed Sep 21 21:00:44 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 21 Sep 2005 21:00:44 +0200 (CEST) Subject: [pypy-svn] r17745 - in pypy/dist/pypy: doc interpreter module/__builtin__ objspace objspace/std Message-ID: <20050921190044.23DB627B5D@code1.codespeak.net> Author: arigo Date: Wed Sep 21 21:00:39 2005 New Revision: 17745 Modified: pypy/dist/pypy/doc/coding-guide.txt pypy/dist/pypy/interpreter/function.py pypy/dist/pypy/interpreter/typedef.py pypy/dist/pypy/module/__builtin__/functional.py pypy/dist/pypy/objspace/descroperation.py pypy/dist/pypy/objspace/std/fake.py Log: * removed all remaining 'if w_obj == space.w_None'. This is definitely something that breaks the thunk space, so it is better made illegal. Removed from the coding guide too. * related details in descroperation.is_true(): trying to make the logic safe against probably uncommon patterns of usages in object spaces like the thunk. Modified: pypy/dist/pypy/doc/coding-guide.txt ============================================================================== --- pypy/dist/pypy/doc/coding-guide.txt (original) +++ pypy/dist/pypy/doc/coding-guide.txt Wed Sep 21 21:00:39 2005 @@ -436,11 +436,7 @@ writing ``if w_x:``) will give you an error reminding you of the problem. -* ``w_x == w_y`` or ``w_x is w_y``: DON'T DO THAT. The only - half-official exception is to check if ``w_x`` contains a - wrapped ``None``: you can write ``w_x == space.w_None``. - Follow this rule; the case of ``None`` is easy to fix - globally later if we find out that we need to. The +* ``w_x == w_y`` or ``w_x is w_y``: DON'T DO THAT. The rationale for this rule is that there is no reason that two wrappers are related in any way even if they contain what looks like the same object at application-level. To check Modified: pypy/dist/pypy/interpreter/function.py ============================================================================== --- pypy/dist/pypy/interpreter/function.py (original) +++ pypy/dist/pypy/interpreter/function.py Wed Sep 21 21:00:39 2005 @@ -176,8 +176,6 @@ not space.is_w(w_obj, space.w_None) or space.is_w(w_cls, space.type(space.w_None))) if asking_for_bound: - #if w_cls == space.w_None: - # w_cls = space.type(w_obj) return space.wrap(Method(space, w_function, w_obj, w_cls)) else: return space.wrap(Method(space, w_function, None, w_cls)) @@ -249,8 +247,6 @@ return space.wrap(self) # already bound else: # only allow binding to a more specific class than before - #if w_cls == space.w_None: - # w_cls = space.type(w_obj) if (w_cls is not None and not space.is_w(w_cls, space.w_None) and not space.is_true(space.abstract_issubclass(w_cls, self.w_class))): Modified: pypy/dist/pypy/interpreter/typedef.py ============================================================================== --- pypy/dist/pypy/interpreter/typedef.py (original) +++ pypy/dist/pypy/interpreter/typedef.py Wed Sep 21 21:00:39 2005 @@ -202,7 +202,8 @@ """property.__get__(obj[, type]) -> value Read the value of the property of the given obj.""" # XXX HAAAAAAAAAAAACK (but possibly a good one) - if w_obj == space.w_None and not space.is_w(w_cls, space.type(space.w_None)): + if (space.is_w(w_obj, space.w_None) + and not space.is_w(w_cls, space.type(space.w_None))): #print property, w_obj, w_cls return space.wrap(property) else: Modified: pypy/dist/pypy/module/__builtin__/functional.py ============================================================================== --- pypy/dist/pypy/module/__builtin__/functional.py (original) +++ pypy/dist/pypy/module/__builtin__/functional.py Wed Sep 21 21:00:39 2005 @@ -71,7 +71,7 @@ try: # save duplication by redirecting every error to applevel x = space.int_w(w_x) - if w_y is space.w_None: + if space.is_w(w_y, space.w_None): start, stop = 0, x else: start, stop = x, space.int_w(w_y) Modified: pypy/dist/pypy/objspace/descroperation.py ============================================================================== --- pypy/dist/pypy/objspace/descroperation.py (original) +++ pypy/dist/pypy/objspace/descroperation.py Wed Sep 21 21:00:39 2005 @@ -168,11 +168,12 @@ return space.get_and_call_function(w_descr, w_obj, w_name) def is_true(space, w_obj): - if w_obj == space.w_False: + # first a few shortcuts for performance + if w_obj is space.w_False: return False - if w_obj == space.w_True: + if w_obj is space.w_True: return True - if w_obj == space.w_None: + if w_obj is space.w_None: return False w_descr = space.lookup(w_obj, '__nonzero__') if w_descr is None: @@ -180,10 +181,15 @@ if w_descr is None: return True w_res = space.get_and_call_function(w_descr, w_obj) + # more shortcuts for common cases + if w_res is space.w_False: + return False + if w_res is space.w_True: + return True w_restype = space.type(w_res) if (space.is_w(w_restype, space.w_bool) or space.is_w(w_restype, space.w_int)): - return space.is_true(w_res) + return space.int_w(w_res) != 0 else: raise OperationError(space.w_TypeError, space.wrap('__nonzero__ should return ' Modified: pypy/dist/pypy/objspace/std/fake.py ============================================================================== --- pypy/dist/pypy/objspace/std/fake.py (original) +++ pypy/dist/pypy/objspace/std/fake.py Wed Sep 21 21:00:39 2005 @@ -203,7 +203,8 @@ def descr_descriptor_get(space, descr, w_obj, w_cls=None): # XXX HAAAAAAAAAAAACK (but possibly a good one) - if w_obj == space.w_None and not space.is_w(w_cls, space.type(space.w_None)): + if (space.is_w(w_obj, space.w_None) + and not space.is_w(w_cls, space.type(space.w_None))): #print descr, w_obj, w_cls return space.wrap(descr) else: From ac at codespeak.net Wed Sep 21 23:02:04 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Wed, 21 Sep 2005 23:02:04 +0200 (CEST) Subject: [pypy-svn] r17747 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050921210204.9388B27B5D@code1.codespeak.net> Author: ac Date: Wed Sep 21 23:02:04 2005 New Revision: 17747 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: Fix typo. Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Wed Sep 21 23:02:04 2005 @@ -1238,7 +1238,7 @@ elif isinstance(arg, ast.AssTuple): for name in arg.getArgNames(): if name in argnames: - raise SyntaxError("duplicate argument '%s' in function definition" % arg.name) + raise SyntaxError("duplicate argument '%s' in function definition" % name) argnames[name] = 1 if 'None' in argnames: raise SyntaxError('assignment to None is not allowed') From ac at codespeak.net Thu Sep 22 00:26:20 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 22 Sep 2005 00:26:20 +0200 (CEST) Subject: [pypy-svn] r17750 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050921222620.922D427B86@code1.codespeak.net> Author: ac Date: Thu Sep 22 00:26:20 2005 New Revision: 17750 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: Fix some untranslatable code. Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Thu Sep 22 00:26:20 2005 @@ -376,9 +376,11 @@ lineno, 0, '') elif isinstance(ast_node, ast.CallFunc): raise ParseError("can't assign to function call", - lineno, 0, '') + lineno, 0, '') else: - raise ASTError("cannot assign to %s" % ast_node, ast_node) + raise ParseError("can't assign to non-lvalue", + lineno, 0, '') + # raise ASTError("cannot assign to %s" % ast_node, ast_node) def is_augassign( ast_node ): if ( isinstance( ast_node, ast.Name ) or From ac at codespeak.net Thu Sep 22 10:27:09 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 22 Sep 2005 10:27:09 +0200 (CEST) Subject: [pypy-svn] r17759 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050922082709.1E4F727B86@code1.codespeak.net> Author: ac Date: Thu Sep 22 10:27:09 2005 New Revision: 17759 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: Make test_syntax pass with astcompiler. Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Thu Sep 22 10:27:09 2005 @@ -375,8 +375,12 @@ raise ParseError("can't assign to list comprehension", lineno, 0, '') elif isinstance(ast_node, ast.CallFunc): - raise ParseError("can't assign to function call", - lineno, 0, '') + if flags == consts.OP_DELETE: + raise ParseError("can't delete function call", + lineno, 0, '') + else: + raise ParseError("can't assign to function call", + lineno, 0, '') else: raise ParseError("can't assign to non-lvalue", lineno, 0, '') From ac at codespeak.net Thu Sep 22 10:48:19 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 22 Sep 2005 10:48:19 +0200 (CEST) Subject: [pypy-svn] r17760 - in pypy/dist/pypy/interpreter/pyparser: . test Message-ID: <20050922084819.CBC9C27B86@code1.codespeak.net> Author: ac Date: Thu Sep 22 10:48:19 2005 New Revision: 17760 Modified: pypy/dist/pypy/interpreter/pyparser/parsestring.py pypy/dist/pypy/interpreter/pyparser/test/test_parsestring.py Log: Fix bad quote quoting. Modified: pypy/dist/pypy/interpreter/pyparser/parsestring.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/parsestring.py (original) +++ pypy/dist/pypy/interpreter/pyparser/parsestring.py Thu Sep 22 10:48:19 2005 @@ -140,7 +140,7 @@ elif ch == "'": lis.append("'") elif ch == '"': - lis.append("'") + lis.append('"') elif ch == 'b': lis.append("\010") elif ch == 'f': Modified: pypy/dist/pypy/interpreter/pyparser/test/test_parsestring.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_parsestring.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_parsestring.py Thu Sep 22 10:48:19 2005 @@ -32,6 +32,15 @@ w_ret = parsestring.parsestr(space, None, s) assert space.str_w(w_ret) == chr(0xFF) + s = r'"\""' + w_ret = parsestring.parsestr(space, None, s) + assert space.str_w(w_ret) == '"' + + s = r"'\''" + w_ret = parsestring.parsestr(space, None, s) + assert space.str_w(w_ret) == "'" + + def test_unicode(self): space = self.space s = u'hello world' From ericvrp at codespeak.net Thu Sep 22 13:41:08 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Thu, 22 Sep 2005 13:41:08 +0200 (CEST) Subject: [pypy-svn] r17761 - pypy/dist/pypy/translator/goal Message-ID: <20050922114108.6967727B86@code1.codespeak.net> Author: ericvrp Date: Thu Sep 22 13:41:07 2005 New Revision: 17761 Added: pypy/dist/pypy/translator/goal/bench-cronjob.py (contents, props changed) Log: Script for generating ppy-c and pypy-llvm. This will be run nightly on the 'snake' server. Will need to decide where to put the results. Added: pypy/dist/pypy/translator/goal/bench-cronjob.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/goal/bench-cronjob.py Thu Sep 22 13:41:07 2005 @@ -0,0 +1,49 @@ +#! /usr/bin/env python + +import time, os, sys, stat + +homedir = os.environ['HOME'] + +def update_pypy(): + os.chdir(homedir + '/projects/pypy-dist') + os.system('svn up 2>&1') + +def update_llvm(): + os.chdir(homedir + '/projects/llvm') + os.system('cvs -q up 2>&1') + os.system('make tools-only 2>&1') + +def get_names(): + os.chdir(homedir + '/projects/pypy-dist') + try: + revision = os.popen('svn info 2>&1').readlines()[3].split()[1] + except: + revision = 'unknown' + basename = homedir + '/projects/pypy-dist/pypy/translator/goal/' + 'pypy-' + backend + realname = basename+'-'+revision() + return basename, realname + +def compile(backend): + os.chdir(homedir + '/projects/pypy-dist/pypy/translator/goal') + os.system('python translate_pypy_new.py targetpypystandalone --backend=%(backend)s --gc=boehm --pygame --batch -r 2>&1' % locals()) + basename, realname = get_names() + os.open(realname, 'wb').write( open(basename).read() ) + os.chmod(realname, stat.S_IRWXU) + os.unlink(basename) + +def benchmark(): + os.chdir(homedir + '/projects/pypy-dist/pypy/translator/goal') + os.system('python bench-unix.py 2>&1' % locals()) + +def main(): + print time.ctime() + update_pypy() + update_llvm() + for backend in 'c llvm'.split(): + compile(backend) + print time.ctime() + print 80*'-' + +if __name__ == '__main__': + #main() + benchmark() From ericvrp at codespeak.net Thu Sep 22 13:46:30 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Thu, 22 Sep 2005 13:46:30 +0200 (CEST) Subject: [pypy-svn] r17762 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20050922114630.7C69D27B96@code1.codespeak.net> Author: ericvrp Date: Thu Sep 22 13:46:29 2005 New Revision: 17762 Modified: pypy/dist/pypy/translator/llvm/arraynode.py pypy/dist/pypy/translator/llvm/module/support.py pypy/dist/pypy/translator/llvm/structnode.py Log: Changed required because some initiate_...Error functions no longer automaticly appear. (I think this is because of svn r17718) Now using the PBC instead of a 'fresh' exception. probably a little faster too. Modified: pypy/dist/pypy/translator/llvm/arraynode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/arraynode.py (original) +++ pypy/dist/pypy/translator/llvm/arraynode.py Thu Sep 22 13:46:29 2005 @@ -85,7 +85,9 @@ self.db = db self.value = value self.arraytype = lltype.typeOf(value).OF - self.ref = self.make_ref('%arrayinstance', '') + prefix = '%arrayinstance' + name = '' #str(value).split()[1] + self.ref = self.make_ref(prefix, name) def __str__(self): return "" % (self.ref,) @@ -186,7 +188,9 @@ assert isinstance(lltype.typeOf(value), lltype.Array) self.db = db self.value = value - self.ref = self.make_ref('%arrayinstance', '') + prefix = '%arrayinstance' + name = '' #str(value).split()[1] + self.ref = self.make_ref(prefix, name) def constantvalue(self): return "{ %s } {%s %s}" % (self.db.get_machine_word(), Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Thu Sep 22 13:46:29 2005 @@ -78,7 +78,7 @@ for exc in "ZeroDivisionError OverflowError ValueError".split(): #_ZER _OVF _VAL extfunctions["%%prepare_%(exc)s" % locals()] = ((), """ internal fastcc void %%prepare_%(exc)s() { - %%exception_value = call fastcc %%RPYTHON_EXCEPTION* %%pypy_instantiate_%(exc)s() + %%exception_value = cast %%structtype.%(exc)s* %%structinstance.%(exc)s to %%RPYTHON_EXCEPTION* %%tmp = getelementptr %%RPYTHON_EXCEPTION* %%exception_value, int 0, uint 0 %%exception_type = load %%RPYTHON_EXCEPTION_VTABLE** %%tmp store %%RPYTHON_EXCEPTION_VTABLE* %%exception_type, %%RPYTHON_EXCEPTION_VTABLE** %%last_exception_type @@ -92,7 +92,7 @@ for exc in "IOError ZeroDivisionError OverflowError ValueError".split(): #_ZER _OVF _VAL extfunctions["%%raisePyExc_%(exc)s" % locals()] = ((), """ internal fastcc void %%raisePyExc_%(exc)s(sbyte* %%msg) { - %%exception_value = call fastcc %%RPYTHON_EXCEPTION* %%pypy_instantiate_%(exc)s() + %%exception_value = cast %%structtype.%(exc)s* %%structinstance.%(exc)s to %%RPYTHON_EXCEPTION* %%tmp = getelementptr %%RPYTHON_EXCEPTION* %%exception_value, int 0, uint 0 %%exception_type = load %%RPYTHON_EXCEPTION_VTABLE** %%tmp store %%RPYTHON_EXCEPTION_VTABLE* %%exception_type, %%RPYTHON_EXCEPTION_VTABLE** %%last_exception_type @@ -109,12 +109,13 @@ br bool %%cond, label %%is_0, label %%is_not_0 is_0: call fastcc void %%prepare_ZeroDivisionError() - br label %%is_not_0 ; XXX unwind ; (2) + ret %s 0 ; XXX unwind ; (2) + ;br label %%is_not_0 ; XXX unwind ; (2) is_not_0: """ -int_zer_test = zer_test % ('int',) -double_zer_test = zer_test % ('double',) +int_zer_test = zer_test % ('int' ,'int') +double_zer_test = zer_test % ('double','double') #overflow: normal operation, ...if ((x) >= 0 || (x) != -(x)) OK else _OVF() @@ -142,7 +143,7 @@ func, inst = func_inst.split(':') for prefix_type_ in "int:int uint:uint".split(): prefix, type_ = prefix_type_.split(':') - type_zer_test = zer_test % type_ + type_zer_test = zer_test % (type_, type_) extfunctions["%%%(prefix)s_%(func)s" % locals()] = ((), """ internal fastcc %(type_)s %%%(prefix)s_%(func)s(%(type_)s %%x, %(type_)s %%y) { %(type_zer_test)s Modified: pypy/dist/pypy/translator/llvm/structnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/structnode.py (original) +++ pypy/dist/pypy/translator/llvm/structnode.py Thu Sep 22 13:46:29 2005 @@ -91,7 +91,9 @@ self.db = db self.value = value self.structtype = self.value._TYPE - self.ref = self.make_ref('%structinstance', '') + prefix = '%structinstance.' + name = str(value).split()[1] + self.ref = self.make_ref(prefix, name) self._get_ref_cache = None self._get_types = self._compute_types() From ac at codespeak.net Thu Sep 22 17:06:47 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 22 Sep 2005 17:06:47 +0200 (CEST) Subject: [pypy-svn] r17765 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050922150647.932D227B93@code1.codespeak.net> Author: ac Date: Thu Sep 22 17:06:47 2005 New Revision: 17765 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: Fix clobbering of variable. Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Thu Sep 22 17:06:47 2005 @@ -1236,10 +1236,10 @@ raise SyntaxError("duplicate argument '%s' in function definition" % arg.name) argnames[arg.name] = 1 elif isinstance(arg, ast.AssTuple): - for name in arg.getArgNames(): - if name in argnames: - raise SyntaxError("duplicate argument '%s' in function definition" % name) - argnames[name] = 1 + for argname in arg.getArgNames(): + if argname in argnames: + raise SyntaxError("duplicate argument '%s' in function definition" % argname) + argnames[argname] = 1 if 'None' in argnames: raise SyntaxError('assignment to None is not allowed') From arigo at codespeak.net Thu Sep 22 20:36:19 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 22 Sep 2005 20:36:19 +0200 (CEST) Subject: [pypy-svn] r17772 - in pypy/dist/pypy: interpreter objspace/std Message-ID: <20050922183619.0F87C27B8E@code1.codespeak.net> Author: arigo Date: Thu Sep 22 20:36:13 2005 New Revision: 17772 Modified: pypy/dist/pypy/interpreter/baseobjspace.py pypy/dist/pypy/interpreter/executioncontext.py pypy/dist/pypy/interpreter/pyopcode.py pypy/dist/pypy/objspace/std/objspace.py Log: Some forgotten variants of space.is_true(space.is_(...)). Modified: pypy/dist/pypy/interpreter/baseobjspace.py ============================================================================== --- pypy/dist/pypy/interpreter/baseobjspace.py (original) +++ pypy/dist/pypy/interpreter/baseobjspace.py Thu Sep 22 20:36:13 2005 @@ -441,7 +441,7 @@ while check_list: w_item = check_list.pop() # Match identical items. - if self.is_true(self.is_(w_exc_type, w_item)): + if self.is_w(w_exc_type, w_item): return True try: # Match subclasses. Modified: pypy/dist/pypy/interpreter/executioncontext.py ============================================================================== --- pypy/dist/pypy/interpreter/executioncontext.py (original) +++ pypy/dist/pypy/interpreter/executioncontext.py Thu Sep 22 20:36:13 2005 @@ -126,14 +126,14 @@ def settrace(self, w_func): """Set the global trace function.""" - if self.space.is_true(self.space.is_(w_func, self.space.w_None)): + if self.space.is_w(w_func, self.space.w_None): self.w_tracefunc = None else: self.w_tracefunc = w_func def setprofile(self, w_func): """Set the global trace function.""" - if self.space.is_true(self.space.is_(w_func, self.space.w_None)): + if self.space.is_w(w_func, self.space.w_None): self.w_profilefunc = None else: self.w_profilefunc = w_func Modified: pypy/dist/pypy/interpreter/pyopcode.py ============================================================================== --- pypy/dist/pypy/interpreter/pyopcode.py (original) +++ pypy/dist/pypy/interpreter/pyopcode.py Thu Sep 22 20:36:13 2005 @@ -357,8 +357,7 @@ f.space.gettypeobject(PyCode.typedef)) w_prog, w_globals, w_locals = f.space.unpacktuple(w_resulttuple, 3) - plain = (f.w_locals is not None and - f.space.is_true(f.space.is_(w_locals, f.w_locals))) + plain = f.w_locals is not None and f.space.is_w(w_locals, f.w_locals) if plain: w_locals = f.getdictscope() pycode = f.space.interpclass_w(w_prog) Modified: pypy/dist/pypy/objspace/std/objspace.py ============================================================================== --- pypy/dist/pypy/objspace/std/objspace.py (original) +++ pypy/dist/pypy/objspace/std/objspace.py Thu Sep 22 20:36:13 2005 @@ -364,7 +364,7 @@ """Allocate the memory needed for an instance of an internal or user-defined type, without actually __init__ializing the instance.""" w_type = self.gettypeobject(cls.typedef) - if self.is_true(self.is_(w_type, w_subtype)): + if self.is_w(w_type, w_subtype): instance = instantiate(cls) else: w_subtype = w_type.check_user_subclass(w_subtype) From tismer at codespeak.net Fri Sep 23 03:02:11 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Fri, 23 Sep 2005 03:02:11 +0200 (CEST) Subject: [pypy-svn] r17774 - pypy/dist/pypy/translator/goal Message-ID: <20050923010211.811C427B95@code1.codespeak.net> Author: tismer Date: Fri Sep 23 03:02:09 2005 New Revision: 17774 Modified: pypy/dist/pypy/translator/goal/bench-windows.py Log: reworked bench-windows quite a bit. A local file is maintained to save computation. The reference python exe can be specified in the file 'bench_windows_exe.txt' The current state of the benchmarks is saved in the file 'bench_windows.dump'. Entries found there are not run again. Identity is obtained via MD5. The current patch does include Samuele's patch, it is just not committed as a new check-in! Modified: pypy/dist/pypy/translator/goal/bench-windows.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-windows.py (original) +++ pypy/dist/pypy/translator/goal/bench-windows.py Fri Sep 23 03:02:09 2005 @@ -2,28 +2,39 @@ # to be executed in the goal folder, # where a couple of .exe files is expected. +USE_HIGH_PRIORITY = True +# usage with high priority: +# the program will try to import subprocess. +# you can have this with python older than 2.4: copy +# subprocess into lib and change line 392 to use win32 + current_result = """ -executable abs.richards abs.pystone rel.rich rel.pystone -pypy-c-17439-hi 35135 ms 674.191 42.4 60.7 -pypy-c-17439-lo 36062 ms 972.900 43.6 42.1 -pypy-c-17600-lo 26357 ms 905.379 31.8 45.2 -pypy-c-17634-lo 20098 ms 1016.890 24.3 40.3 -pypy-c-17649-lo 22637 ms 1041.480 27.3 39.3 -pypy-c-17674-hi 15812 ms 2114.430 19.1 19.4 -pypy-c-17674-lo 19253 ms 1356.470 23.3 30.2 -pypy-c-17707-hi-range 14265 ms 2906.260 17.2 14.1 -pypy-c-17707-hi 14105 ms 2120.210 17.0 19.3 -pypy-c-17707-lo-range 18701 ms 2834.690 22.6 14.4 -pypy-c-17707-lo 19042 ms 1357.690 23.0 30.2 -python 2.3.3 828 ms 40934.500 1.0 1.0 - -After implementing range at interp-level, results have changed -quite dramatically. Revision 17707 runs everywhere fastest -without the -t-lowmem option. This is probably different on machines -with less than 2 MB of L2-cache. +executable a.rich a.stone r.rich r.stone size +pypy-c-17439-hi 37413 678.4 48.2 61.6 5.65 +pypy-c-17600-lo 26352 906.2 33.9 46.1 6.43 +pypy-c-17634-lo 20108 1023.5 25.9 40.9 6.42 +pypy-c-17649-lo 22612 1042.0 29.1 40.1 6.41 +pypy-c-17674-lo 19248 1358.8 24.8 30.8 6.40 +pypy-c-17674-hi 12402 1941.4 16.0 21.5 7.37 +pypy-c-17439-lo 29638 971.4 38.1 43.0 6.49 +pypy-c-17707-hi 14095 2092.7 18.1 20.0 7.37 +pypy-c-17707-lo 19102 1354.7 24.6 30.9 6.40 +pypy-c-17707-lo-range 18786 2800.8 24.2 14.9 6.40 +pypy-c-17707-hi-range 13980 2899.9 18.0 14.4 7.38 +pypy-c-17743-hi 13944 2800.3 17.9 14.9 7.30 +pypy-c-17761-hi-samuele 13243 2983.3 17.0 14.0 7.69 +python 2.5a0 777 41812.1 1.0 1.0 0.96 + +This new version also shows the size of the plain executable. +Samuele's locality patch now has a nice impact of over 5 percent. +I had even expected a bit more, but fine, yeah! """ -import os, sys +import os, sys, pickle, md5 +try: + from subprocess import * +except ImportError: + Popen = None PYSTONE_CMD = 'from test import pystone;pystone.main(%s)' PYSTONE_PATTERN = 'This machine benchmarks at' @@ -45,14 +56,58 @@ print "done" return result -def run_pystone(executable='python', n=0): +def run_cmd_subprocess(cmd): + print "running", cmd + result = Popen(cmd, stdout=PIPE, creationflags=CREATIONFLAGS + ).communicate()[0] + print "done" + return result + +CREATIONFLAGS = 0 +if Popen: + run_cmd = run_cmd_subprocess + try: + import win32con, win32api + except ImportError: + pass + else: + if USE_HIGH_PRIORITY: + CREATIONFLAGS = win32con.HIGH_PRIORITY_CLASS + print "configured to run under high priority" + +BENCH_EXECONFIG = 'bench_windows_exe.txt' +bench_exe = None + +def reference(progname): + global bench_exe + if not bench_exe: + if os.path.exists(BENCH_EXECONFIG): + progname = file(BENCH_EXECONFIG).read().strip() + print "using %s instead of the system default" % progname + bench_exe = progname + return bench_exe + +def run_version_size(executable=reference('python'), *args): + ver, size = run_cmd('''%s -c "import sys,os;print sys.version.split()[0],\\ + os.path.getsize(sys.executable)"''' + % executable).split() + size = int(size) + try: + sys.dllhandle + except AttributeError: + pass + else: + size += os.path.getsize(win32api.GetModuleFileName(sys.dllhandle)) + return ver, size + +def run_pystone(executable=reference('python'), n=0): argstr = PYSTONE_CMD % (str(n) and n or '') txt = run_cmd('%s -c "%s"' % (executable, argstr)) res = get_result(txt, PYSTONE_PATTERN) print res return res -def run_richards(executable='python', n=20): +def run_richards(executable=reference('python'), n=20): argstr = RICHARDS_CMD % n txt = run_cmd('%s -c "%s"' % (executable, argstr)) res = get_result(txt, RICHARDS_PATTERN) @@ -64,31 +119,52 @@ exes.sort() return exes +STAT_FILE = 'bench_windows.dump' +def load_stats(statfile=STAT_FILE): + try: + dic = pickle.load(file(statfile, 'rb')) + except IOError: + dic = {} + return dic + +def save_stats(dic, statfile=STAT_FILE): + pickle.dump(dic, file(statfile, 'wb')) + HEADLINE = '''\ -executable abs.richards abs.pystone rel.rich rel.pystone''' +executable a.rich a.stone r.rich r.stone size''' FMT = '''\ -%-27s ''' + '%5d ms %9.3f ' + '%5.1f %5.1f' +%-27s ''' + '%5d %7.1f ' + '%5.1f %5.1f %5.2f' def main(): - import win32con, win32process - curr = win32process.GetCurrentProcess() - prio = win32con.HIGH_PRIORITY_CLASS - win32process.SetPriorityClass(curr, prio) - # unfortunately, the above doesn't help, because the process priority - # is not inherited by child process. We also cannot import WIn32 extensions - # right now, since PyPycanot handle extension modules. print 'getting the richards reference' ref_rich = run_richards() print 'getting the pystone reference' ref_stone = run_pystone() - res = [] + resdic = {} + prior = load_stats() for exe in get_executables(): exename = os.path.splitext(exe)[0] - res.append( (exename, run_richards(exe, 2), run_pystone(exe, 20000)) ) - res.append( ('python %s' % sys.version.split()[0], ref_rich, ref_stone) ) + mtime = os.path.getmtime(exe) + size = os.path.getsize(exe) + key = md5.new(file(exe,'rb').read()).digest() + if key in prior: + print 'skipped', exename + resdic[key] = prior[key][:2] + (exename, mtime, size) + else: + resdic[key] = (run_richards(exe, 2), run_pystone(exe, 20000), + exename, mtime, size) + prior[key] = resdic[key] # save result, temporarily + save_stats(prior) + save_stats(resdic) # save cleaned result + res = [ (mtime, exe, size, rich, stone) + for rich, stone, exe, mtime, size in resdic.values()] + version, size = run_version_size() + res.append( (9e9, 'python %s' % version, size, ref_rich, ref_stone) ) + res.sort() print HEADLINE - for exe, rich, stone in res: - print FMT % (exe, rich, stone, rich / ref_rich, ref_stone / stone) + for mtime, exe, size, rich, stone in res: + print FMT % (exe, rich, stone, rich / ref_rich, ref_stone / stone, + size / float(1024 * 1024)) if __name__ == '__main__': main() From tismer at codespeak.net Fri Sep 23 05:06:02 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Fri, 23 Sep 2005 05:06:02 +0200 (CEST) Subject: [pypy-svn] r17775 - in pypy/dist/pypy: annotation objspace/std translator Message-ID: <20050923030602.D9BD027B93@code1.codespeak.net> Author: tismer Date: Fri Sep 23 05:05:52 2005 New Revision: 17775 Modified: pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/annotation/policy.py pypy/dist/pypy/objspace/std/objspace.py pypy/dist/pypy/translator/ann_override.py Log: changed the benchmark to use the latest Python for comparison. added Samuele's method caching patch. It has some very visible effect of over 5% for richards and over 6% for pystone. I would have anyway expected more effect. Maybe the patch can be extended to catch even more cases, maybe I misunderstood. Anyway, using 2.5a2 raised the fence for richards, again. I have to find out, why, again. Modified: pypy/dist/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/dist/pypy/annotation/bookkeeper.py (original) +++ pypy/dist/pypy/annotation/bookkeeper.py Fri Sep 23 05:05:52 2005 @@ -383,6 +383,7 @@ if x not in self.seen_mutable: # avoid circular reflowing, # see for example test_circular_mutable_getattr self.seen_mutable[x] = True + self.annotator.policy.event(self, 'mutable', x) for attr in x.__dict__: clsdef.add_source_for_attribute(attr, x) # can trigger reflowing result = SomeInstance(clsdef) Modified: pypy/dist/pypy/annotation/policy.py ============================================================================== --- pypy/dist/pypy/annotation/policy.py (original) +++ pypy/dist/pypy/annotation/policy.py Fri Sep 23 05:05:52 2005 @@ -9,6 +9,9 @@ class BasicAnnotatorPolicy: allow_someobjects = True + + def event(pol, bookkeeper, what, *args): + pass def specialize(pol, bookkeeper, spaceop, func, args, mono): return None, None Modified: pypy/dist/pypy/objspace/std/objspace.py ============================================================================== --- pypy/dist/pypy/objspace/std/objspace.py (original) +++ pypy/dist/pypy/objspace/std/objspace.py Fri Sep 23 05:05:52 2005 @@ -356,9 +356,11 @@ def lookup(self, w_obj, name): w_type = w_obj.getclass(self) return w_type.lookup(name) + lookup._annspecialcase_ = 'specialize:lookup' def lookup_in_type_where(self, w_type, name): return w_type.lookup_where(name) + lookup_in_type_where._annspecialcase_ = 'specialize:lookup_in_type_where' def allocate_instance(self, cls, w_subtype): """Allocate the memory needed for an instance of an internal or Modified: pypy/dist/pypy/translator/ann_override.py ============================================================================== --- pypy/dist/pypy/translator/ann_override.py (original) +++ pypy/dist/pypy/translator/ann_override.py Fri Sep 23 05:05:52 2005 @@ -9,6 +9,11 @@ class PyPyAnnotatorPolicy(AnnotatorPolicy): allow_someobjects = False + def __init__(pol): + pol.lookups = {} + pol.lookups_where = {} + pol.pypytypes = {} + def override__wrap_exception_cls(pol, space, x): import pypy.objspace.std.typeobject as typeobject clsdef = getbookkeeper().getclassdef(typeobject.W_TypeObject) @@ -28,3 +33,104 @@ from pypy.interpreter import pycode clsdef = getbookkeeper().getclassdef(pycode.PyCode) return annmodel.SomeInstance(clsdef) + + def attach_lookup(pol, t, attr): + cached = "cached_%s" % attr + if not t.is_heaptype(): + setattr(t, cached, t.lookup(attr)) + return True + return False + + def attach_lookup_in_type_where(pol, t, attr): + cached = "cached_where_%s" % attr + if not t.is_heaptype(): + setattr(t, cached, t.lookup_where(attr)) + return True + return False + + def consider_lookup(pol, bookkeeper, attr): + assert attr not in pol.lookups + from pypy.objspace.std import typeobject + cached = "cached_%s" % attr + clsdef = bookkeeper.getclassdef(typeobject.W_TypeObject) + setattr(clsdef.cls, cached, None) + clsdef.add_source_for_attribute(cached, clsdef.cls, clsdef) + for t in pol.pypytypes: + if pol.attach_lookup(t, attr): + clsdef.add_source_for_attribute(cached, t) + src = CACHED_LOOKUP % {'attr': attr} + print src + d = {} + exec src in d + fn = d["lookup_%s" % attr] + pol.lookups[attr] = fn + + + def consider_lookup_in_type_where(pol, bookkeeper, attr): + assert attr not in pol.lookups_where + from pypy.objspace.std import typeobject + cached = "cached_where_%s" % attr + clsdef = bookkeeper.getclassdef(typeobject.W_TypeObject) + setattr(clsdef.cls, cached, (None, None)) + clsdef.add_source_for_attribute(cached, clsdef.cls, clsdef) + for t in pol.pypytypes: + if pol.attach_lookup_in_type_where(t, attr): + clsdef.add_source_for_attribute(cached, t) + src = CACHED_LOOKUP_IN_TYPE_WHERE % {'attr': attr} + print src + d = {} + exec src in d + fn = d["lookup_in_type_where_%s" % attr] + pol.lookups_where[attr] = fn + + def specialize__lookup(pol, bookkeeper, mod, spaceop, func, args, mono): + (s_space, s_obj, s_name), _ = args.unpack() + if s_name.is_constant(): + attr = s_name.const + if attr not in pol.lookups: + print "LOOKUP", attr + pol.consider_lookup(bookkeeper, attr) + return pol.lookups[attr], args + else: + pol.lookups[None] = True + return func, args + + def specialize__lookup_in_type_where(pol, bookkeeper, mod, spaceop, func, args, mono): + (s_space, s_obj, s_name), _ = args.unpack() + if s_name.is_constant(): + attr = s_name.const + if attr not in pol.lookups_where: + print "LOOKUP_IN_TYPE_WHERE", attr + pol.consider_lookup_in_type_where(bookkeeper, attr) + return pol.lookups_where[attr], args + else: + pol.lookups_where[None] = True + return func, args + + def event(pol, bookkeeper, what, x): + from pypy.objspace.std import typeobject + if isinstance(x, typeobject.W_TypeObject): + pol.pypytypes[x] = True + print "TYPE", x + for attr in pol.lookups: + if attr: + pol.attach_lookup(x, attr) + for attr in pol.lookups_where: + if attr: + pol.attach_lookup_in_type_where(x, attr) + return + +CACHED_LOOKUP = """ +def lookup_%(attr)s(space, w_obj, name): + w_type = w_obj.getclass(space) + if not w_type.is_heaptype(): + return w_type.cached_%(attr)s + return w_type.lookup("%(attr)s") +""" + +CACHED_LOOKUP_IN_TYPE_WHERE = """ +def lookup_in_type_where_%(attr)s(space, w_type, name): + if not w_type.is_heaptype(): + return w_type.cached_where_%(attr)s + return w_type.lookup_where("%(attr)s") +""" From arigo at codespeak.net Fri Sep 23 12:04:05 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 23 Sep 2005 12:04:05 +0200 (CEST) Subject: [pypy-svn] r17783 - in pypy/dist/pypy/objspace/flow: . test Message-ID: <20050923100405.4E4D627B93@code1.codespeak.net> Author: arigo Date: Fri Sep 23 12:04:04 2005 New Revision: 17783 Modified: pypy/dist/pypy/objspace/flow/model.py pypy/dist/pypy/objspace/flow/test/test_model.py Log: Further simplification of traverse(), the full flexibility of which is never used anywhere and the speed of which is crucial during translation. Modified: pypy/dist/pypy/objspace/flow/model.py ============================================================================== --- pypy/dist/pypy/objspace/flow/model.py (original) +++ pypy/dist/pypy/objspace/flow/model.py Fri Sep 23 12:04:04 2005 @@ -344,52 +344,65 @@ #_________________________________________________________ # a visitor for easy traversal of the above model -import inspect # for getmro +##import inspect # for getmro -class traverse: +##class traverse: - def __init__(self, visitor, functiongraph): - """ send the visitor over all (reachable) nodes. - the visitor needs to have either callable attributes 'visit_typename' - or otherwise is callable itself. - """ - self.visitor = visitor - self.visitor_cache = {} - self.seen = {} - self.visit(functiongraph) - - def visit(self, node): - if id(node) in self.seen: - return +## def __init__(self, visitor, functiongraph): +## """ send the visitor over all (reachable) nodes. +## the visitor needs to have either callable attributes 'visit_typename' +## or otherwise is callable itself. +## """ +## self.visitor = visitor +## self.visitor_cache = {} +## self.seen = {} +## self.visit(functiongraph) + +## def visit(self, node): +## if id(node) in self.seen: +## return + +## # do the visit +## cls = node.__class__ +## try: +## consume = self.visitor_cache[cls] +## except KeyError: +## for subclass in inspect.getmro(cls): +## consume = getattr(self.visitor, "visit_" + subclass.__name__, None) +## if consume: +## break +## else: +## consume = getattr(self.visitor, 'visit', self.visitor) + +## assert callable(consume), "visitor not found for %r on %r" % (cls, self.visitor) + +## self.visitor_cache[cls] = consume + +## self.seen[id(node)] = consume(node) + +## # recurse +## if isinstance(node, Block): +## for obj in node.exits: +## self.visit(obj) +## elif isinstance(node, Link): +## self.visit(node.target) +## elif isinstance(node, FunctionGraph): +## self.visit(node.startblock) +## else: +## raise ValueError, "could not dispatch %r" % cls + +def traverse(visit, functiongraph): + pending = [functiongraph.startblock] + seen = {id(functiongraph.startblock): True} + for block in pending: + visit(block) + for link in block.exits: + visit(link) + targetid = id(link.target) + if targetid not in seen: + pending.append(link.target) + seen[targetid] = True - # do the visit - cls = node.__class__ - try: - consume = self.visitor_cache[cls] - except KeyError: - for subclass in inspect.getmro(cls): - consume = getattr(self.visitor, "visit_" + subclass.__name__, None) - if consume: - break - else: - consume = getattr(self.visitor, 'visit', self.visitor) - - assert callable(consume), "visitor not found for %r on %r" % (cls, self.visitor) - - self.visitor_cache[cls] = consume - - self.seen[id(node)] = consume(node) - - # recurse - if isinstance(node, Block): - for obj in node.exits: - self.visit(obj) - elif isinstance(node, Link): - self.visit(node.target) - elif isinstance(node, FunctionGraph): - self.visit(node.startblock) - else: - raise ValueError, "could not dispatch %r" % cls def flatten(funcgraph): l = [] Modified: pypy/dist/pypy/objspace/flow/test/test_model.py ============================================================================== --- pypy/dist/pypy/objspace/flow/test/test_model.py (original) +++ pypy/dist/pypy/objspace/flow/test/test_model.py Fri Sep 23 12:04:04 2005 @@ -28,27 +28,27 @@ graph = self.getflow(self.simplefunc) assert all_operations(graph) == {'add': 1} - def test_class(self): - graph = self.getflow(self.simplefunc) +## def test_class(self): +## graph = self.getflow(self.simplefunc) - class MyVisitor: - def __init__(self): - self.blocks = [] - self.links = [] - - def visit_FunctionGraph(self, graph): - self.graph = graph - def visit_Block(self, block): - self.blocks.append(block) - def visit_Link(self, link): - self.links.append(link) - - v = MyVisitor() - traverse(v, graph) - #assert len(v.blocks) == 2 - #assert len(v.links) == 1 - assert v.graph == graph - assert v.links[0] == graph.startblock.exits[0] +## class MyVisitor: +## def __init__(self): +## self.blocks = [] +## self.links = [] + +## def visit_FunctionGraph(self, graph): +## self.graph = graph +## def visit_Block(self, block): +## self.blocks.append(block) +## def visit_Link(self, link): +## self.links.append(link) + +## v = MyVisitor() +## traverse(v, graph) +## #assert len(v.blocks) == 2 +## #assert len(v.links) == 1 +## assert v.graph == graph +## assert v.links[0] == graph.startblock.exits[0] ## def test_partial_class(self): ## graph = self.getflow(self.simplefunc) From ac at codespeak.net Fri Sep 23 12:42:40 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Fri, 23 Sep 2005 12:42:40 +0200 (CEST) Subject: [pypy-svn] r17784 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050923104240.C1F2727B80@code1.codespeak.net> Author: ac Date: Fri Sep 23 12:42:40 2005 New Revision: 17784 Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: Refactor processing of tuple-arguments. Modified: pypy/dist/pypy/interpreter/astcompiler/pyassem.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pyassem.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pyassem.py Fri Sep 23 12:42:40 2005 @@ -442,8 +442,7 @@ self.name = name self.filename = filename self.docstring = space.w_None - self.args = args # XXX - self.argcount = getArgCount(args) + self.argcount = len(args) self.klass = klass self.flags = 0 if optimized: @@ -463,21 +462,12 @@ # kinds of variables. self.closure = [] self.varnames = [] - for var in args: + for i in range(len(args)): + var = args[i] if isinstance(var, ast.AssName): - _name = var.name - assert isinstance(_name,str) - self.varnames.append( _name ) - elif isinstance(var, TupleArg): - _name = var.getName() - assert isinstance(_name,str) - self.varnames.append( _name ) + self.varnames.append(var.name ) elif isinstance(var, ast.AssTuple): - for n in var.flatten(): - assert isinstance(n, ast.AssName) - _name = n.name - assert isinstance(_name,str) - self.varnames.append( _name ) + self.varnames.append('.%d' % (2 * i)) self.stage = RAW self.orderedblocks = [] @@ -844,26 +834,6 @@ if opname[:4] == 'JUMP': return 1 -class TupleArg(ast.Node): - """Helper for marking func defs with nested tuples in arglist""" - def __init__(self, count, names): - self.count = count - self.names = names - def __repr__(self): - return "TupleArg(%s, %s)" % (self.count, self.names) - def getName(self): - return ".%d" % self.count - -def getArgCount(args): - argcount = len(args) - if args: - for arg in args: - if isinstance(arg, TupleArg): - numNames = len(arg.names.getArgNames()) - # numNames = len(misc.flatten(arg.names)) - argcount = argcount - numNames - return argcount - def twobyte(val): """Convert an int argument into high and low bytes""" assert isinstance(val,int) Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Fri Sep 23 12:42:40 2005 @@ -12,7 +12,6 @@ SC_FREE, SC_CELL, SC_DEFAULT from pypy.interpreter.astcompiler.consts import CO_VARARGS, CO_VARKEYWORDS, \ CO_NEWLOCALS, CO_NESTED, CO_GENERATOR, CO_GENERATOR_ALLOWED, CO_FUTURE_DIVISION -from pypy.interpreter.astcompiler.pyassem import TupleArg from pypy.interpreter.pyparser.error import SyntaxError # drop VERSION dependency since it the ast transformer for 2.4 doesn't work with 2.3 anyway @@ -1243,9 +1242,7 @@ if 'None' in argnames: raise SyntaxError('assignment to None is not allowed') - args, hasTupleArg = generateArgList(func.argnames) - - graph = pyassem.PyFlowGraph(space, name, func.filename, args, + graph = pyassem.PyFlowGraph(space, name, func.filename, func.argnames, optimized=self.localsfullyknown, newlocals=1) self.isLambda = isLambda @@ -1261,8 +1258,7 @@ if func.kwargs: self.graph.setFlag(CO_VARKEYWORDS) self.set_lineno(func) - if hasTupleArg: - self.generateArgUnpack(func.argnames) + self.generateArgUnpack(func.argnames) def get_module(self): return self.module @@ -1361,23 +1357,6 @@ self.emitop_obj("LOAD_CONST", klass.doc) self.storeName('__doc__') -def generateArgList(arglist): - """Generate an arg list marking TupleArgs""" - args = [] - extra = [] - count = 0 - for i in range(len(arglist)): - elt = arglist[i] - if isinstance(elt, ast.AssName): - args.append(elt) - elif isinstance(elt, ast.AssTuple): - args.append(TupleArg(i * 2, elt)) - extra.extend(elt.getChildNodes()) - count = count + 1 - else: - raise ValueError( "unexpect argument type:" + str(elt) ) - return args + extra, count - def findOp(node): """Find the op (DELETE, LOAD, STORE) in an AssTuple tree""" v = OpFinder() From pedronis at codespeak.net Fri Sep 23 12:51:39 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 23 Sep 2005 12:51:39 +0200 (CEST) Subject: [pypy-svn] r17785 - in pypy/dist/pypy: objspace/std translator Message-ID: <20050923105139.BC1C627B95@code1.codespeak.net> Author: pedronis Date: Fri Sep 23 12:51:38 2005 New Revision: 17785 Modified: pypy/dist/pypy/objspace/std/objspace.py pypy/dist/pypy/translator/ann_override.py Log: avoid producing manye identical versions of wrap for the BaseWrappable case Modified: pypy/dist/pypy/objspace/std/objspace.py ============================================================================== --- pypy/dist/pypy/objspace/std/objspace.py (original) +++ pypy/dist/pypy/objspace/std/objspace.py Fri Sep 23 12:51:38 2005 @@ -289,7 +289,7 @@ from fake import fake_object return fake_object(self, x) - wrap._annspecialcase_ = "specialize:argtype1" + wrap._annspecialcase_ = "specialize:wrap" def wrap_exception_cls(self, x): """NOT_RPYTHON""" Modified: pypy/dist/pypy/translator/ann_override.py ============================================================================== --- pypy/dist/pypy/translator/ann_override.py (original) +++ pypy/dist/pypy/translator/ann_override.py Fri Sep 23 12:51:38 2005 @@ -34,6 +34,14 @@ clsdef = getbookkeeper().getclassdef(pycode.PyCode) return annmodel.SomeInstance(clsdef) + def specialize__wrap(pol, bookkeeper, mod, spaceop, func, args, mono): + from pypy.interpreter.baseobjspace import BaseWrappable + ignore, args_w = args.flatten() + typ = args_w[1].knowntype + if issubclass(typ, BaseWrappable): + typ = BaseWrappable + return (func, typ), args + def attach_lookup(pol, t, attr): cached = "cached_%s" % attr if not t.is_heaptype(): From ericvrp at codespeak.net Fri Sep 23 14:23:42 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 23 Sep 2005 14:23:42 +0200 (CEST) Subject: [pypy-svn] r17789 - pypy/dist/pypy/translator/goal Message-ID: <20050923122342.985B827B97@code1.codespeak.net> Author: ericvrp Date: Fri Sep 23 14:23:41 2005 New Revision: 17789 Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py pypy/dist/pypy/translator/goal/run_pypy-llvm.sh pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: Make boehm the default gc (instead of refcounting) I would like to propose the following further changes to the defaults: 1. no interactive helpers (--batch to run interactive helpers) 2. don't run the exe (it's called translate, not translate&run) 3. don't run pygame (same reason) Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-cronjob.py (original) +++ pypy/dist/pypy/translator/goal/bench-cronjob.py Fri Sep 23 14:23:41 2005 @@ -25,7 +25,7 @@ def compile(backend): os.chdir(homedir + '/projects/pypy-dist/pypy/translator/goal') - os.system('python translate_pypy_new.py targetpypystandalone --backend=%(backend)s --gc=boehm --pygame --batch -r 2>&1' % locals()) + os.system('python translate_pypy_new.py targetpypystandalone --backend=%(backend)s --pygame --batch -r 2>&1' % locals()) basename, realname = get_names() os.open(realname, 'wb').write( open(basename).read() ) os.chmod(realname, stat.S_IRWXU) Modified: pypy/dist/pypy/translator/goal/run_pypy-llvm.sh ============================================================================== --- pypy/dist/pypy/translator/goal/run_pypy-llvm.sh (original) +++ pypy/dist/pypy/translator/goal/run_pypy-llvm.sh Fri Sep 23 14:23:41 2005 @@ -4,7 +4,7 @@ #python translate_pypy.py -no-c -no-o -text -fork2 # running it all #python translate_pypy.py target_pypy-llvm -text -llvm $* -python translate_pypy_new.py targetpypystandalone --backend=llvm --gc=boehm --pygame --batch --fork=fork2 --lowmem $* +python translate_pypy_new.py targetpypystandalone --backend=llvm --pygame --batch --fork=fork2 $* # How to work in parallel: Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Fri Sep 23 14:23:41 2005 @@ -28,7 +28,7 @@ ['-s', '--save', "save translator to file", [str], '']], 'Codegeneration options':[ - ['-g', '--gc', 'Garbage collector', ['ref', 'boehm','none'], 'ref'], + ['-g', '--gc', 'Garbage collector', ['ref', 'boehm','none'], 'boehm'], ['-b', '--backend', 'Backend selector', ['c','llvm'],'c'], ['-w', '--gencode', "Don't generate code", [True,False], True], ['-c', '--compile', "Don't compile generated code", [True,False], True]], From ericvrp at codespeak.net Fri Sep 23 14:27:15 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 23 Sep 2005 14:27:15 +0200 (CEST) Subject: [pypy-svn] r17790 - in pypy/dist/pypy/translator: backendopt llvm llvm/module Message-ID: <20050923122715.9DAF927BA7@code1.codespeak.net> Author: ericvrp Date: Fri Sep 23 14:27:14 2005 New Revision: 17790 Modified: pypy/dist/pypy/translator/backendopt/exception.py pypy/dist/pypy/translator/llvm/build_llvm_module.py pypy/dist/pypy/translator/llvm/database.py pypy/dist/pypy/translator/llvm/exception.py pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/llvm/gc.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/module/support.py Log: * refactored exceptionpolicy.transform() to funcnode as speed opt. * use llvm-ld for creating standalone exe. No longer relying on gcc. * strip (and optionally upx) the standalone * some stats in exception direct_call transform Modified: pypy/dist/pypy/translator/backendopt/exception.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/exception.py (original) +++ pypy/dist/pypy/translator/backendopt/exception.py Fri Sep 23 14:27:14 2005 @@ -5,6 +5,8 @@ from pypy.rpython.lltype import Bool, Ptr +n_calls = n_calls_patched = 0 + def create_exception_handling(translator, graph): """After an exception in a direct_call, that is not catched by an explicit except statement, we need to reraise the exception. So after this @@ -12,6 +14,8 @@ from the current graph with an unused value (false/0/0.0/null). Because of the added exitswitch we need an additional block. """ + global n_calls, n_calls_patched + n_calls_begin = n_calls e = translator.rtyper.getexceptiondata() blocks = [x for x in flatten(graph) if isinstance(x, Block)] for block in blocks: @@ -22,9 +26,11 @@ op = block.operations[i] if op.opname != 'direct_call': continue + n_calls += 1 called_can_raise = True #XXX maybe we even want a list of possible exceptions if not called_can_raise: continue + n_calls_patched += 1 afterblock = split_block(translator, graph, block, i+1) @@ -52,3 +58,5 @@ l.prevblock = block l.exitcase = l.llexitcase = False block.exits.insert(0, l) #False case needs to go first + if n_calls != n_calls_begin: + print 'create_exception_handling: patched %d out of %d calls' % (n_calls_patched, n_calls) Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Fri Sep 23 14:27:14 2005 @@ -81,23 +81,34 @@ #ball = str(dirpath.join('%s_all.bc' % b)) #cmds.append("opt %s %s -f -o %s.bc" % (OPTIMIZATION_SWITCHES, ball, b)) - cmds = ["llvm-as < %s.ll | opt %s -f -o %s.bc" % (b, OPTIMIZATION_SWITCHES, b)] + use_gcc = False + profile = False - generate_s_file = False - if generate_s_file and sys.maxint == 2147483647: #32 bit platform - cmds.append("llc %s %s.bc -f -o %s.s" % (genllvm.exceptionpolicy.llc_options(), b, b)) - cmds.append("as %s.s -o %s.o" % (b, b)) + cmds = ["llvm-as < %s.ll | opt %s -f -o %s.bc" % (b, OPTIMIZATION_SWITCHES, b)] + if not use_gcc: if exe_name: - cmds.append("gcc %s.o %s -lm -ldl -pipe -o %s" % (b, gc_libs, exe_name)) - object_files.append("%s.o" % b) - else: #assume 64 bit platform (x86-64?) - #this special case for x86-64 (called ia64 in llvm) can go as soon as llc supports ia64 assembly output! + cmds.append('llvm-ld %s.bc -native -O5 -l=gc -lm -l=dl -o %s' % (b, exe_name)) + else: + cmds.append("llc %s %s.bc -f -o %s.s" % (genllvm.exceptionpolicy.llc_options(), b, b)) + cmds.append("as %s.s -o %s.o" % (b, b)) + object_files.append("%s.o" % b) + else: cmds.append("llc %s %s.bc -march=c -f -o %s.c" % (genllvm.exceptionpolicy.llc_options(), b, b)) if exe_name: - #XXX TODO: use CFLAGS when available - cmds.append("gcc %s.c -c -O2 -fomit-frame-pointer -march=pentium4 -ffast-math -pipe" % (b,)) - cmds.append("gcc %s.o %s -lm -ldl -pipe -o %s" % (b, gc_libs, exe_name)) + cmd = "gcc %s.c -c -O3 -pipe" % b + if profile: + cmd += ' -pg' + cmds.append(cmd) + cmd = "gcc %s.o %s -lm -ldl -pipe -o %s" % (b, gc_libs, exe_name) + if profile: + cmd += ' -pg' + cmds.append(cmd) source_files.append("%s.c" % b) + if exe_name and not profile: + cmds.append('strip ' + exe_name) + upx = os.popen('which upx').read() + if upx: #compress file even further + cmds.append('upx ' + exe_name) try: if pyxfile: Modified: pypy/dist/pypy/translator/llvm/database.py ============================================================================== --- pypy/dist/pypy/translator/llvm/database.py (original) +++ pypy/dist/pypy/translator/llvm/database.py Fri Sep 23 14:27:14 2005 @@ -16,7 +16,8 @@ log = log.database class Database(object): - def __init__(self, translator): + def __init__(self, genllvm, translator): + self.genllvm = genllvm self.translator = translator self.obj2node = {} self._pendingsetup = [] Modified: pypy/dist/pypy/translator/llvm/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/exception.py (original) +++ pypy/dist/pypy/translator/llvm/exception.py Fri Sep 23 14:27:14 2005 @@ -5,7 +5,7 @@ def __init__(self): raise Exception, 'ExceptionPolicy should not be used directly' - def transform(self, translator): + def transform(self, translator, graph=None): return def _noresult(self, returntype): @@ -28,8 +28,7 @@ return noresult def new(exceptionpolicy=None): #factory - if exceptionpolicy is None: - exceptionpolicy = 'fast' + exceptionpolicy = exceptionpolicy or 'fast' if exceptionpolicy == 'cpython': from pypy.translator.llvm.exception import CPythonExceptionPolicy exceptionpolicy = CPythonExceptionPolicy() @@ -166,11 +165,14 @@ } ''' % locals() - def transform(self, translator): + def transform(self, translator, graph=None): from pypy.translator.backendopt.exception import create_exception_handling - for graph in translator.flowgraphs.itervalues(): + if graph: create_exception_handling(translator, graph) - #translator.view() + else: + for graph in translator.flowgraphs.itervalues(): + create_exception_handling(translator, graph) + #translator.view() def invoke(self, codewriter, targetvar, tail_, cconv, returntype, functionref, args, label, except_label): if returntype == 'void': Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Fri Sep 23 14:27:14 2005 @@ -37,6 +37,7 @@ self.value = value self.ref = self.make_ref('%pypy_', value.graph.name) self.graph = value.graph + self.db.genllvm.exceptionpolicy.transform(self.db.translator, self.graph) remove_double_links(self.db.translator, self.graph) def __str__(self): @@ -206,4 +207,4 @@ codewriter.ret(inputargtype, inputarg) def write_exceptblock(self, codewriter, block): - codewriter.genllvm.exceptionpolicy.write_exceptblock(self, codewriter, block) + self.db.genllvm.exceptionpolicy.write_exceptblock(self, codewriter, block) Modified: pypy/dist/pypy/translator/llvm/gc.py ============================================================================== --- pypy/dist/pypy/translator/llvm/gc.py (original) +++ pypy/dist/pypy/translator/llvm/gc.py Fri Sep 23 14:27:14 2005 @@ -18,6 +18,7 @@ return '' def new(gcpolicy=None): #factory + gcpolicy = gcpolicy or 'boehm' if gcpolicy is None or gcpolicy == 'boehm': from os.path import exists boehm_on_path = exists('/usr/lib/libgc.so') or exists('/usr/lib/libgc.a') Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Fri Sep 23 14:27:14 2005 @@ -35,13 +35,13 @@ # reset counters LLVMNode.nodename_count = {} - self.db = Database(translator) + self.db = Database(self, translator) self.translator = translator self.gcpolicy = gcpolicy self.exceptionpolicy = exceptionpolicy extfuncnode.ExternalFuncNode.used_external_functions = {} self.debug = debug # for debug we create comments of every operation that may be executed - exceptionpolicy.transform(translator) + #exceptionpolicy.transform(translator) #now done in FuncNode (optimization) if debug: translator.checkgraphs() Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Fri Sep 23 14:27:14 2005 @@ -110,7 +110,6 @@ is_0: call fastcc void %%prepare_ZeroDivisionError() ret %s 0 ; XXX unwind ; (2) - ;br label %%is_not_0 ; XXX unwind ; (2) is_not_0: """ From ericvrp at codespeak.net Fri Sep 23 14:37:32 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 23 Sep 2005 14:37:32 +0200 (CEST) Subject: [pypy-svn] r17791 - pypy/dist/pypy/translator/goal Message-ID: <20050923123732.9523427BAE@code1.codespeak.net> Author: ericvrp Date: Fri Sep 23 14:37:31 2005 New Revision: 17791 Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py pypy/dist/pypy/translator/goal/bench-unix.py Log: minor stuff Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-cronjob.py (original) +++ pypy/dist/pypy/translator/goal/bench-cronjob.py Fri Sep 23 14:37:31 2005 @@ -45,5 +45,4 @@ print 80*'-' if __name__ == '__main__': - #main() - benchmark() + main() Modified: pypy/dist/pypy/translator/goal/bench-unix.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-unix.py (original) +++ pypy/dist/pypy/translator/goal/bench-unix.py Fri Sep 23 14:37:31 2005 @@ -55,7 +55,7 @@ ref_stone = run_pystone() res = [] for exe in get_executables(): - exename = os.path.splitext(exe)[0] + exename = os.path.splitext(exe)[0].lstrip('./') res.append( (exename, run_richards(exe, 1), run_pystone(exe, 2000)) ) res.append( ('python %s' % sys.version.split()[0], ref_rich, ref_stone) ) print HEADLINE From ericvrp at codespeak.net Fri Sep 23 14:45:15 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 23 Sep 2005 14:45:15 +0200 (CEST) Subject: [pypy-svn] r17792 - pypy/dist/pypy/translator/goal Message-ID: <20050923124515.B78D827BB3@code1.codespeak.net> Author: ericvrp Date: Fri Sep 23 14:45:15 2005 New Revision: 17792 Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py Log: oops Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-cronjob.py (original) +++ pypy/dist/pypy/translator/goal/bench-cronjob.py Fri Sep 23 14:45:15 2005 @@ -32,6 +32,8 @@ os.unlink(basename) def benchmark(): + os.system('cat /proc/cpuinfo') + os.system('free') os.chdir(homedir + '/projects/pypy-dist/pypy/translator/goal') os.system('python bench-unix.py 2>&1' % locals()) @@ -39,8 +41,9 @@ print time.ctime() update_pypy() update_llvm() - for backend in 'c llvm'.split(): + for backend in 'llvm c'.split(): compile(backend) + benchmark() print time.ctime() print 80*'-' From arigo at codespeak.net Fri Sep 23 15:41:54 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 23 Sep 2005 15:41:54 +0200 (CEST) Subject: [pypy-svn] r17794 - in pypy/dist/pypy: interpreter module/sys module/sys/test Message-ID: <20050923134154.1D4E127BA7@code1.codespeak.net> Author: arigo Date: Fri Sep 23 15:41:45 2005 New Revision: 17794 Added: pypy/dist/pypy/module/sys/version.py (contents, props changed) Modified: pypy/dist/pypy/interpreter/mixedmodule.py pypy/dist/pypy/module/sys/__init__.py pypy/dist/pypy/module/sys/test/test_sysmodule.py Log: * the svn revision number in sys.pypy_version_info should be computed from the last checkin of the 'pypy' directory, not of the 'pypy/module/sys/__init__.py' file. * moved version-related numbers and algorithms to module/sys/version.py. * avoid an infinite loop in the super-Evil hack of mixedmodule.py Modified: pypy/dist/pypy/interpreter/mixedmodule.py ============================================================================== --- pypy/dist/pypy/interpreter/mixedmodule.py (original) +++ pypy/dist/pypy/interpreter/mixedmodule.py Fri Sep 23 15:41:45 2005 @@ -125,8 +125,9 @@ try: value = eval(spec, d) except NameError, ex: - #assert name not in d, "huh, am i looping?" name = ex.args[0].split("'")[1] # super-Evil + if name in d: + raise # propagate the NameError try: d[name] = __import__(pkgroot+'.'+name, None, None, [name]) except ImportError: Modified: pypy/dist/pypy/module/sys/__init__.py ============================================================================== --- pypy/dist/pypy/module/sys/__init__.py (original) +++ pypy/dist/pypy/module/sys/__init__.py Fri Sep 23 15:41:45 2005 @@ -48,13 +48,12 @@ 'executable' : 'space.wrap("py.py")', 'copyright' : 'space.wrap("MIT-License")', - 'api_version' : 'space.wrap(1012)', - 'version_info' : 'space.wrap((2,4,1, "alpha", 42))', - 'version' : 'space.wrap("2.4.1 (pypy 0.7.1 build)")', - 'pypy_version_info' : """space.wrap((0,7,1, "alpha", - int('$Revision$'[11:-1])))""", + 'api_version' : 'version.get_api_version(space)', + 'version_info' : 'version.get_version_info(space)', + 'version' : 'version.get_version(space)', + 'pypy_version_info' : 'version.get_pypy_version_info(space)', 'pypy_svn_url' : 'space.wrap("$HeadURL$"[10:-29])', - 'hexversion' : 'space.wrap(0x020401a0)', + 'hexversion' : 'version.get_hexversion(space)', 'ps1' : 'space.wrap(">>>> ")', 'ps2' : 'space.wrap(".... ")', Modified: pypy/dist/pypy/module/sys/test/test_sysmodule.py ============================================================================== --- pypy/dist/pypy/module/sys/test/test_sysmodule.py (original) +++ pypy/dist/pypy/module/sys/test/test_sysmodule.py Fri Sep 23 15:41:45 2005 @@ -18,34 +18,6 @@ space.sys.get('__stdout__') class AppTestAppSysTests: - def test_path_exists(self): - import sys - assert hasattr(sys, 'path'), "sys.path gone missing" - def test_modules_exists(self): - import sys - assert hasattr(sys, 'modules'), "sys.modules gone missing" - def test_dict_exists(self): - import sys - assert hasattr(sys, '__dict__'), "sys.__dict__ gone missing" - def test_name_exists(self): - import sys - assert hasattr(sys, '__name__'), "sys.__name__ gone missing" - def test_builtin_module_names_exists(self): - import sys - assert hasattr(sys, 'builtin_module_names'), ( - "sys.builtin_module_names gone missing") - def test_warnoptions_exists(self): - import sys - assert hasattr(sys, 'warnoptions'), ( - "sys.warnoptions gone missing") - def test_hexversion_exists(self): - import sys - assert hasattr(sys, 'hexversion'), ( - "sys.hexversion gone missing") - def test_platform_exists(self): - import sys - assert hasattr(sys, 'platform'), "sys.platform gone missing" - def test_sys_in_modules(self): import sys modules = sys.modules @@ -97,15 +69,13 @@ raise AssertionError, "ZeroDivisionError not caught" def app_test_io(): - #space.appexec([], """(): - import sys - assert isinstance(sys.stdout, file) - assert isinstance(sys.__stdout__, file) - assert isinstance(sys.stderr, file) - assert isinstance(sys.__stderr__, file) - assert isinstance(sys.stdin, file) - assert isinstance(sys.__stdin__, file) - #""") + import sys + assert isinstance(sys.stdout, file) + assert isinstance(sys.__stdout__, file) + assert isinstance(sys.stderr, file) + assert isinstance(sys.__stderr__, file) + assert isinstance(sys.stdin, file) + assert isinstance(sys.__stdin__, file) class AppTestSysModulePortedFromCPython: @@ -338,6 +308,9 @@ #) def test_attributes(self): + assert sys.__name__ == 'sys' + assert isinstance(sys.modules, dict) + assert isinstance(sys.path, list) assert isinstance(sys.api_version, int) assert isinstance(sys.argv, list) assert sys.byteorder in ("little", "big") @@ -351,6 +324,7 @@ assert isinstance(sys.platform, basestring) assert isinstance(sys.prefix, basestring) assert isinstance(sys.version, basestring) + assert isinstance(sys.warnoptions, list) vi = sys.version_info assert isinstance(vi, tuple) assert len(vi) == 5 @@ -373,3 +347,18 @@ finally: sys.settrace(None) assert len(counts) == 1 + + def test_pypy_attributes(self): + assert isinstance(sys.pypy_objspaceclass, str) + assert isinstance(sys.pypy_svn_url, str) + vi = sys.pypy_version_info + assert isinstance(vi, tuple) + assert len(vi) == 5 + assert isinstance(vi[0], int) + assert isinstance(vi[1], int) + assert isinstance(vi[2], int) + assert vi[3] in ("alpha", "beta", "candidate", "final") + assert isinstance(vi[4], int) or vi[4] == '?' + + def test_allattributes(self): + sys.__dict__ # check that we don't crash initializing any attribute Added: pypy/dist/pypy/module/sys/version.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/module/sys/version.py Fri Sep 23 15:41:45 2005 @@ -0,0 +1,70 @@ +""" +Version numbers exposed by PyPy through the 'sys' module. +""" +import os +from pypy.interpreter import autopath + + +CPYTHON_VERSION = (2, 4, 1, "alpha", 42) +CPYTHON_API_VERSION = 1012 + +PYPY_VERSION = (0, 7, 1, "alpha", '?') +# the last item is replaced by the svn revision ^^^ + + +# ____________________________________________________________ + +def get_api_version(space): + return space.wrap(CPYTHON_API_VERSION) + +def get_version_info(space): + return space.wrap(CPYTHON_VERSION) + +def get_version(space): + return space.wrap("%d.%d.%d (pypy %d.%d.%d build)" % ( + CPYTHON_VERSION[0], + CPYTHON_VERSION[1], + CPYTHON_VERSION[2], + PYPY_VERSION[0], + PYPY_VERSION[1], + PYPY_VERSION[2])) + +def get_hexversion(space): + return space.wrap(tuple2hex(CPYTHON_VERSION)) + +def get_pypy_version_info(space): + ver = PYPY_VERSION + ver = ver[:-1] + (svn_revision(),) + return space.wrap(ver) + +def tuple2hex(ver): + d = {'alpha': 0xA, + 'beta': 0xB, + 'candidate': 0xC, + 'final': 0xF, + } + subver = ver[4] + if not (0 <= subver <= 9): + subver = 0 + return (ver[0] << 24 | + ver[1] << 16 | + ver[2] << 8 | + d[ver[3]] << 4 | + subver) + +def svn_revision(): + "Return the last-changed svn revision number." + # NB. we hack the number directly out of the .svn directory to avoid + # to depend on an external 'svn' executable in the path. + rev = '?' + try: + f = open(os.path.join(autopath.pypydir, '.svn', 'entries'), 'r') + for line in f: + line = line.strip() + if line.startswith('committed-rev="') and line.endswith('"'): + rev = int(line[15:-1]) + break + f.close() + except (IOError, OSError): + pass + return rev From ericvrp at codespeak.net Fri Sep 23 17:08:45 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 23 Sep 2005 17:08:45 +0200 (CEST) Subject: [pypy-svn] r17795 - pypy/dist/pypy/translator/llvm Message-ID: <20050923150845.84FAC27BB2@code1.codespeak.net> Author: ericvrp Date: Fri Sep 23 17:08:44 2005 New Revision: 17795 Modified: pypy/dist/pypy/translator/llvm/gc.py Log: relax gcpolicy a little Modified: pypy/dist/pypy/translator/llvm/gc.py ============================================================================== --- pypy/dist/pypy/translator/llvm/gc.py (original) +++ pypy/dist/pypy/translator/llvm/gc.py Fri Sep 23 17:08:44 2005 @@ -19,11 +19,14 @@ def new(gcpolicy=None): #factory gcpolicy = gcpolicy or 'boehm' - if gcpolicy is None or gcpolicy == 'boehm': - from os.path import exists - boehm_on_path = exists('/usr/lib/libgc.so') or exists('/usr/lib/libgc.a') - if not boehm_on_path: - raise Exception, 'Boehm GC libary not found in /usr/lib' + + from os.path import exists + boehm_on_path = exists('/usr/lib/libgc.so') or exists('/usr/lib/libgc.a') + if gcpolicy == 'boehm' and not boehm_on_path: + print 'warning: Boehm GC libary not found in /usr/lib, falling back on no gc' + gcpolicy = 'none' + + if gcpolicy == 'boehm': from pypy.translator.llvm.gc import BoehmGcPolicy gcpolicy = BoehmGcPolicy() elif gcpolicy == 'ref': From ericvrp at codespeak.net Fri Sep 23 17:14:19 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 23 Sep 2005 17:14:19 +0200 (CEST) Subject: [pypy-svn] r17796 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20050923151419.F3A0327BB2@code1.codespeak.net> Author: ericvrp Date: Fri Sep 23 17:14:18 2005 New Revision: 17796 Modified: pypy/dist/pypy/translator/llvm/codewriter.py pypy/dist/pypy/translator/llvm/database.py pypy/dist/pypy/translator/llvm/externs2ll.py pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/module/extfunction.py Log: Added internel to function implementations. This exposed a bug with the external functions that return a bool. Since C returns an int here there is a mismatch that I fixed by injecting a small function that calls the int version and return the bool one. Modified: pypy/dist/pypy/translator/llvm/codewriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/codewriter.py (original) +++ pypy/dist/pypy/translator/llvm/codewriter.py Fri Sep 23 17:14:18 2005 @@ -71,10 +71,11 @@ def openfunc(self, decl, is_entrynode=False, cconv=DEFAULT_CCONV): self.newline() - if is_entrynode: - linkage_type = '' - else: - linkage_type = '' #'internal ' + #if is_entrynode: + # linkage_type = '' + #else: + # linkage_type = 'internal ' + linkage_type = 'internal ' self.append("%s%s %s {" % (linkage_type, cconv, decl,)) def closefunc(self): Modified: pypy/dist/pypy/translator/llvm/database.py ============================================================================== --- pypy/dist/pypy/translator/llvm/database.py (original) +++ pypy/dist/pypy/translator/llvm/database.py Fri Sep 23 17:14:18 2005 @@ -124,7 +124,7 @@ assert key not in self.obj2node, ( "node with key %r already known!" %(key,)) - log("added to pending nodes:", type(key), node) + #log("added to pending nodes:", type(key), node) self.obj2node[key] = node self._pendingsetup.append(node) @@ -171,7 +171,7 @@ type_ = type_.TO value = value._obj - log.prepareconstant("preparing ptr", value) + #log.prepareconstant("preparing ptr", value) # we dont need a node for nulls if value is None: @@ -191,7 +191,7 @@ if isinstance(const_or_var, Constant): ct = const_or_var.concretetype if isinstance(ct, lltype.Primitive): - log.prepare(const_or_var, "(is primitive)") + #log.prepare(const_or_var, "(is primitive)") return assert isinstance(ct, lltype.Ptr), "Preparation of non primitive and non pointer" @@ -201,7 +201,7 @@ if isinstance(ct, lltype.Array) or isinstance(ct, lltype.Struct): p, c = lltype.parentlink(value) if p is None: - log.prepareargvalue("skipping preparing non root", value) + #log.prepareargvalue("skipping preparing non root", value) return if value is not None and value not in self.obj2node: @@ -211,7 +211,7 @@ def prepare_arg(self, const_or_var): - log.prepare(const_or_var) + #log.prepare(const_or_var) self.prepare_type(const_or_var.concretetype) self.prepare_arg_value(const_or_var) @@ -219,7 +219,7 @@ def setup_all(self): while self._pendingsetup: node = self._pendingsetup.pop() - log.settingup(node) + #log.settingup(node) node.setup() def set_entrynode(self, key): Modified: pypy/dist/pypy/translator/llvm/externs2ll.py ============================================================================== --- pypy/dist/pypy/translator/llvm/externs2ll.py (original) +++ pypy/dist/pypy/translator/llvm/externs2ll.py Fri Sep 23 17:14:18 2005 @@ -38,7 +38,7 @@ funcname , s = s.split('(', 1) funcnames[funcname] = True if line.find("internal") == -1: - line = '%s %s' % (DEFAULT_CCONV, line,) + line = 'internal %s %s' % (DEFAULT_CCONV, line,) ll_lines.append(line) # patch calls to function that we just declared fastcc @@ -62,7 +62,17 @@ ll_lines2.append(line) llcode = '\n'.join(ll_lines2) - return llcode.split('implementation') + decl, impl = llcode.split('implementation') + impl += """;functions that should return a bool according to + ; pypy/rpython/extfunctable.py , but C doesn't have bools! + +internal fastcc bool %LL_os_isatty(int %fd) { + %t = call fastcc int %LL_os_isatty(int %fd) + %b = cast int %t to bool + ret bool %b +} + """ + return decl, impl def post_setup_externs(db): @@ -123,9 +133,9 @@ for f in include_files: ccode.append(open(f).read()) + ccode = "".join(ccode) if debug: - ccode = "".join(ccode) filename = udir.join("ccode.c") f = open(str(filename), "w") f.write(ccode) Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Fri Sep 23 17:14:18 2005 @@ -44,7 +44,7 @@ return "" %(self.ref,) def setup(self): - log("setup", self) + #log("setup", self) def visit(node): if isinstance(node, Link): map(self.db.prepare_arg, node.args) Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Fri Sep 23 17:14:18 2005 @@ -31,7 +31,7 @@ class GenLLVM(object): - def __init__(self, translator, gcpolicy=None, exceptionpolicy=None, debug=True): + def __init__(self, translator, gcpolicy=None, exceptionpolicy=None, debug=False): # reset counters LLVMNode.nodename_count = {} Modified: pypy/dist/pypy/translator/llvm/module/extfunction.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/extfunction.py (original) +++ pypy/dist/pypy/translator/llvm/module/extfunction.py Fri Sep 23 17:14:18 2005 @@ -1,6 +1,6 @@ extdeclarations = ''' -%last_exception_type = global %RPYTHON_EXCEPTION_VTABLE* null -%last_exception_value = global %RPYTHON_EXCEPTION* null +%last_exception_type = internal global %RPYTHON_EXCEPTION_VTABLE* null +%last_exception_value = internal global %RPYTHON_EXCEPTION* null ''' extfunctions = {} #dependencies, llvm-code From ericvrp at codespeak.net Fri Sep 23 17:16:05 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 23 Sep 2005 17:16:05 +0200 (CEST) Subject: [pypy-svn] r17797 - pypy/dist/pypy/translator/goal Message-ID: <20050923151605.4A75C27BB2@code1.codespeak.net> Author: ericvrp Date: Fri Sep 23 17:16:04 2005 New Revision: 17797 Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py Log: work in progress Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-cronjob.py (original) +++ pypy/dist/pypy/translator/goal/bench-cronjob.py Fri Sep 23 17:16:04 2005 @@ -13,20 +13,18 @@ os.system('cvs -q up 2>&1') os.system('make tools-only 2>&1') -def get_names(): +def compile(backend): + os.chdir(homedir + '/projects/pypy-dist/pypy/translator/goal') + os.system('python translate_pypy_new.py targetpypystandalone --backend=%(backend)s --pygame --batch -r 2>&1' % locals()) + os.chdir(homedir + '/projects/pypy-dist') try: revision = os.popen('svn info 2>&1').readlines()[3].split()[1] except: revision = 'unknown' basename = homedir + '/projects/pypy-dist/pypy/translator/goal/' + 'pypy-' + backend - realname = basename+'-'+revision() - return basename, realname - -def compile(backend): - os.chdir(homedir + '/projects/pypy-dist/pypy/translator/goal') - os.system('python translate_pypy_new.py targetpypystandalone --backend=%(backend)s --pygame --batch -r 2>&1' % locals()) - basename, realname = get_names() + realname = basename + '-' + revision + os.open(realname, 'wb').write( open(basename).read() ) os.chmod(realname, stat.S_IRWXU) os.unlink(basename) From pedronis at codespeak.net Fri Sep 23 18:20:06 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 23 Sep 2005 18:20:06 +0200 (CEST) Subject: [pypy-svn] r17798 - in pypy/dist/pypy: interpreter objspace/std/test Message-ID: <20050923162006.11BC727BBC@code1.codespeak.net> Author: pedronis Date: Fri Sep 23 18:20:04 2005 New Revision: 17798 Modified: pypy/dist/pypy/interpreter/typedef.py pypy/dist/pypy/objspace/std/test/test_typeobject.py Log: avoid duplicated methods in the User* classes hierarchy Modified: pypy/dist/pypy/interpreter/typedef.py ============================================================================== --- pypy/dist/pypy/interpreter/typedef.py (original) +++ pypy/dist/pypy/interpreter/typedef.py Fri Sep 23 18:20:04 2005 @@ -63,6 +63,10 @@ def _buildusercls(cls, hasdict, wants_slots): "NOT_RPYTHON: initialization-time only" typedef = cls.typedef + + if hasdict and typedef.hasdict: + return get_unique_interplevel_subclass(cls, False, wants_slots) + name = ['User'] if not hasdict: name.append('NoDict') @@ -72,27 +76,10 @@ name = ''.join(name) - body = {} - - no_extra_dict = typedef.hasdict or not hasdict - - class User_InsertNameHere(object): - - def getclass(self, space): - return self.w__class__ - - def setclass(self, space, w_subtype): - # only used by descr_set___class__ - self.w__class__ = w_subtype - - def __del__(self): - try: - self.space.userdel(self) - except OperationError, e: - e.write_unraisable(self.space, 'method __del__ of ', self) - e.clear(self.space) # break up reference cycles + if wants_slots: + supercls = get_unique_interplevel_subclass(cls, hasdict, False) - if wants_slots: + class Proto(object): def user_setup_slots(self, nslots): self.slots_w = [None] * nslots @@ -101,17 +88,10 @@ def getslotvalue(self, index): return self.slots_w[index] - else: - def user_setup_slots(self, nslots): - assert nslots == 0 - - if no_extra_dict: - def user_setup(self, space, w_subtype, nslots): - self.space = space - self.w__class__ = w_subtype - self.user_setup_slots(nslots) + elif hasdict: + supercls = get_unique_interplevel_subclass(cls, False, False) - else: + class Proto(object): def getdict(self): return self.w__dict__ @@ -126,15 +106,38 @@ self.w__class__ = w_subtype self.w__dict__ = space.newdict([]) self.user_setup_slots(nslots) + else: + supercls = cls + + class Proto(object): + + def getclass(self, space): + return self.w__class__ + + def setclass(self, space, w_subtype): + # only used by descr_set___class__ + self.w__class__ = w_subtype + + def __del__(self): + try: + self.space.userdel(self) + except OperationError, e: + e.write_unraisable(self.space, 'method __del__ of ', self) + e.clear(self.space) # break up reference cycles + + def user_setup(self, space, w_subtype, nslots): + self.space = space + self.w__class__ = w_subtype + self.user_setup_slots(nslots) + + def user_setup_slots(self, nslots): + assert nslots == 0 body = dict([(key, value) - for key, value in User_InsertNameHere.__dict__.items() + for key, value in Proto.__dict__.items() if not key.startswith('_') or key == '__del__']) - if not hasdict and not wants_slots: - subcls = type(name, (cls,), body) - else: - basesubcls = get_unique_interplevel_subclass(cls, False, False) - subcls = type(name, (basesubcls,), body) + + subcls = type(name, (supercls,), body) return subcls Modified: pypy/dist/pypy/objspace/std/test/test_typeobject.py ============================================================================== --- pypy/dist/pypy/objspace/std/test/test_typeobject.py (original) +++ pypy/dist/pypy/objspace/std/test/test_typeobject.py Fri Sep 23 18:20:04 2005 @@ -440,3 +440,46 @@ Abc.__name__ = 'Def' assert Abc.__name__ == 'Def' raises(TypeError, "Abc.__name__ = 42") + + def test_class_variations(self): + class A(object): + pass + assert '__dict__' in A.__dict__ + a = A() + a.x = 3 + assert a.x == 3 + + class A(object): + __slots__ = () + assert '__dict__' not in A.__dict__ + a = A() + raises(AttributeError, setattr, a, 'x', 3) + + class B(A): + pass + assert '__dict__' in B.__dict__ + b = B() + b.x = 3 + assert b.x == 3 + + import sys + class A(type(sys)): + pass + assert '__dict__' not in A.__dict__ + a = A("a") + a.x = 3 + assert a.x == 3 + + class A(type(sys)): + __slots__ = () + assert '__dict__' not in A.__dict__ + a = A("a") + a.x = 3 + assert a.x == 3 + + class B(A): + pass + assert '__dict__' not in B.__dict__ + b = B("b") + b.x = 3 + assert b.x == 3 From arigo at codespeak.net Fri Sep 23 19:04:30 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 23 Sep 2005 19:04:30 +0200 (CEST) Subject: [pypy-svn] r17799 - pypy/dist/pypy/module/__builtin__ Message-ID: <20050923170430.1A29B27BBF@code1.codespeak.net> Author: arigo Date: Fri Sep 23 19:04:29 2005 New Revision: 17799 Modified: pypy/dist/pypy/module/__builtin__/importing.py Log: Produce the same error message as CPython. For incredibly bad reasons this is required to make help() work correctly. Modified: pypy/dist/pypy/module/__builtin__/importing.py ============================================================================== --- pypy/dist/pypy/module/__builtin__/importing.py (original) +++ pypy/dist/pypy/module/__builtin__/importing.py Fri Sep 23 19:04:29 2005 @@ -282,9 +282,8 @@ return None else: # ImportError - w_failing = w_modulename - w_exc = space.call_function(space.w_ImportError, w_failing) - raise OperationError(space.w_ImportError, w_exc) + msg = "No module named %s" % modulename + raise OperationError(space.w_ImportError, w(msg)) # __________________________________________________________________ # From ericvrp at codespeak.net Fri Sep 23 23:32:53 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 23 Sep 2005 23:32:53 +0200 (CEST) Subject: [pypy-svn] r17800 - in pypy/dist/pypy/translator/llvm: . module Message-ID: <20050923213253.8CE6327BC7@code1.codespeak.net> Author: ericvrp Date: Fri Sep 23 23:32:52 2005 New Revision: 17800 Modified: pypy/dist/pypy/translator/llvm/exception.py pypy/dist/pypy/translator/llvm/module/support.py Log: current best settings Modified: pypy/dist/pypy/translator/llvm/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/exception.py (original) +++ pypy/dist/pypy/translator/llvm/exception.py Fri Sep 23 23:32:52 2005 @@ -28,7 +28,7 @@ return noresult def new(exceptionpolicy=None): #factory - exceptionpolicy = exceptionpolicy or 'fast' + exceptionpolicy = exceptionpolicy or 'cpython' if exceptionpolicy == 'cpython': from pypy.translator.llvm.exception import CPythonExceptionPolicy exceptionpolicy = CPythonExceptionPolicy() Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Fri Sep 23 23:32:52 2005 @@ -97,7 +97,8 @@ %%exception_type = load %%RPYTHON_EXCEPTION_VTABLE** %%tmp store %%RPYTHON_EXCEPTION_VTABLE* %%exception_type, %%RPYTHON_EXCEPTION_VTABLE** %%last_exception_type store %%RPYTHON_EXCEPTION* %%exception_value, %%RPYTHON_EXCEPTION** %%last_exception_value - ret void ; XXX unwind ; (1) + unwind ; XXX (1) if exceptionpolicy == 'boehm' + ret void } """ % locals()) @@ -109,7 +110,8 @@ br bool %%cond, label %%is_0, label %%is_not_0 is_0: call fastcc void %%prepare_ZeroDivisionError() - ret %s 0 ; XXX unwind ; (2) + unwind ; XXX (2) if exceptionpolicy == 'boehm' + ret %s 0 is_not_0: """ @@ -132,7 +134,8 @@ ; br bool %cond3, label %return_block, label %ovf3 ;ovf3: call fastcc void %prepare_OverflowError() - ret int 0 ; XXX unwind ; (3) + unwind ; XXX (3) if exceptionpolicy == 'boehm' + ret int 0 """ From ericvrp at codespeak.net Fri Sep 23 23:41:49 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 23 Sep 2005 23:41:49 +0200 (CEST) Subject: [pypy-svn] r17801 - pypy/dist/pypy/translator/goal Message-ID: <20050923214149.7834227BC7@code1.codespeak.net> Author: ericvrp Date: Fri Sep 23 23:41:48 2005 New Revision: 17801 Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py pypy/dist/pypy/translator/goal/bench-unix.py pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: Please have a good look at bench-unix.py ! I think we have been optimizing with the wrong numbers in our mind. There were two bugs in the formula's used: 1. We did not take into account that we run less iterations in Richards, causing our numbers to be 10x too optimistic 2. The formula for number of pystones was reversed The good news is that it looks like PyPy is beating CPython handsdown with regards to number of Pystones. The other news is that Richards probably has a bottleneck. (note: bench-windows.py still uses the incorrect formulas!) but don't worry: There always is a change that I am completely wrong! Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-cronjob.py (original) +++ pypy/dist/pypy/translator/goal/bench-cronjob.py Fri Sep 23 23:41:48 2005 @@ -25,7 +25,7 @@ basename = homedir + '/projects/pypy-dist/pypy/translator/goal/' + 'pypy-' + backend realname = basename + '-' + revision - os.open(realname, 'wb').write( open(basename).read() ) + open(realname, 'wb').write( open(basename).read() ) os.chmod(realname, stat.S_IRWXU) os.unlink(basename) Modified: pypy/dist/pypy/translator/goal/bench-unix.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-unix.py (original) +++ pypy/dist/pypy/translator/goal/bench-unix.py Fri Sep 23 23:41:48 2005 @@ -1,9 +1,20 @@ # benchmarks on a unix machine. # to be executed in the goal folder, -# where a couple of pyp-c* files is expected. +# where a couple of pypy-* files is expected. import os, sys +current_result = ''' +executable richards pystone +pypy-c-17758 416740ms (326.85x) 916ms ( 0.03x) +pypy-c-17797 394070ms (309.07x) 99999ms ( 3.40x) +pypy-llvm-17758 343870ms (269.70x) 1131ms ( 0.04x) +pypy-llvm-17792 277630ms (217.75x) 1418ms ( 0.05x) +pypy-llvm-17797 274470ms (215.27x) 1434ms ( 0.05x) +pypy-llvm-17799 999990ms (784.31x) 99999ms ( 3.40x) +python 2.4.2c1 1275ms ( 1.00x) 29411ms ( 1.00x) +''' + PYSTONE_CMD = 'from test import pystone;pystone.main(%s)' PYSTONE_PATTERN = 'This machine benchmarks at' RICHARDS_CMD = 'from richards import *;Richards.iterations=%d;main()' @@ -14,28 +25,29 @@ if line.startswith(pattern): break else: - raise ValueError, 'this is no valid output' + print 'warning: this is no valid output' + return 99999.0 return float(line.split()[len(pattern.split())]) def run_cmd(cmd): print "running", cmd pipe = os.popen(cmd + ' 2>&1') result = pipe.read() - print "done" + #print "done" return result def run_pystone(executable='python', n=0): argstr = PYSTONE_CMD % (str(n) and n or '') txt = run_cmd('%s -c "%s"' % (executable, argstr)) res = get_result(txt, PYSTONE_PATTERN) - print res + #print res return res def run_richards(executable='python', n=10): argstr = RICHARDS_CMD % n txt = run_cmd('%s -c "%s"' % (executable, argstr)) - res = get_result(txt, RICHARDS_PATTERN) - print res + res = get_result(txt, RICHARDS_PATTERN) * 10 / n + #print res return res def get_executables(): @@ -43,24 +55,22 @@ exes.sort() return exes -HEADLINE = '''\ -executable abs.richards abs.pystone rel.richards rel.pystone''' -FMT = '''\ -%-30s ''' + '%5d ms %9.3f ' + '%5.1f %5.1f' +HEADLINE = '''executable richards pystone''' +FMT = '''%-30s %6dms (%6.2fx) %6dms (%6.2fx)''' def main(): - print 'getting the richards reference' + #print 'getting the richards reference' ref_rich = run_richards() - print 'getting the pystone reference' + #print 'getting the pystone reference' ref_stone = run_pystone() res = [] for exe in get_executables(): exename = os.path.splitext(exe)[0].lstrip('./') - res.append( (exename, run_richards(exe, 1), run_pystone(exe, 2000)) ) + res.append( (exename, run_richards(exe, 1), run_pystone(exe)) ) res.append( ('python %s' % sys.version.split()[0], ref_rich, ref_stone) ) print HEADLINE for exe, rich, stone in res: - print FMT % (exe, rich, stone, rich / ref_rich, ref_stone / stone) + print FMT % (exe, rich, rich / ref_rich, stone, stone / ref_stone) if __name__ == '__main__': main() Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Fri Sep 23 23:41:48 2005 @@ -127,9 +127,9 @@ print 'Specializing...' t.specialize(dont_simplify_again=True, crash_on_first_typeerror=not cmd_line_opt.insist) - if cmd_line_opt.optimize and cmd_line_opt.backend != 'llvm': + if cmd_line_opt.optimize: print 'Back-end optimizations...' - t.backend_optimizations() + t.backend_optimizations(ssa_form=cmd_line_opt.backend != 'llvm') if a and 'fork2' in cmd_line_opt.fork: from pypy.translator.goal import unixcheckpoint unixcheckpoint.restartable_point(auto='run') From pedronis at codespeak.net Sat Sep 24 01:38:03 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 24 Sep 2005 01:38:03 +0200 (CEST) Subject: [pypy-svn] r17804 - pypy/dist/pypy/interpreter Message-ID: <20050923233803.0B16527BC7@code1.codespeak.net> Author: pedronis Date: Sat Sep 24 01:38:02 2005 New Revision: 17804 Modified: pypy/dist/pypy/interpreter/gateway.py Log: avoid creating a BuiltinFrame subclass per builtin, use an indirection through a BuiltinFrameFactory per unwrap spec kind. Modified: pypy/dist/pypy/interpreter/gateway.py ============================================================================== --- pypy/dist/pypy/interpreter/gateway.py (original) +++ pypy/dist/pypy/interpreter/gateway.py Sat Sep 24 01:38:02 2005 @@ -268,6 +268,9 @@ """Subclasses with behavior specific for an unwrap spec are generated""" raise TypeError, "abstract" +class BuiltinFrameFactory(object): + pass + class BuiltinCodeSignature(Signature): "NOT_RPYTHON" @@ -280,12 +283,12 @@ self.through_scope_w = 0 self.miniglobals = {} - def _make_unwrap_frame_class(self, cache={}): + def _make_unwrap_frame_factory_class(self, cache={}): try: key = tuple(self.unwrap_spec) - frame_cls, run_args = cache[key] + frame_factory_cls, run_args = cache[key] assert run_args == self.run_args,"unexpected: same spec, different run_args" - return frame_cls + return frame_factory_cls except KeyError: parts = [] for el in self.unwrap_spec: @@ -318,23 +321,35 @@ exec compile2(source) in self.miniglobals, d d['_run'] = d['_run_UWS_%s' % label] del d['_run_UWS_%s' % label] - frame_cls = type("BuiltinFrame_UWS_%s" % label, (BuiltinFrame,), d) - cache[key] = frame_cls, self.run_args - return frame_cls - - def make_frame_class(self, func, cache={}): - frame_uw_cls = self._make_unwrap_frame_class() - return type("BuiltinFrame_for_%s" % self.name, - (frame_uw_cls,),{'behavior': staticmethod(func)}) + + frame_cls = type("BuiltinFrame_UwS_%s" % label, (BuiltinFrame,), d) + + class MyBuiltinFrameFactory(BuiltinFrameFactory): + + def create(self, space, code, w_globals): + newframe = frame_cls(space, code, w_globals) + newframe.behavior = self.behavior + return newframe + + MyBuiltinFrameFactory.__name__ = 'BuiltinFrameFactory_UwS_%s' % label + + cache[key] = MyBuiltinFrameFactory, self.run_args + return MyBuiltinFrameFactory + + def make_frame_factory(self, func): + frame_uw_factory_cls = self._make_unwrap_frame_factory_class() -def make_builtin_frame_class(func, orig_sig, unwrap_spec): + factory = frame_uw_factory_cls() + factory.behavior = func + + return factory + +def make_builtin_frame_factory(func, orig_sig, unwrap_spec): "NOT_RPYTHON" name = (getattr(func, '__module__', None) or '')+'_'+func.__name__ emit_sig = orig_sig.apply_unwrap_spec(unwrap_spec, UnwrapSpecRecipe().emit, BuiltinCodeSignature(name=name, unwrap_spec=unwrap_spec)) - cls = emit_sig.make_frame_class(func) - return cls - + return emit_sig.make_frame_factory(func) class BuiltinCode(eval.Code): @@ -394,7 +409,7 @@ else: self.maxargs = self.minargs - self.framecls = make_builtin_frame_class(func, orig_sig, unwrap_spec) + self.framefactory = make_builtin_frame_factory(func, orig_sig, unwrap_spec) # speed hack if unwrap_spec == [ObjSpace, W_Root]: @@ -408,7 +423,7 @@ self.fastfunc_3 = func def create_frame(self, space, w_globals, closure=None): - return self.framecls(space, self, w_globals) + return self.framefactory.create(space, self, w_globals) def signature(self): return self.sig From hpk at codespeak.net Sat Sep 24 07:57:42 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Sat, 24 Sep 2005 07:57:42 +0200 (CEST) Subject: [pypy-svn] r17805 - pypy/dist/pypy/tool Message-ID: <20050924055742.AD00727BC4@code1.codespeak.net> Author: hpk Date: Sat Sep 24 07:57:41 2005 New Revision: 17805 Modified: pypy/dist/pypy/tool/fixeol Log: don't use py.path.checker anymore Modified: pypy/dist/pypy/tool/fixeol ============================================================================== --- pypy/dist/pypy/tool/fixeol (original) +++ pypy/dist/pypy/tool/fixeol Sat Sep 24 07:57:41 2005 @@ -69,7 +69,7 @@ for fn in fns: fixfile(fn) - for x in path.listdir(py.path.checker(dir=1, versioned=True)): + for x in path.listdir(lambda x: x.check(dir=1, versioned=True)): fixdirectory(x) def fixfile(path): From ericvrp at codespeak.net Sat Sep 24 10:05:02 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Sat, 24 Sep 2005 10:05:02 +0200 (CEST) Subject: [pypy-svn] r17806 - pypy/dist/pypy/translator/goal Message-ID: <20050924080502.0F96A27BBF@code1.codespeak.net> Author: ericvrp Date: Sat Sep 24 10:05:02 2005 New Revision: 17806 Modified: pypy/dist/pypy/translator/goal/bench-unix.py Log: Excuse my stupidity! At was late and I should not have drawn conclusions at that time. I was wrong on both points. Benchmark results are just fine! Modified: pypy/dist/pypy/translator/goal/bench-unix.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-unix.py (original) +++ pypy/dist/pypy/translator/goal/bench-unix.py Sat Sep 24 10:05:02 2005 @@ -6,13 +6,14 @@ current_result = ''' executable richards pystone -pypy-c-17758 416740ms (326.85x) 916ms ( 0.03x) -pypy-c-17797 394070ms (309.07x) 99999ms ( 3.40x) -pypy-llvm-17758 343870ms (269.70x) 1131ms ( 0.04x) -pypy-llvm-17792 277630ms (217.75x) 1418ms ( 0.05x) -pypy-llvm-17797 274470ms (215.27x) 1434ms ( 0.05x) -pypy-llvm-17799 999990ms (784.31x) 99999ms ( 3.40x) -python 2.4.2c1 1275ms ( 1.00x) 29411ms ( 1.00x) +pypy-c-17758 30626ms ( 35.74x) 1268 ( 33.98x) +pypy-c-17797 29657ms ( 34.61x) error +pypy-c-17799 29184ms ( 34.05x) error +pypy-llvm-17758 25361ms ( 29.59x) 1525 ( 28.26x) +pypy-llvm-17792 20775ms ( 24.24x) 1912 ( 22.53x) +pypy-llvm-17797 20423ms ( 23.83x) 1943 ( 22.18x) +pypy-llvm-17799 error error +python 2.4.2c1 857ms ( 1.00x) 43103 ( 1.00x) ''' PYSTONE_CMD = 'from test import pystone;pystone.main(%s)' @@ -46,7 +47,7 @@ def run_richards(executable='python', n=10): argstr = RICHARDS_CMD % n txt = run_cmd('%s -c "%s"' % (executable, argstr)) - res = get_result(txt, RICHARDS_PATTERN) * 10 / n + res = get_result(txt, RICHARDS_PATTERN) #print res return res @@ -56,7 +57,7 @@ return exes HEADLINE = '''executable richards pystone''' -FMT = '''%-30s %6dms (%6.2fx) %6dms (%6.2fx)''' +FMT = '''%-30s %6dms (%6.2fx) %6d (%6.2fx)''' def main(): #print 'getting the richards reference' @@ -70,7 +71,7 @@ res.append( ('python %s' % sys.version.split()[0], ref_rich, ref_stone) ) print HEADLINE for exe, rich, stone in res: - print FMT % (exe, rich, rich / ref_rich, stone, stone / ref_stone) + print FMT % (exe, rich, rich / ref_rich, stone, ref_stone / stone) if __name__ == '__main__': main() From arigo at codespeak.net Sat Sep 24 11:14:11 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 11:14:11 +0200 (CEST) Subject: [pypy-svn] r17811 - pypy/dist/pypy/doc Message-ID: <20050924091411.9323427BC6@code1.codespeak.net> Author: arigo Date: Sat Sep 24 11:14:07 2005 New Revision: 17811 Modified: pypy/dist/pypy/doc/svn-help.txt Log: Removed unused reference to a page that has gone missing. Modified: pypy/dist/pypy/doc/svn-help.txt ============================================================================== --- pypy/dist/pypy/doc/svn-help.txt (original) +++ pypy/dist/pypy/doc/svn-help.txt Sat Sep 24 11:14:07 2005 @@ -153,7 +153,6 @@ .. _MacOS: http://codespeak.net/~jum/svn-1.1.3-darwin-ppc.tar.gz .. _versions: http://subversion.tigris.org/project_packages.html .. _Win: http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=4B6140F9-2D36-4977-8FA1-6F8A0F5DCA8F -.. _HowToInstallServer: http://codespeak.net/moin/pypy/moin.cgi/HowToInstallServer .. _guide: http://svnbook.red-bean.com/book.html#svn-ch-1 .. _backports: http://www.backports.org .. _online: http://codespeak.net/svn/pypy/dist/ From arigo at codespeak.net Sat Sep 24 11:57:03 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 11:57:03 +0200 (CEST) Subject: [pypy-svn] r17813 - in pypy/dist/pypy: rpython rpython/test translator Message-ID: <20050924095703.E278B27BCC@code1.codespeak.net> Author: arigo Date: Sat Sep 24 11:56:57 2005 New Revision: 17813 Modified: pypy/dist/pypy/rpython/lltype.py pypy/dist/pypy/rpython/test/test_lltype.py pypy/dist/pypy/translator/transform.py Log: Optimizations of translate_pypy, motivated by a profiled run which you can see at the end of /tmp/PROFILE-out on the snake server: * LowLevelType.__hash__ is called 99 million times. Let's make it cache its result as best as possible. * transform.checkgraphs() calls checkgraph() too many times -- each graph was fully rechecked the same number of times as the number of blocks it contains. Modified: pypy/dist/pypy/rpython/lltype.py ============================================================================== --- pypy/dist/pypy/rpython/lltype.py (original) +++ pypy/dist/pypy/rpython/lltype.py Sat Sep 24 11:56:57 2005 @@ -35,6 +35,10 @@ class LowLevelType(object): + # the following line prevents '__cached_hash' to be in the __dict__ of + # the instance, which is needed for __eq__() and __hash__() to work. + __slots__ = ['__dict__', '__cached_hash'] + def __eq__(self, other): return self.__class__ is other.__class__ and ( self is other or safe_equal(self.__dict__, other.__dict__)) @@ -44,20 +48,28 @@ def __hash__(self): # cannot use saferecursive() -- see test_lltype.test_hash(). - # this version uses a compromize between computation time and - # collision-avoidance that can be customized if needed. + # NB. the __cached_hash should neither be used nor updated + # if we enter with hash_level > 0, because the computed + # __hash__ can be different in this situation. + hash_level = 0 try: - if TLS.nested_hash_level >= 3: - return 0 + hash_level = TLS.nested_hash_level + if hash_level == 0: + return self.__cached_hash except AttributeError: - TLS.nested_hash_level = 0 + pass + if hash_level >= 3: + return 0 items = self.__dict__.items() items.sort() - TLS.nested_hash_level += 1 + TLS.nested_hash_level = hash_level + 1 try: - return hash((self.__class__,) + tuple(items)) + result = hash((self.__class__,) + tuple(items)) finally: - TLS.nested_hash_level -= 1 + TLS.nested_hash_level = hash_level + if hash_level == 0: + self.__cached_hash = result + return result # due to this dynamic hash value, we should forbid # pickling, until we have an algorithm for that. Modified: pypy/dist/pypy/rpython/test/test_lltype.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_lltype.py (original) +++ pypy/dist/pypy/rpython/test/test_lltype.py Sat Sep 24 11:56:57 2005 @@ -305,6 +305,7 @@ S = ForwardReference() S.become(Struct('S', ('p', Ptr(S)))) assert S == S + hash(S) # assert no crash, and force the __cached_hash computation S1 = Struct('S', ('p', Ptr(S))) assert S1 == S assert S == S1 Modified: pypy/dist/pypy/translator/transform.py ============================================================================== --- pypy/dist/pypy/translator/transform.py (original) +++ pypy/dist/pypy/translator/transform.py Sat Sep 24 11:56:57 2005 @@ -17,10 +17,13 @@ def checkgraphs(self, blocks): + seen = {} for block in blocks: fn = self.annotated[block] graph = self.translator.flowgraphs[fn] - checkgraph(graph) + if graph not in seen: + checkgraph(graph) + seen[graph] = True def fully_annotated_blocks(self): """Ignore blocked blocks.""" From arigo at codespeak.net Sat Sep 24 12:15:01 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 12:15:01 +0200 (CEST) Subject: [pypy-svn] r17814 - pypy/dist/pypy/tool Message-ID: <20050924101501.E246427BD2@code1.codespeak.net> Author: arigo Date: Sat Sep 24 12:14:58 2005 New Revision: 17814 Modified: pypy/dist/pypy/tool/unionfind.py Log: More profiling-driven optimizations of the translation process. Modified: pypy/dist/pypy/tool/unionfind.py ============================================================================== --- pypy/dist/pypy/tool/unionfind.py (original) +++ pypy/dist/pypy/tool/unionfind.py Sat Sep 24 12:14:58 2005 @@ -32,8 +32,15 @@ return self.root_info.values() def find_rep(self, obj): - ignore, rep, info = self.find(obj) - return rep + try: + # fast path (shortcut for performance reasons) + parent = self.link_to_parent[obj] + self.root_info[parent] # may raise KeyError + return parent + except KeyError: + # general case + ignore, rep, info = self.find(obj) + return rep def find(self, obj): # -> new_root, obj, info if obj not in self.link_to_parent: From arigo at codespeak.net Sat Sep 24 12:34:11 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 12:34:11 +0200 (CEST) Subject: [pypy-svn] r17815 - in pypy/dist/pypy: rpython translator Message-ID: <20050924103411.07B4027BD2@code1.codespeak.net> Author: arigo Date: Sat Sep 24 12:34:06 2005 New Revision: 17815 Modified: pypy/dist/pypy/rpython/annlowlevel.py pypy/dist/pypy/translator/annrpython.py pypy/dist/pypy/translator/transform.py Log: Profiling pretends that a huge amount of time is spend within annotate_lowlevel_helper() (not sub-calls, directly the function itself). I can only guess that this list comprehension kills us. Trying to remove it. Modified: pypy/dist/pypy/rpython/annlowlevel.py ============================================================================== --- pypy/dist/pypy/rpython/annlowlevel.py (original) +++ pypy/dist/pypy/rpython/annlowlevel.py Sat Sep 24 12:34:06 2005 @@ -82,18 +82,17 @@ def annotate_lowlevel_helper(annotator, ll_function, args_s): - saved = annotator.policy + saved = annotator.policy, annotator.added_blocks annotator.policy = LowLevelAnnotatorPolicy() try: args = annotator.bookkeeper.build_args('simple_call', args_s) (ll_function, args), key = decide_callable(annotator.bookkeeper, None, ll_function, args, mono=True, unpacked=True) args_s, kwds_s = args.unpack() assert not kwds_s - oldblocks = annotator.annotated.copy() + annotator.added_blocks = {} s = annotator.build_types(ll_function, args_s) - newblocks = [block for block in annotator.annotated.iterkeys() if block not in oldblocks] # invoke annotation simplifications for the new blocks - annotator.simplify(block_subset=newblocks) + annotator.simplify(block_subset=annotator.added_blocks) finally: - annotator.policy = saved + annotator.policy, annotator.added_blocks = saved return s, ll_function Modified: pypy/dist/pypy/translator/annrpython.py ============================================================================== --- pypy/dist/pypy/translator/annrpython.py (original) +++ pypy/dist/pypy/translator/annrpython.py Sat Sep 24 12:34:06 2005 @@ -25,6 +25,7 @@ self.pendingblocks = {} # map {block: function} self.bindings = {} # map Variables to SomeValues self.annotated = {} # set of blocks already seen + self.added_blocks = None # see processblock() below self.links_followed = {} # set of links that have ever been followed self.notify = {} # {block: {positions-to-reflow-from-when-done}} # --- the following information is recorded for debugging only --- @@ -382,6 +383,12 @@ setattr(e, '__annotator_block', block) raise + # The dict 'added_blocks' is used by rpython.annlowlevel to + # detect which are the new blocks that annotating an additional + # small helper creates. + if self.added_blocks is not None: + self.added_blocks[block] = True + def reflowpendingblock(self, fn, block): assert not self.frozen self.pendingblocks[block] = fn Modified: pypy/dist/pypy/translator/transform.py ============================================================================== --- pypy/dist/pypy/translator/transform.py (original) +++ pypy/dist/pypy/translator/transform.py Sat Sep 24 12:34:06 2005 @@ -198,10 +198,8 @@ # modified by t.simplify() after it had been annotated. if block_subset is None: block_subset = fully_annotated_blocks(ann) - d = {} - for block in block_subset: - d[block] = True - block_subset = d + if not isinstance(block_subset, dict): + block_subset = dict.fromkeys(block_subset) if ann.translator: checkgraphs(ann, block_subset) transform_dead_code(ann, block_subset) From arigo at codespeak.net Sat Sep 24 12:44:06 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 12:44:06 +0200 (CEST) Subject: [pypy-svn] r17816 - pypy/dist/pypy/translator Message-ID: <20050924104406.255EF27BD2@code1.codespeak.net> Author: arigo Date: Sat Sep 24 12:44:02 2005 New Revision: 17816 Modified: pypy/dist/pypy/translator/annrpython.py pypy/dist/pypy/translator/transform.py Log: Who would have guessed that annotator.complete() would call getreturnvar() 39'439'040 times? Plus, let's use dict.fromkeys() more freely. Modified: pypy/dist/pypy/translator/annrpython.py ============================================================================== --- pypy/dist/pypy/translator/annrpython.py (original) +++ pypy/dist/pypy/translator/annrpython.py Sat Sep 24 12:44:02 2005 @@ -193,7 +193,13 @@ self.annotated.values().count(False)) # make sure that the return variables of all graphs is annotated if self.translator is not None: - for graph in self.translator.flowgraphs.values(): + if self.added_blocks is not None: + newgraphs = [self.translator.flowgraphs[self.annotated[block]] + for block in self.added_blocks] + newgraphs = dict.fromkeys(newgraphs) + else: + newgraphs = self.translator.flowgraphs.itervalues() #all of them + for graph in newgraphs: v = graph.getreturnvar() if v not in self.bindings: self.setbinding(v, annmodel.SomeImpossibleValue()) Modified: pypy/dist/pypy/translator/transform.py ============================================================================== --- pypy/dist/pypy/translator/transform.py (original) +++ pypy/dist/pypy/translator/transform.py Sat Sep 24 12:44:02 2005 @@ -17,13 +17,10 @@ def checkgraphs(self, blocks): - seen = {} - for block in blocks: - fn = self.annotated[block] - graph = self.translator.flowgraphs[fn] - if graph not in seen: - checkgraph(graph) - seen[graph] = True + all_graphs = [self.translator.flowgraphs[fn] + for fn in self.annotated.itervalues()] + for graph in dict.fromkeys(all_graphs): + checkgraph(graph) def fully_annotated_blocks(self): """Ignore blocked blocks.""" From arigo at codespeak.net Sat Sep 24 12:48:55 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 12:48:55 +0200 (CEST) Subject: [pypy-svn] r17817 - pypy/dist/pypy/rpython Message-ID: <20050924104855.8A87B27BD2@code1.codespeak.net> Author: arigo Date: Sat Sep 24 12:48:52 2005 New Revision: 17817 Modified: pypy/dist/pypy/rpython/lltype.py Log: A specialized version of safe_equal() for performance. Modified: pypy/dist/pypy/rpython/lltype.py ============================================================================== --- pypy/dist/pypy/rpython/lltype.py (original) +++ pypy/dist/pypy/rpython/lltype.py Sat Sep 24 12:48:52 2005 @@ -24,7 +24,22 @@ del seeing[seeingkey] return safe -safe_equal = saferecursive(operator.eq, True) +#safe_equal = saferecursive(operator.eq, True) +def safe_equal(x, y): + # a specialized version for performance + try: + seeing = TLS.seeing_eq + except AttributeError: + seeing = TLS.seeing_eq = {} + seeingkey = (id(x), id(y)) + if seeingkey in seeing: + return True + seeing[seeingkey] = True + try: + return x == y + finally: + del seeing[seeingkey] + class frozendict(dict): From arigo at codespeak.net Sat Sep 24 13:19:22 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 13:19:22 +0200 (CEST) Subject: [pypy-svn] r17821 - pypy/dist/pypy/translator/tool Message-ID: <20050924111922.8654E27BD2@code1.codespeak.net> Author: arigo Date: Sat Sep 24 13:19:19 2005 New Revision: 17821 Modified: pypy/dist/pypy/translator/tool/make_dot.py Log: Oups. The pygame viewer *did* use the class-based version of objspace.flow.model.traverse(). Modified: pypy/dist/pypy/translator/tool/make_dot.py ============================================================================== --- pypy/dist/pypy/translator/tool/make_dot.py (original) +++ pypy/dist/pypy/translator/tool/make_dot.py Sat Sep 24 13:19:19 2005 @@ -85,7 +85,8 @@ self.func = None self.prefix = name self.enter_subgraph(name) - traverse(self, node) + self.visit_FunctionGraph(node) + traverse(self.visit, node) self.leave_subgraph() def blockname(self, block): @@ -97,8 +98,8 @@ return name def visit(self, obj): - # ignore for now - return + if isinstance(obj, Block): + self.visit_Block(obj) def visit_FunctionGraph(self, funcgraph): name = self.prefix # +'_'+funcgraph.name From arigo at codespeak.net Sat Sep 24 13:36:31 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 13:36:31 +0200 (CEST) Subject: [pypy-svn] r17822 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20050924113631.3AC4B27BD2@code1.codespeak.net> Author: arigo Date: Sat Sep 24 13:36:27 2005 New Revision: 17822 Modified: pypy/dist/pypy/translator/backendopt/ssa.py pypy/dist/pypy/translator/backendopt/test/test_ssa.py Log: Rewrote the data_flow_families() algorithm, which seems to consume a lot of time. Made up a couple of tests for ssa.py. Modified: pypy/dist/pypy/translator/backendopt/ssa.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/ssa.py (original) +++ pypy/dist/pypy/translator/backendopt/ssa.py Sat Sep 24 13:36:27 2005 @@ -8,33 +8,47 @@ all following variables where the value is just passed unmerged into the next block. """ - entrymaplist = mkentrymap(graph).items() + + # Build a list of "unification opportunities": for each block and each 'n', + # an "opportunity" is the list of the block's nth input variable plus + # the nth output variable from each of the incoming links. + opportunities = [] + for block, links in mkentrymap(graph).items(): + if block is graph.startblock: + continue + assert links + for n, inputvar in enumerate(block.inputargs): + vars = [inputvar] + for link in links: + var = link.args[n] + if not isinstance(var, Variable): + break + vars.append(var) + else: + # if no Constant found in the incoming links + opportunities.append(vars) + + # An "opportunitiy" that lists exactly two distinct variables means that + # the two variables can be unified. We maintain the unification status in + # 'variable_families'. When variables are unified, it might reduce the + # number of distinct variables and thus open other "opportunities" for + # unification. progress = True variable_families = UnionFind() - - # group variables by families; a family of variables will be identified. while progress: progress = False - for block, links in entrymaplist: - if block is graph.startblock: - continue - assert links - for i in range(len(block.inputargs)): - # list of possible vars that can arrive in i'th position - v1 = block.inputargs[i] - v1 = variable_families.find_rep(v1) - inputs = {v1: True} - key = [] - for link in links: - v = link.args[i] - if not isinstance(v, Variable): - break - v = variable_families.find_rep(v) - inputs[v] = True - else: - if len(inputs) == 2: - variable_families.union(*inputs) - progress = True + pending_opportunities = [] + for vars in opportunities: + repvars = [variable_families.find_rep(v1) for v1 in vars] + repvars = dict.fromkeys(repvars).keys() + if len(repvars) > 2: + # cannot unify now, but maybe later? + pending_opportunities.append(repvars) + elif len(repvars) == 2: + # unify! + variable_families.union(*repvars) + progress = True + opportunities = pending_opportunities return variable_families def SSI_to_SSA(graph): Modified: pypy/dist/pypy/translator/backendopt/test/test_ssa.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_ssa.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_ssa.py Sat Sep 24 13:36:27 2005 @@ -1,2 +1,52 @@ +from pypy.translator.backendopt.ssa import * +from pypy.translator.translator import Translator +from pypy.objspace.flow.model import flatten, Block -# XXX write a test! + +def test_data_flow_families(): + def snippet_fn(xx, yy): + while yy > 0: + if 0 < xx: + yy = yy - xx + else: + yy = yy + xx + return yy + t = Translator(snippet_fn) + graph = t.getflowgraph() + operations = [] + for block in flatten(graph): + if isinstance(block, Block): + operations += block.operations + + variable_families = data_flow_families(graph) + + # we expect to find xx only once: + v_xx = variable_families.find_rep(graph.getargs()[0]) + found = 0 + for op in operations: + if op.opname in ('add', 'sub', 'lt'): + assert variable_families.find_rep(op.args[1]) == v_xx + found += 1 + assert found == 3 + + +def test_SSI_to_SSA(): + def snippet_fn(v1, v2, v3): + if v1: # v4 = is_true(v1) + while v3: # v5 = is_true(v3) + pass + passed_over = 0 + else: + v6 = snippet_fn(v3, v2, v1) # v6 = simple_call(v3, v2, v1) + passed_over = v6 + v7 = passed_over # v7 = inputarg + return v7+v1 # v8 = add(v7, v1) + + t = Translator(snippet_fn) + SSI_to_SSA(t.getflowgraph()) + allvars = [] + for block in flatten(t.getflowgraph()): + if isinstance(block, Block): + allvars += [v.name for v in block.getvariables()] + # see comments above for where the 8 remaining variables are expected to be + assert len(dict.fromkeys(allvars)) == 8 From arigo at codespeak.net Sat Sep 24 13:42:35 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 13:42:35 +0200 (CEST) Subject: [pypy-svn] r17823 - pypy/dist/pypy/rpython Message-ID: <20050924114235.61A4B27BD2@code1.codespeak.net> Author: arigo Date: Sat Sep 24 13:42:32 2005 New Revision: 17823 Modified: pypy/dist/pypy/rpython/lltype.py Log: Micro-optimization for typeOf(), which is called veeery often. Modified: pypy/dist/pypy/rpython/lltype.py ============================================================================== --- pypy/dist/pypy/rpython/lltype.py (original) +++ pypy/dist/pypy/rpython/lltype.py Sat Sep 24 13:42:32 2005 @@ -3,6 +3,7 @@ from pypy.rpython.rarithmetic import r_uint from pypy.tool.uid import Hashable from pypy.tool.tls import tlsobject +from types import NoneType log = py.log.Producer('lltype') @@ -415,23 +416,27 @@ def typeOf(val): - if isinstance(val, bool): - return Bool - if isinstance(val, r_uint): - return Unsigned - if isinstance(val, int): - return Signed - if isinstance(val, float): - return Float - if isinstance(val, str): - assert len(val) == 1 - return Char - if isinstance(val, unicode): - assert len(val) == 1 - return UniChar - if val is None: - return Void # maybe - return val._TYPE + try: + return val._TYPE + except AttributeError: + tp = type(val) + if tp is NoneType: + return Void # maybe + if tp is int: + return Signed + if tp is bool: + return Bool + if tp is r_uint: + return Unsigned + if tp is float: + return Float + if tp is str: + assert len(val) == 1 + return Char + if tp is unicode: + assert len(val) == 1 + return UniChar + raise TypeError("typeOf(%r object)" % (tp.__name__,)) class InvalidCast(TypeError): pass From arigo at codespeak.net Sat Sep 24 13:54:38 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 13:54:38 +0200 (CEST) Subject: [pypy-svn] r17824 - pypy/dist/pypy/interpreter/test Message-ID: <20050924115438.BBBDE27BD2@code1.codespeak.net> Author: arigo Date: Sat Sep 24 13:54:34 2005 New Revision: 17824 Removed: pypy/dist/pypy/interpreter/test/test_synerr.py Modified: pypy/dist/pypy/interpreter/test/test_exec.py Log: Moved the test of test_synerr.py into test_exec.py. Modified: pypy/dist/pypy/interpreter/test/test_exec.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_exec.py (original) +++ pypy/dist/pypy/interpreter/test/test_exec.py Sat Sep 24 13:54:34 2005 @@ -187,3 +187,8 @@ assert m.result == {'x': 'm'} exec "y=n" in m # NOTE: this doesn't work in CPython 2.4 assert m.result == {'x': 'm', 'y': 'n'} + + def test_synerr(self): + def x(): + exec "1 2" + raises(SyntaxError, x) From arigo at codespeak.net Sat Sep 24 15:01:18 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 15:01:18 +0200 (CEST) Subject: [pypy-svn] r17825 - pypy/dist/pypy/translator Message-ID: <20050924130118.4D95527BD0@code1.codespeak.net> Author: arigo Date: Sat Sep 24 15:01:17 2005 New Revision: 17825 Modified: pypy/dist/pypy/translator/transform.py Log: Ooooups. A typo in here, and rtyping was 2-3x slower. Additionally, let's revert to the previous version instead of using dict.fromkeys() on a huge list. Modified: pypy/dist/pypy/translator/transform.py ============================================================================== --- pypy/dist/pypy/translator/transform.py (original) +++ pypy/dist/pypy/translator/transform.py Sat Sep 24 15:01:17 2005 @@ -17,10 +17,13 @@ def checkgraphs(self, blocks): - all_graphs = [self.translator.flowgraphs[fn] - for fn in self.annotated.itervalues()] - for graph in dict.fromkeys(all_graphs): - checkgraph(graph) + seen = {} + for block in blocks: + fn = self.annotated[block] + graph = self.translator.flowgraphs[fn] + if graph not in seen: + checkgraph(graph) + seen[graph] = True def fully_annotated_blocks(self): """Ignore blocked blocks.""" From pedronis at codespeak.net Sat Sep 24 16:04:53 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sat, 24 Sep 2005 16:04:53 +0200 (CEST) Subject: [pypy-svn] r17826 - pypy/dist/pypy/interpreter Message-ID: <20050924140453.82C4A27BD3@code1.codespeak.net> Author: pedronis Date: Sat Sep 24 16:04:52 2005 New Revision: 17826 Modified: pypy/dist/pypy/interpreter/gateway.py Log: create abstract method to avoid annotator demotion warnings Modified: pypy/dist/pypy/interpreter/gateway.py ============================================================================== --- pypy/dist/pypy/interpreter/gateway.py (original) +++ pypy/dist/pypy/interpreter/gateway.py Sat Sep 24 16:04:52 2005 @@ -269,7 +269,10 @@ raise TypeError, "abstract" class BuiltinFrameFactory(object): - pass + """Subclasses can create builtin frames for a associated builtin""" + + def create(self, space, code, w_globals): + raise TypeError, "abstract" class BuiltinCodeSignature(Signature): "NOT_RPYTHON" From arigo at codespeak.net Sat Sep 24 16:45:57 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 16:45:57 +0200 (CEST) Subject: [pypy-svn] r17827 - pypy/dist/pypy/doc Message-ID: <20050924144557.5356627BCE@code1.codespeak.net> Author: arigo Date: Sat Sep 24 16:45:54 2005 New Revision: 17827 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: (pedronis, arigo) Drafted the subsection organization. Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Sat Sep 24 16:45:54 2005 @@ -39,9 +39,9 @@ the complete program is built and run by executing statements. Some of these statements have a declarative look and feel; for example, some appear to be function or class declarations. Actually, they are merely -statements that, when executed, build a function or class object. Then a -reference to the new object is stored at some place, under some name, from -where it can be accessed. Units of programs -- modules, whose source is one +statements that, when executed, build a function or class object. A +reference to the new object is then stored in a namespace from where it +can be accessed. Units of programs -- modules, whose source is one file each -- are similarily mere objects in memory, built on demand by some other module executing an ``import`` statement. Any such statement -- class construction or module import -- can be executed at any time during @@ -57,7 +57,7 @@ completing it with substitute functions, as needed by the OS on which ``os.py`` turns out to be executed. Many large Python projects use custom import mechanisms to control exactly how and from where each module is -loaded, simply by tampering with import hooks or just emulating parts of the +loaded, by tampering with import hooks or just emulating parts of the ``import`` statement manually. In addition, there are of course classical (and only partially true) @@ -283,15 +283,134 @@ XXX non mergeable data, details XXX conditionals, multiple pending blocks +XXX termination for "reasonable" terminating programs +Geninterp +~~~~~~~~~ +YYY + +YYY dynamic merging good for geninterp Annotator --------------------------------- +The annotator is the type inference part of our toolchain. The +annotator infers *types* in the following sense: given a program +considered as a family of control flow graphs, it assigns to each +variable of each graph a so-called *annotation*, which describes what +are the possible run-time objects that this variable will contain. Note +that in the literature such an annotation is usually called a type, but +we prefer to avoid this terminology to avoid confusion with the Python +notion of the concrete type of an object. Annotations are sets of +possible values that is not always exactly the set of all objects of a +specific Python type. + +We will first expose a simplified, static model of how the annotator +works, and then hint at some differences between the model and the +reality. + + +Static model +~~~~~~~~~~~~ + +The annotator can be considered as taking as input a finite family of +functions calling each other, and working mainly on the control flow +graphs of each of these functions as built by the `Flow Object Space`_. +Additionally, for a particular "entry point" function, each input +argument is given a user-specified annotation. + +The goal of the annotator is to find the most precise annotation that +can be given to each variable of all control flow graphs while +respecting constrains imposed by the operations in which these variables +are involved. + +More precisely, it is usually possible to deduce information about the +result variable of an operation given information about its arguments. +For example, we can say that the addition of two integers must be an +integer. Most programming languages have this property. However, +Python -- like many languages not specifically designed with type +inference in mind -- does not possess a type system that allows much +useful information to be derived about variables based on how they are +*used*; only on how they were *produced*. For example, a number of very +different built-in types can be involved in an addition; the meaning of +the addition and the type of the result depends on the type of the input +arguments. Merely knowing that a variable will be used in an addition +does not give much information per se. For this reason, our annotator +works by flowing annotations forward, operation after operation, i.e. by +performing abstract interpretation of the flow graphs. By contrast, the +well-known `Hindley-Milner`_ type inference algorithm works in an +inside-out direction, by starting from individual operations and +propagating type constrains outwards. + +We use a fixpoint algorithm XXX + +XXX only generalizing and no infinite chain + +Annotation model +~~~~~~~~~~~~~~~~ + +XXX model and rules + +XXX constant propagation + +Prebuilt constants +~~~~~~~~~~~~~~~~~~ + +Mutable objects +~~~~~~~~~~~~~~~ + +XXX + +Classes and instances +~~~~~~~~~~~~~~~~~~~~~ + XXX +Termination +~~~~~~~~~~~ + +XXX termination + soundness + most-precise-fixpoint-ness + complexity + + +Non-static aspects +~~~~~~~~~~~~~~~~~~ + +XXX specialization (tons of fun) + +XXX executing more user program code (idem) + +XXX constant propagation to remove bootstrap-only code + +XXX termination even with non-static aspects + + +Code generation: rewriting to low-level operations +-------------------------------------------------- + +XXX introduction, repr + +Low-level type system for C +~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +XXX + +Implementing operations as helpers +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +XXX XXX reusing the annotator and specialization + +Generating C code +~~~~~~~~~~~~~~~~~ + +XXX collecting functions and data structures recursively + +XXX inserting hand-written C functions for suggested_primitives + +XXX messy + + .. _architecture: architecture.html .. _`Thunk Object Space`: objspace.html#the-thunk-object-space @@ -299,5 +418,6 @@ .. _`Flow Object Space`: objspace.html#the-flow-object-space .. _`Standard Object Space`: objspace.html#the-standard-object-space .. _Psyco: http://psyco.sourceforge.net/ +.. _`Hindley-Milner`: http://en.wikipedia.org/wiki/Hindley-Milner_type_inference .. include:: _ref.txt From arigo at codespeak.net Sat Sep 24 17:12:19 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 17:12:19 +0200 (CEST) Subject: [pypy-svn] r17829 - pypy/dist/pypy/objspace/flow Message-ID: <20050924151219.8AF2027BD0@code1.codespeak.net> Author: arigo Date: Sat Sep 24 17:12:16 2005 New Revision: 17829 Modified: pypy/dist/pypy/objspace/flow/model.py Log: Speed up Variable.rename(). Modified: pypy/dist/pypy/objspace/flow/model.py ============================================================================== --- pypy/dist/pypy/objspace/flow/model.py (original) +++ pypy/dist/pypy/objspace/flow/model.py Sat Sep 24 17:12:16 2005 @@ -5,6 +5,7 @@ # a discussion in Berlin, 4th of october 2003 from __future__ import generators from pypy.tool.uid import Hashable +from pypy.tool.sourcetools import PY_IDENTIFIER """ memory size before and after introduction of __slots__ @@ -217,7 +218,7 @@ name = property(name) def renamed(self): - return isinstance(self._name, str) + return type(self._name) is not int renamed = property(renamed) def __init__(self, name=None): @@ -237,19 +238,23 @@ return '%s' % self.name def rename(self, name): - if self.renamed: + my_number = self._name + if type(my_number) is not int: # don't rename several times return - if isinstance(name, Variable): - if not name.renamed: + if type(name) is not str: + #assert isinstance(name, Variable) -- disabled for speed reasons + name = name._name + if type(name) is int: # the other Variable wasn't renamed either return - name = name.name[:name.name.rfind('_')] - # remove strange characters in the name - name = ''.join([c for c in name if c.isalnum() or c == '_']) - if not name: - return - if '0' <= name[0] <= '9': - name = '_' + name - self._name = name + '_' + self.name[1:] + name = name[:name.rfind('_')] + else: + # remove strange characters in the name + name = name.translate(PY_IDENTIFIER) + if not name: + return + if name[0] <= '9': # skipped the '0' <= which is always true + name = '_' + name + self._name = '%s_%d' % (name, my_number) def __reduce_ex__(self, *args): if hasattr(self, 'concretetype'): From arigo at codespeak.net Sat Sep 24 17:25:46 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 17:25:46 +0200 (CEST) Subject: [pypy-svn] r17830 - in pypy/dist/pypy: rpython translator/c Message-ID: <20050924152546.ABD5C27BD0@code1.codespeak.net> Author: arigo Date: Sat Sep 24 17:25:35 2005 New Revision: 17830 Modified: pypy/dist/pypy/rpython/rclass.py pypy/dist/pypy/rpython/rmodel.py pypy/dist/pypy/rpython/rpbc.py pypy/dist/pypy/rpython/rtyper.py pypy/dist/pypy/translator/c/database.py pypy/dist/pypy/translator/c/external.py pypy/dist/pypy/translator/c/funcgen.py pypy/dist/pypy/translator/c/node.py pypy/dist/pypy/translator/c/symboltable.py Log: Mindless replacement of '== Void' with 'is Void' and '!= Void' with 'is not Void'. As it's a primitive low-level type it should be a singleton. Modified: pypy/dist/pypy/rpython/rclass.py ============================================================================== --- pypy/dist/pypy/rpython/rclass.py (original) +++ pypy/dist/pypy/rpython/rclass.py Sat Sep 24 17:25:35 2005 @@ -270,7 +270,7 @@ mro = list(rsubcls.classdef.getmro()) for fldname in self.clsfields: mangled_name, r = self.clsfields[fldname] - if r.lowleveltype == Void: + if r.lowleveltype is Void: continue for clsdef in mro: if fldname in clsdef.cls.__dict__: @@ -279,7 +279,7 @@ break # extra PBC attributes for (access_set, attr), (mangled_name, r) in self.pbcfields.items(): - if r.lowleveltype == Void: + if r.lowleveltype is Void: continue for clsdef in mro: try: @@ -513,7 +513,7 @@ result.super) # then add instance attributes from this level for name, (mangled_name, r) in self.fields.items(): - if r.lowleveltype == Void: + if r.lowleveltype is Void: llattrvalue = None elif name == '_hash_cache_': # hash() support llattrvalue = hash(value) @@ -596,7 +596,7 @@ if fldname == '__class__': continue mangled_name, r = self.allinstancefields[fldname] - if r.lowleveltype == Void: + if r.lowleveltype is Void: continue for clsdef in mro: if fldname in clsdef.cls.__dict__: Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Sat Sep 24 17:25:35 2005 @@ -91,7 +91,7 @@ def convert_const(self, value): "Convert the given constant value to the low-level repr of 'self'." - if self.lowleveltype != Void: + if self.lowleveltype is not Void: try: realtype = typeOf(value) except (AssertionError, AttributeError): @@ -199,9 +199,9 @@ def rtype_is_((robj1, robj2), hop): if hop.s_result.is_constant(): return inputconst(Bool, hop.s_result.const) - if robj1.lowleveltype == Void: + if robj1.lowleveltype is Void: robj1 = robj2 - elif robj2.lowleveltype == Void: + elif robj2.lowleveltype is Void: robj2 = robj1 if (not isinstance(robj1.lowleveltype, Ptr) or not isinstance(robj2.lowleveltype, Ptr)): @@ -287,7 +287,7 @@ raise TypeError(repr(reqtype)) # Void Constants can hold any value; # non-Void Constants must hold a correctly ll-typed value - if lltype != Void: + if lltype is not Void: try: realtype = typeOf(value) except (AssertionError, AttributeError): Modified: pypy/dist/pypy/rpython/rpbc.py ============================================================================== --- pypy/dist/pypy/rpython/rpbc.py (original) +++ pypy/dist/pypy/rpython/rpbc.py Sat Sep 24 17:25:35 2005 @@ -232,7 +232,7 @@ result = malloc(self.pbc_type, immortal=True) self.pbc_cache[pbc] = result for attr, (mangled_name, r_value) in self.llfieldmap.items(): - if r_value.lowleveltype == Void: + if r_value.lowleveltype is Void: continue try: thisattrvalue = self.access_set.values[(pbc, attr)] @@ -436,7 +436,7 @@ return self.call(hop, f, vlist, rresult) def call(self, hop, f, vlist, rresult): - if self.lowleveltype == Void: + if self.lowleveltype is Void: assert len(self.function_signatures()) == 1 vlist[0] = hop.inputconst(typeOf(f), f) hop.exception_is_here() @@ -466,7 +466,7 @@ # this check makes sense because both source and dest repr are FunctionsPBCRepr if r_fpbc1.lowleveltype == r_fpbc2.lowleveltype: return v - if r_fpbc1.lowleveltype == Void: + if r_fpbc1.lowleveltype is Void: return inputconst(r_fpbc2, r_fpbc1.s_pbc.const) return NotImplemented @@ -622,7 +622,7 @@ def convert_const(self, cls): if cls not in self.s_pbc.prebuiltinstances: raise TyperError("%r not in %r" % (cls, self)) - if self.lowleveltype == Void: + if self.lowleveltype is Void: return cls return rclass.get_type_repr(self.rtyper).convert_const(cls) @@ -633,7 +633,7 @@ return self.redispatch_call(hop, call_args=True) def redispatch_call(self, hop, call_args): - if self.lowleveltype != Void: + if self.lowleveltype is not Void: # instantiating a class from multiple possible classes vcls = hop.inputarg(self, arg=0) access_set = self.get_access_set() @@ -696,7 +696,7 @@ # this check makes sense because both source and dest repr are ClassesPBCRepr if r_clspbc1.lowleveltype == r_clspbc2.lowleveltype: return v - if r_clspbc1.lowleveltype == Void: + if r_clspbc1.lowleveltype is Void: return inputconst(r_clspbc2, r_clspbc1.s_pbc.const) return NotImplemented Modified: pypy/dist/pypy/rpython/rtyper.py ============================================================================== --- pypy/dist/pypy/rpython/rtyper.py (original) +++ pypy/dist/pypy/rpython/rtyper.py Sat Sep 24 17:25:35 2005 @@ -212,7 +212,7 @@ if not hasattr(c, 'concretetype'): c = inputconst(using_repr, c.value) else: - if c.concretetype != Void: + if c.concretetype is not Void: assert typeOf(c.value) == using_repr.lowleveltype return c @@ -393,7 +393,7 @@ if hop.s_result.is_constant(): if isinstance(resultvar, Constant) and \ isinstance(hop.r_result.lowleveltype, Primitive) and \ - hop.r_result.lowleveltype != Void: + hop.r_result.lowleveltype is not Void: assert resultvar.value == hop.s_result.const resulttype = resultvar.concretetype op.result.concretetype = hop.r_result.lowleveltype @@ -707,7 +707,7 @@ args_s = [] newargs_v = [] for v in args_v: - if v.concretetype == Void: + if v.concretetype is Void: s_value = rtyper.binding(v) if not s_value.is_constant(): raise TyperError("non-constant variable of type Void") Modified: pypy/dist/pypy/translator/c/database.py ============================================================================== --- pypy/dist/pypy/translator/c/database.py (original) +++ pypy/dist/pypy/translator/c/database.py Sat Sep 24 17:25:35 2005 @@ -69,7 +69,7 @@ resulttype = self.gettype(T.RESULT) argtypes = [] for i in range(len(T.ARGS)): - if T.ARGS[i] != Void: + if T.ARGS[i] is not Void: argtype = self.gettype(T.ARGS[i]) try: argname = argnames[i] Modified: pypy/dist/pypy/translator/c/external.py ============================================================================== --- pypy/dist/pypy/translator/c/external.py (original) +++ pypy/dist/pypy/translator/c/external.py Sat Sep 24 17:25:35 2005 @@ -27,12 +27,12 @@ pass def cfunction_declarations(self): - if self.FUNCTYPE.RESULT != Void: + if self.FUNCTYPE.RESULT is not Void: yield '%s;' % cdecl(self.resulttypename, 'result') def cfunction_body(self): call = '%s(%s)' % (self.fnptr._name, ', '.join(self.argnames())) - if self.FUNCTYPE.RESULT != Void: + if self.FUNCTYPE.RESULT is not Void: yield 'result = %s;' % call yield 'if (PyErr_Occurred()) RPyConvertExceptionFromCPython();' yield 'return result;' Modified: pypy/dist/pypy/translator/c/funcgen.py ============================================================================== --- pypy/dist/pypy/translator/c/funcgen.py (original) +++ pypy/dist/pypy/translator/c/funcgen.py Sat Sep 24 17:25:35 2005 @@ -100,7 +100,7 @@ def expr(self, v, special_case_void=True): if isinstance(v, Variable): - if self.lltypemap(v) == Void and special_case_void: + if self.lltypemap(v) is Void and special_case_void: return '/* nothing */' else: return LOCALVAR % v.name @@ -143,7 +143,7 @@ if name not in seen: seen[name] = True result = cdecl(self.lltypename(v), LOCALVAR % name) + ';' - if self.lltypemap(v) == Void: + if self.lltypemap(v) is Void: result = '/*%s*/' % result result_by_name.append((v._name, result)) result_by_name.sort() @@ -174,7 +174,7 @@ multiple_times_alive = [] for a1, a2 in zip(link.args, link.target.inputargs): a2type, a2typename = self.lltypes[id(a2)] - if a2type == Void: + if a2type is Void: continue if a1 in linklocalvars: src = linklocalvars[a1] @@ -237,7 +237,7 @@ to_release.append(op.result) T = self.lltypemap(op.result) - if T != Void: + if T is not Void: res = LOCALVAR % op.result.name line = push_alive_op_result(op.opname, res, T) if line: @@ -395,8 +395,8 @@ def OP_DIRECT_CALL(self, op, err): # skip 'void' arguments - args = [self.expr(v) for v in op.args if self.lltypemap(v) != Void] - if self.lltypemap(op.result) == Void: + args = [self.expr(v) for v in op.args if self.lltypemap(v) is not Void] + if self.lltypemap(op.result) is Void: # skip assignment of 'void' return value return '%s(%s); if (RPyExceptionOccurred()) FAIL(%s);' % ( args[0], ', '.join(args[1:]), err) @@ -414,7 +414,7 @@ if T == PyObjPtr: result.append(self.pyobj_incref_expr(newvalue, T)) result = '\t'.join(result) - if T == Void: + if T is Void: result = '/* %s */' % result return result @@ -425,7 +425,7 @@ T = self.lltypemap(op.args[2]) self.gcpolicy.write_barrier(result, newvalue, T, targetexpr) result = '\t'.join(result) - if T == Void: + if T is Void: result = '/* %s */' % result return result @@ -506,7 +506,7 @@ itemtypename = self.db.gettype(VARPART.OF) elength = self.expr(op.args[1]) eresult = self.expr(op.result) - if VARPART.OF == Void: # strange + if VARPART.OF is Void: # strange esize = 'sizeof(%s)' % (cdecl(typename, ''),) else: esize = 'sizeof(%s)+((%s-1)*sizeof(%s))' % (cdecl(typename, ''), @@ -532,7 +532,7 @@ result = [] TYPE = self.lltypemap(op.result) assert self.lltypemap(op.args[0]) == TYPE - if TYPE != Void: + if TYPE is not Void: result.append('%s = %s;' % (self.expr(op.result), self.expr(op.args[0]))) if TYPE == PyObjPtr: Modified: pypy/dist/pypy/translator/c/node.py ============================================================================== --- pypy/dist/pypy/translator/c/node.py (original) +++ pypy/dist/pypy/translator/c/node.py Sat Sep 24 17:25:35 2005 @@ -142,7 +142,7 @@ STRUCT = self.STRUCT for name in STRUCT._names: FIELD_T = self.c_struct_field_type(name) - if FIELD_T == Void: + if FIELD_T is Void: yield '-1' else: cname = self.c_struct_field_name(name) @@ -201,7 +201,7 @@ yield '\t' + line yield '\tlong length;' line = '%s;' % cdecl(self.itemtypename, 'items[%d]'% self.varlength) - if self.ARRAY.OF == Void: # strange + if self.ARRAY.OF is Void: # strange line = '/* %s */' % line yield '\t' + line yield '};' @@ -241,7 +241,7 @@ def debug_offsets(self): # generate three offsets for debugging inspection yield 'offsetof(struct %s, length)' % (self.name,) - if self.ARRAY.OF != Void: + if self.ARRAY.OF is not Void: yield 'offsetof(struct %s, items[0])' % (self.name,) yield 'offsetof(struct %s, items[1])' % (self.name,) else: @@ -381,7 +381,7 @@ line = self.db.gcpolicy.array_gcheader_initializationexpr(self) if line: yield '\t' + line - if self.T.OF == Void or len(self.obj.items) == 0: + if self.T.OF is Void or len(self.obj.items) == 0: yield '\t%d' % len(self.obj.items) yield '}' elif self.T.OF == Char: @@ -416,7 +416,7 @@ node.where_to_copy_me.append('&%s' % access_expr) else: expr = db.get(value) - if typeOf(value) == Void: + if typeOf(value) is Void: comma = '' expr += comma i = expr.find('\n') Modified: pypy/dist/pypy/translator/c/symboltable.py ============================================================================== --- pypy/dist/pypy/translator/c/symboltable.py (original) +++ pypy/dist/pypy/translator/c/symboltable.py Sat Sep 24 17:25:35 2005 @@ -107,7 +107,7 @@ if isinstance(FIELD_TYPE, ContainerType): return debugptr(Ptr(FIELD_TYPE), address, self._symtable) elif isinstance(FIELD_TYPE, Primitive): - if FIELD_TYPE == Void: + if FIELD_TYPE is Void: return None tag = PrimitiveTag[FIELD_TYPE] size = struct.calcsize(tag) From arigo at codespeak.net Sat Sep 24 17:40:05 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 17:40:05 +0200 (CEST) Subject: [pypy-svn] r17831 - pypy/dist/pypy/interpreter Message-ID: <20050924154005.3EB0227BD0@code1.codespeak.net> Author: arigo Date: Sat Sep 24 17:40:00 2005 New Revision: 17831 Modified: pypy/dist/pypy/interpreter/pyopcode.py Log: (pedronis, arigo) Overlooked yet another usage of '== space.w_None'. This one was spelled 'is not f.space.w_None'... But actually using space.is_w() here makes the flow space very unhappy, so care is neeeded. Obscure stuff. Modified: pypy/dist/pypy/interpreter/pyopcode.py ============================================================================== --- pypy/dist/pypy/interpreter/pyopcode.py (original) +++ pypy/dist/pypy/interpreter/pyopcode.py Sat Sep 24 17:40:00 2005 @@ -379,9 +379,10 @@ f.valuestack.pop() # ignore the exception type f.valuestack.pop() # ignore the exception value w_unroller = f.valuestack.pop() - if w_unroller is not f.space.w_None: + unroller = f.space.interpclass_w(w_unroller) + if unroller is not None: # re-raise the unroller, if any - raise f.space.interpclass_w(w_unroller) + raise unroller def BUILD_CLASS(f): w_methodsdict = f.valuestack.pop() From arigo at codespeak.net Sat Sep 24 17:57:01 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 17:57:01 +0200 (CEST) Subject: [pypy-svn] r17832 - in pypy/dist/pypy: annotation rpython Message-ID: <20050924155701.5064D27BD0@code1.codespeak.net> Author: arigo Date: Sat Sep 24 17:56:57 2005 New Revision: 17832 Modified: pypy/dist/pypy/annotation/model.py pypy/dist/pypy/rpython/lltype.py Log: * Minor optimizations. * Safety-check that the annotator is not running on 'python -O'. Modified: pypy/dist/pypy/annotation/model.py ============================================================================== --- pypy/dist/pypy/annotation/model.py (original) +++ pypy/dist/pypy/annotation/model.py Sat Sep 24 17:56:57 2005 @@ -482,12 +482,20 @@ def unionof(*somevalues): "The most precise SomeValue instance that contains all the values." - s1 = SomeImpossibleValue() - for s2 in somevalues: + try: + s1, s2 = somevalues + except ValueError: + s1 = SomeImpossibleValue() + for s2 in somevalues: + if s1 != s2: + s1 = pair(s1, s2).union() + else: + # this is just a performance shortcut if s1 != s2: s1 = pair(s1, s2).union() - if DEBUG and s1.caused_by_merge is None and len(somevalues) > 1: - s1.caused_by_merge = somevalues + if DEBUG: + if s1.caused_by_merge is None and len(somevalues) > 1: + s1.caused_by_merge = somevalues return s1 def isdegenerated(s_value): @@ -548,6 +556,17 @@ return SomeImpossibleValue() setattr(cls, name, default_op) +# +# safety check that no-one is trying to make annotation and translation +# faster by providing the -O option to Python. +try: + assert False +except AssertionError: + pass # fine +else: + raise RuntimeError("The annotator relies on 'assert' statements from the\n" + "\tannotated program: you cannot run it with 'python -O'.") + # this has the side-effect of registering the unary and binary operations from pypy.annotation.unaryop import UNARY_OPERATIONS from pypy.annotation.binaryop import BINARY_OPERATIONS Modified: pypy/dist/pypy/rpython/lltype.py ============================================================================== --- pypy/dist/pypy/rpython/lltype.py (original) +++ pypy/dist/pypy/rpython/lltype.py Sat Sep 24 17:56:57 2005 @@ -622,11 +622,12 @@ if field_name in self._T._flds: T1 = self._T._flds[field_name] T2 = typeOf(val) - if T1 != T2: + if T1 == T2: + setattr(self._obj, field_name, val) + else: raise TypeError("%r instance field %r:\n" "expects %r\n" " got %r" % (self._T, field_name, T1, T2)) - setattr(self._obj, field_name, val) return raise AttributeError("%r instance has no field %r" % (self._T, field_name)) From arigo at codespeak.net Sat Sep 24 18:09:25 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 18:09:25 +0200 (CEST) Subject: [pypy-svn] r17833 - pypy/dist/pypy/objspace/test Message-ID: <20050924160925.3626927BD0@code1.codespeak.net> Author: arigo Date: Sat Sep 24 18:09:24 2005 New Revision: 17833 Modified: pypy/dist/pypy/objspace/test/test_thunkobjspace.py Log: Should not use become() on ints. Now that we try to reuse a bit W_IntObject, this is not a good idea at all. Modified: pypy/dist/pypy/objspace/test/test_thunkobjspace.py ============================================================================== --- pypy/dist/pypy/objspace/test/test_thunkobjspace.py (original) +++ pypy/dist/pypy/objspace/test/test_thunkobjspace.py Sat Sep 24 18:09:24 2005 @@ -37,24 +37,24 @@ assert d[7] == [43] def test_become(self): - x = 5 - y = 6 + x = [] + y = [] assert x is not y become(x, y) assert x is y def test_id(self): # these are the Smalltalk semantics of become(). - x = 5; idx = id(x) - y = 6; idy = id(y) + x = []; idx = id(x) + y = []; idy = id(y) assert idx != idy become(x, y) assert id(x) == id(y) == idy def test_double_become(self): - x = 5 - y = 6 - z = 7 + x = [] + y = [] + z = [] become(x, y) become(y, z) assert x is y is z From arigo at codespeak.net Sat Sep 24 19:59:59 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 19:59:59 +0200 (CEST) Subject: [pypy-svn] r17837 - in pypy/dist/pypy: objspace/flow translator translator/backendopt Message-ID: <20050924175959.93DF227B71@code1.codespeak.net> Author: arigo Date: Sat Sep 24 19:59:53 2005 New Revision: 17837 Modified: pypy/dist/pypy/objspace/flow/model.py pypy/dist/pypy/translator/backendopt/ssa.py pypy/dist/pypy/translator/simplify.py Log: (pedronis, arigo) * rewrote remove_identical_vars(). The algorithm is somehow getting clearer, though the code doesn't necessary shows this. Well, the comments help, maybe. * added FunctionGraph.iterblocks() and FunctionGraph.iterlinks(), which are actually not used by the rest of this check-in. Modified: pypy/dist/pypy/objspace/flow/model.py ============================================================================== --- pypy/dist/pypy/objspace/flow/model.py (original) +++ pypy/dist/pypy/objspace/flow/model.py Sat Sep 24 19:59:53 2005 @@ -68,21 +68,27 @@ return getsource(self.func) source = roproperty(getsource) -## def hasonlyexceptionreturns(self): -## try: -## return self._onlyex -## except AttributeError: -## def visit(link): -## if isinstance(link, Link): -## if link.target == self.returnblock: -## raise ValueError(link) -## try: -## traverse(visit, self) -## except ValueError: -## self._onlyex = False -## else: -## self._onlyex = True -## return self._onlyex + def iterblocks(self): + pending = [self.startblock] + seen = {id(self.startblock): True} + for block in pending: + yield block + for link in block.exits: + targetid = id(link.target) + if targetid not in seen: + pending.append(link.target) + seen[targetid] = True + + def iterlinks(self): + pending = [self.startblock] + seen = {id(self.startblock): True} + for block in pending: + for link in block.exits: + yield link + targetid = id(link.target) + if targetid not in seen: + pending.append(link.target) + seen[targetid] = True def show(self): from pypy.translator.tool.graphpage import SingleGraphPage Modified: pypy/dist/pypy/translator/backendopt/ssa.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/ssa.py (original) +++ pypy/dist/pypy/translator/backendopt/ssa.py Sat Sep 24 19:59:53 2005 @@ -1,55 +1,91 @@ from pypy.objspace.flow.model import Variable, mkentrymap, flatten, Block from pypy.tool.unionfind import UnionFind -def data_flow_families(graph): - """Follow the flow of the data in the graph. Returns a UnionFind grouping +class DataFlowFamilyBuilder: + """Follow the flow of the data in the graph. Builds a UnionFind grouping all the variables by families: each family contains exactly one variable where a value is stored into -- either by an operation or a merge -- and all following variables where the value is just passed unmerged into the next block. """ - # Build a list of "unification opportunities": for each block and each 'n', - # an "opportunity" is the list of the block's nth input variable plus - # the nth output variable from each of the incoming links. - opportunities = [] - for block, links in mkentrymap(graph).items(): - if block is graph.startblock: - continue - assert links - for n, inputvar in enumerate(block.inputargs): - vars = [inputvar] - for link in links: - var = link.args[n] - if not isinstance(var, Variable): - break - vars.append(var) - else: - # if no Constant found in the incoming links - opportunities.append(vars) - - # An "opportunitiy" that lists exactly two distinct variables means that - # the two variables can be unified. We maintain the unification status in - # 'variable_families'. When variables are unified, it might reduce the - # number of distinct variables and thus open other "opportunities" for - # unification. - progress = True - variable_families = UnionFind() - while progress: - progress = False - pending_opportunities = [] - for vars in opportunities: - repvars = [variable_families.find_rep(v1) for v1 in vars] - repvars = dict.fromkeys(repvars).keys() - if len(repvars) > 2: - # cannot unify now, but maybe later? - pending_opportunities.append(repvars) - elif len(repvars) == 2: - # unify! - variable_families.union(*repvars) - progress = True - opportunities = pending_opportunities - return variable_families + def __init__(self, graph): + # Build a list of "unification opportunities": for each block and each + # 'n', an "opportunity" groups the block's nth input variable with + # the nth output variable from each of the incoming links, in a list: + # [Block, blockvar, linkvar, linkvar, linkvar...] + opportunities = [] + for block, links in mkentrymap(graph).items(): + if block is graph.startblock: + continue + assert links + for n, inputvar in enumerate(block.inputargs): + vars = [block, inputvar] + for link in links: + var = link.args[n] + if not isinstance(var, Variable): + break + vars.append(var) + else: + # if no Constant found in the incoming links + opportunities.append(vars) + self.opportunities = opportunities + self.variable_families = UnionFind() + + def complete(self): + # An "opportunitiy" that lists exactly two distinct variables means that + # the two variables can be unified. We maintain the unification status + # in 'variable_families'. When variables are unified, it might reduce + # the number of distinct variables and thus open other "opportunities" + # for unification. + variable_families = self.variable_families + any_progress_at_all = False + progress = True + while progress: + progress = False + pending_opportunities = [] + for vars in self.opportunities: + repvars = [variable_families.find_rep(v1) for v1 in vars[1:]] + repvars_without_duplicates = dict.fromkeys(repvars) + count = len(repvars_without_duplicates) + if count > 2: + # cannot unify now, but maybe later? + pending_opportunities.append(vars[:1] + repvars) + elif count == 2: + # unify! + variable_families.union(*repvars_without_duplicates) + progress = True + self.opportunities = pending_opportunities + any_progress_at_all |= progress + return any_progress_at_all + + def merge_identical_phi_nodes(self): + variable_families = self.variable_families + any_progress_at_all = False + progress = True + while progress: + progress = False + block_phi_nodes = {} # in the SSA sense + for vars in self.opportunities: + block, blockvar = vars[:2] + linksvars = vars[2:] # from the incoming links + linksvars = [variable_families.find_rep(v) for v in linksvars] + phi_node = (block,) + tuple(linksvars) # ignoring n and blockvar + if phi_node in block_phi_nodes: + # already seen: we have two phi nodes in the same block that + # get exactly the same incoming vars. Identify the results. + blockvar1 = block_phi_nodes[phi_node] + if variable_families.union(blockvar1, blockvar)[0]: + progress = True + else: + block_phi_nodes[phi_node] = blockvar + any_progress_at_all |= progress + return any_progress_at_all + + def get_variable_families(self): + self.complete() + return self.variable_families + def SSI_to_SSA(graph): """Rename the variables in a flow graph as much as possible without @@ -61,7 +97,7 @@ result of an operation only once in the whole graph, but it can be passed to other blocks across links. """ - variable_families = data_flow_families(graph) + variable_families = DataFlowFamilyBuilder(graph).get_variable_families() # rename variables to give them the name of their familiy representant for v in variable_families.keys(): v1 = variable_families.find_rep(v) Modified: pypy/dist/pypy/translator/simplify.py ============================================================================== --- pypy/dist/pypy/translator/simplify.py (original) +++ pypy/dist/pypy/translator/simplify.py Sat Sep 24 19:59:53 2005 @@ -453,44 +453,59 @@ which otherwise doesn't realize that tests performed on one of the copies of the variable also affect the other.""" - from pypy.translator.backendopt.ssa import data_flow_families - entrymapitems = mkentrymap(graph).items() - progress = True - while progress: - variable_families = data_flow_families(graph) - progress = False - for block, links in entrymapitems: - if not block.exits: - continue - entryargs = {} - for i in range(len(block.inputargs)): - # list of possible vars that can arrive in i'th position - key = [] + # This algorithm is based on DataFlowFamilyBuilder, used as a + # "phi node remover" (in the SSA sense). 'variable_families' is a + # UnionFind object that groups variables by families; variables from the + # same family can be identified, and if two input arguments of a block + # end up in the same family, then we really remove one of them in favor + # of the other. + # + # The idea is to identify as much variables as possible by trying + # iteratively two kinds of phi node removal: + # + # * "vertical", by identifying variables from different blocks, when + # we see that a value just flows unmodified into the next block without + # needing any merge (this is what backendopt.ssa.SSI_to_SSA() would do + # as well); + # + # * "horizontal", by identifying two input variables of the same block, + # when these two variables' phi nodes have the same argument -- i.e. + # when for all possible incoming paths they would get twice the same + # value (this is really the purpose of remove_identical_vars()). + # + from pypy.translator.backendopt.ssa import DataFlowFamilyBuilder + builder = DataFlowFamilyBuilder(graph) + variable_families = builder.get_variable_families() # vertical removal + while True: + if not builder.merge_identical_phi_nodes(): # horizontal removal + break + if not builder.complete(): # vertical removal + break + + for block, links in mkentrymap(graph).items(): + if block is graph.startblock: + continue + renaming = {} + family2blockvar = {} + kills = [] + for i, v in enumerate(block.inputargs): + v1 = variable_families.find_rep(v) + if v1 in family2blockvar: + # already seen -- this variable can be shared with the + # previous one + renaming[v] = family2blockvar[v1] + kills.append(i) + else: + family2blockvar[v1] = v + if renaming: + block.renamevariables(renaming) + # remove the now-duplicate input variables + kills.reverse() # starting from the end + for i in kills: + del block.inputargs[i] for link in links: - v = link.args[i] - if isinstance(v, Constant): - break - key.append(variable_families.find_rep(v)) - else: # if no Constant - key = tuple(key) - if key not in entryargs: - entryargs[key] = i - else: - j = entryargs[key] - # positions i and j receive exactly the same input - # vars, we can remove the argument i and replace it - # with the j. - argi = block.inputargs[i] - argj = block.inputargs[j] - block.renamevariables({argi: argj}) - assert block.inputargs[i] == block.inputargs[j]== argj - del block.inputargs[i] - for link in links: - assert (variable_families.find_rep(link.args[i])== - variable_families.find_rep(link.args[j])) - del link.args[i] - progress = True - break # block.inputargs mutated + del link.args[i] + def coalesce_is_true(graph): """coalesce paths that go through an is_true and a directly successive From arigo at codespeak.net Sat Sep 24 20:10:24 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 20:10:24 +0200 (CEST) Subject: [pypy-svn] r17838 - pypy/dist/pypy/translator/backendopt/test Message-ID: <20050924181024.DB87C27B83@code1.codespeak.net> Author: arigo Date: Sat Sep 24 20:10:22 2005 New Revision: 17838 Modified: pypy/dist/pypy/translator/backendopt/test/test_ssa.py Log: Forgotten update. Modified: pypy/dist/pypy/translator/backendopt/test/test_ssa.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_ssa.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_ssa.py Sat Sep 24 20:10:22 2005 @@ -18,7 +18,7 @@ if isinstance(block, Block): operations += block.operations - variable_families = data_flow_families(graph) + variable_families = DataFlowFamilyBuilder(graph).get_variable_families() # we expect to find xx only once: v_xx = variable_families.find_rep(graph.getargs()[0]) From arigo at codespeak.net Sat Sep 24 20:21:18 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sat, 24 Sep 2005 20:21:18 +0200 (CEST) Subject: [pypy-svn] r17839 - in pypy/dist/pypy/module/sys: . test Message-ID: <20050924182118.6DD1027B83@code1.codespeak.net> Author: arigo Date: Sat Sep 24 20:21:14 2005 New Revision: 17839 Modified: pypy/dist/pypy/module/sys/__init__.py pypy/dist/pypy/module/sys/test/test_sysmodule.py pypy/dist/pypy/module/sys/version.py Log: Playing around with version numbers in the sys.pypy_* attributes. The pypy_svn_url attribute now also contains the revision number, to make it clear that it's a revision number and that both informations are useful mostly only if they are together. Modified: pypy/dist/pypy/module/sys/__init__.py ============================================================================== --- pypy/dist/pypy/module/sys/__init__.py (original) +++ pypy/dist/pypy/module/sys/__init__.py Sat Sep 24 20:21:14 2005 @@ -52,7 +52,7 @@ 'version_info' : 'version.get_version_info(space)', 'version' : 'version.get_version(space)', 'pypy_version_info' : 'version.get_pypy_version_info(space)', - 'pypy_svn_url' : 'space.wrap("$HeadURL$"[10:-29])', + 'pypy_svn_url' : 'version.get_svn_url(space)', 'hexversion' : 'version.get_hexversion(space)', 'ps1' : 'space.wrap(">>>> ")', 'ps2' : 'space.wrap(".... ")', Modified: pypy/dist/pypy/module/sys/test/test_sysmodule.py ============================================================================== --- pypy/dist/pypy/module/sys/test/test_sysmodule.py (original) +++ pypy/dist/pypy/module/sys/test/test_sysmodule.py Sat Sep 24 20:21:14 2005 @@ -358,7 +358,7 @@ assert isinstance(vi[1], int) assert isinstance(vi[2], int) assert vi[3] in ("alpha", "beta", "candidate", "final") - assert isinstance(vi[4], int) or vi[4] == '?' + assert isinstance(vi[4], int) def test_allattributes(self): sys.__dict__ # check that we don't crash initializing any attribute Modified: pypy/dist/pypy/module/sys/version.py ============================================================================== --- pypy/dist/pypy/module/sys/version.py (original) +++ pypy/dist/pypy/module/sys/version.py Sat Sep 24 20:21:14 2005 @@ -11,6 +11,8 @@ PYPY_VERSION = (0, 7, 1, "alpha", '?') # the last item is replaced by the svn revision ^^^ +SVN_URL = "$HeadURL: http://codespeak.net/svn/pypy/dist/pypy/module/sys/version.py $"[10:-28] + # ____________________________________________________________ @@ -37,6 +39,9 @@ ver = ver[:-1] + (svn_revision(),) return space.wrap(ver) +def get_svn_url(space): + return space.wrap((SVN_URL, svn_revision())) + def tuple2hex(ver): d = {'alpha': 0xA, 'beta': 0xB, @@ -56,7 +61,7 @@ "Return the last-changed svn revision number." # NB. we hack the number directly out of the .svn directory to avoid # to depend on an external 'svn' executable in the path. - rev = '?' + rev = 0 try: f = open(os.path.join(autopath.pypydir, '.svn', 'entries'), 'r') for line in f: From pedronis at codespeak.net Sun Sep 25 15:13:58 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Sun, 25 Sep 2005 15:13:58 +0200 (CEST) Subject: [pypy-svn] r17851 - pypy/dist/pypy/translator/goal Message-ID: <20050925131358.B1B8D27B8D@code1.codespeak.net> Author: pedronis Date: Sun Sep 25 15:13:57 2005 New Revision: 17851 Modified: pypy/dist/pypy/translator/goal/query.py Log: query to find backedgges in the callgraph Modified: pypy/dist/pypy/translator/goal/query.py ============================================================================== --- pypy/dist/pypy/translator/goal/query.py (original) +++ pypy/dist/pypy/translator/goal/query.py Sun Sep 25 15:13:57 2005 @@ -513,3 +513,27 @@ l.sort() for name, c in l: print name, c + +def backcalls(t): + g = {} + for caller, callee in t.callgraph.itervalues(): + g.setdefault(caller,[]).append(callee) + + back = [] + color = {} + WHITE, GRAY, BLACK = 0,1,2 + + def visit(fcur,witness=[]): + color[fcur] = GRAY + for f in dict.fromkeys(g.get(fcur, [])): + fcolor = color.get(f, WHITE) + if fcolor == WHITE: + visit(f,witness+[f]) + elif fcolor == GRAY: + print "*", witness, f + back.append((fcur, f)) + color[fcur] = BLACK + + visit(t.entrypoint, [t.entrypoint]) + + return back From arigo at codespeak.net Sun Sep 25 15:16:47 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 25 Sep 2005 15:16:47 +0200 (CEST) Subject: [pypy-svn] r17852 - pypy/dist/pypy/doc Message-ID: <20050925131647.6B93727B85@code1.codespeak.net> Author: arigo Date: Sun Sep 25 15:16:44 2005 New Revision: 17852 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: Writing. Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Sun Sep 25 15:16:44 2005 @@ -339,14 +339,34 @@ arguments. Merely knowing that a variable will be used in an addition does not give much information per se. For this reason, our annotator works by flowing annotations forward, operation after operation, i.e. by -performing abstract interpretation of the flow graphs. By contrast, the -well-known `Hindley-Milner`_ type inference algorithm works in an -inside-out direction, by starting from individual operations and -propagating type constrains outwards. +performing abstract interpretation of the flow graphs. In a sense, it +is a more naive approach than the one taken by type systems specifically +designed to enable more advanced inference algorithms. For example, +`Hindley-Milner`_ type inference works in an inside-out direction, by +starting from individual operations and propagating type constrains +outwards. + +Naturally, simply propagating annotations forward requires the use of a +fixpoint algorithm in the presence of loops in the flow graphs or in the +inter-procedural call graph. Indeed, we flow annotations forward from +the beginning of the entry point function into each block, operation +after operation, and follow all calls recursively. During this process, +each variable along the way gets an annotation. In various cases, +e.g. when we close a loop, the previously assigned annotations can be +found to be too restrictive. In this case, we generalize them to allow +for a larger set of possible run-time values, and schedule the block +where they appear for reflowing. The more general annotations can +generalize the annotations of the results of the variables in the block, +which in turn can generalize the annotations that flow into the +following blocks, and so on. This process continues until a fixpoint is +reached. + +We can consider that all variables are initially assigned the "bottom" +annotation corresponding to an empty set of possible run-time values. +Annotations can only ever be generalized, and the model is simple enough +to show that there is no infinite chain of generalization, so that this +process necessarily terminates, as we will show in the sequel. -We use a fixpoint algorithm XXX - -XXX only generalizing and no infinite chain Annotation model ~~~~~~~~~~~~~~~~ @@ -358,6 +378,8 @@ Prebuilt constants ~~~~~~~~~~~~~~~~~~ +XXX + Mutable objects ~~~~~~~~~~~~~~~ From arigo at codespeak.net Sun Sep 25 19:07:40 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 25 Sep 2005 19:07:40 +0200 (CEST) Subject: [pypy-svn] r17863 - pypy/dist/pypy/doc Message-ID: <20050925170740.B313B27B8D@code1.codespeak.net> Author: arigo Date: Sun Sep 25 19:07:38 2005 New Revision: 17863 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: Drafting the formal notations and rules for the annotation model. Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Sun Sep 25 19:07:38 2005 @@ -371,6 +371,93 @@ Annotation model ~~~~~~~~~~~~~~~~ +:: + + Bot + + Top + + Int + + NonNegInt + + UnsignedInt + + Bool + + Float + + Str + + NullableStr + + Char + + Inst(class) + + List(x) + + Dict(x, y) + + Tup(ann_1, ..., ann_n) + + Pbc({... a finite set ...}) + + with: None + f + class + class.f + + v_n = op(v_n1, ...) | v_n', v_n'' + + v_class.attr + + v_n: Annotation + + + E: eq rel on V + b: V->A + + + + E(x,y) + ---------------------------------------- + merge_into(x,y) + merge_into(y,x) + + + z=add(x,y), b(x)=List(v), b(y)=List(w) + -------------------------------------------- + E' = E union (v~w) + b' = b with (z->List(v)) + + + z=add(x,y), b(x)<=NullableStr, b(y)<=NullableStr + ------------------------------------------------------ + b' = b with (z->Str) + + + merge_into(x,y), b(x)=List(v) + --------------------------------------------------------- + E' = E union (v~w) if b(y)=List(w) + b' = b with (y->b(x)\/b(y)) otherwise + + + z=new_list() | z' + ------------------------------------- + b' = b with (z->List(z')) + + + z=getitem(x,y), b(x)=List(v), b(y)=Int + -------------------------------------------- + E' = E union (z~v) + + + setitem(x,y,z), b(x)=List(v), b(y)=Int + -------------------------------------------- + merge_into(z,v) + + XXX model and rules XXX constant propagation From arigo at codespeak.net Sun Sep 25 21:02:52 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Sun, 25 Sep 2005 21:02:52 +0200 (CEST) Subject: [pypy-svn] r17866 - pypy/dist/pypy/doc Message-ID: <20050925190252.71C9C27B86@code1.codespeak.net> Author: arigo Date: Sun Sep 25 21:02:49 2005 New Revision: 17866 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: More drafting. Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Sun Sep 25 21:02:49 2005 @@ -414,9 +414,15 @@ v_n: Annotation + for each function f: + arg_f_1 ... arg_f_n + returnvar_f + E: eq rel on V b: V->A + V: set of variables + A: fixed lattice of the above annotation terms @@ -437,10 +443,14 @@ b' = b with (z->Str) - merge_into(x,y), b(x)=List(v) - --------------------------------------------------------- - E' = E union (v~w) if b(y)=List(w) - b' = b with (y->b(x)\/b(y)) otherwise + merge_into(x,y), b(x)=List(v), b(y)=List(w) + ------------------------------------------------- + E' = E union (v~w) + + + merge_into(x,y), b(x) and b(y) not both Lists + --------------------------------------------------- + b' = b with (y->b(x)\/b(y)) z=new_list() | z' @@ -448,16 +458,63 @@ b' = b with (z->List(z')) - z=getitem(x,y), b(x)=List(v), b(y)=Int + z=getitem(x,y) | z', b(x)=List(v) -------------------------------------------- - E' = E union (z~v) + E' = E union (z'~v) + b' = b with (z->b(z')) - setitem(x,y,z), b(x)=List(v), b(y)=Int + setitem(x,y,z), b(x)=List(v) -------------------------------------------- merge_into(z,v) + z=getattr(x,attr) | z', b(x)=Inst(A) + --------------------------------------------------------------------- + E' = E union (A.attr ~ A'.attr) for all A' subclass of A + E' = E union (z' ~ A.attr) + b' = b with (z->lookup_filter(b(z'), A)) + + + setattr(x,attr,z), b(x)=Inst(A) + --------------------------------------------------------------------- + assert b(z) is not a Pbc containing methods + E' = E union (A.attr ~ A'.attr) for all A' subclass of A + merge_into(z, A.attr) + + + z=simplecall(x,y1,...,yn), b(x)=Pbc(set) + --------------------------------------------------------------------- + for each c in set: + if c is a function: + E' = E union (z~returnvar_c) + merge_into(y1, arg_c_1) + ... + merge_into(yn, arg_c_n) + if c is a class: + let f = c.__init__ + b' = b with (z->b(z)\/Inst(c)) + b' = b with (arg_f_1->b(arg_f_1)\/Inst(c)) + merge_into(y1, arg_f_2) + ... + merge_into(yn, arg_f_(n+1)) + if c is a method: + let class.f = c + E' = E union (z~returnvar_f) + b' = b with (arg_f_1->b(arg_f_1)\/Inst(class)) + merge_into(y1, arg_f_2) + ... + merge_into(yn, arg_f_(n+1)) + + + lookup_filter(Pbc(set), class) = Pbc(newset) where + we only keep in newset the non-methods, and the following methods: + * the ones bound to a strict subclass of 'class', and + * among the methods bound the 'class' or superclasses, only the + one from the most derived class. + lookup_filter(NonPbcAnnotation, class) = NonPbcAnnotation + + XXX model and rules XXX constant propagation From ac at codespeak.net Mon Sep 26 11:31:54 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Mon, 26 Sep 2005 11:31:54 +0200 (CEST) Subject: [pypy-svn] r17870 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050926093154.BE9B427B7C@code1.codespeak.net> Author: ac Date: Mon Sep 26 11:31:54 2005 New Revision: 17870 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py Log: Fix some error messages. Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Mon Sep 26 11:31:54 2005 @@ -103,7 +103,7 @@ break elif cur_token.get_value() == 'for': if len(arguments) != 1: - raise ValueError('SyntaxError("invalid syntax")') + raise SyntaxError("invalid syntax") expr = arguments[0] genexpr_for = parse_genexpr_for(tokens[index:]) genexpr_for[0].is_outmost = True @@ -306,7 +306,7 @@ genexpr_fors.append(ast.GenExprFor(ass_node, iterable, ifs, lineno)) ifs = [] else: - assert False, 'Unexpected token: expected for in genexpr' + raise SyntaxError('invalid syntax') return genexpr_fors From mwh at codespeak.net Mon Sep 26 13:42:35 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Mon, 26 Sep 2005 13:42:35 +0200 (CEST) Subject: [pypy-svn] r17871 - pypy/extradoc/sprintinfo Message-ID: <20050926114235.B7E4227B84@code1.codespeak.net> Author: mwh Date: Mon Sep 26 13:42:35 2005 New Revision: 17871 Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt Log: add myself to paris-2005-people.txt I guess I need to think about accomodation soon... Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt ============================================================================== --- pypy/extradoc/sprintinfo/paris-2005-people.txt (original) +++ pypy/extradoc/sprintinfo/paris-2005-people.txt Mon Sep 26 13:42:35 2005 @@ -19,6 +19,7 @@ Armin Rigo ? ? Samuele Pedroni ? ? Holger Krekel ? ? +Michael Hudson ? ? =================== ============== ===================== People on the following list are likely to come and were From ac at codespeak.net Mon Sep 26 13:50:30 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Mon, 26 Sep 2005 13:50:30 +0200 (CEST) Subject: [pypy-svn] r17872 - pypy/dist/lib-python/modified-2.4.1/test Message-ID: <20050926115030.752FE27B7C@code1.codespeak.net> Author: ac Date: Mon Sep 26 13:50:30 2005 New Revision: 17872 Modified: pypy/dist/lib-python/modified-2.4.1/test/test_genexps.py Log: Adjust tests to account for implementation details. Modified: pypy/dist/lib-python/modified-2.4.1/test/test_genexps.py ============================================================================== --- pypy/dist/lib-python/modified-2.4.1/test/test_genexps.py (original) +++ pypy/dist/lib-python/modified-2.4.1/test/test_genexps.py Mon Sep 26 13:50:30 2005 @@ -116,9 +116,9 @@ Verify re-use of tuples (a side benefit of using genexps over listcomps) - >>> tupleids = map(id, ((i,i) for i in xrange(10))) - >>> max(tupleids) - min(tupleids) - 0 +## >>> tupleids = map(id, ((i,i) for i in xrange(10))) +## >>> max(tupleids) - min(tupleids) +## 0 Verify that syntax error's are raised for genexps used as lvalues @@ -191,7 +191,7 @@ g.next() File "", line 1, in g = (10 // i for i in (5, 0, 2)) - ZeroDivisionError: integer division or modulo by zero + ZeroDivisionError: integer division by zero >>> g.next() Traceback (most recent call last): File "", line 1, in -toplevel- From ericvrp at codespeak.net Mon Sep 26 13:53:56 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Mon, 26 Sep 2005 13:53:56 +0200 (CEST) Subject: [pypy-svn] r17873 - in pypy/dist/pypy/translator: goal llvm llvm/module Message-ID: <20050926115356.2E1AA27B7C@code1.codespeak.net> Author: ericvrp Date: Mon Sep 26 13:53:50 2005 New Revision: 17873 Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py pypy/dist/pypy/translator/goal/bench-unix.py pypy/dist/pypy/translator/llvm/build_llvm_module.py pypy/dist/pypy/translator/llvm/exception.py pypy/dist/pypy/translator/llvm/externs2ll.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/module/support.py Log: Ironed out minor shortcomings. pypy-llvm is (on the snake server) about a factor 14-15 slower then CPython. pypy-c is over a factor 25 slower that CPython here. Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-cronjob.py (original) +++ pypy/dist/pypy/translator/goal/bench-cronjob.py Mon Sep 26 13:53:50 2005 @@ -2,7 +2,8 @@ import time, os, sys, stat -homedir = os.environ['HOME'] +homedir = os.getenv('HOME') +os.putenv('PATH','~/bin:/usr/local/bin:/usr/bin:/bin:/opt/bin:/usr/i686-pc-linux-gnu/gcc-bin/3.3.6') def update_pypy(): os.chdir(homedir + '/projects/pypy-dist') @@ -25,7 +26,9 @@ basename = homedir + '/projects/pypy-dist/pypy/translator/goal/' + 'pypy-' + backend realname = basename + '-' + revision - open(realname, 'wb').write( open(basename).read() ) + pypy = open(basename, 'rb').read() + if len(pypy) > 0: + open(realname, 'wb').write(pypy) os.chmod(realname, stat.S_IRWXU) os.unlink(basename) @@ -35,15 +38,17 @@ os.chdir(homedir + '/projects/pypy-dist/pypy/translator/goal') os.system('python bench-unix.py 2>&1' % locals()) -def main(): +def main(backends=[]): + if backends == []: + backends = 'llvm c'.split() print time.ctime() update_pypy() update_llvm() - for backend in 'llvm c'.split(): + for backend in backends: compile(backend) benchmark() print time.ctime() print 80*'-' if __name__ == '__main__': - main() + main(sys.argv[1:]) Modified: pypy/dist/pypy/translator/goal/bench-unix.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-unix.py (original) +++ pypy/dist/pypy/translator/goal/bench-unix.py Mon Sep 26 13:53:50 2005 @@ -5,15 +5,16 @@ import os, sys current_result = ''' -executable richards pystone -pypy-c-17758 30626ms ( 35.74x) 1268 ( 33.98x) -pypy-c-17797 29657ms ( 34.61x) error -pypy-c-17799 29184ms ( 34.05x) error -pypy-llvm-17758 25361ms ( 29.59x) 1525 ( 28.26x) -pypy-llvm-17792 20775ms ( 24.24x) 1912 ( 22.53x) -pypy-llvm-17797 20423ms ( 23.83x) 1943 ( 22.18x) -pypy-llvm-17799 error error -python 2.4.2c1 857ms ( 1.00x) 43103 ( 1.00x) +executable richards pystone +python 2.4.2c1 864ms ( 1.00x) 43103 ( 1.00x) +pypy-llvm-17870 12574ms ( 14.55x) 3069 ( 14.04x) +pypy-llvm-17862 12980ms ( 15.02x) 3041 ( 14.17x) +pypy-llvm-17797 13473ms ( 15.59x) 2824 ( 15.26x) +pypy-llvm-17792 13755ms ( 15.92x) 2823 ( 15.27x) +pypy-llvm-17758 17057ms ( 19.74x) 2229 ( 19.34x) +pypy-c-17853 22411ms ( 25.94x) 1653 ( 26.07x) +pypy-c-17806 22315ms ( 25.83x) 1656 ( 26.03x) +pypy-c-17758 23500ms ( 27.20x) 1570 ( 27.45x) ''' PYSTONE_CMD = 'from test import pystone;pystone.main(%s)' @@ -31,47 +32,39 @@ return float(line.split()[len(pattern.split())]) def run_cmd(cmd): - print "running", cmd + #print "running", cmd pipe = os.popen(cmd + ' 2>&1') - result = pipe.read() - #print "done" - return result + return pipe.read() def run_pystone(executable='python', n=0): argstr = PYSTONE_CMD % (str(n) and n or '') txt = run_cmd('%s -c "%s"' % (executable, argstr)) - res = get_result(txt, PYSTONE_PATTERN) - #print res - return res + return get_result(txt, PYSTONE_PATTERN) def run_richards(executable='python', n=10): argstr = RICHARDS_CMD % n txt = run_cmd('%s -c "%s"' % (executable, argstr)) - res = get_result(txt, RICHARDS_PATTERN) - #print res - return res + return get_result(txt, RICHARDS_PATTERN) def get_executables(): exes = [os.path.join('.', name) for name in os.listdir('.') if name.startswith('pypy-')] exes.sort() + exes.reverse() return exes -HEADLINE = '''executable richards pystone''' -FMT = '''%-30s %6dms (%6.2fx) %6d (%6.2fx)''' +HEADLINE = 'executable richards pystone' +FMT = '%-30s %6dms (%6.2fx) %6d (%6.2fx)' def main(): - #print 'getting the richards reference' + print HEADLINE ref_rich = run_richards() - #print 'getting the pystone reference' ref_stone = run_pystone() - res = [] + print FMT % ('python %s' % sys.version.split()[0], ref_rich, 1.0, ref_stone, 1.0) for exe in get_executables(): exename = os.path.splitext(exe)[0].lstrip('./') - res.append( (exename, run_richards(exe, 1), run_pystone(exe)) ) - res.append( ('python %s' % sys.version.split()[0], ref_rich, ref_stone) ) - print HEADLINE - for exe, rich, stone in res: - print FMT % (exe, rich, rich / ref_rich, stone, ref_stone / stone) + rich = run_richards(exe, 1) + stone = run_pystone(exe) + print FMT % (exename, rich, rich / ref_rich, stone, ref_stone / stone) if __name__ == '__main__': main() Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Mon Sep 26 13:53:50 2005 @@ -81,29 +81,29 @@ #ball = str(dirpath.join('%s_all.bc' % b)) #cmds.append("opt %s %s -f -o %s.bc" % (OPTIMIZATION_SWITCHES, ball, b)) - use_gcc = False + use_gcc = True profile = False cmds = ["llvm-as < %s.ll | opt %s -f -o %s.bc" % (b, OPTIMIZATION_SWITCHES, b)] if not use_gcc: - if exe_name: - cmds.append('llvm-ld %s.bc -native -O5 -l=gc -lm -l=dl -o %s' % (b, exe_name)) - else: - cmds.append("llc %s %s.bc -f -o %s.s" % (genllvm.exceptionpolicy.llc_options(), b, b)) - cmds.append("as %s.s -o %s.o" % (b, b)) - object_files.append("%s.o" % b) + cmds.append("llc %s %s.bc -f -o %s.s" % (genllvm.exceptionpolicy.llc_options(), b, b)) + cmds.append("as %s.s -o %s.o" % (b, b)) + object_files.append("%s.o" % b) else: cmds.append("llc %s %s.bc -march=c -f -o %s.c" % (genllvm.exceptionpolicy.llc_options(), b, b)) if exe_name: - cmd = "gcc %s.c -c -O3 -pipe" % b + cmd = "gcc %s.c -c -O2 -pipe" % b if profile: cmd += ' -pg' + else: + cmd += ' -fomit-frame-pointer' cmds.append(cmd) cmd = "gcc %s.o %s -lm -ldl -pipe -o %s" % (b, gc_libs, exe_name) if profile: cmd += ' -pg' cmds.append(cmd) source_files.append("%s.c" % b) + if exe_name and not profile: cmds.append('strip ' + exe_name) upx = os.popen('which upx').read() Modified: pypy/dist/pypy/translator/llvm/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/exception.py (original) +++ pypy/dist/pypy/translator/llvm/exception.py Mon Sep 26 13:53:50 2005 @@ -53,7 +53,7 @@ def __init__(self): pass - def pyrex_entrypoint_code(self, entrynode): + def llvmcode(self, entrynode): returntype, entrypointname = entrynode.getdecl().split('%', 1) noresult = self._noresult(returntype) cconv = DEFAULT_CCONV @@ -74,6 +74,10 @@ %%result = cast %%RPYTHON_EXCEPTION_VTABLE* %%tmp to int ret int %%result } + +internal fastcc void %%unwind() { + unwind +} ''' % locals() def invoke(self, codewriter, targetvar, tail_, cconv, returntype, functionref, args, label, except_label): @@ -139,7 +143,7 @@ def __init__(self): self.invoke_count = 0 - def pyrex_entrypoint_code(self, entrynode): + def llvmcode(self, entrynode): returntype, entrypointname = entrynode.getdecl().split('%', 1) noresult = self._noresult(returntype) cconv = DEFAULT_CCONV @@ -163,6 +167,10 @@ %%result = cast %%RPYTHON_EXCEPTION_VTABLE* %%tmp to int ret int %%result } + +internal fastcc void %%unwind() { + ret void +} ''' % locals() def transform(self, translator, graph=None): Modified: pypy/dist/pypy/translator/llvm/externs2ll.py ============================================================================== --- pypy/dist/pypy/translator/llvm/externs2ll.py (original) +++ pypy/dist/pypy/translator/llvm/externs2ll.py Mon Sep 26 13:53:50 2005 @@ -38,7 +38,9 @@ funcname , s = s.split('(', 1) funcnames[funcname] = True if line.find("internal") == -1: - line = 'internal %s %s' % (DEFAULT_CCONV, line,) + #internal = '' + internal = 'internal ' + line = '%s%s %s' % (internal, DEFAULT_CCONV, line,) ll_lines.append(line) # patch calls to function that we just declared fastcc Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Mon Sep 26 13:53:50 2005 @@ -41,7 +41,6 @@ self.exceptionpolicy = exceptionpolicy extfuncnode.ExternalFuncNode.used_external_functions = {} self.debug = debug # for debug we create comments of every operation that may be executed - #exceptionpolicy.transform(translator) #now done in FuncNode (optimization) if debug: translator.checkgraphs() @@ -190,7 +189,7 @@ typ_decl.writeimpl(codewriter) self._checkpoint('write implementations') - codewriter.append(self.exceptionpolicy.pyrex_entrypoint_code(self.entrynode)) + codewriter.append(self.exceptionpolicy.llvmcode(self.entrynode)) # XXX we need to create our own main() that calls the actual entry_point function if entryfunc_name == 'pypy_entry_point': #XXX just to get on with translate_pypy Modified: pypy/dist/pypy/translator/llvm/module/support.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/support.py (original) +++ pypy/dist/pypy/translator/llvm/module/support.py Mon Sep 26 13:53:50 2005 @@ -97,7 +97,7 @@ %%exception_type = load %%RPYTHON_EXCEPTION_VTABLE** %%tmp store %%RPYTHON_EXCEPTION_VTABLE* %%exception_type, %%RPYTHON_EXCEPTION_VTABLE** %%last_exception_type store %%RPYTHON_EXCEPTION* %%exception_value, %%RPYTHON_EXCEPTION** %%last_exception_value - unwind ; XXX (1) if exceptionpolicy == 'boehm' + call fastcc void %%unwind() ret void } """ % locals()) @@ -110,7 +110,7 @@ br bool %%cond, label %%is_0, label %%is_not_0 is_0: call fastcc void %%prepare_ZeroDivisionError() - unwind ; XXX (2) if exceptionpolicy == 'boehm' + call fastcc void %%unwind() ret %s 0 is_not_0: @@ -134,7 +134,7 @@ ; br bool %cond3, label %return_block, label %ovf3 ;ovf3: call fastcc void %prepare_OverflowError() - unwind ; XXX (3) if exceptionpolicy == 'boehm' + call fastcc void %unwind() ret int 0 """ From pedronis at codespeak.net Mon Sep 26 14:13:48 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 26 Sep 2005 14:13:48 +0200 (CEST) Subject: [pypy-svn] r17874 - pypy/dist/pypy/module/sys/test Message-ID: <20050926121348.7B85D27B7C@code1.codespeak.net> Author: pedronis Date: Mon Sep 26 14:13:47 2005 New Revision: 17874 Modified: pypy/dist/pypy/module/sys/test/test_sysmodule.py Log: update test to match occured changes Modified: pypy/dist/pypy/module/sys/test/test_sysmodule.py ============================================================================== --- pypy/dist/pypy/module/sys/test/test_sysmodule.py (original) +++ pypy/dist/pypy/module/sys/test/test_sysmodule.py Mon Sep 26 14:13:47 2005 @@ -350,7 +350,9 @@ def test_pypy_attributes(self): assert isinstance(sys.pypy_objspaceclass, str) - assert isinstance(sys.pypy_svn_url, str) + assert isinstance(sys.pypy_svn_url, tuple) + url = sys.pypy_svn_url + assert isinstance(url[0], str) vi = sys.pypy_version_info assert isinstance(vi, tuple) assert len(vi) == 5 @@ -359,6 +361,7 @@ assert isinstance(vi[2], int) assert vi[3] in ("alpha", "beta", "candidate", "final") assert isinstance(vi[4], int) + assert url[1] == vi[4] def test_allattributes(self): sys.__dict__ # check that we don't crash initializing any attribute From pedronis at codespeak.net Mon Sep 26 14:21:42 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 26 Sep 2005 14:21:42 +0200 (CEST) Subject: [pypy-svn] r17875 - pypy/dist/pypy/doc Message-ID: <20050926122142.D33F827B7C@code1.codespeak.net> Author: pedronis Date: Mon Sep 26 14:21:42 2005 New Revision: 17875 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: plural, typos Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Mon Sep 26 14:21:42 2005 @@ -215,9 +215,6 @@ Flow Object Space --------------------------------- - -XXX - In our bytecode-interpreter design evaluation responsibilities are split between the Object Space, frames and the so-called execution context. The latter two object kinds are properly part of the @@ -254,7 +251,7 @@ given point. The Flow Space constructs the flow graph by creating new blocks in it, -when fresh never-seen state is reached. During construction block in +when fresh never-seen state is reached. During construction, blocks in the graph all have an associated frame state. The Flow Space start from an empty block with an a frame state corresponding to setup induced but input arguments in the form of variables and constants to @@ -272,7 +269,7 @@ byecode instructions, as their position, and frame state, block pairs. A union operation is defined on frame states, only two equal constants -unify to a constant of the same value, all other combination unify +unify to a constant of the same value, all other combinations unify to a fresh new variable. If some previously associated frame state for the next byecode unifies From pedronis at codespeak.net Mon Sep 26 15:12:44 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 26 Sep 2005 15:12:44 +0200 (CEST) Subject: [pypy-svn] r17876 - pypy/dist/pypy/doc Message-ID: <20050926131244.D316A27B82@code1.codespeak.net> Author: pedronis Date: Mon Sep 26 15:12:42 2005 New Revision: 17876 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: start of a description of how branching flow paths are captured Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Mon Sep 26 15:12:42 2005 @@ -277,9 +277,47 @@ one, the corresponding block will be reused and reset. Otherwise a new block is used. -XXX non mergeable data, details + +Branching on conditions by the engine usually involves querying the +truth value of a object through the is_true space operation. This +needs special treatment to be able to capture all possible flow paths. + +Multiple pending blocks can scheduled for abstract interpretation by +the flow space, which proceeds picking one and reconstructing the +abstract execution frame from the frame state associated with the +block. The frame is what will be operated on, its setup is delegated +to the block and based on the state, the frame setup by the block also +returns a so called recorder through which, and not directly the +block, appending of new space operations to the block will be +delegated. What to do when an is_true operation is about to be +executed is also responsability to the recorder. + +The normal recorder when an is_true operation is encountered will +create and schedule special blocks which don't have an associated +frame state, but the previous block ending in the is_true operation +and an outcome, either True or False. + +The special branching blocks when about to be executed, will use the +chain of previous blocks, and consider all of them up to the first +non-special branching block included, the state of this one block will +be used to setup the frame for execution and a chain of so called +replaying recorders setup except for the scheduled branching block +which gets a normal recorder. The outcome registered in each special +block in the chain will be associated with the replayer for the +previous block. + +The replaying recorders will sanity check that the same operations are +appended by comparing the previous contents of the blocks +re-encountered by execution and on is_true operation will deliver the +outcome associated with them on construction. + +All this mechanism ensures that all flow paths are considered. + + + XXX conditionals, multiple pending blocks +XXX non mergeable data, details XXX termination for "reasonable" terminating programs Geninterp From pedronis at codespeak.net Mon Sep 26 15:15:33 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 26 Sep 2005 15:15:33 +0200 (CEST) Subject: [pypy-svn] r17877 - pypy/dist/pypy/doc Message-ID: <20050926131533.8D84227B82@code1.codespeak.net> Author: pedronis Date: Mon Sep 26 15:15:32 2005 New Revision: 17877 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: that part is addressed by the added content (at least in draftish way) Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Mon Sep 26 15:15:32 2005 @@ -314,9 +314,6 @@ All this mechanism ensures that all flow paths are considered. - -XXX conditionals, multiple pending blocks - XXX non mergeable data, details XXX termination for "reasonable" terminating programs From ac at codespeak.net Mon Sep 26 16:54:09 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Mon, 26 Sep 2005 16:54:09 +0200 (CEST) Subject: [pypy-svn] r17878 - pypy/dist/lib-python Message-ID: <20050926145409.2572B27B82@code1.codespeak.net> Author: ac Date: Mon Sep 26 16:54:09 2005 New Revision: 17878 Modified: pypy/dist/lib-python/conftest.py Log: Allow --compiler option to py.test for compliance tests. Modified: pypy/dist/lib-python/conftest.py ============================================================================== --- pypy/dist/lib-python/conftest.py (original) +++ pypy/dist/lib-python/conftest.py Mon Sep 26 16:54:09 2005 @@ -295,7 +295,7 @@ self._oldstyle = oldstyle self._uselibfile = uselibfile self._usemodules = usemodules.split() - self.compiler = compiler + self._compiler = compiler self.core = core def oldstyle(self): @@ -310,6 +310,10 @@ return self._usemodules + pypy_option.usemodules usemodules = property(usemodules) + def compiler(self): + return self._compiler or pypy_option.compiler + compiler = property(compiler) + def getoptions(self): l = [] for name in 'oldstyle', 'core', 'uselibfile': From pedronis at codespeak.net Mon Sep 26 16:59:43 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 26 Sep 2005 16:59:43 +0200 (CEST) Subject: [pypy-svn] r17879 - pypy/dist/pypy/translator/tool Message-ID: <20050926145943.10BD927B82@code1.codespeak.net> Author: pedronis Date: Mon Sep 26 16:59:42 2005 New Revision: 17879 Added: pypy/dist/pypy/translator/tool/taskengine.py (contents, props changed) Log: start of a very very simple "make-like" task engine to use to make translation/translator more goal oriented Added: pypy/dist/pypy/translator/tool/taskengine.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/tool/taskengine.py Mon Sep 26 16:59:42 2005 @@ -0,0 +1,108 @@ + + + + +class SimpleTaskEngine: + + + def __init__(self): + + self.tasks = tasks = {} + + for name in dir(self): + if name.startswith('task_'): + task_name = name[len('task_'):] + task = getattr(self, name) + assert callable(task) + task_deps = getattr(task, 'task_deps', []) + + tasks[task_name] = task, task_deps + + + def _plan(self, goal, skip=[]): + + constraints = [] + + def subgoals(task_name): + taskcallable, deps = self.tasks[task_name] + for dep in deps: + if dep.startswith('?'): + dep = dep[1:] + if dep in skip: + continue + yield dep + + + seen = {} + + def consider(subgoal): + if subgoal in seen: + return + else: + seen[subgoal] = True + constraints.append([subgoal]) + deps = subgoals(subgoal) + for dep in deps: + constraints.append([subgoal, dep]) + consider(dep) + + consider(goal) + + #sort + + plan = [] + + while True: + cands = dict.fromkeys([constr[0] for constr in constraints if constr]) + if not cands: + break + + for cand in cands: + for constr in constraints: + if cand in constr[1:]: + break + else: + break + else: + raise RuntimeError, "circular dependecy" + + plan.append(cand) + for constr in constraints: + if constr and constr[0] == cand: + del constr[0] + + plan.reverse() + + return plan + + + +def test_simple(): + + class ABC(SimpleTaskEngine): + + def task_A(self): + pass + + task_A.task_deps = ['B', '?C'] + + def task_B(self): + pass + + def task_C(self): + pass + + task_C.task_deps = ['B'] + + abc = ABC() + + assert abc._plan('B') == ['B'] + assert abc._plan('C') == ['B', 'C'] + assert abc._plan('A') == ['B', 'C', 'A'] + assert abc._plan('A', skip=['C']) == ['B', 'A'] + + + + + + From pedronis at codespeak.net Mon Sep 26 17:58:33 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 26 Sep 2005 17:58:33 +0200 (CEST) Subject: [pypy-svn] r17880 - pypy/dist/pypy/translator/tool Message-ID: <20050926155833.89CE127B82@code1.codespeak.net> Author: pedronis Date: Mon Sep 26 17:58:32 2005 New Revision: 17880 Modified: pypy/dist/pypy/translator/tool/taskengine.py Log: added a string with a draft of tasks for translation, ?tasks are optional there will be probably a global option to select between llvm/c versions of tasks optional tasks can have a --dont-xxx option Modified: pypy/dist/pypy/translator/tool/taskengine.py ============================================================================== --- pypy/dist/pypy/translator/tool/taskengine.py (original) +++ pypy/dist/pypy/translator/tool/taskengine.py Mon Sep 26 17:58:32 2005 @@ -102,7 +102,25 @@ assert abc._plan('A', skip=['C']) == ['B', 'A'] - +""" sketch of tasks for translation: + +annotate: # includes annotation and annotatation simplifications + +rtype: annotate + +source_llvm: backendoptimisations, rtype, annotate + +source_c: ?backendoptimisations, ?rtype, ?annotate + +compile_c : source_c + +compile_llvm: source_llvm + +run_c: compile_c + +run_llvm: compile_llvm + +""" From pedronis at codespeak.net Mon Sep 26 18:08:42 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 26 Sep 2005 18:08:42 +0200 (CEST) Subject: [pypy-svn] r17881 - pypy/dist/pypy/translator/tool Message-ID: <20050926160842.060F527B85@code1.codespeak.net> Author: pedronis Date: Mon Sep 26 18:08:41 2005 New Revision: 17881 Modified: pypy/dist/pypy/translator/tool/taskengine.py Log: tweak Modified: pypy/dist/pypy/translator/tool/taskengine.py ============================================================================== --- pypy/dist/pypy/translator/tool/taskengine.py (original) +++ pypy/dist/pypy/translator/tool/taskengine.py Mon Sep 26 18:08:41 2005 @@ -108,6 +108,8 @@ rtype: annotate +backendoptimisations: rtype # make little sense otherwise + source_llvm: backendoptimisations, rtype, annotate source_c: ?backendoptimisations, ?rtype, ?annotate From pedronis at codespeak.net Mon Sep 26 21:24:05 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 26 Sep 2005 21:24:05 +0200 (CEST) Subject: [pypy-svn] r17882 - in pypy/dist/pypy/translator/tool: . test Message-ID: <20050926192405.C7CE227B82@code1.codespeak.net> Author: pedronis Date: Mon Sep 26 21:24:03 2005 New Revision: 17882 Added: pypy/dist/pypy/translator/tool/test/test_taskengine.py (contents, props changed) Modified: pypy/dist/pypy/translator/tool/taskengine.py Log: some more work on taskengine, proper test file Modified: pypy/dist/pypy/translator/tool/taskengine.py ============================================================================== --- pypy/dist/pypy/translator/tool/taskengine.py (original) +++ pypy/dist/pypy/translator/tool/taskengine.py Mon Sep 26 21:24:03 2005 @@ -1,11 +1,9 @@ - - class SimpleTaskEngine: - def __init__(self): + self._plan_cache = {} self.tasks = tasks = {} @@ -18,9 +16,14 @@ tasks[task_name] = task, task_deps + def _plan(self, goals, skip=[]): + skip = [toskip for toskip in skip if toskip not in goals] - def _plan(self, goal, skip=[]): - + key = (tuple(goals), tuple(skip)) + try: + return self._plan_cache[key] + except KeyError: + pass constraints = [] def subgoals(task_name): @@ -32,7 +35,6 @@ continue yield dep - seen = {} def consider(subgoal): @@ -46,7 +48,8 @@ constraints.append([subgoal, dep]) consider(dep) - consider(goal) + for goal in goals: + consider(goal) #sort @@ -73,34 +76,33 @@ plan.reverse() - return plan - - - -def test_simple(): - - class ABC(SimpleTaskEngine): + self._plan_cache[key] = plan - def task_A(self): - pass - - task_A.task_deps = ['B', '?C'] - - def task_B(self): - pass - - def task_C(self): - pass + return plan - task_C.task_deps = ['B'] + def _execute(self, goals, *args, **kwds): + task_skip = kwds.get('task_skip', []) + for goal in self._plan(goals, skip=task_skip): + taskcallable, _ = self.tasks[goal] + self._event('pre', goal, taskcallable) + try: + self._do(goal, taskcallable, *args, **kwds) + except (SystemExit, KeyboardInterrupt): + raise + except: + self._error(goal) + raise + self._event('post', goal, taskcallable) + + def _do(self, goal, func, *args, **kwds): + func() - abc = ABC() + def _event(self, kind, goal, func): + pass + + def _error(self, goal): + pass - assert abc._plan('B') == ['B'] - assert abc._plan('C') == ['B', 'C'] - assert abc._plan('A') == ['B', 'C', 'A'] - assert abc._plan('A', skip=['C']) == ['B', 'A'] - """ sketch of tasks for translation: Added: pypy/dist/pypy/translator/tool/test/test_taskengine.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/tool/test/test_taskengine.py Mon Sep 26 21:24:03 2005 @@ -0,0 +1,73 @@ +from pypy.translator.tool.taskengine import SimpleTaskEngine + +def test_simple(): + + class ABC(SimpleTaskEngine): + + def task_A(self): + pass + + task_A.task_deps = ['B', '?C'] + + def task_B(self): + pass + + def task_C(self): + pass + + task_C.task_deps = ['B'] + + abc = ABC() + + assert abc._plan('B') == ['B'] + assert abc._plan('C') == ['B', 'C'] + assert abc._plan('A') == ['B', 'C', 'A'] + assert abc._plan('A', skip=['C']) == ['B', 'A'] + +def test_execute(): + + class ABC(SimpleTaskEngine): + + def __init__(self): + SimpleTaskEngine.__init__(self) + self.done = [] + + def task_A(self): + self.done.append('A') + + task_A.task_deps = ['B', '?C'] + + def task_B(self): + self.done.append('B') + + def task_C(self): + self.done.append('C') + + task_C.task_deps = ['B'] + + def _event(self, kind, goal, taskcallable): + self.done.append((kind, goal)) + + def test(goals, task_skip=[]): + if isinstance(goals, str): + gaols = [goals] + abc = ABC() + abc._execute(goals, task_skip=task_skip) + return abc.done + + def trace(goals): + t = [] + for goal in goals: + t.extend([('pre', goal), goal, ('post', goal)]) + return t + + assert test('B') == trace('B') + assert test('C') == trace(['B', 'C']) + assert test('A') == trace(['B', 'C', 'A']) + assert test('A', ['C']) == trace(['B', 'A']) + assert test(['B', 'C']) == trace(['B', 'C']) + assert test(['C', 'B']) == trace(['B', 'C']) + assert test(['B', 'A']) == trace(['B', 'C', 'A']) + assert test(['B', 'A'], ['C']) == trace(['B', 'A']) + assert test(['B', 'A', 'C']) == trace(['B', 'C', 'A']) + assert test(['B', 'A', 'C'], ['C']) == trace(['B', 'C', 'A']) From pedronis at codespeak.net Mon Sep 26 22:09:37 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 26 Sep 2005 22:09:37 +0200 (CEST) Subject: [pypy-svn] r17883 - in pypy/dist/pypy: annotation translator Message-ID: <20050926200937.10E5C27B7C@code1.codespeak.net> Author: pedronis Date: Mon Sep 26 22:09:35 2005 New Revision: 17883 Modified: pypy/dist/pypy/annotation/bookkeeper.py pypy/dist/pypy/annotation/builtin.py pypy/dist/pypy/translator/annrpython.py Log: delegate warning printing to RPythonAnnotator no printing in annotation/ Modified: pypy/dist/pypy/annotation/bookkeeper.py ============================================================================== --- pypy/dist/pypy/annotation/bookkeeper.py (original) +++ pypy/dist/pypy/annotation/bookkeeper.py Mon Sep 26 22:09:35 2005 @@ -6,7 +6,6 @@ import sys from types import FunctionType, ClassType, MethodType from types import BuiltinMethodType, NoneType -from pypy.tool.ansi_print import ansi_print from pypy.annotation.model import * from pypy.annotation.classdef import ClassDef, isclassdef from pypy.annotation.listdef import ListDef, MOST_GENERAL_LISTDEF @@ -378,7 +377,6 @@ else: clsdef = self.getclassdef(x.__class__) if x.__class__.__dict__.get('_annspecialcase_', '').endswith('ctr_location'): - print "encountered a pre-built mutable instance of a class needing specialization: %s" % x.__class__.__name__ raise Exception, "encountered a pre-built mutable instance of a class needing specialization: %s" % x.__class__.__name__ if x not in self.seen_mutable: # avoid circular reflowing, # see for example test_circular_mutable_getattr @@ -724,12 +722,7 @@ return self.annotator.whereami(self.position_key) def warning(self, msg): - try: - pos = self.whereami() - except AttributeError: - pos = '?' - ansi_print("*** WARNING: [%s] %s" % (pos, msg), esc="31") # RED - + return self.annotator.warning(msg) def ishashable(x): try: Modified: pypy/dist/pypy/annotation/builtin.py ============================================================================== --- pypy/dist/pypy/annotation/builtin.py (original) +++ pypy/dist/pypy/annotation/builtin.py Mon Sep 26 22:09:35 2005 @@ -4,7 +4,6 @@ import types import sys, math, os, time -from pypy.tool.ansi_print import ansi_print from pypy.annotation.model import SomeInteger, SomeObject, SomeChar, SomeBool from pypy.annotation.model import SomeList, SomeString, SomeTuple, SomeSlice from pypy.annotation.model import SomeUnicodeCodePoint, SomeAddress @@ -324,7 +323,6 @@ n = 1 p = lltype.malloc(T.const, n) r = SomePtr(lltype.typeOf(p)) - #print "MALLOC", r return r def typeOf(s_val): Modified: pypy/dist/pypy/translator/annrpython.py ============================================================================== --- pypy/dist/pypy/translator/annrpython.py (original) +++ pypy/dist/pypy/translator/annrpython.py Mon Sep 26 22:09:35 2005 @@ -267,13 +267,26 @@ s_value)) if arg in self.return_bindings and degenerated: - log.red("*** WARNING: %s result degenerated to SomeObject" % - self.whereami((self.return_bindings[arg],None, None))) + self.warning("result degenerated to SomeObject", + (self.return_bindings[arg],None, None)) self.binding_caused_by[arg] = called_from # XXX make this line available as a debugging option ##assert not (s_value.__class__ == annmodel.SomeObject and s_value.knowntype == object) ## debug + + def warning(self, msg, pos=None): + if pos is None: + try: + pos = self.bookkeeper.position_key + except AttributeError: + pos = '?' + if pos != '?': + pos = self.whereami(pos) + + log.red("*** WARNING: %s/ %s" % (pos, msg)) + + #___ interface for annotator.bookkeeper _______ def recursivecall(self, func, whence, inputcells): # whence = position_key|callback taking the annotator, graph From pedronis at codespeak.net Mon Sep 26 22:30:17 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 26 Sep 2005 22:30:17 +0200 (CEST) Subject: [pypy-svn] r17884 - pypy/dist/pypy/translator Message-ID: <20050926203017.997C027B7C@code1.codespeak.net> Author: pedronis Date: Mon Sep 26 22:30:16 2005 New Revision: 17884 Modified: pypy/dist/pypy/translator/translator.py Log: use logging for getflowgraph info (we can get back the old behavior with a taylored consumer) Modified: pypy/dist/pypy/translator/translator.py ============================================================================== --- pypy/dist/pypy/translator/translator.py (original) +++ pypy/dist/pypy/translator/translator.py Mon Sep 26 22:30:16 2005 @@ -12,7 +12,10 @@ from pypy.translator.tool.cbuild import make_module_from_pyxstring from pypy.translator.tool.cbuild import make_module_from_c from pypy.objspace.flow import FlowObjSpace - +from pypy.tool.ansi_print import ansi_log +import py +log = py.log.Producer("getflowgraph") +py.log.setconsumer("getflowgraph", ansi_log) class Translator: @@ -57,11 +60,11 @@ graph = self.flowgraphs[func] except KeyError: if self.verbose: - print 'getflowgraph (%s:%d) %s' % ( + descr = '(%s:%d) %s' % ( func.func_globals.get('__name__', '?'), func.func_code.co_firstlineno, - func.__name__), - sys.stdout.flush() + func.__name__) + log(descr) assert not self.frozen space = FlowObjSpace() space.builtins_can_raise_exceptions = self.builtins_can_raise_exceptions @@ -70,7 +73,7 @@ if self.simplifying: simplify_graph(graph, self.simplifying) if self.verbose: - print + log.done(func.__name__) self.flowgraphs[func] = graph self.functions.append(func) graph.func = func From pedronis at codespeak.net Mon Sep 26 23:09:27 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Mon, 26 Sep 2005 23:09:27 +0200 (CEST) Subject: [pypy-svn] r17885 - in pypy/dist/pypy: tool translator Message-ID: <20050926210927.AD16E27B83@code1.codespeak.net> Author: pedronis Date: Mon Sep 26 23:09:24 2005 New Revision: 17885 Modified: pypy/dist/pypy/tool/ansi_print.py pypy/dist/pypy/translator/translator.py Log: tweak ansi_log to suppot the hold behavior on ttys for getflowgraph info support mapping from keyword to color in ansi_log, now class based -> AnsiLog Modified: pypy/dist/pypy/tool/ansi_print.py ============================================================================== --- pypy/dist/pypy/tool/ansi_print.py (original) +++ pypy/dist/pypy/tool/ansi_print.py Mon Sep 26 23:09:24 2005 @@ -4,7 +4,7 @@ import sys -def ansi_print(text, esc, file=None, newline=True): +def ansi_print(text, esc, file=None, newline=True, flush=False): if file is None: file = sys.stderr text = text.rstrip() if esc and sys.platform != "win32" and file.isatty(): @@ -14,16 +14,44 @@ if newline: text += '\n' file.write(text) + if flush: + file.flush() -def ansi_log(msg): - keywords = list(msg.keywords) - if 'bold' in keywords: - keywords.remove('bold') - esc = "1" - elif 'red' in keywords: - keywords.remove('red') - esc = "31" - else: - esc = None - ansi_print("[%s] %s" %(":".join(keywords), msg.content()), esc) + +class AnsiLog: + + def __init__(self, kw_to_color={}, file=None): + self.kw_to_color = kw_to_color + self.file = file + + def __call__(self, msg): + tty = getattr(sys.stderr, 'isatty', lambda: False)() + flush = False + newline = True + keywords = [] + for kw in msg.keywords: + color = self.kw_to_color.get(kw) + if color and color not in keywords: + keywords.append(color) + keywords.append(kw) + if 'start' in keywords: + if tty: + newline = False + flush = True + keywords.remove('start') + elif 'done' in keywords: + if tty: + print >> sys.stderr + return + if 'bold' in keywords: + keywords.remove('bold') + esc = "1" + elif 'red' in keywords: + keywords.remove('red') + esc = "31" + else: + esc = None + ansi_print("[%s] %s" %(":".join(keywords), msg.content()), esc, + file=self.file, newline=newline, flush=flush) +ansi_log = AnsiLog() Modified: pypy/dist/pypy/translator/translator.py ============================================================================== --- pypy/dist/pypy/translator/translator.py (original) +++ pypy/dist/pypy/translator/translator.py Mon Sep 26 23:09:24 2005 @@ -64,7 +64,7 @@ func.func_globals.get('__name__', '?'), func.func_code.co_firstlineno, func.__name__) - log(descr) + log.start(descr) assert not self.frozen space = FlowObjSpace() space.builtins_can_raise_exceptions = self.builtins_can_raise_exceptions From ericvrp at codespeak.net Mon Sep 26 23:34:10 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Mon, 26 Sep 2005 23:34:10 +0200 (CEST) Subject: [pypy-svn] r17886 - in pypy/dist/pypy/translator: backendopt goal llvm llvm/backendopt Message-ID: <20050926213410.D70B727B41@code1.codespeak.net> Author: ericvrp Date: Mon Sep 26 23:34:09 2005 New Revision: 17886 Added: pypy/dist/pypy/translator/llvm/backendopt/ pypy/dist/pypy/translator/llvm/backendopt/exception.py - copied unchanged from r17884, pypy/dist/pypy/translator/backendopt/exception.py Removed: pypy/dist/pypy/translator/backendopt/exception.py Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py pypy/dist/pypy/translator/llvm/exception.py pypy/dist/pypy/translator/llvm/opwriter.py Log: * About 13.3x slower then CPython by eliminating malloc's that clearly allocate zero bytes (which were 500 of about 16000 mallocs) * Using 'fast' exceptionpolicy by default now * Moving backendopt transformations that are used by genllvm only, to a seperate directory. Modified: pypy/dist/pypy/translator/goal/bench-cronjob.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-cronjob.py (original) +++ pypy/dist/pypy/translator/goal/bench-cronjob.py Mon Sep 26 23:34:09 2005 @@ -2,6 +2,21 @@ import time, os, sys, stat +current_result = ''' +executable richards pystone +python 2.4.2c1 855ms ( 1.00x) 44642 ( 1.00x) +pypy-llvm-17884 11034ms ( 12.91x) 3362 ( 13.28x) +pypy-llvm-17881 11702ms ( 13.69x) 3240 ( 13.78x) +pypy-llvm-17870 12683ms ( 14.83x) 3073 ( 14.53x) +pypy-llvm-17862 13053ms ( 15.27x) 3017 ( 14.79x) +pypy-llvm-17797 13497ms ( 15.79x) 2832 ( 15.76x) +pypy-llvm-17792 13808ms ( 16.15x) 2818 ( 15.84x) +pypy-llvm-17758 16998ms ( 19.88x) 2237 ( 19.96x) +pypy-c-17853 22389ms ( 26.19x) 1651 ( 27.04x) +pypy-c-17806 22328ms ( 26.11x) 1660 ( 26.88x) +pypy-c-17758 23485ms ( 27.47x) 1598 ( 27.92x) +''' + homedir = os.getenv('HOME') os.putenv('PATH','~/bin:/usr/local/bin:/usr/bin:/bin:/opt/bin:/usr/i686-pc-linux-gnu/gcc-bin/3.3.6') Modified: pypy/dist/pypy/translator/llvm/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/exception.py (original) +++ pypy/dist/pypy/translator/llvm/exception.py Mon Sep 26 23:34:09 2005 @@ -28,7 +28,7 @@ return noresult def new(exceptionpolicy=None): #factory - exceptionpolicy = exceptionpolicy or 'cpython' + exceptionpolicy = exceptionpolicy or 'fast' if exceptionpolicy == 'cpython': from pypy.translator.llvm.exception import CPythonExceptionPolicy exceptionpolicy = CPythonExceptionPolicy() @@ -174,7 +174,7 @@ ''' % locals() def transform(self, translator, graph=None): - from pypy.translator.backendopt.exception import create_exception_handling + from pypy.translator.llvm.backendopt.exception import create_exception_handling if graph: create_exception_handling(translator, graph) else: Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Mon Sep 26 23:34:09 2005 @@ -390,7 +390,11 @@ def malloc(self, op): arg_type = op.args[0].value targetvar = self.db.repr_arg(op.result) - + if isinstance(arg_type, lltype.Struct) and arg_type._names_without_voids() == []: + t = self.db.repr_arg_type(op.result) + self.codewriter.cast(targetvar, t, 'null', t) + self.codewriter.comment('removed malloc(%s) from previous line' % t) + return type_ = self.db.repr_type(arg_type) self.codewriter.malloc(targetvar, type_, atomic=arg_type._is_atomic()) From pedronis at codespeak.net Tue Sep 27 00:20:19 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 27 Sep 2005 00:20:19 +0200 (CEST) Subject: [pypy-svn] r17887 - in pypy/dist/pypy: tool translator Message-ID: <20050926222019.36B5227B6A@code1.codespeak.net> Author: pedronis Date: Tue Sep 27 00:20:17 2005 New Revision: 17887 Modified: pypy/dist/pypy/tool/ansi_print.py pypy/dist/pypy/translator/annrpython.py Log: more conversion of prints into using logging Modified: pypy/dist/pypy/tool/ansi_print.py ============================================================================== --- pypy/dist/pypy/tool/ansi_print.py (original) +++ pypy/dist/pypy/tool/ansi_print.py Tue Sep 27 00:20:17 2005 @@ -8,7 +8,9 @@ if file is None: file = sys.stderr text = text.rstrip() if esc and sys.platform != "win32" and file.isatty(): - text = ('\x1b[%sm' % esc + + if not isinstance(esc, tuple): + esc = (esc,) + text = (''.join(['\x1b[%sm' % cod for cod in esc]) + text + '\x1b[0m') # ANSI color code "reset" if newline: @@ -20,8 +22,18 @@ class AnsiLog: + KW_TO_COLOR = { + # color supress + 'red': ((31,), True), + 'bold': ((1,), True), + 'WARNING': ((31,), False), + 'event': ((1,), True), + 'ERROR': ((1, 31), False), + } + def __init__(self, kw_to_color={}, file=None): - self.kw_to_color = kw_to_color + self.kw_to_color = self.KW_TO_COLOR.copy() + self.kw_to_color.update(kw_to_color) self.file = file def __call__(self, msg): @@ -29,11 +41,13 @@ flush = False newline = True keywords = [] + esc = [] for kw in msg.keywords: - color = self.kw_to_color.get(kw) - if color and color not in keywords: - keywords.append(color) - keywords.append(kw) + color, supress = self.kw_to_color.get(kw, (None, False)) + if color: + esc.extend(color) + if not supress: + keywords.append(kw) if 'start' in keywords: if tty: newline = False @@ -43,15 +57,9 @@ if tty: print >> sys.stderr return - if 'bold' in keywords: - keywords.remove('bold') - esc = "1" - elif 'red' in keywords: - keywords.remove('red') - esc = "31" - else: - esc = None - ansi_print("[%s] %s" %(":".join(keywords), msg.content()), esc, - file=self.file, newline=newline, flush=flush) + esc = tuple(esc) + for line in msg.content().splitlines(): + ansi_print("[%s] %s" %(":".join(keywords), line), esc, + file=self.file, newline=newline, flush=flush) ansi_log = AnsiLog() Modified: pypy/dist/pypy/translator/annrpython.py ============================================================================== --- pypy/dist/pypy/translator/annrpython.py (original) +++ pypy/dist/pypy/translator/annrpython.py Tue Sep 27 00:20:17 2005 @@ -181,14 +181,14 @@ fn = self.why_not_annotated[block][1].break_at[0] self.blocked_functions[fn] = True import traceback - print '-+' * 30 - print 'BLOCKED block at:', - print self.whereami(self.why_not_annotated[block][1].break_at) - print 'because of:' - traceback.print_exception(*self.why_not_annotated[block]) - print '-+' * 30 - print - print "++-" * 20 + log.ERROR('-+' * 30) + log.ERROR('BLOCKED block at :' + + self.whereami(self.why_not_annotated[block][1].break_at)) + log.ERROR('because of:') + for line in traceback.format_exception(*self.why_not_annotated[block]): + log.ERROR(line) + log.ERROR('-+' * 30) + raise AnnotatorError('%d blocks are still blocked' % self.annotated.values().count(False)) # make sure that the return variables of all graphs is annotated @@ -262,7 +262,7 @@ self.bindings[arg] = s_value if annmodel.DEBUG: if arg in self.return_bindings: - log.bold("%s -> %s" % + log.event("%s -> %s" % (self.whereami((self.return_bindings[arg], None, None)), s_value)) @@ -284,7 +284,7 @@ if pos != '?': pos = self.whereami(pos) - log.red("*** WARNING: %s/ %s" % (pos, msg)) + log.WARNING("%s/ %s" % (pos, msg)) #___ interface for annotator.bookkeeper _______ @@ -391,10 +391,6 @@ try: self.flowin(fn, block) except BlockedInference, e: - #print '_'*60 - #print 'Blocked at %r:' % (e.break_at,) - #import traceback, sys - #traceback.print_tb(sys.exc_info()[2]) self.annotated[block] = False # failed, hopefully temporarily except Exception, e: # hack for debug tools only From pedronis at codespeak.net Tue Sep 27 00:29:27 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 27 Sep 2005 00:29:27 +0200 (CEST) Subject: [pypy-svn] r17888 - pypy/dist/pypy/translator Message-ID: <20050926222927.5C8CF27B6A@code1.codespeak.net> Author: pedronis Date: Tue Sep 27 00:29:26 2005 New Revision: 17888 Modified: pypy/dist/pypy/translator/annrpython.py Log: just on log event Modified: pypy/dist/pypy/translator/annrpython.py ============================================================================== --- pypy/dist/pypy/translator/annrpython.py (original) +++ pypy/dist/pypy/translator/annrpython.py Tue Sep 27 00:29:26 2005 @@ -181,13 +181,15 @@ fn = self.why_not_annotated[block][1].break_at[0] self.blocked_functions[fn] = True import traceback - log.ERROR('-+' * 30) - log.ERROR('BLOCKED block at :' + - self.whereami(self.why_not_annotated[block][1].break_at)) - log.ERROR('because of:') - for line in traceback.format_exception(*self.why_not_annotated[block]): - log.ERROR(line) - log.ERROR('-+' * 30) + blocked_err = [] + blocked_err.append('-+' * 30 +'\n') + blocked_err.append('BLOCKED block at :' + + self.whereami(self.why_not_annotated[block][1].break_at) + + '\n') + blocked_err.append('because of:\n') + blocked_err.extend(traceback.format_exception(*self.why_not_annotated[block])) + blocked_err.append('-+' * 30 +'\n') + log.ERROR(''.join(blocked_err)) raise AnnotatorError('%d blocks are still blocked' % self.annotated.values().count(False)) From cfbolz at codespeak.net Tue Sep 27 00:40:12 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 27 Sep 2005 00:40:12 +0200 (CEST) Subject: [pypy-svn] r17889 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20050926224012.D270027B6C@code1.codespeak.net> Author: cfbolz Date: Tue Sep 27 00:40:11 2005 New Revision: 17889 Added: pypy/dist/pypy/translator/backendopt/tailrecursion.py pypy/dist/pypy/translator/backendopt/test/test_tailrecursion.py Log: a very academical transformation: tail recursion elimination for some very specific cases (I was bored). Added: pypy/dist/pypy/translator/backendopt/tailrecursion.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/backendopt/tailrecursion.py Tue Sep 27 00:40:11 2005 @@ -0,0 +1,61 @@ +import sys +from pypy.translator.unsimplify import copyvar, split_block +from pypy.objspace.flow.model import Variable, Constant, Block, Link +from pypy.objspace.flow.model import SpaceOperation, last_exception +from pypy.objspace.flow.model import traverse, mkentrymap, checkgraph, flatten +from pypy.annotation import model as annmodel +from pypy.rpython.lltype import Bool, typeOf, FuncType, _ptr +from pypy.rpython import rmodel + +# this transformation is very academical -- I had too much time + +def get_graph(arg, translator): + if isinstance(arg, Variable): + return None + f = arg.value + if not isinstance(f, _ptr): + return None + try: + callable = f._obj._callable + #external function calls don't have a real graph + if getattr(callable, "suggested_primitive", False): + return None + if callable in translator.flowgraphs: + return translator.flowgraphs[callable] + except AttributeError, KeyError: + pass + try: + return f._obj.graph + except AttributeError: + return None + +def _remove_tail_call(translator, graph, block): + print "removing tail call" + assert len(block.exits) == 1 + assert block.exits[0].target is graph.returnblock + assert block.operations[-1].result == block.exits[0].args[0] + op = block.operations[-1] + block.operations = block.operations[:-1] + block.exits[0].args = op.args[1:] + block.exits[0].target = graph.startblock + +def remove_tail_calls_to_self(translator, graph): + entrymap = mkentrymap(graph) + changed = False + for link in entrymap[graph.returnblock]: + block = link.prevblock + if (len(block.exits) == 1 and + len(block.operations) > 0 and + block.operations[-1].opname == 'direct_call' and + block.operations[-1].result == link.args[0]): + call = get_graph(block.operations[-1].args[0], translator) + print "getgraph", graph + if graph is graph: + _remove_tail_call(translator, graph, block) + changed = True + if changed: + from pypy.translator import simplify + checkgraph(graph) + simplify.remove_identical_vars(graph) + simplify.eliminate_empty_blocks(graph) + simplify.join_blocks(graph) Added: pypy/dist/pypy/translator/backendopt/test/test_tailrecursion.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/backendopt/test/test_tailrecursion.py Tue Sep 27 00:40:11 2005 @@ -0,0 +1,20 @@ +from pypy.objspace.flow.model import traverse, Block, Link, Variable, Constant +from pypy.translator.backendopt.tailrecursion import remove_tail_calls_to_self +from pypy.translator.translator import Translator +from pypy.rpython.llinterp import LLInterpreter +from pypy.translator.test.snippet import is_perfect_number + +def test_recursive_gcd(): + def gcd(a, b): + if a == 1 or a == 0: + return b + if a > b: + return gcd(b, a) + return gcd(b % a, a) + t = Translator(gcd) + a = t.annotate([int, int]) + t.specialize() + remove_tail_calls_to_self(t, t.flowgraphs[gcd]) + lli = LLInterpreter(t.flowgraphs, t.rtyper) + res = lli.eval_function(gcd, (15, 25)) + assert res == 5 From cfbolz at codespeak.net Tue Sep 27 00:43:41 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 27 Sep 2005 00:43:41 +0200 (CEST) Subject: [pypy-svn] r17890 - in pypy/dist/pypy/translator: . test Message-ID: <20050926224341.86C2127B6C@code1.codespeak.net> Author: cfbolz Date: Tue Sep 27 00:43:40 2005 New Revision: 17890 Added: pypy/dist/pypy/translator/test/test_simplify.py Modified: pypy/dist/pypy/translator/simplify.py Log: try to find functions without side-effects and remove them in transform_dead_op_vars. Modified: pypy/dist/pypy/translator/simplify.py ============================================================================== --- pypy/dist/pypy/translator/simplify.py (original) +++ pypy/dist/pypy/translator/simplify.py Tue Sep 27 00:43:40 2005 @@ -9,7 +9,7 @@ from pypy.objspace.flow.model import Variable, Constant, Block, Link from pypy.objspace.flow.model import last_exception from pypy.objspace.flow.model import checkgraph, traverse, mkentrymap - +from pypy.translator.backendopt.tailrecursion import get_graph # ____________________________________________________________ def eliminate_empty_blocks(graph): @@ -322,7 +322,57 @@ block.exits = tuple(lst) traverse(visit, graph) -def transform_dead_op_vars(graph): + +# _____________________________________________________________________ +# decide whether a function has side effects +lloperations_with_side_effects = {"setfield": True, + "setarrayitem": True, + } + +class HasSideEffects(Exception): + pass + +# XXX: this could even be improved: +# if setfield and setarrayitem only occur on things that are malloced +# in this function then the function still does not have side effects + +def has_no_side_effects(translator, graph, seen=None): + #is the graph specialized? if no we can't say anything + #don't cache the result though + if translator.rtyper is None: + return False + else: + if graph.startblock not in translator.rtyper.already_seen: + return False + if seen is None: + seen = [] + elif graph in seen: + return True + try: + def visit(block): + if not isinstance(block, Block): + return + for op in block.operations: + if op.opname in lloperations_with_side_effects: + raise HasSideEffects + if op.opname == "direct_call": + if isinstance(op.args[0], Variable): + raise HasSideEffects + g = get_graph(op.args[0], translator) + if g is None: + raise HasSideEffects + if not has_no_side_effects(translator, g, seen + [graph]): + raise HasSideEffects + traverse(visit, graph) + except HasSideEffects: + return False + else: + return True + +# ___________________________________________________________________________ +# remove operations if their result is not used and they have no side effects + +def transform_dead_op_vars(graph, translator=None): """Remove dead operations and variables that are passed over a link but not used in the target block. Input is a graph.""" blocks = {} @@ -330,7 +380,7 @@ if isinstance(block, Block): blocks[block] = True traverse(visit, graph) - return transform_dead_op_vars_in_blocks(blocks) + return transform_dead_op_vars_in_blocks(blocks, translator) # the set of operations that can safely be removed # (they have no side effects, at least in R-Python) @@ -349,7 +399,7 @@ hasattr: True, } -def transform_dead_op_vars_in_blocks(blocks): +def transform_dead_op_vars_in_blocks(blocks, translator=None): """Remove dead operations and variables that are passed over a link but not used in the target block. Input is a set of blocks""" read_vars = {} # set of variables really used @@ -427,7 +477,14 @@ del block.operations[i] except TypeError: # func is not hashable pass - + elif op.opname == 'direct_call': + if translator is not None: + graph = get_graph(op.args[0], translator) + if (graph is not None and + has_no_side_effects(translator, graph) and + (block.exitswitch != Constant(last_exception) or + i != len(block.operations)- 1)): + del block.operations[i] # look for output variables never used # warning: this must be completely done *before* we attempt to # remove the corresponding variables from block.inputargs! Added: pypy/dist/pypy/translator/test/test_simplify.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/test/test_simplify.py Tue Sep 27 00:43:40 2005 @@ -0,0 +1,59 @@ +from pypy.translator.translator import Translator +from pypy.objspace.flow.model import traverse, Block + +def test_remove_direct_call_without_side_effects(): + def f(x): + return x + 123 + def g(x): + a = f(x) + return x * 12 + t = Translator(g) + a = t.annotate([int]) + t.specialize() + t.backend_optimizations() + assert len(t.flowgraphs[g].startblock.operations) == 1 + +def test_dont_remove_external_calls(): + import os + def f(x): + os.close(x) + t = Translator(f) + a = t.annotate([int]) + t.specialize() + t.backend_optimizations() + assert len(t.flowgraphs[f].startblock.operations) == 1 + +def test_remove_recursive_call(): + def rec(a): + if a <= 1:block.exitswitch != Constant(last_exception): + return 0 + else: + return rec(a - 1) + 1 + def f(x): + a = rec(x) + return x + 12 + t = Translator(f) + a = t.annotate([int]) + t.specialize() + t.backend_optimizations() + assert len(t.flowgraphs[f].startblock.operations) + +def test_dont_remove_if_exception_guarded(): + def f(x): + a = {} #do some stuff to prevent inlining + a['123'] = 123 + a['1123'] = 1234 + return x + 1 + def g(x): + try: + a = f(x) + except OverflowError: + raise + else: + return 1 + t = Translator(g) + a = t.annotate([int]) + t.specialize() + t.backend_optimizations() + assert t.flowgraphs[g].startblock.operations[-1].opname == 'direct_call' + From cfbolz at codespeak.net Tue Sep 27 00:55:56 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 27 Sep 2005 00:55:56 +0200 (CEST) Subject: [pypy-svn] r17891 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20050926225556.9022227B6C@code1.codespeak.net> Author: cfbolz Date: Tue Sep 27 00:55:54 2005 New Revision: 17891 Added: pypy/dist/pypy/translator/backendopt/propagate.py pypy/dist/pypy/translator/backendopt/test/test_propagate.py Modified: pypy/dist/pypy/translator/backendopt/all.py Log: Some more "I am on vacation and have way too much time" graph transformations: these try to do some amount of constant folding and constant propagation. They are nice in theory although I think rather unimportant in practice (they yield 8% speedup on targetrpystone). Modified: pypy/dist/pypy/translator/backendopt/all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/all.py (original) +++ pypy/dist/pypy/translator/backendopt/all.py Tue Sep 27 00:55:54 2005 @@ -3,21 +3,24 @@ from pypy.translator.backendopt.inline import auto_inlining from pypy.translator.backendopt.malloc import remove_simple_mallocs from pypy.translator.backendopt.ssa import SSI_to_SSA +from pypy.translator.backendopt.propagate import propagate_all from pypy.translator import simplify def backend_optimizations(translator, inline_threshold=1, mallocs=True, - ssa_form=True): + ssa_form=True, + propagate=True): # remove obvious no-ops for graph in translator.flowgraphs.values(): remove_same_as(graph) simplify.eliminate_empty_blocks(graph) - + simplify.transform_dead_op_vars(graph, translator) # inline functions in each other + if propagate: + propagate_all(translator) if inline_threshold: auto_inlining(translator, inline_threshold) - # vaporize mallocs if mallocs: for graph in translator.flowgraphs.values(): @@ -25,8 +28,10 @@ # remove typical leftovers from malloc removal remove_same_as(graph) simplify.eliminate_empty_blocks(graph) - simplify.transform_dead_op_vars(graph) - + simplify.transform_dead_op_vars(graph, translator) + if propagate: + propagate_all(translator) + if ssa_form: for graph in translator.flowgraphs.values(): SSI_to_SSA(graph) Added: pypy/dist/pypy/translator/backendopt/propagate.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/backendopt/propagate.py Tue Sep 27 00:55:54 2005 @@ -0,0 +1,276 @@ +from pypy.objspace.flow.model import Block, Variable, Constant, last_exception +from pypy.objspace.flow.model import traverse, mkentrymap, checkgraph +from pypy.rpython.lltype import Void, Bool +from pypy.rpython.llinterp import LLInterpreter, LLFrame +from pypy.translator import simplify +from pypy.translator.backendopt.tailrecursion import get_graph +from pypy.translator.backendopt.removenoops import remove_same_as + +def do_atmost(n, f, *args): + i = 0 + while f(*args): + i += 1 + if i > n: + break + return i > 0 + +def rewire_links(graph): + """This function changes the target of certain links: this happens + if the exitswitch is passed along the link to another block where + it is again used as the exitswitch. This situation occurs after the + inlining of functions that return a bool.""" + entrymap = mkentrymap(graph) + candidates = {} + def visit(block): + if not isinstance(block, Block): + return + if (isinstance(block.exitswitch, Variable) and + block.exitswitch.concretetype is Bool): + for val, link in enumerate(block.exits): + val = bool(val) + if block.exitswitch not in link.args: + continue + if len(link.target.operations) > 0: + continue + index = link.args.index(block.exitswitch) + var = link.target.inputargs[index] + if link.target.exitswitch is var: + candidates[block] = val + traverse(visit, graph) + for block, val in candidates.iteritems(): + link = block.exits[val] + args = [] + for arg in link.target.exits[val].args: + if isinstance(arg, Constant): + args.append(arg) + else: + index = link.target.inputargs.index(arg) + args.append(link.args[index]) + link.target = link.target.exits[val].target + link.args = args + if candidates: + print "rewiring links in graph", graph.name + checkgraph(graph) + return True + return False + +def coalesce_links(graph): + candidates = {} + def visit(block): + if not isinstance(block, Block): + return + if len(block.exits) != 2: + return + if (block.exits[0].args == block.exits[1].args and + block.exits[0].target is block.exits[1].target): + candidates[block] = True + traverse(visit, graph) + for block in candidates: + block.exitswitch = None + block.exits = block.exits[:1] + block.exits[0].exitcase = None + if candidates: + print "coalescing links in graph", graph.name + return True + else: + return False + +def propagate_consts(graph): + """replace a variable of the inputargs of a block by a constant + if all blocks leading to it have the same constant in that position""" + entrymap = mkentrymap(graph) + candidates = [] + changed = [False] + for block, ingoing in entrymap.iteritems(): + if block in [graph.returnblock, graph.exceptblock]: + continue + for i in range(len(ingoing[0].args) - 1, -1, -1): + vals = {} + withvar = True + for link in ingoing: + if isinstance(link.args[i], Variable): + break + else: + vals[link.args[i]] = True + else: + withvar = False + if len(vals) != 1 or withvar: + continue + print "propagating constants in graph", graph.name + const = vals.keys()[0] + for link in ingoing: + del link.args[i] + var = block.inputargs[i] + del block.inputargs[i] + block.renamevariables({var: const}) + changed[0] = True + if changed[0]: + checkgraph(graph) + return True + return False + +_op = """getarrayitem setarrayitem malloc malloc_varsize flavored_malloc + flavored_free getfield setfield getsubstruct getarraysubstruct + getarraysize raw_malloc raw_free raw_memcopy raw_load + raw_store direct_call cast_pointer""".split() +from pypy.objspace.flow.operation import FunctionByName +_op += FunctionByName.keys() #operations with PyObjects are dangerous +cannot_constant_fold = {} +for opname in _op: + cannot_constant_fold[opname] = True +del _op +del FunctionByName + +class TooManyOperations(Exception): + pass + +class CountingLLFrame(LLFrame): + def __init__(self, graph, args, llinterpreter, f_back=None, maxcount=1000): + super(CountingLLFrame, self).__init__(graph, args, llinterpreter, f_back) + self.count = 0 + self.maxcount = maxcount + + def eval_operation(self, operation): + if operation is None: #can happen in the middle of constant folding + return + self.count += 1 + if self.count > self.maxcount: + raise TooManyOperations + return super(CountingLLFrame, self).eval_operation(operation) + +def constant_folding(graph, translator): + """do constant folding if the arguments of an operations are constants""" + lli = LLInterpreter(translator.flowgraphs, translator.rtyper) + llframe = LLFrame(graph, None, lli) + changed = [False] + def visit(block): + if not isinstance(block, Block): + return + for i, op in enumerate(block.operations): + if sum([isinstance(arg, Variable) for arg in op.args]): + continue + if op.opname not in cannot_constant_fold: + print "folding operation", op, "in graph", graph.name + try: + llframe.eval_operation(op) + except: + print "did not work" + else: + res = Constant(llframe.getval(op.result)) + print "result", res.value + res.concretetype = op.result.concretetype + block.operations[i].opname = "same_as" + block.operations[i].args = [res] + changed[0] = True + elif op.opname == "direct_call": + called_graph = get_graph(op.args[0], translator) + if (called_graph is not None and + simplify.has_no_side_effects(translator, called_graph) and + (block.exitswitch != Constant(last_exception) or + i != len(block.operations) - 1)): + args = [arg.value for arg in op.args[1:]] + countingframe = CountingLLFrame(called_graph, args, lli) + print "folding call", op, "in graph", graph.name + try: + res = countingframe.eval() + except: + print "did not work" + pass + else: + print "result", res + res = Constant(res) + res.concretetype = op.result.concretetype + block.operations[i].opname = "same_as" + block.operations[i].args = [res] + changed[0] = True + block.operations = [op for op in block.operations if op is not None] + traverse(visit, graph) + if changed[0]: + remove_same_as(graph) + propagate_consts(graph) + checkgraph(graph) + return True + return False + +def partial_folding_once(graph, translator): + lli = LLInterpreter(translator.flowgraphs, translator.rtyper) + entrymap = mkentrymap(graph) + def visit(block): + if (not isinstance(block, Block) or block is graph.startblock or + block is graph.returnblock or block is graph.exceptblock): + return + usedvars = {} + for op in block.operations: + if op.opname in cannot_constant_fold: + return + for arg in op.args: + if (isinstance(arg, Variable) and arg in block.inputargs): + usedvars[arg] = True + if isinstance(block.exitswitch, Variable): + usedvars[block.exitswitch] = True + pattern = [arg in usedvars for arg in block.inputargs] + for link in entrymap[block]: + s = sum([isinstance(arg, Constant) or not p + for arg, p in zip(link.args, pattern)]) + if s != len(link.args): + continue + args = [] + for i, arg in enumerate(link.args): + if isinstance(arg, Constant): + args.append(arg.value) + else: + assert not pattern[i] + args.append(arg.concretetype._example()) + llframe = LLFrame(graph, None, lli) + llframe.fillvars(block, args) + nextblock, forwardargs = llframe.eval_block(block) + if nextblock is not None: + newargs = [] + for i, arg in enumerate(nextblock.inputargs): + try: + index = [l.target for l in block.exits].index(nextblock) + index = block.inputargs.index(block.exits[index].args[i]) + except ValueError: + c = Constant(forwardargs[i]) + c.concretetype = arg.concretetype + newargs.append(c) + else: + newargs.append(link.args[index]) + else: + assert 0, "this should not occur" + unchanged = link.target == nextblock and link.args == newargs + link.target = nextblock + link.args = newargs + checkgraph(graph) + if not unchanged: + raise ValueError + try: + traverse(visit, graph) + except ValueError: + return True + else: + return False + +def partial_folding(graph, translator): + """this function does constant folding in the following situation: + a block has a link that leads to it that has only constant args. Then all + the operations of this block are evaluated and the link leading to the + block is adjusted according to the resulting value of the exitswitch""" + if do_atmost(1000, partial_folding_once, graph, translator): + propagate_consts(graph) + simplify.join_blocks(graph) + return True + else: + return False + +def propagate_all(translator): + for graph in translator.flowgraphs.itervalues(): + def prop(): + changed = rewire_links(graph) + changed = changed or propagate_consts(graph) + changed = changed or coalesce_links(graph) + changed = changed or do_atmost(100, constant_folding, graph, + translator) + changed = changed or partial_folding(graph, translator) + return changed + do_atmost(10, prop) Added: pypy/dist/pypy/translator/backendopt/test/test_propagate.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/backendopt/test/test_propagate.py Tue Sep 27 00:55:54 2005 @@ -0,0 +1,87 @@ +from pypy.translator.translator import Translator +from pypy.translator.backendopt.propagate import * +from pypy.rpython.llinterp import LLInterpreter + + +def get_graph(fn, signature): + t = Translator(fn) + t.annotate(signature) + t.specialize() + t.backend_optimizations(ssa_form=False, propagate=False) + graph = t.getflowgraph() + return graph, t + +def check_graph(graph, args, expected_result, t): + interp = LLInterpreter(t.flowgraphs, t.rtyper) + res = interp.eval_function(None, args, graph=graph) + assert res == expected_result + +def check_get_graph(fn, signature, args, expected_result): + graph, t = get_graph(fn, signature) + check_graph(graph, args, expected_result, t) + return graph + +def test_inline_and(): + def f(x): + return x != 1 and x != 5 and x != 42 + def g(x): + ret = 0 + for i in range(x): + if f(x): + ret += x + else: + ret += x + 1 + return ret + graph, t = get_graph(g, [int]) + propagate_consts(graph) + assert len(graph.startblock.exits[0].args) == 4 + check_graph(graph, [100], g(100), t) + +def test_dont_fold_return(): + def f(x): + return + graph, t = get_graph(f, [int]) + propagate_consts(graph) + assert len(graph.startblock.exits[0].args) == 1 + check_graph(graph, [1], None, t) + +def test_constant_fold(): + def f(x): + return 1 + def g(x): + return 1 + f(x) + graph, t = get_graph(g, [int]) + constant_folding(graph, t) + assert len(graph.startblock.operations) == 0 + check_graph(graph, [1], g(1), t) + +def test_constant_fold_call(): + def s(x): + res = 0 + for i in range(1, x + 1): + res += i + return res + def g(x): + return s(100) + s(1) + x + graph, t = get_graph(g, [int]) + while constant_folding(graph, t): + pass + assert len(graph.startblock.operations) == 1 + check_graph(graph, [10], g(10), t) + +def test_fold_const_blocks(): + def s(x): + res = 0 + i = 1 + while i < x: + res += i + i += 1 + return res + def g(x): + return s(100) + s(99) + x + graph, t = get_graph(g, [int]) + partial_folding(graph, t) + constant_folding(graph, t) + assert len(graph.startblock.operations) == 1 + check_graph(graph, [10], g(10), t) + From ale at codespeak.net Tue Sep 27 10:08:49 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Tue, 27 Sep 2005 10:08:49 +0200 (CEST) Subject: [pypy-svn] r17892 - pypy/extradoc/minute Message-ID: <20050927080849.856D627B6A@code1.codespeak.net> Author: ale Date: Tue Sep 27 10:08:48 2005 New Revision: 17892 Added: pypy/extradoc/minute/pypy-sync-09-22-2005.txt Log: Added: pypy/extradoc/minute/pypy-sync-09-22-2005.txt ============================================================================== --- (empty file) +++ pypy/extradoc/minute/pypy-sync-09-22-2005.txt Tue Sep 27 10:08:48 2005 @@ -0,0 +1,229 @@ +============================================= +pypy-sync developer meeting 22th September +============================================= + +Time & location: 1pm (30 minutes) at #pypy-sync + +Attendees:: + + Samuele Pedroni + Anders Lehmann (minutes/moderation) + Anders Chrigstr?m + Christian Tismer + Holger Krekel + Eric van Riet Paap + Michael Hudson + Adrien Di Mascio + +Regular Topics +==================== + +- activity reports (3 prepared lines of info). + All Attendees submitted activity reports (see `IRC-Log`_ + at the end and 'LAST/NEXT/BLOCKERS' entries in particular) + +- resolve conflicts/blockers + No conflicts were discovered (which we could address). + +Topics of the week +=================== + +1. translate_pypy_new +------------------------------ + +We should try to make translate_pypy_new be the default way of translating pypy this week (ale and pedronis). There are wishes as to be able to specify options in the target, that translate_pypy_new should propagate to the translation process. + +Better naming and better choice of default values should be done first. + +Batch mode is another wish + +2. A pypy scrap-book ? +------------------------------ + +It was agreed that it was a nice idea, and that we should setup a file for +that purpose. (ale will do that) + +3. Candidates for Refactoring +-------------------------------- + +Two candidates were mentioned. The backendoptimizations and the that we should separate inlining and malloc removal. Both topics was deemed as hard and as good topics for a sprint. + + +Next pypy-sync meeting +------------------------------- + +Scheduled for next Thursday, Sept. 29nd 1300h CET, conducted by Anders Lehmann. + + +Closing +------------------ + +Anders closes the meeting at 13:34pm. + + + Hi everyone, my watch says 1 o'clock. Have anyone heard from pedronis, ludal, stakkars and arigo + Should we ait for them ? + not sure + aleale: Hi + Samuele is not in the office. + armin flew to goetheborg today + ludal won't be here today + let's start the meeting anyway, i'd say + Ok + Welcome + hello + Lets start with what we have prepared + Prev: a little on translate_pypy_new + Next: translate_pypy_new ? + Blockers: - + PREV: Getting the astcompiler is shape + NEXT: Even more work on the astcompiler + BLOCKERS: None + LAST: astcompiler + NEXT: none + BLOCKERS: none + last: dealt with UoB stuff + next: continue to attempt to resolve UoB situation + blockers: UoB administrative delays +??? stakkars [i=oauxesw at i577B5C64.versanet.de] has joined #pypy-sync + last: EU issues, py lib/test stuff + next: relaxing, pytest related things + blockers: - + welcome stakkars, we just doing the round of prepared stuff + ok + DONE: optimization and benchmarking + NEXT: more of this, splitting generated code + BLOCK: None + ericvp: Do you have something? + probably he is not actually here at the moment + The only block seems to be mwh's wrestling with UoB bureaucraci + I dont know if we can help, anyone ? + not really +??? bertf [n=bertf at pD9514462.dip0.t-ipconnect.de] has joined #pypy-sync + we talked about it in the consortium meeting yesterday + yes + if the uob doesn't work out, it looks like i'll join armin at hhud + (administratively more than physically :) + Hi bertf. we are just about moving to the weekly topics + Ok, mwh good luck + yea, sorry, was in the wrong channel. Hi + aleale: thanks + let's not waste time on this one today :) + The weekly topics are: + 1. translate_pypy_new + 2. A pypy scrap-book ? + 3. Candidates for Refactoring + bertf: hi bert + sorry, was asleep. will paste now + last: transformation for faster exception handling in llvm + next: nightly cronjob benchmarking of pypy-c and pypy-llvm + blockers: - + first I would like if we could give a hint on in what direction we are heading with translate_pypy_new + Thanks ericvp + But since armin and pedronis are not here we better defer it to #pypy ? + can someone quickly summarize the point of translate_pypy_new ? + yes. I heared Samuele saying he would help with this + is it just to tidy up the various translation target/option/... stuff + ? + mwh: yes + right, thanks + It was/is an attempt to clean up te mess and maybe add some flexibility, I think + Defer ? + yes, the idea is a more unified model for options regarding our pypy entry points + aleale: can one currently use translate_pypy_new already? +? hpk/#pypy-sync hasn't tried so far + I have tried it on targetrichards and standalone. seems to work + Actually I think all the checked in versions have work (with different naming of options) + it probably makes sense to switch it in sometime next week, after some more people have tried to use it + I was a bit puzzled about new option names. I know the old ones by heart. that made me lazy to try it. + It is working, some options do the opposite of what you expect and default values need to be choosen better + but yes, it is working + the naming had to change due to optparse - maybe we dont want to use that + Hi pedronis +??? pedronis [n=Samuele_ at c-278b70d5.022-54-67626719.cust.bredbandsbolaget.se] has joined #pypy-sync + Hi pedronis +? pedronis/#pypy-sync sorry + ha + no, we should use optparse + Ok, we were discussing the state of translate_pypy_new + but keep the option meanings and default values as close as sensible to the current script + like Eric said + sure + what was the expected changeof the refactoring? + there's an issue about that + less mess/ more flexibility + just better layout, or adding features easily? + sorry, I didn't look last time. Is it considered ready? + well, one thing we may want is pass --usemodulus to targertpypy. + yip + other is to run it unattended with a failed/successed exit value and writing to logs (maybe) + yip, but we can add that + i think it's good to head for actually using it some time soon + a complete batch mode: compile with options, run tests, do next one. + Ok, I'll conclude that we want to have translate_pypy_new brougth into a usable state soon + at the moment if used with -d option on targetrichads it crashed + --usemodules is indeed a thing where I'd like to have the defaults in the target. It would be nice if the + translate_pypy was able to read the options which a target has and to support them + and that we want to add more features. + a way to specify specific things in a target without cluttering translate + well, but also to take them from the command line + target specific options + Who have time to look at it ( I wont be able to do anything before tuesday)? + I want to provide them in the commandline, for the target, without defining them in translate, necessarily + I can look at it a bit + I have prepared for target specific options (add a dictionary called "options" in the target) + as an aside: I'd like to renew the idea of saving state to a file which can be postprocessed, right before the + backend + I think we should go to the next topic (8 mins left) + a scrap book sounds like a nice idea, but is there that much to go in it? + mwh: what to you mean? + maybe in misunderstood the idea then + ^I + I think we should start thinking about it and collecting stuff. + Then we can later decide how to present it + well, just start a .txt file + makes sense + i certainly don't know where to find all our conference presentations + what about pictures and video? + should it be a directory instead + we don't have much in that area (apart from the sprint pictures i took) + they can be referenced along with the sprint reports + ok I wil start the scrapbook then + topic 3 : Candidates for refactoring + well, the backendoptimizations are still pretty scary + don't know if that's fixable, though + It seems that we have enough tasks for the next week so I think we can postpone it ? + we should try to seperate inlining and malloc removal + mwh: do you have time to look at that ? + not really + stakkars: will that be done as part of the current optimisations + I hope so. But it is difficult. + Ok the candidates will be recorded in the minutes. Anyway the time is up + might be a better sprint topic +??? mwh [n=user at 82-33-185-193.cable.ubr01.azte.blueyonder.co.uk] has left #pypy-sync ["ERC Version 5.0 (CVS) $Revision: + 1.771 $ (IRC client for Emacs)"] + Have we choesen a moderator for next week ? + yes, the list is completely incomplete :) + yes, who is doing it next week? + Any takers? + Ok I can do it again ? + I think of a number, you guess it. who is closes takes it +? stakkars/#pypy-sync thinks this didn't work + I would like to add : "Incredible job You've done with optimisations. Great job" + stakkars, aleale: what about the two of you doing it the next two times? + after that we already have the paris sprint and can plan further + I will do it next week . See you all then + no problem + ok, great. + and we are 4 minutes late + no idea why exactly we, but it's fine +??? stakkars [i=oauxesw at i577B5C64.versanet.de] has left #pypy-sync [] + see you + see you +??? adim [n=adim at logilab.net2.nerim.net] has left #pypy-sync [] + bye. I have to prepare my sons 18 birthday now - so the minutes will first be done tomorrow + goodbye + bye + [01:40pm][aleale] [#pypy-sync(+ns)] + [Lag 0] [O/1 N/6 I/0 V/0 F/0] [U:a:S:b:h] +[#pypy-sync] From ac at codespeak.net Tue Sep 27 10:46:13 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Tue, 27 Sep 2005 10:46:13 +0200 (CEST) Subject: [pypy-svn] r17893 - in pypy/dist/pypy/module/__builtin__: . test Message-ID: <20050927084613.CD8A927B6C@code1.codespeak.net> Author: ac Date: Tue Sep 27 10:46:13 2005 New Revision: 17893 Modified: pypy/dist/pypy/module/__builtin__/compiling.py pypy/dist/pypy/module/__builtin__/test/test_builtin.py Log: Don't alter lineumbers when compiling unicode. Modified: pypy/dist/pypy/module/__builtin__/compiling.py ============================================================================== --- pypy/dist/pypy/module/__builtin__/compiling.py (original) +++ pypy/dist/pypy/module/__builtin__/compiling.py Tue Sep 27 10:46:13 2005 @@ -9,11 +9,11 @@ def compile(space, w_source, filename, mode, flags=0, dont_inherit=0): if space.is_true(space.isinstance(w_source, space.w_unicode)): - # hack: encode the unicode string as UTF-8 and attach a 'coding' - # declaration at the start + # hack: encode the unicode string as UTF-8 and attach + # a BOM at the start w_source = space.call_method(w_source, 'encode', space.wrap('utf-8')) str_ = space.str_w(w_source) - str_ = "# -*- coding: utf-8 -*-\n" + str_ + str_ = '\xEF\xBB\xBF' + str_ else: str_ = space.str_w(w_source) Modified: pypy/dist/pypy/module/__builtin__/test/test_builtin.py ============================================================================== --- pypy/dist/pypy/module/__builtin__/test/test_builtin.py (original) +++ pypy/dist/pypy/module/__builtin__/test/test_builtin.py Tue Sep 27 10:46:13 2005 @@ -360,6 +360,12 @@ raises(ValueError, compile, '1+2', '?', 'maybenot') raises(TypeError, compile, '1+2', 12, 34) + def test_unicode_compile(self): + try: + compile(u'-', '?', 'eval') + except SyntaxError, e: + assert e.lineno == 1 + def test_isinstance(self): assert isinstance(5, int) assert isinstance(5, object) From ac at codespeak.net Tue Sep 27 11:09:33 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Tue, 27 Sep 2005 11:09:33 +0200 (CEST) Subject: [pypy-svn] r17894 - pypy/dist/lib-python/modified-2.4.1/test Message-ID: <20050927090933.3BC2927B70@code1.codespeak.net> Author: ac Date: Tue Sep 27 11:09:28 2005 New Revision: 17894 Added: pypy/dist/lib-python/modified-2.4.1/test/test_builtin.py - copied, changed from r17877, pypy/dist/lib-python/2.4.1/test/test_builtin.py Log: Make a test less time and memory consuming. From ericvrp at codespeak.net Tue Sep 27 11:32:12 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 27 Sep 2005 11:32:12 +0200 (CEST) Subject: [pypy-svn] r17895 - pypy/dist/pypy/translator/llvm/backendopt Message-ID: <20050927093212.9153327B70@code1.codespeak.net> Author: ericvrp Date: Tue Sep 27 11:32:11 2005 New Revision: 17895 Added: pypy/dist/pypy/translator/llvm/backendopt/__init__.py Log: forgot to check this in Added: pypy/dist/pypy/translator/llvm/backendopt/__init__.py ============================================================================== From cfbolz at codespeak.net Tue Sep 27 11:36:20 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 27 Sep 2005 11:36:20 +0200 (CEST) Subject: [pypy-svn] r17896 - pypy/dist/pypy/translator/test Message-ID: <20050927093620.C3A6927B70@code1.codespeak.net> Author: cfbolz Date: Tue Sep 27 11:36:20 2005 New Revision: 17896 Modified: pypy/dist/pypy/translator/test/test_simplify.py Log: no idea what happened here Modified: pypy/dist/pypy/translator/test/test_simplify.py ============================================================================== --- pypy/dist/pypy/translator/test/test_simplify.py (original) +++ pypy/dist/pypy/translator/test/test_simplify.py Tue Sep 27 11:36:20 2005 @@ -25,7 +25,7 @@ def test_remove_recursive_call(): def rec(a): - if a <= 1:block.exitswitch != Constant(last_exception): + if a <= 1: return 0 else: return rec(a - 1) + 1 From cfbolz at codespeak.net Tue Sep 27 11:37:17 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 27 Sep 2005 11:37:17 +0200 (CEST) Subject: [pypy-svn] r17897 - pypy/dist/pypy/translator/backendopt Message-ID: <20050927093717.BBF4827B70@code1.codespeak.net> Author: cfbolz Date: Tue Sep 27 11:37:17 2005 New Revision: 17897 Modified: pypy/dist/pypy/translator/backendopt/propagate.py Log: it's cleverer to let remove_same_as handle all the problems of var->const replacement Modified: pypy/dist/pypy/translator/backendopt/propagate.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/propagate.py (original) +++ pypy/dist/pypy/translator/backendopt/propagate.py Tue Sep 27 11:37:17 2005 @@ -1,5 +1,6 @@ from pypy.objspace.flow.model import Block, Variable, Constant, last_exception from pypy.objspace.flow.model import traverse, mkentrymap, checkgraph +from pypy.objspace.flow.model import SpaceOperation from pypy.rpython.lltype import Void, Bool from pypy.rpython.llinterp import LLInterpreter, LLFrame from pypy.translator import simplify @@ -102,9 +103,11 @@ del link.args[i] var = block.inputargs[i] del block.inputargs[i] - block.renamevariables({var: const}) + op = SpaceOperation("same_as", [const], var) + block.operations.insert(0, op) changed[0] = True if changed[0]: + remove_same_as(graph) checkgraph(graph) return True return False From cfbolz at codespeak.net Tue Sep 27 11:46:39 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 27 Sep 2005 11:46:39 +0200 (CEST) Subject: [pypy-svn] r17898 - pypy/dist/pypy/translator/backendopt Message-ID: <20050927094639.739D327B6A@code1.codespeak.net> Author: cfbolz Date: Tue Sep 27 11:46:38 2005 New Revision: 17898 Modified: pypy/dist/pypy/translator/backendopt/all.py Log: disabling propagate it seems to really break graphs :-( Modified: pypy/dist/pypy/translator/backendopt/all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/all.py (original) +++ pypy/dist/pypy/translator/backendopt/all.py Tue Sep 27 11:46:38 2005 @@ -10,7 +10,7 @@ def backend_optimizations(translator, inline_threshold=1, mallocs=True, ssa_form=True, - propagate=True): + propagate=False): # remove obvious no-ops for graph in translator.flowgraphs.values(): remove_same_as(graph) From cfbolz at codespeak.net Tue Sep 27 12:40:01 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 27 Sep 2005 12:40:01 +0200 (CEST) Subject: [pypy-svn] r17899 - in pypy/dist/pypy/translator/backendopt: . test Message-ID: <20050927104001.0DB6027B6E@code1.codespeak.net> Author: cfbolz Date: Tue Sep 27 12:40:01 2005 New Revision: 17899 Modified: pypy/dist/pypy/translator/backendopt/propagate.py pypy/dist/pypy/translator/backendopt/test/test_propagate.py Log: fix a few issues of propagate: coalesce_links did the wrong thing with except link: fix + test (thanks eric) cast_ptr_to_int should of course _not_ be folded (thanks samuele) one more test that fails in llvm, but seems to pass for me Modified: pypy/dist/pypy/translator/backendopt/propagate.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/propagate.py (original) +++ pypy/dist/pypy/translator/backendopt/propagate.py Tue Sep 27 12:40:01 2005 @@ -62,6 +62,8 @@ return if len(block.exits) != 2: return + if block.exitswitch == Constant(last_exception): + return if (block.exits[0].args == block.exits[1].args and block.exits[0].target is block.exits[1].target): candidates[block] = True @@ -103,11 +105,11 @@ del link.args[i] var = block.inputargs[i] del block.inputargs[i] - op = SpaceOperation("same_as", [const], var) - block.operations.insert(0, op) + op = SpaceOperation("same_as", [const], var) + block.operations.insert(0, op) changed[0] = True if changed[0]: - remove_same_as(graph) + remove_same_as(graph) checkgraph(graph) return True return False @@ -115,7 +117,7 @@ _op = """getarrayitem setarrayitem malloc malloc_varsize flavored_malloc flavored_free getfield setfield getsubstruct getarraysubstruct getarraysize raw_malloc raw_free raw_memcopy raw_load - raw_store direct_call cast_pointer""".split() + raw_store direct_call cast_pointer cast_ptr_to_int""".split() from pypy.objspace.flow.operation import FunctionByName _op += FunctionByName.keys() #operations with PyObjects are dangerous cannot_constant_fold = {} Modified: pypy/dist/pypy/translator/backendopt/test/test_propagate.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/test/test_propagate.py (original) +++ pypy/dist/pypy/translator/backendopt/test/test_propagate.py Tue Sep 27 12:40:01 2005 @@ -85,3 +85,36 @@ assert len(graph.startblock.operations) == 1 check_graph(graph, [10], g(10), t) +def getitem(l, i): #LookupError, KeyError + if not isinstance(i, int): + raise TypeError + if i < 0: + i = len(l) - i + if i>= len(l): + raise IndexError + return l[i] + +def test_dont_coalesce_except(): + def fn(n): + lst = range(10) + try: + getitem(lst,n) + except: + pass + return 4 + graph, t = get_graph(fn, [int]) + coalesce_links(graph) + check_graph(graph, [-1], fn(-1), t) + +def list_default_argument(i1, l1=[0]): + l1.append(i1) + return len(l1) + l1[-2] + +def call_list_default_argument(i1): + return list_default_argument(i1) + +def test_call_list_default_argument(): + graph, t = get_graph(call_list_default_argument, [int]) + t.backend_optimizations(propagate=True, ssa_form=False) + for i in range(10): + check_graph(graph, [i], call_list_default_argument(i), t) From ac at codespeak.net Tue Sep 27 14:20:36 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Tue, 27 Sep 2005 14:20:36 +0200 (CEST) Subject: [pypy-svn] r17900 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050927122036.0F29B27B6E@code1.codespeak.net> Author: ac Date: Tue Sep 27 14:20:36 2005 New Revision: 17900 Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/pythonparse.py Log: Use proper encoding when interpreting strings. Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Tue Sep 27 14:20:36 2005 @@ -553,14 +553,14 @@ if len(atoms) == 1: token = atoms[0] assert isinstance(token, TokenObject) - builder.push(ast.Const(parsestr(builder.space, None, token.get_value()), top.lineno)) # XXX encoding + builder.push(ast.Const(parsestr(builder.space, builder.source_encoding, token.get_value()), top.lineno)) else: space = builder.space empty = space.wrap('') accum = [] for token in atoms: assert isinstance(token, TokenObject) - accum.append(parsestr(builder.space, None, token.get_value())) # XXX encoding + accum.append(parsestr(builder.space, builder.source_encoding, token.get_value())) w_s = space.call_method(empty, 'join', space.newlist(accum)) builder.push(ast.Const(w_s, top.lineno)) elif top.name == tok.BACKQUOTE: @@ -1581,7 +1581,8 @@ BaseGrammarBuilder.__init__(self, rules, debug) self.rule_stack = [] self.space = space - + self.source_encoding = None + def context(self): return AstBuilderContext(self.rule_stack) Modified: pypy/dist/pypy/interpreter/pyparser/pythonparse.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonparse.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonparse.py Tue Sep 27 14:20:36 2005 @@ -37,11 +37,9 @@ goalnumber = pysymbol.sym_values[goal] target = self.rules[goalnumber] src = Source(lines, flags) - - result = target.match(src, builder) - # XXX find a clean way to process encoding declarations builder.source_encoding = src.encoding - # + + result = target.match(src, builder) if not result: line, lineno = src.debug() # XXX needs better error messages From arigo at codespeak.net Tue Sep 27 15:01:48 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 27 Sep 2005 15:01:48 +0200 (CEST) Subject: [pypy-svn] r17901 - pypy/dist/pypy/doc Message-ID: <20050927130148.A495527B71@code1.codespeak.net> Author: arigo Date: Tue Sep 27 15:01:45 2005 New Revision: 17901 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: Expanded and subsectionned and motivated a bit the text about the flow object space. Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Tue Sep 27 15:01:45 2005 @@ -232,97 +232,267 @@ The Flow Object Space in our current design is responsible of constructing a flow graph for a single function using abstract -interpretation. +interpretation. The domain on which the Flow Space operates comprises +variables and constant objects. They are stored as such in the frame +objects without problems because by design the interpreter engine treat +them as black boxes. -Concretely the Flow Space plugs itself in the interpreter as an object -space, and supplying a derived execution context implementation. It -also wrap a fix-point loop around invocations of the frame resume -method which is forced to execute one single bytecode through -exceptions reaching this loop from the space operations' code and the -specialised execution context. - -The domain on which the Flow Space operates comprises variables and -constant objects. They are stored as such in the frame objects without -problems because by design the interpreter engine treat them -neutrally. - -The Flow Space can synthesise out of a frame content so called frame -states. Frame states described the execution state for the frame at a -given point. - -The Flow Space constructs the flow graph by creating new blocks in it, -when fresh never-seen state is reached. During construction, blocks in -the graph all have an associated frame state. The Flow Space start -from an empty block with an a frame state corresponding to setup -induced but input arguments in the form of variables and constants to -the analysed function. - -When an operation is delegated to the Flow Space by the frame -interpretation loop, either a constant result is produced, in the case -the arguments are constant and the operation doesn't have -side-effects, otherwise the operation is recorded in the current block -and a fresh new variable is returned as result. + +Construction of flow graphs +~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Concretely, the Flow Space plugs itself in the interpreter as an object +space and supplies a derived execution context implementation. It also +wraps a fix-point loop around invocations of the frame resume method. +In our current design, this fix-point searching is implemented by +interrupting the normal interpreter loop in the frame after every +bytecode, and comparing the state with previously-seen states. These +states describe the execution state for the frame at a given point. +They are synthesised out of the frame by the Flow Space; they contain +position-dependent data (current bytecode index, current exception +handlers stack) as well as a flattened list of all variables and +constants currently handled by the frame. + +The Flow Space constructs the flow graph, operation after operation, as +a side effect of seeing these operations performed by the interpretation +of the bytecode. During construction, blocks in the graph all have an +associated frame state. The Flow Space start from an empty block with an +a frame state corresponding to a frame freshly initialized, with a new +variables for each input argument of the analysed function. It proceeds +by recording the operations in this block, as follows: when an operation +is delegated to the Flow Space by the frame interpretation loop, either +a constant result is produced -- in the case of constant arguments to an +operation with no side-effects -- or a fresh new variable is produced. +In the latter case, the operation (together with its input variables and +constant arguments, and its output variable) is recorded in the current +block and the new variable is returned as result to the frame +interpretation loop. When a new bytecode is about to be executed, as signalled by the -bytecode hook, the Flow Space considers the frame state corresponding -to the current frame contents. The Flow Space keeps a mapping between -byecode instructions, as their position, and frame state, block pairs. - -A union operation is defined on frame states, only two equal constants -unify to a constant of the same value, all other combinations unify -to a fresh new variable. - -If some previously associated frame state for the next byecode unifies -with the current state giving some more general state, i.e. an unequal -one, the corresponding block will be reused and reset. Otherwise a new -block is used. +bytecode hook, the Flow Space considers the frame state corresponding to +the current frame contents. This state is compared with the existing +states attached to the blocks produced so far. If the state was not +seen before, the Flow Space creates a new block in the graph. If the +same state was already seen before, then a backlink to the previous +block is inserted, and the abstract interpretation stops here. If only +a "similar enough" state was seen so far, then the current and the +previous states are merged to produce a more general state. + +In more details, "similar enough" is defined as having the same +position-dependant part, the so-called "non-mergeable frame state", +which mostly means that only frame states corresponding to the same +bytecode position can ever be merged. This process thus produces blocks +that are generally in one-to-one correspondance with the bytecode +positions seen so far. The exception to this rule is in the rare cases +where frames from the same bytecode position have a different +non-mergeable state, which typically occurs during the "finally" part of +a "try: finally:" construct, where the details of the exception handler +stack differs according to whether the "finally" part was entered +normally or as a result of an exception. + +If two states have the same non-mergeable part, they can be merged using +a "union" operation: only two equal constants unify to a constant of the +same value; all other combinations (variable-variable or +variable-constant) unify to a fresh new variable. + +In summary, if some previously associated frame state for the next +byecode can be unified with the current state, then a backlink to the +corresponding existing block is inserted; additionally, if the unified +state is strictly more general than the existing one, then the existing +block is cleared, and we proceed with the generalized state, reusing the +block. (Reusing the block avoids the proliferation of over-specific +blocks. Ror example, without this, all loops would typically have their +first pass unrolled with the first value of the counter as a constant; +instead, the second pass through the loop that the Flow Space does with +the counter generalized as a variable will reuse the same entry point +block, and any further blocks from the first pass are simply +garbage-collected.) +Branching +~~~~~~~~~ + Branching on conditions by the engine usually involves querying the -truth value of a object through the is_true space operation. This -needs special treatment to be able to capture all possible flow paths. +truth value of a object through the ``is_true`` space operation. When +this object is a variable, the result is not statically known; this +needs special treatment to be able to capture both possible flow paths. +In theory, this would require continuation support at the language level +so that we can pretend that ``is_true`` returns twice into the engine, +once for each possible answer, so that the Flow Space can record both +outcomes. Without proper continuations in Python, we have implemented a +more explicit scheme that we describe below. (The approach is related +to the one used in Psyco_, where continuations would be entierely +inpractical, as described in the `ACM SIGPLAN 2004 paper`_.) + +At any point in time, multiple pending blocks can be scheduled for +abstract interpretation by the Flow Space, which proceeds by picking one +of them and reconstructing a frame from the frame state associated with +the block. This frame reconstruction is actually delegated to the +block, which also returns a so-called "recorder" through which the Flow +Space will append new space operations to the block. The recorder is +also responsible for handling the ``is_true`` operation. + +A normal recorder simply appends the space operations to the block from +which it comes from. However, when it sees an ``is_true`` operation, it +creates and schedules two special blocks (one for the outcome ``True`` +and one for the outcome ``False``) which don't have an associated frame +state. The previous block is linked to the two new blocks with +conditional exits. At this point, abstract interpretation stops (i.e. +an exception is raised to interrupt the engine). + +The special blocks have no frame state, and cannot be used to setup a +frame: indeed, unlike normal blocks, which correspond to the state of +the engine between the execution of two bytecode, special blocks +correspond to a call to ``is_true`` issued the engine. The details of +the engine state (internal call stack and local variables) are not +available at this point. + +However, it is still possible to put the engine back into the state +where it was calling ``is_true``. This is what occurs later on, when +one of the special block is scheduled for further execution: the block +considers its previous block, and possibly its previous block's previous +block, and so on up to the first normal block. As we can see, these +blocks form a binary tree of special blocks with a normal block at the +root. A special block thus corresponds to a branch in the tree, whose +path is described by a list of outcomes -- a list of boolean values. We +can thus restore the state of any block by starting from the root and +asking the engine to replay the execution from there; intermediate +``is_true`` calls issued by the engine are answered according to the +list of outcomes until we reach the desired state. + +This is implemented by having a special blocks (called ``EggBlocks`` +internally, whereas normal blocks are ``SpamBlocks``) return a chain of +recorders: one so-called "replaying" recorder for each of the parent +blocks in the tree, followed by a normal recorder for the block itself. +When the engine replays the execution from the root of the tree, the +intermediate recorders check (for consistency) that the same operations +as the ones already recorded are issued again, ending in a call to +``is_true``; at this point, the replaying recorder gives the answer +corresponding to the branch to follow, and switch to the next recorder +in the chain. + +This mechanism ensures that all flow paths are considered, including +different flow paths inside the engine and not only flow paths that are +explicit in the bytecode. For example, an ``UNPACK_SEQUENCE`` bytecode +in the engine iterates over a sequence object and checks that it +produces exactly the expected number of values; so the single bytecode +``UNPACK_SEQUENCE n`` generates a tree with ``n+1`` branches +corresponding to the ``n+1`` times the engine asks the iterator if it +has more elements to produce. A simpler example is a conditional jump, +which will generate a pair of special blocks for the ``is_true``, each +of which consisting only in a jump to the normal block corresponding to +the next bytecode -- either the one following the conditional jump, or +the target of the jump, depending on whether the replayer answered +``False`` or ``True`` to the ``is_true``. + +Note a limitation of this mechanism: the engine cannot use an unbounded +loop to implement a single bytecode. All *loops* must still be +explicitly present in the bytecodes. The reason is that the Flow Space +can only insert backlinks between bytecodes. -Multiple pending blocks can scheduled for abstract interpretation by -the flow space, which proceeds picking one and reconstructing the -abstract execution frame from the frame state associated with the -block. The frame is what will be operated on, its setup is delegated -to the block and based on the state, the frame setup by the block also -returns a so called recorder through which, and not directly the -block, appending of new space operations to the block will be -delegated. What to do when an is_true operation is about to be -executed is also responsability to the recorder. - -The normal recorder when an is_true operation is encountered will -create and schedule special blocks which don't have an associated -frame state, but the previous block ending in the is_true operation -and an outcome, either True or False. - -The special branching blocks when about to be executed, will use the -chain of previous blocks, and consider all of them up to the first -non-special branching block included, the state of this one block will -be used to setup the frame for execution and a chain of so called -replaying recorders setup except for the scheduled branching block -which gets a normal recorder. The outcome registered in each special -block in the chain will be associated with the replayer for the -previous block. - -The replaying recorders will sanity check that the same operations are -appended by comparing the previous contents of the blocks -re-encountered by execution and on is_true operation will deliver the -outcome associated with them on construction. -All this mechanism ensures that all flow paths are considered. +Dynamic merging +~~~~~~~~~~~~~~~ +For simplicity, we have so far omitted a point in the description of how +frame states are associated to blocks. In our implementation, there is +not necessarily a block corresponding to each bytecode position (or more +precisely each non-mergeable state): we avoid creating blocks at all if +they would stay empty. This is done by tentatively running the engine +on a given frame state and seeing if it creates at least one operation; +if it does not, then we simply continue with the new frame state without +having created a block for the previous frame state. The previous frame +state is discarded without having even tried to compare it with +already-seen state to see if it merges. + +The effect of this is that merging only occurs at the beginning of a +bytecode that actually produces an operation. This allows some amount +of constant-folding: for example, the two functions below produce the +same flow graph:: + + def f(n): def g(n): + if n < 0: if n < 0: + n = 0 return 1 + return n+1 else: + return n+1 + +because the two branches of the condition are not merged where the +``if`` statement syntactically ends: the ``True`` branch propagates a +constant zero in the local variable ``n``, and the following addition is +constant-folded and does not generate a residual operation. + +Note that this feature means that the Flow Space is not guaranteed to +terminate. The analysed function can contain arbitrary computations on +constant values (with loops) that will be entierely constant-folded by +the Flow Space. A function with an obvious infinite loop will send the +Flow Space following the loop ad infinitum. This means that it is +difficult to give precise conditions for when the Flow Space terminates +and which complexity it has. Informally, "reasonable" functions should +not create problems: it is uncommon for a function to perform +non-trivial constant computations at run-time; and the complexity of the +Flow Space can more or less be bound by the run-time complexity of the +constant parts of the function itself, if we ignore pathological cases +where a part of a function contains infinite loops but cannot be entered +at run-time for some reasons unknown to the Flow Space. -XXX non mergeable data, details -XXX termination for "reasonable" terminating programs Geninterp ~~~~~~~~~ -YYY - -YYY dynamic merging good for geninterp +Introducing `Dynamic merging`_ can be seen as a practical move: it does +not, in practice, prevent even large functions to be analysed reasonably +quickly, and it is useful to simplify the flow graphs of some functions. +This is specially true for functions that are themselves automatically +generated. + +In the PyPy interpreter, for convenience, some of the core functionality +has been written as application-level Python code, which means that the +interpreter will consider some core operations as calls to further +application-level code. This has, of course, a performance hit due to +the interpretation overhead. To minimize this overhead, we +automatically turn some of this application-level code into +interpreter-level code, as follows. Consider the following trivial +example function at application-level:: + + def f_app(n): + return n+1 + +Interpreting it, the engine just issues an ``add`` operation on the +object space, which means that it is mostly equivalent to the following +interpreter-level function:: + + def f_interp(space, wrapped_n): + return space.add(wrapped_n, wrapped_1) + +The translation from ``f_app`` to ``f_interp`` can be done automatically +by using the Flow Space as well: we produce the flow graph of ``f_app`` +using the techniques described above, and then we turn the resulting +flow graph into ``f_interp`` by generating for each operation a call to +the corresponding method of ``space``. + +This process looses the original syntactic structure of ``f_app``, +though; the flow graph is merely a collection of blocks that jump to +each other. It is not always easy to reconstruct the structure from the +graph (or even possible at all, in some cases where the flow graph does +not exactly follow the bytecode). So, as is common for code generators, +we use a workaround to the absence of explicit gotos:: + + def f_interp(...): + next_block = 0 + while True: + + if next_block == 0: + ... + next_block = 1 + + if next_block == 1: + ... + +This produces Python code that is particularly sub-efficient when it is +interpreted; however, if it is further re-analysed by the Flow Space, +dynamic merging will ensure that ``next_block`` will always be +constant-folded away, instead of having the various possible values of +``next_block`` be merged at the beginning of the loop. Annotator @@ -405,6 +575,59 @@ :: + ____________ Top ___________ + / / | \ \ + / / | \ \ + / / | | \ + / NullableStr | | | + Int / \ | (lists) | + / Str \ (instances) | (pbcs) + NonNegInt \ \ \ | | + \ Char \ \ / / + Bool \ \ \ / / + \ \ `----- None -----' + \ \ / + \ \ / + `--------`-- Bottom + + + Top + | + | + | + NuInst(object) + / / \ + Inst(object) / \ + / \ / \ + / \/ \ + / /\ \ + / / \ \ + / / \ \ + / NuInst(cls2) \ NuInst(cls1) + / / \ \ / / + Inst(cls2) \ Inst(cls1) / + \ \ / / + \ \ / / + \ \/ / + \ /\ / + \ / None + \ / / + Bottom + + + + __________________ Top __________________ + / / / \ \ \ + / / / \ \ \ + / / / \ \ \ + List(v_1) ... ... ... List(v_n) + \ \ \ / / / + \ \ \ / / / + \ \ \ / / / + '------------'--- None ----'------------' + + + Bot Top @@ -412,13 +635,9 @@ Int NonNegInt - - UnsignedInt Bool - - Float - + Str NullableStr @@ -616,6 +835,7 @@ .. _`Flow Object Space`: objspace.html#the-flow-object-space .. _`Standard Object Space`: objspace.html#the-standard-object-space .. _Psyco: http://psyco.sourceforge.net/ +.. _`ACM SIGPLAN 2004 paper`: http://psyco.sourceforge.net/psyco-pepm-a.ps.gz .. _`Hindley-Milner`: http://en.wikipedia.org/wiki/Hindley-Milner_type_inference .. include:: _ref.txt From ericvrp at codespeak.net Tue Sep 27 15:20:42 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 27 Sep 2005 15:20:42 +0200 (CEST) Subject: [pypy-svn] r17902 - pypy/dist/pypy/translator/c Message-ID: <20050927132042.193CB27B82@code1.codespeak.net> Author: ericvrp Date: Tue Sep 27 15:20:41 2005 New Revision: 17902 Modified: pypy/dist/pypy/translator/c/gc.py Log: This should only be set when using Boehm with threads. Modified: pypy/dist/pypy/translator/c/gc.py ============================================================================== --- pypy/dist/pypy/translator/c/gc.py (original) +++ pypy/dist/pypy/translator/c/gc.py Tue Sep 27 15:20:41 2005 @@ -355,9 +355,9 @@ return ['gc'] # xxx on windows? def pre_pre_gc_code(self): - if sys.platform == "linux2": - yield "#define _REENTRANT 1" - yield "#define GC_LINUX_THREADS 1" + #if sys.platform == "linux2": + # yield "#define _REENTRANT 1" + # yield "#define GC_LINUX_THREADS 1" yield '#include ' yield '#define USING_BOEHM_GC' From ericvrp at codespeak.net Tue Sep 27 15:33:12 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 27 Sep 2005 15:33:12 +0200 (CEST) Subject: [pypy-svn] r17903 - in pypy/dist/pypy/translator: backendopt backendopt/test llvm llvm/backendopt llvm/module Message-ID: <20050927133312.47CC827B88@code1.codespeak.net> Author: ericvrp Date: Tue Sep 27 15:33:10 2005 New Revision: 17903 Added: pypy/dist/pypy/translator/backendopt/removezerobytemalloc.py pypy/dist/pypy/translator/backendopt/test/test_removezerobytemalloc.py pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py Modified: pypy/dist/pypy/translator/backendopt/all.py pypy/dist/pypy/translator/llvm/exception.py pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/module/extfunction.py Log: working on experimental transformations. disabled for now! Modified: pypy/dist/pypy/translator/backendopt/all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/all.py (original) +++ pypy/dist/pypy/translator/backendopt/all.py Tue Sep 27 15:33:10 2005 @@ -4,23 +4,34 @@ from pypy.translator.backendopt.malloc import remove_simple_mallocs from pypy.translator.backendopt.ssa import SSI_to_SSA from pypy.translator.backendopt.propagate import propagate_all +from pypy.translator.backendopt.removezerobytemalloc import remove_zero_byte_mallocs from pypy.translator import simplify def backend_optimizations(translator, inline_threshold=1, mallocs=True, ssa_form=True, - propagate=False): + propagate=False, + removezerobytemallocs=False): # remove obvious no-ops for graph in translator.flowgraphs.values(): remove_same_as(graph) simplify.eliminate_empty_blocks(graph) simplify.transform_dead_op_vars(graph, translator) - # inline functions in each other + + # remove allocation of empty structs + if removezerobytemallocs: + for graph in translator.flowgraphs.values(): + remove_zero_byte_mallocs(graph) + + # ... if propagate: propagate_all(translator) + + # inline functions in each other if inline_threshold: auto_inlining(translator, inline_threshold) + # vaporize mallocs if mallocs: for graph in translator.flowgraphs.values(): Added: pypy/dist/pypy/translator/backendopt/removezerobytemalloc.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/backendopt/removezerobytemalloc.py Tue Sep 27 15:33:10 2005 @@ -0,0 +1,16 @@ +from pypy.objspace.flow.model import Constant, Block, flatten +from pypy.objspace.flow.model import SpaceOperation +from pypy.rpython import lltype + +def remove_zero_byte_mallocs(graph): + blocks = [x for x in flatten(graph) if isinstance(x, Block)] + for block in blocks: + for i, op in enumerate(block.operations): + if op.opname != 'malloc': + continue + arg = op.args[0].value + if True: #isinstance(arg, lltype.Struct) and arg._names_without_voids() == []: + print 'remove_zero_byte_mallocs: removed malloc(%s) from previous line' % arg + nullresulttype = op.result.concretetype + nullresult = Constant(nullresulttype._defl(), nullresulttype) + block.operations[i] = SpaceOperation('cast_null_to_ptr', [nullresult], op.result) Added: pypy/dist/pypy/translator/backendopt/test/test_removezerobytemalloc.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/backendopt/test/test_removezerobytemalloc.py Tue Sep 27 15:33:10 2005 @@ -0,0 +1,21 @@ +from pypy.objspace.flow.model import traverse, Block, Link, Variable, Constant +from pypy.translator.backendopt.removezerobytemalloc import remove_zero_byte_mallocs +from pypy.translator.translator import Translator +from pypy.rpython.llinterp import LLInterpreter +from pypy.translator.test.snippet import is_perfect_number + +def test_removezerobytemalloc(): + x = () + def func2(q): + return q + def zerobytemalloc(): + y = func2(x) + return len(x) + t = Translator(zerobytemalloc) + a = t.annotate([]) + t.specialize() + remove_zero_byte_mallocs(t.flowgraphs[zerobytemalloc]) + #t.view() + lli = LLInterpreter(t.flowgraphs, t.rtyper) + res = lli.eval_function(zerobytemalloc, ()) + assert res == 0 Added: pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py Tue Sep 27 15:33:10 2005 @@ -0,0 +1,22 @@ +from pypy.objspace.flow.model import Block, flatten, SpaceOperation + + +def merge_mallocs(translator, graph): + """Merge all mallocs of identical in a block into one. + Thus all mallocs of atomic data are merged and all mallocs of + non-atomic data are also merged into one. This reasoning behind this is + that data allocated in the same block will probably have about the same + livespan. So we hope that this does not increase the memory appetite + of your program by much. + + warning: some will consider this a dirty hack, that's ok! :) + """ + blocks = [x for x in flatten(graph) if isinstance(x, Block)] + for block in blocks: + n_mallocs_in_block = 0 + for op in block.operations: + if op.opname != 'malloc': + continue + n_mallocs_in_block += 1 + if n_mallocs_in_block >= 2: + print 'merge_mallocs: n_mallocs_in_block=%d' % n_mallocs_in_block Added: pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py Tue Sep 27 15:33:10 2005 @@ -0,0 +1,37 @@ +from pypy.objspace.flow.model import Block, Constant, flatten, SpaceOperation +from pypy.translator.backendopt.inline import _find_exception_type + + +def _llvm_structsize(struct): + #XXX TODO take a save guess + return 16 + +def remove_exception_mallocs(translator, graph, ringbuffer_entry_maxsize=16, ringbuffer_n_entries=1024): + """Remove mallocs that occur because an exception is raised. + Typically this data is shortlived and occuring often in highlevel + languages like Python. So it would be preferable if we would not need + to call a malloc function. We can not allocate the data on the stack + because a global pointer (last_exception_type) is pointing to it. + + Here we use a ringbuffer of fixed size to contain exception instances. + Our ringbuffer entries have fixed (maximum)size so all malloc over that + amount are not affected by this code. + + warning: this code will not work when your code references + an exception instance 'long' after it has been raised. + """ + blocks = [x for x in flatten(graph) if isinstance(x, Block)] + for block in blocks: + ops = block.operations + if (len(ops) < 3 or + ops[0].opname != 'malloc' or ops[1].opname != 'cast_pointer' or + ops[2].opname != 'setfield' or ops[2].args[1].value != 'typeptr' or + not isinstance(ops[2].args[1], Constant) or + _llvm_structsize(ops[0].args[0]) > ringbuffer_entry_maxsize): #todo: ops[2].args[2] to vtable + continue + print 'remove_exception_malloc: ', str(ops[0].args[0]), ops[2].args[2] + #ops = [SpaceOperation('ops[0].result = load sbyte** %exception_ringbuffer'), + # SpaceOperation('%tmpptr.0 = add sbyte* ops[0].result, ringbuffer_entry_maxsize'), + # SpaceOperation('%tmpptr.1 = and sbyte* tmpptr.0, ~(ringbuffer_n_entries*ringbuffer_entry_maxsize)'), + # SpaceOperation('store sbyte* %tmpptr.1, sbyte** %exception_ringbuffer), + # ops[1:]] Modified: pypy/dist/pypy/translator/llvm/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/exception.py (original) +++ pypy/dist/pypy/translator/llvm/exception.py Tue Sep 27 15:33:10 2005 @@ -2,6 +2,9 @@ class ExceptionPolicy: + RINGBUFFER_ENTRY_MAXSIZE = 16 + RINGBUGGER_N_ENTRIES = 1024 + def __init__(self): raise Exception, 'ExceptionPolicy should not be used directly' Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Tue Sep 27 15:33:10 2005 @@ -5,6 +5,8 @@ from pypy.translator.llvm.node import LLVMNode, ConstantLLVMNode from pypy.translator.llvm.opwriter import OpWriter from pypy.translator.llvm.log import log +from pypy.translator.llvm.backendopt.removeexcmallocs import remove_exception_mallocs +from pypy.translator.llvm.backendopt.mergemallocs import merge_mallocs from pypy.translator.unsimplify import remove_double_links log = log.funcnode @@ -38,6 +40,8 @@ self.ref = self.make_ref('%pypy_', value.graph.name) self.graph = value.graph self.db.genllvm.exceptionpolicy.transform(self.db.translator, self.graph) + #remove_exception_mallocs(self.db.translator, self.graph) + #merge_mallocs(self.db.translator, self.graph) remove_double_links(self.db.translator, self.graph) def __str__(self): Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Tue Sep 27 15:33:10 2005 @@ -253,7 +253,7 @@ a = t.annotate(annotation) a.simplify() t.specialize() - t.backend_optimizations(ssa_form=False) + t.backend_optimizations(ssa_form=False, propagate=False, removezerobytemallocs=False) if view: #note: this is without policy transforms t.view() return genllvm(t, **kwds) Modified: pypy/dist/pypy/translator/llvm/module/extfunction.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/extfunction.py (original) +++ pypy/dist/pypy/translator/llvm/module/extfunction.py Tue Sep 27 15:33:10 2005 @@ -1,6 +1,9 @@ extdeclarations = ''' %last_exception_type = internal global %RPYTHON_EXCEPTION_VTABLE* null %last_exception_value = internal global %RPYTHON_EXCEPTION* null + +%exception_ringbuffer = internal global [8192 x sbyte] zeroinitializer +%exception_ringbuffer_index = internal global int 0 ''' extfunctions = {} #dependencies, llvm-code From ericvrp at codespeak.net Tue Sep 27 15:48:49 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Tue, 27 Sep 2005 15:48:49 +0200 (CEST) Subject: [pypy-svn] r17904 - pypy/dist/pypy/translator/backendopt Message-ID: <20050927134849.B1DEC27B8E@code1.codespeak.net> Author: ericvrp Date: Tue Sep 27 15:48:49 2005 New Revision: 17904 Modified: pypy/dist/pypy/translator/backendopt/removezerobytemalloc.py Log: use cast_pointer which is used more often already Modified: pypy/dist/pypy/translator/backendopt/removezerobytemalloc.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/removezerobytemalloc.py (original) +++ pypy/dist/pypy/translator/backendopt/removezerobytemalloc.py Tue Sep 27 15:48:49 2005 @@ -9,8 +9,8 @@ if op.opname != 'malloc': continue arg = op.args[0].value - if True: #isinstance(arg, lltype.Struct) and arg._names_without_voids() == []: + if isinstance(arg, lltype.Struct) and arg._names_without_voids() == []: print 'remove_zero_byte_mallocs: removed malloc(%s) from previous line' % arg nullresulttype = op.result.concretetype nullresult = Constant(nullresulttype._defl(), nullresulttype) - block.operations[i] = SpaceOperation('cast_null_to_ptr', [nullresult], op.result) + block.operations[i] = SpaceOperation('cast_pointer', [nullresult], op.result) From pedronis at codespeak.net Tue Sep 27 16:40:20 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 27 Sep 2005 16:40:20 +0200 (CEST) Subject: [pypy-svn] r17905 - in pypy/dist/pypy: rpython tool Message-ID: <20050927144020.D05B527B97@code1.codespeak.net> Author: pedronis Date: Tue Sep 27 16:40:18 2005 New Revision: 17905 Modified: pypy/dist/pypy/rpython/rbool.py pypy/dist/pypy/rpython/rfloat.py pypy/dist/pypy/rpython/rint.py pypy/dist/pypy/rpython/rmodel.py pypy/dist/pypy/rpython/rtyper.py pypy/dist/pypy/tool/ansi_print.py Log: convert print in rtyper to use logging Modified: pypy/dist/pypy/rpython/rbool.py ============================================================================== --- pypy/dist/pypy/rpython/rbool.py (original) +++ pypy/dist/pypy/rpython/rbool.py Tue Sep 27 16:40:18 2005 @@ -3,10 +3,9 @@ from pypy.rpython.lltype import Signed, Unsigned, Bool, Float, pyobjectptr from pypy.rpython.rmodel import Repr, TyperError, IntegerRepr, BoolRepr from pypy.rpython.robject import PyObjRepr, pyobj_repr +from pypy.rpython.rmodel import log -debug = False - class __extend__(annmodel.SomeBool): def rtyper_makerepr(self, rtyper): return bool_repr @@ -44,10 +43,10 @@ class __extend__(pairtype(BoolRepr, IntegerRepr)): def convert_from_to((r_from, r_to), v, llops): if r_from.lowleveltype == Bool and r_to.lowleveltype == Unsigned: - if debug: print 'explicit cast_bool_to_uint' + log.debug('explicit cast_bool_to_uint') return llops.genop('cast_bool_to_uint', [v], resulttype=Unsigned) if r_from.lowleveltype == Bool and r_to.lowleveltype == Signed: - if debug: print 'explicit cast_bool_to_int' + log.debug('explicit cast_bool_to_int') return llops.genop('cast_bool_to_int', [v], resulttype=Signed) return NotImplemented Modified: pypy/dist/pypy/rpython/rfloat.py ============================================================================== --- pypy/dist/pypy/rpython/rfloat.py (original) +++ pypy/dist/pypy/rpython/rfloat.py Tue Sep 27 16:40:18 2005 @@ -8,8 +8,8 @@ from pypy.rpython.rstr import STR from pypy.rpython.lltype import functionptr, FuncType, malloc from pypy.rpython import rstr +from pypy.rpython.rmodel import log -debug = False class __extend__(annmodel.SomeFloat): def rtyper_makerepr(self, rtyper): @@ -166,17 +166,17 @@ class __extend__(pairtype(IntegerRepr, FloatRepr)): def convert_from_to((r_from, r_to), v, llops): if r_from.lowleveltype == Unsigned and r_to.lowleveltype == Float: - if debug: print 'explicit cast_uint_to_float' + log.debug('explicit cast_uint_to_float') return llops.genop('cast_uint_to_float', [v], resulttype=Float) if r_from.lowleveltype == Signed and r_to.lowleveltype == Float: - if debug: print 'explicit cast_int_to_float' + log.debug('explicit cast_int_to_float') return llops.genop('cast_int_to_float', [v], resulttype=Float) return NotImplemented class __extend__(pairtype(BoolRepr, FloatRepr)): def convert_from_to((r_from, r_to), v, llops): if r_from.lowleveltype == Bool and r_to.lowleveltype == Float: - if debug: print 'explicit cast_bool_to_float' + log.debug('explicit cast_bool_to_float') return llops.genop('cast_bool_to_float', [v], resulttype=Float) return NotImplemented Modified: pypy/dist/pypy/rpython/rint.py ============================================================================== --- pypy/dist/pypy/rpython/rint.py (original) +++ pypy/dist/pypy/rpython/rint.py Tue Sep 27 16:40:18 2005 @@ -7,10 +7,9 @@ inputconst from pypy.rpython.robject import PyObjRepr, pyobj_repr from pypy.rpython.rarithmetic import intmask, r_uint +from pypy.rpython.rmodel import log -debug = False - class __extend__(annmodel.SomeInteger): def rtyper_makerepr(self, rtyper): if self.unsigned: @@ -29,10 +28,10 @@ def convert_from_to((r_from, r_to), v, llops): if r_from.lowleveltype == Signed and r_to.lowleveltype == Unsigned: - if debug: print 'explicit cast_int_to_uint' + log.debug('explicit cast_int_to_uint') return llops.genop('cast_int_to_uint', [v], resulttype=Unsigned) if r_from.lowleveltype == Unsigned and r_to.lowleveltype == Signed: - if debug: print 'explicit cast_uint_to_int' + log.debug('explicit cast_uint_to_int') return llops.genop('cast_uint_to_int', [v], resulttype=Signed) return NotImplemented Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Tue Sep 27 16:40:18 2005 @@ -4,7 +4,6 @@ from pypy.rpython.lltype import Void, Bool, Float, Signed, Char, UniChar from pypy.rpython.lltype import typeOf, LowLevelType, Ptr, PyObject from pypy.rpython.lltype import FuncType, functionptr, cast_ptr_to_int -from pypy.tool.ansi_print import ansi_print from pypy.rpython.error import TyperError, MissingRTypeOperation # initialization states for Repr instances @@ -322,12 +321,24 @@ _callable = getattr(graphfunc, '_specializedversionof_', graphfunc) return functionptr(FT, graphfunc.func_name, graph = graph, _callable = _callable) -def warning(msg): - ansi_print("*** WARNING: %s" % (msg,), esc="31") # RED - - def needsgc(classdef, nogc=False): if classdef is None: return not nogc else: return getattr(classdef.cls, '_alloc_flavor_', 'gc').startswith('gc') + +# logging/warning + +import py +from pypy.tool.ansi_print import ansi_log + +log = py.log.Producer("rtyper") +py.log.setconsumer("rtyper", ansi_log) +#py.log.setconsumer("rtyper translating", None) +#py.log.setconsumer("rtyper debug", None) + +def warning(msg): + log.WARNING(msg) + + + Modified: pypy/dist/pypy/rpython/rtyper.py ============================================================================== --- pypy/dist/pypy/rpython/rtyper.py (original) +++ pypy/dist/pypy/rpython/rtyper.py Tue Sep 27 16:40:18 2005 @@ -31,8 +31,7 @@ from pypy.rpython.annlowlevel import annotate_lowlevel_helper from pypy.rpython.exceptiondata import ExceptionData -log = py.log.Producer("rtyper") -py.log.setconsumer("rtyper", None) +from pypy.rpython.rmodel import log class RPythonTyper: @@ -58,9 +57,7 @@ try: self.seed = int(os.getenv('RTYPERSEED')) s = 'Using %d as seed for block shuffling' % self.seed - print '*' * len(s) - print s - print '*' * len(s) + log.info(s) except: self.seed = 0 self.order = None @@ -69,9 +66,7 @@ order_module = RTYPERORDER.split(',')[0] self.order = __import__(order_module, {}, {}, ['*']).order s = 'Using %s.%s for order' % (self.order.__module__, self.order.__name__) - print '*' * len(s) - print s - print '*' * len(s) + log.info(s) self.crash_on_first_typeerror = True def add_pendingsetup(self, repr): @@ -157,20 +152,20 @@ error_report = " but %d errors" % self.typererror_count else: error_report = '' - print 'specializing: %d / %d blocks (%d%%)%s' % ( - n, total, 100 * n // total, error_report) + log.event('specializing: %d / %d blocks (%d%%)%s' % ( + n, total, 100 * n // total, error_report)) # make sure all reprs so far have had their setup() called self.call_all_setups() if self.typererrors: - self.dump_typererrors() + self.dump_typererrors(to_log=True) raise TyperError("there were %d error" % len(self.typererrors)) # make sure that the return variables of all graphs are concretetype'd for graph in self.annotator.translator.flowgraphs.values(): v = graph.getreturnvar() self.setconcretetype(v) - def dump_typererrors(self, num=None, minimize=True): + def dump_typererrors(self, num=None, minimize=True, to_log=False): c = 0 bc = 0 for err in self.typererrors[:num]: @@ -184,11 +179,19 @@ func = "(%s:%s)" %(func.__module__ or '?', func.__name__) else: func = "(?:?)" - print "TyperError-%d: %s" % (c, func) - print str(err) - print "" - if bc: - print "(minimized %d errors away for this dump)" % (bc,) + errmsg = ("TyperError-%d: %s" % (c, func) + + str(err) + + "\n") + if to_log: + log.ERROR(errmsg) + else: + print errmsg + if bc: + minmsg = "(minimized %d errors away for this dump)" % (bc,) + if to_log: + log.ERROR(minmsg) + else: + print minmsg def call_all_setups(self): # make sure all reprs so far have had their setup() called Modified: pypy/dist/pypy/tool/ansi_print.py ============================================================================== --- pypy/dist/pypy/tool/ansi_print.py (original) +++ pypy/dist/pypy/tool/ansi_print.py Tue Sep 27 16:40:18 2005 @@ -29,6 +29,7 @@ 'WARNING': ((31,), False), 'event': ((1,), True), 'ERROR': ((1, 31), False), + 'info': ((32,), False), } def __init__(self, kw_to_color={}, file=None): From pedronis at codespeak.net Tue Sep 27 16:42:50 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 27 Sep 2005 16:42:50 +0200 (CEST) Subject: [pypy-svn] r17906 - pypy/dist/pypy/rpython Message-ID: <20050927144250.E718B27B98@code1.codespeak.net> Author: pedronis Date: Tue Sep 27 16:42:50 2005 New Revision: 17906 Modified: pypy/dist/pypy/rpython/rmodel.py Log: these were not meant to stay commented out Modified: pypy/dist/pypy/rpython/rmodel.py ============================================================================== --- pypy/dist/pypy/rpython/rmodel.py (original) +++ pypy/dist/pypy/rpython/rmodel.py Tue Sep 27 16:42:50 2005 @@ -334,8 +334,8 @@ log = py.log.Producer("rtyper") py.log.setconsumer("rtyper", ansi_log) -#py.log.setconsumer("rtyper translating", None) -#py.log.setconsumer("rtyper debug", None) +py.log.setconsumer("rtyper translating", None) +py.log.setconsumer("rtyper debug", None) def warning(msg): log.WARNING(msg) From bert at codespeak.net Tue Sep 27 16:54:27 2005 From: bert at codespeak.net (bert at codespeak.net) Date: Tue, 27 Sep 2005 16:54:27 +0200 (CEST) Subject: [pypy-svn] r17907 - pypy/extradoc/sprintinfo Message-ID: <20050927145427.AEEC627B93@code1.codespeak.net> Author: bert Date: Tue Sep 27 16:54:23 2005 New Revision: 17907 Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt Log: added myself to list Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt ============================================================================== --- pypy/extradoc/sprintinfo/paris-2005-people.txt (original) +++ pypy/extradoc/sprintinfo/paris-2005-people.txt Tue Sep 27 16:54:23 2005 @@ -20,6 +20,7 @@ Samuele Pedroni ? ? Holger Krekel ? ? Michael Hudson ? ? +Bert Freudenberg 09/10 - 17/10 ? =================== ============== ===================== People on the following list are likely to come and were From jum at codespeak.net Tue Sep 27 18:22:26 2005 From: jum at codespeak.net (jum at codespeak.net) Date: Tue, 27 Sep 2005 18:22:26 +0200 (CEST) Subject: [pypy-svn] r17912 - pypy/dist/pypy/doc Message-ID: <20050927162226.09FAD27B93@code1.codespeak.net> Author: jum Date: Tue Sep 27 18:22:26 2005 New Revision: 17912 Modified: pypy/dist/pypy/doc/svn-help.txt Log: Updated the local build of subversion for OS X to the latest version. Modified: pypy/dist/pypy/doc/svn-help.txt ============================================================================== --- pypy/dist/pypy/doc/svn-help.txt (original) +++ pypy/dist/pypy/doc/svn-help.txt Tue Sep 27 18:22:26 2005 @@ -150,7 +150,7 @@ .. _website: http://support.microsoft.com/default.aspx?scid=kb%3Ben-us%3B259403 .. _GUI: http://tortoisesvn.tigris.org/servlets/ProjectDocumentList?folderID=616 -.. _MacOS: http://codespeak.net/~jum/svn-1.1.3-darwin-ppc.tar.gz +.. _MacOS: http://codespeak.net/~jum/svn-1.2.3-darwin-ppc.tar.gz .. _versions: http://subversion.tigris.org/project_packages.html .. _Win: http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=4B6140F9-2D36-4977-8FA1-6F8A0F5DCA8F .. _guide: http://svnbook.red-bean.com/book.html#svn-ch-1 From bea at codespeak.net Tue Sep 27 18:30:08 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Tue, 27 Sep 2005 18:30:08 +0200 (CEST) Subject: [pypy-svn] r17913 - pypy/extradoc/sprintinfo Message-ID: <20050927163008.15CD627B97@code1.codespeak.net> Author: bea Date: Tue Sep 27 18:30:06 2005 New Revision: 17913 Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt Log: My travel dates - will bring Sten and Simone as well Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt ============================================================================== --- pypy/extradoc/sprintinfo/paris-2005-people.txt (original) +++ pypy/extradoc/sprintinfo/paris-2005-people.txt Tue Sep 27 18:30:06 2005 @@ -15,7 +15,7 @@ Adrien Di Mascio 10/10 - 16/10 Private Jacob Hallen ? ? Laura Creighton ? ? -Beatrice Duering ? ? +Beatrice Duering 9/10 - 17/10 flat Armin Rigo ? ? Samuele Pedroni ? ? Holger Krekel ? ? From cfbolz at codespeak.net Tue Sep 27 18:53:51 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Tue, 27 Sep 2005 18:53:51 +0200 (CEST) Subject: [pypy-svn] r17915 - pypy/extradoc/sprintinfo Message-ID: <20050927165351.8995527B95@code1.codespeak.net> Author: cfbolz Date: Tue Sep 27 18:53:50 2005 New Revision: 17915 Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt Log: add my info Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt ============================================================================== --- pypy/extradoc/sprintinfo/paris-2005-people.txt (original) +++ pypy/extradoc/sprintinfo/paris-2005-people.txt Tue Sep 27 18:53:50 2005 @@ -20,6 +20,7 @@ Samuele Pedroni ? ? Holger Krekel ? ? Michael Hudson ? ? +Carl Friedrich Bolz 7/10? - 16/10 flat? Bert Freudenberg 09/10 - 17/10 ? =================== ============== ===================== @@ -29,7 +30,6 @@ =================== ============== ===================== Name Arrive/Depart Accomodation =================== ============== ===================== -Carl Friedrich Bolz ? ? Niklaus Haldimann ? ? Eric van Riet Paap ? ? Richard Emslie ? ? From hpk at codespeak.net Tue Sep 27 19:04:31 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Tue, 27 Sep 2005 19:04:31 +0200 (CEST) Subject: [pypy-svn] r17916 - pypy/extradoc/sprintinfo Message-ID: <20050927170431.D8C4327B88@code1.codespeak.net> Author: hpk Date: Tue Sep 27 19:04:30 2005 New Revision: 17916 Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt Log: my and lene's dates Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt ============================================================================== --- pypy/extradoc/sprintinfo/paris-2005-people.txt (original) +++ pypy/extradoc/sprintinfo/paris-2005-people.txt Tue Sep 27 19:04:30 2005 @@ -18,7 +18,8 @@ Beatrice Duering 9/10 - 17/10 flat Armin Rigo ? ? Samuele Pedroni ? ? -Holger Krekel ? ? +Holger Krekel 6th-17th Oct flat +Lene Wagner 12th-13th Oct flat Michael Hudson ? ? Carl Friedrich Bolz 7/10? - 16/10 flat? Bert Freudenberg 09/10 - 17/10 ? From arigo at codespeak.net Tue Sep 27 19:38:44 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 27 Sep 2005 19:38:44 +0200 (CEST) Subject: [pypy-svn] r17917 - pypy/dist/pypy/doc Message-ID: <20050927173844.9ED0927B9A@code1.codespeak.net> Author: arigo Date: Tue Sep 27 19:38:41 2005 New Revision: 17917 Modified: pypy/dist/pypy/doc/_ref.txt pypy/dist/pypy/doc/faq.txt Log: * Added a FAQ entry from mwh's mail on pypy-dev (27 Sep 2005). * Added How do I compile my own RPython programs. * Reformatted the FAQ. Modified: pypy/dist/pypy/doc/_ref.txt ============================================================================== --- pypy/dist/pypy/doc/_ref.txt (original) +++ pypy/dist/pypy/doc/_ref.txt Tue Sep 27 19:38:41 2005 @@ -67,6 +67,7 @@ .. _`pypy/translator/c/src/`: ../../pypy/translator/c/src .. _`pypy/translator/c/src/ll_os.h`: ../../pypy/translator/c/src/ll_os.h .. _`pypy/translator/c/test/test_extfunc.py`: ../../pypy/translator/c/test/test_extfunc.py +.. _`pypy/translator/goal/targetnopstandalone.py`: ../../pypy/translator/goal/targetnopstandalone.py .. _`translator/java/`: ../../pypy/translator/java .. _`translator/llvm/`: ../../pypy/translator/llvm .. _`translator/tool/`: ../../pypy/translator/tool \ No newline at end of file Modified: pypy/dist/pypy/doc/faq.txt ============================================================================== --- pypy/dist/pypy/doc/faq.txt (original) +++ pypy/dist/pypy/doc/faq.txt Tue Sep 27 19:38:41 2005 @@ -1,55 +1,107 @@ - +========================== Frequently Asked Questions ========================== -How fast is PyPy? +.. contents:: - As of August 2005, PyPy was succesfully translated to C. - The version of PyPy that still runs on top of CPython - is slower by a factor of 2000 compared to CPython. The translated - versions seems to be roughly 300 times slower than CPython. - On the other hand, the really interesting question is: Why is - PyPy so slow? -.. _whysoslow: +General +============================================================ -Why is PyPy so slow? +Do I have to rewrite my programs in RPython? +-------------------------------------------- - Our translation process does not try to optimize the produced code - very much. So far the project has been focused on getting a well - tested very compliant self-contained static Python implementation. - During end 2005 and 2006 we are targetting optimizations at various - levels. If you then still think that PyPy is slow then we will - have to find a better answer :-) - -What is RPython? - - RPython is the restricted subset of the Python language that - we are able to translate to machine level code. A major part of - PyPy's interpreter and type implementations is written - in this restricted style. For a more exhaustive definitions - please refer to `RPython`_. +No. PyPy always runs your code in its own interpreter, which is a full +and compliant Python 2.4 interpreter. RPython_ is only the language in +which parts of PyPy itself are written. -Do I have to rewrite my programs in RPython? +I am getting strange errors while playing with PyPy, what should I do? +---------------------------------------------------------------------- + +It seems that a lot of strange, unexplainable problems can be magically +solved by removing all the \*.pyc files from the PyPy source +tree. Another thing you can do is removing the pypy/_cache +completely. If the error is persistent and still annoys you after this +treatment please send us a bug report (or even better, a fix :-) - No. PyPy always runs your code in its own interpreter, which is - a full and compliant Python 2.4 interpreter. RPython is only - the language in which parts of PyPy itself are written. + +Using the PyPy translator +============================================================ How do I compile PyPy? +---------------------- - See the `getting-started`_ guide. Note that at the moment this - produces an executable that contains a lot of things that are - hard-coded for your particular system (including paths), so it's - not really suitable for being installed or redistributed. +See the `getting-started`_ guide. Note that at the moment this produces +an executable that contains a lot of things that are hard-coded for your +particular system (including paths and other stuff), so it's not +suitable for being installed or redistributed. + +How do I compile my own programs? +--------------------------------- + +Start from the example of +`pypy/translator/goal/targetnopstandalone.py`_, which you compile by +typing:: + + python translate_pypy.py targetnopstandalone + +You can have a look at intermediate C source code, which is (at the +moment) put in ``/tmp/usession-*/testing_1/testing_1.c``. Of course, +all the function and stuff indirectly used by your ``entry_point()`` +function has to be RPython_. + + +Compiling to other languages +============================================================ + +Couldn't we simply take a Python syntax tree and turn it into Lisp? +------------------------------------------------------------------- + +It's not necessarily nonsense, but it's not really The PyPy Way. It's +pretty hard, without some kind of type inference, to translate, say this +Python:: + + a + b + +into anything significantly more efficient than this Common Lisp:: + + (py:add a b) + +And making type inference possible is what RPython is all about. + +You could make ``#'py:add`` a generic function and see if a given CLOS +implementation is fast enough to give a useful speed (but I think the +coercion rules would probably drive you insane first). -- mwh + + +Speed +============================================================ + +How fast is PyPy? +----------------- + +As of August 2005, PyPy was successfully translated to C. The version +of PyPy that still runs on top of CPython is slower by a factor of 2000 +compared to CPython. This translated version was roughly 300 times +slower than CPython, and is now about 20-30 times slower than CPython. +On the other hand, the really interesting question is: Why is PyPy so +slow? + +.. _whysoslow: + +Why is PyPy so slow? +-------------------- + +Our translation process does not try to optimize the produced code very +much. So far the project has been focused on getting a well tested very +compliant self-contained static Python implementation. During end 2005 +and 2006 we are targetting optimizations at various levels. If you then +still think that PyPy is slow then we will have to find a better answer +:-) -I am getting strange errors while playing with PyPy, what should I do? - It seems that a lot of strange, unexplainable problems can be magically - solved by removing all the \*.pyc files from the PyPy source tree. Another - thing you can do is removing the pypy/_cache completely. If the error is - persistent and still annoys you after this treatment please send us a bug - report (or even better, a fix :-) .. _`RPython`: coding-guide.html#rpython .. _`getting-started`: getting-started.html + +.. include:: _ref.txt From pedronis at codespeak.net Tue Sep 27 21:18:19 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 27 Sep 2005 21:18:19 +0200 (CEST) Subject: [pypy-svn] r17918 - in pypy/dist/pypy/translator: . goal tool Message-ID: <20050927191819.999A527B9F@code1.codespeak.net> Author: pedronis Date: Tue Sep 27 21:18:17 2005 New Revision: 17918 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py pypy/dist/pypy/translator/tool/util.py pypy/dist/pypy/translator/translator.py Log: reorganize: move about from translate_pypy_new to translator.Translator as method (parallel to the various show functionality) Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Tue Sep 27 21:18:17 2005 @@ -162,7 +162,7 @@ block = getattr(val, '__annotator_block', None) if block: print '-'*60 - about(block) + t.about(block) print '-'*60 print >> sys.stderr @@ -193,31 +193,6 @@ debugger.join() cleanup() - def about(x): - """ interactive debugging helper """ - from pypy.objspace.flow.model import Block, flatten - if isinstance(x, Block): - for func, graph in t.flowgraphs.items(): - if x in flatten(graph): - funcname = func.func_name - cls = getattr(func, 'class_', None) - if cls: - funcname = '%s.%s' % (cls.__name__, funcname) - print '%s is a %s in the graph of %s' % (x, - x.__class__.__name__, funcname) - print 'at %s:%d' % (func.func_globals.get('__name__', '?'), - func.func_code.co_firstlineno) - break - else: - print '%s is a %s at some unknown location' % (x, - x.__class__.__name__) - print 'containing the following operations:' - for op in x.operations: - print op - print '--end--' - return - print "don't know about", x - from optparse import OptionParser parser = OptionParser() for group in opts: Modified: pypy/dist/pypy/translator/tool/util.py ============================================================================== --- pypy/dist/pypy/translator/tool/util.py (original) +++ pypy/dist/pypy/translator/tool/util.py Tue Sep 27 21:18:17 2005 @@ -120,6 +120,7 @@ print "=" * 70 def worstblocks_topten(ann, n=10): + from pypy.tool.ansi_print import ansi_print h = [(count, block) for block, count in ann.reflowcounter.iteritems()] h.sort() if not h: @@ -131,7 +132,7 @@ break count, block = h.pop() ansi_print(' #%3d: reflown %d times |' % (i+1, count), 36) - about(block) + ann.translator.about(block) ansi_print("`----------------------------------------------------------------------------'", 36) print Modified: pypy/dist/pypy/translator/translator.py ============================================================================== --- pypy/dist/pypy/translator/translator.py (original) +++ pypy/dist/pypy/translator/translator.py Tue Sep 27 21:18:17 2005 @@ -132,6 +132,31 @@ self.annotator.build_types(graph, input_args_types, func) return self.annotator + def about(self, x): + """Interactive debugging helper """ + from pypy.objspace.flow.model import Block, flatten + if isinstance(x, Block): + for func, graph in self.flowgraphs.items(): + if x in graph.iterblocks(): + funcname = func.func_name + cls = getattr(func, 'class_', None) + if cls: + funcname = '%s.%s' % (cls.__name__, funcname) + print '%s is a %s in the graph of %s' % (x, + x.__class__.__name__, funcname) + print 'at %s:%d' % (func.func_globals.get('__name__', '?'), + func.func_code.co_firstlineno) + break + else: + print '%s is a %s at some unknown location' % (x, + x.__class__.__name__) + print 'containing the following operations:' + for op in x.operations: + print " ",op + print '--end--' + return + raise TypeError, "don't know about %r" % x + def checkgraphs(self): for graph in self.flowgraphs.itervalues(): checkgraph(graph) From pedronis at codespeak.net Tue Sep 27 21:26:27 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 27 Sep 2005 21:26:27 +0200 (CEST) Subject: [pypy-svn] r17919 - pypy/dist/pypy/translator/goal Message-ID: <20050927192627.13B5827B9F@code1.codespeak.net> Author: pedronis Date: Tue Sep 27 21:26:25 2005 New Revision: 17919 Modified: pypy/dist/pypy/translator/goal/translate_pypy.py Log: don't define about here anymore Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Tue Sep 27 21:26:25 2005 @@ -264,7 +264,7 @@ break count, block = h.pop() ansi_print(' #%3d: reflown %d times |' % (i+1, count), 36) - about(block) + ann.translator.about(block) ansi_print("`----------------------------------------------------------------------------'", 36) print @@ -396,31 +396,6 @@ if options['-d']: annmodel.DEBUG = True - def about(x): - """ interactive debugging helper """ - from pypy.objspace.flow.model import Block, flatten - if isinstance(x, Block): - for func, graph in t.flowgraphs.items(): - if x in flatten(graph): - funcname = func.func_name - cls = getattr(func, 'class_', None) - if cls: - funcname = '%s.%s' % (cls.__name__, funcname) - print '%s is a %s in the graph of %s' % (x, - x.__class__.__name__, funcname) - print 'at %s:%d' % (func.func_globals.get('__name__', '?'), - func.func_code.co_firstlineno) - break - else: - print '%s is a %s at some unknown location' % (x, - x.__class__.__name__) - print 'containing the following operations:' - for op in x.operations: - print op - print '--end--' - return - print "don't know about", x - class PdbPlusShow(pdb.Pdb): def post_mortem(self, t): @@ -767,7 +742,7 @@ block = getattr(val, '__annotator_block', None) if block: print '-'*60 - about(block) + t.about(block) print '-'*60 print >> sys.stderr From pedronis at codespeak.net Tue Sep 27 21:34:33 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 27 Sep 2005 21:34:33 +0200 (CEST) Subject: [pypy-svn] r17920 - in pypy/dist/pypy/translator: goal tool Message-ID: <20050927193433.EB35E27B9F@code1.codespeak.net> Author: pedronis Date: Tue Sep 27 21:34:31 2005 New Revision: 17920 Modified: pypy/dist/pypy/translator/goal/query.py pypy/dist/pypy/translator/goal/translate_pypy.py pypy/dist/pypy/translator/goal/translate_pypy_new.py pypy/dist/pypy/translator/tool/util.py Log: make worstblocks_topten a user-invoked query Modified: pypy/dist/pypy/translator/goal/query.py ============================================================================== --- pypy/dist/pypy/translator/goal/query.py (original) +++ pypy/dist/pypy/translator/goal/query.py Tue Sep 27 21:34:31 2005 @@ -537,3 +537,24 @@ visit(t.entrypoint, [t.entrypoint]) return back + +# + +def worstblocks_topten(t, n=10): + from pypy.tool.ansi_print import ansi_print + ann = t.annotator + h = [(count, block) for block, count in ann.reflowcounter.iteritems()] + h.sort() + if not h: + print "annotator should have been run with debug collecting enabled" + return + print + ansi_print(',----------------------- Top %d Most Reflown Blocks -----------------------.' % n, 36) + for i in range(n): + if not h: + break + count, block = h.pop() + ansi_print(' #%3d: reflown %d times |' % (i+1, count), 36) + t.about(block) + ansi_print("`----------------------------------------------------------------------------'", 36) + print Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Tue Sep 27 21:34:31 2005 @@ -135,7 +135,6 @@ lost = query.sanity_check_methods(t) assert not lost, "lost methods, something gone wrong with the annotation of method defs" print "*** No lost method defs." - worstblocks_topten(a, 3) find_someobjects(t) if a: #and not options['-no-s']: print 'Simplifying...' @@ -252,23 +251,6 @@ someobjnum, num) print "=" * 70 -def worstblocks_topten(ann, n=10): - h = [(count, block) for block, count in ann.reflowcounter.iteritems()] - h.sort() - if not h: - return - print - ansi_print(',----------------------- Top %d Most Reflown Blocks -----------------------.' % n, 36) - for i in range(n): - if not h: - break - count, block = h.pop() - ansi_print(' #%3d: reflown %d times |' % (i+1, count), 36) - ann.translator.about(block) - ansi_print("`----------------------------------------------------------------------------'", 36) - print - - def update_usession_dir(stabledir = udir.dirpath('usession')): from py import path try: Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Tue Sep 27 21:34:31 2005 @@ -89,7 +89,7 @@ # XXX this tries to make compiling faster from pypy.translator.tool import cbuild cbuild.enable_fast_compilation() -from pypy.translator.tool.util import worstblocks_topten, find_someobjects +from pypy.translator.tool.util import find_someobjects from pypy.translator.tool.util import sanity_check_exceptblocks, update_usession_dir from pypy.translator.tool.util import assert_rpython_mostly_not_imported, mkexename @@ -114,7 +114,6 @@ lost = query.sanity_check_methods(t) assert not lost, "lost methods, something gone wrong with the annotation of method defs" print "*** No lost method defs." - worstblocks_topten(a, 3) find_someobjects(t) if a: #and not options['-no-s']: print 'Simplifying...' Modified: pypy/dist/pypy/translator/tool/util.py ============================================================================== --- pypy/dist/pypy/translator/tool/util.py (original) +++ pypy/dist/pypy/translator/tool/util.py Tue Sep 27 21:34:31 2005 @@ -119,20 +119,4 @@ someobjnum, num) print "=" * 70 -def worstblocks_topten(ann, n=10): - from pypy.tool.ansi_print import ansi_print - h = [(count, block) for block, count in ann.reflowcounter.iteritems()] - h.sort() - if not h: - return - print - ansi_print(',----------------------- Top %d Most Reflown Blocks -----------------------.' % n, 36) - for i in range(n): - if not h: - break - count, block = h.pop() - ansi_print(' #%3d: reflown %d times |' % (i+1, count), 36) - ann.translator.about(block) - ansi_print("`----------------------------------------------------------------------------'", 36) - print From arigo at codespeak.net Tue Sep 27 22:25:28 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Tue, 27 Sep 2005 22:25:28 +0200 (CEST) Subject: [pypy-svn] r17921 - in pypy/dist/pypy/rpython: . test Message-ID: <20050927202528.6FB1727BAB@code1.codespeak.net> Author: arigo Date: Tue Sep 27 22:25:24 2005 New Revision: 17921 Modified: pypy/dist/pypy/rpython/rtuple.py pypy/dist/pypy/rpython/test/test_rtuple.py Log: * make constant tuples prebuilt and shared. * never malloc() an empty tuple, always use the shared one. * fixed tuple concatenation, which was crashing the rtyper -- never tested, because the tests were constant-folded away :-) Modified: pypy/dist/pypy/rpython/rtuple.py ============================================================================== --- pypy/dist/pypy/rpython/rtuple.py (original) +++ pypy/dist/pypy/rpython/rtuple.py Tue Sep 27 22:25:24 2005 @@ -35,16 +35,22 @@ self.lltypes = [r.lowleveltype for r in items_r] fields = zip(self.fieldnames, self.lltypes) self.lowleveltype = Ptr(GcStruct('tuple%d' % len(items_r), *fields)) + self.tuple_cache = {} def compact_repr(self): return "TupleR %s" % ' '.join([llt._short_name() for llt in self.lltypes]) def convert_const(self, value): assert isinstance(value, tuple) and len(value) == len(self.items_r) - p = malloc(self.lowleveltype.TO) - for obj, r, name in zip(value, self.items_r, self.fieldnames): - setattr(p, name, r.convert_const(obj)) - return p + key = tuple([Constant(item) for item in value]) + try: + return self.tuple_cache[key] + except KeyError: + p = malloc(self.lowleveltype.TO) + self.tuple_cache[key] = p + for obj, r, name in zip(value, self.items_r, self.fieldnames): + setattr(p, name, r.convert_const(obj)) + return p #def get_eqfunc(self): # return inputconst(Void, self.item_repr.get_ll_eq_function()) @@ -115,11 +121,13 @@ class __extend__(pairtype(TupleRepr, TupleRepr)): def rtype_add((r_tup1, r_tup2), hop): - v_tuple, v_tuple1 = hop.inputargs(r_tup1.items_r, r_tup2.items_r) - items_r = r_tup1.items_r + r_tup2.items_r - res = TupleRepr(items_r) - vlist = v_tuple + v_tuple1 - return newtuple(hop.llops, res, vlist) + v_tuple1, v_tuple2 = hop.inputargs(r_tup1, r_tup2) + vlist = [] + for i in range(len(r_tup1.items_r)): + vlist.append(r_tup1.getitem(hop.llops, v_tuple1, i)) + for i in range(len(r_tup2.items_r)): + vlist.append(r_tup2.getitem(hop.llops, v_tuple2, i)) + return newtuple_cached(hop, vlist) rtype_inplace_add = rtype_add def convert_from_to((r_from, r_to), v, llops): @@ -140,6 +148,8 @@ # Irregular operations. def newtuple(llops, r_tuple, items_v): + if len(r_tuple.items_r) == 0: + return inputconst(r_tuple, ()) # always the same empty tuple c1 = inputconst(Void, r_tuple.lowleveltype.TO) v_result = llops.genop('malloc', [c1], resulttype = r_tuple.lowleveltype) for i in range(len(r_tuple.items_r)): @@ -147,10 +157,17 @@ llops.genop('setfield', [v_result, cname, items_v[i]]) return v_result +def newtuple_cached(hop, items_v): + r_tuple = hop.r_result + if hop.s_result.is_constant(): + return inputconst(r_tuple, hop.s_result.const) + else: + return newtuple(hop.llops, r_tuple, items_v) + def rtype_newtuple(hop): r_tuple = hop.r_result vlist = hop.inputargs(*r_tuple.items_r) - return newtuple(hop.llops, r_tuple, vlist) + return newtuple_cached(hop, vlist) # # _________________________ Conversions _________________________ Modified: pypy/dist/pypy/rpython/test/test_rtuple.py ============================================================================== --- pypy/dist/pypy/rpython/test/test_rtuple.py (original) +++ pypy/dist/pypy/rpython/test/test_rtuple.py Tue Sep 27 22:25:24 2005 @@ -52,21 +52,21 @@ rtype(dummyfn, [int, int]) def test_tuple_concatenation(): - def f(): - tup = (1,2) + def f(n): + tup = (1,n) tup1 = (3,) res = tup + tup1 + () return res[0]*100 + res[1]*10 + res[2] - res = interpret(f, []) + res = interpret(f, [2]) assert res == 123 def test_tuple_concatenation_mix(): - def f(): - tup = (1,2) + def f(n): + tup = (1,n) tup1 = ('3',) res = tup + tup1 return res[0]*100 + res[1]*10 + ord(res[2]) - ord('0') - res = interpret(f, []) + res = interpret(f, [2]) assert res == 123 def test_constant_tuple_contains(): @@ -123,3 +123,14 @@ res = interpret(f, [0]) assert res.item0 == 3 assert isinstance(typeOf(res.item2), Ptr) and not res.item2 + +def test_constant_tuples_shared(): + def g(n): + x = (n, 42) # constant (5, 42) detected by the annotator + y = (5, 42) # another one, built by the flow space + z = x + () # yet another + return id(x) == id(y) == id(z) + def f(): + return g(5) + res = interpret(f, []) + assert res is True From pedronis at codespeak.net Tue Sep 27 23:29:04 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 27 Sep 2005 23:29:04 +0200 (CEST) Subject: [pypy-svn] r17923 - in pypy/dist/pypy/translator: goal tool Message-ID: <20050927212904.C840F27BB9@code1.codespeak.net> Author: pedronis Date: Tue Sep 27 23:29:02 2005 New Revision: 17923 Modified: pypy/dist/pypy/translator/goal/query.py pypy/dist/pypy/translator/goal/translate_pypy_new.py pypy/dist/pypy/translator/tool/util.py Log: * in translate_pypy_new use polluted logic instead of find_somebojects, complete the formet * move check_exceptblocks logic to query too * started reorganizing the queries to be used for sanity checks by translate_pypy to be generator based with a helper function to write the produced bits of output, in preparatio to use logging with them Modified: pypy/dist/pypy/translator/goal/query.py ============================================================================== --- pypy/dist/pypy/translator/goal/query.py (original) +++ pypy/dist/pypy/translator/goal/query.py Tue Sep 27 23:29:02 2005 @@ -5,48 +5,6 @@ import pypy.annotation.model as annmodel import pypy.objspace.flow.model as flowmodel -#def sources(translator): -# annotator = translator.annotator -# d = {} -# for v, s in annotator.bindings.iteritems(): -# if s.__class__ == annmodel.SomeObject and s.knowntype != type: -# if s.origin: -# d[s.origin[0]] = 1 -# for func in d: -# print func.__module__ or '?', func.__name__ -# print len(d) -# return d.keys() - -class Found(Exception): - pass - -def sovars(translator, g): - annotator = translator.annotator - def visit(block): - if isinstance(block, flowmodel.Block): - for v in block.getvariables(): - s = annotator.binding(v, extquery=True) - if s and s.__class__ == annmodel.SomeObject and s.knowntype != type: - print v,s - flowmodel.traverse(visit, g) - -def polluted(translator): - """list functions with still real SomeObject variables""" - annotator = translator.annotator - def visit(block): - if isinstance(block, flowmodel.Block): - for v in block.getvariables(): - s = annotator.binding(v, extquery=True) - if s and s.__class__ == annmodel.SomeObject and s.knowntype != type: - raise Found - c = 0 - for f,g in translator.flowgraphs.iteritems(): - try: - flowmodel.traverse(visit, g) - except Found: - print prettycallable((None, f)) - c += 1 - return c class typerep(object): @@ -122,6 +80,14 @@ classes = [clsdef and clsdef.cls for clsdef, func in bunch] return roots(classes), tuple(typereps(callables)) +def prettyfunc(func): + descr = "(%s:%s)" % (getattr(func, '__module__', None) or '?', func.func_code.co_firstlineno) + funcname = getattr(func, '__name__', None) or 'UNKNOWN' + cls = getattr(func, 'class_', None) + if cls: + funcname = "%s.%s" % (cls.__name__, funcname) + return descr+funcname + def prettycallable((cls, obj)): if cls is None or cls == (True, ()): cls = None @@ -137,7 +103,7 @@ cls = "_|%s" % cls if isinstance(obj, types.FunctionType): - obj = "(%s)%s" % (getattr(obj, '__module__', None) or '?', getattr(obj, '__name__', None) or 'UNKNOWN') + obj = prettyfunc(obj) elif isinstance(obj, tuple): obj = "[%s]" % '|'.join([str(x) for x in obj]) else: @@ -558,3 +524,73 @@ t.about(block) ansi_print("`----------------------------------------------------------------------------'", 36) print + +# query used for sanity checks by translate_pypy + +def short_binding(annotator, var): + try: + binding = annotator.binding(var) + except KeyError: + return "?" + if binding.is_constant(): + return 'const %s' % binding.__class__.__name__ + else: + return binding.__class__.__name__ + +def graph_sig(t, g): + ann = t.annotator + hbinding = lambda v: short_binding(ann, v) + return "%s -> %s" % ( + ', '.join(map(hbinding, g.getargs())), + hbinding(g.getreturnvar())) + +class Found(Exception): + pass + +def polluted_qgen(translator): + """list functions with still real SomeObject variables""" + annotator = translator.annotator + def visit(block): + if isinstance(block, flowmodel.Block): + for v in block.getvariables(): + s = annotator.binding(v, extquery=True) + if s and s.__class__ == annmodel.SomeObject and s.knowntype != type: + raise Found + for f,g in translator.flowgraphs.iteritems(): + try: + flowmodel.traverse(visit, g) + except Found: + line = "%s: %s" % (prettyfunc(f), graph_sig(translator, g)) + yield line + +def check_exceptblocks_qgen(translator): + annotator = translator.annotator + for graph in translator.flowgraphs.itervalues(): + et, ev = graph.exceptblock.inputargs + s_et = annotator.binding(et, extquery=True) + s_ev = annotator.binding(ev, extquery=True) + if s_et: + if s_et.knowntype == type: + if s_et.__class__ == annmodel.SomeObject: + if hasattr(s_et, 'is_type_of') and s_et.is_type_of == [ev]: + continue + else: + if s_et.__class__ == annmodel.SomePBC: + continue + yield "%s exceptblock is not completely sane" % graph.name + + +def qoutput(queryg, write=None): + if write is None: + def write(s): + print s + c = 0 + for bit in queryg: + write(bit) + c += 1 + return c + +def polluted(translator): + c = qoutput(polluted_qgen(translator)) + print c + Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Tue Sep 27 23:29:02 2005 @@ -89,8 +89,7 @@ # XXX this tries to make compiling faster from pypy.translator.tool import cbuild cbuild.enable_fast_compilation() -from pypy.translator.tool.util import find_someobjects -from pypy.translator.tool.util import sanity_check_exceptblocks, update_usession_dir +from pypy.translator.tool.util import update_usession_dir from pypy.translator.tool.util import assert_rpython_mostly_not_imported, mkexename annmodel.DEBUG = False @@ -99,6 +98,20 @@ # __________ Main __________ +def sanity_check_annotation(t): + irreg = query.qoutput(query.check_exceptblocks_qgen(t)) + if not irreg: + print "++ All exceptblocks seem sane" + + lost = query.sanity_check_methods(t) + assert not lost, "lost methods, something gone wrong with the annotation of method defs" + print "*** No lost method defs." + + so = query.qoutput(query.polluted_qgen(t)) + tot = len(t.flowgraphs) + percent = int(tot and (100.0*so / tot) or 0) + print "-- someobjectness %2d (%d of %d functions polluted by SomeObjects)" % (percent, so, tot) + def analyse(t, inputtypes): standalone = inputtypes is None @@ -110,11 +123,8 @@ print 'Annotating...' print 'with policy: %s.%s' % (policy.__class__.__module__, policy.__class__.__name__) a = t.annotate(inputtypes, policy=policy) - sanity_check_exceptblocks(t) - lost = query.sanity_check_methods(t) - assert not lost, "lost methods, something gone wrong with the annotation of method defs" - print "*** No lost method defs." - find_someobjects(t) + sanity_check_annotation(t) + if a: #and not options['-no-s']: print 'Simplifying...' a.simplify() Modified: pypy/dist/pypy/translator/tool/util.py ============================================================================== --- pypy/dist/pypy/translator/tool/util.py (original) +++ pypy/dist/pypy/translator/tool/util.py Tue Sep 27 23:29:02 2005 @@ -41,82 +41,5 @@ raise RuntimeError("cannot fork because improper rtyper code" " has already been imported: %r" %(wrongimports,)) -def sanity_check_exceptblocks(translator): - annotator = translator.annotator - irreg = 0 - for graph in translator.flowgraphs.itervalues(): - et, ev = graph.exceptblock.inputargs - s_et = annotator.binding(et, extquery=True) - s_ev = annotator.binding(ev, extquery=True) - if s_et: - if s_et.knowntype == type: - if s_et.__class__ == SomeObject: - if hasattr(s_et, 'is_type_of') and s_et.is_type_of == [ev]: - continue - else: - if s_et.__class__ == annmodel.SomePBC: - continue - print "*****", graph.name, "exceptblock is not completely sane" - irreg += 1 - if irreg == 0: - print "*** All exceptblocks seem sane." - -def find_someobjects(translator, quiet=False): - """Find all functions in that have SomeObject in their signature.""" - annotator = translator.annotator - if not annotator: - return # no annotations available - - translator.highlight_functions = {} - - def is_someobject(var): - try: - return annotator.binding(var).__class__ == SomeObject - except KeyError: - return False - - def short_binding(var): - try: - binding = annotator.binding(var) - except KeyError: - return "?" - if binding.is_constant(): - return 'const %s' % binding.__class__.__name__ - else: - return binding.__class__.__name__ - - header = True - items = [(graph.name, func, graph) - for func, graph in translator.flowgraphs.items()] - items.sort() - num = someobjnum = 0 - for graphname, func, graph in items: - unknown_input_args = len(filter(is_someobject, graph.getargs())) - unknown_return_value = is_someobject(graph.getreturnvar()) - if unknown_input_args or unknown_return_value: - someobjnum += 1 - translator.highlight_functions[func] = True - if not quiet: - if header: - header = False - print "=" * 70 - print "Functions that have SomeObject in their signature" - print "=" * 70 - print ("%(name)s(%(args)s) -> %(result)s\n" - "%(filename)s:%(lineno)s\n" - % {'name': graph.name, - 'filename': func.func_globals.get('__name__', '?'), - 'lineno': func.func_code.co_firstlineno, - 'args': ', '.join(map(short_binding, - graph.getargs())), - 'result': short_binding(graph.getreturnvar())}) - num += 1 - if not quiet: - print "=" * 70 - percent = int(num and (100.0*someobjnum / num) or 0) - print "someobjectness: %2d percent" % (percent) - print "(%d out of %d functions get or return SomeObjects" % ( - someobjnum, num) - print "=" * 70 From pedronis at codespeak.net Tue Sep 27 23:37:08 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Tue, 27 Sep 2005 23:37:08 +0200 (CEST) Subject: [pypy-svn] r17924 - pypy/dist/pypy/translator/goal Message-ID: <20050927213708.56BF327BB9@code1.codespeak.net> Author: pedronis Date: Tue Sep 27 23:37:06 2005 New Revision: 17924 Modified: pypy/dist/pypy/translator/goal/query.py pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: reorganized sanity_check_methods as a generator for translate_pypy_new Modified: pypy/dist/pypy/translator/goal/query.py ============================================================================== --- pypy/dist/pypy/translator/goal/query.py (original) +++ pypy/dist/pypy/translator/goal/query.py Tue Sep 27 23:37:06 2005 @@ -408,47 +408,6 @@ import traceback traceback.print_exc() -def sanity_check_methods(translator): - from pypy.annotation.classdef import ClassDef - def ismeth(s_val): - if not isinstance(s_val, annmodel.SomePBC): - return False - s_pbc = s_val - c = 0 - for f, clsdef in s_pbc.prebuiltinstances.iteritems(): - if callable(f) and isinstance(clsdef, ClassDef): - c += 1 - return c == len(s_pbc.prebuiltinstances) - usercls = translator.annotator.getuserclasses() - withmeths = [] - for clsdef in usercls.itervalues(): - meths = [] - for attr in clsdef.attrs.values(): - if ismeth(attr.s_value): - meths.append(attr) - if meths: - withmeths.append((clsdef, meths)) - lost = 0 - for clsdef, meths in withmeths: - cls = clsdef.cls - n = 0 - subclasses = [] - for clsdef1 in usercls.itervalues(): - if issubclass(clsdef1.cls, cls): - subclasses.append(clsdef1) - for meth in meths: - name = meth.name - funcs = dict.fromkeys(meth.s_value.prebuiltinstances.iterkeys()) - for subcls in subclasses: - f = subcls.cls.__dict__.get(name) - if hasattr(f, 'im_self') and f.im_self is None: - f = f.im_func - if f: - if f not in funcs: - print "Lost method!", name, subcls.cls, cls, subcls.attrs.keys() - lost += 0 - return lost - def graph_footprint(graph): class Counter: blocks = 0 @@ -579,6 +538,43 @@ continue yield "%s exceptblock is not completely sane" % graph.name +def check_methods_qgen(translator): + from pypy.annotation.classdef import ClassDef + def ismeth(s_val): + if not isinstance(s_val, annmodel.SomePBC): + return False + s_pbc = s_val + c = 0 + for f, clsdef in s_pbc.prebuiltinstances.iteritems(): + if callable(f) and isinstance(clsdef, ClassDef): + c += 1 + return c == len(s_pbc.prebuiltinstances) + usercls = translator.annotator.getuserclasses() + withmeths = [] + for clsdef in usercls.itervalues(): + meths = [] + for attr in clsdef.attrs.values(): + if ismeth(attr.s_value): + meths.append(attr) + if meths: + withmeths.append((clsdef, meths)) + for clsdef, meths in withmeths: + cls = clsdef.cls + n = 0 + subclasses = [] + for clsdef1 in usercls.itervalues(): + if issubclass(clsdef1.cls, cls): + subclasses.append(clsdef1) + for meth in meths: + name = meth.name + funcs = dict.fromkeys(meth.s_value.prebuiltinstances.iterkeys()) + for subcls in subclasses: + f = subcls.cls.__dict__.get(name) + if hasattr(f, 'im_self') and f.im_self is None: + f = f.im_func + if f: + if f not in funcs: + yield "lost method: %s %s %s %s" % (name, subcls.cls, cls, subcls.attrs.keys() ) def qoutput(queryg, write=None): if write is None: @@ -593,4 +589,7 @@ def polluted(translator): c = qoutput(polluted_qgen(translator)) print c - + +def sanity_check_methods(translator): + lost = qoutput(check_methods_qgen(translator)) + print lost Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Tue Sep 27 23:37:06 2005 @@ -103,9 +103,9 @@ if not irreg: print "++ All exceptblocks seem sane" - lost = query.sanity_check_methods(t) + lost = query.qoutput(query.check_methods_qgen(t)) assert not lost, "lost methods, something gone wrong with the annotation of method defs" - print "*** No lost method defs." + print "++ No lost method defs" so = query.qoutput(query.polluted_qgen(t)) tot = len(t.flowgraphs) From pedronis at codespeak.net Wed Sep 28 00:18:47 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 28 Sep 2005 00:18:47 +0200 (CEST) Subject: [pypy-svn] r17925 - pypy/dist/pypy/translator Message-ID: <20050927221847.9414B27BC4@code1.codespeak.net> Author: pedronis Date: Wed Sep 28 00:18:45 2005 New Revision: 17925 Modified: pypy/dist/pypy/translator/annrpython.py Log: use class_ in whereami Modified: pypy/dist/pypy/translator/annrpython.py ============================================================================== --- pypy/dist/pypy/translator/annrpython.py (original) +++ pypy/dist/pypy/translator/annrpython.py Wed Sep 28 00:18:45 2005 @@ -439,6 +439,9 @@ else: name = 'UNKNOWN' firstlineno = -1 + cls = getattr(fn, 'class_', None) + if cls is not None: + name = "%s.%s" % (cls.__name__, name) blk = "" if block: at = block.at() @@ -447,7 +450,7 @@ opid="" if i is not None: opid = " op=%d" % i - return "(%s:%d) %s%s%s" % (mod, firstlineno, name, blk, opid) + return "(%s:%d)%s%s%s" % (mod, firstlineno, name, blk, opid) def flowin(self, fn, block): #print 'Flowing', block, [self.binding(a) for a in block.inputargs] From cfbolz at codespeak.net Wed Sep 28 01:39:45 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Wed, 28 Sep 2005 01:39:45 +0200 (CEST) Subject: [pypy-svn] r17926 - pypy/dist/pypy/doc Message-ID: <20050927233945.2B79227BC4@code1.codespeak.net> Author: cfbolz Date: Wed Sep 28 01:39:43 2005 New Revision: 17926 Added: pypy/dist/pypy/doc/dev_method.txt - copied unchanged from r17914, pypy/extradoc/sprintinfo/dev_meth_20050827.txt Log: copy the sprint document to the regular documentation directory. From pedronis at codespeak.net Wed Sep 28 02:54:09 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 28 Sep 2005 02:54:09 +0200 (CEST) Subject: [pypy-svn] r17928 - in pypy/dist/pypy/translator: goal tool Message-ID: <20050928005409.B6D3027B92@code1.codespeak.net> Author: pedronis Date: Wed Sep 28 02:54:07 2005 New Revision: 17928 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py pypy/dist/pypy/translator/tool/pdbplus.py Log: issue132 in-progress - move some debugger startup logic to pdbplus - command enable_graphic to activate pygame display even if non-graphic mode was specified originally Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Wed Sep 28 02:54:07 2005 @@ -157,10 +157,10 @@ def debug(got_error): from pypy.translator.tool.pdbplus import PdbPlusShow - from pypy.translator.tool.pdbplus import run_debugger_in_thread pdb_plus_show = PdbPlusShow(t) # need a translator to support extended commands - + + tb = None if got_error: import traceback exc, val, tb = sys.exc_info() @@ -173,34 +173,25 @@ print '-'*60 t.about(block) print '-'*60 - - print >> sys.stderr - func, args = pdb_plus_show.post_mortem, (tb,) + print else: print '-'*60 print 'Done.' print - func, args = pdb_plus_show.set_trace, () - if not cmd_line_opt.pygame: - if cmd_line_opt.batch: - print >>sys.stderr, "batch mode, not calling interactive helpers" - else: - func(*args) - else: - if cmd_line_opt.batch: - print >>sys.stderr, "batch mode, not calling interactive helpers" + + if cmd_line_opt.batch: + print >>sys.stderr, "batch mode, not calling interactive helpers" + return + + def server_setup(): + if serv_start: + return serv_start, serv_show, serv_stop, serv_cleanup else: - if serv_start: - start, show, stop, cleanup = serv_start, serv_show, serv_stop, serv_cleanup - else: - from pypy.translator.tool.pygame.server import run_translator_server - start, show, stop, cleanup = run_translator_server(t, entry_point, cmd_line_opt) - pdb_plus_show.install_show(show) - debugger = run_debugger_in_thread(func, args, stop) - debugger.start() - start() - debugger.join() - cleanup() + from pypy.translator.tool.pygame.server import run_translator_server + return run_translator_server(t, entry_point, cmd_line_opt) + + pdb_plus_show.start(tb, server_setup, graphic=cmd_line_opt.pygame) + from optparse import OptionParser parser = OptionParser() Modified: pypy/dist/pypy/translator/tool/pdbplus.py ============================================================================== --- pypy/dist/pypy/translator/tool/pdbplus.py (original) +++ pypy/dist/pypy/translator/tool/pdbplus.py Wed Sep 28 02:54:07 2005 @@ -1,18 +1,7 @@ import threading, pdb -def run_debugger_in_thread(fn, args, cleanup=None, cleanup_args=()): - def _run_in_thread(): - try: - try: - fn(*args) - pass # for debugger to land - except pdb.bdb.BdbQuit: - pass - finally: - if cleanup is not None: - cleanup(*cleanup_args) - return threading.Thread(target=_run_in_thread, args=()) - +class _EnableGraphic: + pass class PdbPlusShow(pdb.Pdb): @@ -346,8 +335,15 @@ from pypy.translator.tool import graphpage self._show(graphpage.ClassHierarchyPage(self.translator)) + def do_enable_graphic(self, arg): + """enable_graphic +enable pygame graph display even from non-graphic mode""" + if self.show: + return + raise _EnableGraphic + def help_graphs(self): - print "graph commands are: showg, flowg, callg, classhier" + print "graph commands are: showg, flowg, callg, classhier, enable_graphic" def help_ann_other(self): print "other annotation related commands are: find, findclasses, findfuncs, attrs, attrsann, readpos" @@ -356,3 +352,40 @@ print "these prefixes are tried for dotted names in graph commands:" print self.TRYPREFIXES + # start helpers + def _run_debugger(self, tb): + if tb is None: + fn, args = self.set_trace, () + else: + fn, args = self.post_mortem, (tb,) + try: + t = self.translator # define enviroments, xxx more stuff + fn(*args) + pass # for debugger to land + except pdb.bdb.BdbQuit: + pass + + def _run_debugger_in_thread(self, tb, cleanup=None, cleanup_args=()): + def _run_in_thread(): + try: + self._run_debugger(tb) + finally: + if cleanup is not None: + cleanup(*cleanup_args) + return threading.Thread(target=_run_in_thread, args=()) + + def start(self, tb, server_setup, graphic=False): + if not graphic: + try: + self._run_debugger(tb) + except _EnableGraphic: + pass + else: + return + start, show, stop, cleanup = server_setup() + self.install_show(show) + debugger = self._run_debugger_in_thread(tb, stop) + debugger.start() + start() + debugger.join() + cleanup() From pedronis at codespeak.net Wed Sep 28 03:01:23 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 28 Sep 2005 03:01:23 +0200 (CEST) Subject: [pypy-svn] r17930 - pypy/dist/pypy/translator/goal Message-ID: <20050928010123.7CFA727B92@code1.codespeak.net> Author: pedronis Date: Wed Sep 28 03:01:22 2005 New Revision: 17930 Modified: pypy/dist/pypy/translator/goal/translate_pypy.py Log: change to preserve old output for now Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Wed Sep 28 03:01:22 2005 @@ -132,7 +132,7 @@ print 'with policy: %s.%s' % (policy.__class__.__module__, policy.__class__.__name__) a = t.annotate(inputtypes, policy=policy) sanity_check_exceptblocks(t) - lost = query.sanity_check_methods(t) + lost = query.qoutput(query.check_methods_qgen(t)) assert not lost, "lost methods, something gone wrong with the annotation of method defs" print "*** No lost method defs." find_someobjects(t) From hpk at codespeak.net Wed Sep 28 08:34:37 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Wed, 28 Sep 2005 08:34:37 +0200 (CEST) Subject: [pypy-svn] r17934 - pypy/dist/pypy/doc Message-ID: <20050928063437.1285827B97@code1.codespeak.net> Author: hpk Date: Wed Sep 28 08:34:36 2005 New Revision: 17934 Modified: pypy/dist/pypy/doc/index.txt Log: link to development methodology doc Modified: pypy/dist/pypy/doc/index.txt ============================================================================== --- pypy/dist/pypy/doc/index.txt (original) +++ pypy/dist/pypy/doc/index.txt Wed Sep 28 08:34:36 2005 @@ -11,6 +11,8 @@ `coding guide`_ helps you to write code for PyPy. +`development methodology`_ describe our sprint-driven approach. + `object spaces`_ discusses the object space interface and several implementations. @@ -41,6 +43,7 @@ .. _`FAQ`: faq.html .. _parser: parser.html +.. _`development methodology`: dev_method.html .. _`talks and related projects`: extradoc.html .. _`license`: ../../LICENSE .. _`compliance test status`: http://codespeak.net/~hpk/pypy-testresult/ From ale at codespeak.net Wed Sep 28 09:50:21 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Wed, 28 Sep 2005 09:50:21 +0200 (CEST) Subject: [pypy-svn] r17935 - pypy/dist/pypy/translator/tool/test Message-ID: <20050928075021.2BC5527B9E@code1.codespeak.net> Author: ale Date: Wed Sep 28 09:50:20 2005 New Revision: 17935 Modified: pypy/dist/pypy/translator/tool/test/test_taskengine.py Log: typo Modified: pypy/dist/pypy/translator/tool/test/test_taskengine.py ============================================================================== --- pypy/dist/pypy/translator/tool/test/test_taskengine.py (original) +++ pypy/dist/pypy/translator/tool/test/test_taskengine.py Wed Sep 28 09:50:20 2005 @@ -50,7 +50,7 @@ def test(goals, task_skip=[]): if isinstance(goals, str): - gaols = [goals] + goals = [goals] abc = ABC() abc._execute(goals, task_skip=task_skip) return abc.done From ac at codespeak.net Wed Sep 28 11:19:38 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Wed, 28 Sep 2005 11:19:38 +0200 (CEST) Subject: [pypy-svn] r17936 - pypy/dist/pypy/module/_codecs Message-ID: <20050928091938.251E427BA1@code1.codespeak.net> Author: ac Date: Wed Sep 28 11:19:38 2005 New Revision: 17936 Modified: pypy/dist/pypy/module/_codecs/app_codecs.py Log: Fix errors in decoding unicode escapes. Modified: pypy/dist/pypy/module/_codecs/app_codecs.py ============================================================================== --- pypy/dist/pypy/module/_codecs/app_codecs.py (original) +++ pypy/dist/pypy/module/_codecs/app_codecs.py Wed Sep 28 11:19:38 2005 @@ -1404,17 +1404,18 @@ elif ch == 'r' : p += u'\r' elif ch == 'v': p += u'\013' #break; /* VT */ elif ch == 'a': p += u'\007' # break; /* BEL, not classic C */ - elif ch in [ '0', '1', '2', '3', '4', '5', '6', '7']: - x = int(s[pos, pos+3], 8) - # x = ord(ch) - ord('0') - # ch = s[pos] - # if ('0' <= ch and ch <= '7'): - # x = (x<<3) + ord(ch) - ord('0') - # ch = s[pos+1] - # if ('0' <= ch and ch <= '7'): - # x = (x<<3) + ord(ch) - ord('0') - # pos += 2 - pos += 3 + elif '0' <= ch <= '7': + x = ord(ch) - ord('0') + if pos < size: + ch = s[pos] + if '0' <= ch <= '7': + pos += 1 + x = (x<<3) + ord(ch) - ord('0') + if pos < size: + ch = s[pos] + if '0' <= ch <= '7': + pos += 1 + x = (x<<3) + ord(ch) - ord('0') p += unichr(x) ## /* hex escapes */ ## /* \xXX */ @@ -1471,16 +1472,8 @@ else: x = unicode_call_errorhandler(errors, "unicodeescape", message, s, pos-1, look+1) else: - if (pos > size): - message = "\\ at end of string" - handler = lookup_error(errors) - x = handler(UnicodeDecodeError("unicodeescape", s, pos, - size, message)) - p += x[0] - pos = x[1] - else: - p += '\\' - p += s[pos] + p += '\\' + p += ch return p def PyUnicode_EncodeRawUnicodeEscape(s, size): From arigo at codespeak.net Wed Sep 28 11:55:01 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 28 Sep 2005 11:55:01 +0200 (CEST) Subject: [pypy-svn] r17937 - pypy/dist/pypy/doc Message-ID: <20050928095501.0420927BA1@code1.codespeak.net> Author: arigo Date: Wed Sep 28 11:54:52 2005 New Revision: 17937 Modified: pypy/dist/pypy/doc/_ref.txt pypy/dist/pypy/doc/draft-dynamic-language-translation.txt pypy/dist/pypy/doc/theory.txt Log: * More work on the annotator section. * Added a reference to Wikipedia in theory.txt. Modified: pypy/dist/pypy/doc/_ref.txt ============================================================================== --- pypy/dist/pypy/doc/_ref.txt (original) +++ pypy/dist/pypy/doc/_ref.txt Wed Sep 28 11:54:52 2005 @@ -4,6 +4,7 @@ .. _`annotation/`: .. _`pypy/annotation`: ../../pypy/annotation .. _`annotation/binaryop.py`: ../../pypy/annotation/binaryop.py +.. _`pypy/annotation/model.py`: ../../pypy/annotation/model.py .. _`doc/`: ../../pypy/doc .. _`doc/revreport/`: ../../pypy/doc/revreport .. _`interpreter/`: Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Wed Sep 28 11:54:52 2005 @@ -2,6 +2,9 @@ Compiling dynamic language implementations ============================================================ +.. contents:: +.. sectnum:: + The analysis of dynamic languages =============================================== @@ -494,6 +497,12 @@ constant-folded away, instead of having the various possible values of ``next_block`` be merged at the beginning of the loop. +For more information see `The Interplevel Back-End`_ in the reference +documentation. + +.. _`The Interplevel Back-End`: translation.html#the-interplevel-back-end + + Annotator --------------------------------- @@ -570,10 +579,122 @@ process necessarily terminates, as we will show in the sequel. +Flow graph model +~~~~~~~~~~~~~~~~ + +For the purpose of the sequel, an informal description of the data model +used to represent flow graphs will suffice (a `precise description`_ can +be found in the reference documentation). + +The flow graphs are in Static Single Information (SSI) form, an +extension of Static Single Assignment (SSA_): each variable is only used +in exactly one basic block. All variables that are not dead at the end +of a basic block are explicitely carried over to the next block and +renamed. Instead of the traditional phi functions of SSA we use a minor +variant, parameter-passing style: each block declares a number of *input +variables* playing the role of input arguments to the block; each link +going out of a block carries a matching number of variables and +constants from the previous block into the target block's input +arguments. + +We use the following notation for an *operation* recorded in a block of +the flow graph of a function:: + + z = opname(x_1, ..., x_n) | z' + +where *x_1, ..., x_n* are the arguments of the operation (either +variables defined earlier in the block, or constants), *z* is the +variable into which the result is stored (each operation introduces a +new fresh variable as its result), and *z'* is a fresh extra variable +which we will use in particular cases (which we omit from the notation +when it is irrelevant). + +Let us assume that we are given a user program, which for the purpose of +the model we assume to be fully known in advance. Let us define the set +*V* of all variables as follows: + +* *V* contains all the variables that appear in operations, in any flow + graph of any function of the program, as described above; + +* in addition, for each class ``C`` of the user program and each + possible attribute name ``attr``, we add to *V* a variable called + *v_C.attr*. + +For a function ``f`` of the user program, we call *arg_f_1, ..., +arg_f_n* the variables bound to the input arguments of ``f`` (which are +actually the input variables of the first block in the flow graph of +``f``) and *return_f* the variable bound to the return value of ``f`` +(which is the single input variable of a special empty "return" block +ending the flow graph). + +Note that the complete knowledge of the operations and classes that +appear in the user program allow us to bound the size of *V*. Indeed, +the set of possible attribute names can be defined as all names that +appear in a ``getattr`` or ``setattr`` operation; no other name will +play a role during annotation. + +.. _`precise description`: objspace.html#the-flow-model +.. _`SSA`: http://en.wikipedia.org/wiki/Static_single_assignment_form + + Annotation model ~~~~~~~~~~~~~~~~ -:: +As in the `formal definition`_ of Abstract Interpretation, the model for +our annotation forms a *lattice_*, although we only use its structure of +*`join-semilattice`_*. + +The set *A* of annotations is defined as the following formal terms: + +* Bot, Top -- the minimum and maximum elements (corresponding to + "impossible value" and "most general value"); + +* Int, NonNegInt, Bool -- integers, known-non-negative integers, booleans; + +* Str, Char -- strings, characters (which are strings of length 1); + +* Inst(*class*) -- instance of *class* or a subclass thereof (there is + one such term per *class*); + +* List(*v*) -- list; *v* is a variable summarizing the items of the list + (there is one such term per variable); + +* Callable(*set*) -- where the *set* is a subset of the (finite) set of + all functions, all classes, and all pairs of a class and a function + (written ``class.f``). + +* None -- stands for the singleton ``None`` object of Python. + +More details about the annotations will be introduced in due time. In +addition, some of the annotations have a corresponding "nullable" twin, +which stands for "either the object described or ``None``". We use it +to propagate knowledge about which variable, after translation to C, +could ever contain a NULL pointer. (More precisely, there are a +NullableStr, nullable instances, and nulllable callables, and all lists +are implicitely assumed to be nullable). + +Each annotation corresponds to a family of run-time Python object; the +ordering of the lattice is essentially the subset order. Formally, it +is the partial order generated by: + +* Bot <= a <= Top -- for any annotation *a*; + +* Bool <= NonNegInt <= Int; + +* Char <= Str; + +* Inst(*subclass*) <= Inst(*class*) -- for any class and subclass; + +* Callable(*subset*) <= Callable(*set*); + +* a <= b -- for any annotation *a* with a nullable twin *b*; + +* None <= b -- for any nullable annotation *b*. + +It is left as an exercice to show that this partial order makes *A* a +lattice. + +Graphically:: ____________ Top ___________ / / | \ \ @@ -582,14 +703,16 @@ / NullableStr | | | Int / \ | (lists) | / Str \ (instances) | (pbcs) - NonNegInt \ \ \ | | - \ Char \ \ / / - Bool \ \ \ / / - \ \ `----- None -----' - \ \ / - \ \ / - `--------`-- Bottom + NonNegInt \ \ | | | + \ Char \ |\ / / + Bool \ \ | \ / / + \ \ `----- None -----/ + \ \ | / / + \ \ | / / + `--------`-- Bottom ------' +Here is the part about instances and nullable instances, assuming a +simple class hierarchy with only two direct subclasses of ``object``:: Top | @@ -614,7 +737,7 @@ \ / / Bottom - +All list terms for all variables are unordered:: __________________ Top __________________ / / / \ \ \ @@ -626,22 +749,27 @@ \ \ \ / / / '------------'--- None ----'------------' +The callables form a classical finite set-of-subsets lattice. In +practice, we consider ``None`` as a degenerated callable, so the None +annotation is actually Callable({None}). + +We should mention (but ignore for the sequel) that all annotations also +have a variant where they stand for a single known object; this +information is used in constant propagation. In addition, we have left +out a number of other annotations that are irrelevant for the basic +description of the annotator, and straightforward to handle. The +complete list is defined and documented in `pypy/annotation/model.py`_ +and described in more practical terms in `The Annotation Pass`_ in the +reference documentation. +.. _`The Annotation Pass`: translation.html#annotator - Bot - Top - - Int - - NonNegInt +Draft +~~~~~ - Bool +:: - Str - - NullableStr - Char Inst(class) @@ -783,6 +911,22 @@ Classes and instances ~~~~~~~~~~~~~~~~~~~~~ +We assume that the classes in the user program are organized in a single +inheritance tree rooted at the ``object`` base class. (Python supports +multiple inheritance, but the annotator is limited to single inheritance +plus simple mix-ins.) + +Remember that Python has no notion of classes declaring attributes and +methods. Classes are merely hierarchical namespaces: an expression like +``obj.attr`` means that the ``attr`` attribute is looked up in the class +that ``obj`` is an instance of at run-time, and all parent classes (a +``getattr`` operation). Expressions like ``obj.meth()`` that look like +method calls are actually grouped as ``(obj.meth)()``: they correspond +to two operations, a ``getattr`` followed by a ``call``. + +So it is down to the annotator to reconstruct a static structure for +each class in the hierarchy XXX. + XXX Termination @@ -791,6 +935,22 @@ XXX termination + soundness + most-precise-fixpoint-ness + complexity +The lattice is finite, although its size depends on the size of the +program. The List part has the same size as *V*, and the Callable part +is exponential on the number of callables. However, in this model a +chain of annotations (where each one is larger than the previous) cannot +be longer than:: + + max(5, number-of-callables + 3, depth-of-class-hierarchy + 3). + +In the extended lattice used in practice it is more difficult to compute +an upper bound. Such a bound exists -- some considerations can even +show that a finite subset of the extended lattice suffices -- but it +does not reflect any partical complexity considerations. It is simpler +to prove that there is no infinite ascending chain, which is enough to +guarantee termination. + + Non-static aspects ~~~~~~~~~~~~~~~~~~ @@ -832,6 +992,9 @@ .. _architecture: architecture.html .. _`Thunk Object Space`: objspace.html#the-thunk-object-space .. _`abstract interpretation`: theory.html#abstract-interpretation +.. _`formal definition`: http://en.wikipedia.org/wiki/Abstract_interpretation +.. _lattice: http://en.wikipedia.org/wiki/Lattice_%28order%29 +.. _`join-semilattice`: http://en.wikipedia.org/wiki/Lattice_%28order%29 .. _`Flow Object Space`: objspace.html#the-flow-object-space .. _`Standard Object Space`: objspace.html#the-standard-object-space .. _Psyco: http://psyco.sourceforge.net/ Modified: pypy/dist/pypy/doc/theory.txt ============================================================================== --- pypy/dist/pypy/doc/theory.txt (original) +++ pypy/dist/pypy/doc/theory.txt Wed Sep 28 11:54:52 2005 @@ -32,8 +32,18 @@ In PyPy, the FlowObjSpace_ uses the abstract interpretation technique to generate a control flow graph of the functions of RPython_ programs. +In its `more formal definition`_, Abstract Interpretation typically +considers abstract objects that are organized in a lattice_: some of +these objects are more (or less) abstract than others, in the sense that +they represent less (or more) known information; to say that this forms +a lattice essentially means that any two abstract objects have +well-defined unions and intersections (which are again abstract +objects). + .. _FlowObjSpace: objspace.html#the-flow-object-space .. _RPython: coding-guide.html#restricted-python +.. _`more formal definition`: http://en.wikipedia.org/wiki/Abstract_interpretation +.. _lattice: http://en.wikipedia.org/wiki/Lattice_%28order%29 Multimethods From arigo at codespeak.net Wed Sep 28 12:19:47 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 28 Sep 2005 12:19:47 +0200 (CEST) Subject: [pypy-svn] r17938 - pypy/dist/pypy/doc Message-ID: <20050928101947.8BDC927BB2@code1.codespeak.net> Author: arigo Date: Wed Sep 28 12:19:44 2005 New Revision: 17938 Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Log: Fix links. Modified: pypy/dist/pypy/doc/draft-dynamic-language-translation.txt ============================================================================== --- pypy/dist/pypy/doc/draft-dynamic-language-translation.txt (original) +++ pypy/dist/pypy/doc/draft-dynamic-language-translation.txt Wed Sep 28 12:19:44 2005 @@ -641,8 +641,8 @@ ~~~~~~~~~~~~~~~~ As in the `formal definition`_ of Abstract Interpretation, the model for -our annotation forms a *lattice_*, although we only use its structure of -*`join-semilattice`_*. +our annotation forms a lattice_, although we only use its structure of +`join-semilattice`_. The set *A* of annotations is defined as the following formal terms: @@ -994,7 +994,7 @@ .. _`abstract interpretation`: theory.html#abstract-interpretation .. _`formal definition`: http://en.wikipedia.org/wiki/Abstract_interpretation .. _lattice: http://en.wikipedia.org/wiki/Lattice_%28order%29 -.. _`join-semilattice`: http://en.wikipedia.org/wiki/Lattice_%28order%29 +.. _`join-semilattice`: http://en.wikipedia.org/wiki/Semilattice .. _`Flow Object Space`: objspace.html#the-flow-object-space .. _`Standard Object Space`: objspace.html#the-standard-object-space .. _Psyco: http://psyco.sourceforge.net/ From ericvrp at codespeak.net Wed Sep 28 13:18:00 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 28 Sep 2005 13:18:00 +0200 (CEST) Subject: [pypy-svn] r17940 - in pypy/dist/pypy/translator: backendopt backendopt/test llvm Message-ID: <20050928111800.4866627BA9@code1.codespeak.net> Author: ericvrp Date: Wed Sep 28 13:17:58 2005 New Revision: 17940 Removed: pypy/dist/pypy/translator/backendopt/removezerobytemalloc.py pypy/dist/pypy/translator/backendopt/test/test_removezerobytemalloc.py Modified: pypy/dist/pypy/translator/backendopt/all.py pypy/dist/pypy/translator/llvm/build_llvm_module.py pypy/dist/pypy/translator/llvm/genllvm.py pypy/dist/pypy/translator/llvm/opwriter.py Log: removal of unused code like remove zero byte malloc transformation Modified: pypy/dist/pypy/translator/backendopt/all.py ============================================================================== --- pypy/dist/pypy/translator/backendopt/all.py (original) +++ pypy/dist/pypy/translator/backendopt/all.py Wed Sep 28 13:17:58 2005 @@ -4,26 +4,19 @@ from pypy.translator.backendopt.malloc import remove_simple_mallocs from pypy.translator.backendopt.ssa import SSI_to_SSA from pypy.translator.backendopt.propagate import propagate_all -from pypy.translator.backendopt.removezerobytemalloc import remove_zero_byte_mallocs from pypy.translator import simplify def backend_optimizations(translator, inline_threshold=1, mallocs=True, ssa_form=True, - propagate=False, - removezerobytemallocs=False): + propagate=False): # remove obvious no-ops for graph in translator.flowgraphs.values(): remove_same_as(graph) simplify.eliminate_empty_blocks(graph) simplify.transform_dead_op_vars(graph, translator) - # remove allocation of empty structs - if removezerobytemallocs: - for graph in translator.flowgraphs.values(): - remove_zero_byte_mallocs(graph) - # ... if propagate: propagate_all(translator) Modified: pypy/dist/pypy/translator/llvm/build_llvm_module.py ============================================================================== --- pypy/dist/pypy/translator/llvm/build_llvm_module.py (original) +++ pypy/dist/pypy/translator/llvm/build_llvm_module.py Wed Sep 28 13:17:58 2005 @@ -88,6 +88,9 @@ if not use_gcc: cmds.append("llc %s %s.bc -f -o %s.s" % (genllvm.exceptionpolicy.llc_options(), b, b)) cmds.append("as %s.s -o %s.o" % (b, b)) + if exe_name: + cmd = "gcc %s.o %s -lm -ldl -pipe -o %s" % (b, gc_libs, exe_name) + cmds.append(cmd) object_files.append("%s.o" % b) else: cmds.append("llc %s %s.bc -march=c -f -o %s.c" % (genllvm.exceptionpolicy.llc_options(), b, b)) Modified: pypy/dist/pypy/translator/llvm/genllvm.py ============================================================================== --- pypy/dist/pypy/translator/llvm/genllvm.py (original) +++ pypy/dist/pypy/translator/llvm/genllvm.py Wed Sep 28 13:17:58 2005 @@ -253,7 +253,7 @@ a = t.annotate(annotation) a.simplify() t.specialize() - t.backend_optimizations(ssa_form=False, propagate=False, removezerobytemallocs=False) + t.backend_optimizations(ssa_form=False) if view: #note: this is without policy transforms t.view() return genllvm(t, **kwds) Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Wed Sep 28 13:17:58 2005 @@ -390,11 +390,6 @@ def malloc(self, op): arg_type = op.args[0].value targetvar = self.db.repr_arg(op.result) - if isinstance(arg_type, lltype.Struct) and arg_type._names_without_voids() == []: - t = self.db.repr_arg_type(op.result) - self.codewriter.cast(targetvar, t, 'null', t) - self.codewriter.comment('removed malloc(%s) from previous line' % t) - return type_ = self.db.repr_type(arg_type) self.codewriter.malloc(targetvar, type_, atomic=arg_type._is_atomic()) From ale at codespeak.net Wed Sep 28 14:48:40 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Wed, 28 Sep 2005 14:48:40 +0200 (CEST) Subject: [pypy-svn] r17941 - pypy/extradoc/sprintinfo Message-ID: <20050928124840.EC24A27B8E@code1.codespeak.net> Author: ale Date: Wed Sep 28 14:48:40 2005 New Revision: 17941 Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt Log: my info Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt ============================================================================== --- pypy/extradoc/sprintinfo/paris-2005-people.txt (original) +++ pypy/extradoc/sprintinfo/paris-2005-people.txt Wed Sep 28 14:48:40 2005 @@ -23,6 +23,7 @@ Michael Hudson ? ? Carl Friedrich Bolz 7/10? - 16/10 flat? Bert Freudenberg 09/10 - 17/10 ? +Anders Lehmann 09/10 - 14/10 Hotel Porte de Versailles =================== ============== ===================== People on the following list are likely to come and were @@ -36,5 +37,4 @@ Richard Emslie ? ? Anders Chrigstroem ? ? Christian Tismer ? ? -Anders Lehmann ? ? =================== ============== ===================== From ericvrp at codespeak.net Wed Sep 28 15:51:56 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Wed, 28 Sep 2005 15:51:56 +0200 (CEST) Subject: [pypy-svn] r17943 - in pypy/dist/pypy/translator/llvm: . backendopt module Message-ID: <20050928135156.E00D327B93@code1.codespeak.net> Author: ericvrp Date: Wed Sep 28 15:51:55 2005 New Revision: 17943 Modified: pypy/dist/pypy/translator/llvm/backendopt/exception.py pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py pypy/dist/pypy/translator/llvm/exception.py pypy/dist/pypy/translator/llvm/funcnode.py pypy/dist/pypy/translator/llvm/gc.py pypy/dist/pypy/translator/llvm/module/extfunction.py pypy/dist/pypy/translator/llvm/opwriter.py Log: Added ringbuffer for data that looks like an Exception or Error. It seems to work. Moving to another machine now to see how fast it actually is. Modified: pypy/dist/pypy/translator/llvm/backendopt/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/backendopt/exception.py (original) +++ pypy/dist/pypy/translator/llvm/backendopt/exception.py Wed Sep 28 15:51:55 2005 @@ -15,7 +15,7 @@ Because of the added exitswitch we need an additional block. """ global n_calls, n_calls_patched - n_calls_begin = n_calls + n_calls_patched_begin = n_calls_patched e = translator.rtyper.getexceptiondata() blocks = [x for x in flatten(graph) if isinstance(x, Block)] for block in blocks: @@ -58,5 +58,6 @@ l.prevblock = block l.exitcase = l.llexitcase = False block.exits.insert(0, l) #False case needs to go first - if n_calls != n_calls_begin: - print 'create_exception_handling: patched %d out of %d calls' % (n_calls_patched, n_calls) + #if n_calls_patched != n_calls_patched_begin: + # print 'create_exception_handling: patched %d out of %d calls' % (n_calls_patched, n_calls) + return n_calls_patched_begin - n_calls_patched Modified: pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py ============================================================================== --- pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py (original) +++ pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py Wed Sep 28 15:51:55 2005 @@ -11,6 +11,7 @@ warning: some will consider this a dirty hack, that's ok! :) """ + n_times_merged = 0 blocks = [x for x in flatten(graph) if isinstance(x, Block)] for block in blocks: n_mallocs_in_block = 0 @@ -20,3 +21,5 @@ n_mallocs_in_block += 1 if n_mallocs_in_block >= 2: print 'merge_mallocs: n_mallocs_in_block=%d' % n_mallocs_in_block + n_times_merged += 1 + return n_times_merged Modified: pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py ============================================================================== --- pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py (original) +++ pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py Wed Sep 28 15:51:55 2005 @@ -2,10 +2,6 @@ from pypy.translator.backendopt.inline import _find_exception_type -def _llvm_structsize(struct): - #XXX TODO take a save guess - return 16 - def remove_exception_mallocs(translator, graph, ringbuffer_entry_maxsize=16, ringbuffer_n_entries=1024): """Remove mallocs that occur because an exception is raised. Typically this data is shortlived and occuring often in highlevel @@ -20,18 +16,19 @@ warning: this code will not work when your code references an exception instance 'long' after it has been raised. """ + n_removed = 0 blocks = [x for x in flatten(graph) if isinstance(x, Block)] for block in blocks: ops = block.operations - if (len(ops) < 3 or - ops[0].opname != 'malloc' or ops[1].opname != 'cast_pointer' or - ops[2].opname != 'setfield' or ops[2].args[1].value != 'typeptr' or - not isinstance(ops[2].args[1], Constant) or - _llvm_structsize(ops[0].args[0]) > ringbuffer_entry_maxsize): #todo: ops[2].args[2] to vtable + if len(ops) < 3 or \ + ops[0].opname != 'malloc' or ops[1].opname != 'cast_pointer' or \ + ops[2].opname != 'setfield' or ops[2].args[1].value != 'typeptr' or \ + not isinstance(ops[2].args[1], Constant): + continue + name = str(ops[0].args[0]) + if 'Exception' not in name and 'Error' not in name: #XXX better to look at the actual structure continue - print 'remove_exception_malloc: ', str(ops[0].args[0]), ops[2].args[2] - #ops = [SpaceOperation('ops[0].result = load sbyte** %exception_ringbuffer'), - # SpaceOperation('%tmpptr.0 = add sbyte* ops[0].result, ringbuffer_entry_maxsize'), - # SpaceOperation('%tmpptr.1 = and sbyte* tmpptr.0, ~(ringbuffer_n_entries*ringbuffer_entry_maxsize)'), - # SpaceOperation('store sbyte* %tmpptr.1, sbyte** %exception_ringbuffer), - # ops[1:]] + print 'remove_exception_malloc: ', name + ops[0].opname = 'malloc_exception' #XXX refactor later to not use a new operationtype + n_removed += 1 + return n_removed Modified: pypy/dist/pypy/translator/llvm/exception.py ============================================================================== --- pypy/dist/pypy/translator/llvm/exception.py (original) +++ pypy/dist/pypy/translator/llvm/exception.py Wed Sep 28 15:51:55 2005 @@ -2,8 +2,27 @@ class ExceptionPolicy: + RINGBUGGER_SIZE = 8192 RINGBUFFER_ENTRY_MAXSIZE = 16 - RINGBUGGER_N_ENTRIES = 1024 + RINGBUGGER_OVERSIZE = RINGBUGGER_SIZE + RINGBUFFER_ENTRY_MAXSIZE + RINGBUFFER_LLVMCODE = ''' +internal fastcc sbyte* %%malloc_exception(uint %%nbytes) { + %%cond = setle uint %%nbytes, %d + br bool %%cond, label %%then, label %%else + +then: + %%tmp.3 = load uint* %%exception_ringbuffer_index + %%tmp.4 = getelementptr [%d x sbyte]* %%exception_ringbuffer, int 0, uint %%tmp.3 + %%tmp.6 = add uint %%tmp.3, %%nbytes + %%tmp.7 = and uint %%tmp.6, %d + store uint %%tmp.7, uint* %%exception_ringbuffer_index + ret sbyte* %%tmp.4 + +else: + %%tmp.8 = call ccc sbyte* %%GC_malloc(uint %%nbytes) + ret sbyte* %%tmp.8 +} +''' % (RINGBUFFER_ENTRY_MAXSIZE, RINGBUGGER_OVERSIZE, RINGBUGGER_SIZE-1) def __init__(self): raise Exception, 'ExceptionPolicy should not be used directly' @@ -81,7 +100,7 @@ internal fastcc void %%unwind() { unwind } -''' % locals() +''' % locals() + self.RINGBUFFER_LLVMCODE def invoke(self, codewriter, targetvar, tail_, cconv, returntype, functionref, args, label, except_label): labels = 'to label %%%s except label %%%s' % (label, except_label) @@ -174,7 +193,7 @@ internal fastcc void %%unwind() { ret void } -''' % locals() +''' % locals() + self.RINGBUFFER_LLVMCODE def transform(self, translator, graph=None): from pypy.translator.llvm.backendopt.exception import create_exception_handling Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Wed Sep 28 15:51:55 2005 @@ -40,7 +40,12 @@ self.ref = self.make_ref('%pypy_', value.graph.name) self.graph = value.graph self.db.genllvm.exceptionpolicy.transform(self.db.translator, self.graph) - #remove_exception_mallocs(self.db.translator, self.graph) + if remove_exception_mallocs(self.db.translator, self.graph): + print ' from function', self.ref + import sys + sys.stdout.flush() + #if self.ref not in ('%pypy_ll_raise_OSError__Signed', '%pypy_getitem'): + # self.db.translator.view() #merge_mallocs(self.db.translator, self.graph) remove_double_links(self.db.translator, self.graph) Modified: pypy/dist/pypy/translator/llvm/gc.py ============================================================================== --- pypy/dist/pypy/translator/llvm/gc.py (original) +++ pypy/dist/pypy/translator/llvm/gc.py Wed Sep 28 15:51:55 2005 @@ -10,8 +10,6 @@ def malloc(self, targetvar, type_, size, is_atomic, word, uword): s = str(size) - if s == '0': - return '%(targetvar)s = cast %(type_)s* null to %(type_)s* ;was malloc 0 bytes' % locals() return '%(targetvar)s = malloc %(type_)s, uint %(s)s' % locals() def pyrex_code(self): @@ -61,8 +59,6 @@ def malloc(self, targetvar, type_, size, is_atomic, word, uword): s = str(size) - if s == '0': - return '%(targetvar)s = cast %(type_)s* null to %(type_)s* ;was malloc 0 bytes' % locals() self.n_malloced += 1 cnt = '.%d' % self.n_malloced atomic = is_atomic and '_atomic' or '' Modified: pypy/dist/pypy/translator/llvm/module/extfunction.py ============================================================================== --- pypy/dist/pypy/translator/llvm/module/extfunction.py (original) +++ pypy/dist/pypy/translator/llvm/module/extfunction.py Wed Sep 28 15:51:55 2005 @@ -2,8 +2,9 @@ %last_exception_type = internal global %RPYTHON_EXCEPTION_VTABLE* null %last_exception_value = internal global %RPYTHON_EXCEPTION* null -%exception_ringbuffer = internal global [8192 x sbyte] zeroinitializer -%exception_ringbuffer_index = internal global int 0 +;8208=8192+16 in the next line because the last one (16 bytes maxsize) might start at 8190 for instance. +%exception_ringbuffer = internal global [8208 x sbyte] zeroinitializer +%exception_ringbuffer_index = internal global uint 0 ''' extfunctions = {} #dependencies, llvm-code Modified: pypy/dist/pypy/translator/llvm/opwriter.py ============================================================================== --- pypy/dist/pypy/translator/llvm/opwriter.py (original) +++ pypy/dist/pypy/translator/llvm/opwriter.py Wed Sep 28 15:51:55 2005 @@ -387,6 +387,18 @@ ep.reraise(self.node, self.codewriter) ep.fetch_exceptions(self.codewriter, exc_found_labels, lltype_of_exception_type, lltype_of_exception_value) + def malloc_exception(self, op): + arg_type = op.args[0].value + targetvar = self.db.repr_arg(op.result) + type_ = self.db.repr_type(arg_type) + tmpvar1 = self.db.repr_tmpvar() + tmpvar2 = self.db.repr_tmpvar() + tmpvar3 = self.db.repr_tmpvar() + self.codewriter.indent('%(tmpvar1)s = getelementptr %(type_)s* null, int 1' % locals()) + self.codewriter.cast(tmpvar2, type_+'*', tmpvar1, 'uint') + self.codewriter.call(tmpvar3, 'sbyte*', '%malloc_exception', [tmpvar2], ['uint']) + self.codewriter.cast(targetvar, 'sbyte*', tmpvar3, type_+'*') + def malloc(self, op): arg_type = op.args[0].value targetvar = self.db.repr_arg(op.result) @@ -397,7 +409,7 @@ arg_type = op.args[0].value if isinstance(arg_type, lltype.Array) and arg_type.OF is lltype.Void: # This is a backend decision to NOT represent a void array with - # anything and save space - therefore not varsizeda anymore + # anything and save space - therefore not varsized anymore self.malloc(op) return From pedronis at codespeak.net Wed Sep 28 19:55:34 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Wed, 28 Sep 2005 19:55:34 +0200 (CEST) Subject: [pypy-svn] r17946 - in pypy/dist/pypy/translator/tool: . test Message-ID: <20050928175534.40F9E27B7B@code1.codespeak.net> Author: pedronis Date: Wed Sep 28 19:55:31 2005 New Revision: 17946 Modified: pypy/dist/pypy/translator/tool/taskengine.py pypy/dist/pypy/translator/tool/test/test_taskengine.py Log: added helper to find strong dependecies from a task Modified: pypy/dist/pypy/translator/tool/taskengine.py ============================================================================== --- pypy/dist/pypy/translator/tool/taskengine.py (original) +++ pypy/dist/pypy/translator/tool/taskengine.py Wed Sep 28 19:55:31 2005 @@ -80,6 +80,13 @@ return plan + def _depending_on(self, goal): + l = [] + for task_name, (task, task_deps) in self.tasks.iteritems(): + if goal in task_deps: + l.append(task_name) + return l + def _execute(self, goals, *args, **kwds): task_skip = kwds.get('task_skip', []) for goal in self._plan(goals, skip=task_skip): @@ -104,27 +111,5 @@ pass -""" sketch of tasks for translation: - -annotate: # includes annotation and annotatation simplifications - -rtype: annotate - -backendoptimisations: rtype # make little sense otherwise - -source_llvm: backendoptimisations, rtype, annotate - -source_c: ?backendoptimisations, ?rtype, ?annotate - -compile_c : source_c - -compile_llvm: source_llvm - -run_c: compile_c - -run_llvm: compile_llvm - -""" - Modified: pypy/dist/pypy/translator/tool/test/test_taskengine.py ============================================================================== --- pypy/dist/pypy/translator/tool/test/test_taskengine.py (original) +++ pypy/dist/pypy/translator/tool/test/test_taskengine.py Wed Sep 28 19:55:31 2005 @@ -23,6 +23,10 @@ assert abc._plan('C') == ['B', 'C'] assert abc._plan('A') == ['B', 'C', 'A'] assert abc._plan('A', skip=['C']) == ['B', 'A'] + + assert abc._depending_on('C') == [] + assert dict.fromkeys(abc._depending_on('B'), True) == {'A':True, 'C':True} + assert abc._depending_on('A') == [] def test_execute(): From mwh at codespeak.net Wed Sep 28 21:49:10 2005 From: mwh at codespeak.net (mwh at codespeak.net) Date: Wed, 28 Sep 2005 21:49:10 +0200 (CEST) Subject: [pypy-svn] r17949 - pypy/dist/pypy/translator/c Message-ID: <20050928194910.5F3AD27B7B@code1.codespeak.net> Author: mwh Date: Wed Sep 28 21:49:09 2005 New Revision: 17949 Modified: pypy/dist/pypy/translator/c/funcgen.py Log: tyop Modified: pypy/dist/pypy/translator/c/funcgen.py ============================================================================== --- pypy/dist/pypy/translator/c/funcgen.py (original) +++ pypy/dist/pypy/translator/c/funcgen.py Wed Sep 28 21:49:09 2005 @@ -33,7 +33,7 @@ # collect all variables and constants used in the body, # and get their types now # - # NOTE: cannot use dictionaries with Constants has keys, because + # NOTE: cannot use dictionaries with Constants as keys, because # Constants may hash and compare equal but have different lltypes mix = [] self.more_ll_values = [] From tismer at codespeak.net Wed Sep 28 22:34:25 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Wed, 28 Sep 2005 22:34:25 +0200 (CEST) Subject: [pypy-svn] r17950 - pypy/dist/pypy/translator/goal Message-ID: <20050928203425.2215C27B7B@code1.codespeak.net> Author: tismer Date: Wed Sep 28 22:34:24 2005 New Revision: 17950 Modified: pypy/dist/pypy/translator/goal/bench-windows.py Log: smal improvement in size calculation of the CPython executable Modified: pypy/dist/pypy/translator/goal/bench-windows.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-windows.py (original) +++ pypy/dist/pypy/translator/goal/bench-windows.py Wed Sep 28 22:34:24 2005 @@ -88,16 +88,16 @@ return bench_exe def run_version_size(executable=reference('python'), *args): - ver, size = run_cmd('''%s -c "import sys,os;print sys.version.split()[0],\\ - os.path.getsize(sys.executable)"''' - % executable).split() + ver, size, dll = run_cmd('%s -c "import sys, os; print sys.version.split()[0], ' + 'os.path.getsize(sys.executable), sys.dllhandle"' + % executable).split() size = int(size) try: - sys.dllhandle - except AttributeError: + import win32api + except ImportError: pass else: - size += os.path.getsize(win32api.GetModuleFileName(sys.dllhandle)) + size += os.path.getsize(win32api.GetModuleFileName(int(dll))) return ver, size def run_pystone(executable=reference('python'), n=0): From arigo at codespeak.net Wed Sep 28 23:32:14 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Wed, 28 Sep 2005 23:32:14 +0200 (CEST) Subject: [pypy-svn] r17952 - in pypy/dist/pypy: interpreter objspace/std Message-ID: <20050928213214.0C60227B7B@code1.codespeak.net> Author: arigo Date: Wed Sep 28 23:32:14 2005 New Revision: 17952 Modified: pypy/dist/pypy/interpreter/baseobjspace.py pypy/dist/pypy/interpreter/mixedmodule.py pypy/dist/pypy/objspace/std/objspace.py Log: Shortcut for performance: getdictvalue() can return None faster, using a space.finditem() that is a non-raising version of space.getitem(). finditem() is implemented as a try:except: around a space.getitem(), with a shortcut for the common case of W_DictObject. Modified: pypy/dist/pypy/interpreter/baseobjspace.py ============================================================================== --- pypy/dist/pypy/interpreter/baseobjspace.py (original) +++ pypy/dist/pypy/interpreter/baseobjspace.py Wed Sep 28 23:32:14 2005 @@ -21,11 +21,7 @@ def getdictvalue(self, space, attr): w_dict = self.getdict() if w_dict is not None: - try: - return space.getitem(w_dict, space.wrap(attr)) - except OperationError, e: - if not e.match(space, space.w_KeyError): - raise + return space.finditem(w_dict, space.wrap(attr)) return None def setdict(self, space, w_dict): @@ -373,6 +369,14 @@ """shortcut for space.int_w(space.hash(w_obj))""" return self.int_w(self.hash(w_obj)) + def finditem(self, w_obj, w_key): + try: + return self.getitem(w_obj, w_key) + except OperationError, e: + if e.match(self, self.w_KeyError): + return None + raise + def newbool(self, b): if b: return self.w_True Modified: pypy/dist/pypy/interpreter/mixedmodule.py ============================================================================== --- pypy/dist/pypy/interpreter/mixedmodule.py (original) +++ pypy/dist/pypy/interpreter/mixedmodule.py Wed Sep 28 23:32:14 2005 @@ -31,13 +31,8 @@ return self.space.call_function(w_builtin, *args_w) def getdictvalue(self, space, name): - try: - return space.getitem(self.w_dict, space.wrap(name)) - except OperationError, e: - if not e.match(space, space.w_KeyError): - raise - if not self.lazy: - return None + w_value = space.finditem(self.w_dict, space.wrap(name)) + if self.lazy and w_value is None: try: loader = self.loaders[name] except KeyError: @@ -57,7 +52,7 @@ func._builtinversion_ = bltin w_value = space.wrap(bltin) space.setitem(self.w_dict, space.wrap(name), w_value) - return w_value + return w_value def getdict(self): if self.lazy: Modified: pypy/dist/pypy/objspace/std/objspace.py ============================================================================== --- pypy/dist/pypy/objspace/std/objspace.py (original) +++ pypy/dist/pypy/objspace/std/objspace.py Wed Sep 28 23:32:14 2005 @@ -397,11 +397,18 @@ def is_true(self, w_obj): # XXX don't look! - if isinstance(w_obj, W_DictObject): + if type(w_obj) is W_DictObject: return len(w_obj.content) != 0 else: return DescrOperation.is_true(self, w_obj) + def finditem(self, w_obj, w_key): + # performance shortcut to avoid creating the OperationError(KeyError) + if type(w_obj) is W_DictObject: + return w_obj.content.get(w_key, None) + else: + return ObjSpace.finditem(self, w_obj, w_key) + # support for the deprecated __getslice__, __setslice__, __delslice__ def getslice(self, w_obj, w_start, w_stop): From arigo at codespeak.net Thu Sep 29 00:06:00 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 29 Sep 2005 00:06:00 +0200 (CEST) Subject: [pypy-svn] r17953 - pypy/dist/pypy/translator/c/test Message-ID: <20050928220600.D8B8427B7B@code1.codespeak.net> Author: arigo Date: Thu Sep 29 00:05:57 2005 New Revision: 17953 Modified: pypy/dist/pypy/translator/c/test/test_extfunc.py Log: Difficult to test properly the complex interactions between the refcounting of the parent and in the child thread of the object passed as an argument between them. The problem is that the only clean solution is like os_thread.py: the ownership of 'arg' must be passed to the child thread, which eventually releases it when it terminates. But for the tests the main thread has no clean way to wait until the child thread is entierely finished and has released 'arg'. So sometimes, an 'assert mallocs == frees' triggers. Modified: pypy/dist/pypy/translator/c/test/test_extfunc.py ============================================================================== --- pypy/dist/pypy/translator/c/test/test_extfunc.py (original) +++ pypy/dist/pypy/translator/c/test/test_extfunc.py Thu Sep 29 00:05:57 2005 @@ -332,11 +332,11 @@ assert arg.value == 42 def myotherthreadedfunction(arg): assert arg.value == 43 + a42 = Arg() + a42.value = 42 + a43 = Arg() + a43.value = 43 def fn(i): - a42 = Arg() - a42.value = 42 - a43 = Arg() - a43.value = 43 thread.start_new_thread(mythreadedfunction, (a42,)) thread.start_new_thread(myotherthreadedfunction, (a43,)) if i == 1: @@ -356,14 +356,14 @@ import pypy.module.thread.rpython.exttable # for declare()/declaretype() class Arg: pass + a = Arg() + a.x = 5 + a.lock = thread.allocate_lock() def mythreadedfunction(arg): arg.x += 37 arg.myident = thread.get_ident() arg.lock.release() def fn(): - a = Arg() - a.x = 5 - a.lock = thread.allocate_lock() a.lock.acquire(True) ident = thread.start_new_thread(mythreadedfunction, (a,)) assert ident != thread.get_ident() From pedronis at codespeak.net Thu Sep 29 00:06:40 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 29 Sep 2005 00:06:40 +0200 (CEST) Subject: [pypy-svn] r17954 - in pypy/dist/pypy/translator: goal tool Message-ID: <20050928220640.C4F5527B71@code1.codespeak.net> Author: pedronis Date: Thu Sep 29 00:06:37 2005 New Revision: 17954 Added: pypy/dist/pypy/translator/goal/translation.py - copied, changed from r17944, pypy/dist/pypy/translator/goal/translate_pypy_new.py Modified: pypy/dist/pypy/translator/tool/taskengine.py Log: intermediate checkin for rename :( Modified: pypy/dist/pypy/translator/tool/taskengine.py ============================================================================== --- pypy/dist/pypy/translator/tool/taskengine.py (original) +++ pypy/dist/pypy/translator/tool/taskengine.py Thu Sep 29 00:06:37 2005 @@ -1,6 +1,6 @@ -class SimpleTaskEngine: +class SimpleTaskEngine(object): def __init__(self): self._plan_cache = {} From pedronis at codespeak.net Thu Sep 29 00:07:12 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 29 Sep 2005 00:07:12 +0200 (CEST) Subject: [pypy-svn] r17955 - pypy/dist/pypy/translator/goal Message-ID: <20050928220712.A132627B7C@code1.codespeak.net> Author: pedronis Date: Thu Sep 29 00:07:10 2005 New Revision: 17955 Added: pypy/dist/pypy/translator/goal/driver.py - copied unchanged from r17954, pypy/dist/pypy/translator/goal/translation.py Removed: pypy/dist/pypy/translator/goal/translation.py Log: rename translation.py to driver.py for now From tismer at codespeak.net Thu Sep 29 00:25:39 2005 From: tismer at codespeak.net (tismer at codespeak.net) Date: Thu, 29 Sep 2005 00:25:39 +0200 (CEST) Subject: [pypy-svn] r17957 - pypy/dist/pypy/translator/goal Message-ID: <20050928222539.0882627B7C@code1.codespeak.net> Author: tismer Date: Thu Sep 29 00:25:38 2005 New Revision: 17957 Modified: pypy/dist/pypy/translator/goal/bench-windows.py Log: current results, now compared against true python 2.4.1 be aware: python 2.5a2 is faster. Let's study what they did. Modified: pypy/dist/pypy/translator/goal/bench-windows.py ============================================================================== --- pypy/dist/pypy/translator/goal/bench-windows.py (original) +++ pypy/dist/pypy/translator/goal/bench-windows.py Thu Sep 29 00:25:38 2005 @@ -9,25 +9,23 @@ # subprocess into lib and change line 392 to use win32 current_result = """ -executable a.rich a.stone r.rich r.stone size -pypy-c-17439-hi 37413 678.4 48.2 61.6 5.65 -pypy-c-17600-lo 26352 906.2 33.9 46.1 6.43 -pypy-c-17634-lo 20108 1023.5 25.9 40.9 6.42 -pypy-c-17649-lo 22612 1042.0 29.1 40.1 6.41 -pypy-c-17674-lo 19248 1358.8 24.8 30.8 6.40 -pypy-c-17674-hi 12402 1941.4 16.0 21.5 7.37 -pypy-c-17439-lo 29638 971.4 38.1 43.0 6.49 -pypy-c-17707-hi 14095 2092.7 18.1 20.0 7.37 -pypy-c-17707-lo 19102 1354.7 24.6 30.9 6.40 -pypy-c-17707-lo-range 18786 2800.8 24.2 14.9 6.40 -pypy-c-17707-hi-range 13980 2899.9 18.0 14.4 7.38 -pypy-c-17743-hi 13944 2800.3 17.9 14.9 7.30 -pypy-c-17761-hi-samuele 13243 2983.3 17.0 14.0 7.69 -python 2.5a0 777 41812.1 1.0 1.0 0.96 - -This new version also shows the size of the plain executable. -Samuele's locality patch now has a nice impact of over 5 percent. -I had even expected a bit more, but fine, yeah! +executable richards pystone size (MB) +pypy-c-17439-hi 37413 47.8x 678.4 60.5x 5.65 +pypy-c-17600-lo 26352 33.7x 906.2 45.3x 6.43 +pypy-c-17634-lo 20108 25.7x 1023.5 40.1x 6.42 +pypy-c-17649-lo 22612 28.9x 1042.0 39.4x 6.41 +pypy-c-17674-lo 19248 24.6x 1358.8 30.2x 6.40 +pypy-c-17674-hi 12402 15.9x 1941.4 21.1x 7.37 +pypy-c-17439-lo 29638 37.9x 971.4 42.3x 6.49 +pypy-c-17707-hi 14095 18.0x 2092.7 19.6x 7.37 +pypy-c-17707-lo 19102 24.4x 1354.7 30.3x 6.40 +pypy-c-17707-lo-range 18786 24.0x 2800.8 14.7x 6.40 +pypy-c-17707-hi-range 13980 17.9x 2899.9 14.2x 7.38 +pypy-c-17743-hi 13944 17.8x 2800.3 14.7x 7.30 +pypy-c-17761-hi-samuele 13243 16.9x 2983.3 13.8x 7.69 +pypy-c-17794-ref-crash 41088 52.5x 1084.5 37.9x 14.62 +pypy-c-17950-hi 12888 16.5x 3203.0 12.8x 5.49 +python 2.4.1 782 1.0x 41058.3 1.0x 0.96 """ import os, sys, pickle, md5 @@ -131,9 +129,9 @@ pickle.dump(dic, file(statfile, 'wb')) HEADLINE = '''\ -executable a.rich a.stone r.rich r.stone size''' +executable richards pystone size (MB)''' FMT = '''\ -%-27s ''' + '%5d %7.1f ' + '%5.1f %5.1f %5.2f' +%-27s''' + '%5d %5.1fx %7.1f %5.1fx %5.2f' def main(): print 'getting the richards reference' @@ -163,7 +161,7 @@ res.sort() print HEADLINE for mtime, exe, size, rich, stone in res: - print FMT % (exe, rich, stone, rich / ref_rich, ref_stone / stone, + print FMT % (exe, rich, rich / ref_rich, stone, ref_stone / stone, size / float(1024 * 1024)) if __name__ == '__main__': From hpk at codespeak.net Thu Sep 29 11:44:54 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 29 Sep 2005 11:44:54 +0200 (CEST) Subject: [pypy-svn] r17967 - pypy/extradoc/talk/22c3 Message-ID: <20050929094454.E00E827B72@code1.codespeak.net> Author: hpk Date: Thu Sep 29 11:44:53 2005 New Revision: 17967 Added: pypy/extradoc/talk/22c3/ (props changed) pypy/extradoc/talk/22c3/pypy-dev-talk.txt (contents, props changed) pypy/extradoc/talk/22c3/speaker-holgerkrekel.txt (contents, props changed) Log: a first draft of the 22C3 proposal, everybody who wants to participate: coordinate with me and Carl and most important: add your speaker information! comments on the talk scope welcome. I know that it's pretty stuffed but i also think that it doesn't make sense (as opposed to EuroPython conferences) to go into too many details. Rather i'd like to touch areas and allow people to have questions. (The talk is 60 minutes overall, but 45 minutes and then questions is recommended) Added: pypy/extradoc/talk/22c3/pypy-dev-talk.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/pypy-dev-talk.txt Thu Sep 29 11:44:53 2005 @@ -0,0 +1,73 @@ +Reference/Call For Papers: http://www.ccc.de/congress/2005/cfp.html +DEADLINE: 31st September 2005 (friday) + +Title: PyPy - the new Python implementation on the block + +Subtitle: Language/VM R&D, whole program type inference, + translation to low level backends, fun. + +Section: Hacking + +Talkers: ... + +Abstract (max 250 letters): + + We'll present results of our first self-contained translated + Python virtual machine using parts of itself ("the muenchhausen + approach"). We are going to relate our research/technical results + to other language-research aspects found with Perl 6 and Haskell. + + Development of PyPy is partly funded by the European Union + during the 6th Research Framework programme. + +Description (250-500 words): + + PyPy is a reimplementation of Python written in Python + itself, flexible and easy to experiment with. Our + long-term goals are to target a large variety of + platforms, small and large, by providing a compiler + toolsuite that can produce custom Python versions. + Platform, Memory and Threading models are becoming + aspects of the translation process - as opposed to + encoding low level details into a language implementation + itself. + + We are going to briefly describe the concepts of object spaces, + abstract interpretation and translation aspects and + how they lead us to producing a first self-contained + very compliant Python implementation August 2005, + completely independent from the current mainstream + CPython implementation. We go through a translation + example of a Python program with control-flow-graphs + and the according translated lowlevel C and + LLVM (Low level Virtual Machine) code. + + We'll also try to relate PyPy's architectural concepts + (known roughly for 2-3 years) to similar upcoming concepts + in e.g. pugs/Perl 6 development and we'll give an + outlook on our starting Just-In-Time Compiler efforts + and approaches. + + Lastly, we intend to discuss experimental new language/interpreter-level + solutions to long-standing problems such as distributed computing, + persistence and security/sandboxing. + +Statement: We intend to submit a paper (PDF) for the 22C3 proceedings. +Statement: We intend to submit a slides PDF as well. + +Duration of your talk: 45 minutes + questions + +Language of your talk: english + +Links to background information on the talk: http://codespeak.net/pypy + +Target Group: Advanced Users, Pros + +Resources you need for your talk: digital projector, internet + +Related talks at 22C3 you know of: ... + +A lecture logo, square format, min. 128x128 pixels (optional): + http://codespeak.net/pypy/img/py-web1.png + (please scale it down a bit :-) + Added: pypy/extradoc/talk/22c3/speaker-holgerkrekel.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/speaker-holgerkrekel.txt Thu Sep 29 11:44:53 2005 @@ -0,0 +1,43 @@ +Reference/Call For Papers: http://www.ccc.de/congress/2005/cfp.html +DEADLINE: 31st September 2005 (friday) + +Name: holger krekel + +Public Name: holger krekel + +Other Names: hpk on irc.freenode.org + +Primary E-Mail address: krekel at merlinux.de + +Phone number(s): +49 171 464 8622 + +A photo, square format, min. 128x128 pixels (optional): + http://merlinux.de/~hpk/holger.jpg + + (maybe i can find a better one, this is the one from + last 21C3 conference) + +Statement: publishing contact info except for the phone number + is fine with me. + +Public home page, weblog and other speaker-related websites: + + http://codespeak.net/pypy + +Short Info: + + Holger Krekel, founder and core developer of PyPy, technical lead merlinux GmbH. + +Bio: + + Holger started with game and general computer hacking ages ago, went through + university, some bank/insurance contracting until finally arriving at working + in free software environments fulltime now. He founded merlinux in 2004, which + is operating with and employing a world-wide set of open-source specialists. + Other merits are a well known python testing framework (http://codespeak.net/py) + and operating the http://codespeak.net open-source hosting site itself. + +Postal address: merlinux GmbH, Holger Krekel, Steinbergstr. 42, 31139 Hildesheim +Bank information: +Expected day of arrival and departure: 27th-30th December. + From ac at codespeak.net Thu Sep 29 13:40:02 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 29 Sep 2005 13:40:02 +0200 (CEST) Subject: [pypy-svn] r17969 - pypy/dist/pypy/module/_codecs Message-ID: <20050929114002.9DCD227B72@code1.codespeak.net> Author: ac Date: Thu Sep 29 13:40:02 2005 New Revision: 17969 Modified: pypy/dist/pypy/module/_codecs/app_codecs.py Log: Fix bug with mixing char and unichar in result. Modified: pypy/dist/pypy/module/_codecs/app_codecs.py ============================================================================== --- pypy/dist/pypy/module/_codecs/app_codecs.py (original) +++ pypy/dist/pypy/module/_codecs/app_codecs.py Thu Sep 29 13:40:02 2005 @@ -1382,7 +1382,7 @@ while (pos < size): ## /* Non-escape characters are interpreted as Unicode ordinals */ if (s[pos] != '\\') : - p += s[pos] + p += unichr(ord(s[pos])) pos += 1 continue ## /* \ - Escapes */ @@ -1604,7 +1604,7 @@ if (((pos - bs) & 1) == 0 or pos >= size or (s[pos] != 'u' and s[pos] != 'U')) : - p += s[pos] + p += unichr(ord(s[pos])) pos += 1 continue From ac at codespeak.net Thu Sep 29 14:01:15 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 29 Sep 2005 14:01:15 +0200 (CEST) Subject: [pypy-svn] r17971 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050929120115.3D8A427B72@code1.codespeak.net> Author: ac Date: Thu Sep 29 14:01:14 2005 New Revision: 17971 Modified: pypy/dist/pypy/interpreter/pyparser/pythonlexer.py pypy/dist/pypy/interpreter/pyparser/pythonparse.py Log: Refactor sourcecode encoding processing. Modified: pypy/dist/pypy/interpreter/pyparser/pythonlexer.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonlexer.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonlexer.py Thu Sep 29 14:01:14 2005 @@ -47,24 +47,6 @@ if encoding != '': return encoding return None - -def _normalize_encoding(encoding): - """returns normalized name for - - see dist/src/Parser/tokenizer.c 'get_normal_name()' - for implementation details / reference - - NOTE: for now, parser.suite() raises a MemoryError when - a bad encoding is used. (SF bug #979739) - """ - # lower() + '_' / '-' conversion - encoding = encoding.replace('_', '-').lower() - if encoding.startswith('utf-8'): - return 'utf-8' - for variant in ['latin-1', 'iso-latin-1', 'iso-8859-1']: - if encoding.startswith(variant): - return 'iso-8859-1' - return encoding ################################################################################ from pypy.interpreter.pyparser import pytoken @@ -112,16 +94,11 @@ contline = None indents = [0] last_comment = '' - encoding = None # make the annotator happy pos = -1 lines.append('') # XXX HACK probably not needed # look for the bom (byte-order marker) for utf-8 - # XXX encoding support is incomplete at the moment - if lines[0].startswith('\xEF\xBB\xBF'): - lines[0] = lines[0][3:] - encoding = 'utf-8' # make the annotator happy endDFA = automata.DFA([], []) @@ -175,10 +152,6 @@ if line[pos] == '#': tok = Token(pytoken.COMMENT, line[pos:]) last_comment = line[pos:] - if lnum <= 2 and encoding is None: - encoding = match_encoding_declaration(last_comment) - if encoding is not None: - encoding = _normalize_encoding(encoding) else: tok = Token(pytoken.NL, line[pos:]) last_comment = '' @@ -237,10 +210,6 @@ elif initial == '#': tok = Token(pytoken.COMMENT, token) last_comment = token - if lnum <= 2 and encoding is None: - encoding = match_encoding_declaration(last_comment) - if encoding is not None: - encoding = _normalize_encoding(encoding) # XXX Skip # token_list.append((tok, line, lnum, pos)) # token_list.append((COMMENT, token, spos, epos, line)) elif token in triple_quoted: @@ -317,15 +286,14 @@ #for t in token_list: # print '%20s %-25s %d' % (pytoken.tok_name.get(t[0].codename, '?'), t[0], t[-2]) #print '----------------------------------------- pyparser/pythonlexer.py' - return token_list, encoding + return token_list class PythonSource(TokenSource): """This source uses Jonathan's tokenizer""" def __init__(self, strings, flags=0): # TokenSource.__init__(self) - tokens, encoding = generate_tokens(strings, flags) + tokens = generate_tokens(strings, flags) self.token_stack = tokens - self.encoding = encoding self._current_line = '' # the current line (as a string) self._lineno = -1 self._offset = 0 Modified: pypy/dist/pypy/interpreter/pyparser/pythonparse.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonparse.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonparse.py Thu Sep 29 14:01:14 2005 @@ -6,9 +6,10 @@ using file_input, single_input and eval_input targets """ from pypy.interpreter.error import OperationError, debug_print +from pypy.interpreter import gateway from pypy.interpreter.pyparser.error import ParseError from pypy.tool.option import Options -from pythonlexer import Source +from pythonlexer import Source, match_encoding_declaration import pysymbol import ebnfparse import sys @@ -27,8 +28,18 @@ def parse_source(self, textsrc, goal, builder, flags=0): """Parse a python source according to goal""" + # Detect source encoding. + if textsrc[:3] == '\xEF\xBB\xBF': + textsrc = textsrc[3:] + enc = 'utf-8' + else: + enc = _normalize_encoding(_check_for_encoding(textsrc)) + if enc is not None and enc not in ('utf-8', 'iso-8859-1'): + textsrc = _recode_in_utf8(builder.space, textsrc, enc) + lines = [line + '\n' for line in textsrc.split('\n')] - if textsrc.endswith('\n'): + builder.source_encoding = enc + if textsrc[-1] =='\n': lines.pop() flags &= ~PyCF_DONT_IMPLY_DEDENT return self.parse_lines(lines, goal, builder, flags) @@ -37,8 +48,7 @@ goalnumber = pysymbol.sym_values[goal] target = self.rules[goalnumber] src = Source(lines, flags) - builder.source_encoding = src.encoding - + result = target.match(src, builder) if not result: line, lineno = src.debug() @@ -46,6 +56,54 @@ raise ParseError("invalid syntax", lineno, -1, line) # return None return builder + +app_recode_to_utf8 = gateway.applevel(r''' + def app_recode_to_utf8(text, encoding): + return unicode(text, encoding).encode("utf-8") +''').interphook('app_recode_to_utf8') + +def _recode_to_utf8(space, text, encoding): + return space.str_w(app_recode_to_utf8(space, space.wrap(text), + space.wrap(encoding))) +def _normalize_encoding(encoding): + """returns normalized name for + + see dist/src/Parser/tokenizer.c 'get_normal_name()' + for implementation details / reference + + NOTE: for now, parser.suite() raises a MemoryError when + a bad encoding is used. (SF bug #979739) + """ + if encoding is None: + return None + # lower() + '_' / '-' conversion + encoding = encoding.replace('_', '-').lower() + if encoding.startswith('utf-8'): + return 'utf-8' + for variant in ['latin-1', 'iso-latin-1', 'iso-8859-1']: + if encoding.startswith(variant): + return 'iso-8859-1' + return encoding + +def _check_for_encoding(s): + eol = s.find('\n') + if eol == -1: + return _check_line_for_encoding(s) + enc = _check_line_for_encoding(s[:eol]) + eol2 = s.find('\n', eol + 1) + if eol2 == -1: + return _check_line_for_encoding(s[eol + 1:]) + return _check_line_for_encoding(s[eol + 1:eol2]) + +def _check_line_for_encoding(line): + """returns the declared encoding or None""" + i = 0 + for i in range(len(line)): + if line[i] == '#': + break + if line[i] not in ' \t\014': + return None + return match_encoding_declaration(line[i:]) PYTHON_VERSION = ".".join([str(i) for i in sys.version_info[:2]]) def get_grammar_file( version ): From hpk at codespeak.net Thu Sep 29 14:07:15 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Thu, 29 Sep 2005 14:07:15 +0200 (CEST) Subject: [pypy-svn] r17972 - pypy/extradoc/talk Message-ID: <20050929120715.D48C327B72@code1.codespeak.net> Author: hpk Date: Thu Sep 29 14:07:15 2005 New Revision: 17972 Added: pypy/extradoc/talk/conference-attendance.txt Log: let's try to be more systematic about our conference attendance plannings. Here is a start with CCC and Pycon, please insert more conferences/workshops that you deem interesting! Added: pypy/extradoc/talk/conference-attendance.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/conference-attendance.txt Thu Sep 29 14:07:15 2005 @@ -0,0 +1,30 @@ + +Conferences Planning for PyPy talks +--------------------------------------- + +CCC 2005 +--------------- + +Chaos Communication Conference 2005, a 2000-3000 people +conference focusing on all things hacking (including +cultural events). Non-Python Audience. + +Call for Papers: http://www.ccc.de/congress/2005/cfp.html +DEADLINE: 31st September 2005 + +see http://codespeak.net/pypy/extradoc/talk/22c3 for +proposal preparations! + +Time & Location: 27th-30th December, Berlin. + + +Pycon 2006 +--------------- + +Annual US Python conference, strong Python audience. + +Call for Papers: http://www.python.org/pycon/2006/cfp +DEADLINE: 31st October 2005 + +Time & location: 24-26 Feburary, 2006, Addison, Texas (near Dallas) + From cfbolz at codespeak.net Thu Sep 29 14:12:45 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Thu, 29 Sep 2005 14:12:45 +0200 (CEST) Subject: [pypy-svn] r17973 - pypy/extradoc/talk/22c3 Message-ID: <20050929121245.72CDF27B99@code1.codespeak.net> Author: cfbolz Date: Thu Sep 29 14:12:43 2005 New Revision: 17973 Added: pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt Modified: pypy/extradoc/talk/22c3/pypy-dev-talk.txt Log: small fixes to description. added my draft of my speaker info. Modified: pypy/extradoc/talk/22c3/pypy-dev-talk.txt ============================================================================== --- pypy/extradoc/talk/22c3/pypy-dev-talk.txt (original) +++ pypy/extradoc/talk/22c3/pypy-dev-talk.txt Thu Sep 29 14:12:43 2005 @@ -12,10 +12,11 @@ Abstract (max 250 letters): - We'll present results of our first self-contained translated - Python virtual machine using parts of itself ("the muenchhausen - approach"). We are going to relate our research/technical results - to other language-research aspects found with Perl 6 and Haskell. + We'll present results of our first self-contained Python virtual + machine that uses parts of itself while translation itself + to a low level language ("the muenchhausen approach"). We are + going to relate our research/technical results to other + language-research aspects found with Perl 6 and Haskell. Development of PyPy is partly funded by the European Union during the 6th Research Framework programme. @@ -27,17 +28,17 @@ long-term goals are to target a large variety of platforms, small and large, by providing a compiler toolsuite that can produce custom Python versions. - Platform, Memory and Threading models are becoming + Platform, Memory and Threading models will become aspects of the translation process - as opposed to encoding low level details into a language implementation - itself. + itself. We are going to briefly describe the concepts of object spaces, abstract interpretation and translation aspects and how they lead us to producing a first self-contained very compliant Python implementation August 2005, completely independent from the current mainstream - CPython implementation. We go through a translation + CPython implementation. We go through a translation example of a Python program with control-flow-graphs and the according translated lowlevel C and LLVM (Low level Virtual Machine) code. Added: pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt Thu Sep 29 14:12:43 2005 @@ -0,0 +1,33 @@ +Reference/Call For Papers: http://www.ccc.de/congress/2005/cfp.html +DEADLINE: 31st September 2005 (friday) + +Name: Carl Friedrich Bolz + +Public Name: Carl Friedrich Bolz + +Other Names: cfbolz on irc.freenode.org + +Primary E-Mail address: cfbolz at gmx.de + +Phone number(s): +49 6221 7183177 + +Statement: publishing contact info except for the phone number + is fine with me. + +Short Info: + + Carl Friedrich Bolz, student and developer of PyPy + +Bio: + + Carl Friedrich started to program C++ when he was 16, couldn't stant it + and turned to Python soon afterwards. He is a PyPy developer since the + beginning of 2005. At the moment he is supposed to study mathematics, + computer science and sometimes physics at the University of Heidelberg. + Since programming is more fun he is taking a semester off to work for + merlinux on PyPy and other projects. + +Postal address: Carl Friedrich Bolz, Albert-Fritz-Str. 9, 69124 Heidelberg +Bank information: +Expected day of arrival and departure: 27th-30th December. + From ac at codespeak.net Thu Sep 29 14:44:50 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 29 Sep 2005 14:44:50 +0200 (CEST) Subject: [pypy-svn] r17974 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050929124450.85B5627B7B@code1.codespeak.net> Author: ac Date: Thu Sep 29 14:44:50 2005 New Revision: 17974 Modified: pypy/dist/pypy/interpreter/pyparser/pythonparse.py Log: Fix typo. Modified: pypy/dist/pypy/interpreter/pyparser/pythonparse.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonparse.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonparse.py Thu Sep 29 14:44:50 2005 @@ -35,7 +35,7 @@ else: enc = _normalize_encoding(_check_for_encoding(textsrc)) if enc is not None and enc not in ('utf-8', 'iso-8859-1'): - textsrc = _recode_in_utf8(builder.space, textsrc, enc) + textsrc = _recode_to_utf8(builder.space, textsrc, enc) lines = [line + '\n' for line in textsrc.split('\n')] builder.source_encoding = enc From ale at codespeak.net Thu Sep 29 14:45:11 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Thu, 29 Sep 2005 14:45:11 +0200 (CEST) Subject: [pypy-svn] r17975 - pypy/extradoc/minute Message-ID: <20050929124511.E4FEF27B9E@code1.codespeak.net> Author: ale Date: Thu Sep 29 14:45:11 2005 New Revision: 17975 Added: pypy/extradoc/minute/pypy-sync-09-29-2005.txt Log: Minutes for #pypy-sync meeting 29 of september Added: pypy/extradoc/minute/pypy-sync-09-29-2005.txt ============================================================================== --- (empty file) +++ pypy/extradoc/minute/pypy-sync-09-29-2005.txt Thu Sep 29 14:45:11 2005 @@ -0,0 +1,226 @@ +============================================= +pypy-sync developer meeting 29th September +============================================= + +Time & location: 1pm (30 minutes) at #pypy-sync + +Attendees:: + + Samuele Pedroni + Anders Lehmann (minutes/moderation) + Anders Chrigstr?m + Christian Tismer + Holger Krekel + Eric van Riet Paap + Michael Hudson + Carl Friedrich Bolz + Bert Freudenberg + +Regular Topics +==================== + +- activity reports (3 prepared lines of info). + All Attendees submitted activity reports (see `IRC-Log`_ + at the end and 'LAST/NEXT/BLOCKERS' entries in particular) + +- resolve conflicts/blockers + Christian had a problem with debugging of pypy-c (Samuelle and Christian will try to solve this). + Carl Friedrich has trouble finding a bug (proposed deferred to the Sprint). + Holger had a problem finding accomodation in Paris (Holger resolves this). + +Topics of the week +=================== + + +1. Pre christmas sprint + + There a possibilities for a sprint in the last week of november/ first week of december in Barcelona or Ireland (Bea) or Bruxelles or Istanbul (Holger) or Finland (Christian). Nothing firm at this point, but it was agreed that the possibilities should be investigated before Paris so the dates and venue could be fixed at the sprint. + +2. Compliance + + It seems that pypy-c cant run py.test and that several tests in the lib-python testsuite will make pypy-c segfault. It was chosen to not invest time into solving these problems at this point since pypy-c is in such a flux anyway. + +3.Marketing + + The topic was deferred due to lack of time. + + +Next pypy-sync meeting +------------------------------- + +Scheduled for next Thursday, oct. 6th 1300h CET, conducted by Christian Tismer. + +Closing +------------------ + +Anders closes the meeting at 13:31pm. + + +Log: +------------ + + DONE: other work/vacation + NEXT: multi-source compile, usability, stacklessdesign paper + BLOCK: None + so we are starting with our reports ? +??? willingsl is now known as ericvrp + aleale: you are the leader + I'd suggest to start pasting at 13:00 without asking :-) + ok the meeting is open - please go ahead with the reports + LAST: killing duplication, worked on reports, worked on goal + oriented translate_pypy, moving away from print to logging in our + toolchain + NEXT: finish translate_pypy work, more report work, ... + BLOCKERS: - + Prev: DFKI stuff + Next: WP 9 stuff + Blockers: - + last: llvm hacky transforms to see effect of mallocs for exceptions + next: lurk #pypy for paris activities + blockers: - +??? hpk [n=hpk at merlinux.de] has joined #pypy-sync + LAST: astcompiler, making sure we pass compilancy tests with it. + NEXT: More astcompiler, sourcecode encoding and cleaning up. + BLCOKERS: None. + LAST: vacation, optimization fun (constant folding) + NEXT: probably PyPy unrelated? + BLOCKER: can't find my bug in constant folding :-( + last: a bit py lib work, 22C3 proposal, paris-org, emails and + communication + next: EU reports, some dev coordination stuff + blockers: paris appartment planning +??? Topic (#pypy-sync): changed by pedronis: log at + http://tismerysoft.de/pypy/irc-logs/pypy-sync/ +??? mwh [n=user at 82-33-185-193.cable.ubr01.azte.blueyonder.co.uk] has joined + #pypy-sync + I see two blockers + a bug and housing in Paris + ok, three: BLOCKER: can't debug the PyPy crash, yet. + stakkars: sorry + it's probably not possible to do much about mine + aleale: the housing problem cannot be solved here (and it's not anyone + else blocking me ASFAIK :) + mine has a simple solution: implement multi-source, so that's + doable by me + ok, maybe we can look at cfbolz's bug in Paris, make sense? + stakkars: you probaly need info lost from previous stages, I can + help with that + Before we start withe the weekly topics + pedronis: for the multi-source? ok, we'll talk on #pypy + hpk has asked for a status on the ast-compiler, can anyone give a + short report ? + where are the logilab people? + arre: ? + can we maybe raise a topic somewhere about importance of pypy-sync + and attendance? + cfbolz: having lunch ? havent heard from them (or arigo) + I think the last problem to sort out is sourcecode encoding stuff. + stakkars: I agree + arre: that's the only thing left? cool! + After that i think we can start using it as the default. + stakkars: and consortium meetings... + cfbolz: That is unless something ugly turns up. + mwh: yes, I missed that one because I was in vacation and didn't + read enough + arre: sounds good + It's passing the compliancytests as good as sablecompiler. + it would be great if we could get rid of our various compilers and just + have astcompiler (especially for newcomers) + before the paris sprint + and it seems very close to it already (good work, btw!) + I'll do my best! + Any suggestions where we can raise the issue, pypy-dev? + we have raised it here, it will be in the minutes which you can reply + to your invitation + ok,lets move on the announced topics + 1 Pre christmas sprint. I think it would be nice to get this planned + 2 Compliance. Are we able to run the compliance test with + pypy-c/pypy-llvm? + 3 Marketing. It would be nice to haave a press kit aimed at + non-python people + Bea talk about a sprint in Barcelona, any other suggestions ? + barcelona in december sounds nice :) + are there locals, though? + ad 1: at some point, I'd like topropose a sprint in Finland. But + maybe Winter isn't perfect for that. Opinions? + i thinkt he only _concrete_ efforts currently are from bea (Ireland, + Barcelona) and me (Bruxelles, Istanbul) + bruxelles: no answer yet, though + stakkars: if the venue includes a sauna it should be ok + istanbul: unlikely in the december time frame, need to re-check + aleale: sure, there is no Finish house without it + to be honest, we increasingly need more than ideas: concrete + contacts/opportunities, no? + it is the question whether we want it at all, because the travel + is a bit expensive. Sure I have a contact + hpk: yes, I need to plan my christmas vacation + so date are important too + the thing is that i might not be at a december sprint myself (at least + not fully), which makes it dubious if i organize it + We ahve talked about first week of december, right? + right + crosing Nov/Dec would be fine + yes + 5th-9th is a Calibre conference in Bruxelles + (which is why i asked someone in bruxelles) + Ok, I want to conclude that we work on the opportunities in order to + fix date and location in Paris. + time is running + barcelona sounds cosy. + sounds good. everyone can check around. it's usually a matter of + douing the first contact and sorting out possibilities with the + local person + ad 2. I havent been able to run the test suite, Am I the only one? + what is the problem? + ad 2: didn't try that exactly, but we have some serious problems + with memory management + until we support things like dectecting C level stack overflow we + have few chances + It seems that pytest needs something in os, which isnt there + something is quite broken with Boehm, I need to discuss on #pypy + stakkars: Boehm is known to give warning/errors when data is not + cleared + test_str segfaults like several other tests + there is more to it + ok, #pypy later. Will have to step back from this computer in 6 + minutes anyway + Well, time is running. I raised the issue, sounds like something for + #pypy or Paris + is making tests posible a valuable target that I shouldconcentrate + on? + stakkars + ? + so do we need to think about a way to run tests so that we don't crash + the test process on SEG-faults? + (apart from fixing the underlying problem, of course) + could be nice, is it hard to do ? + capturing the crash is probably not too hard to do + somewhat hard, but not impossible + basically it means that we have to run tests (maybe per module or so) + in a separate process and catch such crashes + well, but is what we do for py.py without -E + apart the capturing crashes + I don't think it makese to run both py.test and the tests on pypy + itself at the moment + doesnt sound like something that will before Paris + Ok, we will defer until pypy-c is more stable? + last topic 3. Marketing +? ericvrp/#pypy-sync has to leave in two minutes + we should work on it but it something to discuss in paris + pedronis: i don't fully understand, but we can clarify that on #pypy + that's what I was asking: should I push myself into making it more + stable right now? + stakkars:one thing, it's about raising RuntimeErrors on stack + overflows but it should be done in a clean way +??? SignOff ericvrp: #pypy-sync (Read error: 104 (Connection reset by peer)) + I put thiss topic on because trying to get a journalist made me + realise that what we have for apress kit may be to technical. + sure + But I see that the time is up . So I propose to defer it + Next week Christian wil do the moderating ? + yes, putting the last topic as the first, maybe + Ok, the meeting is ajourned. Thank You all + bye + bye + bye + bye From ac at codespeak.net Thu Sep 29 14:54:41 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 29 Sep 2005 14:54:41 +0200 (CEST) Subject: [pypy-svn] r17976 - pypy/dist/pypy/interpreter/pyparser Message-ID: <20050929125441.35E2D27BA2@code1.codespeak.net> Author: ac Date: Thu Sep 29 14:54:41 2005 New Revision: 17976 Modified: pypy/dist/pypy/interpreter/pyparser/pythonparse.py Log: Fix some naming issues. Modified: pypy/dist/pypy/interpreter/pyparser/pythonparse.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonparse.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonparse.py Thu Sep 29 14:54:41 2005 @@ -35,7 +35,7 @@ else: enc = _normalize_encoding(_check_for_encoding(textsrc)) if enc is not None and enc not in ('utf-8', 'iso-8859-1'): - textsrc = _recode_to_utf8(builder.space, textsrc, enc) + textsrc = recode_to_utf8(builder.space, textsrc, enc) lines = [line + '\n' for line in textsrc.split('\n')] builder.source_encoding = enc @@ -57,13 +57,13 @@ # return None return builder -app_recode_to_utf8 = gateway.applevel(r''' - def app_recode_to_utf8(text, encoding): +_recode_to_utf8 = gateway.applevel(r''' + def _recode_to_utf8(text, encoding): return unicode(text, encoding).encode("utf-8") -''').interphook('app_recode_to_utf8') +''').interphook('_recode_to_utf8') -def _recode_to_utf8(space, text, encoding): - return space.str_w(app_recode_to_utf8(space, space.wrap(text), +def recode_to_utf8(space, text, encoding): + return space.str_w(_recode_to_utf8(space, space.wrap(text), space.wrap(encoding))) def _normalize_encoding(encoding): """returns normalized name for From pedronis at codespeak.net Thu Sep 29 15:21:30 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 29 Sep 2005 15:21:30 +0200 (CEST) Subject: [pypy-svn] r17977 - pypy/dist/pypy/translator/tool Message-ID: <20050929132130.7851527BA2@code1.codespeak.net> Author: pedronis Date: Thu Sep 29 15:21:29 2005 New Revision: 17977 Modified: pypy/dist/pypy/translator/tool/util.py Log: unneeded import Modified: pypy/dist/pypy/translator/tool/util.py ============================================================================== --- pypy/dist/pypy/translator/tool/util.py (original) +++ pypy/dist/pypy/translator/tool/util.py Thu Sep 29 15:21:29 2005 @@ -1,4 +1,3 @@ -from pypy.annotation.model import SomeObject from pypy.tool.udir import udir import sys From pedronis at codespeak.net Thu Sep 29 15:22:14 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 29 Sep 2005 15:22:14 +0200 (CEST) Subject: [pypy-svn] r17978 - pypy/dist/pypy/translator/goal Message-ID: <20050929132214.E05A827B9E@code1.codespeak.net> Author: pedronis Date: Thu Sep 29 15:22:10 2005 New Revision: 17978 Modified: pypy/dist/pypy/translator/goal/driver.py pypy/dist/pypy/translator/goal/target_pypy-llvm.py pypy/dist/pypy/translator/goal/targetcompiler.py pypy/dist/pypy/translator/goal/targetnopstandalone.py pypy/dist/pypy/translator/goal/targetpypymain.py pypy/dist/pypy/translator/goal/targetpypystandalone.py pypy/dist/pypy/translator/goal/translate_pypy.py pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: - refactor target() signature - change targets - some reorg in driver Modified: pypy/dist/pypy/translator/goal/driver.py ============================================================================== --- pypy/dist/pypy/translator/goal/driver.py (original) +++ pypy/dist/pypy/translator/goal/driver.py Thu Sep 29 15:22:10 2005 @@ -8,6 +8,13 @@ from pypy.annotation import policy as annpolicy import optparse +DEFAULT_OPTIONS = optparse.Values(defaults={ + 'gc': 'ref', + 'insist': False, + 'backend': 'c', + 'lowmem': False, +}) + def taskdef(taskfunc, deps, title, new_state=None, expected_states=[]): taskfunc.task_deps = deps taskfunc.task_title = title @@ -17,7 +24,8 @@ class TranslationDriver(SimpleTaskEngine): - def __init__(self, translator, inputtypes, policy, options, runner, disable=[]): + def __init__(self, translator, inputtypes, policy=None, options=None, + runner=None, disable=[]): SimpleTaskEngine.__init__(self) self.translator = translator @@ -28,10 +36,17 @@ inputtypes = [annmodel.SomeList(ldef)] self.inputtypes = inputtypes + if policy is None: + policy = annpolicy.AnnotatorPolicy() self.policy = policy + if options is None: + options = DEFAULT_OPTIONS self.options = options self.standalone = standalone + if runner is None: + def runner(f): + f() self.runner = runner self.done = {} @@ -121,7 +136,6 @@ newexename = mkexename('./pypy-' + backend) shutil.copy(exename, newexename) c_entryp = newexename - update_usession_dir() if standalone: os.system(c_entryp) else: @@ -162,13 +176,31 @@ def proceed(self, goal): self._execute([goal]) - def __getattr__(self, name): + def __getattr__(self, name): # xxx if name in self.tasks: def proceed_with_task(): self.proceed(name) return proceed_with_task raise AttribueError, name + def from_targetspec(targetspec_dic, options=None): + target = targetspec_dic['target'] + spec = target(not options.lowmem) + try: + entry_point, inputtypes, policy = spec + except ValueError: + entry_point, inputtypes = spec + policy = None + + translator = Translator(entry_point, verbose=True, simplifying=True) + + driver = TranslationDriver(translator, inputtypes, + policy, options, targetspec_dic['run']) + + return translation + + from_targetspec = staticmethod(from_targetspec) + # xxx reorg/move around @@ -181,30 +213,6 @@ execfile(targetspec, targetspec_dic) return targetspec_dic -DEFAULT_OPTIONS = optparse.Values(defaults={ - 'gc': 'ref', - 'insist': False, - 'backend': 'c', - 'lowmem': False, -}) - -def make_translation(targetspec_dic, options=DEFAULT_OPTIONS): - policy = annpolicy.AnnotatorPolicy() - target = targetspec_dic['target'] - spec = target(not options.lowmem) - try: - entry_point, inputtypes, policy = spec - except ValueError: - entry_point, inputtypes = spec - - translator = Translator(entry_point, verbose=True, simplifying=True) - - translation = TranslationDriver(translator, inputtypes, policy, options, targetspec_dic['run']) - - return translation - - - # __________ helpers @@ -222,4 +230,4 @@ percent = int(tot and (100.0*so / tot) or 0) print "-- someobjectness %2d (%d of %d functions polluted by SomeObjects)" % (percent, so, tot) -from pypy.translator.tool.util import update_usession_dir, mkexename +from pypy.translator.tool.util import mkexename Modified: pypy/dist/pypy/translator/goal/target_pypy-llvm.py ============================================================================== --- pypy/dist/pypy/translator/goal/target_pypy-llvm.py (original) +++ pypy/dist/pypy/translator/goal/target_pypy-llvm.py Thu Sep 29 15:22:10 2005 @@ -44,8 +44,11 @@ # _____ Define and setup target ___ -def target(geninterp=True): +def target(options, args): global space, w_entry_point + + geninterp = not getattr(options, 'lowmem', False) + # disable translation of the whole of classobjinterp.py StdObjSpace.setup_old_style_classes = lambda self: None space = StdObjSpace(nofaking=True, Modified: pypy/dist/pypy/translator/goal/targetcompiler.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetcompiler.py (original) +++ pypy/dist/pypy/translator/goal/targetcompiler.py Thu Sep 29 15:22:10 2005 @@ -23,8 +23,11 @@ return 'target_ast_compile --> %r' % (pycode,) # _____ Define and setup target ___ -def target(geninterp=True): +def target(options, args): global space, w_entry_point + + geniterp = not getattr(options, 'lowmem', False) + # disable translation of the whole of classobjinterp.py StdObjSpace.setup_old_style_classes = lambda self: None # disable geninterp for now -- we have faaar toooo much interp-level code Modified: pypy/dist/pypy/translator/goal/targetnopstandalone.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetnopstandalone.py (original) +++ pypy/dist/pypy/translator/goal/targetnopstandalone.py Thu Sep 29 15:22:10 2005 @@ -11,6 +11,6 @@ # _____ Define and setup target ___ -def target(geninterp=True): +def target(*args): return entry_point, None Modified: pypy/dist/pypy/translator/goal/targetpypymain.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetpypymain.py (original) +++ pypy/dist/pypy/translator/goal/targetpypymain.py Thu Sep 29 15:22:10 2005 @@ -49,8 +49,11 @@ # _____ Define and setup target ___ -def target(geninterp=True): +def target(options, args): global space, w_entry_point + + geninterp = not getattr(options, 'lowmem', False) + # disable translation of the whole of classobjinterp.py StdObjSpace.setup_old_style_classes = lambda self: None space = StdObjSpace(nofaking=True, Modified: pypy/dist/pypy/translator/goal/targetpypystandalone.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetpypystandalone.py (original) +++ pypy/dist/pypy/translator/goal/targetpypystandalone.py Thu Sep 29 15:22:10 2005 @@ -44,9 +44,11 @@ # _____ Define and setup target ___ -def target(geninterp=True): +def target(options, args): global space, w_entry_point + geninterp = not getattr(options, 'lowmem', False) + # obscure hack to stuff the translation options into the translated PyPy import __main__, pypy.module.sys options = {} Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Thu Sep 29 15:22:10 2005 @@ -95,7 +95,7 @@ # XXX this tries to make compiling faster from pypy.translator.tool import cbuild -cbuild.enable_fast_compilation() +#cbuild.enable_fast_compilation() annmodel.DEBUG = False @@ -108,7 +108,10 @@ policy = AnnotatorPolicy() if target: - spec = target(not options['-t-lowmem']) + # forward compatibility + import optparse + opts = optparse.Values({'lowmem': options['-t-lowmem']}) + spec = target(opts, []) try: entry_point, inputtypes, policy = spec except ValueError: Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Thu Sep 29 15:22:10 2005 @@ -271,7 +271,7 @@ policy = AnnotatorPolicy() target = targetspec_dic['target'] if target: - spec = target(not cmd_line_opt.lowmem) + spec = target(cmd_line_opt, []) # xxx rest args try: entry_point, inputtypes, policy = spec except ValueError: From pedronis at codespeak.net Thu Sep 29 16:12:29 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 29 Sep 2005 16:12:29 +0200 (CEST) Subject: [pypy-svn] r17979 - pypy/dist/pypy/translator/goal Message-ID: <20050929141229.68A4827B9E@code1.codespeak.net> Author: pedronis Date: Thu Sep 29 16:12:28 2005 New Revision: 17979 Modified: pypy/dist/pypy/translator/goal/targetcompiler.py Log: fix typo Modified: pypy/dist/pypy/translator/goal/targetcompiler.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetcompiler.py (original) +++ pypy/dist/pypy/translator/goal/targetcompiler.py Thu Sep 29 16:12:28 2005 @@ -26,7 +26,7 @@ def target(options, args): global space, w_entry_point - geniterp = not getattr(options, 'lowmem', False) + geninterp = not getattr(options, 'lowmem', False) # disable translation of the whole of classobjinterp.py StdObjSpace.setup_old_style_classes = lambda self: None From ac at codespeak.net Thu Sep 29 17:03:28 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Thu, 29 Sep 2005 17:03:28 +0200 (CEST) Subject: [pypy-svn] r17980 - in pypy/dist/pypy: interpreter interpreter/astcompiler interpreter/pyparser module/recparser Message-ID: <20050929150328.612B227B9E@code1.codespeak.net> Author: ac Date: Thu Sep 29 17:03:27 2005 New Revision: 17980 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/pycompiler.py pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/pyparser/error.py pypy/dist/pypy/interpreter/pyparser/pythonlexer.py pypy/dist/pypy/interpreter/pyparser/pythonparse.py pypy/dist/pypy/interpreter/pyparser/pythonutil.py pypy/dist/pypy/module/recparser/pyparser.py Log: Use SyntaxError consistently. Add linenumber info to SyntaxErrors. Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Thu Sep 29 17:03:27 2005 @@ -213,27 +213,27 @@ raise RuntimeError, "should be implemented by subclasses" # Next five methods handle name access - def storeName(self, name): + def storeName(self, name, lineno): if name in ('None', '__debug__'): - raise SyntaxError('assignment to %s is not allowed' % name) + raise SyntaxError('assignment to %s is not allowed' % name, lineno) self._nameOp('STORE', name) - def loadName(self, name): + def loadName(self, name, lineno): if (self.scope.nested and self.scopeambiguity and name in self.scope.hasbeenfree): raise SyntaxError("cannot reference variable '%s' because " "of ambiguity between " - "scopes" % name) + "scopes" % name, lineno) self._nameOp('LOAD', name) - def delName(self, name): + def delName(self, name, lineno): if name in ('None', '__debug__'): - raise SyntaxError('deleting %s is not allowed' % name) + raise SyntaxError('deleting %s is not allowed' % name, lineno) scope = self.scope.check_name(name) if scope == SC_CELL: raise SyntaxError("can not delete variable '%s' " - "referenced in nested scope" % name) + "referenced in nested scope" % name, lineno) self._nameOp('DELETE', name) def _nameOp(self, prefix, name): @@ -311,7 +311,7 @@ self.emitop_int('SET_LINENO', 0) if not space.is_w(node.doc, space.w_None): self.emitop_obj('LOAD_CONST', node.doc) - self.storeName('__doc__') + self.storeName('__doc__', node.lineno) node.node.accept( self ) self.emitop_obj('LOAD_CONST', space.w_None ) self.emit('RETURN_VALUE') @@ -329,7 +329,7 @@ space = self.space if not space.is_w(node.doc, space.w_None): self.setDocstring(node.doc) - self.storeName(node.name) + self.storeName(node.name, node.lineno) def visitLambda(self, node): self._visitFuncOrLambda(node, isLambda=1) @@ -389,7 +389,7 @@ self.emitop_int('MAKE_FUNCTION', 0) self.emitop_int('CALL_FUNCTION', 0) self.emit('BUILD_CLASS') - self.storeName(node.name) + self.storeName(node.name, node.lineno) # The rest are standard visitor methods @@ -470,16 +470,13 @@ def visitBreak(self, node): if not self.setups: - raise SyntaxError( "'break' outside loop (%s, %d)" % - (node.filename, node.lineno) ) + raise SyntaxError( "'break' outside loop", node.lineno) self.set_lineno(node) self.emit('BREAK_LOOP') def visitContinue(self, node): if not self.setups: - raise SyntaxError( "'continue' not properly in loop" - # (%s, %d)" % (node.filename, node.lineno) - ) + raise SyntaxError( "'continue' not properly in loop", node.lineno) kind, block = self.setups.top() if kind == LOOP: self.set_lineno(node) @@ -496,14 +493,12 @@ if kind == LOOP: break if kind != LOOP: - raise SyntaxError( "'continue' not properly in loop" - # (%s, %d)" % (node.filename, node.lineno) - ) + raise SyntaxError( "'continue' not properly in loop", node.lineno) self.emitop_block('CONTINUE_LOOP', loop_block) self.nextBlock() elif kind == END_FINALLY: - msg = "'continue' not supported inside 'finally' clause" # " (%s, %d)" - raise SyntaxError( msg ) # % (node.filename, node.lineno) + msg = "'continue' not supported inside 'finally' clause" + raise SyntaxError( msg, node.lineno ) def _visitTest(self, node, jump): end = self.newBlock() @@ -676,7 +671,7 @@ anchor = self.newBlock() if node.is_outmost: - self.loadName('[outmost-iterable]') + self.loadName('[outmost-iterable]', node.lineno) else: node.iter.accept( self ) self.emit('GET_ITER') @@ -826,7 +821,7 @@ def visitName(self, node): self.set_lineno(node) - self.loadName(node.varname) + self.loadName(node.varname, node.lineno) def visitPass(self, node): self.set_lineno(node) @@ -839,9 +834,9 @@ mod = name.split(".")[0] if alias: self._resolveDots(name) - self.storeName(alias) + self.storeName(alias, node.lineno) else: - self.storeName(mod) + self.storeName(mod, node.lineno) def visitFrom(self, node): self.set_lineno(node) @@ -858,7 +853,7 @@ else: self.emitop('IMPORT_FROM', name) self._resolveDots(name) - self.storeName(alias or name) + self.storeName(alias or name, node.lineno) self.emit('POP_TOP') def _resolveDots(self, name): @@ -887,10 +882,10 @@ def visitAssName(self, node): if node.flags == 'OP_ASSIGN': - self.storeName(node.name) + self.storeName(node.name, node.lineno) elif node.flags == 'OP_DELETE': self.set_lineno(node) - self.delName(node.name) + self.delName(node.name, node.lineno) else: assert False, "visitAssName unexpected flags: %s" % node.flags @@ -898,14 +893,14 @@ node.expr.accept( self ) if node.flags == 'OP_ASSIGN': if node.attrname == 'None': - raise SyntaxError('assignment to None is not allowed') + raise SyntaxError('assignment to None is not allowed', node.lineno) self.emitop('STORE_ATTR', self.mangle(node.attrname)) elif node.flags == 'OP_DELETE': if node.attrname == 'None': - raise SyntaxError('deleting None is not allowed') + raise SyntaxError('deleting None is not allowed', node.lineno) self.emitop('DELETE_ATTR', self.mangle(node.attrname)) else: - assert False, "visitAssAttr unexpected flags: %s" % node.flags + assert False, "visitAssAttr unexpected flags: %s" % node.flags def _visitAssSequence(self, node, op='UNPACK_SEQUENCE'): if findOp(node) != 'OP_DELETE': @@ -1232,15 +1227,15 @@ for arg in func.argnames: if isinstance(arg, ast.AssName): if arg.name in argnames: - raise SyntaxError("duplicate argument '%s' in function definition" % arg.name) + raise SyntaxError("duplicate argument '%s' in function definition" % arg.name, func.lineno) argnames[arg.name] = 1 elif isinstance(arg, ast.AssTuple): for argname in arg.getArgNames(): if argname in argnames: - raise SyntaxError("duplicate argument '%s' in function definition" % argname) + raise SyntaxError("duplicate argument '%s' in function definition" % argname, func.lineno) argnames[argname] = 1 if 'None' in argnames: - raise SyntaxError('assignment to None is not allowed') + raise SyntaxError('assignment to None is not allowed', func.lineno) graph = pyassem.PyFlowGraph(space, name, func.filename, func.argnames, optimized=self.localsfullyknown, @@ -1284,7 +1279,7 @@ for elt in tup.nodes: if isinstance(elt, ast.AssName): - self.storeName(elt.name) + self.storeName(elt.name, elt.lineno) elif isinstance(elt, ast.AssTuple): self.unpackSequence( elt ) else: @@ -1352,10 +1347,10 @@ self.graph.setCellVars(self.scope.get_cell_vars()) self.set_lineno(klass) self.emitop("LOAD_GLOBAL", "__name__") - self.storeName("__module__") + self.storeName("__module__", klass.lineno) if not space.is_w(klass.doc, space.w_None): self.emitop_obj("LOAD_CONST", klass.doc) - self.storeName('__doc__') + self.storeName('__doc__', klass.lineno) def findOp(node): """Find the op (DELETE, LOAD, STORE) in an AssTuple tree""" @@ -1393,7 +1388,7 @@ raise RuntimeError("shouldn't arrive here!") def visitName(self, node ): - self.main.loadName(node.varname) + self.main.loadName(node.varname, node.lineno) def visitGetattr(self, node): node.expr.accept( self ) @@ -1405,7 +1400,8 @@ def visitSubscript(self, node): if len(node.subs) > 1: - raise SyntaxError( "augmented assignment to tuple is not possible" ) + raise SyntaxError( "augmented assignment to tuple is not possible", + node.lineno) self.main._visitSubscript(node, True) @@ -1417,7 +1413,7 @@ raise RuntimeError("shouldn't arrive here!") def visitName(self, node): - self.main.storeName(node.varname) + self.main.storeName(node.varname, node.lineno) def visitGetattr(self, node): self.main.emit('ROT_TWO') @@ -1439,7 +1435,8 @@ def visitSubscript(self, node): if len(node.subs) > 1: - raise SyntaxError( "augmented assignment to tuple is not possible" ) + raise SyntaxError("augmented assignment to tuple is not possible", + node.lineno) self.main.emit('ROT_THREE') self.main.emit('STORE_SUBSCR') Modified: pypy/dist/pypy/interpreter/pycompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pycompiler.py (original) +++ pypy/dist/pypy/interpreter/pycompiler.py Thu Sep 29 17:03:27 2005 @@ -196,14 +196,14 @@ the whole source after having only added a new '\n') """ def compile(self, source, filename, mode, flags): - from pyparser.error import ParseError + from pyparser.error import SyntaxError from pyparser.pythonutil import internal_pypy_parse flags |= __future__.generators.compiler_flag # always on (2.2 compat) # XXX use 'flags' space = self.space try: parse_result = internal_pypy_parse(source, mode, True, flags) - except ParseError, e: + except SyntaxError, e: w_synerr = space.newtuple([space.wrap(e.msg), space.newtuple([space.wrap(filename), space.wrap(e.lineno), @@ -393,7 +393,6 @@ the whole source after having only added a new '\n') """ def compile(self, source, filename, mode, flags): - from pyparser.error import ParseError from pyparser.error import SyntaxError from pypy.interpreter import astcompiler from pypy.interpreter.astcompiler.pycodegen import ModuleCodeGenerator @@ -410,9 +409,6 @@ PYTHON_PARSER.parse_source(source, target_rule, builder, flags) ast_tree = builder.rule_stack[-1] encoding = builder.source_encoding - except ParseError, e: - raise OperationError(space.w_SyntaxError, - e.wrap_info(space, filename)) except SyntaxError, e: raise OperationError(space.w_SyntaxError, e.wrap_info(space, filename)) Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Thu Sep 29 17:03:27 2005 @@ -7,7 +7,7 @@ from pypy.interpreter.astcompiler import ast, consts import pypy.interpreter.pyparser.pysymbol as sym import pypy.interpreter.pyparser.pytoken as tok -from pypy.interpreter.pyparser.error import SyntaxError, TokenError, ASTError, ParseError +from pypy.interpreter.pyparser.error import SyntaxError from pypy.interpreter.pyparser.parsestring import parsestr DEBUG_MODE = 0 @@ -103,7 +103,8 @@ break elif cur_token.get_value() == 'for': if len(arguments) != 1: - raise SyntaxError("invalid syntax") + raise SyntaxError("invalid syntax", cur_token.lineno, + cur_token.col) expr = arguments[0] genexpr_for = parse_genexpr_for(tokens[index:]) genexpr_for[0].is_outmost = True @@ -186,10 +187,12 @@ cur_token = tokens[index] index += 1 else: - raise ValueError("FIXME: SyntaxError (incomplete varags) ?") + raise SyntaxError("incomplete varags", cur_token.lineno, + cur_token.col) assert isinstance(cur_token, TokenObject) if cur_token.name != tok.DOUBLESTAR: - raise TokenError("Unexpected token: ", [cur_token] ) + raise SyntaxError("Unexpected token", cur_token.lineno, + cur_token.col) cur_token = tokens[index] index += 1 assert isinstance(cur_token, TokenObject) @@ -199,9 +202,12 @@ flags |= consts.CO_VARKEYWORDS index += 1 else: - raise ValueError("FIXME: SyntaxError (incomplete varags) ?") + raise SyntaxError("incomplete varags", cur_token.lineno, + cur_token.col) if index < l: - raise TokenError("unexpected token" , [tokens[index]] ) + token = tokens[index] + raise SyntaxError("unexpected token" , token.lineno, + token.col) elif cur_token.name == tok.NAME: val = cur_token.get_value() names.append( ast.AssName( val, consts.OP_ASSIGN ) ) @@ -213,7 +219,8 @@ if flags & consts.CO_VARARGS: num_expected_with_default -= 1 if len(defaults) != num_expected_with_default: - raise SyntaxError('non-default argument follows default argument') + raise SyntaxError('non-default argument follows default argument', + tokens[0].lineno, tokens[0].col) return names, defaults, flags @@ -306,7 +313,8 @@ genexpr_fors.append(ast.GenExprFor(ass_node, iterable, ifs, lineno)) ifs = [] else: - raise SyntaxError('invalid syntax') + raise SyntaxError('invalid syntax', + token.lineno, token.col) return genexpr_fors @@ -365,26 +373,22 @@ ast_node.flags = flags return ast_node else: - # TODO: check type of ast_node and raise according SyntaxError in case - # of del f() - # #raise ASTError("cannot assign to %s" % ast_node, ast_node) if isinstance(ast_node, ast.GenExpr): - raise ParseError("assign to generator expression not possible", + raise SyntaxError("assign to generator expression not possible", lineno, 0, '') elif isinstance(ast_node, ast.ListComp): - raise ParseError("can't assign to list comprehension", + raise SyntaxError("can't assign to list comprehension", lineno, 0, '') elif isinstance(ast_node, ast.CallFunc): if flags == consts.OP_DELETE: - raise ParseError("can't delete function call", + raise SyntaxError("can't delete function call", lineno, 0, '') else: - raise ParseError("can't assign to function call", + raise SyntaxError("can't assign to function call", lineno, 0, '') else: - raise ParseError("can't assign to non-lvalue", + raise SyntaxError("can't assign to non-lvalue", lineno, 0, '') - # raise ASTError("cannot assign to %s" % ast_node, ast_node) def is_augassign( ast_node ): if ( isinstance( ast_node, ast.Name ) or @@ -566,7 +570,7 @@ elif top.name == tok.BACKQUOTE: builder.push(ast.Backquote(atoms[1], atoms[1].lineno)) else: - raise TokenError("unexpected tokens", atoms) + raise SyntaxError("unexpected tokens", top.lineno, top.col) def slicecut(lst, first, endskip): # endskip is negative last = len(lst)+endskip @@ -623,7 +627,8 @@ elif op_node.name == tok.DOUBLESLASH: left = ast.FloorDiv( [ left, right ], left.lineno ) else: - raise TokenError("unexpected token", [atoms[i-1]]) + token = atoms[i-1] + raise SyntaxError("unexpected token", token.lineno, token.col) builder.push( left ) def build_arith_expr(builder, nb): @@ -639,7 +644,8 @@ elif op_node.name == tok.MINUS: left = ast.Sub([ left, right ], left.lineno) else: - raise ValueError("unexpected token", [atoms[i-1]]) + token = atoms[i-1] + raise SyntaxError("unexpected token", token.lineno, token.col) builder.push( left ) def build_shift_expr(builder, nb): @@ -656,7 +662,8 @@ elif op_node.name == tok.RIGHTSHIFT: left = ast.RightShift( [ left, right ], lineno ) else: - raise ValueError("unexpected token", [atoms[i-1]] ) + token = atoms[i-1] + raise SyntaxError("unexpected token", token.lineno, token.col) builder.push(left) @@ -779,7 +786,7 @@ assert l==3 lvalue = atoms[0] if isinstance(lvalue, ast.GenExpr) or isinstance(lvalue, ast.Tuple): - raise ParseError("augmented assign to tuple literal or generator expression not possible", + raise SyntaxError("augmented assign to tuple literal or generator expression not possible", lineno, 0, "") assert isinstance(op, TokenObject) builder.push(ast.AugAssign(lvalue, op.get_name(), atoms[2], lineno)) Modified: pypy/dist/pypy/interpreter/pyparser/error.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/error.py (original) +++ pypy/dist/pypy/interpreter/pyparser/error.py Thu Sep 29 17:03:27 2005 @@ -1,29 +1,5 @@ from pypy.interpreter.error import OperationError - -class ParseError(Exception): - """Base class for exceptions raised by the parser.""" - - def __init__(self, msg, lineno, offset, text): - self.msg = msg - self.lineno = lineno - self.offset = offset - self.text = text - - def wrap_info(self, space, filename): - return space.newtuple([space.wrap(self.msg), - space.newtuple([space.wrap(filename), - space.wrap(self.lineno), - space.wrap(self.offset), - space.wrap(self.text)])]) - - def __str__(self): - return "%s at pos (%d, %d) in %r" % (self.__class__.__name__, - self.lineno, - self.offset, - self.text) - - class SyntaxError(Exception): """Base class for exceptions raised by the parser.""" Modified: pypy/dist/pypy/interpreter/pyparser/pythonlexer.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonlexer.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonlexer.py Thu Sep 29 17:03:27 2005 @@ -6,7 +6,7 @@ from codeop import PyCF_DONT_IMPLY_DEDENT from pypy.interpreter.pyparser.grammar import TokenSource, Token -from pypy.interpreter.pyparser.error import ParseError +from pypy.interpreter.pyparser.error import SyntaxError import pytoken from pytoken import NEWLINE @@ -55,11 +55,11 @@ import automata -class TokenError(ParseError): +class TokenError(SyntaxError): """Raised for lexer errors, e.g. when EOF is found prematurely""" def __init__(self, msg, line, strstart, token_stack): lineno, offset = strstart - ParseError.__init__(self, msg, lineno, offset, line) + SyntaxError.__init__(self, msg, lineno, offset, line) self.token_stack = token_stack def generate_tokens(lines, flags): Modified: pypy/dist/pypy/interpreter/pyparser/pythonparse.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonparse.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonparse.py Thu Sep 29 17:03:27 2005 @@ -7,7 +7,7 @@ """ from pypy.interpreter.error import OperationError, debug_print from pypy.interpreter import gateway -from pypy.interpreter.pyparser.error import ParseError +from pypy.interpreter.pyparser.error import SyntaxError from pypy.tool.option import Options from pythonlexer import Source, match_encoding_declaration import pysymbol @@ -53,7 +53,7 @@ if not result: line, lineno = src.debug() # XXX needs better error messages - raise ParseError("invalid syntax", lineno, -1, line) + raise SyntaxError("invalid syntax", lineno, -1, line) # return None return builder Modified: pypy/dist/pypy/interpreter/pyparser/pythonutil.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonutil.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonutil.py Thu Sep 29 17:03:27 2005 @@ -5,7 +5,6 @@ import pythonparse from tuplebuilder import TupleBuilder from astbuilder import AstBuilder -from pypy.interpreter.pyparser.error import ParseError from pypy.interpreter.pyparser import pysymbol PYTHON_PARSER = pythonparse.PYTHON_PARSER Modified: pypy/dist/pypy/module/recparser/pyparser.py ============================================================================== --- pypy/dist/pypy/module/recparser/pyparser.py (original) +++ pypy/dist/pypy/module/recparser/pyparser.py Thu Sep 29 17:03:27 2005 @@ -8,7 +8,8 @@ from pypy.interpreter.typedef import interp_attrproperty, GetSetProperty from pypy.interpreter.pycode import PyCode from pypy.interpreter.pyparser.syntaxtree import TokenNode, SyntaxNode, AbstractSyntaxVisitor -from pypy.interpreter.pyparser.pythonutil import PYTHON_PARSER, ParseError +from pypy.interpreter.pyparser.pythonutil import PYTHON_PARSER +from pypy.interpreter.pyparser.error import SyntaxError from pypy.interpreter.pyparser import grammar, pysymbol, pytoken __all__ = [ "ASTType", "STType", "suite", "expr" ] @@ -147,7 +148,7 @@ try: PYTHON_PARSER.parse_source(source, goal, builder ) return builder.stack[-1] - except ParseError, e: + except SyntaxError, e: raise OperationError(space.w_SyntaxError, e.wrap_info(space, '')) From ale at codespeak.net Thu Sep 29 18:14:01 2005 From: ale at codespeak.net (ale at codespeak.net) Date: Thu, 29 Sep 2005 18:14:01 +0200 (CEST) Subject: [pypy-svn] r17981 - pypy/extradoc/minute Message-ID: <20050929161401.A7FAF27B96@code1.codespeak.net> Author: ale Date: Thu Sep 29 18:14:00 2005 New Revision: 17981 Modified: pypy/extradoc/minute/pypy-sync-09-29-2005.txt Log: addendum Modified: pypy/extradoc/minute/pypy-sync-09-29-2005.txt ============================================================================== --- pypy/extradoc/minute/pypy-sync-09-29-2005.txt (original) +++ pypy/extradoc/minute/pypy-sync-09-29-2005.txt Thu Sep 29 18:14:00 2005 @@ -16,6 +16,11 @@ Carl Friedrich Bolz Bert Freudenberg +Absentees: + Ludovic Aubry + Adrien Di Mascio + Armin Rigo + Regular Topics ==================== @@ -40,10 +45,13 @@ It seems that pypy-c cant run py.test and that several tests in the lib-python testsuite will make pypy-c segfault. It was chosen to not invest time into solving these problems at this point since pypy-c is in such a flux anyway. -3.Marketing +3. Marketing The topic was deferred due to lack of time. +4. Not announced topic - Status of AST-compiler. + + It seems that there is only ony remaining problem left. There is hope that the parse/compiler multitude will be resolved before the Paris sprint. Next pypy-sync meeting ------------------------------- From bea at codespeak.net Thu Sep 29 19:56:52 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 29 Sep 2005 19:56:52 +0200 (CEST) Subject: [pypy-svn] r17982 - pypy/extradoc/talk/22c3 Message-ID: <20050929175652.91CB927B9E@code1.codespeak.net> Author: bea Date: Thu Sep 29 19:56:50 2005 New Revision: 17982 Added: pypy/extradoc/talk/22c3/pypy-sprint-talk.txt Log: A suggestion for a proposal to the CCC conference, a talk by Holger and me around PyPy and sprint driven development. Feel free to comment.... Added: pypy/extradoc/talk/22c3/pypy-sprint-talk.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/pypy-sprint-talk.txt Thu Sep 29 19:56:50 2005 @@ -0,0 +1,72 @@ +Reference/Call For Papers: http://www.ccc.de/congress/2005/cfp.html +DEADLINE: 31st September 2005 (friday) + +Title: PyPy - agility through diversity + +Subtitle: Sprint driven development in an Open Source Python project - funded + by the European Commission (FP6) + +Section: Hackerethik + +Talkers: Holger Krekel, Beatrice D?ring + +Abstract (max 250 letters): + + We?ll present the experiences from managing an agile and distributed OSS + development process in the Pypy project, "sprint driven", during the last year + and also connect this to the challenges of integrating this process with the requirements + from the European Commission who is partly funding the project through the + 6th Research Framework programme. + We will also reflect on the aspect of diversity, combining technical + and non technical people and skills in the PyPy project and learnings + from this. + + PyPy is a reimplementation of Python written in Python + itself, flexible and easy to experiment with. + +Description (250-500 words): + + We are going to briefly describe the organisation of the project, + showing how formal stakeholders and OSS Python community interact + through agile practices like sprinting. + + We will relate the various agile techniques used in PyPy to the + agile practices known from the work in the Agile Alliance (XP, Scrum, + Crystal Clear) and show how the PyPy project is developing an ongoing + customization of several of the known practices. + + Lastly we will also share our experience of various challenges and + possibilities when integrating the different cultures and skills from + the OSS perspective, EU perspective and the Chaos Pilot/process management + perspective - managing diversities. + + PyPy is a reimplementation of Python written in Python + itself, flexible and easy to experiment with. Our + long-term goals are to target a large variety of + platforms, small and large, by providing a compiler + toolsuite that can produce custom Python versions. + Platform, Memory and Threading models will become + aspects of the translation process - as opposed to + encoding low level details into a language implementation + itself. + + +Statement: We intend to submit a paper (PDF) for the 22C3 proceedings. +Statement: We intend to submit a slides PDF as well. + +Duration of your talk: 45 minutes + questions + +Language of your talk: english + +Links to background information on the talk: http://codespeak.net/pypy + +Target Group: Advanced Users, Pros + +Resources you need for your talk: digital projector, internet + +Related talks at 22C3 you know of: ... + +A lecture logo, square format, min. 128x128 pixels (optional): + http://codespeak.net/pypy/img/py-web1.png + (please scale it down a bit :-) + From bea at codespeak.net Thu Sep 29 20:04:46 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Thu, 29 Sep 2005 20:04:46 +0200 (CEST) Subject: [pypy-svn] r17983 - pypy/extradoc/talk/22c3 Message-ID: <20050929180446.BCE0227BA7@code1.codespeak.net> Author: bea Date: Thu Sep 29 20:04:44 2005 New Revision: 17983 Added: pypy/extradoc/talk/22c3/speaker-beatriced??ring.txt Log: speaker info for the CCC Added: pypy/extradoc/talk/22c3/speaker-beatriced??ring.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/speaker-beatriced??ring.txt Thu Sep 29 20:04:44 2005 @@ -0,0 +1,45 @@ +Reference/Call For Papers: http://www.ccc.de/congress/2005/cfp.html +DEADLINE: 31st September 2005 (friday) + +Name: Beatrice D?ring + +Public Name: Beatrice D?ring + +Other Names: bd on irc.freenode.org, bea + +Primary E-Mail address: bea at changemaker.nu + +Phone number(s): +46 (0)734 22 89 06 + +A photo, square format, min. 128x128 pixels (optional): + Holger - can I send you a photo that we can put somewhere on codespeak? + I will bring a digital camera so that we can take phoes of everyone in Paris for the website. + +Statement: publishing contact info except for the phone number + is fine with me. + +Public home page, weblog and other speaker-related websites: + + http://codespeak.net/pypy + +Short Info: + + Beatrice During, consultant in the project management field, assistant project manager in PyPY +Bio: + + Beatrice D?ring studied teaching/pedagogy at the University of Karlstad in + Sweden. She was recruited into the IT-industry to work as a project manager + for large scale education projects for the company NetGuide Scandinavia, + Gothenburg. Since 1998 she has been working with education and development + project management and management of education and consultant departments, + implementing Open Source strategies and Agile development methods. Beatrice + also teaches project management, requirements and communication courses for + Learning Tree International through the Chaos Pilot company Change Maker. + Currently she is involved in the project management team of the PyPy project + and tries to fit in some maternity leave (;-) + + +Postal address: Change Maker, J?rntorget 3, 41304 Gothenburg, Sweden +Bank information: +Expected day of arrival and departure: 27th-30th December. + From pedronis at codespeak.net Thu Sep 29 20:11:32 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 29 Sep 2005 20:11:32 +0200 (CEST) Subject: [pypy-svn] r17984 - pypy/dist/pypy/translator/goal Message-ID: <20050929181132.60A0A27BA7@code1.codespeak.net> Author: pedronis Date: Thu Sep 29 20:11:31 2005 New Revision: 17984 Modified: pypy/dist/pypy/translator/goal/driver.py pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: work-in-progress, intermediate check-in: translate_pypy_new refactoring, reorganized option parsing in preparation to use driver.py translate_pypy_new is functionality is disabled by this checking but with -h you can see the new opts organisation Modified: pypy/dist/pypy/translator/goal/driver.py ============================================================================== --- pypy/dist/pypy/translator/goal/driver.py (original) +++ pypy/dist/pypy/translator/goal/driver.py Thu Sep 29 20:11:31 2005 @@ -10,6 +10,7 @@ DEFAULT_OPTIONS = optparse.Values(defaults={ 'gc': 'ref', + 'debug': True, 'insist': False, 'backend': 'c', 'lowmem': False, @@ -71,6 +72,7 @@ policy = self.policy self.info('with policy: %s.%s' % (policy.__class__.__module__, policy.__class__.__name__)) + annmodel.DEBUG = self.options.debug annotator = translator.annotate(self.inputtypes, policy=policy) sanity_check_annotation(translator) annotator.simplify() Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Thu Sep 29 20:11:31 2005 @@ -1,355 +1,553 @@ #! /usr/bin/env python -# """ Command-line options for translate_pypy: See below """ +import autopath + +# dict are not ordered, cheat with #_xyz keys and bunchiter +def OPT(*args): + return args + +def bunchiter(d): + purify = lambda name: name.split('_',1)[1] + items = d.items() + items.sort() + for name, val in items: + yield purify(name), val + +GOAL = object() +SKIP_GOAL = object() opts = { - 'Annotation':[ - ['-m', '--lowmem', 'Try to save memory', [True,False], False], - ['-n', '--no_annotations', "Don't infer annotations", [True,False], False], - ['-d', '--debug', 'record debug information', [True,False], False], - ['-i', '--insist', "Dont't stop on first error", [True,False], True]], + + + '0_Annotation': { + '0_annotate': [OPT(('-a', '--annotate'), "Annotate", GOAL), + OPT(('--no-annotate',), "Don't annotate", SKIP_GOAL)], + '1_debug': [OPT(('-d', '--debug'), "Record annotation debug info", True)] + }, + + '1_RTyping': { + '0_rtype': [OPT(('-t', '--rtype'), "RType", GOAL), + OPT(('--no-rtype',), "Don't rtype", SKIP_GOAL)], + '1_insist': [OPT(('--insist',), "Dont' stop on first rtyper error", True)] + }, + + '2_Backend optimisations': { + '_backopt': [OPT(('-o', '--backopt'), "Do backend optimisations", GOAL), + OPT(('--no-backopt',), "Don't do backend optimisations", SKIP_GOAL)], + }, + + + '3_Code generation options': { + '0_source': [OPT(('-s', '--source'), "Generate source code", GOAL), + OPT(('--no-source',), "Don't generate source code", SKIP_GOAL)], + + '1_backend': [OPT(('-b', '--backend'), "Backend", ['c', 'llvm'])], + + '2_gc': [OPT(('--gc',), "Garbage collector", ['ref', 'boehm', 'none'])], + }, + + + '4_Compilation options':{ + '_compile': [OPT(('-c', '--compile'), "Compile generated source", GOAL), + OPT(('--no-compile',), "Don't compile", SKIP_GOAL)], + }, + + '5_Run options': { + '_run': [OPT(('-r', '--run'), "Run compiled code", GOAL), + OPT(('--no-run',), "Don't run compiled code", SKIP_GOAL)], + }, + + + '6_General&other options': { + '0_batch': [OPT(('--batch',), "Don't run interactive helpers", True)], + '1_lowmem': [OPT(('--lowmem',), "Target should try to save memory", True)], + + + '2_huge': [OPT(('--huge',), "Threshold in the number of functions after which only a local call graph and not a full one is displayed", int)], + + '3_text': [OPT(('--text',), "Don't start the pygame viewer", True)], + + '4_graphserve': [OPT(('--graphserve',), """Serve analysis graphs on port number +(see pypy/translator/tool/pygame/graphclient.py)""", int)], + + }, + + + #'Process options':[ + # ['-f', '--fork', + # "(UNIX) Create restartable checkpoint after annotation [,specialization]", + # [['fork1','fork2']], [] ], + +} + +defaults = { + 'targetspec': 'targetpypystandalone', + + 'goals': [], + + 'default_goals': ['annotate', 'rtype', 'backopt', 'source', 'compile'], + 'skipped_goals': ['run'], + +## 'annotate': True, +## 'rtype': True, +## 'backopt': True, +## 'source': True, +## 'compile': True, +## 'run': False, + + 'lowmem': False, + + 'debug': False, + 'insist': False, + + 'gc': 'boehm', + 'backend': 'c', + + 'batch': False, + 'text': False, + 'graphserve': None, + 'huge': 100, +} + +import py +# we want 2.4 expand_default functionality +optparse = py.compat.optparse + + +class OptHelpFormatter(optparse.IndentedHelpFormatter): + + def expand_default(self, option): + assert self.parser + dfls = self.parser.defaults + defl = "" + if option.action == 'callback' and option.callback == goal_cb: + enable, goal = option.callback_args + if enable == (goal in dfls['default_goals']): + defl = "[default]" + else: + val = dfls.get(option.dest) + if val is None: + pass + elif isinstance(val, bool): + if bool(val) == (option.action=="store_true"): + defl = "[default]" + else: + defl = "[default: %s]" % val + + return option.help.replace("%defl", defl) + +def goal_cb(option, opt, value, parser, enable, goal): + if enable: + parser.values.goals = parser.values.goals + [goal] + else: + parser.values.goals = parser.values.skipped_goals + [goal] + +def parse_options(): + opt_parser = optparse.OptionParser(prog="translate_pypy", + formatter=OptHelpFormatter()) + for group_name, grp_opts in bunchiter(opts): + grp = opt_parser.add_option_group(group_name) + for dest, dest_opts in bunchiter(grp_opts): + for names, descr, choice in dest_opts: + opt_setup = {'action': 'store', + 'dest': dest, + 'help': descr+" %defl"} + if choice in (GOAL, SKIP_GOAL): + del opt_setup['dest'] + opt_setup['action'] = 'callback' + opt_setup['nargs'] = 0 + opt_setup['callback'] = goal_cb + opt_setup['callback_args'] = (choice is GOAL, dest,) + elif isinstance(choice, list): + opt_setup['type'] = 'choice' + opt_setup['choices'] = choice + elif isinstance(choice, bool): + opt_setup['action'] = ['store_false', 'store_true'][choice] + elif choice is int: + opt_setup['type'] = 'int' + elif choice is str: + opt_setup['type'] = 'string' + + grp.add_option(*names, **opt_setup) + opt_parser.set_defaults(**defaults) + + options, args = opt_parser.parse_args() + + if args: + arg = args[0] + args = args[1:] + if os.path.isfile(arg+'.py'): + assert not os.path.isfile(arg), ( + "ambiguous file naming, please rename %s" % arg) + options.targetspec = arg + elif os.path.isfile(arg) and arg.endswith('.py'): + options.targetspec = arg[:-3] + + return options, args + +def main(): + options, args = parse_options() + + print options, args + +if __name__ == '__main__': + main() + +# +## """ +## Command-line options for translate_pypy: + +## See below +## """ + +## opts = { +## 'Annotation':[ +## ['-m', '--lowmem', 'Try to save memory', [True,False], False], +## ['-n', '--no_annotations', "Don't infer annotations", [True,False], False], +## ['-d', '--debug', 'record debug information', [True,False], False], +## ['-i', '--insist', "Dont't stop on first error", [True,False], True]], - 'Specialization':[ - ['-t', '--specialize', "Don't specialize", [True,False], True]], +## 'Specialization':[ +## ['-t', '--specialize', "Don't specialize", [True,False], True]], - 'Backend optimisation': [ - ['-o', '--optimize', "Don't optimize (should have different name)", - [True,False], True ]], +## 'Backend optimisation': [ +## ['-o', '--optimize', "Don't optimize (should have different name)", +## [True,False], True ]], - 'Process options':[ - ['-f', '--fork', - "(UNIX) Create restartable checkpoint after annotation [,specialization]", - [['fork1','fork2']], [] ], - ['-l', '--load', "load translator from file", [str], ''], - ['-s', '--save', "save translator to file", [str], '']], +## 'Process options':[ +## ['-f', '--fork', +## "(UNIX) Create restartable checkpoint after annotation [,specialization]", +## [['fork1','fork2']], [] ], +## ['-l', '--load', "load translator from file", [str], ''], +## ['-s', '--save', "save translator to file", [str], '']], - 'Codegeneration options':[ - ['-g', '--gc', 'Garbage collector', ['ref', 'boehm','none'], 'boehm'], - ['-b', '--backend', 'Backend selector', ['c','llvm'],'c'], - ['-w', '--gencode', "Don't generate code", [True,False], True], - ['-c', '--compile', "Don't compile generated code", [True,False], True]], +## 'Codegeneration options':[ +## ['-g', '--gc', 'Garbage collector', ['ref', 'boehm','none'], 'boehm'], +## ['-b', '--backend', 'Backend selector', ['c','llvm'],'c'], +## ['-w', '--gencode', "Don't generate code", [True,False], True], +## ['-c', '--compile', "Don't compile generated code", [True,False], True]], - 'Compilation options':[], +## 'Compilation options':[], - 'Run options':[ - ['-r', '--run', "Don't run the compiled code", [True,False], True], - ['-x', '--batch', "Dont run interactive helpers", [True,False], False]], - 'Pygame options':[ - ['-p', '--pygame', "Dont run pygame", [True,False], True], - ['-H', '--huge', - "Threshold in the number of functions after which only a local call graph and not a full one is displayed", [int], 0 ]]} +## 'Run options':[ +## ['-r', '--run', "Don't run the compiled code", [True,False], True], +## ['-x', '--batch', "Dont run interactive helpers", [True,False], False]], +## 'Pygame options':[ +## ['-p', '--pygame', "Dont run pygame", [True,False], True], +## ['-H', '--huge', +## "Threshold in the number of functions after which only a local call graph and not a full one is displayed", [int], 0 ]]} -import autopath, sys, os +## import autopath, sys, os -if '-use-snapshot' in sys.argv: - # xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx - basedir = autopath.this_dir - - pypy_translation_snapshot_dir = os.path.join(basedir, 'pypy-translation-snapshot') - - if not os.path.isdir(pypy_translation_snapshot_dir): - print """ - Translation will be performed on a specific revision of PyPy which lives on - a branch. This needs to be checked out into translator/goal with: - - svn co http://codespeak.net/svn/pypy/branch/pypy-translation-snapshot - """[1:] - sys.exit(2) - - # override imports from pypy head with imports from pypy-translation-snapshot - import pypy - pypy.__path__.insert(0, pypy_translation_snapshot_dir) - - # complement imports from pypy.objspace (from pypy-translation-snapshot) - # with pypy head objspace/ - import pypy.objspace - pypy.objspace.__path__.append(os.path.join(autopath.pypydir, 'objspace')) - - print "imports redirected to pypy-translation-snapshot." - - # xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx - - -import threading, pdb - -from pypy.translator.translator import Translator -from pypy.annotation import model as annmodel -from pypy.annotation import listdef -from pypy.annotation.policy import AnnotatorPolicy -from pypy.translator.pickle.main import load, save -# catch TyperError to allow for post-mortem dump -from pypy.rpython.error import TyperError - -from pypy.translator.goal import query - -# XXX this tries to make compiling faster -from pypy.translator.tool import cbuild -cbuild.enable_fast_compilation() -from pypy.translator.tool.util import update_usession_dir -from pypy.translator.tool.util import assert_rpython_mostly_not_imported, mkexename - -annmodel.DEBUG = False - - - -# __________ Main __________ - -def sanity_check_annotation(t): - irreg = query.qoutput(query.check_exceptblocks_qgen(t)) - if not irreg: - print "++ All exceptblocks seem sane" - - lost = query.qoutput(query.check_methods_qgen(t)) - assert not lost, "lost methods, something gone wrong with the annotation of method defs" - print "++ No lost method defs" - - so = query.qoutput(query.polluted_qgen(t)) - tot = len(t.flowgraphs) - percent = int(tot and (100.0*so / tot) or 0) - print "-- someobjectness %2d (%d of %d functions polluted by SomeObjects)" % (percent, so, tot) - -def analyse(t, inputtypes): - - standalone = inputtypes is None - if standalone: - ldef = listdef.ListDef(None, annmodel.SomeString()) - inputtypes = [annmodel.SomeList(ldef)] - - if not cmd_line_opt.no_annotations: - print 'Annotating...' - print 'with policy: %s.%s' % (policy.__class__.__module__, policy.__class__.__name__) - a = t.annotate(inputtypes, policy=policy) - sanity_check_annotation(t) - - if a: #and not options['-no-s']: - print 'Simplifying...' - a.simplify() - if 'fork1' in cmd_line_opt.fork: - from pypy.translator.goal import unixcheckpoint - assert_rpython_mostly_not_imported() - unixcheckpoint.restartable_point(auto='run') - if a and cmd_line_opt.specialize: - print 'Specializing...' - t.specialize(dont_simplify_again=True, - crash_on_first_typeerror=not cmd_line_opt.insist) - if cmd_line_opt.optimize: - print 'Back-end optimizations...' - t.backend_optimizations(ssa_form=cmd_line_opt.backend != 'llvm') - if a and 'fork2' in cmd_line_opt.fork: - from pypy.translator.goal import unixcheckpoint - unixcheckpoint.restartable_point(auto='run') - if a: - t.frozen = True # cannot freeze if we don't have annotations - return standalone +## if '-use-snapshot' in sys.argv: +## # xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx +## basedir = autopath.this_dir -# graph servers +## pypy_translation_snapshot_dir = os.path.join(basedir, 'pypy-translation-snapshot') -serv_start, serv_show, serv_stop, serv_cleanup = None, None, None, None +## if not os.path.isdir(pypy_translation_snapshot_dir): +## print """ +## Translation will be performed on a specific revision of PyPy which lives on +## a branch. This needs to be checked out into translator/goal with: + +## svn co http://codespeak.net/svn/pypy/branch/pypy-translation-snapshot +## """[1:] +## sys.exit(2) + +## # override imports from pypy head with imports from pypy-translation-snapshot +## import pypy +## pypy.__path__.insert(0, pypy_translation_snapshot_dir) + +## # complement imports from pypy.objspace (from pypy-translation-snapshot) +## # with pypy head objspace/ +## import pypy.objspace +## pypy.objspace.__path__.append(os.path.join(autopath.pypydir, 'objspace')) + +## print "imports redirected to pypy-translation-snapshot." + +## # xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx + + +## import threading, pdb + +## from pypy.translator.translator import Translator +## from pypy.annotation import model as annmodel +## from pypy.annotation import listdef +## from pypy.annotation.policy import AnnotatorPolicy +## from pypy.translator.pickle.main import load, save +## # catch TyperError to allow for post-mortem dump +## from pypy.rpython.error import TyperError + +## from pypy.translator.goal import query + +## # XXX this tries to make compiling faster +## from pypy.translator.tool import cbuild +## cbuild.enable_fast_compilation() +## from pypy.translator.tool.util import update_usession_dir +## from pypy.translator.tool.util import assert_rpython_mostly_not_imported, mkexename + +## annmodel.DEBUG = False -if __name__ == '__main__': - targetspec = 'targetpypystandalone' - listen_port = None + +## # __________ Main __________ + +## def sanity_check_annotation(t): +## irreg = query.qoutput(query.check_exceptblocks_qgen(t)) +## if not irreg: +## print "++ All exceptblocks seem sane" + +## lost = query.qoutput(query.check_methods_qgen(t)) +## assert not lost, "lost methods, something gone wrong with the annotation of method defs" +## print "++ No lost method defs" + +## so = query.qoutput(query.polluted_qgen(t)) +## tot = len(t.flowgraphs) +## percent = int(tot and (100.0*so / tot) or 0) +## print "-- someobjectness %2d (%d of %d functions polluted by SomeObjects)" % (percent, so, tot) + +## def analyse(t, inputtypes): + +## standalone = inputtypes is None +## if standalone: +## ldef = listdef.ListDef(None, annmodel.SomeString()) +## inputtypes = [annmodel.SomeList(ldef)] - def debug(got_error): - from pypy.translator.tool.pdbplus import PdbPlusShow - - pdb_plus_show = PdbPlusShow(t) # need a translator to support extended commands +## if not cmd_line_opt.no_annotations: +## print 'Annotating...' +## print 'with policy: %s.%s' % (policy.__class__.__module__, policy.__class__.__name__) +## a = t.annotate(inputtypes, policy=policy) +## sanity_check_annotation(t) + +## if a: #and not options['-no-s']: +## print 'Simplifying...' +## a.simplify() +## if 'fork1' in cmd_line_opt.fork: +## from pypy.translator.goal import unixcheckpoint +## assert_rpython_mostly_not_imported() +## unixcheckpoint.restartable_point(auto='run') +## if a and cmd_line_opt.specialize: +## print 'Specializing...' +## t.specialize(dont_simplify_again=True, +## crash_on_first_typeerror=not cmd_line_opt.insist) +## if cmd_line_opt.optimize: +## print 'Back-end optimizations...' +## t.backend_optimizations(ssa_form=cmd_line_opt.backend != 'llvm') +## if a and 'fork2' in cmd_line_opt.fork: +## from pypy.translator.goal import unixcheckpoint +## unixcheckpoint.restartable_point(auto='run') +## if a: +## t.frozen = True # cannot freeze if we don't have annotations +## return standalone - tb = None - if got_error: - import traceback - exc, val, tb = sys.exc_info() - print >> sys.stderr - traceback.print_exception(exc, val, tb) - print >> sys.stderr - - block = getattr(val, '__annotator_block', None) - if block: - print '-'*60 - t.about(block) - print '-'*60 - print - else: - print '-'*60 - print 'Done.' - print - - if cmd_line_opt.batch: - print >>sys.stderr, "batch mode, not calling interactive helpers" - return - - def server_setup(): - if serv_start: - return serv_start, serv_show, serv_stop, serv_cleanup - else: - from pypy.translator.tool.pygame.server import run_translator_server - return run_translator_server(t, entry_point, cmd_line_opt) +## # graph servers - pdb_plus_show.start(tb, server_setup, graphic=cmd_line_opt.pygame) +## serv_start, serv_show, serv_stop, serv_cleanup = None, None, None, None +## if __name__ == '__main__': - from optparse import OptionParser - parser = OptionParser() - for group in opts: - for option in opts[group]: - if option[-1] in [True,False]: - if option[-1] == True: - action = "store_false" - else: - action = "store_true" - parser.add_option(option[0],option[1], default=option[-1], - dest=option[1].lstrip('--'), help=option[2], action=action) - elif type(option[-2][0]) == list: - parser.add_option(option[0],option[1], default=option[-1], - dest=option[1].lstrip('--'), help=option[2], action="append") - else: - parser.add_option(option[0],option[1], default=option[-1], - dest=option[1].lstrip('--'), help=option[2]) +## targetspec = 'targetpypystandalone' +## listen_port = None - (cmd_line_opt, args) = parser.parse_args() - argiter = iter(args) #sys.argv[1:]) - for arg in argiter: - try: - listen_port = int(arg) - except ValueError: - if os.path.isfile(arg+'.py'): - assert not os.path.isfile(arg), ( - "ambiguous file naming, please rename %s" % arg) - targetspec = arg - elif os.path.isfile(arg) and arg.endswith('.py'): - targetspec = arg[:-3] - t = None - options = {} - for opt in parser.option_list[1:]: - options[opt.dest] = getattr(cmd_line_opt,opt.dest) - if options.get('gc') == 'boehm': - options['-boehm'] = True -## if options['-tcc']: -## os.environ['PYPY_CC'] = 'tcc -shared -o "%s.so" "%s.c"' - if cmd_line_opt.debug: - annmodel.DEBUG = True - try: - err = None - if cmd_line_opt.load: - loaded_dic = load(cmd_line_opt.load) - t = loaded_dic['trans'] - entry_point = t.entrypoint - inputtypes = loaded_dic['inputtypes'] - targetspec_dic = loaded_dic['targetspec_dic'] - targetspec = loaded_dic['targetspec'] - old_options = loaded_dic['options'] - for name in 'no_a specialize optimize'.split(): - # if one of these options has not been set, before, - # then the action has been done and must be prevented, now. - if not old_options[name]: - if options[name]: - print 'option %s is implied by the load' % name - options[name] = True - print "continuing Analysis as defined by %s, loaded from %s" %( - targetspec, cmd_line_opt.loadname) - targetspec_dic['target'] = None - else: - targetspec_dic = {} - sys.path.insert(0, os.path.dirname(targetspec)) - execfile(targetspec+'.py', targetspec_dic) - print "Analysing target as defined by %s" % targetspec - if targetspec_dic.get('options', None): - targetspec_dic['options'].update(options) - options = targetspec_dic['options'] - print options,targetspec_dic['options'] - print 'options in effect:' - optnames = options.keys() - optnames.sort() - for name in optnames: - print ' %25s: %s' %(name, options[name]) - - policy = AnnotatorPolicy() - target = targetspec_dic['target'] - if target: - spec = target(cmd_line_opt, []) # xxx rest args - try: - entry_point, inputtypes, policy = spec - except ValueError: - entry_point, inputtypes = spec - t = Translator(entry_point, verbose=True, simplifying=True) - a = None - else: - # otherwise we have been loaded - a = t.annotator - t.frozen = False - if listen_port: - from pypy.translator.tool.graphserver import run_async_server - serv_start, serv_show, serv_stop, serv_cleanup = run_async_server(t, listen_port) - try: - standalone = analyse(t, inputtypes) - except TyperError: - err = sys.exc_info() - print '-'*60 - if cmd_line_opt.save: - print 'saving state to %s' % cmd_line_opt.save - if err: - print '*** this save is done after errors occured ***' - save(t, cmd_line_opt.save, - trans=t, - inputtypes=inputtypes, - targetspec=targetspec, - targetspec_dic=targetspec_dic, - options=options, - ) - if err: - raise err[0], err[1], err[2] - if cmd_line_opt.backend == 'c': #XXX probably better to supply gcpolicy as string to the backends - gcpolicy = None - if cmd_line_opt.gc =='boehm': - from pypy.translator.c import gc - gcpolicy = gc.BoehmGcPolicy - if cmd_line_opt.gc == 'none': - from pypy.translator.c import gc - gcpolicy = gc.NoneGcPolicy - elif cmd_line_opt.backend == 'llvm': - gcpolicy = cmd_line_opt.gc - - if cmd_line_opt.backend == 'llinterpret': - def interpret(): - import py - from pypy.rpython.llinterp import LLInterpreter - py.log.setconsumer("llinterp operation", None) - interp = LLInterpreter(t.flowgraphs, t.rtyper) - interp.eval_function(entry_point, - targetspec_dic['get_llinterp_args']()) - interpret() - elif not cmd_line_opt.gencode: - print 'Not generating C code.' - else: - print 'Generating %s %s code...' %(cmd_line_opt.compile and "and compiling" or "",cmd_line_opt.backend) - keywords = {'really_compile' : cmd_line_opt.compile, - 'standalone' : standalone, - 'gcpolicy' : gcpolicy} - c_entry_point = t.compile(cmd_line_opt.backend, **keywords) +## def debug(got_error): +## from pypy.translator.tool.pdbplus import PdbPlusShow + +## pdb_plus_show = PdbPlusShow(t) # need a translator to support extended commands + +## tb = None +## if got_error: +## import traceback +## exc, val, tb = sys.exc_info() +## print >> sys.stderr +## traceback.print_exception(exc, val, tb) +## print >> sys.stderr + +## block = getattr(val, '__annotator_block', None) +## if block: +## print '-'*60 +## t.about(block) +## print '-'*60 +## print +## else: +## print '-'*60 +## print 'Done.' +## print + +## if cmd_line_opt.batch: +## print >>sys.stderr, "batch mode, not calling interactive helpers" +## return + +## def server_setup(): +## if serv_start: +## return serv_start, serv_show, serv_stop, serv_cleanup +## else: +## from pypy.translator.tool.pygame.server import run_translator_server +## return run_translator_server(t, entry_point, cmd_line_opt) + +## pdb_plus_show.start(tb, server_setup, graphic=cmd_line_opt.pygame) + + +## from optparse import OptionParser +## parser = OptionParser() +## for group in opts: +## for option in opts[group]: +## if option[-1] in [True,False]: +## if option[-1] == True: +## action = "store_false" +## else: +## action = "store_true" +## parser.add_option(option[0],option[1], default=option[-1], +## dest=option[1].lstrip('--'), help=option[2], action=action) +## elif type(option[-2][0]) == list: +## parser.add_option(option[0],option[1], default=option[-1], +## dest=option[1].lstrip('--'), help=option[2], action="append") +## else: +## parser.add_option(option[0],option[1], default=option[-1], +## dest=option[1].lstrip('--'), help=option[2]) + +## (cmd_line_opt, args) = parser.parse_args() +## argiter = iter(args) #sys.argv[1:]) +## for arg in argiter: +## try: +## listen_port = int(arg) +## except ValueError: +## if os.path.isfile(arg+'.py'): +## assert not os.path.isfile(arg), ( +## "ambiguous file naming, please rename %s" % arg) +## targetspec = arg +## elif os.path.isfile(arg) and arg.endswith('.py'): +## targetspec = arg[:-3] +## t = None +## options = {} +## for opt in parser.option_list[1:]: +## options[opt.dest] = getattr(cmd_line_opt,opt.dest) +## if options.get('gc') == 'boehm': +## options['-boehm'] = True +## ## if options['-tcc']: +## ## os.environ['PYPY_CC'] = 'tcc -shared -o "%s.so" "%s.c"' +## if cmd_line_opt.debug: +## annmodel.DEBUG = True +## try: +## err = None +## if cmd_line_opt.load: +## loaded_dic = load(cmd_line_opt.load) +## t = loaded_dic['trans'] +## entry_point = t.entrypoint +## inputtypes = loaded_dic['inputtypes'] +## targetspec_dic = loaded_dic['targetspec_dic'] +## targetspec = loaded_dic['targetspec'] +## old_options = loaded_dic['options'] +## for name in 'no_a specialize optimize'.split(): +## # if one of these options has not been set, before, +## # then the action has been done and must be prevented, now. +## if not old_options[name]: +## if options[name]: +## print 'option %s is implied by the load' % name +## options[name] = True +## print "continuing Analysis as defined by %s, loaded from %s" %( +## targetspec, cmd_line_opt.loadname) +## targetspec_dic['target'] = None +## else: +## targetspec_dic = {} +## sys.path.insert(0, os.path.dirname(targetspec)) +## execfile(targetspec+'.py', targetspec_dic) +## print "Analysing target as defined by %s" % targetspec +## if targetspec_dic.get('options', None): +## targetspec_dic['options'].update(options) +## options = targetspec_dic['options'] +## print options,targetspec_dic['options'] +## print 'options in effect:' +## optnames = options.keys() +## optnames.sort() +## for name in optnames: +## print ' %25s: %s' %(name, options[name]) + +## policy = AnnotatorPolicy() +## target = targetspec_dic['target'] +## if target: +## spec = target(cmd_line_opt, []) # xxx rest args +## try: +## entry_point, inputtypes, policy = spec +## except ValueError: +## entry_point, inputtypes = spec +## t = Translator(entry_point, verbose=True, simplifying=True) +## a = None +## else: +## # otherwise we have been loaded +## a = t.annotator +## t.frozen = False +## if listen_port: +## from pypy.translator.tool.graphserver import run_async_server +## serv_start, serv_show, serv_stop, serv_cleanup = run_async_server(t, listen_port) +## try: +## standalone = analyse(t, inputtypes) +## except TyperError: +## err = sys.exc_info() +## print '-'*60 +## if cmd_line_opt.save: +## print 'saving state to %s' % cmd_line_opt.save +## if err: +## print '*** this save is done after errors occured ***' +## save(t, cmd_line_opt.save, +## trans=t, +## inputtypes=inputtypes, +## targetspec=targetspec, +## targetspec_dic=targetspec_dic, +## options=options, +## ) +## if err: +## raise err[0], err[1], err[2] +## if cmd_line_opt.backend == 'c': #XXX probably better to supply gcpolicy as string to the backends +## gcpolicy = None +## if cmd_line_opt.gc =='boehm': +## from pypy.translator.c import gc +## gcpolicy = gc.BoehmGcPolicy +## if cmd_line_opt.gc == 'none': +## from pypy.translator.c import gc +## gcpolicy = gc.NoneGcPolicy +## elif cmd_line_opt.backend == 'llvm': +## gcpolicy = cmd_line_opt.gc + +## if cmd_line_opt.backend == 'llinterpret': +## def interpret(): +## import py +## from pypy.rpython.llinterp import LLInterpreter +## py.log.setconsumer("llinterp operation", None) +## interp = LLInterpreter(t.flowgraphs, t.rtyper) +## interp.eval_function(entry_point, +## targetspec_dic['get_llinterp_args']()) +## interpret() +## elif not cmd_line_opt.gencode: +## print 'Not generating C code.' +## else: +## print 'Generating %s %s code...' %(cmd_line_opt.compile and "and compiling" or "",cmd_line_opt.backend) +## keywords = {'really_compile' : cmd_line_opt.compile, +## 'standalone' : standalone, +## 'gcpolicy' : gcpolicy} +## c_entry_point = t.compile(cmd_line_opt.backend, **keywords) - if standalone: # xxx fragile and messy - import shutil - exename = mkexename(c_entry_point) - newexename = mkexename('./pypy-' + cmd_line_opt.backend) - shutil.copy(exename, newexename) - c_entry_point = newexename - update_usession_dir() - print 'Written %s.' % (c_entry_point,) - if cmd_line_opt.run: - print 'Running!' - if standalone: - os.system(c_entry_point) - else: - targetspec_dic['run'](c_entry_point) - except SystemExit: - raise - except: - if t: debug(True) - raise SystemExit(1) - else: - if t: debug(False) +## if standalone: # xxx fragile and messy +## import shutil +## exename = mkexename(c_entry_point) +## newexename = mkexename('./pypy-' + cmd_line_opt.backend) +## shutil.copy(exename, newexename) +## c_entry_point = newexename +## update_usession_dir() +## print 'Written %s.' % (c_entry_point,) +## if cmd_line_opt.run: +## print 'Running!' +## if standalone: +## os.system(c_entry_point) +## else: +## targetspec_dic['run'](c_entry_point) +## except SystemExit: +## raise +## except: +## if t: debug(True) +## raise SystemExit(1) +## else: +## if t: debug(False) From pedronis at codespeak.net Thu Sep 29 20:48:04 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Thu, 29 Sep 2005 20:48:04 +0200 (CEST) Subject: [pypy-svn] r17985 - pypy/dist/pypy/translator/goal Message-ID: <20050929184804.E2DDF27BA7@code1.codespeak.net> Author: pedronis Date: Thu Sep 29 20:48:04 2005 New Revision: 17985 Modified: pypy/dist/pypy/translator/goal/driver.py Log: work-in-progress Modified: pypy/dist/pypy/translator/goal/driver.py ============================================================================== --- pypy/dist/pypy/translator/goal/driver.py (original) +++ pypy/dist/pypy/translator/goal/driver.py Thu Sep 29 20:48:04 2005 @@ -185,9 +185,14 @@ return proceed_with_task raise AttribueError, name - def from_targetspec(targetspec_dic, options=None): - target = targetspec_dic['target'] - spec = target(not options.lowmem) + def from_targetspec(targetspec_dic, options=None, args=None): + if args is None: + args = [] + if options is None: + options = DEFAULT_OPTIONS.copy() + + target = targetspec_dic['target'] + spec = target(options, arg) try: entry_point, inputtypes, policy = spec except ValueError: From arigo at codespeak.net Thu Sep 29 23:53:13 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Thu, 29 Sep 2005 23:53:13 +0200 (CEST) Subject: [pypy-svn] r17986 - in pypy/dist/pypy/objspace/flow: . test Message-ID: <20050929215313.C0E2027BA7@code1.codespeak.net> Author: arigo Date: Thu Sep 29 23:53:07 2005 New Revision: 17986 Modified: pypy/dist/pypy/objspace/flow/model.py pypy/dist/pypy/objspace/flow/test/test_model.py pypy/dist/pypy/objspace/flow/test/test_objspace.py Log: * restored the depth-first behavior of traverse(). * cleaned up the tests and put in test_model some tests that really test model.py. Modified: pypy/dist/pypy/objspace/flow/model.py ============================================================================== --- pypy/dist/pypy/objspace/flow/model.py (original) +++ pypy/dist/pypy/objspace/flow/model.py Thu Sep 29 23:53:07 2005 @@ -69,26 +69,28 @@ source = roproperty(getsource) def iterblocks(self): - pending = [self.startblock] - seen = {id(self.startblock): True} - for block in pending: - yield block - for link in block.exits: - targetid = id(link.target) - if targetid not in seen: - pending.append(link.target) - seen[targetid] = True + block = self.startblock + yield block + seen = {id(block): True} + stack = list(block.exits[::-1]) + while stack: + block = stack.pop().target + if id(block) not in seen: + yield block + seen[id(block)] = True + stack += block.exits[::-1] def iterlinks(self): - pending = [self.startblock] - seen = {id(self.startblock): True} - for block in pending: - for link in block.exits: - yield link - targetid = id(link.target) - if targetid not in seen: - pending.append(link.target) - seen[targetid] = True + block = self.startblock + seen = {id(block): True} + stack = list(block.exits[::-1]) + while stack: + link = stack.pop() + yield link + block = link.target + if id(block) not in seen: + seen[id(block)] = True + stack += block.exits[::-1] def show(self): from pypy.translator.tool.graphpage import SingleGraphPage @@ -403,16 +405,18 @@ ## raise ValueError, "could not dispatch %r" % cls def traverse(visit, functiongraph): - pending = [functiongraph.startblock] - seen = {id(functiongraph.startblock): True} - for block in pending: - visit(block) - for link in block.exits: - visit(link) - targetid = id(link.target) - if targetid not in seen: - pending.append(link.target) - seen[targetid] = True + block = functiongraph.startblock + visit(block) + seen = {id(block): True} + stack = list(block.exits[::-1]) + while stack: + link = stack.pop() + visit(link) + block = link.target + if id(block) not in seen: + visit(block) + seen[id(block)] = True + stack += block.exits[::-1] def flatten(funcgraph): @@ -431,11 +435,9 @@ "Returns a dict mapping Blocks to lists of Links." startlink = Link(funcgraph.getargs(), funcgraph.startblock) result = {funcgraph.startblock: [startlink]} - def visit(link): - if isinstance(link, Link): - lst = result.setdefault(link.target, []) - lst.append(link) - traverse(visit, funcgraph) + for link in funcgraph.iterlinks(): + lst = result.setdefault(link.target, []) + lst.append(link) return result def checkgraph(graph): Modified: pypy/dist/pypy/objspace/flow/test/test_model.py ============================================================================== --- pypy/dist/pypy/objspace/flow/test/test_model.py (original) +++ pypy/dist/pypy/objspace/flow/test/test_model.py Thu Sep 29 23:53:07 2005 @@ -1,95 +1,128 @@ -import autopath +import autopath, inspect from pypy.objspace.flow.model import * -from pypy.objspace.flow import FlowObjSpace -class TestModel: - def setup_class(cls): - cls.space = FlowObjSpace() - - def getflow(self, func): - import inspect - try: - func = func.im_func - except AttributeError: - pass - # disable implicit exceptions to keep the graphs simple and checkable - self.space.handle_implicit_exceptions = lambda exceptions: None - try: - return self.space.build_flow(func) - finally: - del self.space.handle_implicit_exceptions - - #_____________________________________________ - def simplefunc(x): - return x+1 - - def test_simplefunc(self): - graph = self.getflow(self.simplefunc) - assert all_operations(graph) == {'add': 1} - -## def test_class(self): -## graph = self.getflow(self.simplefunc) - -## class MyVisitor: -## def __init__(self): -## self.blocks = [] -## self.links = [] - -## def visit_FunctionGraph(self, graph): -## self.graph = graph -## def visit_Block(self, block): -## self.blocks.append(block) -## def visit_Link(self, link): -## self.links.append(link) - -## v = MyVisitor() -## traverse(v, graph) -## #assert len(v.blocks) == 2 -## #assert len(v.links) == 1 -## assert v.graph == graph -## assert v.links[0] == graph.startblock.exits[0] - -## def test_partial_class(self): -## graph = self.getflow(self.simplefunc) - -## class MyVisitor: -## def __init__(self): -## self.blocks = [] -## self.links = [] - -## def visit_FunctionGraph(self, graph): -## self.graph = graph -## def visit_Block(self, block): -## self.blocks.append(block) -## def visit(self, link): -## self.links.append(link) - -## v = MyVisitor() -## traverse(v, graph) -## assert len(v.blocks) == 2 -## assert len(v.links) == 1 -## assert v.graph == graph -## assert v.links[0] == graph.startblock.exits[0] - - def loop(x): - x = abs(x) - while x: - x = x - 1 - - def test_loop(self): - graph = self.getflow(self.loop) - assert all_operations(graph) == {'abs': 1, - 'is_true': 1, - 'sub': 1} - - -def all_operations(graph): - result = {} - def visit(node): - if isinstance(node, Block): - for op in node.operations: - result.setdefault(op.opname, 0) - result[op.opname] += 1 - traverse(visit, graph) - return result + +def sample_function(i): + sum = 0 + while i > 0: + sum = sum + i + i = i - 1 + return sum + +class pieces: + """ The manually-built graph corresponding to the sample_function(). + """ + i = Variable("i") + i1 = Variable("i1") + i2 = Variable("i2") + i3 = Variable("i3") + sum1 = Variable("sum1") + sum2 = Variable("sum2") + sum3 = Variable("sum3") + + conditionres = Variable("conditionres") + conditionop = SpaceOperation("gt", [i1, Constant(0)], conditionres) + addop = SpaceOperation("add", [sum2, i2], sum3) + decop = SpaceOperation("sub", [i2, Constant(1)], i3) + startblock = Block([i]) + headerblock = Block([i1, sum1]) + whileblock = Block([i2, sum2]) + + graph = FunctionGraph("f", startblock) + startblock.closeblock(Link([i, Constant(0)], headerblock)) + headerblock.operations.append(conditionop) + headerblock.exitswitch = conditionres + headerblock.closeblock(Link([sum1], graph.returnblock, False), + Link([i1, sum1], whileblock, True)) + whileblock.operations.append(addop) + whileblock.operations.append(decop) + whileblock.closeblock(Link([i3, sum3], headerblock)) + + graph.func = sample_function + +graph = pieces.graph + +# ____________________________________________________________ + +def test_checkgraph(): + checkgraph(graph) + +def test_graphattributes(): + assert graph.startblock is pieces.startblock + assert graph.returnblock is pieces.headerblock.exits[0].target + assert graph.getargs() == [pieces.i] + assert [graph.getreturnvar()] == graph.returnblock.inputargs + assert graph.getsource() == inspect.getsource(sample_function) + +def test_iterblocks(): + assert list(graph.iterblocks()) == [pieces.startblock, + pieces.headerblock, + graph.returnblock, + pieces.whileblock] + +def test_iterlinks(): + assert list(graph.iterlinks()) == [pieces.startblock.exits[0], + pieces.headerblock.exits[0], + pieces.headerblock.exits[1], + pieces.whileblock.exits[0]] + +def test_traverse(): + lst = [] + traverse(lst.append, graph) + assert lst == [pieces.startblock, + pieces.startblock.exits[0], + pieces.headerblock, + pieces.headerblock.exits[0], + graph.returnblock, + pieces.headerblock.exits[1], + pieces.whileblock, + pieces.whileblock.exits[0]] + assert flatten(graph) == lst + +def test_mkentrymap(): + entrymap = mkentrymap(graph) + startlink = entrymap[graph.startblock][0] + assert entrymap == { + pieces.startblock: [startlink], + pieces.headerblock: [pieces.startblock.exits[0], + pieces.whileblock.exits[0]], + graph.returnblock: [pieces.headerblock.exits[0]], + pieces.whileblock: [pieces.headerblock.exits[1]], + } + +def test_blockattributes(): + block = pieces.whileblock + assert block.getvariables() == [pieces.i2, + pieces.sum2, + pieces.sum3, + pieces.i3] + assert block.getconstants() == [Constant(1)] + +def test_renamevariables(): + block = pieces.whileblock + v = Variable() + block.renamevariables({pieces.sum2: v}) + assert block.getvariables() == [pieces.i2, + v, + pieces.sum3, + pieces.i3] + block.renamevariables({v: pieces.sum2}) + assert block.getvariables() == [pieces.i2, + pieces.sum2, + pieces.sum3, + pieces.i3] + +def test_variable(): + v = Variable() + assert v.name[0] == 'v' and v.name[1:].isdigit() + assert not v.renamed + num = int(v.name[1:]) + v.rename("foobar") + assert v.name == "foobar_%d" % num + assert v.renamed + v.rename("not again") + assert v.name == "foobar_%d" % num + v2 = Variable(v) + assert v2.renamed + assert v2.name.startswith("foobar_") and v2.name != v.name Modified: pypy/dist/pypy/objspace/flow/test/test_objspace.py ============================================================================== --- pypy/dist/pypy/objspace/flow/test/test_objspace.py (original) +++ pypy/dist/pypy/objspace/flow/test/test_objspace.py Thu Sep 29 23:53:07 2005 @@ -34,6 +34,16 @@ def show(self, x): pass # or self.reallyshow(x) + def all_operations(self, graph): + result = {} + def visit(node): + if isinstance(node, Block): + for op in node.operations: + result.setdefault(op.opname, 0) + result[op.opname] += 1 + traverse(visit, graph) + return result + #__________________________________________________________ def nothing(): pass @@ -46,6 +56,14 @@ self.show(x) #__________________________________________________________ + def simplefunc(x): + return x+1 + + def test_simplefunc(self): + graph = self.codetest(self.simplefunc) + assert self.all_operations(graph) == {'add': 1} + + #__________________________________________________________ def simplebranch(i, j): if i < 0: return i @@ -66,6 +84,18 @@ self.show(x) #__________________________________________________________ + def loop(x): + x = abs(x) + while x: + x = x - 1 + + def test_loop(self): + graph = self.codetest(self.loop) + assert self.all_operations(graph) == {'abs': 1, + 'is_true': 1, + 'sub': 1} + + #__________________________________________________________ def print_(i): print i From pedronis at codespeak.net Fri Sep 30 00:35:18 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 00:35:18 +0200 (CEST) Subject: [pypy-svn] r17987 - pypy/dist/pypy/translator/goal Message-ID: <20050929223518.D62EC27BA7@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 00:35:17 2005 New Revision: 17987 Modified: pypy/dist/pypy/translator/goal/targetpypystandalone.py pypy/dist/pypy/translator/goal/translate_pypy.py Log: more forward compatibility Modified: pypy/dist/pypy/translator/goal/targetpypystandalone.py ============================================================================== --- pypy/dist/pypy/translator/goal/targetpypystandalone.py (original) +++ pypy/dist/pypy/translator/goal/targetpypystandalone.py Fri Sep 30 00:35:17 2005 @@ -50,16 +50,16 @@ geninterp = not getattr(options, 'lowmem', False) # obscure hack to stuff the translation options into the translated PyPy - import __main__, pypy.module.sys - options = {} - for key, value in __main__.options.items(): - options[key.lstrip('-')] = value - wrapstr = 'space.wrap(%r)' % (options,) + import pypy.module.sys + d = {} + for key, value in options.__dict__.items(): + d[key.lstrip('-')] = value + wrapstr = 'space.wrap(%r)' % (d,) pypy.module.sys.Module.interpleveldefs['pypy_translation_info'] = wrapstr # disable translation of the whole of classobjinterp.py StdObjSpace.setup_old_style_classes = lambda self: None - if __main__.options.get('-boehm'): + if options.gc == 'boehm': #print "disabling thread with boehm for stabilitiy (combination not tested)" #print "trying threads and boehm" usemodules = [] Modified: pypy/dist/pypy/translator/goal/translate_pypy.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy.py Fri Sep 30 00:35:17 2005 @@ -110,8 +110,17 @@ if target: # forward compatibility import optparse - opts = optparse.Values({'lowmem': options['-t-lowmem']}) - spec = target(opts, []) + fw_opts = options.copy() + if options['-boehm']: + gc_ = 'boehm' + elif options['-no-gc']: + gc_ = 'none' + else: + gc_ = 'ref' + fw_opts['gc'] = gc_ + fw_opts['lowmem'] = options['-t-lowmem'] + fw_opts = optparse.Values(fw_opts) + spec = target(fw_opts, []) try: entry_point, inputtypes, policy = spec except ValueError: From pedronis at codespeak.net Fri Sep 30 00:38:38 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 00:38:38 +0200 (CEST) Subject: [pypy-svn] r17989 - in pypy/dist/pypy/translator: goal tool tool/pygame Message-ID: <20050929223838.1146E27BA9@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 00:38:35 2005 New Revision: 17989 Modified: pypy/dist/pypy/translator/goal/driver.py pypy/dist/pypy/translator/goal/translate_pypy_new.py pypy/dist/pypy/translator/tool/pdbplus.py pypy/dist/pypy/translator/tool/pygame/server.py Log: recomposed the translate_pypy_new puzzle still do to: - renable forking - finish: print -> logging - display options - some more polish Modified: pypy/dist/pypy/translator/goal/driver.py ============================================================================== --- pypy/dist/pypy/translator/goal/driver.py (original) +++ pypy/dist/pypy/translator/goal/driver.py Fri Sep 30 00:38:35 2005 @@ -26,7 +26,7 @@ class TranslationDriver(SimpleTaskEngine): def __init__(self, translator, inputtypes, policy=None, options=None, - runner=None, disable=[]): + runner=None, disable=[], default_goal = None): SimpleTaskEngine.__init__(self) self.translator = translator @@ -45,12 +45,54 @@ self.options = options self.standalone = standalone - if runner is None: + if runner is None and not standalone: def runner(f): f() self.runner = runner self.done = {} + + maybe_skip = {} + def add_maybe_skip(goal): + if goal in maybe_skip: + return + maybe_skip[goal] = True + for depending in self._depending_on(goal): + add_maybe_skip(depending) + + for goal in self.backend_select_goals(disable): + add_maybe_skip(goal) + self.maybe_skip = maybe_skip.keys() + + if default_goal: + default_goal, = self.backend_select_goals([default_goal]) + if default_goal in self.maybe_skip: + default_goal = None + + self.default_goal = default_goal + + # expose tasks + def expose_task(task): + backend_goal, = self.backend_select_goals([task]) + def proc(): + self.proceed(backend_goal) + setattr(self, task, proc) + + for task in ('annotate', 'rtype', 'backopt', 'source', 'compile', 'run'): + expose_task(task) + + def backend_select_goals(self, goals): + backend = self.options.backend + assert backend + l = [] + for goal in goals: + if goal in self.tasks: + l.append(goal) + else: + goal = "%s_%s" % (goal, backend) + assert goal in self.tasks + l.append(goal) + return l def info(self, msg): print msg @@ -86,11 +128,11 @@ # task_rtype = taskdef(task_rtype, ['annotate'], "RTyping") - def task_backendoptimisations(self): + def task_backopt(self): opt = self.options self.translator.backend_optimizations(ssa_form=opt.backend != 'llvm') # - task_backendoptimisations = taskdef(task_backendoptimisations, + task_backopt = taskdef(task_backopt, ['rtype'], "Back-end optimisations") def task_source_c(self): # xxx messy @@ -113,7 +155,7 @@ self.cbuilder = cbuilder # task_source_c = taskdef(task_source_c, - ['?backendoptimisations', '?rtype', '?annotate'], + ['?backopt', '?rtype', '?annotate'], "Generating c source") def task_compile_c(self): # xxx messy @@ -122,10 +164,10 @@ if self.standalone: self.c_entryp = cbuilder.executable_name + self.info("written: %s" % (c_entryp,)) else: cbuilder.import_module() self.c_entryp = cbuilder.get_entry_point() - #print 'Written %s.' % (c_entryp,) # task_compile_c = taskdef(task_compile_c, ['source_c'], "Compiling c source") @@ -152,21 +194,21 @@ raise NotImplementedError, "llinterpret" # xxx # task_llinterpret = taskdef(task_llinterpret, - ['?backendoptimisations', 'rtype'], + ['?backopt', 'rtype'], "LLInterpeting") def task_source_llvm(self): raise NotImplementedError, "source_llvm" # xxx # task_source_llvm = taskdef(task_source_llvm, - ['backendoptimisations', 'rtype'], + ['backopt', 'rtype'], "Generating llvm source") def task_compile_llvm(self): raise NotImplementedError, "compile_llvm" # xxx # task_compile_llvm = taskdef(task_compile_llvm, - ['backendoptimisations', 'rtype'], + ['backopt', 'rtype'], "Compiling llvm source") def task_run_llvm(self): @@ -175,50 +217,49 @@ task_run_llvm = taskdef(task_run_llvm, ['compile_llvm'], "Running compiled llvm source") - def proceed(self, goal): - self._execute([goal]) - - def __getattr__(self, name): # xxx - if name in self.tasks: - def proceed_with_task(): - self.proceed(name) - return proceed_with_task - raise AttribueError, name - - def from_targetspec(targetspec_dic, options=None, args=None): + def proceed(self, goals): + if not goals: + if self.default_goal: + goals = [self.default_goal] + else: + self.info("nothing to do") + return + elif isinstance(goals, str): + goals = [goals] + goals = self.backend_select_goals(goals) + self._execute(goals, task_skip = self.maybe_skip) + + def from_targetspec(targetspec_dic, options=None, args=None, empty_translator=None, + disable=[], + default_goal=None): if args is None: args = [] if options is None: options = DEFAULT_OPTIONS.copy() target = targetspec_dic['target'] - spec = target(options, arg) + spec = target(options, args) try: entry_point, inputtypes, policy = spec except ValueError: entry_point, inputtypes = spec policy = None - translator = Translator(entry_point, verbose=True, simplifying=True) + if empty_translator: + # re-initialize it + empty_translator.__init__(entry_point, verbose=True, simplifying=True) + translator = empty_translator + else: + translator = Translator(entry_point, verbose=True, simplifying=True) driver = TranslationDriver(translator, inputtypes, - policy, options, targetspec_dic['run']) + policy, options, targetspec_dic.get('run'), + disable=disable, + default_goal = default_goal) - return translation + return driver from_targetspec = staticmethod(from_targetspec) - - -# xxx reorg/move around - -def load_target(targetspec): - if not targetspec.endswith('.py'): - targetspec += '.py' - targetspec_dic = {} - sys.path.insert(0, os.path.dirname(targetspec)) - #xxx print - execfile(targetspec, targetspec_dic) - return targetspec_dic # __________ helpers Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Fri Sep 30 00:38:35 2005 @@ -4,8 +4,11 @@ See below """ +import sys, os + import autopath + # dict are not ordered, cheat with #_xyz keys and bunchiter def OPT(*args): return args @@ -22,7 +25,6 @@ opts = { - '0_Annotation': { '0_annotate': [OPT(('-a', '--annotate'), "Annotate", GOAL), OPT(('--no-annotate',), "Don't annotate", SKIP_GOAL)], @@ -40,7 +42,6 @@ OPT(('--no-backopt',), "Don't do backend optimisations", SKIP_GOAL)], }, - '3_Code generation options': { '0_source': [OPT(('-s', '--source'), "Generate source code", GOAL), OPT(('--no-source',), "Don't generate source code", SKIP_GOAL)], @@ -60,13 +61,11 @@ '_run': [OPT(('-r', '--run'), "Run compiled code", GOAL), OPT(('--no-run',), "Don't run compiled code", SKIP_GOAL)], }, - '6_General&other options': { '0_batch': [OPT(('--batch',), "Don't run interactive helpers", True)], '1_lowmem': [OPT(('--lowmem',), "Target should try to save memory", True)], - '2_huge': [OPT(('--huge',), "Threshold in the number of functions after which only a local call graph and not a full one is displayed", int)], '3_text': [OPT(('--text',), "Don't start the pygame viewer", True)], @@ -92,13 +91,6 @@ 'default_goals': ['annotate', 'rtype', 'backopt', 'source', 'compile'], 'skipped_goals': ['run'], -## 'annotate': True, -## 'rtype': True, -## 'backopt': True, -## 'source': True, -## 'compile': True, -## 'run': False, - 'lowmem': False, 'debug': False, @@ -144,9 +136,18 @@ if enable: parser.values.goals = parser.values.goals + [goal] else: - parser.values.goals = parser.values.skipped_goals + [goal] + parser.values.skipped_goals = parser.values.skipped_goals + [goal] -def parse_options(): +def load_target(targetspec): + if not targetspec.endswith('.py'): + targetspec += '.py' + targetspec_dic = {} + sys.path.insert(0, os.path.dirname(targetspec)) + #xxx print + execfile(targetspec, targetspec_dic) + return targetspec_dic + +def parse_options_and_load_target(): opt_parser = optparse.OptionParser(prog="translate_pypy", formatter=OptHelpFormatter()) for group_name, grp_opts in bunchiter(opts): @@ -173,6 +174,7 @@ opt_setup['type'] = 'string' grp.add_option(*names, **opt_setup) + opt_parser.set_defaults(**defaults) options, args = opt_parser.parse_args() @@ -186,368 +188,76 @@ options.targetspec = arg elif os.path.isfile(arg) and arg.endswith('.py'): options.targetspec = arg[:-3] + + targespec_dic = load_target(options.targetspec) - return options, args + return targespec_dic, options, args def main(): - options, args = parse_options() - - print options, args - -if __name__ == '__main__': - main() + targetspec_dic, options, args = parse_options_and_load_target() -# -## """ -## Command-line options for translate_pypy: - -## See below -## """ - -## opts = { -## 'Annotation':[ -## ['-m', '--lowmem', 'Try to save memory', [True,False], False], -## ['-n', '--no_annotations', "Don't infer annotations", [True,False], False], -## ['-d', '--debug', 'record debug information', [True,False], False], -## ['-i', '--insist', "Dont't stop on first error", [True,False], True]], - -## 'Specialization':[ -## ['-t', '--specialize', "Don't specialize", [True,False], True]], - -## 'Backend optimisation': [ -## ['-o', '--optimize', "Don't optimize (should have different name)", -## [True,False], True ]], - -## 'Process options':[ -## ['-f', '--fork', -## "(UNIX) Create restartable checkpoint after annotation [,specialization]", -## [['fork1','fork2']], [] ], -## ['-l', '--load', "load translator from file", [str], ''], -## ['-s', '--save', "save translator to file", [str], '']], - -## 'Codegeneration options':[ -## ['-g', '--gc', 'Garbage collector', ['ref', 'boehm','none'], 'boehm'], -## ['-b', '--backend', 'Backend selector', ['c','llvm'],'c'], -## ['-w', '--gencode', "Don't generate code", [True,False], True], -## ['-c', '--compile', "Don't compile generated code", [True,False], True]], - -## 'Compilation options':[], + from pypy.translator import translator + from pypy.translator.goal import driver + from pypy.translator.tool.pdbplus import PdbPlusShow + from pypy.translator.tool.graphserver import run_async_server + + t = translator.Translator() + + if options.graphserve: + serv_start, serv_show, serv_stop, serv_cleanup = run_async_server(t, options.graphserve) + def server_setup(): + return serv_start, serv_show, serv_stop, serv_cleanup + else: + def server_setup(): + from pypy.translator.tool.pygame.server import run_translator_server + return run_translator_server(t, options) + + pdb_plus_show = PdbPlusShow(t) # need a translator to support extended commands + + def debug(got_error): + tb = None + if got_error: + import traceback + exc, val, tb = sys.exc_info() + print >> sys.stderr + traceback.print_exception(exc, val, tb) + print >> sys.stderr -## 'Run options':[ -## ['-r', '--run', "Don't run the compiled code", [True,False], True], -## ['-x', '--batch', "Dont run interactive helpers", [True,False], False]], -## 'Pygame options':[ -## ['-p', '--pygame', "Dont run pygame", [True,False], True], -## ['-H', '--huge', -## "Threshold in the number of functions after which only a local call graph and not a full one is displayed", [int], 0 ]]} - -## import autopath, sys, os - -## if '-use-snapshot' in sys.argv: -## # xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx -## basedir = autopath.this_dir - -## pypy_translation_snapshot_dir = os.path.join(basedir, 'pypy-translation-snapshot') - -## if not os.path.isdir(pypy_translation_snapshot_dir): -## print """ -## Translation will be performed on a specific revision of PyPy which lives on -## a branch. This needs to be checked out into translator/goal with: - -## svn co http://codespeak.net/svn/pypy/branch/pypy-translation-snapshot -## """[1:] -## sys.exit(2) - -## # override imports from pypy head with imports from pypy-translation-snapshot -## import pypy -## pypy.__path__.insert(0, pypy_translation_snapshot_dir) - -## # complement imports from pypy.objspace (from pypy-translation-snapshot) -## # with pypy head objspace/ -## import pypy.objspace -## pypy.objspace.__path__.append(os.path.join(autopath.pypydir, 'objspace')) - -## print "imports redirected to pypy-translation-snapshot." - -## # xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx - - -## import threading, pdb - -## from pypy.translator.translator import Translator -## from pypy.annotation import model as annmodel -## from pypy.annotation import listdef -## from pypy.annotation.policy import AnnotatorPolicy -## from pypy.translator.pickle.main import load, save -## # catch TyperError to allow for post-mortem dump -## from pypy.rpython.error import TyperError - -## from pypy.translator.goal import query - -## # XXX this tries to make compiling faster -## from pypy.translator.tool import cbuild -## cbuild.enable_fast_compilation() -## from pypy.translator.tool.util import update_usession_dir -## from pypy.translator.tool.util import assert_rpython_mostly_not_imported, mkexename - -## annmodel.DEBUG = False - - - -## # __________ Main __________ - -## def sanity_check_annotation(t): -## irreg = query.qoutput(query.check_exceptblocks_qgen(t)) -## if not irreg: -## print "++ All exceptblocks seem sane" - -## lost = query.qoutput(query.check_methods_qgen(t)) -## assert not lost, "lost methods, something gone wrong with the annotation of method defs" -## print "++ No lost method defs" - -## so = query.qoutput(query.polluted_qgen(t)) -## tot = len(t.flowgraphs) -## percent = int(tot and (100.0*so / tot) or 0) -## print "-- someobjectness %2d (%d of %d functions polluted by SomeObjects)" % (percent, so, tot) - -## def analyse(t, inputtypes): - -## standalone = inputtypes is None -## if standalone: -## ldef = listdef.ListDef(None, annmodel.SomeString()) -## inputtypes = [annmodel.SomeList(ldef)] - -## if not cmd_line_opt.no_annotations: -## print 'Annotating...' -## print 'with policy: %s.%s' % (policy.__class__.__module__, policy.__class__.__name__) -## a = t.annotate(inputtypes, policy=policy) -## sanity_check_annotation(t) - -## if a: #and not options['-no-s']: -## print 'Simplifying...' -## a.simplify() -## if 'fork1' in cmd_line_opt.fork: -## from pypy.translator.goal import unixcheckpoint -## assert_rpython_mostly_not_imported() -## unixcheckpoint.restartable_point(auto='run') -## if a and cmd_line_opt.specialize: -## print 'Specializing...' -## t.specialize(dont_simplify_again=True, -## crash_on_first_typeerror=not cmd_line_opt.insist) -## if cmd_line_opt.optimize: -## print 'Back-end optimizations...' -## t.backend_optimizations(ssa_form=cmd_line_opt.backend != 'llvm') -## if a and 'fork2' in cmd_line_opt.fork: -## from pypy.translator.goal import unixcheckpoint -## unixcheckpoint.restartable_point(auto='run') -## if a: -## t.frozen = True # cannot freeze if we don't have annotations -## return standalone - -## # graph servers - -## serv_start, serv_show, serv_stop, serv_cleanup = None, None, None, None - -## if __name__ == '__main__': + block = getattr(val, '__annotator_block', None) + if block: + print '-'*60 + t.about(block) + print '-'*60 + print + else: + print '-'*60 + print 'Done.' + print + + if options.batch: + print >>sys.stderr, "batch mode, not calling interactive helpers" + return + + pdb_plus_show.start(tb, server_setup, graphic=not options.text) + + try: + drv = driver.TranslationDriver.from_targetspec(targetspec_dic, options, args, + empty_translator=t, + disable=options.skipped_goals, + default_goal='compile') + pdb_plus_show.expose({'drv': drv}) -## targetspec = 'targetpypystandalone' -## listen_port = None - -## def debug(got_error): -## from pypy.translator.tool.pdbplus import PdbPlusShow + goals = options.goals + drv.proceed(goals) -## pdb_plus_show = PdbPlusShow(t) # need a translator to support extended commands + except SystemExit: + raise + except: + debug(True) + raise SystemExit(1) + else: + debug(False) -## tb = None -## if got_error: -## import traceback -## exc, val, tb = sys.exc_info() -## print >> sys.stderr -## traceback.print_exception(exc, val, tb) -## print >> sys.stderr - -## block = getattr(val, '__annotator_block', None) -## if block: -## print '-'*60 -## t.about(block) -## print '-'*60 -## print -## else: -## print '-'*60 -## print 'Done.' -## print - -## if cmd_line_opt.batch: -## print >>sys.stderr, "batch mode, not calling interactive helpers" -## return - -## def server_setup(): -## if serv_start: -## return serv_start, serv_show, serv_stop, serv_cleanup -## else: -## from pypy.translator.tool.pygame.server import run_translator_server -## return run_translator_server(t, entry_point, cmd_line_opt) - -## pdb_plus_show.start(tb, server_setup, graphic=cmd_line_opt.pygame) - - -## from optparse import OptionParser -## parser = OptionParser() -## for group in opts: -## for option in opts[group]: -## if option[-1] in [True,False]: -## if option[-1] == True: -## action = "store_false" -## else: -## action = "store_true" -## parser.add_option(option[0],option[1], default=option[-1], -## dest=option[1].lstrip('--'), help=option[2], action=action) -## elif type(option[-2][0]) == list: -## parser.add_option(option[0],option[1], default=option[-1], -## dest=option[1].lstrip('--'), help=option[2], action="append") -## else: -## parser.add_option(option[0],option[1], default=option[-1], -## dest=option[1].lstrip('--'), help=option[2]) - -## (cmd_line_opt, args) = parser.parse_args() -## argiter = iter(args) #sys.argv[1:]) -## for arg in argiter: -## try: -## listen_port = int(arg) -## except ValueError: -## if os.path.isfile(arg+'.py'): -## assert not os.path.isfile(arg), ( -## "ambiguous file naming, please rename %s" % arg) -## targetspec = arg -## elif os.path.isfile(arg) and arg.endswith('.py'): -## targetspec = arg[:-3] -## t = None -## options = {} -## for opt in parser.option_list[1:]: -## options[opt.dest] = getattr(cmd_line_opt,opt.dest) -## if options.get('gc') == 'boehm': -## options['-boehm'] = True -## ## if options['-tcc']: -## ## os.environ['PYPY_CC'] = 'tcc -shared -o "%s.so" "%s.c"' -## if cmd_line_opt.debug: -## annmodel.DEBUG = True -## try: -## err = None -## if cmd_line_opt.load: -## loaded_dic = load(cmd_line_opt.load) -## t = loaded_dic['trans'] -## entry_point = t.entrypoint -## inputtypes = loaded_dic['inputtypes'] -## targetspec_dic = loaded_dic['targetspec_dic'] -## targetspec = loaded_dic['targetspec'] -## old_options = loaded_dic['options'] -## for name in 'no_a specialize optimize'.split(): -## # if one of these options has not been set, before, -## # then the action has been done and must be prevented, now. -## if not old_options[name]: -## if options[name]: -## print 'option %s is implied by the load' % name -## options[name] = True -## print "continuing Analysis as defined by %s, loaded from %s" %( -## targetspec, cmd_line_opt.loadname) -## targetspec_dic['target'] = None -## else: -## targetspec_dic = {} -## sys.path.insert(0, os.path.dirname(targetspec)) -## execfile(targetspec+'.py', targetspec_dic) -## print "Analysing target as defined by %s" % targetspec -## if targetspec_dic.get('options', None): -## targetspec_dic['options'].update(options) -## options = targetspec_dic['options'] -## print options,targetspec_dic['options'] -## print 'options in effect:' -## optnames = options.keys() -## optnames.sort() -## for name in optnames: -## print ' %25s: %s' %(name, options[name]) - -## policy = AnnotatorPolicy() -## target = targetspec_dic['target'] -## if target: -## spec = target(cmd_line_opt, []) # xxx rest args -## try: -## entry_point, inputtypes, policy = spec -## except ValueError: -## entry_point, inputtypes = spec -## t = Translator(entry_point, verbose=True, simplifying=True) -## a = None -## else: -## # otherwise we have been loaded -## a = t.annotator -## t.frozen = False -## if listen_port: -## from pypy.translator.tool.graphserver import run_async_server -## serv_start, serv_show, serv_stop, serv_cleanup = run_async_server(t, listen_port) -## try: -## standalone = analyse(t, inputtypes) -## except TyperError: -## err = sys.exc_info() -## print '-'*60 -## if cmd_line_opt.save: -## print 'saving state to %s' % cmd_line_opt.save -## if err: -## print '*** this save is done after errors occured ***' -## save(t, cmd_line_opt.save, -## trans=t, -## inputtypes=inputtypes, -## targetspec=targetspec, -## targetspec_dic=targetspec_dic, -## options=options, -## ) -## if err: -## raise err[0], err[1], err[2] -## if cmd_line_opt.backend == 'c': #XXX probably better to supply gcpolicy as string to the backends -## gcpolicy = None -## if cmd_line_opt.gc =='boehm': -## from pypy.translator.c import gc -## gcpolicy = gc.BoehmGcPolicy -## if cmd_line_opt.gc == 'none': -## from pypy.translator.c import gc -## gcpolicy = gc.NoneGcPolicy -## elif cmd_line_opt.backend == 'llvm': -## gcpolicy = cmd_line_opt.gc - -## if cmd_line_opt.backend == 'llinterpret': -## def interpret(): -## import py -## from pypy.rpython.llinterp import LLInterpreter -## py.log.setconsumer("llinterp operation", None) -## interp = LLInterpreter(t.flowgraphs, t.rtyper) -## interp.eval_function(entry_point, -## targetspec_dic['get_llinterp_args']()) -## interpret() -## elif not cmd_line_opt.gencode: -## print 'Not generating C code.' -## else: -## print 'Generating %s %s code...' %(cmd_line_opt.compile and "and compiling" or "",cmd_line_opt.backend) -## keywords = {'really_compile' : cmd_line_opt.compile, -## 'standalone' : standalone, -## 'gcpolicy' : gcpolicy} -## c_entry_point = t.compile(cmd_line_opt.backend, **keywords) - -## if standalone: # xxx fragile and messy -## import shutil -## exename = mkexename(c_entry_point) -## newexename = mkexename('./pypy-' + cmd_line_opt.backend) -## shutil.copy(exename, newexename) -## c_entry_point = newexename -## update_usession_dir() -## print 'Written %s.' % (c_entry_point,) -## if cmd_line_opt.run: -## print 'Running!' -## if standalone: -## os.system(c_entry_point) -## else: -## targetspec_dic['run'](c_entry_point) -## except SystemExit: -## raise -## except: -## if t: debug(True) -## raise SystemExit(1) -## else: -## if t: debug(False) + +if __name__ == '__main__': + main() Modified: pypy/dist/pypy/translator/tool/pdbplus.py ============================================================================== --- pypy/dist/pypy/translator/tool/pdbplus.py (original) +++ pypy/dist/pypy/translator/tool/pdbplus.py Fri Sep 30 00:38:35 2005 @@ -8,6 +8,7 @@ def __init__(self, translator): pdb.Pdb.__init__(self) self.translator = translator + self.exposed = {} def post_mortem(self, t): self.reset() @@ -15,6 +16,9 @@ t = t.tb_next self.interaction(t.tb_frame, t) + def expose(self, d): + self.exposed.update(d) + show = None def install_show(self, show): @@ -360,6 +364,8 @@ fn, args = self.post_mortem, (tb,) try: t = self.translator # define enviroments, xxx more stuff + exec "" + locals().update(self.exposed) fn(*args) pass # for debugger to land except pdb.bdb.BdbQuit: Modified: pypy/dist/pypy/translator/tool/pygame/server.py ============================================================================== --- pypy/dist/pypy/translator/tool/pygame/server.py (original) +++ pypy/dist/pypy/translator/tool/pygame/server.py Fri Sep 30 00:38:35 2005 @@ -1,5 +1,5 @@ -def run_translator_server(t, entry_point, options): +def run_translator_server(t, options): from pypy.translator.tool import graphpage import pygame from pypy.translator.tool.pygame.graphclient import get_layout From pedronis at codespeak.net Fri Sep 30 00:45:01 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 00:45:01 +0200 (CEST) Subject: [pypy-svn] r17990 - pypy/dist/pypy/translator/goal Message-ID: <20050929224501.45AA527BA7@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 00:44:59 2005 New Revision: 17990 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: usage should include [target] Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Fri Sep 30 00:44:59 2005 @@ -148,7 +148,7 @@ return targetspec_dic def parse_options_and_load_target(): - opt_parser = optparse.OptionParser(prog="translate_pypy", + opt_parser = optparse.OptionParser(usage="%prog [options] [target]", prog="translate_pypy", formatter=OptHelpFormatter()) for group_name, grp_opts in bunchiter(opts): grp = opt_parser.add_option_group(group_name) From pedronis at codespeak.net Fri Sep 30 01:02:07 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 01:02:07 +0200 (CEST) Subject: [pypy-svn] r17991 - pypy/dist/pypy/translator/goal Message-ID: <20050929230207.B438127BA7@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 01:02:06 2005 New Revision: 17991 Modified: pypy/dist/pypy/translator/goal/driver.py Log: fixes Modified: pypy/dist/pypy/translator/goal/driver.py ============================================================================== --- pypy/dist/pypy/translator/goal/driver.py (original) +++ pypy/dist/pypy/translator/goal/driver.py Fri Sep 30 01:02:06 2005 @@ -164,7 +164,7 @@ if self.standalone: self.c_entryp = cbuilder.executable_name - self.info("written: %s" % (c_entryp,)) + self.info("written: %s" % (self.c_entryp,)) else: cbuilder.import_module() self.c_entryp = cbuilder.get_entry_point() @@ -276,6 +276,6 @@ so = query.qoutput(query.polluted_qgen(t)) tot = len(t.flowgraphs) percent = int(tot and (100.0*so / tot) or 0) - print "-- someobjectness %2d (%d of %d functions polluted by SomeObjects)" % (percent, so, tot) + print "-- someobjectness %2d%% (%d of %d functions polluted by SomeObjects)" % (percent, so, tot) from pypy.translator.tool.util import mkexename From pedronis at codespeak.net Fri Sep 30 01:07:36 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 01:07:36 +0200 (CEST) Subject: [pypy-svn] r17992 - pypy/dist/pypy/translator/tool Message-ID: <20050929230736.014D527BA7@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 01:07:35 2005 New Revision: 17992 Modified: pypy/dist/pypy/translator/tool/cbuild.py Log: mmh Modified: pypy/dist/pypy/translator/tool/cbuild.py ============================================================================== --- pypy/dist/pypy/translator/tool/cbuild.py (original) +++ pypy/dist/pypy/translator/tool/cbuild.py Fri Sep 30 01:07:35 2005 @@ -134,7 +134,7 @@ # XXX do we need to do some check on fout/ferr? # XXX not a nice way to import a module except: - print data + print >>sys.stderr, data raise finally: lastdir.chdir() From pedronis at codespeak.net Fri Sep 30 01:14:15 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 01:14:15 +0200 (CEST) Subject: [pypy-svn] r17993 - pypy/dist/pypy/translator/goal Message-ID: <20050929231415.2D37527BA7@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 01:14:13 2005 New Revision: 17993 Modified: pypy/dist/pypy/translator/goal/driver.py Log: TODO comments Modified: pypy/dist/pypy/translator/goal/driver.py ============================================================================== --- pypy/dist/pypy/translator/goal/driver.py (original) +++ pypy/dist/pypy/translator/goal/driver.py Fri Sep 30 01:14:13 2005 @@ -23,6 +23,10 @@ taskfunc.task_expected_states = expected_states return taskfunc +# TODO: +# run is idempotent +# sanity-checks using states + class TranslationDriver(SimpleTaskEngine): def __init__(self, translator, inputtypes, policy=None, options=None, From pedronis at codespeak.net Fri Sep 30 02:04:49 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 02:04:49 +0200 (CEST) Subject: [pypy-svn] r17994 - in pypy/dist/pypy/translator: goal tool tool/test Message-ID: <20050930000449.C2A7927BA7@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 02:04:47 2005 New Revision: 17994 Modified: pypy/dist/pypy/translator/goal/driver.py pypy/dist/pypy/translator/tool/taskengine.py pypy/dist/pypy/translator/tool/test/test_taskengine.py Log: closure for _depending_on and use it to compute optinally skipped goal based on skipped suggestions Modified: pypy/dist/pypy/translator/goal/driver.py ============================================================================== --- pypy/dist/pypy/translator/goal/driver.py (original) +++ pypy/dist/pypy/translator/goal/driver.py Fri Sep 30 02:04:47 2005 @@ -55,18 +55,11 @@ self.runner = runner self.done = {} - - maybe_skip = {} - def add_maybe_skip(goal): - if goal in maybe_skip: - return - maybe_skip[goal] = True - for depending in self._depending_on(goal): - add_maybe_skip(depending) + maybe_skip = [] for goal in self.backend_select_goals(disable): - add_maybe_skip(goal) - self.maybe_skip = maybe_skip.keys() + maybe_skip.extend(self._depending_on_closure(goal)) + self.maybe_skip = dict.fromkeys(maybe_skip).keys() if default_goal: default_goal, = self.backend_select_goals([default_goal]) Modified: pypy/dist/pypy/translator/tool/taskengine.py ============================================================================== --- pypy/dist/pypy/translator/tool/taskengine.py (original) +++ pypy/dist/pypy/translator/tool/taskengine.py Fri Sep 30 02:04:47 2005 @@ -87,6 +87,17 @@ l.append(task_name) return l + def _depending_on_closure(self, goal): + d = {} + def track(goal): + if goal in d: + return + d[goal] = True + for depending in self._depending_on(goal): + track(depending) + track(goal) + return d.keys() + def _execute(self, goals, *args, **kwds): task_skip = kwds.get('task_skip', []) for goal in self._plan(goals, skip=task_skip): Modified: pypy/dist/pypy/translator/tool/test/test_taskengine.py ============================================================================== --- pypy/dist/pypy/translator/tool/test/test_taskengine.py (original) +++ pypy/dist/pypy/translator/tool/test/test_taskengine.py Fri Sep 30 02:04:47 2005 @@ -17,6 +17,17 @@ task_C.task_deps = ['B'] + def task_D(self): + pass + task_D.task_deps = ['E'] + + def task_E(self): + pass + task_E.task_deps = ['F'] + + def task_F(self): + pass + abc = ABC() assert abc._plan('B') == ['B'] @@ -28,6 +39,19 @@ assert dict.fromkeys(abc._depending_on('B'), True) == {'A':True, 'C':True} assert abc._depending_on('A') == [] + assert abc._depending_on('F') == ['E'] + assert abc._depending_on('E') == ['D'] + assert abc._depending_on('D') == [] + + assert abc._depending_on_closure('C') == ['C'] + assert dict.fromkeys(abc._depending_on_closure('B'), True) == {'A':True, 'C':True, 'B': True} + assert abc._depending_on_closure('A') == ['A'] + + assert dict.fromkeys(abc._depending_on_closure('F'), True) == {'D':True, 'E':True, 'F': True} + assert dict.fromkeys(abc._depending_on_closure('E'), True) == {'D':True, 'E':True} + assert abc._depending_on_closure('D') == ['D'] + + def test_execute(): class ABC(SimpleTaskEngine): From pedronis at codespeak.net Fri Sep 30 02:11:53 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 02:11:53 +0200 (CEST) Subject: [pypy-svn] r17995 - pypy/dist/pypy/translator/goal Message-ID: <20050930001153.1914E27BA7@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 02:11:51 2005 New Revision: 17995 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: don't accumalte a goal option twice Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Fri Sep 30 02:11:51 2005 @@ -134,9 +134,11 @@ def goal_cb(option, opt, value, parser, enable, goal): if enable: - parser.values.goals = parser.values.goals + [goal] + if goal not in parser.values.goals: + parser.values.goals = parser.values.goals + [goal] else: - parser.values.skipped_goals = parser.values.skipped_goals + [goal] + if goal not in parser.values.skipped_goals: + parser.values.skipped_goals = parser.values.skipped_goals + [goal] def load_target(targetspec): if not targetspec.endswith('.py'): From pedronis at codespeak.net Fri Sep 30 02:18:00 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 02:18:00 +0200 (CEST) Subject: [pypy-svn] r17996 - pypy/dist/pypy/translator/goal Message-ID: <20050930001800.1D9EC27BA7@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 02:17:57 2005 New Revision: 17996 Modified: pypy/dist/pypy/translator/goal/driver.py pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: backopt -> backendop (consistent with the directory name but still reasonably short) Modified: pypy/dist/pypy/translator/goal/driver.py ============================================================================== --- pypy/dist/pypy/translator/goal/driver.py (original) +++ pypy/dist/pypy/translator/goal/driver.py Fri Sep 30 02:17:57 2005 @@ -75,7 +75,7 @@ self.proceed(backend_goal) setattr(self, task, proc) - for task in ('annotate', 'rtype', 'backopt', 'source', 'compile', 'run'): + for task in ('annotate', 'rtype', 'backendopt', 'source', 'compile', 'run'): expose_task(task) def backend_select_goals(self, goals): @@ -125,11 +125,11 @@ # task_rtype = taskdef(task_rtype, ['annotate'], "RTyping") - def task_backopt(self): + def task_backendopt(self): opt = self.options self.translator.backend_optimizations(ssa_form=opt.backend != 'llvm') # - task_backopt = taskdef(task_backopt, + task_backendopt = taskdef(task_backendopt, ['rtype'], "Back-end optimisations") def task_source_c(self): # xxx messy @@ -152,7 +152,7 @@ self.cbuilder = cbuilder # task_source_c = taskdef(task_source_c, - ['?backopt', '?rtype', '?annotate'], + ['?backendopt', '?rtype', '?annotate'], "Generating c source") def task_compile_c(self): # xxx messy @@ -191,21 +191,21 @@ raise NotImplementedError, "llinterpret" # xxx # task_llinterpret = taskdef(task_llinterpret, - ['?backopt', 'rtype'], + ['?backendopt', 'rtype'], "LLInterpeting") def task_source_llvm(self): raise NotImplementedError, "source_llvm" # xxx # task_source_llvm = taskdef(task_source_llvm, - ['backopt', 'rtype'], + ['backendopt', 'rtype'], "Generating llvm source") def task_compile_llvm(self): raise NotImplementedError, "compile_llvm" # xxx # task_compile_llvm = taskdef(task_compile_llvm, - ['backopt', 'rtype'], + ['backendopt', 'rtype'], "Compiling llvm source") def task_run_llvm(self): Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Fri Sep 30 02:17:57 2005 @@ -38,8 +38,8 @@ }, '2_Backend optimisations': { - '_backopt': [OPT(('-o', '--backopt'), "Do backend optimisations", GOAL), - OPT(('--no-backopt',), "Don't do backend optimisations", SKIP_GOAL)], + '_backendopt': [OPT(('-o', '--backendopt'), "Do backend optimisations", GOAL), + OPT(('--no-backendopt',), "Don't do backend optimisations", SKIP_GOAL)], }, '3_Code generation options': { @@ -88,7 +88,7 @@ 'goals': [], - 'default_goals': ['annotate', 'rtype', 'backopt', 'source', 'compile'], + 'default_goals': ['annotate', 'rtype', 'backendopt', 'source', 'compile'], 'skipped_goals': ['run'], 'lowmem': False, From pedronis at codespeak.net Fri Sep 30 02:50:03 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 02:50:03 +0200 (CEST) Subject: [pypy-svn] r17997 - pypy/dist/pypy/translator/goal Message-ID: <20050930005003.37E1C27BA7@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 02:50:01 2005 New Revision: 17997 Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py Log: manage to list choices in help Modified: pypy/dist/pypy/translator/goal/translate_pypy_new.py ============================================================================== --- pypy/dist/pypy/translator/goal/translate_pypy_new.py (original) +++ pypy/dist/pypy/translator/goal/translate_pypy_new.py Fri Sep 30 02:50:01 2005 @@ -48,7 +48,7 @@ '1_backend': [OPT(('-b', '--backend'), "Backend", ['c', 'llvm'])], - '2_gc': [OPT(('--gc',), "Garbage collector", ['ref', 'boehm', 'none'])], + '2_gc': [OPT(('--gc',), "Garbage collector", ['boehm', 'ref', 'none'])], }, @@ -168,6 +168,7 @@ elif isinstance(choice, list): opt_setup['type'] = 'choice' opt_setup['choices'] = choice + opt_setup['metavar'] = "[%s]" % '|'.join(choice) elif isinstance(choice, bool): opt_setup['action'] = ['store_false', 'store_true'][choice] elif choice is int: From ericvrp at codespeak.net Fri Sep 30 09:43:05 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 30 Sep 2005 09:43:05 +0200 (CEST) Subject: [pypy-svn] r17998 - pypy/dist/pypy/translator/goal Message-ID: <20050930074305.3C24927B96@code1.codespeak.net> Author: ericvrp Date: Fri Sep 30 09:43:02 2005 New Revision: 17998 Removed: pypy/dist/pypy/translator/goal/target_pypy-llvm.py Modified: pypy/dist/pypy/translator/goal/run_pypy-llvm.sh pypy/dist/pypy/translator/goal/runtranslate.sh (props changed) Log: cleaning up Modified: pypy/dist/pypy/translator/goal/run_pypy-llvm.sh ============================================================================== --- pypy/dist/pypy/translator/goal/run_pypy-llvm.sh (original) +++ pypy/dist/pypy/translator/goal/run_pypy-llvm.sh Fri Sep 30 09:43:02 2005 @@ -1,30 +1,3 @@ #!/bin/sh export RTYPERORDER=order,module-list.pedronis -# stopping on the first error -#python translate_pypy.py -no-c -no-o -text -fork2 -# running it all -#python translate_pypy.py target_pypy-llvm -text -llvm $* -python translate_pypy_new.py targetpypystandalone --backend=llvm --pygame --batch --fork=fork2 $* - - -# How to work in parallel: -# There is an environment variable to be set with your personal random seed. -# Seeds taken so far are -# Armin: 42, Samuele: 46, Chris: 49, Arre: 97, hpk/rxe: 23 -# Under Windows, use -# SET RTYPERSEED=xx -# where xx is your seed. When you run translate_pypy, you will get a message -# with your seed, if everything is fine. The purpose of the seed is to -# shuffle the annotated blocks, in order to create different errors. - -# To get the above RTYPER problems, do:: - -# RTYPERORDER=order,SOMEFILE -# # stopping on the first error -# python translate_pypy.py -no-c -no-o -fork -text -t-insist - -# # seeing things in the graph -# python translate_pypy.py -no-c -no-o - -# In the SOMEFILE you put: -# pypy.rpython.rarithmetic.ovfcheck_float_to_int +python translate_pypy_new.py targetpypystandalone --backend=llvm --text --batch --no-run $* From ac at codespeak.net Fri Sep 30 11:59:03 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Fri, 30 Sep 2005 11:59:03 +0200 (CEST) Subject: [pypy-svn] r18000 - pypy/dist/lib-python/modified-2.4.1/test Message-ID: <20050930095903.68B3B27BB2@code1.codespeak.net> Author: ac Date: Fri Sep 30 11:59:03 2005 New Revision: 18000 Added: pypy/dist/lib-python/modified-2.4.1/test/test_generators.py - copied, changed from r17998, pypy/dist/lib-python/2.4.1/test/test_generators.py Log: Adjust for difference in error-message. From ac at codespeak.net Fri Sep 30 12:05:45 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Fri, 30 Sep 2005 12:05:45 +0200 (CEST) Subject: [pypy-svn] r18001 - in pypy/dist/pypy/interpreter: . astcompiler pyparser test Message-ID: <20050930100545.82FD027BB2@code1.codespeak.net> Author: ac Date: Fri Sep 30 12:05:44 2005 New Revision: 18001 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/ast.txt pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/pycompiler.py pypy/dist/pypy/interpreter/pyparser/astbuilder.py pypy/dist/pypy/interpreter/test/test_compiler.py Log: Reject return-statement with an argument inside a generator. Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Fri Sep 30 12:05:44 2005 @@ -1477,10 +1477,13 @@ def getChildren(self): "NOT_RPYTHON" - return self.value, + return tuple(flatten(self.value)) def getChildNodes(self): - return [self.value,] + nodelist = [] + if self.value is not None: + nodelist.append(self.value) + return nodelist def __repr__(self): return "Return(%s)" % (repr(self.value),) Modified: pypy/dist/pypy/interpreter/astcompiler/ast.txt ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.txt (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.txt Fri Sep 30 12:05:44 2005 @@ -25,7 +25,7 @@ Raise: expr1&, expr2&, expr3& TryFinally: body, final TryExcept: body, handlers!, else_& -Return: value +Return: value& Yield: value Const: value* NoneConst: Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Fri Sep 30 12:05:44 2005 @@ -1003,7 +1003,12 @@ def visitReturn(self, node): self.set_lineno(node) - node.value.accept( self ) + if node.value is None: + self.emitop_obj('LOAD_CONST', self.space.w_None) + else: + if self.scope.generator: + raise SyntaxError("'return' with argument inside generator", node.lineno) + node.value.accept( self ) self.emit('RETURN_VALUE') def visitYield(self, node): Modified: pypy/dist/pypy/interpreter/pycompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pycompiler.py (original) +++ pypy/dist/pypy/interpreter/pycompiler.py Fri Sep 30 12:05:44 2005 @@ -424,24 +424,9 @@ codegenerator = ExpressionCodeGenerator(space, ast_tree, flag_names) c = codegenerator.getCode() except SyntaxError, e: - w_synerr = space.newtuple([space.wrap(e.msg), - space.newtuple([space.wrap(e.filename), - space.wrap(e.lineno), - space.wrap(e.offset), - space.wrap(e.text)])]) - raise OperationError(space.w_SyntaxError, w_synerr) -## except UnicodeDecodeError, e: -## # TODO use a custom UnicodeError -## import traceback -## traceback.print_exc() -## raise OperationError(space.w_UnicodeDecodeError, space.newtuple([ -## space.wrap(e.encoding), space.wrap(e.object), space.wrap(e.start), -## space.wrap(e.end), space.wrap(e.reason)])) + raise OperationError(space.w_SyntaxError, + e.wrap_info(space, filename)) except ValueError,e: - #if e.__class__ != ValueError: - # extra_msg = "(Really got %s)" % e.__class__.__name__ - #else: - # extra_msg = "" raise OperationError(space.w_ValueError,space.wrap(str(e))) except TypeError,e: raise Modified: pypy/dist/pypy/interpreter/pyparser/astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/astbuilder.py Fri Sep 30 12:05:44 2005 @@ -820,7 +820,7 @@ if len(atoms) > 2: assert False, "return several stmts not implemented" elif len(atoms) == 1: - builder.push(ast.Return(ast.Const(builder.wrap_none(), lineno), lineno)) + builder.push(ast.Return(None, lineno)) else: builder.push(ast.Return(atoms[1], lineno)) Modified: pypy/dist/pypy/interpreter/test/test_compiler.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_compiler.py (original) +++ pypy/dist/pypy/interpreter/test/test_compiler.py Fri Sep 30 12:05:44 2005 @@ -208,6 +208,13 @@ ex.normalize_exception(self.space) assert ex.match(self.space, self.space.w_SyntaxError) + def test_return_in_generator(self): + code = 'def f():\n return None\n yield 19\n' + e = py.test.raises(OperationError, self.compiler.compile, code, '', 'single', 0) + ex = e.value + ex.normalize_exception(self.space) + assert ex.match(self.space, self.space.w_SyntaxError) + def test_none_assignment(self): stmts = [ 'None = 0', @@ -312,6 +319,9 @@ def test_globals_warnings(self): py.test.skip('INPROGRES') + def test_return_in_generator(self): + py.test.skip('INPROGRES') + class TestPyCCompiler(BaseTestCompiler): def setup_method(self, method): self.compiler = CPythonCompiler(self.space) @@ -322,7 +332,10 @@ def test_globals_warnings(self): py.test.skip('INPROGRES') - + + def test_return_in_generator(self): + py.test.skip('INPROGRES') + class TestPythonAstCompiler(BaseTestCompiler): def setup_method(self, method): self.compiler = PythonAstCompiler(self.space) From cfbolz at codespeak.net Fri Sep 30 13:30:43 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 30 Sep 2005 13:30:43 +0200 (CEST) Subject: [pypy-svn] r18002 - pypy/extradoc/talk/22c3 Message-ID: <20050930113043.4F6C327BB6@code1.codespeak.net> Author: cfbolz Date: Fri Sep 30 13:30:41 2005 New Revision: 18002 Added: pypy/extradoc/talk/22c3/speaker-beatriceduering.txt - copied unchanged from r18001, pypy/extradoc/talk/22c3/speaker-beatriced?ring.txt Removed: pypy/extradoc/talk/22c3/speaker-beatriced?ring.txt Log: make the file name ascii From ericvrp at codespeak.net Fri Sep 30 13:46:20 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 30 Sep 2005 13:46:20 +0200 (CEST) Subject: [pypy-svn] r18004 - in pypy/dist/pypy/translator/llvm: . backendopt Message-ID: <20050930114620.B3A0E27BB2@code1.codespeak.net> Author: ericvrp Date: Fri Sep 30 13:46:18 2005 New Revision: 18004 Modified: pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py pypy/dist/pypy/translator/llvm/funcnode.py Log: Working on mergemallocs, switch machine. Modified: pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py ============================================================================== --- pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py (original) +++ pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py Fri Sep 30 13:46:18 2005 @@ -1,4 +1,5 @@ -from pypy.objspace.flow.model import Block, flatten, SpaceOperation +from pypy.objspace.flow.model import Block, flatten, SpaceOperation, Constant +from pypy.rpython.lltype import GcStruct, Void def merge_mallocs(translator, graph): @@ -14,12 +15,29 @@ n_times_merged = 0 blocks = [x for x in flatten(graph) if isinstance(x, Block)] for block in blocks: - n_mallocs_in_block = 0 - for op in block.operations: - if op.opname != 'malloc': + mallocs = [[], []] + for i, op in enumerate(block.operations): + if op.opname == 'malloc' and op.args[0].value._arrayfld: + print 'merge_mallocs: skip varsize', op.args[0] + if op.opname != 'malloc' or op.args[0].value._arrayfld: continue - n_mallocs_in_block += 1 - if n_mallocs_in_block >= 2: - print 'merge_mallocs: n_mallocs_in_block=%d' % n_mallocs_in_block - n_times_merged += 1 + is_atomic = op.args[0].value._is_atomic() + mallocs[is_atomic].append( (i,op.args[0].value) ) + print 'merge_malloc: OLD %d, %s, %s, %s' % (i, type(op.args[0]), op.args[0], op.args[0].concretetype) + for a in range(2): + if len(mallocs[a]) >= 2: + indices = [m[0] for m in mallocs[a]] + structs = [m[1] for m in mallocs[a]] + merged_name = 'merged' + for m in mallocs[a]: + merged_name += '_' + m[1]._name + merged = GcStruct(merged_name, + ('field1', super(GcStruct, structs[0])), + ('field2', super(GcStruct, structs[1])) + ) + print 'merge_mallocs: %s {%s} [%s]' % (indices, structs, merged) + c = Constant(merged, Void) + print 'merge_malloc: NEW %s, %s' % (c, c.concretetype) + block.operations[indices[0]].args[0] = c + n_times_merged += 1 return n_times_merged Modified: pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py ============================================================================== --- pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py (original) +++ pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py Fri Sep 30 13:46:18 2005 @@ -2,7 +2,7 @@ from pypy.translator.backendopt.inline import _find_exception_type -def remove_exception_mallocs(translator, graph, ringbuffer_entry_maxsize=16, ringbuffer_n_entries=1024): +def remove_exception_mallocs(translator, graph): """Remove mallocs that occur because an exception is raised. Typically this data is shortlived and occuring often in highlevel languages like Python. So it would be preferable if we would not need Modified: pypy/dist/pypy/translator/llvm/funcnode.py ============================================================================== --- pypy/dist/pypy/translator/llvm/funcnode.py (original) +++ pypy/dist/pypy/translator/llvm/funcnode.py Fri Sep 30 13:46:18 2005 @@ -1,4 +1,5 @@ import py +import sys from pypy.objspace.flow.model import Block, Constant, Variable, Link from pypy.objspace.flow.model import flatten, mkentrymap, traverse, last_exception from pypy.rpython import lltype @@ -42,11 +43,14 @@ self.db.genllvm.exceptionpolicy.transform(self.db.translator, self.graph) if remove_exception_mallocs(self.db.translator, self.graph): print ' from function', self.ref - import sys sys.stdout.flush() #if self.ref not in ('%pypy_ll_raise_OSError__Signed', '%pypy_getitem'): # self.db.translator.view() - #merge_mallocs(self.db.translator, self.graph) + #if merge_mallocs(self.db.translator, self.graph): + # print ' in function', self.ref + # sys.stdout.flush() + # #self.db.translator.view() + remove_double_links(self.db.translator, self.graph) def __str__(self): From hpk at codespeak.net Fri Sep 30 14:17:13 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Fri, 30 Sep 2005 14:17:13 +0200 (CEST) Subject: [pypy-svn] r18007 - pypy/extradoc/talk/22c3 Message-ID: <20050930121713.E8F6E27BB6@code1.codespeak.net> Author: hpk Date: Fri Sep 30 14:17:11 2005 New Revision: 18007 Added: pypy/extradoc/talk/22c3/bea.jpg (contents, props changed) pypy/extradoc/talk/22c3/holger.jpg (contents, props changed) Modified: pypy/extradoc/talk/22c3/pypy-dev-talk.txt pypy/extradoc/talk/22c3/pypy-sprint-talk.txt pypy/extradoc/talk/22c3/speaker-beatriceduering.txt pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt pypy/extradoc/talk/22c3/speaker-holgerkrekel.txt Log: Bea, Carl, others: i updated all talks and speaker infos for CCC (to be submitted by midnight, i think) - modified the sprint talk quite a bit, please check/discuss this evening (i am probably around 10PM on IRC). It's not so much solely focused on PyPy anymore which i believe is not _so_ interesting to CCC people by itself (might be wrong, of course). - modified the speaker bios, please check - added pictures of bea and me. - anybody else, especially Christian: if you want to chime in regarding CCC talks please get involved ASAP. Added: pypy/extradoc/talk/22c3/bea.jpg ============================================================================== Binary file. No diff available. Added: pypy/extradoc/talk/22c3/holger.jpg ============================================================================== Binary file. No diff available. Modified: pypy/extradoc/talk/22c3/pypy-dev-talk.txt ============================================================================== --- pypy/extradoc/talk/22c3/pypy-dev-talk.txt (original) +++ pypy/extradoc/talk/22c3/pypy-dev-talk.txt Fri Sep 30 14:17:11 2005 @@ -8,7 +8,7 @@ Section: Hacking -Talkers: ... +Talkers: Holger Krekel, Carl Friedrich Bolz Abstract (max 250 letters): @@ -18,6 +18,10 @@ going to relate our research/technical results to other language-research aspects found with Perl 6 and Haskell. + We'll provide examples on how PyPy could solve problems + at language/interpreter-level that formerly required + complex frameworkish solutions at user-level. + Development of PyPy is partly funded by the European Union during the 6th Research Framework programme. @@ -38,16 +42,16 @@ how they lead us to producing a first self-contained very compliant Python implementation August 2005, completely independent from the current mainstream - CPython implementation. We go through a translation + CPython implementation. We go through a translation example of a Python program with control-flow-graphs and the according translated lowlevel C and LLVM (Low level Virtual Machine) code. - We'll also try to relate PyPy's architectural concepts - (known roughly for 2-3 years) to similar upcoming concepts - in e.g. pugs/Perl 6 development and we'll give an - outlook on our starting Just-In-Time Compiler efforts - and approaches. + We'll also try to relate PyPy's architectural concepts + (known roughly for 2-3 years now) to similar upcoming + concepts in e.g. pugs/Perl 6 development and we'll give an + outlook on our starting Just-In-Time Compiler efforts and + approaches. Lastly, we intend to discuss experimental new language/interpreter-level solutions to long-standing problems such as distributed computing, Modified: pypy/extradoc/talk/22c3/pypy-sprint-talk.txt ============================================================================== --- pypy/extradoc/talk/22c3/pypy-sprint-talk.txt (original) +++ pypy/extradoc/talk/22c3/pypy-sprint-talk.txt Fri Sep 30 14:17:11 2005 @@ -1,56 +1,54 @@ Reference/Call For Papers: http://www.ccc.de/congress/2005/cfp.html DEADLINE: 31st September 2005 (friday) -Title: PyPy - agility through diversity +Title: agile open-source methods <-> business/EU-funding -Subtitle: Sprint driven development in an Open Source Python project - funded - by the European Commission (FP6) +Subtitle: Sprint driven development in Open Source projects - + agile methods in open-source related companies -Section: Hackerethik +Section: Hacker(ethik) Talkers: Holger Krekel, Beatrice D?ring Abstract (max 250 letters): - We?ll present the experiences from managing an agile and distributed OSS - development process in the Pypy project, "sprint driven", during the last year - and also connect this to the challenges of integrating this process with the requirements - from the European Commission who is partly funding the project through the - 6th Research Framework programme. - We will also reflect on the aspect of diversity, combining technical - and non technical people and skills in the PyPy project and learnings - from this. - - PyPy is a reimplementation of Python written in Python - itself, flexible and easy to experiment with. + There is a growing number of open-source developers organized + and connected to company and business related work. We + report our experiences from the first year of the PyPy project which + has a 7 company/university consortium and a 1.3 Million Euro research + grant from the European Union. Moreover, we carry together + experiences from other projects and companies such as + Canonical (doing infrastructure for the Ubuntu distribution). + + We?ll present our own experiences from managing an agile + and distributed OSS development process in the PyPy + project, XP-style "sprint driven", during the last year and + relate this to the requirements from the European Union. + + We'd like to discuss models and experiences for connecting + open-source culture driven development to business related + projects and goals with the audience. Description (250-500 words): - We are going to briefly describe the organisation of the project, + We are going to briefly describe the organisation of the PyPy project, showing how formal stakeholders and OSS Python community interact through agile practices like sprinting. + We will also reflect on the aspect of diversity, combining technical + and non technical people and skills in the PyPy project and learnings + from this. + We will relate the various agile techniques used in PyPy to the agile practices known from the work in the Agile Alliance (XP, Scrum, - Crystal Clear) and show how the PyPy project is developing an ongoing - customization of several of the known practices. + Crystal Clear) and tell you what we know of how other projects + are doing it. Lastly we will also share our experience of various challenges and possibilities when integrating the different cultures and skills from the OSS perspective, EU perspective and the Chaos Pilot/process management perspective - managing diversities. - PyPy is a reimplementation of Python written in Python - itself, flexible and easy to experiment with. Our - long-term goals are to target a large variety of - platforms, small and large, by providing a compiler - toolsuite that can produce custom Python versions. - Platform, Memory and Threading models will become - aspects of the translation process - as opposed to - encoding low level details into a language implementation - itself. - - Statement: We intend to submit a paper (PDF) for the 22C3 proceedings. Statement: We intend to submit a slides PDF as well. @@ -58,13 +56,15 @@ Language of your talk: english -Links to background information on the talk: http://codespeak.net/pypy +Links to background information on the talk: + http://codespeak.net/pypy/dist/pypy/doc/dev_method.html + http://codespeak.net/pypy/dist/pypy Target Group: Advanced Users, Pros Resources you need for your talk: digital projector, internet -Related talks at 22C3 you know of: ... +Related talks at 22C3 you know of: PyPy - the new Python implementation on the block A lecture logo, square format, min. 128x128 pixels (optional): http://codespeak.net/pypy/img/py-web1.png Modified: pypy/extradoc/talk/22c3/speaker-beatriceduering.txt ============================================================================== --- pypy/extradoc/talk/22c3/speaker-beatriceduering.txt (original) +++ pypy/extradoc/talk/22c3/speaker-beatriceduering.txt Fri Sep 30 14:17:11 2005 @@ -12,8 +12,8 @@ Phone number(s): +46 (0)734 22 89 06 A photo, square format, min. 128x128 pixels (optional): - Holger - can I send you a photo that we can put somewhere on codespeak? - I will bring a digital camera so that we can take phoes of everyone in Paris for the website. + (is also attached in the proposals zip file) + see http://codespeak.net/pypy/extradoc/talk/22C3/bea.jpg Statement: publishing contact info except for the phone number is fine with me. @@ -24,7 +24,7 @@ Short Info: - Beatrice During, consultant in the project management field, assistant project manager in PyPY + Beatrice During, consultant in the project management field, assistant project manager in PyPy Bio: Beatrice D?ring studied teaching/pedagogy at the University of Karlstad in Modified: pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt ============================================================================== --- pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt (original) +++ pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt Fri Sep 30 14:17:11 2005 @@ -16,13 +16,13 @@ Short Info: - Carl Friedrich Bolz, student and developer of PyPy + Carl Friedrich Bolz, developer of PyPy, works with merlinux Bio: - Carl Friedrich started to program C++ when he was 16, couldn't stant it + Carl Friedrich started to program C++ when he was 16, couldn't stand it and turned to Python soon afterwards. He is a PyPy developer since the - beginning of 2005. At the moment he is supposed to study mathematics, + beginning of 2005. At the moment he is supposed to study mathematics, computer science and sometimes physics at the University of Heidelberg. Since programming is more fun he is taking a semester off to work for merlinux on PyPy and other projects. Modified: pypy/extradoc/talk/22c3/speaker-holgerkrekel.txt ============================================================================== --- pypy/extradoc/talk/22c3/speaker-holgerkrekel.txt (original) +++ pypy/extradoc/talk/22c3/speaker-holgerkrekel.txt Fri Sep 30 14:17:11 2005 @@ -13,7 +13,6 @@ A photo, square format, min. 128x128 pixels (optional): http://merlinux.de/~hpk/holger.jpg - (maybe i can find a better one, this is the one from last 21C3 conference) From hpk at codespeak.net Fri Sep 30 14:48:55 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Fri, 30 Sep 2005 14:48:55 +0200 (CEST) Subject: [pypy-svn] r18008 - pypy/extradoc/talk/22c3 Message-ID: <20050930124855.08BA727BB6@code1.codespeak.net> Author: hpk Date: Fri Sep 30 14:48:54 2005 New Revision: 18008 Modified: pypy/extradoc/talk/22c3/pypy-dev-talk.txt Log: relate talks to each other Modified: pypy/extradoc/talk/22c3/pypy-dev-talk.txt ============================================================================== --- pypy/extradoc/talk/22c3/pypy-dev-talk.txt (original) +++ pypy/extradoc/talk/22c3/pypy-dev-talk.txt Fri Sep 30 14:48:54 2005 @@ -70,7 +70,7 @@ Resources you need for your talk: digital projector, internet -Related talks at 22C3 you know of: ... +Related talks at 22C3 you know of: agile open-source methods <-> business/EU-funding A lecture logo, square format, min. 128x128 pixels (optional): http://codespeak.net/pypy/img/py-web1.png From ac at codespeak.net Fri Sep 30 17:17:19 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Fri, 30 Sep 2005 17:17:19 +0200 (CEST) Subject: [pypy-svn] r18009 - in pypy/dist/pypy/interpreter: astcompiler test Message-ID: <20050930151719.0A56B27BA9@code1.codespeak.net> Author: ac Date: Fri Sep 30 17:17:19 2005 New Revision: 18009 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py pypy/dist/pypy/interpreter/test/test_compiler.py Log: Reject 'yield' inside a 'try: ... finally:' block Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Fri Sep 30 17:17:19 2005 @@ -1012,6 +1012,11 @@ self.emit('RETURN_VALUE') def visitYield(self, node): + kind, block = self.setups.top() + if kind == TRY_FINALLY: + raise SyntaxError("'yield' not allowed in a 'try' block " + "with a 'finally' clause", + node.lineno) self.set_lineno(node) node.value.accept( self ) self.emit('YIELD_VALUE') Modified: pypy/dist/pypy/interpreter/test/test_compiler.py ============================================================================== --- pypy/dist/pypy/interpreter/test/test_compiler.py (original) +++ pypy/dist/pypy/interpreter/test/test_compiler.py Fri Sep 30 17:17:19 2005 @@ -215,6 +215,13 @@ ex.normalize_exception(self.space) assert ex.match(self.space, self.space.w_SyntaxError) + def test_yield_in_finally(self): + code ='def f():\n try:\n yield 19\n finally:\n pass\n' + e = py.test.raises(OperationError, self.compiler.compile, code, '', 'single', 0) + ex = e.value + ex.normalize_exception(self.space) + assert ex.match(self.space, self.space.w_SyntaxError) + def test_none_assignment(self): stmts = [ 'None = 0', @@ -322,6 +329,10 @@ def test_return_in_generator(self): py.test.skip('INPROGRES') + def test_yield_in_finally(self): + py.test.skip('INPROGRES') + + class TestPyCCompiler(BaseTestCompiler): def setup_method(self, method): self.compiler = CPythonCompiler(self.space) @@ -336,6 +347,9 @@ def test_return_in_generator(self): py.test.skip('INPROGRES') + def test_yield_in_finally(self): + py.test.skip('INPROGRES') + class TestPythonAstCompiler(BaseTestCompiler): def setup_method(self, method): self.compiler = PythonAstCompiler(self.space) From ac at codespeak.net Fri Sep 30 17:26:25 2005 From: ac at codespeak.net (ac at codespeak.net) Date: Fri, 30 Sep 2005 17:26:25 +0200 (CEST) Subject: [pypy-svn] r18010 - pypy/dist/pypy/interpreter/astcompiler Message-ID: <20050930152625.9616A27BA9@code1.codespeak.net> Author: ac Date: Fri Sep 30 17:26:25 2005 New Revision: 18010 Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Log: Oops! Modified: pypy/dist/pypy/interpreter/astcompiler/pycodegen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/pycodegen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/pycodegen.py Fri Sep 30 17:26:25 2005 @@ -1012,11 +1012,12 @@ self.emit('RETURN_VALUE') def visitYield(self, node): - kind, block = self.setups.top() - if kind == TRY_FINALLY: - raise SyntaxError("'yield' not allowed in a 'try' block " - "with a 'finally' clause", - node.lineno) + if self.setups: + kind, block = self.setups.top() + if kind == TRY_FINALLY: + raise SyntaxError("'yield' not allowed in a 'try' block " + "with a 'finally' clause", + node.lineno) self.set_lineno(node) node.value.accept( self ) self.emit('YIELD_VALUE') From pedronis at codespeak.net Fri Sep 30 19:42:00 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 19:42:00 +0200 (CEST) Subject: [pypy-svn] r18011 - in pypy/dist/pypy/interpreter: astcompiler pyparser/test Message-ID: <20050930174200.1F35D27BA9@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 19:41:58 2005 New Revision: 18011 Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py pypy/dist/pypy/interpreter/astcompiler/astgen.py pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Log: fixing 18001 breakage to test_astbuilder: - special case Return because now the behavior is different from transformer - generated getChildren was broken for '&' children :( Modified: pypy/dist/pypy/interpreter/astcompiler/ast.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/ast.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/ast.py Fri Sep 30 19:41:58 2005 @@ -1477,7 +1477,7 @@ def getChildren(self): "NOT_RPYTHON" - return tuple(flatten(self.value)) + return (self.value,) def getChildNodes(self): nodelist = [] Modified: pypy/dist/pypy/interpreter/astcompiler/astgen.py ============================================================================== --- pypy/dist/pypy/interpreter/astcompiler/astgen.py (original) +++ pypy/dist/pypy/interpreter/astcompiler/astgen.py Fri Sep 30 19:41:58 2005 @@ -154,7 +154,11 @@ print >> buf, " return %s" % clist else: if len(self.argnames) == 1: - print >> buf, " return tuple(flatten(self.%s))" % self.argnames[0] + name = self.argnames[0] + if self.argprops[name] == P_NESTED: + print >> buf, " return tuple(flatten(self.%s))" % name + else: + print >> buf, " return (self.%s,)" % name else: print >> buf, " children = []" template = " children.%s(%sself.%s%s)" Modified: pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py (original) +++ pypy/dist/pypy/interpreter/pyparser/test/test_astbuilder.py Fri Sep 30 19:41:58 2005 @@ -75,6 +75,12 @@ print "(0) (%s) left: %s, right: %s" % (left, left.lineno, right.lineno) return False return True + elif isinstance(right, ast_ast.Return) and isinstance(left, stable_ast.Return): + left_nodes = left.getChildren() + if right.value is None: + right_nodes = (ast_ast.Const(None),) + else: + right_nodes = right.getChildren() else: left_nodes = left.getChildren() right_nodes = right.getChildren() From bea at codespeak.net Fri Sep 30 20:24:16 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Fri, 30 Sep 2005 20:24:16 +0200 (CEST) Subject: [pypy-svn] r18012 - pypy/extradoc/talk/22c3 Message-ID: <20050930182416.A911027BA9@code1.codespeak.net> Author: bea Date: Fri Sep 30 20:24:14 2005 New Revision: 18012 Modified: pypy/extradoc/talk/22c3/pypy-sprint-talk.txt Log: small change - removed "clear" Modified: pypy/extradoc/talk/22c3/pypy-sprint-talk.txt ============================================================================== --- pypy/extradoc/talk/22c3/pypy-sprint-talk.txt (original) +++ pypy/extradoc/talk/22c3/pypy-sprint-talk.txt Fri Sep 30 20:24:14 2005 @@ -41,7 +41,7 @@ We will relate the various agile techniques used in PyPy to the agile practices known from the work in the Agile Alliance (XP, Scrum, - Crystal Clear) and tell you what we know of how other projects + Crystal) and tell you what we know of how other projects are doing it. Lastly we will also share our experience of various challenges and From bea at codespeak.net Fri Sep 30 20:53:42 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Fri, 30 Sep 2005 20:53:42 +0200 (CEST) Subject: [pypy-svn] r18013 - pypy/extradoc/sprintinfo Message-ID: <20050930185342.190E827BB9@code1.codespeak.net> Author: bea Date: Fri Sep 30 20:53:39 2005 New Revision: 18013 Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt Log: boris details Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt ============================================================================== --- pypy/extradoc/sprintinfo/paris-2005-people.txt (original) +++ pypy/extradoc/sprintinfo/paris-2005-people.txt Fri Sep 30 20:53:39 2005 @@ -24,6 +24,7 @@ Carl Friedrich Bolz 7/10? - 16/10 flat? Bert Freudenberg 09/10 - 17/10 ? Anders Lehmann 09/10 - 14/10 Hotel Porte de Versailles +Boris Feigin 09/10 - 16/10 ?/looking for flatmates =================== ============== ===================== People on the following list are likely to come and were From bea at codespeak.net Fri Sep 30 20:56:27 2005 From: bea at codespeak.net (bea at codespeak.net) Date: Fri, 30 Sep 2005 20:56:27 +0200 (CEST) Subject: [pypy-svn] r18014 - pypy/extradoc/sprintinfo Message-ID: <20050930185627.52D2027BB9@code1.codespeak.net> Author: bea Date: Fri Sep 30 20:56:20 2005 New Revision: 18014 Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt Log: Amaury?s details Modified: pypy/extradoc/sprintinfo/paris-2005-people.txt ============================================================================== --- pypy/extradoc/sprintinfo/paris-2005-people.txt (original) +++ pypy/extradoc/sprintinfo/paris-2005-people.txt Fri Sep 30 20:56:20 2005 @@ -25,6 +25,7 @@ Bert Freudenberg 09/10 - 17/10 ? Anders Lehmann 09/10 - 14/10 Hotel Porte de Versailles Boris Feigin 09/10 - 16/10 ?/looking for flatmates +Amaury Forgeot d'Arc10/10 - 16/10 Private =================== ============== ===================== People on the following list are likely to come and were From pedronis at codespeak.net Fri Sep 30 21:11:11 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 21:11:11 +0200 (CEST) Subject: [pypy-svn] r18015 - in pypy/dist/pypy/module/recparser: . test Message-ID: <20050930191111.10A4827BB6@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 21:11:09 2005 New Revision: 18015 Added: pypy/dist/pypy/module/recparser/test/ pypy/dist/pypy/module/recparser/test/test_parser.py (contents, props changed) Modified: pypy/dist/pypy/module/recparser/pyparser.py Log: some very minimal sanity check tests for recparser fix to have them pass (enc_minimal was crashing on missing space attr on the builder) Modified: pypy/dist/pypy/module/recparser/pyparser.py ============================================================================== --- pypy/dist/pypy/module/recparser/pyparser.py (original) +++ pypy/dist/pypy/module/recparser/pyparser.py Fri Sep 30 21:11:09 2005 @@ -145,6 +145,7 @@ def parse_python_source(space, source, goal): builder = grammar.BaseGrammarBuilder(debug=False, rules=PYTHON_PARSER.rules) + builder.space = space try: PYTHON_PARSER.parse_source(source, goal, builder ) return builder.stack[-1] Added: pypy/dist/pypy/module/recparser/test/test_parser.py ============================================================================== --- (empty file) +++ pypy/dist/pypy/module/recparser/test/test_parser.py Fri Sep 30 21:11:09 2005 @@ -0,0 +1,17 @@ +from pypy.objspace.std import StdObjSpace + +def setup_module(mod): + mod.space = StdObjSpace(usemodules=['recparser']) + + +class AppTestRecparser: + def setup_class(cls): + cls.space = space + + def test_simple(self): + import parser + parser.suite("great()") + + def test_enc_minimal(self): + import parser + parser.suite("# -*- coding: koi8-u -*-*\ngreat()") From pedronis at codespeak.net Fri Sep 30 21:17:01 2005 From: pedronis at codespeak.net (pedronis at codespeak.net) Date: Fri, 30 Sep 2005 21:17:01 +0200 (CEST) Subject: [pypy-svn] r18016 - in pypy/dist/pypy/interpreter: . pyparser Message-ID: <20050930191701.64CF327BB6@code1.codespeak.net> Author: pedronis Date: Fri Sep 30 21:17:00 2005 New Revision: 18016 Modified: pypy/dist/pypy/interpreter/pycompiler.py pypy/dist/pypy/interpreter/pyparser/pythonparse.py pypy/dist/pypy/interpreter/pyparser/pythonutil.py Log: fix translation problems (blocked block on builder.space...) add a simingly missing return case in _check_for_encoding (arre?) Modified: pypy/dist/pypy/interpreter/pycompiler.py ============================================================================== --- pypy/dist/pypy/interpreter/pycompiler.py (original) +++ pypy/dist/pypy/interpreter/pycompiler.py Fri Sep 30 21:17:00 2005 @@ -202,7 +202,7 @@ # XXX use 'flags' space = self.space try: - parse_result = internal_pypy_parse(source, mode, True, flags) + parse_result = internal_pypy_parse(source, mode, True, flags, space) except SyntaxError, e: w_synerr = space.newtuple([space.wrap(e.msg), space.newtuple([space.wrap(filename), Modified: pypy/dist/pypy/interpreter/pyparser/pythonparse.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonparse.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonparse.py Fri Sep 30 21:17:00 2005 @@ -87,11 +87,13 @@ def _check_for_encoding(s): eol = s.find('\n') - if eol == -1: + if eol < 0: return _check_line_for_encoding(s) enc = _check_line_for_encoding(s[:eol]) + if enc: + return enc eol2 = s.find('\n', eol + 1) - if eol2 == -1: + if eol2 < 0: return _check_line_for_encoding(s[eol + 1:]) return _check_line_for_encoding(s[eol + 1:eol2]) Modified: pypy/dist/pypy/interpreter/pyparser/pythonutil.py ============================================================================== --- pypy/dist/pypy/interpreter/pyparser/pythonutil.py (original) +++ pypy/dist/pypy/interpreter/pyparser/pythonutil.py Fri Sep 30 21:17:00 2005 @@ -50,7 +50,7 @@ pyf.close() return pypy_parse(source, 'exec', lineno) -def internal_pypy_parse(source, mode='exec', lineno=False, flags=0): +def internal_pypy_parse(source, mode='exec', lineno=False, flags=0, space=None): """This function has no other role than testing the parser's annotation annotateme() is basically the same code that pypy_parse(), but with the @@ -61,6 +61,8 @@ """ builder = TupleBuilder(PYTHON_PARSER.rules, lineno=False) + if space is not None: + builder.space = space target_rule = TARGET_DICT[mode] PYTHON_PARSER.parse_source(source, target_rule, builder, flags) stack_element = builder.stack[-1] From hpk at codespeak.net Fri Sep 30 21:31:09 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Fri, 30 Sep 2005 21:31:09 +0200 (CEST) Subject: [pypy-svn] r18018 - pypy/extradoc/talk/22c3 Message-ID: <20050930193109.903B027BB9@code1.codespeak.net> Author: hpk Date: Fri Sep 30 21:31:08 2005 New Revision: 18018 Added: pypy/extradoc/talk/22c3/os-business.txt - copied, changed from r18012, pypy/extradoc/talk/22c3/pypy-sprint-talk.txt Removed: pypy/extradoc/talk/22c3/pypy-sprint-talk.txt Log: rename and reshuffle the os/business talk to have the abstract fit into 250 chars (thanks to arigo who pointed this out) From hpk at codespeak.net Fri Sep 30 21:38:44 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Fri, 30 Sep 2005 21:38:44 +0200 (CEST) Subject: [pypy-svn] r18019 - pypy/extradoc/talk/22c3 Message-ID: <20050930193844.900C527BB9@code1.codespeak.net> Author: hpk Date: Fri Sep 30 21:38:43 2005 New Revision: 18019 Modified: pypy/extradoc/talk/22c3/os-business.txt Log: remove duplication Modified: pypy/extradoc/talk/22c3/os-business.txt ============================================================================== --- pypy/extradoc/talk/22c3/os-business.txt (original) +++ pypy/extradoc/talk/22c3/os-business.txt Fri Sep 30 21:38:43 2005 @@ -31,11 +31,9 @@ skills and learnings from this. We will relate the various agile techniques used in PyPy - and other projects to the agile practices known from the - work in the Agile Alliance (XP, Scrum, Crystal) and tell - you what we know of how other projects are doing it. We - carry together related experiences from some other projects - and companies. + and other projects/companies to the agile practices known from + the work in the Agile Alliance (XP, Scrum, Crystal) and tell + you what we know of how other projects are doing it. Lastly we will also share our experience of various challenges and possibilities when integrating the different cultures and skills from From arigo at codespeak.net Fri Sep 30 21:43:14 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 30 Sep 2005 21:43:14 +0200 (CEST) Subject: [pypy-svn] r18020 - pypy/extradoc/talk/22c3 Message-ID: <20050930194314.7AFD427BB6@code1.codespeak.net> Author: arigo Date: Fri Sep 30 21:43:11 2005 New Revision: 18020 Modified: pypy/extradoc/talk/22c3/pypy-dev-talk.txt Log: Shortened the abstract to 248 letters. Sneaked myself in as a speaker. Modified: pypy/extradoc/talk/22c3/pypy-dev-talk.txt ============================================================================== --- pypy/extradoc/talk/22c3/pypy-dev-talk.txt (original) +++ pypy/extradoc/talk/22c3/pypy-dev-talk.txt Fri Sep 30 21:43:11 2005 @@ -1,5 +1,5 @@ Reference/Call For Papers: http://www.ccc.de/congress/2005/cfp.html -DEADLINE: 31st September 2005 (friday) +DEADLINE: 30th September 2005 (friday) Title: PyPy - the new Python implementation on the block @@ -8,22 +8,15 @@ Section: Hacking -Talkers: Holger Krekel, Carl Friedrich Bolz +Talkers: Holger Krekel, Carl Friedrich Bolz, Armin Rigo Abstract (max 250 letters): - We'll present results of our first self-contained Python virtual - machine that uses parts of itself while translation itself - to a low level language ("the muenchhausen approach"). We are - going to relate our research/technical results to other - language-research aspects found with Perl 6 and Haskell. - - We'll provide examples on how PyPy could solve problems - at language/interpreter-level that formerly required - complex frameworkish solutions at user-level. - - Development of PyPy is partly funded by the European Union - during the 6th Research Framework programme. + We present our first self-contained Python virtual machine that uses + parts of itself to translate itself to low level languages ("the + Muenchhausen approach"). The PyPy approach could solve problems at + language/interpreter-level that formerly required complex + frameworkish solutions at user-level. Description (250-500 words): @@ -37,10 +30,10 @@ encoding low level details into a language implementation itself. - We are going to briefly describe the concepts of object spaces, - abstract interpretation and translation aspects and - how they lead us to producing a first self-contained - very compliant Python implementation August 2005, + We are going to briefly describe the concepts of + object spaces, abstract interpretation and translation + aspects and how they led us to a first self-contained + very compliant Python implementation in August 2005, completely independent from the current mainstream CPython implementation. We go through a translation example of a Python program with control-flow-graphs @@ -57,6 +50,9 @@ solutions to long-standing problems such as distributed computing, persistence and security/sandboxing. + Development of PyPy is partly funded by the European Union + during the 6th Research Framework programme. + Statement: We intend to submit a paper (PDF) for the 22C3 proceedings. Statement: We intend to submit a slides PDF as well. From arigo at codespeak.net Fri Sep 30 22:11:25 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 30 Sep 2005 22:11:25 +0200 (CEST) Subject: [pypy-svn] r18022 - pypy/extradoc/talk/22c3 Message-ID: <20050930201125.7EB3F27BB9@code1.codespeak.net> Author: arigo Date: Fri Sep 30 22:11:22 2005 New Revision: 18022 Modified: pypy/extradoc/talk/22c3/pypy-dev-talk.txt Log: Added a paragraph... Should we mention that some design decisions (e.g. be Stackless) are painful to write and maintain by hand? Modified: pypy/extradoc/talk/22c3/pypy-dev-talk.txt ============================================================================== --- pypy/extradoc/talk/22c3/pypy-dev-talk.txt (original) +++ pypy/extradoc/talk/22c3/pypy-dev-talk.txt Fri Sep 30 22:11:22 2005 @@ -23,13 +23,18 @@ PyPy is a reimplementation of Python written in Python itself, flexible and easy to experiment with. Our long-term goals are to target a large variety of - platforms, small and large, by providing a compiler - toolsuite that can produce custom Python versions. + platforms, small and large, by adapting the compiler + toolsuite we developed to produce custom Python versions. Platform, Memory and Threading models will become aspects of the translation process - as opposed to encoding low level details into a language implementation itself. + Basically, we think it's a good way to avoid writing + n x m x o interpreters for n dynamic languages and m + platforms with o crucial design decisions. In PyPy + any one of these can be changed independently. + We are going to briefly describe the concepts of object spaces, abstract interpretation and translation aspects and how they led us to a first self-contained From ericvrp at codespeak.net Fri Sep 30 22:34:32 2005 From: ericvrp at codespeak.net (ericvrp at codespeak.net) Date: Fri, 30 Sep 2005 22:34:32 +0200 (CEST) Subject: [pypy-svn] r18023 - in pypy/dist/pypy/translator: goal llvm/backendopt Message-ID: <20050930203432.5709527BB6@code1.codespeak.net> Author: ericvrp Date: Fri Sep 30 22:34:31 2005 New Revision: 18023 Modified: pypy/dist/pypy/translator/goal/run_pypy-llvm.sh pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py Log: added mergemallocs (experimental) transform. (disabled by default) Modified: pypy/dist/pypy/translator/goal/run_pypy-llvm.sh ============================================================================== --- pypy/dist/pypy/translator/goal/run_pypy-llvm.sh (original) +++ pypy/dist/pypy/translator/goal/run_pypy-llvm.sh Fri Sep 30 22:34:31 2005 @@ -1,3 +1,3 @@ #!/bin/sh export RTYPERORDER=order,module-list.pedronis -python translate_pypy_new.py targetpypystandalone --backend=llvm --text --batch --no-run $* +python translate_pypy.py targetpypystandalone -o -llvm -boehm -text -batch -fork2 $* Modified: pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py ============================================================================== --- pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py (original) +++ pypy/dist/pypy/translator/llvm/backendopt/mergemallocs.py Fri Sep 30 22:34:31 2005 @@ -1,5 +1,5 @@ -from pypy.objspace.flow.model import Block, flatten, SpaceOperation, Constant -from pypy.rpython.lltype import GcStruct, Void +from pypy.objspace.flow.model import Block, flatten, SpaceOperation, Constant, Variable +from pypy.rpython.lltype import Struct, GcStruct, Void, Ptr def merge_mallocs(translator, graph): @@ -23,21 +23,30 @@ continue is_atomic = op.args[0].value._is_atomic() mallocs[is_atomic].append( (i,op.args[0].value) ) - print 'merge_malloc: OLD %d, %s, %s, %s' % (i, type(op.args[0]), op.args[0], op.args[0].concretetype) + for a in range(2): if len(mallocs[a]) >= 2: indices = [m[0] for m in mallocs[a]] - structs = [m[1] for m in mallocs[a]] - merged_name = 'merged' + gcstructs = [m[1] for m in mallocs[a]] + merged_name = 'mergedstructs__' + '_'.join([s._name+str(n) for n, s in enumerate(gcstructs)]) + + x = [(gcstruct._name+str(n), gcstruct) for n, gcstruct in enumerate(gcstructs)] + mergedstruct= GcStruct(merged_name, *x) + c = Constant(mergedstruct, Void) + ptr_merged = Variable('ptr_mergedstructs') + ptr_merged.concretetype = Ptr(c.value) + merged_op = SpaceOperation('malloc', [c], ptr_merged) + block.operations.insert(0, merged_op) + + for n, i in enumerate(indices): + op = block.operations[i+1] + field = Constant(x[n][0], Void) + block.operations[i+1] = SpaceOperation('getsubstruct', [ptr_merged, field], op.result) + for m in mallocs[a]: - merged_name += '_' + m[1]._name - merged = GcStruct(merged_name, - ('field1', super(GcStruct, structs[0])), - ('field2', super(GcStruct, structs[1])) - ) - print 'merge_mallocs: %s {%s} [%s]' % (indices, structs, merged) - c = Constant(merged, Void) - print 'merge_malloc: NEW %s, %s' % (c, c.concretetype) - block.operations[indices[0]].args[0] = c + index, type_ = m + print 'merge_malloc: OLD %d, %s' % (index, type(type_)) + print 'merge_mallocs: NEW %s, %s' % (c, c.concretetype) n_times_merged += 1 + return n_times_merged Modified: pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py ============================================================================== --- pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py (original) +++ pypy/dist/pypy/translator/llvm/backendopt/removeexcmallocs.py Fri Sep 30 22:34:31 2005 @@ -1,5 +1,4 @@ -from pypy.objspace.flow.model import Block, Constant, flatten, SpaceOperation -from pypy.translator.backendopt.inline import _find_exception_type +from pypy.objspace.flow.model import Block, Constant, flatten def remove_exception_mallocs(translator, graph): From hpk at codespeak.net Fri Sep 30 22:50:36 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Fri, 30 Sep 2005 22:50:36 +0200 (CEST) Subject: [pypy-svn] r18024 - pypy/extradoc/talk/22c3 Message-ID: <20050930205036.EE57A27BB9@code1.codespeak.net> Author: hpk Date: Fri Sep 30 22:50:36 2005 New Revision: 18024 Modified: pypy/extradoc/talk/22c3/pypy-dev-talk.txt Log: result from joint cfbolz/arigo/hpk session with shped! Modified: pypy/extradoc/talk/22c3/pypy-dev-talk.txt ============================================================================== --- pypy/extradoc/talk/22c3/pypy-dev-talk.txt (original) +++ pypy/extradoc/talk/22c3/pypy-dev-talk.txt Fri Sep 30 22:50:36 2005 @@ -25,16 +25,15 @@ long-term goals are to target a large variety of platforms, small and large, by adapting the compiler toolsuite we developed to produce custom Python versions. - Platform, Memory and Threading models will become - aspects of the translation process - as opposed to - encoding low level details into a language implementation - itself. - - Basically, we think it's a good way to avoid writing - n x m x o interpreters for n dynamic languages and m - platforms with o crucial design decisions. In PyPy - any one of these can be changed independently. + Basically, we are implementing strategies to avoid writing + N x M x O interpreters for N dynamic languages, M platforms + and O crucial design decisions (e.g. memory and threading + models). Moreover, hand-coding particular combinations can + be hairy at best. With PyPy, we aim to change any of these + parameters independently. They become aspects of the + translation process. + We are going to briefly describe the concepts of object spaces, abstract interpretation and translation aspects and how they led us to a first self-contained From hpk at codespeak.net Fri Sep 30 22:56:45 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Fri, 30 Sep 2005 22:56:45 +0200 (CEST) Subject: [pypy-svn] r18025 - pypy/extradoc/talk/22c3 Message-ID: <20050930205645.693D827BB6@code1.codespeak.net> Author: hpk Date: Fri Sep 30 22:56:44 2005 New Revision: 18025 Modified: pypy/extradoc/talk/22c3/os-business.txt pypy/extradoc/talk/22c3/speaker-beatriceduering.txt pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt pypy/extradoc/talk/22c3/speaker-holgerkrekel.txt Log: minor corrections Modified: pypy/extradoc/talk/22c3/os-business.txt ============================================================================== --- pypy/extradoc/talk/22c3/os-business.txt (original) +++ pypy/extradoc/talk/22c3/os-business.txt Fri Sep 30 22:56:44 2005 @@ -1,5 +1,5 @@ Reference/Call For Papers: http://www.ccc.de/congress/2005/cfp.html -DEADLINE: 31st September 2005 (friday) +DEADLINE: 30th September 2005 (friday) Title: agile open-source methods <-> business/EU-funding Modified: pypy/extradoc/talk/22c3/speaker-beatriceduering.txt ============================================================================== --- pypy/extradoc/talk/22c3/speaker-beatriceduering.txt (original) +++ pypy/extradoc/talk/22c3/speaker-beatriceduering.txt Fri Sep 30 22:56:44 2005 @@ -1,5 +1,5 @@ Reference/Call For Papers: http://www.ccc.de/congress/2005/cfp.html -DEADLINE: 31st September 2005 (friday) +DEADLINE: 30th September 2005 (friday) Name: Beatrice D?ring @@ -24,7 +24,9 @@ Short Info: - Beatrice During, consultant in the project management field, assistant project manager in PyPy + Beatrice During (Sweden), consultant in the project management field, + assistant project manager in PyPy + Bio: Beatrice D?ring studied teaching/pedagogy at the University of Karlstad in Modified: pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt ============================================================================== --- pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt (original) +++ pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt Fri Sep 30 22:56:44 2005 @@ -1,5 +1,5 @@ Reference/Call For Papers: http://www.ccc.de/congress/2005/cfp.html -DEADLINE: 31st September 2005 (friday) +DEADLINE: 30th September 2005 (friday) Name: Carl Friedrich Bolz Modified: pypy/extradoc/talk/22c3/speaker-holgerkrekel.txt ============================================================================== --- pypy/extradoc/talk/22c3/speaker-holgerkrekel.txt (original) +++ pypy/extradoc/talk/22c3/speaker-holgerkrekel.txt Fri Sep 30 22:56:44 2005 @@ -1,5 +1,5 @@ Reference/Call For Papers: http://www.ccc.de/congress/2005/cfp.html -DEADLINE: 31st September 2005 (friday) +DEADLINE: 30th September 2005 (friday) Name: holger krekel From hpk at codespeak.net Fri Sep 30 22:58:09 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Fri, 30 Sep 2005 22:58:09 +0200 (CEST) Subject: [pypy-svn] r18026 - pypy/extradoc/talk/22c3 Message-ID: <20050930205809.835DC27BB9@code1.codespeak.net> Author: hpk Date: Fri Sep 30 22:58:09 2005 New Revision: 18026 Added: pypy/extradoc/talk/22c3/talk-os-business.txt - copied unchanged from r18022, pypy/extradoc/talk/22c3/os-business.txt pypy/extradoc/talk/22c3/talk-pypy-dev.txt - copied unchanged from r18022, pypy/extradoc/talk/22c3/pypy-dev-talk.txt Removed: pypy/extradoc/talk/22c3/os-business.txt pypy/extradoc/talk/22c3/pypy-dev-talk.txt Log: made the talks more obvious (i am going to send them a zip file of all this) From arigo at codespeak.net Fri Sep 30 23:09:20 2005 From: arigo at codespeak.net (arigo at codespeak.net) Date: Fri, 30 Sep 2005 23:09:20 +0200 (CEST) Subject: [pypy-svn] r18027 - pypy/extradoc/talk/22c3 Message-ID: <20050930210920.D554727BB9@code1.codespeak.net> Author: arigo Date: Fri Sep 30 23:09:17 2005 New Revision: 18027 Added: pypy/extradoc/talk/22c3/speaker-arminrigo.txt (contents, props changed) Log: my bio Added: pypy/extradoc/talk/22c3/speaker-arminrigo.txt ============================================================================== --- (empty file) +++ pypy/extradoc/talk/22c3/speaker-arminrigo.txt Fri Sep 30 23:09:17 2005 @@ -0,0 +1,36 @@ +Reference/Call For Papers: http://www.ccc.de/congress/2005/cfp.html +DEADLINE: 30th September 2005 (friday) + +Name: Armin Rigo + +Public Name: Armin Rigo + +Other Names: arigo on irc.freenode.org + +Primary E-Mail address: arigo at tunes.org + +Phone number(s): - + +Statement: publishing contact info is fine with me. + +Public home page, weblog and other speaker-related websites: + + http://codespeak.net/pypy + +Short Info: + + Armin Rigo, developer of PyPy, Heinrich-Heine Universit?t D?sseldorf + +Bio: + + Armin was born in 1976 in Lausanne, Switzerland. He is officially a + researcher at the University of D?sseldorf but works from various + places around Europe. He studied and Ph.D.ed in Mathematics. At + that time, he was the author of QuArK, a Quake map editor. + Afterwards, he worked on various programs and hacks that are more + specifically oriented towards programming languages, mostly Python. + +Postal address: Armin Rigo, App 22, c/o Hall?n/Creighton, + Linn?gatan 55, 413 08 G?teborg, Sweden + +Expected day of arrival and departure: 27th-30th December. From hpk at codespeak.net Fri Sep 30 23:11:11 2005 From: hpk at codespeak.net (hpk at codespeak.net) Date: Fri, 30 Sep 2005 23:11:11 +0200 (CEST) Subject: [pypy-svn] r18028 - pypy/extradoc/talk/22c3 Message-ID: <20050930211111.6124927BB9@code1.codespeak.net> Author: hpk Date: Fri Sep 30 23:11:10 2005 New Revision: 18028 Modified: pypy/extradoc/talk/22c3/speaker-arminrigo.txt Log: i see no contradiction there :) Modified: pypy/extradoc/talk/22c3/speaker-arminrigo.txt ============================================================================== --- pypy/extradoc/talk/22c3/speaker-arminrigo.txt (original) +++ pypy/extradoc/talk/22c3/speaker-arminrigo.txt Fri Sep 30 23:11:10 2005 @@ -24,7 +24,7 @@ Bio: Armin was born in 1976 in Lausanne, Switzerland. He is officially a - researcher at the University of D?sseldorf but works from various + researcher at the University of D?sseldorf and works from various places around Europe. He studied and Ph.D.ed in Mathematics. At that time, he was the author of QuArK, a Quake map editor. Afterwards, he worked on various programs and hacks that are more From cfbolz at codespeak.net Fri Sep 30 23:12:14 2005 From: cfbolz at codespeak.net (cfbolz at codespeak.net) Date: Fri, 30 Sep 2005 23:12:14 +0200 (CEST) Subject: [pypy-svn] r18029 - pypy/extradoc/talk/22c3 Message-ID: <20050930211214.2228927BB9@code1.codespeak.net> Author: cfbolz Date: Fri Sep 30 23:12:12 2005 New Revision: 18029 Modified: pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt Log: changed bio. added link to image Modified: pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt ============================================================================== --- pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt (original) +++ pypy/extradoc/talk/22c3/speaker-carlfriedrichbolz.txt Fri Sep 30 23:12:12 2005 @@ -11,6 +11,10 @@ Phone number(s): +49 6221 7183177 +A photo, square format, min. 128x128 pixels (optional): +http://codespeak.net/~cfbolz/carl_friedrich_bolz.jpg + + Statement: publishing contact info except for the phone number is fine with me. @@ -22,10 +26,9 @@ Carl Friedrich started to program C++ when he was 16, couldn't stand it and turned to Python soon afterwards. He is a PyPy developer since the - beginning of 2005. At the moment he is supposed to study mathematics, - computer science and sometimes physics at the University of Heidelberg. - Since programming is more fun he is taking a semester off to work for - merlinux on PyPy and other projects. + beginning of 2005. He studies mathematics, computer science and physics + at the University of Heidelberg and works for merlinux on PyPy and other + projects. Postal address: Carl Friedrich Bolz, Albert-Fritz-Str. 9, 69124 Heidelberg Bank information: