From Moshe Zadka Wed Nov 1 14:38:38 2000 From: Moshe Zadka (Moshe Zadka) Date: Wed, 1 Nov 2000 16:38:38 +0200 (IST) Subject: [Python-Dev] Bug Changes Message-ID: This message is in MIME format. The first part should be readable text, while the remaining parts are likely unreadable without MIME-aware tools. Send mail to mime@docserver.cac.washington.edu for more info. ---559023410-2110444415-973089518=:21805 Content-Type: TEXT/PLAIN; charset=US-ASCII I've noticed the SF-FAQ still has the old "Use Jitterbug" thing about bugs, even though we've moved to SourceForge bug manager. Attached is a patch to correct everything. I haven't checked it in, because I'm not sure I my explanations are clear at all. I'd be happy if someone went over it and saw if it's all right. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez ---559023410-2110444415-973089518=:21805 Content-Type: TEXT/PLAIN; charset=US-ASCII; name=file Content-ID: Content-Description: Content-Disposition: attachment; filename=file Content-Transfer-Encoding: BASE64 SW5kZXg6IHNmLWZhcS5odG1sDQo9PT09PT09PT09PT09PT09PT09PT09PT09 PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09DQpS Q1MgZmlsZTogL2N2c3Jvb3QvcHl0aG9uL3B5dGhvbi9ub25kaXN0L3NmLWh0 bWwvc2YtZmFxLmh0bWwsdg0KcmV0cmlldmluZyByZXZpc2lvbiAxLjE1DQpk aWZmIC1jIC1yMS4xNSBzZi1mYXEuaHRtbA0KKioqIHNmLWZhcS5odG1sCTIw MDAvMDgvMjUgMDc6NTU6NDgJMS4xNQ0KLS0tIHNmLWZhcS5odG1sCTIwMDAv MTEvMDEgMTQ6MzM6NTANCioqKioqKioqKioqKioqKg0KKioqIDU3LDYzICoq KioNCiAgPGgyPjxhIGhyZWY9IiNidWdzIj42LiBCdWcgUmVwb3J0aW5nPC9h PjwvaDI+DQogIDxvbD4NCiAgICA8bGk+PGEgaHJlZj0iI2IxIj5XaGVyZSBj YW4gSSBzdWJtaXQvdmlldyBidWdzIGZvciBQeXRob24/PC9hPjwvbGk+DQoh ICAgPGxpPjxhIGhyZWY9IiNiMiI+SG93IGRvIEkgdXNlIGppdHRlcmJ1Zz88 L2E+PC9saT4NCiAgPC9vbD4NCiAgDQogIDxoMj48YSBocmVmPSIjYXBwZW5k aXgiPkEuIEFwcGVuZGl4PC9hPjwvaDI+DQotLS0gNTcsNjMgLS0tLQ0KICA8 aDI+PGEgaHJlZj0iI2J1Z3MiPjYuIEJ1ZyBSZXBvcnRpbmc8L2E+PC9oMj4N CiAgPG9sPg0KICAgIDxsaT48YSBocmVmPSIjYjEiPldoZXJlIGNhbiBJIHN1 Ym1pdC92aWV3IGJ1Z3MgZm9yIFB5dGhvbj88L2E+PC9saT4NCiEgICA8bGk+ PGEgaHJlZj0iI2IyIj5Ib3cgZG8gSSB1c2UgdGhlIHNvdXJjZWZvcmdlIGJ1 ZyBtYW5hZ2VyPzwvYT48L2xpPg0KICA8L29sPg0KICANCiAgPGgyPjxhIGhy ZWY9IiNhcHBlbmRpeCI+QS4gQXBwZW5kaXg8L2E+PC9oMj4NCioqKioqKioq KioqKioqKg0KKioqIDQ0Niw0NzIgKioqKg0KICA8aDQ+UTogV2hlcmUgY2Fu IEkgc3VibWl0L3ZpZXcgYnVncyBmb3IgUHl0aG9uPzwvaDQ+DQogIA0KICA8 aDQ+QTo8L2g0Pg0KISBBcyBvZiBub3cgWzI1LUp1bC0yMDBdIHRoZSBQeXRo b24gcHJvamVjdCBkb2VzIG5vdCB1c2UgU291cmNlRm9yZ2UncyBidWcNCiEg dHJhY2tpbmcgZmFjaWxpdHkuIFRoaXMgbWF5IGNoYW5nZSB3aGVuIHRoZXJl IGlzIGEgd2F5IHRvIGltcG9ydCB0aGUgZXhpc3RpbmcNCiEgaml0dGVyYnVn IGRhdGFiYXNlLiBUaGUgaml0dGVyYnVnIGRhdGFiYXNlIGNhbiBiZSBhY2Nl c3NlZCB0aHJvdWdoIHRoZQ0KISBmb2xsb3dpbmcgaW50ZXJmYWNlOg0KICAN Ci0gPGJsb2NrcXVvdGU+DQotICAgPHR0PjxhDQotICAgaHJlZj0iaHR0cDov L3d3dy5weXRob24ub3JnL3B5dGhvbi1idWdzIj5odHRwOi8vd3d3LnB5dGhv bi5vcmcvcHl0aG9uLWJ1Z3M8L2E+PC90dD48L2Jsb2NrcXVvdGU+DQotIA0K ICA8aDM+PGEgbmFtZT0iYjEiIGlkPSJiMSI+PC9hPjYuMS46PC9oMz4NCiAg DQohIDxoND5ROiBIb3cgZG8gSSB1c2Ugaml0dGVyYnVnPzwvaDQ+DQogIA0K ICA8aDQ+QTo8L2g0Pg0KISBUbyBnZXQgdGhlIGxpc3Qgb2Ygb3BlbiBidWdz IGNsaWNrIG9uIDx0dD5vcGVuPC90dD4gKGhpZGRlbiBiZXR3ZWVuIHRoZSBz ZWNvbmQNCiEgbGFzdCBhbmQgdGhlIGxhc3QgaG9yaXpvbnRhbCBydWxlciku DQogIA0KISA8cD5UbyBnZXQgYSBsaXN0IG9mIHRoZSBidWdzIHdoaWNoIGFy ZSByZWxhdGVkIHRvIHNvbWUgYXJlYSwgZW50ZXIgYW4NCiEgYXBwcm9wcmlh dGUgcmVndWxhciBleHByZXNzaW9uIGFuZCBwcmVzcyAiU2VsZWN0IE1lc3Nh Z2VzIi4gVGhlbiBzZWxlY3QNCiEgPHR0Pm9wZW48L3R0PiAob3Igd2hhdGV2 ZXIgY2F0ZWdvcnkgeW91IHdvdWxkIGxpa2UgdG8gdmlldykgYXMgZGVzY3Jp YmVkDQohIGFib3ZlLjwvcD4NCiAgDQogIDxoMT48YSBuYW1lPSJhcHBlbmRp eCIgaWQ9ImFwcGVuZGl4Ij48L2E+QS4gQXBwZW5kaXg8L2gxPg0KICANCi0t LSA0NDYsNDY2IC0tLS0NCiAgPGg0PlE6IFdoZXJlIGNhbiBJIHN1Ym1pdC92 aWV3IGJ1Z3MgZm9yIFB5dGhvbj88L2g0Pg0KICANCiAgPGg0PkE6PC9oND4N CiEgVGhlIFB5dGhvbiBwcm9qZWN0IHVzZXMgU291cmNlRm9yZ2UncyBidWcN CiEgdHJhY2tpbmcgZmFjaWxpdHkuIEdvIHRvIA0KISA8YSBocmVmPSJodHRw Oi8vc291cmNlZm9yZ2UubmV0L2J1Z3MvP2dyb3VwX2lkPTU0NzAiPmh0dHA6 Ly9zb3VyY2Vmb3JnZS5uZXQvYnVncy8/Z3JvdXBfaWQ9NTQ3MDwvYT4gZm9y IGFsbCBidWcgbWFuYWdlbWVudCBuZWVkcy4NCiAgDQogIDxoMz48YSBuYW1l PSJiMSIgaWQ9ImIxIj48L2E+Ni4xLjo8L2gzPg0KICANCiEgPGg0PlE6IEhv dyBkbyBJIHVzZSB0aGUgc291cmNlZm9yZ2UgYnVnIG1hbmFnZXI/PC9oND4N CiAgDQogIDxoND5BOjwvaDQ+DQohIEJ5IGRlZmF1bHQsIHlvdSB3aWxsIHNl ZSB0aGUgbGlzdCBvZiBhbGwgT3BlbiBidWdzLiBZb3UgY2FuIGNoYW5nZQ0K ISB3aGljaCBidWdzIHlvdSdyZSB2aWV3aW5nIGJ5IHNlbGVjdGluZyB0aGUg YXNzaWduZWRfdG8vc3RhdHVzL2FyZWEvdHlwZQ0KISBzZWxlY3QgYm94cy4N CiAgDQohIDxwPlRvIHN1Ym1pdCBhIGJ1ZywgdXNlIHRoZSAiU3VibWl0IGEg QnVnIiBsaW5rLCBuZWFyIHRoZSB0b3Agb2YgdGhlIHBhZ2UuDQohIDwvcD4N CiAgDQogIDxoMT48YSBuYW1lPSJhcHBlbmRpeCIgaWQ9ImFwcGVuZGl4Ij48 L2E+QS4gQXBwZW5kaXg8L2gxPg0KICANCioqKioqKioqKioqKioqKg0KKioq IDYwMCw2MTYgKioqKg0KICAgICAgbWFpbGluZyBsaXN0IGFkZHJlc3M7IHRo aXMgYWRkcmVzcyBzaG91bGQgbm8gbG9uZ2VyIGJlIHVzZWQgZm9yIHBhdGNo DQogICAgICBzdWJtaXNzaW9uLiBUaGUgcGF0Y2ggbWFuYWdlciBpcyBmb3Ig PGI+cGF0Y2hlczwvYj4gb25seTsgaWYgeW91IGhhdmUgYQ0KICAgICAgcHJv YmxlbSBvciBzdWdnZXN0aW9uIGJ1dCBkb24ndCBrbm93IGhvdyB0byB3cml0 ZSB0aGUgY29kZSBmb3IgaXQsIHVzZSB0aGUNCiEgICAgIDxhIGhyZWY9Imh0 dHA6Ly93d3cucHl0aG9uLm9yZy9zZWFyY2gvc2VhcmNoX2J1Z3MuaHRtbCI+ UHl0aG9uIEJ1Z3MNCiEgICAgIExpc3Q8L2E+IGluc3RlYWQuIFRoZSBidWdz IGxpc3QgaXMgc2VhcmNoYWJsZTsgaWYgeW91IGhhdmUgYSBwcm9ibGVtIGFu ZA0KICAgICAgeW91J3JlIG5vdCBzdXJlIGlmIGl0IGhhcyBiZWVuIHJlcG9y dGVkIG9yIGZpeGVkIGFscmVhZHksIHRoaXMgaXMgdGhlDQogICAgICBmaXJz dCBwbGFjZSB0byBsb29rLiAoVGhlcmUgdXNlZCB0byBiZSBhIHNlcGFyYXRl IFRPRE8gbGlzdDsgd2Ugbm93IHByZWZlcg0KICAgICAgdGhhdCB5b3UgdXNl IHRoZSBidWdzIGxpc3QgZm9yIHN1Z2dlc3Rpb25zIGFuZCByZXF1ZXN0cyB0 b28uKTwvbGk+DQogICAgPGxpIHN0eWxlPSJsaXN0LXN0eWxlOiBub25lIj48 Yj5TdWJtaXQgZG9jdW1lbnRhdGlvbiBwYXRjaGVzIHRoZSBzYW1lDQogICAg ICB3YXkuPC9iPiBXaGVuIGFkZGluZyB0aGUgcGF0Y2gsIGJlIHN1cmUgdG8g c2V0IHRoZSAiPGI+Q2F0ZWdvcnk8L2I+IiBmaWVsZA0KICAgICAgdG8gIjxi PmRvY3VtZW50YXRpb248L2I+Ii4gRm9yIGRvY3VtZW50YXRpb24gZXJyb3Jz IHdpdGhvdXQgcGF0Y2hlcywNCiEgICAgIHBsZWFzZSB1c2UgdGhlIDxhDQoh ICAgICBocmVmPSJodHRwOi8vd3d3LnB5dGhvbi5vcmcvc2VhcmNoL3NlYXJj aF9idWdzLmh0bWwiPlB5dGhvbiBCdWdzIExpc3Q8L2E+DQohICAgICBpbnN0 ZWFkLjwvbGk+DQogICAgPGxpPldlIGxpa2UgY29udGV4dCBkaWZmcy4gV2Ug Z3J1ZGdpbmdseSBhY2NlcHQgdW5pZmllZCBkaWZmcy4gPGI+U3RyYWlnaHQN CiAgICAgICgiZWQtc3R5bGUiKSBkaWZmcyBhcmUgcmlnaHQgb3V0ITwvYj4g SWYgeW91IGRvbid0IGtub3cgaG93IHRvIGdlbmVyYXRlDQogICAgICBjb250 ZXh0IGRpZmZzLCB5b3UncmUgcHJvYmFibHkgbm90IHF1YWxpZmllZCB0byBw cm9kdWNlIGhpZ2gtcXVhbGl0eQ0KLS0tIDU5NCw2MDggLS0tLQ0KICAgICAg bWFpbGluZyBsaXN0IGFkZHJlc3M7IHRoaXMgYWRkcmVzcyBzaG91bGQgbm8g bG9uZ2VyIGJlIHVzZWQgZm9yIHBhdGNoDQogICAgICBzdWJtaXNzaW9uLiBU aGUgcGF0Y2ggbWFuYWdlciBpcyBmb3IgPGI+cGF0Y2hlczwvYj4gb25seTsg aWYgeW91IGhhdmUgYQ0KICAgICAgcHJvYmxlbSBvciBzdWdnZXN0aW9uIGJ1 dCBkb24ndCBrbm93IGhvdyB0byB3cml0ZSB0aGUgY29kZSBmb3IgaXQsIHVz ZSB0aGUNCiEgICAgIDxhIGhyZWY9IiNiMSI+YnVnIHJlcG9ydGluZyBtZWNo YW5pc208L2E+IGluc3RlYWQuIA0KISAgICAgVGhlIGJ1Z3MgbGlzdCBpcyBz ZWFyY2hhYmxlOyBpZiB5b3UgaGF2ZSBhIHByb2JsZW0gYW5kDQogICAgICB5 b3UncmUgbm90IHN1cmUgaWYgaXQgaGFzIGJlZW4gcmVwb3J0ZWQgb3IgZml4 ZWQgYWxyZWFkeSwgdGhpcyBpcyB0aGUNCiAgICAgIGZpcnN0IHBsYWNlIHRv IGxvb2suIChUaGVyZSB1c2VkIHRvIGJlIGEgc2VwYXJhdGUgVE9ETyBsaXN0 OyB3ZSBub3cgcHJlZmVyDQogICAgICB0aGF0IHlvdSB1c2UgdGhlIGJ1Z3Mg bGlzdCBmb3Igc3VnZ2VzdGlvbnMgYW5kIHJlcXVlc3RzIHRvby4pPC9saT4N CiAgICA8bGkgc3R5bGU9Imxpc3Qtc3R5bGU6IG5vbmUiPjxiPlN1Ym1pdCBk b2N1bWVudGF0aW9uIHBhdGNoZXMgdGhlIHNhbWUNCiAgICAgIHdheS48L2I+ IFdoZW4gYWRkaW5nIHRoZSBwYXRjaCwgYmUgc3VyZSB0byBzZXQgdGhlICI8 Yj5DYXRlZ29yeTwvYj4iIGZpZWxkDQogICAgICB0byAiPGI+ZG9jdW1lbnRh dGlvbjwvYj4iLiBGb3IgZG9jdW1lbnRhdGlvbiBlcnJvcnMgd2l0aG91dCBw YXRjaGVzLA0KISAgICAgcGxlYXNlIHVzZSB0aGUgPGEgaHJlZj0iYjEiPmJ1 ZyByZXBvcnRpbmcgbWVjaGFuaXNtPC9hPi48L2xpPg0KICAgIDxsaT5XZSBs aWtlIGNvbnRleHQgZGlmZnMuIFdlIGdydWRnaW5nbHkgYWNjZXB0IHVuaWZp ZWQgZGlmZnMuIDxiPlN0cmFpZ2h0DQogICAgICAoImVkLXN0eWxlIikgZGlm ZnMgYXJlIHJpZ2h0IG91dCE8L2I+IElmIHlvdSBkb24ndCBrbm93IGhvdyB0 byBnZW5lcmF0ZQ0KICAgICAgY29udGV4dCBkaWZmcywgeW91J3JlIHByb2Jh Ymx5IG5vdCBxdWFsaWZpZWQgdG8gcHJvZHVjZSBoaWdoLXF1YWxpdHkNCg== ---559023410-2110444415-973089518=:21805-- From mwh21@cam.ac.uk Wed Nov 1 18:13:07 2000 From: mwh21@cam.ac.uk (Michael Hudson) Date: 01 Nov 2000 18:13:07 +0000 Subject: [Python-Dev] Python Call Mechanism In-Reply-To: Jeremy Hylton's message of "Mon, 30 Oct 2000 09:59:00 -0500 (EST)" References: <200010301715.JAA32564@slayer.i.sourceforge.net> <39FDB5EA.B60EA39A@lemburg.com> <14845.36020.17063.951147@bitdiddle.concentric.net> Message-ID: Jeremy Hylton writes: > >>>>> "MAL" == M -A Lemburg writes: > > MAL> Jeremy Hylton wrote: > >> > >> Update of /cvsroot/python/python/dist/src/Python In directory > >> slayer.i.sourceforge.net:/tmp/cvs-serv32349/Python > >> > >> Modified Files: ceval.c ... N.B. The CALL_FUNCTION > >> implementation is getting really hairy; should review it to see > >> if it can be simplified. > > MAL> How about a complete redesign of the whole call mechanism ?! > [chomp] > > I'd be interested in looking at it. Random idea that occurred while answering a post on comp.lang.python: How about dumping the CALL_FUNCTION* opcodes, and replacing them with two non-argumented opcodes, called for the sake of argument NCALL_FUNC and NCALL_FUNC_KW. NCALL_FUNC would pop a function object and a tuple off the stack and apply the function to the tuple. NCALL_FUNC_KW would do the same, then pop a dictionary and then do the moral equivalent of f(*args,**kw). As a preliminary it would be sensible to rework BUILD_MAP so that it built dictionaries off the stack (a bit like BUILD_LIST, and like CALL_FUNCTION now does with keyword arguments...) (and extend the compiler to use this for literal dictionaries). This would add an opcode or so per function call, but would probably make life much simpler. No time for implementation tonight, but could probably knock something up tomorrow (depending how hard it turns out to be). Thoughts? Is that like what you did, Marc? M. -- Those who have deviant punctuation desires should take care of their own perverted needs. -- Erik Naggum, comp.lang.lisp From jeremy@alum.mit.edu Wed Nov 1 19:06:46 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Wed, 1 Nov 2000 14:06:46 -0500 (EST) Subject: [Python-Dev] Python Call Mechanism In-Reply-To: References: <200010301715.JAA32564@slayer.i.sourceforge.net> <39FDB5EA.B60EA39A@lemburg.com> <14845.36020.17063.951147@bitdiddle.concentric.net> Message-ID: <14848.27078.923932.758419@bitdiddle.concentric.net> My first impression is that this sounds like a nice simplifcation. One question is how expensive this is for the common case. Right now arguments are pushed on the interpreter stack before CALL_FUNCTION is executed, which is just a pointer assignment. The pointers on the stack are then assigned into the fast locals of the function after the call. Your scheme sounds like it would increase all function calls by the cost of a tuple allocation. It certainly wouldn't hurt to implement this, as it would provide some practical implementation experience that would inform a PEP on the subject. On a related note, I have proposed a pep to add nested lexical scopes for Python 2.1. Barry's away for the moment, so it hasn't been assigned a number yet. It's just a proposal, not sure what Guido will say in the end, but it also involves revising the function call architecture. I'll send a copy of the current draft (just notes) under a separate subject. Jeremy From jeremy@alum.mit.edu Wed Nov 1 19:07:10 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Wed, 1 Nov 2000 14:07:10 -0500 (EST) Subject: [Python-Dev] statically nested scopes Message-ID: <14848.27102.223001.369662@bitdiddle.concentric.net> Title: Statically Nested Scopes Author: Jeremy Hylton Status: Draft Type: Standards Track Created: 01-Nov-2000 Abstract This PEP proposes the additional of statically nested scoping (lexical scoping) for Python 2.1. The current language definition defines exactly three namespaces that are used to resolve names -- the local, global, and built-in namespaces. The addition of nested scopes would allow resolution of unbound local names in enclosing functions' namespaces. One consequence of this change that will be most visible to Python programs is that lambda statements could reference variables in the namespaces where the lambda is defined. Currently, a lambda statement uses default arguments to explicitly creating bindings in the lambda's namespace. Notes This section describes several issues that will be fleshed out and addressed in the final draft of the PEP. Until that draft is ready, please direct comments to the author. This change has been proposed many times in the past. It has always been stymied by the possibility of creating cycles that could not be collected by Python's reference counting garbage collector. The additional of the cycle collector in Python 2.0 eliminates this concern. Guido once explained that his original reservation about nested scopes was a reaction to their overuse in Pascal. In large Pascal programs he was familiar with, block structure was overused as an organizing principle for the program, leading to hard-to-read code. Greg Ewing developed a proposal "Python Nested Lexical Scoping Enhancement" in Aug. 1999. It is available from http://www.cosc.canterbury.ac.nz/~greg/python/lexscope.html Michael Hudson's bytecodehacks projects at http://sourceforge.net/projects/bytecodehacks/ provides facilities to support nested scopes using the closure module. Examples: def make_adder(n): def adder(x): return x + n return adder add2 = make_adder(2) add2(5) == 7 from Tkinter import * root = Tk() Button(root, text="Click here", command = lambda : root.test.configure(text="...")) One controversial issue is whether it should be possible to modify the value of variables defined in an enclosing scope. One part of the issue is how to specify that an assignment in the local scope should reference to the definition of the variable in an enclosing scope. Assignment to a variable in the current scope creates a local variable in the scope. If the assignment is supposed to refer to a global variable, the global statement must be used to prevent a local name from being created. Presumably, another keyword would be required to specify "nearest enclosing scope." Guido is opposed to allow modifications (need to clarify exactly why). If you are modifying variables bound in enclosing scopes, you should be using a class, he says. The problem occurs only when a program attempts to rebind the name in the enclosing scope. A mutable object, e.g. a list or dictionary, can be modified by a reference in a nested scope; this is an obvious consequence of Python's reference semantics. The ability to change mutable objects leads to an inelegant workaround: If a program needs to rebind an immutable object, e.g. a number or tuple, store the object in a list and have all references to the object use this list: def bank_account(initial_balance): balance = [initial_balance] def deposit(amount): balance[0] = balance[0] + amount def withdraw(amount): balance[0] = balance[0] - amount return deposit, withdraw I would prefer for the language to support this style of programming directly rather than encouraging programs to use this somewhat obfuscated style. Of course, an instance would probably be clearer in this case. One implementation issue is how to represent the environment that stores variables that are referenced by nested scopes. One possibility is to add a pointer to each frame's statically enclosing frame and walk the chain of links each time a non-local variable is accessed. This implementation has some problems, because access to nonlocal variables is slow and causes garbage to accumulate unncessarily. Another possibility is to construct an environment for each function that provides access to only the non-local variables. This environment would be explicitly passed to nested functions. From mal@lemburg.com Wed Nov 1 20:31:08 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Wed, 01 Nov 2000 21:31:08 +0100 Subject: [Python-Dev] Python Call Mechanism References: <200010301715.JAA32564@slayer.i.sourceforge.net> <39FDB5EA.B60EA39A@lemburg.com> <14845.36020.17063.951147@bitdiddle.concentric.net> Message-ID: <3A007D8C.A9943D90@lemburg.com> Michael Hudson wrote: > > Jeremy Hylton writes: > > > >>>>> "MAL" == M -A Lemburg writes: > > > > MAL> Jeremy Hylton wrote: > > >> > > >> Update of /cvsroot/python/python/dist/src/Python In directory > > >> slayer.i.sourceforge.net:/tmp/cvs-serv32349/Python > > >> > > >> Modified Files: ceval.c ... N.B. The CALL_FUNCTION > > >> implementation is getting really hairy; should review it to see > > >> if it can be simplified. > > > > MAL> How about a complete redesign of the whole call mechanism ?! > > > [chomp] > > > > I'd be interested in looking at it. > > Random idea that occurred while answering a post on comp.lang.python: > > How about dumping the CALL_FUNCTION* opcodes, and replacing them with > two non-argumented opcodes, called for the sake of argument NCALL_FUNC > and NCALL_FUNC_KW. > > NCALL_FUNC would pop a function object and a tuple off the stack and > apply the function to the tuple. > > NCALL_FUNC_KW would do the same, then pop a dictionary and then do > the moral equivalent of f(*args,**kw). > > As a preliminary it would be sensible to rework BUILD_MAP so that it > built dictionaries off the stack (a bit like BUILD_LIST, and like > CALL_FUNCTION now does with keyword arguments...) (and extend the > compiler to use this for literal dictionaries). > > This would add an opcode or so per function call, but would probably > make life much simpler. > > No time for implementation tonight, but could probably knock something > up tomorrow (depending how hard it turns out to be). > > Thoughts? Is that like what you did, Marc? No, I just cleaned up the intertwine calling scheme currently implemented in ceval.c. This allows a few improvments, one of them being the possibility to inline C function calls in the main loop (anyone ever trace the path Python takes when calling a builtin function or method... you'd be surprised). About your idea with the new opcodes: you could be touching a performance relevant section there -- a ceval round-trip may cost more than the added if()s in the CALL_FUNCION opcode. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From mal@lemburg.com Wed Nov 1 20:37:12 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Wed, 01 Nov 2000 21:37:12 +0100 Subject: [Python-Dev] statically nested scopes References: <14848.27102.223001.369662@bitdiddle.concentric.net> Message-ID: <3A007EF8.2D9BDCF5@lemburg.com> [pre-PEP] This will break code... I'm not sure whether it's worth going down this path just for the sake of being able to define functions within functions. Wouldn't it be a better idea to somehow add native acqusition to Python's objects ? We already have a slot which implements the "contains" relationship. All we'd need is a way for a contained object to register itself with the container in a way that doesn't produce cycles. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From jeremy@alum.mit.edu Wed Nov 1 20:48:53 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Wed, 1 Nov 2000 15:48:53 -0500 (EST) Subject: [Python-Dev] statically nested scopes In-Reply-To: <3A007EF8.2D9BDCF5@lemburg.com> References: <14848.27102.223001.369662@bitdiddle.concentric.net> <3A007EF8.2D9BDCF5@lemburg.com> Message-ID: <14848.33205.361821.679508@bitdiddle.concentric.net> >>>>> "MAL" == M -A Lemburg writes: MAL> [pre-PEP] This will break code... I'm not sure whether it's MAL> worth going down this path just for the sake of being able to MAL> define functions within functions. How will this break code? Any code written to use the scoping rules will not work today. Python already allows programs to define functions within functions. That's not at issue. The issue is how hard it is to use nested functions, including lambdas. MAL> Wouldn't it be a better idea to somehow add native acqusition MAL> to Python's objects ? No. Seriously, I don't see how acquistion addresses the same issues at all. Feel free to explain what you mean. MAL> We already have a slot which implements the "contains" MAL> relationship. All we'd need is a way for a contained object to MAL> register itself with the container in a way that doesn't MAL> produce cycles. The contains relationship has to do with container objects and their contents. A function's environment is not a container in the same sense, so I don't see how this is related. As I noted in the PEP, I don't see a compelling reason to avoid cycles. Jeremy From mal@lemburg.com Wed Nov 1 20:51:16 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Wed, 01 Nov 2000 21:51:16 +0100 Subject: [Python-Dev] statically nested scopes References: <14848.27102.223001.369662@bitdiddle.concentric.net> <3A007EF8.2D9BDCF5@lemburg.com> <14848.33205.361821.679508@bitdiddle.concentric.net> Message-ID: <3A008244.8F558C8B@lemburg.com> Jeremy Hylton wrote: > > >>>>> "MAL" == M -A Lemburg writes: > > MAL> [pre-PEP] This will break code... I'm not sure whether it's > MAL> worth going down this path just for the sake of being able to > MAL> define functions within functions. > > How will this break code? Any code written to use the scoping rules > will not work today. > > Python already allows programs to define functions within functions. > That's not at issue. The issue is how hard it is to use nested > functions, including lambdas. The problem is that with nested scoping, a function defined within another function will suddenly reference the variables of the enclosing function as globals and not the module globals... this could break code. Another problem is that you can't reach out for the defining module globals anymore (at least not in an easy way like today). > MAL> Wouldn't it be a better idea to somehow add native acqusition > MAL> to Python's objects ? > > No. > > Seriously, I don't see how acquistion addresses the same issues at > all. Feel free to explain what you mean. It's not related to *statically* nested scopes, but is to dynamically nested ones. Acquisition is basically about the same thing: you acquire attributes from containers. The only difference here is that the containment relationships are defined at run-time. > MAL> We already have a slot which implements the "contains" > MAL> relationship. All we'd need is a way for a contained object to > MAL> register itself with the container in a way that doesn't > MAL> produce cycles. > > The contains relationship has to do with container objects and their > contents. A function's environment is not a container in the same > sense, so I don't see how this is related. > > As I noted in the PEP, I don't see a compelling reason to avoid > cycles. Ok, we have cycle GC, but why create cycles when you don't have to (Python can also be run without GC and then you'd run into problems...) ? -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From skip@mojam.com (Skip Montanaro) Wed Nov 1 21:57:17 2000 From: skip@mojam.com (Skip Montanaro) (Skip Montanaro) Date: Wed, 1 Nov 2000 15:57:17 -0600 (CST) Subject: [Python-Dev] statically nested scopes In-Reply-To: <14848.33205.361821.679508@bitdiddle.concentric.net> References: <14848.27102.223001.369662@bitdiddle.concentric.net> <3A007EF8.2D9BDCF5@lemburg.com> <14848.33205.361821.679508@bitdiddle.concentric.net> Message-ID: <14848.37309.3616.710295@beluga.mojam.com> MAL> [pre-PEP] This will break code... Jeremy> How will this break code? Suppose you have x = 1 def f1(): x = 2 def inner(): print x inner() Today, calling f1() prints "1". After your proposed changes I suspect it would print "2". Skip From jeremy@alum.mit.edu Wed Nov 1 21:18:49 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Wed, 1 Nov 2000 16:18:49 -0500 (EST) Subject: [Python-Dev] statically nested scopes In-Reply-To: <3A008244.8F558C8B@lemburg.com> References: <14848.27102.223001.369662@bitdiddle.concentric.net> <3A007EF8.2D9BDCF5@lemburg.com> <14848.33205.361821.679508@bitdiddle.concentric.net> <3A008244.8F558C8B@lemburg.com> Message-ID: <14848.35001.109767.606384@bitdiddle.concentric.net> >>>>> "MAL" == M -A Lemburg writes: MAL> [pre-PEP] This will break code... I'm not sure whether it's MAL> worth going down this path just for the sake of being able to MAL> define functions within functions. >> >> How will this break code? Any code written to use the scoping >> rules will not work today. >> >> Python already allows programs to define functions within >> functions. That's not at issue. The issue is how hard it is to >> use nested functions, including lambdas. MAL> The problem is that with nested scoping, a function defined MAL> within another function will suddenly reference the variables MAL> of the enclosing function as globals and not the module MAL> globals... this could break code. That's right it could, in the unlikely case that someone has existing code today using nested functions where an intermediate function defines a local variable that shadows a variable defined in an enclosing scope. It should be straightfoward to build a tool that would detect this case. It would be pretty poor programming style, so I think it would be fine to break backwards compatibility here. MAL> Another problem is that you can't reach out for the defining MAL> module globals anymore (at least not in an easy way like MAL> today). I think we would want to keep globals implemented just the way they are. The compiler would need to determine exactly which variables are access from enclosing scopes and which are globals. MAL> Wouldn't it be a better idea to somehow add native acqusition MAL> to Python's objects ? >> >> Seriously, I don't see how acquistion addresses the same issues >> at all. Feel free to explain what you mean. MAL> It's not related to *statically* nested scopes, but is to MAL> dynamically nested ones. Acquisition is basically about the MAL> same thing: you acquire attributes from containers. The only MAL> difference here is that the containment relationships are MAL> defined at run-time. Static scoping and dynamic scoping are entirely different beasts, useful for different things. I want to fix, among other things, lambdas. That's a static issue. MAL> We already have a slot which implements the "contains" MAL> relationship. All we'd need is a way for a contained object to MAL> register itself with the container in a way that doesn't MAL> produce cycles. >> >> The contains relationship has to do with container objects and >> their contents. A function's environment is not a container in >> the same sense, so I don't see how this is related. >> >> As I noted in the PEP, I don't see a compelling reason to avoid >> cycles. MAL> Ok, we have cycle GC, but why create cycles when you don't have MAL> to (Python can also be run without GC and then you'd run into MAL> problems...) ? If we can avoid cycles, sure. I would prefer a simple design that allowed cycles to a complicated design that avoided them. Exactly where to draw the line between simple and complex is a matter of taste, of course. Jeremy From jeremy@alum.mit.edu Wed Nov 1 21:20:08 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Wed, 1 Nov 2000 16:20:08 -0500 (EST) Subject: [Python-Dev] statically nested scopes In-Reply-To: <14848.37309.3616.710295@beluga.mojam.com> References: <14848.27102.223001.369662@bitdiddle.concentric.net> <3A007EF8.2D9BDCF5@lemburg.com> <14848.33205.361821.679508@bitdiddle.concentric.net> <14848.37309.3616.710295@beluga.mojam.com> Message-ID: <14848.35080.73182.888834@bitdiddle.concentric.net> Thanks. I expect there is very little code that depends on this sort of behavior, since it is confusing to read. Many readers, particularly novices, could reasonably expect Python to print 2 now. As I explained to MAL, I think we would need to provide a code analysis tool that identified these problems. It's probably helpful to generate warning about this right now, since it's rather obfuscated. Jeremy From jeremy@alum.mit.edu Wed Nov 1 22:35:14 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Wed, 1 Nov 2000 17:35:14 -0500 (EST) Subject: [Python-Dev] Modules/makesetup loop Message-ID: <14848.39586.832800.139182@bitdiddle.concentric.net> I just did a clean configure and make from the latest CVS tree. It seems to get stuck in a loop calling makesetup over and over again. (Output below.) Any idea what's going wrong? Jeremy (cd Modules; make -f Makefile.pre Makefile) make[1]: Entering directory `/home/jeremy/src/python/dist/src/build-O3/Modules' ----------------------------------------------- Modules/Setup.dist is newer than Modules/Setup; check to make sure you have all the updates you need in your Modules/Setup file. ----------------------------------------------- rm -rf ../libpython2.0.a /bin/sh ../../Modules/makesetup Setup.config Setup.local Setup cat: Setup.local: No such file or directory cat: Setup: No such file or directory make[1]: Leaving directory `/home/jeremy/src/python/dist/src/build-O3/Modules' making Makefile in subdirectory . make[1]: Entering directory `/home/jeremy/src/python/dist/src/build-O3' make[1]: `Makefile' is up to date. make[1]: Leaving directory `/home/jeremy/src/python/dist/src/build-O3' making Makefile in subdirectory Parser make[1]: Entering directory `/home/jeremy/src/python/dist/src/build-O3/Parser' make[1]: `Makefile' is up to date. make[1]: Leaving directory `/home/jeremy/src/python/dist/src/build-O3/Parser' making Makefile in subdirectory Grammar make[1]: Entering directory `/home/jeremy/src/python/dist/src/build-O3/Grammar' make[1]: Nothing to be done for `Makefile'. make[1]: Leaving directory `/home/jeremy/src/python/dist/src/build-O3/Grammar' making Makefile in subdirectory Objects make[1]: Entering directory `/home/jeremy/src/python/dist/src/build-O3/Objects' make[1]: `Makefile' is up to date. make[1]: Leaving directory `/home/jeremy/src/python/dist/src/build-O3/Objects' making Makefile in subdirectory Python make[1]: Entering directory `/home/jeremy/src/python/dist/src/build-O3/Python' make[1]: `Makefile' is up to date. make[1]: Leaving directory `/home/jeremy/src/python/dist/src/build-O3/Python' making Makefile in subdirectory Modules make[1]: Entering directory `/home/jeremy/src/python/dist/src/build-O3/Modules' ----------------------------------------------- Modules/Setup.dist is newer than Modules/Setup; check to make sure you have all the updates you need in your Modules/Setup file. ----------------------------------------------- rm -rf ../libpython2.0.a /bin/sh ../../Modules/makesetup Setup.config Setup.local Setup cat: Setup.local: No such file or directory cat: Setup: No such file or directory make[1]: Leaving directory `/home/jeremy/src/python/dist/src/build-O3/Modules' make[1]: Entering directory `/home/jeremy/src/python/dist/src/build-O3/Modules' ----------------------------------------------- Modules/Setup.dist is newer than Modules/Setup; check to make sure you have all the updates you need in your Modules/Setup file. ----------------------------------------------- rm -rf ../libpython2.0.a /bin/sh ../../Modules/makesetup Setup.config Setup.local Setup cat: Setup.local: No such file or directory cat: Setup: No such file or directory make[1]: Leaving directory `/home/jeremy/src/python/dist/src/build-O3/Modules' make[1]: Entering directory `/home/jeremy/src/python/dist/src/build-O3/Modules' ----------------------------------------------- Modules/Setup.dist is newer than Modules/Setup; check to make sure you have all the updates you need in your Modules/Setup file. ----------------------------------------------- rm -rf ../libpython2.0.a /bin/sh ../../Modules/makesetup Setup.config Setup.local Setup cat: Setup.local: No such file or directory cat: Setup: No such file or directory make[1]: Leaving directory `/home/jeremy/src/python/dist/src/build-O3/Modules' make[1]: Entering directory `/home/jeremy/src/python/dist/src/build-O3/Modules' ----------------------------------------------- Modules/Setup.dist is newer than Modules/Setup; check to make sure you have all the updates you need in your Modules/Setup file. ----------------------------------------------- [etc.] From thomas@xs4all.net Wed Nov 1 22:41:34 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Wed, 1 Nov 2000 23:41:34 +0100 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0204.txt,1.4,1.5 In-Reply-To: <200011012237.OAA06642@slayer.i.sourceforge.net>; from twouters@users.sourceforge.net on Wed, Nov 01, 2000 at 02:37:39PM -0800 References: <200011012237.OAA06642@slayer.i.sourceforge.net> Message-ID: <20001101234134.O12812@xs4all.nl> On Wed, Nov 01, 2000 at 02:37:39PM -0800, Thomas Wouters wrote: > Modified Files: > pep-0204.txt > Log Message: > Update this PEP to current, harsh, reality. It's been rejected :) If at all > possible, the reasoning should be extended to include the real reasons it > was rejected -- this is just guesswork from my side. (This means you, Guido, > or anyone who can channel Guido enough to call himself Guido.) In addition to that, PEP 0 also needs to be updated. Shall I do that myself, now that Barry is apparently away ? While I was at it, I also noticed PEP 0200 still says 'Incomplete', though that might be by design. Yay-another-first-for-me---first-rejected-PEP-ly y'rs, ;) -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From jeremy@alum.mit.edu Wed Nov 1 22:54:07 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Wed, 1 Nov 2000 17:54:07 -0500 (EST) Subject: [Python-Dev] Move idle PEPs to status deferred Message-ID: <14848.40719.914227.606929@bitdiddle.concentric.net> Barry, We should start working on any PEPs that are going to be considered for Python 2.1. There are a bunch of old PEPs that have been assigned and then ignored. You have marked many of them as deferred. We should assign new authors for the PEPs we care about and move all the other deferred PEPs somewhere else in the index. The following PEPs have been inactive for at least two months: I 2 pep-0002.txt Procedure for Adding New Modules Raymond S 202 pep-0202.txt List Comprehensions Peters SD 205 pep-0205.txt Weak References Drake I 206 pep-0206.txt 2.0 Batteries Included Zadka SD 207 pep-0207.txt Rich Comparisons Ascher SD 208 pep-0208.txt Reworking the Coercion Model Ascher SD 209 pep-0209.txt Adding Multidimensional Arrays Ascher SD 210 pep-0210.txt Decoupling the Interpreter Loop Ascher SD 211 pep-0211.txt Adding New Linear Algebra Operators Wilson SD 212 pep-0212.txt Loop Counter Iteration Schneider-Kamp SD 213 pep-0213.txt Attribute Access Handlers Prescod SD 215 pep-0215.txt String Interpolation Yee I 216 pep-0216.txt Docstring Format Zadka SD 217 pep-0217.txt Display Hook for Interactive Use Zadka SD 218 pep-0218.txt Adding a Built-In Set Object Type Wilson SD 219 pep-0219.txt Stackless Python McMillan I 220 pep-0220.txt Coroutines, Generators, Continuations McMillan S 224 pep-0224.txt Attribute Docstrings Lemburg I think we should find new authors for PEPs 207 and 208 and work on them for 2.1. I imagine David would be happy to pass the torch on these issues. I assume Gordon will be working on the stackless PEPs, but we should confirm this. For all of the other PEPs, authors who want to have them considered for 2.1 should provide some updates to their PEPs. We should also simplfy the PEP index so that deferred PEPs are collected at the bottom or something like that. Jeremy From greg@cosc.canterbury.ac.nz Thu Nov 2 00:34:04 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Thu, 02 Nov 2000 13:34:04 +1300 (NZDT) Subject: [Python-Dev] statically nested scopes In-Reply-To: <14848.33205.361821.679508@bitdiddle.concentric.net> Message-ID: <200011020034.NAA29444@s454.cosc.canterbury.ac.nz> Jeremy Hylton : > Seriously, I don't see how acquistion addresses the same issues at > all. My proposal for nested scopes was actually an acquisition-like mechanism. The idea was to avoid unbreakable cycles by deferring the creation of a closure from when the function is defined to when it is used. Guido rejected my implementation for various good reasons. It could be modified to overcome most of those objections, however. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From greg@cosc.canterbury.ac.nz Thu Nov 2 00:37:00 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Thu, 02 Nov 2000 13:37:00 +1300 (NZDT) Subject: [Python-Dev] statically nested scopes In-Reply-To: <3A008244.8F558C8B@lemburg.com> Message-ID: <200011020037.NAA29447@s454.cosc.canterbury.ac.nz> "M.-A. Lemburg" : > The problem is that with nested scoping, a function defined > within another function will suddenly reference the variables > of the enclosing function This could be avoided by requiring that variables which are to be visible in an inner scope be marked somehow in the scope where they are defined. I don't think it's a serious enough problem to be worth fixing that way, though. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From gmcm@hypernet.com Thu Nov 2 01:37:09 2000 From: gmcm@hypernet.com (Gordon McMillan) Date: Wed, 1 Nov 2000 20:37:09 -0500 Subject: [Python-Dev] Move idle PEPs to status deferred In-Reply-To: <14848.40719.914227.606929@bitdiddle.concentric.net> Message-ID: <3A007EF5.21540.6BA02C9@localhost> Jeremy wrote: > I assume Gordon will be working on the stackless PEPs, but we should > confirm this. Yes, I will. - Gordon From tim_one@email.msn.com Thu Nov 2 07:02:04 2000 From: tim_one@email.msn.com (Tim Peters) Date: Thu, 2 Nov 2000 02:02:04 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: <14848.27102.223001.369662@bitdiddle.concentric.net> Message-ID: [Jeremy Hylton] > ... > Guido once explained that his original reservation about nested > scopes was a reaction to their overuse in Pascal. In large Pascal > programs he was familiar with, block structure was overused as an > organizing principle for the program, leading to hard-to-read > code. Note that this problem will be much worse in Python: in Pascal, you could always "look up" for the closest-containing func/proc that explicitly declares a referenced vrbl. In Python, you have to indirectly *deduce* which vrbls are local to a def, by searching the entire body for an appearance as a binding target. So you have to "look up" and "look down" from the reference point, and it's easy to miss a binding target. i = 6 def f(x): def g(): print i # ... # skip to the next page # ... for i in x: # ah, i *is* local to f, so this is what g sees pass g() > def bank_account(initial_balance): > balance = [initial_balance] > def deposit(amount): > balance[0] = balance[0] + amount > def withdraw(amount): > balance[0] = balance[0] - amount > return deposit, withdraw Unfortunately for proponents, this is exactly the kind of SICP example that is much better done via a class. Not only is the closure version strained by comparison, but as is usual it manages to create a bank account with a write-only balance <0.9 wink>. def deposit(amount): global bank_account.balance balance += amount is one old suggested way to explicitly declare non-local names and the enclosing block to which they are local (and in analogy with current "global", mandatory if you want to rebind the non-local name, optional if you only want to reference it). There are subtleties here, but explicit is better than implicit, and the subtleties are only subtler if you refuse (like Scheme) to make the intent explicit. for-real-fun-think-about-"exec"-abuses-ly y'rs - tim From Moshe Zadka Thu Nov 2 08:32:11 2000 From: Moshe Zadka (Moshe Zadka) Date: Thu, 2 Nov 2000 10:32:11 +0200 (IST) Subject: [Python-Dev] statically nested scopes In-Reply-To: Message-ID: [Jeremy Hylton] > ... > Guido once explained that his original reservation about nested > scopes was a reaction to their overuse in Pascal. In large Pascal > programs he was familiar with, block structure was overused as an > organizing principle for the program, leading to hard-to-read > code. [Tim Peters] > Note that this problem will be much worse in Python: in Pascal, you could > always "look up" for the closest-containing func/proc that explicitly > declares a referenced vrbl. In Python, you have to indirectly *deduce* > which vrbls are local to a def, by searching the entire body for an > appearance as a binding target. So you have to "look up" and "look down" > from the reference point, and it's easy to miss a binding target. This is a tool problem, and should be solved with good tools. Of course, installing the corret tools in people's minds will require some technological discoveries. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From gstein@lyra.org Thu Nov 2 08:46:04 2000 From: gstein@lyra.org (Greg Stein) Date: Thu, 2 Nov 2000 00:46:04 -0800 (PST) Subject: [Python-Dev] statically nested scopes In-Reply-To: Message-ID: On Thu, 2 Nov 2000, Moshe Zadka wrote: > [Tim Peters] > > Note that this problem will be much worse in Python: in Pascal, you could > > always "look up" for the closest-containing func/proc that explicitly > > declares a referenced vrbl. In Python, you have to indirectly *deduce* > > which vrbls are local to a def, by searching the entire body for an > > appearance as a binding target. So you have to "look up" and "look down" > > from the reference point, and it's easy to miss a binding target. > > This is a tool problem, and should be solved with good tools. > Of course, installing the corret tools in people's minds will require > some technological discoveries. Bleck. Those tools are a crutch to deal with a poor language design / feature. And are those tools portable? Are they part of everybody's standard tool set? Will vi, emacs, and MS DevStudio all have those capabilities? Not a chance. Personally, I'll take Guido's point of view and say they are inherently hard to deal with; therefore, punt them. Cheers, -g -- Greg Stein, http://www.lyra.org/ From mal@lemburg.com Thu Nov 2 12:12:42 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Thu, 02 Nov 2000 13:12:42 +0100 Subject: [Python-Dev] statically nested scopes References: <200011020037.NAA29447@s454.cosc.canterbury.ac.nz> Message-ID: <3A015A3A.4EC7DBC6@lemburg.com> Greg Ewing wrote: > > "M.-A. Lemburg" : > > > The problem is that with nested scoping, a function defined > > within another function will suddenly reference the variables > > of the enclosing function > > This could be avoided by requiring that variables which are > to be visible in an inner scope be marked somehow in the > scope where they are defined. > > I don't think it's a serious enough problem to be worth > fixing that way, though. It may not look serious, but changing the Python lookup scheme is, since many inspection tools rely and reimplement exactly that scheme. With nested scopes, there would be next to no way to emulate the lookups using these tools. To be honest, I don't think static nested scopes buy us all that much. You can do the same now, by using keyword arguments which isn't all that nice, but works great and makes the scope clearly visible. Dynamic nested scopes is another topic... those are *very* useful; especially when it comes to avoiding global variables and implementing programs which work using control objects instead of global function calls. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From barry@wooz.org Thu Nov 2 14:30:26 2000 From: barry@wooz.org (barry@wooz.org) Date: Thu, 2 Nov 2000 09:30:26 -0500 (EST) Subject: [Python-Dev] Re: Move idle PEPs to status deferred References: <14848.40719.914227.606929@bitdiddle.concentric.net> Message-ID: <14849.31362.805142.553781@anthem.concentric.net> I was away from email for most of the day yesterday. I'll do a swing through all the outstanding PEP updates later today -- after I finish catching up on email. :( -Barry From Moshe Zadka Thu Nov 2 14:49:37 2000 From: Moshe Zadka (Moshe Zadka) Date: Thu, 2 Nov 2000 16:49:37 +0200 (IST) Subject: [Python-Dev] PEP-0217 Message-ID: I need some help: 1) BDFL pronouncement 2) someone to see about the Jython issue. Thank you for your co-operation. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From jeremy@alum.mit.edu Thu Nov 2 15:18:47 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 10:18:47 -0500 (EST) Subject: [Python-Dev] statically nested scopes In-Reply-To: References: Message-ID: <14849.34263.310260.404940@bitdiddle.concentric.net> >>>>> "GS" == Greg Stein writes: GS> On Thu, 2 Nov 2000, Moshe Zadka wrote: >> This is a tool problem, and should be solved with good tools. Of >> course, installing the corret tools in people's minds will >> require some technological discoveries. GS> Bleck. Those tools are a crutch to deal with a poor language GS> design / feature. And are those tools portable? Are they part of GS> everybody's standard tool set? Will vi, emacs, and MS DevStudio GS> all have those capabilities? Are you saying that compilers are a crutch and we should get rid of them? I don't think you intend that, but this is a completely straightforward tool to build. It is needed only for backwards compatibility -- to identify scripts that depend on the changed behavior. There is no need for vi, emacs, or devstudio to understand what's going on. Jeremy From Moshe Zadka Thu Nov 2 15:16:41 2000 From: Moshe Zadka (Moshe Zadka) Date: Thu, 2 Nov 2000 17:16:41 +0200 (IST) Subject: [Python-Dev] statically nested scopes In-Reply-To: <14849.34263.310260.404940@bitdiddle.concentric.net> Message-ID: On Thu, 2 Nov 2000, Jeremy Hylton wrote: > >>>>> "GS" == Greg Stein writes: > > GS> On Thu, 2 Nov 2000, Moshe Zadka wrote: > >> This is a tool problem, and should be solved with good tools. Of > >> course, installing the corret tools in people's minds will > >> require some technological discoveries. > > GS> Bleck. Those tools are a crutch to deal with a poor language > GS> design / feature. And are those tools portable? Are they part of > GS> everybody's standard tool set? Will vi, emacs, and MS DevStudio > GS> all have those capabilities? > > Are you saying that compilers are a crutch and we should get rid of > them? I don't think you intend that, but this is a completely > straightforward tool to build. It is needed only for backwards > compatibility -- to identify scripts that depend on the changed > behavior. There is no need for vi, emacs, or devstudio to understand > what's going on. you guys are talking about different things. Jeremy is talking about a tool to warn against incompatible changes Greg is talking about a tool to identify, for each variable, what scope it belongs to. as-usual-the-answer-is-"you're-both-right"-ly y'rs, Z. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From guido@python.org Thu Nov 2 03:26:14 2000 From: guido@python.org (Guido van Rossum) Date: Wed, 01 Nov 2000 22:26:14 -0500 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 13:12:42 +0100." <3A015A3A.4EC7DBC6@lemburg.com> References: <200011020037.NAA29447@s454.cosc.canterbury.ac.nz> <3A015A3A.4EC7DBC6@lemburg.com> Message-ID: <200011020326.WAA07307@cj20424-a.reston1.va.home.com> [MAL] > Dynamic nested scopes is another topic... those are *very* > useful; especially when it comes to avoiding global variables > and implementing programs which work using control objects > instead of global function calls. Marc-Andre, what are Dynamic nested scopes? --Guido van Rossum (home page: http://www.python.org/~guido/) From Moshe Zadka Thu Nov 2 15:26:20 2000 From: Moshe Zadka (Moshe Zadka) Date: Thu, 2 Nov 2000 17:26:20 +0200 (IST) Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: <200011020326.WAA07307@cj20424-a.reston1.va.home.com> Message-ID: On Wed, 1 Nov 2000, Guido van Rossum wrote: > [MAL] > > Dynamic nested scopes is another topic... those are *very* > > useful; especially when it comes to avoiding global variables > > and implementing programs which work using control objects > > instead of global function calls. > > Marc-Andre, what are Dynamic nested scopes? If MAL means dynamic scoping (which I understood he does), then this simply means: when looking for a variable "foo", you first search for it in the local namespace. If not there, the *caller's* namespace, and so on. In the end, the caller is the __main__ module, and if not found there, it is a NameError. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From guido@python.org Thu Nov 2 03:29:03 2000 From: guido@python.org (Guido van Rossum) Date: Wed, 01 Nov 2000 22:29:03 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 13:12:42 +0100." <3A015A3A.4EC7DBC6@lemburg.com> References: <200011020037.NAA29447@s454.cosc.canterbury.ac.nz> <3A015A3A.4EC7DBC6@lemburg.com> Message-ID: <200011020329.WAA07330@cj20424-a.reston1.va.home.com> > It may not look serious, but changing the Python lookup scheme > is, since many inspection tools rely and reimplement exactly > that scheme. With nested scopes, there would be next to no > way to emulate the lookups using these tools. So fix the tools. > To be honest, I don't think static nested scopes buy us all that > much. You can do the same now, by using keyword arguments which > isn't all that nice, but works great and makes the scope clearly > visible. Yes. It's a hack that gets employed over and over. And it has certain problems. We added 'import as' to get rid of a common practice that was perceived unclean. Maybe we should support nested scopes to get rid of another unclean common practice? I'm not saying that we definitely should add this to 2.1 (there's enough on our plate already) but we should at least consider it, and now that we have cycle GC, the major argument against it (that it causes cycles) is gone... --Guido van Rossum (home page: http://www.python.org/~guido/) From Moshe Zadka Thu Nov 2 15:29:55 2000 From: Moshe Zadka (Moshe Zadka) Date: Thu, 2 Nov 2000 17:29:55 +0200 (IST) Subject: [Python-Dev] statically nested scopes In-Reply-To: <200011020329.WAA07330@cj20424-a.reston1.va.home.com> Message-ID: On Wed, 1 Nov 2000, Guido van Rossum wrote: > I'm not saying that we definitely should add this to 2.1 (there's > enough on our plate already) but we should at least consider it, and > now that we have cycle GC, the major argument against it (that it > causes cycles) is gone... This is the perfect moment to ask: what do we have on our plates for 2.1? Shouldn't we have a list of goals for it or something? As a first-order approximation, what PEPs are expected to be included? And, most the conspiracy-theory question, what are Digital Creations' goals for Python? We now return you to our regularily scheduled bug fixing. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From akuchlin@mems-exchange.org Thu Nov 2 15:34:42 2000 From: akuchlin@mems-exchange.org (Andrew Kuchling) Date: Thu, 2 Nov 2000 10:34:42 -0500 Subject: [Python-Dev] Python 2.1 tasks In-Reply-To: ; from moshez@math.huji.ac.il on Thu, Nov 02, 2000 at 05:29:55PM +0200 References: <200011020329.WAA07330@cj20424-a.reston1.va.home.com> Message-ID: <20001102103442.B5027@kronos.cnri.reston.va.us> On Thu, Nov 02, 2000 at 05:29:55PM +0200, Moshe Zadka wrote: >Shouldn't we have a list of goals for [Python 2.1] or something? As a >first-order approximation, what PEPs are expected to be included? And, Stuff I personally want to get done: * Finish PEP 222, "Web Programming Improvements" and implement whatever emerges from it. * Write a PEP on using Distutils to build the modules that come with Python, and implement it if accepted. * Work on something CPAN-like. This may or may not have repercussions for the core; I don't know. --amk From thomas@xs4all.net Thu Nov 2 15:42:39 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Thu, 2 Nov 2000 16:42:39 +0100 Subject: [Python-Dev] Python 2.1 tasks In-Reply-To: <20001102103442.B5027@kronos.cnri.reston.va.us>; from akuchlin@mems-exchange.org on Thu, Nov 02, 2000 at 10:34:42AM -0500 References: <200011020329.WAA07330@cj20424-a.reston1.va.home.com> <20001102103442.B5027@kronos.cnri.reston.va.us> Message-ID: <20001102164239.R12812@xs4all.nl> On Thu, Nov 02, 2000 at 10:34:42AM -0500, Andrew Kuchling wrote: > * Work on something CPAN-like. This may or may not have repercussions for > the core; I don't know. Probably not, though perhaps a new module would be nice. As for the CPAN-like thing, I really got a kick out of Greg S's WebDAV session on Apachecon, and I think it would be suited extremely well as the transmission protocol for SPAM (or however you want to call the Python CPAN ;). You can do the uploading, downloading and searching for modules using WebDAV without too much pain, and there's excellent WebDAV support for Apache ;) Is anyone working on something like this, or even thinking about it ? I'm not deep enough into distutils to join that SIG, but I definately would join a CPyAN SIG ;) -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From bckfnn@worldonline.dk Thu Nov 2 15:37:43 2000 From: bckfnn@worldonline.dk (Finn Bock) Date: Thu, 02 Nov 2000 15:37:43 GMT Subject: [Python-Dev] PEP-0217 In-Reply-To: References: Message-ID: <3a0186d5.25979536@smtp.worldonline.dk> [Moshe Zadka] >I need some help: > >1) BDFL pronouncement >2) someone to see about the Jython issue. I don't see any problems with this. This is already handled by a method in the jython runtime (Py.printResult). However, I think your example implementation should be: def displayhook(o): if o is None: return __builtin__._ = None print `o` __builtin__._ = o I don't why, but that is the implementation currently used by Jython (and I think CPython too). regards, finn From Moshe Zadka Thu Nov 2 16:05:20 2000 From: Moshe Zadka (Moshe Zadka) Date: Thu, 2 Nov 2000 18:05:20 +0200 (IST) Subject: [Python-Dev] PEP-0217 In-Reply-To: <3a0186d5.25979536@smtp.worldonline.dk> Message-ID: On Thu, 2 Nov 2000, Finn Bock wrote: > However, I think your example implementation should be: > > def displayhook(o): > if o is None: > return > __builtin__._ = None > print `o` > __builtin__._ = o > You're right. I'll just add the necessary Jython changes to PEP-0217. Thanks a lot. I don't like this either, but the good news is that in Py2.1, you'll be able to change this in site.py -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From mal@lemburg.com Thu Nov 2 16:05:14 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Thu, 02 Nov 2000 17:05:14 +0100 Subject: [Python-Dev] Re: Dynamic nested scopes References: Message-ID: <3A0190BA.940FFADA@lemburg.com> Moshe Zadka wrote: > > On Wed, 1 Nov 2000, Guido van Rossum wrote: > > > [MAL] > > > Dynamic nested scopes is another topic... those are *very* > > > useful; especially when it comes to avoiding global variables > > > and implementing programs which work using control objects > > > instead of global function calls. > > > > Marc-Andre, what are Dynamic nested scopes? > > If MAL means dynamic scoping (which I understood he does), then this > simply means: > > when looking for a variable "foo", you first search for it in the local > namespace. If not there, the *caller's* namespace, and so on. In the > end, the caller is the __main__ module, and if not found there, it is > a NameError. That would be one application, yes. With dynamic scoping I meant that the context of a lookup is defined at run-time and by explicitely or implicitely hooking together objects which then define the nesting. Environment acquisition is an example of such dynamic scoping: attribute lookups are passed on to the outer scope in case they don't resolve on the inner scope, e.g. say you have object a with a.x = 1; all other objects don't define .x. Then a.b.c.d.x will result in lookups 1. a.b.c.d.x 2. a.b.c.x 3. a.b.x 4. a.x -> 1 This example uses attribute lookup -- the same can be done for other nested objects by explicitely specifying the nesting relationship. Jim's ExtensionClasses allow the above by using a lot of wrappers around objects -- would be nice if we could come up with a more general scheme which then also works for explicit nesting relationships (e.g. dictionaries which get hooked together -- Jim's MultiMapping does this). -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From mal@lemburg.com Thu Nov 2 16:06:38 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Thu, 02 Nov 2000 17:06:38 +0100 Subject: [Python-Dev] Python 2.1 tasks References: <200011020329.WAA07330@cj20424-a.reston1.va.home.com> <20001102103442.B5027@kronos.cnri.reston.va.us> Message-ID: <3A01910E.A19FFBB2@lemburg.com> Andrew Kuchling wrote: > > On Thu, Nov 02, 2000 at 05:29:55PM +0200, Moshe Zadka wrote: > >Shouldn't we have a list of goals for [Python 2.1] or something? As a > >first-order approximation, what PEPs are expected to be included? And, > > Stuff I personally want to get done: > * Finish PEP 222, "Web Programming Improvements" and implement whatever > emerges from it. > > * Write a PEP on using Distutils to build the modules that come with > Python, and implement it if accepted. > > * Work on something CPAN-like. This may or may not have repercussions for > the core; I don't know. Most important for 2.1 are probably: 1. new C level coercion scheme 2. rich comparisons 3. making the std lib Unicode compatible -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From jeremy@alum.mit.edu Thu Nov 2 16:11:30 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 11:11:30 -0500 (EST) Subject: [Python-Dev] statically nested scopes In-Reply-To: References: <14848.27102.223001.369662@bitdiddle.concentric.net> Message-ID: <14849.37426.860007.989619@bitdiddle.concentric.net> >>>>> "TP" == Tim Peters writes: TP> [Jeremy Hylton] >> ... Guido once explained that his original reservation about >> nested scopes was a reaction to their overuse in Pascal. In >> large Pascal programs he was familiar with, block structure was >> overused as an organizing principle for the program, leading to >> hard-to-read code. TP> Note that this problem will be much worse in Python: in Pascal, TP> you could always "look up" for the closest-containing func/proc TP> that explicitly declares a referenced vrbl. In Python, you have TP> to indirectly *deduce* which vrbls are local to a def, by TP> searching the entire body for an appearance as a binding target. TP> So you have to "look up" and "look down" from the reference TP> point, and it's easy to miss a binding target. I agree that visual inspection is a tad harder, but I contend that existing programs that use the same name for a global variable and a local variable -- and intend for the global to be visible within a function nested in the local variable's region -- are confusing. It's too hard for a first-time reader of the code to figure out what is going on. Incidentally, I have yet to see an example of this problem occurring in anyone's code. All the examples seem a bit contrived. I wonder if anyone has an example in existing code. [My SICP example omitted] TP> Unfortunately for proponents, this is exactly the kind of SICP TP> example that is much better done via a class. Indeed, the PEP says exactly that: This kind of program is better done via a class. My intent was not to show a compelling use of mutable state. Instead it was to show that with read-only access, people could still modify values on enclosing scopes. The issue is whether the language allows the programmer to express this intent clearly or if she has to jump through some hoops to accomplish it. TP> Not only is the TP> closure version strained by comparison, but as is usual it TP> manages to create a bank account with a write-only balance <0.9 TP> wink>. TP> def deposit(amount): TP> global bank_account.balance balance += amount TP> is one old suggested way to explicitly declare non-local names TP> and the enclosing block to which they are local (and in analogy TP> with current "global", mandatory if you want to rebind the TP> non-local name, optional if you only want to reference it). TP> There are subtleties here, but explicit is better than implicit, TP> and the subtleties are only subtler if you refuse (like Scheme) TP> to make the intent explicit. I'm still not sure I like it, because it mixes local variables of a function with attribute access on objects. I'll add it to the discussion in the PEP (if Barry approves the PEP ), though. Do you have any opinion on the subtleties? The two that immediately come to mind are: 1) whether the function's local are available as attributes anywhere or only in nested scopes and 2) whether you can create new local variable using this notation. Jeremy From Moshe Zadka Thu Nov 2 16:09:08 2000 From: Moshe Zadka (Moshe Zadka) Date: Thu, 2 Nov 2000 18:09:08 +0200 (IST) Subject: [Python-Dev] Python 2.1 tasks In-Reply-To: <20001102164239.R12812@xs4all.nl> Message-ID: On Thu, 2 Nov 2000, Thomas Wouters wrote: > Is anyone working on something like this, or even thinking about it ? I'm > not deep enough into distutils to join that SIG, but I definately would join > a CPyAN SIG ;) Cries for this sig have been already made in c.l.py. I'm moving this discussion to meta-sig. Please discuss it there. I'm willing to champion it, but I'll defer if Andrew or Greg want to do it. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From jeremy@alum.mit.edu Thu Nov 2 16:13:52 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 11:13:52 -0500 (EST) Subject: [Python-Dev] statically nested scopes In-Reply-To: <3A015A3A.4EC7DBC6@lemburg.com> References: <200011020037.NAA29447@s454.cosc.canterbury.ac.nz> <3A015A3A.4EC7DBC6@lemburg.com> Message-ID: <14849.37568.510427.834971@bitdiddle.concentric.net> >>>>> "MAL" == M -A Lemburg writes: MAL> It may not look serious, but changing the Python lookup scheme MAL> is, since many inspection tools rely and reimplement exactly MAL> that scheme. With nested scopes, there would be next to no way MAL> to emulate the lookups using these tools. Can you say more about this issue? It sounds like it is worth discussing in the PEP, but I can't get a handle on exactly what the problem is. Any tool needs to implement or model Python's name resolution algorithm, call it algorithm A. If we change name resolution to use algorithm B, then the tools need to implement or model a new algorithm. I don't see where the impossibility of emulation comes in. Jeremy From guido@python.org Thu Nov 2 04:24:29 2000 From: guido@python.org (Guido van Rossum) Date: Wed, 01 Nov 2000 23:24:29 -0500 Subject: [Python-Dev] Python Call Mechanism In-Reply-To: Your message of "01 Nov 2000 18:13:07 GMT." References: <200010301715.JAA32564@slayer.i.sourceforge.net> <39FDB5EA.B60EA39A@lemburg.com> <14845.36020.17063.951147@bitdiddle.concentric.net> Message-ID: <200011020424.XAA07810@cj20424-a.reston1.va.home.com> > How about dumping the CALL_FUNCTION* opcodes, and replacing them with > two non-argumented opcodes, called for the sake of argument NCALL_FUNC > and NCALL_FUNC_KW. > > NCALL_FUNC would pop a function object and a tuple off the stack and > apply the function to the tuple. > > NCALL_FUNC_KW would do the same, then pop a dictionary and then do > the moral equivalent of f(*args,**kw). No, this is a bad idea. Long, long ago, all calls requird building a tuple for the arguments first. This tuple creation turned out to be a major bottleneck. That's why the current call opcode exists. --Guido van Rossum (home page: http://www.python.org/~guido/) From barry@wooz.org Thu Nov 2 16:22:32 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Thu, 2 Nov 2000 11:22:32 -0500 (EST) Subject: [Python-Dev] statically nested scopes References: <14848.27102.223001.369662@bitdiddle.concentric.net> Message-ID: <14849.38088.903104.936944@anthem.concentric.net> If we get lexical scoping, there should be a fast (built-in) way to get at all the accessible names from Python. I.e. currently I can do d = globals().copy() d.update(locals()) and know that `d' contains a dictionary of available names, with the right overloading semantics. (PEP 42 now includes a feature request to make vars() do this by default.) -Barry From barry@wooz.org Thu Nov 2 16:23:56 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Thu, 2 Nov 2000 11:23:56 -0500 (EST) Subject: [Python-Dev] statically nested scopes References: <14848.27102.223001.369662@bitdiddle.concentric.net> Message-ID: <14849.38172.908316.107381@anthem.concentric.net> This has been added as PEP 227. From guido@python.org Thu Nov 2 04:28:00 2000 From: guido@python.org (Guido van Rossum) Date: Wed, 01 Nov 2000 23:28:00 -0500 Subject: [Python-Dev] Python Call Mechanism In-Reply-To: Your message of "Wed, 01 Nov 2000 14:06:46 EST." <14848.27078.923932.758419@bitdiddle.concentric.net> References: <200010301715.JAA32564@slayer.i.sourceforge.net> <39FDB5EA.B60EA39A@lemburg.com> <14845.36020.17063.951147@bitdiddle.concentric.net> <14848.27078.923932.758419@bitdiddle.concentric.net> Message-ID: <200011020428.XAA07841@cj20424-a.reston1.va.home.com> > It certainly wouldn't hurt to implement this, as it would provide some > practical implementation experience that would inform a PEP on the > subject. If it solves the mess with supporting extended call syntax, adding these opcodes might be a good idea. But as I said, for the normal (not extended) case, the existing CALL_FUNCTION opcode is the right thing to use unless you want things to slow down significantly. --Guido van Rossum (home page: http://www.python.org/~guido/) From barry@wooz.org Thu Nov 2 16:31:35 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Thu, 2 Nov 2000 11:31:35 -0500 (EST) Subject: [Python-Dev] Re: Dynamic nested scopes References: <200011020326.WAA07307@cj20424-a.reston1.va.home.com> Message-ID: <14849.38631.997377.600214@anthem.concentric.net> >>>>> "MZ" == Moshe Zadka writes: MZ> If MAL means dynamic scoping (which I understood he does), MZ> then this simply means: MZ> when looking for a variable "foo", you first search for it in MZ> the local namespace. If not there, the *caller's* namespace, MZ> and so on. In the end, the caller is the __main__ module, and MZ> if not found there, it is a NameError. This is how Emacs Lisp behaves, and it's used all the time in ELisp programs. On the one hand it's quite convenient for customizing the behavior of functions. On the other hand, it can make documenting the interface of functions quite difficult because all those dynamically scoped variables are now part of the function's API. It's interesting to note that many ELispers really hate dynamic scoping and pine for a move toward lexical scoping. I'm not one of them. I'm not as concerned about "fixing" nested functions because I hardly ever use them, and rarely see them much in Python code. Fixing lambdas would be nice, but since Guido considers lambdas themselves a mistake, and given that lamda use /can/ be a performance hit in some situations, does it make sense to change something as fundamental as Python's scoping rules to fix this eddy of the language? -Barry From jeremy@alum.mit.edu Thu Nov 2 16:36:45 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 11:36:45 -0500 (EST) Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: <3A0190BA.940FFADA@lemburg.com> References: <3A0190BA.940FFADA@lemburg.com> Message-ID: <14849.38941.59576.682495@bitdiddle.concentric.net> Moshe's explanation of "dynamic scope" is the definition I've seen in every programming language text I've ever read. The essence of the defintion, I believe, is that a free variable is resolved in the environment created by the current procedure call stack. I think it muddles the discussion to use "dynamic scope" to describe acquistion, though it is a dynamic feature. Python using dynamic scope for exceptions. If any exception is raised, the exception handler that is triggered is determined by the environment in which the procedure was called. There are few languages that use dynamic scoping for normal name resolution. Many early Lisp implementations did, but I think all the modern ones use lexical scoping instead. It is hard to write modular code using dynamic scope, because the behavior of a function with free variables can not be determined by the module that defines it. Not saying it isn't useful, just that it makes it much harder to reason about how a particular modular or function works in isolation from the rest of the system. Jeremy From jeremy@alum.mit.edu Thu Nov 2 16:38:04 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 11:38:04 -0500 (EST) Subject: [Python-Dev] statically nested scopes In-Reply-To: References: <14849.34263.310260.404940@bitdiddle.concentric.net> Message-ID: <14849.39020.532855.494649@bitdiddle.concentric.net> >>>>> "MZ" == Moshe Zadka writes: MZ> you guys are talking about different things. Jeremy is talking MZ> about a tool to warn against incompatible changes Greg is MZ> talking about a tool to identify, for each variable, what scope MZ> it belongs to. Not sure we're talking about different things. The compiler will need to determine the scope of each variable. It's a tool. If it implements the specifiction for name binding, other tools can too. Jeremy From barry@wooz.org Thu Nov 2 16:36:11 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Thu, 2 Nov 2000 11:36:11 -0500 (EST) Subject: [Python-Dev] statically nested scopes References: <14849.34263.310260.404940@bitdiddle.concentric.net> Message-ID: <14849.38907.808752.186414@anthem.concentric.net> >>>>> "MZ" == Moshe Zadka writes: MZ> you guys are talking about different things. Jeremy is MZ> talking about a tool to warn against incompatible changes Greg MZ> is talking about a tool to identify, for each variable, what MZ> scope it belongs to. And Greg's point is well taken, because it /will/ be harder to tell at a glance where a name is coming from, so programming tools will have to find ways to help with this. -Barry From barry@wooz.org Thu Nov 2 16:44:06 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Thu, 2 Nov 2000 11:44:06 -0500 (EST) Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0204.txt,1.4,1.5 References: <200011012237.OAA06642@slayer.i.sourceforge.net> <20001101234134.O12812@xs4all.nl> Message-ID: <14849.39382.960359.909365@anthem.concentric.net> >>>>> "TW" == Thomas Wouters writes: >> Update this PEP to current, harsh, reality. It's been rejected >> :) If at all possible, the reasoning should be extended to >> include the real reasons it was rejected -- this is just >> guesswork from my side. (This means you, Guido, or anyone who >> can channel Guido enough to call himself Guido.) TW> In addition to that, PEP 0 also needs to be updated. Shall I TW> do that myself, now that Barry is apparently away ? I've just done it. TW> While I was at it, I also noticed PEP 0200 still says TW> 'Incomplete', though that might be by design. I've updated both these too, thanks. -Barry From mal@lemburg.com Thu Nov 2 16:45:12 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Thu, 02 Nov 2000 17:45:12 +0100 Subject: [Python-Dev] statically nested scopes References: <200011020037.NAA29447@s454.cosc.canterbury.ac.nz> <3A015A3A.4EC7DBC6@lemburg.com> <14849.37568.510427.834971@bitdiddle.concentric.net> Message-ID: <3A019A18.20D12FE1@lemburg.com> Jeremy Hylton wrote: > > >>>>> "MAL" == M -A Lemburg writes: > > MAL> It may not look serious, but changing the Python lookup scheme > MAL> is, since many inspection tools rely and reimplement exactly > MAL> that scheme. With nested scopes, there would be next to no way > MAL> to emulate the lookups using these tools. > > Can you say more about this issue? It sounds like it is worth > discussing in the PEP, but I can't get a handle on exactly what the > problem is. Any tool needs to implement or model Python's name > resolution algorithm, call it algorithm A. If we change name > resolution to use algorithm B, then the tools need to implement or > model a new algorithm. I don't see where the impossibility of > emulation comes in. Well first you'd have to change all tools to use the new scheme (this includes debuggers, inspection tools, reflection kits, etc.). This certainly is not a smart thing to do since Python IDEs are just starting to appear -- you wouldn't want to break all those. What get's harder with the nested scheme is that you can no longer be certain that globals() reaches out to the module namespace. But this is needed by some lazy evaluation tools. Writing to globals() would not be defined anymore -- where should you bind the new variable ? Another problem is that there probably won't be a way to access all the different nesting levels on a per-level basis (could be that I'm missing something here, but debugging tools would need some sort of scope() builtin to access the different scopes). I'm not sure whether this is possible to do without some sort of link between the scopes. We currently don't need such links. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From barry@wooz.org Thu Nov 2 16:52:50 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Thu, 2 Nov 2000 11:52:50 -0500 (EST) Subject: [Python-Dev] PEP-0217 References: <3a0186d5.25979536@smtp.worldonline.dk> Message-ID: <14849.39906.374949.489679@anthem.concentric.net> >>>>> "FB" == Finn Bock writes: FB> I don't see any problems with this. This is already handled by FB> a method in the jython runtime (Py.printResult). Yep, should be easy to add to Jython. -Barry From mwh21@cam.ac.uk Thu Nov 2 17:00:37 2000 From: mwh21@cam.ac.uk (Michael Hudson) Date: 02 Nov 2000 17:00:37 +0000 Subject: [Python-Dev] Python Call Mechanism In-Reply-To: Guido van Rossum's message of "Wed, 01 Nov 2000 23:24:29 -0500" References: <200010301715.JAA32564@slayer.i.sourceforge.net> <39FDB5EA.B60EA39A@lemburg.com> <14845.36020.17063.951147@bitdiddle.concentric.net> <200011020424.XAA07810@cj20424-a.reston1.va.home.com> Message-ID: Guido van Rossum writes: > > How about dumping the CALL_FUNCTION* opcodes, and replacing them with > > two non-argumented opcodes, called for the sake of argument NCALL_FUNC > > and NCALL_FUNC_KW. > > > > NCALL_FUNC would pop a function object and a tuple off the stack and > > apply the function to the tuple. > > > > NCALL_FUNC_KW would do the same, then pop a dictionary and then do > > the moral equivalent of f(*args,**kw). > > No, this is a bad idea. Long, long ago, all calls requird building a > tuple for the arguments first. This tuple creation turned out to be a > major bottleneck. That's why the current call opcode exists. Yes, I realize this now. I made my suggestion having not actually looked at the code or thought about it very much. I still think there is (or at least might be) value in rewriting the code for the more complex cases, and moving the dictionary creation out of the implementation of CALL_FUNCTION. Then the following could be made essentially equivalent: f(a,b,c=d,e=f) dict = {c:d,e:f} f(a,b,**dict) (modulo evaluation order). (I've made some changes to get BUILD_MAP using it's argument and construct literal dictionaries using it, which I'll whack up onto sf just as soon as I get round to it... ah, it's at https://sourceforge.net/patch/index.php?func=detailpatch&patch_id=102227&group_id=5470 ). Cheers, M. -- Those who have deviant punctuation desires should take care of their own perverted needs. -- Erik Naggum, comp.lang.lisp From gstein@lyra.org Thu Nov 2 17:07:29 2000 From: gstein@lyra.org (Greg Stein) Date: Thu, 2 Nov 2000 09:07:29 -0800 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0204.txt,1.4,1.5 In-Reply-To: <14849.39382.960359.909365@anthem.concentric.net>; from barry@wooz.org on Thu, Nov 02, 2000 at 11:44:06AM -0500 References: <200011012237.OAA06642@slayer.i.sourceforge.net> <20001101234134.O12812@xs4all.nl> <14849.39382.960359.909365@anthem.concentric.net> Message-ID: <20001102090729.A1874@lyra.org> On Thu, Nov 02, 2000 at 11:44:06AM -0500, Barry A. Warsaw wrote: > > >>>>> "TW" == Thomas Wouters writes: > > >> Update this PEP to current, harsh, reality. It's been rejected > >> :) If at all possible, the reasoning should be extended to > >> include the real reasons it was rejected -- this is just > >> guesswork from my side. (This means you, Guido, or anyone who > >> can channel Guido enough to call himself Guido.) > > TW> In addition to that, PEP 0 also needs to be updated. Shall I > TW> do that myself, now that Barry is apparently away ? > > I've just done it. Shouldn't we allow other people to tweak PEP 0? It would certainly lighten Barry's administrative overload. I mean, geez... this is what source control is about. Let a lot of people in there, but be able to back up in case somebody totally goofs it. This goes for adding new PEPs, too. I'm not as convinced here, since some level of "good enough for a PEP" filtering is probably desirable, but then again, it would seem that the people with commit access probably have that filter in their head anyways. Just a thought... how can we grease things up a bit more... Cheers, -g -- Greg Stein, http://www.lyra.org/ From pf@artcom-gmbh.de Thu Nov 2 16:59:21 2000 From: pf@artcom-gmbh.de (Peter Funk) Date: Thu, 2 Nov 2000 17:59:21 +0100 (MET) Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0227.txt,NONE,1.1 In-Reply-To: <200011021618.IAA15298@slayer.i.sourceforge.net> from Barry Warsaw at "Nov 2, 2000 8:18:27 am" Message-ID: Barry Warsaw wrote: > Update of /cvsroot/python/python/nondist/peps > In directory slayer.i.sourceforge.net:/tmp/cvs-serv15290 > > Added Files: > pep-0227.txt > Log Message: > PEP 227, Statically Nested Scopes, Jeremy Hylton > > > ***** Error reading new file: (2, 'No such file or directory') It was obviously not intended to be mailed out that way again. Problem with pathname and/or current directory? Barry got this right once, now it is broken again. Regards, Peter -- Peter Funk, Oldenburger Str.86, D-27777 Ganderkesee, Germany, Fax:+49 4222950260 office: +49 421 20419-0 (ArtCom GmbH, Grazer Str.8, D-28359 Bremen) From mal@lemburg.com Thu Nov 2 17:07:15 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Thu, 02 Nov 2000 18:07:15 +0100 Subject: [Python-Dev] statically nested scopes References: <200011020037.NAA29447@s454.cosc.canterbury.ac.nz> <3A015A3A.4EC7DBC6@lemburg.com> <200011020329.WAA07330@cj20424-a.reston1.va.home.com> Message-ID: <3A019F43.E13604BA@lemburg.com> Guido van Rossum wrote: > > > It may not look serious, but changing the Python lookup scheme > > is, since many inspection tools rely and reimplement exactly > > that scheme. With nested scopes, there would be next to no > > way to emulate the lookups using these tools. > > So fix the tools. Eek. Are you proposing to break all the Python IDE that are just appearing out there ? > > To be honest, I don't think static nested scopes buy us all that > > much. You can do the same now, by using keyword arguments which > > isn't all that nice, but works great and makes the scope clearly > > visible. > > Yes. It's a hack that gets employed over and over. And it has > certain problems. We added 'import as' to get rid of a common > practice that was perceived unclean. Maybe we should support nested > scopes to get rid of another unclean common practice? I think the common practice mainly comes from the fact, that by making globals locals which can benefit from LOAD_FAST you get a noticable performance boost. So the "right" solution to these weird looking hacks would be to come up with a smart way by which the Python compiler itself can do the localizing. Nested scopes won't help eliminating the current keyword practice. > I'm not saying that we definitely should add this to 2.1 (there's > enough on our plate already) but we should at least consider it, and > now that we have cycle GC, the major argument against it (that it > causes cycles) is gone... Hmm, so far the only argument for changing Python lookups was to allow writing lambdas without keyword hacks. Does this really warrant breaking code ? What other advantages would statically nested scopes have ? -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From guido@python.org Thu Nov 2 05:16:43 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 00:16:43 -0500 Subject: [Python-Dev] PEP-0217 In-Reply-To: Your message of "Thu, 02 Nov 2000 16:49:37 +0200." References: Message-ID: <200011020516.AAA08400@cj20424-a.reston1.va.home.com> > 1) BDFL pronouncement I believe Ping has also proposed such a display hook. I'm not against the idea, but I'm also not much in favor -- so I'm kind of +/- 0... I've always thought that the interpreter mainloop should be rewritten in Python. That would be another PEP, and this would be a good place to add a display hook as a feature. Note that the example curreltly in the PEP has a fatal flaw: it's got a recursive reference to print. This is one of the things to consider when proposing such a feature. Sorry I can't be of more help... --Guido van Rossum (home page: http://www.python.org/~guido/) From jeremy@alum.mit.edu Thu Nov 2 17:20:12 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 12:20:12 -0500 (EST) Subject: [Python-Dev] PEP-0217 In-Reply-To: <200011020516.AAA08400@cj20424-a.reston1.va.home.com> References: <200011020516.AAA08400@cj20424-a.reston1.va.home.com> Message-ID: <14849.41548.308591.119778@bitdiddle.concentric.net> I think the current draft of PEP 217 is far too thin and vague to be acceptable. You ought to re-read the PEP guidlines and make sure you've covered all the points. You must address at least: - motivation - specification of feature independent of implementation Jeremy From claird@starbase.neosoft.com Thu Nov 2 17:18:20 2000 From: claird@starbase.neosoft.com (Cameron Laird) Date: Thu, 2 Nov 2000 11:18:20 -0600 (CST) Subject: [Python-Dev] Tk news you'll want to read Message-ID: <200011021718.LAA70431@starbase.neosoft.com> Note the clear intention to co-operate with Perl and Python, the ambition to be considerably more portable and superior in performance to GTK+, and so on. I encourage you to e-mail this on to others it might interest. A lot of the Perl people don't receive anything from me because my ISP is ORBS-blacklisted (don't ask). From guido@python.org Thu Nov 2 05:29:27 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 00:29:27 -0500 Subject: [Python-Dev] CPyAN In-Reply-To: Your message of "Thu, 02 Nov 2000 16:42:39 +0100." <20001102164239.R12812@xs4all.nl> References: <200011020329.WAA07330@cj20424-a.reston1.va.home.com> <20001102103442.B5027@kronos.cnri.reston.va.us> <20001102164239.R12812@xs4all.nl> Message-ID: <200011020529.AAA08549@cj20424-a.reston1.va.home.com> [Andrew] > > * Work on something CPAN-like. This may or may not have repercussions for > > the core; I don't know. [Thomas] > Probably not, though perhaps a new module would be nice. As for the > CPAN-like thing, I really got a kick out of Greg S's WebDAV session on > Apachecon, and I think it would be suited extremely well as the transmission > protocol for SPAM (or however you want to call the Python CPAN ;). You can > do the uploading, downloading and searching for modules using WebDAV without > too much pain, and there's excellent WebDAV support for Apache ;) > > Is anyone working on something like this, or even thinking about it ? I'm > not deep enough into distutils to join that SIG, but I definately would join > a CPyAN SIG ;) This is a nice thing to have, but I don't see why it should be tied to the 2.1 effort. Let's not couple projects that can be carried out independently! --Guido van Rossum (home page: http://www.python.org/~guido/) From mwh21@cam.ac.uk Thu Nov 2 17:29:52 2000 From: mwh21@cam.ac.uk (Michael Hudson) Date: 02 Nov 2000 17:29:52 +0000 Subject: [Python-Dev] Python Call Mechanism In-Reply-To: Michael Hudson's message of "02 Nov 2000 17:00:37 +0000" References: <200010301715.JAA32564@slayer.i.sourceforge.net> <39FDB5EA.B60EA39A@lemburg.com> <14845.36020.17063.951147@bitdiddle.concentric.net> <200011020424.XAA07810@cj20424-a.reston1.va.home.com> Message-ID: Michael Hudson writes: > Guido van Rossum writes: > > Then the following could be made essentially equivalent: > > f(a,b,c=d,e=f) > > dict = {c:d,e:f} > f(a,b,**dict) ... except when you have stuff like f(a,b=c,**kw) Gruntle. Maybe Python function calls are just complicated! (Has anyone looked at bytecodehacks.xapply? - and that doesn't even handle *-ed and **-ed arguments...). Hmm - the interaction of "a=b" style args & *-ed args is a bit counter-intuitive, particularly as the "a=b" args syntatically have to come before the *-ed args: >>> def f(a,b,c,d): ... return a,b,c,d ... >>> f(1,c=3,*(2,4)) Traceback (most recent call last): File "", line 1, in ? TypeError: keyword parameter 'c' redefined in call to f() >>> f(1,b=3,*(2,4)) Traceback (most recent call last): File "", line 1, in ? TypeError: keyword parameter 'b' redefined in call to f() >>> f(1,d=4,*(2,3)) (1, 2, 3, 4) I humbly submit that This Is Wrong. I haven't seen anybody complain about it, which suggests to me that noone is using this combination, and I propose either: 1) banning it 2) demanding that the *-ed arg precede the "a=b" args Of course, if noone is using this "feature", then maybe this dusty little corner of the language should be left undisturbed. And I haven't even thought about default arguments yet... about-to-make-an-opcode-called-BODGE_KEYWORD_ARGUMENTS-ly y'rs M. -- nonono, while we're making wild conjectures about the behavior of completely irrelevant tasks, we must not also make serious mistakes, or the data might suddenly become statistically valid. -- Erik Naggum, comp.lang.lisp From barry@wooz.org Thu Nov 2 17:32:13 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Thu, 2 Nov 2000 12:32:13 -0500 (EST) Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0204.txt,1.4,1.5 References: <200011012237.OAA06642@slayer.i.sourceforge.net> <20001101234134.O12812@xs4all.nl> <14849.39382.960359.909365@anthem.concentric.net> <20001102090729.A1874@lyra.org> Message-ID: <14849.42269.297360.615404@anthem.concentric.net> >>>>> "GS" == Greg Stein writes: GS> Shouldn't we allow other people to tweak PEP 0? It would GS> certainly lighten Barry's administrative overload. I certainly don't mind at the very least, people modifying PEP 0 when the status of their own peps change. GS> I mean, geez... this is what source control is about. Let a GS> lot of people in there, but be able to back up in case GS> somebody totally goofs it. GS> This goes for adding new PEPs, too. I'm not as convinced here, GS> since some level of "good enough for a PEP" filtering is GS> probably desirable, but then again, it would seem that the GS> people with commit access probably have that filter in their GS> head anyways. GS> Just a thought... how can we grease things up a bit more... I do like to make a sanity pass through the text before approving it, just to make sure we've got consistent format throughout the peps. Also, I know we're all on "internet time" here , but one day isn't too much time to let pass before taking action on things. :) I'd also prefer it if there's /some/ limited editorial review before these things get added. That having been said, I'm very happy if someone wants to co-edit the peps. The pep 0 re-organization and slacker scolding would definitely benefit from more than one watchdog. Volunteers? :) -Barry From barry@wooz.org Thu Nov 2 17:33:11 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Thu, 2 Nov 2000 12:33:11 -0500 (EST) Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0227.txt,NONE,1.1 References: <200011021618.IAA15298@slayer.i.sourceforge.net> Message-ID: <14849.42327.23020.553510@anthem.concentric.net> >>>>> "PF" == Peter Funk writes: PF> It was obviously not intended to be mailed out that way again. PF> Problem with pathname and/or current directory? Barry got PF> this right once, now it is broken again. Except that AFAIK, I didn't do anything to fix it, or to break it again. -Barry From guido@python.org Thu Nov 2 05:45:09 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 00:45:09 -0500 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 17:05:14 +0100." <3A0190BA.940FFADA@lemburg.com> References: <3A0190BA.940FFADA@lemburg.com> Message-ID: <200011020545.AAA08743@cj20424-a.reston1.va.home.com> > > > [MAL] > > > > Dynamic nested scopes is another topic... those are *very* > > > > useful; especially when it comes to avoiding global variables > > > > and implementing programs which work using control objects > > > > instead of global function calls. > > > > > > Marc-Andre, what are Dynamic nested scopes? [Moshe] > > If MAL means dynamic scoping (which I understood he does), then this > > simply means: > > > > when looking for a variable "foo", you first search for it in the local > > namespace. If not there, the *caller's* namespace, and so on. In the > > end, the caller is the __main__ module, and if not found there, it is > > a NameError. Ah, yuck. For variable namespace, this is a really bad idea. For certain other things (e.g. try/except blocks, Jeremy's example) it is of course OK. [MAL] > That would be one application, yes. > > With dynamic scoping I meant that the context of a lookup is > defined at run-time and by explicitely or implicitely > hooking together objects which then define the nesting. But certainly you're not thinking of doing this to something as basic to Python's semantics as local/global variable lookup! > Environment acquisition is an example of such dynamic scoping: > attribute lookups are passed on to the outer scope in case they > don't resolve on the inner scope, e.g. say you have > object a with a.x = 1; all other objects don't define .x. > Then a.b.c.d.x will result in lookups > 1. a.b.c.d.x > 2. a.b.c.x > 3. a.b.x > 4. a.x -> 1 This seems only remotely related to dynamic scopes. There are namespaces here, but not "scopes" as I think of them: a scope defines the validity for an unadorned variable. Static scopes mean that this is determined by the program source structure. Dynamic scopes means that that this is determined by run-time mechanisms. Python uses static scoping for local variables, but a dynamic scoping mechanism for non-local variable references: first it does a module-global lookup, then it looks for a built-in name. When we're not talking about simple name lookup, we should speak of namespaces (which can also have lifetimes). > This example uses attribute lookup -- the same can be done for > other nested objects by explicitely specifying the nesting > relationship. > > Jim's ExtensionClasses allow the above by using a lot of > wrappers around objects -- would be nice if we could come > up with a more general scheme which then also works for > explicit nesting relationships (e.g. dictionaries which > get hooked together -- Jim's MultiMapping does this). I think we're getting way off track here. --Guido van Rossum (home page: http://www.python.org/~guido/) From tim_one@email.msn.com Thu Nov 2 17:43:03 2000 From: tim_one@email.msn.com (Tim Peters) Date: Thu, 2 Nov 2000 12:43:03 -0500 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: <14849.38941.59576.682495@bitdiddle.concentric.net> Message-ID: [Jeremy Hylton] > ... > There are few languages that use dynamic scoping for normal name > resolution. Many early Lisp implementations did, but I think all the > modern ones use lexical scoping instead. I believe all early Lisps were dynamically scoped. Scheme changed that, and Common Lisp followed. Many interpreted languages *start* life with dynamic scoping because it's easy to hack together, but that never lasts. REBOL went thru this a couple years ago, switching entirely from dynamic to lexical before its first public release, and breaking most existing programs in the process. Perl also started with dynamic scoping, but, in Perl-like fashion, Perl5 *added* lexical scoping on top of dynamic ("local" vars use dynamic scoping; "my" vars lexical; and all Perl5 gotcha guides stridently recommend never using "local" anymore). Here's some Perl5: $i = 1; sub f { local($i) = 2; &g(); } sub g { return $i; } print "i at start is $i\n"; print "i in g called directly is ", &g(), "\n"; print "i in g called indirectly via f is ", &f(), "\n"; print "i at end is $i\n"; Here's what it prints: i at start is 1 i in g called directly is 1 i in g called indirectly via f is 2 i at end is 1 > It is hard to write modular code using dynamic scope, because the > behavior of a function with free variables can not be determined by > the module that defines it. As shown above; dynamic scoping is a nightmare even in the absence of nested functions. > Not saying it isn't useful, just that it makes it much harder to reason > about how a particular modular or function works in isolation from the > rest of the system. People who spend too much time writing meta-systems overestimate its usefulness. Most programmers who need this kind of effect would be much better off explicitly passing a dict of name->value mappings explicitly manipulated, or simply passing the values of interest (if I want a function that sucks the value of "i" out of its caller, what clearer way than for the caller to pass i in the arglist?! dynamic scoping is much subtler, of course -- it sucks the value of "i" out of *whatever* function up the call chain happened to define an i "most recently"). Lexical scoping is much tamer, and, indeed, Python is one of the few modern languages that doesn't support it. Last time Guido and I hand-wrung over this, that was the chief reason to implement it: newcomers are endlessly surprised that when they write functions that visually nest, their scopes nevertheless don't nest. There's certainly nothing novel or unexpected about lexical scoping anymore. The problem unique to Python is its rule for determining which vrbls are local; in Lisps, REBOL and Perl, you have to explicitly name all vrbls local to a given scope, which renders "how do you rebind a non-local name?" a non-issue. has-the-feel-of-inevitability-ly y'rs - tim From guido@python.org Thu Nov 2 05:56:03 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 00:56:03 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 11:11:30 EST." <14849.37426.860007.989619@bitdiddle.concentric.net> References: <14848.27102.223001.369662@bitdiddle.concentric.net> <14849.37426.860007.989619@bitdiddle.concentric.net> Message-ID: <200011020556.AAA08847@cj20424-a.reston1.va.home.com> [Jeremy and Tim argue about what to do about write access for variables at intermediate levels of nesting, neither local nor module-global.] I'll risk doing a pronouncement, even though I know that Jeremy (and maybe also Tim?) disagree. You don't need "write access" (in the sense of being able to assign) for variables at the intermediate scopes, so there is no need for a syntax to express this. Assignments are to local variables (normally) or to module-globals (when 'global' is used). Use references search for a local, then for a local of the containing function definition, then for a local in its container, and so forth, until it hits the module globals, and then finally it looks for a builtin. We can argue over which part of this is done statically and which part is done dynamically: currently, locals are done dynamically and everything else is done statically. --Guido van Rossum (home page: http://www.python.org/~guido/) From jeremy@alum.mit.edu Thu Nov 2 17:57:09 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 12:57:09 -0500 (EST) Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: <200011020545.AAA08743@cj20424-a.reston1.va.home.com> References: <3A0190BA.940FFADA@lemburg.com> <200011020545.AAA08743@cj20424-a.reston1.va.home.com> Message-ID: <14849.43765.419161.105395@bitdiddle.concentric.net> I don't think I buy your explanation that Python uses dynamic scope for resolving globals. As I understand the mechanism, the module namespace and builtins namespace are those for the module in which the function was defined. If so, this is still static scope. Here's a quick example that illustrates the difference: module foo: ----------------------- a = 12 def add(b): return a + b ----------------------- module bar: ----------------------- from foo import add a = -1 print add(1) ----------------------- If Python used static scope, "python bar.py" should print 13 (it does). If it used dynamic scope, I would expect the answer to be 0. Jeremy From fdrake@acm.org Thu Nov 2 17:55:11 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Thu, 2 Nov 2000 12:55:11 -0500 (EST) Subject: [Python-Dev] Modules/makesetup loop In-Reply-To: <14848.39586.832800.139182@bitdiddle.concentric.net> References: <14848.39586.832800.139182@bitdiddle.concentric.net> Message-ID: <14849.43647.550092.678278@cj42289-a.reston1.va.home.com> Jeremy Hylton writes: > I just did a clean configure and make from the latest CVS tree. It > seems to get stuck in a loop calling makesetup over and over again. Please do an update and run autoconf, then try again. I didn't check in the autoconf output; Guido thinks that we want to stick with autoconf 2.13, and I've got 2.14.1. The reason is that there are enough differences in the m4 expansions that seeing the real effects of the configure.in changes is hard, just because there are so many changed lines which are not related to the actual change. (If you do have autoconf 2.13, please check in the new configure.) -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From guido@python.org Thu Nov 2 06:02:10 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 01:02:10 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 11:22:32 EST." <14849.38088.903104.936944@anthem.concentric.net> References: <14848.27102.223001.369662@bitdiddle.concentric.net> <14849.38088.903104.936944@anthem.concentric.net> Message-ID: <200011020602.BAA08974@cj20424-a.reston1.va.home.com> > If we get lexical scoping, there should be a fast (built-in) way to > get at all the accessible names from Python. I.e. currently I can do > > d = globals().copy() > d.update(locals()) > > and know that `d' contains a dictionary of available names, with the > right overloading semantics. (PEP 42 now includes a feature request > to make vars() do this by default.) Note that I just deleted that feature request from PEP 42 -- vars() or locals() returns the dictionary containing the variables, and you can't just change the semantics to return a newly copied dictionary (which could be quite expensive too!). I don't think you need to have a mechanism to find all accessible names; I don't see a common use for that. It's sufficient to have a mechanism to look up any specific name according to whatever mechanism we decide upon. This is needed for internal use of course; it can also be useful for e.g. variable substitution mechanisms like the one you recently proposed or Ping's Itmpl. --Guido van Rossum (home page: http://www.python.org/~guido/) From jeremy@alum.mit.edu Thu Nov 2 18:04:10 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 13:04:10 -0500 (EST) Subject: [Python-Dev] Modules/makesetup loop In-Reply-To: <14849.43647.550092.678278@cj42289-a.reston1.va.home.com> References: <14848.39586.832800.139182@bitdiddle.concentric.net> <14849.43647.550092.678278@cj42289-a.reston1.va.home.com> Message-ID: <14849.44186.494602.297323@bitdiddle.concentric.net> I created a new directory. I executed 'OPT="-O3" ../configure'. It placed three files in Modules: Makefile.pre, Setup, and Setup.config. I ran "make" and it immediately reported an error: "../../Modules/makesetup Setup.config Setup.local Setup" reports "cat: Setup.local: No such file or directory" Jeremy From guido@python.org Thu Nov 2 06:07:04 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 01:07:04 -0500 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 11:31:35 EST." <14849.38631.997377.600214@anthem.concentric.net> References: <200011020326.WAA07307@cj20424-a.reston1.va.home.com> <14849.38631.997377.600214@anthem.concentric.net> Message-ID: <200011020607.BAA09013@cj20424-a.reston1.va.home.com> [Barry, on dynamic scopes] > This is how Emacs Lisp behaves, and it's used all the time in ELisp > programs. On the one hand it's quite convenient for customizing the > behavior of functions. On the other hand, it can make documenting the > interface of functions quite difficult because all those dynamically > scoped variables are now part of the function's API. > > It's interesting to note that many ELispers really hate dynamic > scoping and pine for a move toward lexical scoping. I'm not one of > them. I don't care what you pine for in ELisp, but for Python this would be a bad idea. > I'm not as concerned about "fixing" nested functions because I hardly > ever use them, and rarely see them much in Python code. Fixing > lambdas would be nice, but since Guido considers lambdas themselves a > mistake, and given that lamda use /can/ be a performance hit in some > situations, does it make sense to change something as fundamental as > Python's scoping rules to fix this eddy of the language? I referred to this in our group meeting as "fixing lambda" because that's where others seem to need it most often. But it is a real problem that exists for all nested functions. So let me rephrase that: "fixing nested function definitions" is useful, if it can be done without leaking memory, without breaking too much existing code, and without slowing down code that doesn't use the feature. --Guido van Rossum (home page: http://www.python.org/~guido/) From jeremy@alum.mit.edu Thu Nov 2 18:06:12 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 13:06:12 -0500 (EST) Subject: [Python-Dev] statically nested scopes In-Reply-To: <200011020556.AAA08847@cj20424-a.reston1.va.home.com> References: <14848.27102.223001.369662@bitdiddle.concentric.net> <14849.37426.860007.989619@bitdiddle.concentric.net> <200011020556.AAA08847@cj20424-a.reston1.va.home.com> Message-ID: <14849.44308.333212.37111@bitdiddle.concentric.net> Don't know if you saw the discussion in the PEP or not. I made two arguments for being able to assign to variables bound in enclosing scopes. 1. Every other language that supports nested lexical scoping allows this. To the extent that programmers have seen these other languages, they will expect it to work. 2. It is possible to work around this limitation by using containers. If you want to have an integer that can be updated by nested functions, you wrap the interger in a list and make all assignments and references refer to list[0]. It would be unfortunate if programmers used this style, because it is obscure. I'd rather see the language provide a way to support this style of programming directly. Jeremy From esr@thyrsus.com Thu Nov 2 22:10:16 2000 From: esr@thyrsus.com (Eric S. Raymond) Date: Thu, 2 Nov 2000 14:10:16 -0800 Subject: [Python-Dev] statically nested scopes In-Reply-To: <200011020329.WAA07330@cj20424-a.reston1.va.home.com>; from guido@python.org on Wed, Nov 01, 2000 at 10:29:03PM -0500 References: <200011020037.NAA29447@s454.cosc.canterbury.ac.nz> <3A015A3A.4EC7DBC6@lemburg.com> <200011020329.WAA07330@cj20424-a.reston1.va.home.com> Message-ID: <20001102141016.B1838@thyrsus.com> Guido van Rossum : > > To be honest, I don't think static nested scopes buy us all that > > much. You can do the same now, by using keyword arguments which > > isn't all that nice, but works great and makes the scope clearly > > visible. > > Yes. It's a hack that gets employed over and over. And it has > certain problems. We added 'import as' to get rid of a common > practice that was perceived unclean. Maybe we should support nested > scopes to get rid of another unclean common practice? > > I'm not saying that we definitely should add this to 2.1 (there's > enough on our plate already) but we should at least consider it, and > now that we have cycle GC, the major argument against it (that it > causes cycles) is gone... For whatever it's worth, I agree with both these arguments. -- Eric S. Raymond "The power to tax involves the power to destroy;...the power to destroy may defeat and render useless the power to create...." -- Chief Justice John Marshall, 1819. From fdrake@acm.org Thu Nov 2 18:04:34 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Thu, 2 Nov 2000 13:04:34 -0500 (EST) Subject: [Python-Dev] Modules/makesetup loop In-Reply-To: <14849.44186.494602.297323@bitdiddle.concentric.net> References: <14848.39586.832800.139182@bitdiddle.concentric.net> <14849.43647.550092.678278@cj42289-a.reston1.va.home.com> <14849.44186.494602.297323@bitdiddle.concentric.net> Message-ID: <14849.44210.93217.431262@cj42289-a.reston1.va.home.com> Jeremy Hylton writes: > I created a new directory. I executed 'OPT="-O3" ../configure'. It > placed three files in Modules: Makefile.pre, Setup, and Setup.config. > > I ran "make" and it immediately reported an error: > "../../Modules/makesetup Setup.config Setup.local Setup" reports "cat: > Setup.local: No such file or directory" Very interesting! The first thing I get is this: (cd Modules; make -f Makefile.pre Makefile) make[1]: Entering directory `/home/fdrake/projects/python/temp/Modules' echo "# Edit this file for local setup changes" >Setup.local rm -rf ../libpython2.0.a /bin/sh ../../Modules/makesetup Setup.config Setup.local Setup make[1]: Leaving directory `/home/fdrake/projects/python/temp/Modules' Can you capture stout & stderr from a clean configure & make and mail it to me? Thanks! -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From guido@python.org Thu Nov 2 06:10:57 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 01:10:57 -0500 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 11:36:45 EST." <14849.38941.59576.682495@bitdiddle.concentric.net> References: <3A0190BA.940FFADA@lemburg.com> <14849.38941.59576.682495@bitdiddle.concentric.net> Message-ID: <200011020610.BAA09073@cj20424-a.reston1.va.home.com> > Moshe's explanation of "dynamic scope" is the definition I've seen in > every programming language text I've ever read. The essence of the > defintion, I believe, is that a free variable is resolved in the > environment created by the current procedure call stack. Ah. The term "free variable" makes sense here. > I think it muddles the discussion to use "dynamic scope" to describe > acquistion, though it is a dynamic feature. > > Python using dynamic scope for exceptions. If any exception is > raised, the exception handler that is triggered is determined by the > environment in which the procedure was called. Then I think this is also muddles the discussion, since the look for exception handlers has nothing to do with free variable lookup. > There are few languages that use dynamic scoping for normal name > resolution. Many early Lisp implementations did, but I think all the > modern ones use lexical scoping instead. It is hard to write modular > code using dynamic scope, because the behavior of a function with free > variables can not be determined by the module that defines it. Not > saying it isn't useful, just that it makes it much harder to reason > about how a particular modular or function works in isolation from the > rest of the system. I think Python 3000 ought to use totally static scoping. That will make it possible to do optimize code using built-in names! --Guido van Rossum (home page: http://www.python.org/~guido/) From esr@thyrsus.com Thu Nov 2 22:12:57 2000 From: esr@thyrsus.com (Eric S. Raymond) Date: Thu, 2 Nov 2000 14:12:57 -0800 Subject: [Python-Dev] Python 2.1 tasks In-Reply-To: <3A01910E.A19FFBB2@lemburg.com>; from mal@lemburg.com on Thu, Nov 02, 2000 at 05:06:38PM +0100 References: <200011020329.WAA07330@cj20424-a.reston1.va.home.com> <20001102103442.B5027@kronos.cnri.reston.va.us> <3A01910E.A19FFBB2@lemburg.com> Message-ID: <20001102141257.C1838@thyrsus.com> M.-A. Lemburg : > Most important for 2.1 are probably: > > 1. new C level coercion scheme > 2. rich comparisons > 3. making the std lib Unicode compatible I'd certainly like to see rich comparisons go in. I have a "Set" class all ready for addition to the standard library except that it's waiting on this feature in order to do partial ordering properly. -- Eric S. Raymond Every election is a sort of advance auction sale of stolen goods. -- H.L. Mencken From esr@thyrsus.com Thu Nov 2 22:14:24 2000 From: esr@thyrsus.com (Eric S. Raymond) Date: Thu, 2 Nov 2000 14:14:24 -0800 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: <14849.38941.59576.682495@bitdiddle.concentric.net>; from jeremy@alum.mit.edu on Thu, Nov 02, 2000 at 11:36:45AM -0500 References: <3A0190BA.940FFADA@lemburg.com> <14849.38941.59576.682495@bitdiddle.concentric.net> Message-ID: <20001102141424.D1838@thyrsus.com> Jeremy Hylton : > There are few languages that use dynamic scoping for normal name > resolution. Many early Lisp implementations did, but I think all the > modern ones use lexical scoping instead. It is hard to write modular > code using dynamic scope, because the behavior of a function with free > variables can not be determined by the module that defines it. Correct. Based on my LISP experience, I would be strongly opposed to dymamic scoping. That path has been tried and found wanting. -- Eric S. Raymond "Guard with jealous attention the public liberty. Suspect every one who approaches that jewel. Unfortunately, nothing will preserve it but downright force. Whenever you give up that force, you are inevitably ruined." -- Patrick Henry, speech of June 5 1788 From jeremy@alum.mit.edu Thu Nov 2 18:14:08 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 13:14:08 -0500 (EST) Subject: [Python-Dev] Modules/makesetup loop In-Reply-To: <14849.44210.93217.431262@cj42289-a.reston1.va.home.com> References: <14848.39586.832800.139182@bitdiddle.concentric.net> <14849.43647.550092.678278@cj42289-a.reston1.va.home.com> <14849.44186.494602.297323@bitdiddle.concentric.net> <14849.44210.93217.431262@cj42289-a.reston1.va.home.com> Message-ID: <14849.44784.479281.162333@bitdiddle.concentric.net> Despite the error, I did get a successful build this time. I guess your configure change worked. Jeremy From jeremy@alum.mit.edu Thu Nov 2 18:16:25 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 13:16:25 -0500 (EST) Subject: [Python-Dev] Larry Wall talk on Perl 6 Message-ID: <14849.44921.765397.205628@bitdiddle.concentric.net> http://dev.perl.org/~ask/als/ Interesting reading. Jeremy From fdrake@acm.org Thu Nov 2 18:12:38 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Thu, 2 Nov 2000 13:12:38 -0500 (EST) Subject: [Python-Dev] Modules/makesetup loop In-Reply-To: <14849.44784.479281.162333@bitdiddle.concentric.net> References: <14848.39586.832800.139182@bitdiddle.concentric.net> <14849.43647.550092.678278@cj42289-a.reston1.va.home.com> <14849.44186.494602.297323@bitdiddle.concentric.net> <14849.44210.93217.431262@cj42289-a.reston1.va.home.com> <14849.44784.479281.162333@bitdiddle.concentric.net> Message-ID: <14849.44694.303665.854116@cj42289-a.reston1.va.home.com> Jeremy Hylton writes: > Despite the error, I did get a successful build this time. I guess > your configure change worked. I don't think the bug I fixed and the bug you reported were tightly related. I don't understand why you got the error after before or after my change. I'f you're running autoconf 2.13, though, please do check in the new configure script! (If anyone else is, you can check it in too! The winner gets one free documentation download!) -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From barry@wooz.org Thu Nov 2 18:20:06 2000 From: barry@wooz.org (barry@wooz.org) Date: Thu, 2 Nov 2000 13:20:06 -0500 (EST) Subject: [Python-Dev] statically nested scopes References: <14848.27102.223001.369662@bitdiddle.concentric.net> <14849.38088.903104.936944@anthem.concentric.net> <200011020602.BAA08974@cj20424-a.reston1.va.home.com> Message-ID: <14849.45142.860806.358489@anthem.concentric.net> >>>>> "GvR" == Guido van Rossum writes: >> If we get lexical scoping, there should be a fast (built-in) >> way to get at all the accessible names from Python. >> I.e. currently I can do d = globals().copy() d.update(locals()) >> and know that `d' contains a dictionary of available names, >> with the right overloading semantics. (PEP 42 now includes a >> feature request to make vars() do this by default.) GvR> Note that I just deleted that feature request from PEP 42 -- GvR> vars() or locals() returns the dictionary containing the GvR> variables, and you can't just change the semantics to return GvR> a newly copied dictionary (which could be quite expensive GvR> too!). Saw that. I was just thinking that locals() already does what vars()-no-args does, so why have two ways to do the same thing? GvR> I don't think you need to have a mechanism to find all GvR> accessible names; I don't see a common use for that. It's GvR> sufficient to have a mechanism to look up any specific name GvR> according to whatever mechanism we decide upon. This is GvR> needed for internal use of course; it can also be useful for GvR> e.g. variable substitution mechanisms like the one you GvR> recently proposed or Ping's Itmpl. Ah, something like this then: -------------------- snip snip -------------------- import sys from UserDict import UserDict class NamesDict(UserDict): def __init__(self, frame): self.__frame = frame UserDict.__init__(self) def __getitem__(self, key): if self.data.has_key(key): return self.data[key] locals = self.__frame.f_locals if locals.has_key(key): return locals[key] globals = self.__frame.f_globals if globals.has_key(key): return globals[key] raise KeyError, key def _(s): try: raise 'oops' except: frame = sys.exc_info()[2].tb_frame.f_back return s % NamesDict(frame) theirs = 'theirs' def give(mine, yours): print _('mine=%(mine)s, yours=%(yours)s, theirs=%(theirs)s') -------------------- snip snip -------------------- Python 2.0 (#128, Oct 18 2000, 04:48:44) [GCC egcs-2.91.66 19990314/Linux (egcs-1.1.2 release)] on linux2 Type "copyright", "credits" or "license" for more information. >>> import dict >>> dict.give('mine', 'yours') mine=mine, yours=yours, theirs=theirs >>> -Barry From guido@python.org Thu Nov 2 06:28:26 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 01:28:26 -0500 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0204.txt,1.4,1.5 In-Reply-To: Your message of "Thu, 02 Nov 2000 09:07:29 PST." <20001102090729.A1874@lyra.org> References: <200011012237.OAA06642@slayer.i.sourceforge.net> <20001101234134.O12812@xs4all.nl> <14849.39382.960359.909365@anthem.concentric.net> <20001102090729.A1874@lyra.org> Message-ID: <200011020628.BAA09282@cj20424-a.reston1.va.home.com> > Shouldn't we allow other people to tweak PEP 0? It would certainly lighten > Barry's administrative overload. > > I mean, geez... this is what source control is about. Let a lot of people in > there, but be able to back up in case somebody totally goofs it. Agreed. > This goes for adding new PEPs, too. I'm not as convinced here, since some > level of "good enough for a PEP" filtering is probably desirable, but then > again, it would seem that the people with commit access probably have that > filter in their head anyways. Here, common sense and good judgement should be applied. If there seems to be consensus that a PEP is needed, there's no need to wait for Barry. The update to PEP-0000 commits the assignment of the new PEP number. But the new PEP should follow all the rules for a new PEP! Having Barry in the loop makes sense for those who aren't sure they can comply with all the rules, and for those outside the python-dev community. --Guido van Rossum (home page: http://www.python.org/~guido/) From tim_one@email.msn.com Thu Nov 2 18:29:41 2000 From: tim_one@email.msn.com (Tim Peters) Date: Thu, 2 Nov 2000 13:29:41 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: <200011020556.AAA08847@cj20424-a.reston1.va.home.com> Message-ID: [Guido] > [Jeremy and Tim argue about what to do about write access for > variables at intermediate levels of nesting, neither local nor > module-global.] > > I'll risk doing a pronouncement, even though I know that Jeremy (and > maybe also Tim?) disagree. > > You don't need "write access" (in the sense of being able to assign) > for variables at the intermediate scopes, so there is no need for a > syntax to express this. I can live with that! Reference-only access to intermediate scopes would address 99% of current gripes. Of course, future gripes will shift to that there's no rebinding access. If we have to support that someday too, my preferred way of spelling it in Python requires explicit new syntax, so adding that later would not break anything. > Assignments are to local variables (normally) or to module-globals (when > 'global' is used). Use references search for a local, then for a local of > the containing function definition, then for a local in its container, The Pascal standard coined "closest-containing scope" to describe this succinctly, and I recommend it for clarity and brevity. > and so forth, until it hits the module globals, and then finally it looks > for a builtin. > > We can argue over which part of this is done statically and which part > is done dynamically: currently, locals are done dynamically and > everything else is done statically. I'm not sure what you're trying to say there, but to the extent that I think I grasp it, I believe it's backwards: locals are static today but everything else is dynamic (and not in the sense of "dynamic scoping", but in the operational sense of "requires runtime search to resolve non-local names", while local names are fully resolved at compile-time today (but in the absence of "exec" and "import *")). what's-in-a-name?-a-rose-in-any-other-scope-may-not-smell-as- sweet-ly y'rs - tim From guido@python.org Thu Nov 2 06:35:00 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 01:35:00 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 18:07:15 +0100." <3A019F43.E13604BA@lemburg.com> References: <200011020037.NAA29447@s454.cosc.canterbury.ac.nz> <3A015A3A.4EC7DBC6@lemburg.com> <200011020329.WAA07330@cj20424-a.reston1.va.home.com> <3A019F43.E13604BA@lemburg.com> Message-ID: <200011020635.BAA09321@cj20424-a.reston1.va.home.com> [MAL] > > > It may not look serious, but changing the Python lookup scheme > > > is, since many inspection tools rely and reimplement exactly > > > that scheme. With nested scopes, there would be next to no > > > way to emulate the lookups using these tools. [GvR] > > So fix the tools. [MAL] > Eek. Are you proposing to break all the Python IDE that are > just appearing out there ? Yes. If a tool does variable scope analysis, it should be prepared for changes in the rules. Otherwise we might as well have refused the syntax changes in 2.0 because they required changes to tools! > > > To be honest, I don't think static nested scopes buy us all that > > > much. You can do the same now, by using keyword arguments which > > > isn't all that nice, but works great and makes the scope clearly > > > visible. > > > > Yes. It's a hack that gets employed over and over. And it has > > certain problems. We added 'import as' to get rid of a common > > practice that was perceived unclean. Maybe we should support nested > > scopes to get rid of another unclean common practice? > > I think the common practice mainly comes from the fact, > that by making globals locals which can benefit from LOAD_FAST > you get a noticable performance boost. > > So the "right" solution to these weird looking hacks would > be to come up with a smart way by which the Python compiler > itself can do the localizing. Can you elaborate? I dun't understand what you are proposing here. > Nested scopes won't help eliminating the current keyword > practice. Why not? I'd say that def create_adder(n): def adder(x, n=n): return x+n return adder is a hack and that nested scopes can fix this by allowing you to write def create_adder(n): def adder(x): return x+n return adder like one would expect. (Don't tell me that it isn't a FAQ why this doesn't work!) > > I'm not saying that we definitely should add this to 2.1 (there's > > enough on our plate already) but we should at least consider it, and > > now that we have cycle GC, the major argument against it (that it > > causes cycles) is gone... > > Hmm, so far the only argument for changing Python lookups > was to allow writing lambdas without keyword hacks. Does this > really warrant breaking code ? > > What other advantages would statically nested scopes have ? Doing what's proper. Nested scopes are not a bad idea. They weren't implemented because they were hard to get right (perhaps impossible without creating cycles), and I was okay with that because I didn't like them; but I've been convinced that examples like the second create_adder() above should reall work. Just like float+int works (in Python 0.1, this was a type-error -- you had to case the int arg to a float to get a float result). --Guido van Rossum (home page: http://www.python.org/~guido/) From guido@python.org Thu Nov 2 06:40:19 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 01:40:19 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 17:45:12 +0100." <3A019A18.20D12FE1@lemburg.com> References: <200011020037.NAA29447@s454.cosc.canterbury.ac.nz> <3A015A3A.4EC7DBC6@lemburg.com> <14849.37568.510427.834971@bitdiddle.concentric.net> <3A019A18.20D12FE1@lemburg.com> Message-ID: <200011020640.BAA09370@cj20424-a.reston1.va.home.com> > Well first you'd have to change all tools to use the new > scheme (this includes debuggers, inspection tools, reflection > kits, etc.). This certainly is not a smart thing to do since > Python IDEs are just starting to appear -- you wouldn't want > to break all those. I've seen a Komodo demo. Yes, it does this. But it's soooooooo far from being done that adding this wouldn't really slow them down much, I think. More likely, the toolmakers will have fun competing with each other to be the first to support this! :-) > What get's harder with the nested scheme is that > you can no longer be certain that globals() reaches out to > the module namespace. But this is needed by some lazy evaluation > tools. Writing to globals() would not be defined anymore -- > where should you bind the new variable ? I think globals() should return the module's __dict__, and the global statement should cause the variable to reach directly into there. We'll need to introduce a new builtin to do a name *lookup* in the nested scopes. > Another problem is that there probably won't be a way to access > all the different nesting levels on a per-level basis (could be > that I'm missing something here, but debugging tools would need > some sort of scope() builtin to access the different scopes). > I'm not sure whether this is possible to do without some sort > of link between the scopes. We currently don't need such links. Correct. That link may have to be added to the frame object. --Guido van Rossum (home page: http://www.python.org/~guido/) From fredrik@effbot.org Thu Nov 2 18:56:40 2000 From: fredrik@effbot.org (Fredrik Lundh) Date: Thu, 2 Nov 2000 19:56:40 +0100 Subject: [Python-Dev] statically nested scopes References: <200011020037.NAA29447@s454.cosc.canterbury.ac.nz> <3A015A3A.4EC7DBC6@lemburg.com> <200011020329.WAA07330@cj20424-a.reston1.va.home.com> <3A019F43.E13604BA@lemburg.com> <200011020635.BAA09321@cj20424-a.reston1.va.home.com> Message-ID: <000f01c044ff$47b72cb0$3c6340d5@hagrid> Guido van Rossum wrote:> [MAL] > > Eek. Are you proposing to break all the Python IDE that are > > just appearing out there ? > > Yes. If a tool does variable scope analysis, it should be prepared > for changes in the rules. we can live with that... From guido@python.org Thu Nov 2 06:50:45 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 01:50:45 -0500 Subject: [Python-Dev] Python Call Mechanism In-Reply-To: Your message of "02 Nov 2000 17:29:52 GMT." References: <200010301715.JAA32564@slayer.i.sourceforge.net> <39FDB5EA.B60EA39A@lemburg.com> <14845.36020.17063.951147@bitdiddle.concentric.net> <200011020424.XAA07810@cj20424-a.reston1.va.home.com> Message-ID: <200011020650.BAA09505@cj20424-a.reston1.va.home.com> > Hmm - the interaction of "a=b" style args & *-ed args is a bit > counter-intuitive, particularly as the "a=b" args syntatically have to > come before the *-ed args: > > >>> def f(a,b,c,d): > ... return a,b,c,d > ... > >>> f(1,c=3,*(2,4)) > Traceback (most recent call last): > File "", line 1, in ? > TypeError: keyword parameter 'c' redefined in call to f() > >>> f(1,b=3,*(2,4)) > Traceback (most recent call last): > File "", line 1, in ? > TypeError: keyword parameter 'b' redefined in call to f() > >>> f(1,d=4,*(2,3)) > (1, 2, 3, 4) > > I humbly submit that This Is Wrong. I haven't seen anybody complain > about it, which suggests to me that noone is using this combination, > and I propose either: > > 1) banning it > 2) demanding that the *-ed arg precede the "a=b" args Interesting. The *-ed arg cannot precede the a=b args currently, but I agree that it would have made more sense to do it that way. I'm not sure that it's worth allowing the combination of kw args and *tuple, but I'l also not sure that it *isn't* worth it. > Of course, if noone is using this "feature", then maybe this dusty > little corner of the language should be left undisturbed. I think it may be best to leave it alone. > And I haven't even thought about default arguments yet... That's unrelated. --Guido van Rossum (home page: http://www.python.org/~guido/) From cgw@fnal.gov Thu Nov 2 18:56:55 2000 From: cgw@fnal.gov (Charles G Waldman) Date: Thu, 2 Nov 2000 12:56:55 -0600 (CST) Subject: [Python-Dev] PythonLabs -> Digital Creations Message-ID: <14849.47351.835291.365708@buffalo.fnal.gov> So, when are the relevant websites going to be updated to reflect the new reality? Pythonlabs.com and BeOpen.com still reflect the old regime... I'd think the Digital Creations folks would be eager to announce the changing of the guard... From gstein@lyra.org Thu Nov 2 18:57:07 2000 From: gstein@lyra.org (Greg Stein) Date: Thu, 2 Nov 2000 10:57:07 -0800 Subject: [Python-Dev] Modules/makesetup loop In-Reply-To: <14849.44694.303665.854116@cj42289-a.reston1.va.home.com>; from fdrake@acm.org on Thu, Nov 02, 2000 at 01:12:38PM -0500 References: <14848.39586.832800.139182@bitdiddle.concentric.net> <14849.43647.550092.678278@cj42289-a.reston1.va.home.com> <14849.44186.494602.297323@bitdiddle.concentric.net> <14849.44210.93217.431262@cj42289-a.reston1.va.home.com> <14849.44784.479281.162333@bitdiddle.concentric.net> <14849.44694.303665.854116@cj42289-a.reston1.va.home.com> Message-ID: <20001102105707.H2037@lyra.org> On Thu, Nov 02, 2000 at 01:12:38PM -0500, Fred L. Drake, Jr. wrote: > > Jeremy Hylton writes: > > Despite the error, I did get a successful build this time. I guess > > your configure change worked. > > I don't think the bug I fixed and the bug you reported were tightly > related. I don't understand why you got the error after before or > after my change. > I'f you're running autoconf 2.13, though, please do check in the new > configure script! Wouldn't it make sense to do the upgrade to take advantages of the autoconf fixes? If there are any problems with the change, then now is the time to do it and to sort them out! Cheers, -g -- Greg Stein, http://www.lyra.org/ From tim_one@email.msn.com Thu Nov 2 19:07:18 2000 From: tim_one@email.msn.com (Tim Peters) Date: Thu, 2 Nov 2000 14:07:18 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: <14849.37426.860007.989619@bitdiddle.concentric.net> Message-ID: [Jeremy] > I agree that visual inspection is a tad harder, but I contend that > existing programs that use the same name for a global variable and a > local variable -- and intend for the global to be visible within a > function nested in the local variable's region -- are confusing. > It's too hard for a first-time reader of the code to figure out what > is going on. > > Incidentally, I have yet to see an example of this problem occurring > in anyone's code. All the examples seem a bit contrived. I wonder if > anyone has an example in existing code. I wasn't the one making the "will break code" argument (I'm sure it will, but very little, and not at all for most people since most people never nest functions in Python today apart from default-abusing lambdas). Visual inspection stands on its own as a potential problem. > [My SICP example omitted] > > TP> Unfortunately for proponents, this is exactly the kind of SICP > TP> example that is much better done via a class. > > Indeed, the PEP says exactly that: This kind of program is better > done via a class. My intent was not to show a compelling use of > mutable state. Instead it was to show that with read-only access, > people could still modify values on enclosing scopes. The issue is > whether the language allows the programmer to express this intent > clearly or if she has to jump through some hoops to accomplish it. Guido said "jump through some hoops", so I'm dropping it, but first noting that "the container" in *idiomatic* Python will most often be "self": def yadda(self, ...): def whatever(amount): self.balance += amount return whatever will work to rebind self.balance. I don't think Guido will ever be interested in supporting idiomatic Scheme. > TP> def deposit(amount): > TP> global bank_account.balance > TP> balance += amount > I'm still not sure I like it, because it mixes local variables of a > function with attribute access on objects. I'll add it to the > discussion in the PEP (if Barry approves the PEP ), though. Actually, no connection to attribute access was intended there: it was just a backward-compatible way to spell the pair (name of containing scope, name of vrbl in that scope). global back_account:balance or global balance from bank_account would do as well (and maybe better as they don't imply attribute access; but maybe worse as they don't bring to mind attribute access ). > Do you have any opinion on the subtleties? The two that immediately > come to mind are: 1) whether the function's local are available as > attributes anywhere or only in nested scopes I didn't intend attribute-like access at all (although we had earlier talked about that wrt JavaScript, I didn't have that in mind here). > and 2) whether you can create new local variable using this notation. Yes, that's the primary one I was thinking about. From tim_one@email.msn.com Thu Nov 2 19:10:52 2000 From: tim_one@email.msn.com (Tim Peters) Date: Thu, 2 Nov 2000 14:10:52 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: <14849.38088.903104.936944@anthem.concentric.net> Message-ID: [Barry A. Warsaw] > If we get lexical scoping, there should be a fast (built-in) way to > get at all the accessible names from Python. I.e. currently I can do > > d = globals().copy() > d.update(locals()) > > and know that `d' contains a dictionary of available names, with the > right overloading semantics. It was long ago agreed (don't you love how I pull this stuff out of thin historical air ?) that if nested lexical scoping was added, we would need also to supply a new mapping object that mimicked the full Python lookup rules (including builtins). Not necessarily a dictionary, though. From guido@python.org Thu Nov 2 07:15:50 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 02:15:50 -0500 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0227.txt,NONE,1.1 In-Reply-To: Your message of "Thu, 02 Nov 2000 12:33:11 EST." <14849.42327.23020.553510@anthem.concentric.net> References: <200011021618.IAA15298@slayer.i.sourceforge.net> <14849.42327.23020.553510@anthem.concentric.net> Message-ID: <200011020715.CAA09616@cj20424-a.reston1.va.home.com> > PF> It was obviously not intended to be mailed out that way again. > PF> Problem with pathname and/or current directory? Barry got > PF> this right once, now it is broken again. > > Except that AFAIK, I didn't do anything to fix it, or to break it > again. We have too much to do to try and fix this now. --Guido van Rossum (home page: http://www.python.org/~guido/) From tim_one@email.msn.com Thu Nov 2 19:14:50 2000 From: tim_one@email.msn.com (Tim Peters) Date: Thu, 2 Nov 2000 14:14:50 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: <3A019F43.E13604BA@lemburg.com> Message-ID: [MAL] > ... > Hmm, so far the only argument for changing Python lookups > was to allow writing lambdas without keyword hacks. Does this > really warrant breaking code ? *Some* amount of code, sure. Hard to quantify, but hard to believe there's much code at risk. > What other advantages would statically nested scopes have ? Pythonic obviousness. Virtually everyone coming to Python from other languages *expects* visually nested functions to work this way. That they don't today is a very frequent source of surprises and complaints. From barry@wooz.org Thu Nov 2 19:31:37 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Thu, 2 Nov 2000 14:31:37 -0500 (EST) Subject: [Python-Dev] statically nested scopes References: <14849.38088.903104.936944@anthem.concentric.net> Message-ID: <14849.49433.197319.973848@anthem.concentric.net> >>>>> "TP" == Tim Peters writes: TP> It was long ago agreed (don't you love how I pull this stuff TP> out of thin historical air ?) that if nested lexical TP> scoping was added, we would need also to supply a new mapping TP> object that mimicked the full Python lookup rules (including TP> builtins). Not necessarily a dictionary, though. Oh yeah, I vaguely remember now <0.5 scratch-head>. Works for me, although I'll point out that we needn't wait for lexical scoping to provide such an object! -Barry From guido@python.org Thu Nov 2 07:34:53 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 02:34:53 -0500 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 12:57:09 EST." <14849.43765.419161.105395@bitdiddle.concentric.net> References: <3A0190BA.940FFADA@lemburg.com> <200011020545.AAA08743@cj20424-a.reston1.va.home.com> <14849.43765.419161.105395@bitdiddle.concentric.net> Message-ID: <200011020734.CAA09751@cj20424-a.reston1.va.home.com> > I don't think I buy your explanation that Python uses dynamic scope > for resolving globals. That's not what I meant, but I expressed it clumsily. Perhaps the terminology is just inadequate. I simply meant that builtins can be overridden by module-globals. But only in the module whose globals are searched for globals -- that part is still static. Let's drop this thread... --Guido van Rossum (home page: http://www.python.org/~guido/) From fdrake@acm.org Thu Nov 2 19:31:59 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Thu, 2 Nov 2000 14:31:59 -0500 (EST) Subject: [Python-Dev] Modules/makesetup loop In-Reply-To: <14849.49013.363302.823824@bitdiddle.concentric.net> References: <14848.39586.832800.139182@bitdiddle.concentric.net> <14849.43647.550092.678278@cj42289-a.reston1.va.home.com> <14849.44186.494602.297323@bitdiddle.concentric.net> <14849.44210.93217.431262@cj42289-a.reston1.va.home.com> <14849.45483.304866.556547@bitdiddle.concentric.net> <14849.45714.207143.196945@cj42289-a.reston1.va.home.com> <14849.47007.862290.364694@bitdiddle.concentric.net> <14849.48464.183713.819894@cj42289-a.reston1.va.home.com> <14849.49013.363302.823824@bitdiddle.concentric.net> Message-ID: <14849.49455.97877.379601@cj42289-a.reston1.va.home.com> [We've uncovered the root of Jeremy's problem here -- this is a problem/limitation of make with VPATH.] Jeremy Hylton writes: > There is a Setup.local in the source directory. That's why it isn't being built. Since it's on the VPATH, make thinks it already exists in a usable form. Since the command that uses it doesn't look for the "most local" version but assumes it exists in the right place, the command fails. This is a problem that exists with VPATH; the built files cannot already exist in the source directories (hence, the build directories can't refer to source dirs which have been used as build dirs; "make clobber" isn't good enough for the Setup* files). Remove (or move) the existing file, and it should work fine. -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From tim_one@email.msn.com Thu Nov 2 19:41:12 2000 From: tim_one@email.msn.com (Tim Peters) Date: Thu, 2 Nov 2000 14:41:12 -0500 Subject: [Python-Dev] PythonLabs -> Digital Creations In-Reply-To: <14849.47351.835291.365708@buffalo.fnal.gov> Message-ID: [Charles G Waldman] > So, when are the relevant websites going to be updated to reflect the > new reality? Sorry, we have no idea. > Pythonlabs.com and BeOpen.com still reflect the old regime... And they're running on BeOpen.com machines, which we can no longer fiddle with. The people remaining at BeOpen.com probably don't view updating those sites as a priority anymore (you all saw how long it took them to get pythonlabs.com and Starship merely running again, although at the time we couldn't tell you *why* we weren't able to fix them ourselves). > I'd think the Digital Creations folks would be eager to announce the > changing of the guard... They're getting the word out in their own way. Hope you saw the cover story on LWN today! http://www.lwn.net/ There's a lot of good info there, including useful interviews with Paul Everitt and Guido. From mwh21@cam.ac.uk Thu Nov 2 19:45:47 2000 From: mwh21@cam.ac.uk (Michael Hudson) Date: 02 Nov 2000 19:45:47 +0000 Subject: [Python-Dev] Python Call Mechanism In-Reply-To: Guido van Rossum's message of "Thu, 02 Nov 2000 01:50:45 -0500" References: <200010301715.JAA32564@slayer.i.sourceforge.net> <39FDB5EA.B60EA39A@lemburg.com> <14845.36020.17063.951147@bitdiddle.concentric.net> <200011020424.XAA07810@cj20424-a.reston1.va.home.com> <200011020650.BAA09505@cj20424-a.reston1.va.home.com> Message-ID: Guido van Rossum writes: > > Hmm - the interaction of "a=b" style args & *-ed args is a bit > > counter-intuitive, particularly as the "a=b" args syntatically have to > > come before the *-ed args: > > > > >>> def f(a,b,c,d): > > ... return a,b,c,d > > ... > > >>> f(1,c=3,*(2,4)) > > Traceback (most recent call last): > > File "", line 1, in ? > > TypeError: keyword parameter 'c' redefined in call to f() > > >>> f(1,b=3,*(2,4)) > > Traceback (most recent call last): > > File "", line 1, in ? > > TypeError: keyword parameter 'b' redefined in call to f() > > >>> f(1,d=4,*(2,3)) > > (1, 2, 3, 4) > > > > I humbly submit that This Is Wrong. I haven't seen anybody complain > > about it, which suggests to me that noone is using this combination, > > and I propose either: > > > > 1) banning it > > 2) demanding that the *-ed arg precede the "a=b" args > > Interesting. The *-ed arg cannot precede the a=b args currently, but > I agree that it would have made more sense to do it that way. > > I'm not sure that it's worth allowing the combination of kw args and > *tuple, but I'l also not sure that it *isn't* worth it. > > > Of course, if noone is using this "feature", then maybe this dusty > > little corner of the language should be left undisturbed. > > I think it may be best to leave it alone. Oh, I mostly agree and am not going to put any effort into this area - but if I or someone else manages to make striking simplifications to the function call code that has the side effect of banning/changing the syntax of this combination, then this probably wouldn't be a good enough reason for its rejection. > > And I haven't even thought about default arguments yet... > > That's unrelated. I know, it was just something that occurred to me as adding more complexity. Hypothetical excercise: Write a description of Python's current function call behaviour in sufficient detail to allow reimplementation of it without playing with an existing interpreter in less than 1000 words. It's just hairy, and it amazes me that it feels so natural most of the time... Cheers, M. -- Not only does the English Language borrow words from other languages, it sometimes chases them down dark alleys, hits them over the head, and goes through their pockets. -- Eddy Peters From guido@python.org Thu Nov 2 07:51:15 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 02:51:15 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 13:20:06 EST." <14849.45142.860806.358489@anthem.concentric.net> References: <14848.27102.223001.369662@bitdiddle.concentric.net> <14849.38088.903104.936944@anthem.concentric.net> <200011020602.BAA08974@cj20424-a.reston1.va.home.com> <14849.45142.860806.358489@anthem.concentric.net> Message-ID: <200011020751.CAA12539@cj20424-a.reston1.va.home.com> [Barry] > Saw that. I was just thinking that locals() already does what > vars()-no-args does, so why have two ways to do the same thing? Not clear -- maybe one of them needs to be made obsolete. > GvR> I don't think you need to have a mechanism to find all > GvR> accessible names; I don't see a common use for that. It's > GvR> sufficient to have a mechanism to look up any specific name > GvR> according to whatever mechanism we decide upon. This is > GvR> needed for internal use of course; it can also be useful for > GvR> e.g. variable substitution mechanisms like the one you > GvR> recently proposed or Ping's Itmpl. > > Ah, something like this then: [Example deleted] I'm not sure that the mechanism provided should follow the mapping API. *All* it needs to do it provide lookup capability. Having .keys(), .items(), .has_key() etc. just slows it down. --Guido van Rossum (home page: http://www.python.org/~guido/) From tim_one@email.msn.com Thu Nov 2 19:54:05 2000 From: tim_one@email.msn.com (Tim Peters) Date: Thu, 2 Nov 2000 14:54:05 -0500 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: <14849.43765.419161.105395@bitdiddle.concentric.net> Message-ID: [Jeremy] > I don't think I buy your explanation that Python uses dynamic scope > for resolving globals. It's dynamic in the shallow (but real!) sense that it can't be determined to which of {module scope, builtin scope} a global name resolves today until runtime. Indeed, in def f(): print len the resolving scope for "len" may even change with each invocation of f(), depending on who's playing games with the containing module's __dict__. That's not "dynamic scoping" in the proper sense of the term, but it's sure dynamic! words-lead-to-more-words-so-become-one-with-the-essence-instead-ly y'rs - tim From guido@python.org Thu Nov 2 07:59:47 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 02:59:47 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 13:29:41 EST." References: Message-ID: <200011020759.CAA12600@cj20424-a.reston1.va.home.com> > [Guido] > > [Jeremy and Tim argue about what to do about write access for > > variables at intermediate levels of nesting, neither local nor > > module-global.] > > > > I'll risk doing a pronouncement, even though I know that Jeremy (and > > maybe also Tim?) disagree. > > > > You don't need "write access" (in the sense of being able to assign) > > for variables at the intermediate scopes, so there is no need for a > > syntax to express this. [Tim] > I can live with that! Reference-only access to intermediate scopes would > address 99% of current gripes. Of course, future gripes will shift to that > there's no rebinding access. If we have to support that someday too, my > preferred way of spelling it in Python requires explicit new syntax, so > adding that later would not break anything. Exactly my point. > > Assignments are to local variables (normally) or to module-globals (when > > 'global' is used). Use references search for a local, then for a local of > > the containing function definition, then for a local in its container, > > The Pascal standard coined "closest-containing scope" to describe this > succinctly, and I recommend it for clarity and brevity. Thanks, that's a good term. And the principle is totally uncontroversial. > > and so forth, until it hits the module globals, and then finally it looks > > for a builtin. > > > > We can argue over which part of this is done statically and which part > > is done dynamically: currently, locals are done dynamically and > > everything else is done statically. > > I'm not sure what you're trying to say there, but to the extent that I think > I grasp it, I believe it's backwards: locals are static today but > everything else is dynamic (and not in the sense of "dynamic scoping", but > in the operational sense of "requires runtime search to resolve non-local > names", while local names are fully resolved at compile-time today (but in > the absence of "exec" and "import *")). Oops, yes, I had it backwards. As I said elsewhere, in Python 3000 I'd like to do it all more statically. So perhaps we should look up nested locals based on static information too. Thus: x = "global-x" def foo(): if 0: x = "x-in-foo" def bar(): return x return bar print foo()() should raise UnboundLocalError, not print "global-x". --Guido van Rossum (home page: http://www.python.org/~guido/) From guido@python.org Thu Nov 2 07:59:55 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 02:59:55 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 13:06:12 EST." <14849.44308.333212.37111@bitdiddle.concentric.net> References: <14848.27102.223001.369662@bitdiddle.concentric.net> <14849.37426.860007.989619@bitdiddle.concentric.net> <200011020556.AAA08847@cj20424-a.reston1.va.home.com> <14849.44308.333212.37111@bitdiddle.concentric.net> Message-ID: <200011020759.CAA12609@cj20424-a.reston1.va.home.com> > Don't know if you saw the discussion in the PEP or not. Sorry, I had no time. I have read it now, but it doesn't change my point of view. > I made two > arguments for being able to assign to variables bound in enclosing > scopes. > > 1. Every other language that supports nested lexical scoping allows > this. To the extent that programmers have seen these other > languages, they will expect it to work. But we have a unique way of declaring variables, which makes the issues different. Your PEP wonders why I am against allowing assignment to intermediate levels. Here's my answer: all the syntaxes that have been proposed to spell this have problems. So let's not provide a way to spell it. I predict that it won't be a problem. If it becomes a problem, we can add a way to spell it later. I expect that the mechanism that will be used to find variables at intermediate levels can also be used to set them, so it won't affect that part of the implementation much. > 2. It is possible to work around this limitation by using containers. > If you want to have an integer that can be updated by nested > functions, you wrap the interger in a list and make all assignments > and references refer to list[0]. It would be unfortunate if > programmers used this style, because it is obscure. I'd rather see > the language provide a way to support this style of programming > directly. I don't expect that programmers will use this style. When they have this need, they will more likely use a class. --Guido van Rossum (home page: http://www.python.org/~guido/) From guido@python.org Thu Nov 2 08:08:14 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 03:08:14 -0500 Subject: [Python-Dev] PythonLabs -> Digital Creations In-Reply-To: Your message of "Thu, 02 Nov 2000 12:56:55 CST." <14849.47351.835291.365708@buffalo.fnal.gov> References: <14849.47351.835291.365708@buffalo.fnal.gov> Message-ID: <200011020808.DAA16162@cj20424-a.reston1.va.home.com> > So, when are the relevant websites going to be updated to reflect the > new reality? Pythonlabs.com and BeOpen.com still reflect the old > regime... I'd think the Digital Creations folks would be eager to > announce the changing of the guard... While technically I have the capability to update the pythonlabs.com website, ethically I feel I cannot do this. The site belongs to BeOpen. I'll ask them to make some kind of announcement. --Guido van Rossum (home page: http://www.python.org/~guido/) From guido@python.org Thu Nov 2 08:16:53 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 03:16:53 -0500 Subject: [Python-Dev] Modules/makesetup loop In-Reply-To: Your message of "Thu, 02 Nov 2000 10:57:07 PST." <20001102105707.H2037@lyra.org> References: <14848.39586.832800.139182@bitdiddle.concentric.net> <14849.43647.550092.678278@cj42289-a.reston1.va.home.com> <14849.44186.494602.297323@bitdiddle.concentric.net> <14849.44210.93217.431262@cj42289-a.reston1.va.home.com> <14849.44784.479281.162333@bitdiddle.concentric.net> <14849.44694.303665.854116@cj42289-a.reston1.va.home.com> <20001102105707.H2037@lyra.org> Message-ID: <200011020816.DAA16227@cj20424-a.reston1.va.home.com> > Wouldn't it make sense to do the upgrade to take advantages of the autoconf > fixes? If there are any problems with the change, then now is the time to do > it and to sort them out! Yeah, but I have too much to do to get to this any time soon -- it just isn't important enough. --Guido van Rossum (home page: http://www.python.org/~guido/) From skip@mojam.com (Skip Montanaro) Thu Nov 2 21:19:11 2000 From: skip@mojam.com (Skip Montanaro) (Skip Montanaro) Date: Thu, 2 Nov 2000 15:19:11 -0600 (CST) Subject: [Python-Dev] statically nested scopes In-Reply-To: References: <14849.37426.860007.989619@bitdiddle.concentric.net> Message-ID: <14849.55887.283432.505910@beluga.mojam.com> >>>>> "Tim" == Tim Peters writes: Tim> [Jeremy] >> All the examples seem a bit contrived. I wonder if anyone has an >> example in existing code. Tim> I wasn't the one making the "will break code" argument ... Nor was I. Jeremy (I think) asked MAL how it could break code. I posted a (simple, but obviously contrived) example. I have no particular opinion on this subject. I was just trying to answer Jeremy's question. Skip From jeremy@alum.mit.edu Thu Nov 2 20:25:36 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 15:25:36 -0500 (EST) Subject: [Python-Dev] statically nested scopes In-Reply-To: <14849.55887.283432.505910@beluga.mojam.com> References: <14849.37426.860007.989619@bitdiddle.concentric.net> <14849.55887.283432.505910@beluga.mojam.com> Message-ID: <14849.52672.943439.978785@bitdiddle.concentric.net> Looks like we need to rehash this thread at least enough to determine who is responsible for causing us to rehash it. MAL said it would break code. I asked how. Skip and Tim obliged with examples. I said their examples exhibited bad style; neither of them claimed they were good style. In the end, I observed that while it could break code in theory, I doubted it really would break much code. Furthermore, I believe that the code it will break is already obscure so we needn't worry about it. Jeremy From tim_one@email.msn.com Thu Nov 2 21:13:47 2000 From: tim_one@email.msn.com (Tim Peters) Date: Thu, 2 Nov 2000 16:13:47 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: <14849.52672.943439.978785@bitdiddle.concentric.net> Message-ID: [Jeremy] > Looks like we need to rehash this thread at least enough to determine > who is responsible for causing us to rehash it. > > MAL said it would break code. I asked how. Skip and Tim obliged with > examples. I said their examples exhibited bad style; neither of them > claimed they were good style. Ah! I wasn't trying to give an example of code that would break, but, ya, now that you mention it, it would. I was just giving an example of why visual inspection will be harder in Python than in Pascal. I expect that the kind of code I showed *will* be common: putting all the nested "helper functions" at the top of a function, just as was also done in Pascal. The killer difference is that in Pascal, the containing function's locals referenced by the nested helpers can be found declared at the top of the containing function; but in Python you'll have to search all over the place, and if you don't find a non-local var at once, you'll be left uneasy, wondering whether you missed a binding target, or whether the var is truly global, or what. From guido@python.org Thu Nov 2 09:41:01 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 04:41:01 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: Your message of "Thu, 02 Nov 2000 15:25:36 EST." <14849.52672.943439.978785@bitdiddle.concentric.net> References: <14849.37426.860007.989619@bitdiddle.concentric.net> <14849.55887.283432.505910@beluga.mojam.com> <14849.52672.943439.978785@bitdiddle.concentric.net> Message-ID: <200011020941.EAA18905@cj20424-a.reston1.va.home.com> [Jeremy] > Looks like we need to rehash this thread at least enough to determine > who is responsible for causing us to rehash it. > > MAL said it would break code. I asked how. Skip and Tim obliged with > examples. I said their examples exhibited bad style; neither of them > claimed they were good style. > > In the end, I observed that while it could break code in theory, I > doubted it really would break much code. Furthermore, I believe that > the code it will break is already obscure so we needn't worry about > it. That's quite enough rehashing. I don't think we'll have to worry about breakiung much code. --Guido van Rossum (home page: http://www.python.org/~guido/) From thomas@xs4all.net Thu Nov 2 21:47:35 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Thu, 2 Nov 2000 22:47:35 +0100 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0227.txt,NONE,1.1 In-Reply-To: <200011020715.CAA09616@cj20424-a.reston1.va.home.com>; from guido@python.org on Thu, Nov 02, 2000 at 02:15:50AM -0500 References: <200011021618.IAA15298@slayer.i.sourceforge.net> <14849.42327.23020.553510@anthem.concentric.net> <200011020715.CAA09616@cj20424-a.reston1.va.home.com> Message-ID: <20001102224734.U12812@xs4all.nl> On Thu, Nov 02, 2000 at 02:15:50AM -0500, Guido van Rossum wrote: > > PF> It was obviously not intended to be mailed out that way again. > > PF> Problem with pathname and/or current directory? Barry got > > PF> this right once, now it is broken again. > We have too much to do to try and fix this now. But I haven't ;) I've made syncmail give some more information if it can't find the file, to compensate for the fact that cvs.python.sourceforge.net runs a Python before 1.5.2, and thus doesn't show the file it couldn't find. I won't bother with creating files just for the hell of it, so we'll just wait until Barry triggers it again. (Not that I'm bored (not much, anyway), but this was a no-brainer.) -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From MarkH@ActiveState.com Thu Nov 2 21:55:07 2000 From: MarkH@ActiveState.com (Mark Hammond) Date: Fri, 3 Nov 2000 08:55:07 +1100 Subject: [Python-Dev] please subscribe me ;-) Message-ID: Hi all, My skippinet.com.au address is temporarily bouncing, and thus I am no longer getting python-dev mail. I re-subscribed on Monday, but am still waiting moderator approval. Can someone approve me please? Thanks, Mark. From MarkH@ActiveState.com Thu Nov 2 21:57:58 2000 From: MarkH@ActiveState.com (Mark Hammond) Date: Fri, 3 Nov 2000 08:57:58 +1100 Subject: [Python-Dev] Dont bother subscribing me! Message-ID: All done already - sorry about the noise. Thanks, Mark. From guido@python.org Thu Nov 2 10:06:16 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 05:06:16 -0500 Subject: [Python-Dev] Web Programming Improvements In-Reply-To: Your message of "Thu, 02 Nov 2000 10:34:42 EST." <20001102103442.B5027@kronos.cnri.reston.va.us> References: <200011020329.WAA07330@cj20424-a.reston1.va.home.com> <20001102103442.B5027@kronos.cnri.reston.va.us> Message-ID: <200011021006.FAA19152@cj20424-a.reston1.va.home.com> [Andrew] > Stuff I personally want to get done: > * Finish PEP 222, "Web Programming Improvements" and implement whatever > emerges from it. I just skimmed PEP 222. I agree that the classes defined by cgi.py are unnecessarily arcane. I wonder if it isn't better to just start over rather than trying to add yet another new class to the already top-heavy CGI module??? Regarding file uploads: you *seem* to be proposing that uploaded files should be loaded into memory only. I've got complaints from people who are uploading 10 Mb files and don't appreciate their process growing by that much. Perhaps there should be more options, so that the caller can control the disposition of uploads more carefully? What's wrong with subclassing? Maybe two example subclasses should be provided? Regarding templating -- what's wrong with HTMLgen as a starting point? Just that it's too big? I've never used it myself, but I've always been impressed with its appearance. :-) --Guido van Rossum (home page: http://www.python.org/~guido/) From akuchlin@mems-exchange.org Thu Nov 2 22:12:26 2000 From: akuchlin@mems-exchange.org (Andrew Kuchling) Date: Thu, 2 Nov 2000 17:12:26 -0500 Subject: [Python-Dev] Web Programming Improvements In-Reply-To: <200011021006.FAA19152@cj20424-a.reston1.va.home.com>; from guido@python.org on Thu, Nov 02, 2000 at 05:06:16AM -0500 References: <200011020329.WAA07330@cj20424-a.reston1.va.home.com> <20001102103442.B5027@kronos.cnri.reston.va.us> <200011021006.FAA19152@cj20424-a.reston1.va.home.com> Message-ID: <20001102171226.A21364@kronos.cnri.reston.va.us> On Thu, Nov 02, 2000 at 05:06:16AM -0500, Guido van Rossum wrote: >I just skimmed PEP 222. I agree that the classes defined by cgi.py >are unnecessarily arcane. I wonder if it isn't better to just start >over rather than trying to add yet another new class to the already >top-heavy CGI module??? I've wondered about that, too; writing a neat request class that wraps up field values, cookies, and environment variables, and provides convenience functions for re-creating the current URL, Something like the request classes in Zope and Webware. >Regarding file uploads: you *seem* to be proposing that uploaded files >should be loaded into memory only. I've got complaints from people Mrr...? The only file upload reference simply says you shouldn't have to subclass in order to use them; it's not implying files have to be read into memory. (We have to deal with whackingly large mask layout files at work, after all.) >Regarding templating -- what's wrong with HTMLgen as a starting point? >Just that it's too big? I've never used it myself, but I've always >been impressed with its appearance. :-) I, personally, am against including templating, but it was suggested. I'm against it because there are too many solutions with different tradeoffs. Do you want a simple regex search-and-replace, constructing HTML pages as Python objects, or a full-blown minilanguage? HTML/XML-compatibile syntax, ASP-compatible syntax, Python-compatible syntax? Much better just to move templating into the "Rejected" category and give the above rationale. --amk From jeremy@alum.mit.edu Thu Nov 2 22:07:37 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Thu, 2 Nov 2000 17:07:37 -0500 (EST) Subject: [Python-Dev] Web Programming Improvements In-Reply-To: <20001102171226.A21364@kronos.cnri.reston.va.us> References: <200011020329.WAA07330@cj20424-a.reston1.va.home.com> <20001102103442.B5027@kronos.cnri.reston.va.us> <200011021006.FAA19152@cj20424-a.reston1.va.home.com> <20001102171226.A21364@kronos.cnri.reston.va.us> Message-ID: <14849.58793.132399.810370@bitdiddle.concentric.net> Since today has been busy on the meta-sig, I wonder if we should create a web-sig to thrash out these issues. Jeremy From thomas@xs4all.net Thu Nov 2 22:51:52 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Thu, 2 Nov 2000 23:51:52 +0100 Subject: [Python-Dev] Re: [Patch #102170] move getopt() to Py_GetOpt() and use it unconditionally In-Reply-To: <200011020500.VAA11872@sf-web2.i.sourceforge.net>; from noreply@sourceforge.net on Wed, Nov 01, 2000 at 09:00:43PM -0800 References: <200011020500.VAA11872@sf-web2.i.sourceforge.net> Message-ID: <20001102235152.N12776@xs4all.nl> On Wed, Nov 01, 2000 at 09:00:43PM -0800, noreply@sourceforge.net wrote: > Comment: > Accepted and assigned back to Thomas. > Guido approved of this "in theory" before, so go for it! Well, I changed everything you wanted changed (use _PyOS_ rather than Py_ as prefix, etc) and I was about to check it in, when I got cold feet. This change is going to break programs embedding or extending python, that rely on getopt() (the C function) being available, one way or another. After this move, that is no longer necessarily true. I got cold feet because of Demo/pysrv/pysrv.c, which just adds the necessary externs for getopt(), optind and optarg, and doesn't bother with #include or some such. Before, that would probably always work, depending on which library/archive was searched first, for the getopt() function. It would only break if system-libraries were searched first, for some reason, and the system libraries provide a broken getopt. In all other cases, like there not being a getopt, getopt needing seperate includes or defines, or separate link or compile arguments, the Python getopt would be used, and it would do what was expected (probably, anyway ;) After the change, people need to use the system getopt(), and do their own hoop-jumping to find out if it's there, how to call it, etc. Now I myself have never written anything that would depend on Python providing getopt(), but that doesn't say anything at all. Is this acceptable breakage ? Or should we go the whole nine yards, make it an official API, document it, and provide backwards-compatible symbols on platforms where getopt used to be used ? -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From guido@python.org Thu Nov 2 13:41:53 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 08:41:53 -0500 Subject: [Python-Dev] Web Programming Improvements In-Reply-To: Your message of "Thu, 02 Nov 2000 17:12:26 EST." <20001102171226.A21364@kronos.cnri.reston.va.us> References: <200011020329.WAA07330@cj20424-a.reston1.va.home.com> <20001102103442.B5027@kronos.cnri.reston.va.us> <200011021006.FAA19152@cj20424-a.reston1.va.home.com> <20001102171226.A21364@kronos.cnri.reston.va.us> Message-ID: <200011021341.IAA19469@cj20424-a.reston1.va.home.com> > >Regarding file uploads: you *seem* to be proposing that uploaded files > >should be loaded into memory only. I've got complaints from people > > Mrr...? The only file upload reference simply says you shouldn't have > to subclass in order to use them; it's not implying files have to be > read into memory. (We have to deal with whackingly large mask layout > files at work, after all.) Mrrauw...? Do you really have to subclass? I thought it just says that you can subclass if you're not happy with the given make_file() implementation? > >Regarding templating -- what's wrong with HTMLgen as a starting point? > >Just that it's too big? I've never used it myself, but I've always > >been impressed with its appearance. :-) > > I, personally, am against including templating, but it was suggested. > I'm against it because there are too many solutions with different > tradeoffs. Do you want a simple regex search-and-replace, > constructing HTML pages as Python objects, or a full-blown > minilanguage? HTML/XML-compatibile syntax, ASP-compatible syntax, > Python-compatible syntax? Much better just to move templating into > the "Rejected" category and give the above rationale. Sure -- I'm perfectly happy with ad-hoc templating solutions myself (see my FAQ wizard). I've also heard Tom Christiansen complain that Perl is slower to start up than Python for CGI work -- because the templating classes are so big, and are all loaded at startup! I do see a use for a helper or helpers to creates tables though -- tables are notoriously tag-intensive and hard to get right. --Guido van Rossum (home page: http://www.python.org/~guido/) From guido@python.org Thu Nov 2 13:51:14 2000 From: guido@python.org (Guido van Rossum) Date: Thu, 02 Nov 2000 08:51:14 -0500 Subject: [Python-Dev] Re: [Patch #102170] move getopt() to Py_GetOpt() and use it unconditionally In-Reply-To: Your message of "Thu, 02 Nov 2000 23:51:52 +0100." <20001102235152.N12776@xs4all.nl> References: <200011020500.VAA11872@sf-web2.i.sourceforge.net> <20001102235152.N12776@xs4all.nl> Message-ID: <200011021351.IAA19544@cj20424-a.reston1.va.home.com> > Well, I changed everything you wanted changed (use _PyOS_ rather than Py_ as > prefix, etc) and I was about to check it in, when I got cold feet. This > change is going to break programs embedding or extending python, that rely > on getopt() (the C function) being available, one way or another. After this > move, that is no longer necessarily true. I got cold feet because of > Demo/pysrv/pysrv.c, which just adds the necessary externs for getopt(), > optind and optarg, and doesn't bother with #include or some such. > > Before, that would probably always work, depending on which library/archive > was searched first, for the getopt() function. It would only break if > system-libraries were searched first, for some reason, and the system > libraries provide a broken getopt. In all other cases, like there not being > a getopt, getopt needing seperate includes or defines, or separate link or > compile arguments, the Python getopt would be used, and it would do what was > expected (probably, anyway ;) > > After the change, people need to use the system getopt(), and do their own > hoop-jumping to find out if it's there, how to call it, etc. Now I myself > have never written anything that would depend on Python providing getopt(), > but that doesn't say anything at all. Is this acceptable breakage ? Or > should we go the whole nine yards, make it an official API, document it, and > provide backwards-compatible symbols on platforms where getopt used to be > used ? Don't worry -- getopt has never been something offered by Python. It's something assumed to be in the system library, and if it isn't we provide our own -- but we don't provide it for the benefit of those who embed us. (In fact, if Python is embedded, it will not *use* getopt.) We can't even guarantee to provide it, because it's likely already in the system library. (That's why I like that the prefix begins with an underscore -- those are not provided for use by others, only for internal use.) --Guido van Rossum (home page: http://www.python.org/~guido/) From akuchlin@mems-exchange.org Fri Nov 3 03:57:26 2000 From: akuchlin@mems-exchange.org (Andrew Kuchling) Date: Thu, 2 Nov 2000 22:57:26 -0500 Subject: [Python-Dev] Web Programming Improvements In-Reply-To: <14849.58793.132399.810370@bitdiddle.concentric.net>; from jeremy@alum.mit.edu on Thu, Nov 02, 2000 at 05:07:37PM -0500 References: <200011020329.WAA07330@cj20424-a.reston1.va.home.com> <20001102103442.B5027@kronos.cnri.reston.va.us> <200011021006.FAA19152@cj20424-a.reston1.va.home.com> <20001102171226.A21364@kronos.cnri.reston.va.us> <14849.58793.132399.810370@bitdiddle.concentric.net> Message-ID: <20001102225726.A21509@kronos.cnri.reston.va.us> On Thu, Nov 02, 2000 at 05:07:37PM -0500, Jeremy Hylton wrote: >Since today has been busy on the meta-sig, I wonder if we should >create a web-sig to thrash out these issues. There's already a python-web-modules@egroups.com list, for authors of Python Web frameworks; fairly on-topic for that list, I should think. --amk From Moshe Zadka Fri Nov 3 10:18:26 2000 From: Moshe Zadka (Moshe Zadka) Date: Fri, 3 Nov 2000 12:18:26 +0200 (IST) Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: <200011020610.BAA09073@cj20424-a.reston1.va.home.com> Message-ID: On Thu, 2 Nov 2000, Guido van Rossum wrote: > I think Python 3000 ought to use totally static scoping. That will > make it possible to do optimize code using built-in names! Isn't that another way of saying you want the builtin names to be part of the language definition? Part of today's method advantages is that new builtins can be added without any problems. no-clear-win-either-way-ly y'rs, Z. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From mal@lemburg.com Fri Nov 3 10:02:13 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Fri, 03 Nov 2000 11:02:13 +0100 Subject: [Python-Dev] statically nested scopes References: <14849.37426.860007.989619@bitdiddle.concentric.net> <14849.55887.283432.505910@beluga.mojam.com> <14849.52672.943439.978785@bitdiddle.concentric.net> Message-ID: <3A028D25.B6A227F2@lemburg.com> Jeremy Hylton wrote: > > Looks like we need to rehash this thread at least enough to determine > who is responsible for causing us to rehash it. > > MAL said it would break code. I asked how. Skip and Tim obliged with > examples. I said their examples exhibited bad style; neither of them > claimed they were good style. > > In the end, I observed that while it could break code in theory, I > doubted it really would break much code. Furthermore, I believe that > the code it will break is already obscure so we needn't worry about > it. That's just what I was trying to say all along: statically nested scopes don't buy you anything except maybe for lambdas and nested functions (which is bad style programming, IMHO too). The only true argument for changing scoping I see is that of gained purity in language design... without much practical use. Other issues that need sorting out: x = 2 class C: x = 1 C = 'some string' def a(self): print x def b(self): global x x = 3 class D(C): C = 'some string' def a(self): C.a(self) print C o = C() o.a() o.b() o.a() o = D() o.a() What would the output look like under your proposal ? -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From mal@lemburg.com Fri Nov 3 10:46:59 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Fri, 03 Nov 2000 11:46:59 +0100 Subject: [Python-Dev] Re: Dynamic nested scopes References: Message-ID: <3A0297A3.A778F4FC@lemburg.com> Moshe Zadka wrote: > > On Thu, 2 Nov 2000, Guido van Rossum wrote: > > > I think Python 3000 ought to use totally static scoping. That will > > make it possible to do optimize code using built-in names! > > Isn't that another way of saying you want the builtin names to be > part of the language definition? Part of today's method advantages > is that new builtins can be added without any problems. +1. Wouldn't it be more Python-like to provide the compiler with a set of known-to-be-static global name bindings ? A simple way of avoiding optimizations like these: def f(x, str=str): return str(x) + '!' would then be to have the compiler lookup "str" in the globals() passed to it and assign the found value to the constants of the function, provided that "str" appears in the list of known-to-be-static global name bindings (perhaps as optional addition parameter to compile() with some reasonable default in sys.staticsymbols). -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From mal@lemburg.com Fri Nov 3 10:58:10 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Fri, 03 Nov 2000 11:58:10 +0100 Subject: [Python-Dev] statically nested scopes References: <200011020037.NAA29447@s454.cosc.canterbury.ac.nz> <3A015A3A.4EC7DBC6@lemburg.com> <200011020329.WAA07330@cj20424-a.reston1.va.home.com> <3A019F43.E13604BA@lemburg.com> <200011020635.BAA09321@cj20424-a.reston1.va.home.com> Message-ID: <3A029A42.3C11C774@lemburg.com> Guido van Rossum wrote: > > > > > To be honest, I don't think static nested scopes buy us all that > > > > much. You can do the same now, by using keyword arguments which > > > > isn't all that nice, but works great and makes the scope clearly > > > > visible. > > > > > > Yes. It's a hack that gets employed over and over. And it has > > > certain problems. We added 'import as' to get rid of a common > > > practice that was perceived unclean. Maybe we should support nested > > > scopes to get rid of another unclean common practice? > > > > I think the common practice mainly comes from the fact, > > that by making globals locals which can benefit from LOAD_FAST > > you get a noticable performance boost. > > > > So the "right" solution to these weird looking hacks would > > be to come up with a smart way by which the Python compiler > > itself can do the localizing. > > Can you elaborate? I dun't understand what you are proposing here. See my other post... I would like to have the compiler do the localization for me in case it sees a global which has been "defined" static. > > Nested scopes won't help eliminating the current keyword > > practice. > > Why not? I'd say that > > def create_adder(n): > def adder(x, n=n): return x+n > return adder > > is a hack and that nested scopes can fix this by allowing you to write > > def create_adder(n): > def adder(x): return x+n > return adder > > like one would expect. (Don't tell me that it isn't a FAQ why this > doesn't work!) I know... still, I consider function definitions within a function bad style. Maybe just me, though ;-) > > > I'm not saying that we definitely should add this to 2.1 (there's > > > enough on our plate already) but we should at least consider it, and > > > now that we have cycle GC, the major argument against it (that it > > > causes cycles) is gone... > > > > Hmm, so far the only argument for changing Python lookups > > was to allow writing lambdas without keyword hacks. Does this > > really warrant breaking code ? > > > > What other advantages would statically nested scopes have ? > > Doing what's proper. Nested scopes are not a bad idea. They weren't > implemented because they were hard to get right (perhaps impossible > without creating cycles), and I was okay with that because I didn't > like them; but I've been convinced that examples like the second > create_adder() above should reall work. Just like float+int works > (in Python 0.1, this was a type-error -- you had to case the int arg > to a float to get a float result). Ok, but how does nested scoping mix with class definitions and global lookups then ? -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From mal@lemburg.com Fri Nov 3 12:50:17 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Fri, 03 Nov 2000 13:50:17 +0100 Subject: [Python-Dev] statically nested scopes References: <14849.37426.860007.989619@bitdiddle.concentric.net> <14849.55887.283432.505910@beluga.mojam.com> <14849.52672.943439.978785@bitdiddle.concentric.net> <3A028D25.B6A227F2@lemburg.com> Message-ID: <3A02B489.89EF108C@lemburg.com> "M.-A. Lemburg" wrote: > > Jeremy Hylton wrote: > > > > Looks like we need to rehash this thread at least enough to determine > > who is responsible for causing us to rehash it. > > > > MAL said it would break code. I asked how. Skip and Tim obliged with > > examples. I said their examples exhibited bad style; neither of them > > claimed they were good style. > > > > In the end, I observed that while it could break code in theory, I > > doubted it really would break much code. Furthermore, I believe that > > the code it will break is already obscure so we needn't worry about > > it. > > That's just what I was trying to say all along: statically > nested scopes don't buy you anything except maybe for lambdas > and nested functions (which is bad style programming, IMHO too). > > The only true argument for changing scoping I see is that > of gained purity in language design... without much practical > use. > > Other issues that need sorting out: > > x = 2 > class C: > x = 1 > C = 'some string' > def a(self): > print x > def b(self): > global x > x = 3 > > class D(C): > C = 'some string' > def a(self): > C.a(self) > print C > > o = C() > o.a() > o.b() > o.a() > > o = D() > o.a() > > What would the output look like under your proposal ? [Moshe pointed out to me in private mail that the above would continue to work as it does now due to a difference being made between class and function scoping categories] More questions: How are you going to explain the different scoping categories to a newbie ? What if you define a class within a method ? How can you explicitely attach a dynamically defined class to a certain scope ? More problems (?!): Nested scopes will introduce cycles in all frame objects. This means that with GC turned off, frame objects will live forever -- Python will eat up memory at a very fast pace. BTW, Python's GC only works for a few builtin types (frames are not among the supported types): what if a frame finds its way into a user defined type ? Will GC still be able to cleanup the cycles ? Perhaps I'm just being silly, but I still don't see the benefits of breaking todays easy-to-grasp three level scoping rules... -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From guido@python.org Fri Nov 3 13:05:08 2000 From: guido@python.org (Guido van Rossum) Date: Fri, 03 Nov 2000 08:05:08 -0500 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: Your message of "Fri, 03 Nov 2000 12:18:26 +0200." References: Message-ID: <200011031305.IAA22103@cj20424-a.reston1.va.home.com> > On Thu, 2 Nov 2000, Guido van Rossum wrote: > > > I think Python 3000 ought to use totally static scoping. That will > > make it possible to do optimize code using built-in names! [Moshe] > Isn't that another way of saying you want the builtin names to be > part of the language definition? Part of today's method advantages > is that new builtins can be added without any problems. The built-in names have always been part of the language definition in my mind. The way they are implemented doesn't reflect this, but that's just an implementation detail. How would you like it if something claimed to be Python but didn't support len()? Or map()? That doesn't mean you can't add new built-ins, and I don't think that the new implementation will prevent that -- but it *will* assume that you don't mess with the definitions of the existing built-ins. Of course you still will be able to define functions whose name overrides a built-in -- in that case the compiler can see that you're doing that (because it knows the scope rules and can see what you are doing). But you won't be able to confuse someone else's module by secretly sticking a replacement built-in into their module's __dict__. --Guido van Rossum (home page: http://www.python.org/~guido/) From guido@python.org Fri Nov 3 13:12:11 2000 From: guido@python.org (Guido van Rossum) Date: Fri, 03 Nov 2000 08:12:11 -0500 Subject: [Python-Dev] statically nested scopes In-Reply-To: Your message of "Fri, 03 Nov 2000 11:02:13 +0100." <3A028D25.B6A227F2@lemburg.com> References: <14849.37426.860007.989619@bitdiddle.concentric.net> <14849.55887.283432.505910@beluga.mojam.com> <14849.52672.943439.978785@bitdiddle.concentric.net> <3A028D25.B6A227F2@lemburg.com> Message-ID: <200011031312.IAA22141@cj20424-a.reston1.va.home.com> [MAL] > That's just what I was trying to say all along: statically > nested scopes don't buy you anything except maybe for lambdas > and nested functions That's a tautology -- that's what nested scopes are FOR! > (which is bad style programming, IMHO too). Not always. The keyword argument hack is so common that it must serve a purpose, and that's what we're trying to fix -- for lambda's *and* nested functions (which are semantically equivalent anyway). > The only true argument for changing scoping I see is that > of gained purity in language design... without much practical > use. But there is practical use: get rid of the unintuitive, unobvious and fragile keyword argument hack. > Other issues that need sorting out: > > x = 2 > class C: > x = 1 > C = 'some string' > def a(self): > print x > def b(self): > global x > x = 3 > > class D(C): > C = 'some string' > def a(self): > C.a(self) > print C > > o = C() > o.a() > o.b() > o.a() > > o = D() > o.a() > > What would the output look like under your proposal ? This is a good point! If we considered the class as a nested scope here, I think it might break too much code, plus it would allow a new coding style where you could reference class variables without a self or prefix. I don't like that prospect, so I'm in favor for ruling this out. --Guido van Rossum (home page: http://www.python.org/~guido/) From barry@wooz.org Fri Nov 3 14:16:20 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Fri, 3 Nov 2000 09:16:20 -0500 (EST) Subject: [Python-Dev] Re: Dynamic nested scopes References: <200011031305.IAA22103@cj20424-a.reston1.va.home.com> Message-ID: <14850.51380.7327.977719@anthem.concentric.net> >>>>> "GvR" == Guido van Rossum writes: GvR> The built-in names have always been part of the language GvR> definition in my mind. The way they are implemented doesn't GvR> reflect this, but that's just an implementation detail. How GvR> would you like it if something claimed to be Python but GvR> didn't support len()? Or map()? GvR> That doesn't mean you can't add new built-ins, and I don't GvR> think that the new implementation will prevent that -- but it GvR> *will* assume that you don't mess with the definitions of the GvR> existing built-ins. GvR> Of course you still will be able to define functions whose GvR> name overrides a built-in -- in that case the compiler can GvR> see that you're doing that (because it knows the scope rules GvR> and can see what you are doing). But you won't be able to GvR> confuse someone else's module by secretly sticking a GvR> replacement built-in into their module's __dict__. I'm a little confused. I've occasionally done the following within an application: ----------driver.py # need to override built-in open() to do extra debugging def debuggin_open(filename, mode, bufsize): # ... if EXTRA_DEBUGGING: import __builtin__.__dict__['open'] = debugging_open -------------------- snip snip -------------------- Would this be illegal? Would other modules in my application (even if imported from the standard library!) automatically get debugging_open() for open() like they do now? -Barry From gward@mems-exchange.org Fri Nov 3 14:33:03 2000 From: gward@mems-exchange.org (Greg Ward) Date: Fri, 3 Nov 2000 09:33:03 -0500 Subject: [Python-Dev] build problems under Solaris In-Reply-To: <200010281051.MAA01290@loewis.home.cs.tu-berlin.de>; from martin@loewis.home.cs.tu-berlin.de on Sat, Oct 28, 2000 at 12:51:20PM +0200 References: <200010281051.MAA01290@loewis.home.cs.tu-berlin.de> Message-ID: <20001103093303.A2683@ludwig.cnri.reston.va.us> On 28 October 2000, Martin v. Loewis said: > > I don't have access to a Solaris machine, so I can't do anything to > > help these users. > > The patch in 117606 looks right to me: gcc on Solaris (and on any > other platform) needs -shared to build shared library; configure > currently passes -G. I haven't actually tried the patch, since it is a > pain to extract it from the SF bug report page. What happens is that > gcc passes -G to the linker, which correctly produces a shared > library. However, gcc also links crt1/crti into the library, which > causes the reference to main. Well, I do have access to a Solaris machine -- I try not to use it if I don't have to, and about the only purpose it has these days is occasionally building Python to make sure it still works. Incidentally, I'm the one who changed "ld -G" to "$(CC) -G" -- see revision 1.124 of configure.in: revision 1.124 date: 2000/05/26 12:22:54; author: gward; state: Exp; lines: +6 -2 When building on Solaris and the compiler is GCC, use '$(CC) -G' to create shared extensions rather than 'ld -G'. This ensures that shared extensions link against libgcc.a, in case there are any functions in the GCC runtime not already in the Python core. I think the checkin message there is fairly clear; the context was in using Robin Dunn's extension for BSDDB 2.x, which does some 64-bit arithmetic deep down inside. Turned out that GCC compiled a 64-bit divide into a function call, and that function is in GCC's own runtime library. Using "ld -G" -- that's Sun's linker, which knows nothing about GCC's runtime library -- the function in question wasn't available, so loading the extension failed. I assume that if *Python* did a 64-bit divide somewhere in *its* guts, that function would have been available (linked into the python binary), which is why this problem doesn't come up very often -- Python probably does use most of GCC's runtime. ;-) Anyways, I thought the patch in bug #117606 looked fine, so I tried it out. Not so good; here's what happens when I try to build arraymodule.so (the first extension, alphabetically) with "gcc -shared": Text relocation remains referenced against symbol offset in file _PyObject_NewVar 0x654 arraymodule.o 0x26cc arraymodule.o 0x26c8 arraymodule.o [...many many symbols...] PyErr_Occurred 0x1274 arraymodule.o PyErr_Occurred 0x4a0 arraymodule.o PyErr_Occurred 0x22d8 arraymodule.o PyErr_Occurred 0x115c arraymodule.o PyErr_Occurred 0x368 arraymodule.o PyErr_Occurred 0x1074 arraymodule.o PyErr_Occurred 0x1f50 arraymodule.o PyInt_FromLong 0x3f4 arraymodule.o [...] _Py_NoneStruct 0x19d4 arraymodule.o Py_InitModule4 0x26b8 arraymodule.o ld: fatal: relocations remain against allocatable but non-writable sections collect2: ld returned 1 exit status All told, there were 500+ relocation errors in that one extension. About half where "Py" or "_Py" symbols; a bunch were "", and the rest were 'malloc', 'memcpy', 'sprintf', and other standard library functions. So apparently "-shared" tells GCC to forget everything it knows about linking C code and be as stupid as it can. Hmmm. I tried adding "../libpython2.0.a", and "-L.. -lpython2.0", and instead got 20,000 relocation errors, since of course libpython2.0.a needs a lot more symbols than arraymodule.o. Hmmm. I have no idea what's going on here. I've updated the bug report, and I am definitely -1 on "gcc -shared" for GCC on Solaris! Unless, of course, there are other linker options that make it work right... Greg From gmcm@hypernet.com Fri Nov 3 14:48:29 2000 From: gmcm@hypernet.com (Gordon McMillan) Date: Fri, 3 Nov 2000 09:48:29 -0500 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: <14850.51380.7327.977719@anthem.concentric.net> Message-ID: <3A0289ED.6247.EB4DD0E@localhost> > > >>>>> "GvR" == Guido van Rossum writes: > GvR> Of course you still will be able to define functions whose > GvR> name overrides a built-in -- in that case the compiler can > GvR> see that you're doing that (because it knows the scope rules > GvR> and can see what you are doing). But you won't be able to > GvR> confuse someone else's module by secretly sticking a > GvR> replacement built-in into their module's __dict__. [A Craven Dog overrides builtin open...] > Would this be illegal? Would other modules in my application (even if > imported from the standard library!) automatically get > debugging_open() for open() like they do now? I have always found it very elegant & powerful that keyword "import" calls builtin __import__, and that well-behaved C goes through the same hook. In a similar vein, I have for quite awhile wanted to experiment with mountable virtual file systems so that I can "mount" a (for example) MetaKit database at /cheesewhiz and other modules in my app, when navigating into /cheesewhiz will, unbeknownst to them, be reading from the MetaKit DB (these ideas leaked from tcl-land by Jean-Claude). I most definitely appreciate that these facilities are dangerous and this type of stuff tends to be abused gratuitously (eg, most import hacks), but disallowing them might be considered a gratuitous limitation (heck - Barry knows how to bypass the governor on every cheezewhizzer ever manufactured). - Gordon From guido@python.org Fri Nov 3 15:36:22 2000 From: guido@python.org (Guido van Rossum) Date: Fri, 03 Nov 2000 10:36:22 -0500 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: Your message of "Fri, 03 Nov 2000 09:16:20 EST." <14850.51380.7327.977719@anthem.concentric.net> References: <200011031305.IAA22103@cj20424-a.reston1.va.home.com> <14850.51380.7327.977719@anthem.concentric.net> Message-ID: <200011031536.KAA22620@cj20424-a.reston1.va.home.com> [Barry] > I'm a little confused. I've occasionally done the following within an > application: > > ----------driver.py > # need to override built-in open() to do extra debugging > def debuggin_open(filename, mode, bufsize): > # ... > if EXTRA_DEBUGGING: > import __builtin__.__dict__['open'] = debugging_open > -------------------- snip snip -------------------- > > Would this be illegal? Would other modules in my application (even if > imported from the standard library!) automatically get > debugging_open() for open() like they do now? That's up for discussion. Note that the open() function is special in this respect -- I don't see you doing the same to range() or hash(). If this is deemed a useful feature (for open()), we can make a rule about which built-ins you cannot override like this and which ones you can. --Guido van Rossum (home page: http://www.python.org/~guido/) From barry@wooz.org Fri Nov 3 15:45:05 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Fri, 3 Nov 2000 10:45:05 -0500 (EST) Subject: [Python-Dev] Re: Dynamic nested scopes References: <200011031305.IAA22103@cj20424-a.reston1.va.home.com> <14850.51380.7327.977719@anthem.concentric.net> <200011031536.KAA22620@cj20424-a.reston1.va.home.com> Message-ID: <14850.56705.120643.110016@anthem.concentric.net> >>>>> "GvR" == Guido van Rossum writes: GvR> That's up for discussion. Note that the open() function is GvR> special in this respect -- I don't see you doing the same to GvR> range() or hash(). Me neither, but special casing overridability seems like a fragile hack. GvR> If this is deemed a useful feature (for open()), we can make GvR> a rule about which built-ins you cannot override like this GvR> and which ones you can. Hmm, maybe we need __open__() and an open-hook? ;) -Barry From guido@python.org Fri Nov 3 16:07:13 2000 From: guido@python.org (Guido van Rossum) Date: Fri, 03 Nov 2000 11:07:13 -0500 Subject: [Python-Dev] New Wiki-based Python 2.0 Info Area Message-ID: <200011031607.LAA22863@cj20424-a.reston1.va.home.com> At our PythonLabs group meeting last Tuesday we decided that we needed to set up a Patches page and a FAQ for Python 2.0. (This because we don't see a reason yet to issue a bugfix release, but we do need to answer some common questions and point people to some patches.) I figured that we could do ourselves a favor by making this a set of dynamic pages maintained wiki-style, rather than static pages (especially since I've managed to personally become the bottleneck for editing the static pages, until the move of python.org to a DC hosted machine is complete :-). So, when I saw the announcement of MoinMoin, a Wiki clone written in Python as a single CGI script, I decided to try it -- instead of ZWiki, which might be the obvious choice given my new employer. This is not because I don't like Zope or ZWiki, but because we can learn from using different implementations of the same idea. So, I humbly present the Python 2.0 Info Area: http://www.python.org/cgi-bin/moinmoin I've added one critical patch, three non-critical (misc) patches, answers to two frequent bug reports to the FAQ page, and a bunch of links to the front page. Note that to generate patches, I use SourceForge cvsweb's diff feature. The URLs are ugly, but only the page editors see these, and it saves having to store copies of the patches. I'd like to get some feedback from the python-dev group before I link it into the 2.0 home page. --Guido van Rossum (home page: http://www.python.org/~guido/) From thomas@xs4all.net Fri Nov 3 16:12:04 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Fri, 3 Nov 2000 17:12:04 +0100 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: <200011031536.KAA22620@cj20424-a.reston1.va.home.com>; from guido@python.org on Fri, Nov 03, 2000 at 10:36:22AM -0500 References: <200011031305.IAA22103@cj20424-a.reston1.va.home.com> <14850.51380.7327.977719@anthem.concentric.net> <200011031536.KAA22620@cj20424-a.reston1.va.home.com> Message-ID: <20001103171203.G28658@xs4all.nl> On Fri, Nov 03, 2000 at 10:36:22AM -0500, Guido van Rossum wrote: > [Barry] > > Would this [replacing builtins] be illegal? > That's up for discussion. Note that the open() function is special in > this respect -- I don't see you doing the same to range() or hash(). Eep. I don't care much about scoping, so I'm not going to say much about that, but I certainly do care about Python's flexibility. One of the great things is that there are so little special cases, that (nearly) everything is delightfully consistent. Being able to override open() but not hash() or range() sounds directly contrary to that, at least to me. Being able to change __builtins__ *just like any other dict* strikes me as terribly Pythonic, though I realize this is probably contrary to Guido's view of Python :-) It also makes for instance 'exec' and 'eval' terribly obvious. I understand where the wish to restrict replacement and shadowing of builtins comes from, but I'm hoping here that this is going to be 'optional', like Perl's -w and 'use strict'. Defaulting to maximum simplicity and maximum flexibility (in that order:) but with optional warnings (when shadowing/modifying a builtin) and optimizations (using constants for builtins, when not modifying or shadowing them, for instance.) Just-my-fl.0,04-ly y'rs, -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From thomas@xs4all.net Fri Nov 3 16:31:07 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Fri, 3 Nov 2000 17:31:07 +0100 Subject: [Python-Dev] New Wiki-based Python 2.0 Info Area In-Reply-To: <200011031607.LAA22863@cj20424-a.reston1.va.home.com>; from guido@python.org on Fri, Nov 03, 2000 at 11:07:13AM -0500 References: <200011031607.LAA22863@cj20424-a.reston1.va.home.com> Message-ID: <20001103173107.H28658@xs4all.nl> On Fri, Nov 03, 2000 at 11:07:13AM -0500, Guido van Rossum wrote: > So, I humbly present the Python 2.0 Info Area: > http://www.python.org/cgi-bin/moinmoin > I've added one critical patch, three non-critical (misc) patches, Looks good. I'd never seen a Wiki thing before, and the naming requirements takes a bit getting used to, but I think it looks great, regardless :) It also reminds me that we still need to fix the erroneous conclusion by configure that BSDI has large file support, just because it has an off_t type that is 64 bit. (It does have that, it just doesn't use that in any of the seek/tell functions available.) Trent, you were making noises about looking at it, when I left for ApacheCon. Did you ever get to look at it ? If not, I'll see if I can figure it out ;P Once it's fixed, I think it should be added to CriticalPatches, but it might not be as straightforward as pointing to a cvsweb URL ;P -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From loewis@informatik.hu-berlin.de Fri Nov 3 18:51:49 2000 From: loewis@informatik.hu-berlin.de (Martin von Loewis) Date: Fri, 3 Nov 2000 19:51:49 +0100 (MET) Subject: [Python-Dev] build problems under Solaris Message-ID: <200011031851.TAA10703@pandora.informatik.hu-berlin.de> > Text relocation remains referenced > against symbol offset in file [...] > I have no idea what's going on here. I've updated the bug report, > and I am definitely -1 on "gcc -shared" for GCC on Solaris! Unless, > of course, there are other linker options that make it work right... That happens only when using the system linker (/usr/ccs/bin/ld). The GNU linker won't complain, and the resulting executables will run fine. To make the system linker happy, you also need to compile the modules with -fPIC - which, according to Sun, we should have done all the time when building shared libraries. The linker complains about relocations in the text segment, which shouldn't be there - shared libraries should be position independent. Alternatively, you can use 'gcc -shared -mimpure-text'; that does not request that remaining relocations cause errors. Regards, Martin From mal@lemburg.com Fri Nov 3 19:08:17 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Fri, 03 Nov 2000 20:08:17 +0100 Subject: [Python-Dev] zipfile.py and InfoZIP files Message-ID: <3A030D21.413B8C8A@lemburg.com> I'm having trouble opening ZIP files created using InfoZIP's zip utility (which uses zlib) with zipfile.py: >>> x = z.read('README') Traceback (innermost last): File "", line 1, in ? File "/home/lemburg/lib/zipfile.py", line 242, in read bytes = dc.decompress(bytes) zlib.error: Error -3 while decompressing: incomplete dynamic bit lengths tree Is this due to the installed zlib on my system being incompatible, or is this a bug in zipfile.py ? I have libz version 1.1.3 and zip version 2.2. Also, I wonder why zipfile forces the mode flag to be 'r', 'w' and 'a' -- wouldn't it make more sense to only add 'b', etc. to the mode flag instead ?! The ZipFile is also missing some kind of method which extracts files in the ZIP archive to a file-like object. This would be very useful for extracting large files from a ZIP archive without having to first read in the whole file into memory. Thanks, -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From skip@mojam.com (Skip Montanaro) Fri Nov 3 21:04:44 2000 From: skip@mojam.com (Skip Montanaro) (Skip Montanaro) Date: Fri, 3 Nov 2000 15:04:44 -0600 (CST) Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: <200011031536.KAA22620@cj20424-a.reston1.va.home.com> References: <200011031305.IAA22103@cj20424-a.reston1.va.home.com> <14850.51380.7327.977719@anthem.concentric.net> <200011031536.KAA22620@cj20424-a.reston1.va.home.com> Message-ID: <14851.10348.715861.488845@beluga.mojam.com> Guido> If this is deemed a useful feature (for open()), we can make a Guido> rule about which built-ins you cannot override like this and Guido> which ones you can. I thought we were all adults... For Py3k I think it should be sufficient to define the semantics of the builtin functions so that if people want to override them they can, but that overriding them in incompatible ways is likely to create some problems. (They might have to run with a "no optimize" flag to keep the compiler from assuming semantics, for instance.) I see no particular reason to remove the current behavior unless there are clear instances where something important is not going to work properly. Modifying builtins seems to me to be akin to linking a C program with a different version of malloc. As long as the semantics of the new functions remain the same as the definition, everyone's happy. You can have malloc leave a logfile behind or keep histograms of allocation sizes. If someone links in a malloc library that only returns a pointer to a region that's only half the requested size though, you're likely to run into problems. Skip From skip@mojam.com (Skip Montanaro) Fri Nov 3 21:17:12 2000 From: skip@mojam.com (Skip Montanaro) (Skip Montanaro) Date: Fri, 3 Nov 2000 15:17:12 -0600 (CST) Subject: [Python-Dev] New Wiki-based Python 2.0 Info Area In-Reply-To: <200011031607.LAA22863@cj20424-a.reston1.va.home.com> References: <200011031607.LAA22863@cj20424-a.reston1.va.home.com> Message-ID: <14851.11096.337480.434853@beluga.mojam.com> Guido> I'd like to get some feedback from the python-dev group before I Guido> link it into the 2.0 home page. Looks good to me. Skip From mal@lemburg.com Fri Nov 3 20:14:17 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Fri, 03 Nov 2000 21:14:17 +0100 Subject: [Python-Dev] Re: Dynamic nested scopes References: <200011031305.IAA22103@cj20424-a.reston1.va.home.com> <14850.51380.7327.977719@anthem.concentric.net> <200011031536.KAA22620@cj20424-a.reston1.va.home.com> <14851.10348.715861.488845@beluga.mojam.com> Message-ID: <3A031C99.7F7DD8E2@lemburg.com> [Making builtins static...] What about the idea to have the compiler make the decision whether a global symbol may be considered static based on a dictionary ? In non-optimizing mode this dictionary would be empty, with -O it would include all builtins which should never be overloaded and with -OO even ones which can be overloaded such as open() in addition to some standard modules which are known to only contain static symbols. Perhaps this needs some additional help of the "define" statement we planned as dynamic compiler interface ?! ... define static_symbols = * # all globals in this module define static_symbols = func1, func2 # only these two etc. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From guido@python.org Fri Nov 3 20:31:51 2000 From: guido@python.org (Guido van Rossum) Date: Fri, 03 Nov 2000 15:31:51 -0500 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: Your message of "Fri, 03 Nov 2000 15:04:44 CST." <14851.10348.715861.488845@beluga.mojam.com> References: <200011031305.IAA22103@cj20424-a.reston1.va.home.com> <14850.51380.7327.977719@anthem.concentric.net> <200011031536.KAA22620@cj20424-a.reston1.va.home.com> <14851.10348.715861.488845@beluga.mojam.com> Message-ID: <200011032031.PAA23630@cj20424-a.reston1.va.home.com> > Guido> If this is deemed a useful feature (for open()), we can make a > Guido> rule about which built-ins you cannot override like this and > Guido> which ones you can. [Skip] > I thought we were all adults... And consenting as well... :-) > For Py3k I think it should be sufficient to define the semantics of the > builtin functions so that if people want to override them they can, but that > overriding them in incompatible ways is likely to create some problems. > (They might have to run with a "no optimize" flag to keep the compiler from > assuming semantics, for instance.) I see no particular reason to remove the > current behavior unless there are clear instances where something important > is not going to work properly. > > Modifying builtins seems to me to be akin to linking a C program with a > different version of malloc. As long as the semantics of the new functions > remain the same as the definition, everyone's happy. You can have malloc > leave a logfile behind or keep histograms of allocation sizes. If someone > links in a malloc library that only returns a pointer to a region that's > only half the requested size though, you're likely to run into problems. Actually, the C standard specifically says you are *not* allowed to override standard library functions like malloc(). I'm thinking of the example of the rules in Fortran for intrinsic functions (Fortran's name for built-ins). Based on what Tim has told me, I believe that Fortran by default assumes that you're not doing anything funky with intrinsics (like sin, cos, tan) it can use a shortcut, e.g. inline them. But there are also real functions by these names in the Fortran standard library, and you can call those by declaring e.g. "external function sin". (There may also be an explicit way to say that you're happy with the intrinsic one.) I believe that when you use the external variant, they may be overridden by the user. I'm thinking of something similar here for Python. If the bytecode compiler knows that the builtins are vanilla, it can generate better (== more efficient) code for e.g. for i in range(10): ... Ditto for expressions like len(x) -- the len() operation is typically so fast that the cost is dominated by the two dict lookup operations (first in globals(), then in __builtins__). Why am I interested in this? People interested in speed routinely use hacks that copy a built-in function into a local variable so that they don't have dictlookups in their inner loop; it's really silly to have to do this, and if certain built-ins were recognized by the compiler it wouldn't be necessary. There are other cases where this is not so easy without much more analysis; but the built-ins (to me) seem low-hanging fruit. (Search the archives for that term, I've used it before in this context.) I assume that it's *really* unlikely that there are people patching the __builtin__ module to replace the functions that are good inline candidates (range, len, id, hash and so on). So I'm interesting in complicating the rules here. I'd be happy to make an explicit list of those builtins that should not be messed with, as part of the language definition. Program that *do* mess with these have undefined semantics. --Guido van Rossum (home page: http://www.python.org/~guido/) From mal@lemburg.com Fri Nov 3 20:50:48 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Fri, 03 Nov 2000 21:50:48 +0100 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/dist/src/Lib/test/output test_unicodedata,1.3,1.4 References: <200011032024.MAA26242@slayer.i.sourceforge.net> Message-ID: <3A032528.D743FC0A@lemburg.com> Fredrik Lundh wrote: > > Update of /cvsroot/python/python/dist/src/Lib/test/output > In directory slayer.i.sourceforge.net:/tmp/cvs-serv25791/lib/test/output > > Modified Files: > test_unicodedata > Log Message: > > Added 38,642 missing characters to the Unicode database (first-last > ranges) -- but thanks to the 2.0 compression scheme, this doesn't add > a single byte to the resulting binaries (!) > > Closes bug #117524 Cool :-) -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From gmcm@hypernet.com Fri Nov 3 20:56:56 2000 From: gmcm@hypernet.com (Gordon McMillan) Date: Fri, 3 Nov 2000 15:56:56 -0500 Subject: [Python-Dev] Re: Dynamic nested scopes In-Reply-To: <200011032031.PAA23630@cj20424-a.reston1.va.home.com> References: Your message of "Fri, 03 Nov 2000 15:04:44 CST." <14851.10348.715861.488845@beluga.mojam.com> Message-ID: <3A02E048.17812.10062D77@localhost> > > Guido> If this is deemed a useful feature (for open()), we can make a > > Guido> rule about which built-ins you cannot override like this and > > Guido> which ones you can. I think I would be satisfied with just those builtins which involve interfaces to the external world. Where Java allows such deviance, they tend to provide an API whereby you can supply or register a factory to override or extend the default behavior. In principle this seems less hackish than stomping on builtins; in practice I doubt it makes much difference ;-). > [Skip] > > I thought we were all adults... Snort. One of my kids will be voting on Tuesday, but I *still* don't know what I want to be when I grow up. - Gordon From trentm@ActiveState.com Fri Nov 3 20:51:41 2000 From: trentm@ActiveState.com (Trent Mick) Date: Fri, 3 Nov 2000 12:51:41 -0800 Subject: 64-bit stuff on BSDI (was: Re: [Python-Dev] New Wiki-based Python 2.0 Info Area) In-Reply-To: <20001103173107.H28658@xs4all.nl>; from thomas@xs4all.net on Fri, Nov 03, 2000 at 05:31:07PM +0100 References: <200011031607.LAA22863@cj20424-a.reston1.va.home.com> <20001103173107.H28658@xs4all.nl> Message-ID: <20001103125141.I20329@ActiveState.com> On Fri, Nov 03, 2000 at 05:31:07PM +0100, Thomas Wouters wrote: > It also reminds me that we still need to fix the erroneous conclusion by > configure that BSDI has large file support, just because it has an off_t > type that is 64 bit. (It does have that, it just doesn't use that in any of > the seek/tell functions available.) Trent, you were making noises about > looking at it, when I left for ApacheCon. Did you ever get to look at it ? No. I was in Washington D.C. for the past week at SD 2000 (and getting occassionally verbally abused by Tim about 64-bit stuff). > If not, I'll see if I can figure it out ;P I can answer questions about fixing it but you can test it, so better for you to submit the patch. > Once it's fixed, I think it > should be added to CriticalPatches, but it might not be as straightforward > as pointing to a cvsweb URL ;P > Sure. Yes, it will probably be more than one file (configure.in, configure, and fileobject.c). Whatever, I don't think it should be a problem to put up the actually checkin patch rather than a link. Thanks! Trent -- Trent Mick TrentM@ActiveState.com From gward@mems-exchange.org Fri Nov 3 22:45:00 2000 From: gward@mems-exchange.org (Greg Ward) Date: Fri, 3 Nov 2000 17:45:00 -0500 Subject: [Python-Dev] build problems under Solaris In-Reply-To: <200011031851.TAA10703@pandora.informatik.hu-berlin.de>; from loewis@informatik.hu-berlin.de on Fri, Nov 03, 2000 at 07:51:49PM +0100 References: <200011031851.TAA10703@pandora.informatik.hu-berlin.de> Message-ID: <20001103174500.B6683@ludwig.cnri.reston.va.us> On 03 November 2000, Martin von Loewis said: > That happens only when using the system linker (/usr/ccs/bin/ld). The > GNU linker won't complain, and the resulting executables will run > fine. I'm not sure which linker my GCC on Solaris is using. (Even though, umm, I built that GCC. Errr...) But ISTR that the choice of linker is made when you build GCC, so it's not really an option here. Bad enough to tell people that they need to reinstall Python because they need some extension; requiring them to rebuild GCC -- ! But it's not necessary at all, because... > To make the system linker happy, you also need to compile the modules > with -fPIC - which, according to Sun, we should have done all the time > when building shared libraries. The linker complains about relocations > in the text segment, which shouldn't be there - shared libraries > should be position independent. ...compiling everything with -fPIC is exactly what the doctor ordered. I added "-fPIC" to OPT in the Makefile and rebuilt, and everything went smoothly when linking the extensions with "gcc -shared". No problems compiling or linking, and the tests are running right now without a hitch. Yaayyyyh! So here's an update of the patch: this changes LDSHARED to "$(CC) -shared" when using GCC under Solaris, and it adds "-fPIC" to OPT when using GCC *anywhere*. This seems like a good thing to me when building shared objects, but if anyone is aware of a platform where "gcc ... -fPIC" is a bad idea, speak up now! *** configure.in 2000/11/03 08:18:36 1.177 --- configure.in 2000/11/03 22:43:50 *************** *** 308,315 **** case $GCC in yes) case $ac_cv_prog_cc_g in ! yes) OPT="-g -O2 -Wall -Wstrict-prototypes";; ! *) OPT="-O2 -Wall -Wstrict-prototypes";; esac ;; *) OPT="-O";; --- 308,315 ---- case $GCC in yes) case $ac_cv_prog_cc_g in ! yes) OPT="-g -O2 -Wall -Wstrict-prototypes -fPIC";; ! *) OPT="-O2 -Wall -Wstrict-prototypes -fPIC";; esac ;; *) OPT="-O";; *************** *** 564,570 **** SunOS/4*) LDSHARED="ld";; SunOS/5*) if test "$GCC" = "yes" ! then LDSHARED='$(CC) -G' else LDSHARED="ld -G"; fi ;; hp*|HP*) LDSHARED="ld -b";; --- 564,570 ---- SunOS/4*) LDSHARED="ld";; SunOS/5*) if test "$GCC" = "yes" ! then LDSHARED='$(CC) -shared' else LDSHARED="ld -G"; fi ;; hp*|HP*) LDSHARED="ld -b";; I'll go update the bug report now. Thanks, Martin! Greg From guido@python.org Fri Nov 3 23:01:27 2000 From: guido@python.org (Guido van Rossum) Date: Fri, 03 Nov 2000 18:01:27 -0500 Subject: [Python-Dev] Gathering Python 2.1 goals and non-goals Message-ID: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> Moshe and Andrew (the impatient youth :-) have asked what's on our plate for Python 2.1. Schedule -------- The current plan for the 2.1 release is in PEP 226: http://python.sourceforge.net/peps/pep-0226.html According to that PEP, the tentative schedule is: 16-Dec-2000: 2.1 PEPs ready for review 01-Feb-2001: First 2.1 beta release 16-Mar-2001: 2.1 final release But the PEP doesn't really say much about which PEPs we're going to work on. Overview of the PEPs -------------------- In my recent reworking of PEP 0 (the PEP index) I created a category of Active PEPs -- these are PEPs that I definitely want to consider in Python 2.1: I 42 pep-0042.txt Small Feature Requests Hylton S 207 pep-0207.txt Rich Comparisons Ascher S 208 pep-0208.txt Reworking the Coercion Model Ascher S 217 pep-0217.txt Display Hook for Interactive Use Zadka S 222 pep-0222.txt Web Library Enhancements Kuchling I 226 pep-0226.txt Python 2.1 Release Schedule Hylton S 227 pep-0227.txt Statically Nested Scopes Hylton Note that I said *consider*: I'm sure that not all feature requests from PEP 42 (a grab-bag of stuff that may or may not be feasible to implement) will be implemented, and I'm not sure that all the other PEPs on the list will be (especially nested scopes still seems iffy). Two of these (207, 208) haven't been written yet -- but I know roughly what they will contain, so they are included in this list, and not in the lists of vaporware PEPs later in PEP 0 (Incomplete and Empty PEPs). These may be reconsidered for Python 2.1 if their authors care to follow the PEP guidelines. There are a bunch of other PEPs that I moved to the Pie-in-the-sky category -- these are form-complete PEPs, but they are controversial (e.g. there are two competing matrix op PEPs); some others I think are not important; for yet others I think it's unrealistic to expect them to be implemented by Python 2.1. (There are also other things I'd like to do that fit in the pie-in-the-sky category, like breaking up the interpreter in several replaceable pieces.) Other issues to work on ----------------------- These aren't PEPs yet, but I think they need to become PEPs soon -- I'd like to see work on them go into Python 2.1: - The buffer interface needs a revamping or simplification; this was discussed here previously. - A warning framework. I've got some ideas already. - Integer division. If we want to play by Paul Prescod's Language Evolution rules (PEP 5), we better get started on the first stage. E.g. we could introduce a // operator in 2.1 for integer division, and issue a warning when / is used for integers. Then a year later (i.e., March 2002!) we could change / so that it returns a floating point number. - Case sensitivity??? - Class/type dichotomy??? - Weak references. This *is* a PEP, but there's no contents yet. We could also try to implement (just) weak dictionaries. - Internationalization. Barry knows what he wants here; I bet Martin von Loewis and Marc-Andre Lemburg have ideas too. - Arbitrart attributes on functions? This would be a generalization of docstrings; with the intent that you don't have to put semantics in docstrings (like SPARK and Zope). Issue: what syntax to choose? This could possibly lead to implementing private, protected, public attributes too. - Whatever you want to work on. If you have an idea for a PEP that you think should be implemented in Python 2.1, or if you want to revive a PEP that's currently listed in one of the "unattainable" categories, now's the time to make a plea! --Guido van Rossum (home page: http://www.python.org/~guido/) From gstein@lyra.org Fri Nov 3 23:13:50 2000 From: gstein@lyra.org (Greg Stein) Date: Fri, 3 Nov 2000 15:13:50 -0800 Subject: [Python-Dev] Gathering Python 2.1 goals and non-goals In-Reply-To: <200011032301.SAA26346@cj20424-a.reston1.va.home.com>; from guido@python.org on Fri, Nov 03, 2000 at 06:01:27PM -0500 References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> Message-ID: <20001103151350.C12266@lyra.org> On Fri, Nov 03, 2000 at 06:01:27PM -0500, Guido van Rossum wrote: >... > - Whatever you want to work on. If you have an idea for a PEP that > you think should be implemented in Python 2.1, or if you want to > revive a PEP that's currently listed in one of the "unattainable" > categories, now's the time to make a plea! I'd like to write up a free-threading PEP, but with all of my work on Subversion right now, I'm not sure when that writing will occur or the coding. I'll certainly do the PEP first because it could also be used as a development roadmap for somebody do some of the work. Cheers, -g -- Greg Stein, http://www.lyra.org/ From gward@mems-exchange.org Fri Nov 3 23:17:35 2000 From: gward@mems-exchange.org (Greg Ward) Date: Fri, 3 Nov 2000 18:17:35 -0500 Subject: [Python-Dev] Compiler warnings on Solaris Message-ID: <20001103181734.A6809@ludwig.cnri.reston.va.us> Hi all -- since "-Wall -Wstrict-prototypes" were added to OPT by default, a bunch of warnings are showing up with GCC on Solaris. (I haven't seen them on Linux, so I assume this is OS-specific.) See PR#121479 for a complete list: https://sourceforge.net/bugs/?func=detailbug&bug_id=121479&group_id=5470) Is anyone else looking into these? Some of them are easy to fix, eg. cast char to int when using 'isalnum()' and friends. Some of them are easy but unnecessary -- eg. sham variable initializations to shut up the "might be used uninitialized" warning. Others seem to require #define'ing XOPEN_SOURCE or __EXTENSIONS__, which was a bit controversial when it was done on Linux... but may be needed on Solaris too. I'm not sure yet; I've just started looking at it. And I have to run now. Anyways, if anyone has guidelines/hints/suggestions for the best way to fix these warnings, I'm all ears. Also, if there are warnings we're not going to worry about (eg. incorrect "might be used uninitialized"), lemme know. Thanks -- Greg From DavidA@ActiveState.com Fri Nov 3 23:38:13 2000 From: DavidA@ActiveState.com (David Ascher) Date: Fri, 3 Nov 2000 15:38:13 -0800 (Pacific Standard Time) Subject: [Python-Dev] Gathering Python 2.1 goals and non-goals In-Reply-To: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> Message-ID: > S 207 pep-0207.txt Rich Comparisons Ascher > S 208 pep-0208.txt Reworking the Coercion Model Ascher > Two of these (207, 208) haven't been written yet -- but I know roughly > what they will contain, so they are included in this list, and not in > the lists of vaporware PEPs later in PEP 0 (Incomplete and Empty > PEPs). These may be reconsidered for Python 2.1 if their authors care > to follow the PEP guidelines. I would _love_ to find someone to take over PEP ownership of these. I feel terrible that I haven't been able to find the time to do those right (although I only feel moral ownership of 207, not 208, where others have much more in-depth knowledge). If someone wants to take over, please speak up and grab them. I'll try to find the time to share the information I have, would gladly give the very early and now obsolete patches I wrote up. --david From akuchlin@mems-exchange.org Sat Nov 4 01:31:05 2000 From: akuchlin@mems-exchange.org (Andrew Kuchling) Date: Fri, 3 Nov 2000 20:31:05 -0500 Subject: [Python-Dev] Gathering Python 2.1 goals and non-goals In-Reply-To: <200011032301.SAA26346@cj20424-a.reston1.va.home.com>; from guido@python.org on Fri, Nov 03, 2000 at 06:01:27PM -0500 References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> Message-ID: <20001103203105.A25097@kronos.cnri.reston.va.us> On Fri, Nov 03, 2000 at 06:01:27PM -0500, Guido van Rossum wrote: >- Whatever you want to work on. If you have an idea for a PEP that > you think should be implemented in Python 2.1, or if you want to Using Distutils to compile most of the extension modules, taking the first step to eliminating the Setup/Makefile.pre.in scheme. I'll begin drafting a PEP. --amk From guido@python.org Sat Nov 4 03:38:17 2000 From: guido@python.org (Guido van Rossum) Date: Fri, 03 Nov 2000 22:38:17 -0500 Subject: [Python-Dev] Gathering Python 2.1 goals and non-goals In-Reply-To: Your message of "Fri, 03 Nov 2000 15:38:13 PST." References: Message-ID: <200011040338.WAA26703@cj20424-a.reston1.va.home.com> > > S 207 pep-0207.txt Rich Comparisons Ascher > > S 208 pep-0208.txt Reworking the Coercion Model Ascher [David] > I would _love_ to find someone to take over PEP ownership of these. I > feel terrible that I haven't been able to find the time to do those right > (although I only feel moral ownership of 207, not 208, where others have > much more in-depth knowledge). I will take care of these myself if noone else picks them up. --Guido van Rossum (home page: http://www.python.org/~guido/) From tim_one@email.msn.com Sat Nov 4 07:07:43 2000 From: tim_one@email.msn.com (Tim Peters) Date: Sat, 4 Nov 2000 02:07:43 -0500 Subject: [Python-Dev] Compiler warnings on Solaris In-Reply-To: <20001103181734.A6809@ludwig.cnri.reston.va.us> Message-ID: [Greg Ward] > ... > Also, if there are warnings we're not going to worry about (eg. > incorrect "might be used uninitialized"), lemme know. If a compiler is afraid something might be uninitialized, the code is too clever for people to be sure it's correct at a glance too. Note that right before 2.0 was released, a bogus "uninitalized" msg got repaired, which turned up a *legitimate* "uninitialized" msg that the bogus msg was covering up. The effort needed to fix one of these is too minor to be worth even considering not fixing. I'm not sure what gcc is complaining about in many of the cases; others are quite clear (e.g., "confstr" apparently has no prototype in scope by the time it's called in posixmodule.c, and that is indeed not good). unix-should-be-shot-not-that-windows-shouldn't-ly y'rs - tim From py-dev@zadka.site.co.il Sat Nov 4 18:00:49 2000 From: py-dev@zadka.site.co.il (Moshe Zadka) Date: Sat, 04 Nov 2000 20:00:49 +0200 Subject: [Python-Dev] Revamping Python's Numeric Model Message-ID: Here's a draft PEP. Have I already mentioned how much it irks me to cater for other editors in the PEP text itself? PEP: Unassigned Title: Reworking Python's Numeric Model Version: $Revision$ Author: pep@zadka.site.co.il (Moshe Zadka) Status: Draft Type: Standards Track Created: 4-Nov-2000 Post-History: Abstract Today, Python's numerical model is similar to the C numeric model: there are several unrelated numerical types, and when operations between numerical types are requested, coercions happen. While the C rational for the numerical model is that it is very similar to what happens on the hardware level, that rational does not apply to Python. So, while it is acceptable to C programmers that 2/3 == 0, it is very surprising to Python programmers. Rationale In usability studies, one of Python hardest to learn features was the fact integer division returns the floor of the division. This makes it hard to program correctly, requiring casts to float() in various parts through the code. Python numerical model stems from C, while an easier numerical model would stem from the mathematical understanding of numbers. Other Numerical Models Perl's numerical model is that there is one type of numbers -- floating point numbers. While it is consistent and superficially non-suprising, it tends to have subtle gotchas. One of these is that printing numbers is very tricky, and requires correct rounding. In Perl, there is also a mode where all numbers are integers. This mode also has its share of problems, which arise from the fact that there is not even an approximate way of dividing numbers and getting meaningful answers. Suggested Interface For Python Numerical Model While coercion rules will remain for add-on types and classes, the built in type system will have exactly one Python type -- a number. There are several things which can be considered "number methods": 1. isnatural() 2. isintegral() 3. isrational() 4. isreal() 5. iscomplex() a. isexact() Obviously, a number which answers m as true, also answers m+k as true. If "isexact()" is not true, then any answer might be wrong. (But not horribly wrong: it's close the truth). Now, there is two thing the models promises for the field operations (+, -, /, *): If both operands satisfy isexact(), the result satisfies isexact() All field rules are true, except that for not-isexact() numbers, they might be only approximately true. There is one important operation, inexact() which takes a number and returns an inexact number which is a good approximation. Several of the classical Python operations will return exact numbers when given inexact numbers: e.g, int(). Inexact Operations The functions in the "math" module will be allowed to return inexact results for exact values. However, they will never return a non-real number. The functions in the "cmath" module will return the correct mathematicl result. Numerical Python Issues People using Numerical Python do that for high-performance vector operations. Therefore, NumPy should keep it's hardware based numeric model. Copyright This document has been placed in the public domain. Local Variables: mode: indented-text indent-tabs-mode: nil End: -- Moshe Zadka From mal@lemburg.com Sat Nov 4 09:50:10 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Sat, 04 Nov 2000 10:50:10 +0100 Subject: [Python-Dev] Gathering Python 2.1 goals and non-goals References: Message-ID: <3A03DBD2.7E024978@lemburg.com> David Ascher wrote: > > > S 207 pep-0207.txt Rich Comparisons Ascher > > S 208 pep-0208.txt Reworking the Coercion Model Ascher > > > Two of these (207, 208) haven't been written yet -- but I know roughly > > what they will contain, so they are included in this list, and not in > > the lists of vaporware PEPs later in PEP 0 (Incomplete and Empty > > PEPs). These may be reconsidered for Python 2.1 if their authors care > > to follow the PEP guidelines. > > I would _love_ to find someone to take over PEP ownership of these. I > feel terrible that I haven't been able to find the time to do those right > (although I only feel moral ownership of 207, not 208, where others have > much more in-depth knowledge). > > If someone wants to take over, please speak up and grab them. I'll try to > find the time to share the information I have, would gladly give the very > early and now obsolete patches I wrote up. I can take over the coercion PEP: I've been working on this before (see the proposal on my Python Pages). I would also like to know whether the PEP-0224 will be considered for 2.1 if I update the patch to make it a little more robust w/r to the problems mentioned in that PEP -- I'd really like to see this in Python soon, since it makes documenting Python programs so much easier. Note that I won't get around to do much work on these before January... way too busy at the moment :-/ -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From py-dev@zadka.site.co.il Sat Nov 4 17:42:48 2000 From: py-dev@zadka.site.co.il (Moshe Zadka) Date: Sat, 04 Nov 2000 19:42:48 +0200 Subject: [Python-Dev] Gathering Python 2.1 goals and non-goals In-Reply-To: Message from Guido van Rossum of "Fri, 03 Nov 2000 18:01:27 EST." <200011032301.SAA26346@cj20424-a.reston1.va.home.com> References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> Message-ID: > Moshe and Andrew (the impatient youth :-) have asked what's on our > plate for Python 2.1. Yeah, sure. They're not making enough fun of my age where I work, . > - Integer division. If we want to play by Paul Prescod's Language > Evolution rules (PEP 5), we better get started on the first stage. > E.g. we could introduce a // operator in 2.1 for integer division, > and issue a warning when / is used for integers. Then a year later > (i.e., March 2002!) we could change / so that it returns a floating > point number. I'm working on that one, and will submit a PEP draft ASAP. I do want to know what kind of changes are acceptable: it seems you have no problem with just patching over the numerical model, while I think that solving this problem without recreating a host of others needs total reworking of the numerical model. > - Case sensitivity??? Writing a PEP for this seems trivial, but I thought the consensus was that this should be solved by tools, not the language. > - Whatever you want to work on. If you have an idea for a PEP that > you think should be implemented in Python 2.1, or if you want to > revive a PEP that's currently listed in one of the "unattainable" > categories, now's the time to make a plea! I'm not sure I want to work on it, but in the past, we through around ideas for pluggable nanny architecture. This is related to both the case-sensitivity discussion above, and we also had lots of FAQ-stopping nannies. (self-nanny, e.g., was designed to be a FAQ stopper) -- Moshe Zadka From mal@lemburg.com Sat Nov 4 09:58:49 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Sat, 04 Nov 2000 10:58:49 +0100 Subject: [Python-Dev] Revamping Python's Numeric Model References: Message-ID: <3A03DDD9.8527F7BE@lemburg.com> Moshe Zadka wrote: > > Here's a draft PEP. > Have I already mentioned how much it irks me to cater for other > editors in the PEP text itself? > > PEP: Unassigned > Title: Reworking Python's Numeric Model > Version: $Revision$ > Author: pep@zadka.site.co.il (Moshe Zadka) > Status: Draft > Type: Standards Track > Created: 4-Nov-2000 > Post-History: > > Abstract > > Today, Python's numerical model is similar to the C numeric model: > there are several unrelated numerical types, and when operations > between numerical types are requested, coercions happen. While the C > rational for the numerical model is that it is very similar to what > happens on the hardware level, that rational does not apply to Python. > So, while it is acceptable to C programmers that 2/3 == 0, it is very > surprising to Python programmers. > > Rationale > > In usability studies, one of Python hardest to learn features was > the fact integer division returns the floor of the division. This > makes it hard to program correctly, requiring casts to float() in > various parts through the code. Python numerical model stems from > C, while an easier numerical model would stem from the mathematical > understanding of numbers. > > Other Numerical Models > > Perl's numerical model is that there is one type of numbers -- floating > point numbers. While it is consistent and superficially non-suprising, > it tends to have subtle gotchas. One of these is that printing numbers > is very tricky, and requires correct rounding. In Perl, there is also > a mode where all numbers are integers. This mode also has its share of > problems, which arise from the fact that there is not even an approximate > way of dividing numbers and getting meaningful answers. > > Suggested Interface For Python Numerical Model > > While coercion rules will remain for add-on types and classes, the built > in type system will have exactly one Python type -- a number. While I like the idea of having the numeric model in Python based on a solid class hierarchy, I don't think that this model is implementable in Python 2.x without giving away performance. > There > are several things which can be considered "number methods": > > 1. isnatural() > 2. isintegral() > 3. isrational() > 4. isreal() > 5. iscomplex() +1. I would like to see methods on Python numbers too (after having made some really good experiences with methods on strings ;-). There's one problem though: how would you call these on numeric literals ? ... 1.2.isreal() ?! > a. isexact() > > Obviously, a number which answers m as true, also answers m+k as true. > If "isexact()" is not true, then any answer might be wrong. (But not > horribly wrong: it's close the truth). Not sure what you mean here: perhaps .isexact() <=> can be represented in IEEE ? -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From py-dev@zadka.site.co.il Sat Nov 4 18:19:13 2000 From: py-dev@zadka.site.co.il (Moshe Zadka) Date: Sat, 04 Nov 2000 20:19:13 +0200 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: Message from "M.-A. Lemburg" of "Sat, 04 Nov 2000 10:58:49 +0100." <3A03DDD9.8527F7BE@lemburg.com> References: <3A03DDD9.8527F7BE@lemburg.com> Message-ID: > While I like the idea of having the numeric model in Python > based on a solid class hierarchy, I don't think that this model > is implementable in Python 2.x without giving away performance. I think they are, using a similar trick to Fred's automorphing dictionaries. > +1. > > I would like to see methods on Python numbers too (after having > made some really good experiences with methods on strings ;-). > There's one problem though: how would you call these on > numeric literals ? ... 1.2.isreal() ?! Ummmm....how would you say you want to add 3 and 4, and multiply the result by 5? 3+4*5? No, you use parens: (3+4)*5 (1.2).isreal() > > a. isexact() > > > > Obviously, a number which answers m as true, also answers m+k as true. > > If "isexact()" is not true, then any answer might be wrong. (But not > > horribly wrong: it's close the truth). > > Not sure what you mean here: perhaps .isexact() <=> can be > represented in IEEE ? No, I meant "not represented exactly". The real meaning for that (one that we might or might not promise) is that it's a float. It's a place where the numeric model takes the easy way out . -- Moshe Zadka From mal@lemburg.com Sat Nov 4 10:21:52 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Sat, 04 Nov 2000 11:21:52 +0100 Subject: [Python-Dev] Gathering Python 2.1 goals and non-goals References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> Message-ID: <3A03E340.38F78FA2@lemburg.com> Guido van Rossum wrote: > > Other issues to work on > ----------------------- > > These aren't PEPs yet, but I think they need to become PEPs soon -- > I'd like to see work on them go into Python 2.1: > > - The buffer interface needs a revamping or simplification; this was > discussed here previously. > > - A warning framework. I've got some ideas already. This would be a *cool* thing. I have a need for such a framework in mx.ODBC where ODBC often issues warnings. Currently these turn out as exceptions which is not really all that useful because some warnings can safely be ignored. > - Integer division. If we want to play by Paul Prescod's Language > Evolution rules (PEP 5), we better get started on the first stage. > E.g. we could introduce a // operator in 2.1 for integer division, > and issue a warning when / is used for integers. Then a year later > (i.e., March 2002!) we could change / so that it returns a floating > point number. +0... and then only, if there will be a tool to check Python source code for integer divides. > - Case sensitivity??? Should be left to Py3k. It could then be implemented by using a special dictionary subclass as instance dictionary. > - Class/type dichotomy??? One thing that would probably be implementable is a way to maintain "instance" dictionaries for types (which are created on-demand whenever an assignment is made). This would enable extending types with new methods and attributes. "Subclassing" could then be emulated by using new contructors which add the new or changed methods to each created type instance, e.g. class myclose: def __init__(self, object, basemethod): self.object = object self.basemethod = basemethod def __call__(self): print 'Closed file %s' % self.object self.basemethod() def myfile(filename): f = open(filename) # add/override attributes f.newattribute = 1 # add/override methods f.close = myclose(f, f.close) return f Types would have to be made aware of this possibility. Python could provide some helping APIs to make life easier for the programmer. > - Weak references. This *is* a PEP, but there's no contents yet. We > could also try to implement (just) weak dictionaries. These already exist... http://www.handshake.de/~dieter/weakdict.html mx.Proxy also has an implementation which support weak references. BTW, are these still needed now that we have GC ? > - Internationalization. Barry knows what he wants here; I bet Martin > von Loewis and Marc-Andre Lemburg have ideas too. We'd need a few more codecs, support for the Unicode compression, normalization and collation algorithms. > - Arbitrart attributes on functions? This would be a generalization > of docstrings; with the intent that you don't have to put semantics > in docstrings (like SPARK and Zope). Issue: what syntax to choose? > This could possibly lead to implementing private, protected, public > attributes too. Perhaps the above "trick" could help with this ?! -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From mal@lemburg.com Sat Nov 4 10:31:01 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Sat, 04 Nov 2000 11:31:01 +0100 Subject: [Python-Dev] Revamping Python's Numeric Model References: <3A03DDD9.8527F7BE@lemburg.com> Message-ID: <3A03E565.9D23559D@lemburg.com> Moshe Zadka wrote: > > > While I like the idea of having the numeric model in Python > > based on a solid class hierarchy, I don't think that this model > > is implementable in Python 2.x without giving away performance. > > I think they are, using a similar trick to Fred's automorphing dictionaries. You mean numbers "morph" to become floats, complex numbers, etc. on demand ? E.g. math.sqrt(-1) would return 1j ?! > > +1. > > > > I would like to see methods on Python numbers too (after having > > made some really good experiences with methods on strings ;-). > > There's one problem though: how would you call these on > > numeric literals ? ... 1.2.isreal() ?! > > Ummmm....how would you say you want to add 3 and 4, and multiply the result > by 5? 3+4*5? > > No, you use parens: > > (3+4)*5 > (1.2).isreal() Ah. Of course :-) Cool ! > > > a. isexact() > > > > > > Obviously, a number which answers m as true, also answers m+k as true. > > > If "isexact()" is not true, then any answer might be wrong. (But not > > > horribly wrong: it's close the truth). > > > > Not sure what you mean here: perhaps .isexact() <=> can be > > represented in IEEE ? > > No, I meant "not represented exactly". The real meaning for that (one > that we might or might not promise) is that it's a float. It's a place > where the numeric model takes the easy way out . Uhm, that's what I meant. I don't see much use for this though: the whole meaning of "exact" is void w/r to floats. It should be replaced by "accurate to n digits". Note that there is a whole mathematical theory that tries to deal with this problem: interval calculus. A package to support this would probably make sense... a nice side-effect of interval calculus is that it allows "automatic" optimization of functions within certain bounds. Numbers are replaced with intervals and calculus is then done on the interval bounds. This is just about as close as you can get to floating point values with computer machinery ;-) -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From moshez@zadka.site.co.il Sat Nov 4 18:52:35 2000 From: moshez@zadka.site.co.il (Moshe Zadka) Date: Sat, 04 Nov 2000 20:52:35 +0200 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: Message from "M.-A. Lemburg" of "Sat, 04 Nov 2000 11:31:01 +0100." <3A03E565.9D23559D@lemburg.com> References: <3A03DDD9.8527F7BE@lemburg.com> <3A03E565.9D23559D@lemburg.com> Message-ID: [Moshe Zadka, about efficient and honest numbers] > I think they are, using a similar trick to Fred's automorphing dictionaries [MAL] > You mean numbers "morph" to become floats, complex numbers, etc. > on demand ? E.g. math.sqrt(-1) would return 1j ?! math.sqrt has been dealt elsewhere in the PEP. It has been suggested that math.foo will accept and return real numbers, and that cmath.foo will accept and return complex numbers. If you want to always deal with complex numbers, put this in your site.py import cmath import sys sys.modules['math']=cmath > > No, I meant "not represented exactly". The real meaning for that (one > > that we might or might not promise) is that it's a float. It's a place > > where the numeric model takes the easy way out . > > Uhm, that's what I meant. I don't see much use for this though: > the whole meaning of "exact" is void w/r to floats. It should > be replaced by "accurate to n digits". I'm just promising that floats are inexact. I don't see a need for "accuracy to n digits" (interval mathematics, etc.) in core Python. This should be a new module if anyone needs it. Since inexact numbers will only come about via external modules, you can just use: import imath # interval math module import sys sys.modules['math']=imath. I'm not repeating myself. > This is just about as close as you can get to floating point > values with computer machinery ;-) I thought floats are the way to get to floating point values with computer machinery? -- Moshe Zadka From martin@loewis.home.cs.tu-berlin.de Sat Nov 4 11:12:22 2000 From: martin@loewis.home.cs.tu-berlin.de (Martin v. Loewis) Date: Sat, 4 Nov 2000 12:12:22 +0100 Subject: [Python-Dev] 2.1 tasks (Was: statically nested scopes) Message-ID: <200011041112.MAA01135@loewis.home.cs.tu-berlin.de> > * Work on something CPAN-like. This may or may not have repercussions for > the core; I don't know. At a minimum, I think we need to include somethingCPANlike.py into the core (library) to make something CPAN-like useful. Regards, Martin From thomas@xs4all.net Sat Nov 4 13:54:17 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Sat, 4 Nov 2000 14:54:17 +0100 Subject: [Python-Dev] Compiler warnings on Solaris In-Reply-To: ; from tim_one@email.msn.com on Sat, Nov 04, 2000 at 02:07:43AM -0500 References: <20001103181734.A6809@ludwig.cnri.reston.va.us> Message-ID: <20001104145417.I28658@xs4all.nl> On Sat, Nov 04, 2000 at 02:07:43AM -0500, Tim Peters wrote: > [Greg Ward] > > ... > > Also, if there are warnings we're not going to worry about (eg. > > incorrect "might be used uninitialized"), lemme know. > If a compiler is afraid something might be uninitialized, the code is too > clever for people to be sure it's correct at a glance too. > I'm not sure what gcc is complaining about in many of the cases; others are > quite clear (e.g., "confstr" apparently has no prototype in scope by the > time it's called in posixmodule.c, and that is indeed not good). There are a few messages that certainly should be looked at. The 'uninitialized usage' messages, for instances, might seriously be problems. In this case, though, the "'ord' might be used uninitialized" warning isn't a real problem. 'ord' is indeed only set when 'size == 1' is true, but it's also only used if 'size == 1', and size isn't changed inbetween those checks. Whether it should be fixed or not is another issue, but at least it isn't causing problems. The 'subscript has type char' message I'm not so sure about -- what is the problem with those ? I assume it has something to do with char's signedness being undefined, but I'm not sure. I also thought 'up'casting, such as in functioncalls (function needs int, you give it char) was done automatically, as part of the language, and thus shouldn't be a problem. But the -Wstrict-prototypes seems to detect a lot more 'errors' in system headerfiles than in Python. For instance, all the "function declaration isn't a prototype" warnings in signalmodule.c and intrcheck.c seem to be caused by the SIG_ERR, SIG_DFL and SIG_IGN #defines, which Python can do nothing about. (those SIG_ #defines are apparently defined as function declarations without prototypes.) I've seen the same effect on BSDI 4.0.1, where a few system include files define or declare functions without prototypes. We can't fix those errors, except by complaining to the OS vendor. Maybe we should just disable -Wstrict-prototypes (but not -Wall) for releases, to avoid confusion. (Developers should still use -Wstrict-prototypes, to catch warnings Python *can* do something about, me thinks.) And the 'confstr' message, well, ewww ;) The manpage on Linux seems to indicate you need to define either _POSIX_C_SOURCE as 2, or define _XOPEN_SOURCE, for it to work. But I've never understood how those defines are supposed to interact, so I'm hesitant to go there ;P -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From thomas@xs4all.net Sat Nov 4 14:08:43 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Sat, 4 Nov 2000 15:08:43 +0100 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: ; from py-dev@zadka.site.co.il on Sat, Nov 04, 2000 at 08:19:13PM +0200 References: <3A03DDD9.8527F7BE@lemburg.com> Message-ID: <20001104150842.J28658@xs4all.nl> On Sat, Nov 04, 2000 at 08:19:13PM +0200, Moshe Zadka wrote: > > There's one problem though: how would you call these on > > numeric literals ? ... 1.2.isreal() ?! > you use parens: > (1.2).isreal() Why ? There is exactly no problem with this example :) >>> 1.2.isreal() Traceback (most recent call last): File "", line 1, in ? AttributeError: 'float' object has no attribute 'isreal' If float objects had attributes, it would already work. The real problem isn't with floats, but with nonfloats: >>> 1.isreal() File "", line 1 1.isreal() ^ SyntaxError: invalid syntax And the limitation is just the parser, currently. Whether we want to allow that syntax is something that will have to be figured out. And the parser would have to be fixed (either rewritten into at least an LL(2) parser, or the metagrammar hacked so that 'NUMBER' doesn't eat the . after a number if it isn't followed by another number or whitespace.) (At least IMHO, this isn't an insurmountable problem, or even a medium-sized problem. It was just never necessary to fix it.) -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From thomas@xs4all.net Sat Nov 4 14:46:00 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Sat, 4 Nov 2000 15:46:00 +0100 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: <20001104150842.J28658@xs4all.nl>; from thomas@xs4all.net on Sat, Nov 04, 2000 at 03:08:43PM +0100 References: <3A03DDD9.8527F7BE@lemburg.com> <20001104150842.J28658@xs4all.nl> Message-ID: <20001104154559.L28658@xs4all.nl> On Sat, Nov 04, 2000 at 03:08:43PM +0100, Thomas Wouters wrote: > And the parser would have to be fixed (either rewritten into at least an > LL(2) parser, or the metagrammar hacked so that 'NUMBER' doesn't eat the . > after a number if it isn't followed by another number or whitespace.) (At > least IMHO, this isn't an insurmountable problem, or even a medium-sized > problem. It was just never necessary to fix it.) Actually, no, it isn't easily fixable, if at all. The problem is mostly the scientific notation: 1.e5 Even if it was parsed properly, it is definately going to confuse people. They wouldn't be able to say, for instance, 1.e() ;P -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From cgw@fnal.gov Sat Nov 4 15:21:02 2000 From: cgw@fnal.gov (Charles G Waldman) Date: Sat, 4 Nov 2000 09:21:02 -0600 (CST) Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: References: <3A03DDD9.8527F7BE@lemburg.com> Message-ID: <14852.10590.837218.382614@buffalo.fnal.gov> Moshe Zadka writes: > MAL > Not sure what you mean here: perhaps .isexact() <=> can be > MAL > represented in IEEE ? > > No, I meant "not represented exactly". The real meaning for that (one > that we might or might not promise) is that it's a float. It's a place > where the numeric model takes the easy way out . Hmm, I like almost everything about your proposal. The above point bothers me slightly. Are you saying (1.0).isexact() == 0? Also, how about long integers? Will they, under your new proposal, be indistinguisable from regular ints? While this has some appeal to it it would be problematic for C extension modules. Finally, although I'm no Schemer, the hierarchy you suggest sounds very Schemish to me - I know they have a similar hierarchy of numeric types with some of these same predicates to test for integrality, rationality, reality, exactness - maybe there is something to be learned by studying the Scheme model? From cgw@fnal.gov Sat Nov 4 15:25:37 2000 From: cgw@fnal.gov (Charles G Waldman) Date: Sat, 4 Nov 2000 09:25:37 -0600 (CST) Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: <20001104154559.L28658@xs4all.nl> References: <3A03DDD9.8527F7BE@lemburg.com> <20001104150842.J28658@xs4all.nl> <20001104154559.L28658@xs4all.nl> Message-ID: <14852.10865.96590.740569@buffalo.fnal.gov> Thomas Wouters writes: > Actually, no, it isn't easily fixable, if at all. The problem is mostly the > scientific notation: > > 1.e5 You could strongly encourage people to spell this 1.0e5 From bckfnn@worldonline.dk Sat Nov 4 16:21:48 2000 From: bckfnn@worldonline.dk (Finn Bock) Date: Sat, 04 Nov 2000 16:21:48 GMT Subject: [Python-Dev] Three argument slices. Message-ID: <3a04376a.28016074@smtp.worldonline.dk> Hi, While updating the difference page in the Jython documentation, I came across this: - JPython sequences support three argument slices. i.e. range(3)[::-1] == [2,1,0]. CPython should be fixed. Is this actually true? Should (and will cpython) change in this respect? regards, finn From py-dev@zadka.site.co.il Sun Nov 5 00:57:12 2000 From: py-dev@zadka.site.co.il (Moshe Zadka) Date: Sun, 05 Nov 2000 02:57:12 +0200 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: Message from Charles G Waldman of "Sat, 04 Nov 2000 09:21:02 CST." <14852.10590.837218.382614@buffalo.fnal.gov> References: <3A03DDD9.8527F7BE@lemburg.com> <14852.10590.837218.382614@buffalo.fnal.gov> Message-ID: > Hmm, I like almost everything about your proposal. The above point > bothers me slightly. Are you saying (1.0).isexact() == 0? Yes. 1.0 is not an exact number. What's wrong with that? (Consider stuff like 0.333333333*3: this shouldn't be exact!) > Also, how about long integers? Will they, under your new proposal, be > indistinguisable from regular ints? Yes. > While this has some appeal to it > it would be problematic for C extension modules. I haven't mentioned anything about implementation, so I haven't dealt with the C level at all. Currently, a Python-level API is under consideration. I believe I can keep current day C API almost unchanged. > Finally, although I'm no Schemer, the hierarchy you suggest sounds > very Schemish to me I shamelessly stole it from Scheme, with only minor changes -- most of them about hardening some things Scheme left for implementations to decide. -- Moshe Zadka From mwh21@cam.ac.uk Sat Nov 4 16:47:31 2000 From: mwh21@cam.ac.uk (Michael Hudson) Date: 04 Nov 2000 16:47:31 +0000 Subject: [Python-Dev] Three argument slices. In-Reply-To: bckfnn@worldonline.dk's message of "Sat, 04 Nov 2000 16:21:48 GMT" References: <3a04376a.28016074@smtp.worldonline.dk> Message-ID: bckfnn@worldonline.dk (Finn Bock) writes: > Hi, > > While updating the difference page in the Jython documentation, I > came across this: > > - JPython sequences support three argument slices. i.e. > range(3)[::-1] == [2,1,0]. > CPython should be fixed. > > Is this actually true? Should (and will cpython) change in this > respect? Well, there's a patch I wrote on sf to add this to CPython, but it was too late for 2.0 and it got postponed. Does J[P]ython allow l = range(10) l[::3] = range(4) ? That's one of the odder bits of the behaviour of my patch. awaiting-pronouncement-(or-even-consensus)-ly y'rs m. -- 58. Fools ignore complexity. Pragmatists suffer it. Some can avoid it. Geniuses remove it. -- Alan Perlis, http://www.cs.yale.edu/homes/perlis-alan/quotes.html From bckfnn@worldonline.dk Sat Nov 4 18:41:59 2000 From: bckfnn@worldonline.dk (Finn Bock) Date: Sat, 04 Nov 2000 18:41:59 GMT Subject: [Python-Dev] Three argument slices. In-Reply-To: References: <3a04376a.28016074@smtp.worldonline.dk> Message-ID: <3a045808.36365811@smtp.worldonline.dk> >> - JPython sequences support three argument slices. i.e. >> range(3)[::-1] == [2,1,0]. >> CPython should be fixed. >> >> Is this actually true? Should (and will cpython) change in this >> respect? > >Well, there's a patch I wrote on sf to add this to CPython, but it was >too late for 2.0 and it got postponed. > >Does J[P]ython allow > >l = range(10) >l[::3] = range(4) > >? That's one of the odder bits of the behaviour of my patch. No: Jython 2.0 pre-alpha on java1.3.0 (JIT: null) Type "copyright", "credits" or "license" for more information. >>> l = range(10) >>> l[::3] = range(4) Traceback (innermost last): File "", line 1, in ? ValueError: step size must be 1 for setting list slice regards, finn From gstein@lyra.org Sat Nov 4 18:56:08 2000 From: gstein@lyra.org (Greg Stein) Date: Sat, 4 Nov 2000 10:56:08 -0800 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: <20001104154559.L28658@xs4all.nl>; from thomas@xs4all.net on Sat, Nov 04, 2000 at 03:46:00PM +0100 References: <3A03DDD9.8527F7BE@lemburg.com> <20001104150842.J28658@xs4all.nl> <20001104154559.L28658@xs4all.nl> Message-ID: <20001104105608.A10135@lyra.org> Oh, this is just simple: (1.2).isreal() 1.2 .isreal() As Thomas said, fixing the grammar/parser would be rather difficult, so just expect people to use parens or an extra space if they want to use it on a constant. [ of course, it is very silly to make *any* changes to the grammar just to allow people to use these on a constant; that is quite a silly "usage" that we don't need to pander to; the above "workarounds", if you will, are sufficient for the bozos who use it on a constant. ] Cheers, -g On Sat, Nov 04, 2000 at 03:46:00PM +0100, Thomas Wouters wrote: > On Sat, Nov 04, 2000 at 03:08:43PM +0100, Thomas Wouters wrote: > > > And the parser would have to be fixed (either rewritten into at least an > > LL(2) parser, or the metagrammar hacked so that 'NUMBER' doesn't eat the . > > after a number if it isn't followed by another number or whitespace.) (At > > least IMHO, this isn't an insurmountable problem, or even a medium-sized > > problem. It was just never necessary to fix it.) > > Actually, no, it isn't easily fixable, if at all. The problem is mostly the > scientific notation: > > 1.e5 > > Even if it was parsed properly, it is definately going to confuse people. > They wouldn't be able to say, for instance, > > 1.e() > > ;P > > -- > Thomas Wouters > > Hi! I'm a .signature virus! copy me into your .signature file to help me spread! > > _______________________________________________ > Python-Dev mailing list > Python-Dev@python.org > http://www.python.org/mailman/listinfo/python-dev -- Greg Stein, http://www.lyra.org/ From thomas@xs4all.net Sat Nov 4 21:29:28 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Sat, 4 Nov 2000 22:29:28 +0100 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: <14852.10865.96590.740569@buffalo.fnal.gov>; from cgw@fnal.gov on Sat, Nov 04, 2000 at 09:25:37AM -0600 References: <3A03DDD9.8527F7BE@lemburg.com> <20001104150842.J28658@xs4all.nl> <20001104154559.L28658@xs4all.nl> <14852.10865.96590.740569@buffalo.fnal.gov> Message-ID: <20001104222928.M28658@xs4all.nl> On Sat, Nov 04, 2000 at 09:25:37AM -0600, Charles G Waldman wrote: > Thomas Wouters writes: > > Actually, no, it isn't easily fixable, if at all. The problem is mostly the > > scientific notation: > > 1.e5 > You could strongly encourage people to spell this 1.0e5 Oh, sure, but that isn't going to solve anything, unless you are proposing to break the common practice of not requiring zeros before or after decimal points entirely, and thus breaking gobs of code. The problem is simply that the syntax is truly ambiguous, and there is no way to tell whether the statement x = 1.e5 is meant to assign 100000 (as a float) to 'x', or assign the 'e5' attribute of the object '1' to 'x'. Not even full context-based parsing is going to solve that. This is an edge case, and not likely to happen in real life, but I don't think it's really worth the trouble, all in all. We'd have to rewrite the parser into something other than a look-ahead parser to be able to correctly parse the cases where the syntax isn't really ambiguous, such as x = 1.e42e or some such, and that would still leave unparseable syntax. And all that just to avoid forcing people to use parentheses around 'normal' integer literals when directly following them with an attribute dereference. After all, t = 1 x = t.e5 is perfectly valid, and probably a lot more common. Who needs to run a method on a literal anyway [don't mention the .join() war!] -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From thomas@xs4all.net Sat Nov 4 21:32:39 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Sat, 4 Nov 2000 22:32:39 +0100 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: <20001104105608.A10135@lyra.org>; from gstein@lyra.org on Sat, Nov 04, 2000 at 10:56:08AM -0800 References: <3A03DDD9.8527F7BE@lemburg.com> <20001104150842.J28658@xs4all.nl> <20001104154559.L28658@xs4all.nl> <20001104105608.A10135@lyra.org> Message-ID: <20001104223239.N28658@xs4all.nl> On Sat, Nov 04, 2000 at 10:56:08AM -0800, Greg Stein wrote: > Oh, this is just simple: > (1.2).isreal() > 1.2 .isreal() Very observant, Greg ! I hadn't even realized that, but it makes perfect sense if you think about it ;) '1.e5' is a single token, to the parser (a NUMBER), so it can't have whitespace inbetween. But '1 .e5' is naturally broken up into at least two tokens (three, in this case), and thus is 'correctly' parsed even in todays parser: >>> 1 .e5 Traceback (most recent call last): File "", line 1, in ? AttributeError: 'int' object has no attribute 'e5' Y'all just need to quit thinking about floats as the problem... the problem is ints, not floats ;) I-shut-up-now-ly y'rs, -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From martin@loewis.home.cs.tu-berlin.de Sat Nov 4 21:15:13 2000 From: martin@loewis.home.cs.tu-berlin.de (Martin v. Loewis) Date: Sat, 4 Nov 2000 22:15:13 +0100 Subject: [Python-Dev] Revamping Python's Numeric Model Message-ID: <200011042115.WAA01162@loewis.home.cs.tu-berlin.de> > Yes. 1.0 is not an exact number. What's wrong with that? > (Consider stuff like 0.333333333*3: this shouldn't be exact!) It's not inherently wrong. It just points out an omission in the PEP: it doesn't talk about the meaning of number literals. Since the new model is going to follow algebraic principles more closely, I had expected that 0.333333333 == 333333333 / 1000000000 where, as I understand the proposal, the right-hand side is an exact number (so 0.333333333*3 would be 999999999 / 1000000000). One of the more-frequent questions on python-help is confusion about floating-point numbers, e.g. why is the result of 1.99+4.99 printed as 6.9800000000000004; users often report that as a bug. Of course, spelling the number you had in mind as inexact(0.333333333) is hardly acceptable, either. Regards, Martin From skip@mojam.com (Skip Montanaro) Sat Nov 4 22:55:44 2000 From: skip@mojam.com (Skip Montanaro) (Skip Montanaro) Date: Sat, 4 Nov 2000 16:55:44 -0600 (CST) Subject: [Python-Dev] Accessing RH 7.0 on SF? Message-ID: <14852.37872.843069.514782@beluga.mojam.com> Someone posted a bug about the bsddb config stuff related to RH7.0, which I don't have direct access to. I've seen others access different Linux dialects on the SF site. Can someone explain how I can access RH7.0 there? All I really need to do at the moment is peruse the /usr/include/db3 stuff. Thx, Skip From thomas@xs4all.net Sat Nov 4 22:10:32 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Sat, 4 Nov 2000 23:10:32 +0100 Subject: [Python-Dev] Accessing RH 7.0 on SF? In-Reply-To: <14852.37872.843069.514782@beluga.mojam.com>; from skip@mojam.com on Sat, Nov 04, 2000 at 04:55:44PM -0600 References: <14852.37872.843069.514782@beluga.mojam.com> Message-ID: <20001104231031.P28658@xs4all.nl> On Sat, Nov 04, 2000 at 04:55:44PM -0600, Skip Montanaro wrote: > Someone posted a bug about the bsddb config stuff related to RH7.0, which I > don't have direct access to. I've seen others access different Linux > dialects on the SF site. Can someone explain how I can access RH7.0 there? > All I really need to do at the moment is peruse the /usr/include/db3 stuff. I don't think they have it, yet. RH7 is pretty new after all. I can give you an account on my home machine if you want, though, it's RH7. Only accessible through ssh currently, but if you want I can turn on telnet. Just send me an email with login/pw you want ;) Or if you'd rather I peruse the db3 stuff for you, that's fine too. -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From moshez@zadka.site.co.il Sun Nov 5 08:20:45 2000 From: moshez@zadka.site.co.il (Moshe Zadka) Date: Sun, 05 Nov 2000 10:20:45 +0200 Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: Message from "Martin v. Loewis" of "Sat, 04 Nov 2000 22:15:13 +0100." <200011042115.WAA01162@loewis.home.cs.tu-berlin.de> References: <200011042115.WAA01162@loewis.home.cs.tu-berlin.de> Message-ID: [Martin v. Loewis] > It's not inherently wrong. It just points out an omission in the PEP: > it doesn't talk about the meaning of number literals. That's right -- but I did mean that floating-point literals will be inexact. > Since the new > model is going to follow algebraic principles more closely, I had > expected that > > 0.333333333 == 333333333 / 1000000000 > > where, as I understand the proposal, the right-hand side is an exact > number (so 0.333333333*3 would be 999999999 / 1000000000). > > One of the more-frequent questions on python-help is confusion about > floating-point numbers, e.g. why is the result of 1.99+4.99 printed > as 6.9800000000000004; users often report that as a bug. That's one thing my PEP is not meant to help with -- floating point numbers will remain hard. Hopefully, people will use them less often when they'll have rational arithmetic. -- Moshe Zadka From martin@loewis.home.cs.tu-berlin.de Sun Nov 5 08:28:06 2000 From: martin@loewis.home.cs.tu-berlin.de (Martin v. Loewis) Date: Sun, 5 Nov 2000 09:28:06 +0100 Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: (message from Moshe Zadka on Sun, 05 Nov 2000 10:20:45 +0200) References: <200011042115.WAA01162@loewis.home.cs.tu-berlin.de> Message-ID: <200011050828.JAA00672@loewis.home.cs.tu-berlin.de> > > It's not inherently wrong. It just points out an omission in the PEP: > > it doesn't talk about the meaning of number literals. > > That's right -- but I did mean that floating-point literals will be > inexact. Remind you that your model has no notion of floating-point numbers - then what the heck are floating-point literals? The numbers that you can write in a base-10 notation are all rational numbers, and the point doesn't really float in them... Regards, Martin From py-dev@zadka.site.co.il Sun Nov 5 17:25:33 2000 From: py-dev@zadka.site.co.il (Moshe Zadka) Date: Sun, 05 Nov 2000 19:25:33 +0200 Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: Message from "Martin v. Loewis" of "Sun, 05 Nov 2000 09:28:06 +0100." <200011050828.JAA00672@loewis.home.cs.tu-berlin.de> References: <200011042115.WAA01162@loewis.home.cs.tu-berlin.de> <200011050828.JAA00672@loewis.home.cs.tu-berlin.de> Message-ID: [Martin v. Loewis] > Remind you that your model has no notion of floating-point numbers - > then what the heck are floating-point literals? The numbers that you > can write in a base-10 notation are all rational numbers, and the > point doesn't really float in them... Well, of course they are rational numbers. The only question is whether 1.0 should be inexact or exact. While that is not specified in the PEP (which was meant for Barry to assign me a PEP number primarily...), I think the principle of least suprise would be to treat 1.0 as inexact. (IOW, not to promise that (1.0/3.0)*3.0 == 1.0) -- Moshe Zadka From martin@loewis.home.cs.tu-berlin.de Sun Nov 5 10:24:47 2000 From: martin@loewis.home.cs.tu-berlin.de (Martin v. Loewis) Date: Sun, 5 Nov 2000 11:24:47 +0100 Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: (message from Moshe Zadka on Sun, 05 Nov 2000 19:25:33 +0200) References: <200011042115.WAA01162@loewis.home.cs.tu-berlin.de> <200011050828.JAA00672@loewis.home.cs.tu-berlin.de> Message-ID: <200011051024.LAA01015@loewis.home.cs.tu-berlin.de> > Well, of course they are rational numbers. The only question is whether 1.0 > should be inexact or exact. While that is not specified in the PEP (which > was meant for Barry to assign me a PEP number primarily...), I think > the principle of least suprise would be to treat 1.0 as inexact. To long-term Python users, that would be the least surprise. To new users, the entire notion of inexact numbers is surprising; more so that something as simple as 1.0 is inexact. To computer numerics fanatics, it is surprising that 1.0 is inexact, since the common representations of floating point numbers are well capable of representing it exactly. Regards, Martin From py-dev@zadka.site.co.il Sun Nov 5 19:04:10 2000 From: py-dev@zadka.site.co.il (Moshe Zadka) Date: Sun, 05 Nov 2000 21:04:10 +0200 Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: Message from "Martin v. Loewis" of "Sun, 05 Nov 2000 11:24:47 +0100." <200011051024.LAA01015@loewis.home.cs.tu-berlin.de> References: <200011042115.WAA01162@loewis.home.cs.tu-berlin.de> <200011050828.JAA00672@loewis.home.cs.tu-berlin.de> <200011051024.LAA01015@loewis.home.cs.tu-berlin.de> Message-ID: [Moshe Zadka] > Well, of course they are rational numbers. The only question is whether 1.0 > should be inexact or exact. While that is not specified in the PEP (which > was meant for Barry to assign me a PEP number primarily...), I think > the principle of least suprise would be to treat 1.0 as inexact. [Martin v. Loewis] > To long-term Python users, that would be the least surprise. And to long term users of C/Perl/etc., once they map the numerical concepts correctly. But I hardly thing we should be arguing about this at this stage: I'm willing to leave this as an open issue in the PEP, if this is all you find wrong with it... And a request to all Python-Devvers: please direct comments directly to me, and I promise I'll summarize them all in the PEP. As soon as I get a PEP number, I'll publish an updated version, with all objections and open issues sumarized. -- Moshe Zadka From martin@loewis.home.cs.tu-berlin.de Sun Nov 5 11:21:14 2000 From: martin@loewis.home.cs.tu-berlin.de (Martin v. Loewis) Date: Sun, 5 Nov 2000 12:21:14 +0100 Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: (message from Moshe Zadka on Sun, 05 Nov 2000 21:04:10 +0200) References: <200011042115.WAA01162@loewis.home.cs.tu-berlin.de> <200011050828.JAA00672@loewis.home.cs.tu-berlin.de> <200011051024.LAA01015@loewis.home.cs.tu-berlin.de> Message-ID: <200011051121.MAA01234@loewis.home.cs.tu-berlin.de> > And to long term users of C/Perl/etc., once they map the numerical > concepts correctly. But I hardly thing we should be arguing about this > at this stage: I'm willing to leave this as an open issue in the PEP, > if this is all you find wrong with it... Sorry I didn't mention it: Overall, I like your proposal very well. I'm missing the section on implementation strategies, though. > And a request to all Python-Devvers: please direct comments directly > to me, and I promise I'll summarize them all in the PEP. As soon as > I get a PEP number, I'll publish an updated version, with all > objections and open issues sumarized. Yes, that is a tricky part of the PEP procedure: not commenting in the public initially. I think PEP authors can contribute by not posting the text of their PEP publically. Regards, Martin From gstein@lyra.org Sun Nov 5 11:40:44 2000 From: gstein@lyra.org (Greg Stein) Date: Sun, 5 Nov 2000 03:40:44 -0800 Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: <200011051121.MAA01234@loewis.home.cs.tu-berlin.de>; from martin@loewis.home.cs.tu-berlin.de on Sun, Nov 05, 2000 at 12:21:14PM +0100 References: <200011042115.WAA01162@loewis.home.cs.tu-berlin.de> <200011050828.JAA00672@loewis.home.cs.tu-berlin.de> <200011051024.LAA01015@loewis.home.cs.tu-berlin.de> <200011051121.MAA01234@loewis.home.cs.tu-berlin.de> Message-ID: <20001105034044.M10135@lyra.org> On Sun, Nov 05, 2000 at 12:21:14PM +0100, Martin v. Loewis wrote: >... > > And a request to all Python-Devvers: please direct comments directly > > to me, and I promise I'll summarize them all in the PEP. As soon as > > I get a PEP number, Euh... is process getting in the way of progress? Allocate yourself a PEP number and publish the darn thing. If you don't feel comfortable grabbing a PEP number, then just post it to the list or something. One of the worst things that can happen is to feel locked in by a cumbersome process. Open Source works because people can flow at their own speed, independent of what is happening with others. If Guido and company are too busy to update syncmail... no problem! Thomas has time and inclination and jumps in to fix it. Somebody too busy to revamp the headers for ANSI C? No worries... we have a lot of volunteers just ready and able to do that. But you throw in the gates? The locks? The process? It halts. > > I'll publish an updated version, with all > > objections and open issues sumarized. > > Yes, that is a tricky part of the PEP procedure: not commenting in the > public initially. I think PEP authors can contribute by not posting > the text of their PEP publically. I'm not sure what you're saying here. That a PEP author should develop the PEP entirely in private? Only when it is fully-cooked, that it should be published? Bleh. A PEP should be a work-in-progress. Publish an empty version. Publish a first draft. Revise. Revise. Revise. Comments on PEPs are generated when people feel their are listened to. If a PEP author develops a PEP entirely in secret, then the feedback is going to be diminished because it is hard for people to really know if their comments and ideas are being captured and considered. When you have that feedback loop and the positive reinforcement, then you will engender more commentary. Cheers, -g -- Greg Stein, http://www.lyra.org/ From nhodgson@bigpond.net.au Sun Nov 5 11:59:03 2000 From: nhodgson@bigpond.net.au (Neil Hodgson) Date: Sun, 5 Nov 2000 22:59:03 +1100 Subject: [Python-Dev] What to choose to replace Tkinter? Message-ID: <049d01c0471f$d7899450$8119fea9@neil> Andrew Kuchling: > I believe the GNOME (not GTk's, but GNOME's) canvas widget began as a > fork of the Tk widget that was then substantially enhanced to be a > general-purpose display engine, with antialiasing, alpha compositing, > more attention to performance, and other fancy features. I don't know > if the corresponding text widget (which is Pango, www.pango.org, I > think) is equally featureful. There is a port of the Tk text widget to GTK+ by Havoc Pennington which doesn't require Pango. Its aims are a little muddled as a high quality Pango based text widget is also under long term development. Pango isn't just a text widget but a layered set of capabilities allowing development of internationalised layout and rendering (the equivalent of Microsoft's Uniscribe). Scintilla for GTK+ will use Pango to implement Unicode once Pango is released. Neil From Moshe Zadka Sun Nov 5 12:12:09 2000 From: Moshe Zadka (Moshe Zadka) Date: Sun, 5 Nov 2000 14:12:09 +0200 (IST) Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: <200011051121.MAA01234@loewis.home.cs.tu-berlin.de> Message-ID: On Sun, 5 Nov 2000, Martin v. Loewis wrote: > Yes, that is a tricky part of the PEP procedure: not commenting in the > public initially. I think PEP authors can contribute by not posting > the text of their PEP publically. Perhaps, but that is part of the PEP flow: public post->number Well, thanks a lot for the feedback. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From Moshe Zadka Sun Nov 5 12:16:46 2000 From: Moshe Zadka (Moshe Zadka) Date: Sun, 5 Nov 2000 14:16:46 +0200 (IST) Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: <20001105034044.M10135@lyra.org> Message-ID: On Sun, 5 Nov 2000, Greg Stein wrote: > Euh... is process getting in the way of progress? Perhaps. Well, I won't let it get in the way more then a few more hours -- I'll allocate myself a PEP. It's easier to get forgiveness then permission > I'm not sure what you're saying here. That a PEP author should develop the > PEP entirely in private? Only when it is fully-cooked, that it should be > published? Far from it. Only tht the discussion tends to clutter up Python-Dev too much, so I want to moderate it by way of private mail to me-> checking to the PEP. I hope everyone here trust me to be honest enough not to censor competing points. > Bleh. A PEP should be a work-in-progress. Publish an empty version. Publish > a first draft. Revise. Revise. Revise. I intend to. > Comments on PEPs are generated when people feel their are listened to. If a > PEP author develops a PEP entirely in secret All future revisions will be in the Python CVS. Only security through obscurity can keep me secret there -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From fdrake@acm.org Sun Nov 5 16:45:55 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Sun, 5 Nov 2000 11:45:55 -0500 (EST) Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: References: <200011051121.MAA01234@loewis.home.cs.tu-berlin.de> Message-ID: <14853.36547.748584.450976@cj42289-a.reston1.va.home.com> Moshe Zadka writes: > Perhaps, but that is part of the PEP flow: public post->number > Well, thanks a lot for the feedback. Don't hesitate to publish an updated version just because Barry hasn't assigned a number. If you don't want to publish the full text too often, assign a number to yourself in the PEP index and check that in, then add the actual PEP with the right number. (Make sure you run "./pep2html.py -i 0 " to update the index and your PEP on python.sourceforge.net once you've made the checkins.) -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From thomas@xs4all.net Sun Nov 5 19:18:51 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Sun, 5 Nov 2000 20:18:51 +0100 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0228.txt,NONE,1.1 pep-0000.txt,1.44,1.45 In-Reply-To: <200011051655.IAA13536@slayer.i.sourceforge.net>; from moshez@users.sourceforge.net on Sun, Nov 05, 2000 at 08:55:27AM -0800 References: <200011051655.IAA13536@slayer.i.sourceforge.net> Message-ID: <20001105201851.C27208@xs4all.nl> On Sun, Nov 05, 2000 at 08:55:27AM -0800, Moshe Zadka wrote: > Added Files: > pep-0228.txt > Log Message: > Added first revision of numerical model pep. > ***** Error reading new file: (2, 'No such file or directory') > ***** file: pep-0228.txt cwd: /tmp/cvs-serv13461 Thank you, Moshe ;) Now to figure out why it does work for some people, or in some cases ;P I'm guessing it's a race condition of some sort, but damned if I know what triggers it. -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From gstein@lyra.org Sun Nov 5 21:54:39 2000 From: gstein@lyra.org (Greg Stein) Date: Sun, 5 Nov 2000 13:54:39 -0800 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0228.txt,NONE,1.1 pep-0000.txt,1.44,1.45 In-Reply-To: <20001105201851.C27208@xs4all.nl>; from thomas@xs4all.net on Sun, Nov 05, 2000 at 08:18:51PM +0100 References: <200011051655.IAA13536@slayer.i.sourceforge.net> <20001105201851.C27208@xs4all.nl> Message-ID: <20001105135439.V10135@lyra.org> On Sun, Nov 05, 2000 at 08:18:51PM +0100, Thomas Wouters wrote: > On Sun, Nov 05, 2000 at 08:55:27AM -0800, Moshe Zadka wrote: > > > Added Files: > > pep-0228.txt > > Log Message: > > Added first revision of numerical model pep. > > > ***** Error reading new file: (2, 'No such file or directory') > > ***** file: pep-0228.txt cwd: /tmp/cvs-serv13461 > > Thank you, Moshe ;) Now to figure out why it does work for some people, or > in some cases ;P I'm guessing it's a race condition of some sort, but damned > if I know what triggers it. It certainly could be a race condition. Just look at blast_mail(). It forks off the operation, and it might run before the file arrives in the repository. (??) Or is it possible that the script looks for pep-0228.txt rather than getting a copy from pep-0228.txt,v ?? My script does "cvs -Qn update -p -r1.1 FILE" for new files, and pipes that to the output. Note that the log_accum.pl script that I use for my CVS repository (which came from Apache, which came from BSD) does not fork. It seems to work quite fine with added files. [ and the point in blast_mail() about holding the CVS lock too long is rather silly given the horsepower of the SourceForge boxes ] Cheers, -g -- Greg Stein, http://www.lyra.org/ From thomas@xs4all.net Sun Nov 5 22:12:35 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Sun, 5 Nov 2000 23:12:35 +0100 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0228.txt,NONE,1.1 pep-0000.txt,1.44,1.45 In-Reply-To: <20001105135439.V10135@lyra.org>; from gstein@lyra.org on Sun, Nov 05, 2000 at 01:54:39PM -0800 References: <200011051655.IAA13536@slayer.i.sourceforge.net> <20001105201851.C27208@xs4all.nl> <20001105135439.V10135@lyra.org> Message-ID: <20001105231235.X12776@xs4all.nl> On Sun, Nov 05, 2000 at 01:54:39PM -0800, Greg Stein wrote: > On Sun, Nov 05, 2000 at 08:18:51PM +0100, Thomas Wouters wrote: > It certainly could be a race condition. Just look at blast_mail(). It forks > off the operation, and it might run before the file arrives in the > repository. (??) No... that's not it. > Or is it possible that the script looks for pep-0228.txt rather than getting > a copy from pep-0228.txt,v ?? Almost correct :) The problem is that the loginfo process is run in the server-specific /tmp dir, and the filename is not prefixed by the path to the CVSROOT or some such. I guess it's pure coincidence that the file is still there when the syncmail script arrives at the 'open()' call. > My script does "cvs -Qn update -p -r1.1 FILE" for new files, and pipes > that to the output. Yes... I just wrote a small patch to syncmail which does exactly that (actually, it uses 'newrev' rather than a hardcoded '1.1', and it uses -f and -n but not -Q -- if we add -Q, we should add it to the diff commands as well, and it might confuse developers that are used to reading the debug info ;) The main reason I'm delaying the checkin is to test it on the only CVS repository I can play with, which is over a slow link to an american highschool. Now if only you had mailed an hour earlier, Greg, I wouldn't have had to go through that trouble ;) > [ and the point in blast_mail() about holding the CVS lock too long is > rather silly given the horsepower of the SourceForge boxes ] Well, syncmail was written to manage the Python CVS tree on a slow Sun (I believe) and did an rsync-over-ssh to another machine as well. That can definately take long ;) If we just remove the fork, the rest of syncmail might just work, even with new files. In the mean time, I'll check in my change. It might be the best thing to do anyway, since it shouldn't interfere unless the file isn't there. -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From jeremy@alum.mit.edu Sun Nov 5 21:08:55 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Sun, 5 Nov 2000 16:08:55 -0500 (EST) Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: <20001105034044.M10135@lyra.org> References: <200011042115.WAA01162@loewis.home.cs.tu-berlin.de> <200011050828.JAA00672@loewis.home.cs.tu-berlin.de> <200011051024.LAA01015@loewis.home.cs.tu-berlin.de> <200011051121.MAA01234@loewis.home.cs.tu-berlin.de> <20001105034044.M10135@lyra.org> Message-ID: <14853.52327.355764.528782@bitdiddle.concentric.net> [Comments from Greg and Moshe on the PEP process.] I don't see the PEP process creating any impediments here. There is almost no process -- Barry assigns numbers to PEPs and Guido rules on them. We've got some rules about what must be in the PEP before it is approved, but almost none about how the PEP is created. There is nothing about the PEP process that prevents a healthy discussion of issues, in private mail or on a mailing list (python-dev or otherwise). We had lots of comments on the statically nested scopes PEP before Barry assigned it a number. An entire PEP could be created and discussed before it gets a number. Someone may want to work on a PEP in privacy and wait to share it until the entire document is complete; that's fine too, provided that revisions are made based on feedback. One goal we had when setting up the PEP process was to limit the amount of repeated discussion on an issue. It's not too uncommon for email discussions to go in circles or to endlessly rehash the same few issues. We hoped that PEP authors would incorporate a discussion of such issues in the PEP and reduce the amount of wasted bandwidth on repetitive discussion. Let's not waste time discussing how to create PEPs and instead actually create them. The clock is ticking for new features in 2.1; the tentative deadline for new PEPs is mid-December. Jeremy From jeremy@alum.mit.edu Sun Nov 5 21:14:36 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Sun, 5 Nov 2000 16:14:36 -0500 (EST) Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: <14853.36547.748584.450976@cj42289-a.reston1.va.home.com> References: <200011051121.MAA01234@loewis.home.cs.tu-berlin.de> <14853.36547.748584.450976@cj42289-a.reston1.va.home.com> Message-ID: <14853.52668.117844.28459@bitdiddle.concentric.net> >>>>> "FLD" == Fred L Drake, writes: FLD> Moshe Zadka writes: >> Perhaps, but that is part of the PEP flow: public post->number >> Well, thanks a lot for the feedback. FLD> Don't hesitate to publish an updated version just because FLD> Barry FLD> hasn't assigned a number. If you don't want to publish the FLD> full text too often, assign a number to yourself in the PEP FLD> index and check that in, then add the actual PEP with the right FLD> number. I thought we discussed this earlier and agreed that a little bit of control over the process was healthy. I would prefer to see all PEP creation go through Barry. We can circulate drafts in email before that. Jeremy From thomas@xs4all.net Sun Nov 5 22:29:21 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Sun, 5 Nov 2000 23:29:21 +0100 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0228.txt,NONE,1.1 pep-0000.txt,1.44,1.45 In-Reply-To: <20001105231235.X12776@xs4all.nl>; from thomas@xs4all.net on Sun, Nov 05, 2000 at 11:12:35PM +0100 References: <200011051655.IAA13536@slayer.i.sourceforge.net> <20001105201851.C27208@xs4all.nl> <20001105135439.V10135@lyra.org> <20001105231235.X12776@xs4all.nl> Message-ID: <20001105232921.E27208@xs4all.nl> On Sun, Nov 05, 2000 at 11:12:35PM +0100, Thomas Wouters wrote: > In the mean time, I'll check in my change. It might be the best thing to > do anyway, since it shouldn't interfere unless the file isn't there. Since changes to files in CVSROOT go to python-checkins-admin rather than python-checkins, here's the diff I just checked in: Index: syncmail =================================================================== RCS file: /cvsroot/python/CVSROOT/syncmail,v retrieving revision 3.14 retrieving revision 3.15 diff -c -c -r3.14 -r3.15 *** syncmail 2000/11/02 21:44:32 3.14 --- syncmail 2000/11/05 22:24:29 3.15 *************** *** 85,91 **** return '***** Bogus filespec: %s' % filespec if oldrev == 'NONE': try: ! fp = open(file) lines = fp.readlines() fp.close() lines.insert(0, '--- NEW FILE ---\n') --- 85,95 ---- return '***** Bogus filespec: %s' % filespec if oldrev == 'NONE': try: ! if os.path.exists(file): ! fp = open(file) ! else: ! update_cmd = 'cvs -fn update -r %s -p %s' % (newrev, file) ! fp = os.popen(update_cmd) lines = fp.readlines() fp.close() lines.insert(0, '--- NEW FILE ---\n') See the manpage for 'cvs' for an explanation of the options ;) This should fix 99.95% or so of the problem (there is still a tiny window for the file being removed inbetween the os.path.exists and the actual open) and is probably best even if we do remove the fork() from syncmail. -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From greg@cosc.canterbury.ac.nz Mon Nov 6 00:56:16 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Mon, 06 Nov 2000 13:56:16 +1300 (NZDT) Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: <200011051024.LAA01015@loewis.home.cs.tu-berlin.de> Message-ID: <200011060056.NAA29965@s454.cosc.canterbury.ac.nz> "Martin v. Loewis" : > To computer numerics fanatics, it is surprising that 1.0 is inexact, > since the common representations of floating point numbers are well > capable of representing it exactly. I suppose in principle one could meticulously keep track of which floating point numbers in a calculation were exact and which weren't. But you'd lose the property that any arithmetic operation on exact operands produces an exact result. Also, it would be very tedious and inefficient to have to keep track of exactness so exactly! Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From greg@cosc.canterbury.ac.nz Mon Nov 6 01:03:06 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Mon, 06 Nov 2000 14:03:06 +1300 (NZDT) Subject: [Python-Dev] Static scoping of builtins (Re: Dynamic nested scopes) In-Reply-To: <200011032031.PAA23630@cj20424-a.reston1.va.home.com> Message-ID: <200011060103.OAA29969@s454.cosc.canterbury.ac.nz> Guido: > I'd be happy to make an explicit list of > those builtins that should not be messed with There's a precedent for this in Scheme, which has a notion of "integrable procedures". As for the rest, with static scoping it will be possible to make access to builtins just as efficient as locals, while still allowing them to be rebound, so there's no reason why __builtin__.__dict__.open = foo can't continue to work, if so desired. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From greg@cosc.canterbury.ac.nz Mon Nov 6 01:16:27 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Mon, 06 Nov 2000 14:16:27 +1300 (NZDT) Subject: [Python-Dev] statically nested scopes In-Reply-To: <3A02B489.89EF108C@lemburg.com> Message-ID: <200011060116.OAA29972@s454.cosc.canterbury.ac.nz> "M.-A. Lemburg" : > Nested scopes will introduce cycles in all frame objects. It doesn't have to be that way. A static link is only needed if a function actually uses any variables from an outer scope. In the majority of cases, it won't. And it's possible to do even better than that. You can separate out variables referred to in an inner scope and store them separately from the rests of the frame, so you only keep what's really needed alive. > This means that with GC turned off, frame objects will live > forever Don't allow GC to be turned off, then! (Presumably this feature would only be considered once GC has become a permanent feature of Python.) > BTW, Python's GC only works for a few builtin types (frames > are not among the supported types) But they could become so if necessary, surely? Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From guido@python.org Mon Nov 6 01:19:02 2000 From: guido@python.org (Guido van Rossum) Date: Sun, 05 Nov 2000 20:19:02 -0500 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: Your message of "Sat, 04 Nov 2000 20:00:49 +0200." References: Message-ID: <200011060119.UAA03952@cj20424-a.reston1.va.home.com> So I go offline for a couple of days to entertain guests and have my body kicked around in a dance class, and I have 25 messages discussing Python's numeric model waiting for me... I was hoping that Tim would chime in, but he's apparently taken the weekend off -- very much out of character. :-) I like the idea of a PEP to rework the numeric model. I think that Moshe, being a mathematician by training, will make a good editor. I think that this has a snowball's chance in hell to make it into Python 2.1 however -- there are many deep issues, and it's a bad idea to experiment too much in the released version of the language. Some of the issues are more usability issues than of a mathematical nature. For example, Tim has conjectured that using binary floating point will always be a problem for the "unwashed masses" -- the only thing they might understand is decimal floating point, not rational numbers. There are at least two issues here: (1) As long as the internal representation is not the same as what is commonly printed, there will be surprises -- with rationals just as much as with floating point. There are issues with decimal floating point too, but they are only the issues having to do with loss of precision in the calculation (e.g. 1.0 - 1e-20 yielding 1.0) and not with loss of precision in the printing, where most of the "bug reports" we get seem to concentrate. (2) Rational numbers have the unpleasant property of growing unboundedly during many innocent calculations, thereby using up exorbitant amounts of memory and slowing down the calculation -- often mysteriously, because what is displayed is truncated. Another issue that I might bring up is that there are no inexact numbers (each floating point number is perfectly exact and rational) -- there are only inexact operations. I'm not sure what to do with this though. If we take its meaning literally, the isreal() function should only return true for numbers for which isrational() is also true: mathematically speaking, real numbers that aren't also rational don't have an easy finite representation, since they are numbers like sqrt(2) or pi. I'll leave it to Tim to explain why inexact results may not be close to the truth. Tim may also break a lance for IEEE 754. Finally, the PEP doesn't talk about how the numeric model can be extended, and how other numeric types can be blended in. E.g. I've heard of wild floating point representations that make multiplication and division really cheap but addition a pain, rather than the other way around; some people may want to have long ints implemented using the GNU mp library, and so on. Such custom types should be supported as well as native types -- like they are now. --Guido van Rossum (home page: http://www.python.org/~guido/) PS. The "1.isreal()" problem is a non-problem. This question is only interesting to ask about variables. From guido@python.org Mon Nov 6 01:25:08 2000 From: guido@python.org (Guido van Rossum) Date: Sun, 05 Nov 2000 20:25:08 -0500 Subject: [Python-Dev] PEP 224 (Attribute Docstrings) In-Reply-To: Your message of "Sat, 04 Nov 2000 10:50:10 +0100." <3A03DBD2.7E024978@lemburg.com> References: <3A03DBD2.7E024978@lemburg.com> Message-ID: <200011060125.UAA03986@cj20424-a.reston1.va.home.com> Marc-Andre: > I can take over the coercion PEP: I've been working > on this before (see the proposal on my Python Pages). Thanks, excellent (although I haven't seen your proposal yet). > I would also like to know whether the PEP-0224 will be considered > for 2.1 if I update the patch to make it a little more robust > w/r to the problems mentioned in that PEP -- I'd really like > to see this in Python soon, since it makes documenting Python > programs so much easier. I "kinda" like the idea of having attribute docstrings (meaning it's not of great importance to me) but there are two things I don't like in your current proposal: 1. The syntax you propose is too ambiguous: as you say, stand-alone string literal are used for other purposes and could suddenly become attribute docstrings. 2. I don't like the access method either (__doc___). > Note that I won't get around to do much work on these before > January... way too busy at the moment :-/ That's a problem -- we really want to have the PEPs ready for review by mid December. This will also be a problem for the coercion PEP -- if you think you won't be able to work on it before then, I'd prefer to find another (co-)author. --Guido van Rossum (home page: http://www.python.org/~guido/) From guido@python.org Mon Nov 6 02:05:20 2000 From: guido@python.org (Guido van Rossum) Date: Sun, 05 Nov 2000 21:05:20 -0500 Subject: [Python-Dev] Static scoping of builtins (Re: Dynamic nested scopes) In-Reply-To: Your message of "Mon, 06 Nov 2000 14:03:06 +1300." <200011060103.OAA29969@s454.cosc.canterbury.ac.nz> References: <200011060103.OAA29969@s454.cosc.canterbury.ac.nz> Message-ID: <200011060205.VAA04176@cj20424-a.reston1.va.home.com> > Guido: > > I'd be happy to make an explicit list of > > those builtins that should not be messed with [Greg Ewing] > There's a precedent for this in Scheme, which has a notion > of "integrable procedures". Good! > As for the rest, with static scoping it will be possible to > make access to builtins just as efficient as locals, while > still allowing them to be rebound, so there's no reason why > __builtin__.__dict__.open = foo can't continue to work, > if so desired. I'm not sure what you mean. With integrable procedures (whatever they may be :-) I believe this is possible. Without them, the lookup in globals() can be skipped for builtins, but a local is accessed with *zero* dict lookups -- how would you do this while still supporting __builtin__.__dict__.open = foo? have "hookable" dictionaries? (Those would solve a bunch of problems, but are not under consideration at the moment.) --Guido van Rossum (home page: http://www.python.org/~guido/) From gmcm@hypernet.com Mon Nov 6 02:21:47 2000 From: gmcm@hypernet.com (Gordon McMillan) Date: Sun, 5 Nov 2000 21:21:47 -0500 Subject: [Python-Dev] Stackless pages Message-ID: <3A05CF6B.32107.2CAF777@localhost> I have put up 6 pages of information about stackless at http://www.mcmillan-inc.com/stackless.html The first page attempts to give a conceptual overview of stackless. Notice I said "conceptual" - I make no attempt to be technically accurate! Next follow 4 pages of tutorial. Mainly this is a discussion of implementing generators and coroutines through the continuation module. It includes rewrites of 2 samples that Tim used to demonstrate his coroutines-implemented-via- threads madness. Finally, the last page is about SelectDispatcher, which is kind of Medusa using coroutines. Included as a demonstration is a full FTPServer that will run on Windows. This is not just demo quality code - it's at the core of a couple commercial apps I'm doing for clients, at least one of which will make something of a splash in its (large, prosperous) industry. SelectDispatcher and friends are released under the McMillan Enterprises 4 line license (do what thou wilt; maintain the copyright notice; no warranty). While these are not the PEPs I owe on stackless, they are part of the background material for those PEPs, particularly in demonstrating why some of us are so interested in seeing these facilities within core Python. I apologize in advance to Christian for any misunderstandings or misinformation these pages may contain. Enjoy! - Gordon From guido@python.org Mon Nov 6 02:34:40 2000 From: guido@python.org (Guido van Rossum) Date: Sun, 05 Nov 2000 21:34:40 -0500 Subject: [Python-Dev] Integer division transition In-Reply-To: Your message of "Sat, 04 Nov 2000 11:21:52 +0100." <3A03E340.38F78FA2@lemburg.com> References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> Message-ID: <200011060234.VAA04271@cj20424-a.reston1.va.home.com> [Guido] > > - Integer division. If we want to play by Paul Prescod's Language > > Evolution rules (PEP 5), we better get started on the first stage. > > E.g. we could introduce a // operator in 2.1 for integer division, > > and issue a warning when / is used for integers. Then a year later > > (i.e., March 2002!) we could change / so that it returns a floating > > point number. [MAL] > +0... and then only, if there will be a tool to check Python > source code for integer divides. Hm. I don't believe it's possible to write a tool to check for integer divides by inspection of the source code only -- you have to actually execute the code (with specific input, etc.). However, with the right warnings framework in place (I'll post some ideas about this under a separate subject), the Python interpreter itself can be the perfect tool to do the checking. Given that it's pretty uncontroversial that 1/2 in Py3K should equal 0.5, I'd rather get this started sooner than later. Let me state some requirements: - We don't want to break code in 2.1 that works in 2.0. - It's okay to issue warnings though (my warnings proposal will limit the warnings to once per source line). - In Py3K, 1/2 will yield 0.5 and users must use a different way to spell floor(x/y). - Starting in 2.1, we want to issue warnings that encourage users to make their code Py3K-ready. - We want (almost) all users to have converted their code to using 1//2 instead of 1/2 by the time 2.n (the last 2.x version before Py3K is released) comes out, because unchanged code will silently change its meaning at the Py3K transition. - Code that is Py3K-ready (in this respect) should trigger no warnings in Python 2.1. Note: it is open for debate whether the result of x/y for integer (or long integer) arguments should yield an integer (or long integer) in those cases where the result *is* representable as such (e.g. 4/2). It is possible that the numeric tower will render this problem moot -- but that depends on what happens to Moshe's PEP 228, and that's a much longer story. However, I think we can decide on the transition path from / to // independent from the outcome of that discussion, since in all cases it is clear that 1/2 will change in meaning. Here's a concrete proposal (could be PEPped pretty easily): - In Py3K, there will be two division operators: - x/y will always yield the mathematically expected result (possibly inexact, depending on how the numeric model is changed). - x//y will always yield the floor of the mathematical result, even for float arguments. For complex arguments, this should raise an exception, just as int(1j) does today. - Starting with Python 2.1, x//y will do the right thing (1//2 == 0). - In 2.1, using x/y for ints and longs will issue a warning. - In Py3K, there *might* be a special "backward incompatibility warning mode" that issues warnings when x/y is used for ints; 1/2 will still yield 0.5 in this mode, but a warning will be issued. This is a minimal proposal. If Python were to grow a syntax for pragmas, it would be nice to have a pragma saying "I want int/int to do float division" in code that is Py3K ready; otherwise, this always has to be spelled as float(x)/y to ensure proper working in 2.x as well as in Py3K. David Scherer proposed to spell this pragma as a "magical import" (http://www.python.org/pipermail/idle-dev/2000-April/000138.html). This is an OK idea (+0) because it can be assumed to fail in pre-2.1 installations and doesn't require new syntax. I don't give it a +1 because it's a hack -- "import" doesn't quite convey the intention. (Perl's "use" is better for this purpose!) Tim didn't seem to like this idea much (http://www.python.org/pipermail/python-dev/2000-April/010029.html). His dislike seems based on the assumption that such annotations would mention specific language (or interpreter) version numbers, which could be interpreted as resisting progress (one moment 1.7 is a forward looking version, but the next moment it is backward looking). However if we use directives (I don't want to call them pragmas because pragmas are supposed to be ignorable) to select specify specific features, especially features for which there are only two versions (the old way and the new way) then it seems okay to use such a mechanism -- if we can agree on a syntax for directives. Hm, reading Tim's post again it seems he's mostly objecting against defaulting to an old version. I have to agree with him there. However what I'm proposing here is defaulting to the current version, and allowing a way to select a "future version" as an alternative. If we don't adopt directives, all we need to do (in Python 2.1) is add a new opcode for //, keeping the opcode for / unchanged. If we do adopt directives, we'll need to introduce two new opcodes: one for the new (always float) /, one for the new //, still keeping the old / opcode with the 2.0 meaning. The latter is what David Scherer proposes (and what he needs for his students). Note that it would be okay to introduce directives in a later 2.x version -- this won't break any previous code. --Guido van Rossum (home page: http://www.python.org/~guido/) From guido@python.org Mon Nov 6 02:44:57 2000 From: guido@python.org (Guido van Rossum) Date: Sun, 05 Nov 2000 21:44:57 -0500 Subject: [Python-Dev] Class/type dichotomy thoughts In-Reply-To: Your message of "Sat, 04 Nov 2000 11:21:52 +0100." <3A03E340.38F78FA2@lemburg.com> References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> Message-ID: <200011060244.VAA04289@cj20424-a.reston1.va.home.com> [me] > > - Class/type dichotomy??? [MAL] > One thing that would probably be implementable is a way to > maintain "instance" dictionaries for types (which are created > on-demand whenever an assignment is made). > > This would enable > extending types with new methods and attributes. "Subclassing" > could then be emulated by using new contructors which add the > new or changed methods to each created type instance, e.g. > > class myclose: > > def __init__(self, object, basemethod): > self.object = object > self.basemethod = basemethod > > def __call__(self): > print 'Closed file %s' % self.object > self.basemethod() > > def myfile(filename): > f = open(filename) > # add/override attributes > f.newattribute = 1 > # add/override methods > f.close = myclose(f, f.close) > return f > > Types would have to be made aware of this possibility. Python > could provide some helping APIs to make life easier for the > programmer. But this would require an extra pointer field for *all* built-in types. That would seriously impact the space requirements for ints and floats! As long as we're proposing hacks like this that don't allow smooth subclassing yet but let you get at least some of the desired effects, I'd rather propose to introduce some kind of metaclass that will allow you to use a class statement to define this. Thinking aloud: import types filemetaclass = metaclass(types.FileType) class myfile(filemetaclass): def __init__(self, filename): filemetaclass.__init__(filename) self.newattribute = 1 def close(self): myclose(self) filemetaclass.close(self) I'm not quite sure what kind of object "filemetaclass" here should be or what exactly "metaclass()" should do, but it could create a new type that has the lay-out of an existing file object, with an instance dictionary (for newattribute) tacked on the end. Later maybe (I'm going to brainstorm with Jim Fulton about types and classes). --Guido van Rossum (home page: http://www.python.org/~guido/) From guido@python.org Mon Nov 6 02:48:47 2000 From: guido@python.org (Guido van Rossum) Date: Sun, 05 Nov 2000 21:48:47 -0500 Subject: [Python-Dev] Weak references In-Reply-To: Your message of "Sat, 04 Nov 2000 11:21:52 +0100." <3A03E340.38F78FA2@lemburg.com> References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> Message-ID: <200011060248.VAA04306@cj20424-a.reston1.va.home.com> [me] > > - Weak references. This *is* a PEP, but there's no contents yet. We > > could also try to implement (just) weak dictionaries. [MAL] > These already exist... http://www.handshake.de/~dieter/weakdict.html > > mx.Proxy also has an implementation which support weak references. Thanks. For Fred to read... > BTW, are these still needed now that we have GC ? Yes, definitely. Weak dicts are sometimes needed for situations where a regular dict would keep objects alive forever. E.g. we were made aware of a "leak" in JPython that could only be fixed with weak dicts: the Swing wrapper code has a global dict mapping widgets to callback functions, and this keeps all widgets alive forever. The Java GC doesn't destroy the widgets, because they are still referenced from the dict. A weak dict solves this problem nicely (if it weren't that JDK 1.1 doesn't support weak dicts). --Guido van Rossum (home page: http://www.python.org/~guido/) From fdrake@acm.org Mon Nov 6 02:49:25 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Sun, 5 Nov 2000 21:49:25 -0500 (EST) Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: <14853.52668.117844.28459@bitdiddle.concentric.net> References: <200011051121.MAA01234@loewis.home.cs.tu-berlin.de> <14853.36547.748584.450976@cj42289-a.reston1.va.home.com> <14853.52668.117844.28459@bitdiddle.concentric.net> Message-ID: <14854.7221.546916.848838@cj42289-a.reston1.va.home.com> Jeremy Hylton writes: > I thought we discussed this earlier and agreed that a little bit of > control over the process was healthy. I would prefer to see all PEP > creation go through Barry. We can circulate drafts in email before I think I hadn't actually noticed some of that email, or perhaps there was a conversation I've forgotten. Fine. I still don't see a problem for people creating PEPs; there's always email and the files can be pubished at alternate locations before a number has been assigned. -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From fdrake@acm.org Mon Nov 6 03:06:29 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Sun, 5 Nov 2000 22:06:29 -0500 (EST) Subject: [Python-Dev] Weak references In-Reply-To: <200011060248.VAA04306@cj20424-a.reston1.va.home.com> References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060248.VAA04306@cj20424-a.reston1.va.home.com> Message-ID: <14854.8245.959258.340132@cj42289-a.reston1.va.home.com> Guido van Rossum writes: > Yes, definitely. Weak dicts are sometimes needed for situations where > a regular dict would keep objects alive forever. E.g. we were made > aware of a "leak" in JPython that could only be fixed with weak dicts: > the Swing wrapper code has a global dict mapping widgets to callback That's a perfect example. I've started working on some text describing the motivation; hopefully I'll have that fleshed out and checked in later this week. -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From greg@cosc.canterbury.ac.nz Mon Nov 6 03:19:28 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Mon, 06 Nov 2000 16:19:28 +1300 (NZDT) Subject: [Python-Dev] Class/type dichotomy thoughts In-Reply-To: <200011060244.VAA04289@cj20424-a.reston1.va.home.com> Message-ID: <200011060319.QAA00004@s454.cosc.canterbury.ac.nz> Guido: > [MAL] > > One thing that would probably be implementable is a way to > > maintain "instance" dictionaries for types > But this would require an extra pointer field for *all* built-in > types. Ruby has an interesting solution to this. It keeps such "extra" instance variables in a global data structure. The Python version of this would be to have a special global dict which maps instances of built-in types to dicts holding their extra instance variables. The keys in this dict would have to be weak references, so that they wouldn't keep the objects alive. A flag would be set in the object header so that, when the object was deleted, the corresponding entry in the global dict could be cleaned up. The overhead would then be one bit in each object, and one extra test when deleting an object. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From greg@cosc.canterbury.ac.nz Mon Nov 6 03:20:37 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Mon, 06 Nov 2000 16:20:37 +1300 (NZDT) Subject: [Python-Dev] Integer division transition In-Reply-To: <200011060234.VAA04271@cj20424-a.reston1.va.home.com> Message-ID: <200011060320.QAA00007@s454.cosc.canterbury.ac.nz> Guido: > Here's a concrete proposal (could be PEPped pretty easily): Looks good to me. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From guido@python.org Mon Nov 6 03:35:26 2000 From: guido@python.org (Guido van Rossum) Date: Sun, 05 Nov 2000 22:35:26 -0500 Subject: [Python-Dev] Warning framework In-Reply-To: Your message of "Sat, 04 Nov 2000 11:21:52 +0100." <3A03E340.38F78FA2@lemburg.com> References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> Message-ID: <200011060335.WAA04452@cj20424-a.reston1.va.home.com> Before I fall asleep let me write up my ideas about the warning framework. Requirements: - A C-level API that lets C code issue a warning with a single call taking one or two arguments, e.g. Py_Warning(level, message). (The 'level' argument is an example only; I'm not sure what if any we need.) - After the first time a specific warning is issued for a given source code location, the overhead of calling Py_Warning() should be minimal. - An equivalent Python level API, e.g. sys.warning(level, message). - Flexible control over which warnings are printed or not; there should be a way to set this up from within the Python program but also from the command line or possible using an environment variable. - Control over the disposition of warnings; by default they should be printed to sys.stderr but an alternative disposition should be supported (the mechanism could be either a different file or a different callback function). - By default, a warning is printed once (the first time it is issued) for each source line where it is issued. - For a specific warning at a specific source code location, it should be possible to specify the following alternatives: - Turn it into an exception - Don't print it at all - Print it each time it is issued - It should also be possible to specify these alternatives: - For all warnings - For all warnings in a specific module or file - For all warnings at a specific source code location - For a specific warning everywhere in a specific module or file - For a specific warning everywhere in the program - For all warnings at/above/below (?) a specific level, if we use warning levels Possible implementation: - Each module can has a dictionary __warnings__ in its global __dict__, which records the state of warnings. It is created as an emprt dict if it doesn't exist when it is needed. The keys are (message, linenumber) tuples (the module or file is implicit through the use of the module's __dict__). The value is None if no more action is needed for this particular warning and location. Some other values may indicate the options "always print warning" (1?) and "raise an exception" (-1?). - There's a list of "filters" in the sys module (e.g. sys.warningfilters) that is checked whenever a warning doesn't have a hit in the __warnings__ dict. Entries in the filter list are (file, line, message, action) tuples. (If we decide to implement warning levels, these must also be represented here somewhere.) - The file must be None or a shell matching pattern, e.g. "*foo"; the ".py" suffix is optional; a partial pathname may be given too. So "foo/bar" matches "/usr/lib/python2.0/foo/bar.py" but also "/home/guido/libp/tralala/foo/bar.py". If the file is None or "*" the filter applies regardless of the file. - The line must be None or an integer. If the file is None or "*" (indicating all files) the line must be None and is ignored. - The message must be a None or a string. If it is None, the filter applies to all messages. The message string may end in "*" to match all messages with the given text (up to the "*"). - The action must be one of the following strings: - "ignore" -- the warning is never printed - "always" -- the warning is always printed - "once" -- the warning is printed for the first occurrence matching the filter - "module" -- the warning is printed for the first occurrence in each module matching the filter - "location" -- the warning is printed for the first occurrence at each source code location (module + line) matching the filter - "exception" -- the warning is turned into an exception whenever it matches the filter Note: implementation of "once" and "module" require additional state per filter entry; I'm not sure if that's worth the effort. - When the warning system decides to print a warning, it is given to sys.displaywarning(file, line, message), which by default does something like print >>sys.stderr, file, ":", line, ":", message - There should be a function sys.addwarningfilter(file, line, message, action) that appends items to sys.warningfilters after some sanity checking. - There should be command line options to specify the most common filtering actions, which I expect to include at least: - suppress all warnings - suppress a particular warning message everywhere - suppress all warnings in a particular module - turn all warnings into exceptions --Guido van Rossum (home page: http://www.python.org/~guido/) From greg@cosc.canterbury.ac.nz Mon Nov 6 03:34:26 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Mon, 06 Nov 2000 16:34:26 +1300 (NZDT) Subject: [Python-Dev] Static scoping of builtins (Re: Dynamic nested scopes) In-Reply-To: <200011060205.VAA04176@cj20424-a.reston1.va.home.com> Message-ID: <200011060334.QAA00011@s454.cosc.canterbury.ac.nz> Guido: > the lookup in > globals() can be skipped for builtins, but a local is accessed with > *zero* dict lookups -- how would you do this while still supporting > __builtin__.__dict__.open = foo? have "hookable" dictionaries? With fully static scoping, I envisage that all three kinds of scope (local, module and builtin) would be implemented in essentially the same way, i.e. as arrays indexed by integers. That being the case, all you need to do is arrange for the __builtin__ module and the global scope to be one and the same thing, and __builtin__.open = foo will work just fine (assuming open() isn't one of the special inlinable functions). Getting __builtin__.__dict__['open'] = foo to work as well may require some kind of special dictionary-like object. But then you're going to need that anyway if you want to continue to support accessing module namespaces as if they are dictionaries. Whether it's worth continuing to support that in Py3k is something that can be debated separately. > integrable procedures (whatever they may be :-) In the Revised^n Report, some builtin procedures are declared to be "integrable", meaning that the compiler is allowed to assume that they have their usual definitions and optimise accordingly. (This is quite important in Scheme, even more so than in Python, when you consider that almost every operation in Scheme, including '+', is a procedure call!) Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From guido@python.org Mon Nov 6 03:40:33 2000 From: guido@python.org (Guido van Rossum) Date: Sun, 05 Nov 2000 22:40:33 -0500 Subject: [Python-Dev] More Unicode support In-Reply-To: Your message of "Sat, 04 Nov 2000 11:21:52 +0100." <3A03E340.38F78FA2@lemburg.com> References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> Message-ID: <200011060340.WAA04479@cj20424-a.reston1.va.home.com> [me] > > - Internationalization. Barry knows what he wants here; I bet Martin > > von Loewis and Marc-Andre Lemburg have ideas too. [MAL] > We'd need a few more codecs, support for the Unicode compression, > normalization and collation algorithms. Hm... There's also the problem that there's no easy way to do Unicode I/O. I'd like to have a way to turn a particular file into a Unicode output device (where the actual encoding might be UTF-8 or UTF-16 or a local encoding), which should mean that writing Unicode objects to the file should "do the right thing" (in particular should not try to coerce it to an 8-bit string using the default encoding first, like print and str() currently do) and that writing 8-bit string objects to it should first convert them to Unicode using the default encoding (meaning that at least ASCII strings can be written to a Unicode file without having to specify a conversion). I support that reading from a "Unicode file" should always return a Unicode string object (even if the actual characters read all happen to fall in the ASCII range). This requires some serious changes to the current I/O mechanisms; in particular str() needs to be fixed, or perhaps a ustr() needs to be added that it used in certain cases. Tricky, tricky! --Guido van Rossum (home page: http://www.python.org/~guido/) From guido@python.org Mon Nov 6 03:48:22 2000 From: guido@python.org (Guido van Rossum) Date: Sun, 05 Nov 2000 22:48:22 -0500 Subject: [Python-Dev] Stackless pages In-Reply-To: Your message of "Sun, 05 Nov 2000 21:21:47 EST." <3A05CF6B.32107.2CAF777@localhost> References: <3A05CF6B.32107.2CAF777@localhost> Message-ID: <200011060348.WAA04560@cj20424-a.reston1.va.home.com> > I have put up 6 pages of information about stackless at > > http://www.mcmillan-inc.com/stackless.html Gordon, thanks for doing this. I still have a review of Stackless on my TODO list. It takes a serious chunk of my time to do it justice, and this continues to be a problem, but the existience of your overview certainly helps. I still think that the current Stackless implementation is too complex, and that continuations aren't worth the insanity they seem to require (or cause :-), but that microthreads and coroutines *are* worth having and that something not completely unlike Stackless will be one day the way to get there... --Guido van Rossum (home page: http://www.python.org/~guido/) From jeremy@alum.mit.edu Mon Nov 6 03:55:17 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Sun, 5 Nov 2000 22:55:17 -0500 (EST) Subject: [Python-Dev] Stackless pages In-Reply-To: <200011060348.WAA04560@cj20424-a.reston1.va.home.com> References: <3A05CF6B.32107.2CAF777@localhost> <200011060348.WAA04560@cj20424-a.reston1.va.home.com> Message-ID: <14854.11173.601039.883893@bitdiddle.concentric.net> [Changed discussion list from general python-list to specific stackless.] >>>>> "GvR" == Guido van Rossum writes: >> I have put up 6 pages of information about stackless at >> http://www.mcmillan-inc.com/stackless.html GvR> Gordon, thanks for doing this. I still have a review of GvR> Stackless on my TODO list. It takes a serious chunk of my time GvR> to do it justice, and this continues to be a problem, but the GvR> existience of your overview certainly helps. I still think GvR> that the current Stackless implementation is too complex, and GvR> that continuations aren't worth the insanity they seem to GvR> require (or cause :-), but that microthreads and coroutines GvR> *are* worth having and that something not completely unlike GvR> Stackless will be one day the way to get there... I tend to agree with you, Guido. I think we would do well to purposefully omit continuations from the Python language. There seems to be little need for a facility to implement arbitrary control structures in Python. If Python support coroutines and microthreads, I am not sure what else would be needed. It would be very helpful if the PEPs on Stackless could address this issue. One way to address it is to ask these questions: What new control structures do users want in Python? How best can they be implemented? Are continuations necessary to implement them or are there other options? The sort of implementation complexity that I worry about with Stackless is, e.g. clean interaction with the C stack. If a Python C API call is made that pushes a C stack frame, e.g. PyObject_Compare, then a continuation stored before that call can no longer be invokved. The problem is that continuations break the notion a C API call will always return an answer; they create a situation in which the C call that is made should never return, because control is transferred to the continuation. I assume Stackless raises an error in this case, but this seems pretty messy: How do we right a language spec that explains when an error will occur without appealing to the language implementation? Jeremy From petrilli@amber.org Mon Nov 6 04:06:35 2000 From: petrilli@amber.org (Christopher Petrilli) Date: Sun, 5 Nov 2000 23:06:35 -0500 Subject: [Python-Dev] Weak references In-Reply-To: <14854.8245.959258.340132@cj42289-a.reston1.va.home.com>; from fdrake@acm.org on Sun, Nov 05, 2000 at 10:06:29PM -0500 References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060248.VAA04306@cj20424-a.reston1.va.home.com> <14854.8245.959258.340132@cj42289-a.reston1.va.home.com> Message-ID: <20001105230635.A18694@trump.amber.org> Fred L. Drake, Jr. [fdrake@acm.org] wrote: > > Guido van Rossum writes: > > Yes, definitely. Weak dicts are sometimes needed for situations where > > a regular dict would keep objects alive forever. E.g. we were made > > aware of a "leak" in JPython that could only be fixed with weak dicts: > > the Swing wrapper code has a global dict mapping widgets to callback > > That's a perfect example. I've started working on some text > describing the motivation; hopefully I'll have that fleshed out and > checked in later this week. Another example is some of the things in Zope use back-references for ease of traversability (or worse keep weird counts hanging around). Alot of these are negated by ZODB's ability to break cycles, but... a lot of data structures would be hugely better from an architecture perspective if we had a native weak reference. Chris -- | Christopher Petrilli | petrilli@amber.org From est@hyperreal.org Mon Nov 6 04:27:52 2000 From: est@hyperreal.org (est@hyperreal.org) Date: Sun, 5 Nov 2000 20:27:52 -0800 (PST) Subject: [Python-Dev] Weak references In-Reply-To: <200011060248.VAA04306@cj20424-a.reston1.va.home.com> from Guido van Rossum at "Nov 5, 2000 09:48:47 pm" Message-ID: <20001106042752.25360.qmail@hyperreal.org> Guido van Rossum discourseth: > [MAL] > > These already exist... http://www.handshake.de/~dieter/weakdict.html > > > > mx.Proxy also has an implementation which support weak references. > > Thanks. For Fred to read... He may want to also see my http://www.hyperreal.org/~est/python/weak (also registered in the vaults), an implementation of weak dicts and proxy maps that doesn't require the contained objects to be subclassed. It even has documentation in standard format (!) including some words about motivation. One simple implementation approach to make it work for all objects might be to have: int PyObject_PushWeakHandler(PyObject *o, void (*handler)(PyObject *o, PyObject *data), PyObject *data); When an object is deallocated all its handlers would be called on the object and the data that was registered with the handler. I believe this would make weak dicts a very simple extension module. With this approach I suspect DATA should be incref'd by the call to PyObject_PushWeakHandler() and decref'd after the associated handler is called. Best, Eric From est@hyperreal.org Mon Nov 6 04:39:29 2000 From: est@hyperreal.org (est@hyperreal.org) Date: Sun, 5 Nov 2000 20:39:29 -0800 (PST) Subject: [Python-Dev] Stackless pages In-Reply-To: <14854.11173.601039.883893@bitdiddle.concentric.net> from Jeremy Hylton at "Nov 5, 2000 10:55:17 pm" Message-ID: <20001106043929.2515.qmail@hyperreal.org> Jeremy Hylton discourseth: > > The sort of implementation complexity that I worry about with > Stackless is, e.g. clean interaction with the C stack. If a Python C > API call is made that pushes a C stack frame, e.g. PyObject_Compare, > then a continuation stored before that call can no longer be invokved. > The problem is that continuations break the notion a C API call will > always return an answer; they create a situation in which the C call > that is made should never return, because control is transferred to > the continuation. I assume Stackless raises an error in this case, > but this seems pretty messy: How do we right a language spec that > explains when an error will occur without appealing to the > language implementation? This point is particularly worrisome to me because of a common pattern I see in my own Python development work. I'll define a class which is parameterized with some callbacks. Sometimes, when profiling reveals the need, I'll move these classes to C. If the client of the class is using continuations via its callbacks, it may suddenly break. This seems a step back in the modularity I can count on in my program components. ..and I say this as a long-time, die-hard Schemer. :) I definitely pine for micro-threads in some of my application domains though. Eric From greg@cosc.canterbury.ac.nz Mon Nov 6 04:50:09 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Mon, 06 Nov 2000 17:50:09 +1300 (NZDT) Subject: [Python-Dev] Stackless pages In-Reply-To: <14854.11173.601039.883893@bitdiddle.concentric.net> Message-ID: <200011060450.RAA00019@s454.cosc.canterbury.ac.nz> Jeremy Hylton : > Are continuations necessary to implement them or are > there other options? I think you'll find that any implementation of microthreads or coroutines or whatever you want to call them, that doesn't rely on playing nonportable tricks with the C stack, will be just as mindbending as Stackless. > The problem is that continuations break the notion a C API call will > always return an answer; So do threads or coroutines. As soon as you have multiple threads of control, you have the chance that one of them will switch to another and never get back. > I assume Stackless raises an error in this case, > but this seems pretty messy This messiness isn't the fault of Stackless itself, but of the large amount of code which *hasn't* been converted to the Stackless Way. If the whole of Python and everything that it calls were made truly stackless, the problem would not arise. Doing so, however, would not be fun. It wouldn't be fun for any future extension writers, either. I can't see any way out of this. Continuations/coroutines/ microthreads are all basically the same thing underneath, and they imply an execution model that just doesn't fit well with C. Maybe we need to reimplement Python in Scheme, and then feed it through a good Scheme compiler. SPython, anyone? Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From moshez@zadka.site.co.il Mon Nov 6 13:52:55 2000 From: moshez@zadka.site.co.il (Moshe Zadka) Date: Mon, 06 Nov 2000 15:52:55 +0200 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: Message from Guido van Rossum of "Sun, 05 Nov 2000 20:19:02 EST." <200011060119.UAA03952@cj20424-a.reston1.va.home.com> References: <200011060119.UAA03952@cj20424-a.reston1.va.home.com> Message-ID: [GvR] > So I go offline for a couple of days to entertain guests and have my > body kicked around in a dance class, and I have 25 messages discussing > Python's numeric model waiting for me... I think the solution is obvious -- stop going offline to entertain guests. [GvR] > I was hoping that Tim would chime in Me too. I even tried to drag him in by mentioning 754. [GvR] > I like the idea of a PEP to rework the numeric model. I think that > Moshe, being a mathematician by training, will make a good editor. I > think that this has a snowball's chance in hell to make it into Python > 2.1 however -- there are many deep issues, and it's a bad idea to > experiment too much in the released version of the language. The reason the PEP was written now was because you started making sounds of causing 1/2 to be 0.5, which is dangerous. I tried to outline the PEP to show a less dangerous (and less innovative) way of getting similar usability. [GvR] > (1) As long as the internal representation is not the same as what is > commonly printed, there will be surprises -- with rationals just > as much as with floating point. There are issues with decimal > floating point too, but they are only the issues having to do with > loss of precision in the calculation (e.g. 1.0 - 1e-20 yielding > 1.0) and not with loss of precision in the printing, where most of > the "bug reports" we get seem to concentrate. My PEP does not yet deal with either written or inputted representation. [GvR] > (2) Rational numbers have the unpleasant property of growing > unboundedly during many innocent calculations, thereby using up > exorbitant amounts of memory and slowing down the calculation -- > often mysteriously, because what is displayed is truncated. Those calculations, if performed with floating points, would often build up inaccuracy. I prefer to go slow+memory hog (which the user feels), then to be wildly inaccurate. [GvR] > Another issue that I might bring up is that there are no inexact > numbers (each floating point number is perfectly exact and rational) You'll note that my PEP does not mention floating point explicitly -- and once again I mention that my PEP does not yet deal with number literals. All it allows (not requires) is for things like math.sqrt() to return inexact results. Naive implementations (which we might use) would make math.sqrt(1) an *inexact* 1, tentatively called 1.0. Of course, good implementations of math.sqrt() would realize that 1 has an exact root, but that might be too hard to do for not enough gain. [GvR] > If we take its meaning literally, the isreal() function should only > return true for numbers for which isrational() is also true: Correct -- in my current model. If you later add things like constructive reals, that is no longer true: if I have a constructive Pi, it's not rational. [GvR] > mathematically speaking, real numbers that aren't also rational don't > have an easy finite representation, since they are numbers like > sqrt(2) or pi. But numbers don't have to have a finite (periodic really) representation to be representable in the computer: what about infinite continued fractions, for example? [GvR] > Finally, the PEP doesn't talk about how the numeric model can be > extended That's because its rich enough not to need it. Let me explain exactly what I mean: as long as all field operations between Python numbers give honest to god Python numbers, then everything else can be solved with Python's current model of coercions, and can be solved well when the coercion PEP will be written. [GvR] > I've > heard of wild floating point representations that make multiplication > and division really cheap but addition a pain, rather than the other > way around; Well, no problems: write wild.inexact() and wildmath.{sqrt,...} and use that instead of inexact() and math.{...}. How these numbers interact with builtin Python numbers is your responsibility -- and that's what the coercions are for. Same goes for gmp: as long as you're not expecting to be able to change 10000000000+10000000000 to be a gmp long rather then a Python long, then there shouldn't be a problem. -- Moshe Zadka From paulp@ActiveState.com Mon Nov 6 06:24:27 2000 From: paulp@ActiveState.com (Paul Prescod) Date: Sun, 5 Nov 2000 22:24:27 -0800 (PST) Subject: [Python-Dev] Warnings PEP Message-ID: Abstract This PEP describes a generalized warning mechanism for Python 2.1. The primary purpose of this mechanism is to alert the programmer or user of a program to potential or likely errors which, for whatever reason, are not considered exception-worthy. For example, this might be done to keep old code working during a transitional period or to alert the programmer or user of a recoverable error. Syntax assert >> cls, test[[[, arg], arg]...] "cls" may be any callable object that takes a list as a single argument argument list and returns an object with the required attributes "get_action" and "format_message" * get_action() -> "warn"|"error"|"suppress" * format_message() -> string A provided base class implements these methods in a reusable fashion. Warning creators are encouraged to merely subclass. This extended form of the assertion statement calls the assertion handler code in the new "assertions" module. The semantics of the built-in assertion handler are defined by the following code. It should be exposed in a new "assertions" module. def handle_assertion(cls, message = ""): "This code is called when an assertion fails and cls is not None" obj = cls(message) action = obj.get_action() if action=="error": *** existing assertion code *** elif action=="warn": sys.stderr.write(obj.format_message()) elif action=="suppress": pass else: assert action in ["warn","error","suppress"] Even if handle_assertion is implemented in C, it should be exposed as assertions.handle_assertion so that it may be overriden. The generic warning base class is defined below: class Assertion: def __init__(self, *args): if len(args) == 1: self.args = args[0] else: self.args = args def format_message(self): sys.stderr.write("%s: %s" %(obj.__name__, self.args)) def get_action(self): return (self.get_user_request(self.__class__) or sys.default_warning_action) def get_user_request(self, cls): if cls.__name__ in sys.errors: return "error" elif cls.__name__ in sys.warnings: return "warn" elif cls.__name__ in sys.disabled_warnings: return "suppress" for base in cls.__bases__: rc = self.get_user_request(base) if rc: return rc else: return None The Assertion class should be implemented in Python so that it can be used as a base class. Because this code inherently implements "warning state inheritance", it would be rare to override any of the methods, but this is possible in exceptional circumstances. Command line By default the special variables have the following contents: sys.warnings = [] sys.errors = [] sys.suppress = [] sys.default_warning_action = "warn" These variables may be changed from the command line. The command line arguments are interpreted as described below: -w XXX => sys.warnings.append("XXX") -e XXX => sys.errors.append("XXX") -no-w XXX => sys.suppress.append("XXX") -wall => sys.default_warning_action => "warn" -eall => sys.default_warning_action => "error" -no-wall => sys.default_warning_action => "suppress" As per the code above, errors take precedence over warnings and warnings over suppressions unless a particular assertion class specifies otherwise. Built-in warning objects: class exception_in_del(Assertion): "An exception was ignored in an __del__ method" class deprecation(Assertion): "This feature may be removed in a future version of Python." class dubious(Assertion): "There is a common error associated with this feature." These class definitions are part of the "Assertion" module. They should only ever be used when there exists a way for the programmer to accomplish the same thing without triggering the warning. For instance the way to suppress the deletion exception is to trap exceptions in __del__ methods with a try/except block. From tim_one@email.msn.com Mon Nov 6 06:34:38 2000 From: tim_one@email.msn.com (Tim Peters) Date: Mon, 6 Nov 2000 01:34:38 -0500 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: <200011060119.UAA03952@cj20424-a.reston1.va.home.com> Message-ID: [Guido] > So I go offline for a couple of days to entertain guests and have my > body kicked around in a dance class, and I have 25 messages discussing > Python's numeric model waiting for me... The scary thing is which one of those you clearly enjoyed more . > I was hoping that Tim would chime in, but he's apparently taken the > weekend off -- very much out of character. :-) Exactly in character, alas: I was obsessed with my new cable modem connection. I had years of stuff to learn about firewalls in two days -- not to mention years of pornography to download in one . Some quickies for now: + Konrad Hinsen needs to be sucked in. He's been arguing for a "unified" numeric model forever. + Everyone has IEEE-754 fp hardware today; some people actually want to use it; Moshe doesn't, but whatever revamping we get needs to allow others their delusions too. > ... > For example, Tim has conjectured that using binary floating point will > always be a problem for the "unwashed masses" -- the only thing they > might understand is decimal floating point, At first glance, yes. Complaints traced to the "binary" part of "binary fp" vastly outnumber complaints due to the "fp" part *and* integer division combined, on both Python-Help and the Tutor list. So if we want to know what actually trips up newbies, they've been telling us for years. Decimal fp would silence most of those complaints; but rationals would silence them too (provided they're *displayed* in rounded decimal fp notation (restart "str" vs "repr" rant, and that the interactive prompt uses the wrong one, and ditto str(container))), plus a few more (non-obvious example: (1/49)*49 does not equal 1 in either decimal or IEEE-754 binary double fp, but does equal 1 using rationals). Note that Mike Cowlishaw (REXX's dad) has been working on a scheme to merge REXX's decimal fp with IEEE-854 (the decimal variant of IEEE-754): http://www2.hursley.ibm.com/decimal/ I'll talk to Jim Fulton about that, since Cowlishaw is pushing a BCD variant and Jim was wondering about that (around the change of the year, for use-- I presume --in Zope). Note also IBM's enhanced BigDecimal class for Java: http://www2.hursley.ibm.com/decimalj/ > ... > Another issue that I might bring up is that there are no inexact > numbers (each floating point number is perfectly exact and rational) > -- there are only inexact operations. I'm not sure what to do with > this though. IEEE-754 defines exactly what to do with this, for binary floats (and your hardware has an "inexact result" flag set or not after every fp operation). Conversion of the string "1.0" to float must not set it; conversion of "0.1" must set it; and similarly for + - * / sqrt: "inexact result" gets set whenever the infinitely precise result differs from the computed result. So inexactness there is neither a property of types nor of numbers, but of specific computations. Extreme example: x = 1./3. # inexact y = x-x # exact result (from inexact inputs!) I know that this version (operation-based) of inexactness can be useful. I see almost no earthly use for calling every number of a given type inexact. Tagging individual numbers with an exact/inexact bit is an extremely crude form of interval arithmetic (where the intervals are single points or infinite). > ... > I'll leave it to Tim to explain why inexact results may not be close > to the truth. > Tim may also break a lance for IEEE 754. Somebody else on c.l.py offered to write a 754 PEP; delighted to let them have it. if-you-ever-approximate-people-will-get-confused- but-if-you-don't-they'll-run-out-of-time-or-memory-ly y'rs - tim From paul@prescod.net Mon Nov 6 07:17:52 2000 From: paul@prescod.net (Paul Prescod) Date: Sun, 05 Nov 2000 23:17:52 -0800 Subject: [Python-Dev] Warning framework References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060335.WAA04452@cj20424-a.reston1.va.home.com> Message-ID: <3A065B20.BBD1C1E3@prescod.net> It's just coincidence that I was working on warnings at the same time you were. Our proposals seem to have almost no overlap. I think mine does less, but is also much simpler. I'm always nervous about over-engineering rather than incremental development. -- Paul Prescod Simplicity does not precede complexity, but follows it. - http://www.cs.yale.edu/homes/perlis-alan/quotes.html From pf@artcom-gmbh.de Mon Nov 6 09:05:12 2000 From: pf@artcom-gmbh.de (Peter Funk) Date: Mon, 6 Nov 2000 10:05:12 +0100 (MET) Subject: [Python-Dev] Integer division transition In-Reply-To: <200011060234.VAA04271@cj20424-a.reston1.va.home.com> from Guido van Rossum at "Nov 5, 2000 9:34:40 pm" Message-ID: Hi, [Guido]: > David Scherer proposed to spell this pragma as a "magical import" > (http://www.python.org/pipermail/idle-dev/2000-April/000138.html). Huh? AFAIR David Scherer and Bruce Sherwood used the 'global'-statement at module level as a backward compatible method to introduce module level pragmas. (http://www.python.org/pipermail/idle-dev/2000-April/000140.html) I still like David Scherers proposal very much. [David]: > I actually implemented 1/2==0.5 in Python 1.5.2, complete with a > module-level backward compatibility flag. The flag made an unusual use of > the "global" statement, which appears to be accepted at toplevel by 1.5.2 > without any effect. Therefore a statement like "global olddivision" will be > silently ignored by 1.5.2 and earlier, and will result in the old behavior > under my patch. "global" even has the right sort of sound for module-level > options :) > > An outline of what I did: > > 1. Add new opcode BINARY_FRACTION to opcode.h and dis.py > 2. Add new flag "int c_olddivision" to struct compiling in compile.c > 3. Set c_olddivision to base->c_olddivision or 0 in jcompile > 4. Check for "global olddivision" outside a function definition in > com_global_stmt, and set c_olddivision=1 > 5. Check c_olddivision in com_term, and generate either BINARY_DIVISION or > BINARY_FRACTION > 6. Add PyNumber_Fraction to abstract.h, and define it in abstract.c to > explicitly check for a pair of integers and do float division > 7. Add a BINARY_FRACTION case to ceval.c, which calls PyNumber_Fraction > instead of PyNumber_Divide. BTW: I think the "symbol" '//' is incredible ugly and starting with IBMs JCL years ago all languages I encountered, that used this symbol for something, did suck in some way or another. I would appreaciate very much, if it could be avoided alltogether to add a symbol '//' to Python. '//' will look like a comment delimiter to most people today. Using a new keyword like 'div' in the tradition of languages like Modula-2 looks far more attractive to me. Regards, Peter -- Peter Funk, Oldenburger Str.86, D-27777 Ganderkesee, Germany, Fax:+49 4222950260 office: +49 421 20419-0 (ArtCom GmbH, Grazer Str.8, D-28359 Bremen) From mal@lemburg.com Mon Nov 6 09:14:12 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Mon, 06 Nov 2000 10:14:12 +0100 Subject: [Python-Dev] More Unicode support References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060340.WAA04479@cj20424-a.reston1.va.home.com> Message-ID: <3A067664.22D09D03@lemburg.com> Guido van Rossum wrote: > > [me] > > > - Internationalization. Barry knows what he wants here; I bet Martin > > > von Loewis and Marc-Andre Lemburg have ideas too. > > [MAL] > > We'd need a few more codecs, support for the Unicode compression, > > normalization and collation algorithms. > > Hm... There's also the problem that there's no easy way to do Unicode > I/O. I'd like to have a way to turn a particular file into a Unicode > output device (where the actual encoding might be UTF-8 or UTF-16 or a > local encoding), which should mean that writing Unicode objects to the > file should "do the right thing" (in particular should not try to > coerce it to an 8-bit string using the default encoding first, like > print and str() currently do) and that writing 8-bit string objects to > it should first convert them to Unicode using the default encoding > (meaning that at least ASCII strings can be written to a Unicode file > without having to specify a conversion). I support that reading from > a "Unicode file" should always return a Unicode string object (even if > the actual characters read all happen to fall in the ASCII range). > > This requires some serious changes to the current I/O mechanisms; in > particular str() needs to be fixed, or perhaps a ustr() needs to be > added that it used in certain cases. Tricky, tricky! It's not all that tricky since you can write a StreamRecoder subclass which implements this. AFAIR, I posted such an implementation on i18n-sig. BTW, one of my patches on SF adds unistr(). Could be that it's time to apply it :-) -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From Moshe Zadka Mon Nov 6 09:45:21 2000 From: Moshe Zadka (Moshe Zadka) Date: Mon, 6 Nov 2000 11:45:21 +0200 (IST) Subject: [Python-Dev] Warning framework In-Reply-To: <200011060335.WAA04452@cj20424-a.reston1.va.home.com> Message-ID: On Sun, 5 Nov 2000, Guido van Rossum wrote: > - The file must be None or a shell matching pattern, e.g. "*foo"; > the ".py" suffix is optional; a partial pathname may be given too. > So "foo/bar" matches "/usr/lib/python2.0/foo/bar.py" but also > "/home/guido/libp/tralala/foo/bar.py". If the file is None or "*" > the filter applies regardless of the file. How about "file" must be None or a callable, and if it's a callable, it will be called to check whether to print? If I'll want fnmatch, I know where to find it. > - The message must be a None or a string. If it is None, the filter > applies to all messages. The message string may end in "*" to > match all messages with the given text (up to the "*"). Same remark. If I want re, I know where to find it. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From Moshe Zadka Mon Nov 6 10:42:34 2000 From: Moshe Zadka (Moshe Zadka) Date: Mon, 6 Nov 2000 12:42:34 +0200 (IST) Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: Message-ID: [Tim Peters] > Some quickies for now: > > + Konrad Hinsen needs to be sucked in. He's been arguing for a "unified" > numeric model forever. OK, I hope this e-mail address reaches him: I got it off his webpage. [Tim Peters] > + Everyone has IEEE-754 fp hardware today; some people actually want to use > it; Moshe doesn't, but whatever revamping we get needs to allow others their > delusions too. My proposal has nothing *against* 754 either. For example, "all inexact operations are done compliant to 754" is a perfectly acceptable addition. [Tim Peters, about rationals] > (provided they're *displayed* in rounded decimal fp notation (restart > "str" vs "repr" rant, and that the interactive prompt uses the wrong one, > and ditto str(container))), Tim, that's the other PEP [Tim Peters] > IEEE-754 defines exactly what to do with this, for binary floats (and your > hardware has an "inexact result" flag set or not after every fp operation). Cool! I didn't know about that. [Tim Peters] > Conversion of the string "1.0" to float must not set it; conversion of "0.1" > must set it; and similarly for + - * / sqrt: "inexact result" gets set > whenever the infinitely precise result differs from the computed result. Is there some API for it in C? If not, we might as well assume that any floating point number is inexact. [Tim Peters] > So > inexactness there is neither a property of types nor of numbers, but of > specific computations. Extreme example: > > x = 1./3. # inexact > y = x-x # exact result (from inexact inputs!) > > I know that this version (operation-based) of inexactness can be useful. I > see almost no earthly use for calling every number of a given type inexact. > Tagging individual numbers with an exact/inexact bit is an extremely crude > form of interval arithmetic (where the intervals are single points or > infinite). The tagging is whether you *want* exact operations or inexact operations. I.e.: 1.0/3 --> inexact 1/3 --> exact. Well, this is the classical definition of "type": what do the operations mean? Which is why I need the inexact() function in my PEP. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From mal@lemburg.com Mon Nov 6 12:13:06 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Mon, 06 Nov 2000 13:13:06 +0100 Subject: [Python-Dev] PEP 224 (Attribute Docstrings) References: <3A03DBD2.7E024978@lemburg.com> <200011060125.UAA03986@cj20424-a.reston1.va.home.com> Message-ID: <3A06A052.1990A512@lemburg.com> Guido van Rossum wrote: > > Marc-Andre: > > I can take over the coercion PEP: I've been working > > on this before (see the proposal on my Python Pages). > > Thanks, excellent (although I haven't seen your proposal yet). > > > I would also like to know whether the PEP-0224 will be considered > > for 2.1 if I update the patch to make it a little more robust > > w/r to the problems mentioned in that PEP -- I'd really like > > to see this in Python soon, since it makes documenting Python > > programs so much easier. > > I "kinda" like the idea of having attribute docstrings (meaning it's > not of great importance to me) but there are two things I don't like > in your current proposal: > > 1. The syntax you propose is too ambiguous: as you say, stand-alone > string literal are used for other purposes and could suddenly > become attribute docstrings. This can be fixed by introducing some extra checks in the compiler to reset the "doc attribute" flag in the compiler struct. > 2. I don't like the access method either (__doc___). Any other name will do. It will only have to match these criteria: * must start with two underscores (to match __doc__) * must be extractable using some form of inspection (e.g. by using a naming convention which includes some fixed name part) * must be compatible with class inheritence (i.e. should be stored as attribute) > > Note that I won't get around to do much work on these before > > January... way too busy at the moment :-/ > > That's a problem -- we really want to have the PEPs ready for review > by mid December. This will also be a problem for the coercion PEP -- > if you think you won't be able to work on it before then, I'd prefer > to find another (co-)author. I think I'll need a co-author for the PEPs -- I have high-priority project running which has a deadline in mid-December too. Much food-for-thought is already available (see the PEP 214 and the coercion proposal on my Python Pages) and I should find time for some review. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From mal@lemburg.com Mon Nov 6 12:25:26 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Mon, 06 Nov 2000 13:25:26 +0100 Subject: [Python-Dev] Re: Class/type dichotomy thoughts References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060244.VAA04289@cj20424-a.reston1.va.home.com> Message-ID: <3A06A336.5FD1EBC7@lemburg.com> Guido van Rossum wrote: > > [me] > > > - Class/type dichotomy??? > > [MAL] > > One thing that would probably be implementable is a way to > > maintain "instance" dictionaries for types (which are created > > on-demand whenever an assignment is made). > > > > This would enable > > extending types with new methods and attributes. "Subclassing" > > could then be emulated by using new contructors which add the > > new or changed methods to each created type instance, e.g. > > > > class myclose: > > > > def __init__(self, object, basemethod): > > self.object = object > > self.basemethod = basemethod > > > > def __call__(self): > > print 'Closed file %s' % self.object > > self.basemethod() > > > > def myfile(filename): > > f = open(filename) > > # add/override attributes > > f.newattribute = 1 > > # add/override methods > > f.close = myclose(f, f.close) > > return f > > > > Types would have to be made aware of this possibility. Python > > could provide some helping APIs to make life easier for the > > programmer. > > But this would require an extra pointer field for *all* built-in > types. That would seriously impact the space requirements for ints > and floats! True. > As long as we're proposing hacks like this that don't allow smooth > subclassing yet but let you get at least some of the desired effects, > I'd rather propose to introduce some kind of metaclass that will allow > you to use a class statement to define this. Thinking aloud: > > import types > filemetaclass = metaclass(types.FileType) > > class myfile(filemetaclass): > > def __init__(self, filename): > filemetaclass.__init__(filename) > self.newattribute = 1 > > def close(self): > myclose(self) > filemetaclass.close(self) > > I'm not quite sure what kind of object "filemetaclass" here should be > or what exactly "metaclass()" should do, but it could create a new > type that has the lay-out of an existing file object, with an instance > dictionary (for newattribute) tacked on the end. Later maybe (I'm > going to brainstorm with Jim Fulton about types and classes). I think the problem we currently have with subclassing types is strongly related to the fact that all Py_Check() macros only work on a address compare basis. If we could find a way to change this to some kind of (very) fast different lookup scheme we'd open a door which could lead to making subclassing of types a whole lot easier. Perhaps a simple indirection could help... instead of obj->ob_type == PyInteger_Type we'd write obj->ob_type->base_type == PyInteger_Type_ID. A subclass could then identify itself as integer subclass by setting the base_type id to PyInteger_Type_ID. It would of course have to publish the same internal structure in order to remain compatible to the PyInteger_*() API, but there would be a possibility to extend the object struct and slots could also be replaced with new ones. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From jim@interet.com Mon Nov 6 13:11:25 2000 From: jim@interet.com (James C. Ahlstrom) Date: Mon, 06 Nov 2000 08:11:25 -0500 Subject: [Python-Dev] zipfile.py and InfoZIP files References: <3A030D21.413B8C8A@lemburg.com> Message-ID: <3A06ADFD.5C735BBA@interet.com> "M.-A. Lemburg" wrote: I am happy to modify zipfile.py as desired by this group. > I'm having trouble opening ZIP files created using InfoZIP's > zip utility (which uses zlib) with zipfile.py: > > >>> x = z.read('README') > Traceback (innermost last): > File "", line 1, in ? > File "/home/lemburg/lib/zipfile.py", line 242, in read > bytes = dc.decompress(bytes) > zlib.error: Error -3 while decompressing: incomplete dynamic bit lengths tree > > Is this due to the installed zlib on my system being incompatible, > or is this a bug in zipfile.py ? I have libz version 1.1.3 and > zip version 2.2. I "borrowed" the zlib code from someone else, and it uses undocumented features. Since WinZip seems happy with the compression I left it alone. I wouldn't be surprised if other utilities have problems, especially if they use zlib. I am not sure what to do about this. My starting view is that there is only one compression method, and if WinZip accepts it, then InfoZip is wrong. Is there perhaps a version difference in zlib in Python vs InfoZip? Could you send me an InfoZip zip file? > Also, I wonder why zipfile forces the mode flag to be 'r', > 'w' and 'a' -- wouldn't it make more sense to only add 'b', etc. > to the mode flag instead ?! The 'b' is not necessary, as it is always added by zipfile.py when the file is opened. The [rwa] each do something different, and aren't really the same as a file open. I am not sure what you mean here. > The ZipFile is also missing some kind of method which > extracts files in the ZIP archive to a file-like object. This > would be very useful for extracting large files from a ZIP > archive without having to first read in the whole file into > memory. The "read(self, name)" method returns the bytes, which you can write to a file if you want. What new method would you like? JimA From gvwilson@nevex.com Mon Nov 6 13:39:37 2000 From: gvwilson@nevex.com (Greg Wilson) Date: Mon, 6 Nov 2000 08:39:37 -0500 (EST) Subject: [Python-Dev] Stackless pages In-Reply-To: <14854.11173.601039.883893@bitdiddle.concentric.net> Message-ID: > Jeremy wrote: > I tend to agree with you, Guido. I think we would do well to purposefully > omit continuations from the Python language. There seems to be little need > for a facility to implement arbitrary control structures in Python. If > Python support coroutines and microthreads, I am not sure what else would be > needed. I just finished reading Thomas and Hunt's "Programming Ruby" (the first book in English on the language). It's pretty clear that their favorite language feature in Ruby is the block, which is any group of statements inside either braces or do...end. Blocks are invoked using the 'yield' construct, and can take any number of arguments (enclosed in bars): def fibUpTo(max) i1, i2 = 1, 1 while i1 <= max yield i1 # 'call' the block i1, i2 = i2, i1 + i2 end end fibUpTo(1000) { |f| print f, " " } Most built-in types have iterators that understand blocks: [1, 3, 5].each { |i| puts i } # prints 1, 3, and 5 on separate lines Programmers can use blocks and 'yield' to create new control structures, subject to the limitation that a statement can only be given one block (which means that a multi-way interleaving loop can't be built out of blocks). It would be interesting to see how many of their examples can be done (easily) with stackless... Thanks, Greg From mal@lemburg.com Mon Nov 6 14:00:33 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Mon, 06 Nov 2000 15:00:33 +0100 Subject: [Python-Dev] zipfile.py and InfoZIP files References: <3A030D21.413B8C8A@lemburg.com> <3A06ADFD.5C735BBA@interet.com> Message-ID: <3A06B981.B3D73091@lemburg.com> "James C. Ahlstrom" wrote: > > "M.-A. Lemburg" wrote: > > I am happy to modify zipfile.py as desired by this group. > > > I'm having trouble opening ZIP files created using InfoZIP's > > zip utility (which uses zlib) with zipfile.py: > > > > >>> x = z.read('README') > > Traceback (innermost last): > > File "", line 1, in ? > > File "/home/lemburg/lib/zipfile.py", line 242, in read > > bytes = dc.decompress(bytes) > > zlib.error: Error -3 while decompressing: incomplete dynamic bit lengths tree > > > > Is this due to the installed zlib on my system being incompatible, > > or is this a bug in zipfile.py ? I have libz version 1.1.3 and > > zip version 2.2. > > I "borrowed" the zlib code from someone else, and it uses > undocumented features. Since WinZip seems happy with the compression > I left it alone. I wouldn't be surprised if other utilities have > problems, especially if they use zlib. I am not sure what to > do about this. My starting view is that there is only one > compression method, and if WinZip accepts it, then InfoZip > is wrong. Is there perhaps a version difference in zlib > in Python vs InfoZip? > > Could you send me an InfoZip zip file? As it turned out it was a false alarm: Python must have picked up an old zlibmodule.so from somewhere which caused the problem. A clean rebuild made zipfile work with my InfoZIP files (all my Python Pages ZIP-archives are built using InfoZIP 2.2). Now I get this error after working in interactive Python mode with zipfile: Exception exceptions.AttributeError: "ZipFile instance has no attribute 'fp'" in ignored I guess the __del__ method has to be a bit more careful about what it expects to find... sometimes the builtins may have already been garbage collected. > > Also, I wonder why zipfile forces the mode flag to be 'r', > > 'w' and 'a' -- wouldn't it make more sense to only add 'b', etc. > > to the mode flag instead ?! > > The 'b' is not necessary, as it is always added by zipfile.py > when the file is opened. The [rwa] each do something > different, and aren't really the same as a file open. I > am not sure what you mean here. Sorry, I was only reading the code and got the impression that mode testing was done on the complete string you pass to the constructor ... I missed the line "key = mode[0]". > > The ZipFile is also missing some kind of method which > > extracts files in the ZIP archive to a file-like object. This > > would be very useful for extracting large files from a ZIP > > archive without having to first read in the whole file into > > memory. > > The "read(self, name)" method returns the bytes, which you can > write to a file if you want. What new method would you like? I would like a method .copy(self, name, output) which reads the file name from the ZIP archive and writes it directly to the file-like object output. This should copy the file in chunks of say 64kB in order to reduce memory load. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From nas@arctrix.com Mon Nov 6 07:44:53 2000 From: nas@arctrix.com (Neil Schemenauer) Date: Sun, 5 Nov 2000 23:44:53 -0800 Subject: [Python-Dev] Re: Class/type dichotomy thoughts In-Reply-To: <3A06A336.5FD1EBC7@lemburg.com>; from mal@lemburg.com on Mon, Nov 06, 2000 at 01:25:26PM +0100 References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060244.VAA04289@cj20424-a.reston1.va.home.com> <3A06A336.5FD1EBC7@lemburg.com> Message-ID: <20001105234453.A8255@glacier.fnational.com> On Mon, Nov 06, 2000 at 01:25:26PM +0100, M.-A. Lemburg wrote: > I think the problem we currently have with subclassing types > is strongly related to the fact that all Py_Check() > macros only work on a address compare basis. I don't think this is the problem, although it is closely related. The problem is that the interpreter uses these type checks to special case the handling of certain types. PyInstance_Check() is a big offender. Behavior should be based solely on the type structure. Extension types would then be able to behave exactly like any other builtin type. Your coercion proposal and David's rich comparisions both remove some of this special casing based on type. Neil From gmcm@hypernet.com Mon Nov 6 14:54:57 2000 From: gmcm@hypernet.com (Gordon McMillan) Date: Mon, 6 Nov 2000 09:54:57 -0500 Subject: [Python-Dev] Stackless pages In-Reply-To: <14854.11173.601039.883893@bitdiddle.concentric.net> References: <200011060348.WAA04560@cj20424-a.reston1.va.home.com> Message-ID: <3A067FF1.10498.57C83EB@localhost> Jeremy wrote: > ... I think we would do well to > purposefully omit continuations from the Python language. There seems > to be little need for a facility to implement arbitrary control > structures in Python. If Python support coroutines and microthreads, > I am not sure what else would be needed. I'm not sure what you mean here. So far, no one has asked for anything to get added to the *language* (although generators and coroutines could be made nicer with a bit of keyword support, it's not necessary). > It would be very helpful if the PEPs on Stackless could address this > issue. One way to address it is to ask these questions: What new > control structures do users want in Python? How best can they be > implemented? Are continuations necessary to implement them or are > there other options? If by "control structure" you mean for / while / do type things, well, Python's not a functional language, and stackless / continuations won't grow you any new ones. OTOH, if threads added new "control structures", then yes, continuations adds new ones (although from a language viewpoint, they don't look like anything new). > The sort of implementation complexity that I worry about with > Stackless is, e.g. clean interaction with the C stack. If a Python C > API call is made that pushes a C stack frame, e.g. PyObject_Compare, > then a continuation stored before that call can no longer be invokved. You mean when C code (called by Python) ends up calling eval_code2, and the Python code so invoked wants to use a continuation which "resides" in some other invocation of eval_code2? Yes, Christian gives you an error at that point (I believe; I've never stumbled on this personally). > The problem is that continuations break the notion a C API call will > always return an answer; they create a situation in which the C call > that is made should never return, because control is transferred to > the continuation. Well, sys.exit breaks that notion too :-). If an API says that a return is required, it's a programmer error not to return something. I think it's universally agreed that there should be a "coroutine" API. Perhaps it is best done from scratch, or perhaps it is to the continuation module as threading is to thread. But I think Greg Ewing is right - it doesn't make much difference as far as stackless is concerned. > I assume Stackless raises an error in this case, > but this seems pretty messy: How do we right a language spec that > explains when an error will occur without appealing to the > language implementation? I'm not understanding. In practical terms, I toasted my machine more times and more thoroughly playing with the thread module than I have with the continuation module. In that respect, continuations are a good deal safer - they don't have any effect on the state of your OS. If you're playing with the raw "go-to" features of the continuation module, it's very easy to screw up. That's simply because code has to "balance", somehow. Call and return are the universally accepted way of doing that in the procedural community. But they're just a protocol based on go-to. The go- to is still there, even if it's in the chip's microcode. Christian's continuation module exposes a bunch of primitives. Most of the material on the pages I put up is concerned with how to mix and match those to get useful results (one example is coded 6 different ways). No one thinks this is a good "end state", just like the thread module was a lousy "end state". But we can't steal Java's API here, and we can't steal from functional languages either, 'cause Python just ain't functional. Everybody wants to see safer ways of using this stuff, but there's still a lot of experimenting to do. And I, for one, don't want to slam the door after one or two safe ways have been found. (And personally, I've found most of the attempts at, e.g. writing a "coroutine" class, less comprehensible than using the primitives directly.) - Gordon From tismer@tismer.com Mon Nov 6 14:23:48 2000 From: tismer@tismer.com (Christian Tismer) Date: Mon, 06 Nov 2000 16:23:48 +0200 Subject: [Stackless] Re: [Python-Dev] Stackless pages References: <200011060450.RAA00019@s454.cosc.canterbury.ac.nz> Message-ID: <3A06BEF4.95B773BD@tismer.com> Greg Ewing wrote: > > Jeremy Hylton : > > > Are continuations necessary to implement them or are > > there other options? > > I think you'll find that any implementation of microthreads > or coroutines or whatever you want to call them, that > doesn't rely on playing nonportable tricks with the C > stack, will be just as mindbending as Stackless. > > > The problem is that continuations break the notion a C API call will > > always return an answer; > > So do threads or coroutines. As soon as you have multiple > threads of control, you have the chance that one of them > will switch to another and never get back. This is correct! The only "special" thing with my continuation implementation is that frames are armed to be able to accept a return from a callee multiple times. This little piece on top of frames turns them into full continuations. Without doing this, the API doesn't change, just the implementation gets a little simpler. What remains is still what Greg says: The guarantee of stack-like frame execution no longer holds, and every stackless C extension must either provide a way to handle calls in a tail-recursive manner, *or* it must enforce stack-like execution, like it is done today. The *only* thing out of (coroutines, generators, uthreads) which can be built without breaking this assumption are the simple ICON-style generators. But those can be implemented without becoming stackless at all. If you want full coroutines, or uthreads, the non-trivial change of execution-order which Stackless permits *is* necessary. The step from there to supporting full continuations is tiny, can be done easily or left completely. The order of complexity of the Python implementation is not increased by continuations. In fact, supporting uthreads and coroutines and not only stack-based generators impose the real problem. That's one of my reasons to support continuations: Making Python completely coroutine aware, without tricking the C stack, is 90 percent of the problem. But after walking that far, there is no reason to leave the other 10 percent alone. ... Greg: > I can't see any way out of this. Continuations/coroutines/ > microthreads are all basically the same thing underneath, and > they imply an execution model that just doesn't fit well > with C. Yes, you understood it completely. > Maybe we need to reimplement Python in Scheme, and then > feed it through a good Scheme compiler. SPython, anyone? I believe the current implementation is quite pleasant for a lot of people, while it isn't perfectly stackless. A lot of internal operations could be made stackless by introduction of a couple more opcodes, turning these operations into bytecode. This would probably not cost much speed, since only those code branches are a problem which cause an interpreter recursion today, anyway. ciao - chris -- Christian Tismer :^) Mission Impossible 5oftware : Have a break! Take a ride on Python's Kaunstr. 26 : *Starship* http://starship.python.net 14163 Berlin : PGP key -> http://wwwkeys.pgp.net PGP Fingerprint E182 71C7 1A9D 66E9 9D15 D3CC D4D7 93E2 1FAE F6DF where do you want to jump today? http://www.stackless.com From guido@python.org Mon Nov 6 15:41:02 2000 From: guido@python.org (Guido van Rossum) Date: Mon, 06 Nov 2000 10:41:02 -0500 Subject: [Python-Dev] Three argument slices. In-Reply-To: Your message of "Sat, 04 Nov 2000 16:21:48 GMT." <3a04376a.28016074@smtp.worldonline.dk> References: <3a04376a.28016074@smtp.worldonline.dk> Message-ID: <200011061541.KAA07351@cj20424-a.reston1.va.home.com> > While updating the difference page in the Jython documentation, I came > across this: > > - JPython sequences support three argument slices. i.e. > range(3)[::-1] == [2,1,0]. > CPython should be fixed. > > Is this actually true? Should (and will cpython) change in this respect? Good question. I haven't pronounced on Michael's patch http://sourceforge.net/patch/?func=detailpatch&patch_id=100998&group_id=5470 because I don't know whether this is really a good idea. I don't know that I ever *will* pronounce uness someone points out how useful it is for some practical application. --Guido van Rossum (home page: http://www.python.org/~guido/) From barry@wooz.org Mon Nov 6 15:50:35 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Mon, 6 Nov 2000 10:50:35 -0500 (EST) Subject: [Python-Dev] Re: Revamping Python's Numeric Model References: <200011042115.WAA01162@loewis.home.cs.tu-berlin.de> <200011050828.JAA00672@loewis.home.cs.tu-berlin.de> Message-ID: <14854.54091.14275.140381@anthem.concentric.net> >>>>> "MZ" == Moshe Zadka writes: MZ> (which was meant for Barry to assign me a PEP number MZ> primarily...) Looks like Guido beat me to it this morning and assigned 228 to your PEP. It really pisses me off though, him stepping on my toes there, because clearly the PEP should have been assigned PEP .3/1.e-3 :) -Barry From barry@wooz.org Mon Nov 6 15:59:40 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Mon, 6 Nov 2000 10:59:40 -0500 (EST) Subject: [Python-Dev] Re: Revamping Python's Numeric Model References: <200011042115.WAA01162@loewis.home.cs.tu-berlin.de> <200011050828.JAA00672@loewis.home.cs.tu-berlin.de> <200011051024.LAA01015@loewis.home.cs.tu-berlin.de> <200011051121.MAA01234@loewis.home.cs.tu-berlin.de> <20001105034044.M10135@lyra.org> Message-ID: <14854.54636.49888.781754@anthem.concentric.net> >>>>> "GS" == Greg Stein writes: GS> Allocate yourself a PEP number and publish the darn thing. If GS> you don't feel comfortable grabbing a PEP number, then just GS> post it to the list or something. A general rule is that meta-peps usually get assigned numbers < 100 and standards track peps just get the next free number above 200. Pretty simple really. However, I'm trying to encourage two things: only serious proposals get pep'd, and peps ought to have a consistent style. I want to avoid approving peps for incomplete proposals (such as 207, not to purposely pick on David). There are enough good peps that you should largely be able to figure out the style to use. I can clean up any minor issues after the fact. I think the way Moshe did this one was fine. He wrote it up and posted it, and got assigned a pep number fairly quickly. -Barry From rrdn60@email.sps.mot.com Mon Nov 6 16:04:41 2000 From: rrdn60@email.sps.mot.com (Norman Shelley (rrdn60)) Date: Mon, 06 Nov 2000 09:04:41 -0700 Subject: [Stackless] Re: [Python-Dev] Stackless pages References: <14854.11173.601039.883893@bitdiddle.concentric.net> Message-ID: <3A06D699.D3AFAC0@email.sps.mot.com> Jeremy Hylton wrote: > [Changed discussion list from general python-list to specific > stackless.] > > >>>>> "GvR" == Guido van Rossum writes: > > >> I have put up 6 pages of information about stackless at > >> http://www.mcmillan-inc.com/stackless.html > > GvR> Gordon, thanks for doing this. I still have a review of > GvR> Stackless on my TODO list. It takes a serious chunk of my time > GvR> to do it justice, and this continues to be a problem, but the > GvR> existience of your overview certainly helps. I still think > GvR> that the current Stackless implementation is too complex, and > GvR> that continuations aren't worth the insanity they seem to > GvR> require (or cause :-), but that microthreads and coroutines > GvR> *are* worth having and that something not completely unlike > GvR> Stackless will be one day the way to get there... > > I tend to agree with you, Guido. I think we would do well to > purposefully omit continuations from the Python language. There seems > to be little need for a facility to implement arbitrary control > structures in Python. If Python support coroutines and microthreads, > I am not sure what else would be needed. > > It would be very helpful if the PEPs on Stackless could address this > issue. One way to address it is to ask these questions: What new > control structures do users want in Python? This kind of question/thought disturbs me. It presumes we can determine apriori all the ways one might wish to use the features that Stackless provides. Closing off or bounding innovation just because we can't answer the question as to how it will be used will just cause future forks in Python or promote or non-Python choices. > How best can they be > implemented? Are continuations necessary to implement them or are > there other options? > > The sort of implementation complexity that I worry about with > Stackless is, e.g. clean interaction with the C stack. If a Python C > API call is made that pushes a C stack frame, e.g. PyObject_Compare, > then a continuation stored before that call can no longer be invokved. > The problem is that continuations break the notion a C API call will > always return an answer; they create a situation in which the C call > that is made should never return, because control is transferred to > the continuation. I assume Stackless raises an error in this case, > but this seems pretty messy: How do we right a language spec that > explains when an error will occur without appealing to the > language implementation? > > Jeremy > > _______________________________________________ > Stackless mailing list > Stackless@starship.python.net > http://starship.python.net/mailman/listinfo/stackless From barry@wooz.org Mon Nov 6 16:05:51 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Mon, 6 Nov 2000 11:05:51 -0500 (EST) Subject: [Python-Dev] Re: Revamping Python's Numeric Model References: <200011051121.MAA01234@loewis.home.cs.tu-berlin.de> <14853.36547.748584.450976@cj42289-a.reston1.va.home.com> <14853.52668.117844.28459@bitdiddle.concentric.net> <14854.7221.546916.848838@cj42289-a.reston1.va.home.com> Message-ID: <14854.55007.188953.545215@anthem.concentric.net> BTW, Moshe posted his PEP on Saturday, and I don't think it's at all unreasonable that he'd have to wait until Monday to get a PEP number. I reserve the right to apply Warsaw's 2nd Law to these cases. :) -Barry Warsaw's Second Law: Unbending Law of Commit Scheduling Never change anything after 3pm on a Friday. From hinsen@cnrs-orleans.fr Mon Nov 6 16:12:39 2000 From: hinsen@cnrs-orleans.fr (Konrad Hinsen) Date: Mon, 6 Nov 2000 17:12:39 +0100 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: (message from Moshe Zadka on Mon, 6 Nov 2000 12:42:34 +0200 (IST)) References: Message-ID: <200011061612.RAA05582@chinon.cnrs-orleans.fr> > > + Konrad Hinsen needs to be sucked in. He's been arguing for a "unified" I'd like to know what I am about to be sucked into here ;-) > OK, I hope this e-mail address reaches him: I got it off his webpage. That's fine, my web page always know how to reach me. From your mail I get the impression that the discussion is about some PEP. If you tell me which one I'll try to have a look at it. Konrad. -- ------------------------------------------------------------------------------- Konrad Hinsen | E-Mail: hinsen@cnrs-orleans.fr Centre de Biophysique Moleculaire (CNRS) | Tel.: +33-2.38.25.56.24 Rue Charles Sadron | Fax: +33-2.38.63.15.17 45071 Orleans Cedex 2 | Deutsch/Esperanto/English/ France | Nederlands/Francais ------------------------------------------------------------------------------- From barry@wooz.org Mon Nov 6 16:23:43 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Mon, 6 Nov 2000 11:23:43 -0500 (EST) Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0228.txt,NONE,1.1 pep-0000.txt,1.44,1.45 References: <200011051655.IAA13536@slayer.i.sourceforge.net> <20001105201851.C27208@xs4all.nl> <20001105135439.V10135@lyra.org> <20001105231235.X12776@xs4all.nl> Message-ID: <14854.56079.737563.729305@anthem.concentric.net> >>>>> "TW" == Thomas Wouters writes: TW> Well, syncmail was written to manage the Python CVS tree on a TW> slow Sun (I believe) Correct. TW> and did an rsync-over-ssh to another machine as well. That can TW> definately take long ;) If we just remove the fork, the rest TW> of syncmail might just work, even with new files. In the mean TW> time, I'll check in my change. It might be the best thing to TW> do anyway, since it shouldn't interfere unless the file isn't TW> there. Your patch looks good Thomas, thanks for tracking this down. IIRC, without the fork, the checkin would actually deadlock trying to get the diff. -Barry From guido@python.org Mon Nov 6 16:28:50 2000 From: guido@python.org (Guido van Rossum) Date: Mon, 06 Nov 2000 11:28:50 -0500 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: Your message of "Mon, 06 Nov 2000 15:52:55 +0200." References: <200011060119.UAA03952@cj20424-a.reston1.va.home.com> Message-ID: <200011061628.LAA07574@cj20424-a.reston1.va.home.com> [Moshe] > The reason the PEP was written now was because you started making sounds > of causing 1/2 to be 0.5, which is dangerous. I tried to outline the PEP > to show a less dangerous (and less innovative) way of getting similar > usability. 1/2 yielding 0.5 is innovative? Give me a break. Pascal did this. Algol-60 did this. Fortran does this. And rational numbers are less innovative? And what's dangerous about 1/2 yielding 0.5? > My PEP does not yet deal with either written or inputted representation. These affect the numeric model though (see Tim's posts) so should be considered. > [GvR] > > (2) Rational numbers have the unpleasant property of growing > > unboundedly during many innocent calculations, thereby using up > > exorbitant amounts of memory and slowing down the calculation -- > > often mysteriously, because what is displayed is truncated. > > Those calculations, if performed with floating points, would often build > up inaccuracy. I prefer to go slow+memory hog (which the user feels), then > to be wildly inaccurate. That's your choice. Wildly inaccurate is sometimes unavoidable. This is clearly an issue open for debate, but note that I speak from experience: ABC used rationals unless you forced it to use reals, and the rationals *did* cause real users to complain about how slow ABC was. > [GvR] > > Another issue that I might bring up is that there are no inexact > > numbers (each floating point number is perfectly exact and rational) > > You'll note that my PEP does not mention floating point explicitly -- > and once again I mention that my PEP does not yet deal with number > literals. All it allows (not requires) is for things like math.sqrt() > to return inexact results. Naive implementations (which we might use) > would make math.sqrt(1) an *inexact* 1, tentatively called 1.0. Of > course, good implementations of math.sqrt() would realize that 1 has > an exact root, but that might be too hard to do for not enough gain. Without a clear implementation plan your PEP is incomplete, PEP guidelines notwithstanding. Quality of implementation is important for such a basic feature! > [GvR] > > If we take its meaning literally, the isreal() function should only > > return true for numbers for which isrational() is also true: > > Correct -- in my current model. If you later add things like constructive > reals, that is no longer true: if I have a constructive Pi, it's not > rational. > > [GvR] > > mathematically speaking, real numbers that aren't also rational don't > > have an easy finite representation, since they are numbers like > > sqrt(2) or pi. > > But numbers don't have to have a finite (periodic really) representation > to be representable in the computer: what about infinite continued fractions, > for example? So the question is, what implementation do you have in mind? You can't just go prescribe idealistic semantics and hope it gets implemented by magic (even Tim can't do that :-). > [GvR] > > Finally, the PEP doesn't talk about how the numeric model can be > > extended > > That's because its rich enough not to need it. > Let me explain exactly what I mean: as long as all field operations between > Python numbers give honest to god Python numbers, then everything else > can be solved with Python's current model of coercions, and can be solved > well when the coercion PEP will be written. I think this deserves very explicit mention in your PEP. An example of how I would go about implementing my own Rational class or extension type would help. Also, the coercions PEP is still in need of an author. Maybe you want to take this on too? It will help your numeric proposal if you can write down how you think coercions should work. > [GvR] > > I've > > heard of wild floating point representations that make multiplication > > and division really cheap but addition a pain, rather than the other > > way around; > > Well, no problems: write wild.inexact() and wildmath.{sqrt,...} and use > that instead of inexact() and math.{...}. How these numbers interact > with builtin Python numbers is your responsibility -- and that's what > the coercions are for. So explain how to do the coercions. This will force you to be explicit about implementation details. (Both from Python and from C.) > Same goes for gmp: as long as you're not expecting to be able to change > 10000000000+10000000000 to be a gmp long rather then a Python long, then > there shouldn't be a problem. Fair enough. --Guido van Rossum (home page: http://www.python.org/~guido/) From mal@lemburg.com Mon Nov 6 17:02:51 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Mon, 06 Nov 2000 18:02:51 +0100 Subject: [Python-Dev] Re: Class/type dichotomy thoughts References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060244.VAA04289@cj20424-a.reston1.va.home.com> <3A06A336.5FD1EBC7@lemburg.com> <20001105234453.A8255@glacier.fnational.com> Message-ID: <3A06E43B.D95EC267@lemburg.com> Neil Schemenauer wrote: > > On Mon, Nov 06, 2000 at 01:25:26PM +0100, M.-A. Lemburg wrote: > > I think the problem we currently have with subclassing types > > is strongly related to the fact that all Py_Check() > > macros only work on a address compare basis. > > I don't think this is the problem, although it is closely > related. The problem is that the interpreter uses these type > checks to special case the handling of certain types. > PyInstance_Check() is a big offender. > > Behavior should be based solely on the type structure. Extension > types would then be able to behave exactly like any other builtin > type. Your coercion proposal and David's rich comparisions both > remove some of this special casing based on type. Even though this could remove some of the problems, it doesn't help much with a common use of Py_Check(): that of using fast access macros and native Py_*() APIs once an object has been classified as being of a certain type. This usually improves performance. By changing the simple address compare to a type handle system, we might be able to add some more flexibility to the system while keeping b/w compatibility. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From guido@python.org Mon Nov 6 17:17:22 2000 From: guido@python.org (Guido van Rossum) Date: Mon, 06 Nov 2000 12:17:22 -0500 Subject: [Python-Dev] Integer division transition In-Reply-To: Your message of "Mon, 06 Nov 2000 10:05:12 +0100." References: Message-ID: <200011061717.MAA07847@cj20424-a.reston1.va.home.com> > [Guido]: > > David Scherer proposed to spell this pragma as a "magical import" > > (http://www.python.org/pipermail/idle-dev/2000-April/000138.html). [Peter Funk] > Huh? AFAIR David Scherer and Bruce Sherwood used the 'global'-statement > at module level as a backward compatible method to introduce module level > pragmas. (http://www.python.org/pipermail/idle-dev/2000-April/000140.html) > I still like David Scherers proposal very much. Oops, you're right. That URL mentions "global olddivision". The "import floatdivision" proposal came from a more recent private mail from Bruce Sherwood. I maintain that neither seems the right way to spell "directive". I think "import" is slightly better if the import is supposed to enable a feature that is not supported by previous versions, because the import will cause a clear failure on systems that don't have the new feature (rather than silently giving wrong results sometimes). > BTW: I think the "symbol" '//' is incredible ugly and starting with > IBMs JCL years ago all languages I encountered, that used this symbol > for something, did suck in some way or another. I would appreaciate > very much, if it could be avoided alltogether to add a symbol '//' > to Python. '//' will look like a comment delimiter to most people today. Good point. I've just used // as a placeholder for a new way to spell integer division. > Using a new keyword like 'div' in the tradition of languages like > Modula-2 looks far more attractive to me. That's a possibility too. It's a new keyword though, which has a much higher threshold for acceptance than a new two-character operator symbol. We could spell it as a built-in function: div(a, b), (analogous to divmod(a, b)) but that's not very user-friendly either. Keep looking... --Guido van Rossum (home page: http://www.python.org/~guido/) From guido@python.org Mon Nov 6 17:20:03 2000 From: guido@python.org (Guido van Rossum) Date: Mon, 06 Nov 2000 12:20:03 -0500 Subject: [Python-Dev] More Unicode support In-Reply-To: Your message of "Mon, 06 Nov 2000 10:14:12 +0100." <3A067664.22D09D03@lemburg.com> References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060340.WAA04479@cj20424-a.reston1.va.home.com> <3A067664.22D09D03@lemburg.com> Message-ID: <200011061720.MAA07862@cj20424-a.reston1.va.home.com> [GvR] > > Hm... There's also the problem that there's no easy way to do Unicode > > I/O. I'd like to have a way to turn a particular file into a Unicode > > output device (where the actual encoding might be UTF-8 or UTF-16 or a > > local encoding), which should mean that writing Unicode objects to the > > file should "do the right thing" (in particular should not try to > > coerce it to an 8-bit string using the default encoding first, like > > print and str() currently do) and that writing 8-bit string objects to > > it should first convert them to Unicode using the default encoding > > (meaning that at least ASCII strings can be written to a Unicode file > > without having to specify a conversion). I support that reading from > > a "Unicode file" should always return a Unicode string object (even if > > the actual characters read all happen to fall in the ASCII range). > > > > This requires some serious changes to the current I/O mechanisms; in > > particular str() needs to be fixed, or perhaps a ustr() needs to be > > added that it used in certain cases. Tricky, tricky! [MAL] > It's not all that tricky since you can write a StreamRecoder > subclass which implements this. AFAIR, I posted such an implementation > on i18n-sig. > > BTW, one of my patches on SF adds unistr(). Could be that it's > time to apply it :-) Adding unistr() and StreamRecoder isn't enough. The problem is that when you set sys.stdout to a StreamRecoder, the print statement doesn't do the right thing! Try it. print u"foo" will work, but print u"\u1234" will fail because print always applies the default encoding. The required changes to print are what's tricky. Whether we even need unistr() depends on the solution we find there. --Guido van Rossum (home page: http://www.python.org/~guido/) From tismer@tismer.com Mon Nov 6 16:19:21 2000 From: tismer@tismer.com (Christian Tismer) Date: Mon, 06 Nov 2000 18:19:21 +0200 Subject: [Python-Dev] cgi.py and huge uploads problem Message-ID: <3A06DA09.143D2D9@tismer.com> Howdy, there is a problem with the cgi.py implementation of Python 1.5.2 and uploading of huge files. (found out by Joachim Rudolph, Virtual Photonics) Class FieldStorage of cgi.py has a couple of methods which add accumulated lines to a self.lines array. This array fills and fills until the whole upload is done, with the side effect of loading the whole file into memory. The memory is freed after the whole upload is done. This is no problem, until a company like VPI uses cgi.py to upload whole distributions of 100 MB and more, via Zope. :-) Looking into cgi.py, I can't find a reason why this happens. Is this possibly just a debugging feature which is no longer needed? While cgi.py was modified for cosmetic resons, I didn't find changes for Python 2.0 on this topic. Does it make sense to use a debug flag for this, or should the feature vanish completely? Do you want a patch? cheers - chris -- Christian Tismer :^) Mission Impossible 5oftware : Have a break! Take a ride on Python's Kaunstr. 26 : *Starship* http://starship.python.net 14163 Berlin : PGP key -> http://wwwkeys.pgp.net PGP Fingerprint E182 71C7 1A9D 66E9 9D15 D3CC D4D7 93E2 1FAE F6DF where do you want to jump today? http://www.stackless.com From moshez@zadka.site.co.il Tue Nov 7 01:30:05 2000 From: moshez@zadka.site.co.il (Moshe Zadka) Date: Tue, 07 Nov 2000 03:30:05 +0200 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: Message from Guido van Rossum of "Mon, 06 Nov 2000 11:28:50 EST." <200011061628.LAA07574@cj20424-a.reston1.va.home.com> References: <200011060119.UAA03952@cj20424-a.reston1.va.home.com> <200011061628.LAA07574@cj20424-a.reston1.va.home.com> Message-ID: [Moshe] > The reason the PEP was written now was because you started making sounds > of causing 1/2 to be 0.5, which is dangerous. I tried to outline the PEP > to show a less dangerous (and less innovative) way of getting similar > usability. [Guido] > 1/2 yielding 0.5 is innovative? Give me a break. Pascal did this. > Algol-60 did this. Fortran does this. And rational numbers are less > innovative? OK, I take back the innovation claim. [Guido] > And what's dangerous about 1/2 yielding 0.5? Nothing, but then people would expect 1/3 to yield 0.333333333...., while simultaneusly expecting (1/3)*3 == 1. IOW, shoving floating points in people's faces is not user-friendly. [Moshe] > My PEP does not yet deal with either written or inputted representation. [Guido] > These affect the numeric model though (see Tim's posts) so should be > considered. I agree. That's why they are in the "unresolved issues" of the PEP. [Guido] > (2) Rational numbers have the unpleasant property of growing > unboundedly during many innocent calculations, thereby using up > exorbitant amounts of memory and slowing down the calculation -- > often mysteriously, because what is displayed is truncated. [Moshe] > Those calculations, if performed with floating points, would often build > up inaccuracy. I prefer to go slow+memory hog (which the user feels), then > to be wildly inaccurate. [Guido] > That's your choice. Wildly inaccurate is sometimes unavoidable. This > is clearly an issue open for debate, but note that I speak from > experience: ABC used rationals unless you forced it to use reals, and > the rationals *did* cause real users to complain about how slow ABC > was. Well, experiences differ. I'm on the Scheme48 (which uses rationals) mailing list, and nobody ever complained. Besides, if "forcing" is simply saying either inexact() (and note that inexact-exact operations are performed inexactly) or mixing in an inexact literal (assuming that's what 1.0 will be), it might be easier. I don't know ABC well enough to say how the forcing mechanism worked. [Guido] > Another issue that I might bring up is that there are no inexact > numbers (each floating point number is perfectly exact and rational) [Moshe] > You'll note that my PEP does not mention floating point explicitly -- > and once again I mention that my PEP does not yet deal with number > literals. All it allows (not requires) is for things like math.sqrt() > to return inexact results. Naive implementations (which we might use) > would make math.sqrt(1) an *inexact* 1, tentatively called 1.0. Of > course, good implementations of math.sqrt() would realize that 1 has > an exact root, but that might be too hard to do for not enough gain. [Guido] > Without a clear implementation plan your PEP is incomplete, PEP > guidelines notwithstanding. Quality of implementation is important > for such a basic feature! Well, I agree. When I set down to complete the PEP, I'll go over all the math/cmath functions and remark how the implementation should change. I also plan to draft an implementation design. I just wanted to throw the idea out into the open, to get some feedback -- in no way can the PEP be considered complete. [Guido] > So the question is, what implementation do you have in mind? You > can't just go prescribe idealistic semantics and hope it gets > implemented by magic (even Tim can't do that :-). Well, for constructive reals, none. It's too much of pain to implement, and too little to gain. That wouldn't preclude a later day implementation, in case it turns out not to be the case. Besides, the tower wouldn't look clean without it. [Guido] > I think this deserves very explicit mention in your PEP. An example > of how I would go about implementing my own Rational class or > extension type would help. Well, I don't see why there should be a difference from what happens currently. The thing is, the model will not support you telling it at runtime what the results for operations on types it already knows should be: that would make us Scheme, not Python. [Guido] > Also, the coercions PEP is still in need of an author. Maybe you want > to take this on too? It will help your numeric proposal if you can > write down how you think coercions should work. Smooth, real smooth. OK, sold to the highest bidder -- I take it. I'll update it and 0000, and start reading MAL's pages. [Guido] > So explain how to do the coercions. This will force you to be > explicit about implementation details. (Both from Python and from C.) Again this is no different from current day Python, modulu other PEPs. -- Moshe Zadka From gmcm@hypernet.com Mon Nov 6 17:27:22 2000 From: gmcm@hypernet.com (Gordon McMillan) Date: Mon, 6 Nov 2000 12:27:22 -0500 Subject: [Python-Dev] Integer division transition In-Reply-To: <200011061717.MAA07847@cj20424-a.reston1.va.home.com> References: Your message of "Mon, 06 Nov 2000 10:05:12 +0100." Message-ID: <3A06A3AA.1401.6080F51@localhost> [Peter Funk] > > Using a new keyword like 'div' in the tradition of languages like > > Modula-2 looks far more attractive to me. [Guido]: > That's a possibility too. It's a new keyword though, which has a much > higher threshold for acceptance than a new two-character operator > symbol. We could spell it as a built-in function: div(a, b), > (analogous to divmod(a, b)) but that's not very user-friendly either. > > Keep looking... FWIW, I remember as a newbie being sure that integer division was spelled "a div b". In fact when it didn't work, I went digging through PP, IPWP and the docs in dumbfounded disbelief. this-is-your-brain-on-Modula-2-ly y'rs - Gordon From nas@arctrix.com Mon Nov 6 10:35:29 2000 From: nas@arctrix.com (Neil Schemenauer) Date: Mon, 6 Nov 2000 02:35:29 -0800 Subject: [Python-Dev] Re: Class/type dichotomy thoughts In-Reply-To: <3A06E43B.D95EC267@lemburg.com>; from mal@lemburg.com on Mon, Nov 06, 2000 at 06:02:51PM +0100 References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060244.VAA04289@cj20424-a.reston1.va.home.com> <3A06A336.5FD1EBC7@lemburg.com> <20001105234453.A8255@glacier.fnational.com> <3A06E43B.D95EC267@lemburg.com> Message-ID: <20001106023529.B8639@glacier.fnational.com> On Mon, Nov 06, 2000 at 06:02:51PM +0100, M.-A. Lemburg wrote: > Neil Schemenauer wrote: > > Behavior should be based solely on the type structure. Extension > > types would then be able to behave exactly like any other builtin > > type. Your coercion proposal and David's rich comparisions both > > remove some of this special casing based on type. > > Even though this could remove some of the problems, it doesn't > help much with a common use of Py_Check(): that of > using fast access macros and native Py_*() APIs once an > object has been classified as being of a certain type. > This usually improves performance. Can you clarify what you mean by "it doesn't help much"? Do you mean that extension types will not be able to perform as well as types that get special treatment by the interpreter? I think the major problem that extension types _cannot_ behave the same as the builtin types. > By changing the simple address compare to a type handle > system, we might be able to add some more flexibility to > the system while keeping b/w compatibility. I don't see what this buys us. The Python interpreter shouldn't care about which type object it is dealing with. Can you give an example of where you think this would be useful? Neil From moshez@zadka.site.co.il Tue Nov 7 01:40:30 2000 From: moshez@zadka.site.co.il (Moshe Zadka) Date: Tue, 07 Nov 2000 03:40:30 +0200 Subject: [Python-Dev] Re: Revamping Python's Numeric Model In-Reply-To: Message from barry@wooz.org (Barry A. Warsaw) of "Mon, 06 Nov 2000 11:05:51 EST." <14854.55007.188953.545215@anthem.concentric.net> References: <200011051121.MAA01234@loewis.home.cs.tu-berlin.de> <14853.36547.748584.450976@cj42289-a.reston1.va.home.com> <14853.52668.117844.28459@bitdiddle.concentric.net> <14854.7221.546916.848838@cj42289-a.reston1.va.home.com> <14854.55007.188953.545215@anthem.concentric.net> Message-ID: [Barry] > BTW, Moshe posted his PEP on Saturday, and I don't think it's at all > unreasonable that he'd have to wait until Monday to get a PEP number. It is to me. If I work on Sunday, I might as well do some Python work too... Problems of truly international open source projects -- Moshe Zadka From guido@python.org Mon Nov 6 17:43:07 2000 From: guido@python.org (Guido van Rossum) Date: Mon, 06 Nov 2000 12:43:07 -0500 Subject: [Python-Dev] cgi.py and huge uploads problem In-Reply-To: Your message of "Mon, 06 Nov 2000 18:19:21 +0200." <3A06DA09.143D2D9@tismer.com> References: <3A06DA09.143D2D9@tismer.com> Message-ID: <200011061743.MAA08031@cj20424-a.reston1.va.home.com> > there is a problem with the cgi.py implementation of Python 1.5.2 > and uploading of huge files. > (found out by Joachim Rudolph, Virtual Photonics) > > Class FieldStorage of cgi.py has a couple of methods which > add accumulated lines to a self.lines array. This array fills and > fills until the whole upload is done, with the side effect of > loading the whole file into memory. The memory is freed > after the whole upload is done. > > This is no problem, until a company like VPI uses cgi.py to > upload whole distributions of 100 MB and more, via Zope. :-) > > Looking into cgi.py, I can't find a reason why this happens. > Is this possibly just a debugging feature which is no longer > needed? > While cgi.py was modified for cosmetic resons, I didn't find > changes for Python 2.0 on this topic. > > Does it make sense to use a debug flag for this, or should > the feature vanish completely? > Do you want a patch? You know, I have *no* idea why this is. I have looked through various revisions (this feature is as old as cgi.py:1.8) and cannot find any use of or need for self.lines! It just gets appended to. So I think it's safe to toss all the references to self.lines and see who complains. --Guido van Rossum (home page: http://www.python.org/~guido/) From mal@lemburg.com Mon Nov 6 17:36:54 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Mon, 06 Nov 2000 18:36:54 +0100 Subject: [Python-Dev] More Unicode support References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060340.WAA04479@cj20424-a.reston1.va.home.com> <3A067664.22D09D03@lemburg.com> <200011061720.MAA07862@cj20424-a.reston1.va.home.com> Message-ID: <3A06EC36.61B32B4D@lemburg.com> Guido van Rossum wrote: > > [GvR] > > > Hm... There's also the problem that there's no easy way to do Unicode > > > I/O. I'd like to have a way to turn a particular file into a Unicode > > > output device (where the actual encoding might be UTF-8 or UTF-16 or a > > > local encoding), which should mean that writing Unicode objects to the > > > file should "do the right thing" (in particular should not try to > > > coerce it to an 8-bit string using the default encoding first, like > > > print and str() currently do) and that writing 8-bit string objects to > > > it should first convert them to Unicode using the default encoding > > > (meaning that at least ASCII strings can be written to a Unicode file > > > without having to specify a conversion). I support that reading from > > > a "Unicode file" should always return a Unicode string object (even if > > > the actual characters read all happen to fall in the ASCII range). > > > > > > This requires some serious changes to the current I/O mechanisms; in > > > particular str() needs to be fixed, or perhaps a ustr() needs to be > > > added that it used in certain cases. Tricky, tricky! > > [MAL] > > It's not all that tricky since you can write a StreamRecoder > > subclass which implements this. AFAIR, I posted such an implementation > > on i18n-sig. > > > > BTW, one of my patches on SF adds unistr(). Could be that it's > > time to apply it :-) > > Adding unistr() and StreamRecoder isn't enough. The problem is that > when you set sys.stdout to a StreamRecoder, the print statement > doesn't do the right thing! Try it. print u"foo" will work, but > print u"\u1234" will fail because print always applies the default > encoding. Hmm, that's due to PyFile_WriteObject() calling PyObject_Str(). Perhaps we ought to let it call PyObject_Unicode() (which you find in the patch on SF) instead for Unicode objects. That way the file-like .write() method will be given a Unicode object and StreamRecoder could then do the trick. Haven't tried this, but it could work (the paths objects take through Python to get printed are somewhat strange at times -- there are just so many different possiblities and special cases that it becomes hard telling from just looking at the code). > The required changes to print are what's tricky. Whether we even need > unistr() depends on the solution we find there. I think we'll need PyObject_Unicode() and unistr() one way or another. Those two APIs simply complement PyObject_Str() and str() in that they always return Unicode objects and do the necessary conversion based on the input object type. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From guido@python.org Mon Nov 6 17:49:35 2000 From: guido@python.org (Guido van Rossum) Date: Mon, 06 Nov 2000 12:49:35 -0500 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: Your message of "Tue, 07 Nov 2000 03:30:05 +0200." References: <200011060119.UAA03952@cj20424-a.reston1.va.home.com> <200011061628.LAA07574@cj20424-a.reston1.va.home.com> Message-ID: <200011061749.MAA08063@cj20424-a.reston1.va.home.com> > [Guido] > > That's your choice. Wildly inaccurate is sometimes unavoidable. This > > is clearly an issue open for debate, but note that I speak from > > experience: ABC used rationals unless you forced it to use reals, and > > the rationals *did* cause real users to complain about how slow ABC > > was. [Moshe] > Well, experiences differ. I'm on the Scheme48 (which uses rationals) mailing > list, and nobody ever complained. Besides, if "forcing" is simply saying > either inexact() (and note that inexact-exact operations are performed > inexactly) or mixing in an inexact literal (assuming that's what 1.0 will > be), it might be easier. I don't know ABC well enough to say how the > forcing mechanism worked. Scheme has no real users -- only CS types. :-) ABC's forcing was as simple as writing ~x or mixing inexact numbers. We did have the notion that 1.0 was an exact literal (to me it looks exact!) so maybe that was an additional problem. > [Guido] > > I think this deserves very explicit mention in your PEP. An example > > of how I would go about implementing my own Rational class or > > extension type would help. > > Well, I don't see why there should be a difference from what happens > currently. The thing is, the model will not support you telling it > at runtime what the results for operations on types it already knows > should be: that would make us Scheme, not Python. Agreed. I think what makes me feel uneasy is that your proposal assumes that there is One True Numeric Type, and all the rest are second-class numbers. Python currently makes no such claim: there are many numeric types built in and you can add your own types that play along. The only thing that makes a difference is that there are literals for the built-in types and not for the extension types; but apart from that they have all the same privileges, and the coercion rules work in everybody's favor (except when they don't :-). Placing more emphasis on One True Numeric Type runs a risk of discriminating against the others. > [Guido] > > Also, the coercions PEP is still in need of an author. Maybe you want > > to take this on too? It will help your numeric proposal if you can > > write down how you think coercions should work. > > Smooth, real smooth. OK, sold to the highest bidder -- I take it. > I'll update it and 0000, and start reading MAL's pages. OK, it's a deal. But make yourself a co-author with MAL. --Guido van Rossum (home page: http://www.python.org/~guido/) From guido@python.org Mon Nov 6 17:53:36 2000 From: guido@python.org (Guido van Rossum) Date: Mon, 06 Nov 2000 12:53:36 -0500 Subject: [Python-Dev] More Unicode support In-Reply-To: Your message of "Mon, 06 Nov 2000 18:36:54 +0100." <3A06EC36.61B32B4D@lemburg.com> References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060340.WAA04479@cj20424-a.reston1.va.home.com> <3A067664.22D09D03@lemburg.com> <200011061720.MAA07862@cj20424-a.reston1.va.home.com> <3A06EC36.61B32B4D@lemburg.com> Message-ID: <200011061753.MAA08112@cj20424-a.reston1.va.home.com> [Guido] > > Adding unistr() and StreamRecoder isn't enough. The problem is that > > when you set sys.stdout to a StreamRecoder, the print statement > > doesn't do the right thing! Try it. print u"foo" will work, but > > print u"\u1234" will fail because print always applies the default > > encoding. [MAL] > Hmm, that's due to PyFile_WriteObject() calling PyObject_Str(). > Perhaps we ought to let it call PyObject_Unicode() (which you > find in the patch on SF) instead for Unicode objects. That way > the file-like .write() method will be given a Unicode object > and StreamRecoder could then do the trick. That's still not enough. Classes and types should be able to have a __str__ (or tp_str) that yields Unicode too. --Guido van Rossum (home page: http://www.python.org/~guido/) From moshez@zadka.site.co.il Tue Nov 7 02:08:05 2000 From: moshez@zadka.site.co.il (Moshe Zadka) Date: Tue, 07 Nov 2000 04:08:05 +0200 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: Message from Guido van Rossum of "Mon, 06 Nov 2000 12:49:35 EST." <200011061749.MAA08063@cj20424-a.reston1.va.home.com> References: <200011060119.UAA03952@cj20424-a.reston1.va.home.com> <200011061628.LAA07574@cj20424-a.reston1.va.home.com> <200011061749.MAA08063@cj20424-a.reston1.va.home.com> Message-ID: [Guido] > Scheme has no real users -- only CS types. :-) I resent that! I'm not a CS type: I never studied CS, and I refuse to stand for that kind of mud slinging. [Guido] > ABC's forcing was as simple as writing ~x or mixing inexact numbers. > We did have the notion that 1.0 was an exact literal (to me it looks > exact!) so maybe that was an additional problem. Let me clarify: exactness *is* in the operations. Since Python is an OO language, an object decides semantics of operations on it. "1.0" decides to have inexact operations on it, no matter how exact it looks to you. [Guido] > Agreed. I think what makes me feel uneasy is that your proposal > assumes that there is One True Numeric Type, and all the rest are > second-class numbers. Python currently makes no such claim: there are > many numeric types built in and you can add your own types that play > along. My proposal doesn't change that, as far as it's true. (And it isn't really true: I cant have my new types as literals, or as results of existing operations). It changes the number of built-in numeric types to 1. -- Moshe Zadka This is a signature anti-virus. Please stop the spread of signature viruses! From mal@lemburg.com Mon Nov 6 18:15:27 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Mon, 06 Nov 2000 19:15:27 +0100 Subject: [Python-Dev] More Unicode support References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060340.WAA04479@cj20424-a.reston1.va.home.com> <3A067664.22D09D03@lemburg.com> <200011061720.MAA07862@cj20424-a.reston1.va.home.com> <3A06EC36.61B32B4D@lemburg.com> <200011061753.MAA08112@cj20424-a.reston1.va.home.com> Message-ID: <3A06F53F.2CF6760@lemburg.com> Guido van Rossum wrote: > > [Guido] > > > Adding unistr() and StreamRecoder isn't enough. The problem is that > > > when you set sys.stdout to a StreamRecoder, the print statement > > > doesn't do the right thing! Try it. print u"foo" will work, but > > > print u"\u1234" will fail because print always applies the default > > > encoding. > > [MAL] > > Hmm, that's due to PyFile_WriteObject() calling PyObject_Str(). > > Perhaps we ought to let it call PyObject_Unicode() (which you > > find in the patch on SF) instead for Unicode objects. That way > > the file-like .write() method will be given a Unicode object > > and StreamRecoder could then do the trick. > > That's still not enough. Classes and types should be able to have a > __str__ (or tp_str) that yields Unicode too. Instances are allowed to return Unicode through their __str__ method and PyObject_Unicode() will pass it along. PyObject_Str() will still convert it to an 8-bit string though because there's too much code out there which expects a string object (without checking !) ... even the Python core. So if you print an instance which returns Unicode through __str__, the wrapper should see a real Unicode object at its end... at least I think we're getting closer ;-) -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From barry@wooz.org Mon Nov 6 18:32:44 2000 From: barry@wooz.org (Barry A. Warsaw) Date: Mon, 6 Nov 2000 13:32:44 -0500 (EST) Subject: [Python-Dev] cgi.py and huge uploads problem References: <3A06DA09.143D2D9@tismer.com> <200011061743.MAA08031@cj20424-a.reston1.va.home.com> Message-ID: <14854.63820.201069.434450@anthem.concentric.net> >>>>> "GvR" == Guido van Rossum writes: GvR> You know, I have *no* idea why this is. I have looked GvR> through various revisions (this feature is as old as GvR> cgi.py:1.8) and cannot find any use of or need for GvR> self.lines! It just gets appended to. GvR> So I think it's safe to toss all the references to self.lines GvR> and see who complains. There are two bug reports related to this in the SF database. The first one was 110674, which we closed after adding the feature request to PEP 42. The second is 119806, which looks fairly new, but wasn't submitted by Chris or Joachim. I came to the same conclusion Guido does above when I looked at 110674 in the Python 2.0 time frame, but didn't feel comfortable making that change for 2.0. I think it's correct to make the change now. I will do the following: - remove self.lines from cgi.py - close bug #119806 - update pep 42 Attached below is the patch. -Barry -------------------- snip snip -------------------- Index: cgi.py =================================================================== RCS file: /cvsroot/python/python/dist/src/Lib/cgi.py,v retrieving revision 1.55 diff -u -r1.55 cgi.py --- cgi.py 2000/10/03 13:51:09 1.55 +++ cgi.py 2000/11/06 18:32:18 @@ -497,7 +497,6 @@ self.list = self.file = None self.done = 0 - self.lines = [] if ctype == 'application/x-www-form-urlencoded': self.read_urlencoded() elif ctype[:10] == 'multipart/': @@ -633,7 +632,6 @@ if not line: self.done = -1 break - self.lines.append(line) self.file.write(line) def read_lines_to_outerboundary(self): @@ -646,7 +644,6 @@ if not line: self.done = -1 break - self.lines.append(line) if line[:2] == "--": strippedline = string.strip(line) if strippedline == next: @@ -676,7 +673,6 @@ if not line: self.done = -1 break - self.lines.append(line) if line[:2] == "--": strippedline = string.strip(line) if strippedline == next: From guido@python.org Mon Nov 6 19:09:26 2000 From: guido@python.org (Guido van Rossum) Date: Mon, 06 Nov 2000 14:09:26 -0500 Subject: [Python-Dev] cgi.py and huge uploads problem In-Reply-To: Your message of "Mon, 06 Nov 2000 13:32:44 EST." <14854.63820.201069.434450@anthem.concentric.net> References: <3A06DA09.143D2D9@tismer.com> <200011061743.MAA08031@cj20424-a.reston1.va.home.com> <14854.63820.201069.434450@anthem.concentric.net> Message-ID: <200011061909.OAA08466@cj20424-a.reston1.va.home.com> > GvR> You know, I have *no* idea why this is. I have looked > GvR> through various revisions (this feature is as old as > GvR> cgi.py:1.8) and cannot find any use of or need for > GvR> self.lines! It just gets appended to. > > GvR> So I think it's safe to toss all the references to self.lines > GvR> and see who complains. [Barry] > There are two bug reports related to this in the SF database. The > first one was 110674, which we closed after adding the feature request > to PEP 42. The second is 119806, which looks fairly new, but wasn't > submitted by Chris or Joachim. > > I came to the same conclusion Guido does above when I looked at 110674 > in the Python 2.0 time frame, but didn't feel comfortable making that > change for 2.0. I think it's correct to make the change now. > > I will do the following: > > - remove self.lines from cgi.py > - close bug #119806 > - update pep 42 > > Attached below is the patch. Patch looks good. Go for it. Somehow when I skimmed the original bug reports I never really understood what was going on -- thanks to Christian for pointing out *exactly* what was the problem in words I could understand. :-) --Guido van Rossum (home page: http://www.python.org/~guido/) From mal@lemburg.com Mon Nov 6 20:01:40 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Mon, 06 Nov 2000 21:01:40 +0100 Subject: [Python-Dev] Re: Class/type dichotomy thoughts References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060244.VAA04289@cj20424-a.reston1.va.home.com> <3A06A336.5FD1EBC7@lemburg.com> <20001105234453.A8255@glacier.fnational.com> <3A06E43B.D95EC267@lemburg.com> <20001106023529.B8639@glacier.fnational.com> Message-ID: <3A070E24.B07FA7D7@lemburg.com> Neil Schemenauer wrote: > > On Mon, Nov 06, 2000 at 06:02:51PM +0100, M.-A. Lemburg wrote: > > Neil Schemenauer wrote: > > > Behavior should be based solely on the type structure. Extension > > > types would then be able to behave exactly like any other builtin > > > type. Your coercion proposal and David's rich comparisions both > > > remove some of this special casing based on type. > > > > Even though this could remove some of the problems, it doesn't > > help much with a common use of Py_Check(): that of > > using fast access macros and native Py_*() APIs once an > > object has been classified as being of a certain type. > > This usually improves performance. > > Can you clarify what you mean by "it doesn't help much"? Do you > mean that extension types will not be able to perform as well as > types that get special treatment by the interpreter? I think the > major problem that extension types _cannot_ behave the same as > the builtin types. If you only define types by interface, the interpreter will have to apply the interface availability checks every time it calls a slot. This would cause a major performance hit which would not be acceptable. The "doesn't help much" refers to the fact that once an object has been identified as say Integer you are free to use whatever access macro or function you choose. > > By changing the simple address compare to a type handle > > system, we might be able to add some more flexibility to > > the system while keeping b/w compatibility. > > I don't see what this buys us. The Python interpreter shouldn't > care about which type object it is dealing with. Can you give an > example of where you think this would be useful? Take e.g. dictionaries: you could easily add a new dictionary type which uses case-insensitive string keys by extending the existing dictionary type. The new type would reuse most of the slots of the original type and only replace the ones needed for lookup with the new logic for case-insensitivity. Then it sets the type id to PyDict_TypeID and Python will use it as if it were an original dictionary object. The underlying type objects would be different though (and also the type object address which is currently used to identify a builtin type). -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From guido@python.org Mon Nov 6 21:02:27 2000 From: guido@python.org (Guido van Rossum) Date: Mon, 06 Nov 2000 16:02:27 -0500 Subject: [Python-Dev] Warnings PEP In-Reply-To: Your message of "Sun, 05 Nov 2000 22:24:27 PST." References: Message-ID: <200011062102.QAA09294@cj20424-a.reston1.va.home.com> Paul, thanks for submitting a warnings framework. I'd like to give some feedback, comparing it to my own proposal. Please also give explicit feedback on mine! [Paul Prescod] > Abstract > > This PEP describes a generalized warning mechanism for Python 2.1. The > primary purpose of this mechanism is to alert the programmer or user > of a program to potential or likely errors which, for whatever reason, > are not considered exception-worthy. For example, this might be done > to keep old code working during a transitional period or to alert the > programmer or user of a recoverable error. What's missing here is a set of requirements. Comparing your proposal to my requirements, I find that you are not addressing two requirements that are important in my mind: a convenient and efficient C API, and a mechanism that prints a warning message only the first time that the warning is issued. These are important to me, because I expect that most warnings will probably be generated by C code, and in order to be useful we must avoid mindless repetition. If a function divides two integers using the / operator, this is being detected by C code (the int or long division implementation) and we only want to print the warning once per program execution and per source location. My expectation is that if a single occurrence (in the program) of a warning condition caused an endless sequence of warnings to be spit out, people would quickly grow a habit of disabling all warnings, thus defeating the purposes. Warnings are more a human engineering issue than a technical issue! That's also why I am emphasizing a C API -- I want to make it real easy to ussue quality warnings in the runtime. It's also why I specify a rich (probably too rich!) filtering mechanism. > Syntax > > assert >> cls, test[[[, arg], arg]...] I have several problems with this. First of all, using "assert" means that in "optimizing" mode (python -O) you won't get *any* warnings. I think that the decision to disable all warnings should be independent from the decision to "optimize". Second, you're hypergeneralizing the extended print syntax. Just because I think it's okay to add >>file to the print syntax doesn't mean that it's now okay to add >>object syntax to all statements! I also don't see what warnings have to do with assertions. Assertions are a mechanism to check for error conditions. What happens if the error is detected is of less importance -- it could raise an exception (Python), issue a fatal error (C), or do nothing (in -O mode). With warnings I believe the issue is not so much the detection of the condition (for which a regular 'if' statement does just fine) but the reporting. Again, this is motivated by the fact that I expect that flexible filtering is essential for a successful warning mechanism. > "cls" may be any callable object that takes a list as a single > argument argument list and returns an object with the required > attributes "get_action" and "format_message" > > * get_action() -> "warn"|"error"|"suppress" > * format_message() -> string > > A provided base class implements these methods in a reusable > fashion. Warning creators are encouraged to merely subclass. This is just a matter of exposition, but when I first read your PEP I had a hard time figuring out the purpose of the cls object. It wasn't until I got to the very end where I saw your example classes that I realized what it is: it represents a specific warning condition or a group of related warning conditions. > This extended form of the assertion statement calls the assertion > handler code in the new "assertions" module. I won't harp on this each time, but I'd like to point out once more that "assertion" is the wrong name for a warning feature. Although it isn't part of the Zen of Python (by Tim Peters), it should be: a suggestive name for a feature is worth half a spec! > The semantics of the built-in assertion handler are defined by the > following code. It should be exposed in a new "assertions" module. > > def handle_assertion(cls, message = ""): > "This code is called when an assertion fails and cls is not None" > > obj = cls(message) > action = obj.get_action() > > if action=="error": > *** existing assertion code *** That's just raise AssertionError, message Right? > elif action=="warn": > sys.stderr.write(obj.format_message()) > elif action=="suppress": > pass > else: > assert action in ["warn","error","suppress"] > > Even if handle_assertion is implemented in C, it should be exposed as > assertions.handle_assertion so that it may be overriden. Suggestion: have separate warning and error handlers, so that if I want to override these branches of the if statement I don't have to repeat the entire handler. > The generic warning base class is defined below: > > class Assertion: > def __init__(self, *args): > if len(args) == 1: > self.args = args[0] > else: > self.args = args > > def format_message(self): > sys.stderr.write("%s: %s" %(obj.__name__, self.args)) > > def get_action(self): > return (self.get_user_request(self.__class__) > or sys.default_warning_action) > > def get_user_request(self, cls): > if cls.__name__ in sys.errors: > return "error" > elif cls.__name__ in sys.warnings: > return "warn" > elif cls.__name__ in sys.disabled_warnings: I see no definition of sys.disabled_warnings. Instead of sys.disabled_warnings you meant sys.suppress, right? > return "suppress" > for base in cls.__bases__: > rc = self.get_user_request(base) > if rc: > return rc > else: > return None This approach (searching for the class name or the name of one of its base classes in a list) doesn't look very object-oriented. It would make more sense to store the desired return value as a class or instance attribute. The default warning action could be stored on the base class. > The Assertion class should be implemented in Python so that it can be > used as a base class. > > Because this code inherently implements "warning state inheritance", > it would be rare to override any of the methods, but this is possible > in exceptional circumstances. > > Command line > > By default the special variables have the following contents: > > sys.warnings = [] > sys.errors = [] > sys.suppress = [] > sys.default_warning_action = "warn" > > These variables may be changed from the command line. The command line > arguments are interpreted as described below: > > -w XXX => sys.warnings.append("XXX") > -e XXX => sys.errors.append("XXX") > -no-w XXX => sys.suppress.append("XXX") > -wall => sys.default_warning_action => "warn" > -eall => sys.default_warning_action => "error" > -no-wall => sys.default_warning_action => "suppress" Python doesn't support long options (I don't *like* long options so I doubt that this is a good occasion to start lobbying for them :-). We can come up with different options though. > As per the code above, errors take precedence over warnings and > warnings over suppressions unless a particular assertion class > specifies otherwise. I would use a different precedence scheme: a more specific filter takes precedence over a more general filter. So -eall -wdubious would mean that "dubious" class warnings are warnings but all others are errors, and -wall -edubious would mean the opposite. > Built-in warning objects: > > class exception_in_del(Assertion): > "An exception was ignored in an __del__ method" > > class deprecation(Assertion): > "This feature may be removed in a future version of Python." > > class dubious(Assertion): > "There is a common error associated with this feature." > > These class definitions are part of the "Assertion" module. They > should only ever be used when there exists a way for the programmer to > accomplish the same thing without triggering the warning. For instance > the way to suppress the deletion exception is to trap exceptions in > __del__ methods with a try/except block. --Guido van Rossum (home page: http://www.python.org/~guido/) From tim_one@email.msn.com Mon Nov 6 21:36:49 2000 From: tim_one@email.msn.com (Tim Peters) Date: Mon, 6 Nov 2000 16:36:49 -0500 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: <200011061628.LAA07574@cj20424-a.reston1.va.home.com> Message-ID: [Guido] > 1/2 yielding 0.5 is innovative? Give me a break. Pascal did this. > Algol-60 did this. Fortran does this. And rational numbers are less > innovative? Small correction: Fortran does not -- 1/2 is 0 in Fortran (same as C99's new rules, int div always truncates). So far as innovation goes, no choice on the table so far is innovative (neither mathematically nor in programming languages), so there's no basis for choosing there. Guido, *why* did ABC use rationals by default? Was that driven by usability studies? From tim_one@email.msn.com Mon Nov 6 21:36:50 2000 From: tim_one@email.msn.com (Tim Peters) Date: Mon, 6 Nov 2000 16:36:50 -0500 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: <200011061749.MAA08063@cj20424-a.reston1.va.home.com> Message-ID: [Guido] > ABC's forcing was as simple as writing ~x or mixing inexact numbers. > We did have the notion that 1.0 was an exact literal (to me it looks > exact!) so maybe that was an additional problem. I seriously wonder whether that was *the* problem with ABC: not only was 1.0 treated as an exact rational in ABC, so was 6.02e23 and 3.14159e-314 etc. At least for me, this caused rationals to get used in many places I *intended* to use floats. I assume many others got burned by this too, as I'd say it's impossible for users coming from other languages not to see 6.02e23 etc as float literals. From paulp@ActiveState.com Mon Nov 6 22:01:27 2000 From: paulp@ActiveState.com (Paul Prescod) Date: Mon, 06 Nov 2000 14:01:27 -0800 Subject: [Python-Dev] Warnings PEP References: <200011062102.QAA09294@cj20424-a.reston1.va.home.com> Message-ID: <3A072A37.C2CAFED1@activestate.com> Guido van Rossum wrote: > > Paul, thanks for submitting a warnings framework. I'd like to give > some feedback, comparing it to my own proposal. Please also give > explicit feedback on mine! Okay, but I'm a little bit worried about getting into a science project -- in terms of my time to work on it. I'm perfectly happy with the two requirements you mentioned but I haven't thought enough about filters to be able to whip off ideas about them quickly. > What's missing here is a set of requirements. Comparing your proposal > to my requirements, I find that you are not addressing two > requirements that are important in my mind: a convenient and efficient > C API, and a mechanism that prints a warning message only the first > time that the warning is issued. I agree that these are both important things and I will add them to my proposal. Also, I like your requirement that the target for warnings should be easily customizable. > Warnings are more a human engineering issue than a technical issue! > That's also why I am emphasizing a C API -- I want to make it real > easy to ussue quality warnings in the runtime. It's also why I > specify a rich (probably too rich!) filtering mechanism. Let me put out the strawman proposal that "grep" is a nicely orthogonal filtering mechanism. I guess my thinking is that you turn off warnings at the source. If you get a warning about __del__ exceptions then you put in a try/except. If you get a warning about an unused variable then you assign it to itself. If you get a warning about integer division then you should pass in a float and so forth. > > Syntax > > > > assert >> cls, test[[[, arg], arg]...] > > I have several problems with this. First of all, using "assert" means > that in "optimizing" mode (python -O) you won't get *any* warnings. I > think that the decision to disable all warnings should be independent > from the decision to "optimize". Arguable. I see it as "release" and "debug" configurations. python and python_d. > Second, you're hypergeneralizing the > extended print syntax. Just because I think it's okay to add >>file > to the print syntax doesn't mean that it's now okay to add >>object > syntax to all statements! Well getting new syntax into Python is really, really hard so we've got to squeeze out as much value out of what we have as possible. But anyhow, assertions are not allowed to You and I agree that there is an important sociological dimension to this. We can't require: import warnings warnings.warn("foo", "bar") I prefer: warn Foo, "Bar" just like: raise Foo, "Bar" > I also don't see what warnings have to do with assertions. Assertions > are a mechanism to check for error conditions. What happens if the > error is detected is of less importance -- it could raise an exception > (Python), issue a fatal error (C), or do nothing (in -O mode). Warnings are issued when an error or dubious construct is detected. Assertions are "fatal warnings". You agree that it is appropriate for some "warnings" to kill the app in some circumstances. Isn't it just a hop-skip-and-a-jump to say that warnings and errors are just points on the spectrum: Report Once Report Always Report and Die Python trained me to think of function calls and object constructions as being basically the same thing -- and keyword arguments and variable names being basically the same thing. etc. > With warnings I believe the issue is not so much the detection of the > condition (for which a regular 'if' statement does just fine) but the > reporting. Again, this is motivated by the fact that I expect that > flexible filtering is essential for a successful warning mechanism. I don't see why assertions need special syntax but warnings do not! I would have been happy for a "warn" keyword with assert-like syntax but I don't think I'll see that in my lifetime. > This is just a matter of exposition, but when I first read your PEP I > had a hard time figuring out the purpose of the cls object. It wasn't > until I got to the very end where I saw your example classes that I > realized what it is: it represents a specific warning condition or a > group of related warning conditions. It sort of evolved into an exception-like mechanism in that the class is instantiated with arguments just as exception classes are. > > if action=="error": > > *** existing assertion code *** > > That's just > > raise AssertionError, message > > Right? Well I'd like the traceback to emanate from the caller's position not the warning handler's position. Python doesn't really have a way to say that simply. This may well be implemented in C so it might not matter. > Suggestion: have separate warning and error handlers, so that if I > want to override these branches of the if statement I don't have to > repeat the entire handler. Good idea. > I see no definition of sys.disabled_warnings. Instead of > sys.disabled_warnings you meant sys.suppress, right? Right. > > return "suppress" > > for base in cls.__bases__: > > rc = self.get_user_request(base) > > if rc: > > return rc > > else: > > return None > > This approach (searching for the class name or the name of one of its > base classes in a list) doesn't look very object-oriented. It would > make more sense to store the desired return value as a class or > instance attribute. The default warning action could be stored on the > base class. The way I came to this odd structure is I wanted most subclasses to be just no-ops as many exceptions are: class ActiveStateLintWarning(Warning): pass class CodeLooksLikePerl(ActiveStateLintWarning): pass class TooMuchOnOneLine(CodeLooksLikePerl): pass So what __init__ could I write that would look up -wTooMuchOnOneLine and then if it failed that, look up -wCodeLooksLikePerl and so forth? It gets pretty mind-bending because you sort of want one method to call another and yet you want it to be the *same* inherited method (i.e. don't have to code it each time). E.g. you want to run the *same method* at each level of the hierarchy. So I just do that directly. > > -w XXX => sys.warnings.append("XXX") > > -e XXX => sys.errors.append("XXX") > > -no-w XXX => sys.suppress.append("XXX") > > -wall => sys.default_warning_action => "warn" > > -eall => sys.default_warning_action => "error" > > -no-wall => sys.default_warning_action => "suppress" > > Python doesn't support long options (I don't *like* long options so I > doubt that this is a good occasion to start lobbying for them :-). We > can come up with different options though. Are these long options? Or just overloaded behavior on -w, -e, -n . Think of "all" as the base class for warnings or something. > > As per the code above, errors take precedence over warnings and > > warnings over suppressions unless a particular assertion class > > specifies otherwise. > > I would use a different precedence scheme: a more specific filter > takes precedence over a more general filter. So -eall -wdubious would > mean that "dubious" class warnings are warnings but all others are > errors, and -wall -edubious would mean the opposite. Good idea. I think the code may work that way already. :) Paul Prescod From guido@python.org Mon Nov 6 22:16:29 2000 From: guido@python.org (Guido van Rossum) Date: Mon, 06 Nov 2000 17:16:29 -0500 Subject: [Python-Dev] Revamping Python's Numeric Model In-Reply-To: Your message of "Mon, 06 Nov 2000 16:36:49 EST." References: Message-ID: <200011062216.RAA09669@cj20424-a.reston1.va.home.com> > [Guido] > > 1/2 yielding 0.5 is innovative? Give me a break. Pascal did this. > > Algol-60 did this. Fortran does this. And rational numbers are less > > innovative? [Tim] > Small correction: Fortran does not -- 1/2 is 0 in Fortran (same as C99's > new rules, int div always truncates). I stand corrected -- the idea is only 40 years old, not 44. :-) > So far as innovation goes, no choice on the table so far is innovative > (neither mathematically nor in programming languages), so there's no basis > for choosing there. > > Guido, *why* did ABC use rationals by default? Was that driven by usability > studies? I assume that the use of rationals for exact numbers was driven by usability studies -- like us, the ABC designers were tired of explaining the vagaries of floating point to novices. I remember that I pushed for using rationals for 1E1000 and 1E-1000, probably out of a mistaken sense of consistency. I don't think I'm responsible for 1.0 being exact -- in "The B Programmer's Handbook" (CWI, 1985) 1.0 is exact and 1E10 is approximate. In "The ABC Progammer's Handbook (Prentice Hall, 1990) these are all exact. --Guido van Rossum (home page: http://www.python.org/~guido/) From martin@loewis.home.cs.tu-berlin.de Tue Nov 7 00:49:34 2000 From: martin@loewis.home.cs.tu-berlin.de (Martin v. Loewis) Date: Tue, 7 Nov 2000 01:49:34 +0100 Subject: [Python-Dev] Revamping Python's Numeric Model Message-ID: <200011070049.BAA01485@loewis.home.cs.tu-berlin.de> > Is there some API for it in C? In C99, you have access to the floating-point environment: #include /* ... */ { #pragma STDC FENV_ACCESS ON int set_excepts; feclearexcept(FE_INEXACT | FE_OVERFLOW); // maybe raise exceptions set_excepts = fetestexcept(FE_INEXACT | FE_OVERFLOW); if (set_excepts & FE_INEXACT) f(); if (set_excepts & FE_OVERFLOW) g(); /* ... */ } It defines the following symbolic exception constants: FE_DIVBYZERO FE_INEXACT FE_INVALID FE_OVERFLOW FE_UNDERFLOW Few compilers support that, though. Regards, Martin From greg@cosc.canterbury.ac.nz Tue Nov 7 01:06:04 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Tue, 07 Nov 2000 14:06:04 +1300 (NZDT) Subject: [Python-Dev] Integer division transition In-Reply-To: <200011061717.MAA07847@cj20424-a.reston1.va.home.com> Message-ID: <200011070106.OAA00139@s454.cosc.canterbury.ac.nz> > It's a new keyword though, which has a much > higher threshold for acceptance than a new two-character operator > symbol. It could be non-reserved, since a div b is currently a syntax error. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From martin@loewis.home.cs.tu-berlin.de Tue Nov 7 01:13:04 2000 From: martin@loewis.home.cs.tu-berlin.de (Martin v. Loewis) Date: Tue, 7 Nov 2000 02:13:04 +0100 Subject: [Python-Dev] Revamping Python's Numeric Model Message-ID: <200011070113.CAA01767@loewis.home.cs.tu-berlin.de> > I seriously wonder whether that was *the* problem with ABC: not only > was 1.0 treated as an exact rational in ABC, so was 6.02e23 and > 3.14159e-314 etc. At least for me, this caused rationals to get > used in many places I *intended* to use floats. I assume many > others got burned by this too, as I'd say it's impossible for users > coming from other languages not to see 6.02e23 etc as float > literals. There seems to be a long tradition in Python of annotating literals to get them interpreted in a different way; I think it would be reasonable to tell apart floating point literals and rational literals (with a power-of-ten denominator). Specifically, the "scientific notation" could be used: 1.1 would be exactly the same as 11/10, 1.1e0 would be binary floating point, and only approximately equal to 11/10. Regards, Martin From guido@python.org Tue Nov 7 03:07:49 2000 From: guido@python.org (Guido van Rossum) Date: Mon, 06 Nov 2000 22:07:49 -0500 Subject: [Python-Dev] Warnings PEP In-Reply-To: Your message of "Mon, 06 Nov 2000 14:01:27 PST." <3A072A37.C2CAFED1@activestate.com> References: <200011062102.QAA09294@cj20424-a.reston1.va.home.com> <3A072A37.C2CAFED1@activestate.com> Message-ID: <200011070307.WAA10555@cj20424-a.reston1.va.home.com> [Guido] > > Warnings are more a human engineering issue than a technical issue! > > That's also why I am emphasizing a C API -- I want to make it real > > easy to ussue quality warnings in the runtime. It's also why I > > specify a rich (probably too rich!) filtering mechanism. [Paul] > Let me put out the strawman proposal that "grep" is a nicely orthogonal > filtering mechanism. > > I guess my thinking is that you turn off warnings at the source. If you > get a warning about __del__ exceptions then you put in a try/except. If > you get a warning about an unused variable then you assign it to itself. > If you get a warning about integer division then you should pass in a > float and so forth. I'm thinking of a different situation, where the user isn't very Unix or Python sophisticated or the code generating warnings isn't easily accessible for editing, or the fix isn't obvious (which I expect to be often the case for __del__ and int division warnings). Just as with exception handlers, where unqualified except clauses are bad because of the risk that they mask real errors, I want to avoid the likelihood that end users (those who can't or shouldn't fix the errors) have to turn off all warnings. > > > Syntax > > > > > > assert >> cls, test[[[, arg], arg]...] > > > > I have several problems with this. First of all, using "assert" means > > that in "optimizing" mode (python -O) you won't get *any* warnings. I > > think that the decision to disable all warnings should be independent > > from the decision to "optimize". > > Arguable. I see it as "release" and "debug" configurations. python and > python_d. > > > Second, you're hypergeneralizing the > > extended print syntax. Just because I think it's okay to add >>file > > to the print syntax doesn't mean that it's now okay to add >>object > > syntax to all statements! > > Well getting new syntax into Python is really, really hard so we've got > to squeeze out as much value out of what we have as possible. But > anyhow, assertions are not allowed to (Did you forget to complete a sentence there?) I say there's no need for new syntax, this can just be a function. > You and I agree that there is an important sociological dimension to > this. We can't require: > > import warnings > > warnings.warn("foo", "bar") > > I prefer: > > warn Foo, "Bar" > > just like: > > raise Foo, "Bar" Hm. Raise was made a statement because (1) Modula-3 did so, and (2) it is a flow control modifier. You haven't made a good case for why warn() can't be a function. > > I also don't see what warnings have to do with assertions. Assertions > > are a mechanism to check for error conditions. What happens if the > > error is detected is of less importance -- it could raise an exception > > (Python), issue a fatal error (C), or do nothing (in -O mode). > > Warnings are issued when an error or dubious construct is detected. > Assertions are "fatal warnings". You agree that it is appropriate for > some "warnings" to kill the app in some circumstances. Isn't it just a > hop-skip-and-a-jump to say that warnings and errors are just points on > the spectrum: > > Report Once > Report Always > Report and Die Possibly -- and then 'assert' is a poor choice of name for the feature. Objection denied. > Python trained me to think of function calls and object constructions as > being basically the same thing -- and keyword arguments and variable > names being basically the same thing. etc. > > > With warnings I believe the issue is not so much the detection of the > > condition (for which a regular 'if' statement does just fine) but the > > reporting. Again, this is motivated by the fact that I expect that > > flexible filtering is essential for a successful warning mechanism. > > I don't see why assertions need special syntax but warnings do not! I > would have been happy for a "warn" keyword with assert-like syntax but I > don't think I'll see that in my lifetime. Indeed. But binding arbitrary unrelated semantics to an existing statement with a carefully chosen name is poor design too. You might as well propose print< > This is just a matter of exposition, but when I first read your PEP I > > had a hard time figuring out the purpose of the cls object. It wasn't > > until I got to the very end where I saw your example classes that I > > realized what it is: it represents a specific warning condition or a > > group of related warning conditions. > > It sort of evolved into an exception-like mechanism in that the class is > instantiated with arguments just as exception classes are. > > > > if action=="error": > > > *** existing assertion code *** > > > > That's just > > > > raise AssertionError, message > > > > Right? > > Well I'd like the traceback to emanate from the caller's position not > the warning handler's position. Python doesn't really have a way to say > that simply. This may well be implemented in C so it might not matter. OK. > > Suggestion: have separate warning and error handlers, so that if I > > want to override these branches of the if statement I don't have to > > repeat the entire handler. > > Good idea. > > > I see no definition of sys.disabled_warnings. Instead of > > sys.disabled_warnings you meant sys.suppress, right? > > Right. > > > > return "suppress" > > > for base in cls.__bases__: > > > rc = self.get_user_request(base) > > > if rc: > > > return rc > > > else: > > > return None > > > > This approach (searching for the class name or the name of one of its > > base classes in a list) doesn't look very object-oriented. It would > > make more sense to store the desired return value as a class or > > instance attribute. The default warning action could be stored on the > > base class. > > The way I came to this odd structure is I wanted most subclasses to be > just no-ops as many exceptions are: > > class ActiveStateLintWarning(Warning): pass > > class CodeLooksLikePerl(ActiveStateLintWarning): pass > > class TooMuchOnOneLine(CodeLooksLikePerl): pass The idea of using a class hierarchy to classify warnings is definitely a good one. > So what __init__ could I write that would look up -wTooMuchOnOneLine and > then if it failed that, look up -wCodeLooksLikePerl and so forth? Yeah, it's not clear what to do. You would like the -w option to poke a value in a class variable, but the problem there is that it doesn't know in which module the class is defined. (On the other hand, if that problem could be solved, it would be the preferred solution -- since it would solve the problem of typos in the -w argument neatly.) > It gets pretty mind-bending because you sort of want one method to call > another and yet you want it to be the *same* inherited method (i.e. > don't have to code it each time). E.g. you want to run the *same method* > at each level of the hierarchy. So I just do that directly. > > > > -w XXX => sys.warnings.append("XXX") > > > -e XXX => sys.errors.append("XXX") > > > -no-w XXX => sys.suppress.append("XXX") > > > -wall => sys.default_warning_action => "warn" > > > -eall => sys.default_warning_action => "error" > > > -no-wall => sys.default_warning_action => "suppress" > > > > Python doesn't support long options (I don't *like* long options so I > > doubt that this is a good occasion to start lobbying for them :-). We > > can come up with different options though. > > Are these long options? Or just overloaded behavior on -w, -e, -n . > Think of "all" as the base class for warnings or something. Yes: if you change -no-w into -n, they can all be short options. Note that no matter what syntax we choose, we'll always be deviants compared to Perl and GCC: those require a -w or -W option to *enable* warnings. (I found some great quotes in the Perl man page: Whenever you get mysterious behavior, try the -w switch!!! Whenever you don't get mysterious behavior, try using -w anyway. The -w switch produces some lovely diagnostics. Did we mention that you should definitely consider using the -w switch? Bugs The -w switch is not mandatory. ) > > > As per the code above, errors take precedence over warnings and > > > warnings over suppressions unless a particular assertion class > > > specifies otherwise. > > > > I would use a different precedence scheme: a more specific filter > > takes precedence over a more general filter. So -eall -wdubious would > > mean that "dubious" class warnings are warnings but all others are > > errors, and -wall -edubious would mean the opposite. > > Good idea. I think the code may work that way already. :) Indeed it does. :-) Then what did you mean by your remark about the precedence? --Guido van Rossum (home page: http://www.python.org/~guido/) From greg@cosc.canterbury.ac.nz Tue Nov 7 04:03:59 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Tue, 07 Nov 2000 17:03:59 +1300 (NZDT) Subject: [Python-Dev] Warnings PEP In-Reply-To: <3A072A37.C2CAFED1@activestate.com> Message-ID: <200011070403.RAA00158@s454.cosc.canterbury.ac.nz> Paul Prescod : > Assertions are "fatal warnings". No, the failure of an assertion *causes* a fatal warning. Assertions themselves are tests, not warnings. I agree with Guido -- "assert" is the wrong verb, "warn" is the right verb. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From fdrake@users.sourceforge.net Tue Nov 7 04:05:34 2000 From: fdrake@users.sourceforge.net (Fred L. Drake) Date: Mon, 6 Nov 2000 20:05:34 -0800 Subject: [Python-Dev] [development doc updates] Message-ID: <200011070405.UAA32052@orbital.p.sourceforge.net> The development version of the documentation has been updated: http://python.sourceforge.net/devel-docs/ From py-dev@zadka.site.co.il Tue Nov 7 12:24:22 2000 From: py-dev@zadka.site.co.il (Moshe Zadka) Date: Tue, 07 Nov 2000 14:24:22 +0200 Subject: [Python-Dev] Re: Class/type dichotomy thoughts In-Reply-To: Message from "M.-A. Lemburg" of "Mon, 06 Nov 2000 21:01:40 +0100." <3A070E24.B07FA7D7@lemburg.com> References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060244.VAA04289@cj20424-a.reston1.va.home.com> <3A06A336.5FD1EBC7@lemburg.com> <20001105234453.A8255@glacier.fnational.com> <3A06E43B.D95EC267@lemburg.com> <20001106023529.B8639@glacier.fnational.com> <3A070E24.B07FA7D7@lemburg.com> Message-ID: [MAL] > Take e.g. dictionaries: you could easily add a new dictionary > type which uses case-insensitive string keys by extending the > existing dictionary type. > > The new type would reuse most of the slots of the original > type and only replace the ones needed for lookup with the > new logic for case-insensitivity. > > Then it sets the type id to PyDict_TypeID and Python will > use it as if it were an original dictionary object. The > underlying type objects would be different though (and also > the type object address which is currently used to identify > a builtin type). This kind of thing will also help a numeric system like PEP-0228 be implemented more easily. -- Moshe Zadka This is a signature anti-virus. Please stop the spread of signature viruses! From greg@cosc.canterbury.ac.nz Tue Nov 7 04:19:46 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Tue, 07 Nov 2000 17:19:46 +1300 (NZDT) Subject: [Python-Dev] Re: Class/type dichotomy thoughts In-Reply-To: <3A070E24.B07FA7D7@lemburg.com> Message-ID: <200011070419.RAA00162@s454.cosc.canterbury.ac.nz> "M.-A. Lemburg" : > Then it sets the type id to PyDict_TypeID and Python will > use it as if it were an original dictionary object. Hang on a minute. What sort of assumptions is the interpreter going to be making based on the fact that the type id is PyDict_TypeID? Can it be sure that this new case-insensitive dictionary doesn't break them somehow? In other words, what does this new type_id thing actually *mean*? Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From paul@prescod.net Tue Nov 7 05:57:40 2000 From: paul@prescod.net (Paul Prescod) Date: Mon, 06 Nov 2000 21:57:40 -0800 Subject: [Python-Dev] Warnings PEP References: <200011070403.RAA00158@s454.cosc.canterbury.ac.nz> Message-ID: <3A0799D4.86FC8148@prescod.net> Greg Ewing wrote: > > Paul Prescod : > > > Assertions are "fatal warnings". > > No, the failure of an assertion *causes* a fatal warning. > Assertions themselves are tests, not warnings. Fine, some assertions are tests for fatal errors and some assertions are tests for non-fatal mistakes. -- Paul Prescod Simplicity does not precede complexity, but follows it. - http://www.cs.yale.edu/homes/perlis-alan/quotes.html From Moshe Zadka Tue Nov 7 06:37:01 2000 From: Moshe Zadka (Moshe Zadka) Date: Tue, 7 Nov 2000 08:37:01 +0200 (IST) Subject: [Python-Dev] Re: python-gdk-imlib and Delinquent Maintainers In-Reply-To: <87ofzs39lo.fsf@indy.progeny.com> Message-ID: On 6 Nov 2000, Eric Gillespie, Jr. wrote: > There has been a problem in Debian's python-gdk-imlib package for > quite some time: Transparent PNGs do not display properly (see > attached example script). According to upstream > (http://www.daa.com.au/pipermail/pygtk/2000-September/000336.html), > the proper solution is either to have Python use RTLD_GLOBAL in > dlopen calls when loading extension modules, Or, possible, using dlmodule to dlopen things ourselves rather then using Python's import facilities for that. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From Moshe Zadka Tue Nov 7 06:41:56 2000 From: Moshe Zadka (Moshe Zadka) Date: Tue, 7 Nov 2000 08:41:56 +0200 (IST) Subject: [Python-Dev] Re: python-gdk-imlib and Delinquent Maintainers In-Reply-To: Message-ID: On Tue, 7 Nov 2000, Moshe Zadka wrote: > On 6 Nov 2000, Eric Gillespie, Jr. wrote: > > > There has been a problem in Debian's python-gdk-imlib package for > > quite some time: Transparent PNGs do not display properly (see > > attached example script). According to upstream > > (http://www.daa.com.au/pipermail/pygtk/2000-September/000336.html), > > the proper solution is either to have Python use RTLD_GLOBAL in > > dlopen calls when loading extension modules, > > Or, possible, using dlmodule to dlopen things ourselves rather then > using Python's import facilities for that. Oooops....I didn't mean to send it here (well, I did, then I change my mind but forgot to tell my e-mail program about that) -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From paul@prescod.net Tue Nov 7 06:54:14 2000 From: paul@prescod.net (Paul Prescod) Date: Mon, 06 Nov 2000 22:54:14 -0800 Subject: [Python-Dev] Warning framework References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060335.WAA04452@cj20424-a.reston1.va.home.com> Message-ID: <3A07A716.BDF9658B@prescod.net> Guido van Rossum wrote: > > Before I fall asleep let me write up my ideas about the warning > framework. > > Requirements: > > - A C-level API that lets C code issue a warning with a single call > taking one or two arguments, e.g. Py_Warning(level, message). (The > 'level' argument is an example only; I'm not sure what if any we > need.) I like the C-level API. I think the "level" argument should be some sort of characterization, not a number or enumeration. I think of it as being like an exception -- just another Python object. Or it could be just a string. The main point is that filtering on a number or enum is not flexible enough. > - An equivalent Python level API, e.g. sys.warning(level, message). I would prefer something easier to spell and with more of a central "you should use this alot" feeling. > Possible implementation: > > - Each module can has a dictionary __warnings__ in its global > __dict__, which records the state of warnings. It is created as an > emprt dict if it doesn't exist when it is needed. The keys are > (message, linenumber) tuples (the module or file is implicit through > the use of the module's __dict__). The value is None if no more > action is needed for this particular warning and location. Some > other values may indicate the options "always print warning" (1?) > and "raise an exception" (-1?). The problem with filtering based on line number is that it can't really be done in a static manner because it is too fragile to code changes. In my proposal every warning was assigned a "type" which would be the key for filtering. A string would also be fine. In general, I'm uncomfortable that I don't understand the requirements enough. Are warnings something that the average Python programmer sees rarely and then immediately goes in to fix the code so they don't see it anymore (as compiler warnings are often handled)? Or do we expect most Python programs to issue hundreds of warnings unless they are filtered. Is filtering something you do constantly or as a partial workaround for half-broken code that you can't fix right now? -- Paul Prescod Simplicity does not precede complexity, but follows it. - http://www.cs.yale.edu/homes/perlis-alan/quotes.html From Moshe Zadka Tue Nov 7 06:58:40 2000 From: Moshe Zadka (Moshe Zadka) Date: Tue, 7 Nov 2000 08:58:40 +0200 (IST) Subject: [Python-Dev] Warning framework In-Reply-To: <3A07A716.BDF9658B@prescod.net> Message-ID: On Mon, 6 Nov 2000, Paul Prescod wrote: > I like the C-level API. > > I think the "level" argument should be some sort of characterization, > not a number or enumeration. I think of it as being like an exception -- > just another Python object. These are completely different things -- both should be there. An exception doesn't need a level -- it's the highest level possible, saying "get out of here, fast!". Warnings need both a level and characterization. Having it a Python class is a nice touch, and keeps it consistent with the way exceptions use classes for characterization. > In general, I'm uncomfortable that I don't understand the requirements > enough. Are warnings something that the average Python programmer sees > rarely and then immediately goes in to fix the code so they don't see it > anymore (as compiler warnings are often handled)? Or do we expect most > Python programs to issue hundreds of warnings unless they are filtered. > Is filtering something you do constantly or as a partial workaround for > half-broken code that you can't fix right now? There are two main sources to copy from here: gcc: You have -Wall, -Wadd-something, etc. Any warning you do see you either fix, or surround with a pragma so you don't see this. You also have -Werror to turn all warnings into errors. Perl: -w gives runtime warnings for things any saner language would raise exceptions for. "dereferencing NULL", accessing non-existing elements in an array, etc. Warnings are serious bugs, and you must always use them. Perl: "use strict" and friends: die because of some otherwise perfectly legal Perl if it's not declared properly. I'd go for a more gcc-like approach: if you see a warning, you should either 1. Silence it or 2. Fix it. Silencing warnings is a serious issue: sometimes the programmer does no better then the interpreter, and should have the ability to silence any warning permanently -- otherwise he'll work with -silence_all, and the benefit is lost. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From tim_one@email.msn.com Tue Nov 7 08:38:35 2000 From: tim_one@email.msn.com (Tim Peters) Date: Tue, 7 Nov 2000 03:38:35 -0500 Subject: [Python-Dev] Integer division transition In-Reply-To: <200011060234.VAA04271@cj20424-a.reston1.va.home.com> Message-ID: [Guido] > ... > Note: it is open for debate whether the result of x/y for integer (or > long integer) arguments should yield an integer (or long integer) in > those cases where the result *is* representable as such (e.g. 4/2). > It is possible that the numeric tower will render this problem moot -- > but that depends on what happens to Moshe's PEP 228, and that's a much > longer story. Note that for long ints i and j, i/j may (& easily so) be too large to represent as a native double. Rationals (of course) have no problem with that. It would certainly be curious if i*j didn't overflow but i/j did. Just an observation -- I'm a fan of unintended consequences . From mal@lemburg.com Tue Nov 7 09:17:56 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Tue, 07 Nov 2000 10:17:56 +0100 Subject: [Python-Dev] Re: Class/type dichotomy thoughts References: <200011070419.RAA00162@s454.cosc.canterbury.ac.nz> Message-ID: <3A07C8C4.1F14985F@lemburg.com> Greg Ewing wrote: > > "M.-A. Lemburg" : > > > Then it sets the type id to PyDict_TypeID and Python will > > use it as if it were an original dictionary object. > > Hang on a minute. What sort of assumptions is the > interpreter going to be making based on the fact that > the type id is PyDict_TypeID? Can it be sure that this > new case-insensitive dictionary doesn't break them > somehow? > > In other words, what does this new type_id thing > actually *mean*? For the interpreter it means that it can assume the type interface to be binary compatible to the "original" type, e.g. by setting the flag to say PyDict_TypeID the type assures that all PyDict_*() APIs will work on the type -- basically the same thing as PyDict_Check() does now except that the type object needn't be the same anymore. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From fredrik@pythonware.com Tue Nov 7 09:36:42 2000 From: fredrik@pythonware.com (Fredrik Lundh) Date: Tue, 7 Nov 2000 10:36:42 +0100 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0216.txt,1.3,1.4 References: <200011070911.BAA13249@slayer.i.sourceforge.net> Message-ID: <01e301c0489e$3e349540$0900a8c0@SPIFF> Moshe Zadka wrote: > Modified Files: > pep-0216.txt > Log Message: > Added structured text consensus. when/where was this discussed? From Moshe Zadka Tue Nov 7 09:49:54 2000 From: Moshe Zadka (Moshe Zadka) Date: Tue, 7 Nov 2000 11:49:54 +0200 (IST) Subject: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0216.txt,1.3,1.4 In-Reply-To: <01e301c0489e$3e349540$0900a8c0@SPIFF> Message-ID: On Tue, 7 Nov 2000, Fredrik Lundh wrote: > Moshe Zadka wrote: > > Modified Files: > > pep-0216.txt > > Log Message: > > Added structured text consensus. > > when/where was this discussed? ummmm....doc-sig, where all things documentation-related are discussed? -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From Moshe Zadka Tue Nov 7 09:58:41 2000 From: Moshe Zadka (Moshe Zadka) Date: Tue, 7 Nov 2000 11:58:41 +0200 (IST) Subject: PEP-0216 (was Re: [Python-Dev] Re: [Python-checkins] CVS: python/nondist/peps pep-0216.txt,1.3,1.4) In-Reply-To: Message-ID: On Tue, 7 Nov 2000, Moshe Zadka wrote: > ummmm....doc-sig, where all things documentation-related are discussed? Just to clarify my answer: I'm happy to receive comments/complaints by private e-mail, or you can share thm with all of doc-sig. I prefer to be CCed on doc-sig e-mail, but if not, I'll read it in the doc-sig. Python-Dev does not seem the place for documentation-related discussion, since that's exactly what doc-sig is for. It's quite low-volume, so please join, or browse the archives. -- Moshe Zadka -- 95855124 http://advogato.org/person/moshez From fredrik@pythonware.com Tue Nov 7 10:59:38 2000 From: fredrik@pythonware.com (Fredrik Lundh) Date: Tue, 7 Nov 2000 11:59:38 +0100 Subject: PEP-0216 (was Re: [Python-Dev] Re: [Python-checkins] CVS:python/nondist/peps pep-0216.txt,1.3,1.4) References: Message-ID: <027f01c048a9$d324efa0$0900a8c0@SPIFF> moshe wrote: > Python-Dev does not seem the place for documentation-related discussion, > since that's exactly what doc-sig is for. It's quite low-volume, so > please join, or browse the archives. oh, I thought I was subscribed. guess I wasn't. I'll check the archives. From guido@python.org Tue Nov 7 12:55:26 2000 From: guido@python.org (Guido van Rossum) Date: Tue, 07 Nov 2000 07:55:26 -0500 Subject: [Python-Dev] Warning framework In-Reply-To: Your message of "Mon, 06 Nov 2000 22:54:14 PST." <3A07A716.BDF9658B@prescod.net> References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060335.WAA04452@cj20424-a.reston1.va.home.com> <3A07A716.BDF9658B@prescod.net> Message-ID: <200011071255.HAA13355@cj20424-a.reston1.va.home.com> > I think the "level" argument should be some sort of characterization, > not a number or enumeration. I think of it as being like an exception -- > just another Python object. > > Or it could be just a string. The main point is that filtering on a > number or enum is not flexible enough. OK, let's make this a class then. Just keep exceptions out of it -- this is a separate, disjoint set of classes. Let's call this "warning category". There will be standard categories and user code can add categories. > > - An equivalent Python level API, e.g. sys.warning(level, message). > > I would prefer something easier to spell and with more of a central "you > should use this alot" feeling. OK, let's make this a built-in: warning(category, message). > > Possible implementation: > > > > - Each module can has a dictionary __warnings__ in its global > > __dict__, which records the state of warnings. It is created as an > > emprt dict if it doesn't exist when it is needed. The keys are > > (message, linenumber) tuples (the module or file is implicit through > > the use of the module's __dict__). The value is None if no more > > action is needed for this particular warning and location. Some > > other values may indicate the options "always print warning" (1?) > > and "raise an exception" (-1?). > > The problem with filtering based on line number is that it can't really > be done in a static manner because it is too fragile to code changes. In > my proposal every warning was assigned a "type" which would be the key > for filtering. A string would also be fine. > > In general, I'm uncomfortable that I don't understand the requirements > enough. Are warnings something that the average Python programmer sees > rarely and then immediately goes in to fix the code so they don't see it > anymore (as compiler warnings are often handled)? Or do we expect most > Python programs to issue hundreds of warnings unless they are filtered. > Is filtering something you do constantly or as a partial workaround for > half-broken code that you can't fix right now? All of the above. If I'm a programmer maintaining a piece of code, of course I turn all warnings into errors and fix them as they occur. But if I'm using someone else's code that happens to generate warnings, it's better to disable warnings in that code until its author has released a fixed version. I want to be able to be very specific, so that turning off warnings in 3rd party code doesn't disable them in my own code. If the 3rd party code generates a single warning during normal use at a single location (e.g. if there's one unconverted integer divide somehwere) then it's best to turn it off just at that location, so that when I feed it other data which may trigger other warnings elsewhere I will still get the benefit of the warnings -- which *may* mean there's something wrong with my data, not with the 3rd party code. --Guido van Rossum (home page: http://www.python.org/~guido/) From thomas@xs4all.net Tue Nov 7 13:17:02 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Tue, 7 Nov 2000 14:17:02 +0100 Subject: [Python-Dev] Integer division transition In-Reply-To: <200011070106.OAA00139@s454.cosc.canterbury.ac.nz>; from greg@cosc.canterbury.ac.nz on Tue, Nov 07, 2000 at 02:06:04PM +1300 References: <200011061717.MAA07847@cj20424-a.reston1.va.home.com> <200011070106.OAA00139@s454.cosc.canterbury.ac.nz> Message-ID: <20001107141702.H27208@xs4all.nl> On Tue, Nov 07, 2000 at 02:06:04PM +1300, Greg Ewing wrote: > > It's a new keyword though, which has a much > > higher threshold for acceptance than a new two-character operator > > symbol. > It could be non-reserved, since a div b is currently > a syntax error. Except for the fact our current parser can't handle the 'a div b' syntax without making 'div' a reserved word, which also makes 'x.div', 'class div:' and 'def div():' invalid syntax. It might be work around-able, but .... :P -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From jim@interet.com Tue Nov 7 13:28:16 2000 From: jim@interet.com (James C. Ahlstrom) Date: Tue, 07 Nov 2000 08:28:16 -0500 Subject: [Python-Dev] zipfile.py and InfoZIP files References: <3A030D21.413B8C8A@lemburg.com> <3A06ADFD.5C735BBA@interet.com> <3A06B981.B3D73091@lemburg.com> Message-ID: <3A080370.30C36339@interet.com> "M.-A. Lemburg" wrote: > Now I get this error after working in interactive Python > mode with zipfile: > > Exception exceptions.AttributeError: > "ZipFile instance has no attribute 'fp'" in ignored Reading the code, I don't see how this could have happened unless __init__ has already raised an exception. > I would like a method .copy(self, name, output) which > reads the file name from the ZIP archive and writes it directly to > the file-like object output. This should copy the file in chunks > of say 64kB in order to reduce memory load. This is only a few lines of Python, and I generally omit any methods which are not absoultely necessary. Does anyone else think this should be added? JimA From guido@python.org Tue Nov 7 13:33:46 2000 From: guido@python.org (Guido van Rossum) Date: Tue, 07 Nov 2000 08:33:46 -0500 Subject: [Python-Dev] Integer division transition In-Reply-To: Your message of "Tue, 07 Nov 2000 14:17:02 +0100." <20001107141702.H27208@xs4all.nl> References: <200011061717.MAA07847@cj20424-a.reston1.va.home.com> <200011070106.OAA00139@s454.cosc.canterbury.ac.nz> <20001107141702.H27208@xs4all.nl> Message-ID: <200011071333.IAA13497@cj20424-a.reston1.va.home.com> > > > It's a new keyword though, which has a much > > > higher threshold for acceptance than a new two-character operator > > > symbol. > > > It could be non-reserved, since a div b is currently > > a syntax error. > > Except for the fact our current parser can't handle the 'a div b' syntax > without making 'div' a reserved word, which also makes 'x.div', 'class div:' > and 'def div():' invalid syntax. It might be work around-able, but .... :P No, we *could* use the 'import as' trick: define the syntax as term: factor (('*'|'/'|'%'|NAME) factor)* and add a check that NAME is "div" in the compiler. But I don't know how comfy I am with a proliferation of hacks like this -- and it's likely to cause more confusing error messages. --Guido van Rossum (home page: http://www.python.org/~guido/) From mal@lemburg.com Tue Nov 7 14:46:15 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Tue, 07 Nov 2000 15:46:15 +0100 Subject: [Python-Dev] zipfile.py and InfoZIP files References: <3A030D21.413B8C8A@lemburg.com> <3A06ADFD.5C735BBA@interet.com> <3A06B981.B3D73091@lemburg.com> <3A080370.30C36339@interet.com> Message-ID: <3A0815B7.BD0D9C4C@lemburg.com> "James C. Ahlstrom" wrote: > > "M.-A. Lemburg" wrote: > > > Now I get this error after working in interactive Python > > mode with zipfile: > > > > Exception exceptions.AttributeError: > > "ZipFile instance has no attribute 'fp'" in ignored > > Reading the code, I don't see how this could have happened > unless __init__ has already raised an exception. Probably has something to do with GC and cleaning up instances -- don't know. The error only shows up sometimes... Instead of seeing the __del__ exception I now get the previous error again: zlib.error: Error -3 while decompressing: incomplete dynamic bit lengths tree Nevermind... my setup must be broken in more way than I have time to figure out now :-( > > I would like a method .copy(self, name, output) which > > reads the file name from the ZIP archive and writes it directly to > > the file-like object output. This should copy the file in chunks > > of say 64kB in order to reduce memory load. > > This is only a few lines of Python, and I generally omit > any methods which are not absoultely necessary. Does > anyone else think this should be added? Thanks, -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From pf@artcom-gmbh.de Tue Nov 7 14:59:01 2000 From: pf@artcom-gmbh.de (Peter Funk) Date: Tue, 7 Nov 2000 15:59:01 +0100 (MET) Subject: [Python-Dev] Integer division transition In-Reply-To: <200011071333.IAA13497@cj20424-a.reston1.va.home.com> from Guido van Rossum at "Nov 7, 2000 8:33:46 am" Message-ID: Hi, Guido van Rossum: > No, we *could* use the 'import as' trick: define the syntax as > > term: factor (('*'|'/'|'%'|NAME) factor)* > > and add a check that NAME is "div" in the compiler. > > But I don't know how comfy I am with a proliferation of hacks like > this -- and it's likely to cause more confusing error messages. In Modula-2 it is forbidden to declare a variable or procedure called 'DIV' or 'MOD', since both were reserved words in this language from the very beginning. But in Python the situation is different and people might have used 'div' as an identifier. So unless Guido is able to fix this using the famous time machine ...sigh... for the sake of backward compatibility using this "hack" seems to be the best available choice. I believe confusing error messages can be avoided and I see no "proliferation of hacks" in this two attempts to avoid defining new keywords. Keeping backward compatibility always had its price. But it is often a price worth to pay. Just my $0.02, Peter -- Peter Funk, Oldenburger Str.86, D-27777 Ganderkesee, Germany, Fax:+49 4222950260 office: +49 421 20419-0 (ArtCom GmbH, Grazer Str.8, D-28359 Bremen) From gward@mems-exchange.org Tue Nov 7 15:22:51 2000 From: gward@mems-exchange.org (Greg Ward) Date: Tue, 7 Nov 2000 10:22:51 -0500 Subject: [Python-Dev] Warning framework In-Reply-To: <200011071255.HAA13355@cj20424-a.reston1.va.home.com>; from guido@python.org on Tue, Nov 07, 2000 at 07:55:26AM -0500 References: <200011032301.SAA26346@cj20424-a.reston1.va.home.com> <3A03E340.38F78FA2@lemburg.com> <200011060335.WAA04452@cj20424-a.reston1.va.home.com> <3A07A716.BDF9658B@prescod.net> <200011071255.HAA13355@cj20424-a.reston1.va.home.com> Message-ID: <20001107102251.A15674@ludwig.cnri.reston.va.us> On 07 November 2000, Guido van Rossum said: > > Or it could be just a string. The main point is that filtering on a > > number or enum is not flexible enough. > > OK, let's make this a class then. Just keep exceptions out of it > -- this is a separate, disjoint set of classes. Let's call this > "warning category". There will be standard categories and user code > can add categories. This sounds right -- I was going to suggest "warning class" instead of "level", but "category" might be a better word. My main rationale was filtering: show me "integer divide" problems, but don't bother me with "function argument not used". (Hmm, those two sound more like specific warnings rather than warning categories. Probably the categories there would be "arithmetic" and "dataflow".) > > I would prefer something easier to spell and with more of a central "you > > should use this alot" feeling. > > OK, let's make this a built-in: warning(category, message). Minor spelling nit: I would call it 'warn()' (or 'sys.warn()', or 'Py_Warn()', etc.) since that's a verb. More importantly: if 'warn(ing)' is meant to be used mainly for compiler-style warnings -- you're using this language or library feature inappropriately -- then it should be left in sys. But if it's meant to also be used for printing some message to stderr (like Perl's 'warn'), then there's a good case for making it a builtin. Almost every Python script I write features def warn (msg): sys.stderr.write("warning: " + msg + "\n") That might be a clue that something (albeit a tiny thing) is missing from the language. ;-) Greg From fdrake@acm.org Tue Nov 7 15:30:00 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Tue, 7 Nov 2000 10:30:00 -0500 (EST) Subject: [Python-Dev] Integer division transition In-Reply-To: <200011071333.IAA13497@cj20424-a.reston1.va.home.com> References: <200011061717.MAA07847@cj20424-a.reston1.va.home.com> <200011070106.OAA00139@s454.cosc.canterbury.ac.nz> <20001107141702.H27208@xs4all.nl> <200011071333.IAA13497@cj20424-a.reston1.va.home.com> Message-ID: <14856.8184.854926.597397@cj42289-a.reston1.va.home.com> Guido van Rossum writes: > But I don't know how comfy I am with a proliferation of hacks like > this -- and it's likely to cause more confusing error messages. If "div" is it, I'd rather see it made a keyword and a warning published to the community soon so that people have a chance to check their code and either make it compatible with the change or scream ahead of time. A tool to help them out wouldn't hurt, either, and that could be written before any actual changes are made or even final decisions are made -- it could search everything on sys.path and report on uses that would be affected by each candidate change. -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From guido@python.org Tue Nov 7 16:14:21 2000 From: guido@python.org (Guido van Rossum) Date: Tue, 07 Nov 2000 11:14:21 -0500 Subject: [Python-Dev] uthread strawman In-Reply-To: Your message of "Mon, 06 Nov 2000 16:23:48 +0200." <3A06BEF4.95B773BD@tismer.com> References: <200011060450.RAA00019@s454.cosc.canterbury.ac.nz> <3A06BEF4.95B773BD@tismer.com> Message-ID: <200011071614.LAA18276@cj20424-a.reston1.va.home.com> I've thought about it a little more, and drawn some pictures in my head. I still have to disagree with Christian when he says: > Making Python completely coroutine aware, without > tricking the C stack, is 90 percent of the problem. > But after walking that far, there is no reason > to leave the other 10 percent alone. Without continuations, but with microthreads (uthreads) or coroutines, each (Python) stack frame can simply be "paused" at a specific point and continued later. The semantics here are completely clear (except perhaps for end cases such as unhandled exceptions and intervening C stack frames). With continuations, you have to decide how much state to save for a future continuation. It would seem easy enough: save all state kept in the frame except for the local variables. But now consider this: the "evaluation stack" contained in each frame could easily be replaced by a bunch of temporary variables, if we had a slightly different instruction set (3-address opcodes instead of stack-based opcodes). Then would we save those temporary variables or not? it can make a difference! Since the "save continuation" operation is a function call, you can easily save a continuation while there are some items on the value stack. I believe the current implementation saves these so they are restored when you jump to the continuation. But if the compiler were to use temporary variables instead of the evaluation stack, they might not have been restored! Here's another example. Suppose you set up a for loop. After three iterations through the loop you save a continuation. Then you finish hree more iterations. Then you return to the saved continuation. Where does the loop continue: at 3 or at 6 iterations? Try to answer this without trying it. My guess: it gets restarted at 3 iterations, because the loop index is saved on the value stack. If you rewrite this using a while loop, however, it would get restarted at 6 iterations, because then your loop index is an unsaved local variable. Ditto if you changed the bytecode compiler so for loops use an anonymous local variable instead of an entry on the evaluation stack. This semantic imprecision is one of the reasons why I don't like the concept of continuations. (I've been told that the exact semantics of continuations in Scheme differ greatly between Scheme implementations.) Now let's look at Jython. In Jython, we can simulate "paused frames" well enough by using real Java threads. However full continuations would require modifications to the JVM -- which is unacceptable to a language boasting "100% Pure Java". Another reason against allowing continuations. So, all in all, I don't think of continuations as "the last 10% that we might as well add to finish the job." I see it as an ill-specified hypergeneralization. What *do* I want to see in a stackless PEP? Not surprisingly, I'd like to see generators, coroutines, and uthreads. These all share a mechanism that pauses one frame and resumes another. I propose to make the concept of uthreads fundamental -- this will simplify the emulation in Jython. A strawman proposal: The uthread module provides the new functionality at the lowest level. Uthread objects represent microthreads. An uthread has a chain of stack frames linked by back pointers just like a regular thread. Pause/resume operations are methods on uthread objects. Pause/resume operations do not address specific frames but specific uthreads; within an uthread the normal call/return mechanisms can be used, and only the top frame in the uthread's stack of call frames can be paused/resumed (the ones below it are paused implicitly by the call to the next frame, and resumed when that call returns). - u = uthread.new(func) creates a new uthread object, u. The new uthread is poised to call func() but doesn't execute yet. - u = uthread.current() returns the uthread object for the current frame. - u.yield() pauses the current uthread and resume the uthread u where it was paused. The current uthread is resumed when some other uthread calls its yield() method. Calling uthread.current().yield() is a no-op. - When func() returns, the uthread that was executing it ceases to be runnable. The uthread that most recently yielded to it is resumed, unless that is no longer runnable, in which case the uthread that most recently yielded to *it* is resumed, and so on until a runnable uthread is found or until no runnable uthreads are left, in which case the program terminates. (XXX I need a proof here that this works.) - When func() raises an unhandled exception, the exception gets propagated using the same rules as when it returns, and similarly its uthread ceases to be runnable. - u.kill(exc) causes the yield() call that paused u to raise the exception exc. (This can be caught in a try/except of course.) - Calling u.yield() or u.kill() for a non-runnable uthread is an error and raises an exception. I think this API should enough to implement Gordon's SelectDispatcher code. In general, it's easy to create a scheduler uthread that schedules other uthreads. Open issues: - I'm not sure that I got the start conditions right. Should func() be be allowed to run until its first yield() when uthread.new(func) is called? - I'm not sure that the rules for returning and raising exceptions from func() are the right ones. - Should it be possible to pass a value to another uthread by passing an argument to u.yield(), which then gets returned by the resumed yield() call in that uthread? - How do uthreads interact with real threads? Uthreads are explicitly scheduled through yield() calls; real threads use preemptive scheduling. I suppose we could create a new "main" uthread for each real thread. But what if we yield() to an uthread that's already executing in another thread? How is that error detected? Please help! --Guido van Rossum (home page: http://www.python.org/~guido/) From cgw@fnal.gov Tue Nov 7 16:30:16 2000 From: cgw@fnal.gov (Charles G Waldman) Date: Tue, 7 Nov 2000 10:30:16 -0600 (CST) Subject: [Python-Dev] Integer division transition In-Reply-To: References: <200011071333.IAA13497@cj20424-a.reston1.va.home.com> Message-ID: <14856.11800.444274.851825@buffalo.fnal.gov> > Guido van Rossum: > > No, we *could* use the 'import as' trick: define the syntax as > > > > term: factor (('*'|'/'|'%'|NAME) factor)* > > > > and add a check that NAME is "div" in the compiler. > > > > But I don't know how comfy I am with a proliferation of hacks like > > this -- and it's likely to cause more confusing error messages. And what is the compelling reason for going through all this instead of just using the '//' symbol? Because it might be confused for a C++ comment? This is a weak argument AFAIAC. Python is not C++ and everybody knows that. I think that making "div" an infix operator would be setting a horrible precedent. Currently, all infix operators "look like" operators, i.e. they are non-alphabetic characters, and things that look like words are either functions or reserved words. There's already a "divmod" builtin which is a function, not an infix operator. I think it would be rather inconsistent to write, on the one hand: divmod(10, 2) and on the other: 10 div 2 Long before the creation of python-dev, this issue had been discussed numerous times on c.l.py, and the '//' operator was suggested several times, and I don't think anybody ever had a problem with it... From gvwilson@nevex.com Tue Nov 7 16:41:25 2000 From: gvwilson@nevex.com (Greg Wilson) Date: Tue, 7 Nov 2000 11:41:25 -0500 Subject: [Python-Dev] Integer division transition In-Reply-To: <14856.11800.444274.851825@buffalo.fnal.gov> Message-ID: > > Guido van Rossum: > > > No, we *could* use the 'import as' trick: define the syntax as > > > term: factor (('*'|'/'|'%'|NAME) factor)* > > > and add a check that NAME is "div" in the compiler. > Charles G. Waldman: > And what is the compelling reason for going through all this instead > of just using the '//' symbol? Because it might be confused for a C++ > comment? This is a weak argument AFAIAC. Python is not C++ and > everybody knows that. > Long before the creation of python-dev, this issue had been discussed > numerous times on c.l.py, and the '//' operator was suggested several > times, and I don't think anybody ever had a problem with it... Greg Wilson: As someone who teaches Python, I'm strongly opposed to using '//' in the same language as '/', purely on readability grounds: 1. Every C/C++ book includes a warning about "=" vs. "==", because it's a common hard-to-spot error. 2. What mark would you give a student who had variables IO and I0 in the same module? Greg p.s. I was very disappointed to discover that Ruby uses both '..' and '...' One means "up to but not including", the other means "up to and including". It would be interesting to estimate the number of programmer-hours this will cost... :-) From thomas@xs4all.net Tue Nov 7 17:26:21 2000 From: thomas@xs4all.net (Thomas Wouters) Date: Tue, 7 Nov 2000 18:26:21 +0100 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/dist/src configure.in,1.177,1.178 configure,1.169,1.170 In-Reply-To: <200011071544.HAA31147@slayer.i.sourceforge.net>; from gward@users.sourceforge.net on Tue, Nov 07, 2000 at 07:44:24AM -0800 References: <200011071544.HAA31147@slayer.i.sourceforge.net> Message-ID: <20001107182621.I27208@xs4all.nl> On Tue, Nov 07, 2000 at 07:44:24AM -0800, Greg Ward wrote: > - when compiling with GCC on any platform, add "-fPIC" to OPT > (without this, "$(CC) -shared" dies horribly) Sorry for the late remark (I did see your earlier message) but after reading the patch I realized 'OPT' isn't the right place for this. 'OPT' should be for non-essential stuff: warnings, debug-info and optimizations. Removing things from OPT shouldn't break anything, and neither should adding options that fit in the categories above. (Barring broken compilers, of course.) Instead, the -fPIC option should be added to CFLAGS, I think. The Python autoconf setup is slightly less versatile than most, though, since it's doggone hard to near impossible to change things like OPT, CC, CFLAGS, etc, without editing configure(.in) :P If noone else does it before me, I'll see about fixing at least the -fPIC thing later, when I find some time ;P -- Thomas Wouters Hi! I'm a .signature virus! copy me into your .signature file to help me spread! From paulp@ActiveState.com Tue Nov 7 17:29:10 2000 From: paulp@ActiveState.com (Paul Prescod) Date: Tue, 07 Nov 2000 09:29:10 -0800 Subject: [Python-Dev] Integer division transition References: <200011061717.MAA07847@cj20424-a.reston1.va.home.com> <200011070106.OAA00139@s454.cosc.canterbury.ac.nz> <20001107141702.H27208@xs4all.nl> <200011071333.IAA13497@cj20424-a.reston1.va.home.com> <14856.8184.854926.597397@cj42289-a.reston1.va.home.com> Message-ID: <3A083BE6.4A502210@activestate.com> "Fred L. Drake, Jr." wrote: > > ... > A tool to help them out wouldn't hurt, either, and that could be > written before any actual changes are made or even final decisions are > made -- it could search everything on sys.path and report on uses that > would be affected by each candidate change. I think that the standard Python compiler is the appropriate tool for this sort of thing. Anything that can be caught "statically" might as well be implemented right in the compiler (at least from the user's point of view) rather than in a separate "deprecation nanny." Paul Prescod From nas@arctrix.com Tue Nov 7 11:45:18 2000 From: nas@arctrix.com (Neil Schemenauer) Date: Tue, 7 Nov 2000 03:45:18 -0800 Subject: [Python-Dev] Integer division transition In-Reply-To: ; from gvwilson@nevex.com on Tue, Nov 07, 2000 at 11:41:25AM -0500 References: <14856.11800.444274.851825@buffalo.fnal.gov> Message-ID: <20001107034518.A12431@glacier.fnational.com> On Tue, Nov 07, 2000 at 11:41:25AM -0500, Greg Wilson wrote: > As someone who teaches Python, I'm strongly opposed to using '//' in the > same language as '/', purely on readability grounds: How do you feel about div(x, y)? Neil From cgw@fnal.gov Tue Nov 7 18:50:18 2000 From: cgw@fnal.gov (Charles G Waldman) Date: Tue, 7 Nov 2000 12:50:18 -0600 (CST) Subject: [Python-Dev] Integer division transition In-Reply-To: References: <14856.11800.444274.851825@buffalo.fnal.gov> Message-ID: <14856.20202.174151.647981@buffalo.fnal.gov> Greg Wilson writes: > > 2. What mark would you give a student who had variables IO and I0 in the > same module? > I think this is a bit of a stretch - IO and I0 look almost identical typographically (depending on the font) whereas // and / look pretty different. It would be a better analogy to say "What mark would you give a student who used variables X and XX in the same program". And, I wouldn't have a problem with that. How about URL's? '/' and '//' have different meanings there and I don't think people have a big problem with this. The other point - "=" vs "==" - is a bit harder to answer. Both of these symbols are used in Python, but not in the same context. All-the-good-symbols-are-already-taken-ly y'rs, //C From gstein@lyra.org Tue Nov 7 18:49:12 2000 From: gstein@lyra.org (Greg Stein) Date: Tue, 7 Nov 2000 10:49:12 -0800 Subject: [Python-Dev] Re: python-gdk-imlib and Delinquent Maintainers In-Reply-To: ; from moshez@math.huji.ac.il on Tue, Nov 07, 2000 at 08:37:01AM +0200 References: <87ofzs39lo.fsf@indy.progeny.com> Message-ID: <20001107104912.Q14054@lyra.org> On Tue, Nov 07, 2000 at 08:37:01AM +0200, Moshe Zadka wrote: > On 6 Nov 2000, Eric Gillespie, Jr. wrote: > > There has been a problem in Debian's python-gdk-imlib package for > > quite some time: Transparent PNGs do not display properly (see > > attached example script). According to upstream > > (http://www.daa.com.au/pipermail/pygtk/2000-September/000336.html), > > the proper solution is either to have Python use RTLD_GLOBAL in > > dlopen calls when loading extension modules, > > Or, possible, using dlmodule to dlopen things ourselves rather then > using Python's import facilities for that. There was quite a long conversation [on python-dev] a while back (geez, a year ago? more?) about RTLD_GLOBAL and whether Python should use it. There were pros and cons for both directions, and I believe some compatibility issues. You may be able to find the conversation, then figure out why Python chose its current mechanism. Heck... maybe it should change :-) Cheers, -g -- Greg Stein, http://www.lyra.org/ From akuchlin@mems-exchange.org Tue Nov 7 19:11:06 2000 From: akuchlin@mems-exchange.org (Andrew Kuchling) Date: Tue, 7 Nov 2000 14:11:06 -0500 Subject: [Python-Dev] Re: python-gdk-imlib and Delinquent Maintainers In-Reply-To: <20001107104912.Q14054@lyra.org>; from gstein@lyra.org on Tue, Nov 07, 2000 at 10:49:12AM -0800 References: <87ofzs39lo.fsf@indy.progeny.com> <20001107104912.Q14054@lyra.org> Message-ID: <20001107141106.A10897@kronos.cnri.reston.va.us> On Tue, Nov 07, 2000 at 10:49:12AM -0800, Greg Stein wrote: >There was quite a long conversation [on python-dev] a while back (geez, a >year ago? more?) about RTLD_GLOBAL and whether Python should use it. There >were pros and cons for both directions, and I believe some compatibility >issues. kronos Python>cvs log importdl.c ... revision 2.47 date: 1998/05/18 13:42:45; author: guido; state: Exp; lines: +4 -6 Remove use of RTLD_GLOBAL. ---------------------------- ... revision 2.41 date: 1997/12/02 20:43:18; author: guido; state: Exp; lines: +7 -3 Add the flag RTLD_GLOBAL to the dlopen() options. This exports symbols defined by the loaded extension to other extensions (loaded later). (I'm not quite sure about this but suppose it can't hurt...) ---------------------------- Adding RTLD_GLOBAL in one version, removing it in the next: a new Python tradition! --amk From est@hyperreal.org Tue Nov 7 19:04:33 2000 From: est@hyperreal.org (est@hyperreal.org) Date: Tue, 7 Nov 2000 11:04:33 -0800 (PST) Subject: [Python-Dev] uthread strawman In-Reply-To: <200011071614.LAA18276@cj20424-a.reston1.va.home.com> from Guido van Rossum at "Nov 7, 2000 11:14:21 am" Message-ID: <20001107190433.26578.qmail@hyperreal.org> Guido van Rossum discourseth: > > A strawman proposal: > > The uthread module provides the new functionality at the lowest level. I really like this as a primitive appropriate for Python's evolution. > - When func() returns, the uthread that was executing it ceases to be > runnable. The uthread that most recently yielded to it is resumed, > unless that is no longer runnable, in which case the uthread that > most recently yielded to *it* is resumed, and so on until a runnable > uthread is found or until no runnable uthreads are left, in which > case the program terminates. (XXX I need a proof here that this > works.) I'd like it added that when a uthread chains to its yielder it drops (i.e., nulls and decrefs) the reference to that yielder. I want uthreads in some of the same applications in which i disable gc for real-time purposes, and I don't want circular structures of unrunnable uthreads leaking my memory. > - Calling u.yield() or u.kill() for a non-runnable uthread is an error > and raises an exception. A runnable() predicate might be nice. > - I'm not sure that I got the start conditions right. Should func() be > be allowed to run until its first yield() when uthread.new(func) is > called? +1 for no on this. > - I'm not sure that the rules for returning and raising exceptions > from func() are the right ones. I'm particularly unsure about the exception propagation. It could always be disabled by a universal exception handler in the uthread, but I'm not sure it's even worth the implementation effort. > - Should it be possible to pass a value to another uthread by passing > an argument to u.yield(), which then gets returned by the resumed > yield() call in that uthread? Certainly! :) As written, the strawman seems to require that the dread intervening C stack frames are handled transparently (since it doesn't say anything about them). This seems pretty important to me. An instance method may well know that it should yield, yet not know that it's being called as a callback from a class/type that's just been moved to C. OTOH..not handling this transparently would increase my market value as a Python programmer. Handling it right might get me some unpaid work implementing some of the low-level details for Linux. Hmm! :D Eric From paulp@ActiveState.com Tue Nov 7 20:33:19 2000 From: paulp@ActiveState.com (Paul Prescod) Date: Tue, 07 Nov 2000 12:33:19 -0800 Subject: [Python-Dev] Integer division transition References: <200011071333.IAA13497@cj20424-a.reston1.va.home.com> <14856.11800.444274.851825@buffalo.fnal.gov> Message-ID: <3A08670F.A4703F32@activestate.com> Charles G Waldman wrote: > > ... > > > I think that making "div" an infix operator would be setting a > horrible precedent. I think it would be a good precedent because it is a cleaner upgrade path to things like matrixdiv, matrixmul, ... Paul Prescod From tismer@tismer.com Tue Nov 7 19:54:14 2000 From: tismer@tismer.com (Christian Tismer) Date: Tue, 07 Nov 2000 21:54:14 +0200 Subject: [Python-Dev] Re: uthread strawman References: <200011060450.RAA00019@s454.cosc.canterbury.ac.nz> <3A06BEF4.95B773BD@tismer.com> <200011071614.LAA18276@cj20424-a.reston1.va.home.com> Message-ID: <3A085DE6.28593202@tismer.com> Just answering/clarifying a few bits, since I can't change your opinion about continuations, anyway. Guido van Rossum wrote: > > I've thought about it a little more, and drawn some pictures in my > head. I have to agree with Guido when he says: > I still have to disagree with Christian when he says: > > > Making Python completely coroutine aware, without > > tricking the C stack, is 90 percent of the problem. > > But after walking that far, there is no reason > > to leave the other 10 percent alone. ... since I meant implementability. Of course there are other reasons gainst continuations. I just did it since they were in reach. > Without continuations, but with microthreads (uthreads) or coroutines, > each (Python) stack frame can simply be "paused" at a specific point > and continued later. The semantics here are completely clear (except > perhaps for end cases such as unhandled exceptions and intervening C > stack frames). I agree. But also with continuations, the situation is identical, as long as you don't try anything else where continuations would be needed. Note that they will not need to be created when the mentioned structures are implemented well. We don't have to implement them, but providing support for them in the interpreter framework is simple. (that's the 10% issue). > With continuations, you have to decide how much state to save for a > future continuation. It would seem easy enough: save all state kept > in the frame except for the local variables. But now consider this: > the "evaluation stack" contained in each frame could easily be > replaced by a bunch of temporary variables, if we had a slightly > different instruction set (3-address opcodes instead of stack-based > opcodes). Then would we save those temporary variables or not? it > can make a difference! Since the "save continuation" operation is a > function call, you can easily save a continuation while there are some > items on the value stack. I believe the current implementation saves > these so they are restored when you jump to the continuation. But if > the compiler were to use temporary variables instead of the evaluation > stack, they might not have been restored! I would consider these temporary variables registers which must be preserved. They are immutable objects as part of the immutable continuation, treated as values. Stack or registers, this is part of an expression evaluation. Temporary results must conceptually be read-only, whatever way I implement this. > Here's another example. Suppose you set up a for loop. After three > iterations through the loop you save a continuation. Then you finish > hree more iterations. Then you return to the saved continuation. > Where does the loop continue: at 3 or at 6 iterations? Try to answer > this without trying it. My guess: it gets restarted at 3 iterations, > because the loop index is saved on the value stack. If you rewrite > this using a while loop, however, it would get restarted at 6 > iterations, because then your loop index is an unsaved local variable. > Ditto if you changed the bytecode compiler so for loops use an > anonymous local variable instead of an entry on the evaluation > stack. Wrong guess! Exactly for that reason I changed the loop code to put a mutable loopcounter object on the stack. The loop works perfectly. > This semantic imprecision is one of the reasons why I don't like the > concept of continuations. (I've been told that the exact semantics of > continuations in Scheme differ greatly between Scheme implementations.) In a sense, you have continuations already, also with the restriction to gen/co/uthread structures. The only difference is to treat a frame as exactly one continuation and to disallow to have more than one at any time. This saves the decision about the ambiguities you mentioned. I agree that going to this point and not further for the Python core is a good idea. A PEP doesn't need to name continuations at all. On the other hand, I don't see a reason why this hsould mean that Python *must not* support them. What I'd like to keep is the possibility to still write such an extension module. Enabling this for educational purposes is a great value that comes at a cheap price and no impact for the core. > Now let's look at Jython. In Jython, we can simulate "paused frames" > well enough by using real Java threads. However full continuations > would require modifications to the JVM -- which is unacceptable to a > language boasting "100% Pure Java". Another reason against allowing > continuations. Not quite true, after I heard of a paper that shows how to implement continuations in Java, using threads. But I'll come back to that when I have the paper. > So, all in all, I don't think of continuations as "the last 10% that > we might as well add to finish the job." I see it as an ill-specified > hypergeneralization. Can we agree to not support them without forbidding them? ... > A strawman proposal: Ok, this looks all very well to me. More on that later. One question: Why do you want an explicit u.yield() ? Uthreads are scheduled automatically now, like real threads. Do you see a major drawback in supporting this, maybe as an option? Or do you see automatic scheduling as an extra construct on top with a special "scheduler" uthread? cheers - chris -- Christian Tismer :^) Mission Impossible 5oftware : Have a break! Take a ride on Python's Kaunstr. 26 : *Starship* http://starship.python.net 14163 Berlin : PGP key -> http://wwwkeys.pgp.net PGP Fingerprint E182 71C7 1A9D 66E9 9D15 D3CC D4D7 93E2 1FAE F6DF where do you want to jump today? http://www.stackless.com From jack@oratrix.nl Tue Nov 7 22:57:53 2000 From: jack@oratrix.nl (Jack Jansen) Date: Tue, 07 Nov 2000 23:57:53 +0100 Subject: [Python-Dev] Re: Class/type dichotomy thoughts In-Reply-To: Message by "M.-A. Lemburg" , Tue, 07 Nov 2000 10:17:56 +0100 , <3A07C8C4.1F14985F@lemburg.com> Message-ID: <20001107225758.B77651301D9@oratrix.oratrix.nl> > > In other words, what does this new type_id thing > > actually *mean*? > > For the interpreter it means that it can assume the type > interface to be binary compatible to the "original" > type, e.g. by setting the flag to say PyDict_TypeID > the type assures that all PyDict_*() APIs will work > on the type -- basically the same thing as PyDict_Check() > does now except that the type object needn't be the same > anymore. I would be _very_ happy if this single type_id could somehow be replaced by an array, or a bitset. I have a lot of types in MacPython that are acceptable to the APIs of other types, a sort of poor-mans-inheritance scheme. For instance, all operating system calls that accept a MacOS WindowPtr will also happily accept a DialogPtr. Major magic is needed to get this to work reasonably in Python, and the Python user can still accidentally mess up the refcounting scheme and free things s/he isn't aware of. As the number of types in a given run of the interpreter appears to be limited (am I right here?) and type-identity-tests are valid within a single interpreter run only (am I right here?) an API like typeindex = Py_TypeToTypeIndex(typeobject); which would use a dictionary as storage for the mapping and generate the index numbers on the fly would do the trick. Call it once during module initalization and the Py_ISOBJECTCOMPATIBLEWITH(object, typeindex) macro would be a oneliner to test the bit in the set. A few hundred bits in the set would get us a long way, I guess. -- Jack Jansen | ++++ stop the execution of Mumia Abu-Jamal ++++ Jack.Jansen@oratrix.com | ++++ if you agree copy these lines to your sig ++++ www.oratrix.nl/~jack | see http://www.xs4all.nl/~tank/spg-l/sigaction.htm From fdrake@acm.org Tue Nov 7 23:06:25 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Tue, 7 Nov 2000 18:06:25 -0500 (EST) Subject: [Python-Dev] Integer division transition In-Reply-To: <14856.11800.444274.851825@buffalo.fnal.gov> References: <200011071333.IAA13497@cj20424-a.reston1.va.home.com> <14856.11800.444274.851825@buffalo.fnal.gov> Message-ID: <14856.35569.55094.245631@cj42289-a.reston1.va.home.com> Charles G Waldman writes: > I think that making "div" an infix operator would be setting a > horrible precedent. Currently, all infix operators "look like" > operators, i.e. they are non-alphabetic characters, and things that > look like words are either functions or reserved words. Like "is", "in", "is not", and "not in"? > Long before the creation of python-dev, this issue had been discussed > numerous times on c.l.py, and the '//' operator was suggested several > times, and I don't think anybody ever had a problem with it... I don't have a strong preference for either // or div, but definately want this to be an operator. -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From fdrake@acm.org Tue Nov 7 23:17:10 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Tue, 7 Nov 2000 18:17:10 -0500 (EST) Subject: [Python-Dev] Integer division transition In-Reply-To: <3A083BE6.4A502210@activestate.com> References: <200011061717.MAA07847@cj20424-a.reston1.va.home.com> <200011070106.OAA00139@s454.cosc.canterbury.ac.nz> <20001107141702.H27208@xs4all.nl> <200011071333.IAA13497@cj20424-a.reston1.va.home.com> <14856.8184.854926.597397@cj42289-a.reston1.va.home.com> <3A083BE6.4A502210@activestate.com> Message-ID: <14856.36214.267784.220958@cj42289-a.reston1.va.home.com> Paul Prescod writes: > I think that the standard Python compiler is the appropriate tool for > this sort of thing. Anything that can be caught "statically" might as > well be implemented right in the compiler (at least from the user's > point of view) rather than in a separate "deprecation nanny." For linting programs using the final specification, this fine. I'm thinking that a tool to read over people's sources and say "'div' is used in 120 places out of 56K lines of code." would be helpful because we could determine the extent of the effect of using "div" instead of "//". If there are a limited number of projects affected, it may be entirely reasonable to find out that there aren't enough uses to worry about, and it becomes acceptable to make it a keyword (like the other textual infix operators). -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From greg@cosc.canterbury.ac.nz Wed Nov 8 00:42:40 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Wed, 08 Nov 2000 13:42:40 +1300 (NZDT) Subject: [Python-Dev] Integer division transition In-Reply-To: <14856.20202.174151.647981@buffalo.fnal.gov> Message-ID: <200011080042.NAA00284@s454.cosc.canterbury.ac.nz> Charles G Waldman : > The other point - "=" vs "==" - is a bit harder to answer. I think the reason this causes so much trouble is that many programming languages, not to mention mainstream mathematics, use "=" to mean what C uses "==" for. Other such pairs in C, e.g. "&" vs "&&" and "+" vs "++", don't seem to cause anywhere near as much difficulty, so I don't think the problem is one of visual confusion, but of semantic confusion. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From guido@python.org Wed Nov 8 01:34:49 2000 From: guido@python.org (Guido van Rossum) Date: Tue, 07 Nov 2000 20:34:49 -0500 Subject: [Python-Dev] uthread strawman In-Reply-To: Your message of "Tue, 07 Nov 2000 11:04:33 PST." <20001107190433.26578.qmail@hyperreal.org> References: <20001107190433.26578.qmail@hyperreal.org> Message-ID: <200011080134.UAA19392@cj20424-a.reston1.va.home.com> [Note: the stackless@starship.python.net list seems to have trouble again. BeOpen may have crashed the machine again. :-( ] [Note: response to Christian at end.] [Guido] > > > > A strawman proposal: > > > > The uthread module provides the new functionality at the lowest level. Eric (Tiedemann, right? There are two Erics here -- it would help if you signed your full name :-) writes: > I really like this as a primitive appropriate for Python's evolution. Cool. I think I spoke too soon when I called it uthreads -- these are really more like coroutines. I also forgot to mention that I am assuming with this strawman that we can use an essentially stackless implementation of the PVM. I hope that it will be possible to make it a lot simpler than current stackless though, by not doing continuations. Freezing a frame in place is a lot simpler than freezing for multiple uses, which requires one to decide what to copy and what to share! > > - When func() returns, the uthread that was executing it ceases to be > > runnable. The uthread that most recently yielded to it is resumed, > > unless that is no longer runnable, in which case the uthread that > > most recently yielded to *it* is resumed, and so on until a runnable > > uthread is found or until no runnable uthreads are left, in which > > case the program terminates. (XXX I need a proof here that this > > works.) > > I'd like it added that when a uthread chains to its yielder it drops > (i.e., nulls and decrefs) the reference to that yielder. I want > uthreads in some of the same applications in which i disable gc for > real-time purposes, and I don't want circular structures of unrunnable > uthreads leaking my memory. Good point. > > - Calling u.yield() or u.kill() for a non-runnable uthread is an error > > and raises an exception. > > A runnable() predicate might be nice. Yes. > > - I'm not sure that I got the start conditions right. Should func() be > > be allowed to run until its first yield() when uthread.new(func) is > > called? > > +1 for no on this. You're being unnecessarily cryptic. "Yes for no"? So you're for the original proposal (which doesn't start func() at all until it is yielded to for the first time). > > - I'm not sure that the rules for returning and raising exceptions > > from func() are the right ones. > > I'm particularly unsure about the exception propagation. It could > always be disabled by a universal exception handler in the uthread, > but I'm not sure it's even worth the implementation effort. Agreed. We may have to experiment. > > - Should it be possible to pass a value to another uthread by passing > > an argument to u.yield(), which then gets returned by the resumed > > yield() call in that uthread? > > Certainly! :) This affects the initial condition. If u hasn't called func() yet, and I call u.yield(42), where does the 42 go? Does it call func(42)? That may make it make it unnecessarily hard to get the end conditions right for func(). Again, we'll have to write some sample code to see how this turns out in practice. > As written, the strawman seems to require that the dread intervening C > stack frames are handled transparently (since it doesn't say anything > about them). This seems pretty important to me. An instance method > may well know that it should yield, yet not know that it's being > called as a callback from a class/type that's just been moved to C. Not sure what you meant by intervening. I certainly intended Python to Python calls to be handled without creating extra C stack frames. When Python calls C which calls back into Python, this is considered all part of the same uthread. Where it gets tricky is when this spawns a new uthread, which also calls C which calls Python. Now the second uthread has a C stack frame above the C stack frame that's part of the first uthread. This means that the second uthread must return from its C code before the first uthread can return to its C code! A little extra bookkeeping will be necessary to check for this -- so that when it is attempted an exception is raised, rather than a return attempted from the wrong uthread back into C. This is the same as for current stackless. The solution is simply that the application "shouldn't do that." > OTOH..not handling this transparently would increase my market value > as a Python programmer. Handling it right might get me some unpaid > work implementing some of the low-level details for Linux. Hmm! :D If you want do hack continuations in C, be my guest -- as long as you stay 10,000 kilometers away from core Python. :-) [Now replying to Christian:] > Just answering/clarifying a few bits, > since I can't change your opinion about > continuations, anyway. Right! > Guido van Rossum wrote: > > > > I've thought about it a little more, and drawn some pictures in my > > head. > > I have to agree with Guido when he says: > > > I still have to disagree with Christian when he says: > > > > > Making Python completely coroutine aware, without > > > tricking the C stack, is 90 percent of the problem. > > > But after walking that far, there is no reason > > > to leave the other 10 percent alone. > > ... since I meant implementability. Of course there are > other reasons gainst continuations. I just did it since they > were in reach. Hm. Having seen a few fragments of your implementation today (just a very little bit, since we were having an all-day meeting) I feel that there are a lot of extra hacks needed to make the reuse of continuations necessary. This shouldn't be needed in my version. > > Without continuations, but with microthreads (uthreads) or coroutines, > > each (Python) stack frame can simply be "paused" at a specific point > > and continued later. The semantics here are completely clear (except > > perhaps for end cases such as unhandled exceptions and intervening C > > stack frames). > > I agree. But also with continuations, the situation is identical, > as long as you don't try anything else where continuations > would be needed. But the complexity in the code still exists because a continuation *could* be reused and you don't know if it will ever happen so you must be prepared. > Note that they will not need to be created when the mentioned > structures are implemented well. We don't have to implement them, > but providing support for them in the interpreter framework > is simple. (that's the 10% issue). Having seen your code (frankly, a mess!) I don't believe that it's only 10% at all. > > With continuations, you have to decide how much state to save for a > > future continuation. It would seem easy enough: save all state kept > > in the frame except for the local variables. But now consider this: > > the "evaluation stack" contained in each frame could easily be > > replaced by a bunch of temporary variables, if we had a slightly > > different instruction set (3-address opcodes instead of stack-based > > opcodes). Then would we save those temporary variables or not? it > > can make a difference! Since the "save continuation" operation is a > > function call, you can easily save a continuation while there are some > > items on the value stack. I believe the current implementation saves > > these so they are restored when you jump to the continuation. But if > > the compiler were to use temporary variables instead of the evaluation > > stack, they might not have been restored! > > I would consider these temporary variables registers which must > be preserved. They are immutable objects as part of the immutable > continuation, treated as values. Stack or registers, this is part > of an expression evaluation. Temporary results must conceptually > be read-only, whatever way I implement this. I heard from Tim that he helped you get this right. The fact that it is so hard to know the requirements for a practical implementation makes me very worried that continuations may have hidden bugs. > > Here's another example. Suppose you set up a for loop. After three > > iterations through the loop you save a continuation. Then you finish > > hree more iterations. Then you return to the saved continuation. > > Where does the loop continue: at 3 or at 6 iterations? Try to answer > > this without trying it. My guess: it gets restarted at 3 iterations, > > because the loop index is saved on the value stack. If you rewrite > > this using a while loop, however, it would get restarted at 6 > > iterations, because then your loop index is an unsaved local variable. > > Ditto if you changed the bytecode compiler so for loops use an > > anonymous local variable instead of an entry on the evaluation > > stack. > > Wrong guess! > Exactly for that reason I changed the loop code to put a mutable > loopcounter object on the stack. > The loop works perfectly. Wow. i'm impressed. You must have borrowed my time machine. :-) Still, I believe there was a time when the loop *didn't* work perfectly yet. It is really hard to know what is needed. Are you *sure* that it now *always* does the right thing? What if I save a continuation in the middle of a shortcut Boolean expression (and/or stuff)? Or in cases like a > This semantic imprecision is one of the reasons why I don't like the > > concept of continuations. (I've been told that the exact semantics of > > continuations in Scheme differ greatly between Scheme implementations.) > > In a sense, you have continuations already, also with the restriction > to gen/co/uthread structures. The only difference is to treat a > frame as exactly one continuation and to disallow to have more > than one at any time. > This saves the decision about the ambiguities you mentioned. Yeah, right. You can call pausable/resumable frames use-once continuations if you want to. And if that gives you the happy feeling that I "support" continuations, fine. > I agree that going to this point and not further for the > Python core is a good idea. > A PEP doesn't need to name continuations at all. Good. > On the other hand, I don't see a reason why this hsould mean > that Python *must not* support them. What I'd like to keep > is the possibility to still write such an extension module. > > Enabling this for educational purposes is a great value > that comes at a cheap price and no impact for the core. I doubt it. I'm not going to allow any compromises just to make it easier to reuse continuations. (Such as using a mutable counter in the for-loop code.) > > Now let's look at Jython. In Jython, we can simulate "paused frames" > > well enough by using real Java threads. However full continuations > > would require modifications to the JVM -- which is unacceptable to a > > language boasting "100% Pure Java". Another reason against allowing > > continuations. > > Not quite true, after I heard of a paper that shows how > to implement continuations in Java, using threads. > But I'll come back to that when I have the paper. I've heard that his margin was too small to contain the proof. I expect that it will be a disappointment from a practical point of view: perhaps he emulates the JVM in Java. > > So, all in all, I don't think of continuations as "the last 10% that > > we might as well add to finish the job." I see it as an ill-specified > > hypergeneralization. > > Can we agree to not support them without forbidding them? I won't forbid them, but I won't make compromises to the core PVM that would make them easier to implement. Your patch set would still be a heck of a lot smaller of course. > ... > > A strawman proposal: > > Ok, this looks all very well to me. More on that later. > One question: Why do you want an explicit u.yield() ? > Uthreads are scheduled automatically now, like real > threads. Do you see a major drawback in supporting > this, maybe as an option? Or do you see automatic > scheduling as an extra construct on top with a special > "scheduler" uthread? See my response near the top to Eric about this. I was thinking of a lower-level concept, like coroutines. I might consider automatic scheduling of uthreads too. But I've noticed that there are some ugly hacks in that code, too. :-) I've lived for years with a system (early Amoeba) that had threads with only explicit scheduling: other threads would only run when you were blocked for I/O or waiting for a semaphore. It made for very easy coding in some cases, since you didn't need to protect critical sections with mutexes. Unles, that is, you invoke stuff that might do I/O (maybe for debugging :-). --Guido van Rossum (home page: http://www.python.org/~guido/) From greg@cosc.canterbury.ac.nz Wed Nov 8 02:12:09 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Wed, 08 Nov 2000 15:12:09 +1300 (NZDT) Subject: [Python-Dev] uthread strawman In-Reply-To: <200011080134.UAA19392@cj20424-a.reston1.va.home.com> Message-ID: <200011080212.PAA00313@s454.cosc.canterbury.ac.nz> Guido: > I hope that it will be possible to make it a lot simpler than current > stackless though, by not doing continuations. My feeling is that this won't be the case. The fundamental change of structure needed to make it stackless will be much the same, as will the thought processes necessary to understand how it works. > Where it gets tricky is when this spawns a new uthread, which also > calls C which calls Python... The solution is simply that the > application "shouldn't do that." I worry that this is going to be a rather severe restriction. For instance, it may make it impossible to spawn a uthread from within a callback from a GUI framework. Since with many GUI frameworks the entire application gets executed in callbacks, you wouldn't be able to use uthreads at all with such a framework. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From guido@python.org Wed Nov 8 02:58:20 2000 From: guido@python.org (Guido van Rossum) Date: Tue, 07 Nov 2000 21:58:20 -0500 Subject: [Python-Dev] uthread strawman In-Reply-To: Your message of "Wed, 08 Nov 2000 15:12:09 +1300." <200011080212.PAA00313@s454.cosc.canterbury.ac.nz> References: <200011080212.PAA00313@s454.cosc.canterbury.ac.nz> Message-ID: <200011080258.VAA19690@cj20424-a.reston1.va.home.com> > Guido: > > > I hope that it will be possible to make it a lot simpler than current > > stackless though, by not doing continuations. [Greg Ewing] > My feeling is that this won't be the case. The fundamental > change of structure needed to make it stackless will be > much the same, as will the thought processes necessary > to understand how it works. I hope you are wrong but you may be right. I'll have to have a good look -- or someone else (not Christian! With all due respect his code is unreadable :-). > > Where it gets tricky is when this spawns a new uthread, which also > > calls C which calls Python... The solution is simply that the > > application "shouldn't do that." > > I worry that this is going to be a rather severe restriction. > For instance, it may make it impossible to spawn a uthread > from within a callback from a GUI framework. Since with many > GUI frameworks the entire application gets executed in > callbacks, you wouldn't be able to use uthreads at all with > such a framework. But that's the same problem that current stackless has. I take it that you don't see the point of stackless then? That's fine. Maybe this is not an application that could use uthreads. They seem more something for servers anyway. --Guido van Rossum (home page: http://www.python.org/~guido/) From greg@cosc.canterbury.ac.nz Wed Nov 8 03:18:10 2000 From: greg@cosc.canterbury.ac.nz (Greg Ewing) Date: Wed, 08 Nov 2000 16:18:10 +1300 (NZDT) Subject: [Python-Dev] uthread strawman In-Reply-To: <200011080258.VAA19690@cj20424-a.reston1.va.home.com> Message-ID: <200011080318.QAA00324@s454.cosc.canterbury.ac.nz> Guido: > I take it that you don't see the point of stackless then? I have mixed feelings about it. I really like the idea of uthreads, but I get the impression that Stackless as it stands is only a partial implementation of the idea, with no easy way in sight to evolve it into a full implementation. > Maybe this is not an application that could use uthreads. The point is that the user can be mixing Python and C stack frames without even realising it. I was just giving one example of how that can come about. Saying "don't do that" isn't very helpful when "that" is something which poeple often do quite unconsciously. So, while I'd love to see uthreads as a core Python feature one day, I think I'm in agreement with you that Stackless isn't yet ready to be made into the standard Python implementation. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+ From pf@artcom-gmbh.de Wed Nov 8 07:33:00 2000 From: pf@artcom-gmbh.de (Peter Funk) Date: Wed, 8 Nov 2000 08:33:00 +0100 (MET) Subject: [Python-Dev] Integer division transition In-Reply-To: <14856.35569.55094.245631@cj42289-a.reston1.va.home.com> from "Fred L. Drake, Jr." at "Nov 7, 2000 6: 6:25 pm" Message-ID: > Charles G Waldman writes: > > I think that making "div" an infix operator would be setting a > > horrible precedent. Currently, all infix operators "look like" > > operators, i.e. they are non-alphabetic characters, and things that > > look like words are either functions or reserved words. Fred L. Drake, Jr.: > Like "is", "in", "is not", and "not in"? And not to forget "and", "or" which were also infix operators from the very beginning. So "div" is no precedent at all. IMHO the term "horrible" applies to operator symbols composed out of non-alphabetic characters, where the meaning of these operators is hard to guess. counter-example: Using "><" as a vector cross product operator might still make some sense. But what would be the meaning of all those other arbitrary combinations like ".+", ".%", ".*", "//", "@.", "~*" or what else has been proposed to extend Python in the numeric area? As long as the meaning of such an operator isn't obvious from basic math knowledge, I clearly prefer keyword operators. Regards, Peter -- Peter Funk, Oldenburger Str.86, D-27777 Ganderkesee, Germany, Fax:+49 4222950260 office: +49 421 20419-0 (ArtCom GmbH, Grazer Str.8, D-28359 Bremen) From mal@lemburg.com Wed Nov 8 10:13:51 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Wed, 08 Nov 2000 11:13:51 +0100 Subject: [Python-Dev] Starship down again ?! References: <200011080620.WAA20659@slayer.i.sourceforge.net> Message-ID: <3A09275F.CE4B01FA@lemburg.com> > > + Marc-André Lemburg's mx.Proxy package. These Web pages appear to > + be unavailable at the moment. > + > + http://starship.python.net/crew/lemburg/ > + Looks like Starship is down again. Is this due to the move from BeOpen to DC or has someone pulled the plug on that ADSL line ? ... 9 sl-bb1-rly-0-0-0.sprintlink.net (144.232.14.6) 120 ms 117 ms 113 ms 10 beth1sr2-2-0-0.md.us.prserv.net (165.87.97.226) 114 ms 116 ms 114 ms 11 beth1br2-ge-6-0-0-0.md.us.prserv.net (165.87.29.182) 122 ms 121 ms 116 ms 12 sfra1br1-so-2-1-0-0.ca.us.prserv.net (165.87.233.42) 193 ms 192 ms 191 ms 13 sfra1sr3-ge-2-0-0-0.ca.us.prserv.net (165.87.33.121) 191 ms 189 ms 190 ms 14 165.87.161.13 (165.87.161.13) 191 ms 191 ms 189 ms 15 core4-g2-0.snfc21.pbi.net (209.232.130.77) 197 ms 192 ms 190 ms 16 rback26-fe2-0.snfc21.pbi.net (216.102.187.153) 212 ms 197 ms 197 ms 17 adsl-63-202-160-65.dsl.snfc21.pacbell.net (63.202.160.65) 206 ms 212 ms 204 ms 18 * * * 19 * * * -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From tismer@tismer.com Wed Nov 8 11:23:46 2000 From: tismer@tismer.com (Christian Tismer) Date: Wed, 08 Nov 2000 13:23:46 +0200 Subject: [Python-Dev] uthread strawman References: <200011080212.PAA00313@s454.cosc.canterbury.ac.nz> <200011080258.VAA19690@cj20424-a.reston1.va.home.com> Message-ID: <3A0937C2.903D42@tismer.com> Guido van Rossum wrote: > > > Guido: > > > > > I hope that it will be possible to make it a lot simpler than current > > > stackless though, by not doing continuations. > > [Greg Ewing] > > My feeling is that this won't be the case. The fundamental > > change of structure needed to make it stackless will be > > much the same, as will the thought processes necessary > > to understand how it works. > > I hope you are wrong but you may be right. I'll have to have a good > look -- or someone else (not Christian! With all due respect his code > is unreadable :-). Are you talking of my changes to ceval.c or the continuationmodule? I think it can't be the latter, since that does not matter at all if we talk about Stackless. Stackless was written to make continuations possible. It does not implement them. My changes to ceval.c are written in the same style as your original code, and it uses the same level of commenting as yours: Nothing at all. :-) With all due respect, I consider both versions equally unreadable, unless one understands what the intent of the code is. Until last October, I tried to keep everything as readable and understandable as possible. Then it became clear that this implementation would never make it into the core. Then I gave up my efforts, and I also added a lot of optimizations to the interpreter, by some systematic use of macroes. Shurely his doesn't increase readability. Forgetting about these optimizations, the code doesn't do much more than the following: eval_code2_setup is split off of the original eval_code2 function. It prepares a new frame for execution and puts it on top of the frame stack. PyEval_Frame_Dispatch is a new function. It controls the execution of frames. Every initial or recursive interpreter call starts such a dispatcher. The nested dispatchers control the remaining "stackful" execution of Python. In its central loop, it runs the topmost frame of the frame stack, receives its return value and runs the next frame, until it sees the frame appear that spawned this dispatcher. Then it returns. eval_code2_loop is the "real" part of the original eval_code2 function. It is not much different from the original. Major changes have been done to the entry code, the periodic checks in the loop, and the handling of function calls. The "big switch" has been simplified in the sense, that errors are no longer treated with various variables which have to be checked outside the switch. Instead, errors are directly mapped on a pseudo-opcode that allows to handle exceptions as just another case of the big switch. Every function call has got additional code that checks for the so-called unwind token, which tells us to leave this frame and to return to the scheduler. On entry to the frame, on every trip through the main loop, and after every function call, a callback f_callguard is checked for existence. If it exists, it is called, and if it returns -42, again the frame is left and we return to the scheduler. Entry into a frame has become a bit difficult, since we no longer know in advance whether a frame is expected to return a value or not. Due to uthread scheduling, switches occour between opcodes, and no values are transferred. When switching in the context of a function call, there *are* return values expected. This is all handled via some flags, in the frame entry code, line 948ff. Then, there are some very simple changes to the loop construct. Generally, more state variables are in the frames and kept up-to-date, like the instruction pointer. I'm omitting the extra code for uthread support here. Some functions were pulled out of the main loop, in order to make it smaller and easier to read. I would undo this today, since it makes comparison to the old version quite impossible, and it didn't yield more speed. This is about all of it. As you can see, there is no explicit support for co-anything in the code. There are some generalisations to frame calling and some callback hooks which actually do all co operations. An implementation targeted for core integration would look quite much different. It would provide more functionality directly, without using callbacks. A pure coroutine based implementation as you proposed would not need the generalization of the frame parameter passing, since switches can only occour in the context of a function call. Supporting auto-scheduled uthreads needs to distinguish explicit and implicit switching, since implicit switching occours between opcodes, not *in* opcodes. The techniques for this can be written in quite a different manner than I did. Again, this code is not intended for inclusion in the core, and not for drawing conclusions for the feasibility of Stackless at all. The latter has been shown by applications like the uthreads, and by its central use in the EVE game. We need to judge the priciple, not the implementaiton. ciao - chris -- Christian Tismer :^) Mission Impossible 5oftware : Have a break! Take a ride on Python's Kaunstr. 26 : *Starship* http://starship.python.net 14163 Berlin : PGP key -> http://wwwkeys.pgp.net PGP Fingerprint E182 71C7 1A9D 66E9 9D15 D3CC D4D7 93E2 1FAE F6DF where do you want to jump today? http://www.stackless.com From phil@river-bank.demon.co.uk Wed Nov 8 12:26:54 2000 From: phil@river-bank.demon.co.uk (Phil Thompson) Date: Wed, 08 Nov 2000 12:26:54 +0000 Subject: [Python-Dev] What to choose to replace Tkinter? References: <049d01c0471f$d7899450$8119fea9@neil> Message-ID: <3A09468E.9E767720@river-bank.demon.co.uk> I've come to this discussion rather late... Most modern GUI toolkits have (or will soon have) the widgets to compete with Tk strengths. The difficult question with a Tkinter replacement is the complete fragmentation of the GUI toolkit "market". I don't believe that you can, today, identify a toolkit that you are sure is going to have widespread support and the longevity needed (in 5 years time you don't want to be in the position you are in today with Tk). I see two alternatives... - make the Tkinter replacement an abstraction layer between Python and the *user's* choice of toolkit. The developer gets a consistent API, and toolkits can be adopted and dropped as fashions change. This is the approach taken by VeePee (http://www.thekompany.com/projects/vp/). - don't bundle Tkinter with Python. At least you then make people think a bit more about what they want from a toolkit and make an appropriate choice - let Tkinter's replacement be found by natural selection. At the very least let's have a more up-front presentation of the different options, strengths/weaknesses etc on the web site. Cameron must be getting bored of pointing people to his toolkit summary. For the record, Qt has a good Canvas widget, Unicode support, user selectable Windows/Mac/Unix look & feel etc, etc. Phil From guido@python.org Wed Nov 8 13:20:02 2000 From: guido@python.org (Guido van Rossum) Date: Wed, 08 Nov 2000 08:20:02 -0500 Subject: [Python-Dev] uthread strawman In-Reply-To: Your message of "Wed, 08 Nov 2000 13:23:46 +0200." <3A0937C2.903D42@tismer.com> References: <200011080212.PAA00313@s454.cosc.canterbury.ac.nz> <200011080258.VAA19690@cj20424-a.reston1.va.home.com> <3A0937C2.903D42@tismer.com> Message-ID: <200011081320.IAA21990@cj20424-a.reston1.va.home.com> > Again, this code is not intended for inclusion in the core, > and not for drawing conclusions for the feasibility of > Stackless at all. The latter has been shown by applications > like the uthreads, and by its central use in the EVE game. > We need to judge the priciple, not the implementaiton. Of course. Thanks by the way for the clear explanation of what needs to be done! --Guido van Rossum (home page: http://www.python.org/~guido/) From gward@mems-exchange.org Wed Nov 8 14:02:49 2000 From: gward@mems-exchange.org (Greg Ward) Date: Wed, 8 Nov 2000 09:02:49 -0500 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/dist/src configure.in,1.177,1.178 configure,1.169,1.170 In-Reply-To: <20001107182621.I27208@xs4all.nl>; from thomas@xs4all.net on Tue, Nov 07, 2000 at 06:26:21PM +0100 References: <200011071544.HAA31147@slayer.i.sourceforge.net> <20001107182621.I27208@xs4all.nl> Message-ID: <20001108090249.A28202@ludwig.cnri.reston.va.us> On 07 November 2000, Thomas Wouters said: > Sorry for the late remark (I did see your earlier message) but after reading > the patch I realized 'OPT' isn't the right place for this. 'OPT' should be > for non-essential stuff: warnings, debug-info and optimizations. Removing > things from OPT shouldn't break anything, and neither should adding options > that fit in the categories above. (Barring broken compilers, of course.) > > Instead, the -fPIC option should be added to CFLAGS, I think. The Python > autoconf setup is slightly less versatile than most, though, since it's > doggone hard to near impossible to change things like OPT, CC, CFLAGS, etc, > without editing configure(.in) :P If noone else does it before me, I'll see > about fixing at least the -fPIC thing later, when I find some time ;P Good point -- fixing CFLAGS instead of OPT sounds right to me. I'm not really sure on where to do this, though. Ooh, I just noticed this in configure.in: # DG/UX requires some fancy ld contortions to produce a .so from an .a case $MACHDEP in dguxR4) LDLIBRARY='libpython$(VERSION).so' OPT="$OPT -pic" ;; No prize for guessing that "-pic" on the DG/UX compiler has a similar effect to GCC's -fPIC, and based on the comment this is required. I'm guessing this should be in CFLAGS as well. Oh wait: CFLAGS is not exported from configure.in -- it's *only* defined in the Makefile. From Makefile.in: CFLAGS= $(OPT) -I. $(DEFS) IOW, it looks like OPT is used for all non-preprocessor compiler flags, whether they're "frills" like optimization/debugging or not. Conclusion: my patch (add "-fPIC" to OPT instead of CFLAGS) does the right thing, but for the wrong reason. Fixing it would require a little more involved surgery on configure.in and the Makefiles. And it would also require reexamining every use of OPT in configure.in (not too hard, "grep -c OPT" only finds 16 matches). IMHO this would be a good thing: if we do it right, it should make it easier to tweak OPT, CC, CFLAGS and so forth at config time or at make time. I'm willing to spend some time on this; does anyone think it's a pointless waste of time? Greg From fdrake@acm.org Wed Nov 8 14:04:32 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Wed, 8 Nov 2000 09:04:32 -0500 (EST) Subject: [Python-Dev] Re: [Python-checkins] CVS: python/dist/src configure.in,1.177,1.178 configure,1.169,1.170 In-Reply-To: <20001108090249.A28202@ludwig.cnri.reston.va.us> References: <200011071544.HAA31147@slayer.i.sourceforge.net> <20001107182621.I27208@xs4all.nl> <20001108090249.A28202@ludwig.cnri.reston.va.us> Message-ID: <14857.23920.646760.779849@cj42289-a.reston1.va.home.com> Greg Ward writes: > I'm willing to spend some time on this; does anyone think it's a > pointless waste of time? I'm certainly willing to allocate some of your cycles to this. ;) I'll even help test it once you've checked it in. Seriously, I think you're right -- there needs to be a separation of what's needed and optional stuff added from the make command line. -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From guido@python.org Wed Nov 8 14:21:20 2000 From: guido@python.org (Guido van Rossum) Date: Wed, 08 Nov 2000 09:21:20 -0500 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/dist/src configure.in,1.177,1.178 configure,1.169,1.170 In-Reply-To: Your message of "Wed, 08 Nov 2000 09:02:49 EST." <20001108090249.A28202@ludwig.cnri.reston.va.us> References: <200011071544.HAA31147@slayer.i.sourceforge.net> <20001107182621.I27208@xs4all.nl> <20001108090249.A28202@ludwig.cnri.reston.va.us> Message-ID: <200011081421.JAA22160@cj20424-a.reston1.va.home.com> > Conclusion: my patch (add "-fPIC" to OPT instead of CFLAGS) does the > right thing, but for the wrong reason. Fixing it would require a little > more involved surgery on configure.in and the Makefiles. And it would > also require reexamining every use of OPT in configure.in (not too hard, > "grep -c OPT" only finds 16 matches). IMHO this would be a good thing: > if we do it right, it should make it easier to tweak OPT, CC, CFLAGS and > so forth at config time or at make time. > > I'm willing to spend some time on this; does anyone think it's a > pointless waste of time? No, please fix it right! --Guido van Rossum (home page: http://www.python.org/~guido/) From cgw@fnal.gov Wed Nov 8 14:23:47 2000 From: cgw@fnal.gov (Charles G Waldman) Date: Wed, 8 Nov 2000 08:23:47 -0600 (CST) Subject: [Python-Dev] Integer division transition In-Reply-To: References: <14856.35569.55094.245631@cj42289-a.reston1.va.home.com> Message-ID: <14857.25075.482728.276846@buffalo.fnal.gov> > > Charles G Waldman writes: > > > I think that making "div" an infix operator would be setting a > > > horrible precedent. Currently, all infix operators "look like" > > > operators, i.e. they are non-alphabetic characters, and things that > > > look like words are either functions or reserved words. > > Fred L. Drake, Jr.: > > Like "is", "in", "is not", and "not in"? > > Peter Funk writes: > And not to forget "and", "or" which were also infix operators from > the very beginning. So "div" is no precedent at all. OK, I stand corrected and feel suitably foolish. However I still think it's quite inconsistent to have divmod(a,b) but a div b. From fdrake@acm.org Wed Nov 8 14:38:51 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Wed, 8 Nov 2000 09:38:51 -0500 (EST) Subject: [Python-Dev] Integer division transition In-Reply-To: <14857.25075.482728.276846@buffalo.fnal.gov> References: <14856.35569.55094.245631@cj42289-a.reston1.va.home.com> <14857.25075.482728.276846@buffalo.fnal.gov> Message-ID: <14857.25979.244131.879387@cj42289-a.reston1.va.home.com> Charles G Waldman writes: > OK, I stand corrected and feel suitably foolish. However I still > think it's quite inconsistent to have divmod(a,b) but a div b. I suspect div would be much more widely used than divmod(), which is essentially a performance optimization when you need both results. One reason *not* to make divmod() an operator, aside from issues of legacy code, is that it really returns two results (never mind that it returns exactly one tuple); I can't think of another operator that conceptually returns two values. -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From pf@artcom-gmbh.de Wed Nov 8 15:32:22 2000 From: pf@artcom-gmbh.de (Peter Funk) Date: Wed, 8 Nov 2000 16:32:22 +0100 (MET) Subject: [Python-Dev] Re: [Python-checkins] CVS: python/dist/src/Demo/threads Coroutine.py,NONE,1.1 fcmp.py,NONE,1.1 squasher.py,NONE,1.1 README,1.7,1.8 In-Reply-To: <200011081517.HAA16919@slayer.i.sourceforge.net> from Guido van Rossum at "Nov 8, 2000 7:17:51 am" Message-ID: Hi, this rather old module still contains string exceptions. Since string exceptions are depreceated in favour of class based exceptions wouldn't it be better to tweak those few lines into class based exceptions now? > Add 1994 Coroutine module by Tim Peters [...] > Killed = 'Coroutine.Killed' > EarlyExit = 'Coroutine.EarlyExit' Regards, Peter From guido@python.org Wed Nov 8 15:42:41 2000 From: guido@python.org (Guido van Rossum) Date: Wed, 08 Nov 2000 10:42:41 -0500 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/dist/src/Demo/threads Coroutine.py,NONE,1.1 fcmp.py,NONE,1.1 squasher.py,NONE,1.1 README,1.7,1.8 In-Reply-To: Your message of "Wed, 08 Nov 2000 16:32:22 +0100." References: Message-ID: <200011081542.KAA22738@cj20424-a.reston1.va.home.com> > this rather old module still contains string exceptions. > Since string exceptions are depreceated in favour of class based > exceptions wouldn't it be better to tweak those few lines into > class based exceptions now? > > > Add 1994 Coroutine module by Tim Peters > [...] > > Killed = 'Coroutine.Killed' > > EarlyExit = 'Coroutine.EarlyExit' No. This code is of historic interest only. Don't touch it please! --Guido van Rossum (home page: http://www.python.org/~guido/) From pf@artcom-gmbh.de Wed Nov 8 16:06:56 2000 From: pf@artcom-gmbh.de (Peter Funk) Date: Wed, 8 Nov 2000 17:06:56 +0100 (MET) Subject: [Python-Dev] Re: [Python-checkins] CVS: python/dist/src/Demo/threads Coroutine.py,NONE,1.1 fcmp.py,NONE,1.1 squasher.py,NONE In-Reply-To: <200011081542.KAA22738@cj20424-a.reston1.va.home.com> from Guido van Rossum at "Nov 8, 2000 10:42:41 am" Message-ID: I was nitpicking: > > this rather old module still contains string exceptions. > > Since string exceptions are depreceated in favour of class based > > exceptions wouldn't it be better to tweak those few lines into > > class based exceptions now? > > > > > Add 1994 Coroutine module by Tim Peters > > [...] > > > Killed = 'Coroutine.Killed' > > > EarlyExit = 'Coroutine.EarlyExit' Guido van Rossum answered: > No. This code is of historic interest only. Don't touch it please! Hmmmm.... I always thought of the Demo directory as a repository for example code, which may be used to teach Python programming to beginners. I know, that some pieces are a way out of date. But I think it would be a worthwile goal to update at least some of those pieces step by step to reflect current Python coding habits. The README in Demo says: """This directory contains various demonstrations of what you can do with Python. [...]""" If you want to turn the Demo directory into a museum of code snippets of historic interest, at least the README should say so. ;-) Regards, Peter -- Peter Funk, Oldenburger Str.86, D-27777 Ganderkesee, Germany, Fax:+49 4222950260 office: +49 421 20419-0 (ArtCom GmbH, Grazer Str.8, D-28359 Bremen) From gmcm@hypernet.com Wed Nov 8 17:09:48 2000 From: gmcm@hypernet.com (Gordon McMillan) Date: Wed, 8 Nov 2000 12:09:48 -0500 Subject: [Python-Dev] uthread strawman In-Reply-To: <200011071614.LAA18276@cj20424-a.reston1.va.home.com> References: Your message of "Mon, 06 Nov 2000 16:23:48 +0200." <3A06BEF4.95B773BD@tismer.com> Message-ID: <3A09428C.12529.1044B070@localhost> [Guido] > Without continuations, but with microthreads (uthreads) or > coroutines, each (Python) stack frame can simply be "paused" at a > specific point and continued later. The semantics here are > completely clear (except perhaps for end cases such as unhandled > exceptions and intervening C stack frames). Exceptions require some thought, particularly because the "for" protocol uses an IndexError as a signal. In my own stuff I've found I need to catch all exceptions in the coroutine, primarily because I've always got resources to clean up, but clearly the implementation has to do the right thing when an exception crosses the boundary. > A strawman proposal: > > The uthread module provides the new functionality at the lowest > level. Uthread objects represent microthreads. An uthread has a > chain of stack frames linked by back pointers just like a regular > thread. Pause/resume operations are methods on uthread objects. > Pause/resume operations do not address specific frames but > specific uthreads; within an uthread the normal call/return > mechanisms can be used, and only the top frame in the uthread's > stack of call frames can be paused/resumed (the ones below it are > paused implicitly by the call to the next frame, and resumed when > that call returns). I'm not convinced (though I'm not asking you to convince me - I need to ponder some more) that this is the right approach. My worry is that to do coroutines, we end up with a bunch of machinery on top of uthreads, just like Tim's old coroutine stuff, or implementations of coroutines in Java. My mental test case is using coroutines to solve the impedance mismatch problem. SelectDispatcher is a simple example (write "client" code that looks like it's using blocking sockets, but multiplex them behind the curtain). Using a "pull" parser as a "push" parser is another case, (that is, letting it think it's doing its own reads). But what about using a "pull" lexer and a "pull" parser, but tricking them with coroutines so you can "push" text into them? Tim's implementation of the Dahl & Hoare example (which I rewrote in mcmillan-inc.com/tutorial4.html) shows you *can* do this kind of thing on top of a thread primitive, but might it not be much better done on a different primitive? Again, I'm not really asking for an answer, but I think this type of problem is not uncommon, and a wonderful use of coroutines; so I'm wondering if this is a good trade-off. > - u.yield() pauses the current uthread and resume the uthread > u where it was paused. The current uthread is resumed when > some other uthread calls its yield() method. Calling > uthread.current().yield() is a no-op. This doesn't seem like enough: sort of as though you designed a language in which "call" and "return" were spelled the same way. Certainly for coroutines and generators, people gravitate towards paired operations (eg. "suspend" and "resume"). Again, Tim's demonstrated you can do that on top of threads, but it sure seems to me like they should be primitives. > I think this API should enough to implement Gordon's > SelectDispatcher code. In general, it's easy to create a > scheduler uthread that schedules other uthreads. Thank you :-). > Open issues: > > - I'm not sure that I got the start conditions right. Should > func() be > be allowed to run until its first yield() when > uthread.new(func) is called? For coroutine stuff, that doesn't bother me. For uthreads, I'd think (like real threads) that creation and starting are different things. > - I'm not sure that the rules for returning and raising > exceptions > from func() are the right ones. > > - Should it be possible to pass a value to another uthread by > passing > an argument to u.yield(), which then gets returned by the > resumed yield() call in that uthread? Argument (in and / or out) passing is a necessity for generators and coroutines. But, as mentioned above, I don't think a symmetrical "yield" is the right answer. > - How do uthreads interact with real threads? Uthreads are > explicitly > scheduled through yield() calls; real threads use preemptive > scheduling. I suppose we could create a new "main" uthread for > each real thread. But what if we yield() to an uthread that's > already executing in another thread? How is that error > detected? I think I would (perhaps naively) expect that I could create a uthread in one (real) thread, and then pass it off to another (real) thread to execute. Another post brings up GUIs and uthreads. I already expect that I'm going to have to dedicate a (real) thread to the GUI and ponder very carefully how that thread interacts with others. Of course, that's from lessons learned the hard way; but personally I'm not expecting uthreads / coroutines to make that any easier. [About continuations: while I love the fact that Christian has made these available for playing, I have so far not found them productive. I wrote a simple minded backtracking parser using them, but found it no better than a coroutine based one. But I am interested in how a *real* pervert (eg, Tim) feels about it - and no "Gee, that sounds like a *good* idea, boss", please.] - Gordon From akuchlin@mems-exchange.org Wed Nov 8 18:11:26 2000 From: akuchlin@mems-exchange.org (Andrew Kuchling) Date: Wed, 08 Nov 2000 13:11:26 -0500 Subject: [Python-Dev] Catalog-SIG created Message-ID: A mailing list for the Catalog SIG has been created, to discuss the design and construction of a Vaults/CPAN/LSM-like index for Python. Web pages for the SIG don't exist yet, but will be created soon. The SIG's charter: The Python Catalog SIG aims at producing a master index of Python software and other resources. It will begin by figuring out what the requirements are, converging on a design for the data schema, and producing an implementation. ("Implementation" will almost certainly include mean a set of CGI scripts for browsing the catalog, and may also contain a standard library module for automatically fetching & installing modules, if the SIG decides that's a worthwhile feature.) --amk From guido@python.org Wed Nov 8 18:21:39 2000 From: guido@python.org (Guido van Rossum) Date: Wed, 08 Nov 2000 13:21:39 -0500 Subject: [Python-Dev] Re: [Python-checkins] CVS: python/dist/src/Demo/threads Coroutine.py,NONE,1.1 fcmp.py,NONE,1.1 squasher.py,NONE In-Reply-To: Your message of "Wed, 08 Nov 2000 17:06:56 +0100." References: Message-ID: <200011081821.NAA23572@cj20424-a.reston1.va.home.com> > Hmmmm.... I always thought of the Demo directory as a repository for > example code, which may be used to teach Python programming to > beginners. I know, that some pieces are a way out of date. > > But I think it would be a worthwile goal to update at least some of > those pieces step by step to reflect current Python coding habits. > The README in Demo says: > > """This directory contains various demonstrations of what you can do with > Python. [...]""" > > If you want to turn the Demo directory into a museum of code snippets of > historic interest, at least the README should say so. ;-) If you want to make a proposal for reorganizing the Demo directory, please do so. There are more important problems with the Demo directlry than the fact that some code still uses string exceptions. Please don't start fixing the small nits without seeing the big picture. (That's all I have time for now.) --Guido van Rossum (home page: http://www.python.org/~guido/) From tismer@tismer.com Wed Nov 8 17:20:43 2000 From: tismer@tismer.com (Christian Tismer) Date: Wed, 08 Nov 2000 19:20:43 +0200 Subject: [Python-Dev] uthread strawman References: Your message of "Mon, 06 Nov 2000 16:23:48 +0200." <3A06BEF4.95B773BD@tismer.com> <3A09428C.12529.1044B070@localhost> Message-ID: <3A098B6B.522EA5E8@tismer.com> Gordon McMillan wrote: [snipped all the good stuff away for bigger brains than mine ] > [About continuations: while I love the fact that Christian has made > these available for playing, I have so far not found them productive. I > wrote a simple minded backtracking parser using them, but found it no > better than a coroutine based one. But I am interested in how a *real* > pervert (eg, Tim) feels about it - and no "Gee, that sounds like a *good* > idea, boss", please.] Yes, I saw Gordon making heavy use of naked continuations, but actually they were not really what he needed. I believe this since he made much use of co.update(), which moves a continaution to the most current state of the frame. In fact, what Gordon would need (and most probably most of us as well) is just the handle to a frame, and the ability to switch to it. In Gordon's case, these would probably be "continuation" which are not frozen, but simply track the frame as it is. I'm not absolutely shure, but quite. I'm happy to toss continuations for core Python, if we can find the right building blocks for coro/gen/uthreads. I think Guido comes quite near this, already. ciao - chris -- Christian Tismer :^) Mission Impossible 5oftware : Have a break! Take a ride on Python's Kaunstr. 26 : *Starship* http://starship.python.net 14163 Berlin : PGP key -> http://wwwkeys.pgp.net PGP Fingerprint E182 71C7 1A9D 66E9 9D15 D3CC D4D7 93E2 1FAE F6DF where do you want to jump today? http://www.stackless.com From fdrake@acm.org Wed Nov 8 19:50:50 2000 From: fdrake@acm.org (Fred L. Drake, Jr.) Date: Wed, 8 Nov 2000 14:50:50 -0500 (EST) Subject: [Python-Dev] Re: [Distutils] Catalog-SIG created In-Reply-To: References: Message-ID: <14857.44698.672928.206695@cj42289-a.reston1.va.home.com> Mark W. Alexander writes: > Is there another way to subscribe, or did I just jump the gun? You can get to the Mailman interface at: http://www.python.org/mailman/listinfo/catelog-sig/ The Web pages aren't actually there yet; Andrew will get to this when he can, I'm sure. ;) -Fred -- Fred L. Drake, Jr. PythonLabs at Digital Creations From mal@lemburg.com Wed Nov 8 20:34:49 2000 From: mal@lemburg.com (M.-A. Lemburg) Date: Wed, 08 Nov 2000 21:34:49 +0100 Subject: [Python-Dev] Re: Class/type dichotomy thoughts References: <20001107225758.B77651301D9@oratrix.oratrix.nl> Message-ID: <3A09B8E9.E247125F@lemburg.com> Jack Jansen wrote: > > > > In other words, what does this new type_id thing > > > actually *mean*? > > > > For the interpreter it means that it can assume the type > > interface to be binary compatible to the "original" > > type, e.g. by setting the flag to say PyDict_TypeID > > the type assures that all PyDict_*() APIs will work > > on the type -- basically the same thing as PyDict_Check() > > does now except that the type object needn't be the same > > anymore. > > I would be _very_ happy if this single type_id could somehow be > replaced by an array, or a bitset. I guess a bit array would be a possibility... #define PyList_Check(obj) ((obj)->ob_type->\ capabilities[Py_ListType->cap_slot]) cap_slot could be set at type object creation time using some Python slot id generator (a function which outputs integers up to the maximum length of the capabilities array and raises a Py_FatalError() if this limited is excceded). > I have a lot of types in MacPython that are acceptable to the APIs of > other types, a sort of poor-mans-inheritance scheme. For instance, all > operating system calls that accept a MacOS WindowPtr will also happily > accept a DialogPtr. Major magic is needed to get this to work > reasonably in Python, and the Python user can still accidentally mess > up the refcounting scheme and free things s/he isn't aware of. > > As the number of types in a given run of the interpreter appears to be > limited (am I right here?) and type-identity-tests are valid within a > single interpreter run only (am I right here?) Right * 2 > an API like > typeindex = Py_TypeToTypeIndex(typeobject); > which would use a dictionary as storage for the mapping and generate > the index numbers on the fly would do the trick. Call it once during > module initalization and the > Py_ISOBJECTCOMPATIBLEWITH(object, typeindex) > macro would be a oneliner to test the bit in the set. > > A few hundred bits in the set would get us a long way, I guess. One thing I'm unsure about is whether changing cap_slot ids between runs of the interpreter are a good idea. Also, I think that the basic types should be given constant cat_slot ids to enhance performance (the id generator could be made to start at say 10 and the basic types be fixed in the range 0-9). -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ From trentm@ActiveState.com Wed Nov 8 21:07:39 2000 From: trentm@ActiveState.com (Trent Mick) Date: Wed, 8 Nov 2000 13:07:39 -0800 Subject: [Python-Dev] Re: [Distutils] Catalog-SIG created In-Reply-To: <14857.44698.672928.206695@cj42289-a.reston1.va.home.com>; from fdrake@acm.org on Wed, Nov 08, 2000 at 02:50:50PM -0500 References: <14857.44698.672928.206695@cj42289-a.reston1.va.home.com> Message-ID: <20001108130739.H27185@ActiveState.com> On Wed, Nov 08, 2000 at 02:50:50PM -0500, Fred L. Drake, Jr. wrote: > http://www.python.org/mailman/listinfo/catelog-sig/ .replace("catelog", "catalog") :) -- Trent Mick TrentM@ActiveState.com From jeremy@alum.mit.edu Thu Nov 9 00:14:45 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Wed, 8 Nov 2000 19:14:45 -0500 (EST) Subject: [Python-Dev] Re: [Patch #102337] revised CALL_FUNCTION implementation In-Reply-To: <200011090008.QAA22011@sf-web2.i.sourceforge.net> References: <200011090008.QAA22011@sf-web2.i.sourceforge.net> Message-ID: <14857.60533.766291.786182@bitdiddle.concentric.net> --QTUWfOmTKZ Content-Type: text/plain; charset=us-ascii Content-Description: message body text Content-Transfer-Encoding: 7bit I just uploaded a patch to sourceforge that revises the CALL_FUNCTION implementation to use a bunch of functions and avoid the long block of inline code in eval_code2. The overall performance of the patch is about the same. The current patch causes a big slowdown for the use of keyword arguments, but is otherwise as fast or faster then the old version. The keyword slowdown should be avoidable without too much effort. I wrote a short benchmark that demonstrates the effect on many variations of function calls. The output lists the test time, and the median, min, and max time of 10 executions. (The benchmark script is attached to this message.) The vanilla CVS tree produces these results: time_arg1 0.09 0.09 0.11 time_arg2 0.1 0.09 0.12 time_arg3 0.11 0.1 0.12 time_fact 0.12 0.1 0.14 time_meth 0.1 0.09 0.11 time_umeth 0.11 0.1 0.12 time_builtin 0.1 0.08 0.11 time_callable 0.37 0.33 0.38 time_keyword 0.14 0.13 0.18 time_star 0.12 0.12 0.14 time_kw 0.25 0.24 0.27 time_kw2 0.69 0.66 0.73 time_starkw 0.24 0.23 0.26 time_init 0.64 0.63 0.68 total 3.18 The CVS tree with the CALL_FUNCTION patch applied produces these results: time_arg1 0.09 0.09 0.1 time_arg2 0.1 0.09 0.1 time_arg3 0.11 0.09 0.13 time_fact 0.11 0.11 0.14 time_meth 0.09 0.09 0.1 time_umeth 0.1 0.1 0.11 time_builtin 0.08 0.07 0.09 time_callable 0.35 0.34 0.38 time_keyword 0.42 0.4 0.44 (*** big slowdown ***) time_star 0.13 0.13 0.15 time_kw 0.25 0.23 0.29 time_kw2 0.66 0.61 0.79 time_starkw 0.24 0.22 0.27 time_init 0.66 0.6 0.72 total 3.39 Jeremy --QTUWfOmTKZ Content-Type: text/plain Content-Disposition: inline; filename="callbench.py" Content-Transfer-Encoding: 7bit import time MANY = 40000 AFEW = 2000 def time_arg1(iters=range(MANY)): def arg1(x): return x t0 = time.clock() for i in iters: arg1(i) t1 = time.clock() return t1 - t0 def time_arg2(iters=range(MANY)): def arg2(x, y): return y t0 = time.clock() for i in iters: arg2(i, i) t1 = time.clock() return t1 - t0 def time_arg3(iters=range(MANY)): def arg3(x, y, z): return z t0 = time.clock() for i in iters: arg3(i, i, i) t1 = time.clock() return t1 - t0 def fact(n): if n == 0: return 1L else: return n * fact(n - 1) def time_fact(iters=range(AFEW)): t0 = time.clock() for i in iters: fact(10) t1 = time.clock() return t1 - t0 class Foo: def method(self, x): return x def time_meth(iters=range(MANY)): inst = Foo() meth = inst.method t0 = time.clock() for i in iters: meth(i) t1 = time.clock() return t1 - t0 def time_umeth(iters=range(MANY)): inst = Foo() meth = Foo.method t0 = time.clock() for i in iters: meth(inst, i) t1 = time.clock() return t1 - t0 def time_builtin(iters=range(MANY)): l = [] func = l.count t0 = time.clock() for i in iters: func(i) t1 = time.clock() return t1 - t0 class Bar: def __call__(self, x): return x def time_callable(iters=range(MANY)): inst = Bar() t0 = time.clock() for i in iters: inst(i) t1 = time.clock() return t1 - t0 def time_keyword(iters=range(MANY)): def kwfunc(a=None, b=None): return a + b t0 = time.clock() for i in iters: kwfunc(a=i, b=i) t1 = time.clock() return t1 - t0 def time_star(iters=range(MANY)): def star(a, b, c): return b arg = 1, 2, 3 t0 = time.clock() for i in iters: star(*arg) t1 = time.clock() return t1 - t0 def time_kw(iters=range(MANY)): def kw(a=0, b=0): return a * b dict = {'a': 1, 'b': -1} t0 = time.clock() for i in iters: kw(**dict) t1 = time.clock() return t1 - t0 def time_kw2(iters=range(MANY)): def kw(a=0, b=0, **d): return d.values()[a] d = {'a':1, 'c':2, 'd':3} t0 = time.clock() for i in iters: kw(**d) t1 = time.clock() return t1 - t0 def time_starkw(iters=range(MANY)): def func(a, b, c=None): return b t = (1,) d = {'c':1} t0 = time.clock() for i in iters: func(i, *t, **d) t1 = time.clock() return t1 - t0 class Test: def __init__(self, arg=None): pass def time_init(iters=range(MANY)): constructor = Test t0 = time.clock() for i in iters: constructor(arg=i) t1 = time.clock() return t1 - t0 def median(x): x.sort() return x[len(x)/2] sum = 0.0 for func in ( time_arg1, time_arg2, time_arg3, time_fact, time_meth, time_umeth, time_builtin, time_callable, time_keyword, time_star, time_kw, time_kw2, time_starkw, time_init, ): times = [func() for i in range(9)] med = median(times) print func.func_name, med, min(times), max(times) sum += med print "total", sum --QTUWfOmTKZ-- From jeremy@alum.mit.edu Thu Nov 9 00:25:11 2000 From: jeremy@alum.mit.edu (Jeremy Hylton) Date: Wed, 8 Nov 2000 19:25:11 -0500 (EST) Subject: [Python-Dev] Re: [Patch #102337] revised CALL_FUNCTION implementation In-Reply-To: <14857.60533.766291.786182@bitdiddle.concentric.net> References: <200011090008.QAA22011@sf-web2.i.sourceforge.net> <14857.60533.766291.786182@bitdiddle.concentric.net> Message-ID: <14857.61159.339908.704603@bitdiddle.concentric.net> Looks like I jumped the gun with my last message. A trivial change to the logic prevented the keyword arguments slowdown. I've revised the SF patch. The new numbers show that the patch is just a tad faster on the benchmark. (And the difference between gcc -O2 and -O3 makes a big difference for this ceval-intensive benchmark.) Jeremy time_arg1 0.1 0.09 0.12 time_arg2 0.1 0.1 0.12 time_arg3 0.1 0.1 0.12 time_fact 0.12 0.11 0.13 time_meth 0.1 0.09 0.11 time_umeth 0.11 0.1 0.12 time_builtin 0.09 0.08 0.1 time_callable 0.36 0.34 0.41 time_keyword 0.13 0.13 0.14 time_star 0.12 0.12 0.13 time_kw 0.23 0.22 0.24 time_kw2 0.64 0.61 0.66 time_starkw 0.24 0.23 0.26 time_init 0.64 0.64 0.69 total 3.08 From tim.one@home.com Thu Nov 9 08:44:07 2000 From: tim.one@home.com (Tim Peters) Date: Thu, 9 Nov 2000 03:44:07 -0500 Subject: [Python-Dev] uthread strawman In-Reply-To: <200011071614.LAA18276@cj20424-a.reston1.va.home.com> Message-ID: [Guido] [examples of presumably fuzzy semantics in the presence of continuations] > ... > But now consider this: the "evaluation stack" contained in each > frame could easily be replaced by a bunch of temporary variables, > if we had a slightly different instruction set ... > Then would we save those temporary variables or not? Yes, provided they remained anonymous (to the user) they're simply internal implementation details, and continuations preserve all things "like that". It's only visible name bindings that continuations let change across resumptions. > it can make a difference! Indeed yes. > ... > But if the compiler were to use temporary variables instead of the > evaluation stack, they might not have been restored! The technical term for that would be "bug" <0.5 wink>. Christian already covered the for-loop example. A more interesting variation is a for loop without a named "indexing vrbl": for x in sequence: etc Scheme has nothing direct to say about this because Scheme has no loops. Writing it as a recursive function instead leads to the same kind of result, though. > ... > This semantic imprecision is one of the reasons why I don't like the > concept of continuations. It's clearer in Scheme because Scheme has fewer "primitive concepts" than Python. > (I've been told that the exact semantics of continuations in Scheme > differ greatly between Scheme implementations.) I believe you've been told that, but not by a clueful Scheme user! Continuations are rigorously well-defined in Scheme. What *isn't* well-defined in Scheme is order of evaluation in many cases, so an expression like (+ (f) (g)) can display wildly different behavior across implementations if one or both of the functions {f, g}, directly or indirectly, creates or invokes a continuation (say that g does: it's not defined whether f has been invoked by the time g is -- indeed, it's even OK to invoke f and g in parallel). Note: the Python eye sees something like that and leaps to the rule "OK, chowderhead, so don't muck with continuations in the middle of expressions!". What that misses is that *everything* in Scheme is "an expression". Scheme implementations do "differ greatly" in *this* respect. BTW, note that Scheme implementations can also display wildly different behavior if f and g merely have side effects (i.e., this really has nothing to do with continuations specifically: they're just another form of side-effect you can't always predict without knowing the order of evaluation first). [skipping the proposal because we talked about it instead] [Gordon McMillan] > ... > [About continuations: while I love the fact that Christian has made > these available for playing, I have so far not found them productive. I > wrote a simple minded backtracking parser using them, but found it no > better than a coroutine based one. But I am interested in how a *real* > pervert (eg, Tim) feels about it - and no "Gee, that sounds like a *good* > idea, boss", please.] I don't know of any comprehensible application of continuations that can't be done without them. The real appeal of continuations is in their theoretical elegance (they're a single mechanism that can be used to build all sorts of stuff, much as all of "if" and "while" and "for" can be built out of compares and gotos -- continuations can be used to implement all of calls, non-resumable and resumable exceptions, generators, coroutines, non-deterministic evaluation, "thread like" stuff, ...). In any specific case, though, you want to live at the higher level, not at the raw continuation level. WRT a backtracking parser, note that this is what Icon *lives for*, and generators alone suffice (and are indeed very pleasant) for natural expression of that task. Icon became what it is when Griswold decided to open up the backtracking pattern-matching engine underlying SNOBOL4, and make it the basis for all expression evaluation. It took him 3 full languages (SL5 and Rebus came before Icon) and 20 years to get this right. Building a backtracking parser directly out of continuations sounds to me mostly painful. Building generators out of continuations *is* painful (I've done it). Curiously, the symmetry of coroutines appears to make building them out of continuations easier (than building generators). I'm not in love w/ continuations: I *could* be if Guido got Continuation Religion and wanted to redo exceptions and calls on top of continuations too, but given that whatever happens here is destined (Christian will say "doomed" ) to co-exist with everything that's already here, the appeal of continuations is minimal. I've got no desire to play with novel new control structures in Python (note that I don't consider generators-- or even coroutines --to be "novel", not after they've been in multiple languages for more than 30 years), and Python doesn't have the syntactic flexibility that makes such experiments *pleasant* in Scheme anyway. So it's enough for me if Python supports the handful of new control-flow gimmicks (generators, coroutines, maybe uthreads) tasteful *users* ask for; if we don't need continuations for those, fine by me. BTW, I'd also like to pickle a pure-Python computation in mid-stream, save it to disk, and resume it later after a reboot (or on another machine!); we don't need continuations for that either (although we do need much of what Stackless does). switching-from-redefining-truth-to-redefining-falsehood-ly y'rs - tim From tismer@tismer.com Thu Nov 9 11:39:48 2000 From: tismer@tismer.com (Christian Tismer) Date: Thu, 09 Nov 2000 13:39:48 +0200 Subject: [Python-Dev] uthread strawman References: Message-ID: <3A0A8D04.3881E4FC@tismer.com> Tim Peters wrote: ... > Building a backtracking parser directly out of continuations sounds to me > mostly painful. Building generators out of continuations *is* painful (I've > done it). Curiously, the symmetry of coroutines appears to make building > them out of continuations easier (than building generators). Some things work very well, built with plain continuations. See the attached ICON-style generator/backtrack framework (going to post the 8 queens puzzle, soon). > I'm not in love w/ continuations: I *could* be if Guido got Continuation > Religion and wanted to redo exceptions and calls on top of continuations > too, but given that whatever happens here is destined (Christian will say > "doomed" ) to co-exist with everything that's already here, the appeal > of continuations is minimal. I've got no desire to play with novel new > control structures in Python (note that I don't consider generators-- or > even coroutines --to be "novel", not after they've been in multiple > languages for more than 30 years), and Python doesn't have the syntactic > flexibility that makes such experiments *pleasant* in Scheme anyway. That's a very good point. Tricking Python to make continuations useful is a pain in the a** and has led me to a quite weird API. After the "sane" constructs are defined well, there is no much reason to support continuations in the first place. > So it's enough for me if Python supports the handful of new control-flow > gimmicks (generators, coroutines, maybe uthreads) tasteful *users* ask for; > if we don't need continuations for those, fine by me. BTW, I'd also like to > pickle a pure-Python computation in mid-stream, save it to disk, and resume > it later after a reboot (or on another machine!); we don't need > continuations for that either (although we do need much of what Stackless > does). There is one application of continuations which I still consider worthy. I'm shure that many people find it incredibly ugly. Using continuations, I can build method-like functions without classes and instances, which perform incredibly fast. This cannot be done with simple one-shot continuations; of course a class method would do the same, but slower: This is a function with expensive initialization and many local variables involved. After initializing, the continuation of *** is returned as a callable object. All initialization is done, all locals are set, and now we can pull out many results by repeatedly calling this continuation. This cannot be modelled as efficiently today with classes. ciao - chris p.s.: Here the simple ICON-like generator/backtrack framework. --------------------------------------------------------------------- import continuation class control: """ ICON style generators """ def __init__(self): # the chain of alternatives is a stack of tuples self.more = None def suspend(self, value): """ return a value, but keep the caller for re-use """ # push the caller on the alternatives stack self.more = (continuation.caller(), self.more) # simulate a return of the caller with the current value continuation.caller(2)(value) def fail(self): """ restart an alternative, if there is some. Otherwise do nothing """ if self.more: back, self.more = self.more back() def clear(self): """ clear alternatives stack """ self.more = None def asn(self, old, val): """ an undoable assignment. Usage: var = c.asn(var, value) Like the ICON operator "<-" """ self.suspend(val) print "asn returning" return(old) def choice(self, *args): """ iterator over a fixed sequence of values Like the ICON operator "|" """ if len(args) == 1: args = args[0] for val in args[:-1]: self.suspend(val) return args[-1] # the above works only for sequences of known size. # The version below is better since it does not need to # know the size, but it has to do a look-ahead. def choice(self, *args): """ iterator over a general sequence of values Like the ICON operator "|" """ if len(args) == 1: args = args[0] # problem: how do we *return* the last element for any sequence? # solution: do a look-ahead by 1 first = 1 for nextval in args: if first: val = nextval first = 0 continue self.suspend(val) val = nextval return val def to_by(self, a, b=None, c=None): """ a loop construct, modelled after the ICON for .. to .. by expression, but using xrange style """ if c is None: if b is None: iterator = xrange(a) else: iterator = xrange(a,b) else: iterator = xrange(a,b,c) # problem: how do we *return* the last element? # solution: splitting the xrange is handy! for i in iterator[:-1]: self.suspend(i) return iterator[-1] # trying to get *slightly* more structure, here an # attempt to introduce something like "every". --------------------------------------------------------------------- -- Christian Tismer :^) Mission Impossible 5oftware : Have a break! Take a ride on Python's Kaunstr. 26 : *Starship* http://starship.python.net 14163 Berlin : PGP key -> http://wwwkeys.pgp.net PGP Fingerprint E182 71C7 1A9D 66E9 9D15 D3CC D4D7 93E2 1FAE F6DF where do you want to jump today? http://www.stackless.com From tismer@tismer.com Thu Nov 9 13:04:00 2000 From: tismer@tismer.com (Christian Tismer) Date: Thu, 09 Nov 2000 15:04:00 +0200 Subject: [Python-Dev] uthread strawman References: <3A0A8D04.3881E4FC@tismer.com> Message-ID: <3A0AA0C0.EA9C137B@tismer.com> Christian Tismer wrote: I wrote: > There is one application of continuations which I still consider > worthy. I'm shure that many people find it incredibly ugly. > Using continuations, I can build method-like functions > without classes and instances, which perform incredibly > fast. This cannot be done with simple one-shot continuations; > of course a class method would do the same, but slower: > > This is a function with expensive initialization and many local variables involved. After initializing, the continuation of *** is returned as a callable object. All initialization is done, all locals are set, and now we can pull out many results by repeatedly calling this continuation. This cannot be modelled as efficiently today with classes. ciao - chris p.s.: Here the simple ICON-like generator/backtrack framework. --------------------------------------------------------------------- import continuation class control: """ ICON style generators """ def __init__(self): # the chain of alternatives is a stack of tuples self.more = None def suspend(self, value): """ return a value, but keep the caller for re-use """ # push the caller on the alternatives stack self.more = (continuation.caller(), self.more) # simulate a return of the caller with the current value continuation.caller(2)(value) def fail(self): """ restart an alternative, if there is some. Otherwise do nothing """ if self.more: back, self.more = self.more back() def clear(self): """ clear alternatives stack """ self.more = None def asn(self, old, val): """ an undoable assignment. Usage: var = c.asn(var, value) Like the ICON operator "<-" """ self.suspend(val) print "asn returning" return(old) def choice(self, *args): """ iterator over a fixed sequence of values Like the ICON operator "|" """ if len(args) == 1: args = args[0] for val in args[:-1]: self.suspend(val) return args[-1] # the above works only for sequences of known size. # The version below is better since it does not need to # know the size, but it has to do a look-ahead. def choice(self, *args): """ iterator over a general sequence of values Like the ICON operator "|" """ if len(args) == 1: args = args[0] # problem: how do we *return* the last element for any sequence? # solution: do a look-ahead by 1 first = 1 for nextval in args: if first: val = nextval first = 0 continue self.suspend(val) val = nextval return val def to_by(self, a, b=None, c=None): """ a loop construct, modelled after the ICON for .. to .. by expression, but using xrange style """ if c is None: if b is None: iterator = xrange(a) else: iterator = xrange(a,b) else: iterator = xrange(a,b,c) # problem: how do we *return* the last element? # solution: splitting the xrange is handy! for i in iterator[:-1]: self.suspend(i) return iterator[-1] # trying to get *slightly* more structure, here an # attempt to introduce something like "every". --------------------------------------------------------------------- -- Christian Tismer :^) Mission Impossible 5oftware : Have a break! Take a ride on Python's Kaunstr. 26 : *Starship* http://starship.python.net 14163 Berlin : PGP key -> http://wwwkeys.pgp.net PGP Fingerprint E182 71C7 1A9D 66E9 9D15 D3CC D4D7 93E2 1FAE F6DF where do you want to jump today? http://www.stackless.com From tismer at tismer.com Thu Nov 9 14:04:00 2000 From: tismer at tismer.com (Christian Tismer) Date: Thu, 09 Nov 2000 15:04:00 +0200 Subject: [Python-Dev] uthread strawman References: <3A0A8D04.3881E4FC@tismer.com> Message-ID: <3A0AA0C0.EA9C137B@tismer.com> Christian Tismer wrote: I wrote: > There is one application of continuations which I still consider > worthy. I'm shure that many people find it incredibly ugly. > Using continuations, I can build method-like functions > without classes and instances, which perform incredibly > fast. This cannot be done with simple one-shot continuations; > of course a class method would do the same, but slower: > > This is a function with expensive initialization and many local variables involved. After initializing, the continuation of *** is returned as a callable object. All initialization is done, all locals are set, and now we can pull out many results by repeatedly calling this continuation. This cannot be modelled as efficiently today with classes. ciao - chris p.s.: Here the simple ICON-like generator/backtrack framework. --------------------------------------------------------------------- import continuation class control: """ ICON style generators """ def __init__(self): # the chain of alternatives is a stack of tuples self.more = None def suspend(self, value): """ return a value, but keep the caller for re-use """ # push the caller on the alternatives stack self.more = (continuation.caller(), self.more) # simulate a return of the caller with the current value continuation.caller(2)(value) def fail(self): """ restart an alternative, if there is some. Otherwise do nothing """ if self.more: back, self.more = self.more back() def clear(self): """ clear alternatives stack """ self.more = None def asn(self, old, val): """ an undoable assignment. Usage: var = c.asn(var, value) Like the ICON operator "<-" """ self.suspend(val) print "asn returning" return(old) def choice(self, *args): """ iterator over a fixed sequence of values Like the ICON operator "|" """ if len(args) == 1: args = args[0] for val in args[:-1]: self.suspend(val) return args[-1] # the above works only for sequences of known size. # The version below is better since it does not need to # know the size, but it has to do a look-ahead. def choice(self, *args): """ iterator over a general sequence of values Like the ICON operator "|" """ if len(args) == 1: args = args[0] # problem: how do we *return* the last element for any sequence? # solution: do a look-ahead by 1 first = 1 for nextval in args: if first: val = nextval first = 0 continue self.suspend(val) val = nextval return val def to_by(self, a, b=None, c=None): """ a loop construct, modelled after the ICON for .. to .. by expression, but using xrange style """ if c is None: if b is None: iterator = xrange(a) else: iterator = xrange(a,b) else: iterator = xrange(a,b,c) # problem: how do we *return* the last element? # solution: splitting the xrange is handy! for i in iterator[:-1]: self.suspend(i) return iterator[-1] # trying to get *slightly* more structure, here an # attempt to introduce something like "every". --------------------------------------------------------------------- -- Christian Tismer :^) Mission Impossible 5oftware : Have a break! Take a ride on Python's Kaunstr. 26 : *Starship* http://starship.python.net 14163 Berlin : PGP key -> http://wwwkeys.pgp.net PGP Fingerprint E182 71C7 1A9D 66E9 9D15 D3CC D4D7 93E2 1FAE F6DF where do you want to jump today? http://www.stackless.com From tismer at tismer.com Thu Nov 9 14:04:00 2000 From: tismer at tismer.com (Christian Tismer) Date: Thu, 09 Nov 2000 15:04:00 +0200 Subject: [Python-Dev] uthread strawman References: <3A0A8D04.3881E4FC@tismer.com> Message-ID: <3A0AA0C0.EA9C137B@tismer.com> Christian Tismer wrote: I wrote: > There is one application of continuations which I still consider > worthy. I'm shure that many people find it incredibly ugly. > Using continuations, I can build method-like functions > without classes and instances, which perform incredibly > fast. This cannot be done with simple one-shot continuations; > of course a class method would do the same, but slower: > >