From hpk at trillke.net Sun Jan 2 14:00:02 2005 From: hpk at trillke.net (holger krekel) Date: Sun, 2 Jan 2005 14:00:02 +0100 Subject: [py-dev] revising exception handling / help appreciated Message-ID: <20050102130002.GA32271@solar.trillke.net> Hi py-dev, i was wondering if anybody has any helpful opinions or suggestions on the following so far rather undocumented feature regarding exception-handling. It's one of the fundamental things that prevents releasing the py lib and py.test. Currently, especially the local path implementation converts IOErrors/OSErrors to specific new classes derived from the "errno" attribute of those errors. This makes it possible to do something like: try: do_something_with(path) except py.path.NotFound: ... except py.path.PermissionDenied: ... and so on. Other path implementations (the subversion ones especially) try to raise such exceptions as well. Now the question is if the above exceptions shouldn't better have their names derived from what is in the "errno" module. Refering to these posix-standard names avoids to learn new names and their semantics. I would hope these names also make sense on non-unix environments. The above example would probably become: try: do_something_with(path) except py.error.ENOENT: ... except py.error.EPERM: ... This error-conversion would/should then more easily become pervasive through the library, e.g. py.execnet (http://codespeak.net/py/current/doc/execnet.html) would probably also raise such exceptions if applicable. Also the svn implementations would try to fit exceptions as good as possible into this naming. Exceptions arising in path implementations may even derive from something like py.path.Error in order to allow easier "catch-all-path-errors" handling. Not sure about this, though, as the distinction may be too vague. Any opinions? cheers and a happy new year, holger From florian.proff.schulze at gmx.net Sun Jan 2 14:16:47 2005 From: florian.proff.schulze at gmx.net (Florian Schulze) Date: Sun, 02 Jan 2005 14:16:47 +0100 Subject: [py-dev] revising exception handling / help appreciated In-Reply-To: <20050102130002.GA32271@solar.trillke.net> References: <20050102130002.GA32271@solar.trillke.net> Message-ID: On Sun, 2 Jan 2005 14:00:02 +0100, holger krekel wrote: > try: > do_something_with(path) > except py.path.NotFound: > ... > except py.path.PermissionDenied: > ... > > [or better] > > try: > do_something_with(path) > except py.error.ENOENT: > ... > except py.error.EPERM: > ... I like the first much better, not everyone knows the constants from C and they aren't very intuitive. I think readability is more valuable than the small learning needed for people which know C. Maybe the names should have Error at the end, like the other Python exceptions. try: do_something_with(path) except py.path.NotFoundError: ... except py.path.PermissionDeniedError: ... Regards, Florian Schulze From hpk at trillke.net Sun Jan 2 15:18:26 2005 From: hpk at trillke.net (holger krekel) Date: Sun, 2 Jan 2005 15:18:26 +0100 Subject: [py-dev] revising exception handling / help appreciated In-Reply-To: References: <20050102130002.GA32271@solar.trillke.net> Message-ID: <20050102141826.GO28060@solar.trillke.net> Hi Florian, On Sun, Jan 02, 2005 at 14:16 +0100, Florian Schulze wrote: > On Sun, 2 Jan 2005 14:00:02 +0100, holger krekel wrote: > > > > try: > > do_something_with(path) > > except py.path.NotFound: > > ... > > except py.path.PermissionDenied: > > ... > > > > >[or better] > > > > > try: > > do_something_with(path) > > except py.error.ENOENT: > > ... > > except py.error.EPERM: > > ... > > > I like the first much better, not everyone knows the constants from C and > they aren't very intuitive. I think readability is more valuable than the > small learning needed for people which know C. I agree that the names of the first examples appear more readable. OTOH, it's harder to guess the exact meaning of these "nice" names. It at least causes thinking and documentation effort. Whereas with "errno-classes" we can say: google luckily, since these weird abbreviations are easily found as the first hit. Note e.g. that PermissionDenied may mean either "EACCES" or "EPERM", see e.g. http://www.wlug.org.nz/EPERM for the difference. Either way, the current exception-errno/conversion hacks are too much tied to "py.path" so something has to happen. Maybe we can retain the "nice names" and let them be a tuple of py.error.*'s or so. > Maybe the names should have Error at the end, like the other Python > exceptions. This is a problem that py.error.* would not have :-) cheers, holger From lac at strakt.com Sun Jan 2 16:22:01 2005 From: lac at strakt.com (Laura Creighton) Date: Sun, 02 Jan 2005 16:22:01 +0100 Subject: [py-dev] revising exception handling / help appreciated In-Reply-To: Message from holger krekel of "Sun, 02 Jan 2005 15:18:26 +0100." <20050102141826.GO28060@solar.trillke.net> References: <20050102130002.GA32271@solar.trillke.net> <20050102141826.GO28060@solar.trillke.net> Message-ID: <200501021522.j02FM1WF019312@ratthing-b246.strakt.com> What about a scheme: try: do_something_with(path) except py.error.NotFound: #ENOENT ... except py.error.PermissionDenied: #EPERM ... >OTOH, it's harder to guess the exact meaning of these "nice" names. >It at least causes thinking and documentation effort. Whereas >with "errno-classes" we can say: google luckily, since these weird >abbreviations are easily found as the first hit. Note e.g. that >PermissionDenied may mean either "EACCES" or "EPERM", see e.g. >http://www.wlug.org.nz/EPERM for the difference. The place we need to spend the effort is on the error message that is seen when the user has an error. Whose messages will our users be seeing? If they all look like this: http://docs.python.org/lib/module-errno.html then using Google will be mandatory for figuring out what your program isn't doing. I'd rather save our users this time. Laura From arigo at tunes.org Sun Jan 2 16:40:17 2005 From: arigo at tunes.org (Armin Rigo) Date: Sun, 2 Jan 2005 15:40:17 +0000 Subject: [py-dev] revising exception handling / help appreciated In-Reply-To: <200501021522.j02FM1WF019312@ratthing-b246.strakt.com> References: <20050102130002.GA32271@solar.trillke.net> <20050102141826.GO28060@solar.trillke.net> <200501021522.j02FM1WF019312@ratthing-b246.strakt.com> Message-ID: <20050102154017.GA20216@vicky.ecs.soton.ac.uk> Hi, Would some mixed scheme make sense? Something with some inheritance, for example: py.error.PathError py.error.NotFound py.error.ENOENT py.error.PermissionDenied py.error.EACCES py.error.EPERM The actual raised exceptions would be the EXXX ones but with an error message that gives the readable category: >>> p = mydir.join('filename').open() Traceback: ... py.error.EACCES: Permission Denied Then programers can catch the precise exception or the broader category (whose name is obvious from the error message). A bientot, Armin. From hpk at trillke.net Sun Jan 2 17:05:49 2005 From: hpk at trillke.net (holger krekel) Date: Sun, 2 Jan 2005 17:05:49 +0100 Subject: [py-dev] revising exception handling / help appreciated In-Reply-To: <200501021522.j02FM1WF019312@ratthing-b246.strakt.com> References: <20050102130002.GA32271@solar.trillke.net> <20050102141826.GO28060@solar.trillke.net> <200501021522.j02FM1WF019312@ratthing-b246.strakt.com> Message-ID: <20050102160549.GP28060@solar.trillke.net> On Sun, Jan 02, 2005 at 16:22 +0100, Laura Creighton wrote: > try: > do_something_with(path) > except py.error.NotFound: #ENOENT > ... > except py.error.PermissionDenied: #EPERM > ... > >OTOH, it's harder to guess the exact meaning of these "nice" names. > >It at least causes thinking and documentation effort. Whereas > >with "errno-classes" we can say: google luckily, since these weird > >abbreviations are easily found as the first hit. Note e.g. that > >PermissionDenied may mean either "EACCES" or "EPERM", see e.g. > >http://www.wlug.org.nz/EPERM for the difference. > > The place we need to spend the effort is on the error message that is seen > when the user has an error. Well, the py lib targets developers and programmers and they/we should be able to reference error conditions in a consistent, reliable and exact way. This is not neccessarily related to what a user will or should see. Also I've done the current error-conversion hacks and i can tell you they may appear simple but they are full of ambivalence and hard to get right, if "right" can even be defined unambigously. They are also not complete, i.e. some errors will just propagate in raw form, i.e. IOError, OSErrors. This is all messy and i'd rather not try to put more effort there but go for a simpler and deterministic scheme, reusing common standard error definitions. POSIX error specifications have such a clear definition and they can be looked up easily. Also the underlying modules (socket, os, file-IO, etc.pp) do the hard work of preserving Errno's and the 'errno' module provides the mapping to the google-able names. > If they all look like this: > > http://docs.python.org/lib/module-errno.html > > then using Google will be mandatory for figuring out what your program isn't > doing. I'd rather save our users this time. it's a shallow and deterministic learning curve and many programmers can just reuse their existing knowledge. The third time you encounter e.g. "EPERM: /etc/passwd" or "ENOENT: /etc/asldkajsdlkasjdl" you will probably not have to google for it anymore. cheers, holger From hpk at trillke.net Sun Jan 2 17:35:13 2005 From: hpk at trillke.net (holger krekel) Date: Sun, 2 Jan 2005 17:35:13 +0100 Subject: [py-dev] revising exception handling / help appreciated In-Reply-To: <20050102154017.GA20216@vicky.ecs.soton.ac.uk> References: <20050102130002.GA32271@solar.trillke.net> <20050102141826.GO28060@solar.trillke.net> <200501021522.j02FM1WF019312@ratthing-b246.strakt.com> <20050102154017.GA20216@vicky.ecs.soton.ac.uk> Message-ID: <20050102163513.GQ28060@solar.trillke.net> Hi Armin, On Sun, Jan 02, 2005 at 15:40 +0000, Armin Rigo wrote: > Would some mixed scheme make sense? Something with some inheritance, for > example: > > py.error.PathError > py.error.NotFound > py.error.ENOENT > py.error.PermissionDenied > py.error.EACCES > py.error.EPERM > > The actual raised exceptions would be the EXXX ones but with an error message > that gives the readable category: > > >>> p = mydir.join('filename').open() > Traceback: > ... > py.error.EACCES: Permission Denied yes, that may make sense but i would probably try to keep the number of "ornamented" errors as low as possible. Actually, for this case i'd rather see something like: >>> p = mydir.join('filename').open() Traceback: ... py.error.EACCES: .../filename (Permission Denied) as the actual file item that caused the error is the most interesting piece of information, i guess. > Then programers can catch the precise exception or the broader category (whose > name is obvious from the error message). This is true for PermissionDenied and NotFound but maybe not so much for "IsNotADirectory", let apart ELOOP aka "NestedLink" etc.pp. All in all, i am wary of inventing new names and having to maintain fragile code, tests and documentation for this so i lean more and more towards pure clean un-ambigous posix error names despite their uglyness. If posix names really prove to be too painful then we can later add exception hierarchies alike the one you proposed. This could be done in a backward-compatible way whereas introducing/letting the current names continue to exist places a burden with questionable value. Hehe and I guess you all know the nice feeling of being able to remove messy and fragile code with something easy and obviously correct :-) I take it that no one is much using py.path.* Errors so far anyway as most people just use the py.test part of the library currently. Well, i am myself using py.path.* Errors in a number of my applications so i am probably the most affected developer by this change. cheers, holger From lac at strakt.com Sun Jan 2 18:30:39 2005 From: lac at strakt.com (Laura Creighton) Date: Sun, 02 Jan 2005 18:30:39 +0100 Subject: [py-dev] revising exception handling / help appreciated In-Reply-To: Message from holger krekel of "Sun, 02 Jan 2005 17:35:13 +0100." <20050102163513.GQ28060@solar.trillke.net> References: <20050102130002.GA32271@solar.trillke.net> <20050102141826.GO28060@solar.trillke.net> <200501021522.j02FM1WF019312@ratthing-b246.strakt.com> <20050102154017.GA20216@vicky.ecs.soton.ac.uk> <20050102163513.GQ28060@solar.trillke.net> Message-ID: <200501021730.j02HUd7O019671@ratthing-b246.strakt.com> Something that is germane to this discussion: Most users (and we tested them, somewhere I have hard evidence for this) do not see the E_WHATEVER part of their error message at all. They parse all error messages that show up as: ERROR: . So when faced with: EPERM: /etc/hedgehog or ENOENT: /etc/hedgehog there is no part of their brain going 'what is an EPERM'? Their brain thinks it already knows all it needs to know about EPERM: it means there is an error. So what they do is go take a look at /etc/hedgehog and see if they can reason out what must be wrong. If they cannot find the file, they figure it out. If they can and it is owned by the wrong person, then they can, assuming they know about permissions. But my job was to see what confused them. When we mounted the filesystem read-only, they _couldn't_ figure it out. They didn't know that this was possible, and so didn't imagine that. Making the ETEXT_THEY_READ longer only had a limited effect. _Telling_ them that they should look these up in the manuals we provided had some effect. But the only real fix was long error messages. If you look at messages in newbie forums all over, you will see that things have not changed. Newbies are still asking 'What does "Operation not permitted" mean'? rather than 'What does "EPERM: Operation not permitted" mean. Bug reports still exclude the perfectly vital piece of information the programmer needs to figure out exactly what went wrong 'because I didn't think that was important'. We had better take special care with errors new users can get trying to install and use pytest for the very first time. We don't want them to get frustrated and decide that _unit testing_ is too hard for them. Laura From hpk at trillke.net Sun Jan 2 19:29:05 2005 From: hpk at trillke.net (holger krekel) Date: Sun, 2 Jan 2005 19:29:05 +0100 Subject: [py-dev] revising exception handling / help appreciated In-Reply-To: <200501021730.j02HUd7O019671@ratthing-b246.strakt.com> References: <20050102130002.GA32271@solar.trillke.net> <20050102141826.GO28060@solar.trillke.net> <200501021522.j02FM1WF019312@ratthing-b246.strakt.com> <20050102154017.GA20216@vicky.ecs.soton.ac.uk> <20050102163513.GQ28060@solar.trillke.net> <200501021730.j02HUd7O019671@ratthing-b246.strakt.com> Message-ID: <20050102182905.GR28060@solar.trillke.net> On Sun, Jan 02, 2005 at 18:30 +0100, Laura Creighton wrote: > Something that is germane to this discussion: > > Most users (and we tested them, somewhere I have hard evidence for this) I wonder how you could provide real hard evidence for this, but it probably doesn't matter so much ... > do not see the E_WHATEVER part of their error message at all. They parse all > error messages that show up as: > > ERROR: . > > So when faced with: > > EPERM: /etc/hedgehog or > ENOENT: /etc/hedgehog Actually, with the new scheme they'd would look like e.g.: EPERM: [Operation not Permitted]: chmod('/tmp', 500, 500) together with a traceback that shows how it came to this call. This seems to be the best of two worlds: a precise error description for a specific call to underlying os-level services and the name ("EPERM") which allows to catch this precise error. This is at least a lot better than what there is now. On a side note, please don't forget the burden of implementing code, writing docs and tests for all this. It's not _only_ the usage side that i worry about but also ease, simplicty and predictability of the implementation. cheers, holger From lac at strakt.com Sun Jan 2 20:48:05 2005 From: lac at strakt.com (Laura Creighton) Date: Sun, 02 Jan 2005 20:48:05 +0100 Subject: [py-dev] revising exception handling / help appreciated In-Reply-To: Message from holger krekel of "Sun, 02 Jan 2005 19:29:05 +0100." <20050102182905.GR28060@solar.trillke.net> References: <20050102130002.GA32271@solar.trillke.net> <20050102141826.GO28060@solar.trillke.net> <200501021522.j02FM1WF019312@ratthing-b246.strakt.com> <20050102154017.GA20216@vicky.ecs.soton.ac.uk> <20050102163513.GQ28060@solar.trillke.net> <200501021730.j02HUd7O019671@ratthing-b246.strakt.com> <20050102182905.GR28060@solar.trillke.net> Message-ID: <200501021948.j02Jm5jg019997@ratthing-b246.strakt.com> In a message of Sun, 02 Jan 2005 19:29:05 +0100, holger krekel writes: >On Sun, Jan 02, 2005 at 18:30 +0100, Laura Creighton wrote: >> Something that is germane to this discussion: >> >> Most users (and we tested them, somewhere I have hard evidence for this >) > >I wonder how you could provide real hard evidence for this, but >it probably doesn't matter so much ... Oh it was fun. We filmed them constantly, and had access to all their communications. This is one nice thing about testing on soldiers rather than on customers. The soldiers can be ordered not to leave, and put up with whatever foolishness you want to do to them. We did some pretty foolish things, all in the name of seeing how they would react. >Actually, with the new scheme they'd would look like e.g.: > > EPERM: [Operation not Permitted]: chmod('/tmp', 500, 500) > >together with a traceback that shows how it came to this call. >This seems to be the best of two worlds: a precise error description >for a specific call to underlying os-level services and >the name ("EPERM") which allows to catch this precise error. >This is at least a lot better than what there is now. >On a side note, please don't forget the burden of implementing >code, writing docs and tests for all this. It's not _only_ >the usage side that i worry about but also ease, simplicty >and predictability of the implementation. Sounds to me as if you have found another area where people who want to contribute, but don't want to at the implementation level can. Now we need a way to conveniently break out such work so we can see if it tempts volunteers. Laura > >cheers, > > holger From hpk at trillke.net Mon Jan 3 15:13:45 2005 From: hpk at trillke.net (holger krekel) Date: Mon, 3 Jan 2005 15:13:45 +0100 Subject: [py-dev] example svn-move script Message-ID: <20050103141345.GY28060@solar.trillke.net> hi, i just moved all the test files of the py lib into their own "testing" directory. This was done using a throw-away script which extensively uses py.path.svnwc() and performs lot of systematic and some exlicit moves in a working copy. Maybe someone finds this ad-hoc script a useful example for renaming/moving purposes in others svn-controled projects. It allows to repetetively and iteratively code the "transformation" until everything is right. After which you can just "svn ci /tmp/renaming/py". cheers, holger # move_to_testing.py import py wc = py.path.svnwc('py') assert wc.check(versioned=True) print "cleaning pyc and pyo" for x in wc.localpath.visit('*.py?'): x.remove() workarea = py.path.local('/tmp/renaming').ensure(dir=1).join("py") if workarea.check(): print "removing", workarea workarea.remove() print "copying to", workarea wc.localpath.copy(workarea) workarea = py.path.svnwc(workarea) print "visiting", workarea for x in workarea.visit(py.path.checker(versioned=True, file=True, ext='.py'), py.path.checker(dotfile=0)): if x.relto(workarea / 'magic' / 'greenlet'): continue dp = x.dirpath() if dp.basename == 'testing': continue if dp.basename == 'test' and dp != workarea / 'test': if dp == workarea / 'path' / 'test': continue print "renaming", dp dp.rename(dp.new(basename='testing')) continue pbn = x.purebasename if pbn.startswith('test_') or pbn.endswith('_test'): testing = dp.ensure('testing', dir=1) testing.ensure('__init__.py').write('#') print "moving to", testing.join(x.basename) x.move(testing.join(x.basename)) # extra moves extpy = workarea.join('path', 'extpy') extpy.join('inc_pseudofs.py').move(extpy.join('testing', 'inc_pseudofs.py')) extpy.join('inc_test_extpy.py').move(extpy.join('testing', 'inc_test_extpy.py')) extpy.join('test_data').move(extpy.join('testing', 'test_data')) svn = workarea.join('path', 'svn') svn.join('svntestbase.py').move(svn.join('testing', 'svntestbase.py')) execnet = workarea.join('execnet') execnet.join('sshtesting.py').move(execnet / 'testing' / 'sshtesting.py') From hpk at trillke.net Mon Jan 3 18:31:48 2005 From: hpk at trillke.net (holger krekel) Date: Mon, 3 Jan 2005 18:31:48 +0100 Subject: [py-dev] testing with multiple python executables Message-ID: <20050103173148.GD28060@solar.trillke.net> Hi, i just checked in support for a new cmdline-option. you can now basically do something like py.test --exec=python2.2 and your tests will run with the given executable (looked up on $PATH). At some point i want to allow specifying multiple executables and, if you also specify --session, then you could step through the errors specific to python versions until you fixed them all. Additionally, this should at some point be extended to support running through arbitrary gateways, so that you can run the tests over an ssh-connection on some other machine. The code is basically there and uses the right abstractions (py.execnet) but we don't have a "string representation" for specifying gateway connections yet. Lastly, it becomes more urgent to better manage pyc/.o/.so etc.pp files as these files are constantly overwritten in-place if you use multiple python versions concurrently. The idea here is to have a per-user, per-python and possibly per-host local storage where we can put created pyc/so files. cheers, holger From kkowalczyk at gmail.com Tue Jan 4 05:24:08 2005 From: kkowalczyk at gmail.com (Krzysztof Kowalczyk) Date: Mon, 3 Jan 2005 20:24:08 -0800 Subject: [py-dev] How to re-run just failed tests Message-ID: <7ce338ad05010320246e7a0d89@mail.gmail.com> I'm quite new to py.test and looking for some guidance. I have the same problem with py.test as I had with built-in unittest module: I can't seem find an easy way to re-run just the tests that failed. Let's assume that you have a lot of tests and each of the tests takes some time. Let's further assume that tests are independent and only few tests fail. It would be most desired if there was a way to easily re-run the tests that just failed. Otherwise, I have to re-run all the tests after trying to fix the code for the failed tests, which is a waste of time. I know two work-arounds, but they don't meet the 'easy' criteria: a) move the failing tests to a separate file and proceede as usual b) write my own test logger which would save the failures from the last run e.g. to a file and provide an option to just re-run those tests. It could be as simple as writing names of the tests that failed to a file and then, if, say, -rerun-failed option is given to py.test, only those tests are re-ran Even though the option b) seems simple, I'm not sure if it's easy to implement. Is some such functionality provided with py.test out-of-the-box, by any chance? Or anything else, that migth meet my needs e.g. ability to specify a list of test functions to run to py.test (as opposed to executing them all)? Krzysztof Kowalczyk | http://blog.kowalczyk.info From hpk at trillke.net Tue Jan 4 10:09:07 2005 From: hpk at trillke.net (holger krekel) Date: Tue, 4 Jan 2005 10:09:07 +0100 Subject: [py-dev] How to re-run just failed tests In-Reply-To: <7ce338ad05010320246e7a0d89@mail.gmail.com> References: <7ce338ad05010320246e7a0d89@mail.gmail.com> Message-ID: <20050104090907.GF28060@solar.trillke.net> Hi Krzysztof, On Mon, Jan 03, 2005 at 20:24 -0800, Krzysztof Kowalczyk wrote: > I'm quite new to py.test and looking for some guidance. I have the > same problem with py.test as I had with built-in unittest module: I > can't seem find an easy way to re-run just the tests that failed. py.test --session should go a long way into this direction. It actually does a bit more than just re-running failed tests: after running the first time it sits there and waits for file changes (tests and code) and reruns failing tests repetetively. > I know two work-arounds, but they don't meet the 'easy' criteria: > a) move the failing tests to a separate file and proceede as usual > b) write my own test logger which would save the failures from the > last run e.g. to a file and provide an option to just re-run those > tests. It could be as simple as writing names of the tests that > failed to a file and then, if, say, -rerun-failed option is given to > py.test, only those tests are re-ran b) is more or less what "--session does". But it is actually not enough as it doesn't take deeper customization (providing your own test Items) into account. I have been discussing with Armin and thinking some more about an appropriate refactoring to make this all clean and predictable. Nothing definite yet but, unless you are providing your own Test Items, you should not be affected by such a refactoring. Also note, that once there is a per-user storage (e.g. "~/.py/") we can easily add an option that reruns failing tests across separate invocations. Via e.g. py.test --failingonly or '-f' or so. This means you could run "py.test" on your whole project and then change to a directory and enter "py.test --failingonly" to run all the failed tests in this directory. HTH & looking forward to your feedback, holger From hpk at trillke.net Wed Jan 5 00:49:29 2005 From: hpk at trillke.net (holger krekel) Date: Wed, 5 Jan 2005 00:49:29 +0100 Subject: [py-dev] core design of py.test broken? Message-ID: <20050104234929.GD7439@solar.trillke.net> Hi everyone, lately, i have been thinking (and discussing with Armin somewhat) about some of the underlying implementation design challenges for py.test. Now as a user you don't usually much see what is going on behind the scenes, inline with the motto "The best API is no API". But it would still be great to get some implementation feedback from the very experienced python developers currently subscribed to py-dev. Sorry, if i am asking too much here. This is one of the situation where it obviously takes a lot more time to get to the right ideas than to actually implement it (not uncommon with Python, of course). And of course, to write about the current problems forces me to some clearer thinking. So even if you stop reading i have written this under the illusion that you are going to read it all :-) So with py.test we currently have Collectors and Items. Collectors are responsible for traversing (possibly remote) directories, modules and classes to discover test methods, which are wrapped up in Items. Items are responsible for setting up and executing the actual test methods. Test state management aka setup/teardown hooks are encoded at user level with setup_module/setup_class/setup_method (& teardown-counterparts) at appropriate places. From user-level you can also provide your own Items at various levels, e.g. per directory or per class. Here seems to be the first problem: Collectors don't take part in the setup/teardown procedures and this is especially true for "generative tests", i.e. tests that "yield" more sub tests and as such actually behave like a Collector. Such yielding "generative tests" are located somewhere at user-level test code and as such you might expect them to fully participate in the setup/teardown state management. (actually "GenItems" currently do take part but this is a hack and has side effects). Then we also execute test methods across process boundaries (with --session and lately --exec) and across python versions and platforms (and soon across networks). And we want to "memorize" especially failing tests across those barriers, but maybe also other types of tests. Currently, we basically just memorize a filesystem location (possibly remote, yadda yadda) and a module path, i.e. a list of names leading to the test method. Problem: when re-instating the test method in order to re-execute it we don't redo the collection process leading up to that Item and thus we don't get user-supplied Items anymore! (The collection process looks at per directory/per class or otherwise custom configuration in order to determine a custom Item. The collection process also allows to be customized itself and we use that already for PyPy ...). IOW, the interaction between test memorizing/sessions and customized Items and Collectors is broken. Now throwin another bunch of upcoming features: doctest Items, maybe spawning multiple threads to run tests concurrently for MT-applications and what not. Btw, we already have a "ReST-Item" in py/documentation/rest_test.py which encodes - in like 10 easy lines of code - how to integrate ReST-to-HTML-generates-no-warnings checking with py.test. There are nice aspects regarding the separation of Collectors and Items, e.g.: you usually get very short tracebacks, even when using --fulltrace. And conceptionally Collectors and Items really appear to be different things, although i sometimes ponder viewing each collection step as a test in itself, e.g. if you can't import a module (due to a syntax error), it could be treated as a somewhat special failing test. In practise, though, i feel there is more to loose than to gain by mixing collecting and executing tests. So one of the ideas to remedy the problems and have py.test's design ready for future growth is to introduce "test paths". They would basically encode a root location (say always some kind of directory) and a list of names. The names would be traversed to redo any collection logic, including getting at custom Items and using custom collectors. Basically we could think of a Collector offering two concepts: listdir() # list all the possible Items (test methods) and # "recursion" points, i.e. Collectors, each # of which has a specific associated name join(name) # get a contained Collector/Item specified # by such a name Concrete Example: # module test_something.py def setup_module(mod): mod.answer = 42 class TestSomething: def setup_class(cls): cls.question == "7*8" def test_method(self): assert eval(self.question) == 42 def test_global_function(): assert answer == 42 Here listdir() on the Module-Collector would give something like [ClassCollector('TestSomething'), Item('test_global_function')] and each entry would contain a full test_path, for example: ['test_something.py', 'TestSomething'] and ['test_something.py', 'test_global_function'] respectively. Such test paths would allow to later invoke the very same collectors (which can all be custom) and reconstruct the appropriate Item instance by invoking join(), given the same starting point (a root directory). Also if we then decide to let collectors themselves particpate in setup/teardown semantics there shouldn't be problems anymore. Of course, there are some other side issues to consider, like that py.test manages to run tests in the same order they appear in a file. But this probably affects implementation, not the idea so much. Another idea i think of: being able to just scan all tests (maybe with the current --collectonly) and memorize it at some per-user location. For projects with thousands of tests it would then be possible to quickly run tests with e.g. py.test --match=database to instantly run only tests that have "database" in their test paths, using the "index database" of previously scanned and memorized test paths. This would turn the actual names for test directories files, classes and methods into something meaningful to distinguish distinct sets of tests. Another somewhat unexplored area: having very interactive testing frontends, think pygame, wxPython or web-driven. For this the underlying design choices need to be accessible and manipulatible in a user-event-driven and very custom way. OK, enough for now, i guess. I am interested in any kind of feedback, including criticism and other proposals, on the ideas, problems and vague solutions i tried to present. cheers & thanks for listening (if you actually got here :-), holger From lac at strakt.com Wed Jan 5 01:04:42 2005 From: lac at strakt.com (Laura Creighton) Date: Wed, 05 Jan 2005 01:04:42 +0100 Subject: [py-dev] core design of py.test broken? In-Reply-To: Message from hpk@trillke.net (holger krekel) of "Wed, 05 Jan 2005 00:49:29 +0100." <20050104234929.GD7439@solar.trillke.net> References: <20050104234929.GD7439@solar.trillke.net> Message-ID: <200501050004.j0504gIc003723@ratthing-b246.strakt.com> >Of course, there are some other side issues to consider, like >that py.test manages to run tests in the same order they >appear in a file. But this probably affects implementation, >not the idea so much. Sometimes, of course, what you really want is a way to make sure that the tests are _not_ run in the same order that they appear in a file, because that ordering hides a bug, which you don't know about, because you always do them in that order. Being able to specify that would be neat. >Another somewhat unexplored area: having very interactive >testing frontends, think pygame, wxPython or web-driven. For >this the underlying design choices need to be accessible >and manipulatible in a user-event-driven and very custom way. I think that the people with a huge amount of tests want, among other things, a way run a nightly build that goes through all the tests and see if something inadvertantly broke. I worry about how well that will play with being user-event driven. Just something that i thought of while I was reading. I will go back to thinking more... Laura From hpk at trillke.net Wed Jan 5 01:14:29 2005 From: hpk at trillke.net (holger krekel) Date: Wed, 5 Jan 2005 01:14:29 +0100 Subject: [py-dev] core design of py.test broken? In-Reply-To: <200501050004.j0504gIc003723@ratthing-b246.strakt.com> References: <20050104234929.GD7439@solar.trillke.net> <200501050004.j0504gIc003723@ratthing-b246.strakt.com> Message-ID: <20050105001429.GE7439@solar.trillke.net> On Wed, Jan 05, 2005 at 01:04 +0100, Laura Creighton wrote: > > > >Of course, there are some other side issues to consider, like > >that py.test manages to run tests in the same order they > >appear in a file. But this probably affects implementation, > >not the idea so much. > > Sometimes, of course, what you really want is a way to make sure that > the tests are _not_ run in the same order that they appear in a file, because > that ordering hides a bug, which you don't know about, because you always > do them in that order. Being able to specify that would be neat. sure, it shouldn't be a problem to optionally don't care for ordering, maybe with specifiying "--random" or so. > >Another somewhat unexplored area: having very interactive > >testing frontends, think pygame, wxPython or web-driven. For > >this the underlying design choices need to be accessible > >and manipulatible in a user-event-driven and very custom way. > > I think that the people with a huge amount of tests want, among other things, > a way run a nightly build that goes through all the tests and see if something > inadvertantly broke. I worry about how well that will play with being > user-event driven. Running in linear batch mode is easier on the implementation than to allow full interaction with crazy users in a highly customized environment. After all you could see a batch job as something with very low user interaction :-) Regarding batch mode it is probably more interesting to look at the "Reporter" side, which is a somewhat different can of worms. cheers, holger From lac at strakt.com Wed Jan 5 01:42:35 2005 From: lac at strakt.com (Laura Creighton) Date: Wed, 05 Jan 2005 01:42:35 +0100 Subject: [py-dev] core design of py.test broken? In-Reply-To: Message from holger krekel of "Wed, 05 Jan 2005 01:14:29 +0100." <20050105001429.GE7439@solar.trillke.net> References: <20050104234929.GD7439@solar.trillke.net> <200501050004.j0504gIc003723@ratthing-b246.strakt.com> <20050105001429.GE7439@solar.trillke.net> Message-ID: <200501050042.j050gZd3003841@ratthing-b246.strakt.com> I am thinking about 2 test sessions running simultaneously, one checking that all tests pass with python2.3 and one with python2.4 We would like if a running test, trying to do the 'build this in SetUp', found that the thing was _already there_, waited until that particular area was free to use. Else they could collide with each other. Do we have a way to communicate: BUSY: stay out until I am done? Laura From bob at redivi.com Wed Jan 5 01:45:27 2005 From: bob at redivi.com (Bob Ippolito) Date: Tue, 4 Jan 2005 19:45:27 -0500 Subject: [py-dev] core design of py.test broken? In-Reply-To: <200501050042.j050gZd3003841@ratthing-b246.strakt.com> References: <20050104234929.GD7439@solar.trillke.net> <200501050004.j0504gIc003723@ratthing-b246.strakt.com> <20050105001429.GE7439@solar.trillke.net> <200501050042.j050gZd3003841@ratthing-b246.strakt.com> Message-ID: <178FE566-5EB3-11D9-9DC0-000A9567635C@redivi.com> On Jan 4, 2005, at 7:42 PM, Laura Creighton wrote: > I am thinking about 2 test sessions running simultaneously, one > checking that all tests pass > with python2.3 and one with python2.4 > > We would like if a running test, trying to do the 'build this in > SetUp', found > that the thing was _already there_, waited until that particular area > was free > to use. Else they could collide with each other. Do we have a way to > communicate: > BUSY: stay out until I am done? Why not use a platform-specific directory for temp files like distutils does? Seems to make a lot more sense than "this file might be used by some other process, or it might be garbage that will never go away so I will just sit here and wait for eternity". -bob From hpk at trillke.net Wed Jan 5 01:50:35 2005 From: hpk at trillke.net (holger krekel) Date: Wed, 5 Jan 2005 01:50:35 +0100 Subject: [py-dev] core design of py.test broken? In-Reply-To: <178FE566-5EB3-11D9-9DC0-000A9567635C@redivi.com> References: <20050104234929.GD7439@solar.trillke.net> <200501050004.j0504gIc003723@ratthing-b246.strakt.com> <20050105001429.GE7439@solar.trillke.net> <200501050042.j050gZd3003841@ratthing-b246.strakt.com> <178FE566-5EB3-11D9-9DC0-000A9567635C@redivi.com> Message-ID: <20050105005035.GF7439@solar.trillke.net> On Tue, Jan 04, 2005 at 19:45 -0500, Bob Ippolito wrote: > On Jan 4, 2005, at 7:42 PM, Laura Creighton wrote: > >I am thinking about 2 test sessions running simultaneously, one > >checking that all tests pass > >with python2.3 and one with python2.4 > > > >We would like if a running test, trying to do the 'build this in > >SetUp', found > >that the thing was _already there_, waited until that particular area > >was free > >to use. Else they could collide with each other. Do we have a way to > >communicate: > >BUSY: stay out until I am done? > > Why not use a platform-specific directory for temp files like distutils > does? Seems to make a lot more sense than "this file might be used by > some other process, or it might be garbage that will never go away so I > will just sit here and wait for eternity". Yes, indeed this may be a good idea for filesystem-based setups/teardowns. And in-process states are obviously not a problem for e.g. running tests simultanously with Python2.3 and Python2.4 as this can only happen in separate processes, anyway. holger From lac at strakt.com Wed Jan 5 02:05:37 2005 From: lac at strakt.com (Laura Creighton) Date: Wed, 05 Jan 2005 02:05:37 +0100 Subject: [py-dev] core design of py.test broken? In-Reply-To: Message from holger krekel of "Wed, 05 Jan 2005 01:50:35 +0100." <20050105005035.GF7439@solar.trillke.net> References: <20050104234929.GD7439@solar.trillke.net> <200501050004.j0504gIc003723@ratthing-b246.strakt.com> <20050105001429.GE7439@solar.trillke.net> <200501050042.j050gZd3003841@ratthing-b246.strakt.com> <178FE566-5EB3-11D9-9DC0-000A9567635C@redivi.com> <20050105005035.GF7439@solar.trillke.net> Message-ID: <200501050105.j0515bTi003949@ratthing-b246.strakt.com> In a message of Wed, 05 Jan 2005 01:50:35 +0100, holger krekel writes: >On Tue, Jan 04, 2005 at 19:45 -0500, Bob Ippolito wrote: >> On Jan 4, 2005, at 7:42 PM, Laura Creighton wrote: >> >I am thinking about 2 test sessions running simultaneously, one >> >checking that all tests pass >> >with python2.3 and one with python2.4 >> > >> >We would like if a running test, trying to do the 'build this in >> >SetUp', found >> >that the thing was _already there_, waited until that particular area >> >was free >> >to use. Else they could collide with each other. Do we have a way to > >> >communicate: >> >BUSY: stay out until I am done? >> >> Why not use a platform-specific directory for temp files like distutils > >> does? Seems to make a lot more sense than "this file might be used by >> some other process, or it might be garbage that will never go away so I > >> will just sit here and wait for eternity". > >Yes, indeed this may be a good idea for filesystem-based setups/teardowns >. >And in-process states are obviously not a problem for e.g. running >tests simultanously with Python2.3 and Python2.4 as this can only >happen in separate processes, anyway. > > holger What do we want to happen when the python versions we are testing against each other are 'holger's before lunch' and 'holger's after lunch version'? In the longest run, we will want to be able to run the same test suite with different object spaces. What is going to be the most convenient way to do that? Laura From hpk at trillke.net Wed Jan 5 02:32:15 2005 From: hpk at trillke.net (holger krekel) Date: Wed, 5 Jan 2005 02:32:15 +0100 Subject: [py-dev] core design of py.test broken? In-Reply-To: <200501050105.j0515bTi003949@ratthing-b246.strakt.com> References: <20050104234929.GD7439@solar.trillke.net> <200501050004.j0504gIc003723@ratthing-b246.strakt.com> <20050105001429.GE7439@solar.trillke.net> <200501050042.j050gZd3003841@ratthing-b246.strakt.com> <178FE566-5EB3-11D9-9DC0-000A9567635C@redivi.com> <20050105005035.GF7439@solar.trillke.net> <200501050105.j0515bTi003949@ratthing-b246.strakt.com> Message-ID: <20050105013215.GG7439@solar.trillke.net> On Wed, Jan 05, 2005 at 02:05 +0100, Laura Creighton wrote: > What do we want to happen when the python versions we are testing against > each other are 'holger's before lunch' and 'holger's after lunch version'? I am not sure what you mean exactly. py.test needs a way to reference a python executable ... > In the longest run, we will want to be able to run the same test suite with > different object spaces. What is going to be the most convenient way to do that? ... and with PyPy we need a way to specify different object spaces for testing. Already possible in the pypy-branch. But this all gets a bit off-topic to the original issues raised in this thread (regarding Items, Collectors, state management and external test addressability). holger From dialtone at divmod.com Sun Jan 9 03:44:57 2005 From: dialtone at divmod.com (Valentino Volonghi aka Dialtone) Date: Sun, 09 Jan 2005 02:44:57 GMT Subject: [py-dev] Problems compiling greenlets under MacOSX 10.3.7 In-Reply-To: 0 Message-ID: <20050109024457.32125.1389610875.divmod.quotient.2956@ohm> Hi all. I would like to compile greenlets on darwin but unfortunately I only get an error from the assembler. running build running build_ext building 'greenlet' extension gcc -fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -I/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 -c greenlet.c -o build/temp.darwin-7.7.0-Power_Macintosh-2.4/greenlet.o /var/tmp//cc5zHKvY.s:2506:non-relocatable subtraction expression, "_ts_current" minus "L00000000044$pb" /var/tmp//cc5zHKvY.s:2506:symbol: "L00000000044$pb" can't be undefined in a subtraction expression /var/tmp//cc5zHKvY.s:2504:non-relocatable subtraction expression, "_ts_current" minus "L00000000044$pb" /var/tmp//cc5zHKvY.s:2504:symbol: "L00000000044$pb" can't be undefined in a subtraction expression /var/tmp//cc5zHKvY.s:2475:non-relocatable subtraction expression, "_ts_target" minus "L00000000044$pb" /var/tmp//cc5zHKvY.s:2475:symbol: "L00000000044$pb" can't be undefined in a subtraction expression /var/tmp//cc5zHKvY.s:2473:non-relocatable subtraction expression, "_ts_target" minus "L00000000044$pb" /var/tmp//cc5zHKvY.s:2473:symbol: "L00000000044$pb" can't be undefined in a subtraction expression /var/tmp//cc5zHKvY.s:unknown:Undefined local symbol L00000000044$pb error: command 'gcc' failed with exit status 1 Is there any flag I should use to compile them or is this just an error in the code? Great work though :). From bob at redivi.com Sun Jan 9 08:19:49 2005 From: bob at redivi.com (Bob Ippolito) Date: Sun, 9 Jan 2005 02:19:49 -0500 Subject: [py-dev] Problems compiling greenlets under MacOSX 10.3.7 In-Reply-To: <20050109024457.32125.1389610875.divmod.quotient.2956@ohm> References: <20050109024457.32125.1389610875.divmod.quotient.2956@ohm> Message-ID: On Jan 8, 2005, at 21:44, Valentino Volonghi aka Dialtone wrote: > I would like to compile greenlets on darwin but unfortunately I only > get an error from the assembler. > > running build > running build_ext > building 'greenlet' extension > gcc -fno-strict-aliasing -Wno-long-double -no-cpp-precomp > -mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 -Wall > -Wstrict-prototypes > -I/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 > -c greenlet.c -o > build/temp.darwin-7.7.0-Power_Macintosh-2.4/greenlet.o > /var/tmp//cc5zHKvY.s:2506:non-relocatable subtraction expression, > "_ts_current" minus "L00000000044$pb" > /var/tmp//cc5zHKvY.s:2506:symbol: "L00000000044$pb" can't be undefined > in a subtraction expression > /var/tmp//cc5zHKvY.s:2504:non-relocatable subtraction expression, > "_ts_current" minus "L00000000044$pb" > /var/tmp//cc5zHKvY.s:2504:symbol: "L00000000044$pb" can't be undefined > in a subtraction expression > /var/tmp//cc5zHKvY.s:2475:non-relocatable subtraction expression, > "_ts_target" minus "L00000000044$pb" > /var/tmp//cc5zHKvY.s:2475:symbol: "L00000000044$pb" can't be undefined > in a subtraction expression > /var/tmp//cc5zHKvY.s:2473:non-relocatable subtraction expression, > "_ts_target" minus "L00000000044$pb" > /var/tmp//cc5zHKvY.s:2473:symbol: "L00000000044$pb" can't be undefined > in a subtraction expression > /var/tmp//cc5zHKvY.s:unknown:Undefined local symbol L00000000044$pb > error: command 'gcc' failed with exit status 1 > > Is there any flag I should use to compile them or is this just an > error in the code? It's a GCC bug, but there is a workaround for it in Stackless. It sounds like the greenlet code you are compiling is using really old stackless code. I fixed this bug in stackless CVS some time last March I think. Make sure you're using the latest Xcode and whatnot too, of course. This is what I have: % cc --version cc (GCC) 3.3 20030304 (Apple Computer, Inc. build 1666) -bob From hpk at trillke.net Sun Jan 9 10:18:30 2005 From: hpk at trillke.net (holger krekel) Date: Sun, 9 Jan 2005 10:18:30 +0100 Subject: [py-dev] Problems compiling greenlets under MacOSX 10.3.7 In-Reply-To: <20050109024457.32125.1389610875.divmod.quotient.2956@ohm> References: <20050109024457.32125.1389610875.divmod.quotient.2956@ohm> Message-ID: <20050109091830.GH7439@solar.trillke.net> Hi Valentino, first: do i remember correctly that it was us two who had these nice discussions about e.g. reload-on-the-fly at EP 2004? I am going to incorporate the mentioned code into the py lib at some point :-) On Sun, Jan 09, 2005 at 02:44 +0000, Valentino Volonghi aka Dialtone wrote: > I would like to compile greenlets on darwin but > unfortunately I only get an error from the assembler. I nagged Armin about this yesterday :-) And Bob even seems to know the underlying cause. Great. I guess we need to provide Armin online-access to an OSX machine with recent tools. Apparently he has no problems compiling greenlets on an older version of OSX. Can anyone provide an account on an OSX machine that is online all the time? This would also be great when we (soon) start to extend py.test to support interactively running tests on multiple platforms. The same goes for a windows-machine: if anyone can provide 24-hour online access to a win32 box with an sshd that would be really helpful. cheers, holger From dialtone at divmod.com Sun Jan 9 12:33:18 2005 From: dialtone at divmod.com (Valentino Volonghi aka Dialtone) Date: Sun, 09 Jan 2005 11:33:18 GMT Subject: [py-dev] Problems compiling greenlets under MacOSX 10.3.7 In-Reply-To: Message-ID: <20050109113318.32125.926734110.divmod.quotient.3368@ohm> On Sun, 9 Jan 2005 02:19:49 -0500, Bob Ippolito wrote: > Make sure you're using the latest Xcode and whatnot too, of course. > This is what I have: > > % cc --version > cc (GCC) 3.3 20030304 (Apple Computer, Inc. build 1666) The same for me. From dialtone at divmod.com Sun Jan 9 13:40:23 2005 From: dialtone at divmod.com (Valentino Volonghi aka Dialtone) Date: Sun, 09 Jan 2005 12:40:23 GMT Subject: [py-dev] Problems compiling greenlets under MacOSX 10.3.7 In-Reply-To: 0 Message-ID: <20050109124023.32125.112823069.divmod.quotient.3423@ohm> On Sun, 9 Jan 2005 10:18:30 +0100, holger krekel wrote: >Hi Valentino, > > first: do i remember correctly that it was us two who had these nice > discussions about e.g. reload-on-the-fly at EP 2004? I am going > to incorporate the mentioned code into the py lib at some point :-) Yep you actually remember correctly :). Pleased to talk to you again. I'm eagerly waiting that code then. My current job would benefit a lot from that functionality, and I would really like a more robust one (than twisted's) > I nagged Armin about this yesterday :-) And Bob even seems to know the > underlying cause. Great. I guess we need to provide Armin online-access > to an OSX machine with recent tools. Apparently he has no problems > compiling greenlets on an older version of OSX. This is a very good news. > Can anyone provide an account on an OSX machine that is online > all the time? This would also be great when we (soon) start > to extend py.test to support interactively running tests on > multiple platforms. The same goes for a windows-machine: if > anyone can provide 24-hour online access to a win32 box with > an sshd that would be really helpful. Unfortunately my OSX is a laptop. But I can probably arrange an account on my win32 desktop. It is online almost 24h a day and shouldn't be a problem, just tell me what you need :). Yesterday I read almost all the documentation about py lib... It's great :), I especially like py.std import hack, so few lines and yet so wonderful. I hope to be able to help with development From bob at redivi.com Sun Jan 9 16:06:08 2005 From: bob at redivi.com (Bob Ippolito) Date: Sun, 9 Jan 2005 10:06:08 -0500 Subject: [py-dev] Problems compiling greenlets under MacOSX 10.3.7 In-Reply-To: <20050109091830.GH7439@solar.trillke.net> References: <20050109024457.32125.1389610875.divmod.quotient.2956@ohm> <20050109091830.GH7439@solar.trillke.net> Message-ID: On Jan 9, 2005, at 4:18, holger krekel wrote: > On Sun, Jan 09, 2005 at 02:44 +0000, Valentino Volonghi aka Dialtone > wrote: >> I would like to compile greenlets on darwin but >> unfortunately I only get an error from the assembler. > > I nagged Armin about this yesterday :-) And Bob even seems to know the > underlying cause. Great. I guess we need to provide Armin > online-access > to an OSX machine with recent tools. Apparently he has no problems > compiling greenlets on an older version of OSX. Older version of GCC. As I said before, this is a GCC bug :) > Can anyone provide an account on an OSX machine that is online > all the time? This would also be great when we (soon) start > to extend py.test to support interactively running tests on > multiple platforms. The same goes for a windows-machine: if > anyone can provide 24-hour online access to a win32 box with > an sshd that would be really helpful. I'm almost certain that the greenlets code is using an old header file from Stackless, and it needs to be replaced with a newer one. It doesn't really take access to OS X to make sure that these are in sync :) -bob From hpk at trillke.net Sun Jan 9 16:27:40 2005 From: hpk at trillke.net (holger krekel) Date: Sun, 9 Jan 2005 16:27:40 +0100 Subject: [py-dev] Problems compiling greenlets under MacOSX 10.3.7 In-Reply-To: References: <20050109024457.32125.1389610875.divmod.quotient.2956@ohm> <20050109091830.GH7439@solar.trillke.net> Message-ID: <20050109152740.GJ7439@solar.trillke.net> On Sun, Jan 09, 2005 at 10:06 -0500, Bob Ippolito wrote: > On Jan 9, 2005, at 4:18, holger krekel wrote: > > >On Sun, Jan 09, 2005 at 02:44 +0000, Valentino Volonghi aka Dialtone > >wrote: > >>I would like to compile greenlets on darwin but > >>unfortunately I only get an error from the assembler. > > > >I nagged Armin about this yesterday :-) And Bob even seems to know the > >underlying cause. Great. I guess we need to provide Armin > >online-access > >to an OSX machine with recent tools. Apparently he has no problems > >compiling greenlets on an older version of OSX. > > Older version of GCC. As I said before, this is a GCC bug :) Yah, well, whatever, version problems :-) > >Can anyone provide an account on an OSX machine that is online > >all the time? This would also be great when we (soon) start > >to extend py.test to support interactively running tests on > >multiple platforms. The same goes for a windows-machine: if > >anyone can provide 24-hour online access to a win32 box with > >an sshd that would be really helpful. > > I'm almost certain that the greenlets code is using an old header file > from Stackless, and it needs to be replaced with a newer one. It > doesn't really take access to OS X to make sure that these are in sync > :) I don't know how closely greenlets mirror stackless-header files. I guess i stop and let Armin get to the issue when he has the time ... cheers, holger From bob at redivi.com Sun Jan 9 18:03:11 2005 From: bob at redivi.com (Bob Ippolito) Date: Sun, 9 Jan 2005 12:03:11 -0500 Subject: [py-dev] Problems compiling greenlets under MacOSX 10.3.7 In-Reply-To: <20050109152740.GJ7439@solar.trillke.net> References: <20050109024457.32125.1389610875.divmod.quotient.2956@ohm> <20050109091830.GH7439@solar.trillke.net> <20050109152740.GJ7439@solar.trillke.net> Message-ID: <579DAEF4-6260-11D9-9C18-000A95BA5446@redivi.com> On Jan 9, 2005, at 10:27, holger krekel wrote: > On Sun, Jan 09, 2005 at 10:06 -0500, Bob Ippolito wrote: >> On Jan 9, 2005, at 4:18, holger krekel wrote: >> >>> On Sun, Jan 09, 2005 at 02:44 +0000, Valentino Volonghi aka Dialtone >>> wrote: >>>> I would like to compile greenlets on darwin but >>>> unfortunately I only get an error from the assembler. >>> >>> I nagged Armin about this yesterday :-) And Bob even seems to know >>> the >>> underlying cause. Great. I guess we need to provide Armin >>> online-access >>> to an OSX machine with recent tools. Apparently he has no problems >>> compiling greenlets on an older version of OSX. >> >> Older version of GCC. As I said before, this is a GCC bug :) > > Yah, well, whatever, version problems :-) > >>> Can anyone provide an account on an OSX machine that is online >>> all the time? This would also be great when we (soon) start >>> to extend py.test to support interactively running tests on >>> multiple platforms. The same goes for a windows-machine: if >>> anyone can provide 24-hour online access to a win32 box with >>> an sshd that would be really helpful. >> >> I'm almost certain that the greenlets code is using an old header file >> from Stackless, and it needs to be replaced with a newer one. It >> doesn't really take access to OS X to make sure that these are in sync >> :) > > I don't know how closely greenlets mirror stackless-header files. > I guess i stop and let Armin get to the issue when he has the time ... They should be identical. -bob From arigo at tunes.org Sun Jan 9 19:18:47 2005 From: arigo at tunes.org (Armin Rigo) Date: Sun, 9 Jan 2005 18:18:47 +0000 Subject: [py-dev] Problems compiling greenlets under MacOSX 10.3.7 In-Reply-To: <579DAEF4-6260-11D9-9C18-000A95BA5446@redivi.com> References: <20050109024457.32125.1389610875.divmod.quotient.2956@ohm> <20050109091830.GH7439@solar.trillke.net> <20050109152740.GJ7439@solar.trillke.net> <579DAEF4-6260-11D9-9C18-000A95BA5446@redivi.com> Message-ID: <20050109181847.GA978@vicky.ecs.soton.ac.uk> Hi Bob, On Sun, Jan 09, 2005 at 12:03:11PM -0500, Bob Ippolito wrote: > >I don't know how closely greenlets mirror stackless-header files. > >I guess i stop and let Armin get to the issue when he has the time ... > > They should be identical. Done importing the latest Stackless switch_*.h files. I hope it fixes the problem. Armin From hpk at trillke.net Sun Jan 9 21:05:46 2005 From: hpk at trillke.net (holger krekel) Date: Sun, 9 Jan 2005 21:05:46 +0100 Subject: [py-dev] Re: [py-svn] r8187 - py/dist/py In-Reply-To: <20050109182838.EA1C027B75@code1.codespeak.net> References: <20050109182838.EA1C027B75@code1.codespeak.net> Message-ID: <20050109200546.GL7439@solar.trillke.net> Hi Armin, On Sun, Jan 09, 2005 at 19:28 +0100, arigo at codespeak.net wrote: > Author: arigo > Date: Sun Jan 9 19:28:38 2005 > New Revision: 8187 > > Modified: > py/dist/py/initpkg.py > Log: > Temporary fix for "import py; py.magic.greenlet" raising an ImportError if the > 'py' lib is not in your PYTHONPATH: make the package __path__ absolute. This > prevents os.chdir() from breaking imports. I see. And also saw your IRC ramblings as i just got back. Indeed, os.chdir() can break things horribly for relatively imported packages. As you hinted at, it is likely a problem with Python itself that it doesn't try to convert import paths to absolute locations as soon as possible. Of course, the py lib's import hook may very well obscure things further although (arguably) the errors from Python when just using plain imports are very similar if you don't realise that your import-problem happens while executing in some os.chdir()ed execution scope. OTOH, the specific problem with make_module_from_c() might not have arisen because we might have imported py.code.Source during module initilization. OTOH, it might have occured within a C-file that imports python-modules during module initilization :-) Anyway, thanks for figuring this out and fixing it. I hope it wasn't as hard as fixing some Psyco problems :-) cheers, holger From hpk at trillke.net Sun Jan 9 21:41:41 2005 From: hpk at trillke.net (holger krekel) Date: Sun, 9 Jan 2005 21:41:41 +0100 Subject: [py-dev] Re: [py-svn] r8187 - py/dist/py In-Reply-To: <20050109200546.GL7439@solar.trillke.net> References: <20050109182838.EA1C027B75@code1.codespeak.net> <20050109200546.GL7439@solar.trillke.net> Message-ID: <20050109204141.GN7439@solar.trillke.net> On Sun, Jan 09, 2005 at 21:05 +0100, holger krekel wrote: > Of course, the py lib's import hook ... well, in this case the custom import hook, used for importing tests files or generally modules from path objects, was not involved (and it actually uses absolute paths and so would have avoided the whole problem!). But our second hack, that just delegates to python's import hacks for importing the py lib itself was involved. Should we, after all, try to substitute these hacks with a clean custom import hook as well? Sometimes i think this would be easier even in the medium term, especially considering that we want to have remote lazy imports over py.execnet.SshGateway()s working as well. Python's piled import hacks are just way too painful. cheers, holger From arigo at tunes.org Tue Jan 11 16:37:05 2005 From: arigo at tunes.org (Armin Rigo) Date: Tue, 11 Jan 2005 15:37:05 +0000 Subject: [py-dev] core design of py.test broken? In-Reply-To: <20050104234929.GD7439@solar.trillke.net> References: <20050104234929.GD7439@solar.trillke.net> Message-ID: <20050111153705.GB12085@vicky.ecs.soton.ac.uk> Hi Holger, On Wed, Jan 05, 2005 at 12:49:29AM +0100, holger krekel wrote: > Then we also execute test methods across process boundaries > (with --session and lately --exec) and across python versions > and platforms (and soon across networks). And we want to > "memorize" especially failing tests across those barriers When reading py/test/item.py, I feel uneasy about this global object SetupItem.state which encapsulates the notion of which collectors and items are currently "up". This is a bit strange because there is another parallel mecanism that looks cleaner, which is already used for reporting: naively, it seems that it would just work if a collector or item were set up when the method Driver.run_collector_or_item() starts, and teared down when it ends. This would work independently of any notion of extpy path to name each item. Note that this is only about setup/teardown. For purposes like --session or remote execution of tests, we clearly need a persistent notion of "name" or "path" of test items: > Basically we could think of a Collector offering two concepts: > > listdir() # list all the possible Items (test methods) and > # "recursion" points, i.e. Collectors, each > # of which has a specific associated name > > join(name) # get a contained Collector/Item specified > # by such a name In this model generative tests are themselves collectors, not items, and the listdir() should enumerate all the sub-tests by enumerating the user-defined generator. The easier would be to use integer indices as the name of the sub-items (the first one get the name 0, the next one the name 1, and so on). It means that we shouldn't add or remove sub-tests in the middle of the list if we are e.g. running in --session, otherwise py.test will mess up the various sub-tests... but I don't see any obvious workaround. In general I think that using just lists of names or numbers as paths to individual test items looks good. The semantics of setup/teardown could then be cleanly integrated in this model, by putting the related knowledge on the individual Collector classes, e.g. with two new methods on all collectors and items: enter() and leave(). The Module collector would call setup_module() in its enter() method, etc. A bientot, Armin. From ianb at colorstudy.com Tue Jan 11 18:00:07 2005 From: ianb at colorstudy.com (Ian Bicking) Date: Tue, 11 Jan 2005 11:00:07 -0600 Subject: [py-dev] core design of py.test broken? In-Reply-To: <20050111153705.GB12085@vicky.ecs.soton.ac.uk> References: <20050104234929.GD7439@solar.trillke.net> <20050111153705.GB12085@vicky.ecs.soton.ac.uk> Message-ID: <41E40617.70203@colorstudy.com> Armin Rigo wrote: >>Basically we could think of a Collector offering two concepts: >> >> listdir() # list all the possible Items (test methods) and >> # "recursion" points, i.e. Collectors, each >> # of which has a specific associated name >> >> join(name) # get a contained Collector/Item specified >> # by such a name > > > In this model generative tests are themselves collectors, not items, and the > listdir() should enumerate all the sub-tests by enumerating the user-defined > generator. The easier would be to use integer indices as the name of the > sub-items (the first one get the name 0, the next one the name 1, and so on). > It means that we shouldn't add or remove sub-tests in the middle of the list > if we are e.g. running in --session, otherwise py.test will mess up the > various sub-tests... but I don't see any obvious workaround. In general I > think that using just lists of names or numbers as paths to individual test > items looks good. Another idea I proposed earlier was determining an ID from the argument list; e.g., taking the first argument for each subtest, str() it, remove non-alphanumeric characters, truncate it if it's too long, then add an integer if that name is not unique. This way users can create explicit IDs if they want (by adding an argument which won't be used for anything but the ID), or at least the IDs should be a little more stable and meaningful. -- Ian Bicking / ianb at colorstudy.com / http://blog.ianbicking.org From arigo at tunes.org Tue Jan 11 23:24:52 2005 From: arigo at tunes.org (Armin Rigo) Date: Tue, 11 Jan 2005 22:24:52 +0000 Subject: [py-dev] relto() Message-ID: <20050111222452.GA24033@vicky.ecs.soton.ac.uk> Hi Holger, >>> py.path.local('bcde').relto(py.path.local('b')) 'de' looks quite unexpected to me... Armin From grig at gheorghiu.net Fri Jan 14 20:56:48 2005 From: grig at gheorghiu.net (Grig Gheorghiu) Date: Fri, 14 Jan 2005 11:56:48 -0800 (PST) Subject: [py-dev] Issue with py.test and stderr Message-ID: <20050114195649.73273.qmail@web54508.mail.yahoo.com> Hi, Holger I started to play with py.test after seeing it mentioned on a lot of Python-related blogs. I find it really easy to use and powerful at the same time, so congrats for a fine piece of work. I did run into a slight issue though. At some point I was instantiating a class which was redirecting stderr to stdout under some circumstances. The __del__ method of my class was resetting sys.stderr to sys.__stderr__. I think this caused a problem with py.test, in that it interfered with py.test's own manipulation of sys.stderr. The end result was that I was getting a rather cryptic error message from py.test: # py.test -v test_clients.py ============================= test process starts ============================= testing-mode: inprocess executable : /usr/bin/python (2.2.3-final-0) using py lib: /local/dist-py/py initial testconfig 0: /local/dist-py/py/test/defaultconfig =============================================================================== Traceback (most recent call last): File "/local/dist-py/py/bin/py.test", line 5, in ? main() File "/local/dist-py/py/test/cmdline.py", line 32, in main run.inprocess(args, filenames) File "/local/dist-py/py/test/run.py", line 141, in inprocess driver.run(fncollectors) File "/local/dist-py/py/test/drive.py", line 43, in run self.run_collector_or_item(x) File "/local/dist-py/py/test/drive.py", line 61, in run_collector_or_item self.runcollector(obj) File "/local/dist-py/py/test/drive.py", line 83, in runcollector self.run_collector_or_item(obj) File "/local/dist-py/py/test/drive.py", line 61, in run_collector_or_item self.runcollector(obj) File "/local/dist-py/py/test/drive.py", line 83, in runcollector self.run_collector_or_item(obj) File "/local/dist-py/py/test/drive.py", line 59, in run_collector_or_item self.runitem(obj) File "/local/dist-py/py/test/drive.py", line 103, in runitem self.reporter.enditem(res) File "/local/dist-py/py/test/report/text/reporter.py", line 141, in enditem result.out, result.err = item.iocapture.reset() File "/local/dist-py/py/test/tool/outerrcapture.py", line 29, in reset err = e.getvalue() AttributeError: 'file' object has no attribute 'getvalue' The only way I was able to get around this issue without modifying my class was to edit py/test/tool/outerrcapture.py and replace: err = e.getvalue() with try: err = e.getvalue() except: err = None Then things seemed to work fine. The real solution I found was to modify my class and save sys.stderr into a temp and restore it in __del__ to that temp, instead of restoring it to sys.__stderr__. Then I could get rid of my modifications of outerrcapture.py. Anyway, I guess this is a pretty rare case, but maybe you can insert a try/catch in outerrcapture.py to prevent things like this from happening. All the best, Grig P.S. I'm really interested in both Python and testing, so if you guys need some help with py.test in the future, I'd be glad to pitch in. I have a blog with some of the stuff I'm doing at http://agiletesting.blogspot.com . I've been using unittest quite heavily, but I'm also trying to look into alternatives, mainly py.test and doctest. I released a JUnitPerf port to Python which I called pyUnitPerf, so now I'm thinking about doing something similar and more "pythonic" with py.test. Maybe py.test.perf?!! From hpk at trillke.net Fri Jan 14 21:22:08 2005 From: hpk at trillke.net (holger krekel) Date: Fri, 14 Jan 2005 21:22:08 +0100 Subject: [py-dev] Issue with py.test and stderr In-Reply-To: <20050114195649.73273.qmail@web54508.mail.yahoo.com> References: <20050114195649.73273.qmail@web54508.mail.yahoo.com> Message-ID: <20050114202208.GY7439@solar.trillke.net> Hi Grig! On Fri, Jan 14, 2005 at 11:56 -0800, Grig Gheorghiu wrote: > I started to play with py.test after seeing it mentioned on a lot of > Python-related blogs. I find it really easy to use and powerful at the > same time, so congrats for a fine piece of work. great to hear! > I did run into a slight issue though. At some point I was instantiating > a class which was redirecting stderr to stdout under some > circumstances. The __del__ method of my class was resetting sys.stderr > to sys.__stderr__. I think this caused a problem with py.test, in that > it interfered with py.test's own manipulation of sys.stderr. Yes, certainly. The stdout/stderr capturing is a bit naive and not careful enough yet. Note that if you suspect that py.test messes up stdout/stderr somehow you can pass py.test -S to inhibit the redirecting logic. > The end result was that I was getting a rather cryptic error message > from py.test: > ... traceback too ugly to cite here ... uh indeed :-) > The only way I was able to get around this issue without modifying my > class was to edit py/test/tool/outerrcapture.py and replace: > > err = e.getvalue() > > with > > try: > err = e.getvalue() > except: > err = None > > Then things seemed to work fine. "Do the simplest fix that possibly works" .. . aah these testing guys :-) Well, in this case we should probably do better and detect that a test did something to stdout/stderr. The test reporter could display something like [test changed stdout/err -> partial output only] and still display what we captured. It may generally be a good idea to always reset stdout/stderr after a test ran. > P.S. I'm really interested in both Python and testing, so if you guys > need some help with py.test in the future, I'd be glad to pitch in. I > have a blog with some of the stuff I'm doing at > http://agiletesting.blogspot.com . Interesting. > I've been using unittest quite heavily, but I'm also trying to look > into alternatives, mainly py.test and doctest. doctests are to be integrated into py.test real soon now (this is a programmer saying this!). Actually the Schevo guys did a hack in their tree and i have doing some tries with Armin. I think the main issues are how to collect and report doctests. As you noticed, py.test is so far collecting test automatically. For doctests this may not be completly feasible. > I released a JUnitPerf > port to Python which I called pyUnitPerf, so now I'm thinking about > doing something similar and more "pythonic" with py.test. Maybe > py.test.perf?!! Profiling certainly makes sense in the testing context. It's not too high on my list, though. But if you want to give it a try, that would be great. I could certainly help with integrational issues. cheers, holger From grig at gheorghiu.net Fri Jan 14 22:58:07 2005 From: grig at gheorghiu.net (Grig Gheorghiu) Date: Fri, 14 Jan 2005 13:58:07 -0800 (PST) Subject: [py-dev] Issue with py.test and stderr In-Reply-To: <20050114202208.GY7439@solar.trillke.net> Message-ID: <20050114215807.19636.qmail@web54508.mail.yahoo.com> Holger, Thanks for the detailed response. I'm still very much green at py.test, but I'm working on it :-) pyUnitPerf doesn't actually do profiling, it does load/time performance testing. You can start by profiling your code for bottlenecks via the profile module for example, and then wrap those bottleneck functions/methods in unitperf tests. You can set a desired user load and an expected time to completion, and have the tests fail if the time is exceeded. There's a good article by Mike Clark at http://javapronews.com/javapronews-47-20030721ContinuousPerformanceTestingwithJUnitPerf.html I think the notion of "continuous performance testing" is worth investigating, so that's what I was referring to when I said I'll give some thought to a potential py.test.perf module. Grig --- holger krekel wrote: > Hi Grig! > > On Fri, Jan 14, 2005 at 11:56 -0800, Grig Gheorghiu wrote: > > I started to play with py.test after seeing it mentioned on a lot > of > > Python-related blogs. I find it really easy to use and powerful at > the > > same time, so congrats for a fine piece of work. > > great to hear! > > > I did run into a slight issue though. At some point I was > instantiating > > a class which was redirecting stderr to stdout under some > > circumstances. The __del__ method of my class was resetting > sys.stderr > > to sys.__stderr__. I think this caused a problem with py.test, in > that > > it interfered with py.test's own manipulation of sys.stderr. > > Yes, certainly. The stdout/stderr capturing is a bit naive and not > careful enough yet. Note that if you suspect that py.test messes up > stdout/stderr somehow you can pass > > py.test -S > > to inhibit the redirecting logic. > > > The end result was that I was getting a rather cryptic error > message > > from py.test: > > > ... traceback too ugly to cite here ... > > uh indeed :-) > > > The only way I was able to get around this issue without modifying > my > > class was to edit py/test/tool/outerrcapture.py and replace: > > > > err = e.getvalue() > > > > with > > > > try: > > err = e.getvalue() > > except: > > err = None > > > > Then things seemed to work fine. > > "Do the simplest fix that possibly works" .. . > aah these testing guys :-) > > Well, in this case we should probably do better and detect > that a test did something to stdout/stderr. The test reporter > could display something like > > [test changed stdout/err -> partial output only] > > and still display what we captured. It may generally > be a good idea to always reset stdout/stderr after a > test ran. > > > P.S. I'm really interested in both Python and testing, so if you > guys > > need some help with py.test in the future, I'd be glad to pitch in. > I > > have a blog with some of the stuff I'm doing at > > http://agiletesting.blogspot.com . > > Interesting. > > > I've been using unittest quite heavily, but I'm also trying to look > > into alternatives, mainly py.test and doctest. > > doctests are to be integrated into py.test real soon now > (this is a programmer saying this!). > > Actually the Schevo guys did a hack in their tree > and i have doing some tries with Armin. I think the > main issues are how to collect and report doctests. > As you noticed, py.test is so far collecting test > automatically. For doctests this may not be completly > feasible. > > > I released a JUnitPerf > > port to Python which I called pyUnitPerf, so now I'm thinking about > > doing something similar and more "pythonic" with py.test. Maybe > > py.test.perf?!! > > Profiling certainly makes sense in the testing context. It's not > too high on my list, though. But if you want to give it a try, > that would be great. I could certainly help with integrational > issues. > > cheers, > > holger > From walter at livinglogic.de Sat Jan 15 01:34:00 2005 From: walter at livinglogic.de (walter at livinglogic.de) Date: Sat, 15 Jan 2005 01:34:00 +0100 (CET) Subject: [py-dev] Issue with py.test and stderr In-Reply-To: <20050114195649.73273.qmail@web54508.mail.yahoo.com> References: <20050114195649.73273.qmail@web54508.mail.yahoo.com> Message-ID: <1125.193.196.118.51.1105749240.squirrel@isar.livinglogic.de> > Hi, Holger > > I started to play with py.test after seeing it mentioned on a lot of > Python-related blogs. I find it really easy to use and powerful at the > same time, so congrats for a fine piece of work. > > I did run into a slight issue though. At some point I was instantiating > a class which was redirecting stderr to stdout under some > circumstances. The __del__ method of my class was resetting sys.stderr > to sys.__stderr__. Don't do that. Never use sys.__stderr__. Instead store the old value in a temporary variable and restore sys.stderr afterwards. See: http://groups.google.com/groups?lr=&selm=just-E1C26C.10084603012005%40news1.news.xs4all.nl Bye, Walter D?rwald From hpk at trillke.net Sat Jan 15 12:11:55 2005 From: hpk at trillke.net (holger krekel) Date: Sat, 15 Jan 2005 12:11:55 +0100 Subject: [py-dev] improved reporting ... Message-ID: <20050115111155.GB14660@solar.trillke.net> hi py.test.onistas, i have cleaned up and improved the text-reporting for failing tests (i think :-). You can see the results with e.g. py.test py/documentation/example/pytest/failure_demo.py which gives you a list of 40 reported failing tests. have fun (and report back glitches), holger From hpk at trillke.net Sat Jan 15 14:13:20 2005 From: hpk at trillke.net (holger krekel) Date: Sat, 15 Jan 2005 14:13:20 +0100 Subject: [py-dev] relto() In-Reply-To: <20050111222452.GA24033@vicky.ecs.soton.ac.uk> References: <20050111222452.GA24033@vicky.ecs.soton.ac.uk> Message-ID: <20050115131320.GE14660@solar.trillke.net> Hi Armin, On Tue, Jan 11, 2005 at 22:24 +0000, Armin Rigo wrote: > Hi Holger, > > >>> py.path.local('bcde').relto(py.path.local('b')) > 'de' > looks quite unexpected to me... indeed. this should now be fixed: it gives an empty string. cheers, holger From hpk at trillke.net Sat Jan 15 14:34:30 2005 From: hpk at trillke.net (holger krekel) Date: Sat, 15 Jan 2005 14:34:30 +0100 Subject: [py-dev] core design of py.test broken? In-Reply-To: <20050111153705.GB12085@vicky.ecs.soton.ac.uk> References: <20050104234929.GD7439@solar.trillke.net> <20050111153705.GB12085@vicky.ecs.soton.ac.uk> Message-ID: <20050115133430.GF14660@solar.trillke.net> Hi Armin, On Tue, Jan 11, 2005 at 15:37 +0000, Armin Rigo wrote: > On Wed, Jan 05, 2005 at 12:49:29AM +0100, holger krekel wrote: > > Basically we could think of a Collector offering two concepts: > > > > listdir() # list all the possible Items (test methods) and > > # "recursion" points, i.e. Collectors, each > > # of which has a specific associated name > > > > join(name) # get a contained Collector/Item specified > > # by such a name > > In this model generative tests are themselves collectors, not items, and the > listdir() should enumerate all the sub-tests by enumerating the user-defined > generator. The easier would be to use integer indices as the name of the > sub-items [...] In general I think that using just lists of names or numbers > as paths to individual test items looks good. OK, good! I am not sure about naming/indexing of subtests from generative tests yet. Ians proposal also looks interesting. Generative tests have other issues as well, for example the tracebacks don't look nice, yet. > The semantics of setup/teardown could then be cleanly integrated in this > model, by putting the related knowledge on the individual Collector classes, > e.g. with two new methods on all collectors and items: enter() and leave(). > The Module collector would call setup_module() in its enter() method, etc. We could put enter/leave methods on collectors i guess. But i'd rather keep setup_X/teardown_X on the item for the list-of-names refactoring. >From previous experiences and refactoring tries i am still hesitant to mix the collecting and running parts (setup/execute/teardown) too much. Currently, if you want to stick an extra parameter on all TestClasses in a directory you can provide a custom Item-class which defines "setup_class" or "setup_method" to stick it in. If these methods were only available on the collectors you would have to provide a lot more stuff. cheers, holger From hpk at trillke.net Sat Jan 15 15:21:18 2005 From: hpk at trillke.net (holger krekel) Date: Sat, 15 Jan 2005 15:21:18 +0100 Subject: performance testing [was: Re: [py-dev] Issue with py.test and stderr] In-Reply-To: <20050114215807.19636.qmail@web54508.mail.yahoo.com> References: <20050114202208.GY7439@solar.trillke.net> <20050114215807.19636.qmail@web54508.mail.yahoo.com> Message-ID: <20050115142118.GG14660@solar.trillke.net> On Fri, Jan 14, 2005 at 13:58 -0800, Grig Gheorghiu wrote: > Thanks for the detailed response. I'm still very much green at py.test, > but I'm working on it :-) I have actually fixed the stdout/stderr issues somewhat by now. > pyUnitPerf doesn't actually do profiling, it does load/time performance > testing. You can start by profiling your code for bottlenecks via the > profile module for example, and then wrap those bottleneck > functions/methods in unitperf tests. OK, but i don't see a reason why py.test shouldn't help with the first step as well. For example, i'd like to know where the bottle-necks are when i run tests in a certain directory. > You can set a desired user load and an expected time to > completion, and have the tests fail if the time is exceeded. > There's a good article by Mike Clark at > http://javapronews.com/javapronews-47-20030721ContinuousPerformanceTestingwithJUnitPerf.html I see. I guess with py.test you could do something similar today without even modifying the py lib itself (although proper integration would be nice). There are actually two ways to integrate such performance testing. First the simple example: def do_something(arg): # the stuff you want to performance test def test_some_performance_stuff(): yield run_test_in_one_second, do_something, 42 def run_test_in_one_second(call, *args): # XXX async dispatch running "call(*args)" if not wait_test_finish(timeout): # XXX kill asynchronous test (if possible, # otherwise wait :-) raise py.test.fail("eek! test took too long") and that's it. Well, you need to implement async-dispatching/synchronization parts but it might be hard for the generic py.test tool to do that properly without knowledge about the application, anyway. There also is another way to integrate such a performance testing facility on a more general level. For example, you might want to allow everywhere in your test code base to have "PerfTest" classes, e.g.: class PerfTestCompletesInOneSecond: timeout = 1.0 def test_something_that_might_be_too_slow(self): # .... and all test methods of that class would need to complete with the specified timeout. Therefore you could have a conftest.py file (in a directory where you want to run performance tests) and provide a custom Module-level test collector which provides "PerfTestItems". import py class Module(py.test.collect.Class): def collect_perftest(self, extpy): if extpy.check(class_=True, basename='PerfTest'): yield PerfTestItem(extpy) class PerfTestItem(py.test.Item): def execute(self, target): timeout = target.im_self.timeout # XXX dispatch asynchronously target() if not wait_test_finish(timeout): # XXX kill asynchronous test (if possible, # otherwise wait :-) raise self.Failed("test took too long") that's basically it. Note that the latter method of customization is likely to change slightly. (see http://codespeak.net/pipermail/py-dev/2005-January/000122.html for more details if interested, don't worry too much). take care, holger From arigo at tunes.org Mon Jan 17 11:16:30 2005 From: arigo at tunes.org (Armin Rigo) Date: Mon, 17 Jan 2005 10:16:30 +0000 Subject: [py-dev] Re: [py-svn] r8187 - py/dist/py In-Reply-To: <20050109204141.GN7439@solar.trillke.net> References: <20050109182838.EA1C027B75@code1.codespeak.net> <20050109200546.GL7439@solar.trillke.net> <20050109204141.GN7439@solar.trillke.net> Message-ID: <20050117101630.GA20323@vicky.ecs.soton.ac.uk> Hi Holger, On Sun, Jan 09, 2005 at 09:41:41PM +0100, holger krekel wrote: > hacks for importing the py lib itself was involved. Should we, > after all, try to substitute these hacks with a clean custom > import hook as well? I don't know. So far I believe that we have found a reasonable compromize, though I'm not sure how painful it will become in the future. I think that the py lib (and most "reasonable" packages) can already be imported both with Python's __import__ and with the py lib's system, so it's fine to use one or the other depending on the situation. Plus, bootstrapping is always messy... Armin From arigo at tunes.org Mon Jan 17 11:21:19 2005 From: arigo at tunes.org (Armin Rigo) Date: Mon, 17 Jan 2005 10:21:19 +0000 Subject: [py-dev] Issue with py.test and stderr In-Reply-To: <20050114195649.73273.qmail@web54508.mail.yahoo.com> References: <20050114195649.73273.qmail@web54508.mail.yahoo.com> Message-ID: <20050117102119.GB20323@vicky.ecs.soton.ac.uk> Hi Grig, On Fri, Jan 14, 2005 at 11:56:48AM -0800, Grig Gheorghiu wrote: > I did run into a slight issue though. At some point I was instantiating > a class which was redirecting stderr to stdout under some > circumstances. The __del__ method of my class was resetting sys.stderr > to sys.__stderr__. Note that there is support in py.test for clean setup/teardown semantics, i.e. instead of using __del__ (which is not guaranteed to run timely) you should save and restore sys.stderr in setup/teardown methods; e.g.: def setup_module(mod): mod.saved_stderr = sys.stderr sys.stderr = ...something else... def teardown_module(mod): sys.stderr = mod.saved_stderr Armin From arigo at tunes.org Mon Jan 17 11:34:48 2005 From: arigo at tunes.org (Armin Rigo) Date: Mon, 17 Jan 2005 10:34:48 +0000 Subject: [py-dev] improved reporting ... In-Reply-To: <20050115111155.GB14660@solar.trillke.net> References: <20050115111155.GB14660@solar.trillke.net> Message-ID: <20050117103448.GC20323@vicky.ecs.soton.ac.uk> Hi Holger, On Sat, Jan 15, 2005 at 12:11:55PM +0100, holger krekel wrote: > py.test py/documentation/example/pytest/failure_demo.py > > which gives you a list of 40 reported failing tests. I only get 20 tests (all failing)... Armin From hpk at trillke.net Mon Jan 17 11:53:51 2005 From: hpk at trillke.net (holger krekel) Date: Mon, 17 Jan 2005 11:53:51 +0100 Subject: [py-dev] Issue with py.test and stderr In-Reply-To: <20050117102119.GB20323@vicky.ecs.soton.ac.uk> References: <20050114195649.73273.qmail@web54508.mail.yahoo.com> <20050117102119.GB20323@vicky.ecs.soton.ac.uk> Message-ID: <20050117105351.GP14660@solar.trillke.net> On Mon, Jan 17, 2005 at 10:21 +0000, Armin Rigo wrote: > On Fri, Jan 14, 2005 at 11:56:48AM -0800, Grig Gheorghiu wrote: > > I did run into a slight issue though. At some point I was instantiating > > a class which was redirecting stderr to stdout under some > > circumstances. The __del__ method of my class was resetting sys.stderr > > to sys.__stderr__. > > Note that there is support in py.test for clean setup/teardown semantics, i.e. > instead of using __del__ (which is not guaranteed to run timely) you should > save and restore sys.stderr in setup/teardown methods; e.g.: > > def setup_module(mod): > mod.saved_stderr = sys.stderr > sys.stderr = ...something else... > > def teardown_module(mod): > sys.stderr = mod.saved_stderr but then you may want to run with the "-S" or "--nocapture" option so that py.test doesn't itself do stdout/stderr mangling. There is - unfortunately - no easy way yet to switch to "nocapture" just for a specific module, class or method. Although py.test is quite customizable it lacks with respect to programmatically modifying the options from cmdline-switches. cheers, holger From hpk at trillke.net Mon Jan 17 11:55:01 2005 From: hpk at trillke.net (holger krekel) Date: Mon, 17 Jan 2005 11:55:01 +0100 Subject: [py-dev] improved reporting ... In-Reply-To: <20050117103448.GC20323@vicky.ecs.soton.ac.uk> References: <20050115111155.GB14660@solar.trillke.net> <20050117103448.GC20323@vicky.ecs.soton.ac.uk> Message-ID: <20050117105501.GQ14660@solar.trillke.net> On Mon, Jan 17, 2005 at 10:34 +0000, Armin Rigo wrote: > Hi Holger, > > On Sat, Jan 15, 2005 at 12:11:55PM +0100, holger krekel wrote: > > py.test py/documentation/example/pytest/failure_demo.py > > > > which gives you a list of 40 reported failing tests. > > I only get 20 tests (all failing)... um, yes, 20 actually :-) morning'ly yours, holger From arigo at tunes.org Mon Jan 17 12:27:26 2005 From: arigo at tunes.org (Armin Rigo) Date: Mon, 17 Jan 2005 11:27:26 +0000 Subject: [py-dev] Re: [Python-Dev] Getting rid of unbound methods: patch available In-Reply-To: References: Message-ID: <20050117112726.GA10827@vicky.ecs.soton.ac.uk> Hi Holger & py.testers, On Sun, Jan 16, 2005 at 10:12:37PM -0800, Guido van Rossum wrote: > https://sourceforge.net/tracker/index.php?func=detail&aid=1103689&group_id=5470&atid=305470 > > Here's a patch that gets rid of unbound methods, as > discussed here before. A function's __get__ method > now returns the function unchanged when called without > an instance, instead of returning an unbound method object. This patch breaks py.test quite extensively, which is no surprise because it contains a number of tests along the lines of hasattr(func, 'im_self'). After a quick scan over py.test I'm not sure how easily this could be fixed, and more importantly how easily it would be to fix it while still preserving compatibility with existing Python versions. The "hard" bit with removed unbound methods is that it's no longer possible to find back the class C from which a function C.f was read (i.e. there is no im_class any more). I would suggest that we (I) give the issue some serious thoughts, and if it really makes the life of the py lib difficult, it will be a good "real life" example of breakage. I'm sure it would be a good argument to postpone the patch to the faraway ages usually called "Python 3000". Armin From hpk at trillke.net Mon Jan 17 12:56:52 2005 From: hpk at trillke.net (holger krekel) Date: Mon, 17 Jan 2005 12:56:52 +0100 Subject: [py-dev] Re: [Python-Dev] Getting rid of unbound methods: patch available In-Reply-To: <20050117112726.GA10827@vicky.ecs.soton.ac.uk> References: <20050117112726.GA10827@vicky.ecs.soton.ac.uk> Message-ID: <20050117115652.GS14660@solar.trillke.net> Hi Armin, On Mon, Jan 17, 2005 at 11:27 +0000, Armin Rigo wrote: > On Sun, Jan 16, 2005 at 10:12:37PM -0800, Guido van Rossum wrote: > > https://sourceforge.net/tracker/index.php?func=detail&aid=1103689&group_id=5470&atid=305470 > > > > Here's a patch that gets rid of unbound methods, as > > discussed here before. A function's __get__ method > > now returns the function unchanged when called without > > an instance, instead of returning an unbound method object. > > This patch breaks py.test quite extensively, which is no surprise because it > contains a number of tests along the lines of hasattr(func, 'im_self'). pydoc and other inspection-related modules and packages will likely break as well. Especially adding im_self/im_func to plain functions breaks assumptions that have been valid with Python for many years. > After a quick scan over py.test I'm not sure how easily this could be fixed, > and more importantly how easily it would be to fix it while still preserving > compatibility with existing Python versions. I guess we would need to keep track of class information ourselves. So a test item would need to grow a class-attribute so that we know which class a function belongs to. Maybe it's more difficult then that. > I would suggest that we (I) give the issue some serious thoughts, and if it > really makes the life of the py lib difficult, it will be a good "real life" > example of breakage. I'm sure it would be a good argument to postpone the > patch to the faraway ages usually called "Python 3000". Yes, i agree. I am pretty sure that py.test isn't the only real-life example that breaks by changing the introspection model for functions/methods. The gain is too small to warrant such a change before Python 3000 IMHO. After all, where is the "real need" to introduce the change? (this question is often thrown at change-ideas by Guido :-) holger From arigo at tunes.org Mon Jan 17 14:14:17 2005 From: arigo at tunes.org (Armin Rigo) Date: Mon, 17 Jan 2005 13:14:17 +0000 Subject: [py-dev] Re: [Python-Dev] Getting rid of unbound methods: patch available In-Reply-To: <20050117115652.GS14660@solar.trillke.net> References: <20050117112726.GA10827@vicky.ecs.soton.ac.uk> <20050117115652.GS14660@solar.trillke.net> Message-ID: <20050117131417.GA2380@vicky.ecs.soton.ac.uk> Hi, On Mon, Jan 17, 2005 at 12:56:52PM +0100, holger krekel wrote: > > This patch breaks py.test quite extensively, which is no surprise because it > > contains a number of tests along the lines of hasattr(func, 'im_self'). > > pydoc and other inspection-related modules and packages will likely > break as well. Especially adding im_self/im_func to plain functions > breaks assumptions that have been valid with Python for many years. I agree. It turned out that fixing the py lib for the patch wasn't too complicated, because at the time where it is needed we still now the extpy paths from which a method is resolved, so that we can figure out the parent object along the path, and check if it is a class. However, I can easily imagine that it could have been more involved. For reference, the diff is attached to this message. Here is a summary that I intend to post as a comment of Guido's patch: -+- Here are the issues that had to be fixed in the case of the py lib: * no im_class, so we need another way to figure out which class a method comes from. As it turned out, and in part by chance, it wasn't too complicated to know it in the case of the py lib, but I can easily imagine cases where the "path" along which the method is found would be long lost. This kind of usage of the 'im_class' attribute is arguably an accident, but it is convenient and I can imagine that it's a trick relatively widely used. * a number of places break because they expect plain functions *not* to have an 'im_self' attribute. * the trick of 'f.im_func==f' for plain functions fooled the code usually in the right way. On the other hand, I believe that the resulting (fixed) code is a bit clearer; by necessity, it checks explicitely if an object comes from a class, instead of, for example, "if hasattr(gen, 'im_self') and not gen.im_self:". It was still a breakage, and I believe that a lot of projects would break. Adding 'im_self' and 'im_func' to function objects only helps a bit; based on this experience I'd still expect any program doing any kind of introspection on function/method objects to require some maintenance. To me it looks like a nice simplification, but it also looks like it should not go into Python right now :-( -+- Armin From arigo at tunes.org Mon Jan 17 14:17:21 2005 From: arigo at tunes.org (Armin Rigo) Date: Mon, 17 Jan 2005 13:17:21 +0000 Subject: [py-dev] Re: [Python-Dev] Getting rid of unbound methods: patch available In-Reply-To: <20050117131417.GA2380@vicky.ecs.soton.ac.uk> References: <20050117112726.GA10827@vicky.ecs.soton.ac.uk> <20050117115652.GS14660@solar.trillke.net> <20050117131417.GA2380@vicky.ecs.soton.ac.uk> Message-ID: <20050117131721.GA30096@vicky.ecs.soton.ac.uk> Hi, On Mon, Jan 17, 2005 at 01:14:17PM +0000, Armin Rigo wrote: > For reference, the diff is attached to this message. Or this one. Armin -------------- next part -------------- Index: test/item.py =================================================================== --- test/item.py (revision 8327) +++ test/item.py (working copy) @@ -84,13 +84,15 @@ if x is not None: x(function) - def make_callable(self, method): + def resolve_callable(self, extpy): + container = extpy.dirpath().resolve() + method = extpy.resolve() assert callable(method) - if not hasattr(method, 'im_class'): + if not isclass(container): return method - if self.state._instance.__class__ != method.im_class: - self.state._instance = method.im_class() - return method.__get__(self.state._instance, method.im_class) + if self.state._instance.__class__ != container: + self.state._instance = container() + return method.__get__(self.state._instance, container) class Item(SetupItem): """ an Item is responsible for locating and executing @@ -104,8 +106,8 @@ def run(self, driver): self.setup_path(self.extpy) - method = self.make_callable(self.extpy.resolve()) - if hasattr(method, 'im_self'): + method = self.resolve_callable(self.extpy) + if getattr(method, 'im_self', None) is not None: self.setup_method(method) else: self.setup_function(method) Index: test/compat.py =================================================================== --- test/compat.py (revision 8327) +++ test/compat.py (working copy) @@ -7,7 +7,7 @@ """ def execute(self, driver): unboundmethod = self.extpy.resolve() - cls = unboundmethod.im_class + cls = self.extpy.dirpath().resolve() instance = cls() instance.setUp() try: Index: test/collect.py =================================================================== --- test/collect.py (revision 8327) +++ test/collect.py (working copy) @@ -172,11 +172,15 @@ if extpy.check(genfunc=1): yield Generator(extpy) else: + container = extpy.dirpath().resolve() func = extpy.resolve() - try: - yield getattr(func.im_class, 'Item')(extpy) - except AttributeError: - yield self.Item(extpy) + Item = self.Item + if inspect.isclass(container): + try: + Item = container.Item + except AttributeError: + pass + yield Item(extpy) class Generator(PyCollector): def builditem(self, obj): @@ -197,9 +201,10 @@ #sm.setup_path(self.extpy) #gen, teardown = sm.setup_method(self.extpy) #assert not teardown, "%r not processoable in Generator-Collector (XXX)" + container = self.extpy.dirpath().resolve() gen = self.extpy.resolve() - if hasattr(gen, 'im_self') and not gen.im_self: - gen = gen.__get__(gen.im_class(), gen.im_class) + if inspect.isclass(container): + gen = gen.__get__(container(), container) for call in gen(): yield self.builditem(call) except: From grig at gheorghiu.net Mon Jan 17 15:33:13 2005 From: grig at gheorghiu.net (Grig Gheorghiu) Date: Mon, 17 Jan 2005 06:33:13 -0800 (PST) Subject: [py-dev] Issue with py.test and stderr In-Reply-To: <20050117102119.GB20323@vicky.ecs.soton.ac.uk> Message-ID: <20050117143313.22914.qmail@web54508.mail.yahoo.com> Armin, Thanks for the response. I've noticed that messing with sys.__stderr__ is risky, so from now on I'll follow the "save and restore sys.stderr" recipe. BTW, would you guys be interested in some performance/scalability tests for py.test? I was thinking about timing py.test on various configurations that would involve things like: - large number of files in a directory - large number of sub-directories in a directory - deeply nested directory tree This would probably exercise the Collector part of py.test more than other parts, but it would still be interesting in my opinion. What do you think? Thanks, Grig --- Armin Rigo wrote: > Hi Grig, > > On Fri, Jan 14, 2005 at 11:56:48AM -0800, Grig Gheorghiu wrote: > > I did run into a slight issue though. At some point I was > instantiating > > a class which was redirecting stderr to stdout under some > > circumstances. The __del__ method of my class was resetting > sys.stderr > > to sys.__stderr__. > > Note that there is support in py.test for clean setup/teardown > semantics, i.e. > instead of using __del__ (which is not guaranteed to run timely) you > should > save and restore sys.stderr in setup/teardown methods; e.g.: > > def setup_module(mod): > mod.saved_stderr = sys.stderr > sys.stderr = ...something else... > > def teardown_module(mod): > sys.stderr = mod.saved_stderr > > > Armin > From hpk at trillke.net Mon Jan 17 23:38:11 2005 From: hpk at trillke.net (holger krekel) Date: Mon, 17 Jan 2005 23:38:11 +0100 Subject: [py-dev] Issue with py.test and stderr In-Reply-To: <20050117143313.22914.qmail@web54508.mail.yahoo.com> References: <20050117102119.GB20323@vicky.ecs.soton.ac.uk> <20050117143313.22914.qmail@web54508.mail.yahoo.com> Message-ID: <20050117223810.GG14660@solar.trillke.net> Hi Grig, On Mon, Jan 17, 2005 at 06:33 -0800, Grig Gheorghiu wrote: > Armin, > > Thanks for the response. I've noticed that messing with sys.__stderr__ > is risky, so from now on I'll follow the "save and restore sys.stderr" > recipe. > > BTW, would you guys be interested in some performance/scalability tests > for py.test? Up front, I have to admit i am not particularly worried about py.test's performance (yet). On a side note, when converting PyPy's tests we found that tests ran some 20% faster with py.test than with our previous unittest.py hacks. But this is largely related to the relative messiness of our previous code (which may be regarded as a result of trying to fit our requirements into unittest.py). >I was thinking about timing py.test on various > configurations that would involve things like: > > - large number of files in a directory > - large number of sub-directories in a directory > - deeply nested directory tree > > This would probably exercise the Collector part of py.test more than > other parts, but it would still be interesting in my opinion. What do > you think? py.test is iteratively walking directory trees. It basically uses "listdir()" returning a list which - for sufficiently large directories - may give a performance penalty compared to opendir()/readdir()/closedir() wrapped in some nice Iterable. But in practise you might have to merge Zope3, twisted and 100 other packages to see an impact of using the non-lazy listdir() :-) cheers, holger From mscott at goldenspud.com Tue Jan 18 02:00:24 2005 From: mscott at goldenspud.com (Matthew Scott) Date: Mon, 17 Jan 2005 19:00:24 -0600 Subject: [py-dev] Problems with DOS files In-Reply-To: <200412022140.iB2LeQ5M003024@host13.apollohosting.com> References: <200412022140.iB2LeQ5M003024@host13.apollohosting.com> Message-ID: <41EC5FA8.1020209@goldenspud.com> Patrick K. O'Brien wrote: >>Is there >>anything similarly simple that can be done under windows? > > > Not that I'm aware of. :-( > > Take that back. Courtesy of Matthew Scott, here is a clever, two-line > solution (concatenate that second line if it breaks): > > pytest.bat > ========== > @echo off > python -c "import py; from py.__impl__.test.cmdline import main; main()" %1 > %2 %3 %4 %5 %6 %7 %8 %9 Just had to do something similar with a Schevo script, so here's a new version that can be put into py\bin as pytest.cmd: @echo off python %~dp0\py.test %* Not sure if it will work on non-NT-based systems, or with a .bat extension instead of .cmd, but this style seems to work great for the other script I had to write. Above py.test script is untested by me though since I avoid Windows whenever possible. :) Opening this URL in Internet Explorer will shed some light on the "advanced" batch file syntax, at least on Windows XP: ms-its:C:\WINDOWS\Help\ntcmds.chm::/percent.htm Basically, %~dp0 resolves to the drive and directory of the pytest.cmd file that was executed, and %* is the rest of the command-line arguments given to the batch file. The py\bin directory needs to be in the PATH environment variable but everything else is done via Py's magic :) - Matthew From hpk at trillke.net Tue Jan 18 23:19:24 2005 From: hpk at trillke.net ('holger krekel') Date: Tue, 18 Jan 2005 23:19:24 +0100 Subject: [py-dev] Problems with DOS files In-Reply-To: <41EC5FA8.1020209@goldenspud.com> References: <200412022140.iB2LeQ5M003024@host13.apollohosting.com> <41EC5FA8.1020209@goldenspud.com> Message-ID: <20050118221924.GV14660@solar.trillke.net> Hi Matthew, On Mon, Jan 17, 2005 at 19:00 -0600, Matthew Scott wrote: > Patrick K. O'Brien wrote: > >>Is there > >>anything similarly simple that can be done under windows? > > > > > >Not that I'm aware of. :-( > > > >Take that back. Courtesy of Matthew Scott, here is a clever, two-line > >solution (concatenate that second line if it breaks): > > > >pytest.bat > >========== > >@echo off > >python -c "import py; from py.__impl__.test.cmdline import main; main()" %1 > >%2 %3 %4 %5 %6 %7 %8 %9 > > Just had to do something similar with a Schevo script, so here's a new > version that can be put into py\bin as pytest.cmd: > > @echo off > python %~dp0\py.test %* much nicer! Now if only the thing could be named "pytest" :-) Btw, i played around a bit and when i just start the plain (unixish) 'py.test' i got a dialogue asking me for the application i want to execute with, I chose "python" from the menu and said yes to "always open with this application ...". And now (after i added the 'bin' directory to the Path environment) it just works if i type "py.test". Hum, i guess that now all files ending with ".test" will get executed by python so it's probably not a recommendable solution. Alternatively, py.test.cmd could probably be copied to some other directory (contained in the Path env) along with _findpy.py and other scripts and it would just work. (or the windows script really gets a different name but i'd like to avoid it if possible) Hum, we are getting closer to good solutions either way i think :-) cheers, holger From florian.proff.schulze at gmx.net Wed Jan 19 12:19:56 2005 From: florian.proff.schulze at gmx.net (Florian Schulze) Date: Wed, 19 Jan 2005 12:19:56 +0100 Subject: [py-dev] Problems with DOS files In-Reply-To: <20050118221924.GV14660@solar.trillke.net> References: <200412022140.iB2LeQ5M003024@host13.apollohosting.com> <41EC5FA8.1020209@goldenspud.com> <20050118221924.GV14660@solar.trillke.net> Message-ID: On Tue, 18 Jan 2005 23:19:24 +0100, 'holger krekel' wrote: > Hi Matthew, > > On Mon, Jan 17, 2005 at 19:00 -0600, Matthew Scott wrote: >> Patrick K. O'Brien wrote: >> >>Is there >> >>anything similarly simple that can be done under windows? >> > >> > >> >Not that I'm aware of. :-( >> > >> >Take that back. Courtesy of Matthew Scott, here is a clever, two-line >> >solution (concatenate that second line if it breaks): >> > >> >pytest.bat >> >========== >> >@echo off >> >python -c "import py; from py.__impl__.test.cmdline import main; >> main()" %1 >> >%2 %3 %4 %5 %6 %7 %8 %9 >> >> Just had to do something similar with a Schevo script, so here's a new >> version that can be put into py\bin as pytest.cmd: >> >> @echo off >> python %~dp0\py.test %* > > much nicer! Now if only the thing could be named "pytest" :-) I put a py.test.bat into my C:\Python23\Scripts directory and added that to my path. Now I can just use it by typing "py.test ...". So where is the problem? Does it really need to be called py.test? You could write two files, one names py.test for unix and one py.test.bat/py.test.cmd and just install both into a directory which is in the path. Then it works the same on both unix and windows. Regards, Florian Schulze From mscott at goldenspud.com Tue Jan 25 19:43:36 2005 From: mscott at goldenspud.com (Matthew Scott) Date: Tue, 25 Jan 2005 12:43:36 -0600 Subject: [py-dev] thoughts on embedding files Message-ID: <41F69358.8050000@goldenspud.com> (cross-posting this to both the Schevo and Py mailing lists, to get the input of both sets of people) Extending evo ============= The next task I am embarking upon for the Schevo project is to add two new actions to evo, a tool that was just added that is a central place to perform actions to create, run, and generally work with Schevo applications: * "evo py2exe", which will perform all necessary steps to turn your application into a Windows executable using Py2exe. * "evo innosetup", which will perform the py2exe action, then wrap the executable inside a Windows installer using the free InnoSetup program. (see http://lists.orbtech.com/pipermail/schevo-devel/2005-January/000060.html for more information) These actions will be built to satisfy the requirements of an application that Patrick and I are working on professionally, but will be useful for packaging any Schevo application for deployment on Windows platforms. In the future, expect to see "evo cxfreeze", and perhaps one day "evo deb" and "evo rpm". Embedding files --------------- One of the requirements of turning a Schevo application into a Py2exe application is that of file embedding. If you have looked at Schevo prior to the latest merge, you might have noticed that each application had its own "build.py" step. This step did the following things: * Embedded the contents of the schema directory in a single Python module. * Determined the icons that the application used, and the icon collections that are made available by the application, and embedded those icons in a single Python module. * For some apps, embedded arbitrary files that the user interface would use, such as additional graphics other than icons. The reason all of this was necessary is that we wanted to make sure that the entirety of the schema, and all of the icons an app would use, would be embedded within a Py2exe application's library ZIP file, and not just copied into plain files that could be easily modified by a user of the application. The easy way to do this during development, of course, is to always build the generated Python modules, and to always use those modules. This worked in the short term, but made application startup during development time-consuming, and added much boilerplate necessary for creating new applications. What we want to do instead is to make sure that when developing a Schevo application, the developer does NOT worry about how or when to embed files in this manner. The developer should just be able to create the application, efficiently run it in a development state using normal files, and then run "evo py2exe" to create an executable. The "evo py2exe" action should embody all of the logic needed to determine what needs to be embedded and where, and the final application that is generated should have the necessary logic to know that it should read file-like objects from the embedded Python modules rather than real files on disk. So, it becomes necessary to create an API to allow this transparency to occur, so that the application developer must only make minimal changes to his/her way of thinking but may still take advantage of files embedded within Python modules. The py.path API --------------- Early on in the Schevo project after it was split from Pypersyst, we made the decision to use py.test from the "py" library to run our unit tests. We were happy with this decision, but this meant that the user would need to keep up-to-date with the py library, which is under heavy development. So we kept our usage of py within Schevo optional, and minimal. Now that we are including py within Schevo as a dependency, we can control which revision of it we use. So if some major API shift occurs in the main py distribution, we can just stick with a "known good" revision for Schevo. Because of this, we are moving from tapping the water lightly with our toes to walking into it knee deep. :) The py.path API is the first major step toward using py for more than just unit testing. py.path provides a very well-designed object model for dealing with paths of all sorts, both local, remote, and virtual. I've started using it in Schevo whenever I refactor a piece of code that works with filesystem paths. So, what I am thinking is that a py.path.embed package could be created that would embody the aspects of embedding files within a Python module, along with providing a single API that could be used to access those files both when they are still in the local filesystem, and when they have been embedded into Python modules. I will be writing more as I discover more about py.path's innards, and determine whether this would be a good choice or not for embedding files. Any feedback from the Py and Schevo teams is most appreciated :) From bob at redivi.com Tue Jan 25 20:29:10 2005 From: bob at redivi.com (Bob Ippolito) Date: Tue, 25 Jan 2005 14:29:10 -0500 Subject: [py-dev] thoughts on embedding files In-Reply-To: <41F69358.8050000@goldenspud.com> References: <41F69358.8050000@goldenspud.com> Message-ID: <638643C2-6F07-11D9-A261-000A95BA5446@redivi.com> On Jan 25, 2005, at 13:43, Matthew Scott wrote: > The next task I am embarking upon for the Schevo project is to add two > new actions to evo, a tool that was just added that is a central place > to perform actions to create, run, and generally work with Schevo > applications: > > * "evo py2exe", which will perform all necessary steps to turn your > application into a Windows executable using Py2exe. > > * "evo innosetup", which will perform the py2exe action, then wrap the > executable inside a Windows installer using the free InnoSetup > program. > > (see > http://lists.orbtech.com/pipermail/schevo-devel/2005-January/ > 000060.html for more information) > > These actions will be built to satisfy the requirements of an > application that Patrick and I are working on professionally, but will > be useful for packaging any Schevo application for deployment on > Windows platforms. > > In the future, expect to see "evo cxfreeze", and perhaps one day "evo > deb" and "evo rpm". I really don't like the idea of having some script churn the distutils wheels behind the scenes with no way to hook into it or support packaging commands that you don't support specifically already. Why not have an "evo makesetup" that spits out an appropriate setup.py for packaging the application? This setup.py would have enough information to support nearly any distutils packaging command (py2exe, cxfreeze, py2app, rpm, deb, etc), though it may initially only be wired to support py2exe specifically, a user on another platform would be able to have a good starting point to (a) make their application work (b) submit a patch to you to support their preferred packaging method (because they know what they needed to add to the setup.py "template"). In a lot of scenarios, supporting py2app is a matter of changing the setup(...) option from "windows" to "app". A portable setup file can be written like this (though obviously will require a lot more machinery to support a complex application): from distutils.core import setup import sys if sys.platform == 'darwin': import py2app buildstyle = 'app' elif sys.platform == 'win32': import py2exe # buildstyle = 'console' buildstyle = 'windows' setup( name="application", **{buildstyle : ['application.py']} ) -bob From mscott at goldenspud.com Thu Jan 27 07:33:53 2005 From: mscott at goldenspud.com (Matthew Scott) Date: Thu, 27 Jan 2005 00:33:53 -0600 Subject: [py-dev] thoughts on embedding files In-Reply-To: <41F69358.8050000@goldenspud.com> References: <41F69358.8050000@goldenspud.com> Message-ID: <41F88B51.6080209@goldenspud.com> Matthew Scott wrote: > (cross-posting this to both the Schevo and Py mailing lists, to get the > input of both sets of people) > > I will be writing more as I discover more about py.path's innards, and > determine whether this would be a good choice or not for embedding files. As promised, here is more. I've begun working on a py.path.embed implementation. Once I've completed a working first pass at it, I'll post a patch to the list for review. So far, I've skimmed over the py.path.local API, and have also written down some thoughts and "forward-looking documentation" on how I think this could work. Before I paste that in though, I'll preface it by saying that the first implementation of embedding files within Schevo was very specific. That is, for schema, icons, and general files, there were (well... still are) three different ways of transforming files into Python modules, and three different sets of loaders that each had "filesystem loaders" and "python object loaders". In short, lots of duplication and specialization with little gain. What I plan to do once py.path.embed is implemented is to convert Schevo's schema and icon loaders to use py.path.embed exclusively, so that it can transparently use either local files and directories, or embedded "virtual" files and directories. And with that out of the way, here are my notes, straight from the py.__impl__.path.embed docstring :-) """Embedded path implementation. Introduction ============ This path implementation, `py.path.embed`, lets you use a single API to access either a set of files that are within a common directory, or a Python object that contains 'virtual' files and directories that can be embedded in a Python module. No matter which mode you're in, you can do nearly all typical path/file operations that you would use if you were using `py.path.local` objects. You can also transform a set of real files into a dictionary suitable for embedding within a Python module, and vice versa. Such embedding of data using a file/directory metaphor allows you to develop an application's external data (such as image files for use in a user interface) using standard files and directories, which are easy to use with external tools such as editors and source code repositories. When packaging these files in a form that will be distributed to end-users, it is often desirable to not expose these files and directories to those users. As long as you are using `py.path.embed` to work with sets of files located in your local file system, you can embed those same files into a Python module, which can in turn be included within your distributed application alongside the other required modules. Contexts ======== `py.path.embed` is not a general-purpose `py.path` implementation that can work with any local path. It works within *contexts*, which are self-contained sets of files and directories. There are two types of contexts: - A *Local context* is a collection of all files and directories that are within a single base directory. A local context is created based on a `py.path.local` object, which becomes the *root* of the context. When using `py.path.embed` to access paths, files, and directories within a local context, it essentially acts as a proxy to the `py.path.local` root path of the context and the real file objects that `py.path.local` makes available. - A *Virtual context* is a collection of virtual files and directories that are stored as strings and dictionaries within a Python dictionary. A virtual context is created based on either an existing dictionary created by `py.path.embed`, or an empty dictionary. The dictionary becomes the *root* of the context. When using `py.path.embed` to access paths, files, and directories within a virtual context, it provides an API that acts as if you were accessing paths via `py.path.local` and provides file-like objects that act similarly to actual file objects. Transformation ============== The tranformation of files and directories between local and virtual contexts is perhaps the primary reason that `py.path.embed` exists. Thankfully, this becomes an easy process since `py.path.embed` closely mirrors the `py.path.local` API. To transform from a local directory to a module containing a dictionary, follow these steps: 1. Create a local context `localCtx` whose root is the local directory. Create a corresponding `py.path.embed` instance called `local` using that context:: root = py.path.local('/path/to/the/files/to/embed') localCtx = py.path.embed.LocalContext(root) local = py.path.embed(localCtx) 2. Create a virtual context `virtualCtx` whose root is an empty dictionary. Create a corresponding `py.path.embed` instance called `virtual` using that context:: virtualCtx = py.path.embed.VirtualContext() virtual = py.path.embed(virtualCtx) 3. Populate the virtual context with copies of local files:: local.copy(target=virtual) 4. Open a local file for output which will be your Python module containing the embedded files:: foo = open('foo.py', 'w') 5. Write a representation of the virtual context's root dictionary to the Python module:: foo.write('root = %r\n' % virtualCtx.root) foo.close() Of course, if you want to populate the virtual context from scratch using a more complex algorithm than simply copying files from the local filesystem, you can simply create the virtual context and associated `py.path.embed` object, and use it as if you were creating files using `py.path.local`. Once you have this module, you can now use code similar to the following to attempt to load the embedded version that you distribute, then fall back to the local version that you use during development if the embedded version couldn't be loaded:: try: import foo except ImportError: ctx = py.path.embed.LocalContext(root=py.path.local(...)) else: ctx = py.path.embed.VirtualContext(root=foo.root) rootPath = py.path.embed(ctx) """ From mscott at goldenspud.com Fri Jan 28 01:52:32 2005 From: mscott at goldenspud.com (Matthew Scott) Date: Thu, 27 Jan 2005 18:52:32 -0600 Subject: [py-dev] py.path.embed patch Message-ID: <41F98CD0.5090508@goldenspud.com> Here is the first stab at py.path.embed Let me know what you all think. If it passes muster, feel free to merge it into trunk. - Matthew -------------- next part -------------- A non-text attachment was scrubbed... Name: embed1.patch Type: text/x-patch Size: 30716 bytes Desc: not available URL: From lac at strakt.com Fri Jan 28 16:01:56 2005 From: lac at strakt.com (Laura Creighton) Date: Fri, 28 Jan 2005 16:01:56 +0100 Subject: [py-dev] we are about to get press: Message-ID: <200501281501.j0SF1uIb003013@ratthing-b246.strakt.com> http://agiletesting.blogspot.com/2005/01/python-unit-testing-part-1-unittest.html which got a pointer in the daily python url. woo woo woo! laura From grig at gheorghiu.net Fri Jan 28 16:14:11 2005 From: grig at gheorghiu.net (Grig Gheorghiu) Date: Fri, 28 Jan 2005 07:14:11 -0800 (PST) Subject: [py-dev] we are about to get press: In-Reply-To: <200501281501.j0SF1uIb003013@ratthing-b246.strakt.com> Message-ID: <20050128151411.13694.qmail@web54510.mail.yahoo.com> I know that Holger will give a PyCon presentation on py.test, so I hope he won't mind if I cover some py.test stuff myself. I'm just a beginner at using py.test, so I'll only cover the basics -- which are already more than the other 2 frameworks offer anyway... Grig http://agiletesting.blogspot.com --- Laura Creighton wrote: > http://agiletesting.blogspot.com/2005/01/python-unit-testing-part-1-unittest.html > > which got a pointer in the daily python url. > > woo woo woo! > > laura > _______________________________________________ > py-dev mailing list > py-dev at codespeak.net > http://codespeak.net/mailman/listinfo/py-dev > From p.f.moore at gmail.com Fri Jan 28 19:13:08 2005 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 28 Jan 2005 18:13:08 +0000 Subject: [py-dev] Strange failure with py.test - too much magic? Message-ID: <79990c6b05012810135595ed62@mail.gmail.com> I thought I'd give py.test a try. So I grabbed the code from the Subversion repository and wrote a little test file, which just did an assert 1=0. I don't expect the test to succeed :-) I added the dist-py directory to PYTHONPATH, and ran dist-py\py\bin\pytest.cmd test_pytest.py. And got the attached result. What does all that mean?!?! I expected something a bit more straightforward. I'm running on Windows XP Pro, with Python 2.4 and py.test from Subversion. Paul. PS Is it possible to use py.test apart from the rest of the py library? I'm not interested in the XML stuff, or execnet, and while greenlets sound cool, I don't want to package them up with my testing framework... -------------- next part -------------- A non-text attachment was scrubbed... Name: pytest.output Type: application/octet-stream Size: 5015 bytes Desc: not available URL: From ianb at colorstudy.com Fri Jan 28 19:14:09 2005 From: ianb at colorstudy.com (Ian Bicking) Date: Fri, 28 Jan 2005 12:14:09 -0600 Subject: [py-dev] Strange failure with py.test - too much magic? In-Reply-To: <79990c6b05012810135595ed62@mail.gmail.com> References: <79990c6b05012810135595ed62@mail.gmail.com> Message-ID: <41FA80F1.5020708@colorstudy.com> Paul Moore wrote: > I thought I'd give py.test a try. So I grabbed the code from the > Subversion repository and wrote a little test file, which just did an > assert 1=0. I don't expect the test to succeed :-) This is invalid syntax, you meant: assert 1 == 0 Though it's not the best error message that you got. -- Ian Bicking / ianb at colorstudy.com / http://blog.ianbicking.org From hpk at trillke.net Fri Jan 28 19:23:55 2005 From: hpk at trillke.net (holger krekel) Date: Fri, 28 Jan 2005 19:23:55 +0100 Subject: [py-dev] Strange failure with py.test - too much magic? In-Reply-To: <41FA80F1.5020708@colorstudy.com> References: <79990c6b05012810135595ed62@mail.gmail.com> <41FA80F1.5020708@colorstudy.com> Message-ID: <20050128182355.GW467@solar.trillke.net> On Fri, Jan 28, 2005 at 12:14 -0600, Ian Bicking wrote: > Paul Moore wrote: > >I thought I'd give py.test a try. So I grabbed the code from the > >Subversion repository and wrote a little test file, which just did an > >assert 1=0. I don't expect the test to succeed :-) > > This is invalid syntax, you meant: > > assert 1 == 0 > > Though it's not the best error message that you got. Yes, that is indeed true. SyntaxErrors and ImportErrors of test files are totally utterly ugly at the moment. The thing is that there is an upcoming refactoring of the "collection process" on the todo list which will change reporting details anyway ... and i am a bit caught up in other activities at the moment. holger From hpk at trillke.net Fri Jan 28 19:28:52 2005 From: hpk at trillke.net (holger krekel) Date: Fri, 28 Jan 2005 19:28:52 +0100 Subject: [py-dev] Strange failure with py.test - too much magic? In-Reply-To: <79990c6b05012810135595ed62@mail.gmail.com> References: <79990c6b05012810135595ed62@mail.gmail.com> Message-ID: <20050128182852.GX467@solar.trillke.net> On Fri, Jan 28, 2005 at 18:13 +0000, Paul Moore wrote: > PS Is it possible to use py.test apart from the rest of the py > library? I'm not interested in the XML stuff, or execnet, and while > greenlets sound cool, I don't want to package them up with my testing > framework... py.test uses itself at least execnet (for distributing tests, running on different python interpreters via --exec=XXX) and the py.path implementations and soon also the 100-line xml-stuff in order to do html-reporting. So i am afraid there is not much use in trying to separate py.test out. cheers, holger From p.f.moore at gmail.com Fri Jan 28 19:54:41 2005 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 28 Jan 2005 18:54:41 +0000 Subject: [py-dev] Strange failure with py.test - too much magic? In-Reply-To: <20050128182355.GW467@solar.trillke.net> References: <79990c6b05012810135595ed62@mail.gmail.com> <41FA80F1.5020708@colorstudy.com> <20050128182355.GW467@solar.trillke.net> Message-ID: <79990c6b050128105411a7e17a@mail.gmail.com> On Fri, 28 Jan 2005 19:23:55 +0100, holger krekel wrote: > On Fri, Jan 28, 2005 at 12:14 -0600, Ian Bicking wrote: > > Paul Moore wrote: > > >I thought I'd give py.test a try. So I grabbed the code from the > > >Subversion repository and wrote a little test file, which just did an > > >assert 1=0. I don't expect the test to succeed :-) > > > > This is invalid syntax, you meant: > > > > assert 1 == 0 Doh. Braindead at the end of a long day, but did I have to let the whole world know??? Thanks for being kind enough to explain gently :-) > > Though it's not the best error message that you got. > > Yes, that is indeed true. SyntaxErrors and ImportErrors of > test files are totally utterly ugly at the moment. In some mitigation for my stupidity, I would say that although I saw "SyntaxError" in the traceback, it didn't point me at the error in my code, and the overall error terrified me :-) I suppose it's one of those "who tests the tests" sort of questions... Regards, Paul. From p.f.moore at gmail.com Fri Jan 28 20:28:30 2005 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 28 Jan 2005 19:28:30 +0000 Subject: [py-dev] Strange failure with py.test - too much magic? In-Reply-To: <20050128182852.GX467@solar.trillke.net> References: <79990c6b05012810135595ed62@mail.gmail.com> <20050128182852.GX467@solar.trillke.net> Message-ID: <79990c6b050128112864630b78@mail.gmail.com> On Fri, 28 Jan 2005 19:28:52 +0100, holger krekel wrote: > On Fri, Jan 28, 2005 at 18:13 +0000, Paul Moore wrote: > > PS Is it possible to use py.test apart from the rest of the py > > library? I'm not interested in the XML stuff, or execnet, and while > > greenlets sound cool, I don't want to package them up with my testing > > framework... > > py.test uses itself at least execnet (for distributing tests, running > on different python interpreters via --exec=XXX) and the py.path > implementations and soon also the 100-line xml-stuff in order to do > html-reporting. So i am afraid there is not much use in > trying to separate py.test out. Ah. I've got the wrong impression of py.test. Because its principle is simplicity of use, I took that as implying a certain simplicity of implementation, whereas in actual fact (my apologies if I misrepresent things) it is doing some fairly sophisticated things at the implementation level, to allow things to "just happen" as the user expects. It's an entirely fair tradeoff, but probably not what I'm after. Too much "magic" for me, I suspect, as well as including a lot of things (distributing tests, html reporting) that I don't need/want. I don't want to dismiss py on the basis of superficial impressions, but at least my *new* superficial impression stands a better chance of matching reality than my previous one :-) I'll go back to playing, and look at things with a more realistic view. Thanks for taking the time to explain. Paul. From pobrien at orbtech.com Fri Jan 28 20:45:04 2005 From: pobrien at orbtech.com (Patrick K. O'Brien) Date: Fri, 28 Jan 2005 13:45:04 -0600 Subject: [py-dev] Strange failure with py.test - too much magic? In-Reply-To: <79990c6b050128112864630b78@mail.gmail.com> References: <79990c6b05012810135595ed62@mail.gmail.com> <20050128182852.GX467@solar.trillke.net> <79990c6b050128112864630b78@mail.gmail.com> Message-ID: <41FA9640.4090407@orbtech.com> Paul Moore wrote: > Ah. I've got the wrong impression of py.test. Because its principle is > simplicity of use, I took that as implying a certain simplicity of > implementation, whereas in actual fact (my apologies if I misrepresent > things) it is doing some fairly sophisticated things at the > implementation level, to allow things to "just happen" as the user > expects. > > It's an entirely fair tradeoff, but probably not what I'm after. Too > much "magic" for me, I suspect, as well as including a lot of things > (distributing tests, html reporting) that I don't need/want. As someone who is usually opposed to "magic" in Python code, I thought I'd at least point out that I find the magic in py.test to be quite welcome and necessary. In fact, we've recently decided to make the entire py package an integral part of Schevo, which is itself acquiring a fair amount of "magical" qualities. Suffice it to say that I don't think it is a trivial decision to add a core dependency like that. But we had enough confidence in py and the team maintaining it to give it a shot. So maybe one needs to assess whether the magic is "good magic" or "bad magic". In my mind, "bad magic" is code that could have accomplished the same results using straightforward Python code. Likewise, "good magic" gets the job done in spite of the fact that the Python language doesn't provide a straightforward solution. IMO, the py package is very much "good magic". Just my two cents worth. -- Patrick K. O'Brien Orbtech http://www.orbtech.com Schevo http://www.schevo.org Pypersyst http://www.pypersyst.org From hpk at trillke.net Fri Jan 28 23:08:56 2005 From: hpk at trillke.net (holger krekel) Date: Fri, 28 Jan 2005 23:08:56 +0100 Subject: [py-dev] Strange failure with py.test - too much magic? In-Reply-To: <79990c6b050128112864630b78@mail.gmail.com> References: <79990c6b05012810135595ed62@mail.gmail.com> <20050128182852.GX467@solar.trillke.net> <79990c6b050128112864630b78@mail.gmail.com> Message-ID: <20050128220856.GY467@solar.trillke.net> On Fri, Jan 28, 2005 at 19:28 +0000, Paul Moore wrote: > On Fri, 28 Jan 2005 19:28:52 +0100, holger krekel wrote: > > On Fri, Jan 28, 2005 at 18:13 +0000, Paul Moore wrote: > > > PS Is it possible to use py.test apart from the rest of the py > > > library? I'm not interested in the XML stuff, or execnet, and while > > > greenlets sound cool, I don't want to package them up with my testing > > > framework... > > > > py.test uses itself at least execnet (for distributing tests, running > > on different python interpreters via --exec=XXX) and the py.path > > implementations and soon also the 100-line xml-stuff in order to do > > html-reporting. So i am afraid there is not much use in > > trying to separate py.test out. > > Ah. I've got the wrong impression of py.test. Because its principle is > simplicity of use, I took that as implying a certain simplicity of > implementation, whereas in actual fact (my apologies if I misrepresent > things) it is doing some fairly sophisticated things at the > implementation level, to allow things to "just happen" as the user > expects. That is a fair description. Quite arguably the py lib tries to provide more capabilities than are strictly neccessary for py.test. For example, the py.code.Traceback/ExceptionInfo/Code/Frame Classes thinly wrap their raw python counterpart objects. They offer somewhat higher level introspection facilities and thus make the life of the test reporter (who has to display tracebacks etc.pp) easier and cleaner. At the same time, those classes offer e.g. a much nicer interface to implement from PyPy which now completely integrates with py.test and uses it to run tests against the current PyPy interpreter (without requiring any changes on py.test's part of course, including new command line options etc.pp.). > It's an entirely fair tradeoff, but probably not what I'm after. Too > much "magic" for me, I suspect, as well as including a lot of things > (distributing tests, html reporting) that I don't need/want. Fair enough. In some ways the py lib tries to be supplemental to the std lib which offers some hundred modules, many of which both of us probably not use, although they may be used indirectly :-) > I don't want to dismiss py on the basis of superficial impressions, > but at least my *new* superficial impression stands a better chance of > matching reality than my previous one :-) Feel free to ask any questions and complain about uglyness :-) > I'll go back to playing, and look at things with a more realistic > view. Thanks for taking the time to explain. I'd be interested to hear if you can manage to just use py.test without having to worry much about the rest of the py lib. holger From grig at gheorghiu.net Fri Jan 28 23:27:39 2005 From: grig at gheorghiu.net (Grig Gheorghiu) Date: Fri, 28 Jan 2005 14:27:39 -0800 (PST) Subject: [py-dev] we are about to get press: In-Reply-To: <200501281501.j0SF1uIb003013@ratthing-b246.strakt.com> Message-ID: <20050128222739.88628.qmail@web54503.mail.yahoo.com> Here goes: http://agiletesting.blogspot.com/2005/01/python-unit-testing-part-3-pytest-tool.html Feedback and comments are appreciated. Grig --- Laura Creighton wrote: > http://agiletesting.blogspot.com/2005/01/python-unit-testing-part-1-unittest.html > > which got a pointer in the daily python url. > > woo woo woo! > > laura > _______________________________________________ > py-dev mailing list > py-dev at codespeak.net > http://codespeak.net/mailman/listinfo/py-dev > From mscott at goldenspud.com Fri Jan 28 23:30:43 2005 From: mscott at goldenspud.com (Matthew Scott) Date: Fri, 28 Jan 2005 16:30:43 -0600 Subject: [py-dev] Strange failure with py.test - too much magic? In-Reply-To: <20050128220856.GY467@solar.trillke.net> References: <79990c6b05012810135595ed62@mail.gmail.com> <20050128182852.GX467@solar.trillke.net> <79990c6b050128112864630b78@mail.gmail.com> <20050128220856.GY467@solar.trillke.net> Message-ID: <41FABD13.8090709@goldenspud.com> holger krekel wrote: > On Fri, Jan 28, 2005 at 19:28 +0000, Paul Moore wrote: >>I'll go back to playing, and look at things with a more realistic >>view. Thanks for taking the time to explain. > > > I'd be interested to hear if you can manage to just use py.test > without having to worry much about the rest of the py lib. Newcomers to py.test -- be careful when you delve into the rest of the py lib. Despite its reputation for being "bleeding edge" or having an "unstable" API, once you start peeking under the hood and trying things out, you might just get hooked :) One thing I'd like to see more of though are docstrings for public methods. It took me a little while to get a handle on py.path.local when I was writing py.path.embed. So maybe one of these days I'll work on that and submit a patch that cleans up existing docstrings and adds new ones where there aren't any. - Matthew From hpk at trillke.net Fri Jan 28 23:56:19 2005 From: hpk at trillke.net (holger krekel) Date: Fri, 28 Jan 2005 23:56:19 +0100 Subject: [py-dev] we are about to get press: In-Reply-To: <20050128222739.88628.qmail@web54503.mail.yahoo.com> References: <200501281501.j0SF1uIb003013@ratthing-b246.strakt.com> <20050128222739.88628.qmail@web54503.mail.yahoo.com> Message-ID: <20050128225619.GZ467@solar.trillke.net> Hi Grig! On Fri, Jan 28, 2005 at 14:27 -0800, Grig Gheorghiu wrote: > Here goes: > > http://agiletesting.blogspot.com/2005/01/python-unit-testing-part-3-pytest-tool.html > > Feedback and comments are appreciated. I just read through your blog entry and enjoyed it. All of it, especially the criticism seems pretty sensible. Of course, it motiviates me to finally get rid of remaining ugly tracebacks. I'd actually like to do it by really improving the collection logic ... Personally, btw, i find the assert statement <-> self.assertEquals difference one of the primary points for using py.test. But reading your blog entry reminded me that we should probably focus a bit more on getting the current features more solid (especially tracebacks). However, integrating doctests is still high on the list ... hum, didn't I say that already lately? :-) Btw, another interesting development may be that we spent some time at the Switzerland PyPy sprint to run CPython's unittest-using regression tests to work _unmodified_ from invoking them with py.test. It's easier than i thought but i am not sure that i want to offer this "officially" from py.test. Maybe i put it into the py/documentation/example directory to show off what you can do from "conftest.py" files. So i am very much looking forward to meet and discuss with you at Pycon! cheers, holger From mscott at goldenspud.com Sat Jan 29 00:13:57 2005 From: mscott at goldenspud.com (Matthew Scott) Date: Fri, 28 Jan 2005 17:13:57 -0600 Subject: [py-dev] py.path.embed patch In-Reply-To: <41F98CD0.5090508@goldenspud.com> References: <41F98CD0.5090508@goldenspud.com> Message-ID: <41FAC735.3020202@goldenspud.com> Matthew Scott wrote: > > Here is the first stab at py.path.embed > > Let me know what you all think. If it passes muster, feel free to merge > it into trunk. > Speaking of passing muster -- this currently fails miserably on Windows. :-( But passes quite nicely on Linux :-) So I'll eventually get it working better on Windows, since that is one of the target platforms I need it to work on. - Matthew From ianb at colorstudy.com Sat Jan 29 00:38:41 2005 From: ianb at colorstudy.com (Ian Bicking) Date: Fri, 28 Jan 2005 17:38:41 -0600 Subject: [py-dev] we are about to get press: In-Reply-To: <20050128225619.GZ467@solar.trillke.net> References: <200501281501.j0SF1uIb003013@ratthing-b246.strakt.com> <20050128222739.88628.qmail@web54503.mail.yahoo.com> <20050128225619.GZ467@solar.trillke.net> Message-ID: <41FACD01.7000506@colorstudy.com> holger krekel wrote: > I just read through your blog entry and enjoyed it. All of it, especially > the criticism seems pretty sensible. Of course, it motiviates me to > finally get rid of remaining ugly tracebacks. I'd actually like to > do it by really improving the collection logic ... I don't know if this is related, but I was thinking recently about tracebacks in other contexts. The Zope exception reporter looks for __traceback_info__ local variables in the frames and uses them to add extra information... I thought it might be possible to use another local variable like that to signal that a frame (or maybe all further frames from where the variable was found) should be left out of the traceback. It seems like a simple way to handle the problem of long tracebacks. > Personally, btw, i find the assert statement <-> self.assertEquals > difference one of the primary points for using py.test. But reading your > blog entry reminded me that we should probably focus a bit more on getting > the current features more solid (especially tracebacks). > However, integrating doctests is still high on the list ... hum, > didn't I say that already lately? :-) > > Btw, another interesting development may be that we spent some time > at the Switzerland PyPy sprint to run CPython's unittest-using > regression tests to work _unmodified_ from invoking them with > py.test. It's easier than i thought but i am not sure that > i want to offer this "officially" from py.test. Maybe i put > it into the py/documentation/example directory to show off > what you can do from "conftest.py" files. You mean, py.test can automatically find and run unittest-based tests? That would be great, certainly not something to hide away -- it makes it more viable to adopt py.test incrementally, or as a developer to use it in projects that already use unittest (and where you might not have the authority to change that). Tests written for py.test are still compelling because they are simple -- making it unittest-compatible just means there's less initial investment required to start using py.test. -- Ian Bicking / ianb at colorstudy.com / http://blog.ianbicking.org From hpk at trillke.net Sat Jan 29 00:45:00 2005 From: hpk at trillke.net (holger krekel) Date: Sat, 29 Jan 2005 00:45:00 +0100 Subject: [py-dev] py.path.embed patch In-Reply-To: <41F98CD0.5090508@goldenspud.com> References: <41F98CD0.5090508@goldenspud.com> Message-ID: <20050128234500.GA467@solar.trillke.net> On Thu, Jan 27, 2005 at 18:52 -0600, Matthew Scott wrote: > > Here is the first stab at py.path.embed > > Let me know what you all think. If it passes muster, feel free to merge > it into trunk. Hi Matthew, thanks for the patch, first of all. i couldn't yet look more carefully into your patch but i am wondering why you didn't just provide the virtual path based on dictionaries so that you can use py.path.local and py.path.dict (or some such) quite interchangeably? On a side note i believe that py.path.extpy uses some "embed" similar ideas to represent files/hierarchies from within python modules offering a filesystem-path like interface. Generally i think the virtualization idea is worthwhile to integrate but i am not sure yet how it bests fits in and it's obviously good to avoid code duplication. cheers, holger From mscott at goldenspud.com Sat Jan 29 00:47:52 2005 From: mscott at goldenspud.com (Matthew Scott) Date: Fri, 28 Jan 2005 17:47:52 -0600 Subject: [py-dev] py.path.embed patch In-Reply-To: <41FAC735.3020202@goldenspud.com> References: <41F98CD0.5090508@goldenspud.com> <41FAC735.3020202@goldenspud.com> Message-ID: <41FACF28.5060609@goldenspud.com> Matthew Scott wrote: > > Speaking of passing muster -- this currently fails miserably on Windows. > :-( But passes quite nicely on Linux :-) > Thankfully, only minor modifications needed to be made to "make it go" on Windows. Attached is a patch to be applied after the patch I posted yesterday. - Matthew -------------- next part -------------- A non-text attachment was scrubbed... Name: embed1-1.patch Type: text/x-patch Size: 1435 bytes Desc: not available URL: From hpk at trillke.net Sat Jan 29 00:57:02 2005 From: hpk at trillke.net (holger krekel) Date: Sat, 29 Jan 2005 00:57:02 +0100 Subject: [py-dev] we are about to get press: In-Reply-To: <41FACD01.7000506@colorstudy.com> References: <200501281501.j0SF1uIb003013@ratthing-b246.strakt.com> <20050128222739.88628.qmail@web54503.mail.yahoo.com> <20050128225619.GZ467@solar.trillke.net> <41FACD01.7000506@colorstudy.com> Message-ID: <20050128235702.GB467@solar.trillke.net> On Fri, Jan 28, 2005 at 17:38 -0600, Ian Bicking wrote: > holger krekel wrote: > >I just read through your blog entry and enjoyed it. All of it, especially > >the criticism seems pretty sensible. Of course, it motiviates me to > >finally get rid of remaining ugly tracebacks. I'd actually like to > >do it by really improving the collection logic ... > > I don't know if this is related, but I was thinking recently about > tracebacks in other contexts. The Zope exception reporter looks for > __traceback_info__ local variables in the frames and uses them to add > extra information... I thought it might be possible to use another local > variable like that to signal that a frame (or maybe all further frames > from where the variable was found) should be left out of the traceback. > It seems like a simple way to handle the problem of long tracebacks. Hehe, actually there is already support for that with py.test in that you can say __tracebackhide__ = True in a function (and modify that value during the execution of a function) and a py.test-traceback will leave out that traceback. You can also modify this value during the run of the function, of course. This is nice for proxy or decorating functions that want to hide themselves when they dispatch to their underlying object. In a custom import hook, for example, you can set __tracebackhide__ initially to False and when you dispatch to cpython's __import__ you set it to True because showing your hook becomes unteresting. It may be worthwhile to extend this idea to be able to have more general interactions at traceback-show time. With something like: __traceback__ = """ def repr(frame): # text representation for this frame """ This would only add a minimal runtime penalty and the speed at traceback-showtime (when we need to compile the above) is irrelevant. This would allow for rather custom tracebacks. > >Btw, another interesting development may be that we spent some time > >at the Switzerland PyPy sprint to run CPython's unittest-using > >regression tests to work _unmodified_ from invoking them with > >py.test. It's easier than i thought but i am not sure that > >i want to offer this "officially" from py.test. Maybe i put > >it into the py/documentation/example directory to show off > >what you can do from "conftest.py" files. > > You mean, py.test can automatically find and run unittest-based tests? > That would be great, certainly not something to hide away -- it makes it > more viable to adopt py.test incrementally, or as a developer to use it > in projects that already use unittest (and where you might not have the > authority to change that). Tests written for py.test are still > compelling because they are simple -- making it unittest-compatible just > means there's less initial investment required to start using py.test. Yes, i agree. But I am afraid that there may be more and more demand to support more and more of unittest's uglynesses :-) cheers, holger From mscott at goldenspud.com Sat Jan 29 01:07:35 2005 From: mscott at goldenspud.com (Matthew Scott) Date: Fri, 28 Jan 2005 18:07:35 -0600 Subject: [py-dev] py.path.embed patch In-Reply-To: <20050128234500.GA467@solar.trillke.net> References: <41F98CD0.5090508@goldenspud.com> <20050128234500.GA467@solar.trillke.net> Message-ID: <41FAD3C7.30508@goldenspud.com> holger krekel wrote: > > thanks for the patch, first of all. i couldn't yet look more > carefully into your patch but i am wondering why you didn't > just provide the virtual path based on dictionaries so that > you can use py.path.local and py.path.dict (or some such) > quite interchangeably? > Well, I had originally considered that, but I realized that what I mainly wanted was a way to have a self-contained "sandbox" of files. That sandbox could either be based on a local path, or embedded into a Python module (or even a pickle, or whatever) as a dictionary. Then, the code using those sandboxes would initialize them based on what was available (either a local path or a dictionary) and after that, not care whether the files were on disk or in a dictionary. The use of a "context" also allows virtual chdir() operations to occur. That means you can do the following: >>> import py >>> ctx = py.path.embed.virtualContext() >>> path = py.path.embed(context=ctx) >>> print repr(path) embed('/', virtual(406cddfc)) >>> subdir = path.join('foo').ensure(dir=True) >>> subdir.chdir() >>> path2 = py.path.embed(context=ctx) >>> print repr(path2) embed('/foo', virtual(406cddfc)) > On a side note i believe that py.path.extpy uses some "embed" similar > ideas to represent files/hierarchies from within python modules > offering a filesystem-path like interface. Generally i think > the virtualization idea is worthwhile to integrate but i am not > sure yet how it bests fits in and it's obviously good to > avoid code duplication. Hmm... Perhaps the "local context" in my code could be refactored into some sort of "sandbox" path class that can use *any* path object as its "context" or "virtual root". That would satisfy my desire to take a py.path.local object pointing to a directory, then use that as the root of a sandbox of files that I want to manage. I may look at py.path.extpy and see how it works. I think it might make sense in this case to use a simpler dictionary-based virtual filesystem. But then we have the conundrum that I solved with the concept of contexts in py.path.embed - what is the "root" of that virtual filesystem? Some sort of top-level object would still be needed to allow navigation through parents of virtual paths. Well those are my ramblings for now. :) I may charge forward with using py.path.embed but I think its actual use will be simple enough that if we decide to use some other techniques to accomplish the same thing, it won't be a problem at all to refactor. And speaking of paths, contexts, and virtual filesystems... Schevo has some support for ZIP files for loading icons. py.path.zip?? :) - Matthew From hpk at trillke.net Sat Jan 29 01:43:39 2005 From: hpk at trillke.net (holger krekel) Date: Sat, 29 Jan 2005 01:43:39 +0100 Subject: [py-dev] py.path.embed patch In-Reply-To: <41FAD3C7.30508@goldenspud.com> References: <41F98CD0.5090508@goldenspud.com> <20050128234500.GA467@solar.trillke.net> <41FAD3C7.30508@goldenspud.com> Message-ID: <20050129004339.GC467@solar.trillke.net> On Fri, Jan 28, 2005 at 18:07 -0600, Matthew Scott wrote: > ... > The use of a "context" also allows virtual chdir() operations to occur. > That means you can do the following: > > >>> import py > >>> ctx = py.path.embed.virtualContext() > >>> path = py.path.embed(context=ctx) > >>> print repr(path) > embed('/', virtual(406cddfc)) > >>> subdir = path.join('foo').ensure(dir=True) > >>> subdir.chdir() > >>> path2 = py.path.embed(context=ctx) > >>> print repr(path2) > embed('/foo', virtual(406cddfc)) makes some sense ... > >On a side note i believe that py.path.extpy uses some "embed" similar > >ideas to represent files/hierarchies from within python modules > >offering a filesystem-path like interface. Generally i think > >the virtualization idea is worthwhile to integrate but i am not > >sure yet how it bests fits in and it's obviously good to > >avoid code duplication. > > Hmm... > > Perhaps the "local context" in my code could be refactored into some > sort of "sandbox" path class that can use *any* path object as its > "context" or "virtual root". I wonder why we can not simply provide a virtual path like py.path.virtual() # for a new one py.path.virtual(dict=...) # for instantiating from a dict and use those quite interchangeably with other paths. I don't like long instantiation incantations and the py.path.virtual() seems to be able to provide the same semantics by creating a context under the hood that gets passed on when creating subpaths. So chdir() could still be supported. > That would satisfy my desire to take a py.path.local object pointing to > a directory, then use that as the root of a sandbox of files that I want > to manage. I guess we would need some kind of transfer-operations between the path objects. > I may look at py.path.extpy and see how it works. I think it might make > sense in this case to use a simpler dictionary-based virtual filesystem. i think i agree but py.path.extpy() could at least work on top of a virtual path :-) > But then we have the conundrum that I solved with the concept of > contexts in py.path.embed - what is the "root" of that virtual > filesystem? Some sort of top-level object would still be needed to > allow navigation through parents of virtual paths. yip. > And speaking of paths, contexts, and virtual filesystems... Schevo has > some support for ZIP files for loading icons. py.path.zip?? :) definitely, py.path.zip and py.path.tar (which may both work from path objects i think) are very worthwhile targets. Btw, generally it's interesting to make it easy to implement new filesystem-path objects by simply providing a few operations like listdir() join() dirpath() read(), write() (and possibly open() ...) size() mtime() and then getting the full interface for free. This more or less works already (by inheriting from common.FSPathBase) but the more-or-lessness needs to be determined :-) To mention another direction i have been discussing with Armin: there is the problem of being able to determine the common set of operations between path objects (or objects in general). Instead of creating new names (for IIIIInterfances :-) we thought that it may be worth a try to define an intersection operation between e.g. classes: you would get back a proxy object where you can toss in one of the intersecting classes but you would only see the operations that are supported by all classes. For example: commonpath = py.code.intersect(py.path.local, py.path.virtual) and then commonpath.__init__local('/tmp') commonpath.__init__virtual(dict=....) would both return an object that support exactly the same methods (that both implementations support). Again, the goal here is to avoid adding any name complexity and still get stability in using objects/classes. This is basically an ad hoc interface-on-the-fly approach and seems more flexible than creating static and limited interface names that you have to teach to people ... cheers, holger From mscott at goldenspud.com Sat Jan 29 01:58:21 2005 From: mscott at goldenspud.com (Matthew Scott) Date: Fri, 28 Jan 2005 18:58:21 -0600 Subject: [py-dev] py.path.embed patch In-Reply-To: <20050129004339.GC467@solar.trillke.net> References: <41F98CD0.5090508@goldenspud.com> <20050128234500.GA467@solar.trillke.net> <41FAD3C7.30508@goldenspud.com> <20050129004339.GC467@solar.trillke.net> Message-ID: <41FADFAD.2040601@goldenspud.com> holger krekel wrote: >>Hmm... >> >>Perhaps the "local context" in my code could be refactored into some >>sort of "sandbox" path class that can use *any* path object as its >>"context" or "virtual root". > > I wonder why we can not simply provide a virtual path like > > py.path.virtual() # for a new one > py.path.virtual(dict=...) # for instantiating from a dict > > and use those quite interchangeably with other paths. I don't > like long instantiation incantations and the py.path.virtual() > seems to be able to provide the same semantics by creating > a context under the hood that gets passed on when creating > subpaths. So chdir() could still be supported. I quite like short instantiation incantations myself as well. The syntax you propose seems very reasonable. If you wanted to get the root path of the virtual path context though (or for any path object I suppose), what would you use? Would something like a rootpath() method (similar to dirpath()) be appropriate? > I guess we would need some kind of transfer-operations between > the path objects. What about "source.copy(target=dest)" ? :) Or are you referring to something else? > would both return an object that support exactly the same methods > (that both implementations support). Again, the goal here is to > avoid adding any name complexity and still get stability in using > objects/classes. This is basically an ad hoc interface-on-the-fly > approach and seems more flexible than creating static and limited > interface names that you have to teach to people ... Sounds interesting. I'll re-read this snippet again tomorrow to grok it further, and will look forward to your implementation *grin* - Matthew From mscott at goldenspud.com Sat Jan 29 06:04:43 2005 From: mscott at goldenspud.com (Matthew Scott) Date: Fri, 28 Jan 2005 23:04:43 -0600 Subject: [py-dev] py.path.dict and py.path.proxy patch Message-ID: <41FB196B.4040400@goldenspud.com> Round 2! The attached patch provides the new classes py.path.dict and py.path.proxy, as well as py.__impl__.path.virtual.virtual.Path which is the basis for them. This is an attempt at taking into account some of the suggestions that holger made regarding generalization of the py.path.embed patch I posted in the last couple of days. This patch supersedes the embed1.patch and embed1-1.patch files, so don't use those anymore. What py.path.proxy allows you to do is to take an arbitrary path and create a proxy filesystem into a limited area of that path's filesystem. Maybe a better word would be sandbox, but I found proxy to be easier to type. Naming suggestions are welcome. This code is set in fresh mud, not in stone. :) I haven't yet tried py.path.proxy using anything but py.path.local root paths. Here's an example of using py.path.proxy to wrap a directory specified by py.path.local, and of using py.path.dict to create a dictionary-based virtual file system that contains a copy of that directory: I started IPython in a directory underneath my home directory that contains pictures of a model of laptop that I owned years ago, the Toshiba T1200XE. I created a 'here' variable to contain that path. >>> import py >>> here = py.path.local() >>> here --- local('/home/gldnspud/personal/www/crap/t1200xe') Listing the directory, you can see that it has three files inside it: >>> here.listdir() --- [local('/home/gldnspud/personal/www/crap/t1200xe/1.jpg'), local('/home/gldnspud/personal/www/crap/t1200xe/2.jpg'), local('/home/gldnspud/personal/www/crap/t1200xe/3.jpg')] Now for the fun part. I created a 'prox' variable which is a py.path.proxy whose root is the 'here' path. >>> prox = py.path.proxy(root=here) >>> prox --- proxypath('/', local('/home/gldnspud/personal/www/crap/t1200xe')) If you list its directory, it obviously contains the same files: >>> prox.listdir() --- [proxypath('/1.jpg', local('/home/gldnspud/personal/www/crap/t1200xe')), proxypath('/2.jpg', local('/home/gldnspud/personal/www/crap/t1200xe')), proxypath('/3.jpg', local('/home/gldnspud/personal/www/crap/t1200xe'))] If you traverse to parent directories with 'here', it goes up and up and up. >>> here.join('..', '..') --- local('/home/gldnspud/personal/www') However, if you attempt the same thing using 'prox', you run into the boundary that was defined when prox's virtual filesystem was created. >>> prox.join('..', '..') --- proxypath('/', local('/home/gldnspud/personal/www/crap/t1200xe')) Now I'll create a 'virt' variable which will contain a py.path.dict starting from an empty dictionary. >>> virt = py.path.dict() >>> virt --- dictpath('/', ) >>> virt.listdir() --- [] I'll populate it with the contents of 'prox'. >>> prox.copy(target=virt) >>> virt.listdir() --- [dictpath('/2.jpg', ), dictpath('/3.jpg', ), dictpath('/1.jpg', )] What's fun (but maybe pointless other than to show that when it isn't pointless to do so, it can be done) is creating a proxy for a dict path, and then a proxy for that proxy. >>> virt.join('a', 'b', 'c').ensure(dir=True) --- dictpath('/a/b/c', ) >>> virt.listdir() --- [dictpath('/2.jpg', ), dictpath('/a', ), dictpath('/3.jpg', ), dictpath('/1.jpg', )] >>> prox2 = py.path.proxy(root=virt.join('a')) >>> prox2.listdir() --- [proxypath('/b', dictpath('/a', ))] >>> prox3 = py.path.proxy(root=prox2.join('b')) >>> prox3.listdir() --- [proxypath('/c', proxypath('/b', dictpath('/a', )))] Removing the "c" directory in prox3 of course will result in a/b/c being removed from virt. >>> prox3.join('c').remove() >>> virt.join('a', 'b').listdir() --- [] Let's delete the jpg files from virt and create a small text file, and see what the underlying dictionary representation of virt is. (Hint: a 'c' key stands for contents. A directory's contents is a dictionary, and a file's contents is a string). >>> for path in virt.listdir(fil='*.jpg'): path.remove() ... >>> virt.listdir() --- [dictpath('/a', )] >>> virt.join('test.txt').ensure().write('hello world') >>> virt.join('test.txt').read() --- 'hello world' >>> virt.fs.dict --- {'c': {'a': {'c': {'b': {'c': {}}}}, 'test.txt': {'c': 'hello world'}}} Fun stuff! - Matthew -------------- next part -------------- A non-text attachment was scrubbed... Name: virtual1.patch Type: text/x-patch Size: 31082 bytes Desc: not available URL: