From holger at merlinux.eu Mon Oct 3 05:36:53 2011 From: holger at merlinux.eu (holger krekel) Date: Mon, 3 Oct 2011 03:36:53 +0000 Subject: [py-dev] plugin scoping or control In-Reply-To: References: Message-ID: <20111003033653.GC1684@merlinux.eu> Hi Jason, On Thu, Sep 29, 2011 at 14:39 -0700, Jason R. Coombs wrote: > I've run into an issue with pytest_assertrepr_compare - I want the hook to > apply to only a subset of tests (tests/unit/**), so I defined > tests/unit/conftest.py and defined my pytest_assertrepr_compare there, but > still tests under tests/other/** get that behavior (when I run py.test on > tests/). > > > > Is there a way to cause the hook to be loaded only for tests/unit/** (or > some other subset of tests)? Alternatively, is there a way to detect from > within the pytest_assertrepr_compare function which module/function is under > test and select that way? In principle it should be possible to have a conftest pytest_assertrepr_compare apply only to a subdirectory. The basic mechanism exists but this particular hook is implemented in a special way and thus it requires some thinking/hacking i am afraid. Could you open an issue about this? thanks, holger > > > Thanks, > > Jason > > _______________________________________________ > py-dev mailing list > py-dev at codespeak.net > http://codespeak.net/mailman/listinfo/py-dev From jaraco at jaraco.com Mon Oct 3 15:39:06 2011 From: jaraco at jaraco.com (Jason R. Coombs) Date: Mon, 3 Oct 2011 06:39:06 -0700 Subject: [py-dev] plugin scoping or control In-Reply-To: <20111003033653.GC1684@merlinux.eu> References: <20111003033653.GC1684@merlinux.eu> Message-ID: > -----Original Message----- > From: holger krekel [mailto:holger at merlinux.eu] > Sent: Sunday, 02 October, 2011 23:37 > > Could you open an issue about this? I've opened an issue as https://bitbucket.org/hpk42/pytest/issue/77/allow-assertrepr_compare-hook-to -apply. Thanks! -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 6448 bytes Desc: not available URL: From holger at merlinux.eu Tue Oct 18 21:11:27 2011 From: holger at merlinux.eu (holger krekel) Date: Tue, 18 Oct 2011 19:11:27 +0000 Subject: [py-dev] pytest-2.1.3: just some fixes Message-ID: <20111018191127.GP27936@merlinux.eu> py.test 2.1.3: just some fixes =========================================================================== pytest-2.1.3 is a minor backward compatible maintenance release of the popular py.test testing tool. It is commonly used for unit, functional- and integration testing. See extensive docs with examples here: http://pytest.org/ The release contains a fix to the perfected assertions introduced with the 2.1 series as well as the new possibility to customize the detailed reporting for assertion expressions on a per-directory level. If you want to install or upgrade pytest, just type one of:: pip install -U pytest # or easy_install -U pytest Thanks to the bug reporters and to Ronny Pfannschmidt, Benjamin Peterson and Floris Bruynooghe who implemented the fixes. best, holger krekel Changes between 2.1.2 and 2.1.3 ---------------------------------------- - fix issue79: assertion rewriting failed on some comparisons in boolops, - correctly handle zero length arguments (a la pytest '') - fix issue67 / junitxml now contains correct test durations - fix issue75 / skipping test failure on jython - fix issue77 / Allow assertrepr_compare hook to apply to a subset of tests From pere.martir4 at gmail.com Wed Oct 26 02:17:04 2011 From: pere.martir4 at gmail.com (Pere Martir) Date: Wed, 26 Oct 2011 02:17:04 +0200 Subject: [py-dev] Status of py.test's unit tests now ? Message-ID: Hello. I am new to this mailing list. I've just cloned the source to my Mac OS X 10.6,8 and ran the unit tests. I found that a lot of them failed (attached: result.txt). Is it normal ? I'd like fix an issue: https://bitbucket.org/hpk42/pytest/issue/56/feature-report-exceptions-in-threads-as But I am not sure that I should proceed with the already-failed tests ? -------------- next part -------------- ============================= test session starts ============================== platform darwin -- Python 2.6.1 -- pytest-2.1.3 collecting ... collected 678 items doc/example/assertion/test_failures.py . doc/example/assertion/test_setup_flow_example.py .. doc/example/assertion/global_testmodule_config/test_hello.py . doc/example/costlysetup/sub1/test_quick.py . doc/example/costlysetup/sub2/test_two.py .. testing/acceptance_test.py ....................................x. testing/test_assertinterpret.py .......................... testing/test_assertion.py ........................ testing/test_assertrewrite.py ...................... testing/test_capture.py ............x................. testing/test_collection.py ...........x.....FFF..FF...F.... testing/test_config.py ........x.............. testing/test_conftest.py .......................F..... testing/test_core.py .................................................. testing/test_doctest.py ......... testing/test_genscript.py s..sssss testing/test_helpconfig.py ........ testing/test_junitxml.py .................... testing/test_mark.py ........................ testing/test_monkeypatch.py ......... testing/test_nose.py sssssssss testing/test_parseopt.py ............ testing/test_pastebin.py .. testing/test_pdb.py ....ssssss. testing/test_pytester.py x...... testing/test_python.py ..FEEEEEEEEEEEEEEE....FFFEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE......FEEEEEFEEEEEFEEEEEFFEEEFEEEEEEEEEEEEEEEEEEEEEEEEE testing/test_recwarn.py EEEEEEEEEEEEEEEE testing/test_resultlog.py F.FEEEEEFEEE testing/test_runner.py EEFEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEExxEEEE....F testing/test_runner_xunit.py .......... testing/test_session.py .............FEEE testing/test_skipping.py EEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEXXEEEE testing/test_terminal.py EEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEXX testing/test_tmpdir.py EEEEEEEEEEEEEE testing/test_unittest.py EEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE From benjamin at python.org Wed Oct 26 02:44:16 2011 From: benjamin at python.org (Benjamin Peterson) Date: Tue, 25 Oct 2011 20:44:16 -0400 Subject: [py-dev] Status of py.test's unit tests now ? In-Reply-To: References: Message-ID: 2011/10/25 Pere Martir : > Hello. I am new to this mailing list. > > I've just cloned the source to my Mac OS X 10.6,8 and ran the unit > tests. I found that a lot of them failed (attached: result.txt). Is it > normal ? Perhaps you could share what the details of the failures are. -- Regards, Benjamin From holger at merlinux.eu Wed Oct 26 09:07:56 2011 From: holger at merlinux.eu (holger krekel) Date: Wed, 26 Oct 2011 07:07:56 +0000 Subject: [py-dev] Status of py.test's unit tests now ? In-Reply-To: References: Message-ID: <20111026070756.GB27936@merlinux.eu> On Tue, Oct 25, 2011 at 20:44 -0400, Benjamin Peterson wrote: > 2011/10/25 Pere Martir : > > Hello. I am new to this mailing list. > > > > I've just cloned the source to my Mac OS X 10.6,8 and ran the unit > > tests. I found that a lot of them failed (attached: result.txt). Is it > > normal ? > > Perhaps you could share what the details of the failures are. in addition you might also install "tox" and run "tox -e py26" or "tox -e py27" and show the output. thanks, holger From holger at merlinux.eu Thu Oct 27 13:12:42 2011 From: holger at merlinux.eu (holger krekel) Date: Thu, 27 Oct 2011 11:12:42 +0000 Subject: [py-dev] Status of py.test's unit tests now ? In-Reply-To: References: <20111026070756.GB27936@merlinux.eu> Message-ID: <20111027111242.GC27936@merlinux.eu> On Thu, Oct 27, 2011 at 00:31 +0200, Pere Martir wrote: > On Wed, Oct 26, 2011 at 9:07 AM, holger krekel wrote: > > On Tue, Oct 25, 2011 at 20:44 -0400, Benjamin Peterson wrote: > >> Perhaps you could share what the details of the failures are. > > in addition you might also install "tox" and run "tox -e py26" or "tox -e py27" and show the output. > > "tox -e py26" blocks forever (for hours). I attached the output in the > end of this post (tox_py26_output.txt). > > I also attached the output of command "python pytest.py -v" > (pytest_v_output.txt). As you can see, there are a lot of ERROR. > > My environment: Mac OS X 10.6.8, Python 2.6.1 Thanks Pere for your efforts. I guess some more work testing with OSX is needed (which is unfortuantely not part of the CI setup - anyone able to offer an OSX build host?). However, one issue sticks out - could you meanwhile try changing the last line of _pytest/tmpdir.py to "return x" instead of "return x.realpath()" and see if this helps with some of the currently failing tests? best, holger > [TOX] ***copying new sdistfile to '/Users/tzuchien/.tox/distshare/pytest-2.1.3.zip' > __________________________________________________ [tox testenv:py26] __________________________________________________ > [TOX] ***creating virtualenv py26 > [TOX] /Users/tzuchien/pytest/.tox$ /System/Library/Frameworks/Python.framework/Versions/2.6/bin/python2.6 ../../../../Library/Python/2.6/site-packages/tox/virtualenv.py --distribute --no-site-packages py26 >py26/log/0.log > [TOX] ***installing dependencies: :pypi:pexpect, :pypi:nose > [TOX] /Users/tzuchien/pytest/.tox/py26/log$ ../bin/pip install -i http://pypi.python.org/simple --download-cache=/Users/tzuchien/pytest/.tox/_download pexpect nose >1.log > [TOX] ***installing sdist > [TOX] /Users/tzuchien/pytest/.tox/py26/log$ ../bin/pip install -i http://pypi.testrun.org --download-cache=/Users/tzuchien/pytest/.tox/_download ../../dist/pytest-2.1.3.zip >2.log > [TOX] /Users/tzuchien/pytest/testing$ ../.tox/py26/bin/py.test -rfsxX --junitxml=/Users/tzuchien/pytest/.tox/py26/log/junit-py26.xml > ================================================= test session starts ================================================== > platform darwin -- Python 2.6.1 -- pytest-2.1.3 > collected 671 items > > acceptance_test.py ....................................x. > test_assertinterpret.py .......................... > test_assertion.py ........................ > test_assertrewrite.py ...................... > test_capture.py ............x................. > test_collection.py ...........x.....FFF..FF...F.... > test_config.py ........x.............. > test_conftest.py .......................F..... > test_core.py .................................................. > test_doctest.py ......... > test_genscript.py s..sssss > test_helpconfig.py ........ > test_junitxml.py .................... > test_mark.py ........................ > test_monkeypatch.py ......... > test_nose.py ......... > test_parseopt.py ............ > test_pastebin.py .. > test_pdb.py ....FFFF^CKEYBOARDINTERRUPT > > > ======================================================= FAILURES ======================================================= > ______________________________________________ TestSession.test_parsearg _______________________________________________ > > self = > testdir = > > def test_parsearg(self, testdir): > p = testdir.makepyfile("def test_func(): pass") > subdir = testdir.mkdir("sub") > subdir.ensure("__init__.py") > target = subdir.join(p.basename) > p.move(target) > testdir.chdir() > subdir.chdir() > config = testdir.parseconfig(p.basename) > rcol = Session(config=config) > > assert rcol.fspath == subdir > E assert local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_parsearg0/test_parsearg/sub') == local('/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_parsearg0/test_parsearg/sub') > E + where local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_parsearg0/test_parsearg/sub') = .fspath > > /Users/tzuchien/pytest/testing/test_collection.py:303: AssertionError > ___________________________________________ TestSession.test_collect_topdir ____________________________________________ > > self = > testdir = > > def test_collect_topdir(self, testdir): > p = testdir.makepyfile("def test_func(): pass") > id = "::".join([p.basename, "test_func"]) > config = testdir.parseconfigure(id) > topdir = testdir.tmpdir > rcol = Session(config) > > assert topdir == rcol.fspath > E assert local('/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_collect_topdir0/test_collect_topdir') == local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_collect_topdir0/test_collect_topdir') > E + where local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_collect_topdir0/test_collect_topdir') = .fspath > > /Users/tzuchien/pytest/testing/test_collection.py:319: AssertionError > __________________________________ TestSession.test_collect_protocol_single_function ___________________________________ > > self = > testdir = > > def test_collect_protocol_single_function(self, testdir): > p = testdir.makepyfile("def test_func(): pass") > id = "::".join([p.basename, "test_func"]) > config = testdir.parseconfigure(id) > topdir = testdir.tmpdir > rcol = Session(config) > > assert topdir == rcol.fspath > E assert local('/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_collect_protocol_single_function0/test_collect_protocol_single_function') == local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_collect_protocol_single_function0/test_collect_protocol_single_function') > E + where local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_collect_protocol_single_function0/test_collect_protocol_single_function') = .fspath > > /Users/tzuchien/pytest/testing/test_collection.py:334: AssertionError > ____________________________________ TestSession.test_collect_subdir_event_ordering ____________________________________ > > self = > testdir = > > def test_collect_subdir_event_ordering(self, testdir): > p = testdir.makepyfile("def test_func(): pass") > aaa = testdir.mkpydir("aaa") > test_aaa = aaa.join("test_aaa.py") > p.move(test_aaa) > config = testdir.parseconfigure() > rcol = Session(config) > hookrec = testdir.getreportrecorder(config) > rcol.perform_collect() > items = rcol.items > assert len(items) == 1 > py.std.pprint.pprint(hookrec.hookrecorder.calls) > hookrec.hookrecorder.contains([ > ("pytest_collectstart", "collector.fspath == test_aaa"), > ("pytest_pycollect_makeitem", "name == 'test_func'"), > ("pytest_collectreport", > > "report.nodeid.startswith('aaa/test_aaa.py')"), > E Failed: could not find 'pytest_collectreport' check "report.nodeid.startswith('aaa/test_aaa.py')" > > /Users/tzuchien/pytest/testing/test_collection.py:427: Failed > --------------------------------------------------- Captured stdout ---------------------------------------------------- > collected 1 items > [})>, > , > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > , 'obj': , 'name': 'test_func'})>, > })>, > , 'obj': , 'name': '@pytest_ar'})>, > , 'obj': , 'name': '@py_builtins'})>, > })>, > })>, > ], 'session': , 'config': <_pytest.config.Config object at 0x10473b210>})>, > })>] > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NAMEMATCH pytest_collectstart })> > NOCHECKERMATCH 'collector.fspath == test_aaa' - })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NAMEMATCH pytest_collectstart })> > CHECKERMATCH 'collector.fspath == test_aaa' -> })> > NONAMEMATCH pytest_pycollect_makeitem with })> > NAMEMATCH pytest_pycollect_makeitem , 'obj': , 'name': 'test_func'})> > CHECKERMATCH "name == 'test_func'" -> , 'obj': , 'name': 'test_func'})> > NONAMEMATCH pytest_collectreport with })> > NONAMEMATCH pytest_collectreport with , 'obj': , 'name': '@pytest_ar'})> > NONAMEMATCH pytest_collectreport with , 'obj': , 'name': '@py_builtins'})> > NONAMEMATCH pytest_collectreport with })> > NAMEMATCH pytest_collectreport })> > NOCHECKERMATCH "report.nodeid.startswith('aaa/test_aaa.py')" - })> > NONAMEMATCH pytest_collectreport with ], 'session': , 'config': <_pytest.config.Config object at 0x10473b210>})> > NONAMEMATCH pytest_collectreport with })> > ____________________________________ TestSession.test_collect_two_commandline_args _____________________________________ > > self = > testdir = > > def test_collect_two_commandline_args(self, testdir): > p = testdir.makepyfile("def test_func(): pass") > aaa = testdir.mkpydir("aaa") > bbb = testdir.mkpydir("bbb") > test_aaa = aaa.join("test_aaa.py") > p.copy(test_aaa) > test_bbb = bbb.join("test_bbb.py") > p.move(test_bbb) > > id = "." > config = testdir.parseconfigure(id) > rcol = Session(config) > hookrec = testdir.getreportrecorder(config) > rcol.perform_collect() > items = rcol.items > assert len(items) == 2 > py.std.pprint.pprint(hookrec.hookrecorder.calls) > hookrec.hookrecorder.contains([ > ("pytest_collectstart", "collector.fspath == test_aaa"), > ("pytest_pycollect_makeitem", "name == 'test_func'"), > ("pytest_collectreport", "report.nodeid == 'aaa/test_aaa.py'"), > ("pytest_collectstart", "collector.fspath == test_bbb"), > ("pytest_pycollect_makeitem", "name == 'test_func'"), > > ("pytest_collectreport", "report.nodeid == 'bbb/test_bbb.py'"), > E Failed: could not find 'pytest_collectstart' check 'collector.fspath == test_aaa' > > /Users/tzuchien/pytest/testing/test_collection.py:453: Failed > --------------------------------------------------- Captured stdout ---------------------------------------------------- > collected 2 items > [})>, > , > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > })>, > , 'obj': , 'name': 'test_func'})>, > })>, > , 'obj': , 'name': '@pytest_ar'})>, > , 'obj': , 'name': '@py_builtins'})>, > })>, > })>, > })>, > })>, > , 'obj': , 'name': 'test_func'})>, > })>, > , 'obj': , 'name': '@pytest_ar'})>, > , 'obj': , 'name': '@py_builtins'})>, > })>, > })>, > , ], 'session': , 'config': <_pytest.config.Config object at 0x1045a7710>})>, > })>] > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NAMEMATCH pytest_collectstart })> > NOCHECKERMATCH 'collector.fspath == test_aaa' - })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NAMEMATCH pytest_collectstart })> > NOCHECKERMATCH 'collector.fspath == test_aaa' - })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with , 'obj': , 'name': 'test_func'})> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with , 'obj': , 'name': '@pytest_ar'})> > NONAMEMATCH pytest_collectstart with , 'obj': , 'name': '@py_builtins'})> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NAMEMATCH pytest_collectstart })> > NOCHECKERMATCH 'collector.fspath == test_aaa' - })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with , 'obj': , 'name': 'test_func'})> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with , 'obj': , 'name': '@pytest_ar'})> > NONAMEMATCH pytest_collectstart with , 'obj': , 'name': '@py_builtins'})> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with })> > NONAMEMATCH pytest_collectstart with , ], 'session': , 'config': <_pytest.config.Config object at 0x1045a7710>})> > NONAMEMATCH pytest_collectstart with })> > __________________________________________ Test_getinitialnodes.test_pkgfile ___________________________________________ > > self = > testdir = > > def test_pkgfile(self, testdir): > testdir.chdir() > tmpdir = testdir.tmpdir > subdir = tmpdir.join("subdir") > x = subdir.ensure("x.py") > subdir.ensure("__init__.py") > config = testdir.reparseconfig([x]) > col = testdir.getnode(config, x) > assert isinstance(col, pytest.Module) > > assert col.name == 'subdir/x.py' > E assert 'x.py' == 'subdir/x.py' > E - x.py > E + subdir/x.py > > /Users/tzuchien/pytest/testing/test_collection.py:508: AssertionError > --------------------------------------------------- Captured stdout ---------------------------------------------------- > ============================= test session starts ============================== > platform darwin -- Python 2.6.1 -- pytest-2.1.3 > collected 0 items > > =============================== in 0.00 seconds =============================== > _______________________________________________ test_setinitial_confcut ________________________________________________ > > testdir = > > def test_setinitial_confcut(testdir): > conf = testdir.makeconftest("") > sub = testdir.mkdir("sub") > sub.chdir() > for opts in (["--confcutdir=%s" % sub, sub], > [sub, "--confcutdir=%s" % sub], > ["--confcutdir=.", sub], > [sub, "--confcutdir", sub], > [str(sub), "--confcutdir", "."], > ): > conftest = Conftest() > conftest.setinitial(opts) > > assert conftest._confcutdir == sub > E assert local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_setinitial_confcut0/test_setinitial_confcut/sub') == local('/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_setinitial_confcut0/test_setinitial_confcut/sub') > E + where local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_setinitial_confcut0/test_setinitial_confcut/sub') = <_pytest.config.Conftest object at 0x1065754d0>._confcutdir > > /Users/tzuchien/pytest/testing/test_conftest.py:169: AssertionError > _____________________________________________ TestPDB.test_pdb_interaction _____________________________________________ > > self = > testdir = > > def test_pdb_interaction(self, testdir): > p1 = testdir.makepyfile(""" > def test_1(): > i = 0 > assert i == 1 > """) > child = testdir.spawn_pytest("--pdb %s" % p1) > child.expect(".*def test_1") > child.expect(".*i = 0") > child.expect("(Pdb)") > child.sendeof() > > rest = child.read() > > /Users/tzuchien/pytest/testing/test_pdb.py:63: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , size = -1 > > def read (self, size = -1): # File-like object. > > """This reads at most "size" bytes from the file (less if the read hits > EOF before obtaining size bytes). If the size argument is negative or > omitted, read all data until EOF is reached. The bytes are returned as > a string object. An empty string is returned when EOF is encountered > immediately. """ > > if size == 0: > return '' > if size < 0: > > self.expect (self.delimiter) # delimiter default is EOF > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:863: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , pattern = , timeout = -1, searchwindowsize = -1 > > def expect(self, pattern, timeout = -1, searchwindowsize=-1): > > """This seeks through the stream until a pattern is matched. The > pattern is overloaded and may take several types. The pattern can be a > StringType, EOF, a compiled re, or a list of any of those types. > Strings will be compiled to re types. This returns the index into the > pattern list. If the pattern was not a list this returns index 0 on a > successful match. This may raise exceptions for EOF or TIMEOUT. To > avoid the EOF or TIMEOUT exceptions add EOF or TIMEOUT to the pattern > list. That will cause expect to match an EOF or TIMEOUT condition > instead of raising an exception. > > If you pass a list of patterns and more than one matches, the first match > in the stream is chosen. If more than one pattern matches at that point, > the leftmost in the pattern list is chosen. For example:: > > # the input is 'foobar' > index = p.expect (['bar', 'foo', 'foobar']) > # returns 1 ('foo') even though 'foobar' is a "better" match > > Please note, however, that buffering can affect this behavior, since > input arrives in unpredictable chunks. For example:: > > # the input is 'foobar' > index = p.expect (['foobar', 'foo']) > # returns 0 ('foobar') if all input is available at once, > # but returs 1 ('foo') if parts of the final 'bar' arrive late > > After a match is found the instance attributes 'before', 'after' and > 'match' will be set. You can see all the data read before the match in > 'before'. You can see the data that was matched in 'after'. The > re.MatchObject used in the re match will be in 'match'. If an error > occurred then 'before' will be set to all the data read so far and > 'after' and 'match' will be None. > > If timeout is -1 then timeout will be set to the self.timeout value. > > A list entry may be EOF or TIMEOUT instead of a string. This will > catch these exceptions and return the index of the list entry instead > of raising the exception. The attribute 'after' will be set to the > exception type. The attribute 'match' will be None. This allows you to > write code like this:: > > index = p.expect (['good', 'bad', pexpect.EOF, pexpect.TIMEOUT]) > if index == 0: > do_something() > elif index == 1: > do_something_else() > elif index == 2: > do_some_other_thing() > elif index == 3: > do_something_completely_different() > > instead of code like this:: > > try: > index = p.expect (['good', 'bad']) > if index == 0: > do_something() > elif index == 1: > do_something_else() > except EOF: > do_some_other_thing() > except TIMEOUT: > do_something_completely_different() > > These two forms are equivalent. It all depends on what you want. You > can also just expect the EOF if you are waiting for all output of a > child to finish. For example:: > > p = pexpect.spawn('/bin/ls') > p.expect (pexpect.EOF) > print p.before > > If you are trying to optimize for speed then see expect_list(). > """ > > compiled_pattern_list = self.compile_pattern_list(pattern) > > return self.expect_list(compiled_pattern_list, timeout, searchwindowsize) > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1316: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , pattern_list = [], timeout = -1 > searchwindowsize = -1 > > def expect_list(self, pattern_list, timeout = -1, searchwindowsize = -1): > > """This takes a list of compiled regular expressions and returns the > index into the pattern_list that matched the child output. The list may > also contain EOF or TIMEOUT (which are not compiled regular > expressions). This method is similar to the expect() method except that > expect_list() does not recompile the pattern list on every call. This > may help if you are trying to optimize for speed, otherwise just use > the expect() method. This is called by expect(). If timeout==-1 then > the self.timeout value is used. If searchwindowsize==-1 then the > self.searchwindowsize value is used. """ > > > return self.expect_loop(searcher_re(pattern_list), timeout, searchwindowsize) > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1330: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , searcher = , timeout = 10.0 > searchwindowsize = None > > def expect_loop(self, searcher, timeout = -1, searchwindowsize = -1): > > """This is the common loop used inside expect. The 'searcher' should be > an instance of searcher_re or searcher_string, which describes how and what > to search for in the input. > > See expect() for other arguments, return value and exceptions. """ > > self.searcher = searcher > > if timeout == -1: > timeout = self.timeout > if timeout is not None: > end_time = time.time() + timeout > if searchwindowsize == -1: > searchwindowsize = self.searchwindowsize > > try: > incoming = self.buffer > freshlen = len(incoming) > while True: # Keep reading until exception or return. > index = searcher.search(incoming, freshlen, searchwindowsize) > if index >= 0: > self.buffer = incoming[searcher.end : ] > self.before = incoming[ : searcher.start] > self.after = incoming[searcher.start : searcher.end] > self.match = searcher.match > self.match_index = index > return self.match_index > # No match at this point > if timeout < 0 and timeout is not None: > raise TIMEOUT ('Timeout exceeded in expect_any().') > # Still have time left, so read more data > c = self.read_nonblocking (self.maxread, timeout) > freshlen = len(c) > time.sleep (0.0001) > incoming = incoming + c > if timeout is not None: > timeout = end_time - time.time() > except EOF, e: > self.buffer = '' > self.before = incoming > self.after = EOF > index = searcher.eof_index > if index >= 0: > self.match = EOF > self.match_index = index > return self.match_index > else: > self.match = None > self.match_index = None > raise EOF (str(e) + '\n' + str(self)) > except TIMEOUT, e: > self.buffer = incoming > self.before = incoming > self.after = TIMEOUT > index = searcher.timeout_index > if index >= 0: > self.match = TIMEOUT > self.match_index = index > return self.match_index > else: > self.match = None > self.match_index = None > > raise TIMEOUT (str(e) + '\n' + str(self)) > E TIMEOUT: Timeout exceeded in read_nonblocking(). > E > E version: 2.4 ($Revision: 516 $) > E command: /Users/tzuchien/pytest/.tox/py26/bin/python2.6 > E args: ['/Users/tzuchien/pytest/.tox/py26/bin/python2.6', '/Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pytest.py', '--basetemp=/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction0/test_pdb_interaction/pexpect', '--pdb', '/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction0/test_pdb_interaction/test_pdb_interaction.py'] > E searcher: searcher_re: > E 0: EOF > E buffer (last 100 chars): ) > E before (last 100 chars): ) > E after: > E match: None > E match_index: None > E exitstatus: None > E flag_eof: False > E pid: 31071 > E child_fd: 53 > E closed: False > E timeout: 10.0 > E delimiter: > E logfile: > E logfile_read: None > E logfile_send: None > E maxread: 2000 > E ignorecase: False > E searchwindowsize: None > E delaybeforesend: 0.05 > E delayafterclose: 0.1 > E delayafterterminate: 0.1 > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1414: TIMEOUT > ________________________________________ TestPDB.test_pdb_interaction_exception ________________________________________ > > self = > testdir = > > def test_pdb_interaction_exception(self, testdir): > p1 = testdir.makepyfile(""" > import pytest > def globalfunc(): > pass > def test_1(): > pytest.raises(ValueError, globalfunc) > """) > child = testdir.spawn_pytest("--pdb %s" % p1) > child.expect(".*def test_1") > child.expect(".*pytest.raises.*globalfunc") > child.expect("(Pdb)") > child.sendline("globalfunc") > child.expect(".*function") > child.sendeof() > > child.expect("1 failed") > > /Users/tzuchien/pytest/testing/test_pdb.py:84: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , pattern = '1 failed', timeout = -1, searchwindowsize = -1 > > def expect(self, pattern, timeout = -1, searchwindowsize=-1): > > """This seeks through the stream until a pattern is matched. The > pattern is overloaded and may take several types. The pattern can be a > StringType, EOF, a compiled re, or a list of any of those types. > Strings will be compiled to re types. This returns the index into the > pattern list. If the pattern was not a list this returns index 0 on a > successful match. This may raise exceptions for EOF or TIMEOUT. To > avoid the EOF or TIMEOUT exceptions add EOF or TIMEOUT to the pattern > list. That will cause expect to match an EOF or TIMEOUT condition > instead of raising an exception. > > If you pass a list of patterns and more than one matches, the first match > in the stream is chosen. If more than one pattern matches at that point, > the leftmost in the pattern list is chosen. For example:: > > # the input is 'foobar' > index = p.expect (['bar', 'foo', 'foobar']) > # returns 1 ('foo') even though 'foobar' is a "better" match > > Please note, however, that buffering can affect this behavior, since > input arrives in unpredictable chunks. For example:: > > # the input is 'foobar' > index = p.expect (['foobar', 'foo']) > # returns 0 ('foobar') if all input is available at once, > # but returs 1 ('foo') if parts of the final 'bar' arrive late > > After a match is found the instance attributes 'before', 'after' and > 'match' will be set. You can see all the data read before the match in > 'before'. You can see the data that was matched in 'after'. The > re.MatchObject used in the re match will be in 'match'. If an error > occurred then 'before' will be set to all the data read so far and > 'after' and 'match' will be None. > > If timeout is -1 then timeout will be set to the self.timeout value. > > A list entry may be EOF or TIMEOUT instead of a string. This will > catch these exceptions and return the index of the list entry instead > of raising the exception. The attribute 'after' will be set to the > exception type. The attribute 'match' will be None. This allows you to > write code like this:: > > index = p.expect (['good', 'bad', pexpect.EOF, pexpect.TIMEOUT]) > if index == 0: > do_something() > elif index == 1: > do_something_else() > elif index == 2: > do_some_other_thing() > elif index == 3: > do_something_completely_different() > > instead of code like this:: > > try: > index = p.expect (['good', 'bad']) > if index == 0: > do_something() > elif index == 1: > do_something_else() > except EOF: > do_some_other_thing() > except TIMEOUT: > do_something_completely_different() > > These two forms are equivalent. It all depends on what you want. You > can also just expect the EOF if you are waiting for all output of a > child to finish. For example:: > > p = pexpect.spawn('/bin/ls') > p.expect (pexpect.EOF) > print p.before > > If you are trying to optimize for speed then see expect_list(). > """ > > compiled_pattern_list = self.compile_pattern_list(pattern) > > return self.expect_list(compiled_pattern_list, timeout, searchwindowsize) > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1316: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , pattern_list = [<_sre.SRE_Pattern object at 0x106e05378>], timeout = -1 > searchwindowsize = -1 > > def expect_list(self, pattern_list, timeout = -1, searchwindowsize = -1): > > """This takes a list of compiled regular expressions and returns the > index into the pattern_list that matched the child output. The list may > also contain EOF or TIMEOUT (which are not compiled regular > expressions). This method is similar to the expect() method except that > expect_list() does not recompile the pattern list on every call. This > may help if you are trying to optimize for speed, otherwise just use > the expect() method. This is called by expect(). If timeout==-1 then > the self.timeout value is used. If searchwindowsize==-1 then the > self.searchwindowsize value is used. """ > > > return self.expect_loop(searcher_re(pattern_list), timeout, searchwindowsize) > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1330: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , searcher = > timeout = 9.9997730255126953, searchwindowsize = None > > def expect_loop(self, searcher, timeout = -1, searchwindowsize = -1): > > """This is the common loop used inside expect. The 'searcher' should be > an instance of searcher_re or searcher_string, which describes how and what > to search for in the input. > > See expect() for other arguments, return value and exceptions. """ > > self.searcher = searcher > > if timeout == -1: > timeout = self.timeout > if timeout is not None: > end_time = time.time() + timeout > if searchwindowsize == -1: > searchwindowsize = self.searchwindowsize > > try: > incoming = self.buffer > freshlen = len(incoming) > while True: # Keep reading until exception or return. > index = searcher.search(incoming, freshlen, searchwindowsize) > if index >= 0: > self.buffer = incoming[searcher.end : ] > self.before = incoming[ : searcher.start] > self.after = incoming[searcher.start : searcher.end] > self.match = searcher.match > self.match_index = index > return self.match_index > # No match at this point > if timeout < 0 and timeout is not None: > raise TIMEOUT ('Timeout exceeded in expect_any().') > # Still have time left, so read more data > c = self.read_nonblocking (self.maxread, timeout) > freshlen = len(c) > time.sleep (0.0001) > incoming = incoming + c > if timeout is not None: > timeout = end_time - time.time() > except EOF, e: > self.buffer = '' > self.before = incoming > self.after = EOF > index = searcher.eof_index > if index >= 0: > self.match = EOF > self.match_index = index > return self.match_index > else: > self.match = None > self.match_index = None > raise EOF (str(e) + '\n' + str(self)) > except TIMEOUT, e: > self.buffer = incoming > self.before = incoming > self.after = TIMEOUT > index = searcher.timeout_index > if index >= 0: > self.match = TIMEOUT > self.match_index = index > return self.match_index > else: > self.match = None > self.match_index = None > > raise TIMEOUT (str(e) + '\n' + str(self)) > E TIMEOUT: Timeout exceeded in read_nonblocking(). > E > E version: 2.4 ($Revision: 516 $) > E command: /Users/tzuchien/pytest/.tox/py26/bin/python2.6 > E args: ['/Users/tzuchien/pytest/.tox/py26/bin/python2.6', '/Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pytest.py', '--basetemp=/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction_exception0/test_pdb_interaction_exception/pexpect', '--pdb', '/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction_exception0/test_pdb_interaction_exception/test_pdb_interaction_exception.py'] > E searcher: searcher_re: > E 0: re.compile("1 failed") > E buffer (last 100 chars): globalfunc at 0x102354c08> > E (Pdb) > E before (last 100 chars): globalfunc at 0x102354c08> > E (Pdb) > E after: > E match: None > E match_index: None > E exitstatus: None > E flag_eof: False > E pid: 31072 > E child_fd: 150 > E closed: False > E timeout: 10.0 > E delimiter: > E logfile: > E logfile_read: None > E logfile_send: None > E maxread: 2000 > E ignorecase: False > E searchwindowsize: None > E delaybeforesend: 0.05 > E delayafterclose: 0.1 > E delayafterterminate: 0.1 > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1414: TIMEOUT > ____________________________________ TestPDB.test_pdb_interaction_capturing_simple _____________________________________ > > self = > testdir = > > def test_pdb_interaction_capturing_simple(self, testdir): > p1 = testdir.makepyfile(""" > import pytest > def test_1(): > i = 0 > print ("hello17") > pytest.set_trace() > x = 3 > """) > child = testdir.spawn_pytest(str(p1)) > child.expect("test_1") > child.expect("x = 3") > child.expect("(Pdb)") > child.sendeof() > > rest = child.read() > > /Users/tzuchien/pytest/testing/test_pdb.py:102: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , size = -1 > > def read (self, size = -1): # File-like object. > > """This reads at most "size" bytes from the file (less if the read hits > EOF before obtaining size bytes). If the size argument is negative or > omitted, read all data until EOF is reached. The bytes are returned as > a string object. An empty string is returned when EOF is encountered > immediately. """ > > if size == 0: > return '' > if size < 0: > > self.expect (self.delimiter) # delimiter default is EOF > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:863: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , pattern = , timeout = -1, searchwindowsize = -1 > > def expect(self, pattern, timeout = -1, searchwindowsize=-1): > > """This seeks through the stream until a pattern is matched. The > pattern is overloaded and may take several types. The pattern can be a > StringType, EOF, a compiled re, or a list of any of those types. > Strings will be compiled to re types. This returns the index into the > pattern list. If the pattern was not a list this returns index 0 on a > successful match. This may raise exceptions for EOF or TIMEOUT. To > avoid the EOF or TIMEOUT exceptions add EOF or TIMEOUT to the pattern > list. That will cause expect to match an EOF or TIMEOUT condition > instead of raising an exception. > > If you pass a list of patterns and more than one matches, the first match > in the stream is chosen. If more than one pattern matches at that point, > the leftmost in the pattern list is chosen. For example:: > > # the input is 'foobar' > index = p.expect (['bar', 'foo', 'foobar']) > # returns 1 ('foo') even though 'foobar' is a "better" match > > Please note, however, that buffering can affect this behavior, since > input arrives in unpredictable chunks. For example:: > > # the input is 'foobar' > index = p.expect (['foobar', 'foo']) > # returns 0 ('foobar') if all input is available at once, > # but returs 1 ('foo') if parts of the final 'bar' arrive late > > After a match is found the instance attributes 'before', 'after' and > 'match' will be set. You can see all the data read before the match in > 'before'. You can see the data that was matched in 'after'. The > re.MatchObject used in the re match will be in 'match'. If an error > occurred then 'before' will be set to all the data read so far and > 'after' and 'match' will be None. > > If timeout is -1 then timeout will be set to the self.timeout value. > > A list entry may be EOF or TIMEOUT instead of a string. This will > catch these exceptions and return the index of the list entry instead > of raising the exception. The attribute 'after' will be set to the > exception type. The attribute 'match' will be None. This allows you to > write code like this:: > > index = p.expect (['good', 'bad', pexpect.EOF, pexpect.TIMEOUT]) > if index == 0: > do_something() > elif index == 1: > do_something_else() > elif index == 2: > do_some_other_thing() > elif index == 3: > do_something_completely_different() > > instead of code like this:: > > try: > index = p.expect (['good', 'bad']) > if index == 0: > do_something() > elif index == 1: > do_something_else() > except EOF: > do_some_other_thing() > except TIMEOUT: > do_something_completely_different() > > These two forms are equivalent. It all depends on what you want. You > can also just expect the EOF if you are waiting for all output of a > child to finish. For example:: > > p = pexpect.spawn('/bin/ls') > p.expect (pexpect.EOF) > print p.before > > If you are trying to optimize for speed then see expect_list(). > """ > > compiled_pattern_list = self.compile_pattern_list(pattern) > > return self.expect_list(compiled_pattern_list, timeout, searchwindowsize) > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1316: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , pattern_list = [], timeout = -1 > searchwindowsize = -1 > > def expect_list(self, pattern_list, timeout = -1, searchwindowsize = -1): > > """This takes a list of compiled regular expressions and returns the > index into the pattern_list that matched the child output. The list may > also contain EOF or TIMEOUT (which are not compiled regular > expressions). This method is similar to the expect() method except that > expect_list() does not recompile the pattern list on every call. This > may help if you are trying to optimize for speed, otherwise just use > the expect() method. This is called by expect(). If timeout==-1 then > the self.timeout value is used. If searchwindowsize==-1 then the > self.searchwindowsize value is used. """ > > > return self.expect_loop(searcher_re(pattern_list), timeout, searchwindowsize) > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1330: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , searcher = , timeout = 10.0 > searchwindowsize = None > > def expect_loop(self, searcher, timeout = -1, searchwindowsize = -1): > > """This is the common loop used inside expect. The 'searcher' should be > an instance of searcher_re or searcher_string, which describes how and what > to search for in the input. > > See expect() for other arguments, return value and exceptions. """ > > self.searcher = searcher > > if timeout == -1: > timeout = self.timeout > if timeout is not None: > end_time = time.time() + timeout > if searchwindowsize == -1: > searchwindowsize = self.searchwindowsize > > try: > incoming = self.buffer > freshlen = len(incoming) > while True: # Keep reading until exception or return. > index = searcher.search(incoming, freshlen, searchwindowsize) > if index >= 0: > self.buffer = incoming[searcher.end : ] > self.before = incoming[ : searcher.start] > self.after = incoming[searcher.start : searcher.end] > self.match = searcher.match > self.match_index = index > return self.match_index > # No match at this point > if timeout < 0 and timeout is not None: > raise TIMEOUT ('Timeout exceeded in expect_any().') > # Still have time left, so read more data > c = self.read_nonblocking (self.maxread, timeout) > freshlen = len(c) > time.sleep (0.0001) > incoming = incoming + c > if timeout is not None: > timeout = end_time - time.time() > except EOF, e: > self.buffer = '' > self.before = incoming > self.after = EOF > index = searcher.eof_index > if index >= 0: > self.match = EOF > self.match_index = index > return self.match_index > else: > self.match = None > self.match_index = None > raise EOF (str(e) + '\n' + str(self)) > except TIMEOUT, e: > self.buffer = incoming > self.before = incoming > self.after = TIMEOUT > index = searcher.timeout_index > if index >= 0: > self.match = TIMEOUT > self.match_index = index > return self.match_index > else: > self.match = None > self.match_index = None > > raise TIMEOUT (str(e) + '\n' + str(self)) > E TIMEOUT: Timeout exceeded in read_nonblocking(). > E > E version: 2.4 ($Revision: 516 $) > E command: /Users/tzuchien/pytest/.tox/py26/bin/python2.6 > E args: ['/Users/tzuchien/pytest/.tox/py26/bin/python2.6', '/Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pytest.py', '--basetemp=/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction_capturing_simple0/test_pdb_interaction_capturing_simple/pexpect', '/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction_capturing_simple0/test_pdb_interaction_capturing_simple/test_pdb_interaction_capturing_simple.py'] > E searcher: searcher_re: > E 0: EOF > E buffer (last 100 chars): ) > E before (last 100 chars): ) > E after: > E match: None > E match_index: None > E exitstatus: None > E flag_eof: False > E pid: 31073 > E child_fd: 194 > E closed: False > E timeout: 10.0 > E delimiter: > E logfile: > E logfile_read: None > E logfile_send: None > E maxread: 2000 > E ignorecase: False > E searchwindowsize: None > E delaybeforesend: 0.05 > E delayafterclose: 0.1 > E delayafterterminate: 0.1 > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1414: TIMEOUT > _____________________________________ TestPDB.test_pdb_interaction_capturing_twice _____________________________________ > > self = > testdir = > > def test_pdb_interaction_capturing_twice(self, testdir): > p1 = testdir.makepyfile(""" > import pytest > def test_1(): > i = 0 > print ("hello17") > pytest.set_trace() > x = 3 > print ("hello18") > pytest.set_trace() > x = 4 > """) > child = testdir.spawn_pytest(str(p1)) > child.expect("test_1") > child.expect("x = 3") > child.expect("(Pdb)") > child.sendline('c') > child.expect("x = 4") > child.sendeof() > > rest = child.read() > > /Users/tzuchien/pytest/testing/test_pdb.py:128: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , size = -1 > > def read (self, size = -1): # File-like object. > > """This reads at most "size" bytes from the file (less if the read hits > EOF before obtaining size bytes). If the size argument is negative or > omitted, read all data until EOF is reached. The bytes are returned as > a string object. An empty string is returned when EOF is encountered > immediately. """ > > if size == 0: > return '' > if size < 0: > > self.expect (self.delimiter) # delimiter default is EOF > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:863: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , pattern = , timeout = -1, searchwindowsize = -1 > > def expect(self, pattern, timeout = -1, searchwindowsize=-1): > > """This seeks through the stream until a pattern is matched. The > pattern is overloaded and may take several types. The pattern can be a > StringType, EOF, a compiled re, or a list of any of those types. > Strings will be compiled to re types. This returns the index into the > pattern list. If the pattern was not a list this returns index 0 on a > successful match. This may raise exceptions for EOF or TIMEOUT. To > avoid the EOF or TIMEOUT exceptions add EOF or TIMEOUT to the pattern > list. That will cause expect to match an EOF or TIMEOUT condition > instead of raising an exception. > > If you pass a list of patterns and more than one matches, the first match > in the stream is chosen. If more than one pattern matches at that point, > the leftmost in the pattern list is chosen. For example:: > > # the input is 'foobar' > index = p.expect (['bar', 'foo', 'foobar']) > # returns 1 ('foo') even though 'foobar' is a "better" match > > Please note, however, that buffering can affect this behavior, since > input arrives in unpredictable chunks. For example:: > > # the input is 'foobar' > index = p.expect (['foobar', 'foo']) > # returns 0 ('foobar') if all input is available at once, > # but returs 1 ('foo') if parts of the final 'bar' arrive late > > After a match is found the instance attributes 'before', 'after' and > 'match' will be set. You can see all the data read before the match in > 'before'. You can see the data that was matched in 'after'. The > re.MatchObject used in the re match will be in 'match'. If an error > occurred then 'before' will be set to all the data read so far and > 'after' and 'match' will be None. > > If timeout is -1 then timeout will be set to the self.timeout value. > > A list entry may be EOF or TIMEOUT instead of a string. This will > catch these exceptions and return the index of the list entry instead > of raising the exception. The attribute 'after' will be set to the > exception type. The attribute 'match' will be None. This allows you to > write code like this:: > > index = p.expect (['good', 'bad', pexpect.EOF, pexpect.TIMEOUT]) > if index == 0: > do_something() > elif index == 1: > do_something_else() > elif index == 2: > do_some_other_thing() > elif index == 3: > do_something_completely_different() > > instead of code like this:: > > try: > index = p.expect (['good', 'bad']) > if index == 0: > do_something() > elif index == 1: > do_something_else() > except EOF: > do_some_other_thing() > except TIMEOUT: > do_something_completely_different() > > These two forms are equivalent. It all depends on what you want. You > can also just expect the EOF if you are waiting for all output of a > child to finish. For example:: > > p = pexpect.spawn('/bin/ls') > p.expect (pexpect.EOF) > print p.before > > If you are trying to optimize for speed then see expect_list(). > """ > > compiled_pattern_list = self.compile_pattern_list(pattern) > > return self.expect_list(compiled_pattern_list, timeout, searchwindowsize) > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1316: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , pattern_list = [], timeout = -1 > searchwindowsize = -1 > > def expect_list(self, pattern_list, timeout = -1, searchwindowsize = -1): > > """This takes a list of compiled regular expressions and returns the > index into the pattern_list that matched the child output. The list may > also contain EOF or TIMEOUT (which are not compiled regular > expressions). This method is similar to the expect() method except that > expect_list() does not recompile the pattern list on every call. This > may help if you are trying to optimize for speed, otherwise just use > the expect() method. This is called by expect(). If timeout==-1 then > the self.timeout value is used. If searchwindowsize==-1 then the > self.searchwindowsize value is used. """ > > > return self.expect_loop(searcher_re(pattern_list), timeout, searchwindowsize) > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1330: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > self = , searcher = > timeout = 9.9995019435882568, searchwindowsize = None > > def expect_loop(self, searcher, timeout = -1, searchwindowsize = -1): > > """This is the common loop used inside expect. The 'searcher' should be > an instance of searcher_re or searcher_string, which describes how and what > to search for in the input. > > See expect() for other arguments, return value and exceptions. """ > > self.searcher = searcher > > if timeout == -1: > timeout = self.timeout > if timeout is not None: > end_time = time.time() + timeout > if searchwindowsize == -1: > searchwindowsize = self.searchwindowsize > > try: > incoming = self.buffer > freshlen = len(incoming) > while True: # Keep reading until exception or return. > index = searcher.search(incoming, freshlen, searchwindowsize) > if index >= 0: > self.buffer = incoming[searcher.end : ] > self.before = incoming[ : searcher.start] > self.after = incoming[searcher.start : searcher.end] > self.match = searcher.match > self.match_index = index > return self.match_index > # No match at this point > if timeout < 0 and timeout is not None: > raise TIMEOUT ('Timeout exceeded in expect_any().') > # Still have time left, so read more data > c = self.read_nonblocking (self.maxread, timeout) > freshlen = len(c) > time.sleep (0.0001) > incoming = incoming + c > if timeout is not None: > timeout = end_time - time.time() > except EOF, e: > self.buffer = '' > self.before = incoming > self.after = EOF > index = searcher.eof_index > if index >= 0: > self.match = EOF > self.match_index = index > return self.match_index > else: > self.match = None > self.match_index = None > raise EOF (str(e) + '\n' + str(self)) > except TIMEOUT, e: > self.buffer = incoming > self.before = incoming > self.after = TIMEOUT > index = searcher.timeout_index > if index >= 0: > self.match = TIMEOUT > self.match_index = index > return self.match_index > else: > self.match = None > self.match_index = None > > raise TIMEOUT (str(e) + '\n' + str(self)) > E TIMEOUT: Timeout exceeded in read_nonblocking(). > E > E version: 2.4 ($Revision: 516 $) > E command: /Users/tzuchien/pytest/.tox/py26/bin/python2.6 > E args: ['/Users/tzuchien/pytest/.tox/py26/bin/python2.6', '/Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pytest.py', '--basetemp=/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction_capturing_twice0/test_pdb_interaction_capturing_twice/pexpect', '/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction_capturing_twice0/test_pdb_interaction_capturing_twice/test_pdb_interaction_capturing_twice.py'] > E searcher: searcher_re: > E 0: EOF > E buffer (last 100 chars): > E (Pdb) > E before (last 100 chars): > E (Pdb) > E after: > E match: None > E match_index: None > E exitstatus: None > E flag_eof: False > E pid: 31086 > E child_fd: 150 > E closed: False > E timeout: 10.0 > E delimiter: > E logfile: > E logfile_read: None > E logfile_send: None > E maxread: 2000 > E ignorecase: False > E searchwindowsize: None > E delaybeforesend: 0.05 > E delayafterclose: 0.1 > E delayafterterminate: 0.1 > > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1414: TIMEOUT > ----------------------- generated xml file: /Users/tzuchien/pytest/.tox/py26/log/junit-py26.xml ------------------------ > =============================================== short test summary info ================================================ > FAIL test_collection.py::TestSession::()::test_parsearg > FAIL test_collection.py::TestSession::()::test_collect_topdir > FAIL test_collection.py::TestSession::()::test_collect_protocol_single_function > FAIL test_collection.py::TestSession::()::test_collect_subdir_event_ordering > FAIL test_collection.py::TestSession::()::test_collect_two_commandline_args > FAIL test_collection.py::Test_getinitialnodes::()::test_pkgfile > FAIL test_conftest.py::test_setinitial_confcut > FAIL test_pdb.py::TestPDB::()::test_pdb_interaction > FAIL test_pdb.py::TestPDB::()::test_pdb_interaction_exception > FAIL test_pdb.py::TestPDB::()::test_pdb_interaction_capturing_simple > FAIL test_pdb.py::TestPDB::()::test_pdb_interaction_capturing_twice > SKIP [1] /Users/tzuchien/pytest/testing/conftest.py:96: no suitable python2.7 found > SKIP [1] /Users/tzuchien/pytest/testing/conftest.py:96: no suitable jython found > SKIP [1] /Users/tzuchien/pytest/testing/conftest.py:96: no suitable pypy found > SKIP [1] /Users/tzuchien/pytest/testing/conftest.py:96: no suitable python2.4 found > SKIP [1] /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/_pytest/core.py:120: plugin 'xdist' is missing > SKIP [1] /Users/tzuchien/pytest/testing/conftest.py:96: no suitable python3.1 found > XFAIL acceptance_test.py::TestInvocationVariants::()::test_noclass_discovery_if_not_testcase > decide: feature or bug > XFAIL test_capture.py::TestPerTestCapturing::()::test_capture_scope_cache > XFAIL test_collection.py::TestPrunetraceback::()::test_collect_report_postprocessing > other mechanism for adding to reporting needed > XFAIL test_config.py::TestParseIni::()::test_confcutdir > probably not needed > !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! KeyboardInterrupt !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! > /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1099: KeyboardInterrupt > ============================ 11 failed, 362 passed, 6 skipped, 4 xfailed in 464.77 seconds ============================= > ~/pytest $ > ~/pytest $ python pytest.py -v > =================================================== test session starts ==================================================== > platform darwin -- Python 2.6.1 -- pytest-2.1.3 -- /usr/bin/python > collected 678 items > > doc/example/assertion/test_failures.py:6: test_failure_demo_fails_properly PASSED > doc/example/assertion/test_setup_flow_example.py:14: TestStateFullThing.test_42 PASSED > doc/example/assertion/test_setup_flow_example.py:18: TestStateFullThing.test_23 PASSED > doc/example/assertion/global_testmodule_config/test_hello.py:4: test_func PASSED > doc/example/costlysetup/sub1/test_quick.py:2: test_quick PASSED > doc/example/costlysetup/sub2/test_two.py:1: test_something PASSED > doc/example/costlysetup/sub2/test_two.py:4: test_something_more PASSED > testing/acceptance_test.py:4: TestGeneralUsage.test_config_error PASSED > testing/acceptance_test.py:16: TestGeneralUsage.test_early_hook_error_issue38_1 PASSED > testing/acceptance_test.py:36: TestGeneralUsage.test_early_hook_configure_error_issue38 PASSED > testing/acceptance_test.py:49: TestGeneralUsage.test_file_not_found PASSED > testing/acceptance_test.py:54: TestGeneralUsage.test_config_preparse_plugin_option PASSED > testing/acceptance_test.py:69: TestGeneralUsage.test_assertion_magic PASSED > testing/acceptance_test.py:82: TestGeneralUsage.test_nested_import_error PASSED > testing/acceptance_test.py:96: TestGeneralUsage.test_not_collectable_arguments PASSED > testing/acceptance_test.py:107: TestGeneralUsage.test_early_skip PASSED > testing/acceptance_test.py:120: TestGeneralUsage.test_issue88_initial_file_multinodes PASSED > testing/acceptance_test.py:138: TestGeneralUsage.test_issue93_initialnode_importing_capturing PASSED > testing/acceptance_test.py:149: TestGeneralUsage.test_conftest_printing_shows_if_error PASSED > testing/acceptance_test.py:158: TestGeneralUsage.test_chdir PASSED > testing/acceptance_test.py:174: TestGeneralUsage.test_issue109_sibling_conftests_not_loaded PASSED > testing/acceptance_test.py:187: TestGeneralUsage.test_directory_skipped PASSED > testing/acceptance_test.py:200: TestGeneralUsage.test_multiple_items_per_collector_byid PASSED > testing/acceptance_test.py:219: TestGeneralUsage.test_skip_on_generated_funcarg_id PASSED > testing/acceptance_test.py:235: TestGeneralUsage.test_direct_addressing_selects PASSED > testing/acceptance_test.py:247: TestGeneralUsage.test_direct_addressing_notfound PASSED > testing/acceptance_test.py:256: TestGeneralUsage.test_docstring_on_hookspec PASSED > testing/acceptance_test.py:262: TestGeneralUsage.test_initialization_error_issue49 PASSED > testing/acceptance_test.py:276: TestInvocationVariants.test_earlyinit PASSED > testing/acceptance_test.py:284: TestInvocationVariants.test_pydoc PASSED > testing/acceptance_test.py:292: TestInvocationVariants.test_import_star_py_dot_test PASSED > testing/acceptance_test.py:307: TestInvocationVariants.test_import_star_pytest PASSED > testing/acceptance_test.py:319: TestInvocationVariants.test_double_pytestcmdline PASSED > testing/acceptance_test.py:335: TestInvocationVariants.test_python_minus_m_invocation_ok PASSED > testing/acceptance_test.py:341: TestInvocationVariants.test_python_minus_m_invocation_fail PASSED > testing/acceptance_test.py:347: TestInvocationVariants.test_python_pytest_package PASSED > testing/acceptance_test.py:354: TestInvocationVariants.test_equivalence_pytest_pytest PASSED > testing/acceptance_test.py:357: TestInvocationVariants.test_invoke_with_string PASSED > testing/acceptance_test.py:364: TestInvocationVariants.test_invoke_with_path PASSED > testing/acceptance_test.py:369: TestInvocationVariants.test_invoke_plugin_api PASSED > testing/acceptance_test.py:378: TestInvocationVariants.test_pyargs_importerror PASSED > testing/acceptance_test.py:391: TestInvocationVariants.test_cmdline_python_package PASSED > testing/acceptance_test.py:418: TestInvocationVariants.test_cmdline_python_package_not_exists PASSED > testing/acceptance_test.py:425: TestInvocationVariants.test_noclass_discovery_if_not_testcase xfail > testing/acceptance_test.py:439: TestInvocationVariants.test_doctest_id PASSED > testing/test_assertinterpret.py:12: test_not_being_rewritten PASSED > testing/test_assertinterpret.py:15: test_assert PASSED > testing/test_assertinterpret.py:23: test_assert_with_explicit_message PASSED > testing/test_assertinterpret.py:30: test_assert_within_finally PASSED > testing/test_assertinterpret.py:50: test_assert_multiline_1 PASSED > testing/test_assertinterpret.py:59: test_assert_multiline_2 PASSED > testing/test_assertinterpret.py:68: test_in PASSED > testing/test_assertinterpret.py:76: test_is PASSED > testing/test_assertinterpret.py:85: test_attrib PASSED > testing/test_assertinterpret.py:97: test_attrib_inst PASSED > testing/test_assertinterpret.py:108: test_len PASSED > testing/test_assertinterpret.py:118: test_assert_non_string_message PASSED > testing/test_assertinterpret.py:128: test_assert_keyword_arg PASSED > testing/test_assertinterpret.py:152: test_assert_non_string PASSED > testing/test_assertinterpret.py:159: test_assert_implicit_multiline PASSED > testing/test_assertinterpret.py:169: test_assert_with_brokenrepr_arg PASSED > testing/test_assertinterpret.py:176: test_multiple_statements_per_line PASSED > testing/test_assertinterpret.py:183: test_power PASSED > testing/test_assertinterpret.py:196: TestView.test_class_dispatch PASSED > testing/test_assertinterpret.py:222: TestView.test_viewtype_class_hierarchy PASSED > testing/test_assertinterpret.py:250: test_assert_customizable_reprcompare PASSED > testing/test_assertinterpret.py:260: test_assert_long_source_1 PASSED > testing/test_assertinterpret.py:271: test_assert_long_source_2 PASSED > testing/test_assertinterpret.py:282: test_assert_raise_alias PASSED > testing/test_assertinterpret.py:300: test_assert_raise_subclass PASSED > testing/test_assertinterpret.py:312: test_assert_raises_in_nonzero_of_object_pytest_issue10 PASSED > testing/test_assertion.py:32: TestBinReprIntegration.test_pytest_assertrepr_compare_called PASSED > testing/test_assertion.py:37: TestBinReprIntegration.test_pytest_assertrepr_compare_args PASSED > testing/test_assertion.py:47: TestAssert_reprcompare.test_different_types PASSED > testing/test_assertion.py:50: TestAssert_reprcompare.test_summary PASSED > testing/test_assertion.py:54: TestAssert_reprcompare.test_text_diff PASSED > testing/test_assertion.py:59: TestAssert_reprcompare.test_multiline_text_diff PASSED > testing/test_assertion.py:66: TestAssert_reprcompare.test_list PASSED > testing/test_assertion.py:70: TestAssert_reprcompare.test_list_different_lenghts PASSED > testing/test_assertion.py:76: TestAssert_reprcompare.test_dict PASSED > testing/test_assertion.py:80: TestAssert_reprcompare.test_set PASSED > testing/test_assertion.py:84: TestAssert_reprcompare.test_list_tuples PASSED > testing/test_assertion.py:90: TestAssert_reprcompare.test_list_bad_repr PASSED > testing/test_assertion.py:99: TestAssert_reprcompare.test_one_repr_empty PASSED > testing/test_assertion.py:110: TestAssert_reprcompare.test_repr_no_exc PASSED > testing/test_assertion.py:114: test_rewritten PASSED > testing/test_assertion.py:122: test_reprcompare_notin PASSED > testing/test_assertion.py:126: test_pytest_assertrepr_compare_integration PASSED > testing/test_assertion.py:143: test_sequence_comparison_uses_repr PASSED > testing/test_assertion.py:161: test_assertrepr_loaded_per_dir PASSED > testing/test_assertion.py:184: test_assertion_options PASSED > testing/test_assertion.py:203: test_old_assert_mode PASSED > testing/test_assertion.py:211: test_triple_quoted_string_issue113 PASSED > testing/test_assertion.py:222: test_traceback_failure PASSED > testing/test_assertion.py:251: test_warn_missing PASSED > testing/test_assertrewrite.py:55: TestAssertionRewrite.test_place_initial_imports PASSED > testing/test_assertrewrite.py:91: TestAssertionRewrite.test_dont_rewrite PASSED > testing/test_assertrewrite.py:99: TestAssertionRewrite.test_name PASSED > testing/test_assertrewrite.py:111: TestAssertionRewrite.test_assert_already_has_message PASSED > testing/test_assertrewrite.py:116: TestAssertionRewrite.test_boolop PASSED > testing/test_assertrewrite.py:165: TestAssertionRewrite.test_short_circut_evaluation PASSED > testing/test_assertrewrite.py:174: TestAssertionRewrite.test_unary_op PASSED > testing/test_assertrewrite.py:192: TestAssertionRewrite.test_binary_op PASSED > testing/test_assertrewrite.py:199: TestAssertionRewrite.test_call PASSED > testing/test_assertrewrite.py:227: TestAssertionRewrite.test_attribute PASSED > testing/test_assertrewrite.py:240: TestAssertionRewrite.test_comparisons PASSED > testing/test_assertrewrite.py:263: TestAssertionRewrite.test_len PASSED > testing/test_assertrewrite.py:270: TestAssertionRewrite.test_custom_reprcompare PASSED > testing/test_assertrewrite.py:284: TestAssertionRewrite.test_assert_raising_nonzero_in_comparison PASSED > testing/test_assertrewrite.py:298: TestAssertionRewrite.test_formatchar PASSED > testing/test_assertrewrite.py:306: TestRewriteOnImport.test_pycache_is_a_file PASSED > testing/test_assertrewrite.py:313: TestRewriteOnImport.test_zipfile PASSED > testing/test_assertrewrite.py:329: TestRewriteOnImport.test_readonly PASSED > testing/test_assertrewrite.py:339: TestRewriteOnImport.test_dont_write_bytecode PASSED > testing/test_assertrewrite.py:349: TestRewriteOnImport.test_pyc_vs_pyo PASSED > testing/test_assertrewrite.py:363: TestRewriteOnImport.test_package PASSED > testing/test_assertrewrite.py:372: TestRewriteOnImport.test_translate_newlines PASSED > testing/test_capture.py:7: TestCaptureManager.test_getmethod_default_no_fd PASSED > testing/test_capture.py:17: TestCaptureManager.test_configure_per_fspath PASSED > testing/test_capture.py:35: TestCaptureManager.test_capturing_basic_api[0] PASSED > testing/test_capture.py:35: TestCaptureManager.test_capturing_basic_api[1] PASSED > testing/test_capture.py:35: TestCaptureManager.test_capturing_basic_api[2] PASSED > testing/test_capture.py:59: TestCaptureManager.test_juggle_capturings PASSED > testing/test_capture.py:80: test_capturing_unicode[0] PASSED > testing/test_capture.py:80: test_capturing_unicode[1] PASSED > testing/test_capture.py:100: test_capturing_bytes_in_utf8_encoding[0] PASSED > testing/test_capture.py:100: test_capturing_bytes_in_utf8_encoding[1] PASSED > testing/test_capture.py:111: test_collect_capturing PASSED > testing/test_capture.py:123: TestPerTestCapturing.test_capture_and_fixtures PASSED > testing/test_capture.py:145: TestPerTestCapturing.test_capture_scope_cache xfail > testing/test_capture.py:170: TestPerTestCapturing.test_no_carry_over PASSED > testing/test_capture.py:184: TestPerTestCapturing.test_teardown_capturing PASSED > testing/test_capture.py:205: TestPerTestCapturing.test_teardown_final_capturing PASSED > testing/test_capture.py:221: TestPerTestCapturing.test_capturing_outerr PASSED > testing/test_capture.py:245: TestLoggingInteraction.test_logging_stream_ownership PASSED > testing/test_capture.py:257: TestLoggingInteraction.test_logging_and_immediate_setupteardown PASSED > testing/test_capture.py:283: TestLoggingInteraction.test_logging_and_crossscope_fixtures PASSED > testing/test_capture.py:309: TestLoggingInteraction.test_logging_initialized_in_test PASSED > testing/test_capture.py:328: TestLoggingInteraction.test_conftestlogging_is_shown PASSED > testing/test_capture.py:342: TestLoggingInteraction.test_conftestlogging_and_test_logging PASSED > testing/test_capture.py:364: TestCaptureFuncarg.test_std_functional PASSED > testing/test_capture.py:373: TestCaptureFuncarg.test_stdfd_functional PASSED > testing/test_capture.py:385: TestCaptureFuncarg.test_partial_setup_failure PASSED > testing/test_capture.py:396: TestCaptureFuncarg.test_keyboardinterrupt_disables_capturing PASSED > testing/test_capture.py:410: test_setup_failure_does_not_kill_capturing PASSED > testing/test_capture.py:423: test_fdfuncarg_skips_on_no_osdup PASSED > testing/test_capture.py:436: test_capture_conftest_runtest_setup PASSED > testing/test_collection.py:6: TestCollector.test_collect_versus_item PASSED > testing/test_collection.py:11: TestCollector.test_compat_attributes PASSED > testing/test_collection.py:23: TestCollector.test_check_equality PASSED > testing/test_collection.py:51: TestCollector.test_getparent PASSED > testing/test_collection.py:71: TestCollector.test_getcustomfile_roundtrip PASSED > testing/test_collection.py:89: TestCollectFS.test_ignored_certain_directories PASSED > testing/test_collection.py:105: TestCollectFS.test_custom_norecursedirs PASSED > testing/test_collection.py:120: TestCollectPluginHookRelay.test_pytest_collect_file PASSED > testing/test_collection.py:130: TestCollectPluginHookRelay.test_pytest_collect_directory PASSED > testing/test_collection.py:142: TestPrunetraceback.test_collection_error PASSED > testing/test_collection.py:153: TestPrunetraceback.test_custom_repr_failure PASSED > testing/test_collection.py:178: TestPrunetraceback.test_collect_report_postprocessing xfail > testing/test_collection.py:198: TestCustomConftests.test_ignore_collect_path PASSED > testing/test_collection.py:213: TestCustomConftests.test_ignore_collect_not_called_on_argument PASSED > testing/test_collection.py:226: TestCustomConftests.test_collectignore_exclude_on_option PASSED > testing/test_collection.py:244: TestCustomConftests.test_pytest_fs_collect_hooks_are_seen PASSED > testing/test_collection.py:261: TestCustomConftests.test_pytest_collect_file_from_sister_dir PASSED > testing/test_collection.py:293: TestSession.test_parsearg FAILED > testing/test_collection.py:313: TestSession.test_collect_topdir FAILED > testing/test_collection.py:328: TestSession.test_collect_protocol_single_function FAILED > testing/test_collection.py:354: TestSession.test_collect_protocol_method PASSED > testing/test_collection.py:375: TestSession.test_collect_custom_nodes_multi_id PASSED > testing/test_collection.py:411: TestSession.test_collect_subdir_event_ordering FAILED > testing/test_collection.py:430: TestSession.test_collect_two_commandline_args FAILED > testing/test_collection.py:456: TestSession.test_serialization_byid PASSED > testing/test_collection.py:472: TestSession.test_find_byid_without_instance_parents PASSED > testing/test_collection.py:488: Test_getinitialnodes.test_global_file PASSED > testing/test_collection.py:499: Test_getinitialnodes.test_pkgfile FAILED > testing/test_collection.py:514: Test_genitems.test_check_collect_hashes PASSED > testing/test_collection.py:531: Test_genitems.test_root_conftest_syntax_error PASSED > testing/test_collection.py:537: Test_genitems.test_example_items1 PASSED > testing/test_collection.py:564: test_matchnodes_two_collections_same_file PASSED > testing/test_config.py:6: TestParseIni.test_getcfg_and_config PASSED > testing/test_config.py:19: TestParseIni.test_getcfg_empty_path PASSED > testing/test_config.py:22: TestParseIni.test_append_parse_args PASSED > testing/test_config.py:35: TestParseIni.test_tox_ini_wrong_version PASSED > testing/test_config.py:46: TestParseIni.test_ini_names[0] PASSED > testing/test_config.py:46: TestParseIni.test_ini_names[1] PASSED > testing/test_config.py:46: TestParseIni.test_ini_names[2] PASSED > testing/test_config.py:56: TestParseIni.test_toxini_before_lower_pytestini PASSED > testing/test_config.py:70: TestParseIni.test_confcutdir xfail > testing/test_config.py:82: TestConfigCmdlineParsing.test_parsing_again_fails PASSED > testing/test_config.py:88: TestConfigAPI.test_config_trace PASSED > testing/test_config.py:96: TestConfigAPI.test_config_getvalue_honours_conftest PASSED > testing/test_config.py:110: TestConfigAPI.test_config_getvalueorskip PASSED > testing/test_config.py:127: TestConfigAPI.test_config_overwrite PASSED > testing/test_config.py:137: TestConfigAPI.test_getconftest_pathlist PASSED > testing/test_config.py:149: TestConfigAPI.test_addini PASSED > testing/test_config.py:163: TestConfigAPI.test_addini_pathlist PASSED > testing/test_config.py:180: TestConfigAPI.test_addini_args PASSED > testing/test_config.py:197: TestConfigAPI.test_addini_linelist PASSED > testing/test_config.py:215: test_options_on_small_file_do_not_blow_up PASSED > testing/test_config.py:231: test_preparse_ordering_with_setuptools PASSED > testing/test_config.py:253: test_plugin_preparse_prevents_setuptools_loading PASSED > testing/test_config.py:267: test_cmdline_processargs_simple PASSED > testing/test_conftest.py:26: TestConftestValueAccessGlobal.test_basic_init[0] PASSED > testing/test_conftest.py:26: TestConftestValueAccessGlobal.test_basic_init[1] PASSED > testing/test_conftest.py:31: TestConftestValueAccessGlobal.test_onimport[0] PASSED > testing/test_conftest.py:31: TestConftestValueAccessGlobal.test_onimport[1] PASSED > testing/test_conftest.py:41: TestConftestValueAccessGlobal.test_immediate_initialiation_and_incremental_are_the_same[0] PASSED > testing/test_conftest.py:41: TestConftestValueAccessGlobal.test_immediate_initialiation_and_incremental_are_the_same[1] PASSED > testing/test_conftest.py:52: TestConftestValueAccessGlobal.test_default_has_lower_prio[0] PASSED > testing/test_conftest.py:52: TestConftestValueAccessGlobal.test_default_has_lower_prio[1] PASSED > testing/test_conftest.py:57: TestConftestValueAccessGlobal.test_value_access_not_existing[0] PASSED > testing/test_conftest.py:57: TestConftestValueAccessGlobal.test_value_access_not_existing[1] PASSED > testing/test_conftest.py:62: TestConftestValueAccessGlobal.test_value_access_by_path[0] PASSED > testing/test_conftest.py:62: TestConftestValueAccessGlobal.test_value_access_by_path[1] PASSED > testing/test_conftest.py:73: TestConftestValueAccessGlobal.test_value_access_with_init_one_conftest[0] PASSED > testing/test_conftest.py:73: TestConftestValueAccessGlobal.test_value_access_with_init_one_conftest[1] PASSED > testing/test_conftest.py:78: TestConftestValueAccessGlobal.test_value_access_with_init_two_conftests[0] PASSED > testing/test_conftest.py:78: TestConftestValueAccessGlobal.test_value_access_with_init_two_conftests[1] PASSED > testing/test_conftest.py:84: TestConftestValueAccessGlobal.test_value_access_with_confmod[0] PASSED > testing/test_conftest.py:84: TestConftestValueAccessGlobal.test_value_access_with_confmod[1] PASSED > testing/test_conftest.py:94: test_conftest_in_nonpkg_with_init PASSED > testing/test_conftest.py:101: test_doubledash_not_considered PASSED > testing/test_conftest.py:109: test_conftest_global_import PASSED > testing/test_conftest.py:130: test_conftestcutdir PASSED > testing/test_conftest.py:149: test_conftestcutdir_inplace_considered PASSED > testing/test_conftest.py:157: test_setinitial_confcut FAILED > testing/test_conftest.py:173: test_setinitial_conftest_subdirs[0] PASSED > testing/test_conftest.py:173: test_setinitial_conftest_subdirs[1] PASSED > testing/test_conftest.py:173: test_setinitial_conftest_subdirs[2] PASSED > testing/test_conftest.py:173: test_setinitial_conftest_subdirs[3] PASSED > testing/test_conftest.py:186: test_conftest_confcutdir PASSED > testing/test_core.py:7: TestBootstrapping.test_consider_env_fails_to_import PASSED > testing/test_core.py:12: TestBootstrapping.test_preparse_args PASSED > testing/test_core.py:18: TestBootstrapping.test_plugin_prevent_register PASSED > testing/test_core.py:26: TestBootstrapping.test_plugin_prevent_register_unregistered_alredy_registered PASSED > testing/test_core.py:35: TestBootstrapping.test_plugin_skip PASSED > testing/test_core.py:49: TestBootstrapping.test_consider_env_plugin_instantiation PASSED > testing/test_core.py:63: TestBootstrapping.test_consider_setuptools_instantiation PASSED > testing/test_core.py:82: TestBootstrapping.test_consider_setuptools_not_installed PASSED > testing/test_core.py:89: TestBootstrapping.test_pluginmanager_ENV_startup PASSED > testing/test_core.py:102: TestBootstrapping.test_import_plugin_importname PASSED > testing/test_core.py:120: TestBootstrapping.test_import_plugin_dotted_name PASSED > testing/test_core.py:132: TestBootstrapping.test_consider_module PASSED > testing/test_core.py:143: TestBootstrapping.test_consider_module_import_module PASSED > testing/test_core.py:160: TestBootstrapping.test_config_sets_conftesthandle_onimport PASSED > testing/test_core.py:164: TestBootstrapping.test_consider_conftest_deps PASSED > testing/test_core.py:169: TestBootstrapping.test_pm PASSED > testing/test_core.py:186: TestBootstrapping.test_pm_ordering PASSED > testing/test_core.py:199: TestBootstrapping.test_register_imported_modules PASSED > testing/test_core.py:213: TestBootstrapping.test_canonical_import PASSED > testing/test_core.py:221: TestBootstrapping.test_register_mismatch_method PASSED > testing/test_core.py:228: TestBootstrapping.test_register_mismatch_arg PASSED > testing/test_core.py:236: TestBootstrapping.test_notify_exception PASSED > testing/test_core.py:250: TestBootstrapping.test_register PASSED > testing/test_core.py:267: TestBootstrapping.test_listattr PASSED > testing/test_core.py:281: TestBootstrapping.test_hook_tracing PASSED > testing/test_core.py:304: TestPytestPluginInteractions.test_addhooks_conftestplugin PASSED > testing/test_core.py:323: TestPytestPluginInteractions.test_addhooks_nohooks PASSED > testing/test_core.py:335: TestPytestPluginInteractions.test_do_option_conftestplugin PASSED > testing/test_core.py:346: TestPytestPluginInteractions.test_namespace_early_from_import PASSED > testing/test_core.py:355: TestPytestPluginInteractions.test_do_ext_namespace PASSED > testing/test_core.py:372: TestPytestPluginInteractions.test_do_option_postinitialize PASSED > testing/test_core.py:385: TestPytestPluginInteractions.test_configure PASSED > testing/test_core.py:406: TestPytestPluginInteractions.test_listattr PASSED > testing/test_core.py:414: TestPytestPluginInteractions.test_listattr_tryfirst PASSED > testing/test_core.py:446: test_namespace_has_default_and_env_plugins PASSED > testing/test_core.py:454: test_varnames PASSED > testing/test_core.py:468: TestMultiCall.test_uses_copy_of_methods PASSED > testing/test_core.py:476: TestMultiCall.test_call_passing PASSED > testing/test_core.py:498: TestMultiCall.test_keyword_args PASSED > testing/test_core.py:511: TestMultiCall.test_keyword_args_with_defaultargs PASSED > testing/test_core.py:519: TestMultiCall.test_tags_call_error PASSED > testing/test_core.py:523: TestMultiCall.test_call_subexecute PASSED > testing/test_core.py:535: TestMultiCall.test_call_none_is_no_result PASSED > testing/test_core.py:546: TestHookRelay.test_happypath PASSED > testing/test_core.py:563: TestHookRelay.test_only_kwargs PASSED > testing/test_core.py:571: TestHookRelay.test_firstresult_definition PASSED > testing/test_core.py:587: TestTracer.test_simple PASSED > testing/test_core.py:601: TestTracer.test_indent PASSED > testing/test_core.py:623: TestTracer.test_setprocessor PASSED > testing/test_core.py:645: TestTracer.test_setmyprocessor PASSED > testing/test_doctest.py:6: TestDoctests.test_collect_testtextfile PASSED > testing/test_doctest.py:22: TestDoctests.test_collect_module PASSED > testing/test_doctest.py:30: TestDoctests.test_simple_doctestfile PASSED > testing/test_doctest.py:39: TestDoctests.test_new_pattern PASSED > testing/test_doctest.py:48: TestDoctests.test_doctest_unexpected_exception PASSED > testing/test_doctest.py:62: TestDoctests.test_doctest_unex_importerror PASSED > testing/test_doctest.py:77: TestDoctests.test_doctestmodule PASSED > testing/test_doctest.py:89: TestDoctests.test_doctestmodule_external_and_issue116 PASSED > testing/test_doctest.py:111: TestDoctests.test_txtfile_failing PASSED > testing/test_genscript.py:21: test_gen[python2.4] SKIPPED > testing/test_genscript.py:21: test_gen[python2.5] PASSED > testing/test_genscript.py:21: test_gen[python2.6] PASSED > testing/test_genscript.py:21: test_gen[python2.7] SKIPPED > testing/test_genscript.py:21: test_gen[python3.1] SKIPPED > testing/test_genscript.py:21: test_gen[pypy] SKIPPED > testing/test_genscript.py:21: test_gen[jython] SKIPPED > testing/test_genscript.py:31: test_rundist SKIPPED > testing/test_helpconfig.py:4: test_version PASSED > testing/test_helpconfig.py:17: test_help PASSED > testing/test_helpconfig.py:26: test_collectattr PASSED > testing/test_helpconfig.py:38: test_hookvalidation_unknown PASSED > testing/test_helpconfig.py:49: test_hookvalidation_optional PASSED > testing/test_helpconfig.py:59: test_traceconfig PASSED > testing/test_helpconfig.py:66: test_debug PASSED > testing/test_helpconfig.py:72: test_PYTEST_DEBUG PASSED > testing/test_junitxml.py:21: TestPython.test_summing_simple PASSED > testing/test_junitxml.py:42: TestPython.test_timing_function PASSED > testing/test_junitxml.py:54: TestPython.test_setup_error PASSED > testing/test_junitxml.py:73: TestPython.test_skip_contains_name_reason PASSED > testing/test_junitxml.py:93: TestPython.test_classname_instance PASSED > testing/test_junitxml.py:108: TestPython.test_classname_nested_dir PASSED > testing/test_junitxml.py:120: TestPython.test_internal_error PASSED > testing/test_junitxml.py:133: TestPython.test_failure_function PASSED > testing/test_junitxml.py:160: TestPython.test_failure_escape PASSED > testing/test_junitxml.py:181: TestPython.test_junit_prefixing PASSED > testing/test_junitxml.py:203: TestPython.test_xfailure_function PASSED > testing/test_junitxml.py:221: TestPython.test_xfailure_xpass PASSED > testing/test_junitxml.py:240: TestPython.test_collect_error PASSED > testing/test_junitxml.py:254: TestPython.test_collect_skipped PASSED > testing/test_junitxml.py:267: TestPython.test_unicode PASSED > testing/test_junitxml.py:283: TestNonPython.test_summing_simple PASSED > testing/test_junitxml.py:312: test_nullbyte PASSED > testing/test_junitxml.py:328: test_nullbyte_replace PASSED > testing/test_junitxml.py:343: test_invalid_xml_escape PASSED > testing/test_junitxml.py:380: test_logxml_path_expansion PASSED > testing/test_mark.py:5: TestMark.test_markinfo_repr PASSED > testing/test_mark.py:10: TestMark.test_pytest_exists_in_namespace_all PASSED > testing/test_mark.py:14: TestMark.test_pytest_mark_notcallable PASSED > testing/test_mark.py:18: TestMark.test_pytest_mark_bare PASSED > testing/test_mark.py:25: TestMark.test_pytest_mark_keywords PASSED > testing/test_mark.py:34: TestMark.test_apply_multiple_and_merge PASSED > testing/test_mark.py:48: TestMark.test_pytest_mark_positional PASSED > testing/test_mark.py:56: TestMark.test_pytest_mark_reuse PASSED > testing/test_mark.py:72: TestFunctional.test_mark_per_function PASSED > testing/test_mark.py:82: TestFunctional.test_mark_per_module PASSED > testing/test_mark.py:92: TestFunctional.test_marklist_per_class PASSED > testing/test_mark.py:104: TestFunctional.test_marklist_per_module PASSED > testing/test_mark.py:117: TestFunctional.test_mark_per_class_decorator PASSED > testing/test_mark.py:129: TestFunctional.test_mark_per_class_decorator_plus_existing_dec PASSED > testing/test_mark.py:144: TestFunctional.test_merging_markers PASSED > testing/test_mark.py:162: TestFunctional.test_mark_other PASSED > testing/test_mark.py:173: TestFunctional.test_mark_dynamically_in_funcarg PASSED > testing/test_mark.py:193: Test_genitems.test_check_collect_hashes PASSED > testing/test_mark.py:210: Test_genitems.test_root_conftest_syntax_error PASSED > testing/test_mark.py:216: Test_genitems.test_example_items1 PASSED > testing/test_mark.py:245: TestKeywordSelection.test_select_simple PASSED > testing/test_mark.py:265: TestKeywordSelection.test_select_extra_keywords PASSED > testing/test_mark.py:291: TestKeywordSelection.test_select_starton PASSED > testing/test_mark.py:307: TestKeywordSelection.test_keyword_extra PASSED > testing/test_monkeypatch.py:5: test_setattr PASSED > testing/test_monkeypatch.py:27: test_delattr PASSED > testing/test_monkeypatch.py:45: test_setitem PASSED > testing/test_monkeypatch.py:62: test_delitem PASSED > testing/test_monkeypatch.py:78: test_setenv PASSED > testing/test_monkeypatch.py:86: test_delenv PASSED > testing/test_monkeypatch.py:106: test_setenv_prepend PASSED > testing/test_monkeypatch.py:116: test_monkeypatch_plugin PASSED > testing/test_monkeypatch.py:124: test_syspath_prepend PASSED > testing/test_nose.py:6: test_nose_setup SKIPPED > testing/test_nose.py:27: test_nose_setup_func SKIPPED > testing/test_nose.py:57: test_nose_setup_func_failure SKIPPED > testing/test_nose.py:81: test_nose_setup_func_failure_2 SKIPPED > testing/test_nose.py:105: test_nose_setup_partial SKIPPED > testing/test_nose.py:140: test_nose_test_generator_fixtures SKIPPED > testing/test_nose.py:207: test_module_level_setup SKIPPED > testing/test_nose.py:238: test_nose_style_setup_teardown SKIPPED > testing/test_nose.py:259: test_nose_setup_ordering SKIPPED > testing/test_parseopt.py:6: TestParser.test_no_help_by_default PASSED > testing/test_parseopt.py:12: TestParser.test_group_add_and_get PASSED > testing/test_parseopt.py:18: TestParser.test_getgroup_simple PASSED > testing/test_parseopt.py:26: TestParser.test_group_ordering PASSED > testing/test_parseopt.py:35: TestParser.test_group_addoption PASSED > testing/test_parseopt.py:41: TestParser.test_group_shortopt_lowercase PASSED > testing/test_parseopt.py:51: TestParser.test_parser_addoption PASSED > testing/test_parseopt.py:65: TestParser.test_parse PASSED > testing/test_parseopt.py:70: TestParser.test_parse_will_set_default PASSED > testing/test_parseopt.py:79: TestParser.test_parse_setoption PASSED > testing/test_parseopt.py:90: TestParser.test_parse_defaultgetter PASSED > testing/test_parseopt.py:105: test_addoption_parser_epilog PASSED > testing/test_pastebin.py:13: TestPasting.test_failed PASSED > testing/test_pastebin.py:30: TestPasting.test_all PASSED > testing/test_pdb.py:14: TestPDB.test_pdb_on_fail PASSED > testing/test_pdb.py:24: TestPDB.test_pdb_on_xfail PASSED > testing/test_pdb.py:34: TestPDB.test_pdb_on_skip PASSED > testing/test_pdb.py:43: TestPDB.test_pdb_on_BdbQuit PASSED > testing/test_pdb.py:52: TestPDB.test_pdb_interaction SKIPPED > testing/test_pdb.py:69: TestPDB.test_pdb_interaction_exception SKIPPED > testing/test_pdb.py:88: TestPDB.test_pdb_interaction_capturing_simple SKIPPED > testing/test_pdb.py:109: TestPDB.test_pdb_interaction_capturing_twice SKIPPED > testing/test_pdb.py:136: TestPDB.test_pdb_used_outside_test SKIPPED > testing/test_pdb.py:147: TestPDB.test_pdb_used_in_generate_tests SKIPPED > testing/test_pdb.py:160: TestPDB.test_pdb_collection_failure_is_shown PASSED > testing/test_pytester.py:7: test_reportrecorder xfail > testing/test_pytester.py:58: test_parseconfig PASSED > testing/test_pytester.py:65: test_testdir_runs_with_plugin PASSED > testing/test_pytester.py:76: test_hookrecorder_basic PASSED > testing/test_pytester.py:88: test_hookrecorder_basic_no_args_hook PASSED > testing/test_pytester.py:99: test_functional PASSED > testing/test_pytester.py:119: test_makepyfile_unicode PASSED > testing/test_python.py:5: TestModule.test_failing_import PASSED > testing/test_python.py:10: TestModule.test_import_duplicate PASSED > testing/test_python.py:27: TestModule.test_syntax_error_in_module PASSED > testing/test_python.py:32: TestModule.test_module_considers_pluginmanager_at_import PASSED > testing/test_python.py:37: TestClass.test_class_with_init_not_collected FAILED > testing/test_python.py:37: TestClass.test_class_with_init_not_collected ERROR > testing/test_python.py:49: TestClass.test_class_subclassobject ERROR > testing/test_python.py:49: TestClass.test_class_subclassobject ERROR > testing/test_python.py:60: TestGenerator.test_generative_functions ERROR > testing/test_python.py:60: TestGenerator.test_generative_functions ERROR > testing/test_python.py:80: TestGenerator.test_generative_methods ERROR > testing/test_python.py:80: TestGenerator.test_generative_methods ERROR > testing/test_python.py:98: TestGenerator.test_generative_functions_with_explicit_names ERROR > testing/test_python.py:98: TestGenerator.test_generative_functions_with_explicit_names ERROR > testing/test_python.py:120: TestGenerator.test_generative_functions_unique_explicit_names ERROR > testing/test_python.py:120: TestGenerator.test_generative_functions_unique_explicit_names ERROR > testing/test_python.py:134: TestGenerator.test_generative_methods_with_explicit_names ERROR > testing/test_python.py:134: TestGenerator.test_generative_methods_with_explicit_names ERROR > testing/test_python.py:154: TestGenerator.test_order_of_execution_generator_same_codeline ERROR > testing/test_python.py:154: TestGenerator.test_order_of_execution_generator_same_codeline ERROR > testing/test_python.py:178: TestGenerator.test_order_of_execution_generator_different_codeline ERROR > testing/test_python.py:178: TestGenerator.test_order_of_execution_generator_different_codeline ERROR > testing/test_python.py:209: TestGenerator.test_setupstate_is_preserved_134 ERROR > testing/test_python.py:209: TestGenerator.test_setupstate_is_preserved_134 ERROR > testing/test_python.py:253: TestFunction.test_getmodulecollector ERROR > testing/test_python.py:253: TestFunction.test_getmodulecollector ERROR > testing/test_python.py:259: TestFunction.test_function_equality ERROR > testing/test_python.py:259: TestFunction.test_function_equality ERROR > testing/test_python.py:281: TestFunction.test_function_equality_with_callspec ERROR > testing/test_python.py:281: TestFunction.test_function_equality_with_callspec ERROR > testing/test_python.py:299: TestFunction.test_pyfunc_call ERROR > testing/test_python.py:299: TestFunction.test_pyfunc_call ERROR > testing/test_python.py:313: TestSorting.test_check_equality ERROR > testing/test_python.py:313: TestSorting.test_check_equality ERROR > testing/test_python.py:341: TestSorting.test_allow_sane_sorting_for_decorators ERROR > testing/test_python.py:341: TestSorting.test_allow_sane_sorting_for_decorators ERROR > testing/test_python.py:363: TestConftestCustomization.test_pytest_pycollect_module ERROR > testing/test_python.py:363: TestConftestCustomization.test_pytest_pycollect_module ERROR > testing/test_python.py:380: TestConftestCustomization.test_pytest_pycollect_makeitem ERROR > testing/test_python.py:380: TestConftestCustomization.test_pytest_pycollect_makeitem ERROR > testing/test_python.py:395: TestConftestCustomization.test_makeitem_non_underscore ERROR > testing/test_python.py:395: TestConftestCustomization.test_makeitem_non_underscore ERROR > testing/test_python.py:403: test_setup_only_available_in_subdir ERROR > testing/test_python.py:403: test_setup_only_available_in_subdir ERROR > testing/test_python.py:431: test_generate_tests_only_done_in_subdir ERROR > testing/test_python.py:431: test_generate_tests_only_done_in_subdir ERROR > testing/test_python.py:449: test_modulecol_roundtrip ERROR > testing/test_python.py:449: test_modulecol_roundtrip ERROR > testing/test_python.py:457: TestTracebackCutting.test_skip_simple PASSED > testing/test_python.py:462: TestTracebackCutting.test_traceback_argsetup PASSED > testing/test_python.py:482: TestTracebackCutting.test_traceback_error_during_import PASSED > testing/test_python.py:507: test_getfuncargnames PASSED > testing/test_python.py:523: test_callspec_repr PASSED > testing/test_python.py:534: TestFillFuncArgs.test_funcarg_lookupfails PASSED > testing/test_python.py:545: TestFillFuncArgs.test_funcarg_lookup_default FAILED > testing/test_python.py:545: TestFillFuncArgs.test_funcarg_lookup_default ERROR > testing/test_python.py:554: TestFillFuncArgs.test_funcarg_basic ERROR > testing/test_python.py:554: TestFillFuncArgs.test_funcarg_basic ERROR > testing/test_python.py:567: TestFillFuncArgs.test_funcarg_lookup_modulelevel ERROR > testing/test_python.py:567: TestFillFuncArgs.test_funcarg_lookup_modulelevel ERROR > testing/test_python.py:584: TestFillFuncArgs.test_funcarg_lookup_classlevel ERROR > testing/test_python.py:584: TestFillFuncArgs.test_funcarg_lookup_classlevel ERROR > testing/test_python.py:597: TestFillFuncArgs.test_fillfuncargs_exposed ERROR > testing/test_python.py:597: TestFillFuncArgs.test_fillfuncargs_exposed ERROR > testing/test_python.py:610: TestRequest.test_request_attributes ERROR > testing/test_python.py:610: TestRequest.test_request_attributes ERROR > testing/test_python.py:624: TestRequest.test_request_attributes_method ERROR > testing/test_python.py:624: TestRequest.test_request_attributes_method ERROR > testing/test_python.py:648: TestRequest.test_getfuncargvalue_recursive ERROR > testing/test_python.py:648: TestRequest.test_getfuncargvalue_recursive ERROR > testing/test_python.py:663: TestRequest.test_getfuncargvalue ERROR > testing/test_python.py:663: TestRequest.test_getfuncargvalue ERROR > testing/test_python.py:684: TestRequest.test_request_addfinalizer ERROR > testing/test_python.py:684: TestRequest.test_request_addfinalizer ERROR > testing/test_python.py:702: TestRequest.test_request_addfinalizer_partial_setup_failure ERROR > testing/test_python.py:702: TestRequest.test_request_addfinalizer_partial_setup_failure ERROR > testing/test_python.py:717: TestRequest.test_request_getmodulepath ERROR > testing/test_python.py:717: TestRequest.test_request_getmodulepath ERROR > testing/test_python.py:723: test_applymarker ERROR > testing/test_python.py:723: test_applymarker ERROR > testing/test_python.py:741: TestRequestCachedSetup.test_request_cachedsetup ERROR > testing/test_python.py:741: TestRequestCachedSetup.test_request_cachedsetup ERROR > testing/test_python.py:762: TestRequestCachedSetup.test_request_cachedsetup_class ERROR > testing/test_python.py:762: TestRequestCachedSetup.test_request_cachedsetup_class ERROR > testing/test_python.py:795: TestRequestCachedSetup.test_request_cachedsetup_extrakey ERROR > testing/test_python.py:795: TestRequestCachedSetup.test_request_cachedsetup_extrakey ERROR > testing/test_python.py:810: TestRequestCachedSetup.test_request_cachedsetup_cache_deletion ERROR > testing/test_python.py:810: TestRequestCachedSetup.test_request_cachedsetup_cache_deletion ERROR > testing/test_python.py:828: TestRequestCachedSetup.test_request_cached_setup_two_args ERROR > testing/test_python.py:828: TestRequestCachedSetup.test_request_cached_setup_two_args ERROR > testing/test_python.py:842: TestRequestCachedSetup.test_request_cached_setup_getfuncargvalue ERROR > testing/test_python.py:842: TestRequestCachedSetup.test_request_cached_setup_getfuncargvalue ERROR > testing/test_python.py:857: TestRequestCachedSetup.test_request_cached_setup_functional ERROR > testing/test_python.py:857: TestRequestCachedSetup.test_request_cached_setup_functional ERROR > testing/test_python.py:885: TestMetafunc.test_no_funcargs ERROR > testing/test_python.py:885: TestMetafunc.test_no_funcargs ERROR > testing/test_python.py:890: TestMetafunc.test_function_basic ERROR > testing/test_python.py:890: TestMetafunc.test_function_basic ERROR > testing/test_python.py:898: TestMetafunc.test_addcall_no_args ERROR > testing/test_python.py:898: TestMetafunc.test_addcall_no_args ERROR > testing/test_python.py:907: TestMetafunc.test_addcall_id ERROR > testing/test_python.py:907: TestMetafunc.test_addcall_id ERROR > testing/test_python.py:920: TestMetafunc.test_addcall_param ERROR > testing/test_python.py:920: TestMetafunc.test_addcall_param ERROR > testing/test_python.py:932: TestMetafunc.test_addcall_funcargs ERROR > testing/test_python.py:932: TestMetafunc.test_addcall_funcargs ERROR > testing/test_python.py:945: TestGenfuncFunctional.test_attributes ERROR > testing/test_python.py:945: TestGenfuncFunctional.test_attributes ERROR > testing/test_python.py:979: TestGenfuncFunctional.test_addcall_with_two_funcargs_generators ERROR > testing/test_python.py:979: TestGenfuncFunctional.test_addcall_with_two_funcargs_generators ERROR > testing/test_python.py:1000: TestGenfuncFunctional.test_two_functions ERROR > testing/test_python.py:1000: TestGenfuncFunctional.test_two_functions ERROR > testing/test_python.py:1022: TestGenfuncFunctional.test_noself_in_method ERROR > testing/test_python.py:1022: TestGenfuncFunctional.test_noself_in_method ERROR > testing/test_python.py:1037: TestGenfuncFunctional.test_generate_plugin_and_module PASSED > testing/test_python.py:1063: TestGenfuncFunctional.test_generate_tests_in_class PASSED > testing/test_python.py:1078: TestGenfuncFunctional.test_two_functions_not_same_instance PASSED > testing/test_python.py:1096: TestGenfuncFunctional.test_issue28_setup_method_in_generate_tests PASSED > testing/test_python.py:1112: test_conftest_funcargs_only_available_in_subdir PASSED > testing/test_python.py:1133: test_funcarg_non_pycollectobj FAILED > testing/test_python.py:1133: test_funcarg_non_pycollectobj ERROR > testing/test_python.py:1156: test_funcarg_lookup_error ERROR > testing/test_python.py:1156: test_funcarg_lookup_error ERROR > testing/test_python.py:1172: TestReportInfo.test_itemreport_reportinfo ERROR > testing/test_python.py:1172: TestReportInfo.test_itemreport_reportinfo ERROR > testing/test_python.py:1186: TestReportInfo.test_func_reportinfo ERROR > testing/test_python.py:1186: TestReportInfo.test_func_reportinfo ERROR > testing/test_python.py:1193: TestReportInfo.test_class_reportinfo ERROR > testing/test_python.py:1193: TestReportInfo.test_class_reportinfo ERROR > testing/test_python.py:1205: TestReportInfo.test_generator_reportinfo ERROR > testing/test_python.py:1205: TestReportInfo.test_generator_reportinfo ERROR > testing/test_python.py:1236: test_show_funcarg ERROR > testing/test_python.py:1236: test_show_funcarg ERROR > testing/test_python.py:1245: TestRaises.test_raises ERROR > testing/test_python.py:1245: TestRaises.test_raises ERROR > testing/test_python.py:1252: TestRaises.test_raises_exec ERROR > testing/test_python.py:1252: TestRaises.test_raises_exec ERROR > testing/test_python.py:1255: TestRaises.test_raises_syntax_error ERROR > testing/test_python.py:1255: TestRaises.test_raises_syntax_error ERROR > testing/test_python.py:1258: TestRaises.test_raises_function ERROR > testing/test_python.py:1258: TestRaises.test_raises_function ERROR > testing/test_python.py:1261: TestRaises.test_raises_callable_no_exception ERROR > testing/test_python.py:1261: TestRaises.test_raises_callable_no_exception ERROR > testing/test_python.py:1270: TestRaises.test_raises_as_contextmanager ERROR > testing/test_python.py:1270: TestRaises.test_raises_as_contextmanager ERROR > testing/test_python.py:1300: test_customized_python_discovery ERROR > testing/test_python.py:1300: test_customized_python_discovery ERROR > testing/test_python.py:1330: test_collector_attributes ERROR > testing/test_python.py:1330: test_collector_attributes ERROR > testing/test_python.py:1348: test_customize_through_attributes ERROR > testing/test_python.py:1348: test_customize_through_attributes ERROR > testing/test_recwarn.py:4: test_WarningRecorder ERROR > testing/test_recwarn.py:4: test_WarningRecorder ERROR > testing/test_recwarn.py:23: test_recwarn_functional ERROR > testing/test_recwarn.py:23: test_recwarn_functional ERROR > testing/test_recwarn.py:54: test_deprecated_call_raises ERROR > testing/test_recwarn.py:54: test_deprecated_call_raises ERROR > testing/test_recwarn.py:59: test_deprecated_call ERROR > testing/test_recwarn.py:59: test_deprecated_call ERROR > testing/test_recwarn.py:62: test_deprecated_call_ret ERROR > testing/test_recwarn.py:62: test_deprecated_call_ret ERROR > testing/test_recwarn.py:66: test_deprecated_call_preserves ERROR > testing/test_recwarn.py:66: test_deprecated_call_preserves ERROR > testing/test_recwarn.py:74: test_deprecated_explicit_call_raises ERROR > testing/test_recwarn.py:74: test_deprecated_explicit_call_raises ERROR > testing/test_recwarn.py:78: test_deprecated_explicit_call ERROR > testing/test_recwarn.py:78: test_deprecated_explicit_call ERROR > testing/test_resultlog.py:7: test_generic_path ERROR > testing/test_resultlog.py:7: test_generic_path ERROR > testing/test_resultlog.py:30: test_write_log_entry ERROR > testing/test_resultlog.py:30: test_write_log_entry ERROR > testing/test_resultlog.py:80: TestWithFunctionIntegration.test_collection_report ERROR > testing/test_resultlog.py:80: TestWithFunctionIntegration.test_collection_report ERROR > testing/test_resultlog.py:103: TestWithFunctionIntegration.test_log_test_outcomes ERROR > testing/test_resultlog.py:103: TestWithFunctionIntegration.test_log_test_outcomes ERROR > testing/test_resultlog.py:136: TestWithFunctionIntegration.test_internal_exception ERROR > testing/test_resultlog.py:136: TestWithFunctionIntegration.test_internal_exception ERROR > testing/test_resultlog.py:153: test_generic ERROR > testing/test_resultlog.py:153: test_generic ERROR > testing/test_resultlog.py:180: test_no_resultlog_on_slaves ERROR > testing/test_resultlog.py:180: test_no_resultlog_on_slaves ERROR > testing/test_runner.py:6: TestSetupState.test_setup ERROR > testing/test_runner.py:6: TestSetupState.test_setup ERROR > testing/test_runner.py:16: TestSetupState.test_setup_scope_None ERROR > testing/test_runner.py:16: TestSetupState.test_setup_scope_None ERROR > testing/test_runner.py:30: TestSetupState.test_teardown_exact_stack_empty ERROR > testing/test_runner.py:30: TestSetupState.test_teardown_exact_stack_empty ERROR > testing/test_runner.py:37: TestSetupState.test_setup_fails_and_failure_is_cached ERROR > testing/test_runner.py:37: TestSetupState.test_setup_fails_and_failure_is_cached ERROR > testing/test_runner.py:48: TestExecutionNonForked.test_passfunction ERROR > testing/test_runner.py:48: TestExecutionNonForked.test_passfunction ERROR > testing/test_runner.py:59: TestExecutionNonForked.test_failfunction ERROR > testing/test_runner.py:59: TestExecutionNonForked.test_failfunction ERROR > testing/test_runner.py:72: TestExecutionNonForked.test_skipfunction FAILED > testing/test_runner.py:90: TestExecutionNonForked.test_skip_in_setup_function FAILED > testing/test_runner.py:90: TestExecutionNonForked.test_skip_in_setup_function ERROR > testing/test_runner.py:109: TestExecutionNonForked.test_failure_in_setup_function ERROR > testing/test_runner.py:109: TestExecutionNonForked.test_failure_in_setup_function ERROR > testing/test_runner.py:124: TestExecutionNonForked.test_failure_in_teardown_function ERROR > testing/test_runner.py:124: TestExecutionNonForked.test_failure_in_teardown_function ERROR > testing/test_runner.py:142: TestExecutionNonForked.test_custom_failure_repr ERROR > testing/test_runner.py:142: TestExecutionNonForked.test_custom_failure_repr ERROR > testing/test_runner.py:163: TestExecutionNonForked.test_failure_in_setup_function_ignores_custom_repr ERROR > testing/test_runner.py:163: TestExecutionNonForked.test_failure_in_setup_function_ignores_custom_repr ERROR > testing/test_runner.py:187: TestExecutionNonForked.test_systemexit_does_not_bail_out ERROR > testing/test_runner.py:187: TestExecutionNonForked.test_systemexit_does_not_bail_out ERROR > testing/test_runner.py:199: TestExecutionNonForked.test_exit_propagates ERROR > testing/test_runner.py:199: TestExecutionNonForked.test_exit_propagates ERROR > testing/test_runner.py:217: TestExecutionNonForked.test_keyboardinterrupt_propagates ERROR > testing/test_runner.py:217: TestExecutionNonForked.test_keyboardinterrupt_propagates ERROR > testing/test_runner.py:48: TestExecutionForked.test_passfunction ERROR > testing/test_runner.py:48: TestExecutionForked.test_passfunction ERROR > testing/test_runner.py:59: TestExecutionForked.test_failfunction ERROR > testing/test_runner.py:59: TestExecutionForked.test_failfunction ERROR > testing/test_runner.py:72: TestExecutionForked.test_skipfunction ERROR > testing/test_runner.py:72: TestExecutionForked.test_skipfunction ERROR > testing/test_runner.py:90: TestExecutionForked.test_skip_in_setup_function ERROR > testing/test_runner.py:90: TestExecutionForked.test_skip_in_setup_function ERROR > testing/test_runner.py:109: TestExecutionForked.test_failure_in_setup_function ERROR > testing/test_runner.py:109: TestExecutionForked.test_failure_in_setup_function ERROR > testing/test_runner.py:124: TestExecutionForked.test_failure_in_teardown_function ERROR > testing/test_runner.py:124: TestExecutionForked.test_failure_in_teardown_function ERROR > testing/test_runner.py:142: TestExecutionForked.test_custom_failure_repr ERROR > testing/test_runner.py:142: TestExecutionForked.test_custom_failure_repr ERROR > testing/test_runner.py:163: TestExecutionForked.test_failure_in_setup_function_ignores_custom_repr ERROR > testing/test_runner.py:163: TestExecutionForked.test_failure_in_setup_function_ignores_custom_repr ERROR > testing/test_runner.py:187: TestExecutionForked.test_systemexit_does_not_bail_out ERROR > testing/test_runner.py:187: TestExecutionForked.test_systemexit_does_not_bail_out ERROR > testing/test_runner.py:199: TestExecutionForked.test_exit_propagates ERROR > testing/test_runner.py:199: TestExecutionForked.test_exit_propagates ERROR > testing/test_runner.py:236: TestExecutionForked.test_suicide ERROR > testing/test_runner.py:236: TestExecutionForked.test_suicide ERROR > testing/test_runner.py:247: TestSessionReports.test_collect_result ERROR > testing/test_runner.py:247: TestSessionReports.test_collect_result ERROR > testing/test_runner.py:267: TestSessionReports.test_skip_at_module_scope ERROR > testing/test_runner.py:267: TestSessionReports.test_skip_at_module_scope ERROR > testing/test_runner.py:279: test_callinfo ERROR > testing/test_runner.py:279: test_callinfo ERROR > testing/test_runner.py:292: test_runtest_in_module_ordering XPASS > testing/test_runner.py:292: test_runtest_in_module_ordering XPASS > testing/test_runner.py:322: test_pytest_exit ERROR > testing/test_runner.py:322: test_pytest_exit ERROR > testing/test_runner.py:329: test_pytest_fail ERROR > testing/test_runner.py:329: test_pytest_fail ERROR > testing/test_runner.py:337: test_pytest_fail_notrace ERROR > testing/test_runner.py:337: test_pytest_fail_notrace ERROR > testing/test_runner.py:352: test_exception_printing_skip ERROR > testing/test_runner.py:352: test_exception_printing_skip ERROR > testing/test_runner.py:360: test_importorskip ERROR > testing/test_runner.py:360: test_importorskip ERROR > testing/test_runner.py:386: test_importorskip_imports_last_module_part ERROR > testing/test_runner.py:386: test_importorskip_imports_last_module_part ERROR > testing/test_runner.py:392: test_pytest_cmdline_main ERROR > testing/test_runner.py:392: test_pytest_cmdline_main ERROR > testing/test_runner_xunit.py:5: test_module_and_function_setup ERROR > testing/test_runner_xunit.py:5: test_module_and_function_setup ERROR > testing/test_runner_xunit.py:35: test_class_setup ERROR > testing/test_runner_xunit.py:35: test_class_setup ERROR > testing/test_runner_xunit.py:59: test_method_setup ERROR > testing/test_runner_xunit.py:59: test_method_setup ERROR > testing/test_runner_xunit.py:75: test_method_generator_setup ERROR > testing/test_runner_xunit.py:75: test_method_generator_setup ERROR > testing/test_runner_xunit.py:97: test_func_generator_setup FAILED > testing/test_runner_xunit.py:125: test_method_setup_uses_fresh_instances FAILED > testing/test_runner_xunit.py:137: test_failing_setup_calls_teardown FAILED > testing/test_runner_xunit.py:137: test_failing_setup_calls_teardown ERROR > testing/test_runner_xunit.py:153: test_setup_that_skips_calledagain_and_teardown ERROR > testing/test_runner_xunit.py:153: test_setup_that_skips_calledagain_and_teardown ERROR > testing/test_runner_xunit.py:171: test_setup_fails_again_on_all_tests FAILED > testing/test_runner_xunit.py:171: test_setup_fails_again_on_all_tests ERROR > testing/test_runner_xunit.py:189: test_setup_funcarg_setup_when_outer_scope_fails ERROR > testing/test_runner_xunit.py:189: test_setup_funcarg_setup_when_outer_scope_fails ERROR > testing/test_session.py:4: TestNewSession.test_basic_testitem_events ERROR > testing/test_session.py:4: TestNewSession.test_basic_testitem_events ERROR > testing/test_session.py:31: TestNewSession.test_nested_import_error FAILED > testing/test_session.py:46: TestNewSession.test_raises_output FAILED > testing/test_session.py:59: TestNewSession.test_generator_yields_None FAILED > testing/test_session.py:69: TestNewSession.test_syntax_error_module FAILED > testing/test_session.py:76: TestNewSession.test_exit_first_problem FAILED > testing/test_session.py:85: TestNewSession.test_maxfail FAILED > testing/test_session.py:95: TestNewSession.test_broken_repr FAILED > testing/test_session.py:119: TestNewSession.test_skip_file_by_conftest PASSED > testing/test_session.py:137: TestNewSession.test_order_of_execution FAILED > testing/test_session.py:162: TestNewSession.test_collect_only_with_various_situations PASSED > testing/test_session.py:196: TestNewSession.test_minus_x_import_error PASSED > testing/test_session.py:205: test_plugin_specify PASSED > testing/test_session.py:214: test_plugin_already_exists FAILED > testing/test_session.py:219: test_exclude FAILED > testing/test_session.py:219: test_exclude ERROR > testing/test_skipping.py:9: TestEvaluator.test_no_marker FAILED > testing/test_skipping.py:9: TestEvaluator.test_no_marker ERROR > testing/test_skipping.py:15: TestEvaluator.test_marked_no_args ERROR > testing/test_skipping.py:15: TestEvaluator.test_marked_no_args ERROR > testing/test_skipping.py:29: TestEvaluator.test_marked_one_arg ERROR > testing/test_skipping.py:29: TestEvaluator.test_marked_one_arg ERROR > testing/test_skipping.py:42: TestEvaluator.test_marked_one_arg_with_reason ERROR > testing/test_skipping.py:42: TestEvaluator.test_marked_one_arg_with_reason ERROR > testing/test_skipping.py:56: TestEvaluator.test_marked_one_arg_twice ERROR > testing/test_skipping.py:56: TestEvaluator.test_marked_one_arg_twice ERROR > testing/test_skipping.py:75: TestEvaluator.test_marked_one_arg_twice2 ERROR > testing/test_skipping.py:75: TestEvaluator.test_marked_one_arg_twice2 ERROR > testing/test_skipping.py:89: TestEvaluator.test_skipif_class ERROR > testing/test_skipping.py:89: TestEvaluator.test_skipif_class ERROR > testing/test_skipping.py:105: TestXFail.test_xfail_simple ERROR > testing/test_skipping.py:105: TestXFail.test_xfail_simple ERROR > testing/test_skipping.py:119: TestXFail.test_xfail_xpassed ERROR > testing/test_skipping.py:119: TestXFail.test_xfail_xpassed ERROR > testing/test_skipping.py:133: TestXFail.test_xfail_run_anyway ERROR > testing/test_skipping.py:133: TestXFail.test_xfail_run_anyway ERROR > testing/test_skipping.py:148: TestXFail.test_xfail_evalfalse_but_fails ERROR > testing/test_skipping.py:148: TestXFail.test_xfail_evalfalse_but_fails ERROR > testing/test_skipping.py:160: TestXFail.test_xfail_not_report_default ERROR > testing/test_skipping.py:160: TestXFail.test_xfail_not_report_default ERROR > testing/test_skipping.py:172: TestXFail.test_xfail_not_run_xfail_reporting ERROR > testing/test_skipping.py:172: TestXFail.test_xfail_not_run_xfail_reporting ERROR > testing/test_skipping.py:194: TestXFail.test_xfail_not_run_no_setup_run ERROR > testing/test_skipping.py:194: TestXFail.test_xfail_not_run_no_setup_run ERROR > testing/test_skipping.py:210: TestXFail.test_xfail_xpass ERROR > testing/test_skipping.py:210: TestXFail.test_xfail_xpass ERROR > testing/test_skipping.py:224: TestXFail.test_xfail_imperative ERROR > testing/test_skipping.py:224: TestXFail.test_xfail_imperative ERROR > testing/test_skipping.py:245: TestXFail.test_xfail_imperative_in_setup_function ERROR > testing/test_skipping.py:245: TestXFail.test_xfail_imperative_in_setup_function ERROR > testing/test_skipping.py:285: TestXFail.test_dynamic_xfail_no_run ERROR > testing/test_skipping.py:285: TestXFail.test_dynamic_xfail_no_run ERROR > testing/test_skipping.py:299: TestXFail.test_dynamic_xfail_set_during_funcarg_setup ERROR > testing/test_skipping.py:299: TestXFail.test_dynamic_xfail_set_during_funcarg_setup ERROR > testing/test_skipping.py:313: TestXFailwithSetupTeardown.test_failing_setup_issue9 ERROR > testing/test_skipping.py:313: TestXFailwithSetupTeardown.test_failing_setup_issue9 ERROR > testing/test_skipping.py:328: TestXFailwithSetupTeardown.test_failing_teardown_issue9 ERROR > testing/test_skipping.py:328: TestXFailwithSetupTeardown.test_failing_teardown_issue9 ERROR > testing/test_skipping.py:345: TestSkipif.test_skipif_conditional ERROR > testing/test_skipping.py:345: TestSkipif.test_skipif_conditional ERROR > testing/test_skipping.py:356: TestSkipif.test_skipif_reporting ERROR > testing/test_skipping.py:356: TestSkipif.test_skipif_reporting ERROR > testing/test_skipping.py:370: test_skip_not_report_default ERROR > testing/test_skipping.py:370: test_skip_not_report_default ERROR > testing/test_skipping.py:383: test_skipif_class ERROR > testing/test_skipping.py:383: test_skipif_class ERROR > testing/test_skipping.py:400: test_skip_reasons_folding ERROR > testing/test_skipping.py:400: test_skip_reasons_folding ERROR > testing/test_skipping.py:425: test_skipped_reasons_functional ERROR > testing/test_skipping.py:425: test_skipped_reasons_functional ERROR > testing/test_skipping.py:453: test_reportchars ERROR > testing/test_skipping.py:453: test_reportchars ERROR > testing/test_skipping.py:475: test_reportchars_error ERROR > testing/test_skipping.py:475: test_reportchars_error ERROR > testing/test_skipping.py:490: test_errors_in_xfail_skip_expressions XPASS > testing/test_skipping.py:490: test_errors_in_xfail_skip_expressions XPASS > testing/test_skipping.py:521: test_xfail_skipif_with_globals ERROR > testing/test_skipping.py:521: test_xfail_skipif_with_globals ERROR > testing/test_skipping.py:539: test_direct_gives_error ERROR > testing/test_skipping.py:539: test_direct_gives_error ERROR > testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[default] FAILED > testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[default] ERROR > testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[verbose] ERROR > testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[verbose] ERROR > testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[quiet] ERROR > testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[quiet] ERROR > testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[fulltrace] ERROR > testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[fulltrace] ERROR > testing/test_terminal.py:68: TestTerminal.test_internalerror FAILED > testing/test_terminal.py:77: TestTerminal.test_writeline FAILED > testing/test_terminal.py:77: TestTerminal.test_writeline ERROR > testing/test_terminal.py:88: TestTerminal.test_show_runtest_logstart ERROR > testing/test_terminal.py:88: TestTerminal.test_show_runtest_logstart ERROR > testing/test_terminal.py:99: TestTerminal.test_runtest_location_shown_before_test_starts ERROR > testing/test_terminal.py:99: TestTerminal.test_runtest_location_shown_before_test_starts ERROR > testing/test_terminal.py:110: TestTerminal.test_itemreport_subclasses_show_subclassed_file ERROR > testing/test_terminal.py:110: TestTerminal.test_itemreport_subclasses_show_subclassed_file ERROR > testing/test_terminal.py:133: TestTerminal.test_itemreport_directclasses_not_shown_as_subclasses ERROR > testing/test_terminal.py:133: TestTerminal.test_itemreport_directclasses_not_shown_as_subclasses ERROR > testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[default] ERROR > testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[default] ERROR > testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[verbose] ERROR > testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[verbose] ERROR > testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[quiet] ERROR > testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[quiet] ERROR > testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[fulltrace] ERROR > testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[fulltrace] ERROR > testing/test_terminal.py:170: TestTerminal.test_keyboard_in_sessionstart ERROR > testing/test_terminal.py:170: TestTerminal.test_keyboard_in_sessionstart ERROR > testing/test_terminal.py:186: TestCollectonly.test_collectonly_basic ERROR > testing/test_terminal.py:186: TestCollectonly.test_collectonly_basic ERROR > testing/test_terminal.py:197: TestCollectonly.test_collectonly_skipped_module ERROR > testing/test_terminal.py:197: TestCollectonly.test_collectonly_skipped_module ERROR > testing/test_terminal.py:208: TestCollectonly.test_collectonly_failed_module ERROR > testing/test_terminal.py:208: TestCollectonly.test_collectonly_failed_module ERROR > testing/test_terminal.py:216: TestCollectonly.test_collectonly_fatal ERROR > testing/test_terminal.py:216: TestCollectonly.test_collectonly_fatal ERROR > testing/test_terminal.py:227: TestCollectonly.test_collectonly_simple ERROR > testing/test_terminal.py:227: TestCollectonly.test_collectonly_simple ERROR > testing/test_terminal.py:247: TestCollectonly.test_collectonly_error ERROR > testing/test_terminal.py:247: TestCollectonly.test_collectonly_error ERROR > testing/test_terminal.py:260: test_repr_python_version ERROR > testing/test_terminal.py:260: test_repr_python_version ERROR > testing/test_terminal.py:270: TestFixtureReporting.test_setup_fixture_error ERROR > testing/test_terminal.py:270: TestFixtureReporting.test_setup_fixture_error ERROR > testing/test_terminal.py:288: TestFixtureReporting.test_teardown_fixture_error ERROR > testing/test_terminal.py:288: TestFixtureReporting.test_teardown_fixture_error ERROR > testing/test_terminal.py:306: TestFixtureReporting.test_teardown_fixture_error_and_test_failure ERROR > testing/test_terminal.py:306: TestFixtureReporting.test_teardown_fixture_error_and_test_failure ERROR > testing/test_terminal.py:330: TestTerminalFunctional.test_deselected ERROR > testing/test_terminal.py:330: TestTerminalFunctional.test_deselected ERROR > testing/test_terminal.py:347: TestTerminalFunctional.test_no_skip_summary_if_failure ERROR > testing/test_terminal.py:347: TestTerminalFunctional.test_no_skip_summary_if_failure ERROR > testing/test_terminal.py:361: TestTerminalFunctional.test_passes ERROR > testing/test_terminal.py:361: TestTerminalFunctional.test_passes ERROR > testing/test_terminal.py:380: TestTerminalFunctional.test_header_trailer_info ERROR > testing/test_terminal.py:380: TestTerminalFunctional.test_header_trailer_info ERROR > testing/test_terminal.py:395: TestTerminalFunctional.test_showlocals ERROR > testing/test_terminal.py:395: TestTerminalFunctional.test_showlocals ERROR > testing/test_terminal.py:409: TestTerminalFunctional.test_verbose_reporting ERROR > testing/test_terminal.py:409: TestTerminalFunctional.test_verbose_reporting ERROR > testing/test_terminal.py:440: TestTerminalFunctional.test_quiet_reporting ERROR > testing/test_terminal.py:440: TestTerminalFunctional.test_quiet_reporting ERROR > testing/test_terminal.py:448: test_fail_extra_reporting ERROR > testing/test_terminal.py:448: test_fail_extra_reporting ERROR > testing/test_terminal.py:458: test_fail_reporting_on_pass ERROR > testing/test_terminal.py:458: test_fail_reporting_on_pass ERROR > testing/test_terminal.py:463: test_getreportopt ERROR > testing/test_terminal.py:463: test_getreportopt ERROR > testing/test_terminal.py:483: test_terminalreporter_reportopt_addopts ERROR > testing/test_terminal.py:483: test_terminalreporter_reportopt_addopts ERROR > testing/test_terminal.py:498: test_tbstyle_short ERROR > testing/test_terminal.py:498: test_tbstyle_short ERROR > testing/test_terminal.py:520: test_traceconfig FAILED > testing/test_terminal.py:520: test_traceconfig ERROR > testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[default] ERROR > testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[default] ERROR > testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[verbose] FAILED > testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[verbose] ERROR > testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[quiet] ERROR > testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[quiet] ERROR > testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[fulltrace] ERROR > testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[fulltrace] ERROR > testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[default] FAILED > testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[default] ERROR > testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[verbose] ERROR > testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[verbose] ERROR > testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[quiet] ERROR > testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[quiet] ERROR > testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[fulltrace] FAILED > testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[fulltrace] ERROR > testing/test_terminal.py:559: TestGenericReporting.test_tb_option[default] ERROR > testing/test_terminal.py:559: TestGenericReporting.test_tb_option[default] ERROR > testing/test_terminal.py:559: TestGenericReporting.test_tb_option[verbose] ERROR > testing/test_terminal.py:559: TestGenericReporting.test_tb_option[verbose] ERROR > testing/test_terminal.py:559: TestGenericReporting.test_tb_option[quiet] FAILED > testing/test_terminal.py:559: TestGenericReporting.test_tb_option[quiet] ERROR > testing/test_terminal.py:559: TestGenericReporting.test_tb_option[fulltrace] ERROR > testing/test_terminal.py:559: TestGenericReporting.test_tb_option[fulltrace] ERROR > testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[default] ERROR > testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[default] ERROR > testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[verbose] FAILED > testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[verbose] ERROR > testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[quiet] ERROR > testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[quiet] ERROR > testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[fulltrace] ERROR > testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[fulltrace] ERROR > testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[default] FAILED > testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[default] ERROR > testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[verbose] ERROR > testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[verbose] ERROR > testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[quiet] ERROR > testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[quiet] ERROR > testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[fulltrace] FAILED > testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[fulltrace] ERROR > testing/test_terminal.py:619: test_fdopen_kept_alive_issue124 XPASS > testing/test_terminal.py:619: test_fdopen_kept_alive_issue124 XPASS > testing/test_tmpdir.py:6: test_funcarg ERROR > testing/test_tmpdir.py:6: test_funcarg ERROR > testing/test_tmpdir.py:23: test_ensuretemp PASSED > testing/test_tmpdir.py:31: TestTempdirHandler.test_mktemp PASSED > testing/test_tmpdir.py:44: TestConfigTmpdir.test_getbasetemp_custom_removes_old FAILED > testing/test_tmpdir.py:57: TestConfigTmpdir.test_reparse FAILED > testing/test_tmpdir.py:57: TestConfigTmpdir.test_reparse ERROR > testing/test_tmpdir.py:64: TestConfigTmpdir.test_reparse_filename_too_long ERROR > testing/test_tmpdir.py:64: TestConfigTmpdir.test_reparse_filename_too_long ERROR > testing/test_tmpdir.py:68: test_basetemp ERROR > testing/test_tmpdir.py:68: test_basetemp ERROR > testing/test_unittest.py:3: test_simple_unittest ERROR > testing/test_unittest.py:3: test_simple_unittest ERROR > testing/test_unittest.py:17: test_isclasscheck_issue53 FAILED > testing/test_unittest.py:17: test_isclasscheck_issue53 ERROR > testing/test_unittest.py:28: test_setup ERROR > testing/test_unittest.py:28: test_setup ERROR > testing/test_unittest.py:48: test_new_instances ERROR > testing/test_unittest.py:48: test_new_instances ERROR > testing/test_unittest.py:60: test_teardown ERROR > testing/test_unittest.py:60: test_teardown ERROR > testing/test_unittest.py:79: test_method_and_teardown_failing_reporting FAILED > testing/test_unittest.py:79: test_method_and_teardown_failing_reporting ERROR > testing/test_unittest.py:98: test_setup_failure_is_shown ERROR > testing/test_unittest.py:98: test_setup_failure_is_shown ERROR > testing/test_unittest.py:118: test_setup_setUpClass ERROR > testing/test_unittest.py:118: test_setup_setUpClass ERROR > testing/test_unittest.py:140: test_setup_class FAILED > testing/test_unittest.py:161: test_testcase_adderrorandfailure_defers[0] FAILED > testing/test_unittest.py:161: test_testcase_adderrorandfailure_defers[0] ERROR > testing/test_unittest.py:161: test_testcase_adderrorandfailure_defers[1] ERROR > testing/test_unittest.py:161: test_testcase_adderrorandfailure_defers[1] ERROR > testing/test_unittest.py:181: test_testcase_custom_exception_info[0] FAILED > testing/test_unittest.py:181: test_testcase_custom_exception_info[0] ERROR > testing/test_unittest.py:181: test_testcase_custom_exception_info[1] ERROR > testing/test_unittest.py:181: test_testcase_custom_exception_info[1] ERROR > testing/test_unittest.py:211: test_testcase_totally_incompatible_exception_info ERROR > testing/test_unittest.py:211: test_testcase_totally_incompatible_exception_info ERROR > testing/test_unittest.py:222: test_module_level_pytestmark FAILED > testing/test_unittest.py:239: TestTrialUnittest.test_trial_exceptions_with_skips SKIPPED > testing/test_unittest.py:285: TestTrialUnittest.test_trial_error SKIPPED > testing/test_unittest.py:335: TestTrialUnittest.test_trial_pdb SKIPPED > testing/test_unittest.py:347: test_djangolike_testcase FAILED > testing/test_unittest.py:347: test_djangolike_testcase ERROR > testing/test_unittest.py:401: test_unittest_not_shown_in_traceback ERROR > testing/test_unittest.py:401: test_unittest_not_shown_in_traceback ERRORTraceback (most recent call last): > File "pytest.py", line 11, in > raise SystemExit(main()) > File "/Users/tzuchien/pytest/_pytest/core.py", line 457, in main > exitstatus = hook.pytest_cmdline_main(config=config) > File "/Users/tzuchien/pytest/_pytest/core.py", line 411, in __call__ > return self._docall(methods, kwargs) > File "/Users/tzuchien/pytest/_pytest/core.py", line 422, in _docall > res = mc.execute() > File "/Users/tzuchien/pytest/_pytest/core.py", line 340, in execute > res = method(**kwargs) > File "/Users/tzuchien/pytest/_pytest/main.py", line 90, in pytest_cmdline_main > return wrap_session(config, _main) > File "/Users/tzuchien/pytest/_pytest/main.py", line 84, in wrap_session > exitstatus=session.exitstatus) > File "/Users/tzuchien/pytest/_pytest/core.py", line 411, in __call__ > return self._docall(methods, kwargs) > File "/Users/tzuchien/pytest/_pytest/core.py", line 422, in _docall > res = mc.execute() > File "/Users/tzuchien/pytest/_pytest/core.py", line 340, in execute > res = method(**kwargs) > File "/Users/tzuchien/pytest/_pytest/terminal.py", line 313, in pytest_sessionfinish > __multicall__.execute() > File "/Users/tzuchien/pytest/_pytest/core.py", line 340, in execute > res = method(**kwargs) > File "/Users/tzuchien/pytest/_pytest/runner.py", line 22, in pytest_sessionfinish > rep = hook.pytest__teardown_final(session=session) > File "/Users/tzuchien/pytest/_pytest/core.py", line 411, in __call__ > return self._docall(methods, kwargs) > File "/Users/tzuchien/pytest/_pytest/core.py", line 422, in _docall > res = mc.execute() > File "/Users/tzuchien/pytest/_pytest/core.py", line 340, in execute > res = method(**kwargs) > File "/Users/tzuchien/pytest/_pytest/capture.py", line 166, in pytest__teardown_final > self.resumecapture(method) > File "/Users/tzuchien/pytest/_pytest/capture.py", line 101, in resumecapture > cap.resume() > File "/Library/Python/2.6/site-packages/py-1.4.5-py2.6.egg/py/_io/capture.py", line 237, in resume > self.startall() > File "/Library/Python/2.6/site-packages/py-1.4.5-py2.6.egg/py/_io/capture.py", line 229, in startall > self.in_.start() > File "/Library/Python/2.6/site-packages/py-1.4.5-py2.6.egg/py/_io/capture.py", line 59, in start > fd = os.open(devnullpath, os.O_RDONLY) > OSError: [Errno 24] Too many open files: '/dev/null' > ~/pytest $ From pere.martir4 at gmail.com Thu Oct 27 00:31:44 2011 From: pere.martir4 at gmail.com (Pere Martir) Date: Thu, 27 Oct 2011 00:31:44 +0200 Subject: [py-dev] Status of py.test's unit tests now ? In-Reply-To: <20111026070756.GB27936@merlinux.eu> References: <20111026070756.GB27936@merlinux.eu> Message-ID: On Wed, Oct 26, 2011 at 9:07 AM, holger krekel wrote: > On Tue, Oct 25, 2011 at 20:44 -0400, Benjamin Peterson wrote: >> Perhaps you could share what the details of the failures are. > in addition you might also install "tox" and run "tox -e py26" or "tox -e py27" and show the output. "tox -e py26" blocks forever (for hours). I attached the output in the end of this post (tox_py26_output.txt). I also attached the output of command "python pytest.py -v" (pytest_v_output.txt). As you can see, there are a lot of ERROR. My environment: Mac OS X 10.6.8, Python 2.6.1 -------------- next part -------------- ~/pytest $ tox -e py26 _____________________________________________________ [tox sdist] ______________________________________________________ [TOX] ***creating sdist package [TOX] /Users/tzuchien/pytest$ /usr/bin/python setup.py sdist --formats=zip --dist-dir .tox/dist >.tox/log/0.log [TOX] ***copying new sdistfile to '/Users/tzuchien/.tox/distshare/pytest-2.1.3.zip' __________________________________________________ [tox testenv:py26] __________________________________________________ [TOX] ***creating virtualenv py26 [TOX] /Users/tzuchien/pytest/.tox$ /System/Library/Frameworks/Python.framework/Versions/2.6/bin/python2.6 ../../../../Library/Python/2.6/site-packages/tox/virtualenv.py --distribute --no-site-packages py26 >py26/log/0.log [TOX] ***installing dependencies: :pypi:pexpect, :pypi:nose [TOX] /Users/tzuchien/pytest/.tox/py26/log$ ../bin/pip install -i http://pypi.python.org/simple --download-cache=/Users/tzuchien/pytest/.tox/_download pexpect nose >1.log [TOX] ***installing sdist [TOX] /Users/tzuchien/pytest/.tox/py26/log$ ../bin/pip install -i http://pypi.testrun.org --download-cache=/Users/tzuchien/pytest/.tox/_download ../../dist/pytest-2.1.3.zip >2.log [TOX] /Users/tzuchien/pytest/testing$ ../.tox/py26/bin/py.test -rfsxX --junitxml=/Users/tzuchien/pytest/.tox/py26/log/junit-py26.xml ================================================= test session starts ================================================== platform darwin -- Python 2.6.1 -- pytest-2.1.3 collected 671 items acceptance_test.py ....................................x. test_assertinterpret.py .......................... test_assertion.py ........................ test_assertrewrite.py ...................... test_capture.py ............x................. test_collection.py ...........x.....FFF..FF...F.... test_config.py ........x.............. test_conftest.py .......................F..... test_core.py .................................................. test_doctest.py ......... test_genscript.py s..sssss test_helpconfig.py ........ test_junitxml.py .................... test_mark.py ........................ test_monkeypatch.py ......... test_nose.py ......... test_parseopt.py ............ test_pastebin.py .. test_pdb.py ....FFFF^CKEYBOARDINTERRUPT ======================================================= FAILURES ======================================================= ______________________________________________ TestSession.test_parsearg _______________________________________________ self = testdir = def test_parsearg(self, testdir): p = testdir.makepyfile("def test_func(): pass") subdir = testdir.mkdir("sub") subdir.ensure("__init__.py") target = subdir.join(p.basename) p.move(target) testdir.chdir() subdir.chdir() config = testdir.parseconfig(p.basename) rcol = Session(config=config) > assert rcol.fspath == subdir E assert local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_parsearg0/test_parsearg/sub') == local('/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_parsearg0/test_parsearg/sub') E + where local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_parsearg0/test_parsearg/sub') = .fspath /Users/tzuchien/pytest/testing/test_collection.py:303: AssertionError ___________________________________________ TestSession.test_collect_topdir ____________________________________________ self = testdir = def test_collect_topdir(self, testdir): p = testdir.makepyfile("def test_func(): pass") id = "::".join([p.basename, "test_func"]) config = testdir.parseconfigure(id) topdir = testdir.tmpdir rcol = Session(config) > assert topdir == rcol.fspath E assert local('/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_collect_topdir0/test_collect_topdir') == local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_collect_topdir0/test_collect_topdir') E + where local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_collect_topdir0/test_collect_topdir') = .fspath /Users/tzuchien/pytest/testing/test_collection.py:319: AssertionError __________________________________ TestSession.test_collect_protocol_single_function ___________________________________ self = testdir = def test_collect_protocol_single_function(self, testdir): p = testdir.makepyfile("def test_func(): pass") id = "::".join([p.basename, "test_func"]) config = testdir.parseconfigure(id) topdir = testdir.tmpdir rcol = Session(config) > assert topdir == rcol.fspath E assert local('/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_collect_protocol_single_function0/test_collect_protocol_single_function') == local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_collect_protocol_single_function0/test_collect_protocol_single_function') E + where local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_collect_protocol_single_function0/test_collect_protocol_single_function') = .fspath /Users/tzuchien/pytest/testing/test_collection.py:334: AssertionError ____________________________________ TestSession.test_collect_subdir_event_ordering ____________________________________ self = testdir = def test_collect_subdir_event_ordering(self, testdir): p = testdir.makepyfile("def test_func(): pass") aaa = testdir.mkpydir("aaa") test_aaa = aaa.join("test_aaa.py") p.move(test_aaa) config = testdir.parseconfigure() rcol = Session(config) hookrec = testdir.getreportrecorder(config) rcol.perform_collect() items = rcol.items assert len(items) == 1 py.std.pprint.pprint(hookrec.hookrecorder.calls) hookrec.hookrecorder.contains([ ("pytest_collectstart", "collector.fspath == test_aaa"), ("pytest_pycollect_makeitem", "name == 'test_func'"), ("pytest_collectreport", > "report.nodeid.startswith('aaa/test_aaa.py')"), E Failed: could not find 'pytest_collectreport' check "report.nodeid.startswith('aaa/test_aaa.py')" /Users/tzuchien/pytest/testing/test_collection.py:427: Failed --------------------------------------------------- Captured stdout ---------------------------------------------------- collected 1 items [})>, , })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, , 'obj': , 'name': 'test_func'})>, })>, , 'obj': , 'name': '@pytest_ar'})>, , 'obj': , 'name': '@py_builtins'})>, })>, })>, ], 'session': , 'config': <_pytest.config.Config object at 0x10473b210>})>, })>] NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NAMEMATCH pytest_collectstart })> NOCHECKERMATCH 'collector.fspath == test_aaa' - })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NAMEMATCH pytest_collectstart })> CHECKERMATCH 'collector.fspath == test_aaa' -> })> NONAMEMATCH pytest_pycollect_makeitem with })> NAMEMATCH pytest_pycollect_makeitem , 'obj': , 'name': 'test_func'})> CHECKERMATCH "name == 'test_func'" -> , 'obj': , 'name': 'test_func'})> NONAMEMATCH pytest_collectreport with })> NONAMEMATCH pytest_collectreport with , 'obj': , 'name': '@pytest_ar'})> NONAMEMATCH pytest_collectreport with , 'obj': , 'name': '@py_builtins'})> NONAMEMATCH pytest_collectreport with })> NAMEMATCH pytest_collectreport })> NOCHECKERMATCH "report.nodeid.startswith('aaa/test_aaa.py')" - })> NONAMEMATCH pytest_collectreport with ], 'session': , 'config': <_pytest.config.Config object at 0x10473b210>})> NONAMEMATCH pytest_collectreport with })> ____________________________________ TestSession.test_collect_two_commandline_args _____________________________________ self = testdir = def test_collect_two_commandline_args(self, testdir): p = testdir.makepyfile("def test_func(): pass") aaa = testdir.mkpydir("aaa") bbb = testdir.mkpydir("bbb") test_aaa = aaa.join("test_aaa.py") p.copy(test_aaa) test_bbb = bbb.join("test_bbb.py") p.move(test_bbb) id = "." config = testdir.parseconfigure(id) rcol = Session(config) hookrec = testdir.getreportrecorder(config) rcol.perform_collect() items = rcol.items assert len(items) == 2 py.std.pprint.pprint(hookrec.hookrecorder.calls) hookrec.hookrecorder.contains([ ("pytest_collectstart", "collector.fspath == test_aaa"), ("pytest_pycollect_makeitem", "name == 'test_func'"), ("pytest_collectreport", "report.nodeid == 'aaa/test_aaa.py'"), ("pytest_collectstart", "collector.fspath == test_bbb"), ("pytest_pycollect_makeitem", "name == 'test_func'"), > ("pytest_collectreport", "report.nodeid == 'bbb/test_bbb.py'"), E Failed: could not find 'pytest_collectstart' check 'collector.fspath == test_aaa' /Users/tzuchien/pytest/testing/test_collection.py:453: Failed --------------------------------------------------- Captured stdout ---------------------------------------------------- collected 2 items [})>, , })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, })>, , 'obj': , 'name': 'test_func'})>, })>, , 'obj': , 'name': '@pytest_ar'})>, , 'obj': , 'name': '@py_builtins'})>, })>, })>, })>, })>, , 'obj': , 'name': 'test_func'})>, })>, , 'obj': , 'name': '@pytest_ar'})>, , 'obj': , 'name': '@py_builtins'})>, })>, })>, , ], 'session': , 'config': <_pytest.config.Config object at 0x1045a7710>})>, })>] NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NAMEMATCH pytest_collectstart })> NOCHECKERMATCH 'collector.fspath == test_aaa' - })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NAMEMATCH pytest_collectstart })> NOCHECKERMATCH 'collector.fspath == test_aaa' - })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with , 'obj': , 'name': 'test_func'})> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with , 'obj': , 'name': '@pytest_ar'})> NONAMEMATCH pytest_collectstart with , 'obj': , 'name': '@py_builtins'})> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NAMEMATCH pytest_collectstart })> NOCHECKERMATCH 'collector.fspath == test_aaa' - })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with , 'obj': , 'name': 'test_func'})> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with , 'obj': , 'name': '@pytest_ar'})> NONAMEMATCH pytest_collectstart with , 'obj': , 'name': '@py_builtins'})> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with })> NONAMEMATCH pytest_collectstart with , ], 'session': , 'config': <_pytest.config.Config object at 0x1045a7710>})> NONAMEMATCH pytest_collectstart with })> __________________________________________ Test_getinitialnodes.test_pkgfile ___________________________________________ self = testdir = def test_pkgfile(self, testdir): testdir.chdir() tmpdir = testdir.tmpdir subdir = tmpdir.join("subdir") x = subdir.ensure("x.py") subdir.ensure("__init__.py") config = testdir.reparseconfig([x]) col = testdir.getnode(config, x) assert isinstance(col, pytest.Module) > assert col.name == 'subdir/x.py' E assert 'x.py' == 'subdir/x.py' E - x.py E + subdir/x.py /Users/tzuchien/pytest/testing/test_collection.py:508: AssertionError --------------------------------------------------- Captured stdout ---------------------------------------------------- ============================= test session starts ============================== platform darwin -- Python 2.6.1 -- pytest-2.1.3 collected 0 items =============================== in 0.00 seconds =============================== _______________________________________________ test_setinitial_confcut ________________________________________________ testdir = def test_setinitial_confcut(testdir): conf = testdir.makeconftest("") sub = testdir.mkdir("sub") sub.chdir() for opts in (["--confcutdir=%s" % sub, sub], [sub, "--confcutdir=%s" % sub], ["--confcutdir=.", sub], [sub, "--confcutdir", sub], [str(sub), "--confcutdir", "."], ): conftest = Conftest() conftest.setinitial(opts) > assert conftest._confcutdir == sub E assert local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_setinitial_confcut0/test_setinitial_confcut/sub') == local('/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_setinitial_confcut0/test_setinitial_confcut/sub') E + where local('/private/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_setinitial_confcut0/test_setinitial_confcut/sub') = <_pytest.config.Conftest object at 0x1065754d0>._confcutdir /Users/tzuchien/pytest/testing/test_conftest.py:169: AssertionError _____________________________________________ TestPDB.test_pdb_interaction _____________________________________________ self = testdir = def test_pdb_interaction(self, testdir): p1 = testdir.makepyfile(""" def test_1(): i = 0 assert i == 1 """) child = testdir.spawn_pytest("--pdb %s" % p1) child.expect(".*def test_1") child.expect(".*i = 0") child.expect("(Pdb)") child.sendeof() > rest = child.read() /Users/tzuchien/pytest/testing/test_pdb.py:63: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , size = -1 def read (self, size = -1): # File-like object. """This reads at most "size" bytes from the file (less if the read hits EOF before obtaining size bytes). If the size argument is negative or omitted, read all data until EOF is reached. The bytes are returned as a string object. An empty string is returned when EOF is encountered immediately. """ if size == 0: return '' if size < 0: > self.expect (self.delimiter) # delimiter default is EOF /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:863: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , pattern = , timeout = -1, searchwindowsize = -1 def expect(self, pattern, timeout = -1, searchwindowsize=-1): """This seeks through the stream until a pattern is matched. The pattern is overloaded and may take several types. The pattern can be a StringType, EOF, a compiled re, or a list of any of those types. Strings will be compiled to re types. This returns the index into the pattern list. If the pattern was not a list this returns index 0 on a successful match. This may raise exceptions for EOF or TIMEOUT. To avoid the EOF or TIMEOUT exceptions add EOF or TIMEOUT to the pattern list. That will cause expect to match an EOF or TIMEOUT condition instead of raising an exception. If you pass a list of patterns and more than one matches, the first match in the stream is chosen. If more than one pattern matches at that point, the leftmost in the pattern list is chosen. For example:: # the input is 'foobar' index = p.expect (['bar', 'foo', 'foobar']) # returns 1 ('foo') even though 'foobar' is a "better" match Please note, however, that buffering can affect this behavior, since input arrives in unpredictable chunks. For example:: # the input is 'foobar' index = p.expect (['foobar', 'foo']) # returns 0 ('foobar') if all input is available at once, # but returs 1 ('foo') if parts of the final 'bar' arrive late After a match is found the instance attributes 'before', 'after' and 'match' will be set. You can see all the data read before the match in 'before'. You can see the data that was matched in 'after'. The re.MatchObject used in the re match will be in 'match'. If an error occurred then 'before' will be set to all the data read so far and 'after' and 'match' will be None. If timeout is -1 then timeout will be set to the self.timeout value. A list entry may be EOF or TIMEOUT instead of a string. This will catch these exceptions and return the index of the list entry instead of raising the exception. The attribute 'after' will be set to the exception type. The attribute 'match' will be None. This allows you to write code like this:: index = p.expect (['good', 'bad', pexpect.EOF, pexpect.TIMEOUT]) if index == 0: do_something() elif index == 1: do_something_else() elif index == 2: do_some_other_thing() elif index == 3: do_something_completely_different() instead of code like this:: try: index = p.expect (['good', 'bad']) if index == 0: do_something() elif index == 1: do_something_else() except EOF: do_some_other_thing() except TIMEOUT: do_something_completely_different() These two forms are equivalent. It all depends on what you want. You can also just expect the EOF if you are waiting for all output of a child to finish. For example:: p = pexpect.spawn('/bin/ls') p.expect (pexpect.EOF) print p.before If you are trying to optimize for speed then see expect_list(). """ compiled_pattern_list = self.compile_pattern_list(pattern) > return self.expect_list(compiled_pattern_list, timeout, searchwindowsize) /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1316: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , pattern_list = [], timeout = -1 searchwindowsize = -1 def expect_list(self, pattern_list, timeout = -1, searchwindowsize = -1): """This takes a list of compiled regular expressions and returns the index into the pattern_list that matched the child output. The list may also contain EOF or TIMEOUT (which are not compiled regular expressions). This method is similar to the expect() method except that expect_list() does not recompile the pattern list on every call. This may help if you are trying to optimize for speed, otherwise just use the expect() method. This is called by expect(). If timeout==-1 then the self.timeout value is used. If searchwindowsize==-1 then the self.searchwindowsize value is used. """ > return self.expect_loop(searcher_re(pattern_list), timeout, searchwindowsize) /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , searcher = , timeout = 10.0 searchwindowsize = None def expect_loop(self, searcher, timeout = -1, searchwindowsize = -1): """This is the common loop used inside expect. The 'searcher' should be an instance of searcher_re or searcher_string, which describes how and what to search for in the input. See expect() for other arguments, return value and exceptions. """ self.searcher = searcher if timeout == -1: timeout = self.timeout if timeout is not None: end_time = time.time() + timeout if searchwindowsize == -1: searchwindowsize = self.searchwindowsize try: incoming = self.buffer freshlen = len(incoming) while True: # Keep reading until exception or return. index = searcher.search(incoming, freshlen, searchwindowsize) if index >= 0: self.buffer = incoming[searcher.end : ] self.before = incoming[ : searcher.start] self.after = incoming[searcher.start : searcher.end] self.match = searcher.match self.match_index = index return self.match_index # No match at this point if timeout < 0 and timeout is not None: raise TIMEOUT ('Timeout exceeded in expect_any().') # Still have time left, so read more data c = self.read_nonblocking (self.maxread, timeout) freshlen = len(c) time.sleep (0.0001) incoming = incoming + c if timeout is not None: timeout = end_time - time.time() except EOF, e: self.buffer = '' self.before = incoming self.after = EOF index = searcher.eof_index if index >= 0: self.match = EOF self.match_index = index return self.match_index else: self.match = None self.match_index = None raise EOF (str(e) + '\n' + str(self)) except TIMEOUT, e: self.buffer = incoming self.before = incoming self.after = TIMEOUT index = searcher.timeout_index if index >= 0: self.match = TIMEOUT self.match_index = index return self.match_index else: self.match = None self.match_index = None > raise TIMEOUT (str(e) + '\n' + str(self)) E TIMEOUT: Timeout exceeded in read_nonblocking(). E E version: 2.4 ($Revision: 516 $) E command: /Users/tzuchien/pytest/.tox/py26/bin/python2.6 E args: ['/Users/tzuchien/pytest/.tox/py26/bin/python2.6', '/Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pytest.py', '--basetemp=/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction0/test_pdb_interaction/pexpect', '--pdb', '/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction0/test_pdb_interaction/test_pdb_interaction.py'] E searcher: searcher_re: E 0: EOF E buffer (last 100 chars): ) E before (last 100 chars): ) E after: E match: None E match_index: None E exitstatus: None E flag_eof: False E pid: 31071 E child_fd: 53 E closed: False E timeout: 10.0 E delimiter: E logfile: E logfile_read: None E logfile_send: None E maxread: 2000 E ignorecase: False E searchwindowsize: None E delaybeforesend: 0.05 E delayafterclose: 0.1 E delayafterterminate: 0.1 /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1414: TIMEOUT ________________________________________ TestPDB.test_pdb_interaction_exception ________________________________________ self = testdir = def test_pdb_interaction_exception(self, testdir): p1 = testdir.makepyfile(""" import pytest def globalfunc(): pass def test_1(): pytest.raises(ValueError, globalfunc) """) child = testdir.spawn_pytest("--pdb %s" % p1) child.expect(".*def test_1") child.expect(".*pytest.raises.*globalfunc") child.expect("(Pdb)") child.sendline("globalfunc") child.expect(".*function") child.sendeof() > child.expect("1 failed") /Users/tzuchien/pytest/testing/test_pdb.py:84: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , pattern = '1 failed', timeout = -1, searchwindowsize = -1 def expect(self, pattern, timeout = -1, searchwindowsize=-1): """This seeks through the stream until a pattern is matched. The pattern is overloaded and may take several types. The pattern can be a StringType, EOF, a compiled re, or a list of any of those types. Strings will be compiled to re types. This returns the index into the pattern list. If the pattern was not a list this returns index 0 on a successful match. This may raise exceptions for EOF or TIMEOUT. To avoid the EOF or TIMEOUT exceptions add EOF or TIMEOUT to the pattern list. That will cause expect to match an EOF or TIMEOUT condition instead of raising an exception. If you pass a list of patterns and more than one matches, the first match in the stream is chosen. If more than one pattern matches at that point, the leftmost in the pattern list is chosen. For example:: # the input is 'foobar' index = p.expect (['bar', 'foo', 'foobar']) # returns 1 ('foo') even though 'foobar' is a "better" match Please note, however, that buffering can affect this behavior, since input arrives in unpredictable chunks. For example:: # the input is 'foobar' index = p.expect (['foobar', 'foo']) # returns 0 ('foobar') if all input is available at once, # but returs 1 ('foo') if parts of the final 'bar' arrive late After a match is found the instance attributes 'before', 'after' and 'match' will be set. You can see all the data read before the match in 'before'. You can see the data that was matched in 'after'. The re.MatchObject used in the re match will be in 'match'. If an error occurred then 'before' will be set to all the data read so far and 'after' and 'match' will be None. If timeout is -1 then timeout will be set to the self.timeout value. A list entry may be EOF or TIMEOUT instead of a string. This will catch these exceptions and return the index of the list entry instead of raising the exception. The attribute 'after' will be set to the exception type. The attribute 'match' will be None. This allows you to write code like this:: index = p.expect (['good', 'bad', pexpect.EOF, pexpect.TIMEOUT]) if index == 0: do_something() elif index == 1: do_something_else() elif index == 2: do_some_other_thing() elif index == 3: do_something_completely_different() instead of code like this:: try: index = p.expect (['good', 'bad']) if index == 0: do_something() elif index == 1: do_something_else() except EOF: do_some_other_thing() except TIMEOUT: do_something_completely_different() These two forms are equivalent. It all depends on what you want. You can also just expect the EOF if you are waiting for all output of a child to finish. For example:: p = pexpect.spawn('/bin/ls') p.expect (pexpect.EOF) print p.before If you are trying to optimize for speed then see expect_list(). """ compiled_pattern_list = self.compile_pattern_list(pattern) > return self.expect_list(compiled_pattern_list, timeout, searchwindowsize) /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1316: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , pattern_list = [<_sre.SRE_Pattern object at 0x106e05378>], timeout = -1 searchwindowsize = -1 def expect_list(self, pattern_list, timeout = -1, searchwindowsize = -1): """This takes a list of compiled regular expressions and returns the index into the pattern_list that matched the child output. The list may also contain EOF or TIMEOUT (which are not compiled regular expressions). This method is similar to the expect() method except that expect_list() does not recompile the pattern list on every call. This may help if you are trying to optimize for speed, otherwise just use the expect() method. This is called by expect(). If timeout==-1 then the self.timeout value is used. If searchwindowsize==-1 then the self.searchwindowsize value is used. """ > return self.expect_loop(searcher_re(pattern_list), timeout, searchwindowsize) /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , searcher = timeout = 9.9997730255126953, searchwindowsize = None def expect_loop(self, searcher, timeout = -1, searchwindowsize = -1): """This is the common loop used inside expect. The 'searcher' should be an instance of searcher_re or searcher_string, which describes how and what to search for in the input. See expect() for other arguments, return value and exceptions. """ self.searcher = searcher if timeout == -1: timeout = self.timeout if timeout is not None: end_time = time.time() + timeout if searchwindowsize == -1: searchwindowsize = self.searchwindowsize try: incoming = self.buffer freshlen = len(incoming) while True: # Keep reading until exception or return. index = searcher.search(incoming, freshlen, searchwindowsize) if index >= 0: self.buffer = incoming[searcher.end : ] self.before = incoming[ : searcher.start] self.after = incoming[searcher.start : searcher.end] self.match = searcher.match self.match_index = index return self.match_index # No match at this point if timeout < 0 and timeout is not None: raise TIMEOUT ('Timeout exceeded in expect_any().') # Still have time left, so read more data c = self.read_nonblocking (self.maxread, timeout) freshlen = len(c) time.sleep (0.0001) incoming = incoming + c if timeout is not None: timeout = end_time - time.time() except EOF, e: self.buffer = '' self.before = incoming self.after = EOF index = searcher.eof_index if index >= 0: self.match = EOF self.match_index = index return self.match_index else: self.match = None self.match_index = None raise EOF (str(e) + '\n' + str(self)) except TIMEOUT, e: self.buffer = incoming self.before = incoming self.after = TIMEOUT index = searcher.timeout_index if index >= 0: self.match = TIMEOUT self.match_index = index return self.match_index else: self.match = None self.match_index = None > raise TIMEOUT (str(e) + '\n' + str(self)) E TIMEOUT: Timeout exceeded in read_nonblocking(). E E version: 2.4 ($Revision: 516 $) E command: /Users/tzuchien/pytest/.tox/py26/bin/python2.6 E args: ['/Users/tzuchien/pytest/.tox/py26/bin/python2.6', '/Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pytest.py', '--basetemp=/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction_exception0/test_pdb_interaction_exception/pexpect', '--pdb', '/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction_exception0/test_pdb_interaction_exception/test_pdb_interaction_exception.py'] E searcher: searcher_re: E 0: re.compile("1 failed") E buffer (last 100 chars): globalfunc at 0x102354c08> E (Pdb) E before (last 100 chars): globalfunc at 0x102354c08> E (Pdb) E after: E match: None E match_index: None E exitstatus: None E flag_eof: False E pid: 31072 E child_fd: 150 E closed: False E timeout: 10.0 E delimiter: E logfile: E logfile_read: None E logfile_send: None E maxread: 2000 E ignorecase: False E searchwindowsize: None E delaybeforesend: 0.05 E delayafterclose: 0.1 E delayafterterminate: 0.1 /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1414: TIMEOUT ____________________________________ TestPDB.test_pdb_interaction_capturing_simple _____________________________________ self = testdir = def test_pdb_interaction_capturing_simple(self, testdir): p1 = testdir.makepyfile(""" import pytest def test_1(): i = 0 print ("hello17") pytest.set_trace() x = 3 """) child = testdir.spawn_pytest(str(p1)) child.expect("test_1") child.expect("x = 3") child.expect("(Pdb)") child.sendeof() > rest = child.read() /Users/tzuchien/pytest/testing/test_pdb.py:102: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , size = -1 def read (self, size = -1): # File-like object. """This reads at most "size" bytes from the file (less if the read hits EOF before obtaining size bytes). If the size argument is negative or omitted, read all data until EOF is reached. The bytes are returned as a string object. An empty string is returned when EOF is encountered immediately. """ if size == 0: return '' if size < 0: > self.expect (self.delimiter) # delimiter default is EOF /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:863: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , pattern = , timeout = -1, searchwindowsize = -1 def expect(self, pattern, timeout = -1, searchwindowsize=-1): """This seeks through the stream until a pattern is matched. The pattern is overloaded and may take several types. The pattern can be a StringType, EOF, a compiled re, or a list of any of those types. Strings will be compiled to re types. This returns the index into the pattern list. If the pattern was not a list this returns index 0 on a successful match. This may raise exceptions for EOF or TIMEOUT. To avoid the EOF or TIMEOUT exceptions add EOF or TIMEOUT to the pattern list. That will cause expect to match an EOF or TIMEOUT condition instead of raising an exception. If you pass a list of patterns and more than one matches, the first match in the stream is chosen. If more than one pattern matches at that point, the leftmost in the pattern list is chosen. For example:: # the input is 'foobar' index = p.expect (['bar', 'foo', 'foobar']) # returns 1 ('foo') even though 'foobar' is a "better" match Please note, however, that buffering can affect this behavior, since input arrives in unpredictable chunks. For example:: # the input is 'foobar' index = p.expect (['foobar', 'foo']) # returns 0 ('foobar') if all input is available at once, # but returs 1 ('foo') if parts of the final 'bar' arrive late After a match is found the instance attributes 'before', 'after' and 'match' will be set. You can see all the data read before the match in 'before'. You can see the data that was matched in 'after'. The re.MatchObject used in the re match will be in 'match'. If an error occurred then 'before' will be set to all the data read so far and 'after' and 'match' will be None. If timeout is -1 then timeout will be set to the self.timeout value. A list entry may be EOF or TIMEOUT instead of a string. This will catch these exceptions and return the index of the list entry instead of raising the exception. The attribute 'after' will be set to the exception type. The attribute 'match' will be None. This allows you to write code like this:: index = p.expect (['good', 'bad', pexpect.EOF, pexpect.TIMEOUT]) if index == 0: do_something() elif index == 1: do_something_else() elif index == 2: do_some_other_thing() elif index == 3: do_something_completely_different() instead of code like this:: try: index = p.expect (['good', 'bad']) if index == 0: do_something() elif index == 1: do_something_else() except EOF: do_some_other_thing() except TIMEOUT: do_something_completely_different() These two forms are equivalent. It all depends on what you want. You can also just expect the EOF if you are waiting for all output of a child to finish. For example:: p = pexpect.spawn('/bin/ls') p.expect (pexpect.EOF) print p.before If you are trying to optimize for speed then see expect_list(). """ compiled_pattern_list = self.compile_pattern_list(pattern) > return self.expect_list(compiled_pattern_list, timeout, searchwindowsize) /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1316: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , pattern_list = [], timeout = -1 searchwindowsize = -1 def expect_list(self, pattern_list, timeout = -1, searchwindowsize = -1): """This takes a list of compiled regular expressions and returns the index into the pattern_list that matched the child output. The list may also contain EOF or TIMEOUT (which are not compiled regular expressions). This method is similar to the expect() method except that expect_list() does not recompile the pattern list on every call. This may help if you are trying to optimize for speed, otherwise just use the expect() method. This is called by expect(). If timeout==-1 then the self.timeout value is used. If searchwindowsize==-1 then the self.searchwindowsize value is used. """ > return self.expect_loop(searcher_re(pattern_list), timeout, searchwindowsize) /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , searcher = , timeout = 10.0 searchwindowsize = None def expect_loop(self, searcher, timeout = -1, searchwindowsize = -1): """This is the common loop used inside expect. The 'searcher' should be an instance of searcher_re or searcher_string, which describes how and what to search for in the input. See expect() for other arguments, return value and exceptions. """ self.searcher = searcher if timeout == -1: timeout = self.timeout if timeout is not None: end_time = time.time() + timeout if searchwindowsize == -1: searchwindowsize = self.searchwindowsize try: incoming = self.buffer freshlen = len(incoming) while True: # Keep reading until exception or return. index = searcher.search(incoming, freshlen, searchwindowsize) if index >= 0: self.buffer = incoming[searcher.end : ] self.before = incoming[ : searcher.start] self.after = incoming[searcher.start : searcher.end] self.match = searcher.match self.match_index = index return self.match_index # No match at this point if timeout < 0 and timeout is not None: raise TIMEOUT ('Timeout exceeded in expect_any().') # Still have time left, so read more data c = self.read_nonblocking (self.maxread, timeout) freshlen = len(c) time.sleep (0.0001) incoming = incoming + c if timeout is not None: timeout = end_time - time.time() except EOF, e: self.buffer = '' self.before = incoming self.after = EOF index = searcher.eof_index if index >= 0: self.match = EOF self.match_index = index return self.match_index else: self.match = None self.match_index = None raise EOF (str(e) + '\n' + str(self)) except TIMEOUT, e: self.buffer = incoming self.before = incoming self.after = TIMEOUT index = searcher.timeout_index if index >= 0: self.match = TIMEOUT self.match_index = index return self.match_index else: self.match = None self.match_index = None > raise TIMEOUT (str(e) + '\n' + str(self)) E TIMEOUT: Timeout exceeded in read_nonblocking(). E E version: 2.4 ($Revision: 516 $) E command: /Users/tzuchien/pytest/.tox/py26/bin/python2.6 E args: ['/Users/tzuchien/pytest/.tox/py26/bin/python2.6', '/Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pytest.py', '--basetemp=/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction_capturing_simple0/test_pdb_interaction_capturing_simple/pexpect', '/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction_capturing_simple0/test_pdb_interaction_capturing_simple/test_pdb_interaction_capturing_simple.py'] E searcher: searcher_re: E 0: EOF E buffer (last 100 chars): ) E before (last 100 chars): ) E after: E match: None E match_index: None E exitstatus: None E flag_eof: False E pid: 31073 E child_fd: 194 E closed: False E timeout: 10.0 E delimiter: E logfile: E logfile_read: None E logfile_send: None E maxread: 2000 E ignorecase: False E searchwindowsize: None E delaybeforesend: 0.05 E delayafterclose: 0.1 E delayafterterminate: 0.1 /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1414: TIMEOUT _____________________________________ TestPDB.test_pdb_interaction_capturing_twice _____________________________________ self = testdir = def test_pdb_interaction_capturing_twice(self, testdir): p1 = testdir.makepyfile(""" import pytest def test_1(): i = 0 print ("hello17") pytest.set_trace() x = 3 print ("hello18") pytest.set_trace() x = 4 """) child = testdir.spawn_pytest(str(p1)) child.expect("test_1") child.expect("x = 3") child.expect("(Pdb)") child.sendline('c') child.expect("x = 4") child.sendeof() > rest = child.read() /Users/tzuchien/pytest/testing/test_pdb.py:128: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , size = -1 def read (self, size = -1): # File-like object. """This reads at most "size" bytes from the file (less if the read hits EOF before obtaining size bytes). If the size argument is negative or omitted, read all data until EOF is reached. The bytes are returned as a string object. An empty string is returned when EOF is encountered immediately. """ if size == 0: return '' if size < 0: > self.expect (self.delimiter) # delimiter default is EOF /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:863: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , pattern = , timeout = -1, searchwindowsize = -1 def expect(self, pattern, timeout = -1, searchwindowsize=-1): """This seeks through the stream until a pattern is matched. The pattern is overloaded and may take several types. The pattern can be a StringType, EOF, a compiled re, or a list of any of those types. Strings will be compiled to re types. This returns the index into the pattern list. If the pattern was not a list this returns index 0 on a successful match. This may raise exceptions for EOF or TIMEOUT. To avoid the EOF or TIMEOUT exceptions add EOF or TIMEOUT to the pattern list. That will cause expect to match an EOF or TIMEOUT condition instead of raising an exception. If you pass a list of patterns and more than one matches, the first match in the stream is chosen. If more than one pattern matches at that point, the leftmost in the pattern list is chosen. For example:: # the input is 'foobar' index = p.expect (['bar', 'foo', 'foobar']) # returns 1 ('foo') even though 'foobar' is a "better" match Please note, however, that buffering can affect this behavior, since input arrives in unpredictable chunks. For example:: # the input is 'foobar' index = p.expect (['foobar', 'foo']) # returns 0 ('foobar') if all input is available at once, # but returs 1 ('foo') if parts of the final 'bar' arrive late After a match is found the instance attributes 'before', 'after' and 'match' will be set. You can see all the data read before the match in 'before'. You can see the data that was matched in 'after'. The re.MatchObject used in the re match will be in 'match'. If an error occurred then 'before' will be set to all the data read so far and 'after' and 'match' will be None. If timeout is -1 then timeout will be set to the self.timeout value. A list entry may be EOF or TIMEOUT instead of a string. This will catch these exceptions and return the index of the list entry instead of raising the exception. The attribute 'after' will be set to the exception type. The attribute 'match' will be None. This allows you to write code like this:: index = p.expect (['good', 'bad', pexpect.EOF, pexpect.TIMEOUT]) if index == 0: do_something() elif index == 1: do_something_else() elif index == 2: do_some_other_thing() elif index == 3: do_something_completely_different() instead of code like this:: try: index = p.expect (['good', 'bad']) if index == 0: do_something() elif index == 1: do_something_else() except EOF: do_some_other_thing() except TIMEOUT: do_something_completely_different() These two forms are equivalent. It all depends on what you want. You can also just expect the EOF if you are waiting for all output of a child to finish. For example:: p = pexpect.spawn('/bin/ls') p.expect (pexpect.EOF) print p.before If you are trying to optimize for speed then see expect_list(). """ compiled_pattern_list = self.compile_pattern_list(pattern) > return self.expect_list(compiled_pattern_list, timeout, searchwindowsize) /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1316: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , pattern_list = [], timeout = -1 searchwindowsize = -1 def expect_list(self, pattern_list, timeout = -1, searchwindowsize = -1): """This takes a list of compiled regular expressions and returns the index into the pattern_list that matched the child output. The list may also contain EOF or TIMEOUT (which are not compiled regular expressions). This method is similar to the expect() method except that expect_list() does not recompile the pattern list on every call. This may help if you are trying to optimize for speed, otherwise just use the expect() method. This is called by expect(). If timeout==-1 then the self.timeout value is used. If searchwindowsize==-1 then the self.searchwindowsize value is used. """ > return self.expect_loop(searcher_re(pattern_list), timeout, searchwindowsize) /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , searcher = timeout = 9.9995019435882568, searchwindowsize = None def expect_loop(self, searcher, timeout = -1, searchwindowsize = -1): """This is the common loop used inside expect. The 'searcher' should be an instance of searcher_re or searcher_string, which describes how and what to search for in the input. See expect() for other arguments, return value and exceptions. """ self.searcher = searcher if timeout == -1: timeout = self.timeout if timeout is not None: end_time = time.time() + timeout if searchwindowsize == -1: searchwindowsize = self.searchwindowsize try: incoming = self.buffer freshlen = len(incoming) while True: # Keep reading until exception or return. index = searcher.search(incoming, freshlen, searchwindowsize) if index >= 0: self.buffer = incoming[searcher.end : ] self.before = incoming[ : searcher.start] self.after = incoming[searcher.start : searcher.end] self.match = searcher.match self.match_index = index return self.match_index # No match at this point if timeout < 0 and timeout is not None: raise TIMEOUT ('Timeout exceeded in expect_any().') # Still have time left, so read more data c = self.read_nonblocking (self.maxread, timeout) freshlen = len(c) time.sleep (0.0001) incoming = incoming + c if timeout is not None: timeout = end_time - time.time() except EOF, e: self.buffer = '' self.before = incoming self.after = EOF index = searcher.eof_index if index >= 0: self.match = EOF self.match_index = index return self.match_index else: self.match = None self.match_index = None raise EOF (str(e) + '\n' + str(self)) except TIMEOUT, e: self.buffer = incoming self.before = incoming self.after = TIMEOUT index = searcher.timeout_index if index >= 0: self.match = TIMEOUT self.match_index = index return self.match_index else: self.match = None self.match_index = None > raise TIMEOUT (str(e) + '\n' + str(self)) E TIMEOUT: Timeout exceeded in read_nonblocking(). E E version: 2.4 ($Revision: 516 $) E command: /Users/tzuchien/pytest/.tox/py26/bin/python2.6 E args: ['/Users/tzuchien/pytest/.tox/py26/bin/python2.6', '/Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pytest.py', '--basetemp=/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction_capturing_twice0/test_pdb_interaction_capturing_twice/pexpect', '/var/folders/tu/tuNRuAb+GwGVrjEwBwc6KE+++TI/-Tmp-/pytest-5/testdir/test_pdb_interaction_capturing_twice0/test_pdb_interaction_capturing_twice/test_pdb_interaction_capturing_twice.py'] E searcher: searcher_re: E 0: EOF E buffer (last 100 chars): E (Pdb) E before (last 100 chars): E (Pdb) E after: E match: None E match_index: None E exitstatus: None E flag_eof: False E pid: 31086 E child_fd: 150 E closed: False E timeout: 10.0 E delimiter: E logfile: E logfile_read: None E logfile_send: None E maxread: 2000 E ignorecase: False E searchwindowsize: None E delaybeforesend: 0.05 E delayafterclose: 0.1 E delayafterterminate: 0.1 /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1414: TIMEOUT ----------------------- generated xml file: /Users/tzuchien/pytest/.tox/py26/log/junit-py26.xml ------------------------ =============================================== short test summary info ================================================ FAIL test_collection.py::TestSession::()::test_parsearg FAIL test_collection.py::TestSession::()::test_collect_topdir FAIL test_collection.py::TestSession::()::test_collect_protocol_single_function FAIL test_collection.py::TestSession::()::test_collect_subdir_event_ordering FAIL test_collection.py::TestSession::()::test_collect_two_commandline_args FAIL test_collection.py::Test_getinitialnodes::()::test_pkgfile FAIL test_conftest.py::test_setinitial_confcut FAIL test_pdb.py::TestPDB::()::test_pdb_interaction FAIL test_pdb.py::TestPDB::()::test_pdb_interaction_exception FAIL test_pdb.py::TestPDB::()::test_pdb_interaction_capturing_simple FAIL test_pdb.py::TestPDB::()::test_pdb_interaction_capturing_twice SKIP [1] /Users/tzuchien/pytest/testing/conftest.py:96: no suitable python2.7 found SKIP [1] /Users/tzuchien/pytest/testing/conftest.py:96: no suitable jython found SKIP [1] /Users/tzuchien/pytest/testing/conftest.py:96: no suitable pypy found SKIP [1] /Users/tzuchien/pytest/testing/conftest.py:96: no suitable python2.4 found SKIP [1] /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/_pytest/core.py:120: plugin 'xdist' is missing SKIP [1] /Users/tzuchien/pytest/testing/conftest.py:96: no suitable python3.1 found XFAIL acceptance_test.py::TestInvocationVariants::()::test_noclass_discovery_if_not_testcase decide: feature or bug XFAIL test_capture.py::TestPerTestCapturing::()::test_capture_scope_cache XFAIL test_collection.py::TestPrunetraceback::()::test_collect_report_postprocessing other mechanism for adding to reporting needed XFAIL test_config.py::TestParseIni::()::test_confcutdir probably not needed !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! KeyboardInterrupt !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! /Users/tzuchien/pytest/.tox/py26/lib/python2.6/site-packages/pexpect.py:1099: KeyboardInterrupt ============================ 11 failed, 362 passed, 6 skipped, 4 xfailed in 464.77 seconds ============================= ~/pytest $ -------------- next part -------------- ~/pytest $ python pytest.py -v =================================================== test session starts ==================================================== platform darwin -- Python 2.6.1 -- pytest-2.1.3 -- /usr/bin/python collected 678 items doc/example/assertion/test_failures.py:6: test_failure_demo_fails_properly PASSED doc/example/assertion/test_setup_flow_example.py:14: TestStateFullThing.test_42 PASSED doc/example/assertion/test_setup_flow_example.py:18: TestStateFullThing.test_23 PASSED doc/example/assertion/global_testmodule_config/test_hello.py:4: test_func PASSED doc/example/costlysetup/sub1/test_quick.py:2: test_quick PASSED doc/example/costlysetup/sub2/test_two.py:1: test_something PASSED doc/example/costlysetup/sub2/test_two.py:4: test_something_more PASSED testing/acceptance_test.py:4: TestGeneralUsage.test_config_error PASSED testing/acceptance_test.py:16: TestGeneralUsage.test_early_hook_error_issue38_1 PASSED testing/acceptance_test.py:36: TestGeneralUsage.test_early_hook_configure_error_issue38 PASSED testing/acceptance_test.py:49: TestGeneralUsage.test_file_not_found PASSED testing/acceptance_test.py:54: TestGeneralUsage.test_config_preparse_plugin_option PASSED testing/acceptance_test.py:69: TestGeneralUsage.test_assertion_magic PASSED testing/acceptance_test.py:82: TestGeneralUsage.test_nested_import_error PASSED testing/acceptance_test.py:96: TestGeneralUsage.test_not_collectable_arguments PASSED testing/acceptance_test.py:107: TestGeneralUsage.test_early_skip PASSED testing/acceptance_test.py:120: TestGeneralUsage.test_issue88_initial_file_multinodes PASSED testing/acceptance_test.py:138: TestGeneralUsage.test_issue93_initialnode_importing_capturing PASSED testing/acceptance_test.py:149: TestGeneralUsage.test_conftest_printing_shows_if_error PASSED testing/acceptance_test.py:158: TestGeneralUsage.test_chdir PASSED testing/acceptance_test.py:174: TestGeneralUsage.test_issue109_sibling_conftests_not_loaded PASSED testing/acceptance_test.py:187: TestGeneralUsage.test_directory_skipped PASSED testing/acceptance_test.py:200: TestGeneralUsage.test_multiple_items_per_collector_byid PASSED testing/acceptance_test.py:219: TestGeneralUsage.test_skip_on_generated_funcarg_id PASSED testing/acceptance_test.py:235: TestGeneralUsage.test_direct_addressing_selects PASSED testing/acceptance_test.py:247: TestGeneralUsage.test_direct_addressing_notfound PASSED testing/acceptance_test.py:256: TestGeneralUsage.test_docstring_on_hookspec PASSED testing/acceptance_test.py:262: TestGeneralUsage.test_initialization_error_issue49 PASSED testing/acceptance_test.py:276: TestInvocationVariants.test_earlyinit PASSED testing/acceptance_test.py:284: TestInvocationVariants.test_pydoc PASSED testing/acceptance_test.py:292: TestInvocationVariants.test_import_star_py_dot_test PASSED testing/acceptance_test.py:307: TestInvocationVariants.test_import_star_pytest PASSED testing/acceptance_test.py:319: TestInvocationVariants.test_double_pytestcmdline PASSED testing/acceptance_test.py:335: TestInvocationVariants.test_python_minus_m_invocation_ok PASSED testing/acceptance_test.py:341: TestInvocationVariants.test_python_minus_m_invocation_fail PASSED testing/acceptance_test.py:347: TestInvocationVariants.test_python_pytest_package PASSED testing/acceptance_test.py:354: TestInvocationVariants.test_equivalence_pytest_pytest PASSED testing/acceptance_test.py:357: TestInvocationVariants.test_invoke_with_string PASSED testing/acceptance_test.py:364: TestInvocationVariants.test_invoke_with_path PASSED testing/acceptance_test.py:369: TestInvocationVariants.test_invoke_plugin_api PASSED testing/acceptance_test.py:378: TestInvocationVariants.test_pyargs_importerror PASSED testing/acceptance_test.py:391: TestInvocationVariants.test_cmdline_python_package PASSED testing/acceptance_test.py:418: TestInvocationVariants.test_cmdline_python_package_not_exists PASSED testing/acceptance_test.py:425: TestInvocationVariants.test_noclass_discovery_if_not_testcase xfail testing/acceptance_test.py:439: TestInvocationVariants.test_doctest_id PASSED testing/test_assertinterpret.py:12: test_not_being_rewritten PASSED testing/test_assertinterpret.py:15: test_assert PASSED testing/test_assertinterpret.py:23: test_assert_with_explicit_message PASSED testing/test_assertinterpret.py:30: test_assert_within_finally PASSED testing/test_assertinterpret.py:50: test_assert_multiline_1 PASSED testing/test_assertinterpret.py:59: test_assert_multiline_2 PASSED testing/test_assertinterpret.py:68: test_in PASSED testing/test_assertinterpret.py:76: test_is PASSED testing/test_assertinterpret.py:85: test_attrib PASSED testing/test_assertinterpret.py:97: test_attrib_inst PASSED testing/test_assertinterpret.py:108: test_len PASSED testing/test_assertinterpret.py:118: test_assert_non_string_message PASSED testing/test_assertinterpret.py:128: test_assert_keyword_arg PASSED testing/test_assertinterpret.py:152: test_assert_non_string PASSED testing/test_assertinterpret.py:159: test_assert_implicit_multiline PASSED testing/test_assertinterpret.py:169: test_assert_with_brokenrepr_arg PASSED testing/test_assertinterpret.py:176: test_multiple_statements_per_line PASSED testing/test_assertinterpret.py:183: test_power PASSED testing/test_assertinterpret.py:196: TestView.test_class_dispatch PASSED testing/test_assertinterpret.py:222: TestView.test_viewtype_class_hierarchy PASSED testing/test_assertinterpret.py:250: test_assert_customizable_reprcompare PASSED testing/test_assertinterpret.py:260: test_assert_long_source_1 PASSED testing/test_assertinterpret.py:271: test_assert_long_source_2 PASSED testing/test_assertinterpret.py:282: test_assert_raise_alias PASSED testing/test_assertinterpret.py:300: test_assert_raise_subclass PASSED testing/test_assertinterpret.py:312: test_assert_raises_in_nonzero_of_object_pytest_issue10 PASSED testing/test_assertion.py:32: TestBinReprIntegration.test_pytest_assertrepr_compare_called PASSED testing/test_assertion.py:37: TestBinReprIntegration.test_pytest_assertrepr_compare_args PASSED testing/test_assertion.py:47: TestAssert_reprcompare.test_different_types PASSED testing/test_assertion.py:50: TestAssert_reprcompare.test_summary PASSED testing/test_assertion.py:54: TestAssert_reprcompare.test_text_diff PASSED testing/test_assertion.py:59: TestAssert_reprcompare.test_multiline_text_diff PASSED testing/test_assertion.py:66: TestAssert_reprcompare.test_list PASSED testing/test_assertion.py:70: TestAssert_reprcompare.test_list_different_lenghts PASSED testing/test_assertion.py:76: TestAssert_reprcompare.test_dict PASSED testing/test_assertion.py:80: TestAssert_reprcompare.test_set PASSED testing/test_assertion.py:84: TestAssert_reprcompare.test_list_tuples PASSED testing/test_assertion.py:90: TestAssert_reprcompare.test_list_bad_repr PASSED testing/test_assertion.py:99: TestAssert_reprcompare.test_one_repr_empty PASSED testing/test_assertion.py:110: TestAssert_reprcompare.test_repr_no_exc PASSED testing/test_assertion.py:114: test_rewritten PASSED testing/test_assertion.py:122: test_reprcompare_notin PASSED testing/test_assertion.py:126: test_pytest_assertrepr_compare_integration PASSED testing/test_assertion.py:143: test_sequence_comparison_uses_repr PASSED testing/test_assertion.py:161: test_assertrepr_loaded_per_dir PASSED testing/test_assertion.py:184: test_assertion_options PASSED testing/test_assertion.py:203: test_old_assert_mode PASSED testing/test_assertion.py:211: test_triple_quoted_string_issue113 PASSED testing/test_assertion.py:222: test_traceback_failure PASSED testing/test_assertion.py:251: test_warn_missing PASSED testing/test_assertrewrite.py:55: TestAssertionRewrite.test_place_initial_imports PASSED testing/test_assertrewrite.py:91: TestAssertionRewrite.test_dont_rewrite PASSED testing/test_assertrewrite.py:99: TestAssertionRewrite.test_name PASSED testing/test_assertrewrite.py:111: TestAssertionRewrite.test_assert_already_has_message PASSED testing/test_assertrewrite.py:116: TestAssertionRewrite.test_boolop PASSED testing/test_assertrewrite.py:165: TestAssertionRewrite.test_short_circut_evaluation PASSED testing/test_assertrewrite.py:174: TestAssertionRewrite.test_unary_op PASSED testing/test_assertrewrite.py:192: TestAssertionRewrite.test_binary_op PASSED testing/test_assertrewrite.py:199: TestAssertionRewrite.test_call PASSED testing/test_assertrewrite.py:227: TestAssertionRewrite.test_attribute PASSED testing/test_assertrewrite.py:240: TestAssertionRewrite.test_comparisons PASSED testing/test_assertrewrite.py:263: TestAssertionRewrite.test_len PASSED testing/test_assertrewrite.py:270: TestAssertionRewrite.test_custom_reprcompare PASSED testing/test_assertrewrite.py:284: TestAssertionRewrite.test_assert_raising_nonzero_in_comparison PASSED testing/test_assertrewrite.py:298: TestAssertionRewrite.test_formatchar PASSED testing/test_assertrewrite.py:306: TestRewriteOnImport.test_pycache_is_a_file PASSED testing/test_assertrewrite.py:313: TestRewriteOnImport.test_zipfile PASSED testing/test_assertrewrite.py:329: TestRewriteOnImport.test_readonly PASSED testing/test_assertrewrite.py:339: TestRewriteOnImport.test_dont_write_bytecode PASSED testing/test_assertrewrite.py:349: TestRewriteOnImport.test_pyc_vs_pyo PASSED testing/test_assertrewrite.py:363: TestRewriteOnImport.test_package PASSED testing/test_assertrewrite.py:372: TestRewriteOnImport.test_translate_newlines PASSED testing/test_capture.py:7: TestCaptureManager.test_getmethod_default_no_fd PASSED testing/test_capture.py:17: TestCaptureManager.test_configure_per_fspath PASSED testing/test_capture.py:35: TestCaptureManager.test_capturing_basic_api[0] PASSED testing/test_capture.py:35: TestCaptureManager.test_capturing_basic_api[1] PASSED testing/test_capture.py:35: TestCaptureManager.test_capturing_basic_api[2] PASSED testing/test_capture.py:59: TestCaptureManager.test_juggle_capturings PASSED testing/test_capture.py:80: test_capturing_unicode[0] PASSED testing/test_capture.py:80: test_capturing_unicode[1] PASSED testing/test_capture.py:100: test_capturing_bytes_in_utf8_encoding[0] PASSED testing/test_capture.py:100: test_capturing_bytes_in_utf8_encoding[1] PASSED testing/test_capture.py:111: test_collect_capturing PASSED testing/test_capture.py:123: TestPerTestCapturing.test_capture_and_fixtures PASSED testing/test_capture.py:145: TestPerTestCapturing.test_capture_scope_cache xfail testing/test_capture.py:170: TestPerTestCapturing.test_no_carry_over PASSED testing/test_capture.py:184: TestPerTestCapturing.test_teardown_capturing PASSED testing/test_capture.py:205: TestPerTestCapturing.test_teardown_final_capturing PASSED testing/test_capture.py:221: TestPerTestCapturing.test_capturing_outerr PASSED testing/test_capture.py:245: TestLoggingInteraction.test_logging_stream_ownership PASSED testing/test_capture.py:257: TestLoggingInteraction.test_logging_and_immediate_setupteardown PASSED testing/test_capture.py:283: TestLoggingInteraction.test_logging_and_crossscope_fixtures PASSED testing/test_capture.py:309: TestLoggingInteraction.test_logging_initialized_in_test PASSED testing/test_capture.py:328: TestLoggingInteraction.test_conftestlogging_is_shown PASSED testing/test_capture.py:342: TestLoggingInteraction.test_conftestlogging_and_test_logging PASSED testing/test_capture.py:364: TestCaptureFuncarg.test_std_functional PASSED testing/test_capture.py:373: TestCaptureFuncarg.test_stdfd_functional PASSED testing/test_capture.py:385: TestCaptureFuncarg.test_partial_setup_failure PASSED testing/test_capture.py:396: TestCaptureFuncarg.test_keyboardinterrupt_disables_capturing PASSED testing/test_capture.py:410: test_setup_failure_does_not_kill_capturing PASSED testing/test_capture.py:423: test_fdfuncarg_skips_on_no_osdup PASSED testing/test_capture.py:436: test_capture_conftest_runtest_setup PASSED testing/test_collection.py:6: TestCollector.test_collect_versus_item PASSED testing/test_collection.py:11: TestCollector.test_compat_attributes PASSED testing/test_collection.py:23: TestCollector.test_check_equality PASSED testing/test_collection.py:51: TestCollector.test_getparent PASSED testing/test_collection.py:71: TestCollector.test_getcustomfile_roundtrip PASSED testing/test_collection.py:89: TestCollectFS.test_ignored_certain_directories PASSED testing/test_collection.py:105: TestCollectFS.test_custom_norecursedirs PASSED testing/test_collection.py:120: TestCollectPluginHookRelay.test_pytest_collect_file PASSED testing/test_collection.py:130: TestCollectPluginHookRelay.test_pytest_collect_directory PASSED testing/test_collection.py:142: TestPrunetraceback.test_collection_error PASSED testing/test_collection.py:153: TestPrunetraceback.test_custom_repr_failure PASSED testing/test_collection.py:178: TestPrunetraceback.test_collect_report_postprocessing xfail testing/test_collection.py:198: TestCustomConftests.test_ignore_collect_path PASSED testing/test_collection.py:213: TestCustomConftests.test_ignore_collect_not_called_on_argument PASSED testing/test_collection.py:226: TestCustomConftests.test_collectignore_exclude_on_option PASSED testing/test_collection.py:244: TestCustomConftests.test_pytest_fs_collect_hooks_are_seen PASSED testing/test_collection.py:261: TestCustomConftests.test_pytest_collect_file_from_sister_dir PASSED testing/test_collection.py:293: TestSession.test_parsearg FAILED testing/test_collection.py:313: TestSession.test_collect_topdir FAILED testing/test_collection.py:328: TestSession.test_collect_protocol_single_function FAILED testing/test_collection.py:354: TestSession.test_collect_protocol_method PASSED testing/test_collection.py:375: TestSession.test_collect_custom_nodes_multi_id PASSED testing/test_collection.py:411: TestSession.test_collect_subdir_event_ordering FAILED testing/test_collection.py:430: TestSession.test_collect_two_commandline_args FAILED testing/test_collection.py:456: TestSession.test_serialization_byid PASSED testing/test_collection.py:472: TestSession.test_find_byid_without_instance_parents PASSED testing/test_collection.py:488: Test_getinitialnodes.test_global_file PASSED testing/test_collection.py:499: Test_getinitialnodes.test_pkgfile FAILED testing/test_collection.py:514: Test_genitems.test_check_collect_hashes PASSED testing/test_collection.py:531: Test_genitems.test_root_conftest_syntax_error PASSED testing/test_collection.py:537: Test_genitems.test_example_items1 PASSED testing/test_collection.py:564: test_matchnodes_two_collections_same_file PASSED testing/test_config.py:6: TestParseIni.test_getcfg_and_config PASSED testing/test_config.py:19: TestParseIni.test_getcfg_empty_path PASSED testing/test_config.py:22: TestParseIni.test_append_parse_args PASSED testing/test_config.py:35: TestParseIni.test_tox_ini_wrong_version PASSED testing/test_config.py:46: TestParseIni.test_ini_names[0] PASSED testing/test_config.py:46: TestParseIni.test_ini_names[1] PASSED testing/test_config.py:46: TestParseIni.test_ini_names[2] PASSED testing/test_config.py:56: TestParseIni.test_toxini_before_lower_pytestini PASSED testing/test_config.py:70: TestParseIni.test_confcutdir xfail testing/test_config.py:82: TestConfigCmdlineParsing.test_parsing_again_fails PASSED testing/test_config.py:88: TestConfigAPI.test_config_trace PASSED testing/test_config.py:96: TestConfigAPI.test_config_getvalue_honours_conftest PASSED testing/test_config.py:110: TestConfigAPI.test_config_getvalueorskip PASSED testing/test_config.py:127: TestConfigAPI.test_config_overwrite PASSED testing/test_config.py:137: TestConfigAPI.test_getconftest_pathlist PASSED testing/test_config.py:149: TestConfigAPI.test_addini PASSED testing/test_config.py:163: TestConfigAPI.test_addini_pathlist PASSED testing/test_config.py:180: TestConfigAPI.test_addini_args PASSED testing/test_config.py:197: TestConfigAPI.test_addini_linelist PASSED testing/test_config.py:215: test_options_on_small_file_do_not_blow_up PASSED testing/test_config.py:231: test_preparse_ordering_with_setuptools PASSED testing/test_config.py:253: test_plugin_preparse_prevents_setuptools_loading PASSED testing/test_config.py:267: test_cmdline_processargs_simple PASSED testing/test_conftest.py:26: TestConftestValueAccessGlobal.test_basic_init[0] PASSED testing/test_conftest.py:26: TestConftestValueAccessGlobal.test_basic_init[1] PASSED testing/test_conftest.py:31: TestConftestValueAccessGlobal.test_onimport[0] PASSED testing/test_conftest.py:31: TestConftestValueAccessGlobal.test_onimport[1] PASSED testing/test_conftest.py:41: TestConftestValueAccessGlobal.test_immediate_initialiation_and_incremental_are_the_same[0] PASSED testing/test_conftest.py:41: TestConftestValueAccessGlobal.test_immediate_initialiation_and_incremental_are_the_same[1] PASSED testing/test_conftest.py:52: TestConftestValueAccessGlobal.test_default_has_lower_prio[0] PASSED testing/test_conftest.py:52: TestConftestValueAccessGlobal.test_default_has_lower_prio[1] PASSED testing/test_conftest.py:57: TestConftestValueAccessGlobal.test_value_access_not_existing[0] PASSED testing/test_conftest.py:57: TestConftestValueAccessGlobal.test_value_access_not_existing[1] PASSED testing/test_conftest.py:62: TestConftestValueAccessGlobal.test_value_access_by_path[0] PASSED testing/test_conftest.py:62: TestConftestValueAccessGlobal.test_value_access_by_path[1] PASSED testing/test_conftest.py:73: TestConftestValueAccessGlobal.test_value_access_with_init_one_conftest[0] PASSED testing/test_conftest.py:73: TestConftestValueAccessGlobal.test_value_access_with_init_one_conftest[1] PASSED testing/test_conftest.py:78: TestConftestValueAccessGlobal.test_value_access_with_init_two_conftests[0] PASSED testing/test_conftest.py:78: TestConftestValueAccessGlobal.test_value_access_with_init_two_conftests[1] PASSED testing/test_conftest.py:84: TestConftestValueAccessGlobal.test_value_access_with_confmod[0] PASSED testing/test_conftest.py:84: TestConftestValueAccessGlobal.test_value_access_with_confmod[1] PASSED testing/test_conftest.py:94: test_conftest_in_nonpkg_with_init PASSED testing/test_conftest.py:101: test_doubledash_not_considered PASSED testing/test_conftest.py:109: test_conftest_global_import PASSED testing/test_conftest.py:130: test_conftestcutdir PASSED testing/test_conftest.py:149: test_conftestcutdir_inplace_considered PASSED testing/test_conftest.py:157: test_setinitial_confcut FAILED testing/test_conftest.py:173: test_setinitial_conftest_subdirs[0] PASSED testing/test_conftest.py:173: test_setinitial_conftest_subdirs[1] PASSED testing/test_conftest.py:173: test_setinitial_conftest_subdirs[2] PASSED testing/test_conftest.py:173: test_setinitial_conftest_subdirs[3] PASSED testing/test_conftest.py:186: test_conftest_confcutdir PASSED testing/test_core.py:7: TestBootstrapping.test_consider_env_fails_to_import PASSED testing/test_core.py:12: TestBootstrapping.test_preparse_args PASSED testing/test_core.py:18: TestBootstrapping.test_plugin_prevent_register PASSED testing/test_core.py:26: TestBootstrapping.test_plugin_prevent_register_unregistered_alredy_registered PASSED testing/test_core.py:35: TestBootstrapping.test_plugin_skip PASSED testing/test_core.py:49: TestBootstrapping.test_consider_env_plugin_instantiation PASSED testing/test_core.py:63: TestBootstrapping.test_consider_setuptools_instantiation PASSED testing/test_core.py:82: TestBootstrapping.test_consider_setuptools_not_installed PASSED testing/test_core.py:89: TestBootstrapping.test_pluginmanager_ENV_startup PASSED testing/test_core.py:102: TestBootstrapping.test_import_plugin_importname PASSED testing/test_core.py:120: TestBootstrapping.test_import_plugin_dotted_name PASSED testing/test_core.py:132: TestBootstrapping.test_consider_module PASSED testing/test_core.py:143: TestBootstrapping.test_consider_module_import_module PASSED testing/test_core.py:160: TestBootstrapping.test_config_sets_conftesthandle_onimport PASSED testing/test_core.py:164: TestBootstrapping.test_consider_conftest_deps PASSED testing/test_core.py:169: TestBootstrapping.test_pm PASSED testing/test_core.py:186: TestBootstrapping.test_pm_ordering PASSED testing/test_core.py:199: TestBootstrapping.test_register_imported_modules PASSED testing/test_core.py:213: TestBootstrapping.test_canonical_import PASSED testing/test_core.py:221: TestBootstrapping.test_register_mismatch_method PASSED testing/test_core.py:228: TestBootstrapping.test_register_mismatch_arg PASSED testing/test_core.py:236: TestBootstrapping.test_notify_exception PASSED testing/test_core.py:250: TestBootstrapping.test_register PASSED testing/test_core.py:267: TestBootstrapping.test_listattr PASSED testing/test_core.py:281: TestBootstrapping.test_hook_tracing PASSED testing/test_core.py:304: TestPytestPluginInteractions.test_addhooks_conftestplugin PASSED testing/test_core.py:323: TestPytestPluginInteractions.test_addhooks_nohooks PASSED testing/test_core.py:335: TestPytestPluginInteractions.test_do_option_conftestplugin PASSED testing/test_core.py:346: TestPytestPluginInteractions.test_namespace_early_from_import PASSED testing/test_core.py:355: TestPytestPluginInteractions.test_do_ext_namespace PASSED testing/test_core.py:372: TestPytestPluginInteractions.test_do_option_postinitialize PASSED testing/test_core.py:385: TestPytestPluginInteractions.test_configure PASSED testing/test_core.py:406: TestPytestPluginInteractions.test_listattr PASSED testing/test_core.py:414: TestPytestPluginInteractions.test_listattr_tryfirst PASSED testing/test_core.py:446: test_namespace_has_default_and_env_plugins PASSED testing/test_core.py:454: test_varnames PASSED testing/test_core.py:468: TestMultiCall.test_uses_copy_of_methods PASSED testing/test_core.py:476: TestMultiCall.test_call_passing PASSED testing/test_core.py:498: TestMultiCall.test_keyword_args PASSED testing/test_core.py:511: TestMultiCall.test_keyword_args_with_defaultargs PASSED testing/test_core.py:519: TestMultiCall.test_tags_call_error PASSED testing/test_core.py:523: TestMultiCall.test_call_subexecute PASSED testing/test_core.py:535: TestMultiCall.test_call_none_is_no_result PASSED testing/test_core.py:546: TestHookRelay.test_happypath PASSED testing/test_core.py:563: TestHookRelay.test_only_kwargs PASSED testing/test_core.py:571: TestHookRelay.test_firstresult_definition PASSED testing/test_core.py:587: TestTracer.test_simple PASSED testing/test_core.py:601: TestTracer.test_indent PASSED testing/test_core.py:623: TestTracer.test_setprocessor PASSED testing/test_core.py:645: TestTracer.test_setmyprocessor PASSED testing/test_doctest.py:6: TestDoctests.test_collect_testtextfile PASSED testing/test_doctest.py:22: TestDoctests.test_collect_module PASSED testing/test_doctest.py:30: TestDoctests.test_simple_doctestfile PASSED testing/test_doctest.py:39: TestDoctests.test_new_pattern PASSED testing/test_doctest.py:48: TestDoctests.test_doctest_unexpected_exception PASSED testing/test_doctest.py:62: TestDoctests.test_doctest_unex_importerror PASSED testing/test_doctest.py:77: TestDoctests.test_doctestmodule PASSED testing/test_doctest.py:89: TestDoctests.test_doctestmodule_external_and_issue116 PASSED testing/test_doctest.py:111: TestDoctests.test_txtfile_failing PASSED testing/test_genscript.py:21: test_gen[python2.4] SKIPPED testing/test_genscript.py:21: test_gen[python2.5] PASSED testing/test_genscript.py:21: test_gen[python2.6] PASSED testing/test_genscript.py:21: test_gen[python2.7] SKIPPED testing/test_genscript.py:21: test_gen[python3.1] SKIPPED testing/test_genscript.py:21: test_gen[pypy] SKIPPED testing/test_genscript.py:21: test_gen[jython] SKIPPED testing/test_genscript.py:31: test_rundist SKIPPED testing/test_helpconfig.py:4: test_version PASSED testing/test_helpconfig.py:17: test_help PASSED testing/test_helpconfig.py:26: test_collectattr PASSED testing/test_helpconfig.py:38: test_hookvalidation_unknown PASSED testing/test_helpconfig.py:49: test_hookvalidation_optional PASSED testing/test_helpconfig.py:59: test_traceconfig PASSED testing/test_helpconfig.py:66: test_debug PASSED testing/test_helpconfig.py:72: test_PYTEST_DEBUG PASSED testing/test_junitxml.py:21: TestPython.test_summing_simple PASSED testing/test_junitxml.py:42: TestPython.test_timing_function PASSED testing/test_junitxml.py:54: TestPython.test_setup_error PASSED testing/test_junitxml.py:73: TestPython.test_skip_contains_name_reason PASSED testing/test_junitxml.py:93: TestPython.test_classname_instance PASSED testing/test_junitxml.py:108: TestPython.test_classname_nested_dir PASSED testing/test_junitxml.py:120: TestPython.test_internal_error PASSED testing/test_junitxml.py:133: TestPython.test_failure_function PASSED testing/test_junitxml.py:160: TestPython.test_failure_escape PASSED testing/test_junitxml.py:181: TestPython.test_junit_prefixing PASSED testing/test_junitxml.py:203: TestPython.test_xfailure_function PASSED testing/test_junitxml.py:221: TestPython.test_xfailure_xpass PASSED testing/test_junitxml.py:240: TestPython.test_collect_error PASSED testing/test_junitxml.py:254: TestPython.test_collect_skipped PASSED testing/test_junitxml.py:267: TestPython.test_unicode PASSED testing/test_junitxml.py:283: TestNonPython.test_summing_simple PASSED testing/test_junitxml.py:312: test_nullbyte PASSED testing/test_junitxml.py:328: test_nullbyte_replace PASSED testing/test_junitxml.py:343: test_invalid_xml_escape PASSED testing/test_junitxml.py:380: test_logxml_path_expansion PASSED testing/test_mark.py:5: TestMark.test_markinfo_repr PASSED testing/test_mark.py:10: TestMark.test_pytest_exists_in_namespace_all PASSED testing/test_mark.py:14: TestMark.test_pytest_mark_notcallable PASSED testing/test_mark.py:18: TestMark.test_pytest_mark_bare PASSED testing/test_mark.py:25: TestMark.test_pytest_mark_keywords PASSED testing/test_mark.py:34: TestMark.test_apply_multiple_and_merge PASSED testing/test_mark.py:48: TestMark.test_pytest_mark_positional PASSED testing/test_mark.py:56: TestMark.test_pytest_mark_reuse PASSED testing/test_mark.py:72: TestFunctional.test_mark_per_function PASSED testing/test_mark.py:82: TestFunctional.test_mark_per_module PASSED testing/test_mark.py:92: TestFunctional.test_marklist_per_class PASSED testing/test_mark.py:104: TestFunctional.test_marklist_per_module PASSED testing/test_mark.py:117: TestFunctional.test_mark_per_class_decorator PASSED testing/test_mark.py:129: TestFunctional.test_mark_per_class_decorator_plus_existing_dec PASSED testing/test_mark.py:144: TestFunctional.test_merging_markers PASSED testing/test_mark.py:162: TestFunctional.test_mark_other PASSED testing/test_mark.py:173: TestFunctional.test_mark_dynamically_in_funcarg PASSED testing/test_mark.py:193: Test_genitems.test_check_collect_hashes PASSED testing/test_mark.py:210: Test_genitems.test_root_conftest_syntax_error PASSED testing/test_mark.py:216: Test_genitems.test_example_items1 PASSED testing/test_mark.py:245: TestKeywordSelection.test_select_simple PASSED testing/test_mark.py:265: TestKeywordSelection.test_select_extra_keywords PASSED testing/test_mark.py:291: TestKeywordSelection.test_select_starton PASSED testing/test_mark.py:307: TestKeywordSelection.test_keyword_extra PASSED testing/test_monkeypatch.py:5: test_setattr PASSED testing/test_monkeypatch.py:27: test_delattr PASSED testing/test_monkeypatch.py:45: test_setitem PASSED testing/test_monkeypatch.py:62: test_delitem PASSED testing/test_monkeypatch.py:78: test_setenv PASSED testing/test_monkeypatch.py:86: test_delenv PASSED testing/test_monkeypatch.py:106: test_setenv_prepend PASSED testing/test_monkeypatch.py:116: test_monkeypatch_plugin PASSED testing/test_monkeypatch.py:124: test_syspath_prepend PASSED testing/test_nose.py:6: test_nose_setup SKIPPED testing/test_nose.py:27: test_nose_setup_func SKIPPED testing/test_nose.py:57: test_nose_setup_func_failure SKIPPED testing/test_nose.py:81: test_nose_setup_func_failure_2 SKIPPED testing/test_nose.py:105: test_nose_setup_partial SKIPPED testing/test_nose.py:140: test_nose_test_generator_fixtures SKIPPED testing/test_nose.py:207: test_module_level_setup SKIPPED testing/test_nose.py:238: test_nose_style_setup_teardown SKIPPED testing/test_nose.py:259: test_nose_setup_ordering SKIPPED testing/test_parseopt.py:6: TestParser.test_no_help_by_default PASSED testing/test_parseopt.py:12: TestParser.test_group_add_and_get PASSED testing/test_parseopt.py:18: TestParser.test_getgroup_simple PASSED testing/test_parseopt.py:26: TestParser.test_group_ordering PASSED testing/test_parseopt.py:35: TestParser.test_group_addoption PASSED testing/test_parseopt.py:41: TestParser.test_group_shortopt_lowercase PASSED testing/test_parseopt.py:51: TestParser.test_parser_addoption PASSED testing/test_parseopt.py:65: TestParser.test_parse PASSED testing/test_parseopt.py:70: TestParser.test_parse_will_set_default PASSED testing/test_parseopt.py:79: TestParser.test_parse_setoption PASSED testing/test_parseopt.py:90: TestParser.test_parse_defaultgetter PASSED testing/test_parseopt.py:105: test_addoption_parser_epilog PASSED testing/test_pastebin.py:13: TestPasting.test_failed PASSED testing/test_pastebin.py:30: TestPasting.test_all PASSED testing/test_pdb.py:14: TestPDB.test_pdb_on_fail PASSED testing/test_pdb.py:24: TestPDB.test_pdb_on_xfail PASSED testing/test_pdb.py:34: TestPDB.test_pdb_on_skip PASSED testing/test_pdb.py:43: TestPDB.test_pdb_on_BdbQuit PASSED testing/test_pdb.py:52: TestPDB.test_pdb_interaction SKIPPED testing/test_pdb.py:69: TestPDB.test_pdb_interaction_exception SKIPPED testing/test_pdb.py:88: TestPDB.test_pdb_interaction_capturing_simple SKIPPED testing/test_pdb.py:109: TestPDB.test_pdb_interaction_capturing_twice SKIPPED testing/test_pdb.py:136: TestPDB.test_pdb_used_outside_test SKIPPED testing/test_pdb.py:147: TestPDB.test_pdb_used_in_generate_tests SKIPPED testing/test_pdb.py:160: TestPDB.test_pdb_collection_failure_is_shown PASSED testing/test_pytester.py:7: test_reportrecorder xfail testing/test_pytester.py:58: test_parseconfig PASSED testing/test_pytester.py:65: test_testdir_runs_with_plugin PASSED testing/test_pytester.py:76: test_hookrecorder_basic PASSED testing/test_pytester.py:88: test_hookrecorder_basic_no_args_hook PASSED testing/test_pytester.py:99: test_functional PASSED testing/test_pytester.py:119: test_makepyfile_unicode PASSED testing/test_python.py:5: TestModule.test_failing_import PASSED testing/test_python.py:10: TestModule.test_import_duplicate PASSED testing/test_python.py:27: TestModule.test_syntax_error_in_module PASSED testing/test_python.py:32: TestModule.test_module_considers_pluginmanager_at_import PASSED testing/test_python.py:37: TestClass.test_class_with_init_not_collected FAILED testing/test_python.py:37: TestClass.test_class_with_init_not_collected ERROR testing/test_python.py:49: TestClass.test_class_subclassobject ERROR testing/test_python.py:49: TestClass.test_class_subclassobject ERROR testing/test_python.py:60: TestGenerator.test_generative_functions ERROR testing/test_python.py:60: TestGenerator.test_generative_functions ERROR testing/test_python.py:80: TestGenerator.test_generative_methods ERROR testing/test_python.py:80: TestGenerator.test_generative_methods ERROR testing/test_python.py:98: TestGenerator.test_generative_functions_with_explicit_names ERROR testing/test_python.py:98: TestGenerator.test_generative_functions_with_explicit_names ERROR testing/test_python.py:120: TestGenerator.test_generative_functions_unique_explicit_names ERROR testing/test_python.py:120: TestGenerator.test_generative_functions_unique_explicit_names ERROR testing/test_python.py:134: TestGenerator.test_generative_methods_with_explicit_names ERROR testing/test_python.py:134: TestGenerator.test_generative_methods_with_explicit_names ERROR testing/test_python.py:154: TestGenerator.test_order_of_execution_generator_same_codeline ERROR testing/test_python.py:154: TestGenerator.test_order_of_execution_generator_same_codeline ERROR testing/test_python.py:178: TestGenerator.test_order_of_execution_generator_different_codeline ERROR testing/test_python.py:178: TestGenerator.test_order_of_execution_generator_different_codeline ERROR testing/test_python.py:209: TestGenerator.test_setupstate_is_preserved_134 ERROR testing/test_python.py:209: TestGenerator.test_setupstate_is_preserved_134 ERROR testing/test_python.py:253: TestFunction.test_getmodulecollector ERROR testing/test_python.py:253: TestFunction.test_getmodulecollector ERROR testing/test_python.py:259: TestFunction.test_function_equality ERROR testing/test_python.py:259: TestFunction.test_function_equality ERROR testing/test_python.py:281: TestFunction.test_function_equality_with_callspec ERROR testing/test_python.py:281: TestFunction.test_function_equality_with_callspec ERROR testing/test_python.py:299: TestFunction.test_pyfunc_call ERROR testing/test_python.py:299: TestFunction.test_pyfunc_call ERROR testing/test_python.py:313: TestSorting.test_check_equality ERROR testing/test_python.py:313: TestSorting.test_check_equality ERROR testing/test_python.py:341: TestSorting.test_allow_sane_sorting_for_decorators ERROR testing/test_python.py:341: TestSorting.test_allow_sane_sorting_for_decorators ERROR testing/test_python.py:363: TestConftestCustomization.test_pytest_pycollect_module ERROR testing/test_python.py:363: TestConftestCustomization.test_pytest_pycollect_module ERROR testing/test_python.py:380: TestConftestCustomization.test_pytest_pycollect_makeitem ERROR testing/test_python.py:380: TestConftestCustomization.test_pytest_pycollect_makeitem ERROR testing/test_python.py:395: TestConftestCustomization.test_makeitem_non_underscore ERROR testing/test_python.py:395: TestConftestCustomization.test_makeitem_non_underscore ERROR testing/test_python.py:403: test_setup_only_available_in_subdir ERROR testing/test_python.py:403: test_setup_only_available_in_subdir ERROR testing/test_python.py:431: test_generate_tests_only_done_in_subdir ERROR testing/test_python.py:431: test_generate_tests_only_done_in_subdir ERROR testing/test_python.py:449: test_modulecol_roundtrip ERROR testing/test_python.py:449: test_modulecol_roundtrip ERROR testing/test_python.py:457: TestTracebackCutting.test_skip_simple PASSED testing/test_python.py:462: TestTracebackCutting.test_traceback_argsetup PASSED testing/test_python.py:482: TestTracebackCutting.test_traceback_error_during_import PASSED testing/test_python.py:507: test_getfuncargnames PASSED testing/test_python.py:523: test_callspec_repr PASSED testing/test_python.py:534: TestFillFuncArgs.test_funcarg_lookupfails PASSED testing/test_python.py:545: TestFillFuncArgs.test_funcarg_lookup_default FAILED testing/test_python.py:545: TestFillFuncArgs.test_funcarg_lookup_default ERROR testing/test_python.py:554: TestFillFuncArgs.test_funcarg_basic ERROR testing/test_python.py:554: TestFillFuncArgs.test_funcarg_basic ERROR testing/test_python.py:567: TestFillFuncArgs.test_funcarg_lookup_modulelevel ERROR testing/test_python.py:567: TestFillFuncArgs.test_funcarg_lookup_modulelevel ERROR testing/test_python.py:584: TestFillFuncArgs.test_funcarg_lookup_classlevel ERROR testing/test_python.py:584: TestFillFuncArgs.test_funcarg_lookup_classlevel ERROR testing/test_python.py:597: TestFillFuncArgs.test_fillfuncargs_exposed ERROR testing/test_python.py:597: TestFillFuncArgs.test_fillfuncargs_exposed ERROR testing/test_python.py:610: TestRequest.test_request_attributes ERROR testing/test_python.py:610: TestRequest.test_request_attributes ERROR testing/test_python.py:624: TestRequest.test_request_attributes_method ERROR testing/test_python.py:624: TestRequest.test_request_attributes_method ERROR testing/test_python.py:648: TestRequest.test_getfuncargvalue_recursive ERROR testing/test_python.py:648: TestRequest.test_getfuncargvalue_recursive ERROR testing/test_python.py:663: TestRequest.test_getfuncargvalue ERROR testing/test_python.py:663: TestRequest.test_getfuncargvalue ERROR testing/test_python.py:684: TestRequest.test_request_addfinalizer ERROR testing/test_python.py:684: TestRequest.test_request_addfinalizer ERROR testing/test_python.py:702: TestRequest.test_request_addfinalizer_partial_setup_failure ERROR testing/test_python.py:702: TestRequest.test_request_addfinalizer_partial_setup_failure ERROR testing/test_python.py:717: TestRequest.test_request_getmodulepath ERROR testing/test_python.py:717: TestRequest.test_request_getmodulepath ERROR testing/test_python.py:723: test_applymarker ERROR testing/test_python.py:723: test_applymarker ERROR testing/test_python.py:741: TestRequestCachedSetup.test_request_cachedsetup ERROR testing/test_python.py:741: TestRequestCachedSetup.test_request_cachedsetup ERROR testing/test_python.py:762: TestRequestCachedSetup.test_request_cachedsetup_class ERROR testing/test_python.py:762: TestRequestCachedSetup.test_request_cachedsetup_class ERROR testing/test_python.py:795: TestRequestCachedSetup.test_request_cachedsetup_extrakey ERROR testing/test_python.py:795: TestRequestCachedSetup.test_request_cachedsetup_extrakey ERROR testing/test_python.py:810: TestRequestCachedSetup.test_request_cachedsetup_cache_deletion ERROR testing/test_python.py:810: TestRequestCachedSetup.test_request_cachedsetup_cache_deletion ERROR testing/test_python.py:828: TestRequestCachedSetup.test_request_cached_setup_two_args ERROR testing/test_python.py:828: TestRequestCachedSetup.test_request_cached_setup_two_args ERROR testing/test_python.py:842: TestRequestCachedSetup.test_request_cached_setup_getfuncargvalue ERROR testing/test_python.py:842: TestRequestCachedSetup.test_request_cached_setup_getfuncargvalue ERROR testing/test_python.py:857: TestRequestCachedSetup.test_request_cached_setup_functional ERROR testing/test_python.py:857: TestRequestCachedSetup.test_request_cached_setup_functional ERROR testing/test_python.py:885: TestMetafunc.test_no_funcargs ERROR testing/test_python.py:885: TestMetafunc.test_no_funcargs ERROR testing/test_python.py:890: TestMetafunc.test_function_basic ERROR testing/test_python.py:890: TestMetafunc.test_function_basic ERROR testing/test_python.py:898: TestMetafunc.test_addcall_no_args ERROR testing/test_python.py:898: TestMetafunc.test_addcall_no_args ERROR testing/test_python.py:907: TestMetafunc.test_addcall_id ERROR testing/test_python.py:907: TestMetafunc.test_addcall_id ERROR testing/test_python.py:920: TestMetafunc.test_addcall_param ERROR testing/test_python.py:920: TestMetafunc.test_addcall_param ERROR testing/test_python.py:932: TestMetafunc.test_addcall_funcargs ERROR testing/test_python.py:932: TestMetafunc.test_addcall_funcargs ERROR testing/test_python.py:945: TestGenfuncFunctional.test_attributes ERROR testing/test_python.py:945: TestGenfuncFunctional.test_attributes ERROR testing/test_python.py:979: TestGenfuncFunctional.test_addcall_with_two_funcargs_generators ERROR testing/test_python.py:979: TestGenfuncFunctional.test_addcall_with_two_funcargs_generators ERROR testing/test_python.py:1000: TestGenfuncFunctional.test_two_functions ERROR testing/test_python.py:1000: TestGenfuncFunctional.test_two_functions ERROR testing/test_python.py:1022: TestGenfuncFunctional.test_noself_in_method ERROR testing/test_python.py:1022: TestGenfuncFunctional.test_noself_in_method ERROR testing/test_python.py:1037: TestGenfuncFunctional.test_generate_plugin_and_module PASSED testing/test_python.py:1063: TestGenfuncFunctional.test_generate_tests_in_class PASSED testing/test_python.py:1078: TestGenfuncFunctional.test_two_functions_not_same_instance PASSED testing/test_python.py:1096: TestGenfuncFunctional.test_issue28_setup_method_in_generate_tests PASSED testing/test_python.py:1112: test_conftest_funcargs_only_available_in_subdir PASSED testing/test_python.py:1133: test_funcarg_non_pycollectobj FAILED testing/test_python.py:1133: test_funcarg_non_pycollectobj ERROR testing/test_python.py:1156: test_funcarg_lookup_error ERROR testing/test_python.py:1156: test_funcarg_lookup_error ERROR testing/test_python.py:1172: TestReportInfo.test_itemreport_reportinfo ERROR testing/test_python.py:1172: TestReportInfo.test_itemreport_reportinfo ERROR testing/test_python.py:1186: TestReportInfo.test_func_reportinfo ERROR testing/test_python.py:1186: TestReportInfo.test_func_reportinfo ERROR testing/test_python.py:1193: TestReportInfo.test_class_reportinfo ERROR testing/test_python.py:1193: TestReportInfo.test_class_reportinfo ERROR testing/test_python.py:1205: TestReportInfo.test_generator_reportinfo ERROR testing/test_python.py:1205: TestReportInfo.test_generator_reportinfo ERROR testing/test_python.py:1236: test_show_funcarg ERROR testing/test_python.py:1236: test_show_funcarg ERROR testing/test_python.py:1245: TestRaises.test_raises ERROR testing/test_python.py:1245: TestRaises.test_raises ERROR testing/test_python.py:1252: TestRaises.test_raises_exec ERROR testing/test_python.py:1252: TestRaises.test_raises_exec ERROR testing/test_python.py:1255: TestRaises.test_raises_syntax_error ERROR testing/test_python.py:1255: TestRaises.test_raises_syntax_error ERROR testing/test_python.py:1258: TestRaises.test_raises_function ERROR testing/test_python.py:1258: TestRaises.test_raises_function ERROR testing/test_python.py:1261: TestRaises.test_raises_callable_no_exception ERROR testing/test_python.py:1261: TestRaises.test_raises_callable_no_exception ERROR testing/test_python.py:1270: TestRaises.test_raises_as_contextmanager ERROR testing/test_python.py:1270: TestRaises.test_raises_as_contextmanager ERROR testing/test_python.py:1300: test_customized_python_discovery ERROR testing/test_python.py:1300: test_customized_python_discovery ERROR testing/test_python.py:1330: test_collector_attributes ERROR testing/test_python.py:1330: test_collector_attributes ERROR testing/test_python.py:1348: test_customize_through_attributes ERROR testing/test_python.py:1348: test_customize_through_attributes ERROR testing/test_recwarn.py:4: test_WarningRecorder ERROR testing/test_recwarn.py:4: test_WarningRecorder ERROR testing/test_recwarn.py:23: test_recwarn_functional ERROR testing/test_recwarn.py:23: test_recwarn_functional ERROR testing/test_recwarn.py:54: test_deprecated_call_raises ERROR testing/test_recwarn.py:54: test_deprecated_call_raises ERROR testing/test_recwarn.py:59: test_deprecated_call ERROR testing/test_recwarn.py:59: test_deprecated_call ERROR testing/test_recwarn.py:62: test_deprecated_call_ret ERROR testing/test_recwarn.py:62: test_deprecated_call_ret ERROR testing/test_recwarn.py:66: test_deprecated_call_preserves ERROR testing/test_recwarn.py:66: test_deprecated_call_preserves ERROR testing/test_recwarn.py:74: test_deprecated_explicit_call_raises ERROR testing/test_recwarn.py:74: test_deprecated_explicit_call_raises ERROR testing/test_recwarn.py:78: test_deprecated_explicit_call ERROR testing/test_recwarn.py:78: test_deprecated_explicit_call ERROR testing/test_resultlog.py:7: test_generic_path ERROR testing/test_resultlog.py:7: test_generic_path ERROR testing/test_resultlog.py:30: test_write_log_entry ERROR testing/test_resultlog.py:30: test_write_log_entry ERROR testing/test_resultlog.py:80: TestWithFunctionIntegration.test_collection_report ERROR testing/test_resultlog.py:80: TestWithFunctionIntegration.test_collection_report ERROR testing/test_resultlog.py:103: TestWithFunctionIntegration.test_log_test_outcomes ERROR testing/test_resultlog.py:103: TestWithFunctionIntegration.test_log_test_outcomes ERROR testing/test_resultlog.py:136: TestWithFunctionIntegration.test_internal_exception ERROR testing/test_resultlog.py:136: TestWithFunctionIntegration.test_internal_exception ERROR testing/test_resultlog.py:153: test_generic ERROR testing/test_resultlog.py:153: test_generic ERROR testing/test_resultlog.py:180: test_no_resultlog_on_slaves ERROR testing/test_resultlog.py:180: test_no_resultlog_on_slaves ERROR testing/test_runner.py:6: TestSetupState.test_setup ERROR testing/test_runner.py:6: TestSetupState.test_setup ERROR testing/test_runner.py:16: TestSetupState.test_setup_scope_None ERROR testing/test_runner.py:16: TestSetupState.test_setup_scope_None ERROR testing/test_runner.py:30: TestSetupState.test_teardown_exact_stack_empty ERROR testing/test_runner.py:30: TestSetupState.test_teardown_exact_stack_empty ERROR testing/test_runner.py:37: TestSetupState.test_setup_fails_and_failure_is_cached ERROR testing/test_runner.py:37: TestSetupState.test_setup_fails_and_failure_is_cached ERROR testing/test_runner.py:48: TestExecutionNonForked.test_passfunction ERROR testing/test_runner.py:48: TestExecutionNonForked.test_passfunction ERROR testing/test_runner.py:59: TestExecutionNonForked.test_failfunction ERROR testing/test_runner.py:59: TestExecutionNonForked.test_failfunction ERROR testing/test_runner.py:72: TestExecutionNonForked.test_skipfunction FAILED testing/test_runner.py:90: TestExecutionNonForked.test_skip_in_setup_function FAILED testing/test_runner.py:90: TestExecutionNonForked.test_skip_in_setup_function ERROR testing/test_runner.py:109: TestExecutionNonForked.test_failure_in_setup_function ERROR testing/test_runner.py:109: TestExecutionNonForked.test_failure_in_setup_function ERROR testing/test_runner.py:124: TestExecutionNonForked.test_failure_in_teardown_function ERROR testing/test_runner.py:124: TestExecutionNonForked.test_failure_in_teardown_function ERROR testing/test_runner.py:142: TestExecutionNonForked.test_custom_failure_repr ERROR testing/test_runner.py:142: TestExecutionNonForked.test_custom_failure_repr ERROR testing/test_runner.py:163: TestExecutionNonForked.test_failure_in_setup_function_ignores_custom_repr ERROR testing/test_runner.py:163: TestExecutionNonForked.test_failure_in_setup_function_ignores_custom_repr ERROR testing/test_runner.py:187: TestExecutionNonForked.test_systemexit_does_not_bail_out ERROR testing/test_runner.py:187: TestExecutionNonForked.test_systemexit_does_not_bail_out ERROR testing/test_runner.py:199: TestExecutionNonForked.test_exit_propagates ERROR testing/test_runner.py:199: TestExecutionNonForked.test_exit_propagates ERROR testing/test_runner.py:217: TestExecutionNonForked.test_keyboardinterrupt_propagates ERROR testing/test_runner.py:217: TestExecutionNonForked.test_keyboardinterrupt_propagates ERROR testing/test_runner.py:48: TestExecutionForked.test_passfunction ERROR testing/test_runner.py:48: TestExecutionForked.test_passfunction ERROR testing/test_runner.py:59: TestExecutionForked.test_failfunction ERROR testing/test_runner.py:59: TestExecutionForked.test_failfunction ERROR testing/test_runner.py:72: TestExecutionForked.test_skipfunction ERROR testing/test_runner.py:72: TestExecutionForked.test_skipfunction ERROR testing/test_runner.py:90: TestExecutionForked.test_skip_in_setup_function ERROR testing/test_runner.py:90: TestExecutionForked.test_skip_in_setup_function ERROR testing/test_runner.py:109: TestExecutionForked.test_failure_in_setup_function ERROR testing/test_runner.py:109: TestExecutionForked.test_failure_in_setup_function ERROR testing/test_runner.py:124: TestExecutionForked.test_failure_in_teardown_function ERROR testing/test_runner.py:124: TestExecutionForked.test_failure_in_teardown_function ERROR testing/test_runner.py:142: TestExecutionForked.test_custom_failure_repr ERROR testing/test_runner.py:142: TestExecutionForked.test_custom_failure_repr ERROR testing/test_runner.py:163: TestExecutionForked.test_failure_in_setup_function_ignores_custom_repr ERROR testing/test_runner.py:163: TestExecutionForked.test_failure_in_setup_function_ignores_custom_repr ERROR testing/test_runner.py:187: TestExecutionForked.test_systemexit_does_not_bail_out ERROR testing/test_runner.py:187: TestExecutionForked.test_systemexit_does_not_bail_out ERROR testing/test_runner.py:199: TestExecutionForked.test_exit_propagates ERROR testing/test_runner.py:199: TestExecutionForked.test_exit_propagates ERROR testing/test_runner.py:236: TestExecutionForked.test_suicide ERROR testing/test_runner.py:236: TestExecutionForked.test_suicide ERROR testing/test_runner.py:247: TestSessionReports.test_collect_result ERROR testing/test_runner.py:247: TestSessionReports.test_collect_result ERROR testing/test_runner.py:267: TestSessionReports.test_skip_at_module_scope ERROR testing/test_runner.py:267: TestSessionReports.test_skip_at_module_scope ERROR testing/test_runner.py:279: test_callinfo ERROR testing/test_runner.py:279: test_callinfo ERROR testing/test_runner.py:292: test_runtest_in_module_ordering XPASS testing/test_runner.py:292: test_runtest_in_module_ordering XPASS testing/test_runner.py:322: test_pytest_exit ERROR testing/test_runner.py:322: test_pytest_exit ERROR testing/test_runner.py:329: test_pytest_fail ERROR testing/test_runner.py:329: test_pytest_fail ERROR testing/test_runner.py:337: test_pytest_fail_notrace ERROR testing/test_runner.py:337: test_pytest_fail_notrace ERROR testing/test_runner.py:352: test_exception_printing_skip ERROR testing/test_runner.py:352: test_exception_printing_skip ERROR testing/test_runner.py:360: test_importorskip ERROR testing/test_runner.py:360: test_importorskip ERROR testing/test_runner.py:386: test_importorskip_imports_last_module_part ERROR testing/test_runner.py:386: test_importorskip_imports_last_module_part ERROR testing/test_runner.py:392: test_pytest_cmdline_main ERROR testing/test_runner.py:392: test_pytest_cmdline_main ERROR testing/test_runner_xunit.py:5: test_module_and_function_setup ERROR testing/test_runner_xunit.py:5: test_module_and_function_setup ERROR testing/test_runner_xunit.py:35: test_class_setup ERROR testing/test_runner_xunit.py:35: test_class_setup ERROR testing/test_runner_xunit.py:59: test_method_setup ERROR testing/test_runner_xunit.py:59: test_method_setup ERROR testing/test_runner_xunit.py:75: test_method_generator_setup ERROR testing/test_runner_xunit.py:75: test_method_generator_setup ERROR testing/test_runner_xunit.py:97: test_func_generator_setup FAILED testing/test_runner_xunit.py:125: test_method_setup_uses_fresh_instances FAILED testing/test_runner_xunit.py:137: test_failing_setup_calls_teardown FAILED testing/test_runner_xunit.py:137: test_failing_setup_calls_teardown ERROR testing/test_runner_xunit.py:153: test_setup_that_skips_calledagain_and_teardown ERROR testing/test_runner_xunit.py:153: test_setup_that_skips_calledagain_and_teardown ERROR testing/test_runner_xunit.py:171: test_setup_fails_again_on_all_tests FAILED testing/test_runner_xunit.py:171: test_setup_fails_again_on_all_tests ERROR testing/test_runner_xunit.py:189: test_setup_funcarg_setup_when_outer_scope_fails ERROR testing/test_runner_xunit.py:189: test_setup_funcarg_setup_when_outer_scope_fails ERROR testing/test_session.py:4: TestNewSession.test_basic_testitem_events ERROR testing/test_session.py:4: TestNewSession.test_basic_testitem_events ERROR testing/test_session.py:31: TestNewSession.test_nested_import_error FAILED testing/test_session.py:46: TestNewSession.test_raises_output FAILED testing/test_session.py:59: TestNewSession.test_generator_yields_None FAILED testing/test_session.py:69: TestNewSession.test_syntax_error_module FAILED testing/test_session.py:76: TestNewSession.test_exit_first_problem FAILED testing/test_session.py:85: TestNewSession.test_maxfail FAILED testing/test_session.py:95: TestNewSession.test_broken_repr FAILED testing/test_session.py:119: TestNewSession.test_skip_file_by_conftest PASSED testing/test_session.py:137: TestNewSession.test_order_of_execution FAILED testing/test_session.py:162: TestNewSession.test_collect_only_with_various_situations PASSED testing/test_session.py:196: TestNewSession.test_minus_x_import_error PASSED testing/test_session.py:205: test_plugin_specify PASSED testing/test_session.py:214: test_plugin_already_exists FAILED testing/test_session.py:219: test_exclude FAILED testing/test_session.py:219: test_exclude ERROR testing/test_skipping.py:9: TestEvaluator.test_no_marker FAILED testing/test_skipping.py:9: TestEvaluator.test_no_marker ERROR testing/test_skipping.py:15: TestEvaluator.test_marked_no_args ERROR testing/test_skipping.py:15: TestEvaluator.test_marked_no_args ERROR testing/test_skipping.py:29: TestEvaluator.test_marked_one_arg ERROR testing/test_skipping.py:29: TestEvaluator.test_marked_one_arg ERROR testing/test_skipping.py:42: TestEvaluator.test_marked_one_arg_with_reason ERROR testing/test_skipping.py:42: TestEvaluator.test_marked_one_arg_with_reason ERROR testing/test_skipping.py:56: TestEvaluator.test_marked_one_arg_twice ERROR testing/test_skipping.py:56: TestEvaluator.test_marked_one_arg_twice ERROR testing/test_skipping.py:75: TestEvaluator.test_marked_one_arg_twice2 ERROR testing/test_skipping.py:75: TestEvaluator.test_marked_one_arg_twice2 ERROR testing/test_skipping.py:89: TestEvaluator.test_skipif_class ERROR testing/test_skipping.py:89: TestEvaluator.test_skipif_class ERROR testing/test_skipping.py:105: TestXFail.test_xfail_simple ERROR testing/test_skipping.py:105: TestXFail.test_xfail_simple ERROR testing/test_skipping.py:119: TestXFail.test_xfail_xpassed ERROR testing/test_skipping.py:119: TestXFail.test_xfail_xpassed ERROR testing/test_skipping.py:133: TestXFail.test_xfail_run_anyway ERROR testing/test_skipping.py:133: TestXFail.test_xfail_run_anyway ERROR testing/test_skipping.py:148: TestXFail.test_xfail_evalfalse_but_fails ERROR testing/test_skipping.py:148: TestXFail.test_xfail_evalfalse_but_fails ERROR testing/test_skipping.py:160: TestXFail.test_xfail_not_report_default ERROR testing/test_skipping.py:160: TestXFail.test_xfail_not_report_default ERROR testing/test_skipping.py:172: TestXFail.test_xfail_not_run_xfail_reporting ERROR testing/test_skipping.py:172: TestXFail.test_xfail_not_run_xfail_reporting ERROR testing/test_skipping.py:194: TestXFail.test_xfail_not_run_no_setup_run ERROR testing/test_skipping.py:194: TestXFail.test_xfail_not_run_no_setup_run ERROR testing/test_skipping.py:210: TestXFail.test_xfail_xpass ERROR testing/test_skipping.py:210: TestXFail.test_xfail_xpass ERROR testing/test_skipping.py:224: TestXFail.test_xfail_imperative ERROR testing/test_skipping.py:224: TestXFail.test_xfail_imperative ERROR testing/test_skipping.py:245: TestXFail.test_xfail_imperative_in_setup_function ERROR testing/test_skipping.py:245: TestXFail.test_xfail_imperative_in_setup_function ERROR testing/test_skipping.py:285: TestXFail.test_dynamic_xfail_no_run ERROR testing/test_skipping.py:285: TestXFail.test_dynamic_xfail_no_run ERROR testing/test_skipping.py:299: TestXFail.test_dynamic_xfail_set_during_funcarg_setup ERROR testing/test_skipping.py:299: TestXFail.test_dynamic_xfail_set_during_funcarg_setup ERROR testing/test_skipping.py:313: TestXFailwithSetupTeardown.test_failing_setup_issue9 ERROR testing/test_skipping.py:313: TestXFailwithSetupTeardown.test_failing_setup_issue9 ERROR testing/test_skipping.py:328: TestXFailwithSetupTeardown.test_failing_teardown_issue9 ERROR testing/test_skipping.py:328: TestXFailwithSetupTeardown.test_failing_teardown_issue9 ERROR testing/test_skipping.py:345: TestSkipif.test_skipif_conditional ERROR testing/test_skipping.py:345: TestSkipif.test_skipif_conditional ERROR testing/test_skipping.py:356: TestSkipif.test_skipif_reporting ERROR testing/test_skipping.py:356: TestSkipif.test_skipif_reporting ERROR testing/test_skipping.py:370: test_skip_not_report_default ERROR testing/test_skipping.py:370: test_skip_not_report_default ERROR testing/test_skipping.py:383: test_skipif_class ERROR testing/test_skipping.py:383: test_skipif_class ERROR testing/test_skipping.py:400: test_skip_reasons_folding ERROR testing/test_skipping.py:400: test_skip_reasons_folding ERROR testing/test_skipping.py:425: test_skipped_reasons_functional ERROR testing/test_skipping.py:425: test_skipped_reasons_functional ERROR testing/test_skipping.py:453: test_reportchars ERROR testing/test_skipping.py:453: test_reportchars ERROR testing/test_skipping.py:475: test_reportchars_error ERROR testing/test_skipping.py:475: test_reportchars_error ERROR testing/test_skipping.py:490: test_errors_in_xfail_skip_expressions XPASS testing/test_skipping.py:490: test_errors_in_xfail_skip_expressions XPASS testing/test_skipping.py:521: test_xfail_skipif_with_globals ERROR testing/test_skipping.py:521: test_xfail_skipif_with_globals ERROR testing/test_skipping.py:539: test_direct_gives_error ERROR testing/test_skipping.py:539: test_direct_gives_error ERROR testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[default] FAILED testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[default] ERROR testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[verbose] ERROR testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[verbose] ERROR testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[quiet] ERROR testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[quiet] ERROR testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[fulltrace] ERROR testing/test_terminal.py:41: TestTerminal.test_pass_skip_fail[fulltrace] ERROR testing/test_terminal.py:68: TestTerminal.test_internalerror FAILED testing/test_terminal.py:77: TestTerminal.test_writeline FAILED testing/test_terminal.py:77: TestTerminal.test_writeline ERROR testing/test_terminal.py:88: TestTerminal.test_show_runtest_logstart ERROR testing/test_terminal.py:88: TestTerminal.test_show_runtest_logstart ERROR testing/test_terminal.py:99: TestTerminal.test_runtest_location_shown_before_test_starts ERROR testing/test_terminal.py:99: TestTerminal.test_runtest_location_shown_before_test_starts ERROR testing/test_terminal.py:110: TestTerminal.test_itemreport_subclasses_show_subclassed_file ERROR testing/test_terminal.py:110: TestTerminal.test_itemreport_subclasses_show_subclassed_file ERROR testing/test_terminal.py:133: TestTerminal.test_itemreport_directclasses_not_shown_as_subclasses ERROR testing/test_terminal.py:133: TestTerminal.test_itemreport_directclasses_not_shown_as_subclasses ERROR testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[default] ERROR testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[default] ERROR testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[verbose] ERROR testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[verbose] ERROR testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[quiet] ERROR testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[quiet] ERROR testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[fulltrace] ERROR testing/test_terminal.py:147: TestTerminal.test_keyboard_interrupt[fulltrace] ERROR testing/test_terminal.py:170: TestTerminal.test_keyboard_in_sessionstart ERROR testing/test_terminal.py:170: TestTerminal.test_keyboard_in_sessionstart ERROR testing/test_terminal.py:186: TestCollectonly.test_collectonly_basic ERROR testing/test_terminal.py:186: TestCollectonly.test_collectonly_basic ERROR testing/test_terminal.py:197: TestCollectonly.test_collectonly_skipped_module ERROR testing/test_terminal.py:197: TestCollectonly.test_collectonly_skipped_module ERROR testing/test_terminal.py:208: TestCollectonly.test_collectonly_failed_module ERROR testing/test_terminal.py:208: TestCollectonly.test_collectonly_failed_module ERROR testing/test_terminal.py:216: TestCollectonly.test_collectonly_fatal ERROR testing/test_terminal.py:216: TestCollectonly.test_collectonly_fatal ERROR testing/test_terminal.py:227: TestCollectonly.test_collectonly_simple ERROR testing/test_terminal.py:227: TestCollectonly.test_collectonly_simple ERROR testing/test_terminal.py:247: TestCollectonly.test_collectonly_error ERROR testing/test_terminal.py:247: TestCollectonly.test_collectonly_error ERROR testing/test_terminal.py:260: test_repr_python_version ERROR testing/test_terminal.py:260: test_repr_python_version ERROR testing/test_terminal.py:270: TestFixtureReporting.test_setup_fixture_error ERROR testing/test_terminal.py:270: TestFixtureReporting.test_setup_fixture_error ERROR testing/test_terminal.py:288: TestFixtureReporting.test_teardown_fixture_error ERROR testing/test_terminal.py:288: TestFixtureReporting.test_teardown_fixture_error ERROR testing/test_terminal.py:306: TestFixtureReporting.test_teardown_fixture_error_and_test_failure ERROR testing/test_terminal.py:306: TestFixtureReporting.test_teardown_fixture_error_and_test_failure ERROR testing/test_terminal.py:330: TestTerminalFunctional.test_deselected ERROR testing/test_terminal.py:330: TestTerminalFunctional.test_deselected ERROR testing/test_terminal.py:347: TestTerminalFunctional.test_no_skip_summary_if_failure ERROR testing/test_terminal.py:347: TestTerminalFunctional.test_no_skip_summary_if_failure ERROR testing/test_terminal.py:361: TestTerminalFunctional.test_passes ERROR testing/test_terminal.py:361: TestTerminalFunctional.test_passes ERROR testing/test_terminal.py:380: TestTerminalFunctional.test_header_trailer_info ERROR testing/test_terminal.py:380: TestTerminalFunctional.test_header_trailer_info ERROR testing/test_terminal.py:395: TestTerminalFunctional.test_showlocals ERROR testing/test_terminal.py:395: TestTerminalFunctional.test_showlocals ERROR testing/test_terminal.py:409: TestTerminalFunctional.test_verbose_reporting ERROR testing/test_terminal.py:409: TestTerminalFunctional.test_verbose_reporting ERROR testing/test_terminal.py:440: TestTerminalFunctional.test_quiet_reporting ERROR testing/test_terminal.py:440: TestTerminalFunctional.test_quiet_reporting ERROR testing/test_terminal.py:448: test_fail_extra_reporting ERROR testing/test_terminal.py:448: test_fail_extra_reporting ERROR testing/test_terminal.py:458: test_fail_reporting_on_pass ERROR testing/test_terminal.py:458: test_fail_reporting_on_pass ERROR testing/test_terminal.py:463: test_getreportopt ERROR testing/test_terminal.py:463: test_getreportopt ERROR testing/test_terminal.py:483: test_terminalreporter_reportopt_addopts ERROR testing/test_terminal.py:483: test_terminalreporter_reportopt_addopts ERROR testing/test_terminal.py:498: test_tbstyle_short ERROR testing/test_terminal.py:498: test_tbstyle_short ERROR testing/test_terminal.py:520: test_traceconfig FAILED testing/test_terminal.py:520: test_traceconfig ERROR testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[default] ERROR testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[default] ERROR testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[verbose] FAILED testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[verbose] ERROR testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[quiet] ERROR testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[quiet] ERROR testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[fulltrace] ERROR testing/test_terminal.py:532: TestGenericReporting.test_collect_fail[fulltrace] ERROR testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[default] FAILED testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[default] ERROR testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[verbose] ERROR testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[verbose] ERROR testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[quiet] ERROR testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[quiet] ERROR testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[fulltrace] FAILED testing/test_terminal.py:541: TestGenericReporting.test_maxfailures[fulltrace] ERROR testing/test_terminal.py:559: TestGenericReporting.test_tb_option[default] ERROR testing/test_terminal.py:559: TestGenericReporting.test_tb_option[default] ERROR testing/test_terminal.py:559: TestGenericReporting.test_tb_option[verbose] ERROR testing/test_terminal.py:559: TestGenericReporting.test_tb_option[verbose] ERROR testing/test_terminal.py:559: TestGenericReporting.test_tb_option[quiet] FAILED testing/test_terminal.py:559: TestGenericReporting.test_tb_option[quiet] ERROR testing/test_terminal.py:559: TestGenericReporting.test_tb_option[fulltrace] ERROR testing/test_terminal.py:559: TestGenericReporting.test_tb_option[fulltrace] ERROR testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[default] ERROR testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[default] ERROR testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[verbose] FAILED testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[verbose] ERROR testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[quiet] ERROR testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[quiet] ERROR testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[fulltrace] ERROR testing/test_terminal.py:584: TestGenericReporting.test_tb_crashline[fulltrace] ERROR testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[default] FAILED testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[default] ERROR testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[verbose] ERROR testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[verbose] ERROR testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[quiet] ERROR testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[quiet] ERROR testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[fulltrace] FAILED testing/test_terminal.py:604: TestGenericReporting.test_pytest_report_header[fulltrace] ERROR testing/test_terminal.py:619: test_fdopen_kept_alive_issue124 XPASS testing/test_terminal.py:619: test_fdopen_kept_alive_issue124 XPASS testing/test_tmpdir.py:6: test_funcarg ERROR testing/test_tmpdir.py:6: test_funcarg ERROR testing/test_tmpdir.py:23: test_ensuretemp PASSED testing/test_tmpdir.py:31: TestTempdirHandler.test_mktemp PASSED testing/test_tmpdir.py:44: TestConfigTmpdir.test_getbasetemp_custom_removes_old FAILED testing/test_tmpdir.py:57: TestConfigTmpdir.test_reparse FAILED testing/test_tmpdir.py:57: TestConfigTmpdir.test_reparse ERROR testing/test_tmpdir.py:64: TestConfigTmpdir.test_reparse_filename_too_long ERROR testing/test_tmpdir.py:64: TestConfigTmpdir.test_reparse_filename_too_long ERROR testing/test_tmpdir.py:68: test_basetemp ERROR testing/test_tmpdir.py:68: test_basetemp ERROR testing/test_unittest.py:3: test_simple_unittest ERROR testing/test_unittest.py:3: test_simple_unittest ERROR testing/test_unittest.py:17: test_isclasscheck_issue53 FAILED testing/test_unittest.py:17: test_isclasscheck_issue53 ERROR testing/test_unittest.py:28: test_setup ERROR testing/test_unittest.py:28: test_setup ERROR testing/test_unittest.py:48: test_new_instances ERROR testing/test_unittest.py:48: test_new_instances ERROR testing/test_unittest.py:60: test_teardown ERROR testing/test_unittest.py:60: test_teardown ERROR testing/test_unittest.py:79: test_method_and_teardown_failing_reporting FAILED testing/test_unittest.py:79: test_method_and_teardown_failing_reporting ERROR testing/test_unittest.py:98: test_setup_failure_is_shown ERROR testing/test_unittest.py:98: test_setup_failure_is_shown ERROR testing/test_unittest.py:118: test_setup_setUpClass ERROR testing/test_unittest.py:118: test_setup_setUpClass ERROR testing/test_unittest.py:140: test_setup_class FAILED testing/test_unittest.py:161: test_testcase_adderrorandfailure_defers[0] FAILED testing/test_unittest.py:161: test_testcase_adderrorandfailure_defers[0] ERROR testing/test_unittest.py:161: test_testcase_adderrorandfailure_defers[1] ERROR testing/test_unittest.py:161: test_testcase_adderrorandfailure_defers[1] ERROR testing/test_unittest.py:181: test_testcase_custom_exception_info[0] FAILED testing/test_unittest.py:181: test_testcase_custom_exception_info[0] ERROR testing/test_unittest.py:181: test_testcase_custom_exception_info[1] ERROR testing/test_unittest.py:181: test_testcase_custom_exception_info[1] ERROR testing/test_unittest.py:211: test_testcase_totally_incompatible_exception_info ERROR testing/test_unittest.py:211: test_testcase_totally_incompatible_exception_info ERROR testing/test_unittest.py:222: test_module_level_pytestmark FAILED testing/test_unittest.py:239: TestTrialUnittest.test_trial_exceptions_with_skips SKIPPED testing/test_unittest.py:285: TestTrialUnittest.test_trial_error SKIPPED testing/test_unittest.py:335: TestTrialUnittest.test_trial_pdb SKIPPED testing/test_unittest.py:347: test_djangolike_testcase FAILED testing/test_unittest.py:347: test_djangolike_testcase ERROR testing/test_unittest.py:401: test_unittest_not_shown_in_traceback ERROR testing/test_unittest.py:401: test_unittest_not_shown_in_traceback ERRORTraceback (most recent call last): File "pytest.py", line 11, in raise SystemExit(main()) File "/Users/tzuchien/pytest/_pytest/core.py", line 457, in main exitstatus = hook.pytest_cmdline_main(config=config) File "/Users/tzuchien/pytest/_pytest/core.py", line 411, in __call__ return self._docall(methods, kwargs) File "/Users/tzuchien/pytest/_pytest/core.py", line 422, in _docall res = mc.execute() File "/Users/tzuchien/pytest/_pytest/core.py", line 340, in execute res = method(**kwargs) File "/Users/tzuchien/pytest/_pytest/main.py", line 90, in pytest_cmdline_main return wrap_session(config, _main) File "/Users/tzuchien/pytest/_pytest/main.py", line 84, in wrap_session exitstatus=session.exitstatus) File "/Users/tzuchien/pytest/_pytest/core.py", line 411, in __call__ return self._docall(methods, kwargs) File "/Users/tzuchien/pytest/_pytest/core.py", line 422, in _docall res = mc.execute() File "/Users/tzuchien/pytest/_pytest/core.py", line 340, in execute res = method(**kwargs) File "/Users/tzuchien/pytest/_pytest/terminal.py", line 313, in pytest_sessionfinish __multicall__.execute() File "/Users/tzuchien/pytest/_pytest/core.py", line 340, in execute res = method(**kwargs) File "/Users/tzuchien/pytest/_pytest/runner.py", line 22, in pytest_sessionfinish rep = hook.pytest__teardown_final(session=session) File "/Users/tzuchien/pytest/_pytest/core.py", line 411, in __call__ return self._docall(methods, kwargs) File "/Users/tzuchien/pytest/_pytest/core.py", line 422, in _docall res = mc.execute() File "/Users/tzuchien/pytest/_pytest/core.py", line 340, in execute res = method(**kwargs) File "/Users/tzuchien/pytest/_pytest/capture.py", line 166, in pytest__teardown_final self.resumecapture(method) File "/Users/tzuchien/pytest/_pytest/capture.py", line 101, in resumecapture cap.resume() File "/Library/Python/2.6/site-packages/py-1.4.5-py2.6.egg/py/_io/capture.py", line 237, in resume self.startall() File "/Library/Python/2.6/site-packages/py-1.4.5-py2.6.egg/py/_io/capture.py", line 229, in startall self.in_.start() File "/Library/Python/2.6/site-packages/py-1.4.5-py2.6.egg/py/_io/capture.py", line 59, in start fd = os.open(devnullpath, os.O_RDONLY) OSError: [Errno 24] Too many open files: '/dev/null' ~/pytest $ From pere.martir4 at gmail.com Thu Oct 27 13:54:41 2011 From: pere.martir4 at gmail.com (Pere Martir) Date: Thu, 27 Oct 2011 13:54:41 +0200 Subject: [py-dev] Status of py.test's unit tests now ? In-Reply-To: <20111027111242.GC27936@merlinux.eu> References: <20111026070756.GB27936@merlinux.eu> <20111027111242.GC27936@merlinux.eu> Message-ID: By the way, where is the CI server please ? On Thu, Oct 27, 2011 at 1:12 PM, holger krekel > Thanks Pere for your efforts. ?I guess some more work testing with OSX > is needed (which is unfortuantely not part of the CI setup - anyone able > to offer an OSX build host?). ?However, one issue sticks out - could you > meanwhile try changing the last line of _pytest/tmpdir.py to "return x" > instead of "return x.realpath()" and see if this helps with some of > the currently failing tests? From Ronny.Pfannschmidt at gmx.de Thu Oct 27 14:15:00 2011 From: Ronny.Pfannschmidt at gmx.de (Ronny Pfannschmidt) Date: Thu, 27 Oct 2011 14:15:00 +0200 Subject: [py-dev] Status of py.test's unit tests now ? In-Reply-To: References: <20111026070756.GB27936@merlinux.eu> <20111027111242.GC27936@merlinux.eu> Message-ID: <4EA94B44.50200@gmx.de> On 10/27/2011 01:54 PM, Pere Martir wrote: > By the way, where is the CI server please ? http://hudson.testrun.org/job/pytest/ > > On Thu, Oct 27, 2011 at 1:12 PM, holger krekel > > Thanks Pere for your efforts. I guess some more work testing with OSX >> is needed (which is unfortuantely not part of the CI setup - anyone able >> to offer an OSX build host?). However, one issue sticks out - could you >> meanwhile try changing the last line of _pytest/tmpdir.py to "return x" >> instead of "return x.realpath()" and see if this helps with some of >> the currently failing tests? > _______________________________________________ > py-dev mailing list > py-dev at codespeak.net > http://codespeak.net/mailman/listinfo/py-dev -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 900 bytes Desc: OpenPGP digital signature URL: From pere.martir4 at gmail.com Fri Oct 28 01:32:36 2011 From: pere.martir4 at gmail.com (Pere Martir) Date: Fri, 28 Oct 2011 01:32:36 +0200 Subject: [py-dev] Status of py.test's unit tests now ? In-Reply-To: <4EA94B44.50200@gmx.de> References: <20111026070756.GB27936@merlinux.eu> <20111027111242.GC27936@merlinux.eu> <4EA94B44.50200@gmx.de> Message-ID: On Thu, Oct 27, 2011 at 2:15 PM, Ronny Pfannschmidt wrote: > On 10/27/2011 01:54 PM, Pere Martir wrote: >> By the way, where is the CI server please ? > http://hudson.testrun.org/job/pytest/ It seems to be a problem specific to Mac OS X. Since there is no Mac OS X slave, I suppose that the unit tests on this platform has not been paid much attention ? It's strange that if I only executed a subset of unit tests with -k, they don't fail. For example: python pytest.py -k TestGenerator But as you can see in the attachment of my previous post, they failed. It's also true for many other test suites, any clue ? By the way, I fixed a problem. The failure of TestSession.test_parsearg is because on Mac OS X /var is actually a symbolic link of /private/var, and many other path under /. Patching TmpDir to return realpath fixes the problem. This doesn't fix many failures. "tox -e py26" still blocks at test_pdb forever, for example. I feel like that the other failures are due to the other bugs. Is it worthwhile spending time looking at the code ? Or it's probably the problem of my configuration/environment (not the source code) ? From holger at merlinux.eu Fri Oct 28 23:08:03 2011 From: holger at merlinux.eu (holger krekel) Date: Fri, 28 Oct 2011 21:08:03 +0000 Subject: [py-dev] Status of py.test's unit tests now ? In-Reply-To: References: <20111026070756.GB27936@merlinux.eu> <20111027111242.GC27936@merlinux.eu> <4EA94B44.50200@gmx.de> Message-ID: <20111028210803.GF27936@merlinux.eu> On Fri, Oct 28, 2011 at 01:32 +0200, Pere Martir wrote: > On Thu, Oct 27, 2011 at 2:15 PM, Ronny Pfannschmidt > wrote: > > On 10/27/2011 01:54 PM, Pere Martir wrote: > >> By the way, where is the CI server please ? > > http://hudson.testrun.org/job/pytest/ > > It seems to be a problem specific to Mac OS X. Since there is no Mac > OS X slave, I suppose that the unit tests on this platform has not > been paid much attention ? right, is the reason i asked for a build host in my last mail. > It's strange that if I only executed a subset of unit tests with -k, > they don't fail. For example: > > python pytest.py -k TestGenerator > > But as you can see in the attachment of my previous post, they failed. > It's also true for many other test suites, any clue ? > on my OSX 10.8 all tests pass FWIW. > By the way, I fixed a problem. The failure of > TestSession.test_parsearg is because on Mac OS X /var is actually a > symbolic link of /private/var, and many other path under /. Patching > TmpDir to return realpath fixes the problem. This doesn't fix many > failures. "tox -e py26" still blocks at test_pdb forever, for example. > > I feel like that the other failures are due to the other bugs. Is it > worthwhile spending time looking at the code ? Or it's probably the > problem of my configuration/environment (not the source code) ? I don't know. could you send a new "py.test testing" log with the x.realpath() patch applied? Is there a way to log into your OSX machine by chance? thanks, holger