-
Notifications
You must be signed in to change notification settings - Fork 42
Open
Labels
Description
It feels like something upstream, but I cannot really debug it (it only shows on windows). Maybe @pllim you can have a look at it next week, and see if we need to report it upstream?
py314-test-pytestdev: commands[1] D:\a\pytest-doctestplus\pytest-doctestplus\.tmp\py314-test-pytestdev> pytest D:\a\pytest-doctestplus\pytest-doctestplus/tests --ignore=D:\a\pytest-doctestplus\pytest-doctestplus/tests/docs/skip_some_remote_data.rst --doctest-plus --doctest-rst --remote-data
============================= test session starts =============================
platform win32 -- Python 3.14.0, pytest-8.5.0.dev201+gc97a40140, pluggy-1.6.0
cachedir: .tox\py314-test-pytestdev\.pytest_cache
rootdir: D:\a\pytest-doctestplus\pytest-doctestplus
configfile: setup.cfg
plugins: doctestplus-1.4.1.dev37+g2add52568, remotedata-0.4.1
collected 75 items
..\..\tests\docs\skip_all.rst s [ 1%]
..\..\tests\docs\skip_some.rst . [ 2%]
..\..\tests\python\doctests.py ... [ 6%]
..\..\tests\test_doctestplus.py ...................x.................... [ 60%]
............ [ 76%]
..\..\tests\test_encoding.py ........ssssssss [ 97%]
..\..\tests\test_utils.py .. [100%]
================== 65 passed, 9 skipped, 1 xfailed in 15.51s ==================
py314-test-pytestdev: exit 0 (16.21 seconds) D:\a\pytest-doctestplus\pytest-doctestplus\.tmp\py314-test-pytestdev> pytest D:\a\pytest-doctestplus\pytest-doctestplus/tests --ignore=D:\a\pytest-doctestplus\pytest-doctestplus/tests/docs/skip_some_remote_data.rst --doctest-plus --doctest-rst --remote-data pid=7040
py314-test-pytestdev: commands[2] D:\a\pytest-doctestplus\pytest-doctestplus\.tmp\py314-test-pytestdev> pytest D:\a\pytest-doctestplus\pytest-doctestplus/tests
============================= test session starts =============================
platform win32 -- Python 3.14.0, pytest-8.5.0.dev201+gc97a40140, pluggy-1.6.0
cachedir: .tox\py314-test-pytestdev\.pytest_cache
rootdir: D:\a\pytest-doctestplus\pytest-doctestplus
configfile: setup.cfg
plugins: doctestplus-1.4.1.dev37+g2add52568, remotedata-0.4.1
collected 70 items
..\..\tests\test_doctestplus.py ...................x...F................ [ 57%]
............ [ 74%]
..\..\tests\test_encoding.py ........ssssssss [ 97%]
..\..\tests\test_utils.py .. [100%]
================================== FAILURES ===================================
_____________________________ test_ignore_option ______________________________
testdir = <Testdir local('C:\\Users\\runneradmin\\AppData\\Local\\Temp\\pytest-of-unknown\\pytest-1\\test_ignore_option0')>
def test_ignore_option(testdir):
testdir.makepyfile(foo="""
def f():
'''
>>> 1+1
2
'''
pass
""")
testdir.makepyfile(bar="""
def f():
'''
>>> 1+1
2
'''
pass
""")
testdir.makefile('.rst', foo='>>> 1+1\n2')
config = <_pytest.config.Config object at 0x0000018A2069D090>
def collect_unraisable(config: Config) -> None:
pop_unraisable = config.stash[unraisable_exceptions].pop
errors: list[pytest.PytestUnraisableExceptionWarning | RuntimeError] = []
meta = None
hook_error = None
try:
while True:
try:
meta = pop_unraisable()
except IndexError:
break
if isinstance(meta, BaseException):
hook_error = RuntimeError("Failed to process unraisable exception")
hook_error.__cause__ = meta
errors.append(hook_error)
continue
msg = meta.msg
try:
> warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E pytest.PytestUnraisableExceptionWarning: Exception ignored while finalizing file <_io.FileIO name='D:\\a\\pytest-doctestplus\\pytest-doctestplus\\.tox\\py314-test-pytestdev\\Scripts\\pytest.EXE' mode='rb' closefd=True>: None
D:\a\pytest-doctestplus\pytest-doctestplus\.tox\py314-test-pytestdev\Lib\site-packages\_pytest\unraisableexception.py:67: PytestUnraisableExceptionWarning
=========================== short test summary info ===========================
ERROR bar.py::bar.f - pytest.PytestUnraisableExceptionWarning: Exception ignored while finalizing file <_io.FileIO name='D:\\a\\pytest-doctestplus\\pytest-doctestplus\\.tox\\py314-test-pytestdev\\Scripts\\pytest.EXE' mode='rb' closefd=True>: None
========================= 2 passed, 1 error in 0.12s ==========================
=========================== short test summary info ===========================
FAILED ..\..\tests\test_doctestplus.py::test_ignore_option - AssertionError: ([<TestReport 'foo.py::foo.f' when='call' outcome='passed'>, <TestReport 'foo.rst::foo.rst' when='call' outcome='passed'>], [], [<TestReport 'bar.py::bar.f' when='setup' outcome='failed'>])
assert {'failed': 1,... 'skipped': 0} == {'failed': 0,... 'skipped': 0}
Omitting 1 identical items, use -vv to show
Differing items:
{'failed': 1} != {'failed': 0}
{'passed': 2} != {'passed': 3}
Full diff:
{
- 'failed': 0,
? ^
+ 'failed': 1,
? ^
- 'passed': 3,
? ^
+ 'passed': 2,
? ^
'skipped': 0,
}
============= 1 failed, 60 passed, 8 skipped, 1 xfailed in 8.64s ==============
py314-test-pytestdev: exit 1 (9.20 seconds) D:\a\pytest-doctestplus\pytest-doctestplus\.tmp\py314-test-pytestdev> pytest D:\a\pytest-doctestplus\pytest-doctestplus/tests pid=6964
py314-test-pytestdev: FAIL code 1 (70.95=setup[45.19]+cmd[0.35,16.21,9.20] seconds)
evaluation failed :( (71.03 seconds)