>>> py3-pytest-benchmark: Building community/py3-pytest-benchmark 3.4.1-r1 (using abuild 3.10.0_rc1-r2) started Wed, 26 Oct 2022 17:26:45 +0000 >>> py3-pytest-benchmark: Checking sanity of /home/buildozer/aports/community/py3-pytest-benchmark/APKBUILD... >>> py3-pytest-benchmark: Analyzing dependencies... >>> py3-pytest-benchmark: Installing for build: build-base python3 py3-pytest py3-py-cpuinfo py3-setuptools py3-pytest-xdist py3-freezegun py3-pygal py3-elasticsearch (1/29) Installing libbz2 (1.0.8-r3) (2/29) Installing libffi (3.4.3-r0) (3/29) Installing gdbm (1.23-r0) (4/29) Installing xz-libs (5.2.7-r0) (5/29) Installing mpdecimal (2.5.1-r1) (6/29) Installing readline (8.2.0-r0) (7/29) Installing sqlite-libs (3.39.4-r0) (8/29) Installing python3 (3.10.8-r3) (9/29) Installing py3-attrs (22.1.0-r0) (10/29) Installing py3-iniconfig (1.1.1-r3) (11/29) Installing py3-parsing (3.0.9-r0) (12/29) Installing py3-packaging (21.3-r2) (13/29) Installing py3-pluggy (1.0.0-r1) (14/29) Installing py3-py (1.11.0-r0) (15/29) Installing py3-tomli (2.0.1-r1) (16/29) Installing py3-pytest (7.1.3-r1) (17/29) Installing py3-py-cpuinfo (8.0.0-r0) (18/29) Installing py3-setuptools (65.5.0-r0) (19/29) Installing py3-apipkg (2.1.0-r0) (20/29) Installing py3-execnet (1.9.0-r0) (21/29) Installing py3-pytest-forked (1.4.0-r1) (22/29) Installing py3-pytest-xdist (2.5.0-r1) (23/29) Installing py3-six (1.16.0-r3) (24/29) Installing py3-dateutil (2.8.2-r1) (25/29) Installing py3-freezegun (1.2.2-r0) (26/29) Installing py3-pygal (3.0.0-r1) (27/29) Installing py3-urllib3 (1.26.12-r0) (28/29) Installing py3-elasticsearch (7.11.0-r1) (29/29) Installing .makedepends-py3-pytest-benchmark (20221026.172653) Executing busybox-1.35.0-r27.trigger OK: 339 MiB in 121 packages >>> py3-pytest-benchmark: Cleaning up srcdir >>> py3-pytest-benchmark: Cleaning up pkgdir >>> py3-pytest-benchmark: Fetching https://distfiles.alpinelinux.org/distfiles/v3.17/pytest-benchmark-3.4.1.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 146 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 curl: (22) The requested URL returned error: 404 >>> py3-pytest-benchmark: Fetching https://github.com/ionelmc/pytest-benchmark/archive/v3.4.1/pytest-benchmark-3.4.1.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 310k 0 310k 0 0 163k 0 --:--:-- 0:00:01 --:--:-- 325k 100 315k 0 315k 0 0 165k 0 --:--:-- 0:00:01 --:--:-- 330k >>> py3-pytest-benchmark: Fetching https://distfiles.alpinelinux.org/distfiles/v3.17/pytest-benchmark-3.4.1.tar.gz >>> py3-pytest-benchmark: Checking sha512sums... pytest-benchmark-3.4.1.tar.gz: OK python-3.10.patch: OK >>> py3-pytest-benchmark: Unpacking /var/cache/distfiles/v3.17/pytest-benchmark-3.4.1.tar.gz... >>> py3-pytest-benchmark: python-3.10.patch patching file tests/test_cli.py running build running build_py creating build creating build/lib creating build/lib/pytest_benchmark copying src/pytest_benchmark/compat.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/utils.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/session.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/fixture.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/__main__.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/csv.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/stats.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/timers.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/logger.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/histogram.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/__init__.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/table.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/cli.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/hookspec.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/plugin.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/pep418.py -> build/lib/pytest_benchmark creating build/lib/pytest_benchmark/storage copying src/pytest_benchmark/storage/elasticsearch.py -> build/lib/pytest_benchmark/storage copying src/pytest_benchmark/storage/file.py -> build/lib/pytest_benchmark/storage copying src/pytest_benchmark/storage/__init__.py -> build/lib/pytest_benchmark/storage running egg_info creating src/pytest_benchmark.egg-info writing src/pytest_benchmark.egg-info/PKG-INFO writing dependency_links to src/pytest_benchmark.egg-info/dependency_links.txt writing entry points to src/pytest_benchmark.egg-info/entry_points.txt writing requirements to src/pytest_benchmark.egg-info/requires.txt writing top-level names to src/pytest_benchmark.egg-info/top_level.txt writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' reading manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*.py[cod]' found anywhere in distribution warning: no previously-included files matching '__pycache__/*' found anywhere in distribution warning: no previously-included files matching '*.so' found anywhere in distribution warning: no previously-included files matching '*.dylib' found anywhere in distribution adding license file 'LICENSE' adding license file 'AUTHORS.rst' writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' running install /usr/lib/python3.10/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools. warnings.warn( running build running build_py running egg_info writing src/pytest_benchmark.egg-info/PKG-INFO writing dependency_links to src/pytest_benchmark.egg-info/dependency_links.txt writing entry points to src/pytest_benchmark.egg-info/entry_points.txt writing requirements to src/pytest_benchmark.egg-info/requires.txt writing top-level names to src/pytest_benchmark.egg-info/top_level.txt reading manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*.py[cod]' found anywhere in distribution warning: no previously-included files matching '__pycache__/*' found anywhere in distribution warning: no previously-included files matching '*.so' found anywhere in distribution warning: no previously-included files matching '*.dylib' found anywhere in distribution adding license file 'LICENSE' adding license file 'AUTHORS.rst' writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' running install_lib creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10 creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/compat.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/utils.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/session.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/fixture.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/__main__.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/csv.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/stats.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/timers.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/logger.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/histogram.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/__init__.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/table.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage copying build/lib/pytest_benchmark/storage/elasticsearch.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage copying build/lib/pytest_benchmark/storage/file.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage copying build/lib/pytest_benchmark/storage/__init__.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage copying build/lib/pytest_benchmark/cli.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/hookspec.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/plugin.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/pep418.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/compat.py to compat.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/utils.py to utils.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/session.py to session.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/fixture.py to fixture.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/__main__.py to __main__.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/csv.py to csv.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/stats.py to stats.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/timers.py to timers.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/logger.py to logger.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/histogram.py to histogram.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/__init__.py to __init__.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/table.py to table.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage/elasticsearch.py to elasticsearch.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage/file.py to file.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage/__init__.py to __init__.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/cli.py to cli.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/hookspec.py to hookspec.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/plugin.py to plugin.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/pep418.py to pep418.cpython-310.pyc running install_egg_info Copying src/pytest_benchmark.egg-info to /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark-3.4.1-py3.10.egg-info running install_scripts Installing py.test-benchmark script to /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/bin Installing pytest-benchmark script to /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/bin ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1, configfile: setup.cfg, testpaths: tests plugins: benchmark-3.4.1, forked-1.4.0, xdist-2.5.0 collecting ... collected 232 items / 10 deselected / 222 selected tests/test_benchmark.py::test_help PASSED [ 0%] tests/test_benchmark.py::test_groups PASSED [ 0%] tests/test_benchmark.py::test_group_by_name PASSED [ 1%] tests/test_benchmark.py::test_group_by_func PASSED [ 1%] tests/test_benchmark.py::test_group_by_fullfunc PASSED [ 2%] tests/test_benchmark.py::test_group_by_param_all PASSED [ 2%] tests/test_benchmark.py::test_group_by_param_select PASSED [ 3%] tests/test_benchmark.py::test_group_by_param_select_multiple PASSED [ 3%] tests/test_benchmark.py::test_group_by_fullname PASSED [ 4%] tests/test_benchmark.py::test_double_use PASSED [ 4%] tests/test_benchmark.py::test_only_override_skip PASSED [ 4%] tests/test_benchmark.py::test_fixtures_also_skipped PASSED [ 5%] tests/test_benchmark.py::test_conflict_between_only_and_disable PASSED [ 5%] tests/test_benchmark.py::test_max_time_min_rounds PASSED [ 6%] tests/test_benchmark.py::test_max_time PASSED [ 6%] tests/test_benchmark.py::test_bogus_max_time PASSED [ 7%] tests/test_benchmark.py::test_pep418_timer PASSED [ 7%] tests/test_benchmark.py::test_bad_save PASSED [ 8%] tests/test_benchmark.py::test_bad_save_2 PASSED [ 8%] tests/test_benchmark.py::test_bad_compare_fail PASSED [ 9%] tests/test_benchmark.py::test_bad_rounds PASSED [ 9%] tests/test_benchmark.py::test_bad_rounds_2 PASSED [ 9%] tests/test_benchmark.py::test_compare PASSED [ 10%] tests/test_benchmark.py::test_compare_last PASSED [ 10%] tests/test_benchmark.py::test_compare_non_existing PASSED [ 11%] tests/test_benchmark.py::test_compare_non_existing_verbose PASSED [ 11%] tests/test_benchmark.py::test_compare_no_files PASSED [ 12%] tests/test_benchmark.py::test_compare_no_files_verbose PASSED [ 12%] tests/test_benchmark.py::test_compare_no_files_match PASSED [ 13%] tests/test_benchmark.py::test_compare_no_files_match_verbose PASSED [ 13%] tests/test_benchmark.py::test_verbose PASSED [ 13%] tests/test_benchmark.py::test_save PASSED [ 14%] tests/test_benchmark.py::test_save_extra_info PASSED [ 14%] tests/test_benchmark.py::test_update_machine_info_hook_detection PASSED [ 15%] tests/test_benchmark.py::test_histogram PASSED [ 15%] tests/test_benchmark.py::test_autosave PASSED [ 16%] tests/test_benchmark.py::test_bogus_min_time PASSED [ 16%] tests/test_benchmark.py::test_disable_gc PASSED [ 17%] tests/test_benchmark.py::test_custom_timer PASSED [ 17%] tests/test_benchmark.py::test_bogus_timer PASSED [ 18%] tests/test_benchmark.py::test_sort_by_mean PASSED [ 18%] tests/test_benchmark.py::test_bogus_sort PASSED [ 18%] tests/test_benchmark.py::test_xdist PASSED [ 19%] tests/test_benchmark.py::test_xdist_verbose PASSED [ 19%] tests/test_benchmark.py::test_cprofile PASSED [ 20%] tests/test_benchmark.py::test_disabled_and_cprofile PASSED [ 20%] tests/test_benchmark.py::test_abort_broken PASSED [ 21%] tests/test_benchmark.py::test_basic FAILED [ 21%] tests/test_benchmark.py::test_skip FAILED [ 22%] tests/test_benchmark.py::test_disable FAILED [ 22%] tests/test_benchmark.py::test_mark_selection PASSED [ 22%] tests/test_benchmark.py::test_only_benchmarks FAILED [ 23%] tests/test_benchmark.py::test_columns PASSED [ 23%] tests/test_calibration.py::test_calibrate PASSED [ 24%] tests/test_calibration.py::test_calibrate_fast PASSED [ 24%] tests/test_calibration.py::test_calibrate_xfast PASSED [ 25%] tests/test_calibration.py::test_calibrate_slow PASSED [ 25%] tests/test_calibration.py::test_calibrate_stuck[True-0-1] PASSED [ 26%] tests/test_calibration.py::test_calibrate_stuck[True-0-0.01] PASSED [ 26%] tests/test_calibration.py::test_calibrate_stuck[True-0-1e-09] PASSED [ 27%] tests/test_calibration.py::test_calibrate_stuck[True-0-1e-10] PASSED [ 27%] tests/test_calibration.py::test_calibrate_stuck[True-0-1.000000000000001] PASSED [ 27%] tests/test_calibration.py::test_calibrate_stuck[True-1-1] PASSED [ 28%] tests/test_calibration.py::test_calibrate_stuck[True-1-0.01] PASSED [ 28%] tests/test_calibration.py::test_calibrate_stuck[True-1-1e-09] PASSED [ 29%] tests/test_calibration.py::test_calibrate_stuck[True-1-1e-10] PASSED [ 29%] tests/test_calibration.py::test_calibrate_stuck[True-1-1.000000000000001] PASSED [ 30%] tests/test_calibration.py::test_calibrate_stuck[True--1-1] PASSED [ 30%] tests/test_calibration.py::test_calibrate_stuck[True--1-0.01] PASSED [ 31%] tests/test_calibration.py::test_calibrate_stuck[True--1-1e-09] PASSED [ 31%] tests/test_calibration.py::test_calibrate_stuck[True--1-1e-10] PASSED [ 31%] tests/test_calibration.py::test_calibrate_stuck[True--1-1.000000000000001] PASSED [ 32%] tests/test_calibration.py::test_calibrate_stuck[False-0-1] PASSED [ 32%] tests/test_calibration.py::test_calibrate_stuck[False-0-0.01] PASSED [ 33%] tests/test_calibration.py::test_calibrate_stuck[False-0-1e-09] PASSED [ 33%] tests/test_calibration.py::test_calibrate_stuck[False-0-1e-10] PASSED [ 34%] tests/test_calibration.py::test_calibrate_stuck[False-0-1.000000000000001] PASSED [ 34%] tests/test_calibration.py::test_calibrate_stuck[False-1-1] PASSED [ 35%] tests/test_calibration.py::test_calibrate_stuck[False-1-0.01] PASSED [ 35%] tests/test_calibration.py::test_calibrate_stuck[False-1-1e-09] PASSED [ 36%] tests/test_calibration.py::test_calibrate_stuck[False-1-1e-10] PASSED [ 36%] tests/test_calibration.py::test_calibrate_stuck[False-1-1.000000000000001] PASSED [ 36%] tests/test_calibration.py::test_calibrate_stuck[False--1-1] PASSED [ 37%] tests/test_calibration.py::test_calibrate_stuck[False--1-0.01] PASSED [ 37%] tests/test_calibration.py::test_calibrate_stuck[False--1-1e-09] PASSED [ 38%] tests/test_calibration.py::test_calibrate_stuck[False--1-1e-10] PASSED [ 38%] tests/test_calibration.py::test_calibrate_stuck[False--1-1.000000000000001] PASSED [ 39%] tests/test_cli.py::test_list PASSED [ 39%] tests/test_cli.py::test_compare[short-] PASSED [ 40%] tests/test_cli.py::test_compare[long-] PASSED [ 40%] tests/test_cli.py::test_compare[normal-] PASSED [ 40%] tests/test_cli.py::test_compare[trial-] PASSED [ 41%] tests/test_doctest.rst::test_doctest.rst PASSED [ 41%] tests/test_elasticsearch_storage.py::test_handle_saving PASSED [ 42%] tests/test_elasticsearch_storage.py::test_parse_with_no_creds PASSED [ 42%] tests/test_elasticsearch_storage.py::test_parse_with_creds_in_first_host_of_url PASSED [ 43%] tests/test_elasticsearch_storage.py::test_parse_with_creds_in_second_host_of_url PASSED [ 43%] tests/test_elasticsearch_storage.py::test_parse_with_creds_in_netrc PASSED [ 44%] tests/test_elasticsearch_storage.py::test_parse_url_creds_supersedes_netrc_creds PASSED [ 44%] tests/test_elasticsearch_storage.py::test__mask_hosts PASSED [ 45%] tests/test_normal.py::test_normal PASSED [ 45%] tests/test_normal.py::test_fast PASSED [ 45%] tests/test_normal.py::test_slow PASSED [ 46%] tests/test_normal.py::test_slower PASSED [ 46%] tests/test_normal.py::test_xfast PASSED [ 47%] tests/test_normal.py::test_parametrized[0] PASSED [ 47%] tests/test_normal.py::test_parametrized[1] PASSED [ 48%] tests/test_normal.py::test_parametrized[2] PASSED [ 48%] tests/test_normal.py::test_parametrized[3] PASSED [ 49%] tests/test_normal.py::test_parametrized[4] PASSED [ 49%] tests/test_pedantic.py::test_single PASSED [ 50%] tests/test_pedantic.py::test_setup PASSED [ 50%] tests/test_pedantic.py::test_setup_cprofile PASSED [ 50%] tests/test_pedantic.py::test_args_kwargs PASSED [ 51%] tests/test_pedantic.py::test_iterations PASSED [ 51%] tests/test_pedantic.py::test_rounds_iterations PASSED [ 52%] tests/test_pedantic.py::test_rounds PASSED [ 52%] tests/test_pedantic.py::test_warmup_rounds PASSED [ 53%] tests/test_pedantic.py::test_rounds_must_be_int[0] PASSED [ 53%] tests/test_pedantic.py::test_rounds_must_be_int[x] PASSED [ 54%] tests/test_pedantic.py::test_warmup_rounds_must_be_int[-15] PASSED [ 54%] tests/test_pedantic.py::test_warmup_rounds_must_be_int[x] PASSED [ 54%] tests/test_pedantic.py::test_setup_many_rounds PASSED [ 55%] tests/test_pedantic.py::test_cant_use_both_args_and_setup_with_return PASSED [ 55%] tests/test_pedantic.py::test_can_use_both_args_and_setup_without_return PASSED [ 56%] tests/test_pedantic.py::test_cant_use_setup_with_many_iterations PASSED [ 56%] tests/test_pedantic.py::test_iterations_must_be_positive_int[0] PASSED [ 57%] tests/test_pedantic.py::test_iterations_must_be_positive_int[-1] PASSED [ 57%] tests/test_pedantic.py::test_iterations_must_be_positive_int[asdf] PASSED [ 58%] tests/test_sample.py::test_proto[SimpleProxy] PASSED [ 58%] tests/test_sample.py::test_proto[CachedPropertyProxy] PASSED [ 59%] tests/test_sample.py::test_proto[LocalsSimpleProxy] PASSED [ 59%] tests/test_sample.py::test_proto[LocalsCachedPropertyProxy] PASSED [ 59%] tests/test_skip.py::test_skip SKIPPED (bla) [ 60%] tests/test_stats.py::test_1 PASSED [ 60%] tests/test_stats.py::test_2 PASSED [ 61%] tests/test_stats.py::test_single_item PASSED [ 61%] tests/test_stats.py::test_length[1] PASSED [ 62%] tests/test_stats.py::test_length[2] PASSED [ 62%] tests/test_stats.py::test_length[3] PASSED [ 63%] tests/test_stats.py::test_length[4] PASSED [ 63%] tests/test_stats.py::test_length[5] PASSED [ 63%] tests/test_stats.py::test_length[6] PASSED [ 64%] tests/test_stats.py::test_length[7] PASSED [ 64%] tests/test_stats.py::test_length[8] PASSED [ 65%] tests/test_stats.py::test_length[9] PASSED [ 65%] tests/test_stats.py::test_iqr PASSED [ 66%] tests/test_stats.py::test_ops PASSED [ 66%] tests/test_storage.py::test_rendering[short] PASSED [ 67%] tests/test_storage.py::test_rendering[normal] PASSED [ 67%] tests/test_storage.py::test_rendering[long] PASSED [ 68%] tests/test_storage.py::test_rendering[trial] PASSED [ 68%] tests/test_storage.py::test_regression_checks[short] PASSED [ 68%] tests/test_storage.py::test_regression_checks[normal] PASSED [ 69%] tests/test_storage.py::test_regression_checks[long] PASSED [ 69%] tests/test_storage.py::test_regression_checks[trial] PASSED [ 70%] tests/test_storage.py::test_regression_checks_inf[short] PASSED [ 70%] tests/test_storage.py::test_regression_checks_inf[normal] PASSED [ 71%] tests/test_storage.py::test_regression_checks_inf[long] PASSED [ 71%] tests/test_storage.py::test_regression_checks_inf[trial] PASSED [ 72%] tests/test_storage.py::test_compare_1[short] PASSED [ 72%] tests/test_storage.py::test_compare_1[normal] PASSED [ 72%] tests/test_storage.py::test_compare_1[long] PASSED [ 73%] tests/test_storage.py::test_compare_1[trial] PASSED [ 73%] tests/test_storage.py::test_compare_2[short] PASSED [ 74%] tests/test_storage.py::test_compare_2[normal] PASSED [ 74%] tests/test_storage.py::test_compare_2[long] PASSED [ 75%] tests/test_storage.py::test_compare_2[trial] PASSED [ 75%] tests/test_storage.py::test_save_json[short] PASSED [ 76%] tests/test_storage.py::test_save_json[normal] PASSED [ 76%] tests/test_storage.py::test_save_json[long] PASSED [ 77%] tests/test_storage.py::test_save_json[trial] PASSED [ 77%] tests/test_storage.py::test_save_with_name[short] PASSED [ 77%] tests/test_storage.py::test_save_with_name[normal] PASSED [ 78%] tests/test_storage.py::test_save_with_name[long] PASSED [ 78%] tests/test_storage.py::test_save_with_name[trial] PASSED [ 79%] tests/test_storage.py::test_save_no_name[short] PASSED [ 79%] tests/test_storage.py::test_save_no_name[normal] PASSED [ 80%] tests/test_storage.py::test_save_no_name[long] PASSED [ 80%] tests/test_storage.py::test_save_no_name[trial] PASSED [ 81%] tests/test_storage.py::test_save_with_error[short] PASSED [ 81%] tests/test_storage.py::test_save_with_error[normal] PASSED [ 81%] tests/test_storage.py::test_save_with_error[long] PASSED [ 82%] tests/test_storage.py::test_save_with_error[trial] PASSED [ 82%] tests/test_storage.py::test_autosave[short] PASSED [ 83%] tests/test_storage.py::test_autosave[normal] PASSED [ 83%] tests/test_storage.py::test_autosave[long] PASSED [ 84%] tests/test_storage.py::test_autosave[trial] PASSED [ 84%] tests/test_utils.py::test_clonefunc[] PASSED [ 85%] tests/test_utils.py::test_clonefunc[f2] PASSED [ 85%] tests/test_utils.py::test_clonefunc_not_function PASSED [ 86%] tests/test_utils.py::test_get_commit_info[git-True] PASSED [ 86%] tests/test_utils.py::test_get_commit_info[git-False] PASSED [ 86%] tests/test_utils.py::test_get_commit_info[hg-True] SKIPPED (%r not a...) [ 87%] tests/test_utils.py::test_get_commit_info[hg-False] SKIPPED (%r not ...) [ 87%] tests/test_utils.py::test_missing_scm_bins[git-True] PASSED [ 88%] tests/test_utils.py::test_missing_scm_bins[git-False] PASSED [ 88%] tests/test_utils.py::test_missing_scm_bins[hg-True] SKIPPED (%r not ...) [ 89%] tests/test_utils.py::test_missing_scm_bins[hg-False] SKIPPED (%r not...) [ 89%] tests/test_utils.py::test_get_branch_info[git] PASSED [ 90%] tests/test_utils.py::test_get_branch_info[hg] SKIPPED (%r not availa...) [ 90%] tests/test_utils.py::test_no_branch_info PASSED [ 90%] tests/test_utils.py::test_commit_info_error PASSED [ 91%] tests/test_utils.py::test_parse_warmup PASSED [ 91%] tests/test_utils.py::test_parse_columns PASSED [ 92%] tests/test_utils.py::test_get_project_name[False-None] PASSED [ 92%] tests/test_utils.py::test_get_project_name[False-git] PASSED [ 93%] tests/test_utils.py::test_get_project_name[False-hg] SKIPPED (%r not...) [ 93%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo-None] PASSED [ 94%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo-git] PASSED [ 94%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo-hg] SKIPPED [ 95%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo.git-None] PASSED [ 95%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo.git-git] PASSED [ 95%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo.git-hg] SKIPPED [ 96%] tests/test_utils.py::test_get_project_name[c:\\foo\\bar\\pytest_benchmark_repo.gitfoo@example.com:pytest_benchmark_repo.git-None] PASSED [ 96%] tests/test_utils.py::test_get_project_name[c:\\foo\\bar\\pytest_benchmark_repo.gitfoo@example.com:pytest_benchmark_repo.git-git] PASSED [ 97%] tests/test_utils.py::test_get_project_name[c:\\foo\\bar\\pytest_benchmark_repo.gitfoo@example.com:pytest_benchmark_repo.git-hg] SKIPPED [ 97%] tests/test_utils.py::test_get_project_name_broken[git] PASSED [ 98%] tests/test_utils.py::test_get_project_name_broken[hg] PASSED [ 98%] tests/test_utils.py::test_get_project_name_fallback PASSED [ 99%] tests/test_utils.py::test_get_project_name_fallback_broken_hgrc PASSED [ 99%] tests/test_with_testcase.py::TerribleTerribleWayToWriteTests::test_foo PASSED [100%] =================================== FAILURES =================================== __________________________________ test_basic __________________________________ /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_benchmark.py:1028: in test_basic result.stdout.fnmatch_lines([ E Failed: nomatch: '*collected 5 items' E and: '============================= test session starts ==============================' E and: 'platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3' E and: 'cachedir: .pytest_cache' E and: 'benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)' E and: 'rootdir: /tmp/pytest-of-buildozer/pytest-226/test_basic0' E and: 'plugins: benchmark-3.4.1, forked-1.4.0, xdist-2.5.0' E fnmatch: '*collected 5 items' E with: 'collecting ... collected 5 items' E nomatch: 'test_basic.py::*test_basic PASSED*' E and: '' E fnmatch: 'test_basic.py::*test_basic PASSED*' E with: 'test_basic.py::test_basic PASSED [ 20%]' E nomatch: 'test_basic.py::test_slow PASSED*' E and: 'test_basic.py::test_fast PASSED [ 40%]' E fnmatch: 'test_basic.py::test_slow PASSED*' E with: 'test_basic.py::test_slow PASSED [ 60%]' E fnmatch: 'test_basic.py::test_slower PASSED*' E with: 'test_basic.py::test_slower PASSED [ 80%]' E fnmatch: 'test_basic.py::test_xfast PASSED*' E with: 'test_basic.py::test_xfast PASSED [100%]' E nomatch: 'test_basic.py::test_fast PASSED*' E and: '' E and: '' E and: '--------------------------------------------------------------------------------------------------------- benchmark: 4 tests --------------------------------------------------------------------------------------------------------' E and: 'Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations' E and: '-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------' E and: 'test_xfast 76.9994 (1.0) 3,717.8211 (1.0) 79.4478 (1.0) 22.9301 (1.0) 78.8039 (1.0) 0.6007 (1.29) 200;5896 12,586,882.8224 (1.0) 60097 200' E and: 'test_fast 159.7218 (2.07) 57,159.9230 (15.37) 196.5291 (2.47) 158.4659 (6.91) 199.7687 (2.54) 0.4657 (1.0) 29;16206 5,088,304.0594 (0.40) 137360 1' E and: 'test_slow 1,008,165.0689 (>1000.0) 1,106,884.7962 (297.72) 1,052,816.8227 (>1000.0) 4,406.4005 (192.17) 1,052,405.2195 (>1000.0) 2,199.7839 (>1000.0) 25;22 949.8328 (0.00) 934 1' E and: 'test_slower 10,012,168.9029 (>1000.0) 10,062,609.3335 (>1000.0) 10,055,252.2438 (>1000.0) 4,645.8381 (202.61) 10,055,029.0663 (>1000.0) 1,800.0137 (>1000.0) 4;5 99.4505 (0.00) 100 1' E and: '-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------' E and: '' E and: 'Legend:' E and: ' Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile.' E and: ' OPS: Operations Per Second, computed as 1 / Mean' E and: '============================== 5 passed in 4.61s ===============================' E remains unmatched: 'test_basic.py::test_fast PASSED*' ----------------------------- Captured stdout call ----------------------------- running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-buildozer/pytest-226/test_basic0/runpytest-0 -vv --doctest-modules /tmp/pytest-of-buildozer/pytest-226/test_basic0/test_basic.py in: /tmp/pytest-of-buildozer/pytest-226/test_basic0 ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /tmp/pytest-of-buildozer/pytest-226/test_basic0 plugins: benchmark-3.4.1, forked-1.4.0, xdist-2.5.0 collecting ... collected 5 items test_basic.py::test_basic PASSED [ 20%] test_basic.py::test_fast PASSED [ 40%] test_basic.py::test_slow PASSED [ 60%] test_basic.py::test_slower PASSED [ 80%] test_basic.py::test_xfast PASSED [100%] --------------------------------------------------------------------------------------------------------- benchmark: 4 tests -------------------------------------------------------------------------------------------------------- Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- test_xfast 76.9994 (1.0) 3,717.8211 (1.0) 79.4478 (1.0) 22.9301 (1.0) 78.8039 (1.0) 0.6007 (1.29) 200;5896 12,586,882.8224 (1.0) 60097 200 test_fast 159.7218 (2.07) 57,159.9230 (15.37) 196.5291 (2.47) 158.4659 (6.91) 199.7687 (2.54) 0.4657 (1.0) 29;16206 5,088,304.0594 (0.40) 137360 1 test_slow 1,008,165.0689 (>1000.0) 1,106,884.7962 (297.72) 1,052,816.8227 (>1000.0) 4,406.4005 (192.17) 1,052,405.2195 (>1000.0) 2,199.7839 (>1000.0) 25;22 949.8328 (0.00) 934 1 test_slower 10,012,168.9029 (>1000.0) 10,062,609.3335 (>1000.0) 10,055,252.2438 (>1000.0) 4,645.8381 (202.61) 10,055,029.0663 (>1000.0) 1,800.0137 (>1000.0) 4;5 99.4505 (0.00) 100 1 ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Legend: Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile. OPS: Operations Per Second, computed as 1 / Mean ============================== 5 passed in 4.61s =============================== __________________________________ test_skip ___________________________________ /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_benchmark.py:1052: in test_skip result.stdout.fnmatch_lines([ E Failed: nomatch: '*collected 5 items' E and: '============================= test session starts ==============================' E and: 'platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3' E and: 'cachedir: .pytest_cache' E and: 'benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)' E and: 'rootdir: /tmp/pytest-of-buildozer/pytest-226/test_skip0' E and: 'plugins: benchmark-3.4.1, forked-1.4.0, xdist-2.5.0' E fnmatch: '*collected 5 items' E with: 'collecting ... collected 5 items' E nomatch: 'test_skip.py::*test_skip PASSED*' E and: '' E fnmatch: 'test_skip.py::*test_skip PASSED*' E with: 'test_skip.py::test_skip PASSED [ 20%]' E nomatch: 'test_skip.py::test_slow SKIPPED*' E and: 'test_skip.py::test_fast SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 40%]' E fnmatch: 'test_skip.py::test_slow SKIPPED*' E with: 'test_skip.py::test_slow SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 60%]' E fnmatch: 'test_skip.py::test_slower SKIPPED*' E with: 'test_skip.py::test_slower SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 80%]' E fnmatch: 'test_skip.py::test_xfast SKIPPED*' E with: 'test_skip.py::test_xfast SKIPPED (Skipping benchmark (--benchmark-skip active).) [100%]' E nomatch: 'test_skip.py::test_fast SKIPPED*' E and: '' E and: '========================= 1 passed, 4 skipped in 0.04s =========================' E remains unmatched: 'test_skip.py::test_fast SKIPPED*' ----------------------------- Captured stdout call ----------------------------- running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-buildozer/pytest-226/test_skip0/runpytest-0 -vv --doctest-modules --benchmark-skip /tmp/pytest-of-buildozer/pytest-226/test_skip0/test_skip.py in: /tmp/pytest-of-buildozer/pytest-226/test_skip0 ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /tmp/pytest-of-buildozer/pytest-226/test_skip0 plugins: benchmark-3.4.1, forked-1.4.0, xdist-2.5.0 collecting ... collected 5 items test_skip.py::test_skip PASSED [ 20%] test_skip.py::test_fast SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 40%] test_skip.py::test_slow SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 60%] test_skip.py::test_slower SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 80%] test_skip.py::test_xfast SKIPPED (Skipping benchmark (--benchmark-skip active).) [100%] ========================= 1 passed, 4 skipped in 0.04s ========================= _________________________________ test_disable _________________________________ /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_benchmark.py:1066: in test_disable result.stdout.fnmatch_lines([ E Failed: nomatch: '*collected 5 items' E and: '============================= test session starts ==============================' E and: 'platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3' E and: 'cachedir: .pytest_cache' E and: 'benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)' E and: 'rootdir: /tmp/pytest-of-buildozer/pytest-226/test_disable0' E and: 'plugins: benchmark-3.4.1, forked-1.4.0, xdist-2.5.0' E fnmatch: '*collected 5 items' E with: 'collecting ... collected 5 items' E nomatch: 'test_disable.py::*test_disable PASSED*' E and: '' E fnmatch: 'test_disable.py::*test_disable PASSED*' E with: 'test_disable.py::test_disable PASSED [ 20%]' E nomatch: 'test_disable.py::test_slow PASSED*' E and: 'test_disable.py::test_fast PASSED [ 40%]' E fnmatch: 'test_disable.py::test_slow PASSED*' E with: 'test_disable.py::test_slow PASSED [ 60%]' E fnmatch: 'test_disable.py::test_slower PASSED*' E with: 'test_disable.py::test_slower PASSED [ 80%]' E fnmatch: 'test_disable.py::test_xfast PASSED*' E with: 'test_disable.py::test_xfast PASSED [100%]' E nomatch: 'test_disable.py::test_fast PASSED*' E and: '' E and: '============================== 5 passed in 0.05s ===============================' E remains unmatched: 'test_disable.py::test_fast PASSED*' ----------------------------- Captured stdout call ----------------------------- running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-buildozer/pytest-226/test_disable0/runpytest-0 -vv --doctest-modules --benchmark-disable /tmp/pytest-of-buildozer/pytest-226/test_disable0/test_disable.py in: /tmp/pytest-of-buildozer/pytest-226/test_disable0 ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /tmp/pytest-of-buildozer/pytest-226/test_disable0 plugins: benchmark-3.4.1, forked-1.4.0, xdist-2.5.0 collecting ... collected 5 items test_disable.py::test_disable PASSED [ 20%] test_disable.py::test_fast PASSED [ 40%] test_disable.py::test_slow PASSED [ 60%] test_disable.py::test_slower PASSED [ 80%] test_disable.py::test_xfast PASSED [100%] ============================== 5 passed in 0.05s =============================== _____________________________ test_only_benchmarks _____________________________ /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_benchmark.py:1095: in test_only_benchmarks result.stdout.fnmatch_lines([ E Failed: nomatch: '*collected 5 items' E and: '============================= test session starts ==============================' E and: 'platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3' E and: 'cachedir: .pytest_cache' E and: 'benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)' E and: 'rootdir: /tmp/pytest-of-buildozer/pytest-226/test_only_benchmarks0' E and: 'plugins: benchmark-3.4.1, forked-1.4.0, xdist-2.5.0' E fnmatch: '*collected 5 items' E with: 'collecting ... collected 5 items' E nomatch: 'test_only_benchmarks.py::*test_only_benchmarks SKIPPED*' E and: '' E fnmatch: 'test_only_benchmarks.py::*test_only_benchmarks SKIPPED*' E with: 'test_only_benchmarks.py::test_only_benchmarks SKIPPED (Skipping non-benchmark (--benchmark-only active).) [ 20%]' E nomatch: 'test_only_benchmarks.py::test_slow PASSED*' E and: 'test_only_benchmarks.py::test_fast PASSED [ 40%]' E fnmatch: 'test_only_benchmarks.py::test_slow PASSED*' E with: 'test_only_benchmarks.py::test_slow PASSED [ 60%]' E fnmatch: 'test_only_benchmarks.py::test_slower PASSED*' E with: 'test_only_benchmarks.py::test_slower PASSED [ 80%]' E fnmatch: 'test_only_benchmarks.py::test_xfast PASSED*' E with: 'test_only_benchmarks.py::test_xfast PASSED [100%]' E nomatch: 'test_only_benchmarks.py::test_fast PASSED*' E and: '' E and: '' E and: '--------------------------------------------------------------------------------------------------------- benchmark: 4 tests --------------------------------------------------------------------------------------------------------' E and: 'Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations' E and: '-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------' E and: 'test_xfast 76.7992 (1.0) 2,413.8112 (1.0) 78.2197 (1.0) 13.2033 (1.0) 77.6001 (1.0) 0.4051 (1.0) 282;6768 12,784,502.2593 (1.0) 61275 200' E and: 'test_fast 159.7218 (2.08) 44,280.1975 (18.34) 200.9367 (2.57) 141.0621 (10.68) 199.7687 (2.57) 0.4657 (1.15) 128;10388 4,976,690.7399 (0.39) 189406 1' E and: 'test_slow 1,013,244.9679 (>1000.0) 1,082,684.8447 (448.54) 1,053,120.8323 (>1000.0) 3,758.3119 (284.65) 1,053,045.0381 (>1000.0) 1,919.9215 (>1000.0) 33;27 949.5587 (0.00) 939 1' E and: 'test_slower 10,053,810.1979 (>1000.0) 10,090,969.0373 (>1000.0) 10,056,043.0577 (>1000.0) 4,294.2850 (325.24) 10,055,148.9741 (>1000.0) 1,440.2904 (>1000.0) 3;5 99.4427 (0.00) 100 1' E and: '-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------' E and: '' E and: 'Legend:' E and: ' Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile.' E and: ' OPS: Operations Per Second, computed as 1 / Mean' E and: '========================= 4 passed, 1 skipped in 4.78s =========================' E remains unmatched: 'test_only_benchmarks.py::test_fast PASSED*' ----------------------------- Captured stdout call ----------------------------- running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-buildozer/pytest-226/test_only_benchmarks0/runpytest-0 -vv --doctest-modules --benchmark-only /tmp/pytest-of-buildozer/pytest-226/test_only_benchmarks0/test_only_benchmarks.py in: /tmp/pytest-of-buildozer/pytest-226/test_only_benchmarks0 ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /tmp/pytest-of-buildozer/pytest-226/test_only_benchmarks0 plugins: benchmark-3.4.1, forked-1.4.0, xdist-2.5.0 collecting ... collected 5 items test_only_benchmarks.py::test_only_benchmarks SKIPPED (Skipping non-benchmark (--benchmark-only active).) [ 20%] test_only_benchmarks.py::test_fast PASSED [ 40%] test_only_benchmarks.py::test_slow PASSED [ 60%] test_only_benchmarks.py::test_slower PASSED [ 80%] test_only_benchmarks.py::test_xfast PASSED [100%] --------------------------------------------------------------------------------------------------------- benchmark: 4 tests -------------------------------------------------------------------------------------------------------- Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- test_xfast 76.7992 (1.0) 2,413.8112 (1.0) 78.2197 (1.0) 13.2033 (1.0) 77.6001 (1.0) 0.4051 (1.0) 282;6768 12,784,502.2593 (1.0) 61275 200 test_fast 159.7218 (2.08) 44,280.1975 (18.34) 200.9367 (2.57) 141.0621 (10.68) 199.7687 (2.57) 0.4657 (1.15) 128;10388 4,976,690.7399 (0.39) 189406 1 test_slow 1,013,244.9679 (>1000.0) 1,082,684.8447 (448.54) 1,053,120.8323 (>1000.0) 3,758.3119 (284.65) 1,053,045.0381 (>1000.0) 1,919.9215 (>1000.0) 33;27 949.5587 (0.00) 939 1 test_slower 10,053,810.1979 (>1000.0) 10,090,969.0373 (>1000.0) 10,056,043.0577 (>1000.0) 4,294.2850 (325.24) 10,055,148.9741 (>1000.0) 1,440.2904 (>1000.0) 3;5 99.4427 (0.00) 100 1 ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Legend: Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile. OPS: Operations Per Second, computed as 1 / Mean ========================= 4 passed, 1 skipped in 4.78s ========================= =============================== warnings summary =============================== ../../../../../../../usr/lib/python3.10/site-packages/_pytest/config/__init__.py:1199 /usr/lib/python3.10/site-packages/_pytest/config/__init__.py:1199: PytestRemovedIn8Warning: The --strict option is deprecated, use --strict-markers instead. self.issue_config_time_warning( tests/test_utils.py:35 /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_utils.py:35: PytestDeprecationWarning: @pytest.yield_fixture is deprecated. Use @pytest.fixture instead; they are the same. @pytest.yield_fixture(params=(True, False)) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html ---------------------------------------------------------------------------------------------------------------------------------- benchmark: 58 tests ----------------------------------------------------------------------------------------------------------------------------------- Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ test_xfast 0.0000 (1.0) 476.8372 (94.42) 194.9334 (38.60) 120.3175 (inf) 238.4186 (47.21) 0.0000 (1.0) 366;366 5,129,956.4308 (0.03) 1272 1 test_calibrate_stuck[False--1-1e-10] 5.0500 (inf) 5.0500 (1.0) 5.0500 (1.0) 0.0000 (1.0) 5.0500 (1.0) 0.0000 (1.0) 0;0 198,019,801.9796 (1.0) 1 2 test_calibrate_stuck[False-0-1e-10] 5.0500 (inf) 5.0500 (1.0) 5.0500 (1.0) 0.0000 (1.0) 5.0500 (1.0) 0.0000 (1.0) 0;0 198,019,801.9796 (1.0) 1 2 test_calibrate_stuck[False-1-1e-10] 5.0500 (inf) 5.0500 (1.0) 5.0500 (1.0) 0.0000 (1.0) 5.0500 (1.0) 0.0000 (1.0) 0;0 198,019,801.9796 (1.0) 1 2 test_calibrate_stuck[True--1-1e-10] 10.0000 (inf) 10.0000 (1.98) 10.0000 (1.98) 0.0000 (1.0) 10.0000 (1.98) 0.0000 (1.0) 0;0 99,999,999.9995 (0.50) 1 1 test_calibrate_stuck[True-0-1e-10] 10.0000 (inf) 10.0000 (1.98) 10.0000 (1.98) 0.0000 (1.0) 10.0000 (1.98) 0.0000 (1.0) 0;0 99,999,999.9995 (0.50) 1 1 test_calibrate_stuck[True-1-1e-10] 10.0000 (inf) 10.0000 (1.98) 10.0000 (1.98) 0.0000 (1.0) 10.0000 (1.98) 0.0000 (1.0) 0;0 99,999,999.9995 (0.50) 1 1 test_calibrate_stuck[False--1-1e-09] 50.5000 (inf) 50.5000 (10.00) 50.5000 (10.00) 0.0000 (1.0) 50.5000 (10.00) 0.0000 (1.0) 0;0 19,801,980.1980 (0.10) 1 2 test_calibrate_stuck[False-0-1e-09] 50.5000 (inf) 50.5000 (10.00) 50.5000 (10.00) 0.0000 (1.0) 50.5000 (10.00) 0.0000 (1.0) 0;0 19,801,980.1980 (0.10) 1 2 test_calibrate_stuck[False-1-1e-09] 50.5000 (inf) 50.5000 (10.00) 50.5000 (10.00) 0.0000 (1.0) 50.5000 (10.00) 0.0000 (1.0) 0;0 19,801,980.1980 (0.10) 1 2 test_calibrate_xfast 94.7993 (inf) 25,478.1265 (>1000.0) 97.0704 (19.22) 71.0362 (inf) 96.0100 (19.01) 0.4005 (inf) 2189;64238 10,301,805.5669 (0.05) 1054860 100 test_rounds_iterations 99.9775 (inf) 195.9968 (38.81) 138.1338 (27.35) 30.7646 (inf) 132.0150 (26.14) 39.0341 (inf) 6;0 7,239,359.6548 (0.04) 15 10 test_calibrate_stuck[True--1-1e-09] 100.0000 (inf) 100.0000 (19.80) 100.0000 (19.80) 0.0000 (1.0) 100.0000 (19.80) 0.0000 (1.0) 0;0 10,000,000.0000 (0.05) 1 1 test_calibrate_stuck[True-0-1e-09] 100.0000 (inf) 100.0000 (19.80) 100.0000 (19.80) 0.0000 (1.0) 100.0000 (19.80) 0.0000 (1.0) 0;0 10,000,000.0000 (0.05) 1 1 test_calibrate_stuck[True-1-1e-09] 100.0000 (inf) 100.0000 (19.80) 100.0000 (19.80) 0.0000 (1.0) 100.0000 (19.80) 0.0000 (1.0) 0;0 10,000,000.0000 (0.05) 1 1 test_rounds 119.6750 (inf) 320.3750 (63.44) 189.3068 (37.49) 65.1655 (inf) 159.7218 (31.63) 40.3961 (inf) 5;3 5,282,429.4392 (0.03) 15 1 test_warmup_rounds 120.1406 (inf) 239.8156 (47.49) 168.0106 (33.27) 43.6995 (inf) 160.1875 (31.72) 30.2680 (inf) 2;1 5,952,005.6763 (0.03) 5 1 test_iterations 176.0200 (inf) 176.0200 (34.86) 176.0200 (34.86) 0.0000 (1.0) 176.0200 (34.86) 0.0000 (1.0) 0;0 5,681,173.6720 (0.03) 1 10 test_proto[LocalsSimpleProxy] 262.0043 (inf) 154,358.7539 (>1000.0) 269.1948 (53.31) 389.6635 (inf) 266.0090 (52.68) 2.0023 (inf) 163;658 3,714,782.4111 (0.02) 159238 20 test_proto[SimpleProxy] 267.9881 (inf) 145,622.6921 (>1000.0) 275.4586 (54.55) 379.1176 (inf) 272.0160 (53.86) 2.0023 (inf) 161;885 3,630,309.8217 (0.02) 149703 20 test_proto[CachedPropertyProxy] 359.9562 (inf) 60,280.3193 (>1000.0) 412.4187 (81.67) 333.8941 (inf) 400.0030 (79.21) 0.0000 (1.0) 48;26500 2,424,720.5668 (0.01) 119617 1 test_proto[LocalsCachedPropertyProxy] 359.9562 (inf) 7,640.1047 (>1000.0) 413.6279 (81.91) 62.8625 (inf) 400.0030 (79.21) 40.0469 (inf) 107;41 2,417,631.8555 (0.01) 24201 1 test_setup_many_rounds 359.9562 (inf) 1,200.0091 (237.63) 491.9712 (97.42) 253.7300 (inf) 400.0030 (79.21) 79.6281 (inf) 1;1 2,032,639.5154 (0.01) 10 1 test_calibrate_fast 426.6622 (inf) 157,740.7820 (>1000.0) 439.1576 (86.96) 282.3361 (inf) 436.6351 (86.46) 3.3372 (inf) 2953;34586 2,277,086.7211 (0.01) 1953146 12 test_can_use_both_args_and_setup_without_return 800.0061 (inf) 800.0061 (158.42) 800.0061 (158.42) 0.0000 (1.0) 800.0061 (158.42) 0.0000 (1.0) 0;0 1,249,990.4820 (0.01) 1 1 test_single 800.0061 (inf) 800.0061 (158.42) 800.0061 (158.42) 0.0000 (1.0) 800.0061 (158.42) 0.0000 (1.0) 0;0 1,249,990.4820 (0.01) 1 1 test_setup_cprofile 960.1936 (inf) 960.1936 (190.14) 960.1936 (190.14) 0.0000 (1.0) 960.1936 (190.14) 0.0000 (1.0) 0;0 1,041,456.6673 (0.01) 1 1 test_args_kwargs 999.7748 (inf) 999.7748 (197.98) 999.7748 (197.98) 0.0000 (1.0) 999.7748 (197.98) 0.0000 (1.0) 0;0 1,000,225.2669 (0.01) 1 1 test_setup 1,399.7778 (inf) 1,399.7778 (277.18) 1,399.7778 (277.18) 0.0000 (1.0) 1,399.7778 (277.18) 0.0000 (1.0) 0;0 714,399.0845 (0.00) 1 1 test_fast 3,800.2618 (inf) 161,961.1867 (>1000.0) 53,635.7423 (>1000.0) 2,040.5187 (inf) 53,560.3613 (>1000.0) 120.1406 (inf) 231;1426 18,644.2838 (0.00) 13228 1 test_foo 4,440.0804 (inf) 126,759.9873 (>1000.0) 53,481.6660 (>1000.0) 1,721.3229 (inf) 53,440.2207 (>1000.0) 158.7905 (inf) 283;820 18,697.9964 (0.00) 14724 1 test_calibrate_slow 13,200.1005 (inf) 6,078,349.8921 (>1000.0) 62,763.3389 (>1000.0) 8,459.7051 (inf) 62,521.0814 (>1000.0) 119.6750 (inf) 5598;54437 15,932.8681 (0.00) 642671 1 test_calibrate 39,480.1609 (inf) 791,803.9337 (>1000.0) 41,714.6863 (>1000.0) 5,634.8251 (inf) 41,080.1731 (>1000.0) 240.7469 (inf) 3087;33729 23,972.3725 (0.00) 253036 1 test_parametrized[1] 62,279.8689 (inf) 122,560.1882 (>1000.0) 62,630.9151 (>1000.0) 1,492.5594 (inf) 62,479.6376 (>1000.0) 119.2093 (inf) 270;1003 15,966.5558 (0.00) 12827 1 test_parametrized[4] 62,279.8689 (inf) 120,760.8730 (>1000.0) 62,647.1626 (>1000.0) 1,584.1989 (inf) 62,480.1032 (>1000.0) 80.5594 (inf) 283;1227 15,962.4149 (0.00) 12834 1 test_parametrized[2] 62,280.3345 (inf) 86,959.9171 (>1000.0) 62,610.4785 (>1000.0) 1,215.2726 (inf) 62,440.9877 (>1000.0) 119.2093 (inf) 303;1015 15,971.7674 (0.00) 13008 1 test_parametrized[3] 62,280.3345 (inf) 124,760.9034 (>1000.0) 62,628.8212 (>1000.0) 1,561.3588 (inf) 62,479.6376 (>1000.0) 119.2093 (inf) 274;1008 15,967.0896 (0.00) 13015 1 test_parametrized[0] 62,319.9157 (inf) 90,599.9914 (>1000.0) 62,631.5072 (>1000.0) 1,327.3192 (inf) 62,480.1032 (>1000.0) 81.0251 (inf) 279;1172 15,966.4048 (0.00) 12788 1 test_slow 1,007,764.6002 (inf) 1,066,046.3013 (>1000.0) 1,052,693.9557 (>1000.0) 4,125.5815 (inf) 1,052,446.1977 (>1000.0) 1,880.3403 (inf) 35;37 949.9437 (0.00) 942 1 test_slower 10,041,409.1721 (inf) 10,057,769.7158 (>1000.0) 10,054,769.8466 (>1000.0) 1,725.4100 (inf) 10,054,489.1320 (>1000.0) 1,339.9404 (inf) 14;3 99.4553 (0.00) 100 1 test_calibrate_stuck[False--1-0.01] 505,000,000.0010 (inf) 505,000,000.0010 (>1000.0) 505,000,000.0010 (>1000.0) 0.0000 (1.0) 505,000,000.0010 (>1000.0) 0.0000 (1.0) 0;0 1.9802 (0.00) 1 2 test_calibrate_stuck[False-0-0.01] 505,000,000.0010 (inf) 505,000,000.0010 (>1000.0) 505,000,000.0010 (>1000.0) 0.0000 (1.0) 505,000,000.0010 (>1000.0) 0.0000 (1.0) 0;0 1.9802 (0.00) 1 2 test_calibrate_stuck[False-1-0.01] 505,000,000.0010 (inf) 505,000,000.0010 (>1000.0) 505,000,000.0010 (>1000.0) 0.0000 (1.0) 505,000,000.0010 (>1000.0) 0.0000 (1.0) 0;0 1.9802 (0.00) 1 2 test_calibrate_stuck[True--1-0.01] 1,000,000,000.0000 (inf) 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 1.0000 (0.00) 1 1 test_calibrate_stuck[True-0-0.01] 1,000,000,000.0000 (inf) 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 1.0000 (0.00) 1 1 test_calibrate_stuck[True-1-0.01] 1,000,000,000.0000 (inf) 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 1.0000 (0.00) 1 1 test_calibrate_stuck[False--1-1.000000000000001] 50,500,000,000.0000 (inf) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False--1-1] 50,500,000,000.0000 (inf) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False-0-1.000000000000001] 50,500,000,000.0000 (inf) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False-0-1] 50,500,000,000.0000 (inf) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False-1-1.000000000000001] 50,500,000,000.0000 (inf) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False-1-1] 50,500,000,000.0000 (inf) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[True--1-1.000000000000001] 100,000,000,000.0000 (inf) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True--1-1] 100,000,000,000.0000 (inf) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True-0-1.000000000000001] 100,000,000,000.0000 (inf) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True-0-1] 100,000,000,000.0000 (inf) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True-1-1.000000000000001] 100,000,000,000.0000 (inf) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True-1-1] 100,000,000,000.0000 (inf) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ Legend: Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile. OPS: Operations Per Second, computed as 1 / Mean ----------------------------- cProfile (time in s) ----------------------------- tests/test_pedantic.py::test_setup_cprofile ncalls tottime percall cumtime percall filename:lineno(function) 1 0.0000 0.0000 0.0000 0.0000 pytest-benchmark-3.4.1/tests/test_pedantic.py:29(stuff) 1 0.0000 0.0000 0.0000 0.0000 ~:0() 1 0.0000 0.0000 0.0000 0.0000 ~:0() =========================== short test summary info ============================ SKIPPED [1] tests/test_skip.py:5: bla SKIPPED [5] tests/test_utils.py:47: %r not availabe on $PATH SKIPPED [4] tests/test_utils.py:160: %r not availabe on $PATH FAILED tests/test_benchmark.py::test_basic - Failed: nomatch: '*collected 5 i... FAILED tests/test_benchmark.py::test_skip - Failed: nomatch: '*collected 5 it... FAILED tests/test_benchmark.py::test_disable - Failed: nomatch: '*collected 5... FAILED tests/test_benchmark.py::test_only_benchmarks - Failed: nomatch: '*col... = 4 failed, 208 passed, 10 skipped, 10 deselected, 2 warnings in 406.21s (0:06:46) = >>> ERROR: py3-pytest-benchmark: check failed >>> py3-pytest-benchmark: Uninstalling dependencies... (1/29) Purging .makedepends-py3-pytest-benchmark (20221026.172653) (2/29) Purging py3-py-cpuinfo (8.0.0-r0) (3/29) Purging py3-setuptools (65.5.0-r0) (4/29) Purging py3-pytest-xdist (2.5.0-r1) (5/29) Purging py3-execnet (1.9.0-r0) (6/29) Purging py3-apipkg (2.1.0-r0) (7/29) Purging py3-pytest-forked (1.4.0-r1) (8/29) Purging py3-pytest (7.1.3-r1) (9/29) Purging py3-attrs (22.1.0-r0) (10/29) Purging py3-iniconfig (1.1.1-r3) (11/29) Purging py3-packaging (21.3-r2) (12/29) Purging py3-parsing (3.0.9-r0) (13/29) Purging py3-pluggy (1.0.0-r1) (14/29) Purging py3-py (1.11.0-r0) (15/29) Purging py3-tomli (2.0.1-r1) (16/29) Purging py3-freezegun (1.2.2-r0) (17/29) Purging py3-dateutil (2.8.2-r1) (18/29) Purging py3-six (1.16.0-r3) (19/29) Purging py3-pygal (3.0.0-r1) (20/29) Purging py3-elasticsearch (7.11.0-r1) (21/29) Purging py3-urllib3 (1.26.12-r0) (22/29) Purging python3 (3.10.8-r3) (23/29) Purging libbz2 (1.0.8-r3) (24/29) Purging libffi (3.4.3-r0) (25/29) Purging gdbm (1.23-r0) (26/29) Purging xz-libs (5.2.7-r0) (27/29) Purging mpdecimal (2.5.1-r1) (28/29) Purging readline (8.2.0-r0) (29/29) Purging sqlite-libs (3.39.4-r0) Executing busybox-1.35.0-r27.trigger OK: 268 MiB in 92 packages