>>> py3-pytest-benchmark: Building community/py3-pytest-benchmark 3.4.1-r1 (using abuild 3.10.0_rc1-r2) started Wed, 26 Oct 2022 09:52:45 +0000 >>> py3-pytest-benchmark: Checking sanity of /home/buildozer/aports/community/py3-pytest-benchmark/APKBUILD... >>> py3-pytest-benchmark: Analyzing dependencies... >>> py3-pytest-benchmark: Installing for build: build-base python3 py3-pytest py3-py-cpuinfo py3-setuptools py3-pytest-xdist py3-freezegun py3-pygal py3-elasticsearch (1/29) Installing libbz2 (1.0.8-r3) (2/29) Installing libffi (3.4.3-r0) (3/29) Installing gdbm (1.23-r0) (4/29) Installing xz-libs (5.2.7-r0) (5/29) Installing mpdecimal (2.5.1-r1) (6/29) Installing readline (8.2.0-r0) (7/29) Installing sqlite-libs (3.39.4-r0) (8/29) Installing python3 (3.10.8-r1) (9/29) Installing py3-attrs (22.1.0-r0) (10/29) Installing py3-iniconfig (1.1.1-r3) (11/29) Installing py3-parsing (3.0.9-r0) (12/29) Installing py3-packaging (21.3-r2) (13/29) Installing py3-pluggy (1.0.0-r1) (14/29) Installing py3-py (1.11.0-r0) (15/29) Installing py3-tomli (2.0.1-r1) (16/29) Installing py3-pytest (7.1.3-r1) (17/29) Installing py3-py-cpuinfo (8.0.0-r0) (18/29) Installing py3-setuptools (65.5.0-r0) (19/29) Installing py3-apipkg (2.1.0-r0) (20/29) Installing py3-execnet (1.9.0-r0) (21/29) Installing py3-pytest-forked (1.4.0-r1) (22/29) Installing py3-pytest-xdist (2.5.0-r1) (23/29) Installing py3-six (1.16.0-r3) (24/29) Installing py3-dateutil (2.8.2-r1) (25/29) Installing py3-freezegun (1.2.2-r0) (26/29) Installing py3-pygal (3.0.0-r1) (27/29) Installing py3-urllib3 (1.26.12-r0) (28/29) Installing py3-elasticsearch (7.11.0-r1) (29/29) Installing .makedepends-py3-pytest-benchmark (20221026.095255) Executing busybox-1.35.0-r27.trigger OK: 308 MiB in 121 packages >>> py3-pytest-benchmark: Cleaning up srcdir >>> py3-pytest-benchmark: Cleaning up pkgdir >>> py3-pytest-benchmark: Fetching https://distfiles.alpinelinux.org/distfiles/v3.17/pytest-benchmark-3.4.1.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 146 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 curl: (22) The requested URL returned error: 404 >>> py3-pytest-benchmark: Fetching https://github.com/ionelmc/pytest-benchmark/archive/v3.4.1/pytest-benchmark-3.4.1.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 277k 0 277k 0 0 198k 0 --:--:-- 0:00:01 --:--:-- 198k 100 315k 0 315k 0 0 211k 0 --:--:-- 0:00:01 --:--:-- 403k >>> py3-pytest-benchmark: Fetching https://distfiles.alpinelinux.org/distfiles/v3.17/pytest-benchmark-3.4.1.tar.gz >>> py3-pytest-benchmark: Checking sha512sums... pytest-benchmark-3.4.1.tar.gz: OK python-3.10.patch: OK >>> py3-pytest-benchmark: Unpacking /var/cache/distfiles/v3.17/pytest-benchmark-3.4.1.tar.gz... >>> py3-pytest-benchmark: python-3.10.patch patching file tests/test_cli.py running build running build_py creating build creating build/lib creating build/lib/pytest_benchmark copying src/pytest_benchmark/compat.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/__init__.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/table.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/histogram.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/pep418.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/__main__.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/plugin.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/session.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/fixture.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/logger.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/timers.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/csv.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/stats.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/hookspec.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/utils.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/cli.py -> build/lib/pytest_benchmark creating build/lib/pytest_benchmark/storage copying src/pytest_benchmark/storage/__init__.py -> build/lib/pytest_benchmark/storage copying src/pytest_benchmark/storage/elasticsearch.py -> build/lib/pytest_benchmark/storage copying src/pytest_benchmark/storage/file.py -> build/lib/pytest_benchmark/storage running egg_info creating src/pytest_benchmark.egg-info writing src/pytest_benchmark.egg-info/PKG-INFO writing dependency_links to src/pytest_benchmark.egg-info/dependency_links.txt writing entry points to src/pytest_benchmark.egg-info/entry_points.txt writing requirements to src/pytest_benchmark.egg-info/requires.txt writing top-level names to src/pytest_benchmark.egg-info/top_level.txt writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' reading manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*.py[cod]' found anywhere in distribution warning: no previously-included files matching '__pycache__/*' found anywhere in distribution warning: no previously-included files matching '*.so' found anywhere in distribution warning: no previously-included files matching '*.dylib' found anywhere in distribution adding license file 'LICENSE' adding license file 'AUTHORS.rst' writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' running install /usr/lib/python3.10/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools. warnings.warn( running build running build_py running egg_info writing src/pytest_benchmark.egg-info/PKG-INFO writing dependency_links to src/pytest_benchmark.egg-info/dependency_links.txt writing entry points to src/pytest_benchmark.egg-info/entry_points.txt writing requirements to src/pytest_benchmark.egg-info/requires.txt writing top-level names to src/pytest_benchmark.egg-info/top_level.txt reading manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*.py[cod]' found anywhere in distribution warning: no previously-included files matching '__pycache__/*' found anywhere in distribution warning: no previously-included files matching '*.so' found anywhere in distribution warning: no previously-included files matching '*.dylib' found anywhere in distribution adding license file 'LICENSE' adding license file 'AUTHORS.rst' writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' running install_lib creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10 creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/compat.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/__init__.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/table.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/histogram.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/pep418.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/__main__.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/plugin.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/session.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/fixture.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/logger.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/timers.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/csv.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/stats.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage copying build/lib/pytest_benchmark/storage/__init__.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage copying build/lib/pytest_benchmark/storage/elasticsearch.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage copying build/lib/pytest_benchmark/storage/file.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage copying build/lib/pytest_benchmark/hookspec.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/utils.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/cli.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/compat.py to compat.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/__init__.py to __init__.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/table.py to table.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/histogram.py to histogram.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/pep418.py to pep418.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/__main__.py to __main__.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/plugin.py to plugin.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/session.py to session.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/fixture.py to fixture.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/logger.py to logger.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/timers.py to timers.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/csv.py to csv.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/stats.py to stats.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage/__init__.py to __init__.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage/elasticsearch.py to elasticsearch.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage/file.py to file.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/hookspec.py to hookspec.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/utils.py to utils.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/cli.py to cli.cpython-310.pyc running install_egg_info Copying src/pytest_benchmark.egg-info to /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark-3.4.1-py3.10.egg-info running install_scripts Installing py.test-benchmark script to /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/bin Installing pytest-benchmark script to /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/bin ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1, configfile: setup.cfg, testpaths: tests plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0 collecting ... collected 232 items / 10 deselected / 222 selected tests/test_benchmark.py::test_help PASSED [ 0%] tests/test_benchmark.py::test_groups PASSED [ 0%] tests/test_benchmark.py::test_group_by_name PASSED [ 1%] tests/test_benchmark.py::test_group_by_func PASSED [ 1%] tests/test_benchmark.py::test_group_by_fullfunc PASSED [ 2%] tests/test_benchmark.py::test_group_by_param_all PASSED [ 2%] tests/test_benchmark.py::test_group_by_param_select PASSED [ 3%] tests/test_benchmark.py::test_group_by_param_select_multiple PASSED [ 3%] tests/test_benchmark.py::test_group_by_fullname PASSED [ 4%] tests/test_benchmark.py::test_double_use PASSED [ 4%] tests/test_benchmark.py::test_only_override_skip PASSED [ 4%] tests/test_benchmark.py::test_fixtures_also_skipped PASSED [ 5%] tests/test_benchmark.py::test_conflict_between_only_and_disable PASSED [ 5%] tests/test_benchmark.py::test_max_time_min_rounds PASSED [ 6%] tests/test_benchmark.py::test_max_time PASSED [ 6%] tests/test_benchmark.py::test_bogus_max_time PASSED [ 7%] tests/test_benchmark.py::test_pep418_timer PASSED [ 7%] tests/test_benchmark.py::test_bad_save PASSED [ 8%] tests/test_benchmark.py::test_bad_save_2 PASSED [ 8%] tests/test_benchmark.py::test_bad_compare_fail PASSED [ 9%] tests/test_benchmark.py::test_bad_rounds PASSED [ 9%] tests/test_benchmark.py::test_bad_rounds_2 PASSED [ 9%] tests/test_benchmark.py::test_compare PASSED [ 10%] tests/test_benchmark.py::test_compare_last PASSED [ 10%] tests/test_benchmark.py::test_compare_non_existing PASSED [ 11%] tests/test_benchmark.py::test_compare_non_existing_verbose PASSED [ 11%] tests/test_benchmark.py::test_compare_no_files PASSED [ 12%] tests/test_benchmark.py::test_compare_no_files_verbose PASSED [ 12%] tests/test_benchmark.py::test_compare_no_files_match PASSED [ 13%] tests/test_benchmark.py::test_compare_no_files_match_verbose PASSED [ 13%] tests/test_benchmark.py::test_verbose PASSED [ 13%] tests/test_benchmark.py::test_save PASSED [ 14%] tests/test_benchmark.py::test_save_extra_info PASSED [ 14%] tests/test_benchmark.py::test_update_machine_info_hook_detection PASSED [ 15%] tests/test_benchmark.py::test_histogram PASSED [ 15%] tests/test_benchmark.py::test_autosave PASSED [ 16%] tests/test_benchmark.py::test_bogus_min_time PASSED [ 16%] tests/test_benchmark.py::test_disable_gc PASSED [ 17%] tests/test_benchmark.py::test_custom_timer PASSED [ 17%] tests/test_benchmark.py::test_bogus_timer PASSED [ 18%] tests/test_benchmark.py::test_sort_by_mean PASSED [ 18%] tests/test_benchmark.py::test_bogus_sort PASSED [ 18%] tests/test_benchmark.py::test_xdist PASSED [ 19%] tests/test_benchmark.py::test_xdist_verbose PASSED [ 19%] tests/test_benchmark.py::test_cprofile PASSED [ 20%] tests/test_benchmark.py::test_disabled_and_cprofile PASSED [ 20%] tests/test_benchmark.py::test_abort_broken PASSED [ 21%] tests/test_benchmark.py::test_basic FAILED [ 21%] tests/test_benchmark.py::test_skip FAILED [ 22%] tests/test_benchmark.py::test_disable FAILED [ 22%] tests/test_benchmark.py::test_mark_selection PASSED [ 22%] tests/test_benchmark.py::test_only_benchmarks FAILED [ 23%] tests/test_benchmark.py::test_columns PASSED [ 23%] tests/test_calibration.py::test_calibrate PASSED [ 24%] tests/test_calibration.py::test_calibrate_fast PASSED [ 24%] tests/test_calibration.py::test_calibrate_xfast PASSED [ 25%] tests/test_calibration.py::test_calibrate_slow PASSED [ 25%] tests/test_calibration.py::test_calibrate_stuck[True-0-1] PASSED [ 26%] tests/test_calibration.py::test_calibrate_stuck[True-0-0.01] PASSED [ 26%] tests/test_calibration.py::test_calibrate_stuck[True-0-1e-09] PASSED [ 27%] tests/test_calibration.py::test_calibrate_stuck[True-0-1e-10] PASSED [ 27%] tests/test_calibration.py::test_calibrate_stuck[True-0-1.000000000000001] PASSED [ 27%] tests/test_calibration.py::test_calibrate_stuck[True-1-1] PASSED [ 28%] tests/test_calibration.py::test_calibrate_stuck[True-1-0.01] PASSED [ 28%] tests/test_calibration.py::test_calibrate_stuck[True-1-1e-09] PASSED [ 29%] tests/test_calibration.py::test_calibrate_stuck[True-1-1e-10] PASSED [ 29%] tests/test_calibration.py::test_calibrate_stuck[True-1-1.000000000000001] PASSED [ 30%] tests/test_calibration.py::test_calibrate_stuck[True--1-1] PASSED [ 30%] tests/test_calibration.py::test_calibrate_stuck[True--1-0.01] PASSED [ 31%] tests/test_calibration.py::test_calibrate_stuck[True--1-1e-09] PASSED [ 31%] tests/test_calibration.py::test_calibrate_stuck[True--1-1e-10] PASSED [ 31%] tests/test_calibration.py::test_calibrate_stuck[True--1-1.000000000000001] PASSED [ 32%] tests/test_calibration.py::test_calibrate_stuck[False-0-1] PASSED [ 32%] tests/test_calibration.py::test_calibrate_stuck[False-0-0.01] PASSED [ 33%] tests/test_calibration.py::test_calibrate_stuck[False-0-1e-09] PASSED [ 33%] tests/test_calibration.py::test_calibrate_stuck[False-0-1e-10] PASSED [ 34%] tests/test_calibration.py::test_calibrate_stuck[False-0-1.000000000000001] PASSED [ 34%] tests/test_calibration.py::test_calibrate_stuck[False-1-1] PASSED [ 35%] tests/test_calibration.py::test_calibrate_stuck[False-1-0.01] PASSED [ 35%] tests/test_calibration.py::test_calibrate_stuck[False-1-1e-09] PASSED [ 36%] tests/test_calibration.py::test_calibrate_stuck[False-1-1e-10] PASSED [ 36%] tests/test_calibration.py::test_calibrate_stuck[False-1-1.000000000000001] PASSED [ 36%] tests/test_calibration.py::test_calibrate_stuck[False--1-1] PASSED [ 37%] tests/test_calibration.py::test_calibrate_stuck[False--1-0.01] PASSED [ 37%] tests/test_calibration.py::test_calibrate_stuck[False--1-1e-09] PASSED [ 38%] tests/test_calibration.py::test_calibrate_stuck[False--1-1e-10] PASSED [ 38%] tests/test_calibration.py::test_calibrate_stuck[False--1-1.000000000000001] PASSED [ 39%] tests/test_cli.py::test_list PASSED [ 39%] tests/test_cli.py::test_compare[short-] PASSED [ 40%] tests/test_cli.py::test_compare[long-] PASSED [ 40%] tests/test_cli.py::test_compare[normal-] PASSED [ 40%] tests/test_cli.py::test_compare[trial-] PASSED [ 41%] tests/test_doctest.rst::test_doctest.rst PASSED [ 41%] tests/test_elasticsearch_storage.py::test_handle_saving PASSED [ 42%] tests/test_elasticsearch_storage.py::test_parse_with_no_creds PASSED [ 42%] tests/test_elasticsearch_storage.py::test_parse_with_creds_in_first_host_of_url PASSED [ 43%] tests/test_elasticsearch_storage.py::test_parse_with_creds_in_second_host_of_url PASSED [ 43%] tests/test_elasticsearch_storage.py::test_parse_with_creds_in_netrc PASSED [ 44%] tests/test_elasticsearch_storage.py::test_parse_url_creds_supersedes_netrc_creds PASSED [ 44%] tests/test_elasticsearch_storage.py::test__mask_hosts PASSED [ 45%] tests/test_normal.py::test_normal PASSED [ 45%] tests/test_normal.py::test_fast PASSED [ 45%] tests/test_normal.py::test_slow PASSED [ 46%] tests/test_normal.py::test_slower PASSED [ 46%] tests/test_normal.py::test_xfast PASSED [ 47%] tests/test_normal.py::test_parametrized[0] PASSED [ 47%] tests/test_normal.py::test_parametrized[1] PASSED [ 48%] tests/test_normal.py::test_parametrized[2] PASSED [ 48%] tests/test_normal.py::test_parametrized[3] PASSED [ 49%] tests/test_normal.py::test_parametrized[4] PASSED [ 49%] tests/test_pedantic.py::test_single PASSED [ 50%] tests/test_pedantic.py::test_setup PASSED [ 50%] tests/test_pedantic.py::test_setup_cprofile PASSED [ 50%] tests/test_pedantic.py::test_args_kwargs PASSED [ 51%] tests/test_pedantic.py::test_iterations PASSED [ 51%] tests/test_pedantic.py::test_rounds_iterations PASSED [ 52%] tests/test_pedantic.py::test_rounds PASSED [ 52%] tests/test_pedantic.py::test_warmup_rounds PASSED [ 53%] tests/test_pedantic.py::test_rounds_must_be_int[0] PASSED [ 53%] tests/test_pedantic.py::test_rounds_must_be_int[x] PASSED [ 54%] tests/test_pedantic.py::test_warmup_rounds_must_be_int[-15] PASSED [ 54%] tests/test_pedantic.py::test_warmup_rounds_must_be_int[x] PASSED [ 54%] tests/test_pedantic.py::test_setup_many_rounds PASSED [ 55%] tests/test_pedantic.py::test_cant_use_both_args_and_setup_with_return PASSED [ 55%] tests/test_pedantic.py::test_can_use_both_args_and_setup_without_return PASSED [ 56%] tests/test_pedantic.py::test_cant_use_setup_with_many_iterations PASSED [ 56%] tests/test_pedantic.py::test_iterations_must_be_positive_int[0] PASSED [ 57%] tests/test_pedantic.py::test_iterations_must_be_positive_int[-1] PASSED [ 57%] tests/test_pedantic.py::test_iterations_must_be_positive_int[asdf] PASSED [ 58%] tests/test_sample.py::test_proto[SimpleProxy] PASSED [ 58%] tests/test_sample.py::test_proto[CachedPropertyProxy] PASSED [ 59%] tests/test_sample.py::test_proto[LocalsSimpleProxy] PASSED [ 59%] tests/test_sample.py::test_proto[LocalsCachedPropertyProxy] PASSED [ 59%] tests/test_skip.py::test_skip SKIPPED (bla) [ 60%] tests/test_stats.py::test_1 PASSED [ 60%] tests/test_stats.py::test_2 PASSED [ 61%] tests/test_stats.py::test_single_item PASSED [ 61%] tests/test_stats.py::test_length[1] PASSED [ 62%] tests/test_stats.py::test_length[2] PASSED [ 62%] tests/test_stats.py::test_length[3] PASSED [ 63%] tests/test_stats.py::test_length[4] PASSED [ 63%] tests/test_stats.py::test_length[5] PASSED [ 63%] tests/test_stats.py::test_length[6] PASSED [ 64%] tests/test_stats.py::test_length[7] PASSED [ 64%] tests/test_stats.py::test_length[8] PASSED [ 65%] tests/test_stats.py::test_length[9] PASSED [ 65%] tests/test_stats.py::test_iqr PASSED [ 66%] tests/test_stats.py::test_ops PASSED [ 66%] tests/test_storage.py::test_rendering[short] PASSED [ 67%] tests/test_storage.py::test_rendering[normal] PASSED [ 67%] tests/test_storage.py::test_rendering[long] PASSED [ 68%] tests/test_storage.py::test_rendering[trial] PASSED [ 68%] tests/test_storage.py::test_regression_checks[short] PASSED [ 68%] tests/test_storage.py::test_regression_checks[normal] PASSED [ 69%] tests/test_storage.py::test_regression_checks[long] PASSED [ 69%] tests/test_storage.py::test_regression_checks[trial] PASSED [ 70%] tests/test_storage.py::test_regression_checks_inf[short] PASSED [ 70%] tests/test_storage.py::test_regression_checks_inf[normal] PASSED [ 71%] tests/test_storage.py::test_regression_checks_inf[long] PASSED [ 71%] tests/test_storage.py::test_regression_checks_inf[trial] PASSED [ 72%] tests/test_storage.py::test_compare_1[short] PASSED [ 72%] tests/test_storage.py::test_compare_1[normal] PASSED [ 72%] tests/test_storage.py::test_compare_1[long] PASSED [ 73%] tests/test_storage.py::test_compare_1[trial] PASSED [ 73%] tests/test_storage.py::test_compare_2[short] PASSED [ 74%] tests/test_storage.py::test_compare_2[normal] PASSED [ 74%] tests/test_storage.py::test_compare_2[long] PASSED [ 75%] tests/test_storage.py::test_compare_2[trial] PASSED [ 75%] tests/test_storage.py::test_save_json[short] PASSED [ 76%] tests/test_storage.py::test_save_json[normal] PASSED [ 76%] tests/test_storage.py::test_save_json[long] PASSED [ 77%] tests/test_storage.py::test_save_json[trial] PASSED [ 77%] tests/test_storage.py::test_save_with_name[short] PASSED [ 77%] tests/test_storage.py::test_save_with_name[normal] PASSED [ 78%] tests/test_storage.py::test_save_with_name[long] PASSED [ 78%] tests/test_storage.py::test_save_with_name[trial] PASSED [ 79%] tests/test_storage.py::test_save_no_name[short] PASSED [ 79%] tests/test_storage.py::test_save_no_name[normal] PASSED [ 80%] tests/test_storage.py::test_save_no_name[long] PASSED [ 80%] tests/test_storage.py::test_save_no_name[trial] PASSED [ 81%] tests/test_storage.py::test_save_with_error[short] PASSED [ 81%] tests/test_storage.py::test_save_with_error[normal] PASSED [ 81%] tests/test_storage.py::test_save_with_error[long] PASSED [ 82%] tests/test_storage.py::test_save_with_error[trial] PASSED [ 82%] tests/test_storage.py::test_autosave[short] PASSED [ 83%] tests/test_storage.py::test_autosave[normal] PASSED [ 83%] tests/test_storage.py::test_autosave[long] PASSED [ 84%] tests/test_storage.py::test_autosave[trial] PASSED [ 84%] tests/test_utils.py::test_clonefunc[] PASSED [ 85%] tests/test_utils.py::test_clonefunc[f2] PASSED [ 85%] tests/test_utils.py::test_clonefunc_not_function PASSED [ 86%] tests/test_utils.py::test_get_commit_info[git-True] PASSED [ 86%] tests/test_utils.py::test_get_commit_info[git-False] PASSED [ 86%] tests/test_utils.py::test_get_commit_info[hg-True] SKIPPED (%r not a...) [ 87%] tests/test_utils.py::test_get_commit_info[hg-False] SKIPPED (%r not ...) [ 87%] tests/test_utils.py::test_missing_scm_bins[git-True] PASSED [ 88%] tests/test_utils.py::test_missing_scm_bins[git-False] PASSED [ 88%] tests/test_utils.py::test_missing_scm_bins[hg-True] SKIPPED (%r not ...) [ 89%] tests/test_utils.py::test_missing_scm_bins[hg-False] SKIPPED (%r not...) [ 89%] tests/test_utils.py::test_get_branch_info[git] PASSED [ 90%] tests/test_utils.py::test_get_branch_info[hg] SKIPPED (%r not availa...) [ 90%] tests/test_utils.py::test_no_branch_info PASSED [ 90%] tests/test_utils.py::test_commit_info_error PASSED [ 91%] tests/test_utils.py::test_parse_warmup PASSED [ 91%] tests/test_utils.py::test_parse_columns PASSED [ 92%] tests/test_utils.py::test_get_project_name[False-None] PASSED [ 92%] tests/test_utils.py::test_get_project_name[False-git] PASSED [ 93%] tests/test_utils.py::test_get_project_name[False-hg] SKIPPED (%r not...) [ 93%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo-None] PASSED [ 94%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo-git] PASSED [ 94%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo-hg] SKIPPED [ 95%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo.git-None] PASSED [ 95%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo.git-git] PASSED [ 95%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo.git-hg] SKIPPED [ 96%] tests/test_utils.py::test_get_project_name[c:\\foo\\bar\\pytest_benchmark_repo.gitfoo@example.com:pytest_benchmark_repo.git-None] PASSED [ 96%] tests/test_utils.py::test_get_project_name[c:\\foo\\bar\\pytest_benchmark_repo.gitfoo@example.com:pytest_benchmark_repo.git-git] PASSED [ 97%] tests/test_utils.py::test_get_project_name[c:\\foo\\bar\\pytest_benchmark_repo.gitfoo@example.com:pytest_benchmark_repo.git-hg] SKIPPED [ 97%] tests/test_utils.py::test_get_project_name_broken[git] PASSED [ 98%] tests/test_utils.py::test_get_project_name_broken[hg] PASSED [ 98%] tests/test_utils.py::test_get_project_name_fallback PASSED [ 99%] tests/test_utils.py::test_get_project_name_fallback_broken_hgrc PASSED [ 99%] tests/test_with_testcase.py::TerribleTerribleWayToWriteTests::test_foo PASSED [100%] =================================== FAILURES =================================== __________________________________ test_basic __________________________________ /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_benchmark.py:1028: in test_basic result.stdout.fnmatch_lines([ E Failed: nomatch: '*collected 5 items' E and: '============================= test session starts ==============================' E and: 'platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3' E and: 'cachedir: .pytest_cache' E and: 'benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)' E and: 'rootdir: /tmp/pytest-of-buildozer/pytest-218/test_basic0' E and: 'plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0' E fnmatch: '*collected 5 items' E with: 'collecting ... collected 5 items' E nomatch: 'test_basic.py::*test_basic PASSED*' E and: '' E fnmatch: 'test_basic.py::*test_basic PASSED*' E with: 'test_basic.py::test_basic PASSED [ 20%]' E nomatch: 'test_basic.py::test_slow PASSED*' E and: 'test_basic.py::test_fast PASSED [ 40%]' E fnmatch: 'test_basic.py::test_slow PASSED*' E with: 'test_basic.py::test_slow PASSED [ 60%]' E fnmatch: 'test_basic.py::test_slower PASSED*' E with: 'test_basic.py::test_slower PASSED [ 80%]' E fnmatch: 'test_basic.py::test_xfast PASSED*' E with: 'test_basic.py::test_xfast PASSED [100%]' E nomatch: 'test_basic.py::test_fast PASSED*' E and: '' E and: '' E and: '------------------------------------------------------------------------------------------------------------ benchmark: 4 tests ------------------------------------------------------------------------------------------------------------' E and: 'Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations' E and: '--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------' E and: 'test_xfast 101.2487 (1.0) 72,771.9115 (1.0) 110.9902 (1.0) 475.3265 (1.0) 102.8180 (1.0) 1.4203 (1.0) 65;2690 9,009,806.7004 (1.0) 46634 200' E and: 'test_fast 371.5977 (3.67) 3,685,037.6055 (50.64) 443.6367 (4.00) 11,568.3552 (24.34) 387.4302 (3.77) 8.3819 (5.90) 13;9298 2,254,096.7182 (0.25) 149007 1' E and: 'test_slow 1,358,009.8748 (>1000.0) 18,754,065.0368 (257.71) 4,119,647.3024 (>1000.0) 3,527,473.6761 (>1000.0) 3,602,618.3516 (>1000.0) 3,259,829.4783 (>1000.0) 1;1 242.7392 (0.00) 23 1' E and: 'test_slower 10,055,615.5667 (>1000.0) 25,591,122.9178 (351.66) 11,425,354.2572 (>1000.0) 2,559,855.6776 (>1000.0) 10,455,658.6593 (>1000.0) 1,505,138.8182 (>1000.0) 11;11 87.5246 (0.00) 100 1' E and: '--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------' E and: '' E and: 'Legend:' E and: ' Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile.' E and: ' OPS: Operations Per Second, computed as 1 / Mean' E and: '============================== 5 passed in 4.79s ===============================' E remains unmatched: 'test_basic.py::test_fast PASSED*' ----------------------------- Captured stdout call ----------------------------- running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-buildozer/pytest-218/test_basic0/runpytest-0 -vv --doctest-modules /tmp/pytest-of-buildozer/pytest-218/test_basic0/test_basic.py in: /tmp/pytest-of-buildozer/pytest-218/test_basic0 ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /tmp/pytest-of-buildozer/pytest-218/test_basic0 plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0 collecting ... collected 5 items test_basic.py::test_basic PASSED [ 20%] test_basic.py::test_fast PASSED [ 40%] test_basic.py::test_slow PASSED [ 60%] test_basic.py::test_slower PASSED [ 80%] test_basic.py::test_xfast PASSED [100%] ------------------------------------------------------------------------------------------------------------ benchmark: 4 tests ------------------------------------------------------------------------------------------------------------ Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- test_xfast 101.2487 (1.0) 72,771.9115 (1.0) 110.9902 (1.0) 475.3265 (1.0) 102.8180 (1.0) 1.4203 (1.0) 65;2690 9,009,806.7004 (1.0) 46634 200 test_fast 371.5977 (3.67) 3,685,037.6055 (50.64) 443.6367 (4.00) 11,568.3552 (24.34) 387.4302 (3.77) 8.3819 (5.90) 13;9298 2,254,096.7182 (0.25) 149007 1 test_slow 1,358,009.8748 (>1000.0) 18,754,065.0368 (257.71) 4,119,647.3024 (>1000.0) 3,527,473.6761 (>1000.0) 3,602,618.3516 (>1000.0) 3,259,829.4783 (>1000.0) 1;1 242.7392 (0.00) 23 1 test_slower 10,055,615.5667 (>1000.0) 25,591,122.9178 (351.66) 11,425,354.2572 (>1000.0) 2,559,855.6776 (>1000.0) 10,455,658.6593 (>1000.0) 1,505,138.8182 (>1000.0) 11;11 87.5246 (0.00) 100 1 -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Legend: Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile. OPS: Operations Per Second, computed as 1 / Mean ============================== 5 passed in 4.79s =============================== __________________________________ test_skip ___________________________________ /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_benchmark.py:1052: in test_skip result.stdout.fnmatch_lines([ E Failed: nomatch: '*collected 5 items' E and: '============================= test session starts ==============================' E and: 'platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3' E and: 'cachedir: .pytest_cache' E and: 'benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)' E and: 'rootdir: /tmp/pytest-of-buildozer/pytest-218/test_skip0' E and: 'plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0' E fnmatch: '*collected 5 items' E with: 'collecting ... collected 5 items' E nomatch: 'test_skip.py::*test_skip PASSED*' E and: '' E fnmatch: 'test_skip.py::*test_skip PASSED*' E with: 'test_skip.py::test_skip PASSED [ 20%]' E nomatch: 'test_skip.py::test_slow SKIPPED*' E and: 'test_skip.py::test_fast SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 40%]' E fnmatch: 'test_skip.py::test_slow SKIPPED*' E with: 'test_skip.py::test_slow SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 60%]' E fnmatch: 'test_skip.py::test_slower SKIPPED*' E with: 'test_skip.py::test_slower SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 80%]' E fnmatch: 'test_skip.py::test_xfast SKIPPED*' E with: 'test_skip.py::test_xfast SKIPPED (Skipping benchmark (--benchmark-skip active).) [100%]' E nomatch: 'test_skip.py::test_fast SKIPPED*' E and: '' E and: '========================= 1 passed, 4 skipped in 0.04s =========================' E remains unmatched: 'test_skip.py::test_fast SKIPPED*' ----------------------------- Captured stdout call ----------------------------- running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-buildozer/pytest-218/test_skip0/runpytest-0 -vv --doctest-modules --benchmark-skip /tmp/pytest-of-buildozer/pytest-218/test_skip0/test_skip.py in: /tmp/pytest-of-buildozer/pytest-218/test_skip0 ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /tmp/pytest-of-buildozer/pytest-218/test_skip0 plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0 collecting ... collected 5 items test_skip.py::test_skip PASSED [ 20%] test_skip.py::test_fast SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 40%] test_skip.py::test_slow SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 60%] test_skip.py::test_slower SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 80%] test_skip.py::test_xfast SKIPPED (Skipping benchmark (--benchmark-skip active).) [100%] ========================= 1 passed, 4 skipped in 0.04s ========================= _________________________________ test_disable _________________________________ /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_benchmark.py:1066: in test_disable result.stdout.fnmatch_lines([ E Failed: nomatch: '*collected 5 items' E and: '============================= test session starts ==============================' E and: 'platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3' E and: 'cachedir: .pytest_cache' E and: 'benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)' E and: 'rootdir: /tmp/pytest-of-buildozer/pytest-218/test_disable0' E and: 'plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0' E fnmatch: '*collected 5 items' E with: 'collecting ... collected 5 items' E nomatch: 'test_disable.py::*test_disable PASSED*' E and: '' E fnmatch: 'test_disable.py::*test_disable PASSED*' E with: 'test_disable.py::test_disable PASSED [ 20%]' E nomatch: 'test_disable.py::test_slow PASSED*' E and: 'test_disable.py::test_fast PASSED [ 40%]' E fnmatch: 'test_disable.py::test_slow PASSED*' E with: 'test_disable.py::test_slow PASSED [ 60%]' E fnmatch: 'test_disable.py::test_slower PASSED*' E with: 'test_disable.py::test_slower PASSED [ 80%]' E fnmatch: 'test_disable.py::test_xfast PASSED*' E with: 'test_disable.py::test_xfast PASSED [100%]' E nomatch: 'test_disable.py::test_fast PASSED*' E and: '' E and: '============================== 5 passed in 0.05s ===============================' E remains unmatched: 'test_disable.py::test_fast PASSED*' ----------------------------- Captured stdout call ----------------------------- running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-buildozer/pytest-218/test_disable0/runpytest-0 -vv --doctest-modules --benchmark-disable /tmp/pytest-of-buildozer/pytest-218/test_disable0/test_disable.py in: /tmp/pytest-of-buildozer/pytest-218/test_disable0 ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /tmp/pytest-of-buildozer/pytest-218/test_disable0 plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0 collecting ... collected 5 items test_disable.py::test_disable PASSED [ 20%] test_disable.py::test_fast PASSED [ 40%] test_disable.py::test_slow PASSED [ 60%] test_disable.py::test_slower PASSED [ 80%] test_disable.py::test_xfast PASSED [100%] ============================== 5 passed in 0.05s =============================== _____________________________ test_only_benchmarks _____________________________ /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_benchmark.py:1095: in test_only_benchmarks result.stdout.fnmatch_lines([ E Failed: nomatch: '*collected 5 items' E and: '============================= test session starts ==============================' E and: 'platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3' E and: 'cachedir: .pytest_cache' E and: 'benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)' E and: 'rootdir: /tmp/pytest-of-buildozer/pytest-218/test_only_benchmarks0' E and: 'plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0' E fnmatch: '*collected 5 items' E with: 'collecting ... collected 5 items' E nomatch: 'test_only_benchmarks.py::*test_only_benchmarks SKIPPED*' E and: '' E fnmatch: 'test_only_benchmarks.py::*test_only_benchmarks SKIPPED*' E with: 'test_only_benchmarks.py::test_only_benchmarks SKIPPED (Skipping non-benchmark (--benchmark-only active).) [ 20%]' E nomatch: 'test_only_benchmarks.py::test_slow PASSED*' E and: 'test_only_benchmarks.py::test_fast PASSED [ 40%]' E fnmatch: 'test_only_benchmarks.py::test_slow PASSED*' E with: 'test_only_benchmarks.py::test_slow PASSED [ 60%]' E fnmatch: 'test_only_benchmarks.py::test_slower PASSED*' E with: 'test_only_benchmarks.py::test_slower PASSED [ 80%]' E fnmatch: 'test_only_benchmarks.py::test_xfast PASSED*' E with: 'test_only_benchmarks.py::test_xfast PASSED [100%]' E nomatch: 'test_only_benchmarks.py::test_fast PASSED*' E and: '' E and: '' E and: '------------------------------------------------------------------------------------------------------------ benchmark: 4 tests -----------------------------------------------------------------------------------------------------------' E and: 'Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations' E and: '-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------' E and: 'test_xfast 99.5444 (1.0) 827,060.4443 (1.0) 191.9483 (1.48) 6,559.0760 (2.06) 101.6539 (1.0) 1.4808 (1.0) 31;3185 5,209,735.2919 (0.67) 45386 200' E and: 'test_fast 108.2028 (1.09) 1,263,189.8492 (1.53) 129.5133 (1.0) 3,177.9198 (1.0) 110.7427 (1.09) 2.2013 (1.49) 59;14654 7,721,214.2016 (1.0) 196046 44' E and: 'test_slow 1,004,992.0529 (>1000.0) 111,956,682.9875 (135.37) 1,707,145.2634 (>1000.0) 5,368,223.2608 (>1000.0) 1,053,699.2922 (>1000.0) 7,107.3882 (>1000.0) 13;211 585.7732 (0.00) 777 1' E and: 'test_slower 10,010,620.5791 (>1000.0) 62,167,905.2711 (75.17) 11,761,309.3834 (>1000.0) 6,171,483.4433 (>1000.0) 10,057,442.3559 (>1000.0) 670,063.3094 (>1000.0) 6;17 85.0245 (0.00) 100 1' E and: '-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------' E and: '' E and: 'Legend:' E and: ' Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile.' E and: ' OPS: Operations Per Second, computed as 1 / Mean' E and: '========================= 4 passed, 1 skipped in 8.35s =========================' E remains unmatched: 'test_only_benchmarks.py::test_fast PASSED*' ----------------------------- Captured stdout call ----------------------------- running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-buildozer/pytest-218/test_only_benchmarks0/runpytest-0 -vv --doctest-modules --benchmark-only /tmp/pytest-of-buildozer/pytest-218/test_only_benchmarks0/test_only_benchmarks.py in: /tmp/pytest-of-buildozer/pytest-218/test_only_benchmarks0 ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /tmp/pytest-of-buildozer/pytest-218/test_only_benchmarks0 plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0 collecting ... collected 5 items test_only_benchmarks.py::test_only_benchmarks SKIPPED (Skipping non-benchmark (--benchmark-only active).) [ 20%] test_only_benchmarks.py::test_fast PASSED [ 40%] test_only_benchmarks.py::test_slow PASSED [ 60%] test_only_benchmarks.py::test_slower PASSED [ 80%] test_only_benchmarks.py::test_xfast PASSED [100%] ------------------------------------------------------------------------------------------------------------ benchmark: 4 tests ----------------------------------------------------------------------------------------------------------- Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- test_xfast 99.5444 (1.0) 827,060.4443 (1.0) 191.9483 (1.48) 6,559.0760 (2.06) 101.6539 (1.0) 1.4808 (1.0) 31;3185 5,209,735.2919 (0.67) 45386 200 test_fast 108.2028 (1.09) 1,263,189.8492 (1.53) 129.5133 (1.0) 3,177.9198 (1.0) 110.7427 (1.09) 2.2013 (1.49) 59;14654 7,721,214.2016 (1.0) 196046 44 test_slow 1,004,992.0529 (>1000.0) 111,956,682.9875 (135.37) 1,707,145.2634 (>1000.0) 5,368,223.2608 (>1000.0) 1,053,699.2922 (>1000.0) 7,107.3882 (>1000.0) 13;211 585.7732 (0.00) 777 1 test_slower 10,010,620.5791 (>1000.0) 62,167,905.2711 (75.17) 11,761,309.3834 (>1000.0) 6,171,483.4433 (>1000.0) 10,057,442.3559 (>1000.0) 670,063.3094 (>1000.0) 6;17 85.0245 (0.00) 100 1 ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Legend: Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile. OPS: Operations Per Second, computed as 1 / Mean ========================= 4 passed, 1 skipped in 8.35s ========================= =============================== warnings summary =============================== ../../../../../../../usr/lib/python3.10/site-packages/_pytest/config/__init__.py:1199 /usr/lib/python3.10/site-packages/_pytest/config/__init__.py:1199: PytestRemovedIn8Warning: The --strict option is deprecated, use --strict-markers instead. self.issue_config_time_warning( tests/test_utils.py:35 /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_utils.py:35: PytestDeprecationWarning: @pytest.yield_fixture is deprecated. Use @pytest.fixture instead; they are the same. @pytest.yield_fixture(params=(True, False)) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html ------------------------------------------------------------------------------------------------------------------------------------- benchmark: 58 tests -------------------------------------------------------------------------------------------------------------------------------------- Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ test_calibrate_stuck[False--1-1e-10] 5.0500 (1.0) 5.0500 (1.0) 5.0500 (1.0) 0.0000 (1.0) 5.0500 (1.0) 0.0000 (1.0) 0;0 198,019,801.9802 (1.0) 1 2 test_calibrate_stuck[False-0-1e-10] 5.0500 (1.0) 5.0500 (1.0) 5.0500 (1.0) 0.0000 (1.0) 5.0500 (1.0) 0.0000 (1.0) 0;0 198,019,801.9802 (1.0) 1 2 test_calibrate_stuck[False-1-1e-10] 5.0500 (1.0) 5.0500 (1.0) 5.0500 (1.0) 0.0000 (1.0) 5.0500 (1.0) 0.0000 (1.0) 0;0 198,019,801.9802 (1.0) 1 2 test_calibrate_stuck[True--1-1e-10] 10.0000 (1.98) 10.0000 (1.98) 10.0000 (1.98) 0.0000 (1.0) 10.0000 (1.98) 0.0000 (1.0) 0;0 100,000,000.0001 (0.51) 1 1 test_calibrate_stuck[True-0-1e-10] 10.0000 (1.98) 10.0000 (1.98) 10.0000 (1.98) 0.0000 (1.0) 10.0000 (1.98) 0.0000 (1.0) 0;0 100,000,000.0001 (0.51) 1 1 test_calibrate_stuck[True-1-1e-10] 10.0000 (1.98) 10.0000 (1.98) 10.0000 (1.98) 0.0000 (1.0) 10.0000 (1.98) 0.0000 (1.0) 0;0 100,000,000.0001 (0.51) 1 1 test_calibrate_stuck[False--1-1e-09] 50.5000 (10.00) 50.5000 (10.00) 50.5000 (10.00) 0.0000 (1.0) 50.5000 (10.00) 0.0000 (1.0) 0;0 19,801,980.1981 (0.10) 1 2 test_calibrate_stuck[False-0-1e-09] 50.5000 (10.00) 50.5000 (10.00) 50.5000 (10.00) 0.0000 (1.0) 50.5000 (10.00) 0.0000 (1.0) 0;0 19,801,980.1981 (0.10) 1 2 test_calibrate_stuck[False-1-1e-09] 50.5000 (10.00) 50.5000 (10.00) 50.5000 (10.00) 0.0000 (1.0) 50.5000 (10.00) 0.0000 (1.0) 0;0 19,801,980.1981 (0.10) 1 2 test_calibrate_stuck[True--1-1e-09] 100.0000 (19.80) 100.0000 (19.80) 100.0000 (19.80) 0.0000 (1.0) 100.0000 (19.80) 0.0000 (1.0) 0;0 10,000,000.0000 (0.05) 1 1 test_calibrate_stuck[True-0-1e-09] 100.0000 (19.80) 100.0000 (19.80) 100.0000 (19.80) 0.0000 (1.0) 100.0000 (19.80) 0.0000 (1.0) 0;0 10,000,000.0000 (0.05) 1 1 test_calibrate_stuck[True-1-1e-09] 100.0000 (19.80) 100.0000 (19.80) 100.0000 (19.80) 0.0000 (1.0) 100.0000 (19.80) 0.0000 (1.0) 0;0 10,000,000.0000 (0.05) 1 1 test_calibrate_xfast 128.3642 (25.42) 486,802.9431 (>1000.0) 146.8057 (29.07) 1,537.7446 (inf) 130.4224 (25.83) 0.5309 (inf) 552;132907 6,811,725.3645 (0.03) 783868 100 test_rounds_iterations 165.5892 (32.79) 318.4192 (63.05) 217.0851 (42.99) 47.5633 (inf) 214.5767 (42.49) 54.9713 (inf) 6;1 4,606,488.7770 (0.02) 15 10 test_iterations 236.7422 (46.88) 236.7422 (46.88) 236.7422 (46.88) 0.0000 (1.0) 236.7422 (46.88) 0.0000 (1.0) 0;0 4,224,004.0283 (0.02) 1 10 test_xfast 238.4186 (47.21) 715.2557 (141.63) 388.1973 (76.87) 150.0327 (inf) 476.8372 (94.42) 238.4186 (inf) 85;0 2,576,009.8923 (0.01) 1049 1 test_warmup_rounds 355.7652 (70.45) 451.6914 (89.44) 377.3719 (74.73) 41.6677 (inf) 357.6279 (70.82) 28.8710 (inf) 1;1 2,649,905.7848 (0.01) 5 1 test_rounds 385.5675 (76.35) 643.5439 (127.43) 433.6859 (85.88) 81.2386 (inf) 396.7434 (78.56) 46.3333 (inf) 2;2 2,305,816.3722 (0.01) 15 1 test_proto[SimpleProxy] 431.7146 (85.49) 2,157,331.6306 (>1000.0) 533.0952 (105.56) 10,683.6882 (inf) 443.8683 (87.89) 7.1246 (inf) 23;3588 1,875,837.7028 (0.01) 90550 20 test_proto[LocalsSimpleProxy] 438.0010 (86.73) 877,130.7766 (>1000.0) 489.1465 (96.86) 3,835.8440 (inf) 448.9440 (88.90) 6.9384 (inf) 45;3019 2,044,377.1343 (0.01) 92413 20 test_single 654.7198 (129.65) 654.7198 (129.65) 654.7198 (129.65) 0.0000 (1.0) 654.7198 (129.65) 0.0000 (1.0) 0;0 1,527,371.0156 (0.01) 1 1 test_calibrate_fast 669.9003 (132.65) 35,956,313.3679 (>1000.0) 1,053.8904 (208.69) 52,751.2492 (inf) 708.9227 (140.38) 18.0677 (inf) 602;71014 948,865.3092 (0.00) 1538753 10 test_proto[CachedPropertyProxy] 744.1267 (147.35) 617,737.8818 (>1000.0) 800.9057 (158.60) 1,945.2740 (inf) 784.1736 (155.28) 13.9698 (inf) 38;9757 1,248,586.4323 (0.01) 125482 1 test_proto[LocalsCachedPropertyProxy] 745.0581 (147.54) 174,229.0333 (>1000.0) 798.3995 (158.10) 849.0511 (inf) 782.3110 (154.91) 18.6265 (inf) 90;13798 1,252,505.8084 (0.01) 145632 1 test_setup_many_rounds 893.1383 (176.86) 1,524.5751 (301.90) 988.3195 (195.71) 190.7854 (inf) 928.0629 (183.77) 40.0469 (inf) 1;1 1,011,818.5300 (0.01) 10 1 test_can_use_both_args_and_setup_without_return 1,170.6725 (231.82) 1,170.6725 (231.82) 1,170.6725 (231.82) 0.0000 (1.0) 1,170.6725 (231.82) 0.0000 (1.0) 0;0 854,209.8839 (0.00) 1 1 test_args_kwargs 1,527.3690 (302.45) 1,527.3690 (302.45) 1,527.3690 (302.45) 0.0000 (1.0) 1,527.3690 (302.45) 0.0000 (1.0) 0;0 654,720.6244 (0.00) 1 1 test_setup_cprofile 1,532.9570 (303.56) 1,532.9570 (303.56) 1,532.9570 (303.56) 0.0000 (1.0) 1,532.9570 (303.56) 0.0000 (1.0) 0;0 652,334.0365 (0.00) 1 1 test_setup 2,266.8391 (448.88) 2,266.8391 (448.88) 2,266.8391 (448.88) 0.0000 (1.0) 2,266.8391 (448.88) 0.0000 (1.0) 0;0 441,142.9022 (0.00) 1 1 test_foo 6,530.4339 (>1000.0) 37,072,069.0116 (>1000.0) 105,637.9184 (>1000.0) 566,469.9793 (inf) 53,945.9288 (>1000.0) 1,463.1078 (inf) 243;1718 9,466.2978 (0.00) 17212 1 test_calibrate_slow 12,847.5949 (>1000.0) 240,938,390.6052 (>1000.0) 110,037.6472 (>1000.0) 1,018,899.1334 (inf) 62,624.9239 (>1000.0) 932.2539 (inf) 5257;77342 9,087.7988 (0.00) 665433 1 test_parametrized[3] 13,084.1509 (>1000.0) 21,048,039.1979 (>1000.0) 122,306.4257 (>1000.0) 511,744.2822 (inf) 63,445.4191 (>1000.0) 641.6813 (inf) 295;2075 8,176.1853 (0.00) 15141 1 test_parametrized[4] 13,424.0836 (>1000.0) 9,670,732.5429 (>1000.0) 119,966.6467 (>1000.0) 410,565.8459 (inf) 63,402.5782 (>1000.0) 628.6427 (inf) 370;2445 8,335.6502 (0.00) 15565 1 test_parametrized[2] 13,878.5690 (>1000.0) 8,206,707.4254 (>1000.0) 147,864.9663 (>1000.0) 539,991.3564 (inf) 63,396.0590 (>1000.0) 595.1151 (inf) 419;2043 6,762.9272 (0.00) 15229 1 test_parametrized[1] 15,335.1575 (>1000.0) 5,700,932.8157 (>1000.0) 135,042.2770 (>1000.0) 451,041.3025 (inf) 63,429.1209 (>1000.0) 776.2574 (inf) 427;2737 7,405.0884 (0.00) 15252 1 test_parametrized[0] 17,127.9535 (>1000.0) 62,253,041.1929 (>1000.0) 210,652.1633 (>1000.0) 1,302,927.0951 (inf) 63,400.7156 (>1000.0) 798.6091 (inf) 138;1016 4,747.1623 (0.00) 5624 1 test_fast 42,543.7465 (>1000.0) 27,101,828.7167 (>1000.0) 126,269.2932 (>1000.0) 615,287.8062 (inf) 55,540.8187 (>1000.0) 6,203.5397 (inf) 113;790 7,919.5818 (0.00) 5950 1 test_calibrate 45,863.9115 (>1000.0) 415,705,394.9311 (>1000.0) 73,743.8374 (>1000.0) 1,799,594.3365 (inf) 48,461.3702 (>1000.0) 1,196.7495 (inf) 207;12662 13,560.4552 (0.00) 213604 1 test_slow 1,053,364.0161 (>1000.0) 14,269,577.3393 (>1000.0) 1,865,430.2846 (>1000.0) 1,482,467.6011 (inf) 1,096,403.2263 (>1000.0) 763,445.1613 (inf) 120;131 536.0693 (0.00) 767 1 test_slower 10,019,188.7468 (>1000.0) 14,878,639.9513 (>1000.0) 10,657,403.9522 (>1000.0) 1,034,108.3551 (inf) 10,153,917.1301 (>1000.0) 734,513.1598 (inf) 11;9 93.8315 (0.00) 92 1 test_calibrate_stuck[False--1-0.01] 505,000,000.0001 (>1000.0) 505,000,000.0001 (>1000.0) 505,000,000.0001 (>1000.0) 0.0000 (1.0) 505,000,000.0001 (>1000.0) 0.0000 (1.0) 0;0 1.9802 (0.00) 1 2 test_calibrate_stuck[False-0-0.01] 505,000,000.0001 (>1000.0) 505,000,000.0001 (>1000.0) 505,000,000.0001 (>1000.0) 0.0000 (1.0) 505,000,000.0001 (>1000.0) 0.0000 (1.0) 0;0 1.9802 (0.00) 1 2 test_calibrate_stuck[False-1-0.01] 505,000,000.0001 (>1000.0) 505,000,000.0001 (>1000.0) 505,000,000.0001 (>1000.0) 0.0000 (1.0) 505,000,000.0001 (>1000.0) 0.0000 (1.0) 0;0 1.9802 (0.00) 1 2 test_calibrate_stuck[True--1-0.01] 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 1.0000 (0.00) 1 1 test_calibrate_stuck[True-0-0.01] 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 1.0000 (0.00) 1 1 test_calibrate_stuck[True-1-0.01] 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 1.0000 (0.00) 1 1 test_calibrate_stuck[False--1-1.000000000000001] 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False--1-1] 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False-0-1.000000000000001] 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False-0-1] 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False-1-1.000000000000001] 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False-1-1] 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[True--1-1.000000000000001] 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True--1-1] 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True-0-1.000000000000001] 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True-0-1] 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True-1-1.000000000000001] 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True-1-1] 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ Legend: Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile. OPS: Operations Per Second, computed as 1 / Mean ----------------------------- cProfile (time in s) ----------------------------- tests/test_pedantic.py::test_setup_cprofile ncalls tottime percall cumtime percall filename:lineno(function) 1 0.0000 0.0000 0.0000 0.0000 pytest-benchmark-3.4.1/tests/test_pedantic.py:29(stuff) 1 0.0000 0.0000 0.0000 0.0000 ~:0() 1 0.0000 0.0000 0.0000 0.0000 ~:0() =========================== short test summary info ============================ SKIPPED [1] tests/test_skip.py:5: bla SKIPPED [5] tests/test_utils.py:47: %r not availabe on $PATH SKIPPED [4] tests/test_utils.py:160: %r not availabe on $PATH FAILED tests/test_benchmark.py::test_basic - Failed: nomatch: '*collected 5 i... FAILED tests/test_benchmark.py::test_skip - Failed: nomatch: '*collected 5 it... FAILED tests/test_benchmark.py::test_disable - Failed: nomatch: '*collected 5... FAILED tests/test_benchmark.py::test_only_benchmarks - Failed: nomatch: '*col... = 4 failed, 208 passed, 10 skipped, 10 deselected, 2 warnings in 518.49s (0:08:38) = >>> ERROR: py3-pytest-benchmark: check failed >>> py3-pytest-benchmark: Uninstalling dependencies... (1/29) Purging .makedepends-py3-pytest-benchmark (20221026.095255) (2/29) Purging py3-py-cpuinfo (8.0.0-r0) (3/29) Purging py3-setuptools (65.5.0-r0) (4/29) Purging py3-pytest-xdist (2.5.0-r1) (5/29) Purging py3-execnet (1.9.0-r0) (6/29) Purging py3-apipkg (2.1.0-r0) (7/29) Purging py3-pytest-forked (1.4.0-r1) (8/29) Purging py3-pytest (7.1.3-r1) (9/29) Purging py3-attrs (22.1.0-r0) (10/29) Purging py3-iniconfig (1.1.1-r3) (11/29) Purging py3-packaging (21.3-r2) (12/29) Purging py3-parsing (3.0.9-r0) (13/29) Purging py3-pluggy (1.0.0-r1) (14/29) Purging py3-py (1.11.0-r0) (15/29) Purging py3-tomli (2.0.1-r1) (16/29) Purging py3-freezegun (1.2.2-r0) (17/29) Purging py3-dateutil (2.8.2-r1) (18/29) Purging py3-six (1.16.0-r3) (19/29) Purging py3-pygal (3.0.0-r1) (20/29) Purging py3-elasticsearch (7.11.0-r1) (21/29) Purging py3-urllib3 (1.26.12-r0) (22/29) Purging python3 (3.10.8-r1) (23/29) Purging libbz2 (1.0.8-r3) (24/29) Purging libffi (3.4.3-r0) (25/29) Purging gdbm (1.23-r0) (26/29) Purging xz-libs (5.2.7-r0) (27/29) Purging mpdecimal (2.5.1-r1) (28/29) Purging readline (8.2.0-r0) (29/29) Purging sqlite-libs (3.39.4-r0) Executing busybox-1.35.0-r27.trigger OK: 240 MiB in 92 packages