>>> py3-pytest-benchmark: Building community/py3-pytest-benchmark 3.4.1-r1 (using abuild 3.10.0_rc1-r2) started Wed, 26 Oct 2022 15:24:11 +0000 >>> py3-pytest-benchmark: Checking sanity of /home/buildozer/aports/community/py3-pytest-benchmark/APKBUILD... >>> py3-pytest-benchmark: Analyzing dependencies... >>> py3-pytest-benchmark: Installing for build: build-base python3 py3-pytest py3-py-cpuinfo py3-setuptools py3-pytest-xdist py3-freezegun py3-pygal py3-elasticsearch (1/29) Installing libbz2 (1.0.8-r3) (2/29) Installing libffi (3.4.3-r0) (3/29) Installing gdbm (1.23-r0) (4/29) Installing xz-libs (5.2.7-r0) (5/29) Installing mpdecimal (2.5.1-r1) (6/29) Installing readline (8.2.0-r0) (7/29) Installing sqlite-libs (3.39.4-r0) (8/29) Installing python3 (3.10.8-r1) (9/29) Installing py3-attrs (22.1.0-r0) (10/29) Installing py3-iniconfig (1.1.1-r3) (11/29) Installing py3-parsing (3.0.9-r0) (12/29) Installing py3-packaging (21.3-r2) (13/29) Installing py3-pluggy (1.0.0-r1) (14/29) Installing py3-py (1.11.0-r0) (15/29) Installing py3-tomli (2.0.1-r1) (16/29) Installing py3-pytest (7.1.3-r1) (17/29) Installing py3-py-cpuinfo (8.0.0-r0) (18/29) Installing py3-setuptools (65.5.0-r0) (19/29) Installing py3-apipkg (2.1.0-r0) (20/29) Installing py3-execnet (1.9.0-r0) (21/29) Installing py3-pytest-forked (1.4.0-r1) (22/29) Installing py3-pytest-xdist (2.5.0-r1) (23/29) Installing py3-six (1.16.0-r3) (24/29) Installing py3-dateutil (2.8.2-r1) (25/29) Installing py3-freezegun (1.2.2-r0) (26/29) Installing py3-pygal (3.0.0-r1) (27/29) Installing py3-urllib3 (1.26.12-r0) (28/29) Installing py3-elasticsearch (7.11.0-r1) (29/29) Installing .makedepends-py3-pytest-benchmark (20221026.152429) Executing busybox-1.35.0-r27.trigger OK: 334 MiB in 121 packages >>> py3-pytest-benchmark: Cleaning up srcdir >>> py3-pytest-benchmark: Cleaning up pkgdir >>> py3-pytest-benchmark: Fetching https://distfiles.alpinelinux.org/distfiles/v3.17/pytest-benchmark-3.4.1.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 146 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 curl: (22) The requested URL returned error: 404 >>> py3-pytest-benchmark: Fetching https://github.com/ionelmc/pytest-benchmark/archive/v3.4.1/pytest-benchmark-3.4.1.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 315k 0 315k 0 0 497k 0 --:--:-- --:--:-- --:--:-- 2317k >>> py3-pytest-benchmark: Fetching https://distfiles.alpinelinux.org/distfiles/v3.17/pytest-benchmark-3.4.1.tar.gz >>> py3-pytest-benchmark: Checking sha512sums... pytest-benchmark-3.4.1.tar.gz: OK python-3.10.patch: OK >>> py3-pytest-benchmark: Unpacking /var/cache/distfiles/v3.17/pytest-benchmark-3.4.1.tar.gz... >>> py3-pytest-benchmark: python-3.10.patch patching file tests/test_cli.py running build running build_py creating build creating build/lib creating build/lib/pytest_benchmark copying src/pytest_benchmark/histogram.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/pep418.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/utils.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/logger.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/timers.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/fixture.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/hookspec.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/__init__.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/stats.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/plugin.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/table.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/csv.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/session.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/cli.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/__main__.py -> build/lib/pytest_benchmark copying src/pytest_benchmark/compat.py -> build/lib/pytest_benchmark creating build/lib/pytest_benchmark/storage copying src/pytest_benchmark/storage/elasticsearch.py -> build/lib/pytest_benchmark/storage copying src/pytest_benchmark/storage/__init__.py -> build/lib/pytest_benchmark/storage copying src/pytest_benchmark/storage/file.py -> build/lib/pytest_benchmark/storage running egg_info creating src/pytest_benchmark.egg-info writing src/pytest_benchmark.egg-info/PKG-INFO writing dependency_links to src/pytest_benchmark.egg-info/dependency_links.txt writing entry points to src/pytest_benchmark.egg-info/entry_points.txt writing requirements to src/pytest_benchmark.egg-info/requires.txt writing top-level names to src/pytest_benchmark.egg-info/top_level.txt writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' reading manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*.py[cod]' found anywhere in distribution warning: no previously-included files matching '__pycache__/*' found anywhere in distribution warning: no previously-included files matching '*.so' found anywhere in distribution warning: no previously-included files matching '*.dylib' found anywhere in distribution adding license file 'LICENSE' adding license file 'AUTHORS.rst' writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' running install /usr/lib/python3.10/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools. warnings.warn( running build running build_py running egg_info writing src/pytest_benchmark.egg-info/PKG-INFO writing dependency_links to src/pytest_benchmark.egg-info/dependency_links.txt writing entry points to src/pytest_benchmark.egg-info/entry_points.txt writing requirements to src/pytest_benchmark.egg-info/requires.txt writing top-level names to src/pytest_benchmark.egg-info/top_level.txt reading manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*.py[cod]' found anywhere in distribution warning: no previously-included files matching '__pycache__/*' found anywhere in distribution warning: no previously-included files matching '*.so' found anywhere in distribution warning: no previously-included files matching '*.dylib' found anywhere in distribution adding license file 'LICENSE' adding license file 'AUTHORS.rst' writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' running install_lib creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10 creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/histogram.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/pep418.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/utils.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/logger.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/timers.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/fixture.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/hookspec.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/__init__.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/stats.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/plugin.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/table.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark creating /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage copying build/lib/pytest_benchmark/storage/elasticsearch.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage copying build/lib/pytest_benchmark/storage/__init__.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage copying build/lib/pytest_benchmark/storage/file.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage copying build/lib/pytest_benchmark/csv.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/session.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/cli.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/__main__.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark copying build/lib/pytest_benchmark/compat.py -> /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/histogram.py to histogram.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/pep418.py to pep418.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/utils.py to utils.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/logger.py to logger.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/timers.py to timers.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/fixture.py to fixture.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/hookspec.py to hookspec.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/__init__.py to __init__.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/stats.py to stats.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/plugin.py to plugin.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/table.py to table.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage/elasticsearch.py to elasticsearch.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage/__init__.py to __init__.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/storage/file.py to file.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/csv.py to csv.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/session.py to session.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/cli.py to cli.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/__main__.py to __main__.cpython-310.pyc byte-compiling /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark/compat.py to compat.cpython-310.pyc running install_egg_info Copying src/pytest_benchmark.egg-info to /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/lib/python3.10/site-packages/pytest_benchmark-3.4.1-py3.10.egg-info running install_scripts Installing py.test-benchmark script to /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/bin Installing pytest-benchmark script to /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/test_install/usr/bin ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1, configfile: setup.cfg, testpaths: tests plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0 collecting ... collected 232 items / 10 deselected / 222 selected tests/test_benchmark.py::test_help PASSED [ 0%] tests/test_benchmark.py::test_groups PASSED [ 0%] tests/test_benchmark.py::test_group_by_name PASSED [ 1%] tests/test_benchmark.py::test_group_by_func PASSED [ 1%] tests/test_benchmark.py::test_group_by_fullfunc PASSED [ 2%] tests/test_benchmark.py::test_group_by_param_all PASSED [ 2%] tests/test_benchmark.py::test_group_by_param_select PASSED [ 3%] tests/test_benchmark.py::test_group_by_param_select_multiple PASSED [ 3%] tests/test_benchmark.py::test_group_by_fullname PASSED [ 4%] tests/test_benchmark.py::test_double_use PASSED [ 4%] tests/test_benchmark.py::test_only_override_skip PASSED [ 4%] tests/test_benchmark.py::test_fixtures_also_skipped PASSED [ 5%] tests/test_benchmark.py::test_conflict_between_only_and_disable PASSED [ 5%] tests/test_benchmark.py::test_max_time_min_rounds PASSED [ 6%] tests/test_benchmark.py::test_max_time PASSED [ 6%] tests/test_benchmark.py::test_bogus_max_time PASSED [ 7%] tests/test_benchmark.py::test_pep418_timer PASSED [ 7%] tests/test_benchmark.py::test_bad_save PASSED [ 8%] tests/test_benchmark.py::test_bad_save_2 PASSED [ 8%] tests/test_benchmark.py::test_bad_compare_fail PASSED [ 9%] tests/test_benchmark.py::test_bad_rounds PASSED [ 9%] tests/test_benchmark.py::test_bad_rounds_2 PASSED [ 9%] tests/test_benchmark.py::test_compare PASSED [ 10%] tests/test_benchmark.py::test_compare_last PASSED [ 10%] tests/test_benchmark.py::test_compare_non_existing PASSED [ 11%] tests/test_benchmark.py::test_compare_non_existing_verbose PASSED [ 11%] tests/test_benchmark.py::test_compare_no_files PASSED [ 12%] tests/test_benchmark.py::test_compare_no_files_verbose PASSED [ 12%] tests/test_benchmark.py::test_compare_no_files_match PASSED [ 13%] tests/test_benchmark.py::test_compare_no_files_match_verbose PASSED [ 13%] tests/test_benchmark.py::test_verbose PASSED [ 13%] tests/test_benchmark.py::test_save PASSED [ 14%] tests/test_benchmark.py::test_save_extra_info PASSED [ 14%] tests/test_benchmark.py::test_update_machine_info_hook_detection PASSED [ 15%] tests/test_benchmark.py::test_histogram PASSED [ 15%] tests/test_benchmark.py::test_autosave PASSED [ 16%] tests/test_benchmark.py::test_bogus_min_time PASSED [ 16%] tests/test_benchmark.py::test_disable_gc PASSED [ 17%] tests/test_benchmark.py::test_custom_timer PASSED [ 17%] tests/test_benchmark.py::test_bogus_timer PASSED [ 18%] tests/test_benchmark.py::test_sort_by_mean PASSED [ 18%] tests/test_benchmark.py::test_bogus_sort PASSED [ 18%] tests/test_benchmark.py::test_xdist PASSED [ 19%] tests/test_benchmark.py::test_xdist_verbose PASSED [ 19%] tests/test_benchmark.py::test_cprofile PASSED [ 20%] tests/test_benchmark.py::test_disabled_and_cprofile PASSED [ 20%] tests/test_benchmark.py::test_abort_broken PASSED [ 21%] tests/test_benchmark.py::test_basic FAILED [ 21%] tests/test_benchmark.py::test_skip FAILED [ 22%] tests/test_benchmark.py::test_disable FAILED [ 22%] tests/test_benchmark.py::test_mark_selection PASSED [ 22%] tests/test_benchmark.py::test_only_benchmarks FAILED [ 23%] tests/test_benchmark.py::test_columns PASSED [ 23%] tests/test_calibration.py::test_calibrate PASSED [ 24%] tests/test_calibration.py::test_calibrate_fast PASSED [ 24%] tests/test_calibration.py::test_calibrate_xfast PASSED [ 25%] tests/test_calibration.py::test_calibrate_slow PASSED [ 25%] tests/test_calibration.py::test_calibrate_stuck[True-0-1] PASSED [ 26%] tests/test_calibration.py::test_calibrate_stuck[True-0-0.01] PASSED [ 26%] tests/test_calibration.py::test_calibrate_stuck[True-0-1e-09] PASSED [ 27%] tests/test_calibration.py::test_calibrate_stuck[True-0-1e-10] PASSED [ 27%] tests/test_calibration.py::test_calibrate_stuck[True-0-1.000000000000001] PASSED [ 27%] tests/test_calibration.py::test_calibrate_stuck[True-1-1] PASSED [ 28%] tests/test_calibration.py::test_calibrate_stuck[True-1-0.01] PASSED [ 28%] tests/test_calibration.py::test_calibrate_stuck[True-1-1e-09] PASSED [ 29%] tests/test_calibration.py::test_calibrate_stuck[True-1-1e-10] PASSED [ 29%] tests/test_calibration.py::test_calibrate_stuck[True-1-1.000000000000001] PASSED [ 30%] tests/test_calibration.py::test_calibrate_stuck[True--1-1] PASSED [ 30%] tests/test_calibration.py::test_calibrate_stuck[True--1-0.01] PASSED [ 31%] tests/test_calibration.py::test_calibrate_stuck[True--1-1e-09] PASSED [ 31%] tests/test_calibration.py::test_calibrate_stuck[True--1-1e-10] PASSED [ 31%] tests/test_calibration.py::test_calibrate_stuck[True--1-1.000000000000001] PASSED [ 32%] tests/test_calibration.py::test_calibrate_stuck[False-0-1] PASSED [ 32%] tests/test_calibration.py::test_calibrate_stuck[False-0-0.01] PASSED [ 33%] tests/test_calibration.py::test_calibrate_stuck[False-0-1e-09] PASSED [ 33%] tests/test_calibration.py::test_calibrate_stuck[False-0-1e-10] PASSED [ 34%] tests/test_calibration.py::test_calibrate_stuck[False-0-1.000000000000001] PASSED [ 34%] tests/test_calibration.py::test_calibrate_stuck[False-1-1] PASSED [ 35%] tests/test_calibration.py::test_calibrate_stuck[False-1-0.01] PASSED [ 35%] tests/test_calibration.py::test_calibrate_stuck[False-1-1e-09] PASSED [ 36%] tests/test_calibration.py::test_calibrate_stuck[False-1-1e-10] PASSED [ 36%] tests/test_calibration.py::test_calibrate_stuck[False-1-1.000000000000001] PASSED [ 36%] tests/test_calibration.py::test_calibrate_stuck[False--1-1] PASSED [ 37%] tests/test_calibration.py::test_calibrate_stuck[False--1-0.01] PASSED [ 37%] tests/test_calibration.py::test_calibrate_stuck[False--1-1e-09] PASSED [ 38%] tests/test_calibration.py::test_calibrate_stuck[False--1-1e-10] PASSED [ 38%] tests/test_calibration.py::test_calibrate_stuck[False--1-1.000000000000001] PASSED [ 39%] tests/test_cli.py::test_list PASSED [ 39%] tests/test_cli.py::test_compare[short-] PASSED [ 40%] tests/test_cli.py::test_compare[long-] PASSED [ 40%] tests/test_cli.py::test_compare[normal-] PASSED [ 40%] tests/test_cli.py::test_compare[trial-] PASSED [ 41%] tests/test_doctest.rst::test_doctest.rst PASSED [ 41%] tests/test_elasticsearch_storage.py::test_handle_saving PASSED [ 42%] tests/test_elasticsearch_storage.py::test_parse_with_no_creds PASSED [ 42%] tests/test_elasticsearch_storage.py::test_parse_with_creds_in_first_host_of_url PASSED [ 43%] tests/test_elasticsearch_storage.py::test_parse_with_creds_in_second_host_of_url PASSED [ 43%] tests/test_elasticsearch_storage.py::test_parse_with_creds_in_netrc PASSED [ 44%] tests/test_elasticsearch_storage.py::test_parse_url_creds_supersedes_netrc_creds PASSED [ 44%] tests/test_elasticsearch_storage.py::test__mask_hosts PASSED [ 45%] tests/test_normal.py::test_normal PASSED [ 45%] tests/test_normal.py::test_fast PASSED [ 45%] tests/test_normal.py::test_slow PASSED [ 46%] tests/test_normal.py::test_slower PASSED [ 46%] tests/test_normal.py::test_xfast PASSED [ 47%] tests/test_normal.py::test_parametrized[0] PASSED [ 47%] tests/test_normal.py::test_parametrized[1] PASSED [ 48%] tests/test_normal.py::test_parametrized[2] PASSED [ 48%] tests/test_normal.py::test_parametrized[3] PASSED [ 49%] tests/test_normal.py::test_parametrized[4] PASSED [ 49%] tests/test_pedantic.py::test_single PASSED [ 50%] tests/test_pedantic.py::test_setup PASSED [ 50%] tests/test_pedantic.py::test_setup_cprofile PASSED [ 50%] tests/test_pedantic.py::test_args_kwargs PASSED [ 51%] tests/test_pedantic.py::test_iterations PASSED [ 51%] tests/test_pedantic.py::test_rounds_iterations PASSED [ 52%] tests/test_pedantic.py::test_rounds PASSED [ 52%] tests/test_pedantic.py::test_warmup_rounds PASSED [ 53%] tests/test_pedantic.py::test_rounds_must_be_int[0] PASSED [ 53%] tests/test_pedantic.py::test_rounds_must_be_int[x] PASSED [ 54%] tests/test_pedantic.py::test_warmup_rounds_must_be_int[-15] PASSED [ 54%] tests/test_pedantic.py::test_warmup_rounds_must_be_int[x] PASSED [ 54%] tests/test_pedantic.py::test_setup_many_rounds PASSED [ 55%] tests/test_pedantic.py::test_cant_use_both_args_and_setup_with_return PASSED [ 55%] tests/test_pedantic.py::test_can_use_both_args_and_setup_without_return PASSED [ 56%] tests/test_pedantic.py::test_cant_use_setup_with_many_iterations PASSED [ 56%] tests/test_pedantic.py::test_iterations_must_be_positive_int[0] PASSED [ 57%] tests/test_pedantic.py::test_iterations_must_be_positive_int[-1] PASSED [ 57%] tests/test_pedantic.py::test_iterations_must_be_positive_int[asdf] PASSED [ 58%] tests/test_sample.py::test_proto[SimpleProxy] PASSED [ 58%] tests/test_sample.py::test_proto[CachedPropertyProxy] PASSED [ 59%] tests/test_sample.py::test_proto[LocalsSimpleProxy] PASSED [ 59%] tests/test_sample.py::test_proto[LocalsCachedPropertyProxy] PASSED [ 59%] tests/test_skip.py::test_skip SKIPPED (bla) [ 60%] tests/test_stats.py::test_1 PASSED [ 60%] tests/test_stats.py::test_2 PASSED [ 61%] tests/test_stats.py::test_single_item PASSED [ 61%] tests/test_stats.py::test_length[1] PASSED [ 62%] tests/test_stats.py::test_length[2] PASSED [ 62%] tests/test_stats.py::test_length[3] PASSED [ 63%] tests/test_stats.py::test_length[4] PASSED [ 63%] tests/test_stats.py::test_length[5] PASSED [ 63%] tests/test_stats.py::test_length[6] PASSED [ 64%] tests/test_stats.py::test_length[7] PASSED [ 64%] tests/test_stats.py::test_length[8] PASSED [ 65%] tests/test_stats.py::test_length[9] PASSED [ 65%] tests/test_stats.py::test_iqr PASSED [ 66%] tests/test_stats.py::test_ops PASSED [ 66%] tests/test_storage.py::test_rendering[short] PASSED [ 67%] tests/test_storage.py::test_rendering[normal] PASSED [ 67%] tests/test_storage.py::test_rendering[long] PASSED [ 68%] tests/test_storage.py::test_rendering[trial] PASSED [ 68%] tests/test_storage.py::test_regression_checks[short] PASSED [ 68%] tests/test_storage.py::test_regression_checks[normal] PASSED [ 69%] tests/test_storage.py::test_regression_checks[long] PASSED [ 69%] tests/test_storage.py::test_regression_checks[trial] PASSED [ 70%] tests/test_storage.py::test_regression_checks_inf[short] PASSED [ 70%] tests/test_storage.py::test_regression_checks_inf[normal] PASSED [ 71%] tests/test_storage.py::test_regression_checks_inf[long] PASSED [ 71%] tests/test_storage.py::test_regression_checks_inf[trial] PASSED [ 72%] tests/test_storage.py::test_compare_1[short] PASSED [ 72%] tests/test_storage.py::test_compare_1[normal] PASSED [ 72%] tests/test_storage.py::test_compare_1[long] PASSED [ 73%] tests/test_storage.py::test_compare_1[trial] PASSED [ 73%] tests/test_storage.py::test_compare_2[short] PASSED [ 74%] tests/test_storage.py::test_compare_2[normal] PASSED [ 74%] tests/test_storage.py::test_compare_2[long] PASSED [ 75%] tests/test_storage.py::test_compare_2[trial] PASSED [ 75%] tests/test_storage.py::test_save_json[short] PASSED [ 76%] tests/test_storage.py::test_save_json[normal] PASSED [ 76%] tests/test_storage.py::test_save_json[long] PASSED [ 77%] tests/test_storage.py::test_save_json[trial] PASSED [ 77%] tests/test_storage.py::test_save_with_name[short] PASSED [ 77%] tests/test_storage.py::test_save_with_name[normal] PASSED [ 78%] tests/test_storage.py::test_save_with_name[long] PASSED [ 78%] tests/test_storage.py::test_save_with_name[trial] PASSED [ 79%] tests/test_storage.py::test_save_no_name[short] PASSED [ 79%] tests/test_storage.py::test_save_no_name[normal] PASSED [ 80%] tests/test_storage.py::test_save_no_name[long] PASSED [ 80%] tests/test_storage.py::test_save_no_name[trial] PASSED [ 81%] tests/test_storage.py::test_save_with_error[short] PASSED [ 81%] tests/test_storage.py::test_save_with_error[normal] PASSED [ 81%] tests/test_storage.py::test_save_with_error[long] PASSED [ 82%] tests/test_storage.py::test_save_with_error[trial] PASSED [ 82%] tests/test_storage.py::test_autosave[short] PASSED [ 83%] tests/test_storage.py::test_autosave[normal] PASSED [ 83%] tests/test_storage.py::test_autosave[long] PASSED [ 84%] tests/test_storage.py::test_autosave[trial] PASSED [ 84%] tests/test_utils.py::test_clonefunc[] PASSED [ 85%] tests/test_utils.py::test_clonefunc[f2] PASSED [ 85%] tests/test_utils.py::test_clonefunc_not_function PASSED [ 86%] tests/test_utils.py::test_get_commit_info[git-True] PASSED [ 86%] tests/test_utils.py::test_get_commit_info[git-False] PASSED [ 86%] tests/test_utils.py::test_get_commit_info[hg-True] SKIPPED (%r not a...) [ 87%] tests/test_utils.py::test_get_commit_info[hg-False] SKIPPED (%r not ...) [ 87%] tests/test_utils.py::test_missing_scm_bins[git-True] PASSED [ 88%] tests/test_utils.py::test_missing_scm_bins[git-False] PASSED [ 88%] tests/test_utils.py::test_missing_scm_bins[hg-True] SKIPPED (%r not ...) [ 89%] tests/test_utils.py::test_missing_scm_bins[hg-False] SKIPPED (%r not...) [ 89%] tests/test_utils.py::test_get_branch_info[git] PASSED [ 90%] tests/test_utils.py::test_get_branch_info[hg] SKIPPED (%r not availa...) [ 90%] tests/test_utils.py::test_no_branch_info PASSED [ 90%] tests/test_utils.py::test_commit_info_error PASSED [ 91%] tests/test_utils.py::test_parse_warmup PASSED [ 91%] tests/test_utils.py::test_parse_columns PASSED [ 92%] tests/test_utils.py::test_get_project_name[False-None] PASSED [ 92%] tests/test_utils.py::test_get_project_name[False-git] PASSED [ 93%] tests/test_utils.py::test_get_project_name[False-hg] SKIPPED (%r not...) [ 93%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo-None] PASSED [ 94%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo-git] PASSED [ 94%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo-hg] SKIPPED [ 95%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo.git-None] PASSED [ 95%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo.git-git] PASSED [ 95%] tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo.git-hg] SKIPPED [ 96%] tests/test_utils.py::test_get_project_name[c:\\foo\\bar\\pytest_benchmark_repo.gitfoo@example.com:pytest_benchmark_repo.git-None] PASSED [ 96%] tests/test_utils.py::test_get_project_name[c:\\foo\\bar\\pytest_benchmark_repo.gitfoo@example.com:pytest_benchmark_repo.git-git] PASSED [ 97%] tests/test_utils.py::test_get_project_name[c:\\foo\\bar\\pytest_benchmark_repo.gitfoo@example.com:pytest_benchmark_repo.git-hg] SKIPPED [ 97%] tests/test_utils.py::test_get_project_name_broken[git] PASSED [ 98%] tests/test_utils.py::test_get_project_name_broken[hg] PASSED [ 98%] tests/test_utils.py::test_get_project_name_fallback PASSED [ 99%] tests/test_utils.py::test_get_project_name_fallback_broken_hgrc PASSED [ 99%] tests/test_with_testcase.py::TerribleTerribleWayToWriteTests::test_foo PASSED [100%] =================================== FAILURES =================================== __________________________________ test_basic __________________________________ /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_benchmark.py:1028: in test_basic result.stdout.fnmatch_lines([ E Failed: nomatch: '*collected 5 items' E and: '============================= test session starts ==============================' E and: 'platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3' E and: 'cachedir: .pytest_cache' E and: 'benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)' E and: 'rootdir: /tmp/pytest-of-buildozer/pytest-10/test_basic0' E and: 'plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0' E fnmatch: '*collected 5 items' E with: 'collecting ... collected 5 items' E nomatch: 'test_basic.py::*test_basic PASSED*' E and: '' E fnmatch: 'test_basic.py::*test_basic PASSED*' E with: 'test_basic.py::test_basic PASSED [ 20%]' E nomatch: 'test_basic.py::test_slow PASSED*' E and: 'test_basic.py::test_fast PASSED [ 40%]' E fnmatch: 'test_basic.py::test_slow PASSED*' E with: 'test_basic.py::test_slow PASSED [ 60%]' E fnmatch: 'test_basic.py::test_slower PASSED*' E with: 'test_basic.py::test_slower PASSED [ 80%]' E fnmatch: 'test_basic.py::test_xfast PASSED*' E with: 'test_basic.py::test_xfast PASSED [100%]' E nomatch: 'test_basic.py::test_fast PASSED*' E and: '' E and: '' E and: '--------------------------------------------------------------------------------------------------------- benchmark: 4 tests --------------------------------------------------------------------------------------------------------' E and: 'Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations' E and: '-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------' E and: 'test_xfast 473.1119 (1.0) 282,544.6427 (1.00) 936.3836 (1.0) 984.3714 (1.30) 901.5203 (1.0) 145.2863 (1.0) 598;7700 1,067,938.3889 (1.0) 168404 1' E and: 'test_fast 517.8154 (1.09) 282,466.4116 (1.0) 963.2344 (1.03) 759.0103 (1.0) 935.0479 (1.04) 175.0886 (1.21) 597;3544 1,038,168.9547 (0.97) 145889 1' E and: 'test_slow 1,022,193.5809 (>1000.0) 1,317,918.3006 (4.67) 1,063,110.9384 (>1000.0) 8,961.9597 (11.81) 1,062,896.1027 (>1000.0) 2,274.2897 (15.65) 12;28 940.6356 (0.00) 935 1' E and: 'test_slower 10,067,664.0868 (>1000.0) 10,272,413.4922 (36.37) 10,075,907.7072 (>1000.0) 20,468.4478 (26.97) 10,072,611.2723 (>1000.0) 3,863.1260 (26.59) 2;10 99.2466 (0.00) 100 1' E and: '-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------' E and: '' E and: 'Legend:' E and: ' Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile.' E and: ' OPS: Operations Per Second, computed as 1 / Mean' E and: '============================== 5 passed in 7.53s ===============================' E remains unmatched: 'test_basic.py::test_fast PASSED*' ----------------------------- Captured stdout call ----------------------------- running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-buildozer/pytest-10/test_basic0/runpytest-0 -vv --doctest-modules /tmp/pytest-of-buildozer/pytest-10/test_basic0/test_basic.py in: /tmp/pytest-of-buildozer/pytest-10/test_basic0 ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /tmp/pytest-of-buildozer/pytest-10/test_basic0 plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0 collecting ... collected 5 items test_basic.py::test_basic PASSED [ 20%] test_basic.py::test_fast PASSED [ 40%] test_basic.py::test_slow PASSED [ 60%] test_basic.py::test_slower PASSED [ 80%] test_basic.py::test_xfast PASSED [100%] --------------------------------------------------------------------------------------------------------- benchmark: 4 tests -------------------------------------------------------------------------------------------------------- Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- test_xfast 473.1119 (1.0) 282,544.6427 (1.00) 936.3836 (1.0) 984.3714 (1.30) 901.5203 (1.0) 145.2863 (1.0) 598;7700 1,067,938.3889 (1.0) 168404 1 test_fast 517.8154 (1.09) 282,466.4116 (1.0) 963.2344 (1.03) 759.0103 (1.0) 935.0479 (1.04) 175.0886 (1.21) 597;3544 1,038,168.9547 (0.97) 145889 1 test_slow 1,022,193.5809 (>1000.0) 1,317,918.3006 (4.67) 1,063,110.9384 (>1000.0) 8,961.9597 (11.81) 1,062,896.1027 (>1000.0) 2,274.2897 (15.65) 12;28 940.6356 (0.00) 935 1 test_slower 10,067,664.0868 (>1000.0) 10,272,413.4922 (36.37) 10,075,907.7072 (>1000.0) 20,468.4478 (26.97) 10,072,611.2723 (>1000.0) 3,863.1260 (26.59) 2;10 99.2466 (0.00) 100 1 ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Legend: Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile. OPS: Operations Per Second, computed as 1 / Mean ============================== 5 passed in 7.53s =============================== __________________________________ test_skip ___________________________________ /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_benchmark.py:1052: in test_skip result.stdout.fnmatch_lines([ E Failed: nomatch: '*collected 5 items' E and: '============================= test session starts ==============================' E and: 'platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3' E and: 'cachedir: .pytest_cache' E and: 'benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)' E and: 'rootdir: /tmp/pytest-of-buildozer/pytest-10/test_skip0' E and: 'plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0' E fnmatch: '*collected 5 items' E with: 'collecting ... collected 5 items' E nomatch: 'test_skip.py::*test_skip PASSED*' E and: '' E fnmatch: 'test_skip.py::*test_skip PASSED*' E with: 'test_skip.py::test_skip PASSED [ 20%]' E nomatch: 'test_skip.py::test_slow SKIPPED*' E and: 'test_skip.py::test_fast SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 40%]' E fnmatch: 'test_skip.py::test_slow SKIPPED*' E with: 'test_skip.py::test_slow SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 60%]' E fnmatch: 'test_skip.py::test_slower SKIPPED*' E with: 'test_skip.py::test_slower SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 80%]' E fnmatch: 'test_skip.py::test_xfast SKIPPED*' E with: 'test_skip.py::test_xfast SKIPPED (Skipping benchmark (--benchmark-skip active).) [100%]' E nomatch: 'test_skip.py::test_fast SKIPPED*' E and: '' E and: '========================= 1 passed, 4 skipped in 0.08s =========================' E remains unmatched: 'test_skip.py::test_fast SKIPPED*' ----------------------------- Captured stdout call ----------------------------- running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-buildozer/pytest-10/test_skip0/runpytest-0 -vv --doctest-modules --benchmark-skip /tmp/pytest-of-buildozer/pytest-10/test_skip0/test_skip.py in: /tmp/pytest-of-buildozer/pytest-10/test_skip0 ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /tmp/pytest-of-buildozer/pytest-10/test_skip0 plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0 collecting ... collected 5 items test_skip.py::test_skip PASSED [ 20%] test_skip.py::test_fast SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 40%] test_skip.py::test_slow SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 60%] test_skip.py::test_slower SKIPPED (Skipping benchmark (--benchmark-skip active).) [ 80%] test_skip.py::test_xfast SKIPPED (Skipping benchmark (--benchmark-skip active).) [100%] ========================= 1 passed, 4 skipped in 0.08s ========================= _________________________________ test_disable _________________________________ /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_benchmark.py:1066: in test_disable result.stdout.fnmatch_lines([ E Failed: nomatch: '*collected 5 items' E and: '============================= test session starts ==============================' E and: 'platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3' E and: 'cachedir: .pytest_cache' E and: 'benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)' E and: 'rootdir: /tmp/pytest-of-buildozer/pytest-10/test_disable0' E and: 'plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0' E fnmatch: '*collected 5 items' E with: 'collecting ... collected 5 items' E nomatch: 'test_disable.py::*test_disable PASSED*' E and: '' E fnmatch: 'test_disable.py::*test_disable PASSED*' E with: 'test_disable.py::test_disable PASSED [ 20%]' E nomatch: 'test_disable.py::test_slow PASSED*' E and: 'test_disable.py::test_fast PASSED [ 40%]' E fnmatch: 'test_disable.py::test_slow PASSED*' E with: 'test_disable.py::test_slow PASSED [ 60%]' E fnmatch: 'test_disable.py::test_slower PASSED*' E with: 'test_disable.py::test_slower PASSED [ 80%]' E fnmatch: 'test_disable.py::test_xfast PASSED*' E with: 'test_disable.py::test_xfast PASSED [100%]' E nomatch: 'test_disable.py::test_fast PASSED*' E and: '' E and: '============================== 5 passed in 0.09s ===============================' E remains unmatched: 'test_disable.py::test_fast PASSED*' ----------------------------- Captured stdout call ----------------------------- running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-buildozer/pytest-10/test_disable0/runpytest-0 -vv --doctest-modules --benchmark-disable /tmp/pytest-of-buildozer/pytest-10/test_disable0/test_disable.py in: /tmp/pytest-of-buildozer/pytest-10/test_disable0 ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /tmp/pytest-of-buildozer/pytest-10/test_disable0 plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0 collecting ... collected 5 items test_disable.py::test_disable PASSED [ 20%] test_disable.py::test_fast PASSED [ 40%] test_disable.py::test_slow PASSED [ 60%] test_disable.py::test_slower PASSED [ 80%] test_disable.py::test_xfast PASSED [100%] ============================== 5 passed in 0.09s =============================== _____________________________ test_only_benchmarks _____________________________ /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_benchmark.py:1095: in test_only_benchmarks result.stdout.fnmatch_lines([ E Failed: nomatch: '*collected 5 items' E and: '============================= test session starts ==============================' E and: 'platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3' E and: 'cachedir: .pytest_cache' E and: 'benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)' E and: 'rootdir: /tmp/pytest-of-buildozer/pytest-10/test_only_benchmarks0' E and: 'plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0' E fnmatch: '*collected 5 items' E with: 'collecting ... collected 5 items' E nomatch: 'test_only_benchmarks.py::*test_only_benchmarks SKIPPED*' E and: '' E fnmatch: 'test_only_benchmarks.py::*test_only_benchmarks SKIPPED*' E with: 'test_only_benchmarks.py::test_only_benchmarks SKIPPED (Skipping non-benchmark (--benchmark-only active).) [ 20%]' E nomatch: 'test_only_benchmarks.py::test_slow PASSED*' E and: 'test_only_benchmarks.py::test_fast PASSED [ 40%]' E fnmatch: 'test_only_benchmarks.py::test_slow PASSED*' E with: 'test_only_benchmarks.py::test_slow PASSED [ 60%]' E fnmatch: 'test_only_benchmarks.py::test_slower PASSED*' E with: 'test_only_benchmarks.py::test_slower PASSED [ 80%]' E fnmatch: 'test_only_benchmarks.py::test_xfast PASSED*' E with: 'test_only_benchmarks.py::test_xfast PASSED [100%]' E nomatch: 'test_only_benchmarks.py::test_fast PASSED*' E and: '' E and: '' E and: '--------------------------------------------------------------------------------------------------------- benchmark: 4 tests --------------------------------------------------------------------------------------------------------' E and: 'Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations' E and: '-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------' E and: 'test_fast 191.1074 (1.0) 14,749.1693 (1.0) 318.6258 (1.0) 109.7579 (1.0) 316.6497 (1.0) 21.4204 (1.0) 428;4493 3,138,478.1948 (1.0) 138727 20' E and: 'test_xfast 473.1119 (2.48) 281,903.8928 (19.11) 925.5703 (2.90) 1,116.1280 (10.17) 879.1685 (2.78) 171.3634 (8.00) 884;8694 1,080,414.9398 (0.34) 197962 1' E and: 'test_slow 1,018,103.2121 (>1000.0) 1,347,497.1056 (91.36) 1,065,711.6278 (>1000.0) 12,227.4627 (111.40) 1,065,589.4876 (>1000.0) 3,490.5970 (162.96) 15;19 938.3401 (0.00) 938 1' E and: 'test_slower 10,068,185.6275 (>1000.0) 10,080,650.4488 (683.47) 10,073,681.0640 (>1000.0) 1,921.6008 (17.51) 10,073,596.6116 (>1000.0) 2,371.1473 (110.70) 30;2 99.2686 (0.00) 100 1' E and: '-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------' E and: '' E and: 'Legend:' E and: ' Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile.' E and: ' OPS: Operations Per Second, computed as 1 / Mean' E and: '========================= 4 passed, 1 skipped in 8.95s =========================' E remains unmatched: 'test_only_benchmarks.py::test_fast PASSED*' ----------------------------- Captured stdout call ----------------------------- running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-buildozer/pytest-10/test_only_benchmarks0/runpytest-0 -vv --doctest-modules --benchmark-only /tmp/pytest-of-buildozer/pytest-10/test_only_benchmarks0/test_only_benchmarks.py in: /tmp/pytest-of-buildozer/pytest-10/test_only_benchmarks0 ============================= test session starts ============================== platform linux -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000) rootdir: /tmp/pytest-of-buildozer/pytest-10/test_only_benchmarks0 plugins: benchmark-3.4.1, xdist-2.5.0, forked-1.4.0 collecting ... collected 5 items test_only_benchmarks.py::test_only_benchmarks SKIPPED (Skipping non-benchmark (--benchmark-only active).) [ 20%] test_only_benchmarks.py::test_fast PASSED [ 40%] test_only_benchmarks.py::test_slow PASSED [ 60%] test_only_benchmarks.py::test_slower PASSED [ 80%] test_only_benchmarks.py::test_xfast PASSED [100%] --------------------------------------------------------------------------------------------------------- benchmark: 4 tests -------------------------------------------------------------------------------------------------------- Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- test_fast 191.1074 (1.0) 14,749.1693 (1.0) 318.6258 (1.0) 109.7579 (1.0) 316.6497 (1.0) 21.4204 (1.0) 428;4493 3,138,478.1948 (1.0) 138727 20 test_xfast 473.1119 (2.48) 281,903.8928 (19.11) 925.5703 (2.90) 1,116.1280 (10.17) 879.1685 (2.78) 171.3634 (8.00) 884;8694 1,080,414.9398 (0.34) 197962 1 test_slow 1,018,103.2121 (>1000.0) 1,347,497.1056 (91.36) 1,065,711.6278 (>1000.0) 12,227.4627 (111.40) 1,065,589.4876 (>1000.0) 3,490.5970 (162.96) 15;19 938.3401 (0.00) 938 1 test_slower 10,068,185.6275 (>1000.0) 10,080,650.4488 (683.47) 10,073,681.0640 (>1000.0) 1,921.6008 (17.51) 10,073,596.6116 (>1000.0) 2,371.1473 (110.70) 30;2 99.2686 (0.00) 100 1 ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Legend: Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile. OPS: Operations Per Second, computed as 1 / Mean ========================= 4 passed, 1 skipped in 8.95s ========================= =============================== warnings summary =============================== ../../../../../../../usr/lib/python3.10/site-packages/_pytest/config/__init__.py:1199 /usr/lib/python3.10/site-packages/_pytest/config/__init__.py:1199: PytestRemovedIn8Warning: The --strict option is deprecated, use --strict-markers instead. self.issue_config_time_warning( tests/test_utils.py:35 /home/buildozer/aports/community/py3-pytest-benchmark/src/pytest-benchmark-3.4.1/tests/test_utils.py:35: PytestDeprecationWarning: @pytest.yield_fixture is deprecated. Use @pytest.fixture instead; they are the same. @pytest.yield_fixture(params=(True, False)) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html ----------------------------------------------------------------------------------------------------------------------------------- benchmark: 58 tests ------------------------------------------------------------------------------------------------------------------------------------ Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS Rounds Iterations -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- test_calibrate_stuck[False--1-1e-10] 5.0500 (1.0) 5.0500 (1.0) 5.0500 (1.0) 0.0000 (1.0) 5.0500 (1.0) 0.0000 (1.0) 0;0 198,019,801.9804 (1.0) 1 2 test_calibrate_stuck[False-0-1e-10] 5.0500 (1.0) 5.0500 (1.0) 5.0500 (1.0) 0.0000 (1.0) 5.0500 (1.0) 0.0000 (1.0) 0;0 198,019,801.9804 (1.0) 1 2 test_calibrate_stuck[False-1-1e-10] 5.0500 (1.0) 5.0500 (1.0) 5.0500 (1.0) 0.0000 (1.0) 5.0500 (1.0) 0.0000 (1.0) 0;0 198,019,801.9804 (1.0) 1 2 test_calibrate_stuck[True--1-1e-10] 10.0000 (1.98) 10.0000 (1.98) 10.0000 (1.98) 0.0000 (1.0) 10.0000 (1.98) 0.0000 (1.0) 0;0 100,000,000.0001 (0.50) 1 1 test_calibrate_stuck[True-0-1e-10] 10.0000 (1.98) 10.0000 (1.98) 10.0000 (1.98) 0.0000 (1.0) 10.0000 (1.98) 0.0000 (1.0) 0;0 100,000,000.0001 (0.50) 1 1 test_calibrate_stuck[True-1-1e-10] 10.0000 (1.98) 10.0000 (1.98) 10.0000 (1.98) 0.0000 (1.0) 10.0000 (1.98) 0.0000 (1.0) 0;0 100,000,000.0001 (0.50) 1 1 test_calibrate_stuck[False--1-1e-09] 50.5000 (10.00) 50.5000 (10.00) 50.5000 (10.00) 0.0000 (1.0) 50.5000 (10.00) 0.0000 (1.0) 0;0 19,801,980.1980 (0.10) 1 2 test_calibrate_stuck[False-0-1e-09] 50.5000 (10.00) 50.5000 (10.00) 50.5000 (10.00) 0.0000 (1.0) 50.5000 (10.00) 0.0000 (1.0) 0;0 19,801,980.1980 (0.10) 1 2 test_calibrate_stuck[False-1-1e-09] 50.5000 (10.00) 50.5000 (10.00) 50.5000 (10.00) 0.0000 (1.0) 50.5000 (10.00) 0.0000 (1.0) 0;0 19,801,980.1980 (0.10) 1 2 test_calibrate_stuck[True--1-1e-09] 100.0000 (19.80) 100.0000 (19.80) 100.0000 (19.80) 0.0000 (1.0) 100.0000 (19.80) 0.0000 (1.0) 0;0 10,000,000.0000 (0.05) 1 1 test_calibrate_stuck[True-0-1e-09] 100.0000 (19.80) 100.0000 (19.80) 100.0000 (19.80) 0.0000 (1.0) 100.0000 (19.80) 0.0000 (1.0) 0;0 10,000,000.0000 (0.05) 1 1 test_calibrate_stuck[True-1-1e-09] 100.0000 (19.80) 100.0000 (19.80) 100.0000 (19.80) 0.0000 (1.0) 100.0000 (19.80) 0.0000 (1.0) 0;0 10,000,000.0000 (0.05) 1 1 test_calibrate_xfast 232.3226 (46.00) 13,886.3583 (>1000.0) 240.2366 (47.57) 83.3048 (inf) 238.0799 (47.14) 3.3866 (inf) 11479;51169 4,162,562.9581 (0.02) 1929802 22 test_rounds_iterations 397.8610 (78.78) 622.1235 (123.19) 496.0597 (98.23) 52.5400 (inf) 490.6207 (97.15) 61.7467 (inf) 3;1 2,015,886.5725 (0.01) 15 10 test_xfast 476.8372 (94.42) 2,622.6044 (519.33) 929.1279 (183.99) 232.5311 (inf) 953.6743 (188.85) 238.4186 (inf) 235;48 1,076,278.1057 (0.01) 1049 1 test_iterations 544.6374 (107.85) 544.6374 (107.85) 544.6374 (107.85) 0.0000 (1.0) 544.6374 (107.85) 0.0000 (1.0) 0;0 1,836,083.8304 (0.01) 1 10 test_rounds 592.3212 (117.29) 1,877.5463 (371.79) 902.2653 (178.67) 398.8726 (inf) 797.2121 (157.86) 213.2729 (inf) 2;2 1,108,321.4533 (0.01) 15 1 test_warmup_rounds 685.4534 (135.73) 1,307.5769 (258.93) 824.7793 (163.32) 271.3408 (inf) 689.1787 (136.47) 203.0283 (inf) 1;1 1,212,445.6007 (0.01) 5 1 test_proto[LocalsCachedPropertyProxy] 1,035.6307 (205.08) 283,651.0539 (>1000.0) 2,021.2160 (400.24) 1,751.2253 (inf) 1,944.6015 (385.07) 257.0450 (inf) 136;2717 494,751.6822 (0.00) 53135 1 test_proto[CachedPropertyProxy] 1,039.3560 (205.81) 284,712.7616 (>1000.0) 2,024.6515 (400.92) 1,065.5878 (inf) 1,952.0521 (386.54) 301.7485 (inf) 638;2769 493,912.1598 (0.00) 77966 1 test_proto[LocalsSimpleProxy] 1,061.7077 (210.24) 287,421.0477 (>1000.0) 2,024.6592 (400.92) 878.2931 (inf) 1,966.9533 (389.50) 238.4186 (inf) 1978;5939 493,910.2856 (0.00) 121355 1 test_proto[SimpleProxy] 1,065.4330 (210.98) 283,189.1179 (>1000.0) 2,042.6182 (404.48) 1,575.4461 (inf) 1,966.9533 (389.50) 268.2209 (inf) 543;6249 489,567.7537 (0.00) 133153 1 test_calibrate_fast 1,072.5111 (212.38) 30,858.8147 (>1000.0) 1,102.1257 (218.24) 273.4282 (inf) 1,095.6079 (216.95) 9.6858 (inf) 5380;22840 907,337.5428 (0.00) 924683 10 test_single 1,367.1815 (270.73) 1,367.1815 (270.73) 1,367.1815 (270.73) 0.0000 (1.0) 1,367.1815 (270.73) 0.0000 (1.0) 0;0 731,431.7602 (0.00) 1 1 test_setup_many_rounds 1,642.8530 (325.32) 4,328.7873 (857.19) 2,090.6329 (413.99) 792.8109 (inf) 1,857.0572 (367.73) 182.5392 (inf) 1;1 478,324.0485 (0.00) 10 1 test_can_use_both_args_and_setup_without_return 2,503.3951 (495.72) 2,503.3951 (495.72) 2,503.3951 (495.72) 0.0000 (1.0) 2,503.3951 (495.72) 0.0000 (1.0) 0;0 399,457.5238 (0.00) 1 1 test_setup_cprofile 3,200.0244 (633.67) 3,200.0244 (633.67) 3,200.0244 (633.67) 0.0000 (1.0) 3,200.0244 (633.67) 0.0000 (1.0) 0;0 312,497.6205 (0.00) 1 1 test_args_kwargs 3,267.0796 (646.95) 3,267.0796 (646.95) 3,267.0796 (646.95) 0.0000 (1.0) 3,267.0796 (646.95) 0.0000 (1.0) 0;0 306,083.7583 (0.00) 1 1 test_setup 3,691.7627 (731.04) 3,691.7627 (731.04) 3,691.7627 (731.04) 0.0000 (1.0) 3,691.7627 (731.04) 0.0000 (1.0) 0;0 270,873.3158 (0.00) 1 1 test_foo 9,939.0745 (>1000.0) 358,052.5517 (>1000.0) 58,943.0250 (>1000.0) 7,322.3306 (inf) 58,777.6303 (>1000.0) 1,154.8400 (inf) 197;1084 16,965.5358 (0.00) 15076 1 test_fast 12,390.3155 (>1000.0) 340,811.9082 (>1000.0) 60,492.2398 (>1000.0) 6,557.5232 (inf) 60,304.9994 (>1000.0) 2,175.5695 (inf) 212;538 16,531.0460 (0.00) 13906 1 test_calibrate_slow 16,856.9386 (>1000.0) 359,833.2405 (>1000.0) 68,564.2021 (>1000.0) 6,970.2067 (inf) 68,351.6264 (>1000.0) 1,814.2164 (inf) 7451;25860 14,584.8704 (0.00) 558311 1 test_parametrized[4] 17,892.5693 (>1000.0) 361,863.5237 (>1000.0) 68,690.8288 (>1000.0) 7,531.9349 (inf) 68,455.9345 (>1000.0) 2,070.3301 (inf) 201;473 14,557.9842 (0.00) 13607 1 test_parametrized[2] 18,492.3410 (>1000.0) 670,060.5154 (>1000.0) 68,708.5183 (>1000.0) 8,857.0783 (inf) 68,418.6816 (>1000.0) 1,994.8930 (inf) 170;587 14,554.2361 (0.00) 13080 1 test_parametrized[3] 18,943.1012 (>1000.0) 349,733.9785 (>1000.0) 68,812.5765 (>1000.0) 6,742.5907 (inf) 68,666.4134 (>1000.0) 2,186.7454 (inf) 195;406 14,532.2273 (0.00) 13580 1 test_parametrized[1] 19,602.4776 (>1000.0) 669,881.7015 (>1000.0) 68,679.5163 (>1000.0) 8,996.9867 (inf) 68,463.3851 (>1000.0) 2,104.7890 (inf) 181;456 14,560.3821 (0.00) 13488 1 test_parametrized[0] 22,221.3566 (>1000.0) 347,830.3552 (>1000.0) 68,841.0324 (>1000.0) 7,112.7715 (inf) 68,571.4185 (>1000.0) 2,175.5695 (inf) 182;454 14,526.2203 (0.00) 13209 1 test_calibrate 77,489.7635 (>1000.0) 375,024.9743 (>1000.0) 78,485.0532 (>1000.0) 7,324.9560 (inf) 78,272.0745 (>1000.0) 651.9258 (inf) 268;3272 12,741.2795 (0.00) 88701 1 test_slow 1,017,820.0901 (>1000.0) 1,074,872.9110 (>1000.0) 1,065,782.6023 (>1000.0) 3,659.0879 (inf) 1,066,222.7869 (>1000.0) 2,339.4823 (inf) 45;25 938.2777 (0.00) 937 1 test_slower 10,070,189.8336 (>1000.0) 10,080,177.3369 (>1000.0) 10,074,123.0324 (>1000.0) 1,842.5845 (inf) 10,074,110.7017 (>1000.0) 2,739.9510 (inf) 38;1 99.2642 (0.00) 100 1 test_calibrate_stuck[False--1-0.01] 505,000,000.0001 (>1000.0) 505,000,000.0001 (>1000.0) 505,000,000.0001 (>1000.0) 0.0000 (1.0) 505,000,000.0001 (>1000.0) 0.0000 (1.0) 0;0 1.9802 (0.00) 1 2 test_calibrate_stuck[False-0-0.01] 505,000,000.0001 (>1000.0) 505,000,000.0001 (>1000.0) 505,000,000.0001 (>1000.0) 0.0000 (1.0) 505,000,000.0001 (>1000.0) 0.0000 (1.0) 0;0 1.9802 (0.00) 1 2 test_calibrate_stuck[False-1-0.01] 505,000,000.0001 (>1000.0) 505,000,000.0001 (>1000.0) 505,000,000.0001 (>1000.0) 0.0000 (1.0) 505,000,000.0001 (>1000.0) 0.0000 (1.0) 0;0 1.9802 (0.00) 1 2 test_calibrate_stuck[True--1-0.01] 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 1.0000 (0.00) 1 1 test_calibrate_stuck[True-0-0.01] 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 1.0000 (0.00) 1 1 test_calibrate_stuck[True-1-0.01] 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 1,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 1.0000 (0.00) 1 1 test_calibrate_stuck[False--1-1.000000000000001] 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False--1-1] 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False-0-1.000000000000001] 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False-0-1] 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False-1-1.000000000000001] 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[False-1-1] 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 50,500,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0198 (0.00) 1 2 test_calibrate_stuck[True--1-1.000000000000001] 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True--1-1] 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True-0-1.000000000000001] 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True-0-1] 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True-1-1.000000000000001] 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 test_calibrate_stuck[True-1-1] 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 100,000,000,000.0000 (>1000.0) 0.0000 (1.0) 0;0 0.0100 (0.00) 1 1 -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Legend: Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile. OPS: Operations Per Second, computed as 1 / Mean ----------------------------- cProfile (time in s) ----------------------------- tests/test_pedantic.py::test_setup_cprofile ncalls tottime percall cumtime percall filename:lineno(function) 1 0.0000 0.0000 0.0000 0.0000 pytest-benchmark-3.4.1/tests/test_pedantic.py:29(stuff) 1 0.0000 0.0000 0.0000 0.0000 ~:0() 1 0.0000 0.0000 0.0000 0.0000 ~:0() =========================== short test summary info ============================ SKIPPED [1] tests/test_skip.py:5: bla SKIPPED [5] tests/test_utils.py:47: %r not availabe on $PATH SKIPPED [4] tests/test_utils.py:160: %r not availabe on $PATH FAILED tests/test_benchmark.py::test_basic - Failed: nomatch: '*collected 5 i... FAILED tests/test_benchmark.py::test_skip - Failed: nomatch: '*collected 5 it... FAILED tests/test_benchmark.py::test_disable - Failed: nomatch: '*collected 5... FAILED tests/test_benchmark.py::test_only_benchmarks - Failed: nomatch: '*col... = 4 failed, 208 passed, 10 skipped, 10 deselected, 2 warnings in 507.57s (0:08:27) = >>> ERROR: py3-pytest-benchmark: check failed >>> py3-pytest-benchmark: Uninstalling dependencies... (1/29) Purging .makedepends-py3-pytest-benchmark (20221026.152429) (2/29) Purging py3-py-cpuinfo (8.0.0-r0) (3/29) Purging py3-setuptools (65.5.0-r0) (4/29) Purging py3-pytest-xdist (2.5.0-r1) (5/29) Purging py3-execnet (1.9.0-r0) (6/29) Purging py3-apipkg (2.1.0-r0) (7/29) Purging py3-pytest-forked (1.4.0-r1) (8/29) Purging py3-pytest (7.1.3-r1) (9/29) Purging py3-attrs (22.1.0-r0) (10/29) Purging py3-iniconfig (1.1.1-r3) (11/29) Purging py3-packaging (21.3-r2) (12/29) Purging py3-parsing (3.0.9-r0) (13/29) Purging py3-pluggy (1.0.0-r1) (14/29) Purging py3-py (1.11.0-r0) (15/29) Purging py3-tomli (2.0.1-r1) (16/29) Purging py3-freezegun (1.2.2-r0) (17/29) Purging py3-dateutil (2.8.2-r1) (18/29) Purging py3-six (1.16.0-r3) (19/29) Purging py3-pygal (3.0.0-r1) (20/29) Purging py3-elasticsearch (7.11.0-r1) (21/29) Purging py3-urllib3 (1.26.12-r0) (22/29) Purging python3 (3.10.8-r1) (23/29) Purging libbz2 (1.0.8-r3) (24/29) Purging libffi (3.4.3-r0) (25/29) Purging gdbm (1.23-r0) (26/29) Purging xz-libs (5.2.7-r0) (27/29) Purging mpdecimal (2.5.1-r1) (28/29) Purging readline (8.2.0-r0) (29/29) Purging sqlite-libs (3.39.4-r0) Executing busybox-1.35.0-r27.trigger OK: 266 MiB in 92 packages