>>> py3-zimscraperlib: Building testing/py3-zimscraperlib 3.2.0-r0 (using abuild 3.13.0-r3) started Thu, 11 Jul 2024 13:29:27 +0000 >>> py3-zimscraperlib: Checking sanity of /home/buildozer/aports/testing/py3-zimscraperlib/APKBUILD... >>> py3-zimscraperlib: Analyzing dependencies... >>> py3-zimscraperlib: Installing for build: build-base ffmpeg gifsicle py3-pillow py3-six wget py3-gpep517 py3-setuptools py3-wheel py3-babel py3-beautifulsoup4 py3-colorthief py3-iso639 py3-libzim py3-lxml py3-magic py3-optimize-images py3-piexif py3-pytest py3-pytest-cov py3-pytest-httpbin py3-resizeimage py3-requests py3-wsgiprox yt-dlp-core (1/247) Installing libSvtAv1Enc (2.1.2-r0) (2/247) Installing aom-libs (3.9.1-r0) (3/247) Installing libxau (1.0.11-r4) (4/247) Installing libmd (1.1.0-r0) (5/247) Installing libbsd (0.12.2-r0) (6/247) Installing libxdmcp (1.1.5-r1) (7/247) Installing libxcb (1.16.1-r0) (8/247) Installing libx11 (1.8.9-r1) (9/247) Installing hwdata-pci (0.382-r0) (10/247) Installing libpciaccess (0.18.1-r0) (11/247) Installing libdrm (2.4.122-r0) (12/247) Installing libxext (1.3.6-r2) (13/247) Installing libxfixes (6.0.1-r4) (14/247) Installing libffi (3.4.6-r0) (15/247) Installing wayland-libs-client (1.23.0-r0) (16/247) Installing libva (2.21.0-r0) (17/247) Installing libvdpau (1.5-r3) (18/247) Installing ffmpeg-libavutil (6.1.1-r9) (19/247) Installing libdav1d (1.4.3-r0) (20/247) Installing openexr-libiex (3.1.13-r1) (21/247) Installing openexr-libilmthread (3.1.13-r1) (22/247) Installing imath (3.1.11-r2) (23/247) Installing openexr-libopenexr (3.1.13-r1) (24/247) Installing giflib (5.2.2-r0) (25/247) Installing libhwy (1.0.7-r0) (26/247) Installing libjpeg-turbo (3.0.3-r0) (27/247) Installing lcms2 (2.16-r0) (28/247) Installing libpng (1.6.43-r0) (29/247) Installing libjxl (0.10.2-r0) (30/247) Installing lame-libs (3.100-r5) (31/247) Installing opus (1.5.2-r0) (32/247) Installing rav1e-libs (0.7.1-r0) (33/247) Installing soxr (0.1.3-r7) (34/247) Installing ffmpeg-libswresample (6.1.1-r9) (35/247) Installing libogg (1.3.5-r5) (36/247) Installing libtheora (1.1.1-r18) (37/247) Installing libvorbis (1.3.7-r2) (38/247) Installing libvpx (1.14.1-r0) (39/247) Installing libsharpyuv (1.3.2-r0) (40/247) Installing libwebp (1.3.2-r0) (41/247) Installing libwebpmux (1.3.2-r0) (42/247) Installing x264-libs (0.164_git20231001-r0) (43/247) Installing numactl (2.0.18-r0) (44/247) Installing x265-libs (3.6-r0) (45/247) Installing xvidcore (1.3.7-r2) (46/247) Installing ffmpeg-libavcodec (6.1.1-r9) (47/247) Installing sdl2 (2.28.5-r1) (48/247) Installing alsa-lib (1.2.12-r0) (49/247) Installing libbz2 (1.0.8-r6) (50/247) Installing freetype (2.13.2-r0) (51/247) Installing fontconfig (2.15.0-r1) (52/247) Installing fribidi (1.0.15-r0) (53/247) Installing libintl (0.22.5-r0) (54/247) Installing libeconf (0.6.3-r0) (55/247) Installing libblkid (2.40.1-r1) (56/247) Installing libmount (2.40.1-r1) (57/247) Installing glib (2.80.3-r0) (58/247) Installing graphite2 (1.3.14-r6) (59/247) Installing harfbuzz (9.0.0-r0) (60/247) Installing libunibreak (6.1-r0) (61/247) Installing libass (0.17.3-r0) (62/247) Installing libbluray (1.3.4-r1) (63/247) Installing mpg123-libs (1.32.6-r0) (64/247) Installing libopenmpt (0.7.7-r0) (65/247) Installing mbedtls (3.6.0-r0) (66/247) Installing librist (0.2.10-r1) (67/247) Installing libsrt (1.5.3-r0) (68/247) Installing libssh (0.10.6-r0) (69/247) Installing xz-libs (5.6.2-r0) (70/247) Installing libxml2 (2.12.8-r0) (71/247) Installing libsodium (1.0.20-r0) (72/247) Installing libzmq (4.3.5-r2) (73/247) Installing ffmpeg-libavformat (6.1.1-r9) (74/247) Installing serd-libs (0.32.2-r0) (75/247) Installing zix-libs (0.4.2-r0) (76/247) Installing sord-libs (0.16.16-r0) (77/247) Installing sratom (0.6.16-r0) (78/247) Installing lilv-libs (0.24.24-r1) (79/247) Installing spirv-tools (1.3.261.1-r0) (80/247) Installing glslang-libs (1.3.283.0-r0) (81/247) Installing libdovi (3.3.0-r0) (82/247) Installing shaderc (2024.0-r1) (83/247) Installing vulkan-loader (1.3.261.1-r0) (84/247) Installing libplacebo (6.338.2-r2) (85/247) Installing ffmpeg-libpostproc (6.1.1-r9) (86/247) Installing ffmpeg-libswscale (6.1.1-r9) (87/247) Installing vidstab (1.1.1-r0) (88/247) Installing zimg (3.0.5-r2) (89/247) Installing ffmpeg-libavfilter (6.1.1-r9) (90/247) Installing libasyncns (0.8-r3) (91/247) Installing dbus-libs (1.14.10-r3) (92/247) Installing libltdl (2.4.7-r3) (93/247) Installing orc (0.4.38-r0) (94/247) Installing libflac (1.4.3-r1) (95/247) Installing libsndfile (1.2.2-r0) (96/247) Installing speexdsp (1.2.1-r2) (97/247) Installing tdb-libs (1.4.10-r0) (98/247) Installing libpulse (17.0-r1) (99/247) Installing v4l-utils-libs (1.26.1-r0) (100/247) Installing ffmpeg-libavdevice (6.1.1-r9) (101/247) Installing ffmpeg (6.1.1-r9) (102/247) Installing gifsicle (1.95-r0) (103/247) Installing gdbm (1.24-r0) (104/247) Installing mpdecimal (4.0.0-r0) (105/247) Installing libpanelw (6.5_p20240601-r0) (106/247) Installing readline (8.2.10-r0) (107/247) Installing sqlite-libs (3.46.0-r0) (108/247) Installing python3 (3.12.3-r1) (109/247) Installing python3-pycache-pyc0 (3.12.3-r1) (110/247) Installing pyc (3.12.3-r1) (111/247) Installing py3-pillow-pyc (10.4.0-r0) (112/247) Installing python3-pyc (3.12.3-r1) (113/247) Installing libimagequant (4.2.2-r0) (114/247) Installing openjpeg (2.5.2-r0) (115/247) Installing tiff (4.6.0t-r0) (116/247) Installing libwebpdemux (1.3.2-r0) (117/247) Installing py3-pillow (10.4.0-r0) (118/247) Installing py3-six (1.16.0-r9) (119/247) Installing py3-six-pyc (1.16.0-r9) (120/247) Installing wget (1.24.5-r0) (121/247) Installing py3-installer (0.7.0-r2) (122/247) Installing py3-installer-pyc (0.7.0-r2) (123/247) Installing py3-gpep517 (16-r0) (124/247) Installing py3-gpep517-pyc (16-r0) (125/247) Installing py3-parsing (3.1.2-r1) (126/247) Installing py3-parsing-pyc (3.1.2-r1) (127/247) Installing py3-packaging (24.1-r0) (128/247) Installing py3-packaging-pyc (24.1-r0) (129/247) Installing py3-setuptools (70.3.0-r0) (130/247) Installing py3-setuptools-pyc (70.3.0-r0) (131/247) Installing py3-wheel (0.42.0-r1) (132/247) Installing py3-wheel-pyc (0.42.0-r1) (133/247) Installing py3-tz (2024.1-r1) (134/247) Installing py3-tz-pyc (2024.1-r1) (135/247) Installing py3-babel (2.14.0-r2) (136/247) Installing py3-babel-pyc (2.14.0-r2) (137/247) Installing py3-soupsieve (2.5-r1) (138/247) Installing py3-soupsieve-pyc (2.5-r1) (139/247) Installing py3-beautifulsoup4 (4.12.3-r2) (140/247) Installing py3-beautifulsoup4-pyc (4.12.3-r2) (141/247) Installing py3-colorthief (0.2.1-r1) (142/247) Installing py3-colorthief-pyc (0.2.1-r1) (143/247) Installing py3-iso639 (0.4.5-r1) (144/247) Installing py3-iso639-pyc (0.4.5-r1) (145/247) Installing icu-data-en (74.2-r0) Executing icu-data-en-74.2-r0.post-install * * If you need ICU with non-English locales and legacy charset support, install * package icu-data-full. * (146/247) Installing icu-libs (74.2-r0) (147/247) Installing libuuid (2.40.1-r1) (148/247) Installing libxapian (1.4.25-r0) (149/247) Installing libzim (9.1.0-r1) (150/247) Installing py3-libzim (3.4.0-r1) (151/247) Installing libgpg-error (1.49-r0) (152/247) Installing libgcrypt (1.10.3-r0) (153/247) Installing libxslt (1.1.39-r1) (154/247) Installing py3-lxml (5.1.0-r0) (155/247) Installing py3-lxml-pyc (5.1.0-r0) (156/247) Installing py3-magic (0.4.27-r3) (157/247) Installing py3-magic-pyc (0.4.27-r3) (158/247) Installing py3-piexif (1.1.3-r7) (159/247) Installing py3-piexif-pyc (1.1.3-r7) (160/247) Installing yaml (0.2.5-r2) (161/247) Installing py3-yaml (6.0.1-r3) (162/247) Installing py3-yaml-pyc (6.0.1-r3) (163/247) Installing py3-watchdog (4.0.0-r1) (164/247) Installing py3-watchdog-pyc (4.0.0-r1) (165/247) Installing py3-optimize-images (1.5.1-r1) (166/247) Installing py3-optimize-images-pyc (1.5.1-r1) (167/247) Installing py3-iniconfig (2.0.0-r1) (168/247) Installing py3-iniconfig-pyc (2.0.0-r1) (169/247) Installing py3-pluggy (1.5.0-r0) (170/247) Installing py3-pluggy-pyc (1.5.0-r0) (171/247) Installing py3-py (1.11.0-r3) (172/247) Installing py3-py-pyc (1.11.0-r3) (173/247) Installing py3-pytest (8.2.2-r1) (174/247) Installing py3-pytest-pyc (8.2.2-r1) (175/247) Installing py3-coverage (7.5.1-r0) (176/247) Installing py3-coverage-pyc (7.5.1-r0) (177/247) Installing py3-pytest-cov (5.0.0-r0) (178/247) Installing py3-pytest-cov-pyc (5.0.0-r0) (179/247) Installing py3-blinker (1.7.0-r1) (180/247) Installing py3-blinker-pyc (1.7.0-r1) (181/247) Installing py3-click (8.1.7-r2) (182/247) Installing py3-click-pyc (8.1.7-r2) (183/247) Installing py3-itsdangerous (2.1.2-r4) (184/247) Installing py3-itsdangerous-pyc (2.1.2-r4) (185/247) Installing py3-markupsafe (2.1.5-r1) (186/247) Installing py3-markupsafe-pyc (2.1.5-r1) (187/247) Installing py3-jinja2 (3.1.4-r0) (188/247) Installing py3-jinja2-pyc (3.1.4-r0) (189/247) Installing py3-werkzeug (3.0.3-r0) (190/247) Installing py3-werkzeug-pyc (3.0.3-r0) (191/247) Installing py3-flask (3.0.3-r0) (192/247) Installing py3-flask-pyc (3.0.3-r0) (193/247) Installing py3-raven (6.10.0-r7) (194/247) Installing py3-raven-pyc (6.10.0-r7) (195/247) Installing py3-brotli (1.1.0-r2) (196/247) Installing py3-brotli-pyc (1.1.0-r2) (197/247) Installing py3-decorator (5.1.1-r4) (198/247) Installing py3-decorator-pyc (5.1.1-r4) (199/247) Installing py3-httpbin (0.10.2-r3) (200/247) Installing py3-httpbin-pyc (0.10.2-r3) (201/247) Installing py3-pytest-httpbin (2.0.0-r1) (202/247) Installing py3-pytest-httpbin-pyc (2.0.0-r1) (203/247) Installing py3-certifi (2024.2.2-r1) (204/247) Installing py3-certifi-pyc (2024.2.2-r1) (205/247) Installing py3-charset-normalizer (3.3.2-r1) (206/247) Installing py3-charset-normalizer-pyc (3.3.2-r1) (207/247) Installing py3-idna (3.7-r0) (208/247) Installing py3-idna-pyc (3.7-r0) (209/247) Installing py3-urllib3 (1.26.18-r1) (210/247) Installing py3-urllib3-pyc (1.26.18-r1) (211/247) Installing py3-requests (2.32.3-r0) (212/247) Installing py3-requests-pyc (2.32.3-r0) (213/247) Installing py3-resizeimage (1.1.20-r1) (214/247) Installing py3-resizeimage-pyc (1.1.20-r1) (215/247) Installing py3-cparser (2.22-r1) (216/247) Installing py3-cparser-pyc (2.22-r1) (217/247) Installing py3-cffi (1.16.0-r1) (218/247) Installing py3-cffi-pyc (1.16.0-r1) (219/247) Installing py3-cryptography (42.0.8-r0) (220/247) Installing py3-cryptography-pyc (42.0.8-r0) (221/247) Installing py3-openssl (24.1.0-r1) (222/247) Installing py3-openssl-pyc (24.1.0-r1) (223/247) Installing py3-requests-file (2.1.0-r0) (224/247) Installing py3-requests-file-pyc (2.1.0-r0) (225/247) Installing py3-filelock (3.13.1-r1) (226/247) Installing py3-filelock-pyc (3.13.1-r1) (227/247) Installing py3-tldextract (5.1.2-r1) (228/247) Installing py3-tldextract-pyc (5.1.2-r1) (229/247) Installing py3-certauth (1.3.0-r1) (230/247) Installing py3-certauth-pyc (1.3.0-r1) (231/247) Installing py3-greenlet (3.0.3-r1) (232/247) Installing py3-greenlet-pyc (3.0.3-r1) (233/247) Installing py3-zope-event (5.0-r1) (234/247) Installing py3-zope-event-pyc (5.0-r1) (235/247) Installing py3-zope-interface (6.0-r1) (236/247) Installing py3-zope-interface-pyc (6.0-r1) (237/247) Installing libev (4.33-r1) (238/247) Installing libuv (1.48.0-r0) (239/247) Installing py3-gevent (23.9.1-r1) (240/247) Installing py3-gevent-pyc (23.9.1-r1) (241/247) Installing py3-gevent-websocket (0.10.1-r7) (242/247) Installing py3-gevent-websocket-pyc (0.10.1-r7) (243/247) Installing py3-wsgiprox (1.5.2-r1) (244/247) Installing py3-wsgiprox-pyc (1.5.2-r1) (245/247) Installing yt-dlp-core (2024.07.09-r0) (246/247) Installing yt-dlp-core-pyc (2024.07.09-r0) (247/247) Installing .makedepends-py3-zimscraperlib (20240711.132933) Executing busybox-1.36.1-r31.trigger Executing glib-2.80.3-r0.trigger OK: 485 MiB in 346 packages >>> py3-zimscraperlib: Cleaning up srcdir >>> py3-zimscraperlib: Cleaning up pkgdir >>> py3-zimscraperlib: Cleaning up tmpdir >>> py3-zimscraperlib: Fetching https://distfiles.alpinelinux.org/distfiles/edge/py3-zimscraperlib-3.2.0.tar.gz --2024-07-11 13:29:37-- https://distfiles.alpinelinux.org/distfiles/edge/py3-zimscraperlib-3.2.0.tar.gz Resolving distfiles.alpinelinux.org (distfiles.alpinelinux.org)... 172.105.82.32 Connecting to distfiles.alpinelinux.org (distfiles.alpinelinux.org)|172.105.82.32|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 3193608 (3.0M) [application/octet-stream] Saving to: '/var/cache/distfiles/py3-zimscraperlib-3.2.0.tar.gz.part' 0K .......... .......... .......... .......... .......... 1% 226K 14s 50K .......... .......... .......... .......... .......... 3% 227K 13s 100K .......... .......... .......... .......... .......... 4% 200K 14s 150K .......... .......... .......... .......... .......... 6% 52.3M 10s 200K .......... .......... .......... .......... .......... 8% 115K 13s 250K .......... .......... .......... .......... .......... 9% 52.1M 11s 300K .......... .......... .......... .......... .......... 11% 52.8M 9s 350K .......... .......... .......... .......... .......... 12% 231K 9s 400K .......... .......... .......... .......... .......... 14% 252K 9s 450K .......... .......... .......... .......... .......... 16% 803K 8s 500K .......... .......... .......... .......... .......... 17% 282K 8s 550K .......... .......... .......... .......... .......... 19% 248K 8s 600K .......... .......... .......... .......... .......... 20% 856K 8s 650K .......... .......... .......... .......... .......... 22% 279K 8s 700K .......... .......... .......... .......... .......... 24% 264K 8s 750K .......... .......... .......... .......... .......... 25% 704K 7s 800K .......... .......... .......... .......... .......... 27% 277K 7s 850K .......... .......... .......... .......... .......... 28% 250K 7s 900K .......... .......... .......... .......... .......... 30% 948K 7s 950K .......... .......... .......... .......... .......... 32% 296K 7s 1000K .......... .......... .......... .......... .......... 33% 894K 6s 1050K .......... .......... .......... .......... .......... 35% 273K 6s 1100K .......... .......... .......... .......... .......... 36% 249K 6s 1150K .......... .......... .......... .......... .......... 38% 956K 6s 1200K .......... .......... .......... .......... .......... 40% 272K 6s 1250K .......... .......... .......... .......... .......... 41% 245K 6s 1300K .......... .......... .......... .......... .......... 43% 195K 6s 1350K .......... .......... .......... .......... .......... 44% 51.9M 5s 1400K .......... .......... .......... .......... .......... 46% 230K 5s 1450K .......... .......... .......... .......... .......... 48% 4.62M 5s 1500K .......... .......... .......... .......... .......... 49% 242K 5s 1550K .......... .......... .......... .......... .......... 51% 218K 5s 1600K .......... .......... .......... .......... .......... 52% 266K 5s 1650K .......... .......... .......... .......... .......... 54% 237K 4s 1700K .......... .......... .......... .......... .......... 56% 744K 4s 1750K .......... .......... .......... .......... .......... 57% 289K 4s 1800K .......... .......... .......... .......... .......... 59% 251K 4s 1850K .......... .......... .......... .......... .......... 60% 795K 4s 1900K .......... .......... .......... .......... .......... 62% 311K 4s 1950K .......... .......... .......... .......... .......... 64% 234K 3s 2000K .......... .......... .......... .......... .......... 65% 247K 3s 2050K .......... .......... .......... .......... .......... 67% 1.62M 3s 2100K .......... .......... .......... .......... .......... 68% 247K 3s 2150K .......... .......... .......... .......... .......... 70% 823K 3s 2200K .......... .......... .......... .......... .......... 72% 306K 3s 2250K .......... .......... .......... .......... .......... 73% 251K 3s 2300K .......... .......... .......... .......... .......... 75% 1.85M 2s 2350K .......... .......... .......... .......... .......... 76% 238K 2s 2400K .......... .......... .......... .......... .......... 78% 249K 2s 2450K .......... .......... .......... .......... .......... 80% 2.10M 2s 2500K .......... .......... .......... .......... .......... 81% 253K 2s 2550K .......... .......... .......... .......... .......... 83% 1.11M 2s 2600K .......... .......... .......... .......... .......... 84% 271K 1s 2650K .......... .......... .......... .......... .......... 86% 199K 1s 2700K .......... .......... .......... .......... .......... 88% 57.7M 1s 2750K .......... .......... .......... .......... .......... 89% 230K 1s 2800K .......... .......... .......... .......... .......... 91% 231K 1s 2850K .......... .......... .......... .......... .......... 92% 262K 1s 2900K .......... .......... .......... .......... .......... 94% 242K 1s 2950K .......... .......... .......... .......... .......... 96% 493K 0s 3000K .......... .......... .......... .......... .......... 97% 323K 0s 3050K .......... .......... .......... .......... .......... 99% 270K 0s 3100K .......... ........ 100% 1.84M=9.4s 2024-07-11 13:29:47 (330 KB/s) - '/var/cache/distfiles/py3-zimscraperlib-3.2.0.tar.gz.part' saved [3193608/3193608] >>> py3-zimscraperlib: Fetching https://distfiles.alpinelinux.org/distfiles/edge/py3-zimscraperlib-3.2.0.tar.gz >>> py3-zimscraperlib: Checking sha512sums... py3-zimscraperlib-3.2.0.tar.gz: OK >>> py3-zimscraperlib: Unpacking /var/cache/distfiles/py3-zimscraperlib-3.2.0.tar.gz... 2024-07-11 13:29:47,886 gpep517 INFO Building wheel via backend setuptools.build_meta:__legacy__ 2024-07-11 13:29:47,924 root INFO running bdist_wheel 2024-07-11 13:29:47,977 root INFO running build 2024-07-11 13:29:47,977 root INFO running build_py 2024-07-11 13:29:47,986 root INFO creating build 2024-07-11 13:29:47,986 root INFO creating build/lib 2024-07-11 13:29:47,987 root INFO creating build/lib/zimscraperlib 2024-07-11 13:29:47,987 root INFO copying src/zimscraperlib/constants.py -> build/lib/zimscraperlib 2024-07-11 13:29:47,987 root INFO copying src/zimscraperlib/download.py -> build/lib/zimscraperlib 2024-07-11 13:29:47,987 root INFO copying src/zimscraperlib/fix_ogvjs_dist.py -> build/lib/zimscraperlib 2024-07-11 13:29:47,988 root INFO copying src/zimscraperlib/i18n.py -> build/lib/zimscraperlib 2024-07-11 13:29:47,988 root INFO copying src/zimscraperlib/filesystem.py -> build/lib/zimscraperlib 2024-07-11 13:29:47,988 root INFO copying src/zimscraperlib/html.py -> build/lib/zimscraperlib 2024-07-11 13:29:47,989 root INFO copying src/zimscraperlib/misc.py -> build/lib/zimscraperlib 2024-07-11 13:29:47,989 root INFO copying src/zimscraperlib/types.py -> build/lib/zimscraperlib 2024-07-11 13:29:47,989 root INFO copying src/zimscraperlib/__init__.py -> build/lib/zimscraperlib 2024-07-11 13:29:47,989 root INFO copying src/zimscraperlib/logging.py -> build/lib/zimscraperlib 2024-07-11 13:29:47,990 root INFO copying src/zimscraperlib/inputs.py -> build/lib/zimscraperlib 2024-07-11 13:29:47,990 root INFO copying src/zimscraperlib/uri.py -> build/lib/zimscraperlib 2024-07-11 13:29:47,990 root INFO creating build/lib/zimscraperlib/zim 2024-07-11 13:29:47,991 root INFO copying src/zimscraperlib/zim/archive.py -> build/lib/zimscraperlib/zim 2024-07-11 13:29:47,991 root INFO copying src/zimscraperlib/zim/items.py -> build/lib/zimscraperlib/zim 2024-07-11 13:29:47,991 root INFO copying src/zimscraperlib/zim/filesystem.py -> build/lib/zimscraperlib/zim 2024-07-11 13:29:47,992 root INFO copying src/zimscraperlib/zim/_libkiwix.py -> build/lib/zimscraperlib/zim 2024-07-11 13:29:47,992 root INFO copying src/zimscraperlib/zim/providers.py -> build/lib/zimscraperlib/zim 2024-07-11 13:29:47,992 root INFO copying src/zimscraperlib/zim/metadata.py -> build/lib/zimscraperlib/zim 2024-07-11 13:29:47,992 root INFO copying src/zimscraperlib/zim/__init__.py -> build/lib/zimscraperlib/zim 2024-07-11 13:29:47,993 root INFO copying src/zimscraperlib/zim/creator.py -> build/lib/zimscraperlib/zim 2024-07-11 13:29:47,993 root INFO creating build/lib/zimscraperlib/video 2024-07-11 13:29:47,993 root INFO copying src/zimscraperlib/video/probing.py -> build/lib/zimscraperlib/video 2024-07-11 13:29:47,994 root INFO copying src/zimscraperlib/video/encoding.py -> build/lib/zimscraperlib/video 2024-07-11 13:29:47,994 root INFO copying src/zimscraperlib/video/config.py -> build/lib/zimscraperlib/video 2024-07-11 13:29:47,994 root INFO copying src/zimscraperlib/video/__init__.py -> build/lib/zimscraperlib/video 2024-07-11 13:29:47,994 root INFO copying src/zimscraperlib/video/presets.py -> build/lib/zimscraperlib/video 2024-07-11 13:29:47,995 root INFO creating build/lib/zimscraperlib/image 2024-07-11 13:29:47,995 root INFO copying src/zimscraperlib/image/transformation.py -> build/lib/zimscraperlib/image 2024-07-11 13:29:47,995 root INFO copying src/zimscraperlib/image/convertion.py -> build/lib/zimscraperlib/image 2024-07-11 13:29:47,996 root INFO copying src/zimscraperlib/image/probing.py -> build/lib/zimscraperlib/image 2024-07-11 13:29:47,996 root INFO copying src/zimscraperlib/image/optimization.py -> build/lib/zimscraperlib/image 2024-07-11 13:29:47,996 root INFO copying src/zimscraperlib/image/utils.py -> build/lib/zimscraperlib/image 2024-07-11 13:29:47,996 root INFO copying src/zimscraperlib/image/__init__.py -> build/lib/zimscraperlib/image 2024-07-11 13:29:47,997 root INFO copying src/zimscraperlib/image/presets.py -> build/lib/zimscraperlib/image 2024-07-11 13:29:47,997 root INFO running egg_info 2024-07-11 13:29:47,997 root INFO creating src/zimscraperlib.egg-info 2024-07-11 13:29:48,005 root INFO writing src/zimscraperlib.egg-info/PKG-INFO 2024-07-11 13:29:48,006 root INFO writing dependency_links to src/zimscraperlib.egg-info/dependency_links.txt 2024-07-11 13:29:48,006 root INFO writing entry points to src/zimscraperlib.egg-info/entry_points.txt 2024-07-11 13:29:48,006 root INFO writing requirements to src/zimscraperlib.egg-info/requires.txt 2024-07-11 13:29:48,007 root INFO writing top-level names to src/zimscraperlib.egg-info/top_level.txt 2024-07-11 13:29:48,007 root INFO writing manifest file 'src/zimscraperlib.egg-info/SOURCES.txt' 2024-07-11 13:29:48,017 root INFO reading manifest file 'src/zimscraperlib.egg-info/SOURCES.txt' 2024-07-11 13:29:48,017 root INFO reading manifest template 'MANIFEST.in' 2024-07-11 13:29:48,018 root INFO adding license file 'LICENSE' 2024-07-11 13:29:48,020 root INFO writing manifest file 'src/zimscraperlib.egg-info/SOURCES.txt' 2024-07-11 13:29:48,021 root INFO copying src/zimscraperlib/VERSION -> build/lib/zimscraperlib 2024-07-11 13:29:48,040 root INFO installing to build/bdist.linux-loongarch64/wheel 2024-07-11 13:29:48,040 root INFO running install 2024-07-11 13:29:48,052 root INFO running install_lib 2024-07-11 13:29:48,061 root INFO creating build/bdist.linux-loongarch64 2024-07-11 13:29:48,061 root INFO creating build/bdist.linux-loongarch64/wheel 2024-07-11 13:29:48,061 root INFO creating build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,062 root INFO copying build/lib/zimscraperlib/constants.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,062 root INFO copying build/lib/zimscraperlib/download.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,062 root INFO copying build/lib/zimscraperlib/fix_ogvjs_dist.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,062 root INFO copying build/lib/zimscraperlib/i18n.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,063 root INFO copying build/lib/zimscraperlib/filesystem.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,063 root INFO creating build/bdist.linux-loongarch64/wheel/zimscraperlib/zim 2024-07-11 13:29:48,063 root INFO copying build/lib/zimscraperlib/zim/archive.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/zim 2024-07-11 13:29:48,063 root INFO copying build/lib/zimscraperlib/zim/items.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/zim 2024-07-11 13:29:48,064 root INFO copying build/lib/zimscraperlib/zim/filesystem.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/zim 2024-07-11 13:29:48,064 root INFO copying build/lib/zimscraperlib/zim/_libkiwix.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/zim 2024-07-11 13:29:48,064 root INFO copying build/lib/zimscraperlib/zim/providers.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/zim 2024-07-11 13:29:48,064 root INFO copying build/lib/zimscraperlib/zim/metadata.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/zim 2024-07-11 13:29:48,065 root INFO copying build/lib/zimscraperlib/zim/__init__.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/zim 2024-07-11 13:29:48,065 root INFO copying build/lib/zimscraperlib/zim/creator.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/zim 2024-07-11 13:29:48,065 root INFO copying build/lib/zimscraperlib/html.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,066 root INFO copying build/lib/zimscraperlib/misc.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,066 root INFO creating build/bdist.linux-loongarch64/wheel/zimscraperlib/video 2024-07-11 13:29:48,066 root INFO copying build/lib/zimscraperlib/video/probing.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/video 2024-07-11 13:29:48,066 root INFO copying build/lib/zimscraperlib/video/encoding.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/video 2024-07-11 13:29:48,067 root INFO copying build/lib/zimscraperlib/video/config.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/video 2024-07-11 13:29:48,067 root INFO copying build/lib/zimscraperlib/video/__init__.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/video 2024-07-11 13:29:48,067 root INFO copying build/lib/zimscraperlib/video/presets.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/video 2024-07-11 13:29:48,067 root INFO copying build/lib/zimscraperlib/types.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,068 root INFO copying build/lib/zimscraperlib/__init__.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,068 root INFO creating build/bdist.linux-loongarch64/wheel/zimscraperlib/image 2024-07-11 13:29:48,068 root INFO copying build/lib/zimscraperlib/image/transformation.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/image 2024-07-11 13:29:48,068 root INFO copying build/lib/zimscraperlib/image/convertion.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/image 2024-07-11 13:29:48,068 root INFO copying build/lib/zimscraperlib/image/probing.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/image 2024-07-11 13:29:48,069 root INFO copying build/lib/zimscraperlib/image/optimization.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/image 2024-07-11 13:29:48,069 root INFO copying build/lib/zimscraperlib/image/utils.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/image 2024-07-11 13:29:48,069 root INFO copying build/lib/zimscraperlib/image/__init__.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/image 2024-07-11 13:29:48,070 root INFO copying build/lib/zimscraperlib/image/presets.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib/image 2024-07-11 13:29:48,070 root INFO copying build/lib/zimscraperlib/logging.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,070 root INFO copying build/lib/zimscraperlib/inputs.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,070 root INFO copying build/lib/zimscraperlib/VERSION -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,071 root INFO copying build/lib/zimscraperlib/uri.py -> build/bdist.linux-loongarch64/wheel/zimscraperlib 2024-07-11 13:29:48,071 root INFO running install_egg_info 2024-07-11 13:29:48,080 root INFO Copying src/zimscraperlib.egg-info to build/bdist.linux-loongarch64/wheel/zimscraperlib-3.2.0-py3.12.egg-info 2024-07-11 13:29:48,082 root INFO running install_scripts 2024-07-11 13:29:48,086 root INFO creating build/bdist.linux-loongarch64/wheel/zimscraperlib-3.2.0.dist-info/WHEEL 2024-07-11 13:29:48,086 wheel INFO creating '/home/buildozer/aports/testing/py3-zimscraperlib/src/python-scraperlib-3.2.0/.dist/.tmp-_vx58pm7/zimscraperlib-3.2.0-py3-none-any.whl' and adding 'build/bdist.linux-loongarch64/wheel' to it 2024-07-11 13:29:48,087 wheel INFO adding 'zimscraperlib/VERSION' 2024-07-11 13:29:48,087 wheel INFO adding 'zimscraperlib/__init__.py' 2024-07-11 13:29:48,087 wheel INFO adding 'zimscraperlib/constants.py' 2024-07-11 13:29:48,088 wheel INFO adding 'zimscraperlib/download.py' 2024-07-11 13:29:48,088 wheel INFO adding 'zimscraperlib/filesystem.py' 2024-07-11 13:29:48,088 wheel INFO adding 'zimscraperlib/fix_ogvjs_dist.py' 2024-07-11 13:29:48,089 wheel INFO adding 'zimscraperlib/html.py' 2024-07-11 13:29:48,089 wheel INFO adding 'zimscraperlib/i18n.py' 2024-07-11 13:29:48,089 wheel INFO adding 'zimscraperlib/inputs.py' 2024-07-11 13:29:48,089 wheel INFO adding 'zimscraperlib/logging.py' 2024-07-11 13:29:48,090 wheel INFO adding 'zimscraperlib/misc.py' 2024-07-11 13:29:48,090 wheel INFO adding 'zimscraperlib/types.py' 2024-07-11 13:29:48,090 wheel INFO adding 'zimscraperlib/uri.py' 2024-07-11 13:29:48,091 wheel INFO adding 'zimscraperlib/image/__init__.py' 2024-07-11 13:29:48,091 wheel INFO adding 'zimscraperlib/image/convertion.py' 2024-07-11 13:29:48,091 wheel INFO adding 'zimscraperlib/image/optimization.py' 2024-07-11 13:29:48,092 wheel INFO adding 'zimscraperlib/image/presets.py' 2024-07-11 13:29:48,092 wheel INFO adding 'zimscraperlib/image/probing.py' 2024-07-11 13:29:48,092 wheel INFO adding 'zimscraperlib/image/transformation.py' 2024-07-11 13:29:48,092 wheel INFO adding 'zimscraperlib/image/utils.py' 2024-07-11 13:29:48,093 wheel INFO adding 'zimscraperlib/video/__init__.py' 2024-07-11 13:29:48,093 wheel INFO adding 'zimscraperlib/video/config.py' 2024-07-11 13:29:48,093 wheel INFO adding 'zimscraperlib/video/encoding.py' 2024-07-11 13:29:48,094 wheel INFO adding 'zimscraperlib/video/presets.py' 2024-07-11 13:29:48,094 wheel INFO adding 'zimscraperlib/video/probing.py' 2024-07-11 13:29:48,094 wheel INFO adding 'zimscraperlib/zim/__init__.py' 2024-07-11 13:29:48,094 wheel INFO adding 'zimscraperlib/zim/_libkiwix.py' 2024-07-11 13:29:48,095 wheel INFO adding 'zimscraperlib/zim/archive.py' 2024-07-11 13:29:48,095 wheel INFO adding 'zimscraperlib/zim/creator.py' 2024-07-11 13:29:48,095 wheel INFO adding 'zimscraperlib/zim/filesystem.py' 2024-07-11 13:29:48,096 wheel INFO adding 'zimscraperlib/zim/items.py' 2024-07-11 13:29:48,096 wheel INFO adding 'zimscraperlib/zim/metadata.py' 2024-07-11 13:29:48,096 wheel INFO adding 'zimscraperlib/zim/providers.py' 2024-07-11 13:29:48,097 wheel INFO adding 'zimscraperlib-3.2.0.dist-info/LICENSE' 2024-07-11 13:29:48,097 wheel INFO adding 'zimscraperlib-3.2.0.dist-info/METADATA' 2024-07-11 13:29:48,098 wheel INFO adding 'zimscraperlib-3.2.0.dist-info/WHEEL' 2024-07-11 13:29:48,098 wheel INFO adding 'zimscraperlib-3.2.0.dist-info/entry_points.txt' 2024-07-11 13:29:48,098 wheel INFO adding 'zimscraperlib-3.2.0.dist-info/top_level.txt' 2024-07-11 13:29:48,098 wheel INFO adding 'zimscraperlib-3.2.0.dist-info/RECORD' 2024-07-11 13:29:48,099 root INFO removing build/bdist.linux-loongarch64/wheel 2024-07-11 13:29:48,101 gpep517 INFO The backend produced .dist/zimscraperlib-3.2.0-py3-none-any.whl zimscraperlib-3.2.0-py3-none-any.whl ============================= test session starts ============================== platform linux -- Python 3.12.3, pytest-8.2.2, pluggy-1.5.0 rootdir: /home/buildozer/aports/testing/py3-zimscraperlib/src/python-scraperlib-3.2.0 plugins: httpbin-2.0.0, cov-5.0.0 collected 333 items / 4 deselected / 329 selected tests/download/test_download.py ............FFF.F [ 5%] tests/filesystem/test_filesystem.py ..... [ 6%] tests/html/test_html.py ...... [ 8%] tests/i18n/test_i18n.py ........................ [ 15%] tests/image/test_image.py .............................................. [ 29%] .......................... [ 37%] tests/inputs/test_inputs.py ................... [ 43%] tests/logging/test_logging.py ............... [ 48%] tests/misc/test_misc.py .... [ 49%] tests/ogvjs/test_ogvjs.py .. [ 49%] tests/types/test_types.py .............. [ 54%] tests/uri/test_uri.py .............. [ 58%] tests/video/test_video.py ..................... [ 64%] tests/zim/test_archive.py .....F.F. [ 67%] tests/zim/test_fs.py ....... [ 69%] tests/zim/test_libkiwix.py ................ [ 74%] tests/zim/test_zim_creator.py .............F.F.FFF...................... [ 87%] .......................................... [100%] =================================== FAILURES =================================== ____ test_youtube_download_serial[https://vimeo.com/619427082-619427082_0] _____ self = def _new_conn(self): """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ extra_kw = {} if self.source_address: extra_kw["source_address"] = self.source_address if self.socket_options: extra_kw["socket_options"] = self.socket_options try: > conn = connection.create_connection( (self._dns_host, self.port), self.timeout, **extra_kw ) /usr/lib/python3.12/site-packages/urllib3/connection.py:174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/urllib3/util/connection.py:95: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('vimeo.com', 443), timeout = 20.0, source_address = None socket_options = [(6, 1, 1)] def create_connection( address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT, source_address=None, socket_options=None, ): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: return six.raise_from( LocationParseError(u"'%s', label empty or too long" % host), None ) for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E OSError: [Errno 101] Network unreachable /usr/lib/python3.12/site-packages/urllib3/util/connection.py:85: OSError During handling of the above exception, another exception occurred: self = request = def _send(self, request): headers = self._merge_headers(request.headers) add_accept_encoding_header(headers, SUPPORTED_ENCODINGS) max_redirects_exceeded = False session = self._get_instance(cookiejar=self._get_cookiejar(request)) try: > requests_res = session.request( method=request.method, url=request.url, data=request.data, headers=headers, timeout=self._calculate_timeout(request), proxies=self._get_proxies(request), allow_redirects=True, stream=True, ) /usr/lib/python3.12/site-packages/yt_dlp/networking/_requests.py:324: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.12/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.12/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:799: in urlopen retries = retries.increment( /usr/lib/python3.12/site-packages/urllib3/util/retry.py:525: in increment raise six.reraise(type(error), error, _stacktrace) /usr/lib/python3.12/site-packages/urllib3/packages/six.py:770: in reraise raise value /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:715: in urlopen httplib_response = self._make_request( /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:404: in _make_request self._validate_conn(conn) /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:1058: in _validate_conn conn.connect() /usr/lib/python3.12/site-packages/urllib3/connection.py:363: in connect self.sock = conn = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self): """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ extra_kw = {} if self.source_address: extra_kw["source_address"] = self.source_address if self.socket_options: extra_kw["socket_options"] = self.socket_options try: conn = connection.create_connection( (self._dns_host, self.port), self.timeout, **extra_kw ) except SocketTimeout: raise ConnectTimeoutError( self, "Connection to %s timed out. (connect timeout=%s)" % (self.host, self.timeout), ) except SocketError as e: > raise NewConnectionError( self, "Failed to establish a new connection: %s" % e ) E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 101] Network unreachable /usr/lib/python3.12/site-packages/urllib3/connection.py:186: NewConnectionError The above exception was the direct cause of the following exception: self = url_or_request = 'https://vimeo.com/619427082', video_id = '619427082' note = None, errnote = 'Unable to download webpage', fatal = True, data = None headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.159 .../*;q=0.8', 'Accept-Language': 'en-us,en;q=0.5', 'Sec-Fetch-Mode': 'navigate', 'Referer': 'https://vimeo.com/619427082'} query = {}, expected_status = None impersonate = ImpersonateTarget(client=None, version=None, os=None, os_version=None) require_impersonation = False def _request_webpage(self, url_or_request, video_id, note=None, errnote=None, fatal=True, data=None, headers=None, query=None, expected_status=None, impersonate=None, require_impersonation=False): """ Return the response handle. See _download_webpage docstring for arguments specification. """ if not self._downloader._first_webpage_request: sleep_interval = self.get_param('sleep_interval_requests') or 0 if sleep_interval > 0: self.to_screen(f'Sleeping {sleep_interval} seconds ...') time.sleep(sleep_interval) else: self._downloader._first_webpage_request = False if note is None: self.report_download_webpage(video_id) elif note is not False: if video_id is None: self.to_screen(str(note)) else: self.to_screen(f'{video_id}: {note}') # Some sites check X-Forwarded-For HTTP header in order to figure out # the origin of the client behind proxy. This allows bypassing geo # restriction by faking this header's value to IP that belongs to some # geo unrestricted country. We will do so once we encounter any # geo restriction error. if self._x_forwarded_for_ip: headers = (headers or {}).copy() headers.setdefault('X-Forwarded-For', self._x_forwarded_for_ip) extensions = {} if impersonate in (True, ''): impersonate = ImpersonateTarget() requested_targets = [ t if isinstance(t, ImpersonateTarget) else ImpersonateTarget.from_str(t) for t in variadic(impersonate) ] if impersonate else [] available_target = next(filter(self._downloader._impersonate_target_available, requested_targets), None) if available_target: extensions['impersonate'] = available_target elif requested_targets: message = 'The extractor is attempting impersonation, but ' message += ( 'no impersonate target is available' if not str(impersonate) else f'none of these impersonate targets are available: "{", ".join(map(str, requested_targets))}"') info_msg = ('see https://github.com/yt-dlp/yt-dlp#impersonation ' 'for information on installing the required dependencies') if require_impersonation: raise ExtractorError(f'{message}; {info_msg}', expected=True) self.report_warning(f'{message}; if you encounter errors, then {info_msg}', only_once=True) try: > return self._downloader.urlopen(self._create_request(url_or_request, data, headers, query, extensions)) /usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py:896: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:4160: in urlopen return self._request_director.send(req) /usr/lib/python3.12/site-packages/yt_dlp/networking/common.py:117: in send response = handler.send(request) /usr/lib/python3.12/site-packages/yt_dlp/networking/_helper.py:208: in wrapper return func(self, *args, **kwargs) /usr/lib/python3.12/site-packages/yt_dlp/networking/common.py:337: in send return self._send(request) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def _send(self, request): headers = self._merge_headers(request.headers) add_accept_encoding_header(headers, SUPPORTED_ENCODINGS) max_redirects_exceeded = False session = self._get_instance(cookiejar=self._get_cookiejar(request)) try: requests_res = session.request( method=request.method, url=request.url, data=request.data, headers=headers, timeout=self._calculate_timeout(request), proxies=self._get_proxies(request), allow_redirects=True, stream=True, ) except requests.exceptions.TooManyRedirects as e: max_redirects_exceeded = True requests_res = e.response except requests.exceptions.SSLError as e: if 'CERTIFICATE_VERIFY_FAILED' in str(e): raise CertificateVerifyError(cause=e) from e raise SSLError(cause=e) from e except requests.exceptions.ProxyError as e: raise ProxyError(cause=e) from e except (requests.exceptions.ConnectionError, requests.exceptions.Timeout) as e: raise TransportError(cause=e) from e except urllib3.exceptions.HTTPError as e: # Catch any urllib3 exceptions that may leak through > raise TransportError(cause=e) from e E yt_dlp.networking.exceptions.TransportError: : Failed to establish a new connection: [Errno 101] Network unreachable /usr/lib/python3.12/site-packages/yt_dlp/networking/_requests.py:352: TransportError During handling of the above exception, another exception occurred: self = args = ('https://vimeo.com/619427082', , True, {}, True) kwargs = {} @functools.wraps(func) def wrapper(self, *args, **kwargs): while True: try: > return func(self, *args, **kwargs) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1622: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1757: in __extract_info ie_result = ie.extract(url) /usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py:740: in extract ie_result = self._real_extract(url) /usr/lib/python3.12/site-packages/yt_dlp/extractor/vimeo.py:844: in _real_extract webpage, urlh = self._download_webpage_handle( /usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py:960: in _download_webpage_handle urlh = self._request_webpage(url_or_request, video_id, note, errnote, fatal, data=data, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = url_or_request = 'https://vimeo.com/619427082', video_id = '619427082' note = None, errnote = 'Unable to download webpage', fatal = True, data = None headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.159 .../*;q=0.8', 'Accept-Language': 'en-us,en;q=0.5', 'Sec-Fetch-Mode': 'navigate', 'Referer': 'https://vimeo.com/619427082'} query = {}, expected_status = None impersonate = ImpersonateTarget(client=None, version=None, os=None, os_version=None) require_impersonation = False def _request_webpage(self, url_or_request, video_id, note=None, errnote=None, fatal=True, data=None, headers=None, query=None, expected_status=None, impersonate=None, require_impersonation=False): """ Return the response handle. See _download_webpage docstring for arguments specification. """ if not self._downloader._first_webpage_request: sleep_interval = self.get_param('sleep_interval_requests') or 0 if sleep_interval > 0: self.to_screen(f'Sleeping {sleep_interval} seconds ...') time.sleep(sleep_interval) else: self._downloader._first_webpage_request = False if note is None: self.report_download_webpage(video_id) elif note is not False: if video_id is None: self.to_screen(str(note)) else: self.to_screen(f'{video_id}: {note}') # Some sites check X-Forwarded-For HTTP header in order to figure out # the origin of the client behind proxy. This allows bypassing geo # restriction by faking this header's value to IP that belongs to some # geo unrestricted country. We will do so once we encounter any # geo restriction error. if self._x_forwarded_for_ip: headers = (headers or {}).copy() headers.setdefault('X-Forwarded-For', self._x_forwarded_for_ip) extensions = {} if impersonate in (True, ''): impersonate = ImpersonateTarget() requested_targets = [ t if isinstance(t, ImpersonateTarget) else ImpersonateTarget.from_str(t) for t in variadic(impersonate) ] if impersonate else [] available_target = next(filter(self._downloader._impersonate_target_available, requested_targets), None) if available_target: extensions['impersonate'] = available_target elif requested_targets: message = 'The extractor is attempting impersonation, but ' message += ( 'no impersonate target is available' if not str(impersonate) else f'none of these impersonate targets are available: "{", ".join(map(str, requested_targets))}"') info_msg = ('see https://github.com/yt-dlp/yt-dlp#impersonation ' 'for information on installing the required dependencies') if require_impersonation: raise ExtractorError(f'{message}; {info_msg}', expected=True) self.report_warning(f'{message}; if you encounter errors, then {info_msg}', only_once=True) try: return self._downloader.urlopen(self._create_request(url_or_request, data, headers, query, extensions)) except network_exceptions as err: if isinstance(err, HTTPError): if self.__can_accept_status_code(err, expected_status): return err.response if errnote is False: return False if errnote is None: errnote = 'Unable to download webpage' errmsg = f'{errnote}: {err}' if fatal: > raise ExtractorError(errmsg, cause=err) E yt_dlp.utils.ExtractorError: [vimeo] 619427082: Unable to download webpage: : Failed to establish a new connection: [Errno 101] Network unreachable (caused by TransportError(': Failed to establish a new connection: [Errno 101] Network unreachable')) /usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py:909: ExtractorError During handling of the above exception, another exception occurred: url = 'https://vimeo.com/619427082', video_id = '619427082' tmp_path = PosixPath('/tmp/pytest-of-buildozer/pytest-62/test_youtube_download_serial_h0') @pytest.mark.slow @pytest.mark.parametrize( "url,video_id", [ ("https://vimeo.com/619427082", "619427082"), ("https://vimeo.com/619427082", "619427082"), ], ) def test_youtube_download_serial(url, video_id, tmp_path): yt_downloader = YoutubeDownloader(threads=1) options = BestMp4.get_options( target_dir=tmp_path, filepath=pathlib.Path("%(id)s/video.%(ext)s"), ) > yt_downloader.download(url, options) tests/download/test_download.py:146: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ .testenv/lib/python3.12/site-packages/zimscraperlib/download.py:63: in download raise future.exception() /usr/lib/python3.12/concurrent/futures/thread.py:58: in run result = self.fn(*self.args, **self.kwargs) .testenv/lib/python3.12/site-packages/zimscraperlib/download.py:40: in _run_youtube_dl ydl.download([url]) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:3602: in download self.__download_wrapper(self.extract_info)( /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:3577: in wrapper res = func(*args, **kwargs) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1611: in extract_info return self.__extract_info(url, self.get_info_extractor(key), download, extra_info, process) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1640: in wrapper self.report_error(str(e), e.format_traceback()) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1088: in report_error self.trouble(f'{self._format_err("ERROR:", self.Styles.ERROR)} {message}', *args, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = message = "ERROR: [vimeo] 619427082: Unable to download webpage: : ...on.HTTPSConnection object at 0x7fffeb489970>: Failed to establish a new connection: [Errno 101] Network unreachable'))" tb = ' File "/usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py", line 740, in extract\n ie_result = self._re...ion.HTTPSConnection object at 0x7fffeb489970>: Failed to establish a new connection: [Errno 101] Network unreachable\n' is_error = True def trouble(self, message=None, tb=None, is_error=True): """Determine action to take when a download problem appears. Depending on if the downloader has been configured to ignore download errors or not, this method may throw an exception or not when errors are found, after printing the message. @param tb If given, is additional traceback information @param is_error Whether to raise error according to ignorerrors """ if message is not None: self.to_stderr(message) if self.params.get('verbose'): if tb is None: if sys.exc_info()[0]: # if .trouble has been called from an except block tb = '' if hasattr(sys.exc_info()[1], 'exc_info') and sys.exc_info()[1].exc_info[0]: tb += ''.join(traceback.format_exception(*sys.exc_info()[1].exc_info)) tb += encode_compat_str(traceback.format_exc()) else: tb_data = traceback.format_list(traceback.extract_stack()) tb = ''.join(tb_data) if tb: self.to_stderr(tb) if not is_error: return if not self.params.get('ignoreerrors'): if sys.exc_info()[0] and hasattr(sys.exc_info()[1], 'exc_info') and sys.exc_info()[1].exc_info[0]: exc_info = sys.exc_info()[1].exc_info else: exc_info = sys.exc_info() > raise DownloadError(message, exc_info) E yt_dlp.utils.DownloadError: ERROR: [vimeo] 619427082: Unable to download webpage: : Failed to establish a new connection: [Errno 101] Network unreachable (caused by TransportError(': Failed to establish a new connection: [Errno 101] Network unreachable')) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1027: DownloadError ----------------------------- Captured stdout call ----------------------------- [vimeo] Extracting URL: https://vimeo.com/619427082 [vimeo] 619427082: Downloading webpage ----------------------------- Captured stderr call ----------------------------- WARNING: [vimeo] The extractor is attempting impersonation, but no impersonate target is available; if you encounter errors, then see https://github.com/yt-dlp/yt-dlp#impersonation for information on installing the required dependencies ERROR: [vimeo] 619427082: Unable to download webpage: : Failed to establish a new connection: [Errno 101] Network unreachable (caused by TransportError(': Failed to establish a new connection: [Errno 101] Network unreachable')) ____ test_youtube_download_serial[https://vimeo.com/619427082-619427082_1] _____ self = def _new_conn(self): """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ extra_kw = {} if self.source_address: extra_kw["source_address"] = self.source_address if self.socket_options: extra_kw["socket_options"] = self.socket_options try: > conn = connection.create_connection( (self._dns_host, self.port), self.timeout, **extra_kw ) /usr/lib/python3.12/site-packages/urllib3/connection.py:174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/urllib3/util/connection.py:95: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('vimeo.com', 443), timeout = 20.0, source_address = None socket_options = [(6, 1, 1)] def create_connection( address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT, source_address=None, socket_options=None, ): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: return six.raise_from( LocationParseError(u"'%s', label empty or too long" % host), None ) for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E OSError: [Errno 101] Network unreachable /usr/lib/python3.12/site-packages/urllib3/util/connection.py:85: OSError During handling of the above exception, another exception occurred: self = request = def _send(self, request): headers = self._merge_headers(request.headers) add_accept_encoding_header(headers, SUPPORTED_ENCODINGS) max_redirects_exceeded = False session = self._get_instance(cookiejar=self._get_cookiejar(request)) try: > requests_res = session.request( method=request.method, url=request.url, data=request.data, headers=headers, timeout=self._calculate_timeout(request), proxies=self._get_proxies(request), allow_redirects=True, stream=True, ) /usr/lib/python3.12/site-packages/yt_dlp/networking/_requests.py:324: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.12/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.12/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:799: in urlopen retries = retries.increment( /usr/lib/python3.12/site-packages/urllib3/util/retry.py:525: in increment raise six.reraise(type(error), error, _stacktrace) /usr/lib/python3.12/site-packages/urllib3/packages/six.py:770: in reraise raise value /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:715: in urlopen httplib_response = self._make_request( /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:404: in _make_request self._validate_conn(conn) /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:1058: in _validate_conn conn.connect() /usr/lib/python3.12/site-packages/urllib3/connection.py:363: in connect self.sock = conn = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self): """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ extra_kw = {} if self.source_address: extra_kw["source_address"] = self.source_address if self.socket_options: extra_kw["socket_options"] = self.socket_options try: conn = connection.create_connection( (self._dns_host, self.port), self.timeout, **extra_kw ) except SocketTimeout: raise ConnectTimeoutError( self, "Connection to %s timed out. (connect timeout=%s)" % (self.host, self.timeout), ) except SocketError as e: > raise NewConnectionError( self, "Failed to establish a new connection: %s" % e ) E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 101] Network unreachable /usr/lib/python3.12/site-packages/urllib3/connection.py:186: NewConnectionError The above exception was the direct cause of the following exception: self = url_or_request = 'https://vimeo.com/619427082', video_id = '619427082' note = None, errnote = 'Unable to download webpage', fatal = True, data = None headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.159 .../*;q=0.8', 'Accept-Language': 'en-us,en;q=0.5', 'Sec-Fetch-Mode': 'navigate', 'Referer': 'https://vimeo.com/619427082'} query = {}, expected_status = None impersonate = ImpersonateTarget(client=None, version=None, os=None, os_version=None) require_impersonation = False def _request_webpage(self, url_or_request, video_id, note=None, errnote=None, fatal=True, data=None, headers=None, query=None, expected_status=None, impersonate=None, require_impersonation=False): """ Return the response handle. See _download_webpage docstring for arguments specification. """ if not self._downloader._first_webpage_request: sleep_interval = self.get_param('sleep_interval_requests') or 0 if sleep_interval > 0: self.to_screen(f'Sleeping {sleep_interval} seconds ...') time.sleep(sleep_interval) else: self._downloader._first_webpage_request = False if note is None: self.report_download_webpage(video_id) elif note is not False: if video_id is None: self.to_screen(str(note)) else: self.to_screen(f'{video_id}: {note}') # Some sites check X-Forwarded-For HTTP header in order to figure out # the origin of the client behind proxy. This allows bypassing geo # restriction by faking this header's value to IP that belongs to some # geo unrestricted country. We will do so once we encounter any # geo restriction error. if self._x_forwarded_for_ip: headers = (headers or {}).copy() headers.setdefault('X-Forwarded-For', self._x_forwarded_for_ip) extensions = {} if impersonate in (True, ''): impersonate = ImpersonateTarget() requested_targets = [ t if isinstance(t, ImpersonateTarget) else ImpersonateTarget.from_str(t) for t in variadic(impersonate) ] if impersonate else [] available_target = next(filter(self._downloader._impersonate_target_available, requested_targets), None) if available_target: extensions['impersonate'] = available_target elif requested_targets: message = 'The extractor is attempting impersonation, but ' message += ( 'no impersonate target is available' if not str(impersonate) else f'none of these impersonate targets are available: "{", ".join(map(str, requested_targets))}"') info_msg = ('see https://github.com/yt-dlp/yt-dlp#impersonation ' 'for information on installing the required dependencies') if require_impersonation: raise ExtractorError(f'{message}; {info_msg}', expected=True) self.report_warning(f'{message}; if you encounter errors, then {info_msg}', only_once=True) try: > return self._downloader.urlopen(self._create_request(url_or_request, data, headers, query, extensions)) /usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py:896: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:4160: in urlopen return self._request_director.send(req) /usr/lib/python3.12/site-packages/yt_dlp/networking/common.py:117: in send response = handler.send(request) /usr/lib/python3.12/site-packages/yt_dlp/networking/_helper.py:208: in wrapper return func(self, *args, **kwargs) /usr/lib/python3.12/site-packages/yt_dlp/networking/common.py:337: in send return self._send(request) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def _send(self, request): headers = self._merge_headers(request.headers) add_accept_encoding_header(headers, SUPPORTED_ENCODINGS) max_redirects_exceeded = False session = self._get_instance(cookiejar=self._get_cookiejar(request)) try: requests_res = session.request( method=request.method, url=request.url, data=request.data, headers=headers, timeout=self._calculate_timeout(request), proxies=self._get_proxies(request), allow_redirects=True, stream=True, ) except requests.exceptions.TooManyRedirects as e: max_redirects_exceeded = True requests_res = e.response except requests.exceptions.SSLError as e: if 'CERTIFICATE_VERIFY_FAILED' in str(e): raise CertificateVerifyError(cause=e) from e raise SSLError(cause=e) from e except requests.exceptions.ProxyError as e: raise ProxyError(cause=e) from e except (requests.exceptions.ConnectionError, requests.exceptions.Timeout) as e: raise TransportError(cause=e) from e except urllib3.exceptions.HTTPError as e: # Catch any urllib3 exceptions that may leak through > raise TransportError(cause=e) from e E yt_dlp.networking.exceptions.TransportError: : Failed to establish a new connection: [Errno 101] Network unreachable /usr/lib/python3.12/site-packages/yt_dlp/networking/_requests.py:352: TransportError During handling of the above exception, another exception occurred: self = args = ('https://vimeo.com/619427082', , True, {}, True) kwargs = {} @functools.wraps(func) def wrapper(self, *args, **kwargs): while True: try: > return func(self, *args, **kwargs) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1622: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1757: in __extract_info ie_result = ie.extract(url) /usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py:740: in extract ie_result = self._real_extract(url) /usr/lib/python3.12/site-packages/yt_dlp/extractor/vimeo.py:844: in _real_extract webpage, urlh = self._download_webpage_handle( /usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py:960: in _download_webpage_handle urlh = self._request_webpage(url_or_request, video_id, note, errnote, fatal, data=data, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = url_or_request = 'https://vimeo.com/619427082', video_id = '619427082' note = None, errnote = 'Unable to download webpage', fatal = True, data = None headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.159 .../*;q=0.8', 'Accept-Language': 'en-us,en;q=0.5', 'Sec-Fetch-Mode': 'navigate', 'Referer': 'https://vimeo.com/619427082'} query = {}, expected_status = None impersonate = ImpersonateTarget(client=None, version=None, os=None, os_version=None) require_impersonation = False def _request_webpage(self, url_or_request, video_id, note=None, errnote=None, fatal=True, data=None, headers=None, query=None, expected_status=None, impersonate=None, require_impersonation=False): """ Return the response handle. See _download_webpage docstring for arguments specification. """ if not self._downloader._first_webpage_request: sleep_interval = self.get_param('sleep_interval_requests') or 0 if sleep_interval > 0: self.to_screen(f'Sleeping {sleep_interval} seconds ...') time.sleep(sleep_interval) else: self._downloader._first_webpage_request = False if note is None: self.report_download_webpage(video_id) elif note is not False: if video_id is None: self.to_screen(str(note)) else: self.to_screen(f'{video_id}: {note}') # Some sites check X-Forwarded-For HTTP header in order to figure out # the origin of the client behind proxy. This allows bypassing geo # restriction by faking this header's value to IP that belongs to some # geo unrestricted country. We will do so once we encounter any # geo restriction error. if self._x_forwarded_for_ip: headers = (headers or {}).copy() headers.setdefault('X-Forwarded-For', self._x_forwarded_for_ip) extensions = {} if impersonate in (True, ''): impersonate = ImpersonateTarget() requested_targets = [ t if isinstance(t, ImpersonateTarget) else ImpersonateTarget.from_str(t) for t in variadic(impersonate) ] if impersonate else [] available_target = next(filter(self._downloader._impersonate_target_available, requested_targets), None) if available_target: extensions['impersonate'] = available_target elif requested_targets: message = 'The extractor is attempting impersonation, but ' message += ( 'no impersonate target is available' if not str(impersonate) else f'none of these impersonate targets are available: "{", ".join(map(str, requested_targets))}"') info_msg = ('see https://github.com/yt-dlp/yt-dlp#impersonation ' 'for information on installing the required dependencies') if require_impersonation: raise ExtractorError(f'{message}; {info_msg}', expected=True) self.report_warning(f'{message}; if you encounter errors, then {info_msg}', only_once=True) try: return self._downloader.urlopen(self._create_request(url_or_request, data, headers, query, extensions)) except network_exceptions as err: if isinstance(err, HTTPError): if self.__can_accept_status_code(err, expected_status): return err.response if errnote is False: return False if errnote is None: errnote = 'Unable to download webpage' errmsg = f'{errnote}: {err}' if fatal: > raise ExtractorError(errmsg, cause=err) E yt_dlp.utils.ExtractorError: [vimeo] 619427082: Unable to download webpage: : Failed to establish a new connection: [Errno 101] Network unreachable (caused by TransportError(': Failed to establish a new connection: [Errno 101] Network unreachable')) /usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py:909: ExtractorError During handling of the above exception, another exception occurred: url = 'https://vimeo.com/619427082', video_id = '619427082' tmp_path = PosixPath('/tmp/pytest-of-buildozer/pytest-62/test_youtube_download_serial_h1') @pytest.mark.slow @pytest.mark.parametrize( "url,video_id", [ ("https://vimeo.com/619427082", "619427082"), ("https://vimeo.com/619427082", "619427082"), ], ) def test_youtube_download_serial(url, video_id, tmp_path): yt_downloader = YoutubeDownloader(threads=1) options = BestMp4.get_options( target_dir=tmp_path, filepath=pathlib.Path("%(id)s/video.%(ext)s"), ) > yt_downloader.download(url, options) tests/download/test_download.py:146: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ .testenv/lib/python3.12/site-packages/zimscraperlib/download.py:63: in download raise future.exception() /usr/lib/python3.12/concurrent/futures/thread.py:58: in run result = self.fn(*self.args, **self.kwargs) .testenv/lib/python3.12/site-packages/zimscraperlib/download.py:40: in _run_youtube_dl ydl.download([url]) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:3602: in download self.__download_wrapper(self.extract_info)( /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:3577: in wrapper res = func(*args, **kwargs) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1611: in extract_info return self.__extract_info(url, self.get_info_extractor(key), download, extra_info, process) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1640: in wrapper self.report_error(str(e), e.format_traceback()) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1088: in report_error self.trouble(f'{self._format_err("ERROR:", self.Styles.ERROR)} {message}', *args, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = message = "ERROR: [vimeo] 619427082: Unable to download webpage: : ...on.HTTPSConnection object at 0x7fffeb103200>: Failed to establish a new connection: [Errno 101] Network unreachable'))" tb = ' File "/usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py", line 740, in extract\n ie_result = self._re...ion.HTTPSConnection object at 0x7fffeb103200>: Failed to establish a new connection: [Errno 101] Network unreachable\n' is_error = True def trouble(self, message=None, tb=None, is_error=True): """Determine action to take when a download problem appears. Depending on if the downloader has been configured to ignore download errors or not, this method may throw an exception or not when errors are found, after printing the message. @param tb If given, is additional traceback information @param is_error Whether to raise error according to ignorerrors """ if message is not None: self.to_stderr(message) if self.params.get('verbose'): if tb is None: if sys.exc_info()[0]: # if .trouble has been called from an except block tb = '' if hasattr(sys.exc_info()[1], 'exc_info') and sys.exc_info()[1].exc_info[0]: tb += ''.join(traceback.format_exception(*sys.exc_info()[1].exc_info)) tb += encode_compat_str(traceback.format_exc()) else: tb_data = traceback.format_list(traceback.extract_stack()) tb = ''.join(tb_data) if tb: self.to_stderr(tb) if not is_error: return if not self.params.get('ignoreerrors'): if sys.exc_info()[0] and hasattr(sys.exc_info()[1], 'exc_info') and sys.exc_info()[1].exc_info[0]: exc_info = sys.exc_info()[1].exc_info else: exc_info = sys.exc_info() > raise DownloadError(message, exc_info) E yt_dlp.utils.DownloadError: ERROR: [vimeo] 619427082: Unable to download webpage: : Failed to establish a new connection: [Errno 101] Network unreachable (caused by TransportError(': Failed to establish a new connection: [Errno 101] Network unreachable')) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1027: DownloadError ----------------------------- Captured stdout call ----------------------------- [vimeo] Extracting URL: https://vimeo.com/619427082 [vimeo] 619427082: Downloading webpage ----------------------------- Captured stderr call ----------------------------- WARNING: [vimeo] The extractor is attempting impersonation, but no impersonate target is available; if you encounter errors, then see https://github.com/yt-dlp/yt-dlp#impersonation for information on installing the required dependencies ERROR: [vimeo] 619427082: Unable to download webpage: : Failed to establish a new connection: [Errno 101] Network unreachable (caused by TransportError(': Failed to establish a new connection: [Errno 101] Network unreachable')) _________________________ test_youtube_download_nowait _________________________ tmp_path = PosixPath('/tmp/pytest-of-buildozer/pytest-62/test_youtube_download_nowait0') @pytest.mark.slow def test_youtube_download_nowait(tmp_path): with YoutubeDownloader(threads=1) as yt_downloader: future = yt_downloader.download( "https://vimeo.com/619427082", BestMp4.get_options(target_dir=tmp_path), wait=False, ) assert future.running() assert not yt_downloader.executor._shutdown done, not_done = concurrent.futures.wait( [future], return_when=concurrent.futures.ALL_COMPLETED ) > assert future.exception() is None E assert DownloadError("ERROR: [vimeo] 619427082: Unable to download webpage: : Failed to establish a new connection: [Errno 101] Network unreachable'))") is None E + where DownloadError("ERROR: [vimeo] 619427082: Unable to download webpage: : Failed to establish a new connection: [Errno 101] Network unreachable'))") = >() E + where > = .exception tests/download/test_download.py:164: AssertionError ----------------------------- Captured stdout call ----------------------------- [vimeo] Extracting URL: https://vimeo.com/619427082 [vimeo] 619427082: Downloading webpage ----------------------------- Captured stderr call ----------------------------- WARNING: [vimeo] The extractor is attempting impersonation, but no impersonate target is available; if you encounter errors, then see https://github.com/yt-dlp/yt-dlp#impersonation for information on installing the required dependencies ERROR: [vimeo] 619427082: Unable to download webpage: : Failed to establish a new connection: [Errno 101] Network unreachable (caused by TransportError(': Failed to establish a new connection: [Errno 101] Network unreachable')) _____________________ test_youtube_download_contextmanager _____________________ self = def _new_conn(self): """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ extra_kw = {} if self.source_address: extra_kw["source_address"] = self.source_address if self.socket_options: extra_kw["socket_options"] = self.socket_options try: > conn = connection.create_connection( (self._dns_host, self.port), self.timeout, **extra_kw ) /usr/lib/python3.12/site-packages/urllib3/connection.py:174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/urllib3/util/connection.py:95: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('vimeo.com', 443), timeout = 20.0, source_address = None socket_options = [(6, 1, 1)] def create_connection( address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT, source_address=None, socket_options=None, ): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: return six.raise_from( LocationParseError(u"'%s', label empty or too long" % host), None ) for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E OSError: [Errno 101] Network unreachable /usr/lib/python3.12/site-packages/urllib3/util/connection.py:85: OSError During handling of the above exception, another exception occurred: self = request = def _send(self, request): headers = self._merge_headers(request.headers) add_accept_encoding_header(headers, SUPPORTED_ENCODINGS) max_redirects_exceeded = False session = self._get_instance(cookiejar=self._get_cookiejar(request)) try: > requests_res = session.request( method=request.method, url=request.url, data=request.data, headers=headers, timeout=self._calculate_timeout(request), proxies=self._get_proxies(request), allow_redirects=True, stream=True, ) /usr/lib/python3.12/site-packages/yt_dlp/networking/_requests.py:324: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.12/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.12/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:799: in urlopen retries = retries.increment( /usr/lib/python3.12/site-packages/urllib3/util/retry.py:525: in increment raise six.reraise(type(error), error, _stacktrace) /usr/lib/python3.12/site-packages/urllib3/packages/six.py:770: in reraise raise value /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:715: in urlopen httplib_response = self._make_request( /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:404: in _make_request self._validate_conn(conn) /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:1058: in _validate_conn conn.connect() /usr/lib/python3.12/site-packages/urllib3/connection.py:363: in connect self.sock = conn = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self): """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ extra_kw = {} if self.source_address: extra_kw["source_address"] = self.source_address if self.socket_options: extra_kw["socket_options"] = self.socket_options try: conn = connection.create_connection( (self._dns_host, self.port), self.timeout, **extra_kw ) except SocketTimeout: raise ConnectTimeoutError( self, "Connection to %s timed out. (connect timeout=%s)" % (self.host, self.timeout), ) except SocketError as e: > raise NewConnectionError( self, "Failed to establish a new connection: %s" % e ) E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 101] Network unreachable /usr/lib/python3.12/site-packages/urllib3/connection.py:186: NewConnectionError The above exception was the direct cause of the following exception: self = url_or_request = 'https://vimeo.com/619427082', video_id = '619427082' note = None, errnote = 'Unable to download webpage', fatal = True, data = None headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.159 .../*;q=0.8', 'Accept-Language': 'en-us,en;q=0.5', 'Sec-Fetch-Mode': 'navigate', 'Referer': 'https://vimeo.com/619427082'} query = {}, expected_status = None impersonate = ImpersonateTarget(client=None, version=None, os=None, os_version=None) require_impersonation = False def _request_webpage(self, url_or_request, video_id, note=None, errnote=None, fatal=True, data=None, headers=None, query=None, expected_status=None, impersonate=None, require_impersonation=False): """ Return the response handle. See _download_webpage docstring for arguments specification. """ if not self._downloader._first_webpage_request: sleep_interval = self.get_param('sleep_interval_requests') or 0 if sleep_interval > 0: self.to_screen(f'Sleeping {sleep_interval} seconds ...') time.sleep(sleep_interval) else: self._downloader._first_webpage_request = False if note is None: self.report_download_webpage(video_id) elif note is not False: if video_id is None: self.to_screen(str(note)) else: self.to_screen(f'{video_id}: {note}') # Some sites check X-Forwarded-For HTTP header in order to figure out # the origin of the client behind proxy. This allows bypassing geo # restriction by faking this header's value to IP that belongs to some # geo unrestricted country. We will do so once we encounter any # geo restriction error. if self._x_forwarded_for_ip: headers = (headers or {}).copy() headers.setdefault('X-Forwarded-For', self._x_forwarded_for_ip) extensions = {} if impersonate in (True, ''): impersonate = ImpersonateTarget() requested_targets = [ t if isinstance(t, ImpersonateTarget) else ImpersonateTarget.from_str(t) for t in variadic(impersonate) ] if impersonate else [] available_target = next(filter(self._downloader._impersonate_target_available, requested_targets), None) if available_target: extensions['impersonate'] = available_target elif requested_targets: message = 'The extractor is attempting impersonation, but ' message += ( 'no impersonate target is available' if not str(impersonate) else f'none of these impersonate targets are available: "{", ".join(map(str, requested_targets))}"') info_msg = ('see https://github.com/yt-dlp/yt-dlp#impersonation ' 'for information on installing the required dependencies') if require_impersonation: raise ExtractorError(f'{message}; {info_msg}', expected=True) self.report_warning(f'{message}; if you encounter errors, then {info_msg}', only_once=True) try: > return self._downloader.urlopen(self._create_request(url_or_request, data, headers, query, extensions)) /usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py:896: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:4160: in urlopen return self._request_director.send(req) /usr/lib/python3.12/site-packages/yt_dlp/networking/common.py:117: in send response = handler.send(request) /usr/lib/python3.12/site-packages/yt_dlp/networking/_helper.py:208: in wrapper return func(self, *args, **kwargs) /usr/lib/python3.12/site-packages/yt_dlp/networking/common.py:337: in send return self._send(request) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def _send(self, request): headers = self._merge_headers(request.headers) add_accept_encoding_header(headers, SUPPORTED_ENCODINGS) max_redirects_exceeded = False session = self._get_instance(cookiejar=self._get_cookiejar(request)) try: requests_res = session.request( method=request.method, url=request.url, data=request.data, headers=headers, timeout=self._calculate_timeout(request), proxies=self._get_proxies(request), allow_redirects=True, stream=True, ) except requests.exceptions.TooManyRedirects as e: max_redirects_exceeded = True requests_res = e.response except requests.exceptions.SSLError as e: if 'CERTIFICATE_VERIFY_FAILED' in str(e): raise CertificateVerifyError(cause=e) from e raise SSLError(cause=e) from e except requests.exceptions.ProxyError as e: raise ProxyError(cause=e) from e except (requests.exceptions.ConnectionError, requests.exceptions.Timeout) as e: raise TransportError(cause=e) from e except urllib3.exceptions.HTTPError as e: # Catch any urllib3 exceptions that may leak through > raise TransportError(cause=e) from e E yt_dlp.networking.exceptions.TransportError: : Failed to establish a new connection: [Errno 101] Network unreachable /usr/lib/python3.12/site-packages/yt_dlp/networking/_requests.py:352: TransportError During handling of the above exception, another exception occurred: self = args = ('https://vimeo.com/619427082', , True, {}, True) kwargs = {} @functools.wraps(func) def wrapper(self, *args, **kwargs): while True: try: > return func(self, *args, **kwargs) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1622: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1757: in __extract_info ie_result = ie.extract(url) /usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py:740: in extract ie_result = self._real_extract(url) /usr/lib/python3.12/site-packages/yt_dlp/extractor/vimeo.py:844: in _real_extract webpage, urlh = self._download_webpage_handle( /usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py:960: in _download_webpage_handle urlh = self._request_webpage(url_or_request, video_id, note, errnote, fatal, data=data, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = url_or_request = 'https://vimeo.com/619427082', video_id = '619427082' note = None, errnote = 'Unable to download webpage', fatal = True, data = None headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.159 .../*;q=0.8', 'Accept-Language': 'en-us,en;q=0.5', 'Sec-Fetch-Mode': 'navigate', 'Referer': 'https://vimeo.com/619427082'} query = {}, expected_status = None impersonate = ImpersonateTarget(client=None, version=None, os=None, os_version=None) require_impersonation = False def _request_webpage(self, url_or_request, video_id, note=None, errnote=None, fatal=True, data=None, headers=None, query=None, expected_status=None, impersonate=None, require_impersonation=False): """ Return the response handle. See _download_webpage docstring for arguments specification. """ if not self._downloader._first_webpage_request: sleep_interval = self.get_param('sleep_interval_requests') or 0 if sleep_interval > 0: self.to_screen(f'Sleeping {sleep_interval} seconds ...') time.sleep(sleep_interval) else: self._downloader._first_webpage_request = False if note is None: self.report_download_webpage(video_id) elif note is not False: if video_id is None: self.to_screen(str(note)) else: self.to_screen(f'{video_id}: {note}') # Some sites check X-Forwarded-For HTTP header in order to figure out # the origin of the client behind proxy. This allows bypassing geo # restriction by faking this header's value to IP that belongs to some # geo unrestricted country. We will do so once we encounter any # geo restriction error. if self._x_forwarded_for_ip: headers = (headers or {}).copy() headers.setdefault('X-Forwarded-For', self._x_forwarded_for_ip) extensions = {} if impersonate in (True, ''): impersonate = ImpersonateTarget() requested_targets = [ t if isinstance(t, ImpersonateTarget) else ImpersonateTarget.from_str(t) for t in variadic(impersonate) ] if impersonate else [] available_target = next(filter(self._downloader._impersonate_target_available, requested_targets), None) if available_target: extensions['impersonate'] = available_target elif requested_targets: message = 'The extractor is attempting impersonation, but ' message += ( 'no impersonate target is available' if not str(impersonate) else f'none of these impersonate targets are available: "{", ".join(map(str, requested_targets))}"') info_msg = ('see https://github.com/yt-dlp/yt-dlp#impersonation ' 'for information on installing the required dependencies') if require_impersonation: raise ExtractorError(f'{message}; {info_msg}', expected=True) self.report_warning(f'{message}; if you encounter errors, then {info_msg}', only_once=True) try: return self._downloader.urlopen(self._create_request(url_or_request, data, headers, query, extensions)) except network_exceptions as err: if isinstance(err, HTTPError): if self.__can_accept_status_code(err, expected_status): return err.response if errnote is False: return False if errnote is None: errnote = 'Unable to download webpage' errmsg = f'{errnote}: {err}' if fatal: > raise ExtractorError(errmsg, cause=err) E yt_dlp.utils.ExtractorError: [vimeo] 619427082: Unable to download webpage: : Failed to establish a new connection: [Errno 101] Network unreachable (caused by TransportError(': Failed to establish a new connection: [Errno 101] Network unreachable')) /usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py:909: ExtractorError During handling of the above exception, another exception occurred: tmp_path = PosixPath('/tmp/pytest-of-buildozer/pytest-62/test_youtube_download_contextm0') @pytest.mark.slow def test_youtube_download_contextmanager(tmp_path): with YoutubeDownloader(threads=1) as yt_downloader: > yt_downloader.download( "https://vimeo.com/619427082", BestWebm.get_options(target_dir=tmp_path) ) tests/download/test_download.py:180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ .testenv/lib/python3.12/site-packages/zimscraperlib/download.py:63: in download raise future.exception() /usr/lib/python3.12/concurrent/futures/thread.py:58: in run result = self.fn(*self.args, **self.kwargs) .testenv/lib/python3.12/site-packages/zimscraperlib/download.py:40: in _run_youtube_dl ydl.download([url]) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:3602: in download self.__download_wrapper(self.extract_info)( /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:3577: in wrapper res = func(*args, **kwargs) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1611: in extract_info return self.__extract_info(url, self.get_info_extractor(key), download, extra_info, process) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1640: in wrapper self.report_error(str(e), e.format_traceback()) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1088: in report_error self.trouble(f'{self._format_err("ERROR:", self.Styles.ERROR)} {message}', *args, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = message = "ERROR: [vimeo] 619427082: Unable to download webpage: : ...on.HTTPSConnection object at 0x7fffeb107ec0>: Failed to establish a new connection: [Errno 101] Network unreachable'))" tb = ' File "/usr/lib/python3.12/site-packages/yt_dlp/extractor/common.py", line 740, in extract\n ie_result = self._re...ion.HTTPSConnection object at 0x7fffeb107ec0>: Failed to establish a new connection: [Errno 101] Network unreachable\n' is_error = True def trouble(self, message=None, tb=None, is_error=True): """Determine action to take when a download problem appears. Depending on if the downloader has been configured to ignore download errors or not, this method may throw an exception or not when errors are found, after printing the message. @param tb If given, is additional traceback information @param is_error Whether to raise error according to ignorerrors """ if message is not None: self.to_stderr(message) if self.params.get('verbose'): if tb is None: if sys.exc_info()[0]: # if .trouble has been called from an except block tb = '' if hasattr(sys.exc_info()[1], 'exc_info') and sys.exc_info()[1].exc_info[0]: tb += ''.join(traceback.format_exception(*sys.exc_info()[1].exc_info)) tb += encode_compat_str(traceback.format_exc()) else: tb_data = traceback.format_list(traceback.extract_stack()) tb = ''.join(tb_data) if tb: self.to_stderr(tb) if not is_error: return if not self.params.get('ignoreerrors'): if sys.exc_info()[0] and hasattr(sys.exc_info()[1], 'exc_info') and sys.exc_info()[1].exc_info[0]: exc_info = sys.exc_info()[1].exc_info else: exc_info = sys.exc_info() > raise DownloadError(message, exc_info) E yt_dlp.utils.DownloadError: ERROR: [vimeo] 619427082: Unable to download webpage: : Failed to establish a new connection: [Errno 101] Network unreachable (caused by TransportError(': Failed to establish a new connection: [Errno 101] Network unreachable')) /usr/lib/python3.12/site-packages/yt_dlp/YoutubeDL.py:1027: DownloadError ----------------------------- Captured stdout call ----------------------------- [vimeo] Extracting URL: https://vimeo.com/619427082 [vimeo] 619427082: Downloading webpage ----------------------------- Captured stderr call ----------------------------- WARNING: [vimeo] The extractor is attempting impersonation, but no impersonate target is available; if you encounter errors, then see https://github.com/yt-dlp/yt-dlp#impersonation for information on installing the required dependencies ERROR: [vimeo] 619427082: Unable to download webpage: : Failed to establish a new connection: [Errno 101] Network unreachable (caused by TransportError(': Failed to establish a new connection: [Errno 101] Network unreachable')) _________________________________ test_search __________________________________ real_zim_file = local('/tmp/pytest-of-buildozer/pytest-62/data1/small.zim') @pytest.mark.slow def test_search(real_zim_file): > with Archive(real_zim_file) as zim: tests/zim/test_archive.py:60: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > ??? E RuntimeError: Cluster pointer table outside (or not fully inside) ZIM file. libzim/libzim.pyx:815: RuntimeError ________________________________ test_get_tags _________________________________ small_zim_file = local('/tmp/pytest-of-buildozer/pytest-62/data0/small.zim') real_zim_file = local('/tmp/pytest-of-buildozer/pytest-62/data1/small.zim') def test_get_tags(small_zim_file, real_zim_file): with Archive(small_zim_file) as zim: assert zim.get_tags() == ["_ftindex:no"] assert zim.get_tags(libkiwix=True) == [ "_ftindex:no", "_pictures:yes", "_videos:yes", "_details:yes", ] assert zim.tags == zim.get_tags() > with Archive(real_zim_file) as zim: tests/zim/test_archive.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > ??? E RuntimeError: Cluster pointer table outside (or not fully inside) ZIM file. libzim/libzim.pyx:815: RuntimeError ______________________________ test_urlitem_html _______________________________ tmp_path = PosixPath('/tmp/pytest-of-buildozer/pytest-62/test_urlitem_html0') gzip_html_url = 'https://en.wikipedia.org/wiki/Main_Page' def test_urlitem_html(tmp_path, gzip_html_url): file_path = tmp_path / "file.html" > save_large_file(gzip_html_url, file_path) tests/zim/test_zim_creator.py:266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ .testenv/lib/python3.12/site-packages/zimscraperlib/download.py:124: in save_large_file subprocess.run( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = None, check = True popenargs = (['/usr/bin/env', 'wget', '-t', '5', '--retry-connrefused', '--random-wait', ...],) kwargs = {} process = stdout = None, stderr = None, retcode = 4 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout is given, and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/usr/bin/env', 'wget', '-t', '5', '--retry-connrefused', '--random-wait', '-O', '/tmp/pytest-of-buildozer/pytest-62/test_urlitem_html0/file.html', '-c', 'https://en.wikipedia.org/wiki/Main_Page']' returned non-zero exit status 4. /usr/lib/python3.12/subprocess.py:571: CalledProcessError ----------------------------- Captured stderr call ----------------------------- --2024-07-11 13:38:09-- https://en.wikipedia.org/wiki/Main_Page Resolving en.wikipedia.org (en.wikipedia.org)... 199.59.148.229, 2001::1 Connecting to en.wikipedia.org (en.wikipedia.org)|199.59.148.229|:443... failed: Operation timed out. Connecting to en.wikipedia.org (en.wikipedia.org)|2001::1|:443... failed: Network unreachable. Retrying. --2024-07-11 13:40:25-- (try: 2) https://en.wikipedia.org/wiki/Main_Page Connecting to en.wikipedia.org (en.wikipedia.org)|199.59.148.229|:443... failed: Operation timed out. Connecting to en.wikipedia.org (en.wikipedia.org)|2001::1|:443... failed: Network unreachable. Retrying. --2024-07-11 13:42:41-- (try: 3) https://en.wikipedia.org/wiki/Main_Page Connecting to en.wikipedia.org (en.wikipedia.org)|199.59.148.229|:443... failed: Operation timed out. Connecting to en.wikipedia.org (en.wikipedia.org)|2001::1|:443... failed: Network unreachable. Retrying. --2024-07-11 13:44:59-- (try: 4) https://en.wikipedia.org/wiki/Main_Page Connecting to en.wikipedia.org (en.wikipedia.org)|199.59.148.229|:443... failed: Operation timed out. Connecting to en.wikipedia.org (en.wikipedia.org)|2001::1|:443... failed: Network unreachable. Retrying. --2024-07-11 13:47:18-- (try: 5) https://en.wikipedia.org/wiki/Main_Page Connecting to en.wikipedia.org (en.wikipedia.org)|199.59.148.229|:443... failed: Operation timed out. Connecting to en.wikipedia.org (en.wikipedia.org)|2001::1|:443... failed: Network unreachable. Giving up. _____________________________ test_urlitem_binary ______________________________ tmp_path = PosixPath('/tmp/pytest-of-buildozer/pytest-62/test_urlitem_binary0') png_image_url = 'https://commons.wikimedia.org/static/images/project-logos/commonswiki.png' def test_urlitem_binary(tmp_path, png_image_url): file_path = tmp_path / "file.png" > save_large_file(png_image_url, file_path) tests/zim/test_zim_creator.py:297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ .testenv/lib/python3.12/site-packages/zimscraperlib/download.py:124: in save_large_file subprocess.run( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = None, check = True popenargs = (['/usr/bin/env', 'wget', '-t', '5', '--retry-connrefused', '--random-wait', ...],) kwargs = {} process = stdout = None, stderr = None, retcode = 4 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout is given, and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/usr/bin/env', 'wget', '-t', '5', '--retry-connrefused', '--random-wait', '-O', '/tmp/pytest-of-buildozer/pytest-62/test_urlitem_binary0/file.png', '-c', 'https://commons.wikimedia.org/static/images/project-logos/commonswiki.png']' returned non-zero exit status 4. /usr/lib/python3.12/subprocess.py:571: CalledProcessError ----------------------------- Captured stderr call ----------------------------- --2024-07-11 13:49:36-- https://commons.wikimedia.org/static/images/project-logos/commonswiki.png Resolving commons.wikimedia.org (commons.wikimedia.org)... 198.35.26.96, 2620:0:863:ed1a::1 Connecting to commons.wikimedia.org (commons.wikimedia.org)|198.35.26.96|:443... connected. OpenSSL: error:0A000126:SSL routines::unexpected eof while reading Unable to establish SSL connection. _________________________ test_filelikeprovider_nosize _________________________ self = method = 'GET', url = '/static/images/project-logos/commonswiki.png' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/static/images/project-logos/commonswiki.png', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. > httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:715: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:404: in _make_request self._validate_conn(conn) /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:1058: in _validate_conn conn.connect() /usr/lib/python3.12/site-packages/urllib3/connection.py:419: in connect self.sock = ssl_wrap_socket( /usr/lib/python3.12/site-packages/urllib3/util/ssl_.py:449: in ssl_wrap_socket ssl_sock = _ssl_wrap_socket_impl( /usr/lib/python3.12/site-packages/urllib3/util/ssl_.py:493: in _ssl_wrap_socket_impl return ssl_context.wrap_socket(sock, server_hostname=server_hostname) /usr/lib/python3.12/ssl.py:455: in wrap_socket return self.sslsocket_class._create( /usr/lib/python3.12/ssl.py:1042: in _create self.do_handshake() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , block = False @_sslcopydoc def do_handshake(self, block=False): self._check_connected() timeout = self.gettimeout() try: if timeout == 0.0 and block: self.settimeout(None) > self._sslobj.do_handshake() E ssl.SSLEOFError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1000) /usr/lib/python3.12/ssl.py:1320: SSLEOFError During handling of the above exception, another exception occurred: self = request = , stream = True timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) /usr/lib/python3.12/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/urllib3/connectionpool.py:799: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET', url = '/static/images/project-logos/commonswiki.png' response = None error = SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1000)')) _pool = _stacktrace = def increment( self, method=None, url=None, response=None, error=None, _pool=None, _stacktrace=None, ): """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.HTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise six.reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise six.reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or not self._is_method_retryable(method): raise six.reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" redirect_location = response.get_redirect_location() status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): > raise MaxRetryError(_pool, url, error or ResponseError(cause)) E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='commons.wikimedia.org', port=443): Max retries exceeded with url: /static/images/project-logos/commonswiki.png (Caused by SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1000)'))) /usr/lib/python3.12/site-packages/urllib3/util/retry.py:592: MaxRetryError During handling of the above exception, another exception occurred: tmp_path = PosixPath('/tmp/pytest-of-buildozer/pytest-62/test_filelikeprovider_nosize0') png_image_url = 'https://commons.wikimedia.org/static/images/project-logos/commonswiki.png' def test_filelikeprovider_nosize(tmp_path, png_image_url): fileobj = io.BytesIO() > stream_file(png_image_url, byte_stream=fileobj) tests/zim/test_zim_creator.py:323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ .testenv/lib/python3.12/site-packages/zimscraperlib/download.py:199: in stream_file resp = session.get( /usr/lib/python3.12/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.12/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.12/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = True timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. > raise SSLError(e, request=request) E requests.exceptions.SSLError: HTTPSConnectionPool(host='commons.wikimedia.org', port=443): Max retries exceeded with url: /static/images/project-logos/commonswiki.png (Caused by SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1000)'))) /usr/lib/python3.12/site-packages/requests/adapters.py:698: SSLError _______________________________ test_urlprovider _______________________________ tmp_path = PosixPath('/tmp/pytest-of-buildozer/pytest-62/test_urlprovider0') png_image_url = 'https://commons.wikimedia.org/static/images/project-logos/commonswiki.png' def test_urlprovider(tmp_path, png_image_url): file_path = tmp_path / "file.png" > save_large_file(png_image_url, file_path) tests/zim/test_zim_creator.py:335: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ .testenv/lib/python3.12/site-packages/zimscraperlib/download.py:124: in save_large_file subprocess.run( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = None, check = True popenargs = (['/usr/bin/env', 'wget', '-t', '5', '--retry-connrefused', '--random-wait', ...],) kwargs = {} process = stdout = None, stderr = None, retcode = 4 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout is given, and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/usr/bin/env', 'wget', '-t', '5', '--retry-connrefused', '--random-wait', '-O', '/tmp/pytest-of-buildozer/pytest-62/test_urlprovider0/file.png', '-c', 'https://commons.wikimedia.org/static/images/project-logos/commonswiki.png']' returned non-zero exit status 4. /usr/lib/python3.12/subprocess.py:571: CalledProcessError ----------------------------- Captured stderr call ----------------------------- --2024-07-11 13:53:39-- https://commons.wikimedia.org/static/images/project-logos/commonswiki.png Resolving commons.wikimedia.org (commons.wikimedia.org)... 198.35.26.96, 2620:0:863:ed1a::1 Connecting to commons.wikimedia.org (commons.wikimedia.org)|198.35.26.96|:443... connected. OpenSSL: error:0A000126:SSL routines::unexpected eof while reading Unable to establish SSL connection. __________________________ test_urlprovider_nolength ___________________________ tmp_path = PosixPath('/tmp/pytest-of-buildozer/pytest-62/test_urlprovider_nolength0') png_image_url = 'https://commons.wikimedia.org/static/images/project-logos/commonswiki.png' png_image = PosixPath('/tmp/pytest-of-buildozer/pytest-62/test_urlprovider_nolength0/original.png') def test_urlprovider_nolength(tmp_path, png_image_url, png_image): # save url's content locally using external tool png_image = tmp_path / "original.png" > save_large_file(png_image_url, png_image) tests/zim/test_zim_creator.py:350: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ .testenv/lib/python3.12/site-packages/zimscraperlib/download.py:124: in save_large_file subprocess.run( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = None, check = True popenargs = (['/usr/bin/env', 'wget', '-t', '5', '--retry-connrefused', '--random-wait', ...],) kwargs = {} process = stdout = None, stderr = None, retcode = 4 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout is given, and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/usr/bin/env', 'wget', '-t', '5', '--retry-connrefused', '--random-wait', '-O', '/tmp/pytest-of-buildozer/pytest-62/test_urlprovider_nolength0/original.png', '-c', 'https://commons.wikimedia.org/static/images/project-logos/commonswiki.png']' returned non-zero exit status 4. /usr/lib/python3.12/subprocess.py:571: CalledProcessError ----------------------------- Captured stderr call ----------------------------- --2024-07-11 13:55:39-- https://commons.wikimedia.org/static/images/project-logos/commonswiki.png Resolving commons.wikimedia.org (commons.wikimedia.org)... 198.35.26.96, 2620:0:863:ed1a::1 Connecting to commons.wikimedia.org (commons.wikimedia.org)|198.35.26.96|:443... connected. OpenSSL: error:0A000126:SSL routines::unexpected eof while reading Unable to establish SSL connection. =============================== warnings summary =============================== tests/conftest.py:161 /home/buildozer/aports/testing/py3-zimscraperlib/src/python-scraperlib-3.2.0/tests/conftest.py:161: PytestRemovedIn9Warning: Marks applied to fixtures have no effect See docs: https://docs.pytest.org/en/stable/deprecations.html#applying-a-mark-to-a-fixture-function @pytest.mark.slow tests/filesystem/test_filesystem.py: 7 warnings tests/zim/test_fs.py: 7 warnings tests/zim/test_zim_creator.py: 14 warnings /usr/lib/python3.12/site-packages/magic/__init__.py:437: PendingDeprecationWarning: Using compatibility mode with libmagic's python binding. See https://github.com/ahupp/python-magic/blob/master/COMPAT.md for details. warnings.warn( tests/i18n/test_i18n.py::test_lang_details[zh-Hans-expected0] /usr/lib/python3.12/site-packages/iso639/iso639.py:20: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html from pkg_resources import resource_filename tests/i18n/test_i18n.py::test_lang_details[zh-Hans-expected0] tests/i18n/test_i18n.py::test_lang_details[zh-Hans-expected0] /usr/lib/python3.12/site-packages/pkg_resources/__init__.py:3117: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('zope')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html ---------- coverage: platform linux, python 3.12.3-final-0 ----------- Name Stmts Miss Cover Missing ----------------------------------------------------------------------------------------------------------- .testenv/lib/python3.12/site-packages/zimscraperlib/__init__.py 6 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/constants.py 17 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/download.py 81 1 99% 61 .testenv/lib/python3.12/site-packages/zimscraperlib/filesystem.py 19 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/fix_ogvjs_dist.py 25 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/html.py 39 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/i18n.py 94 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/image/__init__.py 5 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/image/convertion.py 27 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/image/optimization.py 94 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/image/presets.py 62 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/image/probing.py 44 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/image/transformation.py 27 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/image/utils.py 7 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/inputs.py 38 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/logging.py 33 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/misc.py 3 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/types.py 19 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/uri.py 17 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/video/__init__.py 3 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/video/config.py 92 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/video/encoding.py 22 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/video/presets.py 27 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/video/probing.py 9 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/zim/__init__.py 8 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/zim/_libkiwix.py 78 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/zim/archive.py 52 3 94% 92-94 .testenv/lib/python3.12/site-packages/zimscraperlib/zim/creator.py 136 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/zim/filesystem.py 62 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/zim/items.py 77 7 91% 73, 123-125, 133, 162-163 .testenv/lib/python3.12/site-packages/zimscraperlib/zim/metadata.py 50 0 100% .testenv/lib/python3.12/site-packages/zimscraperlib/zim/providers.py 46 16 65% 58-59, 76-84, 88-92, 95 ----------------------------------------------------------------------------------------------------------- TOTAL 1319 27 98% =========================== short test summary info ============================ FAILED tests/download/test_download.py::test_youtube_download_serial[https://vimeo.com/619427082-619427082_0] FAILED tests/download/test_download.py::test_youtube_download_serial[https://vimeo.com/619427082-619427082_1] FAILED tests/download/test_download.py::test_youtube_download_nowait - assert... FAILED tests/download/test_download.py::test_youtube_download_contextmanager FAILED tests/zim/test_archive.py::test_search - RuntimeError: Cluster pointer... FAILED tests/zim/test_archive.py::test_get_tags - RuntimeError: Cluster point... FAILED tests/zim/test_zim_creator.py::test_urlitem_html - subprocess.CalledPr... FAILED tests/zim/test_zim_creator.py::test_urlitem_binary - subprocess.Called... FAILED tests/zim/test_zim_creator.py::test_filelikeprovider_nosize - requests... FAILED tests/zim/test_zim_creator.py::test_urlprovider - subprocess.CalledPro... FAILED tests/zim/test_zim_creator.py::test_urlprovider_nolength - subprocess.... ==== 11 failed, 318 passed, 4 deselected, 32 warnings in 1672.21s (0:27:52) ==== /usr/lib/python3.12/site-packages/_pytest/pathlib.py:98: PytestWarning: (rm_rf) error removing /tmp/pytest-of-buildozer/garbage-b519ff9c-41f7-4feb-b942-20e92cc73b15/test_safe_delete_no_perms0 : [Errno 39] Directory not empty: 'test_safe_delete_no_perms0' warnings.warn( /usr/lib/python3.12/site-packages/_pytest/pathlib.py:98: PytestWarning: (rm_rf) error removing /tmp/pytest-of-buildozer/garbage-b519ff9c-41f7-4feb-b942-20e92cc73b15/test_safe_get_no_perms0 : [Errno 39] Directory not empty: 'test_safe_get_no_perms0' warnings.warn( /usr/lib/python3.12/site-packages/_pytest/pathlib.py:98: PytestWarning: (rm_rf) error removing /tmp/pytest-of-buildozer/garbage-b519ff9c-41f7-4feb-b942-20e92cc73b15/test_safe_set_no_perms0 : [Errno 39] Directory not empty: 'test_safe_set_no_perms0' warnings.warn( /usr/lib/python3.12/site-packages/_pytest/pathlib.py:98: PytestWarning: (rm_rf) error removing /tmp/pytest-of-buildozer/garbage-b519ff9c-41f7-4feb-b942-20e92cc73b15 : [Errno 39] Directory not empty: '/tmp/pytest-of-buildozer/garbage-b519ff9c-41f7-4feb-b942-20e92cc73b15' warnings.warn( >>> ERROR: py3-zimscraperlib: check failed >>> py3-zimscraperlib: Uninstalling dependencies... (1/247) Purging .makedepends-py3-zimscraperlib (20240711.132933) (2/247) Purging ffmpeg (6.1.1-r9) (3/247) Purging gifsicle (1.95-r0) (4/247) Purging wget (1.24.5-r0) (5/247) Purging py3-gpep517-pyc (16-r0) (6/247) Purging py3-gpep517 (16-r0) (7/247) Purging py3-installer-pyc (0.7.0-r2) (8/247) Purging py3-installer (0.7.0-r2) (9/247) Purging py3-wheel-pyc (0.42.0-r1) (10/247) Purging py3-wheel (0.42.0-r1) (11/247) Purging py3-babel-pyc (2.14.0-r2) (12/247) Purging py3-babel (2.14.0-r2) (13/247) Purging py3-tz-pyc (2024.1-r1) (14/247) Purging py3-tz (2024.1-r1) (15/247) Purging py3-beautifulsoup4-pyc (4.12.3-r2) (16/247) Purging py3-beautifulsoup4 (4.12.3-r2) (17/247) Purging py3-soupsieve-pyc (2.5-r1) (18/247) Purging py3-soupsieve (2.5-r1) (19/247) Purging py3-colorthief-pyc (0.2.1-r1) (20/247) Purging py3-colorthief (0.2.1-r1) (21/247) Purging py3-iso639-pyc (0.4.5-r1) (22/247) Purging py3-iso639 (0.4.5-r1) (23/247) Purging py3-libzim (3.4.0-r1) (24/247) Purging py3-lxml-pyc (5.1.0-r0) (25/247) Purging py3-lxml (5.1.0-r0) (26/247) Purging py3-magic-pyc (0.4.27-r3) (27/247) Purging py3-magic (0.4.27-r3) (28/247) Purging py3-optimize-images-pyc (1.5.1-r1) (29/247) Purging py3-optimize-images (1.5.1-r1) (30/247) Purging py3-watchdog-pyc (4.0.0-r1) (31/247) Purging py3-watchdog (4.0.0-r1) (32/247) Purging py3-yaml-pyc (6.0.1-r3) (33/247) Purging py3-yaml (6.0.1-r3) (34/247) Purging py3-piexif-pyc (1.1.3-r7) (35/247) Purging py3-piexif (1.1.3-r7) (36/247) Purging py3-pytest-cov-pyc (5.0.0-r0) (37/247) Purging py3-pytest-cov (5.0.0-r0) (38/247) Purging py3-pytest-pyc (8.2.2-r1) (39/247) Purging py3-pytest (8.2.2-r1) (40/247) Purging py3-iniconfig-pyc (2.0.0-r1) (41/247) Purging py3-iniconfig (2.0.0-r1) (42/247) Purging py3-pluggy-pyc (1.5.0-r0) (43/247) Purging py3-pluggy (1.5.0-r0) (44/247) Purging py3-py-pyc (1.11.0-r3) (45/247) Purging py3-py (1.11.0-r3) (46/247) Purging py3-coverage-pyc (7.5.1-r0) (47/247) Purging py3-coverage (7.5.1-r0) (48/247) Purging py3-pytest-httpbin-pyc (2.0.0-r1) (49/247) Purging py3-pytest-httpbin (2.0.0-r1) (50/247) Purging py3-httpbin-pyc (0.10.2-r3) (51/247) Purging py3-httpbin (0.10.2-r3) (52/247) Purging py3-flask-pyc (3.0.3-r0) (53/247) Purging py3-flask (3.0.3-r0) (54/247) Purging py3-click-pyc (8.1.7-r2) (55/247) Purging py3-click (8.1.7-r2) (56/247) Purging py3-itsdangerous-pyc (2.1.2-r4) (57/247) Purging py3-itsdangerous (2.1.2-r4) (58/247) Purging py3-jinja2-pyc (3.1.4-r0) (59/247) Purging py3-jinja2 (3.1.4-r0) (60/247) Purging py3-werkzeug-pyc (3.0.3-r0) (61/247) Purging py3-werkzeug (3.0.3-r0) (62/247) Purging py3-markupsafe-pyc (2.1.5-r1) (63/247) Purging py3-markupsafe (2.1.5-r1) (64/247) Purging py3-raven-pyc (6.10.0-r7) (65/247) Purging py3-raven (6.10.0-r7) (66/247) Purging py3-blinker-pyc (1.7.0-r1) (67/247) Purging py3-blinker (1.7.0-r1) (68/247) Purging py3-brotli-pyc (1.1.0-r2) (69/247) Purging py3-brotli (1.1.0-r2) (70/247) Purging py3-decorator-pyc (5.1.1-r4) (71/247) Purging py3-decorator (5.1.1-r4) (72/247) Purging py3-resizeimage-pyc (1.1.20-r1) (73/247) Purging py3-resizeimage (1.1.20-r1) (74/247) Purging py3-pillow-pyc (10.4.0-r0) (75/247) Purging py3-pillow (10.4.0-r0) (76/247) Purging py3-wsgiprox-pyc (1.5.2-r1) (77/247) Purging py3-wsgiprox (1.5.2-r1) (78/247) Purging py3-certauth-pyc (1.3.0-r1) (79/247) Purging py3-certauth (1.3.0-r1) (80/247) Purging py3-openssl-pyc (24.1.0-r1) (81/247) Purging py3-openssl (24.1.0-r1) (82/247) Purging py3-cryptography-pyc (42.0.8-r0) (83/247) Purging py3-cryptography (42.0.8-r0) (84/247) Purging py3-tldextract-pyc (5.1.2-r1) (85/247) Purging py3-tldextract (5.1.2-r1) (86/247) Purging py3-requests-file-pyc (2.1.0-r0) (87/247) Purging py3-requests-file (2.1.0-r0) (88/247) Purging py3-requests-pyc (2.32.3-r0) (89/247) Purging py3-requests (2.32.3-r0) (90/247) Purging py3-certifi-pyc (2024.2.2-r1) (91/247) Purging py3-certifi (2024.2.2-r1) (92/247) Purging py3-charset-normalizer-pyc (3.3.2-r1) (93/247) Purging py3-charset-normalizer (3.3.2-r1) (94/247) Purging py3-idna-pyc (3.7-r0) (95/247) Purging py3-idna (3.7-r0) (96/247) Purging py3-urllib3-pyc (1.26.18-r1) (97/247) Purging py3-urllib3 (1.26.18-r1) (98/247) Purging py3-filelock-pyc (3.13.1-r1) (99/247) Purging py3-filelock (3.13.1-r1) (100/247) Purging py3-gevent-websocket-pyc (0.10.1-r7) (101/247) Purging py3-gevent-websocket (0.10.1-r7) (102/247) Purging py3-gevent-pyc (23.9.1-r1) (103/247) Purging py3-gevent (23.9.1-r1) (104/247) Purging py3-cffi-pyc (1.16.0-r1) (105/247) Purging py3-cffi (1.16.0-r1) (106/247) Purging py3-cparser-pyc (2.22-r1) (107/247) Purging py3-cparser (2.22-r1) (108/247) Purging py3-greenlet-pyc (3.0.3-r1) (109/247) Purging py3-greenlet (3.0.3-r1) (110/247) Purging py3-zope-event-pyc (5.0-r1) (111/247) Purging py3-zope-event (5.0-r1) (112/247) Purging py3-setuptools-pyc (70.3.0-r0) (113/247) Purging py3-setuptools (70.3.0-r0) (114/247) Purging py3-packaging-pyc (24.1-r0) (115/247) Purging py3-packaging (24.1-r0) (116/247) Purging py3-parsing-pyc (3.1.2-r1) (117/247) Purging py3-parsing (3.1.2-r1) (118/247) Purging py3-zope-interface-pyc (6.0-r1) (119/247) Purging py3-zope-interface (6.0-r1) (120/247) Purging py3-six-pyc (1.16.0-r9) (121/247) Purging py3-six (1.16.0-r9) (122/247) Purging yt-dlp-core-pyc (2024.07.09-r0) (123/247) Purging yt-dlp-core (2024.07.09-r0) (124/247) Purging ffmpeg-libavdevice (6.1.1-r9) (125/247) Purging ffmpeg-libavfilter (6.1.1-r9) (126/247) Purging ffmpeg-libavformat (6.1.1-r9) (127/247) Purging ffmpeg-libpostproc (6.1.1-r9) (128/247) Purging ffmpeg-libswscale (6.1.1-r9) (129/247) Purging libass (0.17.3-r0) (130/247) Purging libbluray (1.3.4-r1) (131/247) Purging libev (4.33-r1) (132/247) Purging libimagequant (4.2.2-r0) (133/247) Purging libopenmpt (0.7.7-r0) (134/247) Purging libplacebo (6.338.2-r2) (135/247) Purging libpulse (17.0-r1) (136/247) Purging librist (0.2.10-r1) (137/247) Purging libsndfile (1.2.2-r0) (138/247) Purging libsrt (1.5.3-r0) (139/247) Purging libssh (0.10.6-r0) (140/247) Purging libunibreak (6.1-r0) (141/247) Purging libuv (1.48.0-r0) (142/247) Purging libwebpdemux (1.3.2-r0) (143/247) Purging libxslt (1.1.39-r1) (144/247) Purging libzim (9.1.0-r1) (145/247) Purging libzmq (4.3.5-r2) (146/247) Purging lilv-libs (0.24.24-r1) (147/247) Purging mbedtls (3.6.0-r0) (148/247) Purging mpg123-libs (1.32.6-r0) (149/247) Purging openjpeg (2.5.2-r0) (150/247) Purging orc (0.4.38-r0) (151/247) Purging python3-pyc (3.12.3-r1) (152/247) Purging python3-pycache-pyc0 (3.12.3-r1) (153/247) Purging pyc (3.12.3-r1) (154/247) Purging sdl2 (2.28.5-r1) (155/247) Purging shaderc (2024.0-r1) (156/247) Purging speexdsp (1.2.1-r2) (157/247) Purging sratom (0.6.16-r0) (158/247) Purging tdb-libs (1.4.10-r0) (159/247) Purging tiff (4.6.0t-r0) (160/247) Purging v4l-utils-libs (1.26.1-r0) (161/247) Purging vidstab (1.1.1-r0) (162/247) Purging vulkan-loader (1.3.261.1-r0) (163/247) Purging yaml (0.2.5-r2) (164/247) Purging zimg (3.0.5-r2) (165/247) Purging alsa-lib (1.2.12-r0) (166/247) Purging ffmpeg-libavcodec (6.1.1-r9) (167/247) Purging aom-libs (3.9.1-r0) (168/247) Purging dbus-libs (1.14.10-r3) (169/247) Purging ffmpeg-libswresample (6.1.1-r9) (170/247) Purging ffmpeg-libavutil (6.1.1-r9) (171/247) Purging fontconfig (2.15.0-r1) (172/247) Purging harfbuzz (9.0.0-r0) (173/247) Purging freetype (2.13.2-r0) (174/247) Purging fribidi (1.0.15-r0) (175/247) Purging python3 (3.12.3-r1) (176/247) Purging gdbm (1.24-r0) (177/247) Purging libjxl (0.10.2-r0) (178/247) Purging giflib (5.2.2-r0) (179/247) Purging glib (2.80.3-r0) (180/247) Purging glslang-libs (1.3.283.0-r0) (181/247) Purging graphite2 (1.3.14-r6) (182/247) Purging libva (2.21.0-r0) (183/247) Purging libdrm (2.4.122-r0) (184/247) Purging libpciaccess (0.18.1-r0) (185/247) Purging hwdata-pci (0.382-r0) (186/247) Purging icu-libs (74.2-r0) (187/247) Purging icu-data-en (74.2-r0) (188/247) Purging openexr-libopenexr (3.1.13-r1) (189/247) Purging imath (3.1.11-r2) (190/247) Purging lame-libs (3.100-r5) (191/247) Purging lcms2 (2.16-r0) (192/247) Purging libasyncns (0.8-r3) (193/247) Purging libmount (2.40.1-r1) (194/247) Purging libblkid (2.40.1-r1) (195/247) Purging libvdpau (1.5-r3) (196/247) Purging libxext (1.3.6-r2) (197/247) Purging libxfixes (6.0.1-r4) (198/247) Purging libx11 (1.8.9-r1) (199/247) Purging libxcb (1.16.1-r0) (200/247) Purging libxdmcp (1.1.5-r1) (201/247) Purging libbsd (0.12.2-r0) (202/247) Purging libbz2 (1.0.8-r6) (203/247) Purging libdav1d (1.4.3-r0) (204/247) Purging libdovi (3.3.0-r0) (205/247) Purging libeconf (0.6.3-r0) (206/247) Purging wayland-libs-client (1.23.0-r0) (207/247) Purging libffi (3.4.6-r0) (208/247) Purging libflac (1.4.3-r1) (209/247) Purging libgcrypt (1.10.3-r0) (210/247) Purging libgpg-error (1.49-r0) (211/247) Purging libhwy (1.0.7-r0) (212/247) Purging libintl (0.22.5-r0) (213/247) Purging libjpeg-turbo (3.0.3-r0) (214/247) Purging libltdl (2.4.7-r3) (215/247) Purging libmd (1.1.0-r0) (216/247) Purging libtheora (1.1.1-r18) (217/247) Purging libvorbis (1.3.7-r2) (218/247) Purging libogg (1.3.5-r5) (219/247) Purging libpanelw (6.5_p20240601-r0) (220/247) Purging libpng (1.6.43-r0) (221/247) Purging libwebpmux (1.3.2-r0) (222/247) Purging libwebp (1.3.2-r0) (223/247) Purging libsharpyuv (1.3.2-r0) (224/247) Purging libsodium (1.0.20-r0) (225/247) Purging libSvtAv1Enc (2.1.2-r0) (226/247) Purging libxapian (1.4.25-r0) (227/247) Purging libuuid (2.40.1-r1) (228/247) Purging libvpx (1.14.1-r0) (229/247) Purging libxau (1.0.11-r4) (230/247) Purging libxml2 (2.12.8-r0) (231/247) Purging mpdecimal (4.0.0-r0) (232/247) Purging x265-libs (3.6-r0) (233/247) Purging numactl (2.0.18-r0) (234/247) Purging openexr-libilmthread (3.1.13-r1) (235/247) Purging openexr-libiex (3.1.13-r1) (236/247) Purging opus (1.5.2-r0) (237/247) Purging rav1e-libs (0.7.1-r0) (238/247) Purging readline (8.2.10-r0) (239/247) Purging sord-libs (0.16.16-r0) (240/247) Purging serd-libs (0.32.2-r0) (241/247) Purging soxr (0.1.3-r7) (242/247) Purging spirv-tools (1.3.261.1-r0) (243/247) Purging sqlite-libs (3.46.0-r0) (244/247) Purging x264-libs (0.164_git20231001-r0) (245/247) Purging xvidcore (1.3.7-r2) (246/247) Purging xz-libs (5.6.2-r0) (247/247) Purging zix-libs (0.4.2-r0) Executing busybox-1.36.1-r31.trigger OK: 213 MiB in 99 packages