>>> py3-mygpoclient: Building community/py3-mygpoclient 1.10-r1 (using abuild 3.17.0_rc1-r0) started Mon, 13 Apr 2026 17:28:37 +0000 >>> py3-mygpoclient: Validating /home/buildozer/aports/community/py3-mygpoclient/APKBUILD... >>> py3-mygpoclient: Analyzing dependencies... >>> py3-mygpoclient: Installing for build: build-base python3 py3-setuptools py3-pytest py3-pytest-cov py3-minimock ( 1/34) Installing libbz2 (1.0.8-r6) ( 2/34) Installing libffi (3.5.2-r0) ( 3/34) Installing gdbm (1.26-r0) ( 4/34) Installing xz-libs (5.8.2-r0) ( 5/34) Installing mpdecimal (4.0.1-r0) ( 6/34) Installing libpanelw (6.6_p20260404-r0) ( 7/34) Installing sqlite-libs (3.53.0-r0) ( 8/34) Installing python3 (3.14.3-r0) ( 9/34) Installing python3-pycache-pyc0 (3.14.3-r0) (10/34) Installing pyc (3.14.3-r0) (11/34) Installing python3-pyc (3.14.3-r0) (12/34) Installing py3-parsing (3.3.2-r1) (13/34) Installing py3-parsing-pyc (3.3.2-r1) (14/34) Installing py3-packaging (26.0-r1) (15/34) Installing py3-packaging-pyc (26.0-r1) (16/34) Installing py3-setuptools (82.0.1-r1) (17/34) Installing py3-setuptools-pyc (82.0.1-r1) (18/34) Installing py3-iniconfig (2.3.0-r1) (19/34) Installing py3-iniconfig-pyc (2.3.0-r1) (20/34) Installing py3-pluggy (1.6.0-r1) (21/34) Installing py3-pluggy-pyc (1.6.0-r1) (22/34) Installing py3-py (1.11.0-r5) (23/34) Installing py3-py-pyc (1.11.0-r5) (24/34) Installing py3-pygments (2.20.0-r0) (25/34) Installing py3-pygments-pyc (2.20.0-r0) (26/34) Installing py3-pytest (9.0.3-r0) (27/34) Installing py3-pytest-pyc (9.0.3-r0) (28/34) Installing py3-coverage (7.13.5-r1) (29/34) Installing py3-coverage-pyc (7.13.5-r1) (30/34) Installing py3-pytest-cov (5.0.0-r2) (31/34) Installing py3-pytest-cov-pyc (5.0.0-r2) (32/34) Installing py3-minimock (1.3.0-r1) (33/34) Installing py3-minimock-pyc (1.3.0-r1) (34/34) Installing .makedepends-py3-mygpoclient (20260413.172848) Executing busybox-1.37.0-r31.trigger OK: 448.0 MiB in 139 packages >>> py3-mygpoclient: Cleaning up srcdir >>> py3-mygpoclient: Cleaning up pkgdir >>> py3-mygpoclient: Cleaning up tmpdir >>> py3-mygpoclient: Fetching https://distfiles.alpinelinux.org/distfiles/edge/py3-mygpoclient-1.10.tar.gz /var/cache/distfiles/edge/py3-mygpoclient-1.10.tar.gz: OK >>> py3-mygpoclient: Fetching https://distfiles.alpinelinux.org/distfiles/edge/py3-mygpoclient-1.10.tar.gz /var/cache/distfiles/edge/py3-mygpoclient-1.10.tar.gz: OK >>> py3-mygpoclient: Unpacking /var/cache/distfiles/edge/py3-mygpoclient-1.10.tar.gz... /usr/lib/python3.14/site-packages/setuptools/dist.py:765: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+) See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running build running build_py creating build/lib/mygpoclient copying mygpoclient/json_test.py -> build/lib/mygpoclient copying mygpoclient/__init__.py -> build/lib/mygpoclient copying mygpoclient/json.py -> build/lib/mygpoclient copying mygpoclient/util.py -> build/lib/mygpoclient copying mygpoclient/locator.py -> build/lib/mygpoclient copying mygpoclient/api.py -> build/lib/mygpoclient copying mygpoclient/feeds.py -> build/lib/mygpoclient copying mygpoclient/api_test.py -> build/lib/mygpoclient copying mygpoclient/simple.py -> build/lib/mygpoclient copying mygpoclient/public.py -> build/lib/mygpoclient copying mygpoclient/testing.py -> build/lib/mygpoclient copying mygpoclient/http.py -> build/lib/mygpoclient copying mygpoclient/http_test.py -> build/lib/mygpoclient copying mygpoclient/simple_test.py -> build/lib/mygpoclient copying mygpoclient/locator_test.py -> build/lib/mygpoclient copying mygpoclient/public_test.py -> build/lib/mygpoclient running build_scripts creating build/scripts-3.14 copying and adjusting bin/mygpo-simple-client -> build/scripts-3.14 copying and adjusting bin/mygpo-bpsync -> build/scripts-3.14 copying and adjusting bin/mygpo-list-devices -> build/scripts-3.14 changing mode of build/scripts-3.14/mygpo-simple-client from 644 to 755 changing mode of build/scripts-3.14/mygpo-bpsync from 644 to 755 changing mode of build/scripts-3.14/mygpo-list-devices from 644 to 755 python -m pytest ============================= test session starts ============================== platform linux -- Python 3.14.3, pytest-9.0.3, pluggy-1.6.0 rootdir: /home/buildozer/aports/community/py3-mygpoclient/src/mygpoclient-1.10 plugins: cov-5.0.0 collected 101 items mygpoclient/api_test.py ................................................ [ 47%] ...... [ 53%] mygpoclient/http_test.py FFFFFFFFFFFF [ 65%] mygpoclient/json_test.py .... [ 69%] mygpoclient/locator_test.py ............... [ 84%] mygpoclient/public_test.py ........ [ 92%] mygpoclient/simple_test.py ........ [100%] =================================== FAILURES =================================== _______________________ Test_HttpClient.test_BadRequest ________________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.14/urllib/request.py:1321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.14/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.14/http/client.py:1057: in send self.connect() /usr/lib/python3.14/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.14/socket.py:870: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 9876), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.14/socket.py:855: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def test_BadRequest(self): client = HttpClient() path = self.URI_BASE + '/badrequest' > self.assertRaises(BadRequest, client.GET, path) mygpoclient/http_test.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mygpoclient/http.py:161: in GET return self._request('GET', uri, None) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http.py:147: in _request response = self._opener.open(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:487: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:504: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.14/urllib/request.py:464: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:1350: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.14/urllib/request.py:1324: URLError ___________________________ Test_HttpClient.test_GET ___________________________ self = http_class = req = , http_conn_args = {} host = 'localhost:9876', h = headers = {'Connection': 'close', 'Host': 'localhost:9876', 'User-Agent': 'mygpoclient/1.10 (+http://gpodder.org/mygpoclient/)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.14/urllib/request.py:1321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.14/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.14/http/client.py:1057: in send self.connect() /usr/lib/python3.14/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.14/socket.py:870: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 9876), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.14/socket.py:855: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def test_GET(self): client = HttpClient() path = self.URI_BASE + '/noauth' > self.assertEqual(client.GET(path), self.RESPONSE) ^^^^^^^^^^^^^^^^ mygpoclient/http_test.py:174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mygpoclient/http.py:161: in GET return self._request('GET', uri, None) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http.py:147: in _request response = self._opener.open(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:487: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:504: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.14/urllib/request.py:464: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:1350: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = 'localhost:9876', h = headers = {'Connection': 'close', 'Host': 'localhost:9876', 'User-Agent': 'mygpoclient/1.10 (+http://gpodder.org/mygpoclient/)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.14/urllib/request.py:1324: URLError ______________________ Test_HttpClient.test_GET_after_PUT ______________________ self = http_class = req = , http_conn_args = {} host = 'localhost:9876', h = headers = {'Connection': 'close', 'Content-Length': '31', 'Content-Type': 'application/x-www-form-urlencoded', 'Host': 'localhost:9876', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.14/urllib/request.py:1321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.14/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.14/http/client.py:1057: in send self.connect() /usr/lib/python3.14/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.14/socket.py:870: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 9876), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.14/socket.py:855: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def test_GET_after_PUT(self): client = HttpClient() for i in range(10): path = self.URI_BASE + '/file.%(i)d.txt' % locals() > client.PUT(path, self.RESPONSE + str(i).encode('utf-8')) mygpoclient/http_test.py:216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mygpoclient/http.py:169: in PUT return self._request('PUT', uri, data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http.py:147: in _request response = self._opener.open(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:487: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:504: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.14/urllib/request.py:464: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:1350: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = 'localhost:9876', h = headers = {'Connection': 'close', 'Content-Length': '31', 'Content-Type': 'application/x-www-form-urlencoded', 'Host': 'localhost:9876', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.14/urllib/request.py:1324: URLError ________________________ Test_HttpClient.test_NotFound _________________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.14/urllib/request.py:1321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.14/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.14/http/client.py:1057: in send self.connect() /usr/lib/python3.14/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.14/socket.py:870: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 9876), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.14/socket.py:855: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def test_NotFound(self): client = HttpClient() path = self.URI_BASE + '/notfound' > self.assertRaises(NotFound, client.GET, path) mygpoclient/http_test.py:159: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mygpoclient/http.py:161: in GET return self._request('GET', uri, None) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http.py:147: in _request response = self._opener.open(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:487: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:504: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.14/urllib/request.py:464: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:1350: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.14/urllib/request.py:1324: URLError __________________________ Test_HttpClient.test_POST ___________________________ self = http_class = req = , http_conn_args = {} host = 'localhost:9876', h = headers = {'Connection': 'close', 'Content-Length': '35', 'Content-Type': 'application/x-www-form-urlencoded', 'Host': 'localhost:9876', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.14/urllib/request.py:1321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.14/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.14/http/client.py:1057: in send self.connect() /usr/lib/python3.14/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.14/socket.py:870: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 9876), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.14/socket.py:855: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def test_POST(self): client = HttpClient() path = self.URI_BASE + '/noauth' self.assertEqual( > client.POST( path, self.DUMMYDATA), codecs.encode( self.DUMMYDATA.decode('utf-8'), 'rot-13').encode('utf-8')) mygpoclient/http_test.py:190: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mygpoclient/http.py:165: in POST return self._request('POST', uri, data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http.py:147: in _request response = self._opener.open(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:487: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:504: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.14/urllib/request.py:464: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:1350: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = 'localhost:9876', h = headers = {'Connection': 'close', 'Content-Length': '35', 'Content-Type': 'application/x-www-form-urlencoded', 'Host': 'localhost:9876', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.14/urllib/request.py:1324: URLError ___________________________ Test_HttpClient.test_PUT ___________________________ self = http_class = req = , http_conn_args = {} host = 'localhost:9876', h = headers = {'Connection': 'close', 'Content-Length': '35', 'Content-Type': 'application/x-www-form-urlencoded', 'Host': 'localhost:9876', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.14/urllib/request.py:1321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.14/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.14/http/client.py:1057: in send self.connect() /usr/lib/python3.14/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.14/socket.py:870: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 9876), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.14/socket.py:855: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def test_PUT(self): client = HttpClient() path = self.URI_BASE + '/noauth' > self.assertEqual(client.PUT(path, self.DUMMYDATA), b'PUT OK') ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http_test.py:210: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mygpoclient/http.py:169: in PUT return self._request('PUT', uri, data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http.py:147: in _request response = self._opener.open(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:487: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:504: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.14/urllib/request.py:464: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:1350: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = 'localhost:9876', h = headers = {'Connection': 'close', 'Content-Length': '35', 'Content-Type': 'application/x-www-form-urlencoded', 'Host': 'localhost:9876', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.14/urllib/request.py:1324: URLError ______________________ Test_HttpClient.test_Unauthorized _______________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.14/urllib/request.py:1321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.14/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.14/http/client.py:1057: in send self.connect() /usr/lib/python3.14/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.14/socket.py:870: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 9876), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.14/socket.py:855: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def test_Unauthorized(self): client = HttpClient('invalid-username', 'invalid-password') path = self.URI_BASE + '/auth' > self.assertRaises(Unauthorized, client.GET, path) mygpoclient/http_test.py:164: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mygpoclient/http.py:161: in GET return self._request('GET', uri, None) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http.py:147: in _request response = self._opener.open(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:487: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:504: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.14/urllib/request.py:464: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:1350: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.14/urllib/request.py:1324: URLError _____________________ Test_HttpClient.test_UnknownResponse _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.14/urllib/request.py:1321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.14/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.14/http/client.py:1057: in send self.connect() /usr/lib/python3.14/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.14/socket.py:870: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 9876), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.14/socket.py:855: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def test_UnknownResponse(self): client = HttpClient() path = self.URI_BASE + '/invaliderror' > self.assertRaises(UnknownResponse, client.GET, path) mygpoclient/http_test.py:154: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mygpoclient/http.py:161: in GET return self._request('GET', uri, None) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http.py:147: in _request response = self._opener.open(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:487: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:504: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.14/urllib/request.py:464: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:1350: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.14/urllib/request.py:1324: URLError ____________________ Test_HttpClient.test_authenticated_GET ____________________ self = http_class = req = , http_conn_args = {} host = 'localhost:9876', h = headers = {'Connection': 'close', 'Host': 'localhost:9876', 'User-Agent': 'mygpoclient/1.10 (+http://gpodder.org/mygpoclient/)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.14/urllib/request.py:1321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.14/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.14/http/client.py:1057: in send self.connect() /usr/lib/python3.14/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.14/socket.py:870: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 9876), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.14/socket.py:855: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def test_authenticated_GET(self): client = HttpClient(self.USERNAME, self.PASSWORD) path = self.URI_BASE + '/auth' > self.assertEqual(client.GET(path), self.RESPONSE) ^^^^^^^^^^^^^^^^ mygpoclient/http_test.py:179: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mygpoclient/http.py:161: in GET return self._request('GET', uri, None) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http.py:147: in _request response = self._opener.open(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:487: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:504: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.14/urllib/request.py:464: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:1350: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = 'localhost:9876', h = headers = {'Connection': 'close', 'Host': 'localhost:9876', 'User-Agent': 'mygpoclient/1.10 (+http://gpodder.org/mygpoclient/)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.14/urllib/request.py:1324: URLError ___________________ Test_HttpClient.test_authenticated_POST ____________________ self = http_class = req = , http_conn_args = {} host = 'localhost:9876', h = headers = {'Connection': 'close', 'Content-Length': '35', 'Content-Type': 'application/x-www-form-urlencoded', 'Host': 'localhost:9876', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.14/urllib/request.py:1321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.14/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.14/http/client.py:1057: in send self.connect() /usr/lib/python3.14/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.14/socket.py:870: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 9876), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.14/socket.py:855: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def test_authenticated_POST(self): client = HttpClient(self.USERNAME, self.PASSWORD) path = self.URI_BASE + '/auth' self.assertEqual( > client.POST( path, self.DUMMYDATA), codecs.encode( self.DUMMYDATA.decode('utf-8'), 'rot-13').encode('utf-8')) mygpoclient/http_test.py:198: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mygpoclient/http.py:165: in POST return self._request('POST', uri, data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http.py:147: in _request response = self._opener.open(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:487: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:504: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.14/urllib/request.py:464: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:1350: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = 'localhost:9876', h = headers = {'Connection': 'close', 'Content-Length': '35', 'Content-Type': 'application/x-www-form-urlencoded', 'Host': 'localhost:9876', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.14/urllib/request.py:1324: URLError ___________________ Test_HttpClient.test_unauthenticated_GET ___________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.14/urllib/request.py:1321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.14/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.14/http/client.py:1057: in send self.connect() /usr/lib/python3.14/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.14/socket.py:870: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 9876), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.14/socket.py:855: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def test_unauthenticated_GET(self): client = HttpClient() path = self.URI_BASE + '/auth' > self.assertRaises(Unauthorized, client.GET, path) mygpoclient/http_test.py:184: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mygpoclient/http.py:161: in GET return self._request('GET', uri, None) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http.py:147: in _request response = self._opener.open(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:487: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:504: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.14/urllib/request.py:464: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:1350: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.14/urllib/request.py:1324: URLError __________________ Test_HttpClient.test_unauthenticated_POST ___________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.14/urllib/request.py:1321: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.14/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.14/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.14/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.14/http/client.py:1057: in send self.connect() /usr/lib/python3.14/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.14/socket.py:870: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 9876), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.14/socket.py:855: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def test_unauthenticated_POST(self): client = HttpClient() path = self.URI_BASE + '/auth' > self.assertRaises(Unauthorized, client.POST, path, self.DUMMYDATA) mygpoclient/http_test.py:205: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ mygpoclient/http.py:165: in POST return self._request('POST', uri, data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ mygpoclient/http.py:147: in _request response = self._opener.open(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:487: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:504: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.14/urllib/request.py:464: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.14/urllib/request.py:1350: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.14/urllib/request.py:1324: URLError =========================== short test summary info ============================ FAILED mygpoclient/http_test.py::Test_HttpClient::test_BadRequest - urllib.er... FAILED mygpoclient/http_test.py::Test_HttpClient::test_GET - urllib.error.URL... FAILED mygpoclient/http_test.py::Test_HttpClient::test_GET_after_PUT - urllib... FAILED mygpoclient/http_test.py::Test_HttpClient::test_NotFound - urllib.erro... FAILED mygpoclient/http_test.py::Test_HttpClient::test_POST - urllib.error.UR... FAILED mygpoclient/http_test.py::Test_HttpClient::test_PUT - urllib.error.URL... FAILED mygpoclient/http_test.py::Test_HttpClient::test_Unauthorized - urllib.... FAILED mygpoclient/http_test.py::Test_HttpClient::test_UnknownResponse - urll... FAILED mygpoclient/http_test.py::Test_HttpClient::test_authenticated_GET - ur... FAILED mygpoclient/http_test.py::Test_HttpClient::test_authenticated_POST - u... FAILED mygpoclient/http_test.py::Test_HttpClient::test_unauthenticated_GET - ... FAILED mygpoclient/http_test.py::Test_HttpClient::test_unauthenticated_POST ======================== 12 failed, 89 passed in 17.17s ======================== make: *** [makefile:31: test] Error 1 >>> ERROR: py3-mygpoclient: check failed >>> py3-mygpoclient: Uninstalling dependencies... ( 1/34) Purging .makedepends-py3-mygpoclient (20260413.172848) ( 2/34) Purging py3-setuptools-pyc (82.0.1-r1) ( 3/34) Purging py3-setuptools (82.0.1-r1) ( 4/34) Purging py3-pytest-cov-pyc (5.0.0-r2) ( 5/34) Purging py3-pytest-cov (5.0.0-r2) ( 6/34) Purging py3-pytest-pyc (9.0.3-r0) ( 7/34) Purging py3-pytest (9.0.3-r0) ( 8/34) Purging py3-iniconfig-pyc (2.3.0-r1) ( 9/34) Purging py3-iniconfig (2.3.0-r1) (10/34) Purging py3-packaging-pyc (26.0-r1) (11/34) Purging py3-packaging (26.0-r1) (12/34) Purging py3-parsing-pyc (3.3.2-r1) (13/34) Purging py3-parsing (3.3.2-r1) (14/34) Purging py3-pluggy-pyc (1.6.0-r1) (15/34) Purging py3-pluggy (1.6.0-r1) (16/34) Purging py3-py-pyc (1.11.0-r5) (17/34) Purging py3-py (1.11.0-r5) (18/34) Purging py3-pygments-pyc (2.20.0-r0) (19/34) Purging py3-pygments (2.20.0-r0) (20/34) Purging py3-coverage-pyc (7.13.5-r1) (21/34) Purging py3-coverage (7.13.5-r1) (22/34) Purging py3-minimock-pyc (1.3.0-r1) (23/34) Purging py3-minimock (1.3.0-r1) (24/34) Purging python3-pyc (3.14.3-r0) (25/34) Purging python3-pycache-pyc0 (3.14.3-r0) (26/34) Purging pyc (3.14.3-r0) (27/34) Purging python3 (3.14.3-r0) (28/34) Purging gdbm (1.26-r0) (29/34) Purging libbz2 (1.0.8-r6) (30/34) Purging libffi (3.5.2-r0) (31/34) Purging libpanelw (6.6_p20260404-r0) (32/34) Purging mpdecimal (4.0.1-r0) (33/34) Purging sqlite-libs (3.53.0-r0) (34/34) Purging xz-libs (5.8.2-r0) Executing busybox-1.37.0-r31.trigger OK: 390.9 MiB in 105 packages