Source: python-digitalocean
Version: 1.16.0-3
Severity: serious
Justification: FTBFS
Tags: trixie sid ftbfs
User: [email protected]
Usertags: ftbfs-20240615 ftbfs-trixie
Hi,
During a rebuild of all packages in sid, your package failed to build
on amd64.
Relevant part (hopefully):
> make[1]: Entering directory '/<<PKGBUILDDIR>>'
> dh_auto_build
> I: pybuild base:311: /usr/bin/python3.12 setup.py build
> running build
> running build_py
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/FloatingIP.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Volume.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Tag.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Balance.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Region.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Record.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Firewall.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Size.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Domain.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Metadata.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/LoadBalancer.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/baseapi.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Droplet.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Manager.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Action.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Image.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Kernel.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/VPC.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Project.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Account.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Certificate.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/SSHKey.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Snapshot.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> I: pybuild base:311: /usr/bin/python3 setup.py build
> running build
> running build_py
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/FloatingIP.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Volume.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Tag.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Balance.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Region.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Record.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Firewall.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Size.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Domain.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Metadata.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/LoadBalancer.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/baseapi.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Droplet.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Manager.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Action.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Image.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Kernel.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/VPC.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Project.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Account.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Certificate.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/SSHKey.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Snapshot.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> PYTHONPATH=. python3 -m sphinx -N -bhtml docs/ build/html
> Running Sphinx v7.2.6
> WARNING: Invalid configuration value found: 'language = None'. Update your
> configuration to a valid language code. Falling back to 'en' (English).
> making output directory... done
> WARNING: The pre-Sphinx 1.0 'intersphinx_mapping' format is deprecated and
> will be removed in Sphinx 8. Update to the current format as described in the
> documentation. Hint: "intersphinx_mapping = {'<name>':
> ('https://docs.python.org/',
> None)}".https://www.sphinx-doc.org/en/master/usage/extensions/intersphinx.html#confval-intersphinx_mapping
> loading intersphinx inventory from https://docs.python.org/objects.inv...
> WARNING: failed to reach any of the inventories with the following issues:
> intersphinx inventory 'https://docs.python.org/objects.inv' not fetchable due
> to <class 'requests.exceptions.ProxyError'>:
> HTTPSConnectionPool(host='docs.python.org', port=443): Max retries exceeded
> with url: /objects.inv (Caused by ProxyError('Unable to connect to proxy',
> NewConnectionError('<urllib3.connection.HTTPSConnection object at
> 0x7f58057dfdd0>: Failed to establish a new connection: [Errno 111] Connection
> refused')))
> building [mo]: targets for 0 po files that are out of date
> writing output...
> building [html]: targets for 2 source files that are out of date
> updating environment: [new config] 2 added, 0 changed, 0 removed
> [2Kreading sources... [ 50%] digitalocean
> [2Kreading sources... [100%] index
>
> /<<PKGBUILDDIR>>/digitalocean/Domain.py:docstring of
> digitalocean.Domain.Domain.create_new_domain_record:15: ERROR: Unexpected
> indentation.
> looking for now-outdated files... none found
> pickling environment... done
> checking consistency... done
> preparing documents... done
> copying assets... copying static files... done
> copying extra files... done
> done
> [2Kwriting output... [ 50%] digitalocean
> [2Kwriting output... [100%] index
>
> generating indices... genindex py-modindex done
> [2Khighlighting module code... [ 6%] digitalocean.Account
> [2Khighlighting module code... [ 12%] digitalocean.Action
> [2Khighlighting module code... [ 19%] digitalocean.Domain
> [2Khighlighting module code... [ 25%] digitalocean.Droplet
> [2Khighlighting module code... [ 31%] digitalocean.FloatingIP
> [2Khighlighting module code... [ 38%] digitalocean.Image
> [2Khighlighting module code... [ 44%] digitalocean.Kernel
> [2Khighlighting module code... [ 50%] digitalocean.LoadBalancer
> [2Khighlighting module code... [ 56%] digitalocean.Manager
> [2Khighlighting module code... [ 62%] digitalocean.Metadata
> [2Khighlighting module code... [ 69%] digitalocean.Record
> [2Khighlighting module code... [ 75%] digitalocean.Region
> [2Khighlighting module code... [ 81%] digitalocean.SSHKey
> [2Khighlighting module code... [ 88%] digitalocean.Size
> [2Khighlighting module code... [ 94%] digitalocean.Tag
> [2Khighlighting module code... [100%] digitalocean.Volume
>
> writing additional pages... search done
> dumping search index in English (code: en)... done
> dumping object inventory... done
> build succeeded, 4 warnings.
>
> The HTML pages are in build/html.
> make[1]: Leaving directory '/<<PKGBUILDDIR>>'
> dh_auto_test -O--buildsystem=pybuild
> I: pybuild pybuild:308: cp -r /<<PKGBUILDDIR>>/digitalocean/tests
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build
> I: pybuild base:311: cd
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build; python3.12 -m
> pytest
> ============================= test session starts
> ==============================
> platform linux -- Python 3.12.4, pytest-8.2.2, pluggy-1.5.0
> rootdir: /<<PKGBUILDDIR>>
> collected 152 items
>
> tests/test_action.py .. [
> 1%]
> tests/test_baseapi.py .... [
> 3%]
> tests/test_certificate.py .... [
> 6%]
> tests/test_domain.py ....... [
> 11%]
> tests/test_droplet.py ............................................ [
> 40%]
> tests/test_firewall.py FF.FF [
> 43%]
> tests/test_floatingip.py ...... [
> 47%]
> tests/test_image.py ....... [
> 51%]
> tests/test_load_balancer.py .......... [
> 58%]
> tests/test_manager.py ............................... [
> 78%]
> tests/test_project.py ......... [
> 84%]
> tests/test_snapshot.py .. [
> 86%]
> tests/test_tag.py ....... [
> 90%]
> tests/test_volume.py .......... [
> 97%]
> tests/test_vpc.py ....
> [100%]
>
> =================================== FAILURES
> ===================================
> ________________________ TestFirewall.test_add_droplets
> ________________________
>
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff02e0>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> > yield
>
> /usr/lib/python3/dist-packages/urllib3/response.py:710:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff02e0>, amt = 10240
>
> def _raw_read(
> self,
> amt: int | None = None,
> ) -> bytes:
> """
> Reads `amt` of bytes from the socket.
> """
> if self._fp is None:
> return None # type: ignore[return-value]
>
> fp_closed = getattr(self._fp, "closed", False)
>
> with self._error_catcher():
> data = self._fp_read(amt) if not fp_closed else b""
> if amt is not None and amt != 0 and not data:
> # Platform-specific: Buggy versions of Python.
> # Close the connection when no data is returned
> #
> # This is redundant to what httplib/http.client _should_
> # already do. However, versions of python released before
> # December 15, 2012 (http://bugs.python.org/issue16298) do
> # not properly close the connection in all cases. There is
> # no harm in redundantly calling close.
> self._fp.close()
> if (
> self.enforce_content_length
> and self.length_remaining is not None
> and self.length_remaining != 0
> ):
> # This is an edge case that httplib failed to cover due
> # to concerns of backward compatibility. We're
> # addressing it here to make sure IncompleteRead is
> # raised during streaming, so all calls with incorrect
> # Content-Length are caught.
> > raise IncompleteRead(self._fp_bytes_read,
> > self.length_remaining)
> E urllib3.exceptions.IncompleteRead: IncompleteRead(44
> bytes read, -44 more expected)
>
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
>
> The above exception was the direct cause of the following exception:
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> > yield from self.raw.stream(chunk_size, decode_content=True)
>
> /usr/lib/python3/dist-packages/requests/models.py:820:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
> data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
> data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
> with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
> self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff02e0>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> yield
>
> except SocketTimeout as e:
> # FIXME: Ideally we'd like to include the url in the
> ReadTimeoutError but
> # there is yet no clean way to get at it from this context.
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except BaseSSLError as e:
> # FIXME: Is there a better way to differentiate between
> SSLErrors?
> if "read operation timed out" not in str(e):
> # SSL errors related to framing/MAC get wrapped and
> reraised here
> raise SSLError(e) from e
>
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except (HTTPException, OSError) as e:
> # This includes IncompleteRead.
> > raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E urllib3.exceptions.ProtocolError: ('Connection broken:
> IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44 bytes
> read, -44 more expected))
>
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
>
> During handling of the above exception, another exception occurred:
>
> self = <tests.test_firewall.TestFirewall testMethod=test_add_droplets>
>
> @responses.activate
> def test_add_droplets(self):
> data = self.load_from_file('firewalls/droplets.json')
>
> url = self.base_url + "firewalls/12345/droplets"
> responses.add(responses.POST, url,
> body=data,
> status=204,
> content_type='application/json')
>
> droplet_id = json.loads(data)["droplet_ids"][0]
> > self.firewall.add_droplets([droplet_id])
>
> tests/test_firewall.py:73:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> digitalocean/Firewall.py:202: in add_droplets
> return self.get_data(
> digitalocean/baseapi.py:216: in get_data
> req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
> return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:115: in post
> return request("post", url, data=data, json=json, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:59: in request
> return session.request(method=method, url=url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
> resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
> r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
> self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> yield from self.raw.stream(chunk_size, decode_content=True)
> except ProtocolError as e:
> > raise ChunkedEncodingError(e)
> E requests.exceptions.ChunkedEncodingError: ('Connection
> broken: IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44
> bytes read, -44 more expected))
>
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> __________________________ TestFirewall.test_add_tags
> __________________________
>
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff2110>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> > yield
>
> /usr/lib/python3/dist-packages/urllib3/response.py:710:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff2110>, amt = 10240
>
> def _raw_read(
> self,
> amt: int | None = None,
> ) -> bytes:
> """
> Reads `amt` of bytes from the socket.
> """
> if self._fp is None:
> return None # type: ignore[return-value]
>
> fp_closed = getattr(self._fp, "closed", False)
>
> with self._error_catcher():
> data = self._fp_read(amt) if not fp_closed else b""
> if amt is not None and amt != 0 and not data:
> # Platform-specific: Buggy versions of Python.
> # Close the connection when no data is returned
> #
> # This is redundant to what httplib/http.client _should_
> # already do. However, versions of python released before
> # December 15, 2012 (http://bugs.python.org/issue16298) do
> # not properly close the connection in all cases. There is
> # no harm in redundantly calling close.
> self._fp.close()
> if (
> self.enforce_content_length
> and self.length_remaining is not None
> and self.length_remaining != 0
> ):
> # This is an edge case that httplib failed to cover due
> # to concerns of backward compatibility. We're
> # addressing it here to make sure IncompleteRead is
> # raised during streaming, so all calls with incorrect
> # Content-Length are caught.
> > raise IncompleteRead(self._fp_bytes_read,
> > self.length_remaining)
> E urllib3.exceptions.IncompleteRead: IncompleteRead(41
> bytes read, -41 more expected)
>
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
>
> The above exception was the direct cause of the following exception:
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> > yield from self.raw.stream(chunk_size, decode_content=True)
>
> /usr/lib/python3/dist-packages/requests/models.py:820:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
> data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
> data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
> with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
> self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff2110>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> yield
>
> except SocketTimeout as e:
> # FIXME: Ideally we'd like to include the url in the
> ReadTimeoutError but
> # there is yet no clean way to get at it from this context.
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except BaseSSLError as e:
> # FIXME: Is there a better way to differentiate between
> SSLErrors?
> if "read operation timed out" not in str(e):
> # SSL errors related to framing/MAC get wrapped and
> reraised here
> raise SSLError(e) from e
>
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except (HTTPException, OSError) as e:
> # This includes IncompleteRead.
> > raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E urllib3.exceptions.ProtocolError: ('Connection broken:
> IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41 bytes
> read, -41 more expected))
>
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
>
> During handling of the above exception, another exception occurred:
>
> self = <tests.test_firewall.TestFirewall testMethod=test_add_tags>
>
> @responses.activate
> def test_add_tags(self):
> data = self.load_from_file('firewalls/tags.json')
>
> url = self.base_url + "firewalls/12345/tags"
> responses.add(responses.POST, url,
> body=data,
> status=204,
> content_type='application/json')
>
> tag = json.loads(data)["tags"][0]
> > self.firewall.add_tags([tag])
>
> tests/test_firewall.py:104:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> digitalocean/Firewall.py:222: in add_tags
> return self.get_data(
> digitalocean/baseapi.py:216: in get_data
> req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
> return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:115: in post
> return request("post", url, data=data, json=json, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:59: in request
> return session.request(method=method, url=url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
> resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
> r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
> self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> yield from self.raw.stream(chunk_size, decode_content=True)
> except ProtocolError as e:
> > raise ChunkedEncodingError(e)
> E requests.exceptions.ChunkedEncodingError: ('Connection
> broken: IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41
> bytes read, -41 more expected))
>
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> ______________________ TestFirewall.test_remove_droplets
> _______________________
>
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3fbfd60>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> > yield
>
> /usr/lib/python3/dist-packages/urllib3/response.py:710:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3fbfd60>, amt = 10240
>
> def _raw_read(
> self,
> amt: int | None = None,
> ) -> bytes:
> """
> Reads `amt` of bytes from the socket.
> """
> if self._fp is None:
> return None # type: ignore[return-value]
>
> fp_closed = getattr(self._fp, "closed", False)
>
> with self._error_catcher():
> data = self._fp_read(amt) if not fp_closed else b""
> if amt is not None and amt != 0 and not data:
> # Platform-specific: Buggy versions of Python.
> # Close the connection when no data is returned
> #
> # This is redundant to what httplib/http.client _should_
> # already do. However, versions of python released before
> # December 15, 2012 (http://bugs.python.org/issue16298) do
> # not properly close the connection in all cases. There is
> # no harm in redundantly calling close.
> self._fp.close()
> if (
> self.enforce_content_length
> and self.length_remaining is not None
> and self.length_remaining != 0
> ):
> # This is an edge case that httplib failed to cover due
> # to concerns of backward compatibility. We're
> # addressing it here to make sure IncompleteRead is
> # raised during streaming, so all calls with incorrect
> # Content-Length are caught.
> > raise IncompleteRead(self._fp_bytes_read,
> > self.length_remaining)
> E urllib3.exceptions.IncompleteRead: IncompleteRead(44
> bytes read, -44 more expected)
>
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
>
> The above exception was the direct cause of the following exception:
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> > yield from self.raw.stream(chunk_size, decode_content=True)
>
> /usr/lib/python3/dist-packages/requests/models.py:820:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
> data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
> data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
> with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
> self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3fbfd60>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> yield
>
> except SocketTimeout as e:
> # FIXME: Ideally we'd like to include the url in the
> ReadTimeoutError but
> # there is yet no clean way to get at it from this context.
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except BaseSSLError as e:
> # FIXME: Is there a better way to differentiate between
> SSLErrors?
> if "read operation timed out" not in str(e):
> # SSL errors related to framing/MAC get wrapped and
> reraised here
> raise SSLError(e) from e
>
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except (HTTPException, OSError) as e:
> # This includes IncompleteRead.
> > raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E urllib3.exceptions.ProtocolError: ('Connection broken:
> IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44 bytes
> read, -44 more expected))
>
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
>
> During handling of the above exception, another exception occurred:
>
> self = <tests.test_firewall.TestFirewall testMethod=test_remove_droplets>
>
> @responses.activate
> def test_remove_droplets(self):
> data = self.load_from_file('firewalls/droplets.json')
>
> url = self.base_url + "firewalls/12345/droplets"
> responses.add(responses.DELETE,
> url,
> body=data,
> status=204,
> content_type='application/json')
>
> droplet_id = json.loads(data)["droplet_ids"][0]
> > self.firewall.remove_droplets([droplet_id])
>
> tests/test_firewall.py:89:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> digitalocean/Firewall.py:212: in remove_droplets
> return self.get_data(
> digitalocean/baseapi.py:216: in get_data
> req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
> return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:671: in delete
> return self.request("DELETE", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
> resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
> r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
> self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> yield from self.raw.stream(chunk_size, decode_content=True)
> except ProtocolError as e:
> > raise ChunkedEncodingError(e)
> E requests.exceptions.ChunkedEncodingError: ('Connection
> broken: IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44
> bytes read, -44 more expected))
>
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> ________________________ TestFirewall.test_remove_tags
> _________________________
>
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff0100>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> > yield
>
> /usr/lib/python3/dist-packages/urllib3/response.py:710:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff0100>, amt = 10240
>
> def _raw_read(
> self,
> amt: int | None = None,
> ) -> bytes:
> """
> Reads `amt` of bytes from the socket.
> """
> if self._fp is None:
> return None # type: ignore[return-value]
>
> fp_closed = getattr(self._fp, "closed", False)
>
> with self._error_catcher():
> data = self._fp_read(amt) if not fp_closed else b""
> if amt is not None and amt != 0 and not data:
> # Platform-specific: Buggy versions of Python.
> # Close the connection when no data is returned
> #
> # This is redundant to what httplib/http.client _should_
> # already do. However, versions of python released before
> # December 15, 2012 (http://bugs.python.org/issue16298) do
> # not properly close the connection in all cases. There is
> # no harm in redundantly calling close.
> self._fp.close()
> if (
> self.enforce_content_length
> and self.length_remaining is not None
> and self.length_remaining != 0
> ):
> # This is an edge case that httplib failed to cover due
> # to concerns of backward compatibility. We're
> # addressing it here to make sure IncompleteRead is
> # raised during streaming, so all calls with incorrect
> # Content-Length are caught.
> > raise IncompleteRead(self._fp_bytes_read,
> > self.length_remaining)
> E urllib3.exceptions.IncompleteRead: IncompleteRead(41
> bytes read, -41 more expected)
>
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
>
> The above exception was the direct cause of the following exception:
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> > yield from self.raw.stream(chunk_size, decode_content=True)
>
> /usr/lib/python3/dist-packages/requests/models.py:820:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
> data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
> data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
> with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
> self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff0100>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> yield
>
> except SocketTimeout as e:
> # FIXME: Ideally we'd like to include the url in the
> ReadTimeoutError but
> # there is yet no clean way to get at it from this context.
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except BaseSSLError as e:
> # FIXME: Is there a better way to differentiate between
> SSLErrors?
> if "read operation timed out" not in str(e):
> # SSL errors related to framing/MAC get wrapped and
> reraised here
> raise SSLError(e) from e
>
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except (HTTPException, OSError) as e:
> # This includes IncompleteRead.
> > raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E urllib3.exceptions.ProtocolError: ('Connection broken:
> IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41 bytes
> read, -41 more expected))
>
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
>
> During handling of the above exception, another exception occurred:
>
> self = <tests.test_firewall.TestFirewall testMethod=test_remove_tags>
>
> @responses.activate
> def test_remove_tags(self):
> data = self.load_from_file('firewalls/tags.json')
>
> url = self.base_url + "firewalls/12345/tags"
> responses.add(responses.DELETE, url,
> body=data,
> status=204,
> content_type='application/json')
>
> tag = json.loads(data)["tags"][0]
> > self.firewall.remove_tags([tag])
>
> tests/test_firewall.py:119:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> digitalocean/Firewall.py:232: in remove_tags
> return self.get_data(
> digitalocean/baseapi.py:216: in get_data
> req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
> return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:671: in delete
> return self.request("DELETE", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
> resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
> r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
> self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> yield from self.raw.stream(chunk_size, decode_content=True)
> except ProtocolError as e:
> > raise ChunkedEncodingError(e)
> E requests.exceptions.ChunkedEncodingError: ('Connection
> broken: IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41
> bytes read, -41 more expected))
>
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> =============================== warnings summary
> ===============================
> .pybuild/cpython3_3.12_digitalocean/build/tests/test_droplet.py::TestDroplet::test_get_kernel_available_with_pages
> .pybuild/cpython3_3.12_digitalocean/build/tests/test_manager.py::TestManager::test_get_droplet_snapshots
> .pybuild/cpython3_3.12_digitalocean/build/tests/test_manager.py::TestManager::test_get_per_region_volumes
> .pybuild/cpython3_3.12_digitalocean/build/tests/test_manager.py::TestManager::test_get_volume_snapshots
> /usr/lib/python3/dist-packages/responses/__init__.py:436:
> DeprecationWarning: Argument 'match_querystring' is deprecated. Use
> 'responses.matchers.query_param_matcher' or
> 'responses.matchers.query_string_matcher'
> warn(
>
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info
> ============================
> FAILED tests/test_firewall.py::TestFirewall::test_add_droplets -
> requests.exc...
> FAILED tests/test_firewall.py::TestFirewall::test_add_tags -
> requests.excepti...
> FAILED tests/test_firewall.py::TestFirewall::test_remove_droplets -
> requests....
> FAILED tests/test_firewall.py::TestFirewall::test_remove_tags -
> requests.exce...
> ================== 4 failed, 148 passed, 4 warnings in 1.06s
> ===================
> E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build; python3.12 -m
> pytest
> I: pybuild pybuild:308: cp -r /<<PKGBUILDDIR>>/digitalocean/tests
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build
> I: pybuild base:311: cd
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build; python3.11 -m
> pytest
> ============================= test session starts
> ==============================
> platform linux -- Python 3.11.9, pytest-8.2.2, pluggy-1.5.0
> rootdir: /<<PKGBUILDDIR>>
> collected 152 items
>
> tests/test_action.py .. [
> 1%]
> tests/test_baseapi.py .... [
> 3%]
> tests/test_certificate.py .... [
> 6%]
> tests/test_domain.py ....... [
> 11%]
> tests/test_droplet.py ............................................ [
> 40%]
> tests/test_firewall.py FF.FF [
> 43%]
> tests/test_floatingip.py ...... [
> 47%]
> tests/test_image.py ....... [
> 51%]
> tests/test_load_balancer.py .......... [
> 58%]
> tests/test_manager.py ............................... [
> 78%]
> tests/test_project.py ......... [
> 84%]
> tests/test_snapshot.py .. [
> 86%]
> tests/test_tag.py ....... [
> 90%]
> tests/test_volume.py .......... [
> 97%]
> tests/test_vpc.py ....
> [100%]
>
> =================================== FAILURES
> ===================================
> ________________________ TestFirewall.test_add_droplets
> ________________________
>
> self = <urllib3.response.HTTPResponse object at 0x7f1d228def20>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> > yield
>
> /usr/lib/python3/dist-packages/urllib3/response.py:710:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f1d228def20>, amt = 10240
>
> def _raw_read(
> self,
> amt: int | None = None,
> ) -> bytes:
> """
> Reads `amt` of bytes from the socket.
> """
> if self._fp is None:
> return None # type: ignore[return-value]
>
> fp_closed = getattr(self._fp, "closed", False)
>
> with self._error_catcher():
> data = self._fp_read(amt) if not fp_closed else b""
> if amt is not None and amt != 0 and not data:
> # Platform-specific: Buggy versions of Python.
> # Close the connection when no data is returned
> #
> # This is redundant to what httplib/http.client _should_
> # already do. However, versions of python released before
> # December 15, 2012 (http://bugs.python.org/issue16298) do
> # not properly close the connection in all cases. There is
> # no harm in redundantly calling close.
> self._fp.close()
> if (
> self.enforce_content_length
> and self.length_remaining is not None
> and self.length_remaining != 0
> ):
> # This is an edge case that httplib failed to cover due
> # to concerns of backward compatibility. We're
> # addressing it here to make sure IncompleteRead is
> # raised during streaming, so all calls with incorrect
> # Content-Length are caught.
> > raise IncompleteRead(self._fp_bytes_read,
> > self.length_remaining)
> E urllib3.exceptions.IncompleteRead: IncompleteRead(44
> bytes read, -44 more expected)
>
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
>
> The above exception was the direct cause of the following exception:
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> > yield from self.raw.stream(chunk_size, decode_content=True)
>
> /usr/lib/python3/dist-packages/requests/models.py:820:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
> data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
> data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
> with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
> self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f1d228def20>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> yield
>
> except SocketTimeout as e:
> # FIXME: Ideally we'd like to include the url in the
> ReadTimeoutError but
> # there is yet no clean way to get at it from this context.
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except BaseSSLError as e:
> # FIXME: Is there a better way to differentiate between
> SSLErrors?
> if "read operation timed out" not in str(e):
> # SSL errors related to framing/MAC get wrapped and
> reraised here
> raise SSLError(e) from e
>
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except (HTTPException, OSError) as e:
> # This includes IncompleteRead.
> > raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E urllib3.exceptions.ProtocolError: ('Connection broken:
> IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44 bytes
> read, -44 more expected))
>
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
>
> During handling of the above exception, another exception occurred:
>
> self = <tests.test_firewall.TestFirewall testMethod=test_add_droplets>
>
> @responses.activate
> def test_add_droplets(self):
> data = self.load_from_file('firewalls/droplets.json')
>
> url = self.base_url + "firewalls/12345/droplets"
> responses.add(responses.POST, url,
> body=data,
> status=204,
> content_type='application/json')
>
> droplet_id = json.loads(data)["droplet_ids"][0]
> > self.firewall.add_droplets([droplet_id])
>
> tests/test_firewall.py:73:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> digitalocean/Firewall.py:202: in add_droplets
> return self.get_data(
> digitalocean/baseapi.py:216: in get_data
> req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
> return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:115: in post
> return request("post", url, data=data, json=json, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:59: in request
> return session.request(method=method, url=url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
> resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
> r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
> self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> yield from self.raw.stream(chunk_size, decode_content=True)
> except ProtocolError as e:
> > raise ChunkedEncodingError(e)
> E requests.exceptions.ChunkedEncodingError: ('Connection
> broken: IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44
> bytes read, -44 more expected))
>
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> __________________________ TestFirewall.test_add_tags
> __________________________
>
> self = <urllib3.response.HTTPResponse object at 0x7f1d22819270>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> > yield
>
> /usr/lib/python3/dist-packages/urllib3/response.py:710:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f1d22819270>, amt = 10240
>
> def _raw_read(
> self,
> amt: int | None = None,
> ) -> bytes:
> """
> Reads `amt` of bytes from the socket.
> """
> if self._fp is None:
> return None # type: ignore[return-value]
>
> fp_closed = getattr(self._fp, "closed", False)
>
> with self._error_catcher():
> data = self._fp_read(amt) if not fp_closed else b""
> if amt is not None and amt != 0 and not data:
> # Platform-specific: Buggy versions of Python.
> # Close the connection when no data is returned
> #
> # This is redundant to what httplib/http.client _should_
> # already do. However, versions of python released before
> # December 15, 2012 (http://bugs.python.org/issue16298) do
> # not properly close the connection in all cases. There is
> # no harm in redundantly calling close.
> self._fp.close()
> if (
> self.enforce_content_length
> and self.length_remaining is not None
> and self.length_remaining != 0
> ):
> # This is an edge case that httplib failed to cover due
> # to concerns of backward compatibility. We're
> # addressing it here to make sure IncompleteRead is
> # raised during streaming, so all calls with incorrect
> # Content-Length are caught.
> > raise IncompleteRead(self._fp_bytes_read,
> > self.length_remaining)
> E urllib3.exceptions.IncompleteRead: IncompleteRead(41
> bytes read, -41 more expected)
>
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
>
> The above exception was the direct cause of the following exception:
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> > yield from self.raw.stream(chunk_size, decode_content=True)
>
> /usr/lib/python3/dist-packages/requests/models.py:820:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
> data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
> data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
> with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
> self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f1d22819270>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> yield
>
> except SocketTimeout as e:
> # FIXME: Ideally we'd like to include the url in the
> ReadTimeoutError but
> # there is yet no clean way to get at it from this context.
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except BaseSSLError as e:
> # FIXME: Is there a better way to differentiate between
> SSLErrors?
> if "read operation timed out" not in str(e):
> # SSL errors related to framing/MAC get wrapped and
> reraised here
> raise SSLError(e) from e
>
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except (HTTPException, OSError) as e:
> # This includes IncompleteRead.
> > raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E urllib3.exceptions.ProtocolError: ('Connection broken:
> IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41 bytes
> read, -41 more expected))
>
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
>
> During handling of the above exception, another exception occurred:
>
> self = <tests.test_firewall.TestFirewall testMethod=test_add_tags>
>
> @responses.activate
> def test_add_tags(self):
> data = self.load_from_file('firewalls/tags.json')
>
> url = self.base_url + "firewalls/12345/tags"
> responses.add(responses.POST, url,
> body=data,
> status=204,
> content_type='application/json')
>
> tag = json.loads(data)["tags"][0]
> > self.firewall.add_tags([tag])
>
> tests/test_firewall.py:104:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> digitalocean/Firewall.py:222: in add_tags
> return self.get_data(
> digitalocean/baseapi.py:216: in get_data
> req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
> return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:115: in post
> return request("post", url, data=data, json=json, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:59: in request
> return session.request(method=method, url=url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
> resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
> r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
> self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> yield from self.raw.stream(chunk_size, decode_content=True)
> except ProtocolError as e:
> > raise ChunkedEncodingError(e)
> E requests.exceptions.ChunkedEncodingError: ('Connection
> broken: IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41
> bytes read, -41 more expected))
>
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> ______________________ TestFirewall.test_remove_droplets
> _______________________
>
> self = <urllib3.response.HTTPResponse object at 0x7f1d22798cd0>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> > yield
>
> /usr/lib/python3/dist-packages/urllib3/response.py:710:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f1d22798cd0>, amt = 10240
>
> def _raw_read(
> self,
> amt: int | None = None,
> ) -> bytes:
> """
> Reads `amt` of bytes from the socket.
> """
> if self._fp is None:
> return None # type: ignore[return-value]
>
> fp_closed = getattr(self._fp, "closed", False)
>
> with self._error_catcher():
> data = self._fp_read(amt) if not fp_closed else b""
> if amt is not None and amt != 0 and not data:
> # Platform-specific: Buggy versions of Python.
> # Close the connection when no data is returned
> #
> # This is redundant to what httplib/http.client _should_
> # already do. However, versions of python released before
> # December 15, 2012 (http://bugs.python.org/issue16298) do
> # not properly close the connection in all cases. There is
> # no harm in redundantly calling close.
> self._fp.close()
> if (
> self.enforce_content_length
> and self.length_remaining is not None
> and self.length_remaining != 0
> ):
> # This is an edge case that httplib failed to cover due
> # to concerns of backward compatibility. We're
> # addressing it here to make sure IncompleteRead is
> # raised during streaming, so all calls with incorrect
> # Content-Length are caught.
> > raise IncompleteRead(self._fp_bytes_read,
> > self.length_remaining)
> E urllib3.exceptions.IncompleteRead: IncompleteRead(44
> bytes read, -44 more expected)
>
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
>
> The above exception was the direct cause of the following exception:
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> > yield from self.raw.stream(chunk_size, decode_content=True)
>
> /usr/lib/python3/dist-packages/requests/models.py:820:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
> data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
> data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
> with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
> self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f1d22798cd0>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> yield
>
> except SocketTimeout as e:
> # FIXME: Ideally we'd like to include the url in the
> ReadTimeoutError but
> # there is yet no clean way to get at it from this context.
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except BaseSSLError as e:
> # FIXME: Is there a better way to differentiate between
> SSLErrors?
> if "read operation timed out" not in str(e):
> # SSL errors related to framing/MAC get wrapped and
> reraised here
> raise SSLError(e) from e
>
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except (HTTPException, OSError) as e:
> # This includes IncompleteRead.
> > raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E urllib3.exceptions.ProtocolError: ('Connection broken:
> IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44 bytes
> read, -44 more expected))
>
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
>
> During handling of the above exception, another exception occurred:
>
> self = <tests.test_firewall.TestFirewall testMethod=test_remove_droplets>
>
> @responses.activate
> def test_remove_droplets(self):
> data = self.load_from_file('firewalls/droplets.json')
>
> url = self.base_url + "firewalls/12345/droplets"
> responses.add(responses.DELETE,
> url,
> body=data,
> status=204,
> content_type='application/json')
>
> droplet_id = json.loads(data)["droplet_ids"][0]
> > self.firewall.remove_droplets([droplet_id])
>
> tests/test_firewall.py:89:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> digitalocean/Firewall.py:212: in remove_droplets
> return self.get_data(
> digitalocean/baseapi.py:216: in get_data
> req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
> return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:671: in delete
> return self.request("DELETE", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
> resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
> r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
> self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> yield from self.raw.stream(chunk_size, decode_content=True)
> except ProtocolError as e:
> > raise ChunkedEncodingError(e)
> E requests.exceptions.ChunkedEncodingError: ('Connection
> broken: IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44
> bytes read, -44 more expected))
>
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> ________________________ TestFirewall.test_remove_tags
> _________________________
>
> self = <urllib3.response.HTTPResponse object at 0x7f1d2379c160>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> > yield
>
> /usr/lib/python3/dist-packages/urllib3/response.py:710:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f1d2379c160>, amt = 10240
>
> def _raw_read(
> self,
> amt: int | None = None,
> ) -> bytes:
> """
> Reads `amt` of bytes from the socket.
> """
> if self._fp is None:
> return None # type: ignore[return-value]
>
> fp_closed = getattr(self._fp, "closed", False)
>
> with self._error_catcher():
> data = self._fp_read(amt) if not fp_closed else b""
> if amt is not None and amt != 0 and not data:
> # Platform-specific: Buggy versions of Python.
> # Close the connection when no data is returned
> #
> # This is redundant to what httplib/http.client _should_
> # already do. However, versions of python released before
> # December 15, 2012 (http://bugs.python.org/issue16298) do
> # not properly close the connection in all cases. There is
> # no harm in redundantly calling close.
> self._fp.close()
> if (
> self.enforce_content_length
> and self.length_remaining is not None
> and self.length_remaining != 0
> ):
> # This is an edge case that httplib failed to cover due
> # to concerns of backward compatibility. We're
> # addressing it here to make sure IncompleteRead is
> # raised during streaming, so all calls with incorrect
> # Content-Length are caught.
> > raise IncompleteRead(self._fp_bytes_read,
> > self.length_remaining)
> E urllib3.exceptions.IncompleteRead: IncompleteRead(41
> bytes read, -41 more expected)
>
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
>
> The above exception was the direct cause of the following exception:
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> > yield from self.raw.stream(chunk_size, decode_content=True)
>
> /usr/lib/python3/dist-packages/requests/models.py:820:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
> data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
> data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
> with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
> self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <urllib3.response.HTTPResponse object at 0x7f1d2379c160>
>
> @contextmanager
> def _error_catcher(self) -> typing.Generator[None, None, None]:
> """
> Catch low-level python exceptions, instead re-raising urllib3
> variants, so that low-level exceptions are not leaked in the
> high-level api.
>
> On exit, release the connection back to the pool.
> """
> clean_exit = False
>
> try:
> try:
> yield
>
> except SocketTimeout as e:
> # FIXME: Ideally we'd like to include the url in the
> ReadTimeoutError but
> # there is yet no clean way to get at it from this context.
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except BaseSSLError as e:
> # FIXME: Is there a better way to differentiate between
> SSLErrors?
> if "read operation timed out" not in str(e):
> # SSL errors related to framing/MAC get wrapped and
> reraised here
> raise SSLError(e) from e
>
> raise ReadTimeoutError(self._pool, None, "Read timed out.")
> from e # type: ignore[arg-type]
>
> except (HTTPException, OSError) as e:
> # This includes IncompleteRead.
> > raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E urllib3.exceptions.ProtocolError: ('Connection broken:
> IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41 bytes
> read, -41 more expected))
>
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
>
> During handling of the above exception, another exception occurred:
>
> self = <tests.test_firewall.TestFirewall testMethod=test_remove_tags>
>
> @responses.activate
> def test_remove_tags(self):
> data = self.load_from_file('firewalls/tags.json')
>
> url = self.base_url + "firewalls/12345/tags"
> responses.add(responses.DELETE, url,
> body=data,
> status=204,
> content_type='application/json')
>
> tag = json.loads(data)["tags"][0]
> > self.firewall.remove_tags([tag])
>
> tests/test_firewall.py:119:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> digitalocean/Firewall.py:232: in remove_tags
> return self.get_data(
> digitalocean/baseapi.py:216: in get_data
> req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
> return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:671: in delete
> return self.request("DELETE", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
> resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
> r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
> self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> def generate():
> # Special case for urllib3.
> if hasattr(self.raw, "stream"):
> try:
> yield from self.raw.stream(chunk_size, decode_content=True)
> except ProtocolError as e:
> > raise ChunkedEncodingError(e)
> E requests.exceptions.ChunkedEncodingError: ('Connection
> broken: IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41
> bytes read, -41 more expected))
>
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> =============================== warnings summary
> ===============================
> .pybuild/cpython3_3.11_digitalocean/build/tests/test_droplet.py::TestDroplet::test_get_kernel_available_with_pages
> .pybuild/cpython3_3.11_digitalocean/build/tests/test_manager.py::TestManager::test_get_droplet_snapshots
> .pybuild/cpython3_3.11_digitalocean/build/tests/test_manager.py::TestManager::test_get_per_region_volumes
> .pybuild/cpython3_3.11_digitalocean/build/tests/test_manager.py::TestManager::test_get_volume_snapshots
> /usr/lib/python3/dist-packages/responses/__init__.py:436:
> DeprecationWarning: Argument 'match_querystring' is deprecated. Use
> 'responses.matchers.query_param_matcher' or
> 'responses.matchers.query_string_matcher'
> warn(
>
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info
> ============================
> FAILED tests/test_firewall.py::TestFirewall::test_add_droplets -
> requests.exc...
> FAILED tests/test_firewall.py::TestFirewall::test_add_tags -
> requests.excepti...
> FAILED tests/test_firewall.py::TestFirewall::test_remove_droplets -
> requests....
> FAILED tests/test_firewall.py::TestFirewall::test_remove_tags -
> requests.exce...
> ================== 4 failed, 148 passed, 4 warnings in 1.39s
> ===================
> E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build; python3.11 -m
> pytest
> dh_auto_test: error: pybuild --test -i python{version} -p "3.12 3.11"
> returned exit code 13
The full build log is available from:
http://qa-logs.debian.net/2024/06/15/python-digitalocean_1.16.0-3_unstable.log
All bugs filed during this archive rebuild are listed at:
https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=ftbfs-20240615;[email protected]
or:
https://udd.debian.org/bugs/?release=na&merged=ign&fnewerval=7&flastmodval=7&fusertag=only&fusertagtag=ftbfs-20240615&[email protected]&allbugs=1&cseverity=1&ctags=1&caffected=1#results
A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!
If you reassign this bug to another package, please mark it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects
If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.