On 12/03/2025 06.17, Nicholas Piggin wrote:
Currently the fetch code does not fail gracefully when retry limit is
exceeded, it just falls through the loop with no file, which ends up
hitting other errors.
In preparation for adding more cases where a download gets retried,
add an explicit check for retry limit exceeded.
Signed-off-by: Nicholas Piggin <npig...@gmail.com>
---
tests/functional/qemu_test/asset.py | 5 ++++-
1 file changed, 4 insertions(+), 1 deletion(-)
diff --git a/tests/functional/qemu_test/asset.py
b/tests/functional/qemu_test/asset.py
index f0730695f09..6a1c92ffbef 100644
--- a/tests/functional/qemu_test/asset.py
+++ b/tests/functional/qemu_test/asset.py
@@ -116,7 +116,10 @@ def fetch(self):
self.log.info("Downloading %s to %s...", self.url, self.cache_file)
tmp_cache_file = self.cache_file.with_suffix(".download")
- for retries in range(3):
+ for retries in range(4):
+ if retries == 3:
+ raise Exception("Retries exceeded downloading %s", self.url)
+
try:
with tmp_cache_file.open("xb") as dst:
with urllib.request.urlopen(self.url) as resp:
Reviewed-by: Thomas Huth <th...@redhat.com>