On 25/02/2021 00.10, Philippe Mathieu-Daudé wrote:
+Thomas/Daniel/Alex/Peter/Paolo/Stefan/Markus
On 2/24/21 9:02 PM, Niek Linnenbank wrote:
Hi Philippe, Cleber,
[...]
Indeed. Just this morning I received an e-mail from github with the
following text:
"[GitHub] Git LFS disabled for nieklinnenbank
Git LFS has been disabled on your personal account nieklinnenbank
because you’ve exceeded your data plan by at least 150%.
Please purchase additional data packs to cover your bandwidth and
storage usage:
https://github.com/account/billing/data/upgrade
Current usage as of 24 Feb 2021 09:49AM UTC:
Bandwidth: 1.55 GB / 1 GB (155%)
Storage: 0.48 GB / 1 GB (48%)"
I wasn't aware of it but it appears that Github has these quota's for
the Large File Storage (LFS). I uploaded the files in the git LFS
because single files are also limited to 100MiB each on the regular Git
repositories.
With those strict limits, in my opinion Github isn't really a solution
since the bandwidth limit will be reached very quickly. At least for the
LFS part that is. I don't know yet if there is any limit for regular access.
My current ideas:
- we can try to splitup the larger files into sizes < 100MiB in order
to use github regular storage. and then download each part to combine
into the final image.
im not really in favour of this but it can work, if github doesnt
have any other limit/quota. the cost is that we have to add more
complexity to the acceptance test code.
Well, if you want to go down that road (which I don't really like), you
could also host the binaries on gitlab instead, where our CI is running, so
the binaries would be hosted in the same network as the CI.
- we can try to just update the URLs to armbian that are working now
(with the risk of breaking again in the near future). Ive also found
this link, which may be more stable:
https://archive.armbian.com/orangepipc/archive/
I'd give this a try! If we then later have to discover that the links are
still not stable, we can still reconsider something else.
- or use the server that im hosting - and i don't mind to add the
license files on it if needed (should be GPLv2 right?)
You know that you also have to be ready to provide all the source codes to
the binaries that you host to adhere the conditions of the GPL? That should
be doable if you originally downloaded them along with the binaries, but
otherwise finding the sources to such binaries can be hard...
I'd be interested to hear your opinion and suggestions.
Kind regards,
Niek
Some of the unpractical options I can think of...:
- do not contribute tests using binary blob
That would be a huge step backward to the times when we did not have the
"acceptance" tests yet.
- do not allow test image >100 MiB
Test images should not be too huge anyway, but I don't think we should
introduce such an artificial limit.
- contribute tests with sha1 of (big) image but say "if you want
the test image contact me off-list" then when the contributor
stop responding we remove the test
That does not really scale. And how do you add such a test to the CI? That'd
mean that everybody has to contribute gitlab runners? I don't think that's
feasible either.
- have anyone setup its servers with tests source and images,
without committing anything to the repository. Interested
maintainers/testers are on their own.
- testing done behind the scene
That's maybe an option for these tests where the binaries cannot be found in
the internet anymore.
TBH I'm a bit hopeless.
C'mon, most of the "acceptance" test are just working fine (from the
"download" perspective), it's just some few tests that are troubled. I'd say
let's give it another try with archive.armbian.com and if that does not
work, we can still consider to simply remove those troubled tests.
Thomas