-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CI is using up too much git-lfs bandwidth #100
Comments
@ml-evs I don't know much about the new github actions CI setup - can you look into this? There ought to be a way to cache data between runs. |
I've just opened #101 which hopefully fixes this, making use of https://github.com/nschloe/action-cached-lfs-checkout until GitHub adds native support for this (or ideally stops charging for internal network bandwidth). I think on the free "academic team" plan on GitHub the LFS quota is much more generous. Depending on how you view things, the echemdata organisation might be upgradable to this tier as long as someone in the organisation retains an academic affiliation (and as long as the org remains to be used for stuff like galvani). |
#101 doesn't seem to be having much effect, I'm not sure if this is because the bandwidth for the month is already exceeded. Unfortunately this is a product of not having an up to date release on PyPI, so anyone who wants to install the dev version is either failing and using up git lfs bandwidth, or is having to disable LFS. |
I still got github actions errors yesterday even though it is the beginning of a new month. I think this might be the last straw that finally pushes me over the edge to migrate to https://sourcehut.org/ - I would certainly rather pay them than Microsoft. |
GitHub puts a quota on git-lfs bandwidth usage, as well as storage space. We are nowhere near filling the storage limit but because every CI run pulls down all the LFS data once for every version of python tested, it eats up the free quota very quickly.
The text was updated successfully, but these errors were encountered: