Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Git repo size is ~6Gb #3

Open
xavriley opened this issue Jan 24, 2024 · 0 comments
Open

Git repo size is ~6Gb #3

xavriley opened this issue Jan 24, 2024 · 0 comments

Comments

@xavriley
Copy link

Congratulations on getting this released! I've been looking forward to working with it since ISMIR

A small issue you might already know about - it looks like you already moved the pretrained models to Google Drive but they still appear to be in this Github repo as the size is 5.77Gb when cloning.

$ git clone git@github.com:polimi-ispl/larsnet.git
Cloning into 'larsnet'...
remote: Enumerating objects: 3487, done.
remote: Counting objects: 100% (255/255), done.
remote: Compressing objects: 100% (204/204), done.
remote: Total 3487 (delta 89), reused 196 (delta 47), pack-reused 3232
Receiving objects: 100% (3487/3487), 5.77 GiB | 12.38 MiB/s, done.
Resolving deltas: 100% (827/827), done.

You could either add something to the readme about doing a shallow clone (--depth 1) if people don't need the history.

The more drastic option would be to delete the large file(s) from the git repo using something like:

git filter-branch --force --index-filter \
  'git rm --cached --ignore-unmatch path/to/large_model_file.pth' \
  --prune-empty --tag-name-filter cat -- --all

You'd then need to force push to GitHub to overwrite the previous history.

As I say, it's not a big deal but it might make the repo easier to live with over time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant