We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I have some large files in my repository that are slowing down the checkout process. To work around this locally, I can use the following command:
git clone --no-checkout --filter=blob:limit=1m
However, I cannot find any ways to pass in these additional options to the checkout action.
checkout
Any ideas how to skip large files with this checkout action?
The text was updated successfully, but these errors were encountered:
Note GitLab supports GIT_FETCH_EXTRA_FLAGS which can be used for that purpose:
GIT_FETCH_EXTRA_FLAGS
Similar feature would be useful here as well.
Sorry, something went wrong.
As a hack, I assume one could do the following:
# sparse checkout to set the blob:none filter, but include all files sparse-checkout: '/*/' # allow interacting with the Git history fetch-depth: 0
That way, we get a clone with intact history but no old blobs being downloaded.
A real fix is in #1396
No branches or pull requests
I have some large files in my repository that are slowing down the checkout process. To work around this locally, I can use the following command:
git clone --no-checkout --filter=blob:limit=1m
However, I cannot find any ways to pass in these additional options to the
checkout
action.Any ideas how to skip large files with this checkout action?
The text was updated successfully, but these errors were encountered: