Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cachebeta task: cache base docker image layers #11034

Closed
ygnr opened this issue Jul 31, 2019 · 22 comments
Closed

Cachebeta task: cache base docker image layers #11034

ygnr opened this issue Jul 31, 2019 · 22 comments

Comments

@ygnr
Copy link

ygnr commented Jul 31, 2019

Required Information

Question, Bug, or Feature?
Type: Question

Enter Task Name: CacheBeta

Issue Description

Can we use CacheBeta task to cache docker images instead of pulling the images every time? I have tried with cache path set to /var/lib/docker on a Hoster Ubuntu Agent. As expected it failed with permission denied. Is there any way to do this or is there any documentation?

@tjhowse
Copy link

tjhowse commented Aug 2, 2019

I would also like a mechanism for speeding up the acquisition of my docker image layers by either using the cache pipeline task or something else.

@bryanmacfarlane bryanmacfarlane added Area: ArtifactsPackages Azure Artifacts Packaging Team enhancement and removed question route labels Aug 20, 2019
@apluche apluche added Area: ArtifactsCore Azure Artifacts Core Team and removed Area: ArtifactsPackages Azure Artifacts Packaging Team labels Aug 21, 2019
@johnterickson
Copy link
Contributor

FWIW I had started experimenting with this and there are a few options I came across:

  1. docker save/load (I've found the performance of this to be pretty bad)
  2. use buildctl --import-cache and --export-cache
buildctl build ... --export-cache type=local,dest=path/to/output-dir
buildctl build ... --import-cache type=local,src=path/to/input-dir
The directory layout conforms to OCI Image Spec v1.0.
--export-cache options
mode=min (default): only export layers for the resulting image
mode=max: export all the layers of all intermediate steps
ref=docker.io/user/image:tag: reference for registry cache exporter
dest=path/to/output-dir: directory for local cache exporter
--import-cache options
ref=docker.io/user/image:tag: reference for registry cache importer
src=path/to/input-dir: directory for local cache importer
digest=sha256:deadbeef: digest of the manifest list to import for local cache importer. Defaults to the digest of "latest" tag in index.json

@Rod-Sychev
Copy link

FWIW I had started experimenting with this and there are a few options I came across:

  1. docker save/load (I've found the performance of this to be pretty bad)
  2. use buildctl --import-cache and --export-cache

Hey @johnterickson, any feedback regarding these options and outcome?

@johnterickson
Copy link
Contributor

@Rod-Sychev - I'd recommend going the buildctl route. For whatever reason, "docker load" from a local file was slower than doing a pull from a remote registry. If you don't have a private registry and you have super slow-to-run layers (e.g. building lots of native dependencies) then "docker load/save" might still pay off.

The challenge with buildctl is getting it installed on the agent.

@johnterickson
Copy link
Contributor

johnterickson commented Sep 24, 2019

@Rod-Sychev btw here's my YML for buildctl tinkering: https://dev.azure.com/codesharing-SU0/cachesandbox/_git/Scripts?path=%2Fdocker.yml&version=GBmaster

Here's what I see:
No cache
Total ~3m

Run 1 (Cache Miss) https://dev.azure.com/codesharing-SU0/cachesandbox/_build/results?buildId=16043&view=results
docker build: 3 minutes + 30 seconds exporting layers

  • 14 seconds storing cache
    Total 4m 3s

Run 2 (Cache Hit) https://dev.azure.com/codesharing-SU0/cachesandbox/_build/results?buildId=16048&view=logs&j=12f1170f-54f2-53f3-20dd-22fc7dff55f9
restore 'buildkit': 10s
restore docker layers: 12s
docker build: 1m 12s
Total 1m 46s

Run 3 (Deps hit, src change) https://dev.azure.com/codesharing-SU0/cachesandbox/_build/results?buildId=16049&view=results
restore 'buildkit': 9s
restore docker layers: 19s
docker build: 1m 30s
Total 2m 8s

Not sure why it's taking docker build so long when all layers are present.

@fadnavistanmay
Copy link
Contributor

@Rod-Sychev , @ygnr @tjhowse - Can you guys please refer to https://github.com/fadnavistanmay/Scripts/blob/master/docker.yml for caching the docker base image layer Let us know how it goes.

@stefankip
Copy link

stefankip commented Oct 21, 2019

I have implemented the steps and scripting from the docker.yml file for a .NET Core microservice application, but it doesn't seem to be faster, while caching is used.
Especially the dotnet restore seems to take some time (unpacking 48.0s done), while it looks like it should've been cached (because of the CACHED output at the end):

#16 [build 5/9] RUN dotnet restore
#16 pulling sha256:e6220b144b9c6c6328e87c1889462088917561bdb4fa73a878066f3f8a89878a
#16 pulling sha256:c05081985e91e6b5326feeb7606d42deb420a10bfa800b4446158093b7b7b8f8
#16 pulling sha256:9a0b0ce99936ce4861d44ce1f193e881e5b40b5bf1847627061205b092fa7f1d
#16 pulling sha256:db3b6004c61a0e86fbf910b9b4a6611ae79e238a336011a1b5f9b177d85cbf9d
#16 pulling sha256:f8f0759202953be4b156f44bba90b682b61f985f9bbc60e7262b216f70dabb96
#16 pulling sha256:6ef14aff1139e1065ec0928ae1c07f2cff8c2b35e760f4b463df5c64e6ea1101
#16 pulling sha256:c527443ccfede68672bb54490c4ee256c1746af05b291928c2cdb0c97beb8545
#16 pulling sha256:6c5e96b85e8cee8b200325cdc8d5cc1ca84678c37eac127ba163f5c94b5d2a60
#16 pulling sha256:d39c626fbbd181fbf5328dc0979c98498454de2180ea8a73e438f00391201759
#16 pulling sha256:86446b9f2ecb592aac25f6fa852b4deff130a6dd5681e84c00a55e295c08bc8a
#16 pulling sha256:c156b8c92f5448b42c13d4a1fb43a5517e7cdd078fc0306575c6567f6a60feca
#16 pulling sha256:c527443ccfede68672bb54490c4ee256c1746af05b291928c2cdb0c97beb8545 0.3s done
#16 pulling sha256:86446b9f2ecb592aac25f6fa852b4deff130a6dd5681e84c00a55e295c08bc8a 0.3s done
#16 pulling sha256:c156b8c92f5448b42c13d4a1fb43a5517e7cdd078fc0306575c6567f6a60feca 0.3s done
#16 pulling sha256:f8f0759202953be4b156f44bba90b682b61f985f9bbc60e7262b216f70dabb96 0.5s done
#16 pulling sha256:c05081985e91e6b5326feeb7606d42deb420a10bfa800b4446158093b7b7b8f8 1.0s done
#16 pulling sha256:db3b6004c61a0e86fbf910b9b4a6611ae79e238a336011a1b5f9b177d85cbf9d 1.2s done
#16 pulling sha256:6ef14aff1139e1065ec0928ae1c07f2cff8c2b35e760f4b463df5c64e6ea1101 3.2s done
#16 pulling sha256:9a0b0ce99936ce4861d44ce1f193e881e5b40b5bf1847627061205b092fa7f1d 4.2s done
#16 pulling sha256:e6220b144b9c6c6328e87c1889462088917561bdb4fa73a878066f3f8a89878a 6.0s done
#16 pulling sha256:6c5e96b85e8cee8b200325cdc8d5cc1ca84678c37eac127ba163f5c94b5d2a60 10.1s done
#16 pulling sha256:d39c626fbbd181fbf5328dc0979c98498454de2180ea8a73e438f00391201759 17.4s done
#16 unpacking
#16 unpacking 48.0s done
#16 pulling sha256:c05081985e91e6b5326feeb7606d42deb420a10bfa800b4446158093b7b7b8f8 0.1s done
#16 pulling sha256:9a0b0ce99936ce4861d44ce1f193e881e5b40b5bf1847627061205b092fa7f1d
#16 pulling sha256:db3b6004c61a0e86fbf910b9b4a6611ae79e238a336011a1b5f9b177d85cbf9d
#16 pulling sha256:f8f0759202953be4b156f44bba90b682b61f985f9bbc60e7262b216f70dabb96
#16 pulling sha256:6ef14aff1139e1065ec0928ae1c07f2cff8c2b35e760f4b463df5c64e6ea1101
#16 pulling sha256:c527443ccfede68672bb54490c4ee256c1746af05b291928c2cdb0c97beb8545
#16 pulling sha256:6c5e96b85e8cee8b200325cdc8d5cc1ca84678c37eac127ba163f5c94b5d2a60
#16 pulling sha256:d39c626fbbd181fbf5328dc0979c98498454de2180ea8a73e438f00391201759
#16 pulling sha256:86446b9f2ecb592aac25f6fa852b4deff130a6dd5681e84c00a55e295c08bc8a
#16 pulling sha256:c156b8c92f5448b42c13d4a1fb43a5517e7cdd078fc0306575c6567f6a60feca 0.0s done
#16 pulling sha256:9a0b0ce99936ce4861d44ce1f193e881e5b40b5bf1847627061205b092fa7f1d 0.1s done
#16 pulling sha256:db3b6004c61a0e86fbf910b9b4a6611ae79e238a336011a1b5f9b177d85cbf9d 0.1s done
#16 pulling sha256:f8f0759202953be4b156f44bba90b682b61f985f9bbc60e7262b216f70dabb96 0.1s done
#16 pulling sha256:6ef14aff1139e1065ec0928ae1c07f2cff8c2b35e760f4b463df5c64e6ea1101 0.1s done
#16 pulling sha256:c527443ccfede68672bb54490c4ee256c1746af05b291928c2cdb0c97beb8545 0.1s done
#16 pulling sha256:6c5e96b85e8cee8b200325cdc8d5cc1ca84678c37eac127ba163f5c94b5d2a60 0.1s done
#16 pulling sha256:d39c626fbbd181fbf5328dc0979c98498454de2180ea8a73e438f00391201759 0.1s done
#16 pulling sha256:86446b9f2ecb592aac25f6fa852b4deff130a6dd5681e84c00a55e295c08bc8a 0.1s done
#16 unpacking 0.1s done
#16 pulling sha256:c05081985e91e6b5326feeb7606d42deb420a10bfa800b4446158093b7b7b8f8 0.1s done
#16 pulling sha256:9a0b0ce99936ce4861d44ce1f193e881e5b40b5bf1847627061205b092fa7f1d 0.1s done
#16 pulling sha256:db3b6004c61a0e86fbf910b9b4a6611ae79e238a336011a1b5f9b177d85cbf9d 0.1s done
#16 pulling sha256:f8f0759202953be4b156f44bba90b682b61f985f9bbc60e7262b216f70dabb96 0.1s done
#16 pulling sha256:6ef14aff1139e1065ec0928ae1c07f2cff8c2b35e760f4b463df5c64e6ea1101 0.1s done
#16 pulling sha256:6c5e96b85e8cee8b200325cdc8d5cc1ca84678c37eac127ba163f5c94b5d2a60 0.1s done
#16 pulling sha256:86446b9f2ecb592aac25f6fa852b4deff130a6dd5681e84c00a55e295c08bc8a 0.0s done
#16 pulling sha256:c527443ccfede68672bb54490c4ee256c1746af05b291928c2cdb0c97beb8545 0.1s done
#16 pulling sha256:d39c626fbbd181fbf5328dc0979c98498454de2180ea8a73e438f00391201759 0.1s done
#16 unpacking 0.1s done
#16 pulling sha256:6ef14aff1139e1065ec0928ae1c07f2cff8c2b35e760f4b463df5c64e6ea1101 0.1s done
#16 pulling sha256:c527443ccfede68672bb54490c4ee256c1746af05b291928c2cdb0c97beb8545 0.0s done
#16 pulling sha256:c05081985e91e6b5326feeb7606d42deb420a10bfa800b4446158093b7b7b8f8 0.1s done
#16 pulling sha256:9a0b0ce99936ce4861d44ce1f193e881e5b40b5bf1847627061205b092fa7f1d 0.1s done
#16 pulling sha256:db3b6004c61a0e86fbf910b9b4a6611ae79e238a336011a1b5f9b177d85cbf9d 0.1s done
#16 pulling sha256:f8f0759202953be4b156f44bba90b682b61f985f9bbc60e7262b216f70dabb96 0.1s done
#16 pulling sha256:6c5e96b85e8cee8b200325cdc8d5cc1ca84678c37eac127ba163f5c94b5d2a60 0.1s done
#16 pulling sha256:d39c626fbbd181fbf5328dc0979c98498454de2180ea8a73e438f00391201759 0.1s done
#16 unpacking 0.1s done
#16 CACHED

@johnterickson
Copy link
Contributor

@stefankip Can you share some YAML and Dockerfile to give some context?

@stefankip
Copy link

stefankip commented Oct 23, 2019

I'll check if I can create a test scenario somewhere next week...

@stefankip
Copy link

stefankip commented Oct 25, 2019

So a small update; I've did some testing with an angular project with npm as package manager.
I used this script for running the build: https://dev.azure.com/codesharing-SU0/cachesandbox/_git/Scripts?path=%2Fdocker.yml&version=GBmaster.

  • Building and pushing the docker image without buildkit and cache (plain docker commands from DevOps) took approx. 4m 30s.
  • Building the image with the buildkit and cache script, with cache available, took approx. 4m 40s. The docker cache is only 9.2 MB in size.
  • Building the image with the buildkit and cache script and setting the export-cache mode to max, with cache available, took approx. 1m 30s. The docker cache is 516 MB in size.

@johnterickson
Copy link
Contributor

Thanks for the tip @stefankip! I've added mode=max to my script.

Sounds like 4m30s down to 1m30s sounds like a good improvement!

@stefankip
Copy link

Yeah it does. But my .NET Core project didn't have such a performance improvement. I'll do some more tests with that.

@johnterickson
Copy link
Contributor

@fadnavistanmay Do you have a docker sample included in your samples repo?

@fadnavistanmay
Copy link
Contributor

Good point @johnterickson . Let me add it.

@johnterickson
Copy link
Contributor

Sample is added to https://github.com/fadnavistanmay/azure-pipelines-caching-yaml which we will later align with docs

@ygnr
Copy link
Author

ygnr commented Dec 1, 2019

@johnterickson Thank you. That worked partially for us but in our case, in order to run some integration tests we spin up some dependencies in a docker container (via docker-compose). So these are pulled every time. Any ideas on how can I cache these images?

@johnterickson
Copy link
Contributor

@ygnr The hosted agents have no persistent state. The images/layers need to be pulled from somewhere each build (either from Pipeline Caching or else where)

@wrsx
Copy link

wrsx commented Dec 16, 2019

@johnterickson Is it possible to demonstrate how we can push the built image after it's built from buildctl? We rely on pushing our image to a private registry through an Azure Pipelines service connection

@stefankip
Copy link

Yeah I'm struggling with the same thing; getting the resulting image pushed to the ACR.
I tried using the --output type=image,name=acr-url.io/repo:tag,push=false argument and after that docker images, but the image is not listed :-(

@stefankip
Copy link

So I found out how we can use the created image.
Add this to the docker command:
--output type=docker,name=some-registry.com/repository:tag
And replace sudo $DOCKER_COMMAND with sudo $DOCKER_COMMAND | docker load.
This way the created image is loaded by docker and you can simply push the image to the registry.

@stefankip
Copy link

Can someone explain what the GOOFS=0 part is in the Buildkit cache key?
And what the purpose is of the CACHE_KEY_FALLBACK restoreKey?

@joaku
Copy link

joaku commented Jun 12, 2020

So I found out how we can use the created image.
Add this to the docker command:
--output type=docker,name=some-registry.com/repository:tag
And replace sudo $DOCKER_COMMAND with sudo $DOCKER_COMMAND | docker load.
This way the created image is loaded by docker and you can simply push the image to the registry.

Stefan, I'm trying to follow your indications. In this block of code, how should I add those to the docker command?

DOCKER_COMMAND="$(DOCKER_COMMAND)"
if [ -d "$(BUILD_KIT_CACHE)" ]; then
  echo "Will use cached layers from $(BUILD_KIT_CACHE)"
  find $(BUILD_KIT_CACHE)
  DOCKER_COMMAND="$DOCKER_COMMAND --import-cache type=local,src=$(BUILD_KIT_CACHE)"
fi
if [ "$(BuildKitLayersHit)" != "true" ]; then
  echo "Will store cached layers to $(BUILD_KIT_CACHE)"
  DOCKER_COMMAND="$DOCKER_COMMAND --export-cache mode=max,type=local,dest=$(BUILD_KIT_CACHE)"
fi
sudo $DOCKER_COMMAND

EDIT:
When I append it to the DOCKER_COMMAND at the variables definitions I get:

  • sudo buildctl build --frontend=dockerfile.v0 --local context=. --local dockerfile=. -output type=docker,name=.azurecr.io/: --import-cache type=local,src=/home/vsts/work/1/buildkitcache

error: could not read /home/vsts/work/1/buildkitcache/index.json: open /home/vsts/work/1/buildkitcache/index.json: no such file or directory

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

10 participants