Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

podman pull fails with Error: writing blob: storing blob to file "/var/tmp/storage3244944871/5": happened during read: unexpected EOF #17545

Closed
Divya1388 opened this issue Feb 17, 2023 · 17 comments
Labels
kind/bug Categorizes issue or PR as related to a bug. locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments.

Comments

@Divya1388
Copy link

Issue Description

This is related to #12962
I am trying to pull an image from private azure container registry which works fine with docker
And it always fail with an error

Steps to reproduce the issue

Steps to reproduce the issue
1.
2.
3.

Describe the results you received

Error: writing blob: storing blob to file "/var/tmp/storage3244944871/5": happened during read: unexpected EOF

Describe the results you expected

successful pull of an image

podman info output

host:
  arch: amd64
  buildahVersion: 1.27.3
  cgroupControllers: []
  cgroupManager: cgroupfs
  cgroupVersion: v1
  conmon:
    package: conmon-2.1.4-1.module+el8.7.0+17498+a7f63b89.x86_64
    path: /usr/bin/conmon
    version: 'conmon version 2.1.4, commit: 419a7c7817d7da098aa7648abddea5014f593ac6'
  cpuUtilization:
    idlePercent: 95.12
    systemPercent: 1.29
    userPercent: 3.59
  cpus: 2
  distribution:
    distribution: '"rhel"'
    version: "8.7"
  eventLogger: file
  hostname: l7lo2arcaue0001
  idMappings:
    gidmap:
    - container_id: 0
      host_id: 1004
      size: 1
    - container_id: 1
      host_id: 296608
      size: 65536
    uidmap:
    - container_id: 0
      host_id: 1003
      size: 1
    - container_id: 1
      host_id: 296608
      size: 65536
  kernel: 4.18.0-425.10.1.el8_7.x86_64
  linkmode: dynamic
  logDriver: k8s-file
  memFree: 534781952
  memTotal: 8140144640
  networkBackend: cni
  ociRuntime:
    name: runc
    package: runc-1.1.4-1.module+el8.7.0+17498+a7f63b89.x86_64
    path: /usr/bin/runc
    version: |-
      runc version 1.1.4
      spec: 1.0.2-dev
      go: go1.18.4
      libseccomp: 2.5.2
  os: linux
  remoteSocket:
    path: /run/user/1003/podman/podman.sock
  security:
    apparmorEnabled: false
    capabilities: CAP_NET_RAW,CAP_CHOWN,CAP_DAC_OVERRIDE,CAP_FOWNER,CAP_FSETID,CAP_KILL,CAP_NET_BIND_SERVICE,CAP_SETFCAP,CAP_SETGID,CAP_SETPCAP,CAP_SETUID,CAP_SYS_CHROOT
    rootless: true
    seccompEnabled: true
    seccompProfilePath: /usr/share/containers/seccomp.json
    selinuxEnabled: true
  serviceIsRemote: false
  slirp4netns:
    executable: /usr/bin/slirp4netns
    package: slirp4netns-1.2.0-2.module+el8.7.0+17498+a7f63b89.x86_64
    version: |-
      slirp4netns version 1.2.0
      commit: 656041d45cfca7a4176f6b7eed9e4fe6c11e8383
      libslirp: 4.4.0
      SLIRP_CONFIG_VERSION_MAX: 3
      libseccomp: 2.5.2
  swapFree: 0
  swapTotal: 0
  uptime: 290h 50m 46.00s (Approximately 12.08 days)
plugins:
  authorization: null
  log:
  - k8s-file
  - none
  - passthrough
  - journald
  network:
  - bridge
  - macvlan
  - ipvlan
  volume:
  - local
registries:
  search:
  - registry.access.redhat.com
  - registry.redhat.io
  - docker.io
store:
  configFile: /home/newuser/.config/containers/storage.conf
  containerStore:
    number: 0
    paused: 0
    running: 0
    stopped: 0
  graphDriverName: overlay
  graphOptions: {}
  graphRoot: /home/newuser/.local/share/containers/storage
  graphRootAllocated: 1063256064
  graphRootUsed: 1050763264
  graphStatus:
    Backing Filesystem: xfs
    Native Overlay Diff: "true"
    Supports d_type: "true"
    Using metacopy: "false"
  imageCopyTmpDir: /var/tmp
  imageStore:
    number: 0
  runRoot: /run/user/1003/containers
  volumePath: /home/newuser/.local/share/containers/storage/volumes
version:
  APIVersion: 4.2.0
  Built: 1670845316
  BuiltTime: Mon Dec 12 11:41:56 2022
  GitCommit: ""
  GoVersion: go1.18.4
  Os: linux
  OsArch: linux/amd64
  Version: 4.2.0

Podman in a container

No

Privileged Or Rootless

Privileged

Upstream Latest Release

Yes

Additional environment details

Additional environment details

Additional information

Additional information like issue happens only occasionally or issue happens with a particular architecture or on a particular setting

@Divya1388 Divya1388 added the kind/bug Categorizes issue or PR as related to a bug. label Feb 17, 2023
@Luap99
Copy link
Member

Luap99 commented Feb 17, 2023

@mtrmac I assume this is the same as #17193?
Although if it always fails with this error it may be a different cause here.

@mtrmac
Copy link
Collaborator

mtrmac commented Feb 17, 2023

Yes to both; the relevant retries have been added to the main branch during the past month, and the primary one will land in Podman 4.4.2.

And if it always fails, that is more surprising.

Does it always fail with the same URL? At the same offset? (The Podman 4.4.2 updates also include that kind of data in the error message and/or debug log.)

Or is it more of a random thing that just happens to trigger reliably enough because the image is so large / has so many layers?

@Divya1388
Copy link
Author

@mtrmac it always fails

@killinq-joke
Copy link

Same on ubuntu 22.04 wsl with a fresh install of podman and downloading through an http proxy

@dpnishant
Copy link

dpnishant commented Mar 1, 2023

Facing the same issue, podman pull always fails but works just fine with docker pull. Adding some more logs, if that helps. I believe that this has something to do with the podman's retry mechanism and that docker is handling differently.

OS: macOS 13.0 (22A380)
CPU: Apple M1 Max
Podman version: 4.4.2
Docker version: 20.10.21-rd, build ac29474

~ podman --log-level trace pull --platform linux/amd64 hub.foo.com/myrepo/redis:7-alpine
INFO[0000] podman filtering at log level trace
DEBU[0000] Called pull.PersistentPreRunE(podman --log-level trace pull --platform linux/amd64 hub.foo.com/myrepo/redis:7-alpine)
DEBU[0000] SSH Ident Key "/Users/johndoe/.ssh/podman-machine-default" SHA256:TKNOih0ytjIogknOt8X3Lr0up3nFXcUbkdW9SvswohM ssh-ed25519
DEBU[0000] DoRequest Method: GET URI: http://d/v4.4.2/libpod/_ping
DEBU[0000] Loading registries configuration "/etc/containers/registries.conf"
DEBU[0000] Found credentials for hub.foo.com in credential helper containers-auth.json in file /Users/johndoe/.config/containers/auth.json
DEBU[0000] No credentials matching docker.io found in /Users/johndoe/.config/containers/auth.json
DEBU[0000] No credentials matching docker.io found in /Users/johndoe/.config/containers/auth.json
DEBU[0000] Found an empty credential entry "https://index.docker.io/v1/" in "/Users/johndoe/.docker/config.json" (an unhandled credential helper marker?), moving on
DEBU[0000] No credentials matching docker.io found in /Users/johndoe/.dockercfg
DEBU[0000] No credentials for docker.io found
DEBU[0000] DoRequest Method: POST URI: http://d/v4.4.2/libpod/images/pull
Trying to pull hub.foo.com/myrepo/redis:7-alpine...
Getting image source signatures
Copying blob sha256:b7e40f3e68f994cff07d581cf283381861c65c8d6d2eb4a895a9701528b6c7a9
Copying blob sha256:1a990ecc86f092d159c35dbd34b9495256da8d34d3b30f4a2cae249c4d148139
Copying blob sha256:c158987b05517b6f2c5913f3acef1f2182a32345a304fe357e3ace5fadcad715
Copying blob sha256:f7ed7b580c2cca7a6c8f2f4b9ca7fd5e5ecc98854c0565ad387835c90a822edf
Copying blob sha256:f2520a938316fcd3430657090cec57e14416dafca8e96c09a25e885dcfadc8da
Copying blob sha256:d48b7a41c3f19fca598b9eb6177f68d627d9087576e338be05fb8d2a5bd93c1a
Copying blob sha256:06cb59a4898430bdecebcbff86ad68fd2847861cb1e405b154ddf700cb6e8345
Copying blob sha256:64a96ebe482b607f3dee6f56c50f7020822bee658487dab74416f8db690c86a7
Error: writing blob: storing blob to file "/var/tmp/storage3033008814/4": happened during read: unexpected EOF
DEBU[0108] Shutting down engines
➜  ~ podman machine stop
Waiting for VM to exit...
Machine "podman-machine-default" stopped successfully
➜  ~ docker pull --platform linux/amd64 hub.foo.com/myrepo/redis:7-alpine
Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?~ sudo ln -s -f "/Users/$USER/.docker/run/docker.sock" /var/run/docker.sock
Password:
➜  ~ docker pull --platform linux/amd64 hub.foo.com/myrepo/redis:7-alpine
7-alpine: Pulling from myrepo/redis
Digest: sha256:1dd65201ed48c679c21632cd4c58d3704a3420025f7a4dca66a525747d71b69e
Status: Image is up to date for hub.foo.com/myrepo/redis:7-alpine
hub.foo.com/myrepo/redis:7-alpine
➜  ~ docker image rm hub.foo.com/myrepo/redis:7-alpine
Untagged: hub.foo.com/myrepo/redis:7-alpine
Untagged: hub.foo.com/myrepo/redis@sha256:1dd65201ed48c679c21632cd4c58d3704a3420025f7a4dca66a525747d71b69e
Deleted: sha256:f9665ae988dbbcfe8fa085230348c51e5a7a95b0ecb6bdd2ec0d55844e555c0a
Deleted: sha256:c3d67ce932c482d5fc73d41225612d2edd2c16398308ba37c8309c1c8c0b0c67
➜  ~ docker pull --platform linux/amd64 hub.foo.com/myrepo/redis:7-alpine
7-alpine: Pulling from myrepo/redis
c158987b0551: Already exists
1a990ecc86f0: Already exists
f2520a938316: Already exists
f7ed7b580c2c: Already exists
d48b7a41c3f1: Already exists
b7e40f3e68f9: Already exists
06cb59a48984: Already exists
64a96ebe482b: Pull complete
Digest: sha256:1dd65201ed48c679c21632cd4c58d3704a3420025f7a4dca66a525747d71b69e
Status: Downloaded newer image for hub.foo.com/myrepo/redis:7-alpine
hub.foo.com/myrepo/redis:7-alpine
➜  ~

@mtrmac
Copy link
Collaborator

mtrmac commented Mar 1, 2023

Error: writing blob: storing blob to file "/var/tmp/storage3033008814/4": happened during read: unexpected EOF

is that really Podman 4.4.2? This seems to be a remote Podman, so the server version (in the VM, I guess) is the one that matters. I would expect the error to include heuristic tuning data.

Also, the debug-level log that would be useful is the log from the Podman server side, not from the client running on a Mac.

@dpnishant
Copy link

You are right, I installed it on mac using Homebrew brew install podman. And running it with podman machine start

How do I run podman natively on macOS without the VM? If not, how do I get the debug logs from the podman server inside the VM?

@rhatdan
Copy link
Member

rhatdan commented Mar 1, 2023

You can not run podman natively on a MAC, it only runs in client server mode with a VM. Containers are a Linux thing and require a Linux kernel.

@dpnishant
Copy link

@rhatdan, @mtrmac So how do I get the debug logs of the podman server from inside the VM, for debugging this issue?

@rhatdan
Copy link
Member

rhatdan commented Mar 3, 2023

You can stop the service and run it manually.

podman --log-level debug system service -t 0

@dpnishant
Copy link

I managed to get the logs from both the podman client and server.

OS: macOS 13.0 (22A380)
CPU: Apple M1 Max
Podman (client) version: 4.4.2

~ podman --log-level trace pull --platform linux/amd64 hub.foo.com/apps/myapp:1.0.3
INFO[0000] podman filtering at log level trace
DEBU[0000] Called pull.PersistentPreRunE(podman --log-level trace pull --platform linux/amd64 hub.foo.com/apps/myapp:1.0.3)
DEBU[0000] SSH Ident Key "/Users/johndoe/.ssh/podman-machine-default" SHA256:TKNOih0ytjIogknOt8X3Lr0up3nFXcUbkdW9SvswohM ssh-ed25519
DEBU[0000] DoRequest Method: GET URI: http://d/v4.4.2/libpod/_ping
DEBU[0000] Loading registries configuration "/etc/containers/registries.conf"
DEBU[0000] Found credentials for hub.foo.com in credential helper containers-auth.json in file /Users/johndoe/.config/containers/auth.json
DEBU[0000] No credentials matching docker.io found in /Users/johndoe/.config/containers/auth.json
DEBU[0000] No credentials matching docker.io found in /Users/johndoe/.config/containers/auth.json
DEBU[0000] Found an empty credential entry "https://index.docker.io/v1/" in "/Users/johndoe/.docker/config.json" (an unhandled credential helper marker?), moving on
DEBU[0000] No credentials matching docker.io found in /Users/johndoe/.dockercfg
DEBU[0000] No credentials for docker.io found
DEBU[0000] DoRequest Method: POST URI: http://d/v4.4.2/libpod/images/pull
Trying to pull hub.foo.com/apps/myapp:1.0.3...
Getting image source signatures
Copying blob sha256:3d51a39ba405fc1b46007633c09b5388246bde32fdc9e224fd1d46e15f8cab4f
Copying blob sha256:391aa60d798921957837d6d49b2c8ec1d40090acdb93e7b63e2b10311de10bb1
Copying blob sha256:7608715873ec5c02d370e963aa9b19a149023ce218887221d93fe671b3abbf58
Copying blob sha256:6d87af36c87a2afe3797ea76c0f29e630ce254a41e3183e7c75da10c3afd73ea
Copying blob sha256:18e855ea6c2e5aea18b9594ecf1324124c242798b9cf7bd33a3c0175d0ee1aec
Copying blob sha256:efd2628a9d8be4265cd14a2270cb994333917832af3dab6ba06f14b3f8ea673f
Copying blob sha256:b8bade255407e5aae315a50508ee20045f1bf60870d23a8f2171ffcaacbe3604
Copying blob sha256:580263e2d3156c888372e0695254fcf0c80518e6efcacc524e70afe323928f6e
Copying blob sha256:6068cad611ea0100a9f89066da5246eade076e24a12c48989b51ed63fba8ec22
Copying blob sha256:3cfc1b709af2ad4de3724b037345ef34da299623103202f9d30f7373271b851e
Error: writing blob: storing blob to file "/var/tmp/storage719313567/6": happened during read: unexpected EOF
DEBU[0347] Shutting down engines

OS: Fedora VM (podman machine start)
Podman (server) version: 4.4.1

[root@localhost ~]# podman --log-level debug system service -t 0
INFO[0000] podman filtering at log level debug
DEBU[0000] Called service.PersistentPreRunE(podman --log-level debug system service -t 0)
DEBU[0000] Using conmon: "/usr/bin/conmon"
DEBU[0000] Initializing boltdb state at /var/lib/containers/storage/libpod/bolt_state.db
DEBU[0000] Using graph driver overlay
DEBU[0000] Using graph root /var/lib/containers/storage
DEBU[0000] Using run root /run/containers/storage
DEBU[0000] Using static dir /var/lib/containers/storage/libpod
DEBU[0000] Using tmp dir /run/libpod
DEBU[0000] Using volume path /var/lib/containers/storage/volumes
DEBU[0000] Using transient store: false
DEBU[0000] Set libpod namespace to ""
DEBU[0000] [graphdriver] trying provided driver "overlay"
DEBU[0000] Cached value indicated that overlay is supported
DEBU[0000] Cached value indicated that overlay is supported
DEBU[0000] Cached value indicated that metacopy is being used
DEBU[0000] NewControl(/var/lib/containers/storage/overlay): nextProjectID = 3172503863
DEBU[0000] Cached value indicated that native-diff is not being used
INFO[0000] Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled
DEBU[0000] backingFs=xfs, projectQuotaSupported=true, useNativeDiff=false, usingMetacopy=true
DEBU[0000] Initializing event backend journald
DEBU[0000] Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument
DEBU[0000] Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument
DEBU[0000] Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument
DEBU[0000] Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument
DEBU[0000] Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument
DEBU[0000] Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument
DEBU[0000] Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument
DEBU[0000] Using OCI runtime "/usr/bin/crun"
INFO[0000] Setting parallel job count to 13
DEBU[0000] registered SIGHUP watcher for config
INFO[0000] API service listening on "/run/podman/podman.sock". URI: "unix:/run/podman/podman.sock"
DEBU[0000] CORS Headers were not set
DEBU[0000] waiting for SIGHUP to reload configuration
DEBU[0000] API service(s) shutting down, idle for 0s
DEBU[0000] API service shutdown request ignored as timeout Duration is UnlimitedService
DEBU[0039] IdleTracker:new 0m+0h/0t connection(s)        X-Reference-Id=0x400012a028
DEBU[0039] IdleTracker:active 0m+0h/1t connection(s)     X-Reference-Id=0x400012a028
@ - - [10/Mar/2023:20:57:37 +0530] "GET /v4.4.2/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
DEBU[0039] IdleTracker:idle 1m+0h/1t connection(s)       X-Reference-Id=0x400012a028
DEBU[0039] IdleTracker:closed 1m+0h/1t connection(s)     X-Reference-Id=0x400012a028
DEBU[0174] IdleTracker:new 0m+0h/1t connection(s)        X-Reference-Id=0x400061c000
DEBU[0174] IdleTracker:active 0m+0h/2t connection(s)     X-Reference-Id=0x400061c000
@ - - [10/Mar/2023:20:59:52 +0530] "GET /v4.4.2/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
DEBU[0174] IdleTracker:idle 1m+0h/2t connection(s)       X-Reference-Id=0x400061c000
DEBU[0174] IdleTracker:closed 1m+0h/2t connection(s)     X-Reference-Id=0x400061c000
DEBU[0174] IdleTracker:new 0m+0h/2t connection(s)        X-Reference-Id=0x4000010010
DEBU[0174] IdleTracker:active 0m+0h/3t connection(s)     X-Reference-Id=0x4000010010
DEBU[0174] Loading registries configuration "/etc/containers/registries.conf"
DEBU[0174] Loading registries configuration "/etc/containers/registries.conf.d/000-shortnames.conf"
DEBU[0174] Loading registries configuration "/etc/containers/registries.conf.d/999-podman-machine.conf"
DEBU[0174] Stored credentials for hub.foo.com in credential helper containers-auth.json
DEBU[0174] Pulling image https://hub.foo.com/apps/myapp:1.0.3 (policy: always)
@ - - [10/Mar/2023:20:59:52 +0530] "POST /v4.4.2/libpod/images/pull?alltags=false&arch=amd64&authfile=&os=linux&password=&policy=always&quiet=false&reference=https%3A%2F%2Fhub.foo.com%2Fapps%2Fmyapp%3A1.0.3&username=&variant= HTTP/1.1" 200 37 "" "Go-http-client/1.1"
DEBU[0174] IdleTracker:idle 1m+0h/3t connection(s)       X-Reference-Id=0x4000010010
DEBU[0174] IdleTracker:closed 1m+0h/3t connection(s)     X-Reference-Id=0x4000010010
DEBU[0192] IdleTracker:new 0m+0h/3t connection(s)        X-Reference-Id=0x400012a148
DEBU[0192] IdleTracker:active 0m+0h/4t connection(s)     X-Reference-Id=0x400012a148
@ - - [10/Mar/2023:21:00:10 +0530] "GET /v4.4.2/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
DEBU[0192] IdleTracker:idle 1m+0h/4t connection(s)       X-Reference-Id=0x400012a148
DEBU[0192] IdleTracker:closed 1m+0h/4t connection(s)     X-Reference-Id=0x400012a148
DEBU[0192] IdleTracker:new 0m+0h/4t connection(s)        X-Reference-Id=0x400061c010
DEBU[0192] IdleTracker:active 0m+0h/5t connection(s)     X-Reference-Id=0x400061c010
DEBU[0192] Stored credentials for hub.foo.com in credential helper containers-auth.json
DEBU[0192] Pulling image hub.foo.com/apps/myapp:1.0.3 (policy: always)
DEBU[0192] Looking up image "hub.foo.com/apps/myapp:1.0.3" in local containers storage
DEBU[0192] Normalized platform linux/amd64 to {amd64 linux  [] }
DEBU[0192] Trying "hub.foo.com/apps/myapp:1.0.3" ...
DEBU[0192] Trying "hub.foo.com/apps/myapp:1.0.3" ...
DEBU[0192] Trying "hub.foo.com/apps/myapp:1.0.3" ...
DEBU[0192] Normalized platform linux/amd64 to {amd64 linux  [] }
DEBU[0192] Attempting to pull candidate hub.foo.com/apps/myapp:1.0.3 for hub.foo.com/apps/myapp:1.0.3
DEBU[0192] parsed reference into "[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]hub.foo.com/apps/myapp:1.0.3"
DEBU[0192] Copying source image //hub.foo.com/apps/myapp:1.0.3 to destination image [overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]hub.foo.com/apps/myapp:1.0.3
DEBU[0192] Using registries.d directory /etc/containers/registries.d
DEBU[0192] Trying to access "hub.foo.com/apps/myapp:1.0.3"
DEBU[0192] Found credentials for hub.foo.com/apps/myapp in credential helper containers-auth.json in file /var/tmp/auth.json.4044354193
DEBU[0192]  No signature storage configuration found for hub.foo.com/apps/myapp:1.0.3, using built-in default file:///var/lib/containers/sigstore
DEBU[0192] Looking for TLS certificates and private keys in /etc/docker/certs.d/hub.foo.com
DEBU[0192] GET https://hub.foo.com/v2/
DEBU[0194] Ping https://hub.foo.com/v2/ status 401
DEBU[0194] GET https://hub.foo.com/v2/auth?account=johndoe&scope=repository%3Aapps%2Fmyapp%3Apull&service=hub.foo.com
DEBU[0196] Increasing token expiration to: 60 seconds
DEBU[0196] GET https://hub.foo.com/v2/apps/myapp/manifests/1.0.3
DEBU[0197] Content-Type from manifest GET is "application/vnd.docker.distribution.manifest.v2+json"
DEBU[0197] Using blob info cache at /var/lib/containers/cache/blob-info-cache-v1.boltdb
DEBU[0197] IsRunningImageAllowed for image docker:hub.foo.com/apps/myapp:1.0.3
DEBU[0197]  Using default policy section
DEBU[0197]  Requirement 0: allowed
DEBU[0197] Overall: allowed
DEBU[0197] Downloading /v2/apps/myapp/blobs/sha256:cf5492592b3e75afe4c0b110df61994272b9065975bb1211fb880504f8b2768c
DEBU[0197] GET https://hub.foo.com/v2/apps/myapp/blobs/sha256:cf5492592b3e75afe4c0b110df61994272b9065975bb1211fb880504f8b2768c
DEBU[0200] Reading /var/lib/containers/sigstore/apps/myapp@sha256=147dee945dc0cfd4aa38efb6a421d1f3f7fd38379d49f8dce5aab595755c2034/signature-1
DEBU[0200] Not looking for sigstore attachments: disabled by configuration
DEBU[0200] Manifest has MIME type application/vnd.docker.distribution.manifest.v2+json, ordered candidate list [application/vnd.docker.distribution.manifest.v2+json, application/vnd.docker.distribution.manifest.v1+prettyjws, application/vnd.oci.image.manifest.v1+json, application/vnd.docker.distribution.manifest.v1+json]
DEBU[0200] ... will first try using the original manifest unmodified
DEBU[0200] Checking if we can reuse blob sha256:391aa60d798921957837d6d49b2c8ec1d40090acdb93e7b63e2b10311de10bb1: general substitution = true, compression for MIME type "application/vnd.docker.image.rootfs.diff.tar.gzip" = true
DEBU[0200] Checking if we can reuse blob sha256:7608715873ec5c02d370e963aa9b19a149023ce218887221d93fe671b3abbf58: general substitution = true, compression for MIME type "application/vnd.docker.image.rootfs.diff.tar.gzip" = true
DEBU[0200] Checking if we can reuse blob sha256:6d87af36c87a2afe3797ea76c0f29e630ce254a41e3183e7c75da10c3afd73ea: general substitution = true, compression for MIME type "application/vnd.docker.image.rootfs.diff.tar.gzip" = true
DEBU[0200] Checking if we can reuse blob sha256:18e855ea6c2e5aea18b9594ecf1324124c242798b9cf7bd33a3c0175d0ee1aec: general substitution = true, compression for MIME type "application/vnd.docker.image.rootfs.diff.tar.gzip" = true
DEBU[0200] Checking if we can reuse blob sha256:3d51a39ba405fc1b46007633c09b5388246bde32fdc9e224fd1d46e15f8cab4f: general substitution = true, compression for MIME type "application/vnd.docker.image.rootfs.diff.tar.gzip" = true
DEBU[0200] Checking if we can reuse blob sha256:efd2628a9d8be4265cd14a2270cb994333917832af3dab6ba06f14b3f8ea673f: general substitution = true, compression for MIME type "application/vnd.docker.image.rootfs.diff.tar.gzip" = true
DEBU[0200] Failed to retrieve partial blob: blob type not supported for partial retrieval
DEBU[0200] Failed to retrieve partial blob: blob type not supported for partial retrieval
DEBU[0200] Failed to retrieve partial blob: blob type not supported for partial retrieval
DEBU[0200] Failed to retrieve partial blob: blob type not supported for partial retrieval
DEBU[0200] Downloading /v2/apps/myapp/blobs/sha256:391aa60d798921957837d6d49b2c8ec1d40090acdb93e7b63e2b10311de10bb1
DEBU[0200] GET https://hub.foo.com/v2/apps/myapp/blobs/sha256:391aa60d798921957837d6d49b2c8ec1d40090acdb93e7b63e2b10311de10bb1
DEBU[0200] Downloading /v2/apps/myapp/blobs/sha256:7608715873ec5c02d370e963aa9b19a149023ce218887221d93fe671b3abbf58
DEBU[0200] Downloading /v2/apps/myapp/blobs/sha256:6d87af36c87a2afe3797ea76c0f29e630ce254a41e3183e7c75da10c3afd73ea
DEBU[0200] GET https://hub.foo.com/v2/apps/myapp/blobs/sha256:6d87af36c87a2afe3797ea76c0f29e630ce254a41e3183e7c75da10c3afd73ea
DEBU[0200] Failed to retrieve partial blob: blob type not supported for partial retrieval
DEBU[0200] GET https://hub.foo.com/v2/apps/myapp/blobs/sha256:7608715873ec5c02d370e963aa9b19a149023ce218887221d93fe671b3abbf58
DEBU[0200] Downloading /v2/apps/myapp/blobs/sha256:18e855ea6c2e5aea18b9594ecf1324124c242798b9cf7bd33a3c0175d0ee1aec
DEBU[0200] GET https://hub.foo.com/v2/apps/myapp/blobs/sha256:18e855ea6c2e5aea18b9594ecf1324124c242798b9cf7bd33a3c0175d0ee1aec
DEBU[0200] Failed to retrieve partial blob: blob type not supported for partial retrieval
DEBU[0200] Downloading /v2/apps/myapp/blobs/sha256:3d51a39ba405fc1b46007633c09b5388246bde32fdc9e224fd1d46e15f8cab4f
DEBU[0200] GET https://hub.foo.com/v2/apps/myapp/blobs/sha256:3d51a39ba405fc1b46007633c09b5388246bde32fdc9e224fd1d46e15f8cab4f
DEBU[0200] Downloading /v2/apps/myapp/blobs/sha256:efd2628a9d8be4265cd14a2270cb994333917832af3dab6ba06f14b3f8ea673f
DEBU[0200] GET https://hub.foo.com/v2/apps/myapp/blobs/sha256:efd2628a9d8be4265cd14a2270cb994333917832af3dab6ba06f14b3f8ea673f
DEBU[0202] Detected compression format gzip
DEBU[0202] Using original blob without modification
DEBU[0202] Detected compression format gzip
DEBU[0202] Using original blob without modification
DEBU[0202] Detected compression format gzip
DEBU[0202] Using original blob without modification
DEBU[0203] Detected compression format gzip
DEBU[0203] Using original blob without modification
DEBU[0203] Detected compression format gzip
DEBU[0203] Using original blob without modification
DEBU[0203] Detected compression format gzip
DEBU[0203] Using original blob without modification
DEBU[0203] Checking if we can reuse blob sha256:b8bade255407e5aae315a50508ee20045f1bf60870d23a8f2171ffcaacbe3604: general substitution = true, compression for MIME type "application/vnd.docker.image.rootfs.diff.tar.gzip" = true
DEBU[0203] Failed to retrieve partial blob: blob type not supported for partial retrieval
DEBU[0203] Downloading /v2/apps/myapp/blobs/sha256:b8bade255407e5aae315a50508ee20045f1bf60870d23a8f2171ffcaacbe3604
DEBU[0203] GET https://hub.foo.com/v2/apps/myapp/blobs/sha256:b8bade255407e5aae315a50508ee20045f1bf60870d23a8f2171ffcaacbe3604
DEBU[0203] Checking if we can reuse blob sha256:580263e2d3156c888372e0695254fcf0c80518e6efcacc524e70afe323928f6e: general substitution = true, compression for MIME type "application/vnd.docker.image.rootfs.diff.tar.gzip" = true
DEBU[0203] Failed to retrieve partial blob: blob type not supported for partial retrieval
DEBU[0203] Downloading /v2/apps/myapp/blobs/sha256:580263e2d3156c888372e0695254fcf0c80518e6efcacc524e70afe323928f6e
DEBU[0203] GET https://hub.foo.com/v2/apps/myapp/blobs/sha256:580263e2d3156c888372e0695254fcf0c80518e6efcacc524e70afe323928f6e
DEBU[0205] Detected compression format gzip
DEBU[0205] Using original blob without modification
DEBU[0205] Checking if we can reuse blob sha256:6068cad611ea0100a9f89066da5246eade076e24a12c48989b51ed63fba8ec22: general substitution = true, compression for MIME type "application/vnd.docker.image.rootfs.diff.tar.gzip" = true
DEBU[0205] Failed to retrieve partial blob: blob type not supported for partial retrieval
DEBU[0205] Downloading /v2/apps/myapp/blobs/sha256:6068cad611ea0100a9f89066da5246eade076e24a12c48989b51ed63fba8ec22
DEBU[0205] GET https://hub.foo.com/v2/apps/myapp/blobs/sha256:6068cad611ea0100a9f89066da5246eade076e24a12c48989b51ed63fba8ec22
DEBU[0206] Detected compression format gzip
DEBU[0206] Using original blob without modification
DEBU[0206] Checking if we can reuse blob sha256:3cfc1b709af2ad4de3724b037345ef34da299623103202f9d30f7373271b851e: general substitution = true, compression for MIME type "application/vnd.docker.image.rootfs.diff.tar.gzip" = true
DEBU[0206] Failed to retrieve partial blob: blob type not supported for partial retrieval
DEBU[0206] Downloading /v2/apps/myapp/blobs/sha256:3cfc1b709af2ad4de3724b037345ef34da299623103202f9d30f7373271b851e
DEBU[0206] GET https://hub.foo.com/v2/apps/myapp/blobs/sha256:3cfc1b709af2ad4de3724b037345ef34da299623103202f9d30f7373271b851e
DEBU[0208] Detected compression format gzip
DEBU[0208] Using original blob without modification
DEBU[0209] Detected compression format gzip
DEBU[0209] Using original blob without modification
DEBU[0252] Applying tar in /var/lib/containers/storage/overlay/d543b8cad89e3428ac8852a13cb2dbfaf55b1e10fd95a9753e51faf393d60e81/diff
DEBU[0253] Applying tar in /var/lib/containers/storage/overlay/bfb1c4f61e6894b03a55984bc11db11961c3d643810f5760e985e8272bfdbc0a/diff
DEBU[0253] Applying tar in /var/lib/containers/storage/overlay/6167e4f5b9a0f3c6eb6ceb81fd6733ea9e6947afe5d0c899843cca43da814ea6/diff
DEBU[0254] Applying tar in /var/lib/containers/storage/overlay/a5772f0ce8cebd21424578f3f2b59aacbdce8621ad9b5fbeedd75d8e440c09d0/diff
DEBU[0539] Error pulling candidate hub.foo.com/apps/myapp:1.0.3: writing blob: storing blob to file "/var/tmp/storage719313567/6": happened during read: unexpected EOF
@ - - [10/Mar/2023:21:00:10 +0530] "POST /v4.4.2/libpod/images/pull?alltags=false&arch=amd64&authfile=&os=linux&password=&policy=always&quiet=false&reference=hub.foo.com%2Fapps%2Fmyapp%3A1.0.3&username=&variant= HTTP/1.1" 200 1269 "" "Go-http-client/1.1"
DEBU[0539] IdleTracker:idle 1m+0h/5t connection(s)       X-Reference-Id=0x400061c010
DEBU[0539] IdleTracker:closed 1m+0h/5t connection(s)     X-Reference-Id=0x400061c010

The server's log activity got stuck for ~5 minutes at line DEBU[0254] and then finally failed. FWIW, hub.foo.com is running Redhat Quay. This, again, worked fine with docker CLI (see below)

~ docker pull --platform linux/amd64 hub.foo.com/apps/myapp:1.0.3
1.0.3: Pulling from apps/myapp
7608715873ec: Pull complete
391aa60d7989: Pull complete
18e855ea6c2e: Pull complete
efd2628a9d8b: Pull complete
6d87af36c87a: Pull complete
3d51a39ba405: Pull complete
b8bade255407: Pull complete
580263e2d315: Pull complete
6068cad611ea: Pull complete
3cfc1b709af2: Pull complete
Digest: sha256:147dee945dc0cfd4aa38efb6a421d1f3f7fd38379d49f8dce5aab595755c2034
Status: Downloaded newer image for hub.foo.com/apps/myapp:1.0.3
hub.foo.com/apps/myapp:1.0.3

Let me know if you need anything else.

@mtrmac
Copy link
Collaborator

mtrmac commented Mar 10, 2023

OK, the Podman inside the podman machine VM needs to be updated to 4.4.2 . I don’t know how that’s done.

@rhatdan
Copy link
Member

rhatdan commented Mar 13, 2023

You need to get a newer podman machine.

@nagi-daintta
Copy link

You need to get a newer podman machine.
This won't help. Problem persists on podman version 4.5.0

@mtrmac
Copy link
Collaborator

mtrmac commented May 9, 2023

@nagi-daintta If so, please file a new issue, with all the data required by the bug reporting form, as well as other data suggested in this issue. “Problem persists” is not enough to diagnose it.

@vchinchambekar
Copy link

vchinchambekar commented Jul 24, 2023

Hi All,
I'm also facing same issue podman pull and my podman version is -4.4.1-14
OS:- Red Hat Enterprise Linux release 8.8 (Ootpa)
Any help please ??

Error: writing blob: storing blob to file "/var/tmp/storage3177196635/3": happened during read: Digest did not match, expected sha256:bcd97

@mtrmac
Copy link
Collaborator

mtrmac commented Jul 24, 2023

@vchinchambekar “digest did not match” is not “unexpected EOF”.

Again,

please file a new issue, with all the data required by the bug reporting form, as well as other data suggested in this issue.

@github-actions github-actions bot added the locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments. label Oct 23, 2023
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Oct 23, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
kind/bug Categorizes issue or PR as related to a bug. locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments.
Projects
None yet
Development

No branches or pull requests

8 participants