Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: cannot acquire lock: Lock FcntlFlock of /export/repo.lock failed: resource temporarily unavailable #6363

Closed
ninkisa opened this issue May 22, 2019 · 12 comments
Labels
kind/support A question or request for support

Comments

@ninkisa
Copy link

ninkisa commented May 22, 2019

Version information:

ipfs version --all
go-ipfs version: 0.4.21-rc1-
Repo version: 7
System version: amd64/linux
Golang version: go1.12.5

Description:

We are using ipfs in a docker container. After restart of the container it started failing with:
ipfs id
Error: cannot acquire lock: Lock FcntlFlock of /export/repo.lock failed: resource temporarily unavailable

I tried to clean the lock files with ipfs repo fsck but it didn't fix it

ipfs repo fsck
Lockfiles have been removed.
ipfs id
Error: could not build arguments for function "reflect".makeFuncStub (/usr/local/go/src/reflect/asm_amd64.s:12): failed to build *mfs.Root: function "github.com/ipfs/go-ipfs/core/node".Files (/go-ipfs/core/node/core.go:74) returned a non-nil error: error loading filesroot from DAG: merkledag: not found

I tried also to manualy delete ~/.ipfs/datastore/LOCK and ~/.ipfs/repo.lock with the same result

@Stebalien Stebalien added the kind/bug A bug in existing code (including security flaws) label May 22, 2019
@eingenito
Copy link
Contributor

Hey @ninkisa. Is this problem occurring with the official docker image available at https://hub.docker.com/r/ipfs/go-ipfs/builds? Or are you making your own? And if you are can you point us to an image we can try?

Thanks.

@ninkisa
Copy link
Author

ninkisa commented May 23, 2019

Hi, yes I'm using the official docker image ipfs/go-ipfs. Originaly was running with v0.4.20, but decided to try with the latest release candidate v0.4.21-rc1 to see if there is difference. Running the container with and emtpy folder is working fine, but with the official repository failes with the above error. Any idea what to check or how to fix it?

@ninkisa
Copy link
Author

ninkisa commented May 23, 2019

I'n attaching some output from the commands I tried
error.txt

@Stebalien
Copy link
Member

Error: cannot acquire lock: Lock FcntlFlock of /export/repo.lock failed: resource temporarily unavailable

This usually means that some other process still has the lock. Could you check lsof /path/to/repo/repo.lock?

Error: could not build arguments for function "reflect".makeFuncStub (/usr/local/go/src/reflect/asm_amd64.s:12): failed to build *mfs.Root: function "github.com/ipfs/go-ipfs/core/node".Files (/go-ipfs/core/node/core.go:74) returned a non-nil error: error loading filesroot from DAG: merkledag: not found

Could you post your config? (ipfs config show).

@ninkisa
Copy link
Author

ninkisa commented May 27, 2019

Hello Sebastian,

here is the output from lsof /path/to/repo/repo.lock

/ $ lsof /export/repo.lock
1 /sbin/tini /dev/null
1 /sbin/tini pipe:[851888]
1 /sbin/tini pipe:[851889]
6 /usr/local/bin/ipfs /dev/null
6 /usr/local/bin/ipfs pipe:[851888]
6 /usr/local/bin/ipfs pipe:[851889]
6 /usr/local/bin/ipfs /export/datastore/006076.log
6 /usr/local/bin/ipfs anon_inode:[eventpoll]
6 /usr/local/bin/ipfs /export/repo.lock
6 /usr/local/bin/ipfs /export/datastore/LOCK
6 /usr/local/bin/ipfs /export/datastore/LOG
6 /usr/local/bin/ipfs /export/datastore/MANIFEST-006077
6 /usr/local/bin/ipfs socket:[10472371]
6 /usr/local/bin/ipfs /export/datastore/006079.ldb
6 /usr/local/bin/ipfs socket:[10472372]
6 /usr/local/bin/ipfs socket:[10472373]
6 /usr/local/bin/ipfs /export/datastore/006081.ldb
6 /usr/local/bin/ipfs socket:[855470]
6 /usr/local/bin/ipfs socket:[853982]
6 /usr/local/bin/ipfs /export/datastore/006082.ldb
6 /usr/local/bin/ipfs socket:[853990]
2599 /bin/sh /dev/pts/0
2599 /bin/sh /dev/pts/0
2599 /bin/sh /dev/pts/0
2599 /bin/sh /dev/tty

I ran again 'ipfs repo fsck' and lsof after that:

/ $ lsof /export/repo.lock
1 /sbin/tini /dev/null
1 /sbin/tini pipe:[851888]
1 /sbin/tini pipe:[851889]
6 /usr/local/bin/ipfs /dev/null
6 /usr/local/bin/ipfs pipe:[851888]
6 /usr/local/bin/ipfs pipe:[851889]
6 /usr/local/bin/ipfs /export/datastore/006076.log
6 /usr/local/bin/ipfs anon_inode:[eventpoll]
6 /usr/local/bin/ipfs /export/repo.lock (deleted)
6 /usr/local/bin/ipfs /export/datastore/LOCK (deleted)
6 /usr/local/bin/ipfs /export/datastore/LOG
6 /usr/local/bin/ipfs /export/datastore/MANIFEST-006077
6 /usr/local/bin/ipfs /export/datastore/006079.ldb
6 /usr/local/bin/ipfs /export/datastore/006081.ldb
6 /usr/local/bin/ipfs socket:[855470]
6 /usr/local/bin/ipfs socket:[853982]
6 /usr/local/bin/ipfs /export/datastore/006082.ldb
6 /usr/local/bin/ipfs socket:[853990]
2599 /bin/sh /dev/pts/0
2599 /bin/sh /dev/pts/0
2599 /bin/sh /dev/pts/0
2599 /bin/sh /dev/tty

Here is the ipfs config
config.txt

@Stebalien
Copy link
Member

According to that, IPFS is still running.

@ninkisa
Copy link
Author

ninkisa commented May 28, 2019

Yes, it is running but is not responding

$ipfs id

13:46:26.065 DEBUG cmd/ipfs: config path is /export main.go:139
13:46:26.066 WARNI fsrepo: NoSync is now deprecated in favor of datastore specific settings. If you want to disable fsync on flatfs set 'sync' to false. See https://github.com/ipfs/go-ipfs/blob/master/docs/datastores.md#flatfs. fsrepo.go:406
13:46:26.074 DEBUG blockservi: BlockService GetBlock: 'QmSszb8dKw36z66Xtm6cfkjnQqLxaC9XXffz7B6g5hN4Kk' blockservice.go:199
13:46:26.075 DEBUG blockservi: BlockService GetBlock: 'QmWjP6wKfcY9jPLyN3ge9F5kPzPq9f1WkMSM3tZk9afmvF' blockservice.go:199
13:46:26.078 DEBUG blockservi: BlockService GetBlock: 'QmSKboVigcD3AY4kLsob117KJcMHvMUu6vNFqk1PQzYUpp' blockservice.go:199
13:46:26.079 DEBUG blockservi: BlockService GetBlock: 'QmWv3PNQKnWBNBGqy5TqVoYJwKV4yhR9i5EbwsQ2qa4daV' blockservice.go:199
13:46:26.079 DEBUG blockservi: Blockservice: Searching bitswap blockservice.go:229
13:46:26.079 DEBUG core: core is shutting down... core.go:669
13:46:26.079 DEBUG blockservi: blockservice is shutting down... blockservice.go:321
Error: error loading filesroot from DAG: merkledag: not found`
If I run it in a clean repository it is fine, but I will use the data in the current repo.

How can I repaire it?

@ninkisa
Copy link
Author

ninkisa commented May 30, 2019

Yesterday the same thing happened on another machine. First we removed the block folder to see if it will fix something, but no result. We had to delete the datastore folder and re-init the repository to make it work, but this is no solution as we have to add all files again to IPFS.
Any idea?

@Stebalien
Copy link
Member

Did you use ipfs repo fsck on this other machine? If you run that on a machine that's actually running IPFS, you'll corrupt your repo. That's why I asked you to run lsof.

Otherwise, is your IPFS repo on a networked drive?

@ninkisa
Copy link
Author

ninkisa commented Jun 11, 2019

yes, we used "ipfs repo fsck", but it was after the daemon started failing with repo.lock failed: resource temporarily unavailable
No, ipfs is not on a networkdrive

@Stebalien
Copy link
Member

yes, we used "ipfs repo fsck", but it was after the daemon started failing with repo.lock failed: resource temporarily unavailable

That's the error that's returned when:

  1. Some other IPFS process is running (or something has locked the repo.lock file).
  2. The repo's "api" file is missing (e.g., the daemon isn't running or is shutting down).

We need to improve the error (#6434) but that would definitely explain why your datastore got corrupted. We should probably also make that command less dangerous (#6435).


Can you reproduce this without running ipfs repo fsck? Next time you see the "resource temporarily unavailable error", could you check lsof before running anything like ipfs repo fsck?

@Stebalien Stebalien added kind/support A question or request for support and removed kind/bug A bug in existing code (including security flaws) labels Jun 14, 2019
@Stebalien
Copy link
Member

Closing as this looks like it isn't a bug (and I've filed a followup issue to make this command less dangerous in #6435).

Feel free to continue debugging in the comments, we can reopen if this turns out to be an actual bug. Closing this just makes tracking open issues a bit easier.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/support A question or request for support
Projects
None yet
Development

No branches or pull requests

3 participants