-
-
Notifications
You must be signed in to change notification settings - Fork 419
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Intermittent failure to release port bindings #189
Comments
This is most likely due to the possible 5 second wait before port forwarding is established (or terminated), a limitation in Lima at the moment. See #71 (comment). |
Hi! Not sure if helpful, but sharing in case it is. I've just experienced a connection refused in my local colima, running on macOS 11.6.2 with colima 0.3.2.
|
I see that regularly in tests also, didn't mention it above. When the tests don't fail due to ports always being bound, they sometimes fail to connect, and I guess it's the same delay @abiosoft mentioned. Ugly though. Not sure of a good workaround. ddev projects go up and down perhaps hundreds of times in a test run, so I could add a |
The plan is have vmnet networking bundled with colima so each vm gets an accessible static IP address (regardless of port-forwarding to localhost). |
It would be awesome to have a reliable port binding situation. For me, root for install is fine. I definitely know the drawbacks. |
Kindly install the current development version with |
Should this be reopened, as it seems to remain an issue? Thanks for all the great work. |
Yeah, that's fine. |
I have to restart DDEV's Colima tests sometimes 3-4-5 times just because of this issue. I understand why it doesn't much affect an ordinary user, but wow, I wish there was a way around this. Even a workaround. |
@rfay considering that gvproxy is now being utilised in Colima, Docker/Containerd events can be monitored directly and forwarded via gvproxy instead. It will be resolved in Lima eventually and the preference is to delegate to Lima as much as possible. But nonetheless, I will explore and see if it does not require too much effort. |
Much appreciated! |
Any thoughts about workarounds I could use to mitigate this problem? |
@rfay I explored this #189 (comment) and it is relatively straightforward to implement for Docker (and Containerd) but not for Kubernetes. Does your workflow involve Kubernetes or it's primarily Docker? Nonetheless I will prioritise this. It can always be improved and will be an optional feature. |
I guess I didn't understand what you meant in #189 (comment) (and still don't). What action would I take to solve this problem of port bindings not being released? Run some kind of external process doing listening and taking some kind of action? This is only for docker. Thanks so much for looking at it. |
@rfay no action is required on your end. I will notify when it is ready for testing. |
I've run into this issue after updating my machine today to 13.2.1 (22D68). I'm looking for a workaround using |
Pretty sure this is obsolete so closing. |
Hi, and thanks for this project!
In ddev's full test run for macOS/Colima I see intermittent failures caused apparently by a failure to release port bindings. I haven't experienced this in a local non-github-actions environment, but since most runs are successful, I think it's most likely an issue with Colima/Lima, but don't know how to chase it.
See https://github.com/drud/ddev/runs/5234475567?check_suite_focus=true and search the test run for
FAIL:
. The first one happens in TestRouterConfigOverride, and you get "Unable to listen on required ports, port 443 is already in use". It seems that port 443 has not been released between tests (and this happens with other ports on other runs).I don't expect that this has anything to do with the tests, as nothing like this happens in the many other test environments, including docker desktop on macOS, linux, windows, and WSL2.
Any thoughts on how to study this problem and narrow down the scope would be much appreciated. I can probably use tmate and have it stop on failure, but with the current test structure my bet is that the issue might be resolved before I get there. Maybe not. But it's awkward and hard because it's probably one of every 3 test runs, and the test runs take almost 2 hours.
The text was updated successfully, but these errors were encountered: