Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[flaky test][receiver/collectdreceiver] Lifecycle test failing due to bind: address already in use #30805

Closed
crobert-1 opened this issue Jan 26, 2024 · 5 comments

Comments

@crobert-1
Copy link
Member

Component(s)

receiver/collectd

Describe the issue you're reporting

Failure link: https://github.com/open-telemetry/opentelemetry-collector-contrib/actions/runs/7673900683/job/20917582102?pr=30803

go test -race -timeout 300s -parallel 4 --tags="" ./...
?   	github.com/open-telemetry/opentelemetry-collector-contrib/receiver/collectdreceiver/internal/metadata	[no test files]
--- FAIL: TestComponentLifecycle (0.00s)
    --- FAIL: TestComponentLifecycle/metrics-lifecycle (0.00s)
        generated_component_test.go:82: 
            	Error Trace:	/home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/receiver/collectdreceiver/generated_component_test.go:82
            	Error:      	Received unexpected error:
            	            	listen tcp 127.0.0.1:8081: bind: address already in use
            	Test:       	TestComponentLifecycle/metrics-lifecycle
FAIL
FAIL	github.com/open-telemetry/opentelemetry-collector-contrib/receiver/collectdreceiver	1.046s
FAIL
make[2]: *** [../../Makefile.Common:124: test] Error 1
@crobert-1 crobert-1 added the needs triage New item requiring triage label Jan 26, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@crobert-1
Copy link
Member Author

It looks like the lifecycle test starts a receiver, shuts it down, then starts another with the exact same config, binding to the same endpoint.

I believe there may be a goroutine leak going on here that's causing this, judging by the context of the test failure, the low frequency of the test failure, and the fact that goleak isn't enabled for this package (meaning it has a failure, at least intermittently).

@crobert-1 crobert-1 removed the needs triage New item requiring triage label Mar 5, 2024
@atoulme
Copy link
Contributor

atoulme commented Mar 5, 2024

Appreciated, it looks like the tests are catching an interesting issue. Thanks for the report.

Copy link
Contributor

github-actions bot commented May 6, 2024

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label May 6, 2024
Copy link
Contributor

github-actions bot commented Jul 5, 2024

This issue has been closed as inactive because it has been stale for 120 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Jul 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants