Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[exporter/azuremonitor] Fix memory leaks on shutdown #30801

Closed
wants to merge 5 commits into from

Conversation

crobert-1
Copy link
Member

@crobert-1 crobert-1 commented Jan 26, 2024

Description:

goleak was detecting a few leaking goroutines in this exporter. This change fixes one, and ignores the other as it's a result of the third party dependency. I've filed microsoft/ApplicationInsights-Go#70 to track the ApplicationInsights goroutine leak.

This change does the following:

  1. Implements Shutdown for metric, trace, and log exporters. This Shutdown needs to call Close on the transportChannel, as documented here.
  2. Copy trace exporter tests for metrics and logs. These are the tests added in factory_test.go, adding wider coverage. Without these we may have missed the need to add a Shutdown call to metrics and logs exporters.
  3. Update the mock channel functionality which I manually generated using mockery.
  4. Add goleak check.

Link to tracking Issue:
#30438

Testing:
All tests are passing.

case <-exporter.transportChannel.Close():
return nil
case <-time.After(shutdownTimeout):
// Currently, due to a dependency's bug (https://github.com/microsoft/ApplicationInsights-Go/issues/70),
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me know if you disagree with my reasoning here, happy to discuss further.

This is not an error might be a bit misleading as technically we want the channel Close returns to send a message before the timeout is hit, but I think it captures the idea.

@crobert-1
Copy link
Member Author

crobert-1 commented Jan 26, 2024

Tests are failing due to microsoft/ApplicationInsights-Go#60. I'll investigate if there's some way to account for this on our end.

In my mind, anything is better here than current functionality, which abruptly shuts down without any kind of safeguards. From my understanding, the worst case of the data race is simply dropping data that was meant to be sent, but that's what we're currently doing every time anyways, since we're not even calling Close. I think if there's any chance it's successful it's better than 0% success. I'll let others weigh in here though.

@crobert-1
Copy link
Member Author

There's no way to workaround this at this point, I'm going to close for now. The final solution may have to be to ignore the leak instead of failing on a race condition, unfortunately.

@crobert-1 crobert-1 closed this Feb 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants