Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fleet] Memory leak #159762

Closed
nchaulet opened this issue Jun 15, 2023 · 1 comment · Fixed by #159807
Closed

[Fleet] Memory leak #159762

nchaulet opened this issue Jun 15, 2023 · 1 comment · Fixed by #159807
Assignees
Labels
bug Fixes for quality problems that affect the customer experience Team:Fleet Team label for Observability Data Collection Fleet team

Comments

@nchaulet
Copy link
Member

nchaulet commented Jun 15, 2023

Description

It seems that Fleet is leaking memory

Investigation

it seems related to the usage we do AsyncLocalStorage

TODO add more investigation info heapdumo

fix #159807

@nchaulet nchaulet added bug Fixes for quality problems that affect the customer experience Team:Fleet Team label for Observability Data Collection Fleet team labels Jun 15, 2023
@nchaulet nchaulet self-assigned this Jun 15, 2023
@elasticmachine
Copy link
Contributor

Pinging @elastic/fleet (Team:Fleet)

juliaElastic pushed a commit that referenced this issue Jun 20, 2023
## Description 

Related to #158361 
Closes #158361
Closes #159762
When we introduced audit logging it seems we introduced a memory leak.

Comparing two heapdump after triggering ~6k request against Fleet we can
see that these AsyncLocalStorage object are never deleted

<img width="1307" alt="Screenshot 2023-06-15 at 9 04 13 AM"
src="https://github.com/elastic/kibana/assets/1336873/7f4b0a7b-75a9-4c87-8fb9-cd539a2029e0">

That PR try fix that by using a single for `AsyncLocalStorage`

The call that create those `AsyncLocalStorage` is not conditional of
audit logging being enabled or not

## Tests

Testing that **PR** on cloud on a 1gb instance with the agent page open
(doing long polling) we can see it seems to improve the memory usage
(compared to the current 8.8.1 with an agent page open too **baseline**)

<img width="810" alt="Screenshot 2023-06-15 at 3 04 35 PM"
src="https://github.com/elastic/kibana/assets/1336873/a392f734-2b91-48ac-a906-3a8e88e2b209">

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
juliaElastic pushed a commit that referenced this issue Jun 20, 2023
Related to #158361
Closes #158361
Closes #159762
When we introduced audit logging it seems we introduced a memory leak.

Comparing two heapdump after triggering ~6k request against Fleet we can
see that these AsyncLocalStorage object are never deleted

<img width="1307" alt="Screenshot 2023-06-15 at 9 04 13 AM"
src="https://github.com/elastic/kibana/assets/1336873/7f4b0a7b-75a9-4c87-8fb9-cd539a2029e0">

That PR try fix that by using a single for `AsyncLocalStorage`

The call that create those `AsyncLocalStorage` is not conditional of
audit logging being enabled or not

Testing that **PR** on cloud on a 1gb instance with the agent page open
(doing long polling) we can see it seems to improve the memory usage
(compared to the current 8.8.1 with an agent page open too **baseline**)

<img width="810" alt="Screenshot 2023-06-15 at 3 04 35 PM"
src="https://github.com/elastic/kibana/assets/1336873/a392f734-2b91-48ac-a906-3a8e88e2b209">

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Fixes for quality problems that affect the customer experience Team:Fleet Team label for Observability Data Collection Fleet team
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants