-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Infinite memory consumption when infinite client reconnection #4218
Comments
I'm not sure I am following, you are creating resources without clearing them on termination, so obviously valgrind reports them as leaks - what is the issue here, precisely? |
In the first case, the leak was less than in the case when resources are created and cleaned in another process. |
I'm still not following, sorry - you are allocating more resources, so you get more memory. Again, what is the issue? |
Memory allocation and deallocation occurs in the second process, and memory grows in the first. |
Yes, because of the IPC connections you are making |
But why doesn't it free resources when the connection was dropped? |
Because you are keeping it open - if you want to drop it, close the sockets/etc. Again, I'm not sure what the problem is. |
If I close the sockets, the server will stop working. |
Yes, if you want to keep it running, it requires memory. I mean, it can't really run out of thin air, it needs cpu and memory resources. Your machines need to have enough resources to handle the amount of traffic you expect. There is no issue nor memory leak here. |
Okay, my mistake. valgrind showed me this result
If I increase the number of connections, I see a lot of messages like this in valgrind. I can also assume that I would be very glad if you could help me. |
@bluca |
@Detect1ve you may have to tell the allocator to actually shrink/free memory back to the OS. I use malloc_trim(0) when I need to do so. |
@joakim-brannstrom |
Issue description
There are two processes in which context and sockets are created. In the first process, objects are created and deleted. In the second process I see memory growth (looks like a memory leak).
Environment
Minimal test code / Steps to reproduce the issue
Simple example without checks for returned error codes:
What's the actual result? (include assertion message & call stack if applicable)
If we start only one process, which only creates the context and sockets, but doesn't clean anything after itself, then we get the following leak:
If I add the second process that will create and delete resources, then the memory consumption of the first process will increase:
Accordingly, there is a dependency - the more creation and deletion of resources occur, the more memory consumption of the process increases.
What's the expected result?
Everything leads to the fact that the growth of memory is endless, which will lead to the fact that memory will eventually end. Can you tell me what the problem is?
The text was updated successfully, but these errors were encountered: