Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High memory consumption after 0.16.x #1200

Closed
rise0chen opened this issue Sep 13, 2023 · 10 comments · Fixed by #1204
Closed

High memory consumption after 0.16.x #1200

rise0chen opened this issue Sep 13, 2023 · 10 comments · Fixed by #1204

Comments

@rise0chen
Copy link

It look like #1150 .

It will use 5MB memory, when I send every 3000 message by a WebSocket connection.

Test way:

var rpc_data = {
	"jsonrpc": "2.0",
	"method": "hello",
	"params": [],
	"id": 2
};
var ws = new WebSocket("ws://192.168.2.3:3030");

ws.onopen = ()=>{
	console.log("open ws");
	ws.send(JSON.stringify(rpc_data));
};
ws.onmessage = (data)=>{
	console.log("recv");
	ws.send(JSON.stringify(rpc_data));
}
@niklasad1
Copy link
Member

Hey, can you report which version that you are using?

Also, I don't understand if the memory usage is high because something is leaking i.e. such that the connection is not cleaned up properly or something else.

I guess we could yank version v0.17-v0.18 because those have issues IIRC.

@rise0chen
Copy link
Author

I am using 0.20.
I had test 0.17,0.20,master

@niklasad1
Copy link
Member

Ok, can you share your server code such that we can try to reproduce it easily?

@rise0chen
Copy link
Author

https://github.com/rise0chen/test_jsonrpsee.git

root@f0fca0e971de:~/demon# RUST_LOG=INFO cargo run
    Finished dev [unoptimized + debuginfo] target(s) in 0.07s
     Running `target/debug/demon`
2023-09-13T14:46:07.790018Z  INFO demon: memory_info: MemoryInfo { rss: 12959744, vms: 312107008, shared: 11575296, text: 17891328, data: 9621504 }
2023-09-13T14:46:08.278279Z  INFO demon: memory_info: MemoryInfo { rss: 19267584, vms: 381329408, shared: 13410304, text: 17891328, data: 16240640 }
2023-09-13T14:46:08.766852Z  INFO demon: memory_info: MemoryInfo { rss: 24621056, vms: 381329408, shared: 13934592, text: 17891328, data: 20721664 }
2023-09-13T14:46:09.250193Z  INFO demon: memory_info: MemoryInfo { rss: 29216768, vms: 381329408, shared: 13934592, text: 17891328, data: 25333760 }
2023-09-13T14:46:09.736768Z  INFO demon: memory_info: MemoryInfo { rss: 33542144, vms: 381329408, shared: 13934592, text: 17891328, data: 29937664 }
2023-09-13T14:46:10.226172Z  INFO demon: memory_info: MemoryInfo { rss: 38129664, vms: 381329408, shared: 13934592, text: 17891328, data: 34549760 }
2023-09-13T14:46:10.714844Z  INFO demon: memory_info: MemoryInfo { rss: 42725376, vms: 381329408, shared: 13934592, text: 17891328, data: 39157760 }
2023-09-13T14:46:11.199223Z  INFO demon: memory_info: MemoryInfo { rss: 47321088, vms: 381329408, shared: 13934592, text: 17891328, data: 43765760 }
2023-09-13T14:46:11.680800Z  INFO demon: memory_info: MemoryInfo { rss: 51916800, vms: 381329408, shared: 13934592, text: 17891328, data: 48377856 }
2023-09-13T14:46:12.165494Z  INFO demon: memory_info: MemoryInfo { rss: 56782848, vms: 381329408, shared: 13934592, text: 17891328, data: 52985856 }
2023-09-13T14:46:12.656301Z  INFO demon: memory_info: MemoryInfo { rss: 61108224, vms: 381329408, shared: 13934592, text: 17891328, data: 57589760 }
2023-09-13T14:46:13.138764Z  INFO demon: memory_info: MemoryInfo { rss: 65703936, vms: 381329408, shared: 13934592, text: 17891328, data: 62201856 }

@niklasad1
Copy link
Member

Yeah, you are correct but would be if you could add the data for v0.16 as well but adding my graphs

v0.20

v20

v0.16.3

v016

@rise0chen
Copy link
Author

https://github.com/rise0chen/test_jsonrpsee/tree/v0.16

root@f0fca0e971de:~/demon# RUST_LOG=INFO cargo run
   Compiling test_jsonrpsee v0.1.0 (/root/demon)
    Finished dev [unoptimized + debuginfo] target(s) in 6.59s
     Running `target/debug/test_jsonrpsee`
2023-09-13T23:05:00.074975Z  INFO connection{remote_addr=127.0.0.1:44256 conn_id=0}: jsonrpsee_server::server: Accepting new connection 1/100
2023-09-13T23:05:00.075558Z  INFO test_jsonrpsee: memory_info: MemoryInfo { rss: 12140544, vms: 313151488, shared: 10530816, text: 18620416, data: 9887744 }
2023-09-13T23:05:00.545386Z  INFO test_jsonrpsee: memory_info: MemoryInfo { rss: 12140544, vms: 382373888, shared: 10530816, text: 18620416, data: 12128256 }
2023-09-13T23:05:01.000579Z  INFO test_jsonrpsee: memory_info: MemoryInfo { rss: 12140544, vms: 382373888, shared: 10530816, text: 18620416, data: 12128256 }
2023-09-13T23:05:01.455944Z  INFO test_jsonrpsee: memory_info: MemoryInfo { rss: 12140544, vms: 382373888, shared: 10530816, text: 18620416, data: 12128256 }
2023-09-13T23:05:01.907101Z  INFO test_jsonrpsee: memory_info: MemoryInfo { rss: 12140544, vms: 382373888, shared: 10530816, text: 18620416, data: 12128256 }
2023-09-13T23:05:02.375756Z  INFO test_jsonrpsee: memory_info: MemoryInfo { rss: 12140544, vms: 382373888, shared: 10530816, text: 18620416, data: 12128256 }

@niklasad1
Copy link
Member

niklasad1 commented Sep 14, 2023

Yeah, I believe this "regression" is because we replaced our own FutureDriver (it had a wake up poll issue) with tokio::spawn but lemme take a look where the increased allocations comes from, could be something else because the data where forced to Arc:ed/Cloned to satisfy the Send + 'static trait bounds

@rise0chen
Copy link
Author

How long will it take to fix this problem?
I'm considering whether to roll back to a previous version or wait for an update.

@niklasad1
Copy link
Member

niklasad1 commented Sep 15, 2023

I'm not sure that we can get down the memory usage to what v0.16 has currently because it had a FutureDriver which polled the internal methods at same time as reading the socket and other things.

If #1069 isn't a concern for you then downgrade.

Ideally, we would have some local executor for the methods and just borrow the params as we did before but it's not a trivial change sadly....

@niklasad1
Copy link
Member

This should be fixed in v0.20.1

I found a leak which is fixed now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants