-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leaks? #28
Comments
Do you have simple code that can be replicated ? |
|
If I am running this code with inspect flag: But if I run the similar code with native fetch, then I see that memory is cleaning regularly and consumption is quite constant |
What's your system, and |
Linux Mint 22. But I am also running the script on the Ubuntu 22.04.3 server. |
setInterval(() => {
const afterMem = process.memoryUsage();
console.log(
afterMem,
afterMem.heapUsed - initMem.heapUsed,
afterMem.heapTotal - initMem.heapTotal,
);
}, 1e3); you can append this code for testing, i think nodejs' gc is lazy, and the difference in memory after a few seconds I guess is a cache made by nodejs for performance optimization |
is initMem measured just at the script start? |
import { LibCurl, fetch } from "@ossiana/node-libcurl";
class ConcurrencyRunner {
#concurrencyLimit = 0;
#taskExecutor = null;
#shouldStop = null;
constructor({
concurrencyLimit = 1,
taskExecutor,
shouldStop = () => false,
} = {}) {
if (typeof concurrencyLimit !== "number" || concurrencyLimit < 1) {
throw new Error("concurrencyLimit must be a positive number");
}
if (typeof taskExecutor !== "function") {
throw new Error("taskExecutor must be a function");
}
if (typeof shouldStop !== "function") {
throw new Error("shouldStop must be a function");
}
this.#concurrencyLimit = concurrencyLimit;
this.#taskExecutor = taskExecutor;
this.#shouldStop = shouldStop;
}
async run() {
let counter = 0;
const tasks = new Set();
while (1) {
if (await this.#shouldStop(counter)) {
break;
}
if (tasks.size < this.#concurrencyLimit) {
const taskPromise = Promise.resolve()
.then(() => this.#taskExecutor(counter))
.catch((error) => {
console.error("ConcurrencyRunner.run", error);
})
.finally(() => {
tasks.delete(taskPromise);
});
tasks.add(taskPromise);
counter++;
} else {
await Promise.race(tasks);
}
}
await Promise.all(tasks);
}
}
let initMem;
const repro = async () => {
initMem = process.memoryUsage();
await new ConcurrencyRunner({
concurrencyLimit: 20,
async taskExecutor() {
let resp = await fetch(
"https://raw.githubusercontent.com/nodejs/node/main/doc/changelogs/CHANGELOG_V20.md",
{
headers: { "Cache-Control": "no-cache" },
},
);
await resp.text();
},
shouldStop(t) {
console.log(t);
return t === 5000;
},
}).run();
const afterMem = process.memoryUsage();
console.log(
initMem,
afterMem,
afterMem.heapUsed - initMem.heapUsed,
afterMem.heapTotal - initMem.heapTotal,
);
};
repro()
.then(() => console.log("Finished"))
.catch((err) => console.log("Error", err));
setInterval(() => {
const afterMem = process.memoryUsage();
console.log(
afterMem,
afterMem.heapUsed - initMem.heapUsed,
afterMem.heapTotal - initMem.heapTotal,
);
}, 1e3); |
Any ideas? It looks like a problem with reading a response body by the library... |
i'll find a time to check it |
any update on this? Facing the same problem |
v1.6.8 still have this problem ? |
Yes, running the same example provided by the author of the issue on v1.6.8 |
try run this test https://github.com/Ossianaa/node-libcurl/blob/master/test/test-0001/index.js I've tested it with this code and it doesn't seem to grow memory permanently |
Im using request.session instead of fetch. |
same as fetch |
are curl instances being disposed after each use? |
node v8 will be lazy gc |
I have a memory leak in the nodejs script, and I suppose that it is due to the node-libcurl. Did you see something similar? Any advices? Thanks.
The text was updated successfully, but these errors were encountered: