-
-
Notifications
You must be signed in to change notification settings - Fork 488
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Huge memory leak #2034
Comments
Thank you for your reporting! I think that v8.0.0-beta.11 is buggy. I am fixing on it now, but that bug may have affected it. we would like to see if memory leaks occur in the previous version v8.0.0-beta.10 to make sure this version is affected. |
i think i got the same error, works fine with my local machine. but i got javascript heap out of memory when building the project. im using docker, the resource works fine when memory is 5gb but less than that got the javascript heap out of memory error. |
Look like changing Head to useHead / useSeoMeta solve the leak, and i think that it's because of computed properly been used in So far memory look normal (below 150), will see how it'll go in 1-2 days |
I greatly appreciate your work |
i point out that i switched from head tag in template to useHead in script |
I can confirm this is still present, when using |
add information. like the below code, I've found that if <script setup lang="ts">
const { $i18n } = useNuxtApp()
const timeLabel = computed(() => $i18n.t('general'))
await new Promise(resolve => setTimeout(resolve, 10))
</script> |
You can try to reproduce the below: # build nuxt app
npm run build
# start nuxt app with `node --inspect`
HOST=localhost node --trace-gc --inspect --max-old-space-size=600 ./node_modules/nuxt/bin/nuxt.mjs preview To stress test it: npx autocannon --duration 50 http://localhost:3000 |
tested, no memory leak for this case, or i tested in wrong way, idk |
@hakan-akgul |
Sorry for the late return. I need some time; I want to return with some useful information if possible. We get the snapshots from test environment but the huge problem on the production. I don't know if it is useful or not. Maybe you can find a pattern:
|
I've encountered a similar issue in our application. Interestingly, the app would spontaneously switch to Turkish (tr) by itself. Although we do have a route setup for tr, as you've mentioned, the switch occurs without any explicit trigger. It's quite perplexing, and it seems this is related to the issue raised by @hakan-akgul. |
Do you encounter also memory leaks in the application? |
Same issue #2612 |
The latest edge release contains a fix for the (what was likely the larger) memory leak, please let me know if you can confirm this in your project! Installing it as alias: From my testing it seems like there is still a smaller memory leak present, I'm still working on finding the cause and fixing that. |
Also, if you're seeing some |
Before the optimization, it consumed 1GB per 1000 requests. Now, after checking, it consistently stays at the level of 100-300MB. I tested it with 10,000 requests After 5000 requests, the following error appeared:
./.output/server/chunks/app/server.mjs:497:20
./.output/server/chunks/app/server.mjs:10255:33
It seems that there is an issue with a memory leak, and the extendBaseUrl function is only using i18n. There might be another problem elsewhere. |
@s00d |
Sure, of course. npm i
npm run build
node .output/server/index.mjs
ab -n 10000 -c 100 http://localhost:3000/ ab is a utility for macOS, it allows you to send a bunch of requests in parallel // macOS
brew install httpd
// debian
sudo apt-get install apache2-utils
// CentOS
sudo yum install httpd-tools on windows, it should work through bash in theory The project is completely clean, only i18n is installed, it is reproduced on previous versions. I've added [nuxt] [request error] [unhandled] [500] Maximum call stack size exceeded
at ./.output/server/index.mjs:12899:10
at ./.output/server/index.mjs:12904:29
at ./.output/server/index.mjs:12904:29
at ./.output/server/index.mjs:12904:29
// ....
at ./.output/server/index.mjs:12904:29
at ./.output/server/index.mjs:12904:29
at ./.output/server/index.mjs:12904:29
at resolveBaseUrl (./.output/server/index.mjs:11475:12)
at extendComposer (./.output/server/index.mjs:11627:22)
at ./.output/server/index.mjs:11585:7
at EffectScope.run (./.output/server/node_modules/@vue/reactivity/dist/reactivity.cjs.js:42:16)
at i18n.install (./.output/server/index.mjs:11584:11)
at Object.use (./.output/server/node_modules/@vue/runtime-core/dist/runtime-core.cjs.js:3778:18)
at setup (./.output/server/index.mjs:13211:9)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Object.callAsync (./.output/server/index.mjs:5133:16)
at async applyPlugin (./.output/server/index.mjs:6747:35)
at async applyPlugins (./.output/server/index.mjs:6767:7)
at async createNuxtAppServer (./.output/server/index.mjs:13627:7)
at async Object.renderToString (./.output/server/node_modules/vue-bundle-renderer/dist/runtime.mjs:173:19)
at async ./.output/server/index.mjs:6118:21
at async ./.output/server/index.mjs:5172:22
at async Object.handler (./.output/server/index.mjs:2293:19)
at async Server.toNodeHandle (./.output/server/index.mjs:2482:7)
The memory leak is completely gone. After finishing the requests, the memory consumption resets to normal. However, the problem with "Maximum call stack size exceeded" error is definitely related to i18n. If I disable the module, the issue disappears. The problem lies in the extendBaseUrl method of the baseUrl function. The baseUrl function is referencing itself, and with each subsequent request, the nesting of calls increases. After around 6500 requests, everything crashes. I suspect that it all starts here:
The method is referencing itself, leading to an infinite loop. Maybe this fix could help: But it's still better to double-check everything |
Thanks for providing the reproduction and testing method, I have been able to replicate this locally and I think you're right about it being caused by Based on the issue @s00d found I'll keep this issue open until that has been fixed. While the memory usage is more stable in Will be checking out the provided fix, expect another release soon. |
Closing as the memory leak in the provided reproductions have been fixed, we will track the other potential leak in #2612. If anyone is still experiencing memory leaks, please let us know there (and provide a reproduction if possible 🙏) |
Environment
Reproduction
Source code: https://github.com/cyperdark/i18n
Describe the bug
Memory leak video:
https://i.osuck.link/1683034408_ScxZCLgnzT.mp4
Additional context
No response
Logs
No response
The text was updated successfully, but these errors were encountered: