-
Notifications
You must be signed in to change notification settings - Fork 478
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gltfpack: Error running through large gltf file #140
Comments
For files this large I would recommend using native gltfpack builds (you can download them on release page, see https://github.com/zeux/meshoptimizer/releases/tag/v0.14). Wasm is limited in the heap space; the current limit is 2 GB. I think I can compile it to raise the limit to 4 GB, but I'm not sure it's going to be enough to fit a file this big. Native builds on the other hand support however much memory your computer has. Any chance you can share the file, possibly privately via e-mail? The largest file I've tested so far was a 200 MB .glb which is 5 times smaller; would be good to be able to test files larger than that and possibly optimize memory consumption somewhere. |
Something else that would help would be to run the attached debug build of gltfpack with the following node option:
I suspect it will still run out of memory since 4 GB might not be enough, but it should be closer and the callstack with the failure will help me prioritize memory optimizations. |
|
This should be substantially improved by #142; I have one more change I'm going to make in the next little while which would reduce the peak memory consumption a bit more by unloading parts of the input file early. |
Running gltfpack from npm.
gltfpack -i sourcefile.gltf -o output\destination.gltf
Error:
Input model is a large gltf file (440 MB gltf + 502MB bin).
The text was updated successfully, but these errors were encountered: