-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The MetalRoughSpheres are HUGE #45
Comments
I'm the author. This model isn't intended to be either basic or compact. The spheres are high-poly and there are a lot of them, the idea being to carefully test the effects of the metalness and roughness channels of the map. I had intended to make another one that tested the factors, as opposed to the texture, but never got around to that. The model has more than 64k vertices, and I had problems exporting 32-bit indicies with the then-current version of COLLADA2GLTF. So, I split the model into 5 separate meshes, and exported with 16-bit indicies. Each sphere has 2562 vertices and 7680 triangles. Is this really considered a bug? It was certainly intentional on my part. It doesn't need to be shown in a high-performance rendering situation, it's for examining the materials in fine detail. |
I tagged it with the intention of categorizing it alongside other (possible) issues in the sample models. I'm fine with removing that tag. I don't have a strong opinion on the tradeoff between samples that showcase glTF best-practice, vs including extra hand optimizations that tooling doesn't automatically provide (e.g. reusing a single sphere mesh). |
@emackey Yes, I noticed that the textures are carefully adjusted to the model (and that it's not just 100 spheres with different materials). (I also considered that this might be a side-effect of COLLADA2GLTF, which, IIRC, may "bake" translations into meshes and create files that are structurally simpler, but based on what you said, this is probably not the reason). I just was surprised to see that an asset that could be very small and simple actually was huge and complex. The point is: One could easily have not 100 spheres with 8000 triangles each, but even 1000 spheres with 20000 triangles each, and the size of the resulting asset could still be only a fraction of the current one... However, OK to close this one, if it's not considered to be worth changing. |
FWIW, MetalRoughSpheres also takes a noticeably longer time to import into Blender than any other sample (about 9s on my PC), more than twice the time taken by the next slowest (GearboxAssy, about 4s). I import all the sample files to test the importer and the two files for MetalRoughSpheres account for about 17% of the total test time. Not that that means anything, but it really is huge :) I put together this script to generate the glTF @emackey mentioned that's like MetalRoughSpheres, but with factors instead of textures. There's one mesh/material pair for every sphere and they all reuse the same accessors. All together the glTF and bin come to about 141KB. If you want to try it, just run it and it should drop For MetalRoughSpheres, you could do the same thing, reusing the position/normal accessors between spheres, but you'd still need texture data for every one. That would still cut a significant chunk of the filesize out though. I could modify my script to do that if there's interest in making this smaller (the annoying bit is just writing down the net of an icosahedral sphere). |
I can certainly see advantages to having a much smaller sample model, so long as most of the quality is preserved (meaning, don't substantially drop the triangles-per-sphere or number of spheres). If @scurest or anyone wants to take a shot at compacting the model as described above, that would be great! The original model came from a .blend file that is included in the |
Here's a script to generate MetalRoughSpheres (again, it will just drop a
There's actually more verts in this version because of how I unwrapped the icosahedron I guess D: The number of tris is the same. For some reason, the viewer draws this model with flat shading despite it having normals. Anyone know why? edit: One more thing; although it's interesting to get the size down by reusing accessors, it really tanks the performance when viewing, so I don't necessarily think this is a good idea. |
Btw, running the textures through zopflipng will shave about half the filesize off. It's insignificant compared to the size of the |
@scurest The three.js's glTF Loader treats it as flat shading if geometry.attributes.normal is undefined. |
@scurest The above problem seems to be flat shading because Labels has no NORMAL. |
three.js is following this implementation note from the specification:
So if smooth normals are intended, then attributes.NORMAL must be included. |
@cx20 👍 Specifically, it happens when the spheres (which have normals) and the labels (which don't) share a material. I've revised the gist to include normals for the labels so the spheres are smooth now. |
Oh, yeah that's a bug in THREE.GLTFLoader then. Shared material isn't supposed to matter. 😁 Will fix. |
@scurest Although it is not a big problem, When I tried to display with the same test code, it seems that the diameter of the sphere is about 2.4 to 2.5 times larger. |
Is it? As far as I can tell, they're all unit spheres; the ones in the original .blend, the ones in the current MetalRoughSpheres, and the ones generated by my script. How do you see a difference? |
@scurest I tried testing the two models with the same code. Three.js + new MetalRoughSpheres.gltf(generated by mrspheres.py) result: Unfortunately, I am not familiar with Blender, so I did not know where to check. |
Ah, you're right, it's because of this matrix. 1/0.4 = 2.5. I missed it in the glTF because I was looking for a I revised a |
@scurest I confirmed that it was improved by the latest version of mrspheres.py. |
The same problem can be observed in In contrast to the import path from "path";
import fs from "fs";
import { NodeIO } from "@gltf-transform/core";
import { dedup } from "@gltf-transform/functions";
import { KHRONOS_EXTENSIONS } from "@gltf-transform/extensions";
const baseDir = "C:/glTF-Sample-Models/";
function ensureDirectoryExists(fileName: string) {
const directory = path.dirname(fileName);
if (!fs.existsSync(directory)) {
fs.mkdirSync(directory, { recursive: true });
}
}
async function runOptimize(modelName: string) {
const inputDir = baseDir + modelName + "/glTF/";
const outputDir = baseDir + modelName + "/glTF-Optimized/";
const inputFileName = inputDir + modelName + ".gltf";
const outputFileName = outputDir + modelName + ".gltf";
const io = new NodeIO().registerExtensions(KHRONOS_EXTENSIONS);
const document = await io.read(inputFileName);
await document.transform(dedup());
const jsonDocument = await io.writeJSON(document);
ensureDirectoryExists(outputFileName);
fs.writeFileSync(outputFileName, JSON.stringify(jsonDocument.json, null, 2));
for (const uri of Object.keys(jsonDocument.resources)) {
const resource = jsonDocument.resources[uri];
const resourceFileName = path.join(outputDir, uri);
ensureDirectoryExists(resourceFileName);
fs.writeFileSync(resourceFileName, resource);
}
}
async function run() {
await runOptimize("IridescenceDielectricSpheres");
await runOptimize("IridescenceMetallicSpheres");
}
run(); Now... whatever we're doing with that: We'll still have these 10MB files in the Git history. But I'll probably still open a PR for that, because this reduces the size of the |
@javagl Perhaps both can be left in the directory with a separate explanation of optimization. |
@DRx3D I'm not sure what you mean. I think that we should generally try to make sure that the sample models (or sample assets) are good. This term has many dimensions and trade-offs (and maybe some of them are somewhat subjective). But I can not imagine any reason to not reduce the size of a model by 97%, while also reducing the amunt of RAM that is required for rendering the model. There are no disadvantages or drawbacks, from what I can tell. Unfortunately, this optimization is not applicable to the |
It looks like this issue has already been resolved for the Iridescence sphere tests. They are substantially smaller in this repo than in the previous one. For MetallicRoughnessSpheres, it looks like we now have both a large textured version and the small non-textured version side by side. Is this intentional that we keep both? Is only one needed? |
The iridescence models had been updated in #6 . The original version still increases the repo size, but I still think that it's worthwhile to have a small version of the model - e.g. when it is supposed to be a basic demo model that may be included in some viewer package, and to have faster downloads when selecting it in the glTF-Sample-Viewer. For the MetalRoughSpheres: I'm not sure about the roles or intentions of the original model and the I don't have a strong opinion (beyond "The model should not be so large"), but
But ... replacing the textures would make it so similar to the |
From just looking at the glTF file and textures, it's hard to understand the actual structure of the MetalRoughSpheres model. The asset reports its generator as "COLLADA2GLTF with hand-edits", so I'm not sure whether the culprit here is the original COLLADA file, COLLADA2GLTF, or (unlikely: ) the hand-editing. There seem to be five meshes, but it's not clear which part of the model belongs to which mesh (and it may be fiddly to figure it out).
But please correct me if I'm wrong at any point here:
bin
buffer file has nearly 11 megabytesaccessor
structure, it seems that the whole scene contains approximately 500000 trianglesbin
file with less than 400 kilobytesSo
When this is considered or supposed to be a "very basic test model for PBR in glTF", then this may not shine the best light on glTF ("Compact"? "No, not at all...")
At the risk of sounding self-righteous: In the (now somewhat out-dated) https://github.com/javagl/gltfTestModels/tree/master/SimpleSpheres test model, I used one sphere, and showed it 25 times...
The text was updated successfully, but these errors were encountered: