-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add glTF exporter #3366
Add glTF exporter #3366
Conversation
I try to get this to work by using the function download(buffers) {
var element = document.createElement('a');
var blob = new Blob(buffers, { type: "octet/stream" });
element.href = URL.createObjectURL(blob);
element.download = 'scene.glb';
element.style.display = 'none';
document.body.appendChild(element);
element.click();
document.body.removeChild(element);
}
// Export as scene.glb
const exporter = new GltfExporter();
const arrayBuffer = exporter.buildGlb(entity):
download([arrayBuffer]); I can download the generated Am I doing something wrong or does it not work for meshes yet? Generated file: scene.zip EDIT Just thought about using this validator and there are quite some problems left: |
Co-authored-by: Hermann Rolfes <lama12345@gmail.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One entity can have multiple meshInstances and each meshInstance.node (GraphNode) belongs to its own JSON glTF node. So first all JSON nodes need to be written and then the resources.meshInstances
can be iterated to assign the mesh ID's to the proper JSON nodes:
scripts/exporters/gltf-exporter.js
Outdated
if (entity.render && entity.render.enabled) { | ||
entity.render.meshInstances.forEach((meshInstance) => { | ||
node.mesh = resources.meshInstances.indexOf(meshInstance); | ||
}); | ||
} | ||
|
||
if (entity.model && entity.model.enabled) { | ||
entity.model.meshInstances.forEach((meshInstance) => { | ||
node.mesh = resources.meshInstances.indexOf(meshInstance); | ||
}); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if (entity.render && entity.render.enabled) { | |
entity.render.meshInstances.forEach((meshInstance) => { | |
node.mesh = resources.meshInstances.indexOf(meshInstance); | |
}); | |
} | |
if (entity.model && entity.model.enabled) { | |
entity.model.meshInstances.forEach((meshInstance) => { | |
node.mesh = resources.meshInstances.indexOf(meshInstance); | |
}); | |
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will be addressed in follow up PRs
scripts/exporters/gltf-exporter.js
Outdated
}); | ||
} | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
assignMeshIDs(resources, json) { | |
var meshInstances = resources.meshInstances; | |
// Assign meshes to JSON nodes based on meshInstance.node | |
meshInstances.forEach(meshInstance => { | |
const entityIndex = resources.entities.indexOf(meshInstance.node); | |
if (entityIndex != -1) { | |
json.nodes[entityIndex].mesh = meshInstances.indexOf(meshInstance); | |
} else { | |
console.warn('GltfExporter#assignMeshIDs> meshInstance referring to unexported json node'); | |
} | |
}); | |
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since there is still only one node for each entity, this will result in each meshInstance in an entity overwriting json.nodes[entityIndex].mesh
as we loop through them, right? Should we be checking for meshInstances in models/renders in writeNodes
and creating a new json node for each?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, you're right, @CynthiaXJia, there is a problem here. I'm not 100% sure of the right solution at the moment, but what you suggest sounds sensible.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will be addressed in follow up PRs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is addressed here now: #4683
scripts/exporters/gltf-exporter.js
Outdated
this.writeBuffers(resources, json); | ||
this.writeBufferViews(resources, json); | ||
this.writeCameras(resources, json); | ||
this.writeNodes(resources, json); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this.writeNodes(resources, json); | |
this.writeNodes(resources, json); | |
this.assignMeshIDs(resources, json); |
I reworked the code to deinterleave the vertex buffer data and tested it on SeeMore (as non-trivial test case): The glTF can be opened in Blender (for some reason there are still errors that prevent full loading): Or in PlayCanvas Viewer: SeeMore demo: https://playcanv.as/p/MflWvdTW/ SeeMore.glb file: Seemore_1629624802680_7.zip JavaScript code for quick testing: https://github.com/KILLTUBE/gltf/blob/master/src/gltf-exporter.js (just pick frame in devtools and copy&paste) It would be nice to get to the point of a fully working and error free SeeMore glTF/glb file, currently I can't even get an error report from KhronosGroup/glTF-Validator (running out of memory) |
Cool to see you experimenting with this. I was wondering why in your version you are de-interleaving the vertex buffer data. In my version, I'm trying to leave the vertex buffer data untouched. Ideally, it will handle both cases. Interleaved vertex buffer data is generally considered to be more efficient. |
Thanks! I de-interleaved the vertex buffer mostly because I just wanted it to work and then I am mostly using playcanvas-gltf which still has problems parsing interleaved data: playcanvas/playcanvas-gltf#7 Once it generated a GLB file it can just be loaded into Blender and then Blender can do the interleaving and draco compression if necessary. I also found the stride logic somewhat confusing, so I just wanted to make it easier to understand |
Hi Will, |
@tanaydimriepigraph Unfortunately, Will is backlogged on some internal stuff and is not able to recommence work on this just yet. He is eager to get back to it though. |
…uffer format was the issue here)
* Add glTF exporter * Lint fixes * Update scripts/exporters/gltf-exporter.js Co-authored-by: Hermann Rolfes <lama12345@gmail.com> * Use pc.math.roundUp * Switch from forEach and push to map * updated to the same format as usdz exporter, added example * gltf exporter returns a promise to match usdz exporter * handling all formats / types / semantics, fixes bench export (index buffer format was the issue here) * updated example * cleanup * small cleanup * evaluate min & max with better precision * fix based on comment * a dirty solution to get non-interleaved VBs working (with validation errors) Co-authored-by: Hermann Rolfes <lama12345@gmail.com> Co-authored-by: Martin Valigursky <mvaligursky@snapchat.com>
const getSemantic = (engineSemantic) => { | ||
switch (engineSemantic) { | ||
case pc.SEMANTIC_POSITION: return 'POSITION'; | ||
case pc.SEMANTIC_NORMAL: return 'NORMAL'; | ||
case pc.SEMANTIC_TANGENT: return 'TANGENT'; | ||
case pc.SEMANTIC_COLOR: return 'COLOR_0'; | ||
case pc.SEMANTIC_BLENDINDICES: return 'JOINTS_0'; | ||
case pc.SEMANTIC_BLENDWEIGHT: return 'WEIGHTS_0'; | ||
case pc.SEMANTIC_TEXCOORD0: return 'TEXCOORD_0'; | ||
case pc.SEMANTIC_TEXCOORD1: return 'TEXCOORD_1'; | ||
case pc.SEMANTIC_TEXCOORD2: return 'TEXCOORD_2'; | ||
case pc.SEMANTIC_TEXCOORD3: return 'TEXCOORD_3'; | ||
case pc.SEMANTIC_TEXCOORD4: return 'TEXCOORD_4'; | ||
case pc.SEMANTIC_TEXCOORD5: return 'TEXCOORD_5'; | ||
case pc.SEMANTIC_TEXCOORD6: return 'TEXCOORD_6'; | ||
case pc.SEMANTIC_TEXCOORD7: return 'TEXCOORD_7'; | ||
} | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In glb-parser.js
there is:
const gltfToEngineSemanticMap = {
'POSITION': SEMANTIC_POSITION,
'NORMAL': SEMANTIC_NORMAL,
'TANGENT': SEMANTIC_TANGENT,
'COLOR_0': SEMANTIC_COLOR,
'JOINTS_0': SEMANTIC_BLENDINDICES,
'WEIGHTS_0': SEMANTIC_BLENDWEIGHT,
'TEXCOORD_0': SEMANTIC_TEXCOORD0,
'TEXCOORD_1': SEMANTIC_TEXCOORD1,
'TEXCOORD_2': SEMANTIC_TEXCOORD2,
'TEXCOORD_3': SEMANTIC_TEXCOORD3,
'TEXCOORD_4': SEMANTIC_TEXCOORD4,
'TEXCOORD_5': SEMANTIC_TEXCOORD5,
'TEXCOORD_6': SEMANTIC_TEXCOORD6,
'TEXCOORD_7': SEMANTIC_TEXCOORD7
};
An object in itself is shorter, but since it's basically the same object (just inverted), I started to think these variables could be exported and then just automatically inverted, for example:
function objectInvert(obj) {
const ret = {};
for (const key in obj) {
const val = obj[key];
ret[val] = key;
}
return ret;
}
export const engineToGltfSemanticMap = objectInvert(gltfToEngineSemanticMap);
(but maybe this is just overcomplicating things)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I started with the object initially .. but the engine needs to be loaded before the pcx (extras) .. and so a static time access to pc constants is not possible here - and so I added a function instead, which executes after the engine has been loaded.
When we have full tree-shaking, and make this a proper module, we'd definitely do something else.
Implements a glTF exporter. Currently supports:
API is:
also exposed a function on the engine's BoundingBox to evaluate min and max for an array of vertices:
New engine example GltfExport converts source scene created from multiple glbs:
into a single glb (limited material & texture support)
I confirm I have read the contributing guidelines and signed the Contributor License Agreement.