Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WebGPURenderer: Align integer attribute check of WebGL backend. #28918

Merged
merged 2 commits into from
Jul 19, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion src/renderers/webgl-fallback/nodes/GLSLNodeBuilder.js
Original file line number Diff line number Diff line change
Expand Up @@ -476,7 +476,7 @@ ${ flowData.code }

const array = dataAttribute.array;

if ( ( array instanceof Uint32Array || array instanceof Int32Array || array instanceof Uint16Array || array instanceof Int16Array ) === false ) {
if ( ( array instanceof Uint32Array || array instanceof Int32Array || array instanceof Int16Array ) === false ) {
Copy link
Collaborator

@RenaudRohlinger RenaudRohlinger Jul 24, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This just breaks Uint16Array support in the WebGL Backend. And now generate this kind of error:

[.WebGL-0x13000c4ea00] GL_INVALID_OPERATION: Vertex shader input type does not match the type of the bound vertex attribute.

Firefox:
WebGL warning: drawElementsInstanced: Vertex attrib 1 requires data of type INT, but is being supplied with type FLOAT.

Are you sure about this one? /cc @Mugen87

Copy link
Collaborator Author

@Mugen87 Mugen87 Jul 24, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This change was required to make compressed models work with the WebGL backend so I would say the previous approach wasn't right.

Do you mind demonstrating with a fiddle how the error occurs?

Ideally, the GLSL builder should only generate iuvec* if the gpuType is IntType or when Uint32Array and Int32Array is used. In all other cases, the shader type should be float.

Related: #28920 (comment)


nodeType = nodeType.slice( 1 );

Expand Down
2 changes: 1 addition & 1 deletion src/renderers/webgl-fallback/utils/WebGLAttributeUtils.js
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ class WebGLAttributeUtils {
bytesPerElement: array.BYTES_PER_ELEMENT,
version: attribute.version,
pbo: attribute.pbo,
isInteger: type === gl.INT || type === gl.UNSIGNED_INT || type === gl.UNSIGNED_SHORT || attribute.gpuType === IntType,
isInteger: type === gl.INT || type === gl.UNSIGNED_INT || attribute.gpuType === IntType,
id: _id ++
};

Expand Down