Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

need docs about GLboolean (8-bit) vs GL_TRUE/GL_FALSE (32-bit) #7

Open
Lokathor opened this issue Mar 7, 2021 · 0 comments
Open

need docs about GLboolean (8-bit) vs GL_TRUE/GL_FALSE (32-bit) #7

Lokathor opened this issue Mar 7, 2021 · 0 comments

Comments

@Lokathor
Copy link
Owner

Lokathor commented Mar 7, 2021

So the deal is that GL_TRUE and GL_FALSE are of type GLenum, which is a 32-bit value, and then separaetly sometimes you need to pass a GLboolean, which is an 8-bit value.

Since Rust doesn't let us just automatically coerce numbers around like in C, you actually can't use GL_TRUE or GL_FALSE in both positions. If we define them as u8 values then it won't match GLenum usage, and if we define them as 32-bit values it won't match GLboolean usage.

Also, we can't simply use bool rather than u8 because sometimes a GLboolean is an out param from GL, and my paranoia of GL possibly writing invalid bit patterns to a bool is too high.

  • For now: users should write true as _ or false as _ when they need a GLboolean, and then in a breaking future release we can probably have GLboolean be a proper newtype over u8 similar to VkBool in the vkvk crate.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant