Compute shaders in Ebitengine #3080
Replies: 2 comments
-
Thank you for the suggestion. As Ebitengine is just a 2D game engine, I don't plan to have compute shaders. I don't close this discussion thread so feel free to put your ideas here. |
Beta Was this translation helpful? Give feedback.
-
Thanks for the reply. I would just add a little more arguments and say that compute shaders are not about being 3d or anything else. They are about being able to do some efficient parallel computing. For a very basic example, we could do some very fast linear algebra in it that our game could need. A famous example is Noita. It is a 2d game but it needs cellular automata capabilities that are pretty unreasonable without using compute shaders. That is why the author has to use a custom engine. There are many such example. It all depends on the need of the dev. Thanks for taking the time to reply and consider the question. |
Beta Was this translation helpful? Give feedback.
-
Thanks for the engine. It looks great.
My question is, as in the title, Is there a way to use compute shaders in Ebitengine?
In Kage, i see that we can use fragment shaders but compute and other shader types are not an option.
I tried several ways to get the context of the window to access opengl functionality (other gpu API are acceptable) but i did not manage to do it. I really do not want to hack the original engine as i think it would only complicate everything (hacking) with little to gain for it. I even think that i saw some comment at a time about ebitengine using a old api for compatibility reasons that could probably not use compute shaders? Or am i wrong?
So am i correct in thinking that it is not possible to do it now? To get the window context and do custom opengl/vulkan/etc operations as advanced as compute shaders? If so i would ask to consider this feature for the engine.
And, also great, would be to have a very basic example of using such context in the example directory of the repo. I do not care much for it being opengl or anything else, as long as it is a standard api that can be learned and used with confidence.
In my own use case, i was needing to use compute shaders, not for graphics, but exactly for what they are named: efficient parallel computing.
I understand that accessing such a context would possibly bring some weird bugs if one is not very careful. We would be getting out of the engine guaranties after all. That is what customisation entails.
But i do think that the people that would be using this type of context would be quickly filtered by the problems that it would bring.
And it would be better if it would be "not recommended" clearly and openly to filter even more people from using it without being aware of the "monsters ahead".
But even after all of this said, i still think that such a feature would give much more power to the engine to do some powerful custom games, that the engine could not do as it is now (or as i think it is now).
Thanks.
Beta Was this translation helpful? Give feedback.
All reactions