-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extension for Procedural textures #1889
Comments
While supporting the feature in general, I have to comment on some details.
MIME type is a formally defined term (administered by IANA) designated to identifying specific data formats. glTF's To accommodate the possibility of other texture sources, we deliberately made Assuming that procedural definitions may be reused for different textures, the former should be stored in a new top-level array. |
Thanks for your insights @lexaknyazev - this is way above my level of expertise! Could you please explain what 'a new top-level array' means and what this would look like? |
"A new top-level array" is an array defined within an extension object that is located in the root-level Here's how it may look for procedural textures. {
"extensionsUsed": [
"IKEA_textures_procedural"
],
"asset": {
"version": "2.0"
},
"extensions": {
"IKEA_textures_procedural": {
"procedures": [
{
"type": "WOOD",
...
}
]
}
},
"textures": [
{
"extensions": {
"IKEA_textures_procedural": {
"procedure": 0
}
}
}
]
} |
If we go down the path of supporting procedural textures, we should seriously consider existing standards like MaterialX Standard Nodes: I'm not suggesting that we include all of MaterialX, here — just to use similar standard nodes for procedural textures, effectively supporting the "pattern graph" subset of MaterialX. This provides very flexible inputs to the existing glTF material system (and upcoming extensions) without defining entirely new shading models. MaterialX can be used with Autodesk Standard Surface — a very good match for glTF's direction — but MaterialX also has additional features for defining custom surface shaders, and I think that should be out of our scope for the time being, as I'm not confident it fits into the realtime transmission ecosystem well right now. Less impactfully, I'd also suggest |
Thanks @donmccurdy I guess the first step is to understand what is already available - apart from MaterialX - what are the other (open) ways of procedurally generating textures? |
Ah ok, I think I misunderstood - I thought you where referring to the mime-types. |
NVIDIA's MDL is another good example. My understanding is that it interops well with MaterialX, and maybe even uses the same standard nodes (citation needed?). I'm not aware of any other open, declarative representation of a procedural texture. I say "declarative" because you could argue that OSL, GLSL, etc. are also representations of a procedural texture, but they're not compelling to me for this use case, because they're imperative, less composable, and less portable. |
I'd recommend putting the extension on the texture object — this fits very well with how we extend for KTX and WebP textures today. |
I'd strongly prefer to keep the semantics of |
The very first step could be even simpler: define and name the most-required texture patterns using something like MaterialX or MDL and just refer to those names from the extension. |
I suspect that defining high-level texture patterns (wood, checkerboard, etc.) would turn out to be more complicated than defining a discrete set of low-level, composable nodes. MaterialX is able to provide relatively simple OSL reference implementations for each of its standard nodes, which is a big plus to me. Practically, an engine that supports these high-level texture patterns will absolutely need to assemble shaders at loading time, whether through a node-based system or direct shader generation. If that's the case, I think we should go directly to lower-level nodes as MaterialX, MDL, and USD are doing. |
FWIW, MatX is higher level than MDL; MatX is a general purpose shader graph, and MDL is specifically a physically based model. MatX will ultimately target MDL, similarly to how it targets OSL and GLSL. |
Interesting topic, been thinking about something like this for a while 😄 One thing that could be worth considering is how these procedurals are created by artist - could this e.g. fit into how Blender3D would do node based procedural textures (I think there is some interest already in blender community about this - also check out this guys stuff: https://gumroad.com/simonthommes). I suppose Substance Designer is also very much in this space. |
@donmccurdy For our first step in a prototype we would choose one of the 3D tools that support the creation of wood textures and have the code readily available to reverse that 'magic data' into pixels. |
This is exactly what we are aiming for in our prototype @pjoe |
Great :) I do think, like @donmccurdy that you would quickly get to the point where it makes most sense to build this from simpler 'nodes', e.g. 4D noise (which is needed for making procedural noise tiling). IIRC 4D noise nodes landed in blender 2.81. At load time you could 'execute' the node graph, to generate 'normal' 2D textures (maybe at different resolution depending on device) and hand those of to the actual renderer. If you haven't already looked at Substance Designer, it is a good source of inspiration for this kind of system - even though it is proprietary. |
Thanks @pjoe - this sounds like what I am after :-) In order not to choke myself with too much to do I rely on others expertise when it comes to choosing the editing tool :-) Later on we will definately look into more complex solutions such as how to define the procedural texture generation. |
I don't think Blender currently has the ability to export these procedural texture nodes in any format (except .blend). You can try asking the Blender community, might be others looking at this (e.g. https://devtalk.blender.org/, https://blender.chat/) |
If your goal is to support a discrete list of "known" procedural textures (let's say For a tightly-controlled pipeline (i.e. you can make the needed changes in 1-2 DCC tools and 1-2 viewers) this should be a reasonable approach. For more widespread adoption I think it would be necessary to shift to lower-level nodes like MaterialX or MDL — I don't expect it is possible to get widespread agreement across the ecosystem on high-level abstractions like "wood". I may open a PR describing a lower-level node-based procedural texture approach at some point, but for the moment it sounds like these are different things, yes. 👍 |
I've been keeping an eye on things like https://blenderartists.org/t/materialx-blender-integration/700331 and godotengine/godot-proposals#714, but at the moment I don't see any signal from Blender or Godot on how they plan to approach this problem. Blender is working on USD integration, but I'm not sure what that means in terms of node-based materials, if anythingc. |
I agree, our first goal is to do a proof of concept that we are happy with :-) |
This is exactly what me and @rsahlin have been discussing. I'm no expert on procedural generation, but I suspect that most procedural representations boil down to a limited number of math operations and noise generation primitives, i.e. a simple math expression and not a turing-complete domain specific language (I do think branching will be required though). If the representation is low-level enough, other software packages (Substance/Blender) will be able to export their procedural node graphs to this format! Here are the sources for Blender's texture nodes for example: https://developer.blender.org/diffusion/B/browse/master/source/blender/nodes/texture/nodes/. Some of these nodes are too high-level (bricks), but the sources could be used to re-create a procedural texture that has been created in Blender and exported to a declarative format. After this proof-of-concept we can define a set of "lowest common denominator" nodes that must be present in an extension. Another thing we have been talking about is that it would be nice for the generator to be able to use other images/textures in the glTF file as inputs in addition to simple value constants. This way, a noise/mask/grain texture could be embedded into the glTF and then used in the generation step for a procedural wood texture at load time. I think the focus should be on load-time generation, we don't have to go to runtime evaluation to get the benefits of procedural generation. As long as we take the "pixel-centric" approach, per-pixel evaluation at runtime should be possible in the future. |
Right, there are a couple of fundamental building blocks for doing procedural textures, like noise (4D for tiling in uv, 6D if you e.g. also need tiling in time), shape, scatter, etc. My best reference for inspiration is as mentioned Substance Designer, see e.g.: https://www.youtube.com/playlist?list=PLB0wXHrWAmCwWfVVurGIQO_tMVWCFhnqE For blender source, note that it has a couple of version of nodes: for Cycles and for Eevee. See e.g. 4D perlin noise shader here: https://developer.blender.org/diffusion/B/browse/master/source/blender/gpu/shaders/material/gpu_shader_material_noise.glsl$191 |
Btw. also found my original investigation into tiling noise in Blender: https://blender.stackexchange.com/questions/135437/how-to-make-tileable-procedural-noise-texture NOTE: 4D noise was added in 2.81 (I think) - not by me :) |
Just to confirm: you're saying this extension is a proof of concept and won't use "lowest common denominator" nodes, but that you think that could be a next step afterward? The latter is obviously a big project, so we can certainly start a new thread for that if you'd rather keep this issue focused on something simpler. |
I would say that this issue is serves the purpose of discussing how an extension for procedural texture could be done. As I see it both phases are relevant for procedural textures - It's just that for the moment our focus is on the first phase. |
Ok, I think so. If you feel that the comments here are getting ahead of your immediate goals feel free to let us know and we can split threads as needed. :) |
You do have a valid point |
If I wanted to give a MaterialX material to material mapping with a stage0 approach where one pretends that the materialx definition is a png. Has anyone done that? Stage 1 would be like gltf variants for the procedural properties. |
@fire Hi and thanks for your comment Could you please elaborate? |
My friend Lyuma explains, MaterialX in glTF would allow for a
generalization of "KHR_materials_variants" extension - in other words,
MaterialX would allow for character or object customization to textures
baked at load time.
The standard parameters in the shop example of KHR_materials_variants like
colors would be good for a store.
We can use floats to affect the material output. For example, a float of 0
will generate a texture from SDF nodes to show a triangle and a float of 1
will generate a square which is not only a color.
On Thu, 29 Apr 2021 at 23:11, Richard Sahlin ***@***.***> wrote:
@fire Hi and thanks for your comment
I am not sure I understand what you mean by pretends that the materialx
definition is a png?
Do you mean that the input, the recipe if you will, to the procedural
generation should be a png (and not the list of operations needed to
perform the image generation)?
|
Thanks for the clarification @fire Hope this answers your question? |
A (2D) procedural image generator. It will not aim at affecting geometry or other parts of the material - just the provide texture input for basecolor, normal, roughness and possibly occlusion. Yes. All the possible PBR GLTF2 material parameters that are supported. This is exactly what I want. |
Great :-) We are in the process of starting the second phase of the proof of concept - I will be able to share more in the coming months. |
Investigating interest in an extension for procedural textures - IKEA_procedural_images
The purpose of this extension is to enable procedurally generated images that can be used as texture references.
The proposed solution will offer the possibility of generating textures at load time by defining new texture sources and a way of describing image generation from a set of image parameters.
One goal of this is to reduce transmission size, another goal is to be able to offer dynamic output depending - for instance altering the exact look of a wood texture at loadtime.
The initial proof of concept was to verify if the quality of procedurally generated wood textures is good enough - we are convinced that we achieved this.
The goal of this extension is to provide a generic enough descriptive syntax that other types of textures can be created.
It is NOT to create a flexible material graph, neither is forcing implementations to do texture generation in shaders (on the fly).
This shall certainly be possible, however it is not the goal for the first iteration.
Think of it as taking a subset of a procedural graph and then locking the operations so that implementations are simpler.
Overview
You may see this as a compiled (or locked) subpart of a material nodegraph.
It is estimated that metalness is not needed since the procedural specification targets one material - but this is somewhat undecided.
It is up to implementations to dynamically generate the textures, for instance in the fragment shader, however this behavior is not mandated nor is it the goal of this first iteration of the extension.
Way forward
1: Texture reference
For a procedurally generated texture to be able to be used as a source it must be possible to reference it.
This can be done on an image level, where the procedural file is referenced.
The following will add two procedurally generated textures as source, generating the basecolor and normal when the image is loaded.
A fallback is specified, however this is probably not wanted.
Procedural generator
This part will specify some sort of declarative syntax for understanding the data provided to the generator.
Input will be one or more datasets - output is the generated image (for each of the texture maps)
It shall be (resonably) easy to implement, provide sufficient performance for real world load time usage and be able to guarantee that the visual output is the same.
It shall be possible to output different target maps such as color, normal, roughness etc.
The procedurally generated image is specified in the images array using the extension, it is recommended to require this extension or only provide a very basic image fallback (otherwise the whole purpose of reducing transmission size is lost)
Sourcecode (C99) and wasm will be provided for the generator, reducing the implementation overhead.
One part of this project will be to define the (graphic) operators that are needed to create the procedural textures.
Another important part will be to create a test and verification suite so that implementers know that they are getting the textures right.
The text was updated successfully, but these errors were encountered: