Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KHR_materials_physical_scale #1949

Closed
wants to merge 5 commits into from

Conversation

bhouston
Copy link
Contributor

@bhouston bhouston commented Mar 1, 2021

A first attempt at solving the need for reusable materials based on specifying how their intrinsic UVs map to a 1 meter by 1 meter square.

@bhouston bhouston changed the title first proposal for Physical UVs KHR_materials_physicalScale Mar 1, 2021
@mlimper
Copy link
Contributor

mlimper commented Mar 1, 2021

Thanks for the great proposal! It seems very logical and clean that the material texture itself can be scaled and the rest is then just handled via the UVs on the actual 3D model (assuming 1 UV unit = 1 meter from there on).

Following the first sentence in the Copyright Statement, the "Suggestions" part could possibly go into a new section "Non-Normative" / "Implementation Notes", maybe along with implementation suggestion or an example on how to combine this with texture transforms?

@bhouston
Copy link
Contributor Author

bhouston commented Mar 1, 2021

Thanks for the great proposal! It seems very logical and clean that the material texture itself can be scaled and the rest is then just handled via the UVs on the actual 3D model (assuming 1 UV unit = 1 meter from there on).

Just to be specific, it is the "material" that is scaled to physical. The textures will be scaled as part of that, but the physicalScale is specified on the material.

@elalish
Copy link
Contributor

elalish commented Mar 1, 2021

I like the AR use-case, but this seems a little orthogonal to glTF. For instance, this extension would not apply to any actual 3D model, right? Since the UV coordinates nearly always introduce stretch which makes the physical scale non-uniform. Also, specifying how UV directions are to be treated feels arbitrary and hard to future-proof. What about wood grain applied at an angle on a wall?

Couldn't this use case be satisfied without any extension by simply including a mesh quad of the appropriate size with the material applied to it? This would have the advantage of being renderable in a normal viewer like a fabric swatch, instead of just being inexplicably empty. How this would be applied in AR might have some conventions, but I doubt it's ready to be standardized yet.

@bhouston
Copy link
Contributor Author

bhouston commented Mar 1, 2021 via email

@donmccurdy
Copy link
Contributor

This is all new to me — do you mind giving an example or two of how "material reuse" is applied in practice? And, how might this extension interact with KHR_texture_transform?

Aside: our extension naming conventions recommend using entirely snake_case moving forward.

@bhouston bhouston changed the title KHR_materials_physicalScale KHR_materials_physical_scale Mar 3, 2021
@bhouston
Copy link
Contributor Author

bhouston commented Mar 3, 2021

This is all new to me — do you mind giving an example or two of how "material reuse" is applied in practice?

I describe two use cases in the proposal.

Because you are at Google, I think one really good use case would be AR replacement of wall and floor materials -- such as previewing backsplash tiles or hardwood or wall paper. AR Core could extract the wall or floor geometry and then it could apply this material to that geometry based on this scale so that it looks correct. Right now AR previewing of tiles or flooring or wall paint doesn't really work via glTF at all.

The same use case can be done in a floor planner where the room is fully virtual. Having a library of glTF materials would enable one to apply these to the walls and floors and see the physical results easily. With just arbitrary material scale you do not know what you will get, interchange would be impossible. The only solution would be for each set of people wanting to interchange materials to develop their own conventions as to what is the material physical scale. Hopefully everyone picks 1 UV to 1 meter, but that is unlikely.

I think replacing fabric on upholstered chairs or on draps or even on clothing is another use case for physical scale material. Maybe one can replace one's black leather jacket with brown leather material -- this would allow it.

And, how might this extension interact with KHR_texture_transform?

This is fully orthogonal to KHR_texture_transform because this proposed KHR_material_physical_scale applies to the material at a whole. It would be like a parent scale to all of the internal scales of the material.

Aside: our extension naming conventions recommend using entirely snake_case moving forward.

Thank you! I've updated the title of this proposal to reflect that.

@bhouston
Copy link
Contributor Author

This extension would enable this use case: https://apps.apple.com/us/app/primer-supply/id1451986109

@elalish
Copy link
Contributor

elalish commented Mar 11, 2021

Okay, but you still haven't answered my second question: how is this extension better or different than using a quad?

@rsahlin
Copy link

rsahlin commented Mar 24, 2021

I don't think that a material should have physical scale.
We have done similar prototypes to the AR usecase you mention.
I think this usecase can be solved by either using real world UV scale (together with texture transform) or by applying the material on a quad (which would give the real world mapping of UVs)
Either way I think additional specification needs to be done in some kind of guidelines, for instance to know if the material can be applied to a floor or a wall (X + Z or Z + Y).
Maybe this would be better suited for 3D commerce?

@bhouston
Copy link
Contributor Author

@elalish wrote:

Okay, but you still haven't answered my second question: how is this extension better or different than using a quad?

I guess you are saying as a convention, if there is just a quad in the file, we can infer that this means it is its physical size? I worry that is a bit arbitrary. I think an extension that just specifies this is easier than specifying that one had to have a very specific scene definition.

@rsahlin

I think this usecase can be solved by either using real world UV scale (together with texture transform) or by applying the material on a quad (which would give the real world mapping of UVs)

My proposal is exactly how 3DS Max specifies real-world scale. Thus I am aligned with you -- we want to do real world UV scale! From 3DS Max docs:

This extension nearly specifies the real world scale like 3DS Max allows you to do. :). I am confused why you say that we shouldn't do this but then suggest we should do it as an alternative. I am very confused by your suggestion.

Substance also specifies a physical size:

  • "Physical Size | This indicated the span the texture would have in the physical world, in X (length), Y (width) and Z (height). It is therefore inherently related to the material which is produced in the graph.The Physical Size can be used, for instance, to display the texture at its correct ratio in the 2D View and 3D View." https://docs.substance3d.com/sddoc/graph-parameters-102400069.html

VRay Scans contain physical size:

  • "The .vrscan file stores information about the physical size of the scanned sample and by clicking on a point over a given object, the texture tiling is modified so that the texture is the correct size for the clicked point." https://docs.chaosgroup.com/display/VMAYA/VRayScannedMtl

The new physically-based material standard from Vizoo, U3M, has a physical size, "length_measurement" in milimeters. It is nearly equivalent to glTF otherwise:

@elalish
Copy link
Contributor

elalish commented Mar 30, 2021

I guess you are saying as a convention, if there is just a quad in the file, we can infer that this means it is its physical size? I worry that is a bit arbitrary. I think an extension that just specifies this is easier than specifying that one had to have a very specific scene definition.

I would say that any geometry in the glTF exactly specifies the physical scale of any applied material already. I'm concerned about specifying this size again, because then inevitably files will be created where the two do not agree, and what are we supposed to assume then? If you're trying to represent a swatch of floor material, a quad seems quite natural, and in fact matches the physical sample you would be handed at a show room. That seems much better than a glTF with no geometry at all, which most viewers will simply show as an empty scene. For the AR tiling use case, you'll need something external to identify these files regardless, since the use case won't work with an arbitrary glTF, so specifying it differently inside the glTF seems superfluous. I also don't think the geometry needs to be highly specified to work in this use case; anything with uniform UV-stretch (a cube, cylinder, quad, etc) would work, as the scale can be calculated from any single triangle.

@bhouston
Copy link
Contributor Author

bhouston commented Mar 30, 2021

I feel that I have failed to communicate the value of physical scale that so many other tools use: Substance, 3DS Max, V-Ray Scans, Maya, U3M, and most CAD tools. I blame myself for this. I'll close the issue and just work around this.

@bhouston bhouston closed this Mar 30, 2021
@bhouston
Copy link
Contributor Author

Interesting implementation detail -- but that it outside of whether this is desired in the first place -- MaterialX supports real-world absolute sizes on a per image basis, rather than at the material level:

http://www.materialx.org/assets/MaterialX.v1.38.Spec.pdf#page=12

"This allows images and the quantities they represent such as displacement amount to be specified at an absolute real-world size, and then be converted automatically to the expected scene units of the application."

@jstone-lucasfilm
Copy link

Yes, physical distances are expressed in MaterialX using the unit system, with the GLSL/OSL/MDL shader generators automatically translating between units for renderers. This allows the creators of a material library to pin selected float and vector values to the scene unit of their authoring environment (e.g. meters), and importing applications have a straightforward way to visualize these materials in their own scene units.

In spirit this is similar to the handling of color spaces, where the creators of a material library can pin selected colors to one or more spaces in their authoring environment (e.g. lin_rec709), with shader generators automatically translating between spaces for renderers.

@MiiBond
Copy link
Contributor

MiiBond commented Mar 31, 2021

We (Adobe/Substance) are definitely interested in an extension for real world material scale. I think it's worth re-opening this issue and continuing the conversation.
I can easily see shipping a glTF with a library of materials where each material might have a different real-world scale. Tying it to a specific piece of geometry, though possible (given glTF has a fixed unit of metres), seems cumbersome. Not knowing much about this myself, my assumption would be that that an app like a textile viewer would have its own models to view the materials on. These models would have a known UV-to-real-world-unit conversion and then they could appropriately tile each material in the library, given the scale defined by this extension.

In short, I think this is potentially a very valuable extension. I'll ask someone here with more expertise in this area to comment further.

@MiiBond MiiBond reopened this Mar 31, 2021
@MasterZap
Copy link

MasterZap commented Mar 31, 2021

I think it would be a mistake to pull this. As you yourself note, this is a solved problem in 3ds Max since a decade or two.

The basic gist in 3ds Max is:

An object set up for "real world" textures do not use UV's that go in some arbitrary 0-1 space. The UV's are scaled in actual units. (Ideally, we pick a real world unit and stick to it for everything. Max was never this lucky, so the unit in max is the "scene unit" as defined by the scene you are in). So a 10 by 3 piece of wood have UV's that reach from 0-10 in one axis and 0-3 in the other.

Textures applied in "real world" scale, are simply scaled accordingly. I.e. if the texture represent a 5 by 5 units piece of real world textures, the UV's coming from the object are simply divided by the textures sizes, to be turned into the actual texture lookup in the 0-1 grid of the bitmap itself.

Done.

So in our 10 by 3 piece of wood, the texture would repeat twice in the U direction and cover 3/5:ths of the V direction.

Interesting implementation detail -- but that it outside of whether this is desired in the first place --
MaterialX supports real-world absolute sizes on a per image basis, rather than at the material level:

You need both, obviously. You need to know how big your object is. You also need to know how large bit of that surface your texture covers.

/Z

@rsahlin
Copy link

rsahlin commented Mar 31, 2021

My proposal is exactly how 3DS Max specifies real-world scale. Thus I am aligned with you -- we want to do real world UV scale! From 3DS Max docs:

What I am saying @bhouston is that what you want to do can already be done, in one of two ways.
1: Model with simple geoemetry, for instance a quad that provides UV to real world mapping. 1 Unit = 1 meter in glTF.
2: Model uses KHR_texture_transform that maps real world UV coordinates (onto the model)

No need for an extension in my opinion.

I am also saying that if anything is needed, I think it is more of a guideline as how to do the above.
As such I do not believe it is part of glTF standard.

@MiiBond
It seems to me that what you want to do is more in line with 3D Commerce / Configuration.
If mapping of UVs to a quad is too cumbersome then perhaps your usecase can be solved by multiple scenes in the model?

  • Yes, I know that having multiple scenes can potentially bloat the filesize - if this proves to be an issue I would rather look into how that can be made more efficient (than having this type or extension)

That MaterialX has support for it comes to no surprise to me - they aim to solve problem of sharing of assets in the content. creation pipeline.
I don't believe that is what glTF should be for - I see the format as a distribution of authored assets.

@jstone-lucasfilm
Copy link

@rsahlin This is somewhat tangential to Ben's original post, but I wanted to provide my thoughts on one point above:

That MaterialX has support for it comes to no surprise to me - they aim to solve problem of sharing of assets in the content creation pipeline.
I don't believe that is what glTF should be for - I see the format as a distribution of authored assets.

I see your point here, and it's true that the emphasis of MaterialX is artist-facing materials in a content creation pipeline, while glTF materials are focused on end-user delivery for real-time rendering But I wonder if there's really a need for two completely separate material models in these two projects, where an artist authors their asset in a production shading model using MaterialX/USD, and then translates their asset to a new material system for delivery to the end user.

Even within content creation pipelines, there's a strong need to optimize material assets for real-time rendering, and there are MaterialX facilities for texture baking, shader code generation, and other optimizations that can be leveraged in preparing an asset for delivery, without needing to change the underlying shading model or material system.

Could there be benefits in converging these two material systems in the future, focusing on real-time optimizations that preserve the authored look that the artist originally intended, and removing translation steps that simply convert between arbitrary conventions?

@MiiBond
Copy link
Contributor

MiiBond commented Apr 2, 2021

@rsahlin having multiple scenes would be even more cumbersome. I also agree with @bhouston that including a particular piece of geometry to communicate the scale seems completely arbitrary. The fact that this extension proposes data on the material directly and not tied to any individual mesh is key. An application can stream down a library of materials (or an individual one) and know, for certain, how to map that material on to a mesh of their choosing while maintaining the real-world scale. Think of a material library like Substance Source.

As far as transforms included with the material go, I would assume that they are relative to the material's real-world scale. i.e. they are applied like always but with a higher-level scale then applied on top to scale the 0-1 space into the real-world dimensions on the model. Make sense?

@rsahlin
Copy link

rsahlin commented Apr 2, 2021

@MiiBond I don't see how adding a quad or box to this file would make the scale arbitrary?
In it's simplest form the coordinates (eg 4 corners of the quad) would be your real world coordinates - from this you would know the exact real world dimensions of your texture.

The other solution of using KHR_texture_transform would not need this mesh.

The third option would be to use metadata to define the material physical scale, because what you are defining is not really part of the model. It's more like additional data to help you achieve some other usecases (distributing material libraries) - perfect for metadata.

A fourth solution could be to put this information in a container type of file (manifest) that 'ties together' for instance multiple glTFs and additional data files (such as sound, physics, collision and behavior scripts)

I think a manifest is a very interesting solution and one that I see the need for in other usecases.

@elalish
Copy link
Contributor

elalish commented Apr 2, 2021

A fourth solution could be to put this information in a container type of file (manifest) that 'ties together' for instance multiple glTFs and additional data files (such as sound, physics, collision and behavior scripts)

I think a manifest is a very interesting solution and one that I see the need for in other usecases.

This is tangential to this thread, but I find this idea very interesting as well and worth talking about in more depth perhaps somewhere else. I think what you're referring to is a need for an open alternative to Apple's proprietary .reality files. We don't want to expand the scope of glTF to include all that, but some kind of container would be nice. Personally, I think something like an iframe of a little self-contained website might be ideal (leaning on the existing standards of HTML, CSS, and JS). The emerging web packaging standard might help.

@donmccurdy
Copy link
Contributor

That MaterialX has support for it comes to no surprise to me - they aim to solve problem of sharing of assets in the content creation pipeline... I don't believe that is what glTF should be for - I see the format as a distribution of authored assets.

... I wonder if there's really a need for two completely separate material models in these two projects ...

I might be drifting a bit off topic here, but the material parameters glTF aims to support in the near-to-medium-term are substantively very similar to Autodesk Standard Surface, with significant input from Autodesk (and other parties). To my understanding MaterialX also offers excellent Standard Surface compatibility (along with custom shading models, which I'm more skeptical of adapting into the realtime / publishing context of glTF), so we have at least a promising starting point for alignment now.

In the longer-term I'm also very interested in using something like MaterialX Standard Nodes to enable procedural inputs to the glTF material system. I think I've seen the term "pattern graph" used in MaterialX contexts? Standard Nodes appear to be a nearly ideal way to express procedural textures, and I'd be sorry to invent an incompatible alternative.

@donmccurdy
Copy link
Contributor

To the original topic, I don't think there is any disagreement here that real-world scale materials are a useful concept. We do also need to reckon with the fact that most software consuming glTF files will have no such concept, and it's unclear what it means for a vendor to "implement" or "not implement" the extension proposed. That is unusual, and I'm worried the extension would not gather wide support, to the detriment of the ecosystem.

For that reason I do agree with @rsahlin's suggestion that this is "metadata", i.e. it is not needed to correctly render the model, and provides information — like author, license, or product SKU — that should ideally be preserved when the model is edited, or even when it is converted to/from formats like USD.

If that's the case, then #1893 (draft, but 98% complete) might be a good solution. It provides a method of attaching metadata to the glTF file, or to specific objects like materials, within an XMP packet. In which case Khronos, or another organization, could define an XMP namespace for real-world material scale with the properties described here. This comes with an important benefit — software editing the glTF file does not need to understand real-world material scale, or any particular XMP namespace, it only needs to understand that there is some metadata attached to a material and that the metadata should be preserved.

There are scenarios where I'd favor manifest files or wrapping formats (3D Tiles, .reality, etc.) but I'm not sure this is one.

@bhouston
Copy link
Contributor Author

bhouston commented Apr 3, 2021

I am fine with it being in the per material metadata as a key or keys. We should have a standard key names for it. This is basically equivalent to this extension just with the data in a different place, but functionally the same or possibly even better.

@jstone-lucasfilm
Copy link

I might be drifting a bit off topic here, but the material parameters glTF aims to support in the near-to-medium-term are substantively very similar to Autodesk Standard Surface, with significant input from Autodesk (and other parties). To my understanding MaterialX also offers excellent Standard Surface compatibility (along with custom shading models, which I'm more skeptical of adapting into the realtime / publishing context of glTF), so we have at least a promising starting point for alignment now.

Just to provide some additional thoughts on this topic, one straightforward way to advance this alignment would be to create a MaterialX Physically Based Shading graph for the glTF BRDF, allowing it to be rendered in any current or future environment that supports MaterialX content (e.g. Maya, 3dsMax, Arnold, RenderMan). As two examples, here are the MaterialX PBS graphs for Autodesk Standard Surface and UsdPreviewSurface.

Note that the glTF BRDF, while it has some similarities with both Autodesk Standard Surface and UsdPreviewSurface, is not identical to either model. As an example, I don't believe there's any energy conservation between the diffuse and specular lobes in the glTF BRDF, so assets authored in this BRDF will change their look if their inputs are simply remapped to an energy-conserving model such as Autodesk Standard Surface.

Instead of trying to exactly match these shading models, my recommendation would be to create a unique MaterialX graph for the glTF BRDF, allowing content authored for this model to be recreated exactly in any shading language through Shader Generation. Following the pattern for Autodesk Standard Surface and UsdPreviewSurface, the MaterialX graph for the glTF BRDF can be robustly versioned, allowing the model to evolve over time without changing the look of assets authored for earlier versions of the glTF BRDF.

@bhouston
Copy link
Contributor Author

bhouston commented Apr 5, 2021

Whoa, this discussion went in a very interesting direction -- It would be dream to support node-based shaders in glTF and basing it off MaterialX would be very interesting. I had proposed shader graphs for glTF back in 2019 and I did try to base it a bit off of MaterialX but I got stuck not knowing the right subset to pull out -- https://docs.google.com/document/d/1Y6JFE2FV164IFDe7_cYhp2gzhSapB76fUNPgmsI6DDY/edit

@bhouston
Copy link
Contributor Author

bhouston commented Apr 5, 2021

@jstone-lucasfilm wrote:

As an example, I don't believe there's any energy conservation between the diffuse and specular lobes in the glTF BRDF,

This should be fixed BTW. This is a bug in glTF BRDF.

@rsahlin
Copy link

rsahlin commented Apr 6, 2021

As an example, I don't believe there's any energy conservation between the diffuse and specular lobes in the glTF BRDF

@jstone-lucasfilm Could you please point to the BRDF definition that exhibits this behavior?
Of course the glTF BRDF is meant to be energy conserving.

@jstone-lucasfilm
Copy link

@rsahlin Let me know if there are more rigorous implementations of the glTF BRDF elsewhere, and I'll post the code I've been looking at.

In the glTF Sample Viewer, for example, there doesn't appear to be any energy conservation between the diffuse and specular terms for image-based lights:

https://github.com/KhronosGroup/glTF-Sample-Viewer/blob/b03cdfb5271632563bf03c643c74460583f8f367/source/Renderer/shaders/pbr.frag#L355

In the analytic light path of this viewer, there's an implied Fresnel weighting of the diffuse term, but no consideration of the full directional albedo (e.g. including both the F and G terms of the specular lobe):

https://github.com/KhronosGroup/glTF-Sample-Viewer/blob/b03cdfb5271632563bf03c643c74460583f8f367/source/Renderer/shaders/pbr.frag#L423

It looks like these energy-conservation issues are acknowledged in the glTF specification, but there's no other reference renderer provided, so it's not clear how users would compare their own renders against a "ground truth" visual.

By creating a MaterialX graph for the glTF BRDF, it would be possible to generate accurate reference renders in GLSL, OSL, and MDL as needed by users. This would follow the approach used for Autodesk Standard Surface, where the MaterialX graph has become the reference implementation of the shader, and teams can either choose to generate shading code directly from this definition or use it to guide their own application-specific implementations.

@elalish
Copy link
Contributor

elalish commented Apr 6, 2021

@jstone-lucasfilm So, the idea is that glTF specifies physically what the materials represent, and then implementations are free to approximate them to their choosing. Therefore energy conservation is required for absolute correctness, but is often not perfect in practice. We use unoptimized path tracing as ground truth, since that is much easier code to check for physical validity. Here is a set of example comparisons of glTF renderers including a path tracer. At the bottom of the page you'll find a furnace test that checks for energy conservation. Like most of the others, the sample viewer is correct for IBLs with metals and dielectrics, but loses some energy during the transition between.

@jstone-lucasfilm
Copy link

@elalish That's great to see, and I like the inclusion of the path-traced renderer for validation outside of a real-time environment. Still, I can see benefits in having a full MaterialX graph for the glTF BRDF, which can provide both a physical specification and a direct source for reference and production renders. Following the approach for Autodesk Standard Surface, the graph for the glTF BRDF could be included in a future version of the MaterialX repository, allowing applications to render this BRDF automatically without additional libraries.

@rsahlin
Copy link

rsahlin commented Apr 6, 2021

@jstone-lucasfilm As already pointed out by @elalish realtime implementations of the glTF spec usually takes shortcuts in order to achieve a realistic framerate.
The section you refer to is non-normative and intended as a help for implementers that choose to support realtime rasterization of glTF models. :-)
With regards to ground truth for realtime rasterizers:
I think consistency and a deterministic output is more important than achieving energy conservation in all scenarios.
This is handled (more in depth) by 3D Commerce.

I can't really see the need for a graph based material system.
Personally I am not convinced that is the best way forward for glTF, I think that type of solution is more in line of what is needed in an authoring format.

@bhouston
Copy link
Contributor Author

bhouston commented Apr 6, 2021

Personally I am not convinced that is the best way forward for glTF, I think that type of solution is more in line of what is needed in an authoring format.

I am based as I suggested MaterialX node graphs earlier. But we could look at the precedence of UE4. It offers node-based shader graphs which are executed at run-time and it provides much more efficient transmission of data than baking out textures every time. It is why UE4 materials can look so real while not taking up a massive amount of space. Also both Three.js as well as Babylon.js offer node graphs for real-time materials. It greatly simplifies the specification of materials - just specify the BRDF and its inputs. Let the user figure out the textures, transforms and modulators required to achieve their effect. It is honestly beautiful once you go down that path. Interoperability has been the main challenge those with graph-based materials.

@proog128
Copy link
Contributor

proog128 commented Jun 2, 2021

Let me know if there are more rigorous implementations of the glTF BRDF elsewhere, and I'll post the code I've been looking at.

@jstone-lucasfilm The Enterprise PBR Material is a more rigorous implementation of the glTF BRDF. It is energy conserving in all cases, no matter what settings an artist uses.

It looks like these energy-conservation issues are acknowledged in the glTF specification, but there's no other reference renderer provided, so it's not clear how users would compare their own renders against a "ground truth" visual.

In addition to the math described in the Enterprise PBR specification, we provide test scenes and ground truth renderings to check implementations. There is also a WebGL-based path tracer which implements the BSDF.

To give a bit more context to the glTF specification: the specification and with it Appendix B is split into a normative and a non-normative part. The normative part deliberately does not demand energy conservation, since due to the additional computation overhead we don't want to enforce it (yet). So we ended up with a normative part that describes the material as a set of BSDFs and layers, their parameters and how these parameters affect the BSDFs/layers. And a non-normative part that describes a very simplistic implementation of the normative definitions. It's just an example to get people started. Think of the normative part as the MaterialX or MDL node graph, and the non-normative part as the implementation of the nodes (for example, which flavor of GGX shadowing-masking term should be used, or how does the layering operator ensure energy conservation).

The same applies to all the recent extensions, like KHR_materials_transmission, _volume, _sheen, _clearcoat, _specular.

Still, I can see benefits in having a full MaterialX graph for the glTF BRDF, which can provide both a physical specification and a direct source for reference and production renders.

I think a MaterialX graph for glTF BRDF would be a nice addition. It would allow us to use glTF PBR in many authoring tools. We just have to be careful that the graph really respects the normative part of the glTF specification to achieve consistent renderings. MDL, for example, was lacking a compatible layering operation for the sheen term. Given the high-level structure of nodes in MaterialX a few details might be missing or incompatible there as well, although from reading the MaterialX spec I believe there will be only very minor issues (This isn't a conicidence as we designed the glTF PBR to be compatible to many open material models like Blender's Principled BSDF, Standard Surface and Enterprise PBR).

A good starting point for designing a graph might be the overview in this pull request. It depicts the glTF BRDF in a pseudo-"graph language", inspired by MDL.

@RichardTea
Copy link

To bring this discussion back to the original PR:

A specified material physical scale is very valuable.

I agree that the material scale needs to be attached to the material itself, and not inferred by the meshes it is applied to.

I see two main reasons why:

  • If the material physical size is defined by a specific mesh that uses it, this would require all "material library" glTF files to include meshes. This makes it more difficult for applications to determine whether a glTF contains a material or mesh library, and is likely to result in material libraries 'polluting' an application's mesh database with large numbers of 'dummy' quads.

  • A glTF file that uses the same 'physical material' on multiple meshes/nodes will result in an uncertainty about the actual material size, as each usage will be rounded differently. The errors could be significant when combined with 'quantized' vertex or UV coordinates.

@elalish
Copy link
Contributor

elalish commented Jul 6, 2022

  • A glTF file that uses the same 'physical material' on multiple meshes/nodes will result in an uncertainty about the actual material size, as each usage will be rounded differently. The errors could be significant when combined with 'quantized' vertex or UV coordinates.

This just how UV texture mapping works: it stretches the scale of materials. It's not "uncertain", it simply will not exactly match a scale that is separately defined. As for quads, you only need one: all the materials can be variants on it. I'm curious if you have any solutions to the issues I raised above: #1949 (comment)

@bhouston
Copy link
Contributor Author

bhouston commented Jul 6, 2022

This just how UV texture mapping works: it stretches the scale of materials. It's not "uncertain", it simply will not exactly match a scale that is separately defined. As for quads, you only need one: all the materials can be variants on it. I'm curious if you have any solutions to the issues I raised above: #1949 (comment)

@elalish I think you are presuming that what is being asked here is not valuable or that we do not understand the alternatives you have described, that isn't the case. Your alternatives are not true alternatives to what I am asking for, which is this:

https://knowledge.autodesk.com/support/3ds-max/learn-explore/caas/CloudHelp/cloudhelp/2020/ENU/3DSMax-Lighting-Shading/files/GUID-27F58B25-C61E-4658-AB1E-7A6C20B23D1F-htm.html

In this extension, I was attempting to provide a solution to the second requirement as described in the Autodesk documentation above:

"The second requirement is available in the Material Editor. All 2D texture maps, such as Bitmap, provide a Use Real-World Scale checkbox on the Coordinates rollout. Like Real-World Map Size, by default this checkbox is off. When on, the U/V parameter names change to Width/Height and the Tiling label changes to Size. You can then specify the horizontal/vertical offsets and size of the texture map in current system units."

I was trying to allow this information that is stored in 3DS Max and similar to tools to be represented in glTF so we could do this type of mapping. It is distinct from scale/tiling. It is actually a different value with a different meaning. And it is then treated differently in the system. It is incredibly powerful to know and is used extensively in CAD systems and in room planners and other things where real-world scale is important.

@elalish
Copy link
Contributor

elalish commented Jul 6, 2022

That is interesting; perhaps someone from Autodesk could weigh in here? I just don't understand how they can achieve that mathematically. Can you tell me how you will apply a texture to a sphere in such a way that it has a constant real-world scale?

@bhouston
Copy link
Contributor Author

bhouston commented Jul 6, 2022

Can you tell me how you will apply a texture to a sphere in such a way that it has a constant real-world scale?

It doesn't work in all cases when applying 2D textures to arbitrary 3D objects. Its applicability isn't meant to be universal. So even if spheres make no sense, it doesn't mean this isn't useful.

(Although 3DS Max cheats a bit because it can do both 2D and 3D materials, as do a number of cad systems, in part because they support procedural volumetric materials. Thus it can actually do a sphere theoretically correctly with real-world scale if you are using one of these procedural real world materials.)

But for many things that are flat, like walls, floors, fabrics, it does work really well. This is generally where this is useful.

@bhouston
Copy link
Contributor Author

bhouston commented Jul 6, 2022

For a real world use case, take a look at this:

https://developer.apple.com/augmented-reality/roomplan/

If we were to import one of these models, even if we didn't know anything out the UVs on these objects, if we knew the real physical size of tiles or flooring, etc. we could set this up immediately and it would look great.

These are real world useful scenarios that I am trying to enable.

@bhouston
Copy link
Contributor Author

bhouston commented Jul 6, 2022

To figure out the scaling required, you need to get the world space magnitude of the spatial derivates of the UVs on the object. I think you could do this on a triangle area weighted basis. If they are consistent, then you can apply a real-world scale material to this object accurately. If they are inconsistent, then you have something like a sphere, where you can not apply a real-world scale material correctly.

@elalish
Copy link
Contributor

elalish commented Jul 6, 2022

I like the use case, I just don't understand why you can't enable it with vanilla glTF. What is the gap this extension is closing? glTF is in the business of arbitrary 3D geometry, so it seems odd to make an extension where nearly all glTF models can't be mathematically reconciled against it. The glTF by definition has access to everything about the UVs, so as you say, you can calculate the scale you'll get. You just can't make it constant (unless your UVs have been very carefully designed). And a single quad can easily represent the desired scale of a material swatch, and make it visible to viewers that don't support a special extension.

@bhouston
Copy link
Contributor Author

bhouston commented Jul 6, 2022

glTF is in the business of arbitrary 3D geometry, so it seems odd to make an extension where nearly all glTF models can't be mathematically reconciled against it.

You and I live in different worlds. My world is the world of products. @Threekit has created millions upon millions of glTFs for our clients and we use real-world materials constantly in them, like fabrics, veneers, flooring, etc. I think you are looking at artistic creation glTFs and also in glTFs that have contained baked materials with a unified UV system.

Myself, RapidCompact, Adobe and ILM/MaterialX and a few others said this was useful in some form. You and Richard said it wasn't. So I guess we don't do it.

This is a frustrating situation for me, but I have other things where I can spend my time productively and life is short, so as I said previously I will move on.

@RichardTea
Copy link

@bhouston I was agreeing with you and trying to get some traction towards accepting this PR, as it would solve one of my problems and it seemed to be 'parked'.

@elalish I work in the CAD space. Everything has a real size, including the physical materials. The grain of wood, the crystal size of metals, the height and width of a brick are fixed, and need to remain the same regardless of which piece of geometry they are applied to.

In the CAD simulations, the user will apply different materials to the same piece of geometry, and expects the bricks in the texture to stay the size of bricks.

When people create material packs, they want to specify "this texture image is exactly 0.8m by 1.0m". I currently try to work this out from DPI, which is awful.

@elalish
Copy link
Contributor

elalish commented Jul 6, 2022

@RichardTea Agreed, and I've worked in the CAD space as well. You're right, it's just not something we can solve in a format. glTF already unambiguously defines exactly how big every piece of every applied texture is. What you want is a CAD feature, making it easy to create the appropriate UV coordinates with minimal stretch. It's a difficult problem, but there are packages that tackle this. However, if we define these sizes again, it will only conflict with how the format is specified, and won't help us display the objects properly.

@bhouston
Copy link
Contributor Author

bhouston commented Jul 6, 2022

I strongly believe @elalish is blocking something he doesn't understand. It is frustrating and has been for a while.

I think of this feature in the 3D format discussion recently that I thought was a bad idea and how I handled it was I explained my position and then I also said if the consensus is that we should continue with it so be it, but I wanted my dissenting voice heard. I then backed down and let it proceed. I did this because I am not all knowing, especially when the group is as experienced and knowledgable as those that tend to hang around the glTF standardization group.

@bhouston
Copy link
Contributor Author

Closing this for lack of interest.

@bhouston bhouston closed this Oct 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.