-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
glTF 2.0: New KHR_environments extension #946
Comments
Interesting. I thought the IBL would just be one form of a light that was about to be defined in the new lights extension, but considering that it does not necessarily contribute to the lighting, a dedicated The lack of support of real cube maps might either require some workarounds, or extensions in the In any case, one probably has to either
|
The new light class makes sense. Probably only for PBR materials, but I prefer the reference. |
To make sure I'm understanding this, the general case would still be that we expect PBR models to pick up environments from the rendering engine, not ship them with the glTF file itself, right? Take for example the models in sbtron's glTF 2 demo. The same glTF file can be loaded into multiple different environments, and will reflect the selected environment without changes to the glTF itself. |
Yes, the general case would be to deploy without the environment map. And I see this extension as a "pure" extension like the PBR specular glossiness is right now. To explain this extension a little bit further: Normally, you deploy/use the glTF 2.0 file without any environment map. So the engine decides, which environment lighting is used. Like in the above images and glTF 2.0 is specified right now. But I do see another use case: So, if I want to send someone an asset and I want this person to see the 3D content exactly the way I want him/her to see the asset, you also need to send the environment map plus some additional information. For this reason, I also want to deploy the environment map inside the glTF - especially glb - file. |
If this is added, can we add a transform matrix to orient the envmap, and a color to scale it? |
Good idea. |
Are these environment maps intended to be used as reflection probes? If so, it might be useful to be able to associate an environment map with a node. In many game engines, a scene may have multiple reflection probes, and it uses the weighted relative distances to the different probes to choose one for a particular object. The specific algorithm would have to be implementation-dependent, but the data should be available. |
The original idea is having one static environment map, which influences the whole scene. |
Are cubemaps most common? If so, should that be the only one supported to start or is that too limited compared to what shading tools will create? Could you give a brief rundown of each possibility for |
CC @moneimne, this discussion may be of interest to you. |
I definitely see a strong use case for this when displaying/previewing glTF models. I expect that when an artist creates an asset, they often want it to be displayed in a relevant environment. If the engine imposes its own environment map, it might end up that the model looks out-of-place or awkward because of context. An extension like this might not be as useful when loading multiple glTF models into the same scene, though. Contradicting environment maps would detract from the realism that PBR aims to add. I suppose the engine could ignore the extension at this point. |
@pjcozzi Type should define, how the environment map is "encoded":
I suggest, that we should only support the panorama and mirror ball format:
|
@moneimne Regarding the environment maps included in the glTF scene, we should exactly define this in the specification e.g.: Also, in the last case, I would suggest to store the environment map in in a separate glTF file without any other scene data. This glTF file would still be a valid glTF file plus it has all the information about the environment map like type and additonal rotation. |
Let me comment on the extension a bit.
One could also want to include envmap probes, i.e. envmaps that are local to a part of the scene. The problem with doing so is that it is very hard to define how to render them appropriately without some form of complex probe interpolation, typically done only on smoothed probes and using some form of angular basis for it. I would leave this out, unless there is clarity on a simple implementation that actually works. |
How would an app know "If the asset has to be rendered like the artist wants?" It seems that one environment map would need to always override the other, e.g., "if the runtime has an environment map, the environment map in the extension may be ignored."
Sounds like cube maps will be a lot of work; this is probably why we punted on them earlier. 😄 But are they widely used enough that the work is justified to "get this right?" Any thoughts @lexaknyazev @bghgary? |
The app does not know. It is more depending on the context: Having an environment map would also imply to support HDR images. Having cube maps, we also need to support more samplers. Basically easy to define, but I think it takes some time until all agree. My suggestion is to put all extensions - except lighting and common materials - for now on hold. |
Sounds good. |
Vendor extensions too? My two pending extension PRs are ready to merge and
implemented.
On Fri, Jun 16, 2017, 9:22 AM Patrick Cozzi ***@***.***> wrote:
My suggestion is to put all extensions - except lighting and common
materials - for now on hold.
Sounds good.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#946 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABy5CDuqs-IbgH0pgtPpfaefkQ5WYOJxks5sEqvFgaJpZM4NTD3N>
.
--
---
Steven Vergenz, AltspaceVR
[email protected]
|
No, no, just the offical "KHR_" ones. |
For orienting the environment map, could we add an extension to the node. The main advantage of this is that by having only one way to specify transforms in glTF we get an easier integration in libraries and support for all transformation features, like for example animation. |
Yes, I think your suggestion is the better approach. Also, it would be possible to have several environment maps, using IBL depending on the position of the actor. |
@McNopper Regarding environment map encoding. I am not convinced that we should support mirror balls. The reason is that they display heavy distortions (I believe Azimuthal equidistant corresponds to mirrorball). |
I am fine without the support of mirror balls, as we use normally the equirectangular representation and I think others do not have a strong opinion on this. What we have to define is, how these HDR images are encoded: .ktx does have the advantage, that it supports cube, mip maps and floating point textures. Also, it is a Khronos standard. |
@McNopper we use equirectangular as well, that said I am leaning towards cube maps because of the lower distortions (on top of hardware support). In terms of HDR encoding we mostly use .hdr internally (exr as well). |
I mean both should be possible - cube maps and equirectangular - as they are just common used. KTX is from Khronos: https://www.khronos.org/opengles/sdk/tools/KTX/file_format_spec/ |
(New to the community - hello :)) My 2 cents: My impression is that base glTF is meant to define the object (or hierarchy of objects), but not the lighting nor any aspect of the final rendering (post effects, etc. ...although we do have 'camera' which seems odd to me). The choice to stick to PBR materials makes sense for this goal: A glTF object can be dropped into any PBR renderer and look fairly accurate and fitting with the rest of the scene, regardless of lighting. Environment-based IBLs seems to me purely a lighting concern, and thus does not belong in glTF. On the other hand, if glTF is meant to define an entire 3D experience, then certainly you need IBLs, but a bunch of other stuff too. I don't think this is the way to go, since the use cases of a "drop-in object" are numerous, and one could imagine another spec being defined for entire scenes/experiences. |
We're looking into something like this, and this would be handy because our app is purely a model viewer, so it'd be nice to have one file for everything, but i agree this should be stored next to a glTF model, not inside of it. |
Welcome @stevesan
There was a lot of early discussion of this in 2.0 (see #696 (comment) and #746 and other issues). The general idea was that we do want glTF to be capable of sending whole scenes. But, lights and environments and such weren't ready to go when 2.0 was released, so, they're being worked on as extensions. If and when the extensions become mature and widely-supported, they can be candidates for moving to core glTF in a future version. |
Just waiting for a Windows build and thought that I'd writeup my current thoughts on this extension:
|
No immediate plans to move forward with |
I'm just getting started with glTF. I'll be using it for whole-scene transfer between apps, and was surprised to find HDR textures and environment maps missing. Just a data point. |
If KHR_environment requires KHR_lights, how would that work exactly?
Does this break any rules to have the
|
This looks like a bug with the mipmap generation. It should know how to "wrap" the texture while resizing, and there will be no problem. IMO, KTX support may be not bad, but if added, it sounds more like another extension. This is generally useful format if you need custom mimaps, GPU compression support, etc. So this means huge work by itself, before getting to the env extension... Why not just expose some HDR image formats as separate extensions like the DDS extension? The env extension will not have to deal with this issue, which is really not part of this extension. The engines will decide what to support Supporting multiple probes is more advanced feature, so maybe it's for another extension? Making KHR_environment dependent on KHR_lights sounds cool, but... what exactly it depends on? It doesn't care about the other lighting. IMO, the environment is the most basic form of lighting, so some implementation may decide to not support KHR_lights. I don't think letting the implementations do the convolution will create much discrepancies, since they will all need to be fast and will do some simple blur probably :)... Maybe the question is how bad will be such runtime convolution quality-wise? Imagine also if you have multiple probes, this is a stress that even game engines don't need to handle, since it's always precalculated. However, if the convolution is offline, this will require format that also supports mipmaps to handle different roughness values. It also means, that the engine's BRDF model may be different then yours... Maybe if the engine supports mipmaps, and they are provided, it must just used them on your own responsibility. Otherwise do its thing? |
This actually has nothing to do with the mipmap generation. It's an issue with how mipmaps are sampled. The hardware chooses a mip level based on the screen-space derivatives of the UV's (how they're changing across the poly). If the UV's are discontinuous (i.e. jump directly from 0 to 1), like in this case, the derivative becomes undefined. Hence the artifact. |
As the discussion regarding IBL and how to define it pop ups again, I want this extension to be discussed and reviewed. In addition, we need a parameter for the default orientation of "front" and/or the center of the panorama image. |
I still have some reservations about creating an IBL extension, and shipping one with only panorama .hdr images feels like a particularly short-term workaround to me. @UX3D-nopper could you say more about why you would like to revisit the extension? |
As from 3D Commerce there is the demand to ship the IBL with the glTF. |
I don't understand this requirement... Perhaps we can discuss more soon.
Unity stores reflection probes as cubemaps. three.js and Babylon support equirectangular IBL, but have to convert them to cubemaps at load time before using them, to my understanding. The projection produces artifacts at the poles, we tend to find cubemaps preferable. |
The IBL storage, transmission, and usage comprises several key questions that haven't been thoroughly investigated for glTF yet. We cannot make a KHR extension otherwise. The current state of existing DCC tools shouldn't be a deciding factor here. Note that most of the following questions do not depend on each other. Shape
OrientationglTF defines fixed XYZ directions and we even rejected an extension that was supposed to remap global axes. For the same reasons, standardized IBLs should have a fixed orientation. Values InterpretationThe values coming from IBL should have a well-defined physical meaning. This implies their range and possible runtime adjustments (bias / multiplier). Bitstream formatRegardless of prefiltering, there are multiple storage options. The final choice should take in account data transmission, runtime processing, and VRAM costs. |
Also, storing non-pre-filtered IBLs, such as HDR / raw RGBE directly in the glTF, goes a little against the spirit of delivering the data in a ready-to-render form, as pre-filtered data would be. For example, BabylonJS developed their own This could be KTX2 if it can include both diffuse and specular pre-filtered environments. Otherwise, I think we could use an empty glTF file with a KHR extension as a container for such an environment. It might even warrent a new file extension, to indicate that it contained only an environment with no model, and was intended to be loaded alongside some other glTF model file. Of course you could still bundle such an environment along with a model in a single glTF file. |
I agree with @lexaknyazev but disagree with @emackey. Every renderer I know (babylon, filament, three, and the sample renderer) uses a different IBL prefiltering format, which also implies different shaders to interpret it. They have different pros and cons, different artifacts, different upfront and per frame costs. There is not at all a clear "right way" and they are in no way interoperable, yet they all do a pretty good job with PBR. Also, as I've demonstrated with three.js, it is even possible to get good results now with just-in-time prefiltering, which removes the need for transmitting prefiltered IBLs at all (since I can prefilter them in less time than a texture upload takes). |
That's a fair observation that different engines need different pre-filtering. I still think this type of extension is to be handled with extreme caution. Typical glTF models are intended to integrate into a variety of lighting environments, unless the model contains a complete scene description including the environment (which has not been a common use case so far, to my knowledge). I've seen issues on GitHub (and I'm not naming names) where developers wanted to prevent users from selecting their own lighting environments, preferring to wait for Khronos to ship the IBL along with the model. This is not the expected default case. A typical glTF file contains a single object or a couple of objects that are to be placed into a lighting environment of the client's choosing. |
I don't know about others, and I'm sure I'm out of the mainstream, but I'm using glTF as an exchange format between my front end (three.js) and my back end (blender). Right now I have my own scene file format that includes the environment (typically .hdr or .exr, equirectangular) plus the glTF scene object with pretty much everything else, because I can't represent the environment in glTF. I'd love to have it all in glTF. |
|
Please continue discussion here: #1850 |
For PBR and IB lighting, an environment map is needed. The follwoing JSON snippet defines evironments by example:
"type" defines, the format of the environment texture.
For non-PBR materials, the texture can be used just as the environment. For PBR, the texture needs to be sampled and pore-filtered.
It needs to be discussed, if cube maps as textures should be supported. Furthermore, a standard HDR image format has to be selected.
Finally, find out a possibility, to provide the pre-sampled/-filtered images as well.
The text was updated successfully, but these errors were encountered: