-
-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for on-the-fly automatic 3D rendering quality settings adjustment #2183
Comments
If something like this is part of the core, I'd be interested on the possibility of it picking a sensible default based on the specs of the machine before initializing any 3D graphics (so, during splash screen). |
See also #917.
The goal is to change settings during gameplay. This would also allow using higher quality settings in less demanding scenes automatically. |
Kind of reminds me of what nintendo switch games are doing in order to best use the limited hardware... I am also a bit concerned about this part "and it creates a node at the root level that changes the root viewport"? I am guessing that the created node is a singleton, something of which users should be made aware. Would it be possible to also add a non-interactable entry to the AutoLoad settings page indicating that this singleton exists? Having code sneakily add stuff to my scene tree sounds like a debuggers worst nightmare This feels like a very useful addition, but could potentially involve a lot of incomprehensible "black magic" happening behind the curtains and break a couple of assumptions users make about how their own game works. Making it obvious when this feature is enabled will be essential to avoid a lot of confusion and frustration. |
Hi, I'm passing by: A couple thoughts:
|
This is why manual presets will also be configurable. If you set all graphics presets to never disable a feature, then the feature will never be disabled 🙂
VRS is planned to be implemented by @BastiaanOlij as part of his XR work, but this is the topic of a separate proposal.
This can already be done manually, but we will probably have to decouple the UI viewport from the 3D viewport by default to handle this more nicely. (This is also required for #917.) |
Very much looking forward sinking my teeth into VRS, won't be right off the bat but so important to get performance up in VR where this makes a marked difference by reducing render quality in your peripheral vision where most fragments are wasted anyway |
Sounds great! |
Possibly off-topic, but this should be in project settings for the sake of convenience. |
This is now implemented: godotengine/godot#51870 |
You could have ideally linked that instead, or does it effect UI too? |
It doesn't affect UI or anything 2D. However, the way 3D render scaling is currently performed isn't suited for dynamic resolution scaling. This is because render buffers need to be recreated every time the rendering resolution changes. Recreating buffers is a slow operation, so it should be avoided here. |
Good stuff. In fact it can be integrated into Viewport Container, because it already features stuff like viewport resolution. Also viewport factor there shouldn't be an integer. |
This is already implemented by godotengine/godot#52215 in |
Strange. It works decently in my case (viewport scaling by a factor), naturally, if not done every frame. |
The goal of dynamic resolution scaling is to change the scaling factor every frame. This is how resolution changes can be smoothed out over time. If you perform scaling in 0.1 increments, resolution changes become very noticeable during gameplay. The current approach does allow changing the scale factor every frame, but it comes with a hefty performance penalty that negates the benefits of dynamic resolution scaling. |
By the way, does debug monitor system has anything related to performance when it is affected by viewport scale? Is it fragment ops in that case or something of that sort? |
There is a visual profiler in the |
one thing required (even when implementing resolution scaling in gdscript) is a measurement of the actual render time. as far as i can see, there is currently no way to get a read on this duration when vsynch is enabled. all measurement return a time delta which includes the wait time for vsync:
an implementation of one of these could allow a really simple GDscript, and allow users to scale their game independently without any limitations Edit: i created a separate feature request which only request a way to measure frame render time when vsync is enabled: |
That could be solved by also storing how long the vsync took. |
is there currently a way to read how long the vsynch took, or is this something that also would need to be added? |
It will probably have to be added. I don't remember there being a way to check how long vsync takes. |
This is already supported as mentioned in #8399 (comment). (The same metrics are used by dynamic resolution in most other game engines – though note that doing good dynamic resolution scaling on PC is harder than on console for various reasons, such as the timings being generally less reliable on PC.) |
Describe the project you are working on
Godot
Describe the problem or limitation you are having in your project
Godot always had many good settings to switch between quality and performance. In Godot 4.0 they are improved further.
Still, for users making relatively simple games, distributing them and expecting them to work everywhere can be a bit challenging.
Describe the feature / enhancement and how it helps to overcome the problem or limitation
The idea is to add a system to make it easy to toggle quality settings for 3D rendering on the fly, so this feature can be enabled with minimal effort by users (but still allow for a large deal of flexibility in case you want to control it).
Describe how your proposal will work, with code, pseudo-code, mock-ups, and/or diagrams
How it works (low level)
Internally, a new resource type will be created: RenderingQualityLevels. A node RenderingQualityAdjuster will take this resource and apply it to a selected viewport (or the root viewport).
RenderingQualityLevels will contain the following base_properties:
Then, a list of settings are laid out such as:
Settings need to also show the list of levels, so you can customize what this setting will be on each level.
The "low level" modus of operation is basically:
Add a RenderingQualityAdjuster node to the scene. If you are using a specific viewport (or viewports), or just the root viewport set on which it will work.
Set the RenderingQualityLevels resource on it and (while the editor is not running) it will automatically work when the game runs.
How it works (exposed to the user).
Most users will most likely not use the above because it takes some knowledge to understand what to enable and disable quality settings, so the following simplified interface will be present:
If this enhancement will not be used often, can it be worked around with a few lines of script?
It could be done with a script but, to be honest the idea is to provide something out of the box for users that you can use and trust.
Is there a reason why this should be core and not an add-on in the asset library?
Because most 3D users working on simple 3D games will want to use this option, so it makes sense to have it as core.
The text was updated successfully, but these errors were encountered: