Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for on-the-fly automatic 3D rendering quality settings adjustment #2183

Open
reduz opened this issue Jan 25, 2021 · 22 comments
Open

Comments

@reduz
Copy link
Member

reduz commented Jan 25, 2021

Describe the project you are working on

Godot

Describe the problem or limitation you are having in your project

Godot always had many good settings to switch between quality and performance. In Godot 4.0 they are improved further.

Still, for users making relatively simple games, distributing them and expecting them to work everywhere can be a bit challenging.

Describe the feature / enhancement and how it helps to overcome the problem or limitation

The idea is to add a system to make it easy to toggle quality settings for 3D rendering on the fly, so this feature can be enabled with minimal effort by users (but still allow for a large deal of flexibility in case you want to control it).

Describe how your proposal will work, with code, pseudo-code, mock-ups, and/or diagrams

How it works (low level)

Internally, a new resource type will be created: RenderingQualityLevels. A node RenderingQualityAdjuster will take this resource and apply it to a selected viewport (or the root viewport).

RenderingQualityLevels will contain the following base_properties:

  • level_count, amount of levels
  • lower_threshold (milliseconds), if the time it takes to render a frame goes below this threshold, the quality level is incremented on level
  • upper_threshold (milliseconds) if the time it takes to render a frame goes above this threshold, then quality is decremented one level

Then, a list of settings are laid out such as:

  • anti aliasing
  • debanding
  • GI quality
  • SSAO quality
  • shadow quality
  • bokeh quality
  • Switching to an inferior viewport resolution
  • etc.

Settings need to also show the list of levels, so you can customize what this setting will be on each level.

The "low level" modus of operation is basically:

Add a RenderingQualityAdjuster node to the scene. If you are using a specific viewport (or viewports), or just the root viewport set on which it will work.

Set the RenderingQualityLevels resource on it and (while the editor is not running) it will automatically work when the game runs.

How it works (exposed to the user).

Most users will most likely not use the above because it takes some knowledge to understand what to enable and disable quality settings, so the following simplified interface will be present:

  • Project setting "automatic quality adjust", you toggle it on and it creates a node at the root level that changes the root viewport, al automatically.
  • Optionally, a path to a RenderingQualityLevels resource can be provided, which will be used.
  • A resource RenderingQualityLevelsDefault will be provided which comes autoconfigured and should more or less work for most projects.

If this enhancement will not be used often, can it be worked around with a few lines of script?

It could be done with a script but, to be honest the idea is to provide something out of the box for users that you can use and trust.

Is there a reason why this should be core and not an add-on in the asset library?

Because most 3D users working on simple 3D games will want to use this option, so it makes sense to have it as core.

@reduz reduz changed the title Automatic 3D quality settings. Support for on-the-fly automatic 3D rendering quality settings adjustment. Jan 25, 2021
@theraot
Copy link

theraot commented Jan 25, 2021

If something like this is part of the core, I'd be interested on the possibility of it picking a sensible default based on the specs of the machine before initializing any 3D graphics (so, during splash screen).

@Calinou
Copy link
Member

Calinou commented Jan 25, 2021

See also #917.

I'd be interested on the possibility of it picking a sensible default based on the specs of the machine before initializing any 3D graphics (so, during splash screen).

The goal is to change settings during gameplay. This would also allow using higher quality settings in less demanding scenes automatically.

@Ansraer
Copy link

Ansraer commented Jan 25, 2021

Kind of reminds me of what nintendo switch games are doing in order to best use the limited hardware...
Sound like a good idea, but I am a bit worried that devs will forget about enabling it and then struggle to figure out why their quality settings aren't used the way they want them to be used. A warning that is displayed next to the render settings when this feature is enabled would be necessary imo.
Also, what happens when a user who would rather play with better graphics but lower framerate disables this node (in an in-game options menu) after it has already been loaded by default? Will the changed settings stay the way they are or will they be revert to whatever they were before they were adjusted?

I am also a bit concerned about this part "and it creates a node at the root level that changes the root viewport"? I am guessing that the created node is a singleton, something of which users should be made aware. Would it be possible to also add a non-interactable entry to the AutoLoad settings page indicating that this singleton exists? Having code sneakily add stuff to my scene tree sounds like a debuggers worst nightmare

This feels like a very useful addition, but could potentially involve a lot of incomprehensible "black magic" happening behind the curtains and break a couple of assumptions users make about how their own game works. Making it obvious when this feature is enabled will be essential to avoid a lot of confusion and frustration.

@darksylinc
Copy link

Hi, I'm passing by:

A couple thoughts:

  • Some GPUs have different impact to different techniques. For example pre-GCN AMD cards (those are old and don't support Vulkan, but for the sake of example) will have a higher trouble with SSAO as they're bad at hiding memory latency. IGP cards have bandwidth problems, so removing BW-intensive algorithms will have a greater impact.
  • Artistically an author may give higher priority to certain algorithms (or even force them to stay at a minimum)
  • Don't forget about VRS (Variable Rate Shading) which allows for immediate selective (or global) quality degradation that looks better than simply downgrading the viewport resolution
  • Auto-Quality needs damping like any noisy signal. Otherwise it can introduce jitter, stutter or FPS spikes as the quality gets downgraded and upgraded dynamically too quickly. While the average FPS may look fine, the gameplay will feel horrendous because of the inconsistent frame timings. Conversely, very aggressive damping means quality may stay degraded for too long (or permanently) when it could easily be improved because whatever event that caused framerate to drop is gone now. It takes a lot of testing and good damping formulas until it 'feels right'
  • External factors such as power saving, background PSO optimization (by the driver), and other processes (e.g. antivirus) can cause weird inconsistent readings. Performance may be too slow because the driver decided to use an unoptimized PSO until the optimized PSO is ready; thus Godot will think the feature is too slow. Likewise, if rendering becomes too lightweight, the GPU may go into a lower power mode; causing frame timings to go up despite getting rid of a feature.
  • Resolution Scaling should not affect UI
  • Despite all that's been said it's not a bad idea. Plenty of games use dynamic quality adjustment (particularly focused on resolution scaling). Just highlighting common problems.

@Calinou
Copy link
Member

Calinou commented Jan 25, 2021

Artistically an author may give higher priority to certain algorithms (or even force them to stay at a minimum)

This is why manual presets will also be configurable. If you set all graphics presets to never disable a feature, then the feature will never be disabled 🙂

Don't forget about VRS (Variable Rate Shading) which allows for immediate selective (or global) quality degradation that looks better than simply downgrading the viewport resolution

VRS is planned to be implemented by @BastiaanOlij as part of his XR work, but this is the topic of a separate proposal.

Resolution Scaling should not affect UI

This can already be done manually, but we will probably have to decouple the UI viewport from the 3D viewport by default to handle this more nicely. (This is also required for #917.)

@BastiaanOlij
Copy link

Very much looking forward sinking my teeth into VRS, won't be right off the bat but so important to get performance up in VR where this makes a marked difference by reducing render quality in your peripheral vision where most fragments are wasted anyway

@phelioz
Copy link

phelioz commented Jan 26, 2021

Sounds great!
This is probably already planned but would really like to easily be able to adjust the RenderingQualityAdjuster manually. So instead of auto switching quality levels it would also be controlled manually.
Then it could be used for low, medium, high graphics settings easily in games if auto switching the quality levels is not desired.

@Calinou Calinou changed the title Support for on-the-fly automatic 3D rendering quality settings adjustment. Add support for on-the-fly automatic 3D rendering quality settings adjustment Jan 30, 2021
@Calinou Calinou added this to the 4.x milestone Aug 8, 2021
@atirut-w
Copy link

atirut-w commented Oct 6, 2021

This can already be done manually, but we will probably have to decouple the UI viewport from the 3D viewport by default to handle this more nicely. (This is also required for #917.)

Possibly off-topic, but this should be in project settings for the sake of convenience.

@Calinou
Copy link
Member

Calinou commented Oct 6, 2021

Possibly off-topic, but this should be in project settings for the sake of convenience.

This is now implemented: godotengine/godot#51870

@atirut-w
Copy link

atirut-w commented Oct 6, 2021

You could have ideally linked that instead, or does it effect UI too?

@Calinou
Copy link
Member

Calinou commented Oct 6, 2021

You could have ideally linked that instead, or does it effect UI too?

It doesn't affect UI or anything 2D. However, the way 3D render scaling is currently performed isn't suited for dynamic resolution scaling. This is because render buffers need to be recreated every time the rendering resolution changes. Recreating buffers is a slow operation, so it should be avoided here.

@roalyr
Copy link

roalyr commented Nov 14, 2021

Good stuff. In fact it can be integrated into Viewport Container, because it already features stuff like viewport resolution. Also viewport factor there shouldn't be an integer.

@Calinou
Copy link
Member

Calinou commented Nov 14, 2021

Also viewport factor there shouldn't be an integer.

This is already implemented by godotengine/godot#52215 in master. However, changing the scale factor will cause buffers to be reallocated, which makes it unsuitable for dynamic resolution scaling as it's too slow.

@roalyr
Copy link

roalyr commented Nov 14, 2021

Strange. It works decently in my case (viewport scaling by a factor), naturally, if not done every frame.
What about slightly delayed resolution factor change? Like, 0.2...0.5s by +/-0.1?

@Calinou
Copy link
Member

Calinou commented Nov 14, 2021

Strange. It works decently in my case (viewport scaling by a factor), naturally, if not done every frame.
What about slightly delayed resolution factor change? Like, 0.2...0.5s by +/-0.1?

The goal of dynamic resolution scaling is to change the scaling factor every frame. This is how resolution changes can be smoothed out over time. If you perform scaling in 0.1 increments, resolution changes become very noticeable during gameplay. The current approach does allow changing the scale factor every frame, but it comes with a hefty performance penalty that negates the benefits of dynamic resolution scaling.

@roalyr
Copy link

roalyr commented Nov 15, 2021

By the way, does debug monitor system has anything related to performance when it is affected by viewport scale? Is it fragment ops in that case or something of that sort?

@Calinou
Copy link
Member

Calinou commented Nov 15, 2021

By the way, does debug monitor system has anything related to performance when it is affected by viewport scale? Is it fragment ops in that case or something of that sort?

There is a visual profiler in the master branch, but I don't think it goes this far. If you want to do advanced GPU profiling, using something like NSight is likely the only option.

@thmasn
Copy link

thmasn commented Nov 11, 2023

one thing required (even when implementing resolution scaling in gdscript) is a measurement of the actual render time.

as far as i can see, there is currently no way to get a read on this duration when vsynch is enabled. all measurement return a time delta which includes the wait time for vsync:

  • using delta in _process gives the "clamped" vsync time of exactly 1/60 fps.

  • using Performance.get_monitor(Performance.TIME_PROCESS) varies more, but seems to max out around 1/63fps
    -> here, an additional monitor like "TIME_PROCESS_WITHOUT_VSYNCH" could be added.

  • there seems to be no beforeRender or afterRender callback
    -> additional Methods _before_render, and _after_render could be added to the node class

an implementation of one of these could allow a really simple GDscript, and allow users to scale their game independently without any limitations

Edit: i created a separate feature request which only request a way to measure frame render time when vsync is enabled:
#8399

@atirut-w
Copy link

as far as i can see, there is currently no way to get a read on this duration when vsynch is enabled. all measurement return a time delta which includes the wait time for vsync:

That could be solved by also storing how long the vsync took.

@thmasn
Copy link

thmasn commented Nov 11, 2023

as far as i can see, there is currently no way to get a read on this duration when vsynch is enabled. all measurement return a time delta which includes the wait time for vsync:

That could be solved by also storing how long the vsync took.

is there currently a way to read how long the vsynch took, or is this something that also would need to be added?

@atirut-w
Copy link

It will probably have to be added. I don't remember there being a way to check how long vsync takes.

@Calinou
Copy link
Member

Calinou commented Nov 12, 2023

It will probably have to be added. I don't remember there being a way to check how long vsync takes.

This is already supported as mentioned in #8399 (comment). (The same metrics are used by dynamic resolution in most other game engines – though note that doing good dynamic resolution scaling on PC is harder than on console for various reasons, such as the timings being generally less reliable on PC.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

10 participants