-
Notifications
You must be signed in to change notification settings - Fork 387
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow sensitive UI to make use of the visible-blurred
state
#743
Comments
So a lot has changed since this issue was filed. We do have a more concrete concept of trusted UI. Currently, the spec says:
This would essentially involve allowing visible-blurred in this paragraph, which allows for head pose (and no input) to be read. /agenda Are we okay with applications knowing head poses (and head poses only) while displaying trusted UI? |
The concern with applications knowing head poses is that it can be a side channel for text input. So, for example, if we have navigation where a user types in a URL in TIUI, the head pose could theoretically be used as a side channel to recover the URL that was input. |
Do you think we can potentially relax this language to allow for TIUI to be shown in the blurred state for things like permissions prompts, but perhaps not sensitive typing? Let the UA make its own judgement, for example. Also my personal experience with typing in vr is that my eyes move, not my head, but it could still involve detectably small perturbations. |
I think we should definitely have a note that by allowing poses, sensitive typing could have a side channel and therefore it's not recommended for things like password input etc |
Yeah, that would work really well. Unsure if we should also have that warning for URL input. |
If your eyes move to look at the keys, should systems that use eye tracking for better immersion turn this off during sensitive UI? |
This issue was first discussed in #724. I recommend reading that for additional context.
It would be a quality-of-life improvement on some platforms (mainly PCs, where power is less of a concern) to allow immersive session's
visibilityState
to be set tovisible-blurred
rather thanhidden
when sensitive UI is being shown to the user. This would enable the display of such UI to be less jarring and give the user agents more flexibility in their presentation. If the application receives unfiltered head poses while such UI is being displayed, however, they may be able to infer private information (such as passwords) from observing the user's head movement while they interact with, for example, a virtual keyboard.The purpose of this issue is to determine what restrictions must be placed on the data delivered to sessions in the
visible-blurred
state to enable their use with sensitive UI while still allowing a reasonable user experience (if any can be found).The presumption is that head poses must either be severely throttled of suppressed entirely in order to mitigate that risk. A more detailed breakdown is given in the privacy and security repo:
Of course such low frequencies would not generally yield an acceptable tracking experience for rendering an immersive environment, thus negating their use. It's possible that UA could fake it in some cases by, for example, rendering a cube map or similar set of viewports that are wider than necessary and reproject them as needed to provide a tracked view of the session's content without needing to deliver fully accurate poses. The question is really how aggressive does the UA need to be about it.
The text was updated successfully, but these errors were encountered: