Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added touched property to GamepadButton #26

Merged
merged 2 commits into from
Jul 8, 2016
Merged

Added touched property to GamepadButton #26

merged 2 commits into from
Jul 8, 2016

Conversation

toji
Copy link
Member

@toji toji commented Jun 15, 2016

This is my proposal for adding explicit touchpad support to the Gamepad spec. Seems like creating a pull request here may be a more productive way to fostering conversation around the feature than the mailing list, which was relatively quiet on the subject.

It seems to me that effective touchpad support should only require this single new attribute. Touchpad position can accurately be represented by the existing axes array, but developers will need to be able to disambiguate between [0, 0] being a centered touch or no touch at all. That can be handled via the existing pressed value in a button, but it's awkward both in name and use. Many touchpads are clickable, meaning that the single control may have both a touched an a pressed state. Having a touched and a pressed state on a single button allows us to track both states while logically associating them with a single physical control. Furthermore some touchpads are pressure sensitive, but it's hard to imagine that such a control would also have separate tracking for clicked pressure, so the single existing value attribute should cover both touch pressure and button pressed amount.

There is also some upcoming hardware (Oculus Touch) for which many of the physical buttons will also have capacitive touch tracking to determine the position of the fingers. Adding a touched property to the buttons serves that case well.

There still exists some ambiguity over which button is associated with which axes, but that's a pre-existing problem and this addition doesn't make it any worse. This also ignores multi-touch surfaces (The PS4 controller has one) but considering the large amount of effort that has gone into supporting multi-touch for mobile devices it seems like that's not something we want to duplicate into the gamepad API.

@cvan
Copy link
Contributor

cvan commented Jun 15, 2016

There still exists some ambiguity over which button is associated with which axes, but that's a pre-existing problem and this addition doesn't make it any worse. This also ignores multi-touch surfaces (The PS4 controller has one) but considering the large amount of effort that has gone into supporting multi-touch for mobile devices it seems like that's not something we want to duplicate into the gamepad API.

Let me preface this by saying I'm extremely pleased with the tremendous work you've done and progress you've made with your Chromium builds. And I'm glad we have browser builds today to prototype HTC Vive motion controller support with WebVR.

That aside, perhaps it's premature to introduce GamepadButton#touched without consideration of multitouch.

(And, apologies for the radio silence on the mailing list thread; I had a lengthy draft I never got around to sending. So thanks for bringing it up again.)

We want to know the pressure in addition to the location and distance moved of the X and Y axes. As @patrickhlauke brought up in the mailing list thread, Pointer Events are likely a good candidate for representing the touch data from controllers.

IMO, instead of overloading GamepadButton, perhaps we should introduce a GamepadAxis interface (per my comment here).

But this begs the question: should we continue designing an API for accessing gamepad data on the Web that diverges from native controller APIs (that I'm aware of at least)?

In particular, the SDL2 GameController API already supports multitouch events, haptic feedback (with support for controllers with multiple haptic motors; this goes beyond the Gamepad#vibrate(pattern) in the current Chromium WebVR builds), and controller mappings. These are all items missing from the Gamepad API spec but supported well by SDL2 APIs.

There's been a Bugzilla bug on file for this for some time now. And I filed an issue about SDL2 (#24), wherein we can continue the discussion. And many of the associated issues about the current API's limitations are already filed as well: #19, #24, #7, #6, #4.

Anyway, I wanted to bring this up before we introduce too many new properties or overload the existing interfaces. Thanks!

@toji
Copy link
Member Author

toji commented Jun 15, 2016

Thanks for the comments @cvan! A couple of things I wanted to point out in response:

We want to know the pressure in addition to the location and distance moved of the X and Y axes.

I feel like the button value handles this appropriately right now. I can't think of any situation in which you'd have a separate touch pressure and pressed value on the same input.

Pointer Events are likely a good candidate for representing the touch data from controllers.

I agree! Especially for multitouch inputs, it feels like we'd need a much deeper level of API support than we likely want to add to the Gamepad API. I'm hesitant to say that all touch inputs need to use a separate API like that, though. It doesn't feel appropriate to route the Vive's touchpad through PointerEvents, for example. All we really need in that case is a way to disambiguate the [0, 0] case and the rest works as-is.

IMO, instead of overloading GamepadButton, perhaps we should introduce a GamepadAxis interface (per my comment here).

I assume that would be used in place of floats in the axes array? If so I'm really reluctant to do so, since that would be a significant breaking change. Adding touched to button doesn't require any existing code to change and can safely be ignored by devs who don't care about the property. I'l also point out that while the Steam/Vive controllers have a convention of associating vibration with axes (because they actually vibrate the touch pads) that doesn't apply to all devices. I don't think it's a good idea to associate axis data with vibration.

I'll also point out that I haven't submitted a pull request for any haptics support, because it's the part of the experimental builds that I'm least happy with. After toying with my original API I definitely think we can do better than a simple pulse pattern for the entire gamepad, and would love to discuss that side of things further! Probably a better conversation for an issue like #19 though.

@luser
Copy link
Contributor

luser commented Jun 21, 2016

I don't think I know enough about the Vive/Steam controller hardware to understand what's going on here. Can you give me a high-level summary of what they do? I'm a bit loathe to overload bits of the existing API to support very different controllers. I'd also like to make sure we're not adding things to the API that are too specific to a single controller.

@luser
Copy link
Contributor

luser commented Jun 21, 2016

Specifically, it feels like trying to shoehorn a touchpad into the existing buttons/axes is just not a great model. Building something off of Pointer Events seems like it would be much better, since that spec has spent a lot of time describing useful semantics.

@patrickhlauke
Copy link
Member

is it naive of me to suggest that a gamepad with built-in touchpad could/should be effectively treated as two separate inputs? on the one hand, the strictly gamepad side (buttons, sticks with axes, etc) which can be queried using the gamepad API, and on the other a touchpad, which is handled using the pointer events API?

i think the one pain point here would be if there are multiple gamepads, as there's currently no concept of assigning anything other than a unique ID for each pointer - assuming the touchpad is multitouch, you could end up with lots of pointers, all with their own unique ID, but no overarching "this belongs to one hand/single input/controller device" (and even if there were, marrying up any unique identifier for a gamepad to the unique identifier for the pointers would probably present further challenges).

@patrickhlauke
Copy link
Member

having said all that, if it's the case of a touchpad which is essentially used exactly like a stick (so you are really only getting X and Y), i could see the argument of treating it exactly like a stick ("how" the actual hardware detects those X/Y is transparent to the developer, what would count is what it "acts" like)

@toji
Copy link
Member Author

toji commented Jun 21, 2016

I completely agree that we don't want to do something that only works for a single set of controllers! I'm also wary of tying gamepads too closely to a spec as large as pointer events, which has thus far been developed with no regard to this use case. I was my hope that this would be a good middle ground, but I'm very happy to have a constructive conversation about alternatives!

So to start off, I should point out a couple of the controllers that I'm aware of which could potentially take advantage of this feature.

  • Steam Controller: Two touchpads, each only registers a single touch point and can register clicks like a button.
  • Vive Controllers: One touchpad per controller (you typically hold two). Each registers a single touch point and can register clicks like a button.
  • Oculus Touch: No touchpad, but has several capacitive controls to detect finger positions. (Joystick and two triggers can all detect when there is a finger on them.) Joystick can be clicked as well, like joysticks on XBox/PS controllers. Triggers are analog like XBox/PS triggers.
  • Daydream controller: The controller for Google's VR system has a touchpad that registers a single touch point and registers clicks like a button.
  • PS4 Controller: Has a touchpad can register two touch points and registers clicks like a button.
  • Shield Gamepad: Has a touchpad, though I can't find out much about it. Appears to support one touch point. Clickable like a button.
  • WiiU Gamepad: Touch sensitive screen, only registers a single touch point. Not clickable. (I really doubt this device is ever going to be exposed to the browser, though.)

There's also a sprinkling of long tail devices that have touchpads on them. Appears that very few if any support multi-touch.

So from my point of view it seems like most gamepads that have touch support use it for a single touch pad that functions more-or less like a joystick and is clickable. That general case already maps pretty well to the existing gamepad spec, with the exception of not having a clear notion of "touched". That's what prompted the idea of adding this single property to the button, as it felt unwieldy to link to a separate and fairly complex API to gain one additional bit of input. Especially considering that the Pointer events spec is event based, not polling, and there's no clear way to associate a given pointer with a given gamepad.

I agree that for devices with larger multitouch surfaces (especially something like the WiiU gamepad) pointer events seem more natural, but that seems to be the minority of gamepad uses.

@luser
Copy link
Contributor

luser commented Jun 21, 2016

Okay, so the Steam controller API seems to just expose it as a pair of axes:
https://github.com/ValveSoftware/source-sdk-2013/blob/0d8dceea4310fde5706b3ce1c70609d72a38efdf/mp/src/public/steam/isteamcontroller.h#L63

The Vive is used via the OpenVR SDK, I guess? That seems to treat everything as axes as well:
https://github.com/ValveSoftware/openvr/wiki/IVRSystem::GetControllerState (seems like an awkward API, TBH)

The Oculus Touch hardware seems to map the best to what you're proposing here, since it literally has bit flags for "user is touching this button" for each of the buttons:
https://developer.oculus.com/documentation/pcsdk/latest/concepts/dg-input-touch-touch/

There are a few other things there that I don't know how you'd map to the Gamepad API, like ovrTouch_RIndexPointing.

The Daydream Controller also seems to map the touch point to a pair of axes, and also exposes isTouching, which is true if the user is touching the touchpad:
https://developers.google.com/vr/android/reference/com/google/vr/sdk/controller/Controller

For the DualShock 4, I'm not sure how you'd map more than one touch to the Gamepad API.

Kinda hard to find any info on the Shield Gamepad.

I think the Wii U Gamepad is pretty far out of scope. :)

I guess the only real qualm I have here is that for everything but the Oculus Touch, the 'touched' property corresponds to the touchpad, and you're just sort of exposing it as an additional button, which feels weird. They do all appear to be clickable like analog sticks, so maybe that's not the worst thing. The Gamepad API doesn't have any concept of mapping a pair of axes and a button together into a single unit, which maybe it should since every dual-analog gamepad functionally has two sticks that are each composed of a pair of axes and a button for clicking the stick, but you're not really making things worse in that regard.

Leaving aside that I don't actually know what you use the touched indicator for in VR games, I think this presents a little bit of a wart on the API, in that for non-VR controllers that don't have a touched indicator, the way you've proposed the spec text this is always going to be false, even if you're pressing the button in question, which seems odd. Do you think it'd be sensible to instead say something like "true if the button is capable of detecting touch and is currently touched, true if the button is pressed regardless of whether it is capable of detecting touch"? Sort of similar to how the spec states that for digital buttons, the value should be set to 0.0 or 1.0 for depressed or pressed.

@patrickhlauke
Copy link
Member

with the exception of not having a clear notion of "touched". That's what prompted the idea of adding this single property to the button

what about active, borrowing very loosely from the idea of active pointers?

@toji
Copy link
Member Author

toji commented Jun 21, 2016

There are a few other things there that I don't know how you'd map to the Gamepad API, like ovrTouch_RIndexPointing

I believe that those states are actually just convenience booleans to indicate common hand gestures based on the touched states of the controls. Thumb and middle finger down + index up == IndexPointing. That could be easily handled at the JS level and is (IMO) far too specific of a concept to try to expose through this API.

I guess the only real qualm I have here is that for everything but the Oculus Touch, the 'touched' property corresponds to the touchpad, and you're just sort of exposing it as an additional button, which feels weird. They do all appear to be clickable like analog sticks, so maybe that's not the worst thing. The Gamepad API doesn't have any concept of mapping a pair of axes and a button together into a single unit...

You hit on most of my reasoning for associating it with the button instead of the axis, actually. In pretty much all cases that I can see today the touchable controls also function as a button, and the button is a single element for the control, whereas there's often multiple axes for the single control. I'm also extremely loath to change the axes array right now, as it would be backwards-compatibility breaking in a big way. (We already went through that once with buttons.) Admittedly its not the most intuitive layout, but it maps well to many of the underlying APIs and seems to work well for the reality of how these devices are designed today. And, crucially, existing code can simply ignore it's existence.

Do you think it'd be sensible to instead say something like "true if the button is capable of detecting touch and is currently touched, true if the button is pressed regardless of whether it is capable of detecting touch"? Sort of similar to how the spec states that for digital buttons, the value should be set to 0.0 or 1.0 for depressed or pressed.

That sounds far more sensible than my proposal, actually! I can't think of any situation where you would have a control that is pressed but not touched. :)

what about active, borrowing very loosely from the idea of active pointers?

I'm not completely opposed to it, but lacking the context provided by the pointer API I feel like it would be easily confused with the pressed state.

@toji
Copy link
Member Author

toji commented Jul 7, 2016

Ping! I would like to keep the conversation around this feature moving. Are there still significant objections to it or changes that should be made? And if we're not going to go this route to handle touchpads then can we decide on the mechanism that will be used?

@cvan
Copy link
Contributor

cvan commented Jul 8, 2016

/cc @kearwood

@luser
Copy link
Contributor

luser commented Jul 8, 2016

I don't know about touchpads in general--I think your proposal to use a pair of axes for single-touch touchpads is probably okay enough, and maybe if it proves to be insufficient we could spec something more complicated based on PointerEvents in the future.

I think the touched property is reasonable, modulo my additional proposal in the previous comment. Without entirely wrangling it into spec-language, something like:

  • If the button is capable of detecting touch, touched should be true if the button is currently touched and false if the button is not currently touched.
  • If the button is not capable of detecting touch but has an analog value associated with it (e.g. it is a trigger), touched should be true if the analog value is greater than zero, and false if the analog value is zero.
  • If the button is not capable of detecting touch and is a simple digital button, touched should mirror the value of pressed for the button.

should work and provide useful values for all controllers. Does that sound sensible?

@toji
Copy link
Member Author

toji commented Jul 8, 2016

Sounds very reasonable to me, and in fact I've already updated my Chrome proof-of-concept to match that behavior.

@toji
Copy link
Member Author

toji commented Jul 8, 2016

Updated the pull request to include language like what @luser suggested. Thanks!

@luser luser merged commit 29f9854 into w3c:gh-pages Jul 8, 2016
@luser
Copy link
Contributor

luser commented Jul 8, 2016

Thanks for doing the work here! One other thing that might be nice is to write a W3C test for the new property.

@cvan
Copy link
Contributor

cvan commented Jul 8, 2016

@toji do you want to open an issue about supporting multitouch or at least touch events (using Pointer Events, for example)?

@toji
Copy link
Member Author

toji commented Jul 9, 2016

Thanks for the review and merge! I'm happy to work on a test for this behavior!

@cvan: #27

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants