You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This bug report was migrated from our old Bugzilla tracker.
Reported in version: 2.0.9 Reported for operating system, platform: Linux, x86_64
Comments on the original bug report:
On 2018-10-25 06:46:47 +0000, Ellie wrote:
There seems to be an API oversight unless both the people in chat and me are blind (which is very possible, haha, sorry if I just missed it):
The SDL_FINGERDOWN/UP/... events and the tfinger event structure map to some "touch device", but there is no connection from that to any of the screens - which means in a multiscreen setup, it is impossible to find out what screen the input actually maps to.
Since this is bad and will lead to broken applications in less trivial multiscreen setups, please add a new function that provides one of the following (whatever makes more sense/is easier to implement):
EITHER map a touch device to any of the screens,
OR map any specific finger input event to any of the screens.
On 2018-11-05 08:09:31 +0000, Ellie wrote:
Some sort of note whether this is actually an oversight or whether it's there and we just didn't find it would be useful!
On 2018-11-06 22:16:59 +0000, Sam Lantinga wrote:
Unfortunately the same touch API refers to both touchscreens (which are tied to a display) and touchpads (which are not)
In both cases we return normalized coordinates and let the application decide what it means. I don't know of a cross-platform way of knowing that, but if someone contributes a patch, we can expose that in the API.
On 2018-11-07 01:15:50 +0000, Ellie wrote:
How about a function that maps the tfinger event struct to another struct with additional info (for API compatibility/retaining the old tfinger struct size) with the following members:
screen index (-1 if not mapped/known)
screen pixel pos x (undefined/whatever if screen index is -1)
screen pixel pos y (undefined/whatever if screen index is -1)
This would save me from doing both the float to pixel conversion (because 1. I can trivially normalize myself, 2. in 99% of cases where I care about the screen, I DONT WANT normalized coordinates anyway), and would also support e.g. mapping a touchpad to the range of multiple screens.
Or am I missing some corner case that wouldn't work with this?
In case the screen can't be detected, it would simply return -1 and I wouldn't be worse off than I am right now (which is basically having no clue which screen it is ever, and just randomly taking screen 0 which may be completely wrong in a multiscreen setup, but at least has a chance of being right as well).
On 2018-11-07 11:42:16 +0000, Alex Szpakowski wrote:
I made a patch a while ago which adds a SDL_TouchDeviceType enum and SDL_GetTouchDeviceType(SDL_TouchID).
It has implementations for Android/iOS/macOS/Windows/wayland/x11, but at the time it was created only iOS and x11 had runtime APIs for determining that (as far as I know), so on the other platforms it uses the device type typically found on that platform (direct/screen touches on Windows, indirect touches on macOS because of laptop trackpads and no touchscreen Macs, etc.) I think macOS now has an API for that though.
If that sounds reasonable I can update the patch to work with the latest SDL source.
On 2018-11-07 14:27:04 +0000, Sam Lantinga wrote:
Sure, that would be great. Thanks Alex!
On 2018-11-07 14:27:51 +0000, Sam Lantinga wrote:
Can you also extend the event with integer coordinates and set them when appropriate for the touch device?
Thanks!
On 2018-11-07 15:41:31 +0000, Alex Szpakowski wrote:
Currently the normalized x/y position in touch events are window-relative for direct touches, the absolute position on the touch device for most indirect touches, and a position on the touch device relative to some starting position, for certain indirect touches (Apple TV controllers behave that way I believe).
What would the integer coordinates represent in each case? We can already get the position of the mouse cursor relative to the window with SDL's other APIs, which direct touch presses currently drive. For direct touches I don't think exposing a touch position on the screen (rather than the window) is particularly useful for apps.
On 2018-11-07 15:50:54 +0000, Ellie wrote:
The thing I am looking for is really the screen index. The device type doesn't seem sufficient for determining that in a multiscreen setup.
For direct touches I don't think exposing a touch position on the screen (rather than the window) is particularly useful for apps.
It is useful because the whole "clipping events to window size" can make strokes over the entire screen really difficult to interpret as a scroll for a specific window.
The mouse API also gave me quite the headache with its "clip mouse events to window unless there is an input lock".
Therefore, I would really opt for screen coordinates & screen index. It's not hard to get the window coordinates from that. If you want, that could be also provided in addition, but IMHO screen-based events are much more universally useful.
On 2018-11-07 15:52:03 +0000, Ellie wrote:
And I think there might be an oversight:
We can already get the position of the mouse cursor relative to the window with SDL's other APIs,
Is there, for multitouch? Because I didn't find one, that's exactly why I filed this ticket. Of course for single touch it exists with the fake touch mouse events, but that doesn't help with multitouch
On 2018-11-07 16:20:50 +0000, Alex Szpakowski wrote:
(In reply to Jonas Thiem from comment # 8)
The thing I am looking for is really the screen index. The device type
doesn't seem sufficient for determining that in a multiscreen setup.
For direct touches (where there is a screen to index), you can get the index from the SDL window at least. It's definitely more roundabout than a single API though.
(In reply to Jonas Thiem from comment # 8)
For direct touches I don't think exposing a touch position on the screen (rather than the window) is particularly useful for apps.
It is useful because the whole "clipping events to window size" can make
strokes over the entire screen really difficult to interpret as a scroll for
a specific window.
I wonder what the implications are with respect to window focus. I wouldn't want SDL apps to think my touch input should be going to them, when I'm trying to interact with an unrelated thing on my screen.
In any case I think this sort of thing is a separate scope (in terms of implementation as well as implications) from the device type change, so I'll get that in first.
On 2018-11-07 16:22:34 +0000, Alex Szpakowski wrote:
(In reply to Alex Szpakowski from comment # 10)
I wonder what the implications are with respect to window focus. I wouldn't
want SDL apps to think my touch input should be going to them, when I'm
trying to interact with an unrelated thing on my screen.
I forgot to add to this - do operating system deal with this in different ways? Is this something we can leverage OS behaviour/APIs for rather than implementing something that breaks the conventions of certain operating systems?
On 2018-11-08 06:15:14 +0000, Ellie wrote:
I wonder what the implications are with respect to window focus. I wouldn't want SDL apps to think my touch input should be going to them, when I'm trying to interact with an unrelated thing on my screen.
I already track window focus myself. (SDL has events for this, after all)
For direct touches (where there is a screen to index), you can get the index from the SDL window at least. It's definitely more roundabout than a single API though.
Ok I think I am fundamentally misunderstanding something here. What is direct touch? Because again for the multitouch finger events, I can't see any way of doing that... as for the single coordinate fake mouse touch events, sure, but that doesn't help me with multitouch in the slightest.
I don't mind the device type going in, I just don't think it helps much with this issue. I'm sure someone else finds it useful, but it's really not what I am looking for. (I want a finger-event-to-screen-index-and-coordinates mapping)
On 2018-11-08 12:12:20 +0000, Alex Szpakowski wrote:
(In reply to Jonas Thiem from comment # 12)
I wonder what the implications are with respect to window focus. I wouldn't want SDL apps to think my touch input should be going to them, when I'm trying to interact with an unrelated thing on my screen.
I already track window focus myself. (SDL has events for this, after all)
I trust that you and other advanced users will, but I think it's important to do the right thing by default otherwise many apps won't.
(In reply to Jonas Thiem from comment # 12)
Ok I think I am fundamentally misunderstanding something here. What is
direct touch? Because again for the multitouch finger events, I can't see
any way of doing that... as for the single coordinate fake mouse touch
events, sure, but that doesn't help me with multitouch in the slightest.
A direct touch device is when the touchscreen is the same physical device as the screen that can have the SDL window on it. For example an iPhone or a touchscreen tablet/laptop hybrid.
If the touch device doesn't have the screen on it (such as a trackpad, for example), there's no such thing as a screen index that your finger is touching, so your code needs to be able to separate those two (i.e. the device type API) if it wants to map direct touches to a particular screen.
For direct touches, with the way SDL's touch coordinates are currently setup (and with the new touch device type API to filter out indirect touches), you can get the screen index that a direct touch belongs to because you have its position relative to the window, and you have the display index of the window via SDL_GetWindowDisplayIndex (or SDL_GetWindowPosition + SDL_GetDisplayBounds if you want to be more precise).
On 2018-11-08 12:48:41 +0000, Ellie wrote:
I trust that you and other advanced users will, but I think it's important to do the right thing by default otherwise many apps won't.
I understand, and I don't mind IF there is an obvious way to turn this off. Sadly, for mouse events there isn't, which leads to really annoying problems like mouse being stuck in the app when an exception interrupts the release of the lock, and other issues.
So if such a function/hint/... to disable focus-filtering touch events is provided, I don't mind. It just actually needs to be THERE.
If the touch device doesn't have the screen on it (such as a trackpad, for example), there's no such thing as a screen index that your finger is touching, so your code needs to be able to separate those two (i.e. the device type API) if it wants to map direct touches to a particular screen.
No that's not what I am looking for, I am looking for the mouse translation that would be done for the touch event, NOT the visual screen device touched. This would e.g. map to multiple screens depending on where you a touch a trackpad, but obviously only to one screen per specific tfinger instance/specific touch event.
For direct touches, with the way SDL's touch coordinates are currently setup (and with the new touch device type API to filter out indirect touches), you can get the screen index that a direct touch belongs to because you have its position relative to the window
Can I? I don't think I can, because a window can cover multiple screens, and I don't have that relative position at all (tfinger only has the normalized position over the whole device). This is exactly why I made this ticket.
Summed up:
I think there are a lot of ideas and suggestions going on that don't cover at all what I'm looking for. I don't mind those, but it would be nice if there was an actual way to do what I am actually looking for, which is:
map tfinger to screen index + screen pos FOR THE ACTUAL MOUSE EVENT (same mapping the fake mouse touch event does, NOT the screen the touch device is necessarily part of, which wouldn't make sense for e.g. trackpads)
get this information ALWAYS irrespective of window focus. I don't care about windows, I can manage my windows on my own. If I need to explicitly enable this unfiltered behavior I don't care, but there needs to be a way to do it
The other ideas are all nice, but don't help me at all.
On 2018-11-08 12:52:42 +0000, Ellie wrote:
Actually, small correction: thinking about it, I assume most trackpads would actually not have an absolute mapping at all (since they usually are meant for relative input), so -1 for screen index / no mapping would be completely fine. But e.g. a dedicated external graphics tablet with set up absolute input mode I would expect to be mapped properly, depending on how it's setup (e.g. to cover the range of multiple screens, so events on the left half should be mapped to screen 0, right half screen 1, or similar)
On 2018-11-08 13:56:08 +0000, Alex Szpakowski wrote:
(In reply to Jonas Thiem from comment # 14)
Can I? I don't think I can, because a window can cover multiple screens, and
I don't have that relative position at all (tfinger only has the normalized
position over the whole device). This is exactly why I made this ticket.
For direct touch presses, tfinger currently has the normalized position inside the window, right? You can figure out the global position from that information combined with the window's position.
(In reply to Jonas Thiem from comment # 15)
Actually, small correction: thinking about it, I assume most trackpads would
actually not have an absolute mapping at all (since they usually are meant
for relative input), so -1 for screen index / no mapping would be completely
fine.
This is why the touch device type is an important part of this discussion. :)
On 2018-11-09 07:59:58 +0000, Ellie wrote:
For direct touch presses, tfinger currently has the normalized position inside the window, right? You can figure out the global position from that information combined with the window's position.
Nope, that'd be really great. It has no link to any window at all, just to a magical touch device that is completely disconnected from any screen, window, or whatever, which is exactly why I find the current API so completely useless and made this ticket. I just checked the header files again, I'm pretty sure I'm not missing something
On 2018-11-09 18:14:53 +0000, Alex Szpakowski wrote:
(In reply to Jonas Thiem from comment # 17)
Nope, that'd be really great. It has no link to any window at all, just to a
magical touch device that is completely disconnected from any screen,
window, or whatever, which is exactly why I find the current API so
completely useless and made this ticket. I just checked the header files
again, I'm pretty sure I'm not missing something
It might not be documented very well right now, but direct touch positions are meant to be normalized within the window the touch event came from. It might also be possible for the positions to go outside of [0, 1] depending on how the OS deals with touches that begin inside the SDL window and move outside of it.
Here are some backend implementations, so you can see that this is the case:
Okay but if that is true, how do I get the window? I can't see a member for it in neither SDL_Event, nor SDL_TouchFingerEvent. Without the specific window, knowing it is normalized for whatever one of the windows doesn't help me too much I'm afraid :x
On 2018-11-09 23:18:52 +0000, Alex Szpakowski wrote:
Yes, that information is missing from SDL's APIs. If it wanted to match the SDL_mouse APIs, it would have the window ID as part of the finger events, as well as a function to get the current window focus for a given touch ID+index combo.
They would return an invalid id / a null window pointer for touches which didn't originate from a direct device type, of course.
On 2018-11-10 11:31:39 +0000, Ellie wrote:
Ok, then that missing window is what I would ask for to be added in the scope of this ticket, since I'm pretty sure that is all I need to calculate all the information I want. Of course the touch device type and other ideas are still useful to have
On 2018-12-11 19:45:28 +0000, Ellie wrote:
Is there any chance this will be added some time soon? It's almost impossible to do proper multitouch handling in more complex multi-screen settings without this information
On 2018-12-22 20:27:34 +0000, Ellie wrote:
I'm removing waiting status since I think it's obvious what is necessary (the window id is missing from the tfinger struct and needs to be made available), and there is no further discussion going on - in the hope of putting this back onto the radar. It would be really useful if this could be addressed some time soon
On 2019-02-16 13:47:45 +0000, Ellie wrote:
Is there any plan to fix this moving forward? Or is the multitouch API just going to remain completely unsuitable for multi screen setups for the foreseeable future? Because that would be a bit of a shame
On 2019-06-18 13:52:19 +0000, Sam Lantinga wrote:
If all we need is the window ID, that seems reasonable to add, as long as we maintain binary compatibility.
Alex, I haven't thought through how this makes sense for the different types of touch input. Would you like to take a crack at this?
On 2019-08-02 11:09:11 +0000, Alex Szpakowski wrote:
(In reply to Ozkan Sezer from comment # 26)
Is that not an abi breakage for 2.0?
SDL_TouchFingerEvent is part of the larger SDL_Event union, which is guaranteed to be at least 56 bytes. Since SDL_TouchFingerEvent is smaller than that (even with the new field), and since the only SDL functions that use SDL_TouchFingerEvent do so through the larger SDL_Event union, there's no ABI breakage.
On 2019-08-02 12:36:09 +0000, Ozkan Sezer wrote:
OK then, thanks.
On 2019-08-04 20:11:34 +0000, Alex Szpakowski wrote:
This bug report was migrated from our old Bugzilla tracker.
Reported in version: 2.0.9
Reported for operating system, platform: Linux, x86_64
Comments on the original bug report:
On 2018-10-25 06:46:47 +0000, Ellie wrote:
On 2018-11-05 08:09:31 +0000, Ellie wrote:
On 2018-11-06 22:16:59 +0000, Sam Lantinga wrote:
On 2018-11-07 01:15:50 +0000, Ellie wrote:
On 2018-11-07 11:42:16 +0000, Alex Szpakowski wrote:
On 2018-11-07 14:27:04 +0000, Sam Lantinga wrote:
On 2018-11-07 14:27:51 +0000, Sam Lantinga wrote:
On 2018-11-07 15:41:31 +0000, Alex Szpakowski wrote:
On 2018-11-07 15:50:54 +0000, Ellie wrote:
On 2018-11-07 15:52:03 +0000, Ellie wrote:
On 2018-11-07 16:20:50 +0000, Alex Szpakowski wrote:
On 2018-11-07 16:22:34 +0000, Alex Szpakowski wrote:
On 2018-11-08 06:15:14 +0000, Ellie wrote:
On 2018-11-08 12:12:20 +0000, Alex Szpakowski wrote:
On 2018-11-08 12:48:41 +0000, Ellie wrote:
On 2018-11-08 12:52:42 +0000, Ellie wrote:
On 2018-11-08 13:56:08 +0000, Alex Szpakowski wrote:
On 2018-11-09 07:59:58 +0000, Ellie wrote:
On 2018-11-09 18:14:53 +0000, Alex Szpakowski wrote:
On 2018-11-09 22:52:00 +0000, Ellie wrote:
On 2018-11-09 23:18:52 +0000, Alex Szpakowski wrote:
On 2018-11-10 11:31:39 +0000, Ellie wrote:
On 2018-12-11 19:45:28 +0000, Ellie wrote:
On 2018-12-22 20:27:34 +0000, Ellie wrote:
On 2019-02-16 13:47:45 +0000, Ellie wrote:
On 2019-06-18 13:52:19 +0000, Sam Lantinga wrote:
On 2019-08-02 08:44:21 +0000, Ozkan Sezer wrote:
On 2019-08-02 11:09:11 +0000, Alex Szpakowski wrote:
On 2019-08-02 12:36:09 +0000, Ozkan Sezer wrote:
On 2019-08-04 20:11:34 +0000, Alex Szpakowski wrote:
The text was updated successfully, but these errors were encountered: