-
Notifications
You must be signed in to change notification settings - Fork 909
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pointer Event Overhaul #3833
Comments
This would require internal tracking on some backends, because pointer events generally don't have information about position when you click.
I don't think it's a great idea to stuff more data into simple events when users don't really care about them.
it's not a keyboard input, it's on the same level with mouse buttons, so applying anything related to keyboard to them is not going to work, but I'd rather just have an extension for that users can implement. In general, I'd suggest that we have a generic For example, with generic touch input you want to emulate scrolling with 2 fingers, and maybe have some other gestures, however it'll collide with the users who want to have their own 2 fingers handling (maybe some touch based game, etc) So such emulation should have a way to be turned off. If we provide a way to turn such emulation off, it basically means that we don't have to do anything about the identity of the pointer at all, because once user starts to carry more about a specific source of input, they can implement a protocol for Touch/Tablet/Whatever input. So I'd say that:
I think that's how it's usually done, even if you look into more complex systems with wayland protocol emulation, etc, etc. This also solves the Carrying fatty structs around when 90% of the users won't ever look into them (all the tablet tool stuff) is not really a great thing for performance. Besides, trait extension means that all the code will be straight deleted as unused, and for Wayland I can completely not bind the protocols at all, meaning that even less code will be executed on the client in the end and the things will generally be more compact. |
Uhm, the rest of your text argues in favor of generic input handling. E.g. pen can also be handled in your proposed
If you have experience how others did that it would be great. Who is "usually"?
I think we are talking about around 64 bytes ... so I don't think performance plays a role here. I think in conclusion, you are suggesting taking the alternative route I proposed under "Keep There was one uncertainty that came up while reading your post: I'm not sure if you intend to stop routing events to the generic handler if users have registered for a specialized one, which I believe would be a mistake. The motivation behind this is described in "Unified Pointer Handling", but to add more to that, we should take into account how Winit is used in the wild, which is mostly not directly, e.g. Bevy or Eframe/Egui, our biggest users on crates.io (not counting Tauri, because AFAIK they don't expose Winit). In cases like this it should be possible that libraries hook into Winit without actually changing the Winit input for users. We have talked about use-cases like this in the past and come to the consensus that we want to add explicit event-suppression but not implicit! We will discuss this proposal in the next meeting, so I will adjust it to whatever conclusion we come then, but otherwise I'm happy to change it to keeping specialized event handlers. |
size_of::<WindowEvent>()
The event type is already about five times larger than what the pointer event variants would end up after adding all the pen axes as f32 each.
If there is such a benefit, wouldn't the other types of events also benefit from the same kind of API, such that other code paths could be optimized out? This sounds more like something that could be a more general refactor of the event system. But why does the implementation of touch and pen input in particular have to be burdened by this? Why can't this be a feature flag instead? Why can't this be a const bool in ApplicationHandler instead? Besides, this is quite a verbose interface. You'd have to implement a trait, and also implement a method that returns a dyn trait reference. |
Because its exactly how the Wayland API works. All other backends, except maybe X11, won't benefit much from this. But I don't mind that, I think we should try and use each backend to its fullest if possible. In this case it comes with little downsides.
We know that in general the trait-based approach has been poorly received, but on the other hand we haven't really been offered an alternative solution. See #3432. |
yes, it should be left on its own, since it's probably similar to mouse buttons.
Hm, thinking about it, users have to store some state for their renderer anyway, probably on the The only issue is when you want to do scrolling for the touch input, since you generally want to stop such a thing, so I guess having a trait should disable some of the handling. That would also mean that we'd need a pointer device type identity, but that shouldn't be a problem.
I think GTK, they usually don't bind stuff unless you explicitly ask for it, at least it was like that with some handlers, like text-input(ime).
That's what the plan is, see #3432.
You can not test all the feature permutations, imagine that you have 12, it means you have 12! combinations and that's just from the events, now add other features, and backends specific features, and you have untestable code.
You can't do much with such a bool, so no. bool will just require you to have every method possible in your application, it won't result in all of that code to be deleted, unlike the
it's as verbose as enabling a feature, doing fn pointer_handler(&self) -> Option<&dyn PointerHandler> {
Some(self)
} is not a much different to figuring out what feature to enable and add a bunch of
macOS is the same, you basically can stop registering selectors if user doesn't want certain events to be handled, I think windows is also the same, since you can omit some event handling all together. |
@daxpedda in general, we can have both traits working at the same time, it's just we'd change certain behavior like disable touch scrolling emulation once you enable touch or other advanced emulations we might implement. |
No what I mean is, why are we not routing it to mouse buttons then? Ergo the proposed
On macOS you don't require specific events for different pointer types, e.g. right now we already handle some pen input on macOS without registering any additional selectors. But maybe there is more I don't know about.
I'm rather strongly against adding exceptions like this, I believe we should decide to go either one way or another. |
I mean, users who do multi touch input want to hanle scrolling themselves, default behavior for clients who doesn't have touch input is that you have 2 finger scroll, maybe we'd need something like
on the |
Changes from the latest meeting:
|
- Rename `CursorMoved` to `PointerMoved`. - Rename `CursorEntered` to `PointerEntered`. - Rename `CursorLeft` to `PointerLeft`. - Rename `MouseInput` to `PointerButton`. - Add `position` to every `PointerEvent`. - Remove `Touch`, which is folded into the `Pointer*` events. - New `PointerType` added to `PointerEntered` and `PointerLeft`, signifying which pointer type is the source of this event. - New `PointerSource` added to `PointerMoved`, similar to `PointerType` but holding additional data. - New `ButtonSource` added to `PointerButton`, similar to `PointerType` but holding pointer type specific buttons. Use `ButtonSource::mouse_button()` to easily normalize any pointer button type to a generic mouse button. - In the same spirit rename `DeviceEvent::MouseMotion` to `PointerMotion`. - Remove `Force::Calibrated::altitude_angle`. Fixes rust-windowing#3833. Fixes rust-windowing#883. Fixes rust-windowing#336.
- Rename `CursorMoved` to `PointerMoved`. - Rename `CursorEntered` to `PointerEntered`. - Rename `CursorLeft` to `PointerLeft`. - Rename `MouseInput` to `PointerButton`. - Add `position` to every `PointerEvent`. - Remove `Touch`, which is folded into the `Pointer*` events. - New `PointerType` added to `PointerEntered` and `PointerLeft`, signifying which pointer type is the source of this event. - New `PointerSource` added to `PointerMoved`, similar to `PointerType` but holding additional data. - New `ButtonSource` added to `PointerButton`, similar to `PointerType` but holding pointer type specific buttons. Use `ButtonSource::mouse_button()` to easily normalize any pointer button type to a generic mouse button. - In the same spirit rename `DeviceEvent::MouseMotion` to `PointerMotion`. - Remove `Force::Calibrated::altitude_angle`. Fixes rust-windowing#3833. Fixes rust-windowing#883. Fixes rust-windowing#336. Co-authored-by: Kirill Chibisov <[email protected]>
Goals
Unified Pointer Handling
We currently handle
Touch
completely separately from mouse events (WindowEvent::Cursor*
), the initial idea was I assume that the type doesn't really fit all the information that touch wants to express.This would make sense but comes with a big downside: if users don't explicitly handle the
Touch
event, their application just won't work on touch devices.This issue will only grow if we want to introduce pen/stylus support (#99). So to address this, we take inspiration from the Web Pointer Events Specification, which was mainly introduced to address this problem. The idea is to have a single event for all pointer input, which contains details about all possible inputs, be it touch, pen/stylus or anything else. That way users don't have to explicitly handle certain input types, but can still choose to do so, without accidentally not supporting a certain one.
Addresses #1555 (doesn't fix it, because there is a performance component to this on Wayland that would not be addressed by this proposal).
Fixes #336.
Pen/Stylus Support
To support pen/stylus we run into the same problem as above: we can't introduce a separate event, otherwise applications will suddenly stop working when a pen is used and they don't explicitly handle it.
Additionally this will require adding much more data to our current
WindowEvent::Cursor*
events. So we should figure out an API that will allow us to add more pointer type support in the future while we are at it.Addresses #99 (implementation is being done in #3810).
Position Data in Pointer Events
We are currently missing position data in all pointer events except
WindowEvent::CursorMoved
. This isn't just annoying to the user, because they have to handle two events to get the information they want, but it requires implementations to account for that as well and send "fake"WindowEvent::CursorMoved
events to report the position to the users.The solution should be quite simple: add
position: PhysicalPosition<f64>
to all pointer events.Fixes #883.
Fixes #1909.
Implementation
DeviceEvent::MouseMotion
toDeviceEvent::PointerMotion
to represent any pointer device.WindowEvent::Cursor*
toWindowEvent::Pointer*
. "Cursor" is more commonly used when describing the visual representation of a "Pointer", which we would like to avoid when talking about touch or pen/stylus.position: PhysicalPosition<f64>
toWindowEvent::PointerEntered/Left
.WindowEvent::MouseInput
toWindowEvent::PointerButton
.WindowEvent::Touch
into pointer events:TouchPhase::Start
->PointerButton
withElementState::Pressed
&PointerEntered
TouchPhase::Moved
->PointerMoved
TouchPhase::Ended
->PointerButton
withElementState::Released
&PointerLeft
TouchPhase::Cancelled
->PointerLeft
PointerType
,PointerSource
andButtonSource
enums, which can hold information about any device type.ButtonSource
should be convertible toMouseButton
, so users can do general pointer input handling without understanding each device type.Proposed API
Alternative Changes
Keep
WindowEvent::Touch
We could additionally emit the
Touch
event, and introduce a new event for each device type, for user convenience; having a dedicated place to handle a specific device type is certainly a nice touch.Though it would lead to a duplication of events, which personally I'm not a big fan of.
Remove
position
fromWindowEvent::PointerLeft
While this is certainly useful, I don't know if it makes on all backends, it certainly works on Web.
We could either wrap it in
Option
, if this information is not available on all backends, or remove it entirely.Add
WindowEvent::PointerChange
eventPointer devices like touch or pen/stylus, can update their details without actually moving. In the current proposal this would use
PointerMove
, but we could introduce a newPointerChange
event to address this.However, like other pointer events, we would probably still want to include a
position
field, so the data would look exactly likePointerMove
.Add
KeyType
Maybe "Key" is not the best term used here, but we might want to use a similar device type distinction for keyboard input as well. This would probably include renaming the
KeyboardInput
event toKeyInput
. The question is really what device types would show up here.My understand is that devices like Joysticks, Steering Wheels and things like that would all fall under gamepads, which we already decided can be fully implemented by external libraries without any help by Winit.
The reason this has come up here is because of Waylands tablet protocol, which exposes a very strange amount of graphic tablet specific input events. However AFAIK this is very Wayland specific and no other backend exposes this amount of specific information. No matter how we want to address it though, it should be at least routed to
KeyboardInput
for the same reason we want to route all pointer events toPointer*
events.Gestures
The idea would be to implement a new
Gesture
event with an enum containing all possible gestures. This is only related to this proposal because we might want to addPointerType
/PointerSource
to it (or a variation of it).We currently have the following gestures:
DoubleTapGesture
PanGesture
PinchGesture
RotationGesture
And more in the coming like #3413. Additionally I would also argue that we could fit
MouseWheel
and pen/stylus "actions" into this as well (#3768).I think this would make it much easier to add new gestures and centralize a place to handle them. Adding
PointerType
/PointerSource
to it would also allow users to identify what kind of device this is coming from, which brings me to the next topic.Touchpad
This is related to gestures and #2379. The idea is that we need more information on touch events if they relate to a touchpad and not a touchscreen. This could easily be addressed by the current proposal by adding a
Touchpad
variant toPointerType
/PointerSource
(where we could also expose the touchpad coordinates).I don't know how this would look implementation-wise on other backends, but Web has an open issue for that: #2963.
FAQ
How would touch events work?
PointerEntered
PointerButton
withElementState::Pressed
.PointerMoved
PointerButton
withElementState::Released
(when not cancelled)PointerLeft
*: Only emitted when it makes sense, e.g. touch screen. Touchpad doesn't emit those unless actually pressed.
This is exactly like using a mouse, fulfilling the purpose of this proposal
The only difference is that all events come with
PointerSource/PointerType::Touch
, exposing theFingerId
andForce
.How would pen/stylus input work?
Exactly the same as touch events. But coming with
PointerSource/PointerType::Pen/Eraser/Airbrush
, exposing any relevant information.How would pen/stylus hover work without touching the screen?
Exactly like touch events, but not emitting
PointerButton
unless the pen/stylus actually makes contact.This would also be exactly how touchpads work.
How would changes to e.g. tilt in pen/stylus be represented?
By using the
PointerMoved
event. See the "AddWindowEvent::PointerChange
event" alternative.How would graphic tablet buttons be represented?
See the "Add
KeyType
" alternative.In any case, they have to through regular
KeyboardInput
unless we want to repeat the same issue that motivates this proposal: if users don't explicitly handle it, the app won't react at all. This would sometimes make sense when certain actions are not translatable to e.g. keyboard input, but often they are.The text was updated successfully, but these errors were encountered: