We have quite a few issues with touch enabled sites on Windows. [1] Our support for touch stretches back to when we first implemented MozTouch events which over time has morphed into a weird combination of W3C touch / simple gestures support. It is rather messy to fix, but I'd like to get this cleaned up now such that we are producing a reliable stream of events on all Windows platforms we support touch on. (This includes Win7 and Win8, and Metro.)

We are constrained by limitations in the way Windows handles touch input on desktop and our own implementation. For the desktop browser, there are two different Windows event sets we can work with which are implemented in Windows such that they are mutually exclusive - we can receive one type or the other, but not both. The two event sets are Gesture and Touch. The switch we use to decide which to process is based on a call to nsIWidget's RegisterTouchWindow.

If RegisterTouchWindow has not been called we consume only Windows Gesture events which generate nsIDOMSimpleGestureEvents (rotate, magnify, swipe) and pixel scroll events. For the specific case of panning content widget queries event state manager's DecideGestureEvent to see if the underlying element wants pixel scroll / pan feedback. [2] Based on the returned panDirection we request certain Gesture events from Windows and send pixel scroll accordingly. If the underlying element can't be panned in the direction the input is in, we opt out of receiving Gesture events and fall back on sending simple mouse input. (This is why you'll commonly get selection when dragging your finger horizontally across a page.)

On the flip side, if the DOM communicates the window supports touch input through RegisterTouchWindow, we bypass all Gesture events and instead request Touch events from Windows. In this case we do not fire nsIDOMSimpleGestureEvents, mouse, or pixel scroll events and instead fire W3C compliant touch input. We do not call DecideGestureEvent, and we do not generate pan feedback on the window. You can see this behavior using a good W3C touch demo. [3]

One of the concerns here is that since we do not differentiate the metro and desktop browsers via UA, the two should emulate each other closely. The browser would appear completely broken to content if the same UA sent two different event streams. So we need to take into account how metrofx works as well.

With metrofx we can differentiate between mouse and touch input when we receive input, so we split the two up and fire appropriate events for each. When receiving mouse input, we fire standard mouse events. When receiving touch input, we fire W3C touch input and nsIDOMSimpleGestureEvents. We also fire mouse down/mouse up (click) events from touch so taps on the screen emulate clicking the mouse. Metrofx ignores RegisterTouchWindow, never queries DecideGestureEvent, and does not fire pixel scroll events. Panning of web pages is currently handled in the front end via js in response to W3C touch events, which I might note is not as performant as desktop's pixel scroll handling. In time this front end handling will hopefully be replaced by async pan zoom which lives down in the layers backend.

Note that the metrofx front end makes very little use of nsIDOMSimpleGestureEvents, the only events we use are left/right swipe events for navigation. If we chose we could ignore these and not generate nsIDOMSimpleGestureEvents at all. [4]

To clean this up, I'd like to propose the following:

1) abandon generating nsIDOMSimpleGestureEvents on Windows for both backends when processing touch input from touch input displays.*

This would mean that if the desktop front end wants to do something with pinch or zoom, it would have to process W3C touch events instead. Note that we could still fire simple gestures from devices like track pads. But for touch input displays, we would not support these events.

* There's one exception to this in metro, we would continue to fire MozEdgeUIGesture. [5]

2) Rework how we process touch events in Windows widget such that:

* Both backends respect RegisterTouchWindow and only fire W3C events when it is set.
* If RegisterTouchWindow has been called:
** Send touchstart and the first touchmove and look at the return results.
** If either of these returns eConsumeNoDefault, continue sending W3C events only. No mouse or pixel scroll events would be sent.
** If both of these events do not return eConsumeNoDefault:
*** Abandon sending W3C touch events.
*** Generate pixel scroll events in the appropriate direction based on DecideGestureEvent, or simple mouse events if DecideGestureEvent indicates scrolling isn't possible.

Feedback welcome on this approach. With Win8 going full touch capable, this problem is only going to get worse with time, so I think we need to get it cleaned up and standardized.

There is also the open issue of future support for other touch specs which I'm not taking into consideration. If anyone has any input on W3C support vs. MS Pointer support I'd love to hear it. I'd hate to get this all cleaned up only to find that we have to change our touch input processing again for the nth time. Maybe now might be a good time to abandon W3C completely. As I understand it MS Pointer events solve some of the problems we have with mixed touch/mouse input.

Jim

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=806805
[2] http://mxr.mozilla.org/mozilla-central/source/content/events/src/nsEventStateManager.cpp#2946
[3] http://paulirish.com/demo/multi
[4] http://mxr.mozilla.org/mozilla-central/source/browser/metro/base/content/input.js#1001 [5] http://mxr.mozilla.org/mozilla-central/source/dom/interfaces/events/nsIDOMSimpleGestureEvent.idl#89

_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to