Richard Yao posted on Sat, 12 Sep 2015 17:24:41 +0000 as excerpted:

> I asked in #gentoo-embedded on freenode, but I would like to pose this
> question to a wider auidence through the list.
> 
> What is the status of phone/tablet support? In specific, I am curious
> about:
> 
> * Modern hardware options (especially those without a hardware keyboard)
> * Status of F/OSS drivers
> * Xorg touch support (multitouch, virtual keyboards, etcetera)
> * Desktop environments designed for multitouch in portage
> * Graphical applications (e.g. dialer, calendar, web browser,
> email client)
> * Reliable filesystem options for flash storage in small memory
> environments.

I've actually been thinking about replacing my (dearly departed) netbook 
with a tablet (x86_64-based, for me, the idea being single-build and 
share packages across machines where possible), so I too am interested.

Meanwhile, I can already answer one, multitouch, at least to some extent, 
in the positive, based on my own experience, with more limited comments 
on some of the others, as well.

[TL;DR: just stop reading here.]

As it so happens, I have a touchpad (Logitech t650) attached to my 
workstation, here.  The kernel supports the hardware from 3.16 or 3.18, 
IIRC; before that it didn't /quite/ work, but some others did.

The xorg driver choices are xf86-input-evdev, -synaptics, and -mtrack.
-evdev doesn't work well without hardware buttons, however, so that 
leaves -synaptics, and -mtrack.

xf86-input-synaptics is the more mature of the two and works reasonably 
well, but is relatively limited in the number of gestures it recognizes.  
OTOH, it can use pressure as a Z axis, often controlling speed or brush-
width, for example, while mtouch doesn't.  As a mature driver, it has a 
manpage, and additional optional apps such as a kcontrol (aka kde system 
settings) applet.  It's limited to the standard seven "button" events, 
however, that being its biggest drawback, for me.

xf86-input-mtrack is what I'm using, and am quite happy with, tho being 
less mature it has a readme instead of a manpage, and is otherwise less 
commonly supported (no kcontrol applet for it, for example).  It supports 
far more gestures, and can work in either relative mode, which is what 
I'm using with my touchpad, or absolute mode, which a number of users 
reported works quite well on their touch-screens.

What mtrack does for gestures is translate them into xorg button events, 
registering up to 20 "buttons", depending on which gestures you have 
configured.

The first seven buttons are standardized to left/middle/right, scroll-up/
down/left/right.  Both mtouch and synaptics default 1-3 finger taps to 
the first three buttons, normally left/right/middle mouse buttons.  They 
also map two-finger-swipes to scroll events in the appropriate direction, 
up/down/left/right, buttons 4/5/6/7.  Beyond button 7, however, is not 
standardized.

Mtrack, however, lets you configure three-finger-swipe up/down/left/right 
(mapped to four "buttons"), four-finger-swipe up/down/left/right (four 
more), two-finger pinch-in/out and rotate-left/right (four more), and 
finally, 4-finger-tap (a final one more) for a total of 20 "buttons", 
with 7 having standardized mapping and 13 not standardized.

Because mtrack maps these gestures to unstandardized X button events, 
apps such as firefox that normally recognize gestures such as pinch 
(which it would use for zoom) won't directly recognize them here, since 
there's no standard or direct button to gesture mapping beyond the first 
7 buttons.

So you have to run another app to handle button event to actual action.  
I ended up using an app called sxhkd, simple X hotkey daemon, for that.  
This is a very powerful app that lets you map X keyboard and mouse events 
to various actions, so in addition to the mouse events we're talking 
here, people with keyboards with "extra" keys can configure it to watch 
for those events and trigger various actions, too.

In my sxhkd config I break down the nonstandard 13 button events into 
three groups based on whether I want the actions they are to perform to 
map globally, be app-specific, or global idea but applied per app.

The globally mapped actions are no problem.  I simply use another app 
(xdotool) to perform the specific action, regardless of what window is 
actually active.

The app-specific actions are somewhat more of a problem.  What I ended up 
doing is using a scripted setup that uses xdotool to get the active 
window, and another app, xprop, to get selected properties of that window 
(title, window-class, window-role), and match them against a preconfigured 
set of properties that I know identifies that window.  The script is 
named "winactiveis" and takes a key into the match table as a parameter.  
It returns shell true or shell false depending on whether the active 
window matches the configured values based on the key I fed the script.

I setup the swipe4 events along with pinch and rotate, and tap4, as 
global, either full global (don't care the active window) or global idea, 
applied per app.  The swipe3 events are app-specific.

So for example, swipe4left/right equate to buttons 14 and 15, and their 
sxkd config, globally applied, is:

# 14,15: swipe4left/right: window-close/desktop-up
button14
        wmctrl -c :ACTIVE:
button15
        xdotool key super+F12


(wmctrl is another app, window-manager control.  -c tells it to close the 
given window, and :ACTIVE: points it at the active window.  So four-
finger-swipe-left closes the active window.  xdotool key uses the xtest 
extension to simulate the named keys.  In this case, super-F12 aka win-
F12, is the global keyboard shortcut configured in kwin to switch 
desktops.  So four-finger-swipe-right switches to the next desktop.)

Here's the sxkd config for two-finger-rotate-left/right, a global idea, 
applied per app, to apps that let you rotate an image, in this case only 
gwenview, since that's all I've configured for it so far:

# 18,19: rotate: app-context
button18
        winactiveis gwenview && xdotool key ctrl+l
button19
        winactiveis gwenview && xdotool key ctrl+r

(You see the winactiveis shell script, passed gwenview.  If it returns 
true, gwenview's main window is active, and the xdotool action will be 
triggered, in this kays, ctrl-l or ctrl-r, which in gwenview is 
configured to rotate the image left or right.)

And here's a more complex multi-app-specific config for three-finger-
swipe-up/down.  These events trigger app specific actions, something 
different for each app it has been configured for.

# 8,9: swipe3up/down: app-specific
# gwenview: start-page/up
# firefox: zoom out/in
# orion: (cancel)/"gl n" (*100)
button8
        winactiveis gwenview && xdotool key super+Return ; \
        winactiveis firefox && xdotool key ctrl+minus ; \
        winactiveis orion && killall xdotool
button9
        winactiveis gwenview && xdotool key Escape ; \
        winactiveis firefox && xdotool key ctrl+plus ; \
        winactiveis orion && xdotool type --delay 1333 "gl ngl ngl ngl 
ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl 
ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl 
ngl ngl ngl ngl ngl ngl ngl ngl ngl ngl n"

(Again, winactiveis matches the active window.  If it's gwenview it does 
one thing, firefox does something else, and orion, an old DOS based game 
I run in DOSBox, so it's actually dosbox I'm testing for, does something 
else again.

For gwenview, swipe3up triggers super-Return, which tells gwenview to go 
to the start page, while swipe3down triggers Escape, which goes a level 
up in the tree, if it's displaying an image, it returns to directory 
view, if it's a directory, it goes a level up in the dir tree, etc.

For firefox, swipe3up triggers ctrl-minus, while swipe3down triggers ctrl-
plus, which zoom firefox out and in, respectively.

For orion, swipe3down types a series of keys repeatedly, in ordered to 
load a saved game and advance one turn, trying to trigger something with 
a small chance of happening.  Once it does or if something else triggers 
instead, I abort the key repeat using the opposite gesture, swipe3up, 
killing the xdotool invocation doing the repeat.)

....

So bottom line, multitouch and gestures work well, particularly with the 
xf86-input-mtrack driver, which recognizes upto 20 gestures and 
translates them into button events.  That's extremely flexible, but 
beyond the 7 standard buttons (1/2/3 and scroll vert/horiz), actually 
mapping the buttons to specific actions tends to be rather complex and 
require a bunch of helper apps and scripting.  Flexible, you can trigger 
literally any action you want, per-app or globally, but rather complex to 
setup, with full functionality basically out of reach for users who don't 
know how to script it themselves and who don't have someone to do it for 
them.

....

I wouldn't anticipate any specific issues with onscreen keyboards or 
multi-touch enabled apps, either.

Based on my netbook experience, however, I would be a bit worried about 
app display on low resolution screens, say under 1024 px wide (in 
whichever mode, landscape or portrait, you run), if the app isn't well 
designed for it, and about trying to maintain reasonable app display area 
when the screen is shared with the onscreen keyboard.

Additionally, switching landscape/portrait may well require complex 
scripted solutions of the type I used above for gesture to action 
mapping, tho I'm not familiar enough with the territory to say for sure.  
If you're willing to hack your own solutions using existing tools as I 
did above, I've little doubt you'll be fine.  If not, you're likely to 
run into problems except when using pre-rolled solutions like plasma-
active, which will considerably limit your app selection.  Of course, 
android or the like would be the easy way around that, but if you were 
satisfied with that, you wouldn't have posted the question, right? =:^)

My primary concern, however, is lack of data on specific hardware kernel 
drivers.  The kernel drivers for my t650 touchpad, for instance, simply 
weren't complete when I first got the device, and I was looking at all 
sorts of potential workarounds, but I regularly run mainline Linus pre-
release git kernels, and I updated one kernels one day and suddenly it 
was registering differently and more completely with the kernel.  After 
that, I could even use evdev with it, tho as I said, evdev functionality 
is extremely limited, and both synaptic and mtrack were basically full-
functionality without further configuring, except tweaking of the 
details.  I'd anticipate similar issues for touchscreen hardware, etc.

Tho if it has USB support you can always plugin a standard keyboard and 
trackpad and use that until you get the builtin hardware up and running.

And I should also mention, don't expect to work with keyboard or mouse in 
CLI, unless you have something plugged in.  Touchscreen and onscreen-
keyboard, effectively only in X.  (My touchpad works with gpm to the 
extent that the pointer moves, but without hardware buttons I can't 
really click anything, which means no onscreen keyboard at the CLI 
either, even if there's some ncurses or whatever based solution that 
presumably would work with gpm, and I'm not aware of any, tho perhaps 
I've just missed it.)

-- 
Duncan - List replies preferred.   No HTML msgs.
"Every nonfree program has a lord, a master --
and if you use the program, he is your master."  Richard Stallman


Reply via email to