Thanks to Samuel and Devin for their input.  As Devin said:

> Smart phones use touchscreen braille input to make typing faster. ...
> This attempts to bring this to Linux smart phones and such like that.

This is precisely my goal.  I'd like to provide touchscreen braille input for 
Linux smart phones.  I'm particularly interested in supporting postmarketOS 
(pmOS), because it targets older (and thus more economical) phones.  However, 
I'd hope that my code would work on any Linux (or for that matter, BSD) variant.

Since I have NO interest in trying to create a screen reader, I need to 
understand the existing packages and what I'd need to do in order to support 
them.  Please feel free to correct and/or supplement these notes...

# BRLTTY

> BRLTTY is a background process (daemon) which provides access to the 
> Linux/Unix console (when in text mode) for a blind person using a refreshable 
> braille display.  It drives the braille display, and provides complete screen 
> review functionality. Some speech capability has also been incorporated. -- 
> https://brltty.app/


This seems pretty promising, but I have no clue how I should send event 
messages to BRLTTY.  I sent a note to their mailing list, but other advice and 
suggestions would be very welcome.

# Orca

Orca appears to support Braille input and output.  I read that:

- "The Orca screen reader can display the user interface on a refreshable 
Braille display."
- "Orca supports contracted braille via the liblouis project."

So, it would seem reasonable to take advantage of Orca's Braille input 
capabilities.  I gather that Orca prefers to use the AT-SPI protocol on D-Bus.  
As Samuel pointed out, I could support this via atspi_generate_keyboard_event.

# Yasr

I read that "Yasr is a general-purpose console screen reader for GNU/Linux and 
other Unix-like operating systems."  I suspect that I'd want to talk to it via 
uinput, but perhaps Samuel can clarify and/or correct this.

-r

Reply via email to