On 13/02/12 12:59, supernova wrote:
Let's focus on HUD innovation, leaving the menu as a backstop till we have
great solutions for *both* aspects of the menu; exploration/discovery, and
command invocation.
this is important. HUD has potential, just ad the dash. They are the
real tutor of any Ubuntu user. Maybe in the future HUD will understand
sound commands?!
Supernova
yeah, it could do, fairly easily I think, the menu fragments need to be
reformed into a jsgf grammar and then passed to pocketsphinx to listen
for. Continuous speech recognition is still some way off, but if there
is a grammar that it is listening for (i.e. it matches against a limited
vocabulary rather than the entire English language) then it can be
really quite good. The menu data that the HUD accesses appears to be
quite a good corpus of stuff to feed into a vocabulary file. I think now
I know a bit about dbus introspection that the menu data is in there
somewhere, I am fairly sure all the bits exist today and just need
plumbing together.
Alan.
--
Mailing list: https://launchpad.net/~unity-design
Post to : unity-design@lists.launchpad.net
Unsubscribe : https://launchpad.net/~unity-design
More help : https://help.launchpad.net/ListHelp