Orca, AT-SPI and the Compiz zoom project

Willie Walker William.Walker at Sun.COM
Tue May 1 14:19:36 BST 2007


> I've been trying to familiarise myself with the whole world of
> accessibility, but I'm having a bit of trouble understanding what I'll
> actually need for my Compiz zoom project this summer.

Welcome!

> I know pretty much everything I need to know on the compiz-side of
> things, but accessibility is new to me, and I'm hoping for a few
> pointers for where to look for information, or rather what information
> to look for.

The current API for driving a magnifier is the gnome-mag API.  It's a
pretty complex API, and I think there may be opportunity for improvement
with something such as Compiz.  From my standpoint (that of a consumer),
the main concepts behind gnome-mag are:

1) The magnifier is a system service to be accessed by an assistive
technology (AT).  AT's discover (and activate if necessary) the
gnome-mag magnifier and talk to it.

2) Each AT needs to write its own configuration GUI for the magnifier.

3) The AT tells the magnifier to create one or more 'zoomers', each of
which magnifies an area of the display.  Each zoomer can have various
settings associated with it, and the AT can update the region of
interest for each zoomer at any time.

4) In use, the AT tells each zoomer to magnify an area of the screen,
typically in response to mouse, object focus, caret movement events,
and/or speech progress.

With Compiz, I think there's opportunity for the magnifier to be more
autonomous, but continue to offer itself up for interaction with other
assistive technologies.  

As a loose analogy to this, one might consider what is done with
BrlTTY/BrlAPI.  BrlTTY runs as its own process, providing interaction
with braille displays, most notably for virtual consoles.  BrlAPI
provides an interface for other applications to access the braille
display while BrlTTY is running.  By default, when Orca detects BrlTTY
is running, it assumes the user wants braille and goes ahead and uses
it.  

Something similar could be done with Compiz.  For example:

1) Compiz could provide its own sophisticated GUI for the magnifier,
allowing for at least what (and perhaps more than) could be accomplished
with gnome-mag.  This UI would also include the ability to
create/destroy/modify new magnifier views of the display.  I suspect a
typical use case would be that the user has a main 'dynamic' magnified
region that tracks where they are on the display and 0 or more 'static'
magnified regions that monitor fixed areas on the display.  This would
be a HUGE improvement, and would not require all other AT's to write
their own magnifier GUI.

2) Compiz itself could more easily provide the support for the various
mouse tracking methods rather than requiring the assistive technology to
do so.

3) Compiz could expose itself as a simpler service to the screen reader,
with an API that provides mostly hints/recommendations about the area of
the screen to magnify.  One would need to engage end users in a
discussion about what they really want to achieve, but I'd guess that
the API would be more for suggesting where the 'dynamic' magnified
region would go.  Again, talking with end users would be a requirement.

4) If need be, the assistive technology could provide font/text hints
about the portions of the screen being magnified, but I suspect Compiz
itself could probably accomplish this by using the AT-SPI directly.

I know I'm going to repeat myself, but I think now is the time to really
engage end users in terms of gathering their requirements and then
develop the architecture from that.

Hope this helps.  You are going to be exciting and useful stuff for the
community.

Will





More information about the Ubuntu-accessibility mailing list