Clients reading their surface position on screen
Daniel van Vugt
daniel.van.vugt at canonical.com
Tue Jul 22 07:58:32 UTC 2014
If accessibility and Autopilot can live with a point (centre of a
widget) rather than needing an accurate region then we can just deal in
one point per widget. And that keeps anpok's dream alive of perfect
input mapping in 3D shells :)
On 22/07/14 15:52, Christopher James Halse Rogers wrote:
>
>
> On Tue, Jul 22, 2014 at 11:23 AM, Luke Yelavich
> <luke.yelavich at canonical.com> wrote:
>> On Tue, Jul 22, 2014 at 09:42:42AM EST, Gerry Boland wrote:
>>> Hey folks,
>>> in working on QtCompositor, I stumbled across a problem ([1]).
>>>
>>> Autopilot needs to know the position of items in an application
>>> (buttons, etc) in screen coordinates - not surface. It needs that as it
>>> generates inputs via uevent, which are defined in screen coordinates.
>>
>> Qt's accessibility framework also needs to be able to present this
>> information via at-spi to accessibility tools such as Orca.(1) This is
>> probably something that is more to do with Qt internals working with
>> Mir, but even though this is primarily about autopilot, keeping
>> accessibility requirements in mind when thinking about this is
>> probably also relevant.
>
> My memory of our discussions in Malta was that it seemed like a good
> idea for Mir itself to take some part of the role of the at-spi registry
> and have toolkits report the surface-relative positions of their
> widgets. Then Mir can provide (appropriately permissioned) accesibility
> applications the absolute positions of the various widgets.
>
>
More information about the Mir-devel
mailing list