Christopher James Halse Rogers
raof at ubuntu.com
Tue Nov 12 00:30:38 UTC 2013
On Mon, 2013-11-11 at 10:33 -0200, Daniel d'Andrada wrote:
> Hi Christopher,
> The job of EventHub, in short, is multiplexing the event streams from
> all those /dev/input/event* files into a single output stream of events
> (those events then having device ids to identify from which device they
> come from).
Right, and that's what I'd like to *make* it do. What it *currently*
does is multiplexing the event streams. Oh, and (badly) managing hotplug
of input devices. Oh, and managing the translation from hardware
scancodes to keycodes. Oh, and providing a description for input
devices. Oh, and…
My proposal is that EventHub retain its task of multiplexing the event
streams by watching fds provided by InputDevices. I just want to discard
the other ancillary things that it does.
> To me it looks like a very clear-cut task. Thus I think that whatever
> you wanna do with those events, that should take place *after* EventHub.
> Just like InputReader, taking those "raw events" from EventHub and
> "cooking" them.
I don't think it makes sense to have the uncooked events leave the
device-specific code. What currently happens is that InputReader “cooks”
them, but needs to call back into EventHub in order to do that.
Only the device-specific code really knows how to cook the events
Also, pragmatically, libtouchpad only provides fully cooked events.
> I do agree with you on the vibration API, though. Feels out of place and
> separate from the rest of the API and code. I bet it was added to
> EventHub at a much later time.
> So interpreting evdev event triplets (type, code, value) to come up with
> higher-level events or gestures like tap-to-click, two-finger scrolling,
> software button emulation, etc should definitely be done post-EventHub.
> InputReader already does very similar things.
Doing it on a higher level only makes sense if that higher level allows
us to collect behaviours common to multiple lower-levels. Currently the
upper level deals with at least two entirely disjoint lower levels -
keyboards and touchscreens.
I don't think that touchpads are going to share much behaviour with
touchscreens - they are fundamentally different input devices - and what
little behaviour they do share I don't believe will be easily
abstractable at the higher layer.
Fundamentally I believe that event cooking in InputReader is a
misdesign, at least for our expanded use-cases.
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 836 bytes
Desc: This is a digitally signed message part
More information about the Mir-devel