adonthell-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Adonthell-devel] Input system theory


From: Alexandre Courbot
Subject: [Adonthell-devel] Input system theory
Date: 19 Feb 2002 16:04:17 +0100

Here is a first thought about the input system. As you will notice, it
looks a lot like Clanlib's one (but there isn't 36 ways to do it right,
is it? ;)), but is tuned to our needs. 

On top of everything there would be the input manager. It just keeps a
list of listeners and what they listen to, and call their callbacks when
needed. Nothing revolutionnary here, but I think we might handle a stack
of listener, with optionnal signal propagation. That is, the listener on
top of the stack (i.e the one that has the focus right now, for example
the window you are typing text into) is first notified of the even. If
the callback returns false, that means the listener hasn't caught the
even and that it might be propagated to the next one, etc... This would
make sense as the display is handled by the window manager, which work a
similar way with windows.

Now for the events themselves. You'd have an event class that defines a
type (KEYBOARD, MOUSE, JOYSTICK, CONTROL) that is set by the
keyboard_event, mouse_event, ... classes constructors that inherits from
it. A callback function would be passed a reference to an event, and
from the type it can guess which kind of event it is and cast it
accordingly.

There is nothing special about keyboard, mouse and joystick events.
We'll use our own mapping here and no more SDL's one. Where it becomes
interesting is with the control events.

Control events are just events that are mapped to certain controls. As
James proposed (have to read the mail again) all the game itself can be
controlled by a "pseudo" joypad (i.e logical keys) that would provides
about the same buttons than the SNES one. As everybody wouldn't want to
play with a pad, or won't have enough buttons on their pad, the 3 other
kind of events (keyboard, mouse and joystick) could be mapped to a
logical function of the pad. For example, I can map CTRL to the ACTION
button, UP to the UP direction, LEFT CLICK, SHIFT and Joystick Button 1
to the RUN button, etc... There is simply a table of input mappings for
keyboard, mouse and joystick. Whenever a keyboard, mouse, or joystick
event happens, we check whether it is mapped to a control. If it is, we
raise the corresponding control event. The lower-level event is also
launched, of course.

How it will work in practice? A client program (controlling a character
on the map, for example) will only listen to the CONTROL events. It
won't be noticed at all of keypresses others than the ones that are
mapped to a control, and will immediatly know which control has been
activated, without needing to mess with mapping tables. Whenever we need
to read some text in a window, we set the window to intercept keyboard
events and control events, and do not propagate control events
(otherwise, if an alphanumerical key is mapped to a control and pushed,
the window will get the key (and display the corresponding character),
and the map client behind will get the control event and launch the
corresponding action.

Voila - nothing really original here (thanks Clanlib ;)) but it would
just work fine for us IMO, and the controls mapping would be totally
transparent for the applications, which is what we wanted to achieve.

Opinions, comments before I start coding that?
Alex.

PS: I've just tried ClanLib 0.5.4 yesterday. It really worth a look at.
The exemples are very, very impressive, and it's truly a design jewel
(even ClanDisplay, despite of what you said, Kenneth ;)) I haven't
managed to run the exemples under OpenGL though - but I haven't insisted
too much, so much things to code! :p

-- 
http://www.gnurou.org





reply via email to

[Prev in Thread] Current Thread [Next in Thread]