Overview of GTK input and event handling

This chapter describes in detail how GTK handles input. If you are interested in what happens to translate a key press or mouse motion of the users into a change of a GTK widget, you should read this chapter. This knowledge will also be useful if you decide to implement your own widgets.

Devices and events

The most basic input devices that every computer user has interacted with are keyboards and mice; beyond these, GTK supports touchpads, touchscreens and more exotic input devices such as graphics tablets. Inside GTK, every such input device is represented by a GdkDevice object.

To simplify dealing with the variability between these input devices, GTK has a concept of master and slave devices. The concrete physical devices that have many different characteristics (mice may have 2 or 3 or 8 buttons, keyboards have different layouts and may or may not have a separate number block, etc) are represented as slave devices. Each slave device is associated with a virtual master device. Master devices always come in pointer/keyboard pairs - you can think of such a pair as a seat.

GTK widgets generally deal with the master devices, and thus can be used with any pointing device or keyboard.

When a user interacts with an input device (e.g. moves a mouse or presses a key on the keyboard), GTK receives events from the windowing system. These are typically directed at a specific surface - for pointer events, the surface under the pointer (grabs complicate this), for keyboard events, the surface with the keyboard focus.

GDK translates these raw windowing system events into GdkEvents. Typical input events are button clicks, pointer motion, key presses or touch events. These are all represented as GdkEvents, but you can differentiate between different events by looking at their type, using gdk_event_get_event_type().

Some events, such as touch events or button press-release pairs, are connected in to each other in an event sequence that univocally identifies events that are related to the same interaction.

When GTK creates a GdkSurface, it connects to the “event” signal on it, which receives all of these input events. Surfaces have have signals and properties, e.g. to deal with window management related events.

Event propagation

The function which initially receives input events on the GTK side is responsible for a number of tasks.

  1. Find the widget which got the event.

  2. Generate crossing (i.e. enter and leave) events when the focus or hover location change from one widget to another.

  3. Send the event to widgets.

An event is propagated down and up the widget hierarchy in three phases (see GtkPropagationPhase) towards a target widget.

Figure 14. Event propagation phases

Event propagation phases


For key events, the top-level window gets a first shot at activating mnemonics and accelerators. If that does not consume the events, the target widget for event propagation is window’s current focus widget (see gtk_window_get_focus()).

For pointer events, the target widget is determined by picking the widget at the events coordinates (see gtk_window_pick()).

In the first phase (the capture phase) the event is delivered to each widget from the top-most (the top-level GtkWindow or grab widget) down to the target GtkWidget. Event controllers that are attached with GTK_PHASE_CAPTURE get a chance to react to the event.

After the capture phase, the widget that was intended to be the destination of the event will run event controllers attached to it with GTK_PHASE_TARGET. This is known as the target phase, and only happens on that widget.

In the last phase (the bubble phase), the event is delivered to each widget from the target to the top-most, and event controllers attached with GTK_PHASE_BUBBLE are run.

Events are not delivered to a widget which is insensitive or unmapped.

Any time during the propagation phase, a controller may indicate that a received event was consumed and propagation should therefore be stopped. If gestures are used, this may happen when the gesture claims the event touch sequence (or the pointer events) for its own. See the gesture states section below to learn more about gestures and sequences.