PySoy has gone through several controller models, all incomplete, all crap. I can summarize why they were all crap very simply:

Every game programmer had to write the same crufty code.

We ended up with code like this:

key = soy.controllers.Keyboard(win)
pearl = bks['Pearl']
key['Q'] = soy.actions.Force(pearl, soy.atoms.Vector((-100,    0,    0)))
key['R'] = soy.actions.Force(pearl, soy.atoms.Vector((   0,  100,    0)))
key['S'] = soy.actions.Force(pearl, soy.atoms.Vector(( 100,    0,    0)))
key['T'] = soy.actions.Force(pearl, soy.atoms.Vector((   0, -100,    0)))
key['U'] = soy.actions.Force(pearl, soy.atoms.Vector((   0,    0, -100)))
key['V'] = soy.actions.Force(pearl, soy.atoms.Vector((   0,    0,  100)))
key['q'] = soy.actions.Quit()
key[ 1 ] = soy.actions.Quit() # 9 = esc key
key['f'] = fullscreenToggle
key['w'] = wireframeToggle
key['['] = lessLight
key[']'] = moreLight
wcn = soy.controllers.Window(win)
wcn['close'] = soy.actions.Quit()

That was for an example - can you imagine the sort of lengthy files we'd need for a full game when every form of input has to me mapped, and commonly remapped, to apply different functions to different objects?

To think of a better, higher-level way to implement this lets look at how games use input devices:

  • keyboard is used for text input
  • keyboard keys are mapped like buttons or directional controls on a joystick
  • pointer's motion across window is used to replace it with a sprite
  • pointer hovering over a button or area causes something to happen
  • pointer's motion is mapped to gesture input, which trigger certain actions
  • pointer button clicks and releases
  • pointer button clicks and drags
  • pointer scrolls (x/y axis)
  • pointer's extra (>3 and not scroll) is clicked like a joystick button
  • joystick axis and buttons
  • accelerometer/gyro changes axis which is attached directly to some object
  • accelerometer/gyro inputs 3d gesture, such as swing or shake

Some of these can be done easily by exposing an atomic property. Atomic types can be shared between objects so a change in one is reflected as a change in another.

Others can be passed to an object being attached for it. For example, an object in 3d space may spin when its being hovered over, and get attached for moving when dragged.

Gestures are the most difficult to implement and can wait for now, but an API could be exposed for designing and using gestures according to paths.

This must be Pythonic and simple. It must also be efficient, we need to only test for hover on objects we're actually hovering over, and the chain for specific events need to link and unlink as its set/unset from Python, not discovered every time an event is received.

That is to say, don't pass an event to a method of Window, then a method of its children, then a method of their children, then to a Scene, then to bodies in the scene - know ahead of time (calculated when set) which objects want what so the only question is which has priority when multiple objects want the same event. We can do that with a priority number and a boolean return for whether the event should be passed on or is "caught".

The code for handling events should "live" with the object that those events act on. If the "A" button causes thrust to a ship, that thrust method should be part of the ship's class. For abstraction a new Controller class could be made for that ship which allows buttons to be remapped and applies to any kind of ship in that game.

Certain objects, like soy.widgets.Window, need a default handler for things like its own close button at a low priority so a higher priority handler (ie, confirm/save dialog) can activate first. In another example, a text input widget will need to capture all keyboard events while its in use, even though some keys would normally do something different.

Other objects, like a Projector/Camera?, can transform the event such as changing a pointer's x,y position to scene x,y,z coordinates.