User Tools

Site Tools


gestures:multimodal:gesture_index

Multimodal Gesture Input

Multimodal gestures are compound gestures that are made up of multiple gestures that come from more than one mode of input. In some cases this may require more than one input device. Simple multimodal inputs can be using different methods on a touch surface. For example: a multimodal gesture can be created that uses both pen/stylus and touch input to create a gesture event. Or a gesture may be created that requires a touch fiducial and finger touch to activate a gesture event.

Alternatively other multimodal gesture input systems can use multiple different modes of input in a single application. For example voice commands, touch gestures and accelerometer gestures can be used to control an application at the same time providing a more immersive experience.

  • touch_motion_library
  • touch_sensor_library
  • touch_voice_library
  • touch_motion_sensor_library
  • touch_motion_sensor_voice_controls

One of the most powerful aspects of creating and using a parallel multimodal gesture control scheme is the ability for users to arbitrarily switch input modalities during run-time. Users can event choose to switch on and off input modes using voice commands or simply use wearables for certain controls and touch controls for others depending on the gameplay or the production task.

Multiple Input Modes

GML has been designed from the ground up to support multiple input modes using single devices and multiple devices. GML can be used to attach gestures to various behaviors in an application also support. Multimodal gesture input schemes are great for hands free input or on the fly control scheme changes. For example when playing a game a voice command can be used to turn on and off eye tracking controlled target tracking. The user can say “sniper mode” and then eye tracking can be enables so that a gun sight will appear at the users gaze location. The user can then say “fire” to shoot the gun and simply repeat the phase “sniper mode” to turn of eye tracking control and switch control schemes.

  • motion_drag & touch_drag (virtual divide)

Crossmodal Gesture Input

gestures/multimodal/gesture_index.txt · Last modified: 2019/01/29 19:06 (external edit)