User Tools

Site Tools


gestures:sensor:gesture_index

Sensor Gesture Index

In GML user actions can also be described using sensor data. These sensor gestures can be used with a variety of existing commodity devices such as Android smart phones and wearables. In some cases sensors input can be extended to work as complimentary multimodal input options for existing touch or motion gestures.

Accelerometer & IMUs

Most portable smart devices such as cell phones, tablets and Utrabooks have build in sensors that measure the acceleration and orientation of the device in space. These are called Accelerometers, Gyro-meters and Magno-meters. In the most recent generation of portable smart devices the output from these sensors have been combined into a single IMU (Inertial Measuring Unit) device which provides methods for self calibration and qualified motion data in 3D space. The result is drift free data which tracks the 9DOF of the device 6 directions of movement (up/down/in/out/left/right) as well as 3 axis of rotation (x,y,z).

By analyzing this fused “inertial” data (6DOF) we can confidently establish whether a device is being tilted or tapped.

  • Hand held device Tilt gestures
    • device-tilt-roll
    • device-tilt-pitch
    • device-tilt-yaw
  • Hand held device Motion gestures
    • device-swipe (up/down/left/right/in/out)
    • device-flick (up/down/left/right/in/out)
    • device-drag (up/down/left/right/in/out)
  • Hand held device Tap gestures
    • device-left-tap
    • device-right-tap
    • device-top-tap
    • device-bottom-tap

In addition to phones, tablets and Ultrabooks other devices with embedded IMUs can be used as input for sensor gesture analytics these include hand-held devices and wearables. For example: rings, watches, gloves, wands, remote controls and game controllers.


Wearables

Wearable smart devices also known as wearables often contain groups of sensors. In som cases these wearable devices can use BlueTooth to stream sensor data in real-time so gestures can be tracked and the device can be used to directly control applications or other secondary devices.

Myo Armband

The Myo armband is a wearable sensor that uses EMG (Electro-myo-graphy) signals to establish rudimentary hand poses and on-board IMU (6DOF) sensors to detect device acceleration and orientation. The pose and motion data is then communicated via bluetooth.

Myo hand dynamic poses:
* Wave-in
* Wave-out
* Finger-thumb-tap

Static hand poses:
* Fist
* Splay

Proposed hand poses:
* Thumbs-up
* Hook/Curled Fingers
* Index-thumb-flick

These poses can then be combined with standard kinementric analysis to create the following motion gestures:

Myo Motion Gestures:

Begin

Tap

Hold

Drag

Swipe

Rotate/Tilt

fist-begin
splay-begin
wave-in
wave-out

finger-thumb-tap

splay-hold
fist-hold

splay-drag
fist-drag

splay-swipe
fist-swipe

splay-rotate
fist-rotate
splay-tilt
fist-tilt
bimanual-splay-rotate
bimanual-fist-rotate


Nod Ring

The Nod ring is an elegant wearable ring that contains a set of buttons and integrated IMU (6DOF) that uses bluetooth to send button state and motion data. There are two tactile buttons, 2 touch pads and a slider touch pad.

Press:

  • tbtn-a-press
  • tbtn-b-press
  • tpad-a-press
  • tpad-b-press
  • tslider-press

These button states can then be combined with standard kinemetric motion analysis to create a set of Nod ring motion gestures:

Nod Motion Gestures:

Hold

Drag

Swipe (up/down/left/right)

Flick

Tilt (roll/pitch/yaw)

tbtn-a-press-hold
tbtn-b-press-hold
tpad-a-press-hold
tpad-b-press-hold
tslider-press-hold

tbtn-a-press-drag
tbtn-b-press-drag
tpad-a-press-drag
tpad-b-press-drag
tslider-press-drag

tbtn-a-press-swipe
tbtn-b-press-swipe
tpad-a-press-swipe
tpad-b-press-swipe
tslider-press-swipe

tbtn-a-press-flick
tbtn-b-press-flick
tpad-a-press-flick
tpad-b-press-flick
tslider-press-flick

tbtn-a-press-tilt
tbtn-b-press-tilt
tpad-a-press-tilt
tpad-b-press-tilt
tslider-press-tilt

These gestures can then be combined from more than one Nod ring to create a bi-manual input scheme. For example the left hand can be used to qualify the input of the right hand to create rich gesture combinations and sequences. The same technique can be used to create a left hand “key” and right hand “cursor” high bandwidth input paradigm.


Hand Held Controllers

There are a number of hand held devices dedicated to application control. In many cases these devices have been developed for dedicated gaming systems control. Some of these devices have been re-purposed by developers to act as custom input devices.

WiiMote

The Nintendo Wii has a few versions of this hand held input device. It uses button input in combination with IR and accelerometer tracking to create rich motion controls. When used with IR tracking the device is capable of 9DOF. When using only accelerometer data across bluetooth only 6DOF are available.

There are six main buttons, a main d-pad and joystick on the wiimote as well as two buttons and a joystick on the nunchuck:

wiimote:

  • a-press
  • b-press
  • plus-press
  • minus-press
  • 1-press
  • 2-press
  • dpad-up
  • dpad-down
  • dpad-left
  • dpad-right
  • joystick-dx
  • joystick-dy

nunchuck:

  • c-press
  • z-press
  • joystick-dx
  • joystick-dy

Each button state when used in conjunction with accelorometer data can be used to create a rich set of motion gestures for example:

  • a-press-hold
  • a-press-drag
  • a-press-flick
  • a-press-swipe-left
  • a-press-swipe-right
  • a-press-swipe-up
  • a-press-swipe-down
  • a-press-rotate-roll
  • a-press-rotate-pitch
  • a-press-rotate-yaw

PS3Move

The Sony Playstation uses the Playstation Move device as a hand held controller. It uses a combination of visible (colored) light tracking and accelerometer to provide 9DOF motion tracking. It also provides a set of six primary interaction buttons on the main controller with 4 buttons a dpad and a joystick on the secondary controller.

primary controller:

  • circle-press
  • square-press
  • triangle-press
  • cross-press
  • move-press
  • trigger-press

secondary controller:

  • L1-press
  • L2-press
  • cross-press
  • circle-press
  • joystick-dx
  • joystick-dy
  • joystick-press
  • dpad-up
  • dpad-down
  • dpad-left
  • dpad-right

Each button state when used in conjunction with accelorometer data can be used to create a rich set of motion gestures for example:

  • trigger-press-hold
  • trigger-press-drag
  • trigger-press-flick
  • trigger-press-swipe-up
  • trigger-press-swipe-down
  • trigger-press-swipe-left
  • trigger-press-swipe-right
  • trigger-press-rotate-roll
  • trigger-press-rotate-pitch
  • trigger-press-rotate-yaw

Voice Commands

Voice commands are defined a simplified single or dual word phrases used to illicit discrete events. In order to maintain response times that keep pace with traditional mouse, keyboard and other gesture input events a “better than real-time” recognition system is used to ensure that the time between the phrase utterance and recognition event is less than 16ms.

Voice commands defined by the GML in this section have been designed and tested on the following devices:

  • Desktop microphones:
  • Embedded microphones:
  • Head mounted microphones:
  • Throat microphones:
  • In-Ear Bluetooth microphones:

When using predictive or “hypothesis” based recognition methods it is advantageous to use short distinct words and phrases with low syllabic complexity. This tends to improve the recognition rate and accuracy as the voice commands are easy to discern, simple to pronounce and in most cases easier to remember.

Word groups associated with common directional commands

Class

Phrase

Directional

right, left

Directional

forward, in, run

Directional

back, backward, out

Directional

up, jump


Multimodal Gesture Index

gestures/sensor/gesture_index.txt · Last modified: 2019/01/29 19:06 (external edit)