The Leap Motion system detects and tracks hands, fingers and finger-like tools. The device operates in an intimate proximity with high precision and tracking frame rate.
The Leap Motion software analyzes the objects observed in the device field of view. It recognizes hands, fingers, and tools, reporting both discrete positions, gestures, and motion. The Leap Motion field of view is an inverted pyramid centered on the device. The effective range of the Leap Motion Controller extends from approximately 25 to 600 millimeters above the device (1 inch to 2 feet).
The Leap Motion system employs a right-handed Cartesian coordinate system. The origin is centered at the top of the Leap Motion Controller. The x- and z-axes lie in the horizontal plane, with the x-axis running parallel to the long edge of the device. The y-axis is vertical, with positive values increasing upwards (in contrast to the downward orientation of most computer graphics coordinate systems). The z-axis has positive values increasing toward the user.
The Leap Motion API measures physical quantities with the following units:
|Time:||microseconds (unless otherwise noted)|
As the Leap Motion Controller tracks hands, fingers, and tools in its field of view, it provides updates as a set, or frame, of data. Each Frame object representing a frame contains lists of the basic tracking data, such as hands, fingers, and tools, as well as recognized gestures and factors describing the overall motion in the scene. The Frame object is essentially the root of the Leap Motion data model.
To read more about Frames, see Getting Frame Data.
The hand model provides information about the position, characteristics, and movement of a detected hand as well as lists of the fingers and tools associated with the hand.
More than two hands can appear in the hand list for a frame if more than one person’s hands or other hand-like objects are in view. However, we recommend keeping at most two hands in the Leap Motion Controller’s field of view for optimal motion tracking quality.
The Leap Motion Controller detects and tracks both fingers and tools within its field of view. The Leap Motion software classifies finger-like objects according to shape. A tool is longer, thinner, and straighter than a finger.
In the Leap Motion model, the physical characteristics of fingers and tools are abstracted into a Pointable object. Fingers and tools are types of Pointable objects.
The Leap Motion software classifies a detected pointable object as either a finger or a tool.
The Leap Motion software recognizes certain movement patterns as gestures which could indicate a user intent or command. Gestures are observed for each finger or tool individually. The Leap Motion software reports gestures observed in a frame the in the same way that it reports other motion tracking data like fingers and hands.
The following movement patterns are recognized by the Leap Motion software:
|Circle — A finger tracing a circle.||Swipe — A long, linear movement of a finger.|
|Key Tap — A tapping movement by a finger as if tapping a keyboard key.||Screen Tap — A tapping movement by the finger as if tapping a vertical computer screen.|
Important: before using gestures in your application, you must enable recognition for each gesture you intend to use. The Controller class has an enableGesture() method that you can use to enable recognition for the types of gestures you use.