A. Building Unity UIs that our hands can interact with.
B. Interaction at a distance -- projects a cursor toward the UI and interact with a pinch
Added Detector utilities to the core assets -- these build on LeapPinchDetector to aid developers in detecting when a hand is in a desired state. Includes 2 new example scenes. Detectors include:
Extended fingers
Finger and palm pointing direction
Pinch detector (moved from former PinchUtilites)
Proximity to a list of target game objects
A logic gate for combining detectors
Added LeapCSharp.NET3.5.dll, which contains the C# wrapper classes around LeapC. Removed LeapC source files.
Removed PinchUtilites module. Code was moved to core as a detector; examples were moved to DetectionExamples.
Added a Sphere mesh setting to CapsuleHand and provided lower-poly sphere objects.
Made it optional for a IHandModel to persist in the scene in Edit mode.
Fixed turning on and off graphics or physics hands in the LeapHandController.
Fixed bug in which old frames would be used if the head transform didn't change
Fixed bug that caused temporal warping to fail 1-in-5 times on Vive (hands not rotating with camera)
Added ToggleModelGroup() method for groups in HandPool
Fixed bug with group enabling and disabling in HandPool
Added script to detect Vive vs. Oculus and adjust camera height compared to floor - this won't be apparent in any of the Core Asset scenes since they feature only hands.