You learned a new series of finger movements and gestures to master using the iPhone and the iPad in 2007 and 2010, respectively.
And now you can’t picture a daily routine in which you don’t use these movements.
Apple’s upcoming Vision Pro headset, which is classified as a “spatial computing” device, does not have a hardware-based control mechanism. It relies on eye tracking and hand gestures to allow users to manipulate objects in the virtual space in front of them. In a recent developer session, Apple designers outlined the specific gestures that can be used with Vision Pro, and how some of the interactions will work:
Tap – Tapping the thumb and the index finger together signals to the headset that you want to tap on a virtual element on the display that you’re looking at. Users have also described this as a pinch, and it is the equivalent to tapping on the screen of an iPhone.
Double Tap – Tapping twice initiates a double tap gesture.
Pinch and Hold – A pinch and a hold is similar to a tap and hold gesture, and it does things like highlighting text.Pinch and Drag – Pinching and dragging can be used to scroll and to move windows around. You can scroll horizontally or vertically, and if you move your hand faster, you’ll be able to scroll faster.
Zoom – Zoom is one of two main two-handed gestures. You can pinch your fingers together and pull your hands apart to zoom in, and presumably zooming out will have a pushing sort of motion. Window sizes can also be adjusted by dragging at the corners.
Rotate – Rotate is the other two-handed gesture and based on Apple’s chart, it will involve pinching the fingers together and rotating the hands to manipulate virtual objects.
These gestures will work in tandem with eye movements that are being tracked by the multiple cameras incorporated into the Vision Pro, and can track your eye movements with impressive accuracy. Eye position will be a key factor in targeting what you want to interact with using hand gestures. For example, looking at an app icon or on-screen element targets and highlights it, and then you can follow up with a gesture.
Apple noted that your hands can be kept in your lap, and encourages this given that your hands and arms can become tired from being held up in the air. Users will need a tiny pinch gesture for the equivalent of a tap, because the cameras can track precise movements.
Once a movement has been initiated, users can select and manipulate objects both close to them and far from them and can use larger gestures to control objects that are directly in front of them.
In addition to gestures, the headset will support hand movements such as air typing. Gestures will work together, of course, and to do something like create a drawing, users will look at a spot on the canvas, select a brush with their hand, and use a gesture in the air to draw. If they look elsewhere, you’ll be able to move the cursor immediately to where you’re looking.
Given these six main gestures, developers can create custom gestures for their apps that can perform other actions.
As expected, to supplement hand and eye gestures, Bluetooth keyboards, trackpads, mice, and game controllers can be connected to the headset, and there are also voice-based search and dictation tools.
Stay tuned for additional details as they become available.
Via MacRumors and developer.apple.com