How do you navigate your Mac? It’s a simple and time honored one-two punch. First, keyboard. Then, mouse or trackpad. iPhone and iPad have changed that somewhat because we have a touchscreen to navigate and that requires an even easier interface. Fingers.
At first, the most critical usage was the now ubiquitous pinch-to-zoom feature nearly everyone does without thinking. That type too small? Take two fingers and spread them apart. That picture not fitting on screen? Take two fingers and bring them together. You don’t ever think about this, you just do it — almost as naturally as you swipe your screen.
Simple, right? Everyone with an iPhone or iPad or Android device knows how to use such multi-touch actions to do this or that. What’s the problem?
Yet, almost anyone with access to a Mac or Windows PC also knows the basics– select an area, then cut and paste. We have a long and growing list of keyboard shortcuts which accomplish almost anything the mouse or trackpad can do.
One thing is missing. We cannot remember all the shortcuts. Likewise, multi-touch has become somewhat burdensome because we can’t remember all the gestures. Sure, a single touch to a button is easy. A double tap might zoom in on an image or location, but most of us can remember where to pinch to zoom or spread to zoom out.
Yet, iPhone and iPad come with even more multi-touch gestures. On iPad, for example, four fingers let you move from one page on one app to another, back and forth. Now, there’s a new gesture to select text and insert text on the iPad’s display.
It’s official. We’ve reached the same problem with gestures that we have with keyboard shortcuts. Remembering which gesture does what is akin to remembering which keyboard shortcut does what within the confines of each app (often they are different on a per app basis).
What Apple had wrought with the mouse/windows/icons user interface of the 1984 Macintosh it had smashed with the multi-touch iPhone, even if the 2007 model would sell fewer than 6 million units in its first year. Apple has sold another 1.15 billion iPhones since.
That simple math means there are about 4-billion touchscreen devices with multi-touch capabilities, while most of the 1-billion or so traditional PCs still rely on the keyboard, mouse and trackpad to navigate (Microsoft’s Surface PC touchscreen technology notwithstanding because hardly anyone uses it tablet style).
I bring this up because gestures have reached a threshold similar to that of keyboard shortcuts. There are too many to remember and new ones are coming faster than they can be integrated into our navigating routine. Google touts another one on the way in Pixel 4. Andy Meek explains:
Radar technology called Project Soli that will be able to track hand gestures in the air and then turn those gestures into executable commands. At the outset, the functionality will be limited, with the Soli-based controls only able to do things like snooze an alarm, silence a phone call or skip songs.
Think of it as Face ID but always on, always ready to track a specific movement in front of the display and then respond accordingly.
Except now you have something else to remember. The example video seems interesting because a person stares into the Pixel 4 display and swipes a hand in front to perform this or that task. In real life, though, we’re holding the smartphone in one hand, and then required to use the other hand to do the mid-air gesture swipe where no we use a finger to touch the display.
See? Something else to learn and remember. Gestures have become the new keyboard shortcuts.
Too many to remember.