BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Could Apple Bring Eye Tracking And Pinch Gestures To The Mac?

Following


For most millennials and Gen Z PC users, their only PC interface is based on a graphical user interface popularized on the Mac. But for all Boomers and Gen Xers, we cut our tech teeth using only textual user interfaces to program and operate minicomputers and PCs.

My first tech-related job used DEC PDP-11s, and we had to program them using a keyboard and keyboard arrows. The early PCs also only used text-based operating systems. Apple's 1 and 2 used keyboard strokes to control them. Microsoft's DOS was used on the IBM PC when it launched in 1981 and had an extensive learning curve.

The evolution of man-machine interfaces has made computers easier to use. Thanks to smartphones, just about everyone knows how to interact with some computing device.

On April 1, 2024, Apple celebrated its 48th anniversary of being founded.

While reflecting on this anniversary and Apple's history, it reminded me of the major contributions Apple has made to computing. At the hardware level, they blazed new trails with the Apple 1 and 2, the Lisa, the Macintosh, and the candy-colored iMacs. This was followed by the iPod, iPhone, iPad, Apple Watch, and most recently, the Vision Pro.

However, Apple's quest to make the man-machine interface easy to use makes its role and influence in computing compelling. Many times throughout his life, Steve Jobs stated that computers of the past were hard to use. He committed to making them easier and integrated this philosophy into every product Apple has made.

With the Mac, he introduced the mouse and a graphical user interface. With the iPod, he gave us the flywheel to make using his hand-held music player a breeze. With the iPhone, Apple took the concept of touch as a UI to new levels. Jobs did the same with the iPad, making the tablet one of the most portable computers of our time. Even though Amazon and Google also have voice UI apps, Apple's Siri has become a voice navigation tool that can be used on all Apple devices as needed.

The Apple Watch has become the premier smartwatch, and its new style of touch makes it simple to use.

Now, with the Vision Pro, Apple is primed to make eye-tracking, pinching, gestures and voice a standard user interface for its XR headsets.

This latest UI used to navigate the Vision Pro is new. I believe Apple can and will extend this UI to current Apple products, particularly the Mac and MacBooks, especially the eye-tracking feature. If you have used the Vision Pro, you understand that eye tracking is a revolutionary way to navigate around this headset. Eye tracking technology is a method that monitors an individual’s eye movement and gaze, creating a novel mode of interaction with digital devices. Eye tracking is used to locate an app, and the pinch gesture opens it. Hand gestures are used to position the screen and jump between open applications.

I could see these gestures eventually replacing a mouse and perhaps even a keyboard on a Mac, coupled with voice commands to create content or navigate requests. This strategy may also work around Apple's resistance to making a touchscreen Mac, a feature in which they have yet to show interest.

This is not to say that the mouse and keyboard should disappear. However, adding eye tracking and pinch gestures would extend the Mac's UI and make it even more versatile.

Apple's history of inventing and influencing man-machine interfaces is undeniable. It has become the hallmark of Apple's importance in making computers and computing devices easy to use.


Disclosure: Apple subscribes to Creative Strategies research reports along with many high tech companies around the world.

Follow me on TwitterCheck out my website