ARN

When it comes to augmented reality, it’s glasses or bust

We have AR on our phones today and merely improving its quality won’t be enough to thrill users and drive adoption

Augmented reality (AR) is the next big thing. At least, that’s what Tim Cook believes, and he’s not alone.

There are several companies betting on AR as the next great computing platform. Apple has been building tools for AR development for iPhone and iPad for a few years now.

AR is here today on our iPhones and iPads. Why isn’t it a bigger deal? Why aren't everyday users as excited about the potential of AR as Tim Cook is? Why aren’t all our everyday apps rushing to make it a core feature, or even an essential one?

I believe it’s because AR on our phones and tablets is never going to take off. It just barely qualifies as AR, and suffers both cognitive limitations and usability challenges that are almost insurmountable. For AR to reach its potential, it must reach us in the form of glasses.

Defining Augmented Reality

Before explaining why I think glasses are essential to AR’s success, I should set some ground rules about what exactly constitutes augmented reality.

Simply put, AR incorporates computer-generated graphics with the real world around you. It generates objects or effects that have the correct scale, orientation, position, and (to some degree) lighting to appear as the objects in reality do.

And as our view of reality shifts, so too does the scale, position, and orientation of the virtual objects. AR graphics don't have to look realistic, but they do have to look realistically located.

There are some gimmicks in photo-sharing apps that are billed as “AR” but don’t meet these criteria. It’s usually just a 3D object placed anywhere the scene, with no regard for correct scale, position, or orientation.

Pokémon Go's original AR mode (left) wasn't really AR. But nobody really cared much when real AR was introduced (right)Credit: Niantic
Pokémon Go's original AR mode (left) wasn't really AR. But nobody really cared much when real AR was introduced (right)

For example, when Pokémon Go launched, its “AR” mode for capturing Pokémon worked this way: You just rotated until you were facing a certain direction and there would be a little critter floating there, on top of a door or your computer monitor or whatever.

The app has since updated this feature to a “true” AR mode, which finds ground planes and creatures are placed on them in the correct scale and orientation. You can move around them, looking at different perspectives. Move closer and they get bigger, staying rooted to a specific position in real-world space.

Nobody cares about phone AR

Pokémon Go is probably the best example of just how little people actually care about phone AR. While always billed as an “augmented reality game” by the mainstream press, the hook that made Pokémon Go popular was it’s location-based gameplay. The so-called AR mode (which wasn’t, at first) was always optional, and most players turned it off to save battery life.

Now that Pokémon Go has a real AR mode for capturing critters, are people impressed? The the mainstream media fawn over how immersive and amazing it is? Did all those players turn AR mode back on?

No, no, and no. Nobody cares. Players still disable the AR mode.

Whether it’s Snapchat’s dancing hotdog or Pokémon Go, augmented reality on our phone screens is perceived as little more than a nifty toy. And it’s going to stay that way.

The key to AR is reality

Augmented reality does have tremendous, world-changing potential. Think about it: the real world around you, but with computer-generated graphics integrated into it. That’s useful for just about everything.

It will revolutionise travel, construction and factory work, social media and dating apps, mapping, stargazing, shopping (online and in-store), toys, games... the applications are truly endless.

But AR on our phones is not computer graphics on reality. It’s computer graphics on a screen, superimposed on with what amounts to a video feed of reality. We’ve had that for ages. That happens on our TV sets every day. There’s a wide gulf between seeing actual reality with your own eyes and seeing a representation of reality on a flat 2D screen.

When we use AR on our phones, we are looking at a wide view of reality, in which a small rectangular portion of that view is playing a real-time video of reality.

It is invariably not correct for the position and distance from our phone to our eyes, and doesn't represent the colours, dynamic range, and focus of what our eyes see all around the screen. It doesn’t look real because it isn’t real, and because we can see reality all around it.

And as our heads and eyes make subtle movements, the image on the phone doesn't follow suit, so our brains are not fooled.

Better phone-based AR isn’t going to cut it

Augmented reality on phones is going to get better and better. Rumour has it that iPad Pros and high-end iPhones shipping this year will have “time-of-flight” sensors on the rear cameras to more quickly produce the 3D depth information necessary for augmented reality. They’ll also improve photo-taking in some ways, but that’s not related to AR.

This may be a necessary step toward eventual widespread AR adoption with glasses. It’s probably critical to helping build tools and software for developers to make augmented reality stuff. But it’s not the thing that will make AR a part of our daily lives.

No matter how good our phones become at doing AR, our brains will still have too many signals to let it know that what we’re seeing is not reality.

No matter how good phone (or tablet) AR gets, you'll always be looking at the real world through a camera on a handheld object's screenCredit: Apple
No matter how good phone (or tablet) AR gets, you'll always be looking at the real world through a camera on a handheld object's screen

We’ll still be looking at a screen, held up by our arms, at some distance in front of our eyes. The “real world” we see on that screen will not match the actual real world in our view around the phone for sharpness, lighting, focus, and perspective.

We’ll interact with it by tapping and swiping on small flat rectangle. Our senses will be bombarded with the fact that not only are the computer graphics on the display not actually real, but neither is the “reality.”

Getting AR right

When we look at AR through a pair of glasses or goggles, we are seeing actual reality, real photons bouncing off real objects in the real world, upon which 3D graphics are superimposed. When our heads move, in subtle or significant ways, our view of reality changes as we would expect because it’s not computer-generated at all, it’s real.

Looking through glasses, we don’t see actual reality around a limited window of “fake” reality. It’s all just the actual world around us.

Current AR glasses have a limited field of view for the section that contains computer graphics. That’s one of the challenges that needs to be overcome to encourage widespread adoption, but it’s not a total deal-breaker because your view of the world is not so limited, only your view of the fake computer graphics overlay. It’s nothing like limited field of view in virtual reality, which limits everything you see.

To make AR really work for the masses, we simply must have glasses or goggles. We have to see graphics on top of our actual view of the real world. But that doesn’t mean the headset has to be a stand-alone item like Microsoft’s HoloLens.

HoloLens 2 is impressively compact for what it does, but it's not nearly compact enoughCredit: Adam Patrick Murray / IDG
HoloLens 2 is impressively compact for what it does, but it's not nearly compact enough

I think the most likely scenario is a relatively compact and lightweight pair of glasses that contains only a transparent display and 3D sensing hardware. It would feed its position and orientation data to your phone, which would generate the correct graphics to superimpose onto your view. Those graphics would be sent back to the glasses and displayed on their transparent surface.

Given the low latency with which all that must happen, I wouldn’t be surprised if Apple’s first AR glasses have a tether that runs from behind your ear to the Lightning port on your iPhone. The phone could potentially stay in your pocket, but there may be some value in using it as a pointing controller to intact with your AR environment.

[Memo to myself: Patent iPhone wrist straps for AR controller use.]

Magic Leap uses goggles tethered to a processing core as well as a separate controller. But the tether and goggles are too big, and requiring a separate controller is never really going to fly. People aren’t going to carry that around with them all the time. Making the iPhone both the processing unit and a controller makes some sense.

Magic Leap uses hefty goggles tethered to a sizable processing unit and a separate wireless controller. Too cumbersome for the mass marketCredit: Magic Leap
Magic Leap uses hefty goggles tethered to a sizable processing unit and a separate wireless controller. Too cumbersome for the mass market

Eventually, chip production and battery technology may get to the point where fully self-contained AR glasses of sufficient quality could be small and light enough to be accepted by the masses.

HoloLens 2 is impressively small for what it does, but it’s still way too bulky for hundreds of millions of everyday consumers.

When can we expect Apple AR glasses? Nobody knows. While the marketing opportunities for releasing a vision-based product in the year 2020 are enticing, it’s hard to imagine that the technology will be ready quite yet. I think an “Apple Glasses” product in 2021 is an aggressive timetable, and 2022 or 2023 is more likely.

There are huge technical challenges to overcome, and the price has to come down significantly; Magic Leap and HoloLens 2 cost $2,300 to $3,500, and it’s hard to imagine an AR product gaining mass adoption if it costs more than a high-end phone.