Augment Reality

Augment Reality

What is the job of a modern smartphone? These little slates go with us everywhere, and we increasingly see people looking at them instead of speaking to parties on the other end of a telephonic conversation.

Even that observation is old news. Looking at a phone has been our primary pose for at least five years. For years, we have seen people walking down the street holding their phones with their middle three fingers. The thumb hovers over the lower half of the screen, waiting to type. Something mangled comes out when they do, and we all hope that autocorrect knows what they meant. 

We saw shifts in technology to help with this intent. Emoji characters predate smartphones, but the ability to pick the face or icon representing one's reaction is faster and usually more accurate than tapping it out on the capacitive screen. 

The intensely high number of pixels presented 6-24 inches from one's face is, in many ways, higher resolution than reality itself. It is bright, crisp, and in your near-term vision. A recipe for attraction and addiction! The phone screen can be isolating - removing one from the world through focusing on a silent conversation or content wholly disconnected from one's context. 

But the phone is not just a touch screen with a radio. It also has cameras and microphones to capture more of the world around us. One can look at the world through the screen to gather data about one's context. The most classic example is using Google or (sigh) Apple maps to navigate a strange area. The phone gives us insight into our context. 

One can look at the map top-down to see how streets fit together and how they move on the screen, with the center based on GPS and wifi-based geolocation services. This view is a step up from the AAA maps one might have previously used while driving a car. 

Another step up is using Street View, in which I can map the panoramic, stitched photos on the app to my immediate view. "Yes," we say with a glance around, "this matches what I see in my environment." Yes, I see that building on the screen, and it is to my left. My destination is two houses from here! 

The navigation application is a first step - and not a new one - in helping us understand our immediate environment. It creates immediate economic opportunities, for example, by making an amateur driver into a competent chauffeur: we can hop into an Uber and go where we mean to not because of the driver's expertise but because of their device-augmented understanding of the road and route. 

Another area that goes even deeper is using the camera and motion sensors to project phantom objects in space. We can understand three-dimensional objects on a two-dimensional screen by using time and motion. As we move the phone around, we get different perspectives on a virtual object in front of us -or all around us. 

I spoke with a sales representative who used a sticker to help sell an oil rig. The sticker anchored the augmented reality "space" to be in common with everyone holding a phone. Since they could all see the sticker and its image, they all had the same perspective on what position and orientation relative to that sticker. 

The sticker gated the AR app that he gave to everyone - a stranger could not look at that rig. Once the app loaded, the machine appeared all around them. They could hold up their phones to the empty room in front of them and see the rig in detail. They moved around it with their phones in hand to see all of its parts. The image came from the engineering application used to build it - these assets are broadly already available within the vendor's organization. 

He made the sale.

I discussed augmented reality so far in terms of consumption of content, but there are magnificent opportunities for creation. Working through the "magic glass," one can create and modify content that drives insight and beauty. One example is a simple drawing program that makes virtual "string" based on drawing on one's phone in space. Move the phone back, and the string, message, or image remains, hanging in the air. One can make action happen in three dimensions by moving the phone on any axis. One technique that just amazed me was holding one's finger on the screen and simply walking forward. The line forms behind you through an imaginary hole in the phone. The ability to draw a network of connections in a room using this technique delivers some fantastic results. 

My test for the utility of technology is uptake by my 7-year old children. They cotton to augmented reality immediately. They see how they can move their devices to interact and then mix that with physical objects. They built funny cars with Legos and pull-back motors to launch from ramps into the sky. They used AR apps to draw rings of fire through which their plastic stuntmen fly for points. This "magic glass" effect is powerful and intuitive. The question of victory is the combination of the physical and virtual. The reality exists only inside an iPad tracking these two together. 

A slightly more educational example is a physics simulator. My sons have a simple app that can launch planets and stars in the space in front of them. These objects react to each other based on gravity, creating orbits, collisions, and skips off into space. The relationships of size, speed, and direction become much more intuitive and fun when simply experimenting. 

The direct interaction with reality is a tremendous opportunity for understanding by both children and adults. It uses a whole set of metaphors unavailable on the consoles or personal computers that have been the entertainment and productivity domains of many developers so far. I am surprised that so little no-code is available for AR - yet. But all this will change. 

The job of the modern smartphone is to augment reality. The technology is mostly here. The next step is up to us. 

Photo by UNIBOA on Unsplash