What is the difference between AR and VR?
While both technologies involve simulated reality, AR and VR rely on different underlying components and generally serve different audiences.
In virtual reality, the user almost always wears an eye-covering headset and headphones to completely replace the real world with the virtual one. The idea of VR is to eliminate the real world as much as possible and insulate the user from it. Once inside, the VR universe can be coded to provide just about anything, ranging from a light saber battle with Darth Vader to a realistic (yet wholly invented) recreation of earth. While VR has some business applications in product design, training, architecture and retail, today the majority of VR applications are built around entertainment, especially gaming.
Augmented reality, on the other hand, integrates the simulated world with the real one. In most applications the user relies on a smartphone or tablet screen to accomplish this, aiming the phone’s camera at a point of interest, and generating a live-streaming video of that scene on the screen. The screen is then overlaid with helpful information, which includes implementations such as repair instructions, navigation information or diagnostic data.
However, AR can also be used in entertainment applications. The mobile game Pokemon Go, in which players attempt to capture virtual creatures while moving around in the real world, is a classic example.
Augmented reality entails abundant — and growing — use cases. Here are some actual applications you can engage with today.
- Ikea Place is a mobile app that allows you to envision Ikea furniture in your own home, by overlaying a 3D representation of the piece atop a live video stream of your room.
- YouCam Makeup lets users virtually try on real-life cosmetics via a living selfie.
- Repair technicians can don a headset that walks them through the steps of fixing or maintaining a broken piece of equipment, diagramming exactly where each part goes and the order in which to do things.
- Various sports are relying on augmented reality to provide real-time statistics and improve physical training for athletes.
Beyond gaming and other entertainment cases, some business examples of virtual reality include:
- Architects are using VR to design homes — and let clients “walk through” before the foundation has ever been laid.
- Automobiles and other vehicles are increasingly being designed in VR.
- Firefighters, soldiers and other workers in hazardous environments are using VR to train without putting themselves at risk.
Automobile manufacturers are increasingly using VR technologies for all aspects of car design.
When were virtual reality and augmented reality first introduced?
While primitive virtual reality systems got their start in the 1950s and 1960s, the concepts of VR and AR began to gain momentum in military applications during the early 1980s. Motion pictures such as Tron, The Matrix and Minority Report all offered futuristic riffs on how these technologies would evolve in the years to come.
The first mainstream attempt at releasing a VR headset was the Sega VR in 1993, an add-on to the Sega Genesis gaming system. While it never made it to market, it did stoke consumer interest in the technology. It would not be until the Oculus Rift in 2010 that a VR headset would be successful with a consumer audience — though today these devices remain expensive and largely of interest to niche, gaming-focused users.
Augmented reality splintered from virtual reality around 1990, and was brought to the public’s attention in 1998, when TV broadcasters began overlaying a yellow line on the football field to better indicate the distance to a first down. Over the next decade, various apps around AR technology were designed for both military use (such as in fighter jet cockpits) and consumer use, when print magazines and packaged goods began embedding QR codes that could be scanned with a consumer’s cell phone, making the product “come alive” with a short 3D video.
In 2014, Google rolled out Google Glass, with an eye toward equipping everyone with a head-mounted display AR device. The AR headset, which was controlled via voice and touch gestures, was met with skepticism and criticism, attributed to the new reality that people were recording video 24/7 in public. Privacy suddenly became a major talking point in consumer AR. Google ultimately suspended the project and relaunched it a few years later with enterprise users in mind.
User privacy has become a growing issue with consumer AR headgear
How is augmented reality being used in business?
Today, business and enterprise use cases are the predominant reality applications for AR. Some key examples include:
- Design and construction — Arguably the most common and fruitful application for AR today, designers are using augmented reality to see what hypothetical products (or structures) look like in real environments and to make virtual tweaks to existing products without ever laying a hand on them.
- Maintenance and repairs — AR technology can guide technicians through the steps of repairing, upgrading, and maintaining a wide range of products, ranging from industrial equipment to entire buildings. AR allows technicians to work on equipment without having to refer to printed manuals or websites, overlaying detailed instructions – often visual – atop the machinery itself.
- Training and education — Businesses are using AR technology to provide an immersive experience when training employees, allowing them to more comprehensively visualize new products and concepts. Schools are following suit.
- Healthcare — AR technology has made its way into the surgery room, with overlays showing the critical steps of an operation, patients' vital statistics, and more.
- Retail — From virtual makeup to virtual changing rooms, businesses are using AR to give retail shoppers a revamped, modernized augmented reality experience when shopping.
- Technology — Products like Splunk Augmented Reality bring AR to major utility companies to improve responses during power outages, and gain full visibility into the entirety of their data.
- Marketing — AR concepts on packaging, point-of-sale materials, and even billboards give businesses a brand new — and much more memorable — way to interact directly with customers.
What are the components of an augmented reality system?
Augmented reality varies depending on implementation, but the most common components include the following, categorized by hardware and software.
These hardware components comprise the backbone of augmented reality. Some of these components might already be supported if you are engaging in AR with your smartphone (more in the following section):
- Processor – Augmented reality requires significant processing power to create the imagery needed and place it in the proper location for it to appear to exist in a real-world environment. Processors may be incorporated in a mobile handset or embedded into a wearable device (more on this below).
- Display – In AR, imagery is created and then populated on some form of display. This can take several forms, depending on the specific application. These include:
- Mobile handheld device – The smartphone or tablet screen is arguably the most common way in which AR hologram imagery is viewed. A user points his or her phone’s camera at a point of interest, and the live video hologram generated by the camera lens is overlaid with AR information.
- Wearable device – Smart glasses such as Google Glass, Vuzix Blade, and Solos Smart Glasses are all designed as standard eyeglasses that also contain a small display only visible to the wearer. The person wearing the augmented reality headset can see the real world by looking straight through the lenses of the goggles, while the embedded display provides an informational overlay. VR headsets are less common in AR environments because they do not allow the wearer to see the real world directly; instead, it has to be recreated in video and displayed on the built-in screen, which is otherwise opaque.
- Automotive HUDs – HUDs, or heads-up displays, are systems that use your car’s windshield as a screen. A device projects an image – speed, directions, etc. – from the dashboard upwards onto the windshield. The driver sees the reflection of this imagery as it bounces off the glass like a mirror.
- Others — Looking ahead, more futuristic devices like smart contact lenses and systems that can project an image directly onto the retina may become viable.
- Camera – As the primary sensor required for AR to function, the camera feeds the live video to the processor, which detects key facets of the environment on which the AR data is overlaid. The camera itself does not process any of the digital information; it merely provides the video feed.
- Other sensors – AR is often designed for motion, so additional sensor types are required for operation. These may include spatial sensors, such as accelerometers and digital compasses, which indicate the direction the camera is facing; GPS sensors, which track the user’s location in the world; microphones, which incorporate audio data into the simulation: and LiDaR, which uses lasers to measure exact distance.
- Input devices – A user on the move is often not at liberty to type commands into a computer. As such, AR systems have been devised to work with numerous types of input technologies. Foremost is the mobile device touchscreen, providing a natural interaction if a phone or tablet is available. Other options include voice recognition technology, so users can control the system via speech, and gesture recognition systems, which typically translate the motion of the user’s hand into commands.
Several types of software algorithms are needed to enable augmented reality. Broadly, these include:
Image registration – Software that takes a photographic representation of one’s surroundings and uses that information to determine various real-world coordinates and objects within it. Image registration maps the real world and determines what is in the foreground vs. what is in the background, where one object ends and another begins, and points of interest as well as additional information.
3D rendering – With the real world mapped and categorized, the next step is overlaying the augmented reality information on top of it. The 3D renderer creates virtual objects and places them into the appropriate location within the live image. The programming language Augmented Reality Markup Language (ARML) is the current standard for setting the location and appearance of a virtual object.
Content management – Content management is a back-end technology incorporating a system that maintains a database of virtual objects and 3D models.
Interface – Whether it’s a video game or a technical management tool, the interface is the intermediary between the user and the video representation of the augmented reality environment.
Development toolkits – A variety of open source and proprietary technologies are used to give programmers a framework for building AR applications on the platform of their choice.
How does augmented reality work on mobile?
If you encounter an AR application today, it will probably be in the form of a mobile phone app: any smartphone owner has access to hundreds of AR applications on iPhone or Android mobile phones without the need for any additional hardware. All the core software capabilities needed to enable AR are built into the operating system.
In a typical use case, the AR user launches an application on his or her mobile phone or tablet. Most AR apps are fairly simple in design. The user just aims the mobile phone or device at a point of interest and waits for the application to populate the screen with additional context. This could be anything from walking directions to the identity of stars in the sky to dance steps.
Hundreds of AR applications are available on mobile devices