Table of contents


Augmented reality (AR) uses technology to combine a simulated environment with a real environment. AR relies on optics to create a simulated environment that annotates or enhances the real environment so that the user can experience them as one environment. The hardware for augmented reality typically includes a computer capable of real-time simulation that synchronizes and maps the simulated to the real environment using a visual output display.

Currently, the AR optics of most interest are see-through head-mounted displays (HMDs), which are also known as see-through near-eye displays or head-up displays (HUDs). Like virtual reality, wearable devices (e.g., haptic gloves) that sense and respond to motions of the user; devices for audio feedback; and trackers for body, head, and eye may be used to interact with the simulation. However, in augmented reality, the user also interacts with objects in the real world.

Augmented reality optics | Synopsys

How do augmented reality optics work?

For augmented reality to work, an optical system must project an image on a transparent display in front of your eyes to overlay the current environment.  This may be in the form of a head-mounted display, handheld display (such as a digital tablet), or a mounted display (such as a windshield).

AR Headsets or Mounted Displays

Typically, an optical system for augmented reality includes light sources (display), receivers (eyes), and optical elements (lenses).

  • The light sources for the augmented reality are microdisplays, such as organic light emitting diodes (OLED) or liquid crystal displays (LCD). A binocular HMD typically has two displays that provide separate images for each eye and generate 3D perception through stereoscopy. In a holographic HMD, the light source is modulated coherent light from a spatial light modulator (SLM). The light sources for the real world are light scattered off or emitted by objects within field of view. 
  • The receivers are simply the eyes of the user.
  • The optical elements combine light from the microdisplays and light from the real world and project augmented information (from the microdisplays) on the real world. An example is shown below in Figure 1, where the micro-display is imaged a distance from the AR glasses through the beam-shaping lens, in-coupling prism, prescription lens, and a free-from image combiner. The real scene and the virtual image (augmented information) are imaged to the eye through the prescription lens.  
Schematic diagram of Prescription Augmented Reality (c) The Optical Society | Synopsys

Figure 1: Schematic diagram of Prescription AR:
(a) The side view and the beam path of the AR image of the proposed system. The prescription lens works both for vision-correction and for wave-guide of the AR image. Light rays from a micro display refracted by a beam shaping lens enter to the prescription lens through an in-coupling prism and create a magnified virtual image located a distance from the lens. 
(b) The detailed diagram for geometric parameters in the Prescription AR.
(c) The 3D diagram of optical components.

Reprinted with permission. © The Optical Society.

Design Considerations

The effects of aberrations to image quality in HMDs are similar to that in other optical systems. Aberrations such as axial chromatic aberration, spherical aberration, coma, astigmatism, and field curvature introduce blur. Aberrations such as distortion, coma, and lateral chromatic aberration induce warping. Aberration control is important in the design of AR HMD optics. An example all-reflective freeform design is shown in Figure 2, where (a) displays the system design, (b) shows the test object, (c) provides the image simulation results of an initial system, and (d) demonstrates the image after optimization. In this example, we can appreciate the power of optimization in reducing aberrations and distortions.  

As It is extremely challenging to match both the FOV and the resolution of human eyes, tradeoffs are often made for specific tasks among FOV, weight (volume, number of optical element), resolution, pupil size (eye box), eye clearance, and the size of the micro-displays. Some of the tradeoffs can be addressed with improved technology. A few potential solutions are listed as below: the tradeoffs between FOV and resolution can be addressed with high-resolution insets, partial binocular overlap, spatial tiling, temporal division, and diffraction-order tiling. The tradeoffs between FOV and pupil size can be addressed by duplicating the exit pupil into an array and employing eye-tracking devices. 

Simulating an all-reflective freeform design in CODE V | Synopsys

Figure 2: Simulating an all-reflective freeform design

Also see the VR page for HMDs.

What is needed to design augmented reality optics?

Synopsys provides a complete set of tools to study AR/VR devices. There are multiple software needs for the design of AR optical systems. The optical engineer needs software to create and optimize the imaging system, analyze straylight in the optical path, and designing diffractive optical elements. The mechanical engineer needs a CAD package to draw the system layout and accomplish thermal and structural analysis. The AR system may also require electrical engineer to track the eye motion and send the signal to the optical system. 

Designing Augmented Reality Optics with Synopsys Optical Solutions Software | Synopsys

The workflow: 

  • Optical Systems:
    • CODE V optical design software can be used to trace rays through the optical system, optimize the system to reduce aberration, decrease distortion, and increase resolution as shown in a head-mounted display. Augmented reality optics for automotive head-up displays (HUDs) can also be modeled in CODE V (see next section). The geometry can then be exported to LightTools.

    • LightTools illumination design software can model illumination, stray light and ghost images. LightTools can also be used to optimize illumination uniformity. 
Using CODE V and LightTools to model the optical systems for augmented reality optics | Synopsys

  • Design gratings using the RSoft Photonic Device Tools

    Diffractive gratings couple light into the waveguide plate and couple the light out of the plate into the eyes. Gratings must be designed properly so that the optical system produces good images.  For the design and optimization of gratings, gratings can be optimized based on diffraction angle, efficiencies, etc. of any order or combination of orders.
    • DiffractMOD RCWA is a very efficient tool to rigorously calculate diffraction properties of transversely periodic devices.
    • FullWAVE FDTD is another powerful tool to rigorously calculate diffraction properties of transversely periodic devices when it is necessary. 
    • MOST optimization in the RSoft CAD Environment provides a convenient method to optimize gratings with either FullWAVE or DiffractMOD.

Once the gratings have been built, the Bidirectional Scattering Distribution Function (BSDF) information and layout files can be exported directly to LightTools to define a surface property.  All diffractive properties are included in the RSoft BSDF files, which contain information about how a surface (thin film, patterns, etc.) scatters light. 

Design gratings using RSoft Photonic Device Tools | Synopsys

How do you design augmented reality optics for an automotive application?

A Head’s Up Display (HUD) augments a driver’s field of view with an image from a display. Software is needed to model the rays traveling through the windshield, and also to evaluate the quality of the projected image.

Head's Up Display Example | Synopsys

CODE V has powerful features for applications in the HUD design space when tackling a wide range of opto-mechanical systems design challenges.  Engineers can use this optical design software for CAD visualization and import for ray tracing.  CODE V can accommodate new freeform surfaces with flexibility for design degrees of freedom for compact designs. when modeling windshields as the combiner. New freeform surface types allow for enhanced aberration control as display resolution increases at the viewer’s eye (higher density display pixels) combined with more compact form factors.

After design completion, it is important to check the final system performance not only against nominal criteria but also for actual system as-built performance. For this, LightTools is the logical next step for viewer simulation.  A reverse ray trace from spectral color objects representing a display image in LightTools shows the projected HUD image for a viewer (onto a model scene). A LightTools simulation can also help uncover any unforeseen issues with stray images or reflections in the system. Also, engineers can use LightTools CAD import and measurement tools to determine:

  • Eyebox to windshield distance

  • Approximate angle of incidence on windshield

  • Windshield to dashboard distance

LightTools simulation of real world optical system performance is an excellent strength for Synopsys Optical Solutions product users in their engineering and design work.

What is the difference between augmented reality and virtual reality optics?

The key difference from virtual reality is that the former simulates the entire visual environment, while the optics for augmented reality captures the real environment and maps the simulated environment to the real environment, presenting them together using a visual display.

In augmented reality, the display is often see-through to combine the simulated environment with the real environment. In virtual reality (VR), the display only needs to output the simulated environment. 

There are a few differences between AR and VR optics.

  • First, AR requires high luminance displays especially for bright environment such as outdoors and surgery rooms.
  • Second, the dipervergence should be less than 1 to 3 arc minutes for see-through HMDs for AR.
  • Lastly, the see-through HMDs often follow a folded design to enable a wide FOV and compact form factor. See-through HMDs must integrate an optical combiner to combine reflected light from the virtual scene and the transmitted light from real-world objects. For prototyping, a beam splitter is often used as a combiner. To reduce the form factor of the combiner, HOEs can be used as they are thin and flat and can function like a beam splitter for a specific wavelength.
Differences in augmented, mixed, and virtual reality applications | Synopsys

What are some real-world applications for augmented reality optics?

  • HUD for driving. 
  • Surgery for superimposing help and procedure instructions. 
  • Combat aid.
  • In engineering to aid 3D design of architectures and products.
  • Social. For example, interacting with real and virtual audience at the same time.
  • Entertainment, including gaming that involves real environment and tourism activities that explain the historical scene or time-evolving changes.
  • Education. For example, textbooks that super-impose explanations on real environments.
Examples of augmented reality applications | Synopsys


[1] Jui-Yi Wu and Jonghyun Kim, "Prescription AR: a fully-customized prescription-embedded augmented reality display," Opt. Express 28, 6225-6241 (2020).