Gazing into the future of augmented reality glasses

Augmented reality (AR) sits between virtual reality (VR) and the real world.

In VR systems, the user’s perception of the world around them is entirely based on computer generated information. This is typically presented to a user using a form of headgear, or heads-up-display (HUD) that totally occludes the wearer’s vision and provides a screen for displaying the virtual world being generated.

VR systems are used in many different applications, but one of the most recognisable uses of VR is in the gaming industry. In this case, the user is presented with a completely virtual world with which they can interact.

In AR systems, the user’s perception of the world around them is enhanced, or changed, based on computer generated information overlaid on the user’s normal perception of reality.

A well-known example of an AR system would be a star gazing software application for a smart phone. In this example, the software application presents a virtual overlay of star constellations on a normal camera output feed of the night sky.

AR glasses

AR systems do not need to fully occlude the user’s vision, and so can come in the form of less restrictive headwear or eyewear with a transparent screen, like the lenses of a pair of glasses, on which the computer generated information can be projected. This can be achieved in a compact arrangement, for example, using various waveguides within the lenses or other means.

A simple AR system may only project information onto the screen; whereas more complex AR systems may adapt the projected information being displayed, or where on the screen the information is being displayed, based on what the user is looking at and/or their eyeline.

In order to achieve this, the AR system may include a camera for monitoring the user’s environment to adapt the information being displayed to the user. Similarly, the AR system may include a camera for monitoring the user’s eye to assess the user’s eyeline and adapt the position of the information being displayed on the screen accordingly.

Medical uses for AR glasses

There have been some fascinating use cases for AR glasses in the medical field.

In current examples of AR glasses, eye tracking technology utilises a mixture of infrared (IR) light sources and cameras to continuously monitor the movement and position of the wearer’s eyes. The cameras capture pupil position and corneal reflections of the IR light, allowing the system to determine where the wearer is looking. A real-time processing unit in the AR glasses calculates the wearer’s eyeline, and their particular gaze point. This information is used for a range of interactions within the AR environment, like selecting objects, navigating menus, and adjusting the focus of virtual elements to match the wearer’s gaze. For example, if the wearer is looking at text, the text can be highlighted or magnified.

It is easy to see the fantastic opportunities offered by these systems as assistive technologies in helping people with disabilities and/or with various diseases. For example, an AR system may be adapted to highlight certain objects or obstacles for a visually impaired wearer. Further, AR glasses may be adapted to act as a computer input device for a wearer with impaired motor function, such as a wearer with motor neuron disease or neck-down paralysis, using only eye movements to control the input.

Tracking eye movements enables AR systems to enhance the user experience by making interactions with the AR environment more realistic and immersive. They can also act as a means of controlling a virtual interface for wearers with impaired manual function.

However, the movements of the eye also represent a significant dataset that can be used to assess the health of the wearer as well as assisting those with pre-diagnosed conditions.

A company that has been, and continues to be, making strides in the technical field of AR glasses is Lumus.

In a recent interview with Med-Tech News, David Goldman, VP of marketing at Lumus, discussed the potential medical applications of eye tracking in AR glasses. In the interview, it was discussed how near-to-eye display eye tracking holds significant potential for the Medtech and AR industries.

Eye tracking via AR glasses can be used to collect health data from the wearer without any invasive testing or procedures. For instance, the eye tracking components can monitor the wearer’s pupil size, pupillary response, blink rate, eye movements, and gaze direction to detect abnormalities and diagnose a number of different conditions and diseases, such as glaucoma, diabetic retinopathy, macular degeneration, concussions and even certain neurological disorders.

For example, people with glaucoma may have difficulty tracking moving objects with their eyes, which can be detected through the eye tracking function of AR glasses. For concussion detection, an AR system may compare post-incident eye tracking data to detect any significant deviations from the baseline of normal eye tracking data collected prior to the incident.

Similarly, pupil size and eye blink rate can be used to determine stress levels, attention, and fatigue – for example, pupils tend to dilate when people are stressed or tired, and people tend to blink more often when fatigued.

When it comes to neurological disorders, AR eye tracking can serve as a screening tool for disorders such as Alzheimer’s and Parkinson’s disease by continuously monitoring a person's pupils and eye movement patterns. The technology looks for irregularities in eye movements, such as decreased smooth pursuit (when the eye remains steadily fixated on a moving object), saccades (rapid eye movements that shift the gaze from one point to another), or pupillary responses compared to a baseline measurement.

Conclusion

As outlined above, AR eye tracking can provide valuable insights into the symptoms of various conditions, particularly as a means for early detection and continuous monitoring.

It is always exciting to see developments such as these in fields that have been relying on the same diagnostic and monitoring tools for a long time. It will be interesting to see if these concepts can be implemented across a wider range of conditions in the future.