US20260050167
2026-02-19
Physics
G02B27/0172
The patent application describes a head-mounted display system designed to enhance virtual and augmented reality experiences by utilizing eye metrics for interaction. This system includes a display worn by the user, inward-facing sensors or cameras to monitor eye movements, and processing electronics. The primary function is to initiate or drive virtual content activity through eye inputs such as gaze direction, blinking, and other eye gestures.
The technology pertains to the fields of virtual reality (VR) and augmented reality (AR), focusing on innovative methods for interacting with digital content. Traditional VR systems present digital images without real-world transparency, while AR systems overlay digital information onto real-world views. This invention aims to improve user interaction within these environments by replacing conventional input tools with eye-based controls.
Advancements in sensor technology have enabled the passive detection of subtle user inputs, eliminating the need for traditional input devices like keyboards and mice. The system leverages this capability by using eye metrics to control virtual content. It can detect eye closure and gaze direction to alter the display state, enhancing the user's interaction with virtual environments.
This eye-driven interaction method offers significant potential for enhancing VR and AR applications. By using natural eye movements as input, the system provides a more intuitive and immersive user experience. It can be applied in various fields, including gaming, training simulations, and virtual meetings, where seamless and efficient interaction with digital content is crucial.