Invention Title:

MIXED REALITY INTERACTION WITH EYE-TRACKING TECHNIQUES

Publication number:

US20250199604

Publication date:
Section:

Physics

Class:

G06F3/011

Inventors:

Assignee:

Applicant:

Smart overview of the Invention

The patent application explores an innovative approach to user interactions in mixed reality (MR) environments by integrating eye-tracking techniques with various secondary input methods. The system determines the user's gaze to identify a location of interest (LOI) within both digital content and the real world. This gaze-based identification enhances the precision and intuitiveness of MR interactions.

Eye-Tracking Techniques

Eye-tracking is central to the system, as it allows for the determination of where the user is looking. By analyzing user gaze, fixation, and saccades, the system can accurately identify the LOI. This capability forms the basis for further interaction, enabling users to interact with their environment in a more natural and seamless manner.

Secondary Input Methods

Once a LOI is identified through eye-tracking, various secondary inputs can be employed to interact with it. These inputs include:

  • Finger gestures
  • Hand gestures
  • Eye gestures
  • Wrist band device input
  • Handheld controller input
These methods provide users with multiple ways to manipulate or engage with the LOI, enhancing flexibility and user experience.

Interactive Actions

The combination of eye-tracking and secondary inputs enables a range of interactive actions on the LOI. Users can perform tasks such as zooming, rotating, panning, moving objects, or opening actionable menus. This functionality allows for a more dynamic and responsive interaction model within MR environments, catering to diverse user needs and preferences.

Applications and Benefits

By harnessing eye-tracking alongside other input methods, this system significantly improves user engagement and control in MR settings. It offers potential applications across various fields such as gaming, education, design, and remote collaboration. The enhanced interaction model promises a more immersive experience by aligning digital interfaces more closely with natural human behaviors.