Invention Title:

MICRO HAND GESTURES FOR CONTROLLING VIRTUAL AND GRAPHICAL ELEMENTS

Publication number:

US20240393887

Publication date:
Section:

Physics

Class:

G06F3/017

Inventors:

Applicant:

Smart overview of the Invention

The described technology involves controlling virtual or graphical elements on a display through hand gestures detected by an eyewear device equipped with a camera system. The system captures video frames and processes them to recognize specific hand shapes that correspond to predefined gestures. Each gesture is linked to a particular action, enabling the control of virtual elements relative to the display. This includes micro-scale movements, such as a thumb sliding along an extended finger, which are used to manipulate interactive graphical elements like sliders.

Technical Field

This innovation pertains to display control in electronic devices, particularly wearables like smart glasses. It focuses on real-time tracking of hand gestures and micro-movements for interacting with virtual objects and graphical elements. The technology leverages augmented reality (AR), which integrates virtual objects into the physical environment, enhancing user interaction by making virtual objects appear and behave like real ones.

Background

Modern electronic devices, including smartphones, tablets, laptops, and wearable devices like smart eyewear, utilize various cameras, sensors, and input systems for user interaction. Graphical user interfaces (GUIs) facilitate this interaction through elements like icons and sliders. Virtual reality (VR) creates immersive environments, while AR overlays digital information onto the real world, providing a blended experience that enhances perception and interaction.

Detailed Description

The method involves an eyewear device with a camera capturing video data to detect hand shapes. These shapes are matched against predefined gestures linked to actions that control virtual elements on a display. The technology enables precise control of interactive graphical elements by establishing a scale along an extended finger and calibrating it with graphical scales. This allows for nuanced control based on small finger movements.

Additional Features

The eyewear device may also include touch-sensitive input areas for additional control options. Touch inputs can trigger various actions in the GUI, such as selecting items or navigating menus through taps and swipes. The system supports diverse orientations and uses advanced AR technologies like computer vision for enriched user experiences. This flexibility allows the eyewear device to adapt its functionality depending on user needs and context.