Invention Title:

INTERFACES, SYSTEMS AND APPARATUSES FOR CONSTRUCTING 3D AR ENVIRONMENT OVERLAYS, AND METHODS FOR MAKING AND USING SAME

Publication number:

US20240412510

Publication date:
Section:

Physics

Class:

G06V20/20

Inventors:

Assignee:

Applicant:

Smart overview of the Invention

The patent application describes advanced systems, interfaces, apparatuses, and methods for creating 3D augmented reality (AR) environment overlays. This involves capturing and displaying images, identifying objects within those images, and generating a 3D AR environment that overlays the image. A key feature is the use of a ray pointer to enhance interaction with both the image and the virtual constructs in the AR environment. These constructs correspond to real-world objects or attributes identified in the image and can be selected, activated, animated, or manipulated within the AR space.

Background

This invention falls within the realm of selection interfaces commonly used in computer software. Traditional interfaces rely on "hard selection" protocols like clicking or tapping. Previous innovations by the inventor introduced motion-based systems that utilize changes in motion direction to perform commands such as scrolling or activating functions. This application builds on those concepts by enabling selection and manipulation using motion attributes to attract or influence target objects or their attributes.

General Aspects

The invention employs systems that utilize motion sensors to detect movement, which then influences virtual or real objects in a manipulative manner. The sensors capture proximity, direction, speed, and acceleration of a selection object towards desired targets. These targets can be real-world devices or virtual constructs controlled remotely. The selection object could be anything from a human body part to a machine part, whose motion is used to interact with the system's interface.

Apparatuses and Systems

The apparatuses include user interfaces equipped with feedback units, motion sensors, and processing units. These components work together to detect motion within sensing zones and translate it into command outputs. Upon activation, the interface displays selectable objects that move relative to the selection object’s motion. Faster movement results in quicker attraction of these objects towards the selection object. This dynamic interaction allows for intuitive control over both real and virtual elements.

Interface Dynamics

The interface enhances user interaction by adjusting attributes of selectable objects as they move closer to the selection object. This includes changes in size, color, or highlighting effects. The system uses a "gravity" effect where objects aligned with the motion are attracted while others are repelled. As selections become more certain, non-selected items fade or reposition themselves. Once selected, objects can trigger actions or reveal submenus for further interaction, creating a seamless user experience.