Invention Title:

WEARABLE TERMINAL APPARATUS, PROGRAM, DISPLAY METHOD, AND VIRTUAL IMAGE DISTRIBUTION SYSTEM

Publication number:

US20240288931

Publication date:
Section:

Physics

Class:

G06F3/011

Inventors:

Assignee:

Applicant:

Drawings (4 of 11)

Smart overview of the Invention

A wearable terminal apparatus designed for user interaction comprises a processor that manages a display unit. This display unit shows a virtual image of a polyhedron in a designated space. The apparatus can exhibit distinct images on different surfaces of the polyhedron, enabling the user to receive varied information through visual means.

Technological Context

Virtual reality (VR), mixed reality (MR), and augmented reality (AR) technologies facilitate immersive experiences for users via head-mounted displays. These technologies allow users to perceive virtual images or environments that blend with or replace their real surroundings, depending on the mode of operation. The display of these virtual elements is contingent upon the user's position and orientation within their environment.

Functionality of the Display Method

The display method employed by the wearable terminal apparatus involves rendering a virtual polyhedron in space. Each surface of this polyhedron can present different images containing specific information, effectively creating an interactive experience for the user. This method enhances the user's ability to engage with both real and virtual elements simultaneously.

Components and User Interaction

The wearable terminal includes a body fitted with various sensors, including depth sensors and cameras, along with a visor that allows users to see both real-world surroundings and projected virtual images. The visor's light-transmitting properties enable users to perceive these virtual images as if they coexist within their actual environment, creating a seamless blend of reality and virtuality.

Calibration and Visible Region Detection

To optimize user experience, the wearable terminal apparatus incorporates calibration processes that define the visible region based on user orientation and position. This calibration ensures that virtual images are displayed within the user's field of view, enhancing interaction and immersion. Adjustments can be made automatically during normal operation to refine this visible region further.