US20240282059
2024-08-22
Physics
G06T19/006
A wearable device is designed to enhance user experience by integrating augmented reality (AR) capabilities. This device includes a camera, display, and processing circuitry that work together to capture and present images from the real world alongside three-dimensional virtual images. The goal is to create a seamless interaction between real and virtual environments, potentially serving applications in the metaverse.
The device utilizes its camera to obtain external images that represent parts of the real environment surrounding the user. These external images are stored in a dedicated command buffer, separate from the display buffer used for rendering the visual output. This separation allows for better management of data and enhances the processing efficiency of the device.
To ensure accurate rendering of both real and virtual elements, the device adjusts the order of image processing among multiple command buffers. This adjustment is based on depth information derived from the external images, the 3D images, and any virtual objects present. By prioritizing these buffers correctly, the device can create a more coherent visual experience for the user.
The rendering process involves combining the contents of the various command buffers to generate a final screen display. This screen is then shown on the wearable device's display, allowing users to view an integrated representation of their surroundings enriched with virtual elements. The method ensures that all components are rendered in a visually appealing manner, maintaining depth perception and realism.
The technology described has significant implications for future applications, particularly in metaverse services that rely on high-speed networks like 5G and 6G. By enhancing interconnectivity between real and virtual objects, this wearable device could revolutionize how users interact with digital content in their everyday environments, paving the way for new AR experiences.