US20240264795
2024-08-08
Physics
G06F3/165
An attraction system is designed to enhance guest experiences in interactive spaces by utilizing augmented reality (AR) and virtual reality (VR) imagery. The system features a display that presents immersive visual content, while an audio controller manages a network of speakers positioned throughout the space. This setup allows for interactive audio experiences that correspond to the actions and movements of guests, creating a more engaging environment.
The system operates through a controller equipped with processors that analyze data reflecting the guest's state, which may include movements, gestures, or facial expressions. Based on this data, the AR/VR imagery is dynamically adjusted, and the audio controller is instructed to produce interactive audio that appears to emanate from specific elements within the AR/VR environment. This integration ensures that audio and visual components work in harmony to enhance immersion.
A non-transitory computer-readable medium is employed to facilitate the generation of interactive audio. The medium contains instructions that enable processors to receive sensor data and identify guest interactions with interactive objects. By determining corresponding audio for these interactions, the system can direct the speaker array to output sounds as if they originate from the interactive objects themselves, further enriching the experience.
The method for providing interactive audio involves receiving user inputs that indicate trajectories of virtual objects within a virtual space. These inputs allow the system to map movements from virtual to physical spaces, ensuring that AR/VR imagery reflects real-time actions. Consequently, interactive audio is generated based on these trajectories, allowing guests to experience synchronized soundscapes that enhance their interaction with the environment.
The attraction system aims to create a fully immersive experience by combining interactive audio with AR/VR imagery and physical objects. Guests are continuously monitored through various sensors that track their movements and gestures. This data informs real-time adjustments to both audio and visual elements, ensuring a responsive environment where guest actions influence their overall experience. For instance, sounds may vary in volume or direction based on the guest's position relative to virtual or physical stimuli.