US20250246084
2025-07-31
Physics
G08G5/55
A novel method for interacting with autonomous unmanned aerial vehicles (UAVs) involves creating a shared virtual environment based on perception inputs from sensor devices. These sensors, including image capture devices on the UAV, gather data while the UAV is in flight. The resulting virtual environment provides a dynamic representation of the physical world, accessible to various network-connected devices like multiple UAVs and mobile computing devices. This environment aids in visual augmentations and guides UAV navigation autonomously.
Autonomous vehicles utilize onboard sensors to navigate through physical environments by generating perception inputs. These inputs, such as images from cameras, help estimate the vehicle's position and orientation. Autonomous navigation systems use these estimates to maneuver vehicles without direct human intervention. The described techniques enhance this process by integrating data from multiple sensors to create a comprehensive virtual model of the environment.
The UAVs discussed include rotor-based aircraft like quadcopters and fixed-wing aircraft. Each UAV is equipped with control actuators and various sensors for navigation and image capture. For instance, a quadcopter might have electronic rotors for propulsion and multiple stereoscopic image capture devices arranged around its perimeter for 360-degree coverage. These devices capture images for navigation and potentially for display to users, enhancing both autonomous flight capabilities and user interaction.
The UAVs feature different types of image capture devices tailored for specific tasks. High-resolution color images are captured for user viewing, while lower-resolution grayscale images are used for navigation to reduce processing loads. A gimbal mechanism allows for tracking moving objects by adjusting the camera orientation mechanically or digitally. This flexibility ensures effective image capture in dynamic environments, facilitating applications like filming or surveillance.
The UAV's navigation system includes a motion planner that autonomously maneuvers the vehicle through its environment using data from image capture devices and other sensors. It also comprises a tracking system for monitoring objects within the environment. The motion planner generates trajectories based on sensor data and control inputs, ensuring precise navigation and interaction with external systems like mobile devices operated by users.