Team Voxel (UAS 5.0) / Team TK6 (UAS 4.0)
Team Voxel used COTS components and minimized the form factor for agility. Their solution utilized LiDAR, RGB cameras, and IMU data for efficient SLAM. They introduced an optional secondary vehicle as a network repeater to mitigate risks associated with high-bandwidth communication between the vehicle and the ground station. Team Voxel’s quadcopter was built using a hybrid-X carbon fiber frame designed for structural integrity and reduced vibrations. They used 3D-printed parts as necessary to secure the avionics and sensors. Their onboard GPU-accelerated compute platform enabled real-time mapping fusing multiple RGB cameras and a precise LiDAR point cloud. All processing occurred onboard to minimize potential communication loss impact with the ground station. The team compressed the data before transmission to the ground station to manage its substantial volume. They used open-source machine learning models to detect humans and improve situational awareness from data received by the vehicle. After the vehicle returned, Team Voxel post processed the uncompressed RGB and LiDAR data to generate a high-accuracy, colorized map of the environment. This approach ensured the adaptability and effectiveness of our UAS solution.
At the heart of Team TK6’s strategy was the development of a quadcopter that was both lightweight and cost effective in order to meet first responder requirements for search and rescue requirements. To achieve this, they relied on commercially available off-the-shelf components, which not only helped to reduce overall costs but ensured that spare parts were easily accessible. Their base vehicle weighed 350g and measured only 450mm diagonally. By using a modular approach to payload and easily swappable batteries, the operator could easily customize the quadcopter with RGB, thermal, or night vision capability and adjust battery capacity to meet the needs of the particular mission. They optimized the power system to achieve an efficiency of greater than 7 g/w (thrust in grams per watt of power), resulting in an average flight time of 20 minutes that could be extended up to 30 minutes with the high-density Li-Ion battery option. For comparison, standard vehicle setups with similar motor sizes have efficiencies in the range of 2-5 g/w. The low weight and compact size allowed exploration of tight spaces and made the vehicle resilient to propeller strikes and collisions which are inevitable when exploring an unknown environment.
Their sensor suite was capable of transmitting full 1080p video at up to 90FPS to allow for first-person view (FPV) flying, while the downward-facing camera ensured stable hover in GPS-denied environments, reducing pilot workload. The particular camera sensor used allowed for extremely low light operation without the use of IR or LEDs to assist in covert operations. In a normal use case, built-in LEDs (which can be switched out for the IR-LED variant) assist in exploring areas that otherwise would be in pitch-black darkness. The system included a ground station computer, but could also be operated using only an FPV headset/external monitor, which would reduce the amount of equipment needed to be carried by the operator on-site. In summary, the compact and lightweight solution cost was under $2,000, making it an affordable option for first responders and others who need a collision-resistant, high-performance quadcopter, capable of visual exploration of unknown environments.
Team Voxel (UAS 5.0) is a two-person team consisting of Abhijith Jagadeesh and Vidullan Surendran who met while working on their shared passion of perception system for humanoid robots. Team TK6 (UAS 4.0) is a two person team consisting of Ghanghoon Paik and Vidullan Surendran formed while they pursued their graduate studies in Aerospace Engineering at Pennsylvania State University. Abhijith Jagadeesh completed his M.S. in Robotics and focuses on robust 3D mapping and localization solutions for robots operating in GPS denied environments. By optimizing the SLAM pipelines for low SWaP systems, he has been able to deploy it successfully on commercial UAV and legged robots. Ghanghoon Paik was awarded his Ph.D. in 2024 for developing novel tools for generating optimal trajectories in powered gravity assist maneuvers. His primary research areas are interplanetary mission design, trajectory optimization and control systems. His other interests include machine learning systems and software engineering as applied to UAS and robotics. Vidullan, who received his Ph.D. in 2023, is currently developing learning algorithms for humanoid robots to enable contact rich manipulation, and perception systems for world modeling. His other passion is the development of micro-UAVs capable of operation in indoor GPS-denied environments, with a focus on improving human-vehicle team effectiveness.