Building the future of aerial navigation: A new multimodal dataset for UAS research

Person adjusting an aerial drone before a flight
Aerial drone with the multimodal sensor suite preparing for a flight test in St. Paul, Minnesota

With drones and other unmanned aerial systems (UAS) becoming an everyday presence in the sky, the way we think about airspace is changing fast. From infrastructure inspection and search-and-rescue to doorstep delivery, autonomous flight is no longer science fiction. But before drones can safely share airspace with crewed aircraft or fly beyond the operator’s visual line of sight, they must navigate reliably and autonomously in complex environments.

Developing this level of autonomy is challenging, according to CTS scholars Demoz Gebre-Egziabher and Kirsten Strandjord, professors of aerospace engineering and mechanics at the University of Minnesota (UMN). Navigation algorithms must be tested and validated under realistic conditions to ensure they are reliable and safe.

Benchmark datasets provide the raw materials researchers can use to build and evaluate navigation algorithms. Ground-based robotics has benefited from well-known datasets such as KITTI, Waymo Open, and Cityscapes, which have helped drive breakthroughs in computer vision and self-driving cars. But for aerial systems, the same kind of comprehensive datasets are surprisingly scarce, Gebre-Egziabher and Strandjord note. Several aerial datasets have been developed, but most are limited to indoor or tightly controlled environments. What has been missing is a large-scale outdoor multimodal aerial dataset that reflects the real-world diversity of flight conditions.

A UMN research project is aiming to bridge this gap. Led by Gebre-Egziabher and Strandjord, a team has started to create a multimodal benchmark dataset designed specifically for UAS and small UAS (sUAS) research. Built using a custom sensor pod developed by Honeywell International, this multi-sensor suite provides synchronized views of the world critical for tasks such as mapping, localization, and collision avoidance.

The first version of this data the team published includes sensor measurement collected during a flight test conducted by graduate research assistants Cooper Alexander and Sharveshwaran Umashankar. The data includes multiple flight paths over a crop field, which offers a controlled but realistic outdoor environment. All sensor outputs were recorded in standard, easy-to-use formats, making them compatible with popular software.

An essential part of a standard benchmark dataset is the “ground truth,” or empirical evidence, for position, velocity, and timing (PVT) solutions. To increase the accuracy of the PVT ground truth, the researchers are incorporating measurements that use real-time corrections. They first checked this data by comparing it with other corrected measurements and then verified it using known survey markers placed at exact locations. These checks showed the data is reliable, establishing confidence in using these as the official reference for future versions of the dataset.

According to Gebre-Egziabher, this work is only the beginning. “The dataset will continue to grow, incorporating flights in diverse environments—including suburban and urban areas—to better mimic real operational scenarios,” he says. “By filling this crucial data gap, this work lays the foundation for the next generation of aerial navigation research.”

This project was supported by CTS seed funding. Awarded biennially, this funding aims to help CTS scholars develop expertise in emerging areas and foster strategic relationships that position them for future funding opportunities.

—Adapted from content contributed by Demoz Gebre-Egziabher and Kirsten Strandjord.

Subscribe

Sign up to receive our Catalyst newsletter in your inbox twice every month.

Media contact

612-624-3645