, Associate Dir., Mobility Tech. Lab, Mechanical Engineering
This project focused on improving the eye-tracking analysis tools used with the HumanFIRST driving simulator. Eye tracking is an important tool for simulation-based studies. It allows researchers to understand where participants are focusing their visual attention while driving.
The eye-tracking system provides a nearly continuous record of the direction in which the driver is looking with respect to real-world coordinates. However, this by itself does not give any information about the objects at which the driver is looking. To determine when a driver is fixated on a given element in the simulated world (e.g., a vehicle or sign), additional processing is necessary. Current methods to process this data are time and resource intensive, requiring a researcher to manually review the eye-tracking data. This motivates an automated solution that can automatically and programmatically combine eye-tracking and simulator data to determine at which object(s) (either in the real world or the simulated world) the driver is looking.
This was accomplished by developing and implementing software capable of providing useful eye-tracking data to researchers without requiring time- and resource-intensive human intervention and hand coding of data. The data generated by the analysis software was designed to provide a set of summary statistics and metrics that will be useful across different simulation studies. Additionally, visualization software was created to allow researchers to view key simulator and eye-tracking data for context or insight or to identify and characterize anomalies in the analysis software. Overall, the software implemented will increase the efficiency with which eye-tracking data can be used alongside simulator data.