Infrared Sensing for Driver Assistive Systems

Principal Investigator(s):

Craig Shankwitz, Fmr Director, Intelligent Veh. Lab, Mechanical Engineering

Project summary:

A driver-assistive system that uses high accuracy, differentially corrected GPS (DGPS), high-accuracy geospatial databases, radar, computers, and driver interfaces (both a head-up display (HUD) and a tactile seat), has been developed to help drivers maintain lane position and avoid collisions during periods of low visibility. These systems have been tested and deployed both in Minnesota and in Alaska. Collision avoidance information is provided to a driver through the HUD. Objects located within the HUD field of view determined to be a threat to a driver are indicated as square boxes. White boxes represent an advisory, and if the detected object is fewer than 50 feet or three seconds from a collision, the box turns red (a warning).

Drivers who have used the system have speculated that a more accurate representation of the object projected on the HUD would make the driving task easier. Emerging Super and High Dynamic Range Cameras (SDRC and HDRC, respectively) appeared to be a feasible, inexpensive means with which to address driver concerns.

This report documents the research program. SDRC and HDRC technology failed to meet expectations, but infrared imagery was successfully integrated with the standard HUD.


Project details:

  • Project number: 2004056
  • Start date: 01/2004
  • Project status: Completed
  • Research area: Transportation Safety and Traffic Flow
  • Topics: Intelligent vehicles