How will robots and humans interact in our future transportation system?
We have longenvisioned a future where cars drive themselves and fly through the air. But what is the reality of automation in our transportation future?
At the CTS Fall Luncheon on November 9, Duke University associate professor Mary (Missy) Cummings discussed the current state of autonomous transportation and explored how we can balance the interactions between humans and robots in the future.
As one of the first female fighter pilots in the United States, Cummings experienced the role of computers in transportation firsthand. “When I began flying F-18s, I saw that the computer does a much better job landing and taking off from an aircraft carrier than even the best pilots in the nation,” Cummings said. “In fact, on takeoff they won’t even allow pilots to touch the controls, because humans can easily over-control the plane and cause it to stall—leading to a deadly crash.”
The realization that computers can do a better job controlling an aircraft than the most elite pilots led Cummings to return to graduate school and eventually start the Humans and Autonomy Laboratory, which focuses on the complex interactions of human and computer decision making. As the lab’s director, she has worked on many transportation-related projects, including the development of self-driving dump trucks, human-controlled drones to inspect oil pipelines, robotic forklifts, and R2D2-like robots that could someday replace co-pilots in commercial airplanes.
Cummings also offered insight on autonomous cars. In her research, Cummings has found that one of the greatest challenges of self-driving cars is the tendency of the supervising humans to become distracted, especially after long periods of time. In one study, people were asked to operate a simulated self-driving car for a four-hour drive. Three hours into the simulated drive, several moose amble across the road and are not sensed by the car because of lightly misting conditions. Nearly every driver hit the moose as a result of boredom and distraction.
“Urban driving will be easier to automate than cross-country driving because of the boredom that sets in while traveling long stretches of road,” Cummings said. “In order for a driverless car to really work in these conditions, we need a way to know the state of the mind of the human to see if the driver even has a chance of intervening.”
Cummings’ research has led her to a deep understanding of the jobs robots can do well and the jobs that require human input. “Computers do a great job with skill-based and rule-based reasoning—doing tasks that are automatic and repeatable and [involve] applying a set of procedures that take you through how to deal with an anomaly,” Cummings said. “After that comes knowledge-based and then expert-based reasoning, and this is where automation falls apart because computers can’t do deductive reasoning or operate under uncertainty.”
This need for human knowledge and expertise means that in the future, skill-based jobs may be replaced with new jobs that require higher levels of cognition. “Robotic forklifts, trucks, and trains don’t exist by themselves. They have to have human supervision,” Cummings said. “Now you have humans in control-room situations, which means our future workforce is going to have a greater need for educated, technology-savvy people who understand how to work under uncertainty.”
- CTS Fall Luncheon: Man vs. Machine or Man + Machine? (video presentation)