As auto manufacturers rush to bring self-driving features to the market, they may have a responsibility to ensure motorists understand the limits of the latest technologies.
Weeks after autonomous driving technology contributed to a fatal crash for the first time, U.S. Department of Transportation Secretary Anthony Foxx told a room full of automated driving leaders Wednesday that they can't expect car owners to understand the capabilities and complexity of autonomous systems, or to always play by the agreed-upon rules.
Although he never mentioned Tesla Motors or Joshua Brown by name, Foxx made it clear he was addressing the May 7 crash that killed Brown while he operated a Model S which had its Autopilot feature engaged.
"Sometimes the coolness of the technology may drive people to push the limits beyond what manufacturers have intended," Foxx said while delivering the keynote address at the Automated Vehicle Symposium in San Francisco. "We need to think about that even as we build these systems. We must consider not just what's within the scope of their intended use, but within the uses that can be reasonably foreseen."
Federal investigators from the National Highway Traffic Safety Administration are investigating the collision, in which a tractor trailer turned across Brown's path on a divided Florida highway. Neither Autopilot nor Brown braked, and the Model S struck the side of the tractor trailer and passed underneath, shearing off the top of the car.
Autopilot was not designed to sense that sort of cross-traffic; drivers must acknowledge they remain responsible for all operations before they can use the feature. But for many, the accident manifested concerns that drivers won't remain vigilant when semi-autonomous technologies are engaged or when misconceptions exist over the car's capability.
Foxx's use of the word "reasonably" may have been carefully chosen. What can be "reasonably" expected be from products' performance—whether they're cars or smartphones—can often be a barometer in assigning liability during lawsuits.
"Public expectations matter," said Bryant Walker Smith, an assistant law professor at the University of South Carolina who has been at the forefront of self-driving legal matters. "Liability often depends on the perceived reasonableness of a company or its product."
Foxx's hints over the legal ramifications of autonomous deployment underscored a more cautious tone in his remarks. In previous settings, he has made no effort to hide his "bullish" stance on autonomous vehicles. But he said Tuesday the focus on safety must always remain at the forefront of development efforts. Noting that federal research shows 94 percent of motor-vehicle crashes are caused by human factors, he remains a believer that autonomous technology can dramatically reduce injuries and fatalities, perhaps by as much as 80 percent, Foxx said.
"But we don't want to replace crashes that occur based on human factors with large numbers of crashes caused by systems," he said. "So the challenge would be for us to think about how we integrate these technologies."
This article originally appeared on .