The talk of the auto industry this year isn’t about Detroit’s record profits. Rather, it’s about racing to field vehicles that drive themselves.
But it’s not as easy as it sounds.
The road to self-driving cars isn’t just about technology.
It requires people to understand and learn the limitations of software embedded in their new ride. People who don’t mistake the “Autopilot” in their Tesla model for the system that flies Airbus airliners safely around the world.
It especially demands people who don’t risk getting themselves killed for trusting a machine to do something it’s not designed to do — like Joshua Brown, the former Navy Seal from my hometown of Canton, Ohio. He died in May when a 2015 Tesla S on “Autopilot” failed to detect a semi. The car slammed into its trailer while Brown was said to be watching a Harry Potter movie.
Tesla Motors requires drivers to, “remain engaged and aware” when the autopilot is enabled. It says, “drivers must keep their hands on the steering wheel.”
Note the word “must.” When drivers don’t, bad things follow. People can get killed. Accidents can happen. And public confidence in a promising technology can be undermined.
This is not your father’s auto industry. The convergence of old manufacturing and new technology is forging a new, disruptive business. It will look very different from the model built in Detroit and the industrial Midwest.
The new version will reshape the realities of driving and personal transportation. It should impact the investment cases for both the traditional auto industry and the Silicon Valley-based tech industry which is making a big play for the auto space.
It will extend the intrusiveness of the wireless tech industry into the brain of the vehicle. That could, first, further erode the privacy of both driver and passengers. And, second, create an electronic referee equipped to apportion blame between drivers, vehicles, or both.
The net result is dramatic change. It could revise the principles connecting liability with the auto insurance business. It could favor automakers determining culpability for accidents because they will own the data created and stored in vehicles. It could restrain the relative freedoms associated with driving for the past century.
But the transition is unlikely to be as smooth as an upgrade to a new iPhone. Not if the technology depends on human beings to understand its capability, to respect its limits and to use the software-driven hardware correctly, as it was designed.
The likes of Tesla’s Autopilot and the fully autonomous Google car still in its testing phase are the most prominent examples of those all-new suites of technology. They will take time, miles and accumulated experience. They’ll need transparency with the public before meaningful numbers of drivers are comfortable and clamoring to own them.
Paradigm-shifting technology is not new to the auto industry and its customers worldwide. Nor are learning curves that tend to begin on high-priced metal before migrating across all product lines. What’s new is the effort to excise people from the process and replace them with machines. They have limitations, too.
Daniel Howes is a columnist at The Detroit News. Views expressed in his essays are his own and do not necessarily reflect those of Michigan Radio, its management or the station licensee, The University of Michigan. You can find Howes' columns online here.