When I went to Las Vegas last week, I didn't go to the club, I didn't play, or even attend the show. These are the activities of another journey. This time I participated in the Car IQ system security conference. I met engineers, automotive executives and computer programmers who shared the same concerns about self-driving cars: is it safe?
Companies like Uber and Tesla are rolling out self-driving technology faster than the industry and ordinary citizens are comfortable with.
In doing so, they force the adoption of these systems and trigger an arms race in vehicle technology. It will no longer be enough to have the most ergonomic cup holder if your car can't compare to a competitor's advanced driver assistance technology. But deploying this technology so aggressively has many people worried. As with many cutting-edge innovations, there will be casualties and costs.
Thus, their task in Las Vegas was to set the acceptable standard of security in the future.
Sincere People, Sobering Statistics
Closing the gaps
Sincere people, sobering statistics
It was impressive to see how sincerely these people cared about vehicle safety and not just from a liability standpoint. It was heartening to hear the issues they were tackling. They were very intelligent people, who cared a lot, and had the resources to take all the necessary precautions. But the issue is complex and will likely evolve over time. There were, however, a few starting points.
First of all, worldwide, there are approximately 1.3 million vehicle-related fatalities per year. Somehow it became acceptable. If this standard were to be applied to another sector of society, it would be outrageous. For example, if 1.3 million people died each year from plane crashes, no one would fly. Yet somehow we give teenagers the keys to a car and think that's just the price to pay for growing up. When we look at crashes per mile traveled, current self-driving test vehicles and adventurous owners of cars with advanced driver assistance features have shown the technology to be half the risk of human-driven cars. This means that an autonomous vehicle drives twice as far as a human before suffering a fatal accident.
Is it sufficient? Not according to those present at the conference, and probably not according to ordinary people. Either way, a robotic car being half as likely to kill you as a human-driven car isn't safe enough.
In order to get to where we consider something safe to be acceptable, we have to find our tolerance for risk. For example, we don't want self-driving vehicles to simply be safer than the average driver, because that includes drunk drivers, distracted drivers, young drivers, older drivers, and aggressive drivers. The standard must be how much safer self-driving is than good drivers. When we eliminate the driver sub-categories above, we are left with around 10% of the world's road fatalities affecting "good drivers", i.e. people who were not not drunk, distracted, new, incapacitated or driving inappropriately.< p>To determine how much safer self-driving vehicles would need to be than good drivers to be tolerable, it helps to look to other safety standards. For example, how safe must water be to be considered safe? When scientists talk about parts per million of a particular element, that gets lost on the average person. But if we are told that these levels of pollution will cause one in 100,000 deaths, or one in a million in a lifetime, that seems to be acceptable in the United States. These numbers often determine our pollution and safety standards.
Incorporating this into vehicle fatality statistics can help create a safer standard. For autonomous vehicles to be truly accepted as safe and introduced en masse, they need to be a hundred times safer than a good driver. If this were to be accomplished, and most conference attendees suggested the technology is very close, it would mean that the number of annual deaths, once implemented, would drop from 1.3 million to 1,300 if all cars were driven without drivers. With such an automated system in place, it would eliminate things like drunk driving fatalities.
Closing the gaps
We know, however, that as we roll out self-driving technology, there won't be any more. t be a simple conversion. Human drivers will co-exist with robotic cars, and each self-driving vehicle that crashes or crashes will receive significantly more ink than its human counterparts. This attention will, no doubt, add friction to our embrace of driverless technology. But we must also realize that there are opportunity costs involved in delay.
If driverless technology simply becomes twice as safe as humans, then it's fair to say that half of the lives lost in road accidents could have survived. we have converted. The problem with this calculation is that there is no liability for technology that did not save lives. It is reserved for lives lost to technology. This adds another layer of resistance as we become more and more comfortable with our cars' driving software.
So get your torches and pitchforks ready, because self-driving vehicles are coming. There will be accidents and there will be casualties. But before you set fire to or puncture the tires of a self-driving car, remember that there are good, conscientious and very smart people working tirelessly to guard against any eventuality.
But as with people, we know there is nothing foolproof. At least not yet.
Jerry Mooney is a professor of languages and communications at the College of Idaho and author of History Yoghurt & the Moon. Follow him on Twitter: @JerryMooney
Whiteboard photo: Sarah Ruddat, Auto IQ Program Director