Children are often reminded never to get into cars with strangers. But for many of us, that is a completely normal activity today. In the next decade, we might even be willing to enter cars without a driver.
Shared transport and shared economy networks, such as Uber and Lyft, are leading a revolutionary transition from individual ownership to new approaches to vehicle mobility and access, including an increased use of autonomous vehicles. Deloitte forecasts that autonomous vehicles (also known as driverless cars or self-driving cars) will account for more than 80% of sales of new vehicles in urban areas by 2040. As automotive companies grapple with the advent of autonomous vehicles and their impact on operations, they must also rethink how they handle risk.
Change of responsibility
Driverless cars should eventually reduce the frequency of collisions and total liability cost. But there will be some initial difficulties before reaching that point. As autonomous vehicles become viable and human drivers adapt to share the road with driverless cars, more collisions are likely to occur in the short term. With autonomous vehicles, drivers become less responsible for road safety. As such, manufacturers, component suppliers, and technology companies that participate in the construction of autonomous vehicles, and the software that controls them, take more liability risks. According to a study conducted between 2015 and 2017 by the National Highway Traffic Safety Administration, the overwhelming majority (94%) of automobile claims were caused by drivers, and only a small number of collisions were attributed to failures in motor vehicles or equipment. Over the next two decades, that equation will surely change, which will pose questions to insurers about how to quantify risk and price insurance coverage.
Should your autonomous car kill you if it saves the lives of more people?
Smart cars will be programmed to comply with the rules and avoid accidents, but in case of the unforeseen, should the driver’s safety prevail or look for the lesser evil?
This dilemma, which has been formulated for decades in philosophy and ethics lessons, has a more current version at a time when intelligent cars without a driver will be a part of our daily lives. The fact that machines take control of the roads brings with it important security questions that will have to be taken into account, as well as ethical-technological debates about who will be responsible for the accidents that may occur, even if they occur less and less thanks to the decisions that the vehicles make.
How will one of these autonomous vehicles be programmed to act in a situation where an accident is inevitable? A flat tire, a sliding surface, human error in another car … If a self-driving car has to decide between hitting a school bus or crashing violently against a wall, what should the car be programmed to do? Should it make the decision to protect its driver? Or should it choose another evil, even if that evil is the life of its passengers?
It seems clear that, although well designed and programmed, and much safer than human driving, there will be unique circumstances in which victims are unavoidable. In that case, since the driver will have no choice or maneuverability, will the manufacturers be legally and morally responsible for the accident?
In an article in Wired Magazine, Patrick Lin, director of the Ethics and Emerging Sciences group at the Polytechnic University of the State of California questioned if it should be left to customers to set or pick the “ethical configuration” when buying one of these cars. Perhaps one customer determines that his life is of maximum importance to protect in case of a possible accident. Another customer may prefer to value the common good above all else. According to a survey, 44% of drivers would prefer that the common good is picked above all else, and 12% would like picking the “ethical configuration” to lay in the hands of the manufacturer.
This is a complex issue that has ethical implications for manufacturers because, what would these exact “ethical configurations” be? Lin also poses a great question…”Imagine, for example, that manufacturers choose to save hybrid cars before trucks that consume a lot of gas, or motorcyclists with helmets before motorcyclists without helmets. Or even other choices like whether to save children before the elderly, or men before women, or rich rather than poor?”
In an accident that requires choosing one victim over another, explains Lin, the manufacturer will continue to be held responsible for giving the user the choice of one thing over another, “that is, the option to discriminate against a particular type of driver or person.”
At the end of the day, in the circumstance of an inevitable accident, saving or valuing a life more means choosing another as a victim. Technology gives us control to establish order over what was previously instinctive or chaotic. And that means a new responsibility for us. Placing decisions in the hands of drivers is not a perfect solution either.
A new approach to insurance
Ultimately, the traditional approach to automatic liability will have to give way to greater coverage of product-related liability or hybrid coverage. The question that still challenges the coverage of autonomous vehicles is who is at fault in the event of a collision. If the vehicle is under the control of the driver, personal automobile coverage will apply. But if an autonomous technology is used, the responsibility should be transferred to the liability coverage for the manufacturer.
For highly autonomous vehicles there will be a strong argument that drivers cannot be at fault and that any collision that occurs is the result of product failure. A limiting factor in the speed with which insurers will adopt this position is the lack of significant regulations and data of claims for incidents related to autonomous vehicles.
In addition, access to the massive amounts of data collected by vehicles, which can help determine conditions during collisions, is a point of contention for insurers. Original equipment manufacturers and insurers will need to find common ground to use the data to create hybrid insurance policies and ensure that liability can be defined fairly and adequately while protecting the personal information of vehicle owners. Society is moving from driving individually owned cars to driving autonomous vehicles. For that future to become a reality, the auto and insurance industries must be ready and willing to change the way they forecast risk.
In order for an insurance regulation to evolve, it is necessary to have favorable legislation that allows both the carrying out of tests with autonomous vehicles while insuring the safety of drivers and pedestrians.
If you or a loved one has been the victim of an autonomous or self-driving car you need to find out your rights. Please contact our personal injury attorneys at P&M Law where our team of experienced attorneys will fight to get you the compensation you deserve. Please contact us at 832-844-6428 or text our attorneys directly at 832-438-3012.