Self-Driving Car Rules May be Eased Despite Safety Concerns
At a time when more and more questions are being asked about the safety of self-driving cars, the Trump administration is backing the technology in a report that may ease self-driving car rules.
In a new report, the National Highway Traffic Safety Administration (NHTSA) reveals it is planning to change some important rules that have hampered the development of autonomous vehicles like those prohibiting cars that lack pedals, mirrors, and steering wheels.
The NHTSA’s report Automated Vehicles 3.0 is billed as a blueprint for the future of motoring. It’s also radical in its push toward automation. Dangerously so, critics would argue.
Not only have self-driving cars failed to prove their safety, but America’s vast insurance industry is not ready for a change in personal injury litigation that would see a switch from driver liability to product liability against carmakers. Nobody has thought this end of the equation through properly. In the United Kingdom, the government has said insurance law will be overhauled to ensure all parties are covered in accidents involving autonomous vehicles. However, questions remain. The issue is far more complicated in the U.S. given the way laws differ from state to state.
Within the guidelines contained in the 80-page report, the NHTSA said it wants to reconsider the necessity and appropriateness of current safety standards in relation to self-driving vehicles.
Currently, carmakers have to meet more than 70 car safety standards in the United States. Most of these assume the presence of a driver. The necessity for a steering wheel, for example, has been a given. This may no longer be the case, although conflicting reports have been coming from the Department of Transportation this week. The regulator will no longer make some of these assumptions, according to the NHTSA report.
States are being requested to review their traffic laws and regulations that pose a barrier to testing autonomous vehicle technology, to adopt the feds’ voluntary recommendations into law and allow a “technology-neutral” environment that doesn’t favor established carmakers over startups.
Some carmakers have been lobbying for a lifting of these restrictions to pave the way for greater investment in automated car technology.
General Motors has filed a petition to allow cars without pedals or steering wheels to become part of a ride-hailing service it intends to launch in 2019. The giant carmaker already operates some self-driving Chevy Bolts in San Francisco.
Waymo, formerly the Google self-driving car project, plans a similar autonomous service in Arizona this year. Its vehicles will have human controls but no driver will be behind the wheel.
Although carmakers welcomed the change, consumer watch groups and safety advocates warned the new guidance is a step backward that will make the roads less safe.
Alarmingly, the Trump administration may also be easing the way for self-driving big rigs.
Last week, the administration included trucks in its updated driverless vehicle policy. The Department of Transportation will “no longer assume” that a commercial motor vehicle driver is necessarily human or needs to be in the cab. Self-driving trucks have already been manufactured by Volvo.
What Are the Dangers of the Rush to Self-Driving Cars?
To use a transportation metaphor from earlier times, the feds appear to be putting the cart before the horse in their rush to automation.
Although we are unlikely to see fleets of driverless cars or trucks on the streets of Virginia Beach, Norfolk, or Hampton any time soon, some states are ahead of Virginia in the testing of self-driving cars. The results from California, Florida, and Arizona are mixed.
Deaths from self-driving cars were once rare. This is no longer the case.
In the western United States, self-driving car manufacturers like Tesla are being hit by lawsuits. The latest was brought by a woman who drove into a fire truck when the Tesla’s much heralded Autopilot was activated. Although these cars are not fully self-driving, they are seen as an important step in the direction of autonomous vehicles.
Heather Lommatzsch filed a lawsuit against Tesla in a Utah state court claiming the carmaker was negligent over a crash in May.
Lommatzsch’s Tesla Model S crashed into a fire truck when the car was driving in Autopilot mode. Lommatzsch was glancing at her phone at the time. She broke her foot in the accident. She said she was under the impression that the Tesla would stop automatically in Autopilot mode.
It’s not clear why the Tesla did not stop but these cars appear to have blind spots for fire trucks. Two other Teslas have crashed into fire trucks, in California.
How Often Are Self-Driving Car Crashes Fatal?
A handful of self-driving vehicle crashes were fatal. In March, Tesla driver Walter Huang was in Autopilot mode when his vehicle suddenly sped up and crashed into a concrete barrier in Silicon Valley, California. Huang was killed.
A report from the National Transportation Safety Board (NTSB) noted the car sped up suddenly from 62mph to 70.8mph in the seconds before the crash. The Autopilot function never attempted to brake or to steer away from the obstacle.
In Florida, two teens from Fort Lauderdale were killed and third was injured after a Tesla Model S they were driving crashed and burst into flames. As well as raising more questions about the safety of self-driving cars, the crash also raised concerns about the Tesla Lithium-ion battery pack. The NTSB report pointed out the battery reignited twice after firefighters extinguished the blaze in the electric Tesla.
Although most of the victims of self-driving cars have been drivers, an innocent bystander was killed in a crash in Arizona.
In March 2018, an Uber self-driving car killed Elain Herzberg, a 49-year-old pedestrian who was crossing the road in Tempe, Arizona. The self-driving Volvo failed to do an emergency stop. It appears the emergency braking maneuvers were disabled whilst the car was controlled by a computer.
The complexity of self-driving car technology is a serious matter for concern. Some of the accidents suggest an incorrect computer setting can be the difference between life and death.
Tesla has repeatedly claimed its Autopilot function reduces crashes by 40 percent. The NHTSA has questioned this statistic.
This kind of rhetoric can give drivers or passengers in self-driving or semi-anonymous cars a false sense of security. In some cases, drivers have been watching movies or taking part in another activity in a car and trusting the technology moments before a fatal impact.
Although the champions of self-driving cars are correct that the number of fatal crashes in these vehicles is a fraction of that in cars driven by humans, there are also considerably fewer of them on the roads.
In her recent update on the technology, Transportation Secretary Elaine Chao said the public has legitimate safety concerns about self-driving cars.
Given that 2018 has been a bad year for self-driving cars, the decision to move full speed ahead with the technology and to allow basic features like steering wheels to be stripped out seems to be a rash one. At the same time, not enough research has been given to shaking up the insurance industry to deal with the widespread use of self-driving cars and the inevitable increase in accidents with injuries.
In America, drivers like to be in control of their cars. The idea of cars and trucks driving themselves is an unsettling one. If you or a loved one has been injured in a car or truck accident in Virginia Beach or elsewhere, please call Cooper Hurley Injury Lawyers at (757) 455-0077.