California has put forth a proposal that self-driving cars should be required to have a licensed driver aboard and steering wheel, but Google and other companies working on developing the cars say that allowing a human driver to take over might be a safety issue. Based on a letter of Feb. 4 from the National Highway Traffic Safety Administration, they may have some support in that position. The NHTSA said in the letter that the artificial intelligence that drives the cars will be considered the driver under federal law.
The NHTSA has not made a decision regarding other issues that have been raised about car safety such as the California proposal, but it has expressed a willingness to rewrite or waive some safety regulations with enough documentation. The agency also said that it would produce regulations regarding driverless cars within six months and that it was willing to assist in widespread deployment of the vehicles once safety concerns were addressed.
The decision regarding who is considered the driver of the car means that vehicle systems that previously notified humans of problems such as low tire pressure via a dashboard alert will now notify the car's software. However, federal regulations require cars to have a brake and steering wheel, and these and other safety issues must be resolved before the cars will be available.
It is hoped that self-driving cars will make the roads considerably safer since so many accidents occur as a result of the actions of human drivers. People may suffer catastrophic injuries in these accidents, and they may then find that the insurance company of the driver who caused the accident offers little compensation. If this occurs, they may want to speak to a lawyer about filing a civil lawsuit. If the driver was engaging in behavior that could be considered negligent or reckless, the injured person may be awarded compensation.