You’re used to being in control when you drive, with your hands on the wheel, your eyes on the road, and — most importantly — your decisions.
But maybe not for much longer.
The spread of autonomous vehicles across the U.S. is starting to undermine the control you take for granted, and lawmakers are racing to figure out what happens when software breaks the rules.
How would you feel about your vehicle answering directly to police?
Electric, autonomous… Cars seem to be escaping our control
When we think about America’s automobile future, most of us picture electric vehicles first and autonomous vehicles next — and that’s a pretty accurate take on the projections.
First came the “electric revolution,” bringing quiet motors and doing away with gas.
But now we’re moving into a new chapter: autonomous vehicles. These are vehicles designed to drive themselves using a combination of intensely developed smart sensors, AI, and advanced software.
These aren’t just assist features anymore. Companies like Waymo are already rolling out sixth-generation robotaxis that can carry passengers without a human driver.
At the same time, self-driving trucks are already operating in the Southern U.S., with autonomous rigs covering long hauls between major centers.
The thing is, this technology is no longer theoretical: the future is here.
What happens if a self-driving car commits a traffic violation? There’s a plan
A big loophole has been exposed in our traffic laws: they were written for drivers, not computers. So if a self-driving car breaks a traffic law in the U.S. today, it’s complicated.
Under current laws in many places, police can’t just slap a speeding ticket on a vehicle with no human at the wheel because there’s no “driver” to cite.
Right now, the situation varies between states. In some places, laws still require a licensed person to be behind the wheel before a citation can be issued.
Luckily, lawmakers aren’t ignoring the problem. New proposals — with some expected to take effect in the middle of this year — will see police issuing “notices of autonomous vehicle noncompliance” directly to the companies that operate these vehicles.
This would assign responsibility and consequences for automated systems that make mistakes, a major step in regulating technology that’s straddling novelty and reality.
Yes, police will force your car to “respond their calls”
This is the moment autonomous cars stop being silent “passengers” and start talking back. Some of us won’t like it, but it’s not negotiable.
California Assembly Bill 1777’s new requirements will kick in on July 1, 2026, for AVs that operate without a human behind the wheel.
Manufacturers have a new, big job to do. Every qualifying AV will have to be equipped with two-way voice communication systems and a dedicated emergency response line that police, firefighters, and other first responders can use when they need to “address” a vehicle.
It’s like a built-in hotline— not just for emergencies, but so officials can speak with a remote human operator who’s watching over the fleet and direct the vehicle if needed.
When will you receive a call… from the police?
If an AV interferes with human activity, first responders have a solution. If speaking to the operator is not the right option, they can issue what’s called an emergency geofencing message to move the vehicle out of the way, which will happen within minutes.
Autonomous cars promise freedom from the wheel, but they also bring new layers of accountability.
If vehicles are allowed to drive themselves, they can’t be allowed to operate outside the law. Requiring them to communicate directly with police means responsibility shifts from the passenger to the company behind the software.
For you, that means control is evolving — not disappearing.
The car may handle the steering. But when something goes wrong, someone still has to answer. And soon, that answer won’t be optional.
