The problems with driverless cars

As Bloomberg reports, driverless cars are currently just a little bit too good of a driver:

They obey the law all the time, as in, without exception. This may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit. It tends not to work out well.

Driverless cars also just react differently than humans do:

Google is working to make the vehicles more “aggressive” like humans — law-abiding, safe humans — so they “can naturally fit into the traffic flow, and other people understand what we’re doing and why we’re doing it,” [Dmitri Dolgov, principal engineer of the program] said. “Driving is a social game.”

Google has already programmed its cars to behave in more familiar ways, such as inching forward at a four-way stop to signal they’re going next. But autonomous models still surprise human drivers with their quick reflexes, coming to an abrupt halt, for example, when they sense a pedestrian near the edge of a sidewalk who might step into traffic.

“These vehicles are either stopping in a situation or slowing down when a human driver might not,” said Brandon Schoettle, co-author of the Michigan study. “They’re a little faster to react, taking drivers behind them off guard.”

That could account for the prevalence of slow-speed, rear- end crashes, he added.

Michael Lowrey

Michael Lowrey is a contributor to Carolina Journal and a policy analyst for the John Locke Foundation. Lowrey has written numerous articles for the foundation on topics su...