When safety becomes a loophole

What Malcolm Gladwell got me thinking about with self-driving cars, street football, and unintended consequences

I saw something recently with Malcolm Gladwell talking about self-driving cars in urban settings, and it’s been sitting with me ever since.

The point he made was simple but brilliant: the problem with driverless cars isn’t that they make mistakes, it’s that they don’t.

In Phoenix, where Waymo cars operate, Gladwell ran alongside one during a test. He deliberately cut in front of it, circled it, even stood still in its path. The car, perfectly calibrated for pedestrian safety, just… waited. Patiently. For as long as it took.

Which led him to this thought: what’s to stop a bunch of streetwise kids from playing football in the middle of the road if they know a self-driving car will always stop? No risk. No danger. No angry driver beeping. Just a flawless system doing exactly what it’s designed to do even if that means grinding traffic to a halt.

And he’s right. I can picture it already, a group of savvy 11-year-olds deliberately stopping a fleet of self-driving Teslas just so they can use the headlights to light up their winter kickabout.

When predictability becomes a feature — and a flaw

Gladwell’s point is a useful reminder that systems don’t exist in a vacuum. They exist alongside people. And people adapt.

Make something perfectly safe and predictable, and you don’t just create confidence, you create opportunity. People will find the loophole. They’ll push the edges. They’ll repurpose the system for something it was never meant to do.

That’s not cynicism. It’s reality.

So the question isn’t just “how do we make something work?” but “what might people do with it once it does?”

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *