"When a measure becomes a target, it ceases to be a good measure."
— Charles Goodhart
Goodhart's Law is a near-universal law of human systems. It explains why so many well-intentioned plans gradually warp into performance art—and why metrics, when turned into targets, can break even the best-designed systems.
What Is Goodhart's Law?
Goodhart's Law was first articulated by economist Charles Goodhart in 1975:
"Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes."
Or, more simply: When a measure becomes a target, it stops being a good measure.
How It Goes Off the Rails
You pick a metric to understand how something is going—website engagement, test scores, or how many bananas your team ships per quarter. At first, the metric is useful. It reflects the thing you care about. But once you start optimizing for that metric directly, something shifts. The metric stops being a signal. It becomes the game.
And then the game gets weird.
Example: Teaching to the Test
A school district decides to focus on test scores. Teachers start teaching to the test. Scores improve. The district celebrates. But when you dig deeper, you find out students can memorize formulas but can't solve real problems. The metric got better, but the goal—real learning—got left behind.
Example: Fitness Tracking
You want to get in shape, so you count your daily steps. One night you're pacing your apartment at 11:54 p.m., phone in hand, chasing a clean 10,000. But your heart rate hasn't hit triple digits since March, and you haven't touched a dumbbell in a month.
Example: Software Productivity
Someone decides to measure productivity by lines of code. Now you've got developers inflating boilerplate or splitting one-liners into ten just to look busy. The codebase expands. The product doesn't improve.
Example: Sales Calls
You reward reps for number of calls. Suddenly everyone's flooding voicemails with robotic greetings. No one's actually having a conversation, but the numbers look amazing in the deck.
Because numbers are seductive. They feel objective. They give you a sense of control in a messy world. But the second people realize which numbers matter, they adapt. They find shortcuts. They exploit cracks. The map becomes the territory. And then the territory falls apart.
The smarter or more resourceful the people involved, the faster the metric collapses into self-parody.
How to Avoid Goodhart's Trap
-
Use Multiple Metrics
Relying on a single number is like flying a plane with one dial. Track complementary signals. If you're measuring speed, also measure quality. If you're tracking engagement, also track retention.Example: Instead of just counting tickets closed, also measure user satisfaction and bug reopen rates.
-
Measure Outcomes, Not Proxies
People game proxies because they're easier to manipulate. Go for the thing you actually care about, even if it's messier or harder to quantify.- Bad: Lines of code
- Better: Working features in production
- Best: User adoption and retention
-
Refresh Metrics Regularly
Every metric decays as a useful signal once people figure out how to optimize for it. Treat metrics like perishable goods—review and rotate them. -
Include Qualitative Input
Talk to the humans affected. Surveys, interviews, slack channels, postmortems. People will often tell you how they're gaming the system if you bother to ask. -
Build Feedback Loops
Create systems where misalignment is obvious early. That means anomaly detection, dashboards with surprising context, and easy ways for people to escalate when something feels off. -
Incentivize Intent, Not Just Output
Reward curiosity, learning, and iteration—not just hitting numbers. The smartest operators create cultures where you can't win by optimizing a dumb metric. -
Watch for Meta-Gaming
Assume someone will eventually optimize how metrics are chosen or reported. That's the "game the game" layer. Preempt that by being transparent and by rotating who defines targets.
Final Thoughts
Goodhart's Law is a warning: metrics are powerful, but they are not the goal. When designing systems—whether in business, education, or AI—remember that what you measure will shape behavior. Choose your metrics wisely, and always be on the lookout for the unintended consequences of optimization.