You’re in a team meeting. The boss, beaming with pride, unveils the new “North Star Metric.” From now on, everything is about reducing customer support ticket resolution time. “What gets measured gets managed!” they declare.
A month later, the numbers are incredible. Resolution times have plummeted. The charts are all up and to the right. The boss is thrilled.
But something’s wrong. Customers are angrier than ever. They’re complaining that their problems aren’t actually being solved. They’re just getting their tickets closed faster. The support team, under pressure to hit the new target, has mastered the art of closing a ticket and telling the customer to open a new one if the problem persists.
The metric looks great. The reality is a dumpster fire.
This isn’t a one-off disaster. It’s a predictable pattern of human behavior, a universal law that explains why so many well-intentioned plans go spectacularly wrong.
It’s called Goodhart’s Law. And it’s the reason your KPIs and OKRs are probably making things worse.
The Origin Story: A Banker’s Cynical Truth
The law comes from a British economist named Charles Goodhart. In 1975, he was looking at the UK government’s attempts to control inflation by targeting the money supply. He made a simple but profound observation: as soon as the government announced they were targeting a specific measure of money, that measure immediately stopped being a reliable indicator of what was happening in the economy.
People and banks changed their behavior to get around the new rules. The metric they were targeting became a distorted funhouse mirror, reflecting the target itself, not the economic reality it was supposed to represent.
His observation was boiled down into a beautifully cynical maxim:
“When a measure becomes a target, it ceases to be a good measure.”
In other words, the moment you start grading people on a number, they stop trying to achieve the goal and start trying to game the number.
The Basic Explanation
Think of it like trying to lose weight by only weighing yourself on Wednesdays.
Stepping on the scale is a measure of your progress. But if it becomes your one and only target, you start doing weird things. You might dehydrate yourself before weigh-in, or only eat celery for a day. The number on the scale will go down, but have you actually gotten healthier? Nope. You’ve just gotten better at manipulating the scale.
Goodhart’s Law says this happens with any metric. The metric is just a proxy, a shadow of the real thing you care about. The moment you start chasing the shadow, you lose sight of the object casting it.
The Goal: A happy, well-educated student.
The Proxy Metric: Standardized test scores.
The Result: Schools “teach to the test,” and students learn how to pass exams, not how to think critically. The metric improves, but the goal is missed.
The law reveals a fundamental flaw in our obsession with data. We pick a simple number to represent a complex reality, and then we’re shocked when people optimize for the simple number instead of the complex reality.
Goodhart’s Law in the Wild
Once you see it, you can’t unsee it. It’s the hidden gremlin breaking systems everywhere.
The Cobra Effect: This is the most famous example. In colonial India, the British government, concerned about the number of venomous cobras, offered a bounty for every dead cobra. The measure was “number of dead cobras.” The target was “fewer live cobras.” What happened? People started breeding cobras just to kill them and collect the bounty. When the government realized this and canceled the program, the breeders released their now-worthless cobras, and the cobra population increased.
Hospital Readmissions: Hospitals were incentivized to reduce the average length of a patient’s stay. The metric looked great on paper. But in reality, some hospitals were just discharging patients prematurely, leading to more readmissions and worse health outcomes down the line.
Developer Productivity: A manager decides to measure developer productivity by the number of lines of code written. The result? Bloated, inefficient code. Developers learn to write ten lines to do what could be done in two. The metric goes up, but the quality of the software goes down.
How to Defend Against Goodhart’s Law
So, are all metrics useless? No. But you have to be smart about how you use them. Goodhart’s Law is a warning, not a death sentence. Here’s how to keep your metrics from turning against you.
Step 1: Measure Outcomes, Not Outputs.
Don’t target the proxy. Target the thing you actually care about. Instead of “ticket resolution time,” target “customer satisfaction score after ticket resolution.” Instead of “lines of code,” target “number of bugs reported” or “system uptime.” This forces the focus back on the real goal.
Step 2: Use a Basket of Metrics.
Never rely on a single metric. A single number is too easy to game. Use a balanced scorecard of several metrics, some of which might even be in tension with each other. For example, measure both sales volume and customer lifetime value. This makes it much harder to optimize one number at the expense of the overall health of the system.
Step 3: Add a Human Element.
Numbers don’t tell the whole story. Pair your quantitative metrics with qualitative feedback. Talk to your customers. Talk to your employees. Do they feel like things are actually getting better? This human check can be a powerful antidote to a metric that has gone rogue.
Step 4: Keep the Target a Secret (Or Change It Often).
This is the advanced move. If people don’t know what the exact target is, they can’t game it. They’re forced to just do good work. Alternatively, changing the targets periodically can keep everyone on their toes and prevent them from over-optimizing for one specific number.
The Bottom Line
Goodhart’s Law is a powerful reminder that humans are clever, and systems are fragile. The moment you turn a measurement into a target, you give people an incentive to corrupt that measurement.
What gets measured gets managed, yes. But what gets targeted gets gamed.
The most effective leaders and organizations understand that metrics are a compass, not a map. They point you in a direction, but they don’t tell you about the terrain. The goal isn’t to reach a specific number. The goal is to make things better. And no single number can ever capture that.
Named Law: Goodhart’s Law
Simple Definition: When a measure becomes a target, it ceases to be a good measure.
Origin: Coined by British economist Charles Goodhart in 1975 in the context of monetary policy.
Wikipedia: Goodheart’s Law
Category: Economics & Business
Subcategory: Value & Decision-Making


