Goodhart's Law
"When a measure becomes a target, it ceases to be a good measure."
Goodhart's Law is an adage named after British economist Charles Goodhart that describes the unintended consequences of using a statistical measure as a target for policy or management decisions.
Origins
The law was formulated by Charles Goodhart in 1975 in the context of monetary policy, but has since been applied broadly across economics, management, and social sciences. Goodhart observed that when policymakers target a specific economic indicator, people's behavior changes in ways that make that indicator less reliable.
The Core Principle
The fundamental insight is that measurement changes behavior. When people know they're being measured on a specific metric, they will optimize for that metric, often in ways that:
-
Game the system rather than improve the underlying reality
-
Sacrifice other important but unmeasured aspects
-
Make the metric less indicative of what it originally measured
Classic Examples
Economics
-
Unemployment statistics: When governments target unemployment rates, employment agencies may reclassify people as "not seeking work"
-
GDP growth: Focusing solely on GDP can lead to environmental degradation and inequality
-
Test scores: Schools "teaching to the test" rather than providing comprehensive education
Business
-
Sales quotas: Salespeople may push unnecessary products or delay sales to next quarter
-
Code commits: Developers writing unnecessary code to meet commit quotas
-
Customer satisfaction scores: Employees gaming surveys rather than improving service
Academia
-
Citation counts: Researchers citing each other excessively or publishing in predatory journals
-
Publication numbers: Preference for quantity over quality research
Variants and Related Concepts
Campbell's Law
"The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures."
Lucas Critique
In economics, the idea that economic relationships will change when policy changes because people adapt their behavior.
Cobra Effect
When an incentive designed to solve a problem actually makes it worse (named after a colonial policy in India that led to cobra breeding for bounties).
Why It Happens
-
Gaming: People find loopholes to meet targets without achieving intended outcomes
-
Tunnel vision: Focus on measured aspects while neglecting unmeasured ones
-
Short-termism: Optimizing for immediate metric improvements over long-term value
-
Perverse incentives: Misaligned rewards that encourage counterproductive behavior
Mitigation Strategies
Multiple Metrics
-
Use balanced scorecards with multiple, sometimes competing metrics
-
Include leading and lagging indicators
-
Measure both outcomes and processes
Qualitative Assessment
-
Combine quantitative metrics with qualitative reviews
-
Regular human oversight and judgment
-
Context-sensitive evaluation
Dynamic Metrics
-
Regularly change or rotate metrics to prevent gaming
-
Use unpredictable measurement criteria
-
Focus on outcomes rather than easily gamed proxies
Systems Thinking
-
Consider the broader system effects of metrics
-
Monitor for unintended consequences
-
Design metrics that align with desired behaviors
Modern Applications
Goodhart's Law is particularly relevant in the age of:
-
AI and algorithms: Optimizing for specific metrics in machine learning
-
Social media: Engagement metrics driving addictive design
-
Performance management: KPIs in modern organizations
-
Public policy: Data-driven governance and accountability
Key Takeaway
The law serves as a reminder that metrics are tools, not goals. The challenge lies in designing measurement systems that encourage the right behaviors while maintaining their diagnostic value. Effective measurement requires ongoing vigilance, adaptation, and a deep understanding of human behavior and system dynamics.