>[!abstract] >Goodhart’s law states that when a measure becomes a target, it ceases to be a good measure. Originally formulated in economics, it captures how metrics lose reliability once used for policy or performance incentives, since individuals and organizations adapt behavior to optimize the measure rather than the underlying reality it was meant to track. This often leads to perverse incentives, gaming, or distortions where the pursuit of the target undermines the system’s actual goals. The law underscores the fragility of proxies and the difficulty of managing complex systems through simplified indicators. >[!related] >- **North** (upstream): [[Performance measurement]] >- **West** (similar): [[Campbell’s law]] (in social decision-making), [[Perverse incentive]] >- **East** (different): [[Effective KPI design]] (metrics that remain valid even when targeted) >- **South** (downstream): [[Gaming the system]]