When it comes to human behavior, whenever we baseline success against a particular measure, the efficacy of that measure tends to decline. This idea, known as Goodhart’s Law, is named after Charles Goodhart, an economist and former advisor to the Bank of England. Thus if we are working within a system of rewards and punishments, for instance, we tend to optimize our actions within that system to avoid punishments, and yet the impact on the overall system may be negative. The practical consequences of Goodhart’s Law are that as soon as we set a target for a measure, we have changed the system in which the measure operates, and thus we have changed not only what the measure means, but also what the measure’s target means. In software development, an example of a popular measure that is subject to Goodhart’s Law is velocity.
Goodhart’s Law Video
Goodhart’s Law Podcasts
Goodhart’s Law Example
Authored by Philip Rogers
Agile World Resources are provided as free resources to anyone seeking to learn more and are shared under a creative commons attribution license. This means if you use a resource elsewhere you must name Agile World Publishing as the source, who the author is, and the photo creator (if used).