The downsides to having targets & Goodhart’s amazing law

group of people sitting on brown wooden bench
Photo by Mikhail Nilov on Pexels.com

Today’s world is very target driven. If we consider our work environment, Key Performance Indicators (KPI’s) are often the yardsticks used to evaluate employees. In our personal lives, measures ranging from our clothes to our cars are considered metrics to gauge our stature in society (the halo effect.) In this context, it is essential to analyze the following questions – 1) Are we being objective? 2)Are there downsides to having targets? 3) If yes, can we avoid them altogether?

Unfortunately, the answers to the last two questions are ‘yes, there are downsides to having targets’ and ‘No, we cannot avoid them completely.’ Goodhart’s law answers the ‘why.’

Simply put, the law states the following:

When a measure becomes a target/metric, it ceases to be a good measure.

Goodhart’s law [Charles Goodhart, Bri’ish Economist.]

Understanding Goodhart’s law

After reviewing it a couple of times, the statement makes intuitive sense. Evolution has wired humans to be very conscious of being judged/evaluated. And when a measure is made a metric for said evaluation, people eventually end up trying to “optimize the measure”, even if that sometimes has detrimental effects in the long run. Couple that with the perception of “the ends justify the means,” and we’ve got ourselves a species trying to game the system.

two people on a date
Photo by Khoa Võ on Pexels.com

The British government, concerned about the number of venomous cobras in Delhi, offered a bounty for every dead cobra. Initially, this was a successful strategy; large numbers of snakes were killed for the reward. Eventually, however, enterprising people began to breed more cobras for the income!! When the number of snakes killed became a target, notice how people began maximizing that target even when that defeated the original purpose.

An example of the law and people trying to game/manipulate the system (source: wikipedia)

Let us look at another example. There are many classic examples of the law already ranging from Volkswagen cars (the Volkswagen car scandal) to rats (the rat massacre). Therefore, I decided to go with something else for a change –

An application/example of Goodhart’s law

Consider a real-life scenario. Alice and Bob go out on a couple of dates. Every time, Alice notices that Bob listens attentively to everything she has to say and that he speaks in a down-to-earth manner. She mentally labels him a great listener and believes he is very humble. Is this a wise thing to do? Not really, in my opinion. Why? Again, Goodhart’s law.

People know the world values and appreciates good listeners and/or humble people. These are often metrics to evaluate a person’s personality. And consequently, by Goodhart’s law, they are bad measures IN THE SHORT RUN AT LEAST. Maybe Bob is a great listener and a humble person, but that should be a conjecture at that point, not a solid belief. Because for all Alice knows, Bob could be putting on a show for her.

The downsides of having targets
Downsides to having targets

I know that this is a grim thought. Obviously, not every person who does “good things” is doing it for their benefit. Many people selflessly strive to make the world a better place. My point is that one should not judge anyone too quickly based on metrics we incentivized as a species. (More about this in my next post – “The unnerving implications of Goodhart’s law.”) There are both wonderful and awful people in the world. We should give time some leeway to unravel a person’s true character, good or bad.

On a different note, there are apparently variants/flavors of the law, which you can read about here. However, the core insight remains the same – “The ‘soundness’ of a measure diminishes when you incentivize optimizing it.”

Possible solutions

One possible solution is having multiple metrics/targets depending on the requirement. These should preferably be ones that people cannot game simultaneously. Another feasible (but not always viable) solution is to keep said metrics private. An example is the admission process of several elite schools such as Harvard, Stanford, etc. These schools never officially release the procedure and metrics based on which applicants are denied or accepted. We have their admission statistics, but not their modus operandi, if you will.

pexels-photo-965875.jpeg
Photo by Jonathan Petersson on Pexels.com

All said and done, I don’t think it is possible to completely avoid the ramifications of the law. However, we can certainly dilute its effects by wisely choosing our targets.

[For any math lovers reading this– I keep thinking of this law as loosely the psychology equivalent of Gödel’s incompleteness theorem because of how a perfect and complete system of targets/metrics could never be achieved. Let me know what you think.]

Note: There is a definite possibility of people having completely different views than mine on certain portions of this article. In that case, I’d love to hear your thoughts on the matter. Feel free to hit me up at reachthetaciturn@gmail.com

Until next time, The Taciturn.

1 Comment

  1. zoritoler imol

    You positively put a new spin on a subject thats been written about for years. Nice stuff, just great!

Comments are closed