Goodhart’s Law is expressed simply as: “When a measure becomes a target, it ceases to be a good measure.” In other words, when we set one specific goal, people will tend to optimize for that objective regardless of the consequences.

Source: Unintended Consequences and Goodhart’s Law – Towards Data Science