Goodhart's law

From RB Wiki

Goodhart's law asserts that "as soon as a measure becomes a target, it ceases to be a good measure". Introduced in its original formulation as "Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes" in Goodhart1981

Examples

Over-fitting the accuracy of a scientific model to the data that was available during the formulation of that model leads to poor reproducibility on data that was unseen.

In machine learning, instances of this could be over-fitting to the training set, which leads to poor generalisation to an unseen test-set.

Moral uncertainty

Instrumental goals