Goodhart's law

From RB Wiki
Revision as of 12:41, 27 January 2020 by El Mahdi El Mhamdi (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Goodhart's law asserts that "as soon as a measure becomes a target, it ceases to be a good measure". Introduced in its original formulation as "Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes" in Goodhart1981


Over-fitting the accuracy of a scientific model to the data that was available during the formulation of that model leads to poor reproducibility on data that was unseen.

In machine learning, instances of this could be over-fitting to the training set, which leads to poor generalisation to an unseen test-set.

Moral uncertainty

Instrumental goals