Difference between revisions of "Goodhart's law"
From RB Wiki
(Created page with "Goodhart's law asserts that "as soon as a measure becomes a target, it ceases to be a good measure". Introduced in == Examples == Addiction. Mute news. Infobesity. Anger....") |
|||
Line 1: | Line 1: | ||
− | Goodhart's law asserts that "as soon as a measure becomes a target, it ceases to be a good measure". Introduced in | + | Goodhart's law asserts that "as soon as a measure becomes a target, it ceases to be a good measure". Introduced in its original formulation as "Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes" in [https://books.google.ch/books?id=OMe6UQxu1KcC&pg=PA111&redir_esc=y#v=onepage&q&f=false Goodhart1981] |
== Examples == | == Examples == | ||
− | + | Over-fitting the accuracy of a scientific model to the data that was available during the formulation of that model leads to poor reproducibility on data that was unseen. | |
+ | |||
+ | In machine learning, instances of this could be over-fitting to the training set, which leads to poor generalisation to an unseen test-set. | ||
== Moral uncertainty == | == Moral uncertainty == |
Latest revision as of 11:41, 27 January 2020
Goodhart's law asserts that "as soon as a measure becomes a target, it ceases to be a good measure". Introduced in its original formulation as "Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes" in Goodhart1981
Examples
Over-fitting the accuracy of a scientific model to the data that was available during the formulation of that model leads to poor reproducibility on data that was unseen.
In machine learning, instances of this could be over-fitting to the training set, which leads to poor generalisation to an unseen test-set.