4 Matching Annotations
  1. Jan 2022
    1. Goodhart's law is an adage often stated as "When a measure becomes a target, it ceases to be a good measure".[1] It is named after British economist Charles Goodhart, who advanced the idea in a 1975 article on monetary policy in the United Kingdom:[2][3] .mw-parser-output .templatequote{overflow:hidden;margin:1em 0;padding:0 40px}.mw-parser-output .templatequote .templatequotecite{line-height:1.5em;text-align:left;padding-left:1.6em;margin-top:0}Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.

      We measure what we find important.

      Measures can and often become self-fulfilling targets. (read: Rankings and Reactivity by W. Espeland and M. Sauder https://www.stmarys-ca.edu/sites/default/files/attachments/files/rankings-and-reactivity-2007.pdf)

      When a measure becomes a target it ceases to be a good measure.

      So why measure?


      Is observation and measurement part of a larger complex process which isn't finished until the process itself is finished?


      This seems related to the measurement problem in quantum mechanics, Schrödinger's cat, the Heisenberg uncertainty principle, and the observer effect).

    1. As Goodhart’s law suggests, metrics can fail if given too much power, and over-emphasizing metrics can lead to gaming, manipulation, or “a myopic focus on short-term goals.” Many of the most important parts of digital well-being cannot be captured by quantitative

      Goodhart's Law is an adage often stated as "When a measure becomes a target, it ceases to be a good measure". It is named after British economist Charles Goodhart, who advanced the idea in a 1975 article on monetary policy in the United Kingdom:

      Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.

  2. Aug 2021
    1. Assessing staff solely on the basis of quantitative metrics is never acceptable, no matter what type of metric is being used

      See Goodhart's Law and some background on why these kinds of measurements are difficult.

  3. Sep 2020
    1. Yet another disadvantage to poorly chosen metrics is that they can lean into something called Goodhart’s Law, a subset of the moral hazard of gameplay specific to measurement. A good summation of the law comes from British anthropologist Marilyn Strathern, who describes it like this: “When a measure becomes a target, it ceases to be a good measure”

      compare with Heisenberg principle

    Tags

    Annotators