2 Matching Annotations
  1. Feb 2019
    1. In other words, when YouTube fine-tunes its algorithms, is it trying to end compulsive viewing, or is it merely trying to make people compulsively watch nicer things?

      YouTube's business interests are clearly rewarded by compulsive viewing. If it is even possible to distinguish "nicer" things, YouTube might have to go against its business interests if less-nice things DO lead to more compulsive viewing. Go even deeper, as Rob suggests below, and ask if viewing itself can shape both how (compulsive?) and what (nice or not-nice?) we view?

    1. Algorithms will privilege some forms of ‘knowing’ over others, and the person writing that algorithm is going to get to decide what it means to know… not precisely, like in the former example, but through their values. If they value knowledge that is popular, then knowledge slowly drifts towards knowledge that is popular.

      I'm so glad I read Dave's post after having just read Rob Horning's great post, "The Sea Was Not a Mask", also addressing algorithms and YouTube.