8 Matching Annotations
  1. Dec 2022
    1. "If you don’t know, you should just say you don’t know rather than make something up," says Stanford researcher Percy Liang, who spoke at a Stanford event Thursday.

      Love this response

  2. Sep 2022
    1. humanity as not only the source and context for technology and its use, but its ultimate yardstick for the constructive use and impact of technology. This may sound obvious, it certainly does to me, but in practice it needs to be repeated to ensure it is used as such a yardstick from the very first design stage of any new technology.

      Vgl [[Networked Agency 20160818213155]] wrt having a specific issue to address that is shared by the user group wielding a tech / tool, in their own context.

      Vgl [[Open data begint buiten 20200808162905]] wrt the only yardstick for open data stems from its role as policy instrument: impact achieved outside in the aimed for policy domains through the increased agency of the open data users.

      Tech impact is not to be measured in eyeballs, usage, revenue etc. That's (understandably) the corporation's singular and limited view, the rest of us should not adopt it as the only possible one.

  3. Mar 2020
    1. le nuove tecnologie sono presenti nella vita di tutti, sia lavorativa sia quotidiana. Spesso non ci rendiamo neanche conto che interagiamo con sistemi automatici o che disseminiamo sulla rete dati che riguardano la nostra identità personale. Per cui si produce una grave asimmetria tra chi li estrae (per i propri interessi) e chi li fornisce (senza saperlo). Per ottenere certi servizi, alcuni siti chiedono a noi di precisare che non siamo un robot, ma in realtà la domanda andrebbe capovolta
    2. «È necessario che l’etica accompagni tutto il ciclo della elaborazione delle tecnologie: dalla scelta delle linee di ricerca fino alla progettazione, la produzione, la distribuzione e l’utente finale. In questo senso papa Francesco ha parlato di “algoretica”»
  4. Jan 2020
    1. Similar to the technical architecture of classic colonialism, digital colonialism is rooted in the design of the tech ecosystem for the purposes of profit and plunder. If the railways and maritime trade routes were the "open veins" of the Global South back then, today, digital infrastructure takes on the same role: Big Tech corporations use proprietary software, corporate clouds, and centralised Internet services to spy on users, process their data, and spit back manufactured services to subjects of their data fiefdoms.


    1. The underlying guiding idea of a “trustworthy AI” is, first and foremost, conceptual nonsense. Machines are not trustworthy; only humans can be trustworthy (or untrustworthy). If, in the future, an untrustworthy corporation or government behaves unethically and possesses good, robust AI technology, this will enable more effective unethical behaviour.


  5. Nov 2017
  6. Oct 2017
    1. Why is all the focus on teaching lay people how to code, and not teaching computer scientists and people who work in tech companies to center empathy and humanity in their work?

      . . .

      I think there should be an element of infusing discussions of ethics, humanity and social consequences into computer science curricula, and I believe that even human-centered design does not go far enough; I suggest that designers of tech consider more “empathetic and participatory design” where there is some degree of involving people who are not in the tech company as autonomous persons in product design decisions, and not just using them as research/testing subjects.