3 Matching Annotations
  1. Nov 2023
  2. Jun 2023
    1. Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.

      What is missing here? The one thing with the highest probability as we are already living the impacts: climate. The phrase itself is not just a strategic bait and switch for the AI businesses, but also a more blatant bait and switch wrt climate politics.

    1. [[Jaan Tallinn]] is connected to Nick Bostrom wrt the risks of AI / other existential risks, which is problematic. It may be worthwile to map out these various institutions, donors and connections between them. This to have a better grasp of influences and formulate responses to the 'tescreal' bunch. Vgl [[2023-longtermism-an-odd-and-peculiar-ideology]] where I observe same.