3 Matching Annotations
  1. Sep 2018
    1. In this singularity-free world, the future would be bleak for programmers. (Imagine having to cope with hundreds of years of legacy software!)

      I'm not really sure I agree with this. Regardless of whether Singularity eventually comes around or not, software will evolve. We have seen whole languages fall out of usage in the past, including Fortran, COBOL, ALGOL, etc. Developers will always be looking to get more efficient, and our current languages will either get abandoned or improved. New languages will come up and become the new standard. If anything, the future seems to be far more bleak for programmers/developers in the years after Singularity, rather than in the years leading upto it - after all, with the prevalence of docile, conscious machines,most of the work programmers do, everything from bug-fixing to data-analytics, would be done more quickly, cheaply and efficiently by the machines. In other words, in the world of singularity, programmers as we know them would no longer exist.

    1. It would be if he knew he was wrong. I can’t put my finger on it, but I sense something strange about him.

      Distrust. Even without any quantifiable proof of there being any error with Hal, it is the distrust that manifests itself in the relationship between the humans and him. This will be one of the most important features of our relationship with sentient machines in the future as well - whether we will be able to trust independently-thinking machines with control of critical aspects of our society.

    1. Perpetual progress is a strong statement of the transhumanist commitment to seek “more intelligence, wisdom, and effectiveness, an open-ended lifespan, and the removal of political, cultural, biological, and psychological limits to continuing development. Perpetually overcoming constraints on our progress and possibilities as individuals, as organizations, and as a species. Growing in healthy directions without bound.”

      What stands out to me here is the efforts to which they went in defining the "constraints" on their pursuit of perpetual progress; specifically, they describe these constraints as "...political, cultural, biological, and psychological limits....". While I had earlier viewed religion as transhumanism's biggest constraint, this description makes me pause - the 'constraints' currently being referred to, such as our divisive political system, deeply embedded cultural practices, psychological issues stemming from society, and our ever-fragile health all now seem to be worthy nemeses to the transhumanist commitment.