4 Matching Annotations
  1. Sep 2024
    1. The robot remained motionless and his voice rumbled: "Pardon, Master, but I cannot. You must mount first." His clumsy arms had come together with a thwack, blunt fingers interlacing. Powell stared and then pinched at his mustache. "Uh . . . oh!" Donovan's eyes bulged. "We've got to ride him? Like a horse?"

      This can be brought back to current day, where AI like chatgpt are programmed to go beyond the data given, learn from it, and adapt. The older robots compared to Speedy require more help from the users such as mounting on to them compared to just instructions for the new one.

  2. Aug 2024
    1. You can’t kill machines. Sulla!

      I remember we brought this up in class, Ai can't really have feelings or be sentient because they're time on earth or living is not limited.

    2. That’s good. (Kisses her hand. She lowers her head.) Oh, I beg your pardon! (Rises) But a working machine must not play the piano, must not feel happy, must not do a whole lot of other things.

      This a great example of the restrictions and the guidelines coders/programmers put on their works. I wonder if people allow or figure a way to code ai to feel emotions would they eventually go berserk on there own.