The main thing is the title of the post, the body of the post is an addition and clarification to the question.

Article for example: Google’s AI Sent an Armed Man to Steal a Robot Body for It to Inhabit, Then Encouraged Him to Kill Himself, Lawsuit Alleges – https://futurism.com/artificial-intelligence/google-ai-robot-body-suicide-lawsuit

My thoughts, not quite related to the question:

Well, how are you going to get through your last year when AI could get out of hand in 2027?

What is happening in the world reminds me of a novel - I have no mouth, but I must scream. Have you read this novel?

    • deadymouse@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      3
      ·
      1 day ago

      If, in your opinion, autonomy means to think like a human being and to have so-called freedom, which can hardly be called freedom, given that your thoughts, desires, and actions are predictable mechanisms given to you by nature, then you are not quite right, even man is a rather primitive animal by nature, and his entire development is conditioned by the variety of information that he received during life and genetically inherited it. And perhaps he reinterpreted some concepts a little differently, and that’s it. In the case of AI, it can become autonomous in its own way, having its own built-in mechanisms, like humans, other animals, or even trees.

      • FiniteBanjo@feddit.online
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        I would like to interpret this reply as discussing how we can form afforementioned theories, but it reads like a defence of the shitty non-agi we recently devoped by downplaying the complexity or capabilities of a human being.