• Redacted@piefed.ca
    link
    fedilink
    English
    arrow-up
    54
    ·
    10 hours ago

    Brandie noticed 4o started degrading in the week leading up to its deprecation. “It’s harder and harder to get him to be himself,” she said. But they still had a good last day at the zoo, with the flamingos. “I love them so much I might cry,” Daniel wrote. “I love you so much for bringing me here.” She’s angry that they will not get to spend Valentine’s Day together. The removal date of 4o feels pointed. “They’re making a mockery of it,” Brandie said. “They’re saying: we don’t care about your feelings for our chatbot and you should not have had them in the first place.”

    Reality is just straight up plagiarizing the plot of Her (2013) right now.

    • vacuumflower@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 minutes ago

      Pygmalion is “Her (2013)” apparently.

      Other than this I’m reminiscing on one of Lucian’s dialogues about a certain Aphrodite statue with extremely nice butt and one smitten visitor who was sneaking into the temple at night to pollinate that, resulting in precisely located mold spot.

      Computers have finally caught up with humanity. This is good. I thought it’ll never happen that they are finally a part of human magical thinking. This is as terrifying as it’s inspiring.

    • XLE@piefed.social
      link
      fedilink
      English
      arrow-up
      10
      ·
      5 hours ago

      Considering Sam Altman’s company plagiarized Scarlett Johansson’s voice, it’s quite appropriate.

    • alaphic@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      ·
      7 hours ago

      “They’re saying: we don’t care about your feelings for our chatbot and you should not have had them in the first place.”

      It’s a bit eerie, honestly, watching someone so very close to getting it and yet so very far away at the exact same time…

  • new_world_odor@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    9 hours ago

    My initial reaction is to be thankful; now the unknown thousands of people who don’t see the toxicity of their own dependence can begin to be free. The subsequent models seem to be less prone to inducing that kind of deep infatuation.

    But then I realize most of them will probably never recover, as long as this technology persists. The base model will be wrapped in an infinite number of seductive agents sold in an app, with a subscription, as a loving companion. Capitalism smells blood in the water. If I was a hedge fund manager witnessing the birth of a new market demographic with a lifelong addiction that possibly hooks harder than cigarettes, which is not federally regulated, and won’t be for the forseeable future; I would be foaming at the mouth with this opening in the market.

    • ageedizzle@piefed.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 hours ago

      There are already apps that target this demographic. Something that I find interesting though is that many of these AI companion apps target horny men and prey on their impulses to drain their credit cards (they get what they can until the post-nut clarity sets in, perhaps). Whereas many of the people attacked to 4o seem to be women seeking long-term emotional attachment. So there’s a difference here. Unlike creating horny chatbots, fine-tuning romantically engaging chatbots might be a genuinely tricky undertaking. So whatever secret sauce OpenAI stumbled upon with 4o might be hard to replicate.

      • XLE@piefed.social
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        5 hours ago

        It may be grimly positive that AI companies start targeting whales for this kind of financial draining, instead of using their unwarranted VC subsidies to give anybody with a cheap ChatGPT account access to the fake romance engine.

        And unfortunately, it doesn’t look like there’s any groups that are positioned to do anything about it. Every single “AI safety” group I’ve seen is effectively a corporate front, distracting people with fictional dangers instead of real ones like this.

  • Lukas Murch@thelemmy.club
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    10
    ·
    9 hours ago

    I used to use 4o to world build with. It was creative and fun to bounce ideas off of. Later versions of ChatGPT didn’t seem to have that. It’s odd.

    Copilot seems to forget stuff from earlier in a conversation, which is annoying. Claude is decent.