If you’ve watched any Olympics coverage this week, you’ve likely been confronted with an ad for Google’s Gemini AI called “Dear Sydney.” In it, a proud father seeks help writing a letter on behalf of his daughter, who is an aspiring runner and superfan of world-record-holding hurdler Sydney McLaughlin-Levrone.

“I’m pretty good with words, but this has to be just right,” the father intones before asking Gemini to “Help my daughter write a letter telling Sydney how inspiring she is…” Gemini dutifully responds with a draft letter in which the LLM tells the runner, on behalf of the daughter, that she wants to be “just like you.”

I think the most offensive thing about the ad is what it implies about the kinds of human tasks Google sees AI replacing. Rather than using LLMs to automate tedious busywork or difficult research questions, “Dear Sydney” presents a world where Gemini can help us offload a heartwarming shared moment of connection with our children.

Inserting Gemini into a child’s heartfelt request for parental help makes it seem like the parent in question is offloading their responsibilities to a computer in the coldest, most sterile way possible. More than that, it comes across as an attempt to avoid an opportunity to bond with a child over a shared interest in a creative way.

  • Krauerking@lemy.lol
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    1
    ·
    edit-2
    3 months ago

    Pshh fellow comrades…

    Then you haven’t seen the movie theater ad they are showing where they ask the Genini AI to write a break up letter for them.

    Anyone that does that, deserves to be alone for the rest of their days.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      3 months ago

      Ah, yes. I’m mostly on the receiving side of such and haven’t had much luck in relationships, but getting ghosted after a few forced words, uneasy looks, maybe even kinda hurtedly-mocking remarks about my personality that I can’t change is one thing, it’s still human, though unjust, but OK.

      While a generated letter with generated reasons and generated emotions feels, eh, just like something from the first girl I cared about, only her parents had amimia, so it wasn’t completely her fault that all she said felt 90% fake (though it took me 10 years to accept that what she did actually was betrayal).

  • Snapz@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    3 months ago

    “Hey Google, please write a letter from my family, addressed to me, that pretends that they love me deeply, and approve of me wholly, even though I am a soulless, emotionless ghoul that longs for the day we’ll have truly functional AR glasses, so that I can superimpose stock tickers over the top of their worthless smiles.”

    • admin@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 months ago

      “As a large language model, I’m not capable of providing a daydream representation of your most inner desires or fulfill your emotional requests. Please subscribe to have an opportunity to unlock these advanced features in one of our next beta releases.”

  • Modva@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    3 months ago

    Yeah, fully agree. This is one of the reasons big tech is dangerous with AI, their sense of humanity and their instincts on what’s right are way off.

    Oozes superficiality. Say anything do anything for market share.

  • NounsAndWords@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    2
    ·
    3 months ago

    The thing is, LLMs can be used for something like this, but just like if you asked a stranger to write a letter for your loved one and only gave them the vaguest amount of information about them or yourself you’re going to end up with a really generic letter.

    …but to give me amount of info and detail you would need to provide it with, you would probably end up already writing 3/4 of the letter yourself which defeats the purpose of being able to completely ignore and write off those you care about!

  • mozz@mbin.grits.dev
    link
    fedilink
    arrow-up
    316
    arrow-down
    6
    ·
    3 months ago

    This is one of the weirdest of several weird things about the people who are marketing AI right now

    I went to ChatGPT right now and one of the auto prompts it has is “Message to comfort a friend”

    If I was in some sort of distress and someone sent me a comforting message and I later found out they had ChatGPT write the message for them I think I would abandon the friendship as a pointless endeavor

    What world do these people live in where they’re like “I wish AI would write meaningful messages to my friends for me, so I didn’t have to”

    • Khanzarate@lemmy.world
      link
      fedilink
      English
      arrow-up
      105
      arrow-down
      9
      ·
      3 months ago

      The thing they’re trying to market is a lot of people genuinely don’t know what to say at certain times. Instead of replacing an emotional activity, its meant to be used when you literally can’t do it but need to.

      Obviously that’s not the way it should go, but it is an actual problem they’re trying to talk to. I had a friend feel real down in high school because his parents didn’t attend an award ceremony, and I couldn’t help cause I just didn’t know what to say. AI could’ve hypothetically given me a rough draft or inspiration. Obviously I wouldn’t have just texted what the AI said, but it could’ve gotten me past the part I was stuck on.

      In my experience, AI is shit at that anyway. 9 times out of 10 when I ask it anything even remotely deep it restates the problem like “I’m sorry to hear your parents couldn’t make it”. AI can’t really solve the problem google wants it to, and I’m honestly glad it can’t.

      • assassin_aragorn@lemmy.world
        link
        fedilink
        English
        arrow-up
        36
        arrow-down
        1
        ·
        3 months ago

        A lot of the times when you don’t know what to say, it’s not because you can’t find the right words, but the right words simply don’t exist. There’s nothing that captures your sorrow for the person.

        Funny enough, the right thing to say is that you don’t know what to say. And just offer yourself to be there for them.

      • Serinus@lemmy.world
        link
        fedilink
        English
        arrow-up
        65
        arrow-down
        1
        ·
        3 months ago

        They’re trying to market emotion because emotion sells.

        It’s also exactly what AI should be kept away from.

        • nilloc@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          8
          ·
          3 months ago

          But ai also lies and hallucinates, so you can’t market it for writing work documents. That could get people fired.

          Really though, I wonder if the marketing was already outsourced to the LLM?

          Sadly, after working in Advertising for over 10 years, I know how dumb the art directors can be about messaging like this. It why I got out.

      • mozz@mbin.grits.dev
        link
        fedilink
        arrow-up
        8
        ·
        3 months ago

        Yeah. If it had any empathy this would be a good task and a genuinely helpful thing. As it is, it’s going to produce nothing but pain and confusion and false hope if turned loose on this task.

    • AFK BRB Chocolate@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      ·
      3 months ago

      The article makes a mention of the early part of the movie Her, where he’s writing a heartfelt, personal card that turns out to be his job, writing from one stranger to another. That reference was exactly on target: I think most of us thought outsourcing such a thing was a completely bizarre idea, and it is. It’s maybe even worse if you’re not even outsourcing to someone with emotions but to an AI.

    • Rolando@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      3 months ago

      I would abandon the friendship as a pointless endeavor

      You’re in luck, you can subscribe to an AI friend instead. /s

    • dan1101@lemm.ee
      link
      fedilink
      English
      arrow-up
      18
      ·
      3 months ago

      These seem like people who treat relationships like a game or an obligation instead of really wanting to know the person.

    • ArbitraryValue@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      12
      ·
      edit-2
      3 months ago

      If I was in some sort of distress and someone sent me a comforting message and I later found out they had ChatGPT write the message for them I think I would abandon the friendship as a pointless endeavor

      My initial response is the same as yours, but I wonder… If the intent was to comfort you and the effect was to comfort you, wasn’t the message effective? How is it different from using a cell phone to get a reminder about a friend’s birthday rather than memorizing when the birthday is?

      One problem that both the AI message and the birthday reminder have is that they don’t require much effort. People apparently appreciate having effort expended on their behalf even if it doesn’t create any useful result. This is why I’m currently making a two-hour round trip to bring a birthday cake to my friend instead of simply telling her to pick the one she wants, have it delivered, and bill me. (She has covid so we can’t celebrate together.) I did make the mistake of telling my friend that I had a reminder in my phone for this, so now she knows I didn’t expend the effort to memorize the date.

      Another problem that only the AI message has is that it doesn’t contain information that the receiver wants to know, which is the specific mental state of the sender rather than just the presence of an intent to comfort. Presumably if the receiver wanted a message from an AI, she would have asked the AI for it herself.

      Anyway, those are my Asperger’s musings. The next time a friend needs comforting, I will tell her “I wish you well. Ask an AI for inspirational messages appropriate for these circumstances.”

      • Emerald@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        3 months ago

        Ask an AI for inspirational messages appropriate for these circumstances.

        Don’t need to ask an AI when every website is AI-generated blogspam these days

      • candybrie@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        3 months ago

        Another problem that only the AI message has is that it doesn’t contain information that the receiver wants to know, which is the specific mental state of the sender rather than just the presence of an intent to comfort.

        I don’t think the recipient wants to know the specific mental state of the sender. Presumably, the person is already dealing with a lot, and it’s unlikely they’re spending much time wondering what friends not going through it are thinking about. Grief and stress tend to be kind of self-centering that way.

        The intent to comfort is the important part. That’s why the suggestion of “I don’t know what to say, but I’m here for you” can actually be an effective thing to say in these situations.

  • PlantDadManGuy@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    3 months ago

    I agree. This ad was immediately disgusting, cringy, and deflated my already floundering hope for humanity. Google sucks.

  • yesman@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    2
    ·
    3 months ago

    It’s 2027, the AI killer app never came, but LLMification has produced an unimaginable glut of mediocre media and the most popular AI application is to use it to find human sourced material.

    The stock market is like a ship on fire, but you can buy video cards for pennies on the dollar.

  • AbouBenAdhem@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    3 months ago

    The obvious missing element is another AI on Sydney’s end to summarize all the fan mail into a one-number sentiment score. At that point we can eliminate both the AIs and the mental effort, and just send each other single numbers via an ad-sponsored Google service.

    • Serinus@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      3 months ago

      Hey, my buddy’s work is already doing that! Management no longer has any idea what the company does, but they know how often you click. It boils down to a decimal number, which is what they really need. Higher numbers are better.

    • thesporkeffect@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      3 months ago

      Which they will unceremoniously murder after it fails to get enough traction in a month after launch.

    • sunzu@kbin.run
      link
      fedilink
      arrow-up
      9
      ·
      3 months ago

      They were always weird but it is getting to the point where even normies are taking notice.

      All that sex traffic that occurs for their event alone make it an abomination.

    • peopleproblems@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      3 months ago

      Ever since I moved to an ad-reduced life, everything has been nicer. I can’t completely escape them, they are everywhere. But minimizing with ublock and pihole helps, then only using video services that don’t have ads. Unfortunately, a lot have added ads, so I have quit those. I’ll pay extra for ad-free, just because ads make my life so miserable.

      I can’t watch broadcast TV, it’s too irritating. I can’t browse the web on a device outside my network or phone. I don’t use free apps. Hell, I don’t listen to the radio.

      I like to think it has made me a calmer person.

  • bitwaba@lemmy.world
    link
    fedilink
    English
    arrow-up
    124
    arrow-down
    1
    ·
    3 months ago

    “Dear Sydney” presents a world where Gemini can help us offload a heartwarming shared moment of connection with our children.

    This is the problem I’ve had with the LLM announcements when they first came out. One of their favorite examples is writing a Thank You note.

    The whole point of a Thank You note is that you didn’t have to write it, but you took time out of your day anyways to find your own words to thank someone.

    • bcgm3@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      3 months ago

      Ugh, who has time for that? I need all of my waking hours to be devoted to increasing work productivity and consuming products. Computers can feel my pesky feelings for me now.

    • Pissipissini Johnson 🩵! :D@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      3
      ·
      3 months ago

      Companies like Google don’t understand how advanced AI algorithms work. They can sort of represent things like emotions by encoding relationships between high level concepts and trying to relate things together using logic.

      This usually just means they’ll echo the emotions of whomever gave them input and amplify them to make some form of art, though.

      People with power at Google are often very hateful people who will say hurtful things to each other, especially about concepts like money or death.

    • Flying Squid@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      3 months ago

      Although I will use it to write resumes and cover letters when applying to jobs from now on. They use AI to weed out resumes. I figure the only way to beat that system is to use it against itself.

    • Duamerthrax@lemmy.world
      link
      fedilink
      English
      arrow-up
      32
      ·
      edit-2
      3 months ago

      Sincerity is a foreign concept to MBAs, VCs, and anyone who thinks they’re on a business Grind Set. They view the world as a game and interpersonal relationships as a game mechanic.