• PushButton@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    5 hours ago

    My feelings on this are conflicted. I’m happy to add a tool that helps people. But I feel like our hand was forced in a weird way.

    Oh really? You won’t tell us that you’re not happy about the free marketing/traffic…

  • oantolin@discuss.online
    link
    fedilink
    arrow-up
    58
    ·
    18 hours ago

    Normally people use ChatGPT to vibe code, this is the first instance I’m aware of of ChatGPT using people to vibe code!

    • zod000@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      17 hours ago

      I actually had this happen more than once because of a coworker that uses Copilot and it would help him with code and use functions not in existance. He has since finally stopped trusting Copilot code without more testing.

      • JWBananas@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        14 hours ago

        Worse, they typically do exist, just in training data made up of others’ code. Sometimes if you put the function names into GitHub search, you can find them.

  • JackbyDev@programming.dev
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    2
    ·
    19 hours ago

    Fascinating! Because this notation is already used by another tool (and possibly more), it might not be as silly as it sounds. From the headline it sounds like some really weird API was added to something.

    Happy to see this sort of optimism in the wake of AI causing people’s programs to get bad reviews because the AI thinks they can do things they can’t.

    Thanks for the share. Maybe I’m looking too far into it, or just in one of those moods, but this really is oddly inspirational to me.

  • palordrolap@fedia.io
    link
    fedilink
    arrow-up
    14
    arrow-down
    2
    ·
    18 hours ago

    TBH, this is barely any different from marketing promising that a product will have a feature that the development team only find out about later purely by accident when upper management asks about it.

    • L0rdMathias@sh.itjust.works
      link
      fedilink
      arrow-up
      24
      arrow-down
      1
      ·
      17 hours ago

      It’s worse. This is like the restaurant across the street, a company that is completely unaffiliated with me and even my industry, is now running a lunch promotion that includes advertising something for my business which I did not approve and do not sell.

      If a human being did this, it would be so unbelievably grossly obviously illegal it wouldn’t even have to go to court. It’s obvious and blatant fraud. Lying on this level is like so unbelievably blasphemous I would go so far as to say that this is uncivilized wild animal behavior that far precedes modern copyright/property laws, having been frowned upon in almost every society since the dawn of civilization.

    • cecilkorik@lemmy.ca
      link
      fedilink
      English
      arrow-up
      9
      ·
      18 hours ago

      It’s much different because you can fire your salespeople for failing to consult with the engineering team, promising shit that is impossible, going to damage your brand and reputation, and provide little-to-no return on investment.

      The biggest difference is that you can’t fire ChatGPT (as much as I desperately wish we could)

      • palordrolap@fedia.io
        link
        fedilink
        arrow-up
        3
        arrow-down
        2
        ·
        17 hours ago

        OK, yeah, you can’t control a third party’s promises (or hallucinations), but the boss isn’t going to fire someone from sales and/or marketing. They’ll fire the developer for failing to deliver.

  • YourAvgMortal@lemmy.world
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    18 hours ago

    ChatGPT is just fancy autocomplete, so it probably got the notation from somewhere else; it’s not really capable of inventing new stuff on its own (unless it hallucinates). It would be interesting to ask it where it saw that notation in the past if you didn’t support it before, but in a way, you could say it’s a standard form of notation (from a different service).

    • LesserAbe@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      15 hours ago

      You know it’s not strictly auto completing sentences that previously existed, right? It’s using words that it anticipates should follow others. I’ve had it suggest code libraries that don’t exist, and you’ll hear about people going to the library to ask for books that haven’t been written but supposedly by real authors, and it sounds like something they would write.

      Tab music notation is super common, and although it wasn’t supported by this particular service before, you could see where it might be the sort of request people make, and so chatgpt combined the two.

    • Paradox@lemdro.id
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      14 hours ago

      Iirc Wikipedia supports it for tab notation

      Personally I much prefer lilypond. I wonder if this tool supports lilypond. Would love to have a workflow to scan sheet music and get lilypond out the other end.