• latenightnoir@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    3
    ·
    edit-2
    23 hours ago

    Sad to see you leave (not really, tho’), love to watch you go!

    Edit: I bet if any AI developing company would stop acting and being so damned shady and would just ASK FOR PERMISSION, they’d receive a huge amount of data from all over. There are a lot of people who would like to see AGI become a real thing, but not if it’s being developed by greedy and unscrupulous shitheads. As it stands now, I think the only ones who are actually doing it for the R&D and not as eye-candy to glitz away people’s money for aesthetically believable nonsense are a handful of start-up-likes with (not in a condescending way) kids who’ve yet to have their dreams and idealism trampled.

    • daniskarma@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      18 hours ago

      In Spain we trained an AI using a mix of public resources available for AI training and public resources (legislation, congress sessions, etc). And the AI turned out quite good. Obviously not top of the line, but very good overall.

      It was a public project not a private company.

    • HakFoo@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      22 hours ago

      But what data would it be?

      Part of the “gobble all the data” perspective is that you need a broad corpus to be meaningfully useful. Not many people are going to give a $892 billion market cap when your model is a genius about a handful of narrow subjects that you could get deep volunteer support on.

      OTOH maybe there’s probably a sane business in narrow siloed (cheap and efficient and more bounded expectations) AI products: the reinvention of the “expert system” with clear guardrails, the image generator that only does seaside background landscapes but can’t generate a cat to save its life, the LLM that’s a prettified version of a knowledgebase search and NOTHING MORE

      • latenightnoir@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        19 hours ago

        You’ve highlighted exactly why I also fundamentally disagree with the current trend of all things AI being for-profit. This should be 100% non-profit and driven purely by scientific goals, in which case using copyrighted data wouldn’t even be an issue in the first place… It’d be like literally giving someone access to a public library.

        Edit: but to focus on this specific instance, where we have to deal with the here-and-now, I could see them receiving, say, 60-75% of what they have now, hassle-free. At the very least, and uniformly distributed. Again, AI development isn’t what irks most people, it’s calling plagiarism generators and search engine fuck-ups AI and selling them back to the people who generated the databases - or, worse, working toward replacing those people entirely with LLMs! - they used for those abhorrences.

        Train the AI to be factually correct instead and sell it as an easy-to-use knowledge base? Aces! Train the AI to write better code and sell it as an on-board stackoverflow Jr.? Amazing! Even having it as a mini-assistant on your phone so that you have someone to pester you to get the damned laundry out of the washing machine before it starts to stink is a neat thing, but that would require less advertising and shoving down our throats, and more accepting the fact that you can still do that with five taps and a couple of alarm entries.

        Edit 2: oh, and another thing which would require a buttload of humility, but would alleviate a lot of tension would be getting it to cite and link to its sources every time! Have it be transformative enough to give you the gist without shifting into plagiarism, then send you to the source for the details!

  • Embargo@lemm.ee
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    3
    ·
    edit-2
    22 hours ago

    Oh no! How will I generate a picture of Sam Altman blowing himself now!?

  • isableandaking@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    22 hours ago

    I mean if they pay for it like everyone else does I don’t think it is a problem. Yes it will cost you billions and billions to do it correctly, but then you basically have the smartest creature on earth (that we know of) and you can replicate/improve on it in perpetuity. We still will have to pay you licensing fees to use it in our daily lives, so you will be making those billions back.

    Now I would say let them use anything that is old and freeware, textbooks, etc. government owned stuff - we sponsored it with our learning, taxes - so we get a percentage in all AI companies. Humanity gets a 51% stake in any AI business using humanity’s knowledge, so we are then free to vote on how the tech is being used and we have a controlling share, also whatever price is set, we get half of it back in taxes at the end of the year. The more you use it the more you pay and the more you get back.

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      18 hours ago

      If it costs billions and billions, then only a handful of companies can afford to build an AI and they now have a monopoly on a technology that will eventually replace a chunk of the workforce. It would basically be giving our economy to Google.

      • isableandaking@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        26 minutes ago

        Yep exactly, that’s why you make it people owned. What is your alternative ? They do have companies/governments that can afford it even at these steep prices.

    • interdimensionalmeme@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      16 hours ago

      They’re unprofitable as it is already. They’re not going to be able to generate enough upfront capital to buy and then enclose all of humanity’s previous works to then sell it back to us. I also think it would be heinous that they could enclose and exploit our commons in this manner. It belongs to all of us. Sure train it and use it, but also release it open (or the gov can confiscate it, fine with that as well). Anything but allowing those rat-snakes to keep it all for themselves.

      • isableandaking@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        23 minutes ago

        They can be even more unprofitable like Amazon was for years and years - and now they print money. I don’t think it’s a bad model, but it’s gonna come down to just a couple governments/companies having powerful AIs where we are not needed anymore - so if it’s privately owned it would spell doom for the human species or at least a huge portion of it, potential enslavement as well.

    • PapaStevesy@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      22 hours ago

      The owners of the copyrighted works should be paid in perpetuity too though, since part of their work goes into everything the AI spits out.

      • isableandaking@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        27 minutes ago

        I don’t see why I’m downvoted for this, but I don’t agree with this opinion - it’s like teaching a human being. If you buy everything once it’s still a hell of a bill - we are talking all books, all movies, all games, all software, all memes, all things - 1 of each is still trillions if you legally want to train your new thing on it.

  • TheBrideWoreCrimson@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    4
    ·
    13 hours ago

    My main takeaway is that some contrived notion of “national security” has now become an acceptable justification for business decisions in the US.

  • LovableSidekick@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    22 hours ago

    Alright, I confess! Almost all of my training in computer programming came from copyrighted material. Put the cuffs on me!

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      4
      ·
      21 hours ago

      You were trained and learned and are able to create new things.

      AI poorly mimics thngs it has seen before.

      • LovableSidekick@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        6 hours ago

        The issue being raised is copyright infringement, not the quality of the results. Writers “borrow” each other’s clever figures of speech all the time without payment or attribution. I’m sure I have often copypasted code without permission. AI does nothing on its own, it’s always a tool used by human beings. I think the copyright argument against AI is built on a false distinction between using one tool vs another.

        My larger argument is that nobody has an inherent right to control what everybody else does with some idea they’ve created. For many thousands of years people saw stuff and freely imitated it. Oh look, there’s an “arch” - I think I’ll build a building like that. Oh look, that tribe uses that root to make medicine, let’s do the same thing. This process was known as “the spread of civilization” until somebody figured out that an authority structure could give people dibs on their ideas and force other people to pay to copy them. As we evolve more capabilities (like AI) I think it’s time to figure out another way to reward creators without getting in the way of improvement, instead of hanging onto a “Hey, that’s Mine!” mentality that does more to enrich copy producers than it does to enrich creators.

        • snooggums@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 hours ago

          Yes, whether copyright should exist is a different discussion than how AI is violating it in a very different way than snippets being reused in different contexts as part of a new creative work.

          Intentionally using a single line is very different than scooping up all the data and hitting a randomizer until it stumbles into some combination that happens to look usable. Kind of like how a single business jacking up prices is different than a monopoly jacking up all the prices.

          • LovableSidekick@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            9 hours ago

            Stripping away your carefully crafted wording, the differences fade away. “Hitting a randomizer” until usable ideas come out is an equally inaccurate description of either human creativity or AI. And again, the contention is that using AI violates copyright, not how it allegedly does that.

            • snooggums@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              6 hours ago

              So the other thing with AI is the companies are not just making money on the output like an artist would. They are making bank on investors and stock market speculation that exists only because they scooped up massive amounts of copyrighted materials to create their output. It really isn’t comparable to a single artist or even a collection of artists.

              • LovableSidekick@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                6 hours ago

                Again, AI doesn’t do anything, any more than hammers and saws build houses. People use AI to do things. Anyway, profiting from investors and speculators without giving creators a piece of the action isn’t a consequence of AI, it’s how our whole system already works.

  • tehn00bi@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 hours ago

    Perhaps this is just a problem with the way the model works. Always requiring new data and unable to use current data, to ponder and expand upon while making new connections about ideas that influenced the author… LLM’s are a smoke and mirrors show, not a real intelligence.

      • This is fine🔥🐶☕🔥@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        3
        ·
        18 hours ago

        And he also said “child pornography is not necessarily abuse.”

        In the US, it is illegal to possess or distribute child pornography, apparently because doing so will encourage people to sexually abuse children.

        This is absurd logic. Child pornography is not necessarily abuse. Even if it was, preventing the distribution or posession of the evidence won’t make the abuse go away. We don’t arrest everyone with videotapes of murders, or make it illegal for TV stations to show people being killed.

        Wired has an article on how these laws destroy honest people’s lives.

        https://web.archive.org/web/20130116210225/http://bits.are.notabug.com/

        Big yikes from me whenever I see him venerated.

      • ccunning@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        1
        ·
        22 hours ago

        Yes, and he killed himself after the FBI was throwing the book at him for doing exactly what these AI assholes are doing without repercussion

        • FauxLiving@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          4
          ·
          20 hours ago

          And for some reason suddenly everyone leaps back to the side of the FBI and copyright because it’s a meme to hate on LLMs.

          It’s almost like people don’t have real convictions.

          You can’t be Team Aaron when it’s popular and then Team Copyright Maximalist when the winds change and it’s time to hate on LLMs or diffusion models.

  • shaggyb@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    2
    ·
    19 hours ago

    “How am I supposed to make any money if I can’t steal all of my products to sell back to the world that produced them?”

    Yeah, fuck that. The whole industry deserves to die.