• GreenShimada@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    21 days ago

    For anyone unsure: Jevon’s Paradox is that when there’s more of a resource to consume, humans will consume more resource rather than make the gains to use the resource better.

    Case in point: AI models could be written to be more efficient in token use (see DeepSeek), but instead AI companies just buy up all the GPUs and shove more compute in.

    For the expansive bloat - same goes for phones. Our phones are orders of magnitude better than what they were 10 years ago, and now it’s loaded with bloat because the manufacturer thinks “Well, there’s more computer and memory. Let’s shove more bloat in there!”

    • VibeSurgeon@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      21 days ago

      Case in point: AI models could be written to be more efficient in token use

      They are being written to be more efficient in inference, but the gains are being offset by trying to wring more capabilities out of the models by ballooning token use.

      Which is indeed a form of Jevon’s paradox

      • errer@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        21 days ago

        Costs have been dropping by a factor of 3 per year, but token use increased 40x over the same period. So while the efficiency is contributing a bit to the use, the use is exploding even faster.

    • GamingChairModel@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      21 days ago

      Jevon’s Paradox is that when there’s more of a resource to consume, humans will consume more resource rather than make the gains to use the resource better.

      More specifically, it’s when an improvement in efficiency cause the underlying resource to be used more, because the efficiency reduces cost and then using that resource becomes even more economically attractive.

      So when factories got more efficient at using coal in the 19th century, England saw a huge increase in coal demand, despite using less coal for any given task.

      • Quetzalcutlass@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        21 days ago

        Also Eli Whitney inventing the cotton gin to make extracting cotton less of a tedious and backbreaking process, which lead to a massive expansion in slavery plantations in the American South due to the increased output and profitability of the crop.

  • OwOarchist@pawb.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    21 days ago

    When you become one with the penguin, though … then you can begin to feel how much faster modern hardware is.

    Hell, I’ve got a 2016 budget-model chromebook that still feels quick and snappy that way.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      21 days ago

      But… 2016 was a decade ago. If it feels quick and snappy that way that means the post is right.

      Which it kinda isn’t but hey.

      • Skullgrid@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        21 days ago

        the point is the software is what’s wrong, not the hardware. it feels snappy because it’s linux, not because it’s old hardware.

  • brotato@slrpnk.net
    link
    fedilink
    arrow-up
    2
    ·
    21 days ago

    The tech debt problem will keep getting worse as product teams keep promising more in less time. Keep making developers move faster. I’m sure nothing bad will come of it.

    Capitalism truly ruins everything good and pure. I used to love writing clean code and now it’s just “prompt this AI to spit out sloppy code that mostly works so you can focus on what really matters… meetings!”

    • dfyx@lemmy.helios42.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      21 days ago

      Well, until you open a browser… or five, because these days nobody wants to build native applications anymore and instead they shove webapps into electron containers.

      Right now, my laptop doesn’t have to run much. Just a combination of KDE, browser, emails, music player, a couple of messengers and some background services. In total, that uses about 9.5 GB of RAM. 20 years ago we would have run the same workload with less than 1 GB.

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    20 days ago

    The modern web is an insult to the idea of efficiency at practically every level.

    You cannot convince me that isolation and sandboxing requires a fat 4Gb slice of RAM for a measly 4 tabs.

    • kalpol@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      20 days ago

      It is crazy that I can have a core 2 duo with 8 gig of RAM that struggles loading web pages

      • CovfefeKills@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        19 days ago

        Actshually it’s bandwidth censorship if you make something too heavy to be used then it won’t get used. It is one of the things China is doing to separate their internet from the rest of the worlds, by having an internet so blazingly fast it is unbearable to goto the world wide web.

        So yesh, the epstien class are making the news too slow for typical users to access. /maybe some sarcasm maybe not I’m not sure yet

        EDIT: I have decided I was not being sarcastic. https://ioda.inetintel.cc.gatech.edu/reports/shining-a-light-on-the-slowdown-ioda-to-track-internet-bandwidth-throttling/

        
        Episodes of network throttling have been reported in countries like Russia, Iran, Egypt, and Zimbabwe, and many more, especially during politically sensitive periods such as elections and protests. In some cases, entire regions such as Iran’s Khuzestan province have experienced indiscriminate throttling, regardless of the protocol or specific services in use. Throttling is particularly effective and appealing to authoritarian governments for several reasons: Throttling is simple to implement, difficult to detect or attribute and hard to circumvent.```
  • Whitebrow@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    21 days ago

    I still remember playing StarCraft 2 shortly after release on a 300$ laptop and it running perfectly well on medium settings.

    Looked amazing. Felt incredibly responsive. Polished. Optimized.

    Nowadays it’s RTX this, framegen that, need SSD or loading times are abysmal, oh and don’t forget that you need 40gb of storage and 32gb of ram for a 3 hour long walking simulator, how about you optimize your goddamn game instead? Don’t even get me started on price tags for these things.

    Software and game development is definitely a spectrum though, but holy shit is the ratio of sloppy releases so disproportionate that it’s hard to see it at times.

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    21 days ago

    Everything bad people said about web apps 20+ years ago has proved true.

    It’s like, great, now we have consistent cross-platform software. But it’s all bloated, slow, and only “consistent” with itself (if even). The world raced to the bottom, and here we are. Everything is bound to lowest-common-denominator tech. Everything has all the disadvantages of client-server architecture even when it all runs (or should run) locally.

    It is completely fucking insane how long I have to wait for lists to populate with data that could already be in memory.

    But at least we’re not stuck with Windows-only admin consoles anymore, so that’s nice.

    All the advances in hardware performance have been used to make it faster (more to the point, “cheaper”) to develop software, not faster to run it.

  • OctopusNemeses@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    21 days ago

    I’m pretty sure the “unused RAM is wasted RAM” thing has caused its share of damage from shit developers who took it to mean use memory with reckless abandon.

    • ThePantser@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      21 days ago

      Would be nice if I could force programs to use more ram though. I actually have 100GB of DDR4 my desktop. I bought it over a year ago when DDR4 was unloved and cheap. But I have tried to force programs to not be offloading as much. Like Firefox, I hate that I have the ram but it’s still unloading webpages in the background and won’t use more than 6GB ever.

      • iglou@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        21 days ago

        Programs that care about memory optimization will typically adapt to your setup, up to a point. More ram isnt going to make a program run any better if it has no use for it

        • Vlyn@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          21 days ago

          Don’t fully disable swap on Windows, it can break things :-/

            • Vlyn@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              ·
              21 days ago

              Maybe it has changed again, but in the past I gave it a try. When 16 GB was a lot. Then when 32 GB was a lot. I always thought “Not filling up the RAM anyway, might as well disable it!”

              Yeah, no, Windows is not a fan. Like you get random “running out of memory” errors, even though with 16 GB I still had 3-4 GB free RAM available.

              Some apps require the page file, same as crash dumps. So I just set it to a fixed value (like 32 GB min + max) on my 64 GB machine.

      • floquant@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        1
        ·
        21 days ago

        Set swappiness to 5 or something similar, or disable swap altogether unless you’re regularly getting close to max usage

    • Vlyn@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      21 days ago

      With 32 and 64 GB systems I’ve never run out of RAM, so the RAM isn’t the issue at all.

      Optimization just sucks.

        • Vlyn@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          20 days ago

          Decent sized for what?

          Creative writing and roleplay? Plenty, but I try to fit it into my 16 GB VRAM as otherwise it’s too slow for my liking.

          Coding/complex tasks? No, that would need 128GB and upwards and it would still be awfully slow. Except you use a Mac with unified memory.

          For image and video generation you’d want to fit it into GPU VRAM again, system RAM would be way too slow.

  • bampop@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    21 days ago

    My PC is 15 times faster than the one I had 10 years ago. It’s the same old PC but I got rid of Windows.

  • oyo@lemmy.zip
    link
    fedilink
    arrow-up
    0
    ·
    21 days ago

    Windows 11 is the slowest Windows I’ve ever used, by far. Why do I have to wait 15-45 seconds to see my folders when I open explorer? If you have a slow or intermittent Internet connection it’s literally unusable.

  • Michal@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    20 days ago

    PCs aren’t faster, they have more cores, so they can do more at a time, but it takes effort to optimize for parallel work. Also the form factor keeps getting smaller, more people use laptops now and you can’t cheat thermal efficiency.

    • leftzero@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      0
      ·
      20 days ago

      My first PC ran at 16MHz on turbo.

      PCs today are orders of magnitude faster. Way less fun, but faster.

      What’s even more orders of magnitude slower and infinitely more bloated is software. Which is the point of the post.

      It’s almost impossible to find any piece of actually optimised software these days (with some exceptions like sqlite) to the point that 99% percent of the software currently in use can be considered unintentional (or intentional) malware.

      Particularly egregious are web browsers, which seem designed to waste the maximum possible amount of resources and run as inefficiently as possible.

      And the fact that most supposedly desktop software these days runs on top of one of those pieces ofintentional (it’s impossible to achieve such levels of inefficiency and bloat unintentionally, it requires active effort) malware obviously doesn’t help.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          18 days ago

          Only on some and name brand PC’s which used it for compatibility. For home built or local store, the turbo would overclock. I remember telling a friend, that although their 16mhz could run at 20, to not do it because it would compromise longevity! Ha! Mind you the cpu’s in those days didn’t have heat sinks but still- Oh no your 386 might not work in 20 years from running too hot!