Companies are going all-in on artificial intelligence right now, investing millions or even billions into the area while slapping the AI initialism on their products, even when doing so seems strange and pointless.

Heavy investment and increasingly powerful hardware tend to mean more expensive products. To discover if people would be willing to pay extra for hardware with AI capabilities, the question was asked on the TechPowerUp forums.

The results show that over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn’t know, while just under 2,000 voters said yes.

  • Godort@lemm.ee
    link
    fedilink
    English
    arrow-up
    57
    ·
    edit-2
    2 months ago

    This is one of those weird things that venture capital does sometimes.

    VC is is injecting cash into tech right now at obscene levels because they think that AI is going to be hugely profitable in the near future.

    The tech industry is happily taking that money and using it to develop what they can, but it turns out the majority of the public don’t really want the tool if it means they have to pay extra for it. Especially in its current state, where the information it spits out is far from reliable.

    • cheese_greater@lemmy.world
      link
      fedilink
      English
      arrow-up
      34
      ·
      2 months ago

      I don’t want it outside of heavily sandboxed and limited scope applications. I dont get why people want an agent of chaos fucking with all their files and systems they’ve cobbled together

      • FiveMacs@lemmy.ca
        link
        fedilink
        English
        arrow-up
        9
        ·
        2 months ago

        NDA also legally prevent you from using this forced garbage too. Companies are going to get screwed over by other companies, capitalism is gonna implode hopefully

    • Tenthrow@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      2 months ago

      I have to endure a meeting at my company next week to come up with ideas on how we can wedge AI into our products because the dumbass venture capitalist firm that owns our company wants it. I have been opting not to turn on video because I don’t think I can control the cringe responses on my face.

    • TipRing@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      2 months ago

      Back in the 90s in college I took a Technology course, which discussed how technology has historically developed, why some things are adopted and other seemingly good ideas don’t make it.

      One of the things that is required for a technology to succeed is public acceptance. That is why AI is doomed.

      • SkyeStarfall@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        AI is not doomed, LLMs or consumer AI products, might be

        In industries AI is and will be used (though probably not LLMs, still, except in a few niche use cases)

        • TipRing@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          2 months ago

          Yeah, I mean the AI being shoveled at us by techbros. Actual ML stuff is currently and will continue to be useful for all sorts on not-sexy but vital research and production tasks. I do task automation for my job and I use things like transcription models and OCR, my company uses smart sorting using rapid image recognition and other really cool uses for computers to do things that humans are bad at. It’s things like LLMs that just aren’t there - yet. I have seen very early research on AI that is trained to actually understand language and learns by context, it’s years away, but eventually we might see AI that really can do what the current AI companies are claiming.

  • Kraiden@kbin.run
    link
    fedilink
    arrow-up
    129
    ·
    2 months ago

    someone tried to sell me a fucking AI fridge the other day. Why the fuck would I want my fridge to “learn my habits?” I don’t even like my phone “learning my habits!”

    • fruitycoder@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      2 months ago

      I want AI in my fridge for sure. Grocery shopping sucks. Forgetting how old something was sucks. Letting all the cool out to crawl around to see what I have sucks.

      I want my fridge to be like the Sims, just get deliveries or pickup the order. Fill it out and get told what ingredients I have. Bonus points if you can just tell me what recipes I can cook right now, even better if I can ask for time frame.

      That would be sick!

      Still not going to give ecorp all of my data or put some half back internet of stings device on my WiFi for it. But it would be cool.

      • Kraiden@kbin.run
        link
        fedilink
        arrow-up
        8
        ·
        2 months ago

        Ye, that’d be sick! and that’s also not what was being sold! this fridge did none of that. What exactly made it “AI” I didn’t bother to find out, but I work in IT. I guarantee it wasn’t this. Also, not convinced I want my fridge to be able to spend my money for me. I want to be able to have a Ramen month if I need/want

      • Tinks@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        Absolutely this. There IS a scenario in which I would love a “smart” or “AI” fridge, but it’s gotta be damn impressive to even be worth my time.

        It needs to know everything in my fridge, how long it’s been there and it’s expiration date, and I want it to build grocery lists for me based on what is low, and let me know ahead of time that I should use something up that’s going bad soon. Bonus points if it recommends some options for how to do that based on my tastes. And I want to do this without having to manually input or remove everything.

        But we’re still SO far from being able to do this reliably, let alone at any kind of acceptable price point, and yet fridge makers keep shoving out dumb fridges with a screen on them and calling them “smart”. I hate it.

        • fruitycoder@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          A couple planets! /s

          I would be willing to never have one in my life time just to see climate change slowed to a rate nature can naturally adapt and people can afford to adjust to honestly.

          I dont forsee it being any worse then food waste and wasted grocery trips are for me.

          Computer vision, a couple services, a db, and network access can be pretty light weight. Any extra voice, natural language interface, etc is probably overkill and without special hardware (and the ecogical cost of that) not worth it on an energy use stand point.

          All speculation of course

    • Zron@lemmy.world
      link
      fedilink
      English
      arrow-up
      49
      ·
      2 months ago

      Why does a fridge need to know your habits?

      It has to keep the food cold all the time. The light has to come on when you open the door.

      What could it possibly be learning

      • 1995ToyotaCorolla@lemmy.world
        link
        fedilink
        English
        arrow-up
        47
        ·
        2 months ago

        Hi Zron, you seem to really enjoy eating shredded cheese at 2:00am! For your convenience, we’ve placed an order for 50lbs of shredded cheese based on your rate of consumption. Thanks!

          • Kraiden@kbin.run
            link
            fedilink
            arrow-up
            6
            ·
            2 months ago

            I think you’re being sarcastic, but I unironically agree. Cars and fridges can, and should stay dumb, with the notable exception of battery management systems in electric vehicles. That’s the single acceptable use case for a car IMHO.

            • captainlezbian@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              2 months ago

              I think car play is a wonderful feature. My car should absolutely allow syncing up to my phone. I don’t think it should telemetry or anything like that though. But I think internal process monitoring should also be a thing. Display error codes, show me that a tire is low, monitor a battery, etc. but the manufacturer shouldn’t get that info. My car shouldn’t know my sex life, and the manufacturer definitely shouldn’t

            • Oh I absolutely agree, some things don’t need to be “smart”.

              Imagine if someone put a microchip in a potato peeler claiming that it would add features like “sensing the amount of pressure applied to the potato to ensure clean peels”. The reason they haven’t done that is that data would only benefit the user, and they can’t think of a way to have it benefit the company’s profit margins.

        • variants@possumpat.io
          link
          fedilink
          English
          arrow-up
          28
          ·
          2 months ago

          We also took the liberty of canceling your health insurance to help protect the shareholders from your abhorrent health expenses in the far future

          • rottingleaf@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            2 months ago

            If your fridge spies after you, certain people can have better insights into healthiness of your food habits, how organized you are, how often things go bad and are thrown out, what medicine (requiring to be kept cold) do you put there and how often do you use it.

            That will then affect your insurances, your credit rating, and possibly many other ratings other people are interested in.

      • njm1314@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        2 months ago

        So I can see what you like to eat, then it can tell your grocery store, then your grocery store can raise the prices on those items. That’s the point. It’s the same thing with those memberships and coupon apps. That’s the end goal.

        • rottingleaf@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          2 months ago

          They can see what you like to eat by what you’re buying, LOL. No, not this.

          A fridge can give them information on how do you eat.

      • JackbyDev@programming.dev
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        2 months ago
        1. Know when you’re about to put groceries in so it makes the fridge colder so the added heat doesn’t make things go bad.
        2. Know when you don’t use it and let it get a tiny bit warmer to save a teeny bit of power. (The vast majority of power is cooling new items, not keeping things cold though.)
        3. Tell you where things are?
        4. Ummm… Maybe give you an optimized layout of how to store things?
        5. Be an attack vector on your home’s wifi
        6. Wait, no, uh,
        7. Push notifications
        8. Do you not have phones?
    • jubilationtcornpone@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 months ago

      I’m still pissed about the fact that I can’t buy a reasonably priced TV that doesn’t have WiFi. I should never have left my old LG Plasma bolted to the wall of my previous house when I sold it. That thing had a fantastic picture and doubled as a space heater in the winter.

      • cestvrai@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        2 months ago

        Projector gang checking in 🤓📽️

        Everything alright here?

        You can always join us in the peaceful realm of select input.

        (there are still WiFi-free options)

      • AdrianTheFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        2 months ago

        it doesn’t seem all that hard to make, as long as you don’t mind the severely reduced flexibility in capacity and glass bottles shattering against each other at the bottom

        • TheGrandNagus@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          2 months ago

          Not to mention the increased expense, loudness, greater difficulty cleaning, and many more points of failure!

      • Kraiden@kbin.run
        link
        fedilink
        arrow-up
        7
        ·
        2 months ago

        Now THIS I could get behind! Still not AI though. it’s a very dumb timer system that would be very useful. 1950’s tech could do this!

    • Ragnarok314159@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      21
      ·
      2 months ago

      And it would improve your life zero. That is what is absurd about LLM’s in their current iteration, they provide almost no benefit to a vast majority of people.

      All a learning model would do for a fridge is send you advertisements for whatever garbage food is on sale. Could it make recipes based on what you have? Tell it you want to slowly get healthier and have it assist with grocery selection?

      Nah, fuck you and buy stuff.

      • captainlezbian@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        Exactly, it’s entirely about extra monetization. They all think in terms of hype and money, never in terms of life improvement.

        I’d actually love AI to control something like a home assistant setup by learning how I like things and predicting change (mind you I still need to get it set up at all). But most people don’t even want a smart home.

        Make something that makes the unpleasant parts of life easier and people will be happy with it

  • peopleproblems@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    2 months ago

    AI in Movies: “The only Logical solution, is the complete control/eradication of humanity.”

    AI in Real Life: “Dave, I see you only have beer, soda, and cheese in your fridge. I am concerned for your health. I can write you a reminder to purchase better food choices.” Dave: “THESE AI ARE EVIL, I WILL NEVER SUBMIT TO YOUR POWER!”

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      I would pay less, and then either use it for dumb stuff or just not use it at all.

  • BlackLaZoR@kbin.run
    link
    fedilink
    arrow-up
    51
    arrow-down
    1
    ·
    2 months ago

    There’s really no point unless you work in specific fields that benefit from AI.

    Meanwhile every large corpo tries to shove AI into every possible place they can. They’d introduce ChatGPT to your toilet seat if they could

    • br3d@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      2 months ago

      “Shits are frequently classified into three basic types…” and then gives 5 paragraphs of bland guff

      • Krackalot@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        21
        ·
        2 months ago

        With how much scraping of reddit they do, there’s no way it doesn’t try ordering a poop knife off of Amazon for you.

      • catloaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        It’s seven types, actually, and it’s called the Bristol scale, after the Bristol Royal Infirmary where it was developed.

        • br3d@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          I know. But I was satirising GPT’s bland writing style, not providing facts

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      Someone did a demo recently of AI acceleration for 3d upscaling (think DLSS/AMDs equivilent) and it showed a nice boost in performance. It could be useful in the future.

      I think it’s kind of a ray tracing. We don’t have a real use for it now, but eventually someone will figure out something that it’s actually good for and use it.

      • AdrianTheFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        We have plenty of real uses for ray tracing right now, from blender to whatever that avatar game was doing to lumen to partial rt to full path tracing, you just can’t do real time GI with any semblance of fine detail without RT from what I’ve seen (although the lumen sdf mode gets pretty close)

        although the rt cores themselves are more debatably useful, they still give a decent performance boost most of the time over “software” rt

      • NekuSoul@lemmy.nekusoul.de
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 months ago

        AI acceleration for 3d upscaling

        Isn’t that not only similar to, but exactly what DLSS already is? A neural network that upscales games?

        • fuckwit_mcbumcrumble@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          But instead of relying on the GPU to power it the dedicated AI chip did the work. Like it had it’s own distinct chip on the graphics card that would handle the upscaling.

          I forget who demoed it, and searching for anything related to “AI” and “upscaling” gets buried with just what they’re already doing.

          • barsoap@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            2 months ago

            That’s already the nvidia approach, upscaling runs on the tensor cores.

            And no it’s not something magical it’s just matrix math. AI workloads are lots of convolutions on gigantic, low-precision, floating point matrices. Low-precision because neural networks are robust against random perturbation and more rounding is exactly that, random perturbations, there’s no point in spending electricity and heat on high precision if it doesn’t make the output any better.

            The kicker? Those tensor cores are less complicated than ordinary GPU cores. For general-purpose hardware and that also includes consumer-grade GPUs it’s way more sensible to make sure the ALUs can deal with 8-bit floats and leave everything else the same. That stuff is going to be standard by the next generation of even potatoes: Every SoC with an included GPU has enough oomph to sensibly run reasonable inference loads. And with “reasonable” I mean actually quite big, as far as I’m aware e.g. firefox’s inbuilt translation runs on the CPU, the models are small enough.

            Nvidia OTOH is very much in the market for AI accelerators and figured it could corner the upscaling market and sell another new generation of cards by making their software rely on those cores even though it could run on the other cores. As AMD demonstrated, their stuff also runs on nvidia hardware.

            What’s actually special sauce in that area are the RT cores, that is, accelerators for ray casting though BSP trees. That’s indeed specialised hardware but those things are nowhere near fast enough to compute enough rays for even remotely tolerable outputs which is where all that upscaling/denoising comes into play.

            • fuckwit_mcbumcrumble@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              Nvidia’s tensor cores are inside the GPU, this was outside the GPU, but on the same card (the PCB looked like an abomination). If I remember right in total it used slightly less power, but performed about 30% faster than normal DLSS.

              • AdrianTheFrog@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 months ago

                Having to send full frames off of the GPU for extra processing has got to come with some extra latency/problems compared to just doing it actually on the gpu… and I’d be shocked if they have motion vectors and other engine stuff that DLSS has that would require the games to be specifically modified for this adaptation. IDK, but I don’t think we have enough details about this to really judge whether its useful or not, although I’m leaning on the side of ‘not’ for this particular implementation. They never showed any actual comparisons to dlss either.

                As a side note, I found this other article on the same topic where they obviously didn’t know what they were talking about and mixed up frame rates and power consumption, its very entertaining to read

                The NPU was able to lower the frame rate in Cyberpunk from 263.2 to 205.3, saving 22% on power consumption, and probably making fan noise less noticeable. In Final Fantasy, frame rates dropped from 338.6 to 262.9, resulting in a power saving of 22.4% according to PowerColor’s display. Power consumption also dropped considerably, as it shows Final Fantasy consuming 338W without the NPU, and 261W with it enabled.

                • NekuSoul@lemmy.nekusoul.de
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  2 months ago

                  I’ve been trying to find some better/original sources [1] [2] [3] and from what I can gather it’s even worse. It’s not even an upscaler of any kind, it apparently uses an NPU just to control clocks and fan speeds to reduce power draw, dropping FPS by ~10% in the process.

                  So yeah, I’m not really sure why they needed an NPU to figure out that running a GPU at its limit has always been wildly inefficient. Outside of getting that investor money of course.

  • magiccupcake@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 months ago

    Most people have pretty decent ai hardware already in the form of a gpu.

    Sure dedicated hardware might be more efficient for mobile devices, but that’s already done better in the cloud.

    • PriorityMotif@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      Google coral TPU has been around for years and it’s cheap. Works well for object detection.

      https://docs.frigate.video

      There’s a lot of use cases in manufacturing where you can do automated inspection of parts as they go by on a conveyor, or have a robot arm pick and place parts/boxes/pallets etc.

      Those types of systems have been around for decades, but they can always be improved.

    • Nomecks@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      It’s not really done better in the cloud if you can push the compute out to the device. When you can leverage edge hardware you save bandwidth fees and a ton of cloud costs. It’s faster in the cloud because you can leverage a cluster with economies of scale, but any AI company would prefer the end-user to pay for that compute instead, if they can service requests adequately.

      • AdrianTheFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Yeah, you also have to deal with the latency with the cloud, which is a big problem for a lot of possible applications

  • OhmsLawn@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 months ago

    I honestly have no Idea what AI does to a processor, and would therefore not pay extra for the badge.

    If it provided a significant speed improvement or something, then yeah, sure. Nobody has really communicated to me what the benefit is. It all seems like hand waving.

    • originalucifer@moist.catsweat.com
      link
      fedilink
      arrow-up
      12
      ·
      2 months ago

      what they mean is that they are putting in dedicated processors or other hardware just to run an LLM . it doesnt speed up anything other than the faux-AI tool they are implementing.

      LLMs require a ton of math that is better suited to video processors than the general purpose cpu on most machines.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      2 months ago

      I honestly have no Idea what AI does to a processor

      Parallel processing capability. CPUs historically worked with mostly-non-massively-parallelizable tasks; maybe you’d use a GPU if you wanted that.

      I mean, that’s not necessarily “AI” as such, but LLMs are a neat application that uses them.

      On-CPU video acceleration does parallel processing too.

      Software’s going to have to parallelize if it wants to get much by way of performance improvements, anyway. We haven’t been seeing rapid exponential growth in serial computation speed since the early 2000s. But we can get more parallel compute capacity.

  • FiniteBanjo@lemmy.today
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 months ago

    People already aren’t paying for them, nVidia’s main source of income is industry use and not consumer parts, right now.

  • cmrn@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    2 months ago

    I still don’t understand how the buzzword of AI 10x’d all these valuations, when it’s always either: a) exactly what they’ve been doing before, now with a fancy new name b) deliberately shoehorning AI in, in ways with no practical benefit

    • dinckel@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Isn’t that the entire point behind what most business people do? The whole goal is to upsell some schmuck by speaking too fast, and mentioning a lot of words that don’t really mean anything. Except the difference now is that the business person in this case is the leadership behind most of the tech industry

  • TheEntity@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    ·
    2 months ago

    And what do the companies take away from this? “Cool, we just won’t leave you any other options.”

  • metaStatic@kbin.earth
    link
    fedilink
    arrow-up
    2
    ·
    2 months ago

    this goes to show just how far the current grift has gone.

    AI enhanced hardware? Jesus Fuck take all my money that’s amazing.

    Dedicated LLM chatbot hardware? Die in a fire for even suggesting this is AI.

  • Sagrotan@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    2 months ago

    They’ll pay for it. When the tech companies decide, it’s a thing to make money off & advertise it, all the good ants will buy, buy, buy and the rest of the time they will work, work, work for it.

  • Sam_Bass@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 months ago

    Its bad enough they shove it on you in some websites. Really not interested in being their lab rats

  • t00l@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    1
    ·
    2 months ago

    They want you to buy the hardware and pay for the additional energy costs so they can deliver clippy 2.0, the watching-you-wank-edition.