Workers should learn AI skills and companies should use it because it’s a “cognitive amplifier,” claims Satya Nadella.

in other words please help us, use our AI

  • SparroHawc@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 days ago

    It’s not the query that burns through electricity like crazy, it’s training the models.

    You can run a query yourself at home with a desktop computer, as long as it has enough RAM and compute cells to support the model you’re using (think a few high-end GPUs).

    Training a model requires a huge pile of computer power though, and the AI companies are constantly scraping the internet to stealfind more training material

    • sobchak@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 days ago

      Dunno if that’s true or not. Generally, much more compute is used in inference than training, since you only train once, then use that model for millions of queries or whatever. However, some of these AI companies may be training many models constantly to one-up each-other and pump their stock; dunno. The “thinking” model paradigm is also transferring a lot more compute to inference. IIRC OpenAI spent $300k of compute just for inference to complete a single benchmark a few months ago (and found that, like training, exponentially increasing amounts of compute are needed for small gains in performance).