• JAWNEHBOY@reddthat.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 hours ago

    I didn’t even realize how big a deal RVA23 was with hypervisor support as a mandatory feature. So RISC-V is hoping to leapfrog ARM into the data center and pull consumer computing forward along with it?

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      RVA23 is a big deal because it allows the big players (e.g. Google, Amazon, Meta, OpenAI, Anthropic, and more) to avoid vendor lock-in for their super duper ultra wicked mega tuned-to-fuck-and-back specialty software (not just AI stuff). Basically, they can tune their software to a generic platform to the nth degree and then switch chips later if they want without having to re-work that level of tuning.

      The other big reason why RISC-V is a big deal right now is energy efficiency. 40% of a data center’s operating cost is cooling. By using right-sized RISC-V chips in their servers they can save a ton of money on cooling. Compare that to say, Intel Xeon where the chips will be wasting energy on zillions of unused extensions and sub-architecture stuff (thank Transmeta for that). Every little unused part of a huge, power hungry chip like a Xeon eats power and generates heat.

      Don’t forget that vector extensions are also mandatory in RVA23. That’s just as big a deal as the virtualization stuff because AI (which heavily relies on vector math) is now the status quo for data center computing.

      My prediction is that AI workload enhancements will become a necessary feature in desktops and laptops soon too. But not because of anything Microsoft integrates into their OS and Office suites (e.g. Copilot). It’ll be because of Internet search and gaming.

      Using an AI to search the Internet is such a vastly superior experience, there’s no way anyone is going to want to go back once they’ve tried it out. Also, in order for it to work well it needs to run queries on the user’s behalf locally. Not in Google or Microsoft’s cloud.

      There’s no way end users are going to pay for an inferior product that only serves search results from a single company (e.g. Microsoft’s solution—if they ever make one—will for sure use Bing and it would never bother to search multiple engines simultaneously).

    • alessandro@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      7 hours ago

      Well, ARM looks like is hoping to leapfrog over x86 (Intel/AMD) in desktop computing. Once the “RISC” technology (Box86,FEX and alike) head in the PC gaming… we may begin to see options to companies who fed on the PC gaming industry (mostly AMD/Nvidia) and now are turning their back after various things coming along (crypto currency, AI…)