It might lead to us being able to get AI datacenter level GPUs to run our own models on for relatively cheaply as ewaste. Wouldn’t that be cool?

  • Ptsf@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    edit-2
    16 hours ago

    The hostility is that you’re questioning my answer already without even bothering to do the most basic of research, it’s frankly whiny and annoying. To answer you again though. No. It’s not possible you’re installing any of this hardware anywhere second hand. As I originally stated. Anything else is copeium.

    • Draconic NEO@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      12 hours ago

      Your behavior and tone in this thread has been incredibly hostile towards people simply for expressing ideas you didn’t like or think was possible. If this is how you respond to people discussing these kinds of topics maybe you shouldn’t participate in our communities.