• boonhet@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 hours ago

    What price point are you trying to hit?

    With regards to AI?. None tbh.

    With this super fast storage I have other cool ideas but I don’t think I can get enough bandwidth to saturate it.

    • gravitas_deficiency@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      You’re willing to pay $none to have hardware ML support for local training and inference?

      Well, I’ll just say that you’re gonna get what you pay for.