Nvidia reveals new A.I. chip, says costs of running LLMs will ‘drop significantly’::Currently, Nvidia dominates the market for AI chips, with over 80% market share, according to some estimates.

  • abhibeckert@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    if I wanted to run, say, BLOOM (an open-source LLM), I’d need to spend close to $100K on hardware

    Doesn’t that dozens of notes with over a terabyte of RAM each? And state of the art networking?

    Sounds closer to $100M than $100K.

    • GenderNeutralBro@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      If you want to train your own network like they did, you’d want something like that, yeah, but to run the trained network you “only” need ~360GB of memory.

      For context, even if you wanted to run this in CPU, there are currently no A5 mobos (Ryzen 7000 series) that support more than 192GB of memory. You literally can’t even run it on high-end consumer hardware.