• DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      9 months ago

      Hopefully we see more specific hardware for this. Like extension cards with pretty much just tensor cores and their own ram.

    • QuadratureSurfer@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      I’ve got it running with a 3090 and 32GB of RAM.

      There are some models that let you run with hybrid system RAM and VRAM (it will just be slower than running it exclusively with VRAM).

      • Deceptichum@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        9 months ago

        Yeah but damn does it get slow.

        I always find it interesting how text is so much slower than image generation. I can do a 1024x1024 in probably 20s, but I get like 1 word a second with text.