• vale@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    7 months ago

    Take a look at Ollama.ai, just follow the installation instructions. A decent GPU is recommended, and the models are around 10GB iirc.