VexCatalyst@lemmy.fmhy.ml to Selfhosted@lemmy.worldEnglish · 1 year agoGuide: Self-hosting open source GPT chat with no GPU using GPT4Allforum.tuxdigital.comexternal-linkmessage-square12fedilinkarrow-up176arrow-down16
arrow-up170arrow-down1external-linkGuide: Self-hosting open source GPT chat with no GPU using GPT4Allforum.tuxdigital.comVexCatalyst@lemmy.fmhy.ml to Selfhosted@lemmy.worldEnglish · 1 year agomessage-square12fedilink
minus-squareStefen Auris@pawb.sociallinkfedilinkEnglisharrow-up1arrow-down4·1 year agoI loved this however my only disappointment is that you can’t use it as a server others can connect to and use the chat interface
minus-squarewebghost0101@lemmy.fmhy.mllinkfedilinkEnglisharrow-up4·edit-21 year agoUse this webui (its the stabld diffusion ui for llm) https://github.com/oobabooga/text-generation-webui I am pretty sure it has a sever option. Here is a list of the models it likely supports, including gpt4all. https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard Best one i tried is wizard vicuna 13B running on a rtx2070
minus-squareStefen Auris@pawb.sociallinkfedilinkEnglisharrow-up0·1 year agooh hey this is super useful, thanks! :D
I loved this however my only disappointment is that you can’t use it as a server others can connect to and use the chat interface
Use this webui (its the stabld diffusion ui for llm)
https://github.com/oobabooga/text-generation-webui
I am pretty sure it has a sever option.
Here is a list of the models it likely supports, including gpt4all. https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
Best one i tried is wizard vicuna 13B running on a rtx2070
oh hey this is super useful, thanks! :D