Pro@programming.dev to Technology@lemmy.worldEnglish · 5 days agoGoogle quietly released an app that lets you download and run AI models locallygithub.comexternal-linkmessage-square47linkfedilinkarrow-up1247arrow-down125
arrow-up1222arrow-down1external-linkGoogle quietly released an app that lets you download and run AI models locallygithub.comPro@programming.dev to Technology@lemmy.worldEnglish · 5 days agomessage-square47linkfedilink
minus-squareGreg Clarke@lemmy.calinkfedilinkEnglisharrow-up3·5 days agoHas this actually been done? If so, I assume it would only be able to use the CPU
minus-squareEuphoma@lemmy.mllinkfedilinkEnglisharrow-up7·5 days agoYeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
You can use it in termux
Has this actually been done? If so, I assume it would only be able to use the CPU
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk