Stamets@lemmy.world to People Twitter@sh.itjust.works · 1 year agoThe dreamlemmy.worldimagemessage-square178fedilinkarrow-up11.93Karrow-down142
arrow-up11.88Karrow-down1imageThe dreamlemmy.worldStamets@lemmy.world to People Twitter@sh.itjust.works · 1 year agomessage-square178fedilink
minus-squareCeeBee@lemmy.worldlinkfedilinkarrow-up2·1 year ago I don’t know of an LLM that works decently on personal hardware Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.
minus-squareParetoOptimalDev@lemmy.todaylinkfedilinkarrow-up1·1 year agoIf you have really low specs use the recently open sourced Microsoft Phi model.
Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.
If you have really low specs use the recently open sourced Microsoft Phi model.