- cross-posted to:
- fosai@lemmy.world
- localllama@sh.itjust.works
- cross-posted to:
- fosai@lemmy.world
- localllama@sh.itjust.works
Original Mistral AI blog: https://mistral.ai/news/mixtral-of-experts/
Original Mistral AI blog: https://mistral.ai/news/mixtral-of-experts/
32k tokens seems to be approaching a usable space.
I tried MistralOrca and, while it was impressive, the token limit prevented it from being immediately useful.