Yeah there are loads of ways to run LLMs locally, my issue is, why not just link to the source code if you’re going to say its open source and mention the model used. The open source AI scene also seems to never include the training data in their source code which is why I wanted to see if Proton would
You can run it locally, with GPT4All toolkit - along any other LLM you want.
Yeah there are loads of ways to run LLMs locally, my issue is, why not just link to the source code if you’re going to say its open source and mention the model used. The open source AI scene also seems to never include the training data in their source code which is why I wanted to see if Proton would