Using Ollama
You can find the full Ollama documentation here.
Step 1 - Install Ollama
Linux and WSL2
curl https://ollama.ai/install.sh | shMac OSX
Windows
Not yet supported
Step 2 - Start the server
ollama serveStep 3 - Download a model
For example, we will use Mistral 7B. There are many models to choose from listed in the library.
ollama run mistralStep 4 - Enable the server in the client
settings -> ChatBot -> ChatBot Backend -> OllamaLast updated