Using Ollama

You can find the full Ollama documentation here.

Step 1 - Install Ollama

Linux and WSL2

curl https://ollama.ai/install.sh | sh

Mac OSX

Download

Windows

Not yet supported

Step 2 - Start the server

ollama serve

Step 3 - Download a model

For example, we will use Mistral 7B. There are many models to choose from listed in the library.

ollama run mistral

Step 4 - Enable the server in the client

settings -> ChatBot -> ChatBot Backend -> Ollama

Last updated