Using Ollama
You can find the full Ollama documentation here.
Step 1 - Install Ollama
Linux and WSL2
Mac OSX
Windows
Not yet supported
Step 2 - Start the server
Step 3 - Download a model
For example, we will use Mistral 7B. There are many models to choose from listed in the library.
Step 4 - Enable the server in the client
Last updated