Using Ollama
Step 1 - Install Ollama
Linux and WSL2
curl https://ollama.ai/install.sh | shMac OSX
Windows
Step 2 - Start the server
ollama serveStep 3 - Download a model
ollama run mistralStep 4 - Enable the server in the client
Last updated