👧
Amica
Launch DemoTelegramGitHubTwitter
  • Welcome to Amica!
  • 🌸Overview
    • How Amica Works
    • Core Features
    • Amica Life
    • Emotion System
    • Other Features
    • Use Cases
    • Amica vs Other Tools
  • 🌳Getting Started
    • Quickstart Guide
    • Installing Amica
    • Next Steps
  • 🗣️Connecting LLMs (Your Core AI Chatbot Model)
    • Using LM Studio
    • Using LLaMA.cpp
    • Using Ollama
    • Using KoboldCpp
    • Using OpenAI
    • Using Oobabooga
    • Using OpenRouter
  • 🔊Connecting Speech Options (TTS)
    • Using SpeechT5
    • Using ElevenLabs
    • Using Coqui Local
    • Using Piper
    • Using Alltalk TTS
    • Using Kokoro TTS
    • Using RVC
  • 👂Connecting Microphone Options (STT)
    • Using whisper.cpp
  • 👁️Connecting Multi-Modal Modules
    • Using LLaVA
  • 🔧Other Guides
    • Using Window.ai
    • Using Moshi (Voice to Voice)
  • 🧩Plugin System
    • Plugins Intro
    • Getting Real World News on Amica
  • 🔌API System
    • External API for Agents
  • 🌻Tutorials
    • Creating new Avatars
    • Using Custom Assets
  • 🌺Contributing to Amica
    • Setting up your developer environment
    • Contributing to the Docs
    • Developing Amica
    • Adding Translations
Powered by GitBook
On this page
  • Step 1 - Clone the repo
  • Step 2 - Download the model
  • Step 3 - Build KoboldCpp
  • Step 4 - Run the server
  • Step 5 - Enable the server in the client
Edit on GitHub
  1. Connecting LLMs (Your Core AI Chatbot Model)

Using KoboldCpp

PreviousUsing OllamaNextUsing OpenAI

Last updated 1 year ago

You can find the full KoboldCpp documentation .

Step 1 - Clone the repo

git clone https://github.com/LostRuins/koboldcpp
cd koboldcpp

Step 2 - Download the model

For example, we will use OpenChat 3.5 model, which is what is used on the demo instance. There are many models to choose from.

Navigate to and download one of the models, such as openchat_3.5.Q5_K_M.gguf. Place this file inside the ./models directory.

Step 3 - Build KoboldCpp

make

Step 4 - Run the server

./koboldcpp.py ./models/openchat_3.5.Q5_K_M.gguf

Step 5 - Enable the server in the client

First select KoboldCpp as the backend in the client:

settings -> ChatBot -> ChatBot Backend -> KoboldCpp

Then configure KoboldCpp:

settings -> ChatBot -> KoboldCpp

Inside of "Use KoboldCpp" ensure that "Use Extra" is enabled. This will allow you to use the extra features of KoboldCpp, such as streaming.

🗣️
here
TheBloke/openchat_3.5-GGUF