Using LLaVA
Last updated
Last updated
LLaVA / BakLLaVA can be used with .
You can find the full llama.cpp documentation .
For example, we will use BakLLaVA-1 model, which is what is used on the demo instance.
Navigate to and download either q4
or q5
quant, as well as the mmproj-model-f16.gguf
file.
The mmproj-model-f16.gguf
file is necessary for the vision model.
Read the documentation for more information on the server options. Or run ./server --help
.