Llama.cpp
Get started with Llama.cpp
Configuration
name: My Config
version: 0.0.1
schema: v1
models:
- name: <MODEL_NAME>
provider: llama.cpp
model: <MODEL_ID>
apiBase: http://localhost:8080
name: My Config
version: 0.0.1
schema: v1
models:
- name: <MODEL_NAME>
provider: llama.cpp
model: <MODEL_ID>
apiBase: http://localhost:8080