Skip to main content

Ollama

Ollama is an open-source tool that allows to run large language models (LLMs) locally on their own computers. To use Ollama, you can install it here and download the model you want to run with the ollama run command.

Chat model

We recommend configuring Llama3.1 8B as your chat model.

config.json
{
"models": [
{
"title": "Llama3.1 8B",
"provider": "ollama",
"model": "llama3.1:8b"
}
]
}

Autocomplete model

We recommend configuring Qwen2.5-Coder 1.5B as your autocomplete model.

config.json
{
"tabAutocompleteModel": {
"title": "Qwen2.5-Coder 1.5B",
"provider": "ollama",
"model": "qwen2.5-coder:1.5b"
}
}

Embeddings model

We recommend configuring Nomic Embed Text as your embeddings model.

config.json
{
"embeddingsProvider": {
"provider": "ollama",
"model": "nomic-embed-text"
}
}

Reranking model

Ollama currently does not offer any reranking models.

Click here to see a list of reranking model providers.