Skip to main content

Autocomplete model

An "autocomplete model" is an LLM that is trained on a special format called fill-in-the-middle (FIM). This format is designed to be given the prefix and suffix of a code file and predict what goes between. This task is very specific, which on one hand means that the models can be smaller (even a 3B parameter model can perform well). On the other hand, this means that Chat models, though larger, will perform poorly.

In Continue, these models are used to display inline Autocomplete suggestions as you type.

If you have the ability to use any model, we recommend Codestral from Mistral.

If you want to run a model locally, we recommend [Starcoder2-3B] with [Ollama]../model-providers/top-level/ollama.md).