Continue home page
Search...
⌘K
Ask AI
Explore
continuedev/continue
continuedev/continue
Search...
Navigation
Guides
Overview
Documentation
Customize
Hub
Guides
Reference
About Us
Community
Blog
Guides
Overview
Using Ollama with Continue: A Developer's Guide
Using Llama 3.1 with Continue
How to Set Up Codestral
Running Continue without Internet
Custom code RAG
Build Your Own Context Provider
How to Self-Host a Model
On this page
Model & Setup Guides
Advanced Tutorials
Contributing
Guides
Overview
Copy page
Model & Setup Guides
Using Ollama with Continue
- Local AI development with Ollama
Setting up Codestral
- Configure Mistral’s Codestral model
How to Self-Host a Model
- Self-hosting AI models
Running Continue Without Internet
- Offline development setup
Llama 3.1 Setup
- Getting started with Llama 3.1
Advanced Tutorials
Build Your Own Context Provider
- Create custom context providers
Custom Code RAG
- Implement custom retrieval-augmented generation
Contributing
Have a guide idea or found an issue? We welcome contributions! Check our
GitHub repository
to get involved.
Was this page helpful?
Yes
No
Suggest edits
Using Ollama with Continue: A Developer's Guide
Assistant
Responses are generated using AI and may contain mistakes.