Configure Replicate with Continue to access newly released language models or deploy your own through their platform, with support for various models including CodeLLama
~/.continue/config.json
to look like this:
model
parameter, it will default to replicate/llama-2-70b-chat:58d078176e02c219e11eb4da5a02a7830a283b14cf8f94537af893ccff5ee781
.
View the source