The Tetrate Agent Router Service provides a unified Gateway for accessing various AI models with fast inference capabilities as well as embedding services. This gateway acts as an intelligent router that can distribute requests across multiple model providers, offering enterprise-grade reliability and performance optimization.

Getting Started

  1. Obtain an API key from the Agent Router Service portal
  2. Configure Continue to use Agent Router Service as your model provider

Configuring Models

You can also quickly use models from Tetrate on the Cotninue Hub by visiting Tetrate on the Continue Hub here. The easiest way if you don’t see the model you want to use is to remix one of the models on the Tetrate page and change the model to the one you want to use. Remix a model from Tetrate on Continue Hub

Available Models

Agent Router Service supports routing to various models. Go to the Agent Router Service model catalog for a full list of supported models.

Configuration Options

As Agent Router Service is an OpenAI API compatible provider, you can configure it like any other OpenAI API compatible provider. Therefore you will be setting provider: openai and apiBase: https://api.router.tetrate.ai/v1. Update your Continue configuration file (~/.continue/config.yaml) with your Agent Router Service API key:

On your local machine

In your config.yaml

config.yaml
models:
  - name: <MODEL_NAME>
    provider: openai
    model: <MODEL_ID>
    apiKey: <TETRATE_API_KEY>
    apiBase: https://api.router.tetrate.ai/v1

As a local model block

Review the configuration reference for model blocks and using local blocks. Ensure you look at the roles and capabilities section and review the model capabilities guide.
model_name.yaml
    name: <MODEL_NAME>
    models:
      - name: <MODEL_NAME>
        provider: openai
        model: <MODEL_ID>
        apiKey: ${{ inputs.TETRATE_API_KEY }}
        apiBase: https://api.router.tetrate.ai/v1
        roles:
          - chat
          - edit
          - apply
        capabilities:
          - tool_use

On Model Block on Continue Hub

Learn how to use model blocks on Continue Hub here.
model_name.yaml
    name: <MODEL_NAME>
    models:
      - name: <MODEL_NAME>
        provider: openai
        model: <MODEL_ID>
        apiKey: ${{ inputs.TETRATE_API_KEY }}
        apiBase: https://api.router.tetrate.ai/v1
        roles:
          - chat
          - edit
          - apply
        capabilities:
          - tool_use