Configure Amazon SageMaker with Continue to use deployed LLM endpoints for both chat and embedding models, supporting LMI and HuggingFace TEI deployments with AWS credentials
SageMaker can be used for both chat and embedding models. Chat models are supported for endpoints deployed with LMI, and embedding models are supported for endpoints deployed with HuggingFace TEIHere is an example Sagemaker configuration setup:
The value in model should be the SageMaker endpoint name you deployed.Authentication will be through temporary or long-term credentials in
~/.aws/credentials under a profile called “sagemaker”.
Copy
Ask AI
[sagemaker]aws_access_key_id = abcdefgaws_secret_access_key = hijklmnoaws_session_token = pqrstuvwxyz # Optional: means short term creds.