Skip to main content

Adding Models

This section explains the steps to add AWS Bedrock models and configure the required access controls.
1

Navigate to AWS Bedrock Models in AI Gateway

From the TrueFoundry dashboard, navigate to AI Gateway > Models and select AWS Bedrock.
Navigating to AWS Bedrock Provider Account in AI Gateway

Navigate to AWS Bedrock Models

2

Add AWS Bedrock Account Name and Collaborators

Give a unique name for the bedrock account which will be used to refer later in the models. The models in the account will be referred to as @providername/@modelname. Add collaborators to your account. You can decide which users/teams have access to the models in the account (User Role) and who can add/edit/remove models in this account (Manager Role). You can read more about access control here.
AWS Bedrock account configuration form with fields for API key and collaborators

AWS Bedrock Model Account Form

3

Add Region and Authentication

Select the default AWS region for the models in this account. The account-level region serves as the default for all models unless explicitly overridden at the model level. Provide the authentication details on how the gateway can access the Bedrock models. Truefoundry supports both AWS Access Key/Secret Key and Assume Role based authentication. You can read below on how to generate the access/secret keys or roles.
Required IAM PolicyFirst, create the IAM policy that grants permission to invoke Bedrock models. This policy can be attached to either an IAM user (for access key authentication) or an IAM role (for assumed role authentication).The following policy grants permission to invoke all models in your available regions (To check the list of available regions for different models, refer to AWS Bedrock):
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Sid": "InvokeAllModels",
      "Action": [
        "bedrock:InvokeModel",
        "bedrock:InvokeModelWithResponseStream"
      ],
      "Resource": [
        "arn:aws:bedrock:*::foundation-model/*",
        "arn:aws:bedrock:*:<aws-account-id>:inference-profile/*",
        "arn:aws:bedrock:*:<aws-account-id>:application-inference-profile/*"
      ]
    }
  ]
}
Using AWS Access Key and Secret
  1. Create an IAM user (or choose an existing IAM user) following these steps.
  2. Attach the IAM policy created above to this user.
  3. Create an access key for this user as per this doc.
  4. Use this access key and secret while adding the provider account to authenticate requests to the Bedrock model.
Using Assumed RoleThe gateway role assumes your role, which in turn accesses Bedrock models.
  1. Create an IAM role in your AWS account that has access to Bedrock. Attach the IAM policy with Bedrock permissions (shown above) to this role.
  2. Configure the trust policy for this role to allow the gateway role to assume it. Use the appropriate role ARN based on your deployment:
For SAAS deployments:
  • Gateway role ARN: arn:aws:iam::416964291864:role/tfy-ctl-production-ai-gateway-deps
For on-prem deployments:
  • Your gateway role ARN will look like: arn:aws:iam::<your-aws-account-id>:role/<account-prefix>-truefoundry-deps
Trust Policy

Trust Policy

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "Statement1",
      "Effect": "Allow",
      "Principal": {
        // for SAAS deployments:
        "AWS": "arn:aws:iam::416964291864:role/tfy-ctl-production-ai-gateway-deps"
        // or for on-prem deployments:
        // "AWS": "arn:aws:iam::<your-aws-account-id>:role/<account-prefix>-truefoundry-deps"
      },
      "Action": "sts:AssumeRole",
      // (Optional) For additional security use external ID.
      "Condition": {
        "StringEquals": {
          "sts:ExternalId": "your-external-id"
        }
      }
    }
  ]
}
Replace the Principal AWS ARN in the trust policy with the appropriate gateway role ARN based on your deployment type (SAAS or on-prem).
You can optionally configure an external ID in the trust policy (as shown in the example above) for additional security. If you use an external ID, make sure to provide the same external ID when creating the Bedrock model integration in TrueFoundry.
  1. Read more about how assumed roles work here.
4

Add Models

Select the models from the list that you want to add. You can use Select All to select all the models.
If the model you are looking for is not present in the options, you can add it using + Add Model at the end of list.
TrueFoundry AI Gateway supports all text and image models in Bedrock.The complete list of models supported by Bedrock can be found here.

Inference

After adding the models, you can perform inference using an OpenAI-compatible API via the Playground or integrate with your own application.
Code Snippet and Try in Playgroud Buttons for each model

Infer Model in Playground or Get Code Snippet to integrate in your application

FAQ:

In case you have custom pricing for your models, you can override the default cost by clicking on Edit Model button and then choosing the Private Cost Metric option.
Edit model button and interface for AWS Bedrock model

Edit Model

Custom cost metric configuration form with input fields for pricing

Set custom cost metric

Yes, you can add models from different regions. You can provide a top level default region for the account and also override it at the model level.Region selection dropdown for AWS Bedrock model configuration
What is Cross-Region Inference?Cross-Region Inference dynamically routes your inference requests across multiple AWS regions to optimize performance and handle traffic bursts. Bedrock selects the best region based on load, latency, and availability. Learn more in the AWS Bedrock Cross-Region Inference documentation.Key Difference: Inference Profile ID vs Model IDTo use cross-region inference, you must use an Inference Profile ID instead of a regular model ID. Inference profiles define the foundation model and the AWS regions where requests can be routed.
  • Regular Model ID: anthropic.claude-3-5-sonnet-20240620-v1:0 (single region)
  • Inference Profile ID: us.anthropic.claude-3-5-sonnet-20240620-v1:0 (cross-region routing)
Inference profile IDs can be:
  • System-defined geographic profiles: Use geographic prefixes (us., eu., apac.) followed by the model ID (e.g., us.anthropic.claude-3-5-sonnet-20240620-v1:0). The prefix indicates routing within that geography.
  • Custom inference profiles: Use full ARN format (e.g., arn:aws:bedrock:us-east-1:123456789012:inference-profile/my-profile)
How to Use It1. Add the inference profile ID as a model: When adding a model in TrueFoundry, use the inference profile ID (e.g., us.anthropic.claude-3-5-sonnet-20240620-v1:0) instead of the regular model ID. If it’s not in the dropdown, use + Add Model and enter it manually.
AWS Bedrock cross-region inference configuration interface

AWS Bedrock cross-region inference configuration interface

2. Configure IAM permissions for ALL destination regions: This is critical. When Bedrock routes to a different region, your IAM role/access key must have permissions in that region. You must grant permissions for both the inference profile and the foundation model in all destination regions.Update your IAM policy to use * for the region to allow access across all regions:
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": ["bedrock:InvokeModel", "bedrock:InvokeModelWithResponseStream"],
      "Resource": ["arn:aws:bedrock:*::foundation-model/*", "arn:aws:bedrock:*:YOUR-AWS-ACCOUNT-ID:inference-profile/*"]
    }
  ]
}
Replace YOUR-AWS-ACCOUNT-ID in the policy above with your actual AWS account ID. The * in the region position allows access across all regions.
Most Common Mistake: Users grant permissions only in their default region. If Bedrock routes to a different region without permissions, requests will fail with “Access Denied”. Always grant permissions in ALL potential destination regions. For geographic profiles, ensure permissions in both source and destination regions.
3. Check Service Control Policies (SCPs): If your organization uses SCPs to restrict region access, ensure they allow access to all destination regions in your inference profile. Blocking any destination region will prevent cross-region inference from working.Troubleshooting
Cause: Missing IAM permissions in the destination region where Bedrock routed the request.Solution:
  • Ensure your IAM policy grants Bedrock permissions across all regions (use * in the region part of the ARN)
  • For geographic profiles, grant permissions in both source and destination regions
  • Check if Service Control Policies (SCPs) are blocking access to certain regions
Cause: You’re using a regular model ID instead of an inference profile ID.Solution: Use the inference profile ID format (e.g., us.anthropic.claude-3-5-sonnet-20240620-v1:0) instead of the regular model ID.
Learn MoreFor detailed AWS documentation, see: