

# Service roles
<a name="security-iam-sr"></a>

Amazon Bedrock uses [IAM service roles](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_terms-and-concepts.html#iam-term-service-role) for some features to let Amazon Bedrock carry out tasks on your behalf.

The console automatically creates service roles for supported features.

You can also create a custom service role and customize the attached permissions to your specific use-case. If you use the console, you can select this role instead of letting Amazon Bedrock create one for you.

To set up the custom service role, you carry out the following general steps.

1. Create the role by following the steps at [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html).

1. Attach a **trust policy**.

1. Attach the relevant **identity-based permissions**.

**Important**  
When setting the `iam:PassRole` permission, make sure that a user can't pass a role where the role has more permissions than you want the user to have. For example, Alice might not be allowed to perform `bedrock:InvokeModel` on a custom model. If Alice can pass a role to Amazon Bedrock to create an evaluation of that custom model, the service could invoke that model on behalf of Alice while running the job.

Refer to the following links for more information about IAM concepts that are relevant to setting service role permissions.
+ [AWS service role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_terms-and-concepts.html#iam-term-service-role)
+ [Identity-based policies and resource-based policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_identity-vs-resource.html)
+ [Using resource-based policies for Lambda](https://docs.aws.amazon.com/lambda/latest/dg/access-control-resource-based.html)
+ [AWS global condition context keys](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html)
+ [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)

Select a topic to learn more about service roles for a specific feature.

**Topics**
+ [Create a custom service role for batch inference](batch-iam-sr.md)
+ [Create a service role for model customization](model-customization-iam-role.md)
+ [Create a service role for importing pre-trained models](model-import-iam-role.md)
+ [Create a service role for Amazon Bedrock Agents](agents-permissions.md)
+ [Create a service role for Amazon Bedrock Knowledge Bases](kb-permissions.md)
+ [Create a service role for Amazon Bedrock Flows in Amazon Bedrock](flows-permissions.md)
+ [Service role requirements for model evaluation jobs](model-evaluation-security-service-roles.md)

# Create a custom service role for batch inference
<a name="batch-iam-sr"></a>

To use a custom service role for batch inference instead of the one Amazon Bedrock automatically creates for you in the AWS Management Console, create an IAM role and attach the following permissions by following the steps at [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html).

**Topics**
+ [Trust relationship](#batch-iam-sr-trust)
+ [Identity-based permissions for the batch inference service role.](#batch-iam-sr-identity)

## Trust relationship
<a name="batch-iam-sr-trust"></a>

The following trust policy allows Amazon Bedrock to assume this role and submit and manage batch inference jobs. Replace the *values* as necessary. The policy contains optional condition keys (see [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) and [AWS global condition context keys](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)) in the `Condition` field that we recommend you use as a security best practice.

**Note**  
As a best practice for security purposes, replace the *\$1* with specific batch inference job IDs after you have created them.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnEquals": {
                    "aws:SourceArn": "arn:aws:bedrock:us-east-1:123456789012:model-invocation-job/*"
                }
            }
        }
    ]
}
```

------

## Identity-based permissions for the batch inference service role.
<a name="batch-iam-sr-identity"></a>

The following topics describe and provide examples of permissions policies that you might need to attach to your custom batch inference service role, depending on your use case.

**Topics**
+ [(Required) Permissions to access input and output data in Amazon S3](#batch-iam-sr-s3)
+ [(Optional) Permissions to run batch inference with inference profiles](#batch-iam-sr-ip)

### (Required) Permissions to access input and output data in Amazon S3
<a name="batch-iam-sr-s3"></a>

To allow a service role to access the Amazon S3 bucket containing your input data and the bucket to which to write your output data, attach the following policy to the service role. Replace *values* as necessary.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
         "Sid": "S3Access",
         "Effect": "Allow",
         "Action": [
            "s3:GetObject",
            "s3:PutObject",
            "s3:ListBucket"
         ],
         "Resource": [
            "arn:aws:s3:::${InputBucket}",
            "arn:aws:s3:::${InputBucket}/*",
            "arn:aws:s3:::${OutputBucket}",
            "arn:aws:s3:::${OutputBucket}/*"
         ],
         "Condition": {
            "StringEquals": {
                "aws:ResourceAccount": [
                    "123456789012"
                ]
            }
         }
        }
    ]
}
```

------

### (Optional) Permissions to run batch inference with inference profiles
<a name="batch-iam-sr-ip"></a>

To run batch inference with an [inference profile](inference-profiles.md), a service role must have permissions to invoke the inference profile in an AWS Region, in addition to the model in each Region in the inference profile.

For permissions to invoke with a cross-Region (system-defined) inference profile, use the following policy as a template for the permissions policy to attach to your service role:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "CrossRegionInference",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:inference-profile/${InferenceProfileId}",
                "arn:aws:bedrock:us-east-1::foundation-model/${ModelId}",
                "arn:aws:bedrock:us-east-1::foundation-model/${ModelId}"
            ]
        }
    ]
}
```

------

For permissions to invoke with an application inference profile, use the following policy as a template for the permissions policy to attach to your service role:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "ApplicationInferenceProfile",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:application-inference-profile/${InferenceProfileId}",
                "arn:aws:bedrock:us-east-1::foundation-model/${ModelId}",
                "arn:aws:bedrock:us-east-1::foundation-model/${ModelId}"
            ]
        }
    ]
}
```

------

# Create a service role for model customization
<a name="model-customization-iam-role"></a>

To use a custom role for model customization instead of the one Amazon Bedrock automatically creates, create an IAM role and attach the following permissions by following the steps at [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html).
+ Trust relationship
+ Permissions to access your training and validation data in S3 and to write your output data to S3
+ (Optional) If you encrypt any of the following resources with a KMS key, permissions to decrypt the key (see [Encryption of custom models](encryption-custom-job.md))
  + A model customization job or the resulting custom model
  + The training, validation, or output data for the model customization job

**Topics**
+ [Trust relationship](#model-customization-iam-role-trust)
+ [Permissions to access training and validation files and to write output files in S3](#model-customization-iam-role-s3)
+ [(Optional) Permissions to create a Distillation job with a cross-region inference profiles](#customization-iam-sr-ip)

## Trust relationship
<a name="model-customization-iam-role-trust"></a>

The following policy allows Amazon Bedrock to assume this role and carry out the model customization job. The following shows an example policy you can use.

You can optionally restrict the scope of the permission for [cross-service confused deputy prevention](cross-service-confused-deputy-prevention.md) by using one or more global condition context keys with the `Condition` field. For more information, see [AWS global condition context keys.](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html)
+ Set the `aws:SourceAccount` value to your account ID.
+ (Optional) Use the `ArnEquals` or `ArnLike` condition to restrict the scope to specific model customization jobs in your account ID.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnEquals": {
                    "aws:SourceArn": "arn:aws:bedrock:us-east-1:111122223333:model-customization-job/*"
                }
            }
        }
    ]
}
```

------

## Permissions to access training and validation files and to write output files in S3
<a name="model-customization-iam-role-s3"></a>

Attach the following policy to allow the role to access your training and validation data and the bucket to which to write your output data. Replace the values in the `Resource` list with your actual bucket names.

To restrict access to a specific folder in a bucket, add an `s3:prefix` condition key with your folder path. You can follow the **User policy** example in [Example 2: Getting a list of objects in a bucket with a specific prefix](https://docs.aws.amazon.com/AmazonS3/latest/userguide/amazon-s3-policy-keys.html#condition-key-bucket-ops-2) 

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::training-bucket",
                "arn:aws:s3:::training-bucket/*",
                "arn:aws:s3:::validation-bucket",
                "arn:aws:s3:::validation-bucket/*"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:PutObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::output-bucket",
                "arn:aws:s3:::output-bucket/*"
            ]
        }
    ]
}
```

------

## (Optional) Permissions to create a Distillation job with a cross-region inference profiles
<a name="customization-iam-sr-ip"></a>

To use a cross-region inference profile for a teacher model in a distillation job, the service role must have permissions to invoke the inference profile in an AWS Region, in addition to the model in each Region in the inference profile.

For permissions to invoke with a cross-Region (system-defined) inference profile, use the following policy as a template for the permissions policy to attach to your service role:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "CrossRegionInference",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:inference-profile/${InferenceProfileId}",
                "arn:aws:bedrock:us-east-1::foundation-model/${ModelId}",
                "arn:aws:bedrock:us-east-1::foundation-model/${ModelId}"
            ]
        }
    ]
}
```

------

# Create a service role for importing pre-trained models
<a name="model-import-iam-role"></a>

To use a custom role for model import create an IAM service role and attach the following permissions. For information on how to create a service role in IAM, see [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html).

These permissions apply to both methods of importing models into Amazon Bedrock:
+ **Custom model import jobs** — For importing customized open-source foundation models (such as Mistral AI or Llama models). For more information, see [Use Custom model import to import a customized open-source model into Amazon Bedrock](model-customization-import-model.md).
+ **Create custom model** — For importing Amazon Nova models that you fine-tuned in SageMaker AI. For more information, see [Import a SageMaker AI-trained Amazon Nova model](import-with-create-custom-model.md).

**Topics**
+ [Trust relationship](#model-import-iam-role-trust)
+ [Permissions to access model files in Amazon S3](#model-import-iam-role-s3)

## Trust relationship
<a name="model-import-iam-role-trust"></a>

The following policy allows Amazon Bedrock to assume this role and carry out model import operations. The following shows an example policy you can use.

You can optionally restrict the scope of the permission for [cross-service confused deputy prevention](cross-service-confused-deputy-prevention.md) by using one or more global condition context keys with the `Condition` field. For more information, see [AWS global condition context keys.](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html)
+ Set the `aws:SourceAccount` value to your account ID.
+ (Optional) Use the `ArnEquals` or `ArnLike` condition to restrict the scope to specific operations in your account. The following example restricts access to custom model import jobs.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "1",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnEquals": {
                    "aws:SourceArn": "arn:aws:bedrock:us-east-1:111122223333:model-import-job/*"
                }
            }
        }
    ]
}
```

------

## Permissions to access model files in Amazon S3
<a name="model-import-iam-role-s3"></a>

Attach the following policy to allow the role to access model files in Amazon S3 bucket. Replace the values in the `Resource` list with your actual bucket names.

For custom model import jobs, this is your own Amazon S3 bucket containing the customized open-source model files. For creating custom models from SageMaker AI-trained Amazon Nova models, this is the Amazon-managed Amazon S3 bucket where SageMaker AI stores the trained model artifacts. SageMaker AI creates this bucket when you run your first SageMaker AI training job. 

To restrict access to a specific folder in a bucket, add an `s3:prefix` condition key with your folder path. You can follow the **User policy** example in [Example 2: Getting a list of objects in a bucket with a specific prefix](https://docs.aws.amazon.com/AmazonS3/latest/userguide/amazon-s3-policy-keys.html#condition-key-bucket-ops-2) 

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "1",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::bucket",
                "arn:aws:s3:::bucket/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        }
    ]
}
```

------

# Create a service role for Amazon Bedrock Agents
<a name="agents-permissions"></a>

To use a custom service role for agents instead of the one Amazon Bedrock automatically creates, create an IAM role and attach the following permissions by following the steps at [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html).
+ Trust policy
+ A policy containing the following identity-based permissions:
  + Access to the Amazon Bedrock base models.
  + Access to the Amazon S3 objects containing the OpenAPI schemas for the action groups in your agents.
  + Permissions for Amazon Bedrock to query knowledge bases that you want to attach to your agents.
  + If any of the following situations pertain to your use case, add the statement to the policy or add a policy with the statement to the service role:
    + (Optional) If you enable multi-agent collaboration, permissions to get the aliases and invoke agent collaborators.
    + (Optional) If you associate a Provisioned Throughput with your agent alias, permissions to perform model invocation using that Provisioned Throughput.
    + (Optional) If you associate a guardrail with your agent, permissions to apply that guardrail. If the guardrail is encrypted with a KMS key, the service role will also need [permissions to decrypt the key](guardrails-permissions-kms.md)
    + (Optional) If you encrypt your agent with a KMS key, [permissions to decrypt the key](encryption-agents.md).

Whether you use a custom role or not, you also need to attach a **resource-based policy** to the Lambda functions for the action groups in your agents to provide permissions for the service role to access the functions. For more information, see [Resource-based policy to allow Amazon Bedrock to invoke an action group Lambda function](#agents-permissions-lambda).

**Topics**
+ [Trust relationship](#agents-permissions-trust)
+ [Identity-based permissions for the Agents service role](#agents-permissions-identity)
+ [(Optional) Identity-based policy to allow Amazon Bedrock to use Provisioned Throughput with your agent alias](#agents-permissions-pt)
+ [(Optional) Identity-based policy to allow Amazon Bedrock to associate and invoke agent collaborators](#agents-permissions-mac)
+ [(Optional) Identity-based policy to allow Amazon Bedrock to use guardrails with your Agent](#agents-permissions-gr)
+ [(Optional) Identity-based policy to allow Amazon Bedrock to access files from S3 to use with code interpretation](#agents-permissions-files-ci)
+ [Resource-based policy to allow Amazon Bedrock to invoke an action group Lambda function](#agents-permissions-lambda)

## Trust relationship
<a name="agents-permissions-trust"></a>

The following trust policy allows Amazon Bedrock to assume this role and create and manage agents. Replace the *\$1\$1values\$1* as necessary. The policy contains optional condition keys (see [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) and [AWS global condition context keys](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)) in the `Condition` field that we recommend you use as a security best practice.

**Note**  
As a best practice for security purposes, replace the *\$1* with specific agent IDs after you have created them.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnLike": {
                    "AWS:SourceArn": "arn:aws:bedrock:us-east-1:123456789012:agent/*"
                }
            }
        }
    ]
}
```

------

## Identity-based permissions for the Agents service role
<a name="agents-permissions-identity"></a>

Attach the following policy to provide permissions for the service role, replacing *\$1\$1values\$1* as necessary. The policy contains the following statements. Omit a statement if it isn't applicable to your use-case. The policy contains optional condition keys (see [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) and [AWS global condition context keys](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)) in the `Condition` field that we recommend you use as a security best practice.

**Note**  
If you encrypt your agent with a customer-managed KMS key, refer to [Encryption of agent resources for agents created before January 22, 2025](encryption-agents.md) for further permissions you need to add.
+ Permissions to use Amazon Bedrock foundation models to run model inference on prompts used in your agent's orchestration.
+ Permissions to access your agent's action group API schemas in Amazon S3. Omit this statement if your agent has no action groups.
+ Permissions to access knowledge bases associated with your agent. Omit this statement if your agent has no associated knowledge bases.
+ Permissions to access a third-party (Pinecone or Redis Enterprise Cloud) knowledge base associated with your agent. Omit this statement if your knowledge base is first-party (Amazon OpenSearch Serverless or Amazon Aurora) or if your agent has no associated knowledge bases.
+ Permissions to access a prompt from Prompt management. Omit this statement if you don't plan to test a prompt from prompt management with your agent in the Amazon Bedrock console.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AgentModelInvocationPermissions",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1::foundation-model/anthropic.claude-v2",
                "arn:aws:bedrock:us-east-1::foundation-model/anthropic.claude-v2:1",
                "arn:aws:bedrock:us-east-1::foundation-model/anthropic.claude-instant-v1"
            ]
        },
        {
            "Sid": "AgentActionGroupS3",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket/SchemaJson"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        },
        {
            "Sid": "AgentKnowledgeBaseQuery",
            "Effect": "Allow",
            "Action": [
                "bedrock:Retrieve",
                "bedrock:RetrieveAndGenerate"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:knowledge-base/knowledge-base-id"
            ]
        },
        {
            "Sid": "Agent3PKnowledgeBase",
            "Effect": "Allow",
            "Action": [
                "bedrock:AssociateThirdPartyKnowledgeBase"
            ],
            "Resource": "arn:aws:bedrock:us-east-1:123456789012:knowledge-base/knowledge-base-id",
            "Condition": {
                "StringEquals": {
                    "bedrock:ThirdPartyKnowledgeBaseCredentialsSecretArn": "arn:aws:kms:us-east-1:123456789012:key/KeyId"
                }
            }
        },
        {
            "Sid": "AgentPromptManagementConsole",
            "Effect": "Allow",
            "Action": [
                "bedrock:GetPrompt"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:prompt/prompt-id"
            ]
        }
    ]
}
```

------

## (Optional) Identity-based policy to allow Amazon Bedrock to use Provisioned Throughput with your agent alias
<a name="agents-permissions-pt"></a>

If you associate a [Provisioned Throughput](prov-throughput.md) with an alias of your agent, attach the following identity-based policy to the service role or add the statement to the policy in [Identity-based permissions for the Agents service role](#agents-permissions-identity).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
      {        
        "Sid": "UseProvisionedThroughput",
        "Effect": "Allow",
        "Action": [
            "bedrock:InvokeModel", 
            "bedrock:GetProvisionedModelThroughput"
        ],
        "Resource": [
            "arn:aws:bedrock:us-east-1:123456789012:provisioned-model/${provisioned-model-id}"
        ]
      }
    ]
}
```

------

## (Optional) Identity-based policy to allow Amazon Bedrock to associate and invoke agent collaborators
<a name="agents-permissions-mac"></a>

If you enable [multi-agent collaboration](agents-multi-agent-collaboration.md), attach the following identity-based policy to the service role or add the statement to the policy in [Identity-based permissions for the Agents service role](#agents-permissions-identity).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AmazonBedrockAgentMultiAgentsPolicyProd",
            "Effect": "Allow",
            "Action": [
                "bedrock:GetAgentAlias",
                "bedrock:InvokeAgent"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:agent-alias/${agent-id}/${agent-alias-id}"
            ]
        }
    ]
}
```

------

## (Optional) Identity-based policy to allow Amazon Bedrock to use guardrails with your Agent
<a name="agents-permissions-gr"></a>

If you associate a [guardrail](guardrails.md) with your agent, attach the following identity-based policy to the service role or add the statement to the policy in [Identity-based permissions for the Agents service role](#agents-permissions-identity).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "ApplyGuardrail",
            "Effect": "Allow",
            "Action": "bedrock:ApplyGuardrail",
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:guardrail/${guardrail-id}"
            ]
        }
    ]
}
```

------

## (Optional) Identity-based policy to allow Amazon Bedrock to access files from S3 to use with code interpretation
<a name="agents-permissions-files-ci"></a>

If you enable [Enable code interpretation in Amazon Bedrock](agents-enable-code-interpretation.md), attach the following identity-based policy to the service role or add the statement to the policy in [Identity-based permissions for the Agents service role](https://docs.aws.amazon.com//bedrock/latest/userguide/agents-permissions.html#agents-permissions-identity).

------
#### [ JSON ]

****  

```
{
  "Version":"2012-10-17",		 	 	 
  "Statement": [
      {       
        "Sid": "AmazonBedrockAgentFileAccess", 
        "Effect": "Allow",
        "Action": [
            "s3:GetObject",
            "s3:GetObjectVersion",
            "s3:GetObjectVersionAttributes",
            "s3:GetObjectAttributes"
        ],
        "Resource": [
            "arn:aws:s3:::[[customerProvidedS3BucketWithKey]]"
        ]
      }
    ]
}
```

------

## Resource-based policy to allow Amazon Bedrock to invoke an action group Lambda function
<a name="agents-permissions-lambda"></a>

Follow the steps at [Using resource-based policies for Lambda](https://docs.aws.amazon.com/lambda/latest/dg/access-control-resource-based.html) and attach the following resource-based policy to a Lambda function to allow Amazon Bedrock to access the Lambda function for your agent's action groups, replacing the *\$1\$1values\$1* as necessary. The policy contains optional condition keys (see [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) and [AWS global condition context keys](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)) in the `Condition` field that we recommend you use as a security best practice.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AccessLambdaFunction",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "lambda:InvokeFunction",
            "Resource": "arn:aws:lambda:us-east-1:123456789012:function:function-name",
            "Condition": {
                "StringEquals": {
                    "AWS:SourceAccount": "123456789012"
                },
                "ArnLike": {
                    "AWS:SourceArn": "arn:aws:bedrock:us-east-1:123456789012:agent/${agent-id}"
                }
            }
        }
    ]
}
```

------

# Create a service role for Amazon Bedrock Knowledge Bases
<a name="kb-permissions"></a>

To use a custom role for a knowledge base instead of the one Amazon Bedrock automatically creates, create an IAM role and attach the following permissions by following the steps at [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html). Include only the necessary permissions for your own security.

**Note**  
A policy cannot be shared between multiple roles when the service role is used.
+ Trust relationship
+ Access to the Amazon Bedrock base models
+ Access to the data source for where you store your data
+ (If you create a vector database in Amazon OpenSearch Service) Access to your OpenSearch Service collection
+ (If you create a vector database in Amazon Aurora) Access to your Aurora cluster
+ (If you create a vector database in Pinecone or Redis Enterprise Cloud) Permissions for AWS Secrets Manager to authenticate your Pinecone or Redis Enterprise Cloud account
+ (Optional) If you encrypt any of the following resources with a KMS key, permissions to decrypt the key (see [Encryption of knowledge base resources](encryption-kb.md)).
  + Your knowledge base
  + Data sources for your knowledge base
  + Your vector database in Amazon OpenSearch Service
  + The secret for your third-party vector database in AWS Secrets Manager
  + A data ingestion job

**Topics**
+ [Trust relationship](#kb-permissions-trust)
+ [Permissions to access Amazon Bedrock models](#kb-permissions-access-models)
+ [Permissions to access your data sources](#kb-permissions-access-ds)
+ [Permissions to decrypt your AWS KMS key for encrypted data sources in Amazon S3](#kb-permissions-kms-datasource)
+ [Permissions to chat with your document](#kb-permissions-chatdoc)
+ [Permissions for multimodal content](#kb-permissions-multimodal)
+ [Permissions to access your Amazon Kendra GenAI index](#kb-permissions-kendra)
+ [Permissions to access your vector database in Amazon OpenSearch Serverless](#kb-permissions-oss)
+ [Permissions to access your vector database in OpenSearch Managed Clusters](#kb-permissions-osm)
+ [Permissions to access your Amazon Aurora database cluster](#kb-permissions-rds)
+ [Permissions to access your vector database in Amazon Neptune Analytics](#kb-permissions-neptune)
+ [Permissions to access your vector store in Amazon S3 Vectors](#kb-permissions-s3vectors)
+ [Permissions to access a vector database configured with an AWS Secrets Manager secret](#kb-permissions-secret)
+ [Permissions for AWS to manage a AWS KMS key for transient data storage during data ingestion](#kb-permissions-kms-ingestion)
+ [Permissions for AWS to manage a data sources from another user's AWS account.](#kb-permissions-otherds)

## Trust relationship
<a name="kb-permissions-trust"></a>

The following policy allows Amazon Bedrock to assume this role and create and manage knowledge bases. The following shows an example policy you can use. You can restrict the scope of the permission by using one or more global condition context keys. For more information, see [AWS global condition context keys.](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html) Set the `aws:SourceAccount` value to your account ID. Use the `ArnEquals` or `ArnLike` condition to restrict the scope to specific knowledge bases.

**Note**  
As a best practice for security purposes, replace the *\$1* with specific knowledge base IDs after you have created them.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnLike": {
                    "AWS:SourceArn": "arn:aws:bedrock:us-east-1:123456789012:knowledge-base/*"
                }
            }
        }
    ]
}
```

------

## Permissions to access Amazon Bedrock models
<a name="kb-permissions-access-models"></a>

Attach the following policy to provide permissions for the role to use Amazon Bedrock models to embed your source data.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "bedrock:ListFoundationModels",
                "bedrock:ListCustomModels"
            ],
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1::foundation-model/amazon.titan-embed-text-v1",
                "arn:aws:bedrock:us-east-1::foundation-model/cohere.embed-english-v3",
                "arn:aws:bedrock:us-east-1::foundation-model/cohere.embed-multilingual-v3"
            ]
        }
    ]
}
```

------

## Permissions to access your data sources
<a name="kb-permissions-access-ds"></a>

Select from the following data sources to attach the necessary permissions for the role.

**Topics**
+ [Permissions to access your Amazon S3 data source](#kb-permissions-access-s3)
+ [Permissions to access your Confluence data source](#kb-permissions-access-confluence)
+ [Permissions to access your Microsoft SharePoint data source](#kb-permissions-access-sharepoint)
+ [Permissions to access your Salesforce data source](#kb-permissions-access-salesforce)

### Permissions to access your Amazon S3 data source
<a name="kb-permissions-access-s3"></a>

If your data source is Amazon S3, attach the following policy to provide permissions for the role to access the S3 bucket that you will connect to as your data source.

If you encrypted the data source with a AWS KMS key, attach permissions to decrypt the key to the role by following the steps at [Permissions to decrypt your AWS KMS key for your data sources in Amazon S3](encryption-kb.md#encryption-kb-ds).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "S3ListBucketStatement",
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        },
        {
            "Sid": "S3GetObjectStatement",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        }
    ]
}
```

------

### Permissions to access your Confluence data source
<a name="kb-permissions-access-confluence"></a>

**Note**  
Confluence data source connector is in preview release and is subject to change.

Attach the following policy to provide permissions for the role to access Confluence.

**Note**  
`secretsmanager:PutSecretValue` is only necessary if you use OAuth 2.0 authentication with a refresh token.  
Confluence OAuth2.0 **access** token has a default expiry time of 60 minutes. If this token expires while your data source is syncing (sync job), Amazon Bedrock will use the provided **refresh** token to regenerate this token. This regeneration refreshes both the access and refresh tokens. To keep the tokens updated from the current sync job to the next sync job, Amazon Bedrock requires write/put permissions for your secret credentials.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "secretsmanager:GetSecretValue",
                "secretsmanager:PutSecretValue"
            ],
            "Resource": [
                "arn:aws:secretsmanager:us-east-1:123456789012:secret:SecretId"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "kms:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/KeyId"
            ],
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "secretsmanager.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

### Permissions to access your Microsoft SharePoint data source
<a name="kb-permissions-access-sharepoint"></a>

**Note**  
SharePoint data source connector is in preview release and is subject to change.

Attach the following policy to provide permissions for the role to access SharePoint.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "secretsmanager:GetSecretValue"
            ],
            "Resource": [
                "arn:aws:secretsmanager:us-east-1:123456789012:secret:SecretId"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "kms:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/KeyId"
            ],
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "secretsmanager.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

### Permissions to access your Salesforce data source
<a name="kb-permissions-access-salesforce"></a>

**Note**  
Salesforce data source connector is in preview release and is subject to change.

Attach the following policy to provide permissions for the role to access Salesforce.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "secretsmanager:GetSecretValue"
            ],
            "Resource": [
                "arn:aws:secretsmanager:us-east-1:123456789012:secret:SecretId"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "kms:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/KeyId"
            ],
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "secretsmanager.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

## Permissions to decrypt your AWS KMS key for encrypted data sources in Amazon S3
<a name="kb-permissions-kms-datasource"></a>

If you encrypted your data sources in Amazon S3 with a AWS KMS key, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow Amazon Bedrock to decrypt your key. Replace *\$1\$1Region\$1* and *\$1\$1AccountId\$1* with the Region and account ID to which the key belongs. Replace *\$1\$1KeyId\$1* with the ID of your AWS KMS key.

```
{
    "Version": "2012-10-17",		 	 	 
    "Statement": [{
        "Effect": "Allow",
        "Action": [
            "kms:Decrypt"
        ],
        "Resource": [
            "arn:aws:kms:${Region}:${AccountId}:key/${KeyId}"
        ],
        "Condition": {
            "StringEquals": {
                "kms:ViaService": [
                    "s3.${Region}.amazonaws.com"
                ]
            }
        }
    }]
}
```

## Permissions to chat with your document
<a name="kb-permissions-chatdoc"></a>

Attach the following policy to provide permissions for the role to use Amazon Bedrock models to chat with your document:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
			"Effect": "Allow",
			"Action": [
				"bedrock:RetrieveAndGenerate"
			],
			"Resource": "*"
		}
    ]
}
```

------

If you only want to grant a user access to chat with your document (and not to `RetrieveAndGenerate` on all Knowledge Bases), use the following policy:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
			"Effect": "Allow",
			"Action": [
				"bedrock:RetrieveAndGenerate"
			],
			"Resource": "*"
		},
        {
			"Effect": "Deny",
			"Action": [
				"bedrock:Retrieve"
			],
			"Resource": "*"
		}
    ]
}
```

------

If you want both chat with your document and use `RetrieveAndGenerate` on a specific Knowledge Base, provide a *\$1\$1KnowledgeBaseArn\$1*, and use the following policy:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "bedrock:RetrieveAndGenerate"
            ],
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "bedrock:Retrieve"
            ],
            "Resource": "arn:aws:bedrock:us-east-1:123456789012:knowledge-base/$KnowledgeBaseId"
        }
    ]
}
```

------

## Permissions for multimodal content
<a name="kb-permissions-multimodal"></a>

When working with multimodal content (images, audio, video), additional permissions are required depending on your processing approach.

### Nova Multimodal Embeddings permissions
<a name="kb-permissions-multimodal-mme"></a>

When using Nova Multimodal Embeddings, attach the following policy to provide permissions for asynchronous model invocation:

```
{
    "Sid": "BedrockInvokeModelStatement",
    "Effect": "Allow",
    "Action": ["bedrock:InvokeModel"],
    "Resource": [
        "arn:aws:bedrock:us-east-1::foundation-model/amazon.nova-*-multimodal-embeddings-*",
        "arn:aws:bedrock:us-east-1::async-invoke/*"
    ],
    "Condition": {
        "StringEquals": {
            "aws:ResourceAccount": ""
        }
    }
},
{
    "Sid": "BedrockGetAsyncInvokeStatement",
    "Effect": "Allow",
    "Action": ["bedrock:GetAsyncInvoke"],
    "Resource": ["arn:aws:bedrock:us-east-1::async-invoke/*"],
    "Condition": {
        "StringEquals": {
            "aws:ResourceAccount": ""
        }
    }
}
```

### Bedrock Data Automation (BDA) permissions
<a name="kb-permissions-multimodal-bda"></a>

When using BDA to process multimodal content, attach the following policy:

```
{
    "Sid": "BDAInvokeStatement",
    "Effect": "Allow",
    "Action": ["bedrock:InvokeDataAutomationAsync"],
    "Resource": [
        "arn:aws:bedrock:us-east-1:aws:data-automation-project/public-rag-default",
        "arn:aws:bedrock:us-east-1::data-automation-profile/*"
    ]
},
{
    "Sid": "BDAGetStatement",
    "Effect": "Allow",
    "Action": ["bedrock:GetDataAutomationStatus"],
    "Resource": "arn:aws:bedrock:us-east-1::data-automation-invocation/*"
}
```

If you use customer-managed AWS KMS keys with BDA, also attach the following policy. Replace *account-id*, *region*, and *key-id* with your specific values:

```
{
    "Sid": "KmsPermissionStatementForBDA",
    "Effect": "Allow",
    "Action": [
        "kms:GenerateDataKey",
        "kms:Decrypt",
        "kms:DescribeKey",
        "kms:CreateGrant"
    ],
    "Resource": ["arn:aws:kms:region:account-id:key/key-id"],
    "Condition": {
        "StringEquals": {
            "aws:ResourceAccount": "account-id",
            "kms:ViaService": "bedrock.region.amazonaws.com"
        }
    }
}
```

## Permissions to access your Amazon Kendra GenAI index
<a name="kb-permissions-kendra"></a>

If you created an Amazon Kendra GenAI index for your knowledge base, then attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow access to the index. In the policy, replace *\$1\$1Partition\$1*, *\$1\$1Region\$1*, *\$1\$1AccountId\$1*, and *\$1\$1IndexId\$1* with the values for your index. You can allow access to multiple indexes by adding them to the `Resource` list. To allow access to every index in your AWS account, replace *\$1\$1IndexId\$1* with a wildcard (\$1).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "kendra:Retrieve",
                "kendra:DescribeIndex"
            ],
            "Resource": "arn:aws:kendra:us-east-1:123456789012:index/${IndexId}" 
        }
    ]
}
```

------

## Permissions to access your vector database in Amazon OpenSearch Serverless
<a name="kb-permissions-oss"></a>

If you created a vector database in OpenSearch Serverless for your knowledge base, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow access to the collection. Replace *\$1\$1Region\$1* and *\$1\$1AccountId\$1* with the Region and account ID to which the database belongs. Input the ID of your Amazon OpenSearch Service collection in *\$1\$1CollectionId\$1*. You can allow access to multiple collections by adding them to the `Resource` list.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "aoss:APIAccessAll"
            ],
            "Resource": [
                "arn:aws:aoss:us-east-1:123456789012:collection/${CollectionId}"
            ]
        }
    ]
}
```

------

## Permissions to access your vector database in OpenSearch Managed Clusters
<a name="kb-permissions-osm"></a>

If you created a vector database in OpenSearch Managed Cluster for your knowledge base, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow access to the domain. Replace *<region>* and *<accountId>* with the Region and account ID to which the database belongs. You can allow access to multiple domains by adding them to the `Resource` list. For more information about configuring permissions, see [Prerequisites and permissions required for using OpenSearch Managed Clusters with Amazon Bedrock Knowledge BasesOverview of permissions configuration](kb-osm-permissions-prereq.md).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",       
            "Action": [
                "es:ESHttpGet", 
                "es:ESHttpPost", 
                "es:ESHttpPut", 
                "es:ESHttpDelete" 
            ],
            "Resource": [
                "arn:aws:es:us-east-1:123456789012:domain/domainName/indexName"
            ]       
        }, 
        {
            "Effect": "Allow",
            "Action": [
                "es:DescribeDomain" 
            ],
            "Resource": [
                "arn:aws:es:us-east-1:123456789012:domain/domainName"
            ]       
        }
    ]
}
```

------

## Permissions to access your Amazon Aurora database cluster
<a name="kb-permissions-rds"></a>

**Note**  
The Amazon Aurora cluster must reside in the same AWS account as the one where the knowledge base is created for Amazon Bedrock.

If you created a database (DB) cluster in Amazon Aurora for your knowledge base, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow access to the DB cluster and to provide read and write permissions on it. Replace *\$1\$1Region\$1* and *\$1\$1AccountId\$1* with the Region and account ID to which the DB cluster belongs. Input the ID of your Amazon Aurora database cluster in *\$1\$1DbClusterId\$1*. You can allow access to multiple DB clusters by adding them to the `Resource` list.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "RdsDescribeStatementID",
            "Effect": "Allow",
            "Action": [
                "rds:DescribeDBClusters"
            ],
            "Resource": [
                "arn:aws:rds:us-east-1:123456789012:cluster:${DbClusterId}"
            ]
        },
        {
            "Sid": "DataAPIStatementID",
            "Effect": "Allow",
            "Action": [
                "rds-data:BatchExecuteStatement",
                "rds-data:ExecuteStatement"
            ],
            "Resource": [
                "arn:aws:rds:us-east-1:123456789012:cluster:${DbClusterId}"
            ]
        }
    ]
}
```

------

## Permissions to access your vector database in Amazon Neptune Analytics
<a name="kb-permissions-neptune"></a>

If you created an Amazon Neptune Analytics graph for your knowledge base, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow access to the graph. In the policy, replace *\$1\$1Region\$1* and *\$1\$1AccountId\$1* with the Region and account ID to which the database belongs. Replace *\$1\$1GraphId\$1* with the values for your graph database.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "NeptuneAnalyticsAccess",
            "Effect": "Allow",
            "Action": [
                "neptune-graph:GetGraph",
                "neptune-graph:ReadDataViaQuery",
                "neptune-graph:WriteDataViaQuery",
                "neptune-graph:DeleteDataViaQuery"
            ],
            "Resource": [
                "arn:aws:neptune-graph:us-east-1:123456789012:graph/${GraphId}"
            ]
        }
    ]
}
```

------

## Permissions to access your vector store in Amazon S3 Vectors
<a name="kb-permissions-s3vectors"></a>

If you choose to use Amazon S3 Vectors for your knowledge base, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow access to the vector index.

In the policy, replace *\$1\$1Region\$1* and *\$1\$1AccountId\$1* with the Region and account ID to which the vector index belongs. Replace *\$1\$1BucketName\$1* with the name of your S3 vector bucket and *\$1\$1IndexName\$1* with the name of your vector index. For more information about Amazon S3 Vectors, see [Setting up to use Amazon S3 Vectors](https://docs.aws.amazon.com/AmazonS3/latest/userguide/s3-vectors-setting-up.html).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "S3VectorBucketReadAndWritePermission",
            "Effect": "Allow",
            "Action": [
                "s3vectors:PutVectors",
                "s3vectors:GetVectors",
                "s3vectors:DeleteVectors",
                "s3vectors:QueryVectors",
                "s3vectors:GetIndex"
            ],
            "Resource": "arn:aws:s3vectors:us-east-1:123456789012:bucket/${BucketName}/index/${IndexName}"
        }
    ]
}
```

------

## Permissions to access a vector database configured with an AWS Secrets Manager secret
<a name="kb-permissions-secret"></a>

If your vector database is configured with an AWS Secrets Manager secret, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow AWS Secrets Manager to authenticate your account to access the database. Replace *\$1\$1Region\$1* and *\$1\$1AccountId\$1* with the Region and account ID to which the database belongs. Replace *\$1\$1SecretId\$1* with the ID of your secret.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "secretsmanager:GetSecretValue"
            ],
            "Resource": [
                "arn:aws:secretsmanager:us-east-1:123456789012:secret:${SecretId}"
            ]
        }
    ]
}
```

------

If you encrypted your secret with a AWS KMS key, attach permissions to decrypt the key to the role by following the steps at [Permissions to decrypt an AWS Secrets Manager secret for the vector store containing your knowledge base](encryption-kb.md#encryption-kb-3p).

## Permissions for AWS to manage a AWS KMS key for transient data storage during data ingestion
<a name="kb-permissions-kms-ingestion"></a>

To allow the creation of a AWS KMS key for transient data storage in the process of ingesting your data source, attach the following policy to your Amazon Bedrock Knowledge Bases service role. Replace the *\$1\$1Region\$1*, *\$1\$1AccountId\$1*, and *\$1\$1KeyId\$1* with the appropriate values.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "kms:GenerateDataKey",
                "kms:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/${KeyId}"
            ]
        }
    ]
}
```

------

## Permissions for AWS to manage a data sources from another user's AWS account.
<a name="kb-permissions-otherds"></a>

To allow the access to another user's AWS account, you must create a role that allows cross-account access to a Amazon S3 bucket in another user's account. Replace the *\$1\$1BucketName\$1*, *\$1\$1BucketOwnerAccountId\$1*, and *\$1\$1BucketNameAndPrefix\$1* with the appropriate values.

**Permissions Required on Knowledge Base role**

The knowledge base role that is provided during knowledge base creation `createKnowledgeBase` requires the following Amazon S3 permissions.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "S3ListBucketStatement",
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        },
        {
            "Sid": "S3GetObjectStatement",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        }
    ]
}
```

------

If the Amazon S3 bucket is encrypted using a AWS KMS key, the following also needs to be added to the knowledge base role. Replace the *\$1\$1BucketOwnerAccountId\$1* and *\$1\$1Region\$1* with the appropriate values.

```
{
        "Sid": "KmsDecryptStatement",
        "Effect": "Allow",
        "Action": [
            "kms:Decrypt"
        ],
        "Resource": [
            "arn:aws:kms:${Region}:${BucketOwnerAccountId}:key/${KeyId}"
        ],
        "Condition": {
        "StringEquals": {
            "kms:ViaService": [
                "s3.${Region}.amazonaws.com"
            ]
        }
        }
    }
```

**Permissions required on a cross-account Amazon S3 bucket policy**

The bucket in the other account requires the following Amazon S3 bucket policy. Replace the *\$1\$1KbRoleArn\$1*, *\$1\$1BucketName\$1*, and *\$1\$1BucketNameAndPrefix\$1* with the appropriate values. 

------
#### [ JSON ]

****  

```
{
   "Version":"2012-10-17",		 	 	 
   "Statement": [
      {
         "Sid": "ListBucket",
         "Effect": "Allow",
         "Principal": {
            "AWS": "123456789012"
         },
         "Action": [
            "s3:ListBucket"
         ],
         "Resource": [
            "arn:aws:s3:::amzn-s3-demo-bucket"
         ]
      },
      {
         "Sid": "GetObject",
         "Effect": "Allow",
         "Principal": {
            "AWS": "123456789012"
         },
         "Action": [
            "s3:GetObject"
         ],
         "Resource": [
            "arn:aws:s3:::amzn-s3-demo-bucket/*"
         ]
      }
   ]
}
```

------

**Permissions required on cross-account AWS KMS key policy**

If the cross-account Amazon S3 bucket is encrypted using a AWS KMS key in that account, the policy of the AWS KMS key requires the following policy. Replace the *\$1\$1KbRoleArn\$1* and *\$1\$1KmsKeyArn\$1* with the appropriate values.

```
{
    "Sid": "Example policy",
    "Effect": "Allow",
    "Principal": {
        "AWS": [
            "${KbRoleArn}"
        ]
    },
    "Action": [
        "kms:Decrypt"
    ],
    "Resource": "${KmsKeyArn}"
}
```

# Create a service role for Amazon Bedrock Flows in Amazon Bedrock
<a name="flows-permissions"></a>

To create and manage a flow in Amazon Bedrock, you must use a service role with the necessary permissions outlined on this page. You can use a service role that Amazon Bedrock automatically creates for you in the console or use one that you customize yourself.

**Note**  
If you use the service role that Amazon Bedrock automatically creates for you in the console, it will attach permissions dynamically if you add nodes to your flow and save the flow. If you remove nodes, however, the permissions won't be deleted, so you will have to delete the permissions you no longer need. To manage the permissions for the role that was created for you, follow the steps at [Modifying a role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_manage_modify.html) in the IAM User Guide.

To create a custom service role for Amazon Bedrock Flows, create an IAM role by following the steps at [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html). Then attach the following permissions to the role.
+ Trust policy
+ The following identity-based permissions:
  + Access to the Amazon Bedrock base models that the flow will use. Add each model that's used in the flow to the `Resource` list.
  + If you invoke a model using Provisioned Throughput, permissions to access and invoke the provisioned model. Add each model that's used in the flow to the `Resource` list.
  + If you invoke a custom model, permissions to access and invoke the custom model. Add each model that's used in the flow to the `Resource` list.
  + Permissions based on the nodes that you add to the flow:
    + If you include prompt nodes that use prompts from Prompt management, you need permissions to access the prompt. Add each prompt that's used in the flow to the `Resource` list.
    + If you include knowledge base nodes, you need permissions to query the knowledge base. Add each knowledge base that's queried in the flow to the `Resource` list.
    + If you include agent nodes, you need permissions to invoke an alias of the agent. Add each agent that's invoked in the flow to the `Resource` list.
    + If you include S3 retrieval nodes, you need permissions to access the Amazon S3 bucket from which data will be retrieved. Add each bucket from which data is retrieved to the `Resource` list.
    + If you include S3 storage nodes, you need permissions to write to the Amazon S3 bucket in which output data will be stored. Add each bucket to which data is written to the `Resource` list.
    + If you include guardrails for a knowledge base node or a prompt node, you need permissions to apply the guardrails in a flow. Add each guardrail that's used in the flow to the `Resource` list.
    + If you include Lambda nodes, you need permissions to invoke the Lambda function. Add each Lambda function which needs to be invoked to the `Resource` list.
    + If you include Amazon Lex nodes, you need permissions to use the Amazon Lex bot. Add each bot alias which needs to be used to the `Resource` list.
    + If you encrypted any resource invoked in a flow, you need permissions to decrypt the key. Add each key to the `Resource` list.
+ If you encrypt the flow, you also need to attach a key policy to the KMS key that you use to encrypt the flow.

**Note**  
The following changes were recently implemented:  
Previously, AWS Lambda and Amazon Lex resources were invoked using the Amazon Bedrock service principal. This behavior is changing for flows created after 2024-11-22 and the Amazon Bedrock Flows service role will be used to invoke the AWS Lambda and Amazon Lex resources. If you created any flows that use either of these resources before 2024-11-22, you should update your Amazon Bedrock Flows service roles with AWS Lambda and Amazon Lex permissions.
Previously, Prompt management resources were rendered using the `bedrock:GetPrompt` action. This behavior is changing for flows created after 2024-11-22 and the `bedrock:RenderPrompt` action will be used to render the prompt resource. If you created any flows that use a prompt resource before 2024-11-22, you should update your Amazon Bedrock Flows service roles with `bedrock:RenderPrompt` permissions.
If you're using a service role that Amazon Bedrock automatically created for you in the console, Amazon Bedrock will attach the corrected permissions dynamically when you save the flow.

**Topics**
+ [Trust relationship](#flows-permissions-trust)
+ [Identity-based permissions for the flows service role.](#flows-permissions-identity)

## Trust relationship
<a name="flows-permissions-trust"></a>

Attach the following trust policy to the flow execution role to allow Amazon Bedrock to assume this role and manage a flow. Replace the *values* as necessary. The policy contains optional condition keys (see [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) and [AWS global condition context keys](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)) in the `Condition` field that we recommend you use as a security best practice.

**Note**  
As a best practice, replace the *\$1* with a flow ID after you have created it.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "FlowsTrustBedrock",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnLike": {
                    "AWS:SourceArn": "arn:aws:bedrock:us-east-1:123456789012:flow/*"
                }
            }
        }
    ]
}
```

------

## Identity-based permissions for the flows service role.
<a name="flows-permissions-identity"></a>

Attach the following policy to provide permissions for the service role, replacing *values* as necessary. The policy contains the following statements. Omit a statement if it isn't applicable to your use-case. The policy contains optional condition keys (see [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) and [AWS global condition context keys](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)) in the `Condition` field that we recommend you use as a security best practice.
+ Access to the Amazon Bedrock base models that the flow will use. Add each model that's used in the flow to the `Resource` list.
+ If you invoke a model using Provisioned Throughput, permissions to access and invoke the provisioned model. Add each model that's used in the flow to the `Resource` list.
+ If you invoke a custom model, permissions to access and invoke the custom model. Add each model that's used in the flow to the `Resource` list.
+ Permissions based on the nodes that you add to the flow:
  + If you include prompt nodes that use prompts from Prompt management, you need permissions to access the prompt. Add each prompt that's used in the flow to the `Resource` list.
  + If you include knowledge base nodes, you need permissions to query the knowledge base. Add each knowledge base that's queried in the flow to the `Resource` list.
  + If you include agent nodes, you need permissions to invoke an alias of the agent. Add each agent that's invoked in the flow to the `Resource` list.
  + If you include S3 retrieval nodes, you need permissions to access the Amazon S3 bucket from which data will be retrieved. Add each bucket from which data is retrieved to the `Resource` list.
  + If you include S3 storage nodes, you need permissions to write to the Amazon S3 bucket in which output data will be stored. Add each bucket to which data is written to the `Resource` list.
  + If you include guardrails for a knowledge base node or a prompt node, you need permissions to apply the guardrails in a flow. Add each guardrail that's used in the flow to the `Resource` list.
  + If you include Lambda nodes, you need permissions to invoke the Lambda function. Add each Lambda function which needs to be invoked to the `Resource` list.
  + If you include Amazon Lex nodes, you need permissions to use the Amazon Lex bot. Add each bot alias which needs to be used to the `Resource` list.
  + If you encrypted any resource invoked in a flow, you need permissions to decrypt the key. Add each key to the `Resource` list.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "InvokeModel",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1::foundation-model/ModelId"
            ]
        },
        {
            "Sid": "InvokeProvisionedThroughput",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel",
                "bedrock:GetProvisionedModelThroughput"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:provisioned-model/ModelId"
            ]
        },
        {
            "Sid": "InvokeCustomModel",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel",
                "bedrock:GetCustomModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:custom-model/ModelId"
            ]
        },
        {
            "Sid": "UsePromptFromPromptManagement",
            "Effect": "Allow",
            "Action": [
                "bedrock:RenderPrompt"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:prompt/PromptId"
            ]
        },
        {
            "Sid": "QueryKnowledgeBase",
            "Effect": "Allow",
            "Action": [
                "bedrock:Retrieve"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:knowledge-base/KnowledgeBaseId"
            ]
        },
        {
            "Sid": "InvokeAgent",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeAgent"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:agent-alias/AgentId/AgentAliasId"
            ]
        },
        {
            "Sid": "AccessS3Bucket",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        },
        {
            "Sid": "WriteToS3Bucket",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket",
                "arn:aws:s3:::amzn-s3-demo-bucket/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        },
        {
            "Sid": "GuardrailPermissions",
            "Effect": "Allow",
            "Action": [
                "bedrock:ApplyGuardrail"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:guardrail/GuardrailId"
            ]
        },
        {
            "Sid": "LambdaPermissions",
            "Effect": "Allow",
            "Action": [
                "lambda:InvokeFunction"
            ],
            "Resource": [
                "arn:aws:lambda:us-east-1:123456789012:function:FunctionId"
            ]
        },
        {
            "Sid": "AmazonLexPermissions",
            "Effect": "Allow",
            "Action": [
                "lex:RecognizeUtterance"
            ],
            "Resource": [ 
                "arn:aws:lex:us-east-1:123456789012:bot-alias/BotId/BotAliasId"
            ]
        },
        {
            "Sid": "KMSPermissions",
            "Effect": "Allow",
            "Action": [
                "kms:GenerateDataKey",
                "kms:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/KeyId"
            ],
             "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        }
    ]
}
```

------

# Service role requirements for model evaluation jobs
<a name="model-evaluation-security-service-roles"></a>

To create a model evaluation job, you must specify a service role. A service role is an [IAM role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html) that a service assumes to perform actions on your behalf. An IAM administrator can create, modify, and delete a service role from within IAM. For more information, see [Create a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html) in the *IAM User Guide*. 

The required IAM actions and resource depend on the type of model evaluation job you are creating. Use the following sections to learn more about the required Amazon Bedrock,Amazon SageMaker AI, and Amazon S3 IAM actions, service principals, and resources. You can optionally choose to encrypt your data using AWS Key Management Service.

**Topics**
+ [Service role requirements for automatic model evaluation jobs](automatic-service-roles.md)
+ [Service role requirements for human-based model evaluation jobs](model-eval-service-roles.md)
+ [Required service role permissions for creating a model evaluation job that uses a judge model](judge-service-roles.md)
+ [Service role requirements for knowledge base evaluation jobs](rag-eval-service-roles.md)

# Service role requirements for automatic model evaluation jobs
<a name="automatic-service-roles"></a>

To create an automatic model evaluation job, you must specify a service role. The policy you attach grants Amazon Bedrock access to resources in your account, and allows Amazon Bedrock to invoke the selected model on your behalf.

You must also attach a trust policy that defines Amazon Bedrock as the service principal using `bedrock.amazonaws.com`. Each of the following policy examples shows you the exact IAM actions that are required based on each service invoked in an automatic model evaluation job.

To create a custom service role, see [Creating a role that uses a custom trust policy](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-custom.html) in the *IAM User Guide*.

**Required Amazon S3 IAM actions**  
The following policy example grants access to the S3 buckets where your model evaluation results are saved, and (optionally) access to any custom prompt datasets you have specified.

------
#### [ JSON ]

****  

```
{
"Version":"2012-10-17",		 	 	 
"Statement": [
    {
        "Sid": "AllowAccessToCustomDatasets",
        "Effect": "Allow",
        "Action": [
            "s3:GetObject",
            "s3:ListBucket"
        ],
        "Resource": [
            "arn:aws:s3:::my_customdataset1_bucket",
            "arn:aws:s3:::my_customdataset1_bucket/myfolder",
            "arn:aws:s3:::my_customdataset2_bucket",
            "arn:aws:s3:::my_customdataset2_bucket/myfolder"
        ]
    },
    {
        "Sid": "AllowAccessToOutputBucket",
        "Effect": "Allow",
        "Action": [
            "s3:GetObject",
            "s3:ListBucket",
            "s3:PutObject",
            "s3:GetBucketLocation",
            "s3:AbortMultipartUpload",
            "s3:ListBucketMultipartUploads"
        ],
        "Resource": [
            "arn:aws:s3:::my_output_bucket",
            "arn:aws:s3:::my_output_bucket/myfolder"
        ]
    }
]
}
```

------

**Required Amazon Bedrock IAM actions**  
You also need to create a policy that allows Amazon Bedrock to invoke the model you plan to specify in the automatic model evaluation job. To learn more about managing access to Amazon Bedrock models, see [Access Amazon Bedrock foundation models](model-access.md). In the `"Resource"` section of the policy, you must specify at least one ARN of a model you have access too. To use a model encrypted with customer managed key KMS key, you must add the required IAM actions and resources to the IAM service role policy. You must also add the service role to the AWS KMS key policy.

------
#### [ JSON ]

****  

```
{
		    "Version":"2012-10-17",		 	 	 
            "Statement": [
        {
            "Sid": "AllowAccessToBedrockResources",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel",
                "bedrock:InvokeModelWithResponseStream",
                "bedrock:CreateModelInvocationJob",
                "bedrock:StopModelInvocationJob",
                "bedrock:GetProvisionedModelThroughput",
                "bedrock:GetInferenceProfile", 
                "bedrock:ListInferenceProfiles",
                "bedrock:GetImportedModel",
                "bedrock:GetPromptRouter",
                "sagemaker:InvokeEndpoint"
            ],
            "Resource": [
                "arn:aws:bedrock:*::foundation-model/*",
                "arn:aws:bedrock:*:111122223333:inference-profile/*",
                "arn:aws:bedrock:*:111122223333:provisioned-model/*",
                "arn:aws:bedrock:*:111122223333:imported-model/*",
                "arn:aws:bedrock:*:111122223333:application-inference-profile/*",
                "arn:aws:bedrock:*:111122223333:default-prompt-router/*",
                "arn:aws:sagemaker:*:111122223333:endpoint/*",
                "arn:aws:bedrock:*:111122223333:marketplace/model-endpoint/all-access"
            ]
        }
    ]
}
```

------

**Service principal requirements**  
You must also specify a trust policy that defines Amazon Bedrock as the service principal. This allows Amazon Bedrock to assume the role. The wildcard (`*`) model evaluation job ARN is required so that Amazon Bedrock can create model evaluation jobs in your AWS account.

------
#### [ JSON ]

****  

```
{
"Version":"2012-10-17",		 	 	 
"Statement": [{
    "Sid": "AllowBedrockToAssumeRole",
    "Effect": "Allow",
    "Principal": {
        "Service": "bedrock.amazonaws.com"
    },
    "Action": "sts:AssumeRole",
    "Condition": {
        "StringEquals": {
            "aws:SourceAccount": "111122223333"
        },
        "ArnEquals": {
            "aws:SourceArn": "arn:aws:bedrock:us-east-1:111122223333:evaluation-job/*"
        }
    }
}]
}
```

------

# Service role requirements for human-based model evaluation jobs
<a name="model-eval-service-roles"></a>

To create a model evaluation job that uses human evaluators, you must specify two service roles.

The following lists summarize the IAM policy requirements for each required service role that must be specified in the Amazon Bedrock console.

**Summary of IAM policy requirements for the Amazon Bedrock service role**
+ You must attach a trust policy which defines Amazon Bedrock as the service principal.
+ You must allow Amazon Bedrock to invoke the selected models on your behalf.
+ You must allow Amazon Bedrock to access the S3 bucket that holds your prompt dataset and the S3 bucket where you want the results saved.
+ You must allow Amazon Bedrock to create the required human loop resources in your account.
+ (Recommended) Use a `Condition` *block* to specify accounts that can access.
+ (Optional) You must allow Amazon Bedrock to decrypt your KMS key if you've encrypted your prompt dataset bucket or the Amazon S3 bucket where you want the results saved.

**Summary of IAM policy requirements for the Amazon SageMaker AI service role**
+ You must attach a trust policy which defines SageMaker AI as the service principal.
+ You must allow SageMaker AI to access the S3 bucket that holds your prompt dataset and the S3 bucket where you want the results saved.
+ (Optional) You must allow SageMaker AI to use your customer managed keys if you've encrypted your prompt dataset bucket or the location where you wanted the results.

To create a custom service role, see [Creating a role that uses a custom trust policy](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-custom.html) in the *IAM User Guide*.

**Required Amazon S3 IAM actions**  
The following policy example grants access to the S3 buckets where your model evaluation results are saved, and access to the custom prompt dataset you have specified. You need to attach this policy to both the SageMaker AI service role and the Amazon Bedrock service role.

------
#### [ JSON ]

****  

```
{
"Version":"2012-10-17",		 	 	 
"Statement": [
    {
        "Sid": "AllowAccessToCustomDatasets",
        "Effect": "Allow",
        "Action": [
            "s3:GetObject",
            "s3:ListBucket"
        ],
        "Resource": [
            "arn:aws:s3:::custom-prompt-dataset"
        ]
    },
    {
        "Sid": "AllowAccessToOutputBucket",
        "Effect": "Allow",
        "Action": [
            "s3:GetObject",
            "s3:ListBucket",
            "s3:PutObject",
            "s3:GetBucketLocation",
            "s3:AbortMultipartUpload",
            "s3:ListBucketMultipartUploads"
        ],
        "Resource": [
            "arn:aws:s3:::model_evaluation_job_output"
        ]
    }
]
}
```

------

**Required Amazon Bedrock IAM actions**  
To allow Amazon Bedrock to invoke the model you plan to specify in the automatic model evaluation job, attach the following policy to the Amazon Bedrock service role. In the `"Resource"` section of the policy, you must specify at least one ARN of a model you have access too. To use a model encrypted with customer managed key KMS key, you must add the required IAM actions and resources to the IAM service role. You must also add any required AWS KMS key policy elements.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AllowAccessToBedrockResources",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel",
                "bedrock:InvokeModelWithResponseStream",
                "bedrock:CreateModelInvocationJob",
                "bedrock:StopModelInvocationJob",
                "bedrock:GetProvisionedModelThroughput",
                "bedrock:GetInferenceProfile", 
                "bedrock:ListInferenceProfiles",
                "bedrock:GetImportedModel",
                "bedrock:GetPromptRouter",
                "sagemaker:InvokeEndpoint"
            ],
            "Resource": [
                "arn:aws:bedrock:*::foundation-model/*",
                "arn:aws:bedrock:*:111122223333:inference-profile/*",
                "arn:aws:bedrock:*:111122223333:provisioned-model/*",
                "arn:aws:bedrock:*:111122223333:imported-model/*",
                "arn:aws:bedrock:*:111122223333:application-inference-profile/*",
                "arn:aws:bedrock:*:111122223333:default-prompt-router/*",
                "arn:aws:sagemaker:*:111122223333:endpoint/*",
                "arn:aws:bedrock:*:111122223333:marketplace/model-endpoint/all-access"
            ]
        }
    ]
}
```

------

**Required Amazon Augmented AI IAM actions**  
You also must create a policy that allows Amazon Bedrock to create resources related to human-based model evaluation jobs. Because Amazon Bedrock creates the needed resources to start the model evaluation job, you must use `"Resource": "*"`. You must attach this policy to the Amazon Bedrock service role.

------
#### [ JSON ]

****  

```
{
"Version":"2012-10-17",		 	 	 
"Statement": [
    {
        "Sid": "ManageHumanLoops",
        "Effect": "Allow",
        "Action": [
            "sagemaker:StartHumanLoop",
            "sagemaker:DescribeFlowDefinition",
            "sagemaker:DescribeHumanLoop",
            "sagemaker:StopHumanLoop",
            "sagemaker:DeleteHumanLoop"
        ],
        "Resource": "*"
    }
]
}
```

------

**Service principal requirements (Amazon Bedrock)**  
You must also specify a trust policy that defines Amazon Bedrock as the service principal. This allows Amazon Bedrock to assume the role.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AllowBedrockToAssumeRole",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "111122223333"
                },
                "ArnEquals": {
                    "aws:SourceArn": "arn:aws:bedrock:us-east-1:111122223333:evaluation-job/*"
                }
            }
        }
    ]
}
```

------

**Service principal requirements (SageMaker AI)**  
You must also specify a trust policy that defines Amazon Bedrock as the service principal. This allows SageMaker AI to assume the role.

------
#### [ JSON ]

****  

```
{
"Version":"2012-10-17",		 	 	 
"Statement": [
{
  "Sid": "AllowSageMakerToAssumeRole",
  "Effect": "Allow",
  "Principal": {
    "Service": "sagemaker.amazonaws.com"
  },
  "Action": "sts:AssumeRole"
}
]
}
```

------

# Required service role permissions for creating a model evaluation job that uses a judge model
<a name="judge-service-roles"></a>

To create a model evaluation job that uses a LLM as judge, you must specify a service role. The policy you attach grants Amazon Bedrock access to resources in your account, and allows Amazon Bedrock to invoke the selected model on your behalf.

The trust policy defines Amazon Bedrock as the service principal using `bedrock.amazonaws.com`. Each of the following policy examples shows you the exact IAM actions that are required based on each service invoked in the model evaluation job

To create a custom service role as described below, see [Creating a role that uses a custom trust policy](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-custom.html) in the *IAM User Guide*.

## Required Amazon Bedrock IAM actions
<a name="judge-service-roles-br"></a>

You need to create a policy that allows Amazon Bedrock to invoke the models you plan to specify in the model evaluation job. To learn more about managing access to Amazon Bedrock models, see [Access Amazon Bedrock foundation models](model-access.md). In the `"Resource"` section of the policy, you must specify at least one ARN of a model you have access too. To use a model encrypted with customer managed key KMS key, you must add the required IAM actions and resources to the IAM service role policy. You must also add the service role to the AWS KMS key policy.

The service role must include access to at least one supported evaluator model. For a list of currently supported evaluator models, see [Supported models](evaluation-judge.md#evaluation-judge-supported).

------
#### [ JSON ]

****  

```
{
	"Version":"2012-10-17",		 	 	 
	"Statement": [
		{
			"Sid": "BedrockModelInvoke",
			"Effect": "Allow",
			"Action": [
				"bedrock:InvokeModel",
				"bedrock:CreateModelInvocationJob",
				"bedrock:StopModelInvocationJob"
			],
			"Resource": [
				"arn:aws:bedrock:us-east-1::foundation-model/*",
				"arn:aws:bedrock:us-east-1:111122223333:inference-profile/*",
				"arn:aws:bedrock:us-east-1:111122223333:provisioned-model/*",
				"arn:aws:bedrock:us-east-1:111122223333:imported-model/*"
			]
		}
	]
}
```

------

## Required Amazon S3 IAM actions and resources
<a name="judge-service-roles-s3"></a>

Your service role policy needs to include access to the Amazon S3 bucket where you want the output of model evaluation jobs saved, and access to the prompt dataset you have specified in your `CreateEvaluationJob` request or via the Amazon Bedrock console.

------
#### [ JSON ]

****  

```
{
	"Version":"2012-10-17",		 	 	 
	"Statement": [
		{
			"Sid": "FetchAndUpdateOutputBucket",
			"Effect": "Allow",
			"Action": [
				"s3:GetObject",
				"s3:ListBucket",
				"s3:PutObject",
				"s3:GetBucketLocation",
				"s3:AbortMultipartUpload",
				"s3:ListBucketMultipartUploads"
			],
			"Resource": [
				"arn:aws:s3:::my_customdataset1_bucket",
	            "arn:aws:s3:::my_customdataset1_bucket/myfolder",
	            "arn:aws:s3:::my_customdataset2_bucket",
				"arn:aws:s3:::my_customdataset2_bucket/myfolder"
			]
		}
	]
}
```

------

# Service role requirements for knowledge base evaluation jobs
<a name="rag-eval-service-roles"></a>

To create a knowledge base evaluation job, you must specify a service role. The policy that you attach to the role grants Amazon Bedrock access to resources in your account, and it allows Amazon Bedrock to do the following:
+ Invoke the models that you select for output generation with the `RetrieveAndGenerate` API action, and evaluate the knowledge base outputs.
+ Invoke the Amazon Bedrock Knowledge Bases `Retrieve` and `RetrieveAndGenerate` API actions on your knowledge base instance.

To create a custom service role, see [Creating a role that uses custom trust policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-custom.html) in the *IAM User Guide*.

**Required IAM actions for Amazon S3 access**  
The following example policy grants access to the S3 buckets where both of the following occur: 
+ You save your knowledge base evaluation results.
+ Amazon Bedrock reads your input dataset.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement":
    [
        {
            "Sid": "AllowAccessToCustomDatasets",
            "Effect": "Allow",
            "Action":
            [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource":
            [
                "arn:aws:s3:::my_customdataset1_bucket",
                "arn:aws:s3:::my_customdataset1_bucket/myfolder",
                "arn:aws:s3:::my_customdataset2_bucket",
                "arn:aws:s3:::my_customdataset2_bucket/myfolder"
            ]
        },
        {
            "Sid": "AllowAccessToOutputBucket",
            "Effect": "Allow",
            "Action":
            [
                "s3:GetObject",
                "s3:ListBucket",
                "s3:PutObject",
                "s3:GetBucketLocation",
                "s3:AbortMultipartUpload",
                "s3:ListBucketMultipartUploads"
            ],
            "Resource":
            [
                "arn:aws:s3:::my_output_bucket",
                "arn:aws:s3:::my_output_bucket/myfolder"
            ]
        }
    ]
}
```

------

**Required Amazon Bedrock IAM actions**  
You also need to create a policy that allows Amazon Bedrock to do the following:

1. Invoke the models that you plan to specify for the following: 
   + Result generation with the `RetrieveAndGenerate` API action.
   + Evaluation of results.

   For the `Resource` key in the policy, you must specify at least one ARN of a model you have access to. To use a model that's encrypted with a customer-managed KMS key, you must add the required IAM actions and resources to the IAM service role policy. You must also add the service role to the AWS KMS key policy.

1. Call the `Retrieve` and `RetrieveAndGenerate` API actions. Note that, in the automated role creation in the console, we give permissions to both `Retrieve` and `RetrieveAndGenerate` API actions, regardless of the action you choose to evaluate for that job. By doing so, we give additional flexibility and reusability for that role. However, for added security, that automatically-created role is tied to a single knowledge base instance.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AllowSpecificModels",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel",
                "bedrock:InvokeModelWithResponseStream",
                "bedrock:CreateModelInvocationJob",
                "bedrock:StopModelInvocationJob",
                "bedrock:GetProvisionedModelThroughput",
                "bedrock:GetInferenceProfile",
                "bedrock:GetImportedModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1::foundation-model/*",
                "arn:aws:bedrock:us-east-1:123456789012:inference-profile/*",
                "arn:aws:bedrock:us-east-1:123456789012:provisioned-model/*",
                "arn:aws:bedrock:us-east-1:123456789012:imported-model/*",
                "arn:aws:bedrock:us-east-1:123456789012:application-inference-profile/*"
            ]
        },
        {
            "Sid": "AllowKnowledgeBaseAPis",
            "Effect": "Allow",
            "Action": [
                "bedrock:Retrieve",
                "bedrock:RetrieveAndGenerate"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:knowledge-base/knowledge-base-id"
            ]
        }
    ]
}
```

------

**Service Principal Requirements**  
You must also specify a trust policy that defines Amazon Bedrock as the service principal. This policy allows Amazon Bedrock to assume the role. The wildcard (`*`) model evaluation job ARN is required so that Amazon Bedrock can create model evaluation jobs in your AWS account.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AllowBedrockToAssumeRole",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnEquals": {
                    "aws:SourceArn": "arn:aws:bedrock:us-east-1:123456789012:evaluation-job/*"
                }
            }
        }
    ]
}
```

------