

# Security
<a name="security-overview"></a>

Security in Amazon Bedrock encompasses multiple layers of protection for your data, applications, and infrastructure.

**Topics**
+ [Data protection](data-protection.md)
+ [Identity and access management for Amazon Bedrock](security-iam.md)
+ [Cross-account access to Amazon S3 bucket for custom model import jobs](cross-account-access-cmi.md)
+ [Compliance validation for Amazon Bedrock](compliance-validation.md)
+ [Incident response in Amazon Bedrock](security-incident-response.md)
+ [Resilience in Amazon Bedrock](disaster-recovery-resiliency.md)
+ [Infrastructure security in Amazon Bedrock](infrastructure-security.md)
+ [Cross-service confused deputy prevention](cross-service-confused-deputy-prevention.md)
+ [Configuration and vulnerability analysis in Amazon Bedrock](vulnerability-analysis-and-management.md)
+ [Amazon Bedrock abuse detection](abuse-detection.md)
+ [Prompt injection security](prompt-injection.md)

# Data protection
<a name="data-protection"></a>

The AWS [shared responsibility model](https://aws.amazon.com/compliance/shared-responsibility-model/) applies to data protection in Amazon Bedrock. As described in this model, AWS is responsible for protecting the global infrastructure that runs all of the AWS Cloud. You are responsible for maintaining control over your content that is hosted on this infrastructure. You are also responsible for the security configuration and management tasks for the AWS services that you use. For more information about data privacy, see the [Data Privacy FAQ](https://aws.amazon.com/compliance/data-privacy-faq/). For information about data protection in Europe, see the [AWS Shared Responsibility Model and GDPR](https://aws.amazon.com/blogs/security/the-aws-shared-responsibility-model-and-gdpr/) blog post on the *AWS Security Blog*.

For data protection purposes, we recommend that you protect AWS account credentials and set up individual users with AWS IAM Identity Center or AWS Identity and Access Management (IAM). That way, each user is given only the permissions necessary to fulfill their job duties. We also recommend that you secure your data in the following ways:
+ Use multi-factor authentication (MFA) with each account.
+ Use SSL/TLS to communicate with AWS resources. We require TLS 1.2 and recommend TLS 1.3.
+ Set up API and user activity logging with AWS CloudTrail. For information about using CloudTrail trails to capture AWS activities, see [Working with CloudTrail trails](https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudtrail-trails.html) in the *AWS CloudTrail User Guide*.
+ Use AWS encryption solutions, along with all default security controls within AWS services.
+ Use advanced managed security services such as Amazon Macie, which assists in discovering and securing sensitive data that is stored in Amazon S3.
+ If you require FIPS 140-3 validated cryptographic modules when accessing AWS through a command line interface or an API, use a FIPS endpoint. For more information about the available FIPS endpoints, see [Federal Information Processing Standard (FIPS) 140-3](https://aws.amazon.com/compliance/fips/).

We strongly recommend that you never put confidential or sensitive information, such as your customers' email addresses, into tags or free-form text fields such as a **Name** field. This includes when you work with Amazon Bedrock or other AWS services using the console, API, AWS CLI, or AWS SDKs. Any data that you enter into tags or free-form text fields used for names may be used for billing or diagnostic logs. If you provide a URL to an external server, we strongly recommend that you do not include credentials information in the URL to validate your request to that server.

For more information on how your data is processed for abuse detection, see [Amazon Bedrock abuse detection](https://docs.aws.amazon.com/bedrock/latest/userguide/abuse-detection.html).

Amazon Bedrock has a concept of a Model Deployment Account—in each AWS Region where Amazon Bedrock is available, there is one such deployment account per model provider. These accounts are owned and operated by the Amazon Bedrock service team. Model providers don't have any access to those accounts. After delivery of a model from a model provider to AWS, Amazon Bedrock will perform a deep copy of a model provider’s inference and training software into those accounts for deployment. Because the model providers don't have access to those accounts, they don't have access to Amazon Bedrock logs or to customer prompts and completions.

**Topics**
+ [Data encryption](data-encryption.md)
+ [Protect your data using Amazon VPC and AWS PrivateLink](usingVPC.md)

# Data encryption
<a name="data-encryption"></a>

Amazon Bedrock uses encryption to protect data at rest and data in transit.

**Encryption in transit**

Within AWS, all inter-network data in transit supports TLS 1.2 encryption.

Requests to the Amazon Bedrock API and console are made over a secure (SSL) connection. You pass AWS Identity and Access Management (IAM) roles to Amazon Bedrock to provide permissions to access resources on your behalf for training and deployment. 

**Encryption at rest**

Amazon Bedrock provides [Encryption of custom models](encryption-custom-job.md) at rest.

## Key management
<a name="key-management"></a>

Use the AWS Key Management Service to manage the keys that you use to encrypt your resources. For more information, see [AWS Key Management Service concepts](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#master_keys). You can encrypt the following resources with a KMS key.
+ Through Amazon Bedrock
  + Model customization jobs and their output custom models – During job creation in the console or by specifying the `customModelKmsKeyId` field in the [CreateModelCustomizationJob](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_CreateModelCustomizationJob.html) API call.
  + Agents – During agent creation in the console or by specifying the `customerEncryptionKeyArn` field in the [CreateAgent](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent_CreateAgent.html) API call.
  + Data source ingestion jobs for knowledge bases – During knowledge base creation in the console or by specifying the `kmsKeyArn` field in the [CreateDataSource](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent_CreateDataSource.html) or [UpdateDataSource](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent_UpdateDataSource.html) API call.
  + Vector stores in Amazon OpenSearch Service – During vector store creation. For more information, see [Creating, listing, and deleting Amazon OpenSearch Service collections](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/serverless-manage.html) and [Encryption of data at rest for Amazon OpenSearch Service](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/encryption-at-rest.html).
  + Model evaluations jobs – When you create a model evaluation job in console or by specify a key ARN in ` customerEncryptionKeyId` in the [CreateEvaluationJob ](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_CreateEvaluationJob.html) API call.
+ Through Amazon S3 – For more information, see [Using server-side encryption with AWS KMS keys (SSE-KMS).](https://docs.aws.amazon.com/AmazonS3/latest/userguide/UsingKMSEncryption.html)
  + Training, validation, and output data for model customization
  + Data sources for knowledge bases
+ Through AWS Secrets Manager – For more information, see [Secret encryption and decryption in AWS Secrets Manager](https://docs.aws.amazon.com/secretsmanager/latest/userguide/security-encryption.html)
  + Vector stores for third-party models

After you encrypt a resource, you can find the ARN of the KMS key by selecting a resource and viewing its **Details** in the console or by using the following `Get` API calls.
+ [GetModelCustomizationJob](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_GetModelCustomizationJob.html)
+ [GetAgent](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent_GetAgent.html)
+ [GetIngestionJob](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent_GetIngestionJob.html)

# Encryption of custom models
<a name="encryption-custom-job"></a>

Amazon Bedrock uses your training data with the [CreateModelCustomizationJob](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_CreateModelCustomizationJob.html) action, or with the [console](model-customization-submit.md), to create a custom model which is a fine tuned version of an Amazon Bedrock foundational model. Your custom models are managed and stored by AWS.

Amazon Bedrock uses the fine tuning data you provide only for fine tuning an Amazon Bedrock foundation model. Amazon Bedrock doesn't use fine tuning data for any other purpose. Your training data isn't used to train the base Titan models or distributed to third parties. Other usage data, such as usage timestamps, logged account IDs, and other information logged by the service, is also not used to train the models. 

None of the training or validation data you provide for fine tuning is stored by Amazon Bedrock, once the fine tuning job completes.

Note that fine-tuned models can replay some of the fine tuning data while generating completions. If your app should not expose fine tuning data in any form, then you should first filter out confidential data from your training data. If you already created a customized model using confidential data by mistake, you can delete that custom model, filter out confidential information from the training data, and then create a new model.

For encrypting custom models (including copied models), Amazon Bedrock offers you two options:

1. **AWS owned keys** – By default, Amazon Bedrock encrypts custom models with AWS owned keys. You can't view, manage, or use AWS owned keys, or audit their use. However, you don't have to take any action or change any programs to protect the keys that encrypt your data. For more information, see [AWS owned keys](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#aws-owned-cmk) in the *AWS Key Management Service Developer Guide*.

1. **Customer managed keys** – You can choose to encrypt custom models with customer managed keys that you manage yourself. For more information about AWS KMS keys, see [Customer managed keys](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#customer-cmk) in the *AWS Key Management Service Developer Guide*.

**Note**  
Amazon Bedrock automatically enables encryption at rest using AWS owned keys at no charge. If you use a customer managed key, AWS KMS charges apply. For more information about pricing, see [AWS Key Management Service pricing](https://aws.amazon.com/kms/pricing/).

For more information about AWS KMS, see the [AWS Key Management Service Developer Guide](https://docs.aws.amazon.com/kms/latest/developerguide/).

**Topics**
+ [How Amazon Bedrock uses grants in AWS KMS](#encryption-br-grants)
+ [Understand how to create a customer managed key and how to attach a key policy to it](#encryption-key-policy)
+ [Permissions and key policies for custom and copied models](#encryption-cm-statements)
+ [Monitor your encryption keys for the Amazon Bedrock service](#encryption-monitor-key)
+ [Encryption of training, validation, and output data](#encryption-custom-job-data)

## How Amazon Bedrock uses grants in AWS KMS
<a name="encryption-br-grants"></a>

If you specify a customer managed key to encrypt a custom model for a model customization or model copy job, Amazon Bedrock creates a **primary** KMS [grant](https://docs.aws.amazon.com/kms/latest/developerguide/grants.html) associated with the custom model on your behalf by sending a [CreateGrant](https://docs.aws.amazon.com/kms/latest/APIReference/API_CreateGrant.html) request to AWS KMS. This grant allows Amazon Bedrock to access and use your customer managed key. Grants in AWS KMS are used to give Amazon Bedrock access to a KMS key in a customer’s account.

Amazon Bedrock requires the primary grant to use your customer managed key for the following internal operations:
+ Send [DescribeKey](https://docs.aws.amazon.com/kms/latest/APIReference/API_DescribeKey.html) requests to AWS KMS to verify that the symmetric customer managed KMS key ID you entered when creating the job is valid.
+ Send [GenerateDataKey](https://docs.aws.amazon.com/kms/latest/APIReference/API_GenerateDataKey.html) and [Decrypt](https://docs.aws.amazon.com/kms/latest/APIReference/API_Decrypt.html) requests to AWS KMS to generate data keys encrypted by your customer managed key and decrypt the encrypted data keys so that they can be used to encrypt the model artifacts.
+ Send [CreateGrant](https://docs.aws.amazon.com/kms/latest/APIReference/API_CreateGrant.html) requests to AWS KMS to create scoped down secondary grants with a subset of the above operations (`DescribeKey`, `GenerateDataKey`, `Decrypt`), for the asynchronous execution of model customization, model copy, or Provisioned Throughput creation.
+ Amazon Bedrock specifies a retiring principal during the creation of grants, so the service can send a [RetireGrant](https://docs.aws.amazon.com/kms/latest/APIReference/API_RetireGrant.html) request.

You have full access to your customer managed AWS KMS key. You can revoke access to the grant by following the steps at [Retiring and revoking grants](https://docs.aws.amazon.com/kms/latest/developerguide/grant-manage.html#grant-delete) in the [AWS Key Management Service Developer Guide](https://docs.aws.amazon.com/kms/latest/developerguide/) or remove the service’s access to your customer managed key at any time by modifying the [key policy](https://docs.aws.amazon.com/kms/latest/developerguide/key-policies.html). If you do so, Amazon Bedrock won’t be able to access the custom model encrypted by your key.

### Life cycle of primary and secondary grants for custom models
<a name="encryption-primary-secondary-grants"></a>
+ **Primary grants** have a long lifespan and remain active as long as the associated custom models are still in use. When a custom model is deleted, the corresponding primary grant is automatically retired.
+ **Secondary grants** are short-lived. They are automatically retired as soon as the operation that Amazon Bedrock performs on behalf of the customers is completed. For example, once a model copy job is finished, the secondary grant that allowed Amazon Bedrock to encrypt the copied custom model will be retired immediately.

## Understand how to create a customer managed key and how to attach a key policy to it
<a name="encryption-key-policy"></a>

To encrypt an AWS resource with a key that you create and manage, you perform the following general steps:

1. (Prerequisite) Ensure that your IAM role has permissions for the [CreateKey](https://docs.aws.amazon.com/kms/latest/APIReference/API_CreateKey.html) action.

1. Follow the steps at [Creating keys](https://docs.aws.amazon.com/kms/latest/developerguide/create-keys.html) to create a customer managed key by using the AWS KMS console or the [CreateKey](https://docs.aws.amazon.com/kms/latest/APIReference/API_CreateKey.html) operation.

1. Creation of the key returns an `Arn` for the key that you can use for operations that require using the key (for example, when [submitting a model customization job](model-customization-submit.md) or [running model inference](inference-invoke.md)).

1. Create and attach a key policy to the key with the required permissions. To create a key policy, follow the steps at [Creating a key policy](https://docs.aws.amazon.com/kms/latest/developerguide/key-policy-overview.html) in the AWS Key Management Service Developer Guide.

## Permissions and key policies for custom and copied models
<a name="encryption-cm-statements"></a>

After you create a KMS key, you attach a key policy to it. Key policies are [resource-based policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_identity-vs-resource.html) that you attach to your customer managed key to control access to it. Every customer managed key must have exactly one key policy, which contains statements that determine who can use the key and how they can use it. You can specify a key policy when you create your customer managed key. You can modify the key policy at any time, but there might be a brief delay before the change becomes available throughout AWS KMS. For more information, see [Managing access to customer managed keys](https://docs.aws.amazon.com/kms/latest/developerguide/control-access-overview.html#managing-access) in the [AWS Key Management Service Developer Guide](https://docs.aws.amazon.com/kms/latest/developerguide/).

The following KMS [actions](https://docs.aws.amazon.com/service-authorization/latest/reference/list_awskeymanagementservice.html#awskeymanagementservice-actions-as-permissions) are used for keys that encrypt custom and copied models:

1. [kms:CreateGrant](https://docs.aws.amazon.com/kms/latest/APIReference/API_CreateGrant.html) – Creates a grant for a customer managed key by allowing the Amazon Bedrock service principal access to the specified KMS key through [grant operations](https://docs.aws.amazon.com/kms/latest/developerguide/grants.html#terms-grant-operations). For more information about grants, see [Grants in AWS KMS](https://docs.aws.amazon.com/kms/latest/developerguide/grants.html) in the [AWS Key Management Service Developer Guide](https://docs.aws.amazon.com/kms/latest/developerguide/).
**Note**  
Amazon Bedrock also sets up a retiring principal and automatically retires the grant after it is no longer required.

1. [kms:DescribeKey](https://docs.aws.amazon.com/kms/latest/APIReference/API_DescribeKey.html) – Provides the customer managed key details to allow Amazon Bedrock to validate the key.

1. [kms:GenerateDataKey](https://docs.aws.amazon.com/kms/latest/APIReference/API_GenerateDataKey.html) – Provides the customer managed key details to allow Amazon Bedrock to validate user access. Amazon Bedrock stores generated ciphertext alongside the custom model to be used as an additional validation check against custom model users.

1. [kms:Decrypt](https://docs.aws.amazon.com/kms/latest/APIReference/API_Decrypt.html) – Decrypts the stored ciphertext to validate that the role has proper access to the KMS key that encrypts the custom model.

As a best security practice, we recommend that you include the [kms:ViaService](https://docs.aws.amazon.com/kms/latest/developerguide/conditions-kms.html#conditions-kms-via-service) condition key to limit access to the key to the Amazon Bedrock service.

Although you can only attach one key policy to a key, you can attach multiple statements to the key policy by adding staements to the list in the `Statement` field of the policy.

The following statements are relevant to encrypting custom and copied models:

### Encrypt a model
<a name="encryption-key-policy-encrypt"></a>

To use your customer managed key to encrypt a custom or copied model, include the following statement in a key policy to allow encryption of a model. In the `Principal` field, add accounts that you want to allow to encrypt and decrypt the key to the list that the `AWS` subfield maps to. If you use the `kms:ViaService` condition key, you can add a line for each Region, or use *\$1* in place of *\$1\$1region\$1* to allow all Regions that support Amazon Bedrock.

```
{
    "Sid": "PermissionsEncryptDecryptModel",
    "Effect": "Allow",
    "Principal": {
        "AWS": [
            "arn:aws:iam::${account-id}:role/${role}"
        ]
    },
    "Action": [
        "kms:Decrypt",
        "kms:GenerateDataKey",
        "kms:DescribeKey",
        "kms:CreateGrant"
    ],
    "Resource": "*",
    "Condition": {
        "StringLike": {
            "kms:ViaService": [
                "bedrock.${region}.amazonaws.com"
            ] 
        }
    }
}
```

### Allow access to an encrypted model
<a name="encryption-key-policy-decrypt"></a>

To allow access to a model that has been encrypted with a KMS key, include the following statement in a key policy to allow decryption of the key. In the `Principal` field, add accounts that you want to allow to decrypt the key to the list that the `AWS` subfield maps to. If you use the `kms:ViaService` condition key, you can add a line for each Region, or use *\$1* in place of *\$1\$1region\$1* to allow all Regions that support Amazon Bedrock.

```
{
    "Sid": "PermissionsDecryptModel",
    "Effect": "Allow",
    "Principal": {
        "AWS": [
            "arn:aws:iam::${account-id}:role/${role}"
        ]
    },
    "Action": [
        "kms:Decrypt"
    ],
    "Resource": "*",
    "Condition": {
        "StringLike": {
            "kms:ViaService": [
                "bedrock.${region}.amazonaws.com"
            ] 
        }
    }
}
```

To learn about the key policies that you need to create, expand the section that corresponds to your use case:

### Set up key permissions for encrypting custom models
<a name="encryption-cm"></a>

If you plan to encrypt a model that you customize with a KMS key, the key policy for the key will depend on your use case. Expand the section that corresponds to your use case:

#### The roles that will customize the model and the roles that will invoke the model are the same
<a name="encryption-cm-custom-invoke-same"></a>

If the roles that will invoke the custom model are the same as the roles that will customize the model, you only need the statement from [Encrypt a model](#encryption-key-policy-encrypt). In the `Principal` field in the following policy template, add accounts that you want to allow to customize and invoke the custom model to the list that the `AWS` subfield maps to.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Id": "PermissionsCustomModelKey",
    "Statement": [
        {
            "Sid": "PermissionsEncryptCustomModel",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::111122223333:role/Role"
                ]
            },
            "Action": [
                "kms:Decrypt",
                "kms:GenerateDataKey",
                "kms:DescribeKey",
                "kms:CreateGrant"
            ],
            "Resource": "*",
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "bedrock.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

#### The roles that will customize the model and the roles that will invoke the model are different
<a name="encryption-custom-invoke-different"></a>

If the roles that will invoke the custom model are different from the role that will customize the model, you need both the statement from [Encrypt a model](#encryption-key-policy-encrypt) and [Allow access to an encrypted model](#encryption-key-policy-decrypt). Modify the statements in the following policy template as follows:

1. The first statement allows encryption and decryption of the key. In the `Principal` field, add accounts that you want to allow to customize the custom model to the list that the `AWS` subfield maps to.

1. The second statement allows only decryption of the key. In the `Principal` field, add accounts that you want to only allow to invoke the custom model to the list that the `AWS` subfield maps to.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Id": "PermissionsCustomModelKey",
    "Statement": [
        {
            "Sid": "PermissionsEncryptCustomModel",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::111122223333:role/Role"
                ]
            },
            "Action": [
                "kms:Decrypt",
                "kms:GenerateDataKey",
                "kms:DescribeKey",
                "kms:CreateGrant"
            ],
            "Resource": "*",
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "bedrock.us-east-1.amazonaws.com"
                    ]
                }
            }
        },
        {
            "Sid": "PermissionsDecryptModel",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::111122223333:role/Role"
                ]
            },
            "Action": [
                "kms:Decrypt"
            ],
            "Resource": "*",
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "bedrock.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

### Set up key permissions for copying custom models
<a name="encryption-copy"></a>

When you copy a model that you own or that has been shared with you, you might have to manage up to two key policies:

#### Key policy for key that will encrypt a copied model
<a name="encryption-copied-model-key-policy"></a>

If you plan to use a KMS key to encrypt a copied model, the key policy for the key will depend on your use case. Expand the section that corresponds to your use case:

##### The roles that will copy the model and the roles that will invoke the model are the same
<a name="encryption-copied-model-copy-invoke-same"></a>

If the roles that will invoke the copied model are the same as the roles that will create the model copy, you only need the statement from [Encrypt a model](#encryption-key-policy-encrypt). In the `Principal` field in the following policy template, add accounts that you want to allow to copy and invoke the copied model to the list that the `AWS` subfield maps to:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Id": "PermissionsCopiedModelKey",
    "Statement": [
        {
            "Sid": "PermissionsEncryptCopiedModel",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::111122223333:role/Role"
                ]
            },
            "Action": [
                "kms:Decrypt",
                "kms:GenerateDataKey",
                "kms:DescribeKey",
                "kms:CreateGrant"
            ],
            "Resource": "*",
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "bedrock.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

##### The roles that will copy the model and the roles that will invoke the model are different
<a name="encryption-copied-model-copy-invoke-different"></a>

If the roles that will invoke the copied model are different from the role that will create the model copy, you need both the statement from [Encrypt a model](#encryption-key-policy-encrypt) and [Allow access to an encrypted model](#encryption-key-policy-decrypt). Modify the statements in the following policy template as follows:

1. The first statement allows encryption and decryption of the key. In the `Principal` field, add accounts that you want to allow to create the copied model to the list that the `AWS` subfield maps to.

1. The second statement allows only decryption of the key. In the `Principal` field, add accounts that you want to only allow to invoke the copied model to the list that the `AWS` subfield maps to.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Id": "PermissionsCopiedModelKey",
    "Statement": [
        {
            "Sid": "PermissionsEncryptCopiedModel",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::111122223333:role/Role"
                ]
            },
            "Action": [
                "kms:Decrypt",
                "kms:GenerateDataKey",
                "kms:DescribeKey",
                "kms:CreateGrant"
            ],
            "Resource": "*",
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "bedrock.us-east-1.amazonaws.com"
                    ]
                }
            }
        },
        {
            "Sid": "PermissionsDecryptCopiedModel",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::111122223333:role/Role"
                ]
            },
            "Action": [
                "kms:Decrypt"
            ],
            "Resource": "*",
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "bedrock.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

#### Key policy for key that encrypts the source model to be copied
<a name="encryption-copy-source-model-key-policy"></a>

If the source model that you will copy is encrypted with a KMS key, attach the statement from [Allow access to an encrypted model](#encryption-key-policy-decrypt) to the key policy for the key that encrypts the source model. This stamtement allows the model copy role to decrypt the key that encrypts the source model. In the `Principal` field in the following policy template, add accounts that you want to allow to copy the source model to the list that the `AWS` subfield maps to:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Id": "PermissionsSourceModelKey",
    "Statement": [
        {
            "Sid": "PermissionsDecryptModel",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::111122223333:role/Role"
                ]
            },
            "Action": [
                "kms:Decrypt"
            ],
            "Resource": "*",
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "bedrock.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

## Monitor your encryption keys for the Amazon Bedrock service
<a name="encryption-monitor-key"></a>

When you use an AWS KMS customer managed key with your Amazon Bedrock resources, you can use [AWS CloudTrail](https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudtrail-user-guide.html) or [Amazon CloudWatch Logs](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/WhatIsCloudWatchLogs.html) to track requests that Amazon Bedrock sends to AWS KMS.

The following is an example AWS CloudTrail event for [CreateGrant](https://docs.aws.amazon.com/kms/latest/APIReference/API_CreateGrant.html) to monitor KMS operations called by Amazon Bedrock to create a primary grant:

```
{
    "eventVersion": "1.09",
    "userIdentity": {
        "type": "AssumedRole",
        "principalId": "AROAIGDTESTANDEXAMPLE:SampleUser01",
        "arn": "arn:aws:sts::111122223333:assumed-role/RoleForModelCopy/SampleUser01",
        "accountId": "111122223333",
        "accessKeyId": "EXAMPLE",
        "sessionContext": {
            "sessionIssuer": {
                "type": "Role",
                "principalId": "AROAIGDTESTANDEXAMPLE",
                "arn": "arn:aws:iam::111122223333:role/RoleForModelCopy",
                "accountId": "111122223333",
                "userName": "RoleForModelCopy"
            },
            "attributes": {
                "creationDate": "2024-05-07T21:46:28Z",
                "mfaAuthenticated": "false"
            }
        },
        "invokedBy": "bedrock.amazonaws.com"
    },
    "eventTime": "2024-05-07T21:49:44Z",
    "eventSource": "kms.amazonaws.com",
    "eventName": "CreateGrant",
    "awsRegion": "us-east-1",
    "sourceIPAddress": "bedrock.amazonaws.com",
    "userAgent": "bedrock.amazonaws.com",
    "requestParameters": {
        "granteePrincipal": "bedrock.amazonaws.com",
        "retiringPrincipal": "bedrock.amazonaws.com",
        "keyId": "arn:aws:kms:us-east-1:111122223333:key/1234abcd-12ab-34cd-56ef-123456SAMPLE",
        "operations": [
            "Decrypt",
            "CreateGrant",
            "GenerateDataKey",
            "DescribeKey"
        ]
    },
    "responseElements": {
        "grantId": "0ab0ac0d0b000f00ea00cc0a0e00fc00bce000c000f0000000c0bc0a0000aaafSAMPLE",
        "keyId": "arn:aws:kms:us-east-1:111122223333:key/1234abcd-12ab-34cd-56ef-123456SAMPLE"
    },
    "requestID": "ff000af-00eb-00ce-0e00-ea000fb0fba0SAMPLE",
    "eventID": "ff000af-00eb-00ce-0e00-ea000fb0fba0SAMPLE",
    "readOnly": false,
    "resources": [
        {
            "accountId": "111122223333",
            "type": "AWS::KMS::Key",
            "ARN": "arn:aws:kms:us-east-1:111122223333:key/1234abcd-12ab-34cd-56ef-123456SAMPLE"
        }
    ],
    "eventType": "AwsApiCall",
    "managementEvent": true,
    "recipientAccountId": "111122223333",
    "eventCategory": "Management"
}
```

## Encryption of training, validation, and output data
<a name="encryption-custom-job-data"></a>

When you use Amazon Bedrock to run a model customization job, you store the input files in your Amazon S3 bucket. When the job completes, Amazon Bedrock stores the output metrics files in the S3 bucket that you specifed when creating the job and the resulting custom model artifacts in an S3 bucket controlled by AWS.

The output files are encrypted with the encryption configurations of the S3 bucket. These are encrypted either with [SSE-S3 server-side encryption](https://docs.aws.amazon.com/AmazonS3/latest/userguide/UsingServerSideEncryption.html) or with [AWS KMS SSE-KMS encryption](https://docs.aws.amazon.com/AmazonS3/latest/userguide/UsingKMSEncryption.html), depending on how you set up the S3 bucket.

# Encryption of imported custom models
<a name="encryption-import-model"></a>

Amazon Bedrock supports creating custom models through two methods that both use the same encryption approach. Your custom models are managed and stored by AWS:
+ **Custom model import jobs** — For importing customized open-source foundation models (such as Mistral AI or Llama models).
+ **Create custom model** — For importing Amazon Nova models that you customized in SageMaker AI.

For encryption of your custom models, Amazon Bedrock provides the following options: 
+ **AWS owned keys** – By default, Amazon Bedrock encrypts imported custom models with AWS owned keys. You can't view, manage, or use AWS owned keys, or audit their use. However, you don't have to take any action or change any programs to protect the keys that encrypt your data. For more information, see [AWS owned keys](https://docs.aws.amazon.com//kms/latest/developerguide/concepts.html#aws-owned-cmk) in the *AWS Key Management Service Developer Guide*.
+ **Customer managed keys (CMK)** – You can choose to add a second layer of encryption over the existing AWS owned encryption keys by choosing a customer managed key(CMK). You create, own, and manage your customer managed keys.

   Because you have full control of this layer of encryption, in it you can perform the following tasks: 
  + Establish and maintain key policies
  + Establish and maintain IAM policies and grants
  + Enable and disable key policies
  + Rotate key cryptographic material
  + Add tags
  + Create key aliases
  + Schedule keys for deletion

  For more information, see [customer managed keys](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#customer-cmk) in the *AWS Key Management Service Developer Guide*.

**Note**  
For all the custom models that you import, Amazon Bedrock automatically enables encryption at rest using AWS owned keys to protect customer data at no charge. If you use a customer managed key, AWS KMS charges apply. For more information about pricing, see [AWS Key Management Service Pricing.](https://docs.aws.amazon.com/).

## How Amazon Bedrock uses grants in AWS KMS
<a name="import-model-kms-grants"></a>

If you specify a customer managed key to encrypt the imported model. Amazon Bedrock creates a **primary** AWS KMS [grant](https://docs.aws.amazon.com/) associated with the imported model on your behalf by sending a [CreateGrant](https://docs.aws.amazon.com//kms/latest/APIReference/API_CreateGrant.html) request to AWS KMS. This grant allows Amazon Bedrock to access and use your customer managed key. Grants in AWS KMS are used to give Amazon Bedrock access to a KMS key in a customer’s account.

Amazon Bedrock requires the primary grant to use your customer managed key for the following internal operations:
+ Send [DescribeKey](https://docs.aws.amazon.com/kms/latest/APIReference/API_DescribeKey.html) requests to AWS KMS to verify that the symmetric customer managed KMS key ID you entered when creating the job is valid.
+ Send [GenerateDataKey](https://docs.aws.amazon.com//kms/latest/APIReference/API_GenerateDataKey.html) and [Decrypt](https://docs.aws.amazon.com//kms/latest/APIReference/API_Decrypt.html) requests to AWS KMS to generate data keys encrypted by your customer managed key and decrypt the encrypted data keys so that they can be used to encrypt the model artifacts.
+ Send [CreateGrant](https://docs.aws.amazon.com//kms/latest/APIReference/API_CreateGrant.html) requests to AWS KMS to create scoped down secondary grants with a subset of the above operations (`DescribeKey`, `GenerateDataKey`, `Decrypt`), for the asynchronous execution of model import and for on-demand inference. 
+ Amazon Bedrock specifies a retiring principal during the creation of grants, so the service can send a [RetireGrant](https://docs.aws.amazon.com//kms/latest/APIReference/API_RetireGrant.html) request.

You have full access to your customer managed AWS KMS key. You can revoke access to the grant by following the steps at [Retiring and revoking grants](https://docs.aws.amazon.com//kms/latest/developerguide/grant-manage.html#grant-delete) in the *AWS Key Management Service Developer Guide* or remove the service’s access to your customer managed key at any time by modifying the key policy. If you do so, Amazon Bedrock won’t be able to access the imported model encrypted by your key.

### Life cycle of primary and secondary grants for custom imported models
<a name="import-model-kms-grants-lifecycle"></a>
+ **Primary grants** have a long lifespan and remain active as long as the associated custom models are still in use. When a custom imported model is deleted, the corresponding primary grant is automatically retired.
+ **Secondary grants** are short-lived. They are automatically retired as soon as the operation that Amazon Bedrock performs on behalf of the customers is completed. For example, once a custom model import job is finished, the secondary grant that allowed Amazon Bedrock to encrypt the custom imported model will be retired immediately.

# Using customer managed key (CMK)
<a name="import-model-using-cmk"></a>

If you are planning to use customer managed key to encrypt your custom imported model, complete the following steps:

1. Create a customer managed key with the AWS Key Management Service.

1. Attach a [resource-based policy](https://docs.aws.amazon.com//IAM/latest/UserGuide/access_policies_identity-vs-resource.html) with permissions for the specified-roles to create and use custom imported models.

**Create a customer managed key**

First ensure that you have `CreateKey` permissions. Then follow the steps at [creating keys](https://docs.aws.amazon.com//kms/latest/developerguide/create-keys.html) to create a customer managed keys either in the AWS KMS console or the [CreateKey](https://docs.aws.amazon.com/kms/latest/APIReference/API_CreateKey.html) API operation. Make sure to create a symmetric encryption key.

Creation of the key returns an `Arn` for the key that you can use as the `importedModelKmsKeyId ` when importing a custom model with custom model import.

**Create a key policy and attach it to the customer managed key**

Key policies are [resource-based policy](https://docs.aws.amazon.com//IAM/latest/UserGuide/access_policies_identity-vs-resource.html) that you attach to your customer managed key to control access to it. Every customer managed key must have exactly one key policy, which contains statements that determine who can use the key and how they can use it. You can specify a key policy when you create your customer managed key. You can modify the key policy at any time, but there might be a brief delay before the change becomes available throughout AWS KMS. For more information, see [Managing access to customer managed keys](https://docs.aws.amazon.com//kms/latest/developerguide/control-access-overview.html#managing-access) in the *AWS Key Management Service Developer Guide*.

**Encrypt an imported custom model**

To use your customer managed key to encrypt an imported custom model, you must include the following AWS KMS operations in the key policy:
+ [kms:CreateGrant](https://docs.aws.amazon.com/kms/latest/APIReference/API_CreateGrant.html) – creates a grant for a customer managed key by allowing the Amazon Bedrock service principal access to the specified KMS key through [grant operations](https://docs.aws.amazon.com/kms/latest/developerguide/grants.html#terms-grant-operations). For more information about grants, see [Grants in AWS KMS](https://docs.aws.amazon.com/kms/latest/developerguide/grants.html) in the *AWS Key Management Service Developer Guide*.
**Note**  
Amazon Bedrock also sets up a retiring principal and automatically retires the grant after it is no longer required.
+ [kms:DescribeKey](https://docs.aws.amazon.com/kms/latest/APIReference/API_DescribeKey.html) – provides the customer managed key details to allow Amazon Bedrock to validate the key.
+ [kms:GenerateDataKey](https://docs.aws.amazon.com/kms/latest/APIReference/API_GenerateDataKey.html) – Provides the customer managed key details to allow Amazon Bedrock to validate user access. Amazon Bedrock stores generated ciphertext alongside the imported custom model to be used as an additional validation check against imported custom model users
+ [kms:Decrypt](https://docs.aws.amazon.com/kms/latest/APIReference/API_Decrypt.html) – Decrypts the stored ciphertext to validate that the role has proper access to the KMS key that encrypts the imported custom model.

The following is an example policy that you can attach to a key for a role that you'll use to encrypt models that you import:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Id": "KMS key policy for a key to encrypt an imported custom model",
    "Statement": [
        {
            "Sid": "Permissions for model import API invocation role",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::111122223333:user/role"
            },
            "Action": [
                "kms:Decrypt",
                "kms:GenerateDataKey",
                "kms:DescribeKey",
                "kms:CreateGrant"
            ],
            "Resource": "*"
        }
    ]
}
```

------

**Decrypt an encrypted imported custom model**

If you're importing a custom model that has already been encrypted by another customer managed key, you must add `kms:Decrypt` permissions for the same role, as in the following policy:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Id": "KMS key policy for a key that encrypted a custom imported model",
    "Statement": [
        {
            "Sid": "Permissions for model import API invocation role",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::111122223333:user/role"
            },
            "Action": [
                "kms:Decrypt"
            ],
            "Resource": "*"
        }
    ]
}
```

------

# Monitoring your encryption keys for the Amazon Bedrock service
<a name="import-model-monitor-encryption-keys"></a>

When you use an AWS KMS customer managed key with your Amazon Bedrock resources, you can use [AWS CloudTrail](https://docs.aws.amazon.com//awscloudtrail/latest/userguide/cloudtrail-user-guide.html) or [Amazon CloudWatch Logs](https://docs.aws.amazon.com//AmazonCloudWatch/latest/logs/WhatIsCloudWatchLogs.html) to track requests that Amazon Bedrock sends to AWS KMS.

The following is an example AWS CloudTrail event for [CreateGrant](https://docs.aws.amazon.com/kms/latest/APIReference/API_CreateGrant.html) to monitor AWS KMS operations called by Amazon Bedrock to create a primary grant:

```
{
"eventVersion": "1.09",
    "userIdentity": {
"type": "AssumedRole",
        "principalId": "AROAIGDTESTANDEXAMPLE:SampleUser01",
        "arn": "arn:aws:sts::111122223333:assumed-role/RoleForModelImport/SampleUser01",
        "accountId": "111122223333",
        "accessKeyId": "EXAMPLE",
        "sessionContext": {
"sessionIssuer": {
"type": "Role",
                "principalId": "AROAIGDTESTANDEXAMPLE",
                "arn": "arn:aws:iam::111122223333:role/RoleForModelImport",
                "accountId": "111122223333",
                "userName": "RoleForModelImport"
            },
            "attributes": {
"creationDate": "2024-05-07T21:46:28Z",
                "mfaAuthenticated": "false"
            }
        },
        "invokedBy": "bedrock.amazonaws.com"
    },
    "eventTime": "2024-05-07T21:49:44Z",
    "eventSource": "kms.amazonaws.com",
    "eventName": "CreateGrant",
    "awsRegion": "us-east-1",
    "sourceIPAddress": "bedrock.amazonaws.com",
    "userAgent": "bedrock.amazonaws.com",
    "requestParameters": {
"granteePrincipal": "bedrock.amazonaws.com",
        "retiringPrincipal": "bedrock.amazonaws.com",
        "keyId": "arn:aws:kms:us-east-1:111122223333:key/1234abcd-12ab-34cd-56ef-123456SAMPLE",
        "operations": [
            "Decrypt",
            "CreateGrant",
            "GenerateDataKey",
            "DescribeKey"
        ]
    },
    "responseElements": {
"grantId": "0ab0ac0d0b000f00ea00cc0a0e00fc00bce000c000f0000000c0bc0a0000aaafSAMPLE",
        "keyId": "arn:aws:kms:us-east-1:111122223333:key/1234abcd-12ab-34cd-56ef-123456SAMPLE"
    },
    "requestID": "ff000af-00eb-00ce-0e00-ea000fb0fba0SAMPLE",
    "eventID": "ff000af-00eb-00ce-0e00-ea000fb0fba0SAMPLE",
    "readOnly": false,
    "resources": [
        {
"accountId": "111122223333",
            "type": "AWS::KMS::Key",
            "ARN": "arn:aws:kms:us-east-1:111122223333:key/1234abcd-12ab-34cd-56ef-123456SAMPLE"
        }
    ],
    "eventType": "AwsApiCall",
    "managementEvent": true,
    "recipientAccountId": "111122223333",
    "eventCategory": "Management"
}
```

Attach the following resource-based policy to the KMS key by following the steps at [Creating a policy](https://docs.aws.amazon.com//kms/latest/developerguide/key-policy-overview.html). The policy contains two statements.

1. Permissions for a role to encrypt model customization artifacts. Add ARNs of the imported custom model builder roles to the `Principal` field.

1. Permissions for a role to use the imported custom model in inference. Add ARNs of imported custom model user roles to the `Principal` field.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Id": "KMS Key Policy",
    "Statement": [
        {
            "Sid": "Permissions for imported model builders",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::123456789012:user/role"
            },
            "Action": [
                "kms:Decrypt",
                "kms:GenerateDataKey",
                "kms:DescribeKey",
                "kms:CreateGrant"
            ],
            "Resource": "*"
        },
        {
            "Sid": "Permissions for imported model users",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::123456789012:user/role"
            },
            "Action": "kms:Decrypt",
            "Resource": "*"
        }
    ]
}
```

------

# Encryption in Amazon Bedrock Data Automation
<a name="encryption-bda"></a>

 Amazon Bedrock Data Automation (BDA) uses encryption to protect your data at rest. This includes the blueprints, projects, libraries, and extracted insights stored by the service. BDA offers two options for encrypting your data: 

1.  AWS owned keys – By default, BDA encrypts your data with AWS owned keys. You can't view, manage, or use AWS owned keys, or audit their use. However, you don't have to take any action or change any programs to protect the keys that encrypt your data. For more information, see [AWS owned keys](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#aws-owned-cmk) in the AWS Key Management Service Developer Guide. 

1.  Customer managed keys – You can choose to encrypt your data with customer managed keys that you manage yourself. For more information about AWS KMS keys, see [Customer managed keys](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#customer-cmk) in the AWS Key Management Service Developer Guide. BDA does not support customer managed keys for use in the Amazon Bedrock console, only for API operations. 

 Amazon Bedrock Data Automation automatically enables encryption at rest using AWS owned keys at no charge. If you use a customer managed key, AWS KMS charges apply. For more information about pricing, see AWS KMS [pricing](https://aws.amazon.com/kms/pricing/). 

## How Amazon Bedrock uses grants in AWS KMS
<a name="encryption-bda-grants"></a>

 If you specify a customer managed key for encryption of your BDA when calling invokeDataAutomationAsync or CreateDataAutomationLibrary, the service creates a grant associated with your resources on your behalf by sending a CreateGrant request to AWS KMS. This grant allows BDA to access and use your customer managed key. The grant created by CreateDataAutomationLibrary is not utilized if the customer ingests vocabulary entities into the library. 

 BDA uses the grant for your customer managed key for the following internal operations: 
+ DescribeKey — Send requests to AWS KMS to verify that the symmetric customer managed AWS KMS key ID you provided is valid.
+ GenerateDataKey and Decrypt — Send requests to AWS KMS to generate data keys encrypted by your customer managed key and decrypt the encrypted data keys so that they can be used to encrypt your resources.
+ CreateGrant — Send requests to AWS KMS to create scoped down grants for the asynchronous execution of operations. The grant operations vary by API:
  + InvokeDataAutomationAsync: DescribeKey, GenerateDataKey, Decrypt
  + CreateDataAutomationLibrary: DescribeKey, GenerateDataKey, Decrypt, CreateGrant

 You have full access to your customer managed AWS KMS key. You can revoke access to the grant by following the steps at Retiring and revoking grants in the AWS KMS Developer Guide or remove the service's access to your customer managed key at any time by modifying the key policy. If you do so, BDA won't be able to access the resources encrypted by your key. 

If you initiate a new invokeDataAutomationAsync call after revoking a grant, BDA will recreate the grant.

The grants created by invokeDataAutomationAsync are retired by BDA after 30 hours.

The grants created by CreateDataAutomationLibrary are retired by BDA when the library is deleted.

## Creating a customer managed key and attaching a key policy
<a name="encryption-bda-creating-keys"></a>

 To encrypt BDA resources with a key that you create and manage, follow these general steps: 

1.  (Prerequisite) Ensure that your IAM role has permissions for the CreateKey action. 

1.  Follow the steps at [ Creating keys ](https://docs.aws.amazon.com/kms/latest/developerguide/create-keys.html) to create a customer managed key using the AWS KMS console or the CreateKey operation. 

1.  Creation of the key returns an ARN that you can use for operations that require using the key (for example, when creating a project or blueprint in BDA), like the invokeDataAutomationAsync operation. 

1.  Create and attach a key policy to the key with the required permissions. To create a key policy, follow the steps at [ Creating a key policy ](https://docs.aws.amazon.com/kms/latest/developerguide/key-policy-create.html) in the AWS KMS Developer Guide. 

## Permissions and key policies for Amazon Bedrock Data Automation resources
<a name="encryption-bda-key-policies.title"></a>

 After you create a AWS KMS key, you attach a key policy to it. The following AWS KMS actions are used for keys that encrypt BDA resources:

1.  kms:CreateGrant – Creates a grant for a customer managed key by allowing the BDA service access to the specified AWS KMS key through grant operations, needed for InvokeDataAutomationAsync and CreateDataAutomationLibrary. 

1.  kms:DescribeKey – Provides the customer managed key details to allow BDA to validate the key. 

1.  kms:GenerateDataKey – Provides the customer managed key details to allow BDA to validate user access. 

1.  kms:Decrypt – Decrypts the stored ciphertext to validate that the role has proper access to the AWS KMS key that encrypts the BDA resources. 

**Key policy for Amazon Bedrock Data Automation**

 To use your customer managed key to encrypt BDA resources, include the following statements in your key policy and replace `${account-id}`, `${region}`, and `${key-id}` with your specific values:

```
{
  "Version": "2012-10-17",		 	 	 
  "Id": "KMS key policy for a key to encrypt data for BDA resource",
  "Statement": [
    {
      "Sid": "Enable DescribeKey, Decrypt, GenerateDataKey",
      "Effect": "Allow",
      "Principal": {
        "AWS": "arn:aws:iam::111122223333:role/Role"
      },
      "Action": [
        "kms:DescribeKey",
        "kms:GenerateDataKey",
        "kms:Decrypt"
      ],
      "Resource": "*",
      "Condition": {
        "StringEquals": {
          "kms:ViaService": "bedrock.us-east-1.amazonaws.com"
        }
      }
    },
    {
      "Sid": "Enable CreateGrant",
      "Effect": "Allow",
      "Principal": {
        "AWS": "arn:aws:iam::111122223333:role/Role"
      },
      "Action": "kms:CreateGrant",
      "Resource": "*",
      "Condition": {
        "StringEquals": {
          "kms:ViaService": "bedrock.us-east-1.amazonaws.com"
        },
        "ForAllValues:StringEquals": {
          "kms:GrantOperations": [
            "CreateGrant",
            "GenerateDataKey",
            "Decrypt",
            "DescribeKey"
          ]
        }
      }
    }
  ]
}
```

**IAM role permissions**

The IAM role used to interact with BDA and AWS KMS should have the following permissions, replace `${region}`, `${account-id}`, and `${key-id}` with your specific values:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "kms:GenerateDataKey",
                "kms:Decrypt",
                "kms:DescribeKey",
                "kms:CreateGrant"
            ],
            "Resource": "arn:aws:kms:us-east-1:123456789012:key/KeyId",
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "bedrock.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

## Amazon Bedrock Data Automation encryption context
<a name="encryption-bda-context"></a>

For all DataAutomationLibrary operations including InvokeDataAutomationLibraryIngestionJob, BDA uses below encryption context in all AWS KMS cryptographic operations, where the key is `aws:bedrock:data-automation-library-arn` and the value is the `libraryArn`.

```
"encryptionContext": {
     "aws:bedrock:data-automation-library-arn": "<LibraryArn>"
}
```

For DataAutomationProject operations, BDA uses below encryption context:

```
"encryptionContext": {
     "DataAutomationProjectArn": "<DataAutomationProjectArn>"
}
```

For Blueprint operations, BDA uses below encryption context:

```
"encryptionContext": {
     "BlueprintArn": "<BlueprintArn>"
}
```

For all other operations, BDA uses below encryption context:

```
"encryptionContext": {
     "aws:bedrock:data-automation-customer-account-id": "111122223333"
}
```

**Using encryption context for monitoring**  
When you use a symmetric customer managed key to encrypt your data, you can also use the encryption context in audit records and logs to identify how the customer managed key is being used. The encryption context also appears in logs generated by AWS CloudTrail or Amazon CloudWatch Logs.

**Using encryption context to control access to your customer managed key**  
You can use the encryption context in key policies and IAM policies as conditions to control access to your symmetric customer managed key. You can also use encryption context constraints in a grant. BDA uses an encryption context constraint in grants to control access to the customer managed key in your account or Region. The grant constraint requires that the operations that the grant allows use the specified encryption context. 

The following are example key policy statements to grant access to a customer managed key for a specific encryption context. The condition in this policy statement requires that the grants have an encryption context constraint that specifies the encryption context.

```
[
    {
        "Sid": "Enable DescribeKey",
        "Effect": "Allow",
        "Principal": {
            "AWS": "arn:aws:iam::111122223333:role/ExampleRole"
        },
        "Action": ["kms:DescribeKey"],
        "Resource": "*",
        "Condition": {
            "StringLike": {
                "kms:ViaService": [
                    "bedrock.${region}.amazonaws.com"
                ]
            }
        }
    },
    {
        "Sid": "Enable Decrypt, GenerateDataKey",
        "Effect": "Allow",
        "Principal": {
            "AWS": "arn:aws:iam::111122223333:role/ExampleRole"
        },
        "Action": [
            "kms:GenerateDataKey",
            "kms:Decrypt"
        ],
        "Resource": "*",
        "Condition": {
            "StringLike": {
                "kms:ViaService": [
                    "bedrock.${region}.amazonaws.com"
                ],
                "kms:EncryptionContext:aws:bedrock:data-automation-customer-account-id": "111122223333"
            }
        }
    },
    {
        "Sid": "Enable CreateGrant",
        "Effect": "Allow",
        "Principal": {
            "AWS": "arn:aws:iam::111122223333:role/ExampleRole"
        },
        "Action": "kms:CreateGrant",
        "Resource": "*",
        "Condition": {
            "StringLike": {
                "kms:ViaService": [
                    "bedrock.${region}.amazonaws.com"
                ],
                "kms:EncryptionContext:aws:bedrock:data-automation-customer-account-id": "111122223333"
            },
            "StringEquals": {
                "kms:GrantOperations": ["Decrypt", "DescribeKey", "GenerateDataKey"]
            }
        }
    }
]
```

When calling `CreateDataAutomationLibrary`, attach the following policy to grant BDA the necessary permission for accessing your customer managed key.

```
[
    {
        "Sid": "Enable CreateGrant for CreateDataAutomationLibrary",
        "Effect": "Allow",
        "Principal": {
            "AWS": "arn:aws:iam::111122223333:role/ExampleRole"
        },
        "Action": "kms:CreateGrant",
        "Resource": "*",
        "Condition": {
            "StringLike": {
                "kms:ViaService": [
                    "bedrock.${region}.amazonaws.com"
                ]
            },
            "StringEquals": {
                "kms:GrantOperations": ["Decrypt", "DescribeKey", "GenerateDataKey", "CreateGrant"]
            }
        }
    }
]
```

## Monitoring your encryption keys for Amazon Bedrock Data Automation
<a name="encryption-bda-monitoring"></a>

 When you use an AWS KMS customer managed key with your Amazon Bedrock Data Automation resources, you can use [AWS CloudTrail](https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudtrail-user-guide.html) or [Amazon CloudWatch](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/WhatIsCloudWatchLogs.html) to track requests that Amazon Bedrock Data Automation sends to AWS KMS. The following is an example AWS CloudTrail event for [CreateGrant](https://docs.aws.amazon.com/kms/latest/APIReference/API_CreateGrant.html) to monitor AWS KMS operations called by Amazon Bedrock Data Automation to create a primary grant: 

```
{
    "eventVersion": "1.09",
        "userIdentity": {
        "type": "AssumedRole",
        "principalId": "AROAIGDTESTANDEXAMPLE:SampleUser01",
        "arn": "arn:aws:sts::111122223333:assumed-role/RoleForDataAutomation/SampleUser01",
        "accountId": "111122223333",
        "accessKeyId": "EXAMPLE",
        "sessionContext": {
        "sessionIssuer": {
        "type": "Role",
        "principalId": "AROAIGDTESTANDEXAMPLE",
        "arn": "arn:aws:iam::111122223333:role/RoleForDataAutomation",
        "accountId": "111122223333",
        "userName": "RoleForDataAutomation"
        },
        "attributes": {
        "creationDate": "2024-05-07T21:46:28Z",
        "mfaAuthenticated": "false"
    }
    },
    "invokedBy": "bedrock.amazonaws.com"
    },
    "eventTime": "2024-05-07T21:49:44Z",
    "eventSource": "kms.amazonaws.com",
    "eventName": "CreateGrant",
    "awsRegion": "us-east-1",
    "sourceIPAddress": "bedrock.amazonaws.com",
    "userAgent": "bedrock.amazonaws.com",
    "requestParameters": {
    "granteePrincipal": "bedrock.amazonaws.com",
    "retiringPrincipal": "bedrock.amazonaws.com",
    "keyId": "arn:aws:kms:us-east-1:111122223333:key/1234abcd-12ab-34cd-56ef-123456SAMPLE",
     "constraints": {
            "encryptionContextSubset": {
                "aws:bedrock:data-automation-customer-account-id": "000000000000"
            }
        },
    "operations": [
    "Decrypt",
    "CreateGrant",
    "GenerateDataKey",
    "DescribeKey"
    ]
    },
    "responseElements": {
    "grantId": "0ab0ac0d0b000f00ea00cc0a0e00fc00bce000c000f0000000c0bc0a0000aaafSAMPLE",
    "keyId": "arn:aws:kms:us-east-1:111122223333:key/1234abcd-12ab-34cd-56ef-123456SAMPLE"
    },
    "requestID": "ff000af-00eb-00ce-0e00-ea000fb0fba0SAMPLE",
    "eventID": "ff000af-00eb-00ce-0e00-ea000fb0fba0SAMPLE",
    "readOnly": false,
    "resources": [
    {
    "accountId": "111122223333",
    "type": "AWS::KMS::Key",
    "ARN": "arn:aws:kms:us-east-1:111122223333:key/1234abcd-12ab-34cd-56ef-123456SAMPLE"
    }
    ],
    "eventType": "AwsApiCall",
    "managementEvent": true,
    "recipientAccountId": "111122223333",
    "eventCategory": "Management"
}
```

# Encryption of agent resources
<a name="encryption-agents-new"></a>

Encryption of data at rest by default helps reduce the operational overhead and complexity involved in protecting sensitive data. At the same time, it enables you to build secure applications that meet strict encryption compliance and regulatory requirements.

Amazon Bedrock uses default AWS-owned keys to automatically encrypt agent's information. This includes control plane data and session data. You can't view, manage, or audit the use of AWS owned keys. For more information, see [AWS owned keys](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#aws-owned-cmk). 

While you can't disable this layer of encryption, you can choose to use customer managed keys instead of AWS-owned keys to encrypt agent's information. Amazon Bedrock supports the use of a symmetric customer managed keys (CMK) that you can create, own, and manage instead of the default AWS owned encryption. For more information, see [Customer managed keys](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#customer-cmk).

**Important**  
Amazon Bedrock automatically encrypts your agent's session information using AWS owned keys at no charge.
AWS KMS charges apply for using a customer managed keys. For more information about pricing, see [AWS Key Management Service Pricing](https://aws.amazon.com/kms/pricing/).
If you've created your agent *before* January 22, 2025 and want to use customer managed key for encrypting agent resources, follow instructions for [Encryption of agent resources for agents created before January 22, 2025](encryption-agents.md).

# Encryption of agent resources with customer managed keys (CMK)
<a name="cmk-agent-resources"></a>

You can at any time create a customer managed key to encrypt your agent’s information using the following agent information provided when building your agent.

**Note**  
The following agent resources will only be encrypted for the agents created after January 22, 2025.


****  
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/bedrock/latest/userguide/cmk-agent-resources.html)

To use a customer managed key, complete the following steps:

1. Create customer managed key with the AWS Key Management Service.

1. Create a key policy and attach to the customer managed key

## Create a customer managed key
<a name="create-cmk-agent"></a>

You can create a symmetric customer managed key by using the AWS Management Console, or the AWS Key Management Service APIs. 

 First make sure that you have `CreateKey` permissions and then, follow the steps for [Creating symmetric customer managed key](https://docs.aws.amazon.com/kms/latest/developerguide/create-keys.html#create-symmetric-cmk) in the *AWS Key Management Service Developer Guide*.

**Key policy** - key policies control access to your customer managed key . Every customer managed key must have exactly one key policy, which contains statements that determine who can use the key and how they can use it. When you create your customer managed key, you can specify a key policy. For more information, see [Managing access to customer managed key](https://docs.aws.amazon.com/kms/latest/developerguide/overview.html) in the *AWS Key Management Service Developer Guide*.

If you have created your agent after January 22, 2025 and want to use customer managed key to encrypt your agent's information, make sure that the user or the role calling the agent API operations has the following permissions in the key policy:
+ [kms:GenerateDataKey](https://docs.aws.amazon.com/kms/latest/APIReference/API_GenerateDataKey.html) – returns a unique symmetric data key for use outside of AWS KMS.
+ [kms:Decrypt](https://docs.aws.amazon.com/kms/latest/APIReference/API_Decrypt.html) – decrypts ciphertext that was encrypted by a KMS key.

Creation of the key returns an `Arn` for the key that you can use as the `customerEncryptionKeyArn`, when creating your agent. 

## Create a key policy and attach it to the customer managed key
<a name="attach-policy-agent"></a>

If you encrypt agent resources with a customer managed key, you must set up an identity-based policy and a resource-based policy to allow Amazon Bedrock to encrypt and decrypt the agent resources on your behalf.

**Identity-based policy**

Attach the following identity-based policy to an IAM role or user with permissions to make calls to agent APIs that encrypt and decrypt agent resources on your behalf. This policy validates the user making API call has AWS KMS permissions. Replace the `${region}`, `${account-id}`, `${agent-id}`, and `${key-id}` with the appropriate values.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "EncryptAgents",
            "Effect": "Allow",
            "Action": [
                "kms:GenerateDataKey",
                "kms:Decrypt"
            ],
            "Resource": "arn:aws:kms:us-east-1:123456789012:key/${key-id}",
            "Condition": {
                "StringEquals": {
                    "kms:EncryptionContext:aws:bedrock:arn": "arn:aws:bedrock:us-east-1:123456789012:agent/${agent-id}"
                }
            }
        }
    ]
}
```

------

**Resource-based policy**

Attach the following resource-based policy to your AWS KMS key *only* if you are creating action groups where the schema in Amazon S3 is encrypted. You do not need to attach resource-based policy for any other use cases.

To attach the following resource-based policy, change the scope of the permissions as necessary and replace the `${region}`, `${account-id}`, `${agent-id}`, and `${key-id}` with the appropriate values.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "Allow account root to modify the KMS key, not used by Amazon Bedrock.",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::111122223333:root"
            },
            "Action": "kms:*",
            "Resource": "arn:aws:kms:us-east-1:123456789012:key/${key-id}"
        },
        {
            "Sid": "Allow Amazon Bedrock to encrypt and decrypt Agent resources on behalf of authorized users",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": [
                "kms:GenerateDataKey",
                "kms:Decrypt"
            ],
            "Resource": "arn:aws:kms:us-east-1:123456789012:key/${key-id}",
            "Condition": {
                "StringEquals": {
                    "kms:EncryptionContext:aws:bedrock:arn": "arn:aws:bedrock:us-east-1:123456789012:agent/${agent-id}"
                }
            }
        }
    ]
}
```

------

## Changing the customer managed key
<a name="change-cmk"></a>

Amazon Bedrock agents do not support re-encryption of versioned agents when the customer managed key associated with the *DRAFT* agent is changed or when you move from customer managed key to AWS owned key. Only the data for the *DRAFT* resource will be re-encrypted with the new key.

Make sure you are not deleting or removing permissions for any keys for a versioned agent if using it to serve production data.

To view and verify the keys being used by a version, call [GetAgentVersion](https://docs.aws.amazon.com//bedrock/latest/APIReference/API_agent_GetAgentVersion.html) and check the `customerEncryptionKeyArn` in the response.

# Encrypt agent sessions with customer managed key (CMK)
<a name="ltm-permissions"></a>

If you've enabled memory for your agent and if you encrypt agent sessions with a customer managed key, you must configure the following key policy and the calling identity IAM permissions to configure your customer managed key.

**Customer managed key policy**

Amazon Bedrock uses these permissions to generate encrypted data keys and then use the generated keys to encrypt agent memory. Amazon Bedrock also needs permissions to re-encrypt the the generated data key with different encryption contexts. Re-encrypt permissions are also used when customer managed key transitions between another customer managed key or service owned key. For more information, see [Hierarchical Keyring](https://docs.aws.amazon.com//database-encryption-sdk/latest/devguide/use-hierarchical-keyring.html).

Replace the `$region`, `account-id`, and `${caller-identity-role}` with appropriate values.

```
{
    "Version": "2012-10-17",		 	 	 
    {
        "Sid": "Allow access for bedrock to enable long term memory",
        "Effect": "Allow",
        "Principal": {
            "Service": [
                "bedrock.amazonaws.com",
            ],
        },
        "Action": [
            "kms:GenerateDataKeyWithoutPlainText",
            "kms:ReEncrypt*"
        ],
        "Condition": {
            "StringEquals": {
                "aws:SourceAccount": "$account-id"
            },
            "ArnLike": {
                "aws:SourceArn": "arn:aws:bedrock:$region:$account-id:agent-alias/*"
            }
        }
        "Resource": "*"
    },
    {
        "Sid": "Allow the caller identity control plane permissions for long term memory",
        "Effect": "Allow", 
        "Principal": {
            "AWS": "arn:aws:iam::${account-id}:role/${caller-identity-role}"
        },
        "Action": [
            "kms:GenerateDataKeyWithoutPlainText",
            "kms:ReEncrypt*"
        ],
        "Resource": "*",
        "Condition": {
            "StringLike": {
                "kms:EncryptionContext:aws-crypto-ec:aws:bedrock:arn": "arn:aws:bedrock:${region}:${account-id}:agent-alias/*"
            }
        }
    },
    {
        "Sid": "Allow the caller identity data plane permissions to decrypt long term memory",
        "Effect": "Allow",
        "Principal": {
            "AWS": "arn:aws:iam::${account-id}:role/${caller-identity-role}"
        },
        "Action": [
            "kms:Decrypt"
        ],
        "Resource": "*",
        "Condition": {
            "StringLike": {
                "kms:EncryptionContext:aws-crypto-ec:aws:bedrock:arn": "arn:aws:bedrock:${region}:${account-id}:agent-alias/*",
                "kms:ViaService": "bedrock.$region.amazonaws.com" 
            }
        }
    }
}
```

**IAM permissions to encrypt and decrypt agent memory**

The following IAM permissions are needed for the identity calling Agents API to configure KMS key for agents with memory enabled. Amazon Bedrock agents use these permissions to make sure that the caller identity is authorized to have permissions mentioned in the key policy above for APIs to manage, train, and deploy models. For the APIs that invoke agents, Amazon Bedrock agent uses caller identity's `kms:Decrypt` permissions to decrypty memory.

Replace the `$region`, `account-id`, and `${key-id}` with appropriate values.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AgentsControlPlaneLongTermMemory",
            "Effect": "Allow",
            "Action": [
                "kms:GenerateDataKeyWithoutPlaintext", 
                "kms:ReEncrypt*"
            ],
            "Resource": "arn:aws:kms:us-east-1:123456789012:key/KeyId",
            "Condition": {
                "StringLike": {
                    "kms:EncryptionContext:aws-crypto-ec:aws:bedrock:arn": "arn:aws:bedrock:us-east-1:123456789012:agent-alias/*"
                }
            }
        },
        {
            "Sid": "AgentsDataPlaneLongTermMemory",
            "Effect": "Allow",
            "Action": [
                "kms:Decrypt"
            ],
            "Resource": "arn:aws:kms:us-east-1:123456789012:key/KeyId",
            "Condition": {
                "StringLike": {
                    "kms:EncryptionContext:aws-crypto-ec:aws:bedrock:arn": "arn:aws:bedrock:us-east-1:123456789012:agent-alias/*"
                }
            }
        }
    ]
}
```

------

# Preventative security best practice for agents
<a name="security-best-practice-agents"></a>

The following best practices for Amazon Bedrock service can help prevent security incidents:

**Use secure connections**

Always use encrypted connections, such as those that begin with `https://` to keep sensitive information secure in transit.

**Implement least priviledge access to resources**

When you create custom policies for Amazon Bedrock resources, grant only the permissions required to perform a task. It's recommended to start with a minimum set of permissions and grant additional permissions as needed. Implementing least privilege access is essential to reducing the risk and impact that could result from errors or malicious attacks. For more information, see [Identity and access management for Amazon Bedrock](security-iam.md).

**Do not include PII in any of the agent resources containing customer data**

When creating, updating, and deleting agents resources (for example, when using [CreateAgent](https://docs.aws.amazon.com//bedrock/latest/APIReference/API_agent_CreateAgent.html) ) do not include personally-identifiable information (PII) in any fields that do not support using customer managed key such as action group names and knowledgebase names. For the list of fields that support using customer managed key, see [Encryption of agent resources with customer managed keys (CMK)](cmk-agent-resources.md)

# Encryption of agent resources for agents created before January 22, 2025
<a name="encryption-agents"></a>

**Important**  
If you've created your agent *after* January 22, 2025, follow instructions for [Encryption of agent resources](encryption-agents-new.md)

Amazon Bedrock encrypts your agent's session information. By default, Amazon Bedrock encrypts this data using an AWS managed key. Optionally, you can encrypt the agent artifacts using a customer managed key.

For more information about AWS KMS keys, see [Customer managed keys](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#customer-cmk) in the *AWS Key Management Service Developer Guide*.

If you encrypt sessions with your agent with a custom KMS key, you must set up the following identity-based policy and resource-based policy to allow Amazon Bedrock to encrypt and decrypt agent resources on your behalf.

1. Attach the following identity-based policy to an IAM role or user with permissions to make `InvokeAgent` calls. This policy validates the user making an `InvokeAgent` call has KMS permissions. Replace the *\$1\$1region\$1*, *\$1\$1account-id\$1*, *\$1\$1agent-id\$1*, and *\$1\$1key-id\$1* with the appropriate values.

------
#### [ JSON ]

****  

   ```
   {
       "Version":"2012-10-17",		 	 	 
       "Statement": [
           {
               "Sid": "EncryptDecryptAgents",
               "Effect": "Allow",
               "Action": [
                   "kms:GenerateDataKey",
                   "kms:Decrypt"
               ],
               "Resource": "arn:aws:kms:us-east-1:123456789012:key/key-id",
               "Condition": {
                   "StringEquals": {
                       "kms:EncryptionContext:aws:bedrock:arn": "arn:aws:bedrock:us-east-1:123456789012:agent/agent-id"
                   }
               }
           }
       ]
   }
   ```

------

1. Attach the following resource-based policy to your KMS key. Change the scope of the permissions as necessary. Replace the *\$1\$1region\$1*, *\$1\$1account-id\$1*, *\$1\$1agent-id\$1*, and *\$1\$1key-id\$1* with the appropriate values.

------
#### [ JSON ]

****  

   ```
   {
       "Version":"2012-10-17",		 	 	 
       "Statement": [
           {
               "Sid": "AllowRootModifyKMSKey",
               "Effect": "Allow",
               "Principal": {
                   "AWS": "arn:aws:iam::123456789012:root"
               },
               "Action": "kms:*",
               "Resource": "arn:aws:kms:us-east-1:123456789012:key/KeyId"
           },
           {
               "Sid": "AllowBedrockEncryptAgent",
               "Effect": "Allow",
               "Principal": {
                   "Service": "bedrock.amazonaws.com"
               },
               "Action": [
                   "kms:GenerateDataKey",
                   "kms:Decrypt"
               ],
               "Resource": "arn:aws:kms:us-east-1:123456789012:key/KeyId",
               "Condition": {
                   "StringEquals": {
                       "kms:EncryptionContext:aws:bedrock:arn": "arn:aws:bedrock:us-east-1:123456789012:agent/AgentId"
                   }
               }
           },
           {
               "Sid": "AllowRoleEncryptAgent",
               "Effect": "Allow",
               "Principal": {
                   "AWS": "arn:aws:iam::123456789012:role/Role"
               },
               "Action": [
                   "kms:GenerateDataKey*",
                   "kms:Decrypt"
               ],
               "Resource": "arn:aws:kms:us-east-1:123456789012:key/KeyId"
           },
           {
               "Sid": "AllowAttachmentPersistentResources",
               "Effect": "Allow",
               "Principal": {
                   "Service": "bedrock.amazonaws.com"
               },
               "Action": [
                   "kms:CreateGrant",
                   "kms:ListGrants",
                   "kms:RevokeGrant"
               ],
               "Resource": "*",
               "Condition": {
                   "Bool": {
                       "kms:GrantIsForAWSResource": "true"
                   }
               }
           }
       ]
   }
   ```

------

# Encryption of Amazon Bedrock Flows resources
<a name="encryption-flows"></a>

Amazon Bedrock encrypts your data at rest. By default, Amazon Bedrock encrypts this data using an AWS managed key. Optionally, you can encrypt the data using a customer managed key.

For more information about AWS KMS keys, see [Customer managed keys](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#customer-cmk) in the *AWS Key Management Service Developer Guide*.

If you encrypt data with a custom KMS key, you must set up the following identity-based policy and resource-based policy to allow Amazon Bedrock to encrypt and decrypt data on your behalf.

1. Attach the following identity-based policy to an IAM role or user with permissions to make Amazon Bedrock Flows API calls. This policy validates the user making Amazon Bedrock Flows calls has KMS permissions. Replace the *\$1\$1region\$1*, *\$1\$1account-id\$1*, *\$1\$1flow-id\$1*, and *\$1\$1key-id\$1* with the appropriate values.

------
#### [ JSON ]

****  

   ```
   {
       "Version":"2012-10-17",		 	 	 
       "Statement": [
           {
               "Sid": "EncryptFlow",
               "Effect": "Allow",
               "Action": [
                   "kms:GenerateDataKey",
                   "kms:Decrypt"
               ],
               "Resource": "arn:aws:kms:us-east-1:123456789012:key/${key-id}",
               "Condition": {
                   "StringEquals": {
                       "kms:EncryptionContext:aws:bedrock-flows:arn": "arn:aws:bedrock:us-east-1:123456789012:flow/${flow-id}",
                       "kms:ViaService": "bedrock.us-east-1.amazonaws.com"
                   }
               }
           }
       ]
   }
   ```

------

1. Attach the following resource-based policy to your KMS key. Change the scope of the permissions as necessary. Replace the *\$1IAM-USER/ROLE-ARN\$1*, *\$1\$1region\$1*, *\$1\$1account-id\$1*, *\$1\$1flow-id\$1*, and *\$1\$1key-id\$1* with the appropriate values.

------
#### [ JSON ]

****  

   ```
   {
       "Version":"2012-10-17",		 	 	 
       "Statement": [
           {
               "Sid": "AllowRootModifyKMSId",
               "Effect": "Allow",
               "Principal": {
                   "AWS": "arn:aws:iam::123456789012:root"
               },
               "Action": "kms:*",
               "Resource": "arn:aws:kms:us-east-1:123456789012:key/KeyId"
           },
           {
               "Sid": "AllowRoleUseKMSKey",
               "Effect": "Allow",
               "Principal": {
                   "AWS": "arn:aws:iam::123456789012:role/RoleName"
               },
               "Action": [
                   "kms:GenerateDataKey",
                   "kms:Decrypt"
               ],
               "Resource": "arn:aws:kms:us-east-1:123456789012:key/${key-id}",
               "Condition": {
                   "StringEquals": {
                       "kms:EncryptionContext:aws:bedrock-flows:arn": "arn:aws:bedrock:us-east-1:123456789012:flow/FlowId",
                       "kms:ViaService": "bedrock.us-east-1.amazonaws.com"
                   }
               }
           }
       ]
   }
   ```

------

1. For [flow executions](flows-create-async.md), attach the following identity-based policy to a [service role with permissions to create and manage flows](flows-permissions.md). This policy validates that the your service role has AWS KMS permissions. Replace the *region*, *account-id*, *flow-id*, and *key-id* with the appropriate values.

------
#### [ JSON ]

****  

   ```
   {
       "Version":"2012-10-17",		 	 	 
       "Statement": [
           {
               "Sid": "EncryptionFlows",
               "Effect": "Allow",
               "Action": [
                   "kms:GenerateDataKey",
                   "kms:Decrypt"
               ],
               "Resource": "arn:aws:kms:us-east-1:123456789012:key/key-id",
               "Condition": {
                   "StringEquals": {
                       "kms:EncryptionContext:aws:bedrock-flows:arn": "arn:aws:bedrock:us-east-1:123456789012:flow/flow-id",
                       "kms:ViaService": "bedrock.us-east-1.amazonaws.com"
                   }
               }
           }
       ]
   }
   ```

------

# Encryption of knowledge base resources
<a name="encryption-kb"></a>

Amazon Bedrock encrypts resources related to your knowledge bases. By default, Amazon Bedrock encrypts this data using an AWS-owned key. Optionally, you can encrypt the model artifacts using a customer managed key.

Encryption with a KMS key can occur with the following processes:
+ Transient data storage while ingesting your data sources
+ Passing information to OpenSearch Service if you let Amazon Bedrock set up your vector database
+ Querying a knowledge base

The following resources used by your knowledge bases can be encrypted with a KMS key. If you encrypt them, you need to add permissions to decrypt the KMS key.
+ Data sources stored in an Amazon S3 bucket
+ Third-party vector stores

For more information about AWS KMS keys, see [Customer managed keys](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#customer-cmk) in the *AWS Key Management Service Developer Guide*.

**Note**  
Amazon Bedrock knowledge bases uses TLS encryption for communication with third-party data source connectors and vector stores where the provider permits and supports TLS encryption in transit.

**Topics**
+ [Encryption of transient data storage during data ingestion](#encryption-kb-ingestion)
+ [Encryption of information passed to Amazon OpenSearch Service](#encryption-kb-oss)
+ [Encryption of information passed to Amazon S3 Vectors](#encryption-kb-s3-vector)
+ [Encryption of knowledge base retrieval](#encryption-kb-runtime)
+ [Permissions to decrypt your AWS KMS key for your data sources in Amazon S3](#encryption-kb-ds)
+ [Permissions to decrypt an AWS Secrets Manager secret for the vector store containing your knowledge base](#encryption-kb-3p)
+ [Permissions for Bedrock Data Automation (BDA) with AWS KMS encryption](#encryption-kb-bda)

## Encryption of transient data storage during data ingestion
<a name="encryption-kb-ingestion"></a>

When you set up a data ingestion job for your knowledge base, you can encrypt the job with a custom KMS key.

To allow the creation of a AWS KMS key for transient data storage in the process of ingesting your data source, attach the following policy to your Amazon Bedrock service role. Replace the example values with your own AWS Region, account ID, and AWS KMS key ID.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "kms:GenerateDataKey",
                "kms:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/key-id"
            ]
        }
    ]
}
```

------

## Encryption of information passed to Amazon OpenSearch Service
<a name="encryption-kb-oss"></a>

If you opt to let Amazon Bedrock create a vector store in Amazon OpenSearch Service for your knowledge base, Amazon Bedrock can pass a KMS key that you choose to Amazon OpenSearch Service for encryption. To learn more about encryption in Amazon OpenSearch Service, see [Encryption in Amazon OpenSearch Service](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/serverless-encryption.html).

## Encryption of information passed to Amazon S3 Vectors
<a name="encryption-kb-s3-vector"></a>

If you opt to let Amazon Bedrock create an S3 vector bucket and vector index in Amazon S3 Vectors for your knowledge base, Amazon Bedrock can pass a KMS key that you choose to Amazon S3 Vectors for encryption. To learn more about encryption in Amazon S3 Vectors, see [Encryption with Amazon S3 Vectors](https://docs.aws.amazon.com/AmazonS3/latest/userguide/s3-vectors-bucket-encryption.html).

## Encryption of knowledge base retrieval
<a name="encryption-kb-runtime"></a>

You can encrypt sessions in which you generate responses from querying a knowledge base with a KMS key. To do so, include the ARN of a KMS key in the `kmsKeyArn` field when making a [RetrieveAndGenerate](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent-runtime_RetrieveAndGenerate.html) request. Attach the following policy, replacing the example values with your own AWS Region, account ID, and AWS KMS key ID to allow Amazon Bedrock to encrypt the session context.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": [
                "kms:GenerateDataKey",
                "kms:Decrypt"
            ],
            "Resource": "arn:aws:kms:us-east-1:123456789012:key/key-id"
        }
    ]
}
```

------

## Permissions to decrypt your AWS KMS key for your data sources in Amazon S3
<a name="encryption-kb-ds"></a>

You store the data sources for your knowledge base in your Amazon S3 bucket. To encrypt these documents at rest, you can use the Amazon S3 SSE-S3 server-side encryption option. With this option, objects are encrypted with service keys managed by the Amazon S3 service. 

For more information, see [Protecting data using server-side encryption with Amazon S3-managed encryption keys (SSE-S3)](https://docs.aws.amazon.com/AmazonS3/latest/userguide/UsingServerSideEncryption.html) in the *Amazon Simple Storage Service User Guide*.

If you encrypted your data sources in Amazon S3 with a custom AWS KMS key, attach the following policy to your Amazon Bedrock service role to allow Amazon Bedrock to decrypt your key. Replace the example values with your own AWS Region, account ID, and AWS KMS key ID.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "KMS:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/key-id"
            ],
            "Condition": {
                "StringEquals": {
                    "kms:ViaService": [
                        "s3.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

## Permissions to decrypt an AWS Secrets Manager secret for the vector store containing your knowledge base
<a name="encryption-kb-3p"></a>

If the vector store containing your knowledge base is configured with an AWS Secrets Manager secret, you can encrypt the secret with a custom AWS KMS key by following the steps at [Secret encryption and decryption in AWS Secrets Manager](https://docs.aws.amazon.com/secretsmanager/latest/userguide/security-encryption.html).

If you do so, you attach the following policy to your Amazon Bedrock service role to allow it to decrypt your key. Replace the example values with your own AWS Region, account ID, and AWS KMS key ID.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "kms:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/key-id"
            ]
        }
    ]
}
```

------

## Permissions for Bedrock Data Automation (BDA) with AWS KMS encryption
<a name="encryption-kb-bda"></a>

When using BDA to process multimodal content with customer-managed AWS KMS keys, additional permissions are required beyond the standard AWS KMS permissions.

Attach the following policy to your Amazon Bedrock service role to allow BDA to work with encrypted multimedia files. Replace the example values with your own AWS Region, account ID, and AWS KMS key ID.

```
{
    "Sid": "KmsPermissionStatementForBDA",
    "Effect": "Allow",
    "Action": [
        "kms:GenerateDataKey",
        "kms:Decrypt",
        "kms:DescribeKey",
        "kms:CreateGrant"
    ],
    "Resource": "arn:aws:kms:region:account-id:key/key-id",
    "Condition": {
        "StringEquals": {
            "aws:ResourceAccount": "account-id",
            "kms:ViaService": "bedrock.region.amazonaws.com"
        }
    }
}
```

The BDA-specific permissions include `kms:DescribeKey` and `kms:CreateGrant` actions, which are required for BDA to process encrypted audio, video, and image files.

# Protect your data using Amazon VPC and AWS PrivateLink
<a name="usingVPC"></a>

To control access to your data, we recommend that you use a virtual private cloud (VPC) with [Amazon VPC](https://docs.aws.amazon.com/vpc/latest/userguide/what-is-amazon-vpc.html). Using a VPC protects your data and lets you monitor all network traffic in and out of the AWS job containers by using [VPC Flow Logs](https://docs.aws.amazon.com/vpc/latest/userguide/flow-logs.html).

You can further protect your data by configuring your VPC so that your data isn't available over the internet and instead creating a VPC interface endpoint with [AWS PrivateLink](https://docs.aws.amazon.com/vpc/latest/privatelink/what-is-privatelink.html) to establish a private connection to your data.

The following lists some features of Amazon Bedrock in which you can use VPC to protect your data:
+ Model customization – [(Optional) Protect your model customization jobs using a VPC](custom-model-job-access-security.md#vpc-model-customization)
+ Batch inference – [Protect batch inference jobs using a VPC](batch-vpc.md)
+ Amazon Bedrock Knowledge Bases – [Access Amazon OpenSearch Serverless using an interface endpoint (AWS PrivateLink)](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/serverless-vpc.html)

## Set up a VPC
<a name="create-vpc"></a>

You can use a [default VPC](https://docs.aws.amazon.com/vpc/latest/userguide/default-vpc.html) or create a new VPC by following the guidance at [Get started with Amazon VPC](https://docs.aws.amazon.com/vpc/latest/userguide/vpc-getting-started.html) and [Create a VPC](https://docs.aws.amazon.com/vpc/latest/userguide/create-vpc.html).

When you create your VPC, we recommend that you use the default DNS settings for your endpoint route table, so that standard Amazon S3 URLs (for example, `http://s3-aws-region.amazonaws.com/training-bucket`) resolve.

The following topics show how to set up VPC endpoint with the help of AWS PrivateLink and an example use case for using VPC to protect access to your S3 files.

**Topics**
+ [Set up a VPC](#create-vpc)
+ [Use interface VPC endpoints (AWS PrivateLink) to create a private connection between your VPC and Amazon Bedrock](vpc-interface-endpoints.md)
+ [(Example) Restrict data access to your Amazon S3 data using VPC](vpc-s3.md)

# Use interface VPC endpoints (AWS PrivateLink) to create a private connection between your VPC and Amazon Bedrock
<a name="vpc-interface-endpoints"></a>

You can use AWS PrivateLink to create a private connection between your VPC and Amazon Bedrock. You can access Amazon Bedrock as if it were in your VPC, without the use of an internet gateway, NAT device, VPN connection, or Direct Connect connection. Instances in your VPC don't need public IP addresses to access Amazon Bedrock.

You establish this private connection by creating an *interface endpoint*, powered by AWS PrivateLink. We create an endpoint network interface in each subnet that you enable for the interface endpoint. These are requester-managed network interfaces that serve as the entry point for traffic destined for Amazon Bedrock.

For more information, see [Access AWS services through AWS PrivateLink](https://docs.aws.amazon.com/vpc/latest/privatelink/privatelink-access-aws-services.html) in the *AWS PrivateLink Guide*.

## Considerations for Amazon Bedrock VPC endpoints
<a name="vpc-endpoint-considerations"></a>

Before you set up an interface endpoint for Amazon Bedrock, review [Considerations](https://docs.aws.amazon.com/vpc/latest/privatelink/create-interface-endpoint.html#considerations-interface-endpoints) in the *AWS PrivateLink Guide*.

Amazon Bedrock supports making the following API calls through VPC endpoints.


****  

| Category | Endpoint suffix | 
| --- | --- | 
| [Amazon Bedrock Control Plane API actions](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_Operations_Amazon_Bedrock.html) | bedrock | 
| [Amazon Bedrock Runtime API actions](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_Operations_Amazon_Bedrock_Runtime.html) | bedrock-runtime | 
| Amazon Bedrock Mantle API actions | bedrock-mantle | 
| [Amazon Bedrock Agents Build-time API actions](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_Operations_Agents_for_Amazon_Bedrock.html) | bedrock-agent | 
| [Amazon Bedrock Agents Runtime API actions](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_Operations_Agents_for_Amazon_Bedrock_Runtime.html) | bedrock-agent-runtime | 

**Availability Zones**

Amazon Bedrock and Amazon Bedrock Agents endpoints are available in multiple Availability Zones.

## Create an interface endpoint for Amazon Bedrock
<a name="vpc-endpoint-create"></a>

You can create an interface endpoint for Amazon Bedrock using either the Amazon VPC console or the AWS Command Line Interface (AWS CLI). For more information, see [Create an interface endpoint](https://docs.aws.amazon.com/vpc/latest/privatelink/create-interface-endpoint.html#create-interface-endpoint-aws) in the *AWS PrivateLink Guide*.

Create an interface endpoint for Amazon Bedrock using any of the following service names:
+ `com.amazonaws.region.bedrock`
+ `com.amazonaws.region.bedrock-runtime`
+ `com.amazonaws.region.bedrock-mantle`
+ `com.amazonaws.region.bedrock-agent`
+ `com.amazonaws.region.bedrock-agent-runtime`

After you create the endpoint, you have the option to enable a private DNS hostname. Enable this setting by selecting Enable Private DNS Name in the VPC console when you create the VPC endpoint.

If you enable private DNS for the interface endpoint, you can make API requests to Amazon Bedrock using its default Regional DNS name. The following examples show the format of the default Regional DNS names.
+ `bedrock.region.amazonaws.com`
+ `bedrock-runtime.region.amazonaws.com`
+ `bedrock-mantle.region.api.aws`
+ `bedrock-agent.region.amazonaws.com`
+ `bedrock-agent-runtime.region.amazonaws.com`

## Create an endpoint policy for your interface endpoint
<a name="vpc-endpoint-policy"></a>

An endpoint policy is an IAM resource that you can attach to an interface endpoint. The default endpoint policy allows full access to Amazon Bedrock through the interface endpoint. To control the access allowed to Amazon Bedrock from your VPC, attach a custom endpoint policy to the interface endpoint.

An endpoint policy specifies the following information:
+ The principals that can perform actions (AWS accounts, IAM users, and IAM roles).
+ The actions that can be performed.
+ The resources on which the actions can be performed.

For more information, see [Control access to services using endpoint policies](https://docs.aws.amazon.com/vpc/latest/privatelink/vpc-endpoints-access.html) in the *AWS PrivateLink Guide*.

**Example: VPC endpoint policy for Amazon Bedrock actions**  
The following is an example of a custom endpoint policy. When you attach this resource-based policy to your interface endpoint, it grants access to the listed Amazon Bedrock actions for all principals on all resources.

------
#### [ JSON ]

****  

```
{
   "Version":"2012-10-17",		 	 	 
   "Statement": [
      {
         "Principal": "*",
         "Effect": "Allow",
         "Action": [
            "bedrock:InvokeModel",
            "bedrock:InvokeModelWithResponseStream"
         ],
         "Resource":"*"
      }
   ]
}
```

------

**Example: VPC endpoint policy for Amazon Bedrock Mantle actions**  
The following is an example of a custom endpoint policy. When you attach this resource-based policy to your interface endpoint, it grants access to the listed Amazon Bedrock Mantle actions for all principals on all resources.

```
{
   "Version":"2012-10-17",		 	 	 
   "Statement": [
      {
         "Principal": "*",
         "Effect": "Allow",
         "Action": [
            "bedrock-mantle:CreateInference"
         ],
         "Resource":"*"
      }
   ]
}
```

# (Example) Restrict data access to your Amazon S3 data using VPC
<a name="vpc-s3"></a>

You can use a VPC to restrict access to data in your Amazon S3 buckets. For further security, you can configure your VPC with no internet access and create an endpoint for it with AWS PrivateLink. You can also restrict access by attaching resource-based policies to the VPC endpoint or to the S3 bucket.

**Topics**
+ [Create an Amazon S3 VPC Endpoint](#vpc-s3-create)
+ [(Optional) Use IAM policies to restrict access to your S3 files](#vpc-policy-rbp)

## Create an Amazon S3 VPC Endpoint
<a name="vpc-s3-create"></a>

If you configure your VPC with no internet access, you need to create an [Amazon S3 VPC endpoint](https://docs.aws.amazon.com/AmazonS3/latest/userguide/privatelink-interface-endpoints.html) to allow your model customization jobs to access the S3 buckets that store your training and validation data and that will store the model artifacts.

Create the S3 VPC endpoint by following the steps at [Create a gateway endpoint for Amazon S3](https://docs.aws.amazon.com/vpc/latest/privatelink/vpc-endpoints-s3.html#create-gateway-endpoint-s3).

**Note**  
If you don't use the default DNS settings for your VPC, you need to ensure that the URLs for the locations of the data in your training jobs resolve by configuring the endpoint route tables. For information about VPC endpoint route tables, see [Routing for Gateway endpoints](https://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/vpce-gateway.html#vpc-endpoints-routing).

## (Optional) Use IAM policies to restrict access to your S3 files
<a name="vpc-policy-rbp"></a>

You can use [resource-based policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_identity-vs-resource.html) to more tightly control access to your S3 files. You can use any combination of the following types of resource-based policies.
+ **Endpoint policies** – You can attach endpoint policies to your VPC endpoint to restrict access through the VPC endpoint. The default endpoint policy allows full access to Amazon S3 for any user or service in your VPC. While creating or after you create the endpoint, you can optionally attach a resource-based policy to the endpoint to add restrictions, such as only allowing the endpoint to access a specific bucket or only allowing a specific IAM role to access the endpoint. For examples, see [Edit the VPC endpoint policy](https://docs.aws.amazon.com/vpc/latest/privatelink/vpc-endpoints-s3.html#edit-vpc-endpoint-policy-s3).

  The following is an example policy you can attach to your VPC endpoint to only allow it to access the bucket that you specify.

------
#### [ JSON ]

****  

  ```
  {
      "Version":"2012-10-17",		 	 	 
      "Statement": [
          {
              "Sid": "RestrictAccessToTrainingBucket",
              "Effect": "Allow",
              "Principal": "*",
              "Action": [
                  "s3:GetObject",
                  "s3:ListBucket"
              ],
              "Resource": [
                  "arn:aws:s3:::bucket",
                  "arn:aws:s3:::bucket/*"
              ]
          }
      ]
  }
  ```

------
+ **Bucket policies** – You can attach a bucket policy to an S3 bucket to restrict access to it. To create a bucket policy, follow the steps at [Using bucket policies](https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucket-policies.html). To restrict access to traffic that comes from your VPC, you can use condition keys to specify the VPC itself, a VPC endpoint, or the IP address of the VPC. You can use the [aws:sourceVpc](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html#condition-keys-sourcevpc), [aws:sourceVpce](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html#condition-keys-sourcevpce), or [aws:VpcSourceIp](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html#condition-keys-vpcsourceip) condition keys.

  The following is an example policy you can attach to an S3 bucket to deny all traffic to the bucket unless it comes from your VPC.

------
#### [ JSON ]

****  

  ```
  {
      "Version":"2012-10-17",		 	 	 
      "Statement": [
          {
              "Sid": "RestrictAccessToOutputBucket",
              "Effect": "Deny",
              "Principal": "*",
              "Action": [
                  "s3:GetObject",
                  "s3:PutObject",
                  "s3:ListBucket"
              ],
              "Resource": [
                  "arn:aws:s3:::bucket",
                  "arn:aws:s3:::bucket/*"
              ],
              "Condition": {
                  "StringNotEquals": {
                      "aws:sourceVpc": "vpc-11223344556677889"
                  }
              }
          }
      ]
  }
  ```

------

  For more examples, see [Control access using bucket policies](https://docs.aws.amazon.com/vpc/latest/privatelink/vpc-endpoints-s3.html#bucket-policies-s3).

# Identity and access management for Amazon Bedrock
<a name="security-iam"></a>

AWS Identity and Access Management (IAM) is an AWS service that helps an administrator securely control access to AWS resources. IAM administrators control who can be *authenticated* (signed in) and *authorized* (have permissions) to use Amazon Bedrock resources. IAM is an AWS service that you can use with no additional charge.

**Topics**
+ [Audience](#security_iam_audience)
+ [Authenticating with identities](#security_iam_authentication)
+ [Managing access using policies](#security_iam_access-manage)
+ [How Amazon Bedrock works with IAM](security_iam_service-with-iam.md)
+ [Identity-based policy examples for Amazon Bedrock](security_iam_id-based-policy-examples.md)
+ [Managing IAM policies on Projects](security-iam-projects.md)
+ [AWS managed policies for Amazon Bedrock](security-iam-awsmanpol.md)
+ [Service roles](security-iam-sr.md)
+ [Configure access to Amazon S3 buckets](s3-bucket-access.md)
+ [Troubleshooting Amazon Bedrock identity and access](security_iam_troubleshoot.md)

## Audience
<a name="security_iam_audience"></a>

How you use AWS Identity and Access Management (IAM) differs based on your role:
+ **Service user** - request permissions from your administrator if you cannot access features (see [Troubleshooting Amazon Bedrock identity and access](security_iam_troubleshoot.md))
+ **Service administrator** - determine user access and submit permission requests (see [How Amazon Bedrock works with IAM](security_iam_service-with-iam.md))
+ **IAM administrator** - write policies to manage access (see [Identity-based policy examples for Amazon Bedrock](security_iam_id-based-policy-examples.md))

## Authenticating with identities
<a name="security_iam_authentication"></a>

Authentication is how you sign in to AWS using your identity credentials. You must be authenticated as the AWS account root user, an IAM user, or by assuming an IAM role.

You can sign in as a federated identity using credentials from an identity source like AWS IAM Identity Center (IAM Identity Center), single sign-on authentication, or Google/Facebook credentials. For more information about signing in, see [How to sign in to your AWS account](https://docs.aws.amazon.com/signin/latest/userguide/how-to-sign-in.html) in the *AWS Sign-In User Guide*.

For programmatic access, AWS provides an SDK and CLI to cryptographically sign requests. For more information, see [AWS Signature Version 4 for API requests](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_sigv.html) in the *IAM User Guide*.

### AWS account root user
<a name="security_iam_authentication-rootuser"></a>

 When you create an AWS account, you begin with one sign-in identity called the AWS account *root user* that has complete access to all AWS services and resources. We strongly recommend that you don't use the root user for everyday tasks. For tasks that require root user credentials, see [Tasks that require root user credentials](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_root-user.html#root-user-tasks) in the *IAM User Guide*. 

### Federated identity
<a name="security_iam_authentication-federated"></a>

As a best practice, require human users to use federation with an identity provider to access AWS services using temporary credentials.

A *federated identity* is a user from your enterprise directory, web identity provider, or Directory Service that accesses AWS services using credentials from an identity source. Federated identities assume roles that provide temporary credentials.

For centralized access management, we recommend AWS IAM Identity Center. For more information, see [What is IAM Identity Center?](https://docs.aws.amazon.com/singlesignon/latest/userguide/what-is.html) in the *AWS IAM Identity Center User Guide*.

### IAM users and groups
<a name="security_iam_authentication-iamuser"></a>

An *[IAM user](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_users.html)* is an identity with specific permissions for a single person or application. We recommend using temporary credentials instead of IAM users with long-term credentials. For more information, see [Require human users to use federation with an identity provider to access AWS using temporary credentials](https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html#bp-users-federation-idp) in the *IAM User Guide*.

An [https://docs.aws.amazon.com/IAM/latest/UserGuide/id_groups.html](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_groups.html) specifies a collection of IAM users and makes permissions easier to manage for large sets of users. For more information, see [Use cases for IAM users](https://docs.aws.amazon.com/IAM/latest/UserGuide/gs-identities-iam-users.html) in the *IAM User Guide*.

### IAM roles
<a name="security_iam_authentication-iamrole"></a>

An *[IAM role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html)* is an identity with specific permissions that provides temporary credentials. You can assume a role by [switching from a user to an IAM role (console)](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-console.html) or by calling an AWS CLI or AWS API operation. For more information, see [Methods to assume a role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_manage-assume.html) in the *IAM User Guide*.

IAM roles are useful for federated user access, temporary IAM user permissions, cross-account access, cross-service access, and applications running on Amazon EC2. For more information, see [Cross account resource access in IAM](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies-cross-account-resource-access.html) in the *IAM User Guide*.

## Managing access using policies
<a name="security_iam_access-manage"></a>

You control access in AWS by creating policies and attaching them to AWS identities or resources. A policy defines permissions when associated with an identity or resource. AWS evaluates these policies when a principal makes a request. Most policies are stored in AWS as JSON documents. For more information about JSON policy documents, see [Overview of JSON policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html#access_policies-json) in the *IAM User Guide*.

Using policies, administrators specify who has access to what by defining which **principal** can perform **actions** on what **resources**, and under what **conditions**.

By default, users and roles have no permissions. An IAM administrator creates IAM policies and adds them to roles, which users can then assume. IAM policies define permissions regardless of the method used to perform the operation.

### Identity-based policies
<a name="security_iam_access-manage-id-based-policies"></a>

Identity-based policies are JSON permissions policy documents that you attach to an identity (user, group, or role). These policies control what actions identities can perform, on which resources, and under what conditions. To learn how to create an identity-based policy, see [Define custom IAM permissions with customer managed policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_create.html) in the *IAM User Guide*.

Identity-based policies can be *inline policies* (embedded directly into a single identity) or *managed policies* (standalone policies attached to multiple identities). To learn how to choose between managed and inline policies, see [Choose between managed policies and inline policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies-choosing-managed-or-inline.html) in the *IAM User Guide*.

### Resource-based policies
<a name="security_iam_access-manage-resource-based-policies"></a>

Resource-based policies are JSON policy documents that you attach to a resource. Examples include IAM *role trust policies* and Amazon S3 *bucket policies*. In services that support resource-based policies, service administrators can use them to control access to a specific resource. You must [specify a principal](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_elements_principal.html) in a resource-based policy.

Resource-based policies are inline policies that are located in that service. You can't use AWS managed policies from IAM in a resource-based policy.

### Other policy types
<a name="security_iam_access-manage-other-policies"></a>

AWS supports additional policy types that can set the maximum permissions granted by more common policy types:
+ **Permissions boundaries** – Set the maximum permissions that an identity-based policy can grant to an IAM entity. For more information, see [Permissions boundaries for IAM entities](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_boundaries.html) in the *IAM User Guide*.
+ **Service control policies (SCPs)** – Specify the maximum permissions for an organization or organizational unit in AWS Organizations. For more information, see [Service control policies](https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_scps.html) in the *AWS Organizations User Guide*.
+ **Resource control policies (RCPs)** – Set the maximum available permissions for resources in your accounts. For more information, see [Resource control policies (RCPs)](https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_rcps.html) in the *AWS Organizations User Guide*.
+ **Session policies** – Advanced policies passed as a parameter when creating a temporary session for a role or federated user. For more information, see [Session policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html#policies_session) in the *IAM User Guide*.

### Multiple policy types
<a name="security_iam_access-manage-multiple-policies"></a>

When multiple types of policies apply to a request, the resulting permissions are more complicated to understand. To learn how AWS determines whether to allow a request when multiple policy types are involved, see [Policy evaluation logic](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_evaluation-logic.html) in the *IAM User Guide*.

# How Amazon Bedrock works with IAM
<a name="security_iam_service-with-iam"></a>

Before you use IAM to manage access to Amazon Bedrock, learn what IAM features are available to use with Amazon Bedrock.






**IAM features you can use with Amazon Bedrock**  

| IAM feature | Amazon Bedrock support | 
| --- | --- | 
|  [Identity-based policies](#security_iam_service-with-iam-id-based-policies)  |   Yes  | 
|  [Resource-based policies](#security_iam_service-with-iam-resource-based-policies)  |   No   | 
|  [Policy actions](#security_iam_service-with-iam-id-based-policies-actions)  |   Yes  | 
|  [Policy resources](#security_iam_service-with-iam-id-based-policies-resources)  |   Yes  | 
|  [Policy condition keys](#security_iam_service-with-iam-id-based-policies-conditionkeys)  |   Yes  | 
|  [ACLs](#security_iam_service-with-iam-acls)  |   No   | 
|  [ABAC (tags in policies)](#security_iam_service-with-iam-tags)  |   Yes  | 
|  [Temporary credentials](#security_iam_service-with-iam-roles-tempcreds)  |   Yes  | 
|  [Principal permissions](#security_iam_service-with-iam-principal-permissions)  |   Yes  | 
|  [Service roles](#security_iam_service-with-iam-roles-service)  |   Yes  | 
|  [Service-linked roles](#security_iam_service-with-iam-roles-service-linked)  |   No   | 

To get a high-level view of how Amazon Bedrock and other AWS services work with most IAM features, see [AWS services that work with IAM](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_aws-services-that-work-with-iam.html) in the *IAM User Guide*.

## Identity-based policies for Amazon Bedrock
<a name="security_iam_service-with-iam-id-based-policies"></a>

**Supports identity-based policies:** Yes

Identity-based policies are JSON permissions policy documents that you can attach to an identity, such as an IAM user, group of users, or role. These policies control what actions users and roles can perform, on which resources, and under what conditions. To learn how to create an identity-based policy, see [Define custom IAM permissions with customer managed policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_create.html) in the *IAM User Guide*.

With IAM identity-based policies, you can specify allowed or denied actions and resources as well as the conditions under which actions are allowed or denied. To learn about all of the elements that you can use in a JSON policy, see [IAM JSON policy elements reference](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_elements.html) in the *IAM User Guide*.

### Identity-based policy examples for Amazon Bedrock
<a name="security_iam_service-with-iam-id-based-policies-examples"></a>



To view examples of Amazon Bedrock identity-based policies, see [Identity-based policy examples for Amazon Bedrock](security_iam_id-based-policy-examples.md).

## Resource-based policies within Amazon Bedrock
<a name="security_iam_service-with-iam-resource-based-policies"></a>

**Supports resource-based policies:** No 

Resource-based policies are JSON policy documents that you attach to a resource. Examples of resource-based policies are IAM *role trust policies* and Amazon S3 *bucket policies*. In services that support resource-based policies, service administrators can use them to control access to a specific resource. For the resource where the policy is attached, the policy defines what actions a specified principal can perform on that resource and under what conditions. You must [specify a principal](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_elements_principal.html) in a resource-based policy. Principals can include accounts, users, roles, federated users, or AWS services.

To enable cross-account access, you can specify an entire account or IAM entities in another account as the principal in a resource-based policy. For more information, see [Cross account resource access in IAM](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies-cross-account-resource-access.html) in the *IAM User Guide*.

## Policy actions for Amazon Bedrock
<a name="security_iam_service-with-iam-id-based-policies-actions"></a>

**Supports policy actions:** Yes

Administrators can use AWS JSON policies to specify who has access to what. That is, which **principal** can perform **actions** on what **resources**, and under what **conditions**.

The `Action` element of a JSON policy describes the actions that you can use to allow or deny access in a policy. Include actions in a policy to grant permissions to perform the associated operation.



To see a list of Amazon Bedrock actions, see [Actions defined by Amazon Bedrock ](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-actions-as-permissions) in the *Service Authorization Reference*.

Policy actions in Amazon Bedrock use the following prefix before the action:

```
bedrock
```

To specify multiple actions in a single statement, separate them with commas.

```
"Action": [
   "bedrock:action1",
   "bedrock:action2"
]
```





To view examples of Amazon Bedrock identity-based policies, see [Identity-based policy examples for Amazon Bedrock](security_iam_id-based-policy-examples.md).

## Policy resources for Amazon Bedrock
<a name="security_iam_service-with-iam-id-based-policies-resources"></a>

**Supports policy resources:** Yes

Administrators can use AWS JSON policies to specify who has access to what. That is, which **principal** can perform **actions** on what **resources**, and under what **conditions**.

The `Resource` JSON policy element specifies the object or objects to which the action applies. As a best practice, specify a resource using its [Amazon Resource Name (ARN)](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference-arns.html). For actions that don't support resource-level permissions, use a wildcard (\$1) to indicate that the statement applies to all resources.

```
"Resource": "*"
```

To see a list of Amazon Bedrock resource types and their ARNs, see [Resources defined by Amazon Bedrock ](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-resources-for-iam-policies) in the *Service Authorization Reference*. To learn with which actions you can specify the ARN of each resource, see [Actions defined by Amazon Bedrock ](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-actions-as-permissions).





Some Amazon Bedrock API actions support multiple resources. For example, [AssociateAgentKnowledgeBase](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent_AssociateAgentKnowledgeBase.html) accesses *AGENT12345* and *KB12345678*, so a principal must have permissions to access both resources. To specify multiple resources in a single statement, separate the ARNs with commas. 

```
"Resource": [
   "arn:aws:bedrock:aws-region:111122223333:agent/AGENT12345",
   "arn:aws:bedrock:aws-region:111122223333:knowledge-base/KB12345678"
]
```

To view examples of Amazon Bedrock identity-based policies, see [Identity-based policy examples for Amazon Bedrock](security_iam_id-based-policy-examples.md).

## Policy condition keys for Amazon Bedrock
<a name="security_iam_service-with-iam-id-based-policies-conditionkeys"></a>

**Supports service-specific policy condition keys:** Yes

Administrators can use AWS JSON policies to specify who has access to what. That is, which **principal** can perform **actions** on what **resources**, and under what **conditions**.

The `Condition` element specifies when statements execute based on defined criteria. You can create conditional expressions that use [condition operators](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_elements_condition_operators.html), such as equals or less than, to match the condition in the policy with values in the request. To see all AWS global condition keys, see [AWS global condition context keys](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html) in the *IAM User Guide*.

To see a list of Amazon Bedrock condition keys, see [Condition Keys for Amazon Bedrock ](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) in the *Service Authorization Reference*. To learn with which actions and resources you can use a condition key, see [Actions defined by Amazon Bedrock ](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-actions-as-permissions).

All Amazon Bedrock actions support condition keys using Amazon Bedrock models as the resource.

To view examples of Amazon Bedrock identity-based policies, see [Identity-based policy examples for Amazon Bedrock](security_iam_id-based-policy-examples.md).

## ACLs in Amazon Bedrock
<a name="security_iam_service-with-iam-acls"></a>

**Supports ACLs:** No 

Access control lists (ACLs) control which principals (account members, users, or roles) have permissions to access a resource. ACLs are similar to resource-based policies, although they do not use the JSON policy document format.

## ABAC with Amazon Bedrock
<a name="security_iam_service-with-iam-tags"></a>

**Supports ABAC (tags in policies):** Yes

Attribute-based access control (ABAC) is an authorization strategy that defines permissions based on attributes called tags. You can attach tags to IAM entities and AWS resources, then design ABAC policies to allow operations when the principal's tag matches the tag on the resource.

To control access based on tags, you provide tag information in the [condition element](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_elements_condition.html) of a policy using the `aws:ResourceTag/key-name`, `aws:RequestTag/key-name`, or `aws:TagKeys` condition keys.

If a service supports all three condition keys for every resource type, then the value is **Yes** for the service. If a service supports all three condition keys for only some resource types, then the value is **Partial**.

For more information about ABAC, see [Define permissions with ABAC authorization](https://docs.aws.amazon.com/IAM/latest/UserGuide/introduction_attribute-based-access-control.html) in the *IAM User Guide*. To view a tutorial with steps for setting up ABAC, see [Use attribute-based access control (ABAC)](https://docs.aws.amazon.com/IAM/latest/UserGuide/tutorial_attribute-based-access-control.html) in the *IAM User Guide*.

## Using temporary credentials with Amazon Bedrock
<a name="security_iam_service-with-iam-roles-tempcreds"></a>

**Supports temporary credentials:** Yes

Temporary credentials provide short-term access to AWS resources and are automatically created when you use federation or switch roles. AWS recommends that you dynamically generate temporary credentials instead of using long-term access keys. For more information, see [Temporary security credentials in IAM](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp.html) and [AWS services that work with IAM](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_aws-services-that-work-with-iam.html) in the *IAM User Guide*.

## Cross-service principal permissions for Amazon Bedrock
<a name="security_iam_service-with-iam-principal-permissions"></a>

**Supports forward access sessions (FAS):** Yes

 Forward access sessions (FAS) use the permissions of the principal calling an AWS service, combined with the requesting AWS service to make requests to downstream services. For policy details when making FAS requests, see [Forward access sessions](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_forward_access_sessions.html). 

## Service roles for Amazon Bedrock
<a name="security_iam_service-with-iam-roles-service"></a>

**Supports service roles:** Yes

 A service role is an [IAM role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html) that a service assumes to perform actions on your behalf. An IAM administrator can create, modify, and delete a service role from within IAM. For more information, see [Create a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html) in the *IAM User Guide*. 

**Warning**  
Changing the permissions for a service role might break Amazon Bedrock functionality. Edit service roles only when Amazon Bedrock provides guidance to do so.

## Service-linked roles for Amazon Bedrock
<a name="security_iam_service-with-iam-roles-service-linked"></a>

**Supports service-linked roles:** No 

 A service-linked role is a type of service role that is linked to an AWS service. The service can assume the role to perform an action on your behalf. Service-linked roles appear in your AWS account and are owned by the service. An IAM administrator can view, but not edit the permissions for service-linked roles. 

# Identity-based policy examples for Amazon Bedrock
<a name="security_iam_id-based-policy-examples"></a>

By default, users and roles don't have permission to create or modify Amazon Bedrock resources. To grant users permission to perform actions on the resources that they need, an IAM administrator can create IAM policies.

To learn how to create an IAM identity-based policy by using these example JSON policy documents, see [Create IAM policies (console)](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_create-console.html) in the *IAM User Guide*.

For details about actions and resource types defined by Amazon Bedrock, including the format of the ARNs for each of the resource types, see [Actions, Resources, and Condition Keys for Amazon Bedrock ](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html) in the *Service Authorization Reference*.

**Topics**
+ [Policy best practices](#security_iam_service-with-iam-policy-best-practices)
+ [Use the Amazon Bedrock console](#security_iam_id-based-policy-examples-console)
+ [Allow users to view their own permissions](#security_iam_id-based-policy-examples-view-own-permissions)
+ [Deny access for inference of foundation models](#security_iam_id-based-policy-examples-deny-inference)
+ [Allow users to invoke a provisioned model](#security_iam_id-based-policy-examples-perform-actions-pt)
+ [Identity-based policy examples for Amazon Bedrock Agents](security_iam_id-based-policy-examples-agent.md)

## Policy best practices
<a name="security_iam_service-with-iam-policy-best-practices"></a>

Identity-based policies determine whether someone can create, access, or delete Amazon Bedrock resources in your account. These actions can incur costs for your AWS account. When you create or edit identity-based policies, follow these guidelines and recommendations:
+ **Get started with AWS managed policies and move toward least-privilege permissions** – To get started granting permissions to your users and workloads, use the *AWS managed policies* that grant permissions for many common use cases. They are available in your AWS account. We recommend that you reduce permissions further by defining AWS customer managed policies that are specific to your use cases. For more information, see [AWS managed policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_managed-vs-inline.html#aws-managed-policies) or [AWS managed policies for job functions](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_job-functions.html) in the *IAM User Guide*.
+ **Apply least-privilege permissions** – When you set permissions with IAM policies, grant only the permissions required to perform a task. You do this by defining the actions that can be taken on specific resources under specific conditions, also known as *least-privilege permissions*. For more information about using IAM to apply permissions, see [ Policies and permissions in IAM](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html) in the *IAM User Guide*.
+ **Use conditions in IAM policies to further restrict access** – You can add a condition to your policies to limit access to actions and resources. For example, you can write a policy condition to specify that all requests must be sent using SSL. You can also use conditions to grant access to service actions if they are used through a specific AWS service, such as CloudFormation. For more information, see [ IAM JSON policy elements: Condition](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_elements_condition.html) in the *IAM User Guide*.
+ **Use IAM Access Analyzer to validate your IAM policies to ensure secure and functional permissions** – IAM Access Analyzer validates new and existing policies so that the policies adhere to the IAM policy language (JSON) and IAM best practices. IAM Access Analyzer provides more than 100 policy checks and actionable recommendations to help you author secure and functional policies. For more information, see [Validate policies with IAM Access Analyzer](https://docs.aws.amazon.com/IAM/latest/UserGuide/access-analyzer-policy-validation.html) in the *IAM User Guide*.
+ **Require multi-factor authentication (MFA)** – If you have a scenario that requires IAM users or a root user in your AWS account, turn on MFA for additional security. To require MFA when API operations are called, add MFA conditions to your policies. For more information, see [ Secure API access with MFA](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_mfa_configure-api-require.html) in the *IAM User Guide*.

For more information about best practices in IAM, see [Security best practices in IAM](https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html) in the *IAM User Guide*.

## Use the Amazon Bedrock console
<a name="security_iam_id-based-policy-examples-console"></a>

To access the Amazon Bedrock console, you must have a minimum set of permissions. These permissions must allow you to list and view details about the Amazon Bedrock resources in your AWS account. If you create an identity-based policy that is more restrictive than the minimum required permissions, the console won't function as intended for entities (users or roles) with that policy.

You don't need to allow minimum console permissions for users that are making calls only to the AWS CLI or the AWS API. Instead, allow access to only the actions that match the API operation that they're trying to perform.

To ensure that users and roles can still use the Amazon Bedrock console, also attach the Amazon Bedrock [AmazonBedrockFullAccess](security-iam-awsmanpol.md#security-iam-awsmanpol-AmazonBedrockFullAccess) or [AmazonBedrockReadOnly](security-iam-awsmanpol.md#security-iam-awsmanpol-AmazonBedrockReadOnly) AWS managed policy to the entities. For more information, see [Adding permissions to a user](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_users_change-permissions.html#users_change_permissions-add-console) in the *IAM User Guide*.

## Allow users to view their own permissions
<a name="security_iam_id-based-policy-examples-view-own-permissions"></a>

This example shows how you might create a policy that allows IAM users to view the inline and managed policies that are attached to their user identity. This policy includes permissions to complete this action on the console or programmatically using the AWS CLI or AWS API.

```
{
    "Version": "2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "ViewOwnUserInfo",
            "Effect": "Allow",
            "Action": [
                "iam:GetUserPolicy",
                "iam:ListGroupsForUser",
                "iam:ListAttachedUserPolicies",
                "iam:ListUserPolicies",
                "iam:GetUser"
            ],
            "Resource": ["arn:aws:iam::*:user/${aws:username}"]
        },
        {
            "Sid": "NavigateInConsole",
            "Effect": "Allow",
            "Action": [
                "iam:GetGroupPolicy",
                "iam:GetPolicyVersion",
                "iam:GetPolicy",
                "iam:ListAttachedGroupPolicies",
                "iam:ListGroupPolicies",
                "iam:ListPolicyVersions",
                "iam:ListPolicies",
                "iam:ListUsers"
            ],
            "Resource": "*"
        }
    ]
}
```

## Deny access for inference of foundation models
<a name="security_iam_id-based-policy-examples-deny-inference"></a>

To prevent a user from invoking foundation models, you need to deny access to API actions that invoke models directly. The following example shows a identity-based policy that denies access to running inference on a specific model. This policy can be used as a service control policy (SCP) to control model access across an organization.

------
#### [ JSON ]

****  

```
{
          "Version":"2012-10-17",		 	 	 
          "Statement": {
              "Sid": "DenyInference",
              "Effect": "Deny",
              "Action": [
                  "bedrock:InvokeModel",
                  "bedrock:InvokeModelWithResponseStream",
                  "bedrock:CreateModelInvocationJob"
               ],
              "Resource": "arn:aws:bedrock:*::foundation-model/model-id"
          }
      }
```

------

To deny inference access to all foundation models, use `*` for the model ID. Other actions, such as `Converse` and `StartAsyncInvoke`, are blocked automatically when `InvokeModel` is denied. For a list of model IDs, see [Supported foundation models in Amazon Bedrock](models-supported.md)

## Allow users to invoke a provisioned model
<a name="security_iam_id-based-policy-examples-perform-actions-pt"></a>

The following is a sample policy that you can attach to an IAM role to allow it to use a provisioned model in model inference. For example, you could attach this policy to a role that you want to only have permissions to use a provisioned model. The role won't be able to manage or see information about the Provisioned Throughput.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "ProvisionedThroughputModelInvocation",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel",
                "bedrock:InvokeModelWithResponseStream"
            ],
            "Resource": "arn:aws:bedrock:us-east-1:123456789012:provisioned-model/my-provisioned-model"
        }
    ]
}
```

------

# Identity-based policy examples for Amazon Bedrock Agents
<a name="security_iam_id-based-policy-examples-agent"></a>

Select a topic to see example IAM policies that you can attach to an IAM role to provision permissions for actions in [Automate tasks in your application using AI agents](agents.md).

**Topics**
+ [Required permissions for Amazon Bedrock Agents](#iam-agents-ex-all)
+ [Allow users to view information about and invoke an agent](#security_iam_id-based-policy-examples-perform-actions-agent)
+ [Control access to service tiers](#security_iam_id-based-policy-examples-service-tiers)

## Required permissions for Amazon Bedrock Agents
<a name="iam-agents-ex-all"></a>

For an IAM identity to use Amazon Bedrock Agents, you must configure it with the necessary permissions. You can attach the [AmazonBedrockFullAccess](security-iam-awsmanpol.md#security-iam-awsmanpol-AmazonBedrockFullAccess) policy to grant the proper permissions to the role.

To restrict permissions to only actions that are used in Amazon Bedrock Agents, attach the following identity-based policy to an IAM role:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AgentPermissions",
            "Effect": "Allow",
            "Action": [  
                "bedrock:ListFoundationModels",
                "bedrock:GetFoundationModel",
                "bedrock:TagResource", 
                "bedrock:UntagResource", 
                "bedrock:ListTagsForResource", 
                "bedrock:CreateAgent", 
                "bedrock:UpdateAgent", 
                "bedrock:GetAgent", 
                "bedrock:ListAgents", 
                "bedrock:DeleteAgent",
                "bedrock:CreateAgentActionGroup", 
                "bedrock:UpdateAgentActionGroup", 
                "bedrock:GetAgentActionGroup", 
                "bedrock:ListAgentActionGroups", 
                "bedrock:DeleteAgentActionGroup",
                "bedrock:GetAgentVersion",
                "bedrock:ListAgentVersions", 
                "bedrock:DeleteAgentVersion",
                "bedrock:CreateAgentAlias", 
                "bedrock:UpdateAgentAlias",               
                "bedrock:GetAgentAlias",
                "bedrock:ListAgentAliases",
                "bedrock:DeleteAgentAlias",
                "bedrock:AssociateAgentKnowledgeBase",
                "bedrock:DisassociateAgentKnowledgeBase",
                "bedrock:ListAgentKnowledgeBases",
                "bedrock:GetKnowledgeBase",
                "bedrock:ListKnowledgeBases",
                "bedrock:PrepareAgent",
                "bedrock:InvokeAgent",
                "bedrock:AssociateAgentCollaborator",
                "bedrock:DisassociateAgentCollaborator",
                "bedrock:GetAgentCollaborator",
                "bedrock:ListAgentCollaborators",
                "bedrock:UpdateAgentCollaborator"
            ],
            "Resource": "*"
        }
    ]   
}
```

------

You can further restrict permissions by omitting [actions](security_iam_service-with-iam.md#security_iam_service-with-iam-id-based-policies-actions) or specifying [resources](security_iam_service-with-iam.md#security_iam_service-with-iam-id-based-policies-resources) and [condition keys](security_iam_service-with-iam.md#security_iam_service-with-iam-id-based-policies-conditionkeys). An IAM identity can call API operations on specific resources. For example, the [https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent_UpdateAgent.html](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent_UpdateAgent.html) operation can only be used on agent resources and the [https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent-runtime_InvokeAgent.html](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent-runtime_InvokeAgent.html) operation can only be used on alias resources. For API operations that aren't used on a specific resource type (such as [https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent_CreateAgent.html](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_agent_CreateAgent.html)), specify \$1 as the `Resource`. If you specify an API operation that can't be used on the resource specified in the policy, Amazon Bedrock returns an error.

## Allow users to view information about and invoke an agent
<a name="security_iam_id-based-policy-examples-perform-actions-agent"></a>

The following is a sample policy that you can attach to an IAM role to allow it to view information about or edit an agent with the ID *AGENT12345* and to interact with its alias with the ID *ALIAS12345*. For example, you could attach this policy to a role that you want to only have permissions to troubleshoot an agent and update it.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "GetAndUpdateAgent",
            "Effect": "Allow",
            "Action": [
                "bedrock:GetAgent",
                "bedrock:UpdateAgent"
            ],
            "Resource": "arn:aws:bedrock:us-east-1:123456789012:agent/AgentId"
        },
        {
            "Sid": "InvokeAgent",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeAgent"
            ],
            "Resource": "arn:aws:bedrock:us-east-1:123456789012:agent-alias/AgentId/AgentAliasId"
        }
    ]
}
```

------

## Control access to service tiers
<a name="security_iam_id-based-policy-examples-service-tiers"></a>

Amazon Bedrock service tiers provide different levels of processing priority and pricing for inference requests. By default, all service tiers (priority, default, and flex) are available to users with proper Bedrock permissions, following an allowlist approach where access is granted unless explicitly restricted.

However, organizations may want to control which service tiers their users can access to manage costs or enforce usage policies. You can implement access restrictions by using IAM policies with the `bedrock:ServiceTier` condition key to deny access to specific service tiers. This approach allows you to maintain granular control over which team members can use premium service tiers like "priority" or cost-optimized tiers like "flex".

The following example shows an identity-based policy that denies access to all service tiers. This type of policy is useful when you want to prevent users from specifying any service tier, forcing them to use the system default behavior:

```
{
    "Version": "2012-10-17", 		 	 	 
    "Statement": [
        {
            "Effect": "Deny",
            "Action": "bedrock:InvokeModel",
            "Resource": "*",
            "Condition": {
                "StringEquals": {
                    "bedrock:ServiceTier": ["reserved", "priority", "default", "flex"]
                }
            }
        }
    ]
}
```

You can customize this policy to deny access to only specific service tiers by modifying the `bedrock:ServiceTier` condition values. For example, to deny only the premium "priority" tier while allowing "default" and "flex", you would specify only `["priority"]` in the condition. This flexible approach allows you to implement usage policies that align with your organization's cost management and operational requirements. For more information about service tiers, see [Service tiers for optimizing performance and cost](service-tiers-inference.md).

# Managing IAM policies on Projects
<a name="security-iam-projects"></a>

Amazon Bedrock Projects support direct IAM policy attachment, allowing you to manage access control at the project resource level. This provides an alternative to managing policies on IAM users and roles.

## Understanding Project-Level IAM Policies
<a name="security-iam-projects-understanding"></a>

Project-level IAM policies allow you to:
+ **Centralize access control**: Define permissions directly on the project resource
+ **Simplify management**: Update access without modifying individual user/role policies
+ **Audit easily**: View all permissions for a project in one place
+ **Delegate administration**: Allow project owners to manage access to their projects

## Attaching IAM Policies to Projects
<a name="security-iam-projects-attaching"></a>

### Attach a Policy to Grant Access
<a name="security-iam-projects-attach-grant"></a>

Attach an IAM policy directly to a project to grant permissions:

```
import boto3
import json

iam = boto3.client('iam', region_name='us-east-1')

project_arn = "arn:aws:bedrock-mantle:us-east-1:123456789012:project/proj_abc123"

# Define the identity-based policy document
policy_document = {
    "Version": "2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AllowTeamAlphaAccess",
            "Effect": "Allow",
            "Action": [
                "bedrock-mantle:ListTagsForResources",
                "bedrock-mantle:GetProject"
            ],
            "Resource": project_arn
        }
    ]
}

policy_json = json.dumps(policy_document)

# Create a managed policy
create_response = iam.create_policy(
    PolicyName="TeamAlphaAccessPolicy",
    PolicyDocument=policy_json,
    Description="Grants Team Alpha read access to the Bedrock project"
)

policy_arn = create_response['Policy']['Arn']
print(f"Policy created: {policy_arn}")

# Attach the policy to alice (IAM user)
iam.attach_user_policy(
    UserName="alice",
    PolicyArn=policy_arn
)
print("Policy attached to alice")

# Attach the policy to bob (IAM user)
iam.attach_user_policy(
    UserName="bob",
    PolicyArn=policy_arn
)
print("Policy attached to bob")

# Attach the policy to TeamAlphaRole (IAM role)
iam.attach_role_policy(
    RoleName="TeamAlphaRole",
    PolicyArn=policy_arn
)
print("Policy attached to TeamAlphaRole")
```

### Grant Full Project Access to a Team
<a name="security-iam-projects-full-access"></a>

Allow a team full access to manage and use a project:

```
import boto3
import json

iam = boto3.client('iam', region_name='us-east-1')

project_arn = "arn:aws:bedrock-mantle:us-east-1:123456789012:project/proj_abc123"

# Identity-based policy — no Principal block needed
policy_document = {
    "Version": "2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "FullProjectAccess",
            "Effect": "Allow",
            "Action": "bedrock-mantle:*",
            "Resource": project_arn
        }
    ]
}

# Create a managed policy
create_response = iam.create_policy(
    PolicyName="DataScienceFullAccess",
    PolicyDocument=json.dumps(policy_document),
    Description="Grants DataScienceTeamRole full access to the Bedrock project"
)

policy_arn = create_response['Policy']['Arn']
print(f"Policy created: {policy_arn}")

# Attach to the DataScienceTeamRole
iam.attach_role_policy(
    RoleName="DataScienceTeamRole",
    PolicyArn=policy_arn
)

print("Full access policy attached to DataScienceTeamRole")
```

### Grant Read-Only Access
<a name="security-iam-projects-readonly"></a>

Attach a policy that allows viewing project details and making inference requests only:

```
import boto3
import json

iam = boto3.client('iam', region_name='us-east-1')

project_arn = "arn:aws:bedrock-mantle:us-east-1:123456789012:project/proj_abc123"

# Identity-based policy — no Principal block needed
policy_document = {
    "Version": "2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "ReadOnlyAccess",
            "Effect": "Allow",
            "Action": [
                "bedrock-mantle:CreateInference",
                "bedrock-mantle:GetProject",
                "bedrock-mantle:ListProjects",
                "bedrock-mantle:ListTagsForResources"
            ],
            "Resource": project_arn
        }
    ]
}

# Create a managed policy
create_response = iam.create_policy(
    PolicyName="ReadOnlyAccessPolicy",
    PolicyDocument=json.dumps(policy_document),
    Description="Grants viewer1 and viewer2 read-only access to the Bedrock project"
)

policy_arn = create_response['Policy']['Arn']
print(f"Policy created: {policy_arn}")

# Attach to viewer1
iam.attach_user_policy(
    UserName="viewer1",
    PolicyArn=policy_arn
)
print("Policy attached to viewer1")

# Attach to viewer2
iam.attach_user_policy(
    UserName="viewer2",
    PolicyArn=policy_arn
)
print("Policy attached to viewer2")
```







# AWS managed policies for Amazon Bedrock
<a name="security-iam-awsmanpol"></a>

To add permissions to users, groups, and roles, it's easier to use AWS managed policies than to write policies yourself. It takes time and expertise to [create IAM customer managed policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_create-console.html) that provide your team with only the permissions they need. To get started quickly, you can use our AWS managed policies. These policies cover common use cases and are available in your AWS account.

For a list of AWS managed policies, see [AWS managed policies](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/policy-list.html) in the AWS managed policy referenc. For more information about AWS managed policies, see [AWS managed policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_managed-vs-inline.html#aws-managed-policies) in the *IAM User Guide*.

AWS services maintain and update AWS managed policies. You can't change the permissions in AWS managed policies. Services occasionally add additional permissions to an AWS managed policy to support new features. This type of update affects all identities (users, groups, and roles) where the policy is attached. Services are most likely to update an AWS managed policy when a new feature is launched or when new operations become available. Services do not remove permissions from an AWS managed policy, so policy updates won't break your existing permissions.

Additionally, AWS supports managed policies for job functions that span multiple services. For example, the **ReadOnlyAccess** AWS managed policy provides read-only access to all AWS services and resources. When a service launches a new feature, AWS adds read-only permissions for new operations and resources. For a list and descriptions of job function policies, see [AWS managed policies for job functions](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_job-functions.html) in the *IAM User Guide*.

**Topics**
+ [AWS managed policy: AmazonBedrockFullAccess](#security-iam-awsmanpol-AmazonBedrockFullAccess)
+ [AWS managed policy: AmazonBedrockReadOnly](#security-iam-awsmanpol-AmazonBedrockReadOnly)
+ [AWS managed policy: AmazonBedrockLimitedAccess](#security-iam-awsmanpol-AmazonBedrockLimitedAccess)
+ [AWS managed policy: AmazonBedrockMarketplaceAccess](#security-iam-awsmanpol-AmazonBedrockMarketplaceAccess)
+ [AWS managed policy: AmazonBedrockMantleFullAccess](#security-iam-awsmanpol-AmazonBedrockMantleFullAccess)
+ [AWS managed policy: AmazonBedrockMantleReadOnly](#security-iam-awsmanpol-AmazonBedrockMantleReadOnly)
+ [AWS managed policy: AmazonBedrockMantleInferenceAccess](#security-iam-awsmanpol-AmazonBedrockMantleInferenceAccess)
+ [Amazon Bedrock updates to AWS managed policies](#security-iam-awsmanpol-updates)

## AWS managed policy: AmazonBedrockFullAccess
<a name="security-iam-awsmanpol-AmazonBedrockFullAccess"></a>

You can attach the [AmazonBedrockFullAccess](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonBedrockFullAccess.html) policy to your IAM identities to grant administrative permissions that allow the user permission to create, read, update, and delete Amazon Bedrock resources.

**Permissions details**

This policy includes the following permissions:
+  `ec2` (Amazon Elastic Compute Cloud) – Allows permissions to describe VPCs, subnets, and security groups. 
+  `iam` (AWS Identity and Access Management) – Allows principals to pass roles, but only allows IAM roles with "Amazon Bedrock" in them to be passed to the Amazon Bedrock service. The permissions are restricted to `bedrock.amazonaws.com` for Amazon Bedrock operations. 
+  `kms` (AWS Key Management Service) – Allows principals to describe AWS KMS keys and aliases. 
+  `bedrock` (Amazon Bedrock) – Allows principals read and write access to all actions in the Amazon Bedrock control plane and runtime service. 
+  `sagemaker` (Amazon SageMaker AI) – Allows principals to access the Amazon SageMaker AI resources in the customer's account, which serves as the foundation for the Amazon Bedrock Marketplace feature. 

------
#### [ JSON ]

****  

```
{
     "Version":"2012-10-17",		 	 	 
     "Statement": [
         {
             "Sid": "BedrockAll",
             "Effect": "Allow",
             "Action": [
                 "bedrock:*"
             ],
             "Resource": "*"
         },
         {
             "Sid": "DescribeKey",
             "Effect": "Allow",
             "Action": [
                 "kms:DescribeKey"
             ],
             "Resource": "arn:*:kms:*:::*"
         },
         {
             "Sid": "APIsWithAllResourceAccess",
             "Effect": "Allow",
             "Action": [
                 "iam:ListRoles",
                 "ec2:DescribeVpcs",
                 "ec2:DescribeSubnets",
                 "ec2:DescribeSecurityGroups"
             ],
             "Resource": "*"
         },
         {
             "Sid": "MarketplaceModelEndpointMutatingAPIs",
             "Effect": "Allow",
             "Action": [
                 "sagemaker:CreateEndpoint",
                 "sagemaker:CreateEndpointConfig",
                 "sagemaker:CreateModel",
                 "sagemaker:DeleteEndpoint",
                 "sagemaker:UpdateEndpoint"
             ],
             "Resource": [
                 "arn:aws:sagemaker:*:*:endpoint/*",
                 "arn:aws:sagemaker:*:*:endpoint-config/*",
                 "arn:aws:sagemaker:*:*:model/*"
             ],
             "Condition": {
                 "StringEquals": {
                     "aws:CalledViaLast": "bedrock.amazonaws.com",
                     "aws:ResourceTag/sagemaker-sdk:bedrock": "compatible"
                 }
             }
         },
         {
             "Sid": "MarketplaceModelEndpointAddTagsOperations",
             "Effect": "Allow",
             "Action": [
                 "sagemaker:AddTags"
             ],
             "Resource": [
                 "arn:aws:sagemaker:*:*:endpoint/*",
                 "arn:aws:sagemaker:*:*:endpoint-config/*",
                 "arn:aws:sagemaker:*:*:model/*"
             ],
             "Condition": {
                 "ForAllValues:StringEquals": {
                     "aws:TagKeys": [
                         "sagemaker-sdk:bedrock",
                         "bedrock:marketplace-registration-status",
                         "sagemaker-studio:hub-content-arn"
                     ]
                 },
                 "StringLike": {
                     "aws:RequestTag/sagemaker-sdk:bedrock": "compatible",
                     "aws:RequestTag/bedrock:marketplace-registration-status": "registered",
                     "aws:RequestTag/sagemaker-studio:hub-content-arn": "arn:aws:sagemaker:*:aws:hub-content/SageMakerPublicHub/Model/*"
                 }
             }
         },
         {
             "Sid": "MarketplaceModelEndpointDeleteTagsOperations",
             "Effect": "Allow",
             "Action": [
                 "sagemaker:DeleteTags"
             ],
             "Resource": [
                 "arn:aws:sagemaker:*:*:endpoint/*",
                 "arn:aws:sagemaker:*:*:endpoint-config/*",
                 "arn:aws:sagemaker:*:*:model/*"
             ],
             "Condition": {
                 "ForAllValues:StringEquals": {
                     "aws:TagKeys": [
                         "sagemaker-sdk:bedrock",
                         "bedrock:marketplace-registration-status",
                         "sagemaker-studio:hub-content-arn"
                     ]
                 },
                 "StringLike": {
                     "aws:ResourceTag/sagemaker-sdk:bedrock": "compatible",
                     "aws:ResourceTag/bedrock:marketplace-registration-status": "registered",
                     "aws:ResourceTag/sagemaker-studio:hub-content-arn": "arn:aws:sagemaker:*:aws:hub-content/SageMakerPublicHub/Model/*"
                 }
             }
         },
         {
             "Sid": "MarketplaceModelEndpointNonMutatingAPIs",
             "Effect": "Allow",
             "Action": [
                 "sagemaker:DescribeEndpoint",
                 "sagemaker:DescribeEndpointConfig",
                 "sagemaker:DescribeModel",
                 "sagemaker:DescribeInferenceComponent",
                 "sagemaker:ListEndpoints",
                 "sagemaker:ListTags"
             ],
             "Resource": [
                 "arn:aws:sagemaker:*:*:endpoint/*",
                 "arn:aws:sagemaker:*:*:endpoint-config/*",
                 "arn:aws:sagemaker:*:*:model/*"
             ],
             "Condition": {
                 "StringEquals": {
                     "aws:CalledViaLast": "bedrock.amazonaws.com"
                 }
             }
         },
         {
             "Sid": "MarketplaceModelEndpointInvokingOperations",
             "Effect": "Allow",
             "Action": [
                 "sagemaker:InvokeEndpoint",
                 "sagemaker:InvokeEndpointWithResponseStream"
             ],
             "Resource": [
                 "arn:aws:sagemaker:*:*:endpoint/*"
             ],
             "Condition": {
                 "StringEquals": {
                     "aws:CalledViaLast": "bedrock.amazonaws.com",
                     "aws:ResourceTag/sagemaker-sdk:bedrock": "compatible"
                 }
             }
         },
         {
             "Sid": "DiscoveringMarketplaceModel",
             "Effect": "Allow",
             "Action": [
                 "sagemaker:DescribeHubContent"
             ],
             "Resource": [
                 "arn:aws:sagemaker:*:aws:hub-content/SageMakerPublicHub/Model/*",
                 "arn:aws:sagemaker:*:aws:hub/SageMakerPublicHub"
             ]
         },
         {
             "Sid": "AllowMarketplaceModelsListing",
             "Effect": "Allow",
             "Action": [
                 "sagemaker:ListHubContents"
             ],
             "Resource": "arn:aws:sagemaker:*:aws:hub/SageMakerPublicHub"
         },
         {
             "Sid": "PassRoleToSageMaker",
             "Effect": "Allow",
             "Action": [
                 "iam:PassRole"
             ],
             "Resource": [
                 "arn:aws:iam::*:role/*SageMaker*ForBedrock*"
             ],
             "Condition": {
                 "StringEquals": {
                     "iam:PassedToService": [
                         "sagemaker.amazonaws.com",
                         "bedrock.amazonaws.com"
                     ]
                 }
             }
         },
         {
             "Sid": "PassRoleToBedrock",
             "Effect": "Allow",
             "Action": [
                 "iam:PassRole"
             ],
             "Resource": "arn:aws:iam::*:role/*AmazonBedrock*",
             "Condition": {
                 "StringEquals": {
                     "iam:PassedToService": [
                         "bedrock.amazonaws.com"
                     ]
                 }
             }
         },
         {
             "Sid": "MarketplaceOperationsFromBedrockFor3pModels",
             "Effect": "Allow",
             "Action": [
                 "aws-marketplace:Subscribe",
                 "aws-marketplace:ViewSubscriptions",
                 "aws-marketplace:Unsubscribe"
             ],
             "Resource": "*",
             "Condition": {
                 "StringEquals": {
                     "aws:CalledViaLast": "bedrock.amazonaws.com"
                 }
             }
         }
     ]
 }
```

------

## AWS managed policy: AmazonBedrockReadOnly
<a name="security-iam-awsmanpol-AmazonBedrockReadOnly"></a>

You can attach the [AmazonBedrockReadOnly](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonBedrockReadOnly.html) policy to your IAM identities to grant read-only permissions to view all resources in Amazon Bedrock.

## AWS managed policy: AmazonBedrockLimitedAccess
<a name="security-iam-awsmanpol-AmazonBedrockLimitedAccess"></a>

You can attach the [AmazonBedrockLimitedAccess](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonBedrockLimitedAccess.html) policy to your IAM identities to allow it access to Amazon Bedrock services, AWS KMS key management, networking resources, and AWS Marketplace subscriptions for third-party foundation models. The policy includes the following statements:
+ The `BedrockAPIs` statement allows you to perform several operations in Amazon Bedrock including:
  + Passing the Amazon Bedrock API key when making API requests to the Amazon Bedrock service.
  + Describing information about resources.
  + Creating resources (guardrails, models, jobs).
  + Creating and refining Automated Reasoning policies (create, build, refine, and test policies).
  + Deleting resources.
  + Invoking models on all resources.
+ The `DescribeKey` statement allows you to view information about KMS keys across all regions and accounts, as long as the policies on the keys permit you to do so.
+ The `APIsWithAllResourceAccess` statement allows you to:
  + List IAM roles.
  + Describe Amazon VPC resources (VPCs, subnets, and security groups) across all resources.
+ The `MarketplaceOperationsFromBedrockFor3pModels` statement enables you to:
  + Subscribe to AWS Marketplace offerings.
  + View subscriptions.
  + Unsubscribe from AWS Marketplace offerings.
**Note**  
The condition key `aws:CalledViaLast` restricts these actions to only when they are called through the Amazon Bedrock service.

## AWS managed policy: AmazonBedrockMarketplaceAccess
<a name="security-iam-awsmanpol-AmazonBedrockMarketplaceAccess"></a>

You can attach the [AmazonBedrockMarketplaceAccess](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonBedrockMarketplaceAccess.html) policy to your IAM identities to allow it to manage and use Amazon Bedrock marketplace model endpoints with SageMaker AI integration. The policy includes the following statements:
+ The `BedrockMarketplaceAPIs` statement allows you to create, delete, register, deregister, and update marketplace model endpoints in Amazon Bedrock on all resources.
+ The `MarketplaceModelEndpointMutatingAPIs` statement allows you to create and manage SageMaker AI endpoints, endpoint configurations, and models on specified resources.
  + Use the `aws:CalledViaLast` condition key to ensure that these actions are only performed when called through Bedrock.
  + Use the `aws:ResourceTag/sagemaker-sdk:bedrock` condition key to ensure that these actions are only performed on resources tagged as Amazon Bedrock-compatible.
+ The `MarketplaceModelEndpointAddTagsOperations` statement allows adding specific tags to SageMaker AI endpoints, endpoint configurations, and models on specified resources.
  + Use the `aws:TagKeys` condition key to restrict which tags can be added
  + Use the `aws:RequestTag/*` condition key to ensure tag values match specified patterns
+ The `MarketplaceModelEndpointDeleteTagsOperations` statement allows deleting specific tags from SageMaker AI endpoints, endpoint configurations, and models on specified resources.
  + Use the `aws:TagKeys` condition key to restrict which tags can be deleted
  + Use the `aws:ResourceTag/*` condition key to ensure deleted tags match specified patterns
+ The `MarketplaceModelEndpointNonMutatingAPIs` statement allows viewing and describing SageMaker AI endpoints, endpoint configurations, and models on specified resources.
  + Use `aws:CalledViaLast` condition key to ensure actions are only performed through the Amazon Bedrock service
+ The `MarketplaceModelEndpointInvokingOperations` statement allows invoking SageMaker AI endpoints on specified resources.
  + Use the `aws:CalledViaLast` condition key to ensure actions are only performed through the Amazon Bedrock service
  + Use the `aws:ResourceTag/sagemaker-sdk:bedrock` condition key to ensure actions are only performed on Bedrock-compatible resources
+ The `DiscoveringMarketplaceModel` statement allows decribing SageMaker AI hub content on specified resources.
+ The `AllowMarketplaceModelsListing` statement allows listing SageMaker AI hub contents on specified resources.
+ The `PassRoleToSageMaker` statement allows passing IAM roles to SageMaker AI and Amazon Bedrock on specified resources.
  + Use `iam:PassedToService` condition key to ensure roles are only passed to specified services.
+ The `PassRoleToBedrock` statement allows you to pass specific IAM roles to Amazon Bedrock on specified resources.
  + Use the `iam:PassedToService` condition key to ensure roles are only passed to the Amazon Bedrock service.

## AWS managed policy: AmazonBedrockMantleFullAccess
<a name="security-iam-awsmanpol-AmazonBedrockMantleFullAccess"></a>

You can attach the [AmazonBedrockMantleFullAccess](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonBedrockMantleFullAccess.html) policy to your IAM identities to grant full access to all Amazon Bedrock Mantle operations.

**Permissions details**

This policy includes the following permissions:
+ `bedrock-mantle` (Amazon Bedrock Mantle) – Allows principals full access to all actions in the Amazon Bedrock Mantle service.

## AWS managed policy: AmazonBedrockMantleReadOnly
<a name="security-iam-awsmanpol-AmazonBedrockMantleReadOnly"></a>

You can attach the [AmazonBedrockMantleReadOnly](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonBedrockMantleReadOnly.html) policy to your IAM identities to grant read-only permissions to view Amazon Bedrock Mantle resources and call with bearer token.

**Permissions details**

This policy includes the following permissions:
+ `bedrock-mantle` (Amazon Bedrock Mantle) – Allows principals to get and list Amazon Bedrock Mantle project resources, and call with bearer token for authentication.

## AWS managed policy: AmazonBedrockMantleInferenceAccess
<a name="security-iam-awsmanpol-AmazonBedrockMantleInferenceAccess"></a>

You can attach the [AmazonBedrockMantleInferenceAccess](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonBedrockMantleInferenceAccess.html) policy to your IAM identities to grant permissions to run inference on Amazon Bedrock Mantle models.

**Permissions details**

This policy includes the following permissions:
+ `bedrock-mantle` (Amazon Bedrock Mantle) – Allows principals to get and list Amazon Bedrock Mantle project resources, create inference requests, and call with bearer token for authentication.

## Amazon Bedrock updates to AWS managed policies
<a name="security-iam-awsmanpol-updates"></a>

View details about updates to AWS managed policies for Amazon Bedrock since this service began tracking these changes. For automatic alerts about changes to this page, subscribe to the RSS feed on the [Document history for the Amazon Bedrock User Guide](doc-history.md).


| Change | Description | Date | 
| --- | --- | --- | 
|  [AmazonBedrockMantleFullAccess](#security-iam-awsmanpol-AmazonBedrockMantleFullAccess) – New policy  |  Amazon Bedrock added a new policy to grant full access to all Amazon Bedrock Mantle operations.  | December 3, 2025 | 
|  [AmazonBedrockMantleReadOnly](#security-iam-awsmanpol-AmazonBedrockMantleReadOnly) – New policy  |  Amazon Bedrock added a new policy to grant read-only access to Amazon Bedrock Mantle resources.  | December 3, 2025 | 
|  [AmazonBedrockMantleInferenceAccess](#security-iam-awsmanpol-AmazonBedrockMantleInferenceAccess) – New policy  |  Amazon Bedrock added a new policy to grant inference access to Amazon Bedrock Mantle models.  | December 3, 2025 | 
|  [AmazonBedrockFullAccess](#security-iam-awsmanpol-AmazonBedrockFullAccess) – Updated policy  |  Amazon Bedrock updated the AmazonBedrockFullAccess managed policy to enable access to all serverless foundation models by default.  | July 14th, 2025 | 
|  [AmazonBedrockMarketplaceAccess](#security-iam-awsmanpol-AmazonBedrockLimitedAccess) – New policy  |  Amazon Bedrock added a new policy to grant customers permissions to access Amazon Bedrock Marketplace foundation models through a SageMaker AI endpoint.  | June 13, 2025 | 
|  [AmazonBedrockLimitedAccess](#security-iam-awsmanpol-AmazonBedrockLimitedAccess) – New policy  |  Amazon Bedrock added a new policy to grant customers basic permissions to access core actions in Amazon Bedrock.  | June 13, 2025 | 
|  [AmazonBedrockFullAccess](#security-iam-awsmanpol-AmazonBedrockFullAccess) – Updated policy  |  Amazon Bedrock updated the AmazonBedrockFullAccess managed policy to grant customers the necessary permissions to create, read, update, and delete Amazon Bedrock Marketplace resources. This includes permissions to manage the underlying Amazon SageMaker AI resources, as they serve as the foundation for the Amazon Bedrock Marketplace functionality.  | December 4th, 2024 | 
|  [AmazonBedrockReadOnly](#security-iam-awsmanpol-AmazonBedrockReadOnly) – Updated policy  |  Amazon Bedrock updated the AmazonBedrockReadOnly managed policy to grant customers the necessary permissions to read Amazon Bedrock Marketplace resources. This includes permissions to manage the underlying Amazon SageMaker AI resources, as they serve as the foundation for the Amazon Bedrock Marketplace functionality.  | December 4th, 2024 | 
|  [AmazonBedrockReadOnly](#security-iam-awsmanpol-AmazonBedrockReadOnly) – Updated policy  |  Amazon Bedrock updated the AmazonBedrockReadOnly policy to include read-only permissions for custom model import.  | October 18, 2024 | 
|  [AmazonBedrockReadOnly](#security-iam-awsmanpol-AmazonBedrockReadOnly) – Updated policy  |  Amazon Bedrock added inference profile read-only permissions.  | August 27, 2024 | 
|  [AmazonBedrockReadOnly](#security-iam-awsmanpol-AmazonBedrockReadOnly) – Updated policy  |  Amazon Bedrock updated the AmazonBedrockReadOnly policy to include read-only permissions for Amazon Bedrock Guardrails, Amazon Bedrock Model evaluation, and Amazon Bedrock Batch inference.  | August 21, 2024 | 
|  [AmazonBedrockReadOnly](#security-iam-awsmanpol-AmazonBedrockReadOnly) – Updated policy  |  Amazon Bedrock added batch inference (model invocation job) read-only permissions.  | August 21, 2024 | 
|   [AmazonBedrockReadOnly](#security-iam-awsmanpol-AmazonBedrockReadOnly) – Updated policy   |  Amazon Bedrock updated the AmazonBedrockReadOnly policy to include read-only permissions for Amazon Bedrock Custom Model Import.  | September 3, 2024 | 
|  [AmazonBedrockFullAccess](#security-iam-awsmanpol-AmazonBedrockFullAccess) – New policy  |  Amazon Bedrock added a new policy to give users permissions to create, read, update, and delete resources.  | December 12, 2023 | 
|  [AmazonBedrockReadOnly](#security-iam-awsmanpol-AmazonBedrockReadOnly) – New policy  |  Amazon Bedrock added a new policy to give users read-only permissions for all actions.  | December 12, 2023 | 
|  Amazon Bedrock started tracking changes  |  Amazon Bedrock started tracking changes for its AWS managed policies.  | December 12, 2023 | 

# Service roles
<a name="security-iam-sr"></a>

Amazon Bedrock uses [IAM service roles](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_terms-and-concepts.html#iam-term-service-role) for some features to let Amazon Bedrock carry out tasks on your behalf.

The console automatically creates service roles for supported features.

You can also create a custom service role and customize the attached permissions to your specific use-case. If you use the console, you can select this role instead of letting Amazon Bedrock create one for you.

To set up the custom service role, you carry out the following general steps.

1. Create the role by following the steps at [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html).

1. Attach a **trust policy**.

1. Attach the relevant **identity-based permissions**.

**Important**  
When setting the `iam:PassRole` permission, make sure that a user can't pass a role where the role has more permissions than you want the user to have. For example, Alice might not be allowed to perform `bedrock:InvokeModel` on a custom model. If Alice can pass a role to Amazon Bedrock to create an evaluation of that custom model, the service could invoke that model on behalf of Alice while running the job.

Refer to the following links for more information about IAM concepts that are relevant to setting service role permissions.
+ [AWS service role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_terms-and-concepts.html#iam-term-service-role)
+ [Identity-based policies and resource-based policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_identity-vs-resource.html)
+ [Using resource-based policies for Lambda](https://docs.aws.amazon.com/lambda/latest/dg/access-control-resource-based.html)
+ [AWS global condition context keys](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html)
+ [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)

Select a topic to learn more about service roles for a specific feature.

**Topics**
+ [Create a custom service role for batch inference](batch-iam-sr.md)
+ [Create a service role for model customization](model-customization-iam-role.md)
+ [Create a service role for importing pre-trained models](model-import-iam-role.md)
+ [Create a service role for Amazon Bedrock Agents](agents-permissions.md)
+ [Create a service role for Amazon Bedrock Knowledge Bases](kb-permissions.md)
+ [Create a service role for Amazon Bedrock Flows in Amazon Bedrock](flows-permissions.md)
+ [Service role requirements for model evaluation jobs](model-evaluation-security-service-roles.md)

# Create a custom service role for batch inference
<a name="batch-iam-sr"></a>

To use a custom service role for batch inference instead of the one Amazon Bedrock automatically creates for you in the AWS Management Console, create an IAM role and attach the following permissions by following the steps at [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html).

**Topics**
+ [Trust relationship](#batch-iam-sr-trust)
+ [Identity-based permissions for the batch inference service role.](#batch-iam-sr-identity)

## Trust relationship
<a name="batch-iam-sr-trust"></a>

The following trust policy allows Amazon Bedrock to assume this role and submit and manage batch inference jobs. Replace the *values* as necessary. The policy contains optional condition keys (see [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) and [AWS global condition context keys](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)) in the `Condition` field that we recommend you use as a security best practice.

**Note**  
As a best practice for security purposes, replace the *\$1* with specific batch inference job IDs after you have created them.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnEquals": {
                    "aws:SourceArn": "arn:aws:bedrock:us-east-1:123456789012:model-invocation-job/*"
                }
            }
        }
    ]
}
```

------

## Identity-based permissions for the batch inference service role.
<a name="batch-iam-sr-identity"></a>

The following topics describe and provide examples of permissions policies that you might need to attach to your custom batch inference service role, depending on your use case.

**Topics**
+ [(Required) Permissions to access input and output data in Amazon S3](#batch-iam-sr-s3)
+ [(Optional) Permissions to run batch inference with inference profiles](#batch-iam-sr-ip)

### (Required) Permissions to access input and output data in Amazon S3
<a name="batch-iam-sr-s3"></a>

To allow a service role to access the Amazon S3 bucket containing your input data and the bucket to which to write your output data, attach the following policy to the service role. Replace *values* as necessary.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
         "Sid": "S3Access",
         "Effect": "Allow",
         "Action": [
            "s3:GetObject",
            "s3:PutObject",
            "s3:ListBucket"
         ],
         "Resource": [
            "arn:aws:s3:::${InputBucket}",
            "arn:aws:s3:::${InputBucket}/*",
            "arn:aws:s3:::${OutputBucket}",
            "arn:aws:s3:::${OutputBucket}/*"
         ],
         "Condition": {
            "StringEquals": {
                "aws:ResourceAccount": [
                    "123456789012"
                ]
            }
         }
        }
    ]
}
```

------

### (Optional) Permissions to run batch inference with inference profiles
<a name="batch-iam-sr-ip"></a>

To run batch inference with an [inference profile](inference-profiles.md), a service role must have permissions to invoke the inference profile in an AWS Region, in addition to the model in each Region in the inference profile.

For permissions to invoke with a cross-Region (system-defined) inference profile, use the following policy as a template for the permissions policy to attach to your service role:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "CrossRegionInference",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:inference-profile/${InferenceProfileId}",
                "arn:aws:bedrock:us-east-1::foundation-model/${ModelId}",
                "arn:aws:bedrock:us-east-1::foundation-model/${ModelId}"
            ]
        }
    ]
}
```

------

For permissions to invoke with an application inference profile, use the following policy as a template for the permissions policy to attach to your service role:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "ApplicationInferenceProfile",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:application-inference-profile/${InferenceProfileId}",
                "arn:aws:bedrock:us-east-1::foundation-model/${ModelId}",
                "arn:aws:bedrock:us-east-1::foundation-model/${ModelId}"
            ]
        }
    ]
}
```

------

# Create a service role for model customization
<a name="model-customization-iam-role"></a>

To use a custom role for model customization instead of the one Amazon Bedrock automatically creates, create an IAM role and attach the following permissions by following the steps at [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html).
+ Trust relationship
+ Permissions to access your training and validation data in S3 and to write your output data to S3
+ (Optional) If you encrypt any of the following resources with a KMS key, permissions to decrypt the key (see [Encryption of custom models](encryption-custom-job.md))
  + A model customization job or the resulting custom model
  + The training, validation, or output data for the model customization job

**Topics**
+ [Trust relationship](#model-customization-iam-role-trust)
+ [Permissions to access training and validation files and to write output files in S3](#model-customization-iam-role-s3)
+ [(Optional) Permissions to create a Distillation job with a cross-region inference profiles](#customization-iam-sr-ip)

## Trust relationship
<a name="model-customization-iam-role-trust"></a>

The following policy allows Amazon Bedrock to assume this role and carry out the model customization job. The following shows an example policy you can use.

You can optionally restrict the scope of the permission for [cross-service confused deputy prevention](cross-service-confused-deputy-prevention.md) by using one or more global condition context keys with the `Condition` field. For more information, see [AWS global condition context keys.](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html)
+ Set the `aws:SourceAccount` value to your account ID.
+ (Optional) Use the `ArnEquals` or `ArnLike` condition to restrict the scope to specific model customization jobs in your account ID.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnEquals": {
                    "aws:SourceArn": "arn:aws:bedrock:us-east-1:111122223333:model-customization-job/*"
                }
            }
        }
    ]
}
```

------

## Permissions to access training and validation files and to write output files in S3
<a name="model-customization-iam-role-s3"></a>

Attach the following policy to allow the role to access your training and validation data and the bucket to which to write your output data. Replace the values in the `Resource` list with your actual bucket names.

To restrict access to a specific folder in a bucket, add an `s3:prefix` condition key with your folder path. You can follow the **User policy** example in [Example 2: Getting a list of objects in a bucket with a specific prefix](https://docs.aws.amazon.com/AmazonS3/latest/userguide/amazon-s3-policy-keys.html#condition-key-bucket-ops-2) 

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::training-bucket",
                "arn:aws:s3:::training-bucket/*",
                "arn:aws:s3:::validation-bucket",
                "arn:aws:s3:::validation-bucket/*"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:PutObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::output-bucket",
                "arn:aws:s3:::output-bucket/*"
            ]
        }
    ]
}
```

------

## (Optional) Permissions to create a Distillation job with a cross-region inference profiles
<a name="customization-iam-sr-ip"></a>

To use a cross-region inference profile for a teacher model in a distillation job, the service role must have permissions to invoke the inference profile in an AWS Region, in addition to the model in each Region in the inference profile.

For permissions to invoke with a cross-Region (system-defined) inference profile, use the following policy as a template for the permissions policy to attach to your service role:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "CrossRegionInference",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:inference-profile/${InferenceProfileId}",
                "arn:aws:bedrock:us-east-1::foundation-model/${ModelId}",
                "arn:aws:bedrock:us-east-1::foundation-model/${ModelId}"
            ]
        }
    ]
}
```

------

# Create a service role for importing pre-trained models
<a name="model-import-iam-role"></a>

To use a custom role for model import create an IAM service role and attach the following permissions. For information on how to create a service role in IAM, see [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html).

These permissions apply to both methods of importing models into Amazon Bedrock:
+ **Custom model import jobs** — For importing customized open-source foundation models (such as Mistral AI or Llama models). For more information, see [Use Custom model import to import a customized open-source model into Amazon Bedrock](model-customization-import-model.md).
+ **Create custom model** — For importing Amazon Nova models that you fine-tuned in SageMaker AI. For more information, see [Import a SageMaker AI-trained Amazon Nova model](import-with-create-custom-model.md).

**Topics**
+ [Trust relationship](#model-import-iam-role-trust)
+ [Permissions to access model files in Amazon S3](#model-import-iam-role-s3)

## Trust relationship
<a name="model-import-iam-role-trust"></a>

The following policy allows Amazon Bedrock to assume this role and carry out model import operations. The following shows an example policy you can use.

You can optionally restrict the scope of the permission for [cross-service confused deputy prevention](cross-service-confused-deputy-prevention.md) by using one or more global condition context keys with the `Condition` field. For more information, see [AWS global condition context keys.](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html)
+ Set the `aws:SourceAccount` value to your account ID.
+ (Optional) Use the `ArnEquals` or `ArnLike` condition to restrict the scope to specific operations in your account. The following example restricts access to custom model import jobs.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "1",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnEquals": {
                    "aws:SourceArn": "arn:aws:bedrock:us-east-1:111122223333:model-import-job/*"
                }
            }
        }
    ]
}
```

------

## Permissions to access model files in Amazon S3
<a name="model-import-iam-role-s3"></a>

Attach the following policy to allow the role to access model files in Amazon S3 bucket. Replace the values in the `Resource` list with your actual bucket names.

For custom model import jobs, this is your own Amazon S3 bucket containing the customized open-source model files. For creating custom models from SageMaker AI-trained Amazon Nova models, this is the Amazon-managed Amazon S3 bucket where SageMaker AI stores the trained model artifacts. SageMaker AI creates this bucket when you run your first SageMaker AI training job. 

To restrict access to a specific folder in a bucket, add an `s3:prefix` condition key with your folder path. You can follow the **User policy** example in [Example 2: Getting a list of objects in a bucket with a specific prefix](https://docs.aws.amazon.com/AmazonS3/latest/userguide/amazon-s3-policy-keys.html#condition-key-bucket-ops-2) 

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "1",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::bucket",
                "arn:aws:s3:::bucket/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        }
    ]
}
```

------

# Create a service role for Amazon Bedrock Agents
<a name="agents-permissions"></a>

To use a custom service role for agents instead of the one Amazon Bedrock automatically creates, create an IAM role and attach the following permissions by following the steps at [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html).
+ Trust policy
+ A policy containing the following identity-based permissions:
  + Access to the Amazon Bedrock base models.
  + Access to the Amazon S3 objects containing the OpenAPI schemas for the action groups in your agents.
  + Permissions for Amazon Bedrock to query knowledge bases that you want to attach to your agents.
  + If any of the following situations pertain to your use case, add the statement to the policy or add a policy with the statement to the service role:
    + (Optional) If you enable multi-agent collaboration, permissions to get the aliases and invoke agent collaborators.
    + (Optional) If you associate a Provisioned Throughput with your agent alias, permissions to perform model invocation using that Provisioned Throughput.
    + (Optional) If you associate a guardrail with your agent, permissions to apply that guardrail. If the guardrail is encrypted with a KMS key, the service role will also need [permissions to decrypt the key](guardrails-permissions-kms.md)
    + (Optional) If you encrypt your agent with a KMS key, [permissions to decrypt the key](encryption-agents.md).

Whether you use a custom role or not, you also need to attach a **resource-based policy** to the Lambda functions for the action groups in your agents to provide permissions for the service role to access the functions. For more information, see [Resource-based policy to allow Amazon Bedrock to invoke an action group Lambda function](#agents-permissions-lambda).

**Topics**
+ [Trust relationship](#agents-permissions-trust)
+ [Identity-based permissions for the Agents service role](#agents-permissions-identity)
+ [(Optional) Identity-based policy to allow Amazon Bedrock to use Provisioned Throughput with your agent alias](#agents-permissions-pt)
+ [(Optional) Identity-based policy to allow Amazon Bedrock to associate and invoke agent collaborators](#agents-permissions-mac)
+ [(Optional) Identity-based policy to allow Amazon Bedrock to use guardrails with your Agent](#agents-permissions-gr)
+ [(Optional) Identity-based policy to allow Amazon Bedrock to access files from S3 to use with code interpretation](#agents-permissions-files-ci)
+ [Resource-based policy to allow Amazon Bedrock to invoke an action group Lambda function](#agents-permissions-lambda)

## Trust relationship
<a name="agents-permissions-trust"></a>

The following trust policy allows Amazon Bedrock to assume this role and create and manage agents. Replace the *\$1\$1values\$1* as necessary. The policy contains optional condition keys (see [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) and [AWS global condition context keys](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)) in the `Condition` field that we recommend you use as a security best practice.

**Note**  
As a best practice for security purposes, replace the *\$1* with specific agent IDs after you have created them.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnLike": {
                    "AWS:SourceArn": "arn:aws:bedrock:us-east-1:123456789012:agent/*"
                }
            }
        }
    ]
}
```

------

## Identity-based permissions for the Agents service role
<a name="agents-permissions-identity"></a>

Attach the following policy to provide permissions for the service role, replacing *\$1\$1values\$1* as necessary. The policy contains the following statements. Omit a statement if it isn't applicable to your use-case. The policy contains optional condition keys (see [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) and [AWS global condition context keys](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)) in the `Condition` field that we recommend you use as a security best practice.

**Note**  
If you encrypt your agent with a customer-managed KMS key, refer to [Encryption of agent resources for agents created before January 22, 2025](encryption-agents.md) for further permissions you need to add.
+ Permissions to use Amazon Bedrock foundation models to run model inference on prompts used in your agent's orchestration.
+ Permissions to access your agent's action group API schemas in Amazon S3. Omit this statement if your agent has no action groups.
+ Permissions to access knowledge bases associated with your agent. Omit this statement if your agent has no associated knowledge bases.
+ Permissions to access a third-party (Pinecone or Redis Enterprise Cloud) knowledge base associated with your agent. Omit this statement if your knowledge base is first-party (Amazon OpenSearch Serverless or Amazon Aurora) or if your agent has no associated knowledge bases.
+ Permissions to access a prompt from Prompt management. Omit this statement if you don't plan to test a prompt from prompt management with your agent in the Amazon Bedrock console.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AgentModelInvocationPermissions",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1::foundation-model/anthropic.claude-v2",
                "arn:aws:bedrock:us-east-1::foundation-model/anthropic.claude-v2:1",
                "arn:aws:bedrock:us-east-1::foundation-model/anthropic.claude-instant-v1"
            ]
        },
        {
            "Sid": "AgentActionGroupS3",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket/SchemaJson"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        },
        {
            "Sid": "AgentKnowledgeBaseQuery",
            "Effect": "Allow",
            "Action": [
                "bedrock:Retrieve",
                "bedrock:RetrieveAndGenerate"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:knowledge-base/knowledge-base-id"
            ]
        },
        {
            "Sid": "Agent3PKnowledgeBase",
            "Effect": "Allow",
            "Action": [
                "bedrock:AssociateThirdPartyKnowledgeBase"
            ],
            "Resource": "arn:aws:bedrock:us-east-1:123456789012:knowledge-base/knowledge-base-id",
            "Condition": {
                "StringEquals": {
                    "bedrock:ThirdPartyKnowledgeBaseCredentialsSecretArn": "arn:aws:kms:us-east-1:123456789012:key/KeyId"
                }
            }
        },
        {
            "Sid": "AgentPromptManagementConsole",
            "Effect": "Allow",
            "Action": [
                "bedrock:GetPrompt"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:prompt/prompt-id"
            ]
        }
    ]
}
```

------

## (Optional) Identity-based policy to allow Amazon Bedrock to use Provisioned Throughput with your agent alias
<a name="agents-permissions-pt"></a>

If you associate a [Provisioned Throughput](prov-throughput.md) with an alias of your agent, attach the following identity-based policy to the service role or add the statement to the policy in [Identity-based permissions for the Agents service role](#agents-permissions-identity).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
      {        
        "Sid": "UseProvisionedThroughput",
        "Effect": "Allow",
        "Action": [
            "bedrock:InvokeModel", 
            "bedrock:GetProvisionedModelThroughput"
        ],
        "Resource": [
            "arn:aws:bedrock:us-east-1:123456789012:provisioned-model/${provisioned-model-id}"
        ]
      }
    ]
}
```

------

## (Optional) Identity-based policy to allow Amazon Bedrock to associate and invoke agent collaborators
<a name="agents-permissions-mac"></a>

If you enable [multi-agent collaboration](agents-multi-agent-collaboration.md), attach the following identity-based policy to the service role or add the statement to the policy in [Identity-based permissions for the Agents service role](#agents-permissions-identity).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AmazonBedrockAgentMultiAgentsPolicyProd",
            "Effect": "Allow",
            "Action": [
                "bedrock:GetAgentAlias",
                "bedrock:InvokeAgent"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:agent-alias/${agent-id}/${agent-alias-id}"
            ]
        }
    ]
}
```

------

## (Optional) Identity-based policy to allow Amazon Bedrock to use guardrails with your Agent
<a name="agents-permissions-gr"></a>

If you associate a [guardrail](guardrails.md) with your agent, attach the following identity-based policy to the service role or add the statement to the policy in [Identity-based permissions for the Agents service role](#agents-permissions-identity).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "ApplyGuardrail",
            "Effect": "Allow",
            "Action": "bedrock:ApplyGuardrail",
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:guardrail/${guardrail-id}"
            ]
        }
    ]
}
```

------

## (Optional) Identity-based policy to allow Amazon Bedrock to access files from S3 to use with code interpretation
<a name="agents-permissions-files-ci"></a>

If you enable [Enable code interpretation in Amazon Bedrock](agents-enable-code-interpretation.md), attach the following identity-based policy to the service role or add the statement to the policy in [Identity-based permissions for the Agents service role](https://docs.aws.amazon.com//bedrock/latest/userguide/agents-permissions.html#agents-permissions-identity).

------
#### [ JSON ]

****  

```
{
  "Version":"2012-10-17",		 	 	 
  "Statement": [
      {       
        "Sid": "AmazonBedrockAgentFileAccess", 
        "Effect": "Allow",
        "Action": [
            "s3:GetObject",
            "s3:GetObjectVersion",
            "s3:GetObjectVersionAttributes",
            "s3:GetObjectAttributes"
        ],
        "Resource": [
            "arn:aws:s3:::[[customerProvidedS3BucketWithKey]]"
        ]
      }
    ]
}
```

------

## Resource-based policy to allow Amazon Bedrock to invoke an action group Lambda function
<a name="agents-permissions-lambda"></a>

Follow the steps at [Using resource-based policies for Lambda](https://docs.aws.amazon.com/lambda/latest/dg/access-control-resource-based.html) and attach the following resource-based policy to a Lambda function to allow Amazon Bedrock to access the Lambda function for your agent's action groups, replacing the *\$1\$1values\$1* as necessary. The policy contains optional condition keys (see [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) and [AWS global condition context keys](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)) in the `Condition` field that we recommend you use as a security best practice.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AccessLambdaFunction",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "lambda:InvokeFunction",
            "Resource": "arn:aws:lambda:us-east-1:123456789012:function:function-name",
            "Condition": {
                "StringEquals": {
                    "AWS:SourceAccount": "123456789012"
                },
                "ArnLike": {
                    "AWS:SourceArn": "arn:aws:bedrock:us-east-1:123456789012:agent/${agent-id}"
                }
            }
        }
    ]
}
```

------

# Create a service role for Amazon Bedrock Knowledge Bases
<a name="kb-permissions"></a>

To use a custom role for a knowledge base instead of the one Amazon Bedrock automatically creates, create an IAM role and attach the following permissions by following the steps at [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html). Include only the necessary permissions for your own security.

**Note**  
A policy cannot be shared between multiple roles when the service role is used.
+ Trust relationship
+ Access to the Amazon Bedrock base models
+ Access to the data source for where you store your data
+ (If you create a vector database in Amazon OpenSearch Service) Access to your OpenSearch Service collection
+ (If you create a vector database in Amazon Aurora) Access to your Aurora cluster
+ (If you create a vector database in Pinecone or Redis Enterprise Cloud) Permissions for AWS Secrets Manager to authenticate your Pinecone or Redis Enterprise Cloud account
+ (Optional) If you encrypt any of the following resources with a KMS key, permissions to decrypt the key (see [Encryption of knowledge base resources](encryption-kb.md)).
  + Your knowledge base
  + Data sources for your knowledge base
  + Your vector database in Amazon OpenSearch Service
  + The secret for your third-party vector database in AWS Secrets Manager
  + A data ingestion job

**Topics**
+ [Trust relationship](#kb-permissions-trust)
+ [Permissions to access Amazon Bedrock models](#kb-permissions-access-models)
+ [Permissions to access your data sources](#kb-permissions-access-ds)
+ [Permissions to decrypt your AWS KMS key for encrypted data sources in Amazon S3](#kb-permissions-kms-datasource)
+ [Permissions to chat with your document](#kb-permissions-chatdoc)
+ [Permissions for multimodal content](#kb-permissions-multimodal)
+ [Permissions to access your Amazon Kendra GenAI index](#kb-permissions-kendra)
+ [Permissions to access your vector database in Amazon OpenSearch Serverless](#kb-permissions-oss)
+ [Permissions to access your vector database in OpenSearch Managed Clusters](#kb-permissions-osm)
+ [Permissions to access your Amazon Aurora database cluster](#kb-permissions-rds)
+ [Permissions to access your vector database in Amazon Neptune Analytics](#kb-permissions-neptune)
+ [Permissions to access your vector store in Amazon S3 Vectors](#kb-permissions-s3vectors)
+ [Permissions to access a vector database configured with an AWS Secrets Manager secret](#kb-permissions-secret)
+ [Permissions for AWS to manage a AWS KMS key for transient data storage during data ingestion](#kb-permissions-kms-ingestion)
+ [Permissions for AWS to manage a data sources from another user's AWS account.](#kb-permissions-otherds)

## Trust relationship
<a name="kb-permissions-trust"></a>

The following policy allows Amazon Bedrock to assume this role and create and manage knowledge bases. The following shows an example policy you can use. You can restrict the scope of the permission by using one or more global condition context keys. For more information, see [AWS global condition context keys.](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html) Set the `aws:SourceAccount` value to your account ID. Use the `ArnEquals` or `ArnLike` condition to restrict the scope to specific knowledge bases.

**Note**  
As a best practice for security purposes, replace the *\$1* with specific knowledge base IDs after you have created them.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnLike": {
                    "AWS:SourceArn": "arn:aws:bedrock:us-east-1:123456789012:knowledge-base/*"
                }
            }
        }
    ]
}
```

------

## Permissions to access Amazon Bedrock models
<a name="kb-permissions-access-models"></a>

Attach the following policy to provide permissions for the role to use Amazon Bedrock models to embed your source data.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "bedrock:ListFoundationModels",
                "bedrock:ListCustomModels"
            ],
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1::foundation-model/amazon.titan-embed-text-v1",
                "arn:aws:bedrock:us-east-1::foundation-model/cohere.embed-english-v3",
                "arn:aws:bedrock:us-east-1::foundation-model/cohere.embed-multilingual-v3"
            ]
        }
    ]
}
```

------

## Permissions to access your data sources
<a name="kb-permissions-access-ds"></a>

Select from the following data sources to attach the necessary permissions for the role.

**Topics**
+ [Permissions to access your Amazon S3 data source](#kb-permissions-access-s3)
+ [Permissions to access your Confluence data source](#kb-permissions-access-confluence)
+ [Permissions to access your Microsoft SharePoint data source](#kb-permissions-access-sharepoint)
+ [Permissions to access your Salesforce data source](#kb-permissions-access-salesforce)

### Permissions to access your Amazon S3 data source
<a name="kb-permissions-access-s3"></a>

If your data source is Amazon S3, attach the following policy to provide permissions for the role to access the S3 bucket that you will connect to as your data source.

If you encrypted the data source with a AWS KMS key, attach permissions to decrypt the key to the role by following the steps at [Permissions to decrypt your AWS KMS key for your data sources in Amazon S3](encryption-kb.md#encryption-kb-ds).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "S3ListBucketStatement",
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        },
        {
            "Sid": "S3GetObjectStatement",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        }
    ]
}
```

------

### Permissions to access your Confluence data source
<a name="kb-permissions-access-confluence"></a>

**Note**  
Confluence data source connector is in preview release and is subject to change.

Attach the following policy to provide permissions for the role to access Confluence.

**Note**  
`secretsmanager:PutSecretValue` is only necessary if you use OAuth 2.0 authentication with a refresh token.  
Confluence OAuth2.0 **access** token has a default expiry time of 60 minutes. If this token expires while your data source is syncing (sync job), Amazon Bedrock will use the provided **refresh** token to regenerate this token. This regeneration refreshes both the access and refresh tokens. To keep the tokens updated from the current sync job to the next sync job, Amazon Bedrock requires write/put permissions for your secret credentials.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "secretsmanager:GetSecretValue",
                "secretsmanager:PutSecretValue"
            ],
            "Resource": [
                "arn:aws:secretsmanager:us-east-1:123456789012:secret:SecretId"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "kms:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/KeyId"
            ],
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "secretsmanager.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

### Permissions to access your Microsoft SharePoint data source
<a name="kb-permissions-access-sharepoint"></a>

**Note**  
SharePoint data source connector is in preview release and is subject to change.

Attach the following policy to provide permissions for the role to access SharePoint.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "secretsmanager:GetSecretValue"
            ],
            "Resource": [
                "arn:aws:secretsmanager:us-east-1:123456789012:secret:SecretId"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "kms:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/KeyId"
            ],
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "secretsmanager.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

### Permissions to access your Salesforce data source
<a name="kb-permissions-access-salesforce"></a>

**Note**  
Salesforce data source connector is in preview release and is subject to change.

Attach the following policy to provide permissions for the role to access Salesforce.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "secretsmanager:GetSecretValue"
            ],
            "Resource": [
                "arn:aws:secretsmanager:us-east-1:123456789012:secret:SecretId"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "kms:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/KeyId"
            ],
            "Condition": {
                "StringLike": {
                    "kms:ViaService": [
                        "secretsmanager.us-east-1.amazonaws.com"
                    ]
                }
            }
        }
    ]
}
```

------

## Permissions to decrypt your AWS KMS key for encrypted data sources in Amazon S3
<a name="kb-permissions-kms-datasource"></a>

If you encrypted your data sources in Amazon S3 with a AWS KMS key, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow Amazon Bedrock to decrypt your key. Replace *\$1\$1Region\$1* and *\$1\$1AccountId\$1* with the Region and account ID to which the key belongs. Replace *\$1\$1KeyId\$1* with the ID of your AWS KMS key.

```
{
    "Version": "2012-10-17",		 	 	 
    "Statement": [{
        "Effect": "Allow",
        "Action": [
            "kms:Decrypt"
        ],
        "Resource": [
            "arn:aws:kms:${Region}:${AccountId}:key/${KeyId}"
        ],
        "Condition": {
            "StringEquals": {
                "kms:ViaService": [
                    "s3.${Region}.amazonaws.com"
                ]
            }
        }
    }]
}
```

## Permissions to chat with your document
<a name="kb-permissions-chatdoc"></a>

Attach the following policy to provide permissions for the role to use Amazon Bedrock models to chat with your document:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
			"Effect": "Allow",
			"Action": [
				"bedrock:RetrieveAndGenerate"
			],
			"Resource": "*"
		}
    ]
}
```

------

If you only want to grant a user access to chat with your document (and not to `RetrieveAndGenerate` on all Knowledge Bases), use the following policy:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
			"Effect": "Allow",
			"Action": [
				"bedrock:RetrieveAndGenerate"
			],
			"Resource": "*"
		},
        {
			"Effect": "Deny",
			"Action": [
				"bedrock:Retrieve"
			],
			"Resource": "*"
		}
    ]
}
```

------

If you want both chat with your document and use `RetrieveAndGenerate` on a specific Knowledge Base, provide a *\$1\$1KnowledgeBaseArn\$1*, and use the following policy:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "bedrock:RetrieveAndGenerate"
            ],
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "bedrock:Retrieve"
            ],
            "Resource": "arn:aws:bedrock:us-east-1:123456789012:knowledge-base/$KnowledgeBaseId"
        }
    ]
}
```

------

## Permissions for multimodal content
<a name="kb-permissions-multimodal"></a>

When working with multimodal content (images, audio, video), additional permissions are required depending on your processing approach.

### Nova Multimodal Embeddings permissions
<a name="kb-permissions-multimodal-mme"></a>

When using Nova Multimodal Embeddings, attach the following policy to provide permissions for asynchronous model invocation:

```
{
    "Sid": "BedrockInvokeModelStatement",
    "Effect": "Allow",
    "Action": ["bedrock:InvokeModel"],
    "Resource": [
        "arn:aws:bedrock:us-east-1::foundation-model/amazon.nova-*-multimodal-embeddings-*",
        "arn:aws:bedrock:us-east-1::async-invoke/*"
    ],
    "Condition": {
        "StringEquals": {
            "aws:ResourceAccount": ""
        }
    }
},
{
    "Sid": "BedrockGetAsyncInvokeStatement",
    "Effect": "Allow",
    "Action": ["bedrock:GetAsyncInvoke"],
    "Resource": ["arn:aws:bedrock:us-east-1::async-invoke/*"],
    "Condition": {
        "StringEquals": {
            "aws:ResourceAccount": ""
        }
    }
}
```

### Bedrock Data Automation (BDA) permissions
<a name="kb-permissions-multimodal-bda"></a>

When using BDA to process multimodal content, attach the following policy:

```
{
    "Sid": "BDAInvokeStatement",
    "Effect": "Allow",
    "Action": ["bedrock:InvokeDataAutomationAsync"],
    "Resource": [
        "arn:aws:bedrock:us-east-1:aws:data-automation-project/public-rag-default",
        "arn:aws:bedrock:us-east-1::data-automation-profile/*"
    ]
},
{
    "Sid": "BDAGetStatement",
    "Effect": "Allow",
    "Action": ["bedrock:GetDataAutomationStatus"],
    "Resource": "arn:aws:bedrock:us-east-1::data-automation-invocation/*"
}
```

If you use customer-managed AWS KMS keys with BDA, also attach the following policy. Replace *account-id*, *region*, and *key-id* with your specific values:

```
{
    "Sid": "KmsPermissionStatementForBDA",
    "Effect": "Allow",
    "Action": [
        "kms:GenerateDataKey",
        "kms:Decrypt",
        "kms:DescribeKey",
        "kms:CreateGrant"
    ],
    "Resource": ["arn:aws:kms:region:account-id:key/key-id"],
    "Condition": {
        "StringEquals": {
            "aws:ResourceAccount": "account-id",
            "kms:ViaService": "bedrock.region.amazonaws.com"
        }
    }
}
```

## Permissions to access your Amazon Kendra GenAI index
<a name="kb-permissions-kendra"></a>

If you created an Amazon Kendra GenAI index for your knowledge base, then attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow access to the index. In the policy, replace *\$1\$1Partition\$1*, *\$1\$1Region\$1*, *\$1\$1AccountId\$1*, and *\$1\$1IndexId\$1* with the values for your index. You can allow access to multiple indexes by adding them to the `Resource` list. To allow access to every index in your AWS account, replace *\$1\$1IndexId\$1* with a wildcard (\$1).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "kendra:Retrieve",
                "kendra:DescribeIndex"
            ],
            "Resource": "arn:aws:kendra:us-east-1:123456789012:index/${IndexId}" 
        }
    ]
}
```

------

## Permissions to access your vector database in Amazon OpenSearch Serverless
<a name="kb-permissions-oss"></a>

If you created a vector database in OpenSearch Serverless for your knowledge base, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow access to the collection. Replace *\$1\$1Region\$1* and *\$1\$1AccountId\$1* with the Region and account ID to which the database belongs. Input the ID of your Amazon OpenSearch Service collection in *\$1\$1CollectionId\$1*. You can allow access to multiple collections by adding them to the `Resource` list.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "aoss:APIAccessAll"
            ],
            "Resource": [
                "arn:aws:aoss:us-east-1:123456789012:collection/${CollectionId}"
            ]
        }
    ]
}
```

------

## Permissions to access your vector database in OpenSearch Managed Clusters
<a name="kb-permissions-osm"></a>

If you created a vector database in OpenSearch Managed Cluster for your knowledge base, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow access to the domain. Replace *<region>* and *<accountId>* with the Region and account ID to which the database belongs. You can allow access to multiple domains by adding them to the `Resource` list. For more information about configuring permissions, see [Prerequisites and permissions required for using OpenSearch Managed Clusters with Amazon Bedrock Knowledge BasesOverview of permissions configuration](kb-osm-permissions-prereq.md).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",       
            "Action": [
                "es:ESHttpGet", 
                "es:ESHttpPost", 
                "es:ESHttpPut", 
                "es:ESHttpDelete" 
            ],
            "Resource": [
                "arn:aws:es:us-east-1:123456789012:domain/domainName/indexName"
            ]       
        }, 
        {
            "Effect": "Allow",
            "Action": [
                "es:DescribeDomain" 
            ],
            "Resource": [
                "arn:aws:es:us-east-1:123456789012:domain/domainName"
            ]       
        }
    ]
}
```

------

## Permissions to access your Amazon Aurora database cluster
<a name="kb-permissions-rds"></a>

**Note**  
The Amazon Aurora cluster must reside in the same AWS account as the one where the knowledge base is created for Amazon Bedrock.

If you created a database (DB) cluster in Amazon Aurora for your knowledge base, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow access to the DB cluster and to provide read and write permissions on it. Replace *\$1\$1Region\$1* and *\$1\$1AccountId\$1* with the Region and account ID to which the DB cluster belongs. Input the ID of your Amazon Aurora database cluster in *\$1\$1DbClusterId\$1*. You can allow access to multiple DB clusters by adding them to the `Resource` list.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "RdsDescribeStatementID",
            "Effect": "Allow",
            "Action": [
                "rds:DescribeDBClusters"
            ],
            "Resource": [
                "arn:aws:rds:us-east-1:123456789012:cluster:${DbClusterId}"
            ]
        },
        {
            "Sid": "DataAPIStatementID",
            "Effect": "Allow",
            "Action": [
                "rds-data:BatchExecuteStatement",
                "rds-data:ExecuteStatement"
            ],
            "Resource": [
                "arn:aws:rds:us-east-1:123456789012:cluster:${DbClusterId}"
            ]
        }
    ]
}
```

------

## Permissions to access your vector database in Amazon Neptune Analytics
<a name="kb-permissions-neptune"></a>

If you created an Amazon Neptune Analytics graph for your knowledge base, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow access to the graph. In the policy, replace *\$1\$1Region\$1* and *\$1\$1AccountId\$1* with the Region and account ID to which the database belongs. Replace *\$1\$1GraphId\$1* with the values for your graph database.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "NeptuneAnalyticsAccess",
            "Effect": "Allow",
            "Action": [
                "neptune-graph:GetGraph",
                "neptune-graph:ReadDataViaQuery",
                "neptune-graph:WriteDataViaQuery",
                "neptune-graph:DeleteDataViaQuery"
            ],
            "Resource": [
                "arn:aws:neptune-graph:us-east-1:123456789012:graph/${GraphId}"
            ]
        }
    ]
}
```

------

## Permissions to access your vector store in Amazon S3 Vectors
<a name="kb-permissions-s3vectors"></a>

If you choose to use Amazon S3 Vectors for your knowledge base, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow access to the vector index.

In the policy, replace *\$1\$1Region\$1* and *\$1\$1AccountId\$1* with the Region and account ID to which the vector index belongs. Replace *\$1\$1BucketName\$1* with the name of your S3 vector bucket and *\$1\$1IndexName\$1* with the name of your vector index. For more information about Amazon S3 Vectors, see [Setting up to use Amazon S3 Vectors](https://docs.aws.amazon.com/AmazonS3/latest/userguide/s3-vectors-setting-up.html).

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "S3VectorBucketReadAndWritePermission",
            "Effect": "Allow",
            "Action": [
                "s3vectors:PutVectors",
                "s3vectors:GetVectors",
                "s3vectors:DeleteVectors",
                "s3vectors:QueryVectors",
                "s3vectors:GetIndex"
            ],
            "Resource": "arn:aws:s3vectors:us-east-1:123456789012:bucket/${BucketName}/index/${IndexName}"
        }
    ]
}
```

------

## Permissions to access a vector database configured with an AWS Secrets Manager secret
<a name="kb-permissions-secret"></a>

If your vector database is configured with an AWS Secrets Manager secret, attach the following policy to your Amazon Bedrock Knowledge Bases service role to allow AWS Secrets Manager to authenticate your account to access the database. Replace *\$1\$1Region\$1* and *\$1\$1AccountId\$1* with the Region and account ID to which the database belongs. Replace *\$1\$1SecretId\$1* with the ID of your secret.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "secretsmanager:GetSecretValue"
            ],
            "Resource": [
                "arn:aws:secretsmanager:us-east-1:123456789012:secret:${SecretId}"
            ]
        }
    ]
}
```

------

If you encrypted your secret with a AWS KMS key, attach permissions to decrypt the key to the role by following the steps at [Permissions to decrypt an AWS Secrets Manager secret for the vector store containing your knowledge base](encryption-kb.md#encryption-kb-3p).

## Permissions for AWS to manage a AWS KMS key for transient data storage during data ingestion
<a name="kb-permissions-kms-ingestion"></a>

To allow the creation of a AWS KMS key for transient data storage in the process of ingesting your data source, attach the following policy to your Amazon Bedrock Knowledge Bases service role. Replace the *\$1\$1Region\$1*, *\$1\$1AccountId\$1*, and *\$1\$1KeyId\$1* with the appropriate values.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "kms:GenerateDataKey",
                "kms:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/${KeyId}"
            ]
        }
    ]
}
```

------

## Permissions for AWS to manage a data sources from another user's AWS account.
<a name="kb-permissions-otherds"></a>

To allow the access to another user's AWS account, you must create a role that allows cross-account access to a Amazon S3 bucket in another user's account. Replace the *\$1\$1BucketName\$1*, *\$1\$1BucketOwnerAccountId\$1*, and *\$1\$1BucketNameAndPrefix\$1* with the appropriate values.

**Permissions Required on Knowledge Base role**

The knowledge base role that is provided during knowledge base creation `createKnowledgeBase` requires the following Amazon S3 permissions.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "S3ListBucketStatement",
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        },
        {
            "Sid": "S3GetObjectStatement",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        }
    ]
}
```

------

If the Amazon S3 bucket is encrypted using a AWS KMS key, the following also needs to be added to the knowledge base role. Replace the *\$1\$1BucketOwnerAccountId\$1* and *\$1\$1Region\$1* with the appropriate values.

```
{
        "Sid": "KmsDecryptStatement",
        "Effect": "Allow",
        "Action": [
            "kms:Decrypt"
        ],
        "Resource": [
            "arn:aws:kms:${Region}:${BucketOwnerAccountId}:key/${KeyId}"
        ],
        "Condition": {
        "StringEquals": {
            "kms:ViaService": [
                "s3.${Region}.amazonaws.com"
            ]
        }
        }
    }
```

**Permissions required on a cross-account Amazon S3 bucket policy**

The bucket in the other account requires the following Amazon S3 bucket policy. Replace the *\$1\$1KbRoleArn\$1*, *\$1\$1BucketName\$1*, and *\$1\$1BucketNameAndPrefix\$1* with the appropriate values. 

------
#### [ JSON ]

****  

```
{
   "Version":"2012-10-17",		 	 	 
   "Statement": [
      {
         "Sid": "ListBucket",
         "Effect": "Allow",
         "Principal": {
            "AWS": "123456789012"
         },
         "Action": [
            "s3:ListBucket"
         ],
         "Resource": [
            "arn:aws:s3:::amzn-s3-demo-bucket"
         ]
      },
      {
         "Sid": "GetObject",
         "Effect": "Allow",
         "Principal": {
            "AWS": "123456789012"
         },
         "Action": [
            "s3:GetObject"
         ],
         "Resource": [
            "arn:aws:s3:::amzn-s3-demo-bucket/*"
         ]
      }
   ]
}
```

------

**Permissions required on cross-account AWS KMS key policy**

If the cross-account Amazon S3 bucket is encrypted using a AWS KMS key in that account, the policy of the AWS KMS key requires the following policy. Replace the *\$1\$1KbRoleArn\$1* and *\$1\$1KmsKeyArn\$1* with the appropriate values.

```
{
    "Sid": "Example policy",
    "Effect": "Allow",
    "Principal": {
        "AWS": [
            "${KbRoleArn}"
        ]
    },
    "Action": [
        "kms:Decrypt"
    ],
    "Resource": "${KmsKeyArn}"
}
```

# Create a service role for Amazon Bedrock Flows in Amazon Bedrock
<a name="flows-permissions"></a>

To create and manage a flow in Amazon Bedrock, you must use a service role with the necessary permissions outlined on this page. You can use a service role that Amazon Bedrock automatically creates for you in the console or use one that you customize yourself.

**Note**  
If you use the service role that Amazon Bedrock automatically creates for you in the console, it will attach permissions dynamically if you add nodes to your flow and save the flow. If you remove nodes, however, the permissions won't be deleted, so you will have to delete the permissions you no longer need. To manage the permissions for the role that was created for you, follow the steps at [Modifying a role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_manage_modify.html) in the IAM User Guide.

To create a custom service role for Amazon Bedrock Flows, create an IAM role by following the steps at [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html). Then attach the following permissions to the role.
+ Trust policy
+ The following identity-based permissions:
  + Access to the Amazon Bedrock base models that the flow will use. Add each model that's used in the flow to the `Resource` list.
  + If you invoke a model using Provisioned Throughput, permissions to access and invoke the provisioned model. Add each model that's used in the flow to the `Resource` list.
  + If you invoke a custom model, permissions to access and invoke the custom model. Add each model that's used in the flow to the `Resource` list.
  + Permissions based on the nodes that you add to the flow:
    + If you include prompt nodes that use prompts from Prompt management, you need permissions to access the prompt. Add each prompt that's used in the flow to the `Resource` list.
    + If you include knowledge base nodes, you need permissions to query the knowledge base. Add each knowledge base that's queried in the flow to the `Resource` list.
    + If you include agent nodes, you need permissions to invoke an alias of the agent. Add each agent that's invoked in the flow to the `Resource` list.
    + If you include S3 retrieval nodes, you need permissions to access the Amazon S3 bucket from which data will be retrieved. Add each bucket from which data is retrieved to the `Resource` list.
    + If you include S3 storage nodes, you need permissions to write to the Amazon S3 bucket in which output data will be stored. Add each bucket to which data is written to the `Resource` list.
    + If you include guardrails for a knowledge base node or a prompt node, you need permissions to apply the guardrails in a flow. Add each guardrail that's used in the flow to the `Resource` list.
    + If you include Lambda nodes, you need permissions to invoke the Lambda function. Add each Lambda function which needs to be invoked to the `Resource` list.
    + If you include Amazon Lex nodes, you need permissions to use the Amazon Lex bot. Add each bot alias which needs to be used to the `Resource` list.
    + If you encrypted any resource invoked in a flow, you need permissions to decrypt the key. Add each key to the `Resource` list.
+ If you encrypt the flow, you also need to attach a key policy to the KMS key that you use to encrypt the flow.

**Note**  
The following changes were recently implemented:  
Previously, AWS Lambda and Amazon Lex resources were invoked using the Amazon Bedrock service principal. This behavior is changing for flows created after 2024-11-22 and the Amazon Bedrock Flows service role will be used to invoke the AWS Lambda and Amazon Lex resources. If you created any flows that use either of these resources before 2024-11-22, you should update your Amazon Bedrock Flows service roles with AWS Lambda and Amazon Lex permissions.
Previously, Prompt management resources were rendered using the `bedrock:GetPrompt` action. This behavior is changing for flows created after 2024-11-22 and the `bedrock:RenderPrompt` action will be used to render the prompt resource. If you created any flows that use a prompt resource before 2024-11-22, you should update your Amazon Bedrock Flows service roles with `bedrock:RenderPrompt` permissions.
If you're using a service role that Amazon Bedrock automatically created for you in the console, Amazon Bedrock will attach the corrected permissions dynamically when you save the flow.

**Topics**
+ [Trust relationship](#flows-permissions-trust)
+ [Identity-based permissions for the flows service role.](#flows-permissions-identity)

## Trust relationship
<a name="flows-permissions-trust"></a>

Attach the following trust policy to the flow execution role to allow Amazon Bedrock to assume this role and manage a flow. Replace the *values* as necessary. The policy contains optional condition keys (see [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) and [AWS global condition context keys](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)) in the `Condition` field that we recommend you use as a security best practice.

**Note**  
As a best practice, replace the *\$1* with a flow ID after you have created it.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "FlowsTrustBedrock",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnLike": {
                    "AWS:SourceArn": "arn:aws:bedrock:us-east-1:123456789012:flow/*"
                }
            }
        }
    ]
}
```

------

## Identity-based permissions for the flows service role.
<a name="flows-permissions-identity"></a>

Attach the following policy to provide permissions for the service role, replacing *values* as necessary. The policy contains the following statements. Omit a statement if it isn't applicable to your use-case. The policy contains optional condition keys (see [Condition keys for Amazon Bedrock](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys) and [AWS global condition context keys](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazonbedrock.html#amazonbedrock-policy-keys)) in the `Condition` field that we recommend you use as a security best practice.
+ Access to the Amazon Bedrock base models that the flow will use. Add each model that's used in the flow to the `Resource` list.
+ If you invoke a model using Provisioned Throughput, permissions to access and invoke the provisioned model. Add each model that's used in the flow to the `Resource` list.
+ If you invoke a custom model, permissions to access and invoke the custom model. Add each model that's used in the flow to the `Resource` list.
+ Permissions based on the nodes that you add to the flow:
  + If you include prompt nodes that use prompts from Prompt management, you need permissions to access the prompt. Add each prompt that's used in the flow to the `Resource` list.
  + If you include knowledge base nodes, you need permissions to query the knowledge base. Add each knowledge base that's queried in the flow to the `Resource` list.
  + If you include agent nodes, you need permissions to invoke an alias of the agent. Add each agent that's invoked in the flow to the `Resource` list.
  + If you include S3 retrieval nodes, you need permissions to access the Amazon S3 bucket from which data will be retrieved. Add each bucket from which data is retrieved to the `Resource` list.
  + If you include S3 storage nodes, you need permissions to write to the Amazon S3 bucket in which output data will be stored. Add each bucket to which data is written to the `Resource` list.
  + If you include guardrails for a knowledge base node or a prompt node, you need permissions to apply the guardrails in a flow. Add each guardrail that's used in the flow to the `Resource` list.
  + If you include Lambda nodes, you need permissions to invoke the Lambda function. Add each Lambda function which needs to be invoked to the `Resource` list.
  + If you include Amazon Lex nodes, you need permissions to use the Amazon Lex bot. Add each bot alias which needs to be used to the `Resource` list.
  + If you encrypted any resource invoked in a flow, you need permissions to decrypt the key. Add each key to the `Resource` list.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "InvokeModel",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1::foundation-model/ModelId"
            ]
        },
        {
            "Sid": "InvokeProvisionedThroughput",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel",
                "bedrock:GetProvisionedModelThroughput"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:provisioned-model/ModelId"
            ]
        },
        {
            "Sid": "InvokeCustomModel",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel",
                "bedrock:GetCustomModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:custom-model/ModelId"
            ]
        },
        {
            "Sid": "UsePromptFromPromptManagement",
            "Effect": "Allow",
            "Action": [
                "bedrock:RenderPrompt"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:prompt/PromptId"
            ]
        },
        {
            "Sid": "QueryKnowledgeBase",
            "Effect": "Allow",
            "Action": [
                "bedrock:Retrieve"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:knowledge-base/KnowledgeBaseId"
            ]
        },
        {
            "Sid": "InvokeAgent",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeAgent"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:agent-alias/AgentId/AgentAliasId"
            ]
        },
        {
            "Sid": "AccessS3Bucket",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        },
        {
            "Sid": "WriteToS3Bucket",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket",
                "arn:aws:s3:::amzn-s3-demo-bucket/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        },
        {
            "Sid": "GuardrailPermissions",
            "Effect": "Allow",
            "Action": [
                "bedrock:ApplyGuardrail"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:guardrail/GuardrailId"
            ]
        },
        {
            "Sid": "LambdaPermissions",
            "Effect": "Allow",
            "Action": [
                "lambda:InvokeFunction"
            ],
            "Resource": [
                "arn:aws:lambda:us-east-1:123456789012:function:FunctionId"
            ]
        },
        {
            "Sid": "AmazonLexPermissions",
            "Effect": "Allow",
            "Action": [
                "lex:RecognizeUtterance"
            ],
            "Resource": [ 
                "arn:aws:lex:us-east-1:123456789012:bot-alias/BotId/BotAliasId"
            ]
        },
        {
            "Sid": "KMSPermissions",
            "Effect": "Allow",
            "Action": [
                "kms:GenerateDataKey",
                "kms:Decrypt"
            ],
            "Resource": [
                "arn:aws:kms:us-east-1:123456789012:key/KeyId"
            ],
             "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "123456789012"
                }
            }
        }
    ]
}
```

------

# Service role requirements for model evaluation jobs
<a name="model-evaluation-security-service-roles"></a>

To create a model evaluation job, you must specify a service role. A service role is an [IAM role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html) that a service assumes to perform actions on your behalf. An IAM administrator can create, modify, and delete a service role from within IAM. For more information, see [Create a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html) in the *IAM User Guide*. 

The required IAM actions and resource depend on the type of model evaluation job you are creating. Use the following sections to learn more about the required Amazon Bedrock,Amazon SageMaker AI, and Amazon S3 IAM actions, service principals, and resources. You can optionally choose to encrypt your data using AWS Key Management Service.

**Topics**
+ [Service role requirements for automatic model evaluation jobs](automatic-service-roles.md)
+ [Service role requirements for human-based model evaluation jobs](model-eval-service-roles.md)
+ [Required service role permissions for creating a model evaluation job that uses a judge model](judge-service-roles.md)
+ [Service role requirements for knowledge base evaluation jobs](rag-eval-service-roles.md)

# Service role requirements for automatic model evaluation jobs
<a name="automatic-service-roles"></a>

To create an automatic model evaluation job, you must specify a service role. The policy you attach grants Amazon Bedrock access to resources in your account, and allows Amazon Bedrock to invoke the selected model on your behalf.

You must also attach a trust policy that defines Amazon Bedrock as the service principal using `bedrock.amazonaws.com`. Each of the following policy examples shows you the exact IAM actions that are required based on each service invoked in an automatic model evaluation job.

To create a custom service role, see [Creating a role that uses a custom trust policy](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-custom.html) in the *IAM User Guide*.

**Required Amazon S3 IAM actions**  
The following policy example grants access to the S3 buckets where your model evaluation results are saved, and (optionally) access to any custom prompt datasets you have specified.

------
#### [ JSON ]

****  

```
{
"Version":"2012-10-17",		 	 	 
"Statement": [
    {
        "Sid": "AllowAccessToCustomDatasets",
        "Effect": "Allow",
        "Action": [
            "s3:GetObject",
            "s3:ListBucket"
        ],
        "Resource": [
            "arn:aws:s3:::my_customdataset1_bucket",
            "arn:aws:s3:::my_customdataset1_bucket/myfolder",
            "arn:aws:s3:::my_customdataset2_bucket",
            "arn:aws:s3:::my_customdataset2_bucket/myfolder"
        ]
    },
    {
        "Sid": "AllowAccessToOutputBucket",
        "Effect": "Allow",
        "Action": [
            "s3:GetObject",
            "s3:ListBucket",
            "s3:PutObject",
            "s3:GetBucketLocation",
            "s3:AbortMultipartUpload",
            "s3:ListBucketMultipartUploads"
        ],
        "Resource": [
            "arn:aws:s3:::my_output_bucket",
            "arn:aws:s3:::my_output_bucket/myfolder"
        ]
    }
]
}
```

------

**Required Amazon Bedrock IAM actions**  
You also need to create a policy that allows Amazon Bedrock to invoke the model you plan to specify in the automatic model evaluation job. To learn more about managing access to Amazon Bedrock models, see [Access Amazon Bedrock foundation models](model-access.md). In the `"Resource"` section of the policy, you must specify at least one ARN of a model you have access too. To use a model encrypted with customer managed key KMS key, you must add the required IAM actions and resources to the IAM service role policy. You must also add the service role to the AWS KMS key policy.

------
#### [ JSON ]

****  

```
{
		    "Version":"2012-10-17",		 	 	 
            "Statement": [
        {
            "Sid": "AllowAccessToBedrockResources",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel",
                "bedrock:InvokeModelWithResponseStream",
                "bedrock:CreateModelInvocationJob",
                "bedrock:StopModelInvocationJob",
                "bedrock:GetProvisionedModelThroughput",
                "bedrock:GetInferenceProfile", 
                "bedrock:ListInferenceProfiles",
                "bedrock:GetImportedModel",
                "bedrock:GetPromptRouter",
                "sagemaker:InvokeEndpoint"
            ],
            "Resource": [
                "arn:aws:bedrock:*::foundation-model/*",
                "arn:aws:bedrock:*:111122223333:inference-profile/*",
                "arn:aws:bedrock:*:111122223333:provisioned-model/*",
                "arn:aws:bedrock:*:111122223333:imported-model/*",
                "arn:aws:bedrock:*:111122223333:application-inference-profile/*",
                "arn:aws:bedrock:*:111122223333:default-prompt-router/*",
                "arn:aws:sagemaker:*:111122223333:endpoint/*",
                "arn:aws:bedrock:*:111122223333:marketplace/model-endpoint/all-access"
            ]
        }
    ]
}
```

------

**Service principal requirements**  
You must also specify a trust policy that defines Amazon Bedrock as the service principal. This allows Amazon Bedrock to assume the role. The wildcard (`*`) model evaluation job ARN is required so that Amazon Bedrock can create model evaluation jobs in your AWS account.

------
#### [ JSON ]

****  

```
{
"Version":"2012-10-17",		 	 	 
"Statement": [{
    "Sid": "AllowBedrockToAssumeRole",
    "Effect": "Allow",
    "Principal": {
        "Service": "bedrock.amazonaws.com"
    },
    "Action": "sts:AssumeRole",
    "Condition": {
        "StringEquals": {
            "aws:SourceAccount": "111122223333"
        },
        "ArnEquals": {
            "aws:SourceArn": "arn:aws:bedrock:us-east-1:111122223333:evaluation-job/*"
        }
    }
}]
}
```

------

# Service role requirements for human-based model evaluation jobs
<a name="model-eval-service-roles"></a>

To create a model evaluation job that uses human evaluators, you must specify two service roles.

The following lists summarize the IAM policy requirements for each required service role that must be specified in the Amazon Bedrock console.

**Summary of IAM policy requirements for the Amazon Bedrock service role**
+ You must attach a trust policy which defines Amazon Bedrock as the service principal.
+ You must allow Amazon Bedrock to invoke the selected models on your behalf.
+ You must allow Amazon Bedrock to access the S3 bucket that holds your prompt dataset and the S3 bucket where you want the results saved.
+ You must allow Amazon Bedrock to create the required human loop resources in your account.
+ (Recommended) Use a `Condition` *block* to specify accounts that can access.
+ (Optional) You must allow Amazon Bedrock to decrypt your KMS key if you've encrypted your prompt dataset bucket or the Amazon S3 bucket where you want the results saved.

**Summary of IAM policy requirements for the Amazon SageMaker AI service role**
+ You must attach a trust policy which defines SageMaker AI as the service principal.
+ You must allow SageMaker AI to access the S3 bucket that holds your prompt dataset and the S3 bucket where you want the results saved.
+ (Optional) You must allow SageMaker AI to use your customer managed keys if you've encrypted your prompt dataset bucket or the location where you wanted the results.

To create a custom service role, see [Creating a role that uses a custom trust policy](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-custom.html) in the *IAM User Guide*.

**Required Amazon S3 IAM actions**  
The following policy example grants access to the S3 buckets where your model evaluation results are saved, and access to the custom prompt dataset you have specified. You need to attach this policy to both the SageMaker AI service role and the Amazon Bedrock service role.

------
#### [ JSON ]

****  

```
{
"Version":"2012-10-17",		 	 	 
"Statement": [
    {
        "Sid": "AllowAccessToCustomDatasets",
        "Effect": "Allow",
        "Action": [
            "s3:GetObject",
            "s3:ListBucket"
        ],
        "Resource": [
            "arn:aws:s3:::custom-prompt-dataset"
        ]
    },
    {
        "Sid": "AllowAccessToOutputBucket",
        "Effect": "Allow",
        "Action": [
            "s3:GetObject",
            "s3:ListBucket",
            "s3:PutObject",
            "s3:GetBucketLocation",
            "s3:AbortMultipartUpload",
            "s3:ListBucketMultipartUploads"
        ],
        "Resource": [
            "arn:aws:s3:::model_evaluation_job_output"
        ]
    }
]
}
```

------

**Required Amazon Bedrock IAM actions**  
To allow Amazon Bedrock to invoke the model you plan to specify in the automatic model evaluation job, attach the following policy to the Amazon Bedrock service role. In the `"Resource"` section of the policy, you must specify at least one ARN of a model you have access too. To use a model encrypted with customer managed key KMS key, you must add the required IAM actions and resources to the IAM service role. You must also add any required AWS KMS key policy elements.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AllowAccessToBedrockResources",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel",
                "bedrock:InvokeModelWithResponseStream",
                "bedrock:CreateModelInvocationJob",
                "bedrock:StopModelInvocationJob",
                "bedrock:GetProvisionedModelThroughput",
                "bedrock:GetInferenceProfile", 
                "bedrock:ListInferenceProfiles",
                "bedrock:GetImportedModel",
                "bedrock:GetPromptRouter",
                "sagemaker:InvokeEndpoint"
            ],
            "Resource": [
                "arn:aws:bedrock:*::foundation-model/*",
                "arn:aws:bedrock:*:111122223333:inference-profile/*",
                "arn:aws:bedrock:*:111122223333:provisioned-model/*",
                "arn:aws:bedrock:*:111122223333:imported-model/*",
                "arn:aws:bedrock:*:111122223333:application-inference-profile/*",
                "arn:aws:bedrock:*:111122223333:default-prompt-router/*",
                "arn:aws:sagemaker:*:111122223333:endpoint/*",
                "arn:aws:bedrock:*:111122223333:marketplace/model-endpoint/all-access"
            ]
        }
    ]
}
```

------

**Required Amazon Augmented AI IAM actions**  
You also must create a policy that allows Amazon Bedrock to create resources related to human-based model evaluation jobs. Because Amazon Bedrock creates the needed resources to start the model evaluation job, you must use `"Resource": "*"`. You must attach this policy to the Amazon Bedrock service role.

------
#### [ JSON ]

****  

```
{
"Version":"2012-10-17",		 	 	 
"Statement": [
    {
        "Sid": "ManageHumanLoops",
        "Effect": "Allow",
        "Action": [
            "sagemaker:StartHumanLoop",
            "sagemaker:DescribeFlowDefinition",
            "sagemaker:DescribeHumanLoop",
            "sagemaker:StopHumanLoop",
            "sagemaker:DeleteHumanLoop"
        ],
        "Resource": "*"
    }
]
}
```

------

**Service principal requirements (Amazon Bedrock)**  
You must also specify a trust policy that defines Amazon Bedrock as the service principal. This allows Amazon Bedrock to assume the role.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AllowBedrockToAssumeRole",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "111122223333"
                },
                "ArnEquals": {
                    "aws:SourceArn": "arn:aws:bedrock:us-east-1:111122223333:evaluation-job/*"
                }
            }
        }
    ]
}
```

------

**Service principal requirements (SageMaker AI)**  
You must also specify a trust policy that defines Amazon Bedrock as the service principal. This allows SageMaker AI to assume the role.

------
#### [ JSON ]

****  

```
{
"Version":"2012-10-17",		 	 	 
"Statement": [
{
  "Sid": "AllowSageMakerToAssumeRole",
  "Effect": "Allow",
  "Principal": {
    "Service": "sagemaker.amazonaws.com"
  },
  "Action": "sts:AssumeRole"
}
]
}
```

------

# Required service role permissions for creating a model evaluation job that uses a judge model
<a name="judge-service-roles"></a>

To create a model evaluation job that uses a LLM as judge, you must specify a service role. The policy you attach grants Amazon Bedrock access to resources in your account, and allows Amazon Bedrock to invoke the selected model on your behalf.

The trust policy defines Amazon Bedrock as the service principal using `bedrock.amazonaws.com`. Each of the following policy examples shows you the exact IAM actions that are required based on each service invoked in the model evaluation job

To create a custom service role as described below, see [Creating a role that uses a custom trust policy](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-custom.html) in the *IAM User Guide*.

## Required Amazon Bedrock IAM actions
<a name="judge-service-roles-br"></a>

You need to create a policy that allows Amazon Bedrock to invoke the models you plan to specify in the model evaluation job. To learn more about managing access to Amazon Bedrock models, see [Access Amazon Bedrock foundation models](model-access.md). In the `"Resource"` section of the policy, you must specify at least one ARN of a model you have access too. To use a model encrypted with customer managed key KMS key, you must add the required IAM actions and resources to the IAM service role policy. You must also add the service role to the AWS KMS key policy.

The service role must include access to at least one supported evaluator model. For a list of currently supported evaluator models, see [Supported models](evaluation-judge.md#evaluation-judge-supported).

------
#### [ JSON ]

****  

```
{
	"Version":"2012-10-17",		 	 	 
	"Statement": [
		{
			"Sid": "BedrockModelInvoke",
			"Effect": "Allow",
			"Action": [
				"bedrock:InvokeModel",
				"bedrock:CreateModelInvocationJob",
				"bedrock:StopModelInvocationJob"
			],
			"Resource": [
				"arn:aws:bedrock:us-east-1::foundation-model/*",
				"arn:aws:bedrock:us-east-1:111122223333:inference-profile/*",
				"arn:aws:bedrock:us-east-1:111122223333:provisioned-model/*",
				"arn:aws:bedrock:us-east-1:111122223333:imported-model/*"
			]
		}
	]
}
```

------

## Required Amazon S3 IAM actions and resources
<a name="judge-service-roles-s3"></a>

Your service role policy needs to include access to the Amazon S3 bucket where you want the output of model evaluation jobs saved, and access to the prompt dataset you have specified in your `CreateEvaluationJob` request or via the Amazon Bedrock console.

------
#### [ JSON ]

****  

```
{
	"Version":"2012-10-17",		 	 	 
	"Statement": [
		{
			"Sid": "FetchAndUpdateOutputBucket",
			"Effect": "Allow",
			"Action": [
				"s3:GetObject",
				"s3:ListBucket",
				"s3:PutObject",
				"s3:GetBucketLocation",
				"s3:AbortMultipartUpload",
				"s3:ListBucketMultipartUploads"
			],
			"Resource": [
				"arn:aws:s3:::my_customdataset1_bucket",
	            "arn:aws:s3:::my_customdataset1_bucket/myfolder",
	            "arn:aws:s3:::my_customdataset2_bucket",
				"arn:aws:s3:::my_customdataset2_bucket/myfolder"
			]
		}
	]
}
```

------

# Service role requirements for knowledge base evaluation jobs
<a name="rag-eval-service-roles"></a>

To create a knowledge base evaluation job, you must specify a service role. The policy that you attach to the role grants Amazon Bedrock access to resources in your account, and it allows Amazon Bedrock to do the following:
+ Invoke the models that you select for output generation with the `RetrieveAndGenerate` API action, and evaluate the knowledge base outputs.
+ Invoke the Amazon Bedrock Knowledge Bases `Retrieve` and `RetrieveAndGenerate` API actions on your knowledge base instance.

To create a custom service role, see [Creating a role that uses custom trust policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-custom.html) in the *IAM User Guide*.

**Required IAM actions for Amazon S3 access**  
The following example policy grants access to the S3 buckets where both of the following occur: 
+ You save your knowledge base evaluation results.
+ Amazon Bedrock reads your input dataset.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement":
    [
        {
            "Sid": "AllowAccessToCustomDatasets",
            "Effect": "Allow",
            "Action":
            [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource":
            [
                "arn:aws:s3:::my_customdataset1_bucket",
                "arn:aws:s3:::my_customdataset1_bucket/myfolder",
                "arn:aws:s3:::my_customdataset2_bucket",
                "arn:aws:s3:::my_customdataset2_bucket/myfolder"
            ]
        },
        {
            "Sid": "AllowAccessToOutputBucket",
            "Effect": "Allow",
            "Action":
            [
                "s3:GetObject",
                "s3:ListBucket",
                "s3:PutObject",
                "s3:GetBucketLocation",
                "s3:AbortMultipartUpload",
                "s3:ListBucketMultipartUploads"
            ],
            "Resource":
            [
                "arn:aws:s3:::my_output_bucket",
                "arn:aws:s3:::my_output_bucket/myfolder"
            ]
        }
    ]
}
```

------

**Required Amazon Bedrock IAM actions**  
You also need to create a policy that allows Amazon Bedrock to do the following:

1. Invoke the models that you plan to specify for the following: 
   + Result generation with the `RetrieveAndGenerate` API action.
   + Evaluation of results.

   For the `Resource` key in the policy, you must specify at least one ARN of a model you have access to. To use a model that's encrypted with a customer-managed KMS key, you must add the required IAM actions and resources to the IAM service role policy. You must also add the service role to the AWS KMS key policy.

1. Call the `Retrieve` and `RetrieveAndGenerate` API actions. Note that, in the automated role creation in the console, we give permissions to both `Retrieve` and `RetrieveAndGenerate` API actions, regardless of the action you choose to evaluate for that job. By doing so, we give additional flexibility and reusability for that role. However, for added security, that automatically-created role is tied to a single knowledge base instance.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AllowSpecificModels",
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel",
                "bedrock:InvokeModelWithResponseStream",
                "bedrock:CreateModelInvocationJob",
                "bedrock:StopModelInvocationJob",
                "bedrock:GetProvisionedModelThroughput",
                "bedrock:GetInferenceProfile",
                "bedrock:GetImportedModel"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1::foundation-model/*",
                "arn:aws:bedrock:us-east-1:123456789012:inference-profile/*",
                "arn:aws:bedrock:us-east-1:123456789012:provisioned-model/*",
                "arn:aws:bedrock:us-east-1:123456789012:imported-model/*",
                "arn:aws:bedrock:us-east-1:123456789012:application-inference-profile/*"
            ]
        },
        {
            "Sid": "AllowKnowledgeBaseAPis",
            "Effect": "Allow",
            "Action": [
                "bedrock:Retrieve",
                "bedrock:RetrieveAndGenerate"
            ],
            "Resource": [
                "arn:aws:bedrock:us-east-1:123456789012:knowledge-base/knowledge-base-id"
            ]
        }
    ]
}
```

------

**Service Principal Requirements**  
You must also specify a trust policy that defines Amazon Bedrock as the service principal. This policy allows Amazon Bedrock to assume the role. The wildcard (`*`) model evaluation job ARN is required so that Amazon Bedrock can create model evaluation jobs in your AWS account.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AllowBedrockToAssumeRole",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "123456789012"
                },
                "ArnEquals": {
                    "aws:SourceArn": "arn:aws:bedrock:us-east-1:123456789012:evaluation-job/*"
                }
            }
        }
    ]
}
```

------

# Configure access to Amazon S3 buckets
<a name="s3-bucket-access"></a>

Multiple Amazon Bedrock features require access to data that is stored in Amazon S3 buckets. To access this data, you must configure the following permissions:


****  

| Use case | Permissions | 
| --- | --- | 
| Permissions to retrieve data from S3 bucket | s3:GetObjects3:ListBucket | 
| Permissions to write data to S3 bucket | s3:PutObject | 
| Permissions to decrypt KMS key that encrypted the S3 bucket | kms:Decryptkms:DescribeKey | 

The identities or resources to which you need to attach the above permissions depends on the following factors:
+ Multiple features in Amazon Bedrock use [service roles](security-iam-sr.md). If a feature uses a service role, you must configure the permissions such that the service role, rather than the user's IAM identity, has access to the S3 data. Some Amazon Bedrock features can automatically create a service role for you and attach the required [identity-based permissions](https://docs.aws.amazon.com//IAM/latest/UserGuide/access_policies.html#policies_id-based) to the service role, if you use the AWS Management Console.
+ Some features in Amazon Bedrock allow an identity to access an S3 bucket in a different account. If S3 data needs to be accessed from a different account, the bucket owner must include the above [resource-based permissions](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html#policies_resource-based) in an [S3 bucket policy](https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucket-policies.html) attached to the S3 bucket.

The following describes how to determine where you need to attach the necessary permissions to access S3 data:
+ IAM identity permissions
  + If you can auto-create a service role in the console, the permissions will be configured for the service role, so you don't need to configure it yourself.
  + If you prefer to use a custom service role or the identity that requires access isn't a service role, navigate to [Attach permissions to an IAM identity to allow it to access an Amazon S3 bucket](#s3-bucket-access-identity) to learn how to create an identity-based policy with the proper permissions.
+ Resource-based permissions
  + If the identity requires access to S3 data in the same account, you don't need attach an S3 bucket policy to the bucket containing the data.
  + If the identity requires access to S3 data in a different account, navigate to [Attach a bucket policy to an Amazon S3 bucket to allow another account to access it](#s3-bucket-access-cross-account) to learn how to create an S3 bucket policy with the proper permissions.
**Important**  
Automatic creation of a service role in the AWS Management Console attaches the proper identity-based permissions to the role, but you still must configure the S3 bucket policy if the identity that requires access to it is in a different AWS account.

For more information, see the following links:
+ To learn more about controlling access to data in Amazon S3, see [Access control in Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/userguide/access-management.html).
+ To learn more about Amazon S3 permissions, see [Actions defined by Amazon S3](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-actions-as-permissions).
+ To learn more about AWS KMS permissions, see [Actions defined by AWS Key Management Service](https://docs.aws.amazon.com/service-authorization/latest/reference/list_awskeymanagementservice.html#awskeymanagementservice-actions-as-permissions).

Proceed through the topics that pertain to your use case:

**Topics**
+ [Attach permissions to an IAM identity to allow it to access an Amazon S3 bucket](#s3-bucket-access-identity)
+ [Attach a bucket policy to an Amazon S3 bucket to allow another account to access it](#s3-bucket-access-cross-account)
+ [(Advanced security option) Include conditions in a statement for more fine-grained access](#s3-bucket-access-conditions)

## Attach permissions to an IAM identity to allow it to access an Amazon S3 bucket
<a name="s3-bucket-access-identity"></a>

This topic provides a template for a policy to attach to an IAM identity. The policy includes the following statements defining permissions to grant an IAM identity access to an S3 bucket:

1. Permissions to retrieve data from an S3 bucket. This statement also includes a condition using the `s3:prefix` [condition key](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-policy-keys) to restrict access to a specific folder in the bucket. For more information about this condition, see the **User policy** section in [Example 2: Getting a list of objects in a bucket with a specific prefix](https://docs.aws.amazon.com/AmazonS3/latest/userguide/amazon-s3-policy-keys.html#condition-key-bucket-ops-2).

1. (If you need to write data to an S3 location) Permissions to write data to an S3 bucket. This statement also includes a condition using the `aws:ResourceAccount` [condition key](https://docs.aws.amazon.com//IAM/latest/UserGuide/reference_policies_condition-keys.html#condition-keys-resourceaccount) to restrict access to requests sent from a specific AWS account.

1. (If the S3 bucket is encrypted with an KMS key) Permissions to describe and decrypt the KMS key that encrypted the S3 bucket.
**Note**  
If your S3 bucket is versioning-enabled, each object version that you upload by using this feature can have its own encryption key. You're responsible for tracking which encryption key was used for which object version.

Add, modify, and remove the statements, resources, and conditions in the following policy and replace *\$1\$1values\$1* as necessary:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "ReadS3Bucket",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::${S3Bucket}",
                "arn:aws:s3:::${S3Bucket}/*"
            ]
        },
        {
            "Sid": "WriteToS3Bucket",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:PutObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::${S3Bucket}",
                "arn:aws:s3:::${S3Bucket}/*"
            ]
        },
        {
            "Sid": "DecryptKMSKey",
            "Effect": "Allow",
            "Action": [
                "kms:Decrypt",
                "kms:DescribeKey"
            ],
            "Resource": "arn:aws:kms:us-east-1:123456789012:key/${KMSKeyId}"
        }
    ]
}
```

------

After modifying the policy to your use case, attach it to the service role (or IAM identity) that requires access to the S3 bucket. To learn how to attach permissions to an IAM identity, see [Adding and removing IAM identity permissions](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_manage-attach-detach.html).

## Attach a bucket policy to an Amazon S3 bucket to allow another account to access it
<a name="s3-bucket-access-cross-account"></a>

This topic provides a template for a resource-based policy to attach to an S3 bucket to allow an IAM identity to access data in the bucket. The policy includes the following statements defining permissions for an identity to access the bucket:

1. Permissions to retrieve data from an S3 bucket.

1. (If you need to write data to an S3 location) Permissions to write data to an S3 bucket.

1. (If the S3 bucket is encrypted with an KMS key) Permissions to describe and decrypt the KMS key that encrypted the S3 bucket.
**Note**  
If your S3 bucket is versioning-enabled, each object version that you upload by using this feature can have its own encryption key. You're responsible for tracking which encryption key was used for which object version.

The permissions are similar to the identity-based permissions described in [Attach permissions to an IAM identity to allow it to access an Amazon S3 bucket](#s3-bucket-access-identity). However, each statement also requires you to specify the identity for which to grant permissions to the resource in the `Principal` field. Specify the identity (with most features in Amazon Bedrock, this is the service role) in the `Principal` field. Add, modify, and remove the statements, resources, and conditions in the following policy and replace *\$1\$1values\$1* as necessary:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "ReadS3Bucket",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::111122223333:role/ServiceRole"
            },
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::${S3Bucket}",
                "arn:aws:s3:::${S3Bucket}/*"
            ]
        },
        {
            "Sid": "WriteToS3Bucket",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::111122223333:role/ServiceRole"
            },
            "Action": [
                "s3:GetObject",
                "s3:PutObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::${S3Bucket}",
                "arn:aws:s3:::${S3Bucket}/*"
            ]
        },
        {
            "Sid": "DecryptKMSKey",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::111122223333:role/ServiceRole"
            },
            "Action": [
                "kms:Decrypt",
                "kms:DescribeKey"
            ],
            "Resource": "arn:aws:kms:us-east-1:123456789012:key/${KMSKeyId}"
        }
    ]
}
```

------

After modifying the policy to your use case, attach it to the S3 bucket. To learn how to attach a bucket policy to an S3 bucket, see [Adding a bucket policy by using the Amazon S3 console](https://docs.aws.amazon.com/AmazonS3/latest/userguide/add-bucket-policy.html).

## (Advanced security option) Include conditions in a statement for more fine-grained access
<a name="s3-bucket-access-conditions"></a>

For greater control over the identities that can access your resources, you can include conditions in a policy statement. The policy in this topic provides an example that uses the following condition keys:
+ `s3:prefix` – An S3 condition key that restricts access to a specific folder in an S3 bucket. For more information about this condition key, see the **User policy** section in [Example 2: Getting a list of objects in a bucket with a specific prefix](https://docs.aws.amazon.com/AmazonS3/latest/userguide/amazon-s3-policy-keys.html#condition-key-bucket-ops-2).
+ `aws:ResourceAccount` – A global condition key that restricts access to requests from a specific AWS account.

The following policy restricts read access to the *my-folder* folder in the *amzn-s3-demo-bucket* S3 bucket and restricts write access for the *amzn-s3-demo-destination-bucket* S3 bucket to requests from the AWS account with the ID *111122223333*:

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "ReadS3Bucket",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-bucket",
                "arn:aws:s3:::amzn-s3-demo-bucket/*"
            ],
            "Condition" : {
                "StringEquals" : {
                    "s3:prefix": "my-folder" 
                }
            }
        },
        {
            "Sid": "WriteToS3Bucket",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:PutObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::amzn-s3-demo-destination-bucket",
                "arn:aws:s3:::amzn-s3-demo-destination-bucket/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "111122223333"
                }
            }
        }
    ]
}
```

------

To learn more about conditions and condition keys, see the following links:
+ To learn about conditions, see [IAM JSON policy elements: Condition](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_elements_condition.html) in the IAM User Guide.
+ To learn about condition keys specific to S3, see [Condition keys for Amazon S3](https://docs.aws.amazon.com/service-authorization/latest/reference/list_amazons3.html#amazons3-policy-keys) in the Service Authorization Reference.
+ To learn about global condition keys used across AWS services, see [AWS global condition context keys](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html#condition-keys-resourceaccount).

# Troubleshooting Amazon Bedrock identity and access
<a name="security_iam_troubleshoot"></a>

Use the following information to help you diagnose and fix common issues that you might encounter when working with Amazon Bedrock and IAM.

**Topics**
+ [I am not authorized to perform an action in Amazon Bedrock](#security_iam_troubleshoot-no-permissions)
+ [I am not authorized to perform iam:PassRole](#security_iam_troubleshoot-passrole)
+ [I want to allow people outside of my AWS account to access my Amazon Bedrock resources](#security_iam_troubleshoot-cross-account-access)

## I am not authorized to perform an action in Amazon Bedrock
<a name="security_iam_troubleshoot-no-permissions"></a>

If you receive an error that you're not authorized to perform an action, your policies must be updated to allow you to perform the action.

The following example error occurs when the `mateojackson` IAM user tries to use the console to view details about a fictional `my-example-widget` resource but doesn't have the fictional `bedrock:GetWidget` permissions.

```
User: arn:aws:iam::123456789012:user/mateojackson is not authorized to perform: bedrock:GetWidget on resource: my-example-widget
```

In this case, the policy for the `mateojackson` user must be updated to allow access to the `my-example-widget` resource by using the `bedrock:GetWidget` action.

If you need help, contact your AWS administrator. Your administrator is the person who provided you with your sign-in credentials.

## I am not authorized to perform iam:PassRole
<a name="security_iam_troubleshoot-passrole"></a>

If you receive an error that you're not authorized to perform the `iam:PassRole` action, your policies must be updated to allow you to pass a role to Amazon Bedrock.

Some AWS services allow you to pass an existing role to that service instead of creating a new service role or service-linked role. To do this, you must have permissions to pass the role to the service.

The following example error occurs when an IAM user named `marymajor` tries to use the console to perform an action in Amazon Bedrock. However, the action requires the service to have permissions that are granted by a service role. Mary does not have permissions to pass the role to the service.

```
User: arn:aws:iam::123456789012:user/marymajor is not authorized to perform: iam:PassRole
```

In this case, Mary's policies must be updated to allow her to perform the `iam:PassRole` action.

If you need help, contact your AWS administrator. Your administrator is the person who provided you with your sign-in credentials.

## I want to allow people outside of my AWS account to access my Amazon Bedrock resources
<a name="security_iam_troubleshoot-cross-account-access"></a>

You can create a role that users in other accounts or people outside of your organization can use to access your resources. You can specify who is trusted to assume the role. For services that support resource-based policies or access control lists (ACLs), you can use those policies to grant people access to your resources.

To learn more, consult the following:
+ To learn whether Amazon Bedrock supports these features, see [How Amazon Bedrock works with IAM](security_iam_service-with-iam.md).
+ To learn how to provide access to your resources across AWS accounts that you own, see [Providing access to an IAM user in another AWS account that you own](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_common-scenarios_aws-accounts.html) in the *IAM User Guide*.
+ To learn how to provide access to your resources to third-party AWS accounts, see [Providing access to AWS accounts owned by third parties](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_common-scenarios_third-party.html) in the *IAM User Guide*.
+ To learn how to provide access through identity federation, see [Providing access to externally authenticated users (identity federation)](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_common-scenarios_federated-users.html) in the *IAM User Guide*.
+ To learn the difference between using roles and resource-based policies for cross-account access, see [Cross account resource access in IAM](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies-cross-account-resource-access.html) in the *IAM User Guide*.

# Cross-account access to Amazon S3 bucket for custom model import jobs
<a name="cross-account-access-cmi"></a>

If you are importing your model from Amazon S3 bucket and using cross-account Amazon S3 you will need to grant permissions to users in the bucket owner's account for accessing the bucket before you import your customized model. See [Prerequisites for importing custom model](custom-model-import-prereq.md).

## Configure cross-account access to Amazon S3 bucket
<a name="configure-cross-acct-access"></a>

This section walks you through the steps for creating policies for users in the bucket owners's account for accessing Amazon S3 bucket. 

1. In the bucket owner account, create a bucket policy that provides access to the users in the bucket owner's account.

   The following example bucket policy, created and applied to bucket `s3://amzn-s3-demo-bucket` by the bucket owner, grants access to a user in bucket owner's account `123456789123`. 

------
#### [ JSON ]

****  

   ```
   { 
       "Version":"2012-10-17",		 	 	 
       "Statement": [
           {
               "Sid": "CrossAccountAccess",
               "Effect": "Allow",
               "Principal": {
                   "AWS": "arn:aws:iam::123456789012:role/ImportRole"
               },           
               "Action": [
                   "s3:ListBucket",
                   "s3:GetObject"
               ],
               "Resource": [
                   "arn:aws:s3:::amzn-s3-demo-bucket",
                   "arn:aws:s3:::amzn-s3-demo-bucket/*"
               ]
           }
       ]
   }
   ```

------

1. In the user’s AWS account, create an import execution role policy. For `aws:ResourceAccount` specify account id of the bucket owner's AWS account.

   The following example import execution role policy in the user's account provides the bucket owner's account id `111222333444555` access to Amazon S3 bucket `s3://amzn-s3-demo-bucket`.

------
#### [ JSON ]

****  

   ```
   { 
       "Version":"2012-10-17",		 	 	 
      "Statement": [
       {
           "Effect": "Allow",
           "Action": [
               "s3:ListBucket",
               "s3:GetObject"
           ],
           "Resource": [
               "arn:aws:s3:::amzn-s3-demo-bucket",
               "arn:aws:s3:::amzn-s3-demo-bucket/*"
           ],
           "Condition": {
               "StringEquals": {
                   "aws:ResourceAccount": "123456789012"
               }
           }
       }
     ]
   }
   ```

------

## Configure cross-account access to Amazon S3 bucket encrypted with a custom AWS KMS key
<a name="configure-cross-acct-access-kms"></a>

If you have an Amazon S3 bucket that is encrypted with a custom AWS Key Management Service (AWS KMS) key, you will need to grant access to it to users from bucket owner's account.

To configure cross-account access to Amazon S3 bucket encrypted with a custom AWS KMS key

1. In the bucket owner account, create a bucket policy that provides access to the users in bucket owner's account.

   The following example bucket policy, created and applied to bucket `s3://amzn-s3-demo-bucket` by the bucket owner, grants access to a user in bucket owner's account `123456789123`. 

------
#### [ JSON ]

****  

   ```
   { 
      "Version":"2012-10-17",		 	 	 
      "Statement": [
       {
           "Sid": "CrossAccountAccess",
           "Effect": "Allow",
           "Principal": {
               "AWS": "arn:aws:iam::123456789012:role/ImportRole"
           },           
           "Action": [
               "s3:ListBucket",
               "s3:GetObject"
           ],
           "Resource": [
               "arn:aws:s3:::amzn-s3-demo-bucket",
               "arn:aws:s3:::amzn-s3-demo-bucket/*"
           ]
        }
      ]
   }
   ```

------

1. In the bucket owner account, create the following resource policy to allow user's account import role to decrypt.

   ```
   {
      "Sid": "Allow use of the key by the destination account",
      "Effect": "Allow",
      "Principal": {
      "AWS": "arn:aws:iam::"arn:aws:iam::123456789123:role/ImportRole"
       },
       "Action": [
             "kms:Decrypt",
             "kms:DescribeKey"
       ],
       "Resource": "*"
   }
   ```

1. In the user’s AWS account, create an import execution role policy. For `aws:ResourceAccount` specify account id of the bucket owner's AWS account. Also, provide access to the AWS KMS key that is used to encrypt the bucket.

   The following example import execution role policy in the user's account provides the bucket owner's account id `111222333444555` access to Amazon S3 bucket `s3://amzn-s3-demo-bucket` and the AWS KMS key `arn:aws:kms:us-west-2:123456789098:key/111aa2bb-333c-4d44-5555-a111bb2c33dd`

------
#### [ JSON ]

****  

   ```
   { 
       "Version":"2012-10-17",		 	 	 
      "Statement": [
         {
           "Effect": "Allow",
           "Action": [
               "s3:ListBucket",
               "s3:GetObject"
           ],
           "Resource": [
               "arn:aws:s3:::amzn-s3-demo-bucket",
               "arn:aws:s3:::amzn-s3-demo-bucket/*"
           ],
           "Condition": {
               "StringEquals": {
                   "aws:ResourceAccount": "123456789012"
               }
           }
        },
        {
         "Effect": "Allow",
         "Action": [
           "kms:Decrypt",
           "kms:DescribeKey"
         ],
         "Resource": "arn:aws:kms:us-west-2:123456789012:key/111aa2bb-333c-4d44-5555-a111bb2c33dd"
       }
     ]
    }
   ```

------

# Compliance validation for Amazon Bedrock
<a name="compliance-validation"></a>

To learn whether an AWS service is within the scope of specific compliance programs, see [AWS services in Scope by Compliance Program](https://aws.amazon.com/compliance/services-in-scope/) and choose the compliance program that you are interested in. For general information, see [AWS Compliance Programs](https://aws.amazon.com/compliance/programs/).

You can download third-party audit reports using AWS Artifact. For more information, see [Downloading Reports in AWS Artifact](https://docs.aws.amazon.com/artifact/latest/ug/downloading-documents.html).

Your compliance responsibility when using AWS services is determined by the sensitivity of your data, your company's compliance objectives, and applicable laws and regulations. For more information about your compliance responsibility when using AWS services, see [AWS Security Documentation](https://docs.aws.amazon.com/security/).

# Incident response in Amazon Bedrock
<a name="security-incident-response"></a>

Security is the highest priority at AWS. As part of the AWS Cloud [shared responsibility model](https://aws.amazon.com/compliance/shared-responsibility-model), AWS manages a data center, network, and software architecture that meets the requirements of the most security-sensitive organizations. AWS is responsible for any incident response with respect to the Amazon Bedrock service itself. Also, as an AWS customer, you share a responsibility for maintaining security in the cloud. This means that you control the security you choose to implement from the AWS tools and features you have access to. In addition, you’re responsible for incident response on your side of the shared responsibility model.

By establishing a security baseline that meets the objectives for your applications running in the cloud, you're able to detect deviations that you can respond to. To help you understand the impact that incident response and your choices have on your corporate goals, we encourage you to review the following resources:
+ [AWS Security Incident Response Guide](https://docs.aws.amazon.com/whitepapers/latest/aws-security-incident-response-guide/welcome.html)
+ [AWS Best Practices for Security, Identity, and Compliance](https://aws.amazon.com/architecture/security-identity-compliance)
+ [Security Perspective of the AWS Cloud Adoption Framework (CAF)](https://docs.aws.amazon.com/whitepapers/latest/overview-aws-cloud-adoption-framework/security-perspective.html) whitepaper

 [Amazon GuardDuty](https://aws.amazon.com/guardduty/) is a managed threat detection service continuously monitoring malicious or unauthorized behavior to help customers protect AWS accounts and workloads and identify suspicious activity potentially before it escalates into an incident. It monitors activity such as unusual API calls or potentially unauthorized deployments indicating possible account or resource compromise or reconnaissance by bad actors. For example, Amazon GuardDuty is able to detect suspicious activity in Amazon Bedrock APIs, such as a user logging in from a new location and using Amazon Bedrock APIs to remove Amazon Bedrock Guardrails, or change the Amazon S3 bucket set for model training data.

# Resilience in Amazon Bedrock
<a name="disaster-recovery-resiliency"></a>

The AWS global infrastructure is built around AWS Regions and Availability Zones. AWS Regions provide multiple physically separated and isolated Availability Zones, which are connected with low-latency, high-throughput, and highly redundant networking. With Availability Zones, you can design and operate applications and databases that automatically fail over between zones without interruption. Availability Zones are more highly available, fault tolerant, and scalable than traditional single or multiple data center infrastructures. 

For more information about AWS Regions and Availability Zones, see [AWS Global Infrastructure](https://aws.amazon.com/about-aws/global-infrastructure/).

# Infrastructure security in Amazon Bedrock
<a name="infrastructure-security"></a>

As a managed service, Amazon Bedrock is protected by the AWS global network security. For information about AWS security services and how AWS protects infrastructure, see [AWS Cloud Security](https://aws.amazon.com/security/). To design your AWS environment using the best practices for infrastructure security, see [Infrastructure Protection](https://docs.aws.amazon.com/wellarchitected/latest/security-pillar/infrastructure-protection.html) in *Security Pillar AWS Well‐Architected Framework*.

You use AWS published API calls to access Amazon Bedrock through the network. Clients must support the following:
+ Transport Layer Security (TLS). We require TLS 1.2 and recommend TLS 1.3.
+ Cipher suites with perfect forward secrecy (PFS) such as DHE (Ephemeral Diffie-Hellman) or ECDHE (Elliptic Curve Ephemeral Diffie-Hellman). Most modern systems such as Java 7 and later support these modes.

Additionally, requests must be signed by using an access key ID and a secret access key that is associated with an IAM principal. Or you can use the [AWS Security Token Service](https://docs.aws.amazon.com/STS/latest/APIReference/Welcome.html) (AWS STS) to generate temporary security credentials to sign requests.

# Cross-service confused deputy prevention
<a name="cross-service-confused-deputy-prevention"></a>

The confused deputy problem is a security issue where an entity that doesn't have permission to perform an action can coerce a more-privileged entity to perform the action. In AWS, cross-service impersonation can result in the confused deputy problem. Cross-service impersonation can occur when one service (the *calling service*) calls another service (the *called service*). The calling service can be manipulated to use its permissions to act on another customer's resources in a way it should not otherwise have permission to access. To prevent this, AWS provides tools that help you protect your data for all services with service principals that have been given access to resources in your account. 

We recommend using the [https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html#condition-keys-sourcearn](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html#condition-keys-sourcearn) and [https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html#condition-keys-sourceaccount](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html#condition-keys-sourceaccount) global condition context keys in resource policies to limit the permissions that Amazon Bedrock gives another service to the resource. Use `aws:SourceArn` if you want only one resource to be associated with the cross-service access. Use `aws:SourceAccount` if you want to allow any resource in that account to be associated with the cross-service use.

The most effective way to protect against the confused deputy problem is to use the `aws:SourceArn` global condition context key with the full ARN of the resource. If you don't know the full ARN of the resource or if you are specifying multiple resources, use the `aws:SourceArn` global context condition key with wildcard characters (`*`) for the unknown portions of the ARN. For example, `arn:aws:bedrock:*:123456789012:*`. 

If the `aws:SourceArn` value does not contain the account ID, such as an Amazon S3 bucket ARN, you must use both global condition context keys to limit permissions. 

The value of `aws:SourceArn` must be ResourceDescription.

The following example shows how you can use the `aws:SourceArn` and `aws:SourceAccount` global condition context keys in Bedrock to prevent the confused deputy problem.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "111122223333"
                },
                "ArnEquals": {
                    "aws:SourceArn": "arn:aws:bedrock:us-east-1:111122223333:model-customization-job/*"
                }
            }
        }
    ] 
}
```

------

# Configuration and vulnerability analysis in Amazon Bedrock
<a name="vulnerability-analysis-and-management"></a>

Configuration and IT controls are a shared responsibility between AWS and you, our customer. For more information, see the AWS [shared responsibility model](https://aws.amazon.com/compliance/shared-responsibility-model/).

# Amazon Bedrock abuse detection
<a name="abuse-detection"></a>

AWS is committed to the responsible use of AI. To help prevent potential misuse, Amazon Bedrock implements automated abuse detection mechanisms to identify potential violations of AWS’s [Acceptable Use Policy](https://aws.amazon.com/aup/) (AUP) and Service Terms, including the [Responsible AI Policy](https://aws.amazon.com/machine-learning/responsible-ai/policy/) or a third-party model provider’s AUP.

Our abuse detection mechanisms are fully automated, so there is no human review of, or access to, user inputs or model outputs.

Automated abuse detection includes: 
+ **Categorize content** — We use classifiers to detect harmful content (such as content that incites violence) in user inputs and model outputs. A classifier is an algorithm that processes model inputs and outputs, and assigns type of harm and level of confidence. We may run these classifiers on both first-party and third-party model usage. This may include models that are fine-tuned using Amazon Bedrock's model customization. The classification process is automated and does not involve human review of user inputs or model outputs.
+ **Identify patterns** — We use classifier metrics to identify potential violations and recurring behavior. We may compile and share anonymized classifier metrics with third-party model providers. Amazon Bedrock does not store user input or model output and does not share these with third-party model providers.
+ **Detecting and blocking child sexual abuse material (CSAM)** — You are responsible for the content you (and your end users) upload to Amazon Bedrock and must ensure this content is free from illegal images. To help stop the dissemination of CSAM, Amazon Bedrock may use automated abuse detection mechanisms (such as hash matching technology or classifiers) to detect apparent CSAM. If Amazon Bedrock detects apparent CSAM in your image inputs, Amazon Bedrock will block the request and return a `ValidationException` (HTTP 400) error in the API response. Amazon Bedrock may also file a report with the National Center for Missing and Exploited Children (NCMEC) or a relevant authority. We take CSAM seriously and will continue to update our detection, blocking, and reporting mechanisms. You might be required by applicable laws to take additional actions, and you are responsible for those actions.

Once our automated abuse detection mechanisms identify potential violations, we may request information about your use of Amazon Bedrock and compliance with our terms of service or a third-party provider’s AUP. These requests are sent to the email address associated with your AWS account, so ensure that your account contact information is current and monitored. In the event that you are non-responsive, unwilling, or unable to comply with these terms or policies, AWS may suspend your access to Amazon Bedrock. You may also be billed for the failed fine-tuning jobs if our automated tests detect model responses being inconsistent with third-party model-providers' license terms and policies.

Contact AWS Support if you have additional questions. For more information, see the [Amazon Bedrock FAQs](https://aws.amazon.com/bedrock/faqs/?refid=6f95042b-28fe-493f-8858-601fe99cea89).

# Prompt injection security
<a name="prompt-injection"></a>

 As per the [AWS Shared Responsibility Model](https://aws.amazon.com/compliance/shared-responsibility-model/), AWS is responsible for securing the underlying cloud infrastructure, including the hardware, software, networking, and facilities that run AWS services. However, customers are responsible for securing their applications, data, and resources deployed on AWS. 

In the context of Amazon Bedrock, AWS handles the security of the underlying infrastructure, including the physical data centers, networking, and the Amazon Bedrock service itself. However, the responsibility for secure application development and preventing vulnerabilities like prompt injection lies with the customer. 

Prompt injection is an application-level security concern, similar to SQL injection in database applications. Just as AWS services like Amazon RDS and Amazon Aurora provide secure database engines, but customers are responsible for preventing SQL injection in their applications. Amazon Bedrock provides a secure foundation for natural language processing, but customers must take measures to prevent prompt injection vulnerabilities in their code. Additionally, AWS provides detailed documentation, best practices, and guidance on secure coding practices for Bedrock and other AWS services. 

To protect against prompt injection and other security vulnerabilities when using Amazon Bedrock, customers should follow these best practices: 
+ **Input Validation** – Validate and sanitize all user input before passing it to the Amazon Bedrock API or tokenizer. This includes removing or escaping special characters and ensuring that input adheres to expected formats. 
+ **Secure Coding Practices** – Follow secure coding practices, such as using parameterized queries, avoiding string concatenation for input, and practicing the principle of least privilege when granting access to resources. 
+ **Security Testing** – Regularly test your applications for prompt injection and other security vulnerabilities using techniques like penetration testing, static code analysis, and dynamic application security testing (DAST). 
+ **Stay Updated** – Keep your Amazon Bedrock SDK, libraries, and dependencies up-to-date with the latest security patches and updates. Monitor AWS security bulletins and announcements for any relevant updates or guidance. AWS provides detailed documentation, blog posts, and sample code to help customers build secure applications using Bedrock and other AWS services. Customers should review these resources and follow the recommended security best practices to protect their applications from prompt injection and other vulnerabilities. 

You can use an Amazon Bedrock Guardrail to help protect against prompt injection attacks. For more information, see [Detect prompt attacks with Amazon Bedrock Guardrails](guardrails-prompt-attack.md).

When creating an Amazon Bedrock agent, use the following techniques to help protect against prompt injection attacks. 
+ Associate a guardrail with the agent. For more information, see [Implement safeguards for your application by associating a guardrail with your agent](agents-guardrail.md).
+ Use [advanced prompts](https://docs.aws.amazon.com/bedrock/latest/userguide/advanced-prompts.html) to enable the default pre-processing prompt. Every agent has a default pre-processing prompt that you can enable. This is a lightweight prompt that uses a foundation model to determine if user input is safe to be processed. You can use its default behavior or fully customize the prompt to include any other classification categories. Optionally, you can author your own foundation model response parser in an [AWS Lambda](https://docs.aws.amazon.com/bedrock/latest/userguide/lambda-parser.html) function to implement your own custom rules. 

  For more information, see [How Amazon Bedrock Agents works](agents-how.md).
+ Update the system prompt by using advanced prompt features. Newer models differentiate between system and user prompts. If you use system prompts in an agent, we recommend that you clearly define the scope of what the agent can and cannot do. Also, check the model provider's own documentation for model specific guidance. To find out which serverless models in Amazon Bedrock support system prompts, see [Inference request parameters and response fields for foundation models](model-parameters.md). 