Create a service role for importing pre-trained models - Amazon Bedrock

Create a service role for importing pre-trained models

To use a custom role for model import create an IAM service role and attach the following permissions. For information on how to create a service role in IAM, see Creating a role to delegate permissions to an AWS service.

These permissions apply to both methods of importing models into Amazon Bedrock:

Trust relationship

The following policy allows Amazon Bedrock to assume this role and carry out model import operations. The following shows an example policy you can use.

You can optionally restrict the scope of the permission for cross-service confused deputy prevention by using one or more global condition context keys with the Condition field. For more information, see AWS global condition context keys.

  • Set the aws:SourceAccount value to your account ID.

  • (Optional) Use the ArnEquals or ArnLike condition to restrict the scope to specific operations in your account. The following example restricts access to custom model import jobs.

JSON
{ "Version": "2012-10-17", "Statement": [ { "Sid": "1", "Effect": "Allow", "Principal": { "Service": "bedrock.amazonaws.com" }, "Action": "sts:AssumeRole", "Condition": { "StringEquals": { "aws:SourceAccount": "account-id" }, "ArnEquals": { "aws:SourceArn": "arn:aws:bedrock:us-east-1:111122223333:model-import-job/*" } } } ] }

Permissions to access model files in Amazon S3

Attach the following policy to allow the role to access model files in Amazon S3 bucket. Replace the values in the Resource list with your actual bucket names.

For custom model import jobs, this is your own Amazon S3 bucket containing the customized open-source model files. For creating custom models from SageMaker AI-trained Amazon Nova models, this is the Amazon-managed Amazon S3 bucket where SageMaker AI stores the trained model artifacts. SageMaker AI creates this bucket when you run your first SageMaker AI training job.

To restrict access to a specific folder in a bucket, add an s3:prefix condition key with your folder path. You can follow the User policy example in Example 2: Getting a list of objects in a bucket with a specific prefix

JSON
{ "Version": "2012-10-17", "Statement": [ { "Sid": "1", "Effect": "Allow", "Action": [ "s3:GetObject", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::bucket", "arn:aws:s3:::bucket/*" ], "Condition": { "StringEquals": { "aws:ResourceAccount": "account-id" } } } ] }