Create a service role for importing pre-trained models
To use a custom role for model import create an IAM service role and attach the following permissions. For information on how to create a service role in IAM, see Creating a role to delegate permissions to an AWS service.
These permissions apply to both methods of importing models into Amazon Bedrock:
-
Custom model import jobs — For importing customized open-source foundation models (such as Mistral AI or Llama models). For more information, see Use Custom model import to import a customized open-source model into Amazon Bedrock.
-
Create custom model — For importing Amazon Nova models that you fine-tuned in SageMaker AI. For more information, see Import a SageMaker AI-trained Amazon Nova model.
Trust relationship
The following policy allows Amazon Bedrock to assume this role and carry out model import operations. The following shows an example policy you can use.
You can optionally restrict the scope of the permission for cross-service confused deputy
prevention by using one or more global condition context keys with the
Condition
field. For more information, see AWS global condition context keys.
-
Set the
aws:SourceAccount
value to your account ID. -
(Optional) Use the
ArnEquals
orArnLike
condition to restrict the scope to specific operations in your account. The following example restricts access to custom model import jobs.
Permissions to access model files in Amazon S3
Attach the following policy to allow the role to access model files in
Amazon S3 bucket. Replace the values in the Resource
list with your actual
bucket names.
For custom model import jobs, this is your own Amazon S3 bucket containing the customized open-source model files. For creating custom models from SageMaker AI-trained Amazon Nova models, this is the Amazon-managed Amazon S3 bucket where SageMaker AI stores the trained model artifacts. SageMaker AI creates this bucket when you run your first SageMaker AI training job.
To restrict access to a specific folder in a bucket, add an s3:prefix
condition key with your folder path. You can follow the User
policy example in Example 2: Getting a list of objects in a bucket with a specific prefix