Prerequisites for deploying a custom model for on-demand inference - Amazon Bedrock

Prerequisites for deploying a custom model for on-demand inference

Before you can deploy a custom model for on-demand inference, make sure you meet the following requirements:

  • You must use the US East (N. Virginia) region.

  • You must customize the model on or after 7/16/2025. For supported models, see Supported base models.

  • Your account must have permission to access the model that you are deploying. For more information about model customization access and security, see Model customization access and security.

  • If the model is encrypted with a AWS KMS key, you must have permission to use that key. For more information, see Encryption of custom models.