本文為英文版的機器翻譯版本,如內容有任何歧義或不一致之處,概以英文版為準。
建立自訂模型AWS SDKs)
若要從存放在 Amazon S3 中的 SageMaker AI 訓練 Amazon Nova 模型建立自訂模型,請使用 CreateCustomModel API 操作。您可以使用下列程式碼,透過適用於 Python 的 SDK (Boto3) 建立自訂模型。程式碼會建立自訂模型,然後檢查其狀態,直到模型準備就緒ACTIVE
可供使用為止。
若要使用程式碼,請更新下列參數。程式碼範例也包含選用參數,例如clientRequestToken
用於冪等性和資源modelTags
標記。
-
modelName – 為模型提供唯一的名稱。
-
s3Uri – 指定存放模型成品的 Amazon 受管 Amazon S3 儲存貯體路徑。當您執行第一個 SageMaker AI 訓練任務時SageMaker AI 會建立此儲存貯體。
-
roleArn – 指定 Amazon Bedrock 代您執行任務所擔任之 IAM 服務角色的 Amazon Resource Name (ARN)。如需建立此角色的詳細資訊,請參閱建立用於匯入預先訓練模型的服務角色。
-
modelKmsKeyArn (選用) – 指定在 Amazon Bedrock 中加密模型的 AWS KMS 金鑰。如果您不提供 AWS KMS 金鑰,Amazon Bedrock 會使用 AWS受管 AWS KMS 金鑰來加密模型。如需加密的資訊,請參閱 加密匯入的自訂模型。
建立自訂模型後,模型會出現在 ListCustomModels 回應中,並顯示 customizationType
的 imported
。若要追蹤新模型的狀態,您可以使用 GetCustomModel API 操作。
import boto3 import uuid from botocore.exceptions import ClientError import time def create_custom_model(bedrock_client): """ Creates a custom model in Amazon Bedrock from a SageMaker AI-trained Amazon Nova model stored in Amazon S3. Args: bedrock_client: The Amazon Bedrock client instance Returns: dict: Response from the CreateCustomModel API call """ try: # Create a unique client request token for idempotency client_request_token = str(uuid.uuid4()) # Define the model source configuration model_source_config = { 's3DataSource': { 's3Uri': '
s3://amzn-s3-demo-bucket/folder/
', } } # Create the custom model response = bedrock_client.create_custom_model( # Required parameters modelName='modelName
', roleArn='serviceRoleArn
', modelSourceConfig=model_source_config, # Optional parameters clientRequestToken=client_request_token, modelKmsKeyArn='keyArn
', modelTags=[ { 'key': 'Environment', 'value': 'Production' }, { 'key': 'Project', 'value': 'AIInference' } ] ) print(f"Custom model creation initiated. Model ARN: {response['modelArn']}") return response except ClientError as e: print(f"Error creating custom model: {e}") raise def list_custom_models(bedrock_client): """ Lists all custom models in Amazon Bedrock. Args: bedrock_client: An Amazon Bedrock client. Returns: dict: Response from the ListCustomModels API call """ try: response = bedrock_client.list_custom_models() print(f"Total number of custom models: {len(response['modelSummaries'])}") for model in response['modelSummaries']: print("ARN: " + model['modelArn']) print("Name: " + model['modelName']) print("Status: " + model['modelStatus']) print("Customization type: " + model['customizationType']) print("------------------------------------------------------") return response except ClientError as e: print(f"Error listing custom models: {e}") raise def check_model_status(bedrock_client, model_arn): """ Checks the status of a custom model creation. Args: model_arn (str): The ARN of the custom model bedrock_client: An Amazon Bedrock client. Returns: dict: Response from the GetCustomModel API call """ try: max_time = time.time() + 60 * 60 # 1 hour while time.time() < max_time: response = bedrock_client.get_custom_model(modelIdentifier=model_arn) status = response.get('modelStatus') print(f"Job status: {status}") if status == 'Failed': print(f"Failure reason: {response.get('failureMessage')}") break if status == 'Active': print("Model is ready for use.") break time.sleep(60) except ClientError as e: print(f"Error checking model status: {e}") raise def main(): bedrock_client = boto3.client(service_name='bedrock', region_name='REGION
') # Create the custom model model_arn = create_custom_model(bedrock_client)["modelArn"] # Check the status of the model if model_arn: check_model_status(bedrock_client, model_arn) # View all custom models list_custom_models(bedrock_client) if __name__ == "__main__": main()