CfnModelPropsMixin

class aws_cdk.mixins_preview.aws_sagemaker.mixins.CfnModelPropsMixin(props, *, strategy=None)

Bases: Mixin

The AWS::SageMaker::Model resource to create a model to host at an Amazon SageMaker endpoint.

For more information, see Deploying a Model on Amazon SageMaker Hosting Services in the Amazon SageMaker Developer Guide .

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-model.html

CloudformationResource:

AWS::SageMaker::Model

Mixin:

true

ExampleMetadata:

fixture=_generated

Example:

# The code below shows an example of how to instantiate this type.
# The values are placeholders you should change.
from aws_cdk.mixins_preview import mixins
from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins

# environment: Any

cfn_model_props_mixin = sagemaker_mixins.CfnModelPropsMixin(sagemaker_mixins.CfnModelMixinProps(
    containers=[sagemaker_mixins.CfnModelPropsMixin.ContainerDefinitionProperty(
        container_hostname="containerHostname",
        environment=environment,
        image="image",
        image_config=sagemaker_mixins.CfnModelPropsMixin.ImageConfigProperty(
            repository_access_mode="repositoryAccessMode",
            repository_auth_config=sagemaker_mixins.CfnModelPropsMixin.RepositoryAuthConfigProperty(
                repository_credentials_provider_arn="repositoryCredentialsProviderArn"
            )
        ),
        inference_specification_name="inferenceSpecificationName",
        mode="mode",
        model_data_source=sagemaker_mixins.CfnModelPropsMixin.ModelDataSourceProperty(
            s3_data_source=sagemaker_mixins.CfnModelPropsMixin.S3DataSourceProperty(
                compression_type="compressionType",
                hub_access_config=sagemaker_mixins.CfnModelPropsMixin.HubAccessConfigProperty(
                    hub_content_arn="hubContentArn"
                ),
                model_access_config=sagemaker_mixins.CfnModelPropsMixin.ModelAccessConfigProperty(
                    accept_eula=False
                ),
                s3_data_type="s3DataType",
                s3_uri="s3Uri"
            )
        ),
        model_data_url="modelDataUrl",
        model_package_name="modelPackageName",
        multi_model_config=sagemaker_mixins.CfnModelPropsMixin.MultiModelConfigProperty(
            model_cache_setting="modelCacheSetting"
        )
    )],
    enable_network_isolation=False,
    execution_role_arn="executionRoleArn",
    inference_execution_config=sagemaker_mixins.CfnModelPropsMixin.InferenceExecutionConfigProperty(
        mode="mode"
    ),
    model_name="modelName",
    primary_container=sagemaker_mixins.CfnModelPropsMixin.ContainerDefinitionProperty(
        container_hostname="containerHostname",
        environment=environment,
        image="image",
        image_config=sagemaker_mixins.CfnModelPropsMixin.ImageConfigProperty(
            repository_access_mode="repositoryAccessMode",
            repository_auth_config=sagemaker_mixins.CfnModelPropsMixin.RepositoryAuthConfigProperty(
                repository_credentials_provider_arn="repositoryCredentialsProviderArn"
            )
        ),
        inference_specification_name="inferenceSpecificationName",
        mode="mode",
        model_data_source=sagemaker_mixins.CfnModelPropsMixin.ModelDataSourceProperty(
            s3_data_source=sagemaker_mixins.CfnModelPropsMixin.S3DataSourceProperty(
                compression_type="compressionType",
                hub_access_config=sagemaker_mixins.CfnModelPropsMixin.HubAccessConfigProperty(
                    hub_content_arn="hubContentArn"
                ),
                model_access_config=sagemaker_mixins.CfnModelPropsMixin.ModelAccessConfigProperty(
                    accept_eula=False
                ),
                s3_data_type="s3DataType",
                s3_uri="s3Uri"
            )
        ),
        model_data_url="modelDataUrl",
        model_package_name="modelPackageName",
        multi_model_config=sagemaker_mixins.CfnModelPropsMixin.MultiModelConfigProperty(
            model_cache_setting="modelCacheSetting"
        )
    ),
    tags=[CfnTag(
        key="key",
        value="value"
    )],
    vpc_config=sagemaker_mixins.CfnModelPropsMixin.VpcConfigProperty(
        security_group_ids=["securityGroupIds"],
        subnets=["subnets"]
    )
),
    strategy=mixins.PropertyMergeStrategy.OVERRIDE
)

Create a mixin to apply properties to AWS::SageMaker::Model.

Parameters:
  • props (Union[CfnModelMixinProps, Dict[str, Any]]) – L1 properties to apply.

  • strategy (Optional[PropertyMergeStrategy]) – (experimental) Strategy for merging nested properties. Default: - PropertyMergeStrategy.MERGE

Methods

apply_to(construct)

Apply the mixin properties to the construct.

Parameters:

construct (IConstruct)

Return type:

IConstruct

supports(construct)

Check if this mixin supports the given construct.

Parameters:

construct (IConstruct)

Return type:

bool

Attributes

CFN_PROPERTY_KEYS = ['containers', 'enableNetworkIsolation', 'executionRoleArn', 'inferenceExecutionConfig', 'modelName', 'primaryContainer', 'tags', 'vpcConfig']

Static Methods

classmethod is_mixin(x)

(experimental) Checks if x is a Mixin.

Parameters:

x (Any) – Any object.

Return type:

bool

Returns:

true if x is an object created from a class which extends Mixin.

Stability:

experimental

ContainerDefinitionProperty

class CfnModelPropsMixin.ContainerDefinitionProperty(*, container_hostname=None, environment=None, image=None, image_config=None, inference_specification_name=None, mode=None, model_data_source=None, model_data_url=None, model_package_name=None, multi_model_config=None)

Bases: object

Describes the container, as part of model definition.

Parameters:
  • container_hostname (Optional[str]) – This parameter is ignored for models that contain only a PrimaryContainer . When a ContainerDefinition is part of an inference pipeline, the value of the parameter uniquely identifies the container for the purposes of logging and metrics. For information, see Use Logs and Metrics to Monitor an Inference Pipeline . If you don’t specify a value for this parameter for a ContainerDefinition that is part of an inference pipeline, a unique name is automatically assigned based on the position of the ContainerDefinition in the pipeline. If you specify a value for the ContainerHostName for any ContainerDefinition that is part of an inference pipeline, you must specify a value for the ContainerHostName parameter of every ContainerDefinition in that pipeline.

  • environment (Any) – The environment variables to set in the Docker container. Don’t include any sensitive data in your environment variables. The maximum length of each key and value in the Environment map is 1024 bytes. The maximum length of all keys and values in the map, combined, is 32 KB. If you pass multiple containers to a CreateModel request, then the maximum length of all of their maps, combined, is also 32 KB.

  • image (Optional[str]) – The path where inference code is stored. This can be either in Amazon EC2 Container Registry or in a Docker registry that is accessible from the same VPC that you configure for your endpoint. If you are using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must meet SageMaker requirements. SageMaker supports both registry/repository[:tag] and registry/repository[@digest] image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker . .. epigraph:: The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.

  • image_config (Union[IResolvable, ImageConfigProperty, Dict[str, Any], None]) – Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC). For information about storing containers in a private Docker registry, see Use a Private Docker Registry for Real-Time Inference Containers . .. epigraph:: The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.

  • inference_specification_name (Optional[str]) – The inference specification name in the model package version.

  • mode (Optional[str]) – Whether the container hosts a single model or multiple models.

  • model_data_source (Union[IResolvable, ModelDataSourceProperty, Dict[str, Any], None]) – Specifies the location of ML model data to deploy. .. epigraph:: Currently you cannot use ModelDataSource in conjunction with SageMaker batch transform, SageMaker serverless endpoints, SageMaker multi-model endpoints, and SageMaker Marketplace.

  • model_data_url (Optional[str]) – The S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for SageMaker built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms, see Common Parameters . .. epigraph:: The model artifacts must be in an S3 bucket that is in the same region as the model or endpoint you are creating. If you provide a value for this parameter, SageMaker uses AWS Security Token Service to download model artifacts from the S3 path you provide. AWS STS is activated in your AWS account by default. If you previously deactivated AWS STS for a region, you need to reactivate AWS STS for that region. For more information, see Activating and Deactivating AWS STS in an AWS Region in the AWS Identity and Access Management User Guide . .. epigraph:: If you use a built-in algorithm to create a model, SageMaker requires that you provide a S3 path to the model artifacts in ModelDataUrl .

  • model_package_name (Optional[str]) – The name or Amazon Resource Name (ARN) of the model package to use to create the model.

  • multi_model_config (Union[IResolvable, MultiModelConfigProperty, Dict[str, Any], None]) – Specifies additional configuration for multi-model endpoints.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition.html

ExampleMetadata:

fixture=_generated

Example:

# The code below shows an example of how to instantiate this type.
# The values are placeholders you should change.
from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins

# environment: Any

container_definition_property = sagemaker_mixins.CfnModelPropsMixin.ContainerDefinitionProperty(
    container_hostname="containerHostname",
    environment=environment,
    image="image",
    image_config=sagemaker_mixins.CfnModelPropsMixin.ImageConfigProperty(
        repository_access_mode="repositoryAccessMode",
        repository_auth_config=sagemaker_mixins.CfnModelPropsMixin.RepositoryAuthConfigProperty(
            repository_credentials_provider_arn="repositoryCredentialsProviderArn"
        )
    ),
    inference_specification_name="inferenceSpecificationName",
    mode="mode",
    model_data_source=sagemaker_mixins.CfnModelPropsMixin.ModelDataSourceProperty(
        s3_data_source=sagemaker_mixins.CfnModelPropsMixin.S3DataSourceProperty(
            compression_type="compressionType",
            hub_access_config=sagemaker_mixins.CfnModelPropsMixin.HubAccessConfigProperty(
                hub_content_arn="hubContentArn"
            ),
            model_access_config=sagemaker_mixins.CfnModelPropsMixin.ModelAccessConfigProperty(
                accept_eula=False
            ),
            s3_data_type="s3DataType",
            s3_uri="s3Uri"
        )
    ),
    model_data_url="modelDataUrl",
    model_package_name="modelPackageName",
    multi_model_config=sagemaker_mixins.CfnModelPropsMixin.MultiModelConfigProperty(
        model_cache_setting="modelCacheSetting"
    )
)

Attributes

container_hostname

This parameter is ignored for models that contain only a PrimaryContainer .

When a ContainerDefinition is part of an inference pipeline, the value of the parameter uniquely identifies the container for the purposes of logging and metrics. For information, see Use Logs and Metrics to Monitor an Inference Pipeline . If you don’t specify a value for this parameter for a ContainerDefinition that is part of an inference pipeline, a unique name is automatically assigned based on the position of the ContainerDefinition in the pipeline. If you specify a value for the ContainerHostName for any ContainerDefinition that is part of an inference pipeline, you must specify a value for the ContainerHostName parameter of every ContainerDefinition in that pipeline.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition.html#cfn-sagemaker-model-containerdefinition-containerhostname

environment

The environment variables to set in the Docker container. Don’t include any sensitive data in your environment variables.

The maximum length of each key and value in the Environment map is 1024 bytes. The maximum length of all keys and values in the map, combined, is 32 KB. If you pass multiple containers to a CreateModel request, then the maximum length of all of their maps, combined, is also 32 KB.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition.html#cfn-sagemaker-model-containerdefinition-environment

image

The path where inference code is stored.

This can be either in Amazon EC2 Container Registry or in a Docker registry that is accessible from the same VPC that you configure for your endpoint. If you are using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must meet SageMaker requirements. SageMaker supports both registry/repository[:tag] and registry/repository[@digest] image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker . .. epigraph:

The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.
See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition.html#cfn-sagemaker-model-containerdefinition-image

image_config

Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC).

For information about storing containers in a private Docker registry, see Use a Private Docker Registry for Real-Time Inference Containers . .. epigraph:

The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.
See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition.html#cfn-sagemaker-model-containerdefinition-imageconfig

inference_specification_name

The inference specification name in the model package version.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition.html#cfn-sagemaker-model-containerdefinition-inferencespecificationname

mode

Whether the container hosts a single model or multiple models.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition.html#cfn-sagemaker-model-containerdefinition-mode

model_data_source

Specifies the location of ML model data to deploy.

Currently you cannot use ModelDataSource in conjunction with SageMaker batch transform, SageMaker serverless endpoints, SageMaker multi-model endpoints, and SageMaker Marketplace.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition.html#cfn-sagemaker-model-containerdefinition-modeldatasource

model_data_url

The S3 path where the model artifacts, which result from model training, are stored.

This path must point to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for SageMaker built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms, see Common Parameters . .. epigraph:

The model artifacts must be in an S3 bucket that is in the same region as the model or endpoint you are creating.

If you provide a value for this parameter, SageMaker uses AWS Security Token Service to download model artifacts from the S3 path you provide. AWS STS is activated in your AWS account by default. If you previously deactivated AWS STS for a region, you need to reactivate AWS STS for that region. For more information, see Activating and Deactivating AWS STS in an AWS Region in the AWS Identity and Access Management User Guide . .. epigraph:

If you use a built-in algorithm to create a model, SageMaker requires that you provide a S3 path to the model artifacts in ``ModelDataUrl`` .
See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition.html#cfn-sagemaker-model-containerdefinition-modeldataurl

model_package_name

The name or Amazon Resource Name (ARN) of the model package to use to create the model.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition.html#cfn-sagemaker-model-containerdefinition-modelpackagename

multi_model_config

Specifies additional configuration for multi-model endpoints.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition.html#cfn-sagemaker-model-containerdefinition-multimodelconfig

HubAccessConfigProperty

class CfnModelPropsMixin.HubAccessConfigProperty(*, hub_content_arn=None)

Bases: object

The configuration for a private hub model reference that points to a public SageMaker JumpStart model.

For more information about private hubs, see Private curated hubs for foundation model access control in JumpStart .

Parameters:

hub_content_arn (Optional[str]) – The ARN of your private model hub content. This should be a ModelReference resource type that points to a SageMaker JumpStart public hub model.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-hubaccessconfig.html

ExampleMetadata:

fixture=_generated

Example:

# The code below shows an example of how to instantiate this type.
# The values are placeholders you should change.
from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins

hub_access_config_property = sagemaker_mixins.CfnModelPropsMixin.HubAccessConfigProperty(
    hub_content_arn="hubContentArn"
)

Attributes

hub_content_arn

The ARN of your private model hub content.

This should be a ModelReference resource type that points to a SageMaker JumpStart public hub model.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-hubaccessconfig.html#cfn-sagemaker-model-hubaccessconfig-hubcontentarn

ImageConfigProperty

class CfnModelPropsMixin.ImageConfigProperty(*, repository_access_mode=None, repository_auth_config=None)

Bases: object

Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC).

Parameters:
  • repository_access_mode (Optional[str]) – Set this to one of the following values:. - Platform - The model image is hosted in Amazon ECR. - Vpc - The model image is hosted in a private Docker registry in your VPC.

  • repository_auth_config (Union[IResolvable, RepositoryAuthConfigProperty, Dict[str, Any], None]) – (Optional) Specifies an authentication configuration for the private docker registry where your model image is hosted. Specify a value for this property only if you specified Vpc as the value for the RepositoryAccessMode field, and the private Docker registry where the model image is hosted requires authentication.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-imageconfig.html

ExampleMetadata:

fixture=_generated

Example:

# The code below shows an example of how to instantiate this type.
# The values are placeholders you should change.
from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins

image_config_property = sagemaker_mixins.CfnModelPropsMixin.ImageConfigProperty(
    repository_access_mode="repositoryAccessMode",
    repository_auth_config=sagemaker_mixins.CfnModelPropsMixin.RepositoryAuthConfigProperty(
        repository_credentials_provider_arn="repositoryCredentialsProviderArn"
    )
)

Attributes

repository_access_mode

.

  • Platform - The model image is hosted in Amazon ECR.

  • Vpc - The model image is hosted in a private Docker registry in your VPC.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-imageconfig.html#cfn-sagemaker-model-imageconfig-repositoryaccessmode

Type:

Set this to one of the following values

repository_auth_config

(Optional) Specifies an authentication configuration for the private docker registry where your model image is hosted.

Specify a value for this property only if you specified Vpc as the value for the RepositoryAccessMode field, and the private Docker registry where the model image is hosted requires authentication.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-imageconfig.html#cfn-sagemaker-model-imageconfig-repositoryauthconfig

InferenceExecutionConfigProperty

class CfnModelPropsMixin.InferenceExecutionConfigProperty(*, mode=None)

Bases: object

Specifies details about how containers in a multi-container endpoint are run.

Parameters:

mode (Optional[str]) – How containers in a multi-container are run. The following values are valid. - Serial - Containers run as a serial pipeline. - Direct - Only the individual container that you specify is run.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-inferenceexecutionconfig.html

ExampleMetadata:

fixture=_generated

Example:

# The code below shows an example of how to instantiate this type.
# The values are placeholders you should change.
from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins

inference_execution_config_property = sagemaker_mixins.CfnModelPropsMixin.InferenceExecutionConfigProperty(
    mode="mode"
)

Attributes

mode

How containers in a multi-container are run. The following values are valid.

  • Serial - Containers run as a serial pipeline.

  • Direct - Only the individual container that you specify is run.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-inferenceexecutionconfig.html#cfn-sagemaker-model-inferenceexecutionconfig-mode

ModelAccessConfigProperty

class CfnModelPropsMixin.ModelAccessConfigProperty(*, accept_eula=None)

Bases: object

The access configuration file to control access to the ML model.

You can explicitly accept the model end-user license agreement (EULA) within the ModelAccessConfig .

Parameters:

accept_eula (Union[bool, IResolvable, None]) – Specifies agreement to the model end-user license agreement (EULA). The AcceptEula value must be explicitly defined as True in order to accept the EULA that this model requires. You are responsible for reviewing and complying with any applicable license terms and making sure they are acceptable for your use case before downloading or using a model.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-modelaccessconfig.html

ExampleMetadata:

fixture=_generated

Example:

# The code below shows an example of how to instantiate this type.
# The values are placeholders you should change.
from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins

model_access_config_property = sagemaker_mixins.CfnModelPropsMixin.ModelAccessConfigProperty(
    accept_eula=False
)

Attributes

accept_eula

Specifies agreement to the model end-user license agreement (EULA).

The AcceptEula value must be explicitly defined as True in order to accept the EULA that this model requires. You are responsible for reviewing and complying with any applicable license terms and making sure they are acceptable for your use case before downloading or using a model.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-modelaccessconfig.html#cfn-sagemaker-model-modelaccessconfig-accepteula

ModelDataSourceProperty

class CfnModelPropsMixin.ModelDataSourceProperty(*, s3_data_source=None)

Bases: object

Specifies the location of ML model data to deploy.

If specified, you must specify one and only one of the available data sources.

Parameters:

s3_data_source (Union[IResolvable, S3DataSourceProperty, Dict[str, Any], None]) – Specifies the S3 location of ML model data to deploy.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-modeldatasource.html

ExampleMetadata:

fixture=_generated

Example:

# The code below shows an example of how to instantiate this type.
# The values are placeholders you should change.
from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins

model_data_source_property = sagemaker_mixins.CfnModelPropsMixin.ModelDataSourceProperty(
    s3_data_source=sagemaker_mixins.CfnModelPropsMixin.S3DataSourceProperty(
        compression_type="compressionType",
        hub_access_config=sagemaker_mixins.CfnModelPropsMixin.HubAccessConfigProperty(
            hub_content_arn="hubContentArn"
        ),
        model_access_config=sagemaker_mixins.CfnModelPropsMixin.ModelAccessConfigProperty(
            accept_eula=False
        ),
        s3_data_type="s3DataType",
        s3_uri="s3Uri"
    )
)

Attributes

s3_data_source

Specifies the S3 location of ML model data to deploy.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-modeldatasource.html#cfn-sagemaker-model-modeldatasource-s3datasource

MultiModelConfigProperty

class CfnModelPropsMixin.MultiModelConfigProperty(*, model_cache_setting=None)

Bases: object

Specifies additional configuration for hosting multi-model endpoints.

Parameters:

model_cache_setting (Optional[str]) – Whether to cache models for a multi-model endpoint. By default, multi-model endpoints cache models so that a model does not have to be loaded into memory each time it is invoked. Some use cases do not benefit from model caching. For example, if an endpoint hosts a large number of models that are each invoked infrequently, the endpoint might perform better if you disable model caching. To disable model caching, set the value of this parameter to Disabled.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-multimodelconfig.html

ExampleMetadata:

fixture=_generated

Example:

# The code below shows an example of how to instantiate this type.
# The values are placeholders you should change.
from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins

multi_model_config_property = sagemaker_mixins.CfnModelPropsMixin.MultiModelConfigProperty(
    model_cache_setting="modelCacheSetting"
)

Attributes

model_cache_setting

Whether to cache models for a multi-model endpoint.

By default, multi-model endpoints cache models so that a model does not have to be loaded into memory each time it is invoked. Some use cases do not benefit from model caching. For example, if an endpoint hosts a large number of models that are each invoked infrequently, the endpoint might perform better if you disable model caching. To disable model caching, set the value of this parameter to Disabled.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-multimodelconfig.html#cfn-sagemaker-model-multimodelconfig-modelcachesetting

RepositoryAuthConfigProperty

class CfnModelPropsMixin.RepositoryAuthConfigProperty(*, repository_credentials_provider_arn=None)

Bases: object

Specifies an authentication configuration for the private docker registry where your model image is hosted.

Specify a value for this property only if you specified Vpc as the value for the RepositoryAccessMode field of the ImageConfig object that you passed to a call to CreateModel and the private Docker registry where the model image is hosted requires authentication.

Parameters:

repository_credentials_provider_arn (Optional[str]) – The Amazon Resource Name (ARN) of an AWS Lambda function that provides credentials to authenticate to the private Docker registry where your model image is hosted. For information about how to create an AWS Lambda function, see Create a Lambda function with the console in the AWS Lambda Developer Guide .

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-repositoryauthconfig.html

ExampleMetadata:

fixture=_generated

Example:

# The code below shows an example of how to instantiate this type.
# The values are placeholders you should change.
from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins

repository_auth_config_property = sagemaker_mixins.CfnModelPropsMixin.RepositoryAuthConfigProperty(
    repository_credentials_provider_arn="repositoryCredentialsProviderArn"
)

Attributes

repository_credentials_provider_arn

The Amazon Resource Name (ARN) of an AWS Lambda function that provides credentials to authenticate to the private Docker registry where your model image is hosted.

For information about how to create an AWS Lambda function, see Create a Lambda function with the console in the AWS Lambda Developer Guide .

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-repositoryauthconfig.html#cfn-sagemaker-model-repositoryauthconfig-repositorycredentialsproviderarn

S3DataSourceProperty

class CfnModelPropsMixin.S3DataSourceProperty(*, compression_type=None, hub_access_config=None, model_access_config=None, s3_data_type=None, s3_uri=None)

Bases: object

Describes the S3 data source.

Your input bucket must be in the same AWS region as your training job.

Parameters:
  • compression_type (Optional[str])

  • hub_access_config (Union[IResolvable, HubAccessConfigProperty, Dict[str, Any], None]) – The configuration for a private hub model reference that points to a SageMaker JumpStart public hub model.

  • model_access_config (Union[IResolvable, ModelAccessConfigProperty, Dict[str, Any], None])

  • s3_data_type (Optional[str]) – If you choose S3Prefix , S3Uri identifies a key name prefix. SageMaker uses all objects that match the specified key name prefix for model training. If you choose ManifestFile , S3Uri identifies an object that is a manifest file containing a list of object keys that you want SageMaker to use for model training. If you choose AugmentedManifestFile , S3Uri identifies an object that is an augmented manifest file in JSON lines format. This file contains the data you want to use for model training. AugmentedManifestFile can only be used if the Channel’s input mode is Pipe . If you choose Converse , S3Uri identifies an Amazon S3 location that contains data formatted according to Converse format. This format structures conversational messages with specific roles and content types used for training and fine-tuning foundational models.

  • s3_uri (Optional[str]) – Depending on the value specified for the S3DataType , identifies either a key name prefix or a manifest. For example: - A key name prefix might look like this: s3://bucketname/exampleprefix/ - A manifest might look like this: s3://bucketname/example.manifest A manifest is an S3 object which is a JSON file consisting of an array of elements. The first element is a prefix which is followed by one or more suffixes. SageMaker appends the suffix elements to the prefix to get a full set of S3Uri . Note that the prefix must be a valid non-empty S3Uri that precludes users from specifying a manifest whose individual S3Uri is sourced from different S3 buckets. The following code example shows a valid manifest format: [ {"prefix": "s3://customer_bucket/some/prefix/"}, "relative/path/to/custdata-1", "relative/path/custdata-2", ... "relative/path/custdata-N" ] This JSON is equivalent to the following S3Uri list: s3://customer_bucket/some/prefix/relative/path/to/custdata-1 s3://customer_bucket/some/prefix/relative/path/custdata-2 ... s3://customer_bucket/some/prefix/relative/path/custdata-N The complete set of S3Uri in this manifest is the input data for the channel for this data source. The object that each S3Uri points to must be readable by the IAM role that SageMaker uses to perform tasks on your behalf. Your input bucket must be located in same AWS region as your training job.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-s3datasource.html

ExampleMetadata:

fixture=_generated

Example:

# The code below shows an example of how to instantiate this type.
# The values are placeholders you should change.
from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins

s3_data_source_property = sagemaker_mixins.CfnModelPropsMixin.S3DataSourceProperty(
    compression_type="compressionType",
    hub_access_config=sagemaker_mixins.CfnModelPropsMixin.HubAccessConfigProperty(
        hub_content_arn="hubContentArn"
    ),
    model_access_config=sagemaker_mixins.CfnModelPropsMixin.ModelAccessConfigProperty(
        accept_eula=False
    ),
    s3_data_type="s3DataType",
    s3_uri="s3Uri"
)

Attributes

compression_type

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-s3datasource.html#cfn-sagemaker-model-s3datasource-compressiontype

Type:

see

hub_access_config

The configuration for a private hub model reference that points to a SageMaker JumpStart public hub model.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-s3datasource.html#cfn-sagemaker-model-s3datasource-hubaccessconfig

model_access_config

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-s3datasource.html#cfn-sagemaker-model-s3datasource-modelaccessconfig

Type:

see

s3_data_type

If you choose S3Prefix , S3Uri identifies a key name prefix.

SageMaker uses all objects that match the specified key name prefix for model training.

If you choose ManifestFile , S3Uri identifies an object that is a manifest file containing a list of object keys that you want SageMaker to use for model training.

If you choose AugmentedManifestFile , S3Uri identifies an object that is an augmented manifest file in JSON lines format. This file contains the data you want to use for model training. AugmentedManifestFile can only be used if the Channel’s input mode is Pipe .

If you choose Converse , S3Uri identifies an Amazon S3 location that contains data formatted according to Converse format. This format structures conversational messages with specific roles and content types used for training and fine-tuning foundational models.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-s3datasource.html#cfn-sagemaker-model-s3datasource-s3datatype

s3_uri

Depending on the value specified for the S3DataType , identifies either a key name prefix or a manifest.

For example:

  • A key name prefix might look like this: s3://bucketname/exampleprefix/

  • A manifest might look like this: s3://bucketname/example.manifest

A manifest is an S3 object which is a JSON file consisting of an array of elements. The first element is a prefix which is followed by one or more suffixes. SageMaker appends the suffix elements to the prefix to get a full set of S3Uri . Note that the prefix must be a valid non-empty S3Uri that precludes users from specifying a manifest whose individual S3Uri is sourced from different S3 buckets.

The following code example shows a valid manifest format:

[ {"prefix": "s3://customer_bucket/some/prefix/"},

"relative/path/to/custdata-1",

"relative/path/custdata-2",

...

"relative/path/custdata-N"

]

This JSON is equivalent to the following S3Uri list:

s3://customer_bucket/some/prefix/relative/path/to/custdata-1

s3://customer_bucket/some/prefix/relative/path/custdata-2

...

s3://customer_bucket/some/prefix/relative/path/custdata-N

The complete set of S3Uri in this manifest is the input data for the channel for this data source. The object that each S3Uri points to must be readable by the IAM role that SageMaker uses to perform tasks on your behalf.

Your input bucket must be located in same AWS region as your training job.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-s3datasource.html#cfn-sagemaker-model-s3datasource-s3uri

VpcConfigProperty

class CfnModelPropsMixin.VpcConfigProperty(*, security_group_ids=None, subnets=None)

Bases: object

Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to.

You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .

Parameters:
  • security_group_ids (Optional[Sequence[str]]) – The VPC security group IDs, in the form sg-xxxxxxxx . Specify the security groups for the VPC that is specified in the Subnets field.

  • subnets (Optional[Sequence[str]]) – The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see Supported Instance Types and Availability Zones .

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-vpcconfig.html

ExampleMetadata:

fixture=_generated

Example:

# The code below shows an example of how to instantiate this type.
# The values are placeholders you should change.
from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins

vpc_config_property = sagemaker_mixins.CfnModelPropsMixin.VpcConfigProperty(
    security_group_ids=["securityGroupIds"],
    subnets=["subnets"]
)

Attributes

security_group_ids

The VPC security group IDs, in the form sg-xxxxxxxx .

Specify the security groups for the VPC that is specified in the Subnets field.

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-vpcconfig.html#cfn-sagemaker-model-vpcconfig-securitygroupids

subnets

The ID of the subnets in the VPC to which you want to connect your training job or model.

For information about the availability of specific instance types, see Supported Instance Types and Availability Zones .

See:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-vpcconfig.html#cfn-sagemaker-model-vpcconfig-subnets