CfnInferenceComponentMixinProps
- class aws_cdk.mixins_preview.aws_sagemaker.mixins.CfnInferenceComponentMixinProps(*, deployment_config=None, endpoint_arn=None, endpoint_name=None, inference_component_name=None, runtime_config=None, specification=None, tags=None, variant_name=None)
Bases:
objectProperties for CfnInferenceComponentPropsMixin.
- Parameters:
deployment_config (
Union[IResolvable,InferenceComponentDeploymentConfigProperty,Dict[str,Any],None]) – The deployment configuration for an endpoint, which contains the desired deployment strategy and rollback configurations.endpoint_arn (
Optional[str]) – The Amazon Resource Name (ARN) of the endpoint that hosts the inference component.endpoint_name (
Optional[str]) – The name of the endpoint that hosts the inference component.inference_component_name (
Optional[str]) – The name of the inference component.runtime_config (
Union[IResolvable,InferenceComponentRuntimeConfigProperty,Dict[str,Any],None]) – The runtime config for the inference component.specification (
Union[IResolvable,InferenceComponentSpecificationProperty,Dict[str,Any],None]) – The specification for the inference component.tags (
Optional[Sequence[Union[CfnTag,Dict[str,Any]]]]) – An array of tags to apply to the resource.variant_name (
Optional[str]) – The name of the production variant that hosts the inference component.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins cfn_inference_component_mixin_props = sagemaker_mixins.CfnInferenceComponentMixinProps( deployment_config=sagemaker_mixins.CfnInferenceComponentPropsMixin.InferenceComponentDeploymentConfigProperty( auto_rollback_configuration=sagemaker_mixins.CfnInferenceComponentPropsMixin.AutoRollbackConfigurationProperty( alarms=[sagemaker_mixins.CfnInferenceComponentPropsMixin.AlarmProperty( alarm_name="alarmName" )] ), rolling_update_policy=sagemaker_mixins.CfnInferenceComponentPropsMixin.InferenceComponentRollingUpdatePolicyProperty( maximum_batch_size=sagemaker_mixins.CfnInferenceComponentPropsMixin.InferenceComponentCapacitySizeProperty( type="type", value=123 ), maximum_execution_timeout_in_seconds=123, rollback_maximum_batch_size=sagemaker_mixins.CfnInferenceComponentPropsMixin.InferenceComponentCapacitySizeProperty( type="type", value=123 ), wait_interval_in_seconds=123 ) ), endpoint_arn="endpointArn", endpoint_name="endpointName", inference_component_name="inferenceComponentName", runtime_config=sagemaker_mixins.CfnInferenceComponentPropsMixin.InferenceComponentRuntimeConfigProperty( copy_count=123, current_copy_count=123, desired_copy_count=123 ), specification=sagemaker_mixins.CfnInferenceComponentPropsMixin.InferenceComponentSpecificationProperty( base_inference_component_name="baseInferenceComponentName", compute_resource_requirements=sagemaker_mixins.CfnInferenceComponentPropsMixin.InferenceComponentComputeResourceRequirementsProperty( max_memory_required_in_mb=123, min_memory_required_in_mb=123, number_of_accelerator_devices_required=123, number_of_cpu_cores_required=123 ), container=sagemaker_mixins.CfnInferenceComponentPropsMixin.InferenceComponentContainerSpecificationProperty( artifact_url="artifactUrl", deployed_image=sagemaker_mixins.CfnInferenceComponentPropsMixin.DeployedImageProperty( resolution_time="resolutionTime", resolved_image="resolvedImage", specified_image="specifiedImage" ), environment={ "environment_key": "environment" }, image="image" ), model_name="modelName", startup_parameters=sagemaker_mixins.CfnInferenceComponentPropsMixin.InferenceComponentStartupParametersProperty( container_startup_health_check_timeout_in_seconds=123, model_data_download_timeout_in_seconds=123 ) ), tags=[CfnTag( key="key", value="value" )], variant_name="variantName" )
Attributes
- deployment_config
The deployment configuration for an endpoint, which contains the desired deployment strategy and rollback configurations.
- endpoint_arn
The Amazon Resource Name (ARN) of the endpoint that hosts the inference component.
- endpoint_name
The name of the endpoint that hosts the inference component.
- inference_component_name
The name of the inference component.
- runtime_config
The runtime config for the inference component.
- specification
The specification for the inference component.
- tags
An array of tags to apply to the resource.
- variant_name
The name of the production variant that hosts the inference component.