CfnProcessingJobPropsMixin
- class aws_cdk.mixins_preview.aws_sagemaker.mixins.CfnProcessingJobPropsMixin(props, *, strategy=None)
Bases:
MixinAn Amazon SageMaker processing job that is used to analyze data and evaluate models.
For more information, see Process Data and Evaluate Models .
Also, note the following details specific to processing jobs created using CloudFormation stacks:
When you delete a CloudFormation stack with a processing job resource, the processing job is stopped using the StopProcessingJob API but not deleted. Any tags associated with the processing job are deleted using the DeleteTags API.
If any part of your CloudFormation stack deployment fails and a rollback initiates, processing jobs with a specified
ProcessingJobNamevalue might cause the stack to become stuck in a failed state. This occurs because during a rollback, CloudFormation attempts to recreate the stack resources. Processing job names must be unique, so when CloudFormation attempts to recreate a processing job using the already defined name, this results in anAlreadyExistserror. To prevent this, we recommend that you don’t specify the optionalProcessingJobNameproperty, thereby allowing SageMaker to auto-generate a unique name for your processing job. This ensures successful stack rollbacks when necessary. If you must use custom job names, you have to manually modify theProcessingJobNameand redeploy the stack to recover from a failed rollback.
- See:
- CloudformationResource:
AWS::SageMaker::ProcessingJob
- Mixin:
true
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview import mixins from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins cfn_processing_job_props_mixin = sagemaker_mixins.CfnProcessingJobPropsMixin(sagemaker_mixins.CfnProcessingJobMixinProps( app_specification=sagemaker_mixins.CfnProcessingJobPropsMixin.AppSpecificationProperty( container_arguments=["containerArguments"], container_entrypoint=["containerEntrypoint"], image_uri="imageUri" ), environment={ "environment_key": "environment" }, experiment_config=sagemaker_mixins.CfnProcessingJobPropsMixin.ExperimentConfigProperty( experiment_name="experimentName", run_name="runName", trial_component_display_name="trialComponentDisplayName", trial_name="trialName" ), network_config=sagemaker_mixins.CfnProcessingJobPropsMixin.NetworkConfigProperty( enable_inter_container_traffic_encryption=False, enable_network_isolation=False, vpc_config=sagemaker_mixins.CfnProcessingJobPropsMixin.VpcConfigProperty( security_group_ids=["securityGroupIds"], subnets=["subnets"] ) ), processing_inputs=[sagemaker_mixins.CfnProcessingJobPropsMixin.ProcessingInputsObjectProperty( app_managed=False, dataset_definition=sagemaker_mixins.CfnProcessingJobPropsMixin.DatasetDefinitionProperty( athena_dataset_definition=sagemaker_mixins.CfnProcessingJobPropsMixin.AthenaDatasetDefinitionProperty( catalog="catalog", database="database", kms_key_id="kmsKeyId", output_compression="outputCompression", output_format="outputFormat", output_s3_uri="outputS3Uri", query_string="queryString", work_group="workGroup" ), data_distribution_type="dataDistributionType", input_mode="inputMode", local_path="localPath", redshift_dataset_definition=sagemaker_mixins.CfnProcessingJobPropsMixin.RedshiftDatasetDefinitionProperty( cluster_id="clusterId", cluster_role_arn="clusterRoleArn", database="database", db_user="dbUser", kms_key_id="kmsKeyId", output_compression="outputCompression", output_format="outputFormat", output_s3_uri="outputS3Uri", query_string="queryString" ) ), input_name="inputName", s3_input=sagemaker_mixins.CfnProcessingJobPropsMixin.S3InputProperty( local_path="localPath", s3_compression_type="s3CompressionType", s3_data_distribution_type="s3DataDistributionType", s3_data_type="s3DataType", s3_input_mode="s3InputMode", s3_uri="s3Uri" ) )], processing_job_name="processingJobName", processing_output_config=sagemaker_mixins.CfnProcessingJobPropsMixin.ProcessingOutputConfigProperty( kms_key_id="kmsKeyId", outputs=[sagemaker_mixins.CfnProcessingJobPropsMixin.ProcessingOutputsObjectProperty( app_managed=False, feature_store_output=sagemaker_mixins.CfnProcessingJobPropsMixin.FeatureStoreOutputProperty( feature_group_name="featureGroupName" ), output_name="outputName", s3_output=sagemaker_mixins.CfnProcessingJobPropsMixin.S3OutputProperty( local_path="localPath", s3_upload_mode="s3UploadMode", s3_uri="s3Uri" ) )] ), processing_resources=sagemaker_mixins.CfnProcessingJobPropsMixin.ProcessingResourcesProperty( cluster_config=sagemaker_mixins.CfnProcessingJobPropsMixin.ClusterConfigProperty( instance_count=123, instance_type="instanceType", volume_kms_key_id="volumeKmsKeyId", volume_size_in_gb=123 ) ), role_arn="roleArn", stopping_condition=sagemaker_mixins.CfnProcessingJobPropsMixin.StoppingConditionProperty( max_runtime_in_seconds=123 ), tags=[CfnTag( key="key", value="value" )] ), strategy=mixins.PropertyMergeStrategy.OVERRIDE )
Create a mixin to apply properties to
AWS::SageMaker::ProcessingJob.- Parameters:
props (
Union[CfnProcessingJobMixinProps,Dict[str,Any]]) – L1 properties to apply.strategy (
Optional[PropertyMergeStrategy]) – (experimental) Strategy for merging nested properties. Default: - PropertyMergeStrategy.MERGE
Methods
- apply_to(construct)
Apply the mixin properties to the construct.
- Parameters:
construct (
IConstruct)- Return type:
- supports(construct)
Check if this mixin supports the given construct.
- Parameters:
construct (
IConstruct)- Return type:
bool
Attributes
- CFN_PROPERTY_KEYS = ['appSpecification', 'environment', 'experimentConfig', 'networkConfig', 'processingInputs', 'processingJobName', 'processingOutputConfig', 'processingResources', 'roleArn', 'stoppingCondition', 'tags']
Static Methods
- classmethod is_mixin(x)
(experimental) Checks if
xis a Mixin.- Parameters:
x (
Any) – Any object.- Return type:
bool- Returns:
true if
xis an object created from a class which extendsMixin.- Stability:
experimental
AppSpecificationProperty
- class CfnProcessingJobPropsMixin.AppSpecificationProperty(*, container_arguments=None, container_entrypoint=None, image_uri=None)
Bases:
objectConfiguration to run a processing job in a specified container image.
- Parameters:
container_arguments (
Optional[Sequence[str]]) – The arguments for a container used to run a processing job.container_entrypoint (
Optional[Sequence[str]]) – The entrypoint for a container used to run a processing job.image_uri (
Optional[str]) – The container image to be run by the processing job.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins app_specification_property = sagemaker_mixins.CfnProcessingJobPropsMixin.AppSpecificationProperty( container_arguments=["containerArguments"], container_entrypoint=["containerEntrypoint"], image_uri="imageUri" )
Attributes
- container_arguments
The arguments for a container used to run a processing job.
- container_entrypoint
The entrypoint for a container used to run a processing job.
- image_uri
The container image to be run by the processing job.
AthenaDatasetDefinitionProperty
- class CfnProcessingJobPropsMixin.AthenaDatasetDefinitionProperty(*, catalog=None, database=None, kms_key_id=None, output_compression=None, output_format=None, output_s3_uri=None, query_string=None, work_group=None)
Bases:
objectConfiguration for Athena Dataset Definition input.
- Parameters:
catalog (
Optional[str]) – The name of the data catalog used in Athena query execution.database (
Optional[str]) – The name of the database used in the Athena query execution.kms_key_id (
Optional[str]) – The AWS Key Management Service ( AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.output_compression (
Optional[str]) – The compression used for Athena query results.output_format (
Optional[str]) – The data storage format for Athena query results.output_s3_uri (
Optional[str]) – The location in Amazon S3 where Athena query results are stored.query_string (
Optional[str]) – The SQL query statements, to be executed.work_group (
Optional[str]) – The name of the workgroup in which the Athena query is being started.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins athena_dataset_definition_property = sagemaker_mixins.CfnProcessingJobPropsMixin.AthenaDatasetDefinitionProperty( catalog="catalog", database="database", kms_key_id="kmsKeyId", output_compression="outputCompression", output_format="outputFormat", output_s3_uri="outputS3Uri", query_string="queryString", work_group="workGroup" )
Attributes
- catalog
The name of the data catalog used in Athena query execution.
- database
The name of the database used in the Athena query execution.
- kms_key_id
The AWS Key Management Service ( AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
- output_compression
The compression used for Athena query results.
- output_format
The data storage format for Athena query results.
- output_s3_uri
The location in Amazon S3 where Athena query results are stored.
- query_string
The SQL query statements, to be executed.
- work_group
The name of the workgroup in which the Athena query is being started.
ClusterConfigProperty
- class CfnProcessingJobPropsMixin.ClusterConfigProperty(*, instance_count=None, instance_type=None, volume_kms_key_id=None, volume_size_in_gb=None)
Bases:
objectConfiguration for the cluster used to run a processing job.
- Parameters:
instance_count (
Union[int,float,None]) – The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.instance_type (
Optional[str]) – The ML compute instance type for the processing job.volume_kms_key_id (
Optional[str]) – The AWS Key Management Service ( AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job. .. epigraph:: Certain Nitro-based instances include local storage, dependent on the instance type. Local storage volumes are encrypted using a hardware module on the instance. You can’t request aVolumeKmsKeyIdwhen using an instance type with local storage. For a list of instance types that support local instance storage, see Instance Store Volumes . For more information about local instance storage encryption, see SSD Instance Store Volumes .volume_size_in_gb (
Union[int,float,None]) –The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario. .. epigraph:: Certain Nitro-based instances include local storage with a fixed total size, dependent on the instance type. When using these instances for processing, Amazon SageMaker mounts the local instance storage instead of Amazon EBS gp2 storage. You can’t request a
VolumeSizeInGBgreater than the total size of the local instance storage. For a list of instance types that support local instance storage, including the total size per instance type, see Instance Store Volumes .
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins cluster_config_property = sagemaker_mixins.CfnProcessingJobPropsMixin.ClusterConfigProperty( instance_count=123, instance_type="instanceType", volume_kms_key_id="volumeKmsKeyId", volume_size_in_gb=123 )
Attributes
- instance_count
The number of ML compute instances to use in the processing job.
For distributed processing jobs, specify a value greater than 1. The default value is 1.
- instance_type
The ML compute instance type for the processing job.
- volume_kms_key_id
The AWS Key Management Service ( AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.
Certain Nitro-based instances include local storage, dependent on the instance type. Local storage volumes are encrypted using a hardware module on the instance. You can’t request a
VolumeKmsKeyIdwhen using an instance type with local storage.For a list of instance types that support local instance storage, see Instance Store Volumes .
For more information about local instance storage encryption, see SSD Instance Store Volumes .
- volume_size_in_gb
The size of the ML storage volume in gigabytes that you want to provision.
You must specify sufficient ML storage for your scenario. .. epigraph:
Certain Nitro-based instances include local storage with a fixed total size, dependent on the instance type. When using these instances for processing, Amazon SageMaker mounts the local instance storage instead of Amazon EBS gp2 storage. You can't request a ``VolumeSizeInGB`` greater than the total size of the local instance storage. For a list of instance types that support local instance storage, including the total size per instance type, see `Instance Store Volumes <https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/InstanceStorage.html#instance-store-volumes>`_ .
DatasetDefinitionProperty
- class CfnProcessingJobPropsMixin.DatasetDefinitionProperty(*, athena_dataset_definition=None, data_distribution_type=None, input_mode=None, local_path=None, redshift_dataset_definition=None)
Bases:
objectConfiguration for Dataset Definition inputs.
The Dataset Definition input must specify exactly one of either
AthenaDatasetDefinitionorRedshiftDatasetDefinitiontypes.- Parameters:
athena_dataset_definition (
Union[IResolvable,AthenaDatasetDefinitionProperty,Dict[str,Any],None]) – Configuration for Athena Dataset Definition input.data_distribution_type (
Optional[str]) – Whether the generated dataset isFullyReplicatedorShardedByS3Key(default).input_mode (
Optional[str]) – Whether to useFileorPipeinput mode. InFile(default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. InPipemode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.local_path (
Optional[str]) – The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job.LocalPathis an absolute path to the input data. This is a required parameter whenAppManagedisFalse(default).redshift_dataset_definition (
Union[IResolvable,RedshiftDatasetDefinitionProperty,Dict[str,Any],None]) – Configuration for Redshift Dataset Definition input.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins dataset_definition_property = sagemaker_mixins.CfnProcessingJobPropsMixin.DatasetDefinitionProperty( athena_dataset_definition=sagemaker_mixins.CfnProcessingJobPropsMixin.AthenaDatasetDefinitionProperty( catalog="catalog", database="database", kms_key_id="kmsKeyId", output_compression="outputCompression", output_format="outputFormat", output_s3_uri="outputS3Uri", query_string="queryString", work_group="workGroup" ), data_distribution_type="dataDistributionType", input_mode="inputMode", local_path="localPath", redshift_dataset_definition=sagemaker_mixins.CfnProcessingJobPropsMixin.RedshiftDatasetDefinitionProperty( cluster_id="clusterId", cluster_role_arn="clusterRoleArn", database="database", db_user="dbUser", kms_key_id="kmsKeyId", output_compression="outputCompression", output_format="outputFormat", output_s3_uri="outputS3Uri", query_string="queryString" ) )
Attributes
- athena_dataset_definition
Configuration for Athena Dataset Definition input.
- data_distribution_type
Whether the generated dataset is
FullyReplicatedorShardedByS3Key(default).
- input_mode
Whether to use
FileorPipeinput mode.In
File(default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. InPipemode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
- local_path
The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job.
LocalPathis an absolute path to the input data. This is a required parameter whenAppManagedisFalse(default).
- redshift_dataset_definition
Configuration for Redshift Dataset Definition input.
ExperimentConfigProperty
- class CfnProcessingJobPropsMixin.ExperimentConfigProperty(*, experiment_name=None, run_name=None, trial_component_display_name=None, trial_name=None)
Bases:
objectAssociates a SageMaker job as a trial component with an experiment and trial.
Specified when you call the CreateProcessingJob API.
- Parameters:
experiment_name (
Optional[str]) – The name of an existing experiment to associate with the trial component.run_name (
Optional[str]) – The name of the experiment run to associate with the trial component.trial_component_display_name (
Optional[str]) – The display name for the trial component. If this key isn’t specified, the display name is the trial component name.trial_name (
Optional[str]) – The name of an existing trial to associate the trial component with. If not specified, a new trial is created.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins experiment_config_property = sagemaker_mixins.CfnProcessingJobPropsMixin.ExperimentConfigProperty( experiment_name="experimentName", run_name="runName", trial_component_display_name="trialComponentDisplayName", trial_name="trialName" )
Attributes
- experiment_name
The name of an existing experiment to associate with the trial component.
- run_name
The name of the experiment run to associate with the trial component.
- trial_component_display_name
The display name for the trial component.
If this key isn’t specified, the display name is the trial component name.
- trial_name
The name of an existing trial to associate the trial component with.
If not specified, a new trial is created.
FeatureStoreOutputProperty
- class CfnProcessingJobPropsMixin.FeatureStoreOutputProperty(*, feature_group_name=None)
Bases:
objectConfiguration for processing job outputs in Amazon SageMaker Feature Store.
- Parameters:
feature_group_name (
Optional[str]) – The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins feature_store_output_property = sagemaker_mixins.CfnProcessingJobPropsMixin.FeatureStoreOutputProperty( feature_group_name="featureGroupName" )
Attributes
- feature_group_name
The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output.
Note that your processing script is responsible for putting records into your Feature Store.
NetworkConfigProperty
- class CfnProcessingJobPropsMixin.NetworkConfigProperty(*, enable_inter_container_traffic_encryption=None, enable_network_isolation=None, vpc_config=None)
Bases:
objectNetworking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
- Parameters:
enable_inter_container_traffic_encryption (
Union[bool,IResolvable,None]) – Whether to encrypt all communications between distributed processing jobs. ChooseTrueto encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.enable_network_isolation (
Union[bool,IResolvable,None]) – Whether to allow inbound and outbound network calls to and from the containers used for the processing job.vpc_config (
Union[IResolvable,VpcConfigProperty,Dict[str,Any],None]) – Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins network_config_property = sagemaker_mixins.CfnProcessingJobPropsMixin.NetworkConfigProperty( enable_inter_container_traffic_encryption=False, enable_network_isolation=False, vpc_config=sagemaker_mixins.CfnProcessingJobPropsMixin.VpcConfigProperty( security_group_ids=["securityGroupIds"], subnets=["subnets"] ) )
Attributes
- enable_inter_container_traffic_encryption
Whether to encrypt all communications between distributed processing jobs.
Choose
Trueto encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
- enable_network_isolation
Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
- vpc_config
Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to.
You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
ProcessingInputsObjectProperty
- class CfnProcessingJobPropsMixin.ProcessingInputsObjectProperty(*, app_managed=None, dataset_definition=None, input_name=None, s3_input=None)
Bases:
objectThe inputs for a processing job.
The processing input must specify exactly one of either
S3InputorDatasetDefinitiontypes.- Parameters:
app_managed (
Union[bool,IResolvable,None]) – WhenTrue, input operations such as data download are managed natively by the processing job application. WhenFalse(default), input operations are managed by Amazon SageMaker.dataset_definition (
Union[IResolvable,DatasetDefinitionProperty,Dict[str,Any],None]) – Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of eitherAthenaDatasetDefinitionorRedshiftDatasetDefinitiontypes.input_name (
Optional[str]) – The name for the processing job input.s3_input (
Union[IResolvable,S3InputProperty,Dict[str,Any],None]) – Configuration for downloading input data from Amazon S3 into the processing container.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins processing_inputs_object_property = sagemaker_mixins.CfnProcessingJobPropsMixin.ProcessingInputsObjectProperty( app_managed=False, dataset_definition=sagemaker_mixins.CfnProcessingJobPropsMixin.DatasetDefinitionProperty( athena_dataset_definition=sagemaker_mixins.CfnProcessingJobPropsMixin.AthenaDatasetDefinitionProperty( catalog="catalog", database="database", kms_key_id="kmsKeyId", output_compression="outputCompression", output_format="outputFormat", output_s3_uri="outputS3Uri", query_string="queryString", work_group="workGroup" ), data_distribution_type="dataDistributionType", input_mode="inputMode", local_path="localPath", redshift_dataset_definition=sagemaker_mixins.CfnProcessingJobPropsMixin.RedshiftDatasetDefinitionProperty( cluster_id="clusterId", cluster_role_arn="clusterRoleArn", database="database", db_user="dbUser", kms_key_id="kmsKeyId", output_compression="outputCompression", output_format="outputFormat", output_s3_uri="outputS3Uri", query_string="queryString" ) ), input_name="inputName", s3_input=sagemaker_mixins.CfnProcessingJobPropsMixin.S3InputProperty( local_path="localPath", s3_compression_type="s3CompressionType", s3_data_distribution_type="s3DataDistributionType", s3_data_type="s3DataType", s3_input_mode="s3InputMode", s3_uri="s3Uri" ) )
Attributes
- app_managed
When
True, input operations such as data download are managed natively by the processing job application.When
False(default), input operations are managed by Amazon SageMaker.
- dataset_definition
Configuration for Dataset Definition inputs.
The Dataset Definition input must specify exactly one of either
AthenaDatasetDefinitionorRedshiftDatasetDefinitiontypes.
- input_name
The name for the processing job input.
- s3_input
Configuration for downloading input data from Amazon S3 into the processing container.
ProcessingOutputConfigProperty
- class CfnProcessingJobPropsMixin.ProcessingOutputConfigProperty(*, kms_key_id=None, outputs=None)
Bases:
objectConfiguration for uploading output from the processing container.
- Parameters:
kms_key_id (
Optional[str]) – The AWS Key Management Service ( AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output.KmsKeyIdcan be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. TheKmsKeyIdis applied to all outputs.outputs (
Union[IResolvable,Sequence[Union[IResolvable,ProcessingOutputsObjectProperty,Dict[str,Any]]],None]) – An array of outputs configuring the data to upload from the processing container.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins processing_output_config_property = sagemaker_mixins.CfnProcessingJobPropsMixin.ProcessingOutputConfigProperty( kms_key_id="kmsKeyId", outputs=[sagemaker_mixins.CfnProcessingJobPropsMixin.ProcessingOutputsObjectProperty( app_managed=False, feature_store_output=sagemaker_mixins.CfnProcessingJobPropsMixin.FeatureStoreOutputProperty( feature_group_name="featureGroupName" ), output_name="outputName", s3_output=sagemaker_mixins.CfnProcessingJobPropsMixin.S3OutputProperty( local_path="localPath", s3_upload_mode="s3UploadMode", s3_uri="s3Uri" ) )] )
Attributes
- kms_key_id
The AWS Key Management Service ( AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output.
KmsKeyIdcan be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. TheKmsKeyIdis applied to all outputs.
- outputs
An array of outputs configuring the data to upload from the processing container.
ProcessingOutputsObjectProperty
- class CfnProcessingJobPropsMixin.ProcessingOutputsObjectProperty(*, app_managed=None, feature_store_output=None, output_name=None, s3_output=None)
Bases:
objectDescribes the results of a processing job.
The processing output must specify exactly one of either
S3OutputorFeatureStoreOutputtypes.- Parameters:
app_managed (
Union[bool,IResolvable,None]) – WhenTrue, output operations such as data upload are managed natively by the processing job application. WhenFalse(default), output operations are managed by Amazon SageMaker.feature_store_output (
Union[IResolvable,FeatureStoreOutputProperty,Dict[str,Any],None]) – Configuration for processing job outputs in Amazon SageMaker Feature Store.output_name (
Optional[str]) – The name for the processing job output.s3_output (
Union[IResolvable,S3OutputProperty,Dict[str,Any],None]) – Configuration for uploading output data to Amazon S3 from the processing container.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins processing_outputs_object_property = sagemaker_mixins.CfnProcessingJobPropsMixin.ProcessingOutputsObjectProperty( app_managed=False, feature_store_output=sagemaker_mixins.CfnProcessingJobPropsMixin.FeatureStoreOutputProperty( feature_group_name="featureGroupName" ), output_name="outputName", s3_output=sagemaker_mixins.CfnProcessingJobPropsMixin.S3OutputProperty( local_path="localPath", s3_upload_mode="s3UploadMode", s3_uri="s3Uri" ) )
Attributes
- app_managed
When
True, output operations such as data upload are managed natively by the processing job application.When
False(default), output operations are managed by Amazon SageMaker.
- feature_store_output
Configuration for processing job outputs in Amazon SageMaker Feature Store.
- output_name
The name for the processing job output.
- s3_output
Configuration for uploading output data to Amazon S3 from the processing container.
ProcessingResourcesProperty
- class CfnProcessingJobPropsMixin.ProcessingResourcesProperty(*, cluster_config=None)
Bases:
objectIdentifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job.
In distributed training, you specify more than one instance.
- Parameters:
cluster_config (
Union[IResolvable,ClusterConfigProperty,Dict[str,Any],None]) – The configuration for the resources in a cluster used to run the processing job.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins processing_resources_property = sagemaker_mixins.CfnProcessingJobPropsMixin.ProcessingResourcesProperty( cluster_config=sagemaker_mixins.CfnProcessingJobPropsMixin.ClusterConfigProperty( instance_count=123, instance_type="instanceType", volume_kms_key_id="volumeKmsKeyId", volume_size_in_gb=123 ) )
Attributes
- cluster_config
The configuration for the resources in a cluster used to run the processing job.
RedshiftDatasetDefinitionProperty
- class CfnProcessingJobPropsMixin.RedshiftDatasetDefinitionProperty(*, cluster_id=None, cluster_role_arn=None, database=None, db_user=None, kms_key_id=None, output_compression=None, output_format=None, output_s3_uri=None, query_string=None)
Bases:
objectConfiguration for Redshift Dataset Definition input.
- Parameters:
cluster_id (
Optional[str]) – The Redshift cluster Identifier.cluster_role_arn (
Optional[str]) – The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.database (
Optional[str]) – The name of the Redshift database used in Redshift query execution.db_user (
Optional[str]) – The database user name used in Redshift query execution.kms_key_id (
Optional[str]) – The AWS Key Management Service ( AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.output_compression (
Optional[str]) – The compression used for Redshift query results.output_format (
Optional[str]) – The data storage format for Redshift query results.output_s3_uri (
Optional[str]) – The location in Amazon S3 where the Redshift query results are stored.query_string (
Optional[str]) – The SQL query statements to be executed.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins redshift_dataset_definition_property = sagemaker_mixins.CfnProcessingJobPropsMixin.RedshiftDatasetDefinitionProperty( cluster_id="clusterId", cluster_role_arn="clusterRoleArn", database="database", db_user="dbUser", kms_key_id="kmsKeyId", output_compression="outputCompression", output_format="outputFormat", output_s3_uri="outputS3Uri", query_string="queryString" )
Attributes
- cluster_id
The Redshift cluster Identifier.
- cluster_role_arn
The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
- database
The name of the Redshift database used in Redshift query execution.
- db_user
The database user name used in Redshift query execution.
- kms_key_id
The AWS Key Management Service ( AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
- output_compression
The compression used for Redshift query results.
- output_format
The data storage format for Redshift query results.
- output_s3_uri
The location in Amazon S3 where the Redshift query results are stored.
- query_string
The SQL query statements to be executed.
S3InputProperty
- class CfnProcessingJobPropsMixin.S3InputProperty(*, local_path=None, s3_compression_type=None, s3_data_distribution_type=None, s3_data_type=None, s3_input_mode=None, s3_uri=None)
Bases:
objectConfiguration for downloading input data from Amazon S3 into the processing container.
- Parameters:
local_path (
Optional[str]) – The local path in your container where you want Amazon SageMaker to write input data to.LocalPathis an absolute path to the input data and must begin with/opt/ml/processing/.LocalPathis a required parameter whenAppManagedisFalse(default).s3_compression_type (
Optional[str]) – Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.Gzipcan only be used whenPipemode is specified as theS3InputMode. InPipemode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume.s3_data_distribution_type (
Optional[str]) – Whether to distribute the data from Amazon S3 to all processing instances withFullyReplicated, or whether the data from Amazon S3 is sharded by Amazon S3 key, downloading one shard of data to each processing instance.s3_data_type (
Optional[str]) – Whether you use anS3Prefixor aManifestFilefor the data type. If you chooseS3Prefix,S3Uriidentifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you chooseManifestFile,S3Uriidentifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.s3_input_mode (
Optional[str]) – Whether to useFileorPipeinput mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. InPipemode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.s3_uri (
Optional[str]) – The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins s3_input_property = sagemaker_mixins.CfnProcessingJobPropsMixin.S3InputProperty( local_path="localPath", s3_compression_type="s3CompressionType", s3_data_distribution_type="s3DataDistributionType", s3_data_type="s3DataType", s3_input_mode="s3InputMode", s3_uri="s3Uri" )
Attributes
- local_path
The local path in your container where you want Amazon SageMaker to write input data to.
LocalPathis an absolute path to the input data and must begin with/opt/ml/processing/.LocalPathis a required parameter whenAppManagedisFalse(default).
- s3_compression_type
Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
Gzipcan only be used whenPipemode is specified as theS3InputMode. InPipemode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume.
- s3_data_distribution_type
Whether to distribute the data from Amazon S3 to all processing instances with
FullyReplicated, or whether the data from Amazon S3 is sharded by Amazon S3 key, downloading one shard of data to each processing instance.
- s3_data_type
Whether you use an
S3Prefixor aManifestFilefor the data type.If you choose
S3Prefix,S3Uriidentifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you chooseManifestFile,S3Uriidentifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
- s3_input_mode
Whether to use
FileorPipeinput mode.In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In
Pipemode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
- s3_uri
The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
S3OutputProperty
- class CfnProcessingJobPropsMixin.S3OutputProperty(*, local_path=None, s3_upload_mode=None, s3_uri=None)
Bases:
objectConfiguration for uploading output data to Amazon S3 from the processing container.
- Parameters:
local_path (
Optional[str]) – The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3.LocalPathis an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container’s entrypoint is invoked.s3_upload_mode (
Optional[str]) – Whether to upload the results of the processing job continuously or after the job completes.s3_uri (
Optional[str]) – The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins s3_output_property = sagemaker_mixins.CfnProcessingJobPropsMixin.S3OutputProperty( local_path="localPath", s3_upload_mode="s3UploadMode", s3_uri="s3Uri" )
Attributes
- local_path
The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3.
LocalPathis an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container’s entrypoint is invoked.
- s3_upload_mode
Whether to upload the results of the processing job continuously or after the job completes.
- s3_uri
The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
StoppingConditionProperty
- class CfnProcessingJobPropsMixin.StoppingConditionProperty(*, max_runtime_in_seconds=None)
Bases:
objectConfigures conditions under which the processing job should be stopped, such as how long the processing job has been running.
After the condition is met, the processing job is stopped.
- Parameters:
max_runtime_in_seconds (
Union[int,float,None]) – The maximum length of time, in seconds, that a training or compilation job can run before it is stopped. For compilation jobs, if the job does not complete during this time, aTimeOuterror is generated. We recommend starting with 900 seconds and increasing as necessary based on your model. For all other jobs, if the job does not complete during this time, SageMaker ends the job. WhenRetryStrategyis specified in the job request,MaxRuntimeInSecondsspecifies the maximum time for all of the attempts in total, not each individual attempt. The default value is 1 day. The maximum value is 28 days. The maximum time that aTrainingJobcan run in total, including any time spent publishing metrics or archiving and uploading models after it has been stopped, is 30 days.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins stopping_condition_property = sagemaker_mixins.CfnProcessingJobPropsMixin.StoppingConditionProperty( max_runtime_in_seconds=123 )
Attributes
- max_runtime_in_seconds
The maximum length of time, in seconds, that a training or compilation job can run before it is stopped.
For compilation jobs, if the job does not complete during this time, a
TimeOuterror is generated. We recommend starting with 900 seconds and increasing as necessary based on your model.For all other jobs, if the job does not complete during this time, SageMaker ends the job. When
RetryStrategyis specified in the job request,MaxRuntimeInSecondsspecifies the maximum time for all of the attempts in total, not each individual attempt. The default value is 1 day. The maximum value is 28 days.The maximum time that a
TrainingJobcan run in total, including any time spent publishing metrics or archiving and uploading models after it has been stopped, is 30 days.
VpcConfigProperty
- class CfnProcessingJobPropsMixin.VpcConfigProperty(*, security_group_ids=None, subnets=None)
Bases:
objectSpecifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to.
You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
- Parameters:
security_group_ids (
Optional[Sequence[str]]) – The VPC security group IDs, in the formsg-xxxxxxxx. Specify the security groups for the VPC that is specified in theSubnetsfield.subnets (
Optional[Sequence[str]]) – The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see Supported Instance Types and Availability Zones .
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk.mixins_preview.aws_sagemaker import mixins as sagemaker_mixins vpc_config_property = sagemaker_mixins.CfnProcessingJobPropsMixin.VpcConfigProperty( security_group_ids=["securityGroupIds"], subnets=["subnets"] )
Attributes
- security_group_ids
The VPC security group IDs, in the form
sg-xxxxxxxx.Specify the security groups for the VPC that is specified in the
Subnetsfield.
- subnets
The ID of the subnets in the VPC to which you want to connect your training job or model.
For information about the availability of specific instance types, see Supported Instance Types and Availability Zones .