interface InferenceComponentRuntimeConfigProperty
| Language | Type name |
|---|---|
.NET | Amazon.CDK.Mixins.Preview.AWS.SageMaker.Mixins.CfnInferenceComponentPropsMixin.InferenceComponentRuntimeConfigProperty |
Go | github.com/aws/aws-cdk-go/awscdkmixinspreview/v2/awssagemaker/mixins#CfnInferenceComponentPropsMixin_InferenceComponentRuntimeConfigProperty |
Java | software.amazon.awscdk.mixins.preview.services.sagemaker.mixins.CfnInferenceComponentPropsMixin.InferenceComponentRuntimeConfigProperty |
Python | aws_cdk.mixins_preview.aws_sagemaker.mixins.CfnInferenceComponentPropsMixin.InferenceComponentRuntimeConfigProperty |
TypeScript | @aws-cdk/mixins-preview » aws_sagemaker » mixins » CfnInferenceComponentPropsMixin » InferenceComponentRuntimeConfigProperty |
Runtime settings for a model that is deployed with an inference component.
Example
// The code below shows an example of how to instantiate this type.
// The values are placeholders you should change.
import { mixins as sagemaker_mixins } from '@aws-cdk/mixins-preview/aws-sagemaker';
const inferenceComponentRuntimeConfigProperty: sagemaker_mixins.CfnInferenceComponentPropsMixin.InferenceComponentRuntimeConfigProperty = {
copyCount: 123,
currentCopyCount: 123,
desiredCopyCount: 123,
};
Properties
| Name | Type | Description |
|---|---|---|
| copy | number | The number of runtime copies of the model container to deploy with the inference component. |
| current | number | The number of runtime copies of the model container that are currently deployed. |
| desired | number | The number of runtime copies of the model container that you requested to deploy with the inference component. |
copyCount?
Type:
number
(optional)
The number of runtime copies of the model container to deploy with the inference component.
Each copy can serve inference requests.
currentCopyCount?
Type:
number
(optional)
The number of runtime copies of the model container that are currently deployed.
desiredCopyCount?
Type:
number
(optional)
The number of runtime copies of the model container that you requested to deploy with the inference component.

.NET
Go
Java
Python
TypeScript