Class CfnInferenceComponent.InferenceComponentSpecificationProperty.Jsii$Proxy
- All Implemented Interfaces:
CfnInferenceComponent.InferenceComponentSpecificationProperty,software.amazon.jsii.JsiiSerializable
- Enclosing interface:
CfnInferenceComponent.InferenceComponentSpecificationProperty
CfnInferenceComponent.InferenceComponentSpecificationProperty-
Nested Class Summary
Nested classes/interfaces inherited from class software.amazon.jsii.JsiiObject
software.amazon.jsii.JsiiObject.InitializationModeNested classes/interfaces inherited from interface software.amazon.awscdk.services.sagemaker.CfnInferenceComponent.InferenceComponentSpecificationProperty
CfnInferenceComponent.InferenceComponentSpecificationProperty.Builder, CfnInferenceComponent.InferenceComponentSpecificationProperty.Jsii$Proxy -
Constructor Summary
ConstructorsModifierConstructorDescriptionprotectedConstructor that initializes the object based on literal property values passed by theCfnInferenceComponent.InferenceComponentSpecificationProperty.Builder.protectedJsii$Proxy(software.amazon.jsii.JsiiObjectRef objRef) Constructor that initializes the object based on values retrieved from the JsiiObject. -
Method Summary
Modifier and TypeMethodDescriptioncom.fasterxml.jackson.databind.JsonNodefinal booleanfinal StringThe name of an existing inference component that is to contain the inference component that you're creating with your request.final ObjectThe compute resources allocated to run the model, plus any adapter models, that you assign to the inference component.final ObjectDefines a container that provides the runtime environment for a model that you deploy with an inference component.final StringThe name of an existing SageMaker AI model object in your account that you want to deploy with the inference component.final ObjectSettings that take effect while the model container starts up.final inthashCode()Methods inherited from class software.amazon.jsii.JsiiObject
jsiiAsyncCall, jsiiAsyncCall, jsiiCall, jsiiCall, jsiiGet, jsiiGet, jsiiSet, jsiiStaticCall, jsiiStaticCall, jsiiStaticGet, jsiiStaticGet, jsiiStaticSet, jsiiStaticSet
-
Constructor Details
-
Jsii$Proxy
protected Jsii$Proxy(software.amazon.jsii.JsiiObjectRef objRef) Constructor that initializes the object based on values retrieved from the JsiiObject.- Parameters:
objRef- Reference to the JSII managed object.
-
Jsii$Proxy
Constructor that initializes the object based on literal property values passed by theCfnInferenceComponent.InferenceComponentSpecificationProperty.Builder.
-
-
Method Details
-
getBaseInferenceComponentName
Description copied from interface:CfnInferenceComponent.InferenceComponentSpecificationPropertyThe name of an existing inference component that is to contain the inference component that you're creating with your request.Specify this parameter only if your request is meant to create an adapter inference component. An adapter inference component contains the path to an adapter model. The purpose of the adapter model is to tailor the inference output of a base foundation model, which is hosted by the base inference component. The adapter inference component uses the compute resources that you assigned to the base inference component.
When you create an adapter inference component, use the
Containerparameter to specify the location of the adapter artifacts. In the parameter value, use theArtifactUrlparameter of theInferenceComponentContainerSpecificationdata type.Before you can create an adapter inference component, you must have an existing inference component that contains the foundation model that you want to adapt.
- Specified by:
getBaseInferenceComponentNamein interfaceCfnInferenceComponent.InferenceComponentSpecificationProperty- See Also:
-
getComputeResourceRequirements
Description copied from interface:CfnInferenceComponent.InferenceComponentSpecificationPropertyThe compute resources allocated to run the model, plus any adapter models, that you assign to the inference component.Omit this parameter if your request is meant to create an adapter inference component. An adapter inference component is loaded by a base inference component, and it uses the compute resources of the base inference component.
Returns union: either
IResolvableorCfnInferenceComponent.InferenceComponentComputeResourceRequirementsProperty- Specified by:
getComputeResourceRequirementsin interfaceCfnInferenceComponent.InferenceComponentSpecificationProperty- See Also:
-
getContainer
Description copied from interface:CfnInferenceComponent.InferenceComponentSpecificationPropertyDefines a container that provides the runtime environment for a model that you deploy with an inference component.Returns union: either
IResolvableorCfnInferenceComponent.InferenceComponentContainerSpecificationProperty- Specified by:
getContainerin interfaceCfnInferenceComponent.InferenceComponentSpecificationProperty- See Also:
-
getModelName
Description copied from interface:CfnInferenceComponent.InferenceComponentSpecificationPropertyThe name of an existing SageMaker AI model object in your account that you want to deploy with the inference component.- Specified by:
getModelNamein interfaceCfnInferenceComponent.InferenceComponentSpecificationProperty- See Also:
-
getStartupParameters
Description copied from interface:CfnInferenceComponent.InferenceComponentSpecificationPropertySettings that take effect while the model container starts up.Returns union: either
IResolvableorCfnInferenceComponent.InferenceComponentStartupParametersProperty- Specified by:
getStartupParametersin interfaceCfnInferenceComponent.InferenceComponentSpecificationProperty- See Also:
-
$jsii$toJson
@Internal public com.fasterxml.jackson.databind.JsonNode $jsii$toJson()- Specified by:
$jsii$toJsonin interfacesoftware.amazon.jsii.JsiiSerializable
-
equals
-
hashCode
public final int hashCode()
-