Class CfnAgentPropsMixin.InferenceConfigurationProperty.Jsii$Proxy
- All Implemented Interfaces:
CfnAgentPropsMixin.InferenceConfigurationProperty,software.amazon.jsii.JsiiSerializable
- Enclosing interface:
CfnAgentPropsMixin.InferenceConfigurationProperty
CfnAgentPropsMixin.InferenceConfigurationProperty-
Nested Class Summary
Nested classes/interfaces inherited from class software.amazon.jsii.JsiiObject
software.amazon.jsii.JsiiObject.InitializationModeNested classes/interfaces inherited from interface software.amazon.awscdk.cfnpropertymixins.services.bedrock.CfnAgentPropsMixin.InferenceConfigurationProperty
CfnAgentPropsMixin.InferenceConfigurationProperty.Builder, CfnAgentPropsMixin.InferenceConfigurationProperty.Jsii$Proxy -
Constructor Summary
ConstructorsModifierConstructorDescriptionprotectedConstructor that initializes the object based on literal property values passed by theCfnAgentPropsMixin.InferenceConfigurationProperty.Builder.protectedJsii$Proxy(software.amazon.jsii.JsiiObjectRef objRef) Constructor that initializes the object based on values retrieved from the JsiiObject. -
Method Summary
Modifier and TypeMethodDescriptioncom.fasterxml.jackson.databind.JsonNodefinal booleanfinal NumberThe maximum number of tokens allowed in the generated response.A list of stop sequences.final NumberThe likelihood of the model selecting higher-probability options while generating a response.final NumbergetTopK()While generating a response, the model determines the probability of the following token at each point of generation.final NumbergetTopP()The percentage of most-likely candidates that the model considers for the next token.final inthashCode()Methods inherited from class software.amazon.jsii.JsiiObject
jsiiAsyncCall, jsiiAsyncCall, jsiiCall, jsiiCall, jsiiGet, jsiiGet, jsiiSet, jsiiStaticCall, jsiiStaticCall, jsiiStaticGet, jsiiStaticGet, jsiiStaticSet, jsiiStaticSet
-
Constructor Details
-
Jsii$Proxy
protected Jsii$Proxy(software.amazon.jsii.JsiiObjectRef objRef) Constructor that initializes the object based on values retrieved from the JsiiObject.- Parameters:
objRef- Reference to the JSII managed object.
-
Jsii$Proxy
Constructor that initializes the object based on literal property values passed by theCfnAgentPropsMixin.InferenceConfigurationProperty.Builder.
-
-
Method Details
-
getMaximumLength
Description copied from interface:CfnAgentPropsMixin.InferenceConfigurationPropertyThe maximum number of tokens allowed in the generated response.- Specified by:
getMaximumLengthin interfaceCfnAgentPropsMixin.InferenceConfigurationProperty- See Also:
-
getStopSequences
Description copied from interface:CfnAgentPropsMixin.InferenceConfigurationPropertyA list of stop sequences.A stop sequence is a sequence of characters that causes the model to stop generating the response.
- Specified by:
getStopSequencesin interfaceCfnAgentPropsMixin.InferenceConfigurationProperty- See Also:
-
getTemperature
Description copied from interface:CfnAgentPropsMixin.InferenceConfigurationPropertyThe likelihood of the model selecting higher-probability options while generating a response.A lower value makes the model more likely to choose higher-probability options, while a higher value makes the model more likely to choose lower-probability options.
The default value is the default value for the model that you are using. For more information, see Inference parameters for foundation models .
- Specified by:
getTemperaturein interfaceCfnAgentPropsMixin.InferenceConfigurationProperty- See Also:
-
getTopK
Description copied from interface:CfnAgentPropsMixin.InferenceConfigurationPropertyWhile generating a response, the model determines the probability of the following token at each point of generation.The value that you set for
topKis the number of most-likely candidates from which the model chooses the next token in the sequence. For example, if you settopKto 50, the model selects the next token from among the top 50 most likely choices.- Specified by:
getTopKin interfaceCfnAgentPropsMixin.InferenceConfigurationProperty- See Also:
-
getTopP
Description copied from interface:CfnAgentPropsMixin.InferenceConfigurationPropertyThe percentage of most-likely candidates that the model considers for the next token.For example, if you choose a value of 0.8 for
topP, the model selects from the top 80% of the probability distribution of tokens that could be next in the sequence.The default value is the default value for the model that you are using. For more information, see Inference parameters for foundation models .
- Specified by:
getTopPin interfaceCfnAgentPropsMixin.InferenceConfigurationProperty- See Also:
-
$jsii$toJson
@Internal public com.fasterxml.jackson.databind.JsonNode $jsii$toJson()- Specified by:
$jsii$toJsonin interfacesoftware.amazon.jsii.JsiiSerializable
-
equals
-
hashCode
public final int hashCode()
-