Class CfnPrompt.PromptModelInferenceConfigurationProperty.Builder
java.lang.Object
software.amazon.awscdk.services.bedrock.CfnPrompt.PromptModelInferenceConfigurationProperty.Builder
- All Implemented Interfaces:
software.amazon.jsii.Builder<CfnPrompt.PromptModelInferenceConfigurationProperty>
- Enclosing interface:
CfnPrompt.PromptModelInferenceConfigurationProperty
@Stability(Stable)
public static final class CfnPrompt.PromptModelInferenceConfigurationProperty.Builder
extends Object
implements software.amazon.jsii.Builder<CfnPrompt.PromptModelInferenceConfigurationProperty>
A builder for
CfnPrompt.PromptModelInferenceConfigurationProperty-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()Builds the configured instance.Sets the value ofCfnPrompt.PromptModelInferenceConfigurationProperty.getMaxTokens()stopSequences(List<String> stopSequences) Sets the value ofCfnPrompt.PromptModelInferenceConfigurationProperty.getStopSequences()temperature(Number temperature) Sets the value ofCfnPrompt.PromptModelInferenceConfigurationProperty.getTemperature()Sets the value ofCfnPrompt.PromptModelInferenceConfigurationProperty.getTopP()
-
Constructor Details
-
Builder
public Builder()
-
-
Method Details
-
maxTokens
@Stability(Stable) public CfnPrompt.PromptModelInferenceConfigurationProperty.Builder maxTokens(Number maxTokens) Sets the value ofCfnPrompt.PromptModelInferenceConfigurationProperty.getMaxTokens()- Parameters:
maxTokens- The maximum number of tokens to return in the response.- Returns:
this
-
stopSequences
@Stability(Stable) public CfnPrompt.PromptModelInferenceConfigurationProperty.Builder stopSequences(List<String> stopSequences) Sets the value ofCfnPrompt.PromptModelInferenceConfigurationProperty.getStopSequences()- Parameters:
stopSequences- A list of strings that define sequences after which the model will stop generating.- Returns:
this
-
temperature
@Stability(Stable) public CfnPrompt.PromptModelInferenceConfigurationProperty.Builder temperature(Number temperature) Sets the value ofCfnPrompt.PromptModelInferenceConfigurationProperty.getTemperature()- Parameters:
temperature- Controls the randomness of the response. Choose a lower value for more predictable outputs and a higher value for more surprising outputs.- Returns:
this
-
topP
@Stability(Stable) public CfnPrompt.PromptModelInferenceConfigurationProperty.Builder topP(Number topP) Sets the value ofCfnPrompt.PromptModelInferenceConfigurationProperty.getTopP()- Parameters:
topP- The percentage of most-likely candidates that the model considers for the next token.- Returns:
this
-
build
Builds the configured instance.- Specified by:
buildin interfacesoftware.amazon.jsii.Builder<CfnPrompt.PromptModelInferenceConfigurationProperty>- Returns:
- a new instance of
CfnPrompt.PromptModelInferenceConfigurationProperty - Throws:
NullPointerException- if any required attribute was not provided
-