Class PromptInferenceConfigurationProps.Builder
java.lang.Object
software.amazon.awscdk.services.bedrock.alpha.PromptInferenceConfigurationProps.Builder
- All Implemented Interfaces:
software.amazon.jsii.Builder<PromptInferenceConfigurationProps>
- Enclosing interface:
PromptInferenceConfigurationProps
@Stability(Experimental)
public static final class PromptInferenceConfigurationProps.Builder
extends Object
implements software.amazon.jsii.Builder<PromptInferenceConfigurationProps>
A builder for
PromptInferenceConfigurationProps-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()Builds the configured instance.Sets the value ofPromptInferenceConfigurationProps.getMaxTokens()stopSequences(List<String> stopSequences) Sets the value ofPromptInferenceConfigurationProps.getStopSequences()temperature(Number temperature) Sets the value ofPromptInferenceConfigurationProps.getTemperature()Sets the value ofPromptInferenceConfigurationProps.getTopP()
-
Constructor Details
-
Builder
public Builder()
-
-
Method Details
-
maxTokens
@Stability(Experimental) public PromptInferenceConfigurationProps.Builder maxTokens(Number maxTokens) Sets the value ofPromptInferenceConfigurationProps.getMaxTokens()- Parameters:
maxTokens- The maximum number of tokens to return in the response.- Returns:
this
-
stopSequences
@Stability(Experimental) public PromptInferenceConfigurationProps.Builder stopSequences(List<String> stopSequences) Sets the value ofPromptInferenceConfigurationProps.getStopSequences()- Parameters:
stopSequences- A list of strings that define sequences after which the model will stop generating.- Returns:
this
-
temperature
@Stability(Experimental) public PromptInferenceConfigurationProps.Builder temperature(Number temperature) Sets the value ofPromptInferenceConfigurationProps.getTemperature()- Parameters:
temperature- Controls the randomness of the response. Higher values make output more random, lower values more deterministic. Valid range is 0.0 to 1.0.- Returns:
this
-
topP
Sets the value ofPromptInferenceConfigurationProps.getTopP()- Parameters:
topP- The percentage of most-likely candidates that the model considers for the next token. Valid range is 0.0 to 1.0.- Returns:
this
-
build
Builds the configured instance.- Specified by:
buildin interfacesoftware.amazon.jsii.Builder<PromptInferenceConfigurationProps>- Returns:
- a new instance of
PromptInferenceConfigurationProps - Throws:
NullPointerException- if any required attribute was not provided
-