TextAIPromptInferenceConfiguration
Inference configuration for text-based AI Prompts.
Contents
- maxTokensToSample
-
The maximum number of tokens to generate in the response.
Type: Integer
Valid Range: Minimum value of 0. Maximum value of 4096.
Required: No
- temperature
-
The temperature setting for controlling randomness in the generated response.
Type: Float
Valid Range: Minimum value of 0. Maximum value of 1.
Required: No
- topK
-
The top-K sampling parameter for token selection.
Type: Integer
Valid Range: Minimum value of 0. Maximum value of 200.
Required: No
- topP
-
The top-P sampling parameter for nucleus sampling.
Type: Float
Valid Range: Minimum value of 0. Maximum value of 1.
Required: No
See Also
For more information about using this API in one of the language-specific AWS SDKs, see the following: