AWS services or capabilities described in AWS Documentation may vary by region/location. Click Getting Started with Amazon AWS to see specific differences applicable to the China (Beijing) Region.
Base inference parameters to pass to a model in a call to Converse or ConverseStream. For more information, see Inference parameters for foundation models.
If you need to pass additional parameters that the model supports, use the additionalModelRequestFields
request field in the call to Converse
or ConverseStream
. For more information,
see Model
parameters.
Namespace: Amazon.BedrockRuntime.Model
Assembly: AWSSDK.BedrockRuntime.dll
Version: 3.x.y.z
public class InferenceConfiguration
The InferenceConfiguration type exposes the following members
Name | Description | |
---|---|---|
![]() |
InferenceConfiguration() |
Name | Type | Description | |
---|---|---|---|
![]() |
MaxTokens | System.Nullable<System.Int32> |
Gets and sets the property MaxTokens. The maximum number of tokens to allow in the generated response. The default value is the maximum allowed value for the model that you are using. For more information, see Inference parameters for foundation models. |
![]() |
StopSequences | System.Collections.Generic.List<System.String> |
Gets and sets the property StopSequences. A list of stop sequences. A stop sequence is a sequence of characters that causes the model to stop generating the response. Starting with version 4 of the SDK this property will default to null. If no data for this property is returned from the service the property will also be null. This was changed to improve performance and allow the SDK and caller to distinguish between a property not set or a property being empty to clear out a value. To retain the previous SDK behavior set the AWSConfigs.InitializeCollections static property to true. |
![]() |
Temperature | System.Nullable<System.Single> |
Gets and sets the property Temperature. The likelihood of the model selecting higher-probability options while generating a response. A lower value makes the model more likely to choose higher-probability options, while a higher value makes the model more likely to choose lower-probability options. The default value is the default value for the model that you are using. For more information, see Inference parameters for foundation models. |
![]() |
TopP | System.Nullable<System.Single> |
Gets and sets the property TopP.
The percentage of most-likely candidates that the model considers for the next token.
For example, if you choose a value of 0.8 for The default value is the default value for the model that you are using. For more information, see Inference parameters for foundation models. |
.NET:
Supported in: 8.0 and newer, Core 3.1
.NET Standard:
Supported in: 2.0
.NET Framework:
Supported in: 4.7.2 and newer