Class CfnFlowVersion.PromptModelInferenceConfigurationProperty
Contains inference configurations related to model inference for a prompt.
Inherited Members
Namespace: Amazon.CDK.AWS.Bedrock
Assembly: Amazon.CDK.Lib.dll
Syntax (csharp)
public class CfnFlowVersion.PromptModelInferenceConfigurationProperty : CfnFlowVersion.IPromptModelInferenceConfigurationProperty
Syntax (vb)
Public Class CfnFlowVersion.PromptModelInferenceConfigurationProperty Implements CfnFlowVersion.IPromptModelInferenceConfigurationProperty
Remarks
For more information, see Inference parameters .
ExampleMetadata: fixture=_generated
Examples
// The code below shows an example of how to instantiate this type.
// The values are placeholders you should change.
using Amazon.CDK.AWS.Bedrock;
var promptModelInferenceConfigurationProperty = new PromptModelInferenceConfigurationProperty {
MaxTokens = 123,
StopSequences = new [] { "stopSequences" },
Temperature = 123,
TopP = 123
};
Synopsis
Constructors
| PromptModelInferenceConfigurationProperty() | Contains inference configurations related to model inference for a prompt. |
Properties
| MaxTokens | The maximum number of tokens to return in the response. |
| StopSequences | A list of strings that define sequences after which the model will stop generating. |
| Temperature | Controls the randomness of the response. |
| TopP | The percentage of most-likely candidates that the model considers for the next token. |
Constructors
PromptModelInferenceConfigurationProperty()
Contains inference configurations related to model inference for a prompt.
public PromptModelInferenceConfigurationProperty()
Remarks
For more information, see Inference parameters .
ExampleMetadata: fixture=_generated
Examples
// The code below shows an example of how to instantiate this type.
// The values are placeholders you should change.
using Amazon.CDK.AWS.Bedrock;
var promptModelInferenceConfigurationProperty = new PromptModelInferenceConfigurationProperty {
MaxTokens = 123,
StopSequences = new [] { "stopSequences" },
Temperature = 123,
TopP = 123
};
Properties
MaxTokens
The maximum number of tokens to return in the response.
public double? MaxTokens { get; set; }
Property Value
Remarks
StopSequences
A list of strings that define sequences after which the model will stop generating.
public string[]? StopSequences { get; set; }
Property Value
string[]
Remarks
Temperature
Controls the randomness of the response.
public double? Temperature { get; set; }
Property Value
Remarks
Choose a lower value for more predictable outputs and a higher value for more surprising outputs.
TopP
The percentage of most-likely candidates that the model considers for the next token.
public double? TopP { get; set; }