interface PromptVariantProperty
| Language | Type name |
|---|---|
.NET | Amazon.CDK.Mixins.Preview.AWS.Bedrock.Mixins.CfnPromptPropsMixin.PromptVariantProperty |
Go | github.com/aws/aws-cdk-go/awscdkmixinspreview/v2/awsbedrock/mixins#CfnPromptPropsMixin_PromptVariantProperty |
Java | software.amazon.awscdk.mixins.preview.services.bedrock.mixins.CfnPromptPropsMixin.PromptVariantProperty |
Python | aws_cdk.mixins_preview.aws_bedrock.mixins.CfnPromptPropsMixin.PromptVariantProperty |
TypeScript | @aws-cdk/mixins-preview » aws_bedrock » mixins » CfnPromptPropsMixin » PromptVariantProperty |
Contains details about a variant of the prompt.
Example
// The code below shows an example of how to instantiate this type.
// The values are placeholders you should change.
import { mixins as bedrock_mixins } from '@aws-cdk/mixins-preview/aws-bedrock';
declare const additionalModelRequestFields: any;
declare const any: any;
declare const auto: any;
declare const json: any;
const promptVariantProperty: bedrock_mixins.CfnPromptPropsMixin.PromptVariantProperty = {
additionalModelRequestFields: additionalModelRequestFields,
genAiResource: {
agent: {
agentIdentifier: 'agentIdentifier',
},
},
inferenceConfiguration: {
text: {
maxTokens: 123,
stopSequences: ['stopSequences'],
temperature: 123,
topP: 123,
},
},
metadata: [{
key: 'key',
value: 'value',
}],
modelId: 'modelId',
name: 'name',
templateConfiguration: {
chat: {
inputVariables: [{
name: 'name',
}],
messages: [{
content: [{
cachePoint: {
type: 'type',
},
text: 'text',
}],
role: 'role',
}],
system: [{
cachePoint: {
type: 'type',
},
text: 'text',
}],
toolConfiguration: {
toolChoice: {
any: any,
auto: auto,
tool: {
name: 'name',
},
},
tools: [{
cachePoint: {
type: 'type',
},
toolSpec: {
description: 'description',
inputSchema: {
json: json,
},
name: 'name',
},
}],
},
},
text: {
cachePoint: {
type: 'type',
},
inputVariables: [{
name: 'name',
}],
text: 'text',
textS3Location: {
bucket: 'bucket',
key: 'key',
version: 'version',
},
},
},
templateType: 'templateType',
};
Properties
| Name | Type | Description |
|---|---|---|
| additional | any | Contains model-specific inference configurations that aren't in the inferenceConfiguration field. |
| gen | IResolvable | Prompt | Specifies a generative AI resource with which to use the prompt. |
| inference | IResolvable | Prompt | Contains inference configurations for the prompt variant. |
| metadata? | IResolvable | (IResolvable | Prompt)[] | An array of objects, each containing a key-value pair that defines a metadata tag and value to attach to a prompt variant. |
| model | string | The unique identifier of the model or inference profile with which to run inference on the prompt. |
| name? | string | The name of the prompt variant. |
| template | IResolvable | Prompt | Contains configurations for the prompt template. |
| template | string | The type of prompt template to use. |
additionalModelRequestFields?
Type:
any
(optional)
Contains model-specific inference configurations that aren't in the inferenceConfiguration field.
To see model-specific inference parameters, see Inference request parameters and response fields for foundation models .
genAiResource?
Type:
IResolvable | Prompt
(optional)
Specifies a generative AI resource with which to use the prompt.
inferenceConfiguration?
Type:
IResolvable | Prompt
(optional)
Contains inference configurations for the prompt variant.
metadata?
Type:
IResolvable | (IResolvable | Prompt)[]
(optional)
An array of objects, each containing a key-value pair that defines a metadata tag and value to attach to a prompt variant.
modelId?
Type:
string
(optional)
The unique identifier of the model or inference profile with which to run inference on the prompt.
name?
Type:
string
(optional)
The name of the prompt variant.
templateConfiguration?
Type:
IResolvable | Prompt
(optional)
Contains configurations for the prompt template.
templateType?
Type:
string
(optional)
The type of prompt template to use.

.NET
Go
Java
Python
TypeScript