ChatPromptVariantProps
- class aws_cdk.aws_bedrock_alpha.ChatPromptVariantProps(*, model, variant_name, prompt_variables=None, messages, inference_configuration=None, system=None, tool_configuration=None)
Bases:
CommonPromptVariantProps
(experimental) Properties for creating a chat prompt variant.
- Parameters:
model (
IBedrockInvokable
) – (experimental) The model which is used to run the prompt. The model could be a foundation model, a custom model, or a provisioned model.variant_name (
str
) – (experimental) The name of the prompt variant.prompt_variables (
Optional
[Sequence
[str
]]) – (experimental) The variables in the prompt template that can be filled in at runtime. Default: - No variables defined.messages (
Sequence
[ChatMessage
]) – (experimental) The messages in the chat prompt. Must include at least one User Message. The messages should alternate between User and Assistant.inference_configuration (
Optional
[PromptInferenceConfiguration
]) – (experimental) Inference configuration for the Chat Prompt. Default: - No inference configuration provided.system (
Optional
[str
]) – (experimental) Context or instructions for the model to consider before generating a response. Default: - No system message provided.tool_configuration (
Union
[ToolConfiguration
,Dict
[str
,Any
],None
]) – (experimental) The configuration with available tools to the model and how it must use them. Default: - No tool configuration provided.
- Stability:
experimental
- ExampleMetadata:
fixture=default infused
Example:
cmk = kms.Key(self, "cmk") variant_chat = bedrock.PromptVariant.chat( variant_name="variant1", model=bedrock.BedrockFoundationModel.ANTHROPIC_CLAUDE_3_5_SONNET_V1_0, messages=[ bedrock.ChatMessage.user("From now on, you speak Japanese!"), bedrock.ChatMessage.assistant("Konnichiwa!"), bedrock.ChatMessage.user("From now on, you speak {{language}}!") ], system="You are a helpful assistant that only speaks the language you`re told.", prompt_variables=["language"], tool_configuration=bedrock.ToolConfiguration( tool_choice=bedrock.ToolChoice.AUTO, tools=[ bedrock.Tool.function( name="top_song", description="Get the most popular song played on a radio station.", input_schema={ "type": "object", "properties": { "sign": { "type": "string", "description": "The call sign for the radio station for which you want the most popular song. Example calls signs are WZPZ and WKR." } }, "required": ["sign"] } ) ] ) ) bedrock.Prompt(self, "prompt1", prompt_name="prompt-chat", description="my first chat prompt", default_variant=variant_chat, variants=[variant_chat], kms_key=cmk )
Attributes
- inference_configuration
(experimental) Inference configuration for the Chat Prompt.
- Default:
No inference configuration provided.
- Stability:
experimental
- messages
(experimental) The messages in the chat prompt.
Must include at least one User Message. The messages should alternate between User and Assistant.
- Stability:
experimental
- model
(experimental) The model which is used to run the prompt.
The model could be a foundation model, a custom model, or a provisioned model.
- Stability:
experimental
- prompt_variables
(experimental) The variables in the prompt template that can be filled in at runtime.
- Default:
No variables defined.
- Stability:
experimental
- system
(experimental) Context or instructions for the model to consider before generating a response.
- Default:
No system message provided.
- Stability:
experimental
- tool_configuration
(experimental) The configuration with available tools to the model and how it must use them.
- Default:
No tool configuration provided.
- Stability:
experimental
- variant_name
(experimental) The name of the prompt variant.
- Stability:
experimental