PromptVariant
- class aws_cdk.aws_bedrock_alpha.PromptVariant(*args: Any, **kwargs)
Bases:
object
(experimental) Factory class for creating prompt variants.
Provides static methods to create different types of prompt variants with proper configuration and type safety.
- Stability:
experimental
- ExampleMetadata:
fixture=default infused
Example:
cmk = kms.Key(self, "cmk") # Assuming you have an existing agent and alias agent = bedrock.Agent.from_agent_attributes(self, "ImportedAgent", agent_arn="arn:aws:bedrock:region:account:agent/agent-id", role_arn="arn:aws:iam::account:role/agent-role" ) agent_alias = bedrock.AgentAlias.from_attributes(self, "ImportedAlias", alias_id="alias-id", alias_name="my-alias", agent_version="1", agent=agent ) agent_variant = bedrock.PromptVariant.agent( variant_name="agent-variant", model=bedrock.BedrockFoundationModel.ANTHROPIC_CLAUDE_3_5_SONNET_V1_0, agent_alias=agent_alias, prompt_text="Use the agent to help with: {{task}}. Please be thorough and provide detailed explanations.", prompt_variables=["task"] ) bedrock.Prompt(self, "agentPrompt", prompt_name="agent-prompt", description="Prompt for agent interactions", default_variant=agent_variant, variants=[agent_variant], kms_key=cmk )
Static Methods
- classmethod agent(*, agent_alias, prompt_text, model, variant_name, prompt_variables=None)
(experimental) Creates an agent prompt template variant.
- Parameters:
agent_alias (
IAgentAlias
) – (experimental) An alias pointing to the agent version to be used.prompt_text (
str
) – (experimental) The text prompt. Variables are used by enclosing its name with double curly braces as in{{variable_name}}
.model (
IBedrockInvokable
) – (experimental) The model which is used to run the prompt. The model could be a foundation model, a custom model, or a provisioned model.variant_name (
str
) – (experimental) The name of the prompt variant.prompt_variables (
Optional
[Sequence
[str
]]) – (experimental) The variables in the prompt template that can be filled in at runtime. Default: - No variables defined.
- Return type:
- Returns:
A PromptVariant configured for agent interactions
- Stability:
experimental
- classmethod chat(*, messages, inference_configuration=None, system=None, tool_configuration=None, model, variant_name, prompt_variables=None)
(experimental) Creates a chat template variant.
Use this template type when the model supports the Converse API or the Anthropic Claude Messages API. This allows you to include a System prompt and previous User messages and Assistant messages for context.
- Parameters:
messages (
Sequence
[ChatMessage
]) – (experimental) The messages in the chat prompt. Must include at least one User Message. The messages should alternate between User and Assistant.inference_configuration (
Optional
[PromptInferenceConfiguration
]) – (experimental) Inference configuration for the Chat Prompt. Default: - No inference configuration provided.system (
Optional
[str
]) – (experimental) Context or instructions for the model to consider before generating a response. Default: - No system message provided.tool_configuration (
Union
[ToolConfiguration
,Dict
[str
,Any
],None
]) – (experimental) The configuration with available tools to the model and how it must use them. Default: - No tool configuration provided.model (
IBedrockInvokable
) – (experimental) The model which is used to run the prompt. The model could be a foundation model, a custom model, or a provisioned model.variant_name (
str
) – (experimental) The name of the prompt variant.prompt_variables (
Optional
[Sequence
[str
]]) – (experimental) The variables in the prompt template that can be filled in at runtime. Default: - No variables defined.
- Return type:
- Returns:
A PromptVariant configured for chat interactions
- Stability:
experimental
- classmethod text(*, prompt_text, inference_configuration=None, model, variant_name, prompt_variables=None)
(experimental) Creates a text template variant.
- Parameters:
prompt_text (
str
) – (experimental) The text prompt. Variables are used by enclosing its name with double curly braces as in{{variable_name}}
.inference_configuration (
Optional
[PromptInferenceConfiguration
]) – (experimental) Inference configuration for the Text Prompt. Default: - No inference configuration provided.model (
IBedrockInvokable
) – (experimental) The model which is used to run the prompt. The model could be a foundation model, a custom model, or a provisioned model.variant_name (
str
) – (experimental) The name of the prompt variant.prompt_variables (
Optional
[Sequence
[str
]]) – (experimental) The variables in the prompt template that can be filled in at runtime. Default: - No variables defined.
- Return type:
- Returns:
A PromptVariant configured for text processing
- Stability:
experimental