TextPromptVariantProps

class aws_cdk.aws_bedrock_alpha.TextPromptVariantProps(*, model, variant_name, prompt_variables=None, prompt_text, inference_configuration=None)

Bases: CommonPromptVariantProps

(experimental) Properties for creating a text prompt variant.

Parameters:
  • model (IBedrockInvokable) – (experimental) The model which is used to run the prompt. The model could be a foundation model, a custom model, or a provisioned model.

  • variant_name (str) – (experimental) The name of the prompt variant.

  • prompt_variables (Optional[Sequence[str]]) – (experimental) The variables in the prompt template that can be filled in at runtime. Default: - No variables defined.

  • prompt_text (str) – (experimental) The text prompt. Variables are used by enclosing its name with double curly braces as in {{variable_name}}.

  • inference_configuration (Optional[PromptInferenceConfiguration]) – (experimental) Inference configuration for the Text Prompt. Default: - No inference configuration provided.

Stability:

experimental

ExampleMetadata:

fixture=default infused

Example:

cmk = kms.Key(self, "cmk")
claude_model = bedrock.BedrockFoundationModel.ANTHROPIC_CLAUDE_SONNET_V1_0

variant1 = bedrock.PromptVariant.text(
    variant_name="variant1",
    model=claude_model,
    prompt_variables=["topic"],
    prompt_text="This is my first text prompt. Please summarize our conversation on: {{topic}}.",
    inference_configuration=bedrock.PromptInferenceConfiguration.text(
        temperature=1,
        top_p=0.999,
        max_tokens=2000
    )
)

prompt1 = bedrock.Prompt(self, "prompt1",
    prompt_name="prompt1",
    description="my first prompt",
    default_variant=variant1,
    variants=[variant1],
    kms_key=cmk
)

Attributes

inference_configuration

(experimental) Inference configuration for the Text Prompt.

Default:
  • No inference configuration provided.

Stability:

experimental

model

(experimental) The model which is used to run the prompt.

The model could be a foundation model, a custom model, or a provisioned model.

Stability:

experimental

prompt_text

(experimental) The text prompt.

Variables are used by enclosing its name with double curly braces as in {{variable_name}}.

Stability:

experimental

prompt_variables

(experimental) The variables in the prompt template that can be filled in at runtime.

Default:
  • No variables defined.

Stability:

experimental

variant_name

(experimental) The name of the prompt variant.

Stability:

experimental