inferenceConfig
construct an aws.sdk.kotlin.services.bedrockagentcorecontrol.model.InferenceConfiguration inside the given block
The inference configuration parameters that control model behavior during evaluation, including temperature, token limits, and sampling settings.