inferenceConfig

The inference configuration parameters that control model behavior during evaluation, including temperature, token limits, and sampling settings.