HumanEvaluationCustomMetric
In a model evaluation job that uses human workers you must
define the name of the metric, and how you want that metric rated
ratingMethod, and an optional description of the metric.
Contents
- name
-
The name of the metric. Your human evaluators will see this name in the evaluation UI.
Type: String
Length Constraints: Minimum length of 1. Maximum length of 63.
Pattern:
[0-9a-zA-Z-_.]+Required: Yes
- ratingMethod
-
Choose how you want your human workers to evaluation your model. Valid values for rating methods are
ThumbsUpDown,IndividualLikertScale,ComparisonLikertScale,ComparisonChoice, andComparisonRankType: String
Length Constraints: Minimum length of 1. Maximum length of 100.
Pattern:
[0-9a-zA-Z-_]+Required: Yes
- description
-
An optional description of the metric. Use this parameter to provide more details about the metric.
Type: String
Length Constraints: Minimum length of 1. Maximum length of 63.
Pattern:
.+Required: No
See Also
For more information about using this API in one of the language-specific AWS SDKs, see the following: