TextAiPromptInferenceConfiguration

Inference configuration for text-based AI Prompts.

Types

Link copied to clipboard
class Builder
Link copied to clipboard
object Companion

Properties

Link copied to clipboard

The maximum number of tokens to generate in the response.

Link copied to clipboard

The temperature setting for controlling randomness in the generated response.

Link copied to clipboard
val topK: Int

The top-K sampling parameter for token selection.

Link copied to clipboard
val topP: Float

The top-P sampling parameter for nucleus sampling.

Functions

Link copied to clipboard
Link copied to clipboard
open operator override fun equals(other: Any?): Boolean
Link copied to clipboard
open override fun hashCode(): Int
Link copied to clipboard
open override fun toString(): String