Share via


CommonModelParameters interface

Common language model parameters for Chat Completions. If omitted, default values are used.

Properties

frequencyPenalty

A float in the range [-2,2] that reduces or increases likelihood of repeated tokens. Default is 0.

maxTokens

Maximum number of tokens to generate.

model

The name of the model to use (e.g., 'gpt-4o', etc.). Default is null if not specified.

presencePenalty

A float in the range [-2,2] that penalizes new tokens based on their existing presence. Default is 0.

seed

Random seed for controlling deterministic outputs. If omitted, randomization is used.

stop

List of stop sequences that will cut off text generation. Default is none.

temperature

Sampling temperature. Default is 0.7.

Property Details

frequencyPenalty

A float in the range [-2,2] that reduces or increases likelihood of repeated tokens. Default is 0.

frequencyPenalty?: number

Property Value

number

maxTokens

Maximum number of tokens to generate.

maxTokens?: number

Property Value

number

model

The name of the model to use (e.g., 'gpt-4o', etc.). Default is null if not specified.

model?: string

Property Value

string

presencePenalty

A float in the range [-2,2] that penalizes new tokens based on their existing presence. Default is 0.

presencePenalty?: number

Property Value

number

seed

Random seed for controlling deterministic outputs. If omitted, randomization is used.

seed?: number

Property Value

number

stop

List of stop sequences that will cut off text generation. Default is none.

stop?: string[]

Property Value

string[]

temperature

Sampling temperature. Default is 0.7.

temperature?: number

Property Value

number