Class AzureOpenAICompletionParameters
- Namespace
- FoundationaLLM.Common.Models.Orchestration.Direct
- Assembly
- FoundationaLLM.Common.dll
Supported model configuration parameters.
public class AzureOpenAICompletionParameters
- Inheritance
-
AzureOpenAICompletionParameters
- Derived
- Inherited Members
- Extension Methods
Properties
BestOf
Generates best_of completions server-side and returns the "best" (the one with the lowest log probability per token).
[JsonPropertyName("best_of")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public int? BestOf { get; set; }
Property Value
- int?
Echo
Echo back the prompt in addition to the completion. This parameter cannot be used with gpt-35-turbo.
[JsonPropertyName("echo")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public bool? Echo { get; set; }
Property Value
- bool?
FrequencyPenalty
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.
[JsonPropertyName("frequency_penalty")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public float? FrequencyPenalty { get; set; }
Property Value
LogProbs
Include the log probabilities on the logprobs most likely tokens, as well the chosen tokens.
[JsonPropertyName("logprobs")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public int? LogProbs { get; set; }
Property Value
- int?
LogitBias
Modify the likelihood of specified tokens appearing in the completion.
[JsonPropertyName("logit_bias")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public LogitBias? LogitBias { get; set; }
Property Value
MaxTokens
The maximum number of tokens to generate.
[JsonPropertyName("max_tokens")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public int? MaxTokens { get; set; }
Property Value
- int?
N
How many completions to generate for each prompt.
[JsonPropertyName("n")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public int? N { get; set; }
Property Value
- int?
PresencePenalty
Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.
[JsonPropertyName("presence_penalty")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public float? PresencePenalty { get; set; }
Property Value
Stop
Up to four sequences where the API will stop generating further tokens. The returned text won't contain the stop sequence. For GPT-4 Turbo with Vision, up to two sequences are supported.
[JsonPropertyName("stop")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public string[]? Stop { get; set; }
Property Value
- string[]
Stream
Whether to stream back partial progress.
[JsonPropertyName("stream")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public bool? Stream { get; set; }
Property Value
- bool?
Suffix
The suffix that comes after a completion of inserted text.
[JsonPropertyName("suffix")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public string? Suffix { get; set; }
Property Value
Temperature
Controls randomness in the model. Lower values will make the model more deterministic and higher values will make the model more random.
[JsonPropertyName("temperature")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public float? Temperature { get; set; }
Property Value
TopP
The cumulative probability of parameter highest probability vocabulary tokens to keep for nucleus sampling, defaults to null.
[JsonPropertyName("top_p")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public float? TopP { get; set; }
Property Value
User
A unique identifier representing your end-user, which can help monitoring and detecting abuse.
[JsonPropertyName("user")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public string? User { get; set; }