Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Correct documentation for AI config function. #754

Merged
merged 8 commits into from
Jan 24, 2025
19 changes: 9 additions & 10 deletions packages/sdk/server-ai/src/api/LDAIClient.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,22 +8,21 @@ import { LDAIConfig, LDAIDefaults } from './config/LDAIConfig';
export interface LDAIClient {
/**
* Retrieves and processes an AI configuration based on the provided key, LaunchDarkly context,
* and variables. This includes the model configuration and the processed prompts.
* and variables. This includes the model configuration and the processed messages.
kinyoklion marked this conversation as resolved.
Show resolved Hide resolved
kinyoklion marked this conversation as resolved.
Show resolved Hide resolved
*
* @param key The key of the AI configuration.
* @param context The LaunchDarkly context object that contains relevant information about the
* current environment, user, or session. This context may influence how the configuration is
* processed or personalized.
* @param defaultValue A fallback value containing model configuration and messages. This will
* be used if the configuration is not available from LaunchDarkly.
* @param variables A map of key-value pairs representing dynamic variables to be injected into
* the prompt template. The keys correspond to placeholders within the template, and the values
* the message templates. The keys correspond to placeholders within the template, and the values
kinyoklion marked this conversation as resolved.
Show resolved Hide resolved
kinyoklion marked this conversation as resolved.
Show resolved Hide resolved
* are the corresponding replacements.
* @param defaultValue A fallback value containing model configuration and prompts. This will
* be used if the configurationuration is not available from launchdarkly.
*
* @returns The AI configurationuration including a processed prompt after all variables have been
* substituted in the stored prompt template. This will also include a `tracker` used to track
* the state of the AI operation. If the configuration cannot be accessed from LaunchDarkly, then
* the return value will include information from the defaultValue.
* @returns The AI `config`, processed `messages`, and a `tracker`. If the configuration cannot be accessed from
kinyoklion marked this conversation as resolved.
Show resolved Hide resolved
* LaunchDarkly, then the return value will include information from the defaultValue. The returned `tracker` should
kinyoklion marked this conversation as resolved.
Show resolved Hide resolved
kinyoklion marked this conversation as resolved.
Show resolved Hide resolved
* be used to track the state of the AI operation.
kinyoklion marked this conversation as resolved.
Show resolved Hide resolved
*
* @example
* ```
Expand All @@ -34,7 +33,7 @@ export interface LDAIClient {
* enabled: false,
* };
*
* const result = modelConfig(key, context, defaultValue, variables);
* const result = config(key, context, defaultValue, variables);
* // Output:
* {
* enabled: true,
Expand All @@ -44,7 +43,7 @@ export interface LDAIClient {
* maxTokens: 4096,
* userDefinedKey: "myValue",
* },
* prompt: [
* messages: [
* {
* role: "system",
* content: "You are an amazing GPT."
Expand Down
Loading