Unofficial community-maintained wrapper for OpenAI REST APIs See https://platform.openai.com/docs/api-reference/introduction for further info on REST endpoints
Thank you for your understanding and support and thanks to everyone who has contributed to the library so far!
Add :openai as a dependency in your mix.exs file.
def deps do
[
{:openai, "~> 0.6.2"}
]
end
You can configure openai in your mix config.exs (default $project_root/config/config.exs). If you're using Phoenix add the configuration in your config/dev.exs|test.exs|prod.exs files. An example config is:
import Config
config :openai,
# find it at https://platform.openai.com/account/api-keys
api_key: "your-api-key",
# find it at https://platform.openai.com/account/org-settings under "Organization ID"
organization_key: "your-organization-key",
# optional, use when required by an OpenAI API beta, e.g.:
beta: "assistants=v1",
# optional, passed to [HTTPoison.Request](https://hexdocs.pm/httpoison/HTTPoison.Request.html) options
http_options: [recv_timeout: 30_000],
# optional, useful if you want to do local integration tests using Bypass or similar
# (https://github.com/PSPDFKit-labs/bypass), do not use it for production code,
# but only in your test config!
api_url: "http://localhost/"
Note: you can load your os ENV variables in the configuration file, if you set an env variable for API key named OPENAI_API_KEY
you can get it in the code by doing System.get_env("OPENAI_API_KEY")
.
config.exs
is compile time, so the get_env/1
function is executed during the build, if you want to get the env variables during runtime please use runtime.exs
instead of config.exs
in your application (elixir doc ref).
Client library configuration can be overwritten in runtime by passing a %OpenAI.Config{}
struct as last argument of the function you need to use. For instance if you need to use a different api_key
, organization_key
or http_options
you can simply do:
config_override = %OpenAI.Config{ api_key: "test-api-key" } # this will return a config struct with "test-api-key" as api_key, all the other config are defaulted by the client by using values taken from config.exs, so you don't need to set the defaults manually
# chat_completion with overriden config
OpenAI.chat_completion([
model: "gpt-3.5-turbo",
messages: [
%{role: "system", content: "You are a helpful assistant."},
%{role: "user", content: "Who won the world series in 2020?"},
%{role: "assistant", content: "The Los Angeles Dodgers won the World Series in 2020."},
%{role: "user", content: "Where was it played?"}
]
],
config_override # <--- pass the overriden configuration as last argument of the function
)
# chat_completion with standard config
OpenAI.chat_completion(
model: "gpt-3.5-turbo",
messages: [
%{role: "system", content: "You are a helpful assistant."},
%{role: "user", content: "Who won the world series in 2020?"},
%{role: "assistant", content: "The Los Angeles Dodgers won the World Series in 2020."},
%{role: "user", content: "Where was it played?"}
]
)
you can perform a config override in all the functions, note that params
argument must be passed explicitly as a list in square brackets if the configuration is to be overwritten, as in the example above.
Get your API key from https://platform.openai.com/account/api-keys
Retrieve the list of available models
OpenAI.models()
{:ok, %{
data: [%{
"created" => 1651172505,
"id" => "davinci-search-query",
"object" => "model",
"owned_by" => "openai-dev",
"parent" => nil,
"permission" => [
%{
"allow_create_engine" => false,
"allow_fine_tuning" => false,
"allow_logprobs" => true,
...
}
],
"root" => "davinci-search-query"
},
....],
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/models/list
Retrieve specific model info
OpenAI.models("davinci-search-query")
{:ok,
%{
created: 1651172505,
id: "davinci-search-query",
object: "model",
owned_by: "openai-dev",
parent: nil,
permission: [
%{
"allow_create_engine" => false,
"allow_fine_tuning" => false,
"allow_logprobs" => true,
"allow_sampling" => true,
"allow_search_indices" => true,
"allow_view" => true,
"created" => 1669066353,
"group" => nil,
"id" => "modelperm-lYkiTZMmJMWm8jvkPx2duyHE",
"is_blocking" => false,
"object" => "model_permission",
"organization" => "*"
}
],
root: "davinci-search-query"
}}
See: https://platform.openai.com/docs/api-reference/models/retrieve
It returns one or more predicted completions given a prompt. The function accepts as arguments the "engine_id" and the set of parameters used by the Completions OpenAI api
OpenAI.completions(
model: "finetuned-model",
prompt: "once upon a time",
max_tokens: 5,
temperature: 1,
...
)
## Example response
{:ok, %{
choices: [
%{
"finish_reason" => "length",
"index" => 0,
"logprobs" => nil,
"text" => "\" thing we are given"
}
],
created: 1617147958,
id: "...",
model: "...",
object: "text_completion"
}
}
See: https://platform.openai.com/docs/api-reference/completions/create
this API has been deprecated by OpenAI, as engines
are replaced by models
. If you are using it consider to switch to completions(params)
ASAP!
OpenAI.completions(
"davinci", # engine_id
prompt: "once upon a time",
max_tokens: 5,
temperature: 1,
...
)
{:ok, %{
choices: [
%{
"finish_reason" => "length",
"index" => 0,
"logprobs" => nil,
"text" => "\" thing we are given"
}
],
created: 1617147958,
id: "...",
model: "...",
object: "text_completion"
}
}
See: https://beta.openai.com/docs/api-reference/completions/create for the complete list of parameters you can pass to the completions function
Creates a completion for the chat message
OpenAI.chat_completion(
model: "gpt-3.5-turbo",
messages: [
%{role: "system", content: "You are a helpful assistant."},
%{role: "user", content: "Who won the world series in 2020?"},
%{role: "assistant", content: "The Los Angeles Dodgers won the World Series in 2020."},
%{role: "user", content: "Where was it played?"}
]
)
{:ok,
%{
choices: [
%{
"finish_reason" => "stop",
"index" => 0,
"message" => %{
"content" =>
"The 2020 World Series was played at Globe Life Field in Arlington, Texas due to the COVID-19 pandemic.",
"role" => "assistant"
}
}
],
created: 1_677_773_799,
id: "chatcmpl-6pftfA4NO9pOQIdxao6Z4McDlx90l",
model: "gpt-3.5-turbo-0301",
object: "chat.completion",
usage: %{
"completion_tokens" => 26,
"prompt_tokens" => 56,
"total_tokens" => 82
}
}}
See: https://platform.openai.com/docs/api-reference/chat/create for the complete list of parameters you can pass to the completions function
Creates a completion for the chat message, by default it streams to self()
, but you can override the configuration by passing a config override to the function with a different stream_to
http_options parameter.
import Config
config :openai,
api_key: "your-api-key",
http_options: [recv_timeout: :infinity, async: :once],
...
http_options
must be set as above when you want to treat the chat completion as a stream.
OpenAI.chat_completion([
model: "gpt-3.5-turbo",
messages: [
%{role: "system", content: "You are a helpful assistant."},
%{role: "user", content: "Who won the world series in 2020?"},
%{role: "assistant", content: "The Los Angeles Dodgers won the World Series in 2020."},
%{role: "user", content: "Where was it played?"}
],
stream: true, # set this param to true
]
)
|> Stream.each(fn res ->
IO.inspect(res)
end)
|> Stream.run()
%{
"choices" => [
%{"delta" => %{"role" => "assistant"}, "finish_reason" => nil, "index" => 0}
],
"created" => 1682700668,
"id" => "chatcmpl-7ALbIuLju70hXy3jPa3o5VVlrxR6a",
"model" => "gpt-3.5-turbo-0301",
"object" => "chat.completion.chunk"
}
%{
"choices" => [
%{"delta" => %{"content" => "The"}, "finish_reason" => nil, "index" => 0}
],
"created" => 1682700668,
"id" => "chatcmpl-7ALbIuLju70hXy3jPa3o5VVlrxR6a",
"model" => "gpt-3.5-turbo-0301",
"object" => "chat.completion.chunk"
}
%{
"choices" => [
%{"delta" => %{"content" => " World"}, "finish_reason" => nil, "index" => 0}
],
"created" => 1682700668,
"id" => "chatcmpl-7ALbIuLju70hXy3jPa3o5VVlrxR6a",
"model" => "gpt-3.5-turbo-0301",
"object" => "chat.completion.chunk"
}
%{
"choices" => [
%{
"delta" => %{"content" => " Series"},
"finish_reason" => nil,
"index" => 0
}
],
"created" => 1682700668,
"id" => "chatcmpl-7ALbIuLju70hXy3jPa3o5VVlrxR6a",
"model" => "gpt-3.5-turbo-0301",
"object" => "chat.completion.chunk"
}
%{
"choices" => [
%{"delta" => %{"content" => " in"}, "finish_reason" => nil, "index" => 0}
],
"created" => 1682700668,
"id" => "chatcmpl-7ALbIuLju70hXy3jPa3o5VVlrxR6a",
"model" => "gpt-3.5-turbo-0301",
"object" => "chat.completion.chunk"
}
...
Creates a new edit for the provided input, instruction, and parameters
OpenAI.edits(
model: "text-davinci-edit-001",
input: "What day of the wek is it?",
instruction: "Fix the spelling mistakes"
)
{:ok,
%{
choices: [%{"index" => 0, "text" => "What day of the week is it?\n"}],
created: 1675443483,
object: "edit",
usage: %{
"completion_tokens" => 28,
"prompt_tokens" => 25,
"total_tokens" => 53
}
}}
See: https://platform.openai.com/docs/api-reference/edits/create
This generates an image based on the given prompt. Image functions require some times to execute, and API may return a timeout error, if needed you can pass an optional configuration struct with HTTPoison http_options as second argument of the function to increase the timeout.
OpenAI.images_generations(
[prompt: "A developer writing a test", size: "256x256"],
%OpenAI.Config{http_options: [recv_timeout: 10 * 60 * 1000]} # optional!
)
{:ok,
%{
created: 1670341737,
data: [
%{
"url" => ...Returned url
}
]
}}
Note: this api signature has changed in v0.3.0
to be compliant with the conventions of other APIs, the alias OpenAI.image_generations(params, request_options)
is still available for retrocompatibility. If you are using it consider to switch to OpenAI.images_generations(params, request_options)
ASAP.
Note2: the official way of passing http_options changed in v0.5.0
to be compliant with the conventions of other APIs, the alias OpenAI.images_generations(file_path, params, request_options)
, but is still available for retrocompatibility. If you are using it consider to switch to OpenAI.images_variations(params, config)
See: https://platform.openai.com/docs/api-reference/images/create
Edit an existing image based on prompt Image functions require some times to execute, and API may return a timeout error, if needed you can pass an optional configuration struct with HTTPoison http_options as second argument of the function to increase the timeout.
OpenAI.images_edits(
"/home/developer/myImg.png",
[prompt: "A developer writing a test", size: "256x256"],
%OpenAI.Config{http_options: [recv_timeout: 10 * 60 * 1000]} # optional!
)
{:ok,
%{
created: 1670341737,
data: [
%{
"url" => ...Returned url
}
]
}}
Note: the official way of passing http_options changed in v0.5.0
to be compliant with the conventions of other APIs, the alias OpenAI.images_edits(file_path, params, request_options)
, but is still available for retrocompatibility. If you are using it consider to switch to OpenAI.images_edits(file_path, params, config)
See: https://platform.openai.com/docs/api-reference/images/create-edit
Image functions require some times to execute, and API may return a timeout error, if needed you can pass an optional configuration struct with HTTPoison http_options as second argument of the function to increase the timeout.
OpenAI.images_variations(
"/home/developer/myImg.png",
[n: "5"],
%OpenAI.Config{http_options: [recv_timeout: 10 * 60 * 1000]} # optional!
)
{:ok,
%{
created: 1670341737,
data: [
%{
"url" => ...Returned url
}
]
}}
Note: the official way of passing http_options changed in v0.5.0
to be compliant with the conventions of other APIs, the alias OpenAI.images_variations(file_path, params, request_options)
, but is still available for retrocompatibility. If you are using it consider to switch to OpenAI.images_edits(file_path, params, config)
See: https://platform.openai.com/docs/api-reference/images/create-variation
OpenAI.embeddings(
model: "text-embedding-ada-002",
input: "The food was delicious and the waiter..."
)
{:ok,
%{
data: [
%{
"embedding" => [0.0022523515000000003, -0.009276069000000001,
0.015758524000000003, -0.007790373999999999, -0.004714223999999999,
0.014806155000000001, -0.009803046499999999, -0.038323310000000006,
-0.006844355, -0.028672641, 0.025345700000000002, 0.018145794000000003,
-0.0035904291999999997, -0.025498080000000003, 5.142790000000001e-4,
-0.016317246, 0.028444072, 0.0053713582, 0.009631619999999999,
-0.016469626, -0.015390275, 0.004301531, 0.006984035499999999,
-0.007079272499999999, -0.003926933, 0.018602932000000003, 0.008666554,
-0.022717162999999995, 0.011460166999999997, 0.023860006,
0.015568050999999998, -0.003587254600000001, -0.034843990000000005,
-0.0041555012999999995, -0.026107594000000005, -0.02151083,
-0.0057618289999999996, 0.011714132499999998, 0.008355445999999999,
0.004098358999999999, 0.019199749999999998, -0.014336321, 0.008952264,
0.0063395994, -0.04576447999999999, ...],
"index" => 0,
"object" => "embedding"
}
],
model: "text-embedding-ada-002-v2",
object: "list",
usage: %{"prompt_tokens" => 8, "total_tokens" => 8}
}}
See: https://platform.openai.com/docs/api-reference/embeddings/create
Generates audio from the input text.
OpenAI.audio_speech(
model: "tts-1",
input: "You know that Voight-Kampf test of yours. Did you ever take that test yourself?",
voice: "alloy"
)
{:ok, <<255, 255, ...>>}
See: https://platform.openai.com/docs/api-reference/audio/create to get info on the params accepted by the api
Transcribes audio into the input language.
OpenAI.audio_transcription(
"./path_to_file/blade_runner.mp3", # file path
model: "whisper-1"
)
{:ok,
%{
text: "I've seen things you people wouldn't believe.."
}}
See: https://platform.openai.com/docs/api-reference/audio/create to get info on the params accepted by the api
Translates audio into into English.
OpenAI.audio_translation(
"./path_to_file/werner_herzog_interview.mp3", # file path
model: "whisper-1"
)
{:ok,
%{
text: "I thought if I walked, I would be saved. It was almost like a pilgrimage. I will definitely continue to walk long distances. It is a very unique form of life and existence that we have lost almost entirely from our normal life."
}
}
See: https://platform.openai.com/docs/api-reference/audio/create to get info on the params accepted by the api
Returns a list of files that belong to the user's organization.
OpenAI.files()
{:ok,
%{
data: [
%{
"bytes" => 123,
"created_at" => 213,
"filename" => "file.jsonl",
"id" => "file-123321",
"object" => "file",
"purpose" => "fine-tune",
"status" => "processed",
"status_details" => nil
}
],
object: "list"
}
}
See: https://platform.openai.com/docs/api-reference/files
Returns a file that belong to the user's organization, given a file id
OpenAI.files("file-123321")
{:ok,
%{
bytes: 923,
created_at: 1675370979,
filename: "file.jsonl",
id: "file-123321",
object: "file",
purpose: "fine-tune",
status: "processed",
status_details: nil
}
}
See: https://platform.openai.com/docs/api-reference/files/retrieve
Upload a file that contains document(s) to be used across various endpoints/features. Currently, the size of all the files uploaded by one organization can be up to 1 GB. Please contact OpenAI if you need to increase the storage limit.
OpenAI.files_upload("./file.jsonl", purpose: "fine-tune")
{:ok,
%{
bytes: 923,
created_at: 1675373519,
filename: "file.jsonl",
id: "file-123",
object: "file",
purpose: "fine-tune",
status: "uploaded",
status_details: nil
}
}
See: https://platform.openai.com/docs/api-reference/files/upload
delete a file
OpenAI.files_delete("file-123")
{:ok, %{deleted: true, id: "file-123", object: "file"}}
See: https://platform.openai.com/docs/api-reference/files/delete
List your organization's fine-tuning jobs.
OpenAI.finetunes()
{:ok,
%{
object: "list",
data: [%{
"id" => "t-AF1WoRqd3aJAHsqc9NY7iL8F",
"object" => "fine-tune",
"model" => "curie",
"created_at" => 1614807352,
"fine_tuned_model" => null,
"hyperparams" => { ... },
"organization_id" => "org-...",
"result_files" = [],
"status": "pending",
"validation_files" => [],
"training_files" => [ { ... } ],
"updated_at" => 1614807352,
}],
}
}
See: https://platform.openai.com/docs/api-reference/fine-tunes/list
Gets info about a fine-tune job.
OpenAI.finetunes("t-AF1WoRqd3aJAHsqc9NY7iL8F")
{:ok,
%{
object: "list",
data: [%{
"id" => "t-AF1WoRqd3aJAHsqc9NY7iL8F",
"object" => "fine-tune",
"model" => "curie",
"created_at" => 1614807352,
"fine_tuned_model" => null,
"hyperparams" => { ... },
"organization_id" => "org-...",
"result_files" = [],
"status": "pending",
"validation_files" => [],
"training_files" => [ { ... } ],
"updated_at" => 1614807352,
}],
}
}
See: https://platform.openai.com/docs/api-reference/fine-tunes/retrieve
Creates a job that fine-tunes a specified model from a given dataset.
OpenAI.finetunes_create(
training_file: "file-123213231",
model: "curie",
)
{:ok,
%{
created_at: 1675527767,
events: [
%{
"created_at" => 1675527767,
"level" => "info",
"message" => "Created fine-tune: ft-IaBYfSSAK47UUCbebY5tBIEj",
"object" => "fine-tune-event"
}
],
fine_tuned_model: nil,
hyperparams: %{
"batch_size" => nil,
"learning_rate_multiplier" => nil,
"n_epochs" => 4,
"prompt_loss_weight" => 0.01
},
id: "ft-IaBYfSSAK47UUCbebY5tBIEj",
model: "curie",
object: "fine-tune",
organization_id: "org-1iPTOIak4b5fpuIB697AYMmO",
result_files: [],
status: "pending",
training_files: [
%{
"bytes" => 923,
"created_at" => 1675373519,
"filename" => "file-12321323.jsonl",
"id" => "file-12321323",
"object" => "file",
"purpose" => "fine-tune",
"status" => "processed",
"status_details" => nil
}
],
updated_at: 1675527767,
validation_files: []
}}
See: https://platform.openai.com/docs/api-reference/fine-tunes/create
Get fine-grained status updates for a fine-tune job.
OpenAI.finetunes_list_events("ft-AF1WoRqd3aJAHsqc9NY7iL8F")
{:ok,
%{
data: [
%{
"created_at" => 1675376995,
"level" => "info",
"message" => "Created fine-tune: ft-123",
"object" => "fine-tune-event"
},
%{
"created_at" => 1675377104,
"level" => "info",
"message" => "Fine-tune costs $0.00",
"object" => "fine-tune-event"
},
%{
"created_at" => 1675377105,
"level" => "info",
"message" => "Fine-tune enqueued. Queue number: 18",
"object" => "fine-tune-event"
},
...,
]
}
}
See: https://platform.openai.com/docs/api-reference/fine-tunes/events
Immediately cancel a fine-tune job.
OpenAI.finetunes_cancel("ft-AF1WoRqd3aJAHsqc9NY7iL8F")
{:ok,
%{
created_at: 1675527767,
events: [
...
%{
"created_at" => 1675528080,
"level" => "info",
"message" => "Fine-tune cancelled",
"object" => "fine-tune-event"
}
],
fine_tuned_model: nil,
hyperparams: %{
"batch_size" => 1,
"learning_rate_multiplier" => 0.1,
"n_epochs" => 4,
"prompt_loss_weight" => 0.01
},
id: "ft-IaBYfSSAK47UUCbebY5tBIEj",
model: "curie",
object: "fine-tune",
organization_id: "org-1iPTOIak4b5fpuIB697AYMmO",
result_files: [],
status: "cancelled",
training_files: [
%{
"bytes" => 923,
"created_at" => 1675373519,
"filename" => "file123.jsonl",
"id" => "file-123",
"object" => "file",
"purpose" => "fine-tune",
"status" => "processed",
"status_details" => nil
}
],
updated_at: 1675528080,
validation_files: []
}}
Immediately cancel a fine-tune job.
OpenAI.finetunes_delete_model("model-id")
{:ok,
%{
id: "model-id",
object: "model",
deleted: true
}
}
See: https://platform.openai.com/docs/api-reference/fine-tunes/delete-model
Classifies if text violates OpenAI's Content Policy
OpenAI.moderations(input: "I want to kill everyone!")
{:ok,
%{
id: "modr-6gEWXyuaU8dqiHpbAHIsdru0zuC88",
model: "text-moderation-004",
results: [
%{
"categories" => %{
"hate" => false,
"hate/threatening" => false,
"self-harm" => false,
"sexual" => false,
"sexual/minors" => false,
"violence" => true,
"violence/graphic" => false
},
"category_scores" => %{
"hate" => 0.05119025334715844,
"hate/threatening" => 0.00321022979915142,
"self-harm" => 7.337320857914165e-5,
"sexual" => 1.1111642379546538e-6,
"sexual/minors" => 3.588798147546868e-10,
"violence" => 0.9190407395362855,
"violence/graphic" => 1.2791929293598514e-7
},
"flagged" => true
}
]
}}
See: https://platform.openai.com/docs/api-reference/moderations/create
The following APIs are currently in beta, to use them be sure to set the beta
parameter in your config.
config :openai,
# optional, use when required by an OpenAI API beta, e.g.:
beta: "assistants=v1"
Retrieves the list of assistants.
OpenAI.assistants()
{:ok,
%{
data: [
%{
"created_at" => 1699472932,
"description" => nil,
"file_ids" => ["file-..."],
"id" => "asst_...",
"instructions" => "...",
"metadata" => %{},
"model" => "gpt-4-1106-preview",
"name" => "...",
"object" => "assistant",
"tools" => [%{"type" => "retrieval"}]
}
],
first_id: "asst_...",
has_more: false,
last_id: "asst_...",
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/assistants/listAssistants
Retrieves the list of assistants filtered by query params.
OpenAI.assistants(after: "", limit: 10)
{:ok,
%{
data: [
%{
"created_at" => 1699472932,
"description" => nil,
"file_ids" => ["file-..."],
"id" => "asst_...",
"instructions" => "...",
"metadata" => %{},
"model" => "gpt-4-1106-preview",
"name" => "...",
"object" => "assistant",
"tools" => [%{"type" => "retrieval"}]
},
...
],
first_id: "asst_...",
has_more: false,
last_id: "asst_...",
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/assistants/listAssistants
Retrieves an assistant by its id.
OpenAI.assistants("asst_...")
{:ok,
%{
created_at: 1699472932,
description: nil,
file_ids: ["file-..."],
id: "asst_...",
instructions: "...",
metadata: %{},
model: "gpt-4-1106-preview",
name: "...",
object: "assistant",
tools: [%{"type" => "retrieval"}]
}}
See: https://platform.openai.com/docs/api-reference/assistants/getAssistant
Creates a new assistant.
OpenAI.assistants_create(
model: "gpt-3.5-turbo-1106",
name: "My assistant",
instructions: "You are a research assistant.",
tools: [
%{type: "retrieval"}
],
file_ids: ["file-..."]
)
{:ok,
%{
created_at: 1699640038,
description: nil,
file_ids: ["file-..."],
id: "asst_...",
instructions: "You are a research assistant.",
metadata: %{},
model: "gpt-3.5-turbo-1106",
name: "My assistant",
object: "assistant",
tools: [%{"type" => "retrieval"}]
}}
See: https://platform.openai.com/docs/api-reference/assistants/createAssistant
Modifies an existing assistant.
OpenAI.assistants_modify(
"asst_...",
model: "gpt-4-1106-preview",
name: "My upgraded assistant"
)
{:ok,
%{
created_at: 1699640038,
description: nil,
file_ids: ["file-..."],
id: "asst_...",
instructions: "You are a research assistant.",
metadata: %{},
model: "gpt-4-1106-preview",
name: "My upgraded assistant"
object: "assistant",
tools: [%{"type" => "retrieval"}]
}}
See: https://platform.openai.com/docs/api-reference/assistants/modifyAssistant
Deletes an assistant.
OpenAI.assistants_delete("asst_...")
{:ok,
%{
deleted: true,
id: "asst_...",
object: "assistant.deleted"
}}
See: https://platform.openai.com/docs/api-reference/assistants/deleteAssistant
Retrieves the list of files associated with a particular assistant.
OpenAI.assistant_files("asst_...")
{:ok,
%{
data: [
%{
"assistant_id" => "asst_...",
"created_at" => 1699472933,
"id" => "file-...",
"object" => "assistant.file"
}
],
first_id: "file-...",
has_more: false,
last_id: "file-...",
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/assistants/listAssistantFiles
Retrieves the list of files associated with a particular assistant, filtered by query params.
OpenAI.assistant_files("asst_...", order: "desc")
{:ok,
%{
data: [
%{
"assistant_id" => "asst_...",
"created_at" => 1699472933,
"id" => "file-...",
"object" => "assistant.file"
}
],
first_id: "file-...",
has_more: false,
last_id: "file-...",
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/assistants/listAssistantFiles
Retrieves an assistant file by its id
OpenAI.assistant_file("asst_...", "file_...")
{:ok,
%{
assistant_id: "asst_...",
created_at: 1699472933,
id: "file-...",
object: "assistant.file"
}}
See: https://platform.openai.com/docs/api-reference/assistants/getAssistantFile
Attaches a previously uploaded file to the assistant.
OpenAI.assistant_file_create("asst_...", file_id: "file-...")
{:ok,
%{
assistant_id: "asst_...",
created_at: 1699472933,
id: "file-...",
object: "assistant.file"
}}
See: https://platform.openai.com/docs/api-reference/assistants/createAssistantFile
Detaches a file from the assistant. The file itself is not automatically deleted.
OpenAI.assistant_file_delete("asst_...", "file-...")
{:ok,
%{
deleted: true,
id: "file-...",
object: "assistant.file.deleted"
}}
See: https://platform.openai.com/docs/api-reference/assistants/deleteAssistantFile
Retrieves the list of threads. NOTE: At the time of this writing this functionality remains undocumented by OpenAI.
OpenAI.threads()
{:ok,
%{
data: [
%{
"created_at" => 1699705727,
"id" => "thread_...",
"metadata" => %{"key_1" => "value 1", "key_2" => "value 2"},
"object" => "thread"
},
...
],
first_id: "thread_...",
has_more: false,
last_id: "thread_...",
object: "list"
}}
Retrieves the list of threads by query params. NOTE: At the time of this writing this functionality remains undocumented by OpenAI.
OpenAI.threads(limit: 2)
{:ok,
%{
data: [
%{
"created_at" => 1699705727,
"id" => "thread_...",
"metadata" => %{"key_1" => "value 1", "key_2" => "value 2"},
"object" => "thread"
},
...
],
first_id: "thread_...",
has_more: false,
last_id: "thread_...",
object: "list"
}}
Creates a new thread with some messages and metadata.
messages = [
%{
role: "user",
content: "Hello, what is AI?",
file_ids: ["file-..."]
},
%{
role: "user",
content: "How does AI work? Explain it in simple terms."
},
]
metadata = %{
key_1: "value 1",
key_2: "value 2"
}
OpenAI.threads_create(messages: messages, metadata: metadata)
{:ok,
%{
created_at: 1699703890,
id: "thread_...",
metadata: %{"key_1" => "value 1", "key_2" => "value 2"},
object: "thread"
}}
See: https://platform.openai.com/docs/api-reference/threads/createThread
Creates a new thread and runs it.
messages = [
%{
role: "user",
content: "Hello, what is AI?",
file_ids: ["file-..."]
},
%{
role: "user",
content: "How does AI work? Explain it in simple terms."
},
]
thread_metadata = %{
key_1: "value 1",
key_2: "value 2"
}
thread = %{
messages: messages,
metadata: thread_metadata
}
run_metadata = %{
key_3: "value 3"
}
params = [
assistant_id: "asst_...",
thread: thread,
model: "gpt-4-1106-preview",
instructions: "You are an AI learning assistant.",
tools: [%{
"type" => "retrieval"
}],
metadata: run_metadata
]
OpenAI.threads_create_and_run(params)
{:ok,
%{
assistant_id: "asst_...",
cancelled_at: nil,
completed_at: nil,
created_at: 1699897907,
expires_at: 1699898507,
failed_at: nil,
file_ids: ["file-..."],
id: "run_...",
instructions: "You are an AI learning assistant.",
last_error: nil,
metadata: %{"key_3" => "value 3"},
model: "gpt-4-1106-preview",
object: "thread.run",
started_at: nil,
status: "queued",
thread_id: "thread_...",
tools: [%{"type" => "retrieval"}]
}}
See: https://platform.openai.com/docs/api-reference/runs/createThreadAndRun
Modifies an existing thread.
metadata = %{
key_3: "value 3"
}
OpenAI.threads_modify("thread_...", metadata: metadata)
{:ok,
%{
created_at: 1699704406,
id: "thread_...",
metadata: %{"key_1" => "value 1", "key_2" => "value 2", "key_3" => "value 3"},
object: "thread"
}}
See: https://platform.openai.com/docs/api-reference/threads/modifyThread
Modifies an existing thread.
OpenAI.threads_delete("thread_...")
{:ok,
%{
deleted: true,
id: "thread_...",
object: "thread.deleted"
}}
See: https://platform.openai.com/docs/api-reference/threads/deleteThread
Retrieves the list of messages associated with a particular thread.
OpenAI.thread_messages("thread_...")
{:ok,
%{
data: [
%{
"assistant_id" => nil,
"content" => [
%{
"text" => %{
"annotations" => [],
"value" => "How does AI work? Explain it in simple terms."
},
"type" => "text"
}
],
"created_at" => 1699705727,
"file_ids" => [],
"id" => "msg_...",
"metadata" => %{},
"object" => "thread.message",
"role" => "user",
"run_id" => nil,
"thread_id" => "thread_..."
},
...
],
first_id: "msg_...",
has_more: false,
last_id: "msg_...",
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/messages/listMessages
Retrieves the list of messages associated with a particular thread, filtered by query params.
OpenAI.thread_messages("thread_...", after: "msg_...")
{:ok,
%{
data: [
%{
"assistant_id" => nil,
"content" => [
%{
"text" => %{
"annotations" => [],
"value" => "How does AI work? Explain it in simple terms."
},
"type" => "text"
}
],
"created_at" => 1699705727,
"file_ids" => [],
"id" => "msg_...",
"metadata" => %{},
"object" => "thread.message",
"role" => "user",
"run_id" => nil,
"thread_id" => "thread_..."
},
...
],
first_id: "msg_...",
has_more: false,
last_id: "msg_...",
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/messages/listMessages
Retrieves a thread message by its id.
OpenAI.thread_message("thread_...", "msg_...")
{:ok,
%{
assistant_id: nil,
content: [
%{
"text" => %{"annotations" => [], "value" => "Hello, what is AI?"},
"type" => "text"
}
],
created_at: 1699705727,
file_ids: ["file-..."],
id: "msg_...",
metadata: %{},
object: "thread.message",
role: "user",
run_id: nil,
thread_id: "thread_..."
}}
See: https://platform.openai.com/docs/api-reference/messages/getMessage
Creates a message within a thread.
params = [
role: "user",
content: "Hello, what is AI?",
file_ids: ["file-9Riyo515uf9KVfwdSrIQiqtC"],
metadata: %{
key_1: "value 1",
key_2: "value 2"
}
]
OpenAI.thread_message_create("thread_...", params)
{:ok,
%{
assistant_id: nil,
content: [
%{
"text" => %{"annotations" => [], "value" => "Hello, what is AI?"},
"type" => "text"
}
],
created_at: 1699706818,
file_ids: ["file-..."],
id: "msg_...",
metadata: %{"key_1" => "value 1", "key_2" => "value 2"},
object: "thread.message",
role: "user",
run_id: nil,
thread_id: "thread_..."
}}
See: https://platform.openai.com/docs/api-reference/messages/createMessage
Creates a message within a thread.
params = [
metadata: %{
key_3: "value 3"
}
]
OpenAI.thread_message_modify("thread_...", "msg_...", params)
{:ok,
%{
assistant_id: nil,
content: [
%{
"text" => %{"annotations" => [], "value" => "Hello, what is AI?"},
"type" => "text"
}
],
created_at: 1699706818,
file_ids: ["file-..."],
id: "msg_...",
metadata: %{"key_1" => "value 1", "key_2" => "value 2", "key_3" => "value 3"},
object: "thread.message",
role: "user",
run_id: nil,
thread_id: "thread_..."
}}
See: https://platform.openai.com/docs/api-reference/messages/modifyMessage
Retrieves the list of files associated with a particular message of a thread.
OpenAI.thread_message_files("thread_...", "msg_...")
{:ok,
%{
data: [
%{
"created_at" => 1699706818,
"id" => "file-...",
"message_id" => "msg_...",
"object" => "thread.message.file"
}
],
first_id: "file-...",
has_more: false,
last_id: "file-...",
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/messages/listMessageFiles
Retrieves the list of files associated with a particular message of a thread, filtered by query params.
OpenAI.thread_message_files("thread_...", "msg_...", after: "file-...")
{:ok,
%{
data: [
%{
"created_at" => 1699706818,
"id" => "file-...",
"message_id" => "msg_...",
"object" => "thread.message.file"
}
],
first_id: "file-...",
has_more: false,
last_id: "file-...",
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/messages/listMessageFiles
Retrieves the message file object.
OpenAI.thread_message_file("thread_...", "msg_...", "file-...")
{:ok,
%{
created_at: 1699706818,
id: "file-...",
message_id: "msg_...",
object: "thread.message.file"
}}
See: https://platform.openai.com/docs/api-reference/messages/getMessageFile
Retrieves the list of runs associated with a particular thread, filtered by query params.
OpenAI.thread_runs("thread_...", limit: 10)
{:ok, %{
data: [],
first_id: nil,
has_more: false,
last_id: nil,
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/runs/listRuns
Retrieves a particular thread run by its id.
OpenAI.thread_run("thread_...", "run_...")
{:ok,
%{
assistant_id: "asst_J",
cancelled_at: nil,
completed_at: 1700234149,
created_at: 1700234128,
expires_at: nil,
failed_at: nil,
file_ids: [],
id: "run_",
instructions: "You are an AI learning assistant.",
last_error: nil,
metadata: %{"key_3" => "value 3"},
model: "gpt-4-1106-preview",
object: "thread.run",
started_at: 1700234129,
status: "expired",
thread_id: "thread_",
tools: [%{"type" => "retrieval"}]
}}
See: https://platform.openai.com/docs/api-reference/runs/getRun
Creates a run for a thread using a particular assistant.
params = [
assistant_id: "asst_...",
model: "gpt-4-1106-preview",
tools: [%{
"type" => "retrieval"
}]
]
OpenAI.thread_run_create("thread_...", params)
{:ok,
%{
assistant_id: "asst_...",
cancelled_at: nil,
completed_at: nil,
created_at: 1699711115,
expires_at: 1699711715,
failed_at: nil,
file_ids: ["file-..."],
id: "run_...",
instructions: "...",
last_error: nil,
metadata: %{},
model: "gpt-4-1106-preview",
object: "thread.run",
started_at: nil,
status: "queued",
thread_id: "thread_...",
tools: [%{"type" => "retrieval"}]
}}
See: https://platform.openai.com/docs/api-reference/runs/createRun
Modifies an existing thread run.
params = [
metadata: %{
key_3: "value 3"
}
]
OpenAI.thread_run_modify("thread_...", "run_...", params)
{:ok,
%{
assistant_id: "asst_...",
cancelled_at: nil,
completed_at: 1699711125,
created_at: 1699711115,
expires_at: nil,
failed_at: nil,
file_ids: ["file-..."],
id: "run_...",
instructions: "...",
last_error: nil,
metadata: %{"key_3" => "value 3"},
model: "gpt-4-1106-preview",
object: "thread.run",
started_at: 1699711115,
status: "expired",
thread_id: "thread_...",
tools: [%{"type" => "retrieval"}]
}}
See: https://platform.openai.com/docs/api-reference/runs/modifyRun
When a run has the status: "requires_action"
and required_action.type
is submit_tool_outputs
, this endpoint can be used to submit the outputs from the tool calls once they're all completed. All outputs must be submitted in a single request.
params = [
tool_outputs: [%{
tool_call_id: "call_abc123",
output: "test"
}]
]
OpenAI.thread_run_submit_tool_outputs("thread_...", "run_...", params)
{:ok,
%{
assistant_id: "asst_abc123",
cancelled_at: nil,
completed_at: nil,
created_at: 1699075592,
expires_at: 1699076192,
failed_at: nil,
file_ids: [],
id: "run_abc123",
instructions: "You tell the weather.",
last_error: nil,
metadata: %{},
model: "gpt-4",
object: "thread.run",
started_at: 1699075592,
status: "queued",
thread_id: "thread_abc123",
tools: [
%{
"function" => %{
"description" => "Determine weather in my location",
"name" => "get_weather",
"parameters" => %{
"properties" => %{
"location" => %{
"description" => "The city and state e.g. San Francisco, CA",
"type" => "string"
},
"unit" => %{"enum" => ["c", "f"], "type" => "string"}
},
"required" => ["location"],
"type" => "object"
}
},
"type" => "function"
}
]
}
}
See: https://platform.openai.com/docs/api-reference/runs/submitToolOutputs
Cancels an in_progress
run.
OpenAI.thread_run_cancel("thread_...", "run_...")
{:ok,
%{
assistant_id: "asst_...",
cancelled_at: nil,
completed_at: 1699711125,
created_at: 1699711115,
expires_at: nil,
failed_at: nil,
file_ids: ["file-..."],
id: "run_...",
instructions: "...",
last_error: nil,
metadata: %{"key_3" => "value 3"},
model: "gpt-4-1106-preview",
object: "thread.run",
started_at: 1699711115,
status: "expired",
thread_id: "thread_...",
tools: [%{"type" => "retrieval"}]
}}
See: https://platform.openai.com/docs/api-reference/runs/cancelRun
Retrieves the list of steps associated with a particular run of a thread.
OpenAI.thread_run_steps("thread_...", "run_...")
{:ok,
%{
data: [
%{
"assistant_id" => "asst_...",
"cancelled_at" => nil,
"completed_at" => 1699897927,
"created_at" => 1699897908,
"expires_at" => nil,
"failed_at" => nil,
"id" => "step_...",
"last_error" => nil,
"object" => "thread.run.step",
"run_id" => "run_...",
"status" => "completed",
"step_details" => %{
"message_creation" => %{"message_id" => "msg_..."},
"type" => "message_creation"
},
"thread_id" => "thread_...",
"type" => "message_creation"
}
],
first_id: "step_...",
has_more: false,
last_id: "step_...",
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/runs/listRunSteps
Retrieves the list of steps associated with a particular run of a thread, filtered by query params.
OpenAI.thread_run_steps("thread_...", "run_...", order: "asc")
{:ok,
%{
data: [
%{
"assistant_id" => "asst_...",
"cancelled_at" => nil,
"completed_at" => 1699897927,
"created_at" => 1699897908,
"expires_at" => nil,
"failed_at" => nil,
"id" => "step_...",
"last_error" => nil,
"object" => "thread.run.step",
"run_id" => "run_...",
"status" => "completed",
"step_details" => %{
"message_creation" => %{"message_id" => "msg_..."},
"type" => "message_creation"
},
"thread_id" => "thread_...",
"type" => "message_creation"
}
],
first_id: "step_...",
has_more: false,
last_id: "step_...",
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/runs/listRunSteps
Retrieves the list of steps associated with a particular run of a thread, filtered by query params.
OpenAI.thread_run_steps("thread_...", "run_...", order: "asc")
{:ok,
%{
data: [
%{
"assistant_id" => "asst_...",
"cancelled_at" => nil,
"completed_at" => 1699897927,
"created_at" => 1699897908,
"expires_at" => nil,
"failed_at" => nil,
"id" => "step_...",
"last_error" => nil,
"object" => "thread.run.step",
"run_id" => "run_...",
"status" => "completed",
"step_details" => %{
"message_creation" => %{"message_id" => "msg_..."},
"type" => "message_creation"
},
"thread_id" => "thread_...",
"type" => "message_creation"
}
],
first_id: "step_...",
has_more: false,
last_id: "step_...",
object: "list"
}}
See: https://platform.openai.com/docs/api-reference/runs/listRunSteps
Retrieves a thread run step by its id.
OpenAI.thread_run_step("thread_...", "run_...", "step_...")
{:ok,
%{
assistant_id: "asst_...",
cancelled_at: nil,
completed_at: 1699897927,
created_at: 1699897908,
expires_at: nil,
failed_at: nil,
id: "step_...",
last_error: nil,
object: "thread.run.step",
run_id: "run_...",
status: "completed",
step_details: %{
"message_creation" => %{"message_id" => "msg_..."},
"type" => "message_creation"
},
thread_id: "thread_...",
type: "message_creation"
}}
See: https://platform.openai.com/docs/api-reference/runs/getRunStep
The following APIs are deprecated, but currently supported by the library for retrocompatibility with older versions. If you are using the following APIs consider to remove it ASAP from your project!
Note: from version 0.5.0 search, answers, classifications API are not supported (since they has been removed by OpenAI), if you still need them consider to use v0.4.2
Get the list of available engines
OpenAI.engines()
{:ok, %{
"data" => [
%{"id" => "davinci", "object" => "engine", "max_replicas": ...},
...,
...
]
}
See: https://beta.openai.com/docs/api-reference/engines/list
Retrieve specific engine info
OpenAI.engines("davinci")
{:ok, %{
"id" => "davinci",
"object" => "engine",
"max_replicas": ...
}
}
See: https://beta.openai.com/docs/api-reference/engines/retrieve
The package is available as open source under the terms of the MIT License.