...
- New Adapters: Anthropic, Gemini, xAI,Groq, Ollama, and VLLM. Each of these provides specialized support for their respective LLM APIs.
:json_schema
Mode: The OpenAI adapter and others now support a:json_schema
mode for more structured JSON outputs.Instructor.Extras.ChainOfThought
: A new module to guide multi-step reasoning processes with partial returns and final answers.- Enhanced Streaming: More robust partial/array streaming pipelines, plus improved SSE-based parsing for streamed responses.
- Re-ask/Follow-up Logic: Adapters can now handle re-asking the LLM to correct invalid JSON responses when
max_retries
is set.
- OpenAI Adapter Refactor: A major internal refactor for more flexible streaming modes, additional “response format” options, and better error handling.
- Ecto Dependency: Updated from
3.11
to3.12
. - Req Dependency: Now supports
~> 0.5
or~> 1.0
.
- Schema Documentation via
@doc
: Schemas using@doc
to send instructions to the LLM will now emit a warning. Please migrate to@llm_doc
viause Instructor
.
- Some adapter configurations now require specifying an
:api_path
or:auth_mode
. Verify your adapter config matches the new format. - The OpenAI adapter’s
:json_schema
mode strips unsupported fields (e.g.,format
,pattern
) from schemas before sending them to the LLM.
- Various improvements to JSON parsing and streaming handling, including better handling of partial/invalid responses.
- Support for together.ai inference server
- Support for ollama local inference server
- GPT-4 Vision support
- Added
:json
and:md_json
modes to support more models and inference servers
- Default http settings and where they are stored
before:
config :openai, http_options: [...]
after:
config :instructor, :openai, http_options: [...]
- OpenAI client to allow for better control of default settings and reduce dependencies
v0.0.4 - 2024-01-15
Instructor.Adapters.Llamacpp
for running instructor against local llms.use Instructor.EctoType
for supporting custom ecto types.- More documentation
- Bug fixes in ecto --> json_schema --> gbnf grammar pipeline, added better tests
v0.0.3 - 2024-01-10
- Schemaless Ecto support
response_model: {:partial, Model}
partial streaming moderesponse_model: {:array, Model}
record streaming mode
- Bug handling nested module names
v0.0.2 - 2023-12-30
use Instructor.Validator
for validation callbacks on your Ecto Schemasmax_retries:
option to reask the LLM to fix any validation errors
v0.0.1 - 2023-12-19
- Structured prompting with LLMs using Ecto