0.6.0
New Features
- Support custom LLMs through Ollama and OpenAI compatible APIs by @cyyeh in #376 and #457
- Support MS SQL Server by @onlyjackfrost in #443
- Enhance schema change feature and detect more affected semantic layer resources by @fredalai in #442
Fixes and Chores
- Implement Engine Adapter in ai service by @paopa in #440
- Fix some minor bugs in ai service by @cyyeh in #445, @cyyeh in #449 and @cyyeh in #451
- Improve the logging mechanism in ai service by @cyyeh in #447
- Support force deploy in wren ui by @onlyjackfrost in #452
- Overwrite docker compose file if using custom llm by @onlyjackfrost in #458
Maintenance and Documentation
- Change our official naming from
WrenAI
toWren AI
by @chilijung in #441 - Add log collecting instruction in bug report template by @onlyjackfrost in #438
- Add blog post link in README.md by @chilijung in #456
Notes and known issues:
- Wren AI now supports using customized LLM. To prevent users from deleting Qdrant container and volume by themselves, we clear and reinitialize the vector database every time you start/restart the ai service. The reinitialize process might have some chance of failing and you might have to do the "deploy" action again
- Wren AI now supports MS SQL Server as a data source, but there are some issues related to Ibis and Sqlglot and users might have issues when creating calculated fields. Canner/wren-engine#632
Full Changelog: 0.5.0...0.6.0