Releases: getcellm/cellm
v0.1.1
This is a minor release that mainly fixes a bug where changing API keys either via UI or appsettings.Local.json would not get picked until Excel was restarted.
What's Changed
- docs: Add CLA by @kaspermarstal in #102
- docs: Fix README typos by @kaspermarstal in #103
- build: Add global.json to specify .NET SDK version 9.X.X by @johnnyoshika in #101
- fix: Changing API keys while app is running by @kaspermarstal in #109
Full Changelog: v0.1.0...v0.1.1
v0.1.0
Release v0.1.0
Cellm is an Excel extension that lets you use Large Language Models (LLMs) like ChatGPT in cell formulas. It is designed for automation of repetitive text-based tasks and comes with
- Local and hosted models: Defaults to free local inference (Gemma 2 2B via Ollama) while supporting commercial APIs
- Formula-driven workflow:
=PROMPT()
and=PROMPTWITH()
functions for drag-and-fill operations across cell ranges.
Install
-
Download
Cellm-AddIn64-packed.xll
andappsettings.json
. Put them in the same folder. -
Double-click on
Cellm-AddIn64-packed.xll
. Excel will open and install Cellm. -
Download and install Ollama. Cellm uses Ollama and the Gemma 2 2B model by default. Gemma 2 2B will be downloaded automatically the first time you call
=PROMPT()
. To call other models, see the Models section in the README.
Uninstall
- In Excel, go to File > Options > Add-Ins.
- In the
Manage
drop-down menu, selectExcel Add-ins
and clickGo...
. - Uncheck
Cellm-AddIn64-packed.xll
and clickOK
.
Known Limitations
- Windows-only: No macOS/Linux support planned for initial versions
- Input constraints:
- Formula arguments limited to 8,192 characters (Excel string limit)
- No native support for multi-turn conversations
- Model variability: Output quality depends on selected LLM (validate critically)
Contribution & Feedback
Report issues or suggest improvements via GitHub Issues.
Install
Download Cellm-AddIn64-packed.xll
and appsettings.json
and put it in the same folder. Then double-click on Cellm-AddIn64-packed.xll
. Excel will open with Cellm installed.
License: Fair Core License
Full Documentation: README
What's Changed
- feat: Add LlamafileClient by @kaspermarstal in #1
- bug: Fix AddSystemMessage by @kaspermarstal in #4
- bug: Fix llamafile health uri by @kaspermarstal in #3
- models: Add qwen-0.5b by @kaspermarstal in #2
- docs: Tighten up README by @kaspermarstal in #6
- feat: Manually dispose of ServiceLocator by @kaspermarstal in #7
- bug: By default assign telemetry to default model of provider, not default model of Cellm; refactor: Rename GoogleClient to GoogleAiClient by @kaspermarstal in #8
- docs: Add support for Mistral by @kaspermarstal in #9
- bug: Disable sentry by default until fix for missing immutable arrays is identified by @kaspermarstal in #11
- feat: Add concurrency rate limiting by @kaspermarstal in #10
- feat: Add support for running multiple Llamafiles simultaneously by @kaspermarstal in #12
- build: Enforce code style in build by @kaspermarstal in #13
- git: Add Excel files to .gitignore by @kaspermarstal in #14
- docs: Improve README by @kaspermarstal in #15
- Prompt: Further optimize system prompt for small models with limited instruction-following capability. Larger models will understand anyway by @kaspermarstal in #16
- docs: Proof-read README.md by @kaspermarstal in #17
- feat: Add support for OpenAI tools by @kaspermarstal in #18
- feat: Upgrade default Anthropic model to claude-3-5-sonnet-20241022 by @kaspermarstal in #20
- refactor: Add provider enum by @kaspermarstal in #19
- refactor: Rename CellmFunctions to Functions and CellPrompts to SystemMessages by @kaspermarstal in #21
- ci: Add conventional commits lint by @kaspermarstal in #22
- build: Make internals visible to Cellm.Tests by @kaspermarstal in #23
- refactor: Pull out XllPath into settable property by @kaspermarstal in #24
- feat: Add SentryBehavior and CachingBehavior to model request pipeline by @kaspermarstal in #25
- refactor: Splot Tools into ToolRunner and ToolFactory by @kaspermarstal in #26
- feat: Upgrade Claude 3.5 Sonnet by @kaspermarstal in #27
- refactor: Remove superfluous interfaces by @kaspermarstal in #28
- feat: Add FileReader tool by @kaspermarstal in #29
- ci: Add dependabot by @kaspermarstal in #36
- build(deps): bump Microsoft.Extensions.Caching.Memory from 8.0.0 to 8.0.1 in /src/Cellm by @dependabot in #38
- ci: Disable commitlint for dependabot by @kaspermarstal in #43
- build(deps): bump Sentry.Extensions.Logging from 4.10.2 to 4.12.1 by @dependabot in #39
- ci: Disable conventional commits lint by @kaspermarstal in #44
- build(deps): bump Sentry.Profiling from 4.10.2 to 4.12.1 by @dependabot in #41
- build(deps): bump Microsoft.Extensions.Configuration.Json from 8.0.0 to 8.0.1 by @dependabot in #42
- fix: Parse tool description attributes by @kaspermarstal in #45
- feat: Add support for prompt as single string argument by @kaspermarstal in #50
- fix: CacheBehavior caches value even when model request is mutated downstream by @kaspermarstal in #56
- ci: Run dotnet format with restore by @kaspermarstal in #58
- fix: Remove Sentry metrics which were deprecated by @kaspermarstal in #59
- build(deps): bump Microsoft.Extensions.Configuration, Microsoft.Extensions.DependencyInjection, Microsoft.Extensions.Logging.Console, Microsoft.Extensions.Options and Microsoft.Extensions.Options.ConfigurationExtensions by @dependabot in #51
- fix: Increase timeouts for local LLMs by @kaspermarstal in #57
- build(deps): bump Microsoft.Extensions.Caching.Memory and Microsoft.Extensions.Options by @dependabot in #54
- build: Target .NET 8.0 by @kaspermarstal in #62
- feat: Replace GoogleAI provider with Google's OpenAI compatible endpoint by @kaspermarstal in #63
- feat: Update Llamafile version, add larger Llamafile models by @kaspermarstal in #64
- refactor: Clean up src/Cellm/AddIn by @kaspermarstal in #65
- feat: Use Microsoft.Extensions.AI by @kaspermarstal in #66
- feat: Add Ollama provider by @kaspermarstal in #67
- fix: Ollama provider by @kaspermarstal in #69
- build: Remove json schema deps no longer needed by @kaspermarstal in #72
- feat: Add HybridCache, remove MemoryCache by @kaspermarstal in #73
- feat: Add OpenAiCompatible chat client by @kaspermarstal in #74
- refactor: Models by @kaspermarstal in #75
- build: Target .NET 9 and update deps by @kaspermarstal in #83
- refactor: Use ExcelAsyncUtil to run task by @kaspermarstal in #84
- feat: Remove support for embedded Ollama and Llamafile servers by @kaspermarstal in #85
- build: Copy appsettings.Local.json to bin dir only if exists by @kaspermarstal in #86
- docs: Fix appsettings.Local.*.json examples and update readme to match by @kaspermarstal in #87
- bug: Fix OpenAiCompatible API key by @kaspermarstal in #94
- feat: Add Mistral and DeepSeek providers by @kaspermarstal in #95
- feat: Add Ribbon UI by @kaspermarstal in #96
- feat: Auto-download Ollama models by @kaspermarstal in #98
- docs: Update README.md with installations instructions via Release page by @kaspermarstal in #100
Full Changelog: https://github.com/getcellm/cellm/commits/v0.1.0