This project provides a personalized AI assistant that can be used through a CLI or as a VSCode extension. It supports multiple LLM API providers and offers various tools and commands for enhanced interaction.
Before running the AI assistant, ensure that:
-
A
preferences.yaml
file exist at the target location. This controls which providers and models are available to aidev.# Create the preferences directory mkdir -p ~/.config/aidev # Move the sample preferences to the target location cp preferences.yaml.sample ~/.config/aidev/preferences.yaml
The default location for this file is in
~/.config/aidev/
, but you can override this location by setting theAIDEV_PREFERENCES_DIR
environment variable:export AIDEV_PREFERENCES_DIR="/custom/path/to/preferences/dir"
-
API keys are configured for your chosen LLM provider(s). Store your API keys in
~/.config/aidev/keys/
following this convention:# Create the config directory mkdir -p ~/.config/aidev/keys # Store your API keys (one per file) echo 'your-api-key' > ~/.config/aidev/keys/openai.key chmod 600 ~/.config/aidev/keys/*.key # Secure the files
By default, keys are stored in
~/.config/aidev/keys
, but you can override this location by setting theAIDEV_KEY_DIR
environment variable:export AIDEV_KEY_DIR="/custom/path/to/keys/dir"
Key files for each provider are required to use the provider's configured models:
anthropic.key
openai.key
google.key
groq.key
-
The
code
command is available on your PATH. This is typically installed with Visual Studio Code. -
For the VSCode extension to work properly, the CLI run instructions must be aliased to 'ai'. Add this alias to your shell configuration file:
alias ai='node /path/to/aidev/dist/cli.js'
Replace
/path/to/aidev
with the path to this project.
To build and run the latest from source:
yarn
yarn dev
Or:
yarn
yarn build
node ./dist/cli.js "$@"
To build and install the VSCode extension:
yarn
yarn vsix
In VSCode, select Extensions: Install from VSIX
from the command palette and select the ai-0.0.0.vsix
payload from the above command.
Available LLM API providers:
- Anthropic
- OpenAI
- Groq
- Ollama
Tools available to the LLMs:
shell_execute
: Execute azsh
command.read_directories
: Reads the contents of directories into the conversation.read_files
: Reads the contents of files into the conversation.write_file
: Writes contents to a file.edit_file
: Modifies the contents of an existing file.
The following meta commands have special behaviors to manipulate the context or conversation. All other user input is added as a user message to the conversation.
:branch <branch>
: Create a new branch in the conversation.:clear
: Clear all messages in the conversation.:continue
: Re-prompt the model without a user message.:exit
: Exit the conversation.:help
: Display available commands.:load <patterns, ...>
: Load file contents into the conversation.:loaddir <patterns, ...>
: Load directory entries into the conversation.:model <model>
: Change the model backing the assistant.:models
: List available models.:prompt
: Draft a prompt in VSCode.:redo
: Redo the last undone action.:remove <branch>
: Remove a branch and all its messages.:rename <from_branch> <to_branch>
: Rename an existing branch.:rollback <savepoint>
: Rollback to a previously defined savepoint.:save
: Save the conversation to a file.:savepoint <name>
: Mark a point in the conversation to which you can rollback.:shell
: Run a shell command and include its output in the conversation.:status
: Show the branch topology of the conversation.:switch <branch>
: Switch to an existing branch.:undo
: Undo the last user action.:unload [<patterns, ...>]
: Remove matching files or directories from the conversation.:unstash [<patterns, ...>]
: Remove stashed file(s) from the chat context.:write <path>
: Write a stashed file to disk.
The root of your project (where you launch the CLI) can contain special files to configure the behavior of the assistant:
aidev.system
: Will be injected verbatim into the system prompt.aidev.ignore
: Can contain gitignore-like patterns that will filter out files or directories from being read into the context.
The VSCode extension provides the following features:
- "Open aidev" command in the VSCode command palette.
- "Open aidev with a specific model" command in the VSCode command palette.
- Quick access to "Open aidev" using the keyboard shortcut Command+Shift+I.
- Automatic injection of open editor files into the conversation context when using the extension.
These commands and features make it easy to interact with the AI assistant directly from your VSCode environment.