Skip to content

Latest commit

 

History

History
78 lines (49 loc) · 2.77 KB

README.md

File metadata and controls

78 lines (49 loc) · 2.77 KB

LiteLLM Config Generator

A helper python program to generate a configuration file for LiteLLM proxy.

banner

Just provide a LiteLLM Proxy configuration file in YAML with model_list removed, and then run this program with the following instruction, to get the model_list filled up with templates.

Usage

1. Environment Setup

If you have both VSCode and Docker installed, you can use the .devcontainer directory to setup a dev container. It contains all the necessary Python related plugins to help you modify this project. (It is also the environment I used to develop this project)

Otherwise, installed a recent Python 3.x on your system.

Follow the steps below to setup the environment:

  1. First, create a virtual environment:

    python3 -m venv .venv
  2. Activate the virtual environment:

    On Windows:

    .venv\Scripts\activate

    On Unix or MacOS:

    source .venv/bin/activate
  3. Install the required packages:

    pip install -r requirements.txt

2. Configuration

copy the config.example.yaml file to config.yaml, and the litellm-template.example.yaml file to litellm-template.yaml.

Fill in the config.yaml file with your own configuration.

For the litellm-template.yaml, this is where you put your LiteLLM configuration file with model_list removed.

3. Modify the template to suit your needs

The config_generator/src/component/model_poper.py file contains the logic to generate the model_list. It delegates the generation to different AbstractLLMPoper implementation defined in the config_generator/src/component/llm_poper directory.

So far we have the following AbstractLLMPoper implementations:

Provider Implementation Support Fetching Model List
OpenAI config_generator/src/component/llm_poper/openai.py Yes
Google config_generator/src/component/llm_poper/google.py Yes
Anthropic config_generator/src/component/llm_poper/anthropic.py No
Mistral config_generator/src/component/llm_poper/mistral.py Yes
Groq config_generator/src/component/llm_poper/groq.py Yes
GitHub Copilot config_generator/src/component/llm_poper/copilot.py No
TogetherAI config_generator/src/component/llm_poper/togetherai.py Yes
OpenRouter config_generator/src/component/llm_poper/openrouter.py Yes

You may be interested in modifying the template in each AbstractLLMPoper implementation to suit your needs.

4. Run the program

./run.sh

The output LiteLLM configuration file will be saved to io.output-file.