Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Context window formatting #16

Open
Keyrxng opened this issue Oct 25, 2024 · 1 comment
Open

Context window formatting #16

Keyrxng opened this issue Oct 25, 2024 · 1 comment

Comments

@Keyrxng
Copy link
Member

Keyrxng commented Oct 25, 2024

This plugin was designed with a very clear structure for context window formatting as per the spec for V1, which was and is implemented to spec as per the unit test via these fns.

In building #14 rn I've just noticed that what is being fed to the model has been stripped of all this formatting and structure, below is the comparison.

  • The readme should have it's own section within the context window like others, it's currently inside the conversational block
  • I assume the headers, dividers, and structure should be returned and the model fed the structured version?
  • ubiquibot-configyml yml plugins - uses - plugin link0 with model openaibaseurl this is the code block in the readme, the plugin install yml code which should read:
plugins:
  - uses:
      - plugin: http://localhost:4000
        with:
          model: ""
          openAiBaseUrl: ""
  1. We've lost the url in - plugin: http://localhost:4000
  2. ubiquibot-configyml yml plugins - period removed but also yml duplicated because backticks removed.

What's being fed to the model:

current issue 10 specification ubq-testing/ask-plugin/10 test end current issue 10 specification current issue 10 conversation ubq-testing/ask-plugin 10 2438352893 keyrxng ubqbot test 2614664090 keyrxng test 2614664090 keyrxng ubiquity-os/command-ask this is a highly context aware github organization integrated bot that uses the openai gpt-4o model to provide highly relevant answers to questions and queries in github issues and pull requests usage in any repository where your ubiquity os app is installed both issues and pull requests alike you simply mention ubiquityos with your question or query and using the latest openai gpt-4o model the bot will provide you with a highly relevant answer how it works with its huge context window we are able to feed the entire conversational history to the model which we obtain by recursively fetching any referenced issues or pull requests from the chat history this allows the model to have a very deep understanding of the current scope and provide highly relevant answers as it receives everything from discussions to pull request diffs and review comments it is a highly versatile and capable bot that can assist in a wide range of scenarios installation ubiquibot-configyml yml plugins - uses - plugin link0 with model openaibaseurl devvars for local testing to use the openrouter api for fetching chat history set the openrouterapikey in the devvars file and specify the openaibase url in the ubiquibot-configyml file alternatively you can set the openaiapikey in the devvars file sh openaiapikeyyouropenaiapikey supabaseurlyoursupabaseurl supabasekeyyoursupabasekey voyageaiapikeyyourvoyageaiapikey openrouterapikeyyouropenrouterapikey ubiquityosappnameubiquityos testing sh yarn test end current issue 10 conversation

The formatted chat:

this is logged before the changes happen.

FORMATTED CHAT:
=== Current Issue #10 Specification === ubq-testing/ask-plugin/10 ===

test
=== End Current Issue #10 Specification ===

=== Current Issue #10 Conversation === ubq-testing/ask-plugin #10 ===

2438352893 Keyrxng: @ubqbot test
2614664090 Keyrxng: test
2614664090 Keyrxng: # @ubiquity-os/command-ask

This is a highly context aware GitHub organization integrated bot that uses the OpenAI GPT-4o model to provide highly relevant answers to questions and queries in GitHub issues and pull requests.

Usage

In any repository where your Ubiquity OS app is installed, both issues and pull requests alike, you simply mention @UbiquityOS with your question or query and using the latest OpenAi GPT-4o model, the bot will provide you with a highly relevant answer.

How it works

With it's huge context window, we are able to feed the entire conversational history to the model which we obtain by recursively fetching any referenced issues or pull requests from the chat history. This allows the model to have a very deep understanding of the current scope and provide highly relevant answers.

As it receives everything from discussions to pull request diffs and review comments, it is a highly versatile and capable bot that can assist in a wide range of scenarios.

Installation

ubiquibot-config.yml:

plugins:
  - uses:
      - plugin: http://localhost:4000
        with:
          model: ""
          openAiBaseUrl: ""

.dev.vars (for local testing):

To use the Openrouter API for fetching chat history, set the OPENROUTER_API_KEY in the .dev.vars file and specify the OpenAiBase URL in the ubiquibot-config.yml file. Alternatively, you can set the OPENAI_API_KEY in the .dev.vars file.

OPENAI_API_KEY=your_openai_api_key
SUPABASE_URL=your_supabase_url
SUPABASE_KEY=your_supabase_key
VOYAGEAI_API_KEY=your_voyageai_api_key
OPENROUTER_API_KEY=your_openrouter_api_key
UBIQUITY_OS_APP_NAME="UbiquityOS"

Testing

yarn test

=== End Current Issue #10 Conversation ===

@0x4007
Copy link
Member

0x4007 commented Oct 26, 2024

I'm assuming that the separations may help with comprehension of the prompt because reading what's passed in now is almost incomprehensible

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants