From 55921b262f1a41364d45f4eedf0c1f5503d1e2d4 Mon Sep 17 00:00:00 2001
From: KRISH SONI <67964054+krishvsoni@users.noreply.github.com>
Date: Wed, 18 Oct 2023 21:54:28 +0530
Subject: [PATCH 1/3] docker docs
---
README.md | 11 ++++++++++-
1 file changed, 10 insertions(+), 1 deletion(-)
diff --git a/README.md b/README.md
index edeb7b66f..109728edc 100644
--- a/README.md
+++ b/README.md
@@ -78,7 +78,7 @@ If you don't have enough resources to run it, you can use bitsnbytes to quantize
## QuickStart
-Note: Make sure you have Docker installed
+Note: Make sure you have [Docker](https://www.docker.com/) installed
On Mac OS or Linux, write:
@@ -124,10 +124,19 @@ Make sure you have Python 3.10 or 3.11 installed.
(check out [`application/core/settings.py`](application/core/settings.py) if you want to see more config options.)
2. (optional) Create a Python virtual environment:
+You can follow the [Python official documentation](https://docs.python.org/3/tutorial/venv.html) for virtual environments .
+
+a) On Mac OS and Linux
```commandline
python -m venv venv
. venv/bin/activate
```
+b) On Windows
+```commandline
+python -m venv venv
+ venv/Scripts/activate
+```
+
3. Change to the `application/` subdir and install dependencies for the backend:
```commandline
pip install -r application/requirements.txt
From 9e632aa0bdfbf80d78b03c4fe9f12a3add28e9ee Mon Sep 17 00:00:00 2001
From: KRISH SONI <67964054+krishvsoni@users.noreply.github.com>
Date: Mon, 23 Oct 2023 22:01:07 +0530
Subject: [PATCH 2/3] Update README.md
---
README.md | 110 +++++++++++++++++++++++++++++++++++-------------------
1 file changed, 72 insertions(+), 38 deletions(-)
diff --git a/README.md b/README.md
index b570dfc75..3a6448ddd 100644
--- a/README.md
+++ b/README.md
@@ -7,9 +7,9 @@
- DocsGPT is a cutting-edge open-source solution that streamlines the process of finding information in project documentation. With its integration of the powerful GPT models, developers can easily ask questions about a project and receive accurate answers.
+ DocsGPT is a cutting-edge open-source solution that streamlines the process of finding information in the project documentation. With its integration of the powerful GPT models, developers can easily ask questions about a project and receive accurate answers.
-Say goodbye to time-consuming manual searches, and let DocsGPT help you quickly find the information you need. Try it out and see how it revolutionizes your project documentation experience. Contribute to its development and be a part of the future of AI-powered assistance.
+Say goodbye to time-consuming manual searches, and let DocsGPT help you quickly find the information you need. Try it out and see how it revolutionizes your project documentation experience. Contribute to its development and be a part of the future of AI-powered assistance.
@@ -21,64 +21,62 @@ Say goodbye to time-consuming manual searches, and let DocsGPT
-### Production Support / Help for companies:
+### Production Support / Help for companies:
We're eager to provide personalized assistance when deploying your DocsGPT to a live environment.
-- [Schedule Demo 👋](https://cal.com/arc53/docsgpt-demo-b2b?date=2023-10-04&month=2023-10)
+
+- [Book Demo 👋](https://airtable.com/appdeaL0F1qV8Bl2C/shrrJF1Ll7btCJRbP)
- [Send Email ✉️](mailto:contact@arc53.com?subject=DocsGPT%20support%2Fsolutions)
-
+
### [🎉 Join the Hacktoberfest with DocsGPT and Earn a Free T-shirt! 🎉](https://github.com/arc53/DocsGPT/blob/main/HACKTOBERFEST.md)
![video-example-of-docs-gpt](https://d3dg1063dc54p9.cloudfront.net/videos/demov3.gif)
-
## Roadmap
You can find our roadmap [here](https://github.com/orgs/arc53/projects/2). Please don't hesitate to contribute or create issues, it helps us improve DocsGPT!
## Our Open-Source models optimized for DocsGPT:
-| Name | Base Model | Requirements (or similar) |
-|-------------------|------------|----------------------------------------------------------|
-| [Docsgpt-7b-falcon](https://huggingface.co/Arc53/docsgpt-7b-falcon) | Falcon-7b | 1xA10G gpu |
-| [Docsgpt-14b](https://huggingface.co/Arc53/docsgpt-14b) | llama-2-14b | 2xA10 gpu's |
-| [Docsgpt-40b-falcon](https://huggingface.co/Arc53/docsgpt-40b-falcon) | falcon-40b | 8xA10G gpu's |
-
+| Name | Base Model | Requirements (or similar) |
+| --------------------------------------------------------------------- | ----------- | ------------------------- |
+| [Docsgpt-7b-falcon](https://huggingface.co/Arc53/docsgpt-7b-falcon) | Falcon-7b | 1xA10G gpu |
+| [Docsgpt-14b](https://huggingface.co/Arc53/docsgpt-14b) | llama-2-14b | 2xA10 gpu's |
+| [Docsgpt-40b-falcon](https://huggingface.co/Arc53/docsgpt-40b-falcon) | falcon-40b | 8xA10G gpu's |
If you don't have enough resources to run it, you can use bitsnbytes to quantize.
-
## Features
![Group 9](https://user-images.githubusercontent.com/17906039/220427472-2644cff4-7666-46a5-819f-fc4a521f63c7.png)
-
## Useful links
- [Live preview](https://docsgpt.arc53.com/)
-
- [Join our Discord](https://discord.gg/n5BX8dh8rU)
-
- [Guides](https://docs.docsgpt.co.uk/)
- [Interested in contributing?](https://github.com/arc53/DocsGPT/blob/main/CONTRIBUTING.md)
+- 🔍🔥 [Live preview](https://docsgpt.arc53.com/)
- [How to use any other documentation](https://docs.docsgpt.co.uk/Guides/How-to-train-on-other-documentation)
+- 💬🎉[Join our Discord](https://discord.gg/n5BX8dh8rU)
- [How to host it locally (so all data will stay on-premises)](https://docs.docsgpt.co.uk/Guides/How-to-use-different-LLM)
+- 📚😎 [Guides](https://docs.docsgpt.co.uk/)
+- 👩💻👨💻 [Interested in contributing?](https://github.com/arc53/DocsGPT/blob/main/CONTRIBUTING.md)
+
+- 🗂️🚀 [How to use any other documentation](https://docs.docsgpt.co.uk/Guides/How-to-train-on-other-documentation)
+
+- 🏠🔐 [How to host it locally (so all data will stay on-premises)](https://docs.docsgpt.co.uk/Guides/How-to-use-different-LLM)
## Project structure
+
- Application - Flask app (main application).
- Extensions - Chrome extension.
-- Scripts - Script that creates similarity search index and stores for other libraries.
+- Scripts - Script that creates similarity search index for other libraries.
- Frontend - Frontend uses Vite and React.
## QuickStart
-Note: Make sure you have [Docker](https://www.docker.com/) installed
+Note: Make sure you have [Docker](https://docs.docker.com/) installed
On Mac OS or Linux, write:
@@ -89,15 +87,17 @@ It will install all the dependencies and allow you to download the local model o
Otherwise, refer to this Guide:
1. Download and open this repository with `git clone https://github.com/arc53/DocsGPT.git`
-2. Create a `.env` file in your root directory and set the env variable `OPENAI_API_KEY` with your OpenAI API key and `VITE_API_STREAMING` to true or false, depending on if you want streaming answers or not.
+2. Create a `.env` file in your root directory and set the env variable `OPENAI_API_KEY` with your [OpenAI API key](https://platform.openai.com/account/api-keys) and `VITE_API_STREAMING` to true or false, depending on whether you want streaming answers or not.
It should look like this inside:
-
+
```
API_KEY=Yourkey
VITE_API_STREAMING=true
```
- See optional environment variables in the `/.env-template` and `/application/.env_sample` files.
-3. Run `./run-with-docker-compose.sh`.
+
+ See optional environment variables in the [/.env-template](https://github.com/arc53/DocsGPT/blob/main/.env-template) and [/application/.env_sample](https://github.com/arc53/DocsGPT/blob/main/application/.env_sample) files.
+
+3. Run [./run-with-docker-compose.sh](https://github.com/arc53/DocsGPT/blob/main/run-with-docker-compose.sh).
4. Navigate to http://localhost:5173/.
To stop, just run `Ctrl + C`.
@@ -105,10 +105,12 @@ To stop, just run `Ctrl + C`.
## Development environments
### Spin up mongo and redis
-For development, only two containers are used from `docker-compose.yaml` (by deleting all services except for Redis and Mongo).
+
+For development, only two containers are used from [docker-compose.yaml](https://github.com/arc53/DocsGPT/blob/main/docker-compose.yaml) (by deleting all services except for Redis and Mongo).
See file [docker-compose-dev.yaml](./docker-compose-dev.yaml).
Run
+
```
docker compose -f docker-compose-dev.yaml build
docker compose -f docker-compose-dev.yaml up -d
@@ -119,35 +121,67 @@ docker compose -f docker-compose-dev.yaml up -d
Make sure you have Python 3.10 or 3.11 installed.
1. Export required environment variables or prepare a `.env` file in the `/application` folder:
- - Copy `.env_sample` and create `.env` with your OpenAI API token for the `API_KEY` and `EMBEDDINGS_KEY` fields.
+ - Copy [.env_sample](https://github.com/arc53/DocsGPT/blob/main/application/.env_sample) and create `.env` with your OpenAI API token for the `API_KEY` and `EMBEDDINGS_KEY` fields.
(check out [`application/core/settings.py`](application/core/settings.py) if you want to see more config options.)
2. (optional) Create a Python virtual environment:
+ You can follow the [Python official documentation](https://docs.python.org/3/tutorial/venv.html) for virtual environments.
+
+a) On Mac OS and Linux
+
```commandline
python -m venv venv
. venv/bin/activate
```
-3. Change to the `application/` subdir and install dependencies for the backend:
+
+b) On Windows
+
```commandline
-pip install -r application/requirements.txt
+python -m venv venv
+ venv/Scripts/activate
```
+
+3. Change to the `application/` subdir by the command `cd application/` and install dependencies for the backend:
+
+```commandline
+pip install -r requirements.txt
+```
+
4. Run the app using `flask run --host=0.0.0.0 --port=7091`.
5. Start worker with `celery -A application.app.celery worker -l INFO`.
-### Start frontend
+### Start frontend
Make sure you have Node version 16 or higher.
-1. Navigate to the `/frontend` folder.
-2. Install dependencies by running `npm install`.
-3. Run the app using `npm run dev`.
+1. Navigate to the [/frontend](https://github.com/arc53/DocsGPT/tree/main/frontend) folder.
+2. Install the required packages `husky` and `vite` (ignore if already installed).
+
+```commandline
+npm install husky -g
+npm install vite -g
+```
+
+3. Install dependencies by running `npm install --include=dev`.
+4. Run the app using `npm run dev`.
+
+## Contributing
+
+Please refer to the [CONTRIBUTING.md](CONTRIBUTING.md) file for information about how to get involved. We welcome issues, questions, and pull requests.
+
+## Code Of Conduct
+
+We as members, contributors, and leaders, pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation. Please refer to the [CODE_OF_CONDUCT.md](CODE_OF_CONDUCT.md) file for more information about contributing.
## Many Thanks To Our Contributors
-
-
+
+
+## License
+
+The source code license is [MIT](https://opensource.org/license/mit/), as described in the [LICENSE](LICENSE) file.
Built with [🦜️🔗 LangChain](https://github.com/hwchase17/langchain)
From 130a6b67bd6113513c964437de841f33037db14d Mon Sep 17 00:00:00 2001
From: KRISH SONI <67964054+krishvsoni@users.noreply.github.com>
Date: Mon, 23 Oct 2023 22:16:19 +0530
Subject: [PATCH 3/3] Update README.md
---
README.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/README.md b/README.md
index 3a6448ddd..9aaad49ac 100644
--- a/README.md
+++ b/README.md
@@ -76,7 +76,7 @@ If you don't have enough resources to run it, you can use bitsnbytes to quantize
## QuickStart
-Note: Make sure you have [Docker](https://docs.docker.com/) installed
+Note: Make sure you have [Docker](https://docs.docker.com/engine/install/) installed
On Mac OS or Linux, write: