Skip to content

Commit

Permalink
updated tech architecture
Browse files Browse the repository at this point in the history
  • Loading branch information
luv-singh-ai committed Nov 24, 2024
1 parent 5d2164f commit a26d6db
Show file tree
Hide file tree
Showing 4 changed files with 33 additions and 14 deletions.
27 changes: 13 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,36 +28,35 @@ Project Sukoon aims to build open-source solutions and standards for using AI to

1. Prototyped with [Crew AI agent framework](https://www.crewai.com/)
2. Developed backend and frontend using [LangGraph framework](https://www.langchain.com/langgraph)
3. Shifted to [AutoGen framework](https://microsoft.github.io/autogen/docs/tutorial/introduction/) based on expert consultation
4. Currently addressing web UI issues and agent loop problems with AutoGen
5. Planning to create API endpoints for LangGraph and integrate with WhatsApp API
6. Iterated and added more agents to the pipeline
3. Tried [AutoGen framework](https://microsoft.github.io/autogen/docs/tutorial/introduction/) but due to web UI issues, did not deployed this
4. Completed v3 of our Sukoon chatbot and deployed in IIT Kanpur, others.
5. Created API endpoints for LangGraph and framework for integrating with WhatsApp API

[Watch the video](https://drive.google.com/file/d/1zFL8nz0d8aqzHxJhFU0h-ScDdFaSkPeT/view?usp=drive_link)

## Installation

### Technical Architecture
![Technical Architecture](archive/tech_arch_latest.png)
![Technical Architecture](archive/sukoon_tech_arch_1x.png)

This is Langgraph code. Code from AutoGen is in different branch.

# LangGraph (Current Version)
# LangGraph (`main` branch )
```
- Go to langgraph branch
- clone the repo and create a virutal environement. Create a `.env` file and put in your secret keys like OpenAI keys
- install all dependencies in your environment (pip install -r requirements.txt)
- run 'python sukoon_api.py'
- go to sukoon-frontend(cd sukoon-frontend), run 'npm start' to access it in your browser.
- alternatively use this vercel deployment to access it - https://sukoon-1.vercel.app
- To use the API, run 'python sukoon_api.py'. else run `python.py` to run it in terminal
- To use web UI, cd to `sukoon-frontend`, run 'npm start' to access it in your browser.
- There's a newer frontend version in `frontend-vite` folder. To use this, cd to this and run `npm run dev` to view it locally.
- alternatively use this vercel deployment to access it - https://sukoon-1.vercel.app (might be stopped in future)
```

## Steps to add environment variables -
Create a .env file with:
```
OPENAI_API_KEY = '<YOUR_OPENAI_API_KEY>'
ANTHROPIC_API_KEY = '<ANTHROPIC_API_KEY>'
LANGCHAIN_API_KEY = '<YOUR_LANGCHAIN_API_KEY>'
```
- Add portkey if you want to add observability

- Alternatively , try this:
```
On Mac/Linux -
Expand All @@ -70,7 +69,7 @@ setx OPENAI_API_KEY "your_api_key_here"
# How to contribute 🤝
There are few ways you can contribute to Sukoon

- By providing feedback on the POC
- By providing feedback on the Sukoon Chatbot
- By helping in testing and evaluation(please find relevant code in `tests` and `evals` folder)
- By raising issues in the issues section
- By contributing to the codebase based on the issues
Expand Down
Binary file added archive/sukoon_tech_arch_1x.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added archive/sukoon_tech_arch_3x.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
20 changes: 20 additions & 0 deletions sukoon.py
Original file line number Diff line number Diff line change
Expand Up @@ -299,6 +299,26 @@ def chat(message: str, config: dict):
# )
# return {"messages": response}

# TO USE LANGFUSE (https://langfuse.com/docs/integrations/langchain/example-python-langgraph#goal-of-this-cookbook):
# %pip install langfuse
# %pip install langchain langgraph langchain_openai langchain_community
# # get keys for your project from https://cloud.langfuse.com
# os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-***"
# os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-***"
# os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com" # for EU data region
# # os.environ["LANGFUSE_HOST"] = "https://us.cloud.langfuse.com" # for US data region

# # your openai key
# os.environ["OPENAI_API_KEY"] = "***"
# from langfuse.callback import CallbackHandler
# # config={"callbacks": [langfuse_handler]}
# # Initialize Langfuse CallbackHandler for Langchain (tracing)
# langfuse_handler = CallbackHandler()

# for s in graph.stream({"messages": [HumanMessage(content = "What is Langfuse?")]},
# config={"callbacks": [langfuse_handler]}):
# print(s)

# TO USE OTHER MODELS:

# to use ollama via ollama pull llama3.1
Expand Down

0 comments on commit a26d6db

Please sign in to comment.