Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: concurrent_requests 25 has no efftect in graphrag since use fnllm with vllm/sglang #1633

Open
3 tasks
xxll88 opened this issue Jan 17, 2025 · 0 comments
Open
3 tasks
Labels
bug Something isn't working triage Default label assignment, indicates new issue needs reviewed by a maintainer

Comments

@xxll88
Copy link

xxll88 commented Jan 17, 2025

Do you need to file an issue?

  • I have searched the existing issues and this bug is not already filed.
  • My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
  • I believe this is a legitimate bug, not just a question. If this is a question, please use the Discussions area.

Describe the bug

when use openai compatible model with vllm/sglang since graphrag 1.0 using fnllm , concurrent_requests =25 has no efftect :

Image

when use graphrag 0.5

Image

Steps to reproduce

No response

Expected Behavior

No response

GraphRAG Config Used

# Paste your config here

Logs and screenshots

No response

Additional Information

  • GraphRAG Version:
  • Operating System:
  • Python Version:
  • Related Issues:
@xxll88 xxll88 added bug Something isn't working triage Default label assignment, indicates new issue needs reviewed by a maintainer labels Jan 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Default label assignment, indicates new issue needs reviewed by a maintainer
Projects
None yet
Development

No branches or pull requests

1 participant