Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

qwen2.5 用vllm引擎报错failed to infer device type #2511

Open
1 of 3 tasks
kevinchi8781 opened this issue Nov 4, 2024 · 2 comments
Open
1 of 3 tasks

qwen2.5 用vllm引擎报错failed to infer device type #2511

kevinchi8781 opened this issue Nov 4, 2024 · 2 comments
Milestone

Comments

@kevinchi8781
Copy link

System Info / 系統信息

微信图片_20241104172619
微信图片_20241104172637
微信图片_20241104172646

Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?

  • docker / docker
  • pip install / 通过 pip install 安装
  • installation from source / 从源码安装

Version info / 版本信息

1

The command used to start Xinference / 用以启动 xinference 的命令

1

Reproduction / 复现过程

1111

Expected behavior / 期待表现

111

@XprobeBot XprobeBot added this to the v0.16 milestone Nov 4, 2024
@kevinchi8781
Copy link
Author

微信图片_20241104173213
微信图片_20241104173217
补充日志

@kevinchi8781
Copy link
Author

微信图片_20241104173642
补充VLLM版本

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants