You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to see DeepSeek integrated into this repository as an additional model option. This would involve updating the existing infrastructure so that users can seamlessly select DeepSeek alongside Claude, leveraging AWS resources for inference and deployment.
Why the solution needed
DeepSeek’s robust capabilities can enhance the diversity and scope of model responses. It fills use cases that may not be covered by current models and provides additional flexibility to developers and users looking to explore different large language model features and performance characteristics.
Additional context
DeepSeek has recently become available on AWS:
DeepSeek Now Available on AWS
Integrating it into bedrock-claude-chat would allow the project to showcase multiple advanced models side-by-side and further expand AI capabilities.
Implementation feasibility
Are you willing to collaborate with us to discuss the solution, decide on the approach, and assist with the implementation?
Yes, I am able to implement the feature and create a pull request.
No, I am unable to implement the feature, but I am open to discussing the solution.
The text was updated successfully, but these errors were encountered:
I tried to spin this up in the Bedrock console earlier today. The only instance I could select was a ml.p5e.48xlarge. The EC2 console shows a p5en.48xlarge as ~$85/hour. 192 vCPU and 2048GB of RAM. Not sure if this would be the same as the bedrock instance since it didn't mention any GPUs.
Might not be feasible for this project as cost would shoot up.
Describe the solution you'd like
I would like to see DeepSeek integrated into this repository as an additional model option. This would involve updating the existing infrastructure so that users can seamlessly select DeepSeek alongside Claude, leveraging AWS resources for inference and deployment.
Why the solution needed
DeepSeek’s robust capabilities can enhance the diversity and scope of model responses. It fills use cases that may not be covered by current models and provides additional flexibility to developers and users looking to explore different large language model features and performance characteristics.
Additional context
DeepSeek has recently become available on AWS:
Integrating it into bedrock-claude-chat would allow the project to showcase multiple advanced models side-by-side and further expand AI capabilities.
Implementation feasibility
Are you willing to collaborate with us to discuss the solution, decide on the approach, and assist with the implementation?
The text was updated successfully, but these errors were encountered: