Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Request] 支持 openrouter claude 模型的 Context Caching #6785

Open
Arbow opened this issue Mar 7, 2025 · 3 comments
Open

[Request] 支持 openrouter claude 模型的 Context Caching #6785

Arbow opened this issue Mar 7, 2025 · 3 comments
Labels
🌠 Feature Request New feature or request | 特性与建议

Comments

@Arbow
Copy link

Arbow commented Mar 7, 2025

🥰 需求描述

根据官方文档说明,是支持 prompt caching: https://openrouter.ai/docs/features/prompt-caching

🧐 解决方案

加入 cache_control 参数进行控制

📝 补充信息

No response

@Arbow Arbow added the 🌠 Feature Request New feature or request | 特性与建议 label Mar 7, 2025
@lobehubbot
Copy link
Member

👀 @Arbow

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


🥰 Requirement description

According to the official documentation, it supports prompt caching: https://openrouter.ai/docs/features/prompt-caching

🧐 Solution

Add cache_control parameter for control

📝 Supplementary information

No response

@Aloxaf
Copy link
Contributor

Aloxaf commented Mar 8, 2025

The cache_control breakpoint can only be inserted into the text part of a multipart message.

如果文档这里没说错的话,openrouter 的缓存断点只能打在 text 部分,感觉非常鸡肋。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🌠 Feature Request New feature or request | 特性与建议
Projects
None yet
Development

No branches or pull requests

3 participants