-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature request: global patch like in requests-cache? #150
Comments
Yes, this is a sensible feature request. I generally consider patching a last resort because of the limitations mentioned here, but when working with another package you don't control, you might not have any other options. If you let me know what library you're using, I can take a look and see if there are any other options. For example, some libraries use a single internal session object, so you can patch out just that instead of all instances of |
Hi @JWCook , one example would be https://github.com/openai/openai-python/ Independently of whether it would be straightforward to patch an instance of |
I tried setting the from aiohttp_client_cache import CachedSession as AIOCachedSession
async with AIOCachedSession(cache=AIOMongoDBBackend('my_async_cache', host="<HOST>", port=27017)) as aiosession:
openai.aiosession.set(aiosession)
for i in range(N):
tasks.append(asyncio.create_task(openai.ChatCompletion.acreate(
model_name='gpt-3.5-turbo',
messages=[{"role": "user", "content": "How are you?"}])))
responses = await asyncio.gather(*tasks)
return responses But I am unable to benefit from caching. |
@meetps That's strange. I don't have a paid OpenAI account to test that out right now, but from looking over the openapi-python code, your example looks correct. This example verifies that the internal session made by import asyncio
import openai
from aiohttp_client_cache import CachedSession, SQLiteBackend
async def test_session():
async with CachedSession(cache=SQLiteBackend()) as session:
openai.aiosession.set(session)
async with openai.api_requestor.aiohttp_session() as internal_session:
print(internal_session.__class__)
asyncio.run(test_session()) There could be something about the responses that's causing them to not be cached. Could you show me an example response (status code, headers, and content)? |
Back to the topic of a global The biggest problem with that is unintentional caching outside of the module in which it's used (the cause of many bug reports!). I'd feel much better about it if, for example, we could patch out |
I'm unsure if we'd want to enable direct monkey-patching of asyncio.ClientSession globally. But patching a subset of named sessions using scopes (similar to nest_asyncio 1) and CACHE_SESSION_MAPPING = {
"my_cache_session": <params>
...
}
with nest_asyncio.Session("my_cache_session"):
# <core components to be cached> With this functionality, we can provide users flexibility to install a global patch or annotate their async modules that need to be cached with names and apply a catch only to those sessions. I am not very familiar with the codebase to comment on actual implementation details. |
OpenAI request and response example:
Unable to see anything obvious preventing this from being cached. |
Feature description
Global cache via monkeypatching, just like in
requests-cache
[Apologies if this is not a sensible feature request; I'm new to aiohttp]
Use case
Cache requests of existing Python package that relies on
aiohttp.ClientSession
(without modifying the package).Workarounds
Not that I'm aware of. I tried using
requests-cache
with the code snippet above, but that did not work.The text was updated successfully, but these errors were encountered: