-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
promise: using the same callback both for request as response headers #114
Comments
Do you have a full trace / example of what you're seeing? Not sure I follow what the issue is here. |
Unfortunately I already quick-patched the example, but I'll try to re-explain: When the After that, it triggers the If the processing continues, it'll trigger the same Problem: Because it's using the same callback, I have to infer if those are request or response headers, and then act accordingly. As an example, this is my workaround for the issue: # PUSH_PROMISE callback
def on_promise(stream)
stream.on(:headers) do |h|
k, _ = h.first
if k == ":method"
__on_promise_request(stream, h)
else
__on_promise_response(stream, h)
end
end
end
end |
Ah, I'm with you now — thanks for the example. Curious, what do you do with the request headers? Partially, wondering if we need to emit those at all. |
Oh yes we do :) . I'm using the example of concurrent requests to a server. Whenever a push promise comes, and I have the header, I want to check if I have a pending request matching that one in my queue. If I don't have it or the request is already in-flight, I refuse the stream. Otherwise, I process the pushed stream. The idea of such an heuristic is to minimize DDoS by the server emitting infinite streams (only push what is needed) and minimize contention (having two concurrent requests for the same resource). |
I see, fair enough. With respect to the API shape, honestly, not sure. On the one hand, I see your point on having to do extra work to distinguish req/resp sets in the headers callback. On the other hand, semantically it's the correct callback to trigger and I think it would also be unexpected for this one Are there any other options we can consider here? |
In theory yes, but the client promise stream is the only one capable of receiving two types of headers, so they're clearly the exception here. I really lean towards that approach. I say this because semantically, it makes sense to emit both the stream AND the headers because the headers come as payload in the PUSH_PROMISE frame. The only problem I see would be if these would require a CONTINUATION frame and you'd have to delay the callback until having received all headers. I don't know how much of a problem that would be, though. I'd be fine with a separate callback. There is maybe a more effective workaround for the current behaviour however: # PUSH_PROMISE callback
def on_promise(stream)
stream.once(:headers) do |h|
# request headers, always
if all_conditions_are_met(h)
# response callbacks
stream.on(:headers) ...
stream.on(:data)
else
....
end
end
end |
That's a good catch and a strong reason against augmenting the What is |
In light of that, I have to concur. I don't know if you consider my pattern as acceptable use of the feature, otherwise I'd suggest a separate callback (:request_headers?).
In my client lib case, I'm supporting concurrent requests, and I'd like to allow push responses for the ones in my list of (pending) requests, and reject all other push promises. So that would be such a check. |
|
What about :promise_headers? |
Yeah, I think that could work. Willing to wire up the callbacks and tests for this? :) |
I think I can give it a try. This should be a breaking feature, so I'd also suggest to release this as 0.9 (hint: time for a release :) ). |
I've just noticed that, when trying to read promises from a client, the same callback is used both for when the promise arrives, and when the response headers come:
I'd propose to solve this by sending the request headers in the same callback, which will be more straightforward:
@igrigorik what do you think? is this feasible?
The text was updated successfully, but these errors were encountered: