Expected behavior of Response.aclose? #2742
-
|
Hello everyone! I am using the I have followed the docs here to return the streaming response from the proxy, resulting in code that looks similar to (modified for brevity) client = httpx.AsyncClient(base_url=API_URL)
def proxy(request):
request = client.build_request(
"GET",
request.path,
)
resp = await client.send(request, stream=True)
return fastapi.responses.StreamingResponse(
resp.aiter_lines(),
background=starlette.background.BackgroundTask(resp.aclose),
)As instructed in the link above:
We have a BackgroundTask to ensure the response is closed. However, this does not appear to actually close any connections. After running the Proxy and Streaming API through debuggers, I can see that the proxy calls On closer inspection with I'm assuming that this is due to connection pooling done by the My question is, is this right? Or am I just doing something wrong? Given the way the documentation explains how to handle streaming responses when proxying requests, it seems like the server would always be left in a state where it streaming responses back to a client that isn't acting on them. Does the documentation have to be updated to avoid this situation? I felt a bit misled that it says "Failing to do so [call Response.aclose()] would leave connections open", which seemed to imply that calling |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 6 replies
-
|
Hi. There's two different things mixed in here - how Assume I don't know anything about (This should also help you debug the problem) |
Beta Was this translation helpful? Give feedback.
-
|
Something like this can reproduce that issue for me import httpcore
import logging
import trio
logging.getLogger("httpcore.http11").setLevel(1)
async def log(p, e):
if p.startswith('http11'):
print(p, e)
pool = httpcore.AsyncConnectionPool()
async def main():
with trio.move_on_after(0.3):
await pool.request("GET", "https://www.youtube.com", extensions={"trace": log})
print(pool.connections)
trio.run(main)OUT http11.send_request_headers.started {'request': <Request [b'GET']>}
http11.send_request_headers.complete {'return_value': None}
http11.send_request_body.started {'request': <Request [b'GET']>}
http11.send_request_body.complete {'return_value': None}
http11.receive_response_headers.started {'request': <Request [b'GET']>}
http11.receive_response_headers.failed {'exception': Cancelled()}
http11.response_closed.started {}
http11.response_closed.failed {'exception': Cancelled()}
[<AsyncHTTPConnection ['https://youtube.com:443', HTTP/1.1, ACTIVE, Request Count: 1]>] |
Beta Was this translation helpful? Give feedback.
Hi. At this moment we have concrete problems with our connection pool; if a request is cancelled in the middle of execution, it will leave the connection open; we are working on it and hope to have it fixed in the next httpcore release.
Also see: encode/httpcore#658 and encode/httpcore#704
If you're looking for a solution to that problem, this PR is probably what you're looking for.