Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Http2Connection is still returned by Pool even after lastStreamCreated exceeds Last Stream ID in received GOAWAY frame #2396

Closed
asw12 opened this issue Jul 22, 2022 · 1 comment · Fixed by #2408
Assignees
Labels
type/bug A general bug
Milestone

Comments

@asw12
Copy link

asw12 commented Jul 22, 2022

When using HTTP/2 with a Connection Pool, an HTTP/2 server may occassionally send our reactor-netty client a GOAWAY frame indicating the last stream ID it will accept. In my use case, I am communicating with an nginx server, which by default will only allow 1000 requests on a single connection before it closes it by sending a GOAWAY.

Expected Behavior

After a GOAWAY frame is received and the last stream id is reached, the Pool should eventually stop returning that Connection from the Pool (even without setting or waiting for ConnectionPoolSpec#maxLifeTime).
If I have a retrySpec somewhere downstream of this client usage, subsequent attempts should proceed with a different Connection.

Actual Behavior

The Connection stays in the Pool, and subsequent retries that use the same Connection immediately fail with PrematureCloseException.

Steps to Reproduce

Test case here:
main...asw12:reactor-netty:bad_connection_reuse_after_h2_goaway

The main idea is to first set up a pool with a single HTTP/2 connection that can not be gracefully killed (no max lifetime, graceful shutdown with unlimited timeout, one stream that is held open).
Then, issue another request for which the web client has retry logic and the server will initiate the graceful shutdown on the connection.

If the retry happens after GOAWAY is processed, then you'll see messages like:
io.netty.handler.codec.http2.Http2Exception$StreamException: Cannot create stream 7 greater than Last-Stream-ID 5 from GOAWAY
where that stream ID corresponds to the retry number (e.g. all the way up to 205 for 100 retries). That exception bubbles up in reactor-netty as
reactor.netty.http.client.PrematureCloseException: Connection prematurely closed BEFORE response

That indicates the request with retries is continuously checking out the same PooledConnection, which remains in the pool despite not really being usable for new streams.

Possible Solution

I feel like this kind of PrematureCloseException belongs in the same category of exception that was handled by #2140, where the Connection is no longer marked as persistent and is summarily removed from the pool.

Your Environment

Reactor Netty version:
Obseved in 1.0.17, reproduced on 1.1.0-M5 snapshot (3917290)

JVM version:
Corretto-18.0.0.37.1

@asw12 asw12 added status/need-triage A new issue that still need to be evaluated as a whole type/bug A general bug labels Jul 22, 2022
@violetagg violetagg removed the status/need-triage A new issue that still need to be evaluated as a whole label Jul 25, 2022
@violetagg violetagg self-assigned this Jul 25, 2022
@violetagg violetagg added this to the 1.0.22 milestone Jul 25, 2022
violetagg added a commit that referenced this issue Jul 28, 2022
According to the specification
https://www.rfc-editor.org/rfc/rfc9113.html#section-6.8

`Receivers of a GOAWAY frame MUST NOT open additional streams on the connection`

Reactor Netty by default is running with graceful shutdown which means that `GOAWAY`
might be received and the connection to be still alive.

`Http2Pool` should consider this use case and stop offering a connection that received `GOAWAY`.

Fixes #2396
@violetagg violetagg linked a pull request Jul 28, 2022 that will close this issue
@violetagg
Copy link
Member

@asw12 Thanks for the detailed explanation and the reproducible example. PTAL #2408

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type/bug A general bug
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants