Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Amortize HttpRequestStream & DuplexPipeStream ReadAsync calls #11942

Closed

Conversation

benaadams
Copy link
Member

@benaadams benaadams commented Jul 7, 2019

if _pipeReader.TryRead fails to get the data synchronously defer to an IAsyncEnumerable<int> that is allocated once on the first failure. (Use case WebSockets/SignalR over TLS)

Before

image

After

image

I used an IAsyncEnumerable as then the code very closely mimics the original read loop; other than adding yield before the returns and a slightly modified behaviour for the finally clause.

Resolves: #11940

/cc @stephentoub @davidfowl @halter73

@benaadams
Copy link
Member Author

Hide whitespace shows the change in the main methods better https://github.com/aspnet/AspNetCore/pull/11942/files?w=1

@benaadams benaadams force-pushed the HttpRequestStream.ReadAsync- branch from 6eb9771 to 3b40f05 Compare July 8, 2019 00:24
@stephentoub
Copy link
Member

What's the single-threaded CPU overhead of this approach?

@benaadams benaadams force-pushed the HttpRequestStream.ReadAsync- branch from 28e2e14 to 86032fb Compare July 8, 2019 13:15
@analogrelay analogrelay added this to the 3.0.0-preview8 milestone Jul 8, 2019
@benaadams benaadams changed the title Amortize HttpRequestStream & DuplexPipeStream ReadAsync calls [WIP] Amortize HttpRequestStream & DuplexPipeStream ReadAsync calls Jul 10, 2019
@benaadams benaadams force-pushed the HttpRequestStream.ReadAsync- branch from 86032fb to 3b5f879 Compare July 11, 2019 14:25
@benaadams benaadams changed the title [WIP] Amortize HttpRequestStream & DuplexPipeStream ReadAsync calls Amortize HttpRequestStream & DuplexPipeStream ReadAsync calls Jul 11, 2019
@benaadams
Copy link
Member Author

What's the single-threaded CPU overhead of this approach?

Looks good; it only engages the AsynEnumerable when there is no data and goes back to sync when there is data:

InMemoryTransportBenchmark

              Method |     Mean |     Error |     StdDev |   Median |     Op/s | Allocated |
 ------------------- |---------:|----------:|-----------:|---------:|---------:|----------:|
-          Plaintext | 84.22 us | 3.2848 us |  9.1020 us | 86.95 us | 11,874.2 |     352 B |
+          Plaintext | 79.11 us | 3.8055 us | 10.8573 us | 83.59 us | 12,641.1 |     348 B |
- PlaintextPipelined | 11.52 us | 0.2011 us |  0.1783 us | 11.46 us | 86,778.7 |     368 B |
+ PlaintextPipelined | 10.50 us | 0.2045 us |  0.1913 us | 10.46 us | 95,220.7 |     355 B |


return actual;
// Finally blocks in enumerators aren't excuted prior to the yield return,
// so we advance here
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm... that seems like it will be a bug farm in the future.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this is that bad. The finally still runs given an uncaught exception or the completion of the method. I think it would be more weird if the finally did run for every yield return.

@benaadams benaadams force-pushed the HttpRequestStream.ReadAsync- branch from 4dd5bde to 38690bc Compare July 13, 2019 21:41
@benaadams
Copy link
Member Author

Changed to use ManualResetValueTaskSourceCore as it makes it a bit simplier

@benaadams benaadams force-pushed the HttpRequestStream.ReadAsync- branch from 4af0241 to 2835575 Compare July 18, 2019 13:14
@benaadams
Copy link
Member Author

Rebased

@halter73
Copy link
Member

@davidfowl @jkotalik Can you also take a look at this PR? It LGTM.

@pranavkm pranavkm added this to the 5.0.0-alpha1 milestone Aug 13, 2019
@benaadams benaadams force-pushed the HttpRequestStream.ReadAsync- branch from 33b7347 to ca00257 Compare August 20, 2019 02:57
@benaadams
Copy link
Member Author

Rebased

@benaadams
Copy link
Member Author

@halter73 #12639 fixed the counting, but still needs the timing change

Error message
Moq.MockException : \nExpected invocation on the mock should never have been performed, but was 32 times: h => h.OnTimeout(It.IsAny<TimeoutReason>())\r\n\r\nConfigured setups: \r\nITimeoutHandler h => h.OnTimeout(It.IsAny<TimeoutReason>())\r\n\r\nPerformed invocations: \r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)\r\nITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)

Stack trace
   at Moq.Mock.VerifyCalls(Mock targetMock, InvocationShape expectation, LambdaExpression expression, Times times, String failMessage)
   at Moq.Mock.Verify[T](Mock`1 mock, Expression`1 expression, Times times, String failMessage)
   at Moq.Mock`1.Verify(Expression`1 expression, Times times)
   at Moq.Mock`1.Verify(Expression`1 expression, Func`1 times)
   at Microsoft.AspNetCore.Server.Kestrel.Core.Tests.Http2TimeoutTests.DATA_Received_TooSlowlyOnSecondStream_AbortsConnectionAfterNonAdditiveRateTimeout() in /_/src/Servers/Kestrel/test/InMemory.FunctionalTests/Http2/Http2TimeoutTests.cs:line 775
--- End of stack trace from previous location where exception was thrown ---

@benaadams benaadams force-pushed the HttpRequestStream.ReadAsync- branch from ca00257 to 9137a7f Compare August 20, 2019 14:23
var result = _context.RequestBodyPipe.Reader.TryRead(out readResult);
_readResult = readResult;
CountBytesRead(readResult.Buffer.Length);
TryStart();
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@halter73 note: had to include this Http2MessageBody change for TryStart/TryStop to prevent test failures.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It definitely feels better to call TryStart here. It does important stuff that isn't even timing related like checking that the request body size isn't larger than what's allowed.

What test failures were you seeing specifically.

Copy link
Member Author

@benaadams benaadams Aug 20, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One example was the following though there were a couple different types

Error message
Moq.MockException :
Expected invocation on the mock should never have been performed, but was 32 times: 
h => h.OnTimeout(It.IsAny<TimeoutReason>())

Configured setups:
 ITimeoutHandler h => h.OnTimeout(It.IsAny<TimeoutReason>())

Performed invocations: 
ITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)
ITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)
ITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)
ITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)
ITimeoutHandler.OnTimeout(TimeoutReason.ReadDataRate)
...

Stack trace
   at Moq.Mock.VerifyCalls(Mock targetMock, InvocationShape expectation, LambdaExpression expression, Times times, String failMessage)
   at Moq.Mock.Verify[T](Mock`1 mock, Expression`1 expression, Times times, String failMessage)
   at Moq.Mock`1.Verify(Expression`1 expression, Times times)
   at Moq.Mock`1.Verify(Expression`1 expression, Func`1 times)
   at Microsoft.AspNetCore.Server.Kestrel.Core.Tests.Http2TimeoutTests.DATA_Received_TooSlowlyOnSecondStream_AbortsConnectionAfterNonAdditiveRateTimeout() in /_/src/Servers/Kestrel/test/InMemory.FunctionalTests/Http2/Http2TimeoutTests.cs:line 775
--- End of stack trace from previous location where exception was thrown ---

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah. That makes sense. If you start reading a request body with TryRead, and then call ReadAsync, Kestrel will reset _readTimingBytesRead to 0 on the first ReadAsync call ignoring the bytes counted by TryRead.

I wonder how common using both TryRead and ReadAsync on the same RequestBody will be before we do it implicitly ReadAsyncInternal as part of this PR. We might want to backport this part of the change.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does Http1ContentLengthMessageBody need this?

Http1ChunkedEncodingMessageBody has it; but ContentLength doesn't have the same pattern (TryRead earlier and TryStop)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah; is correct TryStop is in CreateReadResultFromConnectionReadResult

Added isolated change for Http2 #13305

@benaadams
Copy link
Member Author

Probably want to hold on this for now until dotnet/coreclr#26310 is evaluated

@stephentoub
Copy link
Member

Probably want to hold on this for now until dotnet/coreclr#26310 is evaluated

I'll welcome your help evaluating it :)

@analogrelay
Copy link
Contributor

I asked @halter73 what the deal is here and he said "oooh, this is a spicy one, but now that we're in 5.0, we can look at it". 😉

@davidfowl @halter73 Let's re-open this now that 5.0 is open
@benaadams can you rebase?

@halter73
Copy link
Member

halter73 commented Nov 6, 2019

dotnet/coreclr#26310 did get merged, so adding IAsyncEnumerable<int> ReadAsyncAwaited probably doesn't make sense any more.

@benaadams
Copy link
Member Author

Agreed; but it still needs to be evaluated as my be backed out https://github.com/dotnet/coreclr/issues/27403

/cc @sebastienros

@stephentoub
Copy link
Member

My hope is that the functionality from #27403 remains and ends up being enabled by default in .NET 5. But I need help from folks in vetting that and coming back with perf numbers from workloads to justify it.

Even if that ships on-by-default, though, there's potentially value in optimizing critical cases with hand-rolled implementations, if doing so can, for example, do a more efficient job of caching than the built-in caching. And as any such hand-rolled implementation involves a ton more code, it would need to be so much better that it was worth the maintenance / bug risk that is likely in such complicated code. I don't know if this particular case / implementation meets that bar.

@stephentoub
Copy link
Member

(btw, I've implemented an analyzer to help flag cases of ValueTask misuse that can cause problems as more and more such caching is applied: dotnet/roslyn-analyzers#3001)

@halter73 halter73 self-requested a review November 6, 2019 18:19
@analogrelay
Copy link
Contributor

@benaadams can you rebase to resolve conflicts?

@halter73 @jkotalik are we good to merge this once conflicts resolved (and failing builds either kicked into submission or fixed ;))

@davidfowl
Copy link
Member

We should make the pipelines change @benaadams

@halter73
Copy link
Member

We should make the pipelines change @benaadams

What changes exactly are you referring to?

@analogrelay
Copy link
Contributor

@benaadams @davidfowl @halter73 dotnet/coreclr#26310 has been merged now. Do we still want this change? I'd like to clean it up if not.

@benaadams
Copy link
Member Author

No it isn't needed currently.

@benaadams benaadams closed this Jan 6, 2020
@analogrelay
Copy link
Contributor

Sweet! Thanks!

@amcasey amcasey added area-networking Includes servers, yarp, json patch, bedrock, websockets, http client factory, and http abstractions and removed area-runtime labels Jun 6, 2023
@github-actions github-actions bot locked and limited conversation to collaborators Dec 8, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
area-networking Includes servers, yarp, json patch, bedrock, websockets, http client factory, and http abstractions
Projects
None yet
Development

Successfully merging this pull request may close these issues.

HttpRequestStream & DuplexPipeStream don't amortize their ValueTask ReadAsync calls
9 participants