-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance changes between 1.2.0, 1.3.1 and 2.0a #1614
Comments
that's cool. let me check |
By the way, I don't mean to sound negative. Aiohttp's performance is pretty good and from what I can tell with |
it is totally fine. i actually thought about some consistent performance benchmarks. and this tests suit will significantly help me. especially now, i am working on internal refactoring and http pipelining support |
@samuelcolvin i did some optimization work on |
all changes are in master now |
Thanks, can you link to the commit where this was done? As per this discussion do you think pipelining will have noticeable effects on real world applications? That discussion suggests it's unlikely to help in reality. |
pipelining in python application is a benchmark only tool :) |
@samuelcolvin could you run benchmark again with aiohttp from parser branch? |
Will do.
…On 15 Feb 2017 7:27 pm, "Nikolay Kim" ***@***.***> wrote:
@samuelcolvin <https://github.com/samuelcolvin> could you run benchmark
again with aiohttp from parser branch?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1614 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AD2jGU-IfvLYThRLvK6lDgskJHnTmAAAks5rc1GwgaJpZM4L8iiP>
.
|
parser branch is merged to master |
I've split the code into a separate repo as running the full tests and displaying the results was a real pain: https://github.com/samuelcolvin/aiohttp-benchmarks |
Wow! That's great! |
Yes, I saw that too and was surprised, I thought 3.6 had performance improvements for asyncio. |
There were improvements for asyncio and dicts... Can't see at first sight why it should be slower in 3.6 😓 |
I've added the results pivoted to compare python version. Change is fairly consistent. |
Anny ideas about why the surprising bad results with 3.6?
El 16/02/2017 15:54, "Samuel Colvin" <notifications@github.com> escribió:
… I've split the code into a separate repo as running the full tests and
displaying the results was a real pain: https://github.com/
samuelcolvin/aiohttp-benchmarks
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#1614 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABK1id2YijnvE_WocdVDnDF6yDrV5TOcks5rdGMjgaJpZM4L8iiP>
.
|
on my mac i get ~5-7% better performance under python3.6, but my test is very simple |
Perhaps someone else could run my benchmarks and confirm I'm not going mad
or have some obscure problem with my python installation?
It takes about 20 mins to run but it's very easy to setup, then can be left
to chomp through all that cases.
…On 16 Feb 2017 7:57 pm, "Nikolay Kim" ***@***.***> wrote:
on my mac i get ~5-7% better performance under python3.6, but my test is
very simple
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1614 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AD2jGeJ8vR_jhxywHrwca1KPl3vwZxxrks5rdKoTgaJpZM4L8iiP>
.
|
I will do |
FYI JSON and simpletext - the ones that I've tried for curiosity - behave in python3.6 at least equal as python3.5. Python 3.6 could have a better performance but it is something not really appreciable at first sight, something around 5% and to make it sure a serious benchmark should be executed. In any case, the important thing here I can not reproduce the huge decreasing btw 3.5 and 3.6 Note: I ran the JSON and simpletext tests at least 5 times per version, and I picked up the better time. Otherwise handmade tests sharing the CPU with other user processes might bias the results. |
I will run benchmarks on separate aws c3 instances next week |
There will need to be significant changes to my code to allow running on separate machines. I'll see if I can make the changes tomorrow. |
I've modified the benchmark code to run the server remotely and rerun the test: https://github.com/samuelcolvin/aiohttp-benchmarks python 3.5 vs. 3.6 is much closer but still the trend is that 3.6 is single digit percentage points slower. Obviously running the test uses up CPU credits pretty quickly but I was careful to make sure the tests finished before the server ran out of credits. |
We are observing memory leak and 100% CPU usage with 1.3.3. It happens on Linux - on OS X I don't see such problems. Version 1.2.0 works good on both Linux and OS X. I'm investigating this issue, if there's anyone else with same problem - show your setup. I'm trying to write minimal code needed to reproduce this issue. |
Is it on server or client? Do you see CPU usage immidietly or after some time? Btw please create new ticket |
@samuelcolvin could you create new PR for FrameworkBenchmarks with aiohttp 2.0 |
btw maybe you want to merge aiohttp-benchmarks into aiohttp? or move it to aio-libs? |
Was just thinking about this and was waiting for 2 release. Will do. I think this can be closed too, any further discussion should happen on new issue.
For me it's not part of the framework so it should be a separate repo. I'll transfer it. I think we should also delete (or move) the current |
agree on benchmarks directory |
I'll wait 24 hours in case the release causes immediate problems which need fixing with patch releases. Congratulations on 2.0.0 🎉 |
Benchmarks moved into this org: https://github.com/aio-libs/aiohttp-benchmarks. FrameworkBenchmarks updated (PR pending): TechEmpower/FrameworkBenchmarks#2609 |
Awesome! Thanks! |
I've just updated FrameworkBenchmarks to 1.3.1 to stop the
CancelledError
issue (TechEmpower/FrameworkBenchmarks#2561).As part of that, I ran the benchmarks (locally with vagrant, so not the most rigorous testing setup) to see if there were performance changes. Here's the result:
(All numbers are requests per second as given by their
results.json
output. The different steps refer to number of queries executed for the DB tests and concurrency for the other tests. Apart from aiohttp, no other packages have changed.)It seems that that is a general trend of ~10% performance regression however, the code using raw asyncpg queries are consistently much faster.
I know these tests are far from perfect but my questions are:
The text was updated successfully, but these errors were encountered: