-
Notifications
You must be signed in to change notification settings - Fork 108
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Lolex is very slow #229
Comments
Sorry, I didn't notice that there was previously an issue created for that: #159. I guess there are now two people with inefficient tests 😄 I understand that without a benchmark it's hard to make speed comparisons, but even putting the performance aside, objects are not meant to be used as "hashmaps" instead of Map. This was always inefficient and that's why Maps (the actual hashmaps) were added and have been widely supported for some time now. At this point it's just a bad practice to use objects like this. If there's really a need to support ES3, then I guess the only option for me is to fork. I'll still wait for your response, since I don't know if your requirements have changed in the past few months. |
There is a diff I did working with a priority queue, that mostly worked but it was too risky to break by default. Lolex has millions of downloads a month and I'm very hesitant to push for these sort of changes. If you make a PR that keeps the existing implementation while turning on a queue based one based on an option/flag - I promise to review/add it as long as it doesn't break the existing API.
Just a historical note: Maps were added because objects don't support non-string (or symbol) keys. Objects as maps are still as fast as maps for hashmaps (at least in Node/V8). |
But my test has proven otherwise? Maybe my info is outdated, but I remember that extending objects or deleting properties resulted in internal classes being created, at least in V8, so with no properties ever being the same in lolex (property names are unique ids) a lot of stuff is going on under the hood. Or is it not the case now? |
This is all OT - but generally V8 represents objects in two ways - "hidden classes" or hash maps. It calls them "slow mode" and "fast mode" and switches between them based on how the object behaves. I go over it in here (it's a slightly old answer but captures the gist of things). When an object is very "dynamic" it behaves the same way a |
I implemented a Map version and also an Array version to have more comparison between different data structures. Below are the results of stress testing and the code I used to perform the test. TL;DR both Map and Array versions perform comparably, and the current version (called "default" in the tests) stands out and is the slowest. All of the versions seem to reach a linear-ish growth in test duration at some point, and from that we can deduce that the current version can be even 13 times slower than the alternatives in some scenarios. In the most basic scenario of a recurring timeout the current version is ~3 times slower. I'd like to note that I didn't put much effort into implementing the Map and Array versions with the exception of implementing a "clever" fast removal of an item from an array, so this could possibly be further improved. I'd like to also note that the slowdown from my original post was extremely highlighted because of a slower machine I'm using for work. Not everyone can use top-notch equiment and those are the cases where performance really matters. The ResultsI reran the tests several times and the results seem to be consistent on my machine.
The Code"use strict";
var lolex = require("../src/lolex-src");
function testClock(flags, count, totalTime) {
var clock = lolex.createClock(undefined, flags);
function work() {
clock.setTimeout(work, count);
}
for (let index = 0; index < count; ++index) {
clock.setTimeout(work, index);
}
clock.tick(totalTime);
}
function stress(name, flags) {
describe(`stress (${name})`, () => {
beforeEach(() => {
global.gc();
});
it("should warm up", () => {
testClock(flags, 1, 100000);
}).timeout(1000000);
it("should handle 1 timeout at all times in the queue (ineffective interval)", () => {
testClock(flags, 1, 1000000);
}).timeout(1000000);
it("should handle 2 timeouts at all times in the queue", () => {
testClock(flags, 2, 1000000);
}).timeout(1000000);
it("should handle 4 timeouts at all times in the queue", () => {
testClock(flags, 4, 1000000);
}).timeout(1000000);
it("should handle 8 timeouts at all times in the queue", () => {
testClock(flags, 8, 1000000);
}).timeout(1000000);
it("should handle 16 timeouts at all times in the queue", () => {
testClock(flags, 16, 1000000);
}).timeout(1000000);
it("should handle 32 timeouts at all times in the queue", () => {
testClock(flags, 32, 1000000);
}).timeout(1000000);
it("should handle 64 timeouts at all times in the queue", () => {
testClock(flags, 64, 1000000);
}).timeout(1000000);
it("should handle 128 timeouts at all times in the queue", () => {
testClock(flags, 128, 1000000);
}).timeout(1000000);
it("should handle 256 timeouts at all times in the queue", () => {
testClock(flags, 256, 1000000);
}).timeout(1000000);
it("should handle 512 timeouts at all times in the queue", () => {
testClock(flags, 512, 1000000);
}).timeout(1000000);
});
};
stress("default");
stress("map", { useMap: true });
stress("array", { useArray: true }); So I've ended up preparing two new alternatives that perform comparably, with the map version being better in more extreme scenarios. OTOH the benefit of the array version is that old browsers will support it. What do you suggest I do at this point? Having three different versions is an overkill and I added more than one just for testing purposes in the first place. When you've made the decision I'll prepare a PR. |
@fatfisz first of all - thanks for this! Personally I'd prefer a priority queue version and the current implementation and being able to switch between them. A map/array is nice, but switching to a priority queue makes sense to me and really helped our perf in Node :) Basically I suspect the majority use case is executing timers one by one in order up to a certain point for which a priority queue is ideal since timers are not inserted in order. Note that your benchmark only inserts clocks "in order". |
Ok, I'll do a priority queue version too. I didn't want to invest too much time into this and go for a quick win instead, but I have no problem with working on that more. Btw. all versions are implemented next to each other and activated by flags ( The most problematic thing in here is the This all hinges on an assumption that |
There are multiple versions on NPM but you can also use the one we use at Node.
A priority queue (implemented as a heap) is actually quite good at that.
I think that's a safe and valid assumption. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
Sorry for the lack of communication. I actually made 2 attempts at working on that, but every time I gave up because the way Lolex is structured makes it very uncomfortable for me to work on it (1 big file with everything in it). Also there were some other problems that would need to be solved anyway. So I'll just close this issue, since at this point it's stale, as the bot said. |
Is there any chance we could get the (lolex) code in the stress test above, a 10x change in lolex excution would change my life, even if I have to go bug-hunting. Any help extremely appreciated. |
Same answer as #159 . Just have a look at a suitable priority queue and see if you can make it work. Ask questions that are not already answered by these two issues. |
@fatso83 I actually found https://github.com/akhilkedia/lolex-queue/blob/master/fast-lolex.js and got it to work for my needs (the tick() method wasn't implemented (only next() was), and a few other methods it depends on, I implemented it, and it's now over 10 times faster. |
@arthurwolf I had a branch with fake-timers using a priority queue but #159 (comment) |
For the "it was never slow for me", it definitely was for me (used to run a
trading bot in "backtesting" mode instead of it's normal "live" mode, so
spending a lot of time within lolex).
About we don't have benchmarks, any timing will show a big difference
between the two methods. The code I forked from (lolex-fast.js) even
provides a testing script in their repo.
…On Mon, Nov 29, 2021 at 6:52 AM Benjamin Gruenbaum ***@***.***> wrote:
@arthurwolf <https://github.com/arthurwolf> I had a branch with
fake-timers using a priority queue but #159 (comment)
<#159 (comment)>
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#229 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAA2SFO5Q5UKPM2NWS54PS3UOMIKLANCNFSM4GRO3Y6A>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
--
勇気とユーモア
|
I'm running Node 10.5.0, and my test runner is Jest.
What actually happens
As in the title, Lolex is very slow, at least much slower than it could be. For a particular test in my project the current Lolex implementation takes 124s to complete on my machine. After some profiling and diagnosing the problem, I managed to trim that down to 22s.
The culprit is
clock.timers
: it's an object, but it's not an appropriate data structure for operations that Lolex performs on it. Profiling in Chrome showed that the slowest pieces of code in particular are thein
loops withhasOwnProperty
calls. UsingMap
forclock.timers
resulted in a tremendous speed up.How to reproduce
I have a piece of code that sets a new timeout a second after the function is called, something like this:
Then my test looks like so:
Basically there's a clock component and I want to test what is displayed at different moments in time. While the test may not be perfect (it tests the whole integration instead of display and step separately), it highlighted the problem.
Next steps
I'll gladly prepare a PR with the changes. From what I've managed to find out, the lowest supported version of Node for Lolex is 6, and it supports
Map
, so there should be no problem.The text was updated successfully, but these errors were encountered: