-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ability to specify benchmarks that should run synchronously #5412
Comments
It looks like multiple vitest/packages/vitest/src/runtime/runners/benchmark.ts Lines 38 to 39 in 6eda473
However, multiple Though I'm not familiar with the field of benchmarking, but I would feel everything should run sequentially by default and that probably gives better stability. Maybe we can check with what other benchmark framework does in general. I'm playing with examples like this. Multiple bench files also run in parallel by default, but it can be made sequential by https://stackblitz.com/edit/vitest-dev-vitest-cdshc4?file=test.log Sample codeimport { bench, describe } from "vitest"
import fs from "node:fs";
const sleep = (ms: number) => new Promise(resolve => setTimeout(resolve, ms))
describe("suite1", () => {
let i = 0;
bench("bench1", async () => {
await fs.promises.appendFile("test.log",`file1, suite1, bench1 = ${i++}\n`)
await sleep(500);
}, { time: 0, iterations: 0 })
let j = 0;
bench("bench2", async () => {
await fs.promises.appendFile("test.log",`file1, suite1, bench2 = ${j++}\n`)
await sleep(500);
}, { time: 0, iterations: 0 })
})
describe("suite2", () => {
let i = 0;
bench("bench1", async () => {
await fs.promises.appendFile("test.log",`file1, suite2, bench1 = ${i++}\n`)
await sleep(500);
}, { time: 0, iterations: 0 })
}) |
Thank you very much. We did what you suggested and it worked fine. We made implimented it in the vite.config.js like this. Such that we still could have our tests run in parallel
And regarding if benchmarks should run sequentially. I can see arguments for both. A good first step could be to add a small section for it in the documentation. |
I do agree with that. I think we should flatten all benchmarks and use the suite just for reporting. |
Clear and concise description of the problem
In cases where the code is not only js, but is using a shared resource such as a database benchmarks are difficult to run in isolation.
Example: we are currently testing prisma vs kysely and have written these tests. But since they are using a shared resourse it is not optimal to run them in parrallel.
Here is one of the tests
Suggested solution
An ability to run
vitest bench --synchronous
where all promises are awaited and the benchmark tests are performed synchronous.Alternative
No response
Additional context
No response
Validations
The text was updated successfully, but these errors were encountered: