-
Notifications
You must be signed in to change notification settings - Fork 62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automatically annotate slow tests #86
Comments
As with the other issue, if there's interest in this I'd consider submitting a PR. |
@bdsl Thanks for this idea. PHPUnit supports the As for how to modify your suite's test code, that is not a feature I want to bring in scope for SpeedTrap. Perhaps a tool like Rector (GitHub, Website) could perform the actual code modification. Like you suggested, such a tool would require a programmatic output from SpeedTrap to inform which test case and test method was detected as slow. Adding a programmatic output would be a valuable feature for SpeedTrap. SpeedTrap currently outputs to screen via |
I agree it makes sense to to put code modification functions into SpeedTrap, and they'd fit better in Rector. I don't know if Rector currently has a rule to add a given annotation to a list of classes or methods, but hopefully that should be easy to create if not. Will try and look for a nice way to output an easily machine readable list from SpeedTrap. I suppose the difficult is maybe that SpeedTrap doesn't have any CLI of it's own to take options or manage output. |
related to #88 |
Once we've found the slow tests, it might be handy to have them all annotated. That would let us find them again later, and also would let us configure PHPUnit to run them separately from the the fast tests - e.g. we might want to run fast tests before commit and slow tests only before deployment.
I'm not sure if the functionality to edit test code would necessarily belong inside this tool - maybe it would be better to implement this a recommendation of another tool that can do the annotations (assuming it exists) and an option to output the list of test cases in a suitable machine readable format - probably something like you get by piping the existing output through
grep -o '\S*::\S*'
The text was updated successfully, but these errors were encountered: