Skip to content
This repository has been archived by the owner on Feb 8, 2023. It is now read-only.

Minimum definition of done for product work and program work

Laura Gerhardt edited this page Dec 18, 2018 · 3 revisions

Minimum definition of done for user stories implemented in the product

Deliverable or Required Services Performance Standard(s) Acceptable Quality Level Method of Surveillance
Tested Code Code delivered under the order must have substantial test code coverage Minimum of 90% test coverage of all relevant code. Which coverage metric to be agreed upon by the team. Combination of manual review and the results from automated testing. Currently failing on the lines metric if below 90% for lines of code.
Properly Styled Code Tslint based on codelyzer settings for the frontend and eslint-config-airbnb for the server 0 linting errors and 10 warnings Combination of manual review and the results from automated testing
Accessible Web Content Accessibility Guidelines 2.0 AA (WCAG 2.0 AA) standards 0 errors reported for WCAG 2.0 AA standards using an automated scanner and 0 errors reported in manual testing http://squizlabs.github.io/HTML_CodeSniffer/ or https://github.com/pa11y/pa11y
Deployed Code must successfully build and deploy into staging environment. Successful build with a single command Combination of manual review and automatic testing
Documentation All dependencies are listed and the licenses are documented. Major functionality in the software/source code is documented. Individual methods are documented inline using comments that permit the use tools such as JsDoc. System diagram is provided. And changes to documentation are updated Combination of manual review and automatic testing, if available
User research Usability testing and other user research methods must be conducted at regular intervals throughout the development process (not just at the beginning or end). Artifacts from usability testing and/or other research methods with end-users are available at the end of every applicable sprint, in accordance with the vendor’s research plan. Product owner will evaluate the artifacts based on a research plan.
Secure Snyk vulnerability testing and OWASP Application Security Verification Standard 3.0 Code submitted must be free of medium- and high-level static and dynamic security vulnerabilities Clean tests from a static testing SaaS (such as Snyk) and from OWASP ZAP, along with documentation explaining any false positives
Code review Peer review Peer review comments addressed Pull requests unable to merge unless reviewed on restricted branches

Minimum definition of done for program

  • Wiki updated where needed
  • Updated invitations in the group calendar
  • Ops repository documentation updated
  • Stakeholders notified
Background
How we work
Technical Information
Past efforts
Open Forest Scale Up Tool Box
User Research
Support
Support Manual
Support Guide for Frontline Staff
Product Management Information
Clone this wiki locally