Skip to content

Quality Control Plan

Yue Qi edited this page May 9, 2019 · 1 revision

Testing Roadmap

The quality control and quality assurance activities should be carried out throughout the entire SDLC, including the following steps:

  1. Conduct technical reviews of the content/interface/design model to uncover errors.
  2. Conduct unit testing to ensure the correctness of each functional component.
  3. Conduct code reviews to ensure the correctness of coding standard and functionalities.
  4. Conduct integration testing to test the navigation throughout the architecture and uncover errors on interfaces.
  5. Conduct system functionality testing to ensure no obvious defects in the workflow.
  6. Conduct system quality testing to ensure the system can meet the specified quality attributes.
  7. Conduct the acceptance testing on a controlled and monitored the population of end users to evaluate the usability and reliability of the system.

Technical Review

Design Review

Artifacts Architecture/interface design document.
Tools NA
When The design document has been completed.
Whom The whole team.
Entry Criteria A new version of the design document has been completed.
Exit Criteria All team members agree on the consistency and completeness of the design for the architecture drivers, or a revision plan has been generated.
Effort Estimation 8 hours (2 hours/person * 4 people)
Training If possible, some training on the process-related issues and the human psychological side of reviews should be conducted.
Defect Reporting & Tracking The defects found should be added to the project’s backlog to be fixed in the future.

Code Review

Artifacts Code implementing the critical functionalities.
Tools NA
When Code implementing the critical functionalities has been developed but has not to be delivered.
Whom The whole team.
Entry Criteria The critical functional requirements have been implemented.
Exit Criteria The code has been reviewed in the following aspects:
1. Coding standard (Readability/formatting/commenting/consistency/naming/dead code removal)
2. Correctness and efficiency of the algorithm.
3. Correctness of syntax.
4. The code implements the design correctly.
Effort Estimation 8 hours (2 hours/person * 4 people)
Training Code review process & coding standard.
Defect Reporting & Tracking 1. All defects and code improvement issues reported by the code review should be summarized and added to the issue tracker.
2. The defects should be marked with its severity, and all critical level defects should be fixed before delivering the functionality.

Unit Testing

Artifacts Methods/Functions at the code level.
Tools JUnit, Pytest, Jacoco, ACTS
When Unit testing should be completed after implementation but before integration.
Whom At least two developers test independently.
Entry Criteria The method has been implemented
Exit Criteria 1. All critical functional requirements have been thoroughly tested.
2. All classes/methods generated by the development team (instead of the IDE itself) should be thoroughly tested.
3. All the unit tests have been passed.
4. For critical methods that implement the core functions, the line coverage reaches 80% and the branch coverage reaches 80% for critical code parts.
Effort Estimation 22.5 hours (0.5 hour/test case * 5 test case/functional requirement * 9 (critical) functional requirements)
Training Unit testing concepts and tools
Defect Reporting & Tracking 1. All defects reported by the unit testing should be tracked using the issue tracker and be fixed as soon as possible.
2. All defects have to be resolved before the development enters the next stage.

Integration Testing

Artifacts Interfaces, components.
Tools 1. Mockito for testing interfaces.
2. Junit and Pytest for testing components.
When Integration testing should be completed after all the components pass the unit test but before the system test.
Whom At least two developers test independently.
Entry Criteria Components have passed unit testing
Exit Criteria 1. All interfaces (inner-system or cross-system) should be tested.
2. All critical components should be tested.
3. All tests should be passed.
Effort Estimation 2 hours/interface
Training Integration test concepts and tools
Defect Reporting & Tracking 1. All defects reported by the integration test shall be tracked using the issue tracker, and they shall be fixed as soon as possible.
2. All defects have to be resolved before the development enters the next stage.

Static Analysis

Artifacts Code
Tools IntelliJ IDEA, PyCharm, SonarQube
When Every build.
Whom The developer who is responsible for the piece of code.
Entry Criteria Project is able to be built.
Exit Criteria 1. All coding standard issues found by IDE are fixed.
2. All bugs/vulnerabilities found by SonarQube are fixed.
3. All code smell issues found by SonarQube above critical level are fixed.
Effort Estimation 1 hour/release
Training Coding standard check process & SonarQube.
Defect Reporting & Tracking 1. All defects reported by the integration test shall be tracked using the issue tracker, and they shall be fixed as soon as possible.
2. All defects have to be resolved before the development enters the next stage.

System Testing

Testing towards Ststem Functionality

The activity diagram showing the system functionality is presented as follows: Activity Diagram

Artifacts Architecture/interface design document.
Tools NA
When The design document has been completed.
Whom The whole team.
Entry Criteria A new version of the design document has been completed.
Exit Criteria All team members agree on the consistency and completeness of the design for the architecture drivers, or a revision plan has been generated.
Effort Estimation 8 hours (2 hours/person * 4 people)
Training If possible, some training on the process-related issues and the human psychological side of reviews should be conducted.
Defect Reporting & Tracking The defects found should be added to the project’s backlog to be fixed in the future.

Testing towards Quality Attributes

Performance Testing (QA-P01)

Artifacts Code implementing the critical functionalities.
Tools NA
When Code implementing the critical functionalities has been developed but has not to be delivered.
Whom The whole team.
Entry Criteria The critical functional requirements have been implemented.
Exit Criteria The code has been reviewed in the following aspects:
1. Coding standard (Readability/formatting/commenting/consistency/naming/dead code removal)
2. Correctness and efficiency of the algorithm.
3. Correctness of syntax.
4. The code implements the design correctly.
Effort Estimation 8 hours (2 hours/person * 4 people)
Training Code review process & coding standard.
Defect Reporting & Tracking 1. All defects and code improvement issues reported by the code review should be summarized and added to the issue tracker.
2. The defects should be marked with its severity, and all critical level defects should be fixed before delivering the functionality.

Performance Testing (QA-P02)

Artifacts Methods/Functions at the code level.
Tools JUnit, Pytest, Jacoco, ACTS
When Unit testing should be completed after implementation but before integration.
Whom At least two developers test independently.
Entry Criteria The method has been implemented
Exit Criteria 1. All critical functional requirements have been thoroughly tested.
2. All classes/methods generated by the development team (instead of the IDE itself) should be thoroughly tested.
3. All the unit tests have been passed.
4. For critical methods that implement the core functions, the line coverage reaches 80% and the branch coverage reaches 80% for critical code parts.
Effort Estimation 22.5 hours (0.5 hour/test case * 5 test case/functional requirement * 9 (critical) functional requirements)
Training Unit testing concepts and tools
Defect Reporting & Tracking 1. All defects reported by the unit testing should be tracked using the issue tracker and be fixed as soon as possible.
2. All defects have to be resolved before the development enters the next stage.

Availability Testing (QA-A01)

Artifacts Interfaces, components.
Tools 1. Mockito for testing interfaces.
2. Junit and Pytest for testing components.
When Integration testing should be completed after all the components pass the unit test but before the system test.
Whom At least two developers test independently.
Entry Criteria Components have passed unit testing
Exit Criteria 1. All interfaces (inner-system or cross-system) should be tested.
2. All critical components should be tested.
3. All tests should be passed.
Effort Estimation 2 hours/interface
Training Integration test concepts and tools
Defect Reporting & Tracking 1. All defects reported by the integration test shall be tracked using the issue tracker, and they shall be fixed as soon as possible.
2. All defects have to be resolved before the development enters the next stage.

Acceptance Testing

Artifacts Code
Tools IntelliJ IDEA, PyCharm, SonarQube
When Every build.
Whom The developer who is responsible for the piece of code.
Entry Criteria Project is able to be built.
Exit Criteria 1. All coding standard issues found by IDE are fixed.
2. All bugs/vulnerabilities found by SonarQube are fixed.
3. All code smell issues found by SonarQube above critical level are fixed.
Effort Estimation 1 hour/release
Training Coding standard check process & SonarQube.
Defect Reporting & Tracking 1. All defects reported by the integration test shall be tracked using the issue tracker, and they shall be fixed as soon as possible.
2. All defects have to be resolved before the development enters the next stage.

Quality Assurance Management

QA in the Scrum Process

Testing should be conducted in parallel with development:

  1. Testing tasks should be established under the given corresponding entry criteria.
  2. Testing tasks should be prioritized and assigned to individuals at everyday standup meetings.
  3. The testing of most user stories should be completed by the end of each sprint.
  4. For some stories, which are completed near the end of a sprint, the corresponding testing may spill into the beginning of the next sprint.

Defects Tracking

  1. All defects should be documented in defect tables (or other issue tracker tools: JIRA) and be classified based on their root cause.
  2. After a defect is detected, the person who created it and the person who reported it should provide necessary information for root cause analysis and repair.
  3. For all issues, the product owner should determine if the issues need to be resolved in the current sprint (may need to communicate with the client).
  4. All defects to be fixed should be prioritized based on the severity, impact, effort to repair.
  5. Any issue to be fixed on priority should be re-tested after they are fixed.
  6. The state of the issues (to be fixed, to be re-tested, closed, or won’t be fixed) should be updated on issue tracker in time. Quality Assurance Management

Clone this wiki locally