Project / Product Test and Quality Assurance Procedure  

1. Purpose #

1.1 This procedure establishes a formal and standardized process for all testing and quality assurance (QA) activities for projects and products. Its purpose is to ensure that all developed solutions are rigorously tested to identify defects, validate functionality against requirements, and ensure a high-quality, stable, and user-centric product before deployment to production.

2. Scope of Application #

2.1 This procedure applies to all testing and QA activities for projects and products that have completed the Development and Monitoring phase (BM-TD-P-25-004) and are ready for formal verification. It covers all tasks from test plan creation and execution to bug tracking, regression testing, and User Acceptance Testing (UAT).

3. Definition #

3.1 Test and QA Phase: The period of a project’s lifecycle dedicated to verifying that the developed product meets the quality standards and requirements outlined in the planning and design documents.
3.2 Test Plan: A detailed document that outlines the scope, objectives, schedule, and resources required for a testing effort, including the types of testing to be performed.
3.3 Test Case: A specific set of conditions or actions used to verify the functionality of a feature and confirm that a requirement is met.
3.4 Defect/Bug: An error, flaw, or fault in a computer program that causes it to produce an incorrect or unexpected result.
3.5 Regression Testing: A type of testing that ensures recent code changes have not adversely affected existing functionalities.
3.6 UAT (User Acceptance Testing): Formal testing by business stakeholders or end-users to confirm that the product meets their requirements.
3.7 Staging Environment: A pre-production environment used for final testing and validation before deploying to the live production environment.3.8 Quality Standards: The set of criteria used to measure the fitness of the product, including its adherence to functional and non-functional requirements, usability, performance, and security.

4. Responsibility #

4.1 Developers
4.1.1 Conducts unit and integration testing as part of the development process.
4.1.2 Check every time automated tests (e.g., unit and integration tests) are integrated into the CI/CD pipeline. This ensures that every time a developer submits new code, an automated process runs these tests to verify that the changes are stable and have not introduced new bugs.
4.1.3 Fixes bugs and defects identified during the testing phase. 4.1.4 Supports the QA and UAT processes by providing technical clarifications and assistance.
4.2  Project lead
4.2.1 Oversees the entire testing and QA process, ensuring it is thorough and effective.
4.2.2 Approves test plans and ensures all test cases are aligned with the project’s requirements.
4.2.3 Coordinates with the development team to prioritize and schedule bug fixes.
4.2.4 Add(record) and Manage the defect backlog in Jira, ensuring bugs are tracked and prioritized. 4.2.5 Compiles all test results, bug reports, and UAT feedback into a final report. 4.2.6 draft  and manage the product test checklist to track the execution of individual test cases and their results.
4.3  IT Operational Manager
4.3.1 Facilitates communication between the development team and Business Development regarding testing progress and defect status.
4.3.2 Provides final approval and sign-off on the overall quality of the product. 4.3.3 Provides final authorization for the product’s deployment after a successful Final Quality Review and Sign-off.
4.4 Business Development Department:
4.4.1 Collaborates with the Technology Department to define UAT success criteria.
4.4.2 Performs formal User Acceptance Testing (UAT) to ensure the product meets business requirements.4.4.3 Provides final sign-off on the product’s quality and readiness for deployment from a business perspective.
4.5 Creative & Design Department
4.5.1 Provides support to developers on design implementation, ensuring the final product matches the approved mockups and prototypes.

5. Work flow #

5.1 Test Planning
5.1.1 The Senior Developer, in collaboration with developers and Business Development, drafts a comprehensive test plan that outlines all testing activities.
5.1.2 The test plan includes functional, integration, regression, and UAT testing, with clear objectives and success criteria. 5.1.3 The test plan is reviewed and approved by the Senior Developer and IT Operational Manager.
5.2 Test Case Creation & Execution
5.2.1 Test Case Structure: Each test case must be detailed and follow a specific format to ensure clarity and repeatability. A test case should include: * Test Case ID: A unique identifier for the test. * Test Title: A clear and concise description of the test’s purpose. * Preconditions: Any setup steps required before the test can be run. * Test Steps: The specific actions to be performed, listed in a clear, numbered sequence. * Expected Result: The specific outcome or behavior the system should exhibit. * Actual Result: The outcome observed during test execution. * Status: Pass, Fail, or N/A.
5.2.2 Developers create and execute these detailed test cases on the staging environment. All results, including pass/fail status and any identified bugs, are logged in Jira. The project leader is responsible for documenting the results in Jira.
5.3 Code Quality Standards
5.3.1 A set of rules and guidelines that ensure all code is consistent, maintainable, readable, secure, and performant. These standards apply to all team members and all parts of the codebase.
5.3.2 General Principles * Test Case ID: A unique identifier for the test. * Test Title: A clear and concise description of the test’s * Maintainability – Code should be easy to modify and extend with minimal risk of introducing bugs. * Readability – Code should be clear, consistent, and self-explanatory to any developer. * Security – Code should follow best practices to prevent vulnerabilities (e.g., input validation, secure storage, and safe handling of credentials). * Performance – Code should be efficient in time and memory usage, avoiding premature optimization but addressing clear performance bottlenecks.
5.3.3 Core Development Principles
5.3.3.1 SOLID * Single Responsibility: Each class/function should do one thing. * Open/Closed: Code should be open for extension but closed for modification. * Liskov Substitution: Subtypes should be replaceable with their base types. * Interface Segregation: Prefer small, specific interfaces over large general ones. * Dependency Inversion: Depend on abstractions, not on concretions. 5.3.3.2 DRY (Don’t Repeat Yourself) – Avoid duplicating logic; abstract common functionality. 5.3.3.3 YAGNI (You Aren’t Gonna Need It) – Only implement features when they are required, not based on assumptions.
5.3.3.4 KISS (Keep It Simple, Stupid) – Prefer straightforward solutions over overly complex ones. 5.3.3. Style Guidelines 1. Follow language-specific style guides (e.g., PEP 8 for Python, ESLint  for JavaScript/TypeScript, Google/Oracle guidelines for Java, etc.).
2. Write meaningful comments only where needed (focus on why not what).
3. Ensure consistent formatting via automated tools (e.g., Black for Python, Prettier for JS). 5.4 Defect Tracking & Management to confusing and frustrating experiences. 5.4.1 Any bugs or defects found during testing are immediately logged in Jira with detail. 5.4.2 The IT Operational Manager and project leader review all new bugs, prioritizing them for a fix in the current or a future sprint. 5.4.3 The development team is responsible for fixing the bugs, and the original tester verifies the fix.
5.5 Regression Testing
5.5.1 This process is often automated via the CI/CD pipeline.
5.5.2 After bug fixes are implemented, the development team performs regression testing to ensure that the changes have not introduced new bugs or broken existing functionality.
5.6 User Acceptance Testing (UAT)
5.6.1 Once the product is stable on the staging environment and all major bugs are fixed, the Business Development Department participates in UAT.
5.6.2 Business Development supports a predefined set of UAT test cases to formally validate that the product meets business and user requirements.
5.6.3 The Project lead is responsible for documenting the pass/fail status of the UAT in Jira. 5.6.4 The UAT results are documented, and a final decision on acceptance is made.
5.7 Final Quality Review & Sign-of
5.7.1 The Project leader compiles all test results, bug reports, and UAT feedback.
5.7.2 A final report is submitted to the IT Operational Manager. 5.7.3 The IT Operational Manager provides the final official report via email to the Business Development Department and the GM.

6. Generating records #

6.1 << Project Test Plan>> 6.2 << Product test check list>> 6.3 <<UAT Report>>

7. Reference documents #

7.1 <<BM-TD-P-25-004 Project / Product Development and Monitoring Procedure>>

8. Relevant documents #

8.1 << BM-TD-P-25-001 Project and Product Initiation Procedure>> 8.2 <<BM-TD-P-25-002 Project / Product Planning Procedure>>

9. Flow chart #

Updated on November 25, 2025