Strategising is arguably the first and most important step in implementing a Quality Assurance framework in any environment. Ideally, QA is usually an after-thought (there are exceptions in time and budget restrained projects) that comes into play after development approaches have been fully implemented and the end product (a software) has been produced or readied for launching.
In this instance, it would be suicidal to just jump on a project as a Quality Assurance personnel and start testing the application with the intention of breaking it down (negative testing) or just validating the software (regression, unit or functional tests). Approaching the system, for Quality Assurance purposes, must be a planned mechanism with a well defined scope of the exploits of QA in increasing overall system quality. This implies a strict definition of the frame work of ensuring adequate quality of the software/system. By strategising in advance before test implementation, there is the little possibility of test implementation failure, testing taking more time than planned or the test budget being over spent.
A test strategy document can contain anything that relates to a given project. But based on the assumption that the given project is for a software house, with an already live software product that lacks adequate quality assurance maintenance mechanism or any form of efficient documentation, for which you have been contracted to perform, the following should be a key feature of your documentation:
Fundamental Test Process / Quality Assurance Procedure:
Since Quality Assurance is just being introduced into the development approach, an explicit definition of the test process involved in the planning and test implementation stage of the project. This part of the strategy is designed to reconcile the development process with the software testing life cycle and best testing practices as a panacea to Quality Assurance.
Entry and Exit criteria: This is a strict definition of the resources, tools, time scales, project status, environment conditions, metrics and milestones that are in place before the test implementation commences (entry criteria) and also triggers the end of the test execution process (exit criteria). This could be listed severally under the different test approaches or methodologies to be used in test implementation as listed in the test strategy document.
Testing Methodologies: This is a choice of test processes to be implemented and how they relate to the project, what benefits each entail, when and how they would be implemented. It defines the cross-roads between methodologies and approaches such as black versus white versus grey box testing, manual versus automated testing (or a manual and automated testing combination), performance (stress versus/and load) testing, alpha versus beta testing, unit testing, functional and non-functional testing.
Requirements Analysis: Agile environments usually have very little emphasis on requirements specifications documentations. And in instances where they exist, there is little use for them in Quality Assurance simply because they do not offer an extensive and detailed analysis of the system functionality. A detailed requirement analysis is included in this test strategy. This is to ensure that tests are prepared after a full audit of the system's functionality as analysed and approved by the the right person.
Test Scenarios: Test scenarios would be deduced from the requirements analysis to capture all basic outcomes and basis for testing. This would typically include spectrum’s such as security, registration, login, hacking into the system, using the dashboard, using the site audit functionality, using the competitive positioning, on and off site, SEO results and workload functionalities. Scenarios are based on documented assumptions, and are intended to cover all possible outcomes of the system.
100% Functionality / Regression Test Suite: The test suite is a collection of test cases deduced from test scenarios that are collated from the detailed requirements analysis. The test suite would be used to develop manual and automated regression test suites, ensure test system coverage and hence, has to cover the system functionality by 100%.
Automated Testing Framework: A framework is designed to ensure that the system functionality as described in the requirements analysis and as broken down in the test scenario outline and further interpreted into test cases (compiled in a manual test suite) are scripted to an automated regression test suite. How are we doing the scripting? What tools are we using? Is Selenium in use? Which selenium tools are in use? How will the scripting be grouped? Would automation be managed on screen or would it run as an integrated part of an online service? . . . .to be continued