Monday, 11 June 2012

Strategizing For Software Quality Assurance


Strategising is arguably the first and most important step in implementing a Quality Assurance framework in any environment. Ideally, QA is usually an after-thought (there are exceptions in time and budget restrained projects) that comes into play after development approaches have been fully implemented and the end product (a software) has been produced or readied for launching.
In this instance, it would be suicidal to just jump on a project as a Quality Assurance personnel and start testing the application with the intention of breaking it down (negative testing) or just validating the software (regression, unit or functional tests). Approaching the system, for Quality Assurance purposes, must be a planned mechanism with a well defined scope of the exploits of QA in increasing overall system quality. This implies a strict definition of the frame work of ensuring adequate quality of the software/system. By strategising in advance before test implementation, there is the little possibility of test implementation failure, testing taking more time than planned or the test budget being over spent.
A test strategy document can contain anything that relates to a given project. But based on the assumption that the given project is for a software house, with an already live software product that lacks adequate quality assurance maintenance mechanism or any form of efficient documentation, for which you have been contracted to perform, the following should be a key feature of your documentation: 
Fundamental Test Process / Quality Assurance Procedure:
Since Quality Assurance is just being introduced into the development approach, an explicit definition of the test process involved in the planning and test implementation stage of the project. This part of the strategy is designed to reconcile the development process with the software testing life cycle and best testing practices as a panacea to Quality Assurance.
Entry and Exit criteria: This is a strict definition of the resources, tools, time scales, project status, environment conditions, metrics and milestones that are in place before the test implementation commences (entry criteria) and also triggers the end of the test execution process (exit criteria). This could be listed severally under the different test approaches or methodologies to be used in test implementation as listed in the test strategy document.
Testing Methodologies: This is a choice of test processes to be implemented and how they relate to the project, what benefits each entail, when and how they would be implemented. It defines the cross-roads between methodologies and approaches such as black versus white versus grey box testing, manual versus automated testing (or a manual and automated testing combination), performance (stress versus/and load) testing, alpha versus beta testing, unit testing, functional and non-functional testing.
Requirements Analysis: Agile environments usually have very little emphasis on requirements specifications documentations. And in instances where they exist, there is little use for them in Quality Assurance simply because they do not offer an extensive and detailed analysis of the system functionality. A detailed requirement analysis is included in this test strategy. This is to ensure that tests are prepared after a full audit of the system's functionality as analysed and approved by the the right person.
Test Scenarios: Test scenarios would be deduced from the requirements analysis to capture all basic outcomes and basis for testing. This would typically include spectrum’s such as security, registration, login, hacking into the system, using the dashboard, using the site audit functionality, using the competitive positioning, on and off site, SEO results and workload functionalities. Scenarios are based on documented assumptions, and are intended to cover all possible outcomes of the system.
100% Functionality / Regression Test Suite: The test suite is a collection of test cases deduced from test scenarios that are collated from the detailed requirements analysis. The test suite would be used to develop manual and automated regression test suites, ensure test system coverage and hence, has to cover the system functionality by 100%.
Automated Testing Framework: A framework is designed to ensure that the system functionality as described in the requirements analysis and as broken down in the test scenario outline and further interpreted into test cases (compiled in a manual test suite) are scripted to an automated regression test suite. How are we doing the scripting? What tools are we using? Is Selenium in use? Which selenium tools are in use? How will the scripting be grouped? Would automation be managed on screen or would it run as an integrated part of an online service?  . . .  .to be continued

Friday, 8 June 2012

Developing A Quality Assurance Framework For An Agile Environment


The essence of Quality Assurance in development or project management is primarily to ensure that requirements from business sponsors are delivered as defined on time and to budget. Hence, Quality Assurance implies the function of software quality, that assures that the standards, processes and procedures are appropriate for the project and are correctly implemented. Naturally, the need for Quality Assurance procedures could be as basic as having a rich automated regression test suite or defining an entire test/QA process that compliments development approaches like the Test Driven, Waterfall and SCRUM development methodologies.

In many environments around the world, which are naturally fast paced and agile, the approach has always been to have an idea (requirement), simply code/develop it and subsequently go live. New ideas are further developed or present ideas are enhanced (new features), they are coded and pushed live. This cycle is usually continuous. The cycle (as in many agile fast paced environments) leaves little room for detailed requirements analysis documentation, formal reviews, quality assurance, control or software testing. The end result is usually a back log of bugs (usually detected by users and most expensive to fix) or poor quality software.

The major challenge in reconciling an agile environment to a quality assurance framework is that ideally, the present process works. It works in the sense that it delivers what the business sponsors want (never mind that the process leads to a backlog of bugs and an unmanageable bug life cycle or that its implementation is inefficient in relation to time and budget), though this is not always true with reference to time and budget constrained projects. The second challenge is that, since these processes are in constant use for business features that have deadlines, attempting to reconcile QA to the present practices would imply slowing the team's pace in delivering features or projects. Nobody likes to be blamed for the late delivery of a project.

Reconciling an agile environment to a Quality Assurance framework requires a comprehensive assimilation of the product or service specification of the company, the development methodology in use , the approach to development and the timescale objectives (i.e seeing the bigger picture of the entire project, product or service). This involves planning every little bit along the way. Hence, the first and most important step in reconciling any agile approach to efficient quality assurance procedures is to strategise, this explains the need for a test strategy document.

The primary objective of strategising is to have a detailed and well defined road map indicating where we are going with regards to the quality overview of the software and also how these metrics would be achieved. It would be a road map that guides in ensuring that the overall primary objective of Quality Assurance ensures that the standards, processes and procedures are appropriate for the project and are correctly implemented.

The test strategy document would ideally contain all the quality assurance phases or testing realities, risks and expectations in the given environment. The beauty of this is that, test strategies are merely a proposal by QA to inject best practices into an already existing development methodology, and hence, require the approval and sign off of a project manager (or in practice the head of development team). By signing off a test strategy document, it is assumed that the project manager is able to embrace the reality of the imminent changes that are bound to be encountered while implementing QA procedures.

In conclusion, given the task of managing the quality assurance aspect of an organisation or a given project, your best bet is to immediately commence working on a test strategy document with the intention of getting it signed off by your line manager for subsequent implementation. And who says a test strategy document cannot contain a detailed requirements analysis of your own interpretation?