Advanced test scripts
All IT and most business organisations will be familiar with the idea that they have to test systems. A basic tool for this is the test script.
A test script sets out a succession of step-by-step instructions to carry out a test. It is crucial that testing be scripted to an extent commensurate with the risks of not testing properly, but when looking at either corporate standards or individual projects it’s striking how seldom even the minimum standards are met. Do scripts state the expected result, so you can tell unequivocally whether the test has been passed or failed? In my experience, most don’t. Nor do they tell the user what preconditions must be satisfied before the test can begin (navigate to…, using these privileges…), or tell you how to check the result, and so on.
So here is my recipe for a truly complete test script. You won’t want to use it because it’s long and complicated and you can’t see the point of it, but if you can tell me which fields are not needed by anyone with a legitimate interest in your testing, feel free to take them out.
The script falls into five sections:
- Planned execution date.
- Summary of test’s objective or purpose.
- Test condition(s) implemented.
- Positive/negative test?
- Start point (i.e., navigating to appropriate screen/process/field).
- Access procedure (e.g., navigating login and security).
- Set up/Initial conditions.
- User ID/privileges required.
- Preceding actions/tests to prepare the application.
- File selection, parameter settings, etc.
- The preconditions for the test as a whole (e.g., account no., currency, etc).
- Step no. (if sequence is significant.)
- Location (screen/form/field name at which testing should enter test input – for example, “Go to screen…”, “Select ‘Reports’ menu”, etc.)
- Test inputs.
- Data to be entered.
- Option(s) to be selected.
- Test actions (Step by step, checklist-style – e.g., “Enter data”, “Select option A”, “Click on Submit”, etc.)
- Actual test time/date.
- Actual result.
- Checked boxes (against each test step, to confirm completion).
- Notes, with a general prompt to record anomalies, unexpected results, unplanned steps, & unusual system behaviour.
- Narrative/commentary (to support re-runs & regression testing).
- Sign off.
- Tester’s name & signature.
- Expected result.
- Method for checking actual against expected (if not just “Check actual vs expected results” – including automated file comparisons, checking back-end systems, end-of-day report, messages, etc.).
- Root cause of failure (e.g., “Comm320 failure”, “Data feed”).
- Defect reference field (to locate defect reports, anomalies, etc. may be needed at both step and script levels).
Try it. Really, it works.