"A bug in comprehension" - reviews as a part of the testing process


A bug in comprehension
reviews as a part of the testing process

"It's not a bug, it's a bug in comprehension..." A perfectly amusing phrase from the head of the development team in an organization I worked in during a discussion with the project manager. But, what it's all about?
The organizational culture varies from company to company, as are the types of tests required by the QA team.
There are quite a few companies in which the tester is required to examine the quality of the characterization documents (Specs, FD's and User stories) prior to transferring it to development.
In the organizations in which I have worked, no Change Request, User Story, is being transferred to development without the approval of the tester assigned to the project, who is a key figure in the "life cycle" of the product.
Lack of coordination and Inadequate characteristics, means that the final product may be different from the poet's intention, delay in time, features that may be developed contrary to the customer's tastes / needs, areas in the system that will not carry out the action for which they were designed and time-consuming debates on whether it was a bug or that the characterization was not unequivocal And clear enough.
Why is the tester required to review characterization documents?
Among other things we are the link between the client, the project manager and the development, which means that we need to understand the needs of each side. Understanding the needs of the customer, recognizing that in order for the developer to do his job, he needs the most direct and clear information possible, and constructive and effective communication with the project manager is the basic infrastructure for productive and effective work.
In this article, we will examine characterizations and tasks, and the reasons why it is important to review documents before transferring them to development.
Our work as testers begins from the first foundation stone when an internal organizational need arises, a demand from the client for new development or a change request of an existing module.
In the first stage, a meeting will be held, in which the requirement will be presented in its infancy ("A Concept"), and the solution formulated to answer it will be explained.
In this context, I will share with you a presentation I saw at the conference a short while ago, in which a famous sentence by Henry Ford, an American automobile industrialist and founder of "Ford", appeared:

"If I asked people what they wanted, they would have said - faster horses ..."
This means that there, in the conference room, is the time to suggest that things may be developed differently, in a more efficient way, or one that will benefit the experience of using the final product before even one line of code is written. We test the system regularly and know it well. Do we think there is a better and more efficient way to meet the customer's needs? For example, "logics" that can be "removed" from the server and implemented in the client? Is it possible to minimize areas that will be "hard coded" in favor of future changes? Is there another solution that will not affect shared code or one that minimizes risk?
Aligning all those involved in the dev process. (Especially ensuring that the developer understands the task requirements).
Part of the general impression we need to get from the meeting, is that there is coordination and understanding between the parties, especially those who are going to carry out the actual development (before the characterization came into their hands). If he does not understand, it is reasonable to assume that he will not be able to perform as required. Do not be shy to ask leading questions and, if necessary, ask for a follow-up meeting.
Review: When the Spec has come to our hands, it is time to examine whether it is sufficiently clear, and whether it has been written with adequate coverage in order to characterize what the developer is required to perform. This is our opportunity to use our experience as testers to discover potential problems - before they are implemented in the code, to pass constructive criticism of the text and to demand to correct or add important details. (It is important to do this in good spirits - do not forget that someone worked hours on this).
It is important to note that it is impossible to "find" everything during the document review stage. Especially when it comes to large developments, and in any case questions and problems will arise during and after the development. The best thing is to learn from everything, to do our best to minimize risks, and to enrich our toolbox as testers. It should also be remembered that it is not correct to concentrate all the responsibility on the tester as a bottleneck. Responsibility for the review and feedback is also up to the development, project managers and other parties involved in the process.
So how is this done?
As an example, we will take a requirement to develop a "Contact us" module on an existing website.
"The requirement is to create a tab on the main page - with the title "Contact us". By clicking on the tab, a new page will be opened, in which each registered user will be able to fill in his name and email address for reply.
A "Drop down LOV" should be created to contain the subjects of the referral from which the user can choose.
Below it, the user can enter free text in a text box called "Enter your massage" and upload a file in a PDF format.
Below the text box, add a send button and a confirmation message when the file and message are sent to the company's systems. "

On the face of it, a simple demand, but let's see what questions are supposed to arise in us as testers based on the characterization
of the above requirement:

● What will be displayed to a user who is not registered to the site by clicking on the new tab?
● Should we create a validation for checking a valid e-mail format?
● What are the values ​​that will include the Drop Down?
● Will the same values ​​be displayed to each user (per authorization)? If not - what is the extraction that must be displayed for each authorization or authorization-combination?
● Are all fields always enabled? If not, what is the trigger for locking them?
● Is there a character limit on the Text field? (If so, ​​what is the limit? Should we include a visible counter?)
● What should happen when a user tries to upload a file in a different format than PDF?
● Are error messages pre-defined? (Including their text and a trigger to pop up the message).
● Should we set a file size limit?
● How can the user remove / delete the file in case that the he changed their mind?
● Can I upload more than one file?
● Is the Submit button always visible/enabled? If not, what is the trigger for making it available?
● Should we reset / clean the fields after sending?
● To which screen will the user return to after sending?
● What should happen if the sending fails?
● Are there required mandatory fields to fill out? If so, what should happen if we click Submit without them being filled? (Highlighting fields? Pop-ups?)
● If a screenshot is attached - does it "make sense" to us?
These are the things in a nutshell. (You can go on with it, but I think the general idea has been made clear). It should be remembered that it is impossible to characterize everything and there is a limit to the extent to which it is possible and desirable to detail in the limits of the Time Frame, while taking into account the cost-benefit.
As an additional option, it is always a good idea to offer switching to "test-oriented characteristics".
After reviewing the Spec and passing on the comments to the project manager / product in the format used in the organization (whether by email or test management tools), he will have to address our comments and update / change the specification according to the comments received. Of course, this ping-pong usually takes several rounds until we feel that the Spec is mature enough for development.




Additional tips for requirement review:
● Is it written in a way that if a developer who does not familiar with the project can perform the task?
● Is there a description of an existing state versus a required state?
● Infrastructure changes that will affect other developments/projects.
● Is there a reference to permissions?
● Is there a clear flow for each "player" that should use the system?
● Are all fields characterized? (Names, character limit, format, maximum size)
● If there is data to show by retrieval from DB - Is the extraction identical to all? What is a retrieval for each authorization?
● Is there any reference to the system's reactions in case of a deviation from the normal flow? (In the characterization-oriented format it is easier to see the above-mentioned deficiencies).
● Is it clear what triggers a subsequent action?
● Mandatory fields.
● Reference to the visibility and the Enabled & Disabled state of fields + the trigger to lock / unlock them.
● Sample files - (if files are used - it is very important to insist on sample files, because this is the best way to prevent inconsistencies between files that will actually be used by the client and the improvised files used in the tests).
● If there are screenshots - do they match the existing system?
● Are error messages defined? (Including their text and a trigger to pop up the message)
Do not be shy to insist - insisting on subtleties (such as requiring a certain limit on the "price" field when the screen also has "price before discount" and "price after discount") can save unnecessary discussions and a waste of time later.
I hope that this article will help some testers who will need to review characteristics documents, and I would like to add to everything written, that it is very important that the communication between us and our work environment whether it is development or project team, be courteous and considerate. At the end of the day, we all share the same goal and some consideration and understanding of the other side can go a long way.


Comments

Popular posts from this blog

Alone in battle - How to survive being the only tester in a startup

My road to test automation

"באג בהבנה" - סקירה אפיונית (review) כחלק מתהליך הבדיקות