Quality software testing guidelines (for startups)

Importance of software testing=quality assurance=QA cannot be stressed enough.  Here is experience from my startup regarding good quality testing process.

The goal of testing is to guarantee that the released code does not contain any critical issues. This is contrary to understanding that this is just finding as many defects as possible. There are 2 main types of testing: a) testing new features and bug fixes b) regression testing to validate that old code is still working.

Reporting defects

When issue is discovered then it is important that it will be very easily understandable and reproducible. Making extra effort to report issues in understandable format means that development team will be able to fix the issue much faster. Recommended format for reporting bugs is:

  1. Time and date of encountering the issue – it can be a few days later when dev starts to investigate and the faster they locate the event the better.
  2. User accounts involved – usually log includes the username and it is possible to extract all actions related to it.
  3. Servers and environments – Is it testing, dev, staging or production? When each environment has many servers then include specific server whenever possible.
  4. Short descriptive headline – one sentence is enough and it helps you to remember which issue is this about when someone refers it to you
  5. Expected behavior – what the program was supposed to do.
  6. Actual behavior – what was wrong and different from expected behavior.
  7. Steps to reproduce – Clear path of actions that lead the developer to seeing the defect himself without spending lot of time to figure it out himself.
  8. Screenshots – One picture says more than 1000 words, especially good is if you mark the area on screenshot that is relevant to this defect.
  9. Severity estimation – Defects can be critical and not so critical. Critical defects that need immediate attention must stand out.
  10. Comments – anything else the developer needs to know.

Planning the testing

If you approach testing without any plan then you can never be sure that everything in scope is verified. This goes against first goal of testing – verify the code to be working. If we cannot be confident that after testing code is working then term Quality Assurance loses its meaning. There has to be at least some planning and tests must cover

  1. Normal user flow – login success, purchase made
  2. Negative user flow – log not successful, insufficient funds
  3. Edge cases and error handling – buy -1 or 0 items, enter long and invalid input. Only valid values must be accepted.
  4. Security testing – SQL injection, XSS, CSRF and more, check OWASP top 10 for more trends

It is not important to write down all the testcases step by step, unless it is automated test, as this takes usually more time than executing the testcases itself. It is enough if you write down in free form what is going to be tested. For example testing plan for login would be: a) success, b) wrong password, 3) no password/username 4) 1 000 000 character username 5) SQL injections.

Tools for testing

Testing involves cooperation of many parties – testers, developers and managers. In a startup environment all of them can be located in different places and the team needs to have accessible defect tracking system. I have found trello.com to be irreplaceable.  Trello requires no setup and you can login with your google accounts. You can drag and drop Trello Cards(issues) from one list to another, assign project members to each card and set deadlines.

Major part of testing is regression testing and all of the old features need to be tested over and over again during each release. If this testing is done manually then there must be really good reasoning for such waste of time. There absolutely has to be automated tools that help to get the repeating job done with least effort. For websites I can recommend Selenium.

Tracking the testing process is very important to guarantee that all of the features get validated during regression test phase. I am using google docs spreadsheet which is single multiuser page and there is no need to send across huge number of Excel files. On left hand side is listed all of the features and testcases for particular project. On header is written server version and date. Once the testcase has been run and all is good then cell is marked OK. If test fails then defect is reported and defect number is written into the cell. Result is very good overview of testing status and quality of the code in one place.

 

If you have any good ideas to add from your side then add these to comments below.


Leave a Reply

Your email address will not be published.


*