Testing Non-Functional Requirements

Most agile projects today include automated functional testing within the sprints, but there are still a lot of projects that don’t start focusing on NON-functional tests until it’s time for a deployment.

Testing Non-Functional Requirements
Photo by Adi Goldstein / Unsplash

Back in the day, I was brought onto a project to lead performance testing activities for an enterprise web application that had been in development for a couple of years. The team had daily stand-ups but the project as a whole was run as waterfall and we had reached the "testing phase" of the project.

With performance testing, we needed to ensure the application could handle multiple concurrent users, as to that point far we’d only done single-user functional testing. In addition, the development had been conducted against a small database and we needed to conduct performance testing with a more "production-like" database. Before we started the performance testing it was important to setup a test environment that was sized and configured similar to production.

Once we had built out the large database, it was time to start throwing lots of users at the application, produce some reports, and move on to the next project! Except for one problem. We couldn’t move past the home page, with just a single user. All we saw were database timeouts and errors. Introducing a production-sized database had quickly brought the application to its knees. Although there was nothing in the plan for performance tuning, it was clearly time for a tuning phase!

Performance testing is a type of non-functional testing that validates the application conforms to the non-functional performance requirements, such a number of concurrent users, response time, error rate, etc.

In addition to Performance requirements, other non-functional requirement types include:

  • Scalability - does the application conform to scalability requirements by handling additional users and workloads without compromising the user experience?
  • Security - does the application conform to security requirements, by protecting access to the application functionality and data?
  • Capacity - does the application conform to the capacity requirements related to data volume capacity?
  • Reliability - does the application conform to reliability requirements related to uptime and application availability?
  • Maintainability - does the application conform to maintainability requirements related to the ability of support personnel to support, revise and enhance the application?

A typical enterprise development project will have requirements around these areas and it’s essential to have a strategy to confirm the application conforms to those requirements.

As you look at these requirements, early in the project, it’s important to ask three questions:

  • How can I validate the application conforms to these requirements?
  • How can I automate that validation?
  • How can I include that automation in the CI/CD pipeline?

Most agile projects today include automated functional testing within the sprints, but there are still a lot of projects that don’t start focusing on NON-functional tests until it’s time for a deployment.

We all know the value of automated regression testing of functional requirements. With automated regression testing we can find bugs quickly and address them before they fester and become a bigger issue. The same is true with regression testing of non-functional requirements.

Testing tools and test strategies can allow you to validate an application’s conformance to these requirements, and build automated test scripts that can be included in the CI/CD pipeline.

For example, some popular Performance test tools include Apache JMeter, MicroFocus LoadRunner, and Gatling. All of these tools allow you to define test scenarios, the number of concurrent users, load patterns, test durations, etc. In addition, they all provide canned and customizable reports to communicate the results of the tests.

Security requirements are another critical type of non-functional requirement. These requirements must be defined early, as they are critical input to defining the appropriate application architecture. The Open Web Application Security Project (OWASP) is an organization of application security experts that focus on defining key application security threats and strategies for mitigating them.

The OWASP Testing Guide provides guidance and best practices for validating an application’s conformance to Security requirements and best practices. In addition, there is a broad range of Security tools such as the OWASP Zed Attack Proxy (ZAP) that can allow you to include automated Security testing into the CI pipeline.

So you’re probably wondering, what happened during our “tuning” phase? Of course the first step was to get the application to work with one user! Luckily most of our issues were isolated to the database. After weeks of iteratively tuning and testing we were able to clean up a LOT of performance issues within our SQL stored procedures.

It turned out that 90% of our performance issues were related to about 5 types of SQL coding problems. If we had started that testing at the beginning of the project, we would have caught those issues early and made sure they did not propagate throughout the application.

Just like with functional testing, automated regression testing of NON-functional requirements during the sprint will save you lots of time, money, and headaches down the road.