Saturday, January 9, 2016

Creating an Automated Test Strategy - Use Case



Background

On a recent project I was working on a team developing a software component as part of a larger system. This component can be thought of as a controller in a distributed MVC system, if you think of the model as having its own embedded controller. Our controller (the Master Controller) was written in Java. We had a JNI layer for interfacing with the C-based Model/Model Controller assembly and communicated with a web server hosting the view. I wanted to develop an automated test strategy that would:
  • Validate the Master Controller
  • Guard against regressions in in the Master Controller
  • Validate system interactions
  • Allow us to develop functionality independent from schedules/delivery of new functionality in the View and Model/Model Controller components
While most of this could be accomplished with unit tests, I decided we also needed some level of integration test for validating the system interactions. In the following description, bear in mind that the Master Controller is the device under test (DUT) and it interfaces with the View and Model/Model Controller but does not test them directly.

Strategy

It is always a good idea to discuss and document your test strategy. It was even more important for this project, as the other developers on the team were not as familiar with standard automated test tenets, coming from environments where automated test was not a priority. I went about this by performing the following steps:
  • Develop and document the proposed strategy on internal project wiki
  • Gain team buyin
  • Implement integration test framework
  • Create tests exemplifying usage for a variety of different types of modules
  • Provide training
Our overall strategy was to rely on unit tests for the majority of functionality test, but provide a powerful integration test environment and a small number of tests validating interactions between these components.

Unit Test

There are some widely adopted principles for creating unit tests. For the wiki and the training, I put together a few simple patterns/anti-patterns. If you are at all familiar with unit test, there will be no surprises here:

Do

  • Test each module in isolation
  • Test edge conditions
    • bad or null inputs
    • etc
  • Keep tests fast
    • Ideally milliseconds
  • Keep cyclomatic complexity low

Don't

  • Allow timing dependencies other than timeouts
    • No sleeps/timers
  • Span threads in one test
  • Depend on other tests or order of execution
  • Leave artifacts
    • Use @After methods (jUnit) to ensure that artifact cleanup will happen independent of test failure
I chose jUnit, Mockito, and PowerMock as the tools for our unit test. This decision was based on familiarity with the tools, their popularity, and suitability for usage within our system. Mockito performs most of the mocking functionality we needed. Mockito allowed us to:
  • Handle external dependencies easily
    • Good unit tests only test the module under test, none of its dependencies
  • Mock responses from these dependencies
  • Verify the method calls on these dependencies, including
    • Parameters passed
      • With stock matching algorithms or custom validators
    • Number of invocations
  • Verify that method that shouldn't be called are not called
  • Mimic asynchronous callbacks from mocked objects
PowerMock provided us with the key additional capability to mock static method calls.

Integration Test

Although unit testing covers the bulk of the test for the product, it is also useful to validate interactions between the components. In the next post I will describe the integration test strategy I implemented for this project in detail.

System Test

The unit and integration tests created and supported by the development team were only one piece of the overall validation strategy. The QA team tested the overall product manually and with Selenium for creating automated tests driven from the html5/javascript View component.

CI

For tests to be effective, of course, they need to actually be run frequently. Ideally developers would all run the unit test suite before checking in, but this is not enforceable. We had a Jenkins CI environment, where we setup automated tests running on a fixed interval whenever code changes were checked in. Failures were reported and logged and emailed to the team for resolution. We also tied in a code coverage tool for reporting progress against our goals.

TDD

For those who are not familiar with test-driven development, the basic idea is that you create your tests before you implement your code using a recipe like:

  1. Create your interfaces
  2. Create your tests to these interfaces
  3. Implement your code
  4. Run your tests
  5. Rinse/repeat as necessary until all tests are green (pass)
Some of the benefits of TDD are:
  1. Enforces up-front accurate requirements and up-front interface design
  2. Improves code readability, interface design, architecture, quality (clearly much less likely to make untestable code :))
  3. Ensures that tests don't fall behind implementation
As part of this project I adopted a TDD approach, although found it difficult to adopt whole-heartedly. I found "concurrent" test/development a better fit rather than strict adherance to the recipe. It definitely took longer to take a TDD approach than it would have to simply perform the code, but no longer than it would have to develop the code and then the tests later. I advocated for similar approaches by other team members as part of the team training, but we did not enforce it.

Up Next

As already mentioned, my next post will delve into the integration test strategy and methodology that I adopted for the team on this project.