Test Plans

Just as a good software has layers of code that interact, it also has layers of testing. The VistA Software Lifecyle for program development uses a waterfall or iterative process.

Random Test.
A random test just tries feeding random input into a system or program to see if it works or not. As a testing technique it is not targetted at all, but tries to verify that the system as a whole doesn't behave too badly when given unexpected input.

Design test.
In design testing, it is important that all the interactions between the elements of a design are thought through. The essentials of the Design phase is to deal with issues at the conceptual level, focusing on broad coverage of all the interacting parts, even when some of them do not yet exist.

Unit test.
A unit test looks just at a single aspect of a program or system. Correct use of unit testing will always outstrip random testing. Good unit testing involves picking the right input to give to the program and verifying the program produces the right result. While this principle sounds true of all layers of testing, in practice it is easiest to express with unit testing. Unit testing can easily be automated with almost any programming language. There is a system called JUnit for Java, and MUnit for MUMPS. The Expect Programming Language also provides some powerful tools to create unit tests.

Component test.
Exactly what makes a component is difficult to describe. Generally speaking, Components are sofware elements that are combined together to produce a working integrated system. In the VistA context, this means that components are the individual elements that you can export with a KIDS Build. This is usually a low-volume testing phase because you're only testing boundary conditions for each of the components and each component only needs to be retested when its interface changes.

Integration test.
The Integration test is one which focuses on the Package or Patch level. It deals with the interfaces between components, which are usually well defined, anyway. In the VistA system, any software element, to be properly working has to work with a rather well defined interface. It is only in the cases where the existing subsystem doesn't quite match the needs of the package or patch, that the integration tests really involve a lot of work. Usually, integration tests are quite simple, especially if the unit testing and component testing have done their job. You pretty much put it all together and check that it runs, along with the most basic of sanity checks. (Can it access the database? Can a user log in? Is it producing log files? etc.)

System test.
Systems testing is the most complex. If you have proper requirement and design documentation, it isn't that bad. The process of writing a test plan when you know what the system is supposed to do is easy for any competent tester, no matter how large the system, even for one as large as VistA. If the unit testing, component testing and integration testing have done their job system testing should really only be about validating the software against the requirements, not finding bugs. If you're finding significant bugs at system test stage either your unit or component testing wasn't done correctly or your requirement and design process is poor.

This is why the requirments docuements and the design process is so important. By the time they get involved in the testing, the entire process is very inter-related and involved. Fortunately, VistA is built using a simple component architecture that combines small pieces to make large ones that just work.

When testing doesn't work, it is a warning that something is awry. One reason may be that work in progress hasn't allowed for proper timescales. Another reason might be that the people doing the work aren't following a proper iterative process. A third reason might be that the requirement and design phase is done so poorly there is no way to write proper test plans. It is almost never the case that the software is "Too complex". NASA managed to debug the entire shuttle flight control software which depends on sub-second timing and huge numbers of program interactions in real-time. The VistA system has been debugged for years with all of its parts that interact. The goal for advancement is providing a testing system to capture much of those debugging insights into a software system.

Testing How To List

 * Testing How To Select Good Input
 * Testing How To Analyze Dependencies
 * Testing How To Statically Check A System
 * Testing How To Test Concurrency
 * Testing How To Do Code Coverage
 * Testing How To Create The Next Test
 * Testing How To Do Statistical Testing
 * Testing How To Do User Driven Testing
 * Testing How To Use Automatic Usage Data
 * Testing How To Do Pairwise Interaction Testing