The test program is one of the most vital components of any ERP installation. It must be carefully planned and exhaustively pursued from the very beginning of the implementation.
Unfortunately, testing is too often scrimped, or, worse yet, elements of the test program are sacrificed in an effort to get the product out the door.
This doesn’t work. Scrimping on testing simply transfers part of the test project into the go-live phase of the installation in the form of bugs that have to be hunted down at the cost of even more time, resources and schedule disruption.
The most important part of testing is to develop a comprehensive plan to test to. This should be done early in the implementation and adhered to rigidly. The plan should be complete. Simply poking at a system haphazardly doesn’t constitute an adequate test plan.
The testing should be as comprehensive as practical. The more complete the planned testing, the more certain you can be of the outcomes. However, as a practical matter, you can’t test every case. You cannot be absolutely exhaustive because of time and resource constraints. Pick and choose to cover the major cases and make sure the workflow works as it is supposed to.
Fortunately, unlike the case with software development, the bones of your ERP system have already been tested. Most of the time you’re just testing the configuration and the flow between modules. This is a much easier job.
However, this doesn’t apply in cases where you have customized the package by writing additional code. Those customizations have to be tested from the ground up just as any new software does.
Ideally, testing starts as soon as system configuration does. Each part of each module should be tested for correctness as it is developed, irrespective of the other modules.
Bugs at this level are easier to spot and fix. Generally, they will consist of configuration errors within the module or errors in logic. The logic errors are the more difficult to track down.
Once each module is complete it needs to be tested with dummy data to confirm that it is working to specification. At this stage inputs and outputs will be stubbed out.
Workflow within the module should be carefully checked to make sure that it works as advertised. Also look for unresolved edge cases – conditions that aren’t allowed for in the module. These should have been caught much earlier in the project, but sometimes they don’t raise their heads until this stage.
As modules are completed they should be linked together and tested again. Once more dummy data plays a major role in testing. As the process becomes more complete the dummy data becomes more important. This should be actual transaction data or modeled on real data as closely as possible.
Don’t neglect user testing. Users should be brought in at the module stage and they become increasingly important as the implementation progresses. Pay close attention to their comments and problems. Things that can be identified in the testing phase can be fixed before the user base is exposed to them.
As the modules are completed and linked together they should be subjected to increasingly severe and comprehensive testing. Performance should be tested and monitored as well.
The final stage is testing the complete system with a rich supply of dummy data – or real transaction data is that is available. You will probably find a range of issues from the minor to major. These need to be tracked down and fixed before the system goes live.
A good testing program is a lot of work, but it is necessary for a successful implementation. The more complete the testing, the fewer problems you will have when you go live.