Practically speaking, a large portion of the responsibility for the quality of information systems rests with systems users and management. Two things must happen for TQM to become a reality with systems projects. First, the full organizational support of management must exist, which is a departure from merely endorsing the newest management gimmick. Such support means establishing a context for management people to consider seriously how the quality of information systems and information itself affects their work.
Early commitment to quality from the analyst and business users is necessary to achieve the goal of quality. This commitment results in exerting an evenly paced effort toward quality throughout the systems development life cycle, and it stands in stark contrast to having to pour huge amounts of effort into ironing out problems at the end of the project.
Organizational support for quality in management information systems can be achieved by providing on-the-job time for IS quality circles, which consist of six to eight organizational peers specifically charged with considering both how to improve information systems and how to implement improvements.
Through work in IS quality circles or through other mechanisms already in place, management and users must develop guidelines for quality standards of information systems. Preferably, standards will be reshaped every time a new system or major modification is to be formally proposed by the systems analysis team.
Hammering out quality standards is not easy, but it is possible and has been done. Part of the systems analyst’s job is encouraging users to crystallize their expectations about information systems and their interactions with them.
Departmental quality standards must then be communicated through feedback to the systems analysis team. The team is often surprised at what has developed. Expectations typically are less complex than what experienced analysts know could be done with a system. In addition, human issues that have been overlooked or underrated by the analyst team may be designated as extremely pressing in users’ quality standards. Getting users involved in spelling out quality standards for information systems will help the analyst avoid expensive mistakes in unwanted or unnecessary systems development.
One of the strongest quality management actions the systems analysis team can take is to do structured walkthroughs routinely. Structured walkthroughs are a way of using peer reviewers to monitor the system’s programming and overall development, point out problems, and allow the programmer or analyst responsible for that portion of the system to make suitable changes. Structured walkthroughs involve at least four people: the person responsible for the part of the system or subsystem being reviewed (a programmer or analyst), a walkthrough coordinator, a programmer or analyst peer, and a peer who takes notes about suggestions.
Each person attending a walkthrough has a special role to play. The coordinator is there to ensure that the others adhere to any roles assigned to them and to ensure that any activities scheduled are accomplished. The programmer or analyst is there to listen, not to defend his or her thinking, rationalize a problem, or argue. The programmer or analyst peer is present to point out errors or potential problems, not to specify how the problems should be remedied. The notetaker records what is said so that the others present can interact without encumbrance.
Structured walkthroughs fit well in a total quality management approach when performed throughout the systems development life cycle. The time they take should be short—half an hour to an hour at most—which means that they must be well coordinated. Figure below shows a form that is useful in organizing the structured walkthrough and reporting its results. Because walkthroughs take time, do not overuse them.
Use structured walkthroughs as a way to obtain (and then act on) valuable feedback from a perspective that you lack. As with all quality assurance measures, the point of walkthroughs is to evaluate the product systematically on an ongoing basis rather than wait until completion of the system.