Previous Table of Contents Next


Section 22
Audit Evidence

Internal and external auditors reviewing manual applications have always been handicapped by control procedures that could be compromised at any time for a variety of different reasons. For the portion of an application’s manual processes, any error, oversight, or negligent act could result in noncompliant processing for a single transaction. The auditors evaluating manual systems can ask only about procedures and attempt to infer conclusions about those procedures based on any tangible output from the procedures.

Both internal and external auditors realize that automated applications can significantly reduce, if not eliminate, this handicap. If the auditors can test the programs at a particular point in time and then identify and test program changes, it becomes easier to reach conclusions about the processing of every transaction based on testing a small sample of those transactions.

Auditors evaluating an automated application need to review many new items, including program listings, database structures, and system history logs, for example. Automated applications have provided the impetus for developing the data processing. Automated applications have provided the impetus for developing the data processing audit discipline. Until computers became an effective contributor to the business process, the audit community had little need to acquire computer-related skills.

INITIAL WORKPAPERS

IT auditors must understand the types of evidence normally available for applications and their characteristics or attributes. The information normally available for computerized applications includes narratives, flow diagrams, file and field descriptions and specifications, and output specifications and distribution. At this point in the review, IT auditors should find that the available background information is sufficient to meet the audit objectives. Several workpapers can be used to replace any missing documentation, or can be used as a baseline for evaluating the current information provided they relate to the application.

In most application reviews, IT auditors find a clear relationship between the quantity and quality of the application documentation and the time required to complete audit procedures. The auditor’s time is also affected by the magnitude of the system and the volume and sensitivity of the data passing through the application.

Selecting the Sample. The IT auditor should begin by identifying all of the end users for the application he or she is reviewing. The IT auditor should work with management to identify any end users and strongly consider randomly selecting a similar number of end users.

Distributing the Survey. The IT auditor reproduces the required number of surveys and sends the questionnaire to the selected end users with an explanatory letter, visit, or telephone call. The questionnaires can be returned anonymously if appropriate.

Evaluating Survey Results. The IT auditor in this situation has an unusually difficult task. The completed surveys reflect end-user opinions, although each end user may believe that his or her responses are much closer to being facts than opinions. The IT auditor must form an independent conclusion, but be sensitive to any of the end users whose opinions or conclusions differ from his or hers. The IT auditor may consider conducting a survey optional and decide whether to use the survey based on the objectives of the audit he or she is performing.


Previous Table of Contents Next