Doctoral Dissertations

Author

Paul L. Bowen

Date of Award

12-1992

Degree Type

Dissertation

Degree Name

Doctor of Philosophy

Major

Business Administration

Major Professor

A. Faye Borthick

Committee Members

Jack E. Kiger, James H. Scheiner, Mary G. Leitnaker, Michael D. Vose

Abstract

Empirical evidence indicates that the computerized information systems managers use to make operational, tactical, and strategic decisions contain data quality problems. Information economists have proven that, ceteris paribus, more accurate data increase the value of an information system. This dissertation examines the effects on information systems of (1) improving input control effectiveness and (2) increasing the frequency that organizations identify, investigate, and correct data errors. When errors in an accounting information system accumulate to the maximum allowable error level, known as the clearing level, they are identified, investigated, and corrected, i.e., the errors are cleared. The number of errors in the system is the current error level. This set of assumptions can be modeled as a Markov process with an embedded Markov chain. Each event affecting an information system is assumed to have a probability of being processed correctly that is independent of previous error states, i.e., the probability an event is processed correctly depends only on the number of errors currently in the database. The Markov model is used to prove four theorems that reflect commonly held assumptions. The first two theorems show that, if input control effectiveness remains constant, lowering the clearing level improves data accuracy but increases the frequency of clearings. Theorems 3 and 4 show that, for a given clearing level, improving input control effectiveness retards the accumulation of data errors and decreases the frequency of clearings. The Markov model is also used to prove four additional theorems that reveal less obvious relationships. Theorem 5 reveals that, for a given clearing level, improving input control effectiveness increases the variability of the time between clearings. Theorem 6 shows that, if the probability of correctly processing an event is independent of the current error level, improving input control effectiveness without lowering the clearing level does not improve average data quality. Theorem 7 demonstrates that, if the probability of correctly processing an event is independent of the current error level, lowering the clearing level yields linear marginal decreases in the average proportion of errors. Theorem 8, the most interesting result, states that, if the clearing level remains constant, improving input control effectiveness yields increasing marginal increases in the length of time between clearings. The eight theorems emphasize the importance of implementing effective input controls, lowering the clearing level, and developing procedures to identify and correct data errors. More effective input control reduces the costs of maintaining accurate information systems, provides more time after each clearing to generate critical reports, and makes lower clearing levels economically feasible. Lowering the clearing level improves the accuracy of the data and thus reduces the risk of erroneous reports. Developing procedures that identify and correct data errors reduces clearing costs and enhances data accuracy. Because the cost of implementing more effective controls escalates as projects progress through the systems development life cycle, the results support the assertion that accountants should become more involved in systems development activities. The results also suggest that system developers should develop software that enhances the ability of organizations to maintain accurate databases.

Files over 3MB may be slow to open. For best results, right-click and select "save as..."

Share

COinS