首页> 外文学位 >Managing data quality in accounting information systems: A stochastic clearing system approach.
【24h】

Managing data quality in accounting information systems: A stochastic clearing system approach.

机译:在会计信息系统中管理数据质量:随机清算系统方法。

获取原文
获取原文并翻译 | 示例

摘要

Empirical evidence indicates that the computerized information systems managers use to make operational, tactical, and strategic decisions contain data quality problems. Information economists have proven that, ceteris paribus, more accurate data increase the value of an information system. This dissertation examines the effects on information systems of (1) improving input control effectiveness and (2) increasing the frequency that organizations identify, investigate, and correct data errors.;When errors in an accounting information system accumulate to the maximum allowable error level, known as the clearing level, they are identified, investigated, and corrected, i.e., the errors are cleared. The number of errors in the system is the current error level. This set of assumptions can be modeled as a Markov process with an embedded Markov chain. Each event affecting an information system is assumed to have a probability of being processed correctly that is independent of previous error states, i.e., the probability an event is processed correctly depends only on the number of errors currently in the database.;The Markov model is used to prove four theorems that reflect commonly held assumptions. The first two theorems show that, if input control effectiveness remains constant, lowering the clearing level improves data accuracy but increases the frequency of clearings. Theorems 3 and 4 show that, for a given clearing level, improving input control effectiveness retards the accumulation of data errors and decreases the frequency of clearings.;The Markov model is also used to prove four additional theorems that reveal less obvious relationships. Theorem 5 reveals that, for a given clearing level, improving input control effectiveness increases the variability of the time between clearings. Theorem 6 shows that, if the probability of correctly processing an event is independent of the current error level, improving input control effectiveness without lowering the clearing level does not improve average data quality. Theorem 7 demonstrates that, if the probability of correctly processing an event is independent of the current error level, lowering the clearing level yields linear marginal decreases in the average proportion of errors. Theorem 8 states that, if the clearing level remains constant, improving input control effectiveness yields increasing marginal increases in the length of time between clearings.
机译:经验证据表明,管理者用来进行操作,战术和战略决策的计算机信息系统包含数据质量问题。信息经济学家已经证明,更精确的数据可以提高信息系统的价值。本文研究了以下方面对信息系统的影响:(1)提高输入控制的有效性,(2)提高组织识别,调查和纠正数据错误的频率。当会计信息系统中的错误累积到最大允许错误级别时,称为清除级别,可以对其进行识别,调查和纠正,即清除错误。系统中的错误数是当前错误级别。可以将这组假设建模为具有嵌入式马尔可夫链的马尔可夫过程。假定影响信息系统的每个事件都具有与先前错误状态无关的正确处理的概率,即,正确处理事件的概率仅取决于数据库中当前存在的错误数。用来证明四个定理,它们反映了普遍持有的假设。前两个定理表明,如果输入控制效果保持恒定,则降低清除级别将提高数据准确性,但会增加清除频率。定理3和定理4表明,对于给定的结算水平,提高输入控制效率可延迟数据错误的累积并减少结算的频率。Markov模型还用于证明四个其他定理,这些定理揭示了不那么明显的关系。定理5揭示了,对于给定的清算水平,提高输入控制效率会增加两次清算之间时间的可变性。定理6表明,如果正确处理事件的概率与当前错误级别无关,则在不降低清除级别的情况下提高输入控制效率不会提高平均数据质量。定理7证明,如果正确处理事件的概率与当前错误级别无关,则降低清除级别会导致平均错误比例线性减小。定理8指出,如果清算水平保持恒定,则改善输入控制有效性将导致两次清算之间时间间隔的边际增加。

著录项

  • 作者

    Bowen, Paul Larry.;

  • 作者单位

    The University of Tennessee.;

  • 授予单位 The University of Tennessee.;
  • 学科 Business Administration Accounting.;Operations Research.;Computer Science.
  • 学位 Ph.D.
  • 年度 1992
  • 页码 106 p.
  • 总页数 106
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号