Author List: Klein, Barbara D.; Goodhue, Dale L.; Davis, Gordon B.;
MIS Quarterly, 1997, Volume 21, Issue 2, Page 169-194.
There is strong evidence that data items stored in organizational databases have a significant rate of errors. If undetected in uses those errors in stored data may significantly affect business outcomes. Published research suggests that users of information systems tend to be ineffective in detecting data errors. However, in this paper it is argued that, rather than accepting poor human error detection performance, MIS researchers need to develop better theories of human error detection and to improve their understanding of the conditions for improving performance. This paper applies several theory bases (primarily signal detection theory but also a theory of individual task performance, theories of effort and accuracy in decision making, and theories of goals and incentives) to develop a set of propositions about successful human error detection. These propositions are tested in a laboratory setting. The results present a strong challenge to earlier assertions that humans are poor detectors of data errors. The findings of the two laboratory experiments show that explicit error detection goals and incentives can modify error detection performance. These findings provide an improved understanding of conditions under which users detect data errors. They indicate it is possible to influence detection behavior in organizational settings through managerial directives, training, and incentives.
Keywords: Information attributes; user behavior data
Algorithm:

List of Topics

#121 0.141 human awareness conditions point access humans images accountability situational violations result reduce moderation gain people features presence increase uses means
#96 0.135 errors error construction testing spreadsheet recovery phase spreadsheets number failures inspection better studies modules rate replicated detection correction optimal discovering
#110 0.128 theory theories theoretical paper new understanding work practical explain empirical contribution phenomenon literature second implications different building based insights need
#7 0.117 detection deception assessment credibility automated fraud fake cues detecting results screening study detect design indicators science important theory performance improved
#6 0.094 data used develop multiple approaches collection based research classes aspect single literature profiles means crowd collected trend accuracy databases accurate
#8 0.074 decision making decisions decision-making makers use quality improve performance managers process better results time managerial task significantly help indicate maker
#93 0.071 performance results study impact research influence effects data higher efficiency effect significantly findings impacts empirical significant suggest outcomes better positive