Author List: Hufnagel, Ellen M.; Conca, Christopher;
Information Systems Research, 1994, Volume 5, Issue 1, Page 48-73.
Surveys that require users to evaluate or make judgments about information systems and their effect on specific work activities can produce misleading results if respondents do not interpret or answer questions in the ways intended by the researcher. This paper provides a framework for understanding both the cognitive activities and the errors and biases in judgment that can result when users are asked to categorize a system, explain its effects, or predict their own future actions and preferences with respect to use of a system. Specific suggestions are offered for wording survey questions and response categories so as to elicit more precise and reliable responses. In addition, possible sources of systematic bias are discussed, using examples drawn from published IS research. Recommendations are made for further research aimed at better understanding how and to what extent judgment biases could affect the results of IS surveys.
Keywords: IS evaluation; Response bias; Survey research; User satisfaction
Algorithm:

List of Topics

#219 0.314 response responses different survey questions results research activities respond benefits certain leads two-stage interactions study address respondents question directly categories
#183 0.137 explanations explanation bias use kbs biases facilities cognitive making judgment decisions likely decision important prior judgments feedback types difficult lead
#189 0.137 recommendations recommender systems preferences recommendation rating ratings preference improve users frame contextual using frames sensemaking filtering manipulation specific collaborative items
#157 0.098 evaluation effectiveness assessment evaluating paper objectives terms process assessing criteria evaluations methodology provides impact literature potential important evaluated identifying multiple
#32 0.077 research studies issues researchers scientific methodological article conducting conduct advanced rigor researcher methodology practitioner issue relevance findings validation papers published
#0 0.050 information types different type sources analysis develop used behavior specific conditions consider improve using alternative understanding data available main target
#96 0.050 errors error construction testing spreadsheet recovery phase spreadsheets number failures inspection better studies modules rate replicated detection correction optimal discovering