Author List: Prat, Nicolas; Comyn-Wattiau, Isabelle; Akoka, Jacky;
Journal of Management Information Systems, 2015, Volume 32, Issue 3, Page 229-267.
Artifacts, such as software systems, pervade organizations and society. In the field of information systems (IS) they form the core of research. The evaluation of IS artifacts thus represents a major issue. Although IS research paradigms are increasingly intertwined, building and evaluating artifacts has traditionally been the purview of design science research (DSR). DSR in IS has not reached maturity yet. This is particularly true of artifact evaluation. This paper investigates the ÒwhatÓ and the ÒhowÓ of IS artifact evaluation: what are the objects and criteria of evaluation, the methods for evaluating the criteria, and the relationships between the ÒwhatÓ and the ÒhowÓ of evaluation? To answer these questions, we develop a taxonomy of evaluation methods for IS artifacts. With this taxonomy, we analyze IS artifact evaluation practice, as reflected by ten years of DSR publications in the basket of journals of the Association for Information Systems (AIS). This research brings to light important relationships between the dimensions of IS artifact evaluation, and identifies seven typical evaluation patterns: demonstration; simulation- and metric-based benchmarking of artifacts; practice-based evaluation of effectiveness; simulation- and metric-based absolute evaluation of artifacts; practice-based evaluation of usefulness or ease of use; laboratory, student-based evaluation of usefulness; and algorithmic complexity analysis. This study also reveals a focus of artifact evaluation practice on a few criteria. Beyond immediate usefulness, IS researchers are urged to investigate ways of evaluating the long-term organizational impact and the societal impact of artifacts. > >
Keywords: artifact evaluation; content analysis ;design evaluation; design science system; design taxonomy
Algorithm:

List of Topics

#157 0.310 evaluation effectiveness assessment evaluating paper objectives terms process assessing criteria evaluations methodology provides impact literature potential important evaluated identifying multiple
#207 0.214 design artifacts alternative method artifact generation approaches alternatives tool science generate set promising requirements evaluation problem designed incentives components addressing
#21 0.126 research information systems science field discipline researchers principles practice core methods area reference relevance conclude set focus propose perspective inquiry
#169 0.059 research journals journal information systems articles academic published business mis faculty discipline analysis publication management tenure authors publications disciplines years