Author List: Blohm, Ivo; Riedl, Christoph; F™ller, Johann; Leimeister, Jan Marco;
Information Systems Research, 2016, Volume 27, Issue 1, Page 27-48.
Information technology (IT) has created new patterns of digitally-mediated collaboration that allow open sourcing of ideas for new products and services. These novel sociotechnical arrangements afford finely-grained manipulation of how tasks can be represented and have changed the way organizations ideate. In this paper, we investigate differences in behavioral decision-making resulting from IT-based support of open idea evaluation. We report results from a randomized experiment of 120 participants comparing IT-based decision-making support using a rating scale (representing a judgment task) and a preference market (representing a choice task). We find that the rating scale-based task invokes significantly higher perceived ease of use than the preference market-based task and that perceived ease of use mediates the effect of the task representation treatment on the users' decision quality. Furthermore, we find that the understandability of ideas being evaluated, which we assess through the ideas' readability, and the perception of the task's variability moderate the strength of this mediation effect, which becomes stronger with increasing perceived task variability and decreasing understandability of the ideas. We contribute to the literature by explaining how perceptual differences of task representations for open idea evaluation affect the decision quality of users and translate into differences in mechanism accuracy. These results enhance our understanding of how crowdsourcing as a novel mode of value creation may effectively complement traditional work structures.
Keywords: crowdsourcing ; computer-mediated communication and collaboration ; decision support systems ; idea evaluation ; rating scales ; preference markets
Algorithm:

List of Topics

#9 0.100 using subjects results study experiment did conducted task time used experienced use preference experimental presented decision-making empirical significantly effects better
#142 0.092 creativity ideas idea creative individual generation techniques individuals problem support cognitive ideation stimuli memory generate enhance generated solutions solving quality
#99 0.085 perceived usefulness acceptance use technology ease model usage tam study beliefs intention user intentions users behavioral perceptions determinants constructs studies
#295 0.078 task fit tasks performance cognitive theory using support type comprehension tools tool effects effect matching types theories modification working time
#177 0.074 decision accuracy aid aids prediction experiment effects accurate support making preferences interaction judgment hybrid perceptual strategy account context restrictiveness taking
#173 0.071 effect impact affect results positive effects direct findings influence important positively model data suggest test factors negative affects significant relationship
#190 0.063 new licensing license open comparison type affiliation perpetual prior address peer question greater compared explore competing crowdsourcing provide choice place
#157 0.060 evaluation effectiveness assessment evaluating paper objectives terms process assessing criteria evaluations methodology provides impact literature potential important evaluated identifying multiple
#189 0.053 recommendations recommender systems preferences recommendation rating ratings preference improve users frame contextual using frames sensemaking filtering manipulation specific collaborative items