Author List: Pitt, Leyland F.; Watson, Richard T.; Kavan, C. Bruce;
MIS Quarterly, 1997, Volume 21, Issue 2, Page 209-221.
This paper responds to the research note in this issue by Van Dyke et al. concerning the use of SERVQUAL, an instrument to measure service quality, and its use in the IS domain. This paper attempts to balance some of the arguments they raise from the marketing literature on the topic with the well-documented counterarguments of SERVQUAL'S developers, as well as our own research evidence and observations in an IS-specific environment. Specifically, evidence is provided to show that the service quality perceptions-expectations subtraction in SERVQUAL is far more rigorously grounded than Van Dyke et al. suggest; that the expectations construct, while potentially ambiguous, is generally a vector in the case of an IS department; and that the dimensions of service quality seem to be as applicable to the IS department as to any other organizational setting. Then, the paper demonstrates that the problems of reliability of difference score calculations in SERVQUAL are not nearly as serious as Van Dyke et al. suggest; that while perceptions-only measurement of service quality might have marginally better predictive and convergent validity, this comes at considerable expense to managerial diagnostics; and reiterate some of the problems of dimensional instability found in our previous research, highlighted by Van Dyke et al. and discussed in many other studies of SERVQUAL. across a range of settings. Finally, four areas for further research in this area are identified.
Keywords: IS research agenda; marketing of IS; Measurement; reliability; service quality; validity
Algorithm:

List of Topics

#115 0.353 quality different servqual service high-quality difference used quantity importance use measure framework impact assurance better include means van dimensions assessing
#263 0.169 instrument measurement factor analysis measuring measures dimensions validity based instruments construct measure conceptualization sample reliability development develop responses assess use
#127 0.131 systems information research theory implications practice discussed findings field paper practitioners role general important key grounded researchers domain new identified
#82 0.123 case study studies paper use research analysis interpretive identify qualitative approach understanding critical development managerial elements exploring points positivist presents
#222 0.100 research researchers framework future information systems important present agenda identify areas provide understanding contributions using literature studies paper potential review