Author List: van Dyke, Thomas P.; Kappelman, Leon A.; Prybutok, Victor R.;
MIS Quarterly, 1997, Volume 21, Issue 2, Page 195-208.
A recent MIS Quarterly article rightfully points out that service is an important part of the role of the information systems (IS) department and that most IS assessment measures have a product orientation (Pitt et al. 1995). The article went on to suggest the use of an IS-context-modified version of the SERVQUAL instrument to assess the quality of the services supplied by an information services provider (Parasuraman et al. 1985, 1988, 1991).<sup>2</sup>; However, a number of problems with the SERVQUAL instrument have been discussed in the literature (e.g., Babakus and Boller 1992; Carman 1990; Cronin and Taylor 1992, 1994; Teas 1993). This article reviews that literature and discusses some of the implications for measuring service quality in the information systems context. Findings indicate that SERVQUAL suffers from a number of conceptual and empirical difficulties. Conceptual difficulties include the operationalization of perceived service quality as a difference or gap score, the ambiguity of the expectations construct, and the unsuitability of using a single measure of service quality across different industries. Empirical problems, which may be linked to the use of difference scores, include reduced reliability, poor convergent validity, and poor predictive validity. This suggests: that (1) some alternative to difference scores is preferable and should be utilized; (2) if used, caution should be exercised in the interpretation of IS-SERVQUAL difference scores; and (3) further work is needed in the development of measures for assessing the quality of IS services.
Keywords: evaluation; IS management; measurement; service quality; user attitudes; user expectations
Algorithm:

List of Topics

#115 0.313 quality different servqual service high-quality difference used quantity importance use measure framework impact assurance better include means van dimensions assessing
#263 0.275 instrument measurement factor analysis measuring measures dimensions validity based instruments construct measure conceptualization sample reliability development develop responses assess use
#211 0.097 service services delivery quality providers technology information customer business provider asp e-service role variability science propose logic companies especially customers
#63 0.083 mis problems article systems management edp managers organizations ;br&gt; data survey application examines need experiences recent organization reports departments oriented
#127 0.083 systems information research theory implications practice discussed findings field paper practitioners role general important key grounded researchers domain new identified