Options
Investigating response styles and item homogeneity using item response models
Wetzel, Eunike (2013): Investigating response styles and item homogeneity using item response models, Bamberg: opus.
Faculty/Chair:
Author:
Publisher Information:
Year of publication:
2013
Pages:
Supervisor:
Language:
English
Remark:
Bamberg, Univ., Diss., 2013
Licence:
Abstract:
Measurement invariance is a pre-requisite for drawing accurate and valid inferences concerning individuals’ trait levels from questionnaire data. However, several factors exist that can influence a person’s item responses in addition to his or her latent trait level. The research in this dissertation was aimed at investigating three of these factors: 1) individual differences in response styles, 2) the measurement invariance of items between subgroups of respondents, and 3) the measurement invariance of items across assessment periods.
Mixed Rasch analyses of data from the German NEO-PI-R showed that respondents differed systematically in their response scale use: some preferred extreme categories while others preferred moderate categories. Multidimensional item response models showed that response styles (especially extreme response style) explained variance in item responses that was incremental to the variance explained by the traits. Thus, individual differences in response styles had an influence on item responses.
The measurement invariance of items between subgroups of respondents was investigated with respect to differential item functioning for gender in the German NEO-PI-R. Several NEO-PI-R facets especially on neuroticism, agreeableness, and conscientiousness contained items that were not measurement invariant for men and women.
The measurement invariance of items across assessment periods was investigated for link items from the reading and science domains in the Programme for International Student Assessment (PISA). Measurement invariance was violated for both item sets. Some items showed large differences in item difficulty between assessments which may in part be attributed to changes in item wording and position effects.
In sum, it was shown that individual differences in response styles, the lack of measurement invariance of items between subgroups of respondents, and the lack of measurement invariance of items across assessment periods can impair the measurement of the intended traits and in consequence render trait inferences and comparisons between individuals or groups invalid. Thus, measures should be taken to reduce the impact of factors that interfere with measurement invariance. These measures can be aimed at test construction where, for example, the item or response format can be adjusted to elicit response styles to a lesser degree and items can be selected that have invariance properties across subgroups of participants and across assessment periods.
Mixed Rasch analyses of data from the German NEO-PI-R showed that respondents differed systematically in their response scale use: some preferred extreme categories while others preferred moderate categories. Multidimensional item response models showed that response styles (especially extreme response style) explained variance in item responses that was incremental to the variance explained by the traits. Thus, individual differences in response styles had an influence on item responses.
The measurement invariance of items between subgroups of respondents was investigated with respect to differential item functioning for gender in the German NEO-PI-R. Several NEO-PI-R facets especially on neuroticism, agreeableness, and conscientiousness contained items that were not measurement invariant for men and women.
The measurement invariance of items across assessment periods was investigated for link items from the reading and science domains in the Programme for International Student Assessment (PISA). Measurement invariance was violated for both item sets. Some items showed large differences in item difficulty between assessments which may in part be attributed to changes in item wording and position effects.
In sum, it was shown that individual differences in response styles, the lack of measurement invariance of items between subgroups of respondents, and the lack of measurement invariance of items across assessment periods can impair the measurement of the intended traits and in consequence render trait inferences and comparisons between individuals or groups invalid. Thus, measures should be taken to reduce the impact of factors that interfere with measurement invariance. These measures can be aimed at test construction where, for example, the item or response format can be adjusted to elicit response styles to a lesser degree and items can be selected that have invariance properties across subgroups of participants and across assessment periods.
GND Keywords: ; ; ;
Testtheorie
Testkonstruktion
Persönlichkeitsfaktor
Rasch-Modell
Keywords: ; ;
response styles
item response models
measurement invariance
DDC Classification:
Type:
Doctoralthesis
Activation date:
September 13, 2013
Permalink
https://fis.uni-bamberg.de/handle/uniba/1852