Item request has been placed!
×
Item request cannot be made.
×
Processing Request
Application of Two-Parameter Item Response Theory for Determining Form-Dependent Items on Exams Using Different Item Orders
Item request has been placed!
×
Item request cannot be made.
×
Processing Request
- Additional Information
- Availability:
Center for Educational Assessment. 813 North Pleasant Street, Amherst, MA 01002. e-mail:
[email protected]; Tel: 413-577-2180; Web site: https://scholarworks.umass.edu/pare
- Peer Reviewed:
Y
- Source:
15
- Education Level:
Higher Education
Postsecondary Education
- Subject Terms:
- ISSN:
1531-7714
- Abstract:
Using multiple versions of an assessment has the potential to introduce item environment effects. These types of effects result in version dependent item characteristics (i.e., difficulty and discrimination). Methods to detect such effects and resulting implications are important for all levels of assessment where multiple forms of an assessment are created. This report describes a novel method for identifying items that do and do not display form dependence. The first two steps identify form dependent items using a differential item functioning (DIF) analysis of item parameters estimated by Item Response Theory. The method is illustrated using items that appeared in four forms (two trial and two released versions) of a first semester general chemistry examination. Eighteen of fifty-six items were identified as having item parameters that were form dependent. Thirteen of those items displayed a form dependence consistent with reasons previously identified in the literature: preceding item difficulty, content priming, and a combination of preceding item difficulty and content priming. The remaining five items had form dependence that did not align reasons reported in the literature. An analysis was done to determine if all possible instances of predicted form dependence could be found. Several instances where form dependence could have been found, based on the preceding item difficulty or content priming, were identified, and those items did not display form dependence. We identify and rationalize form dependence for thirteen of the eighteen items flagged; however, we are unable to predict form dependence for items.
- Abstract:
As Provided
- Publication Date:
2023
- Accession Number:
EJ1401324
No Comments.