Graded Response Method: Does Question Type Influence the Assessment of Critical Thinking?

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Author(s): Fukuzawa, Sherry; deBraga, Michael
  • Language:
    English
  • Source:
    Journal of Curriculum and Teaching. 2019 8(1):1-10.
  • Publication Date:
    2019
  • Document Type:
    Journal Articles
    Reports - Research
    Tests/Questionnaires
  • Additional Information
    • Availability:
      Sciedu Press. 1120 Finch Avenue West Suite 701-309, Toronto, ON., M3J 3H7, Canada. Tel: 416-479-0028; Fax: 416-642-8548; e-mail: [email protected]; Web site: http://www.sciedupress.com/journal/index.php/jct
    • Peer Reviewed:
      Y
    • Source:
      10
    • Education Level:
      Higher Education
      Postsecondary Education
    • Subject Terms:
    • Subject Terms:
    • ISSN:
      1927-2677
    • Abstract:
      Graded Response Method (GRM) is an alternative to multiple-choice testing where students rank options according to their relevance to the question. GRM requires discrimination and inference between statements and is a cost-effective critical thinking assessment in large courses where open-ended answers are not feasible. This study examined critical thinking assessment in GRM versus open-ended and multiple-choice questions composed from Bloom's taxonomy in an introductory undergraduate course in anthropology and archaeology (N=53 students). Critical thinking was operationalized as the ability to assess a question with evidence to support or evaluate arguments (Ennis, 1993). We predicted that students who performed well on multiple-choice from Bloom's taxonomy levels 4-6 and open-ended questions would perform well on GRM involving similar concepts. High performing students on GRM were predicted to have higher course grades. "The null hypothesis was question type would not have an effect on critical thinking assessment." In two quizzes, there was weak correlation between GRM and open-ended questions (R[superscript 2] =0.15), however there was strong correlation in the exam (R[superscript 2] =0.56). Correlations were consistently higher between GRM and multiple-choice from Bloom's taxonomy levels 4-6 (R[superscript 2] =0.23,0.31,0.21) versus levels 1-3 (R[superscript 2] =0.13,0.29,0.18). GRM is a viable alternative to multiple-choice in critical thinking assessment without added resources and grading efforts.
    • Abstract:
      As Provided
    • Publication Date:
      2019
    • Accession Number:
      EJ1206447