Item Analysis Projects

Item Analysis – DIF, Complexity, Visual-Spatial, Item Order Effect, Item Stability

These projects are aimed at studying examination items using various psychometric analyses and categorization rubrics. Differential item functioning (DIF) studies have been used to identify examination items with situational and persistence DIF; this work is helping us identify examination items that favor certain student populations. Complexity is used as an a priori means for determining item difficulty; a revision of our complexity rubric is used to develop new examinations and to evaluate how students are answering examination items. Visual-spatial reference components (e.g., graphs, equations, structures) are evaluated in light of item difficulty and complexity to better understand how students are using these item components to response to examination items. Order effect studies are used to evaluate how two (or more) ordering of examination items and response options influences item difficulty and response patterns. Studies of item stability are used to identify if item difficulty and discrimination values are independent of examination form including trial testing, released examinations (two forms), and use in paired-questions or conceptual based examinations; this work is informing how norm referenced examinations can be constructed without collecting new norming data.

References

  • Knaus, K. J., Murphy, K. L., & Holme, T. A. (2009). Designing Chemistry Practice Exams for Enhanced Benefits. An Instrument for Comparing Performance and Mental Effort Measures. Journal of Chemical Education86(7), 827.
  • Holme, T., & Murphy, K. (2011). Assessing Conceptual versus Algorithmic Knowledge: Are We Engendering New Myths in Chemical Education?. In Investigating Classroom Myths through Research on Teaching and Learning (pp. 195-206). American Chemical Society.
  • Knaus, K., Murphy, K., Blecking, A., & Holme, T. (2011). A valid and reliable instrument for cognitive complexity rating assignment of chemistry exam items. Journal of Chemical Education88(5), 554-560.
  • Schroeder, J., Murphy, K. L., & Holme, T. A. (2012). Investigating factors that influence item performance on ACS exams.Journal of Chemical Education89(3), 346-350.
  • Grunert, M. L., Raker, J. R., Murphy, K. L., & Holme, T. A. (2013). Polytomous versus dichotomous scoring on multiple-choice examinations: development of a rubric for rating partial credit. Journal of Chemical Education90(10), 1310-1315.
  • Kendhammer, L., Holme, T., & Murphy, K. (2013). Identifying differential performance in general chemistry: Differential item functioning analysis of ACS general chemistry trial tests. Journal of Chemical Education90(7), 846-853.
  • Raker, J. R., Trate, J. M., Holme, T. A., & Murphy, K. (2013). Adaptation of an instrument for measuring the cognitive complexity of organic chemistry exam items. Journal of Chemical Education90(10), 1290-1295.
  • Kendhammer, L. K., & Murphy, K. L. (2014). General Statistical Techniques for Detecting Differential Item Functioning Based on Gender Subgroups: A Comparison of the Mantel-Haenszel Procedure, IRT, and Logistic Regression. In Innovative Uses of Assessments for Teaching and Research (pp. 47-64). American Chemical Society.
  • Murphy, K. L., & Holme, T. (2014). Improving instructional design with better analysis of assessment data. Journal of Learning Design7(2), 1.
  • Brandriet, A., Reed, J. J., & Holme, T. (2015). A Historical Investigation into Item Formats of ACS Exams and Their Relationships to Science Practices. Journal of Chemical Education92(11), 1798-1806.
  • Elkins, K. M., & Murphy, K. L. (2016). Use of the Online Version of an ACS General Chemistry Exam: Evaluation of Student Performance and Impact on the Final Exam. In Technology and Assessment Strategies for Improving Student Learning in Chemistry (pp. 211-223). American Chemical Society.
  • Luxford, C., & Holme, T. (2016). How Do Chemistry Educators View Items That Test Conceptual Understanding?. In Technology and Assessment Strategies for Improving Student Learning in Chemistry (pp. 195-210). American Chemical Society.
  • Reed, J. J., Brandriet, A. R., & Holme, T. A. (2016). Analyzing the Role of Science Practices in ACS Exam Items. Journal of Chemical Education94(1), 3-10.
  • Reed, J.J., Raker, J.R., & Murphy, K.L. (2019) Evaluation of subset norm stability in ACS general chemistry exams. submitted to Journal of Chemical Education.