Journal of Information Systems Education (JISE)

Volume 33

Volume 33, Issue 4, Pages 405-415

Fall 2022


Quality Assurance of Learning Assessments in Large Information Systems and Decision Analysis Courses


Zsolt Ugray
Brian K. Dunn

Utah State University
Logan, UT 84322, USA

Abstract: As Information Systems courses have become both more data-focused and student numbers have increased, there has emerged a greater need to assess technical and analytical skills more efficiently and effectively. Multiple-choice examinations provide a means for accomplishing this, though creating effective multiple-choice assessment items within a technical course context can be challenging. This study presents an iterative quality improvement framework based on Plan-Do-Study-Act (PDSA) quality assurance cycle for developing and improving such multiple-choice assessments. Integral to this framework, we also present a rigorous, reliable, and valid measure of assessment and item quality using discrimination efficiency and the KR-20 assessment reliability measure. We demonstrate the effectiveness of our approach across exams developed and administered for two courses — one, a highly technical Information Systems introductory course and the other, an introductory data analytics course. Using this approach, we show that assessment quality iteratively improves when instructors measure items and exams rigorously and apply this PDSA framework.

Keywords: Learning goals & outcomes, Item analysis, Learning assessment, Student learning

Download This Article: JISE2022v33n4pp405-415.pdf


Recommended Citation: Ugray, Z., & Dunn, B. K. (2022). Quality Assurance of Learning Assessments in Large Information Systems and Decision Analysis Courses. Journal of Information Systems Education, 33(4), 405-415.