•  
  •  
 

Journal of Medical Education

Abstract

Background and purpose: Mini-CEX is a well-recognized reliable and valid clinical assessment tool widely used in medical education. However, there is no effective way focusing on evaluating the quality of Mini-CEX scoring on items from the supervisors. We developed strategies to evaluate the quality of rating from the completed Mini-CEX forms. Methods: We collected 1396 Mini-CEX assessment forms from eighteen clinical departments distributed in five medical centers. We used QI Macros 2020 for statistical process control (SPC) chart and area stacked figures to analyze the data in Excel. The 5 major teaching hospitals and 4 major clinical departments were de-identified with numbers 1 to 5 and English alphabets A, B, C, D. Results: One clinical department C1 had good quality by high variation of SD (p < 0.01), and A2, D3 had lousy quality due to high all items identical scores (AIIS) rate (p < 0.01). Departments A4, B2, and D2 had poor quality because of high unrated rate (p < 0.01), which accounted for 80.2% of unrated rate. The stacked figure showed ratings tended to cluster on scores of 8 and 9 (72.2%), suggesting an overestimate of students' performance and poor scoring quality. Conclusions: Our study disclosed shortcomings of the Mini-CEX assessment, which suggest a cardinal sign of the inadequacy of the quality assurance system of 5 medical centers collaborated with a medical school. Further investigations and evaluations to ensure the quality of clinical assessments at more medical schools and medical centers are mandatory in Taiwan in the future.

First Page

23

Last Page

32

DOI

10.6145/jme.202212_26(4).0002

Share

COinS