
Assessment of Clinical Competence of Interns using Work Place Based Assessment in Ophthalmology Rotational Posting
NC01-NC08
Correspondence
Yashi Bansal,
87, SAS Nagar Extension, Jalandhar, Punjab, India.
E-mail: dryashibansal@gmail.com
Introduction: Acquisition of clinical skills is the main aim of the compulsory clinical rotations during internship. It is left to chance that students acquire clinical skills as there is no formal assessment to ensure that skill learning has actually taken place. The best method for assessing clinical competence is Work Place Based Assessment (WPBA). Mini-clinical Evaluation Exercise (mini-CEX) and Directly Observed Procedural Skills (DOPS) are among some of the tools used for assessment of clinical competence. These tools can also be used to incorporate feedback to the students and at the same time it can be used as an excellent teaching-learning opportunity.
Aim:To assess the feasibility and use of Mini-clinical evaluation exercise and directly observed procedural skills as assessment tools for assessing clinical competence of interns.
Materials and Methods: This Prospective interventional study was conducted in Department of Ophthalmology of Punjab Institute of Medical Sciences, Jalandhar, Punjab, India from June 2016-June 2017. The clinical competence of interns was assessed using mini-CEX and DOPS. One Hundred interns undertook four Mini-clinical evaluation exercises each by different assessors and at least one DOPS for refraction was conducted per intern. If result was unsatisfactory, then further DOPS was undertaken till performance was satisfactory. The grading was done on the checklist (key points required) and evaluation done in points as pre-decided by assessors on a scale of 1-9 (1-3 unsatisfactory, 4-6 satisfactory and 7 to 9 superior). Satisfaction with the process of assessment by mini-CEX was also graded on a 9 point scale. Overall performance of interns for the procedure of refraction was graded on a 9 point scale in DOPS. A feedback questionnaire about the conduct and acceptability of assessment tools was taken from the interns and assessors at the end of their posting using a pre-validated questionnaire. All the results were computed using SPSS software (version 22.0). Average scores of all the interns in each subcompetency of mini-CEX and DOPS were recorded. The progression of the scores was observed from first mini-CEX and the fourth mini-CEX. Comparison of scores was done between mini-CEX 1 vs. mini-CEX 4 using ANOVA Post-hoc Tukey’s Test.
Results: A total of 100 interns undertook 400 Mini-clinical evaluation exercise (four Mini-clinical evaluation exercise each intern) and 160 DOPS (at least one DOPS each). For Mini-clinical evaluation exercise (total including 1, 2, 3 and 4) the grading was satisfactory in 76.25% (305) and unsatisfactory in 23.75 % (95). For DOPS (total=160), 62.5% (100) were graded as satisfactory and 37.5% (60) of the DOPS were graded unsatisfactory. A mean score of 8.0 was given by interns and 7.7 by faculty for their satisfaction with assessment by mini-CEX. Overall performance in DOPS was graded as 6.2.
Conclusion: Mini-clinical evaluation exercise and directly observed procedural skills are useful assessment tools for assessing competence in interns and further improving clinical skills. It is feasible to use these assessment methods in ophthalmology clinical setting. These assessment methods act as an effective tool for giving feedback.