Structured Program For Final-Year Undergraduate Students To Improve Clinical Skills To Prepare For Effective Patient Management
K L Karanth, S Kanagasabai, S B Ibrahim, M Najimuddin, D K Marasinghe, S De
Keywords
clinical evaluation exercise cex, directly observed procedural skills dops, mini-clinical evaluation exercise mini-cex, workplace-based assessment wpba
Citation
K L Karanth, S Kanagasabai, S B Ibrahim, M Najimuddin, D K Marasinghe, S De. Structured Program For Final-Year Undergraduate Students To Improve Clinical Skills To Prepare For Effective Patient Management . The Internet Journal of Gynecology and Obstetrics. 2015 Volume 19 Number 1.
DOI: 10.5580/IJGO.28355
Abstract
Objective: The aim of this study was to assess the mini Clinical Evaluation
Exercise (mini-CEX) and Directly Observed Procedural Skills in terms of their ability to prepare final-year undergraduate students to prepare for effective patient management.
Methods: This is a prospective study conducted to assess the effectivity of the intervention (miniCEX, DOPS) on performance of students in end of the posting clinical exam Final-year obstetrics and gynecology students during their 6 weeks of posting consented to participate were formatively assessed (study group) with the mini-CEX and DOPS using four faculties for four different clinical cases and procedures (Papanicolaou stain, high vaginal swab, Pipelle sampling, and intrauterine contraceptive device insertion). Internal consistency of questionnaire was determined. Correlation between study group, Clinical Evaluation Exercise (CEX) and mini CEX, DOPS were analysed.
Results: Traditional clinical examination scores were significantly higher in the study group (52.93 ± 3.51) than in the control group (47.72 ± 5.46). A moderately significant correlation between mini-CEX scores (r = 0.58) and DOPS scores (r = 0.5) was observed. The reliability coefficient of the mini-CEX, DOPS, and self-evaluation form were 0.915, 0.847, and 0.949 respectively.
Conclusions: Use of the mini-CEX and DOPS as a formative assessment technique with multiple encounters assess students in a much broader range of clinical situations than do traditional clinical examination and offers trainees greater opportunities to prepare students to practice confidently in real situations.
Introduction
Clinical skills of undergraduate medical students are traditionally assessed at end of their clinical rotation (Clinical Evaluation Exercise (CEX)). In clinical skills tests, students gather patient information and examine the patient at the bedside, then present a diagnosis and treatment plan to the assessor. The assessor evaluates the performance of the student. This traditional method involves a one-time assessment and does not address competencies such as communication skills and bedside manner, despite the fact that they are essential for clinical practice in real clinical scenarios. Therefore, the traditional examination method is incomplete. It is also conducted in an unsupervised manner, is mainly based on theoretical knowledge, and is summative. Assessment should evolve from use of numerical values to the standard expected at the end of a period of training. In 1990, Miller1 proposed a framework for assessing clinical competence. In this framework, he distinguished between what occurs in practice in the higher strata of a pyramid (action and performance) and the actual knowledge base in the lower strata (competence and knowledge). Professional performance in real clinical scenarios occupies the top strata in Miller’s competence assessment framework.
Workplace-based assessment (WPBA) methods assess professional performance and collect information about trainees’ performance in normal practice. The roles of physicians and the key competencies associated with each role are well established.2,3 These competencies serve as an indicator of the quality of doctors for the public, patients, and the doctors themselves. These competencies are assessed in the workplace using various methods.4 Because learning is the key purpose of the assessment, great interest in workplace-based training of medical students has recently developed. Formative assessment is designed to be an ongoing part of the instructional process that supports and enhances learning through feedback.5 WPBA is a formative assessment technique in which feedback on regarding student’s performance to minimize the differences between the desired and actual performance. The feedback is specific and WPBA is a two-way process in which trainers provide comments and encourage trainees to self-reflect on their performance.6,7 .Numerous assessment methods that are suitable for providing feedback based on observation of trainee performance in the workplace have been developed.8 Such assessment methods must be reliable, valid, feasible, and acceptable, and should have an educational impact on practice.9 Assessment drives learning. A well-designed, well-implemented WPBA can promote learning and help students to develop the necessary skills, behavior, and attitudes to become competent doctors in real clinical scenarios.10 There is a need for WPBA instead of one-time assessment of undergraduate students to help them to become effective house officers. Students may be better prepared for housemanship if they are encouraged to practice with a conscious effort toward improving their knowledge, skills, and attitude regarding comprehensive management of common conditions and skilled performance of common procedures in the field of obstetrics and gynecology. Faculty could contribute to this effort by using multiple educational tools (i.e., assessment in clinical skill laboratories) and providing feedback so that students can confidently practice in real-life scenarios. Effective feedback by credible source has substantial impact on clinical practice 11.The aim of the present study was to assess the effectiveness of two WPBA tools, the mini-CEX and directly observed procedural skills (DOPS), by formative assessment of final-year obstetrics and gynecology students to prepare them for effective performance of patient management and house officer duties.
Methods
The prospective study involved final-year students during their last 6 weeks in the obstetrics and gynecology department. Before the study began, a focused group discussion about WPBA was conducted with the students, and they were familiarized with the educative and performance assessment tools used in this study (mini-CEX and DOPS). The mini-CEX and DOPS were demonstrated to the students prior to implementation of the study. Hospital faculty members were familiarized with the formative assessment technique using the mini-CEX and DOPS tools during a departmental meeting and written consent was obtained from interested individuals who agreed to participate in the study. The mini-CEX as prepared by the American Board of Internal Medicine was used. The DOPS and self-assessment forms were constructed with input from both the students and faculty. Items in these forms were contextualized. ( Appendices A and B).
For each mini-CEX encounter, one faculty member observed the student conduct a focused interview or physical examination in an inpatient or outpatient setting. After asking the student about their diagnostic or therapeutic decisions, the faculty completed the rating form (www.annals.org) and provided feedback. For each encounter, the faculty member recorded the date, the complexity of the patient’s problem low such as preeclampsia, anemia complicating pregnancy, fibroid uterus, endometriosis, moderate such as twin pregnancy, diabetes complicating pregnancy, pelvic inflammatory disease, carcinoma of the cervix) and high such as hypertension and diabetes complicating pregnancy, epilepsy complicating pregnancy, thrombophilias complicating pregnancy, carcinoma of the endometrium, carcinoma of the ovary), the setting (outpatient or inpatient), the number of minutes spent observing the encounter, and the number of minutes spent giving feedback. The student also noted whether the focus of the encounter was data gathering, diagnosis, treatment, or counseling. The faculty member rated the resident on interviewing skills, physical examination skills, professionalism, clinical judgment, counseling, organization and efficiency, and overall competence. All students assessed for all items of miniCEX. The faculty member also rated his or her own satisfaction with the method as a valid and efficient assessment device on a nine point scale in which 1 was “dissatisfied” and 9 was “very satisfied.” For DOPS encounter one student demonstrated a procedure in the skills laboratory. The faculty member rated the student in terms of explaining the procedure ,obtaining informed consent, preprocedural preparation, demonstrating asepsis, level of procedural skills, consideration of the patient during procedure, seeking senior help when encountering the problem, communication with the patients, providing post procedural instructions and overall competency. For each encounter the faculty member also recorded the number of minutes spent observing the encounter, and the number of minutes spent giving feedback. For the self-assessment the student assessed their own understanding of the varied presentations of diseases,complications and treatment modalities;ability to enquire in detail about symptoms and gather and prioritise essential and accurate information and perform a thorough physical examination; utilise a clinical investigatory and analytical thinking approach to clinical situations to generate a diagnosis,formulate a management plan for common diseases,independently perform procedural skills in a skills laboratory, communicate clearly with patients, communicate clearly with health professionals,mantain ethical practice in working with health care personel;,mantaining ethical practice in working with patients; and overall performance. Likert scale a nine point scale (in which 1 to 3 was “unsatisfactory,” 4 to 6 was “satisfactory,” and 7 to 9 was “above expected”) was used for the self-assessment.
The assessment forms were prepared based on a nine point Likert scale because this scale exhibits higher sensitivity and accuracy in measuring continuous variables than does lower scale (5-point or 7-point Likert scale)11. Permission was obtained from the ethics committee of the institute to conduct this study.
Sixty-four students were included in the control group, and 65 students were included in the study group. Each student in the study group underwent the mini-CEX four times by four different faculty members in either the outpatient or inpatient department. Four different cases per student were selected by consensus. In the DOPS assessment, procedures commonly performed during housemanship in an obstetrics and gynecology rotation (Papanicolaou stain, high vaginal swab, Pipelle sampling, and intrauterine contraceptive device insertion) were performed on a mannequin in a clinical skills laboratory under observation by different faculty members. Feedback was provided on the performance. The students were provided a rating by mutual consensus. Students from both groups underwent traditional bedside clinical examination, and the performance of both groups was compared. During the bedside clinical examination, theoretical case knowledge of the DOPS procedure and various obstetric emergencies were discussed. Traditional clinical examination was conducted by faculty members blinded to the students’ participation in the formative assessment. In the traditional bedside clinical examination, cases were allotted to the students at a stipulated time, and the students were assessed with respect to their performance of clinical skills, competency in case management, knowledge of theoretical aspects of obstetric emergencies, and performance of gynecological outpatient procedures (Papanicolaou stain, high vaginal swab, Pipelle sampling, and intrauterine contraceptive device insertion). Soft skills were not assessed. One faculty assessed one student per patient. All students underwent a final university examination. A self-assessment form with feedback from the participating students in the study group was collected to assess changes in knowledge, clinical skills and attitudes as well as satisfaction before and after the intervention (at the beginning of the rotation, at the end of the rotation, and after the final examination).
Statistical analysis
Differences in traditional examination scores between the study and control groups were analyzed using Student’s t-test. Internal consistency of the assessment forms was evaluated using Cronbach’s alpha. Pearson’s product-moment correlation coefficient and regression analysis between traditional clinical examination scores and mini-CEX as well as DOPS were also analyzed. Self-assessment scores were compared between before and after the intervention using Student’s t-test. Statistical significance was determined to be present at a p value of <0.01.Statistical analysis is conducted using PASW Statistics for Windows, version 18.0 (SPSS Inc.,Chicago,IL).
Results
Table 1 summarizes the setting, degree of difficulty, focus, duration of assessment, and time required for feedback using the mini-CEX and DOPS. In total, 90.2% of the mini-CEX assessments were conducted in the inpatient department, and the remaining assessments were conducted in the outpatient department. The degree of difficulty of the cases as assessed by the mini-CEX was average for 69.5% of the cases, moderate for 24.6%, and high for 5.9%. All students in the study group performed data gathering, diagnosis, and management, whereas 48.82% performed patient counseling. The time required (mean ± standard deviation) for assessment using the mini-CEX and subsequent feedback was 22.94 ± 3.88 and 5.73 ± 2.20 minutes respectively, whereas that required for assessment using DOPS and subsequent feedback was 4.29 ± 1.23 and 1.92 ± 0.65 minutes respectively.
Table 2 compares the traditional clinical examination scores between the study and control groups. A significant difference was observed in the scores between the two groups, suggesting that the use of the mini-CEX and DOPS impacted the students’ performance.
Table 3 shows the reliability coefficients of the mini-CEX, DOPS, patient questionnaire, and self-assessment form. Strong internal consistency was present among the items. An item would be deleted if its Cronbach’s alpha was less than the overall Cronbach’s alpha for all cases; none of the questions needed to be deleted from the questionnaire.
Table 4 shows the results of the correlation and regression analyses of the mini-CEX and traditional clinical examination scores, as well as of the DOPS and traditional clinical examination scores. Pearson’s product-moment correlation coefficient showed a moderate positive correlation between the traditional clinical examination scores and the mini-CEX (r = 0.58, p < 0.001) and between the DOPS and traditional clinical examination scores (r = 0.50, adjusted r2 = 0.246, p = < 0.001). Regression of the mini-CEX on the traditional clinical examination scores indicated a linear and statistically significant positive relationship between the two variables (r2 = 0.332, adjusted r2 = 0.321, p < 0.001). Regression of DOPS on the traditional clinical examination scores also indicated a linear and statistically significant positive relationship between the two variables (r2 = 0.246, adjusted r2 = 0.234, p < 0.001). Scatter graphs of the regression of the mini-CEX on the traditional clinical examination scores and on DOPS are shown in Figures 1 and 2.
Table 5 compares the pre-intervention self-assessment scores with the post-intervention and post-final examination self-assessment scores. Statistically significant differences were observed between the post-intervention and pre-intervention self-assessment scores as well as between the post-final examination self-assessment and pre-intervention self-assessment scores. These findings suggest that the intervention used in this study impacted both short- and long-term clinical competency.
The interval between commencing miniCEX, DOPS and commencing of traditional clinical examination are 21,5-32 (mean, range) and 16,7-23 (mean, range) respectively.
Discussion
The mini-CEX and DOPS were selected as intervention tools for formative assessment of cognitive and practical skills in undergraduate medical students. Significant improvements in the traditional clinical examination scores in the study group suggest the effectiveness of the intervention. The faculty was blinded to the identity of the students in the study and control groups during the traditional clinical examination; therefore, there was no bias during the evaluation. Hence, comparison of the results of the traditional clinical examination between the two groups provided a good assessment of the success of the intervention using the mini-CEX and DOPS. In present study we opted for nine point scale over five point scale because of nine-point scale appear to provide more accurate score and has better educational assessments. Though interrater reliability is similar for nine- and five-point scales.11. Feedback, which is a core component of formative assessment, promotes students’ learning by informing them of their progress, giving them resources for improvement, and motivating them to engage in appropriate learning activities and acquiring reasoning skills through reflective practice.5. Formative assessment ensures confidence as students engage in face examinations and engage in practice. It also facilitates a better academic understanding between faculty members and students. Initial anxiety was observed during Mini-CEX assessment, but with multiple encounters, it has provided insight into student’s competency. This benefited them in preparation and successful completion of final examination.7 Translation of effective feedback into practice was evidenced by the improvements in the traditional clinical examination and self-assessment scores in the present study. The benefits of WPBA include the performance of one to one training, the ability to identify learning needs and the ability to demonstrate concrete evidence of progress in training and learning. For each WPBA, the educational value of the assessment can be increased when the purpose of the particular assessment is clearly defined and understood by both faculty and student. A specific WPBA objectively structured for each technical skill should be utilized. For difficult cases, competency is not the same as expertise. WPBA is only used to assess baseline competency and not to assess mastery of a given technique.12
Few studies on use of the mini-CEX and DOPS in undergraduate students have been performed.10,12 In the present study, the miniCEX and DOPS were selected as tools for formative assessment because the undergraduates' clinical and practical scenarios assessed are very similar to the their working conditions during their housemanship after completion of their undergraduate medical education. The mini-CEX and DOPS involve direct observation of interaction between a student and patient (real-life situation using mannequins). The mini-CEX can be conducted in any setting (outpatient, inpatient, or emergency). Assessment of the clinical interaction between the student and patient involves the observation of humanistic qualities; such assessment is lacking in traditional clinical assessment.13 The multiple mini-CEX is superior to the traditional CEX as an assessment tool and its measurement characteristics are similar to other performance assessment techniques such as the use of standardized patients.14 The traditional CEX assesses a student’s performance in dealing with one patient ,during one encounter with one examiner.15 In the miniCEX, however, many encounters with different assessors and different patients provide students the opportunity to assimilate knowledge and skill development. The mini-CEX assesses student’s ability to focus and prioritize diagnoses and management plans in the real context of clinical practice.14.Compared with other formats such as the use of standardized patients, mini-CEX has higher accuracy, is conducted in real setting, is more feasible and less expensive.16,17 In the present study, the reliability coefficient for the mini-CEX showed significantly greater internal consistency among four raters than in a previous study, which showed a reliability coefficient of 0.8 among 8 to 14 raters.11 This may be explained by the involvement of experienced, trained faculty members in formative assessment. The primary purpose of the mini-CEX is to provide an opportunity to observe the student’s clinical skills; it is not used in high-stakes assessments and should not be used to rank or compare students.17. A study aimed to estimate the validity and reliability of the undergraduate mini-CEX and the challenges involved in its implementation observed that miniCEX has limited reliability because of faculty stringency and limited validity because of faculty’s examiner status. It contributed for 29% of score variation in case complexity; attachment specialty and faculty focus. In the same study, stakeholders were interviewed and majority felt that the mini-CEX was more reliable and valid than the previous long case 7.DOPS involves the observation of students performing common gynecological procedures. No previous reports have described the use of DOPS in undergraduate students. In the present study, DOPS was used in the students’ skills laboratory in view of legal issues involved in performing procedures by undergraduate students. Performance on mannequins may not be as accurate as working on real patients, but it fulfills the criteria requirements of the upper strata of Miller’s pyramid. The reliability coefficient of DOPS was comparable with that reported in other studies.18,12 However, these previous studies involved postgraduate students. The significant correlation between Pearson’s product-moment correlation coefficient and regression between the mini-CEX and CEX indicates the ability to distinguish between the student’s levels of experience and the presence of good construct validity. A similar observation was made between DOPS and CEX. However, a limitation is that assessment of DOPS during CEX was theoretical in the present study. Significant improvement between the pre- and post-assessment scores following the final examination suggests the beneficial educational impact of the intervention 19,4 .It also implies that the intervention helped students to understand and retain knowledge. This study is indeed an interventionist experimental model that has established that WPBA tools such as the mini-CEX and DOPS make a difference in the knowledge, attitude, and skills of the learner.20 This conclusion was further reinforced in this study by the students’ self-assessment, which emphasized the benefits of the intervention. Strength of WPBA is formative observation, interactive educational feedback and reflection by both faculty and students, but should not be relied upon to certify competence. The results should be interpreted along with other assessment methods to assess professional competency7
Strength and limitation of the study
The mini-CEX and DOPS are educational tools that formatively assess undergraduate students and can be put into practice because of their feasibility, cost-effectiveness, and long-term educational impact. The students were randomly selected for inclusion in the study group and were aware of the type of tools by which they were assessed. In DOPS, the students were aware of the type of procedure to be assessed before they performed the procedure, allowing them to prepare well in advance of the assessment. Long-term assessment is necessary at the end of an obstetrics and gynecology posting during housemanship to determine the impact of the posting on practice. It would have been ideal to compare the final examination scores between the control and study groups because this would have given us the intermediate-term effect of the intervention. Because of the confidentiality of individual students’ scores in comprehensive examinations, we were unable to compare the final-year performance between the control and study groups. We were thus unable to assess the intermediate-term effect of the intervention. The present study focused mainly on the impact of the intervention on examination performance and scores. Performance in real-life situations and feedback from consultants may be considered to evaluate the long-term impact of this intervention.
Conclusion
WPBA can feasibly be implemented in undergraduate formative examination. It focuses on the process and outcome of learning. The multiple miniCEX and DOPS are superior to the traditional CEX because they are validated, more reliable, more feasible, less expensive and have a better educational impact. The mini-CEX and DOPS showed a good level of internal consistency and significant correlations with traditional clinical examination scores with a long-term impact. The present program improved the clinical skills of final-year undergraduate students during their obstetrics and gynecology rotation to prepare for effective patient management, indicating that WPBA can replace the existing traditional clinical examination.
Acknowledgements
The authors thank the management staff of Melaka Manipal Medical College, Melaka, Malaysia for permitting completion of the study. The authors also thank Prof. Tejinder Singh, Prof. Dinesh K Badyal, Prof. Jugesh Chhatwal, and Dr. Sarabmeet Singh Lehl from CMCL-FAIMER Regional Institute, Christian Medical College, Ludhiana, India for guidance in completing the study.
Tables and Figures
Table 3
Table 4
Table 5