The effect of a grading incentive and a problem-specific mobile electronic clinical evaluation tool (eCEX) in the direct observation of medical student’s clinical competencies: A pilot study
G Ferenchick, S Sneed, D Solomon, A Mohmand
Keywords
clinical clerkship, clinical competencestandards, computers, handheld, health care, internal medicine, medical, observation, quality assurance, students
Citation
G Ferenchick, S Sneed, D Solomon, A Mohmand. The effect of a grading incentive and a problem-specific mobile electronic clinical evaluation tool (eCEX) in the direct observation of medical student’s clinical competencies: A pilot study. The Internet Journal of Medical Education. 2009 Volume 1 Number 2.
Abstract
Background: Direct observation of medical students’ clinical skills by faculty is uncommon. However, such observation enhances the validity of medical students’ clinical performance evaluation. Objectives: We developed and tested a the effect of a grading incentive along with a mobile, competency-specific assessment tool (eCEX) involving eleven faculty members and twelve students during their internal medicine clerkship. Methods: Six students were required to use the eCEX to for ten discrete clinical observations (e.g. performing an abdominal exam in a patient with abdominal pain) while the other six did not use the eCEX and were had a requirement for a single observed comprehensive history and physical examination. Results: The average number of direct observations per student, per 8 week clerkship, increased from 14.7 + 27 to 26.2 + 16 (p = 0.032). Observing a focused physical examination accounted for the majority of the directly-observed encounters. Students and the faculty both agreed that the CEX helped them understand which problem-specific competencies were targeted for the assessment. Both faculty and students rated the program as "easy to use". Faculty generally agreed that eCEX improved their ability to provide feedback to the students.Conclusions: The eCEX holds promise of facilitating direct observation and evaluation of medical students’ clinical skills
Background
Direct observation and assessment of a medical student's clinical skills is not a routine practice. Holmboe noted, in 2004, that concern about the prevailing situation had been simmering since the mid 1970's.[1] Several subsequent studies endorsed the view that a significant number of medical students are never observed while they interview and examine patients in real clinical setting.[2, 3] Several factors combine to create this problem. Lack of awareness on the part of clinical faculty of what constitutes the expected level of competence of a medical student is an important element.[4, 5] Also important is the lack of defined learning outcomes that are equally transparent to both faculty and the students. [4, 5] Finally, when faculty do evaluate a students’ clinical skills, they infer a rating based upon factors other than direct observation, such as case presentation. [3, 6]
Direct observation of students' clinical skills enhances the validity and objectivity of their ranking by the faculty.[7] Therefore, we felt the need for developing technology that can make direct, objective and transparent observation practical as well as accessible to both the faculty and the students in a structured format.
Methods
We designed a web-based authoring tool that allows faculty to efficiently develop content (e.g. curricular objectives, problem-specific assessment tool, logs) for use on mobile devices. The user can upload data (e.g. educational logs, competence assessment) to a central database. This mobile program has been an integral part of the internal medicine clerkship at Michigan State University for the past two years. The Clerkship Directors in Internal Medicine (CDIM) curriculum was adopted for use on the mobile devices for real-time use in dynamic training environments.[8] Screen shots of the curricular interface are shown in Fig.1.
Figure 1
The 5 screen shots above show the home page as it appears on a mobile device, the organization of the curriculum and specific learning objectives as they pertain to abdominal pain – history taking. The first shot demonstrates the integration of curriculum, log and assessment tools, all organized around the core training problems. The micro-CEX represents the newest addition to this program.
From this interface, students choose problem-specific performance objectives which are displayed as electronic checklists signifying their physical examination competencies, such as examination of a patient with abdominal pain. The device is then handed over to a faculty member who, on the basis of the electronic checklist, evaluates the student’s interaction with the real patient. Figure 2 shows the program that we have termed the eCEX. The program not only generates an electronic checklist, but can also capture free text faculty feedback related to a specific skill. An electronic record of the student's performance is then captured on the mobile device and uploaded to a central database for administrative purposes.
Figure 2
The 6 screen shots above demonstrate the use of the pulldown menu to specify the competency to be assessed, the electronic checklists used to assess the students competencies and understanding of specific abnormalities related to the condition. Faculty can document free text feedback. A competency registry (color coded with green meaning the competency was “well done” and yellow meaning it “needs improvement”) of the student’s accomplishments is displayed on the mobile device and uploaded to the students administrative log.
Pilot Study
Twelve medical students doing their medicine clerkship in their third year participated in this pilot study of eCEX during the academic year 2007-2008. Six of them took part as “eCEX group” and the other six as “no eCEX”. Those in the eCEX group initiated their own directed observations with faculty and were required to arrange and document ten discrete problem-focused eCEX evaluations. Students in the “eCEX” group received one hour of orientation to the entire software program including the eCEX. Students in the “no eCEX” group were observed by faculty members while each of the former elicited a single comprehensive history and carried out a single comprehensive physical examination as arranged by the clerkship administration. All other observations were discretionary for both groups.
Our primary outcome data included the number of patient encounters that were directly observed by attending and resident faculty during the eight-week clerkship as reported by the students with an end-of-clerkship online survey. We compared student responses of the eCEX and no eCEX groups to six specific questions concerning direct observation of focused and full examination, focused and full history-taking, along with any directly observed patient education and patient counseling interaction.
Our secondary outcome data, the results of another online survey, included the eCEX students’ and the evaluating faculty’s perception of the technical aspects and educational value of the eCEX. Data were also analyzed with SPSS (v 17). In view of the small number of study subjects and the non-normal distribution of the data, we used the Mann-Whitney test for differences between the “eCEX” and “no eCEX” groups for continuous outcome variables. Since we were testing for only an increase in the number of directly observed encounters, a one-tailed test was employed. Level of significance was set at p < 0.05.
Results
The total number of direct observation was 158 for eCEX group (average 26.2 +/-16 per student) for the eight-week clerkship, against 14.7 +/-27 per student for the same duration of clerkship for the no-eCEX group ( p < 0.05).
The eCEX group reported an average of 12.3 direct observations per student for a “focused physical examination” and 6.7 observations for “focused history-taking”, compared with 3.3 and 4.8 for the non-eCEX group respectively (p = 0.05). (Table 1).
All 6 of the eCEX students answered the online survey concerning the usability and educational usefulness of the eCEX, as did 11 of the 20 evaluating faculty. As regards the educational utility of the device, students “strongly agreed” that the eCEX improved their ability to identify the specific history and physical examination competencies they needed to know and demonstrate. The idea of having to perform multiple small observations was appreciated, as was the ability to choose which observations of theirs they wished to be observed. (Table 2)
Figure 4
Faculty also agreed that the eCEX enhanced their ability to identify the specific history and physical examination competencies they had to evaluate. It also improved their ability to assess students' competencies and felt better enabled to provide feedback to the students they evaluated.
Students and faculty found the eCEX program technically easy to use. Training of the faculty by students in the use of the software just prior to evaluation took an average of 6 minutes. Faculty spent an average of 20.3 minutes to evaluate a student. Students, on average, found it harder to schedule faculty for the direct observation. They found it easier to schedule residents for this duty (Tables 2 & 3).
Discussion
This pilot study has led us to conclude that the use of the eCEX is educationally valuable to the students and faculty. First, it provided a specific structured format to both the students and the faculty for the directed observation of students' competencies. Both students as well as faculty agreed, or strongly agreed, that it helped them understand the targets of assessment of the observation. Consequently, the eCEX may be helpful in eliminating one barrier to the valid assessment of students' clinical skills by faculty, namely, that faculty evaluators are commonly unaware of the level of performance expected of students[5]. Secondly, the number of directly-observed student-patient encounters that students in the non-eCEX group reported was surprising, being, on average, 15 encounters per student in the eight weeks of clerkship. This is higher than that reported in previous studies[3]. Nevertheless, the eCEX group reported a two-fold increase in the number of directly-observed student-patient encounters, mostly in the field of focused history and physical examination. Thirdly, the eCEX was found easy to use and imparted a sense of self-efficacy to the evaluating faculty in terms of feedback to the students, and improvement in their assessment of students' specific competencies. Students also appreciated their freedom to choose the specific competencies they were to be assessed in. And finally (on the minus side), arranging for faculty or Resident observations was not consistently easy.
Limitations
This pilot study was not without drawbacks. It was a single-center study in which a small number of students and faculty participated. Our results, therefore, may not be generally applicable. Allocation of students to the two groups was convenience-based rather than random. This may have allowed bias to creep in due to the personal characteristics of students in each group. For example, students in the no eCEX group, for instance, may have been less eager in making arrangements for their direct observations. Furthermore, inter-observer variability among our faculty was not determined. Evaluation of such variability is essential if the eCEX is to be employed in high-stakes situations. One study, for example, demonstrated a low percentage agreement, for many individual items on a standardized checklist, among members viewing three videotaped encounters of emergency medicine Residents interacting with standardized patients.[9] Another limitation was the obligation on the student's part to arrange for ten direct observations. This was an essential requirement for passing the clerkship. The key determinant of the primary outcome in the study (viz. the number of directly-observed student-patient encounters) was the student's initiative rather than the use of an electronic device which, after all, only facilitated the directed observation once it had been arranged.
We have previously demonstrated that mandating clinical exposure by use of a grading incentive (e.g. seeing an obligatory minimum number of patients with diabetes, congestive heart failure etc) is an effective policy for making sure that all students document exposure to patients presenting with core clinical problems[10].
We conclude that the use of a grading incentive and the eCEX increases the number of directly-observed focused history taking and physical examination in the context of student-initiated observations. Feedback from the students showed that effective use of the system was hassle-free with little time-commitment on the part of the preceptor or the student. The eCEX enhanced the evaluating faculty members' sense of self-efficacy in their task. The appliance was acceptable to faculty and students alike without feeling of any additional burdening. Its ease of use has been underlined by almost all users. These features make the eCEX a promising tool for promoting and facilitating faculty's observation of medical students' clinical skills.