Determining Student Satisfaction of a Magnetic Resonance Imaging Curriculum in Preparation for Certification Examination
B Southers, K O Lewis, D E Fleck, S Lee
Citation
B Southers, K O Lewis, D E Fleck, S Lee. Determining Student Satisfaction of a Magnetic Resonance Imaging Curriculum in Preparation for Certification Examination. The Internet Journal of Medical Education. 2014 Volume 4 Number 1.
Abstract
Recent Magnetic Resonance Imaging students at the University of Cincinnati have performed well relative to the national average in certification examination scores and pass rates. However, as no known Magnetic Resonance Imaging curriculum reviews have been performed, there is no scientific evidence demonstrating areas of curricular success and deficiency. A mixed-method study was conducted with previous Magnetic Resonance Imaging students and revealed areas of both curricular satisfaction and needed improvement. Implications could be continued use of methodological areas deemed successful and modifications in needed improvement areas. Future considerations are implementation of this study’s instruments by similar allied health educational programs.
INTRODUCTION
- Statement of the Problem
Magnetic Resonance Imaging (MRI) faculty in the Advanced Medical Imaging Technology (AMIT) program have used the American Registry of Radiologic Technologists (ARRT) Certification Examination scores and pass percentage rates as the major index of their effectiveness. These data are valuable because (a) the exams are designed to measure knowledge and skills agreed by an expert body to be necessary for a competent working professional, (b) they are objective, and (c) they set a national standard, enabling programs to compare effectiveness across institutions. However, programs have considerable latitude to vary their instruction around the skeleton structure provided by these exams (and the professional bodies that design them), according to their assessment of the needs of students, employers and patients. Historically, programs have had few evaluation methods for assessing the effectiveness of their curriculum for these larger aims. Tools available have generally been limited to post-graduation employment rates, and surveys directed to graduates six months post-graduation. However, the usefulness of these tools is limited. Employment rates are largely based on several non-curricular factors, such as the effect of the economy on available medical imaging employment and may not accurately measure curricular effectiveness. Post-graduation surveys of student satisfaction can provide pertinent information regarding the curriculum, yet are subject to certain variables such as locating graduates and poor graduate cooperation in completing the survey, as well as personal factors such as mood. Thus, in addition to the certification exams, which define the structure of an effective program, a more fine-grained and reliable method of curricular assessment is also helpful.
The primary goals of AMIT MRI curriculum are to provide and foster sound learning opportunities for all incoming student, both didactically and clinically. Didactic instruction places a focus on physics, instrumentation, safety, anatomy, pathology, and physiology while clinical instruction focuses on educating the student in routine and emergent patient care, proper operation of MRI scanners, protocol selection and manipulation of MRI scanning techniques. Students successfully completing the MRI curriculum will have fulfilled requirements for the nationally administered board examinations. Successful completion of certification examinations provides the examinee official ARRT certification in a particular medical imaging modality.
According to the ARRT, certification is defined as “the initial recognition of an individual who satisfies certain standards within a profession”.1 Certification examination scores and pass percentage rates provide quantitative results, so both may be used as markers to gauge didactic curricular success. However, as students may rely on supplemental study information not provided by the curriculum, this data are limited in measuring curricular effectiveness as a whole. Further, as no known curriculum review has been performed prior to this study, there is no scientific evidence or supporting data demonstrating curricular efficacy in relation to AMIT MRI students passing the ARRT MRI Certification Examination. In order to obtain insight as to what specific factors contribute to program success and what areas should be improved, quantitative and qualitative evaluation is essential. Therefore, a scientific review of the curriculum regarding student satisfaction of the curriculum, successful areas within the curriculum, and areas of needed curricular improvement was needed.
- Purpose/Research Questions
The primary research question for this study was: 1) Is the MRI curriculum effective in preparing the students for the ARRT MRI Certification Examination?
The following research questions were examined using both quantitative and qualitative methods:
1) In preparation for the ARRT MRI Certification Examination, what is the overall satisfaction level of students concerning MRI curriculum course content, course materials, delivery methods, and teaching quality?
2) In preparation for the ARRT MRI Certification Examination, what are specific areas of success regarding MRI curriculum course content, course materials, delivery methods, and teaching quality?
3) In preparation for the ARRT MRI Certification Examination, what are specific areas of needed improvement regarding MRI curriculum course content, course materials, delivery methods, and teaching quality?
The relative importance of determining student satisfaction, locating areas deemed successful and areas of needed improvement within the MRI curriculum can have a direct impact on future curricular coursework, instructor methodology, and curricular objectives. Further, as ongoing evaluation and revision is essential to a program’s success, and specifically within allied health programs that require certification, creation and implementation of a measurement tool with a focus on curricular efficacy on certification examination preparation is needed.
- Background and Significance
In 1989, the AMIT program was developed at the University of Cincinnati to formally educate students in multiple medical imaging modalities. Since its inception, no formal curriculum review has been performed. Without a formal review and feedback provided by former program graduates, faculty cannot fully understand and improve upon student learning.9 Certain factors regarding curricular development and assessment must also be considered, such as curricular content, format, and objectives.2
Currently in the AMIT program, there are two modalities in which students may choose to study – Nuclear Medicine Technology and MRI. Both curriculum modalities are one year in length and are each instructed by one full-time faculty member. Prior to January 2009, adjunct faculty instructed all didactic and clinical MRI coursework. The current MRI curriculum consists of 15-week “MRI Physics and Instrumentation” and “MRI Anatomy, Pathology, and Physiology” courses that run consecutively in Autumn, Spring, and Summer semesters. Additionally, three 15-week clinical placement courses are required. Clinical placement requirements are 24-hours per week throughout the curriculum, with the exception of holiday weeks and final examination weeks. MRI faculty responsibilities include creating and performing all MRI lectures, grading all didactic work, and managing all aspects of the program which include updating curricular materials, performing student clinical site visits, and creating student clinical site rotation schedules.
Additionally, since January 2009, MRI students have been required to take three 200-question mock MRI certification examinations similar in content and question style to the ARRT MRI Certification Examination. Implementation of mock examinations was done to further prepare MRI students in preparation for standardized certification examinations.
REVIEW OF RELATED LITERATURE
The purpose of reviewing the collective literature was to locate any previous MRI curricular review. From reviewing the literature, there are no known MRI curriculum reviews at the time of this current research study. This review was performed using databases such as Scopus, PubMed, Questia, and the Education Resources Information Center (ERIC). Keywords used included: “MRI Curriculum”, “MRI Curriculum Review”, “MRI Program Curriculum Review”, “MRI Program Curriculum Evaluation”, “Magnetic Resonance Imaging Program” + “Curriculum Review”, “Imaging Curriculum Evaluation”.
Review of related literature found information on MRI curriculum standards. In the “Magnetic Resonance Curriculum” – a formal MRI curriculum collaboratively established by the Association of Educators in Imaging and Radiologic Sciences (AEIRS), the American Society of Radiologic Technologists (ASRT), and the International Society of Magnetic Resonance in Medicine’s (ISMRM) Section of Magnetic Resonance Technologists (SMRT), a curriculum:
1. Should establish “national, standardized educational guidelines for MRI, including clinical and didactic components”
2. Needs to be “suitable for all programs in this discipline, including limited fellowships, certificate programs and collegiate based education programs”
3. Should “represent current practice and trends in the field”.3
Results from the literature search revealed a program outline created by the Joint Review Committee on Education in Radiologic Technology (JRCERT), the national accrediting body of MRI education programs. The JRCERT document:
1. Requires accredited MRI programs meet the MRI curriculum and academic practices, guidelines and measures. Further, certain specific objectives are outlined, such as evaluating program faculty performance to assure instructional responsibilities are performed.
2. Includes important efficacy standards, such as Standard 5, Objective 5.2. This Standard lists credentialing examination data, job placement data, and program completion rate data as measurements of curricular effectiveness
3. Lists a curricular review process as a significant importance to program effectiveness as well, as stated in Standard 5, Objective 5.4: “Analysis of student learning outcome data and program effectiveness data allow the program to identify strengths and areas for improvement to bring about systematic program improvement”.
4. Emphasizes as part of a program’s assessment cycle, a program should review its assessment plan at least every two years to ensure “assessment measures are adequate” and is “effective in measuring student learning outcomes”.4
Further review found one discussion on curriculum development and assessment and states:
1. A curriculum review and subsequent development “should be an ongoing process that is responsive to changes”. This is an important consideration, as curricula should not remain static in order to improve learning and increase student outcomes.
2. The use of surveys is often an indicator to measure student learning, specifically graduate surveys.
3. Surveys of program graduates are integral in obtaining opinions about a program’s learning outcomes, areas of strength, and areas of weakness in their learning.5
Finally, review of literature noted benefits of curriculum revision:
1. Increasing awareness of student needs, understanding current best practices, and “initiatives that affect curriculum”. As a result of curriculum review, decision-making to revise the curriculum should be a shared experience.
2. Changes need to hinge upon data collected and in accordance with the consensus view, rather than the individual preferences of faculty. Thus, collecting and analyzing data from former students are necessary components in the curriculum review and development process.6
METHODOLOGY
A. Study Setting
Two separate study environments were utilized in this study: 1) A Likert-scale questionnaire and open-ended question section were administered anonymously to study participants via the online survey tool, SurveyMonkey. A direct link to SurveyMonkey was provided via e-mail from the principal investigator; 2) A videotaped and audiotaped focus group interview session was conducted on the University of Cincinnati campus, French East Building, Room 207.
B. Participants
Participants were former MRI students at the University of Cincinnati in Cincinnati, Ohio, USA. Inclusion criteria included all previous AMIT MRI students who have taken the ARRT MRI Certification Examination. Exclusion criteria included participants who have been previous AMIT MRI students yet elected not to take the ARRT MRI Certification Examination at the time of this study. As this study was to evaluate the level of successful preparation for the ARRT MRI Certification Examination and eligibility was based on successful completion of the AMIT curriculum, participants consisted of previous graduates who completed the ARRT examination. Eighteen of thirty-four total AMIT graduates from 2009-2011 were located and given the study information sheet via e-mail. A total of twelve participants consented to complete the questionnaire and open-ended question sections. All twelve participants were given the focus group information sheet via e-mail. Five participants consented to participate in the focus group.
C. Instruments
Five primary sections consisting of five Likert-scale statements each were developed:
1. Section I: MRI Curriculum Content
2. Section II: MRI Curriculum Materials
3. Section III: MRI Curriculum Delivery Methods
4. Section IV: MRI Curriculum Teaching Quality (Teaching Methods)
5. Section V: MRI Curriculum Teaching Quality (Faculty Skills).
Nine open-ended questions were developed to collect data on study participants’ thoughts and opinions on the AMIT MRI curriculum in preparation for the ARRT MRI Certification Examination. Common terminology for the ARRT examination is the use of the term “registry” rather than “certification”, so “registry” was incorporated in this instrument for improved clarity. Content validity was obtained by review of the questionnaire instruments from five educators considered expert raters in medical imaging and allied health. Upon completion, five recommended changes were reviewed and subsequently incorporated prior to the beginning of the study.
Based on the online survey findings, thirteen open-ended focus group questions were developed to provide further insight on specifics of the MRI curriculum.
D. Data Collection
Due to the nature and purpose of the study, where all study participants were previous students of the same educational program, a non-probability convenience sampling method was used. Following IRB approval, initial data were collected using a quantitative 5-point Likert-scale and qualitative open-ended questionnaire administered anonymously to study participants using the online survey tool, SurveyMonkey. One 90-minute focus group was led by two researchers on the University of Cincinnati campus, French East Building, Room 207. Discussions were recorded using video and audio recording devices for transcription and data analysis purposes.
E. Data Analysis
Quantitative survey data were transferred from SurveyMonkey to SPSS and Excel for data analysis. Descriptive statistical analysis was performed using SPSS version 19.0.7 Reported were the mean, standard error, range, and Cronbach’s Alpha. Although the sample size of this study is relatively limited (n=12), statistical significance was reported at p < 0.05, and internal consistency reported at α > 0.9 (Cronbach’s Alpha= 0.955), indicating a high level of reliability. Qualitative responses on the open-ended questions were transferred to Excel for analysis. Focus group data were transcribed to Microsoft Word and then transferred to Microsoft Excel for data analysis. Qualitative data analysis were performed using the content analysis hand coding method, “for the purpose of classifying large amounts of text into an efficient number of categories that represent similar meanings”.8 Use of this method allowed for a summative assessment of qualitative data collected in regard to perceptions participants’ described about the MRI curriculum.
RESULTS OF THE STUDY
Tables 1-5 list all statements, mean, and standard error for all five categories found within the questionnaire. As seen in Table 1 (MRI curriculum content), data regarding useful and pertinent content offered in the MRI curriculum demonstrated the highest score regarding appropriateness of curricular content (highest overall), while scores regarding learning from outside the curriculum scored lowest. Interestingly, scores concerning efficacy of curriculum content results showed 100% of participants scoring “Strongly Agree” or “Agree”, which contradicts results regarding learning from outside the curriculum (negatively worded statement), where 75% of responses were “Strongly Agree” or “Agree”.
Regarding MRI curriculum materials (Table 2), scores were highest regarding implementation of educational materials to assist in certification preparation where participants indicated the curriculum contained a sufficient level of pertinent materials, and lowest in visual design materials, assigned readings and assignments, and quantity/quality of MRI mock-certification materials.
Highest score in MRI curriculum delivery methods (Table 3) was demonstrated in variety of content delivery methods, and lowest regarding efficacy of incorporating additional or different curriculum delivery methods. Additionally, this was the lowest score of all 25 individual statements. The ability of delivery methods to stimulate or increase “desire to learn” second lowest score in this section, as well as overall.
Regarding overall quality of teaching methods (Table 4), “MRI curriculum teaching objectives were met” scored highest (highest overall). Additionally, scores regarding faculty commitment within the MRI curriculum scored second highest in this section and second highest overall. While considerably high in comparison to all 25 individual statements, lowest scores within this section were seen in in reference to sufficiency and efficacy of teaching methods (68% higher than all other statement scores).
Highest scores in faculty teaching skills section (Table 5) were in regards to faculty ability to communicate their knowledge and experience. Lowest scores were seen regarding faculty respecting and accommodating differing learning among students.
As seen in Graph 1 below, section scores were highest in Section IV – Curriculum Teaching Quality (Teaching Methods), and lowest in Section III – Curriculum Delivery Methods.
Several primary coded categories were located during open-ended data analysis, and are demonstrated in Table 6. Highest number of participant comments on open-ended question data were seen in two coded categories: Effective materials and curricular strengths and Areas of needed improvement. Regarding curricular strengths (27 total comments), several are noted below (total number of coded comments for each area are listed in parentheses):
1. PowerPoint lectures from instructor (nine)
a. Examples: “The slides that we could print and bring to class to take additional notes on. These are the only things I used for my ARRT exam”; “PowerPoints were most effective”
2. Course delivery methods (nine)
a. Examples: “I think the professor did a great job delivering the material…this definitely showed in my registry performance”; “I think the professor did a great job delivering the material”
3. Instructor abilities (five)
a. Examples: “He seemed to know exactly what was necessary for me to be successful on the ARRT Registry Exam”; “I found the teaching technique to be effective”
4. Course content (three)
a. Examples: “All the topics in the ARRT exams were covered in class very well”; “I think the overall layout of the course benefits students”
Regarding areas of needed improvement (26 comments):
1. Variety in teaching methods (six)
a. Examples: “More variety in teaching material, not strictly PowerPoints”; “visual aids”
2. MRI anatomy and vascular anatomy (four)
a. Examples: “I think more time needs to be spent in the anatomy section”; “MRI anatomy and pathology book would be helpful”
Other coded categories of note included:
1. Student expectations met (12)
a. Examples: “the MRI curriculum met my expectations exceedingly well”; “I believe that my success on the ARRT Board exam was a direct correlation of the curriculum in the MRI AMIT Program”
2. Suggestions for improvement (12)
a. Examples: “I would perhaps change the time students spend in clinical settings depending on how the student is progressing in the program”; “I think changing the book will help prospective students a lot”
3. MRI Physics – most interesting curricular subject (seven)
a. Examples: “the physics of MRI was the most interesting”; “How the biochemistry of the body is ultimately what makes us able to image it”
4. MRI Physics – least interesting subject (five)
a. Examples: “physics in MRI”; “learning the various pulse sequences”
5. MRI Physics – most difficult subject (14)
a. Examples: “k-space”; “time-of-flight”; “pulse sequences”.
Analysis of focus group data found five primary themes and are shown in Table 7:
1. Study materials (eight comments)
a. Lecture notes (three)
i. Examples: “my studying before was mostly notes”; “I used that for the test and took it a year after I graduated and studied two days before I took the test”
2. Effective materials and curricular strengths (31 comments)
a. MRI physics content as a curricular strength (10)
i. Examples: “basic principles kind of showed throughout and on the exam, that's where they focused”; “basic principles over and over”
b. Didactic MRI anatomy lectures (seven)
i. Examples: “I remember the PowerPoints with the anatomy, where you had a slide with the labels and the next one blank to help you study”
c. Instructor demeanor (six)
i. Examples: “very approachable”; “very willing to meet outside of class”
3. Areas of needed improvement (26 comments)
a. Patient care instruction (seven)
i. Examples: “how oxygen tanks work”; “how to take blood pressure”
b. Ineffectiveness of the MRI physics textbook used (six)
i. Examples: “Sometimes the book would say something on a page and a few pages say something different”; “a lot of mistakes”
c. MRI anatomy/vascular anatomy instruction (five)
i. Examples: “we don’t cover the vascular system”; “vascular system in depth”
4. Post-graduation outcomes (10 comments)
a. Personal satisfaction (three)
i. Examples: “I actually feel legitimate now that I have a bachelor’s degree”; “I feel confident”
b. Patience (two)
i. Examples: “I learned to be patient”; “I really learned patience, how to be a professional”
5. Suggestions for improvement (15 comments)
a. Initial MRI overview prior to clinical rotations (10)
i. Examples: “just a few basics”; “have a general overview of the anatomy, obviously not detailed structures, but at least a few pictures of each section, a few brain images”
b. Increased essay questions on examinations (three)
i. Examples: “I would suggest for you to do is more essay questions”; “you should give more essay”
DISCUSSION
As seen in Figure 1, ARRT Certification Examination scores and pass percentage rates obtained via the password-protected ARRT Program Director website indicate 2009-2011 AMIT graduates performed higher than the average of all other MRI programs nationwide (97% pass rate vs. 86% national average; 87% average score vs. 83% national average). Although success in MRI certification examination compared to the national average has been shown, further data were needed to support this success, as well as locate specific areas of success and deficiency within the curriculum to continue, and ultimately, improve student outcomes.
Comparison of analyzed open-ended question data and focus group data showed similar areas of effective technique and curricular strengths beneficial toward certification examination preparation. Significant coded focus group data regarding areas of needed improvement was shown, which also corroborated coded open-ended data. Upon quantitative and qualitative data analysis, a theme was discovered: Participants may regard the MRI curriculum as effective in preparation for the ARRT Certification Examination, yet additions and revisions are necessary to improve student outcomes.
Quantitative data from this study demonstrated an overall very high satisfaction level from participants in all five quantitative data sections. MRI curriculum content data results (Table 1) may indicate that while participants utilize supplementary content, the curriculum materials were sufficient in examination preparation, and noted PowerPoint lectures in all courses as effective preparatory tools. Regarding MRI curriculum materials (Table 2), lower scores in quantity/quality of additional curriculum materials, such as visual aids and readings, compared scores regarding satisfaction in curriculum materials (highest) may indicate while participants noted the curriculum was sufficient, additional materials, assignments, examinations and review materials are needed for further curricular improvement. This is corroborated by participants suggesting the use of interactive MRI websites into learning.
Qualitative data reported further corroborates this need for further assessment and implementation of additional improvements, such as more rigorous instruction in patient care techniques, anatomy, vascular anatomy, and ethical/legal issues. Further areas of needed improvement found in qualitative data analysis are completing an extensive assessment of the current MRI textbook and possible textbook change, and the creation of mock certification examinations more similar to the national certification examination. Regarding MRI curriculum delivery methods (Table 3), quantitative results scored lowest overall, indicating a need to further assess delivery methods in didactic MRI coursework, in line with data regarding materials used within the curriculum. Scores regarding efficacy of incorporating additional or different curriculum delivery methods was lowest of all 25 statements. This may indicate additional or different content delivery methods would not increase curriculum effectiveness, contradictory to quantitative and qualitative data reported regarding efficacy of current curricular materials.
A contrast was seen in qualitative open-ended and focus group data results as well, where participants suggested the need for additional delivery methods to improve curricular and instructor effectiveness, such as non-PowerPoint instructional methods, increased use of essay questions on didactic examinations, and increased use of instructional websites and videos. MRI Curriculum Teaching Quality – Teaching Quality (Table 4) demonstrated the highest mean score, suggesting while delivery methods are in need of improvement, faculty instruction skills may be deemed effective in certification examination preparation. Qualitative data clarified this methodology/faculty skill contrast, as participants noted while PowerPoint lectures and instructor abilities are effective in parlaying necessary information, further delivery methods could improve material retention (increased essay questions, instructional videos). Scores regarding teaching objectives were highest in this category suggesting participants consider curricular teaching objectives were met. Additionally, “faculty commitment” results, which scored second highest in this section and second highest overall, may suggest faculty commitment met participants’ educational needs. While considerably high in comparison to all 25 individual statements, lowest scores were seen in teaching methods used in the curriculum and faculty effectiveness in meeting student needs, possibly suggesting further improvements in faculty teaching methods and effectiveness are needed. Qualitative data analysis found the use of PowerPoint lectures, case study assignments, quizzes performed on a regular basis, and interactive anatomy lectures as successful teaching methods.
In MRI Curriculum Teaching Quality – Faculty Skills (Table 5), highest scores in regard to the quality of faculty teaching skills were seen in regard to the faculty’s ability to “communicate their knowledge and experience” to certification preparation, which may suggest participants benefited from faculty knowledge and experience. Lowest scores were in the area of faculty incorporating and accommodating differing learning styles, which may indicate further improvement is needed in incorporating instructional techniques to accommodate differing student learning styles. Qualitative results showed participants felt curricular expectations were met and faculty dedication met student needs, citing meeting with students outside of class and creating a comfortable learning environment as faculty strengths.
CONCLUSION
Program evaluation and measurement of student satisfaction of a curriculum has a focus on pertinence of curricular content and implementation practices within the curriculum in order to identify methods in which to enhance the curriculum and increase program efficacy and value.10 Although this study primarily focused on evaluation of student satisfaction regarding certification examination preparation, results revealed many aspects of overall curricular success as well as areas of needed improvement are needed to increase program efficacy and value for the learner.
A thorough assessment of this study’s results and participant recommendations will be conducted prior to any curricular changes made, however it is clear that results from this study will have a profound impact on the current MRI curriculum. Participants provided valuable insight and suggestions vital for curricular improvement, such as the incorporation of a pre-clinical rotation review to better prepare students for real-time patient care scenarios. Participants also noted that while multiple-choice test items are good preparation for the certification examination their use fosters rote memorization rather than true comprehension of course material. Implementation of increased essay or other question types could potentially benefit the students and increase understanding of the material.
An implication from this study could be the implementation of improvement areas within the MRI curriculum. Particularly, significant additions could be made to curricular content and modifications in instructional methodology. These included increased application of non-PowerPoint lectures, instructional videos, interactive websites, and increased instruction in patient care and ethical/legal issues. Other implications could be continued instructional use of PowerPoint lectures in both physics and anatomy courses, and the use of quizzes, review activities and mock certification examinations as assessment and preparatory tools.
Limitations of this study included:
1. Difficulty locating potential participants. Some previous AMIT MRI students were not available to participate at the current time.
2. Some former AMIT MRI students chose not to participate in this study. No information was provided as to why some previous students chose not to participate.
3. Lack of overall MRI curriculum review performed by similar MRI programs nationwide.
4. Although a pattern of positive and negative comments from the focus group matched both quantitative and qualitative results from the anonymous questionnaire, these findings may be limited due to lack of anonymity. Any difference between all anonymous and focus group data results could be attributable to:
a. The principal investigator who co-moderated the focus group interview was the sole AMIT MRI Program faculty member and all study participants were previous AMIT MRI students under his guidance. Another researcher, Dr. Kadriye O. Lewis, served as co-moderator for the focus group session. Focus group participants could potentially have been influenced by comments from other participants.
Programs in allied health fields requiring professional certification could directly benefit from continuous program assessment to effectively identify both areas of curricular strength and of needed improvement.10 Therefore, a future consideration could be the implementation of this study’s data collection instruments and methodologies, as they are of a non-program specific nature, by allied health programs whose fields require similar examinations for professional certification. Due to the limited sample size and relatively limited scope of curricular assessment within this study, another future consideration is the completion of a larger, more comprehensive AMIT MRI curriculum assessment. Finally, to maintain curricular success congruent with JRCERT standards, assessment of this study’s curricular survey instruments within two years of this study is recommended.4