Home Article
International Journal of Healthcare Simulation
image
Entrustable Professional Activities for simulation faculty?! A novel approach to standardizing mentorship and faculty development for healthcare simulation programs

DOI:10.54531/gdil6011, Pages: 1-10
Article Type: Essay, Article History
Abstract

Simulation-based education (SBE) literature emphasizes debriefing frameworks, with little discussion on developing SBE competencies. Introduced in 2005 by the Royal College of Physicians and Surgeons of Canada, Entrustable Professional Activities (EPAs) offer a robust curriculum development and assessment process for workplace-based assessments. There is paucity of literature on EPAs related to simulation and how simulation faculty move from novice to independent practice. The objective of this curricular innovation project was to develop standardized EPAs and milestones to assess the independence of simulation faculty by the end of mentorship. Using a modified Delphi technique, the team identified expert faculty to rate the level of importance for each EPA and milestone. Five EPAs were identified: Technology; Scenario Design; Simulation Facilitation; Prebriefing and Debriefing. EPAs provide a structured framework for tracking progress, targeting areas for formative feedback and offering opportunities for advancements and transformation of faculty development for simulation programs.

Kaba, Serieska, Terpstra, Fuselli, Semaka, Eichorst, and Grant: Entrustable Professional Activities for simulation faculty?! A novel approach to standardizing mentorship and faculty development for healthcare simulation programs

What this essay adds

    There is paucity of literature on the mentorship of simulation faculty as they move from novice to independent practice.

    Given the current gap for simulation faculty development and mentorship, there is a need for a standardized formative assessment approach that requires structured, observational-based assessment of all domains of simulation competence including technology, scenario design, simulation facilitation, prebriefing and debriefing.

    Building on the competency-based medical education approach, established by the Royal College of Physicians and Surgeons of Canada, the Provincial Simulation team developed five Entrustable Professional Activities (EPAs) and associated milestones, which offer a robust curriculum development and assessment process for simulation faculty development.

    Using a modified Delphi technique, the team identified expert interprofessional faculty from rural and urban centres across Alberta to rate the level of importance for each EPA and milestone, facilitating development of a valid and reliable Entrustable Professional Activities: Faculty Assessment for Simulation Tool (EPA-FAST).

    The EPA-FAST is a highly replicable tool that provides a clear structured framework for the systematic formative assessment of faculty towards safe independent practice. It can be generalized to other simulation programs, and it provides a significant advancement to the field of simulation through standardizing mentorship and faculty development programs.

Introduction/Background

The simulation-based education (SBE) literature emphasizes debriefing frameworks and methods to maintain the quality of simulation facilitators, with little discussion detailing how faculty develop SBE competencies over time [1–3]. Despite its importance, simulation faculty development concentrates primarily on foundational skills, such as debriefing [4–6], and neglects to describe a trajectory through which simulation faculty develop these skills from novice to independent practice.

Currently, there are several approaches to faculty development for simulation facilitators. One common and effective approach to faculty development is peer coaching [7]. Peer coaching can include teaching specific to (i) psychological safety, (ii) frameworks, (iii) method/strategy, (iv) content, (v) learner-centredness, (vi) co-facilitation, (vii) time management, (viii) difficult situations, (ix) debriefing adjuncts and (x) individual style and experience [7]. Alternatively, mentorship as an approach to faculty development creates targeted learner centred opportunities that promote the development and sustainment of expert SBE skills, knowledge, attitudes and behaviours [8]. Priorities of mentorship programs for simulation faculty development include creating a safe learning environment, a nurturing relationship and to encouraging and modelling deliberate self-reflection with feedback. The emphasis is also on promoting ample opportunities to facilitate and sustain debriefing and facilitation skills, and support for healthcare facilitators who juggle multiple responsibilities [9].

It has been recognized that a structured, tiered approach to faculty development, mentorship and certification ensures quality instruction and includes observation, didactic, interactive experiential learning, practice expert feedback and mentoring [3]. Introduced in 2005 by the Royal College of Physicians and Surgeons of Canada as part of competency-based medical education (CBME) [10], Entrustable Professional Activities (EPAs) offer a robust curriculum development and assessment process for faculty development for workplace-based assessments through a continuum of knowledge acquisition to application and proficiency [11,12].

While there is emerging evidence on the development and application of EPAs and associated milestones for medical residents and health professional education [13–17], there is paucity of literature on EPAs specifically for faculty development across a healthcare simulation career. EPAs are defined as reliable, ‘observable tasks’ simulation facilitators are ‘trusted’ or expected to be able to perform independently by the end of mentorship [14,18]. A milestone is a specific observable marker of an individual’s ability along a developmental continuum (i.e. as they progress from beginner tasks to tasks that are more complex and towards independent practice) [19–21]. EPAs and milestones focus on the appropriate expectations that the mentor trusts the simulation faculty to perform safely and independently, and helps identify achievements and targeted areas for improvement within the workplace environment [18].

Just as clinicians need EPAs to develop and demonstrate competence, so must simulation facilitators have entrustable skills, knowledge and attitudes; passion alone to be a simulation facilitator is no longer adequate if you want to achieve simulation excellence [19,22]. EPAs can be useful in assessing readiness to practice but entrustability cannot be determined by a single simulation event, coaching or mentorship session [20]. Further, there are several applications of EPAs within CBME including both undergraduate and graduate studies [19,21] and beyond medical education, for example Keating et al. described the used EPAs to ensure nurse practitioners readiness using SBE [20].

An identified gap for the simulation community has been the lack of standardization of core competencies required to reliably mentor faculty towards best practice as well as defining and monitoring essential competency progression over time. Current proposed frameworks for competencies of simulation facilitators include topics on simulation curriculum, educational theory, assessment, debrief, simulation research, simulation operations and administration [12]. Thomas and Kellgren et al. applied Benner’s novice to expert model to simulation faculty development as a conceptual framework for simulation faculty, yet there is no standard approach in the literature how SBE develops competencies/skills from novice to expert independent faculty over time [23].

There are also few valid and reliable evaluation tools to formatively assess simulation faculty, outside of the Debriefing Assessment for Simulation in Healthcare (DASH) [24,25], which focuses primarily on debriefing skills alone and not on formative and summative assessment of the skills of simulation faculty across the continuum of their career. Similarly, the Facilitator Competency Rubric (FCR) tool was developed for formative and summative evaluation for competency of simulation facilitators, in which scores guide and prioritize faculty development but only evaluate faculty at one point in time [26]. The FCR includes components of preparation, prebriefing, facilitation, debriefing and evaluation. Each concept has a scoring rating that differentiates who was competent, who needed help (beginner/advanced beginner) and who could provide that help (proficient/expert). FCR is targeted for facilitators in academic undergraduate nursing settings in simulation and not specifically simulation faculty providing continuing education within the healthcare environment [26].

In alignment with CBME, the Provincial Simulation program in Alberta, Canada, addressed this identified gap by developing a novel set of EPAs and milestones, specifically targeting formative assessment of competencies for SBE. While mapping of EPAs and milestones have been traditionally used for residency training [27], this novel curricular development of EPAs for simulation faculty training illustrates an education innovation to advance standard competencies in SBE, which to the authors knowledge has not been done by other simulation programs globally. Applying EPAs to simulation faculty development can serve as a framework across the spectrum of health science education and into a variety of education domains to achieve higher levels of proficiency and mastery within the workplace [27]. It has been recognized that EPA application can go beyond CBE for physicians or healthcare professionals. EPAs can be used as an agenda for further development and research across all levels of the educational continuum and implemented across disciplines and professions for continuing professional development and certification [11,28,29]. Harnessing the use of EPAs and milestones for formative assessment of simulation faculty is an opportunity for significant advancements in transforming standardization of faculty development and mentorship for simulation programs globally.

The goal of this curricular innovation evaluation paper is to describe the use of a modified Delphi technique to develop standardized EPAs and milestones that a simulation faculty is trusted to independently perform by the end of faculty development mentorship program.

Methods

Needs assessment

In 2017, the Provincial Simulation program completed a needs assessment of independent simulation faculty and champions across Alberta, to gain a better understanding of the current state of faculty development needs and to explore gaps in SBE mentorship design, tools, resources and the lack of standardization of expected competencies. Prior to this needs assessment, there had been no formal inventory for simulation faculty on current continuing education needs, upskilling opportunities, education resources, mentorship, peer feedback, evaluation of outcomes and certification over the last 10 years.

The needs assessment results highlighted a mismatch in resources, delivery and formal assessment of simulation faculty. The Provincial Simulation program used various tools and approaches to faculty development and mentorship that without provincial standardization remained siloed across sites based on their geographic location. The findings stipulated a review of current process to align with the requirements of national simulation accreditation standards that includes the domains of governance, infrastructure, education and healthcare systems.

Simulation accreditation was recognized as an opportunity to standardize simulation curriculum, as well as integrate a formative assessment and evaluation of education approaches to faculty development and mentorship. The process of applying for national accreditation allowed the program to take stock on how it was measuring its capacity, growth, training beyond the initial novice courses in simulation, and initiate future planning for maintaining and upskilling its existing faculty.

SWOT analysis

To identify areas of priorities for future planning of the program, the provincial program implemented a systematic inquiry, applying a SWOT [30] (Strengths, Weakness, Opportunities and Threats) analysis aligned with the national simulation accreditation standards (see Table 1).

Table 1:
SWOT (Strengths, Weakness, Opportunities and Threats) analysis based on national simulation accreditation standards
Accreditation Standard Requirement Strength Weakness Opportunities Threats
Infrastructure There is a process in place to perform regular peer assessments and feedback on the performance of the instructor. Assessment Tools exist in simulation literature (i.e., DASH1/OSAD2/peer debrief/plus-delta). The tools used intermittently for faculty/peer and self-evaluation. Although expert faculty mentors engage in formative peer assessment, no formal tracking of performance is in place. A formalized working group is in place to develop standardized curriculum and tools. Capacity of the team impacted due to competing demands of frontline simulation, projects and ongoing expansion of simulation throughout the province.
Education Minimal expectation includes scenario development, learning objectives, facilitation & debriefing >1hr workshop.
Encourage progression of learning through continued training, observation, co-facilitation & feedback
Current Workshop in Simulation Education (WISE 13) focused on debriefing skills, facilitation, scenario development.
Defined mentorship program was adapted with some members of the simulation team.
No formalized or structured feedback approach in curriculum.
Vast geography in the province has historically hindered training and development.
Opportunity to develop competency with new curriculum.
Ability to expand our instruction design to include podcasts and webinars, and leverage virtual options for uptake with geographical limitations.
The Provincial Simulation team is unstandardized in its current approach to mentorship as well as the delivery of content of the existing WISE 1 course.
Curriculum Evaluation There is a quality review process in place whereby curriculum evaluation data, for individuals or groups, used to help modify and improve the curriculum and the delivery of courses/sessions to ensure that all educational objectives continue to be met adequately. WISE 1 evaluation tool allowed for collation and dissemination of course level feedback among simulation faculty. Data from WISE 1 evaluation was not consistently collected, measurable, or observable to inform faculty development and mentorship.
No current knowledge management system (KMS) in place to support collation, theming, and dissemination of evaluation data.
Opportunity to develop program evaluation formative assessment tools with new faculty development curriculum and mentorship
Opportunity to evaluate new faculty in mentorship as part of an ongoing formative assessment, peer feedback and program quality review.
Cost to hire an Education Lead to manage faculty development and mentorship provincially, the capacity of current expert faculty/consultants to mentor new faculty, and resources to procure KMS with health authority fiscal restraints in place.

1 Debriefing Assessment for Simulation in Healthcare

2 Objective Structured Assessment of Debriefing and Debriefing Assessment for Simulation in Healthcare

3 WISE- Workshop in Simulation Education (Foundational 2 day course for Simulation Faculty)

Development and curriculum mapping of EPAs and milestones for simulation faculty

CBME focuses on the use of milestones and EPAs to provide structure for teaching, learning and assessment [31]. It is an essential task of a discipline (profession, specialty or subspecialty) that an individual can be trusted to perform without direct supervision in a given healthcare context, once sufficient competence has been demonstrated [28,29,32]. Many iterations of medical curriculum started with a time-based model to a competency-based model and, most recently, with the addition of EPAs [11].

Building on the EPA approach from CBME [11], the Provincial team developed an Entrustable Professional Activities: Faculty Assessment for Simulation Tool (EPA-FAST) for new simulation faculty starting mentorship. This EPA-FAST focuses on the trusted tasks of the discipline and the appropriate expectations that simulation faculty can perform safely and independently while also tracking achievements and targeted areas for improvement. Within each EPA is a series of milestones, or specific observable tasks, that require sign-off as faculty advance in mentorship.

As part of the curriculum mapping exercise, the Provincial EPA Faculty group took into consideration EPAs and milestones for Faculty Development which aligned with the competencies from the new Provincial Faculty Development Curriculum. Through curriculum mapping, they identified gaps which led to modification of milestones and ensured alignment with Operational Expectations and Procedures, Strategic Plan and National Simulation Accreditation standards. Further existing tools and simulation curriculum standards in the literature were considered for the development of the EPAs and milestones which included referencing existing internal mentorship document, Harvard DASH [24,25], International Nursing Association for Clinical Simulation and Learning (INACSL) Standards [33], Royal College of Physicians and Surgeons of Canada (RCPS) [34] and Canadian Patient Safety Institute (CPSI) [35].

Modified Delphi technique

The modified Delphi method is a group consensus strategy that systematically uses literature review, opinion of stakeholders and the judgement of experts within a field to reach agreement [36]. The goal of the modified Delphi in our curricular innovation project was to decrease the number of EPAs and specific milestones and to improve the clarity of the language so that each EPA/milestone would resonate with groups of experts across a range of disciplines, clinical areas and levels of expertise. We chose a core group of experts considered important and knowledgeable in the field of SBE to assist us with the consensus strategy.

An expert is defined as one who is knowledgeable about the subject of SBE and capable of representing the views of his or her peers [37]. Several Delphi studies recommend using 10–20 carefully selected expert respondents, enough to provide a range of opinions but also few enough for the research team to be able to summarize and integrate those opinions [37].

The modified Delphi review was completed by EPA Faculty group as well 20 Simulation Experts with diverse experience in provincial simulation programs. In addition to track changes and feedback, experts rated the questions below:

    1) Does this EPA and associated milestones resonate with you as key observable tasks of the discipline required for a simulation faculty to practice independently? (yes/no)

    2) Do you see ways to improve the strength of the language? If so, please re-write, add comments or make suggestion to combine with another EPA/milestone(s).

    3) Using a 4-point scale: extremely important (3), very important (2), moderately important (1), not important (0), how important is this EPA and associated milestone(s) for a simulation faculty to practice independently?

In total, three Delphi rounds were completed including a first-round review by EPA Faculty group and second- and third-round review by simulation experts.

Results

Demographics of faculty experts

Driven by the accreditation standards and provincial governance model, there was an identified need for increasing diversity in the simulation experts’ group by varying the years of experiences, type of professional and geographic representations. This was important in both membership of the EPA Faculty group as well as the external Expert Simulation Faculty. The team was representative of multiprofessional rural, urban and academic experience.

The EPA Faculty group which completed round 1 of the modified Delphi included nine members, inclusive of a Medical Director (n = 1), Research Scientist (n = 1), Education Coordinator (n = 1), Technical Consultant (n = 1), Simulation Lead (n = 1) and Expert Faculty Mentors/Consultants (n = 4) from both rural and urban centres to ensure comprehensive representation of disciplines and expertise.

The Expert Simulation Faculty for the Delphi review included stakeholders employed by the Provincial health authority, with interprofessional representations (n = 4) and diverse experience in simulation (between 5 and 15 years) across academic, rural and urban settings. The Figure 1 below provides an overview of the demographics of the 20 Expert Simulation Faculty in round 2 and 3 for the Delphi method.

Expert faculty representation by professional experience domains.
Figure 1:

Expert faculty representation by professional experience domains.

The final 5 EPAs and 31 associated milestones identified after the completion of three rounds of modified Delphi were: (1) Technology, (2) Scenario Design and Fidelity-Realism, (3) Simulation Facilitation (Considerations for Session Planning and Implementation), (4) Prebriefing and (5) Debriefing. See Supplementary material for the EPA-FAST.

The Expert Simulation Faculty for round 2 and 3 of the modified Delphi were also asked to rate on a 4-point scale and how important each of the five EPAs and associated milestone(s) were for a simulation faculty to practice independently: extremely important (3), very important (2), moderately important (1), not important (0). The average rating for round 2 and 3 are summarized in Figure 2.

Average EPA rating based on expert faculty response in Delphi rounds.
Figure 2:

Average EPA rating based on expert faculty response in Delphi rounds.

The following sections describe the key findings after each of the rounds of modified Delphi.

Round 1 modified Delphi

In the first round of the modified Delphi, the milestones were separated into levels of skills: Beginner, Novice, Advanced and Expert. This categorization, however, was found to be complex and required a grounded understanding of what was deemed to be beginner vs. novice or expert. Research-Scholarship, Patient Safety and Teamwork and Culture were initially included as EPAs but then were removed as it was difficult to align with clearly observable tasks.

Round 2 modified Delphi

In the second round of the modified Delphi, the Logistics EPA was removed, and operational-based checklists were created for the provincial simulation program. These were recognized as being specific to the individual program and therefore less generalizable across institutions outside of Alberta.

Several experts were confused by the phrasing, ‘safety competencies’ in the Scenario Design EPA and questions also arose with the use of the Promoting Excellence and Reflective Learning in Simulation (PEARLS) framework as the exclusive debriefing tool [4]. Several experts cited that scenario development was not essential in their work where pre-existing curriculum is most often used. Some experts gave the ‘Technology’ EPA a low rating and several milestones were deemed unnecessary. Concerns were also raised regarding the narrow focus of the ‘Setting the Stage’ EPA, with suggestions to instead explore milestones for facilitation and post-session practices.

The changes made in response to the second round of the modified Delphi included a title change of the Setting the Stage EPA to Simulation Facilitation and Implementation to better encompass pre/during/post simulation facilitation. Further, the use of standard nomenclature aligning with the simulation program’s Operational Expectations and Healthcare Simulation Dictionary [38] led to generalized rewording of these milestones.

Round 3 modified Delphi

In round 3 of the modified Delphi, further changes were adapted in the Technology EPA and milestones, for example simplified troubleshooting language. Second, merging 12 milestones into 5 milestones and removed specific technical or clinical language to improve generalizability. Confusion with the term ‘embedded participant’ was noted by experts; therefore, definition of an embedded participant was added. Although several experts wanted to include additional milestones for procedural task trainer skills, the group decided not to include procedural-based inventory, as this was a request specific to one group for residency training. Following the third round of the modified Delphi, the EPA Faculty group reviewed the results for consensus. Any items that did not achieve agreement, were dropped, or revised for clarity. The final analysis of the iterative three round of the modified Delphi revealed stability between the three successive rounds. Consensus was achieved between both the Expert Simulation Faculty and EPA Faculty group on all items by the third round of the modified Delphi, which led to the finalization of the EPA-FAST. Figure 3 highlights the evolution of the number of track changes, milestones and EPAs from initial round of EPA Faculty group to third round of Delphi. In summary, we started with 9 EPA and 144 milestones and by round 3 of the modified Delphi the experts agreed on 5 EPA and 31 milestones, with a total of 228 track changes from the original document.

EPA and milestones evolution.
Figure 3:

EPA and milestones evolution.

Table 2 describes the specific track changes of EPAs and milestones through each evolution of rounds 1–3 of the modified Delphi.

Table 2:
Changes to EPA milestones to three rounds of modified Delphi
Round 1 Round 2 Round 3
EPAs Total #EPAs pre-round: 9
Total #EPAs post-round: 6
Initial EPAs align with language and domains of Provincial Simulation Program Faculty Development.
Three EPAs (Research-Scholarship, Patient Safety and Teamwork/Culture) removed due to difficulty finding observable tasks. Concepts already embedded in other EPAS.
Overall Assessment Rating for each EPA is included in the tool.
Total #EPAs pre-round: 6
Total #EPAs post-round: 5
The Logistics EPA removed, and operational-based checklists created for the provincial simulation program.
Title of ‘Setting the Stage’ EPA is changed to ‘Simulation Facilitation and Implementation’ to better encompass simulation facilitation.
Average EPA Rating from Experts
(1 = Not Important to
4 = Extremely Important)
Technology: 3
Setting the Stage: 3.3
Scenario: 3.5
Prebrief: 3.7
Debrief: 3.8
Total #EPAs: 5
No Change post-round
The overall assessment rating for each EPA removed; the rating scale used as the overall indicator for performance.
Average EPA Rating from Experts
(1 = Not Important to
4 = Extremely Important)
Technology: 3.05
Setting the Stage: 3.8
Scenario: 3.75
Prebrief: 3.95
Debrief: 3.95
Milestones Total #milestones pre-round: 144
Total # milestones post-round: 70
Milestones initially categorized into levels: Beginner, Novice, Advanced and Expert. This categorization removed because of the inability to differentiate specific observable tasks across levels.
Milestones are streamlined only include observable tasks during a simulation session.
Total # milestones pre-round: 66
Total # milestones post-round: 42
Revisions were made to the technology milestones to ensure more generalizability across program and institutions.
Most EPAs and milestones resonated as important with experts: 98/100 answered yes, key observable tasks of the discipline required for a simulation faculty to practice independently.
Total # milestones pre-round: 42
Total # milestones post-round: 31
Technology milestones are adapted by removing specific language for CPR feedback.
The suggestion of adding milestones for procedural task trainer skills excluded because it is not generalizable to all Simulation Faculty.
All EPAs and milestones resonated as important with experts: answered yes, key observable tasks of the discipline required for a simulation faculty to practice independently.
Taxonomy and Track Changes Total # track changes: 93
Language is modified to ensure it is generalizable outside of Provincial Simulation Program (e.g. removed ‘brave space’ as a reference to psychological safety and ‘follow the leader’ as a co-debrief style).
Total # track changes: 105
Standard nomenclature changed to align with the Provincial simulation program’s policies and Healthcare Simulation Dictionary.
The PEARLS debrief model, though it was noted to have a narrow focus, remained because it is the model used throughout the current faculty development curriculum.
The word ‘simulationists’ replaced with ‘simulation faculty’.
Total # track changes: 30
The definition of ‘Embedded participant’ is included for clarity and alignment with current language in the literature.
A reference link to the eSIM Program’s Operational Expectations (guidance documents) and Standard Scenario template is included.

Discussion

Despite its importance, simulation faculty development concentrates primarily on foundational skills, such as debriefing, and neglects to describe trajectory through which simulation faculty develop these skills from novice to independent practice. While there is emerging evidence on the development and application of EPAs for medical residents and health professional education [13–15,28], there is paucity of literature on EPAs specifically for faculty development across their healthcare simulation career. Further, according to Gardner et al., no formal demonstration of competency is required for simulation centre leaders or expert simulation faculty [22]. It has been recognized that to be optimally successful, simulation faculty not only need knowledge and skills related to delivery of educational curricula, but they must also be skilled in the areas beyond debriefing skills (e.g. technology) [22].

The development of our standardized EPA-FAST for simulation faculty builds on the work by Iqbal et al. [13] who proposed an EPA framework which would serve as a roadmap for ‘longitudinal training and entrustment of small group facilitators’ in which learning activities are mapped against predetermined competencies, as well programmatic development for simulation faculty [12]. Yet despite the emerging need, there have also been minimal evaluation tools to formatively assess simulation faculty [39]. Two tools commonly cited the literature are the DASH [24,25] and the FCR [26]. The DASH [24,25] focuses primarily on debriefing skills and not on formative assessment of the simulation faculty across the continuum of their career. Similarly, the FCR focuses on assessing competency based on levels (i.e. Beginner, Novice, Competent, Proficient, Expert) [26] and doesn’t include trustable observable skills that are required for independence or assess observable behaviours over time. The FCR was initially targeted for facilitators in academic undergraduate nursing settings in simulation labs. Further, the FCR does not include measurable milestones or observable tasks of the discipline that can be formatively assessed over time, which is current gap for those simulation faculty providing SBE to staff within a healthcare environment [26]. Our proposed EPA-FAST validates the competencies and concepts described by Leighton et al. in the FCR [26]. The EPA-FAST enhances readiness to practice beyond a 5-point Likert scale. Predictors of competency are based on if the simulation was facilitated at a particular day, time of week and the fidelity of the simulation [26]. In contrast to the FCR, the EPA-FAST standardizes simulation faculty competencies for all new faculty, the assessment of those competencies and therein promoting independence.

Our findings from this curricular innovation project describe the use of a modified Delphi technique to develop standardized EPAs and milestones that a simulation faculty is trusted to independently perform by the end of faculty development mentorship program. An unintended outcome from the modified Delphi was the identification of a mismatch of expert’s simulation expectation of skills required to be an independent faculty. The evaluation of current state of independent faculty yielded knowledge gaps specifically around scenario design and technology. This was likely due to advancement of the simulation expert faculty mentor’s role, specifically leveraging the expertise of the provincial simulation program in providing technology support for simulation sessions. This was predominantly noted in physician expert responses in the modified Delphi rounds. Further, faculty experts in programs with access to existing pre-designed curriculum and scenarios gave lower ratings for scenario design competency as scenario design was a skill they had not developed. However, it was the decision of the EPA Faculty group to retain EPA and milestones in technology and scenario design in the EPA-FAST, as faculty do require understanding of all domains in simulation to be considered independent in their practice. This ensures that all simulation faculty have a basic literacy of SBE competencies and mitigates barriers such as hierarchy and long-term sustainability ensuring the scale and spread of the simulation program.

Implementation of EPAs and milestone and sustaining mentorship

Following the development of our EPA-FAST, the next logical step was to determine how to operationalize this process as new faculty complete the required faculty development courses. As an initial step, a faculty development flow map was developed to illustrate the steps to be followed as new faculty move through mentorship towards independence.

With the importance of tracking and documenting new faculty as they move through the continuum of faculty development through mentorship, the development of an electronic fillable form for each faculty member completing Faculty Development (FD) courses was created. The internal program level tool is used to screen potential applicants to determine the breadth and scope of their simulation plans in order to determine a detailed strategy for the facilitation of simulation sessions.

As new faculty enter mentorship (i.e. once the required foundational online and in-person simulation faculty development courses are completed), an initial meeting with expert faculty mentors is set up to outline the steps and mentorship plan. During this consultation, a needs assessment is completed with new faculty highlighting learning objectives for future sessions and how these were attained (i.e. through identifying their perceived and unperceived needs). It is during this stage that the EPA-FAST fillable tracking tool will be started for each new faculty member. Upon observing simulation sessions, milestones within each of the five EPAs (e.g. technology, prebriefing, etc.) will be referenced and signed off according to the date the observation took place. Session dates will be tracked as well as dates that the specific observable milestones were achieved, or that are still in progress. The number of mentorship sessions required to sign off on all the EPAs and milestones will vary based on the individuals experience and comfort, but it is estimated this would be a minimum of 3–6 sessions for new faculty.

Also contained within this internal tracking form there is a section to document follow-up conversations 3–6 months post-mentorship and EPA sign-off. The overarching benefit of developing a comprehensive document which can be utilized for each new faculty member is the ability to reference conversations from screening for courses to post-EPA completion. As is sometimes the case, several expert faculty mentors may be part of a team mentoring one faculty member to independence. Having the ability to revisit previous conversations and reflect on learning objectives provides a comprehensive and continuous, sustainable process in mentorship which can be shared easily among several expert faculty mentors. Further to ensure standardization of this process, the Provincial Simulation Program intends to transition to an online learning management system that will track, monitor and centralize the location of EPA-FAST and each individual’s mentorship plan.

Finally, one approach a simulation program might consider in supporting the sustainability of EPAs and milestones is to create online Community of Practices (CoP) for new simulation faculty graduates. It has been recognized that mentorship, alongside with proactive planning, will assist faculty with developing and demonstrating the necessary knowledge, skills and behaviours for high-quality simulation facilitation [22]. Access to a simulation CoP network of simulation mentors and peers [22] promotes the sociology of a simulation mentorship environment. The goal of the CoP is to promote deliberate practice and reflection of debriefing strategies or other facilitation domains. The focus is on sharing common simulation facilitation challenges and successes related to skills and knowledge such as difficult debriefing, co-debriefing, or using PEARLS effectively [8].

While our proposed EPA-FAST targeted new simulation faculty, there are a plethora of opportunities for future faculty development and recommendations for the development of advanced EPAs and milestones for simulation faculty in the domains of Co-Debriefing, Peer Debriefing, Virtually Facilitated Simulations, System Integration Simulation, operations, as well as advanced simulation technology and research.

Limitations

This curricular innovation project is subject to some identified limitations. The results of the modified Delphi have been generated through simulation expert responses and assumptions within a Canadian healthcare system; therein, care should be taken in generalizing these findings to other settings and contexts. Further, the curricular innovation project used a cross-sectional design and included a convenience non-probability sample which may have resulted in sampling and selection bias of participants and feedback on the EPAs and milestones. Also, experts may have had some degree of recall bias, recalling either only very positive or very negative experiences potentially impacting their scoring of the EPAs and milestones. Further research is needed to validate the EPA-FAST, across different context and healthcare systems.

Conclusion

Harnessing the use of EPAs and milestones for formative assessment of simulation faculty is an opportunity for significant advancements in transforming standardization of faculty development and mentorship for simulation programs globally. Currently, there is wide variation to how simulation faculty develop these skills across their career from novice to independent practice. The objective of this curricular innovation project was to use a modified Delphi technique to develop EPAs and milestones that a simulation faculty is trusted to independently perform by the end of faculty development mentorship program. Five EPAs and 31 milestones were identified through 3 rounds of modified Delphi: Technology; Scenario Design; Simulation Facilitation; Prebriefing and Debriefing. The EPA-FAST provides a structured framework of clear expectations for assessing and tracking progress of simulation faculty; targeting areas for improvement and formative feedback to facilitate independent and safe practice. While mapping of EPAs and milestones have been traditionally used for residency training, this novel curricular development of EPA-FAST for simulation faculty training provides opportunities for significant advancements in championing new opportunities for faculty development and mentorship for simulation programs locally, nationally and internationally.

Supplementary material

Supplementary data are available at The International Journal of Healthcare Simulation online.

Declarations

Acknowledgements

This project could not have been accomplished without the leadership support from eSIM Provincial Program, Alberta Health Service. The authors would like to acknowledge the following individuals for their contributions to the EPA-FAST: Faculty Assessment for Simulation Tool: Alejandra Boscan, Mirette Dube, AnnaMaria Mundell, Chris Dyte, Danaiet Teame, Gord McNeil, Helen Catena, Irina Charania, James Huffman, John Kortbeek, Jon Duff, Kristin Fraser, Megan Rolleman, Nicholle Oomen, Ryan Iwasiw, Ryan Wilkie, Sue Barnes, Ken Brisbin, Jonathan Jaekel and Stuart Rose.

Authors’ contributions

All authors contributed to manuscript conception and design. Material, preparation, data collection and analysis were performed by AK, CS, NT, TF, JS, CE, VG. The first draft of the manuscript was written by AK and all the authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Funding

None declared.

Availability of data and materials

None declared.

Ethics approval and consent to participate

None declared.

Competing interests

None declared.

References

1. 

Palaganas JC, Epps C, Raemer DB. A history of simulation-enhanced interprofessional education. Journal of Interprofessional Care. 2014 Mar 1; 28(2):110115.

2. 

Irby DM, O’Sullivan PS, Steinert Y. Is it time to recognize excellence in faculty development programs? Medical Teacher. 2015 Aug 1; 37(8):705706.

3. 

Peterson DT, Watts PI, Epps CA, White ML. Simulation faculty development: a tiered approach. Simulation in Healthcare. 2017 Aug 1; 12(4):254259.

4. 

Eppich W, Cheng A. Promoting Excellence and Reflective Learning in Simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simulation in Healthcare. 2015 Apr 1; 10(2):106115.

5. 

Hall K, Tori K. Best practice recommendations for debriefing in simulation-based education for australian undergraduate nursing students: an integrative review. Clinical Simulation in Nursing. 2017 Jan 1; 13(1):3950.

6. 

Cheng A, Eppich W, Kolbe M, Meguerdichian M, Bajaj K, Grant V. A conceptual framework for the development of debriefing skills: a journey of discovery, growth, and maturity. Simulation in Healthcare. 2020 Feb 1; 15(1):5560.

7. 

Cheng A, Grant V, Huffman J, et al. Coaching the debriefer: peer coaching to improve debriefing quality in simulation programs. Simulation in Healthcare. 2017 Oct 1; 12(5):319325.

8. 

Terpstra N, King S. The missing link: cognitive apprenticeship as a mentorship framework for simulation facilitator development. Clinical Simulation in Nursing. 2021 Oct 1; 59:111118.

9. 

Kumar AH, Howard SK, Udani AD. Tipping the scales: prioritizing mentorship and support in simulation faculty development. Simulation in Healthcare. 2018 Feb 1; 13(1):72.

10. 

Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: theory to practice. Medical Teacher. 2010 Aug 1; 32(8):638645.

11. 

ten Cate O. Nuts and bolts of entrustable professional activities. Journal of Graduate Medical Education. 2013 Mar 1; 5(1):157158.

12. 

Meguerdichian MJ, Bajaj K, Walker K. Fundamental underpinnings of simulation education: describing a four-component instructional design approach to healthcare simulation fellowships. Advances in Simulation. 2021 May 11; 6(1):18.

13. 

Iqbal MZ, Könings KD, Al-Eraky M, AlSheikh MH, van Merrienboer JJG. Development of an entrustable professional activities (EPAs) framework for small group facilitators through a participatory design approach. Medical Education Online. 2020 Jan 1; 25(1):1694309.

14. 

Keating S, McLeod-Sordjan R, Lemp M, Willenbrock D, Fried AM, Cassara M. Evaluating entrustable professional activities in a nurse practitioner readiness for practice simulation. The Journal for Nurse Practitioners. 2021 May 1; 17(5):611614.

15. 

ten Cate O, Schumacher DJ. Entrustable professional activities versus competencies and skills: exploring why different concepts are often conflated. Advances in Health Sciences Education. 2022 May 1; 27(2):491499.

16. 

Postmes L, Tammer F, Posthumus I, Wijnen-Meijer M, van der Schaaf M, ten Cate O. EPA-based assessment: clinical teachers’ challenges when transitioning to a prospective entrustment-supervision scale. Medical Teacher. 2021 Apr 3; 43(4):404410.

17. 

van Dam M, Ramani S, ten Cate O. An EPA for better bedside teaching. Clinical Teacher. 2021 Mar 24;18(4):398403.

18. 

van Loon KA, Bonnie LHA, van Dijk N, Scheele F. Benefits of EPAs at risk? The influence of the workplace environment on the uptake of EPAs in EPA-based curricula. Perspectives on Medical Education. 2021 Aug 1; 10(4):200206.

19. 

Ryan MS, Iobst W, Holmboe ES, Santen SA. Competency-based medical education across the continuum: how well aligned are medical school EPAs to residency milestones? Medical Teacher. 2021 Nov 22; 0(0):19.

20. 

Keating S, McLeod-Sordjan Renee, Lemp MC. Nurse practitioner handoff communication: a simulation based experience. Journal of Nursing Education. 2021 Aug 1; 60(8):476477.

21. 

Shorey S, Lau TC, Lau ST, Ang E. Entrustable professional activities in health care education: a scoping review. Medical Education. 2019 Apr 4;53(8):766777.

22. 

Gardner AK, Gee D, Ahmed RA. Entrustable professional activities (EPAs) for simulation leaders: the time has come. Journal of Surgical Education [Internet]. 2018 Jan 1 [cited 2022 May 19]. Available from: http://www.scopus.com/inward/record.url?scp=85045112656&partnerID=8YFLogxK.

23. 

Thomas CM, Kellgren M. Benner’s novice to expert model: an application for simulation facilitators. Nursing Science Quarterly. 2017 Jul 1; 30(3):227234.

24. 

Simon R, Raemer D, Rudolph JW. Debriefing Assessment for Simulation in Healthcare (DASH) [Internet]. Center for Medical Simulation. 2012. Available from: https://harvardmedsim.org/wp-content/uploads/2017/01/DASH.handbook.2010.Final.Rev.2.pdf.

25. 

Brett-Fleegler M, Rudolph J, Eppich W, et al. Debriefing assessment for simulation in healthcare: development and psychometric properties. Simulation in Healthcare. 2012 Oct 1; 7(5):288294.

26. 

Leighton K, Mudra V, Gilbert GE. Development and psychometric evaluation of the facilitator competency rubric. Nursing Education Perspectives. 2018 Nov 1; 39(6):E3.

27. 

Jeyalingam T, Walsh CM, Tavares W, et al. Variable or fixed? Exploring entrustment decision making in workplace- and simulation-based assessments. Academic Medicine. 2022 Jun;97(7):1057–1064https://doi.org/10.1097/ACM.0000000000004661.

28. 

ten Cate O, Balmer DF, Caretta-Weyer H, Hatala R, Hennus MP, West DC. Entrustable professional activities and entrustment decision making: a development and research agenda for the next decade. Academic Medicine. 2021 Jul 1; 96(7S):S96.

29. 

Ten Cate O, Taylor DR. The recommended description of an entrustable professional activity: AMEE Guide No. 140. Medical Teacher. 2021 Oct 3; 43(10):11061114.

30. 

Benzaghta M, Elwalda A, Mousa M, Erkan I, Rahman M. SWOT analysis applications: an integrative literature review. Journal of Global Business Insights. 2021 Mar 1; 6(1):5573.

31. 

Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: theory to practice. Medical Teacher. 2010 Aug 1; 32(8):638645.

32. 

Carraccio C, Englander R, Gilhooly J, et al. Building a framework of entrustable professional activities, supported by competencies and milestones, to bridge the educational continuum. Academic Medicine. 2017 Mar 1; 92(3):324330.

33. 

Durham CF. The International Nursing Association for Clinical Simulation and Learning (INACSL), a community of practice for simulation. Clinical Simulation in Nursing. 2013 Aug 1; 9(8):e275e276.

34. 

Posner G. Chapter 60 - accrediting simulation programs. In: Chiniara G, editor. Clinical simulation. 2nd edition. [Internet]. Academic Press. 2019 [cited 2022 May 19]. p. 905915. Available from: https://www.sciencedirect.com/science/article/pii/B9780128156575000759.

35. 

Hassen P, Hoffman C, Gebran J, Leonard P, Dyck J. The Canadian patient safety institute: building a safer system and stronger culture of safety. British Columbia Medical Journal. 2006 Sep 1;48(7):5.

36. 

McMillan SS, King M, Tully MP. How to use the nominal group and Delphi techniques. International Journal of Clinical Pharmacy. 2016 Jun 1; 38(3):655662.

37. 

Niederberger M, Spranger J. Delphi technique in health sciences: a map. Frontiers in Public Health [Internet]. 2020 [cited 2022 May 19]; 8. Available from: https://www.frontiersin.org/article/10.3389/fpubh.2020.00457.

38. 

Agency for Healthcare Research and Quality. Healthcare simulaton dictionary [Internet]. 2016. Available from: https://www.ahrq.gov/sites/default/files/publications/files/sim-dictionary.pdf.

39. 

Meguerdichian M, Bajaj K, Wong N, et al. Simulation fellowships: survey of current summative assessment practices. Simulation in Healthcare. 2019 Oct 1; 14(5):300306.
Supplementary materials
  • Supplementary-material_S1.docx info     save_alt