Home Article
A28 Enhancing Simulation Effectiveness from Design to Evaluation: An Improvement Science Approach in Higher Education
A28 Enhancing Simulation Effectiveness from Design to Evaluation: An Improvement Science Approach in Higher Education

Article Type: Education Article History
Abstract

Introduction:

The rapid expansion of simulation-based education (SBE) in UK nursing curricula has been driven by factors like the pandemic, updated NMC standards and placement shortages. Furthermore, financial constraints, supply chain issues and technological advances, are reshaping the way simulation is planned and delivered, hindering, at times, SBE implementation. Henceforth, it is imperative to deliver high-quality simulation for impact to ensure the current workforce is prepared to face the challenges of the future [1].

Improvement science methodologies could represent a scalable solution as they offer structured, evidence-based approaches to planning and delivering simulation programmes, ensuring sustainability, operational resilience and simulation effectiveness [2].

This abstract presents the application of improvement science tools to the planning and delivering of an SBE project, aiming to increase awareness and pave the way for a future-proof roadmap to sustainable and impactful SBE.

Methods:

Different methods for improvement were employed to plan, deliver and evaluate a four-week simulated placement for around 40 undergraduate nursing students, Table 1.

Evaluation focussed on students’ perceived competence and confidence on specific clinical skills, including the use of EPR systems. A mixed-method pre-/post-design was employed, and data were collected using anonymised questionnaires, debriefs and informal discussions to capture emergent issues and insights. T-test, Analysis of Variance (ANOVA) and Statistical Process Control (SPC) were used to analyse quantitative data. Ethical approval was obtained for the study.

Results:

The application of the Model for Improvement (MFI) allowed the team to plan and evaluate the project which resulted in data-driven decisions and successful outcomes [3]. In fact, results showed:

Confidence in learning improved to 97% from 68% post/pre-test

Statistical significance p<0.05 for perceived competence was detected across 6/11 proficiencies tested and for 7/7 EPR systems parameters investigated in the Before/After groups

Overall post-test competence was higher than pre-test across all proficiencies.

Table 1 shows the application some of the tools employed and their benefits.

Discussion:

Embedding improvement-science into SBE offers an evidence-based guidance that ensures rigour, effectiveness and a learner-centred focus. Thus, leading to enhanced planning accuracy, educational impact, and operational resilience. Widespread uptake of improvement science in SBE will drive more effective, sustainable, and responsive simulation programmes, ultimately improving nurse preparedness and potentially patient care.

Future work should focus on:

Scoping the current application of the MFI in simulation programmes

Integrating the MFI into SBE more systematically

Evaluating faculty’s experience in using the tools

Ethics Statement:

As the submitting author, I can confirm that all relevant ethical standards of research and dissemination have been met. Additionally, I can confirm that the necessary ethical approval has been obtained, where applicable

References

1. Dalwood P, Haig A, Sykes M, Eaton J. Simulation fidelity and nursing performance: a systematic review. Nurse Educ Pract. 2018;31:72–77.

2. Langley GJ, Moen R, Nolan KM, Nolan TW, Norman CL, Provost LP. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance. 2nd ed. San Francisco: Jossey-Bass; 2009.

3. Reed JE, Howe C, Doyle C, Bell D. Simple rules for evidence translation: insights from the SHIFT-Evidence framework. BMJ Qual Saf. 2018;27(8):672–680.

Supporting Documents – Table 1-A28

Table 1.
Tools for improvement in SBE.
Tool Adoption Benefits
Process Mapping (PM) • Collection of data• Students’ sign in process• Day to day running of the simulation• EPAD completion• Faculty upskilling Process controlImplemented for all the main workstreams to promote process visualisation, encouraging comprehension, facilitating discussions, elimination bottlenecks and redundant steps.
Action Effect Method (AEM) • Kit and equipment required• Learning resources needed• Tracking and monitoring• Critical interventions to PDSA• Programme evaluation Resources and requirementsLinked aims to required actions and resources needed, allowing the team to identify elements to implement and critical aspects warranting related testing cycles.
PDSA • Scenario development• EPR system integration• Attendance• Evaluation completion• Proficiency tracking Responsiveness and effectivenessCycles enabled rapid cycle testing of changes evaluating their immediate impact, and refining the simulation environment in near real-time