- Research
- Open access
- Published:
Human factors and systems simulation methods to optimize peri-operative EHR design and implementation
Advances in Simulation volume 10, Article number: 23 (2025)
Abstract
The increase in adoption of Electronic Health records (EHR) in healthcare can be overwhelming to users and pose hidden safety threats and inefficiencies if the system is not well aligned with workflows. This quality improvement study, facilitated from September 2023–April 2024, aimed to proactively test a new EHR using systems focused simulation and Human factors methods, prior to go-live, in a peri-operative children’s hospital setting to improve safety, efficiency and usability of the EHR. The project was conducted at a large, academic, quaternary care children’s hospital undergoing a transition from one EHR to another. Two cycles of usability testing followed by in situ simulations focused on testing the new EHR with interprofessional peri-operative team members prior to go live. Usability testing, using relevant clinical workflows, was completed over zoom using the EHR “testing” environment with individual care providers across multiple peri-operative roles. In situ simulations were facilitated in the actual peri-operative and Otolaryngology clinic spaces with full interprofessional teams. Qualitative data was collected and summarized through debriefing and recordings of the sessions. Human factors and patient safety principles were integrated throughout the recommendations. A total of 475 recommendations were made to improve the safety, efficiency, usability, and optimization of the EHR. The outcomes included a range of usability and system issues including latent safety threats and their impact on safe and quality patient care. There was a plethora of usability improvements, including some critical issues that were uncovered and mitigated prior to the go live date.
Background
In the past 15 years, there has been an increasing adoption of Electronic Health Records (EHR) [1,2,3]. These electronic systems are implemented across hospitals or health systems, requiring attention to staff training, project and change management, and especially patient and staff safety [4,5,6,7,8]. In complex health care systems, adding a significant change (i.e., new EHR) can be overwhelming for users, pose safety risks to patients, and may cause inefficiencies if the systems’ design does not align with current workflows [9,10,11,12,13]. These factors can lead to low adoption rates and increased dissatisfaction among users. EHRs have helped healthcare institutions reduce medical costs, track and share patient information easier across hospital sites, and reduce medication errors while eliminating illegible handwritten documentation [14,15,16,17,18,19]. In the longer term, they may reduce the extra steps required to navigate multiple documentation systems, when used.
Human factors (HF) and systems focused simulation (SFS) applications involve the scientific study and applications of how people interact with the equipment, tools and technology, information, and other people to perform tasks in their environment [20,21,22,23]. As described in the Systems Engineering Initiative for Patient Safety (SEIPS 2.0) human factors framework, any change to system elements, such as technology (i.e., a new EHR), tools, environment, tasks, and so on, can have unpredictable impact to other system elements, healthcare processes, and desired outcomes including safe patient care [24,25,26]. Healthcare simulation has been used as an effective tool to uncover latent safety threats in healthcare environments, systems of care, and technologies as it can bridge the delta between work as imagined and work as done [27, 28]. While a new EHR is built with safety and efficiency in an ideal state of work as imagined, what really happens in application during actual patient care can be very different resulting in unintended consequences [29]. Simulation ensures an experiential approach focused on user centred design where users participate in methods such as usability testing or in situ simulations (i.e., simulation located in the actual clinical environment) and then provide feedback through debriefing [30,31,32,33]. The debriefing is focused on identifying and mitigating hazards related to the new or changing system element, not on the performance, knowledge, or skills of clinicians.
HF methodologies, including usability testing, are essential to integrate during the design, development, implementation, and operation of any eHealth system to protect patients against harm [34,35,36]. Usability issues can account for up to 60% of information technology (IT) sentinel events according to the Joint Commission [37, 38], and the introduction of EHRs can lead to unintended consequences [14, 34, 35, 39]. Given that approximately 3–5 end users may identify up to 85% of the usability issues in a product, usability testing is a cost-effective method for identification of potential risks and challenges in human-product interaction [20].
New EHR systems are often generic or “out of the box” and then customized to meet the needs of specific institutions but are typically not tested prior to implementation. Customization typically involves demonstrations and working group sessions led by analysts who are competent in the system. Without detailed “hand to keyboard” scenarios that mimic real life work, it can be difficult to identify EHR build issues, including deviations from current workflows and potential errors [40].
Enhancing the build process with cycles of usability testing and in situ systems-focused simulations enables many benefits, including improving the design, safety, and efficiency of the system while engaging end users and enhancing adoption [41]. The peri-operative environment is especially susceptible to high-risk safety threats, staff anxiety, and decreased efficiency given the environment, frequent EHR interaction, and fast paced workflows.
Our paper demonstrates a quality improvement (QI) study utilizing HF and SFS methods to improve the usability of a new EHR for perioperative care providers.
Methods
Setting
The project was conducted at a large, academic, quaternary care children’s hospital undergoing a transition from one EHR to another.
This project was deemed Quality improvement (QI) by the Boston Children’s Hospital Department of Pediatrics Performance Excellence Group. QI projects that are designed to improve clinical care to better conform to established or accepted standards are considered exempt from human subject’s review by our institutional review board.
Project team
The project team consisted of members from the hospital’s IT and simulation programs’ leadership, EHR implementation team (including analysts and frontline clinicians), and an external consultancy group that provided expertise in HF and SFS.
Needs assessment
Peri-operative care was identified as a focus for HF and SFS due to the time-sensitivity during patient care, complexity of workflows, and high volume of throughput suggesting proactive evaluation for safety and efficiency as highly beneficial. During project planning, the study team engaged surgeons, physician assistants, perioperative nurses, business operations, and scheduling roles in a video conference change-based needs assessment [22]. Based on case types and availability, the Otolaryngology (ORL) department was chosen to represent the surgical perspective. Based on participants’ greatest areas of concern, level of change from current workflows, and potential gaps they anticipated with the new EHR, the scenario and testing content was identified: tympanostomy tube placement surgery, with and without tonsillectomy/adenoidectomy.
Plan, do, study, act cycles
Two cycles of 1:1 participant usability testing (Plan, Do, Study, Act cycles) of the EHR using typical workflows for each role were used during usability testing, and later team based in-situ simulation phases. Figure 1 depicts a detailed timeline of each cycle of usability testing and simulations.
Usability testing
The initial plan was to complete a larger test cycle of 3–4 participants/role in October of 2023 (cycle 1) and a smaller retest cycle in January/February of 2024 (cycle 2) to validate build changes. Due to delays in the build, this plan was reversed so that cycle 1 of usability testing included 1–2 participants/role. As the EHR build progressed and approached the implementation date, cycle 2 (January/February 2024) increased the number participants to enable a larger cohort of 3–4 participants/role to participate in the usability testing (Fig. 1).
All usability testing sessions were facilitated by SFS and HF experts, lasting for 90–120 min via video conference and were recorded for data analysis purposes with consent. Each session included a prebrief with simulated scenarios in the testing EHR environment.
Each prebrief followed simulation best practice, including welcome/introductions, rationale for usability and simulation testing, goals of the session (i.e., testing the system to collect user feedback and not the person’s abilities or knowledge of the system), provided updates from improvement work completed, and key messages to build psychological safety, transparency, commitment to respect time, and anonymize participant feedback [42]. The workflow and scenario was introduced with a short orientation to the EHR interface.
Each participant completed their specific documentation workflows for the given scenario (Table 1).
The scenarios created by the study team were registered in the test EHR environment by IT analysts proactively. Each scenario was modified to the appropriate steps in overall workflow based on each participant’s role (e.g., booked for a clinic consult visit prior to the ORL sessions). Following each usability test, a debrief was performed at the end of the session using the PEARLS for systems integration debrief framework [30] which included a reactions phase followed by open ended questions to elicit feedback on the system usability including perceived risks and benefits, efficiencies lost or gained, functionality, terminology, patient safety concerns, staff experience, and workflow alignment. All potential hazards and improvement ideas were summarized and cross checked with participants prior to closing each session.
Simulations
In situ SFS sessions were conducted in two sites (main/community hospital) (Fig. 1). Table 2 describes the expanded simulation objectives to include broader system elements such as the implementation of the EHR into the peri-operative environments, processes, and integration with all roles (Table 2).
Each session included a systems-focused pre-brief and debrief including added details of in-situ considerations, psychological safety importance [43], and the feedback collection and reporting process. During the debrief, any feedback that was deemed critical to patient safety, regulatory compliance, or operational effectiveness was escalated to the EHR implementation team for urgent review. Consents were obtained for photographs/video capture.
The “surgical consult visit” simulation was conducted in the in situ ORL clinic on a separate day to look at the integration of the EHR in the clinic environment (i.e., arrive the patient, documenting the exam, place the required orders, gather consent). A patient and parent from the institution’s family advisory council participated in this simulation.
The procedure phases of the simulation (i.e., pre/intra/post-operative care) were conducted in two different operating room (OR) locations on two separate days as per the plan for EHR implementation at two locations within the healthcare organization. During the in situ OR days, two standardized patients participated (i.e., patient/parent roles) pre-operatively and post-operatively to enable documentation and consent during patient care. During intra-operative simulations, a Laerdal 5-year-old mannequin was utilized to enable medical interventions, such as medication administration and airway support, including induction, to be completed during use of the EHR.
All findings and recommendations from usability and simulation sessions were summarized and reported to the project team by the consultant group and followed each cycle of testing for review and action by the EHR implementation team. The hierarchy of intervention effectiveness was used as a foundational framework to reflect upon the various recommendations that were established (by participants, facilitators, HF/SFS consultant group) and their effectiveness to address the issues [44].
Participants
Key leaders from each peri-op stakeholder group identified participants for all sessions based on reasonable criteria determined by the project team. Participants who had prior interface exposure with the target EHR through past work or participation in the organizations EHR implementation working groups were recruited. In cases where a participant was not familiar with the EHR, a project team facilitator guided the participant through an EHR orientation at the beginning of the session.
Results
A total of 324 recommendations were made during the two usability testing cycles. Table 3 presents the total number of recommendations that were made in each cycle of usability testing and a breakdown of each type of recommendation. The categories of recommendations fell into four potential areas: (1) changes to physical layout, content, and build of the software system; (2) process or workflow changes requiring change management; (3) improvements to the content for training materials and education sessions; and (4) other items which could include physical hardware and further investigations.
Of the 324 total recommendations that resulted from usability testing, there were 275 unique recommendations between usability cycle 1 and 2 of testing. Of those, 76% (209) were completed within six months post go-live. Nearly all outstanding recommendations involved software or build changes. Half of these stemmed from technical limitations or organizational decisions not to adopt the recommendation that resulted from the usability testing. For example, the organization standardized advanced scheduling timeframes for surgery, which differed from previous categorizations, and certain surgery-specific questions could not be moved to different forms in the workflow due to technical system constraints. Other unaddressed recommendations were deferred based on time and resource constraints (Fig. 2).
Recommendation category
Tables 3 and 4 describe examples of specific findings from both the usability testing and simulations as they relate to optimal outcomes. Efforts were made to prioritize resolution strategies based on systems focused solutions, when possible, versus only people focused (e.g., training) solutions. Sample critical outcomes include improving efficiency for documentation and ordering to minimize surgical start delays and making the system more intuitive by matching staff workflows to improve safety and satisfaction (e.g., improving the design of the electronic surgical consent forms).
Systems focused simulations revealed 151 recommendations, the majority of which related to additional build changes. Critical recommendations such as the electronic consents were escalated to the EHR implementation team for urgent review.
Discussion
The ability to prioritize patient safety with proactive testing of new workflows requires a significant shift in an organization’s thinking and process. It requires allocating limited resources and time to avoid the commonplace, often a more reactive, higher risk post-implementation of changes and mitigation for poorly designed systems.
Human factors and SFS methods are proactive approaches to identify risk, mitigate harm, and improve efficiency while embracing user-informed design [43]. Through two cycles of usability testing followed by in situ simulations, we studied the design and integration of all system elements prior to the EHR launch which resulted in over 400 unique recommendations. This approach allowed us to identify high risk and high impact findings early in the project, allowing for more time to mitigate identified hazards and better align the design and adoption with users, prior to implementation.
The findings and outcomes found include a range of usability and system issues including latent safety threats and their impact on safe and quality patient care. In our project, there were a plethora of usability improvements, including some critical issues that were uncovered and mitigated prior to the go live date. Examples included considerable user concerns with consent forms and processes (e.g., risk of wrong site surgery due to consent design issues, incomplete consents), system misconfigurations caught early that would have resulted in incorrect medication quantities being ordered, and poor alignment with users’ workflows that resulted in observed delays and user frustration. Improving the patient experience included adding a field to electronically track patient belongings in the OR. Considering the typical length of an ear tube surgery (i.e., < 10 min per case), with a rapid turnaround and high volume per day, the importance of reducing unnecessary clicks, ensuring that positioning templates for surgery are aligned to the typical patient case, and EHR forms that are aligned to workflow cannot be overstated. Time efficiencies to be gained, user satisfaction (or frustration), and unnecessary time pressures in an operating room environment are all outputs of EHR design and how well it is optimized to the context. Additional benefits were reported, prompted by the pre-implementation usability tests and simulations, where additional team-based simulations were set up by the end users to further practice using the EHR and workflows while moving away from previous siloed training approaches.
Our methods are applicable to other institutions who are interested in adopting more proactive patient safety approaches such as HF usability testing followed by SFS in Plan, Do, Study, Act cycles of testing with new EHRs. The active engagement of users early from needs assessment through to project completion enables system learning of how work is actually done within the new software [45]. This is often a missing link for organizations to realize higher reliability, safer care, and a health system that is well designed and supports people to do their best work [46]. Bates et al. [47] describes decision support when combined with an EHR a potent means to create a “better cockpit” for clinician behaviour and patient outcomes to help them avoid errors, be more thorough and align better with evidence based practice. They emphasize the inherent need for HF usability testing to make it easier for the clinician to “do the right thing” and that the system’s design can make the difference between success and failure to adopt. This evidence builds on our use case to ensure these methods are utilized prior to launch of new EHRs as demonstrated in a peri-operative environment.
The hierarchy of intervention effectiveness [44] is a risk management theory that defines interventions to reduce risk (i.e., recommendations for change) into people focused and systems focused interventions [48]. Systems-focused changes such as forcing functions, introducing automation, and standardization or simplification of processes and tools are more effective at changing human behavior compared to people-focused interventions that include policy development and training and education. Our project resulted in a wide variety of recommendations spanning both the system and people focused elements. As we were early in the EHR build, the majority of recommendations made were for changes to simplify and standardize the software/build to improve automation, more effective at improving compliance specifically from a safety science lens, compared to the education or policy type strategies [47, 49]. Testing the technology through 1:1 usability testing first was helpful to focus on the elements of the specific tools within the EHR and make specific build changes that otherwise may have been challenging to identify if only simulation was used. Combining our approach of the usability testing followed by simulations enabled deeper understanding and testing of the system elements systematically resulting in a broad range of recommendations to improve the integration of the EHR into the health system.
Lessons learned/reflections
Our project was intended to utilize proactive testing prior to the EHR launch to identify issues and safety threats before using in the live clinical environment for the first time given the high risk and large amount of change happening at once. Shifting safety upstream is essential if our goal is to become more proactive versus reactive to harm in healthcare. Using routine testing at this early juncture was helpful to uncover many of the listed problems that enabled our ability to fix them prior to use in the live patient care environment. Further, application of HF and SFS during EHR design changes, upgrades, or even an uptick in EHR reported safety threats, should be a part of routine operations to test any new or potentially concerning changes to avoid inadvertent harmful, ineffective, or insufficient modifications to an existing EHR.
Our project had limitations, some of which were not predictable. Our timelines were short and with the large volume of findings uncovered (i.e., 475 recommendations), this required organizational resources to mitigate risks in a timely way before the next cycle of testing could proceed. While all recommendations were reviewed, the decision as to whether to implement a recommended change was left to the multidisciplinary EHR implementation teams. As such, it was not always clear what changes the various analyst teams had made in the IT testing environment prior to the next cycle of testing. Although limiting, this was not reported to have a negative impact on the effectiveness of subsequent cycles of testing; users and faculty were pleased to proceed as long as the facilitators and users could see progress being made toward improvements. A purposeful anticipation for the unknown risks that may surface, and a process for escalating the critical ones was key to enable teams to make iterative improvements in real time during simulations (e.g., consent form issues were escalated immediately by leadership for mitigation during the simulations). Immediate escalation by people with the ability to make changes, especially when timelines are tight cannot be understated. This occurred multiple times during our project when high risk findings required escalation by leadership.
Ensuring that sufficient time is allotted in project planning is essential to effectively execute each cycle of iterative testing. Unfortunately, no perfect timeline can be provided as a “catch all” given every project is variable in its scope, overall project timeline and budget, number of evaluation objectives that are identified as well as number of cycles of testing that are required. However, building usability testing and simulation into the project timeline and ensuring availability of resources such as IT analysts to build patients for the testing, and to address recommendations and outcomes within their workload is beneficial to reducing the impact on project timelines. Dubé et al. [23] describe the essential need for a well thought out and executed “pre-work” phase to building impactful system focused simulations as a reference for project planning considerations [50].
Given the limited resources, not all areas of the EHR could be evaluated with HF and SFS. Without advertising, multiple users came forward from other clinical areas of the hospital requesting similar testing of workflows in their area during design. This is indeed a gap in the traditional IT project implementation planning. Resources for conducting these methods can be limited when implementing a new institution-wide EHR. The most challenging aspect of this work was the inability to offer these methods to all interested groups who requested it, once the work was underway. The need to carefully prioritize is important and difficult. To prioritize, we determined that areas where high risk/time pressured documentation during live patient care and where efficiency and high workload velocity were critical, would most benefit from this evaluation. In this project, the high volume of cases and high-risk environment of the operating room and the fact that lessons learned could be applied in other operative settings (i.e., beyond the ear tube cases to other surgical situations), made the perioperative setting a priority for HF and SFS testing. We knew based on time limitations and resource constraints that we would need to limit the patient care areas included and chose to prioritize those areas and workflows where there was a high level of patient care interface with the EHR, and where patient harm could be most at risk. A future direction would be to further study which aspects of a new EHR would benefit the greatest.
This demand for HF and SFS is a marker for the increasing need to embed HF and systems simulation specialists into healthcare organizations. As we strive for a safer and more reliable healthcare system, we advocate for organizations to put their resources towards more proactive patient safety testing whenever possible.
Our paper demonstrates a QI use case highlighting the essential need for a proactive and synergistic use of HF and SFS methods during the implementation of eHealth technologies with peri-operative end user teams. Our project resulted in a total of 475 recommendations to improve the design and adoption of a new EHR system at a US pediatric hospital.
Data availability
Data is provided within the manuscript.
Abbreviations
- IT:
-
Information technology
- EHR:
-
Electronic health record
- HF:
-
Human factors
- SFS:
-
Systems focused simulation
- SEIPS 2.0.:
-
Systems Engineering Initiative for patient safety
- QI:
-
Quality improvement
- ORL:
-
Otolaryngology
References
Office of the National Coordinator for Health Information Technology. ‘National Trends in Hospital and Physician Adoption of Electronic Health Records,’ Health IT Quick-Stat #61. https://www.healthit.gov/data/quickstats/national-trends-hospital-and-physician-adoption-electronic-health-records. Accessed 15 Mar 2025.
Mazur LM, Mosaly PR, Moore C, Marks L. Association of the usability of electronic health records with cognitive workload and performance levels among physicians. JAMA Netw Open. 2019;2(4): e191709. https://doiorg.publicaciones.saludcastillayleon.es/10.1001/jamanetworkopen.2019.1709.
Grout RW, Hood D, Nelson SJ, Harris PA, Embí PJ. Selecting EHR-driven recruitment strategies: An evidence-based decision guide. J Clin Transl Sci. 2022;6(1):e108. https://doiorg.publicaciones.saludcastillayleon.es/10.1017/cts.2022.439.
Blumenthal D. Implementation of the Federal Health Information Technology Initiative. N Engl J Med. 2011;365(25):2426–31. https://doiorg.publicaciones.saludcastillayleon.es/10.1056/NEJMsr1112158.
Classen DC, Longhurst CA, Davis T, Milstein JA, Bates DW. Inpatient EHR User Experience and Hospital EHR Safety Performance. JAMA Netw Open. 2023;6(9):e2333152. https://doiorg.publicaciones.saludcastillayleon.es/10.1001/jamanetworkopen.2023.33152.
Prosci. The Prosci ADKAR® model [Internet]. 2020 [cited 26 Oct 2020]. Available from: https://www.prosci.com/adkar]
Kotter J. Leading change: why transformation efforts fail [Internet]. 1995 [cited 29 Oct 2020]. Available from: https://hbr.org/1995/05/leading-change-why-transformation-efforts-fail-2.
Project Management Institute. PMBOK guide and standards [Internet]. 2024 [cited 21 Nov 2024]. Available from: https://www.pmi.org/standards/program-management-fifth-edition.
Khairat S, Coleman C, Newlin T, et al. A mixed-methods evaluation framework for electronic health records usability studies. J Biomed Inform. 2019;94:103175. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.jbi.2019.103175.
Sinsky CA, Privitera MR. Creating a “manageable cockpit” for clinicians: a shared responsibility. JAMA Intern Med. 2018;178(6):741–2. https://doiorg.publicaciones.saludcastillayleon.es/10.1001/jamainternmed.2018.0575.
Howe JL, Adams KT, Hettinger AZ, Ratwani RM. Electronic health record usability issues and potential contribution to patient harm. JAMA. 2018;319(12):1276–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1001/jama.2018.1171.
Schulte F, Fry E. Death By 1,000 Clicks: where electronic health records went wrong. Fortune. 2019. https://kffhealthnews.org/news/death-by-a-thousand-clicks/. Accessed 15 Mar 2025.
Rotenstein LS, Holmgren AJ, Downing NL, Bates DW. Differences in total and after-hours electronic health record time across ambulatory specialties. JAMA Intern Med. 2021;181(6):863–5. https://doiorg.publicaciones.saludcastillayleon.es/10.1001/jamainternmed.2021.0256.PMID:33749732;PMCID:PMC7985815.
Sittig DF, Ash JS, Zhang J, Osheroff JA, Shabot MM. Lessons from “Unexpected increased mortality after implementation of a commercially sold computerized physician order entry system.” Pediatrics. 2006;118(2):797–801. https://doiorg.publicaciones.saludcastillayleon.es/10.1542/peds.2005-3132.
Radley DC, Wasserman MR, Olsho LE, et al. Reduction in medication errors in hospitals due to adoption of computerized provider order entry systems. J Am Med Inform Assoc. 2013;20(3):470–6. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/amiajnl-2012-001241.
Potts AL, Barr FE, Gregory DF, Wright L, Patel NR. Computerized physician order entry and medication errors in a pediatric critical care unit. Pediatrics. 2004;113(1 Pt 1):59–63. https://doiorg.publicaciones.saludcastillayleon.es/10.1542/peds.113.1.59.
Rotenstein LS, Holmgren AJ, Healey MJ, et al. Association between electronic health record time and quality of care metrics in primary care. JAMA Netw Open. 2022;5(10):e2237086. https://doiorg.publicaciones.saludcastillayleon.es/10.1001/jamanetworkopen.2022.37086.
Shekelle PG, Pane JD, Agniel D, et al. Assessment of variation in electronic health record capabilities and reported clinical quality performance in ambulatory care clinics, 2014–2017. JAMA Netw Open. 2021;4(4):e217476. https://doiorg.publicaciones.saludcastillayleon.es/10.1001/jamanetworkopen.2021.7476.
Adler-Milstein J, Zhao W, Willard-Grace R, Knox M, Grumbach K. Electronic health records and burnout: time spent on the electronic health record after hours and message volume associated with exhaustion but not with cynicism among primary care clinicians. J Am Med Inform Assoc. 2020;27(4):531–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/jamia/ocz220.
Saleem JJ, Patterson ES, Militello L, Asch SM, Doebbeling BN, Render ML. Using human factors methods to design a new interface for an electronic medical record. AMIA Annu Symp Proc. 2007;11(2007):640–4.
Colman N, Doughty C, Arnold J, Stone K, Reid J, Dalpiaz A, Hebbar KB. Simulation-based clinical systems testing for healthcare spaces: from intake through implementation. Adv Simul (Lond). 2019;2(4):19. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s41077-019-0108-7.
Brazil V. Translational simulation: not “where?” but “why?” A functional view of in situ simulation. Adv Simul (Lond). 2017;19(2):20. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s41077-017-0052-3.
Dubé M, Posner G, Stone K, White M, Kaba A, Bajaj K, Cheng A, Grant V, Huang S, Reid J. Building impactful systems-focused simulations: integrating change and project management frameworks into the pre-work phase. Adv Simul (Lond). 2021;6(1):16. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s41077-021-00169-x.
Holden RJ, Carayon P, Gurses AP, et al. SEIPS 2.0: a human factors framework for studying and improving the work of healthcare professionals and patients. Ergonomics. 2013;56(11):1669–86. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/00140139.2013.838643.
Lane-Fall MB, Koilor CB, Givan K, Klaiman T, Barg FK. Patient- and team-level characteristics associated with handoff protocol fidelity in a hybrid implementation study: results from a qualitative comparative analysis. Jt Comm J Qual Patient Saf. 2023;49(8):356–64. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.jcjq.2023.04.003.
Bethel C, Rainbow JG, Johnson K. A qualitative descriptive study of the COVID-19 pandemic: Impacts on nursing care delivery in the critical care work system. Appl Ergon. 2022;102:103712. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.apergo.2022.103712.
Deutsch ES. Bridging the gap between work-as-imagined and work-as-done. Patient Safety Advisory. 2017;14(2):80–3.
Hollnagel E. Prologue: Why do our expectations of how work should be done never correspond exactly to how work is done? In: Braithwaite J, Wears RL, Hollnagel E, editors. Resilient Health Care. Vol. 3. Reconciling work-as-imagined and work-as-done. Boca Raton (FL): CRC Press, Taylor & Francis Group; 2017. p. xvii-xxv.
Miller ME, Scholl G, Corby S, Mohan V, Gold JA. The impact of electronic health record-based simulation during intern boot camp: interventional study. JMIR Med Educ. 2021;7(1):e25828. https://doiorg.publicaciones.saludcastillayleon.es/10.2196/25828.
Dubé MM, Reid J, Kaba A, et al. PEARLS for systems integration: a modified PEARLS framework for debriefing systems-focused simulations. Simul Healthc. 2019;14(5):333–42. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/SIH.0000000000000381.
Rudolph JW, Simon R, Rivard P, Dufresne RL, Raemer DB. Debriefing with good judgment: combining rigorous feedback with genuine inquiry. Anesthesiol Clin. 2007;25(2):361–76. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.anclin.2007.03.007.
Eppich W, Cheng A. Promoting excellence and reflective learning in simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simul Healthc. 2015;10(2):106–15. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/SIH.0000000000000072.
Nickson CP, Petrosoniak A, Barwick S, Brazil V. Translational simulation: from description to action. Adv Simul (Lond). 2021;6(1):6. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s41077-021-00160-6.
Dumas JS, Redish J. A practical guide to usability testing (revised edition). Intellect Ltd. 1999.
Nielsen J. Usability Engineering. Academic Press, Inc: 1993.
Haggerty T, Brabson L, Grogg KA, et al. Usability testing of an electronic health application for patient activation on weight management. Mhealth. 2021;20(7):45. https://doiorg.publicaciones.saludcastillayleon.es/10.21037/mhealth-20-119.
Digital Health Canada. eSafety Guidelines: eSafety for eHealth. Toronto, ON; 2013.
The Joint Commission: Safe use of health information technology. Sentinel Event Alert #54. 2015. https://www.jointcommission.org/-/media/tjc/documents/resources/patient-safety-topics/sentinel-event/sea_54_hit_4_26_16.pdf. Accessed 15 Mar 2025.
Ash JS, Berg M, Coiera E. Some unintended consequences of information technology in health care: the nature of patient care information system-related errors. J Am Med Inform Assoc. 2004;11(2):104–12.
Ratwani R, Savage S, Will A, et al. A usability and safety analysis of electronic health records: a multi-center study. J Am Med Inform Assoc. 2018;25(9):1197–201. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/jamia/ocy088.
Dubé, M, Jones, B, Kaba, A, Cunnington, et al. Preventing harm: testing and implementing health care protocols using systems integration and learner-focused simulations: a case study of a new postcardiac surgery, cardiac arrest protocol. Clinical Simulation in Nursing. 2020;44;3–11. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.ecns.2019.10.006.
Watts PI, Rossier K, Bowler F, et al. Onward and upward: introducing the healthcare simulation standards of best practice. Clin Simul Nurs. 2021;58:1–4.
Dube M, Kessler D, Huang L, Petrosoniak A, Bajaj K. Considerations for psychological safety with system-focused debriefings. BMJ Simul Technol Enhanc Learn. 2020;6(3):132–4. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjstel-2019-000579.
Institute for Safe Medication Practices (ISMP). Medication error prevention “toolbox”. ISMP Medication Safety Alert! June 2, 1999; 4 (11): 1.
Ravert P, Whipple K, Hunsaker S. Academic electronic health record implementation: tips for success. Clin Simul Nurs. 2020;41:9–13.
Weintraub AY, Deutsch ES, Hales RL, et al. Using high-technology simulators to prepare anesthesia providers before implementation of a new electronic health record module: a technical report. Anesth Analg. 2017;124(6):1815–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1213/ANE.0000000000001775.
Bates DW, Kuperman GJ, Wang S, et al. Ten commandments for effective clinical decision support: making the practice of evidence-based medicine a reality. J Am Med Inform Assoc. 2003;10(6):523–30. https://doiorg.publicaciones.saludcastillayleon.es/10.1197/jamia.M1370.
Cafazzo JA, St-Cyr O. From discovery to design: the evolution of human factors in healthcare. Healthc Q. 2012;15 Spec No:24–9. https://doiorg.publicaciones.saludcastillayleon.es/10.12927/hcq.2012.22845.
Powers EM, Shiffman RN, Melnick ER, Hickner A, Sharifi M. Efficacy and unintended consequences of hard-stop alerts in electronic health record systems: a systematic review. J Am Med Inform Assoc. 2018;25(11):1556–66. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/jamia/ocy112.
Dubé M, Laberge J, Sigalet E, Shultz J, et al. Evaluations for new healthcare environment commissioning and operational decision making using simulation and human factors: a case study of an interventional trauma operating room. HERD. 2021;14(4):442–56. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/1937586721999668.
Acknowledgements
The author team would like to acknowledge the following colleagues for their support of this work: Heather Nelson and Vinny Chiang (Sponsorship); McKristie Mannan, Melise Keays, Julia Galvez Delgado, Dayna Downing Robertson, Gabriel Arato, Saja Traoui, and members of the Immersive Design Systems and Healthcare Systems Simulation International teams who supported the project work in various ways including participant recruitment, committee and team communications, facilitation, simulation technical operations, and internal team resource allocation.
Funding
No funding was required for the development of this manuscript.
Author information
Authors and Affiliations
Contributions
MD, JR, SB, MCM, ES, RL, MS, DK, DW, RB, LC JA were involved in the project conceptualization, analysis and interpretation of the data, drafting of the manuscript, and development of all figures and tables. MD, JR, SB, MCM, MS, DK, JA were involved in referencing of the article. LC was involved in the post go-live assessment of recommendation completion and generation of a relevant figure, and updates to the manuscript.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
This project was deemed quality improvement (QI) by the Boston Children’s Hospital Department of Pediatrics Performance Excellence Group. QI projects that are designed to improve clinical care to better conform to established or accepted standards are considered exempt from human subject’s review by our institutional review board.
Consent for publication
Not applicable.
Competing interests
MCM, RL, MS, DK, DW, RB, JA have no conflict of interest and no financial disclosure. MD and SB are faculty for Healthcare Systems Simulation International, which provided consulting services. MS is a consultant for Impact Advisors which provided consulting services. Dr. Eliot Shearer is the principal investigator for a sponsor-initiated clinical trial (IRB-P00046118) that is also sponsored by Akouos (a fully owned subsidiary of Eli Lily). He is also a member of the scientific advisory board for Akouos. However, he receives no compensation or incentive from this relationship and has no personal financial interest in Eli Lily or Akouos. There are no other competing interests from the author team, and none disclosed that influence the content of this paper.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
About this article
Cite this article
Dubé, M., Hron, J.D., Biesbroek, S. et al. Human factors and systems simulation methods to optimize peri-operative EHR design and implementation. Adv Simul 10, 23 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s41077-025-00349-z
Received:
Accepted:
Published:
DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s41077-025-00349-z