Framework SIMULATION-BASED MOCK-UP

Framework SIMULATION-BASED MOCK-UP
SIMULATION-B ASED
MOCK-UP
EVA LUATION
F r am ew or k
M ARCH 2016
AUTHOR
Jonas Shultz, MSc EDAC, Human Factors Specialist, Health Quality Council of Alberta;
Adjunct Lecturer, Department of Anesthesia, Cumming School of Medicine, University of Calgary
DOCUMENT COPYRIGHT
The Health Quality Council of Alberta holds copyright and intellectual property rights of this
document. This document is licensed under a Creative Commons “Attribution-Non-CommercialNo Derivatives 4.0 International” license: http://creativecommons.org/licenses/by-nc-nd/4.0/
You may copy, distribute, or transmit only unaltered copies of the document, and only for
non-commercial purposes.
Attribution is required if content from this document is used as a resource or reference in another
work that is created. To reference this document, please use the following citation:
Shultz J. Simulation-Based Mock-up Evaluation Framework. Calgary, Alberta, Canada: Health
Quality Council of Alberta; March 2016.
Please contact the Health Quality Council of Alberta for more information: [email protected],
403.297.8162.
CONTENTS
F O R EW O R D
2
I NTR O DUC T ION
3
N e e d f or a fra m e wo rk
3
P ur pose a nd g o a l s o f th e fra m e wo rk
4
Int e nde d a u d i e n c e
4
P r inc iple s of th e fra me wo rk
4
M ETHO DS A N D B E N E F IT S OF G A IN IN G U SER IN PU T IN D ESIG N
5
U se r input i n to d e s i g n s
5
Ra nge of a rc h i te c tu ra l e va l u a ti o n me th ods
6
C ost s a nd b e n e fi ts
7
S upple m e nt a ry e xp e rti s e
9
EVAL UATI O N P L A N N IN G
10
S pa c e s t o e v a l u a te
10
C onside r a t io n fo r a n e va l u a ti o n
10
S c ope of t h e e va l u a ti o n
12
T im ing of t h e e v a l u a ti o n
12
It e r a t iv e de s i g n a n d e va l u a ti o n
14
DES I GNI NG A N D B U IL D IN G A MOC K -UP
14
S im ple m oc k -u p s
16
D e t a ile d m o ck -u p s
16
L iv e ( f unc t io n a l ) mo c k -u p s
17
S I M UL ATI O N R OL E S A N D R E S P ON S IB ILITIES
21
S im ula t ion p re p a ra ti o n
21
E na c t ing simu l a ti o n s ce n a ri o s
25
D a t a c olle c t i o n a n d a n a l y s i s
26
D isse m ina t in g fi n d i n g s a n d re co mme n d ati ons
29
Tr a c k ing a nd e v a l u a ti n g i m p l e m e n te d rec ommendati ons
30
S UM M ARY
30
APPENDI C E S
31
A ppe ndix 1: L i s ti n g o f e va l u a ti o n ta s k s
32
A ppe ndix 2: S i mu l a ti o n s ce n a ri o te mp l a te
33
A ppe ndix 3: S a mp l e p re -b ri e fi n g s cri p t and i ns truc ti ons
34
A ppe ndix 4: P h o to g ra p h y s u g g e s ti o n s
36
A ppe ndix 5: V i d e o g ra p h y s u g g e s ti o n s
37
A ppe ndix 6: D a ta a n a l y s i s s p re a d s h e e t templ ate
38
A ppe ndix 7: R e co mme n d a ti o n s te mp l a te
40
AC K NO W L E D G E M E N T S
41
R EF ER ENC E S
43
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
1
FOREWORD
The physical space, equipment, and people within any
healthcare environment have a bearing on patient
experience. The size and design of a hospital room and
the placement of equipment, for example, influence
how healthcare teams and individuals interact to
perform their work. A poorly designed space can
inadvertently introduce hazards both for the patient
and healthcare workers and many of these scenarios
can be anticipated and avoided by involving users in
the design process to help the end product meet their
Photo courtesy of DIALOG©
needs. An important tool for gathering users’ input is
designing and testing mock-up environments, where users perform their typical roles in the mock-up space
as if in a real-life scenario. This approach shows how healthcare professionals use and interact with the
space, medical equipment, one another, and the patient so that the design can address any challenges that
might otherwise have been overlooked. A better design process can improve patient safety, staff efficiency,
user experience, and can yield financial returns as well.
This Simulation-Based Mock-up Evaluation Framework, developed by the Health Quality Council of Alberta
(HQCA) in collaboration with experts in human factors, healthcare design, and patient simulation as well
as provincial and national stakeholders, outlines an approach to collect and analyze data from mock-up
healthcare environments. This framework is intended to be a guiding document to support the evaluation of
mock-ups from which an improved design process can result.
The HQCA was supported in this work through the generous contributions of time and talent from various
stakeholder groups and individuals, with the shared goal of achieving better outcomes for patients. Many
people with specialized knowledge in designing safe, high-quality environments participated in the process
of creating and reviewing this framework. We gratefully acknowledge their efforts.
Andrew Neuner
Chief Executive Officer, HQCA
March 2016
2
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
INTRODUCTION
Patient and staff safety as well as other patient and
staff outcomes are strongly linked to the built healthcare
environment in critical ways. Infections, patient falls,
and other complications of care, including surgical
complications and increased length of stay, as well as
healthcare worker injuries are all affected by the built
environment.1-4 Such adverse outcomes can be the
result of decisions made during the design process that
can inadvertently introduce design flaws in the system,
known as latent conditions.5 All decisions, even correct
Photo courtesy of DIALOG©
ones, have the potential to introduce latent conditions.
These design decisions can contribute to adverse events in healthcare settings.6 Basing these decisions,
which are made early in the design process, on evidence-based data will improve outcomes. This framework
describes a systematic way to do this using human factors/ergonomicsi methods to test and optimize design
for safety with representative user groups. It is applicable to a wide variety of stakeholders including those
who initiate, plan, or participate in the design process of built environments for healthcare. Applying this
framework is expected to yield cost efficiencies as well as improve patient safety, staff efficiency, and the
experience of patients and staff who use the space.
NEED F O R A F R A M E W OR K
Various human factors methods can and have been used to conduct and extract information from simulationbased mock-up evaluations.8-15 A mock-up is a prototype or model of a design that is used for teaching,
demonstration, design evaluation, or other purposes to enable testing of a design.16 For this framework, the
term mock-up refers to those which are built at full scale. Simulation, for the purpose of this framework, refers
to the enactment of relevant tasks, performed by individuals or teams, while interacting with the mock-up,
potentially with real patients and their families. Patient simulation is more commonly used for clinical training;
however, in this case the focus is on how well the built environment supports performance. Little guidance
is available for organizations and healthcare design teams that want to extract more evidence-based data
from their mock-ups.
Although the use of full-scale mock-ups to gain user input can be valuable and is growing in popularity, the use
of simulation within a mock-up is less frequently employed. This may be due to a lack of guidance outlining
how simulation data can be collected, analyzed, and used to enhance the design of the built environment.
Watkins and colleagues have cautioned that simulation within mock-ups will deliver a lackluster performance
if the approach does not include a carefully orchestrated research setting and process.17
i The
International Ergonomics Association defines, later adopted by the Human Factors and Ergonomics Society, Human Factors/Ergonomics as “the scientific
discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles,
data and methods to design in order to optimize human well-being and overall system performance.”7
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
3
NEED F O R A F R A M E W OR K – co n t in u ed
Designing and building environments that facilitate safer, more efficient, and cost-effective care is increasingly
important. In anticipation of a growing population as well as aging facilities, changing technology and methods
of delivering care, many provincial governments are heavily investing in healthcare facilities. The Alberta
government budgeted $2.2 billion to build and expand health infrastructure in its five-year (2015-20) capital
plan.ii, 18 The British Columbia government budgeted $2.7 billion on health infrastructure over three years
(2015-18).19 The Ontario government budgeted $11 billion in hospital capital grants over 10 years (2015-26).20
Given the substantial investments in healthcare infrastructure, optimizing the design of planned facilities is
critical. Therefore, there is a need for a simulation-based mock-up evaluation framework; one that can be
useful across jurisdictions sharing similar goals.
PUR PO S E AN D G OA L S OF T H E F R A ME WORK
This Simulation-Based Mock-up Evaluation Framework builds upon extensive experience using and refining
the evaluation methodology in a variety of healthcare environments across Alberta. Participation of stakeholders
with broad expertise in this area, as well as a review of pertinent literature also guided the development of
this framework. The framework outlines an approach to collect and analyze data from full scale mock-ups,
through the use of simulation, where individuals enact processes and procedures that will be performed in
the space. It provides an opportunity to meaningfully engage planned users of the space, including patients
and healthcare professionals, into the design process. Data from the simulation-based mock-up evaluations
are intended to guide design decisions in efforts to optimize the built environment for safe and efficient
patient-centred care.
The goals of the framework are to:
 Enhance awareness and use of simulation-based mock-up evaluations.
 Assist with planning for an evaluation, as applicable, before the design process starts.
 Provide guidance for individuals/organizations to conduct a mock-up evaluation.
I NTENDED A U D IE N C E
This framework is intended for all individuals and organizations involved in the design process of healthcare
environments. This includes government, healthcare administrators, planners, and architects. It is also
applicable to individuals, teams, and organizations that research the built environment, or provide expertise
to optimize built environment design including human factors specialists, simulation consultants, quality
improvement consultants, and academics. And finally, it is applicable to those who will be using the designed
environment including patients, families, healthcare professionals and staff.
PR I NC I PL ES OF T H E F R A ME W OR K
1.A simulation-based mock-up evaluation should be considered, and if applicable, planned, as part of the pre-design stage for inclusion in the design stage.
2. The mock-up evaluation should be thoroughly planned to maximize effectiveness. The scope of the evaluation
should be outlined during pre-design (during or just after functional programming). It should include evaluation objectives, time and costs required to build and evaluate a mock-up, and identify which phase within the design stage the evaluation should occur. The evaluation should occur before finalizing design decisions that the evaluation is intended to inform.
ii
In Alberta, projects $5 million and greater (major projects) are generally managed by Alberta Infrastructure whereas projects under $5 million (minor capital
projects and infrastructure maintenance programs) are generally managed by Alberta Health Services.
4
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
3.Building of the mock-up should align with evaluation timing and objectives. The degree to which a mock-
up is completed (mock-up fidelity) can vary significantly. The mock-up should be built to an appropriate level of fidelity to enable testing of evaluation objectives during the appropriate design phase.
4. Roles and responsibilities for those involved in the evaluation should be clearly defined. This includes identifying who will be responsible for evaluation design, staging the mock-up, data collection, and data analysis as well as who will participate in the scenario enactments. Availability of expertise (e.g., human factors) should be assessed to identify if individuals external to the organization are needed.
5. The simulation scenarios that are created and enacted should test the evaluation objectives. Evaluating a mock-up involves selecting frequent, urgent, and challenging tasks to create simulation scenarios that will test predetermined evaluation objectives. The scenarios are enacted by users of the space within the mock-up, which includes needed supplies and equipment (real or mock-ups).
6. Recommendations should be informed by evidence-based data from scenario enactments. Evidence-based data, collected through user feedback and video analysis, is used to identify potential issues and successes with the planned design. The recommendations that are developed should address any identified issues.
METHODS AND BENEFITS OF GAINING
USER INPUT IN DESIGN
US ER I NPUT IN T O D E S IG N S
Designing or renovating a built environment to support
and facilitate patient care is a complex process. An
important component in human factors evaluations is
the incorporation of users’ input (a form of participatory
ergonomicsiii ) into the design process to help ensure
the space will meet the needs of each group of users.
Depending on the room being designed, users may
include healthcare professionals such as physicians,
Photo courtesy of Alberta Health Services
nurses, pharmacists, and/or allied health professionals.
Support staff such as protection services, facilities
maintenance and engineering, porters, and housekeepers may also be included, as may patients and their
families. Different patients will have diverse needs within the same space and across different healthcare
environments, be it a hospital, long term care facility, ambulance, or outpatient clinic. Similarly, wheelchair
users, people with disabilities, and able-bodied individuals may all interact differently with built environments.
Input from all users is critical. This is consistent with provincial, national and international recognition of the
important role that patients and their families play in health quality improvement.23 Meaningful engagement
from all user groups can enhance patient experience, remove latent conditions or hazards for patients and
staff, reduce conflicts (i.e., bumps) between equipment and people, and improve efficiencies in the final design.
iii Participatory ergonomics is defined as “the involvement of people in planning and controlling a significant amount of their own work activities, with sufficient
knowledge and power to influence both processes and outcomes in order to achieve desirable goals”.21 Hignett and colleagues reviewed the benefits of
participatory ergonomics across a range of industries, including healthcare.22
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
5
R ANGE O F A R C H IT E C T U R A L E VA L U AT ION M ETH OD S
Design teams soliciting feedback from users can share two-dimensional (2D) drawings, three-dimensional
(3D) renderings that allow animated walkthroughs, renderings depicting more spatial qualities, virtual
reality environments, and physical models (e.g., mock-ups) at various scales to communicate architectural
designs.24 The method selected should reflect the degree of precision and detail sought in the feedback
provided from the user groups. For example, 2D drawings are commonly used to gather user input through
focus groups or meetings, and in some cases tabletop exercises are used to walk users through clinical
processes on top of a 2D drawing.25 Feedback obtained through use of 2D drawings may be compromised,
however, due to difficulties with conceptualizing how big the actual space will be. Furthermore, 2D drawings
are not always easily interpreted or understood by those who do not view them regularly.
There is a growing trend to build full-scale mock-ups that allow users to experience a replica of the planned
space, provide feedback, and then move outlets, equipment, walls, and anything else needed to achieve a
desired configuration.17, 26 Although having users walk through a mock-up and inspect the planned room
helps them to better visualize the space, how the space will support both current and planned uses might
not be clear. Potential design issues may remain hidden, especially if the space, once filled with equipment
and supplies, is then used by users in ways unanticipated by designers.27 Furthermore, it is difficult to move
beyond assessing hazards associated with a single technology to assess the complete context and understand
how multiple technologies will work together, and the hazards introduced when interacting with multiple
technologies simultaneously.28 Simulating various processes and procedures anticipated for the space can
help the design team understand the clinical processes that the built environment is intended to support and
the implications when multiple users use the space simultaneously.
Simulation within mock-ups allows design teams to test, refine, and discover new design concepts based on
anticipated processes and procedures.17 Evaluation objectives can target the testing and optimization of (1)
room size and configuration; (2) space requirements; (3) access requirements; (4) equipment and supply
placement; (5) visibility of the patient, equipment, and monitors; (6) work flows and processes; as well as
(7) user experience. These objectives are quite different from simulations which focus on clinical skills or
process/culture improvement, and consequently, the scenarios and data collected will likely differ. Using
simulation within a full scale mock-up allows design teams to experience and test design ideas before
construction.9-15, 29-31
Within Alberta, the use of simulation to evaluate mock-ups has been undertaken as part of the design process
for a number of new builds, primarily those focusing on acute care spaces in Calgary (see Table 1). These
evaluations provided opportunities to develop, use, and refine the evaluation process through lessons
learned, and this framework builds upon those learnings.
6
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
TABLE 1: Simulation-based mock-up evaluations conducted to design healthcare facilities in Alberta
Room type
Cit y
Fac ilit y
Calgary
Foothills Medical Centre (cardiovascular) 32
Calgary
Peter Lougheed Centre (vascular) 33
Edmonton
Mazankowski Alberta Heart Institute (cardiac) 34
Calgary
Foothills Medical Centre (interventional trauma) 11,14, 35
Calgary
South Health Campus 36
Calgary
Foothills Medical Centre 10
Calgary
Peter Lougheed Centre 10
Emergency department exam rooms
Calgary
South Health Campus 37
Ambulance patient compartments
Provincial
Emergency Medical Services 38
Acute care unit patient rooms
Calgary
South Health Campus 39
Designated assisted living resident rooms
Provincial
Facility design standards and guidelines 29
Outpatient exam rooms
Calgary
South Health Campus 40
Medical day unit pods
Grande Prairie
Grande Prairie Regional Hospital 41
Systemic prep rooms
Grande Prairie
Grande Prairie Regional Hospital 41
Hybrid operating rooms
Intensive care unit patient rooms
C O S TS AND B E N E F IT S
Simulation-based mock-up evaluations can lower costs, as well as improve patient safety, staff efficiency,
and user experience. The ability to make design modifications diminishes as a project moves through the
design process, and ultimately affects what is in and out of scope for an evaluation. Early user input into the
design process allows for greater influence on both safety and the final cost, as it is less expensive to implement
safety enhancements early in the project lifecycle (see Figure 1). Conducting a mock-up evaluation during
the correct phase in the design process is essential for maximizing the benefits of the evaluation.
The cost to build a mock-up might include construction, consultant fees, or rental space. The amount is
highly variable and dependent on a variety of factors including the level of fidelity (i.e., how elaborate the
mock-up is, as described on page 16). Table 2 outlines approximate cost ranges for each fidelity type, which
are based on reported amounts from Alberta Infrastructure and Alberta Health Services (AHS) to build
mock-ups. The cost of personnel to conduct and participate in the evaluation will depend on whether inhouse expertise and in-kind support are available. Given that five to 15 per cent of a construction budget
can be used making design-related changes during construction (two to three per cent is commonly considered
acceptable), Johansson advocated for the use of mock-ups, which in one example cost 0.5 per cent of the
construction budget (and often less), to identify issues before construction.24
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
7
FIGURE 1: The cost-influence curve 42
MOVING SAFETY UPSTREAM IN THE HEALTHCARE FACILITY DESIGN PROCESS
Cost of safety
implementation
CONSTRUCTION
DOCUMENTS
Ongoing costs of
adverse events
BID
CONSTRUCTION
OCCUPANCY
DESIGN
DEVELOPMENT
SCHEMATIC
DESIGN
Ease of safety
implementation
Need for retrofitting
enhancements
COMMISSIONING/
PUNCH LIST
PRE DESIGN
Safety through
proactive design
LOW
ABILITY TO INFLUENCE FINAL COST HIGH
Cost influence curve adapted from Max Wideman (2001) and Christensen and Manuele (1999)
-75%
-35%
-7%
+75%
PROJECT LIFECYCLE
(% of contstruction time)
In Alberta, simulation-based mock-up evaluations resulted in net savings estimated at $1.7 million. This was
achieved through a series of evaluations that examined four mock-ups, which informed the design template
for 900 patient rooms at the South Health Campus in Calgary (see Figure 6).43 Savings were gained through
avoidance of change order requisitions and were calculated using direct costs only. Additionally, had the
changes been made after opening the hospital, this would have resulted in a loss of an estimated 940 patient
bed days. Design changes made as a result of the evaluation included reconfiguring the bathroom, headwall,
and the addition of data and electrical outlets. These changes are representative of a different type of value
where changes made during design could minimize or avoid 40 or 50 years of inconvenience.44 Indeed, the
magnitude of cost savings will differ between projects.
TABLE 2: Approximate costs to build a mock-up
Mock-up fidelity
Sample mock-up
Detailed mock-up
Live (functional) mock-up
8
Approximate cost range to build
$440 – $25,000
$51,000 – $158,000
Uses built and fully functional rooms
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
Improvements in patient safety and staff efficiency
have also been demonstrated. Specifically, the number
of bumps between people and/or equipment was reduced by 44 per cent in one area of an interventional
trauma operating room mock-up at the Foothills Medical
Centre (see Figure 7). This was measured by enacting
the same scenario before and after implementing design
changes and comparing bump data between the two
scenario enactments.14 Bumps are indicators of physical
congestion and potential contamination risk (particularly
when sterile and non-sterile items come into contact).
Design changes included relocating supply cabinets
for better access, as well as moving one wall, ceilingmounted articulating arms, and the surgical table, to
increase available space.
Photo courtesy of DIALOG©
Involving clinical teams in the simulations, and ultimately the design process, enhances their level of comfort
with providing patient care in new facilities.45 Team involvement also provides the opportunity to practice
and improve task performance, skills, and team work. For example, the time required to position a C-arm in
a hybrid operating room was decreased from four minutes, eight seconds to one minute, 20 seconds through
the participation of operating-room team members in four scenario enactments.46 This represents a two minute
and 48 second reduction in the time required to have a potentially life saving diagnostic test performed.
Furthermore, participating in the simulations can engage or re-engage these teams in the design process.
Involving members of the public or patient and family advisors as participants in the simulations can
help foster and maintain patient-focused design. For example, seniors living in a supportive living facility
played the role of residents in a mock-up evaluation of a resident’s room (see Figure 5). Design specifications
that resulted from involving these members of the public included ensuring unobstructed turning radius of
a wheelchair on at least two sides of the bed, room reconfiguration options for alternate bed locations, and
views of the outdoors when both sitting and lying in bed. These design solutions are expected to enhance
the residents’ experiences.29
S UPPL EM EN TA RY E XP E RT IS E
It should be remembered that a simulation-based mock-up evaluation is only one of many tools both within
human factors and other areas of expertise that can be used to evaluate the built environment. Mock-up
evaluations do not negate the need to consider other methods (e.g., tabletop exercises, usability testing,
post-occupancy evaluations) as well as other areas of expertise (e.g., infection prevention and control, process
improvement, occupational health and safety, ergonomics) to optimize design. Engaging all appropriate
stakeholders and experts throughout the design process is necessary. Simulation-based mock-up evaluations
provide a venue for potential collaborations.
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
9
EVALUATION PLANNING
This section outlines important considerations for planning a simulation-based mock-up evaluation. This
includes what types of rooms would benefit most from a mock-up evaluation, when to start considering and
planning for an evaluation, determining the evaluation scope, and identifying the optimal point in the design
process to conduct the evaluation.
S PAC ES TO E VA L U AT E
Nationally, the Canadian Health Care Facilities Z8000 standard, published by the Canadian Standards Association, recommends building a mock-up for areas with multiple interrelated activities, complex traffic
flows, repetitive design in room types, complex projects, and for projects working within or next to an existing
facility to minimize disturbances to surrounding services.47 This standard provides requirements and guidance
for the planning, design, and construction of Canadian healthcare facilities. Although there is significant
variation in how these mock-ups can be used, this framework recommends considering a simulation-based
mock-up evaluation for:
 Room design templates, such as standardized (or highly repetitive) patient rooms used throughout a facility.
 Rooms with higher potential for adverse outcomes, such as those that provide operative or invasive procedures.
 Highly technical spaces with complex design requirements, such as a patient room in an intensive care unit.
 Innovative room designs, such as those tailored around technological advancements, such as a hybrid operating room.
 Rooms that foster new team relationships, such as bringing together inter-professional groups that have previously not worked together.
 Innovative or new work processes that need to be developed or reviewed, such as a change from centralized to bedside charting.
 Expensive rooms with consideration to both their building and operating costs, such as a hybrid operating room.
 Rooms with a known history of contributing to adverse events or worker injuries.
 Anticipated user resistance to the room design.
C O NS I DER AT ION F OR A N E VA L U AT ION
A capital program or project includes three stages:
Pre-design
Design
Construction/commissioning
In Alberta, the Health Facilities Capital Program Manual describes the responsibilities, accountabilities, and
processes for the planning and delivery of a capital program or project delivered by the Alberta government.48
Pre-design typically begins with an evaluation of existing spaces, often to address deficiencies or changes
in service delivery. This involves conducting a needs assessment that articulates clinical needs, or gaps,
between current conditions and desired outcomes and includes a preliminary risk assessment. If the needs
assessment is approved by government then a business case is prepared, which is a systematic process to
10
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
evaluate different options and develop a facility solution that meets the requirements identified in the needs
assessment. Information from the business case, if approved and funded as part of the capital plan, would
then be incorporated into the project management plan, functional program, and overall project schedule.
The functional program details the scope of services for a new facility that a project must address and
functional requirements that must be met. The functional program is used to provide instruction and
clarification during the design stage.
It is during the pre-design stage when a simulation-based mock-up evaluation should be considered for
inclusion in the design stage. If applicable, the business case and/or functional program would state that
consideration be given to conducting a simulation-based mock-up evaluation or that an evaluation be required
during the design stage.
GUIDING PRINCIPLE 1
A s i m u l a ti on-bas ed moc k-up ev al uati on s houl d be c ons i dered, and if
a p p l i c a b l e , pl anned, as part of the pre-des i gn s tage for i nc l us i on in t he
d e s i g n s ta ge.
If proceeding with a plan to conduct a simulation-based mock-up evaluation, the scope of the evaluation
(see page 12), design phase when the evaluation should occur (see Figure 3), time required to conduct the
evaluation (see Figure 2), and costs to build a mock-up (see Table 2) should be outlined during or just after
functional programming. Preparing for and conducting an evaluation typically takes between three and
six months, but will vary depending on whether video analysis is performed, the number of participants
in the simulations, the number of scenario enactments, and the time required to build a mock-up. Many of
the steps can occur simultaneously with other planning and design activities. Video analysis extends what
is learned through user feedback by providing evidence-based data that is more quantitative and objective,
and can be used to support or refute recommended design changes, resolve disagreement on the value of
particular design elements, or create a more detailed understanding of room use. This approach is discussed
in more detail in the video analysis section on page 27.
FIGURE 2: Approximate time requirements to prepare for and conduct a simulation-based mock-up evaluation
1 MONTH
1-2 MONTHS
Evaluation planning (page 10)
Designing and building a mock-up (page 14)
Simulation preparation (page 21)
3-6 MONTHS
2 DAYS
1 MONTH
1-2 MONTHS
1 DAY OR MORE
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
Enacting simulation scenarios (page 25)
User feedback and analysis (page 26)
Video analysis (optional – page 27)
Disseminating findings and recommendations (page 29)
11
S C O PE O F T H E E VA L U AT ION
Evaluation objectives, which outline what will be tested for both design and outcomes, need to be identified.
These are often determined by the design team in collaboration with the evaluator or evaluation team. Together
the objectives form the intended scope of the evaluation and are critical for many subsequent steps in the
evaluation process. Evaluation objectives may include, but are not limited to, the assessment of:
 Unit configuration
 Room size
 Design or design feature comparisons
 Space requirements for equipment or processes
 Access to the patient and/or equipment
 Patient/family spaces and experiences
 Patient transport routes to and from the room
 Room configuration
 Furniture, fixtures, and equipment placement (e.g., headwall configuration, sink locations)
 Furniture, fixtures, and equipment usability
 Visibility of patient, monitors, supplies (e.g., alcohol-based hand rubs), and/or equipment
 Supply placement
 Adverse events, such as those pertaining to infections, patient handling, medication safety, falls, behavioral health, and security49
 Work flows and processes
 Team functioning/performance
TI M I NG O F T H E E VA L U AT ION
The design process follows a schedule, typically outlined in a project management plan, and produces a
design that incorporates functional requirements. The functional requirements are identified in a functional
program which describes the scope of services to be addressed by a project and are used during design to
provide instruction, scope clarification, project costing and estimated operating costs. The Royal Architectural
Institute of Canada outlines five phases of the design process.50 These include:
1.Schematic design phase – review program requirements and develop documents, which include floor plans, to illustrate scale and character of the project as well as how the parts functionally relate.
2.Design development phase – further develop schematic design floor plans into drawings and other documents to outline architectural, structural, mechanical, and electrical systems, materials, and any other elements as appropriate.
3.Construction documents phase – further develop drawings and documents to prepare specifications and construction documents that detail the requirements for the construction of the project.
4.Bidding and negotiation phase – obtain bids or negotiated proposals, award, and prepare contracts for construction.
5.Construction phase – construct the project, which involves monitoring for progress and defects based on the interpretation of construction documents, potentially producing renderings to clarify specifications in
construction documents, and preparing change orders and change directives as needed to modify or extend construction.
12
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
Commissioning, although not specifically listed as a design phase, is a quality-oriented process of inspection and
functional performance testing for all design requirements prior to occupancy. Development of a commissioning
plan starts during design (potentially following the schematic design phase). This often dovetails with
operational commissioning which details the clinical and non-clinical operational and move-in requirements to
relocate staff and patients into a facility. This includes activities such as orientation and training of staff, dry
runs, and testing of procedures and equipment. Operational commissioning starts during the pre-design
stage and ends with occupancy. Planning the move may begin early in the design process while the actual
move-in could be phased in over a set time period in order for the project to be fully operational.
Deciding when to conduct an evaluation should be based on predetermined evaluation objectives (i.e., the
evaluation scope) and may require that evaluations be performed at more than one phase in the design
process. Figure 3 illustrates the design phase(s) most applicable to various evaluation objectives.
FIGURE 3: Design phase(s) most applicable to various evaluation objectives
SCHEMATIC DESIGN
DESIGN DEVELOPMENT
CONSTRUCTION/COMMISSIONING
Unit configuration
Room size
Design or design feature comparisons
Space requirements for equipment or processes
Access to the patient and/or equipment
Patient/family spaces and experiences
Patient transport routes to and from the room
Room configuration
Furniture, fixtures, and equipment placement
Furniture, fixtures, and equipment usability
Visibility of patient, monitors, supplies, and/or equipment
Supply placement
Adverse events
Work flows and processes
Team functioning/performance
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
13
I TER ATI V E D E S IG N A N D E VA L U AT ION
Iterative design is a concept that involves testing, analyzing, and refining design through a cyclical process.
This process could involve conducting a series of simulation-based mock-up evaluations at various design
phases, each with differing evaluation objectives. For example, a mock-up to assess space requirements and
room size during the schematic design phase would be modified according to the evaluation findings. The
modified mock-up would then be re-evaluated to also assess equipment placement and usability during the
design development phase with findings again incorporated into the mock-up. A final re-evaluation of the
mock-up during construction/commissioning would then assess workflows and team performance. With
each modification, additional design details would also be incorporated into the mock-up to make the space
appear more completed and realistic for the subsequent evaluation. Iterative design concepts can have a
large effect on the design schedule, but depending on the project may be time well spent.
Multiple evaluations may also occur within a design phase if significant design modifications are made
following an evaluation (e.g., room re-configuration) that warrant subsequent re-testing. For example, if
the location of a patient’s bed was changed to a different wall, then re-evaluating the reconfigured design
is recommended. In some cases the need to plan for multiple evaluations may be more apparent; others are
dependent on the findings of preceding evaluations.
GUIDING PRINCIPLE 2
T h e m o ck -u p e v a l uati on s houl d be thoroughl y pl anned to max i mi z e
e ffe cti ve n e s s .
The scope of the evaluation should be outlined during pre-design (during or just
after functional programming). It should include evaluation objectives, time and
costs required to build and evaluate a mock-up, and identify which phase within the
design stage the evaluation should occur. The evaluation should occur before
finalizing design decisions that the evaluation is intended to inform.
DESIGNING AND BUILDING A MOCK-UP
Determining the location to build a mock-up is important and should consider a variety of factors. The mock-up
can be built within existing facilities, within the shell of a new facility, or even off site. Co-ordinating and
transporting users who will participate in the simulations becomes much easier and efficient when the mockup is built within an existing facility from which users will be recruited. Using the shell of a new facility may
make it easier to have physical space available to house the mock-up; however, construction hazards, timelines,
and the risks associated with having individuals on an active construction site should be considered. Offsite
facilities, such as a simulation centre or a warehouse, may be useful when users come from various geographic
locations, or when space within an existing or new shell facility is unavailable. Other uses for the mock-up,
such as a venue for design team meetings, should also be considered.
14
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
Mock-up fidelity, or the degree to which a mock-up is completed, can vary significantly. Levels of fidelity for
physical mock-ups include simple, detailed, and live (or functional) mock-ups.31 Real or mock-up furniture
and equipment can be used within the mock-up. If real equipment is not included in the mock-up, this will limit
the scope of the potential findings and likely exclude, for example, equipment usability and manoeuvrability.
Clinical users or a task analysis (described later, page 21), should be consulted to develop a comprehensive
list of items to be included in the mock-up. The list should be cross-referenced with the scenarios to ensure,
at a minimum, that all supplies and equipment (real or mock-ups) used in the scenarios are present.
Although mock-up fidelity may be affected by cost and time requirements, it is important that it aligns with
the appropriate design phase (see Table 3) and the evaluation objectives. For example, if one objective is
to assess the size of the room, including walls, it may not be necessary to include a ceiling or functioning
lights, and furthermore the evaluation should occur during schematic design before finalizing decisions
regarding room size. If, however, the location of a call bell is included in the evaluation scope, the planned
placement of the call bell needs to be identified in the mock-up and used in a scenario enactment. The placement
of fixtures is determined during the design development phase, and as such, evaluating placement should
occur during design development to inform this decision. Similarly, if the evaluation includes assessment of
the visibility of a patient monitor, then the mock-up and scenario needs to include a monitor with displayed
information, which can even be a matter of printing information on a sheet of paper that is then taped onto
a plywood monitor. Other equipment or items, which may block visibility of the monitor, also need to be
included. If visibility of the monitor is found to be problematic, solutions would likely involve repositioning
or relocating the monitor (and associated data and electrical outlets), relocating objects blocking visibility of
the monitor, or moving the work area of individuals viewing the monitor. Solutions involving the placement
of data and electrical outlets, and potentially relocating objects blocking visibility, need to be addressed
during the design development phase. Some solutions, however, could be incorporated later in the design
process to enhance visibility, such as adjusting monitor height, or using adjustable mounts, which could be
addressed during construction/commissioning.
TABLE 3: Mock-up fidelity most applicable for each design phase
Design phase
Recommended mock-up fidelity
Schematic design
Simple mock-up
Design development
Construction/commissioning
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
Detailed mock-up
Live (functional) mock-up
15
GUIDING PRINCIPLE 3
B u i l d i n g o f th e mo c k-up s houl d al i gn wi th ev al uati on ti mi ng and obj ec ti ves.
The degree to which a mock-up is completed (mock-up fidelity) can vary significantly.
The mock-up should be built to an appropriate level of fidelity to enable testing of
evaluation objectives during the appropriate design phase.
S I M PL E M OC K -U P S
Desc ri p ti o n : Simple mock-ups can be as basic as using tape to indicate the location of walls and cardboard
boxes to indicate equipment or cabinet locations (see Figure 4). Although economical with respect to time
and dollars to build, some simple mock-ups do not provide the same physical constraints as a real room. That
is, participants and equipment can move beyond tape lines and through ‘walls’ during scenario enactments,
which can make evaluation and analysis less precise, require more effort, and in the end be increasingly
challenging to interpret. Another common approach is to construct walls with plywood or foam-core (see
Figure 5). This approach more accurately assesses a specific room size or configuration. Real or mock-up
furniture and equipment can be used to fill the space.
An alternative is to not have the wall locations indicated as part of the mock-up and measure how much
space is used during scenario enactments in the absence of physical limits; this is known as a functional
space experiment and has been successfully used by Hignett and colleagues.12 This technique then defines
the minimum-sized rectangle to encompass the flow of users during scenario enactments.
Uses: Simple mock-ups are most useful for evaluations early in the design process to evaluate space
requirements because room size and configuration can be adjusted in both the mock-up and design plans.
They can also be useful to help design teams from prematurely agreeing on a design because it is easy to
provide several different alternatives using simple mock-ups.
W h en to u s e : Schematic design.
DETAI L ED MOC K -U P S
Desc ri p ti o n : Detailed mock-ups are finished to a greater degree compared with simple mock-ups and
may include flooring, a ceiling, furniture, light fixtures, the headwall with electrical and gas outlets, and
functioning millwork, as well as other details (see Figure 6). Vendors are often involved in the supply and
installation of larger pieces of equipment.
Uses: These are particularly useful to examine more detailed aspects of the design, such as headwall configurations,
architectural details, space and workflow efficiency, and usability issues with the design or placement of
equipment. Detailed mock-ups can also be used for staff orientation, training, or fundraising efforts.
W h en to u s e : Design development.
16
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
L I V E ( F UNC T ION A L ) M OC K -U P S
Desc ri p ti o n : Live mock-ups are fully functional rooms where actual patient care could (and might) occur
(see Figure 7). These typically involve renovating a room on a functioning unit before a major renovation
or use of a room after construction as part of commissioning.
Uses: These provide an opportunity to support operational commissioning through staff orientation and
training, test processes and communication channels, and assess how the room is set up and whether items
are conveniently located. Live mock-ups can also be used to evaluate design changes resulting from previous
mock-up evaluations. If actual patient care is being provided within the space, such as when renovating a
room for a pilot test, data can be collected over time. Data can include those related to patient safety, such
as adverse events (i.e., infections and patient falls) as well as patient and staff outcomes (i.e., length of stay
and workplace injuries). Data from neighbouring rooms or units can also be used for comparison.
W h en to u s e : Construction/commissioning.
FIGURE 4: SIMPLE (TAPED) mock-ups used for simulation-based evaluation
Medical Day Unit Pod and Systemic Preparation Room 41
Facility: Grande Prairie Regional Hospital, Grande Prairie, Alberta.
Location: Off-site in Queen Elizabeth II Hospital.
Description: Walls, millwork, curtains, and doors were marked out using tape on the floor. Furniture, equipment, and
supplies came from the hospital. Where real equipment could not be obtained, equivalently sized items were used.
Outlet locations were not identified.
Cost to build: $880 for both spaces.
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
17
FIGURE 5: SIMPLE (CONSTRUCTED) mock-ups used for simulation-based evaluation
Designated Assisted Living Resident Room 29
Facility: To inform Design Guidelines for Continuing Care Facilities in Alberta.51
Location: Off-site warehouse, Spruce Grove, Alberta.
Description: Walls constructed with wood studs and drywall. Cabinets, fridge, and wardrobes constructed with wood
but not made to be opened. Plumbing fixtures installed but not functional. Furniture and equipment supplied by local
facility and vendors. The locations of outlets were indicated by the covers.
Cost to build: $22,000 (initially built to be 30 m2 and was enlarged to 32 m2).
Cardiac Hybrid Operating Room 34
Facility: Mazankowski Alberta Heart Institute, Edmonton, Alberta
Location: On-site, within planned space.
Description: Walls and counters constructed with steel studs and foam core. Cabinets outlined with tape on floor.
Ceiling-mounted equipment (fully adjustable), surgical table, and computers constructed with wood. Equipment and
supplies from hospital. Outlet locations were not identified.
Cost to build: $25,000. Equipment mock-ups (i.e., articulating arms, surgical table) reused for multiple mock-ups.
18
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
FIGURE 6: DETAILED mock-ups used for simulation-based evaluation
Intensive Care Unit Patient Room 10
Facility: Foothills Medical Centre, Calgary, Alberta.
Location: On-site in neighbouring building.
Description: Walls, doors, cabinets, and counters fully finished. Lighting and call bell installed and operable. Articulating
arms installed. Equipment supplied by hospital. Gas, data, and electrical outlets installed but not connected.
Cost to build: $158,000.
Emergency Department Exam Room 39
Facility: South Health Campus, Calgary, Alberta.
Location: On-site in neighbouring building (former shipping and receiving area).
Description: Walls, doors, cabinets, and counters fully finished. Lighting installed and operable. Equipment supplied by
hospital. Gas, data, and electrical outlets installed but not connected.
Cost to build: $153,000 for four mock-up patient rooms (Acute Care Unit, Intensive Care Unit, Emergency Department,
Out-patient). $47,000 of equipment reused during construction of actual room.
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
19
FIGURE 7: LIVE (FUNCTIONAL) mock-ups used for simulation-based evaluation
Interventional Trauma Operating Room 14, 35
Facility: Foothills Medical Centre, Calgary, Alberta.
Location: Built and commissioned room.
Description: Fully built room tested as part of operational commissioning with high-fidelity simulator.
Cost to build: $6 million (full room build cost).52
EMS Ambulance Patient Compartment 38
Facility: Safety concept ambulance design.
Location: Simulation testing conducted throughout the province.
Description: Operable ambulance, equiped with patient simulator and eye movement tracking device.
Cost: One ambulance out of service for seven months.
20
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
SIMULATION ROLES AND RESPONSIBILITIES
Preparing for and conducting a simulation-based mock-up evaluation requires extensive collaboration among
diverse stakeholder groups. The National Interprofessional Competency Framework describes competency
domains required for effective interprofessional collaboration.53 To enhance role clarity, the roles and responsibilities for each person involved should be clearly defined. This includes identifying who will be
responsible for evaluation design, staging the mock-up, data collection, and data analysis as well as who
will participate in the scenario enactments. In some cases, multiple roles will be carried out by a single
individual; at other times a team of individuals will support a lead person for a particular role. The tasks
required before, during, and after the scenario enactments that should be assigned are listed (see Appendix 1)
and described in the following sections. A review of internal expertise available, such as expertise in human
factors or equivalent, will need to be considered to identify if individuals external to the organization should
join the evaluation team. Performing a dry run (or pilot test) can help to clarify roles and ensure all required
equipment and supplies are present.
G U ID IN G PR INC IPLE 4
R o les an d responsibilities for those involved in the evaluation should be
clearly d efined.
This includes identifying who will be responsible for evaluation design, staging the
mock-up, data collection, and data analysis as well as who will participate in the
scenario enactments. Availability of expertise (e.g., human factors) should be assessed
to identify if individuals external to the organization are needed.
SIMULATION PREPARATION
Creating simulation scenarios
Simulation scenarios (see Appendix 2 for template) are detailed descriptions of the clinical and/or non-clinical
tasks to be performed by representative users (including patients and families) while interacting with the
mock-up of the built environment. The design of a scenario is important because it has an effect on what
can be discovered as part of the evaluation. For this reason, it is important that the tasks selected reflect
those that are the most frequent, urgent, and challenging tasks to be performed in the space, while also
testing the evaluation objectives listed in the evaluation scope. Challenging circumstances might include the
need to accommodate the maximum number of staff while ensuring continual access to medical equipment
and supplies.
A human factors specialist can conduct a task analysisiv to assist in developing appropriate scenarios, which
will result in a detailed understanding of how the tasks are performed in the space, as well as the equipment
and supplies used (which should also be included in the mock-up).
iv Task analysis can be described as the process of learning about ordinary users by observing them in action to understand in detail how they perform their
tasks and achieve their intended goals.54
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
21
Creating simulation scenarios – continued
Details to consider:

A task analysis helps to identify and select the most appropriate tasks for the simulation scenarios. It is
typically accomplished through systematic observations of users interacting with a space, its environment,
equipment, and related work processes. They should be conducted by an individual with expertise in human factors.

Information from the task analysis should be complemented with information from other sources, including
the functional program, design requirements of the room, user focus groups, as well as design team questions or concerns to be addressed. Safety reporting data and user interviews can also be used to
identify the essential and safety-critical tasks, as well as contributing factors to adverse events (e.g., distractions, interruptions, multi-tasking, etc.) to be incorporated into the scenarios.

The number of tasks and the number of scenario enactments selected for the evaluation will depend on
the evaluation scope and range of tasks typically performed in the space. In some situations, the same
scenario may be enacted more than once. This may be, for example, to test design or process alternatives, to increase staff engagement in situations where knowledge and tasks are expected to be learned based on experience using the room, to obtain a higher level of accuracy or precision in the results, and/or to allow for statistical analysis. The number of scenario enactments used to evaluate the mock-ups listed in Table 1 ranged from two to five enactments, with an average of four.
 The number of tasks and scenario enactments planned will affect the time required to enact scenarios, number of debriefing sessions, time required of participants and observers, as well as the volume of information to be analyzed.

It is important that the tasks selected result in reasonable time requirements for all individuals involved. Typically, one full day (or two half days) are dedicated to conducting the simulations. Half the time is used for performing scenario enactments and half (although often more) for structured debriefing to
guide participant reflections.
 Depending on how long it takes to perform the selected tasks, the number of scenario enactments may be limited by the amount of time available.
 Representative users should provide feedback on draft versions of the scenarios with respect to realism and the degree to which they represent actual tasks expected to be performed in the space. Specific start and end points of the scenario should be identified.
One person (or sometimes a team) should be responsible for creating the simulation scenarios through collaboration with stakeholder groups representing the various users of the space; this person is the lead.
The lead could be a human factors specialist, simulation consultant, healthcare practitioner with appropriate
medical content expertise, or some combination. This person(s) will select appropriate tasks to be included
in the scenarios, identify roles required to enact the scenarios, and determine required equipment and supplies.
22
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
GUIDING PRINCIPLE 5
T h e s i m u l a t i o n s c e n a r i o s t h a t a re c re a t e d a n d e n a c t e d s h o u l d t e s t t h e
evaluation objectives.
Evaluating a mock-up involves selecting frequent, urgent, and challenging tasks
to create simulation scenarios that will test predetermined evaluation objectives.
The scenarios are enacted by users of the space within the mock-up, which includes
needed supplies and equipment (real or mock-ups).
Ethics and consent
Any project that uses information about or collected from individuals should go through an ethics review
process to ensure participants are respected and protected. An ethics review process is used to identify and
mitigate risks to participants in how information is collected and used in the project. For quality improvement
and evaluation projects an ethics review process can usually be accomplished by the project team and does
not require review by a Research Ethics Board unless local policies require it. Projects where there may be
significant risks to individuals will benefit from a review by someone outside the project team with knowledge
of ethical issues and mitigation strategies in that context. A pRoject Ethics Community Consensus Initiative
(ARECCI) has developed a process and supporting tools that project teams can use to complete an ethics
review for a non-research project.55
Details to consider:
 Participants should be informed about the nature of the evaluation, their role in the evaluation, how information will be used, and that they may withdraw participation at any time without consequence.
 Participants should also be told how their information will be used and be reassured that no personally identifying information will be revealed.
 It is recommended that participants be informed before scenario enactment so that alternate participants can be found if some choose not to continue.

Participants may benefit from receiving both written (e.g., a short frequently asked questions sheet) and
verbal information about the evaluation process. The least amount of personal information relevant to
the evaluation should be collected. Avoid collecting personal identifiers (e.g., name, contact information) if possible.

When participants can be identified through photos or videos used in the evaluation, written consent is strongly recommended. Consent to take and use photos and videos must clearly articulate how the images
will be used (e.g., data analysis, presentation, publication, etc.). Obtaining consent after the evaluation for new uses of the photos and videos can be difficult and time consuming.
 Distributing, collecting, and tracking consent forms for all those who observe or participate in the scenario enactments should be managed by at least one person (although more assistance may be required).
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
23
Recruiting participants
A listing of roles required to enact the scenarios is used to recruit individuals (and potentially backups) to
participate in the scenario enactments. The person responsible for recruiting participants should be familiar
with the typical characteristics of the roles involved in the scenarios, and also with the knowledge and
experience levels of those individuals who might participate in the enactments. Those selected to participate
ideally are the actual planned users (or equivalent) of the space. For example, a trauma surgeon would
enact the role of a trauma surgeon. When the scenarios involve providing patient care, the simulated patient
can be a mannequin, patient simulator (with vital signs, clinical signs and symptoms controlled by a computer),
or confederates (patient and family advisors, members of the public, or actual patients).
Details to consider:
 Individuals should not participate in roles that do not match their job titles. Furthermore, those selected to participate in the scenario enactments should have had no prior involvement in the design process.

Ideally each scenario enactment should have different individuals participating. In other words, the same
participants should not participate in multiple scenarios. This is to best represent the breadth of approaches possible between individuals while performing their roles. Challenges recruiting different people for each scenario enactment may preclude this as a viable option. Additionally, users’ level of experience should be considered when selecting participants as this affects user expectations and potentially performance. Select a range of users to include novice users with little or no experience/knowledge, occasional users with some previous experience, and expert users.56
 Recruiting individuals may require back-filling their actual positions on the day of the scenario
enactments.

Special consideration should be given to selecting individuals to participate as patient and family members. Selection should be based on how closely they represent typical patients who will receive care in that space. Various tasks (e.g., performing CPR) may require the use of a mannequin or patient simulator for some or all scenario enactments.
Staging the mock-up
Equipment and supplies are needed that will make the mock-up environment as realistic as possible. Essential
equipment and supplies will be listed with the written scenario scripts and in the task analysis (if conducted).
Realism in the mock-up overall will enhance the scenario enactments and influence both the interactions
and perceptions of those involved in the simulations. For example, device placement, movement, and use
can be affected by the presence and length of attached lines and tubes. Furthermore, the presence of tubes,
lines and electrical cables restricts the access and movement of people and equipment in tight spaces, which
is another important design consideration. Where possible, real equipment and supplies should be used;
however, using mock-up equipment with cardboard or printed screen shots for images on computer screens
may be necessary. When equipment mock-ups are used, tubes, lines and electrical cables should be included.
Training participants
Training in advance of the simulations should be provided for any new processes or equipment with which
participants may be unfamiliar. This will enhance accuracy in the way in which people interact with the
equipment and environment to allow for a more accurate assessment of the evaluation objectives. Vendors
may be best able to provide just-in-time training for some equipment or devices. Clinical educators or other
staff members should provide training for new or altered processes.
24
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
Setting up an observation room
Permitting and encouraging stakeholders, particularly design team members, to observe the scenario enactments allows them to observe first-hand the issues that emerged during the scenario enactments, and
can therefore accelerate decision-making and minimize the time required to make design changes.
Details to consider:

Other than the simulation participants (as required by the scenario scripts), only the simulation director
and individual(s) taking photos and videos should be present in the mock-up during scenario enactments.
Any additional observers in the mock-up during the enactments would affect work flow and use of the space.
 Streaming live video feeds of the scenario enactments into an adjacent observation area is essential to
minimize any distraction caused by observers. In some cases, observers can watch through a window into the mock-up space.
ENACTING SIMULATION SCENARIOS
Providing pre-briefing and scenario enactment instructions
Participants are provided instructions and a pre-briefing just before each scenario enactment (see Appendix
3 for sample script). The pre-briefing in part is used to create a psychologically safe environment where
participants feel comfortable providing open and honest feedback and to augment their understanding of
what is most relevant or critical with respect to the evaluation objectives. The script generally outlines
background information, such as the purpose of the evaluation and an introduction to the space, tasks
included in the scenario, roles and consent. The script also reinforces that the focus of the evaluation is on
the design of the space, and not on individual performance. The person responsible for the pre-briefing and
scenario instructions needs to be familiar with the scenarios, evaluation objectives, and methods.
Simulation participants might be asked to verbalize their thoughts as they enact the scenarios; this is called
a think-aloud protocol.57 In group settings, this is sometimes referred to as constructive interaction. This enables
simulation participants to provide feedback mid-scenario as they encounter difficulties and reduces reliance
on human memory when gathering feedback during the debriefing sessions after each scenario enactment.
With this approach, instructions should be provided to participants before each scenario enactment, including
what types of thoughts should be verbalized as well as a demonstration of how to think aloud. As an example,
the person providing the pre-brief could play the role of a nurse hooking up an intravenous pump and
verbalize a need to plug in the pump but that there are no conveniently located electrical outlets.
Directing the scenario enactments
Scenario enactments are guided by the simulation director. A person familiar with the scenarios and familiar
with the clinical procedures and processes is typically responsible for this task. The simulation director
ensures participants follow scenario scripts and that the scenario starts and ends as planned. The simulation
director will also ensure that there are no observers in the room during scenario enactments.
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
25
Operating the patient simulator
If a high-fidelity patient simulator is used, the scenario will need to be programmed into the simulator and
be operated by an individual with this expertise.
Taking photos and videos
Multiple video cameras can be used to catch all activities within the room and to allow for backup recordings
in the event of unanticipated video equipment failures. Two or three people are usually responsible for taking
photo, video, and audio recordings during the scenario enactments. Pictures will assist with both analysis
and presenting findings (see Appendix 4 for suggestions). Video angles should capture full-room uses, while
also capturing task-specific detailed usage data, particularly if video analysis is to be performed. This will
require a variety of video cameras and mounts (see Appendix 5 for suggestions). Although there are hightech alternatives to synchronizing multiple video cameras for data analysis and video editing/production,
this can easily be done using a clapperboard or by simply clapping one’s hands to provide a visual and audio
marker, which ideally will be caught by all video cameras.
Recording audio
Digital audio recorders are increasingly advanced and inexpensive. These can be effectively used to record
activities to supplement or as a backup to the video recordings. They can be mounted in the space or hung
from the ceiling in an unobtrusive fashion. Lapel microphones on key participants can enhance the quality
and accuracy of audio recordings.
DATA COLLECTION AND ANALYSIS
User feedback and analysis
The debriefing is intended to guide participant reflections of their experiences within the space after each
scenario enactment is completed. When done skillfully, participants are guided through reflection of the
tasks that occurred during the simulation in an organized and thoughtful way. Participants require debriefing
to help organize this reflective process. Semi-structured interview questions are recommended for debriefing
participants. This involves asking a predetermined list of questions while ensuring the flexibility to explore
additional topics or questions as they arise. The architect can be a valuable addition to the participant
debriefing sessions. Importantly, the architect’s role is to provide information requested by participants
and not to critique or refute feedback. Conversations with the architect prior to the debriefing can help set
the expectation that the goal is to generate feedback and suggestions. Having the architect present during
debriefing also can speed up the implementation of participant feedback as the architect hears their suggestions first-hand.
As part of the debriefing session, recommended design modifications can be captured on sticky notes and
placed on mock-up walls, on a large floor plan, or in fact anywhere to indicate the location of proposed
changes to the placement of fixtures, equipment, or supplies. Photos of various items can be used instead of
sticky notes to make their placement more obvious to participants during the scenario enactments, and are
easily movable. Recommended placement can be recorded after each debriefing session by photographing
where items were placed. Placement can continue to be adjusted as needed after each scenario enactment
and then be re-photographed. Asking for the rationale behind recommended placement of various items
can help resolve discrepancies in opinions and identify relevant data that can be extracted from the videos if
further support or analysis is needed.
26
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
While participants are debriefing, a separate session should be conducted with all observers. A separate
session ensures the participants can be candid in their observations, uninfluenced by previous design decisions
or by the rationale behind those decisions. This encourages thinking from a new perspective when identifying
design issues or recommending design alternatives. Moreover, the observers are often supervisors of the
participants; separate debriefing sessions further ensures participants feel comfortable providing open feedback.
Details to consider:
 Two debriefing facilitators are likely required: one to debrief the simulation participants and one to debrief the observers. The debriefing facilitators should be trained facilitators with experience and be
comfortable facilitating debriefing sessions and focus groups.

Having additional support to take notes on the feedback provided is helpful. One or both of the debriefing
facilitators should have experience with qualitative analysis methods to be able to identify themes and summarize concerns from feedback provided while also extracting or identifying potential solutions to
mitigate latent conditions or hazards.

Additional feedback may be sought beyond the debriefing sessions. This may be useful, for example, if the scenarios include a large number of participants and there is a concern that not all opinions will be
heard, to provide quantitative metrics with measurable data for statistical testing about user acceptance, or to further assess aspects of the evaluation (e.g., engagement, realism, etc.).
Video analysis
The amount of data collection and analysis will depend on a number of factors, including evaluation objectives,
timelines, and availability of expertise and resources. Various data collection and analysis methods for consideration are described next. If video analysis is planned, an individual with experience using a variety
of human factors methodologies should be engaged. Video analysis involves behavioural coding, analysis,
and interpretation. Video editing experience will enhance the presentation of findings. In addition to analysis,
this individual should be involved early enough in the design process to participate in discussions about the
evaluation objectives to develop appropriate metrics to assess the objectives.
Video coding
Video recordings of each scenario enactment should be reviewed independently by two or more individuals,
if possible, and coded with criterion to assess the evaluation objectives (see Appendix 6 for template). Common
coding categories include adjustments made to equipment or monitors, bumps between people and/or equipment,
excessive reaches, participants searching for equipment or supplies, line snags, visibility issues, and other
usability issues encountered. Usability issues typically include participant verbalizations. The data from
all coders are merged; duplicates are removed and discrepancies are resolved through consensus, typically
after reviewing the video segment again.
How the data are used will depend on the evaluation objectives, but most commonly involve identifying
patterns within pertinent categories. For example, if room size is being assessed, data pertaining to bumps
(e.g., areas of congestion within the room), adjustments made to equipment or monitors (e.g., if things were
frequently being moved out of the way), and verbalizations (e.g., comments pertaining to room size) may
be relevant. In contrast, assessments of equipment or supply placement may focus more on bumps (e.g.,
frequency of bumps into a particular item), excessive reaches (e.g., to obtain or use the item), and searches
for an item (e.g., which may indicate it is not intuitively or conveniently located).
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
27
Link analysis
Link analysis is used to identify and represents links (or relationships) to determine the nature, frequency,
and importance of the links.58,59 The Center for Health Design recommends this tool for use during design to
support decision-making at varying levels of design detail.60 Link analysis involves using lines on a diagram
(potentially an architectural drawing) to indicate where individuals involved in the scenario enactments moved
from and to. Movement and room utilization is more accurately depicted by transcribing the actual paths
of individuals onto an overhead view or architectural drawing of the space (see Figure 8). This may also be
referred to as a spaghetti diagram. It typically involves watching the videos of the scenario enactments and
drawing the links (lines) by hand or onto an electronic file (i.e., using Microsoft PowerPoint). Technologybased options, such as use of a real-time locating system, can automate the transcription process. Using a
centrally located ceiling-mounted video camera, ideally with a wide-angle or fish-eye lens to capture the entire
room, makes this step more efficient. Some scenarios or rooms may require multiple camera angles to fully
capture all movements through the space. The resulting diagram can be used to visualize motion patterns,
high-traffic areas of congestion, inefficiencies in staff workflow, and use of space by individuals and teams,
or across teams.
FIGURE 8: Link analysis from a simulation based mock-up evaluation of an interventional trauma
operating room (see also Figure 7). 11
28
Surgeons/resident
Anesthesiologist
Circulating nurses
Scrub nurses
Anesthesia respiratory therapists
Diagnostic imaging team
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
Bump analysis
Conducting a bump analysis involves coding (generally
as part of video coding as previously described)
physical contact between two objects – people and/or
equipment – that were not intended to make contact.
Typically, this also includes coding what bumped into
what and, potentially, whether each object was
considered sterile. The location of each bump can be
plotted onto an architectural drawing of the space
and typically overlaid onto the link analysis diagram
(see Figure 9). Combining data from a link and bump
analysis shows interactions between people and
equipment, and can identify areas of physical congestion
and risk of contamination within sterile/clean areas.
Data can be further examined to see which items are
most frequently bumped to assess if design modifications
could reduce the frequency of bumps.
FIGURE 9 : Bump analysis added to link
analysis. 11
Bump
GUIDING PRINCIPLE 6
R e c o mme n d ati ons s houl d be i nformed by ev i denc e-bas ed data from
s ce n a ri o e nac tments .
Evidence-based data, collected through user feedback and video analysis, is used to
identify potential issues and successes with the planned design. The recommendations
that are developed should address any identified issues.
DISSEMINATING FINDINGS AND RECOMMENDATIONS
Deliverables from a simulation-based mock-up evaluation often include a written report and/or presentation
to the design team, accompanied by a listing of recommendations (see Appendix 7). Identified issues (and
successes) with the planned design are often evidenced through pictures, video clips, user feedback, and/
or video analysis data. After the findings are shared, the recommendations are reviewed by the design and
evaluation teams to decide which recommendations will be implemented or require further investigation.
The teams also assign responsibility for implementation and investigation as applicable.
Details to consider:
 The identity of participants in the pictures and videos should be protected by using editing software to blur faces unless permitted through explicit consent.
 The dissemination plan should also include sharing the results with simulation participants and stakeholder
groups, and should consider broad distribution (program, site, provincial, national, or international presentations; journal publication; etc.) to promote knowledge transfer of lessons learned.
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
29
TRACKING AND EVALUATING IMPLEMENTED RECOMMENDATIONS
Tracking which recommendations were planned to be implemented, were actually implemented, and the
effect they had provides many benefits. Initially it provides a mechanism for feedback to the evaluation and
design teams. However, it also supports evidence-based design for use in future designs and generation of
guidelines, as well as produces measurable outcomes to assess the overall value of conducting the simulationbased mock-up evaluation.
Details to consider:

Collecting data to measure the performance of the built environment more globally, known as a post- occupancy evaluation, should be considered. This often includes the use of interviews, surveys, observations and/or physical measurements. Assessing the effect of implemented recommendations can be a sub component of the post-occupancy evaluation.
 Findings from the post-occupancy evaluation should also be disseminated.
SUMMARY
The methodology described in this framework ensures
simulation-based mock-up evaluations are guided
by principles that have proven effectiveness in many
healthcare settings in Alberta, and elsewhere, to
bring about evidence-based design decisions. The
meaningful participation of all user groups early in the
design of healthcare environments can significantly
improve the end result – creating safer, higher-quality
spaces that enhance the overall patient experience
as well as the experience for all users of the space. In
Photo courtesy of Alberta Health Services
addition, latent conditions which contribute to adverse
events for patients, families, and staff are removed, conflicts between equipment and people (i.e., bumps)
are reduced, and functional and cost efficiencies are achieved. A simulation-based mock-up evaluation is
one of many tools both within human factors and in other areas of expertise that can be used to evaluate
the built environment. However, it is unique in that the evaluation process allows design teams to test,
refine, and discover new design concepts as part of the design process based on anticipated processes and
procedures. Furthermore, it facilitates meaningful engagement of planned users of the space into the design
process. Other methods, such as tabletop exercises, usability testing, and post-occupancy evaluations, can
be combined with mock-up evaluations and with other areas of expertise, including infection prevention and
control, process improvement, occupational health and safety, and ergonomics to evaluate and improve
design. Engaging all appropriate stakeholders and experts in a collaborative process improves design outcomes and ultimately patient care.
30
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
S IM U L AT I O N - B A S E D
M OC K-U P
E VA L U AT I O N
APPEND I C E S
The appendices are intended to assist in conducting simulation-based mock-up evaluations. It is important
to note, however, that the information and templates contained in the appendices will likely need to be
modified on a project-by-project basis to best reflect the requirements of the design team and the specific
evaluation objectives. The documents in the appendices can be downloaded from hqca.ca/humanfactors.
Ap p en d i x 1 : L i s ti n g o f e va l u a ti o n ta s k s
32
Ap p en d i x 2 : S i mu l a ti o n s ce n a ri o te mp l ate
33
Ap p en d i x 3 : S a mp l e p re -b ri e fi n g s cri p t and i ns truc ti ons
34
Ap p en d i x 4 : P h o to g ra p h y s u g g e s ti o n s
36
Ap p en d i x 5 : V i d e o g ra p h y s u g g e s ti o n s
37
Ap p en d i x 6 : Da ta a n a l ys i s s p re a d s h e e t templ ate
38
Ap p en d i x 7 : R e c o mme n d a ti o n s te mp l a te
40
APPENDICES COPYRIGHT
The Health Quality Council of Alberta holds copyright for the documents in the Appendices. The Appendices are licensed
under a Creative Commons “Attribution-Non-Commercial-4.0 International” license:
http://creativecommons.org/licenses/by-nc/4.0/
For non-commercial purposes, you may copy and redistribute the material in any medium or format as well as remix,
transform, and build upon the material.
Attribution is requested if content from the Appendices is used as a resource or reference in another work that is created.
To reference the documents, please use the following citation:
Shultz J. Simulation-Based Mock-up Evaluation Framework. Calgary, Alberta, Canada: Health Quality Council of Alberta;
March 2016.
Please contact the Health Quality Council of Alberta for more information: [email protected], 403.297.8162.
31
HEALTH QUALIT Y COUNCIL OF AL BE R TA
APPENDIX 1: LIST OF EVALUATION TASKS
Simulation preparation
 Create simulation scenarios
 Obtain ethics approval and consent
 Recruit participants
 Stage the mock-up with fur niture, equipment and supplies
 Train participants
 Set up an observation room
Enacting simulation scenarios
 Provide pre-briefing and scenario-enactment instructions
 Direct the scenario enactments
 Operate the patient simulator
 Take photos and videos
Data collection and analysis
 Debrief with scenario participants and observers for feedback and analysis
 Conduct video analysis
 Develop findings and recommendations
32
S I MU LAT I O N - B A S E D MO C K - U P E VA LU ATI O N
H E A LT H Q U A LI T Y C O U N C I L O F A LB ER TA
APPENDIX 2: SIMULATION SCEN ARIO TEMPLATE
SCENARIO [INSERT #]
SIMULATION-BASED MOCK-UP EVALUATION OF [INSERT ROOM NAME]
Date:
Background [insert appropriate background information about the room and rationale for the evaluation]
Stakeholders
Project sponsor: [insert name and contact information]
[list stakeholders, including the name of the group being represented and the name and contact information
of the individual representing the group]
Scenario [insert written description/overview of the scenario]
Scenario tasks
Evaluation objectives
Notes
• [list tasks in chronological order] • [list evaluation objectives • [for example confederate
associated with each task and
instructions to introduce a
how they will be evaluated) distraction]
•
•
•
Scenario requirements
Location: [insert address and directions]
Participant s: [list all roles directly involved in this scenario and name of the individual enacting each role]
Equipment required: [list all equipment involved in this scenario and name of the individual responsible
for bringing each piece of equipment]
Supplies required: [list all supplies involved in this scenario and name of the individual responsible for
bringing each supply]
Other requirements: [includes room configuration requirements, make up/moulage needed, etc.]
Pre-scenario training required [list all training needed prior to enacting scenarios including training
provided in-house or by vendors]
Out of scope [list things that will not be included in the evaluation scope]
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
33
HEALTH QUALIT Y COUNCIL OF AL BE R TA
APPENDIX 3: SAMPLE PRE-BRIEFING SCRIPT AND INSTRUCTIONS
PRE-BRIEFING SCRIPT SIMULATION-BASED MOCK-UP EVALUATION OF
Date:
[INSERT ROOM NAME]
Welcome Participants [Provide the same instructions for each scenario enactment]
Thank-you
Thank you for taking part in the [insert room name] simulation-based mock-up evaluation. Your participation in the mock-up evaluation will help inform the design of this room.
Safety briefing [insert safety briefing information as applicable to both the mock-up room as well as the site where the mock-up is located. This will include site safety rules, known hazards such as tripping hazards from electrical cords, evacuation arrangement, locations of drinking water and sanitary facilities. If the mock-up is located within a construction site, this
information may be presented by site management.]
Introductions I would like everyone to introduce themselves.
Schedule
You should have received a schedule and questionnaire upon arrival. Please return the completed questionnaire before you leave.
Consent forms
34
I hope everyone has reviewed and signed consent forms. As previously explained, one consent form shows that you give your permission to participate; the other form says that
you agree to permit evaluators to take photos and videos during the scenario enactments
and share them with [list all uses] to best communicate specific room design issues and recommendations. If you have not signed the consent forms, please come see me. These consent forms must be signed before anyone can participate.
S I MU LAT I O N - B A S E D MO C K - U P E VA LU ATI O N
H E A LT H Q U A LI T Y C O U N C I L O F A LB ER TA
Study Instructions
Background
The purpose of this simulation-based mock-up evaluation is to help inform the design of the new [insert room name]. [Insert appropriate background information about the room and rationale for the evaluation].
Scenarios
As part of this evaluation, you will be asked to participate in a number of scenario enactments
that are expected to commonly occur in this room. The focus of the study is NOT TO EVALUATE YOUR PERFORMANCE but on the adequacy and use of space within the room for
the patient and healthcare professionals. Before enacting each scenario, the scenario will
be read aloud and participants will be reminded if they miss any of the tasks in the scenario.
Think aloud
While enacting the scenario, we ask that you think aloud. When you think aloud, you state
out loud what comes to mind as you enact your role. For example, if I were enacting the role of a nurse in the scenario and was hooking up an IV pump that needs to be plugged
in, I might say that I need to plug in the pump and that there are no conveniently located electrical outlets. There are no right or wrong comments, so please speak freely. If you
forget to state your thoughts out loud, I may occasionally prompt you to continue to do so.
Debrief
If you forget to mention something during a scenario, you will also be given an opportunity
to have a reflective conversation through debriefing at the end of the each scenario. For
those involved in the scenario, I will be conducting the debriefing. For those not directly involved in the scenario, [insert name of debriefing facilitator] will be conducting the
debriefing session.
Questions
Before we start with the scenario enactments, does anyone have any questions?
Thanks
Once again, I’d like to thank each one of you and next we will discuss the scenario which will be enacted.
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
35
HEALTH QUALIT Y COUNCIL OF AL BE R TA
APPENDIX 4: PHOTOGRAPHY SUGGESTIONS
Photos to take
 Full room – aerial view (with no people)
 Full room – panoramic views (with no people)
 Room set-up before and after the scenario enactment
 All equipment, including close-ups of any interfaces
 All outlet configurations
 Post-it note placement (in the mock-up or on a floor plan) after each debriefing session
 Each participant (for identification when conducting link analysis, etc.)
 Ongoing photos during the scenario enactments
Camera equipment
 Digital camera(s)
 Extra batteries and battery charger(s)
 Memory card(s)
 Photo editing software
 Long electrical extension cords
 Tripod(s)
36
S I MU LAT I O N - B A S E D MO C K - U P E VA LU ATI O N
H E A LT H Q U A LI T Y C O U N C I L O F A LB ER TA
APPENDIX 5: VIDEOGRAPHY SU GGESTIONS
Video camera angles
 Full room – aerial view (ideally with one video camera, potentially with a fish eye
lens, but may require multiple video cameras)
 V ideo camera focusing on each areas of use (i.e., medication preparation area,
anesthetic area/head of the bed). Consider anticipated locations of people and
equipment to minimize any obstructions they may cause given the planned video angles.
Recording test videos prior to the scenario enactments and reviewing them on a computer
is important to ensure video camera placement will sufficiently capture all desired angles
with acceptable video and audio qualities.
Video camera equipment
 Multiple video cameras
 Memory cards
 W ide angle lenses
 Batteries and chargers
 Long electrical extension cords
 Lapel microphones
 Suction mounts
 Clamp mounts
 Flexible and fixed extension arms for mounts
 Tripods
 Computer capable of editing large video files
 V ideo analysis software (e.g., Microsoft Excel, NV ivo, Noldus)
 V ideo editing software
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
37
HEALTH QUALIT Y COUNCIL OF AL BE R TA
APPENDIX 6: DATA ANALYSIS SPREADSHEET TEMPLATE
The data analysis spreadsheet is often customized to support analysis. For example, when the category of
bump is selected, the evaluator may include additional columns in the spreadsheet to identify the ‘bumper’
and ‘bumpee’, and may also further specify if those objects were sterile or not. In some cases it may be helpful
to number the bumps on the architectural drawing and in the spreadsheet to allow for cross-referencing.
ROOM [INSERT ROOM NAME]
SCENARIO [INSERT SCENARIO # AND NAME]
CAMERA
ELAPSED
VIDEO
TIME
NOTES
CODER CATEGORY COMMENT
C1: Aerial
00:00:00
SCENARIO START
C2: Head of bed
[insert description/observation]
C3: Surgical area
SCENARIO END
00:30:00
[initials]
[see below]
JS
Bump
C4: Foot of bed
SAMPLE DATA ANALYSIS SPREADSHEET ENTRIES
C4: Foot of bed
38
00:03:24
Circulating nurse bumps head on ceiling mounted monitor while prepping back table with
OR instruments
C4: Foot of bed
00:03:50
Nurse says there is lots of room SB
at the end of the OR table to
gown up
Beneficial
C4: Foot of bed
00:05:14
Circulating nurse must move the equipment carrier behind
the anesthetic machine to make
space for the case cart to be
brought over
JS
Congestion
C1: Aerial
00:05:38
Nurse suggests moving or
rotating OR charting desk for
more adaptable space use
SB
Suggestion
C3: Surgical area 00:07:20
Anesthesia respiratory therapist JS
squeezes between patient and
IV pump
Access
C3: Surgical area 00:07:45
Nurse has difficulty removing
arm extenders from OR table
JS
Usability
C4: Foot of bed
Nurse trips on sponge bucket
SB
Tripping hazard
C2: Head of bed
00:08:10
Anesthesiologist searches
through 2 drawers of drug cart
SB
Searching
C2: Head of bed
00:08:23
Anesthesiologist reaches over IV tubing to deliver medication
in patient’s arm to avoid
walking around equipment
SB
Excessive reach
00:07:55
good angle for highlight video
S I MU LAT I O N - B A S E D MO C K - U P E VA LU ATI O N
H E A LT H Q U A LI T Y C O U N C I L O F A LB ER TA
SAMPLE DATA ANALYSIS SPREADSHEET ENTRIES
CAMERA
ELAPSED
VIDEO
TIME
NOTES
CODER CATEGORY COMMENT
C3: Surgical area 00:08:50
Anesthesiologist asks for the
site rite (ultrasound) for the
2nd time
JS
C1: Aerial
00:40:48
DI tech moves monitor to bring JS
in C-ARM
Adjustment
C3: Surgical area 00:41:03
Lines resting on C-arm are
pulled when rotating C-arm
Line snag
C3: Surgical area 00:43:15
Nurse lifts bed drapes to watch JS
lines while moving the C-ARM
SB
Communication
Visibility
Category Definition
Adjustment
Adjustment made to equipment or monitor.
Access Equipment needed that is not easily accessible.
Beneficial Positive design feature noted by participant or evaluator.
Bump Physical contact between two objects (people and/or equipment) that were not intended to make contact.
Communication
Unsuccessful attempt to communicate (i.e., lack of response, demonstrated confusion, repeated instructions, multiple simultaneous communications, or disruptions due to
environmental noise).
Congestion
An object (person or equipment) is in the way.
Excessive reach Accessing something beyond one’s ‘reach envelope’, which is the length of an extended arm.
Line snag Unintentionally applying force to a line (IV) being used as part of patient care.
Cord/cable snag Unintentionally applying force to a cord (for power supply or patient monitoring) being used as part of patient care.
Searching The location of a supply or equipment is unknown to an individual needing it.
Suggestion Verbalized comment or room design/equipment observation.
Tripping hazard Object (people or equipment) required to move over another object, cord, or line.
Usability Difficulty using a computerized technology, equipment, packaging, or data entry devices.
V isibility Needing to see something (i.e., patient, monitor, equipment, etc.) that is not in view of the individual needing to see it.
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
39
HEALTH QUALIT Y COUNCIL OF AL BE R TA
APPENDIX 7: RECOMMENDATIONS TEMPLATE
SIMULATION-BASED MOCK-UP EVALUATION OF
[INSERT ROOM NAME] RECOMMENDATIONS
Last Updated: DD MMM YYYY
Capital Operating Decision to Operational
Priority Cost
Cost
Proceed
Owner
Comments
[ inse r t n a m e to c l u s te r re co mme n d ati on by area or rec ommendati on ty pe i .e., Bathroom]
1 [insert recommendation] High
High
High
Yes - to be [include recommendation
incorporated
modifications, added details,
alternative solutions, plans to
follow-up or implementation
progress]
2 [insert recommendation] Medium MediumMedium
Yes - already incorporated
3 [insert recommendation] Low
Yes - but modified
Low
Low
[insert name to cluster recommendation by area or recommendation type i.e., Articulating Arm]
4 [insert recommendation]
N/AN/A
No
[insert name to cluster recommendation by area or recommendation type i.e., Communication]
5 [insert recommendation]
40
Follow-up
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
ACKNOWLEDGEMENTS
PR O JEC T TE A M
Project team members were instrumental in developing the evaluation methodology, brought expertise from
prior simulation-based mock-up evaluations, and assisted in crafting the framework.
 Bev Knudtson BID EDAC, Guidelines & Evaluation Specialist, Alberta Health Services (AHS)
 Jason Laberge MSc, Manager of Human Factors, AHS
 Mirette Dubé RRT MSC CAE, Simulation Consultant – Foothills Medical Centre Lead, AHS
 Steve Fowler BA, Director, Government Integration, AHS
 Susan Biesbroek MSc, Human Factors Specialist, AHS
 Trevor Peter PEng, Director – Northern Region, Health Facilities Branch, Alberta Infrastructure
W O R K I NG G R OU P
Working group members contributed their knowledge during stakeholder meetings and/or provided feedback
on draft versions of the framework.
 Byron King BSc BPE, HQCA Patient/Family Safety Advisory Panel Member
 Christine Vis BSN RN, Manager, Trauma Services and PCU 44 Trauma/Surgery, AHS
 David Baker PEng, Director – Health Capital Planning, South, Health Facilities Branch, Alberta Infrastructure
 David Crocker BSc, Director of Operations, Calgary Foothills Primary Care Network
 Deanna Picklyk BA MSW RSW, Director of Engagement and Patient Experience, AHS
 Deb Maerz BScOT MBA, Clinical Liaison, Major Capital Projects, Edmonton Zone, AHS
 Doug Kitlar, Director of Facility Management, GEF Seniors Housing
 Greg Hallihan BA(Hons) MASc, Human Factors Program Manager, W21C
 Jan Davies MSc MD FRCPC, Professor of Anesthesia, University of Calgary
 Janice Cullen BSc MHSA, Director, Strategic Capital Planning – Calgary Zone, AHS
 Joan Rooke RN BN, Clinical Liaison, Major Capital Health Care Projects, Calgary Zone, AHS
 Joanne Ganton BComm, Executive Director – Engagement & Patient Experience, AHS
 Ken Rea Architect AAA MRAIC Dipl Arch Tech BDT BArch, Associate, DIALOG
 Paul Perschon MA, Manager – Process Excellence, AHS
 Robin Snell AAA OAA MRAIC LEED AP EDAC, Principal, Parkin Architects Ltd; Vice-Chair, CSA Technical Committee for Health Care Facilities
 Sue Barnes RN BscN CPN(C), Simulation Consultant, AHS
 Terry Qaqish BA(hons), Consultant, Process Improvement, Lean Six Sigma Black Belt, Covenant Health
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
41
EX TER NAL E XP E RT R E VIE W E R S
 Ellen Taylor PhD(c) AIA MBA EDAC, Director of Research, The Center for Health Design
 Anjali Joseph PhD EDAC, Spartanburg Regional Healthcare System Endowed Chair in Architecture +Health Design; Director, Center for Health Facilities Design and Testing; Associate Professor of
Architecture, Clemson University
 Craig Zimring PhD EDAC, Director, SimTigrate Design Lab; Professor, College of Architecture, Georgia Institute of Technology
 Sue Hignett PhD, Professor of Healthcare Ergonomics & Patient Safety, Loughborough University
 Tony Easty PhD PEng CCE, Professor, Institute of Biomaterials & Biomedical Engineering, University of Toronto
ADDI TI O NA L A C KN OW L E D G ME N T S
Individuals from across Canada with expertise in healthcare and/or human factors offered insights and
feedback on draft versions of the framework.
 Bob Webb PhD FErgS CHFP, President, HumanSystems® Incorporated
 Deanna Harrison BSc BA CPE, Program Leader, Workplace Health, Fraser Health
 Deborah Goodwin MSc (Ergonomics) CCPE LEED Green Associate, Senior Ergonomist, AHS
 Linda Sagmeister CCPE CRSP, Canadian Certified Professional Ergonomist, St. John’s, NL
Thank you to the many individuals at the HQCA who also provided contributions and support.
42
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
REFERENCES
1
Joseph A, Rashid M. The Architecture of Safety: Hospital Design. Curr Opin Crit Care 2007 Dec;13(6):714-9.
2
Ulrich RS. View Through a Window May Influence Recovery from Surgery. Science 1984 Apr 27;224:420-1.
Hamilton DK, Watkins DH. Evidence-Based Design for Multiple Building Types. Hoboken, New Jersey,
United States of America: John Wiley & Sons, Inc.; 2009.
3
4 Evanoff B, Wolf L, Aton E, Canos J, Collins J. Reduction in Injury Rates in Nursing Personnel Through
Introduction of Mechanical Lifts in the Workplace. Am J Ind Med 2003; 44:451-457.
The Canadian Patient Safety Dictionary [Internet]. Ottawa, Ontario, Canada: 2003 Oct [cited 2015 Feb
20]. Available from: http://www.royalcollege.ca/portal/page/portal/rc/common/documents/publications/patient
_safety_dictionary_e.pdf
5
Reiling JG. Safe by Design: Designing Safety in Health Care Facilities, Processes, and Culture. Oak
Brook, Illinois, United States of America: Joint Commission Resources; 2007.
6
International Ergonomics Association. Definition and Domains of Ergonomics | IEA Website [Internet].
2015 [cited 15 September 2015]. Available from: http://www.iea.cc/whats/
7
Carayon P, editor. Handbook of Human Factors and Ergonomics in Health Care and Patient Safety. New
Jersey: CRC Press; 2006
8
9 Shultz J, Chisholm S. Supportive Living Resident Suite Evaluation: Using Simulation to Evaluate a Mock-up.
Proceedings of the Human Factors and Ergonomics Society 54th Annual Meeting; 2010 Sept 27-Oct 1; San
Francisco, California, United States of America: Human Factors and Ergonomics Society, Inc.; 2010.
Chisholm S, Shultz J, Caird J, Lord J, Boiteau P, Davies J. Identification of Intensive Care Unit (ICU)
System Integration Conflicts: Evaluation of Two Mock-up Rooms Using Patient Simulation. Proceedings
of the Human Factors and Ergonomics Society 52nd Annual Meeting; 2008 Sept 22-26. New York City,
New York, United States of America. 2008. p.798-802.
10
Biesbroek S, Shultz J, Kirkpatrick A, Kortbeek J. Human Factors Evaluation of an Interventional
Trauma Operating Room Mock-up. In: 2012 Symposium on Human Factors and Ergonomics in Health
Care; 2012 Mar 12-14; Baltimore, Maryland, United States of America. Human Factors and Ergonomics
Society, Inc.; 2012; p.73-78.
11
Hignett S, Lu J, Fray M. Two Case Studies Using Mock-Ups for Planning and Adult and Neonatal
Intensive Care Facilities. Journal of Healthcare Engineering 2010;1(3):399-413.
12
Homdahl T, Lanbeck P. Design for the Post-Antibiotic Era: Experiences from a New Building for Infectious
Diseases in Malmö, Sweden. HERD 2013 Summer;6(4):27-52.
13
Kirkpatrick AW, Vis C, Dubé M, Biesbroek S, Ball CG, Laberge J, Shultz J, Rea K, Sadler D, Holcomb
JB, Kortbeek J. 2014. The Evolution of a Purpose Designed Hybrid Trauma Operating Room from the
Trauma Service Perspective: The RAPTOR (Resuscitation with Angiography Percutaneous Treatments
and Operative Resuscitations). Injury 2014 Sep;45(9):1413-21.
14
Watkins N, Kobelja M, Peavey E, Thomas S, Lyon J. An Evaluation of Operating Room Safety and
Efficiency: Pilot Utilization of a Structured Focus Group Format and Three-dimensional Video Mock-up to
Inform Design Decision. HERD 2011 Fall;5(1):6-22.
15
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
43
REFERENCES – continued
Wikipedia: the free encyclopedia [Internet]. St. Petersburg (FL): Wikimedia Foundation, Inc. 2001 –
Mockup [cited 2015 Jul 16]. Available from: http://en.wikipedia.org/wiki/Mockup
16
Watkins N, Lorenz S, Naos I. The Functional Mock-Up: The University Medical Center at Princeton
Inpatient Room Mock-Up Project. Healthcare Design Magazine [Internet] 2010 Mar 11 [cited 2014 Jun
17]. Available from: http://www.healthcaredesignmagazine.com/article/functional-mock-university-medicalcenter-princeton-inpatient-room-mock-project
17
Government of Alberta [Internet] Budget 2015 – Capital Plan; 2015-18 [cited 2015 Nov 2]. Available
from: http://finance.alberta.ca/publications/budget/budget2015-october/fiscal-plan-capital-plan.pdf
18
Government of British Columbia [Internet] Budget and Fiscal Plan, 2015/16 – 2017/18; 2015 Feb 17
[cited 2015 Nov 2]. Available from: http://bcbudget.gov.bc.ca/2015/bfp/2015_budget_and_fiscal_plan.pdf
19
Government of Ontario [Internet] Ontario Budget 2015; [cited 2015 Nov 2]. Available from: http://www.
fin.gov.on.ca/en/budget/ontariobudgets/2015/papers_all.pdf
20
Wilson JR. Ergonomics and participation. In: Wilson JR, Corlett EN, eds. Evaluation of Human Work:
A Practical Ergonomics Methodology, 2nd edn. London: Taylor and Francis, 1995; 1071–1096.
21
Hignett S, Wilson J, Morris W. Finding Ergonomics Solutions – Participatory Approaches. London:
Occup Med [Internet]; 2005 May [cited 2015 Sep 11]; 55(3):200–207. Available from: http://occmed.oxfordjournals. org/content/55/3/200.long
22
Verma J, Petersen S, Samis S, Akunov N, Graham J. Healthcare Priorities in Canada: A Backgrounder.
[Internet] Canadian Foundation for Healthcare Improvement. 2014 Apr [cited 2015 Sep 11]; Available
from: http://www.cfhi-fcass.ca/sf-docs/default-source/documents/harkness-healthcare-priorities-canadabackgrounder-e.pdf?sfvrsn=2
23
Johansson J. Patient Rooms of a California Based Hospital: Benefits of Physical Mock-ups vs. Benefits
of Virtual Mock-ups [Internet]; University of Minnesota, College of Design: 2012 [cited 2014 Jun 17].
Available from: http://vr.design.umn.edu/portfolio/documents/VRReport_JJohansson.pdf
24
Jessa M. Human Factors Evaluation of Emergency Department Trauma Room and Standard Treatment
Room Fort Saskatchewan Health Centre. Alberta, Canada: Alberta Health Services; 2011 Dec 7.
25
Himwich DB, Hamilton B. The Smart Choice: Test Drive your Building Plan through Rapid Prototype
Mock-Ups. Healthcare Design Magazine [Internet] 2008 Aug 31 [cited 2014 Jun 17]. Available from:
http://www.healthcaredesignmagazine.com/article/smart-choice-test-drive-your-building-plan-through-rapidprototype-mock-ups
26
Carbasho T. Lean Design Process Prepares Seattle Children’s Hospital for Tomorrow. Mockup Simulation
Critical to Determining Best Design [Internet]; Tradeline, Inc.; 2012 July 10 [cited 2014 Jun 18]. Available
from: http://www.tradelineinc.com/reports/2012-7/lean-design-process-prepares-seattle-childrens-hospital-tomorrow
27
Pennathur P, Thompson D, Abernathy J, Martinez E, Pronovost P, Kim G, et al. Technologies in the Wild
(TiW): Human Factors Implications for Patient Safety in the Cardiovascular Operating Room. Ergonomics
[Internet] 2013 Feb [cited 2014 May 21]; 56:2, 205-219. Available from: http://www.tandfonline.com/doi/
abs/10.1080/00140139.2012.757655
28
Shultz J, Chisholm S. Supportive Living Resident Suite Evaluation: Using Simulation to Evaluate a
Mock-up. Proceedings of the Human Factors and Ergonomics Society 54th Annual Meeting; 2010 Sept 27-Oct 1;
San Francisco, California, United States of America: Human Factors and Ergonomics Society, Inc.; 2010.
29
44
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
Wahr J, Abernathy J. Improving Patient Safety in the Cardiac Operating Room: Doing the Right Thing
the Right Way, Every Time. Current Anesthesiology Reports 2014;4(2):113-123.
30
Peavey EK, Zoss J, Watkins N. Simulation and Mock-up Research Methods to Enhance Design Decision
Making. HERD 2012 Spring;5(3):133-143.
31
Biesbroek S, Lee B, Milloy S, Dubé M, Loken J, Puzey C, Wiens K. CVOR Commissioning Evaluation:
Simulation Evaluation Project. Alberta, Canada: Alberta Health Services; 2014.
32
Laberge J, Raven A. Human Factors Evaluation of the Peter Lougheed Centre (PLC) Vascular Hybrid
OR Suite. Presented at: Peter Lougheed Centre 25th Anniversary Celebration; 2013 Calgary, Alberta, Canada.
33
Shultz J, Biesbroek S. Human Factors Evaluation of the Hybrid OR Mock-Up. Alberta, Canada: Alberta
Health Services; 2011.
34
Biesbroek S, Shultz J, Laberge J, Dube M, Vis C. Validating a Mock-up Simulation of an Interventional
Trauma Operating Room. Paper Presented at: W21C SIMposium; 2014; Calgary, Alberta, Canada.
35
Teteris E, Chisholm S, Shultz J, Caird J, Mayer A. Human Factors Evaluation of the South Health
Campus Mock-ups: ICU Room. Alberta, Canada: Alberta Health Services; 2009.
36
Mayer A, Teteris El, Chisholm S, Caird J, Shultz J. Human Factors Evaluation of the South Health
Campus Mock-Up ED Exam Room II. Paper presented at: Halifax 9 Symposium; 2009 Oct 22-24: Montreal, Quebec, Canada.
37
Hallihan G, Caird JK, Wilkins M, Wiley K, Clayden N, Blanchard I, Plato M. Human Factors Evaluations
of a New Ambulance Compartment: Workspace Use and Paramedic Flow. Presented at: 2014 International
Symposium on Human Factors and Ergonomics in Health Care: Leading The Way; 2014 Mar 16-19; Chicago,
Illinois, United States of America: Human Factors and Ergonomics Society; 2014.
38
Caird J, Shultz J, Mayer A, Teteris E, Chisholm S. Human Factors Evaluation of the South Campus
Mock-up ACU Room. Alberta, Canada: Alberta Health Services; 2009.
39
Chisholm S, Shultz J, Mayer A, Teteris E, Caird J. Human Factors Evaluation of the South Health
Campus Mock-ups: Outpatient Exam Room. Alberta, Canada: Alberta Health Services; 2009.
40
Raven, A. Human Factors Mock-up Evaluation of Grande Prairie Cancer Centre 90% Detailed Design:
Medical Day Unit and Systemic Prep. Alberta, Canada: Alberta Health Services; 2014 Sep 24.
41
Taylor E, Hignett S, Joseph A. The Environment of Safe Care: Considering Building Design as one
Facet of Safety. In: Proceedings of the 2014 International Symposium on Human Factors and Ergonomics
in Health Care: Advancing the Cause; 2014 Mar 16-19; Chicago, Illinois, United States of America. Human
Factors and Ergonomics Society; 2014. p. 123-127.
42
Caird J, Shultz J, Mayer A, Chisholm S, Teteris E. Using Human Factors Methods in Patient Simulation
to Determine the Architectural Usability of Hospital Mock-up Rooms. Presented at: International Meeting of
Simulation in Healthcare Conference; 2010 Jan 23-27. Phoenix, Arizona, United States of America. 2010.
43
44
Thrall T. Mock-Up Lets Patients, Staff Test New Rooms Before They Are Built. H&HN 2008 July; 82(7):20.
Kobayashi L, Shapiro MJ, Sucov A, Woolard R, Boss RM 3rd, Dunbar J, Sciamacco R, Karpik K,
Jay G. Portable advanced medical simulation for new emergency department testing and orientation. Acad
Emerg Med 2006 Jun;13(6):691-5. Epub 2006 Apr 24.
45
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
45
REFERENCES – continued
Dube M, Vis C, Shultz J, Laberge J, Sigalet. Simulation in Trauma: Identifying Latent Threats to Patient
Safety in the Iynterventional OR. Presented at: The Trauma Association of Canada Scientific Congress; 2013
Apr 13. Whistler, British Columbia, Canada. 2013.
46
Canadian Standards Association. Canadian Health Care Facilities Z8000-11. Toronto, Ontario, Canada:
Canadian Standards Association; 2011. 448 p.
47
Alberta Health Services, Alberta Health, Alberta Infrastructure. Health Facilities Capital Program Manual
Version 1.0 [Internet]. Alberta, Canada: Alberta Health Services, Alberta Health, Alberta Infrastructure;
2013 Jun. 132 p. Available from: http://www.infrastructure.alberta.ca/documents/HF_Capital_Program_
Manual_(2).pdf
48
Joseph A, Taylor E, Quan X, Nanda U, Ancheta C, Levin, D, et al. Safety Risk Assessment for Healthcare
Facility Environments [Internet]. California: The Center for Health Design; 2015 May [cited 2015 Sep 11]
Available from: https://www.healthdesign.org/sites/default/files/sra_july2015.pdf
49
The Royal Architectural Institute of Canada. Schedule B to Document Six, 2006 Edition. Services of the
Architect and Responsibilities of the Client. Ottawa, Ontario, Canada: The Royal Architectural Institute of
Canada; 2006.
50
Government of Alberta. Design Guidelines for Continuing Care Facilities in Alberta (Draft). 2014. Oct,
Available from: http://www.seniors.alberta.ca/documents/CC-Design-Guidelines-Facilities-2014.pdf
51
Alberta Health Services, Calgary Health Trust. State-of-the-art Trauma Operating Room Opens at Foothills
Medical Centre [Internet]. 2013 March 27. Available from: http://www.albertahealthservices.ca/8198.asp
52
Orchard C, Bainbridge L, Bassendowski S, Stevenson K, Wagner S J, Weinberg L, et al. National Interprofessional Competency Framework. Canada; Canadian Interprofessional Health Collaborative; 2010.
Available from: http://www.cihc.ca/files/CIHC_IPCompetencies_Feb1210.pdf
53
Wikipedia: the free encyclopedia [Internet]. St. Petersburg (FL): Wikimedia Foundation, Inc. 2001 – Mockup
[cited 2015 Jul 16]. Available from: http://en.wikipedia.org/wiki/Task_analysis
54
ARECCI: A Project Ethics Community Consensus Initiative. Alberta Innovates Health Solutions; 2015.
Available from: http://www.aihealthsolutions.ca/outreach-learning/arecci-a-project-ethics-community-consensus-initiative/
55
Horton W. Designing and Writing Online Documentation: Hypermedia for Self-supporting Products.
2nd ed. New York, NY: Wiley; 1994.
56
57
Nielsen J. Usability Engineering. San Diego, CA, United States of America: Academic Press; 1993.
Carayon P, Alvarado CJ, Hundt AS. Reducing Workload and Increasing Patient Safety through Work and
Workspace Design. Paper commissioned by the Institute of Medicine Committee on the Work Environment
for Nurses and Patient Safety Center for Quality and Productivity Improvement, University of WisconsinMadison; 2003. CQPI Technical Report No. 185.
58
Stanton N, Salmon PM, Walker GH, Baber C, Jenkins DP. Human Factors Methods: A Practical Guide
for Engineering and Design. Burlington, VT: Ashgate Publishing; 2005.
59
Joseph A, Quan X, Taylor E, Jelen M. Designing for Patient Safety: Developing Methods to Integrate
Patient Safety Concerns in the Design Process. The Center for Health Design; 2012. Available from:
https://www.healthdesign.org/sites/default/files/chd416_ahrqreport_final.pdf
60
46
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
NOTES
SIM ULATION-B AS E D M O CK- UP E VAL UAT ION
47
NOTES
48
H E A LT H Q U A LI T Y C O U N C I L O F A L B ER TA
PROMOTING AND IMPROVING PATIENT SAFETY AND HEALTH SERVICE QUALITY ACROSS ALBERTA
www.h q ca .ca
210, 811 - 14 Street NW Calgary, Alberta T2N 2A4
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement