THE POWER OF TAPPING

THE POWER OF TAPPING
THE POWER OF TAPPING
- An interaction model for implementing NFC in Android applications
MASTER THESIS BY:
MARTIN HOLEBY & PATRIK SANDBERG
SUPERVISOR – ÅKE WALLDIUS
EXAMINER – KRISTINA HÖÖK
HUMAN COMPUTER INTERACTION
COMPUTER SCIENCE AND COMMUNICATION (CSC)
ROYAL INSTITUTE OF TECHNOLOGY (KTH)
2013 STOCKHOLM
"Knowledge is experience. The rest is just information."
- A. Einstein
ABSTRACT
Near Field Communication (NFC) is an RFID-based technology that creates a port between
the physical and the digital world through an interaction technique often referred to as
tapping; communication is initiated when two interactive NFC-items briefly touch each other.
The tapping interaction technique opens up for creation of new types of user interfaces that
are compelling to use for both experienced and novice users. Previous work has proven the
strengths of interfaces utilizing NFC from a usability perspective, but except for a couple of
tries to make the technique more wide spread, e.g. by Nokia in the early 2000’s, the market
penetration is unexpectedly low given NFC’s potential.
We have explored possible implementations of the tapping technique and created a design
space by developing three prototypes: Sparakvittot, an NFC integrated version of an already up
and running service used to handle digital receipts; PatientSafetyPrototype, a tool for hospital
nurses used to ease the medicine handling process in order to enhance patient safety; TapThat,
used to immediately transfer playback of a sound file between devices by a simple tap. These
three prototypes serve as illustrations of NFC’s three different modes of operation: card
emulation, which is making active NFC hardware act like if it was passive, i.e. not capable of
initiating communication; Read/write, i.e. making active NFC hardware read or write a
passive NFC-tag; Peer-to-peer, which is making active NFC hardware communicate with
another active NFC hardware. All our three prototypes were designed, developed and
evaluated with end-users. Together, the three prototypes show some of the potential and
strengths of NFC, but they also show the importance of finding a consistent model for
interaction which users can recognize and related to irrespective of which application is being
used.
The report concludes with an interaction model to be used when developing an NFC
integrated Android application in order to create a pliable user experience. The interaction
model is not necessarily Android specific and can also be used when implementing the tapping
technique in applications in general. In short our interaction model states that applications
should provide feedback when tapping. This feedback should consist of sounds, haptic and
GUI dialogues. Application preferences should make it possible for experienced users to
decide which feedback they prefer. It is also important to define a clear interaction model and
be consistent on how the tapping technique is implemented in different contexts. Our
interaction model is followed by other findings from the study that we believe are important
to consider when implementing NFC in Android smartphone applications: the importance of
determining if NFC integration is suitable, to use specific intent filters and to use high-fidelity
prototypes when evaluating.
SAMMANFATTNING
Near Field Communication (NFC) är en RFID-baserad teknik som skapar en port mellan den
fysiska och den digitala världen genom en interaktionsteknik kallad tapping:
kommunikationen initieras genom att två interaktiva NFC-objekt ”nuddar” varandra.
Tapping-interaktion möjliggör skapandet av nya typer av användargränssnitt som är tilltalande
att använda för både erfarna användare och för nybörjare. Tidigare forskning har påvisat
styrkan hos användargränssnitt som använder NFC ur ett användbarhetsperspektiv, men
förutom ett par försök att på ett omfattande sätt sprida tekniken (t.ex. av Nokia i början av
2000-talet), är marknadspenetrationen fortfarande mycket låg med tanke på NFCs potential.
Vi har utforskat möjliga implementeringar av tapping-tekniken och skapat en designrymd
genom att utveckla tre prototyper: Sparakvittot, en NFC-integrerad version av en redan
lanserad tjänst som används för att hantera digitala kvitton; PatientSafetyPrototype, ett verktyg
för sjuksköterskor med syfte att underlätta medicinhanteringsprocessen för att i sin tur öka
patientsäkerheten; TapThat, som används för att på ett smidigt sätt överföra uppspelningen av
en ljudfil mellan enheter genom ett enkelt tap. De tre prototyperna korrelerar med NFCs tre
“modes of operation”: Kortemulering, vilket innebär att en aktiv NFC-enhet agerar som om
den vore passiv. Läs/skriv-läge, vilket innebär att man låter en aktiv NFC-enhet läsa och skriva
till en passiv tag. Peer-to-peer, vilket innebär att man låter en aktiv NFC-enhet kommunicera
med en annan aktiv NFC-enhet. Alla tre prototyper designades, utvecklades och utvärderades
tillsammans med användare. Tillsammans visar det tre applikationerna delar av NFCs styrkor
och potential men de visar också hur viktigt det är att definiera en konsekvent
interaktionsmodell som användare känner igen och kan relatera till oavsett vilken applikation
som används.
Rapporten avslutas med en interaktionsmodell att använda vid utveckling av NFC-integrerade
androidapplikationer för att skapa en smidig användarupplevelse. Interaktionsmodellen är
inte helt androidspecifik utan kan även användas mer generellt vid implementering av tapping
även i andra system. Kortfattat innebär vår interaktionsmodell att användaren ska få
återkoppling när de tappar. Återkopplingen bör bestå av ljud, haptisk återkoppling och
grafiska dialoger. Inställningar i applikationen bör finnas för att avancerade användare ska
kunna ställa in vilken återkoppling de vill ha. Det är också viktigt att definiera en tydlig
interaktionsmodell och vara konsekvent gällande hur tapping-interaktionen fungerar i olika
kontexter. Efter presentationen av vår interaktionsmodell beskriver vi andra intressanta
slutsatser som är bra att tänka på när man ska implementera NFC i androidapplikationer: att
undersöka om NFC är lämpligt att implementera överhuvudtaget, att använda
specifika ”intent filters” och att använda ”high-fidelity”-prototyper när man utvärderar.
ACKNOWLEDGEMENT
We would like to thank all the people who, throughout out our study, have provided us with
both feedback and support. We are also grateful for all the discussions we have had the
possibility to have, both regarding our work and during the execution of it. Our dearest thank
you to:
Kristina Höök (KTH), for great feedback regarding this report and how to enhance the UX of
it.
Åke Walldius (KTH), for excellent supervision including a lot of great feedback and general
support, inspiring academic spirit and patience.
Per Einarsson (Sparakvittot), for very interesting discussions, vivid entrepreneurship and for
sharing valuable insights about Sparakvittot’s business.
Kenneth Andersson and Christoffer Magnani (The Mobile Life), for letting us write the
thesis at The Mobile Life, providing us with feedback and ideas regarding our prototypes and
for giving us the opportunity to attend The Mobile World Congress ‘12, Barcelona.
All our test users - without your participation our work would not have been possible to
conduct.
Our dear girlfriends Camilla and Jeanna for their support and for putting up with us in
general throughout our work.
Martin & Patrik
Stockholm, February 2013
1 INTRODUCTION
1.1 PROBLEM BACKGROUND
1.2 AIM AND OBJECTIVES
1.3 RESEARCH QUESTION
1.4 DELIMITATIONS
1.5 TARGET GROUP
1.6 THE MOBILE LIFE
1.7 ABBREVIATIONS
2 THEORY
2.1 USABILITY
2.1.1 DEFINITIONS OF USABILITY
2.1.2 CONTEXT
2.1.3 USABILITY FEATURES AND ATTRIBUTES
2.1.4 USABILITY INSPECTION
2.1.5 USABILITY TESTING
2.2 MEASURING USABILITY
2.2.1 LAYERED MODEL OF USABILITY
2.2.2 EVALUATING DURING DESIGN
2.2.3 EVALUATING WITH USERS
2.2.4 USABILITY MEASURES DEFINED BY ISO 9241-11
2.3 USER CENTERED DESIGN
2.4 UTILIZING THE MOBILE CONTEXT
2.5 NEAR FIELD COMMUNICATION
2.5.1 NEAR FIELD COMMUNICATION TECHNOLOGY
2.5.2 NFC FORUM
2.5.3 HOW NEAR FIELD COMMUNICATION WORKS
2.5.4 USAGE
2.5.5 POSSIBLE BENEFITS
2.6 NFC INTEGRATION IN ANDROID APPLICATIONS
3 METHODOLOGY
3.1 CONTEXTUAL DESIGN
3.1.1 CONTEXTUAL INQUIRY
3.1.2 WORK MODELING
3.1.3 CONSOLIDATION
3.1.4 WORK REDESIGN
3.1.5 USER ENVIRONMENT DESIGN
3.1.6 MOCKUP AND TEST WITH CUSTOMERS
3.2 TARGET GROUP ANALYSIS
3.3 FIELD STUDY
3.4 HEURISTIC EVALUATION
3.5 USABILITY TESTING
3.5.1 OBSERVATION
3.5.2 VIDEO OBSERVATION
3.5.3 SEMI-STRUCTURED INTERVIEWS
3.6 COMBINATION OF METHODS
3 3 6 7 7 9 9 9 11 11 11 13 14 15 15 16 16 17 17 18 18 20 21 21 22 22 23 24 25 26 26 27 27 28 28 28 28 29 29 29 30 31 31 31 32 3.7 RELIABILITY & VALIDITY
33 4 PERFORMING THE STUDY
34 4.1 DESIGN PROCESS
4.2 STUDY
4.2.1 PRE-STUDY – SPARAKVITTOT
4.2.2 MEDICINE HANDLING
4.2.3 TAPTHAT
4.2.4 OVERALL AIM
5 SPARAKVITTOT
5.1 BACKGROUND
5.2 PROTOTYPE DEVELOPMENT
5.2.1 PROTOTYPE
5.3 MARKET STUDY
5.3.1 PROTOTYPE DEMONSTRATION
5.3.2 RESULT OF MARKET STUDY
5.4 ANALYSIS
6 MEDICINE HANDLING
6.1 DESIGN
6.1.1 CONTEXTUAL ANALYSIS
6.1.2 RESULT OF CONTEXTUAL ANALYSIS
6.1.3 CONTEXTUAL DESIGN
6.1.4 RESULT OF CONTEXTUAL DESIGN
6.2 PROTOTYPE DEVELOPMENT
6.2.1 PROTOTYPE
6.3 EVALUATION
6.3.1 HEURISTIC EVALUATION
6.3.2 FIELD VISIT AND FIELD TEST
6.3.3 USABILITY TESTING
6.3.4 RESULTS OF USABILITY TESTING
6.4 ANALYSIS
7 TAPTHAT
7.1 BACKGROUND
7.2 DESIGN
7.3 PROTOTYPE DEVELOPMENT
7.3.1 PROTOTYPE
7.4 EVALUATION
7.4.1 USABILITY TESTING
7.4.2 RESULTS OF USABILITY TESTING
7.5 ANALYSIS
8 OVERALL ANALYSIS
8.1 QUANTITATIVE VERSUS QUALITATIVE DATA
8.2 NFC ECOSYSTEMS
34 35 35 35 35 36 37 37 38 38 40 40 40 42 43 43 43 44 48 49 51 51 55 55 57 60 62 65 67 67 67 68 69 71 71 72 73 75 75 76 9 DISCUSSION
9.1 USER INTERFACE DESIGNER’S PERSPECTIVE
9.2 USABILITY EVALUATOR’S PERSPECTIVE
9.3 COMMON PERSPECTIVE
10 CONCLUSION
10.1 INTERACTION MODEL
10.1.1 PROVIDE FEEDBACK WHEN TAPPING
10.1.2 DEFINE CLEAR INTERACTION MODELS
10.2 OTHER FINDINGS
10.2.1 DETERMINE IF NFC INTEGRATION IS SUITABLE
10.2.2 USE HIGH-FIDELITY PROTOTYPES WHEN EVALUATING
10.2.3 USE SPECIFIC INTENT FILTERS
10.2.4 DETERMINE IF NFC INTEGRATION IS POSSIBLE
80 80 82 83 84 84 84 85 87 87 88 89 89 11 REFERENCES
90 12 APPENDIX
96 12.1 MEDICINE HANDLING USABILITY TESTING
12.1.1 TEST INSTRUCTIONS
12.1.2 INTERVIEW QUESTIONS
12.2 TAPTHAT USABILITY TESTING
12.3 CONSENT FORM
96 96 97 98 100 PREFACE
In this chapter we shortly present our field of study, explain our division of work and how we
combined two theses into one report.
Field of study
In consultation with our commission company, The Mobile Life, we decided to explore the
possibilities of Near Field Communication (NFC). Together we had a mutual interest for
NFC integration in smartphone applications, which we agreed should be the field of our
study. Coming from a background of usability studies we decided to approach the topic from
a usability perspective. Important for The Mobile Life was that the work had practical
elements, in order for us to gain practical knowledge about possible NFC implementations,
and this approach appealed to us as well. The Mobile Life also preferred if the work was
carried out by two persons cooperating – in that way an agile approach could be applied to
the work and that had a number of positive effects. For example, brainstorming sessions were
held together in order to prevent the workflow from getting stuck. Person A also critically
reviewed person B’s work and vice versa, in order to ensure academic height throughout the
whole study. During the development phase, pair programming, which is part of the agile
approach extreme programming, was applied for the same reasons.
Division of work
A division of work was done with the goal to answer a common research question. Two
different areas of responsibility were defined to concretely describe our division of work. The
two areas of responsibility were:
1. User interface design
2. Usability evaluation
A user interface designer is responsible for analyzing users, activities, tasks and context of use
in order to design an appropriate application. The application design is developed into an
actual application by a team of developers. The usability evaluator is responsible for
continuously evaluating the application during the different processes of each application
development cycle in order to come up with feedback and suggestions for improvements.
Martin took the role as user interface designer while Patrik took the role as usability evaluator,
correlating with the areas of responsibility within User-centered design. With a clear division of
1 work we have cooperated and combined our two theses into one report. One thesis focuses
on user interface design and the other thesis focuses on usability evaluation. From this point
and onwards, the two theses will be referred to as one. In order for the following parts of the
thesis to be read in an comfortable way we have chosen to refer to all work performed
together as work performed by us (“We did...”).
2 1 INTRODUCTION
In this chapter the problem background is presented followed by the aim and objective of the
study. The problem definition, including our research questions, is then described followed by
delimitations and target group. Lastly a section about our commission body is presented
followed by abbreviations used in this report.
With this thesis we want to emphasize the importance of investigating how the interaction
technique of tapping, provided by NFC-enabled Android applications, should be utilized to
give the user a good user experience. The main focus is on how the tapping technique should
be implemented in order to ease the user interaction in three different types of concepts,
exemplified through Android application prototypes. A goal is to iteratively perform usability
evaluation throughout the whole development process of the prototypes as a result of keeping
a user-centered design focus. We also want to highlight how other factors like technical
restrictions affect the implementation of NFC on the Android platform.
1.1 Problem background
NFC is a proximity-based connectivity technology used for creating short-range wireless
communication ports between electronic devices. NFC is known for providing the interaction
technique of tapping, meaning that users “touch” an NFC-object with an NFC-enabled device
and thereby initiate a communication port.
NFC gets utilized when the technique is integrated into services and applications, ideally
smartphone applications (NFC Forum, 2012). NFC can however be used in virtually any
context. Picture 1 on the following page shows examples of how NFC can be used.
3 Picture 1 - Examples of NFC use (NFC Forum, 2012).
The first NFC-enabled phone was the Nokia 6131, released back in the year 2006 and shortly
afterwards a few other NFC ready models came out on the market. The future for NFC in
smartphones were looking good, especially from a business perspective, as stated by
Benyó et al. (2007):
“The combination of the world’s most popular mobile device, the cell phone, with
the novel wireless technology NFC (Near Field Communication) makes possible
variety of business opportunities. Payment, ticketing, access control, content
distribution, smart advertising, peer-to-peer data/money transfer – the potential is
virtually limitless.”
Strömmer et al (2007) shared the view of a promising future for enhanced mobile applications
on NFC-enabled smartphones by proving the powerful combination of NFC together with
data retrieval from the phone’s sensors. NFC is today, early 2013, e.g. mainly used in Europe
within the public transport sector in cities like Stockholm, Barcelona and London.
Companies like Assa Abloy have started to develop products that use NFC instead of physical
keys and at 2012’s MWC (Mobile World Congress) in Barcelona; a lot of NFC-related pilot
projects were presented. Interesting is that in Asia, RFID, the technology that NFC is built up
on, has been used in the public transportation sector since early 21th century and their RFID
integration into smartphones started in the year 2004 according to IDG News Service (2007).
Given its potential, the numbers of NFC mobile products commercially available to end-users
are as of today, early 2013, very limited, at least outside Asia. One can say that NFC still
seems to be in its pilot stage.
4 NFC has not yet reached iOS and can thereby not be utilized by iOS developers. This creates
problems when it comes to market acceleration as iPhone users are considered early adopters
of new technologies. But the market penetration problem can also be seen on the Android
platform, even though it holds NFC support. The ratio of Android applications that utilizes
NFC is almost nonexistent in the Google Play store even though Android has supported NFC
since 2010 when releasing Gingerbread, Android 2.3.
The first NFC equipped smartphones were released 2006 but being the first generation of
smartphones they were technically limited compared to the ones released today, early 2013.
Since they were not accompanied with the same processor power, the accuracy of their sensors
were not nearly as good as the ones in modern smartphones. On top of this the NFC
smartphones of 2007 could not do much more than read and write tags and therefore no one
seemed to know how to utilize the technology from a business perspective. Because of this, the
early versions of NFC-integrated smartphones worked merely as a proof of concept rather
than bringing much actual benefit to the users.
Another part of the explanation might be that the infrastructure surrounding the NFC
equipped smartphones from the year 2006 and onwards likely has been too weak for a
commercial success to happen. In order for NFC to get widely used, a whole ecosystem
surrounding the technology needs to be created, obviously. Take a payment application for
example: Smartphone users want to be able to pay by using NFC in their smartphones. Other
than the fact that their smartphones need to be NFC-enabled, retailers also has to be
equipped with NFC-enabled pay points. In other words both the smartphone users and the
retailers need to see the possible benefits of NFC in order to invest money in the technology.
If the users notice that only a handful of stores have invested in the technology they will soon
lose interest and stop using the NFC payment service. Possibly the users will also lose their
interest in NFC services in general. If the demand for NFC-enabled smartphones continues to
be low, the smartphone manufacturers will not launch any large amounts of NFC-enabled
phones and so on. This problem can be described in analogue with the famous chicken and
egg problem, i.e. who will take the first major step?
Nokia’s unsuccessful attempt of introducing NFC smartphones to the masses has raised
certain carefulness among the involved actors like retailers, phone manufacturers, mobile
network operators and smartphone application developers. Many of the involved actors see
NFC’s potential but before anyone is willing to make the first major investment, the NFC
ecosystem status quo will remain.
But like with many other interesting technologies, the new generation of smartphones seems
to set the standards also for the NFC technology. According to a report from the Mercator
Advisory Group (Mercator Advisory Group, 2010), around 116 million smartphones
5 equipped with NFC were predicted to ship 2011 while more than 260 million is predicted to
get shipped during 2012. The year 2015, the shipment number is supposed to rise to
510 million.
Google and Samsung’s flagship smartphone Galaxy Nexus, released in late 2011, together with
Android’s Operating System 4.0 - Ice Cream Sandwich, which holds more NFC support than
previous versions, might work as an icebreaker for NFC technology. A lot of recent NFCrelated focus has been put on Google’s service Google Wallet and it’s competitors such as Isis enabling smartphone payments and use of loyalty cards via NFC. These types of services have
started to prove NFC’s e-commerce potential and many companies have, with clearer business
opportunities in mind, started to back up the technology. Google can because of this be seen
as a bit of a battering ram for introducing NFC to a wider audience. But payments through
the smartphone are far from the only utilization of the technique, proven by for example
Touchanote (www.touchanote.com), that stores digital notes in NFC-tags, and Proximiant
(www.proximiant.com), that can transfer digital receipts to your smartphone.
Yet a lot more can be done since NFC can be used in virtually any context - especially contexts
not only revolving around payment solutions. So what is a better way of proving that, than to
come up with own solutions that utilize powerful smartphones in combination with the
interaction possibilities offered by NFC?
Through our three applications, we aimed to explore the potential of NFC in different
domains, using different settings. At the same this approach allowed us to bring forth an
interaction technique carried across different applications – a way of tapping that users could
recognize irrespective of which NFC-application they were using.
1.2 Aim and objectives
The aim of this study was to explore how the tapping interaction technique, enabled by NFC,
should be implemented in Android smartphone applications to create interfaces that ease the
user interaction and provide for a smooth, supple and interesting interaction experience. The
challenges in this area have been identified and discussed.
6 In order to accomplish our aim, the following objectives were defined:
•
Find suitable scopes of use for NFC and the tapping technique and define concepts,
revolving around these scopes, in detail.
•
Exemplify the concepts by developing prototypes.
•
Iteratively evaluate for usability, by using design guidelines and heuristics throughout
the design process as well as by performing user testings on the finished prototypes.
•
Discuss challenges that occurred when developing NFC-integrated Android
smartphone applications, based on the design and development of the prototypes and
the results of the evaluations.
•
Create an interaction model concluding a suggested approach when developing NFCintegrated Android smartphone applications. The model is meant to describe how to
best utilize the tapping technique from a user-centered perspective, based on our own
experiences.
1.3 Research question
The research question we have tried to answer was whether we could design a tapping
interaction that could work in different domains, under different demands, allowing users to
learn and carry over their understanding of how to use NFC between different applications.
1.4 Delimitations
NFC provides the end user with the tapping interaction technique. This technique enables an
alternative way of interacting compared to using a touchscreen graphical user interface (GUI)
that is the traditional way of interacting with a smartphone. Of primary interest for us is to
study how to best utilize the tapping technique in a usability context, rather than investigating
all other aspects of the NFC technology on smartphones. We have therefore limited ourselves
to a domain revolving around particular concepts that utilize the technique. It is also
important to understand that NFC needs to be implemented into an application in order to
be able to evaluate the interaction technique it provides, which also motivates our choice of
working with concepts.
We have chosen to limit ourselves to Android, as this is the by far the biggest smartphone
Operating System (OS) that holds support for NFC at the time we started this study in
February 2012.
7 As a developer, NFC offers three modes of operation - of which two, Read/Write and Peer-topeer, are available for developers in the Android 4.0 Software Development Kit (SDK). In the
role as developers, we have focused our two main concepts around the two modes that are
possible to implement, but we have also briefly looked at the third mode, Card emulation.
Card emulation is considered in our first concept, called Sparakvittot, which also works as our
pre-study, followed by Read/Write in the Medicine Handling concept and Peer-to-peer in the
last concept called TapThat.
As developers, it is of course important, for us, to gain knowledge about implementation of
NFC into Android smartphone applications since it sets the domain for what can be
developed, but it is not of primary concern for our study. Our main objective rather focuses
on utilizing and evaluating the interaction technique of tapping, as a part of the user interface
in our prototypes, in order to later come up with an interaction model describing how to
implement the tapping technique from a user-centered perspective. Evaluation of the rest of
the user interface, such as the GUI of the application, is therefore of secondary concern.
When developing the prototypes we have used Android 4.x, which was the latest version of
the Android Software Development Kit (SDK) when we started this study. At that time, it was
the version holding the most features, both regarding NFC and in general.
8 1.5 Target group
This thesis address people interested in human computer interaction (HCI) and new
interaction techniques - especially the tapping technique enabled by NFC technology. It can
also be of interest for people working in the mobile application development industry,
especially regarding NFC integration in mobile applications. The thesis describes a pragmatic
way of using the user-centered philosophy to develop usable end products, in our case the
prototypes. It may therefore also be of interest for software developers and usability evaluators.
1.6 The Mobile Life
The Mobile Life is a company specialized in mobile application development. The company
enables other companies and organizations to take advantage of the mobile channel, for
campaigns or as part of their critical business operations. The Mobile Life has offices in
Stockholm and Singapore.
1.7 Abbreviations
In this chapter we present abbreviations specific for our topic, which are used in the report.
NFC specific
•
Radio Frequency Identification (RFID) - A wireless technology that uses radio frequency
magnetic fields to transfer data. The field range is up to ten meters.
•
Near Field Communication (NFC) - An ISO standard applied on top of RFID. The field
range is lower compared to RFID, being up to four centimeters.
•
NFC chip - The hardware making NFC-communication possible. Consist of a data
storage chip and a low-range antenna.
•
NFC-enabled item - An item that supports NFC-communication. The NFC-enabled
item can e.g. be a tag, a sticker, a wristband or a smartphone. An NFC-enabled item is
from now on sometimes referred to as an “NFC-item”.
•
Interaction technique - An interaction technique or input technique provides a way for
users, of a computer device, to accomplish a single task on the device.
•
Tapping - NFC-enabled smartphones provide users with the interaction technique of
tapping, which means that a smartphone is tapped towards another NFC-enabled
item, such as an NFC-tag or another NFC-enabled smartphone, in order to interact
with it.
•
Passive NFC-item - An NFC-item that is passive cannot perform any operations on
other NFC-items or trigger any communication. It simply lies and waits to be read or
written to. NFC-tags are almost always passive.
9 •
Active NFC-item - An active NFC-item can perform operations, such as read or write,
on passive NFC-items and communicate with our active NFC-items. It can actively
trigger a communication with another NFC-item.
•
NFC Read and Write (R/W) mode - An NFC-item can be written to (Write mode),
meaning that data is written to- and stored on its chip. Read mode means that the
data on the chip is read and retrieved, to the device reading it.
•
NFC Peer-to-peer (P2P) mode - A P2P network is a network of computer devices in
which each connected device can act as both client and server for other devices in the
network. An NFC-device can exchange data with another NFC-device, within a closed
network between the devices, through the NFC P2P mode.
•
NFC Card emulation mode - Making an NFC-chip act as a passive NFC-tag with static
data.
Common
•
Smartphone - A combination of mobile phone and computer equipped with multiple
sensors such as camera, GPS, accelerometer, compass and touchscreen.
•
Original Equipment Manufacturer (OEM) - Refers to the company that originally
manufactured a product.
•
Operating System (OS) - A set of software that manages computer hardware and provide
common services for computer programs.
•
Android - An operating system, provided by Google, for smartphones and tablets.
•
WiFi - Technology that enables devices to exchange data wireless, with high
bandwidth.
•
Bluetooth - Technology that enables devices to exchange data wireless, with low
bandwidth
•
Software Development Kit (SDK) - Software development tools that allow creation of
applications for a certain platform, like an operating system or hardware platform.
•
Graphical User Interface (GUI) - A type of user interface that allows users to interact
with electronic devices with a graphical interface rather than text commands.
•
Electronic Healthcare Record (EHR) - Digital documents containing information about a
patient and it’s medical history.
10 2 THEORY
In this chapter background theory regarding the topic is presented. The term usability is
described first followed by a section about measuring usability and after that user centered
system design is presented.
2.1 Usability
The term usability, which is a central term within HCI, is used to indicate whether a design or
a system is satisfying the needs of its users. The expression easy to use is commonly referred to
when talking about usability, but the term is more complex than that.
2.1.1 Definitions of usability
As discussed in Breaking down Usability (van Welie et al., 1999) it does not seem to be a mutual
and objective understanding, within the contemporary HCI paradigm, on how to define the
term usability. The authors quotes that:
“...usability certainly cannot be expressed in one objective measure. Several authors
have proposed definitions and categorizations of usability and there seems to be at
least some consensus on the concept of usability and they mostly differ on more
detailed levels”.
This ISO standard defines usability as:
”[the] extent to which a product can be used by specified users to achieve specified
goals with effectiveness, efficiency and satisfaction in a specified context of use.”
Furthermore effectiveness is defined as “accuracy and completeness with which users achieve
specified goals” and efficiency as “resources expended in relation to the accuracy and
completeness with which users achieve goals”. Satisfaction is subjective to the user and
defined as “freedom from discomfort, and positive attitudes towards the use of the product“
while context of use relates to “users, tasks, equipment (hardware, software and materials),
and the physical and social environments in which a product is used”. (ISO 9241-11, 1998)
van Welie et al. (1999) give negative criticism to the ISO standard definition for being quite
abstract and theoretical and therefore, in many cases, hard to apply in practice. Jacob Nielsen
takes a different approach, trying to define usability with elements that are slightly more
11 specific. Nielsen (1993) defines usability based on the five elements; learnability, efficiency,
memorability, errors/safety and satisfaction.
A system should be easy to learn how to use so that users quickly can start working with it
(learnability). Once the users have learned the system, it must be efficient to work with
(efficiency). It must be possible to return to using the system, after a while of not using it, and
still be able to remember how it works (memorability). Users should also be able to make as
few errors as possible. If some users still get it wrong, they must be able to return to the
situation before the error occurred (errors/safety). The users should feel comfortable using the
system. The users should feel that it is pleasant to work with the system, i.e. simply like it.
(Nielsen, 1993)
There are several other definitions of usability. Schneiderman defines usability in a similar
manner to Nielsen but uses a slightly different terminology. A comparison of the ISO
definition and the definitions given by Schneiderman and Nielsen are shown in Table 1 below.
Table 1 - Usability definitions as given by ISO 9241-11, Schneiderman and Nielsen
(van Welie et al., 1999).
Dix et al. (1998), take a more practical approach when trying to describe usability and defines
three main groups; learnability, flexibility and robustness. Each group is specified further by
factors that influence the group they belong to. Table 2 shows the usability factors as described
by Dix et. al. (1998, referenced from van Welie et al., 1999)
Table 2 - Usability definition by Dix et al. (van Welie et al., 1999).
12 Of course there are advantages and disadvantages with each different definition. Criticism
given by van Welie et al. towards Dix’s definition is that:
“…from a practical viewpoint, Dix’s categorization gives the designer concrete
measures for improving the usability of a design. On the other hand, it is odd that
Nielsen’s notions of efficiency or error rate can not be found in Dix’s categorization,
as they are clear indicators of usability”. (van Welie et al., 1999)
2.1.2 Context
When talking about usability it is important to involve the context. Dey (2001) defines
context as:
“…any information that can be used to characterise the situation of an entity. An
entity is a person, place, or object that is considered relevant to the interaction
between a user and an application, including the user and applications themselves.”
Context can be embodied in many different shapes and e.g. revolve around users, activities,
tasks and context of use, to name a few that are relevant for this thesis. Context can also
describe a certain device or platform, such as; smartphone, tablet, Android or iOS etcetera.
As stated by (Bevan and Macleod, 1994):
“[Usability] is a property of the overall system: it is the quality of use in a context.”
According to Bevan and Macleod, the characteristics of the context (users, tasks and
environment) may be as important when determining usability as the characteristics of the
product itself:
“Quality of use can instead be measured as the outcome of interaction in a context:
the extent to which the intended goals of use of the overall system are achieved
(effectiveness); the resources such as time, money or mental effort that have to be
expended to achieve the intended goals (efficiency); and the extent to which the user
finds the overall system acceptable (satisfaction).” (Bevan and Macleod, 1994)
Bevan and Macleod (1994) mean that the overall system consists of the users, tasks,
equipment (hardware, software and material) and environments which all influence the
interaction. The relationship between the factors are presented in Diagram 1 below:
13 Diagram 1 - Quality of use as a function of context of use factors together with usability
elements (Bevan and Macleod, 1994).
Based on the usability factors in Diagram 1 Bevan and Macleod further claims that a usability
of a product can be defined as:
“…the ability of a product to be used with effectiveness, efficiency and satisfaction by
specified users to achieve specified goals in particular environments”.
2.1.3 Usability features and attributes
The ideal way to specify and measure usability would be to specify quantifiable features and
attributes required to make a product usable. By measuring whether these features and
attributes are present in the implemented product, an indication is given whether a
qualitative product can be ensured or not.
On top of the different definitions of usability there are several lists with design principles,
design patterns and heuristics, defining features and attributes aimed at ensuring “usable”
systems. Nielsen gives a list of guidelines, with the purpose to affect the five elements used to
compose his definition of usability. Schneiderman gives his own list of heuristics called the 8
golden rules of design. A list of seven dialogue principles is given by the ISO 9241-10 standard and
a list of ergonomic criteria is given by Bastien and Scapin. (van Welie et al., 1999)
14 Preece et al. (2007) describes design principles as guidelines when making design decisions,
based on knowledge, experience and common sense. Design principles can be seen as
common suggestions of what should and should not be implemented in a user interface.
Design principles are however not meant to describe how specific objects or parts of an
interface should be designed - design guidelines describe these types of issues.
Common criticism towards attempts to describe usability features and attributes can be given
that they are too general in their approach, overseeing contextual factors. The nature of the
features and attributes making a product “usable” are very hard to define since they depend
on in which context the product is to be used. (Bevan and Macleod, 1994)
Contradictory to the given criticism Bevan and Macleod concurrently states that design
guidelines have the advantage of being applied early in the design process, that they are easy to
understand and can be used both for the design of the user interface and to perform heuristic
evaluation on a product. (Bevan and Macleod, 1994)
Further Preece et al. (2007) describes design patterns as a solution to a problem in a certain
context. A pattern describes a problem, how it can be solved and in which context this
solution has been proven to work. Cooper et al. (2007) and Tidwell (2006) states that design
patterns cannot be used as general “recipes” when implementing design solutions. Each
implementation is diverse from the other since they, to a certain extent, are context
dependent.
2.1.4 Usability inspection
Usability inspection is an umbrella term for methods where expert evaluators review a GUI in
order to find usability problems and overall design, in terms of usability. These methods are
considered cost-effective, informal and easy to use. Generally usability inspections are most
often used early in a design process, sometimes so early that the evaluator inspects interface
proposals when they are still in the specification phase. Studies have shown that usability
inspection methods are able to find usability problems that are ignored by user testing but
that user testing illuminate problems that inspection methods fails to notice. Therefore a
combination of methods most often induces the best results. (Nielsen, 1994)
2.1.5 Usability testing
Usability testing is a commonly used analytical method to evaluate the performance and
acceptance of computer applications. When conducting usability testing, a user typically
interacts with a computer application in a controlled environment, following instructions in
order to complete tasks within a specific scenario and within a context. During a usability
testing, data regarding the user’s interaction and behavior are collected by a test leader or a
test assistant. Essential elements of usability testing are; the product that is going to be
15 evaluated, the task scenario, the collection of behavioral data and that the test is performed in
a more or less controlled environment. (Wichansky, 2000)
2.2 Measuring usability
There are numerous different definitions of usability. Acquisition of knowledge about
different aspects of context is highly important, when talking about usability. Heuristic design
approaches as well as usability inspection and usability testing, have been studied in the three
sections above. But how should one go about when developing a user-centered product in
practice, given all these definitions, design approaches and evaluation methods?
2.2.1 Layered model of usability
In order to compile the knowledge domain and the definitions and principles given by ISO,
Nielsen, Dix et al., Schneiderman and in turn reduce some confusion when figuring out how
these factors are related, van Welie et al. has come up with a layered model presented in
Diagram 2 below:
Diagram 2 - Layered model of usability (van Welie et al., 1999).
van Welie et al. (1999) describe their model the following way:
“On the highest level the ISO definition of usability is given split up in three aspects:
efficiency, effectiveness and satisfaction. This level is a rather abstract way of looking
at usability and is not directly applicable in practice. However it does give three solid
pillars for looking at usability that are based on a well-formed theory (ISO 9241-11,
16 1998). The next level contains a number of usage indicators, which are indicators of
the usability level that can actually be observed in practice when users are at work.
Each of these indicators contributes to the abstract aspects of the higher level. For
instance, a low error-rate contributes to a better effectiveness and good performance
speed indicates good efficiency.
One level lower is the level of means. Means cannot be observed in usability testing
and are not goals by themselves whereas indicators are observable goals. The means
are used in "heuristics" for improving one or more of the usage indicators and are
consequently not goals by themselves. For instance, consistency may have a positive
effect on learnability and warnings may reduce errors”.
van Wiesel and his co-authors’ layered model is aimed to provide concreteness on how to
implement and measure usability while developing a system. The layered model can be used
both during design and afterwards to evaluate usability, which is described in more detail in
the following two sections.
2.2.2 Evaluating during design
Evaluation during the design process is often more complex than evaluating with users. The
designer should focus on the means that influence the usage indicators since the usage
indicators themselves cannot be evaluated directly and thereby not provide any quantitative
data. The means can be evaluated by looking at the way they are present in the design and by
estimating whether they have a positive or negative impact on the usage indicators. The means
can also be used as heuristics when designing. (van Wiesel et al., 1999)
2.2.3 Evaluating with users
When evaluating with users, evaluation is done by looking at the usage indicators, which are
used to provide data about usage of a system. By using scenarios while user testing, hard data
about the performance speed of a system and the number of errors made can be obtained.
Through interviews, in connection with the user testing, qualitative data can also be acquired.
These data can be used to analyze the usability of the system. (van Wiesel et al., 1999)
17 2.2.4 Usability measures defined by ISO 9241-11
In another try to concretize how usability can be measured, the ISO 9241-11 standard gives
examples of usability measures in terms of efficiency, effectiveness and satisfaction shown in
Table 3 below.
Table 3 - Examples of measures of usability (ISO 9241-11, appendix B).
2.3 User centered design
User centered design (UCD), sometimes referred to as User centered system design (UCSD),
was originally coined by Norman and Draper (1986):
“...user-centred design emphasizes that the purpose of the system is to serve the user,
not to use a specific technology, not to be an elegant piece of programming. The
needs of the users should dominate the design of the interface, and the needs of the
interface should dominate the design of the rest of the system.”
Gulliksen et al. (2003) comment Norman and Draper’s work (1986) by stating that:
“They [Norman and Draper] emphasized the importance of having a good
understanding of the users (but without necessarily involving them actively in the
process)”.
User centered system design is based on user focus and consideration of usability during the
system development and also throughout the whole system life cycle (Gulliksen and
Göransson, 2002). Diagram 3 describes the iterative work process of user centered system
design:
18 Diagram 3 - The basic elements of the iterative user-centered work process
(Gulliksen and Göransson, 2002)
It is good practice to always involve users in a system design process. The ways in which they
participate can however vary. The involvement can be fairly low; users can be interviewed
about their demands and needs, be observed when using a system or take part in usability
testing. The involvement may also be of more substantial character; users can, in addition to
previously mentioned involvement, participate as design assistants during the whole design
process. (Abras et al., 2004)
UCD is to be considered an appropriate term for assembling insight in the art of developing
usable systems. Though users should be involved in system design it is somewhat optional
how to carry it out. (Karat, 1996)
When it comes to utilizing UCD there are many different methods that support the
philosophy, for example usability testing, usability engineering, heuristic evaluation, discount
evaluation and participatory design. It is also desirable to, early in a system design process,
gather some typical users for a quick evaluation in order to get valuable early feedback. (Abras
et al., 2004)
Kangas (2005) talks about user centered system design used in a mobile context, saying that it
is of great importance to provide the user with real usage context:
“The most important aspect of the design process is to provide the user with the real
usage context. For mobile phones this means users need to be able to touch the
buttons and see software that feels like it is actually working.” (Kangas, 2005)
19 Being a design philosophy, there are many different approaches to UCD and none of them
are wrong. There are no strict rules to follow, except for the importance of always involving
users in some way or another - as long as attention of the users are accomplished throughout
the process, it can be considered as user centered design.
2.4 Utilizing the mobile context
The following section is a summary of the article Mobile Application Design Guidelines, by
Gualteri et al (2011). The mobile application design guidelines presented are based on
existing user experience best practices and focuses on how to utilize the benefits of the mobile
context.
Gualteri et al. suggests the following approach when designing a mobile application; create
mobile personas of your users, design for the mobile context and validate your design in a
mobile context. A persona is a vivid, narrative description of a made-up person representing a
segment of the target group. Context is always of importance when designing for a good user
experience. The context describes the environment in which users might use the application
and the features that might be most useful to the users in that environment. To find design
solutions suitable in a mobile context is the second step one should carry out when designing
mobile application. (Gualteri et al., 2011)
Location, locomotion, immediacy, intimacy and device form factor are key design points for
mobile applications. Below, in Table 4, is a more detailed description of these five key design
points given, in the so-called LLIID model:
Table 4 - The five dimensions of mobile UX context: LLIID (Gualtieri et al., 2011)
20 Furthermore Gualteri et al. propose that the mobile application designer uses the LLIID
model when designing an application while simultaneously involving the context of the
persona and the scenario. This approach is described in Diagram 4, on the next page:
Diagram 4 - Design for the mobile, persona and scenario context (Gualtieri et al., 2011)
2.5 Near Field Communication
Near Field Communication (NFC) is a proximity-based connectivity technology used for
creating short-range wireless communication ports between electronic devices through a
touch-based interface.
2.5.1 Near Field Communication technology
NFC is built upon a technology called Radio Frequency Identification (RFID) that is used for
wireless connectivity. RFID enables the capability of accepting and transmitting information
beyond just a few meters and does not need a direct line of sight to read information. It uses
radio frequency waves and can act in passive or active mode. Wireless RFID-connectivity can
be created between two nodes that are situated on a distance of up to 10 meters from each
other, depending on which wave lengths that are used.
NFC is more restrictive than RFID and works only on short-range distances. As a layer on top
of RFID, NFC is more or less compatible with existing structures, tags, and contactless smart
cards. However, it does differ from RFID. NFC, for example, operates at 13.56 MHz to
exchange data between enabled devices whereas RFID uses different frequencies between
120 kHz and 10 GHz. Communication through NFC begins when two enabled devices are in
close proximity of each other (within 4 centimeters), i.e. physically almost touching one another.
In essence, this technology brings virtual connectivity to the physical world.
(NFC Forum, 2012)
21 The technology is getting more and more popular and it is today, early 2013, e.g. used in the
Stockholm Metro system where users tap their SL Access cards against terminals as ticket
confirmation.
2.5.2 NFC Forum
NFC Forum is a non-profit organization that has about 140 member companies and works
with defining specifications in order to secure interoperability among devices and services
utilizing NFC technology. The organization also works with educating the market about the
technology and tries to improve the use of it. (NFC Forum, 2012)
2.5.3 How Near Field Communication works
NFC is often referred to as an application enabler, i.e. used to trigger a launch of a certain
application or function. But how does it work? NFC is basically used to perform a handshake
between two devices that are put close together. The device that first initiates the handshake is
the initiator and tries to establish a connection to the other device. Once a connection has
been established, the two devices exchange data through NFC directly or through other
wireless technologies such as Bluetooth or WiFi. NFC has three modes of operation, defined
by NFC Forum (NFC Forum, 2010) and illustrated in Picture 2, which are:
1. Read/Write mode - usually used for exchange of content information - normally an
active device retrieves information from a passive tag.
2. Peer-to-Peer mode - used when two active devices exchange data.
3. Card Emulation mode - the NFC-device emulates an NFC-tag, e.g. a contactless smart
card (such as SL’s Access cards).
Read/Write
Peer-to-peer
Card emulation
Picture 2 - The three modes of operation on NFC-enabled smartphones (Cavoukian A., 2011)
22 2.5.4 Usage
NFC can be utilized within basically any application scope. Examples of uses that are
commonly referred to when speaking about NFC are payment solutions, sharing of content
between devices and within the ticketing and transportation sector. Opportunities are also
created to gather valuable market research when using NFC. Imagine a customer in a store
that scans a shirt equipped with an NFC-tag with their mobile device to gather more
information on the product. The customer can then instantly retrieve information such as
availability in another store location, size information or washing advice. The information
scanned by the customers can be used by the store to gain a better understanding of their
customers’ interests - leading to a deeper relationship between the consumers and the store.
Through targeted marketing such as tailored promotions, loyalty point incentives and
coupons an enhanced shopping experience could be offered to the customer as well as
potentially increased revenue for the store.
A concrete and interesting example of NFC usage is a pilot project that started in November
year 2010 and kept running for eight months at Clarion Hotel Stockholm. The project tested out
the concept of replacing physical hotel room keys with digital keys, saved on the hotel guests’
smartphones. There were a total of 28 loyalty guests taking part in the project with an average
of 17 hotel stays per guest. Here is an explanation of how the pilot project worked:
“The guests made reservations using their regular channels and on the day of arrival,
they received a check-in invitation sent to them via SMS. By a link in the check-in
invitation the guests accessed an online application where they checked in and
received a hotel room number. When check-in was completed, a digital hotel room
key was sent to the NFC-enabled mobile phone. Upon arrival at the hotel, the guests
could skip the check-in line, and go directly to their room and open the door by
holding the mobile phone up against the door lock. Check out was also managed
through the mobile keys application and the digital hotel room keys were deactivated
when check out was completed.” (Assa Abloy, n.d.)
After the pilot project Assa Abloy and Clarion Hotel made an evaluation of the project and
they got many positive results. For example, nine out of 10 guests stated that they saved time
and appreciated the fact that they did not have to wait in line at the reception in order to
check-in. A few of the participants pointed out that they missed the interaction with the hotel
staff, but that they did not miss waiting in line at the reception. Eight out of 10 guests
thought that it would be a good idea to be able to receive additional information regarding
the hotel such as maps, room service menus and spa services via the smartphone application
as well. Assa Abloy (n.d.) concludes that:
23 “Eight out of 10 claimed that the mobile keys application made their hotel stay more
pleasant. Almost everybody would use the service if it was available today and their
mobile phone supported it.”
“The NFC ecosystem as well as the NFC technology is still under development. The
pilot has experienced some technical challenges in getting all the parts of the
ecosystem to work seamlessly. It has been an important learning for all the companies
involved.”
Interesting to point out is that Assa Abloy’s report did not address why they did not take the
project any further.
2.5.5 Possible benefits
NFC opens up various ways of communicating and making data transactions in a quick and
easy accessible way. It has a potential to make electronic services and other interactions
accessible to a bigger amount of people - young or old, technically aware or not, fully
functional or physically disabled. NFC is ideal to be used in the mobile context such as
smartphones, as concluded by Benyo et al. (2007) and Strömmer et al (2007). The technology
presents possible benefits to both mobile users and businesses. It is a versatile technology that
can be implemented in a lot of different situations, providing a simple interaction technique
as stated by Rukzio, et al.’s (2006) and Ailisto et al. (2007). Convenience and simplicity of use
are examples of advantages.
Rukzio et al. (2006) e.g. state that the tapping technique “…conforms to our everyday physical
interactions as we often touch objects with our hand or fingers”. In addition, all users of
Rukzio, et al.’s (2006) study found a tapping user interface the most trustworthy, secure and
least error-prone one. Ailisto et al. conclude that NFC opens up new possibilities for
physically impaired people. In their paper Physical Browsing with NFC technology (2007) it shows
that users with restrictions in the physical movement of their hands and who have problems
with hand-and-eye coordination did not have any problems in using an NFC interface.
24 2.6 NFC integration in Android applications
When developing NFC integrated Android applications a number of design guidelines are
supposed to be followed. When building any type of Android application one should follow
the Android Developer’s Guide (Android Developer, 2012a) and the Android Design Guidelines,
which include both design style guidelines and interaction pattern guidelines (Android
Developer, 2012b).
The NFC Basics Guidelines and the Advanced NFC Guidelines defined by Android Developer
(2012c; 2012d) should be used when implementing NFC in Android applications. The video
How to NFC by Google I/O (2011) is a very useful source to learn about utilization of the
NFC technology in order to implement NFC in Android applications the proper way.
25 3 METHODOLOGY
In this chapter our method theory including contextual design, target group analysis, field study,
heuristic evaluation and usability testing are presented. Then a section on combination of
methods is presented followed by reliability and validity.
3.1 Contextual Design
The following section, including subsections, is based upon the much-cited article
Contextual Design by Beyer and Holtzblatt (1999) unless anything else is stated.
According to Gulliksen and Göransson, contextual design is used to gain knowledge about
users in their own environment and to create detailed representations of all appropriate
contexts surrounding the user. Contextual design is a pragmatic way of applying ethnographic
methods. (Gulliksen and Göransson, 2002).
The method is considered a user centered design (UCD) method and was originally devised
by Hugh Beyer and Karen Holtzblatt (1999). The method embodies ethnographic field studies,
workflow streamlining and human-computer interaction interface design through a partly
interactive process. Beyer and Holtzblatt describe their method as:
“Contextual Design is a state-of-the-art approach to designing products directly from a
designer’s understanding of how the customer works. Great product ideas come from
the marriage of a designer’s detailed understanding of a customer’s need and his or
her in-depth understanding of the possibilities introduced by technology.”
The preferred approach of the contextual design are divided into the following steps as stated
by Beyer and Holtzblatt (1999):
•
Contextual Inquiry - Talk to individual customers in their workplace while they work.
•
Work Modeling - Draw models representing the work of the customers you talk to.
•
Consolidation - Create a single statement of the work practice of your market or
organization.
•
Work Redesign - Invent and develop better ways to work.
•
User Environment Design - Represent the entire system for product planning,
marketing, UI design, and specification.
•
Mockup and Test with Customers - Test and iterate the design with customers using
rough paper mockups.
26 When it comes to utilizing the contextual design method, Beyer and Holtzblatt state that it is
favorable if the designers of a system, not just understand a customer’s needs, but even
participate in the collection of the user data to really get a deeper understanding of what is
needed. They also conclude that if individual steps aren’t applicable they can be shortened or
even disregarded. Steps can also be elaborated with additional techniques if suitable. The
method does not contain any specific process but rather gives fragments of a whole system’s
life cycle perspective. Finally the contextual inquiry is the most important part of contextual
design. If only one step of the method is to be used, it is contextual inquiry.
3.1.1 Contextual Inquiry
The reason for the contextual inquiry being the most important step of the contextual design
is because it results in the requirements for the future system. Beyer and Holtzblatt state that:
“...data gathered from customers is the base criterion for deciding which needs to
address, what the system should do, and how it should be structured.”
Design teams are supposed to do one-on-one field interviews with users. The interviews are
conducted at the users workplace in order to be able to find what matters in their work. To
understand the users motivations and strategies a contextual interviewer tracks users as they
work. During the tracking the interviewer asks questions about how a user step by step
performs each action. After the tracking session a more standard interview is performed where
the contextual interviewer, through discussion, develop a shared interpretation of work with
the user. By making a system designer sit down with users, hopefully the designer becomes
aware of the users’ struggles and changes the designer’s perspective on what the issues are and
why they matter.
3.1.2 Work Modeling
To make the data gathered during a contextual inquiry coherent for easy sharing between
teams, work modeling is suggested. Work modeling is a way to present a user’s work. A work
model is supposed to be pictorial and compact and in that way easy to understand. A work
model should not be continuous text like a scenario or a summary.
27 3.1.3 Consolidation
The part called consolidation is when a design team compile material from interviews to be
able to conclude user patterns and put together an overall structure without loosing
individual variation. The consolidation is conducted in order to understand the organization
and identify the needs of the customer. This result in corporate data that can be reused in
future developments, if applicable.
3.1.4 Work Redesign
The work redesign is the part of contextual design where it is time to be creative. The
designers take all their knowledge from the contextual inquiry and the consolidated data and
discuss how they can improve the users work. A system is supposed to improve work practice
for the people using it. It is the design team’s challenge to develop a system that really does
that, i.e. making the system provide the users with functions they favor. The work design is
supposed to result in a vision of how users will use the new purposed system. The vision
should consist of a presentation of the system, when it is to be delivered and support
structures needed to get the new system successful.
3.1.5 User Environment Design
The user environment design (UED) is supposed to capture the main structure of a system. It
is also supposed to describe the function and flow of it. The system must support a good
workflow and have suitable functions and structure. To be able to achieve a good UED, paper
prototypes of the system design are created and later used for testing the user interface ideas
before developing any software.
3.1.6 Mockup and Test with Customers
In order to secure that errors in the new design can be found before any software is developed
UED paper prototypes are tested. This gives the users of the supposed system the possibility to
easily contribute to the design of the system. Mockup testing is a good way to avoid arguments
and disagreements. Beyer and Holtzblatt state that:
“…it’s important to test and iterate a design early, before anyone becomes invested in
the design and before anyone spends time writing code.”
Paper prototypes also keeps the designers focus on the basic structure instead of details in the
user interface, that otherwise may take up resources in vain especially if that part of the
interface is redesigned.
28 Contextual design is basically a user centered design method that is supposed to help system
development teams to learn about and consider potential users’ issues regarding a system.
3.2 Target group analysis
A target group analysis is a method used to identify groups of people with common usage
patterns. It has many convergent elements compared to the contextual design method. The
result of a target group analysis explains how a particular job or certain tasks are performed
and can e.g. be used as the base when writing a use case specification. A target group analysis
usually consists of two parts when it comes to software development. First one has to define
who the users of the system will be. Secondly one has to determine what the users will
accomplish when using the system. Both these steps are supposed to be completed before
starting the design process of the system or at an early stage in the design process.
(Gould, 1988).
3.3 Field study
In contrast to laboratory studies, field studies take place in real world settings. Through
observations and interviews data is gathered in accurate social and cultural context. Field
studies most often result in large amounts of grounded qualitative data.
(Kjeldskov & Graham, 2003)
A field study has the possibility to deliver comprehensive and thorough data and therefore
one may be inspired to conduct usability testing in the field as well. Field-testing may however
not be worthwhile because of the costly resources needed, when evaluating user interfaces in
order to improve user interaction. Thus it is of good usability practice to conduct a field pilot
test or a contextual inquiry prior to conducting usability testing in a laboratory setting, in
order to build up a homogenous user environment. (Kallio et al., 2005)
3.4 Heuristic evaluation
Heuristic evaluation is an informal usability inspection method for finding usability problems
in a user interface so that they can be addressed as part of the iterative design process. The
method involves a small set of evaluators to examine whether the interface follows recognized
usability principles, known as the heuristics. (Nielsen and Molich, 1990; Nielsen 1994)
Nielsen (1994) proposes ten general principles for user interface design. He calls them
heuristics because they are more in the nature of rules of thumb than specific usability
guidelines. The ten principles are listed with a short explanation of each principle on the next
page.
29 1. Visibility of system status: Keep users informed about what is going on.
2. Match between system and the real world: The system should use real-world conventions
and speak the users' language.
3. User control and freedom: Users often choose system functions by mistake. Supports
undo and redo.
4. Consistency and standards: Follow platform conventions.
5. Error prevention: Careful design that prevents problems from occurring is of course the
best design practice. Descriptive error messages are however needed when problems
actually occur.
6. Recognition rather than recall: Instructions for use of the system should be visible or
easily retrievable whenever appropriate.
7. Flexibility and efficiency of use: Shortcuts may speed up the interaction for expert users.
8. Aesthetic and minimalist design: Dialogues should not contain information, which is
irrelevant or rarely needed.
9. Help users recognize, diagnose, and recover from errors: Error messages should be expressed
in plain language (no error codes), precisely indicate the problem, and constructively
suggest a solution.
10. Help and documentation: Even though it is better if the system can be used without
documentation, it may be necessary to provide help and documentation.
3.5 Usability testing
In order to evaluate systems regarding user acceptance and performance usability testing is a
commonly used evaluation technique. It is a trustworthy way to measure the user and system
performance but also to get feedback regarding user satisfaction. (Wichansky, 2000)
Usability testing is an evaluation method that may be conducted in many different ways. The
common goal, no matter of the approach, is to evaluate a system or service. User testings are
supposed to be carried out within the target group that is intended to use the product or
service being tested (Rubin and Chisnell, 2008). The purpose of a usability test is to find
different types of usability issues that may arise between the user and the system. It is
important to point out that it is the system that is being tested, not the user. Usability testing
are often performed in controlled environments where the interaction between the user and
the system can be monitored and recorded. According to Rubin and Chisnell (2008), five
users per test group is sufficient to find most of a system's usability problems.
Rubin and Chisnell (2008) further recommend that a pilot test is carried out before the user
testing is conducted. This means that an initial user performs a test to verify that all elements
30 are working, that the issues are clear, that all areas intended to be investigated actually are
investigated and so on. The pilot test is a way to be able to revise the real testing i.e. to
demonstrate potential problems and eliminate them before the real testing is performed.
Concerning the choice of testing environment in mobile HCI studies, a laboratory
environment is appropriate for evaluating design ideas, specific products or theories about
design or user interaction. It is favorable to not have any interference from the out side world
in order to be able to get user focus on the task at hand. (Kjeldskov and Graham, 2003)
3.5.1 Observation
The observation method is a common and reliable method used in human computer
interaction studies. One of the biggest advantages with the method is that the person
observing gets to see the interaction process in real life, first hand. Which is not the case when
it comes to answers from interviews or surveys that tend to be subjective. The observation
method is exceptionally good when the interaction to be evaluated is hard to describe, for
example manual processes where a complex cooperation with other people and/or devices
exists (Benyon, 2010). Though the observation method is powerful it can be hard to observe
the users discreetly enough. Users tend to notice that they are being observed and may
therefore change their behavior, which might lead to unreliable results.
3.5.2 Video observation
Video observation is most often used as a complementary resource to the observation. Gould
(1988) states that:
“Besides being useful for measuring time, errors and user attitudes, brief videotapes
of users attempting to use a new system have tremendous impact upon management,
especially where users are having problems”.
Since the users interactions are recorded they can be play backed and analyzed afterwards.
After a first look at the material and a compilation of the desirable data one may easily go
back and look at specifically interesting parts of the interaction. This makes video observation
powerful, but the analyzing process is time consuming and therefore sometimes considered as
cumbersome.
3.5.3 Semi-structured interviews
People use vocal communication on a daily basis to interact with each other face-to-face. Most
often they exchange experiences, feelings and hopes in a relaxed way. Kvale (1996) suggests
that an interview should be based on everyday conversations, but without being
unprofessional. A professional interview method enables many different types of possible
31 interview techniques, from more spontaneous with diffuse answers to more structured
interviews with more defined specific answers. A problem with structured interviews is the
risk of guiding respondents to answers they does not stand for as found by Leech (2002). To
minimize this risk Leech (2002) suggests the use of semi structured interviews where openended questions hopefully leads to detail and depth while at the same time giving a better
understanding of the interviewees’ perspectives. Semi-structured interviews also allow for
hypothesis testing and quantitative analysis of interview responses.
3.6 Combination of methods
A common approach, when performing a study, is to use multiple methods for data collection.
Which methods to apply depend mainly on the focus of the study, the participants involved,
the nature of the technique and the resources available. Regarding combination of different
methods there is no right or wrong when taking all the earlier stated factors into account.
(Preece et al., 2007).
In other words there is no specific combination of methods suitable for a specific scenario,
every study has to be designed according to the factors previously stated, to be able to fulfill a
specific aim. But for example, in the beginning of a project one may not have specific
questions and therefore it is more productive to explore difficulties through observations and
interviews instead of sending out questionnaires. (Preece et al., 2007).
Conducting interviews is thus a good starting point for a study since they are suitable for
exploring issues, which is of interest in the beginning of any study. Table 5 below shows
comparison of some of the methods used in this study.
Table 5. Comparison of methods, based on matrix by Preece et al. (2007)
32 3.7 Reliability & Validity
It is important to critically question the methods used to collect data. When questioning
methods one may consider a method’s reliability and validity. Reliability describes how certain
it is that the results would be more or less identical if the study was to be performed again
under the same premises and using the same methods. The validity of a method can be
explained as the trustworthiness and is supposed to answer the question: does the method
really measure what it is intended to measure? Both the reliability and validity of methods are
important to consider when performing a study. (Joppe, 2000 referenced in Golafshani, 2003).
33 4 PERFORMING THE STUDY
This chapter presents how we have performed our study. The first section is about our design
process. The second section is about our study in general. Lastly the aim of each sub-study is
presented followed by our overall aim.
4.1 Design process
When performing our study we have followed a user-centered approach, and based our work
process according to the key elements of the User centered design process. The four key elements
- analysis, design suggestions, evaluation and feedback - their relation and how they fit into our
study are shown in Diagram 5 below. We have chosen the methods used in our study
according to the principles of user-centered design. The diagram shows a general description of
how our study has been conducted, each sub-study involves different methods.
Diagram 5 - A description of our workflow based on the user-centered work process, defined
by Gulliksen and Göransson (2002).
As shown in the Diagram 5 Martin has taken a solution-focused approach in his work,
focusing on the analysis of users, activities, tasks and context of use, but most importantly;
acquisition of design suggestions. Martin has thereby taken the role as the user interface
designer. The result of Martin’s conducted work has been used as the basis for the three
34 concepts that have been exemplified by prototypes. The sub-studies, and each correlating
prototype, are described in the following sections.
Patrik has held a bigger focus towards problem-based and analytical work, by taking the role as
the usability evaluator. Patrik’s area of responsibility and his work revolved around designing
the usability tests used to evaluate the prototypes. Patrik has also been responsible for
providing feedback to Martin based on the results of the usability testings.
The prototype development has been performed together using pair programming. By
developing iteratively and together we have been able to develop more sophisticated
prototypes. Throughout our study we have helped each other by e.g. brainstorming together
and the work performed has been iterative in the sense that we constantly have been giving
and receiving feedback regarding each other’s work.
4.2 Study
During our study we have performed three sub-studies, one for each of the previously
proposed concepts which are; Sparakvittot, Medicine Handling and TapThat. Each concept
describes a possible scope of use for the tapping technique. Within each sub-study a prototype
has been developed, in order to exemplify the correlating concept. The aim of each sub-study
is given below, followed by our overall aim.
4.2.1 Pre-study – Sparakvittot
The aim was to explore whether an implementation of the tapping technique to the
Sparakvittot Android application could enhance the user experience?
4.2.2 Medicine handling
Medicine handling is a security-sensitive problem where nurses and other personnel needs to
be ensured that their NFC-service is indeed functioning correctly, allowing them to distribute
the right medicines to the right patient. In this sub-study, we explored whether the tapping
could be sufficiently stably implemented to ensure a more secure medicine handling process?
4.2.3 TapThat
With our last sub-study we wanted to explore if possible differences, regarding the interaction
design, compared to the two previous prototypes, were needed when implementing the peerto-peer mode of operation?
35 4.2.4 Overall aim
As previously stated our research question was whether we could design a tapping interaction
model that could work in several different domains, under quite different demands, allowing
users to learn and carry over their understanding of how to use NFC between different
applications. While the development of our prototypes has worked as empirics, the finished
prototypes have defined a potential design domain for us to evaluate the tapping interaction
within.
36 5 Sparakvittot
In this chapter our pre-study is presented. Firstly a background to the subject is given. The
prototype development and the market study are then described followed by an analysis of the
results.
5.1 Background
Through our commission body, The Mobile Life, a company named Findity contacted us.
They were interested in NFC and wanted to investigate the possibilities of improving their
service by implementing the tapping technique into their smartphone application, Sparakvittot.
Findity is a company aiming to reduce paper receipts in the retail business, by offering digital
receipts, through their service Sparakvittot (Save the receipt). The service helps users, i.e. retail
customers, to keep track of their receipts digitally. Simultaneously it helps retailers to improve
their business advantage. More specifically Findity’s aim is to save the users from the hassle of
handling paper receipts while also giving the retailers information about their customers’
shopping behaviors.
Retailers subscribe to Sparakvittot and receive a software module, which needs to be installed
in their pay point system. After the installation the retailer can offer their customers digital
receipts. The users sign up to the free service on a computer or through a smartphone
application, currently available for iOS, Android and Windows Phone, and are after a
successful signup ready to receive digital receipts. After a purchase at a retailer connected to
Sparakvittot, each customer is offered a paper receipt, a digital receipt or both. If customers
want a digital receipt they have to verify themselves through an ID card, which gets scanned
by the cashier. The system then saves the customers’ receipts on a server, provided and owned
by Findity. The users can then later follow up their purchase history by logging onto a website
or using one of the smartphone applications. In addition to keeping track of transaction
histories, the users get offered loyalty points and coupons from retailers connected to the
service. The retailers can likewise follow the purchase patterns of their customers and offer
them additional values to their shopping experience.
Worth mentioning is that Sparakvittot initially was aimed to be used as a computer web client
only. Findity decided to also develop Android and iOS applications as a test to see how well
their service worked on smartphones. In year 2012, 95% of the usage of their service came
from the smartphone applications, which indicates the power of certain services to be used on
the go.
37 For us, there were several problems to investigate regarding a possible integration of the
tapping technique in Sparakvittot’s Android application. We needed to conclude how to best
utilize the tapping interaction in the application and adapt the user interface to meet the new
interaction possibilities. We also needed to keep the rollout factor in mind since there was a
potential for the tapping interaction technique to actually be implemented in the Sparakvittot
application, given that the study resulted in positive conclusions.
Shortly after our first meeting with Findity they asked us if we could develop a prototype that
could identify a customer via tapping interaction. If possible they also wanted us to show how
a digital receipt could be sent straight from an NFC reader to a user’s smartphone, instead of
first going through a server which was the current solution. The prototype was supposed to be
shown at Sweden’s biggest fair for retailers and suppliers called Butiks- och leverantörsmässan
(Retail and suppliers’ fair). At the fair we would get the opportunity to conduct informal
interviews with interesting people from the retail business, such as employees of the main pay
point system providers in Sweden, to hear what they thought of NFC and investigate the
possibilities of integration into their systems.
5.2 Prototype development
After finishing a literature study about NFC and the use of tapping-based interaction
techniques we developed our tapping interaction enabled prototype on top of the already
existing Sparakvittot Android application. Since we only wanted to show specific functionality
– identification through NFC and a wireless transfer of the receipt – we started off by
stripping down the existing application and removed all unnecessary views and features that
did not fit our purpose. The prototype was then developed together following the Android
design guidelines.
5.2.1 Prototype
The finished prototype demonstrates tapping interaction as an application enabler and shows
how a user can identify itself and receive a digital receipt by tapping a smartphone to an NFC
hardware module, attached to a pay point terminal. Due to some technical difficulties we
actually made the data transfer through a WiFi-connection between our simulated pay point
terminal and the smartphone, but the concept looks and feels as if only powered by NFC. The
pay point client was a Java program developed by us. During the demonstration the pay point
client was running on a laptop to simulate a pay point terminal. The user interfaces of the
prototypes and a flowchart describing a typical interaction between them is given below in
Sparakvittot Flowchart 1. The picture up in the top left corner shows how the user interface of
our prototype looks after launching it, which can be done either by tapping the phone onto
an NFC-tag, representing an NFC-enabled pay-point terminal in a retail store, or by clicking
on the application icon in the smartphone’s start screen which is the traditional way of
launching a smartphone application.
38 Sparakvittot Flowchart 1 - NFC interaction between a pay point terminal and the Sparakvittot
smartphone application.
39 The picture up in the top right corner of Sparakvittot Flowchart 1 shows the user interface of
our pay point terminal. By clicking on the item icons on the right one adds items to the
shopping cart, represented by the big white area with the sum below, 0.00kr in this case,
which is located in the bottom left part of the interface. In the middle left picture items have
been added to the cart by the cashier. When the customer has paid the cashier asks whether
the customer wants a paper receipt, a digital receipt or both. If the customer answers that a
digital receipt is desirable the cashier presses the Sparakvittot icon in the bottom right corner of
the pay point terminal interface. In the bottom left corner, below the shopping cart area,
“Ready to send the receipt” (“Redo att Sparakvittot”) pops up. The customer then taps the NFCequipped payment terminal with an NFC-enabled smartphone and the Sparakvittot-ID,
currently logged onto that smartphone, is sent through NFC to the pay point system software,
which validates it with the Sparakvittot back-end server. If the user is authorized the user’s
name pops up on the cashier’s screen and the receipt is sent through NFC to the smartphone,
described by the arrows 2a above. If an error occurs, e.g. that the customer pulls away the
smartphone to quickly during the identification process, an error dialogue pops up on the
customer’s smartphone screen saying that “No receipt was sent” (“Inget kvitto skickades”).
5.3 Market study
Parallel to the development of the prototype we performed a market study to find out what
was needed for a rollout to be made possible. We considered which actors that have to be
involved and what their demands might be. After our second meeting with Per Einarsson, the
CEO of Findity, we came to the conclusion that an integration of NFC involved more actors
than we first thought.
5.3.1 Prototype demonstration
The demonstration of the prototype took place at Butiks- och leverantörsmässan at Kistamässan
(The Kista fair) in Stockholm on the 28th of March 2012. We stood in Sparakvittot’s showcase
with our prototype of the pay point terminal running on a laptop and the Android
application running on an NFC-enabled Android smartphone. During the day we talked to
representatives from pay point system providers and retail owners with a lot of experience and
valuable knowledge about technology used to power the retail market and let them test our
prototypes. In the afternoon we also had two longer discussions with developers from two of
Sweden’s biggest pay point system providers, which resulted in valuable insights regarding pay
point systems in general.
5.3.2 Result of market study
Most people that tried our prototype at the fair liked the idea of using the tapping technique
for identification and for receiving receipts through NFC directly. They thought that it
showed a really cool and pleasant way of retrieving digital receipts. Some retail owners testing
40 the prototypes questioned whether customers would look upon the interaction method with
skepticism or not. In analogy with this possible suspicion, representatives from pay point
providers pointed out that it took several years before the big mass in Sweden trusted the use
of chips in their credit cards instead of using the older magnetic stripes. This does not mean
that the technology is not worth developing further though, since it often takes decades
before a new technology is accepted by major corporations and societies and made available
for the masses to use.
Retailers we interviewed seemed to like the fact that customers could get their digital receipts
digitally through NFC straight after a purchase, without going through Sparakvittot’s servers
first, which reduces the need for the smartphone to have Internet connection in the store.
This would benefit stores with bad mobile Internet reception, since the customers in such
cases have to wait for Internet connection until they can get their receipt sent to them.
During the day at the fair we conducted informal interviews with two representatives from
Unikum and Datorrama, which are two of Sweden’s biggest pay point system providers. During
our discussions with the representatives we found that a roll out of Sparakvittot with NFC
integration, most likely, would be challenging to perform at that point, early 2012. The main
reason for this is that most retailers buy hosted pay point systems from companies like Unikum
and Datorrama. When Sparakvittot’s current solution, without NFC-support, is to be
implemented in a store, a software plugin needs to be installed into the pay point system. The
pay point system provider performs this installation. Customers who want to receive receipts
digitally are identified through the barcode on their ID-cards, which gets scanned by a barcode
scanner. That barcode scanner is keyboard emulated and connected to the pay point terminal
via USB. In other words the barcode scanner scans the social security number of the ID card
and then automatically types it into a field in the pay point terminal as if the barcode scanner
was a keyboard. The keyboard emulation solution is plug-and-play meaning that there is no
need for additional software or drivers in order to get it to work. The same solution is
favorable for NFC, and there are keyboard emulated NFC-readers available on the market.
But since the Android SDK currently does not support the card emulation mode of operation
a similar solution with keyboard emulating NFC-readers cannot be applied. In a working NFC
environment, there are generally many actors involved. After discussions with representatives
from Sparakvittot we found that the involved actors in their case are:
•
Retail customers - the end users of Sparakvittot
•
Pay point system providers
•
Retailers offering Sparakvittot to their customers
•
Software developers (developing Sparakvittot’s client applications)
•
Phone manufacturers
•
Phone network operators
41 5.4 Analysis
The users that tested our prototype at Butiks- and Leverantörsmässan were in general positive to
an NFC-integrated Sparakvittot application. The general impression was that the test users
liked the feature of identifying themselves by just tapping their smartphone to an NFC
hardware module. They also seemed to like the idea of getting their digital receipts sent
directly to their smartphones instead of having to wait for the synchronization with the servers,
which is the current solution.
Regarding the investigation of the technical infrastructure that is needed to create an NFC
ecosystem surrounding the NFC-integrated Sparakvittot Android application, our results show
major challenges to overcome in order to actualize a rollout.
The result of the interviews with the representatives from Unikum and Datorrama points out
that without meeting the pay point system providers’ requirements, an NFC-integrated end
user solution is probably not worth implementing for Sparakvittot. The challenge is not
implementing NFC into Sparakvittot’s smartphone application; it is rather implementing NFC
into the surrounding infrastructure and to begin with convincing retailers to invest in the
technology.
42 6 Medicine handling
In this chapter our second sub-study is presented. The chapter starts with design process of
the prototype followed by a description of the development of the application. The evaluation of
the prototype is then described and an analysis of our results is outlined.
Previous work (Lahtela et al., 2008; Broll et al., 2007) has shown proposed utilization of RFID
and NFC within health care applications. Given the possibilities of creating more
sophisticated technical solutions with the NFC technique – NFC being integrated into the
Android SDK – we decided to approach the health care area to look for possible
implementations of the technique in that context. We ended up developing a concept aimed
at easing nurses’ medicine handling process.
6.1 Design
The first steps in designing a system is to determine who the users of the system will be and
what they will be doing with it, according to Gould (1988). In correlation with Gould’s
suggestion the User centered design philosophy proposes that an analysis of users, activities, tasks
and context of use is a good way to start.
6.1.1 Contextual analysis
We felt that we first needed to conduct a literature review in order to get more acquainted
with the medicine handling process. Qualitative data was also needed in order to define the
contexts more specifically and a target group analysis in the shape of semi-structured interviews
was therefore performed.
Literature review
A literature review was conducted to find a theoretical foundation in order to better
understand the working process of hospital nurses, problems related to medicine handling
and other related facts. A major aim of the literature review was also to learn more about the
terminology used by nurses in order to later perform a target group analysis in shape of
interviews to find out more specific facts about their work situation.
Target group analysis
To further establish the target group and analyze the contexts surrounding them at their work
a target group analysis was conducted. Four nurses working at different units of the Linköping
university hospital were contacted and semi-structured face-to-face interviews with three of
them were held on the 30th of April 2012. The fourth nurse was interviewed at her work, in
shape of a contextual inquiry. The questions asked in the interviews attended whether the
interviewees’ unit performed medicine handling the traditional way or with the help of
43 medicine carts and which sort of problems they could identify in the medicine handling
process, to name some the most important questions asked.
6.1.2 Result of contextual analysis
The interviewed nurses were at the time of the interviews working in different units of the
university hospital in Linköping, Sweden. Their work situation, regarding medicine handling,
is therefore specific for those nurses although some of the information can be seen general for
other hospital nurses as well.
Nurses’ Medicine Handling
The university hospital in Linköping is divided into different units. Each unit is divided into
modules, which each in turn hold somewhere between five and nine patients. Each nurse is
responsible for distributing medicines to all the patients in one module. The distribution is
however only one part of the medicine handling process.
Medicine handling consists of three parts: ordination, preparation and distribution. Doctors
make the ordination and nurses perform the preparation and the distribution
(Nordström and Virén, 2010). The preparation consists of splitting and measuring the proper
medicines for each patient while the distribution includes giving the right patient their
medicines at the correct time. After each preparation is finished the nurse must sign the
performed action on a workstation computer. The same applies after a distribution is
performed, all in order to ensure patient safety. Medicine handling is one of the nursesʼ
common work tasks and is traditionally made the following way:
Preparation
1. The nurse prints an ordination list for each of the patients under her responsibility.
2. The nurse walks into the medicine room to prepare a medicine cocktail for each of
the patients.
3. While preparing a patient’s medicine cocktail the nurse manually picks out the
medicines, according to the ordination list, and validates that the proper medicine
and dose gets added.
Distribution
1. The nurse walks over to a patient, identifies the patient manually by asking the
patient for name and social security number, or by looking at the patient’s wristband
in case the patient is unable to talk.
2. When a patient is identified the nurse helps the patient to take the medicines and
simultaneously overlooks that the patient actually takes the medicines.
44 3. The nurse is then obligated to sign the distribution, which is done on a workstation
computer, placed in an office connected to the medicine room, i.e. outside the ward.
Some units at the Linköping university hospital use medicine carts, as middle hand in the
medicine handling process. Normally one medicine cart is used for each module. The
medicine carts are normally placed in the corridor outside of the wards and work as a
contemporary medicine room so that the nurse can perform the preparation phase without
having to walk all the way to the medicine room. In adjacent to each medicine cart a
workstation computer is placed so that the nurse can check each of her patients’ ordination
list and sign the preparations and distributions in a more flexible way. Normally the medicine
carts are prepared with the medicines to be used twice a week. The difference with using a
medicine cart compared to the traditional way of performing the medicine handling is that
the nurse has easier access to both the medicines that are distributed and the work station
computer, which speeds up the process.
Medicine handling is a time-consuming work task for nurses and occupies up to 40% of their
total work time according to Armitage and Knapman (2003). A common type of bodily injury
within health care is an injury that occurs due to improper handling of medicines, which is
estimated to cause 26.8% of the total amount of health care-related injuries in Sweden
according to the National Board of Health Supervision (Socialstyrelsen). It is further
estimated that one in ten hospital days are used to give care to patients exposed to an
incorrect medicine handling. (Socialstyrelsen, 2007 referenced in Nordstrom and Viren,
2010). Many of the mistakes made by a nurses in the medicine handling process relates to a
stressful work situation. Due to stress nurses sometimes disobey the rules by signing the
distributions before actually performing them. This is not safe for the patients; in the case the
nurse completely forgets the distribution or performs the distribution at a later point than
intended. The nurses are exposed to a number of disturbing moments during their daily
routines - this commonly relates to the medicine handling process. For example another
patient or colleague needs help with something that might lead a nurse to forgetting to sign a
distribution. Lahtela et al. (2008) concludes that there are seven different errors that can be
committed in the medicine handling process at a hospital:
1. Wrong dose of the medicine
2. Wrong time of distribution of medicine
3. Too rapid intravenous velocity (IV rate)
4. Wrong concentration of the medicine
5. Wrong way of distributing the medicine
6. Wrong medicine distributed
7. Wrong medicine because the patient is not identified correctly
45 Lahtela et al. also presents five key points indicating that the medicine handling process is
properly performed, in order to ensure patient safety:
1. Correct medicine
2. Correct patient
3. Correct dose
4. Correct way of distributing the medicine
5. Correct time
Cosmic
Cosmic, which is an electronic healthcare record (EHR) system, is used by the hospitals of the
county of Östergötland and thereby also by the university hospital in Linköping. Cosmic is
installed on laptops or desktop computers that are placed in different locations throughout
each hospital unit such as the nurses’ expedition, the medicine room and adjacent to the
medicine cart.
Whenever nurses need to check an ordination list they log onto one of the laptops and
launch Cosmic. They choose a patient and the patient’s ordination list is shown. After
preparing the medicines they sign the preparation by clicking on the medicines in the
ordination list, one at a time or by marking several, and then choose Prepare (“Iordningsställ”).
When the preparation is completed they distribute the medicines to the patients. They walk
back to the computer and sign the distribution, which is similar to signing a preparation. Two
pictures of the interface of Cosmic, Picture 3 and Picture 4, are shown on the next page.
46 Picture 3 - List of patients in Cosmic.
Picture 4 - A patient’s ordination list in Cosmic.
47 6.1.3 Contextual design
In order to understand the interaction between the user and the system in the specific
environment Gulliksen and Göransson (2002) argues that contextual design is a good method
to use. Beyer and Holtzblatt (1999) who originally coined the term Contextual Design describe
their method by saying:
“The best product designs result when the product’s designers are involved in
collecting and interpreting customer data and appreciate what real people need.
Contextual Design gives designers the tools to do just that”
A proposed design of a system that could help the nurses was now to be developed according
to the principles of the contextual design model. A contextual inquiry, which is the most
important part of the model, was performed with the fourth of the interviewed nurses at her
work place. We were shown around on the unit for about 45 minutes while asking questions
about the medicine handling process and being instructed how it is performed in practice.
Before this, a semi-structured interview with the nurse was performed according to the same
protocol as the other semi-structured interviews.
A work redesign was then conducted. The consolidated data worked as a base as we discussed
how a smartphone application with NFC support could be used to improve the medicine
handling process for the nurses. In order to concretize the result of the work redesign a first set
of mock-ups was developed following the principles of the fifth step of the model, the user
environment design. The main purpose of this step is to capture the main structure of the
proposed system and to describe the function and flow of it.
Early on in the design process a solution that enabled validation of medicines during the
preparation phase, by tapping medicine jars with the smartphone, was suggested. However,
the contextual analysis pointed out facts, which made us drop this idea. We found that the
preparation of medicines was performed in different ways on different units and that the task
model differed to the extent that a general solution could not be found.
48 6.1.4 Result of contextual design
Picture 5 below shows the first sketch of the application. Digital mock-ups where also created
to demonstrate the possible interaction flow of the system, as seen in Picture 6.
Picture 5 – The first sketch of the look and feel of the possible application.
Picture 6 - The first digital mock-ups of the application aimed to show the possible interaction
flow of the system (the flow goes from left to right, one row at a time).
49 The last step of the Contextual design model is Mockup and Test with Customers, consisting of
presenting paper prototypes to the users of the supposed system in order for them to easily
contribute to the design of the system and quickly find possible errors. This step was
performed on the 13th of April with two of the previously interviewed nurses and gave a lot
of insightful feedback. The most important feedback received was that the proposed design
did not correlate with neither the interaction flow nor the placement of interaction elements
in the GUI of Cosmic. We therefore asked both nurses to draw a sketch of Cosmic, to be
used as the source when iterating the design of the application. One of the sketches is
presented in Picture 7 on the next page.
Picture 7 - Sketch of Cosmic drawn by one of the nurses that provided feedback during the
mock-up demonstration.
It is important to point out that the sketch is a representation of Cosmic as it is used on the
computers provided by each hospital unit. In order to convert this GUI to the dimensions of
a smartphone, the GUI needed to be stripped down. A new proposed design was created in
order to exemplify a possible look and feel of the proposed system, based on the drawings and
feedback from the nurses. The proposed design is illustrated on the next page in Picture 8.
50 Picture 8 - A final digital mock-up of the proposed application
6.2 Prototype development
By developing a high-fidelity prototype named PatientSafetyPrototype, to be used by hospital
nurses in a medicine-handling context we hoped to ease the nurses’ work process and ensure a
higher patient safety. With the prototype we have exemplified a possible implementation of
tapping interaction technique utilizing the NFC Read/Write mode, which was the most
suitable mode to use considering the context.
6.2.1 Prototype
Our aim with the prototype was to help nurses in two ways. Firstly we helped them by
transferring the EHR information, usually accessed through a computer with Cosmic, to a
smartphone application. By making the EHR:s ubiquitous we prevented the nurses from
running back and forth to a computer, often placed in the medicine rooms, making them save
time. Secondly, and more importantly for our study, we utilized the tapping technique as a
part of the smartphone application’s user interface which enabled the nurses to tap their IDcards with the smartphone in order to authorize themselves. They also tapped medicine boxes,
for retrieval of the correlating ordination list, and patients’ wristbands for identification of the
patients.
51 User interface and functionality
The prototype’s GUI is based on the GUI of Cosmic. The purpose of this was to reduce the
cognitive load for the end users, by designing an interface with features they were already
familiar with. The GUI differs from the computer version not only by being stripped down
but also by the touchscreen interaction it enables. For example, as previously mentioned, the
nurses are able to access their patients’ ordinations ubiquitously, e.g. in the medicine room, in
a ward or on the fly.
So how does it work? Firstly the nurse needs to start the application and log in, in order to
access the application data. This interaction flow is described in PatientSafetyPrototype
Flowchart 1 below.
PatientSafetyPrototype Flowchart 1 - Flowchart of the log in process after starting the
application.
The nurses can start the application in two different ways; the classic way by clicking on the
application icon as seen in step 1 in the PatientSafetyPrototype Flowchart 1 above or the tapping
interaction way by tapping an access card with the smartphone. If users choose the classic way
they are presented with step 2 and has to identify themselves by tapping their access cards. If
they instead choose the tapping interaction way they automatically skip step 2 and jump to
step 3 right away. They log in by entering their pin codes and clicking on the log in button
(“Logga in”). When the users are identified and authorized they are presented with a list of
their patients. The PatientSafetyPrototype Flowchart 2 below shows the patient list.
52 PatientSafetyPrototype Flowchart 2 - The list in portrait mode on the left, step 5, and in
landscape on the top right, step 6, followed by the detailed information dialog after a long click
has been performed, step 7.
In PatientSafetyPrototype Flowchart 2, Step 5 shows the GUI in portrait mode after a successful
log in. The smartphone can also been rotated 90 degrees in order to show the same view in
landscape mode if the users prefer that. The nurses choose a patient in the list by either single
clicking that row of the touchscreen GUI, as shown in step 6, or by tapping the phone to
either an NFC-equipped medicine box. Hospitalized patients have their own box, which is
stored in the medicine cart. The nurses can also choose a patient in the list by tapping the
patient’s NFC-equipped wristband. If long-clicking on one of the patients in the list the nurses
are given more detailed information about that patient, like registration date and planned
discharge date, which is shown in a pop-up dialogue.
When a patient has been selected, either by single clicking in a row or by tapping a medicine
box or a patient’s wristband, the nurses are presented with the ordination view in step 8,
shown in the PatientSafetyPrototype Flowchart 3 below.
53 PatientSafetyPrototype Flowchart 3 - Preparation, distribution and signing of medicines.
As previously stated, the preparation of medicines is the first step for the nurses when
performing medicine handling. When a preparation is complete the nurses are obliged to sign
the preparation in the system. How to do this in our prototype is shown in step 9 where the
user has single-clicked each of the prepared medicines in the time column 08:00. Each
prepared medicine is indicated by yellow background in the corresponding cell. As all the
medicines in a column have been prepared the Prepare (“Iordningsställ”) button pops up in
the top right corner of the GUI. The nurses sign the preparation by pressing this button and
they are then presented with step 10. Italic and bold text indicates that these medicines have
been prepared. The nurses distribute the prepared medicines by walking to their patients’
wards and tap their wristbands in order to identify them. A dialogue, which indicates whether
the correct patients have been identified or not, pops up, as seen in step 11. The nurses make
sure that the medicines are properly distributed and then sign the distribution, as seen in
step 12. In the same manner as for the preparation signing a distributed medicine’s
corresponding cell gets a turquoise background. When the whole column is selected as
distributed, i.e. all cells have turquoise background; the Sign (“Signera”) pops up in the same
place as the Prepare-button. After clicking Sign (“Signera”) the time for signing and the nurse’s
initials shown in each cell indicates that a distribution has been performed, step 13.
54 The feature described in step 11, i.e. a dialogue showing that the correct patient has been
identified, was implemented after the pilot test of the prototype based on the pilot test user’s
feedback. We also felt that it would be a good idea to apply the same principle when the users
tap the wrong patient. A dialogue showing “NOT THE SAME PATIENT” (“EJ SAMMA
PATIENT”) was therefore given to the user in those cases. Picture 9 below shows this dialog.
Picture 9 - Dialogue used to give the user feedback.
6.3 Evaluation
The evaluation of the PatientSafetyPrototype consisted of an heuristic evaluation, a field visit, a
field test and usability testing with nurses in a controlled environment.
6.3.1 Heuristic evaluation
As the first version of our prototype was developed, a heuristic evaluation was performed. A
review of the user interface was believed suitable in order to find possible usability problems
and overall design issues, especially regarding the tapping technique and the interaction it
provided. The reason for performing the heuristic evaluation first after the initial version of
the prototype had been developed, and not during the mock-up stage, was because of the fact
that the tapping technique is a physical sort of interaction and therefore must be practically
implemented in order to really inspect the result. The heuristics used in the evaluation where
the Means of the Layered model of usability, defined by van Welie et al.
Consistency
Our prototype is overall consistent with Cosmic’s medicine handling module, but Cosmic
also holds a lot of other features. The GUI is designed according to Cosmic, however applied
into a mobile context, which creates both limitations and possibilities. The most obvious
limitation is that the screen size of a smartphone is much smaller compared to desktop or
laptop screens. The GUI must therefore be stripped down substantially. A hard consideration
55 to make is to choose which data to present to the user and which data to remove on the
smartphone.
The smartphone context enables new ways of interacting through the touchscreen interface
and the sensors integrated into the phone. An example is the feature described in step 7 of
PatientSafetyPrototype Flowchart 2. This feature enables the user to retrieve extensive
information about the patient that could not be fit into the GUI initially due to screen size
constraints. By showing the most important data directly to the user while “hiding” extensive
data, which however still can be retrieved, no data is lost.
A constraint regarding the consistency of the prototype compared to the computer version of
Cosmic is the way a signing of a preparation and a distribution has to be done. To be able to
sign a preparation or a distribution the users of our prototype have to mark all the medicines
in each column, e.g. 08:00, in order to be able to sign the preparation or the distribution. In
Cosmic, medicines can be handled one at a time.
We have been consistent in the meaning of the text styles, indicating which process of the
handling that was lastly performed, see step 8, 10 and 13 in PatientSafetyPrototype Flowchart 3.
Normal text means that the medicine has been ordinated by a doctor, but not prepared. Italic
and bold text means that the medicine has been prepared, but not distributed. Text that is
bold and holds a time stamp and a nurse’s initials means that the medicine has been
distributed, all according to Cosmic. For our smartphone implementation we added
background colors to emphasize a selected cell. We implemented this feature both for the
preparation and the distribution phase. We used yellow for selected preparations and
turquoise for selected distributions. When either a set of medicines had been marked as
prepared or distributed the corresponding column was disabled making it impossible to click
on.
Shortcuts
The whole meaning of the tapping interaction technique is to introduce shortcuts to the user,
by enabling a more physical way of interacting compared to the traditional touchscreen GUI.
An example applied to our prototype is the “NFC-triggered way” of launching the application.
When tapping an access card to launch the application the user jumps straight to step 3 of
PatientSafetyPrototype Flowchart 1 instead of having to go through the second step, described in
step 2.
Another example is identifying a patient during the distribution phase, which can be done by
simply tapping the wristband of the patient. The tapping can be done both in the patient list
view mode, described in step 5 of PatientSafetyPrototype Flowchart 2, in order to retrieve a
patient’s ordination list or in the ordination list view mode, described in step 8 of
PatientSafetyPrototype Flowchart 3, in order to perform an identification of a patient. This
56 reduces, or even eliminates, the number of clicks otherwise needed when navigating between
the views.
Feedback & Warnings
When users tap an NFC-tag integrated in a medicine box or in a patient’s wristband, sound
feedback is given. A happy sound is given when the phone successfully reads a tag while a sad
sound is given if something goes wrong in the reading process - this could for example be that
a user initiates a connection but pulls the smartphone away before all the tag information was
read. On top of the sound feedback, haptic feedback, in the shape of a vibration, is given to
the user when the smartphone successfully reads a tag within the right semantic context - i.e.
reads the right type of tag at the right time.
For example, a vibration is given when nurses identifies themselves successfully by tapping
their access cards, in order to log in. But if the nurses instead tap a patient’s wristband when
the phone expects an access card to be tapped, the interpretation of the semantics are wrong
and therefore no haptic feedback is given by the smartphone. A happy sound can still be given
if a successful tapping is performed syntactically, but semantically it may still be wrong and
thus nothing happens.
A dialogue showing “Tap your identification card to identify yourself” is presented to the user
for self-explanatory reasons in the log in phase. It has been discussed that more feedback can
be given to the user in order to more properly define the interaction semantics of the
prototype. For example dialogue warnings might be given to the users when they make
semantic mistakes.
6.3.2 Field visit and field test
Independent of design project is knowledge about humans as users of computer applications
and design principles, according to van Welie et al. (1999). But they also state that when it
comes to the task world, every design project is different. In accordance with that statement
we wanted to get a first person experience of the nurses’ working environment surrounding
the medicine handling process in order to design a proper test environment for the usability
testing. The results from the contextual analysis were a good start for understanding the
medicine handling process. However, in order to get more detailed and deeper knowledge
about the working environment, a field visit was performed. According to Kjeldskov and
Graham (2003), field studies are good sources for collecting large amounts of rich grounded
data in relatively short time.
Through the information gathered at the field visit we gained a better understanding of the
nurses’ working environment, which enabled an accurate design of the usability test
environment. The field visit took place the 11th of May 2012 and during the three-hour visit,
57 two nurses separately showed how the medicine handling process is performed. The tools
used for the medicine handling was also shown and explained during the process. When
demonstrating the medicine handling process the nurses were asked to describe every step of
it. Together with the nurses we got the opportunity to visit a medicine room and examine a
medicine cart in detail. Both the medicine room and the medicine cart play central parts in
the medicine handling process. Another key player in the medicine handling process is of
course the patient. Photographs of both the visited medicine room and the examined
medicine cart can be seen in Picture 10 below. Unfortunately, due to patient secrecy, we were
not allowed to take photographs of any patients or the wards they were placed in.
Picture 10 - The medicine room and a medicine cart.
During the field visit a field test of the prototype was also performed, with one of the nurses,
in order to see how it functioned in a real hospital environment. Picture 11 below shows a
collage of pictures of the field pilot test showing the test user interacting with the prototype.
58 Picture 11 - Pilot test during the field visit.
The field-testing was performed as a walk through where the test person was asked to go
through each step of the medicine handling process and use our prototype while doing so.
The field-testing worked as a first assurance that we were on the right track with the prototype
59 and since the result was very positive we concluded that it was time to usability test it more
thoroughly.
6.3.3 Usability testing
The usability testing was mainly aimed to investigate the tapping technique in the perspectives
of the usability factors effectiveness and satisfaction, but data regarding efficiency was also of
certain interest. To gather data about the usability factors we looked at the corresponding
usage indicators of the Layered model (van Welie et al., 1999) in order to concretize what to
actually measure. The main usage indicators included in the usability testing were error/safety,
learnability and satisfaction.
The main focus of the usability testing was to retrieve hard data on the accuracy of the tapping
technique as part of the user interface of the application, in order to identify whether the
tapping technique meets the usability requirement of being effective. Therefore the
quantitative part of the study protocol was mainly designed to revolve around effectiveness,
represented in the model by the usage indicator errors/safety and memorability. We decided to
only include errors/safety and measure the number of successful and non-successful tapping
attempts that were made by each user, while using the prototype. The number of nonsuccessful tapping attempts, before a user might abandon the tapping technique in favor for
standard GUI smartphone interaction, was also measured.
Of interest was also to investigate the user’s satisfaction with the tapping technique through
retrieval of qualitative data, in our case interviews.
The defined target group for the testing consisted of hospital nurses who:
1. Perform medicine handling in their everyday work
2. Perform preparation of medicines in a medicine room or with a medicine cart
3. Are experienced Cosmic users
4. Are experienced smartphone users (iOS, Android or Windows Phone)
The first three target group criteria are self-descriptive while the reason for targeting the test at
experienced smartphone users was because of our common research questions. We were
interested in evaluating NFC as an interaction technique for smartphone applications in
specific contexts. By testing on experienced smartphone users the risk of unnecessary
cognitive load, caused by unfamiliarity of using smartphones, was minimized.
The usability test took place in Linköping the 12th and the 13th of May 2012. A whole
apartment was converted into a controlled test environment. The test environment unit
consisted of four double wards, holding a total of five patients, one of them was a real person
60 and the rest were cuddle toys. All five patients were equipped with NFC wristbands. A
simulated medicine room with a stationary medicine cart, holding medicine boxes marked
with numbers correlating with the wards and beds, was also built up. The test consisted of a
typical medicine-handling scenario, i.e. without disturbing elements, except an empty
medicine box correlating with an empty bed in one of the wards. Actually we wanted to
investigate how big of a disturbing moment the empty medicine box would be and therefore
we did not add any feedback to the user when tapping it.
A pilot test was performed in order to control and evaluate the test environment, the written
test leader template and the interview questions. The pilot test pointed out some flaws in the
test design of different significance. The pilot test user suggested some improvements of the
prototype and we found some improvements of the test instructions and interview questions.
For example the pilot test user initially found it hard to use the tapping technique and had
difficulties finding the correct target area to tap the tags on the smartphone. This made the
user doubt whether the tapping technique really could be of any help, which in turn made the
users start clicking in the touchscreen GUI, instead of tapping. We therefore decided, from
that point onwards, to instruct each user how tapping tags worked before starting the test
since the interaction technique was not as self-explanatory as we first thought. Other relevant
findings regarding the prototype are described below:
•
Being able to prepare one medicine at a time in a patient’s ordination list view, since
that sometimes is necessary.
•
Feedback when tapping a patient while an ordination list view is presented on the
screen. We managed to implement positive feedback for the case when the tapped
patient was the same as the chosen patient, in the ordination list view on the
smartphone screen.
Unfortunately we did not have the time to implement the requested function of being able to
prepare one medicine at a time. But after fixing the small flaws in the test instructions and
interview questions and implementing the suggested improvement regarding feedback the
testing could begin. A total of six nurses took part in the usability testing, including the pilot
test user. There were five women and one man participating. A typical test took about an
hour to finish, on average, depending on the length of the semi-structured interview.
Before the start of each testing the current user was shown how to interact with the
PatientSafetyPrototype, both through tapping and via the touchscreen GUI. Characteristics of
the Android OS, especially the back button, were described in order to teach non-Android
smartphone users the importance of the physical back button in Android.
61 During each testing the user’s interaction was recorded by the test assistant who followed the
user around in the laboratory environment, as the user performed the tasks given by the test
leader. The test leader also kept a test protocol, in order to measure number of tapping
attempts, tapping errors and other interaction errors noticed. After each testing the user was
interviewed in a semi-structured interview. The questions were aimed to answer what the user
thought about using the system and how the tapping technique was received in general, the
full question sheet can be found in Appendix 12.1 Medicine Handling Usability Testing.
6.3.4 Results of usability testing
The following results were retrieved from the usability testing performed with the
PatientSafetyPrototype. Quantitative and qualitative results from the field visit and the usability
testing are presented. In the following two sections the compiled results most relevant for our
study are presented.
Quantitative results
The quantitative result of the observation is displayed in Table 6 on the next page - with the
purpose to show how frequently the users used the tapping technique while performing the
testing tasks. The data was collected, by the test leader, during the observation of each user
testing and has also been validated with the help of the video material recorded.
The medicine boxes, light blue rows in Table 6, could be tapped in the preparation phase and
the patients’ wristbands, apricot colored rows in Table 6, could be tapped in the distribution
phase. When it comes to interpreting Table 6, an explanation of the “test codes” is given in
the bottom of Table 6. T2S on a light blue row e.g. means that the user tapped a medicine box
twice and performed a successful tap the second time. Similarly T1S on an apricot colored row
means that the user tapped the patient’s wristband once and instantly succeeded. The
meaning of a successful tap is that the user was taken to the right view, after performing the tap.
Instead of tapping the user could also choose to interact with the prototype by clicking in its
interface, C1S e.g. means that the user clicked once and was taken to the right view.
The data in column 1:2 has not been included in the calculations of the figures under A, B
and C. This because medicine box 1:2 was “empty” - no patient had allocated this bed and
thus neither the box. We simply used this empty medicine box to make the test environment
as real as possible. Three users, out of five, got “tricked” when they tapped this extra box.
Unfortunately we had not thought of implementing any negative feedback, in shape of an
alert dialog or similar, for this case so the system simple answered by doing nothing when the
empty medicine box was tapped. The test leader therefore told the users about this “bug”
when the users got stuck. By doing so, the tricked users quickly got back on the right track
again and went on with preparing medicines for the patients that actually allocated a bed.
62 Finally “*” is an indication showing that the user took out the medicine box from its original
position, in the medicine cart, in order to tap it more easily.
Table 6 - Tap tracking
The data under column A shows that our test users chose the tapping technique, in front of
clicking in the GUI, 46 out of 50 times, during the test. Tapping was chosen all of the 25
times when trying to identify the patients, during the distribution phase. For tapping a
medicine box in order to see a patient’s ordination list, instead of clicking, the frequency was
not as high, with a total 21 taps out of 25 possible. Column B shows the users’ success rate
when choosing tapping as the method of interaction. The total value for the medicine boxes
was 22 successful taps out of 28 tries. The success rate for tapping the patients’ wristbands was
slightly higher, 47 successful taps of 57 tries, when tapping was chosen as interaction method.
Column C shows successful taps regardless of chosen interaction technique, which we
consider to be slightly less important than column A, and B.
Qualitative results
During each user testing the test leader was responsible for the documentation of the number
of taps and errors. Meanwhile, the test assistant recorded video of the user performing the
tasks. The video material has mainly been used to validate the data collected regarding the
number of taps and errors noticed but also to validate the interviews held after each user test.
It was hard to collect all data instantly during the user test and the interviews. The video
material showed some data that we failed to document during the tests and has therefore been
63 really useful as a back-up resource. The most important observations from the video
recordings are the following:
•
Different test users preferred different work processes. Test user three (T3) preferred
to prepare each distribution of medicines, i.e. for distribution 08:00, for all patients
before carrying out the distribution for each patient, one at a time. The cognitive load
of keeping track of all medicines at the same time might have made T3 more
vulnerable and when tapping box 1:2 T3 got tricked and it disturbed the process.
Instead of continuing to prepare and thereafter distribute each patient’s medicine
one at a time T3 started to prepare and then distribute one patient at a time. All the
other test users used this process from scratch. The behavior was later explained by
T3 in the interview. The reason for the change was that there was no possibility to
mark the medicine cocktail mug with a pen, as T3 usually does when preparing all at
the same time, and it therefore felt safer preparing the medicines one at a time.
•
Test user three (T3) made the following comment while identifying the first patient
by tapping the wristband: “I’m doing this mostly for fun even though I am sure of
which patient it is”.
•
Test user four (T4) small talked to the patient while tapping the wristband to identify
the patient– multitasking at its best!
•
Test user four (T4) bursted out “This is very smooth!” (“Det här är ju väldigt
smidigt!”) while tapping the fourth medicine box.
•
Test user five (T5) held the smartphone in landscape mode and could therefore more
easily tap the medicine boxes without pulling them out, as most of the other users did.
•
Test user five (T5) once wondered if all the previous distributions really were signed.
T5 therefore looked up each of the patients already visited and controlled, in the
application, that the distributions were signed properly while walking back to the
medicine cart.
During the semi-structured interviews open discussions were initiated by the interviewer,
when suitable, in order to retrieve deeper and more meaningful answers. The users had
interesting and valuable feedback. Below are the most relevant answers and thoughts stated as
citations, one from each test person. The original interviews where held in Swedish, so we
have freely translated these citations. The interview questions are available in the appendix.
T1: “It is a bit frustrating when the tapping does not work. However, even if it does
not always work, it is super great to have all the information on the phone. Some
patients can not speak, and then it is great to be able to tap the patient's wristband to
identify them.”
64 T2: “I think it works really well with the tapping technique. It is at least as fast,
probably faster, compared to using a workstation computer. To be able to sign a
distribution firsthand is great - one does not have to go back to the computer to do
the signing, this really saves time!”
T3: “Generally you save time. This is really pliable. The tapping was really fun to try
out. I primarily used it to check if it was the right patient that I was on my way to
distribute to. This worked great! The tapping is as fast as clicking, but it is more fun
to tap - it works similar to a barcode scanner. It felt natural to tap, especially when
identifying a patient which is great from a safety perspective.”
T4: “It worked great! I used it to check with myself that the right cocktail gets
delivered to the right patient. When working the night shift, I often prepare the
cocktails beforehand. Then it is extra useful to be able to tap the medicine box, to
quickly get access to an ordination list and in turn get to know which patient I am
supposed to go to.”
T5: “I used the tapping because it was fun. A smartphone screen is quite small - if one
should click on a patient, in the list, it is quite easy to pick the wrong one - but if you
tap, you are guaranteed to pick the right one.”
6.4 Analysis
When looking at the quantifiable result of the tapping, showed in Table 6, it is fairly obvious
that the users to a high extent used the tapping technique successfully. More exactly the value
of successful taps when choosing tapping, as the interaction technique, was 47 out of 57 tries.
This feels like an acceptable rate of interaction success. When reading the table from the left
to the right, which correlates with the test order, it is noticeable that the errors and number of
tries to perform successful taps were reduced for each task, as each test proceeded. This
indicates a high learnability of the tapping technique.
The main conclusions regarding the qualitative data gathered during the semi-structured
interviews and through the video observation are that the users thought that the tapping
technique was pliable, easy to use and that it over all worked well. All the users were also
delighted of getting the Electronic Healthcare Record (EHR) available on a mobile device.
Four of the users stated that they often get questions regarding if and when a patient have
taken their medicines or not. They concluded that having EHR:s available on a smartphone
really helps them answer these types of questions. By identifying patients with NFC-enabled
wristbands they do not have to run back and forth to a computer. The time it takes to find
65 patient specific information is reduced by our application since the nurse only has to unlock
the phone and tap the patient’s wristband to access the patient’s EHR.
Based on feedback from the pilot test we added feedback when the user taps a patient while
an ordination list view is presented on the screen. In the case when the tapped patient was the
same as the chosen patient, positive feedback was given to the user and negative feedback was
given in the opposite case. Most of the users seemed to think it was fun to use this feature and
tap a patient’s wristband to see if it really was the same person. Test user three (T3) e.g. stated
that:
“I know it is the same patient but I will tap and see anyway because it is fun!”
The medicine box 1:2, that had no patient, is worth mentioning again. By not placing any
patient in that bed we wanted to investigate how the users would react when they tapped a
medicine box with no patient linked to it. As an investigation on the presumed importance of
feedback we decided to not give the users any feedback about the medicine box being empty.
We thought that this might work as a disturbing factor for the user, which also was shown, at
least to a certain part. Three of the users reacted by tapping the medicine box multiple times
without understanding what they did “wrong” while the two others just ignored the empty
medicine box and moved on. The results regarding this potentially “disturbing factor” showed
that three out of five users was bewildered by the medicine box with no patient. The lesson
learned is that negative feedback often is as important as positive feedback, if not more
important in some cases, which also got pointed out explicitly by two of the test persons in the
interviews held after the user testing.
66 7 TapThat
In this chapter our third sub-study is presented. The chapter starts with the background to the
sub-study followed by the design of the prototype and the prototype development. Lastly an
evaluation and an analysis of the sub-study are presented.
Whereas the Medicine Handling concept and it’s prototype utilizes NFC’s read and write mode
of operation, we found it important to also investigate the other mode of operation possible
to implement in Android applications, peer-to-peer. It enables small amounts of data to be
transferred directly between active NFC-enabled devices.
7.1 Background
A possible implementation of an NFC peer-to-peer Android smartphone application is the
concept called SOUL developed by Florman Lindeberg et al. (2011). The main idea of the
concept is to relay audio playback between devices by only tapping them to each other. The
part of their concept description that we found most relevant for our study was the possibility
to:
“...seamlessly switch between different devices with something as easy as a tap with
your phone. Whatever is playing on your first device, will continue being played in
the second device. This solves the problem that online content from different sources
is difficult to transfer between devices. The same podcast, music, news or what you
are listening to can always follow you, independently of the device.”
Florman Lindeberg et al. (2011).
Since the inventors of the SOUL concept never implemented any prototype we decided to
pick up where they left off, i.e. develop a prototype and then evaluate it by conducting a
usability test.
7.2 Design
Most of the SOUL concept’s features were described in text, by Florman Lindeberg et al.
(2011), so in order to visually actualized their idea we needed to come up with design
suggestions. The design process was focused on the core feature of the SOUL concept, the
instant switch of playback between devices. The interface of both prototypes were therefore
primarily designed to fulfill this feature utilizing the tapping technique, but also to support
other typical media player features such as play, pause, stop, fast forward and reverse through
67 the normal GUI in form of touchscreen interaction for the smartphone and interacting via
mouse for the Java desktop application.
The design process of the two TapThat prototypes was the most pragmatic out of our three
sub-studies in the sense that previous experience, gained during the Sparakvittot and Medicine
Handling studies, worked as the primary base for the design decisions. The GUI:s of Spotify,
Winamp and VLC were also used as sources of inspiration. Paper mock-ups where simply
drawn, discussed and iterated followed by actual development of the prototypes.
7.3 Prototype development
Florman Lindeberg et al. (2011) state NFC as their choice of technology for a possible
implementation of their concept. The main interaction of the prototype can be described as:
“All the user has to do is tap the smart phone on to any compatible device to start
playing...” Florman Lindeberg et al. (2011).
The core functionality would consequently be the tap to start playing feature, enabled by NFC.
By simulating an implementation of the peer-to-peer mode of operation an instant transfer of
media metadata across compatible devices would be enabled.
For this usage scenario we developed one Android application and one computer Javaapplication, capable of toggling playback between each other immediately by tapping a
smartphone to an NFC-tag. This prototype exemplified a possible implementation of the NFC
peer-to-peer mode, though without using the actual peer-to-peer mode of operation to transfer
the data between the applications. This was due to only having one active device, a
smartphone, available. Two active devices are needed to create a peer-to-peer connection
through NFC. In the lack of another active device we used a WiFi-connection to simulate the
NFC peer-to-peer connection instead. The data sent was small enough to easily be sent by
NFC using peer-to-peer, making the prototype easily adaptable and possible to run on two
smartphones instead of one smartphone and one computer.
When designing and developing the prototype we focused on simplicity and the core
functionality, i.e. instant transfer of playback between devices triggered by tapping the
smartphone to an NFC-tag. We designed stripped down GUI’s for both the smartphone and
the computer application. Both contained a toggle button for playback, a slider for the time
controller and three buttons for the three different tracks available.
68 7.3.1 Prototype
The TapThat prototype can be considered an interesting prediction of how NFC may be used
in the future. We focused on one of the core features of the SOUL concept, i.e. the
immediate switch of playback between devices.
TapThat prototype consists of two music player applications, one Java computer application
and one Android smartphone application. The applications utilize NFC to create a simulated
peer-to-peer connection in order to exchange metadata. The metadata contains information
regarding the music file currently playing as well as the time code of the playback session.
User interface and functionality
The user interface of both the applications and a typical interaction between them is
described through TapThat Flowchart 1 below:
TapThat Flowchart 1 – A user going to work
Imagine a scenario where a user is having breakfast at home, before going to work. While
eating breakfast the user likes to listen to documentaries - P3 documentaries are especially
appreciated. The user therefore turns on a laptop computer and puts it on the kitchen
worktop, launches the TapThat - NFC Music Player and starts to listen to IT-Bubblan. After
listening for almost two minutes it is time to go to work. The user therefore picks up its
smartphone in order to continue listening to the documentary while going to work. The user
unlocks the smartphone and sees the screen in step 1 in TapThat Flowchart 1 above.
To transfer the playback to the smartphone the user taps an NFC sticker on the computer.
Step 2a and step 2b, in TapThat Flowchart 5, explain what happens after the tap; the music
69 player on the computer stops playing while the application automatically launches on the
smartphone and starts playing the same documentary where the computer left off. Most
convenient is that the user does not have to fast forward manually; this is automatically
handled by the smartphone, illustrated by step 3. The user leaves for work while continuing
listening to IT-Bubblan through earphones.
The following scenario describes the user’s way home from work. The user launches the
TapThat application on the smartphone and is presented with the picture to the left in
TapThat Flowchart 2 below. The user had finished listening to IT-Bubblan on the way to work
and therefore chooses to listen to Raul Wallenberg on the way home (step 4).
TapThat Flowchart 2 - User going home from work
When entering the apartment the user has listened to almost two minutes of the documentary
and decides to listen to the rest on the computer while surfing the Internet for the latest news.
The user opens the screen of the laptop standing in the kitchen and sees the picture in the
top right corner of TapThat Flowchart 2. The user then simply taps the phone to the sticker on
the computer, which triggers the music player on the computer to start playing “Raoul
Wallenberg” at the correct time, described in step 5, while the smartphone application
simultaneously stops playing the file.
70 7.4 Evaluation
The evaluation of the TapThat prototype consisted of a usability testing were smartphone
users tested the application with and without NFC in two scenarios.
7.4.1 Usability testing
The main idea with the usability testing of the TapThat prototype was to identify whether the
tapping technique meet the usability requirement of being efficient and in the same time
determine the user’s satisfaction with the tapping technique, implemented in the prototype.
Performance speed is one of the usage indicator for efficiency defined by Welie et al. (1999).
When designing the usability testing of the TapThat prototype we decided to compare the
time it takes for a user to perform a specific task with or without the use of tapping in a
specific scenario. Independent of interaction technique the user was to accomplish exactly the
same task, for the reason to be able to easily compare the techniques. Further, the users’
satisfaction with the tapping technique, another usage indicator defined by Welie et al. (1999),
was of interest for the testing. Typical user scenarios of using TapThat were developed which
are further described in Appendix 12.2 TapThat usability testing. In order to measure the users’
satisfaction, short semi-structured interview was held after every testing.
The target group for the testing was smartphone users that use smartphones, running iOS,
Android or Windows Phone, in their every day life. The testing was conducted with seven
users, including the user performing the pilot test.
The usability testing was conducted in a controlled environment. There was a desk with a
laptop, an NFC-enabled Android smartphone and a chair for the user to sit on during the test.
Before each test the current user was informed about the characteristics of the Android OS,
specifically the importance of the physical back button. Before each test started the tapping
technique was described in short, the NFC-tag was shown and the area of the phone where
the NFC-chip is located was pointed out, but the user was not allowed to test anything.
The pilot test that was conducted showed some minor problems regarding the testing
instructions wherefore each scenario, with and without NFC, where partially rewritten and
then developed into two separate sub-scenarios. The pilot test also showed a possible
improvement of the GUI of the Java computer application used in the scenario. The GUI
improvement was implemented before the main testing began.
71 During the testing a test protocol was filled out by the test leader and the test assistant
observed the user’s actions in general. After finishing the scenarios a semi-structured interview
with each user were held.
7.4.2 Results of usability testing
The usability testing of the TapThat prototype was mainly focused on retrieving data that not
could be measured during the usability testing of the PatientSafetyPrototype. One usage
indicator of the Layered model of usability that was not observed in the usability testing of the
PatientSafetyPrototype, was performance speed, wherefore this indicator was of primary concern
when designing the usability testing for the TapThat prototype.
Every user completed the scenarios twice, one time using NFC and one time using only
touchscreen interaction. Half of the users started with using NFC and the other half started
by using touchscreen interaction.
Quantitative results
During the usability testing of the TapThat prototype the performance speed of standard
interaction, i.e. touchscreen interaction in the GUI, and tapping interaction was compared.
The results from the testing are presented in Table 7 below where tapping interaction is
referred to as NFC.
Table 7 - Quantitative results from the usability testing of the TapThat prototype. The unit is
seconds.
The above is supposed to be read from left to right one row at a time. Each row represents
one test user’s results and each column represents the results from each scenario when
performed with one of the two interaction techniques. Every user completed the scenarios in
numerical order. Worth noticing is that Scenario 3 only was conducted with NFC interaction
and therefore no comparison of performance speed is possible in that scenario.
72 The results of Scenario 1 shows that NFC, most of the times, is faster than standard
touchscreen interaction. The results of the standard touchscreen interaction in Scenario 1
varies a bit whereas the results of the NFC interaction is not as diverged. In Scenario 2 the
resulting times for both NFC and standard touchscreen interaction are lower than in Scenario
1. One may also see that the results for each interaction technique is not as diverged in
Scenario 2 as in Scenario 1. Worth noticing though is that there were two failed attempts, of
reading the NFC-tag, in Scenario 2 compared to only one in Scenario 1. Regarding Scenario 3, it
was conducted mostly to see how fast NFC is as an application enabler/starter and the results
were pretty similar.
Qualitative results
After the practical part of the usability testing, shorter semi-structured interviews were
performed. The questions asked were aimed to determine the usage indicator satisfaction, from
the Layered model of Usability, and how the users perceived the tapping technique in general.
Below are some relevant quotes from the test users presented:
Right after the first tap, the test user one (T1), who was the pilot test user, stated the following.
T1: “Damn, that was awesome!” (“Jäklar, det där var häftigt!”)
T2: “NFC was much more comfortable [..than touch interaction.]” (“NFC var mycket
skönare.”)
T3: “When listening to documentaries you usually do not keep track of where you are,
this enables a smooth transfer between devices.” (“Vanligtvis när man lyssnar på
dokumentärer håller man inte reda på var man är, det här gör byte av enhet smidigt!”)
When presented the instructions for the standard interaction scenario, test user four (T4) said
the following.
T4: “Oh, how troublesome!” (“Åh, vad jobbigt!”). The test user started the test with
NFC-interaction.
7.5 Analysis
When looking at the results it shows that the mean value of task completion time when using
NFC is significantly lower in the second scenario, 6.3 seconds, compared to the first, 11.8
seconds. The mean values for standard interaction are however almost identical, 27 seconds
versus 26.7 seconds. An influential factor might be that the users could have an ability to
learn new interaction techniques in short times but it is hard to tell based only on the
gathered data. This reasoning, thus interesting, is of secondary concern. What is more
73 important is the users’ experience of utilizing the tapping technique that clearly states the
user’s approval of the technique
The quotes presented in the results shows that the users thought that the tapping interaction
technique, and the digital shortcut enabled by it, was smooth and useful. They seemed
satisfied with the prototype and the immediate transfer of audio playback between devices
that it enabled. The users where also asked which interaction technique they thought was the
most difficult to use, all answered the standard way. The reliability of this result can be
questioned since all test users knew, in advance, that we were conducting a study on NFC,
which might have prevented them from expressing criticism towards the technique.
74 8 OVERALL ANALYSIS
In this chapter a summation of our results from the sub-studies has been compiled and
analyzed. The results regarding the quantitative and qualitative evaluation of our prototypes are
presented followed by an analysis of NFC Ecosystems.
8.1 Quantitative versus qualitative data
The overall results from our two sub-studies Medicine Handling and TapThat show that assisted
NFC-interaction in an Android smartphone application possibly can increase the efficiency
and effectiveness of an application, but also give the user a satisfying experience.
In our sub study TapThat the task completion time, i.e. the performance speed, was on average
lower for NFC-interaction in both Scenario 1 and Scenario 2, compared to touch interaction.
Regarding learnability and error frequency the results correlate between the TapThat and the
Medicine Handling sub-studies. The users quickly learned how to the perform taps correctly and
errors made were reduced as the tests proceeded in both sub-studies. The results were similar
even though the users were allowed to play with the tapping technique prior to the test in the
Medicine Handling case while there were not in the TapThat case.
Our result also shows interesting factors other than the ones that we have tried to numerically
quantify. These factors are heavily related to user satisfaction and/or user experience. According
to van Welie et al.’s Layered model of Usability (1999), user satisfaction is one of the usage
indicators, all of which can be observed in practice, as well as one of three usability aspects in
the more abstract top layer. User satisfaction can thereby be look upon as a subset of usability.
Usability is in turn often considered as a subset of user experience. Or as stated in the User
Experience White Paper from the Dagstuhl Seminar on Demarcating User Experience (2011):
“UX is not the same as usability, although usability, as perceived by the user3, is
typically an aspect contributing to the overall UX.”
UX factors more relate to affect, interpretation and meaning compared to traditional usability
factors such as performance and smooth interaction, according to the User Experience White
Paper. They further claim that usability measures such as completion time or the number of
clicks and errors are not good UX measures, as they cannot tell whether the user perceived
them as good or bad. We highly agree of the importance of involving user satisfaction and/or
user experience in studies like the one described in this report. Being self critical we might have
focused a bit too much on quantifiable data such as performance speed and error proneness
instead of looking at more qualitative values such as the users’ feelings about using the
75 tapping technique. Of low importance is whether we choose to call this user satisfaction or user
experience we believe.
From our qualitative results we have extracted the following quotes, collected in the
retrospective interviews after the user testings. The quotes express some sort of feeling related
to what the users experienced when using our prototypes.
“I think it works really well with the tapping technique”
“It worked great”
“Damn, that was awesome!”
“This is really pliable.”
“This is very smooth”
“…this enables a smooth transfer between devices”
“NFC was much more comfortable”
“The tapping was really fun to try out.“
“The tapping is as fast as clicking, but it is more fun to tap”
“I used the tapping because it was fun”
“I know it is the same patient but I will tap and see anyway because it is fun!”
“It is a bit frustrating when the tapping does not work”
We conclude that these quotes show that the tapping interaction technique, within our
design space, was proven to be good, pliable and fun to use, but in some cases also causing
frustration when not working properly. Overall we draw the conclusion that tapping has the
potential of giving the user a satisfying user experience!
8.2 NFC Ecosystems
The Sparakvittot sub-study pointed out some overlying challenges revolving around NFC
integration in smartphone applications. In order for a NFC ecosystem to work, all involved
actors need to invest in the technology as well as implement the same type of NFC solution
that follow the same standards. To start building the ecosystem around Sparakvittot by
integrating NFC in the Sparakvittot Android application is probably to start in the wrong end.
The pay point system providers are far bigger actors, why it probably is a lot smarter to start by
implementing NFC support in their pay-point systems and then move on to the client
76 applications. To develop a solution that follows the same standard, after both parts are
convinced in investing, is then the next challenge.
The challenges described above correlates with the conclusions made by Kabir (2011) who
points out problems revolving around an NFC powered payment solution in the shape of a
smartphone application. Kabir created a diagram in order to show the complexity of the NFC
infrastructure, surrounding his proposed smartphone payment application, which is shown
below in Picture 12:
Picture 12 - A simple view of NFC ecosystem (Kabir, 2011).
Kabir concludes that an NFC ecosystem can be very complex. He points out that users of his
proposed application are stuck in the middle between mobile networking operators (MNO),
service providers and point of sales as shown in Picture 12 above. Further the MNOs and the
service providers have a middle hand in the trusted service manager (TSM). Beyond all this
there are also chip manufacturers, phone manufacturers and tag manufacturers that also are
important actors in the NFC ecosystem since they provide the hardware needed for NFC
implementations.
As in Kabir’s example, there are normally many actors involved in an NFC ecosystem, which
also was the case for Sparakvittot. Kabir’s conclusion heavily relates to the challenge we have
explored.
In order to meet the pay point system providers’ requirements, a software or driver for peer-topeer interaction between the pay point systems and the users’ smartphones would have to be
developed. The reason for this is that the Android SDK currently does not support card
77 emulation, as previously stated, and a solution with a keyboard emulating NFC reader is
therefore out of the question. This leaves the peer-to-peer mode to be the best solution.
One challenge with a peer-to-peer driver solution would however be that it only would
support Android smartphones which is risky from a market perspective. In the second quarter
of 2012, no other smartphone OS, than Android, held support for NFC except Nokia’s
sparingly OS Symbian. With the release of Windows Phone 8 (WP8), in the third quarter of
2012, Microsoft added NFC support to the OS. However the usage of Windows operating
systems for smartphones are low, only 3 percent for Windows Mobile and 1.3% for Windows
Phone 7, in the second quarter of 2012 in USA, according to Nielsen (2012) as seen in the
Picture 13 below.
Picture 13. Smartphone manufacturer share by operating system. Nielsen (2012).
In order to start somewhere it would have been interesting to look more into a peer-to-peer
solution but due to the scope and aim of this thesis we decided not to go any further. It is
impossible to know whether the other smartphone OS:s, most importantly iOS (because of
the high market share - 34% Q2 in USA according to Nielsen, 2012), will follow the same
standards as Android or not. Since WP8 is new and thus have a small market share, it does
78 not matter too much that they have NFC-support; at least until their market penetration
becomes bigger. The safest way of ensuring a successful multi platform rollout would therefore
be to wait, see how the other smartphone OS:s will support NFC and then take the next step.
79 9 DISCUSSION
In this chapter a discussion about our work approach, choice of method and challenges that
came up along the study are presented.
Our research question was whether we could design a tapping interaction model that could
work in different domains, under different demands, allowing users to learn and carry over
their understanding of how to use NFC between different applications. While the
development of our prototypes has worked as empirics, the finished prototypes defined a
design domain for us to evaluate the tapping interaction within. Both our design processes
and our results have therefore been of importance throughout the study, leading to our final
finding in shape of an interaction model, presented in 10 Conclusion.
The whole process of designing the user interfaces, developing the prototypes and evaluating
them has been performed while working in a team, consisting of the following roles: one
interface designer, two software developers and one usability evaluator. This chapter discusses
our work process and choice of method, from three different perspectives: the user interface
designer’s, the usability evaluator’s and lastly a common perspective combining the two.
9.1 User interface designer’s perspective
The contextual design model was used during the analysis and design phase of the Medicine
Handling sub-study. The model was especially of great help for the context analysis part as it
guided the interface designer through the process of defining the contexts surrounding the
nurse in her work. The first idea was to also involve tapping interaction in the preparation
phase of medicines. For example the nurse could tap a medicine jar in order to validate that
the picked medicine was part of a certain patient’s ordination list. This idea was however
dropped because of the findings of the contextual analysis, which stated that different hospital
units are using different methods and different equipment during the preparation phase,
which made the process of defining the task domain rather hard. However, all interviewed
nurses performed the distribution in the same way and a general use for the tapping
technique could therefore be identified to ensure patient safety.
An iterative process, heavily inspired by the contextual design model, was also used to develop
the graphical interface. To summarize, the contextual design model is of good use for the
designer when defining unknown contexts. For the TapThat sub-study a contextual design
approach was not needed since the contexts surrounding the concept where familiar to us
already. TapThat was also aimed at being developed rather quickly due to time constraints and
a contextual design approach was therefore seen as too heavy and time consuming in that case.
80 When it comes to using the Means of the Layered Model of Usability it is hard to define whether
it gave any practical help during the design process or not. The model surely can give more
concrete understanding on how usability can be defined, but it lacks pragmatic guidelines on
how to use it while designing.
User interface design is a pragmatic process, heavily based on artistic creativity and previous
experience. Designing is an iterative process in itself, leading to constant changes to the
product being designed, in order to ease the user interaction to its maximum. Design
guidelines can have both a restricting and helping effect for the designer. Deciding to which
extent guidelines should be followed is often complicated. In order to get the job done, the
designer needs to develop designs while iteratively looking over his own shoulder to ensure
that the result answers to aesthetic, engineering as well as usability principles.
Heuristics such as the design and developer guidelines from Google concerning Android and
guidelines of more general character such as the LLIID model, proposed to be used when
designing for the mobile context in general, have been used throughout the design process as
well as in the implementation of the prototypes. The style and pattern design guidelines,
provided by Android, have been used more implicitly compared to the NFC developer
guidelines. The NFC developer guidelines together with the heuristics proposed through the
presentation “How to NFC” (Google I/O 2011) has been of good help during the design and
prototype development process.
Design knowledge is gained through experience and the design of the user interfaces of our
prototypes have to a big extent been influenced by practically gained experience, especially
regarding utilization of the tapping technique. The integration of the tapping technique in
the user interfaces has primarily been done in the implementation phase of the prototypes, as
it is hard to predict how the result is going to be when using illustrative analogue or even
digital mock-ups. One can say that NFC adds another interaction dimension to the user
interface and this dimension is impossible to predict the result of if not practically utilizing it
through implementation.
81 9.2 Usability evaluator’s perspective
Measuring usability is a complex procedure, likely because of the complexity of defining
usability in the first place. There are a lot of factors that affects the usability of a smartphone
application, especially when it is supposed to be utilizing NFC. The factors can be general
ones, affecting most other applications as well, but they can also be both domain and user
specific. Therefore it may be useful for a usability evaluator to use the usability indicators of
the Layered model of usability proposed by van Welie et al. (1999) in the search for usability
factors worth measuring. The usage indicators define observable factors that affect usability,
which may be of help when designing usability testing. A problem with the model though, is
that the model not really defines some of the more abstract usage indicators. It is up to the
usability evaluation designer to create an own idea of what these indicators really mean.
Using the usage indicators of the Layered model of usability as a base when designing the
usability testing for our sub-studies was helpful. Even if the usage indicators did not always
match the usage domain perfectly they enabled the usability evaluator to create a more vivid
estimation of the usability factors and in that sense ease the design process of the usability
testing.
During the study the usage indicators of the Layered model of usability have been applicable to a
high extent in our work. The model has worked as a good support when performing the study
and the usage indicators has throughout the study helped concretize observable factors that
affect the usability of the prototypes being evaluated. The indicators that are quantifiable, for
example errors made, are in most cases easy to measure during usability tests, but their
relevance, when it comes to measuring the usability of an application, can be discussed.
Qualitative indicators on the other hand are a valuable data source but they require more
planning in order to, for example, be able to define questions for interviews or questionnaires
that will collect data that really answers what was intended. Qualitative data also tends to take
more time to process, but the final results are often more valuable than quantitative results.
The design of the resulting usability testing of the PatientSafetyPrototype gave us the insight that
designing a usability testing aimed at evaluating NFC from a usability perspective is not an
obvious task. It is difficult to exactly state to what extent the Layered model of usability is
applicable for designing usability testing aimed at evaluating NFC implementations in
Android smartphone applications. But for our concepts it worked out well.
82 9.3 Common perspective
The user-centered design philosophy has been of good help during our work since it mixes a
pragmatic approach (interface and interaction design principles) with an analytical (usability
evaluation). Dividing the work into two perspectives enabled a certain abstraction. This
abstraction allowed the designer to artistically express himself while the evaluator validated
that the designs followed usability standards. By using this method we have been able to
create high-fidelity prototypes that utilize the NFC technology in practice and evaluate the
usability of each prototype.
Our study can be seen as primarily solution-focused since we first concluded what can be
accomplished with NFC in Android applications in order to figure out possible
implementations suitable to be applied in certain contexts. We then developed prototypes
that utilize the tapping technique in situations we believed were suitable. Finally we
performed usability evaluations of the prototypes. Based on the lessons learned from these
empirics we have compiled an interaction model presented in the next chapter, Conclusion.
83 10 CONCLUSION
In this final chapter of our study, our conclusion is presented. It consists of an interaction
model to be used when developing an NFC integrated Android application in order to create
pliable user interfaces by utilizing the tapping technique. The interaction model is not
necessarily Android specific and can to some extent be used when implementing NFC in
applications in general. The model is followed by other findings from the study that we believe
are important to consider when performing NFC implementation in Android smartphone
applications.
10.1 Interaction model
The aim of this study was to explore how the tapping interaction technique, enabled by NFC,
should be implemented in Android smartphone applications to create interfaces that ease the
user interaction and provide for a smooth, supple and interesting interaction experience. The
result is an interaction model suggesting how to give feedback to the user and possible ways of
defining clear interaction patterns.
10.1.1 Provide feedback when tapping
When designing the implementation of tapping interaction in an Android smartphone
application, feedback is of high importance. In order for users to feel confident in using NFC
and trusting the technique they have to get feedback for when it is used the right way but also
when it is used the wrong way. There are two different levels where feedback is needed - at the
syntactical level and at the semantic level.
Sound and haptic
We have found it best practice to use sound feedback for the syntactical level - by presenting a
happy sound to the user if a tag is successfully read and a sad sound if it is not. For feedback
regarding the semantic level, we found haptic feedback to be best suited. In our prototypes we
used vibration for the case when an NFC-tag was properly read semantically. If the tag was read
in the wrong way semantically nothing happened and the vibration therefore worked as a
validator that a tap was successfully performed, on both levels.
GUI dialogues
When the tapping technique is used together with a GUI, as in all our three prototypes, it is a
good practice to enhance the semantic feedback with dialogues. A good idea when creating
proper dialogues is to use colors that are visually identifiers for good and bad. In the western
world green stands for good/approval and red stands for bad/warning. If the user taps a tag or
another device in the wrong semantic context a dialogue with red visual elements can e.g. be used.
Positive feedback in shape of a dialogue with green visual elements is good to use for the
84 novice users but it can be questioned whether the more experienced users want this in the
longer run as it might be seen as a disturbing element. If users have to wait for an NFCtriggered dialog to disappear or click a GUI-button to remove it, the shortcut enabled by NFC
and the time savings provided thereof might be lost. Therefore plan the use of the GUI
dialogues well.
Of importance is also to use a clear and descriptive language in the dialogues:
1. Make use of the fact that NFC provides “toggle functionality” - either you did right or
wrong. Similarly NFC can be used as a light switch to turn something on or off
depending on the previous state.
2. Use contextual information provided when tapping as comparisons
3. Communicate actively to the user
An example of all three points above would be a dialog saying: “The patient you just scanned
(Arne Nilsson, 560120-2323) is not the same as the patient in the ordination list (Erik
Andersson, 761203-1911)” instead of: “Wrong patient scanned”.
Novice and experience users
For more experienced users the sound feedback might become redundant when they feel
confident in using the tapping technique. Therefore NFC-enabled applications should be
equipped with the possibility for the user to choose which feedback they would like to get. For
novice users both sound and haptic feedback might be adequate but for experienced users
haptic feedback is probably enough since it confirms that action is taken by the application
and thereby informing the user that the NFC-tag was properly read.
10.1.2 Define clear interaction models
In order to make an application pliable and easy to use the interaction model has to be as
clear as daylight to any possible user. To accomplish this the interaction model has to be
consistent and self-explanatory, the user needs to be guided when tapping and the same thing
should happen each time the user taps correctly in a certain context. This means that the user,
at least when using the default settings, gets guided before, during and after a tap. Before the
tap the user needs to be guided into tapping the correct tag or device. This is accomplished
through GUI dialogues, e.g. “Tap your ID card to identify yourself”. In order for the user to
perform a satisfying syntactical tap, i.e. hold the device within the detection area for a
sufficient amount of time, sound feedback is used to ensure a syntactically correct tap. After the
tap has been correctly performed syntactically, semantic feedback in shape of haptic and dialogues
are used to complete the procedure. A positive action is taken if the whole chain is done right
and a negative action otherwise.
85 NFC is good for providing shortcuts. One nurse proposed a feature where the user would be
able to tap a medicine box, while viewing another patient’s ordination list, in order to go to
the other, newly tapped, patient’s ordination list instead. The proposed interaction design
and our interaction design are presented in PatientSafetyPrototype 4. It is an interesting
proposal, though dangerous in some ways we believe. The way our prototype was designed the
user had to go back from an ordination list to the list of patients before being able to tap a
new medicine box to show the corresponding new ordination list.
PatientSafetyPrototype Flowchart 4. Our interaction design and proposed design by a test user.
As shown in the above interaction example in PatientSafetyPrototype Flowchart 4, one
interaction step is eliminated in the proposed design that includes the tapping shortcut.
Important in this case is to carefully consider whether this feature is preferred by the general
user or only by the more experienced users. We assume that this shortcut example might help
the more experienced users while it could lead to confusion among the novice as it adds
another semantic rule: instead of only being able to tap a patient’s wristband in the
ordination view, in order to verify the patient, the user can now also tap medicine boxes. Not
matter which solution is chosen it is important to keep a consistent set of semantics so that
the users will not get confused - we chose to go with the rule of one tap per view.
A possible solution would be to introduce different context modes that the application can be
in. In this specific case the application could let the user be in either preparation mode or
distribution mode. Tapping a medicine box in the patient’s ordination list triggers the
preparation mode while tapping a patient’s wristband triggers the distribution mode. If
86 implementing different context modes it would preferable to show the user which mode the
application is in by, for example, setting the background GUI-color to a specific color and/or
adding an icon of some sort making it clear for the user which mode the application is in.
This is good example of showing the possible usage of defining physical contexts in NFC-tags.
We have concluded that, in order to define clear interaction models, it is of good practice to:
•
Define physical context in the tag.
•
Consistency in the user interaction model.
•
Possibility of turning shortcuts ON/OFF in preferences.
•
Implement different context modes, carefully, if necessary.
Important when using tapping interaction is to be consistent with what happens when the
user taps in a certain context. The context can mean the status of the GUI or the physical
context surrounding the user.
10.2 Other findings
The interaction model is followed by other findings from the study that we believe are
important to consider when implementing NFC in Android smartphone applications.
10.2.1 Determine if NFC integration is suitable
The development of an NFC-integrated Android smartphone application can be performed in
many different ways. NFC can e.g. be integrated in an already existing application as in the
case with Sparakvittot or the application can be built completely from scratch. When building
from scratch the whole idea of the application can be based on NFC and the tapping
technique as with TapThat. But NFC and tapping interaction possibilities can on the other
hand also play a smaller role, simply being a complementary interaction technique to the
touchscreen interface, which is a case that more correlates with the Medicine Handling
concept and the PatientSafetyPrototype.
The first question you need to ask when preparing for an implementation of NFC into an
Android application, no matter which part the tapping interaction will play in the application,
is whether NFC is suitable from a user experience perspective and thereby worth implementing
at all?
The user experience factor has to do with whether the tapping technique actually is making
the interface easier to use and thereby enhances the user experience of the application or not.
The best solution is sometimes to leave the tapping technique out of the interface since there
are better solutions at hand. This was the case with a suggested feature in the
87 PatientSafetyPrototype. At first we believed that the users should be able to validate the
preparation of medicines by tapping the medicine jars with the phone during the preparation
phase. Since the preparation was performed in different ways on different hospital units but
also differently by each of the individuals on the same unit we felt that the task domain was
too complex to break down into a general pattern that then could be translated into a tapping
interaction pattern. We also questioned whether tapping medicine jars actually made the
nurses more effective or if it instead added a cognitive load and thereby slowed the process
down? We came up with the latter to be the answer to that question.
Even though the tapping technique in general is a simple way of interacting one has to
critically question whether it actually enhances the application in each specific case. This
could e.g. be determined by user tests. New technologies should not be used just for the sake
of it but rather because of the enhancements they provide.
10.2.2 Use high-fidelity prototypes when evaluating
Kangas (2005) describes the importance of providing the user with real usage context when
performing usability studies in the mobile context. According to Kangas the users need to be
able to test software that feels like it is actually working - he verifies that it is of good choice to
use high-fidelity prototypes. Working with prototypes is also good from another perspective - they
can be used to demonstrate utilization of new technologies in a very concrete way.
When developing NFC-enabled Android applications one may be tempted to mock the
functionality of the tapping technique for testing purposes, e.g. by paper mock-ups. We
strongly believe that this is undesirable by two major reasons:
1. The tapping technique is a completely standalone interaction technique, working in a
dimension you cannot see and is thereby part of it’s own interaction domain.
2. It is very hard to simulate tapping interaction without loosing the whole interaction
feeling - assumptions have to be made how the user is going to respond.
In order to test the tapping technique accurately, we strongly believe, you simply have to
implement it. Otherwise flaws may be completely overlooked because the tapping technique
cannot be described by e.g. paper mock-ups since they simply are not interactive enough.
Users may create their own expectations of how the device is going to respond when tapping,
which the evaluator may not share. Low-fidelity mock-ups are therefore highly utopian in the
NFC evaluation context.
Just like with many interaction techniques that are part of a touchscreen interface a lot of
information gets lost when trying to simulate NFC interaction. A comparison would e.g. be to
88 perform tests on the pinching technique, in a touchscreen interface, without actually
implementing it. Trying to describe for the user what would happen when you zoom in by
pinching your fingers together without providing that exact functionality is much harder than
just implementing it and let the user see how it actually works. The very same principle
applies for NFC.
The medicine handling usability testing showed the advantage of using high-fidelity
prototypes. Without high-fidelity prototypes we would never have discovered the problems
surrounding some NFC hardware, i.e. that some users struggled with finding the right tapping
area on the smartphone we used or that the rubber wristbands were harder to tap compared
to the NFC stickers. Neither would we have seen that some users pulled out the medicine
boxes in order to tap them more easily.
10.2.3 Use specific intent filters
Regarding utilization of NFC as an application enabler it is highly important to use specific
intent filters when developing. This means that tapping a tag will launch the right application
on your smartphone. Imagine that you have all three of our applications installed on you
NFC integrated smartphone. What you do not want then is the PatientSafetyPrototype
application to be launched when tapping your computer in order to transfer a sound file to
your phone with the TapThat application. On the contrary you do not want TapThat to
launch when you, as a nurse, is scanning your ID card to launch the PatientSafetyPrototype
application in order to log in. It is important to keep intent filters in mind when developing
an NFC integrated Android smartphone application.
10.2.4 Determine if NFC integration is possible
To be able to successfully rollout an NFC-integrated Android application a whole technical
infrastructure – an NFC ecosystem – needs to surround the application. This has already been
described in the Sparakvittot case. In the Medicine Handling case this means that tags needs to
be bought, prepared and placed on the medicine boxes. The same applies for the patient
wristbands. In the TapThat case it means equipping both the computer and the smartphone
with compatible NFC hardware and software that use the same standards for information
transmission. On top of this there are technical, multi-platform-related issues both regarding
hardware and software, as well as economical and sometimes political factors, as in
Sparakvittot’s case, that heavily influences whether the solution will be successful or not.
89 11 REFERENCES
Abras, C., Maloney-Krichmar, D. and Preece, J. (2004) User-Centered Design. In Bainbridge, W.
Encyclopedia of Human-Computer Interaction. Thousand Oaks: Sage Publications.
Ailisto, H., Matinmikko, T., Häikiö, J., Ylisaukko-oja, A., Strömmer, E., Hillukkala, M.,
Wallin, A., Siira, E., Pöyry, A., Törmänen, V., Huomo, T., Tuikka, T., Leskinen, S. and
Salonen, J., 2007. Physical Browsing with NFC technology. [pdf] Finland: VTT Technical
Research Center of Finland. Available at: <http://vtt.fi/inf/pdf/tiedotteet/2007/T2400.pdf>
[Accessed 07 March 2012].
Amritage, G. and Knapman, H., 2003. Adverse events in drug administration: a literature review.
[pdf] Journal of Nursing Management, 11, Issue 2, pp.130-140. Available at:
<http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2834.2003.00359.x/pdf>.
[Accessed 20 March 2012].
Android Developer, 2012d. Advanced NFC. [online] Available at:
<http://developer.android.com/guide/topics/nfc/advanced-nfc.html>
[Accessed 24 February 2012]
Android Developer, 2012a. Android Developer’s Guide. [online] Available at:
<http://developer.android.com/guide/index.html> [Accessed 24 February 2012]
Android Developer, 2012b. Android Design Guidelines. [online] Available at:
<http://developer.android.com/design/index.html> [Accessed 24 February 2012]
Android Developer, 2012c. NFC Basics. [online] Available at:
<http://developer.android.com/guide/topics/nfc/nfc.html> [Accessed 24 February 2012]
Assa Abloy, n.d. Evaluation of the world’s first pilot using NFC phones for check-in an hotel room keys.
[pdf]. Available at <http://www.assaabloy.com/Global/Products/Products-old/ASSA-ABLOYMobile-Keys/Report-ASSA-ABLOY-Mobile-Keys-Pilot-Clarion.pdf>
[Accessed 30 January 2013]
Benyó, B., Wilmos A., Kovacs, K. and Kutor, L., 2007. NFC Applications and Business Model of
the Ecosystem. In: Mobile and Wireless Communications Summit, 2007, StoLPaN. 16th IST.
Budapest, Hungary 1-5 July 2007.
90 Benyon, D., 2010. Designing Interactive Systems: A comprehensive guide to HCI and
interaction design. 2nd ed. Harlow: Pearson Education Limited.
Bevan N. and Macleod M., (1994) Behaviour and Information Technology, Volume 13, Issue 1-2,
Available at: <http://www.tandfonline.com/doi/abs/10.1080/01449299408914592>
[Accessed 13 March 2012]
Cavoukian A., 2011. Mobile Near Field Communications (NFC) “Tap ‘n Go” - Keep it Secure &
Private. Information and Privacy Commissioner, Ontario, Canada. [pdf] Available at:
<http://privacybydesign.ca/content/uploads/2011/02/mobile-nfc.pdf>
[Accessed March 14 2012]
Clarion Hotel Stockholm, 2012. NFC Mobile Phones Replace NFC. [pdf]. Available at:
<http://www.clarionstockholm.com/nfc-project> [Accessed 16 March 2012].
Cooper, A., Reinmann, R. and Cronin, D., (2007). About Face 3: The Essentials of Interaction
Design. University of Michigan: Wiley.
Dagstuhl Seminar on Demarcating User Experience, 2011. User Experience White Paper –
Bringing clarity to the concept of user experience. [pdf] Available at:
< http://www.allaboutux.org/files/UX-WhitePaper.pdf> [Accessed 30 January 2013]
Dey, A. (2001), Understanding and Using Context, College of Computing & GVU Center,
Georgia Institute of Technology, Atlanta, GA, USA. Available at:
<http://www.springerlink.com/content/1d9grxkjvquhpwkw/w> [Accessed 10 March 2012].
Dix, A., Abowd, G., Beale, R. and Finlay, J. (1998). Human-Computer Interaction, Prentice Hall
Europe.
Florman Lindeberg, C., Lindén, E., Gudmundsson, N., Höjer, O., Hosk, J. and Blom, C.,
2011. Soul - Just a tap away. Future of Media 2011. pp14-27.
Golafshani, N., 2003. Understanding reliability and validity in qualitative research. The qualitative
report, 8(4), pp.597-606. Available at:
http://peoplelearn.homestead.com/MEdHOME/QUALITATIVE/Reliab.VALIDITY.pdf
[Accessed 29 May 2012].
91 Google I/O, 2011. How to NFC. [video online] Available at:
<http://www.google.com/events/io/2011/sessions/how-to-nfc.html>
[Accessed 24 February 2012]
Gould, J.D., 1988, Designing Usable Systems. Available at:
<http://www.adammikeal.com/courses/chi/files/feb2.usable_systems.pdf>
[Accessed 02 May 2012]
Gregor Broll, Sven Siorpaes, Enrico Rukzo, Massimo Paolucci, John Hamard, Matthias
Wagner, Albrecht Schmidt, (2007), Supporting Mobile Service Usage through Physical
Mobile Interaction. In: Computer Society. Conference on Pervasive Computing and
Communications (PerCom'07). New York, USA 19-23 Mars, 2007. [Online] Available at:
<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4144771>
Gualtieri M., Gilpin M., Hammond J.S. and Knoll A., (2011), Mobile App Design Best
Practises. [pdf] Available at:
<http://www.webmetrics.com/content/download/4020/59629/file/Forrester_Mobile_App_
Design_Best_Practices.pdf> [Accessed 23 March 2012].
Gulliksen, J. and Göransson, B., (2002). Användarcentrerad systemdesign. Sweden, Lund:
Studentlitteratur.
Gulliksen, J., Göransson, B., Boivie, I., Blomkvist, S., Persson J., and Cajander, Å., 2003. Key
principles for user-centred systems design. 22:6, pp397-409. [pdf] Department of Information
Technology, Human-Computer Interaction, Uppsala University, Uppsala, Sweden. Available
at: <http://www.tandfonline.com/doi/pdf/10.1080/01449290310001624329>
[Accessed 23 March 2012].
IDG News Service, 2007. RFID Payment Chips Popular in Japan. [online]. Available at <
http://www.pcworld.com/article/129499/article.html> [Accessed 30 January 2013]
ISO 9241-11 (1998). Ergonomic requirements for office work with visual display terminals (VDTs) –
part 11: guidance on usability. Switzerland: International Organization for Standardization.
Available at: <http://www.it.uu.se/edu/course/homepage/acsd/vt09/ISO9241part11.pdf>
[Accessed 27 March 2012].
Kabir, Z., 2011. User Centeric Design of an NFC Mobile Wallet Framework. Available at:
<http://kth.diva-portal.org/smash/get/diva2:432724/FULLTEXT01>
[Accessed 20 April 2012].
92 Kangas, E., 2005. Applying user-centered design to mobile application development.
Communications of the ACM, 48(7), pp.55-59. Available at:
http://dl.acm.org/citation.cfm?id=1070866 [Accessed 24 March 2012].
Karat, J., 1996. Magazine interaction. 10(4), pp.18-20. [pdf] Available at:
<http://delivery.acm.org/10.1145/240000/234814/p18-karat.pdf>
[Accessed 22 March 2012]
Kjeldskov, J. and Graham, C., 2003. A review of mobile HCI research methods. Humancomputer interaction with mobile devices, pp.317-335. Available at:
http://www.springerlink.com/index/4NJMXBLUKXFLJ9TM.pdf [Accessed 18 April 2012].
Kvale,S., 1996. Interviews: an introduction to qualitative research interviewing. Thousand
Oaks: SAGE Publication, Inc.
Lahtela, A., Hassinen, M. and Jylha, V., (2008), RFID and NFC in Healthcare: Safety of
Hospitals Medication Care. In: Pervasive Computing Technologies for Healthcare, 2008.
PervasiveHealth 2008. Second International Conference. [Online] Available at:
<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4571079>
[Accessed 23 March 2012]
Lunqvist, H., (2007), Smarta patientbrickor spara vårdplatser. it i vården, [online] 4 jun.
Available at: <http://itivarden.idg.se/2.2898/1.110715> [Accessed 10 March 2012].
Leech, B.,L., 2002. Asking Questions  : Techniques for Semistructured Interviews. [pdf] Rutgers
Univeristy, (I), pp.665-668. Available at:
<http://journals.cambridge.org/download.php?file=%2FPSC%2FPSC35_04%2FS10490965
02001129a.pdf&code=eb4e8a6ff99dda5e83c2e95bce65491e> [Accessed 16 April 2012].
Mercator Advisory Group, 2010. Global Sweep: NFC Technology to Redefine Smartphone Services.
[online] Available at:
<http://www.mercatoradvisorygroup.com/index.php?doc=news&action=view_item&type=2
&id=643> [Accessed 16 March 2012].
NFC Forum, 2010. NFC Forum Device Requirements - High Level Conformance
Requirements. [Online] Available at: <http://certification.nfcforum.org/docs/NFC_Forum_Device_Requirements.pdf> [Accessed 16 March 2012].
93 NFC Forum, 2012. About NFC. [Online] Available at: <http://www.nfcforum.org/aboutnfc/> [Accessed 16 March 2012].
NFC Forum, 2012, Near Field Communication and the NFC Forum: The Keys to Truly
Interoperable Communications. [pdf] Available at: <http://www.nfcforum.org/resources/white_papers/nfc_forum_marketing_white_paper.pdf> [Accessed 25
March 2012]
Nielsen, 2012. Smartphone manufacturer share by operating system. [image online] Available at:
<http://blog.nielsen.com/nielsenwire/?p=32494> [Accessed 23 August 2012].
Nielsen, J., (1993). Usability Engineering. Academic Press, London
Nielsen, J., (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability
inspection methods, John Wiley & Sons, New York, NY.
Nielsen, J., 1994. Usability Inspection Methods. Conference companion on Human factors in
computing, pp.413-414. Available at: http://dl.acm.org/citation.cfm?id=260531
[Accessed 03 April 2012].
Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, Proc. ACM CHI'90
Conf. (Seattle, WA, 1-5 April), 249-256.
Nordström, H. and Virén, K., 2010. Sjuksköterskors läkemedelshantering för god
patientsäkerhet. Available at: http://kau.divaportal.org/smash/get/diva2:323627/FULLTEXT01 [Accessed 14 April 2012].
Norman, D., A. and Draper, S., W., 1986. User Centred Systems Design. Hillsdale, NJ: Lawrence
Erlbaum Associates Inc.
Preece J., Rogers Y. and Sharp, H. (2007) Interaction Design: Beyond Human-computer
Interaction. Second Edition, New York: Wiley.
Rubin, J. and Chisnell, D., 2008. Handbook of Usability Testing - How to Plan, Design and
Conduct Effective Tests. 2nd ed. Indianapolis, Indiana, USA: Wiley Publishing Inc.
94 Rukzio E., Leichtenstern, K., Callaghan, V., Holleis, P., Schmidt, A. and Chin, J., 2006. In:
Ubicomp, Ubicomp 2006. Orange County, California, USA 17-21 September 2006. Berlin:
Springer-Verlag.
Torp, E., 2011. PayPal testar NFC-betalning i Sverige. Kelkoo Prylblogg, [Blog] 2011-12-22.
Available at: <http://blog.kelkoo.se/teknik/paypal-testar-nfc-betalning-i-sverige-2011-12>
[Accessed 16 March 2012].
Tidwell, J. (2005). Designing Interfaces: Patterns for Effective Interaction Design, Sebastopol:
O’Reilly.
van Welie, M., van der Veer, G., C. and Eliëns, A., (1999). Breaking down Usability. Faculty of
Computer Science - Vrije Universiteit Amsterdam [pdf]. Available at:
<http://www.cs.vu.nl/~gerrit/gta/docs/Interact99.pdf> [Accessed 16 March 2012].
Wichansky, A., M., 2000. Usability testing in 2000 and beyond. Ergonomics, 43(7), pp.9981006. Available at: <http://www.ncbi.nlm.nih.gov/pubmed/10929833>
[Accessed 16 April 2012].
95 12 APPENDIX
12.1 Medicine Handling Usability Testing
12.1.1 Test instructions
Introduction
Hur går medicindelningen på din avdelning till?
Vi har byggt en app där vi försökt efterlikna Cosmic så mycket som möjligt. Så det mesta
kommer du känna igen - dock är vissa saker annorlunda. Till att börja med så är vår app
nerskalad eftersom vi endast är intresserade av medicindelningsaspekten.
Starta appen
Till att börja med så kan man starta appen på två sätt:
1) Tryck på ikonen
2) Håll ditt ID-kort mot telefonen
Logga in
För att logga in måste du identifiera dig - detta görs genom att du håller ditt passerkort mot
telefonen. Du får sedan skriva in din pinkod.
Startsidan
Väl inne i appen så ser man en lista med alla avdelningens patienter. För att välja att se en
patients ordinationslista trycker du på en av raderna i listan. Du kan även välja en patients
ordinationslista genom att blippa patientens medicinlåda då du ska iordningställa
medicinerna. För att identifiera en patient väl inne i "salen" blipparar du patientens armband.
Ordinationslista
För att kunna Iordningsställa/Signera en utdelning så måste samtliga rutor i en kolumn
checkas, dvs alla mediciner som ska delas under kl 08:00 t.e.x måste markeras för att Signeraknappen ska dyka upp. Iordningsställ visar gul färg i checkboxarna medan Signera visar en
turkos färg. När en delning väl är signerad kan man i vår app inte återkalla beslutet. Trycker
du i en box så låses kolumnerna bredvid.
96 12.1.2 Interview questions
APPEN
Vad är din generella uppfattning om appen?
Vad tycker du om att kunna iordningställa och signera utdelningar direkt på telefonen? Vad
är fördelarna respektive nackdelarna med att ha "Cosmic" på telefonen jämfört med en
stationär dator?
NFC
Du valde att blippa X ggr av Y för att identifiera en patients ordination / starta appell. Hur
kommer sig att du inte använde det fler/färre ggr?
Vad tycker du om blipptekniken? Om testpersonen inte förstår frågan: Enkelt/problematisk
att använda, funkar jämnt/funkar alltför sällan?
Är blippandet ett bra sätt att identifiera sig på och i så fall varför / varför inte?
Vilken typ av respons/feedback föredrar du när du blippar? Ljud (för att indikera att en tagg
är inom räckhåll) och vibration (för att indikera att taggen blev korrekt avläst) eller bara ljud,
eller bara vibration t.ex. Har du några andra förslag på responssätt?
Finns det några andra fall där blippandet skulle kunna underlätta ditt jobb? / Ser du någon
annan användning inom sjukvården för blipp-tekniken?
Om inget svar ges - föreslå: Blippa mediciner för att markera dem i listan?
97 12.2 TapThat Usability Testing
Dokumentärlyssnardagen 1 (standard)
1) Det är tidigt på morgonen och du är sugen på att lyssna på en dokumentär medan du äter
frukost. Starta en av dokumentärerna som finns att välja på i mediaspelaren i datorn.
2) Spola fram en liten bit för att illustrera att du lyssnat en stund. Lyssna sedan vidare ett par
sekunder.
3) Nu är det dags att gå till jobbet men dokumentären är inte slut och den är väldigt
spännande så du vill lyssna vidare i din smartphone på vägen till jobbet. Stoppa (pausa)
ljudfilen i datorprogrammet (mediaspelaren).
4) Avaktivera skärmlåset på din smartphone och starta din favoritmediaspelare TapThat.
5) Välj samma dokumentär som du lyssnade på i datorn och spola dit du var.
6) Aktivera skärmlåset på din smartphone och lyssna vidare ett par sekunder för att
illustrera att du lyssnar på vägen till jobbet.
7) Nu har du kommit till jobbet och måste sluta lyssna på dokumentären för att arbeta istället.
Avaktivera skärmlåset och stoppa (pausa) uppspelningen av dokumentären i telefonen.
7) Stäng appen genom att trycka på tillbakaknappen ( <
⥰ ) på din smartphone.
Dokumentärlyssnardagen 2 (standard)
1) Äntligen! Nu är arbetsdagen slut och det är dags att gå hem. På vägen hem vill du lyssna på
en annan dokumentär i din smartphone. Avaktivera skärmlåset och starta TapThat.
2) Välj en annan dokumentär att lyssna på än den som du lyssnade på tidigare. Spola fram en
liten bit för att illustrera att du har lyssnat en stund på vägen hem. Aktivera skärmlåset och
lyssna sedan vidare ett par sekunder.
3) Nu är du hemma igen men dokumentären är inte slut. Du bestämmer dig för att du vill
lyssna vidare på din dator. Avaktivera skärmlåset på din smartphone och stoppa (pausa)
ljudet.
98 4) I datorprogrammet, välj samma dokumentär som du lyssnade på i telefonen och spola dit
du var när du stoppade uppspelningen i smartphonen.
Dokumentärlyssnardagen 1 (NFC)
1) Det är tidigt på morgonen och du är sugen på att lyssna på en dokumentär medan du äter
frukost. Starta en av dokumentärerna som finns att välja på i mediaspelaren i datorn.
2) Spola fram en liten bit för att illustrera att du lyssnat en stund. Lyssna sedan vidare ett par
sekunder.
3) Nu är det dags att gå till jobbet men dokumentären är inte slut och den är väldigt
spännande så du vill lyssna vidare i din smartphone på vägen till jobbet.
4) Avaktivera skärmlåset på din smartphone och starta din favoritmediaspelare TapThat.
5) Tappa (blippa) telefonen mot taggen för att börja lyssna på din smartphone.
6) Aktivera skärmlåset och lyssna vidare ett par sekunder för att illustrera att du lyssnar på
vägen till jobbet.
7) Nu har du kommit till jobbet och måste sluta lyssna på dokumentären för att arbeta istället.
Avaktivera skärmlåset på din smartphone och stoppa (pausa) uppspelningen av
dokumentären i telefonen.
8) Stäng appen genom att trycka på tillbakaknappen ( <
⥰ ) på din smartphone.
Dokumentärlyssnardagen 2 (NFC)
1) Äntligen! Nu är arbetsdagen slut och det är dags att gå hem. På vägen hem vill du lyssna på
en annan dokumentär i din smartphone. Avaktivera skärmlåset och starta TapThat.
2) Välj en annan dokumentär att lyssna på än den som du lyssnade på tidigare. Spola fram en
liten bit för att illustrera att du har lyssnat en stund på vägen hem. Aktivera skärmlåset och
lyssna sedan vidare ett par sekunder.
3) Nu är du hemma igen men dokumentären är inte slut. Du bestämmer dig för att du vill
lyssna vidare på din dator. Avaktivera skärmlåset på din smartphone.
4) Tappa (blippa) din smartphone mot taggen för att lyssna vidare i datorn.
99 12.3 Consent form
Medgivandeformulär
Tack för att du deltar i vår prototyputvärdering! Vi kommer att göra en
videoinspelning av sessionen för att i efterhand analysera interaktionen med
prototypen. Stillbilder kommer möjligtvis också att tas. Materialet kommer endast användas inom projektet där du som deltagare
kommer att förbli anonym. Dock kommer resultatet publiceras i form av en
offentlig rapport och möjliga stillbilder kan komma att publiceras i rapporten. Jag är införstådd med att en videoinspelning kommer att föregå under testet och
att stillbilder kan tas. Jag medger Martin Holeby och Patrik Sandberg tillstånd att
använda det inspelade materialet och stillbilderna till ovanstående ändamål och
jag frånsäger mig mina rättigheter att granska eller redigera något av materialet. Signatur: Datum: Namnförtydligande:
100 
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement