Assessing the accessibility of online learning

Innovations in Education and Teaching International
Vol. 45, No. 2, May 2008, 103–113
Assessing the accessibility of online learning
Joanne L. Badgea*, Emma Dawsona, Alan J. Cannb and Jon Scotta
aSchool
of Biological Sciences, University of Leicester, Leicester, UK; bDepartment of Biology,
University of Leicester, Leicester, UK
A wide range of tools is now available to enable teaching practitioners to create web-based
educational materials from PowerPoint presentations, adding a variety of different digital
media, such as audio and animation. The pilot study described in this paper compared three
different systems for producing multimedia presentations from existing PowerPoint files. The
resulting resources were tested by a group of disabled students and a group of non-disabled
students. Our findings show that there were statistically significant differences between the
two groups in relation to their interaction with the resources. In particular, the students with
disabilities were significantly more active in using the available controls to customise the
running of the presentations. The data suggest that future work on why students with
accessibility issues made different uses of these resources could encourage practitioners’
deployment of multimedia resources for the benefit of all learners.
JOANNEBADGE
Innovations
10.1080/14703290801948959
RIIE_A_295061.sgm
1470-3297
Original
Taylor
202008
45
jlb34@leicester.ac.uk
00000May
&
Article
Francis
(print)/1470-3300
in
2008
Education and Teaching
(online) International
and
Francis
Introduction
In the UK, the Special Education Needs and Disability Act (SENDA), 2001 requires that disabled
students are not treated unfavourably in Higher Education establishments and that reasonable
adjustment is made to provide them with access to the learning experience available to all
students. Policies to promote widening participation, benchmarks and incentives to recruit
students from more diverse backgrounds and those with disabilities have led to a doubling of the
declared number of disabled students entering Higher Education over the last 10 years (Higher
Education Statistics Authority, 2006; Riddell, Tinklin, & Wilson, 2005). It is often assumed by
teaching staff that the proportion of disclosed disabled students in practical and laboratory-based
courses is low, as they personally encounter few physically disabled students. However, learning
disabilities are included in the definition of disability in SENDA, therefore the needs of this group
of students must also be considered. It is more likely that a dyslexic student will choose a biological science course than maths or computer science (Richardson, Taeko, & Wydell, 2003). It is
therefore particularly relevant that we seek to increase awareness of this growing trend with our
colleagues in bioscience.
The phrase ‘reasonable adjustment’ in the SENDA legislation requires staff to ensure that all
existing students can access their teaching materials in a format suitable for them. This includes
the provision of resources that a student can customise, such as providing a lecture handout in the
original word document, so that a student with a learning disability could change the font style
and colour to make it easier for them to read and understand and the provision of software to enable
text to be converted to speech. Microsoft PowerPoint is widely used in teaching in Higher Education
for presentations, lectures and to provide supporting materials on virtual learning environments
(VLE) and files can be customised by the student. However, use of PowerPoint is still controversial,
*Corresponding author. Email: jlb34@leicester.ac.uk
ISSN 1470-3297 print/ISSN 1470-3300 online
© 2008 Taylor & Francis
DOI: 10.1080/14703290801948959
http://www.informaworld.com
104
J.L. Badge et al.
being highly dependent on the practitioner to provide meaningful content for sound pedagogy
(Jones, 2003) and research has been contradictory on whether it benefits student learning (Lowry,
1999; Szabo & Hastings, 2000). The now almost ubiquitous use of VLEs in UK higher education
(Browne & Jenkins, 2003; Ward, Gordon, Field, & Lehmann, 2001) means that PowerPoint files
are now more easily distributed directly to students than ever before. Taking this one stage further,
multimedia resources (containing video, animation or sound) can be used as assistive technology
by providing alternative formats such as audio, diagrams and images to long sections of unbroken
text that can be inaccessible to users with learning disabilities or other disadvantaged groups such
as those with English as a second language (Sloan, Stratford, & Gregor, 2006).
A wide range of tools is now available to enable instructors to add a variety of different
digital media to PowerPoint presentations to produce Web-based multimedia resources. We
report here a pilot study to investigate the use of three systems for the production of online
multimedia resources from existing PowerPoint files: two Adobe products, Flash and Presenter,
and a Java-based product, Impatica.1 The resources were evaluated in terms of ease of production for lecturing staff and resource preference by student users. To investigate the premise that
making multimedia resources available may benefit students with accessibility issues as well as
non-disabled students, two such groups of students were observed using the resources.
Method
A user testing protocol was developed as the basis for this study to elicit how undergraduate
students with and without accessibility issues used the three different programmes. The three
different systems for producing multimedia Web-based resources were used to create the materials for user testing. This provided a realistic and familiar activity to test the delivery systems.
To avoid distractions, we considered it to be important that the content of the teaching materials
be directly relevant to the students’ course of study. Materials were created for testing from PowerPoint presentations sourced within the School of Biological Sciences and from one outside lecturer.
Once the study materials were created, user testing was carried out with individual students and
the sessions were both directly observed and video taped for later analysis. Participants were asked
to complete a series of questions relating to the content of the presentations. These questions
enabled testing of all the features and different presentation media within each format. This
approach also allowed us to observe how the students would use the systems in a simulated reallife situation, such as searching for useful information from lecture support materials.
Creation of test materials
The project required the production of learning materials using three different formats – Flash,
Adobe Presenter and Impatica. In order that fair user testing could be carried out, exactly the same
content was used for all three formats. As Flash is a complex programme, it was initially assumed
that pre-existing Flash presentations could be sourced within the School and/or by a search
online. This content would then be converted to PowerPoint for use in Adobe Presenter and
Impatica. However, the sourcing of suitable materials was more challenging than expected and
the conversion from Flash to PowerPoint not straightforward. It was therefore decided to take a
different approach and start with PowerPoint presentations that could later be converted into the
different presentation outputs. Several PowerPoint presentations containing animations were
found within the teaching materials presented in the Faculty of Medicine and Biological Sciences
and were made available by the kind permission of their authors. One presentation used was from
a lecturer at Trinity College, Dublin (permission granted by author). Four of these presentations
were used to create provisional materials.
Innovations in Education and Teaching International
105
Due to the different ways in which Flash, Adobe Presenter and Impatica handle PowerPoint
files, slightly different versions of each presentation had to be created for each platform in order
for them ultimately to appear identical in content. Narrative scripts were written for each presentation and sound tracks were recorded, edited and cued to the relevant animations.
To avoid over-exposure of the test participants to the materials and to work within acceptable
concentration levels, the period that each user tested the materials was kept to a maximum of one
hour. Following some pilot testing, two topics were selected for full user testing, one based on
basic concepts in immunology and the second on the process of DNA replication. Initial trials
suggested that it would take a minimum of 30–40 minutes to view the material passively and it
was felt that this would give an overall user testing time within the 60-minute maximum.
An example screenshot of each of the different web-based presentation formats are shown in
Figures 1–3. The resources can be viewed at: http://www2.le.ac.uk/Members/jlb34/research/
demo.
Figure 3.
1. Screenshot from the Adobe
2.
Flash presentation
Impatica
Presenter
presentation
presentation
of ‘Immunology’
of ‘Immunology’
of ‘Immunology’
content.
content.content.
Recruitment of users for testing
Current undergraduate students were recruited for user testing. Disabled students at the University of Leicester can register with the AccessAbility Centre to ensure that their needs are highlighted and relevant steps taken to support their studies. The AccessAbility Centre records
showed that there were between 15 and 20 students with registered accessibility issues (excluding
mental health problems) in the School of Biological Sciences. To recruit the test subject group
(those with registered accessibility issues) the AccessAbility centre sent out a letter of invitation
Figure 1.
Screenshot from the Flash presentation of ‘Immunology’ content.
106
Figure 2.
J.L. Badge et al.
Screenshot from the Impatica presentation of ‘Immunology’ content.
to Biological Science students registered with them. An email was also sent to all students within
the School and to the first four years of students on the Medical School’s MBChB programme,
asking for volunteers. Students were offered a small cash fee for completing the task. Ten
students were recruited and came individually to user testing sessions. Each subject was asked to
complete a form notifying us of the nature of their registered accessibility issues.
A matched control group was recruited once the subject testing was completed. Participants
were matched for year of study, gender and course of study. The control group was recruited by
email and this elicited a very large number of replies so that matching the test participants was
easily achieved.
User testing
User testing sessions were carried out in a private office. The sessions were run by one member
of the team, who took notes and video taped the session. Students signed a consent form to
give their permission to the video taping of the session. The video tapes were used solely for
data collection so that timings, actions and questions could be reliably recorded and analysed
later.
The computer used for testing was running Windows XP and was the same machine used to
create and test the materials, to ensure that all software, visual and sound problems had been
identified. A simple HTML page was created to host six links to the testing materials. There were
six presentations to view: a Flash, Impatica and Adobe Presenter presentation of each of the two
participants chosen (Immunology and DNA replication). The resources were presented in the
following order to all participants:
(1) Immunology – Flash version.
(2) Immunology – Impatica version.
Innovations in Education and Teaching International
Figure 3.
(3)
(4)
(5)
(6)
107
Screenshot from the Adobe Presenter presentation of ‘Immunology’ content.
Immunology – Adobe Presenter version.
DNA replication – Flash version.
DNA replication – Impatica version.
DNA replication – Adobe Presenter version.
The sample size in this pilot study was too small to allow for meaningful variation in the
order in which the presentations were viewed by each subject. Each system displayed the
materials in a different way. On each system, audio was synchronised to play with animations and started automatically with each new slide. The Flash presentation did not progress
from slide to slide without user intervention to click to move forward through the presentation. During the Impatica presentations, participants had to click to start the presentation but
thereafter it played automatically. The script of the audio narration was displayed at top right
of the screen automatically in time to the audio soundtrack. Participants could skip ahead by
selecting a slide from its title on the left-hand pane, or by using the fast forward control. The
Adobe Presenter system played automatically. Again, the script of the audio narration was
displayed in the ‘notes’ tab in time to the audio soundtrack. Participants could skip ahead by
selecting a slide from the title on the right-hand pane, or by using the fast forward control.
The thumb tab displayed a list of slides with small thumbnail images of each slide as well as
their titles.
Participants were given minimal information about the project and the task, namely that we
were interested in how students interacted with e-learning resources. They were asked to answer
three questions about each of the presentations. These questions were designed to encourage them
108
J.L. Badge et al.
to make use of the various features of the three different platforms. Questions were based on the
scientific content of each presentation and therefore appeared to test the student’s knowledge of
the subject (e.g. immunology) rather than their use of the system. Questions were designed such
that they either addressed information contained in the text on the slides, audio track or some
graphical element of the presentation. The questions were structured so that the balance between
these three aspects was the same for each presentation, though the subject content of the questions
differed. Participants were also asked to comment on the three different styles of presentation
once the test was completed. The answers given by the students from the task questionnaires were
marked and their scores analysed.
Quotations from test participants
Comments made by the participants were transcribed from the video recordings. Test participants
were asked to comment freely on their experience of using the three presentation formats. The
majority of the comments were unsolicited; a few were made in response to the observer asking
the participants to comment on the experience of user testing at the end of the session. A selection
of these comments are included in the discussion section.
Results
User testing
Ten participants with accessibility issues were recruited and tested individually. Seven of the
participants were registered with dyslexia, one with dyspraxia, one with a visual impairment and
one with a hearing impairment. There were two males and eight females from a range of years of
study and courses within the Faculty. The control group of 10 non-disabled students was matched
for gender, year and course of study to the test group.
Factors analysed
Features used
Records were made of which features of Adobe Presenter and Impatica were used by the participants. These features related to any operation that could be performed such as using the play or
pause button, clicking on the tabs in Adobe Presenter to view a script or different menu view or
using the slide selector in Impatica. We found that, overall, the disability test group used significantly more features than the control group (Fisher’s exact test, p < 0.001). Figure 4 shows a
graphical plot of the number of times the two groups used a particular feature of Adobe Presenter
or Impatica.
Figure 4. Graph showing the number of times that the navigation and other visual control features of Adobe Presenter and Impatica were used by the test (disabled students) and control (non-disabled students) groups.
Scores
The question sheets completed by the participants were marked and the number of correct
answers totalled. There was no statistically significant difference between the scores obtained by
the two groups (test mean score = 17; control mean scores = 16; t(18) = 0.379; p > 0.05).
Use of search facility
Adobe Presenter and Impatica both provide a search facility that can be used to search for
keywords or phrases within the text of the slide or script. Some students made good use of this
feature to find answers to the questions they were asked to complete. However, when the data
Innovations in Education and Teaching International
109
Figure 4. Graph showing the number of times that the navigation and other visual control features of Adobe
Presenter and Impatica were used by the test (disabled students) and control (non-disabled students) groups.
were analysed (see Table 1), there was no statistically significant difference in the use of search
facilities between the test and control groups (Fisher’s exact test, p > 0.05).
Time taken to complete the task
Initially it was assumed that the test participants might take longer to complete the tasks and view
all the presentations than the control participants. However, in this study we found that there was
no significant difference between the overall time taken by the two groups to complete the exercises
Table 1. The use of the search facility was counted over all four presentations (two Adobe Presenter and
two Impatica) (10 students in each group).
Disabled students (test group)
Non-disabled students (control group)
Used search facility
Did not use search facility
6
3
34
37
110
J.L. Badge et al.
Table 2. The number of students that stated a preference of presentation format (10 students questioned in
each group).
Stated preference
Adobe Presenter
Impatica
Adobe Macromedia Flash
Disabled students
Non-disabled studentsa
8
1
1
6
1
1
a One
student had no preference and one student liked a combination of features from all three formats, both excluded
from calculations.
(test group mean time = 2381 s (SD = 597); control group mean time = 2083 s (SD = 273);
t(18) = 1.431; p > 0.05).
Use of navigation tools
Several navigation tools exist in both Impatica and Adobe Presenter similar to VCR controls and
those common to Internet media players (play, stop, pause, fast forward, rewind). A note was
made of how many slides played before the subject made use of any one of these controls. In our
study, the test participants located and used these controls more quickly than the control group
and this difference was statistically significant (Fisher’s exact test, p < 0.001).
Preference for software
When students were asked which of the three presentations they preferred, there was a significant
preference for Adobe Presenter (Fisher’s exact test, p < 0.001) over the other two formats tested
(Impatica and Flash) (see Table 2).
Discussion
This project investigated the use of the resources created by the three systems by two groups of
students in order to gain information on whether students with accessibility issues made different
uses of these alternative formats when compared to a control group of students.
Students with accessibility issues include a wide spectrum of disabilities: although the
majority of the participants were registered as dyslexic, the group also included students with
visual and hearing impairments. Whilst these disabilities cannot be compared directly, SENDA
legislation enforces the remit of accessible design for all students, and this is likely to be representative of the composition of students with accessibility issues in any given cohort. Therefore,
it was important to us in this pilot study to look at this group as whole and their reaction to the
materials presented.
The users were asked to perform distracter tasks (answering questions based on the content
of the presentations) while using the Web-based resources under observation. This usability-testing format allowed the comparison of the use of resources by students with and without accessibility issues. The aim was not to explicitly examine the absolute accessibility of each resource but
gauge the comparative usability experience of these two groups. The relationship between the
usability and accessibility of websites and therefore Web-based teaching materials is a complex
one (Petrie & Kheir, 2007). Usability is widely accepted in terms of the definition provided by
ISO 9421: ‘the extent to which a product can be used by specified users to achieve specified goals
with effectiveness, efficiency and satisfaction in a specified context of use’ (International
Innovations in Education and Teaching International
111
Standards Organization, 1992–2000). Accessibility, however, is a more complicated issue
(Iwarsson & Ståhl, 2003) and is often expressed in terms of guidelines and recommendations for
design. The Web Accessibility Initiative offers the definition ‘Web accessibility means that people
with disabilities can use the Web. More specifically, Web accessibility means that people with
disabilities can perceive, understand, navigate, and interact with the Web, and that they can contribute to the Web’ (Henry, 2006). Petrie and Kheir (2007) demonstrated that accessibility issues and
usability issues are two overlapping sets of problems and neither is a wholly contained subset of
the other. Although products should comply with national legislation, the fact that accessibility
standards focus on guidelines and checklists rather than the concepts of usability, means that it is
essential that products used in the educational environment be tested by both disabled and nondisabled students. A study by Mackey and Ho (2008) demonstrated that there was a strong
correlation between usability factors and the perceived improvement in learning by students using
a Web-based multimedia tutorial. Crowther, Keller, and Waddoups (2004) argued that user testing
for educational resources removed interface problems and freed the students to concentrate on the
educational content.
In our limited study, the most striking result was that the group of disabled students used
significantly more ‘user control’ features than the non-disabled group. From observation, this was
not because the students were confused, randomly clicking on control icons or losing their way
in the resources, but they were actively seeking out information by selecting appropriate sections
of the material more directly than the control group. For example, two disabled students
commented:
I like the [Adobe Presenter] one … you could go straight to slide three instead of sitting all the way
through the presentation again.
It was really good to be able to pause [Impatica presentation] quite easily and to have the words
[narrative script] up on the screen, although they are quite small and you can’t just read them all the
time, if you pause it you can go through it.
It is possible that these students were used to customising their own learning experiences and
personalising their computing environment to facilitate their own learning and as such were
perhaps more self-aware than the control group who mostly appeared to passively watch the
presentations with little interaction. Our study found that the disabled participants located and
used these controls more quickly than the control group and this difference was statistically
significant, again indicating that these students expect to be able to control their environment.
Adobe Presenter and Impatica both dictate a layout and visual appearance, which can be customised within certain ranges but ultimately, conforms to a standard format. The look and layout of
the two systems described here rely on some well-known visual navigation aids and controls, for
example, both use VCR-type symbols (pause as two vertical bars, stop as a square, play as a single
arrow head and skip or fast forward as a double arrow head). One student commented:
[Adobe Presenter] was set out like iTunes or Windows Media Player that I know, so I know fast
forward, play, stop was here [indicates to control bar at bottom of screen].
The paper-based tasks given to the students during the test session were devised as a distracter to
help them engage with the material and encourage their interaction with the software. The
questions addressed the science topics covered in the presentations and the answers required
recall from different elements of the presentation (graphical, sound and textual). We found no
statistically significant difference between the scores obtained by the two groups, indicating that
the questions did not discriminate between the two groups and that they had comparable knowledge of the underlying science content.
We found that there was no significant difference between the overall time taken by the two
groups to complete the exercises which was surprising given that extra time is something
112
J.L. Badge et al.
regularly offered to students with accessibility issues in examinations and other formal assessments. However, the data were skewed by one disabled subject who completed the tasks in the
shortest recorded time and a control subject who took the longest recorded time. Although the test
group participants appeared to take more care and time reading the text on the screen, their use of
the controls to stop and start the presentations often meant that they later made more efficient use
of their time when looking for the answers to various tasks.
The significant preference for Adobe Presenter over the other two formats tested (Impatica
and Flash) could have been due to the fact that this was the format presented last and thus may
have appeared easier to use as the students had become accustomed to the previous presentations.
However, at least one student remarked that they could not see the video controls on Impatica and
several commented that the layout of Adobe Presenter was uncluttered, clear and familiar.
The majority of the test group were students with dyslexia. A study by Alty, Al-Sharrah, and
Beacham (2006) found that while non-dyslexic students’ learning outcomes improved most when
presented with sound and diagrams rather than text alone, the opposite was true of a group of
dyslexic students undergoing the same tests (Beacham & Alty, 2006). This was a surprising finding given that dyslexic students are thought to have the most problems with text alone; Beacham
and Alty suggest that this could be due to being forced to develop considerable compensation
skills for text over other media. Here, we demonstrate that these students are taking more control
over the resources they used in the test than the control group by using their navigation, playback
and audio controls. To gain statistically significant results on such a small-scale project was very
encouraging.
Future work
This pilot study with a small number of participants and limited scope has raised several interesting
questions. How do dyslexic students customise resources for their own use? Where is their attention
focused on the resource and does that differ from non-disabled students? Further research with
dyslexic students in particular could provide useful and practical information for practitioners
when creating resources for use online to support lectures or to deliver other material. Eye tracking,
mouse/motion tracking studies with further usability testing could elicit how and why disabled
students were using the systems differently.
Acknowledgements
Our thanks go to the students who participated in this study and to Noel Murphy and Colin Hewitt
for allowing us to use their PowerPoint presentation materials. We are grateful to the University
of Leicester Fund for New Teaching Initiatives for supporting the work described in this report
and to the University of Leicester Teaching Enhancement Forum and GENIE CETL for providing
post funding for Dr Badge.
Note
1. Flash, http://www.adobe.com/products/flash/; Adobe Presenter, http://www.adobe.com/products/
presenter/; Impatica, http://www.impatica.com/.
Notes on contributors
Joanne Badge is Web Resources Development Officer in the School of Biological Sciences at the University
of Leicester, providing support to academic staff for online teaching. She has a background in promoting
online resources for biological sciences.
Innovations in Education and Teaching International
113
Emma Dawson was employed as a research assistant to prepare the materials for this study and to carry out
the majority of the user testing.
Alan Cann is a senior lecturer in the Department of Biology at the University of Leicester. He has over 10
years’ experience of developing and delivering online learning and is particularly interested in online assessment and interactivity. In recent years he has worked extensively developing new formats for online learning
materials.
Jon Scott is Director of Studies in the School of Biological Sciences at the University of Leicester. His main
areas of teaching interest are neuroscience and human systems physiology which he teaches to biology and
medical students. He is also involved in teaching developments in relation to key skills in the biosciences.
References
Alty, J.L., Al-Sharrah, A., & Beacham, N. (2006). When humans form media and media form humans: An
experimental study examining the effects different digital media have on the learning outcomes of
students who have different learning styles. Interacting with Computers, 18, 891–909.
Beacham, N.A., & Alty, J.L. (2006). An investigation into the effects that digital media can have on the
learning outcomes of individuals who have dyslexia. Computers & Education, 47, 74–93.
Browne, T., & Jenkins, M. (2003). VLE surveys – A longitudinal perspective between March 2001 and
March 2003 for Higher Education in the United Kingdom. Oxford, UK: UCISA.
Crowther, M.S., Keller, C.C., & Waddoups, G.L. (2004). Improving the quality and effectiveness of
computer-mediated instruction through usability evaluations. British Journal of Educational
Technology, 35, 289–303.
Henry, S.L. (2006). Introduction to web accessibility. Retrieved July 18, 2007, from www.w3.org/WAI/
intro/accessibility.php
Higher Education Statistics Authority. (2006). HESA online information service. Retrieved February 14,
2007, from http://www.hesa.ac.uk/holisdocs/home.htm
International Standards Organization. (1992–2000). Standard 9241: Ergonomic requirements for office
work with visual display terminals. Geneva, Switzerland: International Standards Organization.
Iwarsson, S., & Ståhl, A. (2003). Accessibility, usability and universal design – Positioning and definition
of concepts describing person–environment relationships. Disability & Rehabilitation, 25(2), 7.
Jones, A.M. (2003). The use and abuse of PowerPoint in teaching and learning in Life Sciences: A personal
overview. HEA Bioscience, 2. http://www.bioscience.heacademy.ac.uk/journal/vol2/beej-2-3.htm.
Lowry, R.B. (1999). Electronic presentation of lectures – Effect upon student performance. Royal Society
of Chemistry. http://rsc.org/pdf/uchemed/papers/1999/31_lowry.pdf.
Mackey, T.P., & Ho, J. (2008). Exploring the relationships between Web usability and students’ perceived
learning in Web-based multimedia (WBMM) tutorials. Computers & Education, 5, 386–409.
Petrie, H., & Kheir, O. (2007). The relationship between accessibility and usability of websites. San Jose,
CA: ACM Press.
Richardson, J.E., Taeko, N., & Wydell, E.R., (2003). The representation and attainment of students with
dyslexia in UK higher education. Reading and Writing, 16, 475–503.
Riddell, S., Tinklin, T., & Wilson, A. (2005). New Labour, social justice and disabled students in higher
education. British Educational Research Journal, 31, 623–643.
Sloan, D., Stratford, J., & Gregor, P., (2006). Using multimedia to enhance the accessibility of the learning
environment for disabled students: Reflections from the Skills for Access project. ALT-J, 14(1), 39–54.
Szabo, A., & Hastings, N. (2000). Using IT in the undergraduate classroom: Should we replace the blackboard with PowerPoint? Computers and Education, 35, 175–187.
Ward, J.P.T., Gordon, J., Field, M.J., & Lehmann, H.P. (2001). Communication and information technology
in medical education. The Lancet, 357 (9258), 792–796.
Download PDF