Format and long-term effect of a technique Anna Elizabetha du Preez

Format and long-term effect of a technique Anna Elizabetha du Preez
University of Pretoria etd – Du Preez, A E (2004)
Format and long-term effect of a technique
mastering programme in first year Calculus
by
Anna Elizabetha du Preez
Submitted in partial fulfilment of the requirements for the degree of
Master of Science: Mathematics Education
in the Faculty of Natural and Agricultural Sciences
at the
University of Pretoria
Supervisor: Dr AF Harding
Co-supervisor: Prof JC Engelbrecht
April 2004
University of Pretoria etd – Du Preez, A E (2004)
University of Pretoria etd – Du Preez, A E (2004)
Abstract
The research on the format and long-term effect of a technique mastering
programme in the first year Calculus course involves a group of first year engineering
students at the University of Pretoria. Apart from conceptual understanding these
students are also expected to master a certain amount of basic knowledge and rote
skills in the Calculus course. The process of acquiring and assessing basic knowledge
and rote skills (also referred to as must knows and techniques, respectively) is known
as the technique mastering programme at the University of Pretoria.
This study addresses two research questions. The first question deals with the issue as
to whether the paper-based assessment format for the technique mastering programme
in first year Calculus can be replaced by computer-based assessment without a
significant difference in performance. The second question deals with the long-term
effect of the techniques mastering programme and the study investigates which and
how much of the knowledge and skills embedded by the technique mastering
programme in the first year is retained after a further two years of study.
In answer to the first question, the study shows that statistically there is no
significant difference in performance in the technique mastering tests when the paper
format is replaced by an online format. Yet, for a large group of students the logistics
are formidable and the change to the online format under investigation is not
practically feasible. The second part of the study shows that, in general, there is a
disappointing decline in performance over a period of two years. There are, however,
areas in which students performed better after the elapsed period. The research is of
diagnostic value in determining the future of the technique mastering program with
regard to both its format and contents.
University of Pretoria etd – Du Preez, A E (2004)
Format and long-term effect of a technique mastering
programme in teaching Calculus
Anna E du Preez
May 2004
University of Pretoria etd – Du Preez, A E (2004)
Acknowledgements
The author wishes to thank the following people for their contributions to this dissertation.
Dr Ansie Harding (my supervisor) for her enthusiasm, perseverance and energy that went into
guiding me through this research.
Prof Johann Engelbrecht (co-supervisor) for his input and guidance, and keeping up with
the logistics.
The first year computer engineering students of 2001 involved in this research.
The third year computer engineering students of 2003 that so generously allowed
me to interview them.
My colleagues involved with the mainstream first year mathematics course for their
input and leniency when time was of the essence.
Dr Rina Owen in assisting me in analyzing the empirical data.
Victor (my husband) for his encouragement and especially his indulgence with me
while I was burning the midnight oil.
Tharina and Nanette (my daughters) for always believing in my ability during
all my trials and tribulations.
All my dearest friends and family who never gave up on my behalf and especially
my mother and father who put their needs aside to allow me to continue with my
studies.
Dr Kerstin Jordaan (my colleague and friend) for her encouragement and help in keeping
me focused.
University of Pretoria etd – Du Preez, A E (2004)
Contents
1 Introduction
1.1
5
Setting the scene . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
1.2 Research questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7
1.3 Structure of the dissertation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7
1.4
8
SigniÞcance of this research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2 Technique mastering
2.1
10
Terminology and examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
10
2.1.1
Must knows . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
10
2.1.2
Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
11
2.2 Technique mastering, gateway testing and assessment . . . . . . . . . . . . . . . .
12
2.3 Long-term retention of core knowledge and basic skills . . . . . . . . . . . . . . . .
15
2.4 Gateway testing at other institutions . . . . . . . . . . . . . . . . . . . . . . . . . .
18
3 Technique mastering at the University of Pretoria
3.1
22
Computer-based education in mathematics at the University of Pretoria . . . . . .
22
3.2 The current situation at the University of Pretoria . . . . . . . . . . . . . . . . . .
23
3.3 The technique mastering programme at UP in 2001 . . . . . . . . . . . . . . . . . .
24
4 Computer-based assessment
4.1
26
DeÞnitions and terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
26
4.2 Online assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
26
4.3 Examples of online assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
31
4.4 Comparing online and paper assessment . . . . . . . . . . . . . . . . . . . . . . . .
32
1
University of Pretoria etd – Du Preez, A E (2004)
University of Pretoria etd – Du Preez, A E (2004)
List of Tables
2.1
Comparison of a number of USA institutions’ gateway testing programmes
. . . .
21
5.1
Distribution of grade twelve mathematics symbols . . . . . . . . . . . . . . . . . .
34
5.2 Comparison between the results of the online and paper TMTs . . . . . . . . . . .
35
5.3
Student performance distribution (percentage-wise) comparing the results of the
Online and Paper TMTs for TMT1 . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.4
Student performance distribution (percentage-wise) comparing the results of the
online and paper TMTs for TMT2 . . . . . . . . . . . . . . . . . . . . . . . . . . .
6.1
45
Student performance distribution (percentage-wise) comparing the results of the
paper TMTs of 2001 and the paper TMTs of 2003 for TMT1 . . . . . . . . . . . .
6.3
37
Comparison between the results of the paper TMTs in 2001 and the paper TMTs
2003. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6.2
37
46
Student performance distribution (percentage-wise) comparing the results of the
paper TMTs of 2001 and the paper TMTs of 2003 for TMT2 . . . . . . . . . . . .
46
6.4 Student performance distribution (percentage-wise) for TMT1 Question 1 . . . . .
48
6.5 Student performance distribution (percentage-wise) for TMT1 Question 2 . . . . .
49
6.6 Student performance distribution (percentage-wise) for TMT1 Question 3 . . . . .
51
6.7 Student performance distribution (percentage-wise) for TMT1 Question 4 . . . . .
52
6.8 Student performance distribution (percentage-wise) for TMT1 Question 5 . . . . .
54
6.9 Student performance distribution (percentage-wise) for TMT2 Question 1 . . . . .
55
6.10 Student performance distribution (percentage-wise) for TMT2 Question 2 . . . . .
56
6.11 Student performance distribution (percentage-wise) for TMT2 Question 3 . . . . .
57
6.12 Student performance distribution (percentage-wise) for TMT2 Question 4 . . . . .
58
6.13 Student performance distribution (percentage-wise) for TMT2 Question 5 . . . . .
59
3
University of Pretoria etd – Du Preez, A E (2004)
List of Figures
5.1
Comparison of distribution of marks for TMT1 and TMT2 . . . . . . . . . . . . . .
38
6.1
Diagram comparing the pass percentages of TMT1 01 and TMT1 03 . . . . . . . . .
47
6.2
Diagram comparing the pass percentages of TMT2 01 and TMT2 03 . . . . . . . . .
47
6.3
The graph of f (x) = |2x + 1| as done by William . . . . . . . . . . . . . . . . . . .
52
6.4
The graph of f (x) = ln(−x) drawn by one of the students in the two consecutive
tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4
53
University of Pretoria etd – Du Preez, A E (2004)
Chapter 1
Introduction
1.1
Setting the scene
The mainstream Þrst semester calculus course at the University of Pretoria covers, as at most other
universities, standard topics such as limits, continuity, differentiation and integration. In 2001,
when the research for this study was conducted, approximately 900 students were enrolled for the
course of which the majority (65%) were engineering students with the other students majoring in
science subjects such as mathematics itself, physics, chemistry or information technology.
The group under discussion is typical of a Þrst semester calculus group of any year in its diversity in preparedness for the course. On the one end of the scale are students who took Additional
Mathematics as an additional subject in Grade 12, a subject of which the content overlaps with the
Þrst year calculus course. On the other end of the scale are students from educationally disadvantaged backgrounds who come to university ill prepared and laden with misconceptions regarding
basic principles. There are also students who passed Grade 12 mathematics with distinction and
students who come from other countries with a different school syllabus.
Devlin (1991) describes the students on the privileged side of the spectrum in a then future
vision as:
Imagine then the kind of person coming into our graduate schools, if not today,
then certainly tomorrow. Brought up from early childhood on a diet involving MTV,
Nintendo, graphical calculators packed with algorithms, Macintosh-style computers
and, in the not-too-distant future, hypermedia educational tools as well. Such a person
is going to enter mathematics with an outlook and a range of mental abilities quite
5
University of Pretoria etd – Du Preez, A E (2004)
different from their instructors.
On the other side of the spectrum are students who have no experience of technology at all,
even without basic technological appliances at home.
When dealing with such a diverse group and in addition to that a full curriculum and limited
contact time, one has to be innovative in maximising students’ learning experience and to stimulate
these students in their journey of discovery.
After graduation, when students enter the job market, these students will most probably be
required to utilise various resources connected to mathematics such as the internet, software programs, new textbooks, etc. They will be expected to analyse new material and apply and synthesise
this new knowledge in their problem solving strategy. One of the lecturer’s tasks then certainly
is to expose students to resources other than the prescribed textbook (Stewart, 1999). This is
feasible but time-consuming.
Although one of the main objectives of the calculus course under discussion is to equip students
with problem solving skills, one cannot ignore the necessity of acquiring basic knowledge and rote
skills before embarking on problem solving. This basic knowledge and rote skills are referred
to as must knows and techniques, respectively. The process of acquiring and assessing the must
knows and the techniques is known at the University of Pretoria as the technique mastering
programme (TM programme for short).
Mastering of the must knows and techniques requires training and a fair amount of repetition
from the student’s side. These skills then need to be assessed, often more than once, until the
required level of expertise is achieved. Technique mastering is thus time consuming and labour
intensive for the student as well as for the lecturer.
Because of a high content volume for each contact period, with only enough time to teach and
illustrate concepts and theory for a particular topic, there is little time to incorporate practice
activities for the technique mastering programme into the contact sessions.
The time constraints imposed by the challenges mentioned above were the reason for the decision
to separate the technique mastering programme from the formal contact lecture programme and
to investigate the possibility of incorporating computer-based assessment into this programme. In
doing so students can independently master the necessary techniques by repetitively writing tests
without unnecessary external intervention. Lecturers are spared the drudgery of grading repetitive
technique mastering tests and so gain valuable hours.
This study describes the implementation of computer-based assessment, in particular web-based
assessment, into the technique mastering programme and compares results of the computer-based
6
University of Pretoria etd – Du Preez, A E (2004)
versus paper-based assessment.
In addition, the long-term effect of the technique mastering programme is investigated. A
sample group of the 2001 Þrst year students are assessed again in 2003 in their third year of study
to determine how ingrained the must knows and techniques still are.
1.2
Research questions
The research questions of this study are formulated as follows:
1. Can paper-based assessment for the technique mastering programme in Þrst year calculus be
replaced by computer-based assessment without a signiÞcant difference in performance?
2. Which and how much of the knowledge and skills embedded by the technique mastering
programme in Þrst year calculus is retained after a further two years of study?
1.3
Structure of the dissertation
The dissertation consists of six chapters and three appendices.
Chapter 1 is introductory, describing the setting for the research. The research questions are
formulated, followed by an exposition of the structure of the dissertation and a statement of the
signiÞcance of the dissertation.
Chapter 2 gives an overview of technique mastering in general and of related programmes
at other institutions. Assessment of technique mastering is discussed and a literature review is
conducted on experiences with similar programmes as well as on the long-term retention of core
knowledge and basic skills.
Chapter 3 puts the University of Pretoria under the spotlight. A short history of computerbased education at the Mathematics Department up to the present is given, after which the situation in 2001, when this study was conducted is described.
Chapter 4 deals with computer-based assessment. A literature review is conducted, followed
by a discussion of the advantages and disadvantages of computer-based assessment. The chapter
concludes by focussing on WebCT as a platform for computer-based assessment.
Chapter 5 endeavours to answer the Þrst research question on whether the paper assessment
in the technique mastering programme can be replaced by online assessment. The methodology is
described and the collected data is statistically analysed to reach a conclusion.
7
University of Pretoria etd – Du Preez, A E (2004)
Chapter 6 deals with the second research question on the long-term effect of the technique
mastering programme. Both a quantitative and a qualitative investigation and conclusions are
described. A topical comparison sheds light on which topics should be considered as essential
knowledge.
1.4
SigniÞcance of this research
Universities are at the knowledge forefront, with the result that any university functions within
an ever-changing environment. As new knowledge becomes available through research it is incorporated into the teaching programme. Lately, not only the importance of incorporating new
knowledge is emphasised but also the importance of incorporating new innovative teaching methods. The 90’s decade has seen technology blossom, offering new possibilities for the classroom
in particular. Visualisation and ease of computation are two of the major advantages offered by
technology. The internet became commonly used only in the last part of the nineties but is rapidly
becoming an integral part of the education world. It would be foolish not to embrace the possibilities that technology and the internet offer. Levine, Mazmanian, Miller and Pinkman (2000) is of
the opinion that
The teaching of mathematics, and all subjects, for that matter, must undergo constant change if students are to be prepared to enter a rapidly-evolving technological
world. It is no longer a question of whether to use technology in the teaching and
learning experience. It is now the question of what technology to use and how and
when to use it.
Calculus is a subject taught across the world and the concerns at the University of Pretoria in
this regard are by no means unique. Yet, our particular student composition, our facilities, our
curriculum and our approach may be somewhat different from that of other universities, especially
those abroad. Although a number of other universities have embarked on similar technique mastering programmes and computer-based assessment, none of these programmes can be repeated
exactly in our context. It is important to develop a programme that is tailor-made for this particular situation. It is also important to establish what the success of such a programme is. It was
decided to turn to computer-based assessment for the technique mastering programme in the calculus course at the University of Pretoria and if deemed successful it could be used as an example
for other universities and also for convincing the few remaining sceptics in our own department. It
8
University of Pretoria etd – Du Preez, A E (2004)
is also important to list concerns and problems encountered, especially in the relatively new Þeld
of technology in education.
The long-term success of the programme is perhaps even more signiÞcant than the outcome
of the switch to computer-based assessment. If the long-term effect is below expectation we need
to seriously reconsider our strategy. Yet, irrespective of the outcome, results should assist in
addressing two issues. The Þrst issue is the validity of what we consider as must knows. Are what
we consider as must knows the basic knowledge necessary for passing the Þrst year course or does
it have long-term value, even beyond university? Are we selecting the appropriate must knows?
It would be of interest here to investigate the long-term performance in the different categories of
must knows and techniques. The second issue to be addressed is the method of deployment of the
technique mastering programme. What were the concerns and problems encountered and where is
there room for improvement? Do students beneÞt maximally from the programme?
An informed opinion, if not answers to these questions, will be valuable in directing the future
of the technique mastering programme and improving on a programme that is a necessary and
integral part of one of the most signiÞcant courses in the department.
9
University of Pretoria etd – Du Preez, A E (2004)
Chapter 2
Technique mastering
2.1
Terminology and examples
As stated in the introduction there is a core of basic knowledge and rote skills necessary for
venturing into deeper conceptual mathematics and problem solving, the basic knowledge referred
to as must knows and the rote skills as techniques, both prerequisites for success. The process
of acquiring the necessary must knows and the techniques as well as the assessment of these is
collectively referred to as the technique mastering programme at the University of Pretoria, as
stated before. It is not always easy to distinguish between a must know and a technique and often
what is considered basic knowledge by one person could be seen as a rote skill be another. We
expand on the terminology for clariÞcation.
2.1.1
Must knows
When knowledge is vital for some activity and hopefully becomes so ingrained that it can be
recalled effortlessly it is a must know. Real-life examples are facts such as that you need to drive
on the left-hand side of the road in South Africa; that there are 100 cents in a Rand; that the
boiling point for water is 1000 C. In mathematics the most basic must knows are, for example, facts
such as that after 1 comes 2 followed by 3 etc.; that 1 plus 1 make 2; that if you divide by 0 you
run into trouble.
Examples of relevant must knows:
1.
The properties and shapes of the graphs of basic functions such as
10
University of Pretoria etd – Du Preez, A E (2004)
f (x) = mx + c, f (x) = ex , f (x) = ln x, f(x) = sin x
are considered as basic knowledge, must knows.
2.
When solving an equation where the unknown variable is in the exponent you often
need the following basic knowledge:
If y = ex then x = ln y.
This is knowledge that needs to be ingrained and is therefore classiÞed as a must
know.
3.
The identity
sin2 x + cos2 x = 1
is used in differentiation, integration and many other places and so is a must know.
4.
When solving a quadratic equation, for example, one often makes use of:
ab = 0 if and only if a = 0 or b = 0.
This is a must know that is frequently encountered in problem solving.
2.1.2
Techniques
When a process is repeated so often that it becomes second nature it is classiÞed as a technique.
Real life examples include driving a car, eating with cutlery and reading. In mathematics the most
basic of these techniques are counting and simple addition.
Examples of relevant techniques are:
1.
Factorisation of polynomials, solving simple equations.
11
University of Pretoria etd – Du Preez, A E (2004)
2.
Techniques by which to determine the intercepts of a function with the axes, the domain,
the range and the asymptotes.
3.
The basic rules of di¤erentiation such as the product rule, the quotient rule and the
chain rule.
4.
Sketching of y = f (x + a) by using the graph of y = f (x) and moving a units to the
left on the x¡axis.
5.
‘Multiplying by one’ for instance to help …nd indeterminate limits of the form
as
2.2
p
x¡1
x¡ 1
x+ 1
lim p
= lim p
¢ p
x!1
x!1
x¡ 1
x¡1
x+ 1
¡0¢
0
such
Technique mastering, gateway testing and assessment
What is called technique mastering testing at the University of Pretoria is similar to what
is referred to as gateway testing in some American universities, also known as barrier testing.
According to the Oxford dictionary (2003), gateway means
an opening that can be closed by a gate or an entrance with or opening for a gate.
The purpose of most gateway tests is that it allows a student to pass through a gate once he/she
obtains the necessary knowledge or skills to use as the key to unlock the gate when entering into
a new cognitive environment. In the USA the term gateway test is used to describe a test that
typically covers some collection of routine mechanical skills, rather than concepts. The purpose of
gateway tests is stated by Megginson (1994) as
to assure that some particular collection of skills has actually been learned.
Common practice is that if a student fails to master particular skills he/she is forced to relearn
these skills well enough to pass the gateway test. Gateway tests can be taken during a current
course or near the beginning of a follow-up course, to assure that students either have mastered
routine skills taught in the present course or the prerequisite skills needed for the new course.
Megginson(1994) describes the main di¤erence between traditional and gateway tests as follows.
12
University of Pretoria etd – Du Preez, A E (2004)
The difference between traditional testing and gateway testing is that if a student
performs poorly on a gateway test over a particular collection of skills, the student
cannot compensate by learning other collections of skills well enough to overcome the
effect of the poor performance on the gateway test. Instead, the student must go back
and relearn the particular collection of skills well enough to pass the gateway test. The
test does not go away until the student has learned the material well enough to pass it.
Gateway learning, or in our case technique mastering, is categorised in the two lowest levels of
Bloom’s Taxonomy (1956), that of knowledge and comprehension and do not acquire entering the
higher levels of application, analysis, synthesis and evaluation. The philosophy behind technique
mastering is that a learner cannot construct meaning to what he/she has learnt and cannot venture
into problem solving before he/she has mastered the basic knowledge and skills.
Technique mastering is embedded in the broader philosophy of mastery learning. The basic
principles of mastery learning in a course as described in the Educational Psychology Interactive:
Mastery Learning (Huitt, 1996b) include
• enough time to demonstrate mastery of objectives,
• breaking the course into manageable units of instruction and
• requiring students to demonstrate mastery of objectives for a unit before moving on to other
units.
Mastery learning is further explained by the Abaetern Academy (Huitt, 1996a) as
. . . a teaching philosophy that assumes the student can and will master course
objectives if the proper instruction and guidance is offered, and if the time required to
learn is available. . . . In short, this means everyone masters the objectives of a course
before they go on to the next course. This enables them to take on the next level of
learning with conÞdence. . . Mastery learning is not new, but the oldest learning model
we have. Would our parents encourage us to drive if we hadn’t yet mastered walking?
Although the quote above refers to course objectives, the material allocated to a technique
mastering programme will be a subset of the course material, yet the principles of mastery learning
as stated above apply.
The Kumon method of learning mathematics, initiated by Toru Kumon in 1954 (The Kumon
Philosophy, 2003), is a good example of a mastery learning programme, practised worldwide on
13
University of Pretoria etd – Du Preez, A E (2004)
primary and secondary school level. Kumon was convinced that his son could solve problems if
the skills necessary to understand advanced level mathematics were taught one step at a time
and so he created a series of worksheets to help his son. The Kumon method relies on splitting
knowledge into manageable proportions, self-paced instruction and nurturing the basic reading and
mathematical skills in order
to master, step by step, the skills and knowledge they need for success in higher
level math and reading comprehension.
An example of mastery learning is presented by the University of Nebraska-Lincoln in their
Keller plan courses (Titsworth, 1997) replacing the traditional lecture-listen-test model commonly
used at many institutions.
Students are tested over the same information multiple times until they achieve
“mastery” of the content.
Although mastery learning has a large component of drill and practice, this cannot be conducted
blindly. According to Rissmann-Joyce (2002) in a study of the educational system and practices
of Japan, it is not enough to ‘drill-and-kill’ speciÞc skills, unless the underlying mathematical
principles are understood. She also emphasises that mastery learning can be used for mastering
critical thinking skills, referring to the Japanese situation where
students are routinely practising the complex reasoning and critical thinking skills.
Mastery learning need therefore not be merely the mindless practising of problems but could
lead to constructing new knowledge and strategies out of elementary problems, in order to venture
into solving more complex problems.
Engelbrecht (1990) describes the necessity of mastery learning in mathematics. He compares
the lack of even a small piece of mathematical knowledge, to a weak spot in a wall (of mathematical
knowledge) causing the whole wall to tumble.
A weak point in any gateway testing or technique mastering programme is the subjectivity of
the assessor in determining the content and pass criteria, referred to by Engelbrecht and Harding
(2003b):
the judgement of what concepts or techniques are assessed and what constitutes a
criterion of satisfactory performance is in the hands of the assessor.
14
University of Pretoria etd – Du Preez, A E (2004)
In order for any technique mastering programme to be successful students need to be repeatedly
exposed to the must knows and the techniques. They need to work through lists and lists of similar
practice problems and they need to be motivated to do so. In his process they need to be formatively
and continuously assessed. Common practice is to set a high ‘pass mark’ of at least 80% and make
available a series of tests. A student writes the Þrst test and if he/she fails, he/she needs to prepare
more, write the second test etc. until a satisfactory level of expertise is reached.
For whatever way gateway testing is assessed, it seems to be common practice to have sample
tests or handouts available for students to practise. Tests are often available on the institutions’
web sites or as handouts so that students have ample opportunity to practise for a gateway test.
Students know exactly what is expected of them, and in many cases students are rewarded
for passing the gateway programme or penalised if failing. Students know beforehand what the
rewards for passing the test or penalty for failing are. A reward for passing the programme could
be a higher symbol in the total course grading, whereas a penalty for failing could be a maximum
symbol for the actual course. In some cases students fail the actual course because of failing the
gateway programme and have to retake the gateway programme during the next semester before
they pass the actual course.
2.3
Long-term retention of core knowledge and basic skills
Concern regarding the long-term retention of the core knowledge and basic skills acquired by
students in the Þrst year is not unique to the University of Pretoria. It is especially disconcerting
when success in follow-up courses is hampered by a lack of recall of core knowledge. It is also
not uncommon for teachers of third year modules to complain that students either cannot do the
mathematics of the Þrst year any longer or, worse still, claim never to have encountered it.
Although literature in this regard is scanty, a study by Anderson, Austin, Barnard and Jagger
(1998) investigates the issue of long-term retention of core knowledge when they examine the
extent to which certain core Þrst year material is retained and understood. The study involves
155 students, mostly volunteers, at Þfteen different institutions in the UK, all in their third year
of study. Seven questions were posed of which one was on the application of the deÞnition of
differentiability, the only question related to the TM programme at UP. It should be noted that
this question was on a slightly higher level than questions typically included in the TM programme
at UP. Other questions covered a spectrum wider than calculus. The material of the test was
carefully selected to be representative of most Þrst year undergraduate mathematics courses in the
15
University of Pretoria etd – Du Preez, A E (2004)
UK.
Anderson et al (1998) report that
... only about 20% of the responses were substantially correct and almost 50% did
not contain anything that could be deemed to be minimally ‘credit-worthy’. This
suggests that a considerable amount of what is taught to mathematics students in
general as ‘core material’ in the Þrst year is poorly understood or badly remembered.
... the retention of Þrst year material is demonstrably weak and suggests some cause
for concern among those who teach them.
It is as if the experience of students, attending one module after another, is such
that they tend to ‘memory-dump’ what they have had in previous modules, rather than
retain it and build it into a coherent knowledge structure. In many instances, there
was little that the students actually could recall, even in cases where they had gone on
to futher study in the same topic later.
Miller, Mercer and Dillon (1992) in a paper on acquiring and retaining mathematics skills,
conducting a study on elementary school children, mostly with learning disabilities, caution that
memorisation does not implicate the ability to solve problems (moving from the concrete to the
semi-concrete and the abstract).
Rote memorization of math does not teach students to understand mathematical
concepts.
The aim of the TM programme at UP surely is not only to secure students’ core knowledge
but to create understanding as well. If students do not think creatively enough and rather rely on
memory, it could fail them in the long run.
Steyn (2003) claims that retention of information is linked to the way that it is taught. It is
the responsibility of the lecturer to guide students in this process of learning and retention. Steyn
proposes
. . . that new information is more easily understood and retained when it can be
related to existing information.
Weinstein (1999) also refers to the important role of the teacher when he states that
facilitators (lecturers) can have a tremendous impact on helping students to develop
a useful repertoire of learning strategies. Students should have the opportunity to reßect
16
University of Pretoria etd – Du Preez, A E (2004)
on their learning (strategies) and teachers should not only ask students what they think
but also how they think.
Gagné (1977) views long-term retention of knowledge as an essential part of learning when
saying that
Learning is a change in human disposition or capability, which persists over a period
of time, and which is not simply ascribable to processes of growth. . . . The change must
have more than momentary permanence; it must be capable of being retained over some
period of time . . . it must be distinguishable from the kind of change that is attributive
to growth.
Linked to long-term retention of knowledge is the ability to integrate knowledge between different subjects and different years of study. Knowledge obtained in the Þrst year of study of
mathematics could reappear in a physics course, for example, perhaps in a different notation but
should ideally be recognised as essentially the same knowledge.
The lack of ability to integrate knowledge is a widely recognised problem at universities. The
University of Massachusetts Dartmouth currently addresses this issue in their IMPULSE programme (short for The Integrated Math Physics, Undergraduate Laboratory Science and Engineering Program) (IMPULSE, 2004).
The IMPULSE faculty chose to integrate math, physics and engineering in order to
connect concepts, applications, and methods. In the traditional classroom setting, these
subjects are typically taught in an isolated manner, leaving students the responsibility
to draw the connections and relationships between them. The end result is that, while
students spend hours learning derivatives, for example, they are unable to use them in
a different context.
According to John Dowd (IMPULSE, 2004) a physics lecturer at UMD, students beneÞt largely
from the IMPULSE programme.
Integration - the fact that students are learning in a cross-disciplinary kind of way is a key goal. So that when they’re doing math, they’re also doing physics. They realize
that these subjects somehow connect to each other. In the past, I’ve had students come
to me who know how to take the derivative of x squared; it’s 2x. I ask them: ”What’s
the derivative of t squared?” ”I don’t know; we didn’t study that.” They’re doing
17
University of Pretoria etd – Du Preez, A E (2004)
mechanics. Many of the derivatives and integrals are over time. They don’t make the
conceptual jump, going from one symbol to another. They don’t realize the symbol
can stand for anything.
Schattschneider (2004) addresses the issue of knowledge recall in successive courses when she
reports on dealing with the fact that the precalculus course at Movarian College did not meet the
expectations of students and lecturers alike.
In the calculus course, many students will insist that they have never seen or used
certain techniques simply because the context is so different. (For example, solving
simple linear equations certainly is covered in precalculus, but linear equations there
were never solved for something called
dy
dx
or y0. Inevitably, teachers need to review
still again the non-calculus skills that are essential to solve calculus problems. There is
often a high degree of frustration on both the part of the students and of the teachers;
in the precalculus course and (for the survivors) in the calculus course that follows it,
there is low morale in both camps.
Instead of the precalculus course, they successfully introduced a course called Calculus I with
Review, a one-year course covering both Calculus I and the precalculus course
but at a slower pace, integrating the review of precalculus concepts and skills as they
were needed.
Important is the fact that time was spent on identifying student’s weaknesses and how to
address it.
In conclusion, we quote from a report on the Capstone Experience, a programme at the University of Nebraska Kearny with the purpose of integrating ‘real-life’ experiences into problem
solving
Many ideas we teach in the calculus sequence and in Foundations of Mathematics are
abstract and difficult to fully grasp; even for students who eventually earn doctorates
in mathematics, many ideas (Newton quotients and Riemann sums, for example) may
take years to internalize (The Capstone Experience, 2003).
2.4
Gateway testing at other institutions
Although gateway testing is more commonly used in the USA, related programmes are also encountered elsewhere. In the United Kingdom, for example, diagnostic testing is used for almost the
18
University of Pretoria etd – Du Preez, A E (2004)
same purpose as gateway testing. The report titled Measuring the Mathematics Problem (Savage
and Hawkes, 2000) describes the necessity to
assess the knowledge and skills of individual students, and to identify strengths and
weaknesses of whole groups.
In the USA there is an increasing emphasis on the importance of fundamental mathematical
concepts and essential skills that ideally would ensure a foundation that would give all students
solid preparation for work and citizenship, positive mathematical dispositions, and a conceptual
basis for further study. In the state of Tennessee, for example, students are expected to pass
gateway tests in a variety of subjects including mathematics, language, arts and science in order to
accomplish the goals set by the State of Tennessee namely that all children should be able to read
well and reach a certain level of competence in mathematics and science. Details are contained in
The Master plan for Tennessee Schools, Preparing for the 21st Century ( 2003).
We highlight one case of gateway testing in particular, that of the Department of Mathematics,
University of Michigan (Megginson, 1994) where gateway testing is used extensively to achieve
either, or both, of the following objectives:
• To assure that students have the prerequisite mathematical skills needed for the course. At the
university of Michigan, the gateway tests serve to indicate to students that their preparation
is inadequate for them to continue with the present or follow-up courses. These tests can be
given at the start of a new semester or at the end of the present one.
• To assure that students have mastered routine skills taught in the present course. Megginson
describes some of the advantages such as more time available during lectures for concepts,
multi-step problems and applications and assurance that the skills are mastered.
We compare gateway testing programmes at a few institutions in the USA in Table 2.1
Common factors present in all institutions’ gateway testing include:
• no partial credit
• high pass mark
• each student has to at least master 70% of the core knowledge before he/she can attempt a
follow-up test
• preparation tests are available
19
University of Pretoria etd – Du Preez, A E (2004)
• limited time to write the actual test
• students know what is expected of them.
20
University of Pretoria etd – Du Preez, A E (2004)
Institutions
Course
NAU
Pre-Calculus
UMich
UWM
UTK
x
Calculus I
x
x
Calculus II
x
Preparation
Online sample tests
x
test
Online interactive tests
x
available
Topics outlined
x
BSU
APICS
x
x
x
x
x
x
x
x
x
Handouts
x
Linked to other institutions
x
Number of
Limited
5
retakes
Unlimited
3
2x p.w.
3
x
x
x
40%
0%
0%
0%
80%
70%
80%
8
10
x
With penalty
Contribution to course grade
0%
0%
Pass criteria
6
7
20
25
No partial credit
x
x
x
x
x
x
Time limit for completion
x
x
x
x
x
x
Format
x
x
Content
Pen-and-paper
Multiple choice
x
Online
x
Differentiation rules
x
Integration
x
x
or
6
7
x
x
x
x
Basic Algebra
x
x
Limits
x
Trigonometry
x
x
Basic math skills
x
x
x
x
x
x
x
x
Abreviations
NAU
Northern Arizona Unversities
UMich
University of Michigan
UWM
University of Wisconsin-Milwaukee
UTK
University of Tenessee, Knoxville
BSU
Ball State University, Indiana
APICS
Atlantic Provinces Council
on the Sciences
Table 2.1: Comparison of a number of USA institutions’ gateway testing programmes
21
University of Pretoria etd – Du Preez, A E (2004)
Chapter 3
Technique mastering at the
University of Pretoria
3.1
Computer-based education in mathematics at the University of Pretoria
When electronic calculators became available in the 1970’s, the mathematics department at the
University of Pretoria was quick to purchase a number of these and to encourage application in the
classroom. Ever since it has become a priority in the department to keep abreast of technology and
its applications in education. The 1980’s saw the introduction of the personal computer (PC) but
it was only in the 1990’s that these became a common sight in offices. The concept of a computer
laboratory for students also became a reality in the 1990’s and computer aided instruction became
a topic of research in the department (Engelbrecht, 1995). Simultaneously, graphical calculators
became available.
The 1990’s also saw the birth of the Reform Calculus movement where more emphasis is placed
on visualisation and verbal interpretation. This movement had its impact on the mathematics
teaching approach at this university and technological aid became even more important. In the
USA a number of institutions started prescribing graphical calculators or supplying these in a
laboratory situation for student usage. Locally these were too expensive to prescribe. For a while
each lecturer in the department had a graphical calculator and as a group they shared an overhead
device connected to a graphical calculator. This was not viable because of the logistics involved
22
University of Pretoria etd – Du Preez, A E (2004)
but also because students were only observers and not participants. A decision had to be taken
whether to go for graphical calculators or for PCs. After much discussion, the department decided
on a computer laboratory equipped with PCs. Initially one laboratory with six computers was
supplied but subsequently another laboratory has been added with an additional 30 PCs. On
campus there are presently various computer laboratories, totalling close to 2000 PCs.
The Þrst project involving computer-based mathematics teaching ran in 1993 and involved a
selected group of students with the software package Mathematica as a teaching aid. These students
used the interactive feature of Mathematica to communicate with their lecturer. The experiment
proved to be successful but too expensive and labour intensive to expand to the large group of Þrst
year students (Engelbrecht, 1995).
Another online education project, undertaken by the department, was a bridging course for ill
prepared students, initiated in the early 1990’s. The purpose was to bridge the gap between school
mathematics and Þrst year university mathematics, especially for students who had missed a year
between Grade 12 and the Þrst year of university. The software package SERGO was used, this
package being a mastery learning programme. Theory was followed by practice exercises, followed
by an evaluation test with immediate feedback. The underlying principle is that repetition builds
conÞdence and reinforces the must knows necessary for success in the Þst year calculus course.
The programmerandomly chooses questions from a databank according to the level of difficulty set
by the instructor. This was the department’s Þrst experience with online assessment. Because of
major changes to the syllabus, the bridging course was discontinued since only a small part of the
course was relevant to the new syllabus.
3.2
The current situation at the University of Pretoria
Technology has become an integral part of the teaching programme. Almost every staff member
is techno literate and has his/her own PC and all students have access to a PC.
Matlab is a software package that is frequently used in the department at present. The programming and graphical features of Matlab are especially useful in problem solving and visualisation,
respectively. In the Þrst year calculus course, these features are explored by students in projects,
an integral part of the course. Engineering students frequently use this software package to solve
more complex problems in their later years. Matlab is available in all the engineering computer
laboratories as well as in the department of mathematics. This powerful software package is a
standard tool both for classroom demonstrations and for practical session explorations. Matlab is
23
University of Pretoria etd – Du Preez, A E (2004)
a computational tool, not an assessment tool.
Other frequently used software packages include ScientiÞc Workplace (a type-setting programme combined with Maple, a powerful computer algebra system) and its smaller version ScientiÞc Notebook. Both these programmes are used by some students for mathematical manipulations
and presentations of projects, although the software is not available in the laboratories. Lecturers
use these for mathematical purposes and also for typing articles and examination papers. Mathematica, (as discussed above) perhaps offers the highest mathematical capabilities and is used by
researchers and students alike for problem solving, but is unfortunately also not available in the
laboratories.
Engelbrecht and Harding (2001a, 2001b) describe the entrance of the department into the
new telematic era with the introduction of the Þrst web-based mathematics course in 2000. The
university subscribes to WebCT as a platform, as previously stated. There are now a number of
fully online mathematics courses and even more hybrid courses running. The university supplies
the infrastructure in terms of computer laboratories and support and the department adds its
knowledge and know-how. An increasing number of students have internet access at home and so
this medium is becoming increasingly viable and appealing to students.
3.3
The technique mastering programme at UP in 2001
In 2001 the technique mastering programme had been running for a number of years in a fairly
simple format. Appropriately categorised problems, requiring techniques rather than insight, are
listed in the study guide (Appendix A). These problems are somewhat repetitive and provide
ample practice opportunity. Students have access to all the problems from day one. Students are
motivated to systematically work through these lists of problems to reinforce the techniques. They
work on their own and seek help if necessary, either at their own lecturer, a tutor or elsewhere.
Two rounds of TMT (Technique Mastering Testing) takes place per semester, preferably during
the week preceding a semester test such that students can beneÞt from the feedback. Up to the
year 2001 students were assessed on paper, repeatedly, until a satisfactory level of performance
(80%) was obtained. The tests consisted of a subset of the exact questions as listed in the study
guide. The fact that the system allowed a student to write consecutive TMTs until he/she has
reached a satisfactory level of performance meant that assessment was a labour intensive process
for the graders.
The basic format was retained in 2001 but with one signiÞcant difference. After one year’s
24
University of Pretoria etd – Du Preez, A E (2004)
experience of running online courses using WebCT (Engelbrecht and Harding, 2001a, 2001b), it
was decided to investigate the possibility of running the technique mastering assessment on WebCT.
The problems listed in the study guide were shaped into WebCT format to create a question
databank. A number of quizzes were set to accommodate students who had to write more than
once. Students were briefed on how to access the website and how to go about accessing the
online quizzes. Certain groups of students did their TMTs online, certain groups maintained with
the paper testing and the research sample of students were exposed to both the paper and online
versions of the TMTs.
25
University of Pretoria etd – Du Preez, A E (2004)
Chapter 4
Computer-based assessment
4.1
DeÞnitions and terminology
Computer-based assessment (online assessment for short), in contrast to pen-and-paper assessment (paper assessment for short), makes use of some software package to grade the student’s
input, which in most cases is fed into the computer via a keyboard. On the lowest level are multiple choice questions where students shade answers by pencil on an answer sheet, assessed by
an optical reader. On the other end of the scale are courses that are presented totally online in
a web environment and where online assessment forms an integral part of the course. It is also
possible to use only selected sections of a web environment, such as only the assessment tool.
Such a course with face-to-face tuition but with one or more components on the web is referred
to as a hybrid course. WebCT and Blackboard are examples of virtual learning environments
that create a web environment for educational purposes. Such extensive software packages offer
a variety of assessment options including different question types, the option to import graphics
into questions, availability settings for tests, an expandable question databank, a databank for
student records and many more. In the project under discussion, online assessment was used in
a web environment. An extensive report on web-based learning and assessment can be found in
Engelbrecht and Harding (2003b).
4.2
Online assessment
Assessment innovation, including web-based assessment is the topic of a number of projects. Examples include the Innovation Assessment Program (United Inventors Association, 2003), the
26
University of Pretoria etd – Du Preez, A E (2004)
Research on Assessment Practices of the Freudenthal Institute (2003) in the Netherlands and
work done at the Indiana University, Center for Innovation in Assessment (2003) and The MathSkills Discipline Network (2003) in the United Kingdom. Engelbrecht and Harding in a paper on
online assessment (2003a) take an overview of the topic. We quote loosely from this paper while
simultaneously expanding on the topic.
Changing teaching approach without due attention to assessment is not sufficient. A number
of authors emphasise the importance of readdressing assessment practices while teaching practices
evolve. Gretton and Chalis (1999) ask:
Assessment has various purposes. Is it for grading and sorting students? Is it
for encouraging learning? The answer is yes to both, but when both technology and
students’ skills are evolving so rapidly, then assessment style must also evolve to ensure
it continues to fulÞl these objectives.
Smith and Wood (2000) link assessment and learning:
. . . appropriate assessment methods are of major importance in encouraging students to adopt successful approaches to their learning.
The answer to the question why one should venture into online assessment probably lies in
the fact that more and more situations are presently created where online assessment seems the
obvious route to take. Not only are computers in common use in most teaching environments, but
internet access is rapidly following the same route. We are currently experiencing what Miner and
Topping (2001) describe as The Web Revolution with major impact on society:
The World Wide Web is nearly a decade old now and its effect on math and science
communication has already been amazing. An enormous amount of information has
been made available on the Web. Increasingly, cross-referenced research articles and
abstracts are available in searchable online archives. In education, engaging interactive materials are widespread, and Web-based course management tools are becoming
frequent virtual companions to traditional pedagogical tools.
When deciding on online assessment in a course it is necessary to decide what aspect of the
assessment will be done online. Keeping normal term tests in a paper format and opting for online
assessment for the technique mastering programme seems to fall in line with the experience of Wise
(1997) in a study of Maryland’s functional tests for high school graduates in mathematics, reading
and citizenship:
27
University of Pretoria etd – Du Preez, A E (2004)
The CAT (Computerised Adaptive Test) versions have augmented rather than supplanted the paper-and-pencil versions: they are typically used with transfer students
and students who are retaking one or more test.
In a study done by Patelis (2000) on an overview of computer-based testing he urges educators
to consider online assessment
The age of the number-two pencil in standardised assessment is far from over, but
CBT (Computer-Based Testing) is becoming more popular . . . educators . . . are often
unaware of the options available to them.
Gateway or technique mastering testing, as discussed in a previous chapter, is particularly
suited to online assessment. It is important to note, however, that online assessment is not only
used for testing the must knows. The traditional perception is that MCQs can only be used for
testing lower level cognitive skills. This is not true, according to Hibberd (1996)
. . . they can be implemented to measure deeper understanding if questions are
imaginatively constructed.
An important advantage of online testing is that students can work any time, any place and
are not limited to teacher organised opportunities. In a report on The Pros and Cons of Online
Assessment, issued by Customised Training Development (CTD for short) (2003), an independent
research Þrm of San Francisco, it is maintained that immediate feedback is essential in formative
assessment.
Kamps and Van Lint (1975) found that a traditional paper calculus test can be replaced with a
similar multiple choice test provided the questions are carefully selected so that the same concepts
are tested. They comment on the large amount of time necessary to set such a test, but also
mention the advantage of an accumulated question-bank over a period of time.
In a successful project at the University of Wolverhampton (Thelwall, 2000), computer-based
tests were designed to replace paper tests and shows that it is feasible on a large scale.
Each test generates an exam randomly from a possible 80 000 variations and then
delivers the exam, marks it and gives feedback. The project delivers around 2 000 assessments per year in various forms: fully automated tests, automatically generated randomized web tests with automatically generated marking schemes, web self-assessments
with JavaScript, and automated feedback marking schemes with word macros.
28
University of Pretoria etd – Du Preez, A E (2004)
Boyce and Ecker (1995) in their research on the computer-orientated calculus course at Rensselaer Polytechnic Institute see a widening scope of possibilities as a beneÞt of computerising routine
tasks
As some computational tasks are routinized, greater diversity of applications can
be considered.
From the teacher’s perspective the value of decreased grading time when assessing online should
not be underestimated. When working with large groups of a hundred and more students the
grading load impacts on valuable research time and affects staff budgets. Developing the tests is
time consuming but a question databank can be developed that eventually saves time.
Online assessment has the further advantage of enabling the teacher to readily obtain questionby-question proÞles. Subsequent reÞnement of questions and tests can be carried out. The empirical
data that becomes available makes online testing a valuable diagnostic instrument. The objective
of every MCQ should be clearly understood and a careful selection of the distracters can itself be
utilised to provide diagnostic information (Engelbrecht and Harding, 2003a).
Online testing also has disadvantages. At the moment, one of the biggest disadvantages of
online mathematics is symbolic representation.
. . . the Web can be a powerful vehicle for communicating mathematics and science
in general, just as it is revolutionizing communication in other disciplines and Þelds of
endeavor. In spite of this, it is nonetheless true that there are still signiÞcant obstacles
to publishing mathematics on the Web that other disciplines do not face. As has
long been the case in print, authoring and publishing mathematical notation can be a
complicated business (Miner and Topping, 2001).
MCQs are also lacking to a certain extent. First of all there is no partial credit for these
questions. MCQs with distracters based on misconceptions can enforce such misconceptions unless
immediate feedback is available (Engelbrecht and Harding, 2003a). Furthermore one should always
remember that guessing has to be taken into account in some or other way.
Turning to the present study: When students need to do online tests in a secure environment,
the logistics of organising of computer laboratory sessions for a large group of about 900 students are
formidable. A sound infrastructure with enough workstations is a prerequisite before embarking
on online assessment for a large group. It is also time consuming to develop questions for the
databank.
29
University of Pretoria etd – Du Preez, A E (2004)
Scepticism amongst staff regarding the validity of assessing mathematics online is also still
common and there is a resistance to change among less computer literate staff members. An
aspect to keep in mind when converting to online assessment is that it could possibly be a new
experience for students and probably a new experience for some lecturers.
It is important to keep in mind that changes of this magnitude, in any Þeld, are usually
accompanied by anxiety and ‘fear of the unknown’. Research in computer-based testing indicates
that once individuals have taken a computer-based test, they become more comfortable with the
new testing environment and Þnd that the beneÞts associated with the test offset those aspects of
the test that are new and unfamiliar. (Yopp, 1999)
Because of the ‘fear’ experienced by learners’ Þrst time experience of online testing, it is preferable for teachers to introduce practice tests. Students can practise quizzes in their own time to
familiarise themselves with the syntax of mathematical symbols and functions.
According to CTD (2003) online assessment is not always considered by educators as a viable
option.
Online assessment has proved to be a contentious area for educators. The common
image of online assessment is that of computer generated tests and many teachers will
immediately discard this notion seeing it as inappropriate for their subject area.
Other than self-check quizzes and online tests, online assessment can take many other forms
such as
•
submission of assessment items via email or a subject delivery system
•
contributions to a bulletin board, either individually or in groups and
•
peer mentoring and commenting on a bulletin board.
Netshapala (2001) discusses the advantages and disadvantages of using computers in the mathematics classroom. Amongst the advantages of uses for the classroom is the fact that learners
may repeat the same (similar) test without feeling personal rejection from their instructor. The
facelessness of the computer can be an advantage to a student with low self-esteem. There is no
fear of exposure. An individual student need not feel rejection from his/her classmates either, since
each individual can continue at his/her own pace. There is also value in re-writing a test since a
student has to prepare and rethink the same technique again and hopefully master it in the end.
30
University of Pretoria etd – Du Preez, A E (2004)
4.3
Examples of online assessment
On investigating different models, that of the Old Dominion University in Virginia in the USA
stands out as an example of how students can practise skills interactively. Old Dominion University
introduced their Reform Calculus project in 1992 (Bogacki, Melrose and Wohl, 1995). Their
Interactive Tutorials and Tests, available on the Internet, are impressive. The use of these software
packages enables a student to type his solution using prescribed notation such as sqrt(x), sin(x),
/, * etc. The student’s input can be pre-viewed and conÞrmed before submitting to avoid typing
errors, processed by the software application Mathcad. A student can ask for a hint to solve a
problem for the price of a few marks. This model also incorporates the computer algebra system
Maple that enables students to draw graphs, a skill that could prove to be useful for future careers.
Currently anyone is allowed to use the software and it is available to any institution as share-ware,
for non-commercial use, provided permission is granted from the authors. Unfortunately this model
is not compatible with WebCT.
DIAGNOSYS is a software package widely used in the UK (Appleby, 1997).
It uses a deterministic expert system, that is, inferences drawn from a student’s
answers are considered to be deÞnite. By careful design of both the network of skills
and of the questions that test those skills, useful conclusions can be obtained. It should
be noted, however, that such expert systems require extensive design and validation,
and can still produce highly questionable results (e.g. in medical diagnosis).
Respondus and Questionmark are two powerful and widely used software packages for online
assessment in Mathematics. Both these packages offer a variety of question types and an excellent
equation editor. Respondus’ compatibility with WebCT makes it an appealing option.
WebCT is an extensive software package that provides a platform for online education via the
internet. When designing a course with WebCT there are a variety of tools to choose from such
as the Calendar tool for posting the schedule of events, the Discussion tool that offers a bulletin
board for posting messages by both lecturer and students, the WebCT Mail tool for personal
communication, the Chatroom facility for synchronous communication, Content pages for posting
study material in, a Whiteboard facility for freehand writing, the Quiz tool for assessment and My
Record for a record of the student’s marks to name but a few. WebCT is just one example of a
virtual learning environment, others include Blackboard and eCollege.
The powerful Quiz tool provides a variety of question types - multiple choice, matching, single
answer, calculated (parameter dependent) and paragraph questions - as well as a Question Data31
University of Pretoria etd – Du Preez, A E (2004)
bank facility. The designer sets a quiz, determines the availability settings and sets an optional
password. Once the quiz is graded an abundance of statistics becomes available.
Because the University of Pretoria subscribes to WebCT all lecturers have the option to make
use of this platform. The assessment on which this study reports was conducted via the Quiz tool
of WebCT.
4.4
Comparing online and paper assessment
Engelbrecht and Harding (2003a) compare performances in online assessment and paper assessment
in their model of presenting a course via the internet. They conclude that although there is no
signiÞcant difference in performance it is advisable to combine online and paper assessment modes.
There has never been any “disturbing” difference between the online and paper
sections. . . Students do seem to perform slightly better in the online section in general
although this is marginal in most cases . . .
In this study students show a slight preference for online testing but this is again marginal.
In a study at the North Carolina State University, written homework problems were replaced by
online problems, assessed by the software programme WebAssign (Bonham, Beicher and Deardorff,
2001). Student performance was similar between the paper and web sections. Although students
performed slightly better in the online version, the difference was not statistically signiÞcant.
A comparison between online and paper testing in the gateway programme of the University
of Michigan (discussed before) are given by LaRose and Megginson (2003). For the purpose of
their study the online entrance gateway test for Calculus II was replaced by a paper test. During
the courses only online testing was used. Student survey results demonstrate students’ perception
that the online test’s effectiveness is equivalent to that of the paper version for Calculus I. The
Calculus II students somewhat preferred the paper gateway.
32
University of Pretoria etd – Du Preez, A E (2004)
Chapter 5
Comparison between paper and
online TMTs
5.1
Scope of the study
The study reports on research with a convenience sample consisting of a group of 103 students
taught by the author. The group consists of Afrikaans speaking students, the majority studying
computer engineering. Table 5.1 gives the distribution of the Grade 12 marks for mathematics for
the sample group as well as for the whole Þrst year group of engineering students. From this table
it is clear that from an academic point of view, this group can be considered as representative of
the whole Þrst year group of engineering students. Although most of the students have an strong
academical background in mathematics, the group also contains students in the Þve-year-plan
(extended program) who are as a rule less well-prepared for university study in mathematics.
Scheduling of activities in the technique mastering programme was simpliÞed by the fact that
the entire sample group followed the same timetable. Since the students were all reasonably
computer literate, working on the web posed no problem.
As mentioned earlier, this group of students are all Afrikaans speaking and are all studying in
the same programme - making them more of a homogeneous group of students than the rest of the
Calculus students. Because of the homogeneity of the group, it would be dangerous to generalise
the results obtained from this group of students, despite the fact that academically they form a
representative sample of the entire group.
33
University of Pretoria etd – Du Preez, A E (2004)
Sample group
2001 Þrst year
Calculus students
Symbol
%
%
A
32.04
33.42
B
22.33
20.73
C
33.98
26.76
D
11.65
17.46
E
0.0
1.38
Table 5.1: Distribution of grade twelve mathematics symbols
5.2
Research methodology
Students wrote two rounds of TMTs during the semester, the Þrst (TMT1 ) was written in the week
before the Þrst major semester test, about Þve weeks into the semester and the second (TMT2 )
one month later, in the week before the second semester test.
In both cases, all students did two similar TMTs during the same lecture period, one a traditional paper test and the other an online test. The questions and standard in these tests were
similar, but the online tests consisted mainly of multiple-choice and matching questions. See Appendix B for the online versions of the TMTs. The questions used in the paper TMTs are included
in the discussion in Chapter 6. The weights of the different topics in the online version of TMT1
were not exactly the same as that in the paper TMT1 . This was corrected for TMT2 . Here the
questions were carefully selected to test the same skills for both versions of the tests.
Students did the online tests under supervision during which time assistance on the technical
side was available.
As mentioned in Section 3.3, students who did not get 80% in either of the online or paper
TMTs, had to rewrite. These rewrites were done online only and do not form part of this study.
In this chapter the results of the online and paper TMTs are compared empirically for both
TMT1 and TMT2 .
Prior to the date set for the second TMT a questionnaire was issued on students’ attitude
concerning the two modes of assessment and on the technique-mastering programme in general.
34
University of Pretoria etd – Du Preez, A E (2004)
5.3
Research results
Table 5.2 displays the distribution of the marks, number of students that participated, the percentage of the students that passed, the median and the mean of the two TMTs. For both the
TMTs we also include a column indicating the results for the best of the two tests. The reason for
this is that in order to pass the test, it was required that a student pass only one of the online or
paper tests.
Marks distribution
TMT1
Paper
Online
TMT2
Best of Paper
Paper
Online
and Online
Number of students
100
Best of Paper
and Online
97
Number of students that passed
53
60
83
54
53
66
% of students that passed
53
60
83
55.7
54.6
68.0
Mean
7.4
7.4
8.6
7.2
7.4
8.1
8
8
8
8
8
9
2.0
2.2
1.4
2.4
2.3
2.1
Median
Standard deviation
Pearson correlation coefficient
0.016
0.604
Spearman correlation coefficient
0.011
0.558
Sign test for differences: p-value
0.747
0.428
Table 5.2: Comparison between the results of the online and paper TMTs
Although the averages for both TMTs correspond favourably it should be veriÞed statistically
that there is no difference in average performance between the paper and online versions of the
TMTs. Because the data recorded on performances in the four tests is not normally distributed, a
t-test is not appropriate and the non-parametric sign test for differences is applied. The p-values
of 0.747 for TMT1 and 0.428 for TMT2 point to no signiÞcant difference in average performance
on a 5% level of signiÞcance between the paper and online versions in either of the two TMTs.
The Pearson correlation coefficient r = 0.016 and Spearman correlation coefficient r = 0.011
for TMT1 indicate poor correlation between the paper and online versions of the tests. Table
5.3shows the student performance distribution (percentage-wise) split into three categories, for the
results of the paper version of TMT1 compared to the online version. The cells away from the
35
University of Pretoria etd – Du Preez, A E (2004)
main diagonal of this table are heavily loaded, supporting the poor correlation between the two
versions of TMT1 .
The low correlation between the paper and online versions of TMT1 is reason for concern.
Although the averages for the two versions are comparable it appears that it is not the same
students who do well in both. The question of reliability immediately comes to mind. Ideally
a reliability coefficient should be calculated to draw a statistically valid conclusion regarding the
consistency with which students respond to the paper and to the online tests. The low correlation
seems to point to a possible problem regarding the reliability. Unfortunately the questions in the
two versions were not matched and neither are question by question results available for the online
version, negating the possibility of calculating a reliability coefficient. For TMT2 care was taken
to match the questions in the paper and online versions and there the correlation is much higher
as will be discussed shortly, although it would be wrong to conclude that the former (matching the
questions) implies the latter (higher correlation coefficient). Considering the extenuating factors
involved, TMT1 of 2001 should probably be regarded as a trial run and a learning curve for
both teachers and students. Students were unfamiliar with this mode of testing and experienced
difficulty with the technical side, teachers were still familiarising themselves with the art of posing
online questions and the problems of symbolic notation etc. Another possible explanation for the
low correlation could be that in order to pass, students had to pass only one of the two tests. It
is possible that once they passed the paper test they had little motivation for passing the online
version of the test while students who did not pass the paper version performed better in the online
version. This could explain why the averages are comparable but the correlation is low.
From the data collected on TMT1 alone it would be unfair to conclude that the paper mode
of testing could be replaced with online testing without loss of generality and the Þrst research
question would remain unanswered.
We investigate the results of TMT2 . Only a small percentage (30%) of students passed both
the paper and online versions of the test. It should be noted, however, that this 30% represents
56.7% of the students that passed the paper version and 50% of the students that passed the online
version.
The Pearson correlation coefficient r = 0.604 and Spearman correlation coefficient r = 0.558
for TMT2 indicate stronger correlation between the paper and on-line versions of TMT2 . Table
5.4 shows the student performance distribution (percentage-wise) split into three categories, for
36
University of Pretoria etd – Du Preez, A E (2004)
Paper →
TMT1
Marks
0-5
6-7
8 - 10
Total
Online
0-5
2
7
8
17
↓
6-7
4
4
22
30
8 - 10
10
13
30
53
Total
16
24
60
100
Table 5.3: Student performance distribution (percentage-wise) comparing the results of the Online
and Paper TMTs for TMT1
the results of the paper version of TMT2 compared to the online version. In this case the diagonal
cells are more heavily loaded.
For TMT2 , the percentage of students that passed both the paper and online tests increased
from 30% (TMT1 ) to 42.7% and represents 75.9% of the students that passed the paper version
and 77.3% of the online version, respectively.
The percentage of students that passed either of the two tests decreased from 83% ( TMT1 )
to 68.6% (TMT2 ). A possible reason for this could be that students realised that the marks did
not contribute to their Þnal mark, but this again is mere speculation and beyond the scope of this
study.
Paper →
TMT2
Marks
0-5
6-7
8 - 10
Total
Online
0-5
14.6
5.2
4.2
24.0
↓
6-7
8.3
3.1
8.3
19.7
8 - 10
4.2
9.4
42.7
56.3
Total
27.1
17.7
55.2
100
Table 5.4: Student performance distribution (percentage-wise) comparing the results of the online
and paper TMTs for TMT2
In Figure 5.1, we compare distribution of the marks for the online and paper versions of each
of TMT1 and TMT2 . It is difficult to draw any conclusion from these Þgures except that in both
tests the best of the two marks (paper and online) is signiÞcantly higher than any of the two, which
seems to indicate that students have different assessment preferences.
37
University of Pretoria etd – Du Preez, A E (2004)
TMT2
TMT1
90
80
70
60
50
40
30
20
10
0
90
80
70
60
50
40
30
20
10
0
0 to 5
6 to 7
Paper
Online
8 to 10
0 to 5
Best
Paper
6 to 7
Online
8 to 10
Best
Figure 5.1: Comparison of distribution of marks for TMT1 and TMT2
5.4
Responses to the questionnaire
In the period between TMT1 and TMT2 , 92 students completed a questionnaire (Appendix C).
Apart from certain questions on demographics, students were asked to give reasons for their performance in TMT1 , and to indicate (and explain) their preference between the paper and online
tests. The questionnaire survey reveals results that could explain some of the empirical results.
About 86% of the students who completed the questionnaire, passed TMT1 in a Þrst attempt,
either paper or online. Of those who failed the paper test a small percentage (6%) are of the opinion
that they failed because of the fact that their presentation of the answers was not mathematically
correct, while a larger percentage (14%) admit that they were not sufficiently prepared for the test.
Students who failed the online TMT1 offer the following reasons for their performance:
• They were not properly prepared.
• They guessed the answers.
• They blame the computer - they knew the answers but did not type it correctly,
• and a few had technical problems.
Of the 28 students who were not successful in their Þrst attempt in either the paper or online
tests, 5 wrote the online test twice in order to pass and 23 needed 3 attempts to pass the online
test. Only 9 of these students consulted their lecturer oe a tutor for assistance in preparing for the
retake.
A large percentage (69%) of the students prefer the online version of the test to the paper
version. They offer the following reasons:
38
University of Pretoria etd – Du Preez, A E (2004)
It is user friendly.
You do not have to draw the graphs yourself, you can just choose one.
One can work backwards toward the question.
I like multiple choice questions.
I feel comfortable in front of a PC.
Students who prefer the paper version explain their preference as follows:
One can get partial credit.
I like to write out the answer.
I think while I write.
I don’t feel so pressed for time.
It is difficult to concentrate on the screen.
The given answers/notation for the online test confuse me.
Some general observations from the questionnaire may be of value in future.
• A large percentage of the students prefer the online tests and found the tests easier than the
paper tests, although this is not reßected in the results in Table 5.2
• A number of students comment that when failing the Þrst attempt, they would sit for the
second attempt without preparing in advance.
• A number of students confess that they guessed the correct answer in the multiple-choice
questions rather than working towards a solution.
• A suggestion arising from the results of the questionnaire is that there should be more than
two TMTs during the semester, each covering a smaller part of the syllabus content.
5.5
Concerns and problems
A number of concerns related to the WebCT software package have been expressed, the Þrst being
the difficulty with mathematical symbols on the web. In 2001 it was somewhat difficult to get
mathematical symbols on the web. WebCT has since introduced an equation editor that makes it
easier. Another option is to link a software package such as Respondus to WebCT. Nevertheless, in
2001 there were still only a limited number of symbols available and students had to use notation
R
√
d
such as sqrt(x) for x, int(x^2+1)dx for (x2 + 1)dx and d/dx(x-1) for dx
(x − 1). In spite
39
University of Pretoria etd – Du Preez, A E (2004)
of the fact that this notation was explained, students were not familiar with typing mathematical
symbols in this manner and students commented in the questionnaire that this was a problem.
Security is a problem when students do online tests. Students could access tests from their home
computer or use any computer laboratory at the university. Although a password was required,
there is no guarantee that a student wrote his or her own test. To address this problem, supervised
sessions were scheduled in the large computer laboratories to enable a bigger number of students to
write simultaneously but a number of logistical problems were experienced during these sessions.
During informal discussions with students, it became clear that the objectives that many students have with their studies, differ somewhat from ours as teachers. Our goal is that students
should master the mathematical content on a long-term basis. Some students do not value the
importance of basic mathematical skills for future use. Their primary objective is to pass the
course (short term). Students prefer arguing for extra marks in order to get to the required 80%
minimum for a TMT, rather than prepare again and attempt a rewrite, that would beneÞt them
in the long run.
Students also do not like the repetitive nature of the programme. They get bored with rewriting
a TMT and many confess to not preparing at all before a second attempt at a TMT, just hoping
that they will guess a sufficient number of answers correctly when doing the TMT online. This
is counterproductive and another approach should be considered. If we could bring students to
realise what the true aim of the TM programme is, they would perhaps be more prepared to put
effort into the programme. The structure of the TM programme is not challenging enough for
students to encourage total engagement in the programme - they do it because they have to. The
system should be modiÞed in such a way that students take part in the programme with greater
motivation.
The fact that the results of the TMTs do not contribute towards their Þnal mark, may be an
important reason for the lack of enthusiasm for the TM programme. It is also difficult to develop
a mechanism to force students to write the TMTs. The possibility to change to a system where
TMTs become part of regular (contributing) tests was considered and implemented in 2002.
5.6
Conclusions
The aim of the research reported on in this chapter is to compare the online and paper versions
of the technique-mastering tests in order to determine whether the paper TMTs can be replaced
by an online assessment programme. The empirical comparison between the two modes of testing
40
University of Pretoria etd – Du Preez, A E (2004)
indicates no signiÞcant difference in average performance but low correlation between the paper
and online versions for TMT1 raises uncertainty regarding reliability. The results of TMT1 on
its own are therefore inconclusive and the Þrst research question remains unanswered. However,
results of TMT2 not only show no signiÞcant difference in performance between the paper and
online versions but also strong correlation. Based on these results the question as to whether
paper testing can be replaced by online testing in the technique mastering programme can be
answered affirmatively.
There are indications, however, that it is not necessarily the same students who do well in the
online TMTs that do well in the paper TMTs. Students have different assessment preferences and
the paper assessment format could perhaps be enhanced by adding an online component rather
than being entirely replaced by the online tests. Different modes of assessment can provide for
this spectrum of assessment preferences. This conclusion is supported by the preferences expressed
by students in the questionnaire, indicating that the majority of the students prefer the online
assessment mode but that a substantial number of students still prefer conventional paper testing.
There are advantages and disadvantages to computerising the TMTs and a number of logistical
problems. Time saving on marking is a deÞnite advantage. A disadvantage is that some students
as well as some teachers experience the initial internet exposure as new and different. Most of the
students are familiar with the internet but they have had little experience in using the internet in
their learning. Since this is the only section of the course presented on the internet, some students
experience this as overwhelming. A session was scheduled before the Þrst test to give students an
opportunity to familiarise themselves with the procedure but this was still a very new experience
for many students. For many teachers, on the other hand, running an assessment programme on
the internet by posting tests on the web is also a new and overwhelming experience and some
teachers are reluctant to acquire the new expertise necessary for running such a programme.
WebCT tests are only used in the technique mastering programme and are not used in the
general assessment of the course, simply because of the logistical problems when dealing with a
large number of students. As could be expected, the teachers involved in the course decided in
2003 to use the best of the two worlds and to change to a system of using multiple choice questions
that are answered by the students in pencil on paper and then marked by an optical reader. This
system has a smaller marking load than conventional paper tests but is found to be logistically
easier to run by the teachers than when using the internet.
It has to be emphasised that in this study we do not attempt to do a full investigation into
the problems of online assessment but rather investigate the possibility of using online tests in our
41
University of Pretoria etd – Du Preez, A E (2004)
technique mastering programme. Since this study was done, Engelbrecht and Harding (2003a and
2003b) have done further studies on the use of online assessment in undergraduate mathematics
courses at the University of Pretoria.
42
University of Pretoria etd – Du Preez, A E (2004)
Chapter 6
Longer-term effects of the TM
programme
6.1
Scope of the study
For the research on the longer term effects of the TM programme we focus on a sample group of
third year students of 2003. The research consists of three parts - a quantitative investigation, a
topical comparison and a qualitative investigation based on interviews with selected students. In
the quantitative investigation we compare the performance in the two TMTs of 2001 (TMT1 01
and TMT2 01 for short) with that of the two identical TMTs of 2003 (TMT1 03 and TMT2 03 for
short), respectively. In the topical comparison we discuss the 2001 and 2003 performances per
question. In the qualitative investigation we look at perceptions of students on the value of the
TM programme, recorded from interviews with individual students.
6.2
Research methodology
The investigation in 2003 was done towards the end of the Þrst semester. At that time only 43
of the original sample group of 103 students were available. These were the same computer engineering students who were now doing their Þnal mathematics module, Stochastic Processes. By
then they had completed three semesters of Calculus, one semester each of Linear Algebra, Differential Equations and Numerical Methods and had nearly completed one semester of Stochastic
Processes in the mathematics department. Some of the other engineering courses such as Com-
43
University of Pretoria etd – Du Preez, A E (2004)
puter Programming, Circuits, Mechanics, Physics and Digital Systems also require mathematical
knowledge. One can safely assume that these students will have been exposed to a substantial
amount of mathematics in the various mathematics modules as well as in related subjects.
Without prior notice, the sample group was asked to write both TMTs in the same session.
The tests were in paper format only and were exact replicas of the TMTs of 2001. The TMTs
written in 2003 were then graded using the same criteria as in 2001. An overall comparison is
done by comparing the Þnal marks for the TMTs of 2001 and 2003. The comparison between 2001
and 2003 focuses on this group of 43 students. The question by question results of 2003 are then
compared to those of the 2001 tests to determine the longer term effect of the technique mastering
programme.
To get individual feedback, and as a qualitative measure, 14 students were selected from the
group of 43 students, selected on their 2001 Þnal mark for the Þrst semester, Þrst year Calculus
course. There were 6 students with a Þnal mark of more than 75%, 4 students with a Þnal mark of
between 60% and 75% and 4 students with a Þnal mark of between 50% and 60%. These students
were interviewed to report on their perceptions of the TM programme of 2001. We report in detail
on four interviews and quote other students. The interviews consisted of a number of structured
questions concerning the technique mastering programme in general as well as questions concerning
their personal performance in the TMTs of 2001 compared to that of 2003.
6.3
Quantitative investigation
We look at the correlation between the different tests and compare the means of the TMTs of
2001 and 2003. The purpose of the comparison is to use the results diagnostically to determine
a strategy for similar future programmes. Table 6.1 reßects the results of the two TMTs of 2001
with that of the TMTs written in 2003.
For the 43 students in question, the average mark for TMT1 01 of 7.3 out of 10 is followed by an
average mark of 6.1 for TMT1 03. For TMT2 01 the average mark is 8 out of 10, dropping to 6.3 for
TMT2 03. From this data alone it is not possible to establish whether there is a signiÞcant difference
in average performance between the paper tests written in 2001 and the paper test written in 2003,
respectively. Again the non-parametric Sign Test for differences seems appropriate because not
all data appears to be normally distributed. The p-values, obtained from this test, of 0.0008 for
TMT1 and 0.0001 for TMT2 point to a signiÞcant difference in average performance between the
44
University of Pretoria etd – Du Preez, A E (2004)
Marks distribution
TMT1
2001
Number of students
TMT2
2003
43
Number of students that pass
2001
2003
43
20
7
31
17
% of students that pass
46.5
16.3
72.1
54.6
Mean
7.3
6.1
8
6.3
7
7
8
6
1.9
1.6
2.1
2.5
Median
Standard deviation
Pearson correlation coefficient
0.440
0.446
Sign test for differences: p-value
0.0008
0.0001
Table 6.1: Comparison between the results of the paper TMTs in 2001 and the paper TMTs 2003.
TMTs on both a 1% and a 5% level of signiÞcance. The drop in average performance in both
TMTs from 2001 to 2003 is therefore statistically veriÞed.
The Pearson correlation coefficient r = 0.440 between TMT1 01 and TMT1 03 indicates a moderately strong correlation between performance in the two tests. Table 6.2 shows the student
performance distribution (percentage-wise) split into three categories, for the results of the paper
version of TMT1 for the two different years. Most of the cells on the main diagonal of this table are
loaded, supporting the moderately strong correlation between TMT1 01 and TMT1 03. A relatively
large percentage (34.8%) of the students that performed well in 2001 did not pass the same test in
the 2003. It should also be noted that only a very small percentage of students (4.7%) who failed
TMT1 in 2001 managed to pass in 2003. Further investigation is done by looking at performance in
individual questions and also by conducting interviews with a selected group of students, reported
on in later sections.
The Pearson correlation coefficient r = 0.446 between TMT2 01 and TMT2 03 indicates a moderately strong correlation between the tests. Table 6.3 shows the student performance distribution
(percentage-wise) split into three categories, for the results of TMT2 for the two different years. A
disturbing Þgure is the 41.8% of students that performed well in 2001 did not pass the same test
in the 2003. Note also that the percentage of students that passed TMT2 03 is more than double
that of TMT1 03 (39.5% compared to 16.3%).
45
University of Pretoria etd – Du Preez, A E (2004)
2003 →
TMT1
Marks
0-5
6-7
8 - 10
Total
2001
0-5
14.0
7.0
0.0
21.0
↓
6-7
9.3
18.6
4.7
32.6
8 - 10
11.6
23.2
11.6
46.4
Total
34.9
48.8
16.3
Table 6.2: Student performance distribution (percentage-wise) comparing the results of the paper
TMTs of 2001 and the paper TMTs of 2003 for TMT1
2003 →
TMT2
Marks
0-5
6-7
8 - 10
Total
2001
0-5
4.7
2.3
2.3
9.3
↓
6-7
4.7
7.0
7.0
18.7
8 - 10
25.6
16.2
30.2
72.0
Total
35.0
25.5
39.5
Table 6.3: Student performance distribution (percentage-wise) comparing the results of the paper
TMTs of 2001 and the paper TMTs of 2003 for TMT2
Figure 6.1 represents a Venn-diagram for the pass percentages for the TMT1 01 and TMT1 03.
This diagram shows that a relatively small percentage of 11.6% passed both TMT1 01 and TMT1 03
and that almost half of the students in this group (48.9%) did not pass any of the two tests.
Only 46.4% (34.8% plus 11.6%) of this group initially reached the requirement of 80% to pass
TMT1 01. The reader should be reminded that students had to write TMT1 01 repeatedly until
they passed the test, and as a result 90.7% of the large group (described in Chapter 5) eventually
obtained the minimum mark of 80%. The percentage of students that scored at least 80% for
TMT1 03 is a mere 16.3% (11.6% plus 4.7%). Possible reasons for this poor performance will be
discussed in more detail (question-by-question) in Section 6.4.
Figure 6.2 represents a Venn-diagram for the pass percentages for TMT2 01 and TMT2 03 and
shows a larger percentage (than that for TMT1 ) of 30.2% that passed both TMT2 01 and TMT2 03
and a much smaller percentage (18.7%) that did not pass either of these two tests.
46
University of Pretoria etd – Du Preez, A E (2004)
48.9%
TMT101
34.8%
TMT103
11.6%
4.7%
Figure 6.1: Diagram comparing the pass percentages of TMT1 01 and TMT1 03
18.7%
TMT201
41.8%
TMT203
30.2%
9.3%
Figure 6.2: Diagram comparing the pass percentages of TMT2 01 and TMT2 03
A slightly larger percentage of students 72.0% (41.8% plus 30.2%) initially reached the required
80% to pass TMT2 01. The percentage of students that scored at least 80% for TMT2 03 is only
39.5%, but it should be mentioned that some students did not attempt to answer all the questions
for TMT2 03. During the interviews a few students claimed that they did not have enough time to
Þnish both the tests, and explained that they would have put more effort into the tests if the marks
were to have contributed towards their Þnal mark. This was not a general response, however, and
on closer investigation of the papers and from interviews it appeared not to be a factor that had
a serious impact on the results.
6.4
Topical comparison
In this section we compare the question by question results of the 2001 and 2003 technique mastering tests. Unfortunately it is not possible to use a statistical test such as the χ2 -test for comparison
purposes, due to the small sample size of 43 and the fact that in comparison tables such as those
given for the questions below, the maximum of 20% zeroes, a restriction imposed by the χ2 -test,
47
University of Pretoria etd – Du Preez, A E (2004)
is often exceeded. The comparison is done descriptively but supported by the collected data.
Since most of the students are Afrikaans-speaking, responses are translated. Pseudonyms are
used in reporting on interviews and when quoting students.
6.4.1
Comparison of TMT1 01 and TMT1 03
A question by question comparison is given of the Þrst technique mastering test written in both
2001 and 2003.
Question 1
√
f
and f ◦ g. You may assume that
x and g(x) = x − 1. Determine
g
in each case the domains of f and g consist of all numbers x for which f (x) and
Let f (x) =
g(x) make sense.
2003 →
Marks
0
1
2
3
Total
2001
0
0.0
0.0
0.0
0.0
0.0
↓
1
0.0
0.0
0.0
2.3
2.3
2
7.0
16.3
34.9
2.3
60.5
3
4.7
4.7
27.9
0.0
37.2
Total
11.6
20.9
62.8
4.7
Table 6.4: Student performance distribution (percentage-wise) for TMT1 Question 1
Means: 2.3 for TMT1 01 and 1.6 for TMT1 03.
Objectives: To test basic knowledge and comprehension concerning combinations of functions.
Observations: Table 6.4 shows that the percentage of students who scored two or more for
the question dropped considerably from 97.7% in 2001 to 67.5% in 2003. One explanation could
lie in the fact that one mark was awarded to giving the domains of the two composite functions
(although the question did not explicitly ask for it) and while much emphasis was placed on
domains of functions in the Þrst year, students may have forgotten this two years later. During
48
University of Pretoria etd – Du Preez, A E (2004)
the interviews it became evident that these students most probably would have been able to write
down the restrictions, had they explicitly been asked to do so.
In general, students interviewed claim that they last encountered composite functions in their
Þrst year and have not used it since.
Things like these [composite functions] are rather rare [in computer engineering
courses] (Charles)
The quantitative results, however, show that a reasonable percentage of students could still
deal with composite functions. They have most probably used it indirectly somewhere along the
line, perhaps not using the formal notation.
. . . what does f ◦ g mean? (Eugene on his answer sheet)
Memorising is not part of my learning pattern, if I don’t use it often, I tend to
forget the deÞnitions. I am better at Þguring out things. (Eugene explaining his
written answer)
Conclusion: Although students do not seem to have encountered the composite function
notation since Þrst year, the underlying knowledge still seems reasonably intact. It is probably
a matter of not recognising the notation. They would have encountered composite functions in
other mathematics courses if not in their other subjects since most functions are composite e.g.
√
f (x) = sin(x + 2) or f(x) = x − 1.
Question 2
1
ln x = 1 − ln 2.
2
Solve for x :
2003 →
Marks
0
1
Total
2001
0
18.6
4.7
23.3
↓
1
30.2
46.5
77.7
Total
48.8
51.2
Table 6.5: Student performance distribution (percentage-wise) for TMT1 Question 2
Means: 0.8 for TMT1 01 and 0.7 for TMT1 03.
49
University of Pretoria etd – Du Preez, A E (2004)
Objective: To test whether students can use logarithmic properties to solve basic logarithmic
equations, speciÞcally for the natural logarithm ln.
Observations: Although a reasonable high percentage (77.7%), scored full marks in 2001,
only 51.2% did so in 2003, a fair drop. A number of students are surprised at their failure to solve
the equation.
I don’t know what I did. I know ln, I suppose I forgot the rules. (Charles on viewing
his paper)
Students claim that although they no longer solve logarithmic equations such as in this particular problem, they often use logs to solve equations, mainly making use of a calculator to compute
whatever they need to. Most students seem conÞdent of their knowledge.
In general logs and ln are not a problem. (Conrad)
A relatively high percentage of students (18.6% ) could not solve this equation in either of the
two years indicating that they either never mastered basic logarithmic laws in the Þrst place or
fail to connect ln with log10 and log2 , not an uncommon occurrence.
Conclusion: It is disappointing that the percentage of students that could solve a logarithmic
equation dropped by about a third, considering that logarithmic laws are Þrst encountered in high
school. This is an important basic skill. Logarithmic laws should then be seen as knowledge that
was not well-founded, neither in school nor at university level. This is a somewhat disturbing
situation and needs to be addressed.
Question 3
Sketch the graph of f (x) = |2x + 1| .
Means: 1.4 for TMT1 01 and 0.9 for TMT1 03.
Objective: To test sketching of graphs of basic functions by performing vertical and horizontal
shifts and stretching.
Observations: Students generally feel that they no longer have to sketch this kind of graph,
but that they do make use of absolute values when solving differential equations, for example.
The number of students in 2001 who could sketch the graph correctly (65.1%) decreased to
almost half of that (37%) in 2003. Furthermore, more than half of the students (51.2%) could not
50
University of Pretoria etd – Du Preez, A E (2004)
2003 →
Marks
0
1
2
Total
2001
0
20.9
0.0
4.7
25.6
↓
1
0.0
2.3
7.0
9.3
2
30.2
9.3
25.6
65.1
Total
51.2
11.6
37.2
Table 6.6: Student performance distribution (percentage-wise) for TMT1 Question 3
sketch this graph at all in 2003. On investigation it appears that most students tend to stick to one
method, normally the one that was used at school where they had to memorise the properties of
such graphs. When their memory fail them, they do not try different approaches such as reßecting
the negative part of y = 2x + 1 about the x-axis or even calculating and plotting values. In the
Þrst year Calculus course one speciÞc method is rarely enforced, it is expected of students to use
their initiative and resources to solve problems. This does not appear to be very successful.
On the positive side David, one of the few students who did show some initiative, explains that
he knew the general form and then computed a few values to conÞrm his results. When asked why
he had the correct x−intercept in contrast to most other students, he replies
I always check the x- and y-intercepts for all graphs.
Thomas, a student with an incorrect graph, shows insight when analysing his mistakes. He is
an example of a student whose knowledge is slightly rusty.
I possibly confused it [the question] with linear systems, but I realise my mistake
now. I moved the 1 (one) outside the thing [absolute value]. We do not actually work
with things like these [absolute value graphs] in other subjects. I know the concept
though ... graphically you move everything that is negative to positive.
William is an example of someone who elegantly used the deÞnition of absolute value to split
the function into its two branches, a model student. He used exactly the same method in 2001.
Conclusion: As a result of the poor performance in this question in 2003 one is tempted to
conclude that sketching functions of the form f (x) = a |bx + c| + d should not be considered as
a must know. However, one of the main objectives of Þrst year Calculus course is for students to
be able to sketch different graphs, whether by manipulation of well-known functions or by using
51
University of Pretoria etd – Du Preez, A E (2004)
2001
2003
Figure 6.3: The graph of f (x) = |2x + 1| as done by William
tables. Students can usually sketch f (x) = |sin x| by just reßecting the negative part of y = sin x
about the x−axis, but they fail to use the same technique for this graph. In general students Þnd
it difficult to see connections between different sections of the work and often do not think wider
than one speciÞc method. Such skills should be cultivated more strongly in the Þrst year.
Question 4
Sketch the graph of h(x) = ln(−x). Also give the domain of h.
2003 →
Marks
0
1
2
Total
2001
0
9.3
2.3
2.3
14.0
↓
1
2.3
7.0
2.3
11.6
2
18.6
25.6
30.2
74.4
Total
30.2
34.9
34.9
Table 6.7: Student performance distribution (percentage-wise) for TMT1 Question 4
52
University of Pretoria etd – Du Preez, A E (2004)
Means: 1.6 for TMT1 01 and 1.1 for TMT1 03.
Objective: To test horizontal reßection of a graph of a basic function such as f (x) = ln x.
Observations: In 2001 74.4% of students scored full marks for this question, whereas less than
half of that, only 34.9%, scored full marks in 2003, a considerable drop in performance. Students
lost one mark if they had either omitted to give the domain, which was explicitly asked in this
case, or did not indicate the x−intercept even if the shape of the graph was correct. This could
explain the drop in performance.
A typical error made by students is given in the quote below
ln(−x) is not deÞned since ln is deÞned for positive values of x only. (Sean)
An example of how well-founded knowledge can be retained is given in Figure 6.4
2001
2003
Figure 6.4: The graph of f (x) = ln(−x) drawn by one of the students in the two consecutive tests.
Conclusion: In 2003 a number of students did not answer the second part of the question and
this may be one of the reasons for the drop in performance. Even though students did not score
particularly well in this question, a large group (almost two thirds) of students did get the shape of
the graph right indicating that they can still sketch logarithmic graphs. Properties of symmetry,
as in this case a reßection about the y-axis, are important and useful for sketching graphs. There
does not seem to be reason for concern here.
53
University of Pretoria etd – Du Preez, A E (2004)
Question 5
Determine the following limit (if it exists):
2
lim x3 +x+1
.
2
x→∞ x +2x +1
2003 →
Marks
0
1
2
Total
2001
0
0.0
2.3
32.6
34.9
↓
1
0.0
0.0
11.6
11.6
2
2.3
4.7
46.5
53.5
Total
2.3
7.0
90.7
Table 6.8: Student performance distribution (percentage-wise) for TMT1 Question 5
Means: 1.2 for TMT1 01 and 1.9 for TMT1 03.
Objective: To test asymptotic behaviour of functions.
Observations: Quite surprisingly 90.7% of the students scored full marks for this question in
2003 compared to 53.5% in 2001. The mean for TMT1 03 is also considerably higher than that of
TMT1 01. Furthermore 32.6% had this question completely wrong in 2001 but scored full marks in
2003. These Þgures indicate that limits at inÞnity should deÞnitely be considered as must knows
for this group of students. Quite a number of students mention the use of limits in their other
subjects.
Impressive also is the fact that students use three different methods to Þnd the limit. Many
students use the method of eliminating the highest factor in the numerator and denominator
¡ ¢
which was initially taught to use in the case of the indeterminate form ∞
∞ and quite a few use
L’Hospital’s rule, taught during the second semester of Þrst year Calculus. A third method was
used by Charles only, who explains his answer by simply stating that x3 >> x2 when x → ∞,
showing a fair amount of insight.
It comes from an engineering way of thinking. (Charles)
Students conÞrm their continued exposure to this type of limit and application of it.
We use it all the time like for instance in the processing of signals. I can remember
it well since we did a lot of it in sequences and series [in second year Calculus]. (Sean)
54
University of Pretoria etd – Du Preez, A E (2004)
We often simulate a product with a function and then you want to establish the life
span of such a product. (Oliver)
Conclusion: The results of this question are signiÞcant. It is the only question of this test in
which students performed better after two years. The increase in performance was not necessarily
due to the technique mastering programme in the Þrst year but to the fact that students were
repeatedly exposed to this knowledge in their Þeld of study. It was important to lay a Þrm
foundation in the Þrst year but due to repeated exposure over a longer period of time knowledge
was not only retained but also improved upon.
6.4.2
Comparison of TMT2 01 and TMT2 03
A question by question comparison is given of the second technique mastering tests written in 2001
and 2003.
Question 1
Differentiate the function: f(x) = sin x(sin x + cos x)
2003 →
Marks
0
1
2
Total
2001
0
2.3
0.0
2.3
4.7
↓
1
2.3
7.0
32.6
41.9
2
0.0
4.7
48.8
53.5
Total
4.7
11.6
83.7
Table 6.9: Student performance distribution (percentage-wise) for TMT2 Question 1
Means: 1.5 for TMT2 01 and 1.8 for TMT2 03.
Objective: To test the product rule for differentiation.
Observations: A remarkable 83.7% of students scored full marks for this question in 2003
compared to only 53.5% in 2001, a large improvement. The mean for TMT2 03 was also higher
than that of TMT2 01. These Þgures are pleasing but not unexpected. Students who did not score
full marks in 2001 were mostly confused with the signs of the derivatives of sin x and cos x, which
could indicate some confusion between differentiation and anti-differentiation.
55
University of Pretoria etd – Du Preez, A E (2004)
In general, students feel that differentiation techniques are deÞnitely a must know in almost
all of their subjects, although they can Þnd derivatives and rules in tables and do not necessarily
have to memorise these. Top students such as Oliver and Brian commented that they memorise
rules anyway (even if formulae and rules are supplied in tests in the engineering courses), since one
needs to know where and when to use each rule and formula and it saves time.
Conclusion: The results of this question are signiÞcant, especially because although students
claim that they may use tables to look up rules and formulae most students are so familiar with the
product rule for differentiation that they know it by heart. Results show again that the continuous
use of rules enforces the basic must knows of the Þrst year Calculus.
Question 2
Differentiate the function: f(x) = ln
³
x−1
x2 +1
´
.
2003 →
Marks
0
1
2
Total
2001
0
2.3
2.3
0.0
4.7
↓
1
0.0
7.0
14.0
20.9
2
20.9
9.3
44.2
74.4
Total
23.3
18.6
58.1
Table 6.10: Student performance distribution (percentage-wise) for TMT2 Question 2
Means: 1.7 for TMT2 01 and 1.4 for TMT2 03.
Objectives: To test the properties of logarithmic functions and use of the chain rule for
differentiation.
Observations: In this question 58.1% of the students scored full marks in 2003 compared
to 74.4% in 2001, a fair drop. In 2003 only a small percentage of students used the properties
of logarithmic functions for simpliÞcation before differentiation. Most of the students focused on
x−1
x2 +1
instead, recognising it as a quotient and wanting to go the quotient rule route. Students feel
that since they do not use the quotient rule as often as the product rule, they cannot recall the
rule and had no references available. Some of those that tried to recall the quotient rule, confused
the order of the terms in the numerator, and as a result made a sign error.
56
University of Pretoria etd – Du Preez, A E (2004)
Conclusion: In this question the lack of exploring different approaches is disturbing. While
most of the students had no problem in differentiating the ln −function itself, they could not differentiate y =
x−1
x2 +1 .
Students could not remember the quotient rule and did not think of simplifying
by using logarithmic properties or by writing y =
x−1
x2 +1
= (x − 1)(x2 + 1)−1 and then applying the
product rule and the chain rule. More efforts should be spent on fostering innovative and wider
thinking amongst students.
Question 3
y is implicitly deÞned as a function of x. Determine
√
dy
in terms of x and y if xy + 1 = y.
dx
2003 →
Marks
0
1
2
Total
2001
0
9.3
2.3
4.7
16.3
↓
1
11.6
4.7
0.0
16.3
2
27.9
11.6
27.9
67.4
Total
48.8
18.6
32.6
Table 6.11: Student performance distribution (percentage-wise) for TMT2 Question 3
Means: 1.5 for TMT2 01 and 0.9 for TMT2 03.
Objective: To test the technique of implicit differentiation.
Observations: In 2001 only about two-thirds of students (67.4%) scored full marks for this
question and this Þgure more than halved to 32.6% in 2003, a poor overall performance. Students
feel that they rarely encounter functions that are implicitly deÞned in their study Þeld.
... we do not use it [ implicit differentiation] (Charles)
During the interviews it became evident that students cannot recall what to do because they
have not used it for so long. They also seem to rely too heavily on tables of rules for differentiation
techniques.
... [in engineering courses] we became lazy, since we just look up the formulae (Anja
and Dirk)
... often it [looking up a formula] is the Þrst step in a very long solution. (Anja)
57
University of Pretoria etd – Du Preez, A E (2004)
Conclusion: Perhaps this is one of the techniques that is not essential for students in computer
engineering and could only be considered as a must know in terms of their mathematical foundation
in differentiation. Whether students would have been able to Þnd f 0 (x) if y were to be replaced
by f (x) is debatable. The low full score percentage in 2001 points to the fact that more time
should be spent on the mastering of this particular technique.
Question 4
Find the most general form of a function f satisfying the condition f 0 (x) = x(1+x3 ).
2003 →
Marks
0
1
2
Total
2001
0
2.3
4.7
2.3
9.3
↓
1
9.3
2.3
11.6
23.3
2
4.7
9.3
53.5
67.4
Total
16.3
16.3
67.4
Table 6.12: Student performance distribution (percentage-wise) for TMT2 Question 4
Means: 1.6 for TMT2 01 and 1.5 for TMT2 03.
Objective: To test basic anti-derivatives.
Observations: In Þrst year Calculus, questions like this one are used to introduce integration.
It is interesting that the same percentage, namely 67.4%, scored full marks in both 2001 and in
2003. The means are also nearly the same in the two tests. The 23.3% for 2001 and 16.3% for 2003
of students who scored 1 out of 2 , lost one mark because they neglected to mention the integration
constant. Students who did not get any credit, attempt to integrate without simplifying Þrst. The
interviewed students feel that integration as such is deÞnitely a must know.
Sean claims, justly or not, that in their engineering subjects they do not have to mention the
integration constant
... we just keep in mind that there is a constant involved but do not compute it.
Conclusion: Although this question is fairly simple, it is still necessary to simplify before
integrating. The fact that such a large percentage of students could integrate this basic function
58
University of Pretoria etd – Du Preez, A E (2004)
R
shows that they knew at least that the chain rule ( [f (x)]n f 0 (x)dx) does not apply in this case.
One can conclude that integration is a valuable skill and is deÞnitely a must know.
Question 5
Find the solution of the differential equation
with the initial conditions
dy
dx
d2 y
dx2
= ex
(1) = e and y(1) = −4e
2003 →
Marks
0
1
2
Total
2001
0
16.3
0.0
0.0
16.3
↓
1
0.0
0.0
0.0
0.0
2
34.9
11.6
37.2
83.7
Total
51.2
11.6
37.2
Table 6.13: Student performance distribution (percentage-wise) for TMT2 Question 5
Means: 1.7 for TMT2 01 and 0.9 for TMT2 03.
Objective: To test solving of basic differential equations with initial conditions.
Observations: The low performance of students in 2003 is reßected by the means and the fact
that more than half of the students (51.2%) could not do this question at all in 2003. The 83.7%
of students who scored full marks in 2001 dropped to 37.2% in 2003. This may be explained by
the fact that these students had since done a course in Differential Equations in which far more
sophisticated differential equations were dealt with. They may have found this problem somewhat
confusing. This notion is supported by the fact that during the interviews, when students had a
second look at the question most of them could explain how to do it. As in Question 4 they often
omitted the integration constant, in which case they could not solve the equation fully.
Conclusion: This is clearly not the type of problem that this group of students encounters
regularly. Yet the underlying knowledge seems to be intact. Students found the question somewhat
unfamiliar and for this reason failed to answer it well. It is doubtful whether there is reason for
concern.
59
University of Pretoria etd – Du Preez, A E (2004)
6.5
Qualitative investigation
We discuss students’ perceptions of the TM programme by addressing the structured questions in
the interviews.
6.5.1
Responses to structured questions
After initial discussion to set the scene and to make sure the student recalls the technique mastering
programme, the Þrst of a number of questions was addressed. What follows is a report on their
responses to three prominent questions.
Did the fact that you had to score 80% for such a test without it contributing towards
your semester mark, inßuence your performance in such a test?
Most of the students cannot recall that the TMTs did not contribute to their Þnal mark. Students
remark that in an engineering programme the workload is high and therefore they spent little if
any time in preparing for such tests. They do what is required of them in order to pass and it is
not considered as important as some assignments and tests in mathematics or other subjects.
Martin says that he did not study as hard for a TMT as for tests that would contribute to the
Þnal mark, but he did prepare for the TMTs and commented that
... the TMTs were not difficult since one did not need any ‘tricks’ to be able to do
it.
Sean mentions that since the course allowed for frequent evaluation in the form of class tests
that did contribute towards the Þnal mark, he did not mind that the TMTs did not contribute
towards his semester mark.
Eugene remembers that he was nervous about the 80% pass mark at Þrst, but later relaxed
and saw it as a way to evaluate his level of knowledge.
I did it only because I had too ... I did not prepare in advance, ... you already knew
the work . . . it is reassuring to know that you can do the basic stuff.
Anja feels that it was a valuable exercise even if it did not contribute towards the Þnal mark.
You rewrite a TMT until you get your 80%, it is the only way to master it [skills].
60
University of Pretoria etd – Du Preez, A E (2004)
Conrad feels that even if the TMTs did not contribute towards the Þnal mark, all assessment
is valuable.
Summary: Students generally feel positive towards the technique mastering programme and
do not feel that the requirement of 80% but not contributing towards their semester mark inßuenced
them negatively.
Did you Þnd the TMTs meaningful?
Most of the students claim that they wrote the tests simply because it was required of them and
at the time did not give any thought to the usefulness of the particular skills involved. It was just
another task, something they had to do. Later on, in other mathematics modules, they realised
that the TMTs reßect essential knowledge.
Although some students did not speciÞcally prepare for the TMTs they feel that the formative
assessment helped them to determine whether they had mastered certain mathematical skills.
Oliver explains:.
... by the time we wrote the tests we had already covered that part of the content
... I used it [results of TMTs] to see whether I’ve at least mastered the basic knowledge.
If not, I knew I had to work harder.
Lynn speculates on the necessity of TMTs and replies:
... perhaps they are necessary, then you know you know the basic stuff ... you don’t
think it is necessary in Þrst year but later on you realise you need these [basic skills].
You have to know the basic rules and skills in order to succeed.
Anja comments that many engineering students are a little sceptical about whether the TM
programme is necessary at all, since engineering students use tables to look up the formulae they
need, but she personally thinks it is necessary to know the basics.
... you have to do it [TMTs] in Þrst year, otherwise you won’t understand some things
later ... even if we look up formulae in engineering, it is still important to understand.
Charles also feels unsure about the effect of the TM programme because of the low frequency of
tests and the small range of problems. He would have liked more basic practice exercises in order
to construct his own knowledge.
61
University of Pretoria etd – Du Preez, A E (2004)
At that stage it helped to see whether you knew the work, but I needed more
exercises ... it was such a small part of the work that I am not sure that it made any
impact at all.
Conrad explains that the TM programme Þtted into his idea of doing mathematics:
Yes, this is how I learn, by practising ... it is like a certain level you have to obtain
... at a certain stage you have to know enough, and understand enough to continue ...
when you pass [TMT], you are on a higher level.
Dirk comments that his friends who had to work harder to obtain the minimum requirements
for a TMT, beneÞted in the end. He continues to say that students that had not mastered the
basic principles earlier on, now encounter problems in different subject areas.
In the third year, students with gaps in their knowledge start to fall out ... TMTs
help you to see whether you have conceptual problems ... if you get the basics right,
the rest will be OK.
Only four students, Dirk, Oliver, Brian and William (11.3% of the sample group) passed all
four tests (both TMT1 01 and TMT1 03 as well as TMT2 01 and TMT2 03) at their Þrst attempt.
Dirk is a top student, Brian and Oliver each obtain distinctions in about half of their subjects
and William has increased his average from 60% in the Þrst year to more than 70% in the third
year. According to these students it is more convenient to know some of the basic rules and
deÞnitions, than to look it up in the given tables. These students also referred to the necessity of
understanding basic concepts and judging from their responses a programme such as the technique
mastering programme is deÞnitely meaningful.
Summary: Students generally feel that the technique mastering programme contributed to
their basic skills and Þnd the programme meaningful. One should conclude that students beneÞted
from obtaining a set of must knows that they are able to use in all areas.
Do you have any recommendations regarding the technique mastering programme?
Dirk comments that although he has no problem with the current system he suggests that
TMTs should contribute towards the Þnal mark in order to force students to work
harder.
Introduce a similar programme in second and third year mathematics as it forces
students to master every concept. Increase the frequency to weekly tests.
62
University of Pretoria etd – Du Preez, A E (2004)
Dirk also comments that he likes the immediate response of the online tests but that he thinks
the tests should not consist of multiple choice questions only.
Charles explains how he needed more exercises to practise the basic rules and techniques, and
felt it would help students if more problems were included in the programme.
Pete would like to have a summarised version of must knows available to use in other subjects,
even beyond Þrst year.
Oliver suggests that the variables are varied i.e. in integration and differentiation not to stick
as much to x as a variable but regularly use other variables such as t as well.
Conclusion: Few students have recommendations regarding the TM programme. Those that
offer comments would like more practice exercises and a summary of the must knows.
6.5.2
Case studies
We include four case studies of students representing different proÞles of students.
Oliver (the high performing student)
Most Þrst year engineering students start university directly after completing their twelve years
of school. Oliver came to the university after completing an extra year of study at a technical
institution to help him prepare for his study at our university. The curriculum he had followed
there included a fair amount of the content of the Þrst semester engineering mathematics. He is
a hard working student with a positive attitude and good results (Þnal marks for the three Þrst
year mathematics modules are 81%, 78% and 71%). Oliver was able to score more than 80% in
2003 for both TMTs. In TMT1 he was one of few students that included the restrictions for the
functions
f
g
and f ◦ g.
Oliver’s case is an example of how superÞcial knowledge cannot be retained over a period of
time. Although Oliver managed to sketch the graph of the absolute value function f (x) = |2x + 1|
in the Þrst year test, he failed to do so in the follow-up 2003 test. He claims not to use it regularly
any more and says:
I understood it while in the Þrst year but we did not use it immediately after that.
I am not very good at memorising; things get rusty after a while. If I read about it in
textbooks, I remember vaguely something about multiplying with a minus on the left
and right, and the signs interchange, but I get confused with what it really is. It is
deÞnitely more difficult than factorising a quadratic or algebraic expression.
63
University of Pretoria etd – Du Preez, A E (2004)
It is clear that he never understood the basic concept clearly and is used to following a recipe
that he can only vaguely recollect now. A further comment substantiates this.
I realise that |x| means that the inside becomes positive, but I don’t know how to
solve for the x inside, since you normally get two answers. I can’t remember clearly,
do you have to multiply by a minus inside and outside? Especially with a complicated
function, I cannot think in reverse what will x be if the inside should stay positive.
He further claims that his inability stems from a lack of understanding in secondary school and
a confusion that developed because of two different methods followed in secondary school and at
university. He also points to the importance of the initial encounter with a concept.
The Þrst time we encountered absolute values was in high school and there we
just memorised it. When we learned the other method in the Þrst year (i.e using the
deÞnition of absolute values to write a function as a branch function) I understood it.
I realised it is a better way to deal with absolute values, but once the technique faded,
I stepped back to the way we did it in high school. I guess Þrst impressions last.
On a question whether a misconception can reappear after a period of time and whether this
is because of faulty Þrst impressions even though being convinced otherwise along the way, Oliver
replies:
Well, it all depends on how hard I work to get rid of a misconception. If I understand
it better I keep on doing it right, but some of the concepts get mixed up again if I don’t
use them regularly. After a while I can’t remember which of the two methods is correct.
In the question on logarithms Oliver performed well on both occasions and seems conÞdent
of his knowledge. He compares his ingrained knowledge of logarithms with his somewhat shaky
knowledge of the absolute value concept.
I guess I understood ln from the start. There was never anything about ln that
bothered me. I think that with absolute values it was different. The Þrst time in high
school I did not understand it well. When I do something right from the start, I never
have any problems further on.
This comment again stresses how important it is to grasp a concept Þrmly at the onset to
obtain a Þrm foundation and how difficult it is to rid yourself of misconceptions. It appears that
the technique mastering programme was not successful in this latter aspect for Oliver.
64
University of Pretoria etd – Du Preez, A E (2004)
In TMT2 it was quite clear that Oliver had mastered the different differentiation rules well.
When asked why his knowledge on differentiation is still so Þrmly in place, he once again refers to
his secondary schooling but also points to the value of repetition and how concept understanding
develops over a period of time.
Since high school days, we continually use differentiation. Even difficult differentiation problems do not bother me at all. At the moment we particularly use differentiation
and integration a lot. Integration is not as easy as differentiation for me. Sometimes
I look at an integration problem and feel unsure of how to tackle it. I think that with
integration I also did not understand it immediately. It takes a while for me to master
something, but once I understand a concept I try to solve a problem by reasoning. I
like to know my new environment before I start. I often read ahead so that the new
concepts are not totally new to me. If the lecturer then explains the new concepts in
class and I can ask questions when I do not understand, I am able to master the new
work.
In summary, Oliver’s interview highlights:
• The importance of understanding a concept properly the Þrst time round.
• The value of repetition in ingraining knowledge and concepts.
• The improvement of concept understanding by exposure to it over an extended period.
Anja (the hard-working, average student)
Anja is an average student that had difficulty in coping with Þrst year calculus and the heavy
workload of the computer engineering course. She feels at a disadvantage with her fellow students
that took Additional Mathematics as an extra subject at secondary school level. She managed to
pass the Þrst semester, Þrst year calculus, but during the second semester struggled with integration. She spent many extra hours studying calculus and as a result, neglected the parallel linear
algebra course and failed it. She repeated the linear algebra course online and passed it the next
semester. Since then she has not failed any subject and will probably complete the engineering
degree in the scheduled four years.
Anja has a good self-image, and in general has no problem seeking help from her lecturers
and fellow students when she does not understand a concept. However, Anja has a mental block
regarding her Þrst year mathematics. Anja’s case is an example of how students who are not
65
University of Pretoria etd – Du Preez, A E (2004)
conÞdent about their knowledge, fail to be creative and do not use their own initiative to solve
problems. When we discuss her answer sheets for the TMTs, it becomes clear that she feels
uncomfortable to be suddenly confronted with Þrst year calculus problems.
When we had to do these tests, it was impossible to remember all the small detail.
None of us could do it ... many of the students just sat and stared at f ◦ g not recalling
the deÞnition.
She feels that the only content one remembers is the part that you regularly use.
We don’t use it [compositions of functions] at all. I last saw it in Þrst year. It is
only because we don’t use it often now that I could not do it.
Anja is one of the 9.3% (see Table 6.5) of the students that could not solve the logarithmic
equation (Question 2 TMT1 ) in 2001 and made similar mistakes in 2003. She compares the two
tests (during the interview), laughs and remarks
... it is obvious that I could not do this ... I have not practised it for a while.
Although I battled with ln, in 2001 I could do it eventually [for the exam], but now [in
2003] I forgot the rules ... if you don’t use them often, you lose it along the way.
She cannot explain what a logarithm is, and although she claims that she remembers the
principles, she clearly cannot apply the principles to natural logarithms.
I can use logs in problems. ln was a totally new concept for me in Þrst year. I
remember the principles and can use the calculator to compute logs, but we only use
log10 and log2 in Signal Processes. We use logs to calculate decibels ... We use it often
now to change regular functions to decibels to draw certain graphs ... I’m just rusty at
this stage, haven’t done it for a while.
Anja needs time and practice. She could do the limit problem and credits second year Calculus
for that.
I have practised these over and over ... did twenty problems of a kind in my second
year, I will be able to do it in my sleep. I can’t say we use these limits a lot, but
sometimes it is part of the solution of a problem.
Concerning differentiation, Anja says it is not necessary to memorise the rules.
66
University of Pretoria etd – Du Preez, A E (2004)
They [the engineering department] made us lazy since we use the tables in our
textbook to look up the formulae. It is much quicker. Sometimes it [differentiation] is
the Þrst step in a long solution. Nobody works it out, we just use the formulae. There
are examples of all the rules in the appendices of our textbooks, so we just look it up. I
don’t say it is not necessary to learn it in Þrst year. It is important to learn it otherwise
you won’t understand where it Þts in. Anyway, for all follow-up math modules you have
to know it.
Anja’s answer to Question 4 in TMT2 on implicit differentiation shows that she lacks understanding. She did not score any marks on either occasion. Looking at her answer, she remarks.
Again I could not do it. Normally I try to remember how I did it previously.
In summary, Anja’s interview highlights:
• The importance of having a core knowledge like basic rules, techniques and skills that cultivates conÞdence.
• The value of repetition in ingraining knowledge and concepts.
William (the improved student)
William describes himself as always having been an average student, even at school level. He
explains that at school he had no motivation to study, since he was uncertain of the future.
According to him the school he attended did not expose students to the outside world locally or
internationally. He is an introvert and keeps to himself. He struggled with Þrst year Calculus
basically because he was not able to manage his time properly. Eventually he failed the Þrst
semester, Þrst year Calculus, repeated the course online and passed it with distinction.
He says that the engineering department regularly exposes them to the outside world. This
exposure has broadened his horizons, although he still is uncertain of the future and what lies
ahead.
William’s case is an example of how stimulation and motivation can change a person’s future.
William’s present academic record is a testimonial of success. Part of William’s success is his
perseverance. He wants to know more and reads about the subject. He claims that (after he failed
Þrst semester mathematics) he started to prepare his mathematics in advance, since he discovered
that he is then better able to understand the new concepts.
67
University of Pretoria etd – Du Preez, A E (2004)
I force myself to prepare ahead. I work every day. Sometimes I just read through
the work. Even if I do not understand what I read then, it helps if I have seen it
previously and it is then explained in class. I did not do it in my Þrst year.
Upon examining William’s tests, it becomes clear that he has a deep understanding of the
concepts. William considers it as very important to understand the concepts well.
I understand the concepts now, I just have a gut feeling for it ... If you want to
continue with math, you need to understand the basic principles, otherwise you will
not be able to understand new concepts [built on the basic ones]. You won‘t be able to
do higher level math unless you understand the basics.
In summary, William’s interview highlights:
• The importance of intrinsic motivation.
• The value of working regularly.
• The value of thoroughly understanding basic principles and concepts.
Sean (the disadvantaged student)
Sean is a Þve-year-plan student (he follows the option of spreading the four year course over Þve
years), and has an eye-sight disability. He was one of two students in class that could not read well
on the blackboard or when slides were used in class. To help him (and the other student) cope,
the lecturer supplied them with written notes (or copies of the slides) before the lecture started.
They could then follow the discussion in class with the help of a friend that supplemented their
notes with class examples. Sean struggled to keep up with the pace despite the fact that he had
fewer subjects to complete in his Þrst year. He started off with low marks for his Þrst TMT, had
to rewrite it a few times, but managed to pass it eventually. In spite of his difficulties, he managed
to pass his Þrst year mathematics on schedule.
Sean learned to cope through working regularly. He never missed any class or opportunity to
learn, and always asked questions whenever he was uncertain of any concept. Sean’s case is again
an example of how repeated exposure can help to reinforce concepts and skills.
I did not prepare for the TMTs since by the time we wrote the tests, we had done
the work in class, but once I get my script back, I made sure that I mastered whatever
I could not do in the test.
68
University of Pretoria etd – Du Preez, A E (2004)
Due to Sean’s positive attitude, he rarely complains about the way tests are graded and even
if he fails a test, also uses it as a learning opportunity.
The tests were just a way to force one to go through the work, which is a necessary
exercise to prepare for semester tests and exams.
In his TMTs of 2003, he did not sketch the graph of f(x) = ln(−x), but was one of the few
that could recall the graph of y = ln x and its domain.
The two other questions that he could not do, were on implicit differentiation and differential
equations. It became clear (during the interview) that he had not mastered these two skills.
I can’t recall implicit differentiation. Some of these we last saw in the Þrst year
and in the differential equations course. I saw it only once in the differential equations
course.
In summary, Sean’s interview highlights:
• The importance of perseverance.
• The value of a positive attitude.
• The value of regular practice.
6.6
Conclusions and Recommendations
A number of factors that pertain to the technique mastering programme emerge from this study.
The Þrst factor is the notion that a large percentage of students cannot integrate the knowledge
of the different areas of one single course, let alone integrate that knowledge into other areas.
Students tend to categorise subjects and easily conclude ‘we don’t use it often’ or ‘we don’t use it at
all’, without recognising the common core knowledge. Some students think of the TM programme
as some external exercise belonging to the Þrst year Calculus programme (’something we had to
do’ ) that had little to do with follow up courses. This problem is by no means unique as was
discussed in the review of long-term retention of knowledge. The University of Massachusett’s
IMPULSE progamme for addressing this problem serves as an example. Whilst implementing a
separate programme is not necessarily feasible, we should familiarise ourselves with the course
material in other subjects and perhaps view examination papers and one should include examples
69
University of Pretoria etd – Du Preez, A E (2004)
related to other Þelds. Notation should ideally be standardised and where not possible students
should be made aware of corresponding notations.
The study also casts light on what part of what we consider as must knows is encountered
regularly in later studies. For example, the fact that knowledge of limits is so vital for this group
of students came as an eye-opener. We should revise the content of the technique mastering
programme regularly by taking cognisance of what happens in later years and in other subjects.
From interviews it transpired how important the motivational factor is. Often motivation for
studying a subject stems from its perceived usefulness. It is a task of the lecturer to put the TM
programme in context and to explain and demonstrate its usefulness beyond the Þrst year.
Although at this stage great care is taken at the University of Pretoria to supply the mathematics content required in other subjects, we should also take note of the different strategies used
in other departments concerning the use of the same content. If students are allowed to look up
formulae and rules from tables in other subjects, it decreases the necessity to memorise formulae that are not frequently used. We are by no means advocating that memorisation should be
abandoned but this issue has to be discussed by the staff involved to review the policy.
What is the long-term effect of the technique mastering programme? It is difficult to determine
quantitatively how much of the basic knowledge and rote skills imparted in the Þrst year is still
retained after two years and we did not attempt to do so, the approach was more of a qualitative
nature. It is perhaps wishful thinking that the knowledge imparted in the TMTs will be retained
indeÞnitely. It can be expected that students will have forgotten some of this knowledge and skills.
Yet, there is no doubt that the performance in the follow-up technique mastering tests of 2003
is disappointing in general. This is similar to the conclusion of Anderson et al (1998) that ‘a
considerable amount of what is taught to mathematics students in general as ‘core material’ in the
Þrst year is poorly understood or badly remembered.’ It also supports their idea that students
tend to “ ‘memory-dump’ what they have had in previous modules, rather than retain it and build
it into a coherent knowledge structure.” In cases where students had exposure in later years to the
knowledge gained in the Þrst year, this knowledge certainly seems to be Þrmer in place. In general,
the must knows and techniques are not retained to a sufficient extent over a period of time.
Although the concept of a technique mastering programme is viewed favourably and to be
meaningful by students, this study shows that the impact of the programme is not strong enough.
There certainly is room for improvement. A suggestion is that an online, interactive mastery
programme should be initiated that provides students with the opportunity to work as often and
as long as they want, even where they want to in order to master the basic skills and knowledge.
70
University of Pretoria etd – Du Preez, A E (2004)
Such a programme can also provide in the need of students with a poor grounding in secondary
school mathematics who need more exposure than students who, for example, did Additional
Mathematics at secondary school. This is also motivation for keeping the TM programme separate
from the lectures unlike the situation at Movarian College (Schattschneider, 2004) where such a
programme was incorporated into the lectures. It has to be emphasised that the homogeniety of
their students warranted such an inclusion.
The purpose of the TM programme should be explained more clearly at the onset. One could
consider making the TM programme a prerequisite for follow-up courses such as the gateway tests
used at some American universities. Such a prerequisite would ensure that students are forced to
revise the content of a TM programme and have the beneÞt of core knowledge ready and available
to start a new semester. If a similar programme can be introduced in all consecutive courses,
students will encounter certain basic skills and knowledge repeatedly.
71
University of Pretoria etd – Du Preez, A E (2004)
Bibliography
[1] Anderson, J., Austin, K., Barnard, T. & Jagger, J. (1998). Do third-year mathematics undergraduates know what they are supposed to know?, International Journal of Mathematics
Education in Science and Technology, 29(3), 401-420.
[2] Appleby, J. (1997). Diagnosys, Habitat Issue 3, Retrieved September 2, 2003 from
http://cebe.cf.ac.uk/learning/habitat/HABITAT3/dignos.html
[3] Bloom, B. S. (1956). Taxonomy of educational objectives handbook 1: The cognitive domain.
Longman, New York.
[4] Bogacki, P. Melrose, G. & Wohl, P. R.(1992). Old Dominion University calculus project,
Retrieved November 28, 2001 from
http://www.math.odu.edu/cbii/calculus.html
[5] Bonham, S., Beichner, R. & Deardorff, D. (2001). Online homework: Does it make a difference? The Physics Teacher, 39, 293 - 296.
[6] Boyce, W.E. & Ecker, J. G. (1995). The computer-orientated calculus course at Rensselaer
Polytechnic Institute, The College Mathematics Journal, 26(1), 46-51.
[7] Customised Training Development, The pros and cons of online assessment, Retrieved September 2, 2003 from http://www.cmu.edu/teaching/
howto/Bbseminars/assessment/pdfs/seminar_onlineassessment
[8] Devlin, K. (1991). Computers and Mathematics, Technology and calculus instruction: Notes
and references, Notices of the American Mathematical Society, 38(3), 190- 191.
[9] Engelbrecht, J. C. (1990). Rekenaargesteunde onderrig teenoor eksploratiewe gebruik van die
rekenaar in wiskunde-onderwys, Suid Afrikaanse Tydskrif vir Opvoedkunde, 10(4), 300-306.
72
University of Pretoria etd – Du Preez, A E (2004)
[10] Engelbrecht, J. C. (1995). The Calculus and Mathematica experiment at the University of
Pretoria, Technical Report UP.
[11] Engelbrecht, J. & Harding, A. (2001a). WWW mathematics at the University of Pretoria:
The trial run, South African Journal of Science, 97(9/10), 368-370.
[12] Engelbrecht, J. & Harding, A. (2001b). Internet calculus: An option? Quaestiones Mathematicae, Supplement 1, 183-191.
[13] Engelbrecht, J. & Harding, A. (2003a). Online assessment in mathematics: Multiple assessment formats, New Zealand Journal of Mathematics 32 (Supplement), 57-66.
[14] Engelbrecht, J. & Harding, A. (2003b). Combining online and paper assessment in a web-based
course in undergraduate mathematics, To appear in Journal of Computers in Mathematics and
Science Teaching.
[15] Freudenthal Institute, Research on assessment practices, Retrieved September 2, 2003 from
http://www.freudenthal.nl/en/projects/
[16] Gagné, R. M. (1977). The conditions of learning. New York: Holt, Rinehart and Winston.
[17] Gretton, H. & Chalis, N. (1999). Assessment: Does the punishment Þt the crime? Proceedings
of the International Conference on Technology in Education, San Francisco.
[18] Hibberd, S. (1996). The mathematical assessment of students entering university engineering
courses, Studies in Educational Evaluation, 22(4), 375-384.
[19] Huitt, W.G.(1996a). Mastery learning, Abaetern Academy, Retrieved August 28, 2003 from
http://www.abaetern.com/staff/mastery
[20] Huitt, W.G.(1996b). Mastery learning: Educational psychology interactive, Retrieved August
28, 2003 from
http://chiron.valdosta.edu/whuitt/col/instruct/mastery
[21] IMPULSE, (2004) Retrieved March 21 2004 from http://www.umassd.edu/engineering/impulse/
[22] Indiana University, Center for innovation in assessment, Retrieved September 2, 2003 from
http://www.indiana.edu/~cia/
[23] Kamps, H. J. L. & Van Lint, J. H. (1975). A comparison of a classical calculus test with a
similar multiple choice test, Educational Studies in Mathematics, 6, 259-271, translated by
Eindhoven, T. H.
73
University of Pretoria etd – Du Preez, A E (2004)
[24] LaRose, G. P. & Megginson, R. (2003). Implementation and assessment of on-line gateway
testing at the University of Michigan. Primus, XIII(4), 289-307.
[25] Levine, L. E., Mazmanian, V., Miller, P. & Pinkham, R. (2000). Calculus, technology, and
coordination, T. H. E. Journal, 28(5) 18-23.
[26] Megginson, R. E. (1994)., A gateway testing program at the University of Michigan, in preparing for a new calculus, Anita Solow, ed., MAA Notes 36, Mathematical Association of America, Washington, 85-88.
[27] Miller, S.P., Mercer, C. D. & Dillon A.S. (1992). CSA: Acquiring and retaining math skills,
Intervention in School and Clinic, 28(2), 105-110.
[28] Miner, R. & Topping, P. (2001), Math on the web: A status report, Design Science, Inc.
Retrieved September 3, 2003 from
http://www.coun.uvic.ca/learn/program/hndouts/bloom
[29] Netshapala, F. S. (2001), ClassiÞcation and analysis of some computer software packages for
teaching mathematics, Unpublished dissertation Master of Science: Mathematics Education,
U.P. Pretoria, 2001.
[30] Patelis, T. (April 2000). An overview of computer-based testing, The College Board, Research
Notes, RN-09.
[31] Rissmann-Joyce, S. (2002). Dimensions of learning for elementary students, Japanese Math
Texts, Retrieved August 27, 2003 from
http://www.glocomnet.or.jp/fmf/J_math_dec_02
[32] Savage, M. & Hawkes, T. (2000). Measuring the mathematics problem, Report published by
The Engineering Council, London.
[33] Schattschneider, D. (2004). College Precalculus Can Be a Barrier to Calculus. Retrieved
March 21 2004 from http://www.oswego.edu/nsf-precalc/Schattschneider_paper.pdf.
[34] Smith, G. & Wood, L. (2000). Assessment of learning In university mathematics, International
Journal for Mathematics Education in Science and Technology, 31(1), 125-132.
[35] State Board of Education, Nashville Tennessee, (2003). The Master plan for Tennessee Schools,
Preparing for the 21st Century,
http://www.state.tn.us/sbe/master.htm
74
Retrieved July 31,
2003 from
University of Pretoria etd – Du Preez, A E (2004)
[36] Steyn, T. M. (2003). A learning facilitation strategy for mathematics in a support course for
Þrst year engineering students at the University of Pretoria, Unpublished thesis Philosophiae
Doctor: Education, UP Pretoria.
[37] Stewart, J. (1999). Calculus, Early Transcendentals (4th ed.). Brooks/Cole, Belmont CA.
[38] Thelwall, M. (2000). Computer based assessment: A versatile educational tool, Journal of
Computers and Education, 34, 37-49.
[39] Titsworth, S. (1997). Description of the Keller Plan, Retrieved March 21, 2004 from
http://www.unl.edu/speech/comm109/Files/Overview/desklrpln.htm
[40] The Capstone Experience, Mathematics and Statistics University of Nebraska Kearny, Retrieved March 21, 2004
from http://aaunk.unk.edu/asmt/2003Reformat/DptAsmt.htm
[41] The Kumon philosophy, Retrieved August 27, 2003 from
http://www.bocakumon.com/Home
[42] The MathSkills Discipline Network, Department of Education and Employment, U.K., Retrieved September 2, 2003 from
http://www.hull.ac.uk/mathskills/themes/theme3/mathskill.html
[43] United Inventors Association, Innovation assessment program, Retrieved September 2, 2003
from http://www.uiausa.com/UIAIAP.htm
[44] Weinstein, C.E. (1999). Teaching students how to learn. In: McKeachie, W.J. Teaching tips —
strategies, research, and theory for college and university teachers. Boston: Houghton Mifflin
Company.
[45] Wise, S. L. (1997). An evaluation of the item pools used for computerized adaptive tests Versions of the Maryland functional tests, Maryland State Department of Education.
[46] Yopp, J. (1999). Starting points for educators, student advisers and institutional score users,
Retrieved November 28, 2001 from http://www.ets.org
75
University of Pretoria etd – Du Preez, A E (2004)
Appendix A
TECHNIQUE MASTERING QUESTIONS: WTW 114
FUNCTIONS
1. Determine the largest possible domain of f and in each case give the value of f at the given point a:
a. fÝxÞ = 1 , a = 5
b. fÝxÞ =
x?1
3 , a
|x?4|
4
3
=1
c. fÝxÞ = x + x + 1, a = 3
1
d. fÝxÞ = x 2 +x+1
,a=0
e. fÝxÞ =
1
sin x
, a = 5^/2
1
f. fÝxÞ = e x , a = ln3
g. fÝxÞ = x +1 , a = 2
x ?1
h. fÝxÞ = tanx, a = ?^/4
i. fÝxÞ = |x|x , a = ?1
2
j. fÝxÞ = x x+2x+1
2 ?1
k. fÝxÞ =
ln x
x?1
l. fÝxÞ = lnÝ1 + e x Þ
2. For the given functions f and g, determine f + g,
fg, gf and f E g. You may assume that in each case the domain of f and g consists of all numbers x for which fÝxÞ and
gÝxÞ make sense:
a. fÝxÞ = 2x + 5, gÝxÞ = x 2
1
b. fÝxÞ = x 2 , gÝxÞ = x?1
c.
d.
e.
f.
=
=
=
=
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
e x , gÝxÞ = lnx
cosx, gÝxÞ = 1x
x, gÝxÞ = x ? 1
x, gÝxÞ = 2 x
ALGEBRAIC OPERATIONS WITH EXPONENTIALS , LOGARITHMS ETC
1. Express the following as a power of 2:
a. 2
2
b. 4 8
c. 2.4 2x
d. 2 x+y ÷ 2 x?y
e. Ý2 x Þ x
5
1/5
3/5
f. 2 62 ?1/562
2
2. Simplify:
a. 3a 4 b 3 6 2a 3 b ?2
2n+3
b. 12x
4x n?1
c. Ý3x 2 y ?3 z 0 Þ ?4
d. Ý3 x+1 Þ x+1
e. Ýx 1/y Þ 1?y
f. 9x 2
1
g. Ý4a ?2 b 6 Þ ? 2
h.
i.
j.
x
? 12
x ?5
7 ?1 x 1/4 ÷ 2Ý3 ?1/2 Þ 2
?2
3 y ?1
Ý49yÞ 0
Ý2 n?1 Þ n?1
Ý2 n+1 Þ n+1
76
University of Pretoria etd – Du Preez, A E (2004)
k.
l.
m.
n.
o.
p.
q.
? 34
16a 3
81a ?1
x 8n?3 Ýxy 1/2 Þ ?6n x 3
ÝxyÞ ?4n Ýx ?1 Þ n
12 n ×8 n?1 ×3 n+1
24×6 n?2 ×2 n
3
a6 a
Ý3 n Þ 1?n
Ý3 1?n Þ n+1
÷
6 ?1 3 ?n?1
2 ?1 9 ?n?1
9 n ?3 2n+1
Ý3 n Þ 2 ?3 n+2 3 n
5 x ?3 2 5 x?2
5 x+1 +3.5 x
1/2
3/2 2
r. Ýa ? b
?1
s. x ?1a?a ?1
Ý3yÞ ?2
3y ?2
t.
Þ
2
3
3. Express each logarithm as the sum or difference of simpler logarithms:
a. log xyz
b. log xy
c. logÝx 2 y 3 Þ
d. log x 12 y
e. ln
xy 2
x+y
f. ln 3 x 2 + 1
g. ln 3 x 2 y
h. ln
i. log
j. ln
x3
y
x
yz
Ýx+yÞ 3
xy 2
4. Express each statement as a single logarithm with coefficient 1:
a. 3logx
b. 3logx ? 12 logy
c.
1 Ýlogx
2
? 3logyÞ
d. 2logx + logÝx + yÞ
e. 2logx ? 3logy + logz
f. logÝy ? 2Þ + logy ? 2logx
g. 12 ßlogx ? 5Ýlogy ? 3logzÞà
h.
1 ßÝlogx
2
? 5logyÞ ? 3logzà
i. log5 + 3logx ? 2log4 ? logy
j. 3logÝxyÞ ? 2logx ? logy
5. If loga 2 = x and loga 3 = y express each of the following in terms of x and y
a. loga 27
b. loga 29
c. loga 3 6
d. loga 12a
6. If fÝxÞ = 2 x determine the equations of g, h and k if
a. g is symmetrical to f about the X-axis
b. h is symmetrical to f about the Y-axis
c. k is symmetrical to f about the line y = x
7. If loga b = 3, find logb a.
cy
8. Show that if y = 1+cer ?at then t = 1a ln r?y
.
9. If fÝxÞ = log b x find fÝ 1b Þ and fÝ bÞ.
10. If log b Ý 101 Þ = ? 12 , find b.
11. Simplify:
a. 4Ýe x Þ 4 + Ýx e Þ 4
b. 12 e 2x 6 4e 3/x
77
University of Pretoria etd – Du Preez, A E (2004)
c.
14e 3x
7e x
x
d. e 6 3 e ?3x
e. lnÝe 3x Þ
f. e ln 7x
g. lnÝ9e x 6 10e 2x Þ
x2
h. e lnÝ y Þ
i. 3lnx + ln 1x
j. lne 2 + e ?ln x
12. Solve for x:(without a calculator)
a. lne 4x = 12
b. e 4x = 12
c. ln4x = 12
d. e 2 ln 3 = x
e. lnx = 3ln2 ? 2ln3
f. 12 lnx = 1 ? ln2
GRAPHS
1. Sketch the graphs of the following functions:
a. fÝxÞ = 2x + 3 on ß?4,3à
b. fÝxÞ = |2x ? 1|
c. fÝxÞ = |x| + 1
x 2 if x ² 1
d. fÝxÞ =
?x + 2 if x > 1
e. fÝxÞ = x ? x ? 6
2
f. fÝxÞ = 9 ? x 2 on ß?3,3à
g. fÝxÞ = 2sinx + 1 on ß?2^,^à
h. fÝxÞ = cos x2 on ß0,2^à
i. fÝxÞ = tan2x on
?^ , ^
2
2
2. Graph each function. Determine the domain of each function:
a. fÝxÞ = lnx
b. gÝxÞ = lnÝ?xÞ
c. hÝxÞ = ln|x|
d. lÝxÞ = |lnx|
e. mÝxÞ = lnÝ1 + xÞ
f. mÝxÞ = 1 + lnx
3. Graph each function. Determine the range of each function:
a. fÝxÞ = e x
b. gÝxÞ = e ?x
c. hÝxÞ = e |x|
d. lÝxÞ = |e x |
e. mÝxÞ = e 1+x
f. nÝxÞ = 1 + e x
LIMITS
Determine the following limits whenever they exist.
1. lim x¸0 Ýx + 1x Þ
2 ?1
2. lim x¸1 x 2x?2x+1
x 2 ?9
x?3
2
lim x¸0 x x?x
^
lim x¸2 2
3. lim x¸3
4.
5.
6. lim x¸0 |x|
7. lim x¸e lnx 2
78
University of Pretoria etd – Du Preez, A E (2004)
8. lim x¸ ^2 tanx
9. lim x¸ ^2 Ý1 ? sinxÞ
10. lim x¸2 Ýx 3 + 4x + 1Þ
11. lim x¸K Ýx + 1x Þ
2 +x+1
12. lim x¸K x x3 +2x
2 +1
13. lim x¸0 + lnx
14. lim x¸3 ? ?1
3?y
x 2 ?3x+2
x 2 ?2x+1
15. lim x¸1 +
DIFFERENTIATION
Differentiate the following functions:
1. fÝxÞ = 2x 4 ? 3x 2 + 5x + 2
2. fÝxÞ = x ?1/2 + x 1/2
3. fÝxÞ = 3x 2 + 13 x ?2 + x
4. fÝxÞ = x 2 ? 3x 7/3
5. fÝxÞ = x + 1 + x12
6. fÝxÞ = x ?
7. fÝxÞ =
12
x
?
8. fÝxÞ = x +
3
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
=
=
=
=
=
=
=
1
x
4
x3
1
x3
+
1
x4
x 4 ? 8x 3 + 2x 2 ? x + 1
7x 3 ? 5x 2 + 3x ? 17
9x ?3 + 2x ?2 ? 14
?2x 4 + x ?2 ? 3x 3/4
12x 4 + 3x 3 + 5x ?2 ? 4
4x ?2 ? 7 x + 8x 3 + 5
3x 3 + 2x 2 ? x + 1
4x 3 ? 7x 2 + 8x ? 6
x 3 ? 3x ? 2x ?4
x + 3 + 4x
?3x ?3 + 4x 2 + x12
20. fÝxÞ = x x +
1
x2 x
2
21. fÝxÞ = Ýx 3 + 7x + 5Þ 5
22. fÝxÞ = x 4 + x 2 + 2
23. fÝxÞ = Ý x + 1ÞÝx 2 + 1Þ
24. fÝxÞ = x 2 + 2x + 3
25. fÝxÞ = x 2 Ýx 3 ? 1Þ
26. fÝxÞ = Ýx 3 ? 3xÞ 4
27.
28.
29.
30.
31.
32.
33.
34.
35.
36.
37.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
=
=
=
=
=
=
=
x 1 ? x2
Ý2x ? 4ÞÝ3x 2 + 2Þ
Ý2x ? 3Þ 3 Ý4x + 2Þ 2
Ý7x + 3Þ 2 Ý3x 2 ? 14x + 5Þ
Ý8x 3 ? 2ÞÝ3x 2 ? 5x + 10Þ 2
Ý2x 3 ? 3Þ 2/3
Ý3x 2 ? 2x + 1Þ 1/2
Ý2x ? 3ÞÝ3x + 4Þ
Ý2x 3 ? 1ÞÝx 4 + xÞ
Ý 1x + 3ÞÝx2 ? 5Þ
Ý2x + 1Þ 2 Ýx 2 + 2Þ 3
38. fÝxÞ = 3x 2 + 1
39. fÝxÞ = Ý2x ? 5Þ 3
79
University of Pretoria etd – Du Preez, A E (2004)
40. fÝxÞ = Ý6x ? 5Þ ?3
2
41. fÝxÞ = xx +x+1
2 +1
42. fÝxÞ =
43. fÝxÞ =
44. fÝxÞ =
45. fÝxÞ =
46. fÝxÞ =
47. fÝxÞ =
48. fÝxÞ =
49. fÝxÞ =
50. fÝxÞ =
51. fÝxÞ =
52. fÝxÞ =
53. fÝxÞ =
54. fÝxÞ =
55. fÝxÞ =
56. fÝxÞ =
57. fÝxÞ =
58. fÝxÞ =
59. fÝxÞ =
60. fÝxÞ =
61.
62.
63.
64.
65.
66.
67.
68.
69.
70.
71.
72.
73.
74.
75.
76.
77.
78.
79.
80.
81.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
2x+3
3x+2
2x+1
3x?5
x 2 +5x?1
x2
Ýx+1ÞÝx+2Þ
Ýx?1ÞÝx?2Þ
x 2 +1
x 2 +4
x+x 3
x
x+1
x?1
10
x +4
8
4+x 2
4x
x 2 +1
3
2
2x+1
3x 2 ?2x+3
4x 2 ?5
Ý2x?4ÞÝ3x+5Þ
2x 2 +7
2x?3
x 2 +2x
4?2x+3x 2
x 2 +2
2x 3 + 2?x
x3
x?1
x+1
x2
x 2 +1
1
x 4 ?2x+1
2
sinÝx + 2Þ
sinÝ3xÞ
cosÝ3xÞ
cosÝx 4 + 7Þ
tanÝsinxÞ
tanÝx 2 + 5Þ
sinÝ3x + 2Þ
2sinx ? tanx
1 + x ? cosx
sin x
x
tanÝ2x ? x 3 Þ
sinÝ3x 2 ? 2x + 1Þ
3sinÝx 2 Þ + 2sinÝxÞ ? sinÝ3Þ
3sinx ? 2cosx
? cosÝ^x ? 1Þ
cosÝ3xÞ ? tanÝ3xÞ
x 2 sinx
cosÝtanxÞ
sinÝ2x + 3Þ
sinÝ2^xÞ
sin 1
82. fÝxÞ = sin
x
1
x
83. fÝxÞ = cosÝ?xÞ
84. fÝxÞ = tan^Ý 12 ? xÞ
85.
86.
87.
88.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
= tanÝx 2 + 1Þ
= tanÝ5xÞ
= xcosx
x
= sin
1+x
80
University of Pretoria etd – Du Preez, A E (2004)
89. fÝxÞ = 4sin 2 Ý3xÞ
x
90. fÝxÞ = cos
sin x
91.
92.
93.
94.
95.
96.
97.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
=
=
=
Ýx 2 + 3Þ sinx
sinx
cos 2 Ýx 3 Þ
xcosÝ5xÞ
sinÝ2xÞ cosÝ3xÞ
sin 2 Ý3xÞ + cos 2 Ý5xÞ
x 2 tan 1x
98. fÝxÞ = Ýsinx ? xcosxÞ ?1
x+cos x
99. fÝxÞ = sintan
x
100. fÝxÞ = sin 2 x +
101. fÝxÞ =
1
x
+ cos 2 x +
1
x
x2
tan x
102.
103.
104.
105.
106.
107.
108.
109.
110.
111.
112.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
=
=
=
=
=
=
=
cosÝ2xÞ
sin 2 Ýx ? 3Þ
cos 3 Ýx 2 ? xÞ
sinxcos 2 x
sin 1/3 Ý2xÞ
4sin 7 Ý2 ? 4xÞ
2sinx + 3sin 2 x
3cos 3 x + 4cos 2 x ? 6
5cos 2 Ýx + 2Þ + 3cosÝx + 2Þ ? 5
5x 2 ? 3tan 2 x + sinx
113.
114.
115.
116.
117.
118.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
=
=
ÝsinÝx + 1ÞÞ 3/2
sin 2 x + cos 2 x
sin x + sinx
ÝsinxÞÝsinx + cosxÞ
2sinxcosx
tan x
1?tan x
1
sin x
119. fÝxÞ = cos 2 xsinx
120.
121.
122.
123.
124.
125.
126.
127.
128.
129.
130.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
=
=
=
=
=
=
=
1 ? sin 2 x
x + tan 2 x
131.
132.
133.
134.
135.
136.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
=
=
e
2x
e3
ex
e 3x + 2e 2x ? 3e x + 7
2
e x ?2
137.
138.
139.
140.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
e
? 4e ?x
cosÝe x Þ
3e 2x ? 4e x + 1
e 3 cosÝ2xÞ
x
cos x
tanxcosx
sinxcosx
2
ex
e 5x
Ýx 2 + 3xÞe x
xe x ? e ?x
2
e x 6 e x+1
2
ex
e x?1
?x
1+e 2x
2?e 2x
3x?1
81
University of Pretoria etd – Du Preez, A E (2004)
141. fÝxÞ = e ?2x + 4e ?3x + 7
142. fÝxÞ = e 2x+1
143. fÝxÞ = 12 e 2x
144.
145.
146.
147.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
= e sin x
= e 2x
= 2xe x
= 1?e1 ?x
?x
148. fÝxÞ = e x
149. fÝxÞ = x 2 e ?x
?1
150. fÝxÞ = e x2
2
151. fÝxÞ = e x +1
x
?x
152. fÝxÞ = e ?e
2
153.
154.
155.
156.
157.
158.
159.
160.
161.
162.
163.
164.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
=
=
=
=
=
=
=
=
e 2x cosÝ3xÞ
e cosÝ4xÞ
x2 6 2 x
3 5x
x4 + 4 x
9 ?x
tanÝ5 x Þ
3 4x+1 + 2 4x+2
2
3 x +1
x
2
2 ?x
Ý 12 Þ x
165. fÝxÞ = e x lnx
166. fÝxÞ = lnÝsinxÞ
167. fÝxÞ = ln1x
168. fÝxÞ = lnÝ3xe ?x Þ
169. fÝxÞ = ln xx?1
2 +1
170. fÝxÞ = ln
ex
1+e x
sin 2x
171. fÝxÞ = lnÝe
Þ
172. fÝxÞ = ln x 2x+1
173. fÝxÞ = lnÝx 2 Þ
174. fÝxÞ = ln 10x
175.
176.
177.
178.
179.
180.
181.
182.
183.
184.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
=
=
=
=
=
=
lnÝ10 x Þ
lnÝ3xÞ + 4lnx
x 2 lnÝ2xÞ
lnÝx ?1 Þ
xlnx
ln 1x
ÝlnxÞ 3
xlnÝ xÞ
lnÝ7xÞ
ÝlnxÞ 1/2
IMPLICIT DIFFERENTIATION
The following equations define y implicitely as a function of x. Determine
1.
2.
3.
4.
5.
2
2
x + y = 100
x 3 ? y 3 = 6xy
x 2 y + 3xy 3 ? x = 3
x 3 y 2 ? 5x 2 y + x = 1
x 2 = x+y
x?y
82
dy
dx
in terms of x and y in each case.
University of Pretoria etd – Du Preez, A E (2004)
6.
7.
8.
9.
10.
11.
xy + 1 = y
Ýx 2 + 3y 2 Þ 35 = x
cosxy = y
sinÝx 2 y 2 Þ = x
tan 3 Ýxy 2 + yÞ = x
3 + tanxy ? 2 = 0
ANTIDERIVATIVES
Find the most general form of the function f satisfying the following:
1. f v ÝxÞ = x 8
2. f v ÝxÞ = x16
3. f v ÝxÞ = x 5/7
4. f v ÝxÞ = 3 x 2
5. f v ÝxÞ = 4
6. f v ÝxÞ =
x
1
2x 3
3
7.
8.
9.
10.
11.
f v ÝxÞ
f v ÝxÞ
f v ÝxÞ
f v ÝxÞ
f v ÝxÞ
=
=
=
=
=
12.
13.
14.
15.
f v ÝxÞ
f v ÝxÞ
f v ÝxÞ
f v ÝxÞ
= xÝ1 + x 3 Þ
= Ý1 + x 2 ÞÝ2 ? xÞ
= x 1/3 Ý2 ? xÞ 2
= x12 ? cosx
16.
17.
18.
19.
f v ÝxÞ
f v ÝxÞ
f v ÝxÞ
f v ÝxÞ
= ß4sinx + 2cosxà
= ex
2
= xe x
= 2x
x x
Ýx 3 ? 2x + 7Þ
Ýx ?3 + x ? 3x 1/4 + x 2 Þ
Ýx 2/3 ? 4x 1/5 + 4Þ
7 ? 3x + 4 x
3/4
x
DIFFERENTIAL EQUATIONS
Find the solution of each of the following differential equations with initial values.
1. dy
= x/2, yÝ1/2Þ = ?1
dx
2.
3.
4.
5.
6.
7.
8.
dy
= ?Ý3/2Þx 2 , yÝ?1Þ = ?1/2
dx
dy
= sinx, yÝ^/2Þ = 3
dx
dy
= e x , yÝ0Þ = 4
dx
dy
= 1x , yÝ1Þ = 2
dx
d2y
= 0, dyÝ2Þ
= 1, yÝ2Þ = 2
dx
dx 2
2
d y
dyÝ0Þ
= cosx, dx = 1, yÝ0Þ = 2
dx 2
d2y
= e x , dyÝ1Þ
= e, yÝ1Þ = ?4e
dx
dx 2
A GRAFIEKE / GRAPHS
Skets die grafieke van die volgende pare funksies,
telkens op dieselfde assestelsel.
Sketch the graphs of the following pairs of functions,
each time on the same set of axes:
1. fÝxÞ = sinhx and gÝxÞ = 2sinhx + 1
2. fÝxÞ = coshx and gÝxÞ = cosh x2
3. fÝxÞ = tanhx and gÝxÞ = tanh2x
4. fÝxÞ = ln x and gÝxÞ = ln 2x
83
University of Pretoria etd – Du Preez, A E (2004)
5.
6.
7.
8.
9.
10.
11.
12.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
=
=
=
=
ln x and gÝxÞ = ln Ýx + 2Þ
ln x and gÝxÞ = 2ln x
e x and gÝxÞ = e 2x
e x and gÝxÞ = ln x
sinx and gÝxÞ = arcsinx
cosx and gÝxÞ = arccosx
tanx and gÝxÞ = arctanx
arcsinx and gÝxÞ = arcsinÝx + 1Þ
B DIFFERENSIASIE /
DIFFERENTIATION
Differensieer die volgende funksies: /
Differentiate the following functions:
1. fÝxÞ = sinhÝx 2 + 4xÞ
2. fÝxÞ = sinhÝ7xÞ
3. fÝxÞ = coshÝx 4 + 5x + 7Þ
4. fÝxÞ = tanhÝsinh2xÞ
5. fÝxÞ = tanhÝx 2 + 5Þ
x
6. fÝxÞ = sinh
x
7. fÝxÞ = tanhÝ2x ? x 3 Þ
8. fÝxÞ = 3sinhÝx 2 Þ + 2coshÝxÞ ? tanhÝ3Þ
9. fÝxÞ = 3sinh4x ? 2cosh5x
10. fÝxÞ = coshÝtanhxÞ
11. fÝxÞ = cosh 1
x
12. fÝxÞ = tanh
1
x
13. fÝxÞ = sinhÝ?xÞ
x
14. fÝxÞ = cosh
1+x
15. fÝxÞ = 4cosh 2 Ý3xÞ
16. fÝxÞ = sinh 2 Ýx 4 Þ
17. fÝxÞ = x 2 sinh 1x
18. fÝxÞ = Ýsinhx ? xcoshxÞ ?2
x+cosh x
19. fÝxÞ = sinhtanh
x
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
36.
37.
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
fÝxÞ
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
coshÝe x Þ
e 3 coshÝ2xÞ
e sinh x
e 6x coshÝ3xÞ
e coshÝ4xÞ
tanhÝ5 x Þ
lnÝcosh8xÞ
lnÝe cosh 2x Þ
arcsinÝ2xÞ
arccosÝx 2 Þ
xÝarcsinxÞ
arcsinÝ 2x Þ
arcsinÝ2x ? 3Þ
2xÝarctanxÞ
arctanÝ5xÞ
arcsinÝsinxÞ
arctanÝ3x ? 4Þ
arccosÝ x4 Þ
C INTEGRASIE DEUR INSPEKSIE /
INTEGRATION BY INSPECTION
1. X e 5x dx
84
University of Pretoria etd – Du Preez, A E (2004)
2. X x 8 dx
3. X
1 dx
x6
5/7
4. X x dx
5. X
3
6. X
4 dx
x
1 dx
2x 3
3
7. X
x 2 dx
8. X x
xdx
9. X Ýx ? 2x + 7Þdx
3
10. X Ýx ?3 + x ? 3x 1/4 + x 2 Þdx
11. X Ýx 2/3 ? 4x 1/5 + 4Þdx
12. X
? 3 x + 4 x dx
7
x 3/4
13. X xÝ1 + x 3 Þdx
14. X Ý1 + x 2 ÞÝ2 ? xÞdx
15. X x 1/3 Ý2 ? xÞ 2 dx
16. X
? cosx dx
1
x2
17. X ß4sinx + 2cosxàdx
18. X e x dx
2
19. X xe x dx
20. X
2 dx
x
5x
21. X e
22. X
23. X
dx
1 dx
3x
1 dx
2x+1
24. X sinh 4x dx
25. X sinh Ý4x + 6Þ dx
26. X cosh Ý2x + 3Þ dx
27. X tanh 3x dx
28. X
29. X
30. X
31. X
x 2 dx
x 3 ?4
5x 4 dx
x 5 +1
t+1 dt
t
x 3 dx
x 2 +1
32. X x 2 Ý2 ? x 3 Þ 4 dx
2
33. X xe 2x dx
34. X sinSÝcosS ? 3Þ 3 dS
35. X
36. X
sin x
dx
x
sin S
1+cos S
dS
37. X tanS dS
38. X
39. X
1
dx
x ln x
ln x
x dx
D FAKTORINTEGRASIE /
INTEGRATION BY PARTS
1. X xe 2x dx
2. X xcos x dx
3. X xsin 4x dx
4. X x ln x dx
85
University of Pretoria etd – Du Preez, A E (2004)
5. X x 2 cos 3x dx
6. X x 2 sin 2x dx
7. X Ýln xÞ 2 dx
8. X arcsin x dx
9. X S sec 2 S dS
10. X t 2 ln t dt
11. X e 2S sin 3S dS
12. X te ?t dt
13. X
14. X e
t ln t dt
3S
cos 2S dS
86
University of Pretoria etd – Du Preez, A E (2004)
Appendix B
87
University of Pretoria etd – Du Preez, A E (2004)
University of Pretoria etd – Du Preez, A E (2004)
University of Pretoria etd – Du Preez, A E (2004)
University of Pretoria etd – Du Preez, A E (2004)
\
University of Pretoria etd – Du Preez, A E (2004)
University of Pretoria etd – Du Preez, A E
(2004)
Appendix C
Questionnaire: TMT1 Paper versus online
Please help us to improve our system. Please include your name. You are guaranteed that I will not use anything you say or write against you, but I will
possibly ask you to explain your response if necessary. THANK YOU SO MUCH!
Name:………………
1
2
3
4
5
Did you pass TMT1 the first time?
Did you pass the paper version of the test?
If you chose No in 2, say why.
Did you pass the online version of the test?
If you chose No in 4 say why.
Yes
No
Yes
No
Did not write out my answer properly
Did nor prepare properly Other
Yes
No
n.a.
The system failed
Guessed the answers
Too little time
Did not prepare properly
Answers keyed in incorrectly
Other
6 How many times did you write the online version of the test?
1x
2x
3x
4x
5x
7 Did you go for tutoring before attempting a next online TMT?
n.a.
Yes
No
8 If you chose Yes in 7, whom did you ask for help?
Tutor
Friend
My lecturer
Another lecturer
Other
9 If you wrote the online version more than once, where did you do it?
Math lab
Engineering lab
Other
10 Indicate which of the two tests was the easiest.
Paper
Online
The same
11 Explain
12
13
14
15
16
Was it easy to get access to the MATH LAB?
Was it easy to get the PASSWORD?
Was the tutor in the lab able to assist in the technical side of WebCT?
Do you think the TMT programme is of value in the calculus course?
Any suggestions?
Yes
Yes
Yes
Yes
No
No
No
No
Other
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement