The response of educators and school management team members on... curriculum delivery intervention programme

The response of educators and school management team members on... curriculum delivery intervention programme

The response of educators and school management team members on a

curriculum delivery intervention programme

By

Simon Mangwato Nkwana

Submitted in partial fulfillment of the requirements for the degree

Magister Educationis

in Assessment and Quality Assurance

in the

Faculty of Education

University of Pretoria

Supervisor: Prof. WJ Fraser

September 2010

© U n i i v e r r s s i i t y o f f P i r r e t t o r r i i a

Declaration Statement

I, Mangwato Simon Nkwana, declare that this work is entirely my own and that it is original. All the work of others and sources that I have used or quoted have been indicated and acknowledged by means of references. The material contained in this report has not been previously submitted at this

University or any other educational institution for degree purposes.

Student’s Signature…………………… Date……………… ii

Acknowledgements

I would like to extend my sincerest thanks and appreciation to the following people:

• My supervisor Prof. W. J Fraser for his underlying support, intellectual guidance, and constructive criticism throughout the period of this investigation

• Dr Mike van der Linde and Ms Rene Ehlers from the Department of statistics of the University of Pretoria for their assistance with Statistics processing and preparation of data sheets used in this investigation.

• The Principal of the school which participated in the study for their positive

responses.

• Teachers who made this study a success by agreeing to give their time to

complete the questionnaires.

• Special thanks to my parents in their absentia, (their souls rest in peace) who guided me from my childhood till the time they passed away.

• Special thanks to my wife and children who gave me an opportunity to concentrate on this investigation throughout my time of study.

• I dedicate this dissertation to my family for their spiritual and emotional support for the entire period of this investigation.

• The almighty God who gave me strength, wisdom and courage to complete

this research. iii

ABSTRACT

The main purpose of this study was to investigate the responses of educators and school management team members on a Curriculum Delivery Intervention Programme (CDIP) in a township school. The study also examined the factors that might influence the opinions educators and school management team members hold about curriculum delivery. The participants of this study were 60 educators who are teaching at an underperforming secondary school in Mpumalanga province. The secondary school was conveniently selected from 8 under-performing schools in the Witbank area. The data were collected through a mix-method approach using questionnaires, individual interviews and an observational checklist. A 100% responses rate was achieved.

The literature review was conducted to identify the main interventions contained in the

CDIP. Educators and school management teams responded to curriculum delivery interventions questionnaires. Data was collected by means of individual interviews and an observational checklist. The focus fell on the following main interventions: departmental support, professional development and classroom practices.

The data for the study were analysed using both quantitative and qualitative techniques.

The findings of the study show that both educators and SMT members agree about the value of curriculum delivery intervention programme in the school. They also showed that the school had not been supported with necessary resources by the Department that well. The results also indicate that educators and SMT members disagree that skills development covers everybody in order to pursue educators’ development in the school.

The staff agreed that they have an important role to play in ensuring that the implementation of CDIP is successful in their school. The study also revealed that the school did not have all key and relevant documentation as expected in the school files.

However, educators indicated that the Department is not objective enough when delivering the curriculum in the school. The results derived from the curriculum delivery intervention might not be that accurate and reliable. iv

The responses of staff correspond with their experiences on curriculum delivery practices initiated by the Department. Most of them also indicated that they have not seen much change since the CDIP started to enhance learners’ performance. The majority of staff members also indicated that factors such as lack of furniture, unavailability of learning materials, lack of teacher professional training and development, large number of learners in classrooms and shortage of educators in scarce subjects, play a significant role in influencing their teaching performance of the school.

A number of recommendations were made for further research. The limitations of the study were also discussed.

Keywords: Curriculum delivery, Interventions, Departmental support, Professional development, Monitoring, Evaluation, Curriculum development, Dissemination of information, Curriculum management, Curriculum implementation, Feedback and

Improvement of performance of learners. v

TABLE OF CONTENTS

Chapter One

Problem statement, Aims and Objectives of the investigation

1.1 Introduction

1.2 Problem statement and rationale of the investigation

1.2.2

1.3

Rationale for the study

Research questions for the investigation

16

19 1.7 Outline of the dissemination of information

Chapter Two

Literature Review in terms of curriculum delivery interventions in the school 22

2.2 Curriculum development in terms of curriculum delivery interventions 25

2.4

2.5

2.6

2.7

Curriculum implementation in the classroom practice

Professional development of educators

The Departmental support/interventions

Classroom practices (learning environment)

27

28

29

30

1

1

5

9

14

2.12 Monitoring (supervision) and evaluation and feedback vi

37

Chapter three

3.3.1.1 Content validation and development of the questionnaire

3.3.1.2 Research instrument

3.4.2 Types of interviews and observational checklist conducted

3.4 Population and sample who participated

Analysis quantitative

3.5.1 Descriptive analysis of the

47

49

50

3.7 validity)

Chapter four

Analysis and interpretation of the results of the investigation 58

4.2 The results of the quantitative data of the investigation 59

4.2.2 The biographical information of the respondents who participated in

the study

4.2.3 Results of the frequency analysis

59

62 vii

4.3 Results of the frequency analysis of the observational checklist

4.4 Interpreting a box and whisker plots

81

84

4.5 Analysis of the findings of the qualitative approach to the investigation 103

4.5.1 Introduction 103

4.5.3 Departmental support for CDIP

4.5.4 Integration of interventions into a daily programme of educators

4.5.5 Dissemination and implementation of information

4.5.6 Monitoring, evaluation and feedback

Chapter five

106

110

112

116

5.2 Summary of the problem to the study

5.4 Summary of the aims and objective of the study

Findings literature

119

5.3 sub-questions 120

121

5.6 Findings of the empirical investigation on both qualitative and quantitative study

124

5.6.6 Monitoring and evaluation (giving feedback)

5.7 The findings of both the quantitative and qualitative section of

the investigation

129

130 for 133 viii

Appendices

Appendix 1: Letter to the school principal

Appendix 2: Letter to the Circuit manager for permission 143

Letter 144

Appendix 5: Ethical clearance 146

Appendix 6: Observational checklist 147 ix

List of Table

Table 3.1: Data source and methods applied in the study 48

Table 3.2: Source consulted in the content validation of the questionnaire 50

Table 4.1: Gender according to the respondents participated in the study.

Table 4.2: Educational qualification of the respondents who participated

59

60

in the study.

Table 4.3: Age of the respondents who participated in the study

Table 4.4: Teaching Experience according to the respondents participated in

the study

Table 4.5: Streams according to the respondents participated

60

61

61

in the study

Table 4.6: Post level according to the respondents participated 62

in the study

Table 4.7: Comment of the respondents regarding curriculum design of the

development in the CDIP

Table 4.9: Extent to which the respondents’ views on the dissemination of information

63

programme.

Table 4.8: Respondents opinions regarding departmental support and professional 65

And the implementation Programme was achieved 69

Table 4.10: Extent to which the respondents’ views on the integration o 72

interventions, evaluation and monitoring were achieved

Table 4.11: Extent to which the respondents’ views on the departmental support and

professional development were achieved

Table 4.12: Extent to which the respondents’ views on the implementation of the

74

76

programme and dissemination of information

Table 4.13: Extent to which the respondents’ views on the integration of the

Intervention and Monitoring and evaluation achieved

Table 4.14: Extent to which the respondents’ views on the programme evaluation

were achieved

Table 4.15: Frequency analysis of items/variables on the observer’s assessment x

78

80

on the management and administration by checking school documents 82

without interviewing the school management team

Table 4.16: Frequency analysis of items/variables on the observer’s assessment 83

the curriculum management by also checking the school documents

without interviewing the school management team

Table 4.17: Frequency analysis of items/variables on the observer’s assessment 83

on monitoring and moderation by also checking the school documents

without interviewing school management team xi

List of figures

Figure 4.1: The plot of the programme evaluation vs.SMT and educators

Figure 4.2: The plot of the departmental support and professional

85

87

development vs educational qualification.

Figure 4.3: The plot of the integration of interventions, monitoring and 89

evaluation vs streams (science and general department) in the school

Figure 4.4: The plot of the departmental support and professional development 91

vs. post level

Figure 4.5: The plot according to the dissemination of information vs. post level 93

Figure 4.6: The plot of the integration of interventions, monitoring and evaluation 95

vs. post level.

Figure 4.7: The Plot of departmental support and professional development vs. 97

age.

Figure 4.8: The plot of the program evaluation vs. post level in the school

Figure 4.9: The plot of the program evaluation vs. teaching experience

99

101 xii

ABBREVIATIONS

CDIP:

OBE:

MDOE:

NDOE:

DOE:

Outcome- Base- Education

Mpumalanga Department of Education

National Department of Education

Department of Education

NCS: National Curriculum Statement

P: Participant

R: Respondent xiii

Chapter 1

Problem statement, aims and objectives of the investigation

1.1 Introduction

A planned curriculum delivery programme usually contains a series of related operations. That includes projects, activities, strategies and interventions with short- and long-term objectives. Such programme aims to improve an educator’s performance and ensure improved quality in learner attainment. The programme such as Curriculum Delivery Intervention in the school is specifically designed to improve the poor or mediocre performance by both educators and learners in underperforming schools.

Davidson (2005, p. 23) states that in terms of the evolution of the human race, evaluation is possibly the most important activity that allows such evolution, improvement and survival in an ever-changing environment. He further emphasis that every time something new is tried – a farming method, manufacturing process, medical treatment, social change programme, new management policy, or information system – it is important to consider its value. He achieves evaluation by questioning whether the manufacturing process is better than what had gone before, or took evolution to the next level. Evaluating and developing projects or

Curriculum Delivery Intervention Programme remains crucial in improving learner attainment, since all stakeholders need to be empowered in content knowledge, governance and management.

One of the former Ministers of Education, Ms Naledi Pandor, launched a strategic intervention for learner attainment through curriculum delivery intervention. This was developed to consolidate and complete the work started in the foundation, intermediate, senior, further education and training bands, through the national literacy and numeracy strategy, especially in FET institutions. In the FET phase, the focus is on improving the results of under-performing schools and to provide all learners a fair opportunity to succeed. All provincial departments of education have been instructed by the National Department of Education to institute their own

1

provincial units to deal with the unacceptable performance experienced in their respective provinces.

This strategic intervention programme is currently running in the Mpumalanga province, which focuses on curriculum delivery intervention, following an outcry from both educators and the public that the provincial Department of Education is not giving schools and educators in particular enough support.

The Department of Education (2007, p. 9) indicated that a programme is required to deal with the developmental needs of the democratic state, particularly in terms of the legacy of the inherited and fragmented education and training system. It also highlighted other problems including, the lack of common standards across the system, unequal learning opportunities and allocations of resources across racial groups, together with an irrelevant and outdated curriculum and dysfunctional and underperforming institutions.

The Mpumalanga Department of Education (2006, p. 7) expresses itself as follows regarding the strategic plan of the department:

“This national strategy for learner attainment through educators’ performance is founded on the strategic plan and recapitalisation programme of Department of Education.”

It further explains the pursuit of equality education as follows:

“The pursuit of quality in education is one of the fundamental drivers in the education transformation process and is one of the various indicators of quality and learner performance that are indicated as a key determinant of quality.”

This strategic intervention for learner attainment seeks actively to tackle underperformance by learners and teachers in all schools (Mpumalanga Department of

Education, 2008, p. 13).

In terms of educator performance, Ornstein and Hunkins (2004, p. 321) indicate that, as much as educators accept a new programme, for it to be successful students must also be willing to participate. They further state that there is still limited

2

research to guide one on how to involve students but suggest that such involvement may be facilitated by student reaction to innovation.

Kgoseng (2007, p. 4) warns that results of high stakes examinations such as those for Grade 12 (which play a crucial role in the South African education system) attracted a great deal of public interest concerning the credibility of the examination.

The researcher further states that achievement in the senior certificates should not be regarded as sufficient for ignoring the need for quality schooling. Nor should it detract from the requirement for other factors to be taken into consideration, such as laying a good basis in the foundation and senior phases. According to the

Department of Education (2005, p. 6) emphasis is also placed on the need for authorities to develop an understanding of what happens in primary schools and the lower grades of secondary schools.

The Department of Education (2006, p. 21) states that in terms of teacher development, the objective to focus on teacher shortages goes together with addressing the needs of the existing corps of education; its competence, currency, retention in the system and support in the process of life-long learning. Such support should include all the available resources such as education, face-to-face engagement and technology in mixed-model delivery.

According to a newsletter in the province (Mpumalanga Department of Education,

2007, p. 23) commitment and dedication of all role-players in education have contributed to an encouraging output. In particular it notes that 5,481 learners out of 39,040 who sat for 2006 Grade 12 examinations managed to obtain matriculation exemption although this figure was lower in 2007. It does, however, indicate that

563 more exemptions were obtained in the 2006 examinations compared to the

4,914 achieved in 2005.

The Curriculum Delivery Intervention Programme designed to enhance learner attainment through educator performance that constitutes this study includes all learning activities that are required to facilitate changes in behaviour and effectiveness. Bradley (2001, p. 114) states that if these learning activities are to be

3

carried out efficiently and effectively, evaluation should be used during the learning process to ensure that they meet the appropriate objectives.

The Mpumalanga Department of Education (2008, p. 11) showed that the focus areas of the intervention programme should be based on a five-pillar strategy that takes into account short-to-medium and long-term considerations. This is really linked to the theoretical framework of Stuffelbeam’s evaluation model (context, input, processes and product). These six pillars underpinned in the outreach programme to give support to educators are:

• quality assurance and professional development (as an input evaluation model);

• educator support and professional development (as an input evaluation model);

• resourcing (both human and physical);

• teaching and learning environment ( regarded as processes evaluation model);

• stakeholder involvement (Anglo-Coal Mines, Xstrata Group and Old

Mutual);

• learning environment (as contextual evaluation model).

The potential barriers of the above mentioned six-pillar strategy to the evaluation project is the following:

• Resources such as time and money;

• skills, or lack thereof;

• obtaining of permission(s) from the research area.

The Mpumalanga Department of Education (2008, p. 11) further states that, in order to achieve good results, monitoring and evaluation should be planned by the province’s Department of Education in such a way that it fulfills the following criteria:

4

• The Curriculum Deputy Director-General will oversee the performance of the project through monthly reports to senior management, starting from the school level and continuing through circuit and regional to provincial level;

• the progress report of the implementation of the plan shall be a standing agenda item in all senior management, regional management team and circuit meetings;

• under-performing schools will be priorities for high-impact monitoring with, moreover, support by multifunctional task teams;

• there is a unit (Ayihlome Ifunde) in the province assigned to monitoring and supporting closely the under-performing schools during the implementation of the improvement plans; it will therefore be imperative for all branches to feed this detachment with information in relation to support programmes for these under-performing schools.

1.2 Problem statement and rationale of the investigation

1.2.1 Problem statement

Mpumalanga Department of Education (2009, p.18) explains the main problem in the province as follows:

“According to the results of a progression analysis, the results clearly indicated that the academic performance of learners, especially in the primary schools, is very poor. One reason for the poor performance in schools is the lack of effective management of curriculum and delivery of curriculum in schools by the school managers in the province. Not all school managers have been trained on the National Curriculum Statement and how to manage its implementation. As a result, conflict between teachers and school managers in schools is rife from lack of effective curriculum management.”

Van der Westhuizen (2003, p. 326) argues that there is a concern that the linkage between Total Quality Management (TQM) and improved learning outcomes may not be clear or may even be non-existent. Patterson (2005, p. 13) indicate that establishing and maintaining an organizational culture that supports and sustains change requires at least four steps: developing a series of believe statements, determining their implications, putting the implications into practice and revisiting

5

the belief statements and implications regularly to ensure that the organizational culture is being preserved and renewed. This concern originates from the assumption that TQM may be relevant for delivery of services such as resources and programmes to schools. Supporting the statement above, Van der Westhuizen

(2002, P.142) contend that establishing belief statements need not take long- five or six statements that have the support of the administration, teaching and support staff can affect a school significantly.

The researcher’s problem is that we do not know how educators and the school management team respond to intervention programme and the delivery of curriculum in the schools. Van der Westhuizen (2002, P.311) emphasized that the need for leadership is also emphasized by the fact that Total Quality Management has probably generated more failures than successes, caused largely by the reluctance of people to change. He further indicated that this is why leadership is so important in the implementation of change. The researcher wanted to know and understand how educators experienced implemented and managed such programme in a school. The above stated problem confirms why these interventions introduced in the Mpumalanga Department of Education. The reason was that to hear from the educators and school management team members’ responses on the curriculum delivery intervention in schools in the province. These have led to the aims and objectives of the Mpumalanga Department of Education to address the issues related to the mediocre performance by introducing the Curriculum Delivery

Intervention Programme for the following reasons:

• The quality of the pass rate in Grade 12 is poor;

• the intake of learners in mathematics and physical science is low;

• repetition rate is above national average of 10% (at 18.5% in Grade 10 and

11);

• high rate of drop-out at the exit point;

• the pass rates are lowest in Grade 1 at 77.3% and only slightly better in

Grade 3 at 80%, and

• the pass rate in the foundation phases is lowest at 78.5%.

6

The Department of Education (2005, p. 12) states that the building of school’s infrastructure marks one of the responses of the Department of Education to deal with development in schools and aids mediocre performance in those underperforming school. The Mpumalanga Department of Education (2007, p. 4) says that this is a problem that is worrying everybody, including the DOE. This is because while the DOE is providing schools with resources, it is still expecting, in terms of the conceptualisation of educational indicators at school level, that input should be equal to output. Scheerens, Glas and Thomas (2003, p. 181) argue that, in terms of input evaluation, the actual financial resources of a programme or school may be described and judged according to the level that is thought to be necessary in order to keep the system running, while output can be judged according to prefixed attainment levels.

Ramurath (2007, p. 14) states that staff did not have enough equipment then and had to find their own information to use in the classroom. She also claims that there are just some things that cannot be improvised. Now, because of the workshops and teachers’ guides, they can perform these tasks and take advantage of such experience before going to class. These resources are precious to them, she concludes, and as a result they take good care of them.

Education spokesperson Lunga Ngqengelele in Ramurath (2007, p. 16) explains that the report will help the departmental officials to understand why some schools in rural areas do better than others with relatively resources than those of the township schools.

The problems that will be experienced and influenced during the implementation of this Curriculum Delivery Intervention Programme which emanated from the following major areas such as classroom practices, professional development and departmental support should effectively dealt with by the departmental officials.

The following problems dealing with each of these major areas will be fully discussed:

7

Classroom practices:

• The SMT members are no longer able to visit classes unexpectedly to see what is happening between educators and learners.

• Unions and employers prefer school managers to visit classrooms through

IQMS, which has proved ineffective to date.

• Shortage of qualified educators for English, Mathematics and Science, and facilities such as desks and chairs etc.

• The problem of overcrowding in classrooms.

• Usage of intoxicating substances by the learners.

• English as the language of teaching and learning.

• Parents’ involvement in taking care of education of their children.

Professional development:

• Schools have no norms of collegiality, where there is an expectation of shared work in a co-operative atmosphere for all educators.

• Educators and SMT members do not frequently observe one another for feedback, reflection and support regarding the teaching process.

• Lack of content knowledge by the educators.

• Staff members have no common, coherent set of goals and objectives that they have helped formulate, reflecting high expectations of themselves and their learners.

• Demoralised educators and lack of inspiration.

• No commitment among educators.

• Not enough training of educators by department officials.

Departmental support:

• Lack of qualified school SMT and knowledge or inspiration of others.

• Schools are not fully staffed with competent educators.

• Inexperience of development and implementation of school policies.

• Lack of knowledge in terms of co-ordination of all activities in the schools.

8

• Physical environment of schools is not conducive teaching and learning.

• Inexperience of curriculum implementers in terms of content knowledge to workshop educators.

• Unions and principals do not support the mentorship.

1.2.2 The rationale for the study.

The rationale for this study is based on a curriculum delivery intervention programme designed to deliver educational support to schools in the Mpumalanga

Department of Education in all four regions in the province. Meyer (2002, p.149) explains that ETD practitioners need to plan a curriculum which delivers a related learning programme within the structured workplace or any other learning context.

He further says that they require the unit standard ‘plan a curriculum’ in order to plan curriculum within their occupational competence. A Curriculum Delivery

Intervention Programme with optimal design for supporting schools in the province with all necessary resources must get delivered; it must be implemented throughout the region if it is to have any impact on the achievement of learners’ learning. Much is planned and developed, but it often does not get implemented because of a lack of a plan for dispersal throughout education system or rather school systems.

Frequently the new and innovative programme such as CDIP is blunted at the gates of schools. The main other reason that may cause a new curriculum development and delivery to miscarry is that implementation has not been considered critical in the department of education in the nine provinces in the country. Van der

Westhuizen (2003, p. 313) explain that this means an individual teacher could apply the quality process in the classroom, but would need the support and commitment of the school system’s leaders to introduce a viable quality improvement process. He further indicated that unless the staff see a genuine commitment to quality in the behavior of the top team, improvement is unlikely to be implemented from below. A very interesting fact is that many individuals in the Department of Education responsible for curriculum do not possess a macro view of the process or realise that innovations need very careful planning and monitoring. Carl (1997, p. 169) adds that macro-implementation is the application of policy and curriculum initiatives as determined at national levels by curriculum authorities.

9

Curriculum Delivery Intervention Programme is driven or governed by departmental support, which is expected from the departmental officials because if there is no support, one needs not expect good performance from schools or learners. Curriculum designers need to provide the necessary support for their recommended programme modifications to facilitate their rapid implementation.

They have to do this to build self-confidence among those affected, such as the principal, educators and school governing body in the school. Clarke (2007, p. 173) emphasis that it is a matter which should be at the forefront of the minds of everyone who cares about quality education and should particularly occupy the minds of principals, developmental officials and politically effective heads of education to come with a way to deal with this matter. He continues by saying that in terms of the estimate of operational costs by the Department of Education,

Section 21 schools have a relatively free hand as to how they spend their budget.

However, Section 20 schools have to use the state-allocated portion of their budget in accordance with set guidelines on the division between learning support materials

(textbooks, library books, charts etc.) and non-learning support materials (learner desks and chairs, copier machines, computers, etc).

The support or delivery underpinned in the programme (CDIP) is in terms of dissemination of information, wherein communication plays an important role. It is almost an axiom that whenever a new programme is being designed, communication channels must be kept open so that the new programme comes not as a surprise to the customers. The continuous discussions about a new programme among the departmental officials, the principals and educators are a key to successful implementation of the programme. Carl (1997, p. 167) confirmed the above-mentioned statements; the real measure of success during this application phase is largely determined by the quality of the planning, design and dissemination done beforehand. He further says that the success of implementation may be assured if the dissemination had been effective and specific strategies had also been followed during implementation.

The other important aspect to be looked into is to see whether educators can use what they have received or learned through workshops and whether they are able to

10

integrate all curriculums delivered offered in the CDIP by the department into a daily programme of their schools. The integrated interventions involve estimating how much intervention that consumers (learners, educators and school managers) will need in future and then identifying a mix of appropriate sources and forms of intervention to meet the needs in the most efficient and socially beneficial manner.

Piaget in (Ornstein & Hunkins, 2004, p. 110) explains that, integration ‘refers to horizontal relationships of curriculum experience;’ which means that the organisation of experiences should be unified in relation to other elements of the curriculum being taught and that subjects should not be isolated or taught as a single course’ from the rest of the subjects.

The Integrated Quality Management System (IQMS), which is the national

Department of Education’s model for school improvement, uses the results of the development appraisal (DA) of individual teachers as the basis for the development of the school improvement plan (SIP), which it describes as a blueprint for actions and processes needed to produce school improvement. Clarke (2007, p. 132) explains that to integrate the intervention programme such as CDIP and the schools need to comply with the following steps:

• The responsibility for developing the school’s improvement plan rests with the school development team (SDT), which is made up of the principal, the whole-school evaluation coordinator, democratically elected members of the school management and an elected post-level 1 educator.

• Essentially, this group uses information provided by each teacher’s development support group (DSG) to identify the development needs of every teacher or SMT member.

• They work with individual teachers and are required to mentor them and provide them with support. In addition, they must help them develop and refine a personal growth plan (PGP) for the school.

The other support that all educators are expecting is to see is the support the

Department offers their school to enable them to improve the performance in the school in terms of inputs such as funds, learner support materials, human resources

11

and physical resources. Carl (1997, p. 49) confirms that curriculum support is a stage in curriculum development during which the curriculum consumers are prepared for the intended implementation and support reaches a critical point. He further say this support is done through the distribution or publication of information, ideas and notions on in-service training to prepare all those involved and to inform them of the proposed curriculum.

According to Adam (1997, p. 4) in Steyn and Van Niekerk (2007, p. 224), professional development covers a variety of activities, all of which are designed to enhance the growth and professional competence of staff members. Educators’ participation in professional development can enhance the success and effectiveness of the professional development programme.

Teacher training in classroom management should be a regular component of any school’s teacher professional development programme. Clarke (2007, p. 95) emphasises that most experienced teachers benefit from being reminded of the range of strategies that are available to them. He states that teachers also have a valuable role to play in mentoring colleagues (particularly those who are new to the school) in the strategies and techniques of classroom discipline, and in the school’s preferred approach to dealing with non-compliant learners.

Arguably, quality teaching and learning at schools depends on staffing. Van

Deventer and Kruger (2003, p. 220) state that, when identifying development needs, it is essential to decide which of the problems are the most important and should receive attention first. According to Clarke (2007, p. 95), it is important, if the focus of professional development for the staff during a particular period is to be classroom management, that lesson visits focus on this aspect of teaching. With this in mind, visiting teachers should be asked to note how the educator they are observing manages a particular aspect of student behaviour or particular phases of the lesson.

This study contributes to the existing knowledge base on curriculum delivery intervention in a number of ways. Firstly, it reveals how the Mpumalanga

Department of Education is having difficulties with curriculum delivery in schools

12

in the province in the current context of trying to take education to a higher level.

Secondly, it intends to inform the Department of Education about how curriculum delivery needs to be approached in the current context of transformation in the province. Thirdly, it proposes the explanation for the different meanings ascribed to

Curriculum Delivery Intervention Programme within the framework of the existing theory on professional development, classroom practices and departmental support.

In the broader sense, the study contributes to an African perspective regarding the implementation and development of curriculum in education. The unique context of this study is the lack of competency of both regions and provincial departments to fast track or carry out the curriculum delivery in terms of implementation. Orstein and Hunkins (2004, p. 323) confirm that in schools where there has been successful change, the curriculum directors assist teachers and principals in furnishing pedagogic and curricular knowledge. They further state that teachers within the system expect these people to keep abreast of the latest research and theories on any particular innovation and to communicate by giving a feedback to these insights to the school staff.

The other key question is based on the reporting or immediate feedback after evaluation or checking of all existing documentations in the school. Feedback is information given in response to a person’s performance of a task, used as a basis for improvement. Scheerens, Glass and Thomas (2003, p. 12) confirm that programme evaluation is designed to have both formative and summative elements; the former is close to the improvement perspective and the latter close to the accountability perspective. They further emphasise that it is aimed at providing feedback that is relevant to support and improve the process of implementation.

Where special interventions have been put in place in an effort to improve results, any improvement in results needs to be justified in terms of better performance rather than less rigorous assessment tasks. The monitoring and reporting of the results in this way is a key element of the management of good teaching and learning. Scheerens, Glass and Thomas (2003, p. 31) explain that learning, feedback, the formative role of evaluation, intrinsic interest in process and a methodology that is controllable by the teacher are the central characteristics. They further say that external school evaluation is more likely to be accountability-

13

oriented and that internal evaluation is more likely to be improvement-oriented, although exceptions may occur, as when a school deploys an external consultant to review, for instance, its managerial structure.

1.3 Research questions for the investigation

The research question that guided this investigation reads as follows:

“How do educators and SMT members respond to the curriculum delivery intervention programme in township schools in terms of departmental support, professional development and classroom practices?”

The main question the researcher wants to address is how educators and the school management team members respond to the Curriculum Delivery Intervention

Programme in the school. To explain and/or explore the main research topic according to the purpose noted above, seven secondary questions guided the inquiry into the Curriculum Delivery Intervention Programme (CDIP):

• How were educators informed about the implementation of the CDIP?

• What training have they received regarding the implementation of the curriculum delivery programme?

• What support did they get from Mpumalanga Department of Education?

• How did they implement the Curriculum Delivery Intervention Programme?

• How was the implementation monitored?

• How often did they receive feedback from programme evaluation?

• How was the Curriculum Delivery Intervention Programme integrated into the daily programme of the educators?

1.4 Aims and objectives

The aim of this study was to explore and/or explain how educators and the school management team members responded to a curriculum delivery intervention programme in terms of departmental support, professional development and classroom practices in order to improve the quality of teaching and learning in

14

township schools. It aimed also to provide the researcher/evaluator with a list of the intended outcomes for the curriculum delivery intervention programme.

Aspects such as professional development, departmental support and classroom practices of 2007, 2008, 2009 and 2010 will be focused on to ensure that every school in the country provides every learner in the system with a fair chance to succeed at any level. This will be achieved by targeting the following intended outcomes of the curriculum delivery intervention programme. The following outcomes were envisaged in the investigation:

• To establish how educators are informed about the implementation of the

Curriculum Delivery Intervention Programme (CDIP).

• To determine what training educators did they have received regarding the implementation of the curriculum delivery intervention programme.

• To determine what support educators received.

• To establish how the implementation of the Curriculum Delivery

Intervention Programme (CDIP) was achieved.

• To establish how the implementation is monitored.

• To establish how often educators receive feedback from programme evaluation.

• To determine how the Curriculum Delivery Intervention Programme (CDIP) is integrated into the daily programme of the educators.

It has been only four years now that the Department of Education has been facilitating the intervention. Therefore this study sets out to contribute to the knowledge regarding the monitoring and evaluation that has underpinned the literature review on core functions, basic data and evaluation objects which form the conceptual framework of learner attainment in the South African school environment.

15

1.5

Theoretical framework underpinning the investigation

Scientific-positivistic evaluation models inform this research. These assert that differences might be harnessed productively and used to unify social groupings, while contributing something essentially new and creative that did not exist before.

Ornstein and Hunkins (2004, p. 40) argue that the scientific-positivistic evaluation models have taken educational evaluation as the beginning of the modern era of programme evaluation. They further say that the evaluation plan was organised in four sequential steps as follows:

• Focusing on the goals and objectives of the programme.

• Classifying objectives.

• Defining objectives in behavioural terms.

• Finding situations in which achievements are shown.

Davidson (2005, p. 11) states that, in terms of the evolution of the human race, evaluation is possibly the most important activity that has allowed us to grow, improve things and survive in an ever-changing environment. In this case, to change or turn around the situation through curriculum delivery in the classroom is to improve the performance of schools. This will only be possible if the DOE provides the schools with the relevant resources. The following aspects such as the dissemination of information, curriculum development, monitoring and evaluation, departmental support and professional development and classroom practices form the theoretical framework for the study.

Ornstein and Hunkins (2004, p. 342) show us that there are three theoretical assertions about decision management/orientation. This approach to educational evaluation has been that presented by Daniel Stufflebeam. The two indicate further that Stufflebeam’s approach to evaluation is recognised as the CIPP (context, input, process and product) model. Clarke (2007, p. 47) explains that departmental support will be through curriculum implementation in terms of learning support materials and physical and human resources as considered as the inputs in the context

16

(schools). Daniel Stufflebeam explained it’s approach to educational evaluation in

Ornstein and Hunkins (2004, p. 343) as follows:

“Contextual evaluation is the educational evaluation that involves studying the environment (school environment) of the program and its purpose is to define the relevant environment, portray the desired and actual conditions pertaining to that environment. Input evaluation is designed to provide information and determine how to utilize resources (such as human resources, physical resources, learning support materials and professional development) to meet program goals and asses the school’s capabilities to carry out tasks of evaluation: they consider the strategies suggested for achieving program goals and identify the means by which a selected strategy will be implemented. In terms of the process evaluation is defined as a stage addressing implementation of curriculum decisions that control and manage the program and used to determine the congruency between the planned and actual activities (teaching and learning in the classroom) in the classroom practices. Lastly, the product evaluation is also defined as the product evaluators gathering data to determine whether the final curriculum product now in use is accomplishing what they had hoped and to what extent are the objectives created being attained? And further explained that product evaluation provides evaluators with information that will enable them to decide whether to continue, terminate, or modify the new curriculum. It allows them to link actions at this stage of the model to other stages of the total change process”

This comprehensive model considers evaluation to be a continuous process. The researchers contend that information is provided to management the purpose of decision-making, for which there is a three-step process. This is, firstly, delineating the information necessary for collection of data on the implementation of this programme. Secondly, obtaining the information and finally, disseminating it to the interested parties. In this study the model is based on Stufflebeam’s approach in terms of the input that was directed by departmental support, professional development, and educators’ practices in the classroom in order to improve the educators’ performance (which was indicated by learner performance in the school).

This meant that all four approach models of Stufflebeam as mentioned above revolve around a decision-management oriented to educational evaluation in the school. The information is provided to management for the purpose of decision making with the aid of resources so that the implementation (process evaluation) and achievement (product evaluation) must take place in the school situation

17

(context) and supported in terms of Curriculum Delivery Interventions by the

Departmental officials.

In terms of the Daniel Stufflebeam model, the Departmental officials should actually provide schools with resources such as finances, learner support materials and equipments and School management team members ensure that there is a proper planning so that the implementation takes place in the school. The effective use of the inputs such as finances, learner support materials and equipments in a proper way, it will help the school to improve its performance, if not, it will clearly that there was no effective planning during the implementation (process). This leads to say, there will be no good results (production).

1.6

Research approach

It was decided to follow a multiple- methods research approach in this investigation.

These methods will be discussed in depth in chapter of this investigation.

Firstly, the approach can be characterised as collaborative as the researcher worked closely with the underperforming schools where the curriculum delivery intervention programme operates and particularly the educators and school management team, to facilitate the tailoring and timeliness and maximise the usefulness of curriculum delivery in the underperforming school. Creswell (2005: p.

589) explains the term collaborative as to collaborate with others and this is central to action research and it involves actively participating with others in research.

Secondly, the evaluation approach may also be considered a mixed methods approach because it employs both qualitative (e.g. observation, individual interview) and quantitative (e.g. questionnaire) measures.

Quantitative methods provide numerical representations of outcomes that can be used to assess accomplishment against goals, standards or targets; qualitative data, on the other hand, provide rich information that can be used to examine phenomena not readily amenable to quantitative exploration and/ or to provide a contextualised, more complete explanation of the phenomenon or programme under study.

18

1.7 Outline of the dissertation

The dissertation is structured into five chapters; chapter 1 provides talked the problem statement, aims and objective of the study. Chapter 2 provides the literature review (Curriculum Delivery Interventions) as conceptual framework of the study.

Chapter 3 outlines the methodology. Chapter 4 presents the analysis of the investigation by using the tables and graphs (box plots) and Chapter 5 provides the main findings, recommendation and implications.

Chapter 1 presents the problem statement, aims and objective of this study; the

Mpumalanga Department of Education was forced to introduce a Curriculum

Delivery Intervention Programme with the intention of addressing the mediocre performances experienced in the province. The main problem identified in the province by the educational leaders is that the academic performance of learners from the primary schools is very poor and this accounts for the poor results in secondary school education. Poor performance in both primary and secondary schools is the result a lack of effective curriculum management and poor curriculum delivery by the same school managers, the reason being that not all school managers undergone an intensive training to manage school.

The aim of this investigation is outlined as to explore and/ or explain how both educators and the school management team members responded to the curriculum delivery interventions programme in terms of departmental support, professional development and classroom practices in order to improve the quality of teaching and learning in school. Aspects such as professional development, departmental support and classroom practices are focused on to ensuring that every school in the country provides every learner in the system with a fair chance to succeed at any level.

The following objectives were envisaged in the investigation:

• To establish how educators are informed about the implementation of the

Curriculum Delivery Intervention Programme (CDIP).

19

• To determine what training educators did they have received regarding the implementation of the curriculum delivery intervention programme.

• To determine what support educators received.

• To establish how the implementation of the Curriculum Delivery

Intervention Programme (CDIP) was achieved.

• To establish how the implementation is monitored.

• To establish how often educators receive feedback from programme evaluation.

• To determine how the Curriculum Delivery Intervention Programme (CDIP) is integrated into the daily programme of the educators.

The rationale for this study is based on a curriculum delivery intervention programme that designed to deliver educational support to schools in the jurisdiction of the Mpumalanga Department of Education in all four regions in the province. A curriculum with optimal design for supporting schools in the province with all necessary resources, must be delivered, it must be implemented throughout the region if it is to make any impact on the achievement of learners’ learning.

Chapter 2 presents the literature review that explores curriculum delivery interventions such as departmental support, classroom practices and professional development.

Chapter 3 of this study is based on the methodology used in order to give the details on the procedure that had been followed in defining the population and sample, the methods used, data collection, analysis and ethical considerations. The approach used in this study is a mixed-method. The population for this study is all schools declared to be underperforming in the regional level at Nkangala. Sampling was done from all schools performing or obtaining less than 50%. The researcher focused his research on one secondary school in the Mpumalanga province.

Triangulation in the form of questionnaires, individual interviews and an observational checklist and data analysis was used in the study. The descriptive analysis and logical analysis were followed to explore the understanding of the phenomenon.

20

Chapter 4 of this investigation is based on the analysis and interpretation of the results of both quantitative and qualitative data. The biographical information of the respondents who participated in the investigation as appear in the section A of the questionnaire was also discussed in this chapter. The results of the frequency analysis were discussed by using tables. The results of the frequency analysis of the observational checklist were also discussed by using tables. The interpretation of a box and whisker plots by using graphs were also discussed in this chapter. Another approach that was discussed in this chapter was the analysis of the findings of the qualitative approach to the investigation.

Chapter 5 of this investigation is based on the main findings, recommendations and implications. It is also based on the findings of the empirical investigation on both the quantitative and qualitative study. The limitations, suggestions for future study and the closing comments were also discussed in this chapter.

1.8 Conclusion

Chapter 1 endeavoured to give the reasons why the Mpumalanga Department of

Education introduced a curriculum delivery intervention progrmme in the province.

A related question that is that nobody exactly knows how educators and the school management team respond to the CDIP introduced in the province.

The main question in this study is how educators and SMT members respond to the

CDIP in township schools in terms of departmental support, professional development and classroom practices.

21

Chapter 2

Literature review in terms of curriculum delivery interventions in schools

A curriculum has been described as the blue print for the functioning of the primary process in education. The curriculum could function as a powerful coordination mechanism. Eisner, in Jacobs, Gawe and Vakalisa (2002, p. 92) explain that the term curriculum is derived from the Latin word “currere” meaning the course to be run or a task to be completed. Jacobs, Gawe and Vakalisa (2002, p. 92) explain that this means schools have programmes designed for learners that are to be completed within a specified period of time.

Delivery is a broad word that means to bring and hand over something promised by an individual or the government. The curriculum delivery interventions mentioned above as heading of this chapter 2 embraces departmental support, professional development, classroom practices, monitoring and evaluation, program implementation, curriculum management, change and improvement, integration of interventions and curriculum implementation . In the formal educational context, the

South African Department of Education is the role player in delivering education as promised to the entire South African population. Popham (2004, p. 425) emphasises the fact that to support such inter-specialist collaboration, special departmental seminars could be set up specifically for key departmental representatives of the three groups. The delivery referred here, is in terms of the Department providing and supporting schools with human resources, physical resources, learner support materials and professional development.

It is clear that in cases where the management of the school has improved; there is also a notable progress in learner performance, probably through enhanced educator ability. The Department of Education (2007, p. 3) has suggested that it would be essential for all school principals, deputy principals and education specialists from under-performing schools to undergo an intensive management and leadership

22

training programme organised by the province concerned. This means that the mentoring of school principals has been found to be an effective approach towards improving their management skills. It further explains that the development of management and leadership skills could also include district support in terms of departmental support, professional development and classroom practices as indicated.

To ensure that all schools are fully resourced in terms of their physical and financial requirements;

• Management and leadership support should be provided to schools.

• A focus session with the SMT, SGB and educators should be arranged to establish shortcoming of the schools with a view to developing an improvement plan.

• Subject advisory support must be provided, especially in those areas where there is poor performance.

• Monitoring and support programmes are provided for schools introducing

Grade 12 for the first time.

• Scheduled visits and capacity building of new schools should be in place.

• Under-performing schools should be prioritized for high-impact monitoring and support by a multi-functional task team.

• Scheduled visits to schools should be conducted by district teams to facilitate an integrated intervention.

The second section of the literature review deals with curriculum development. It attempts to answer the question why curricula developed for this programme. The third section deals with the area of communication. The fourth section focuses on the curriculum implementation. In this subject we needed to explore how the

Department of Education have supported schools and how schools have implemented it. The other section addresses professional development. We needed to explore in another section how the department developed and trained educators in terms of skill development. The fifth section seemed to be the nucleus of the whole

23

implementation of the programme, here; we were looked at the support the regional officials offer to schools.

Sixth section dealt with classroom practice. Here we looked at the implementation of curriculum delivery in the classroom situation where educators educate and learners learn and do all forms of assessment. The programme implementation was subjected to discussion. We looked at how the regional offices design appropriate learning materials and select suitable learning strategies and techniques to facilitate the learning process.

The seventh section dealt with the areas identified for improvement and changes and determined what it that we should change or improve was. Curriculum management emphasises that it is essential for principals and school management to ensure that they are fully familiar with this material to monitor what the subject heads and teams are doing.

One section explored the integration of interventions of the programme where the integrated quality management system played a role. The national Department of

Education’s model for school improvement uses the results of the development appraisal of individual teachers as the basis for the development of the school improvement plan.

All of the above-mentioned interventions strategies underpin the conceptual framework of the study. In terms of the theoretical framework they link well because the interventions in this study such as professional development (providing skills to the educators in the form of in-service training) and departmental support

(providing human and physical resources and learning materials) refer to the

Stufflebeam’s input model while classroom practices (teaching and learning taking place in the classroom situation by educators) also refers to the Stufflebeam process model as the implementation section in this study.

24

2.2 Curriculum Development in terms of Curriculum Delivery

Interventions

The new education, training and development (ETD) system seeks to achieve a better integration between them in South Africa. Meyer (2002, p.149) explains that

ETD practitioners need to plan a curriculum which delivers a related learning programme within the structured workplace or any other learning context. He further says that they require the unit standard ‘plan a curriculum’ in order to plan curriculum within their occupational competence. The role players suggested by the

Skills Development Act, Labour Relations Act and other relevant legislation need to be considered. The education and training philosophy or approach on the curriculum is based as well as the purpose, outcomes, materials and mode of delivery for each programme should be stipulated in the curriculum framework. Carl (1997, p. 47) explains that the term or concept curriculum development lends itself to different interpretations and identifies six authoritative phases in order to show how curriculum development progresses:

“Curriculum design is that phase during which a new curriculum is planned or during which the deplaning and review of an existing curriculum are done after a full re-evaluation has been carried out. This phase usually has a number of characteristic components which include, inter alia, purposefulness, content, methods, learning experience and evaluation. Curriculum dissemination (which is often equated with implementation in the curriculum literature) is that phase in curriculum development during which the curriculum consumers are prepared for the intended implementation and information for the intended implementation, ideas and notations, in-service training etc, are motivated to prepare all those involved and to inform them of the proposed curriculum. Curriculum implementation is that phase during which the relevant design is applied in practice. Lastly, curriculum evaluation is that phase during which not only the success and effectiveness of the learners but also at the general success of the economy of the country. A distinction must therefore be drawn between curriculums orientated evaluation.”

Ornstein and Hunkins (2004, p. 322) support the statement made by Carl (1997, p.

47) above in terms of dissemination of the information by saying that the principal’s leadership is critical for the success of any curriculum development and implementation. They make sure that they determine the organisational climate and they support those persons involved in change. The principal creates an atmosphere in which good working relationships exist among teachers, and teachers are willing to take the risks necessary to create and deliver new programme changes

25

implemented. They further say that the director of curriculum can assist teachers, supervisors, and principals in the implementation process by inspiring and providing the necessary support for the staff by clarifying the district’s goals and values, conducting curriculum surveys and communicating district policies and guidelines.

In conclusion, curriculum planners have a wide selection of curriculum development models from which to choose. The models they select are influenced by their philosophical orientation and approaches to curriculum. Diversity of approach characterizes curriculum creation; however, certain curriculum elements are universal and require attention from all curriculum developers. Content experience and environment, for instance, are constants regardless of design or development.

2.3

Dissemination of information

The support or delivery intervention contains in the programme is dissemination of information, wherein communication plays an important role for the success of the

CDIP. It is almost an axiom that whenever a new programme is being designed, communication channels must be kept open so that the new programme comes not as a surprise to the customers. The continuous discussions about a new programme among the departmental officials, the principals and educators are a key to successful implementation of the programme. Carl (1997, p. 167) confirms the above-mentioned statements; the real measure of success during this application phase is largely determined by the quality of the planning, design and dissemination done beforehand. He further says that the success of implementation may be assured if the dissemination has been effective and specific strategies are followed during implementation.

Effective implementation of innovation requires time, personal interaction and contacts, in-service training and other forms of people-based support. Ornstein and

Hunkins (1993, p. 304) state that curriculum designers need to provide the necessary support for their recommended programme or programme modifications to facilitate their rapid implementation.

26

2.4 Curriculum Implementation in classroom practices

Implementation is the application phases of core syllabi. The school’s broad curriculum contains different subject knowledge, different work schedules and lesson unit that developed by educators in the school. The sharing of the instructional leaders (collaboration of educators in terms of classroom practices) and that make educators to determine a successful and effective curriculum implementation to a great extent in the schools. Successful implementation, however, depends on the extent to which all consumers are informed and have been prepared for the envisaged change and whether they are prepared to associate themselves with it. Ornstein and Hunkins (2004, p. 298) mention that the individuals might even resist using the term implementation, thinking that the term suggests a technical rationality. There are numerous reasons for the failure of innovative curricula not being successfully implemented. They further say that experts outside the school design many innovative programmes.

Ornstein and Hunkins (2004, p. 303) pointed out that curriculum designers need to make sure that the programme is running effectively by adhering to the following:

• Programmes must be designed so that they can be integrated into and supported by the organizations within which they are designed to function.

• In-service programme that work have resulted from collaborative efforts and have addressed the needs of those who are to be affected by the new curriculum.

• Effective in-service training has the necessary flexibility to respond to the changing needs of the staff.

• Without adequate financial support, efforts to get a programme going district-wide will fail.

• A trust relationship must exist among all parties in the school, especially between administration and the educators.

• Implementation is a collaborative and emotional effort.

27

2.5 Professional Development of educators

According to Adam (1997, p. 4) in Steyn and Van Niekerk (2007, p. 224) professional development covers a variety of activities, all of which are designed to enhance the growth and professional competence of staff members. Educators’ participation in professional development can enhance the success and effectiveness of the professional development programme. The Department of Education (2000, p. 16) state that educators need information to successful put the new curriculum into practice. They further state curriculum committee must make sure that the school has the following policies in place: continuous assessment policy, policy on selection of textbooks, and homework policy. Monteith, Van der westhuizen and

Nieuwoudt (2002, p. 22) state that resource management refers to all the sources a student can use to make learning easier. They further explain that resource management strategies are designed to assist learners in managing such resources in terms of effort and persistence and are aimed at helping learners to manage the time they have available for a give task and their study environment.

Teacher training in classroom management should be a regular component of any school’s teacher professional development programme. Clarke (2007, p. 95) emphasizes the fact that most experienced teachers benefit from being reminded of the range of strategies that are available to them. He states that teachers also have a valuable role to play in mentoring colleagues (particularly those who are new to the school) in the strategies and techniques of classroom discipline and in the school’s preferred approach to dealing with non-compliant learners.

Arguably, quality teaching and learning at schools depend on staffing. Van

Deventer and Kruger (2003, p. 220) state that, when identifying development needs, it is essential to decide which of the problems are the most important and should receive attention first. According to Clarke (2007, p. 95), it is important, if the focus of professional development for the staff during a particular period is to be classroom management, that lesson visits focus on this aspect of teaching. With this in mind, visiting teachers should be asked to note how the educator they are observing manages a particular aspect of learner behaviour or particular phases of the lesson.

28

2.6

Departmental Support and Intervention

In terms of district support Clarke (2007, p. 75) explains that it is the structure that is closest to the site of delivery, i.e. the district, that needs to take final responsibility for the performance of the schools falling under its jurisdiction. He further says that the district is responsible for supporting, monitoring and evaluating the school on an ongoing basis. Woolfork (1998, p. 392) explains that motivation to learn and teach is encouraged when the sources of motivation are intrinsic, the goals are personally challenging, and the individual is focused on the task, has a mastery orientation, attributes successes and failures to controllable causes and believes ability can be improved.

In most schools that are under-performing, it has been established that one of the primary factors contributing to this under-performance is poor or inefficient management. In cases where the management of the school has improved, there is also a notable progress in learner performance, probably through enhanced educator ability. The Department of Education (2007, p. 3) suggests that it is essential for all school principals, deputy principals and education specialists from underperforming schools to undergo an intensive management and leadership training programme organised by the province concerned. This means that the mentoring of school principals is an effective approach towards improving their management skills. It further explains that the development of management and leadership skills need to:

• Make sure that all schools are fully resources in terms of their physical and financial requirements.

• Arrange focus session with the SMT, SGB and educators to establish shortcoming of the schools with a view to developing an improvement plan.

• Provide subject advisory support especially in those areas where there is poor performance.

• Provide monitoring and support programmes for schools introducing Grade

12 for the first time.

29

• Schedule visits and capacity building to new schools;

• Prioritise under-performing schools for high-impact monitoring and support by a multi-functional task team;

• Schedule visits to schools by district teams to facilitate an integrated intervention.

In order to obtain the commitment of school managers to good teaching and learning, it is important to understand precisely what is meant by ‘good teaching and learning’. Clarke (2007, p. 173) emphasises the fact that it is a matter which should be at the forefront of the minds of everyone who cares about quality education, and should particularly occupy the minds of principals, developmental officials and politically effective heads of education. He continues by saying that in terms of the estimate of operational costs by the Department of Education, Section

21 schools have a relatively free hand as to how they spend their budget. However,

Section 20 schools have to use the state-allocated portion of their budget in accordance with set guidelines on the division between learning support materials

(textbooks, library books, charts etc.) and non-learning support materials (learner desks and chairs, copier machines, computers etc).

Department of Education (2000, p. 23) indicates that the Skills Development Act,

No. 97 of 1998 makes provision for national, sector and workforce strategies to provide financing by means of levy-grant schemes with employers – for instance, organising workshops for educators on content knowledge. The Constitution of

South Africa Act, No. 108 of 1996 clearly states that the child has a right to education and the State has an obligation to provide this education to all children without discrimination. I say that the commitment of the whole school community means a situation where Section 20 teachers teach, learners learn, departmental officials give support and parents support both teachers and learners.

2.7

Classroom Practices (Learning Environment)

This section deals with good teaching and learning. In this context teaching quality refers to the teacher’s ability to teach and what he or she does to promote learning

30

within the classroom situation. Clarke (2007, p. 207) stresses the need to create a positive learning climate, i.e. selecting appropriate instructional goals and assessment, using the curriculum effectively and employing those teaching behaviours that help learners to learn at high levels. Good teaching and learning is about the quality of what happens in the classroom and in the interaction between the teacher and students.

Kaplan and Owings in Clarke (2007, p. 223) state that, in reviewing the research linking teacher quality to teacher achievements, there is a distinction between teaching quality and learning quality. The former concerns the input the teachers bring to school, such as aptitude, professional preparation, qualifications and prior work experience. It is evident that learning is no longer restricted to an individual experience. A school that is working is regarded as a school with good leadership; where leadership is about getting things to change in terms of having a vision, strategy, aligning people and motivation and inspiration. Clarke (2007, p. 3) clearly states that management is about getting the system to operate effectively. Four key strategies that managers use to ensure operational effectiveness are:

• Planning and budgeting – creating systems for operational efficiency.

• Organising and staffing – making sure that everyone knows what is expected.

• Controlling and problem solving – making it happen.

• Predictability and order.

These entire objectives will be attainable if the Departmental officials adhere to the

Skills Development Act, No. 97 of 1998, which states that there is a requirement to provide for national, sector and workforce strategies to provide financing by means of levy/grant schemes with employers.

Meyer (2002, p. 74) argues that education is a team and organisational process that requires new and innovative ways of learning and managing performance improvement. He further says that it becomes part of a continuous process of sharing information with learners and the environment. It is true that, as is the case

31

with the NQF, learning organisation also embodies the principle of lifelong learning. It is indeed a continuous and never-ending process.

Popham (2005, p. 52) points out that the major contribution of classroom assessment to teachers’ decision-making is that it provides reasonably accurate evidence about learners’ status. He adds that teachers are often forced to make inferences about learners’ knowledge, skills or attitude based on information observation. However, such unsystematic observations sometimes lead teachers to influence the learners’ status. Moloi (2009, p. 135) pointed out that teachers realize that the benefits of self-assessment can only be realized when they are willing to share control of assessment by allowing learners to play a role in influencing the process, and when learners are involved in determining the assessment criteria.

Mestry, Steinberg and Almond in Scheerens, Glass and Thomas (2003, p. 100) view educational measurement as a form of evidentiary reasoning that covers the inferential product from observable learner behaviour in particular circumstances to their general level of knowledge and skills.

Popham (2005, p. 55) asserts that content-related evidence of validity refers to the adequacy with which the content of the test represents the content of the assessment domain from which inferences are to be made. He states that when those in educational management several decades ago first dealt with the idea of contents relationship, the focus was on achievement examinations, such as a test of the learners’ knowledge of history. Teachers who can test well are better teachers, he maintains, and continuous by saying that effective teaching will enhance their performance, thereby benefiting both themselves and their learners in the classroom situation. Scheerens, Glass and Thomas (2003, p. 10) assert the importance of educational measurement but reminds that it is only one of the two basic forms of educational evaluation, the other one being programme evaluation. They further state that the technology and formal conceptual background of educational measurement needs to address the following appropriate issues:

• Item formats – both closed and open.

32

• Authentic assessment.

• Norm-referenced and criterion-referenced testing.

• The degree to which tests are auricular-tied.

The Department of Education (in NCS, 2003, p. 13) contends that another problem in the assessment field is the appropriate selection of assessment tools and instruments; inappropriate assessment tools could lead to the development of only selected competences required by the curriculum. It is important that all forms of assessment should be applied to enable the development of a full range of competences, which will contribute to the development of learners.

The design phase of the curriculum delivery programme is of critical importance for the success of any programme intervention. This is achieved by the design of appropriate learning materials and the selection of suitable learning strategies and techniques to facilitate the learning process.

Meyer (2002, p. 138) states that the implementation of interventions requires careful thought in complex settings. He contends that intervention involves performance, behaviour and other sensitive issues.

It is not only the managerial skills in various schools that may require a new approach to be implemented. Schools should also ensure that relevant interventions are selected to contribute to effective teaching and learning in terms of classroom practices. Clarke (2007, p. 18) explains that the way in which to make a large project of this kind manageable, is to divide and conquer. That is, divide the project into smaller, more manageable tasks. These smaller, short-term goals, should obviously be included in the overall goals list as sub-sets, but given the same level of importance as the main objective. He states that people need to be the champions of their particular goal; together they will form the team to drive and monitor the implementation of the strategic plan.

33

It is both important and advisable for departmental officials to urge the teams involved in the implementation of the programme to schedule meetings on a regular basis, so that they can report on progress and share ideas on their successes and challenges.

According to Meyer (2002, p. 139), the plan for the implementation of interventions should be set up as follows:

• A plan to develop the intervention itself – to include the identification of the internal and external resources required to complete the development in terms of timelines, budgets, experts’ roll-out dates and milestones.

• A strategy to ensure commitment throughout the interventions.

• An analysis of the target population.

• An analysis of intervention sequencing.

• Review of the intervention implementation.

2.9

Changes and improvement

In terms of professional development, apart from the requirement for the educators to improve, the results in schools need to be seriously monitored and evaluated by the DOE. Clarke (2007, p. 132) explains that Integrated Quality Management is the national Department of Education’s model for school improvement, and uses the result of the development appraisal of individual teachers as the basis for the creation of the school development plan. It describes this plan as a ‘blueprint for actions and processes needed to produce school improvement’.

Kotter and Schlesinger (1979, p. 106) introduce the idea that, in the context of educational management, change means, inter alia, that school principals are exposed to new controls and regulations, growth, increasing competition, technological development and changes in workforce. Beckhard and Harris (in Van der Westhuizen 2003, p. 182) state that changes in legislation, the availability of resources, market demands and social priorities often force principals or the state to

34

redesign the organisation’s structure, and, in addition, change procedures, redefine priorities and re-deploy resources.

Van der Westhuizen (2003, p. 325) makes the point that a quality improvement team is of key importance to the successful implementation of total-qualitymanagement to make schools fully functional. He further states that these teams need to be part of a strategy, since they reflect fundamental TQM concepts such as decentralisation, flat organisation structures and the empowerment of personnel. He also argues (2003, p. 326) that there is a concern that the linkage between TQM and improved learning outcomes may not be clear or may even be non-existent. This concern originates from the assumption that TQM may be relevant for delivery of services such as resources and programmes to schools. In comparing TQM to teaching and learning, he posits that this support structure may not be applicable to this particular context.

In conclusion, the emphasis is that schools as organisations, which can themselves learn, can continuously improve people, processes and systems and therefore contribute towards quality education.

Mestry and Grobbler (2004, p. 2) state that South African school managers have a multifaceted and enormous task to establish an environment that can lead to effective schooling. They further say that one of the major changes in principalship has been the range of expectations placed on these individuals. The Department of

Education (2000, p. 8) states that there is general agreement about why schools need good management and leadership is to ensure a better quality education for the learners. These expectations have moved from the demand for management of curriculum and control to the demand for an educational leader who can foster staff development as well.

Clarke (2007, p. 238) advises school managers that the Department of Education has produced documents providing detailed information on how subject teams and teachers should approach this planning, and the kind of information that their

35

planning material should include. He emphasises that it is essential for principals and school management to ensure that they are fully familiar with this material so that they are in a position to monitor what the subject heads and teams are doing.

The design of an Mpumalanga DOE intervention plan should be based on a fourpillar strategy that takes into account short-, medium- and long-term considerations

(Department of Education, 2008, p. 19), which are the following:

• Curriculum support and professional development (content knowledge).

• Learner support and development (packaging/learning materials).

• Stakeholders’ involvement (Anglo Coal Mines, Old Mutual, Xstrata, etc.).

• Resources (human and physical).

Mpumalanga Department of Education (2008, p. 12) introduced such interventions in curriculum delivery to assist educator performance. These include quality assurance and monitoring of work of educators, task-on-time, management plans and duties, content knowledge workshops and stakeholders’ involvement in the area.

2.11 Integration of interventions

Integration is the process of grouping together similar themes which should be dealt with simultaneously. The integrated interventions involve estimating how far the curriculum delivery needed to consumers (learners, educators and school managers) and then identifying a mix of appropriate sources and forms of interventions to meet the needs in the most efficient and socially beneficial manner. According to Piaget in Ornstein and Hunkins (2004, p. 110) integration ‘refers to horizontal relationships of curriculum experience’ and means that the organisation of experiences should be unified in relation to other elements of the curriculum being taught and that subjects should not be isolated or taught as a single course from the rest of the subjects.

The Integrated Quality Management System (IQMS) which is the national department of education’s model for school improvement, uses the results of the development appraisal (DA) of individual teachers as the basis for the development

36

of the school improvement plan (SIP), which it describes as a blueprint for actions and processes needed to produce school improvement. Clarke (2007, p. 132) explains that to integrate the intervention programme the schools need to comply with the following:

• The responsibility for developing the schools improvement plan rests with the school development team (SDT), which is made up of the principal, the whole-school evaluation coordinator, democratically elected members of the school management and elected post-level 1 educator.

• Essentially, this group uses information provided by each teacher’s development support group (DSG) to identify the development needs of every teacher or SMT member.

• They work with individual teachers and are required to mentor them and provide them with support. In addition, they must help them develop and refine a personal growth plan (PGP) for the school.

These personal growth plans inform the school improvement plan. It is clear that all steps mentioned above will help the director for curriculum to be informed on what schools need in terms of support so that they will be able to improve educator performance. It will also assist the director for curriculum to achieve the target by providing the relevant resources in terms of the professional development

(workshops on content knowledge, study skills, strategic planning and quality assurance for SMT members), departmental support (human and physical resources) and classroom practices (quality assurance in the classroom).

2.12 Monitoring (supervision), evaluation and feedback

James and Donald (2004, p. 231) define monitoring as the actual running of the programme on a day-to-day basis, monitoring progress and making changes as necessary to ensure that it keeps on track for delivering its final objectives. They further emphasise that to monitor progress we first need some mechanism for collecting figures on the resources used. Evaluation is the assessment of the total value of the programme or system in school or any social situation. Monitoring is a

37

requirement that needs to be practised in the classroom situation in terms of quality assurance and includes such considerations as checking both educators’ and learners’ portfolios. It is true that the processes of curriculum development and implementation must be supervised. There should be a close monitoring of what is occurring and determining whether these actions are appropriate. Ornstein and

Hunkins (2004, p. 322) show us that, instructional supervision is important; especially at the level of implementation and that the entire process of curriculum development needs to be supervised. Most of the time the word supervision is associated with instructions (control). Ornstein and Hunkins (2004, p. 354) define a portfolio as a compilation of student work gathered over time that furnishes evidence of a student’s understanding, skill, and even disposition to act in particular ways. They further suggest that a key benefit of the portfolio is that it allows the student to present his or her whole person and for the teacher to judge him or her in that fashion and give feedback. Clarke (2007, p. 252) confirms that one possibility is to ask subject heads to gather feedback from subject meetings and then for the issues identified to be discussed by the senior management team. He further says that it is important, however, that feedback be given to staff on changes that will be made to address their legitimate concerns or to address shortcomings that may have been identified.

It is also true that the evaluator serves as the ‘eyes and ears’ of the decision makers.

Ornstein and Hunkins (2004, p. 362) emphasise the fact that the evaluator’s role is that he or she furnishes data gathered from observations about how the curriculum is functioning in the school and it is then up to the curriculum coordinators, advisors’ committees and administrators to take the data gathered, judge its value and act accordingly. This defines the concept that the evaluator is essentially a pivotal support to the curriculum development and implementation effort.

In terms of classroom practices, Clarke (2007, p. 237) states that the Department of

Education has produced documents (learning programmes, work schedule and lesson plans) that provide detailed information on how a subject team and teacher should approach their planning and the kind of information that their planning material should include. He continues that it is essential that principals and school

38

management teams ensure that they are fully familiar with these materials in order that to be in a position to monitor what their subject heads and teams are doing.

Meyer (2002, p. 32) states that strategic intervention should be monitored on a continuous basis in order to identify successes, shortcomings and areas for improvement and/or modification. Scheerens, Glass and Thomas (2003, p. 3) define evaluation as providing the information to allow judgments to be made based on is information.

Ornstein and Hunkins (2004, p. 335) go on to define evaluation as a process or group of processes by which an evaluator can gather data in order to make decisions. They continue by arguing that, like any other concepts in education, there is no actual consensus as to the meaning of evaluation. However, despite this argument, Worthen and Sanders (in Ornstein & Hunkins, 2004, p. 335) define evaluation as “the formal determination of the quality, effectiveness, or value of a programme, product, project, process, objective and curriculum.” They further tell us that Stufflebeam has defined evaluation as the process of delineating, obtaining, and providing useful information for judging decision alternatives. They also state that monitoring should be seen as a further qualification of evaluation, stressing its association with outgoing information gathering as a basis for the management of decisions and reliance on administrative data and stronger pre-occupation with description than with ‘valuing’. Schmidt and Houang (2003, p. 979) explain further that before turning to the measurement of curriculum, they examine three types of cross-national curriculum evaluation studies differentiated by the role that curriculum plays in each: the object of the evaluation, the criterion measure for the evaluation, and the context in which to interpret an educational evaluation.

A programme evaluation approach should be followed where programme managers are appointed to oversee the planning and execution of each step. Sherril, Foucek and Waterbury (2005, p. 2) explain programme evaluation as necessary to gain information about programme efficacy and to identify areas for programme improvement. They further explain evaluation as a necessity to help programme

39

administrators and planners to identify barriers to successful programme implementation and delivery.

According to Stake in (Stufflebeam, Madaus and Kellaghan, 2000, p. 343) the evaluation circumstances will be that someone is commissioned in some way to evaluate a program, probably an ongoing programme. He further says that he has some clients or audience to be of assistance to the educators who are responsible for the programme and that he has the responsibility for preparing communication with this audience.

Sherril, Foucek and Waterbury (2005, p. 10) explain that evaluation is an opportunity for program improvement and can be viewed as an integrated set of activities designed to identify programme strengths and areas for improvement.

They further emphasise that evaluation can also provide the evidence that will serve as the basis for future programme planning and enhancement. Several steps are involved in getting started on evaluation:

• Stating aims.

• defining evaluation goals and objectives,

• developing key questions and

• creating an evaluation matrix that provides the framework for the evaluation design and development of evaluation instruments.

Ornstein and Hunkins (2004, p. 342) explain the information provided to management for the purpose of decision making; there are four process steps as types of programme evaluation:

• Context evaluation involves studying the environment of the programme.

Context evaluation is really a situation analysis.

• Input evaluation is the second stage of the model; input evaluation is designed to provide information and determine how to utilise resources to meet programme goals.

40

• Process evaluation, this stage addresses curriculum implementation decisions that control and manage the programme. It is used to determine the congruency between the planned and actual activities.

• Product evaluation; in this case product evaluators gather data to determine whether the final curriculum product now in use is accomplishing what they have hoped to achieve.

They further emphasise the fact that cooperation among all parties engaged in curriculum development and delivery is necessary even though various people can play particular roles in an overall evaluation; it is, however, wise to have one person in charge.

Sherril, Foucek and Waterbury (2005, p. 20) explain the reasons by asking the question; why agencies should conduct evaluation. They further answer the question by saying that agencies conduct evaluations for a number of reasons, including the following:

• To provide immediate feedback, enabling programme leaders and managers to make small yet immediate changes during the programme, in response to the identified needs and concerns.

• To provide information over the long term as the basis for programme planning, programme redesign and improvement.

• To meet the foundation or other funders’ requirements to provide evidence of the value received for the money invested in a programme through a grant. Given the ever-increasing calls for accountability from the government, from funders and from the public in general, there are regular demands for clear evaluation findings.

In conclusion, evaluation with some basic knowledge and understanding can be conducted by most organisations within their current organisational capacity and integrated into routine work activities in a way that complements programme delivery. In this chapter 2, it dealt with the literature review that references were based on departmental support, professional development and classroom practices.

41

The following components such as curriculum development/ design, dissemination of information, curriculum implementation and integration of interventions played an important role in this Curriculum Delivery Intervention Programme in the school.

The synthesis of this literature review is based on the curriculum development, dissemination of information, curriculum implementation and integration of information. In terms of the curriculum development, it says that the curriculum is a cornerstone of the intellectual life of the school’s inhabitants. The aim of it, is to provide learners and educators with lived experiences that ideally foster deep understandings, sophisticated skills, appropriate attitudes and social constructive value. It is therefore essential that great care be care be given to the creation of curricular. It is important to know that, immediately the creation and design of curricular, here it come the dissemination of information where the departmental officials should inform all educators and School management team members on the intention to bring CDIP in the school. The reason to inform both educators and SMT members is to give them an opportunity to be ready mentally to assist in the implantation of the CDIP and through workshops the implementation could be effective in the school. One of the reasons for the implementation is to ensure that educators are able to integrate all these activities into their daily program in classroom situation. Monitoring and evaluation should come in and enable the departmental officials to identify the problem and give feedback in their evaluation and able to help the whole staff to develop and turn-around the situation in the school.

2.14 Conclusion

The pursuit of quality in education is one of the fundamental drivers in the educational transformation process. There are various indicators of quality: learner performance is one of the very important determinants of this. The conceptual framework that will be tackled to categorise types of curriculum delivery, curriculum development and monitoring consists of three elements – curriculum design, curriculum dissemination, curriculum implementation and programme implementation that underpin the Curriculum Delivery Intervention Programme.

42

Strategic intervention in learner attainment in the classroom done by educators should be monitored on a continuous basis.

In terms of the dissemination of information, the Department of Education needs to do these through distribution or publication of information ideas and notions; inservice training so that they prepare those involved and informing them about the aims and objective intended to be achieved. The Department of Education should also aim at supporting the schools concerned with in all relevant resources and giving a developmental training needed. Developmental support identified during the curriculum design is one of the phases in curriculum development.

It is better to use formative assessment in order to identify successes or shortcomings as well as areas for improvement and/or modification. Any change to the learners’ attainment should be the result of consultations with all stakeholders and should be communicated to all involved. The most important form of evaluation is the quantifiable impact the strategic intervention has on the constituent parties in terms of organisational performance and results.

Change and improvement go hand-in-hand. Both have been used in this study where there is a change, there will be improvement. Even though it is difficult for people to accept changes, they will remain the cornerstone of any developing country such as South Africa. A change teaches us that, for one to achieve quality results, improvements should be made; so we should be ready to embrace and adhere to them. We need to concentrate on our learning and teaching environment

(organisational learning). They involve improving our schools and producing quality and competent learners able to face the challenges of the world.

It is also important for educational leaders to place emphasis on IQMS – the model of the Department of Education aimed at improving school results. By so doing, we will be holding those involved accountable. This accountability can only be achieved through educational measurements, where the focus is on the type of content offered to learners and the construction of items through checking their validity and reliability. In addition, this can be consolidated by following

43

educational strategy, where it involves administrative data in the schools through management of information systems.

There is a need for departmental officials (such as curriculum implementers/managers and those assigned to administration) to go to schools to determine if the system of filing information is done in accordance with the regulations of the DOE. The DOE, through its financial section, must provide some input to give support in areas such as human and physical resources, staff retention, school management team workshops and SGB functionality.

School managers should encourage the participation of parents, learners and staff in the curriculum management of their schools. Finally, the whole school needs to be involved in marketing the school’s vision, particularly in terms of what type of learners it wishes to produce for the labour market.

44

Chapter 3

3.1 Introduction

This section of the dissertation provides the reader with brief descriptions of the overall assessment approach, the tasks associated with formative assessment and the specific data-gathering methods used across the programme (CDIP).

Creswell (2005, p. 510) defines mixed methods (multiple methods) as a research design that is a procedure for collecting, analysing, and mixing both quantitative and qualitative data in a single study to understand a research problem. McMillan and Schumacher (2006, p. 474) explain that mix methods (multiple methods) refer to a study that combines qualitative and quantitative techniques and/or data analysis within different phases of the research process. I have decided to conduct both because I wanted to use questionnaires and perform follow-ups through individual interviews.

The purpose of this chapter is to explain to the reader which strategies were selected and applied during the investigation. Alternatively, a reason for conducting a mixed methods study might be that we seek to explain in more detail through qualitative research the initial quantitative statistical results.

The population was all underperforming schools where Grade 12 was used to identify educators’ performance in high schools in the Witbank area. In terms of convenience sampling, only one high school was selected out of the five underperforming secondary schools. The reason for selecting this school is that it was the only one that performed under 40% for three consecutive years in the western township. The other four performed between 40-50%. Purposive sampling was considered and followed in this study.

Quantitative methods provide the opportunity to gather data (questionnaires) from a large number of people and generalise results, whereas qualitative methods permit

45

an in-depth exploration of a few individuals (interviews) based on the departmental support, professional development and classroom practices.

The above statement confirms that interventions (such as developmental support, professional developmental and classroom practices) are really linked to the theoretical framework in the study. In terms of the context the evaluation model refers to the learning environment (school) where a teacher teaches and learner learns. The input evaluation model refers to the human resources, physical resources, professional support and teaching and learning support materials. The process evaluation model refers to the teaching and learning in the classroom situation and lastly, the product evaluation model refers to the achievement at the end of the day by learners in order to confirm the success of all activities supported and practised in the school during the process evaluation model.

The research approach followed in this study can be regarded as a mixed-methods one by using questionnaires, individual interviews and observational checklist. One of the strategies is triangulation design which is true for both qualitative and quantitative data collected simultaneously. McMillan and Schumacher (2006, p. 28) explain that triangulation design should be used because the strengths of each approach can be applied to provide not only a more complete result but also one that is more valid. Creswell (2005, p. 514) explains the purpose of triangulation as a simultaneously collecting both quantitative and qualitative data, emerging the data and using the results to understand a research problem.

Firstly, the research approach can be characterised as collaborative as the researcher worked closely with the schools where the CDIP operates and particularly with the educators and school management team. Creswell (2005, p. 589) explains the term

collaborative as to collaborate with others as central to action research and it involves actively participating with others in research.

46

Secondly, the evaluation can be seen as outcomes-based as the conceptual scaffold guiding the evaluation work comprised both the aims and outcomes associated with the CDIP and a set of questions closely derived from those stated outcomes.

Thirdly, the evaluation approach may also be considered a mixed-methods or multiple-methods are because it employs both qualitative (e.g. observation, individual interviews) and quantitative (e.g. questionnaires). Data gathering techniques evaluate and contribute to better understand of the operation of the programme, as well as measure its accomplishments. Creswell (2005, p. 594) defines mixed-methods research design as procedures for collecting both quantitative and qualitative data in a single study and for analysing and reporting the data based on a priority and sequence of the information. Quantitative methods provide numerical representations of outcomes that can be used to assess accomplishment against goals, standards or targets. Qualitative data, on the other hand, provide rich information that can be used to examine phenomena not readily amenable to quantitative exploration and/or to provide a contextualised, more complete explanation of the phenomenon or programme under study.

3.3 Multiple methods approach (mixed-methods)

3.3.1.1 Content validation and development of the questionnaire

Leedy and Ormrod (2001, p. 98) define content validity as the extent to which a measurement instrument is a representative sample of the content area (domain) being measured. They further explain that content validity is often a consideration when we want to assess people’s achievement in some area, for instance, the knowledge they have learned during classroom instruction or the job skills they’ve acquired in a rehabilitation programme. Creswell (2005, p. 590) also defines content validity as the extent to which the questions on the instrument and the scores from these questions are representative of the possible questions that could be asked about the content or skill. The quantitative portion of the evaluation comprised one questionnaire with five sections (section, A, B, C, D, and E) as its principal means

47

of gathering data relevant to respondents’ views on the Curriculum delivery intervention programme. These questionnaires (refers to Appendix 7), included

Likert-type and open-ended questions to be applied to educators in the school.

Similar to the observational checklist (refer to Appendix 6), the questionnaire was developed by the researcher in consultation with the programme documents and literature reviews and also the research questions.

The questionnaire (see Appendix 7) includes questions that address the following:

• Biographical information of all educators participating in the study (e.g. qualification, year experience, etc.).

• Departmental support and professional development.

• Curriculum development and management, implementation of the programme and dissemination of information.

• Programme evaluation.

• Integration of interventions, monitoring and evaluation.

Departmental support and professional development

Curriculum development and curriculum management

Programme implementation and dissemination of information

Programme evaluation and classroom practices

Clarke (2007:75), DOE

(2007:3),

Adam(1997:4), Clarke

(2007:95), Van

Deventer and Kruger

(2003:220), DOE

(2003:23), DOE

(20I0:5) , DOE

(2007:9),

Mestry and Grobbler

(2004:2), DOE

(2008:19), Meyer

(2002:149), Carl

(1997:47), Ornstein and

Hunkins (2004:322)

Meyer (2002:138),

Clarke (2007:168), Carl

(1997:167) ,Ornstein and Hunkins (1993:304)

Clarke (2007:207),

Meyer (2002:74), Cohen

(1993:25), Popham

(2005:52), Scheerens et

13, 14, 15, 16, 17, 18,

19 ,20, 21, 22, 23, 24,

25, 26, 27, 28, 29, 30,

31, 32.

33, 34, 35, 36, 37, 39,

40, 41 ,42, 43, 44, 45,

46, 47, 48, 49, 50, 51,

52, 53, 54, 55, 56, 57,

58, 59, 60.

39, 54, 55, 49, 36, 37,

41, 42,

61, 62, 63, 64, 65, 66,

67, 68, 69, 70.

48

Integration of intervention monitoring, evaluation and feedback. al (2003:100), Sherril

(2005:2), Stufflebeam et al (2000:343)

Ornstein and Hunkins

(2004:342)

James and Donald

(2004:231), Clarke

(2007:237), Meyer

2002:32), Ornstein and

Hunkins (2004:335),

71, 72, 73, 74, 75, 76,

77, 78, 79, 80.

Clarke (2007:252),

Scheerens et al. (2003:3)

Ornstein and Hunkins

(2004:110), Moloi

(2009:135)

Table 3.1: Sources consulted in the content validation of the questionnaire

In addition, the questionnaire was designed to include questions that address the respondents’ views regarding what they consider appropriate additional professional development in terms of the aims of the CDIP. To serve the purpose of the programme evaluation it was essential to hear from respondents what they considered useful in terms of building their capacities in understanding the intervention itself.

3.3.1.2 Research instruments

Creswell (2005, p. 592) explains instrumentation as a tool used for measuring, observing or documenting quantitative data. He further states that researchers identify these instruments before they collect data, and they may include a test, a questionnaire, a tally sheet, a log, an observational checklist, an inventory or an assessment instrument. A qualitative method is considered as a tool for interviewing

(individual interview) the participants in order to understand their views based on curriculum delivery interventions in the school. A listing of the specific datagathering activities associated with this curriculum delivery intervention programme is given in Table 3.2: Data sources and methods. These data-gathering activities resulted from the researcher meeting with individual educators to explain the purposes and aims of evaluating the curriculum delivery intervention programme in their school and to sketch the particular methods to be used in the evaluation itself.

49

The answers to these questions provided the basis for judgments around the success of the curriculum delivery intervention programme in the school.

Table 3.2

Data source Individual interviews

Educators N=4 ( in the

The questionnaire

N=60 (mainly closed Likertmembers underperforming type items) high school educators)

Observational

checklist

Mainly closed questions based on the checking of the educators’ files/school’s file

Table 3.2: Data source and methods applied in the study

3.3.2 Qualitative approach

3.3.2.1 Types of interviews and observational checklist conducted

Creswell (2005, p. 593) explains interviews as occurring when researchers ask one or more participants general, open-ended questions and record their answers.

McMillan and Schumacher (2006, p. 473) explain that interview reflected by interviewers on their role and rapport, interviewees’ reactions, additional information, and extension of interview meanings. The qualitative section of the investigation also included individual (face-to-face refer to Appendix 8) interviews with educators in the school. Individual interviews were conducted to provide indepth, narrative data on the experiences of programme participants to better understand what worked or did not work for the school and importantly, why not.

The interviews with educators were guided by open-ended and close questions (see table 3.2) based on the interventions for the programme by the researcher (see sections, G1, G2, G3, G4, G5, and G6 as captured as appendix 8 on the interview instrument used to capture individual interviews).

• Setting of the interview questions

Generally, this series of interviews granted the researcher extended opportunities to explore with participants (refer to Appendix 8 as an individual interview) their understanding with regard to interventions such as professional development, departmental support and classroom practices developed in the school. Factors that

50

facilitated or hindered their understanding of the curriculum delivery interventions as well as concrete examples of ways in which participants believe that they have benefited from the CDIP based on the sub-questions that built up from departmental support, professional development, curriculum development, dissemination of information, integration of the interventions (See sections, G1, G2, G3, G4, G5 and

G6). McMillan and Schumacher (2006, p. 473) explained interview guide approach as an approach in which the researcher selects interview topics in advance but decides the actual sequence and wording of the questions during the interview. The procedures as indicated explained in the interview guide that the topic was there and the questions developed (see sections, G1, G2, G3, G4, 5 and G6) from the literature review’s components in the investigation and asked during the interview.

Specifically, the purpose of these interviews was to better understand participants’ views on the nature of the curriculum delivery, the professional development provided as well as the implementation and success of the interventions provided to the school. The individual interviews also provided a forum to explore participants’ views on facilitators and barriers to the success, scalability and sustainability of the

CDIP undertaken. Further, building on the questions asked in the individual interview provided a good forum for participants to offer their thinking on identifying additional strategies.

• Observational checklist (refers to Appendix 6)

Creswell (2005, p. 211) defines observation as the process of gathering open-ended, firsthand information by observing people and places at a research site. He further explains that when educators think about qualitative research, they often have in mind the process of collecting observational data in a specific school setting.

Observation in this study refers to the in-person viewing of programme activities such as professional development, departmental support and classroom practices.

The purpose of all observations is to systematically gain a better first-hand sense of the context, content and focus of the professional development, departmental support and classroom practices provided to educators by the Curriculum Delivery

Intervention Programme facilitators. The other purpose of the use of the observational checklist was to check or observe the school’s files, including

51

educators’ files and management files by approaching an individual educator and his subject file.

3.4 Population and sample who participated in the study.

McMillan and Schumacher (2006, p. 119) define a population as a group of elements or cases, be it an individual, objects or events, that conform to specific criteria and are intended to provide a suitable base for the research. The population underperforming schools considered where Grade 12 was used to identify educators’ performance in the Witbank area. According to Ramurath (2006, p. 2), the Mpumalanga Department of Education identified 157 schools that performed below 50% in the 2005, 2006, 2007, 2008 and 2009 Grade 12 examinations. There are 157 underperforming schools in the Mpumalanga province. In the Witbank area alone, out of twenty-two secondary schools with Grade 12, only eight high schools are under-performing. There were five such high schools in the western townships in the area, one of which was selected and focused upon. The remaining four schools performed above 40% in terms of the average. This school is the only one that obtained below 40% in the areas in terms of Grade12 results.

Sampling has been defined by Creswell (2005, p. 146) as a sub-group of the target population that the researcher plans to study. Convenience sampling, or as it is sometimes called, accidental or opportunity sampling, involves starting with the nearest individuals to serve as respondents, and continuing this process until the required sample size has been obtained. In this case purposive sampling was used to emphasise the judgment of the characteristics of the staff members in this selected school.

In terms of convenience sampling, only one high school was selected out of the five under-performing secondary schools. The reason for selecting this school is that it was the only one that performed under 40% for three consecutive years in the western township. The other four performed between 40-50%. A further reason was that the school has a staff of sixty to an enrolment of 2100 learners. I chose to research this school because it was the only one in the area with such a high enrolment. It can provide useful information on factors influencing the educators’

52

performance in terms of departmental support, professional development and classroom practices in our high schools. Sampling (one school) would have applied well to a case study, but a researcher decided not to use a case study in the investigation. The researchers often use a term case study in conjunction with ethnography which at the same time it links to qualitative research. A researcher did not only want to explore, but also to explain how variable are related in the study.

Creswell (2005, p. 53) explains that using the combination of both forms of data provides a better understanding of a research problem than one type of data alone.

Purposive sampling was considered and followed in this study. Cohen, Minion and

Morrison (2002, p. 103) state that, in purposive sampling, researchers handpick the cases to be included in the sample based on their judgment of the characteristics of the staff member(s). Thus, it gives a clear picture that the sample has been chosen for a specific purpose. For example, a group of principals, HOD’s and educators at secondary schools chosen as the study group to look at the incidence of stress among school managers and educators and how it may affect their performance.

3.5 Analysis of the quantitative data

Data analysis was approached through triangulation design analysis (observational checklist, questionnaires and face-face interviews) on how educators and SMT members’ respond to the curriculum delivery intervention. In this study I collected all documents from the school in question (including, inter alia, assessment strategies, work schedules, analysis of learner performance, educators’ profiles, teaching and learning materials, meeting minutes, library records, paper budgets, number and size of classes, school furniture, school journals, parental involvement workshops, policies and plans).

Creswell (2005, p. 175) explains that to collect data on the instrument or a checklist, one needs some system for scoring the data. He continues explaining that scoring data means that the researcher assigns a numeric score to the response category for each question on the data collection instruments. Creswell (2005, p. 521) states that mixed-methods researchers quantify qualitative data in order to compare the data

53

directly with statistical results. He concludes that one could reduce interview data from high school staff to themes and make counts of the occurrences of each theme.

3.5.1 Descriptive analysis of the empirical data

The data analysis in this investigation as explained above is the mixture of the quantitative and qualitative analysis. When quantifying qualitative data, that data can be coded by using a logical analysis, whereby codes are assigned numbers and the number of times codes appear recorded as numeric data. This gave me the idea that quantitative data are descriptively analysed for frequency of occurrence and in that way the two data sets can be compared. The variance indicates the dispersion of scores around the mean. In terms of inferential analysis, McMillan and Schumacher

(2006, p. 308) explain that non-parametric procedures test hypotheses about the relationships between categorical variables, shapes of distributions and normality of distribution. They further explain that non-parametric techniques are concerned with frequencies, percentages and proportions.

Descriptive analysis was used in this case so that it would be possible to generalise the information. The appropriate numerical descriptive measures used systemic plot or box-plot, alternatively called a box and whisker plot to synthesise the analysis that would be done on the tables. David, Robert and Juliette (1989, p 916) explain that authors and readers can use a simple graphic method called “box plot” to rapidly summarise and interpret tabular data. They further explain that three useful properties of exploratory data analysis techniques are that they require few prior assumptions about the data, their statistical measures are resistant to outlying data values that may inordinately influence an analysis, and they emphasise visual displays that clearly highlight important landmarks of the data. Keller and Warrack

(2000, p. 119) define the box and whisker was used as a pictorial display that indicates what the two extreme values of a data set are, where the data are centred.

They further provide a graphic, five number summary of the data such as smallest, lower quartile, median, upper quartile and largest. In this investigation the box and whisker used to analyse the data.

54

3.6 Analysing the qualitative data

3.6.1 Coding and classification

Qualitatively, the interviews were analysed logically by using the detailed responses of the staff perspective on the implementation of curriculum delivery intervention, based on departmental support, professional development and classroom practices in the school. It also included the departmental support, professional developments, curriculum development, curriculum management, programme implementation, dissemination of information, programme evaluation, integration of interventions, feedback and monitoring (Creswell (2005, p. 479). McMillan and Schumacher

(2006, p. 471) describe coding as a descriptive name for the subject or topic of a data segment. The method of analysis involved a content analysis/logical analysis, in which data were partitioned into content domain for the comparison of themes across an individual perspective. I tried to read and underline or classify the responses and common themes in educators’ descriptions of their perspectives on the curriculum delivery in their school. Moreover, the logical analysis was used in this case so that it was possible to explore the phenomenon. Themes identified by the staff were considered common themes in the interviews and, ultimately, coded accordingly.

In this case, the emphasis was on the trustworthiness of the information collected.

Creswell (2005, p. 252) states that validating findings means that the researcher determines the accuracy or credibility of those findings through strategies such as triangulation. The focus here was on validity of three data collection strategies – questionnaire, observational checklist and face-to-face individual interviews.

Leedy and Ormrod (2001, p. 98) explain that content validity is the extent to which a measurement instrument is a representative sample of the content areas (domain) being measured. The domain, which educators need to address in this study, is to assess curriculum delivery intervention on mediocre-performance by using the questionnaires given to both and SMT members as the targeted participants or

55

respondents. The criteria for this were to determine how effective a curriculum delivery intervention had been in their school situation in terms of departmental support, professional development and classroom practices. They further argue that measuring instruments have a high-content domain in appropriate proportions, and they require the particular behaviours and skills that are central to that domain.

The use of triangulation (observational checklist, individual interviews and questionnaires) played a significant role. The information on the curriculum delivery intervention programme, which is based on the departmental support, professional development and classroom practices, was collected from each of the programme participants. In these assessments, the school management team was asked to judge the extent to which their under-performing school exhibited certain characteristics at the start and completion of the intervention programme.

All ethical considerations were observed. I knew what was expected of me as a researcher. I made sure that I understood and observed the rights of the participants by using non-discriminatory language and respecting the audiences’ opinions.

Letters of informed consent were sent to the targeted school and circuit to request permission to conduct evaluation research. Creswell (2005, p. 171) has the following to say about ethical clearance:

“Data collection according to my knowledge was ethically done, respected an individuals’ anonymity. Obtaining permission before starting to collect data or starting with the research is not only a part of the informed consent process but is also an ethical practice. Protecting the anonymity of individuals by assigning numbers to returned instruments and keeping the identity of individuals confidential offers privacy to participants.”

The ethics application was approved by the ethics committee of the faculty of education.

56

3.8 Conclusion

The planning in this study was based on approaches that usually centre on a generic set of tasks. As researcher, I used the procedure to explore effects of curriculum delivery on educators’ performance in terms of departmental support, professional development and classroom practices. In this study multiple or mixed-methods research, outcomes-based and collaborative format was followed and explained appropriately. The target population was all underperforming schools in the

Witbank area. One secondary school in the western part of the township was selected as targeted school. The participants were selected according to convenient and purposive sampling.

The observational checklist (educators’ perspectives in terms of curriculum delivery programmes, documentation (records), questionnaires and individual interviews

(open- and closed questions) were used to collect information. A triangulation design used to collect the data, in terms of Creswell (2005, p. 514) argues that triangulation design allows simultaneous collection of both quantitative and qualitative data, the merging of that data and use of the result to understand a research problem.

57

Chapter 4

Analysis and interpretation of the results of the investigation

4.1 Introduction

This chapter contains the results of data captured with the questionnaire, observational checklist and individual interviews during the investigation. It reports on the responses of educators and school management teams on a Curriculum

Delivery Intervention Programme implemented by the Mpumalanga Department of

Education. It also focuses on the experience of staff regarding the implementation of the Curriculum Delivery Intervention Programme in terms of professional development, departmental support and classroom practices. Other issues covered by the chapter range from the dissemination of information about the implementation of the programme, feedback, the monitoring and the integration of the information into a daily programme of educators. The questionnaires as well as the individual interview schedule dealt with the following issues: biographical information of the respondents, departmental support and professional development, curriculum development and management, implementation of the programme and dissemination of information, programme evaluation and the integration of intervention, monitoring and evaluation. The above mentioned components or sections of the study have been divided into two analyses, namely the quantitative and qualitative analysis of the investigation. The quantitative analysis is the first to be presented and followed by the qualitative analysis.

All these components were analysed, showing the percentages as indicated in tables

4.1 to 4.17 and this is followed by the interpretation of results of the investigation.

A large number of variables are addressed in section A of the questionnaire. The tables reflect the frequency analyses and percentage calculation of the responses of the respondents to each of the sections addressed in the questionnaire. These issues were also addressed by the theoretical framework. The data were analysed through the systemic Box and whisker plot as shown in figure 4.1 to figure 4.9. The missing frequencies of respondents not responding to a question are also listed in the tables.

58

4.2 The result of the quantitative data of the investigation

4.2.1 Introduction

In analysing and interpreting these data readers are reminded that the percentages reported in the tables that follow are based on relatively small numbers. The biographical information in the questionnaires is covered in section A. The rest of the questionnaire covers the components identified in the theoretical framework.

The components mentioned here ranged from departmental support and professional development, curriculum development and management, implementation of the programme and dissemination of information and programme evaluation. The data became valuable evidence to assess the success or failure of the programme itself.

4.2.2 The biographical information of the respondents participated in the study.

All the respondents in the study work in the Mpumalanga province. The total number of respondents who participated in this study is sixty. Most of the details about the respondents’ involvement in the study are discussed in the tables below:

Table 4.1: Gender according to the respondents participated in the

study.

Gender

Female

Total

Percentage

Frequency missing = 3

Male Total

Table 4.1 shows that 25 (43.86%) of the respondents who participated in the study were females, while 32 (56.14%) were males. The results show that male respondents form the majority of the sample in the school. One can conclude that both males and females agree that the programme is helpful in improving teaching and learning in the school.

59

Table 4.2: Educational qualification according to the respondents participated

in the study

Educational

Qualification

Post school diploma and

B-degree

2&3

Degree plus diploma and postgraduate diploma

4&5

Total

Total

Percentage

Frequency missing = 1

Table 4.2 above gives to us a clear picture that only 20 (33.90%) of the respondents reported to have a post school diploma and B-degree, while 39 (66.10%) respondents reported having a degree plus diploma and post-graduate diploma as the highest qualification in the school. This indicates that the majority of the respondents are highly qualified to teach learners in the school. The information indicates that academic qualification plays important roles in the education of a child’s performance in the school.

Table 4.3: Age according to the respondents participated in the study.

Age

39 years and less 40 years and more

Total

Total

Percentage

Frequency missing = 2

Table 4.3 reveals that only 26 (44.83%) of the respondents were aged 39 years or younger while 32 (55.17%) of the respondents were 40 years and older. The results indicate that the majority of the respondents who participated in the study were aged forty-years and older.

60

Table 4.4: Teaching experience according to the respondents participated in

the study

Teaching

Experience

Total

10 years and less 11 years and more

Total

Percentage

It can be noted in table 4.4 that 28 (46.67%) of the respondent reported that there were educators who had teaching experience of 10 years and less while 32 (53.33%) of the respondents reported that they have teaching experience of 11 years and more. This indicates that the majority of educators have 10 years and more teaching experience as compared to those educators have less than ten year teaching experience.

Table 4.5: Streams according to the respondents participated in the study

Streams

Science department

Total

Learning

Areas

2 Mathematics, 4 Natural

Science, 6 Economic,

Science and Management and 8 Technology

General stream

(Department)

1 Language, 3 Human and social science, 5 Arts and culture and 7 Life orientation

Total respondents

Percentages

Table 4.5 indicates that only 39 (65%) of the respondents reported that they teach

Science while only 21 (35%) of the respondents reported that they were teaching general subjects. This shows that most educators teach Science.

61

Table 4.6: Post level according to respondents participated in the study

Post level

Educators

1

Principal, Deputy and HODs

2, 3 and 4

Total

Total 39

Percentage 67

19

33

58

100

Missing frequency = 2

Table 4.6 shows that 39 (67%) of the respondents reported they are at the post level

1, while 19 (33%) are at post levels 2, 3 and 4. This shows that most of the educators are at post level 1 as compared to the rest who are at higher post levels.

4.2.3 Results of the frequency analysis

The results of the frequency analysis are shown on the table below. They are analysed according to the components (sections) as indicated in the questionnaire from section B to section E. This includes three tables in section F of the observational checklist. The tables in this investigation are starting from table 4.1 to table 4.17. A professional statistician aided the researcher to analyse the data collected.

The results presented in table below indicate the number of the respondents who answered each of the statements. Monteith, Van der westhuizen and Nieuwoudt

(2002, p. 94) explain that a likert scale used to summate ratings and therefore are also known as summated rating scales. These tables present to statement and its five-point scale (Likert scale) and the results for each scale are presented in one column indicated numbers and other column. 1 indicates “Strongly disagree”, scale

2 indicates “Disagree”, scale 3 presents “Uncertain”, scale 4 shows “Agree” and scale 5 indicates “Strongly agree”. In another sub-section of the questionnaire the respondents were asked to respond using a scale where scale 1 shows “Not at all”, scale 2 indicated “Less extent”, scale 3 presents “Uncertain”, scale 4 indicates

“Agree” and scale 5 indicates “Strongly agree”. Lastly, “yes” or “No” answer are used in the observational checklist to enable the observer to explore in the delivery interventions in the school.

62

4.2.3 Indicated as to what extent the statements applies to the understanding with regard to programme evaluation.

Table 4.7: Comment on the respondents regarding curriculum design of the programme.

Strongly

Disagree

1

Disagree

2

Uncertain

3

Agree

4

Strongly

Agree

5

Statement (Variables)

Missing frequency = 1

1.Outcomes could reasonably be expected as a result of participation in this curriculum delivery intervention programme.

Missing frequency = 1

2. This programme helped me to understand managing curriculum delivery in the school.

3. This programme helped me to understand my role as educator in the school better.

4. This programme helped to increase my knowledge in the processes of consultation in the school.

5. This programme helped to increase skill in the processes of consultation in the school.

Missing frequency = 4

6. This programme helped me to understand how to work together with other colleagues better.

Missing frequency = 1

7. This programme helped me to improve my interpersonal relations with my colleagues in the school as an educator.

Missing frequency = 3

8. This programme helped me to become more collaborative in my dealings with issues or activities in the school.

Missing frequency = 2

9. This programme helped me to improve my level of managing curriculum in the school.

10. This programme helped me to receive feedback immediately after evaluation was done.

6

4

3

5

4

4

6

4

3

1

10%

7%

5%

8%

7%

7%

10%

7%

5%

2%

8

12

11

8

9

6

7

8

8

8

14%

20%

18%

14%

15%

11%

12%

14%

14%

13%

14

20

20

14

19

15

8

20

14

27

24%

34%

33%

23%

32%

27%

14%

35%

24%

45%

29

22

23

29

26

28

35

22

31

22

49%

37%

38%

48%

43%

50%

59%

39%

53%

37%

2

1

3

4

1

3

3

3

2

2

3%

5%

5%

3%

3%

2%

7%

7%

2%

5%

Mean

3.2333

3.10000

3.16667

3.31667

3.18333

3.40000

3.35000

3.20000

3.33333

3.26667

63

When combining the respondents who indicated “agree” and “strongly agree” with certain statements as illustrated in table 4.7 reveals a very interesting results. Fifty-one percent (51%) of the respondents agree and strongly agree that outcomes can be expected as a result of participating in the CDIP, while 55% of the respondents argue that the programme helped them to increase their knowledge of the process of consultation as well as to improve their levels of managing curriculum in the school

(see items 1, 4 and 9). It is interesting to note that 60% of the respondents were of the opinion that the CDIP helped them to improve their interpersonal relations with their colleagues in the school (see item 7). Forty-four percent (44%) of the respondents indicated that the programme helped them to become more collaborative in their dealing with issues or activities at school (see item 8). Forty percent (40%) of the respondents agree that they receive feedback immediately after evaluation was done, while 45% of the respondents show uncertain that the feedback is given immediately after evaluation is done (see item 10).

In response to the question whether the programme helped educators to improve their interpersonal relation with their colleagues, 38 (63%) indicated that they agree or strongly agree with the statement. The same applies to whether the programme assisted them in managing the curriculum in the school better. Thirty-three (55%) of the respondents indicated that the programme helped them to improve their level of managing the curriculum in the school. Fifty-one percent (51%) of respondents also indicated that outcomes can be expected as a result of their participation in the CDIP.

The findings with regard to programme evaluation the respondents agree or strong agree in table 4.7 that the curriculum delivery interventions helped them to expect outcomes. All the respondents agree and strongly agree that the programme is helpful to improve their interpersonal relations with their colleagues better, and to become more collaborative in dealing with issues or activities in the school. Sherril, Foucek and Waterbury (2005, p. 10) explain that evaluation is an opportunity for program improvement and can be viewed as an integrated set of activities designed to identify programme strengths and areas for improvement. It is clear from the table that respondents agree or agree strongly that this programme helped them to understand the management the curriculum delivery in the school.

64

4.2.4 Departmental support and professional development in curriculum delivery intervention.

Table 4.8: Extent to which the respondents’ opinion with regard to the departmental support and professional development about the curriculum delivery in the school by the Department of Education.

Summary results (N=60)

Disagree

1 2

Uncertain

3

Agree

4

Strongly Agree

5

n % n % n % n % n %

Mean

11. I understand the aims and objectives of the programme as explained to us.

6 10% 5 8% 18 30% 28 47% 3 5% 3.31034

Missing frequency = 1

12. I understand the details of the programme provided by departmental officials.

2

Missing frequency = 1

13. The workshops provided by departmental support were clear in terms of NCS.

3

14.

The workshops were effective.

3%

5%

6

14

10%

24%`

24

14

41%

24%

25

26

42%

44%

2

2

6 10% 14 24% 22 37% 18 30% -

15. I was supplied with all NCS documents.

Missing frequency = 3

16. I attended workshops at least twice a year organized by the officials.

Missing frequency = 1

17. Learners receive all learning materials for their respective learning areas.

8

9

14%

15%

9

21

16%

35%

12

17

21%

29%

23

10

40%

17%

5

2

3%

3%

-

3.34483

3.22414

2.89655

5 8% 13 22% 5 8% 33 56% 4 6% 3.36207

9%

3%

3.10345

2.55172

18. Our school is well supplied with furniture (chairs, tables etc.)

19. Our school is well renovated.

20. In our school the Teacherlearner ratio is larger than 1:32

12 20% 26 44% 5 8% 17 29% -

21 35% 15 26% 6 8% 16 26% 2

- 2.44828

3% 2.43103

9 15% 9 16% 3 5% 13 22% 26 44% 3.58621

65

21. The skill development was managed to cover everybody. 11 18%

22. Our school is well supported on team building from the provincial department.

23. SMT members are provided with skills development on management.

24. SMT members are provided with skills development on leadership

25. I understand how to develop the school improvement plan with the aid of officials.

26. I understand how to develop the school policies with the aid of officials

21. The skill development was managed to cover everybody.

14

10

16

5

5

11

23%

17%

27%

8%

8%

18%

22. Our school is well supported on team building from the provincial department.

23. SMT members are provided with skills development on management.

24. SMT members are provided with skills development on leadership

14

10

16

23%

17%

27%

25. I understand how to develop the school improvement plan with the aid of officials.

26. I understand how to develop the school policies with the aid of officials

5

5

8%

8%

(missing frequencies 7)

Strongly disagree=1 through strongly agree=5

20

30

19

26

16

11

20

30

16

11

19

26

33%

50%

27%

19%

32%

43%

33%

50%

27%

19%

32%

43%

13

8

15

16

14

16

13

8

14

16

15

16

24%

14%

30%

27%

29%

17%

24%

14%

30%

27%

29%

17%

14

8=

17

10

18

16

14

8=

18

16

17

10

22%

15%

24%

27%

25%

27%

22%

15%

24%

27%

25%

27%

2

-

4

3

2

1

2

-

2

1

4

3

6%

6%

5%

3%

2%

6%

3%

2%

6%

5%

2.60345

2.18966

2.81034

2.62069

2.67241

1.94828

2.60345

2.18966

2.81034

2.62069

2.67241

1.94828

66

In table 4.8 the stark difference between the opinions of respondents regarding the departmental support and professional development in the curriculum delivery intervention is clearly evident. As the data in table 4.8 show, there are problems in terms of the support the departmental officials should provide to the school. In terms of district support Clarke (2007, p. 75) explains that it is the structure that is closest to the site of delivery, i.e. the district, that needs to take final responsibility for the performance of the schools falling under its jurisdiction. He further says that the district is responsible for supporting, monitoring and evaluating the school on an ongoing basis. According to the respondents, the departmental officials have not done anything in trying to address the shortage of resources experienced in the school. The majority of respondents are either uncertain (40%) or agree (45%) that they understood the details the CDIP provided by the departmental officials, while 61% disagree and are uncertain whether the departmental workshops were effective (see items 4). Furthermore, 73% of the respondents strongly disagree and disagree that the school is well supported by the provincial department regarding team building. The respondents strongly disagree

(15%) and disagree (36%) that learners receive all learning materials for their respective learning areas (see item 7). The departmental support in terms of the skills development, a researcher wants to know whether skills development is there, to develop each and every educator in the school or not (see item 11). The respondents strongly disagree (18%) and disagree (33%) that skills development is there to develop each and every educator in the school. In whether the school is well supplied with furniture, the respondents strongly disagree (20%) and disagree (44%). Respondents reacted strongly concerning the teacher-learner ratio, while only sixty-six percent of respondents (66%) strongly agree and agree that the teacher-learner ratio is larger than

1:32 in the school (see items 8 and 10).

The findings of the investigation with regard to the departmental support and professional development, respondents indicate that they were uncertain whether all the workshops held were effective or not. Thirty-four percent (34%) of the respondents strongly disagree and disagree whether the workshops held were effective. Thirty percent (30%) of respondents agree that the workshops were effective. Fifty-one percent (51%) of respondents disagree and strongly disagree that learners have receive all learning materials for their respective learning areas. Sixty-four percent (64%) of respondents disagree that the school was supplied with enough furniture. Sixty-one

67

percent of respondents disagree and strongly disagree that the school is well renovated.

In terms of the learner-teacher ratio, only 66% of respondents agree that the ratio is larger than 1:32. Fifty-one percent (51%) of the respondents disagree or strongly disagree that the skills development is not covering everybody in the school. Another support the department supposed to provide to schools in order to empower educators is staff team building. The majority of respondents (73%) disagrees and strongly disagree that the staff is well supported on team building in the school. Forty-six percent (46%) of the respondents both strongly disagree and disagree that School Management Team were provided with the skills development on leadership.

68

4.2 Implementation of the programme, dissemination of information and curriculum development in curriculum delivery intervention

Table 4.9: Extent to which the respondents’ view with regard to dissemination of information and implementation of the CDIP in the school by the Department of Education was achieved.

Summary results (N=60)

Variables

Not at all Less extent Uncertain

1 2 3

n %

9 15% 19 32% 20 33%

27. How well did the regional office inform you on the curriculum intervention programme?

Missing frequency = 1

28. How well does the regional office advise you when they visit the school?

Missing frequency = 2

29. How effective is communication in curriculum delivery in the school?

Missing frequency = 2

30. How effective was the implementation in the planning of lessons?

Missing frequency = 1

31. How effective was the implementation of the forms of assessment?

Missing frequency = 4

32. How effective was the implementation of CIDP in the schools?

Missing frequency = 3

33. How well do staff respond to CIDP?

34. How well does the school handle the extra classes necessary?

Missing frequency = 3

13

2

3

2

4

4

4

22%

3%

5%

3%

7%

7%

7%

17

14

16

16

17

17

12

29%

24%

27%

27%

30%

30%

22%

14

13

16

14

19

17

5

24%

22%

27%

24%

34%

30%

9%

11

12

25

22

26

15

17

14

Somewhat

4

18%

20%

43%

37%

44%

27%

30%

26%

To a large extent

5

1

3

4

1

2

19

1

1

7%

2%

4%

35%

2%

2%

2%

5%

Mean

2.64407

2.59322

3.23729

3.00000

3.18644

2.86441

2.98305

3.67797

69

35. How effective was the moderation of all educational tests by SMT?

Missing frequency = 5

36. How effective was the moderation of all educational examination question papers by SMT?

Missing frequency = 8

37. How effective were the

SMT members in ensuring educators plan lessons by using schedules?

Missing frequency = 4

38. How effective was the lesson observation in class by SMT members

Missing Frequency = 2

39. How effective was communication in the school?

Missing Frequency = 1

40. How well is the monitoring of curriculum delivery handled by the regional office?

Missing frequency = 1

41. How clear is the standard of curriculum delivery in the school by the regional office?

42. How useful are the standards of curriculum delivery in the school?

5

6

1

7

3

5

1 2% 13 23% 11 19% 24 42% 8 14% 3.45763

4 7% 9 16% 8 15% 24 44% 10 18% 3.44068

10%

13% 21 38%

5%

8%

10%

2%

18

12

22

18

9

35%

21%

37%

30%

13

15 27% 10 18%

10

13

20

15% 21

25%

17%

22%

34%

11

25

18

15

21%

43%

30%

25%

35% 28 47%

5

3

8

1

-

1

10%

5%

14%

2%

-

2%

3.03390

2.72881

3.35593

2.74576

2.79661

3.33898

70

In table 4.9 different variables address the dissemination of information and implementation of the programme. The majority of the respondents reported that they had been either “to a larger extent” and “somewhat” honoured the objectives needed to achieve the implementation of the programme and dissemination of information. In reply to the question, how well the monitoring of curriculum delivery is handled by the regional office, the response was “Not at all” (8%) and “Less extent” (37%). This shows that the monitoring of curriculum delivery in the school is not conducted effectively by the regional office. Forty-seven percent (47%) of respondents present

“less extent” and “not at all” on how well the regional office informed the respondents on the curriculum delivery intervention programme. This means that the information did not reach staff members about the introduction of the programme in the school (see item 27 on the table). Carl (1997, p. 167) confirms the above-mentioned statements; the real measure of success during this application phase is largely determined by the quality of the planning, design and dissemination done beforehand. He further says that the success of implementation may be assured if the dissemination has been effective and specific strategies are followed during implementation. Majority of respondents

(37%) indicate that they are “less extent” and “not at all” sure how effective was the implementation of the curriculum delivery intervention in the school had been (See item 32).This indicated that the implementation of CDIP in the school is not happening as expected by the respondents.

The findings with regards to the curriculum development, the implementation of the programme and dissemination of the information in the schools by the departmental officials are not effectively communicated. In terms of how effective the implementation of the CIDP in the schools, the finding was that the implementation of the curriculum delivery intervention in the school is not exactly happening as expected.

The findings on how well the monitoring of curriculum delivery is handled by the regional office, affirmed the options of respondents that this had not been done according to expectations.

71

4.2.5 Integration of interventions, evaluation and evaluation in curriculum delivery intervention.

Table 4.10: Extent to which the respondents’ views on the integration of interventions and evaluation and monitoring were achieved.

Summary of results (N=60)

7

disagree

1 2

Uncertain

3

Agree

4

% %

12% 5 8% 23 39% 21 35% 3

Strongly agree

5

5%

Mean

3.15000

Missing frequency = 1

43. The programme is structured in such a way that it is monitored from the regional office.

Missing frequency = 4

44. This programme helped our staff to be able to integrate all components within the programme with that of daily programme of educators.

Missing frequency = 1

45. The programme is structured in such a way that it is monitored within the school by the SMT members.

Missing frequency = 8

46. The officials from the regional office visit the school at least twice a year.

Missing frequency = 4

47. When department officials visit the school, they check all the school’s files.

48. Curriculum implementers are sent to school so that they organize the class visits with educators.

Missing frequency = 16

49. Departmental officials always inform our SMT members before they visit the school.

50. The programme helped our staff to resolve conflicts better when they arise.

Missing frequency = 2

51. The programme helped our staff to improve teaching performance in the school.

Missing frequency = 2

52. This programme has enabled us to understand how to develop the school improvement plan through IQMS.

6

4

3

1

3

20

10

6

6

11%

7%

6%

2%

5%

45%

17%

10%

10%

7

5

15

12

10

4

15

8

10

13%

8%

29%

21%

17%

9%

26%

13%

17%

26

15

6

10

14

17

25

26

24

47%

25%

12%

18%

23%

39%

42%

43%

41%

17

32

12

21

24

3

8

18

16

30%

54%

23%

38%

40%

7%

13%

31%

28%

-

3

16

12

9

-

-

-

2

5%

31%

21%

15%

-

-

-

-

3%

3.03333

3.41667

3.55000

3.60000

3.43333

2.01667

2.51667

2.90000

2.98333

72

Table 4.10 shows the respondents’ views on the CDIP in terms of the integration of the interventions, monitoring and evaluation. In table 4.10 above the respondents had to indicate whether the integration of the interventions in the curriculum delivery programme clearly linked to the daily programme of the school. According to Piaget in

Ornstein and Hunkins (2004, p. 110) integration ‘refers to horizontal relationships of curriculum experience’ and means that the organisation of experiences should be unified in relation to other elements of the curriculum being taught and that subjects should not be isolated or taught as a single course from the rest of the subjects. Thirty percent (30%) of the respondents agreed that staff is able to integrate all components within the programme with that of daily programme of the school (see item 44) while

47% of the respondents were uncertain whether this had been the case.

Most of the respondents (55%) agreed and strongly agreed that the curriculum implementers were sent to our school to organise the class visit with educators (see item

48). Thirty percent (30%) of respondent agree and strongly agree that this programme helped the staff to improve teaching performance in the school while 43% of respondents presented uncertain (see item 51). Thirty-one percent of respondents agree and strongly agree that this programme has enabled them to understand how to develop the school improvement plan through IQMS, while 41% of respondents indicate that they were “Uncertain” or “Not sure”.

The findings with regard to the integration of the interventions, monitoring and evaluation emphasis that the majority of the respondents were “Not sure” that the curriculum delivery interventions were integrated with that of their daily programme in the school. Majority of the educators indicate that they were “Not sure” whether the programme helped them to be able to integrate all components within the programme with that of daily program of educators. Most of the respondents agreed that the programme is monitored by the regional office.

73

4.2.6 Departmental support and professional development: Curriculum delivery intervention programme

Table 4.11: Extent to which the respondents’ views on the departmental support and professional development was achieved.

Summary results (N=60)

Missing frequency = 1

53. My school is often visited by curriculum inspectors to provide assistance to the school.

Missing frequency = 1

54. Learners receive all learning materials for their respective learning areas.

55. Our school is well supplied with furniture

(chairs, tables)

56. Our school is well renovated.

57. In our school the teacher-learner ratio is larger than 1:32

58. I understand how to develop the school improvement plan with the aid of officials.

59. I understand how to develop the school policies with the aid of officials.

60. I have been provided with the schedule of visits to the school by the regional team.

9

12

21

9

5

5

24

disagree

1 2

Uncertain

3

Agree

4

Strongly agree

5

Mean

n % n % n % n % n %

8 14% 11 19% 12 20% 23 39% 5 8% 3.10169

15%

20%

35%

15%

8%

8%

40%

21

26

15

9

19

26

21

36%

44%

26%

15%

32%

43%

35%

17

5

6

3

15

16

10

29%

8%

10%

5%

26%

27%

17%

10

17

16

13

17

10

4

17%

29%

26%

22%

29%

17%

7%

2

-

2

26

4

3

1

3%

-

3%

44%

6%

5%

2%

2.55932

2.42373

2.40678

3.61017

2.96610

2.66102

1.94915

74

In table 4.11 the respondents indicated that the school is often visited by curriculum inspectors to provide assistance to the school. Forty-six percent (46%) of the respondents agree that curriculum inspectors are sent to school while 20% of the respondents were uncertain about the visit by the curriculum implementers to the school

(see item 53). According to Adam (1997, p. 4) in Steyn and Van Niekerk (2007, p. 224) professional development covers a variety of activities, all of which are designed to enhance the growth and professional competence of staff members. Fifty percent (50%) of the respondents disagree and strongly disagree that learner received all learning materials for their respective learning area (see item 54). Sixty-four percent (64%) of the respondents strongly disagree and disagree that the school is well supplied with furniture. Sixty-four percent (64%) of respondents strongly agree and agree that the teacher-learner ratio is larger than 1:32. Forty percent (40%) of the respondents disagree and strongly disagree that they understand how to develop school improvement with aid of officials (see item 57).

According to the findings in the table 4.11 the majority of respondents disagree that all learners receive learning materials. Most of the respondents disagreed that the school is well supplied with enough chairs and table. The majority of the respondents disagrees and strongly disagree that the school is not well renovated and agree that a teacherlearner ratio is larger than 1:32 in the school. In terms of the skill development by the department of education, the respondents disagree and strongly disagree that they knew how to develop the school improvement plan with the aid of officials.

75

4.2.7 The implementation of the programme and dissemination of the information in curriculum delivery interventions.

Table 4.12: Extent to which the respondents’ views on the implementation of the Programme and dissemination of information.

Summary Results (N=60)

Variable

61. How well did the regional office inform you on the curriculum intervention programme?

Missing frequency = 1

62. How well does the regional office advise you when they visit the school?

Missing frequency = 9

63. How effective was the implementation of

CIDP in the school?

Missing frequency = 1

64. How well is the monitoring of curriculum delivery handled by the regional officials?

Missing frequency = 1

65. How clear is the standard curriculum delivery in the school by the regional office?

66. How useful are the standards of curriculum delivery in the school?

4

5

Not at all

1

Less extent

2

Uncertain

3

Somewhat

4

To a large extent

5

Mean

n % n % n % n % n %

9 15% 19 32% 20 33% 11 18% 1 2% 2.63333

13 22% 17 29% 14 24% 12 20% 3 5% 2.60000

8%

8%

8

22

16%

37%

22

13

43%

22%

15

18

29%

30%

2

1

4%

2%

6 10% 18 30% 20 34% 15 25% - -

1 2% 9 15% 21 35% 28 47%

1 2%

2.85000

2.76667

2.78333

3.31667

76

Table 4.12 shows forty-seven percent (47%) of respondents indicated that they were

“Not at all” informed by the regional office and informed to a “Less extent” on the

CDIP (see item 61). Third-three percent (33%) of the respondents were uncertain which means that, the respondent are not sure as to whether they had been informed of the introduction of CDIP in the school or not. Thirty-three percent (33%) of the respondents indicated “Somewhat” and “To a larger extent” that the CDIP was very effective in the school, while 43% of the respondents reported “Uncertain” (see item

63). Forty-five percent (45%) of the respondents reported that the curriculum delivery had not been handled effectively or to a less extent by the officials in the school (see item 64). This means that the monitoring of curriculum delivery is not handled as expected by the regional office and officials.

The combination of the findings of curriculum development and dissemination of information in table 4.12 above shows that the majority of the respondents were not aware or aware to a less extent that the department of education informed educators or school managers on the introduction of the curriculum delivery intervention programme. Most of the respondents are not sure whether the CDIP implementation of the curriculum delivery intervention programme was effective or not, while 28% of the respondents indicated that the implementation of CDIP was to a larger extent effective.

77

4.2.8 Integration of intervention, monitoring and evaluation in curriculum delivery intervention.

Table 4.13: Extent to which the respondents’ views on the integration of the intervention and monitoring and evaluation achieved.

Summary Results (N=60) disagree

2 3

Agree

4

Strongly agree

5

1

n % n % n % n % n %

7 12% 5 8% 23 39% 21 35% 3 5%

Mean

3.15000

Missing frequency = 1

66. The programme is structured in such a way that it is monitored from the regional office.

67. This programme helped our staff to be able to integrate all components within the programme with that of daily program of educators.

Missing frequency = 8

68. The officials from the regional office visit the school at least twice a year.

Missing frequency = 4

69. When departmental officials visit the school, they check all the school’s file.

Missing frequency = 2

70. This programme helped our staff to improve teaching performance in the school.

Missing frequency = 2

71. This programme has enabled us to understand how to develop the school improvement plan through IQMS.

6

3

1

6

6

10%

6%

2%

10%

10%

7

15

12

8

10

12%

29%

21%

14%

17%

26

6

10

26

24

43%

12%

18%

45%

41%

21

12

21

18

16

35%

23%

38%

31%

28%

-

16

12

-

2

-

30%

21%

-

3%

3.0333

3.5500

3.6000

2.9000

2.9833

78

In table 4.13 the respondents air their views on the programme in terms of the integration of intervention, monitoring and evaluation. The majority of respondents

(40%) “Agree” and “Strongly agree” that the programme is structured in such a way that it is monitored from the regional office, while 39% of respondents were

“Uncertain” whether it is monitored by the departmental officials (see item 66). Fortythree percent (43%) of the respondents were “Uncertain” whether the programme helped staff to be able to integrate all components within the programme with that of the daily programme of educators. Only 35% of the respondents agreed that the programme helped them to be able to integrate all components within the programme with that of the daily programme of educators (see item 67). The majority of the respondents were “uncertain” (43%) whether the programme helped them to improve teaching performance in the school, while 30% of the respondents agreed that the programme helped them to improve their teaching performance in the school (see item

70).

The findings with regard to the integration of interventions, monitoring and evaluation shows that majority of the respondents were uncertain whether the programme was really monitored from the regional office. This means that respondents are not sure that the programme is really structured in such a way that it monitored by regional office.

The majority of respondents were uncertain whether the programme helped staff to improve teaching and learning to address the performance in the school.

79

4.2.9 Programme evaluation in curriculum delivery intervention

Table 4.14: Extent to which the respondents’ views on the evaluation of CDIP was achieved.

Summary Results (N=60)

Missing frequency = 3

72. This programme helped me to become more collaborative in my dealings with issues or activities in the school.

Missing frequency = 2

73. This programme helped me to improve my level of managing curriculum in the school.

74. This programme helped me to receive feedback immediately after evaluation was done.

4

3

1

Disagree

1 2 3 4

Strongly

Agree

5

Mean

n % n % n % n % n %

5% 8 14% 20 35% 22 39% 3 5% 3.20000

5%

2%

8

8

14%

13%

14

27

25%

45%

31

22

53%

37%

2

2

3%

3%

3.33333

3.26667

80

Table 4.14 shows that forty-four percent (44%) of the respondents agreed and strongly agreed that this programme helped them to become more collaborative in their dealings with issues or activities in the school, while 35% of the respondents were uncertain (see item 72). Fifty-six percent (56%) of respondents selected both “Agree” and “Strongly agree” that this programme helped them to improve their level of managing curriculum in the school (see item 73). Forty-five percent (45%) of the respondents indicated that they are not sure that, this programme helped them to receive feedback immediately after evaluation done, while 37% of the respondents agreed that they received feedback immediate after evaluation had been done (see item 74).

Combination of the findings on the table 4.14 above, majority of respondents indicated this programme helped them to become more collaborative in their dealing with issues or activities in the school. Most of the respondents indicated that this programme helped them to receive feedback immediately after evaluation was done. The other respondents indicated that the programme helped them to improve their level of curriculum management in the school.

4.3 Results of the frequency analysis of the observational checklist

This section focuses on variable 1 to variable 35 of the observational checklist which explores the researcher’s perceptions about the usefulness of the curriculum delivery interventions in the management and administration, curriculum management and monitoring and moderation of the work of educators in the classroom practices. With the aid of the professional statistician, the data collected were statistically analysed and the frequency results were produced.

The frequency results as presented in three tables below. In each sub-section of the observational checklist the observer was required to answer “Yes or No” to the question asked.

81

The following variables were used in the observational checklist used by the researcher to observe the school files without interviewing the SMT members.

• V1-Mission statements for the school.

• V5-School policies.

• V21-Work schedules for each learning area in the school.

• V25-Curriculum management plan used SMT members.

• V28-Lesson observation tool used by the SMT members

• V29-Annual assessment programme.

• V31-Monitoring tool use for moderating educators’ files.

• V33-School improvement plan/ school development plan

Table 4.15: Frequency analysis of items/variables the observer assessed on the management and administration documents.

Variable (V1 V2 V3 V5…V33)

Yes

Frequency Percentage

No

Total

7 36%

16 100%

Table 4.15 shows that 64% of observer’s findings is that the school has all items indicated on the observational checklist as V2, V3, V4, V7, V8, V9, V11 and V12, while only 36% of the items on the observational checklist indicated that school does not have items V1, V5, V21, V25, V28, V29, V31 and V33. The assumption is that the school management team members do not have knowledge on how to development these documents for the school.

82

Table 4.16: Frequency analysis of items/variables the observer assessed on the curriculum management.

Variable (V15 to V30)

Yes

Frequency Percentage

9 56%

No

8 44%

Total

16 100%

Table 4.16 shows that 56% of observer’s findings is that the school has all variables indicated as V15, V16, V18, V19, V7, V20, V24 and V26, while only 44% of the items/variables on the observational checklist indicated that the school does not have items such as V5, V21, V25, V28, V29, V31 and V33. This shows that school is not in possession of having the important documents they should have. The assumptions can be that the school management team does not have enough knowledge on how to develop these documents and able to make use of them.

Table 4.17: Frequency analysis of items/variables the observer assessed on the

monitoring and moderation.

Variable (V31 to V35) Frequency Percentage

Table 4.17 shows that 20% of observer’s findings that the school has all items on the observational checklist indicated as V35, while only 80% of observer’s findings indicated that the school does not have items V28, V29, V31, V33 etc. This shows that those important documents the school must have, they do not have.

83

4.4 Interpreting a box and whisker plots

A systemic plot or a box and whisker plot is used to give a picture or image of the variability. The size of this rectangular box is determined by the first and third quartiles of the distribution where consideration is based on the 25% and 75% of the data distributed in this study. The length of the box, the rectangle bounded by the hinges represents the proportion of the distribution that falls between the 25 th

and 75 th percentiles. The line across the box represents the median and the length of the whiskers represents the minimum and the maximum or the adjacent outmost value. The data are used below to stress what is to be used in the Curriculum Delivery Intervention

Programme done in the figures below. The difference between variables that were considered in the CDIP where all participated in the programme scaled to evaluate the programme, the departmental support and professional development, Integration of interventions, monitoring and evaluation and curriculum development, implementation of the programme and dissemination of information by various variables as illustrated in the figure 4.1- 4.9 below. Each of the interventions has its own figure scaled and rated in the tables and has the different variables according to the intervention.

84

Figure 4.1: The plot of the programme evaluation vs. SMT and educators

Strongly disagree = 1 through strongly agree = 5

The box plot in figure 4.1 for males provides interesting information on the distribution of the evaluation of the programme (CDIP). Traditionally, the vertical axis of the box plot is read from the lowest value at the bottom to the highest value at the top. The median of the evaluation of the programme is just below 3.2 (horizontal line inside the box), suggesting that the majority of the respondents were uncertain whether the programme had really helped them. The middle 50% (at 3.2) of the distribution of information in the evaluation of programme lies between 2.1 and 4 providing a rough estimate of the variability around the median. The lowest extreme at the bottom and highest extreme at the upper of the plot box ranges between 1 and 4.4. This means that the lowest whisker is longer than the extreme upper whisker. The data are spread out more below the first quartile than above the third quartile. This also means that most of

85

the information is clustered where most of the respondents indicated “Strongly disagree” and “Uncertain”. In this case not one of the respondents shows any agreement that the programme did help them.

The box plot in figure 4.1 for the female respondents provides an additional visual landmark describing the distribution of information about the evaluation of the programme by the respondents. The median of the programme evaluation is just below

3.4 (horizontal line in the box), suggesting that most of the respondents indicated

“Uncertain” or “Not sure” that the programme had helped them or not. The middle 50%

(at 3.3) of the distribution lies between 3 and 4 (ends of the box plot) providing a rough estimation of the variability around the median. The lowest extreme at the bottom and highest extreme both start vary between 2 to 4.2, which means that the lowest whisker is longer than the extreme upper whisker. The data are spread out more below the first quartile than above the third quartile. This also means that most of the information is clustered where most of the respondents indicated “Disagree” and “Uncertain”.

In conclusion both male and female as indicated in the double box and whisker plot, are

“Uncertain” as to whether the programme did really help them or not. Both plots show that the respondents disagree and are uncertain whether the departmental support did happen as expected in the school. In terms of the two boxes, one box plot for male is longer than the box plot for the female. Thus, the information in the box plot for males is spread out more than the information in the female box.

86

Figure 4.2: The plot of the departmental support and professional development vs educational qualification.

Strongly disagree = 1 through strongly agree = 5

The box plot in figure 4.2 illustrates the distribution of information in terms departmental support versus the educational qualifications of the staff. The focus here is on those educators with a post school diploma and B-degree. The median departmental support information is just above 3 (horizontal line inside the box), suggesting that the majority of the respondents are uncertain or not sure that the Department really gave support to the CDIP. Comparing those who mentioned that they were not sure or uncertain with those who disagreed, those who indicated “uncertain” form majority.

87

Fifth percent (at 3.1) of the distribution of the information on the departmental support lies between 2.3 and 3.4 (ends of the box), providing a rough estimate of the variability around the median. The lowest extreme at the bottom and highest extreme both range between 2.2 to 4. This means that the upper whisker is longer than the extreme lowest whisker, so the data are spread out more evenly above the third quartile than the first quartile. This also means that most of the information is clustered where most of the respondents indicated that they “Disagree” and were “Uncertain” about the statement.

The focus in this box plot represented by figure 4.2 applied to those educators with a degree plus a diploma and a post-graduate qualification. The median of departmental support information is just below 2.4 (horizontal line inside the box), suggesting that the study found that the majority of the respondents disagreed that the school received enough support from the Department. The central of the distributions lies between 2 and

3.1 (ends of the box plot) providing a rough estimate of the variability around the median. The lowest extreme at the bottom and highest extreme at the upper of the plot box range between 1.4 and 3.3. This means that most of the information is clustering where most of the respondents indicated that they disagree and were uncertain about the statement listed in the questionnaire.

88

Figure 4.3: The plot of the integration of interventions, monitoring and evaluation vs streams in the school.

Not at all =1 through to a larger extent = 5

The box plot in figure 4.3 provides information describing the distribution of information about the integration of interventions (classroom practices) by educators in their respective streams in the school. The focus here is on stream 1, 3, 5 and 7. The plot indicates that the majority of the respondents are “Uncertain” or “Not sure” whether the integration of interventions in the classroom had been successful. The middle 50% (at 3) of the distribution lies between 2.2 and 3.3 (ends of the box plot) providing a rough estimate of the variability around the median. The lowest extreme at the bottom and highest extreme at the upper of the plot box range between 1 and 4. This means that the

89

lowest whisker is longer than the extreme upper whisker, indicating that the data is spreading more below the first quartile than above the quartile. This also means that most of the information is clustered where most of the respondents “Strongly disagree” and “Disagree”. None of the respondents is therefore certain whether the programme helped him or her, or not.

Science educators responded to items 2, 4, 6, and 8 in the questionnaire. The distribution of responses occurs the values are illustrated by figure 4.3. Figure 4.3 suggests that the majority of the respondents are “Uncertain” or “Not sure” that the CDIP managed to help educators to integrate the interventions into a daily programme. The middle 50% (at

3.1) of the distribution lies between 2.4.5 and 3.2 (ends of the box plot), providing a rough estimate of the variability around the median. The lowest extreme at the bottom and highest extreme at the upper of the plot box deviate between 2.2 and 3.4. This means that the lowest whisker is longer than the extreme upper whisker, so the data are spread more below the first quartile than above the third quartile. This also implies that most of the information is clustered where most of the respondents strongly disagreed with the question raised in the questionnaire.

In conclusion, both streams (Science and General) indicated different views about the integration of intervention (classroom practices). Respondents from both streams indicated that they were certain whether the integration of the intervention is possible in the classroom. In the double box and whisker plot, the first box plot is based on the responses by educators in the General stream and is longer than the box plot based on the responses by educators in the Science stream. The information from the box plot for

General stream educators is spreading much broader as compared to the responses gathered from the Science educators.

90

Figure 4.4: The plot of the departmental support and professional development vs. post level

Strongly disagree = 1 through strongly agree = 5

The box plot in figure 4.4 provides interesting information on the distribution of information about the support the school is supposed to receive from the Department of

Education. The focus here is on the views of educators about the departmental support as indicated by 1 on figure 4.4. The median of the departmental support is symmetrically skewed at 2.4 (horizontal line inside box plot), suggesting a match between the respondents who disagreed and those who were uncertain. The middle 50% (at 2.4) of the distribution lies between 2.1 and 3.2 (ends of the box plot), providing a rough estimate of the variability around the median. The lowest extreme and highest extreme

91

both start from 1.1 to 4.1.5. This means that the lowest whisker is equal to the upper extreme whisker, so the data are spread more below the first quartile which is equal to data spread above the third quartile. This also means that the information is clustered where most of the respondents strongly disagreed and were uncertain. In this case not one of the respondents gave any indication that the programme helped him or her.

A box plot is based on the views of the school management team members on the departmental support as indicated as (2, 3, and 4) on the figure 4.4. The median of the departmental support is just above 2.3 (horizontal line inside box plot), suggesting that the majority of the respondents totally disagreed that departmental officials support the school. The centre of the distribution lies between 2.2 and 2.4 (ends of the box plot), providing a rough estimate of the variability around the median. The extreme and highest extreme both start from 2 to the 3.2, which means that the lowest whisker is shorter than the extreme upper whisker, so the data are spread more above the third quartile than below the first quartile. This means that most of the information clustered at a point where most of the respondents indicated their disagreement and uncertainty with the questions. None of the respondent gave any indication that the programme helped him or her.

In conclusion, both educators and school management team members indicated different views on the support the Department is supposed to offer to the school. Both box plots of the respondents showed that they “Disagree” or are “Uncertain” about the support the departmental officials are supposed to offer to the school has evident. In the double box and whisker plot in the figure 4.4 above, the first box plot is based on the responses indicated by educators whom a box plot is longer than the box plot based on the responses by the school management team members. The information in the box plot for educators’ responses is spread more than in the box plot for SMT members ‘responses.

92

Figure 4.5: The plot according to the dissemination of information vs. post level

Strongly disagree = 1 through strongly agree = 5

Figure 4.5 provides additional information about the dissemination of information by department of education to the school. The focus here is on the views of educators about the dissemination of information as indicated by 1 in figure 4.5. The median of the dissemination of information is just 2.2 (horizontal line inside box plot), suggesting that the respondents disagreed that they had been informed of the curriculum delivery intervention programme. The middle 50% (at 2.4) of the distribution lies between 2.2 and 3 (ends of the box plot) providing a rough estimate of the variability around the median. The lowest extreme at the bottom and highest extreme at the upper of the plot box range between 1.4 and 3.3. This means that the lowest whisker is shorter than the upper extreme whisker, so the data are spread out more above the third quartile than

93

below the first quartile. This also means that most of the information is clustered where most of the respondents indicated “Disagree” and “Uncertain”. In this case not one of the respondents gave any indication that the programme helped him or her.

The information contained in figure 4.5 is based on the views of the school management team members about the dissemination of information assigned to items 2, 3 and 4 as indicated in figure 4.4 above. The median of the dissemination of information is just above 2.4 (horizontal line inside box plot), suggesting that the study found that the majority of the respondents disagreed that departmental officials informed the school of relevant information regarding the content of programme. The middle 50% of the distribution lies between 2.2 and 3.1 (ends of the box plot) providing a rough estimate of the variability around the median. The lowest extreme and highest extreme both start from 1.4 to 4. Most of the information is clustered where most of the respondents indicated “Disagree” and “Uncertain”. In this case not one of the respondents gave any indication that the programme helped him or her.

In conclusion, both educators and school management team members indicated that they had different views on their responses about the dissemination of information. The majority of the respondents indicated that they disagree that they had been informed of what is entailed in the programme. Both graphs of respondents showed that they disagree or are uncertain that the information was disseminated when the programme started

94

Figure 4.6: The plot of the integration of interventions, monitoring and evaluation vs. post level.

Strongly disagree = 1 through strongly agree = 5

Figure 4.6 provides information on the distribution of information about the integration of interventions in the classroom. The focus here is on the views of educators about the integration of interventions as indicated by 1 in the figure 4.4. The median of the integration of interventions or classroom practices is just 3.2 (horizontal line inside box plot), suggesting that the respondents were “Uncertain” or “Not sure” that integration of interventions happened as expected in the school. The middle 50% (at 3.1) of the distribution lies between 3 and 3.2 (ends of the box plot), providing a rough estimate of the variability around the median. The lowest extreme and highest extreme both start from 2.2.5 to 4. The information is clustered where most of the respondents indicated that they disagreed or were uncertain about the statements listed.

95

The box plot is also based on the views of the school management team members about the integration of interventions assigned to the following numbers 2, 3, and 4 as indicated in the figure 4.6. The median of the integration of interventions is just below

3.2 (horizontal line inside box plot), suggesting that study found that the majority of the respondents indicated “Uncertain” or “Not sure” that the integration of interventions is happened as expected in the school. Middle 50% of the distribution lies between 3 and

3.3 (ends of the box plot), providing a rough estimate of the variability around the median. The extreme at the bottom and highest extreme both start from 2.3 to 4.1.5. This means that the lowest whisker is longer than the upper whisker, so the data are spread more below first quartile than above the third quartile. This means that the information is clustered where the respondents indicated “Disagree” and “Uncertain”.

In conclusion, both educators and school management team members indicated that they had the same views on the integration of interventions in the school. Both box plot of respondents showed that they “Disagree” or are “Uncertain” that the integration of interventions in the classroom situation is happening as expected in the school. In the double box and whisker plot, the first box plot is based on the responses by educators and is shorter than the second box plot based on the responses by the school management team. The information in the box plot for educators’ responses spread less than in the box plot for SMTs’ responses.

96

Figure 4.7: Plot of departmental support and professional development vs. age.

Strongly disagree = 1 through strongly agree= 5

Figure 4.7 provides information on the distribution of information about the support the school is supposed to receive from the Department of Education. The focus is on the views of educators who are less than forty years (< 40 yrs) of age about the departmental support as indicated by < 40 in the figure 4.7. The median of the departmental support is just 2.2 (horizontal line inside box plot), suggesting that box plot of the respondents was

“Uncertain” or are “Not sure” whether departmental officials offered a clear support to the school. The middle 50% (at 2.3) of the distribution lies between 2 and 3.1 (ends of

97

the box plot), providing a rough estimate of the variability around the median. The lowest extreme at the bottom and highest extreme vary between 2.4 to 4.1.

A box plot in figure 4.7 is based on the views of the educators over aged of forty years

(40<). The median of the departmental support is just symmetrical at 3 (horizontal line years. inside box plot), suggesting that study found that the majority of the respondents indicated “Uncertain” that departmental officials support the school of relevant support to the school. The middle 50% of the distribution lies between 2.3 and 3.3 (ends of the box plot), providing a rough estimate of the variability around the median. The lowest extreme and highest extreme deviate between 1.3 and the 4.. The information is clustered where most of the respondents indicated that they disagreed about the statements listed.

In conclusion, the educators and the school management team members indicated that they had the same views on the support the department is supposed to offer to the school. Both box plot of respondents that they “Disagree” or are “Uncertain” that the support the departmental officials are supposed to offer to the school is provided. The first box plot is based on the responses by educators who are over forty years of age are longer than the box plot based on the responses by educators aged less than forty years.

The information in the box plot for educators’ responses aged greater than forty years are spread more than in the box plot for educators’ responses that aged less than forty

98

Figure 4.8: The plot of the program evaluation vs. post level in the school.

Strongly disagree = 1 through strongly agree = 5

Figure 4.8 educators provide information on the distribution about the programme evaluation (CDIP). The focus here is on the views of educators about the evaluation of the programme as indicated by 1 on the figure 4.8. The median of the evaluation of the programme is just symmetrical at 3.1 (horizontal line insider the box), suggesting that study found that the majority of the respondents indicated that they are “uncertain” or are “not sure” whether the programme had really helped them or not. The middle 50%

(at 3.1) of the distribution lies between 2.3 and 4 (ends of the box plot) providing a rough estimate of the variability around the median. The lowest extreme at the bottom

99

and highest extreme at the upper of the plot box range between 1.2 and 4.3. Most of the information is clustered where most of educators indicated that they “Disagree” with the statement”.

The box plot in figure 4.8 provides distribution of information on the evaluation of the programme by the respondents. The focus here is on the views of SMT members about the evaluation of the programme as indicated by 2, 3 and 4 on the figure 4.8. The median of the response to program evaluation is just below 3.3 (horizontal line in the box), suggesting that the study found that most of the respondents indicated that they are

“Uncertain” or “Not sure whether the” programme had helped them or not. The middle

50% (at 3.2) of the distribution lies between 2.4 and 4 (ends of the box plot) providing a rough estimation of the variability around the median. The lowest extreme and highest extreme both vary between 2 and 5. The fact that the lowest whisker is shorter than the upper whisker means the data are spread more above the third quartile than below the first quartile. It therefore appears as if the information is clustered where most of the respondents indicated that they “Agree to the statement.”

In conclusion both educators who are at post level 1 and SMT members indicated that they “Disagree” or are “Uncertain” regarding the matter. Both box plots illustrate that the respondents disagree or are uncertain that the programme helped them. In the double box and whisker box where, the first box plot is based on the responses by educators at post level 1, it is longer than the box plot for the SMT members. The information in the box plot for educators is spread more than the information in the box plot for SMT members’ box.

100

Figure 4.9: The plot of the program evaluation vs. teaching experience

Strongly disagree = 1 through strongly agree = 5

Figure 4.9 provides information on the distribution of information on the evaluation of the programme. The focus is on the views of educators who have more than ten years of teaching experience on the value the evaluation of the programme as indicated by <10 years on the figure 4.9. The median of the evaluation of the programme is just 3.3

(horizontal line inside box plot), suggesting that study found that both of the respondents indicated “Uncertain” or are “Not sure” that the programme had helped them. This means that not one agree that the programme itself did play a role to see changes since the programme started in the school. The middle 50% (at 3) of the distribution of information on the evaluation of the programme by educators lies between 2.4 and 3.3

101

(ends of the box plot) providing a rough estimate of the variability around the median.

The lowest extreme and highest extreme both start from 1.3 to the 4.2. This means that the lowest whisker is longer than the upper extreme whisker, so the data are more spread out below the first quartile than above the third quartile. This means that the information is clustered where respondents indicated “Disagree” or “Uncertain”.

A box plot in figure 4.9 is based on the views of those educators that have less than ten years about the programme evaluation as indicated as (< 10 years) on the figure 4.9. The median of the programme evaluation is just below 3.4 (horizontal line inside box plot), suggesting that study found that the majority of the respondents indicated “Uncertain” or

“Not sure” that the programme had helped them to see changes since the programme started in the school. The middle 50% (at 3.3) of the distribution lies between 2.4 and 4

(ends of the box plot), providing a rough estimate of the variability around the median.

The extreme and highest extreme are both starting from 1.1 to the 4.4, which means that the lowest whisker is longer than the upper whisker, so the data are spread more below first quartile than above the third quartile. This means that the information is clustered where respondents indicated “Disagree” and “Uncertain”. In this case no one of the respondents showing any agreement that the programme did help them.

In conclusion, educators those have more than ten years teaching experience and those having less than ten years had the same views on their responses about the evaluation of the programme in the school. Both box plot of respondents showed that they disagree or uncertain that the programme did help them. In the double box and whisker plot, the first box plot is based on the responses by educators that have more than ten years teaching experience which are longer than the second box plot based on the responses by the educators having less than ten years. The information in the box plot for educators that have more than ten years teaching experience is spread more than in the box plot for educators having less than ten years teaching experience in the school.

102

4.5 Analysis of the findings of the qualitative approach to the investigation

4.5.1 Introduction

In this section the researcher analyses the findings that were during the individual interview with the participants in the school by using the interview guide approach.

McMillan and Schumacher (2006, p. 473) explained interview guide approach as an approach in which the researcher selects interview topics in advance but decides the actual sequence and wording of the questions during the interview. Creswell (2005, p.

231) explained that this analysis initially consists of developing a general sense of the data, and then coding description and themes about the central phenomenon. He further described transcription as a process of converting audiotape recordings or field-notes into text data. In this section the logical analysis was used to explore the phenomenon.

Supporting the view as indicated above McMillan and Schumacher (2006, p. 375) contend that doing logical cross-analyses and usually presented in matrix format, categories are crossed with one another to general new insights for further data analysis.

The findings focus on professional development, departmental support, integration of interventions into daily program of educators, dissemination of information and implementation and monitoring, evaluation and feedback.

4.5.2 Professional development (training)

All four frustrated educators I interviewed mentioned that they lack support by the

Department of Education in terms of professional development. The interesting thing is that the principal of the school was one of the participants 2 (P2).

In addressing the question whether it was possible to learn anything from the completion of class registers, participant 1 (P1) replied that some educators could not complete the registers correctly (P1.5.1) and that they could also not balance the register correctly

(P1.5.2).

103

Participant 2 (P2), (P2) complained about the shortages of qualified Mathematics,

Physical science and Accounting teachers.

The other three educators (P1, P3, P4) said that despite the fact that the teaching staff of a school devotes significant time or resources to professional development it would be better if the Department could sent them to the University or give them a very effective workshop for at least for a week or more (P1.9.1, P3.9.1, and P4.9.1).

Skill development is meant for developing educators and this programme does not actually cover everybody. The reason that it is not able to cover many educators is that the Department of Education does not have enough funds for this programme (P3.9.2).

One Participant (P2. 9. 3) mentioned that other educators did apply for skill development for support but no feedback has been provided thus far.

Another educator (P4) said the Department of Education should be serious about staff development (P4.9.2). The participant continued to explain that it is very much important for the department not to regard educators just as educators who have learned to give homework as a matter of course, never as punishment but district must learn the critical contribution that job-embedded professional development can contribute to general school excellence (P4.9.2).

One of the four educators I interviewed said that he was no longer experiencing problems with the educator development, when an educator sent to in-service training to acquire to deal with, the following (P1.9.1):

• In-service training that was assisted schools in terms of skills development

• Professional growth

• Personal development

• On-job training

• Personnel development

104

He believes that professional development covers quite a number of activities all of which are designed to enhance the growth and professional competence of the staff members (P1.9.3).

The synthesis of the above findings is that the Department has to focus on staff development. Focusing on staff development in the school is the key to effective quality improvement. The dividends of this approach include a more effective school and improved learner achievement, greater satisfaction and higher morale.

In the case of professional development, it seemed that the Department does not actually support educators to advance their knowledge and skills in various fields of study. The

Department organises workshops and focuses on staff regardless of which learning area an educator is teaching, but those officials themselves who conduct the workshops are not knowledgeable in the learning area they facilitate. Gillmer, Pienaar, Van Dyk and

White (1998, p. 60) emphasise that the professional teachers need constant in-service training for general professional growth and effective service to the community.

The problem with the department is that they do not prioritise the development of education specialists in an effort to manage curriculum implementation effectively. They do not make sure that, principals are empowered so that they are able to ensure that every education specialist develops and is supported. According to Combrinck (2003, p.

60) the major problem in the Department was a lack of in-service training for teachers.

He further states that teachers in general felt that training was inadequate which made them felt incompetent. The Department does not make sure that all curriculum implementers facilitate learning areas in which they have expertise in (especially in scarce learning areas such as Mathematics, Physical science and Accounting). They do not ensure that education specialists are empowered to develop policies so that when they come to visit schools, they know what they are looking for. The Departmental officials do not propagate the skill development Act, No. 97 of 1998 so that they collaborate with universities. The province must make sure that the Integrated Quality

105

Management System (IQMS) which is the model for the national Department of

Education for school improvement serves the purpose it is intended to serve.

4.5.3. Departmental Support for the CDIP

One educator reported that the regional office does not manage the programme properly, especially the planning itself. If departmental officials fail to support educators in terms of the inputs the learning outcomes will suffer. Educators emphasise that since the

Department of Education puts other schools under quintile 1 which means that all learners in those schools they do not pay school fees. Schools depend on the support of the Department of Education in terms of funds so that they can sustain themselves.

In answering to the question whether there are problems that the school experiences with regard to the departmental support, one educator (P1.9) replied that the problems experienced were related to the shortage of qualified educators and unacceptable teacher-ratios which are the stipulated regulation. Further problem were overcrowding in the classrooms and shortage of furniture. This participant (P1.9.1) wanted to know how it could be possible for a school to perform in this regard.

Three educators interviewed indicated that the problem of retaining staff is that the

Department of Education is competing with private sector. Most schools loose qualified educators who join the highly paid institutions such as private sector. This applies specifically to those educators teaching Mathematics and Physical science. The schools and regional offices are no longer having a control over the movement of Mathematics and Physical science educators due to the low salaries Department pays to educators

(P2.9.4, P3.9.4 and P4.9.4). Another educator pointed out that now; most of the educators are being absorbed by the private sectors and that is a provincial and national outcry in terms of the shortage of educators, particularly in certain scarce learning areas

(P2.9.5).

106

Another problem is the workshops that do not address their needs as a school. The

Department must come up with a way to monitor these workshops so that educators will feel being supported in terms of the curriculum itself (P1.9.2).

The professional development workshops should cover any of the professional development activities. If the Department of Education wants to show support to it must first develop educators (R2.9.6). It is evidence that the best and most effective workshops are those where an individual educator and learning area teams have tested and shared ideas on good practices (P2.9.7).

Participant 1 (P1) pointed out that they no longer employ educators based on their competency. For example, there are those curriculum implementers who are specialists in their chosen field of expertise including the didactics of the learning area who are not utilised by the same education department (P1.9.3). P1 replied that, although there are excellent curriculum implementers who can encourage and inspire educators, sadly there are some who are relatively poorly qualified and lack teaching and assessment experience (P1.9.4).

Competency refers to the ability of educators to be effective performance of their normal duties and functions. This means they should have the necessary ability or knowledge and skill to perform functions successfully (P1.9.5). However, the induction of new educators is one of the critical elements of developing a committed and competent teaching staff with a shared vision of what constitutes good teaching and learning (P1.9.6).

In answering to the question regarding different ways to develop educators, participant 2

(P2) replied that skill development is one way of improving an educator (P2.10.1). The other ways to develop an educator are for the Department to organise workshops and make sure that they are being supported. The same participant argues that the

Department has to monitor and send very qualified facilitators to the workshops

(P2.10.2). The programme such as teach-a-teacher programme should be promoted to schools (P2.10.3). The suggestion was that the Department form or be in partnership

107

with higher institution of learning, such as universities to assist with content knowledge

(P2.10.4).

On the question whether sufficient support is given by the Department to schools, and whether all NCS documents are used in the cluster meetings, one participant 3 (P3) replied that they have all NCS documents and policy documents (P3.11.1).

In answering to the question in which the interviewer wanted to know how often the regional office visited the school and provided feedback, participant 2 (P2) replied that they visit schools twice a year and give feedback (P2.15.1).

To the question whether the regional office provides the school with learner support materials, participant 4 (P4), replied that there is excellent stationary supply (P4.16.1).

Educators completed a top-up form (a form used to order text-books to Department) for this year now, but no textbooks delivered to schools by the Department of Education

(P4.16.2).

In answering the question on what problems have been experienced with regard to the developmental support by regional office, participant 1 (P1) replied that the problem is that curriculum implementers do not come to visit school on a regular basis (P1.18.1).

The Department has a problem to replace educators who decide to leave the profession and joint industry (P1.18.2).

The synthesis of the findings of the investigation is that within the broader context of education in the South Africa, it is important that school managers reach the set objectives, because schools will then be able to reach the outcomes they wanted to achieve. All depends on school managers who can apply their mind to effective tasks if they want to achieve the set objectives. These also depend on the support given to schools to fulfill all expectations the Department is hoping to achieve. The National

Department of Education (2010, p. 3) indicate that they are also focusing on strengthening teaching and learning in Grade R by distributing learning and teaching

108

support packs for Grade R teachers to all 13900 schools that offer Grade R. Sometimes, one can conclude that it seems there is no cohesion from school level up to provincial level in terms of issues on communication, quality of teaching and learning, curriculum provision and resources. The Department of Education must seriously take the following into consideration and extend their support so that learning context can be strengthened: learner achievement, quality of teaching and learning, curriculum provision and resources, leadership and communication in schools. According to Sigh (1997, p. 31) in

Combrinck (2003, p. 52) some of the problems raised by teachers are the effectiveness of the assessment approach in large classes , the influence and lack of facilities and resources on the assessment methods and the influence of multicultural classes on all aspects of assessment.

When I interviewed the participants, I found that most of the responses show that the department does not provide schools with relevant learning materials such as textbooks, renovated buildings or furniture so that they are able to address the issue of overcrowding in schools. The teacher-learner ratio is larger than recommended in the national regulation and teaching strategies the educator uses in the classroom.

Educators’ use resources, including books, equipment accommodation and time, educators’ expectations of learners will not be attained. It is not easy for educators to control and manage the learners’ work, especially the arrangements made by the educators for learners with different abilities, especially the most able and those with learning difficulties. Combrinck (2003, p.60) explains that teacher’s agree that the number of learners in a class and the availability of resources and facilities plays a significant role in the success of the system.

I have realised that the department does not provide the schools with enough facilities such as school libraries and science laboratories. Learners are studying Mathematics and

Physical sciences suffer in terms of acquiring knowledge, because these learning areas need practical work.

109

Department does not succeed in retaining the very qualified staff in schools. Most educators leave and join the private sector.

Many officials sent to schools by the Department, are not expert in their fields. The

Department needs to offer more workshops to the Departmental officials. Lucas (2009, p. 4) states that you can have all the knowledge in the world between your ears, but, if you cannot effectively communicate it in a way that allows your learners/educators to gain it, retain it, recognize and recall it, and use it, they will likely leave the room feeling cheated. He further states that you must ensure that there is a transfer of learning from you to them and ultimately to the workplace, you must act as a conduit. Russel

(1999, p. 107) further states that memory is important in accelerated learning because it drives how well learners retain new or changed learning.

In answering to the question how these interventions are assimilated into the daily programme of educators in the school, participant 2 (P2) replied as follows:

I’m not saying the Department is failing completely; there are those workshops that are helpful. We hear from educators who even disclose that their knowledge and skills have improved (P2.19.1).

The researcher wanted to know how often SMT members meet and discuss issues related to the school. Participant 3 (P3) replied that they meet and talk about the core function or duties of SMT members and have meetings twice in a month (P3.6.1).

On the question how the SMT members receive workshops on quality assurance, participant 2 (P2), replied that the Department organises workshops for HODs. However those workshops are always not effective. Schools end up organizing their own workshop (P2.20.1).

110

The synthesis of the findings of the investigation with regard to the integration of interventions on daily classroom practices emerged as being vitally important. Welch

(1993, p. 267) indicated that a curriculum focusing on strategies for efficient learning and functioning is intended to help inefficient learners assimilate strategies and metacognitive behaviours that will enable them to meet the demands of mainstream classrooms. These three planning phases play a very significant role such as learning programme, work-schedules and lesson planning. Gipps (1995, p. 28) in O’Leary and

Shiel (1997, p. 224) explains that the NCA in England and Wales has given teachers a better understanding of the curriculum, has improved their understanding of what children can do, has raised expectations for students achievement and broadened teachers’ classroom practices. The learning programme is a learning area plan for the phase (all grades).The work schedule is based on the learning programme and describes what is planned for one year and sequences the learning outcomes and assessment standards to ensure progression across the four terms of the year. Willoughby, Wood,

McDermott and Mclaren (2000, p. 20) explain that learners who use elaborative interrogation may have greater arousal or engage in more effortful processing than learners in control groups, the connections made to the learner’s knowledge base seems to be the most common explanation for elaborative interrogation’s effectiveness.

One of the participants I interviewed said the Department sends officials with no expertise to the workshops to support educators on the content knowledge. Departmental officials tent to forget that the documents need more explanation as they are not easy to understand. The educators need a clear guidance on the NCS policy. If the educators understand how the concepts in the NCS have to be interpreted, they will be able to follow a good approach to share the workload and their expertise The National

Department of Education (2010, p. 5) explained that the NCS should provide clear guidelines on what educators ought to teach and assess on a grade-by-grade and learning area/subject basis assessment policy. Learning area heads should conditionally be able to work together as a team in the school. According to Shepard et al (1996) in Schafer,

Swanson, Bene and Newberry (2001, p. 152 they are better able to utilize assessments as

111

part of instruction by explaining learning goals to students through descriptions of the activities they will be expected and criteria.

It is also very important for the SMT members to ensure that they are familiar with learning or teaching materials, so that they are in a position to monitor what their learning area heads and learning area teams are doing. This can help learning area heads and educators to make sure that the assessment strategies they use provide a coherent framework for student assessment, and that the feedback they provide to students and their parents is an accurate reflection of their performance relative to some external and objective benchmark. Meisels, Bickel, Necholson, Xue and Atkins-Burnett (2001, P. 75) state that evident is accruing on the potential of performance of learners or educators on assessments to improve teaching and learning.

4.5.5 The dissemination and implementation of information

In answering the question whether participants had been informed of the programme in advance, participant 1 (P1) replied that they had not been informed of the programme

(P1.8.2).

In response to the same question, participant 2 (P2) replied that officials must make sure that they engage all the people involved, choosing purposeful activities, deciding on the time span for the programme, the staff involved, how the money will be spent, physical resources required, evaluation procedures and structures needed to put the programme into effect (P2.8.1). This second participant (P2) also stated that some of the criteria that needed to be met when implementing any programme include the following:

• Extent management support.

• Clear rationale and objectives of the programme.

• Use of quality materials.

• A reasonable plan for achieving objective.

112

• Lastly, communication flow and feedback that is part of the process and programme.

In response to the question whether participants had been informed on the goal and objectives of the programme, participant 3 (P3), replied that the Department is just

“pumping” these objectives to educators (P3.8.1). Participant (P3.8.2) replied as follows:

“What I have observed is that different teams come with the aim to support the school but when ask for assistance, you cannot get it”.

On the question whether educators are using different ways to communicate, Participant

2 (P2) replied that when he separates the outraged programme from other programmes outside the Department such as non-departmental organizations there is great difference.

Non-departmental organisations are doing well to make sure that participants receive information on time regarding workshops (P2.8.1). Receiving information before the start of any workshops or a visit to a school in advance is important as it gives educators chance to prepare themselves (P2.8.2).

On the question what different ways should be used by the district to implement the fruitful programme, Participant 2 (P2) replied that the Department decides for you which to choose and impose their needs haphazardly upon educators. They need to address the needs of educators in the school properly by finding out what their needs are or conduct the needs analysis on continuous basis (P2.10.1). No programme can be successful without being led. Once you have a vision and strategy on a particular programme you need to be sure that all members of staff or people from your school district understand the vision and its value so that they can work toward the set vision (P2.10.2). There is nothing more destructive to the successful implementation of a programme than the perception that the leader is willing to “talk the talk” but not “walk the walk”. Leaders from the district are not able to motivate and inspire those around them by helping them to understand the value and benefits of the vision to the school and to all those involved in it and committed to it (P2.10.3).

113

When I combine all responses to the sub-questions analysed above, it becomes clear that educators complain that departmental officials did not inform them of the programme

(Curriculum Delivery Intervention Programme). Most of curriculum development initiatives have already failed because curriculum dissemination has not been done. Carl

(1997, p.167) emphasises that dissemination is regarded as synonymous with the implementation while they should in fact be regarded as two separate phases. It is also shown that during dissemination the climate for the envisage change is created and all users are prepared for it. Once the design has been finalized, the dissemination phase normally follows. The dissemination of information comprises the preparation of curriculum through the distribution or promulgation of information, thoughts and concepts.

The information collected from the interviewees indicated that, they had not been informed of the curriculum delivery intervention programme initiated by the

Department. A good and regular communication with the staff in the school, especially, the school management team members during the building phase plays vital role in any institution. The new developments are important in the life of a school because they are planned as improvements to the existing curriculum in schools. It is important for the

Department to make sure that the improvements and value they are bringing need to be specifically promoted and celebrated. Hall (1998, p. 49) indicated that it was discovered in these earlier studies that the key is not merely having other change facilitators active at the school site, the important difference seems to be related to how well the principal and these other change facilitators work together as a change facilitating team. He further stated this team of facilitators, under the lead of the principal that makes successful change happens in schools. The Department should keep SMT members informed with regular updates and always remind them of the value the developments will bring. A very important point in this regard is that during dissemination ways must be thought out and utilised in order to eliminate resistance to change in such a fashion that later implementation will progress successfully.

114

The implementation is to ensure the performance of the obligations by the employer with the aid of employees will satisfy the needs of the ordinary people on the street. Lee,

Moore and Taylor (1981, p. 708) explain the implementation as the manner in which the manager may come to use the results of scientific effort and that the problem of implementation is determining what activities of the scientist and manager are most appropriate to bring about an effective relationship. In the school level, all school managers should develop a visionary quality and the ability to achieve objectives by using goal setting and planning. Wheelen and Hunger (1987, p. 209) emphasise that the successful implementation of a strategy depends on having the right organisation structure, resource allocation, compensation program, information system and school culture. The teams that function within the vision and mission of the school or any programme within the Department of Education should have a vision and specific goals to achieve, using the activities within the school. The real measure of success during this application phase is largely determined by the quality of the planning, design and dissemination done beforehand. Another point that is very important is that the success of the implementation may be assured if the dissemination has been effective and specific strategies are followed during implementation.

It has already been mentioned that the implementation of curriculum delivery is problematic. It is very important for the Department to set a number of task teams to work on the process of classifying and developing the strategy, as set out above to achieve the goals that have been agreed in. Department must also make sure that either the set task teams can be given the task during the progress to achieve the goals to assign its duties to members of staff in schools. These set task teams need to be the champions of their particular goal set by Department so that they can form a team to drive and monitor the implementation of the strategic plan of the programme. Meetings of a task team need to be scheduled on a regular basis so that they can report on progress and share idea on their success and challenges.

115

4.5.6 Monitoring, Evaluation and Feedback

Two educators reported that, with a knowledge they have with regard to monitoring and evaluation one need to understand what to evaluate, identify problems and give feedback and work on the identified problems. If the school does not have the expertise in evaluation and monitoring the schools cannot achieve all set objective. Wholey (2001, p.

344) states that evaluators can assist policymakers and managers in identifying intended outcomes, establishing or revising agency and programme goals, identifying factors that could affect achievement of the goals and developing strategies for achieving the goals.

When the school invites the departmental officials have to assist and support the schools.

The remaining two educators, claiming that they are working under stressed condition, during these school visits. Officials come to the school and instruct staff to change certain aspects. They also indicated that the Department of Education comes up with a way of supporting schools. They come during school hours and disturb the progress of the school for that day and plans of the school. Educators will be forced to leave classes and busy preparing documents and neglecting the core business of the day which is teaching and learning.

The researcher wanted to know how the implementation of the CIDP is monitored and how educators receive feedback during the course of the implementation. Participant 2

(P2), replied that the term monitoring means control where control is the process of monitoring to ensure that processes decided upon for the improvement of the schools are carried out as agreed. Lopez (2008, p. 278-280) defines performance improvement as a systematic approach to improve productivity and competence using a set of methods and procedures and for realizing opportunities related to people’s performance. He further emphasis that performance improvement has evolved from on instructional focus to a performance focus, where instructional solutions are but a subset of the range of solutions required to solve performance gaps. District does not have specific people assigned to school for monitoring (P2.21.1). The district does not have to wait until the school is in a mess and then intervene (P2.21.2).

116

To the same question, participant 4 (P4), replied that the quarterly process of assessing and reporting on school performance provides the district with an important opportunity not only to monitor and review the performance of its school but also to monitor and review the overall academic performance of the school against established benchmarks

(P4.21.1). Gipps (1999, p. 363) explained that the Governments have linked economic growth to educational performance and are using assessment to help determine curriculum, to impose high “standard” of performance, and to encourage competition among schools. Black (2000, p. 408) supports the above statement that all of these showed that innovations which included strengthening the practice of formative assessment produce significant, and often substantial, learning gains. It is also an opportunity to monitor and review the efficiency of the systems as well as the procedures used to collect and compile the information needed to produce these reports

(P4.21.2).

In answering to the question, how long the school receive feedback, a participant (P3), replied:

“We do receive feedback immediately, but negative feedback most of the time”

(P3.23.1) however, for most educators, staff appraisal is a good method for providing feedback and giving organised and specific feedback after a school visit.

In combining the analysis of the sub-questions above I found that there is not enough support from the side of the departmental officials. Husen and Postlethwaite (1989, p.

2985) state that the information cannot be best employed for policy purpose unless education systems are systematically monitored over time. Monitoring is very simple to understand. It just requires one to distribute all agreed activities to the employee and do a follow-up to see if all activities agreed upon have been executed in the way expected.

The above statement shows that there is not effective monitoring because the

Department cannot identify the exact needs of an individual school.

117

Evaluation implies checking if all shared activities have been conducted as agreed on so that problems identified are supported and positive feedback is provided. O’Leary and

Shiel (1997, p. 222) explain that the shortcomings in any of these areas might be expected to undermine teachers’ interpretations of the profiling system and hence, its value as an assessment and reporting framework. According to Gipps (1995, p. 28) in

O’Leary and Shiel (1997, p. 224) the NCA in England and Wales has given teachers a better understanding of the curriculum, has improved their understanding of what children can do, has raised expectations for students achievement and broadened teachers’ classroom practices. Too often staff members only receive feedback when something goes wrong, but when things go well, there are no comments.

As a prelude to introducing the findings of this study, as well as to provide some context within which to interpret these data, it is useful to report that the data clearly show that the programme was moderately successful. The programme is structured by representative from Department who admitted that there are some problems and challenges the schools are experiencing. According to Lichtenberg and Ogle (2006, p.

235) emphasis that even if the results of an evaluation are less than ideal, at least the problem areas are exposed and plans can be developed or altered to accordingly deal with those deficits. They further state that it is a far better situation for correctional education when programme deficits or weaknesses are exposed as a result of an internal evaluation, than to have outside agencies or evaluators with an influence on funding informally evaluate correctional education. The school managers and educators indicated that the curriculum delivery intervention is delivered to schools may not be successful, because the support Department is giving to schools is not up to standard.

In the end, this programme had measurable impact on the curriculum delivery in the school. However, this impact varied from large to small, depending on the type of a curriculum delivery needed in the school.

118

Chapter 5

Main findings, recommendations and implications

5.1 Introduction

This chapter highlights the compiled results of the data gathered and analysed in terms of educators and school management team’s responses on a Curriculum Delivery

Interventions Programme (CDIP). The focus of this dissertation is on the opinions of staff on the support and implementation of curriculum delivery interventions. As such, this chapter is strongly informed by the earlier interventions such as departmental support, classroom practices and professional development. The above mentioned interventions refer to:

• Departmental support - the delivery of all necessary resources the school should have, such as learning materials, furniture, funds, human resource and enough classrooms.

• Classroom practices- the provision of three level of planning such as learning programme, work schedule and lesson plan and relevant policy documents for curriculum.

• Professional development - the development of educators through workshops and skills development entrenched in the constitution of this country.

5.2 Summary of the problem to the study

The main problem with regard to the academic performance of learners in Mpumalanga

Department of Education applies to the Foundation and Intermediate phase. Through the research conducted by commissions instituted in the Department of Education it become clear that the reason for poor performance in schools is the lack of effective management of curriculum and poor delivery of curriculum by both school managers and departmental officials. According to the results of a progression analysis the results

119

clearly showed that the academic performance of learners from the Foundation to

Intermediate phase is very poor due to poor curriculum management by the school managers. The school managers feel that they are not well supported by the Department in terms of departmental support, classroom practices and professional development.

Research instituted commission by the Department indicated that not all school managers have been trained on the National Curriculum Statement and how to manage its implementation.

5.3 Summary of the research question and sub-question

The key research question that guided this investigation reads:

“How do educators and School management team members respond to the curriculum delivery intervention programme in township schools in terms of departmental support, professional development and classroom practices?”

The main question a researcher wanted to know was how educators and the school management team members respond to the intervention programme and the delivery of curriculum in the school. A researcher explores and/or explains the main topic according to the interventions where the following sub-questions are formulated out:

• How were educators informed about the implementation of the curriculum delivery intervention programme?

• What training have they received regarding the implementation of the curriculum delivery programme?

• What support did they get from Mpumalanga Department of Education?

• How did they implement the curriculum delivery intervention programme?

• How was the implementation monitored?

• How often did they receive feedback from programme evaluation?

• How was the curriculum delivery intervention programme integration into the daily programme of the educators?

120

5.4 Summary of the aim and objective of the study

The aims and objectives of the study was to explore and / or explain how educators and school management team members responded to a curriculum delivery intervention programme in terms of departmental support, professional development and classroom practices in order to improve the quality of teaching and learning in township schools. It aimed also to provide the researcher with a list of the intended outcomes for the curriculum delivery intervention programme. The following outcomes were envisaged:

• To establish how educators are informed about the implementation of the

Curriculum Delivery Intervention Programme (CDIP).

• To determine what training educators have received regarding the implementation of the curriculum delivery intervention programme.

• To determine what support educators received.

• To establish how the implementation of the Curriculum Delivery Intervention

Programme was achieved.

• To establish how the implementation is monitored.

• To establish how often educators receive feedback from programme evaluation.

• To determine how the Curriculum Delivery Intervention Programme is integrated into the daily programme of the educators.

5.5 Findings of the literature review

Curriculum delivery intervention has an extensive and emerging knowledge base. These researchers e.g. Meyer (2002, p. 149), Carl (1997, p. 47), Clarke (2007, p. 18) Orstein and Hunkins (2004, p. 322) have written and shared the sentiment on the curriculum

development and program implementation in terms of CDIP, a curriculum needs to be planned in such a way that it delivers a related learning programme within the structured workplace. It also emphasised that curriculum development lends itself to different interpretations and identifies six authoritative phases. This happened in order to show

121

how curriculum development progresses (e.g. curriculum design, curriculum dissemination or dissemination of information, curriculum implementation). It was also found for implementation of interventions should be a plan to develop the intervention itself and a strategy to ensure commitment throughout the interventions.

Hall (1998, p. 49), Lee, Moore and Taylor (1998, p. 708), Wheelen and Hunger (1987, p. 209), Carl (1997, p. 167), Ornstein and Hunkin (1993, p. 304) have emphasised that the support or delivery contains in the programme is dissemination of information, wherein communication plays an important role. It is almost an axiom that whenever a new programme is being designed, communication channels must be kept open so that the new programme comes not as a surprise to the customers.

Adam (1997, p. 4), Clarke (2007, p. 95), Kruger (2003, p. 220), Gillmer, Pienaar, Van

Dyk and White (1998, p. 60), Combrinck (2003, p. 60), Monteith, Van der westhuizen and Nieuwoudt (2002, p. 22) and DOE (2000, P. 16) have also identified that

professional development covers a variety of activities, all of which are designed to enhance the growth and professional competence of staff members. It was also find that the experience teachers could benefit from being reminded of the range of strategies that are available to them. They further explain that their participation in professional development can enhance the success and effectiveness of the professional development programme. It was also found that focus must be on professional development for staff during a particular period for classroom management and lesson presentation and the visiting teacher should be asked to note how the teacher manages a particular aspect of learner behavior or phases of lesson in order to develop a teacher.

Clarke (2007, p. 75 & 173), Department of Education (2007, p. 3), The Department of

Education (2010, p. 3), Sigh (1997, p. 31) and Woolfork (1998, p. 392) have also identified that in terms of departmental support, the district is responsible for supporting, monitoring and evaluating the school on an ongoing basis. It was also found that it is essential for all school principals, deputy principals and education specialists undergo an intensive management and leadership training programme.

122

Kaplan and Owings in Clarke (2007, p. 223), Popham (2005, p. 52) Scheerens, Glass and Thomas (2003, p. 100), Department of Education (in NCS, 2003, P. 13), Meyer

(2002, p. 74), Mestry and Globbler (2004, p. 238), Department of Education (2008, p.

19), Piaget in Orstein and Hunkins (2004, p. 110), Clarke (2007, p. 132), Moloi (2009, p. 135), Welch (1993, p. 267), O’Leary and Shiel (1997, p. 224) Gipp (1999, p. 363),

Meisels (2001, P.75), Schafer, Swanson, Bene and Newberry (2001, p. 152) and

Willoughby, Wood, McDermott and McLaren (2000, p. 20) have also stressed that in the

classroom practices or integration of interventions, there is a need to create a positive learning climate. They further explain that for educators to select appropriate instruction goals and assessment, using the curriculum effectively and employing those teaching behavior that can help learners to learn at the high level. It is found that education is about a team-work and organizational processes that requires new and innovative ways of learning and managing performance improvement. The Department of Education should produce documents providing detailed information on how learning area teams and teachers should approach this planning, and the kind of information that their planning material should include.

James and Donald (2004, p. 231), Orstein and Hunkins (2004, p. 322), Clarke (2007, p.

252), Meyer (2002, p. 32), Wholey (2001, p. 344), Gipps (1999, p. 363) Black (2000, p.

408), Husen and Postlewaite (1989, p. 2985), O’Leary and Shiel (1997, p. 222)

Lichtenberg and Ogle (2006, p. 235) and Schmidt and Housang (2003, p. 979) advised in terms of monitoring and giving feedback that monitoring is an actual running of the programme on a day to day basis, monitoring progress, and making changes as necessary. The Department should ensure that it keeps on track for delivering its final objectives and also practiced in classroom situation by an educator. It is also found that strategic intervention should be monitored on the continuous basis in order to identify successes, shortcomings and areas for improvement. The authors stressed that a key benefit of the learner portfolio is that it allows the learners to present his or her whole person and for the teacher to judge him and give feedback. It was also confirmed to that one possibility is to ask learning area heads to gather feedback from learning area

123

meetings and then for the issues identified to be discussed by the senior management team members.

5.6 Findings of the empirical investigation on both qualitative and quantitative study

The findings indicate that the Curriculum Delivery Intervention Programme (CDIP) as implemented in the underperforming schools is not a problem as such. The problem arises when some of the departmental officials start to misuse it while the goals and objectives of the programme (CDIP) are clear. The departmental officials visit or give support to schools to ascertain reasons for the poor performances and devise effective strategies to improve results at the under-performing schools. The principals felt that they are not clearly informed of the implementation of the programme. Quantitatively, most of the respondents indicated “Uncertain” or “Disagree” that they were informed of the implementation of the programme. The findings of both the qualitative and quantitative study are being integrated to each sub-question. In both the qualitative and quantitative analysis of the programme, the findings per research-questions are based on the following: dissemination of information, departmental support, and professional development, implementation of programme, monitoring and evaluation and classroom practices/integration of interventions.

5.6.2 Dissemination of information

• On the statement whether to establish how educators are informed about the implementation of the curriculum delivery intervention programme, educators complained about the way the Department handled the communication to the schools. The researcher found that educators and principal felt imposed by the programme. The principal did not even know the goals and objectives of the programme implemented in the school. The departmental officials instead of supporting the school with relevant resources turned to be the witch-hunters and

124

started to target some of the principals and start to harass them. According to the participants, the Department of Education ignored that dissemination of information as one of the elements of communication plays a vital role in curriculum delivery interventions. Through my observations as a researcher, it was found that the circulars are there inviting teachers to attend workshops. The

Department is also mentioned what is intended to do to improve the results of the schools in the province and indicate what contained in the documents and how they are going to do in the schools in terms of departmental support. The question asked by the researcher is that what contained in the documents “Are they implementing it?” If all involved and informed therefore, they should able to exercise a meaningful influence during the implementation of the programme.

The level of the preparedness prevails in the school. Quantitatively, majority of the respondents indicate “Uncertain” or “Disagree” that they are informed of the implementation of programme.

If there is a poor process involved during the dissemination phase (for example poor communication) then makes the challenge of successful implementation

4becomes a complex challenge. There will also be no meaningful curriculum renewal or implementation if there is no active involvement and dynamic leadership and significant change in the curriculum. The dissemination of information will not occur through wishful thinking, but through physically and mentally preparedness. Hard working and diligent application happens through actions.

• On the statement to determine what support educators and SMT members received, it was found that all staff members complained of the poor support received from the departmental officials. When the data are qualitatively analysised explore that there are good considered suggestions as far as this delivery intervention in the school is concerned. The most important element that

125

was realised is that, the workshops organised by the department were not effective. The reason is that, the facilitators were not experts on the learning areas they facilitating in the workshops. There is a lack of support in terms of resources such as learner support materials and shortage of educators in

Mathematics and Science. Learners study without text-books and without extra learning materials (study guides). It also emerged during the investigation that there is not enough furniture in the classroom and that the teacher-learner ratio is above what is stipulated in the regulation. The facilities such as science laboratory, school library and technical workshops are not in a good standing to provide effective teaching and learning in the school.

During my observations it was found that the copies of the top-up form (a form used to order school furniture and textbooks from the department) are completed, but the schools have not received any learning support materials or furniture.

School management team members were not in possession of the school development plan, the school improvement plan and other relevant policy documents in their school files that need to receive attention at the school. There were evidence of circulars for invitation to workshop training sessions and workshop attendance register of teachers. The analysis quantitative data revealed that the, majority of respondents disagreed that there was enough learning materials or text-books, as well as furniture in the schools. In terms of the teacher-learner ratio, most of the respondents indicate disagreed that the teacher- learner ratio is larger than what is stipulated in the regulation. They also disagree all workshops ran by the Department were effective.

The most effective facilitating factors that one may expect are a pleasant and positive climate of renewal, thorough planning, good communication, high level of curriculum expertise from the facilitators and consumers. A support with all relevant resources by the Department of Education is necessary to schools to improve the mediocre performance that comes as a result of poorly resourced

126

schools. The worst part is when educators find themselves addressed by a person that is not an expert and they find themselves losing interest in such workshops.

- On the statement whether to determine what training educators and SMT members have received regarding the implementation of the curriculum delivery intervention programme. Qualitatively, the skill development is not covering everybody in the school. The workshops organised by the departmental officials were not effective due to a lack of expertise. Most of the educators experience heavy workloads and too little time to achieve the standards they would like to.

Educators are turning to be more stressed in performing their duties because educators, who are there, are not enough and there is no incoming of new educators. A very important aspect the Department is not aware of is that a effective programme to staff professional development is a critical element of good teaching and learning.

A tool Department of Education aiming to promote is that the Integrated Quality

Management System (IQMS). The beauty of this model is that everybody understands it, but currently it is no longer serving its purpose, because of the lack of good monitoring by the Department of Education. Educators do as they wish with this programme; they take it as a way of increasing their salaries without a proper implementation. During my observation the documentation with regard to the IQMS was requested. It was found that some of the educators do not have files or portfolios for this IQMS programme. Quantitatively, majority of respondents indicate “Disagree” that the skill development covers everybody in the school. With regard to a tool the Department aims at promoting i.e. IQMS, most of the respondents show “Uncertain” that IQMS is being used properly to improve the performance of educators in the school.

127

Another tool the Department regards as useful is the workshops. The Department organises workshops to help educators to improve or advance their knowledge so as to take education to a higher level. Department fail to retain competent staff members in various schools. Many educators leave teaching and join private sector. There is a serious competition concerning salaries between the private sector and the Department of Education. This is one of the inhibiting factors lie within the educational system.

5.6.5 Implementation of programme

- On the statement whether to establish how the implementation of the curriculum delivery intervention programme is achieved, it was found that the approach was to correct. Qualitatively, the staff had not been prepared the curriculum delivery intervention programme. The process of curriculum implementation needs a very serious an ongoing interactive process with the people involved in the programme and this had not been done by the Department of Education. The task of implementation is never finished for there are always new ideas to bring to the new programme; new materials and methods that educators might wish to try out. The departmental officials think that as long as the process has kicked in, everything stops. The following is of importance:

-

-

Clarifying lines of authority

Involving affected parties in goal setting, staff selection and evaluation

Specifying roles and responsibilities of educators

Training personnel in change strategies and conflict-resolution

techniques

- Furnishing impacted parties with the necessary support

Quantitatively, majority of the respondents indicate “Uncertain” whether the implementation of Curriculum delivery intervention was effective in the school.

128

5.6.6 Monitoring and evaluation (giving feedback) of the programme

On both statements is to establish how the implementation was monitored and how often they receive feedback from the programme. Qualitatively, it was found that the programme was not well monitored. The Department of Education did not do a need analysis. Monitoring and reporting results is a key element of the good teaching and learning and the department must not underestimate its value. Another important key word that needs to be looked into is feedback. The feedback is there and given to the principals. Whatever the shortcomings the

Department found concerning the running of the school (i.e. the school does not have school improvement plan and other relevant documentations) and the

Department does not help to develop the school managers. The departmental officials lambasted the principals during the visit to the schools. Staff members received feedback when something went wrong and when things go well, there were no comments.

Quantitatively, most of the respondents presented “Uncertain” and “disagree” that monitoring was effective. The respondents indicated “Uncertain” that the feedback was given to the school principal by the Departmental officials.

The positive feedback is necessary emolument; this shows clearly that one must ensure a sufficient positive feedback.

5.6.7 Classroom practices and integration of interventions

On the statement to establish how the curriculum delivery intervention programme was integrated into the daily programme of the educators, it was found that with regard to the integration of interventions, there was no class visits or classroom observation. The Department of Education did not do anything to empower the school managers with regard to the class observation.

Most of the school managers could not see what is happening in the classroom

129

situation by the teachers. The unions did take control and not allow the school managers to have that due to organise the classroom support to the educators.

The lack of the classroom support/ visits affects the core business of the school.

The core business refers to the daily planning such as the three phases of planning which are, learning programmes, work-schedules and lesson planning.

Another problem I have encountered is that departmental officials were actually not experts in the learning area the facilitating. They were not in good standing to be able to clarify some of the new terminologies in the NCS. Some educators did not even controlling the books of learners. It was found that educators were still behind as far as arrangement of educators’ portfolios is concerned. The

Department did not have a way to deal with the IQMS. Educators take advantage that their principals do not know NCS policies.

Quantitatively, the respondents indicated “Strongly agree” that the programme helped them to be able to integrate all components within CDIP with that of daily programme of educators. Majority of respondents showed “Uncertain” that the programme helped them to improve teaching performance in the school. It was found that

5.7 The findings of both the quantitative and qualitative section of the investigation.

• The findings of departmental support and professional development in both quantitative and qualitative analysis informed everybody on understand the support the Department of Education provided to the school. Quantitatively, the respondents illustrated in on figures 4.1, 4.8 and 4.9 that they were neither sure nor uncertain as to whether the programme was helpful to them. This meant that for those who indicated that they were “Uncertain,” it could have meant that they were not sure whether there have been any changes since the programme started in the school. Qualitatively, the participants reported that they were not satisfied with the way the department deal with the curriculum delivery in their school.

The schools were for example not supplied with necessary resource to successfully implement the programme. In terms of professional development

130

that was brought to light by both approaches, it showed that workshops were not effective, the reason being that the facilitators appeared not be competent enough to handle those trainings sessions. The information based on the variable qualification was quantitatively analysed. The respondents were also either uncertain or disagree as to whether the support that the Department of Education was supposed to provide to them had actually taken place. Qualitatively, all participants reported that they did not see any changes as far as the curriculum delivery was concerned. Both male and female educators were still experiencing the same challenges as experienced by gender as before.

• The data reported in Figure 4.3 and Figure 4.6 were analysed quantitatively. As illustrated in figure 4.3 respondents were uncertain whether the information with regard to the integration of interventions had taken place in the classroom situation. Learners were still without learner support materials, no clear direction on the usage of the national curriculum statement was given, classes were not visited in the school and overcrowding still remained a challenge to the school.

Participants replied that they were still having the same challenge as illustrated by the quantitative application. They still experienced problems of interpreting the policy in terms of the three level of planning.

• In figure 4.2, 4.4 and 4.7, the data quantitative analysis of data using box-plot, showed that the majority respondents disagree that departmental support and professional development happened in the school. The reason being what there were according to them still shortages in the school.

• The data reported in figure 4.5 were quantitatively analysed. The respondents indicated that they were uncertain or not sure whether the curriculum development and implementation of the programme as well as the dissemination of information happened or were done in the school by the Department. When data were analysed qualitatively, the participants revealed that they were not properly informed of the programme or how it works. This allowed one to

131

conclude that, indeed, in terms of both the quantitative and qualitative investigation, the participants all agreed that they were not well informed of the programme. Information dissemination had not been done properly.

• The data reported in figure 4.3 were quantitatively analysed of data using boxplot, indicated that they disagree that the Departmental officials were able to give principals a comprehensive feedback immediately after school had been visited or after evaluation had taken place. When data were analysied qualitatively, the participant revealed that they disagree that a feedback was given to the principals immediately after evaluation is done in the school.

• Dissemination of information to schools should be a corner-stone to be offered to customers by ensuring that schools receive information on time. In terms of the curriculum dissemination, a regular communication on curriculum issues in the form of circulars should be circulated on a regular basis.

• In terms of the departmental support and professional development in the schools, the Departmental officials should ensure that all schools have all curriculum policies, necessary resources and the requisite documents for the implementation of effective teaching and learning. The Departmental officials should ensure that all workshops contacted by them are always effective.

• In terms of classroom practices a curriculum delivery interventions programme should be integrated into that of the daily programme of the educators in the classroom practices by educators. The Departmental officials should engage educators’ unions on the issue of classroom observation or class visits by SMT members. The Departmental officials should make sure that learners receive a learner support materials and there is a clear direction and interpretation on the usage of the national curriculum statement is given.

132

• The Departmental officials should know that feedback is an important tool the

Department of Education need to take it seriously and could help SMT members to improve the performance.

5.9 Limitations

The researcher acknowledges that there may be some limitations with regard to this study. In the first place, access to school could have been limited because of time restrictions placed by the Department of Education. The second limitation is related to the sample of teachers which was confined to one school in one province in South

Africa, in this case Mpumalanga, and therefore the generalisation of the findings would be with unlikely. Additionally, the present study does not take into consideration the opinions of learners or school governing body about their involvement in curriculum delivery. Another study could be conducted to investigate the responses of learners.

Another observation is that this research is limited to the evaluation of a curriculum delivery intervention programme; it should therefore be expanded into other research topics that had not yet been explored.

5.10 Suggestions for future study

• In order to contribute to the expanding knowledge based about the curriculum delivery intervention programme, there should be a collaborative effort to present the information about this curriculum delivery. Its evaluation at the scholarly conference and to publish a scholarly article addressing its implementation in a specific school.

• The sustained effects of this curriculum delivery on the three interventions

(departmental support, professional development and classroom practices) should continue to be studied. Every effort should be made to collect data

133

periodically from schools to determine the long term impact of the curriculum delivery interventions.

• The Department of Education should continue to conduct research on the extent to which the principals in the province practice the characteristics of effective and cohesive school management team members as identified in the scholarly literature.

• This last comment does not relate directly to an extension of this investigation into another field o specialization, but the immediate advantage of the outcomes of the study is that it could be used in the internal staff development programme of the educators who participated in the study.

• The closing comments about this investigation are that the curriculum delivery interventions are inaction on the part of the Department to support the school with relevant resources. More specifically, the study suggests that the

Department of Education should ensure that there is an extensive support in the form of resources to avoid a mediocre performance in the school.

• The Departmental officials should have a way to ensure that schools receive information on time. Dissemination of information to schools should be a corner stone to be disseminated to staff.

• The classroom observations or class visits are still a problem in many schools.

School management team members are no longer allowed to conduct class visits in the schools, because the teacher unions are often the obstacle in this regard.

The Department should make sure that they engage unions so that teacher development is possible in the school.

• The distribution of the policy documents that are well clarified so that all educators in different schools in the vicinity of the province or district are able to

134

practice or interact it, in the classroom situation. The Department of Education should ensure that all schools have all curriculum policies.

• The feedback should always be provided to the principals of the school.

Feedback is an important tool the Departmental officials need to take it seriously and make sure that a comprehensive feedback is given immediately after school had been visited.

135

6. REFERENCES

Alkire, G. J. (1995). Shaping Your School’s Culture. Thirst for Educational Leadership,

May/June 1995, Vol. 24.

Bank, J. A. (1992). Multicultural Education: for Freedom’s Sake. Washington, 122

Miller Hall D. Q-12, Seattle.

Barootcho, N. & Keshavarz, M. H. (2002). Assessment Of Achievement Through

Portfolios And Teacher-made Tests. Educational Research, Vol. 44 (3) 2002.

Black, P. (2000). Research and the Development of Educational Assessment. Oxford

Review of Education, Vol. 26, Nos, 3 &4, 2000.

Black, P. & William, D. (1998). In The Black Box: Raising Standards through

Classroom Assessment. Phi Delta Kappa, 55 (7), 781-784.

Bradley, C. (2001). Teaching and Learning. Pearson Education Inc. Cape Town.

Brookhart, S. M. (2004). Classroom Assessment: Tension and Interactions in Theory and Practice. Teachers College Record, Vol. 106, No. 3 March 2004, 429-458.

Carl, A. E. (1997). Teacher Empowerment through Curriculum Development. Juta &

Co, Ltd. Cape Town.

Cheng, Y. C. & Cheung, W. M. (2004). Four Types of School Environment: Multilevel

Self-management and Evaluation Quality. Educational Research and Evaluation, Vol.10

(1), 71-10.

Clarke, A. (2007). Handbook of School Management. Kate McCollum. Cape Town.

Cohen, A. (1993). A New Educational Paradigm. Phi Delta Kappa, 74 (10), 791-795.

Cohen, L. Minion, L. & Morrison K. (2002). Research Methods in Education.

Routledge: London and New York.

Combrinck, M. (2003). An international comparative perspective on outcome-based assessment: Implications for South Africa. Perspectives in Education, Volume 21(1),

March 2003.

Creswell, J. W. (2005). Educational Research: Planning, Conducting, and Evaluating

Quantitative and Qualitative Research. (2 nd

ed.) Pearson Education inc: New Jersey

07458.

136

Crooks, T. J. (2002). Educational Assessment in New Zealand Schools. Assessment in

Education, Vol. 9, No. 2, 2002.

David, F. W., Robert A. P & Juliette, J. K. (1989). The Box Plot: A Simple Visual

Method to Interpret Data. Annals of Internal Medicine, Volume 110. Number 11

Davidson, E. J. (2005). Evaluation Methodology Basics. London: Sage publications.

Department of Education (2003). National Curriculum Statement Grades 10-12:

Languages English Home Language. Pretoria: Government Printers.

DOE (2005). Intermediate Phase Systemic Evaluation Report. Pretoria: Government

Printers.

DOE (2006). Speeches by Minister of Education. Pretoria 28 December 2006:

Government Printers.

DOE (2007). National Strategic For Learner Attainment. Nelspruit, February 2007:

Government Printers.

DOE (2007, February 30). Report of the Commission of Enquiry into the Department of

Education. Pretoria: Government Printers.

DOE (2010). Curriculum News: Improving the Quality of Learning and Teaching.

Government Printers. Pretoria.

DOE (2000). School Management Team: Managing and Leading schools. Government

Printers. Pretoria.

DOE (2000). School Management Team: Instructional Leadership. Government printers.

Gillmer, H. M., Pienaar, E. A., Van Dyk, J. N. & White, P. (1998). Education 2.

SACTE. New York.

Gipps, C. (1999). Socio-Cultural Aspect of Assessment. Review of Research in

Education, Vol. 24. (1999),. 355-392.

Government of South Africa (1996). The Constitution of South Africa Act 1996.

Government printers. Pretoria

Government of South Africa (1998). The Skills Development Act 1998. Government

Printers. Pretoria.

Hall, G. E. (1988). The principal as Leader of the Change Facilitating Team. Journal of

Research and Development in Education, Volume 22, Number 1, Fall 1988.

137

Hoffman, S. T. (2003). On the Evaluation of Curriculum Reforms. Curriculum Studies,

3003, Vol. 35, No. 4, 459-478.

Husen, T. & Postlethwaite T. N. (1989). The International Encyclopedia of Education.

Pergamon. Oxford.

Jacobs, M., Gawe, N. & Vakalisa, N. (2002). Teaching-learning Dynamics. Heinemann.

Johannesburg.

James, C. & Donald, Y. (2004). Project Management for Information System.4

th

Ed.

Pearson Education limited. New Jersey 07458.

Johnson, R. B. & Onwuegbuzie, A. J. (2004). Mixed Methods Research: A Research

Paradigm Who’s Time Has Come. Education Researcher, Vol. 33, No.7, 14-26.

Keller, G. & Warrack, B. (2000). Statistics: for management and Economics. Thomson learning. Johannesburg.

Kgoseng, C. (2007). February 18. City Press.

Kotter, J. P. & Schlesinger, L. A. (1979). Choosing Strategies for Change. Harvard

Business Review, 57 (2): 106-114.

Leahy, L. F., Visscher, N. S. & Witziers, B. R. (2005). Educational Leadership:

Classroom Assessment. Minutes by minutes, day by day, Vol. 63 (3) November 2005.

Lee, S. M., Moore, L. J. & Taylor, B.W. (1981). Implementation of Management

Science. Brown company Publishers. Pretoria.

Leedy, P. D. & Ormrod, J. E. (2001). Practical Research: Planning and Design. Pearson

Education International. New Jersey 07458.

Lichtenberger, E. & Ogle, J. T. (2006). The Collection of Post-Release Outcome Data for the Evaluation of Correctional Education Programs. The Journal of Correctional

Education, 57(3). September 2006.

Lopez, L. G. (2008). Performance Evaluation: Proven Approaches for Improving

Programme and Organisational performance. Pearson Education. San Francisco.

Lucas, R. W. (2009). Training Workshop Essentials: Designing, Developing and

Delivering, Essential resources for Training and HR Professionals. A wiley Imprint.

New York.

McMillan, J. & Schumacher, S. (2006). Research in Education: Evidence-based Inquiry.

Pearson Education Inc. Cape Town.

138

Moloi, L. (2009). Exploring the perceptions of English second language teachers about learner self-assessment in the secondary school. University of Pretoria.

Monteith, J. L. & Nieuwoudt, H. D. (2002). Teaching and Learning. Potchefstroomse

Universiteit.

Monteith, J. L., Westhuizen, P. C. & Nieuwoudt, H. D. (2002). Educational Research.

Potchefstromse Universiteit.

Mpumalanga Department of Education. (2006). Age of Hope: Development of an

Education System for Faster and Shared Growth. Nelspruit 14 June 2006: Government

Printers.

MDOE (2006). Newsletter Vol. 1/2006.

MDOE (2007). Speeches by MEC. Nelspruit: Government Printers.

MDOE (2008). Provincial Strategies for Learner Attainment. Nelspruit: Government

Printers.

Meisels, S. J., Bickel, D. D., Necholson, J., Xue, Y. & Atkins-Burnett. (2001). Trust

Teacher’ Judgements: A Validity Study of a Curriculum-Embedded Performance

Assessment in Kindergarten to Grade 3. American Educational Research Journal Spring

2001, Vol. 38, No. 1,. 73-95.

Mestry, R. & Grobbler, B. R. (2004). The Training and Development of Principals to

Manage Schools Effectively Using the Competence Approach. ISEA Principalship, Vol.

32, No. 3, 2004.

Meyer, M. (2002). Managing Human Resource Development. Butterworth. Cape Town.

O’Leary, M. & Shiel, G. (1997). Curriculum Profiling in Australia and the United

Kingdom: Some Implications for Performance- Based Assessment in the United States.

Educational Assessment, 4(3), 203-235.

Ornstein, A. C. & Hunkins, F. P. (1993). Curriculum, foundation, Principles and Issues.

Pearson. Cape Town.

Ornstein, A. C. & Hunkins, F. P. (2004). Curriculum, Foundations, Principles and

Issues. Pearson. Cape Town.

Patterson, W. (2005). Grounding School Culture to Enable Real Change. High School

Magazine, 7 March 2005, (5-6).

139

Popham, W. J. (2005). Classroom Assessment: What Teachers Need to Know? Pearson

Education Inc. New Jersey.

Ramurath, N. (2007, August 4). Kaelo – Stories of Hope. The Teacher. Johannesburg.

Russell, L. (1999). The Accelerated Learning: Making the Instructional Process, fast,

flexible, and fun. Pearson Education. San Francisco.

Schafer, W. D., Swanson, G., Bene, N. & Newberry, G. (2001). Effective of Teacher

Knowledge of Rubrics on Student Achievement in Four Content Areas. Applied

Measurement in Educational, 14(2), 151-170.

Scheerens, J. Glas, C. & Thomas, S. M. (2003). Education, Evaluation, Assessment and

Monitoring: A System Approach. Swets and Zeitlinger. Abingdon.

Schmidt, W. H. & Houang, R. (2003). Cross-National Curriculum Evaluation.

International Handbook of Educational Evaluation, 964-996.

Sherril, B. G., Foucek, A. & Waterbury, A. (2005). Program Evaluation: Principles and

Practices. 2 nd

ed. Portland: Northwest Health foundation.

Steyn, E. J. & Van Niekerk, G. M. (2007). Human Resource Management. UNISA,

Pretoria

Stuffbeam, D. L., Madaus, G. F. & Kellaghan, T. (2000). Evaluation Models Viewpoints

on Educational and Human Services Evaluation. Kluwer Academic. Boston.

Turnbull, B. (2005). Educating School-based Management: A Tool for Team Self-

Review. International Leadership in Education, January-March 2005, Vol. 8 (1), 73.

Van Deventer, I. & Kruger, A. G. (2003). School Management Skills. Van Schaik

Publishers. Pretoria.

Van der Westhuizen, P. C. (2003). Effective Educational Management. Kagiso Tertiary.

Pretoria.

Van der Westhuizen, P. C. (2003). Schools as Organizations. Van Schaik Publishers.

Pretoria.

Van Herpen (1989). Conceptual Models in Use for Educational Indicators. Paper for conference on education; San Francisco.

Welch, M. (1993). Establishing Educational Partnerships with Strategic Intervention through Video-Mediated Staff Development. Journal of Educational and Psychological

consultation, 4(3), 267-273.

140

Wheelen, T. L. & Hunger J. D. (1987). Strategy Implementation. Wesley Publishers.

Canada.

Wholey, J. S. (2001). Managing for Results: Roles for Evaluators in a New Management

Era. American Journal of Evaluation, Vol. 22, No. 3, 2oo1, pp. 343-347.

Willoughby, T., Wood, E., McDermott, C. & McLaren, J. (2000). Enhancing Learning through Strategy Instruction and Ground Interaction: Active Generation of Elaborations

Critical? Applied Cognitive Psychology, 14: 19-30 (2000).

Woolfork, A. E. (1998). Educational Psychology. Allyn and Bacon. London.

141

Department of Science, Mathematics and Technology Education

[email protected]

012 420 2207

8 October 2009

The Principal

Mabande Comprehensive High School

Box 1010

Ogies

2232

Dear Sir

APPLICATION TO CONDUCT RESEARCH IN YOUR SCHOOL

I hereby request permission to undertake research in your school.

The title of my research project is “The response of educators and school management teams on a curriculum delivery intervention programme: a programme evaluation study”. The research is conducted to meet the requirements pertaining to my studies at the University of Pretoria. The details of the project have been included in Annexure A.

I wish to seek permission to conduct interviews with the school management team, as well as with all educators who are currently involved in the curriculum delivery intervention programme managed by the Department of Education. I also wish to have access to educators’ curriculum documents, preparation files and assessment documents.

Thanking you in anticipation

Yours faithfully

Mr MS Nkwana

Cell: 0725195053

142

Department of Science, Mathematics and Technology Education

[email protected]

012 420 2207

8 October 2009

Ms MI Bashele

The Circuit Manager

Witbank II

1035

Dear Ms Bashele

APPLICATION TO CONDUCT RESEARCH IN YOUR CIRCUIT

I hereby request permission to undertake research in Mabande Comprehensive High

School.

The title of my research project is “The response of educators and school management teams on a curriculum delivery intervention programme: a programme evaluation study”. The research is conducted to meet the requirements pertaining to my studies at the University of Pretoria. The details of the project have been included in Annexure A.

I wish to seek permission to conduct interviews with the school management team, as well as with all educators who are currently involved in the curriculum delivery intervention programme managed by our Department of Education. I also wish to have access to educators’ curriculum documents, preparation files and assessment documents.

The Department, the School Management Team as well as all educators will have access to the transcribed interviews and documented observations prior to the finalisation of my report.

Thanking you in anticipation

Yours faithfully

143

Department of Science, Mathematics and Technology Education

[email protected]

012 420 2207

8 October 2009

Dear participant

I am a masters student with the University of Pretoria researching the topic “The response of educators and school management teams on a curriculum delivery intervention programme: a programme evaluation study” and request your kind participation in the study.

You will be free to participate or to withdraw at any time from the interview. This will not affect your relationship with the researcher.

The purpose of this study is to explore educators’ and school management team members’ responses to a curriculum delivery intervention programme in terms of departmental support, professional development and classroom practices in order to improve the quality of teaching and learning in the school who are part of the programme. The items contained in the questionnaire, as well as the interview questions will focus on the impact the curriculum delivery intervention programme has had on your classroom teaching.

Data will be collected using a questionnaire and structured interviews (face-to-face interviews) at your school. I will also be using an observation checklist to observe your assessment file, your lesson preparation, your curriculum file and score sheets.

Do not hesitate to contact me should you wish to have more information on the study before or during the visit to your school. You will be given access to the transcribed interviews and findings before they are captured in my report. Your name, or the name of your school, will not be linked to findings in any way. Your identity will also be protected by the researcher. All activities that you participate in will remain confidential and anonymous.

You will not be subjected to any risks. The information and findings to be shared with you should contribute towards and enhance your classroom teaching practices. You will receive full recognition should this be required, in any publication that might flow from the study.

You will be requested to sign the consent form during my visit to your school. You are signing it with full knowledge of the nature and purpose of the procedures. You will receive a copy of the signed consent form.

Regards

Mr. MS Nkwana Prof WJ Fraser (Supervisor)

144

CONSENT AND APPROVAL TO PARTICIPATE IN THE INVESTIGATION

I ………………………………………………….. have read the information contained in the accompanying letter. I understand the reasons for participating in the study and I therefore agree to participate.

I understand that that my name and personal details will remain confidential and anonymous.

Signed: ……………………………….. Date: ………………………….

145

Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement