https://docushare.sfu.ca/dsweb/Get/Document-610170/S.13-31.pdf

https://docushare.sfu.ca/dsweb/Get/Document-610170/S.13-31.pdf

S.13-31

SFU

OFFICE OF THF. VICE-PRESIDENT, ACADEMIC: AND PROVOST

8888 University Drive, Burnaby, BC

Canada V5A 1S6

TBI-: 778.782.3925

FAX: 778.782.5876

[email protected]

www.sfu.ca/vpacadcmic

MEMORANDUM

attention Senate

FROM Jon Driver, Vice-President, Academic & Provost date January 15,2013 pages 6

RE: Report on Learning Outcomes and Assessment (LOA)

Introduction and Context

In June 2012, Senate approved a set of principles under which the Learning Outcomes and Assessment

Working Group (LOAWG) would investigate the options for implementing a process to assess learning outcomes at Simon Fraser University. LOAWG undertook surveys and research, and prepared a report that was made public at the end of October 2012. Individual faculty members and groups of faculty members submitted comments on the report. The report was also discussed at four Senate committees with student, staff and faculty membership (SCUP, SCUTL, SGSC, SCUS) and at Deans' Council.

I am now bringing to Senate the following documents: the report and its appendices; the comments received from Senate committees; a summary in point form of all comments received; and examples of learning outcomes and assessment reporting frameworks. This covering memo serves two purposes. First, it is my response to the many issues that were raised about the report and, more generally, about whether or not the University can benefit from a more formal process for evaluating the expected outcomes of SFU academic programs. Second, I outline a "made in SFU" model for the development and assessment of learning outcomes. I believe that these issues require debate at Senate, and I anticipate that Senate may request me to provide further information before a decision is made about whether and/or how to proceed. I therefore suggest that we devote our time at the February 2013 meeting to discussing the report and my response to the many issues raised by commentators, and that I bring a final document to Senate at a subsequent meeting that will include one or more motions for consideration.

Response to Comments

(a) General concerns about LOA

As can be seen from various attachments, responses to the LOAWG report question the need to adopt a learning outcomes approach, raise concerns about workload and resource implications, and provide some suggestions for how to implement LOA processes that would be consistent with some key characteristics of the University.

Many faculty members expressed their concerns that any move to adopt LOA was antithetical to various characteristics of teaching and learning in a research university. I agree that the language used in many of the discussions about LOA is problematic, as a number of people commented. It tends to constrain the definition of university education as the delivery of a set of measurable attributes (often identified as quantifiable skills or competencies), and it downplays the less tangible benefits that many of us believe were the greatest benefits of our own education - the freedom to be intellectually curious, opportunities to

SIMON FRASER UNIVERSITY ENGAGING THE WORLD

discuss provocative ideas, and the experience ofworking with teachers who respected and encouraged us as partners in exploration. It is interesting that our own undergraduate students reflect this dichotomy between the development of the person and the practical need to be ready for a career. The preliminary report on the Fall 2012 undergraduate student survey shows that the most commonly cited reason for attending university was to obtain ajob, but the most commonly cited outcome ofa university education

that the students identified was the development of critical thinking abilities.

As VP Academic, it is my responsibility to assess the external environment in which the University, as a publicly and student-funded institution, operates. I believe that there is more than adequate evidence that a general questioning ofthe value of post-secondary education is widespread, and that we must demonstrate that we are capable ofvalid assessment ofourown performance. To bring LOA forward for consideration is neither a "top-down" approach, nor afait accompli, as some have feared. I believe thatwe have to meet the challenges to universities in thecurrent environment, and that an LOA approach will provide benefits for students, will make curriculum management more straightforward, and will allow us to retain the autonomy to make decisions about our academic programs. I take very seriously the concerns about the relationships between LOA processes on the one hand and corporatization, simplification, standardization and commodification of the University's activities on the other hand. As I hope will emerge later in this response, LOA processes are not in and ofthemselves value-laden and I believe that we can create a set of processes that are consistent with the desire of faculty members to be accountable to their students, their discipline and the traditions of universities (in particular, SFU's), without their being strait-jacketed by the

imposition of standards and bureaucratization of learning.

The idea that we should define and assess learning outcomes is not new to SFU. The terms of reference of

SCUTL require the committee "to provide advice on matters pertaining to Learning Outcomes". The

Task Force on Teaching and Learningthat undertook numerous consultations and studies before preparing its 2010 report recommended as follows:

3.1. Identify and promote a set ofattributes that every SFUgraduate should possess or be able to demonstrate.

3.2. Ensure a student-centeredfocus in the curriculum: (1) by identifying learning expectations across all levels ofthe curriculum (in class and out ofclass) with consideration oftlie more general SFUgraduate attributes, (2) by ensuring that the curriculum is well structuredfrom the perspective ofdeveloping learning, and (3) by providing clearly stated information about expectations and responsibilities ofinstructors and students in syllabiforallcourses.

3.6. Evaluate programs, courses, and instructors regularly, systematically, and appropriatelyfor learning effectiveness.

Consistent with the above recommendations, in many conversations that I have had with faculty members and students, and reflected in some of the submitted comments, is the belief that defining learning outcomes can be beneficial to students, under the right conditions. Program outcomes allow students to identify the opportunities they will have during their education and after graduation. The outcomes of specific courses show students how the course relates to the overall program of study, and also provide a logicallink between the work assigned and the intent of the instructor. Facultymembers have commented verbally and in writing that they already do this, either formally in course outlines or when introducing the syllabus to students or during discussions throughout the term. I also believe that the numerous curriculum changes that come to Senate every month are essentially attempts to fine-tune the curriculum to reflect the intrinsic program and course learningoutcomes conceptualized by the faculty members who constitute the academic unit and maintain the quality and relevance of its programs. It therefore seems to me that our debate is not so much about the value of developing, communicating and assessing learning outcomes, but rather the operational, financial and jurisdictional difficulties in doing this in a more formal manner than at present. In the following sections, I address some of the comments on these issues.

(b) Unwanted bureaucracy

Legitimate concerns have been raised chat development of LOA processes will result in new bureaucratic systems that require diversion of budget, and that will add workload to faculty and staff, for little benefit.

In response to these concerns, I have the following comments:

1. We already spend a significant amount of time discussing and modifying curriculum, often in the

absence of a defined set of goals for our programs. If program-level learning outcomes are

identified and/or developed, curriculum can be mapped onto those outcomes, and thisshould

result in less time being devoted to modification of curriculum and course descriptions. We

already assess studentlearning through assigning and grading work, and it will be relatively

straightforward to identify the key assignments that are linked to learning outcomes, and to report on student performance.

2. We already have staff, notably in the Teaching and Learning Centre (TLC), with expertise in development and assessment of learning outcomes, and we already use the concept of learning outcomes in TLC programs that relate to course and curriculum design and effective teaching methods. Therefore, I do not anticipate hiring extra staff, particularly if LOA is phased in gradually over a number of years. We also have software in use (CurricUNET) that collects information on what learning outcomes have been defined for courses and programs.

3.

Some language in the LOAWG report was not sufficiendy clear, and this has led to concerns about the scope of the data-gathering exercise. This concern has been reinforced by the experience of some academic units with the data-collection requirements of external professional accrediting bodies, in disciplines such as Engineering and Business Administration. Specifically, the following

language in the report has caused concerns: "Departments and programs [should] report result summaries

for institutional-level analyses ofaggregated data" (p.9); "it will be necessaryfor the University to provide

academic units with standardized templates for the collection ofselected LOA data; and it will be the responsibility of the University to coordinate the effort ofcompiling and reviewing LOA data receivedfrom all

units" (p.10). Statements such as these have been interpreted as a process in which detailed courselevel data are compiled on an annual basis in a central database. This was not the meaning of the recommendations, and an alternative understanding of the report is one of local control; that the academic units define the data they wish to collect and retain the data, while the University collects information about the processes that operate in each unit, so that it can compile an account of the extent to which learning outcomes are being assessed, and so that it can document the ways in which assessment of student performance is being used to modify curriculum and teaching methods. In my recommendations, I will be proposing this latter process, and I have attached examples of how such reporting frameworks are structured.

4.

The primary interest of the University should be in understanding how different units approach

LOA, and in being able to report at a high level the diversity of approaches and their impacts within programs. As documented in the report, LOA is already a reality for some programs that seek professional accreditation. It would therefore be redundant for the University to require that each academic unit conform to a standardized University process. Furthermore, we should encourage diversity in approaches to LOA, so that the outcomes and the assessment processes are consistent with the needs and practices of the different disciplines. Many universities that arc adopting an LOA approach do not have a separate office to support and oversee LOA in academic units. Instead, a council of faculty members and students monitor activities and advises the relevant administrator about practices that should be followed. I will return to this theme in my recommendations.

5. While I cannot deny that wc will need to commit resources to this initiative, the annual cost can

be reduced by phasing in the process gradually. As the report recommends, 1will be suggesting that we tie the process to the regular review of academic units that runs on a seven-year cycle.

This will also allow for time and flexibility to fine-tune processes to better ensure a fit to SFU.

(c) Loss of autonomy and creativity

Many concerns have been expressed about the impact of an LOA process on the autonomy and academic freedom of faculty members. These concerns include: faculty may wish to modify courses to meet the needs and interests of students, rather than predetermined outcomes; if outcomes mustbe measurable, faculty will be forced to spend less time on other goals that they may bring to their teaching; there will be a tendency to "teach to the test" (or outcome); course content and/or general outcomes will be imposed

from outside the academic unit; data on student success rates will be used to allocate resources to units or evaluate individual faculty members; an LOA process will deter creativity and experimentation.

In response to these concerns, I first note that wc all accept some constraints on our autonomy when we teach in a typical Canadian university. Because educational programs are divided into courses, and because course content and goals are defined by calendar descriptions and arc often linked to other aspects of the curriculum (e.g. as prerequisites), we must conform to some extent to the expectations of our faculty

colleagues and our students, and include somepreviously agreed upon content/skills/attitudes/experiences as outlined in the course description. Moreover, because of British Columbia's long-established system of articulation between institutions, we enter arrangements with colleagues at other universities and colleges

(through credittransfer agreements) that require many of our lower-division courses to be comparable to.

courses at other institutions. Also, in BC, the long-established Degree Quality Assessment Board sets criteria for and reviews all new proposed programs, prior to Ministerial approval.

Second, at the course level in particular, learningoutcomes can be conceptualized as a set of minimum expectations, that have been mutually agreed upon by faculty colleagues, and that relate to the overall expectations of a program. Beyond these minimum expectations, faculty members are still free to develop their own goals for student learning, to decide how to teach the course, to choose relevant examples, to set assignments, to improvise or make a mid-semester adjustment when necessary, and, in the case of research faculty in particular, to incorporate their own unique and personal research experience and interests into the course. As a number of people commented, not every outcome of university education can be defined and measured, and we have to leave room for faculty members to express their interests in the courses they teach.

Third, I have no expectations that anyone outside the academic unit will dictate what learning outcomes should be, except where some professional or disciplinary bodies may require this as part of an accreditation or licensing process at the disciplinary level, and/or if the University decides to adopt some very general learning outcomes (Recommendation 3.1 from the Task Force on Teaching and Learning).

An aspect of this has already been in operation at the undergraduate level since 2006, in the Senateapproved criteria for W, Q, and B designated courses. For most programs, I would anticipate that learning outcomes would be reviewed from time to time by the existing curriculum committees (probably with student input) and by external reviewers during the regularcycle of external reviews.

Fourth, I reiterate one of the principles approved by Senate lastyear - the data on student performance relative to expected outcomes will be "owned" by the academic unit. It is expected that academic units will use the data to modify curriculum and that individual faculty members may use the data to modify some aspects of their courses- the cycle of "continuous improvement" that is often mentioned in the literature. We would reconfirm the principle that data would not be used in the evaluation of individual instructors or teaching assistants (as articulated previously in the principles guiding the LOAWG process).

We would expect external reviewers to be able to access anonymized aggregated data.

A Model for SFU

Rationale

Having read the LOAWG report, participated in many discussions, and read the comments that have been submitted, I believe that the main benefits of defining and assessing learningoutcomes are as follows:

1. Greater clarity for intending and current students about the aims of programs and courses, and a better understanding of the rationale ofassignments and grading practices. Better understanding of the course and program will also allow student evaluations of course and instructor to focus on what they have learnt and experienced in relation to the intended outcomes. This will also allow

students to participate more fully in discussions about curriculum.

2.

A framework within which curriculum committees can make decisions about program requirements and structure.

3. A process that allows academic units to self-assess the efficacy of their programs in meeting their

intended goals, and to make "mid-course corrections" - essentially the "continuous improvement" model.

4. A clear statement of the pedagogical goals of the academic unit against which external reviewers can provide assessments.

5. Through compilation of information from academic units, an opportunity for the University to present credible data about university education, thus making the University more accountable.

The mature model

When fully implemented, I would expect the SFU model to have the following characteristics:

1. All units identify learning outcomes at the course and program level, collect data on student performance, and use that data to modify the curriculum and teaching methods.

2.

There is a diversity of methods and processes employed; some units conform to requirements of a disciplinary or professional body; others have developed methods that are consistent with their discipline. Both qualitative and quantitative methods of assessment may be used.

3.

Learning outcomes, assessment processes and their role in curriculum development are considered during external reviews. A cycle of "continuous improvement" is thus linked to the review cycle.

Units report regularly to the VP Academic on the processes they use and the impact that this has had on their activities. This would probably occur twice in the review cycle - as a component of the self-study document, and at the time of the mid-cycle report to SCUP on what has been achieved since the review.

4. The VP Academic regularly compiles a report that synthesizes the impact of assessing learning outcomes, and uses this as a component of the University's accountability documents.

5. Support for the development or modification of learningoutcomes and assessment processes comes from various support units, such as the Teaching and Learning Centre, the Centre for Online and

Distance Education and Institutional Research and Planning. The University provides software and IT support to assist in the collection and storage of data.

6.

LOA processes are integrated into current Senate committees, when appropriate. For example, when considering new programs or courses SCUS and SGSC examine proposed learning outcomes; SCUP considers external reviewers' suggestions about curriculum and learning outcomes; SCUTL advises on the role of LOA in promoting high-quality teaching and learning, or in the development of effective course evaluations.

Processes for implementation

The LOAWG report also makes suggestions about how the University should implement LOA, and I propose the following:

1. The development ofprogram- and course-based learning outcomes will remain the responsibility

of the academic unit and its faculty members.

2. Institutional responsibility lies with theVP Academic, who will designate a current senior staff person to organize the implementation and the development ofa sustainable, cyclical process for

collection of information at the institutional level.

3. There will be an advisory committee of faculty members, students and relevant professional staff

during the implementation phase.

4. Support for academic programs will come from units such as the Teaching andLearning Centre,

Centre for Online and Distance Education, IT Services, and Institutional Research and Planning.

The VP Academic will provide funding to help units covercertain costs of implementation.

5. Programs that are required to develop LOA by external professional and disciplinary bodies will determine their own timingfor implementation, and this will be sufficient for University purposes.

Such programs are not expected to undertake LOA processes twice.

6. For other programs, the implementation of LOA will be tied to the external review cycle.

7.

Academic units will determine whether to proceed from program outcomes to course outcomes, or vice-versa. Those that decide to begin with identification of program outcomes will be encouraged to develop them as soon as possible.

8. Development of a reporting framework for the collection of information about LOA processes will be undertaken by Institutional Research and Planning, with input from the advisory committee. Some examples of reporting frameworks have been included in the package of material that accompanies this memo.

9.

Academic units will control their internal governance processes for defining learning outcomes, developing assessment protocols, and determining how to collect and utilize data.

Enclosures

TABLE OF CONTENTS

1) Report of the Learning Outcomes &Assessment Working Group (separate attachment)

a.

LOAWG Report b. Appendix A c.

Appendix B

2) Examples of Reporting Frameworks a.

Carnegie Mellon University i. Reporting Frameworks Overview ii. Template & Chart iii.

History iv.

Economics v.

Social & Decision Sciences vi. Psychology vii. Information Systems viii.

Mechanical Engineering ix. Fine Arts (Master's) b.

Columbia University x. Astronomy (PhD)

3) Committee Comments to LOAWG Report a.

SGSC b.

SCUP c.

Deans' Council d.

SCUTL e.

SCUS

4) Summary of University Community Comments to LOAWG Report a.

b.

c.

d.

e.

f.

gh.

i.

J-

General critiques

Students

Faculty

Curriculum

Model/approach

Supports

Linkages to accreditation

Assessment & metrics

Clarifications requested

Suggestions to move forward

(Revised 04-Dec-12)

SFU

Report of the Learning Outcomes & Assessment Working Group

In fall 2011, following the site visit and report of the Northwest Commission on Colleges and

Universities (NWCCU), the Vice-President, Academic (VPA) determined that it would be useful

to have a working committee consider approaches to the implementation of a learning outcomes and assessment (LOA) framework at Simon Fraser University. The Learning Outcomes and Assessment Working Group (LOAWG) was mandated by the VPA to undertake six tasks:

1. Draft principles to guide the establishment and use of learning outcomes for curricular

assessment at SFU. (Note: this will not include evaluations of individual instructors)

2. Identify academic units that currently use, or are in the process of developing, processes for learning outcomes assessment.

3. Identify the curricular assessment processes (regular and off-cycle) currently utilized in academic units.

4.

Review best-practice processes for establishing a learning outcomes assessment process, and recommend the most appropriate process for SFU.

5. Recommend appropriate timelines and milestones for implementing learning outcomes assessment at SFU, bearing in mind the timeline for accreditation with NWCCU, the importance of a communication plan, and the need to take a consultative approach.

6.

Recommend how an ongoing process of learning outcomes assessment and curricular review could best be incorporated into current structures and processes at SFU.

The committee was chaired by Paul Budra, Associate Dean, Faculty of Arts and Social Sciences and consisted of the following members:

Mary-Ellen Kelm, Faculty of Arts & Social Sciences / Graduate Studies

Peter Liljedahl, Faculty of Education / Graduate Studies

Kevin Stewart, Beedie School of Business

Glyn Williams-Jones, Faculty of Science

Chris Groeneboer, Teaching & Learning Centre

Jessica Tilley, Institutional Research & Planning

Sarah Dench, University Curriculum & Institutional Liaison (VPA)

Susan Rhodes, University Curriculum & Institutional Liaison (VPA)

Ilia Starr, Learning Outcomes & Assessment Project (VPA)

(Revised 04-Dec-12)

SFU

Why an LOA Approach?

The practical value of a traditional university education is today under scrutiny. Prospective students from all ages and backgrounds are facing a reality where in today's globalized economy simply having a university education does not guarantee quality career opportunities.

The challenge for many is how to best invest limited resources in developing one's skillsets for a successful and rewarding career within the parameters of a selective marketplace that demands highly specific qualifications and abilities.

The relevance and probable financial return on investment of a traditional university education is therefore in question. Increasingly universities are facing competition from alternative education institutions such as professional schools, which often publicize student learning outcomes data because they are required to by their accreditors and also because "they recognize more generally the power of marketing centered not on institutional 'quality' in the traditional sense but on how students are treated in service responsiveness and in instruction

tailored to individual needs."1 Given this reality, universities must take the initiative to

proactively identify and solve perceived shortcomings in their teaching and learning systems, rather than have these identified arbitrarily by outsiders who may have their own 'solutions' in mind.

The onus is on Canadian universities to act now in assuring the public that students are achieving expected learning outcomes. Recognizing that quality assurance in the postsecondary education sector is a growing global concern, Ontario has recently established a Quality

Assurance Framework through which it intends to meet this challenge as well as "facilitate greater international acceptance of our degrees and improve our graduates' access to

university programs and employment worldwide."2 Earlier this year, Quebec publicized official

recommendations to review and 'adjust' its universities' quality assurance mechanisms, citing the need to keep pace with postsecondary education trends on an international level while

continuously improving the experience of its students and accountability of its universities.3

Peter T. Ewell (2009). Assessment, Accountability, and Improvement: Revisiting the Tension. National Institute for

Learning Outcomes Assessment (NILOA), Retrieved October 4, 2012 from http://www.learninROutcomeassessment.org/documents/PeterEwell 006.pdf

2Ontario Universities Council on Quality Assurance (2012). Quality Assurance Framework, Retrieved September 25,

2012 from http://www.cou.on.ca/related-sites/the-ontario-universities-council-on-qualitv-assura/policies/qualityassurance-framework—guide

3Quebec Conseil superieur de I'education (2012). Avis du Conseil superieur de I'education sur I'assurance qualite a

I'enseignement universitaire, Retrieved October 10, 2012 from http://www.cse.gouv.qc.ca/fichiers/documents/publications/Avis/50-0476cp.pdf

(Revised 04-Dec-12)

SFU

Also this year, British Columbia solicited public opinion on enhancing quality assurance at its

own educational institutions and issued a Quality Assurance Consultation Document stating

that provincial quality assurance processes must 'adapt' in order to remain current with

international standards, better market BC's postsecondary institutions, and reassure employers and students alike that a university education is relevant to their needs.

In June 2008, SFU established a Task Force on Teaching and Learning (TFTL) charged with examining issues relating to the student academic learning experience and curriculum review, and with making recommendations aimed at supporting quality teaching and learning at the

University. In itsfinal report5 (2010), the TFTL acknowledged how "many academic institutions

are redoubling their focus on the student experience and student retention by investing in engaging learning environments and integrating classroom and non-classroom experiences."

One of its key recommendations was that SFU "expand student-centered approaches to teaching within a process of ongoing improvement." It further advised SFU to "identify and promote a set of attributes that every SFU graduate should possess or be able to demonstrate" as well as "ensure a student-centered focus in the curriculum:

1. by identifying learning expectations across all levels of the curriculum (in class and out

of class) with consideration of the more general SFU graduate attributes;6

2.

by ensuring that the curriculum is well structured from the perspective of developing learning and;

3.

by providing clearly stated information about expectations and responsibilities of instructors and students in syllabi for all courses."

The TFTL argued strongly that "clarity around learning expectations enables stronger links between program planning, expectations about learning by students and instructors and related supports for both students and instructors."

SFU has identified and articulated institutional goals in its Strategic Vision, one of which is to

"equip its students with the knowledge, skills, and experiences that prepare them for life in an ever-changing and challenging world." Given the current lack of an LOA framework at SFU,

British Columbia Ministry of Advanced Education (2012). Quality Assurance Consultation Document, Retrieved

October 10, 2012 from http://www.aved.gov.bc.ca/education quality assurance/docs/public qa.pdf

Simon Fraser University (2010). Task Force on Teaching and Learning Recommendations Report, Retrieved

October 10, 2012 from http://www.sfu.ca/content/dam/sfu/vpacademic/files/vp academic docs/pdfs/TFTL-

Report-final.pdf

A set of SFU-specificgraduate attributes were proposed by the TFTL in section 3.1 of its 2010 report (link above).

(Revised 04-Dec-12)

SFU

however, it is difficult to know whether SFU graduates are achieving that goal. Implementing the development of LOA frameworks locally within academic units at the course and program levels is likely to better inform identification of general attributes that all SFU graduates should

possess.

Research shows that university students respond favourably when clearly articulated learning outcomes are built into their programs, courses and assignments. Dr. Richard S. Ascough of

Queen's University explains how "the pressure to articulate [learning] outcomes is not simply a

top-down process; it also arises from our students themselves." Pressure to articulate learning

outcomes, says Ascough, is rising not only from provincial governments which are increasingly basing funding and resource allocations on market mechanisms and private sector criteria, but

also from 21st-century students who "want and often demand a clear idea of the return on investment of a given activity."7 Although implementing an LOA approach is challenging, it is

through the systematic collection and careful examination of learning outcomes data that instructors, programs, departments, and faculties (with the appropriate supports and access to expertise as needed) can identify ways of enhancing their pedagogy, to the benefit of their

students, disciplinary curricula and institutions as a whole.8

Crucial Definitions

Learning Outcome - A 'learning outcome' is an area of knowledge, practical skill, area of professional development, attitude, higher-order thinking skill, etc., that an instructor expects students to develop, learn, or master during a course or program. Learning outcomes are observable and measurable by quantitative or qualitative assessment models. For some examples see the Eberly Center for Teaching Excellence website at

Carnegie Mellon University (http://www.cmu.edu/teaching/assessment/index.html).

Teaching Goal - A 'teaching goal' is anything that an instructor or a program coordinator intends that students will learn in their course or program (note: in this report it is assumed that instructors will retain autonomy to determine the pedagogical

7Richard S. Ascough (2011). Learning (About) Outcomes: How the Focus on Assessment Can Help Overall Course

Design. Canadian Journal of Higher Education, 41(2), 45, Retrieved October 11, 2012 from http://ois.library.ubc.ca/index.php/cihe/article/view/549/2213

8George Kuh and Stanley Ikenberry (2009). More Than You Think, Less Than We Need:

Learning Outcomes Assessment in American Higher Education. National Institute for Learning Outcomes

Assessment (NILOA), Retrieved October 10, 2012 from http://www.learningoutcomeassessment.org/documents/niloafullreportfinal2.pdf

(Revised 04-Dec-12)

SFU

approach they will use to meet the learning outcomes, as well as the autonomy to teach

material falling outside defined outcomes).

Program - A 'program' is a set of coherent curricular requirements leading to a Senate-

approved academic credential (e.g. bachelor or graduate degree, certificate or diploma,

etc.). The definitive list of such programs is included in the SFU Calendar.

Principles for Investigating and Making Recommendations Concerning LOA

[Approved by Senate on June 11, 2012)

1. The primary purpose of learning outcomes and assessment processes is to communicate transparently the purposes of all degree, program and course requirements.

2. As per its Strategic Vision, SFU is committed to academic and intellectual freedom.

Learning outcomes for courses and programs will be developed and determined at the local academic unit level and will reflect local disciplinary cultures. These will be aligned with enduring institutional goals, values, and principles as articulated in the SFU

Strategic Vision.

3. SFU values regular assessment of the achievement of specified learning outcomes as a means of promoting continuous improvement of its courses and programs, and acknowledges that appropriate assessment of learning outcomes can occur before, during and after completion of a course or program.

4.

Processes required by the establishment of learning outcomes and their assessment will be integrated into the regular processes of curricular and program review and renewal and disciplinary accreditation wherever possible.

5.

Learning outcomes assessment will enable instructors to improve upon existing curricula and teaching methodologies. Processes of regular assessment will allow the academic units and the University to collect data concerning unit- and University-level achievement of identified learning outcomes. Learning outcomes assessment data will not be utilized for the evaluation of individual instructor and TA/TM performance, nor will the data be used as evidence to demote, fail to promote, dismiss or otherwise penalize individuals.

6.

It is the responsibility of the University to provide resources (human, capital, technological) to academic units as required to enable and support learning outcomes and assessment procedures. Provision of this support is intended to minimize any addition to the net workload of instructors, TAs/TMs, and department staff.

(Revised 04-Dec-12)

SFU

7. As much as possible, the documentation generated by the Learning Outcomes and

Assessment Working Group will be made broadly available to the SFU community for

transparency and in accordance with SFU's sustainability goals.

Current LOA use in Academic Units

Over the summer of 2012 all academic units at SFU were asked to complete an online survey of

their current practices (if any) surrounding LOA for each of their programs. Not every unit responded, despite reminders, and some used the survey's open questions to express their

concerns about the accreditation process and the implementation of learning outcomes in

general. Out of 457 programs9 solicited by the survey, a total of 273 were completed, yielding a response rate of nearly 60%. Of those programs for which responses were received, 8% of respondents reported that they are already accredited by an external accrediting body (in most

cases by a disciplinary professional body) while a further 4% reported that they are currently seeking accreditation by an external accrediting body.

Of the 60% that responded, a handful of programs indicated they are currently implementing some form of learning outcomes and/or assessment process for some or all of their undergraduate and/or graduate programs or courses. 18% reported having program-level learning outcomes and 64% within this group said they are also assessing students to determine whether these outcomes are being achieved. 16% reported having learning outcomes defined in all of their courses and 35% reported having learning outcomes defined in some of their

courses. Represented in the above clusters are programs10 operating within the following

academic units:

>

Applied Sciences (General Studies)

> Archaeology

> Arts & Social Sciences (General Studies)

> Biomedical Physiology & Kinesiology

> Business

Q

It is important to note that SFU's official tally of program offerings as of 2012 is 145 (undergraduate, graduate and professional programs within its eight academic Faculties). Where LOA surveying purposes are concerned this figure more than doubles, as subdivisions within these programs are counted. For example, the undergraduate history degree constitutes one program, but it includes the following four subdivisions: major, honours, minor and extended minor.

A list of all programs that reported using some form of learning outcomes and/or assessment process can be found in section 2 of the survey report (see Appendix A).

(Revised 04-Dec-12)

SFU

> Cognitive Science

> Computing Science

> Cultural Resources

> Development & Sustainability

> Digital Media

> Earth Sciences

> Education (Graduate Level)

> Engineering Science

> Environmental Science

>

French

>

Health Sciences

> History

> Liberal Arts

> Management & Systems Science

> Molecular Biology & Biochemistry

> Physics

> Political Science

> Psychology

> Public Policy

> Publishing

>

Sustainable Community Development

> Urban Studies

Programs within academic units that reported being a member of an accreditation body11 or in

the process of joining one include:

> Applied Science in Engineering Science (MASc)

> Biomedical Engineering with Biomedical Signals & Instrumentation Concentration -

Honours (BASc)

> Biomedical Engineering with Pre-Med Concentration - Honours (BASc)

> Biomedical Engineering with Rehabilitation & Assistive Devices Concentration - Honours

(BASc)

>

Business Administration (PhD)

11

Specific accreditation bodies that were cited can be found in sections 9a and 10a of the survey report. Note that there may be some programs missing from this list. For example, the Canadian Institute of Planners certifies a stream within the MRM program

(Revised 04-Dec-12)

SFU

> Business Administration - Management of Technology (MBA)

> Business Administration (MBA; MBA Executive)

> Business-Major (BBA)

> Business with Accounting Concentration - Honours (BBA)

> Chemistry - Major & Honours (BSc)

> Communication (MA)

> Computer Engineering - Major & Honours (BASc)

> Electronics Engineering - Major & Honours (BASc)

> Engineering Physics - Electronics - Honours (BASc)

> Engineering Science - Major & Honours (BASc)

> Financial Risk Management (MSc)

> Health Sciences - Major (BA; BSc)

> Mathematics and Computing Science -Joint Honours (BSc; BA)

> Mechatronic Systems Engineering - Major & Honours (BASc; BBA)

> Psychology (MA; PhD)

> Systems - Major & Honours (BASc)

In addition, those programs that prepare individuals for professional practices which require certification by a professional body must meet the certification standards of that body.

Examples include the three Professional Programs in the Faculty of Education (which must meet standards of the Teacher Regulation Branch) and the Master of Counselling Psychology program

(which is designed to meet the standards of the Canadian Counselling and Psychotherapy

Association).

Current Curricular Assessment Processes

Through the survey responses and from informal discussions with faculty across SFU departments, we have found no consistent curricular assessment process used across SFU's academic units. Curriculum committees meet anywhere from annually to bi-weekly. The units that are currently employing some form of learning outcomes assessment have different approaches to administering them: some have formal meetings of curriculum committees, some use a Dean's or Chair's advisory council, and others use departmental retreats. The

Beedie School of Business may have the most formal structure: Assurance of Learning

Undergraduate and Graduate Committees. These committees discuss learning outcomes and assessment for accredited degree programs (BBA, MBA, MOT-MBA, EMBA, MScF, and PhD).

The Undergraduate Curriculum Committee discusses joint honours, joint majors, and certificate

8

(Revised 04-Dec-12)

SFU

program issues as they arise and the Graduate Program Committee includes the GDBA in their program discussions. The Beedie School of Business also created an Accreditation Officer role, which replaced an external accreditation consultant, and helps to facilitate all the processes

described.

The full report provided by Institutional Research and Planning (IRP) is attached as Appendix A.

As the IRP summary indicates, caution must be used in interpreting the data. However, the responses show that only a small number of programs have begun to utilize an LOA approach.

For the majority of SFU programs, creating and adopting meaningful course and program learning outcomes, and undertaking assessment of those, will be a significant change.

Appropriate Processes for SFU

Given the number of academic programs at SFU and their diversity, it is not clear that there can or should be one approach for LOA across the institution. SFU's academic units have different structures, histories, faculty complements, pedagogies, and needs. Therefore LOA processes at

SFU need to reflect the University's decentralized culture as well as commitment to academic freedom and integrity.

Nonetheless an LOA approach impacts curricular design and development at both the program and course level. Any approach adopted for SFU should have the following components:

1.

Learning outcomes are made explicit to students in course outlines and other materials, course assignments, and other assessments.

2.

Course learning outcomes are assessed within courses.

3.

Program learning outcomes are assessed within courses and at a program level.

4.

Departments and programs report result summaries for institutional-level analyses of aggregated data.

5.

LOA data collection and analyses are conducted to provide information for the feedback loop into the learning outcomes cycle.

6.

Results of the data analyses are used to continuously maintain or enhance quality of SFU programs and courses.

7.

Institutional learning outcomes/graduate attributes (when established) are assessed at the institutional level.

8.

Course and program enhancements are implemented at the local level.

(Revised 04-Dec-12)

SFU

Institutional support for the implementation of LOA can take many forms and ultimately must address points 1-8 above. Consistent with LOAWG's sixthTerm of Reference to recommend how ongoing LOA processes can best be incorporated into current structures and processes at

SFU, it is noted that implicit in items 1-3 and 8 are the need for LOA to be embedded in

program and course curriculum design and development at the unit level. Linkage to the

Teaching and Learning Centre (TLC), which provides support to faculty members on curriculum mapping, design, development and assessment, would be essential; these activities are

interrelated and cannot be artificially separated. Implicit in points 4-7 is a need for LOA to be

closely linked to IRP, which maintains, houses and analyzes existing course, program and

institutional data along with expertise in institutional reporting.

In addition, for the purpose of reporting out to the NWCCU (and potentially to the Province in the future), there must be some consistent coordination of information which is facilitated by either of the above units or a separate unit that is linked to all parties involved in LOA activity across the University. Consistent with LOAWG's fourth Principle, which stipulates that processes associated with LOA be integrated into the regular processes of curricular and program review and accreditation where possible, IRP constitutes the existing depository for institutional data and reporting.

In summary, the infrastructure to support implementation of LOA processes at SFU already exists. However, the work of the multiple existing units which comprise that infrastructure needs to be identified and coordinated so that the workflow throughout the LOA cycle, as indicated in points 1-8, is clear.

Given the breadth of disciplines, forcing a single approach on units would be unlikely to yield the best possible results vis-a-vis student learning and course/program improvement. The choice of approach taken to the development and articulation of course- and program-level learning outcomes, along with corresponding assessment practices, should be within the purview of the individual Faculties, schools and departments. Academic units should have the option to retain LOA data that is generated locally. When cyclical processes (such as departmental reviews) require it, University-level evaluation of academic programs' LOA activities can and should be performed using aggregated data from the units. In order to make consistent and meaningful data aggregation a possibility, it will be necessary for the University to provide academic units with standardized templates for the collection of selected LOA data; and it will be the responsibility of the University to coordinate the effort of compiling and reviewing LOA data received from all units.

10

SFU

(Revised 04-Dec-12)

Curricular processes are currently standardized at the University level. All undergraduate curriculum changes must pass through the Senate Committee on Undergraduate Studies

(SCUS), and all graduate curriculum changes must pass through the Senate Graduate Studies

Committee (SGSC). It is already the case that, in the documentation sent to these committees, learning outcomes are being identified for new courses and are required (by the Board of

Governors and the Ministry of Advanced Education) for proposals of new programs and

degrees. Curricular changes to articulated learning outcomes at the course and program levels,

such as additions or modifications, can be tracked through existing SCUS and SGSC processes. In order to support academic units as they develop learning outcomes, to coordinate LOA activity across the University and to gather data for internal and external reporting functions, improved systems of coordination must be developed. All units involved must coordinate in order to ensure there is a central resource and repository of information, that connections are effectively and appropriately made between key units and personnel, and there is development and maintenance of standardized forms, processes, and updated resource documents (e.g. a guidebook, website, etc.). Some programs may want to appoint course coordinators, especially when specific courses are taught by many instructors.

The following suggests two approaches by which academic units may undertake LOA processes while retaining custodianship and control of detailed assessment data.

Program- to Course-level Approach

This cyclic framework is built on the principle that the departmental program committee identifies its program goals, maps the goals onto the program curriculum and courses, gathers data, analyzes data, and then allows the analysis of the data to influence the program committee's efforts to revise the program and the program goals. At some agreed-upon point within this cycle the program committee writes a report on their efforts and submits it to the

VPA's office (e.g. annually, or as part of the regular External Review cycle).

11

SFU

5. Analyze data

6. Modify program and program outcomes

2. Map outcomes onto program curriculum

(Revised 04-Dec-12)

1. Create program outcomes

4. Collect data

3. Write report

Below is a brief description of each step in the program- to course-level approach cycle:

1. Create program outcomes: At this stage the program committee identifies what it is that the program either hopes to achieve or already claims to achieve by the end of the program. One way to do this is for the committee to think from the perspective of a potential employer or admission committee for a graduate program - what would they

expect of a student who has completed their program?

2.

Map outcomes onto program curriculum: The program outcomes identified by the program committee in the previous stage must now be found somewhere within the curriculum of the program. In this stage, the committee first works to either locate or map where in their curriculum specific outcomes are taught and assessed (not necessarily taught and assessed in the same place) and/or work to locate places within the program curriculum the program outcomes can be taught and assessed (not necessarily taught and assessed in the same place). Second, the committee works to identify the methods that they will use to assess their students' attainment of program outcomes. Something to keep in mind here is that a single goal may/should find itself in more than one course in the program. Also important to consider at this point is the way in which a capstone course or capstone experience may be utilized to assess some key program goals.

12

(Revised 04-Dec-12)

SFU

3. Write report: In the first iteration, once the curriculum mapping has been done, the program committee will produce a report documenting their progress thus far. This report will be focused on the program outcomes that have been created/identified,

where these outcomes are being addressed in the curriculum, and where and how these outcomes will be assessed. In subsequent iterations of the cycle, the report will be more extensive including, in essence, what the program committee learned about

their program through the collection and analysis of assessment data, how what they

learned has manifested itself in program and program outcome modifications, as well as the refined learning outcomes and their intentions for collecting new data. Note that there is no requirement that actual data accompany these reports. Data is collected, analysed, and held at the program/department level. What is included in the report is how the data has informed the program.

4. Collect data: The collection of data is really a longitudinal process in which the program committee enacts the plan they laid out in the curriculum mapping stage. How this looks will very much depend on what kind of data the program committee has decided to collect. It can range from longitudinal data on select students through the program, to data on all students at key stages within and following completion of the program.

Programs that have area groups, area coordinators, and course coordinators may want to involve these personnel in this step (and the next).

5. Analyze data: This is the most important stage of the process. It is here that the program committee can learn exactly how it is they, as a program, are performing in their ability to deliver their identified program outcomes to their students. It is very important that programs see this as their data and their analysis. Only through reflection on the results of this process can relevant programmatic changes occur.

6.

Modify program and program outcomes: With the evidence of the assessment data, analysis and results, the program committee can begin the process of initiating and producing change. If the results indicate that the program is not meeting its programmatic objectives then changes within the program and courses are warranted.

Eventually, programs will begin to meet their objectives by adjusting courses and requirements though an iterative process, at which point it might be time to amend programmatic objectives, and with them, create new program goals.

Course- to Program-level Approach

While it may make theoretical sense to begin the process of instituting learning outcomes and assessment by first articulating program outcomes and then working down to the course level,

13

(Revised 04-Dec-12)

SFU

in practice it may be easier to begin at the course level for two reasons. First, course level outcomes are of a manageable size and scope and, second, instructors may already have de

facto learning outcomes articulated in course outlines and syllabi. The process would build

iteratively from course-level learning outcomes up to program-level outcomes.

6. Write report

7. Adjust module and course outcomes

± *

2. Create outcomes for existing

1. Create outcomes for new courses

5. Create outcomes for programs

W

4. Use module outcomes to adjust course outcomes

3. Create outcomes for modules or levels

Below is a brief description of each step in the course- to program-level approach cycle:

1.

Create learning outcomes for new courses: current forms for proposing new courses already contain a section in which to write learning outcomes. Completion of this section has been voluntary but would become required. All new courses will then require learning outcomes in order to pass through SCUS or SGSC and on to Senate.

2.

Create learning outcomes for existing courses: when instructors teach existing courses, they should list learning outcomes on the course outlines. These learning outcomes should be communicated to undergraduate or graduate chairs. The chairs can then add them to course-change forms and bring them forward to SCUS or SGSC. If a course is taught by a number of instructors, those instructors should work collaboratively to agree upon broad learning outcomes for the course and include those in course outlines. Course coordinators may be useful in organizing and overseeing this process.

14

(Revised 04-Dec-12)

SFU

Instructors should work to identify the methods that will be used to assess their students' attainment of the course learning outcomes.

3. Create learning outcomes for modules or levels: what should students know at the end

of first year? At the end of second? What critical knowledge or skills do we want students to have when they progress through the stages of their degrees?

Undergraduate and graduate curriculum committees should begin this conversation with their instructors. We should keep in mind several things: many SFU students do not progress through course levels in a systematic way; many SFU students sample courses from across programs to a significant extent before declaring a major; many SFU students transfer in from other institutions; and some programs may have streams or concentrations that better define learning modules for their programs than course-level.

4.

Use the module outcomes to adjust learning outcomes of courses: once learning outcomes for modules or years have been articulated, the undergraduate or graduate committees should compare these to the course learning outcomes that have been submitted. If the learning outcomes for individual courses are not aligned with the learning outcomes of modules, they may need to be adjusted.

5. Create learning outcomes for programs (majors, minors, honours, certificates): departments and academic units should articulate the learning outcomes of their programs. What do they expect a student who graduates with a major, or minor, or certificate in their discipline to know? What skills should they have? Departments and academic units should identify methods that will be used to assess students' attainment of these program learning outcomes. The data from their assessment procedures should be analyzed regularly to assess the program's performance in reaching its learning outcomes.

6. Write report: Once the curriculum mapping has been done and the data from the assessment procedures is analyzed, the program committee should document how their program is aligned with the principles of using learning outcomes and assessment in a report. Subsequently, such a report should become part of the regular curriculum cycle of the academic unit tied to the seven-year external departmental review.

7.

Use the report to adjust learning outcomes of modules and courses: using the analysis in the report, academic units should conduct an overview of the learning outcomes at both the course and module levels. Do the course outcomes feed into the module outcomes and do the module outcomes inform the program outcomes? If not, they may need to be adjusted.

15

(Revised 04-Dec-12)

SFU

We recommend that programs be encouraged to choose between either of the approaches

described above, or adapt a blend thereof.

Timelines and Milestones

Two milestones in the LOA implementation process have already occurred:

1. On June 11, 2012, SFU's Senate passed the Principles cited at the beginning of this report.

2. As of Sept. 1, 2012, the course and program proposal forms that come before SCUS and

SGSC have included an area for stated learning outcomes.

Pending Senate Approval

If Senate approves the LOA initiative, we propose that:

1.

As new courses and programs are developed, learning outcomes will be brought forward to SCUS and SGSC.

2. Learning outcomes will have to be developed gradually and systematically for existing courses and programs.

3.

Recognizing disciplinary diversity and the fact that some programs are already developing LOA processes, we propose that academic units have learning outcomes for all of their courses and programs in place by their next regularly scheduled external review, beginning in spring 2014. Units that have an external review scheduled for 2014 will be expected to include reference to learning outcomes and assessment in self-study documents prepared in fall 2013, but will not be expected to be able to comprehensively respond to this in their review of curricula.

4.

A mechanism by which University-wide LOA affairs are facilitated and supported should be set up or identified.

5.

Assessment approaches must be integrated into academic units. These will vary from discipline to discipline and, obviously, cannot be put in place until learning outcomes have been articulated. The mechanism responsible for LOA should also be able to provide specialized assistance and advice to faculty in this regard. The most important assessment will take place during the regular external reviews that all academic units presently undergo every seven years, and subject to a mid-term review report between full reviews.

16

(Revised 04-Dec-12)

SFU

Ongoing Processes

As stated in the Principles, "Processes required by the establishment of learning outcomes and

their assessment will be integrated into the regular processes of curricular and program review

and renewal and disciplinary accreditation wherever possible." As the information we have gathered makes clear, there are no regular processes of curricular and program review (with the exception of mandated external reviews of departments) across the University. Even those units presently accredited and using LOA have idiosyncratic processes for their maintenance

and review.

There is evidence from other North American universities that various types of proprietary or in-house instruments, such as software specializing in online assessment tracking systems, can

serve to simplify collection of information about course and program outcomes, and related

assessment. Eventually, when the Canvas Learning Management System is fully implemented or the CurricUNET system is fully utilized, SFU may be able to move to an online tracking system for LOA reporting. However, this will need to be considered in light of the principles of local custodianship of unit-level data and aggregate reporting. Suffice to say, products exist that SFU may wish to explore, but technological products are not necessary to make LOA at SFU a reality.

We recommend, then, that the LOA process be conducted at the local level and resourced by the office of the VPA, and that the cycle of regular assessment be fully built into the external review cycle, since this cycle requires both full review and mid-term reports on progress.

Adopting this approach will also ensure that LOA and curricular review will be fully integrated into self-study, and with all other activities of the department or program. This integration should make it straightforward for units to report out to external reviewers, and for LOA processes and aggregate data to be considered as part of the review reports sent to Senate.

Although some Canadian institutions utilize learning outcomes at the course level, and the

Ontario post-secondary system is exploring the adoption of LOA frameworks across institutions, to date there is no Canadian model to guide SFU in deliberations on working with LOA. To find

models, we examined a number of American universities1 with LOA processes to get a sense of

the administrative structures needed to maintain LOA across the curriculum. In short, there is no single approach, and in fact, most institutions seem to conduct LOA on a relative shoestring

12 We focussed primarily on five NWCCU member institutions that were considered comparable to SFU insofar as they awarded an equivalent range of academic credentials (i.e. baccalaureate, master's and doctoral degrees) and were public-funded, larger 'state' universities with similar student enrollment levels. Appendix B provides brief overviews of some of the institutions that were examined.

17

(Revised 04-Dec-12)

SFU

budget. According to a report by the National Institute for Learning Outcomes Assessment

(NILOA), only "20% [of American universities] have no assessment staff and 65% have two or fewer." However, in general, most universities that engage in LOA do have mechanisms to facilitate university-wide assessment processes, and the charge usually falls under the portfolio of a senior administrator, usually a provost or vice-provost in charge of academic affairs or his/her delegate. Eighty percent of all regionally accredited undergraduate-degree-granting

U.S.-based institutions have "a person or unit" in charge of coordinating or implementing

campus-wide assessment.13 In some cases an assessment council or committee is coordinated

out of academic affairs to evaluate assessment efforts and results across the university and

guide improvements in LOA processes and policy. Some universities also have a dedicated

assessment office facilitating and supporting university-wide assessment efforts, sharing assessment resources, and sometimes engaging directly in data interpretation or providing instructional support for faculty. Currently many of these roles encompass responsibilities that at SFU are assigned to the TLC and IRP. Beyond that, practices are idiosyncratic and seem to respond to institutional cultures.

Providing Support

For SFU to be successful at incorporating an LOA approach into its current structures and processes, key functions for the provision of support will need to be in place. Whatever form these supports take, they must:

1.

Provide resources and guides for use by faculty and staff in establishing learning outcomes and assessing their effectiveness.

2.

Further investigate special software packages related to tracking LOA, and coordinate with the Chief Information Officer/Enterprise Systems/Project Management Office regarding compatibility with other IT systems currently in use (such as the faculty portal in the Beedie School of Business).

3.

Meet regularly with Faculty- and University-level curriculum committees to advise on and support LOA processes and initiatives.

4.

Gather information on LOA activities across the University for reporting purposes.

5.

Work closely across units such as the TLC, IRP, and the Centre for Online and Distance

Education (CODE), which provide support to instructors.

13 G. Kuh and S. Ikenberry (2009). More Than You Think, Less Than We Need: Learning Outcomes Assessment in

American Higher Education. NILOA, Retrieved October 10, 2012 from http://www.learningoutcomeassessment.org/documents/niloafullreportfinal2.pdf

18

(Revised 04-Dec-12)

SFU

6. Provide LOA expertise to SCUS and SGSC, given that the only common denominators in the curriculum development process at SFU are these committees.

7. Work with the staff who manage external reviews, to ensure that LOA activities are

integrated into the external review process.

To perform the above functions in the implementation of LOA at SFU, we recommend

consideration of the following options.

Option 1: Align and Enhance Capacities of Existing Resources

The first option proposed for consideration is to expand the services currently embedded in the

TLC, and IRP, with coordination to ensure continuous improvement and alignment with institutional goals. Advantages of this option include capitalizing on: 1) the necessary expertise that already exist within these units; 2) established relationships and synergies between these units, other units (i.e. the University Curriculum and Liaison office), the Faculties and departments; and 3) greater efficiency and streamlined processes by avoiding overlapping mandates with a new unit and related bureaucracy.

This proposed option would also continue to build on recent efforts resulting from the TFTL's recommendations which were accepted and incorporated by the VPA in current and future academic plans. Two identified areas of concern expressed by members of the University community were to: 1) respect and work with discipline-specific expertise, culture, and language, and 2) provide support that connects with the day-to-day work and challenges of faculty members and programs. The restructured TLC has been focused on establishing its connections within the disciplines and advancing in this area. In working with people on learning outcomes and curriculum, TLC professionals find the language and approach most appropriate within the disciplines rather than taking a generalist or generic approach.

Furthermore, separating learning outcomes from the work that the TLC or CODE already undertake with programs, departments and faculty members may be a step backward. Taking an integrated approach, the TLC consults and works on curriculum mapping, associated faculty and academic development, connects with other initiatives such as innovation using the new

Canvas Learning Management System, graduate student development, etc. To do so, the TLC works with the academic plans and priorities set by each Faculty or department and with other units. The approach proposed would take advantage of the relationships already forged and work already underway.

19

(Revised 04-Dec-12)

SFU

To move LOA forward at SFU, expertise in pedagogy, data collection and University-level reporting is required. From the Summer 2012 Learning Outcomes and Assessment Survey, it is clear that faculty and departments desire discipline-specific examples of LOA materials, planning and implementation approaches and consultation with people already within their

Faculties. Relationships and respecting discipline-specific approaches, and continuity with existing activities, will be integral for the successful implementation of an LOA approach.

Similarly, IRP is already immersed in assessment and reporting activities related to teaching and

learning. The office extracts, gathers, maintains, analyzes and reports on similar data (i.e.

grades, retention, student outcomes survey data). Having a single office entrusted with the responsibility of managing data ensures quality, accuracy and consistency. At present IRP has the technological infrastructure and experience to manage, analyze and report course-, program- and University-level aggregated data. The duty of gathering, compiling, analyzing and reporting institutional-level learning outcomes data is already within IRP's mandate. IRP

"collects, analyzes, maintains, and disseminates institutional research, information and data about the performance and effectiveness of all aspects of the University." As well, it is IRP's mandate to report official data to external agencies. It would be a logical extension of IRP's work to begin to consult with and support individual academic units in their own assessment of learning outcomes, much as the office already does for units regarding reporting on research productivity.

Option 1 avoids creating an additional level of bureaucracy or using scarce resources to create a new administrative unit. The establishment of LOA will require an increase to the workload of existing offices irrespective of any new unit created, and as such utilizing the existing structures or offices (with some additional staffing in each unit) would be more efficient. It has the additional benefit of enabling better communication and relationship building between the offices involved, and with faculty members such that they are better supported.

This approach will require that the TLC, CODE, and IRP work together closely with faculty and academic units in the design of assessment processes, from the outset. It may also require ad

hoc 'assessment councils' to review LOA practices University-wide from time to time, and may

require the office of the VPA to appoint assessment council members and provide secretariat services including the organization and coordination of these meetings. The TLC and IRP would be present at these council meetings to provide their curriculum, pedagogical, data analysis/reporting, and other relevant expertise.

20

(Revised 04-Dec-12)

SFU

Option 2: Add Capacity to the Office of the VPA with a New Unit

The second option is to establish a compact LOA unit that, at least during SFU's period of

transition to an ongoing process of LOA, would report to the VPA.

As previously described, currently instructional support and development, and evaluation at an

institutional level, both occur in the existing offices of the TLC and IRP, respectively. However,

in moving to an LOA approach, SFU may move beyond the services these units presently provide, and may require a new support function that bridges the support provided to

individual instructors, and the functions of aggregated evaluation. In this approach, implicit is the assumption that all units would work together closely to ensure a coordinated approach to curricular development and assessment. Given the original commitment to avoid creating additional workload for faculty and staff, it is not reasonable to expect the existing units to simply add LOA responsibilities to their current commitments, which already see their services and resources fully utilized, without providing them with additional resources. The addition of a mandated LOA unit, even if small, could provide expertise and leadership in this work across the diversity of our academic units and cultures, support ad hoc assessment councils as they are established, and the three units could work together across their complementary mandates.

Note that the decision to pursue Option 1 does not preclude the possibility of moving to Option

2 in the future, should the need arise.

Option 3: Blended or Evolving Services

The third option would be to begin with the services currently in place, with additional responsibilities and resources assigned in each. Over time, as needs evolve and/or service gaps are identified, a new unit can be developed which then, in turn, would continue to grow and adapt to the work required to support mature LOA processes.

21

(Revised 04-Dec-12)

SFU

Recommendations

1. Programs that do not already have processes built into part of their disciplinary accreditation adapt either or both of the approaches to learning outcomes described above (program- to course-level outcomes, or course- to program-level outcomes).

2. Academic units will continue to complete the learning outcomes sections of the new course and program proposal forms used by SCUS and SGSC.

3. Academic units have the option to retain within the unit assessment data that is generated locally. When cyclical processes require it, University-level analysis related to

learning outcomes assessment will use aggregated data received in report form from

the units, which will be provided with standardized templates for the purpose.

4. The cycle of regular assessment of learning outcomes should be built into the external review cycle, beginning with units externally reviewed in spring 2014. LOA will become part of the regular process of external reviews, incorporated into the self-study documents as part of curricular review as of fall 2013 and subsequently every seven years. Curricular review, including comments on the assessment of learning outcomes, will also form part of the external review mid-term report.

5. The VPA will establish enhanced supports for LOA via one of three options: enhanced capacities in the existing units of the TLC and IRP; or added capacity to the office of the

VPA via a compact unit responsible for LOA that would be in place by summer 2013 with a mandate as described above or an evolving blend of these options.

22

SFU

Simon Fraser University

Institutional Research and Planning

Appendix A

Summer 2012 Learning Outcomes and Assessment Survey:

Survey Findings

Prepared by

Kiran Bisra, Analyst

Institutional Research and Planning

September 26, 2012

2012-11-06 Summer 2012 Learning Outcomes and Assessment Survey: Keport of Findings A-

PURPOSE

The goal of this study is to identify academic units that currently use, or are in the process of developing learning outcomes and/or their assessment at Simon Fraser University (SFU). As well, this project aims to identify curricularassessment processes (regular and off-cycle) currently utilized in academic units.

For context, it was also important to gather information about whether learning outcomes have been implemented at the program and/or course level and if academic units formally discuss their implementation and assessment.

METHODOLOGY AND RESPONSE RATE

The project targeted one key person in every academic unit whose portfolio includes curriculum issues.

Individualized surveys were sent out to each respondent, which included all of the programs in their academic unit.

The surveys were open from June 26th to September 24th, 2012. In total, 84 individuals were invited to respond on behalf of 457 programs. In addition to the initial invitation, three reminders were sent to academic unit contacts over the course of survey administration.

In total, 273 programs had responses provided for them, yielding a response rate of 59.7%. Assuming that the sample is representative, percentages calculated on all programs are accurate within ±3.75%, 19 times out of 20. However, strong caution is advised when extrapolating these results to programs which did not respond. Each academic unit is unique and one program's responses may have no bearing on another program.

RESULTS

Highlights of the survey results include:

• Of those programs with responses, 18% have program-level learning outcomes, and of these,

64% are assessing students to determine whether they are achieving program-level outcomes.

• 16% of programs with responses have learning outcomes defined in all of their courses, and an additional 35% have learning outcomes in some of their courses.

• 8% of respondents are already accredited by an external accrediting body, and a further 4% are currently seeking accreditation.

• Of those individuals who responded, 65% wanted examples of learning outcomes and assessment materials/models from other universities, in their discipline, and consultation with faculty mentors who are experienced with learning outcomes assessment as support while establishing or improving learning outcomes assessment processes.

Appendix A1 is an example of an individualized survey that was sent out and AppendixA2 displays the distribution of responses to the survey questions.

2012-11-06 Summer 2012 Learning Outcomes and Assessment Survey: Report ofTindings A:>

APPENDIX A1

SURVEY INSTRUMENT

Example Survey for Graduate Engineering Sciences Programs

:.'!::iiio;)..; U i-.v.mi-v ti .in,I I'l.ir.i::;:-..'. Sun. •:; luvi I'm, ;-im:v. >SNN I '•:i,1-im:v | >: :-.>•. l-,;i::).,[".. IK., t ".in ui.i Vv\ i Si

" r

Summer 2012 Learning Outcomes and Assessment Survey. Report of Findings

2012-11-06

Learning Outcomes and Assessment Survey

A 4

Please Note: JavaScript is disabled in your browser. In order to properly view this survey, you must first enable JavaScript. If you are using Firefox, please follow these steps:

http://support.mozilla.org/en-US/kb/JavaScript

If you are using Internet Explorer, please follow these steps: http://support.microsoft.com/gp/howtoscript

If you are using Google Chrome, please follow the steps below: Hit the wrench icon in the top right corner of your browser Click on "Settings" Click on "Under the Bonnet" or "Under the Hood" or "Show Advanced

Settings" on the left Click on the "Content Settings" button in the Privacy section Select "Allow all sites to run

JavaScript (recommended)" in the JavaScript section Reload this survey page when you are done

Page #1

Lid Please select all programs for which you will be responding:(Q2)

V Master of Applied Science in Engineering Science

V Master of Engineering in Engineering Science

B Doctor of Philosophy in Engineering Science

Page #2

Branching Information

• If Q3 = No then Skip to Page 5

• If not Q3 = Yes then Hide Q3-1 iLJ Do any of your programs have specific program-level learning outcomes? -

C Yes

C No

3 For which programs are there specific learning outcomes?(Q3-1)

V Master of Applied Science in Engineering Science

V Master of Engineering in Engineering Science

Q Doctor of Philosophy in Engineering Science

Page #3

Branching Information

• If not Q3-3 = Yes then Hide Q3-4

V^ Are specific learning outcomes for this program(s) communicated to students through any of the following (check all that apply)?

(Q3-2)

SFU Calendar

Department website Faculty website

Master of Applied Science in Engineering

Science

Master of Engineering in Engineering Science

Doctor of Philosophy in Engineering Science

r

\~

V

Program website r r r

Hard copy materials r

r

r

Not communicated

2012-11-06 Summer 2012 Learning Outcomes and Assessment Survey: Report ol Findings

Master of Applied Science in Engineering

Science

V V

Masterof Engineering in Engineering Science

Doctorof Philosophy in Engineering Science

V

V

r r

ia=) Are there any otherways that you communicate program-level learning outcomes to students?(Q3-3)

C Yes

C No

A Please specify:(Q3-4) l~

r r

•\ 5

Page #4

'-tJi Is your program(s) assessing students to determine whether they are achieving program-level learning outcomes? Ifso, please describe any methods, approaches or instruments that are used.(Q4)

Description of learning outcomes assessment processes

Masterof Applied Science in Engineering

Science

0 Yes

C No

Master of Engineering in Engineering Science

Doctor of Philosophy in Engineering Science

C Yes

(~ No

C Yes

C No

Page #5

Branching Information ill Do the courses in your program(s) have defined learning outcomes?(Q5)

Master of Applied Science in Engineering Science

Master of Engineering in Engineering Science

Doctorof Philosophy in Engineering Science

Yes, all of them

(~

(~

Q

Yes, some of them

c c c

No, none of them

c c c

Page #6

Branching Information

• If not Q5-2 = Yes then Hide Q5-3

Are course-level learning outcomes communicated to students through any of the following (check all that apply)? (Q5-V

2012-1 l-0(.

Summer 2012 Learning Outcomes and Assessment Survey: Kepori ol Findings

Course website

Course outline

Course syllabus

Master of Applied Science in

Engineering Science

Master of Engineering in

Engineering Science

Doctor of Philosophy in

Engineering Science

r r r

Not communicated

r r r r r r

Master of Applied Science in

Engineering Science

Master of Engineering in

Engineering Science

Doctor of Philosophy in

Engineering Science

r r r

la Are there any other ways that you communicate course-level learning outcomes to students?(Q5-2)

C Yes

C No

A Please specify:(Q5-3)

A6

Page #7 lill Do yourcourse-level learning outcomes align with your program-level learning outcomes?(Q6)

Master of Applied Science in Engineering

Science

Master of Engineering in Engineering Science

Doctor of Philosophy in Engineering Science

Yes

c c c

Some of them

c c c

No

c c c c c c

N/A, do not have program-level learning outcomes

Page #8

fy Does your academic unit already have, or is your academic unit in the process ofdeveloping, assessment models for learning outcomes for this program(s)? Ifso, please describe this process and your progress to date.(Q7)

Yes Yes

Please describe below

• •

2012-1 1-06 Summer 2012 Learning Outcomes and Assessment Survey: Keport of Findings

Master of Applied Science in

Engineering Science

C No

C Already have C In process

Master of Engineering in

Engineering Science C No

C Already have C In process

Doctor of Philosophy in

Engineering Science

C No

C Already have C In process

AT

Page #9

Branching Information

•^ Isthis program(s) already accredited or seeking accreditation with an external accrediting body? If so, what is the full name of the accrediting body and its website? Ifthere is more than one, please separate with a semi-colon.(Q8-1)

Name of Accrediting Body Website URL

Master of Applied Science in

Engineering Science

Master of Engineering in

Engineering Science

Doctor of Philosophy in

Engineering Science

C Yes, already accredited

0 Yes, seeking accreditation

C No, neither

C Yes, already accredited

C Yes, seeking accreditation

(~ No, neither

C Yes, already accredited

C Yes, seeking accreditation

C No, neither

V^ Is this program(s) already structured or restructuring curriculumto allow students to meet the requirements set by an external accrediting body? If so, what is the full name of the accrediting body and its website? If there is more than one, please separate with a semi-colon.(Q8-2)

Name of Accrediting Body Website URL

Master of Applied Science in

Engineering Science

Master of Engineering in

Engineering Science

Doctor of Philosophy in

Engineering Science

C Yes, already structured

C Yes, restructuring

(~ No, neither

C Yes, already structured

C Yes, restructuring

C No, neither

C Yes, already structured

C Yes, restructuring

C No, neither

2012-1 1-lK.

Summer 2012 Learning Outcomes and Assessment Survey: Keport ol Findings

Page #10

Branching Information

• If not Q9 = Yes then Hide Q9-1 iLJ Are you aware of any "best practices" (e.g. reputed methodologies, popular instruments or mechanisms, etc.) for the identification and assessment of learning outcomes in your field?(Q9)

C Yes

C No

A Please describe and provide any examples or reference information.(Q9-1)

Page #11

'r^j How often doesyour curriculum committee (or equivalent) meet?(Q10)

Master of Applied Science in

Engineering Science

Master of Engineering in

Engineering Science

Doctor of Philosophy in

Engineering Science

C Weekly

C Bi-weekly

C Monthly

C Once per semester

C Bi-annually

C Annually

C Weekly

C Bi-weekly

C Monthly

C Once per semester

C Bi-annually

C Annually

C Weekly

C Bi-weekly

C Monthly

C Once per semester

C Bi-annually

C Annually

Other, please describe

Page #12 ill] Are learning outcomes and/or theirassessment discussed during your curriculum committee (orequivalent) meetings?(Q11;

Yes No

Master ofApplied Science in Engineering Science

Master ofEngineering in Engineering Science

Doctor ofPhilosophy in Engineering Science

C

0

C

C

C

C i

2012-1 1-06 Summer 2012 Learning Outcomes and Assessment Survey: Report of Findings A9

Page #13

Branching Information

• If not Q12 = Yes then Hide Q12-1

;t=) Besides curriculum committee meetings or their equivalent, are there other forums or committees within your unit that discuss learning outcomes and/or their assessment?(Q12)

C Yes

C No

A Please describe:(Q12-1)

Page #14 l£I What support(s) might your unit require in order to establish a learning outcomes assessment model, and/or improve upon existing learning outcomes assessment processes? (Q13)

V Consultations with experts from the Teaching and Learning Centre

V Consultations with faculty mentors who are experienced with learning outcomes assessment

I- Examples of learning outcomes and assessment materials/models from other universities, in my discipline

H Professional development events

V Other (please specify)

Page #15

A Do you have thoughts, comments, or concerns that you would like to share with us about learning outcomes and their assessment?(Q14)

Page #16

A We are interested in verifying the contact person in your unit in case we need to ask a clarifying question. Could you please provide your first and last name in the text box below? (Q15)

2012-11 -< 16 Summer 2012 Learning Outcomes and Assessment Survey: Report of Findings A10

APPENDIX A2

DISTRIBUTION OF RESPONSES TO SURVEY QUESTIONS

.ti-.-..tit!:)ii K;mm!.!i .uui I*:.ii;::i•:;.'.. "v:r...:: i Ms-.-; I'lim-vi:' . ••---:- IV,:- •.•:-!'.! >T •.-. Ii.i: ::..:>•.. lit . *. ..n.ni i \'S.\ iM.

i "7N.7>,.'.'-'!cS

i ""s.'sj i,v)'' v.v. w.sj'u.v.t up

2012-11-06

Summer 2012 Learning Outcomes and Assessment Survey: Report of Finding

All

This section displays the distribution of program responses to the survey questions. The tables that

follow show the number and percentage of programs selecting each response to the questions. The question numbers in these tables do not always match the question numbers in the survey instruments. Assuming that the sample is representative, percentages calculated on all programs are accurate within ±3.75%, 19 times out of 20.

For questions where it was possible to choose more than one response, the tables present the percentage of respondents selecting each option, rather than the percentage of responses.

Section 1: Closed-ended Questions

1)

Do any of your programs have specific program-level learning outcomes?*

Yes

No

Total Participants'

Missing Cases

17

37.0%

29

63.0%

46 100.0%

0

* For this question, percentages are based on the number of individuals who responded, not the number of programs.

->2)

For which programs are there specific learning outcomes?

Programs Identified with Learning Outcomes

Programs Identified without Learning Outcomes

Total Programs with responses

50

18.3%

223

81.7%

273 100.0%

t For a list of identified programs please see open-ended responses in next section

f

3)

Are specific learning outcomes for this program(s) communicated to students through any of the following?

Hard Copy Materials

29 58.0%

17 34.0%

Department Website

Program Website

Faculty Website

14

9

28.0%

18.0%

SFU Calendar

Not Communicated

Total Respondents

Missing Cases

7 14.0%

4

8.0%

50 160.0%

0

Are there other ways that you communicate program-level learning outcomes to students?

Please specify

Open-ended responses are included in next section

V

4)

Is your program(s) assessing students to determine whether they are achieving program-level learning outcomes?

Yes

30 63.8%

No

Total Responses

Missing Cases

17 36.2%

47 100.0%

Description of learning outcomes and assessment processes

Open-ended responses are included in next section

,

A12

2012-11-06

Summer 2012 Learning Outcomes and Assessment Survey: Report <>t Findings

5)

Do the courses in your program(s) have defined learning outcomes?

C

Yes, all of them

Yes, some of them

No, one of them

Total Responses

Missing Cases

32

71

103

15.5%

34.5%

50.0%

206 100.0%

67

f

6) Are course-level learning outcomes communicated to students through any of the following

(check all that apply)?

Course Syllabus

46 46.0%

Course Outline

42 42.0%

Course Website

17 17.0%

36 36.0%

Not Communicated

Total Respondents

Missing Cases

100 141.0%

Are there other ways that you communicate course-level learning outcomes to students? Please specify.

Open-ended responses are included in next section

V

7)

Do your course-level learning outcomes align with your program-level learning outcomes?

Yes

Some of them

No

N/A do not have program-level learning outcomes

Total Responses

Missing Cases

28

5

1

42

36.8%

6.6%

1.3%

55.3%

76 100.0%

27

8)

Does your academic unit already have, or is your academic unit in the process of developing, assessment models for learning outcomes for this program (s)?

Yes, already have

Yes, in process

No

Total Responses

Missing Cases

22 8.6%

41

193

16.0%

75.4%

256 100.0%

17

Does your academic unit already have, or is your academic unit in the process of developing, assessment models for learning outcomes for this program (s). If so, please describe the process and your progress to date.

Open-ended responses are included in next section

9) Is your program(s) already accredited or seeking accreditation with an external accrediting body?

Yes, already accredited

Yes, seeking accreditation

No, neither

Total Responses

Missing Cases

19

10

207 87.7%

236 100.0%

37

8.1%

4.2%

Name of Accrediting body and Website URL

Open-ended responses are included in next section

I : '

2012-1l-Od

Summer 2012 1 earning Outcomes and Assessment Survey: Report <>t Findint

AI3

Is this program(s) already structured or restructuring curriculum to allow students to meet the requirements set by an external accrediting body?

Yes, already structured

Yes, restructuring

No, neither

Total Responses

Missing Cases

23

12

9.3%

4.9%

211 85.8%

246 100.0%

27

Name of Accrediting body and Website URL

Open-ended responses are included in next section

11)

Are you aware of any "best practices" (e.g. reputed methodologies, popular instruments, or mechanisms, etc.) for the identification and assessments of learning outcomes in your field?

Yes

No

Total Participants*

Missing Cases

11 27.5%

29 72.5%

40 100.0%

6

* For this question, percentages are based on the number of individuals who responded, not the number of programs.

Please describe and provide any examples or reference information.

Open-ended responses are included in next section

12)

How often does your curriculum committee (or equivalent) meet?

Weekly

Bi-weekly

Monthly

Once per semester

Bi-annually

Annually

Other

Total Responses

Missing Cases

0

11

85

0.0%

5.4%

41.9%

61

18

14

30.0%

8.9%

6.9%

14 6.9%

203 100.0%

70

How often does your curriculum committee (or equivalent) meet? Other (please describe)

Open-ended responses are included in next section

13)

Are learning outcomes and/or their assessment discussed during your curriculum committee (or equivalent) meetings?

Yes

No

Total Responses

Missing Cases

75

99

43.1%

56.9%

174 100.0%

99

14)

Besides curriculum committee meetings or their equivalent, are there other forums or committees within your unit that discuss learning outcomes/and or their assessment?

Yes

No

Total Participants*

Missing Cases

16

24

40.0%

60.0%

40 100.0%

6

* For this question, percentages are based on the number of individuals who responded, not the number of programs.

2(112-11-06

Summer 2012 Learning Outcomes and Assessment Survey: Report oi Findings

A14

15)

What support(s) might your unit require in order to establish a learning outcomes assessment model, and/or improve upon existing learning outcomes assessment processes?

(check all that

apply) *

Examples of learning outcomes and assessment materials/models from other universities, in my discipline

Consultation with faculty mentors who are experienced with learning outcomes assessment

Consultation with experts from the Teaching and Learning Centre

Professional development events

Other

Total Participants*

Missing Cases

26 65.0%

26

22

14

65.0%

55.0%

35.0%

15 37.5%

40 257.5%

6

* For this question, percentages are based on the number of individuals who responded, not the number of programs.

Other (please describe)

Open-ended responses are included in next section

16)

Do you have thoughts, comments, or concerns that you would like to share with us about learning outcomes and their assessment?

Open-ended responses are included in next section

2012-1 1-06 Summer 2012 Learning Outcomes and Assessment Survey: Keport of Findings

Section 2: Open-ended Questions

Comments are reproduced exactly as written, except that references to individuals and departments have been removed for privacy reasons. Altered text is in [square brackets].

2) For which programs are there specific learning outcomes?

Biomedical Engineering with Biomedical Signals and Instrumentation Concentration Honours

Biomedical Engineering with Pre-Med Concentration Honours

Biomedical Engineering with Rehabilitation and Assistive Devices Concentration Honours

Business Administration M.B.A.

Business Administration M.B.A. (Executive)

Business Administration Ph.D.

Business Administration (Management of Technology) M.B.A.

Business with Accounting Concentration Honours

Business Major

Cognitive Science Honours

Cognitive Science Major

Cognitive Science Minor

Computer Engineering Honours

Computer Engineering Major

Cultural Resource Management Certificate

Development and Sustainability Grad Certificate

Development and Sustainability Minor

Digital Media M.D.M.

EdD Programs Ed.D.

Educational Leadership M.A.; M.Ed.

Electronics Engineering Honours

Electronics Engineering Major

Engineering Physics (Electronics) Honours

Environmental Science Honours

Environmental Science Major

Financial Risk Management M.Sc.

French and Education Certificate

French Language Proficiency Certificate

French M.A.

French Extended Minor

French Honours

French Major

History M.A.

History Ph.D.

Honours with Applied Behaviour Analysis Concentration Honours

Mechatronic Systems Engineering - Honors

Mechatronic Systems Engineering - Majors

Molecular Biology and Biochemistry M.Sc.

Molecular Biology and Biochemistry Ph.D.

Physics M.Sc.

Physics Ph.D.

Print and Digital Publishing Minor

Psychology with Applied Behaviour Analysis Concentration Major

Psychology Major

Public Policy

A15

2012-1 !-()(> Summer 2012 Learning Outcomes and Assessment Survey: Report of" Findings A16

Publishing M.Pub,

Sustainable Community Development Certificate

Sustainable Community Development Post Baccalaureate Diploma

Systems Honours

Systems Major

3a) Are there other ways that you communicate program-level learning outcomes to students? Please specify

• AT first lecture, through powerpoint presentation which are available on line after the first lecture

• every opening class is an opportunity to re-confirm those outcomes

• In verbal orientation sessions

• Orally in class and in individual meetings with students.

• Program learning outcomes will be highlighted in course offerings. These learning outcomes materials are currently being prepared for the Fall 2013 academic year.

• Some instructors who teach courses with set LOAs do communicate these verbally to students.

In some cases, instructors may even share the LOA rubric with students to illustrate, in part, how a particular assignment may be marked and what students are supposed to learn/demonstrate. These also help explain to students levels of expectation for the associated assignment.

• Supervisory committee meetings

• The expectations are communicated to the students at regular intervals both by the Senior

Supervisor and the supervisor committee. In addition the student is initial provide information as to their progress in form their assessments in the individual graduates.

• Verbally in core courses.

• Verbally, in class.

4a) Description of learning outcomes and assessment processes

• Along with program level outcomes, a set of 65 indicators has been developed for evaluating student success in achieving these attributes (learning outcomes). Each course will be responsible for monitoring, on a class-wide basis, student success towards specific indicators by providing concrete data in the form of results from specific exam questions or other evaluations. These will be evaluated repeatedly, in multiple courses, using rubrics. A process has been developed to evaluate this data on a program-wide basis to determine if additional indicators are required and/or if changes to the course curricula is required.

• annual faculty progress reports about students, second language proficiency evaluations, comprehensive exams, thesis prospectus defences, thesis defences

• annual faculty progress reports about students, second language proficiency evaluations, thesis prospectus defences, thesis defences

• As above

• Comprehensive Examination Research Project

• Comprehensive Examination Thesis

Concentration LOs are assessed in discipline courses mapped to LOs. *same method applies to all 8 concentrations in the [****].

• evaluation of applied projects which have been defined by the students themselves, and which evolve in consultation with faculty.

• Feedback from internship hosts and employers.

• Final outcomes of research project

2012-1 1-06 Summer 2012 Learning Outcomes .md Assessment Survey: Report of'Lindings A17

• Interviews, self-reporting, surveys.

• Program-level LO are assessed in core [****] courses mapped to LOs. Concentrations each have defined LOs that are also assessed in identified courses. Assessment is made through direct embedded measures which use existing assignments/assessment tools for the course.

• Program-level LOs are assessed in comps, dissertation proposal (oral & written) and courses taught.

• Program-level LOs are assessed in final project.

• Program-level LOs are assessed in program capstone course.

• Program-level LOs are assessed in program coursework and final project.

• Program-level LOs are assessed in program coursework.

• Thesis examination

• Tutorial input, final essay, take home midterm exams

• Written and oral work, both individual and group projects. Their capstone project.

6a) Are there other ways that you communicate course-level learning outcomes to students?

Please specify.

• Course outlines

• Course-level LO are communicated to students by some instructors who teach Assurance of

Learning identified courses within each of the degree programs. Course specific LO are indicated on most course outlines - however these are not always equivalent to defined

Program or Concentration LOs.

• Direct interactions with student for honours project; that is, what is expected of them and what the student hopes to achieve

• Learning outcomes are presented during the first lecture of some courses.

• Orally in class and in individual meetings with students.

N.B. I indicated that I was

• responding re MA and PhD in History. The questions on this page and the next are relevant to the undergraduate, not the graduate, programme. Consequently, I find them inappropriate.

• orally, often.

• Verbally in core courses; capstone courses.

Will be reviewed in initial lectures

8a) Does your academic unit already have, or is your academic unit in the process of developing, assessment models for learning outcomes for this program(s). If so, please describe the process and your progress to date.

Assessment models have been developed and implementation is in progress.

[****] not formally assessed as a program. Required courses [****] assessed as [****] concentration courses.

[****] Honours is not formally assessed as a program. Required courses [****] are all assessed as core [****] courses and [****] concentration courses.

[****] minor is not formally assessed as a program. Students take 16 upper division [****] units. Required courses: [****] are not part of the core [****] courses assessed.

[****] Joint Major not formally assessed as a program. Required courses [****] are all assessed as core [****] courses and [****] concentration courses.

r****] j0jnt Major not formally assessed as a program. Required courses [****] are all assessed as core [****] courses and [****] concentration courses.

[****] Cert not formally assessed as a program.

Departmental coordinator has been identified and procedures for drafting learning and assessment outcomes are being formulated.

2012-1 1-06 Summer 2012 Learning Outcomes and Assessment Survey: Report ot Findings A18

[****] Joint Honours not formally assessed as a program. Required courses [****] are all assessed as core [****] courses and [****] concentration courses. Students complete at least one area of concentration. All concentrations have assessment models.

[****] Joint Major not formally assessed as a program. Required courses [****] are all assessed as core [****] courses and [****] concentration courses.

[****] jS not formally assessed as a program. [****] does not have any course overlap with an assessed program.

[****] Joint Major not formally assessed as a program. Required courses [****] are all assessed as core [****] courses and [****] concentration courses.

[****] Joint Major not formally assessed as a program. Required courses [****] are all assessed as core [****] courses and [****] concentration courses.

[****] Cert not formally assessed as a program.

Infancy - will begin when I take over as UG chair

[****] Joint Major not formally assessed as a program. Required courses [****] are ail assessed as core [****] courses, [****] and [****] concentration courses.

[****] Joint Honours not formally assessed as a program. Required courses [****] are all assessed as core [****] courses and [****] concentration courses.

[****] Joint Major not formally assessed as a program. Required courses [****] are all assessed as core [****] courses, [****] concentration courses.

[****] Joint Major not formally assessed as a program. Required courses [****] are all assessed as core [****] courses and [****] concentration courses.

Rubrics have been developed for identified core and discipline courses. Embedded direct measure assessment model uses existing course assignments.

Rubrics have been developed for identified discipline courses in all 8 concentrations.

Embedded direct measure assessment model uses existing course assignments.

Rubrics have been developed. Embedded direct measure assessment model uses existing program assignments.

scheduled for 2012-13 academic year

See above.

The [****] concentration is formally assessed for discipline specific knowledge.

The Honours

Term at [****] is not currently running.

The Honours Term at [****] is not currently running.

the PhD just started this year, and the MSc was realigned to dovetail with it; so we're just trying to get overall processes in place at this point, we will be addressing assessment, etc.

once those basic processes are fully in place and working.

the steering committee is discussing these models.

through student feedback and discussion

Undertaking an inventory of current assessment methods and comparing vs. goals for program outcomes and graduate attributes.

Until we hear about available resources for conducting post-graduation surveys of students, we are unable to develop assessment models at the program level

We are developing ways to assess student progress through the supervisory committee process.

We have discussed in Steering Committee and informally our pedagogical objectives and the means for achieving them.

We have not been asked to develop them, but they would include something about intensive training related to [****] at the Graduate Level.

We have not framed our program in this way, but it would not be difficult to do so. The program involves developing an deep, interdisciplinary understanding of [****], and being introduced to the conduct of field research and academic writing (students write a research based thesis)

2012-1 1-06 Summer 2012 Learning Outcomes and Assessment Survey: Report ofLmdmgs A1V

• We will first need to create lists of outcomes before we list assessment models. The models will likely be based on the cumulative and hierarchical structure of the required courses in our programs.

• We will first need to create lists of outcomes before we list assessment models. The models will likely be based on the cumulative and hierarchical structure of the required courses in our programs.

• We will first need to create lists of outcomes before we list assessment models. The models will likely be based on the cumulative and hierarchical structure of the required courses in our programs. We find valid assessment to be extremely difficult because of the limited English abilities of a large number of our students.

• Work is in process in preparation for next [****] accreditation in 2014

9a) Name of Accrediting body and Website URL

• AACSB; EQUIS-http://www.aacsb.edu/ ; http://www.efmd.org/index.php/accreditationmain/equis as above-http://www.ceph.org/

Canadian Engineering Accreditation Boardhttp://www.chemistry.ca/index.php?ci_id=2071&la_id=l

Canadian Society for Chemistry-http://www.cma-canada.org/index.cfm?ci_id=4118&la_id=l

CEPH: Council for Education on Public Health-http://www.cpa.ca/education/accreditation/; http://www.apa.org/ed/accreditation/

Clinical Program Only: CPA, APA-http://www.engineerscanada.ca

CMA Canada-http://www.engineerscanada.ca/e/pr_accreditation.cfm

P Eng-www.engineerscanada.ca

10a) Name of Accrediting body and Website URL

AACSB; EQUIS-apeg.bc.ca

APEGBC-apeg.bc.ca

APEGBC-http://www.aacsb.edu/ ; http://www.efmd.org/index.php/accreditation-main/equis

Association of Professional Engineers and Geoscientists of British Columbia (APEGBC)http://www.apeg.bc.ca/

Canadian Engineering Accreditation Boardhttp://www.chemistry.ca/index.php?ci_id=18918da_id=l

Canadian Society for Chemistry-http://www.cma-canada.org/index.cfm?ci_id=4118&la_id=l

Clinical Program Only: CPA, APA-http://www.cpa.ca/education/accreditation/; http://www.apa.org/ed/accreditation/

CMA Canada-http://www.engineerscanada.ca

Our programme is structured, but I cannot say whether the structure meets the requirements of an external crediting body without knowing what that body is and what requirements it has.-http://www.engineerscanada.ca/e/pr_accreditation.cfm

• P Eng-www.engineerscanada.ca

11a) Please describe and provide any examples or reference information, (best-practices)

An example is standardized testing. For example, there is a well-known test called the Force

Concept Inventory which we have sometimes had students take during the first week of class and again during the last week of class to measure improvement.

2012-1 1-06 Summer 2012 Learning Outcomes and Assessment Survey: Report of Findings A20

• By keeping active in the profession, attending conferences, reading professional journals and discussing pedagogy among ourselves, we all have a good idea of "best practices."

• For MSc and PhD: the thesis. For PhD: the qualifying examination.

http://graduate.ucsf.edu/system/files/Qualifying_Exam_and_Dissertation_SLOs.pdf Provide administrative support for individual mentoring of graduate students by faculty experts, including methodical tracking of student degree progress (see description of MIT tracking system in http://www.aps.Org/programs/education/graduate/conf2008/resources/upload/gradedreport.P

DF)

• Outcome based learning models were implemented by the [****] Chair responsible for overseeing this project within [****] at his previous institution ([****]). There is also awareness of US models through ABET.

• See Engineering Graduate Attribute Development Project http://egad.engineering.queensu.ca/

• The data provided by [****] during her presentation at SFU July, 2012

• There is published research on LOA available. AACSB doesn't specifically prescribe an

'approved' methodology for LOA. Direct embedded measures are preferable to MFTs or indirect survey methodologies.

• these are so numerous that we do not refer to them by name, but we assess students from multiple perspectives - and expect them to be both versatile and ingenious.

• TLC FASS [****] has met with some of our Faculty members to start informing them of various assessment of learning outcomes instruments

• UK Benchmarks

• We refer mainly to the popular post-secondary LO literature.

The main Computer Science accreditation body is CIPS, which announced an LO-based policy: http://www.cips.ca/Computer-Science-Accreditation-Council-CSAC-moves-to-an-outcomebased-accreditation-approach-Sep2011

12a) How often does your curriculum committee (or equivalent) meet? Other (please describe)

• As necessary to interact with [****] about issues that arise

• As part of graduate studies committee

• As part of undergraduate studies committee

• during application review, and to put processes in place, we met a lot this spring (once every

1-2 weeks) and now have met only twice this summer

• Every few years.

• Have Steering Committee instead - meets every two to three years

• I am not a member of the graduate committee, so I can't speak to this.

• Meetings are scheduled around specific agenda items (SSHRC or program applications, course proposals, curricular changes, etc. This roughly amounts to meetings on a monthly basis, but sometimes this involves more or less frequent meetings.

• Nothing regular; depends entirely on whether there are issues about the curriculum that need to be discussed.

• rarely twice per semester

• when required; 2-3X semester

• 6 times annually

2012-1 1-06 Summer 2012 Learning Outcomes and Assessment Survey: Report orTmdings A21

15a) What support(s) might your unit require in order to establish a learning outcomes assessment model, and/or improve upon existing learning outcomes assessment processes? Other (please describe)

An example that a skeptical and overworked faculty could imagine being useful.

Because we currently have 49% international students, SFU must repair the EAL problems on campus before we can undertake valid assessment of outcomes

Consultations with experts in History at other universities that have long-standing experience with learning outcomes in History would be most helpful. If the university mandates learning outcomes for all academic units, I hope that it will fund such consultations.

Don't yet know.

Examples would be highly useful

Experience from universities in other countries (US, Australia)

[****] is very lucky to have [****] who is well versed in learning outcomes and assessment funding

Funding and resources for locating students who have graduated and conducting the relevant research.

not interested

Potentially increase staffing in order to manage the process

Staff support support for database development to monitor program evaluation metrics and automate report generation for annual program review. Books on rubric development for post-secondary.

there may be a future need for staff to handle data collection and processing.

There would definitely be a future need for staff to handle data collection and processing if any and all programs offered by an academic unit are included in a system of LOA.

This question presumes that we are in favour of this initiative which as a dept we are not

We are an interdisciplinary program

We have made substantial progress in developing LOs, and have designed a presentation and process management system internally. We would be interested in comparing and sharing what we have done with other units.

16) Do you have thoughts, comments, or concerns that you would like to share with us about learning outcomes and their assessment?

• It is very important to have the flexibility for each academic unit to run the LO process, gather and present the materials in a way which is appropriate for that unit. It is unlikely that a global standard form or template will work well. This is particularly important so that the units are able to use the LO documents after the global process is complete: these need to be live, revisable, working documents.

• As faculty and students are not that receptive to significant changes, there is a need for significant training for faculty regarding OBE, particularly alignment of assessment with learning outcomes.

• At a recent Departmental Meeting, the [****] Department as a whole agreed that we have concerns about the initiative toward identified learning outcomes and would like further information before this initiative proceeds and also feel that the resources to support such an initiative could possibly be more fruitfully allocated.

• Given that the University will have to provide documentation continuously to demonstrate that we are meeting our learning objectives, [****] faculty are concerned about the large potential workload increase that could take time away from teaching activities including efforts to innovate. We feel that this should be addressed as most of our faculty remain either unaware of the issue or unconvinced of its importance with respect to how they teach.

2012-1 1-06 Summer 2012 Learning Outcomes and Assessment Survey: Report ot findings A2_

I am a strong believer that learning outcomes should be an integral part of our programs.

They provide clear landmarks for students. It is particularly important in language training.

I found some of the questions of the survey difficult to answer because they did not really apply to our programs. We are in the process of reviewing our entire [****] program (the rest of the curriculum had been revised after the last review; however the outcomes were not mentioned clearly, so we will have to revisit this issue soon), but it is not done yet, so I had to answer no at some questions and this is not the reality. We are doing baby steps in the right direction, but it is not a black and white yes or no answer.

I feel that the learning outcomes and assessment model is incompatible with the pedagogy in our department and in the [****] more broadly. On the assessment side in particular, I fear that it will increase workloads—that are already high~and involve the department in increased administrative work that will in no way improve the pedagogy or learning in the department.

The basic philosophy of learning outcomes and assessment is inimical to creative and spontaneous inquiry, and to the development of authentic critical thinking ability. I fear that once learning outcomes and assessment are established, it will inevitably lead to a change in course design and instruction whereby courses are designed to most easily meet the quantitative assessment requirements of these new procedures. I am opposed to the establishment of a learning outcomes and assessment model at SFU.

I think the whole exercise is reductionists and beggars the learning process.

I think this is a very good idea.

I'm not happy we're going to have to go through this exercise just to let our sports teams play in an American league, instead of against other Canadian universities. When the university community was told about the NCAA plans, there was no mention of the huge effort required right across the university to make this pipe dream happen.

Is the university planning to develop institutional or universal learning outcomes for all undergraduates? If so, it is essential to know these in order to align individual programs along these goals.

Resources and time are a concern of course. While the [****] has adopted some practices around including learning objectives and core competencies from [****] in all of our syllabi, revisions to our curricula and evaluation of the outcomes is a sizable undertaking and investment.

It is likely important for departments to distinguish between (and properly align) learning outcome goals, learning objectives, and demonstrable traits that students work will be evaluated against. It would be helpful for departments embarking on defining and measuring

LOAs to have clearly defined terminology to work with (i.e. common language). LOAs and course grades are not necessarily equivalents in assessment. LOA should also not be considered a measure of instructor evaluation nor should it be used in [****] discussions. It is necessary for departments/faculties to control LOA data at all stages to prevent misuse.

It would be important that learning outcomes and assements be fully integrated from course to program, to faculty, to university levels. Possible concern that University level LOAs could lead to increased workload for students/faculty at the program level. Furthermore concern that any University level LOAs not conflict with external professional accrediting bodies.

Learning outcome development and assessment is an expensive process. Properly monitoring an outcome based education requires additional staff and instructor/TA support. This process will force instructors to be more unified in their coverage of course materials. Rubric development will be challenging and critical to success.

Let's not make this process overly difficult, once program level outcomes have been set the rest should be fairly simple, if bureaucratic.

LOA for programs like joint honours or majors will require inter-departmental coordination since LOs would be divided between the relevant academic units.

2012-1 1-06 Summer 2012 Learning Outcomes and Assessment Survey: Report otTindings A23

Many members of the [****] Department have expressed considerable concern about the LO initiative, relating to matters such as workload; pedagogy; and top-down university governance.

[****] will be working on developing program learning outcomes.

Not sure what is meant by a 'model'; we have practical experience in integrating material and specifying outcomes for graduate students in our multi-disciplinary program. Any model would have to be applicable to our sort of program.

Some of this survey's questions switched from [****] to [****] MA and PhD. I answered 'no' or equivalent to those questions.

The concept and operationalization of learning outcomes are still vague for us. Concrete examples are helpful.

The [****] passed a motion at yesterday's faculty meeting [****] expressing great concern about both the process of being required to develop learning outcomes and the value of the exercise especially the expenditures during times of evident fiscal restraint.

There are some problems with this survey.

1.1 was asked to complete it in my capacity as

[****] Programme Chair. But the survey asked me questions about the [****] programme when referring to [****] Majors and the [****] Honours programme. I found this inappropriate.

2. The preamble that I received in the email from [****] needs to point out in greater detail what the objective of the survey is. Is the survey looking for the way we already express and assess learning outcomes in order to recognize and affirm the superiorquality teaching in evidence at SFU, or is the survey looking for gaps, inadequacies, etc.?

And what action will be taken on the basis of the information gathered in the survey? What does the survey mean by improving "the educational experience for all our students"? Is there a problem with the way in which we deliver our courses and formulate and assess learning outcomes now?

3. The preamble needs to offer a more elaborate definition of learning outcomes. How are these distinct from programme requirements or teaching objectives? The difinition offered in the preamble is sufficiently malleable to be identical with requirements and objectives. It made the survey confusing. University-wide discussions have not congealed enough for all faculty and administrators to have an identical meaning for learning outcomes.

Two of the previous sets of questions were phrased in the present tense: "Does your curriculum committee discuss learning outcomes?" Since we have not yet discussed this, I cannot that that it is something we "do." But once we do begin discussions on learning outcomes, the curriculum committee will be the place that these discussions begin. Since we are a small department, much of this also be done at the level of the entire department.

We have concerns with how to assess learning outcomes at the program level, as that requires the ability to track students after graduate. In the past, we have asked for support in doing this, so as to determine the quality of our degree, but have never received support. Unless the VPA is willing to provide the money and infrastructure support to enable this, I don't see how it will be possible to assess learning outcomes at the program level.

We are an interdisciplinary program and will be relying on the departments whose subjects we use to create the individual learning outcomes for the courses involved in [****] programs.

We are happy to begin the development of learning outcomes in our graduate programs. We will begin in the fall.

We are trying to spread a culture of awareness of learning outcomes, and are succeeding - we hope moving at a pace roughly equal to others in the university.

we need to address the SFU-wide focus on developing learning outcomes across faculties and look forward to being involved in this process, note that [****] is a very diverse, nondepartmentalized faculty; so we've devised the [****] program to deal with that diversity, having students focus in their area of research, while at the same time, having students

2012-11 -06 Summer 2012 Learning Outcomes and Assessment Survey: Report of Findings learning about related fields and working together as much as possible to support interdisciplinary approaches to learning and research.

We will do it but we are very busy and need support.

A24

!>.: ltntlon.il U-.'m'.ii ' i) and I'LiJllllsiu. Simj.io ! i\r.ei I >.i> ••:• U\ , SN'-v-, LY.|\ ;im'- I .':i\ c. i 11:111.« i '•. . lit '. ( 'an.nia VSA •Si

SFU

APPENDIX B

Learning Outcomes & Assessment Working Group:

Institutional Overviews

Prepared by lia Starr, Coordinator, Learning Outcomes & Assessment Project

Office of the Vice-President, Academic & Provost

September 2012

Bl

Appendix B- LOAWG: Institutional Overviews

Sept. 2012

Portland State University

Portland, OR http://www.pdx.edu/

At a Glance

• Accreditor: NWCCU

• Total enrollment: 28,035 (2010-2011) / FTEs: 17,953 (2010-2011)

• Over 225 academic degree programs

• Degree range: bachelor's; master's; doctoral; graduate certificates

• 8 colleges and schools

Colleges & Schools

• College of Liberal Arts & Sciences

• College of Urban & Public Affairs

• Maseeh College of Engineering &

Computer Science

• School of Business Administration

• Graduate School of Education

• School of Extended Studies

• School of Fine & Performing Arts

• School of Social Work

LOA-relevant Administrative Structure

Provost & Vice

President, Academic

Affairs

Institutional Research

& Planning

Vice Provost,

Academic Programs &

Instruction

Associate Vice Provost,

Teaching, Learning &

Assessment

Institutional

Assessment Council*

Center for Academic

Excellence

* The IAC reports annually to the Provost and the Faculty Senate.

B2

Appendix B- LOAWG: Institutional Overviews Sept. 2012

Institutional Assessment Council

• Appointments are made by the Provost, Vice Provost for Academic Programs and

Instruction, and the Dean of Undergraduate Studies to represent the breadth of academic units and programs of the institution.

• Priority has been the development of Campus Wide Learning Outcomes (CWLO) that

define the undergraduate educational experience.

• Working on a five-year plan (since 2009) to promote the implementation and assessment of the CWLOs at a campus level as individual units continue their program level assessment activities.

Council Structure (twelve voting members including):

• Faculty representation

• Staff representation

• Student representative

• 5-6 x ex officio members (Vice Provost for Academic Programs & Instruction, Dean of

Undergraduate Studies)

Center for Academic Excellence

• Assessment practices reflect a decentralized institutional culture, with classroom-level assessment remaining the responsibility of the individual schools, colleges and departments.

• Seeks to build institutional self-knowledge based on the amassed results of programlevel student learning assessments.

• Provides classroom assessment resources while using program-level assessment as a vehicle for focusing faculty attention on student learning, program alignment, pedagogy, and student development.

Office of Institutional Research & Planning

• Collects, preserves, interprets, analyzes and disseminates information regarding the characteristics, activities, operations and policies of PSU.

• Information is used by members of PSU community for: o mandated reporting requirements; o administrative decision-making and policy formulation; o academic assessment; o institutional planning; and o release to the general public.

B3

Appendix B- LOAWG: Institutional Overviews

University of Oregon

Eugene, OR http://www.uoregon.edu/

At a Glance

• Accreditor: NWCCU

• Total enrollment: 24,447 / FTEs: 23,451 (fall 2011)

• Over 260 academic programs

• Degree range: bachelor's; master's; doctoral; graduate certificates

7 colleges and schools

Colleges & Schools

School of Architecture & Allied Arts

College of Arts & Sciences

Lundquist College of Business

College of Education

LOA-relevant Administrative Structure

• School of Journalism &

Communication

• School of Music & Dance

• School of Law

Sept. 2012

Senior Vice President

& Provost

Vice Provost for

Academic Affairs

Associate Vice

Provost for AcademicM

Affairs

Assessment

Council*

Vice Provost for

Undergraduate

Studies

Teaching &

Learning Center

Vice President for

Finance &

Administration

Office of

Institutional

Research

* Chaired by the Associate Vice Provost for Academic Affairs.

B4

Appendix B- LOAWG: Institutional Overviews Sept. 2012

Office of Academic Affairs - Assessment Council

• The Vice Provost for Academic Affairs is responsible for the accreditation portfolio and coordination of the assessment of student learning outcomes is delegated to the

Associate Vice Provost for Academic Affairs.

• An Assessment Council, chaired by the AVP for Academic Affairs, is charged with discussion and coordination of its assessment efforts.

Structure:

• Vice Provost for Academic Affairs / Chair of Assessment Council

• Executive Assistant

• Council membership (details unknown)

Teaching & Learning Center

• Provides substantial student support in additional to faculty support.

• A Teaching Effectiveness Program exists to engage the academic community in viewing,

assessing, and improving undergraduate instruction, but there does not appear to be

any significant LOA component to the TLC's portfolio.

Office of Institutional Research

• Conducts research on student, faculty, and staff to promote on-going institutional selfassessment.

• Fulfills UO's compliance reporting requirements at both the federal and state levels.

• Provides institutional research and assessment data and data analysis to departments and units throughout the University community.

• Organizes and conduct other institutional assessment programs, including assessment of student learning and student activities, both independently and in collaboration with other departments.

• Provides institutional assessment results and analysis to administrative units, academic departments, and offices providing student support services.

B5

Appendix B - LOAWG: Institutional Overviews

Sept. 2012

University of Washington

Seattle, Tacoma and Bothell, WA http://www.washington.edu/

At a Glance

Accreditor: NWCCU

Total enrollment: 43,619 (fall 2011) / FTEs: 50,527 (fall 2011)

Over 150 academic and professional programs

Degree range: bachelor's; master's; doctoral; professional programs

16 colleges and schools

Colleges & Schools

College of Arts & Sciences

College of Built Environments

Foster School of Business

School of Dentistry

College of Education

College of Engineering

College of the Environment

The Graduate School

LOA-relevant Administrative Structure

Information School

School of Law

School of Medicine

School of Nursing

School of Pharmacy

Evans School of Public Affairs

School of Public Health

School of Social Work

Provost & Executive

Vice President

Vice Provost & Dean of Undergraduate

Academic Affairs

Office of Educational

Assessment

Center for Teaching

& Learning*

Vice Provost of

Planning &

Budgeting

Office of Planning &

Budgeting

*CTL reports jointly to the Vice Provost & Dean of Undergraduate Academic Affairs, and to the Vice Provost &

Dean of Graduate Studies and the Dean of University Libraries; those three in turn report directly to the Provost &

Executive Vice President.

B6

Appendix B- LOAWG: Institutional Overviews

Sept. 2012

Office of Educational Assessment

• Dedicated to the improvement of educational practice through assessment of teaching and learning and through evaluation and support of educational programs and services.

• Provides a variety of evaluation and assessment services to the UW community and

outside agencies.

• Research staff are specialists in quantitative and qualitative methods; they collaborate with UW faculty and staff to create effective assessment strategies, particularly in the

assessment of college-level outcomes, program evaluation, and survey research.

Admin:

Directorship

Director

Course Evaluation

Manager

Program Assistant

Scanning & Scoring

Manager

2 x Scanning & Scoring staff

Testing Center

Manager

2 x Educational Test Administrators

Assessment, Program Evaluation, and

Research Support

Director, UW Study of Undergraduate Learning

4 x Research Scientists

Development

Manager

Accounting/Management/Programming

Administrator

Web Developer

Senior Computer Specialist

Fiscal Specialist

Administrator Emeritus

Center for Teaching & Learning

• Works with individuals, departments, and communities of practice, as well as in collaboration with campus partners, to share knowledge of best practices and evidencebased research on teaching, learning, and mentoring.

Office of Planning & Budgeting

• Supports the UW community in accomplishing its goals through the planning and allocation of financial and physical resources, and providing analysis and information services to enhance university decision-making, planning and policy formation.

B7

Appendix B - LOAWG: Institutional Overviews

Sept. 2012

Washington State University

Pullman, Spokane, Tri-Cities, Vancouver, Extension, WA http://www.wsu.edu/

At a Glance

• Accreditor: NWCCU

• Total enrollment: 27,329 (fall 2011) / FTEs: 24,244 (average 2010-2011)

• Over 200 academic programs

• Degree range: bachelor's; master's; doctoral; professional; graduate certificates

• 11 colleges

Colleges & Schools

College of Agricultural, Human &

Natural Resource Sciences

College of Arts & Sciences

College of Business

College of Communication

College of Education

• College of Engineering &

Architecture

Honors College

College of Nursing

College of Pharmacy

University College

College of Veterinary Medicine

LOA-relevant Administrative Structure

Provost & Executive

Vice President

_c

Associate

Executive Vice

President

Liaison Council for

Assessment of

Undergraduate

Programs

Office of

Institutional

Research

Office of Assessment of Teaching &

Learning

B8

Appendix B - LOAWG: Institutional Overviews

Sept. 2012

Liaison Council for Assessment of Undergraduate Programs

• Meets bi-monthly to plan and manage an institution-wide system for planning and managing program assessment.

• WSU reports Liaison Council's activity as part of its assessment system and efforts to build capacity.

Representation:

• 11 x Members from each college or school

• 2 x Members from the WSU Tri-Cities and WSU Vancouver campuses

• Member from International Programs

• 4 x members from the Provost's office in the following capacities: o Associate Executive Vice President o Vice Provost for Academic Policy and Evaluation o Special Assistant to Provost and Executive Vice President o Director, Office of Assessment of Teaching and Learning

Office of Assessment of Teaching & Learning

• Coordinates the bi-monthly meetings of the Liaison Council for Assessment of

Undergraduate Programs.

• Works with colleges and programs on assessment of student learning and continuous improvement of undergraduate academic programs.

• Develops and implements assessment systems and interprets results.

• Develops and deploys best practices in assessment to improve teaching and learning.

• Meets assessment requirements for regional accreditation and academic program review.

Admin:

Director

2 x Assessment Fellows

2 x Assessment Specialists

• 2 x Technical Managers

• Technical Assistant

• Administrative Assistant

Office of Institutional Research

• Provides institutional-level information for decision-making and planning purposes and to external audiences.

B9

Appendix B- LOAWG: Institutional Overviews

Sept. 2012

University of Alaska Anchorage

Anchorage, Kodiak, Valdez, Kenai, Matanuska-Susitna Valley, AK http://www.uaa.alaska.edu/

At a Glance

Accrediting body: NWCCU

Total enrollment: 30,073 (2010-2011) / FTEs: 25,470 (2010-2011)

Over 150 degree programs

Degree range: bachelor's; master's; doctoral (as of May 2012); professional

8 colleges and schools + 4 satellite campus colleges

Colleges & Schools

College of Arts & Sciences

College of Business & Public Policy

College of Education

College of Health

Community & Technical College

School of Engineering

LOA-relevant Administrative Structure

School of Nursing

School of Social Work

Matanuska-Susitna College

Kenai Peninsula College

Kodiak College

Prince William Sound Community College

Provost & Executive

Vice Chancellor

(Academic Affairs)

Vice Provost for

Undergraduate

Academic

Affairs*

Senior Vice Provost of

Inst. Effectiveness,

Engagement &

Academic Support

Assistant Vice

Provost &

Accreditation

Liaison Officer

Academic

Programs &

Assessment

Academic

Assessment

Committee**

Institutional

Research

Center for

Advancing

Faculty

Excellence

* Title changed from VP Accreditation & Undergraduate Programs to VP Curriculum & Assessment, to now VP

Undergraduate Academic Affairs.

** As of Feb. 2012, the VP Res & Grad Studies sits ex-officio on the AAC to solidify the Grad School's role in LOA.

B10

Appendix B- LOAWG: Institutional Overviews Sept. 2012

Academic Assessment Committee

• Directs the collection and analysis of assessment data, and responds to requests for info on official assessment results and achievement of student learning outcomes.

• Develops, maintains and implements an Academic Assessment Handbook for programs.

• Refers curricular and academic issues to the appropriate boards, reviews assessmentrelated policy and procedure change requests, and recommends actions to the appropriate bodies.

Representation:

• Members from most colleges or schools (6 in all)

• 3 x members from satellite campuses

• 4 x members from Faculty Senate

• 3 x ex-officio members including the: o Vice Provost for Undergraduate Academic Affairs (VPUAA) o Vice Provost for Research and Graduate Studies (VPRGS)

• Member from Consortium Library

• Accreditation and Academic Programs Coordinator (Office of Academic Affairs)

Academic Programs & Assessment

• Coordinates review processes for academic programs to determine if they are operationally efficient, achieving intended outcomes, and aligned with UAA's and academic units' missions.

• Two types of academic program reviews are performed: Student Outcomes Assessment and Program Review.

• For Student Outcomes Assessment, program faculty review program outcomes and determine whether students have achieved them, leading to faculty-driven program recommendations and plans for improvement.

Admin:

• Vice Provost for Undergraduate Academic Affairs

• Assistant Vice Provost and Accreditation Liaison Officer* (placed in Accreditation)

• Accreditation and Academic Programs Coordinator

Center for Advancing Faculty Excellence

• Offers workshops, forums and assistance to faculty in teaching, research, assessment, civic engagement, classroom success, student retention, etc.

• Runs a monthly workshop series to help faculty assess the efficacy of classroom strategies, document the effectiveness of their teaching, and share effective strategies.

Institutional Research

• Collects and analyzes institutional data to enhance decision-making, assessment and planning; is heavily involved in UAA's accreditation process.

Bll

Appendix B- LOAWG: Institutional Overviews

Sept. 2012

Texas A&M University

College Station, TX http://www.tamu.edu/

At a Glance

Accreditor: SACS

Total enrollment: 46,422 (spring 2011) / FTEs: 43,074 (fall 2011)

Over 360 academic degree programs

Degree range: bachelor's; master's; doctoral; professional programs

10 colleges and schools

Colleges & Schools

College of Agriculture & Life

Sciences

College of Architecture

Bush School of Government & Public

Service

Mays Business School

College of Education & Human

Development

LOA-relevant Administrative Structure

Dwight Look College of Engineering

College of Geosciences

College of Liberal Arts

College of Science

College of Veterinary Medicine &

Biomedical Sciences

Provost & Executive

Vice President for

Academic Affairs

Vice Provost for

Academic Affairs

Dean of Faculties &

Associate Provost

Associate Vice President for

Academic Services

Office of Institutional

Assessment

Center for Teaching

Excellence

Office of Institutional

Studies & Planning

B12

Appendix B- LOAWG: Institutional Overviews Sept. 2012

Office of Institutional Assessment

• Supports and assists assessment efforts across the TAMU's, particularly those dealing with university-wide assessment and accreditation.

• Assists colleges in efforts to assess student learning.

• Supports institutional effectiveness efforts across the university.

• Plans and hosts an annual assessment conference.

• Assessment Liaisons represent each college and support unit and advise on issues regarding assessment practice and processes.

• An Assessment Review Committee produces an annual report for the Provost and

President which summarizes local college assessment reviews, as well as administrative and educational support assessment reviews of the university.

Admin:

• Director

• Assistant Director

• 2 x Program Coordinators

• Administrative Assistant

• 2 x Graduate Assistants

• 28 x Assessment Liaisons from various colleges and administrative units

• 8 x Assessment Review Committee members from various colleges and administrative units

Center for Teaching Excellence

• Assists academic departments with defining program goals and determining where they are introduced, reinforced and assessed throughout the curriculum.

• Facilitates the processes of: o curriculum redesign; o creation of program-level learning outcomes; o establishment of course-level learning outcomes; and o development of a graphical matrix.

• Encourages departments to build into their programs the incorporation of high impact practices (HIP) such as internships, service learning, study abroad and undergraduate research; these can be incorporated as stand-alone, for-credit components, or incorporated into directly courses.

• Offers one-on-one consulting with faculty on teaching related topics, from implementing high impact practices in courses to documenting results of changes made in teaching; assists with interpreting and responding to student feedback.

Office of Institutional Studies & Planning

• Provides analytical support for university-wide planning activities, fulfilling university compliance reporting requirements at both the Federal (SACS) and State levels.

• Primary responsibilities include online information system, faculty and student studies and reports, and other reporting.

B13

Appendix B- LOAWG: Institutional Overviews Sept. 2012

University of North Carolina at Chapel Hill

Chapel Hill, NC http://www.unc.edu/index.htm

At a Glance

Accreditor: SACS

Total enrollment: 29,137 (fall 2011) / FTEs: 26,837 (fall 2011)

Over 250 academic degree programs

Degree range: bachelor's; master's; doctoral; professional programs

15 colleges and schools

Colleges & Schools

College of Arts & Sciences

School of Dentistry

School of Education

Eshelman School of Pharmacy

Gillings School of Global Public

Health

Graduate School

Kenan-Flagler Business School

School of Government •

School of Information & Library

Science

School of Journalism & Mass

Communication

School of Law

School of Medicine

School of Nursing

School of Social Work

Summer School

LOA-relevant Administrative Structure

Executive Vice

Chancellor & Provost

Vice Provost for Finance

& Academic Planning

Assistant Provost and

Director of Institutional

Research & Planning

Vice Provost for

Academic Initiatives

Center for Faculty

Excellence

B14

Appendix B- LOAWG: Institutional Overviews Sept. 2012

Office of Institutional Research & Assessment

• As part of the "Institutional Effectiveness" component of its mission, is responsible for

coordinating campus-wide and unit-level assessment of academic programs and administrative processes to support UNC-Chapel Hill's quality improvement efforts, including:

o providing technical support and consultation to assist units in assessing student

LOs; and o coordinating accreditation activities.

• Provides resources for use by faculty and staff in assessing the effectiveness of their units and reporting the results as required for University accountability purposes.

• For academic programs assessment, provides guides for creating feasible systems of assessing student learning outcomes within programs and generating meaningful data for program improvement.

• Generates information on the current accreditation status of UNC-Chapel Hill and its professional programs and schools.

• An Assessment Policy Advisory Committee (APAC), made up of representatives from all academic units and the Division of Student Affairs, advises the Executive Vice Chancellor and Provost concerning policies and procedures for the outcomes assessment process at

UNC-Chapel Hill (see "Principles" and "Recommendations for Resources to Support

Assessment Activities").

Center for Faculty Excellence

• Provides integrated support to faculty across the entire spectrum of their responsibilities and throughout all stages of their careers.

• Partners with other units across the University to ensure efficient and effective delivery of support services to all faculty campus-wide.

• Collaborates with Instructional Technologies Services to help instructors devise novel approaches to instruction and assessment.

• E-Learning: Provides Resources for online course development, implementation and evaluation, as well as nationally recognized standards for online course development.

Deans are responsible for coordinating student LOA within their schools. They are also responsible for establishing internal reporting processes and schedules that ensure that assessments are occurring on a regular basis, and that the results are being used to improve programs as appropriate. The Office of the Executive Vice

Chancellor and Provost incorporates data from school-level reports of outcomes assessment processes in its planning processes, and regularly uses these results for program improvement.

B15

Appendix B- LOAWG: Institutional Overviews

Sept. 2012

Boston University

Boston, MA http://www.bu.edu/

At a Glance

Accreditor: NEASC

Total enrollment: 32,067 (fall 2009) / FTEs: 24,846 (fall 2009)

Over 250 academic degree programs

Degree range: bachelor's; master's; doctoral; professional programs

17 colleges and schools

Colleges & Schools

College of Arts & Sciences

Graduate School of Arts & Sciences

College of Communication

College of Engineering

College of Fine Arts

College of General Studies

College of Health & Rehabilitation

Sciences: Sargent College

Metropolitan College

Goldman School of Dental Medicine

LOA-relevant Administrative Structure

School of Education

School of Hospitality Administration

School of Law

School of Management

School of Medicine

Division of Graduate Medical

Sciences

School of Social Work

School of Theology

University Provost &

Chief Academic

Officer

Senior Vice

President,

Operations

Associate Provost for

Faculty Affairs

Associate Provost for

Undergraduate

Education

Vice President, Budget,

Planning & Institutional

Research

Center for Excellence &

Innovation in Teaching

Task Force-"One BU: Unlocking the Undergraduate Experience"

—^ Institutional Research

B16

Appendix B- LOAWG: Institutional Overviews Sept. 2012

Task Force - "One BU: Unlocking the Undergraduate Experience" on Assessment

• In June 2008, a Task Force was convened by the University Provost called "One BU:

Unlocking the Undergraduate Experience," chaired by the Associate Provost for

Undergraduate Education; this Committee worked for twelve months to define a set of shared principles for what constitutes a BU undergraduate education.

• The Task Force made three recommendations relating to LOA:

o Add capacity in institutional research to support LOA, especially across the

curriculum.

o Encourage the use of ePortfolios as a self-appraising instrument, o Encourage the use of ePortfolios as a virtual forum for students to describe, to comment on, and to mentor each other.

• There does not appear to be any LO or Assessment office or council per se, and BU's

LOA culture looks to be largely decentralized, with individual colleges and schools presently following their own approaches and philosophies .

• At the university level, the Center for Excellence and Innovation in Teaching espouses the accreditation standards of The Association to Advance Collegiate Schools of

Business' (AACSB).

Center for Excellence & Innovation in Teaching

• Promotes excellence in teaching by facilitating the appropriate use of new technologies in classrooms and laboratories; refining methods, instruments, and procedures for evaluating teaching; and working with the administration to improve the teaching infrastructure in classrooms and laboratories.

• Assists faculty with learning assessment (e.g. determining how well students understand class material / classroom assessment techniques).

• Assists faculty with identifying learning outcomes for course design following Bloom's taxonomy of educational objectives.

• Faculty representatives serve in a Teaching, Learning, and Instructional Resources

Committee.

Institutional Research

• As part of Budget, Planning, and Institutional Research, provides statistics and information in support of the BU's planning, management, and budgetary processes.

• Collects and analyzes key information regarding the BU's faculty, staff, and student populations.

• Coordinates data for governmental, informational, and ranking submissions.

B17

Carnegie Mellon University

Hone

Components of the Program Assessment Process

Each program is asked to use this

We use the following definitions to guide our work:

(doc) to articulate outcomes and identify measures, findings, and resulting actions.

Program outcomes identify the knowledge, skills, abilities, etc., that students should be able to demonstrate upon completion of the program. The outcomes need to be specific and measurable.

Direct measures require that students demonstrate their knowledge, skills, abilities, etc., for faculty to then assess whether/how well they are achieving/achieved a program outcome. Direct measures can include exams, project artifacts, artistic work products, capstone experiences, case studies, exams, juried performances, oral presentations, papers, and portfolios.

Indirect measures gather perceptions of how well students are achieving/achieved a learning outcome. They can include alumni, employer, or student surveys, exit or focus group interviews, enrollment and retention data, and job placement data. Indirect measures complement the data collected from direct measures but cannot stand alone as a sole measure of student performance and program success.

Major findings result from analyzing the data collected and documenting areas of success and areas for improvement.

Actions provide evidence that findings have been or will be used to further develop and improve student achievement of program outcomes (i.e., actions that were or will be taken as a result of data collection and analysis). It is also important to state when findings provide evidence that students are successfully achieving a program outcome.

Carnegie Mellon University

Program-level Outcomes Assessment Chart

This form is intended tofacilitate documentation ofprogram-level outcomes assessment foraccrediting agencies, advisory boards, andother internal or external

audiences. For thepurpose offollowing through on 2008 Self-Study recommendations, this information will be collected annually.

Date: Type Date Here Name of Person Completing Form: Type Your Name Here

Dept/Program: Type Your Department/Program Name Here

Program Outcomes

••••••

Direct

~ r

Performance

Measures2

••••1

Indirect

» ,

Performance

Measures3

••••

.. .

^. ..

, »4

Major Findmg(s)

• • • •

....

. . ..

«„«...i«.«.j

What Actions Resulted

..

. .?5

• • ^ • • • • j

^ ^ • ^ • H l

^ ^ ^ • 1

Ijj^^^^H ^^H^H

••••I^^^H

••••• •••••

§•••••••1

PmHHH

• • •

HHHIB^H

•••i^l

HBHHBHHHI

^ ^ ^ ^ ^ ^ ^ ^ • M i

^ ^ ^ H H I ^ I H H

I^^^^^^^^H^H

HBHH HHHH

HHMHHI

I ^ ^ H ^ ^ H I H H

Suggestion: It may be easier to work the char! from righl to left, beginning with documenting recenl changes to the program curriculum in the "Actions" column.

Program outcomes identify the knowledge, skills, abilities, etc.,thatstudents should be ableto demonstrate upon completion of theprogram. Theoutcomes need to be specific and measurable.

2Direct measures require students to demonstrate their knowledge, etc., for faculty to then assess whether/how well students are achieving/have achieved a program outcome. Examples of direct measures include artistic work products, case studies, exams, juried performances, oral presentations, papers, and portfolios.

Indirect measures gather perceptions of whether/how well students areachieving/have achieved a program outcome. Examples of indirect measures include alumni, employer, and student surveys, exit and focus group interviews, enrollment and retention data, andjob placement data. Indirect measures complement the data collected from direct measures and cannot stand alone as sole measures of student performance.

Programs should identify the major findings after analyzing the data collected.

"Programs should provide evidence that the findings have been used to further develop and improve student achievement of program outcomes (i.e., actions that were takenas a result of data collection and analysis). It is also important to state when findings provide evidence that students are successfully achieving a program outcome.

Carnegie Mellon University

Sample Program-level Outcomes Assessment Chart

Program Outcomes1

Identify client needs and develop and communicate solutions to address the needs

Function effectively as a team

Identify when a problem contains an ethical component and create an ethically defensible solution according to professional standards

Use knowledge, skills and abilities to solve a problem in any context

Direct Performance

Measures2

Final reports

Class presentations

Course artifacts

Observation of team over time by faculty member

Oral and written case analysis

Defining Issues Test

Fundamentals of

Engineering Exam

Evaluation of unstructured problem solving in capstone c o u r s e s

Externally juried reviews of student performances

Final reports

Indirect

Performance

Measures3

Employer satisfac tion survey

Alumni satisfaction survey

Peer survey evalua ting team member performance

Major Finding(s)

Students were able to formulate good solutions but were not able to effectively communicate them to the intended audience.

Students do not know how to resolve interpersonal/intrateam conflict.

Students were able to identify ethical issues and create solutions but had difficulty defending the solutions.

Students did not validate the effectiveness of their solutions.

What Actions Resulted

from Finding(s)?5

Students will be required to:

• Submit draft documents to faculty for preliminary feedback.

• Practice oral presentations.

Students will participate in a conflict management workshop within the context of the class.

Exploration of ethical issues will be morebroadly addressed and integrated across the program curriculum.

Students will be required to write an analysis of the problem-solving process and its outcome(s).

Utilize tools, techniques and skills to create an original work

Design and conduct experiments and analyze and interpret results

Alumni survey of students enrolled in graduate or professsional programs

External reviewers consistently rated student performances at the highest proficiency level.

Students need additional opportunities to design their own experiments.

Curriculum currently achieves expected outcomesno action at this time.

Experimental design will be morewidely integrated into the program curriculum.

'Program outcomes identify the knowledge, skills, abilities, etc., that students should be able todemonstrate upon completion ofthe program. The outcomes need tobespecific and measurable.

2Dircct measures require students todemonstrate their knowledge, etc., for faculty tothen assess whether/how well students are achieving/have achieved a program outcome. Examples of direct measures include artistic work products, case studies, exams, juried performances, oral presentations, papers, and portfolios.

"'indirect measures gather perceptions ofwhether/how well students are achieving/have achieved a program outcome. Examples of indirect measures include alumni, employer, and student surveys, exit and focus group interviews, enrollment and retention data, and jobplacement data. Indirect measures complement the data collected from direct measures and cannot stand alone assole measures of student performance.

Programs should identify the major findings after analyzing the data collected.

5Programs should provide evidence that the findings have been used tofurther develop and improve student achievement ofprogram outcomes (i.e., actions that were taken as a result ofdata collection and analysis). It is also important to state when findings provide evidence that students arc successfully achieving a program outcome.

Carnegie Mellon University

Program-level Outcomes Assessment Chart

Thisform is intended tofacilitate documentation ofprogram-level outcomes assessmentfor accrediting agencies, advisory hoards, andother internal or external

audiences. For the purpose offollowing through on 2008 Self-Study recommendations, this information will be collected annually.

Date: June 30, 2011

Name of Person Completing Form: Joe W. Trotter, Head

Dept/Program: Department of History (B.A. in History, B.A. in Global Studies, and B.A./B.S. in Ethics, History, and Public Policy)

Program Outcomes1

I. Be able to explain continuity and change over time and place, by gathering, organizing, and interpreting evidence from primary and/or secondary sources that are relevant to particular historical contexts and appropriate to particular disciplines and/or course methodologies

Direct

Performance Measures2

Timeline and diagram assignments

Mapping exercises/map quizzes

Long essay exams (in-class)

Take-home exams

Short answer/Identification exams

Graded contributions to discussion

Oral presentations (group or individual)

Indirect Performance

Measures3

Major Finding(s)

What Actions Resulted from Finding(s)?5

In personal exit interviews and/or on questionnaires completed by 17 out of 25 graduating primary and additional majors in May 2011, several students asked for more training in historiographical schools of thought and/or interpretive theories. Sample comment: "I would have liked to have learned more about the major critical and interpretive schools of thought."

Note for "Action" column:

All three majors in the Department of Historywere newly implemented infall 2009. less than two years ago.

So far, only 5 students have graduated with the new EHPP major, only 5 have graduated with the new Global Studies major, and none have yet completed the new

B.A. in History. We will begin the

"Actions"process to further refine our new majors during the 2011-

2012 academic year.

II. Be able to read texts (including entire books, routinely) and other media critically, to analyze evidence, arguments, and competing interpretations, and to challenge assumptions and values that underlie claims about the past and its relation to the present.

Regular, extended readings

(formerly known as "books")

Long essay exams (in-class)

Take-home exams

Required reading notes

Journal responses

In personal exit interviews and/or on questionnaires completed by 17 out of 25 graduating primary and additional majors in May 2011, students reported strong emphasis on and improvement in their critical reading skills.

Sample comment: "All of the

Program outcomes identify the knowledge, skills, abilities, etc.. that students should heable to demonstrate upon completion ofthe program. The outcomes need to he specific and measurable

'Direct measures require Students to demonstrate their knowledge, etc., forfaculty to then assess whether how well they arc achieving-achieved a program outcome. They include artistic work products, case studies, exams, juriedperformances, oral presentations, papers, andportfolios,

'Indirect measures gather perceptions ofwhether/how well students are achieving/have achieved a program outcome. They include alumni, employer, and student surveys, exit and locus group interviews, enrollment and retention data, and job placement data. Indirect measures complement the data collected from direct measures and cannot stand alone as sole measures of student performance.

' Programs should identify the majorfindings after analyzing the data collected.

Programs shouldprovide evidence that the findings have been used tofurther develop and improve student achievement ofprogram outcomes (i.e., actions that were taken as a result oj data collection and analysis). It is also important to state when findings provideevidence that students are successfully achieving a program outcome

1

>rogram Outcomes

,

III. Be able to write analytical, historical arguments based upon the careful use of evidence, language, reasoning, and organization.

Direct

performance Measures2

Response papers

Book reviews

Oral presentations (informational or interpretive)

Book/document content analyses

Film reviews

Student debates

Small group discussions of primary sources

Graded contributions to discussion

Indirect Performance

Measures3

Short (5-10 pp.) analytical essays

Long (20-30 pp.) research papers

Interpretations of visual evidence

Writing assignments to integrate primary and secondary sources

Oral presentations (persuasive)

At the 2011 annual Western

Pennsylvania Regional

Conference of Phi Alpha Theta

(the history honor society), eleven (11) of our students competed by giving historical research papers. Eleven out of eleven ranked highly —earning two "first place" and nine

"second place" awards in different categories.

.. , .... , |4

Maj°r Fmdmg(s)

History courses I took at

Carnegie Mellon included critical reading; I was particularly interested in courses that included readings from multiple points of view."

However, several students asked for more training in historiographical schools of thought and/or interpretive theories. Sample comment: "I would have liked to have learned more about the major critical and interpretive schools of thought."

What Actions Resulted I

from Finding(s)?5

|

During the 2011-2012 year, the department's Undergraduate

Education Committee will explore ways to increase historiographical emphasis in existing courses for majors, as well as the possibility

(within staffing constraints) of creating a new course.

Personal exit interviews and/or questionnaires completed by 17 out of 25 graduating primary and additional majors in May

2011 indicated that the department-wide emphasis on writing is effective. Sample comments: "I have also learned a lot about writing history"

"The program achieved both objectives [assembling sources and reading critically] well, while also helping me become a much better writer in the process."

IV. Be able to conduct historical, archival, or field research, independently and/or collaboratively, to integrate it with earlier scholars' work, and to present findings in written and/or oral formats that acknowledge sources properly, fully, and fairly

Long (20-30 pp.) research papers

Ethnographic field notes

Ethnographic and/or oral history interviews

Interpretations of visual evidence

Oral presentations (group or individual)

Graded contributions to discussion

Historical simulations or role playing exercises (based on research into different populations or points of view)

In 2010-2011, six (6) members of the History faculty advised twelve (12 students on research projects presented at the annual

"Meeting of the Minds" event.

Personal exit interviews and/or questionnaires completed by 17 out of 25 graduating primary and additional majors in May

2011 indicated that most students felt strongly prepared as researchers. Sample comment: "I feel so much better prepared than many of my peers in terms of in-depth research and analytical thinking"

Student debates

Small group discussions of primary sources

Mastery of library resources, including specialized scholarly databases

V. Be able to employ the knowledge and skills gained by studying the past to understand contemporary issues, to challenge inaccurate or unsupported claims, to make careful comparisons across time, space, and culture, and to take informed positions as students at an international university and as global citizens.

Short and long essay assignments on the historical origins of contemporary issues

Breadth and depth requirements in all three curricula

Graded discussion leadership

Graded contributions to discussion

For 2009-2010, History ranked first among Humanities

Departments for the number of primary majors who studied abroad [Source: Emily Half,

StudyAbroad Overview2009-

2010,13 September 2010.]

VI. BA in History: Be able to articulate factual and contextual knowledge of specific places and times, to make careful comparisons (across time, space, and culture) and to discern how each generation (including theirs) uses the past for present purposes

[Capstone assignments]

Knowledge of national (beyond

U.S.), regional, and global historical development

Knowledge of the world before

1900

Research papers that integrate primary and secondary sources

Two (2) History majors studied abroad in 2009-2010.

The number of students in the

"new" majors in the History

Department grew from 44 in fall 2009, to 110 in May 2011.

The number of BA in History students grew from 7 in fall

2009, to 16 primary and 12 additional majors in spring

2011.

VI. BA in Global Studies: Be able to articulate complex understandings of the processes of globalization in the long- and short-term, by combining interdisciplinary, theoretical, and historical perspectives with cross-cultural knowledge and advanced language training

Policy-oriented research projects

Written preparation and oral presentation of research proposals and preliminary results

Peer critiques of written and oral work

Seven (7) Global Studies majors studied abroad in 2009-2010.

The number of BA in Global

Studies students grew from 11 in fall 2009, to 32 primary and

14 additional majors in spring

2011.

VI. BA/BSinEHPP: Be able to persuade people to agree with their particular arguments and analyses; to conduct research under time and resource constraints; and to craft policies that address real world problems in a way that is sensitive both to history and competing sets of values.

Book/Film Analyses

Topical Essays

Normative Essays

Debates, Mock Trials, and

Legislative Hearings

Issue Briefings

Group Projects, particularly crafting recommendations for policy makers.

Seven (7) EHPP majors studied abroad in 2009-2010.

The number of BA/BS in

EHPP students grew from 26 in fall 2009, to 33 primary and 3 additional majors in spring

2011.

Undergraduate Economics Program: June 2011

Carnegie Mellon University

Program-level Outcomes Assessment Chart

This form is intended to facilitate documentation ofprogram-level outcomes assessment for accrediting agencies, advisory boards, and other internal or external audiences. For thepurpose offollowing through on 2008 Self-Studyrecommendations, this information will be collected annually.

Date: 6/30/2011 Name of Person Completing Form: Carol B. Goldburg

Dept/Program: Undergraduate Economics Program

Program Outcomes

Students should be able to identify, explain, and use economic concepts, theories, models; and data-analytic techniques.

Students should acquire and use knowledge and skills of economics, mathematics, statistics, and computing flexibly in a variety of contexts.

providing the foundation for success in graduate studies and careers in the public and private sectors.

Direct

Performance

Measures

In-class individual presentations

In-class group presentations

In-class quizzes/tests

Research projects

Indirect

Performance

Measures

Successful Application to

Graduate School

Employer Feedback

Senior exit interviews l-3'dyear end-of-year surveys

Meeting of the Minds presentations and awards

Discussions at Faculty

Meetings

Class discussions

In-class individual presentations

In-class group presentations

Successful application to

Graduate School

Employer Feedback

Alumni Feedback

Discussions at Faculty

Meetings

Major Finding(s)

What Actions Resulted from Finding(s)?

1) Most students can identify, explain, and use economic concepts, theories, models; and dataanalytic techniques.

2) Some students are frustrated that they cannot jump immediately into elective courses after taking just Principles of Economics (73-100).

3) Some students want their intermediate theory courses to spend time going over current events.

3) Introduce more current events/examples into the intermediate economic theory courses.

1) Some students find it difficult to transition form their theoretical training in statistics to econometric theory and applications.

2) While the faculty would like students to use economic/statistical data analysis programs

(e.g., e-views, R, etc), many students prefer to use Excel because, apart from proprietary inhouse data analysis packages. Excel is what many think they will use after their undergraduate degree program.

3) Few upper level clectives require econometric analysis (beyond OLS)

1) Review curriculum of identified upper level clectives to find ways in which data-analysis can be used effectively.

2) Help students to understand the value of distinguishing themselves via data-analytic skills that reach beyond the limited statistical capabilities of Excel.

3) Discuss at next Economics Curriculum Committee

Meeting (Fall 2011)

|

Students should be able to apply their economic tools to formulate positions on a wide range of social and economic problems and engage effectively in policy debates.

Direct

Performance

Measures

Class discussions

In-class individual presentations

In-class group presentations

In-class quizzes/tests

Research projects

Indirect

Performance

Measures

Successful Application to

Graduate School

Employer Feedback

Alumni Feedback

Discussions at Faculty

Meetings

Discussions at

Undergraduate Economics

Program co-curricular events.

j

, ..

, .

t>\ /

1) Students are successful in applying their economic tools to economic problems.

2) Some students have not been exposed to analytical frameworks that will allow them to effectively engage in policy debates on topics where "the sanctity of life" and other intangibles must be quantified.

Undergraduate Economics Program: June 2011

, from Fmding(s)?

,.

.

2) Discuss at next Economics Curriculum Committee

Meeting (Fall 2011)

Students should use investigative skills necessary for conducting original economic research and participating effectively in project teams.

Students should be able to deliver effective presentations in which they combine visual communication design with oral arguments and/or the written word.

Products Resulting From

SURG Grants

Senior Honors Theses

Senior Project Course

Peer Assessments of

Teamwork (based on articulated criteria)

Faculty Observations of

Teamwork (based on articulated criteria)

In-class individual presentations

In-class group presentations

Brief Written Responses

Written Essays

Written Reports

Employer Feedback

Alumni Feedback

Discussions at Faculty

Meetings

CMU Community Feedback

SURG Grants

1) Most students work well in teams. The majority of difficulties that arise can be traced to either a) cultural differences and/or b) workethics.

2) When self-selecting into groups, the determinants are: academic ability, friendships, and nationality.

3) Many students have strong interests in pursuing research; however, they find the following barriers: a) identifying a faculty mentor, and/or b) setting up an individual research problem that can be accomplished in one-term or even two-terms, and/or c) having available summer funds so that they need not work elsewhere full-time.

4) More Structure is needed for the Senior

Honors Thesis Program

5) Some of the very top students choose not to write a Senior Honors Thesis.

1) a) In some courses, faculty form groups and change group composition throughout the term.

3) a) The introduction of the new Economics Colloquium course (73-450) should serve as a faculty research introduction to the students.

c) Dennis Epple and Carol Goldburg will be working with

Deans Robert Dammon and John I.choczky to identify ways to help finance more summer research opportunities.

4) A new structure for the Senior Honors Thesis Program will be designed by the end of summer 2011. This will hopefully address point 5).

Meeting of the Minds presentations and awards

Alumni Feedback

Discussions at Faculty

Meetings

CMU Community Feedback

Employer Feedback

1) Most students arc strong public presenters with a keen sense of their audience, pacing, and the appropriate balance between the spoken and the graphic.

2) Many students would benefit from increased writing opportunities.

2) Discuss at next Economics Curriculum Committee

Meeting (Fall 2011) i

HANDOUT A, p. 1

Carnegie Mellon University

Program-level Outcomes Assessment Chart

This form is intended tofacilitate reporting program outcomes assessment to accrediting agencies, advisory boards and other internal orexternal audiences.

For the purpose offollowing through on 2008 Self-Study recommendations, this information will be collectedannually.

Date: 7/11/2011

Program: Social and Decision Sciences

Name of Person Completing Form: John Miller and Connie Angermeier

Program Outcomes1

Direct

Performance

Measures2

Indirect Performance

Measures3

Major Finding(s)

What Actions Resulted fromFinding(s)?5

1. Students apply frontier tools from the social sciences, particularly microeconomics, to understand policy decisions and outcomes and to describe, predict, and influence social systems.

* Written exams

* In-class individual presentations

* In-class group presentations

* Analytical essays

* Homework based on assigned readings

(both field-specific and current news)

* Peer review

* Review from graduate Teaching

Assistants

* Preparing multiple drafts of essays

* Senior and alumni surveys

* Some students have difficulty determining what is valid evidence in making arguments

*Some students have more preparedness in economics than others

*Some students need to change how they view economics - from traditional microeconomics to behavioral economics

* Provide opportunities to participate in interactive economic simulations

* Extra opportunities for practice/study problems

* Require analysis of published news articles to better apply concepts taught in class

2. Students demonstrate how to write and speak about social science theories of individual and social behavior arising in economics, decision science, organizations, psychology, and political science, including results and debates.

* Written exams on theories and facts

* Writing assignments based on assigned texts and articles

* Preparing multiple drafts of essays

* Individual and group presentations and class discussions

* Analytical essays and

* Preparing multiple drafts of essays

* Senior and alumni feedback/surveys

* Students often have difficulty sorting through contradictory findings about "facts"

* Students are able to identify and solve problems, but the quality and appearance of the work is substandard

Submitcompletedform to the Provost's Office, WH 604, by Wednesday, June 30, 2010

* Provide opportunities to help students become better consumers of empirical research

* Provide opportunities to help students become better producers of empirical research

* Provide opportunities for students to share and learn about ethics and diverse perspectives within the social sciences

* Increased emphasis on communicating results - emphasis on total quality of work

* Require multiple drafts of written work to refine analytical abilities

HANDOUT A, p. 2 term papers

* Homework based on assigned readings with focused questions for analysis

* Peer review

* Review from graduate Teaching

Assistants

3. Students solve/explore unstructured real-world problems that require teamwork and contributions from diverse disciplines.

* Design and conduct novel empirical research

* Analytical essays, term papers, and reports

* Preparing multiple drafts of essays and reports

* In-class discussions

* Individual and group presentations

* Homework based on assigned readings with focused questions for analysis

* Peer review

* Review from graduate Teaching

Assistants

* Review from external advisory board

* Senior and alumni feedback/surveys

* Some students are weak in

* Require multiple drafts of written work presentation skills to non-peers to refine analytical abilities

* Students have some difficulty synthesizing multiple disciplines

* Analytical ability in writing

* Provide examples of analytical writing

* Extra opportunity for students to practice extended speech, to describe, narrate, express, and defend facts and exceeds that in speaking opinions

♦Emphasis

placed on communicating results and total quality of work

(specifically oral presentations, but also written work - students must be able to readily articulate connections between the theoretical and practical milieu to multiple audiences)

4. Students demonstrate independent learning skills and enthusiasm for the field.

* Senior thesis work

* Undergraduate,

* In the SDS graduating class of

* Refine advising procedures to promote

* Projects for study faculty, and alumni

2010, approximately 26 % of study abroad, independent research, and abroad and internships feedback/surveys

* Projects for

♦Student reflective

individual student our students have experience abroad1 Their internships

* Offer various outlets for students to paper on experiential projects/writings/presentations discover research/experiential research education (ex: "What reflected their ability to be selfopportunities

Data provided by the Study Abroad Office

* Projects for participating in on going faculty research

♦Analytical

papers for experiential education make this a better experience? How can you help your enthusiasm for the entire learning experience. (Across the University, approximately

* Reflective papers for supervisor make this 8% of undergraduates have experiential education a more enriching studied abroad)2

(ex: "Provide specific experience?")

* In the graduating class of evidence as to the relative academic value of your internship.

How does the work one thing would you directed learners and do differently to demonstrated great

2010, approximately 40% of our students have participated in independent research, and/or internships for experience relate to academic credit (other your studies? What specific connections are there between the students have done research or internships not for academic credit) internship and the work you've done in your classes?")

HANDOUT A, p. 2

♦Provide

short-term experiential opportunities

Suggestion: It may he easierto workthe chartfrom right to left, beginning with documenting recentchanges to the programcurriculum in the "actions" column.

'Program outcomes identify knowledge, skills, attributes and/or capabilities students will demonstrate upon completion ofthe program. The outcomes need tobespecific and measurable.

2Programs should gather data to measure each stated outcome through direct measures (i.e., students demonstrate their knowledge, skills, etc.).

^Indirect measures, where students, employers orothers report their perceptions orobservations ofstudent/employee knowledge, skills, etc., can beprovided but cannot stand alone as a sole measure of student performance.

4Programs should identify the major findings after analyzing data collected.

5Programs should provide evidence that the results have been applied to further the development and improvement ofthe program (i.e., actions that were taken as a result ofdata collection and analysis).

Data from CMU Factbook 2010-2011, Volume 25, Student Programs and Opportunities

Carnegie Mellon University

Psychology Department Program-level Outcomes Assessment Chart

This form is intended tofacilitate reporting program outcomes assessment to accrediting agencies, advisory boards andother internal or external audiences.

For thepurpose offollowing through on 2008 Self-Study recommendations, this information will be collected annually.

Date: Revised August 2011 Program: Psychology

Name of Person Completing Form: Sharon Carver, on behalf of the Psychology Department

Program Outcomes1

Goals for Psychology

Learning Outcomes

1) Breadth ofKnowledge in Psychology

Direct

Performance

Measures

2

• The first set of program goals were adopted in

2006 in preparation for the Self-Study.

• At present, all of the

Psychology assessments are within individual courses. We have no way of aggregating data to determine student progress relative to our program goals.

• The last survey of psychology graduates was conducted prior to our development of program goals.

• Undergraduates were involved in the 2010

Psychology Advisory

Board process via focus group.

• Tests in Introductory

Courses

• Discussions in Courses at all Levels

Indirect

Performance

Measures a

Major Finding(s)4

• Susan Ambrose & Elizabeth Whiteman reviewed the department's learning objectives to suggest improvements in specificity and wording in measurable terms (March, 2010).

What Actions Resulted from Finding(s)?5

• The revised goals have been distributed to faculty for use in their syllabi and annual reports (June 2010).

• Individualized suggestions re: strengthening syllabus communication of goals prepared for faculty in July 2011 re: Fall 2011 courses.

• Share goals with undergraduates, at least via the

Psychology web site but perhaps also when they declare their majors.

• Tap into the departmental level data available from

CMU's Institutional Research and Assessment via request of department chair

• Recruit our Undergraduate Administrator to conduct a survey focused on program improvement in the

Spring of 2012.

• Not clearly sequencing the introductory curriculum

• Not providing coherence & breadth at

• Develop a sequencing plan

• Consider adding a senior "capstone" experience the senior level re: Research

• Not meeting the needs of students with

Clinical Psychology interests

• Not keeping up with trends re:

• Faculty hire in clinical to improve course offerings, advising, internships, mock interviews, etc.

• Review major options to better utilize resources and

Bio/Psych and Cog Sci Tracks meet student needs

• As of April, 2011, we have no system for

• Have a faculty meeting discussion (Fall 2011) to aggregating survey course performance.

brainstorm ways to gather direct performance data, including data from courses but also considering data that could be gleaned from advisor interactions (e.g., grant applications & success, Meeting of the Minds participation, post-graduation plans, etc.).

• Summer 2011 review of course goals to identify level of alignment with departmental goals. This analysis was restricted to courses that will be taught in the Fall 2011, with the clear purpose of guiding efforts to help faculty members improve Fall 2011 syllabi. Data revealed that most Introductory Courses listed goals that fit categories la-c but few included ld&c. Feedback shared with faculty to begin listing |

la) Describe multiple areas within psychology

(e.g., social, cognitive, clinical, developmental, etc.), including theoretical perspectives, research findings, and their applications lb) Identify theoiy, research, and applications in related disciplines (e.g., genetics, computer science, etc.)

1c) Explain diverse experimental paradigms used in psychology and related research areas

Id) Discuss the history of psychology within the primary area of study, including the impact of scientific revolutions, theory shifts, etc. on the choice of research questions, methods, etc.

le) Describe ethical issues in conducting research those goals IF they are in fact part of the course objectives. One faculty member also included a goal explain how apparentlycontradictoryor unrelated theories can in fact be integrated.

• Summer 2011 compilation of data from faculty annual report tables to determine which goals are being met how well. Overall, this review revealed that faculty members have vastly different conceptions of how to complete this table. About 2/3 of the tables were completed at too global a level to be useful for identifying student progress relative to individual goals. The insights listed below were taken from the 1/3 of tables that included multiple rows for each course with progress impressions distinctly specified for separate goals.

• Explore ways to make the faculty annual report table more useful for program assessment purposes.

2) Depth ofKnowledge in at least one area

2a) Synthesize disparate facts and theories in the primary area of study

• Papers in Advanced

Courses (2 courses for all majors)

2b) Apply the research methods, experimental designs, and analysis techniques commonly used to investigate questions in the primary field of study

3) Proficiency in

Information Search and

Communication

• Senior Thesis

• Research Methods

Group Project (2 courses for all majors)

• Senior Thesis

• Papers & Presentations in Advanced Courses (2 courses for all majors)

• As of April, 2011, we have no system for aggregating advanced course performance.

• Summer 2011 review of course goals to identify level of alignment with departmental goals. Goals in categories 2a and 2b were the most frequently listed across all courses. Many Advanced Course syllabi also included a depth of application goal that might be useful to add to the department goal list when the faculty members review it together in Fall 2011.

• Summer 2011 compilation of data from faculty annual report tables to determine which goals are being met how well. There were several faculty who identified students' theoretical sophistication as the weakest area (2a) and several others who indicated student difficulty generating appropriate study designs to test specific hypotheses (2b).

•As of April, 2011, we have no system for aggregating honors thesis performance.

•As of April, 2011, we have no system for aggregating research methods performance.

• Beginningwith the 2011 Senior Theses, the department chair completed a rating scale based on the relevant learning goals so that faculty can reflect on the performance data.

• Now that the rating scale has been pilot tested, give it to honors students and their advisors early in the year so that they are focusing on demonstrating progress in each of the required areas.

• Consider developing a rating scale that thesis advisors could complete early in the thesis process to assess the student's preparation for independent research.

• Collect poster session ratings?

• Have faculty submit their assessments of the papers and posters?

• As of April, 2011, we have no system for aggregating course performance.

• Summer 2011 review of course goals to identify level of alignment with departmental goals. Course goal data revealed that goals in these categories were pervasive in Research Methods and Advanced

Courses, though not always explicitly mentioned.

Goals related to learning to work in a team and to effectively lead a discussion werealso common and might be useful to add to the department goal list.

One faculty member also noted that there are different types of writing that we target, not just APA style

(e.g., reflections, projects in formats other than standard research papers, etc.)

• Summer 2011 compilation of data from faculty

Senior Thesis

3a) Use psychology databases, e.g., PsychLit

3b) Read and critique psychological articles

3c) Deliver effective oral presentations

• Assignments and

Papers in Methods and

Advanced Courses

• Discussions in Courses at all Levels

• Research Methods

Group Poster

Presentation (2 courses for all majors)

• Advanced Course

Presentations??

3d) Write effectively, using the format suggested by the American

Psychological Association

• Meeting of the Minds

Presentation of Senior

Thesis

• Research Methods

Project Reports (2 courses for all majors)

• Papers in Advanced

Courses (2 courses for all majors)

4) Proficiency in the

Investigation and Analysis ofBehavior

• Senior Thesis

• Research Methods

Project Reports (2 courses for all majors)

•As of April, 2011, we have no system for aggregating honors thesis performance.

annual report tables to determine which goals are being met how well. Use of literaturesearch tools was commonly rated as high, while depth of critique was less consistent. Most faculty members were impressedwith the oral and written communication, though several mentioned that they included significant feedback on earlyversions in order to achieve this goal.

• Beginningwith the 2011 Senior Theses, the department chair completed a ratingscale basedon the relevant learning goals so that faculty can reflect on the performance data.

• As of April, 2011, we have no system for aggregating course performance.

• Summer 2011 review of course goals to identify level of alignment with departmental goals. The goals in thesecategories were most consistently listed for Research Methods courses, though some

Advanced Courses also listed study design. Several of the research methods courses also mentioned learning to work effectively with subjects, particularly children, learning the IRB process, and learning to work as part of a team.

• Summer 2011 compilationof data from faculty annual report tables to determinewhich goals are being methow well. Instructors of Research Methods courses noted that students do not uniformly enter the courses with the prerequisite Statistics knowledge and some struggle to design studies to test specific

• Senior Thesis

•As of April, 2011, we have no system for aggregating honors thesis performance.

hypotheses.

• Beginning with the 2011 Senior Theses, the department chair completeda rating scale based on the relevant learning goals so that faculty can reflect on the performance data.

4a) Design and conduct psychological studies to address research questions

4b) Apply knowledge of statistical theory to choice of appropriate analyses

4c) Use statistical packages to analyze and interpret data

5) Dispositions of

Curiosity, Critical

Thinking, and Enthusiasm for the Field

• All Courses & Thesis

• As of April, 2011, we have no system for aggregating course performance.

• Summer 2011 review of course goals to identify level of alignment with departmental goals. Few courses actually list curiosityor enthusiasm as goals, though more include critical thinking. Feedback regarding this point was shared with faculty members in preparation for Fall 2011 syllabi, though some responded that it's too obvious and broad a goal to list.

• Summer 2011 compilation of data from faculty annual report tables to determine which goals are being met how well. Across all course levels, the most uniform point mentioned in the assessment column of the report tables was the impressive level of interest and engagement of the students. Critical thinking was mentioned by several professors as challenging for students.

•As of April, 2011, we have no system for aggregating honors thesis performance.

• Beginning with the 2011 Senior Theses, the department chaircompleted a rating scale based on the relevant learning goals so that faculty can reflect on the performance data.

Suggestion: It may beeasier to work thechartfront right toleft, beginning with documenting recent changes tothe program curriculum in the "actions" column.

'Program outcomes identify knowledge, skills, attributes and/or capabilities students will demonstrate upon completion of the program. The outcomes need tobe specific and measurable.

Programs should gatherdata to measure eachstated outcome through direct measures (i.e., students demonstrate their knowledge, skills, etc.).

indirect measures, where students, employers orothers report their perceptions orobservations ofstudent/employee knowledge, skills, etc., can beprovided but cannot stand alone as a sole measure of student performance.

Programs should identify the major findings after analyzing data collected.

'Programs should provide evidence that the results have been applied to further the development and improvement ofthe program (i.e., actions that were taken as a result ofdata collection and analysis).

HANDOUT A, p. 1

Carnegie Mellon University

Program-level Outcomes Assessment Chart

This form is intended tofacilitate reporting program outcomes assessment to accrediting agencies, advisory boards and other internal orexternal audiences.

For the purpose offollowing through on 2008 Self-Study recommendations, this information will be collectedannually.

Date: August 5, 2010

Program: H&SS Information Systems

Name of Person Completing Form: Randy S. Weinberg

1 Program Outcomes

Direct

Performance

Measures2

Indirect

Performance

Measures3

Major Finding(s)4

What Actions Resulted fromFinding(s)?5

Research, analyze and articulate system requirements and business / development plans

Review of project documentation and project plans

Course examinations

Job placement (selfreported) as business / systems analysts and project managers

While students are capable of working in well-defined problem domains with limited ambiguity, dealing with real-world ambiguity and ill-defined requirements is a problem and frustration for many students.

Refinements to curriculum will continue to challenge students beyond obvious comfort level; projects in unfamiliar domains will be assigned.

Core courses will continue to give students more opportunity to deal with ambiguity.

Design effective solutions to meet organization and management needs for information and decision support

Evaluation of class projects and project reports

Follow-up reports from project clients on project deployments and problems

Graded Presentations

Student peer review of projects and project plans

Requests from previous project clients for subsequent team assignments

Students gain valuable experience, competence and confidence in undergraduate studies in designing solutions with familiar web and database technologies and practices, scaling up to enterprise thinking is a recurring shortfall.

Electives targeting larger systems, system architecture and enterprise planning will be offered more regularly.

Implement and test information systems using contemporary and leading practices and methodologies

Review of test plans and Reports from employers,

As demonstrated in team and individual project status reports student interns and projects, students learn to develop realistic alumni plans using familiar practices to meet course assignments. Scaling to new and

Faculty observation of unfamiliar environments is sometimes process problematic.

Electives and projects will require students to create realistic plans and then manage their execution.

Faculty testing of class projects and term projects

Submit completedform to the Provost '.v Office, WH 604, by Wednesday, June 30, 2010

HANDOUT A, p. 2

Apply organizational, technical, economic and social aspects of information systems to real world problems

Course examinations

Acceptance rates of IS students into top tier graduate programs in IS

Written reflections on their learning experience in elective courses Mgmt and Policy

Work and communicate effectively in teams within organizations

Faculty observations on teamwork and professional communications, selfreports and peer evaluations

Feedback from project clients in project courses

Reports from employers, student interns and alumni

Teamwork continues to be an essential portion of IS student preparation.

Faculty observation of students writing and presentations across the curriculum

Faculty will continue to research,develop and implementbest practices in teamworkand professional communication.

Suggestion: It muybe easier to work the chartfrom rightto left, beginning with documenting recentchanges to theprogram curriculum in the "actions" column.

Program outcomes identify knowledge, skills, attributes and/orcapabilities students will demonstrate uponcompletion of the program. The outcomes need to be specific and measurable.

"Programs should gather data to measure each stated outcome through direct measures (i.e.. students demonstrate their knowledge, skills, etc.).

'Indirect measures, where students, employers or others report theirperceptions or observations of student/employee knowledge, skills, etc.,can be provided butcannot stand alone as a sole measure of student performance.

Programs should identify the major findings after analyzing data collected.

Programs should provide evidence that the results have been applied to further the development and improvement of the program (i.e.. actions that were taken as a result of datacollection and analysis).

Carnegie Mellon University

Program-level Outcomes Assessment Chart

Date: June 14,2010 Program: BS in Mechanical Engineering Name of Person Completing Form: Aubrv. LeDuc. Michalek

1 rotirain

Outcomes

A. an ability to apply knowledge of mathematics, science and engineering

B. an ability to design and conduct experiments, as well as to analyze and interpret data

• l)i reel

Pci'In nuance

.Measures

External

Reviewers of

Capstone

Design

Projects

Outcome-

Specific:

Assessment of homework performance on problems specifically involving the ability to use mathematics, science or engineering

Outcomespecific:

Assessment of laboratory reports in

Thermal Fluids course

I n direct

Performance

Measures

» Alumni

Survey

» Graduating

»

Student Exit

Survey

Recruiter

Survey

• Alumni

Survey

'

• Graduating

Student Exit

Survey

.Major findings)

*

Students would benefit from a more formalized and standardized exposure to fundamentals of engineering statistics.

24-211 course has increased in workload due to innovative incorporation of additional computer lab content.

• Students desire updated and expanded experimental experiences and improved facilities.

C. an ability

*

External '*

Alumni

• Material and implementation for the Engineering

Actions Resulting IVom l'iiulinii(s)

• 36-220 Engineering Statistics andQuality

Control included as a required course in our curriculum in placeof a technical elective (this requirement can also be satisfiedby taking

Probability Theory and Statistics (36-217)or

Introduction to Probability and Statistics I (36-

225).

Three units added to 24-211 Numerical

Methods course to reflect additional workload of new computer lab content.

• Two courses added to increase student lab experience: (1) 24-321 Thermal Fluids

Experimentation and Designand (2) 24-452

Mechanical Systems Laboratory.

Electromechanical Systems course eliminated.

Experimental experiencesmoved into new

Mechanical SystemsLaboratorycourse, and othertopics moved to Dynamic Systems and

Control course.

• Undergrad computerclusterrefurbished. Main undergraduate experimental spacerefurbished.

New lab equipment purchased and lab space upgraded. New experimental lab courses offered that effectively utilize space.

• Brandnew experiments, in boththermal and mechanical systems, were identified and the required instrumentation purchased.

The two courses (1) 24-370 Engineering

to design a system, component, or process to meet desired needs within realistic constraints such as economic, environment al, social, political, ethical, health and safety, manufactura bility, and sustainability

D. an ability to function on multidisciplinary teams

E. an ability to identify, formulate, and solve engineering problems

Reviewers of

Capstone

Design

Projects

External

Reviewers of

Capstone

Design

Projects

Outcome-

Specific:

Assessment of questionnaires and interactions with team managers

External

Reviewers of

Capstone

Design

Projects

Outcome-

Specific:

Assessment of performance on examination problems involving

Survey

Graduating

Student Exit

Survey

Recruiter

Survey

Alumni

Survey

Graduating

Student Exit

Survey

Recruiter

Survey

Alumni

Survey

Graduating

Student Exit

Survey

Recruiter

Survey

Analysis class was inconsistent from year to year.

Students reported need for increased design experience, especially more quantitative and technical experiences to complement their open-ended design project.

Designand (2) 24-441 Engineering Analysis were replacedwith a new design sequence consisting of (1) Engineering Design I: Tools

and Methods in the Jr. year and (2)

Engineering Design II: Conceptualization and

Realization in the Sr. year.

Replaced24-321 Thermal FluidsEngineering with 24-321 Thermal-Fluids Experimentation and Design to incorporate requiredexperience of design and realization of thermal systems.

No problems identified.

No problems identified.

No action required.

No action required.

y

F. an understands gof professional and ethical responsibilit

G. an ability to communicate effectively

H. the broad education necessary to understand the impact of engineering solutions in a global, economic, environment al, and societal context

La recognition of theneed for, andan ability to engage in life-long

• External

Reviewers of

Capstone

Design

Projects

• Outcome-

Specific: identifying, formulating and solving engineering problems

Fundamentals of Engineering

Exam

External

Reviewers of

Capstone

Design

Projects

External

Reviewers of

Capstone

Design

Projects

Outcome-

Specific:

Assessment of reports in senior course

External

Reviewers of

Capstone

Design

Projects

Outcome-

Specific:

Assessment of user impact section of senior design final report

Alumni

Survey

Graduating

Student Exit

Survey

Alumni

Survey

Graduating

Student Exit

Survey

Recruiter

Survey

Alumni

Survey

Graduating

Student Exit

Survey

Student

Advisory

Committee

(SAC)

• Alumni

Survey

• Graduating

Student Exit

Survey

Need to strengthen ethicalresponsibility.

No problems identified.

Feedback for creatingmore global exposure for students

Deficiency in GraduatingStudent Exit Survey in Factor

9: "Impact of Engineering Solutions". In 2005, this was approximately 4.0 (from 1-7with 7 being the highest).

With implementation ofchanges, in 2008, this was approximately 5.2.

• Identified need for more interactions in terms of lifelong learning and advising.

• Deficiency inGraduating Student ExitSurvey in Q34:

"Advising/Computing: Academic advising by faculty",

In 2005, thiswas approximately 3.6 (from 1-7 with 7 being thehighest). With implementation of changes, in

2008, this wasapproximately 5.3.

Ethics introduced in 24-302 Mechanical

Engineering Seminar.

No action required.

Created a study-abroad fellowship programto supportstudent globalexperiences.

Created a study abroadbrochure, with ~20 programs abroad with courseseasily transferrable to Carnegie Mellon.

Created the International Service-Learning

Engineering (ISLE) program to expose studentsto challengesand opportunitiesin otherparts of the world(e.g., the Philippines), andhelpthem place theirengineering studies in a global context. Experience available to all students, strongly encouraged but not required.

Created an Undergraduate Teaching Fellows program for selectedhigh-performing undergraduates to serve as computational or hands-on laboratory instructors.

Hired full-time undergraduate advisor, Bonnie

Olson, to provide student assistanceon curriculum rules and choices. Faculty advising

learning

J. a knowledge of contemporar y issues

K. an ability to use the techniques, skills, and modem engineering tools necessary for engineering practice

L. familiarity with college level mathematics and basic

M.An

ability to work professionall y in both thermal and mechanical systems areas

Assessment of research section of senior design final report

External

Reviewers of

Capstone

Design

Projects

External

Reviewers of

Capstone

Design

Projects

Analysis and discussions of

ABET Criteria

Graduating

Student Exit

Survey

Student

Advisory

Committee

(SAC)

• Alumni

Survey

• Graduating

StudentExit

Survey

Feedback for more student exposure to faculty research and cutting-edge topics.

* Students require additional exposure to modem

CAD/CAE/CAM and experimentaltools.

• Deficiencyin Graduating Student Exit Survey in Factor

8:"System Designand Problem Solving". In 2005, this was approximately4.8 (from 1-7 with 7 being the highest). With implementation of changes, in 2008, this was approximately 5.8.

Analysis and subsequent discussions of ABET Criteria revealed the need for additional mathematics and/or basic sciences exposure

Analysis and subsequent discussions of

ABET Criteria

* Graduating

Student Exit

Survey

* Additional design and realization of thermalsystems needed.

• Deficiency in Graduating Student Exit Survey in Factor

14: "Laboratory Facilities". In 2005, this was approximately 3.9 (from 1-7 with 7 being the highest).

With implementation of changes, in 2008, this was approximately 5.4.

retained with emphasis on advice on technologies and careerchoices.

Created faculty advisor / student advisee lunches, doubled the number of undergraduate graders, offering faculty-led advising seminars, and presenting student awardsat faculty meetings.

Combined graduate/undergraduate courses in the areas of micro/nano-scale engineering, energy/the environmentand innovative design/manufacturing.

Five minute faculty presentations of research in required undergraduate courses.

Created24-370 EngineeringDesign I:Tools and Methods to ensure our students learn modem CAD/CAE/CAM tools.

Extended 24-211 Numerical Methods course with additional computer lab content.

Created two experimental courses using modem experimentation techniques and equipment, 24-321 Thermal-Fluids

Experimentation and Design and 24-452

Mechanical Systems Experimentation.

36-220 EngineeringStatistics includedas a required course in our curriculum.

Created 24-321 Thermal-Fluids

Experimentation and Designto incorporate required experienceof design and realization of thermal systems.

Mapping between Outcomes Assessment Tools and Program Outcomes

A number of methods are used to assess our Program outcomes. The mapping between outcomes and assessment methods is summarized in

Table 1 below.

Assessment Method

A

Alumni Survey

X

Graduating Student Exit Survey

X

Fundamentals of Engineering

Exam

External. Reviewers of

Capstone Design Projects

Recruiter Survey

Outcome Specific Methods

X

X

X

B

X

X

X

C

X

X

D

X

X

Program Outcomes

£

X

X

F

X

X

X

G

X

X

H

X

X

/

X

X

J

X

K

X

X

L M

X X X

X X X X

X X

X

X

X

X

X X

X X

X X X X

Table 1: Mapping between Assessment Methods and Program Outcomes

X

program outcomes

KNOWLEDGE AND SKILLS RELATED TO DESIGN METHODS AND APPROACHES

BFA

COMMUNICATION

DESIGN effectivelydescribe design ethics and integrate them into the practice of design

School of Design degree programs

(only differences in prog ram outcomes are noted)

BFA

INDUSTRIAL

DESIGN

MDES

INTERACTION

DESIGN

MDES

COMMUNICATION

PLANNING AND

INFORMATION DESIGN effectively apply design research methods to all stages of the design process effectively implement environmentally and socially responsible design practices In all stages of the design process effectively communicate ideas orally, graphically, physically and in writing throughout all stages of the design process effectively Implement a cohesive, iterative design process effectively work on design projects In disciplinaryand multldlsclplinaryteams project plan timeline research notes

direct performance measures

OBSERVABLE OBJECTS

OBSERVABLE ACTIONS

SETS OF OBSERVABLE

OBJECTS/ACTIONS

1

maps and diagrams sketches/drawings narratives

(written and/or storyboards) recordings

(video and/or sound) wireframes sketch models prototypes

(physical or virtual) material/technology use

2d compositions

(text and/or image)

3d compositions

t)dcompositions aural compositions

Interactive compositions project and process documentation production documentation files for production written papers silkscreens letterpress blocks coding field research user studies presentation

(to a class or client) class discussion or critique individual student/ teacher interaction team/class Interaction a set of objects/actions from the prior columns that collectivelyindicate process taken and decisions made

(relationships/patterns)

X

X X

X X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

program outcomes

KNOWLEDGE AND SKILLS RELATED TO DISCIPLINARY CONTENT

School of Design degree programs

(only differences In prog am outcomes are noted)

BFA

COMMUNICATION

DESIGN design useful form, focused on communication, basing decisions on the needs and desires of specific audiences, contexts, uses, and content

BFA

INDUSTRIAL

DESIGN focused on physical

Interaction design useable form, basing decisions on the needs and desires of specific audiences, contexts, uses, and content

MDES

INTERACTION

DESIGN focused on Interaction

MDES

COMMUNICATION

PLANNING AND

INFORMATION DESIGN focused on communication and Information design desirable form, basing decisions on the needs and desires of specific audiences, contexts, uses, and content design 2d, 3d, «d forms that are based on founda tional design principles and reflect the needs and desires of audiences, contexts, uses, and content describe key historical and contemporary concepts, people, artifacts, and tools

In communication design and use them In the development of design work industrial design Interaction design communication and

Information design direct performance measures

OBSERVABLE OBJECTS

OBSERVABLE ACTIONS

SETS OF OBSERVABLE

OBJECTS/ACTIONS

1 project plan timeline research notes maps and diagrams sketches/drawings narratives

(writtenand/or storyboards) recordings

(video and/or sound) wireframes sketch models prototypes

(physical or virtual) material/technology use

2d compositions

(text and/or image)

3d compositions

4d compositions aural compositions interactive compositions project and process documentation production documentation files for production written papers silkscreens letterpress blocks coding field research user studies presentation

(to a class or client) class discussion or critique

Individual student/ teacher interaction team/class Interaction a set of objects/actions from the prior columns that collectively Indicate processtaken and decisions made

(relationships/patterns)

X

X

X X

X X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

program outcomes

KNOWLEDGE AND SKILLS RELATED TO DESIGN PRODUCTION

BFA

COMMUNICATION

DESIGN effectivelydescribe the value of traditional and emerging design materials, technology, and processes and use them effectively

In the development of design work effectively create design work for production/ reproduction In/for various mediums

School of Design degree programs

(only differences in prog ram outcomes are noted)

BFA

INDUSTRIAL

DESIGN

MDES

INTERACTION

DESIGN

MDES

COMMUNICATION

PLANNING AND

INFORMATION DESIGN project plan timeline research notes

direct performance measures

OBSERVABLE OBJECTS

OBSERVABLE ACTIONS

SETS OF OBSERVABLE

OBJECTS/ACTIONS

1

maps and diagrams sketches/drawings narratives

(writtenand/or storyboards) recordings

(video and/or sound) wireframes sketch models prototypes

(physical or virtual) material/technology use

2d compositions

(text and/or image)

3d compositions

4d compositions aural compositions

Interactive compositions project and process documentation production documentation files for production written papers silkscreens letterpress blocks coding field research user studies presentation

(to a class or client) class discussion or critique individual student/ teacher interaction team/class interaction a set of objects/actions from the prior columns that collectivelyIndicate process taken and decisions made

(relationships/patterns)

X

X

X

X

X

X X

X

X

X

ShrAPLt Cgr^ bcmaca) - ColOAiIiA UNIVERSITY

Attachment IV. Hypothetical Example of Ph.D. in Astronomy

Program Information

Name of the Program: Astronomy

Degree: Ph.D.

Department/Interdisciplinary Program: Astronomy

College/School: Graduate School

Contact Person

Name:

Title:

Email Address:

Phone:

Program Mission Statement

The Ph.D. program in Astronomy educates graduate students toward achieving an advanced-level of understanding of modern astronomical concepts, capable of applying technology in the field, performing original thesis research, and communicating research

results to the professional astronomical community.

Program Goals for Student Learning

Measures of Learning Outcomes

Goal 1 - Students will demonstrate advanced-level knowledge ol astronomy.

A written and oral qualifying exam is given after students

complete two years of course work. Performance on the

exams is evaluated as excellent, very good, good, adequate, and not adequate.

The Graduate Education Committee reviews syllabi for

required courses on which the qualifying exams are based and the pass/fail results on the exams to assess whether

10

course work adequately prepares students for the test.

• A faculty committee (selected by the student and faculty mentor) evaluatesthe overall research project as excellent, very good, good, adequate, barely adequate, or not adequate.

Goal 2 - Students will design a scientific project, complete the research, and communicate the results in an oral presentation and a scholarly work.

• Each student is sent a project review letter each year after

passing the qualifying exams which describes the strengths

and weaknesses of the work.

• Students complete course requirements from one of these subfields: Theory, Computation, Observation, or

Instrumentation.

Goal 3 - Students will develop expertise in an area of specialization within astronomy.

• The written and qualifying exams include questions to test the student's knowledge of the area of specialization..

Goal 4 - Students will make original contributions to the field of astronomy at the international level.

Goal 5 - Students will acquire the ability to obtain external grant support

• Students propose, design, complete, and successfully defend

before a faculty committee an original thesis research

• By the end of their program, graduates will submit at least one refereed journal article for publication or make at least one presentation at a professional conference

• Students will prepare a mock proposal for an external research award, following the guidelines of the granting agency, for review by the department faculty

11

Goal 6 - Students will become proficient in designing and teaching of college level courses.

Students will serve as teaching assistants in course under the direct supervision of its instructor for at least one year.

Students will teach independently for at least one semester.

Members of the Department faculty will attend at least two classes taught by each studenteach term and provide both a written and oral assessment of the quality of its content and

the student's teaching performance.

Program Enhancement Based on Assessment Results

The Graduate Education Committee will review syllabi from courses on which the qualifying exams are based and results from the

Qualifier and make recommendations to the Chair once every three years.

The Graduate Education Committee will compare the quality of second year projectsover a three year period and make

recommendations to the Chair once every three years.

The Astronomy Department Assessment Committee (ADAC) will review stream course syllabi and the responses to the selected

questions from final exams, and make recommendations to the Chair once every three years

The ADAC will review publication and conference presentation records of students completing the Ph.D., and make

recommendations to the Chair once every three years.

12

COMMITTEE COMMENTS TO

LEARNING OUTCOMES & ASSESSMENT WORKING GROUP REPORT

(NOV-DEC2012)

From November 5,2012 SGSC meeting:

SGSC members provided feedback on specific details of the report, as follows:

There is a definite benefit to graduate students in defining and assessing learning outcomes.

It is essential that the University provide some basic idea of the definition for a thesis, a project, extended essays, etc.

Learning outcomes would provide a rationale as to why a course is mandatory and what distinguishes a graduate course from an undergraduate course

(especially in blended courses).

Questions were raised as to resources to help with the process? Answer: The

VPA has committed, in the Senate-approved principles that will govern learning outcomes and assessment, to provide necessary resources.

There is a definite benefit for curriculum planning when learning outcomes are incorporated. This is consistent with the work already being done in the units that presently belong to accrediting bodies which require learning outcomes.

Graduate Student Society is supportive of learning outcomes provided there is graduate student representation at all levels of curriculum review.

SGSC does not believe that either the program-to-course or course-to-program model is preferable; units should be allowed to make their own choices.

It is appropriate to develop learning outcomes for components that are not course-based, such as a thesis. Outcomes may include mastery of the field of research or ability to produce a scholarly document for a journal, and better definitions for a thesis, project, etc., would help there.

SGSC favours individual units as the logical place for the development and administration of learning outcomes processes.

Learning outcomes are best handled by curriculum committees within the units, provided there is university-level support and staff. Beyond this, the training

should remain within the TLC where webinars are already being conducted. With

additional resources and admin support, TLC should be able to handle the process quite readily.

From November 7,2012 SCUP meeting:

SCUP members provided feedback on specific details of the report, as follows:

Committee Comments to LOAWG Report

• Program to Course option - "write report" should come later in the cycle or be

relabeled as "prepare plan for data gathering".

• The proposals will require ongoing resources for support in order to make this successful and manageable, since the work will impact all aspects of program

design and curricula.

• It may be easier to begin with applying these processes to graduate programs in

the first instance.

• A correction is needed in the list of accredited programs under Engineering

Science, which are accredited at the undergraduate level and not at the graduate level.

• Student involvement in the processes will be useful and important.

• How would unit ownership and production of data be reconciled with the need for reporting at the university level and "templates"?

• The Program to Course option may be less work in the long run, will balance against individual faculty investment in particular courses, and in general would be more transformative.

• Felt there was need for further discussion of the report, and as such item was set to be revisited at next SCUP meeting.

From November 21,2012 SCUP meeting:

SCUP members provided feedback on specific details of the report, as follows:

• Clarification is required as to what "observable and measureable" means to SFU.

Perhaps the word "assessable" should replace "measureable".

• Not every benefit from education can be qualified and assessed as some are teaching goals rather than learning outcomes.

• There is concern about top-down, external imposition of bureaucracy versus articulating learning outcomes for internal purposes, for curriculum review and change.

• What about cost to Faculties and departments in implementing this process?

What resources will be made available to departments, as this process will not be cost free?

• Concern regarding support and workload for faculty members and teaching assistants, as assessment of courses will be time consuming.

• Want examples of implemented assessment models at other universities, which should include how learning outcomes are assessed and reported for

Humanities, Sciences, Business Administration, Social Sciences and some programs at the graduate level. Examples might show what process looks like from the perception of a faculty member rather than the administrative level.

• Agreed that if process is rolled out through External Reviews, as proposed, it will help stagger the costs.

Committee Comments to LOAWG Report

• Concern about how data might be misused; information could be massaged or manipulated to make the university look better, rather than presenting a candid

assessment and reflecting deficits in student performance.

• Report is silent on the WQB curricula and the learning outcomes for these

courses. P. Budra noted that an assessment of the WQB courses has not been

conducted and perhaps it should be. S. Dench clarified that the committee had not reached agreement on this, and the alternative view is that learning outcomes are built into the WQB designation criteria and designated courses

should be assessed as an integral part of every department's curricula. It was recommended to include a comment in the report regarding WQB courses.

• Suggested that report should address the relationship of learning outcomes to course performance; consider the question of whether achievement of setting learning outcomes is actually setting a minimum standard for the students.

• Edits suggested to language of "Appropriate Processes for SFU" point 1.

• A number of SCUP members supported the approach of developing learning

outcomes first at the program level rather than the course level. This will allow

for student involvement.

• Dean of Graduate Studies recommended beginning at the Graduate Program level, as there are fewer programs to assess and it would help to lessen the "fear factor.

• It was noted that not every course nor every student needs to be assessed.

From November 28,2012 Deans' Council meeting:

Decanal input:

• Suggested that certification be included. Education has programs that have to meet competency certification, and inclusion of these would further illustrate the degree to which the University is already working with learning outcomes and teaching goals.

• Suggestion that the crucial definitions be expanded to include a comment about the definition or nature of outcomes.

• Faculties vary in how they define graduate attributes. This would be a useful framework to start with, to help define program outcomes.

• Recommended that the report feature some examples of appropriate, assessable outcomes. J. Driver agreed a variety of examples will be appended to the report.

• Academic units have to be the determining bodies for establishing disciplineappropriate outcomes.

• Process of implementation is to be tied to the external review cycle.

Committee Comments to LOAWG Report

From November 28,2012 SCUTL meeting:

SCUTL members provided feedback on specific details of the report, as follows:

• After discussion of the LOAWG report, SCUTL felt it had more questions than answers; the issues it identifies arise from the view that while desirable in

principle, implementation of learning outcomes and assessment may be more

challenging than anticipated.

• Has no preference for one approach over the other; program-to-course or

course-to-program depends on the unit/Faculty under review. Both approaches

should be available to units.

• Considered the two options proposed regarding who will oversee the process and feels a combined resource, i.e., IRP and TLC would be a better choice.

Creating a new unit is time-consuming and in many ways redundant, given that

TLC and/or IRP already are well equipped to oversee the learning outcomes and assessment proposals.

• SCUTL feedback implies that there are some Faculties unaware of the process and how/why it is being pursued. Some Faculties see this as an exercise they will have to do to obtain accreditation, even if they don't require accreditation. The key is that if faculty [members] don't respect the process, it won't be beneficial.

Furthermore, the question as to whether a curriculum review would be a better instrument was raised.

• There is uncertainty as to how these goals would be implemented. In particular, who will do the work? Will faculty members be asked to take on the process without consideration of their existing workloads? Experience suggests that to properly implement LOA goals, a year or more of work is needed. SCUTL felt that without clear benefits being known, there will be reluctance to engage in the exercise.

• Strongly believes that whatever metrics for assessment of departmental-level work are developed, they cannot be skewed to university-wide goals.

From December 6,2012 SCUS meeting:

SCUS members provided feedback on specific details of the report, as follows:

• In the report's "Crucial Definitions" the definition for "Learning Outcome" mentions skills but not understanding and values. This seems a gap.

• It is reductionist to try to distill learning outcomes to a single indicator that can be tracked. Not everything [instructors] do can be assessed, and [instructors] do not want to focus solely on that which can be assessed.

Committee Comments to LOAWG Report

• "Assess" may be a misleading term. Seems limiting, but can be quantitative or qualitative.

• Resource implications cannot be overlooked. Cost-benefit analysis should be

done to find the right balance with assessment.

• Is identifying learning outcomes at the outset of a course contradictoryto a

learner-centred approach, predetermining what students will learn.

• What happens when you measure a student's performance at level X? Would

"continuous improvement" then imply X++? Seems like this means something for grading. If not, then an alternativeterm to "continuous improvement" should be

considered.

• Is a broad policyreview (particularly teaching and learning policies) needed in regard to this proposal?

• There will likely be a need to incentivize a change in behaviour.

SUMMARY OF UNIVERSITY COMMUNITY COMMENTS TO

LEARNING OUTCOMES & ASSESSMENT WORKING GROUP REPORT

(OCT 2012)

As of Dec. 17, 2012,26 responses were received (some of which were multi-authored or endorsed by a group of individuals). Comments were also received from Deans' Council

and four Senate committees (SCUP, SCUS, SCUTL, SGSC). The following themes were identified in the responses and comments.

General critiques

• Challenge the assumptions underpinning the report (that LOA should be adopted).

• Challenge the need for accountability or more instrumental justification for

university education.

• Adoption of LOA is a fait accompli.

Process is top-down assertion of LOA.

• We need critiques of efficacy of LOAs from research and experience of other institutions.

• Introduces commodification and conformity to learning.

• Adoption of LOA should be distinct from "accountability".

• SFU should be a leader, not a follower.

Students

• Will provide transparency and rationales for requirements.

• Find mechanisms to involve students (or, must involve students).

• Will this introduce "minimum level" standards approach to course performance and grading?

• This approach may not be "learner centered" as currently students learn things teacher may not articulate.

• SFU students are already well-prepared.

• Students have a diversity of goals, not limited to learning outcomes.

• Learning experience will not be improved by LOs.

Faculty

• Impacts to workload will be excessive.

• Faculty already assess appropriately.

• LOs will change faculty-student interactions and negatively impact how faculty assess students' work.

• Faculty members are already transparent with and accountable to students.

• Teach to a diversity of goals, not a limited number of outcomes.

• Infringement on academic freedom.

• Pedagogical decisions taken out of hands of faculty members and curtails their creativity.

Community Comments to LOAWG Report

Curriculum

• Incorporate into curriculum committees' work.

• Curriculum reviews would be better process to utilize than LOA.

• Reductionist and instrumental approach.

• Learning outcomes are not just about measurable skills and competencies -

include understanding, values, ethics etc.

• LOs is a top-down approach to reforming curriculum.

• LOs attached to courses/programs and not to instructors will be too general to

be useful.

• LOs incompatible with experiential education (how can "experience" be

measured?).

• Report is silent on the WQB requirements - will these be included? Why not start with an assessment of WQB learning outcomes?

Model/approach

• Unit level control important, rather than top-down approach.

• LOs must be discipline appropriate.

• Allow choice of which model to undertake (program to course, course to program).

• Program to course would make most sense.

• Utilizing External Review process would allow staggered implementation.

• Iterative process described will result in conformity and teaching to the outcome.

• How can local "ownership" of process and data be reconciled with universitylevel reporting and standardized instruments such as templates.

Supports

• University level support, resources, and expertise important.

• What is the cost?

• Combine resources of TLC and IRP (support for and concerns about this).

Linkages to accreditation

• Imposition of LOs only an exercise to ensure accreditation.

• Concern over terms such as "quality assurance" and "continuous improvement".

• Serves managerial purposes only.

• Evidence from other jurisdictions that this approach is detrimental.

• Tail wagging the dog - if we didn't pursue accreditation, we wouldn't be looking

at this.

Community Comments to LOAWG Report

Assessment and metrics

• Data can be skewed to university goals and/or manipulated.

• Need to reconcile reporting at university level and "templates" with unit

ownership of data.

• Reductionist approach - does not capture benefits of universityeducation that are difficult to measure, and vary from one faculty member to another.

• Must include quantitative and qualitative approaches.

• Templates developed may be incompatible with unit approaches to assessment.

• LOs that must be "measurable" will be stultifying and too limiting, particularly

regarding application of LOs in the affective and aesthetic domains.

Clarifications requested

• Definition of LO (various additions/changes suggested).

• What does observable and measurable mean?

• Provide details on costs and/or cost-benefit analysis.

• Examples of LOs and assessment models are needed, particularly those that would be implemented at SFU.

• Define continuous improvement or find other term.

• Should provide evidence of benefits to learning related to LOs.

Suggestions to move forward

• Begin or pilot at Graduate level.

• Consider SFU or Faculty-by-Faculty graduate attributes discussion.

• Find alternative approaches to institutionally-driven LOA that are "leaner" and locally-based.

• Allow departments to adopt voluntarily.

• Include linkages to Aboriginal and Indigenous LOAs.

• Allow more time to debate.

• Discuss further with NWCCU other ways that SFU might demonstrate that rich learning occurs.

Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement