Assuring Learning

Skip to content
  • Home
  • Projects
    • Hunters & Gatherers Project Overview
    • Curriculum Design Fellowship Overview
    • Quality enhancement project
  • Good Practice
    • AOL Strategies
    • Curriculum Design
    • Quality Enhancement
  • Resources
    • Hunter & Gatherer Resources
    • Curriculum Design Resources
    • Quality Enhancement Resources
    • Curriculum Design Workbench (Tool)
  • Contacts
    • Hunters & Gatherers Contacts
    • Fellowship Contact
    • Quality Enhancement contacts

RPIS

Research Project Information Form – 2_RPIS OLTExt_13April_USC

Participatory Process

Workshopping the mapping process with staff was important in getting initial engagement, particularly as the mapping process in some cases led to collaborative problem solving around curriculum and assessment structure. It was important to create a critical space where academic staff were able to discuss the relationship between the content and the attributes. Along with this buy-in amongst academic staff maps are more likely to reflect the program and unit content when the staff involved in delivering the context are involved.

×
UTS

At UTS Business School the emphasis on a participatory process involved sitting down with subject coordinators and having them work through how the graduate attributes and program learning objectives fit into their subject. Using the Subject Overview Spreadsheet (SOS), subject coordinators collaborated in not only the mapping of attributes across the program, but identifying and resolving issues around the distribution and gaps in the curriculum. While the teaching and learning team facilitated the process and did some of the early work of entering details into SOS to hand back to the subject coordinators, the process centred on the involvement of academic staff.

×
UNSW

At the Australian School of Business at UNSW, initial work on mapping was done through workshops where unit coordinators in program/discipline teams were asked informally to indicate which graduate attributes were involved in their assessment tasks. Using Post-it notes, they were asked to map out the distribution of the attributes across assessment tasks through a program or major, from which a number of gaps and overlaps were identified and discussed. The resulting maps from this exercise were developed by the ASB Teaching and Learning team, and then presented back to the program directors and unit coordinators, who were then responsible for any changes.

×
Griffith

Griffith Business School held a number of workshops off-campus, where staff worked through charting the learning goals over the course of the program on butcher’s paper. This included unit and program coordinators, heads of departments and the dean of learning and teaching. The process of refining the map was continuous, primarily taking place by email, but with additional yearly workshops to go over the process again to make sure the mapping reflects the way the unit is being delivered.

×
SCU

SCU engage in participatory mapping by email, sending out a spreadsheet with the attributes, which lecturers fill in for their individual units.  The collaboration and negotiation occurs at the level of discipline groups who share out the assessment of the required attributes across the degree or major. A process of reflecting on the coverage of graduate attributes at the end of the semester also feeds into this.

×
Edith Cowan

Development of the law program at Edith Cowan University began with the embedding of new university level attributes, along with the threshold learning outcomes into the program. This came about as part of a collaborative process of workshopping the TLOs and capturing current practices in order to identify strengths and gaps. This then lead into a redesign of unit and learning tasks guided by the gaps identified.

×
Fostering a Program-Wide View

The process of manually working through how programs and units align with attributes was often talked about as being important in getting subject coordinators to take a program level view of attributes. Significant engagement seems to start from the moment that subject coordinators could see the bigger picture of the program and how their unit fits with the development and assessment of attributes. This often created the opportunity for collaborative problem solving in addressing gaps, or in identifying assessments in the program that reflect the attributes for measurement. Most of the participants described a very similar process of teaching and learning staff presenting the graduate attributes to unit and program coordinators and having them map the attributes throughout the course or unit. This was typically done with the use of specialist software, spreadsheets, butchers paper, or post-it notes.

×
UTS

At UTS Business School, specially developed software (Subject Overview Spreadsheet) was used to present how unit level assessments fit in at a program and faculty level, using program and subject coordinator’s own knowledge of the program. The presentation of this information through SOS made gaps and overlaps over the course of the program clear, and also identified how particular assessment types (multiple choice, essays, case-studies) were distributed over the course. Being able to present all this information seemed to be important in fostering a program-wide view.

×
Melbourne

An ongoing process of review takes place at the Faculty of Business and Economics at the University of Melbourne where program directors are asked to code all unit objectives against program learning goals and outcomes. A filemaker pro database is used to present how the program learning goals are distributed over the units. The mapping is updated every year, which feeds into a program review every five years.

×
QUT

In the QUT Business School all programs map to five key learning goals. For majors within programs, goals are adapted to reflect discipline needs. Mapping was initially done in the core units, and then discipline staff were given the task of building on the core units and showing the sequential development of program goals across the units within the major. This required taking a view of the program as a whole and observing how units fit into the program in relation to attributes.

×
Charles Darwin University

In the law school at Charles Darwin University the degree was developed collaboratively with staff. While it was a complex and time consuming task, it was thought to be formative for the teaching team as everyone was able to get a program level view and ownership of the program as a whole.

×
Mapping the Development of an Attribute Over the Program

Mapping the development of attributes over the program helps to encourage a more systematic discussion about curriculum structure beyond assessment near the end of the program. Mapping the progression towards the graduate level of attributes created opportunities to consider how to improve development through changing the substance or structure of a program. This also meant that there was multiple sets of data allowing some analysis of the progression as well as assuring the learning at the end of the program. This additional data creates the possibility for very detailed changes to program, courses, and assessments that have been shown to be effective in developing a particular attribute.

×
SCU

As SCU map by course rather than individual assessment items, lecturers are asked to indicate the extent to which each attribute is covered in their course: 1) content only; 2) attribute is assed in the course; 3) cannot pass the course without demonstrating the skill. The map of the program provides a broad sense of where the attribute is developed and assessed.

×
CQU

CQU build levels of attributes into their process, along individual assessment items. Learning outcomes and assessments are rated in terms of the level of the graduate attribute demonstrated: 1) introductory; 2) intermediate; 3) graduate. These levels are used to show that the relevant attribute has been developed over the course of the program.

×
UTS

Mapping at UTS BUSINESS SCHOOL involves working out and communicating to students where the program learning objectives are introduced, developed and assessed. This allows for more detailed adjustment when it comes to closing the loop, but also is designed to promote reflection amongst students on the progression towards graduate level.

×
Flinders

In the School of Law at Flinders, the law program was redeveloped over a series of meetings where there was discussions about the sequence of skills developed in the program and where performance of the skills at an introductory, intermediate and exit level are assessed.

×
Mapping by Assessment Task

Mapping at the assessment level allowed for a greater level of clarity and detail in understanding the distribution of graduate attributes over the program. Where this occurs staff were able to very clearly indicate what activities were undertaken to develop and assess the attributes, unpacking the black-box of courses to allow for reflection and improvement at this very detailed level. There was also the opportunity to communicate to students the relationship between the assessment task and the graduate outcomes, and in some cases students were asked to reflect on the last time they undertook work related to that attribute. For universities with embedded assurance of learning measurement, this was also useful in being able to draw on the mark s for the parts of an assessment related to the graduate attribute.

×
UTS

At UTS Business School there is a descriptor for every course that contains all the learning objectives in the UTS handbook. For each of the assessments in that course the course outline will show which program learning objectives are being addressed as well as subjective learning objectives. These maps are available to students so they can see of particular skills are developed over the course of the unit. Students are also prompted to look back at the last time they addressed that particular outcome in their program.

×
QUT

In the QUT Business School assessments are marked against criteria that are linked to a program learning goal with the link made evident in ReView. The program learning goals encompass all of QUT’s graduate attributes. This means the mapping shows how each assessment relates to the graduate attributes.

×
CQU

The process of mapping at CQU involved every unit coordinator mapping each of their learning outcomes and assessment items to the graduate attributes, allowing gaps in the program to be identified.

×
Student Awareness of Attributes

A number of universities talked specifically about having students aware of and reflecting on the graduate attributes. This was seen as an important part of getting students to get a sense of progression over the course of the program so they could reflect on and engage in their own learning. The next step beyond this was to turn student engagement in thinking about graduate attributes into a portfolio of achievements towards particular attributes.

×
Curtin

A fairly unique system around graduate attribute mapping was used at Curtin’s Distance Education School. Graduate attributes are mapped against an employability skills framework. Students are asked to record their past extracurricular learning and previous studies, which are then combined with their current studies to produce a Career Point Index in line with the graduate attributes. Opportunities are then delivered in line with building up aspects of students’ Career Point Index through extracurricular learning activities. Students are encouraged from early in their program to start planning and developing their Career Point Index aligned to their desired career path.

×
UTS (Business)

The e-portfolio in the Bachelor of Business at UTS BUSINESS SCHOOL, is built around the graduate attributes. Students are prompted to find examples of how they have demonstrated each of the attributes through their course work and extracurricular experiences. This served to not only highlight the skills and attributes they had developed over the course of their studies for themselves and future employers, but helped to identify areas for further development.

×
UTS (Law)

The School of Law at UTS received a grant to develop a website that explains to students the importance of the graduate attributes contextualised to careers in law. It includes an explanation of different levels of performance of these attributes, how they link to legal practice, and a number of checklists for students to consider where they are in the development of the skills. Of particular note is the inclusion of videos of legal practitioners explaining the importance of the attributes to their work

http://www.law.uts.edu.au/graduate-attributes/index.html

×
UWA
Learning outcomes feature on the website of the law program, and student outlines feature very specific learning outcomes that make the connection between assessments, units, and learning outcomes.
×
Mapping with Capstones Across the Program
For universities with capstones or compulsory units at the end of the degree, mapping was simplified by being able to catch all students that would otherwise be spread across many units and majors. Capstones were best used as a point of assessment following the development of an attribute over the program, allowing for more specific adjustments.
×
Bond
At Bond, capstone units are mapped as a point of assessment for particular learning objectives that have been developed through tasks earlier in the degree. The results of assessments related to that outcome are only of interest if there is a problem with demonstrating the graduate level of the attribute at the capstone. The review process then looks all points where the attribute is developed.
×
QUT
In the QUT Business School, each major has a capstone unit. All goals are introduced in the core units, with the further development of the goals in the major and outcomes typically measured in a capstone or final semester unit. Mapping the program commenced in the compulsory core units and then proceeded to identify how the units in majors further developed the program learning goals
×
Use of Mapping Software or Analogues
Mapping relies on the graphical representation of the attributes and how they are distributed across the program. The clear presentation of this information for discussion and debate amongst staff was important for the quality and integrity of the process. Some of the universities used specialist software, while others relied on excel spreadsheets, or even post-it notes or butchers paper.
×
UTS
UTS BUSINESS SCHOOL use an excel template developed in-house for mapping attributes (SOS). The tool helps with collating the distribution of assessment tasks related to the attributes at a broader level (major or program), along with identifying the type of assessments used. Being able to present how course level assessments fit in at a number of levels was particularly useful.
×
UNSW
At the Australian School of Business at UNSW the initial work on mapping was done through workshops where unit coordinators were asked to indicate which graduate attributes were involved in their assessment tasks on a table drawn on large white boards. From this, maps were made in Excel and disseminated to the schools. Some went from there to fill in gaps in their units.
×
Consistent Criteria for Attributes across Programs
The use of consistent criteria in assessing an attribute across programs meant that the same rubrics were either embedded in assessments or used by external assessors. This consistency allows for useful comparisons across programs, as well as more developed and thought through criteria. While some universities argued for being able to adapt criteria to make them more content and course specific, there seemed to be some clear advantages in consistent criteria.
×
UTS
At UTS BUSINESS SCHOOL, consistent criteria for attributes are embedded into assessments using ReView. For each program learning objective an assessment rubric breaks the objective down into two or three criteria, with markers indicating the student’s level of achievement on each of these. Because all students in the faculty are marked against the same criteria there are opportunities to benchmark across programs, and to have fairly high level discussion and feedback on the suitability of the rubrics at the subject and program level.
×
QUT
QUT Business School developed generic rubrics through a collaborative and consultative process for each undergraduate and postgraduate learning goal. The rubrics are useful in communicating to staff and students the criteria and performance standards expected for each learning goal. Discipline teams in majors were responsible for adapting the generic rubrics to meet discipline needs. This was seen as important for staff engagement.
×
Bond
Bond use a rubric developed for each of the learning objectives. The assessment teams take a sample of assessments from the capstone and apply it against a rubric for the learning goal. For each learning goal there is a separate assessment team that mark the sample of assessments against the criteria.
×
Embedding Measurement
The main divergence in approaches to measuring assurance of learning was the decision to embed or use external examiners (this included teaching and learning staff). Where there was a strong institutional commitment to assurance of learning, and significant staff engagement, embedded assessment seemed to be the best option. Embedded assessment means that the measurement of the graduate attributes is built into the assessment of students, eliminating the need for double marking or using marks from assessments that may involve different criteria. There was also the additional benefit of normalising assurance of learning into the system.
×
UTS
At UTS BUSINESS SCHOOL subject coordinators integrate the rubrics for program learning outcomes into assessment tasks; the results are then drawn on to report on particular learning outcomes. This is done through ReView software, allowing for all marking to be done online.
×
QUT
QUT Business School embeds assurance of learning into student assessments using ReView, with student learning typically assessed in a capstone unit or other unit at the end of the program. In ReView, assessments are marked with standard criteria which are linked to a learning goal. The overall performance for each learning goal is an aggregate of all of the student results from that learning goal across all linked assessment criteria. Embedding assurance of learning into routine activity and systems was seen as essential to build assurance of learning into the culture of the university, having all staff engaged in and reflecting on how units and programs develop the learning outcomes.
×
Bond
In assessments at Bond law school, knowledge requirements are built into assessments, so performance on particular skills is able to be drawn on at multiple points in the program from part-marks related only to that criteria.
×
External Examination
For universities where getting buy-in was more difficult, external examiners tended to be used. Primarily this was motivated by not wanting to add to lecturers and course and program coordinators’ workloads. In most cases this involved taking assessment items (or a sample of them) and having external examiners, teaching and learning staff, or the lecturers remark using criteria in line with the attributes. This removed responsibility for assurance of learning from course and program coordinators, but particularly in research intensive universities external examination was the main way to measure achieve towards the graduate level on the attributes.
×
Bond
Bond take a sample of assessments from capstone units and use an assessment team (there are one for each of the learning goals) to apply the rubrics. The main advantages seem to be in not having to impose additional work on lecturers, and in responding to demands for external accountability. However these seem to apply more in contexts where is difficulty in securing the buy-in required to embed rubrics.
×
Monash
At the Faculty of Business and Economics at Monash measurement occurs within the quasi-capstones for each of the majors. Each department forms an assurance of learning measurement group who select a subset of questions to be used for AOL at the program level. They use discipline based assessment teams, with more specific rubrics as the result of a pilot on the process. The process is essentially re-marking as its going back to a sample of assessment items and applying a rubric (exceeds, meets or fails expectation), however only for the relevant aspects of an assessment item. Sampling was seen as a way of reducing the workload.
×
UNSW
The Australian School of Business at UNSW has now implemented assurance of learning (AOL) processes and uses random samples of marked student assessment tasks in identified units. The program director/teaching team identifies the unit and assessment task most suitable for assuring learning in the program. The process has mainly been to assess student work for AOL separately from the lecturer’s regular assessment grading, using external markers (external to the unit but usually internal to ASB) and a rubric which focuses on one graduate attribute /program learning goal; however, increasingly, the rubrics are being embedded in the regular unit marking guides so that external marking is not necessary, and a random sample of the students’ marked work can be used for AOL reporting. In some cases, rubrics for several graduate attributes/learning goals are used for the same task (e.g. written communication and critical thinking).
×
Melbourne
The Faculty of Business and Economics at the University of Melbourne undertake assurance of learning in a unit that maps well to a learning goal or objective, particularly those that are compulsory or with a high enrolment. The measurement then involves the lecturer identifying a piece of assessment they thought assessed the achievement of the objective. Marking occurs independently, on three levels (low, meets, exceed expectations). The external assessment occurs through the teaching and learning office, with external assessors organised and paid externally.
×
UWA
UWA Business School apply rubrics embedded within assessments, but with external assessment for some attributes (e.g. oral communication). Rubrics are used for things like communication and teamwork, but not for more complex attributes such as knowledge. Generic rubrics were made and distributed to allow program staff to come up with their own. Actual marks are collected, sometimes being only part of an assessment (e.g. an exam question).
×
Use of Multiple Measures of Assurance of Learning
Some universities emphasised the importance of multiple measures of their curriculum. The measurement of assurance of learning was complimented with external government surveys and student satisfaction results.
×
QUT
QUT Business School uses a broad package of data  to understand program performance and assurance of learning outcomes, including , not just the  measurement of program learning outcomes, but in addition university level analysis of individual major performance, and unit and teaching evaluations.  These different sources of information are considered each semester by discipline teams as part of a comprehensive review of program, major, unit and assessment outcomes.
×
Bond
As well as assurance of learning measures, Bond use data from government surveys and student completion surveys to help develop the validity of their data.
×
Use of Data Collection/Measurement Software
A variety of data collection tools were used to help collect and record performance on the attributes. Many universities talked about their experiences in trying out and having to choose between many of the tools available, and how critical these tools were to the process. However, some expressed a preference for adopting generic technology such as excel spreadsheets and databases to perform many of the same tasks as the more sophisticated, purpose built software.
×
UTS
AT UTS BUSINESS SCHOOL ReView was used for online marking, with a set of standards for attributes presented alongside the grade standards for the assignment. All assessment marking is done online, eliminating the need for extra data entry. The system is able to pull out the criteria for a particular attribute across assessments and present a picture of performance on that attribute.
×
Griffith
Griffith Business School  developed their own software called ALEC “Assurance of Learning Embedded in Courses” (which presents data in much the same way as ReView), with the optional entry of marks online, the application of customised rubrics and the presentation of performance on graduate attributes within the units.
×
Bond
Bond uses a program called STUNNER, which breaks assessments into high, medium, low and produces a report for each subject and eventually program whether a learning objective was achieved.
×
UWA
Specifically for assessing group work, UWA use SPARK to allow for self and peer evaluation anonymously.
×
External Requirements (e.g. Accreditation)
While in some cases assurance of learning represented a ‘tick and flick’ process of adhering to accreditation requirements, this auspice was often used to kick-start an quality and improvement agenda. Building on systems and processes put in place, a number of universities talked about turning this (sometimes) forced engagement into a force for productive curriculum review.
×
Bond
At Bond initial staff engagement was strong due to the importance of accreditation. From this initial engagement, staff were placed into continuous improvement teams, making the responsibility for responding to AoL everyone’s business. Arguably this worked primarily because of the initial staff buy-in in order to get the university accredited.
×
UWA
UWA Business School talked about accreditation and its benefits as one way to initially engage staff with the value of a process for continuous improvement. It’s much easier then get staff looking at student performance and how to implement change to address any problems.
×
Communicating the ease of the Process
Directly challenging staff perceptions about assurance of learning was often effective in improving staff engagement. Some of the important messages were around the ease and simplicity of the process, along with the idea that much of the assurance activities are done implicitly. That assurance of learning was simply a process to document and systematise what was done anyway was communicated to staff, along with the potential for saving extra effort.
×
Monash
Acknowledging the degree of apprehension around AoL processes at the Faculty of Business and Economics at Monash, there was some work done in directly challenging perceptions that AoL was complex and time consuming.
×
Deakin
At the Faculty of Business and Law at Deakin the discussion around staff engagement concerned the institutional orientation of rewarding research, and the challenges of getting academics’ time for anything else. There was an emphasis on how easy the process of AoL is; that it saves work and time, particularly through electronic systems for the entry of marks. The other message was that there’s not much difference between a marking guide and a rubric, that there was nothing daunting about developing rubrics to measure student performance.
×
UTS
At UTS BUSINESS SCHOOL there is an emphasis on making AoL as least work intensive for staff as possible. Marking through ReView in particular is put forward as a timesaver.
×
UWA
While attempts were made to make AoL resource and time neutral at UWA, there were concerns that this might be indicative of a tolerance for the processes rather than real engagement.
×
Demonstrating Success/Effectiveness
Improving staff engagement often depended on selling staff on the evidence of assurance of learning making a difference. When direct benefits to their workloads or to the curriculum can be shown, this is likely to increase engagement.
×
Monash
The Faculty of Business and Economics at Monash found that engagement comes from demonstrating success, however this can be challenging as AoL is a long term project. Staff want to see the evidence base and examples of AoL making a positive difference.
×
UWA
Selling staff on the usefulness and effectiveness of AoL was central to getting engagement at UWA. Staff need to be able to directly see the benefits in mapping, measurement, and curriculum change in order for them to be invested and spend time on the process.
×
Gradual Rollout
A gradual rollout of processes was identified as being important to fostering staff engagement. Particularly where this involved extended periods of building staff awareness and familiarity with the processes before they were implemented.
×
Deakin
At the Faculty of Business and Law at Deakin the emphasis has been on getting the system set up and then slowly rolling out with groups at a time so that appropriate support can be provided.
×
UNSW
The importance of a smooth roll out of the process over an extended period of time at the Australian School of Business at UNSW was seen as important. In particular, processes were piloted before involving all staff. Having awareness raising and engagement activities before commencing full AOL processes was also important.
×
Victoria
At Victoria, there was a sustained effort to foster buy-in before the processes began and before any extra work was required of staff. This was around getting staff to see the value of the process, as once it started AoL could quickly turn into a perfunctory ‘tick and flick’ exercise.
×
Demonstrated High Level Commitment and Leadership
A prerequisite for any significant staff engagement was the demonstration of high level commitment and leadership in AoL. Even where processes had gotten underway in order to satisfy accreditation requirements, there needed to be a constant and high level push for staff engagement and compliance until AoL became an institutional norm.
×
UTS
Engagement began at UTS BUSINESS SCHOOL through getting approval for the process at the highest levels of the university, the executives, the dean, the deputy dean, associate deans, and heads of discipline groups. While this process could have been faster, it was important to work with and develop engagement at these high levels. This then followed into a big drive to help build support amongst staff in discipline groups, which was built on the high level commitment to AoL.
×
QUT
At QUT continuous improvement is an institutionalised value. This agenda is strongly driven from the most senior leaders in the university and has resulted in a rigorous annual unit reporting process, and evaluation of all units and teaching every semester. AoL processes in the Business School have complemented and provided valuable additional direct data of performance outcomes.
×
Melbourne
Working to get staff engagement began at the Dean level at the Faculty of Business and Economics at the University of Melbourne. This flowed downwards to the Heads of Departments who facilitate the process with their program and subject coordinators.
×
UNSW
At the Australian School of Business at UNSW, developing the process in conjunction with key academic leaders helped to build engagement. Where program heads can see the worth and workability they tend to be more persuasive proponents of AOL.
×
Data Quality
Staff engagement could come from the quality of data that resulted from AoL processes, particularly the presentation of statistics, charts, and tables that had clear implications for curriculum. Being able to provide a clear picture to what was going on in terms of student learning was a powerful argument for AoL, particularly as it could confirm and legitimise issues unit and program coordinators had thought about.
×
QUT
QUT Business School had a comprehensive approach to staff engagement that is worth describing in some detail. Initially, core unit coordinators determined how and where the learning goals would be introduced and measured in their units. Staff teaching in units in majors were also involved in the process from early on, with many vigorous discussions occurring about how and where to map the development of the learning goals in their major. While the solutions could have been imposed, it was seen as important to have unit coordinators own the process of developing and measuring the learning goals for their discipline. The only requirement imposed was that each learning goal needed to be significantly addressed at least twice in the core units and twice in the major. As well as ensuring that the goals are measured through a significant piece of assessment (worth 20%); these points have also proved useful for interventions when improvements are required.

Each semester, a meeting is held for a formal handing over of the data related to each major in the program (AoL, ICR, and LEX data). Each discipline group sits together to consider the data through very robust discussion of the data and its meaning. The visual presentation of the data in graphs and charts and turning this handover of data into an event was talked about as being important to fostering engagement. One of the most important factors that proved to be significant in engaging staff was the depth of view of the data that was available: a macro view of how students performed against each learning goal in the major, but also a micro view of how students performed against every criterion in every assessment item in the major. This meant that staff were able to make very specific analyses and identify those aspects of assessment and the learning goals most requiring improvement.  This facilitated staff being able to readily identify direct and tangible improvements, and enabled staff to be very engaged in the process.  A further very important factor was the insistence on engaging discipline groups, enabling them to own and take responsibility for the integrity of their discipline majors.

×
UTS
At UTS BUSINESS SCHOOL the Review program makes it possible for staff to engage with the data more directly. Along with the presentation of charts and data at workshops, staff are able to work with the data themselves and create charts and analysis cutting across many different levels. Presenting the data as a resource as well as the basis for change and decision-making seemed to be important for staff engagement.
×
Develop Leadership and Champions Across Faculty/School
Along with high level leadership, staff engagement also came from the development of leadership amongst unit and program level staff. These people acted as champions, sharing practices and promoting the benefits that could come from engaging in AoL processes. These champions tended to flow from the high level engagement
×
University of Adelaide
At the University of Adelaide Business School leadership was fostered through a broad assurance of learning committee that draws on a representative from each of the disciplines involved. This serves to not only have staff members responsible for interpreting the results, but to have key staff members enmeshed in the process. These leaders can foster engagement through interaction with peers, along with ensuring the process reflects the experiences of the staff involved.
×
QUT

QUT Business School’s implementation was initially driven by a university wide policy change to criterion referenced assessment (CRA). Assessment champions were identified in each discipline school to guide the implementation of CRA.  Having prior knowledge of CRA made the transition to mapping learning goals and aligning assessment criteria with learning goals a much easier transition. These assessment champions worked with chairs of discipline school teaching and learning committees and together formed a critical mass locally to support the discipline leaders in the mapping of learning goals and in influencing the attitudes and behaviours of colleagues towards the cultural change.

The undergraduate and postgraduate program coordinators work with the discipline leaders for each major in their program. The discipline leader was responsible for coordinating the mapping activities for their major and is now responsible each semester for engaging the discipline team in analysing and reporting student performance against learning outcomes and, where warranted, identifying improvements.

Delegating leadership responsibilities to key people who were able to influence colleagues created buy-in and eased the transition through interpersonal influence.

×
Provide Professional Development Opportunities

Staff engagement often came from professional development activities centred around AoL. These took the form of workshops, training in software, or forums to discuss and resolve difficulties and tensions around AoL processes.
×
UTS

UTS BUSINESS SCHOOL engaged in a multi-faceted series of efforts to engage staff, such as 15 workshops with discipline groups, the dissemination of a step-by-step guide to assurance of learning, development of a website, and the availability of support for AoL tools and processes.
×
Curtin

Curtin Business School took the fairly innovative measure of using program and unit coordinators who had done AoL well, and having them present at seminars and engage in mentoring and peer support. These were staff members at the program or unit level who were able to share their approach to and experience of AoL to reduce anxiety about the process.
×
QUT

QUT Business School established a Teaching and Learning Team of four teaching and learning consultants and learning designers with a coordinator that has been pivotal to the successful implementation and ongoing staff engagement. A multi-faceted strategy has been employed such as workshops with discipline teams, workshops with teaching teams in individual units and especially in core units, workshops with sessional staff, one-on-one support to individual academics to explore and improve assessment practice, development of assessment guidelines and audits of assessment practice, development of a website and AoL resources and tools, development of generic rubrics for undergraduate and postgraduate learning goals, and most importantly development of a Business School handbook based on the five learning goals to ensure staff and student understanding.
×
Flinders

At Flinders University law school, there was a recognition that resources and strategies had to be put into supporting staff, that people couldn’t just be expected to do more. This led to the development of a community of practice to work through some of the challenges staff are having in working with graduate attributes and skills.
×
UWS

The University of Western Sydney’s law school, a major in-road to achieving buy-in was having staff devote time to workshops, setting performance targets linked to learning outcomes, and providing one-on-one support on not only what they need to provide to program coordinators, but on teaching and learning supervisory support.
×
ECU

Staff from the Centre for Learning and Development undertake informal sessions and training in relation to GAs. Staff are expected to address the embedding of GAs in unit and course reports. Staff in T&L office monitor mapping of GAs. For Curriculum 2012, an advisory group was convened for each of the priority areas and guidelines were developed collaboratively across faculties.
×
Involve a Broad Set of Stakeholders
Involving a broad set of stakeholders in discussion and reflection on the data was important in getting meaningful change in curriculum. By involving staff who have direct and tacit experience of the units and programs, change can be more specific, direct and effective. Involving the staff who would be responsible for change also is important in fostering commitment to change rather than having it imposed.
×
Adelaide
The University of Adelaide Business School has quite a de-centralised structure of responsibility for responding to measurement data. The aggregated results get sent back to the unit coordinator and to the discipline representative for assurance of learning. They are asked to comment on the data and make suggestions for changes at the unit level, then at the program level.
×
QUT
A de-centralised process occurs in the QUT Business School where responsibility to develop and act on changes occurs through discipline teams. A team including the Assistant Dean Teaching and Learning, program directors, the teaching and learning team, the discipline coordinators or  head of each major, the chair of teaching and learning committee from the discipline school, and some of the unit coordinators are brought together to discuss the findings. These meetings are positioned as an important event with some robust discussion around the interpretation and implications of the data. From these discussions, the discipline groups complete a one page summary that explains the outcomes, what emerged from the data, and what they would improve for next time. The discipline report (for each major) is forwarded to the relevant program director who prepares a whole of program response. The program and major reports are considered by the Business School Education Committee, Academic Board and the Dean’s Executive Group for any strategic level responses.
×
UTS
UTS BUSINESS SCHOOL have a process involving teaching and learning committees from the discipline groups within the faculty. The groups receive a summary of where problems may exist, from this they make recommendations about changes to curriculum.
×
UWA
Involve a Broad Set of Stakeholders: UWA works from program reports which is reviewed by a committee including the teaching and learning team, lecturers whose units are involved in the measurement, the discipline chair, and the instructional designer.
×
UNSW
At the Australian School of Business at UNSW the emphasis is on program directors collaborating with teaching teams and making recommendations for improvement based on the AOL assessment data.
×
Melbourne
The Faculty of Business and Economics at the University of Melbourne has a reflective process for lecturers, following a discussion of results they would then write a report about what they were going to do to foster improvements, file samples of student work, and the results. This would all be examined the following year.
×
Griffith
Working with a small team including the unit and program coordinators, Griffith Business School places responsibility on this group to work with the results for the different learning objectives. The use of small groups was a fairly practical measure based on the difficulty of getting senior people together at the same time. 
×
Critical/Reflective Discussion
An important part of closing the loop along with engagement in the whole process was the willingness to have critical discussions about AoL procedures, about graduate attributes, mapping and measurement. This allows concerns to be aired and potentially issues to be corrected in the implementation of the process. This critical discussion in particular seemed to be important in measurement, where some staff had reservations about the validity of how student performance was being marked. In some cases this discussion led to significant changes that improved the process.
×
UNSW
The Australian School of Business at UNSW reported that discussing AOL results was useful in reviewing program learning goals and the overlap between them. These discussions represent a willingness to critically evaluate the AOL process and ensure they are providing valid information with which to inform program decision-making.
×
UTS
At UTS BUSINESS SCHOOL there was some discussion of the importance of critical feedback about the process itself, particularly the measures, and what subjects and tasks they are mapped to. Allowing critical feedback and discussion about the process itself was important for building meaningful engagement, along with improvement of the process itself, separate from improvements to curriculum and teaching practices.
×
Melbourne
At the Faculty of Business and Economics at the University of Melbourne there was a review process around the program learning goals and objectives, which constitutes an openness to critical feedback around the process of AoL.
×
Review Previous Proposed Actions
It was important for discussions around closing the loop to begin with examining the previous proposed actions and the effect these had on student performance. This reinforced the commitment to change and improvement, along with reminding groups of the wider context of changes.
×
QUT
Closing the loop at QUT begins with examining the actions suggested last year in the annual university reporting, how they have been implemented and what has happened as a result. This enhances the interpretation of the data by highlighting any substantive changes that may have affected student performance, along with potentially improving suggested actions in light of the previous year’s strategies.
×
Melbourne
At the Faculty of Business and Economics at the University of Melbourne the academic would then write a report about what they were going to do, and file the sample piece of the work, the results, and their approach to improvement. This would all be filed then re-examined the following year to see what has happened.
×
Make Changes Once the Process has Settled
It was important to hold of making dramatic changes until a few rounds of the process had been completed. This allowed for some realistic benchmarks of performance, along with a clearer understanding of the status quo before instituting any changes.
×
Monash
At the Faculty of Business and Economics at Monash it was important that they run through a full cycle of measurement before making any significant changes. While some issues become quite clear prior to running through a full cycle of assessment, there was an understanding that there needed to be some consistency in what was being measured before any significant changes occurred.
×
Focus on Improvements at the Program Level
There is the potential for blame to be thrown around when making changes at the unit level, as unit coordinators are in a sense the end of the line in terms of content and delivery. It was more productive to deal with change at the program level, identifying the development of attributes over the program and dealing with change across units.
×
UTS
At UTS BUSINESS SCHOOL discussion around closing the loop occurs at the program level, so looking at what could be improved over the course of the program rather than attributing blame at a subject level. Where problems are identified this leads to a review of how the subject was introduced, developed, and assessed over the course of the program.
×
University of Adelaide
The University of Adelaide Business School have a process of working back from where learning is assessed to examine how a particular learning outcome has been introduced and developed over the program. Closing the loop at the program level means that there is recognition of the places in the program where changes could be made.
×
Bond
Working through the outcomes data, the law school at Bond University draws on a sub-committee of the teaching and learning group with a background in legal androgogy. They consider if the graduate attributes and the TLOs are achieved at the program level and if the units are applying best practice. Changes are made at the program level and then fed back to unit coordinators.
×
Keep Change Manageable
Manageable change was essential to demonstrating the benefits of AoL processes. Limiting change that resulted from closing the loop to the one thought to be most significant means that a clear example of improvement exists and comparisons can be made in performance without too many confounds.
×
QUT
In the QUT Business School there is an emphasis on the importance of the discipline teams to come up with one point of change that would make the most significant difference. Often additional changes were identified and implemented but the focus was on identifying the change that would have most impact on improving student learning outcomes against the program goals.
×
Review Overview
×