Case Studies



For Griffith Business School accreditation motivated the process, but they went far beyond what was required for AACSB to implement a process that emphasised continuous improvement. As well as responding to accreditation requirements, the school was also focused on cross-campus consistency; that across the many campuses where programs were offered there needed to be some equivalence in learning outcomes, assessment, and experiences. This began with the development of faculty level graduate attributes, which were subject to consultation with the faculty.

The first process undertaken was to articulate the learning goals. While they existed in people’s heads, they hadn’t been written down. They got together people who taught in the program, and external people (employers and alumni) to give their views on what a graduate should look like from a particular program; this fed into defining the learning goals and connection to the faculty level graduate attributes.

The mapping process began with how the units in the program actually contributed to the learning goals and the degree to which they did. It led to being able to streamline the curriculum, but also in some majors it was found that some outcomes weren’t being addressed. Griffith Business School held a number of workshops off-campus, where staff worked through charting the learning goals over the course of the program on butcher’s paper. This included unit and program coordinators, heads of departments and the dean of learning and teaching. The process of refining the map was continuous, primarily taking place by email, but with additional yearly workshops to go over the process again to make sure the mapping reflects the way the unit is being delivered. The rubrics to assess skills against presented as a problem, so teams of people were brought together to write the rubrics through a grant of $1000 per rubric. They are peer reviewed and trialled.

Griffith Business School developed their own software called ALEC ” Assurance of Learning Embedded in Courses” (which presents data in much the same way as ReView), with the optional entry of marks online, the application of customised rubrics and the presentation of performance on graduate attributes within the units. In the mean time the collecting the data has been a slow a laborious process. The process for marking students also outputs the assurance of learning data. Students would be marked on for example ethical understanding as part of an essay.

Working with a small team including the unit and program coordinators, Griffith Business School places responsibility on this group to work with the results for the different learning objectives. The use of small groups was a fairly practical measure based on the difficulty of getting senior people together at the same time. Decisions about changes from the data are handled within the course. A major decision would have to go before the faculty’s learning and teaching committee and the faculty board for approval.


QUT business school set out with a very clear a deliberate strategy to assurance of learning; to embed the process into the culture of the organisation as a means of continuous improvement. This was a long process that depended on staff engagement, which has led to a sustainable approach to assurance. The respondent talked about the difference between getting compliance from staff, as opposed to having them own the process themselves. The distinction between compliance and engage was central to all stages of developing the approach to assurance. This brief case study of QUT business will outline the approach taken highlighting the main phases in its implementation:

  • a) the development of school learning outcomes,
  • b) the mapping of learning outcomes,
  • c) outcome data collection, and
  • d) closing the loop.

The first phase concerned the development of school level learning outcomes through a participative process across the school. The respondent talked about what occurred previously, which was a perfunctory ‘tick and flick’ of graduate attributes across units without identifying in a detailed way how units and assessments developed skills. A strategic view of business school student learning outcomes were developed by the school dean, the heads of school, and the senior management team, which were modified and developed in consultation with all staff. While these outcomes linked directly to the university graduate attributes, they have since replaced the graduate attributes in mapping and measurement due to their greater relevance to business programs. All programs within the school use the same five learning outcomes, although there are discipline specific modifications to the wording of them. Communication to the students about the levels of performance required of them over the program and the skills they’ll get from the programs was a key part of the approach.

The mapping of the learning outcomes similarly involved a process of facilitating and supporting the school to take ownership of how the outcomes fit into the program. The teaching and learning team did the mapping of the eight core business units, working with the unit coordinators to identify assessments that addressed the learning outcomes. From there, discipline teams were asked to build their major onto these core units and to show the sequential development of the outcomes. This led to a robust debate within the disciplines about what different levels of performance of the outcomes looked like over the program. The respondent noted that these distinctions could have been imposed from the top, but that it was important for the discipline groups to have this debate in order to own the measures of quality. The role of teaching and learning staff in this process was primarily as facilitators.

The next phase involved the collection of learning outcome data; while the AACSB require only a sample of student performance, the respondent said that it was a principle that if the work was going to be done in developing the learning outcomes, all students should know how they’re performing against these goals. Instead of identifying just one or two points of measurement for each outcome, performance data is collected over the course of the degree in order to capture the development of graduate competencies. Also instead of having skills assessed externally, the rubrics describing levels of performance on outcomes were built into assessments and marked by unit coordinators. This was all enabled by the successful implementation of the ReView software.

The data collection phase feeds directly into a robust process of closing the loop. Student grades associated with different learning outcomes are collected through ReView and exported into a spreadsheet. The faculty are expected to attend a handover day of the data in order to workshop and actively engage in curriculum improvement. Each discipline group sits around a big table as the assurance data and unit review data (course experience questionnaire, graduate destinations survey attrition, viability, comparisons of the major against other universities) are presented. The assembled are asked to discuss and provide their interpretation of the data. Following the initial discussion the discipline groups get time to work through and discuss the data themselves, they are then asked to fill in a one page summary. This includes a description of the outcomes, some general trends, and some thoughts on how to improve the process, and the curriculum. Groups are asked for one change they would make that would have the most significant impact.

The respondent felt that the successful implementation of the process came from seeking to engage staff in each phase: from identifying goals, mapping outcomes, collecting data on student achievement, and improving the curriculum based on the data. A key part of the successful engagement with staff was in the closing the loop process, which demonstrated a faculty wide commitment to continuous improvement. The presentation of the data in these sessions was also thought to be a key part of engaging staff, the bar graphs of outcome results caught people’s attention and fostered discussion about how their unit fit into the program as a whole. Having the both the micro view of performance in assessments at the unit level and at the program level fostered engagement at two levels. Being able to identify problems in the schedule of assessments, was seen as valuable, yet the focus was on the program view in order to promote a sense of common purpose and pride in the quality of the program.

Using Technology to Aid Assurance of Learning – UTS Business School: A Case Study

Tracy Taylor (University of Technology Sydney) & Romy Lawson (James Cook University)

Among the institutions that currently hold AACSB accreditation worldwide, UTS Business School is (perhaps appropriately) notable for the extent to which technological aids have been successfully used to streamline the assurance of learning (AoL) process, and to make the process as transparent as possible to the school’s stakeholders. This work has been featured by AACS bib a recent Spotlight article and this presentation aims to showcase some of the techniques and technologies developed at UTS Business School.

There are three keys to the success of the assurance of learning system that UTS Business School has developed. First, both the assessment techniques and enabling technologies must be embedded in a whole of program approach within the everyday activities of faculty and staff. Having AoL as an add-on to what is already expected of them generates inefficiency and resentment. Second, while developing the process that will ultimately be used, it is important to be inclusive of faculty (and where appropriate student) input, in order to ensure optimal implementation and results. Finally, the process that is created must be sustainable, and the best way to ensure this is to make certain that any assessment techniques and technological aids minimize effort for all concerned and prevent assurance of learning from being overly time- and resource-intensive.


The act of embedding the AoL process as part of the regular routine of UTS Business School operations has had just as profound an impact on the positive development of assurance of learning at the school as the technological tools used to streamline the process. The idea is to keep AoL tools and processes as simple as possible, so as not to unduly burden faculty with extra work. To that end, several of the tools developed for UTS Business School’s current AoL process are actually very basic, technologically speaking. The first such tool to be developed was the SOS (Subject Overview Spreadsheet)(i). This is a simple Excel file, designed in the faculty Teaching and Learning Centre (See Lawson, Bajada & Lee), to allow faculty designated as subject coordinators to curriculum map by inputing and tracking information about their subject, including such things as learning goals and objectives, how particular assessment types (multiple choice, essays, case-studies, etc.) are weighted and distributed over each course, the timing of due dates and tutor feedback to students, and which learning objectives are included in each assessment task, which is then presented in a summary that allows the faculty to review the whole of program experience for their learners. The presentation of this information in the SOS helps to make clear any gaps (or overlaps) over the whole of the program.

Another technologically simple but highly useful tool is the AoL Repository Dropbox. This is a Word document that uses hyperlinks to all the other technological features of the AoL system to create a central data hub for each program. It acts as a portal to the assurance of learning data for each course and subject area within the program, and also links to summary reports for each program. The AoL Repository Dropbox has had a significant impact on making data management more efficient, allowing easy access to all aspects of the assurance of learning process from a single central clearinghouse. This is particularly useful for the program directors, so that they can have all the information they need at their fingertips when discussing issues that arise, different ways to “close the loop” on them, and even tracking the impact of previous actions.

Probably the most complex technological aid in UTS Business School’s AoL toolbox is the Web-based student marking system known as ReView (ii). This system permits the subject coordinators to work and communicate easily with faculty and tutors, and to view their marks and comments as they are entered. UTS Business School’s assessment rubrics are incorporated into the marking system, allowing assessment data entry to be done entirely online.

Assessment criteria are also color-coded to the school’s graduate attributes within ReView, allowing the system to store student marks and produce graphs of student data that can show either individual or aggregate performance for each program learning goal. Additionally, having the rubrics and criteria embedded in ReView helps to ensure standardized assessment of learning objectives across programs, allowing for easier benchmarking. The system also has a self-assessment feature for the students themselves, which gives them continuous feedback on their performance in each of the graduate attributes as they progress throughout their degree program. Most see this as a valuable opportunity to engage with their own progress and to compare their own perceptions of it against staff assessments.


Without the buy-in that comes from including the faculty who will be responsible for using the assessment techniques and technologies in their development, the end result is less than optimal. Extensive faculty consultation is critical to creating a sense of ownership of the assurance of learning process and its implementation among faculty.

Additionally, the multiplicity of uses available through the technological tools can be a selling point for faculty in and of itself. The ReView system, for example, is not only a timesaver insofar as AoL data entry is concerned, but also makes it possible for faculty to work with the data themselves to create charts and do analyses that cut across different programs and levels. In addition to being the basis for decision-making and “closing the loop,” having the data garnered from the assurance of learning process available as a resource for the faculty in this way enhances the value of both the tools and the process.

Greater student involvement in the AoL process is also encouraged, and additional technological aids in use at UTS Business School make this possible. UTSOnline, for example, is a modified Blackboard™ virtual learning platform which students can use to review how the school’s learning outcomes are mapped and how they will be developed in their program. The UTS Student Handbook (which is made available on UTSOnline) includes subject outlines that state the learning objectives for each subject and program. The outlines also show where and how the objectives align with graduate attributes for each assessment task. The UTSOnline platform is also the means by which students receive their marks and feedback from the ReView system.

Another tool currently being piloted at the undergraduate level is an ePortfolio, which is designed to aid students with knowledge integration between the stages of their degree program. The ePortfolio process is introduced in one of the core first year subjects, and progresses throughout the rest of the core and major electives, to be completed in the capstone course for each major. The portfolio allows students to track their progress toward meeting program learning goals and demonstrating the specified graduate attributes, as well as providing a repository for examples of students’ work.


UTS Business School believe that an inclusive AoL process, streamlined by technological tools and embedded in the everyday activities of stakeholders is the best way to ensure the sustainability of the process. Once assurance of learning becomes largely driven by the faculty themselves, it is essentially self-sustaining.


Lawson, R., Bajada, C., & Lee, J. (2010). SOS: A Tool to support assessment practice across degree courses. Australian Technology Network, Sydney, Australia- NSW.

i. Australian Government, Office for Learning and Teaching. (2012) SOS Demo Tool. Electronic document,, accessed August 24, 2012.

ii. Australian Government, Office for Learning and Teaching. (2012) ReView Overview. Electronic document,, accessed September 4, 2012.