AHELO: Telling you what you need to know about higher education outcomes
Rankings of higher education institutions always grab the headlines, but they only include a small selection of the world’s colleges, and may not tell you what you’d like to know about what it’s like to study there.
We asked Karine Tremblay to tell us about AHELO (Assessment of Higher Education Learning Outcomes), an ambitious OECD project that could prove far more useful than simple league tables.
What is AHELO?
AHELO is the first attempt at measuring at international level what third-year undergraduate degree students have learned and are capable of doing. It will produce measures at institution or department level, not at national level, unlike other OECD studies such as PISA or PIAAC.
At this stage, we’re testing the feasibility of measuring learning outcomes in institutions in different countries, with different missions, languages and cultural backgrounds. In fact non-OECD countries make up nearly a third of the participants – Colombia, Egypt, Kuwait, Russia, and Saudi Arabia as observer.
We’re focusing to begin with on generic skills (the so-called 21st Century skills – critical thinking, analytic reasoning, problem solving and written communication) and skills in economics and engineering.
When can we see the results?
We’ll be releasing the results of this feasibility study in 2012. These won’t be data products. The idea is to show that it’s possible to devise a set of test instruments applicable across a range of different institutions, cultures and languages and that the practical implementation of these tests is feasible.
We’ll provide feedback to the institutions who helped us with this part on how their students perform relative to international benchmarks if the data proves comparable across countries, but we won’t be going public with these results.
If a main study is launched, we would publish performance data on learning outcomes, along with context information to interpret performances – type and mission of institution, selectivity, characteristics of student intake, and so on.
How will you rank institutions?
We won’t. Current rankings like the Shanghai or Times Higher focus on inputs such as libraries or faculty characteristics and research performance, measured by numbers of citations, number of Nobel prizes, and the like. That’s fine if you’re picking a PhD programme for instance. For prospective undergraduates or employers though, it’s highly misleading to use such measures as proxies of higher education quality.
But in the absence of any better information, we see higher education institutions all over the world in a race to research excellence to make it to the top of the rankings, to the detriment of their teaching mission.
So, how is AHELO more useful than rankings?
AHELO data will allow a much more accurate assessment of higher education quality, focusing on one of the key missions of institutions: teaching. And in fact, thanks to the context data, it will be possible to analyse what is distinctive about high-performing institutions and spot best practices.
That makes it possible to identify what works, for which students and in which contexts. There is a huge potential for reducing dropout rates and enhancing more equitable outcomes. Remember that across the OECD, 3 out of 10 students entering higher education will drop out without a degree. With $53 000 spent per higher education student on average, the costs of failure are huge. The social costs for those dropping out are equally high.
Who will use AHELO results?
Anyone interested in higher education. Students can make better informed decisions. Institutions can improve their teaching and learning processes. Governments can effectively account for public expenditure on tertiary education. Employers are better informed as to the capacities and capabilities job candidates.
Are you optimistic about progress so far?
Yes. All the insights from our work so far suggest that AHELO is feasible, and interest is growing steadily. We’ve reached our target of 15 participating countries in the feasibility study, and received expressions of interest from twice as many, which is very promising for a main study.
One reason for this is that we’ve tried to involve as wide a range of participants as possible. Governments, institutions and academics serve on our expert groups. Students are at the core of the feasibility study, and we consult stakeholders regularly to report on progress and seek their feedback.
Interest is particularly high in the MENA region as well as in Latin America. Egypt for example remains highly committed despite the political turmoil. We’re also pleased to have Colombia with its strong track record in national assessments of higher education.
How have teachers and students reacted?
One of the big surprises was that obtaining agreement on frameworks and instruments was easier than we expected – getting academics from different countries to agree on what to measure in the disciplines, and to agree on a test. We included economics to see if agreement was possible in a social science.
Students have taken the generic skills test in their own language and provided qualitative validation that the test is meaningful and relevant to them. Students are now validating the disciplinary assessments as well. Here, initial feedback suggests that the tasks stimulate students’ interest and desire to participate.
What comes next?
AHELO seems feasible, and we’re moving to phase 2 with larger groups of students. The goal is to deepen our analyses and provide a quantitative proof of concept – demonstrate that practical implementation is feasible and that the tests yield relevant and statistically acceptable results.
UNESCO Global Forum: Rankings and Accountability in Higher Education: Uses and Misuses, Paris, 16-17 May 2011, organised with the OECD and World Bank, will address university rankings in light of their impact on policy and decision-making at institutional, national and regional levels.