University admissions overhaul - the PQA debate

University admissions overhaul - the PQA debate

Given the ongoing debate about the university admissions system, researchers at Cambridge Assessment have investigated different ways of predicting performance at A level.

We are pleased that UCAS took into account our concerns regarding a move to a post-results admissions system (or PQA, as it is widely known) and has decided against its implementation, as reported in UCAS' Admissions Process Review Findings and Recommendations report published today (28 March 2012).

We believe that the benefit deriving from PQA in terms of widening access has not been sufficiently demonstrated to warrant the degree of investment, disruption and risk that would be associated with a move. Widening access remains an important educational goal and other less disruptive methods of delivering it need to be looked into first. This page contains our research on the different ways of predicting performance at A level.

All are agreed that the principal rationale for looking at PQA is to deal with the widening access challenge. Much debate has therefore taken place relating to the relative degree of accuracy of predictions in the current system, the outcomes of which might shed light on whether a move to PQA would deliver better outcomes than the current system.

In 2011, researchers at Cambridge Assessment investigated different ways of predicting performance at A level. Two studies were carried out to find if alternative indicators could provide better predictive information that could be used as part of the university admissions process.

The findings from our studies allow comparison with the results from a UCAS commissioned study based on predicted grades data from 2004. Overall, AS level grades and forecast grades were more accurate predictors of A level grades than predicted grades. However, it is important to note that the data for the UCAS predicted grades study were from a different time period. In addition, it was found that AS level grades, forecast grades and predicted grades are all slightly disproportionate predictors for some subgroups of candidates.

Compared with forecast grades, AS level grades appear to be approximately an equally accurate predictor of A level grades. Both forecast grades and AS level grades could be considered as potentially better alternatives to referee predicted grades (albeit with the time period limitations). However, even for the AS level and forecast grade predictors the prediction of A level grades could only be described as 'reasonable'.

An ongoing UCAS longitudinal study exploring the accuracy of predicted grades from 2008 to 2012 will help to resolve the time period difference, providing predicted grade accuracy data for corresponding 2009 and 2010 examination years; this study is due to report in 2013.

On 22 November 2001, our Group Chief Executive Simon Lebus discussed the implications of PQA at the Westminster Education Forum event 'University access and admissions - the next steps'. Simon explained the degree of investment, disruption and risk associated with such a move.

The accuracy of forecast grades for OCR A levels – Statistics Report Series No.26

This report provides timelines for the UCAS application process together with key dates in the assessment cycle. The forecast grades used in the analyses are for the OCR exam board only. Forecast grades are those reported by teachers to the exam board prior to the final examination session. They differ from the predicted grades sent to UCAS as part of the university applications process in two ways

  • The forecast grades deadline is in May while UCAS receive predictions during the applications period ending in January
  • Forecast grades are used by awarding bodies to assist in the examinations awarding system while predicted grades are used in the UCAS applications process.

The second investigated Predicting A level grades using AS level grades – Statistics Report Series No. 29. The AS level grades used in the analyses are for all exam boards.

For both reports, accuracy was reported overall and by gender, level of deprivation and school type.

Two further points can be drawn from the work. First, the vast majority of forecast predictions are no more than one grade out (and generally over- rather than under-optimistic). Second, it is not the case that independent schools over-predict in comparison with the maintained sector, thereby gaining an unfair advantage. There is therefore little evidence that candidates systematically underbid for places at research-led universities on the basis on erroneously forecast grades.

In addition, plans to overhaul the university admissions system do not rely on exam boards alone. A level results could be compressed by around ten days. However, this would reduce the time available to consider results appeals and queries. In any event, there would still be insufficient leeway to permit introduction of PQA without changing some of the other variables, such as the start of the University year or significantly bringing forward the dates at which A levels are sat, thereby reducing teaching time.

Teachers and other staff would also need to be available to give advice and guidance, meaning that both they and candidates would have to be in school in the middle of what are now the summer holidays. All of these things would be difficult to orchestrate or risky, or both. They would also require an expensive investment by Universities in re-engineering their admissions systems.

Related materials

Research Matters

Research Matters 32 promo image

Research Matters is our free biannual publication which allows us to share our assessment research, in a range of fields, with the wider assessment community.