29 April 2015
Schools should be judged by their results over a period of at least five years rather than condemned by one year’s performance, Cambridge Assessment says.
Group Director of Assessment Research and Development Tim Oates says that new research by his department has implications for the current approach to school accountability and for accountability measurements.
Teaching, exam reform or marking are often blamed when schools suffer wild swings in exam results. But a study published today by Cambridge Assessment researchers Tom Bramley and Tom Benton, Volatility in Exam Results, suggests that once the impact of things such as reliability of marking are removed, significant volatility still exists. The Headmasters’ and Headmistresses’ Conference (HMC) has previously stated that at least one in five of its schools experienced volatility that it would define as a “serious concern”.
“This study shows quite clearly that exam results in a school may go up or down in unanticipated ways, caused by a wide and complex set of factors,” researcher Tom Bramley says.
“When swings occur they could be because of what is happening in the school or the children’s lives, they could be to do with the assessment itself or the way that national standards are applied, or to do with teaching and learning. But what our study shows is that when we’ve taken account of the variations which can be attributed to quality of marking and to the location of grade boundaries, surprisingly high levels of year-on-year volatility in exam results remain.
“Schools should still monitor exam results for an administrative error which might have occurred, and should still look for and alert exam boards to peculiar outcomes; but everyone in the system should be aware of the level of volatility typical in the context of the complex system which is schooling.”
Reflecting on the research, Tim Oates says that schools should be judged by their results over a period of at least five years – a five year set of results, not a rolling average.
“It appears that underlying school-level volatility may be an enduring and persistent feature of education arrangements, which means that school performance - in terms of exam results - should be judged on a five year picture rather than one-off annual drops or increases.”
The study does not seek to investigate all of the causes of volatility, instead analysing it in a way which removes the impact of quality of marking and the setting of grade boundaries. It finds that volatility still remained when both were removed.
Tim Oates adds: “This is a very important finding and one which challenges many assumptions, with implications for the approach to accountability and for accountability measurements. The analysis is a valuable contribution to building a far more powerful and analytic approach to system improvement and enhancement of assessment. It is a significant part of a picture that we are continuing to investigate at Cambridge Assessment.”
Robin Bevan, Headteacher of Southend High School for Boys and UK School Leadership representative on the Association of Teachers and Lecturers National Executive, commented:
“Judgements about school outcomes, and indeed about the performance of individual teachers, need to be based on sound evidence. This intelligent and insightful report is very welcome. The rigorous analysis clearly shows that schools will see natural fluctuations from year to year in exam outcomes. These variations should not be used, simplistically, to assess the effectiveness of schools on one year’s output or to assume there are problems with the reliability of exam markers.”