The coronavirus pandemic has made many people question whether the exam system in England is sufficiently robust or resilient. GCSE and A Level exams, along with many vocational assessments, have been cancelled again in 2021. Is there something about the specific assessment arrangements for these qualifications that makes them vulnerable to crises?
This is something we’ve been giving some thought as we develop our outline principles for the future of education. Two are particularly relevant in this context: that dependable assessment is vital, and that technology can support teaching and attainment.
The word ‘resilience’ comes from the Latin resilio: to spring back. It is the ability to recover from difficulties or disturbance. It therefore has a slightly different connotation from ‘robustness’, which implies strength, or the ability to withstand shocks without breaking. (And we won’t talk about a third ‘R’ – ‘rigour’, which is more used to the limelight in normal times). Another way of summing up the difference is that robustness is dealing with ‘known unknowns’, and resilience is dealing with ‘unknown unknowns’. Thinking more broadly about risks (as we have been forced to by the pandemic) arguably just increases the scope of known unknowns, so we won’t belabour the distinction here, and will refer mainly to robustness.
The assessment system is designed to be robust in the sense of dealing with known unknowns, as shown by the detailed procedures for handling exceptions (for example access arrangements, absence and malpractice), and by risk mitigations such as sequestering (isolating) candidates who have timetable clashes or (in the worst case) using a reserve paper when a paper is leaked. But the painful experiences of 2020 and 2021 have shown it struggling to adjust its functioning to withstand the shocks of closed schools, cancelled exams and lost learning. This has led to calls for the system to be changed in some way, to allow it to cope better with future shocks. The main impact on exams of this particular shock to the system (the pandemic) has been that it has been deemed impossible to get large numbers of people into the same place (exam halls) at the same time (the set date for the exam). In this blog we consider whether more online assessment could help improve the robustness of the system.
Could we make more use of online assessment?
Online assessment at home (same time, different place)
Assessing students in their own homes has been a solution in some situations (more often in higher education in England) when lockdowns have prevented students attending the place they would normally have taken exams. The main issues in these cases are the standardisation of testing conditions (including the recognition that many students may not have a suitable home environment for taking an assessment) and the perception or reality of unfairness arising from the greatly increased possibilities for malpractice. ‘Remote proctoring’ (a webcam with a person watching the examinee while they do the assessment) has been one attempted solution, not without its critics
Online assessment at an examination centre (same place, different time)
For GCSEs and A Levels taken in England, examination centres are usually school and college halls. But if we relax the ‘same time’ requirement we could also make use of centres with specific facilities to administer online assessment, but not necessarily to large numbers of people at the same time. Such centres are routinely used for many online tests (e.g. the driving theory test in the UK). Their main advantage is that much stricter controls are possible to prevent malpractice, for example in identifying that the candidate is genuinely the person supposed to be taking the test, that the testing conditions are standardised, that the computer on which the test is taken is appropriately ‘locked down’ so they can only perform certain operations for the purpose of the assessment etc. To avoid actual or perceived unfairness we’d need to ensure that examinees taking the test on different dates didn’t see exactly the same set of questions. Psychometric techniques could be used to address differences in difficulty between different sets of questions, but work would be needed to build public confidence in the fairness of this.
On-demand testing at home (different time, different place)
Versions of this would be the most flexible of all. On-demand could literally mean just that, or it could mean a much larger number of possible dates or ranges of dates when exams could be taken. But the drawbacks of both the previous scenarios would need to be dealt with simultaneously.
Would online assessment make the system more robust?
Incorporating online assessment as part of the overall assessment for GCSEs and A Levels in England could increase robustness to some kinds of disruptive event, most specifically those that prevent large numbers of students being examined in schools on the same date. But it is worth reflecting that online assessment has been ‘just around the corner’ for at least 25 years but, with a few exceptions, has not been incorporated at all into GCSEs and A Levels. Why is this? One reason may well be an anticipated lack of robustness! Computer and technology problems are a familiar part of home life and even for workers in well-equipped and well-supported office environments. It is easy to replace a pen that has run out, but harder to stop an unexpected update of your PC’s operating system. The preparations for delivering online assessments in schools that run smoothly for all candidates could be substantial and costly, the stress for those charged with administering them high, and the price of failure very high in terms of negative publicity and possibly even legal action. It is very natural now for us to focus on disruptive events like the pandemic, but other kinds of disruptive event could affect online systems more than paper-based ones – for example, events such as cyber-attacks or solar flares
that bring down the site(s) involved in running the online assessments. These are big issues that need tackling simultaneously with any move to greater online assessment if we want to increase the robustness of the exam system.
In the next blog we will consider whether more continuous assessment could help.