Why don’t we just put our high stakes exams on screen?

by Sarah Hughes, 06 January 2022
Laptop with exam hall desks and chairs over the keyboard

At Cambridge University Press & Assessment we could migrate our existing assessments to screen. We have an archive of thousands of assessments which have been developed using research evidence and quality assured through our principled approaches. So, given all the work which has gone into creating these assessments, why don’t we just put our existing assessments on screen?

  1. a transformational approach – innovating and using technology to develop new assessments, assessment models and curricula.
  2. a migratory approach – putting existing assessments on screen without changing the assessment models, the curricula or the teaching and learning.

Migration of existing assessments to screen has value: our experiences of developing Cambridge IGCSE Progression tests and GCSE Topic tests show that migration can improve accessibility, release teacher time, provide rich data and recognise the digital literacy of learners as well as build our organisational capabilities. But simply migrating paper tests to screen can miss opportunities and come with risks.

Missed opportunities

Designing digital assessments from scratch means we can:

  • reflect effective teaching and learning in the internet age (for example, OCR offers a Further Maths A Level option ‘Further Pure with Technology’ in which learners can use a computer algebra system, spreadsheets, and graph plotters in the exam);
  • design curricula and assessments which develop skills and behaviours which better reflect real life and work;
  • use technology to assess constructs in ways not possible with paper assessments;
  • embed technology in teaching, learning and assessment to add value to the educational experience;
  • exploit technology to its full potential.

Risks of migrating paper tests to screen

There are three main risks in migrating paper tests to screen:

1. On-screen tools designed to replicate pen and paper may be invalid

Migrated tests often include on-screen tools which allow learners to respond in ways as close as possible to how they respond on paper. For example, an equation editor may be used to allow them to write equations or a digital protractor that can be dragged across the screen to measure an angle. In some sense, this is like using a smartphone to send morse code: it’s possible, but why would you do it? What value does it add and how meaningful is it for learners? The purpose of morse code was to communicate immediately and securely – there are new and more effective ways of doing that.

Importantly, let’s think about the quality criteria we apply to our assessments: validity, reliability, and fairness. Is using technology to replicate how a learner would work on paper fair, reliable, and valid? Does it have a positive impact on teaching and learning? The risk is that using tools to replicate pen and paper adds unintended demands and so threatens validity.

Another threat to validity is that sometimes migrated questions don’t work because the way learners interact with them on screen is different from how they interact with them on paper. For example, candidates who work on screen are less likely to use jotting paper, so this changes the mental processes they use to get to their answers(1). The result can be that migrated assessments don’t cover the full range of content, item types or skills as the paper exam, or that what is assessed is subtly different. If on-screen and paper assessments don’t assess the same things, when they are expected to this can lead to wrong and unfair decisions.

2. Potential mismatch between learning and assessment

High stakes assessment is known to drive teaching and learning, and moving to digital assessment can have implications for both. A mismatch between the teaching and the assessment, including response types and the skills and behaviours assessed, puts validity and fairness at risk. For example, if learners are taught to construct graphs using paper methods and the assessment is on-screen then this could be unfair. But also, if learners are taught using an on-screen graphing tool, we need to understand how using the tool impacts on learners’ conceptual understanding.

3. Limitations to achieving comparability

We are interested in achieving comparability (verging on, obsessed by it) in educational assessment, yet this can be challenging between on-screen and paper-based tests. And requiring them to be comparable can hinder innovation. However, thinking digitally about assessment could help us reconceptualise comparability. For example, we could choose to think about comparability in a different way, like the currency an assessment gives learners to progress (rather than the demand of the questions) or in terms of the transferable skills learners develop (rather than the content coverage). At Cambridge, we are thinking about what comparability means in the digital context.

What the research says

Some subject areas are more impacted by the risks of migration than others. Subjects with a lot of diagrams, annotation, graphs, and equations are most vulnerable. Research on migrating high stakes exams to screen has found:

  • learners perform slightly differently in different modes(2)
  • the gap between performance on paper and on-screen is largest for questions which require annotation, are visual or graphical(3)
  • the gap between paper and screen is bigger for lower ability learners(2)

And, as the regulator of assessments in England, Ofqual(4), recognises, these issues can be a barrier to adoption.

Why does this matter?

Paper assessment has limitations which have influenced what is assessed by high stakes exams and how for decades. On-screen assessments also have limitations. But if we try and replicate paper exams by moving them on-screen, we are unnecessarily taking on the limitations of both paper AND screen and so not delivering the potential opportunities from using technology.

By trying to migrate paper-based assessments to screen your focus becomes what technology can’t do. In contrast, a transformational approach embraces what the technology can do and how this can transform curricula, teaching and learning, and by extension, assessment.

We don’t think that the long-term future of digital assessment is lifting and shifting paper exams. We could migrate all of our existing high stakes assessments, but we know we would miss opportunities and create issues. The opportunities presented by digital are worth taking. In the UK we hear concerns about how the current assessment system is not fit for purpose(5). Suggestions for improvements (for example, assessing transferable skills in authentic ways using a variety of assessment types) could be supported by new assessments and curricula which make innovative use of technology. The assessments we are developing now, in line with our transformational strategy, aim to do just that.

References:

1↩ Johnson, M. & Green, S. (2006). On-Line Mathematics Assessment: The Impact of Mode on Performance and Question Answering Strategies. Journal of Technology, Learning, and Assessment, 4(5).

2↩ Threlfall, J., Pool, P., Homer, M., & Swinnerton, B. (2007). Implicit aspects of paper and pencil mathematics assessment that come to light through the use of the computer. Educational Studies in Mathematics, 66(3), 335–348.

3↩ Hughes, S., Green, C., & Greene, V. (2011). Report on current state of the art in formative and summative assessment in IBE in STM - Part II

4↩ Ofqual (2020). Online and on-screen assessment in high stakes, sessional qualifications. A review of the barriers to greater adoption and how these might be overcome. Ofqual/20/6723/1

5↩ For example, HMC The State of Education – time to talk; IAC Interim report of the Independent Assessment Commission The Future of Assessments and Qualifications in the UK; Rethinking Assessment, Rethinking Assessment in education: the case for change. Bill Lucas.

Before you go... Did you find this article on Twitter, LinkedIn or Facebook? Remember to go back and share it with your friends and colleagues!

Related blogs

Key bloggers

Tim Oates Blogger
Tim Oates

Director of Assessment and Research

Research Matters

Research Matters 32 promo image

Research Matters is our free biannual publication which allows us to share our assessment research, in a range of fields, with the wider assessment community.