Embarking on e-assessment: expectations, impediments and lessons learnt

by The Assessment Network, 15 February 2021
Boy at laptop

Many organisations are increasingly taking their first steps into the world of e-assessment and this has only been accelerated by the global pandemic. Last October Sarah Hughes, Senior Research Manager, OCR and Sanjay Mistry, Head of On-Screen Research, Cambridge International, delivered a workshop for Cambridge Assessment Network exploring the considerations of embarking on e-assessment. The workshop focused on three case studies, two from Cambridge English and one from Cambridge International, which provided insight and lessons learnt in adopting e-assessment. In this blog, Sanjay outlines some of the key areas that the workshop covered.

Getting started

E-assessment can be defined as an ‘end to end electronic assessment process where ICT is used for the presentation of assessment activity and the recording of responses’[i][1].

When considering the first stages in developing an e-assessment, we need to consider the drivers and benefits. A starting point would be to think about what questions we need answers to when developing and delivering e-assessments. 

To focus our questioning, it is sensible to split the e-assessment process into the following areas:

  • Test Specification
  • Test Construction
  • Student Experience
  • Marking
  • Grading
  • Results and Feedback[2]

Benefits of e-assessment

E-assessments bring many benefits, not just those for specific assessment areas, skills and constructs that cannot be assessed on paper - for example in medicine where virtual reality is used to assess keyhole surgery skills in order to remove the potential risk to a patient. 

Other benefits include the ability within an e-assessment to use different item types, rich stimulus materials and enhanced interactivity and the onscreen appeal to candidates, especially those that have varying accessibility requirements. 

There are also the perceived efficiency benefits after the assessment has been taken. These include reduction of costs in administration and logistics, reduction in human error, more auto marking, the gathering and storage of large amounts of data, faster turnaround of grades and the ability for quick and instant feedback. 

In one of the case studies presented, the benefits of having speaking tests as a computer-based e-assessment were that:

  • It allowed for consistent delivery of test content
  • It allowed for assessing integrated skills
  • On-demand testing was possible

Barriers and challenges

The barriers and challenges are also well known when considering e-assessment. 

These can be grouped around cultural and political barriers (rivalry within the industry, country differences, regulations, educational research lag), system or technology barriers (hardware availability, compatibility, network issues, security, technology driving the assessment) and across the process areas as described previously. 

Within Test Specification this could be an on-screen test inadvertently assessing skills not relevant to the test (e.g. keyboard and typing skills), student preparation and familiarity with items and responses, or the perceived risk of an increase in cheating. For grading and marking of e-assessments there could be difficulty in awarding partial credit and a potential change in the exam standard. 

Accessibility is key

When constructing and delivering e-assessments it is important to make them as accessible for as many candidates as possible. Four accessibility areas to consider are:

  • The assessment content
  • The test design and functionality
  • Use of Assistive Technology
  • Software or platform embedded accessibility tools

Transition to on-screen assessment

When awarding organisations are managing the transition from paper to an e-assessment on-screen, it is important to consider the assessment change across the different medium. The first step in this process is usually to translate paper content to screen, rather than thinking about an on-screen first assessment. 

The following questions may help to focus our thinking during this paper-based to onscreen content translation process:

  • How far is the intent to change either the purpose and/or any of the core aspects of the designing, developing, delivering and reporting of a test?
  • What are the possible consequences to score meaning? How will that affect the decisions we want to make? 
  • What can we do to preserve score meaning across changes to ensure validity, fairness and credibility? 

It is also worth considering the impact of this change on the test specification around the apparent constraints, the quality criteria guiding the process, what the tools are that are going to be used and new processes that need to be put in place for a controlled transition.

Practical considerations

Some practical take away considerations for getting started in e-assessment that were highlighted in the case studies presented were:

  • Use medium agnostic, cognitive command words rather than medium specific technical command words when writing assessment content
  • Consider the exposure and familiarity of candidates to software and platforms being used for e-assessment
  • Think about learners’ ability to execute a function on-screen being related to their age
  • Ensure that on-screen items are designed to remove excessive cognitive load for the candidate
  • Carefully design the user interface to ensure it is not cluttered
  • Ensure that assessment content is suitable to be used in a digital context
  • Consider carefully how best to use technology, for example in speaking tests always investigate the comparability of different forms of test delivery (e.g. face-to-face vs. online)
  • Consider contingency measures to plan for unexpected technical failures
  • Always trial and pilot products, iterate and change as you go
  • Not only think about the user requirements, think about the other stakeholder requirements, for example teachers. Do they require familiarisation, training and professional development?

With the explosion of digital in the teaching and learning space as a result of the 2020 pandemic and the increased familiarity of digital means, now is the time for e-assessment. 

So my final piece of advice would be to carefully consider the approach you take and try not to do everything at once. Prioritise the different parts of the e-assessment process into those that will benefit the learners and relevant stakeholders, trying each stage at a time.

Embarking on e-assessment: expectations, impediments and lessons learnt, a workshop from Cambridge Assessment Network, will run again in September.

The Assessment Network is an accredited provider of assessment training and professional development and exists in order to provide high-quality assessment training, underpinned by research, to Cambridge Assessment staff and the wider assessment community.

Find out about our upcoming assessment practitioner workshops on our events pages.

Footnotes

[1] Llamas-Nistal, Fernandez-Iglesias, Gonzalez-Tato and Mikic Fonte – 2013

[2] Whitelock,  Reudel, and Mackenzie (2006), Alruwais, Wills and Wald (2018), Hegarty (2018)

References

Alruwais, N, Wills, G. and Wald, M. (2018) Advantages and Challenges of Using e-Assessment, International Journal of Information and Education Technology, Vol. 8, No. 1, January 2018

Hegarty, G (2018) The Journey to e-assessment.  Presentation at RM Results 2020 and beyond event Scottish Qualifications Authority, 2018. https://vimeo.com/299614645

Llamas-Nistal, M., JuanGonzález-Tato, M & Mikic-Fonte, F. (2013) Blended e-assessment: Migrating classical exams to the digital world Computers & Education. Volume 62, March 2013, Pages 72-87 https://doi.org/10.1016/j.compedu.2012.10.021

D. Whitelock, C.  Reudel, and D. Mackenzie (2006) E-assessment : Case studies of effective and innovative practice a JISC, Jt. Inf. Syst. Comm.,p. 184, 2006.

Before you go... Did you find this article on Twitter, LinkedIn or Facebook? Remember to go back and share it with your friends and colleagues!

Related blogs

Key bloggers

Saul Nasse Blogger
Saul Nassé

Group Chief Executive, Cambridge Assessment 2018 - 21

Tim Oates Blogger
Tim Oates

Group Director of Assessment and Research