Jo Ireland

Jo Ireland

Jo Ireland

Jo is a researcher in the Comparability and Change group of the Research Division. One recent strand of her work has been focused on error prevention, including the development of a set of principles for minimising errors in assessment instruments, exploring what we can learn from high reliability industries such as aviation and medicine. She and colleagues have since made use of those principles in a project to improve question paper production processes by evaluating and redeveloping the checking materials used by assessors.

Jo is also involved in producing data management policies for the Research Division and increasing our knowledge of various aspects of research data management.

Publications

2024

Comparing curricula from different regions: a common practice revamped by using MAXQDA.

Greatorex, J., & Ireland, J. (2024, 28 Feb–1 Mar). Comparing curricula from different regions: a common practice revamped by using MAXQDA [Poster presentation]. MAXQDA International Conference, Berlin, Germany. https://www.maxqda.com/wp/wp-content/uploads/sites/2/JGJI_Indigenous-Knowledge.pdf

2023

Multiple marking using the Levels-only method for A level English Literature
Ireland, J., & de Groot, E. (2023, November 1-4). Multiple marking using the Levels-only method for A level English Literature [Paper presentation]. Annual conference of the Association for Educational Assessment – Europe (AEA-Europe), Malta. https://2023.aea-europe.net/
An example of redeveloping checklists to support assessors who check draft exam papers for errors

Vitello, S., Crisp, V., & Ireland, J. (2023). An example of redeveloping checklists to support assessors who check draft exam papers for errors Research Matters: A Cambridge University Press & Assessment publication, 36, 46-58. https://doi.org/10.17863/CAM.101744

Assessment materials must be checked for errors before they are presented to candidates. Any errors have the potential to reduce validity. For example, in the most extreme cases, an error may turn an otherwise well-designed exam question into one that is impossible to answer. In Cambridge University Press & Assessment, assessment materials are checked by multiple assessment specialists across different stages during assessment development. While human checkers are critical to this process, we must acknowledge that there is ample research showing the shortcomings of being human (e.g., we have cognitive biases, and memory and attentional limitations). It is important to provide assessment checkers with tools that help overcome or mitigate these limitations.

This article is about one type of checking tool – checklists. We describe a research-informed, collaborative project to support assessors in performing their checks of exam papers. This project focused on redesigning the instructional, training and task materials provided to assessors. A key part of this was to design checklists for assessors to use when performing their checks. In this article, we focus primarily on the approach that we took for these checklists in order to draw readers’ attention to the complexity that is involved in designing them and to provide a practical example of how research can be used strategically to inform key design decisions.

Research Matters 36: Autumn 2023
  • Foreword Tim Oates
  • Editorial Tom Bramley
  • The prevalence and relevance of Natural History assessments in the school curriculum, 1858–2000: a study of the Assessment ArchivesGillian Cooke
  • The impact of GCSE maths reform on progression to mathematics post-16Carmen Vidal Rodeiro, Joanna Williamson
  • An example of redeveloping checklists to support assessors who check draft exam papers for errorsSylvia Vitello, Victoria Crisp, Jo Ireland
  • An analysis of the relationship between Secondary Checkpoint and IGCSE resultsTim Gill
  • Synchronous hybrid teaching: how easy is it for schools to implement?Filio Constantinou
  • Research NewsLisa Bowett

2022

A structure for analysing features of digital assessments that may affect the constructs assessed
Crisp, V. & Ireland, J. (2022). A structure for analysing features of digital assessments that may affect the constructs assessed. Cambridge University Press & Assessment.
Register of Change Part 2: 2010-2021
Rushton, N. & Ireland, J. 2022. Register of Change Part 2: 2010- 2021. Cambridge University Press & Assessment Research Report. Cambridge, UK: Cambridge University Press & Assessment.

2021

Principles for minimising errors in examination papers and other educational assessment instruments
Suto, I., and Ireland, J. (2021). Principles for minimising errors in examination papers and other educational assessment instruments. International Journal of Assessment Tools in Education, 8(2), 310-325.
On reducing errors in assessment instruments
Suto, I., Williamson, J., Ireland, J., and Macinska, S. (2021). On reducing errors in assessment instruments. Research Papers in Education (ahead of print).

2020

Perspectives on curriculum design: comparing the spiral and the network models

Ireland, J. and Mouthaan, M. (2020). Perspectives on curriculum design: comparing the spiral and the network models. Research Matters: A Cambridge Assessment publication, 30, 7-12.

Does one approach fit all when it comes to curriculum design? In debates on curriculum design, educators have argued that a curriculum model should take into account the differing knowledge structures of different subjects. Subjects such as maths and science are generally defined as well-structured knowledge domains, characterised by a linearity in learning objectives, and well-defined and predictable learning outcomes. Less structured subjects such as the arts and humanities could, however, benefit from models that encompass a different approach to learning. Two competing perspectives on curriculum design have emerged: the spiral model developed by Bruner in 1960, and non-linear models based on processes of learning in different knowledge domains. Research on curriculum design has tended to focus on the needs of science, technology, engineering and maths (STEM) subjects. Many alternative models to the spiral have come from arts-based disciplines, in particular visual arts.

This article contributes to the ongoing debate about curriculum design in different subjects. It details the key characteristics of Bruner’s spiral model, and presents the main arguments made in favour of adopting flexible and non-linear curriculum models in specific subjects. We discuss a number of alternatives to the spiral model and analyse the relative strengths and weaknesses of these different approaches. The conclusion offers a discussion of implications of our findings for further research in curriculum design.

Research Matters 30: Autumn 2020
  • Foreword Tim Oates, CBE
  • Editorial Tom Bramley
  • A New Cambridge Assessment Archive Collection Exploring Cambridge English Exams in Germany and England in JPLO Gillian Cooke
  • Perspectives on curriculum design: comparing the spiral and the network models Jo Ireland, Melissa Mouthaan
  • Context matters—Adaptation guidance for developing a local curriculum from an international curriculum framework Sinead Fitszimons, Victoria Coleman, Jackie Greatorex, Hiba Salem, Martin Johnson
  • Setting and reviewing questions on-screen: issues and challenges Victoria Crisp, Stuart Shaw
  • A way of using taxonomies to demonstrate that applied qualifcations and curricula cover multiple domains of knowledge Irenka Suto, Jackie Greatorex, Sylvia Vitello, Simon Child
  • Research News Anouk Peigne

2019

Re-heated meals: Revisiting the teaching, learning and assessment of practical cookery in schools
Elliott, G., and Ireland, J. (2019). Re-heated meals: Revisiting the teaching, learning and assessment of practical cookery in schools. Presented at the 20th Annual AEA-Europe conference, Lisbon, Portugal, 13-16 November 2019.
Research Matters 28: Autumn 2019
  • Foreword Tim Oates, CBE
  • Editorial Tom Bramley
  • Which is better: one experienced marker or many inexperienced markers? Tom Benton
  • "Learning progressions": A historical and theoretical discussion Tom Gallacher, Martin Johnson
  • The impact of A Level subject choice and students' background characteristics on Higher Education participation Carmen Vidal Rodeiro
  • Studying English and Mathematics at Level 2 post-16: issues and challenges Jo Ireland
  • Methods used by teachers to predict final A Level grades for their students Tim Gill
  • Research News David Beauchamp
Studying English and Mathematics at Level 2 post-16: issues and challenges

Ireland, J. (2019). Studying English and Mathematics at Level 2 post-16: issues and challenges. Research Matters: A Cambridge Assessment publication, 28, 26-33.

The view of the government in England is that good English and mathematics knowledge is key to employment and education prospects. Students who do not achieve a GCSE grade 4 or above in these subjects must continue to study English and mathematics alongside their chosen post-16 studies. This article outlines the background to this situation, then brings together findings from research literature on the challenges and issues faced by students and teachers affected by this requirement. Focusing on GCSE resits and Functional Skills qualifications, the article identifies ways in which post-16 students and teachers can be supported and whether their support needs differ according to the qualification students are working towards.

2017

Is the General Certificate of Secondary Education (GCSE) in England incongruous in the light of other jurisdictions’ approaches to assessment?
Elliott, G., Rushton, N. and Ireland, J. (2017). Presented at the 18th annual AEA Europe conference, Prague, 9-11 November 2017.

2016

Research Matters Special Issue 4: Aspects of Writing 1980-2014
  • Variations in aspects of writing in 16+ English examinations between 1980 and 2014 Gill Elliott, Sylvia Green, Filio Constantinou, Sylvia Vitello, Lucy Chambers, Nicky Rushton, Jo Ireland, Jessica Bowyer, David Beauchamp
Employers' views on assessment design in vocational qualifications: a preliminary study
Vitello, S., Carroll, P., Greatorex, J. and Ireland, J. (2016). Paper presented at the European Conference on Educational Research (ECER), Dublin, Ireland, 23-26 August 2016.

2015

Piloting a method for comparing examination question paper demands
Chambers, L., Greatorex, J., Constantinou, F. and Ireland, J. (2015). Paper presented at the AEA-Europe annual conference, Glasgow, Scotland, 4-7 November 2015.
Piloting a method for comparing examination question paper demands
Greatorex, J., Chambers, L., Constantinou, F. and Ireland, J. (2015).  Paper presented at the British Educational Research Association (BERA) conference, Belfast, UK, 14-17 September 2015.
Linking instructional verbs from assessment criteria to mode of assessment
Greatorex, J., Ireland, J., Carroll, P. and Vitello. S. (2015) Paper presented at the Journal for Vocational Educational and Training (JVET) conference, Oxford, 3-5 July 2015.

2013

Do the questions from A and AS Level Economics exam papers elicit responses that reflect the intended construct?
Greatorex, J., Shaw, S., Hodson, P. and Ireland, J. (2013) Poster presented at British Education Research Association (BERA) conference, Brighton, 3-5 September 2013
Using scales of cognitive demand in a validation study of Cambridge International A and AS level Economics

Greatorex, J., Shaw, S., Hodson, P., Ireland, J. and Werno, M. (2013). Paper presented at British Education Studies Association conference, Swansea, 27-28 June 2013

Using scales of cognitive demand in a validation study of Cambridge International A and AS level Economics

Greatorex, J., Shaw, S., Hodson, P. and Ireland, J. (2013). Using scales of cognitive demand in a validation study of Cambridge International A and AS level Economics. Research Matters: A Cambridge Assessment publication, 15, 29-37.

The research aims to map the cognitive demand of examination questions in A and AS Level economics. To this end we used the CRAS (complexity, resources, abstractness, strategy) framework, an established way of analysing the cognitive demand of examination questions. Six subject experts applied the CRAS framework to selected question papers which included multiple choice, essay and data response items. That is each subject expert rated the level of cognitive demand of each question twice; without reference to the mark scheme and once with reference to the mark scheme. Ratings without the mark scheme indicate how demanding the questions appear.  Ratings with the mark scheme indicate the cognitive demands rewarded by the mark scheme. Analysis showed that the demands elicited by the question were similar to those rewarded by the mark scheme, which is evidence of validity. The findings are used to explore using CRAS with different types of items (multiple choice, essay and data response).

2011

Assessment instruments over time

Elliott, G., Curcin, M., Johnson, N., Bramley, T., Ireland, J., Gill, T. & Black, B. Assessment instruments over time. Research Matters: A Cambridge University Press & Assessment publication, A selection of articles (2011) 2-4. First published in Research Matters, Issue 7, January 2009

As Cambridge Assessment celebrated its 150th anniversary in 2008 members of the Evaluation and Psychometrics Team looked back at question papers over the years. Details of the question papers and examples of questions were used to illustrate the development of seven subjects: Mathematics, Physics, Geography, Art, French, Cookery and English Literature. Two clear themes emerged from the work across most subjects - an increasing emphasis on real-world contexts in more recent years and an increasing choice of topic areas and question/component options available to candidates.

2009

Assessment instruments over time

Elliott, G., Curcin, M., Bramley, T., Ireland, J., Gill, T. and Black, B. (2009). Assessment instruments over time. Research Matters: A Cambridge Assessment publication, 7, 23-25.

As Cambridge Assessment celebrated its 150th anniversary in 2008 members of the Evaluation and Psychometrics Team looked back at question papers over the years. Details of the question papers and examples of questions were used to illustrate the development of seven subjects: Mathematics, Physics, Geography, Art, French, Cookery and English Literature. Two clear themes emerged from the work across most subjects - an increasing emphasis on real-world contexts in more recent years and an increasing choice of topic areas and question/component options available to candidates.

2008

Assessment Instruments over Time
Elliott, G., Black, B. Ireland, J., Gill, T., Bramley, T., Johnson, N. and Curcin, M. (2008) International Association for Educational Assessment (IAEA) Conference, Cambridge

Research Matters

Research Matters 32 promo image

Research Matters is our free biannual publication which allows us to share our assessment research, in a range of fields, with the wider assessment community.