Jo Ireland

Jo Ireland

Since joining the Research Division in 1998, I have worked on various comparability and validation projects. Much of my work concerns comparison of assessments over time, between qualifications or between jurisdictions. Since 2004, I have been involved with the unique ‘Aspects of Writing’ research series, which periodically explores changes in students’ writing over time.

One recent strand of work has involved investigating approaches to error management in high reliability organisations and industries such as aviation, and what can be learnt from these approaches when improving assessment processes.

Publications

2020

Perspectives on curriculum design: comparing the spiral and the network models

Ireland, J. and Mouthaan, M. (2020). Perspectives on curriculum design: comparing the spiral and the network models. Research Matters: A Cambridge Assessment publication, 30, 7-12.

Does one approach fit all when it comes to curriculum design? In debates on curriculum design, educators have argued that a curriculum model should take into account the differing knowledge structures of different subjects. Subjects such as maths and science are generally defined as well-structured knowledge domains, characterised by a linearity in learning objectives, and well-defined and predictable learning outcomes. Less structured subjects such as the arts and humanities could, however, benefit from models that encompass a different approach to learning. Two competing perspectives on curriculum design have emerged: the spiral model developed by Bruner in 1960, and non-linear models based on processes of learning in different knowledge domains. Research on curriculum design has tended to focus on the needs of science, technology, engineering and maths (STEM) subjects. Many alternative models to the spiral have come from arts-based disciplines, in particular visual arts.

This article contributes to the ongoing debate about curriculum design in different subjects. It details the key characteristics of Bruner’s spiral model, and presents the main arguments made in favour of adopting flexible and non-linear curriculum models in specific subjects. We discuss a number of alternatives to the spiral model and analyse the relative strengths and weaknesses of these different approaches. The conclusion offers a discussion of implications of our findings for further research in curriculum design.

2019

Re-heated meals: Revisiting the teaching, learning and assessment of practical cookery in schools
Elliott, G., and Ireland, J. (2019). Re-heated meals: Revisiting the teaching, learning and assessment of practical cookery in schools. Presented at the 20th Annual AEA-Europe conference, Lisbon, Portugal, 13-16 November 2019.
Studying English and Mathematics at Level 2 post-16: issues and challenges

Ireland, J. (2019). Studying English and Mathematics at Level 2 post-16: issues and challenges. Research Matters: A Cambridge Assessment publication, 28, 26-33.

The view of the government in England is that good English and mathematics knowledge is key to employment and education prospects. Students who do not achieve a GCSE grade 4 or above in these subjects must continue to study English and mathematics alongside their chosen post-16 studies. This article outlines the background to this situation, then brings together findings from research literature on the challenges and issues faced by students and teachers affected by this requirement. Focusing on GCSE resits and Functional Skills qualifications, the article identifies ways in which post-16 students and teachers can be supported and whether their support needs differ according to the qualification students are working towards.

2017

Is the General Certificate of Secondary Education (GCSE) in England incongruous in the light of other jurisdictions’ approaches to assessment?
Elliott, G., Rushton, N. and Ireland, J. (2017). Presented at the 18th annual AEA Europe conference, Prague, 9-11 November 2017.

2016

Employers' views on assessment design in vocational qualifications: a preliminary study
Vitello, S., Carroll, P., Greatorex, J. and Ireland, J. (2016). Paper presented at the European Conference on Educational Research (ECER), Dublin, Ireland, 23-26 August 2016.

2015

Piloting a method for comparing examination question paper demands
Chambers, L., Greatorex, J., Constantinou, F. and Ireland, J. (2015). Paper presented at the AEA-Europe annual conference, Glasgow, Scotland, 4-7 November 2015.
Piloting a method for comparing examination question paper demands
Greatorex, J., Chambers, L., Constantinou, F. and Ireland, J. (2015).  Paper presented at the British Educational Research Association (BERA) conference, Belfast, UK, 14-17 September 2015.
Linking instructional verbs from assessment criteria to mode of assessment
Greatorex, J., Ireland, J., Carroll, P. and Vitello. S. (2015) Paper presented at the Journal for Vocational Educational and Training (JVET) conference, Oxford, 3-5 July 2015.

2013

Do the questions from A and AS Level Economics exam papers elicit responses that reflect the intended construct?
Greatorex, J., Shaw, S., Hodson, P. and Ireland, J. (2013) Poster presented at British Education Research Association (BERA) conference, Brighton, 3-5 September 2013
Using scales of cognitive demand in a validation study of Cambridge International A and AS level Economics

Greatorex, J., Shaw, S., Hodson, P., Ireland, J. and Werno, M. (2013). Paper presented at British Education Studies Association conference, Swansea, 27-28 June 2013

Using scales of cognitive demand in a validation study of Cambridge International A and AS level Economics

Greatorex, J., Shaw, S., Hodson, P. and Ireland, J. (2013). Using scales of cognitive demand in a validation study of Cambridge International A and AS level Economics. Research Matters: A Cambridge Assessment publication, 15, 29-37.

The research aims to map the cognitive demand of examination questions in A and AS Level economics. To this end we used the CRAS (complexity, resources, abstractness, strategy) framework, an established way of analysing the cognitive demand of examination questions. Six subject experts applied the CRAS framework to selected question papers which included multiple choice, essay and data response items. That is each subject expert rated the level of cognitive demand of each question twice; without reference to the mark scheme and once with reference to the mark scheme. Ratings without the mark scheme indicate how demanding the questions appear.  Ratings with the mark scheme indicate the cognitive demands rewarded by the mark scheme. Analysis showed that the demands elicited by the question were similar to those rewarded by the mark scheme, which is evidence of validity. The findings are used to explore using CRAS with different types of items (multiple choice, essay and data response).

2009

Assessment instruments over time

Elliott, G., Curcin, M., Bramley, T., Ireland, J., Gill, T. and Black, B. (2009). Assessment instruments over time. Research Matters: A Cambridge Assessment publication, 7, 23-25.

As Cambridge Assessment celebrated its 150th anniversary in 2008 members of the Evaluation and Psychometrics Team looked back at question papers over the years. Details of the question papers and examples of questions were used to illustrate the development of seven subjects: Mathematics, Physics, Geography, Art, French, Cookery and English Literature. Two clear themes emerged from the work across most subjects - an increasing emphasis on real-world contexts in more recent years and an increasing choice of topic areas and question/component options available to candidates.

2008

Assessment Instruments over Time
Elliott, G., Black, B. Ireland, J., Gill, T., Bramley, T., Johnson, N. and Curcin, M. (2008) International Association for Educational Assessment (IAEA) Conference, Cambridge

Research Matters

Research Matters 28: Autumn 2019

Research Matters is our free biannual publication which allows us to share our assessment research, in a range of fields, with the wider assessment community.