Jackie Greatorex

Jackie Greatorex

Since joining Cambridge Assessment I’ve researched a range of assessment topics including, construct validation, comparability, reliability, grading, standardisation of assessors’ judgements (in academic and vocational settings), grade descriptors, examiners’ cognition, context in examination questions, and the cognitive demand of examination questions. I also studied wider education and curriculum themes, such as, teaching approaches in A Level Chemistry and how ‘Application of Number’ teaching was organised in schools and colleges.

Prior to joining Cambridge Assessment I researched the reliability of medics’ judgements when checking mammograms.

I am a Psychologist and an Associate Fellow of the British Psychological Society. I hold a MEd from University of Bristol and a MA from Cambridge University. For my PhD, which I obtained at the University of Derby, I investigated learning in healthcare degrees. The research drew from psychology, andragogy and curriculum theory.

My team’s research focuses on education and curriculum. The scope of our work is wide-ranging and open to include all ages, subjects (academic or vocational) and jurisdictions. This builds on the curriculum theory I studied during my PhD, and gives the opportunity to research a variety of key education and curriculum matters.

Publications

2018

A review of instruments for assessing complex vocational competence

Greatorex, J., Johnson, M. & Coleman, V. (2017). A review of instruments for assessing complex vocational competence. Research Matters: A Cambridge Assessment publication, 23, 35-42.

The aim of the research was to explore the measurement qualities of checklists and Global Rating Scales [GRS] in the context of assessing complex competence. Firstly, we reviewed the literature about the affordances of human judgement and the mechanical combination of human judgements. Secondly, we reviewed examples of checklists and GRS which are used to assess complex competence in highly regarded professions. These examples served to contextualise and elucidate assessment matters. Thirdly, we compiled research evidence from the outcomes of systematic reviews which compared advantages and disadvantages of checklists and GRS. Together the evidence provides a nuanced and firm basis for conclusions. Overall, literature shows that mechanical combination can outperform the human integration of evidence when assessing complex competence, and that therefore a good use of human judgements is in making decisions about individual traits, which are then mechanically combined. The weight of evidence suggests that GRS generally achieve better reliability and validity than checklists, but that a high quality checklist is better than a poor quality GRS. The review is a reminder that including assessors in designing assessment instruments processes can helps to maximise manageability.

2016

Extending educational taxonomies from general to applied education: Can they be used to write and review assessment criteria?
Greatorex, J. and Suto, I. (2016). Paper presented at the 8th Biennial Conference of the European Association for Research in Learning and Instruction (EARLI) SIG 1 - Assessment and Evaluation, Munich, Germany, 24-26 August 2016
Employers' views on assessment design in vocational qualifications: a preliminary study
Vitello, S., Carroll, P., Greatorex, J. and Ireland, J. (2016). Paper presented at the European Conference on Educational Research (ECER), Dublin, Ireland, 23-26 August 2016.

2015

Piloting a method for comparing examination question paper demands
Chambers, L., Greatorex, J., Constantinou, F. and Ireland, J. (2015). Paper presented at the AEA-Europe annual conference, Glasgow, Scotland, 4-7 November 2015.
Piloting a method for comparing examination question paper demands
Greatorex, J., Chambers, L., Constantinou, F. and Ireland, J. (2015).  Paper presented at the British Educational Research Association (BERA) conference, Belfast, UK, 14-17 September 2015.
Do experts’ views of specification demands correspond with established educational taxonomies?
Greatorex, J., Rushton, N., Mehta, S. and Grayson, R. (2015). Do experts’ views of specification demands correspond with established educational taxonomies? Online Educational Research Journal. (Advance online publication).
Linking instructional verbs from assessment criteria to mode of assessment
Greatorex, J., Ireland, J., Carroll, P. and Vitello. S. (2015) Paper presented at the Journal for Vocational Educational and Training (JVET) conference, Oxford, 3-5 July 2015.

2014

Context in Mathematics questions

Greatorex, J. (2014). Context in Mathematics questions. Research Matters: A Cambridge Assessment publication, 17, 18-23.

For at least two decades educationalists have debated whether Mathematics examination questions should be set in context. The aim of this article is to revisit the debate to answer the following questions: 1. What are the advantages and disadvantages of examining Mathematics in context? 2. What are the features of a high quality context? Initially several taxonomies (categories or classification systems) of context are reviewed and the research methods for evaluating the effects of context are considered. Subsequently, the advantages and disadvantages of using context in Mathematics examination questions are explored, focusing on research about public examinations in secondary school Mathematics in England. The literature is used to make recommendations about context in Mathematics questions.

2013

How can major research findings about returns to qualifications illuminate the comparability of qualifications?

Greatorex, J. (2013) Paper presented at Journal of Vocational Educational and Training (JVET) conference, Oxford, 5-7 July 2013 and British Educational Research Association (BERA) conference, Brighton, 3-5 September 2013

Do the questions from A and AS Level Economics exam papers elicit responses that reflect the intended construct?
Greatorex, J., Shaw, S., Hodson, P. and Ireland, J. (2013) Poster presented at British Education Research Association (BERA) conference, Brighton, 3-5 September 2013
Using scales of cognitive demand in a validation study of Cambridge International A and AS level Economics

Greatorex, J., Shaw, S., Hodson, P., Ireland, J. and Werno, M. (2013). Paper presented at British Education Studies Association conference, Swansea, 27-28 June 2013

Using scales of cognitive demand in a validation study of Cambridge International A and AS level Economics

Greatorex, J., Shaw, S., Hodson, P. and Ireland, J. (2013). Using scales of cognitive demand in a validation study of Cambridge International A and AS level Economics. Research Matters: A Cambridge Assessment publication, 15, 29-37.

The research aims to map the cognitive demand of examination questions in A and AS Level economics. To this end we used the CRAS (complexity, resources, abstractness, strategy) framework, an established way of analysing the cognitive demand of examination questions. Six subject experts applied the CRAS framework to selected question papers which included multiple choice, essay and data response items. That is each subject expert rated the level of cognitive demand of each question twice; without reference to the mark scheme and once with reference to the mark scheme. Ratings without the mark scheme indicate how demanding the questions appear.  Ratings with the mark scheme indicate the cognitive demands rewarded by the mark scheme. Analysis showed that the demands elicited by the question were similar to those rewarded by the mark scheme, which is evidence of validity. The findings are used to explore using CRAS with different types of items (multiple choice, essay and data response).

2012

A method for comparing the demands of specifications
Greatorex, J. and Mehta, S. (2012) Paper presented at the British Educational Research Association Conference, Manchester, 4-6 September 2012 and the European Conference on Educational Research, Cádiz, 18-21 September 2012
The validity of teacher assessed Independent Research Reports contributing to Cambridge Pre-U GPR
Greatorex, J. and Shaw, S. (2012) Paper presented at British Education Research Association conference, Manchester, 4-6 September 2012
The validity of teacher assessed Independent Research Reports contributing to Cambridge Pre-U Global Perspectives and Research

Greatorex, J. and Shaw, S. (2012). The validity of teacher assessed Independent Research Reports contributing to Cambridge Pre-U Global Perspectives and Research. Research Matters: A Cambridge Assessment publication, 14, 38-41.

This research considered the validity of tutor assessed, pre-university independent research reports. Evidence of construct relevance in tutors’ interpretations of the levels awarded to the candidates’ research process was investigated. This included designing, planning, managing and conducting their own research project using techniques and methods appropriate to the subject discipline. The research was conducted in the context of the Cambridge International Pre-U Global Perspectives and Independent Research qualification (the GPR), a pre-university qualification for 16-19 year olds which is designed to equip students with the skills required to make a success of their university studies.  Tutors’ justifications for the levels they gave candidates were considered.  In the first of two studies (Study 1), tutor justifications were qualitatively analysed for specific tutor behaviours that might highlight tutors interpreting levels in a construct irrelevant way.  In the second study (Study 2), external moderators (EMs) rated the justifications according to the extent to which they reflected the intended constructs.  Study 1 showed little evidence of construct irrelevance and Study 2 provided strong evidence of construct relevance in tutors’ interpretation of the levels they awarded candidates for the research process. 

Piloting a method for comparing the demand of vocational qualifications with general qualifications

Greatorex, J. and Shiell, H. (2012). Piloting a method for comparing the demand of vocational qualifications with general qualifications. Research Matters: A Cambridge Assessment publication, 14, 29-38.

Frequently, researchers are tasked with comparing the demand of vocational and general qualifications, and methods of comparison often rely on human judgement.  Therefore, the research aims to develop an instrument to compare vocational and general qualifications, pilot the instrument and explore how experts judge demand.  Reading a range of OCR (Oxford Cambridge and Royal Society of Arts Examinations) level 2 specifications illustrated that they included knowledge, skills and understanding from five domains; the affective, cognitive, interpersonal, metacognitive, and psychomotor domains. Therefore, these domains were included in the instrument. Four cognate units were included in the study. Four experts participated, each with familiarity with at least one unit. Each expert read pairs of unit specifications and judged which was more demanding in each domain (affective, cognitive, interpersonal, metacognitive and psychomotor).  Subsequently, they completed a questionnaire about their experience. The results are presented.  It was found that the demands instrument was suitable for comparing the demand of cognate units from vocational and general qualifications.

2011

Comparing specifications in a diverse qualifications system: instrument development
Greatorex, J., Rushton, N., Mehta, S. and Hopkin, R. (2010). Paper presented at the British Educational Research Association annual conference, University of London Institute of Education, September 2011.
Comparing different types of qualifications (e.g. vocational versus academic)
Greatorex, J. (2011) British Educational Research Association, London
Comparing specifications from diverse qualifications: instrument development
Greatorex, J., Rushton, N., Mehta, S. and Hopkin, R. (2011).  Paper presented at the Journal of Vocational Education and Training International conference, Oxford, July 2011.

2009

How are archive scripts used in judgements about maintaining grading standards?
Greatorex, J.  (2009) British Educational Research Association (BERA) Annual Conference

2008

A Quantitative Analysis of Cognitive Strategy Usage in the Marking of Two GCSE Examinations
Suto, W. M. I. and Greatorex, J. (2008) Assessment in Education: Principles, Policy and Practice, 15, 1, 73-89
What makes AS marking reliable? An experiment with some stages from the standardisation process
Greatorex, J. and Bell J. F. (2008) Research Papers in Education, 23, 3, 333–355
What do GCSE examiners think of ‘thinking aloud’? Findings from an exploratory study
Greatorex, J. and Suto, W.M.I. (2008). What do GCSE examiners think of ‘thinking aloud’? Findings from an exploratory study. Educational Research, 40, 4, 319-331
What attracts judges’ attention? A comparison of three grading methods
Greatorex, J., Novakovic, N. & Suto, I. (2008) International Association for Educational Assessment (IAEA) Conference, Cambridge
Exploring the role of human judgement in examination marking: findings from some empirical studies
Greatorex, J., Suto, I. & Nadas, R. (2008) Association of Language Testers in Europe (ALTE), Cambridge
What goes through an examiner's mind? Using verbal protocols to gain insights into the GCSE marking process
Suto, W. M. I.and Greatorex, J. (2008) British Educational Research Journal, 34, 2, 213-233
Judging Text Presented on Screen: implications for validity
Johnson, M. and Greatorex, J. (2008) E-Learning, 5, 1, 40-50

2007

What strategies do IGCSE examiners use to mark candidates' scripts?
Greatorex, J. (2007) International Schools Journal, 27, 1, 48-55
Assessors’ holistic judgements about borderline performances: some influencing factors
Johnson, M. and Greatorex, J. (2007) British Educational Research Association (BERA) Annual Conference
Exploring how the cognitive strategies used to mark examination questions relate to the efficacy of examiner training
Greatorex, J., Nádas, R., Suto, I. and Bell, J F. (2007) European Conference on Educational Research (ECER) Conference, Ghent, Belgium

2006

What do GCSE examiners think of 'thinking aloud'? Interesting findings from a preliminary study
Suto, I. and Greatorex, J. (2006) British Educational Research Association (BERA) Annual Conference
Do examiners’ approaches to marking change between when they first begin marking and when they have marked many scripts?
Greatorex, J. (2006) British Educational Research Association (BERA) Annual Conference
An empirical exploration of human judgement in the marking of school examinations
Greatorex, J. & Suto, I. (2006) International Association for Educational Assessment (IAEA) Conference, Singapore
Moderated e-portfolio project evaluation
Greatorex, J. (2006) Moderated e-portfolio project evaluation

2005

Assessing the evidence: different types of NVQ evidence and their impact on reliability and fairness.
Greatorex, J. (2005) Journal of Vocational Education and Training 57, 2, 149-264
What goes through a marker’s mind? Gaining theoretical insights into the A-level and GCSE marking process
Greatorex, J. and Suto, I. (2005). Paper presented at the Association for Educational Assessment (AEA) - Europe, Dublin, Republic of Ireland, 3 November 2005.
What goes through an examiner’s mind? Using verbal protocols to gain insights into the GCSE marking process
Suto, I. and Greatorex, J. (2005) British Educational Research Association (BERA) Annual Conference
Judging learners’ work on screen. How valid and fair are assessment judgements?
Johnson, M. and Greatorex, J. (2005) British Educational Research Association (BERA) Annual Conference

2004

Does the gender of examiners influence their marking?
Greatorex, J. and Bell, J. F. (2004) Research in Education, 71, 25-36
From Paper to Screen: some issues on the way
Raikes, N., Greatorex, J. and Shaw, S. (2004). Presented at the 30th annual conference of the International Associations for Educational Assessment (IAEA), Philadelphia, USA, 13-18 June 2004.
What makes marking reliable? Experiments with UK examinations.
Baird, J., Greatorex, J. and Bell, J. F. (2004) Assessment in Education Principles, Policy and Practice, 11, 3, 331-348

2003

Developing and applying level descriptors
Greatorex, J. (2003) Westminster Studies in Education, 26, 2, 125-133
Examinations and assessment in curriculum 2000.
Greatorex, J. (2003) In: L.LeVersha and G. Nicholls (eds.)Teaching at post-16: Effective teaching in the A level, AS and VCE curriculum. London: Kogan Page
A Comparability Study in GCE A level Chemistry Including the Scottish Advanced Higher Grade
Greatorex, J., Hamnett, L. and Bell J. F. (2003) A review of the examination requirements and a report on the cross moderation exercise. [A study based on the Summer 2002 Examinations and organised by the Research and Evaluation Division, UCLES for OCR on behalf
What happened to limen referencing? An exploration of how the Awarding of public examinations has been and might be conceptualised
Greatorex, J. (2003) British Educational Research Association (BERA) Annual Conference
How can NVQ assessors’ judgements be standardised?
Greatorex, J. and Shannon, M. (2003) British Educational Research Association (BERA) Annual Conference

2002

Writing and using level descriptors
Greatorex, J. (2002) Learning and Skills Research Journal, 6, 1, 36
A Comparability Study in GCE AS Chemistry Including parts of the Scottish Higher Grade Examinations
Greatorex, J., Elliott, G. and Bell, J. F. (2002) A review of the examination requirements and a report on the cross moderation exercise. [A study based on the Summer 2001 Examination and organised by the Research and Evaluation Division, UCLES for OCR on behalf of the Joint Council for General Qualifica
A fair comparison? The evolution of methods of comparability in national assessment
Elliott, G. and Greatorex, J. (2002) Educational Studies, 28, 3, 253-264
Back to the future: A methodology for comparing old A level and new AS standards
Elliott, G., Greatorex, J., Forster, M., and Bell, J.F. (2002) Educational Studies, 28, 2, 163-180
What makes a senior examiner?
Greatorex, J. and Bell, J F. (2002) British Educational Research Association (BERA) Annual Conference
Two heads are better than one: Standardising the judgements of National Vocational Qualification assessors
Greatorex, J. (2002) British Educational Research Association (BERA) Annual Conference
Does the gender of examiners influence their marking?
Greatorex, J. and Bell, J F. (2002) Learning Communities and Assessment Cultures: Connecting Research with Practice
Tools for the trade: What makes GCSE marking reliable?
Greatorex, J., Baird, J. and Bell, J F. (2002) Learning Communities and Assessment Cultures: Connecting Research with Practice

2001

Making the grade - developing grade profiles for accounting using a discriminator model of performance
Greatorex, J., Johnson, C. and Frame, K. (2001) Westminster Studies in Education, 24, 2, 167-181
Can vocational A levels be meaningfully compared with other qualifications?
Greatorex, J. (2001) British Educational Research Association (BERA) Annual Conference

2000

A Review of Research into Levels, Profiles and Comparability
Bell, J.F. and Greatorex, J. (2000) QCA [QCA]
An accessible analytical approach for investigating what happens between the rounds of a Delphi study
Greatorex, J. and Dexter, T. (2000) Journal of Advanced Nursing Studies, 32, 4, 1016-1024
Application of Number: an investigation into a theoretical framework for understanding the production and reproduction of pedagogical practices
McAlpine, M. and Greatorex, J. (2000) British Educational Research Association (BERA) Annual Conference
What research could an Awarding Body carry out about NVQs?
Greatorex. J. (2000). British Educational Research Association (BERA) Annual Conference.
Is the glass half full or half empty? What examiners really think of candidates’ achievement
Greatorex, J. (2000) British Educational Research Association (BERA) Annual Conference

1999

Generic Descriptors - a Health Check
Greatorex, J. (1999) Quality in Higher Education, 5, 2, 155-165
The Implementation of Application of Number
McAlpine, M. and Greatorex, J. (1999) British Educational Research Association (BERA) Annual Conference
The Application of Number Experience
McAlpine, M. and Greatorex, J. (1999) Researching Work and Learning, A First International Conference, School of Continuing Education

Research Matters

Research Matters

Research Matters is our free biannual publication which allows us to share our assessment research, in a range of fields, with the wider assessment community.