Nicholas Raikes

Nicholas Raikes

Unusually for Cambridge, I was born and brought up here, and started working for Cambridge Assessment as a temp when home for the summer from Liverpool John Moores University. I graduated with a BSc in Applied Physics, and now also have specialist qualifications in computing and education from the universities of Bradford and Bristol respectively.

After graduation I briefly took up a post in question paper preparation at Cambridge Assessment, but swiftly joined the Research and Evaluation division where I worked on, and in due course led, many studies in areas such as inter-board comparability, standards over time, equity and qualification evaluation. I subsequently worked extensively on the development and introduction of on-screen marking and on operationalising data analytics. With over twenty years’ experience in technical, research, and leadership roles at Cambridge Assessment, I now lead a programme of work in innovation and development.

Outside of work I enjoy foreign travel, hiking, theatre, and concerts.

Publications

2012

Making the most of our assessment data: Cambridge Assessment's Information Services Platform

Raikes, N. (2012). Making the most of our assessment data: Cambridge Assessment's Information Services Platform. Research Matters: A Cambridge Assessment publication, 13, 38-40.

As new technologies penetrate every part of educational assessment, data is being collected as never before.  Traditionally, there were two methods of producing statistical information within Cambridge Assessment.  Routine statistical information came from reports built into bespoke examination processing systems.  Non-routine analysis and reports were produced by small teams of statistical experts, typically working within research units and using statistical software packages on personal computers.  With increasing demand for flexible, high-volume statistical reporting, a new solution was required; one which combined the resilience and scalability of a server-based infrastructure with the flexibility of having statistical experts in charge of creating the statistical content.  The Information Services Platform (ISP) is Cambridge Assessment’s solution for these requirements.  It provides our statistical experts with access to operational assessment data and tools to automate and schedule analysis and report, and to publish the resulting content on an Intranet Portal for use by colleagues across the organisation.  In this paper, I discuss further the thinking behind the ISP and give practical examples of use.

2011

Making the most of our assessment data: Cambridge Assessment’s Information Services Platform
Raikes, N. (2011).  Presented at the 37th annual conference of the International Association for Educational Assessment (IAEA), Manila, Philippines, 23-28 October 2011.
Evaluating Senior Examiners' use of Item Level Data

Shiell, H. and Raikes, N. (2011). Evaluating Senior Examiners' use of Item Level Data. Research Matters: A Cambridge Assessment publication, 12, 7-10.

Many of Cambridge Assessment's written examination scripts are now scanned and marked on screen by examiners working on computers. One benefit arising from on-screen marking is that the marks are captured at item or question-part level and are available for analysis in Cambridge within hours of being submitted by examiners. Cambridge Assessment now routinely analyses these item marks and provides subject staff and senior examiners with reports containing Item Level Data (ILD) for nearly all examinations marked on screen. In this article, we present findings from an evaluation of senior CIE and OCR examiners’ use of these Item Level Data reports.

2010

Must examiners meet in order to standardise their marking? An experiment with new and experienced examiners of GCE AS Psychology

Raikes, N., Fidler, J. and Gill, T. (2010). Must examiners meet in order to standardise their marking? An experiment with new and experienced examiners of GCE AS Psychology. Research Matters: A Cambridge Assessment publication, 10, 21-27.

When high-stakes examinations are marked by a panel of examiners, the examiners must be standardised so that candidates are not advantaged or disadvantaged according to which examiner marks their work.

It is common practice for Awarding Bodies’ standardisation processes to include a “Standardisation” or “Co-ordination” meeting, where all examiners meet to be briefed by the Principal Examiner and to discuss the application of the mark scheme in relation to specific examples of candidates’ work.  Research into the effectiveness of standardisation meetings has cast doubt on their usefulness, however, at least for experienced examiners.  

In the present study we addressed the following research questions:

1. What is the effect on marking accuracy of including a face-to-face meeting as part of an examiner standardisation process?
2. How does the effect on marking accuracy of a face-to-face meeting vary with the type of question being marked (short-answer or essay) and the level of experience of the examiners?
3. To what extent do examiners carry forward standardisation on one set of questions to a different but very similar set of questions?

2009

Must examiners meet in order to standardise their marking? An experiment with new and experienced examiners of GCE AS Psychology
Raikes, N., Fidler, J. and Gill, T. (2009). Presented at the British Educational Research Association (BERA) Annual Conference, Manchester, UK, 5 September 2009.

2008

Grading examinations using expert judgements from a diverse pool of judges
Raikes, N., Scorey, S. and Shiell, H. (2008). Presented at the 34th annual conference of the International Association for Educational Assessment (IAEA), Cambridge, UK, 7-12 September 2008.

2006

Item level examiner agreement
Massey, A.J. and Raikes, N. (2006) Presented at the British Educational Research Association (BERA) annual conference, Warwick, UK, 9 September 2006.
Quality control of marking: Some models and simulations
Bell, J.F., Bramley, T., Claessen, M.J.A. and Raikes, N. (2006). Presented at the 32nd annual conference of the International Association for Educational Assessment (IAEA), Singapore, 21-26 May 2006.

2004

Auto-marking 2: An update on the UCLES-Oxford University research into using computational linguistics to score short, free text responses.
Sukkarieh, J.Z., Pulman, S.G. and Raikes, N. (2004). Presented at the 30th annual conference of the International Associations for Educational Assessment (IAEA), Philadelphia, USA, 13-18 June 2004.
From Paper to Screen: some issues on the way
Raikes, N., Greatorex, J. and Shaw, S. (2004). Presented at the 30th annual conference of the International Associations for Educational Assessment (IAEA), Philadelphia, USA, 13-18 June 2004.

2003

The Horseless Carriage Stage: replacing conventional measures
Raikes, N. and Harding, R. (2003) Assessment in Education, 10, 3, 268-277
Auto-Marking: Using Computational Linguistics to Score Short, Free Text Responses
Sukkarieh, J., Pulman, S. and Raikes, N. (2003). Presented at the 29th annual conference of the International Association for Educational Assessment (IAEA), Manchester, UK, October 2003.

2002

On Screen Marking of Scanned Paper Scripts
Raikes, N. (2002) University of Cambridge Local Examinations Syndicate

1998

Investigating A-level mathematics standards over time
Bell, J.F., Bramley, T. and Raikes, N. (1998).  Investigating A level mathematics standards over time. British Journal of Curriculum and Assessment, 8, 2, 7-11.

1997

Standards in A level Mathematics 1986-1996
Bell J F., Bramley, T. and Raikes, N. (1997). Presented at the British Educational Research Association (BERA) annual conference, York, UK, 11-14 September 1997.

Research Matters

Research Matters

Research Matters is our free biannual publication which allows us to share our assessment research, in a range of fields, with the wider assessment community.