Research Matters 01

  • Research Matters 1 Foreword

    McLone, R. (2005). Foreword. Research Matters: A Cambridge Assessment publication, 1, 1.

    Welcome to the first issue of Research Matters, a biannual publication from Cambridge Assessment. The aim of this publication is to share assessment research in a range of fields with colleagues within Cambridge Assessment, and in the wider assessment community and to comment on prominent research issues.

    Download

  • Research Matters 1 Editorial

    Green, S. (2005). Editorial. Research Matters: A Cambridge Assessment publication, 1, 1.

    In this issue we report on a wide range of research topics from standards over time to auto-marking of short textual responses.

    Download

  • Comparability of national tests overtime: a project and its impact

    Massey, A. (2005). Comparability of national tests overtime: a project and its impact. Research Matters: A Cambridge Assessment publication, 1, 2-6.

    This article summarises the findings and discusses the impact of The Comparability Over Time (CoT) Project, which was commissioned by the Qualifications and Curriculum Authority (QCA) in 1999 and published in 2003. The project investigated the stability of national test standards at all key stages and in all subjects. National test standards are of considerable public interest, not least because of the political prominence these tests have been accorded, including government claims that the huge improvements in results since tests were introduced in the mid-1990s stem from the plethora of recent educational policy initiatives.

    Download

  • Accessibility, easiness and standards

    Bramley, T. (2005). Accessibility, easiness and standards. Research Matters: A Cambridge Assessment publication, 1, 6-7.

    This article is a summary of an article published in Educational Research in 2005.  Discussions about whether one year’s test is easier or more difficult than the previous year’s test can often get bogged down when the spectre of ‘accessibility’ raises its head. Is a ‘more accessible’ test the same as an ‘easier’ test? Are there any implications for where the cut-scores should be set if a test is deemed to be more accessible, as opposed to more easy? Is there any way to identify questions which are ‘inaccessible’?  The main purpose of the article was to use a psychometric approach to attempt to answer these questions.

    Download

  • A rank-ordering method for equating tests by expert judgement

    Bramley, T. (2005). A rank-ordering method for equating tests by expert judgement. Research Matters: A Cambridge Assessment publication, 1, 7-8.

    This article is a summary of an article published in the Journal of Applied Measurement in 2005. It builds on much research carried out at UCLES over the past ten years on the use of judgements in scale construction.  It introduces an extension of Thurstone's paired comparison method to rankings of more than two objects, in the context of mapping a cut-score from one test to another.

    Download

  • A review of research about writing and using grade descriptors in GCSEs and A levels

    Greatorex, J. (2005). A review of research about writing and using grade descriptors in GCSEs and A levels. Research Matters: A Cambridge Assessment publication, 1, 9-11.

    This article describes current awarding practice and reviews literature about writing and using grade descriptors for GCSEs and A levels. Grade descriptors are descriptions of the qualities anticipated at various levels of a candidates’ performance in an assessment. It is concluded that it is good practice to write grade descriptors based on empirical evidence. Grade descriptors for different domains and types of questions can be written by:

    1)    identifying questions where there is a statistically significant difference between the performance of students who achieve adjacent grades (e.g. A and B);
    2)    using Kelly’s Repertory Grid to interview examiners about the qualities which distinguish performance at these grades;
    3)    including these distinguishing qualities in grade descriptors.

    Furthermore, there is little research about how grade descriptors are used, or could be used, in preparing pupils for assessments, and there is room for further research in this area.

    Download

  • Can a picture ruin a thousand words? The effects of visual resources and layout in examination questions

    Crisp, V. and Sweiry, E. (2005). Can a picture ruin a thousand words? The effects of visual resources and layout in examination questions. Research Matters: A Cambridge Assessment publication, 1, 11-15.

    Visual resources, such as pictures, diagrams and photographs, can sometimes influence students’ understanding of an examination question and their responses (Fisher-Hoch, Hughes and Bramley, 1997).  If visual resources do have a disproportionately large influence on the development of mental models, this has implications in examinations where students’ ability to process material effectively is already compromised by test anxiety (Sarason, 1988). Students need to understand questions in the way intended in order to have a fair opportunity to display their knowledge and skills.

    This research explored the effects of visual resources in a number of exam questions. 525 students, aged 16 years, sat an experimental science test under examination conditions. The test included six questions involving graphical or layout elements. For most of the questions, two versions were constructed in order to investigate the effects of changes to visual resources on processing and responses. Some of the students were interviewed after they had taken the test.

    The analysis of the example questions in this study, along with others the authors have studied, suggest that two variables in particular play a decisive role in the effect of visual resources on the way examination questions are processed and answered. The first of these is the relative salience or prominence of the key elements. Secondly, the student must believe that the element is relevant to the answer. One factor in determining this is past test experience, which provides expectations regarding under what circumstances visual resources are relevant.

    Download

  • Gold standards and silver bullets: assessing high attainment

    Bell, J. (2005). Gold standards and silver bullets: assessing high attainment. Research Matters: A Cambridge Assessment publication, 1, 16-19.

    One of the challenges facing those involved in the assessment and selection of high attainers is the fact that so many students get the same high grades (in measurement theory this is referred to as a lack of discrimination). This article discusses some of the issues and methods used in identifying (in order to select) high-attaining candidates.

    Download

  • Automatic marking of short, free text responses

    Sukkarieh, J. Z., Pulman, S. G., and Raikes, N. (2005). Automatic marking of short, free text responses. Research Matters: A Cambridge Assessment publication, 1, 19-22.

    Many of UCLES' academic examinations make extensive use of questions that require candidates to write one or two sentences. With increasing penetration of computers into schools and homes, a system that could partially or wholly automate valid marking of short, free text answers typed into a computer would be valuable, but would seem to pre-suppose a currently unattainable level of performance in automated natural language understanding. However, recent developments in the use of so-called ‘shallow processing’ techniques in computational linguistics have opened up the possibility of being able to automate the marking of free text without having to create systems that fully understand the answers. With this in mind, UCLES funded a three year study at Oxford University. Work began in summer 2002, and in this paper we introduce the project and the information extraction techniques used. A further paper in a forthcoming issue of Research Matters will contain the results of our evaluation of the automatic marks produced by the final system.

    Download

  • Research News

    The Research Division (2005). Research News. Research Matters: A Cambridge Assessment publication, 1, 23.

    A summary of recent conferences and seminars, and research articles published since the last issue of Research Matters.

    Download