Are past paper questions always useful?

'Getting teachers and pupils interested in the really hard questions - an example from physics.' Register now.

Are past paper questions always informative in formative assessment?

Teachers constantly refer to the importance of past paper questions as an integral part of the resourcing of their teaching and learning. When I worked as a teacher I was no different and decided that immersing students in exam questions from the earliest opportunity must be "a good thing." I created weekly homework sheets and end of topic tests using the OCR software and felt well prepared.

I noticed that these selections of questions performed in inconsistent ways. It appeared that all students were particularly competent in some topics and equally incompetent in others. Grades using standardised boundaries gave fluctuating outcomes.

This demonstrated the need for some greater understanding of how to use questions in contexts other than those for which they were initially intended. What was it I wanted to know about the students? What was I going to do with the data? Should I record it - the epitome of summative assessment? Could it help identify areas in need of further study - formative use?

The questions were written as items in a summative assessment, each designed to assess specific content, using a particular construct, and combining with the other items in the examination to give specification coverage and varied level of demand, as well as meeting Ofqual’s requirements for validity and comparability, whilst also allowing discrimination of candidates’ performance.

Once the items are taken out of that context the comparability and validity of the exam are lost. The items in themselves do not have the same integrity, so is it possible to use past paper questions to give meaningful outcomes when used out of context?

A second issue was the use of such questions to diagnose conceptual understanding. Again this may be possible, but in many cases the construction of the question is not focussed on one particular point but is intended to test a wider range of knowledge, understanding and skills which will frequently be placed within a context which may or may not be familiar to the student and which may cloud the conceptual diagnosis.

The one unquestionable use of past paper questions is in forming an excellent basis for revision in preparation for exams. This is a consolidation of knowledge and rehearsal of specific skills required in the examination.

Thus past papers may be a great resource, but need some careful consideration in their use.

Using past paper questions in periodic testing and as an indicator for final outcome.

The performance of students, teachers and institutions is data driven on ever shortening cycles requiring frequent entry of summative data, monthly in my last teaching post. Thinking back to a time in teaching when changes in specification and assessment were less frequent, we used tests and mock exams which were based on historic past paper questions, but where the grade boundaries used in reporting were based on the comparison of previous student outcomes in the test and their final grades. In more recent times there has not been that stability to accrue sufficient data to base judgements on sufficient consistent empirical data.

My observation was that, in physics at least, questions on different topics demand differing skills, cognitive understanding and problem solving ability, meaning that one topic may consistently require lower demand than another topic, resulting in the disparity between grades given if standard boundaries were used. Forces appeared to be lower demand than electric circuits for example. Methods exist and are used by awarding organisations to determine complexity and demand in their own analyses.

The variation in raw mark grade boundaries adds to the complexity of integrating questions from a range of years, which is exacerbated by the observed changes in structure and the usage of English with time. Repurposing multiple choice questions from the 1990s for use in current resources required rephrasing and changing of non-technical terminology such as proceeding (in a context similar to that of a policeman from the 1960s proceeding in a northerly direction) and traversing meaning to move from one side to the other.

Question in formative assessment

The basis of formative assessment is to be able to give specific feedback to individuals or groups of students based on the observed or identified need for improvement. This can be considered diagnostic in the sense that the predictive nature of the question is not related directly to the reporting required for student, teacher or institution, rather that it infers a need for action.

Diagnostic assessment as a form of formative assessment requires that there is construct validity where the question relates directly to an individual item or construct and the candidates ability to demonstrate this, or even to identify the misconception hindering understanding.

Whereas the contextualisation of questions in exams may be a barrier to some students, or could give undue advantage to those familiar with the context, the diagnostic questions are often free of context.

Specific conceptual tests have been devised such as Evidence Based Practice in Science Education (EPSE) which devised a sequence of questions on key conceptual topics in physics in which students were asked to answer a multiple choice question and to rate their certainty. Repeated use of the questions could demonstrate increasing confidence in knowledge and understanding.

An alternative set of conceptual questions are the "Next Time" questions of Paul Hewitt. These are simple contexts intended to make students think and either restructure or consolidate knowledge as they are often counter intuitive.

I used both these in my teaching at A Level, and although the EPSE materials were targeted at Keystage 3, they demonstrated that many misconceptions carry on through GCSE and into A Level.

When can past papers be used formatively

I think that the formative use of exam questions requires some consideration of how we interpret the formative nature of education as compared to training. If education is the increase of knowledge and understanding and training is equipping in the demonstration of skills, then the repeated exhortation to include units at the end of a question or to show working is training rather than education. It is preparing the student to optimise the summative outcome of a test but is not contributing to their knowledge or understanding.

Much of the work which I carried out believing it to be formative with my classes was to increase summative outcome. We used sound educational methods in teaching and learning, with peer assessment and group contribution to improve initial responses, but in hindsight much of this was training.

The use of complex questions does prepare students for the synopticity present in exam questions. With synopticity defined as the linking of content from two or more topics or modules, it is apparent that this cannot take place early in the A Level course. It is also likely that many of the students may not have developed thinking skills to allow them to answer such questions until the second year of A Level.

The development of thinking skills using complex questions has to be an application for such past paper questions, where students may be scaffolded in their completion of an answer. In class we found a number of ways to do this by working in groups or as a whole class, selecting individuals to add a sentence to the answer and collaboratively refining the answer. The Isaac Physics project takes such questions and scaffolds by giving hints. Is that in itself formative, or does it require closing of the feedback loop with the teacher? Is the moment that a student sees their way to answer the question a defining formative moment?

Neil Wade
Subject Specialist, OCR

'Getting teachers and pupils interested in the really hard questions - an example from physics.' Register now.

Research Matters

Research Matters 32 promo image

Research Matters is our free biannual publication which allows us to share our assessment research, in a range of fields, with the wider assessment community.