Assessment from a social perspective

Assessment from a social perspective

Martin Johnson from our research division writes about the Association for Educational Assessment-Europe (AEA-Europe) annual conference, held recently in Cyprus.

I’ve always been interested in seeing assessment from a social perspective, and looking at how assessment involves and affects people. As a result I thought that the recent AEA-Europe conference was enjoyable because it brought together people who engage with assessment in a number of ways; whether coming primarily from a test development background or from primarily a learner-oriented perspective. For me, the conference sessions that I attended could be summarised by three themes: the interplay between social and technical content coverage, information about assessment systems, and highlighting new assessment developments and methods.

The four keynote talks demonstrated the way that the conference organisers wanted to allow space on the programme for delegates to think about both the social and technical aspects of assessment. Gabriele Kaiser (University of Hamburg) and Gudrun Erickson (University of Gothenburg) both explored the social dimensions of assessment. In their respective talks they highlighted the importance, along with the challenges, of taking into account a wide social and contextual perspective when developing assessment tools.

For example, Gabriele outlined her ambition that tools which assess teachers should be able to assess their pedagogic competence (i.e. how do expert teachers perceive and interpret events in a particular situation) as well as their subject content knowledge. Similarly, Gudrun outlined the ways that Swedish school test developers worked to ensure that as many stakeholder voices (and student voices in particular) were involved in the test development process. This included the use of pre-tests to explore students’ perceptions of items, which provided an opportunity to consider why items that are found to be ‘easy’ through statistical indicators may appear to be demanding and unsettling for test takers.

The technical assessment dimensions were covered in the remaining two keynotes. George Marcoulides (University of California, Santa Barbara) and Sebastiaan de Klerk, (eX:plain, The Netherlands) outlined some of the challenges for test developers when trying to refine an assessment tool so that it is as efficient as possible. George spent some time discussing how novel algorithm development can help to reduce the countless number of potential items that could be used for a test, whilst Sebastiaan addressed the challenges related to producing a multimedia-based assessment that could capture the skills executed in an applied vocational performance.

Another strength of the conference was that it brought together assessment practitioners and researchers from diverse geographical settings. Cyprus has a strong Russian presence, and since it is still relatively rare for Russian researchers to present at conferences in the UK it was refreshing to be able to hear them talking about their system. Tatjana Kanonire, Elena Kardanova and colleagues (Institute of Education, National Research University) outlined their work that focused on the attainment of younger learners in different parts of the Russian Federation. Their work explored the established links between academic achievement and a learner’s subjective wellbeing, and suggested that wellbeing encompasses elements such as the degree to which children collaborate with others, their number of friends, and their state of physical health.

The research team then went on to discuss the evidence that they had collected from school assessments to suggest that inequality of learning outcomes was reducing between young children in poor rural and wealthier urban schools in Tatarstan. The researchers then went on to explore the extent to which the Family Investment Model (FIM) – a concept well established in Western economies – applied to a large city in Siberia. The FIM has been used to explain the link between parents’ socio-economic status and children’s school readiness (using indicators such as the number of books in the home or the level of access provided for extra-curricular activity). The researchers found evidence that, in addition to these indicators, Russian parents’ beliefs about the worth of education were also a significant contribution to their child’s attainment.

The final theme that I noted at the conference was that it was an opportunity for sharing information about new assessment developments and methods. Christine Merrell (CEM, University of Durham) talked about the challenges around how to access very young learners’ attitudes, feelings, and dispositions. The limited reading ability of these learners means that researchers often have to gather these data indirectly through adult intervention. Christine went on to describe a new approach for eliciting the perspectives of young learners that uses animated dynamic images.

Another interesting methodological development was outlined by Ayesha Ahmed (University of Cambridge) and Ruth Johnson (AQA). Ayesha and Ruth were reporting on a project that sought to assess collaborative problem-solving skills. This required them to look very carefully at the ways that individuals in a group interact during a learning task. To do this the researchers used Conversation Analysis techniques to identify solution-critical utterances used by a task participant that seemed to help a group to achieve a shared outcome.

These summaries highlight the diversity of the research presented at the AEA-Europe conference and its geographical reach. Cambridge Assessment also presented several pieces of work and you can learn more about them here. Leave a comment below if you’ve any questions or would like to know more about one of our presentations.

Martin Johnson
Researcher, Cambridge Assessment

Research Matters

Research Matters 32 promo image

Research Matters is our free biannual publication which allows us to share our assessment research, in a range of fields, with the wider assessment community.