Assessment Horizons 2026

Assessment Horizons 2026 Assessment Horizons 2026
Date: 23 Apr 2026 - 24 Apr 2026 Venue: Cambridge University Press & Assessment - The Triangle and Online
Time: 09:30 - 16:00
Type: Conference Fee: Various - more details below

Attendance options: In-person (includes recordings):
• £250 non-member – early bird before 27 Feb (save £50)
• £150 member – early bird before 27 Feb (save £50)
Online (includes recordings):
• £115 non-member – early bird before 27 Feb (save £35)
• Free for members (booking deadline: 17 Apr)

Join The Assessment Network as a member and save on attendance plus get 10% discount on all courses.

Book now

Early bird bookings close 27 Feb 2026 at 12pm (UK time).

Why join us at Assessment Horizons 2026?

Be part of the conversation shaping the future of assessment at Assessment Horizons on 23-24 April 2026. Book before 27 February and access an early bird discount.

Join us in person in Cambridge, or online for two days of insights, innovation, and networking.

Be inspired by keynotes on the power of feedback and success through failure, stay up to date on developments with assessment and AI, and upskill with access to practical workshops. Gain a Certificate of Attendance and access to recordings of the sessions after the conference.

This must-attend annual event is designed to connect you with latest knowledge and learning in assessment.

Your tickets include lunch, refreshments, and after a day of learning on Thursday, an exclusive drinks reception at Cambridge University Press and Assessment’s Cass Centre. They also include recordings of the key note, panel and break out presentation sessions that you can catch up on afterwards. 

Keynote presentations

Productive Failure - implications for feedback and assessment - Prof Manu Kapur

Manu Kapur

If learning from failure is intuitively compelling, why do we wait for it to happen? In Manu’s talk he will describe his research program on Productive Failure, exploring how, when and why intentionally designing for failure in a safe way can lead to deep learning.

At the same time, an ever-increasing emphasis on testing and standardisation means that we ought to be seriously wary of the dangers of ‘unproductive success’—an illusion of learning in high performance.

The challenge therefore is to build not only deep knowledge, but also to build it in a way that allows for a flexible and creative use of this knowledge in novel contexts. How do we do that? How do we assess it? And how can technology help? Manu will highlight key trends that force us to rethink what and how we teach, and how we assess deep knowledge, transfer, resilience, creativity, and how well we prepare our children to take on the uncertain future that awaits them.


Professor Manu Kapur is widely known mainly for his research on learning from Productive Failure and has delivered two TEDx talks on the topic. His contributions extend across high-profile journals and conferences, influencing educational policies and practices internationally.

Manu is currently the Director of the Singapore-ETH Centre, and a Professor of Learning Sciences and Higher Education at ETH Zurich, Switzerland.

With a strong technical background in engineering and statistics as well as doctoral training in the learning sciences, Manu brings a unique interdisciplinary skill set to the study of human learning, both in terms of the fundamental mechanisms of human learning as well as developing applications for translating these mechanisms for teaching and learning. For more information, visit www.manukapur.com.

The emotional journey of feedback - from reaction to regulation for learning - Prof Anastasiya Lipnevich

Anastasiya Lipnevich

Receiving feedback often triggers immediate emotional reactions that can either support or hinder learning. This keynote explores how learners move from automatic, affective responses to more deliberate regulation of their emotions in feedback situations. Drawing on contemporary research in motivation, cognition, and emotion, Anastasiya will examine how students interpret feedback, the factors shaping their initial emotional responses, and strategies that help them shift toward productive regulation.

Building on this foundation, Anastasiya will present findings from our recent studies on emotion regulation in feedback contexts, including comparisons between teachers’ assumptions about how students manage their emotions and students’ own reports. To connect these perspectives, she will highlight receptivity to feedback and relate it to control–value theory of achievement emotions. Drawing on her teams large-scale study, Anastasiya will show how positive emotions, self-efficacy, and task value beliefs enhance receptivity, while suppression of emotions undermines engagement and performance.

Anastasiya will also share several new experimental studies on praise and its complex, often counterintuitive links to performance, illustrating how seemingly positive feedback can shape both emotional responses and subsequent self-regulation. By tracing the full progression from reaction to regulation, she will outline how feedback practices can be designed to foster resilience, agency, and sustained engagement with learning.


Professor Anastasiya Lipnevich is Principal Measurement Scientist at the National Board of Medical Examiners, where she leads innovative work at the intersection of formative assessment and the learning sciences. Prior to joining NBME, she served as a Full Professor of Educational Psychology at Queens College and the Graduate Center of the City University of New York, where she continues to supervise doctoral students and maintains her academic affiliation.

Her research centres on instructional feedback, formative assessment, alternative approaches to cognitive and noncognitive assessment, and the role of emotional and other psychosocial characteristics in academic and life success, with work spanning primary, secondary, tertiary, and medical education contexts.

Anastasiya holds master’s degrees in clinical psychology (MS), Italian language and literature (MA), and counselling psychology (MEd). She earned her PhD in educational psychology (learning, cognition, and development concentration) from Rutgers University, where she received the Graduate School of Education’s Excellence in Dissertation Award.

Following her doctorate, she completed a postdoctoral fellowship at Educational Testing Service in Princeton, New Jersey. Her scholarly contributions have been recognized with the New Investigator Award and the Best Article Award from the American Psychological Association’s Division 3 (Experimental Psychology). She is also a Fellow of the Institute for Advanced Study at the University of Surrey, United Kingdom.

Anastasiya has held visiting professorships at the University of Trento (Italy), the University of Konstanz (Germany), the University of Otago (New Zealand), and the National Institute of Education (Singapore), among others. She has delivered numerous keynote and invited addresses across the world, including in Europe, Asia, North America, and Oceania.

She is the co-author or co-editor of four books: Psychosocial Skills and School Systems in the 21st Century (Springer, 2016); The Cambridge Handbook of Instructional Feedback (Cambridge University Press, 2018); Instructional Feedback: The Power, the Promise, the Practice (Corwin, 2023); and Unpacking Students’ Engagement With Feedback Through Artefacts, Discourse, and Survey Findings (Routledge, 2023).

Panel: AI and assessment – where are we now?

Rose Luckin

Rose Luckin

Rose is an internationally respected academic and influential communicator across multiple stakeholders about the future of education and technology, particularly Artificial Intelligence (AI). With over 30 years of experience, she is a recognised expert on AI in education, serving as an advisor to policymakers, governments, and industry globally. Professor Emerita at University College London and Founder and CEO of Educate Ventures Research Limited (EVR), a company that provides thought leaderships, training and consultancy to the education sector to help them leverage AI ethically and electively.

Throughout her career, Rose has held key leadership roles in academia, including serving on the Director's Strategy Group at the UCL Institute of Education from 2011-2015 and as Pro-Vice-Chancellor, University of Sussex, Director of Undergraduate Studies for Science and Technology, and Co-Founding Director of the Human Centred Technology research group all at the University of Sussex.

In recognition of her contributions, Rose was honoured with the Bett Outstanding Achievement Award in 2025, as a Leading Woman in AI EDU at the ASU-GSV AIR Show in 2024 and she received the 2023 ISTE Impact Award, becoming the first person outside North America to receive their top honour. She was also awarded the International Francqui Chair in 2018 by the Francqui Foundation in Belgium, named one of the 20 most influential people in education in the 2017 Seldon List and one of the 50 most influential women in UK tech by Computer Weekly in 2024.

A prolific author, Rose has published extensively in academic journals, books, and conference proceedings. Her 2018 book, Machine Learning and Human Intelligence: The Future of Education for the 21st Century, available in English and Mandarin, describes how AI can be electively used to support teaching and learning. Her most recent book, AI for Schoolteachers, published in 2022, is an essential and accessible guide to AI for anyone involved in education.

Rose regularly delivers keynotes and public lectures across the globe on AI, ethics, and the future of education. She engages with the public through a monthly column in the Times Educational Supplement and op-eds in the Financial Times, Guardian, and China Daily. Rose has also appeared on various media outlets, including BBC Radio 4, ITV News, and CNBC.

In addition to her academic and entrepreneurial roles, Rose serves as an advisor to Cambridge University Press and Assessment and is co-founder of the Institute for Ethical AI in Education. She is also President of The Self-Managed Learning Centre in Brighton and sits on a range of advisory boards within the education and training sector, including membership of the UK Department for Education Science Advisory Council.

Rose holds a PhD in Cognitive and Computing Sciences and a First Class Bachelor's degree in AI and Computer Science, both from the University of Sussex. Prior to her academic career, she achieved Associateship of the Chartered Institute of Bankers.

Bryan Maddox

Bryan Maddox

Professor Bryan Maddox is Research Director for Digital Assessment Futures at the Digital Education Futures Initiative (DEFI), Hughes Hall, University of Cambridge, and Executive Director of Assessment Micro-Analytics Ltd. Bryan is an Honorary Professor at the University of East Anglia, where he was previously Professor of Educational Assessment. He has held Visiting Professor roles at the Centre for Educational Measurement, University of Oslo, and at the University of British Columbia.

Bryan is particularly interested in digital assessment futures from a cross-cultural perspective. A social anthropologist by training, he has conducted ethnographic, observational research in educational contexts as wide ranging as Mongolia, Senegal, the UK and France

Paul Muir

Paul Muir

With a career in the education and assessment industry spanning 25 years, Paul is currently the Chief Customer Officer at risr/. From roles delivering ‘country first’ education and curriculum reform projects across the world, to transformational roles at Awarding Bodies and Regulators in the UK, Paul’s roles have focussed on delivering change, often utilising technology, to deliver wide-spread policy change and transformation in assessment. Paul is a frequent speaker and panellist at industry conferences on issues such as test security, the digital divide and the impact of technology on assessment, with an increased focus on how AI is reshaping our industry. Paul also volunteers extensively throughout the industry. As well as his role as a Director and Chair of the Board of the Association of Test Publishers (ATP), he is currently serving as Chair of the ATP Security Committee, a founding member of the ATP AI Committee and an ambassador for The Assessment Network at Cambridge membership scheme

Dan Bray

Dan Bray

Dan Bray is Director of Assessment Innovation and Transformation at Cambridge University Press & Assessment. He has extensive experience in Assessment across a wide range of educational levels and contexts.

He has a strong track record in the delivery of large-scale education reform projects with governments, school groups and NGOs. He also leads on innovative assessment methodologies for international qualifications at CUPA which are used by more than 10,000 schools globally.

Prior to taking the role Dan was a senior assessment advisor focusing on the design, delivery and standardisation of IGCSE, A Level, O Level and Checkpoint qualifications. He has also worked in publishing and as a classroom teacher.

Key areas of expertise include designing assessments frameworks, conducting research-based analysis of education systems, ensuring examination production quality, using assessment to inform classroom practice, developing diagnostic assessment models and implementing protocols to ensure robust and fair assessment systems.

Breakout presentations

Hear about new research and leading insights with our breakout sessions.

Generative AI and the Student Experience in Higher Education: Evidence, Equity and Skills – Rose Stephenson

Rose Stephenson

Rose Stephenson, Director of Policy and Strategy, Higher Education Policy Institute

Theme: AI and assessment – where are we now?

Generative AI is no longer knocking at the door of higher education — it is already in the lecture hall, the library and the assessment process. Drawing on the latest HEPI / Kortext Student Generative AI Survey of over 1,000 undergraduates, Rose Stephenson, Director of Policy and Strategy at the Higher Education Policy Institute, will explore how AI has become a near-universal study tool, reshaping how students learn, write and think.

While many students say AI improves their experience by saving time and boosting understanding, the findings also expose growing anxiety about assessment, fairness, skills loss and wellbeing. Rose will argue that the real challenge now is how teaching and learning can deliberately develop students’ AI skills, while ensuring fair and equal access to the tools that are rapidly becoming essential for success in higher education and beyond.

Re-imagining assignment-based assessment in the age of AI: A methodological shift - Kirsty Parkinson

Kirsty Parkinson

Kirsty Parkinson, Head of Assessment Development, Chartered Institute of Procurement and Supply (CIPS)

Theme: AI and assessment – where are we now?

The rapid evolution of AI and large language models (LLMs) has disrupted traditional assessment practices—particularly those relying on written assignments.

This session will explore how CIPS are adapting their methodology to embrace AI as a tool for learning rather than a threat to assessment. In this presentation Kirsty will share how her team are rethinking task design to ensure assessments remain meaningful, authentic, and valid in a world where AI assistance is ubiquitous.


Kirsty is Head of Assessment Design and Development at the Chartered Institute of Procurement and Supply. She has over a decade of experience designing assessment methodologies for Transnational Education programmes. Kirsty holds a Postgraduate Advanced Certificate in Educational Assessment from the University of Cambridge and a Master of Education.

Her work focuses on evolving assessment strategies to meet the challenges and opportunities presented by AI, ensuring validity, fairness, and relevance in a rapidly changing educational landscape.

Guardians or Adversaries? Understanding (Gen)AI’s Role in Assessment Integrity – Paul Muir and Nadir Zanini

Paul Muir, Chief Customer Officer, /risr and Nadir Zanini, Head of Assessment Measurement, Cambridge University Press & Assessment

Theme: Assessment integrity in an age of (Gen)AI

Artificial Intelligence (AI), and especially Generative AI, are reshaping the assessment landscape at unprecedented speed, offering powerful new tools while simultaneously introducing profound risks to test security. Guardians or Adversaries? Understanding GenAI’s Role in Assessment Integrity explores this duality head-on.

This session examines how rapidly advancing AI models can support secure assessment design, enhance item development, detect irregularities, and strengthen integrity measures through automated analysis and sophisticated pattern recognition. At the same time, it confronts the growing challenges: AI-enabled cheating, content harvesting at scale, item regeneration, and the erosion of traditional security safeguards.


Nadir ZaniniNadir is Head of Assessment Measurement at Cambridge University Press & Assessment. With 15 years of experience in educational assessment, he has harnessed the power of data to address a range of pressing policy and technical issues related to the awarding of high-stakes assessments.

His research spans the setting and maintenance of assessment standards, the predictive validity of qualifications, and the evaluation of reforms, with a consistent focus on fairness.

He leads a multidisciplinary team of experts in psychometrics, data science, AI engineering, and language testing, overseeing both the operational delivery of valid test results and the development of new assessment capabilities powered by the latest technological advances and research in the field.


Paul MuirPaul has a 25‑year career in the education and assessment industry and is currently the Chief Customer Officer at risr/. From roles delivering 'country first' education and curriculum reform projects across the world to senior positions at awarding bodies and regulators in the UK, Paul has focused on delivering change—often through technology—to support large‑scale policy transformation in assessment.

Paul is a frequent speaker and panellist at industry conferences on topics such as test security, the digital divide, and the impact of technology on assessment, with a growing emphasis on AI’s role in reshaping the sector.

He also volunteers extensively across the industry. Alongside serving as a Director and Chair of the Board of the Association of Test Publishers (ATP), he is Chair of the ATP Security Committee, a founding member of the ATP AI Committee, and an ambassador for The Assessment Network at Cambridge membership scheme.

Creating job-ready learners: Assessing essential skills for work in mainstream secondary and further education - Dr Rebecca Conway

Dr Rebecca Conway

Dr Rebecca Conway, Director of Research and Innovation at NCFE

Theme: Skills and competence assessment

Recent research from the CIPD (December 2024) highlights a potential skills gap in young people entering the workforce. Only 28% of employers who had recently recruited a young person stated that they were prepared for the world of work.

Learners must develop essential skills such as communication and social interaction to thrive in the workplace. These skills are challenging to assess via many traditional assessment methods. This presentation explores the competencies that underpin workplace readiness and examines how we might effectively assess them.


Rebecca moved into the awarding sector in 2012 after starting her career in Higher Education as a PhD researcher and tutor. She has held technical assessment roles with several organisations, including Cambridge University Press & Assessment where she worked in an innovative and fast-paced international education development team. She moved from there to the Federation of Awarding Bodies where she led on policy and strategy during the Covid pandemic.

Rebecca joined NCFE, one of the largest technical and vocational awarding organisations in the UK and an educational charity in August 2024 and has overall responsibility for the organisation’s research, insight, innovation and social investment work. Rebecca has a longstanding interest in practitioner research and professional development in assessment and awarding.

She has trained hundreds of assessment and education professionals on CPD and postgraduate courses for The Assessment Network at Cambridge, international schools, universities and awarding organisations. She currently supports professional learners with practitioner research projects through her role as a part-time Academic Supervisor on the University of Cambridge’s Postgraduate Advanced Certificate in Educational Assessment.

Oracy: Assessing talk and assessing through talk - Dr Ayesha Ahmed

Dr Ayesha Ahmed

Dr Ayesha Ahmed, Associate Teaching Professor at the University of Cambridge Faculty of Education, By-Fellow of Hughes Hall and a founding member of Oracy Cambridge

Theme: Skills and competence assessment | AI and assessment - where are we now? 

How should we assess oracy in educational settings? Why should we assess oracy at all? These questions are being asked more frequently since the publication of the Curriculum and Assessment Review in the UK. Ayesha will outline current conceptualisations of the construct of oracy and outline some of the major challenges for designing oracy assessments, such as reliability of judgements, scalability of administration and capture of evidence.

She will then discuss how these might be addressed and whether we can defend the validity of oracy assessment in high stakes contexts, or only for formative purposes. She will go on to discuss assessment through talk in the form of high stakes oral assessments, and in particular how our understanding of oracy can inform the design of such assessments.

She will argue that while there are also major challenges with this mode of assessment, it has the potential to mitigate the threat of AI to the authentication of written work, and to enhance validity through dialogue between student and assessor.


Following her PhD in Developmental Psychology, Ayesha has been working in the field of educational assessment since 1997, first at UCLES (now Cambridge University Press & Assessment) and then freelance, before joining the Faculty in 2013 where she teaches undergraduate and postgraduate students.

Her research centres on validity issues in assessment design, and her current focus is on assessment of and through oracy. In 2024 she was awarded a Visiting Research fellowship at the University of South Australia, where she furthered her worked on oracy assessment.

Ayesha is a Fellow of the Association for Educational Assessment -Europe, Chair of AQA’s Research Advisory Committee and is on the Advisory Board of the journal Assessment in Education: principles, policy and practice.

Designing performance-based assessments that make competencies visible - Verónica Floretta

Verónica Floretta

Verónica Floretta, Educational Consultant

Theme: Skills and competence assessment

This session looks at how performance tasks can generate meaningful evidence of competence, drawing on work in Uruguay’s curriculum reform, where curricular design based on competencies has been encouraged. Verónica will explore how task conditions, criteria, and scaffolding affect validity and inclusion, and participants leave with a clear design protocol they can adapt in their own contexts.


Verónica is a researcher, educator, and international assessment consultant specializing in educational assessment, performance-based task design, competence-based curriculum implementation, and teacher development. She holds a master’s degree in educational Assessment and has over a decade of experience training pre-service and in-service teachers in assessment literacy, reflective practice, curriculum design, and inclusive pedagogy.

As an Assessment Consultant, Lecturer, and TESOL Trainer, she has led professional development programmes across Uruguay and Latin America. In 2025–2026, her work has centred on supporting institutions navigating Uruguay’s national competence-based education reform.

She is the author of Validity in an English Diagnostic Test: How to Design Meaningful Diagnostic Tests and has contributed to international conferences such as LABCI, FHCE, and multiple events for The Assessment Network at Cambridge.

As an Ambassador for The Assessment Network at Cambridge, she collaborates with global experts to promote innovative, ethical, and teacher-centred approaches to assessment. Her work advocates for human-centred education in the AI era, emphasizing teacher professional judgement, ethical assessment design, and the responsible integration of technology to enhance pedagogical decision-making.

Any learner anywhere? - Dee Arp and David Towlson

Dee Arp, Chief Quality Officer NEBOSH and David Towlson, Director of Learning and Assessment, NEBOSH

Theme: Maintaining validity through inclusive assessment

The challenges of delivering online assessment authentically when learners globally have such different access to technology - do we work on access for all learners or has the dial shifted too far from assessment for learning to solving technological challenges? If so, what are the alternative assessment strategies that will tick the assessment principle boxes? This case study session is based on a current qualification/assessment.


Dee ArpDee is a Chartered Safety and Health Practitioner of IOSH and a Fellow of the Chartered Institute of Educational Assessors. Dee became a qualified health and safety practitioner 25 years ago whilst working at RoSPA, where she developed and taught a wide range of courses and helped several boards to implement safety governance.

As NEBOSH’s Chief Quality Officer, Dee has responsibility for providing leadership on qualification development and assessment and on all compliance matters.

Dee previously studied with Cambridge University for the Certificate of Continuing Education (Principles and Practice of Assessment) and is now undertaking further study with Cambridge University for a Masters in Education.

Dee is INSHPO Immediate Past President, a RoSPA Ambassador, a OneWiSH mentor and is a judge on several Industry Awards including the RoSPA Health and Safety Awards, the IIRSM Risk Excellence Awards and the SHE Excellence Awards.


David TowlsonDavid started his working life in the chemical manufacturing industry, working for several different multinationals (not all at the same time), first in research and development and later as a safety advisor. He later moved into Health and Safety Training.

He now works as Director of Learning & Assessment for NEBOSH where his job, in a nutshell, is to make sure NEBOSH’s qualifications, courses, publications and assessments are designed and developed to be fit for purpose. He is the AI adoption lead within the organisation.

Beyond language and content: Unpacking hidden factors in CLIL assessment through a psychoanalytic lens - Aydil İnal

Aydil İnal

Aydil İnal, Science Teacher, The Koç School, Türkiye

Theme: Maintaining validity through inclusive assessment

In CLIL (Content and Language Integrated Learning) classrooms, assessment often appears to measure two things: language proficiency and content understanding. Yet classroom reality shows that a third layer, students’ cognitive-emotional dynamics, quietly shapes performance and can blur the distinction between what learners know and what they are able to express.

This session examines how affective factors such as language anxiety, fear of failure, learner identity, inner narratives, and subtle defence responses can become entangled with assessment outcomes. Drawing on classroom experience and psychoanalytic perspectives, the session explores how these hidden dynamics may unintentionally mask content learning or amplify linguistic challenges.


Aydil is an experienced educator working at the intersection of international and national curricula, currently serving as a science teacher at The Koç School in Istanbul. Over the past ten years, she has taught science in English using CLIL approaches across primary and lower secondary levels, designing inquiry-based and bilingual learning environments.

She holds a master’s degree in educational Measurement and Evaluation, where her thesis focused on standard setting for a hybrid Cognitive Skills Inventory using the Angoff, Contrasting Groups, and Bookmark methods. She is currently pursuing her PhD in Curriculum and Instruction.

Scaffolding as inclusive assessment practice - supporting SEN students in EFL classrooms through reflective pedagogy - Prof. Lic. Valeria Miño

Valeria Miño

Prof. Lic. Valeria Miño, Universidad de la República, Uruguay

Theme: Maintaining validity through inclusive assessment

This presentation draws on a qualitative, multi-case study exploring how English as a Foreign Language (EFL) teachers scaffold learning for students with special educational needs (SEN) in inclusive primary classrooms. Grounded in sociocultural theory (Vygotsky, 1978; Lantolf & Thorne, 2009), the study examines scaffolding (Gibbons, 2015) as a pedagogical and assessment tool that supports equitable access to learning (Packer, 2017) while preserving the validity of language assessment in diverse educational contexts.


Valeria is an English as a Foreign Language teacher and inclusive education specialist with experience in pedagogy, curriculum design, and educational technology. She is pursuing a Master’s in Teaching Foreign Languages (English track) at Universidad de la República, with ongoing research and thesis work. She holds a Bachelor’s degree in Pedagogy and graduated from the English Teaching Program at the Instituto de Profesores Artigas in Uruguay. Valeria has also completed international programs at Arizona State University (USA) and the Commonwealth Education Trust (UK), as well as postgraduate diplomas in digital technologies for inclusive education and teaching competencies for inclusion.

Valeria coordinates the English Department and teaches primary and secondary levels at Colegio Santa María in Montevideo. She leads workshops on inclusive education for teacher training courses. She has presented her research at national and international conferences, focusing on scaffolding and inclusive strategies for students with special needs. Her work combines pedagogy, technology, and language learning to create reflective and inclusive practices that support meaningful learning for all students.

The assessment we overlook: literacy, transparency, and the impact of the UK university admissions process - Rebecca Dowbiggin

Rebecca Dowbiggin

Rebecca Dowbiggin, Educational Consultant

Theme: The value of assessment literacy

University admissions are among the most significant assessments students encounter, with wide-ranging implications for motivation, confidence, and engagement. These high-stakes processes position students and educators within complex evaluative relationships, highlighting the broader impact of assessment beyond the classroom.

This presentation will explore how understanding these dynamics can inform approaches to assessment literacy, encouraging reflection on how students interpret and respond to judgments and feedback in high-stakes contexts. By considering admissions through the lens of assessment literacy, the session will open up discussions about the psychological and pedagogical implications of evaluative processes in supporting students through critical transitions.


Rebecca studied as an undergraduate at the University of Cambridge and graduated with Distinction from the MSc in Educational Assessment at the University of Oxford. Her postgraduate research examined how existing university admissions structures and proposed reforms affect students’ self-efficacy and motivation.

Rebecca has over fifteen years of experience as an Education Consultant, specialising in university admissions advisory, with a particular focus on admissions assessments. Alongside her consultancy work, Rebecca is an Academic Tutor and Mentor, and designs and delivers curricula and assessments for students who benefit from alternative learning pathways outside mainstream schooling.

Rebecca has authored a wide range of educational resources and has collaborated with schools to develop virtual programmes that recognise students’ skills beyond traditional assessments, while fostering classroom cultures that prioritise well-being and emotional literacy. Having trained and worked as a professional actor, Rebecca is a strong advocate for arts-rich education and the role of the arts in holistic learning. She maintains a keen interest in professional development in assessment and serves as an Academic Supervisor and Tutor on the Postgraduate Advanced Certificate in Educational Studies: Educational Assessment, co-convened by Cambridge Assessment and the Faculty of Education at the University of Cambridge.

A tale of two countries: parental assessment literacy - Dr Simon Child

Dr Simon Child

Dr Simon Child, Head of Assessment Training, The Assessment Network at Cambridge

Theme: The value of assessment literacy

This presentation shares insights from initial findings of a study investigating the extent to which parents have the knowledge and skills to support their children with assessment activities, and what type of support (if any) they would like to receive to improve their assessment literacy.

The presentation will introduce our conceptual model for parental AL, and how it connects to the survey instrument Simon and Dr Ourania Ventista (University of The Aegean) used to gather data from 678 parents in two countries (England & Greece). Inspired by the findings from the study, Simon will make some suggestions for how parental assessment literacy can foster an effective school culture.


Dr Simon Child is Head of Assessment Training at The Assessment Network at Cambridge. Previously, he was a Senior Research Officer in the Assessment Research and Development Division of Cambridge Assessment. He's conducted research in the field of qualifications reform and development since 2012.

His other research interests include quality of marking processes, curriculum development, formative assessment and Higher Education. His background is in developmental psychology. In 2011, he received his Ph.D from the University of Manchester, which focused on the development of symbolic cognition in pre-school children.

Who are ‘the public’ for educational assessment and do their views matter? - Isabel Nisbet and Stuart Shaw

Isabel Nisbet and Stuart Shaw, Educational Consultants

Theme: The value of assessment literacy

This session will explore the concepts associated with public confidence in (or public opinion about) assessment. Public confidence is a statutory objective of the assessment regulators in England and Wales. But who are the public and what does it mean for them to have confidence in assessment?

The session will start by exploring concepts including public confidence and public opinion in the context of educational assessment. It will then set out and consider arguments for and against the relevance of the public's views. Should we lead, follow, measure or ignore the views of the public? The public may be inexpert in assessment, most will be untutored in measurement theory and practice, and they may be influenced by inaccurate or partisan reporting. Should assessment professionals, researchers, teachers and lecturers take account of their views, engage in public debate and/or change what they do to gain public approval?


Isabel NisbetIsabel's professional career was in government and regulation. She was the first CEO of Ofqual, the regulator of exams and qualifications in England. She is now active in consultancy and non-executive roles and has continued her academic interest in educational assessment and in the philosophy of education.

From 2021-2023 Isabel served on a panel appointed to review all aspects of education in Northern Ireland. Their report was published in December 2023.

Isabel was the co-author, with Stuart Shaw, of Is Assessment Fair? published by SAGE in 2020 and Educational Assessment in a Changing World: Lessons Learned and the Path Ahead, published by Routledge in 2024.

Isabel has served on the Boards of Governors of four universities in the UK and is currently Vice-Chair of Governors of the University of Bedfordshire. She has served on the Board of Qualifications Wales and continues on its Research Advisory Group. Isabel also served on the Transition Board overseeing the establishment of Qualifications Scotland. She is a Trustee of the Methodist Independent Schools Trust.


Stuart ShawStuart is an educational assessment researcher, consultant, and author. He is Honorary Professor of University College London in the Institute of Education - Curriculum, Pedagogy & Assessment.

Stuart has worked for international awarding organisations for over 20 years and is particularly interested in demonstrating how educational, psychological, and vocational tests seek to meet the demands of validity, reliability, and fairness. He has a wide range of publications in English second language assessment and educational/psychological research journals, as well as books. Stuart is Chair of the Board of Trustees of the Chartered Institute of Educational Assessors (CIEA). He is also a Fellow of the CIEA with Chartered Assessor Status.

Stuart is a Fellow of the Association for Educational Assessment in Europe (AEA-Europe), an elected member of the Council of AEA-Europe and is Chair of its Scientific Programme Committee. He is also an Honorary Lifetime Member of the Board of Trustees of the International Association for Educational Assessment (IAEA). He has recently been appointed Director on the e-Assessment Association Board and is Chair of the ‘Research’ Awards Panel for the e-Assessment International Conference & Awards.

Stuart regularly presents at British, European and International conferences and has given keynote presentations. He is currently engaged in a major writing project with research colleagues from Trinity College Dublin University (Ireland) – Prof. Damian Muchan and Dr. Evgenia Likhovtseva, which takes as its focus externally moderated school-based assessment from an international perspective. This book, entitled: International practices in moderating high-stakes, school-based assessment will be published by Palgrave in mid-2026.

Member case study sessions

New for 2026, we are pleased to be showcasing practical stories of innovation and sharing insights from members of The Assessment Network.

Nancy Prabhu, Chatrabhuj Narsee School, Mumbai

Nancy Prabhu

AI and the learner: Empowering reflection, reasoning, and growth

A case study of AI use in the classroom

"In my classroom, I integrate AI-driven approaches to enhance teaching, learning, and assessment. By using AI tools for feedback, reflection, and question design, I help students understand what quality looks like, develop stronger reasoning, and build the confidence to become reflective, self-regulated learners. Thoughtful and constructive feedback supported by both human insight and AI forms the foundation of the transformative pathways for my learners, so they grow into independent and confident thinkers.

Using AI for assessment design and constructing questions pushes the learning curve into higher-order thinking. AI reviews question papers for overlaps, appropriate difficulty levels, and balanced assessment objectives, ensuring they align with key assessment principles such as validity, reliability, fairness and comparability.

This process ensures questions measure what they are meant to measure, and students are assessed consistently across all groups. The standards remain consistent across both the components of business studies, and assessments become more transparent and instructionally meaningful.

When guided thoughtfully, the partnership between AI, the teacher and the learner transforms the assessment and reflection process into pathways for transformative and sustainable growth."

Takeaway for attendees:

  • How do we balance technology with the irreplaceable human element?
  • How can AI elevate metacognition and self-reflection in assessment preparation?

Nancy is an Upper Secondary IGCSE Business Studies teacher at Chatrabhuj Narsee School in Mumbai. She is a member of The Assessment Network at Cambridge, and a Cambridge Assessment Specialist with experience of more than eight assessment cycles.

Joseph Onyancha Mogunda and Caroline Baldwin, Yingya St. Peter’s School in Haikou, China

Joseph Onyancha Mogunda

Examiner feedback to student action: building feedback literacy for all

A case study of feedback literacy for mixed stakeholders

Joseph Onyancha Mogunda is Head of Humanities and a Further Mathematics teacher at Yingya St. Peter’s School in Haikou, China. In his case study session, "Examiner Feedback to Student Action: Building Feedback Literacy for All," Joseph draws on his classroom experience, examining background, and school-based innovation work to explore how feedback can be designed and delivered so that students understand it, value it, and use it.

The session focuses on building feedback literacy among teachers, students and parents, and on practical strategies that turn examiner-style feedback into concrete next steps for learning.

Joseph’s interest in developing a feedback ‘innovation’ at Yingya St. Peter’s School arose from a shared recognition that feedback must go beyond comments on work and become information that learners can act upon to make measurable progress. In response, the school has put a mechanism in place to ensure that feedback is captured and forms a key part of the weekly learning behaviour reports.

These reports include a specific segment on students’ responses to feedback, allowing teachers to monitor how learners engage with and act on the guidance they receive. The reports are also shared with parents, helping to keep the feedback loop open between school and home and to promote a consistent focus on improvement.

In Joseph’s leadership role he oversees curriculum planning and implementation across the Humanities department, ensures the quality of assessment design and marking, and leads professional development for colleagues. A keen assessment practitioner, Joseph is working towards the Advanced Assessment Practitioner Award and is currently pursuing an MA in Educational Leadership. He has been teaching in high schools for the past 14 years and serves as an examiner for an awarding organisation as well as a professional development facilitator.

Caroline BaldwinJoseph’s co-presenter for this session will be Caroline Baldwin, Dean of Academics at YSPS. Caroline has responsibility for overseeing assessment systems, curriculum design, and academic standards across bilingual and international pathways. She holds an MA in Educational Technologies & Instructional Design, with research grounded in data-informed teaching and instructional system design.

Over the past 15 years, Caroline has gained teaching and leadership experience in the UK, China, Myanmar, and Cambodia across public, private, and international settings. She has also worked within Montessori, PYP, Oxford International, and Cambridge IGCSE programmes, with particular focus on the development of coherent literacy pathways and data-driven intervention models.

At YSPS, the English curriculum has been strengthened through the introduction of reading interventions, vocabulary development frameworks, and structured self-study pathways for language acquisition.

Prior to senior school leadership, Caroline spent more than five years as a Project Manager and Instructional Designer. Caroline has also published comparative work examining post-colonial education policies in China, Singapore, and Myanmar, with a particular interest in the intersections between globalisation, curriculum reform, and national development agendas.

Her professional practice remains centred on data-driven school improvement, instructional design, and the creation of sustainable academic systems that raise student outcomes.

Practical workshops

Enhance your learning with your choice of a selection of practical taster workshops from experts across Cambridge. The sessions will take place in person and online on Friday 24 April.

Session 1 - in person attendees choose from one of:

In person workshop: How do I know my assessments have worked well? - Dr Simon Child

Dr Simon Child
Dr Simon Child, Head of Assessment Training, The Assessment Network at Cambridge

How do you know your assessments have worked well? Build your confidence with this introduction to assessment validation and gain practical methods for collecting different sources of data to ensure your assessments meet their intended purposes.

Assessment validation is essentially about collecting evidence to support how assessment results are interpreted and used. It plays a crucial role in ensuring that decisions based on those results are fair and appropriate.

This taster workshop that will help you understand the concept of validation in educational assessment. You’ll gain a clear overview of how to judge whether an assessment has measured what it set out to measure, and whether the results can be relied upon for their intended use.


Dr Simon Child is Head of Assessment Training at The Assessment Network at Cambridge. Previously, he was a Senior Research Officer in the Assessment Research and Development Division of Cambridge Assessment. He's conducted research in the field of qualifications reform and development since 2012.

His other research interests include quality of marking processes, curriculum development, formative assessment and Higher Education. His background is in developmental psychology. In 2011, he received his Ph.D from the University of Manchester, which focused on the development of symbolic cognition in pre-school children.

In person workshop: Developing an approach to assessing competence – Margaret Cooze

Margaret Cooze
Margaret Cooze, Freelance language and assessment specialist

In this taster workshop Trainer Margaret Cooze will discuss the challenges of assessing competence using a range of subjects as examples, before moving on to look at how to use a competence framework to help to develop valid assessment methods to ensure reliability is maintained in a practical manner.

The workshop will draw on different vocational areas as well as practical elements of school-based study, such as the assessment of language speaking skills. The workshop will cover both generic behaviour-oriented competences as well as task-orientated competencies and will ultimately aim to address the important question ‘How can we claim someone is competent is this skill?’


Margaret Cooze is a freelance language and assessment specialist. She holds an MA in Applied Linguistics and an MSc in English Language Teaching Management. She has taught in Guatemala, Japan and the UK with a particular interest in the teaching and assessment of writing. 

Margaret has also worked for Cambridge Assessment managing the production of a range of assessments for language learners and teachers. She has also led on assessment reviews, benchmarking and comparability studies as well as supporting awarding bodies in gaining recognition for their assessments. 

She is now involved in the design, development and production of assessment materials as well acting as moderator and Principal Examiner for English assessments for Cambridge International. Alongside this, Margaret has recently written for Cambridge University Press and also designs and delivers training on the principles of assessment, question paper content writing, the operational aspects of assessment systems and also language awareness for teachers with recent work in Sweden, Lesotho, Pakistan, Botswana, Eswatini, The Bahamas and Vietnam.

In person workshop: Authentic assessment and AI - James Beadle

James Beadle
James Beadle, Senior Professional Development Manager at The Assessment Network at Cambridge

How do our assessments need to adapt, if they are to remain authentic in an AI world?

This taster workshop will address critical concerns around authenticity, adaptive thinking, as well as the contexts that might include, or prohibit, the use of AI within an assessment. In this session we’ll look at:

  • How generative AI tools can facilitate students in demonstrating their own understanding through authentic assessment
  • How generative AI tools can act as a creative tool that can help students solve challenging problems and produce complex artefacts.

James is a Senior Professional Development Manager at The Assessment Network, where he designs and delivers training in Assessment for a wide range of customers, including schools, universities and educational ministries.

Originally trained as a mathematics teacher, he has worked in a wide range of contexts, both in England and internationally. He holds a Master’s in Mathematics Education from the Institute of Education, University College London and as part of his Master’s, he carried out a comparative analysis between the Shanghai Gaokao, A Level and IB Mathematics papers.

He is currently a tutor on the University of Cambridge Postgraduate Advanced Certificate in Education Studies: Educational Assessment course and has supported the development of the Assessment Network’s Embedding equity, diversity, inclusion and belonging in assessment practice online on-demand course.

Session 1 - online attendees choose from one of:

Online workshop: AI and assessment – Niall McNulty

Niall McNulty
Niall McNulty, Product Lead in Education Futures at Cambridge

Niall is the Product Lead in Education Futures at Cambridge, where he drives product strategy and innovation at the intersection of AI and education. He develops technology solutions that prepare learners and educators for an AI-integrated future. Niall leads product discovery and development for digital learning tools, including AI-powered content creation platforms and intelligent learning assistants.

He also supports the Cambridge-HP EdTech Fellowship Programme, guiding ministry officials and policymakers on AI integration in education while developing specialised training materials on AI in education for government officials, school leaders and teachers. Additionally, as a member of the HP Futures Council on AI & Teachers, Niall contributes to strategies that help educators engage with AI tools and methodologies across diverse educational contexts, enhancing both teacher productivity and professional development.

Joseph Onyancha Mogunda and Caroline Baldwin, Yingya St. Peter’s School in Haikou, China

Examiner feedback to student action: building feedback literacy for all

A case study of feedback literacy for mixed stakeholders

Session details above in Member case study sessions.

Session 2 - in person attendees choose from one of:

In person workshop: Adaptive Comparative Judgement for testing creative and technical subjects – insights from a UK school's pilot - Victoria Merrick

Victoria Merrick, CEO and Founder of Merrick-Ed Limited

In this presentation session, Victoria will share learning from pilot projects in UK school groups who are implementing Adaptive Comparative Judgement methodologies to support authentic assessment and moderation of creative, practical, performance and technical subject areas, in standardised testing, at scale.

Victoria  is Founder and Director of Merrick-Ed Limited, an education consultancy practice underpinned by 19 years’ experience of leading and teaching in the East Midlands, UK. Her impact as ‘Trust Lead for Assessment, Exams and Progression’ and ‘Trust Data Strategy Lead’, have directly improved outcomes for students across all phases; EYFS to Sixth Form. Victoria’s belief in the power of trust-wide collaboration inspired her conception of a bespoke ‘Common Assessment Framework’.

In person workshop: Creating a framework to evaluate digital assessment readiness: Sanjay Mistry and Stephen Kemmery

Sanjay and Stephen will begin by setting the scene with a short introduction to the purpose of digital exams and why they matter, challenging assumptions about whether they are truly the future and what schools actually want.

Participants will then explore Cambridge’s research findings, including the Digital Readiness Framework, which outlines seven critical dimensions for successful implementation: hardware, connectivity, resources, student readiness, digital teaching and learning, readiness timescale, and risks and concerns. This segment also emphasises the distinction between readiness versus willingness, highlighting why technical capability does not always translate into intent.

Next, the session moves into an interactive phase where participants quickly assess their own organisation's readiness using the framework. This is followed by a discussion on barriers through the COM-B behavioural lens (Capability, Opportunity, and Motivation) helping participants identify which challenges resonate most and brainstorm practical solutions.

The workshop concludes with strategies for overcoming these barriers, including Cambridge’s phased approach and the Early Adopter Programme, and invites participants to commit to one actionable next step they can take back to their institutions.


Sanjay MistrySanjay is Head of Digital Insight and Impact at Cambridge. The role involves leading on the definition and delivery of the research, insight and impact strategy that is pivotal to underpinning the organisation’s strategic roadmap for the development of the digital products and services portfolio that serve the international and UK markets, and working with schools, school groups and Associates to bring them on Cambridge’s digital journey.

Stephen KemmeryStephen is Digital Development Lead at Cambridge University Press & Assessment. He leads global strategies for implementing high-stakes digital exams, working with research, product, policy, schools and partners to design next-generation customer enablement for digital qualifications. Stephen brings extensive experience in education leadership and digital transformation, having previously driven innovation strategies within Further and Higher education across the UK and Europe, as well as leading on teaching, learning, and assessment initiatives. Passionate about advancing educational outcomes through technology, he focuses on equity of access to digital education, global readiness, and the integration of emerging technologies to shape the future of assessment.

Session 2 - online attendees choose from one of:

Online workshop: Top tips for effective question writing - Rebecca Wall

Rebecca Wall
Rebecca Wall, Senior Assessment Manager at Cambridge

Rebecca is a Senior Assessment Manager, working in the Primary and Lower Secondary Assessments team. Prior to joining Cambridge, she spent nine years as Head of Science in an international school in the Cayman Islands. She began her teaching career in the UK before moving to Boston, USA to take on a leadership role in science education, including curriculum and assessment development.

At Cambridge, Rebecca oversees the production of primary and lower secondary science assessments. She is also involved with assessment development for overseas partners, and plays a key role in delivering item-writer training and supporting teachers with use of assessment data to inform teaching and learning.

Online workshop: AI and assessment in the classroom with Nancy Prabhu, Chatrabhuj Narsee School, Mumbai

AI and the learner: Empowering reflection, reasoning, and growth

A case study of AI use in the classroom

Session details above in Member case study sessions.

Group bookings

Gain insights from Assessment Horizons 2026 with your colleagues through one of our group booking packages. Learn together at a discounted rate.

Group bookings info

Attend in-person as a group, stream to your team in the office or join online separately across multiple locations.

All participants will get a conference certificate of attendance, as well as having the option to upgrade to membership of The Assessment Network.

We’re offering the following packages for online attendance. Our early bird rates offer significant discount when comparing to both standard and individual ticket prices.

Number of people Early bird price (until 27 February 2026) Standard price
Up to 10 £1,035 £1,350
Up to 15 £1,460 £1,900
Up to 20 £1,840 £2,400

In-person attendance group booking packages will be priced on request.

To discuss your organisation's needs and for further information on pricing please get in touch stating how many places you are looking to book.

FAQs

What travel and transport options are available?

The conference is being held at The Triangle Building, Shaftesbury Road, Cambridge CB2 8EA. Triangle is easily accessible via public transport. The building is situated 0.7 miles away (an 8-15 minute walk depending on your walking pace) from Cambridge Railway Station. The nearest parking facilities are:

  • Cambridge Railway Station (0.7 miles)
  • Cambridge Leisure Park (0.9 miles)
  • Trumpington Park & Ride (3 miles)
  • Babraham Road Park & Ride (3 miles)

By Air: Stansted Airport. Approx. 40 minute drive to The Triangle Building.
By Bus: Many buses, operated by Stagecoach, stop at the nearby Cambridge Railway Station. Whippet buses U1 and U2 stop at the top of Shaftesbury Road.
By Taxi: Taxi drop off point is in front of main reception.

If you have a Blue Badge and require a parking space (we have a limited number available) please get in touch.

What does my ticket price include?

The in-person ticket price includes two days of keynote and breakout presentation sessions, taster workshops as well as refreshments and lunch. Plus, there will be an informal networking and drinks opportunity on the evening of 23 April.

The online ticket price includes access to a live stream of sessions throughout the event, with the opportunity to ask questions.

All ticket holders will also have access to an online resource with videos from the conference after the event.

Where can I stay?

Accommodation options near to The Triangle Building include:

  • ibis Cambridge Central Station, 2 Station Square, Cambridge CB1 2GA
  • Clayton Hotel Cambridge, 27-29 Station Rd, Cambridge CB1 2FB
  • Travelodge Cambridge Central, Cambridge Leisure Park, Clifton Way, Cambridge CB1 7DY
  • The University Rooms site can find rooms within Cambridge Colleges. Please note these are not as close to The Triangle Building as the options above.

Please note Cambridge isn’t affiliated with these providers, so if you’d like to book or have any questions, please reach out to them directly.

What if I need to cancel?

Cancellation via the course booking portal or written notification of cancellation of the event booking within five working days of the event date will incur no charge. For cancellations made less than five working days prior to the event, the full event fee will be chargeable.

In-person attendees who wish to swap to an online place and receive a partial refund must do so by 4pm on Friday 10 April 2026.

To enable us to offer your place to another delegate, please notify us of your cancellation as soon as possible.

Can I transfer my ticket to a colleague?

Yes, in-person or online attendees can transfer their ticket to a colleague. If you wish to do this, the last date to let us know by is Friday 10 April 2026. Please email thenetwork@cambridgeassessment.org.uk.

Do you cater for specific dietary requirements?

If you need to let us know about a specific dietary requirement, please log into the course booking portal, and choose 'Update details' from the left menu. Then, update the information on your dietary requirements in your profile before Friday 10 April 2026.

Contact our team

Our team of experts are here to help you find the perfect course for your learning needs or answer any questions about what we offer.

thenetwork@cambridgeassessment.org.uk

Conference archive

Colourful membership badge

Do you want to learn even more about previous Assessment Horizons' conferences? You can read all about our past events in our dedicated archive page.

Find out more

Become a Member

Colourful membership badge

Gain free online access to Assessment Horizons and enhance your status as a recognised expert by becoming a member of The Assessment Network at Cambridge.

Find out more