Member spotlight: Vocational assessments with Phillip Bryant

AI in assessment design, the benefits of competency based assessments and apprenticeships - vocational assessments with Phillip Bryant

phil_bryant

"I currently work for the International Compliance Association (ICA), the leading professional body for the global regulatory and financial crime compliance community.

I joined ICA in 2016 following time working for UK based awarding organisations AQA and OCR. At AQA I was part of the team developing a new range of technical and vocational qualifications, and I was at OCR (Part of Cambridge University Press & Assessment) for 13 years working in a number of roles, with the main post being managing the team of subject specialists who were responsible for qualifications for 14-19 year olds in the subjects of ICT, Computer Science, D&T and Health & Social Care. 

When I joined ICA, my role was focussed on our suite of professional qualifications at level 2 through to level 7 which we award in association with the University of Manchester’s Alliance Manchester Business School.  In 2017 we became active in apprenticeship assessment as an extension of how we support the regulatory compliance sector, by offering end point assessment (EPA) in two compliance related apprenticeships.

Since then we have expanded the work that we do and now offer EPA for 18 different apprenticeships, and as part of that growth and expansion, my role became wholly focussed on EPA and apprenticeships.”

How are your team – and the apprenticeship sector more broadly – responding to the opportunities and challenges afforded by AI?

“This is an interesting topic for us, as it is for most people in assessment, and I think everyone’s understanding of the opportunities and challenges relating to the use of AI is evolving. To begin with I think there was a general fear of misuse, and it was perceived as a threat to academic integrity. It continues to present some challenges, but the sector is also now beginning to understand some of the potential benefits it offers.

There are some acknowledged opportunities in the area of assessment design and in the writing of EPA policies or procedures, not necessarily to replace the human input into either of those areas, but to be used as a tool by them to aid their work.  My experience is that when AI is used without human intervention it can go badly wrong.  The research tends to acknowledge that AI is not yet developed to a level that it can be left alone to perform functions entirely independent of human control! 

In terms of how this relates to the work that apprentices submit as part of their assessment, the key is for all awarding organisations to provide clear guidance to all stakeholders on what is or isn’t allowed. The industry consensus seems to be that apprentices are okay to use AI as part of their research but should not use it to generate content that they will present as their own for assessment - and that any work submitted must be their own. Not following this rule would then be seen as malpractice, and normal policies and procedures would then apply. 

The challenge for the vocational/apprenticeships assessment sector is to consider how to work with cases where AI might be the natural way of producing documents in their work setting. ”

From an assessment point of view, we in apprenticeships are not as prone to some of the potential misuse of AI (eg essay or assignment writing) that some sectors or organisations may face as the substantive parts of our assessment is through forms of oral assessment or observation of work based activity.

If AI is used in the apprentice’s job role, then there is an argument that they should be able to reflect this in the content of their portfolio or in any projects, reports or other forms of written evidence they submit – in our assessment, they would then have an accompanying discussion where the assessor will be able to probe the apprentice’s true understanding and competency.

The reality however, is that most professional workplaces either are, or soon will be, making some use of AI, and so for many job roles (and therefore apprenticeships) being able to master the use of AI could be deemed to be a relevant skill."

What could assessment practitioners working in different disciplines learn from assessment methodologies used for apprenticeships and similar programmes?;

“A lot of the methodologies we use are influenced by the nature of the work-based learning, which will be very different to other areas and types of assessment. However there are some ideas, principles, or approaches that may be transferable to other disciplines.

I’m not saying that the points below are unique to EPA and apprenticeship assessment, but these are the things that I find most striking about the way in which skills-based assessment is carried out:

• EPA allows apprentices to evidence competencies in ways that are naturally occurring and relevant to them and their role, rather than using theoretical assessment or only assessing specific criteria in set questions;

• There are no set right or wrong responses and the assessment is based around the individual – assessors keep an open mind on what an appropriate response would be and look for the underpinning competency. It is possible that two apprentices, asked the same question about the same scenario, could come up with two opposing responses, both of which could be correct if there is a rationale and justification for the approach taken;

• EPA picks the assessment methodology that is the best way of assessing the competency. Each apprenticeship has its own assessment plan, where the Trailblazer Group (the group of employers and stakeholders who were responsible for creating it) pick the methods of assessment that are best suited to the content/competency being assessed rather than rigid formats or every apprenticeship being uniform. Most of them have some form of professional discussion that is based on a portfolio of evidence gathered during their apprenticeship, but beyond that there can be knowledge exams, projects, reports, presentations, practical observations etc. This approach means that the assessment will be based on what are the naturally occurring and relevant ways of evidencing the knowledge, skills and behaviours;

• The use of face-to-face/1-2-1 assessment is great for allowing the apprentice to talk about their work, and to articulate their experience and competency. Oral assessments, where the context (ie the apprentice’s job role, organisation and sector) dictates the nature of the evidence, tend to generate responses from the apprentices that other forms might not – they approach the assessment with a lot more confidence, feeling that they can use their experience to justify their approach or arguments, and that the assessor is making a judgement based on their work, experience and competency;

• The assessment criteria for EPA is written by employers so it focusses on what is important to know/be able to do rather than those things that are easy to measure/assess – this can sometimes present challenges for the assessment organisation, but it gives us (and the apprentice) the assurance that we are assessing the right areas and assessing them in meaningful ways;

• All of the competency required to carry out a job is expressed as pass criteria which makes it easier for everyone concerned to understand the requirements;

• Apprentices are able to demonstrate skills and knowledge at the same time rather than there having to be a separate assessment of each type of competency.”

What do you find interesting about assessment learning? Have you had any ‘lightbulb moments’ over the years that you might like to share with the community?

“I think my biggest area of increased understanding and learning has been around accessibility in assessment design – this relates to supporting students with disabilities and additional needs, but also goes beyond that to removing any broader unconscious bias, assessing across different geographical regions/educational practice, providing fair access to everyone, and considering how the assessment should be equally accessible to people from a range of different professional sectors/backgrounds.

I have also recently managed a transition from traditional (paper based) exams to online exams, and it was interesting to see how the difference in delivery models affected every stage of the assessment – from question writing, test design, rubric, the nature of the answers candidates wrote, the different types of questions candidates asked in advance, the different risks of malpractice, the way in which the papers were marked, and the QA processes around the assessment.

The best advice I can offer anyone about to embark on a similar journey is to speak to people who have already made that leap to benefit from their experience, and to think in advance about how each stage of the assessment needs to change.”

This discussion first appeared in Perspectives on Assessment, the Cambridge Assessment Network member newsletter, which features key voices from the assessment community along with other member-exclusive content. Would you like to feature in a future Member Spotlight? We'd love to hear from you - get in touch.

Assessment practitioner awards

Assessment practitioner awards

Our awards have been designed to recognise and demonstrate a commitment to developing your assessment expertise.

Build and develop your practice

Girl working at a laptop

Focusing on an important element of assessment design and practice, our assessment practitioner workshops are designed to fit seamlessly around your work commitments.