Podcast - Admissions tests: Discovering candidates with the most potential

Admissions tests: Discovering candidates with the most potential

24 Mar 2021 (27:18)

Download this podcast (mp4, 24.8mb)
This is the fourth in a series of podcasts from Cambridge Assessment Network focusing on different aspects and forms of assessment. Today we are with Mike Housden, Senior Assessment Manager, Cambridge Assessment Admissions Testing to discuss how admissions tests can be used to identify candidates with the most potential, give students an opportunity to demonstrate a core understanding of their subject and help prepare students for university education.

Cambridge Assessment Admissions Testing offer a range of tests to support selection and recruitment for universities, governments and employers around the world. 

Covering skills such as problem solving, critical thinking, written communication and the application of subject knowledge, admissions tests can identify applicants with the greatest potential for success and give every person the opportunity to let their true potential shine.

Cambridge Assessment Network is an accredited provider of assessment training and professional development. We aim to bring together professionals from the assessment community to share best practice and the latest thinking. You can find out about our upcoming professional development opportunities on our events pages. Or to stay up date with our network sign up to our newsletters.

You can also find this episode on Apple Podcasts and Spotify.

Podcast transcript

Alana: [00:00:08.53] Hello and welcome back to the Cambridge Assessment podcast. My name's Alana Walden and I'm here to introduce the next episode in our series of podcasts, guest hosted by Cambridge Assessment Network on the different aspects and forms of assessment. In this episode, we're joined by Mike Housden, Senior Assessment Manager of the Cambridge Assessment Admissions Testing, to discuss why admissions tests are important, what skills and knowledge they assess and how they can support widening access to educational opportunities.

Penelope: [00:00:44.17] So, Mike, thank you so much for joining us today. And before we start, would you like to just introduce yourself and say a bit about what you do at Cambridge Assessment?

Mike: [00:00:54.24] Hi. Yes, I'd like to introduce myself. I'm Mike Housden. I'm a Senior Assessment Manager as part of Cambridge Assessment Admissions Testing. Specifically, I'm a chemistry specialist and I work as part of a science team of mathematicians, a physicist and a biologist. And we work on a variety of university entrance assessments that are used in highly competitive universities for their selection procedures. Specifically are major headline products are the biomedical admissions tests, the BMAT and also the test of mathematics for university admissions, which is called TMUA sometimes. And also we work on a suite of tests for our own university in Cambridge and also deliver some tests at Oxford University, for their own admissions processes.

Penelope: [00:01:49.90] So firstly, I'd like to ask, what is an admissions test, who uses them and why are they needed?

Mike: [00:01:59.03] Ok, so the university admissions landscape is quite competitive at many universities, particularly in the UK, but we also work with universities across the world and. Applications for competitive courses such as medicine and engineering and other physical and biological sciences courses require information about people who are applying that could cross a wide range of skills and potential to thrive on a future course. So the point of the admissions test is to help the universities and disentangle some of the information from their large numbers of applicants to make objective decisions about who they want to accept on their courses. So typically for some some courses with the universities that we deal with, there are at least five to one applications per place and we aim to provide a consistent, objective yardstick for the universities to measure all the applicants against as part of that process. And we really see ourselves as part of an overall, not a not an absolute pass or fail whether you get a place. But we'd like to give at least one consistent piece of information that every single applicant in a particular year will have on their application for.

Penelope: [00:03:18.61] I think you touched on it a bit there already, but how do the universities use the results of the admissions tests to make this election decisions?

Mike: [00:03:29.58] Ok, this varies between our different stakeholders in different universities that we work with, but I can give some examples for medicine courses and for Cambridge University itself and a variety of other artists are use not as a as a set cut school, but they're generally used as either a pre filter or none for universities who would normally use interviews as part of their admissions processes. So from that point of view, it's useful to have an internal ranking of their application cohorts if they're considering bringing lots of people to interview. And that can influence some decisions that some of our tests are used to stretch the very top end of a of an application cohort to try to see whether we can differentiate people more objectively who will have very, very similar grades in this high school education. And some universities will use this to be able to compare their cohorts to to other similar type universities. So this is quite common in the UK with the Russell Group universities, several of which use our BMI test. And, um, and in we work with some stakeholders in other countries as well who use our tests as an international benchmark, um, for their own application cohorts when they have people applying from all over the world.

Penelope: [00:04:56.37] And do you think admissions tests are more important now in a time when exams have been cancelled and we have had to rely on teacher assessed grades?

Mike: [00:05:07.80] So this is a very complicated, fast moving situation that we're dealing with, with the teacher, assessing assessed grades, situations, and obviously different countries are using different methods for assessing things. I feel that the admissions tests are in a unique place right at the moment to help people with admissions cycles, not just this year, but also the next few years, because I'm a high school, education has been disrupted at all sorts of levels, which will trickle down for years and years while people are applying to university. So, for example, diseases that have been disrupted recently will impact university admission cycles in a couple of years time. And so what? As I said before, the having a constant objective yardstick for your whole application is something that we can provide that even say different exam boards doing A-levels can't provide vibraphones, got slightly different variations on on their application cohorts. When you have predicted grades, one of the main reasons why we've been involved with some of the universities, such as Cambridge University, is that many, many of the applicants have identical predicted grades for competitive courses. This is particularly true in sciences and maths, where the lion's share of all the applicants will have the straight top grades in whatever qualification system they're working. And so such that predicted grades become not particularly useful for differentiating the strongest from those cohorts. And we also we also work off cycle, if you like, from the conventional standard standard qualification systems. So we run sessions generally in the UK in October, November of year 13 for UK students, depending on which universities we're working with around the world. We have some other sessions that run throughout the year, um, so we can provide something a bit different. That's a bit of a sort of slightly different measure of qualifications, sort of more in a more agile way perhaps than than maybe standard government controlled, um, qualifications come.

Penelope: [00:07:24.03] So what skills and knowledge do your admissions tests look to assess and what makes them different to other high stakes testing?

Mike: [00:07:36.88] So our major focus with our applications is the university admissions cohort. So we are generally testing people who are a year before they're about to apply for a specialist university course, such as medicine or engineering or any of the sciences. And so where school qualifications such as A-levels will be focussing on a combination of acquiring new knowledge and testing that and how you can use that in different problem type scenarios.

Mike: [00:08:12.07] What we would hope to test is fluency and agility with those subjects and which is an important skill for going beyond the navel into a subject specific, a higher education qualification. So we do a variety of tests, some of which are based on a curriculum, but we do based upon, say, a science curriculum. What we do is we assess the landscape out, both comparing to say see Cambridge International and OCR, but also every other example in the U.K. and Scotland and around the world. We design a curriculum which is purely the core material that people would have encountered up to the age of 16. So generally something that they should have become familiar with by the time they're taking these tests at the age of maybe 17 or 18. So we design questions that are based on something that should be really, really well established at that time. We also run some tests, such as the thinking skills assessment, which are curriculum free, which assess, um, understanding arguments and and a wider Problem-Solving type skills in that. So why these are a bit different is we have a very strong focus on the application end of things rather than assessment objectives around a record of facts.

Mike: [00:09:37.99] So we will design questions that, for example, who will be using familiar topics, but maybe in a slightly unfamiliar scenario, we will be combining topic areas, say, from maths and chemistry at GCSE type level, but maybe combine them to solve problems which are slightly unfamiliar then to students who have routine pass paper types of tests that they can get familiar with. To start with. Um, we're looking for the the ability for a student to combine ideas together, maybe across discipline and across specification and the types of skills that are really important for progressing onto a university course in these subjects. Um, and that that's our intention to be to be something different. The in terms of how the tests differ from school qualifications and level, for example, would be focussing on, um, finding a national, um, sort of baseline of of an understanding of, say, chemistry, where the focus is on differentiating and spreading out over an eight E type grading system, where, um, top top universities and highly competitive courses will often find themselves with, uh, students with collections of the top grades in every subject applying all at once and still in the ratio of five applicants to one place.

Mike: [00:11:09.97] So what we do is we focus on, well, can we design a test that will really stretch the top end of this cohort, maybe in a way that they've never had before. So, um, the focus in designing our tests is if you have a lot of people who are getting straight-A grades, for example, across all the different subjects a level, how can we design a test that really probes their fluency with that material as opposed to routine processing of past papers or, um, rote learning of material on en masse? So we're trying to stretch some cognitive, um, differences that are useful for people going forwards where the techniques of rote learning will become less useful to them.

Penelope: [00:11:55.19] So what types of questions are used in the emissions tests? Is it often multiple choice questions or is it always multiple choice?

Mike: [00:12:05.08] Um, it's not it's not always multiple choice, but the vast majority of the products that we we have it in our admissions testing are multiple choice. And that's partly because we need the fast turnaround for our stakeholders, because, for example, if we our UK tests in November, the that UCAS application deadline to the UK universities, but certainly some of the competitive ones is the 50 October they would like to be interviewing candidates in late November and early December. So it's important that we can get a quick turnaround of marking, for example, at that stage. So the multiple choice format is very useful for that. And in terms of the design of the multiple choice questions, um, really the important factor is, is the design of those multiple choice questions at the start. If you want that quick win at the end for the quick marketing, you need multiple choice questions that function coherently as a as a test when you put them as a collection. So we put a lot of effort into into how we write these. Um, one of the major factors in designing good multiple choice questions is I'm thinking about what the incorrect options actually are. And we consult with many teachers and other examiners from different examples across the UK and more widely sometimes, um, to assess what sorts of things are a problem for students learning at a particular time.

Mike: [00:13:34.03] And we learn a lot about what the actual misconceptions of topics are and we try to factor those into our assessments very actively. So if you have a multiple choice question, testing chemistry, for example, we want to make sure that if someone is harbouring a major misconception about chemistry, that would be a real hindrance to them when they start on a university course, which would then mean they struggle with that course and maybe will drop out. Um, we would like to sort of, um, make them go for through the test in that sort of way that they can they they feel like they can find an answer that's based on their misconception and then they will pay that. And if you collect together a whole load of test items, say, in chemistry, based on of balancing chemical equations as a default, um, those types of candidates will tend to pick certain options and we can track that through the test. This becomes part of our general analysis for for any of the tests that we do.

Mike: [00:14:35.95] So putting in the effort to design the questions is key. We do do one test and isn't a multiple choice test, which is a very high level maths test, which is the sixth term examination papers, stat papers, which are used predominantly by Cambridge University for admissions to their maths trials, but also used by a few other universities such as Warwick and Imperial in assessing that that applicants for really sort of world leading departments in mathematics, where competition is very high and the 80 EE type qualifications are A-levels, really don't differentiate the very top candidates. This is particularly a problem in maths, so I'm in step. For example, they will be exposed to much longer and much longer free response type questions that are based on their services, that they've covered that school, but really require very deep appreciation of that, that material. So we take in the expectation that these lots of candidates applying for Cambridge for maths will be averaging close to one hundred percent in all of their subjects and modules in mathematics. So how can we design tests that stretch that particular cohort as opposed to trying to find where the ATP boundary is, for example?

Penelope: [00:15:59.53] And can you tell me a bit more about how the admissions tests are developed?

Mike: [00:16:05.97] So some of the things that are maybe a bit different to what we do to other parts of the organisation, and we have quite a lot of stakeholder engagement because our tests are designed for what sort of bespoke for four different universities and different requirements than normal qualifications would be.

Mike: [00:16:23.42] So, for example, our BMAT qualification, which is used for several medical schools in the U.K., we work with the academics in the university to to review our tests as we go along. And they can have some influence on to what types of things they would like to have in the tests so that we can map and map a sort of a disconnect between high school type science education and university level requirements for science education. And one important thing we also do with our stakeholders is that because the, um, because there's potentially some quite contentious issues to do with people's applications to universities and whether a test is fair at this level, one of the things we we do routinely with our science tests is that once we've had our item writers and chairs writing some multiple choice type questions, for example, I'm in a similar way that many other people in Cambridge assessment in would be developing questions. Our next stage is to send the questions through what we call a science vetting process, where we get university academics to review all of the questions for their integrity of scientific content and skills. And this allows us to make sure that that oh, well, basically, we asked them to pick apart the question in the most pedantic way possible.

[00:17:58.06] And should there be any way that you could maybe get answers by using incorrect science or that the science involved is actually a simplification that's used in high school education but is actually not particularly accurate? And we would like to eliminate those questions from our bank. Um, we, uh, we find this level of engagement with our with the academic world into our tests very useful for, um, for making sure that we don't disadvantage candidates who are very, very strong. So if we're trying to to test that the very, very top end of, say, a level cohort, those students might well be aware of many other limitations of their high level curriculum. And because of just a general interest in that subject, has led to wider reading and participation in say Olympiads in more detail and an awareness of of a lot more depth of their subject. So it's an important part of our process that I think is unique to us. Is this, um, stress testing, if you like, for academic rigour. And it also protects us from the many challenges that people have with our testing, that maybe this is a simplification that's taught but not actually scientifically accurate.

Mike: [00:19:12.88] I think that's one one unique thing that we bring to the to the playing field here.

Penelope: [00:19:19.08] And how can students prepare for an admissions test?

Mike: [00:19:24.96] So, as I mentioned before, the idea with these things is that, well, we're not we're not trying to supersede an A level and the GCSE, what they're trying to do at school.

Mike: [00:19:32.82] I mean, there is a fundamental learning about your subject which those assets and we're not trying to interfere with the school education system at all in any country. What what we're trying to do is develop something that in theory, even if there is a curriculum, it's stuff that they should be that the students should really be familiar with at this point. If they're thinking of applying to a science type degree at university, for example, they should probably be pretty familiar with what Pythagoras's theorem is and how you might be able to use it. So in terms of preparation, what we do is we make a lot of material freely available on our website, um, where we've set out very clearly the specification which we defined as a core specification of what we think students really should understand quite well the year before starting a university course. So we encourage our students applicants to to look at our website and go through that specification just to highlight if there is one or two bits that are a bit rusty with or maybe is is missing from that particular course for whatever reason. And so that they've got time to to ask a teacher to go through that for them. And we also have a three subject matter guide on our website for BMAT, but would also apply to other things. So it generally fleshes out our specification and give some examples of the types of ways we will be asking questions about those particular topics.

[00:21:02.92] And it's really important for us that this is all freely available. We don't endorse any three months training courses for these admissions tests, and we want to make sure that these tests are accessible to absolutely everyone and want to make that very clear in the media that courses claiming to have secret knowledge of how to do real tests are actually do not at all. So in terms of preparation, the main thing that we would encourage our students to do would be to continue on working hard with their A-levels or equivalent qualifications. That at the end of the day is the core thing that we can to be stress testing in our in our in our assessments. And they need to be familiar with the format because multiple choice format is maybe a bit unusual to some people. And particularly I'm having a test where that where students won't be getting near 100 percent. And so we're taking a cohort of people who are used to getting everything right, for example, and having a test with an average score of 50 percent. And that can be quite uncomfortable for these particular students. Getting used to that is the the the best thing that they can do. Practicing under the time conditions. We have some research showing that that is the biggest predictor of being able to do our tests well and giving people the best opportunity to get a good score put onto their UCAS form by our tests.

Penelope: [00:22:36.96] And do you think that the preparation for the emissions tests can have a positive impact on a student's learning and understanding?

Mike: [00:22:47.53] Absolutely so and so when when applying to university, there is a lot of unknowns when when you're at school, and so preparing for these types of tests really does make you be able to sort of self assess how well you really do understand some of these subjects rather than necessarily just just enjoying being surrounded by the subjects. Because when you're studying independently at university, you really do have to have that individual motivation. So you preparing for free for these test really is one of some of the few things out there that can really assess whether you are bored. We have a broad knowledge base that's pretty strong together with can you apply the knowledge as opposed to sort of more passive learning by reading textbooks, for example? And so I think the students can use this complementary to that to say that high level studies, I'm just a slightly different way of assessing, um, whether they have a deep core understanding of their subject as opposed to just knowing some new things that they're learning as and when. I think the emphasis in high school education is very different to university education. So it does it does start to sort of point towards ideas of can you think about the problems in in more flexibsle ways? And people might engage with that very well and actually read it like that.

Mike: [00:24:15.52] And and if they do, they're likely to do very well at university and particularly in top top courses at top universities.

Penelope: [00:24:24.85] Do you think admissions tests can support widening access to educational opportunities?

Mike: [00:24:32.09] So as I mentioned before, one of the main focuses of having admissions tests is not to be as sort of a hurdle to get over in any sense. It's a way for people to put something very objective onto their. On today's UCAS forms that are that is comparable across all of the different candidates, there is a function of whether your school has many resources or or better teachers, for example, or more experience with preparing students for the universities. And so in terms of widening access, that that's kind of why we've provided all of our materials for free on our website, particularly these sorts of revision guide that we have for science topics. And we want to make sure that everyone has a sort of equal access to all of that material. It's also why we don't tend to test material that might be a source of special knowledge, for example, that people are being taught at some schools and not others. We keep our curriculum content low, but we test whether people really understand it. And I think that day is different from some from some of our things and and hopefully. That helps with if someone is I'm genuinely, really, really strong on that subject. They can have a school say that tells the university that they're actually despite some difficulties that they've had with DeJesus's in the past, if I can pose a really good score on that UCAS form to say just before they were applying to university, that they're in the top 10 percent of all of the applicants, it allows the university to be able to judge their previous qualifications, for example, that might be disrupted into a lot more context. So the universities really want to be able to use these types of things for contextual information rather than a cuts to cut school. Because if so, you predicted grades aren't as strong based on something that's happened in a student's past. And we want to be able to have those applicants show what they can really do at a relevant time just before that, their application.

Penelope: [00:26:42.59] Mike, thank you very much. If you enjoyed listening to this podcast, you can find out more about the assessment network via the links in the description, join our community on LinkedIn and look out for the next podcast in this series.

Alana: [00:26:59.48] Thank you for listening to the Cambridge Assessment podcast, you can find more of our podcasts on our website, just search podcast gallery, or you can find us some Apple podcasts or YouTube.

Return to top

Research Matters

Research Matters 32 promo image

Research Matters is our free biannual publication which allows us to share our assessment research, in a range of fields, with the wider assessment community.