Lucy Chambers

Lucy Chambers

In 2012 I joined the Innovation and Development team of the Research Division at Cambridge Assessment and have worked on a number of projects including developing methods and metrics to monitor the quality of marking, developing data and information reporting systems and examination comparability. My current interests include the moderation of school-based assessment and students’ examination writing.

Prior to working at Cambridge Assessment I taught English in Japan, the Czech Republic and the UK in both the private schools and business sectors. I joined Cambridge Assessment in 2004 and initially worked for the Cambridge English Language Assessment group conducting research in the areas of assessing speaking, examination impact, workplace and business English, and language benchmarking in the business sector.

I hold a MA in Applied Linguistics from Anglia Ruskin University, a PGDip in Health Psychology from City University and a BSc in Psychology from the University of Stirling.

Outside of work, I enjoy gardening, doing house renovations and a little bit of dancing. I volunteer for a medical charity and have recently become a lay member of their research grants panel.

Publications

2017

Alternative uses of examination data: the case of English Language writing
Chambers, L., Constantinou, F., Zanini, N. and Klir, N. (2017). Presented at the 18th annual AEA Europe conference, Prague, 9-11 November 2017.
Formality in students’ writing over time: empirical findings from the UK
Constantinou, F., Chambers, L., Zanini, N. and Klir, N. (2017). Presented at the annual European Conference of Educational Research, Copenhagen, Denmark, 22-25 August 2017
Evaluating blended learning: Bringing the elements together

Bowyer, J. and Chambers, L. (2017). Evaluating blended learning: Bringing the elements together. Research Matters: A Cambridge Assessment publication, 23, 17-26.

This article provides a brief introduction to blended learning, its benefits and factors to consider when implementing a blended learning programme. It then concentrates on how to evaluate a blended learning programme and describes a number of published evaluation frameworks. There are numerous frameworks and instruments for evaluating blended learning, although no particular one seems to be favoured in the literature. This is partly due to the diversity of reasons for evaluating blended learning systems, as well as the many intended audiences and perspectives for these evaluations.  The article concludes by introducing a new framework which brings together many of the constructs from existing frameworks whilst adding new elements. It is aim is to encompass all aspects of the blended learning situation to permit researchers and evaluators to easily identify the relationships between the different elements whilst still enabling focussed and situated evaluation.

2016

Research Matters Special Issue 4: Aspects of Writing 1980-2014
Elliott, G., Green, S., Constantinou, F., Vitello, S., Chambers, L., Rushton, N., Ireland, J., Bowyer, J. and Beauchamp, D. (2016). Research Matters Special Issue 4: Aspects of Writing 1980-2014.

2015

Piloting a method for comparing examination question paper demands
Chambers, L., Greatorex, J., Constantinou, F. and Ireland, J. (2015). Paper presented at the AEA-Europe annual conference, Glasgow, Scotland, 4-7 November 2015.
Piloting a method for comparing examination question paper demands
Greatorex, J., Chambers, L., Constantinou, F. and Ireland, J. (2015).  Paper presented at the British Educational Research Association (BERA) conference, Belfast, UK, 14-17 September 2015.

2012

The Hebei Impact Project: A study into the impact of Cambridge English exams in the state sector in Hebei province, China

Chambers, L., Elliott, M., and Jianguo, H. (2012). The Hebei Impact Project: A study into the impact of Cambridge English exams in the state sector in Hebei province, China. Research Notes, 50, 20-23.

An exploration of how independent research and project management skills can be developed and assessed among 16 to 19 year olds
Suto, I., Nadas, R. and Chambers, L. (2012). Paper presented at the British Educational Research Association (BERA) conference, Manchester, UK, 4-6 September 2012.
Test taker familiarity and speaking test performance: Does it make a difference?

Chambers, L., Galaczi, E., and Gilbert, S. (2012). Test taker familiarity and speaking test performance: Does it make a difference? Research Notes, 49, 33-40.

2011

Composition and revision in computer-based written assessment

Chambers, L. (2011). Composition and revision in computer-based written assessment. Research Notes, 43, 25-32.

The BULATS online speaking test

Chambers, L., and Ingham, K. (2011). The BULATS online speaking test. Research Notes, 43, 21-25.

 

2010

Composition and revision in CB written assessment
Chambers, L. (2010). Presented at the 43rd conference of the British Association of Applied Linguistics, Aberdeen, UK, 9-10 September 2010.

2009

Using the CEFR to inform assessment criteria development for Online BULATS speaking and writing

Chambers, L. (2009). Using the CEFR to inform assessment criteria development for Online BULATS speaking and writing. Research Notes, 38, 29-31.

Computer-based and paper-based writing assessment: A comparative text analysis
Chambers, L. (2009). Presented at the 42nd conference of the British Association of Applied Linguistics, Newcastle, UK, 3-5 September 2009.

2008

Computer-based and paper-based writing assessment: a comparative text analysis

Chambers, L. (2008). Computer-based and paper-based writing assessment: a comparative text analysis. Research Notes, 34, 9-15.

Research Matters

Research Matters

Research Matters is our free biannual publication which allows us to share our assessment research, in a range of fields, with the wider assessment community.