Skip To Main Content

Header Holder

Header Sticky

Search Canvas

Close container canvas

Breadcrumb

Are international schools asking the right interview questions about assessment?
  • EAL

This blog is from one of COBIS' Supporting Associates.

Written by Dr Helen Wood, Head of School Partnerships, Password

When appointing a Deputy Head Academic or Head of Teaching and Learning, you probably ask about curriculum design, staff development or intervention strategies. You’ll almost certainly include questions about using data to inform planning and improve results.

But how often do you ask this: “What do you think are the key considerations for assessment practices in a school where learners come with highly varied linguistic backgrounds and levels of English proficiency?”

In most schools, the answer is: never.

Why this matters

Across international schools worldwide, multilingual learners now make up around 70 percent of the student population. Yet many assessment systems – from admission screening to baseline cognitive tests – continue to be based on design principles which assume a monolingual norm.

Take a widely used example: the Cambridge Insight suite. Few schools realise that the average linguistic demand of test items within each of their tests, from InCAS through to Alis and the CEM IBE, sits around B2 on the CEFR scale, according to CEM’s own mapping. The CEFR is a globally recognised framework for describing language proficiency, and in ‘layman’s terms’ B2 represents upper-intermediate level. Even so, every year, learners whose English proficiency is at A1 or A2 (beginner to elementary) are asked to complete such tests at the start of their time at a school. When those students underperform, the risk is that their academic potential is significantly underestimated, when in truth, it is the language of the test that is beyond their reach.

There are no simple fixes for this. As Dr Eowyn Crisfield, Director of the Oxford Collaborative for Multilingualism in Education warns, extracting verbal reasoning scores is only partially effective. “If a learner’s English is limited, no amount of mitigation is going to erase that disadvantage”.

For leaders responsible for data and teaching and learning quality, that statement should stop us in our tracks. Are we confident that the data we rely on to make high-stakes decisions truly reflects ability, or simply language proficiency? Are we adopting benchmarking tests that are familiar, simply due to weaknesses in assessment literacy within our leadership teams?

Schools must scrutinise how the linguistic load of their assessments influence learner outcomes. Without this awareness, the risk is misclassifying learners, overestimating gaps and designing interventions that address the wrong problems – a classic example being the misidentification of multilingual learners has having SEND needs. Avoiding these hazards hinges on appointing school leaders possessing the assessment literacy skills necessary to evaluate both the tools being used and the results they produce.  

The hidden challenge

Wide-scale longitudinal studies in the UK by Professor Steve Strand and colleagues, have shown that English proficiency is the strongest predictor of academic attainment for learners with EAL – with new-to-English learners taking at least six years of English-medium instruction to reach full academic linguistic fluency.

During that journey, standardised scores and age-normed data can be dangerously misleading. This leads Crisfield to assert, “if we know the data is not a true reflection of these learners’ ability, the crucial question becomes, how can we interpret it to inform good decision-making?” This requires careful triangulation of multiple different data sources.

Crisfield suggests schools should gather an accurate measure of academic English proficiency. “Not a general EFL test” she clarifies, “but one specifically designed to assess curriculum access language, such as Password.” Tools of this kind give schools a clearer baseline, ensuring that what is being measured is access to learning, not simply command of every day social English.

Your Head of EAL/Multilingualism’s professional expertise will be essential in helping make the right assessment choices. Holders of these roles have not always been given that degree of influence, but the landscape has changed: effective schools now place the expertise of their EAL/Multilingual team at the centre of such strategic decision-making, knowing that multilingual learners need strong advocates, whose leadership and professional credibility give voice to their needs.

What this means for recruitment

Assessment literacy in international schools has become a defining leadership skill, shaping how schools fairly design admissions processes, interpret data from baseline and on-going assessment with accuracy, and plan interventions that genuinely meet learner need. Your Head of EAL/Multilingualism, your data-assessment lead, your Head of Teaching and Learning all need to ensure that assessment choices and interpretations are grounded in a thorough shared understanding of language development.

So, at your next academic leadership interview, consider asking: how well do candidates understand the relationship between language proficiency and attainment? How confident are they that currently widely used assessments are fair for multilingual learners? What steps will they take to help staff interpret data through a language-aware lens?

These questions matter. The answers will tell you whether your next appointment has the expertise to serve all your learners well.

If your school is reflecting on how language shapes assessment and learner outcomes, explore how Password’s academic English proficiency tests support fairer baselines for multilingual learners: www.englishlanguagetesting.co.uk

About the author:
Dr Helen Wood is a former EAL lead and SLT member, she works with international and British-curriculum schools to strengthen assessment literacy and language-aware practice: helen.wood@englishlanguagetesting.co.uk