Dr. Sarah Richardson, The verification of technical terminology - the case of engineering
25.11.2021

The verification of technical terminology – the case of engineering

By Dr Sarah Richardson, Deputy CEO and Research Director of the Australian Council for Educational Research (ACER) UK

Between 2010 and 2012 the OECD implemented the Assessment of Higher Education Learning Outcomes (AHELO) Feasibility Study. The focus was on measuring the learning outcomes of final year bachelor-degree (or equivalent students) in the domains of civil engineering, economics and generic skills. Overall project implementation of the first two strands was led by the Australian Council for Educational Research (ACER), with language equivalency quality assurance work implemented by cApStAn. This project posed an interesting set of challenges in relation to language equivalency between different versions of assessment tools, and one that required innovative solutions.

The engineering strand provided some of the most interesting challenges and is used here as an example of how language equivalency can be assured when technical terms are involved. Assessment items were developed by both the National Institute for Educational Policy Research (NIER) in Japan and ACER. These comprised multiple-choice type items to assess civil engineering knowledge and constructed-response type items to assess civil engineering application and reasoning, several of which utilised diagrams from real world engineering contexts as their stimulus material.

Nine education systems implemented the civil engineering strand – Abu Dhabi in the United Arab Emirates; Australia; Colombia; Egypt; Japan; Mexico; the Ontario Province of Canada; Russia; and the Slovak Republic. All assessment materials were delivered online and it was essential that none of the versions would be any easier or more difficult than any other one. As this list of education systems suggests, the range of languages—and indeed alphabets—was broad. Hence, minute verification work was required to ensure that translation and adaptation processes led to versions that were linguistically equivalent.  As the AHELO Translation and Adaptation Guidelines stated:

… Each item should examine the same skills and invoke the same cognitive processes as the original version, while being culturally appropriate within the target country.

The gold standard to ensure linguistic equivalency involves a double translation design (in which translators work separately), a third translator reconciling the two translations and then an in-depth verification process undertaken by linguistic experts in which every element of the reconciled translation is checked for its accuracy and appropriateness within a particular education system.

In the case of AHELO, however, the level of technical specificity of terminology utilised in assessment items for those in the final year of their Bachelor degrees of civil engineering went beyond that which a linguist would be expected to have command of. To give an indication of the kind of technical terminology that was used, sample civil engineering items included in Volume 1 of the AHELO Feasibility Report[1] include terms such as ‘arch-gravity’, ‘spillway’, ‘Warren truss’, ‘manometer’, ‘fracture plane’ and ‘Mohr-Coulomb’s failure criterion line’.    

The solution to this challenge was to supplement the standard verification process—in which the translations submitted by National Centres were verified by linguists appointed and quality assured by cApStAn—with a parallel review by a domain expert from each education system.

Domain specialists were university lecturers in the domain who were able to demonstrate a strong grasp of the up-to-date technical terminology that was commonly used within that education system, and hence with which students would be familiar. For the English language versions used in Australia and Ontario, a local domain expert was also used to verify the terminology used in the national version of the assessment items.

Since this was the first time that both domain and linguistic verification had taken place, several models were experimented with, each of which was shown to have strengths and weaknesses. These included parallel as well as consecutive processes, with all verifiers and reviewers working remotely.

Since all instruments were delivered online, an additional quality assurance step was an optical check to ensure that the agreed terminology was displayed correctly. A particular challenge that arose involved labels used in diagrams, with the number of characters in some versions taking up much more space than in others, requiring diagrams to be edited to accommodate these.

Ultimately, the AHELO assessment and contextual instruments were successfully delivered to more than 23,000 students in 17 education systems, with a combination of data analysis and detailed feedback from participating education systems leading to the conclusion that the AHELO Feasibility Study had successfully “demonstrated that it is feasible to develop instruments with reliable and valid results across different countries, languages, cultures and institutional settings. As in any Feasibility Study, much of the value lay in what lessons were learned along the way and the use of domain experts as part of the verification process was certainly one of these valuable lessons.

More information on the AHELO Feasibility Study can be found at http://www.oecd.org/education/skills-beyond-school/ahelo-main-study.htm and in Richardson, S. & Coates, H. (2014). Essential foundations for establishing equivalence in cross-national higher education assessment. Higher Education, 68: 825–836, DOI 10.1007/s10734-014-9746-9.


[1] Tremblay, K., LaLancette, D. & Roseveare, D. (2012). Assessment of Higher Education Learning Outcomes (AHELO) Feasibility Report. Volume 1: Design and Implementation. Paris: OECD.

[2] Tremblay, K., LaLancette, D. & Roseveare, D. (2012). Assessment of Higher Education Learning Outcomes (AHELO) Feasibility Report. Volume 2: Data Analysis and National Experiences. Paris: OECD.

About Dr Sarah Richardson

Dr Sarah Richardson is a globally-recognised education expert with 26 years’ experience in education. She is currently the Deputy CEO and Research Director of the Australian Council for Educational Research (ACER) UK, where she leads a wide range of educational research projects across multiple educational levels and themes. Dr Richardson has a global outlook, having lived in Australia, India, the Netherlands and the UK and having undertaken research projects all around the world. Dr Richardson regularly gives conference presentations and her most recent book entitled ‘Cosmopolitan Learning for a Global Era: University Education in an Interconnected World’ was published by Routledge in 2016.

Australian Council for Educational Research

About ACER

ACER is one of the world’s leading educational research centres. Their goal is to support learners, learning professionals, learning institutions and the development of a learning society through their work. ACER has built a strong reputation as a provider of reliable support and expertise to education policymakers and professional practitioners since it was established in 1930. As an independent non-government organisation, ACER generates its entire income through contracted research and development projects, and through developing and distributing products and services, with operating surplus directed back into research and development.

About ACER UK

ACER UK works closely with governments, schools and universities in the UK, international aid agencies, and research or not-for-profit organisations such as RAND, Department for International Development UK and the Scottish Government. We work to help advance educational research and help progress educational development in every project they take on.