14.06.2021

Considerations for Designing Accessible Digital Assessments for Diverse Populations

by Dr. Maria Elena Oliveri

I would like to start by thanking Steve Dept for the invitation to write on the assessment and interpretation of data from digital assessments administered to culturally and linguistically diverse populations. This topic is ever more relevant as the populations participating in assessments designed for learning, placement, or to inform summative decisions become increasingly diverse (Oliveri, Mislevy, & Elliot, 2020) and as the use of digital assessments to assess complex constructs rises (Mislevy & Oliveri, 2019).

Due to the digital nature of this article, I will provide only a high-level description of my perspective on current and future-looking directions related to the complexities of designing assessments for culturally and linguistically diverse populations.

Several initiatives, frameworks, and guidelines have been written to guide test development practices. Examples include the development of:

(a) guidelines for assessing culturally and linguistically diverse populations (e.g., the International Test Commission guidelines [ITC], 2018),

(b) sociocognitive frameworks designed to augment traditional approaches to conceptualize and operationalize fairness in assessments (Mislevy, 2018), and

(c) natural language processing tools for use in the design of digital assessments that take into account the linguistic needs of diverse populations (Oliveri, 2019).

Concomitantly, complementing a sociocognitive assessment development approach are conceptual frameworks such as the Universal Design for Learning, which supports from the outset the development of assessments that are accessible for the widest range of students. Kettler Elliott, Beddow, and Kurz (2018) define accessibility as the extent to which a product (i.e., a test) eliminates barriers and permits equal use of components or services for diverse populations. 

These developments allow us to attend to key elements of task design, construct representation, and the resources and knowledge that culturally and linguistically diverse populations bring to the assessment situation (O’Sullivan & Weir, 2011).

They also guide decisions made with respect to the type of language, visuals, and scenarios used in traditional and next-generation assessments to minimize construct-irrelevant variance and support the fair and valid assessment of diverse populations. For instance, they guide test developers to construct items that are accessible to diverse populations such as English Language Learners.

They also specify the types of linguistic modifications to consider such as using simpler vocabulary when the linguistic complex terms are not part of the assessed construct or reducing unnecessary wordiness in items.

Oliveri (2019) demonstrates the use of natural language processing tools such as Text Evaluator (Sheehan, Kostin, Napolitano, & Flor, 2014) to supplement or augment the work carried out by subject matter experts. Alongside, experts’ judgments, these tools can be used to inform which language to modify or scaffold. Both approaches can be used at the design stage of digital scenario-based tasks. To elaborate, Oliveri describes the work a multidisciplinary team of expert collaborators carried out when designing the Green Islands prototype.

The Green Islands is a third-grade scenario-based English Language Arts assessment contextualized in science. Following the principles of universal design for learning, this work involved the use of natural language processing tools and experts’ judgments to modify and scaffold complex language in the prototype. This work was conducted at the design stage to avoid unneeded retrofitting post development, which I argue is central to supporting accessibility from the outset.

Developing accessible assessments for multiple populations is important because the use of construct-irrelevant language in assessments may disadvantage some learners. Students may have a more difficult time understanding the questions in the assessment, may require additional time to understand the questions, or may face additional difficulties related to building mental representations of the materials presented in the assessments (Sheehan et al., 2014).  These issues may have unintended consequences, which may lead students to misinterpret the language presented in the tests due to construct-irrelevant factors (e.g., inadequate exposure to language used in mainstream schools).

Steps may thus need to be implemented to modify less accessible construct-irrelevant text and scaffold less accessible construct-relevant text. These issues become ever more central when using digitally delivered assessments, which make greater use of interactive tasks and which incorporate more visuals and representations by avatars.

Due to their complexity, addressing them may best be carried out by engaging multidisciplinary teams of experts who understand different aspects of test design, development, and user experiences, to name but a few (Oliveri & Mislevy, 2019).


References

International Test Commission. (2018). ITC guidelines for the large-scale assessment of linguistically and culturally diverse populations. Retrieved from https://www.intestcom.org/page/31

Kettler, R. J., Elliott, S. N., Beddow, P. A., & Kurz, A. (2018). Accessible instruction and testing today. In S. N. Elliot, R. J. Kettler, A. P. Beddow, & A. Kurz. Handbook of accessible instruction and testing practices (pp. 1-16). Springer, Cham.

Mislevy, R. J. (2018). Sociocognitive foundations of educational measurement.  London, UK: Routledge.

Mislevy, R. J., & Oliveri, M. E. (2019). Digital module 09: Sociocognitive assessment for diverse populations (Version 2.0). Available online at https://ncme.elevate.commpartners.com 

Oliveri, M. E. (2019).  Evaluating text complexity to improve the design of formative educational scenario-based assessments for culturally and linguistically diverse populations. In M. Asil, K. Ercikan, & J. Gorin (Eds.). In Cultural Contexts and Priorities in Assessment special issue. Frontiers in Education.

Oliveri, M. E. & Mislevy, R. (2019). Introduction to challenges and opportunities in the design of ‘next-generation assessments of 21st century skills special issue. In M. E. Oliveri & Mislevy, R. (Eds.). Next-generation assessments of 21st century skills special issue. International Journal of Testing, 19, 97-102. doi.org/10.1080/15305058.2019.1608551. https://www.tandfonline.com/eprint/3ERZnubGGAeHabubRnsc/full?target=10.1080/15305058.2019.1608551

Oliveri, M. E., Mislevy, R., & Elliot, N. (2020). New horizons for postsecondary placement and admission practices in the United States, (pp. 347-375). In M. E. Oliveri & C. Wendler (Eds.), Higher Education Admission Practices: An International Perspective. Cambridge University Press.

O’Sullivan, B. and Weir, C. J. (2011). Test development and validation. In O’Sullivan, B. (Ed.) Language Testing: Theories and Practices (pp. 13-32). Basingstoke: Palgrave Macmillan.

Sheehan, K. M., Kostin, I., Napolitano, D., & Flor, M. (2014). The TextEvaluator tool: Helping teachers and test developers select texts for use in instruction and assessment. The Elementary School Journal, 115(2), 184-209.


digital assessments

About Dr Maria Elena Oliveri

Dr Maria Elena Oliveri has a PhD and Masters in Measurement, Evaluation, and Research Methodology and a second Masters in Clinical Counselling Psychology from the University of British Columbia. Her research focuses on team dynamics and vocational training, validity and the design of next-generation assessments of collaborative problem solving and workplace communication. She has over 60 publications including an edited book on Higher Education Practices from an International Perspective. She has served in various leadership roles including associate editor of the International Journal of Testing, chair of International Test Commission Guidelines on the Fair and Valid Assessment of Linguistically Diverse Populations, and chair of AERA Division D. 

Dr Maria Elena Oliveri on LinkedIn

Contact Us

We'd love to hear from you, be prepared for a quick response

This field is for validation purposes and should be left unchanged.

Brussels

Chaussée de La Hulpe 268, 1170 Brussels

+32 2 663 1729

Philadelphia

121 S. Broad Street, Suite 1710, Philadelphia, PA 19107

+1 267 469 2611