Journal article icon

Journal article

Language effects in international testing: the case of PISA 2006 science items

Abstract:
We investigate the extent to which language versions (English, French and Arabic) of the same science test are comparable in terms of item difficulty and demands. We argue that language is an inextricable part of the scientific literacy construct, be it intended or not by the examiner. This argument has considerable implications on methodologies used to address the equivalence of multiple language versions of the same assessment, including in the context of international assessment where cross-cultural fairness is a concern. We also argue that none of the available statistical or qualitative techniques are capable of teasing out the language variable and neutralising its potential effects on item difficulty and demands. Exploring the use of automated text analysis tools at the quality control stage may be successful in addressing some of these challenges.
Publication status:
Published
Peer review status:
Peer reviewed

Actions


Access Document


Files:
Publisher copy:
10.1080/0969594X.2016.1218323

Authors


More by this author
Institution:
University of Oxford
Division:
SSD
Department:
Education
Role:
Author
More by this author
Institution:
University of Oxford
Division:
SSD
Department:
Education
Role:
Author


Publisher:
Taylor and Francis (Routledge)
Journal:
Assessment in Education More from this journal
Volume:
23
Issue:
4
Pages:
427-455
Publication date:
2016-08-25
Acceptance date:
2016-07-24
DOI:
EISSN:
1465-329X
ISSN:
0969-594X


Language:
English
Keywords:
Pubs id:
pubs:635368
UUID:
uuid:fa8b3fcc-ac71-4daf-aa06-1ec407d920eb
Local pid:
pubs:635368
Source identifiers:
635368
Deposit date:
2016-07-25

Terms of use



Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP