Simulated computer adaptive testing Method Choices for ability estimation with empirical evidence

Jumoke I Oladele, Mdutshekelwa Ndlovu, Erica D. Spangenberg


Computer Adaptive Testing (CAT) is a technological advancement for educational assessments that requires thorough feasibility studies through computer simulations to have strong testing foundations. This advancement is especially germane in Africa being adopters of technology, and this should not be done blindly without empirical evidence. A quasi-experimental design was adopted for this study to establish methodological choices for CAT ability estimation. Five thousand candidates were simulated with 100 items simulate through the 3-parameter logistic model. The simulation design stipulated a fixed-length test of thirty (30) items, while examinee characteristics were drawn from a normal distribution with a mean of 0 and Standard Deviation of 1. Also, controls for the simulation were set not to control item exposure or to use the progressive restricted method. Data gathered were analysed using descriptive statistics (mean and standard deviation) and inferential statistics (Two-way Multivariate Analysis of Variance: MANOVA) for testing the generated hypotheses. This study provided empirical evidence for choosing methods of ability estimation for CAT as part of the efforts geared towards designing accurate testing programmes for use in higher education.


CAT; methods; simulation; ability estimation; Item Response Theory


W.J. van der Linden and P.J Pashley, “Item selection and ability estimation in adaptive testing.” In W. J. Linden, W.J. van der Linden, and C.A. Glas (Eds.). Computerised adaptive testing: Theory and practice. Springer Science & Business Media, 2000.

E. Georgiadou, E. Triantafillou and A.A. Economides, “Evaluation parameters for computer‐adaptive testing.” British Journal of Educational Technology, vol. 37, no. 2,pp. 261-278, 2006.

D.G. Seo, “Overview and current management of computerised adaptive testing in licensing/certification examinations.” Journal of Educational Evaluation for Health Professions, vol. 14, 2017.

C. Redeckerand Ø. Johannessen, “Changing assessment—Towards a new assessment paradigm using ICT.” European Journal of Education, Vol. 48, no. 1, pp. 79-96, 2013.

J.I. Oladele, M.A. Ayanwale and H.O. Owolabi, “Paradigm Shifts in Computer Adaptive Testing in Nigeria in Terms of Simulated Evidence.”J. Soc. Sci., vol. 63, no. 1-3,pp. 9-20, 2020.

Eneh, O.C. Technology transfer, adoption and integration: A review. Journal of Applied Sciences (Faisalabad), Vol. 10, no. 16, pp. 1814-1819, 2010.

J.W. Richardson, “Challenges of adopting the use of technology in less developed countries: The case of Cambodia.” Comparative Education Review, vol. 55, no. 1, pp. 008-029, 2011.

Dutta, S., Geiger, T. and Lanvin, B. The global information technology report. In the World Economic Forum, vol. 1, no. 1, pp. 80-85, 2015.

D.G. Seo, and J. Choi, “A post-hoc simulation study of computerised adaptive testing for the Korean Medical Licensing Examination.” Journal of Educational Evaluation for Health Professions, Vol. 15, 2018.

D.J. Weiss and G.G. Kingsbury, “Application of computerised adaptive testing to educational problems.” Journal of Educational Measurement, vol. 21(4), pp. 361-375, 1984.

H. Khoshsima, and S.M.H. Toroujeni, “Computer Adaptive Testing (Cat) Design; Testing Algorithm and Administration Mode Investigation.” European Journal of Education Studies, 2017.

K.C.T. Han, “Conducting simulation studies for computerised adaptive testing using SimulCAT: An instructional piece.” Journal of Educational Evaluation for Health Professions, vol. 15, no. 20, 2018.

S.M. Čisar, D. Radosav, B. Markoski, R. Pinter and P. Čisar, “Computer adaptive testing of student knowledge. ActaPolytechnicaHungarica, vol. 7, no. 4, pp. 139-152, 2010.

N.A. Thompson and D.A. Weiss “A framework for the development of computerised adaptive tests.” Practical Assessment, Research, and Evaluation, vol. 16, no. 1, pp. 1, 2011.

A.M. Ramadan and E.A. Aleksandrova, “Computerised Adaptive Testing.” In Информационныетехнологии в образовании pp. 446-449, 2018.

D. Cella, R. Gershon, J.S. Lai and S Choi, “The future of outcomes measurement: item banking, tailored short-forms, and computerised adaptive assessment.” Quality of Life Research, vol. 16, no. 1, pp. 133-141, 2007.

J. Olea, J.R. Barrada, F.J. Abad, V. Ponsoda and L. Cuevas, “Computerised adaptive testing: The capitalisation on the chance problem.”The Spanish journal of psychology, vol. 15, no. 1, pp. 424, 2012.

T.H. Ho, “A comparison of item selection procedures using different ability estimation methods in computerised adaptive testing based on the generalised partial credit model.” PhD dissertations: The Univ. of Texas at Austin, 2010.

T. Wang and W. P. Vispoel, “Properties of ability estimation methods in computerised adaptive testing.” Journal of Educational Measurement, Vol. 35(2),pp. 109-135, 1998.

K.T. Han, “User’s Manual for SimulCAT: Windows Software for Simulating Computerized Adaptive Test Administration.” Applied Psychological Measurement, Vol. 36(1), pp. 64-66, 2012.

D.D.R. Meneghetti and P. T. A. Junior, “Application and Simulation of Computerised Adaptive Tests Through the Package catsim”, 2017.

S.W. Choi, Firestar: Computerized Adaptive Testing Simulation Program for Polytomous Item Response Theory Models. Applied Psychological Measurement, Vol. 33(8), pp. 644-645, 2009.

H. Sie, “A review of SimuMCAT: A simulation software for multidimensional computerised adaptive testing.” Applied Psychological Measurement, Vol. 39(3), pp. 241-244, 2015.

M.O. Ogunjimi, M.A. Ayanwale, J.I. Oladele, D.S. Daramola, I.M. Jimoh, and H.O. Owolabi, “Simulated Evidence of Computer Adaptive Test Length: Implications for High Stakes Assessment in Nigeria.” Journal of Higher Education, Theory and Practice, 21(2), 202-212, 2021.

K.L. Wuensch, “Factorial MANOVA.” 2015.

University of Maimi “MANOVA/MANCOVA using SPSS.” 2020 Statistical Supporting Unit.



  • There are currently no refbacks.

Copyright (c) 2022 Institute of Advanced Engineering and Science

International Journal of Evaluation and Research in Education (IJERE)
p-ISSN: 2252-8822, e-ISSN: 2620-5440

View IJERE Stats

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.