Validation of a digital tool for diagnosing mathematical proficiency

Putcharee Junpeng, Metta Marwiang, Samruan Chinjunthuk, Prapawadee Suwannatrai, Kanokporn Chanayota, Kissadapan Pongboriboon, Keow Ngang Tang, Mark Wilson

Abstract


This study was aimed to validate a digital tool for diagnosing mathematical proficiency in the Number andAlgebra strand of 1,504 Thai seventh-grade students. Researchers employed a multidimensional approach, an extension of the Rasch model to measure its quality. A design-based research method was adopted to create the diagnostic tool which consists of four components, namely register system, input data, process system, and diagnostic feedback report. The diagnostic framework consists of 18 tasks encompassing 11 and 7 tasks of mathematical procedures dimension and structural of observed learning outcome dimension respectively. The results revealed that there is internal structure evidence of validity based on the comparison of model fit and Wright map. The results also indicated that the reliability evidence and item fit are compliance with the quality of the digital tool as shown in the analysis of standard error of measurement, and infit and outfit of the items. In conclusion, the developed digital tool can diagnose seventh-grade students’ multiple mathematical proficiencies in terms of accuracy, consistency, and stability. This implies that the digital tool can provide fruitful information, particularly to those intermediate and high mathematical proficiency levels because the error for estimating proficiency in each dimension was at the lowest value.

Keywords


Construct modeling; Digital tool; Mathematical proficiency; Rasch analysis; Test development

References


P. Junpeng, M. Inprasitha and M. Wilson, “Modeling of the Open-ended Items for Assessing Multiple Proficiencies in Mathematical Problem Solving,” The Turkish Online Journal of Educational Technology, vol. 2, Special Issue for INTE-ITICAM-IDEC, pp. 142-149, 2018.

Y-M. Huang, S-H. Huang and T-T. Wu, “Embedding Diagnostics in a Digital Game for Learning Mathematics,” Education Technology Research and Development, vol. 62, pp. 187-207, 2014. https:doi.org/10.1007/s11423-013-9315-4

C. Redecker and O. Johannessen, “Changing Assessment – Towards a New Assessment,” European Journal of Education, vol. 48, no. 1, pp. 79-96, Retrieved from https://doi.org/10.1111/ejed.12018, 2014.

P. Junpeng, et al., "Constructing Progress Maps of Digital Technology for Diagnosing Mathematical Proficiency," Journal of Education and Learning, vol. 8, issue 6, pp. 90-102, 2019. doi: 10.5539/jel.v8n6p90

J. B. Briggs and K. Collis, Evaluating the quality of learning: The SOLO taxonomy. (New York: Academic Press), 1982.

M. Wilson, Constructing Measures: An Item Response Modeling Approach. (Mahwah, NJ: Lawrence Erlbaum Assoc.), 2005.

T. C. Reeves, “Design Research from a Technology Perspective,” In J. van den Akkker, K. Gravemeijer, S. McKenney and N. Nieveen (Eds.), Educational Design Research, pp. 52-66, (London, UK: Routledge Sharples), 2006.

Thailand Ministry of Education, Learning Standards and Indicators Learning of Mathematics (revised edition 2017) according to the Core Curriculum of Basic Education, B.E. 2551. (Bangkok: Printing House, Agricultural Cooperative of Thailand), 2017.

M. Custer, “Sample size and item parameter estimation precision when utilizing the one-parameter ‘Rasch’ model,” Paper presented at the annual meeting of the mid-western Educational Research Association, Evanston, Illinois, 21-24 October, 2015.

S. Jiang, C. Wang and D. J. Weiss, “Sample Size Requirements for Estimation of Item Parameters in the Multidimensional Graded Response Model,” Frontiers in Psychology, vol. 7, p. 109, Retrieved from https://doi.org/10.3389/fpsyg.2016.00109

M. R. Wilson and P. De Boeck, “Descriptive and Explanatory Item Response Models,” In P. De Boeck & M. Wilson (Eds.), Explanatory Item Models: A Generalized Linear and Nonlinear. (New York: Springer-Verlag), 2004.

L. Yao and R. D. Schwarz, “A Multidimensional Partial Credit Model with Associated Item and Test Statistics: An Application to Mixed-format Tests,” Applied Psychological Measurement, vol. 30, no. 6, pp. 469-492, 2006.

G. Schwarz, “Estimating the Dimension of a Model,” The Annals of Statistics, vol. 6, no. 2, pp. 461-464, 1978.

B. Duckor, K. E. Castellano, K. Tĕllez, D.Wihardini and M. R. Wilson, “Examining the Internal Structure Evidence for the Performance Assessment for California Teachers: A Validation Study of the Elementary Literacy Teaching Event for Tier 1 Teacher Licensure,” Journal of Teacher Education, vol. 65, no. 5, pp. 402-420, 2014.

American Educational Research Association, American Psychological Association and National Council on Measurement in Education, Standards for Educational and Psychological Testing (6th ed.). (Washington, DC: American Educational Research Association), 2014.

R. J. Adams, “Reliability as a Measurement Design Effect,” Studies In Educational Evaluation, vol. 31, no. 2-3, pp. 162-172, 2005.

R. Adams and S. T. Khoo, “Quest: The Interactive Test Analysis System,” Quest (Computer Program), Retrieved from http://works.bepress.com/ray_adams/36/, 1996.

R. J. Adams, M. R. Wilson and W. Wang, "The multidimensional random coefficient multinomial logit model," Applied Psychological Measurement, vol. 21, no. 1, pp. 1-23, 1997.




DOI: http://doi.org/10.11591/ijere.v9i3.20503
Total views : 15 times

Refbacks

  • There are currently no refbacks.


Copyright (c) 2020 Institute of Advanced Engineering and Science

International Journal of Evaluation and Research in Education (IJERE)
p-ISSN: 2252-8822, e-ISSN: 2620-5440

View IJERE Stats

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.