A Computerized Adaptive Test for Measuring the Physics Critical Thinking Skills in High School Students

Aang Zainul Abidin, Edi Istiyono, Nunung Fadilah, Wipsar Sunu Brams Dwandaru

Abstract


Classical assessments that are not comprehensive and do not distinguish students' initial abilities make measurement results far from the actual abilities. This study was conducted to produce a computerized adaptive test for physics critical thinking skills (CAT-PhysCriTS) that met the feasibility criteria. The test was presented for the physics subject of 11th grade high school students with two-tier multiple-choice format. This development research was based on the 4-D model combined with the test development model by Oriondo & Antonio. Eleven experts and 577 11th grade high school students in Kulonprogo, Indonesia, have participated. The media feasibility and the content validity of the items was assessed by experts, while item and abilities parameters were estimated by item response theory. The results obtained: 1) CAT media was declared very feasible and content validity of 136 items was declared valid; 2) all items fitted to partial credit model, the item reliability was classified as good, and the difficulty index of items was good; 3) the results of the CAT-PhysCriTS were equivalent to students academic achievement. Based on the results, CAT-PhysCriTS has fulfilled the requirements as a measuring instrument with measurement times were faster and more comprehensive for large-scale assessments.

Keywords


Assessment of physics learning; Critical thinking skills; Adaptive test; Two-tier multiple-choice; Item response theory

References


S. E. Saleh, “Critical Thinking as a 21st Century Skill: Conceptions, Implementation and Challenges in the EFL Classroom,” Eur. J. Foreign Lang. Teach., vol. 4, no. 1, pp. 1–16, 2019.

Mundilarto and H. Ismoyo, “Effect of Problem-Based Learning on Improvement Physics Achievement and Critical Thinking of Senior High School Student,” J. Balt. Sci. Educ., vol. 16, no. 5, pp. 761–779, 2017.

R. H. Ennis, “Critical Thinking Assessment,” Theory Pract., vol. 32, no. 3, pp. 179–186, 1993.

R. W. Elliott, “Understanding Faculty Engagement in Assessment through Feedback and Dialogues : A Mixed Methods Approach,” Int. J. Eval. Res. Educ., vol. 7, no. 3, pp. 167–175, 2018.

F. S. Putri and E. Istiyono, “The Development of Performance Assessment of STEM-Based Critical Thinking Skill in the High School Physics Lessons,” Int. J. Environ. Sci. Educ., vol. 12, no. 5, pp. 1269–1281, 2017.

R. S. Damayanti, A. Suyatna, Warsono, and U. Rosidin, “Development of Authentic Assessment Instruments for Critical Thinking Skills in Global Warming with a Scientific Approach,” Int. J. Sci. Appl. Sci. Conf. Ser., vol. 2, no. 1, pp. 289–299, 2017.

G. N. Khan, N. Ishrat, and A. Q. Khan, “Using Item Analysis on Essay Types Questions Given in Summative Examination of Medical College Students : Facility Value, Discrimination Index,” Int. J. Res. Med. Sci., vol. 3, no. 1, pp. 178–182, 2015.

K. A. Shaaban, “Assessment of Critical Thinking Skills Through Reading Comprehension,” Int. J. Lang. Stud., vol. 8, no. 2, pp. 117–140, 2014.

M. Baig, S. K. Ali, S. Ali, and N. Huda, “Evaluation of Multiple Choice and Short Essay Question items in Basic Medical Sciences,” Pakistan J. Med. Sci., vol. 30, no. 1, pp. 3–6, 2014.

D. D. Kerkman and A. T. Johnson, “Challenging Multiple-Choice Questions to Engage Critical Thinking,” InSight A J. Sch. Teach., vol. 9, pp. 92–97, 2014.

E. Istiyono, D. Mardapi, and Suparno, “Pengembangan Tes Kemampuan Berpikir Tingkat Tinggi Fisika (PysTHOTS) Peserta Didik SMA,” J. Penelit. dan Eval. Pendidik., vol. 18, no. 1, pp. 1–12, 2014.

Winarti, Cari, Suparmi, W. Sunarno, and E. Istiyono, “Development of Two Tier Test to Assess Conceptual Understanding in Heat and Temperature,” in IOP Conf. Series: Journal of Physics: Conf. Series 795 (2017) 012052, 2017, pp. 1–5.

F. S. Putri, E. Istiyono, and E. Nurcahyanto, “Pengembangan Instrumen Tes Keterampilan Berpikir Kritis dalam Bentuk Pilihan Ganda Beralasan (Politomus) di DIY,” Unnes Phys. Educ. J., vol. 5, no. 2, pp. 76–84, 2016.

Fajrianthi, W. Hendriani, and B. G. Septarini, “Pengembangan Tes Berpikir Kritis dengan Pendekatan Item Respone Theory,” J. Penelit. dan Eval. Pendidik., vol. 20, no. 1, pp. 45–55, 2016.

E. Istiyono, W. B. Dwandaru, and F. Rahayu, “The Developing of Creative Thinking Skills Test Based on Modern Test Theory in Physics of Senior High Schools,” Cakrawala Pendidik., vol. 37, no. 2, pp. 190–200, 2018.

A. A. Bichi and R. Talib, “Item Response Theory : An Introduction to Latent Trait Models to Test and Item Development,” Int. J. Eval. Res. Educ., vol. 7, no. 2, pp. 142–151, 2018.

O. A. Awopeju and E. R. I. Afolabi, “Comparative Analysis of Classical Test Theory and Item Response Theory Based Item Parameter Estimates of Senior School Certificate Mathematics Examination,” Eur. Sci. J., vol. 12, no. 28, pp. 263–284, 2016.

R. K. Hambleton and H. Swaminathan, Fundamentals of Item Respon Theory. Newbury Park, Calif: Sage Publications, 1991.

H. C. Bagus, “The National Exam Administration by Using Computerized Adaptive Testing (CAT) Model,” J. Pendidik. dan Kebud., vol. 18, no. 1, pp. 45–53, 2012.

G. Ling, Y. Attali, B. Finn, and E. A. Stone, “Is a Computerized Adaptive Test More Motivating Than a Fixed-Item Test ?,” Appl. Psychol. Meas., vol. 41, no. 7, pp. 495–511, 2017.

E. C. Aybek and R. N. Demirtasli, “Computerized Adaptive Test (CAT) Applications and Item Response Theory Models for Polytomous Items,” Int. J. Res. Educ. Sci., vol. 3, no. 2, pp. 475–487, 2017.

M. Rezaie and M. Golshan, “Computer Adaptive Test (CAT): Advantages and Limitations,” Int. J. Educ. Investig., vol. 2, no. 5, pp. 128–137, 2015.

H. M. Wu, B. C. Kuo, and S. C. Wang, “Computerized Dynamic Adaptive Tests with Immediately Individualized Feedback for Primary School Mathematics Learning,” Educ. Technol. Soc., vol. 20, no. 1, pp. 61–72, 2017.

E. Istiyono, W. S. B. Dwandaru, and R. Faizah, “Mapping of Physics Problem-Solving Skills of Senior High School Students Using PhysProSS-CAT,” Res. Eval. Educ., vol. 4, no. 2, pp. 144–154, 2018.

S. Thiagarajan, D. S. Semmel, and M. I. Semmel, Instructional Development for Training Teacher of Exceptional children. Bloomington Indiana: Indiana University, 1974.

L. L. Oriondo and E. M. D. Antonio, Evaluating Educational Outcomes (Test, Measurment and Evaluation). Florentino St: Rex Printing Company, Inc, 1998.

E. Muraki and R. D. Bock, Parscale: IRT Item Analysis and Test Scoring for Rating Scale Data. Chicago: Scientific Softwere International, Inc, 1998.

S. E. P. Widiyoko, Teknik Penyusunan Instrumen Penelitian. Yogyakarta: Pustaka Pelajar, 2013.

L. R. Aiken, “Content Validity and Reliability of Single Items or Questionnaires,” Educ. Psychol. Meas., vol. 40, no. 4, pp. 955–959, 1980.

S. Krishnan and N. Idris, “Using PCM to Improve the Quality of an Instrument,” Int. J. Eval. Res. Educ., vol. 7, no. 4, pp. 313–316, 2018.

E. Istiyono, “The Analysis of Senior High School Students’ Physics HOTS in Bantul District Measured Using PhysReMChoTHOTS,” in American Institute of Physics Conference Proceedings, 2017, vol. 1868, pp. 1–7.

R. Adam and S. Khoo, Quest: the Interactive Test Analysis System Version 2.1. Victoria: The Australian Council for Educational Research, 1996.

B. Subali, Pengembangan Tes Beserta Penyelidikan Validitas dan Reliabilitas secara Empiris. Yogyakarta: UNY Press, 2016.

B. Sumintono and W. Widhiarso, Aplikasi Pemodelan Rasch pada Assessment Pendidikan. Cimahi: Trim Komunikata, 2015.

K. Hidayati, Budiyono, and Sugiman, “Using Alignment Index and Polytomous Item Response Theory on Statistics Essay Test,” Eurasian J. Educ. Res., vol. 79, pp. 115–132, 2019.

R. K. Hambleton and H. Swaminathan, Item Response Theory. Boston, MA: Kluer Inc, 1985.

K. T. Han, “Maximum Likelihood Score Estimation Method With Fences for Short-Length Tests and Computerized Adaptive Tests,” Appl. Psychol. Meas., vol. 40, no. 4, pp. 289–301, 2016.

S. Hadi, Pengembangan Computerized Adaptive Test Berbasis Web. Yogyakarta: Aswaja Presindo, 2013.

R. H. Ennis, “A Logical Basis for Measuring Critical Thinking Skills,” J. Dep. Superv. Curric. Dev., vol. 43, no. 2, pp. 45–48, 1985.

E. Istiyono, “IT-based HOTS Assessment on Physics Learning as the 21st Century Demand at Senior High Schools: Expectation and Reality,” in AIP Conference Proceedings, 2018, vol. 020014, pp. 1–6.




DOI: http://doi.org/10.11591/ijere.v8i3.19642
Total views : 80 times

Refbacks

  • There are currently no refbacks.


Copyright (c) 2019 Institute of Advanced Engineering and Science

International Journal of Evaluation and Research in Education (IJERE)
p-ISSN: 2252-8822, e-ISSN: 2620-5440

View IJERE Stats

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.