Testing Teacher-Made Assessment Validity by Profiling Demographic Variables

Mohammed Borhandden Musah

Abstract


Purpose - This study investigates the content validity of teacher-made assessment in selected Chinese elementary schools in Malaysia. It also examines teacher understanding of the table of specification (TOS) in the sampled schools. The study further investigates whether or not demographic variables contribute to the validity of teacher-made assessments. Method - A total of 660 questionnaires were randomly distributed to the sampled teachers drawn from 21 Chinese elementary schools. Of the distributed questionnaires, 381 completed surveys were received and analysed. Findings - The results revealed an average teacher understanding and validity of teacher-assessment levels in using the TOS. The k-group MANOVA analysis demonstrated that work experience and age had significantly influenced teacher understanding in the sampled schools. However, work experience and age demonstrated an insignificant influence on teacher-made assessment. Furthermore, teacher gender had insignificant influence on both teacher understanding and teacher-made assessment. Recommendations for further improvements were also addressed. Significance - The study empirically investigates the selected Chinese elementary schools with reference to teacher-made assessment which has yet to be studied in significant depth in the context of these types of schools.

Keywords


Teacher-made assessment, Demographic variable, Table of specification, Subject-matter expert, Elementary school

References


References

P. A. Griswold, “Assessing Relevance and Reliability To Improve the Quality of Teacher-Made Tests,” NASSP Bull., vol. 74, no. 523, pp. 18–24, 1990, doi: 10.1177/019263659007452305.

P. J. Black, Testing: Friend or foe: Theory and practice of assessment and testing, vol. 53, no. 9. Taylor and Francis e-Library, 2002.

M. E. B. Preliminary Report, “Malaysian Ministry of Education (2012) Preliminary Report Malaysia Education Blueprint.”

C. Y. F. and G. K. Sidhu, School-Based Assessment among ESL Teachers in Malaysian Secondary Schools. , Shah Alam, Selangor: Faculty of Education, Universiti Teknologi MARA, 2006.

S. H. V. Al-Hudawi, R. L. S. Fong, M. B. Musah, and L. M. Tahir, “The actualization of the Malaysian national education philosophy in secondary schools: Student and teacher perspectives,” Int. Educ. Stud., vol. 7, no. 4, pp. 57–68, 2014, doi: 10.5539/ies.v7n4p57.

F. Majid, “School-Based Assessment in Malaysian Schools: The Concerns of the English Teachers.,” Online Submiss., vol. 3, no. 10, pp. 393–402, 2011.

I. J. Mehrens, William A.; Lehmann, “Using teacher-made measurement devices.,” NASSP Bull., vol. 71, no. 495, pp. 36–44, 1987.

D. A. Frisbie, “Reliability of Scores from Teacher-Made Tests,” Artic. NCME Instr. Modul., 1988.

E. Feron, T. Schils, and B. ter Weel, “Does the Teacher Beat the Test? The Value of the Teacher’s Assessment in Predicting Student Ability,” Econ., vol. 164, no. 4, pp. 391–418, 2016, doi: 10.1007/s10645-016-9278-z.

N. Didonato-barnes, H. Fives, and N. Didonato-barnes, “Classroom test construction: the power of table of contents,” vol. 18, no. 3, pp. 1–7, 2013.

E. Hartell and G. J. Strimel, “What is it called and how does it work: examining content validity and item design of teacher-made tests,” Int. J. Technol. Des. Educ., vol. 29, no. 4, pp. 781–802, 2019, doi: 10.1007/s10798-018-9463-2.

A. Stevenson, Oxford Dictionary of English (3 ed.), Oxford Uni. UK, 2010.

D. Pratt, Curriculum: Design and development’. New York: Harcourt Brace Jovanovich, 1980.

A. Brualdi, “Implementing Performance Assessment in the Implementing Performance Assessment in the,” Educ. Resour. Inf. Centre(ERIC), pp. 1–7, 1998.

L. G. G. and P. R. Yarnold, Reading and Understanding Multivariate Statistic. Washington, DC: American Psychological Association, 2006.

M. J. Lambert et al., “The reliability and validity of the outcome questionnaire,” Clin. Psychol. Psychother., vol. 3, no. 4, pp. 249–258, 1996, doi: 10.1002/(sici)1099-0879(199612)3:4<249::aid-cpp106>3.0.co;2-s.

E. J. P. and L. P. Schemelkin, “Measurement, Design, and Analysis: An Integrated Approach,” in Measurement, Design, and Analysis: An Integrated Approach, New York: Psychology Press; Taylor & Francis Group, 1991, pp. 1689–1699.

R. L. Linn, “Issues of Validity for Criterion-Referenced Measures,” Appl. Psychol. Meas., vol. 4, no. 4, pp. 547–561, 1980, doi: 10.1177/014662168000400407.

L. F. Bachman, Fundamental Considerations in Language Testin. New York: Oxford University Press, 1990.

Y. Liao, “Issues of Validity and Reliability in Second Language Performance Assessment,” TESOL&Applied Linguist., vol. 4, no. 2, pp. 2–5, 2002.

B. D. Notar, Charles E.; Zuelke, Dennis C.; Wilson, Janell D.; Yunker, “The Table of Specifications: Insuring Accountability in Teacher Made Tests,” J. Instr. Psychol., vol. 31, no. 2, pp. 115–129, 2004.

S. Wolming and C. Wikström, “The concept of validity in theory and practice,” Assess. Educ. Princ. Policy Pract., vol. 17, no. 2, pp. 117–132, 2010, doi: 10.1080/09695941003693856.

L. O. O. Kiragu Kinyua, “Validity and reliability of teacher-made tests: Case study of year 11 physics in Nyahururu District of Kenya,” African Educ. Res. J., vol. 2, no. 2, pp. 61–71, 2014.

H. Alkharusi, “Teachers’ classroom assessment skills: Influence of gender, subject area, grade level, teaching experience and in-service assessment training,” J. Turkish Sci. Educ., vol. 8, no. 2, pp. 39–48, 2011.

C. Magno, “The Profile of Teacher-Made Test Construction of the Professors of University of Perpetual Help Laguna,” vol. 1, no. 1, pp. 48–55, 2003.

C. A. Mertler, “Classroom Assesment Practices of Ohio Teachers,” Annu. Meet. Mid-Western Educ. Res. Assoc., p. 23, 1998.

and S. W. Tracie Cooper, Ashley Pittman, “Using reliability, validity, and item analysis to evaluate a teacher-developed test in international business,” Eval. Test. Res. Artic., pp. 1–11, 2014.

L. M. Ing, M. B. Musah, S. H. V. Al-Hudawi, L. M. Tahir, and N. M. Kamil, “Validity of teacher-made assessment: A table of specification approach,” Asian Soc. Sci., vol. 11, no. 5, pp. 193–200, 2015, doi: 10.5539/ass.v11n5p193.

M. N. A. Ghafar, “Reka Bentuk Tnjauan Soal Selidik Pendidikan’, Johor, Malaysia,” in Statistical Methods in Education and Psychology’, Johor Bahru: UTM Press, 1996.

Gene V. Glass and Kenneth D. Hopkins, Statistical Methods in Education and Psychology. New Jersey: Prentice -Hall, Inc. Inglewood Cliffs, 1996.

R. A. Santos, “Cronbach’s Alpha: A Tool for Assessing the Reliability of Scales,” J. Ext., vol. 37, no. 2, 1999.

M. Tavakol and R. Dennick, “Making sense of Cronbach’s alpha,” Int. J. Med. Educ., vol. 2, pp. 53–55, 2011, doi: 10.5116/ijme.4dfb.8dfd.

B. J. B. and R. E. A. Joseph F. Hair Jr, William C. Black, Multivariate data analysis (7th edition). New Jersey: Pearson Prentice Hall, 2015.

R. B. Kline, Principles and practices of structural equation modelling (fourth ed.). New York: The Guilford Publications, Inc, 2016.

Richard P. Bagozzi, “Evaluating structural equation models with unobservable variables and measurement error: a comment,” J. Mark. Res., vol. 18, no. 3, pp. 375–381, 1981, [Online]. Available: https://doi-org.ezproxy.utm.my/10.1177/002224378101800312.

C. Fornell and D. F. Larcker, “Evaluating Structural Equation Models with Unobservable Variables and Measurement Error,” J. Mark. Res., vol. 18, no. 1, p. 39, 1981, doi: 10.2307/3151312.




DOI: http://doi.org/10.11591/ijere.v11i2.21992

Refbacks

  • There are currently no refbacks.


Copyright (c) 2021 Institute of Advanced Engineering and Science

International Journal of Evaluation and Research in Education (IJERE)
p-ISSN: 2252-8822, e-ISSN: 2620-5440

View IJERE Stats

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.