Main Article Content

Abstract

This study focused on three kinds of item analysis towards the test items of the achievement tests. The validity and reliability were also provided as supporting functions. This study used quantitative data for the data source and some qualitative explanation to elaborate on the data. To gain the data analysis, the test papers and students answers sheets were collected from three achievement tests of SHS X, SHS Y, and SHS Z. Also, the first-grade students of those schools were as the sample of this study. The study revealed (1) the mean of item facility of three achievement tests categorized as medium test items (SHS X= 0,69; SHS Y= 0,55; and SHS Z= 0,44), while the mean of item discrimination of SHS X examined as good items (0,326) and the mean of item discrimination of SHS Y, SHS Z analyzed as satisfactory items (SHS Y= 0,245; and SHS Z= 0,244). Moreover, half of the distractor efficiency of those tests were accepted. Also, the validity and reliability of the achievement tests were found. Thus, it can be summarized that the achievement tests need to be improved since there are some items have high item facility and low item discrimination.

Keywords

items analysis achievement test senior high school

Article Details

How to Cite
Nurbaeti, A. P., Taufiqulloh, T., & Sulistyawati, A. E. (2019). Items Analysis of The Achievement Tests in EFL Classrooms. English Focus: Journal of English Language Education, 3(1), 32-42. https://doi.org/10.24905/efj.v3i1.69

References

  1. Bachman, L. F. (1990). Fundamental Considerations in Language Testing (Third Edit). Hong Kong: Oxford University Press.
  2. Braun, H., Kanjee, A., Bettinger, E., & Kremer, M. (2006). Improving Education through Assessment, Innovation, and Evaluation. In American Academy of Arts and Sciences (p. 110). Cambridge: American Academy of Arts and Sciences. https://doi.org/10.3389/fnhum.2015.00046
  3. Brown, H. D. (2004). Language Assessment: Principles and Classroom Practices. (V. L. Blanford, Ed.). New York: Pearson Education.
  4. Brown, J. D. (1996). Testing in Language Programs. (D. Mosco, Ed.). New Jersey: Prentice Hall Regents.
  5. Capkova, H., Kroupova, J., & Young, K. (2015). An Analysis of Gap Fill Items in Achievement Tests. Elsevier Science Direct, 192, 547–553. https://doi.org/10.1016/j.sbspro.2015.06.087
  6. Ciptaningrum, D. (2014). An Item Analysis of English Summative Test on Difficulty Level and Discriminating Power. Syarif HIdayatullah State Islamic University.
  7. Cohen, L., Manion, L., & Morrison, K. (2007). Research Methods in Education. Research Methods in Education (Sixth Edit, Vol. 1). New York: Routledge Taylor & Francis. https://doi.org/10.1080/10572252.2010.502513
  8. Fulcher, G. (2010). Practical Language Testing (First edit). London: Hodder Education.
  9. Garvin, A. D., & Ebel, R. L. (1980). Essentials of Educational Measurement. Educational Researcher, 9(9), 21. https://doi.org/10.2307/1175572
  10. Grant, J., Allen, B., Anna, Edwards, A., & Henry. (2006). Students Assessment Essentials Handbook. Academic Medicine (Vol. 73). University of the West Indies. https://doi.org/10.1097/00001888-199809000-00035
  11. Kumbakonam, U. R., & S, A. (2017). Role of A Teacher in English Language Teaching (ELT). International Journal of Educational Science and Research (IJESR), 7(February), 1–4.
  12. Lebagi, D., Sumardi, S., & Sudjoko, S. (2017). The Quality of Teacher-Made Test in Efl Classroom At the Elementary School and Its Washback in the Learning. Journal of English Education, 2(2), 97–104. https://doi.org/10.31327/jee.v2i2.289
  13. Marie, S. M. J. A., & Edannur, S. (2015). Relevance of Item Analysis in Standardizing an Achievement Test in Teaching of Physical Science. Journal of Educational Technology, 12(3), 30–36.