World Journal of Dentistry

Register      Login

VOLUME 13 , ISSUE S1 ( Supplementary Issue 1, 2022 ) > List of Articles

ORIGINAL RESEARCH

Test Item Analysis and its Relevance to a Dental Foundation Course at King Saud University

Khaled M Alqahtani

Keywords : Classroom assessment, Dental courses, Difficulty index, Discrimination index, Distractor analysis, Item analysis

Citation Information :

DOI: 10.5005/jp-journals-10015-2141

License: CC BY-NC 4.0

Published Online: 01-10-2022

Copyright Statement:  Copyright © 2022; The Author(s).


Abstract

Aim. To conduct an item analysis of a dental foundation course (biostatistics) taught in the first semester of the foundation year at King Saud University (KSU), Saudi Arabia. Materials and methods: A total of 32 students completed the final examination for a dental foundation course (biostatistics) that is taught in the first semester of the foundation year at KSU. This exam consisted of 30 test items. The test items were evaluated for their level of difficulty, the measure of difficulty index (p-value), power of discrimination as measured by the discrimination index (DI), and distractor analysis. In the data analysis, the test reliability of inter-item consistency, or how strongly the test items are connected was determined using the Kuder–Richardson formula. Results: The average test score was 18; the standard deviation was 6.9, and the standard error of the mean was 2.2. The skewness for the scores was 0.31, which indicates that the distribution was positively skewed. The kurtosis was 1.82, which indicates that the distribution was almost normal. No correlation was found between item DI and the item difficulty level. Conclusion: The item DI and item difficulty level had no relationship, indicating that the test items lacked practical and excellent discriminating power. Clinical significance: Item analysis is a valuable test for determining the accuracy and quality of multiple choice items, and it should be utilized when creating exams and assessments for dentistry students.


PDF Share
  1. McMillan JH. Classroom Assessment. In: Wright JDBT-IE of the S& BS (Second E, ed. Oxford: Elsevier; 2015:819–824. DOI: 10.1016/B978-0-08-097086-8.92074-9
  2. Adams NE. Bloom's taxonomy of cognitive learning objectives. J Med Libr Assoc 2015;103(3):152–153. DOI: 10.3163/1536-5050.103.3.010.
  3. Quaigrain K, Arhin AK. Using reliability and item analysis to evaluate a teacher developed test in educational measurement and evaluationKing Fai Hui S, ed. Cogent Educ. 2017;4(1):1301013. DOI: 10.1080/2331186X.2017.1301013
  4. Lange A, Lehmann IJ, Mehrens WA. Using item analysis to improve tests. J Educ Meas 1967;4(2):65–68. http://www.jstor.org/stable/1434299. DOI: 10.1111/j.1745-3984.1967.tb00572.x
  5. Gajjar S, Sharma R, Kumar P, et al. Item and test analysis to identify quality multiple choice questions (MCQs) from and assessment of medical students of Ahmedabad, Gujarat. Indian J community Med 2014;39(1):17–20. DOI: 10.4103/0970-0218.126347
  6. Agresti, A. Finlay B. Statistical Methods for the Social Sciences. Prentice Hall, Upper Saddle River, NJ. 1997.
  7. Tamhane, A. C. Dunlop DD. Statistics and Data Analysis from Elementary to Intermediate. Prentice Hall, Upper Saddle River, NJ. 2000.
  8. Saudi Commission for. Health Specialties. Item writing manual for multiple choice questions. Available at: http://www.scfhs.org.sa/education/HighEduExams/Guidlines/mcq/Documents/MCQ%20Manual.pdf. Accessed in September 2021
  9. Shakil M, Campus H. Using beta-binomial distribution in analyzing some multiple choice questions of the final exam of a math course, and its application in predicting the performance of future students POLYGON-A Web-Based, Multi-Disciplinary Publication, Miami-Dade College, Hialeah Campus. 2009 Mar 5
  10. Lord F. The relation of the reliability of multiple choice tests to the distribution of item difficulties. Psychometrika 1952;17(2):181–194. DOI: 10.1007/BF02288781
  11. Panjaitan RL, Irawati R, Sujana A, et al. Item validity vs. item discrimination index: a redundancy? J Phys Conf Ser 2018;983:12101. DOI: 10.1088/1742-6596/983/1/012101
  12. Chase C. Contemporary Assessment for Educators. January 1999.
  13. Odukoya JA, Adekeye O, Igbinoba AO, et al. Item analysis of university-wide multiple choice objective examinations: the experience of a Nigerian private university. Qual Quant 2017;52(3):983–997. DOI: 10.1007/s11135-017-0499-2
  14. Ingale AS, Giri PA, Doibale MK. Study on item and test analysis of multiple choice questions amongst undergraduate medical students. Int J Community Med Public Health 2017;4:1562. DOI: 10.18203/2394-6040.ijcmph20171764
  15. Rehman A, Aslam A, Hassan SH. Item analysis of multiple choice questions. Pakistan Oral & Dental Journal 2018;38(2):291–293. DOI: 10.53350/pjmhs221621120
PDF Share
PDF Share

© Jaypee Brothers Medical Publishers (P) LTD.