Research Article
BibTex RIS Cite
Year 2023, Volume: 14 Issue: 2, 128 - 153, 30.06.2023
https://doi.org/10.21031/epod.1122857

Abstract

References

  • Agresti, A. (2012). Categorical data analysis (Vol. 792). John Wiley & Sons. https://doi.org/10.1002/0471249688
  • Alatlı, B. A., & Şenel, S. (2020). Değişen Madde Fonksiyonunun Belirlenmesinde “difR” R Paketinin Kullanımı: Ortaöğretime Geçiş Sınavı Fen Alt Testi. Ankara Üniversitesi Eğitim Bilimleri Fakültesi Dergisi, 1-37. https://doi.org/10.30964/auebfd.684727
  • Altıntaş, Ö., & Kutlu, Ö. (2019). Investigating Differential Item Functioning of Ankara University Examination for Foreign Students by Recursive Partitioning Analysis in the Rasch Model. International Journal of Assessment Tools in Education, 6(4), 602-616. https://dx.doi.org/10.21449/ijate.554212
  • Arslan, M. (2020). Teog Sınavının Yabancı Dil Alt Testine Ait Maddelerin Yanlılığının İncelenmesi. Yüksek Lisans Tezi, Ankara: Hacettepe Üniversitesi Eğitim Bilimleri Enstitüsü.
  • Asamoah, N. A. B. (2020). Assessing Differential Item Functioning in the Perceived Stress Scale. University of Arkansas. https://scholarworks.uark.edu/etd/3775
  • Ayan, C. (2011). PISA 2009 fen okuryazarlığı alt testinin değişen madde fonksiyonu açısından incelenmesi. Yayınlanmamış Yüksek Lisans Tezi, Ankara: Hacettepe Üniversitesi Sosyal Bilimler Enstitüsü.
  • Başman, M. (2017). Matematik başarısında cinsiyet ve duyuşsal özelliklerin etkileşimine göre Rasch ağacı yöntemi ile değişen madde fonksiyonunun belirlenmesi. Doktora Tezi, Ankara: Ankara Üniversitesi Eğitim Bilimleri Enstitüsü.
  • Camilli, G., & Shepard, L. (1994). Methods for identifying biased test items (Vol. 4). Sage. https://doi.org/10.1177/109821409701800108
  • Cattell, R. B. (1966). The scree test for the number of factors. Multivariate behavioral research, 1(2), 245-276. https://doi.org/10.1207/s15327906mbr0102_10
  • Chen, W. H., & Thissen, D. (1997). Local dependence indexes for item pairs using item response theory. Journal of Educational and Behavioral Statistics, 22(3), 265-289. https://doi.org/10.2307/1165285
  • Crocker, L., & Algina, J. (1986). Introduction to classical and modern test theory. Holt, Rinehart and Winston, 6277 Sea Harbor Drive, Orlando, FL 32887.
  • Cronbach, L. J. (1990). Essentials of Psychological Testing. Harper Collins Publishers, New York. https://doi.org/10.1002/sce.3730350432
  • DeVellis, R. F. (2017). Ölçek geliştirme kuram ve uygulamalar. (T. Totan, Çev.). Nobel Akademik Yayıncılık. https://doi.org/10.1177/109821409301400212
  • Doğan, N., & Öğretmen, T. (2008). Değişen madde fonksiyonunu belirlemede mantel‐haenszel, ki‐kare ve lojistik regresyon tekniklerinin karşılaştırılması. Eğitim ve Bilim, 33(148), 100-112.
  • Ellis, B. B., & Raju, N. S. (2003). Differential item and test functioning. Jossey-Bass/Wiley.
  • Field, A., Miles, J., & Field, Z. (2012). Discovering statistics using R. Sage publications. https://doi.org/10.1111/insr.12011_21
  • Gök, B., Kelecioğlu, H., & Doğan, N. (2010). Değişen madde fonksiyonunu belirlemede Mantel–Haenszel ve Lojistik Regresyon tekniklerinin karşılaştırılması. Eğitim ve Bilim, 35(156).
  • Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Sage.
  • İrioğlu, Z., & Ertekin, E. (2011). İlköğretim İkinci Kademe Öğrencilerinin Zihinsel Döndürme Becerilerinin Bazı Değişkenler Açısından İncelenmesi. Journal of Educational and Instructional Studies in the World, 75.
  • Jodoin, M. G., & Gierl, M. J. (2001). Evaluating type I error and power rates using an effect size measure with the logistic regression procedure for DIF detection. Applied measurement in education, 14(4), 329-349. https://doi.org/10.1207/S15324818AME1404_2
  • Kan, A. ,Sünbül, Ö. ve Ömür, S (2013). 6.-8. sınıf seviye belirleme sınavları alt testlerinin çeşitli yöntemlere göre değişen madde fonksiyonlarının incelenmesi. Mersin Üniversitesi Eğitim Fakültesi Dergisi, 9(2), 207-222. https://doi.org/10.17860/efd.55452
  • Karami, H. R., Gramipour, M., & Minaei, A. (2021). Application of The Rasch Tree Model In The Detection Of Differential Item Functioning (Case Study: Recruitment Exams Of The Police Of The Islamic Republic Of Iran). https://doi.org/10.22054/jem.2021.61694.2190
  • Karasar, N. (2017). Bilimsel araştırma yöntemi (2. yazım, 32. Basım). Nobel Yayın Dağıtım.
  • Karip, E., & Köksal, K. (1996). Etkili eğitim sistemlerinin geliştirilmesi. Kuram ve Uygulamada Egitim Yönetimi Dergisi, 2(2), 245-257.
  • Koğar, E. Y., & Koğar, H. (2019). Investigation of scientific literacy according to different item types: PISA 2015 Turkey sample. Abant İzzet Baysal Üniversitesi Eğitim Fakültesi Dergisi, 19(2), 695-709. https://doi.org/10.17240/aibuefd.2019.19.46660-467271
  • Liu, M. (2017). Differential Item Functioning in Large-scale Mathematics Assessments: Comparing the Capabilities of the Rasch Trees Model to Traditional Approaches (Doctoral dissertation, University of Toledo).
  • Lord, F. M. (1980). Applications of item response theory to practical testing problems. Routledge. https://doi.org/10.4324/9780203056615
  • Millsap, R. E., & Everson, H. T. (1993). Methodology review: Statistical approaches for assessing measurement bias. Applied psychological measurement, 17(4), 297-334. https://doi.org/10.1177/014662169301700401
  • Nunnally, J. C. (1978). Psychometric theory. New York: McGraw Hill. https://doi.org/10.1177/014662169501900308
  • Osterlind, S. J., & Everson, H. T. (2009). Differential item functioning (Vol. 161). Sage Publications. https://doi.org/10.4135/9781412993913
  • Ozarkan, H. B., Kucam, E., & Demir, E. (2017). Merkezi ortak sınav matematik alt testinde değişen madde fonksiyonunun görme engeli durumuna göre incelenmesi. Current Research in Education, 3(1), 24-34.
  • Potenza, M. T., & Dorans, N. J. (1995). DIF assessment for polytomously scored items: A framework for classification and evaluation. Applied psychological measurement, 19(1), 23-37.
  • Robitzsch, A.; Lüdtke, O. (2020). A review of different scaling approaches under full invariance, partial invariance, and noninvariance for cross-sectional country comparisons in large-scale assessments. Psych. Test Assess. Model. 2020, 62, 233–279. https://bit.ly/3ezBB05 (accessed on 12 June 2023).
  • Schwabe, F., McElvany, N., Trendtel, M., Gebauer, M. M., & Bos, W. (2014). Vertiefende Analysen zu migrationsbedingten Leistungsdifferenzen in Leseaufgaben. Zeitschrift für Pädagogische Psychologie.
  • Sharma, S. (1995). Applied multivariate techniques. John Wiley & Sons, Inc..
  • Spearman, C. (1905). Proof and disproof of correlation. The American Journal of Psychology, 16(2), 228-231.
  • Strobl, C., Kopf, J., & Zeileis, A. (2015). Rasch trees: A new method for detecting differential item functioning in the Rasch model. Psychometrika, 80(2), 289-316. https://doi.org/10.1007/s11336-013-9388-3
  • Şenferah, S. (2015). Seviye belirleme sınavı matematik alt testi için değişen madde fonksiyonlarının ve madde yanlılığının incelenmesi. Yayınlanmamış Yüksek Lisans Tezi. Gazi Üniversitesi. Eğitim Bilimleri Enstitüsü. Ankara.
  • Tabachnick, B. G., Fidell, L. S., & Ullman, J. B. (2007). Using multivariate statistics (Vol. 5, pp. 481-498). Boston, MA: Pearson.
  • Tavşancıl, E. (2018). Tutumların ölçülmesi ve SPSS ile veri analizi. Nobel Akademik Yayıncılık.
  • Zhang, M. (2009). Gender related differential item functioning in mathematics tests: A meta-analysis (Doctoral dissertation, Washington State University).
  • Zieky, M. (1993). Practical questions in the use of DIF statistics in test development. Lawrence Erlbaum Associates, Inc.
  • Zumbo, B. D., & Thomas, D. R. (1997). A measure of effect size for a model-based approach for studying DIF. Prince George, Canada: University of Northern British Columbia, Edgeworth Laboratory for Quantitative Behavioral Science.
  • Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF). Ottawa: National Defense Headquarters, 1-57.

Examination of Differential Item Functioning in PISA 2018 Mathematics Literacy Test with Different Methods

Year 2023, Volume: 14 Issue: 2, 128 - 153, 30.06.2023
https://doi.org/10.21031/epod.1122857

Abstract

This study aims to determine whether the PISA 2018 Mathematical Literacy test items show differential item functioning (DIF) according to gender and parental education level. The sample of the study consisted of a total of 521 students who participated in the practice in Turkey and answered the booklets numbered 1 and 7. The research was conducted on a total of 45 items in these booklets. In this study, the Mantel-Haenszel (MH), Logistic Regression (LR), and Rasch Tree (RT) methods were applied to determine the items showing DIF regarding the gender variable. As a result of the analyses, it was determined that two items in the 1st booklet showed DIF in favour of girls, and an item in the 7th booklet that was common with the 1st booklet showed DIF. This item showed DIF in common for all three methods according to the DIF analyses performed separately by the Mantel Haenszel, Logistic Regression, and Rasch Tree methods. As a result, an item showing DIF in favour of girls was determined with both the MH and LR methods in the 1st and 7th booklets. In addition, when the items in booklets 1 and 7 were examined to see whether they show DIF according to parental education level, it was concluded that an item in booklet 1 was easy for students whose mother's education level was high school, university, and above, but difficult for students whose mother's education level was high school or below.

References

  • Agresti, A. (2012). Categorical data analysis (Vol. 792). John Wiley & Sons. https://doi.org/10.1002/0471249688
  • Alatlı, B. A., & Şenel, S. (2020). Değişen Madde Fonksiyonunun Belirlenmesinde “difR” R Paketinin Kullanımı: Ortaöğretime Geçiş Sınavı Fen Alt Testi. Ankara Üniversitesi Eğitim Bilimleri Fakültesi Dergisi, 1-37. https://doi.org/10.30964/auebfd.684727
  • Altıntaş, Ö., & Kutlu, Ö. (2019). Investigating Differential Item Functioning of Ankara University Examination for Foreign Students by Recursive Partitioning Analysis in the Rasch Model. International Journal of Assessment Tools in Education, 6(4), 602-616. https://dx.doi.org/10.21449/ijate.554212
  • Arslan, M. (2020). Teog Sınavının Yabancı Dil Alt Testine Ait Maddelerin Yanlılığının İncelenmesi. Yüksek Lisans Tezi, Ankara: Hacettepe Üniversitesi Eğitim Bilimleri Enstitüsü.
  • Asamoah, N. A. B. (2020). Assessing Differential Item Functioning in the Perceived Stress Scale. University of Arkansas. https://scholarworks.uark.edu/etd/3775
  • Ayan, C. (2011). PISA 2009 fen okuryazarlığı alt testinin değişen madde fonksiyonu açısından incelenmesi. Yayınlanmamış Yüksek Lisans Tezi, Ankara: Hacettepe Üniversitesi Sosyal Bilimler Enstitüsü.
  • Başman, M. (2017). Matematik başarısında cinsiyet ve duyuşsal özelliklerin etkileşimine göre Rasch ağacı yöntemi ile değişen madde fonksiyonunun belirlenmesi. Doktora Tezi, Ankara: Ankara Üniversitesi Eğitim Bilimleri Enstitüsü.
  • Camilli, G., & Shepard, L. (1994). Methods for identifying biased test items (Vol. 4). Sage. https://doi.org/10.1177/109821409701800108
  • Cattell, R. B. (1966). The scree test for the number of factors. Multivariate behavioral research, 1(2), 245-276. https://doi.org/10.1207/s15327906mbr0102_10
  • Chen, W. H., & Thissen, D. (1997). Local dependence indexes for item pairs using item response theory. Journal of Educational and Behavioral Statistics, 22(3), 265-289. https://doi.org/10.2307/1165285
  • Crocker, L., & Algina, J. (1986). Introduction to classical and modern test theory. Holt, Rinehart and Winston, 6277 Sea Harbor Drive, Orlando, FL 32887.
  • Cronbach, L. J. (1990). Essentials of Psychological Testing. Harper Collins Publishers, New York. https://doi.org/10.1002/sce.3730350432
  • DeVellis, R. F. (2017). Ölçek geliştirme kuram ve uygulamalar. (T. Totan, Çev.). Nobel Akademik Yayıncılık. https://doi.org/10.1177/109821409301400212
  • Doğan, N., & Öğretmen, T. (2008). Değişen madde fonksiyonunu belirlemede mantel‐haenszel, ki‐kare ve lojistik regresyon tekniklerinin karşılaştırılması. Eğitim ve Bilim, 33(148), 100-112.
  • Ellis, B. B., & Raju, N. S. (2003). Differential item and test functioning. Jossey-Bass/Wiley.
  • Field, A., Miles, J., & Field, Z. (2012). Discovering statistics using R. Sage publications. https://doi.org/10.1111/insr.12011_21
  • Gök, B., Kelecioğlu, H., & Doğan, N. (2010). Değişen madde fonksiyonunu belirlemede Mantel–Haenszel ve Lojistik Regresyon tekniklerinin karşılaştırılması. Eğitim ve Bilim, 35(156).
  • Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Sage.
  • İrioğlu, Z., & Ertekin, E. (2011). İlköğretim İkinci Kademe Öğrencilerinin Zihinsel Döndürme Becerilerinin Bazı Değişkenler Açısından İncelenmesi. Journal of Educational and Instructional Studies in the World, 75.
  • Jodoin, M. G., & Gierl, M. J. (2001). Evaluating type I error and power rates using an effect size measure with the logistic regression procedure for DIF detection. Applied measurement in education, 14(4), 329-349. https://doi.org/10.1207/S15324818AME1404_2
  • Kan, A. ,Sünbül, Ö. ve Ömür, S (2013). 6.-8. sınıf seviye belirleme sınavları alt testlerinin çeşitli yöntemlere göre değişen madde fonksiyonlarının incelenmesi. Mersin Üniversitesi Eğitim Fakültesi Dergisi, 9(2), 207-222. https://doi.org/10.17860/efd.55452
  • Karami, H. R., Gramipour, M., & Minaei, A. (2021). Application of The Rasch Tree Model In The Detection Of Differential Item Functioning (Case Study: Recruitment Exams Of The Police Of The Islamic Republic Of Iran). https://doi.org/10.22054/jem.2021.61694.2190
  • Karasar, N. (2017). Bilimsel araştırma yöntemi (2. yazım, 32. Basım). Nobel Yayın Dağıtım.
  • Karip, E., & Köksal, K. (1996). Etkili eğitim sistemlerinin geliştirilmesi. Kuram ve Uygulamada Egitim Yönetimi Dergisi, 2(2), 245-257.
  • Koğar, E. Y., & Koğar, H. (2019). Investigation of scientific literacy according to different item types: PISA 2015 Turkey sample. Abant İzzet Baysal Üniversitesi Eğitim Fakültesi Dergisi, 19(2), 695-709. https://doi.org/10.17240/aibuefd.2019.19.46660-467271
  • Liu, M. (2017). Differential Item Functioning in Large-scale Mathematics Assessments: Comparing the Capabilities of the Rasch Trees Model to Traditional Approaches (Doctoral dissertation, University of Toledo).
  • Lord, F. M. (1980). Applications of item response theory to practical testing problems. Routledge. https://doi.org/10.4324/9780203056615
  • Millsap, R. E., & Everson, H. T. (1993). Methodology review: Statistical approaches for assessing measurement bias. Applied psychological measurement, 17(4), 297-334. https://doi.org/10.1177/014662169301700401
  • Nunnally, J. C. (1978). Psychometric theory. New York: McGraw Hill. https://doi.org/10.1177/014662169501900308
  • Osterlind, S. J., & Everson, H. T. (2009). Differential item functioning (Vol. 161). Sage Publications. https://doi.org/10.4135/9781412993913
  • Ozarkan, H. B., Kucam, E., & Demir, E. (2017). Merkezi ortak sınav matematik alt testinde değişen madde fonksiyonunun görme engeli durumuna göre incelenmesi. Current Research in Education, 3(1), 24-34.
  • Potenza, M. T., & Dorans, N. J. (1995). DIF assessment for polytomously scored items: A framework for classification and evaluation. Applied psychological measurement, 19(1), 23-37.
  • Robitzsch, A.; Lüdtke, O. (2020). A review of different scaling approaches under full invariance, partial invariance, and noninvariance for cross-sectional country comparisons in large-scale assessments. Psych. Test Assess. Model. 2020, 62, 233–279. https://bit.ly/3ezBB05 (accessed on 12 June 2023).
  • Schwabe, F., McElvany, N., Trendtel, M., Gebauer, M. M., & Bos, W. (2014). Vertiefende Analysen zu migrationsbedingten Leistungsdifferenzen in Leseaufgaben. Zeitschrift für Pädagogische Psychologie.
  • Sharma, S. (1995). Applied multivariate techniques. John Wiley & Sons, Inc..
  • Spearman, C. (1905). Proof and disproof of correlation. The American Journal of Psychology, 16(2), 228-231.
  • Strobl, C., Kopf, J., & Zeileis, A. (2015). Rasch trees: A new method for detecting differential item functioning in the Rasch model. Psychometrika, 80(2), 289-316. https://doi.org/10.1007/s11336-013-9388-3
  • Şenferah, S. (2015). Seviye belirleme sınavı matematik alt testi için değişen madde fonksiyonlarının ve madde yanlılığının incelenmesi. Yayınlanmamış Yüksek Lisans Tezi. Gazi Üniversitesi. Eğitim Bilimleri Enstitüsü. Ankara.
  • Tabachnick, B. G., Fidell, L. S., & Ullman, J. B. (2007). Using multivariate statistics (Vol. 5, pp. 481-498). Boston, MA: Pearson.
  • Tavşancıl, E. (2018). Tutumların ölçülmesi ve SPSS ile veri analizi. Nobel Akademik Yayıncılık.
  • Zhang, M. (2009). Gender related differential item functioning in mathematics tests: A meta-analysis (Doctoral dissertation, Washington State University).
  • Zieky, M. (1993). Practical questions in the use of DIF statistics in test development. Lawrence Erlbaum Associates, Inc.
  • Zumbo, B. D., & Thomas, D. R. (1997). A measure of effect size for a model-based approach for studying DIF. Prince George, Canada: University of Northern British Columbia, Edgeworth Laboratory for Quantitative Behavioral Science.
  • Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF). Ottawa: National Defense Headquarters, 1-57.
There are 44 citations in total.

Details

Primary Language English
Journal Section Articles
Authors

Emre Kucam 0000-0002-4283-7103

Hamide Deniz Gülleroğlu 0000-0001-6995-8223

Publication Date June 30, 2023
Acceptance Date June 22, 2023
Published in Issue Year 2023 Volume: 14 Issue: 2

Cite

APA Kucam, E., & Gülleroğlu, H. D. (2023). Examination of Differential Item Functioning in PISA 2018 Mathematics Literacy Test with Different Methods. Journal of Measurement and Evaluation in Education and Psychology, 14(2), 128-153. https://doi.org/10.21031/epod.1122857