Research Article
BibTex RIS Cite

Evaluations of Turkish Science Teacher Curriculum with Many-Facet Rasch Analysis

Year 2022, Volume: 18 Issue: 2, 27 - 42, 31.12.2022
https://doi.org/10.17244/eku.1180825

Abstract

Scientific and technological developments cause changes in educational programs and curriculums. Especially science education should meet criteria of today’s needs and expectations. Changing only science curriculum in K-12 is not enough. Science teacher curriculum should also change since teachers are responsible to teach subjects. By 2018, all teacher curriculum, including science teacher education, changed due to recent improvements in science, technology and education. This study investigated science teacher educators’ evaluations of Turkish science teacher curriculum with Many Facet Rasch Analysis. The program is evaluated according to the four dimensions of curriculum which are 1) aims, aims objectives, 2) subject matter, 3) learning experiences, and 4) evaluating approaches. These analyses including general evaluations about the program, academicians’ generosity, and ungenerosity behavior during evaluating the program, and analysis of each criterion itself. Results of the analysis conformed psychometric and unidimensional properties of the criterion form. Therefore, it is supported with the literature that a Likert-type instrument can be developed and used to evaluate programs. Additionally, this study discussed academician’s generosity and ungenerosity behavior while evaluating the program. Evaluating validity and reliability of each academicians’ behavior is necessary. Results indicated that their bias, generosity, or ungenerosity behaviors did not affect the criterion forms’ statistical confidence.

References

  • Aikenhead, G. S. (1997). Toward a First Nations cross‐cultural science and technology curriculum. Science Education, 81(2), 217.
  • Atkin, J. M. (1998). The OECD study of innovations in science, mathematics and technology education. Journal of Curriculum Studies, 30(6), 647.
  • Ayre, C., & Scally A. J. (2014). Critical values for Lawshe’s content validity ratio: revisiting the original methods of calculation. Measurement and Evaluation in Counseling and Development, 47(1), 79.
  • Bailes, L. P., & Nandakumar, R. (2020). Get the most from your survey: An application of Rasch analysis for education leaders. International Journal of Education Policy and Leadership, 16(2), n2.
  • Bawane, J., & Spector, J. M. (2009). Prioritization of online instructor roles: implications for competency‐based teacher education programs. Distance Education, 30(3), 383.
  • Bencze, L., & Carter, L. (2011). Globalizing students acting for the common good. Journal of Research in Science Teaching, 48(6), 648.
  • Bencze, L., & Hodson, D. (1999). Changing practice by changing practice: Toward more authentic science and science curriculum development. Journal of Research in Science Teaching, 36(5), 521.
  • Boone, W. J., & Scantlebury, K. (2006). The role of Rasch analysis when conducting science education research utilizing multiple‐choice tests. Science Education, 90(2), 253.
  • Boone, W. J., Townsend, J. S., & Staver, J. (2011). Using Rasch theory to guide the practice of survey development and survey data analysis in science education and to inform science reform efforts: An exemplar utilizing STEBI self‐efficacy data. Science Education, 95(2), 258.
  • Council of Higher Education (CoHE). (2007). Teacher Education and Faculty of Education (1982-2007). CoHE.
  • Council of Higher Education (CoHE). (2018). Teaching Science Programs. CoHE.
  • Creswell, J. W. (2002). Educational research: Planning, conducting, and evaluating quantitative. Prentice Hall.
  • Cronin‐Jones, L. L. (1991). Science teacher beliefs and their influence on curriculum implementation: Two case studies. Journal of Research in Science Teaching, 28(3), 235.
  • Davis, L. L. (1992). Instrument review: Getting the most from a panel of experts. Applied Nursing Research, 5(4), 194.
  • Farrokhi, F., Esfandiari, R., & Schaefer, E. (2012). A many-facet Rasch measurement of differential rater severity/leniency in three types of assessment. JALT Journal, 34(1), 79.
  • Fleiss, J. L. (1971). Measuring nominal scale agreement among many raters. Psychological Bulletin, 76(5), 378.
  • Gay, L. R., Mills, G. E., & Airasian, P. W. (2009). Educational research competencies for analysis and applications. Merrill/Pearson.
  • Juttner, M., Boone, W., Park, S., & Neuhaus, B. J. (2013). Development and use of a test instrument to measure biology teachers’ content knowledge (CK) and pedagogical content knowledge (PCK). Educational Assessment, Evaluation and Accountability, 25(1), 45.
  • Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel psychology, 28(4), 563.
  • Linacre, J. M. (2012). Many-facet Rasch measurement: Facets tutorial. Retrieved April 24, 2017 from http://www.winsteps.com/a/ftutorial2.pdf
  • Lynn M. R. (1986). Determination and quantification of content validity. Nursing Research, 35(6), 382.
  • Okcabol, R. (2005). Teacher training system: historical development, current situation, and a systems approach to the problem of teacher education. Utopya Publishing.
  • Olivia, P. F. (1997). Developing the curriculum (4th ed.). Longman.
  • Ornstein, A. C., &Hunkins, F. P. (2009).Curriculum foundations, principles and issues (6th ed.). Pearson Education. Orts-Cortés, M. I., Moreno-Casbas, T., Squires, A., Fuentelsaz-Gallego, C., Maciá-Soler, L., & González-María, E. (2013). Content validity of the Spanish version of the Practice Environment Scale of the Nursing Work Index. Applied Nursing Research, 26(4), e5.
  • Oon, P. T., & Fan, X. (2017). Rasch analysis for psychometric improvement of science attitude rating scales. International Journal of Science Education, 39(6), 683.
  • Orhan, E. E., (2017). What teachers think about teacher education they received in Turkey? A qualitative research. Education and Science, 42(189), 197.
  • Polit, D. F., & Beck, C. T. (2006). The content validity index: are you sure you know what's being reported? Critique and recommendations. Research in nursing & health, 29(5), 489.
  • Popkewitz, T. S. (1994). Professionalization in teaching and teacher education: Some notes on its history, ideology, and potential. Teaching and Teacher Education, 10(1), 1.
  • Rasch, G. (1960). Studies in mathematical psychology: I. Probabilistic models for some intelligence and attainment tests. Nielsen & Lydiche.
  • Robinson, B., & Latchem, C. (2003). Teacher education: Challenge and change. In B. Robinson & C. Latchem (Eds.), Teacher education through open and distance learning (pp. 1-27). Routledge.
  • Sandelowski, M. (2000). Combining qualitative and quantitative sampling, data collection, and analysis techniques in mixed‐method studies. Research in Nursing & Health, 23(3), 246.
  • Siegel, S. (1957). Nonparametric statistics. The American Statistician, 11(3), 13.
  • Sondergeld, T. A., & Johnson, C. C. (2014). Using Rasch measurement for the development and use of affective assessments in science education research. Science Education, 98(4), 581.
  • Unal, S., Çoştu, B., & Karataş, F. Ö. (2004). Program development activities for science education: An overview. Journal of Gazi University Faculty of Education, 24(2), 183.
  • Veal, W. R. (2004). Beliefs and knowledge in chemistry teacher development. International Journal of Science Education, 26(3), 329.
  • Veneziano L. ve Hooper J. (1997). A method for quantifying content validity of health-related questionnaires. American Journal of Health Behavior, 21(1), 67.
  • Wei, B. (2020). The change in the intended Senior High School Chemistry Curriculum in China: focus on intellectual demands. Chemistry Education Research and Practice, 21(1), 14. You, H. S. (2016). Rasch validation of a measure of reform-oriented science teaching practices. Journal of Science Teacher Education, 27(4), 373.

Fen Bilgisi Öğretmenliği Programının Çok Yönlü Rasch Analizi ile Değerlendirilmesi

Year 2022, Volume: 18 Issue: 2, 27 - 42, 31.12.2022
https://doi.org/10.17244/eku.1180825

Abstract

Bilimsel ve teknolojik gelişmeler eğitim ve öğretim programlarında değişikliklere neden olmaktadır. Özellikle fen eğitimi, günümüzün ihtiyaç ve beklentilerini karşılamalıdır. Bunları karşılamak için sadece ortaokul fen müfredatını değiştirmek yeterli değildir. Öğretmenler, konuları öğretmekle sorumlu oldukları için fen bilgisi öğretmenliği eğitim programları da günümüzün ihtiyaçlarına göre değişmelidir. 2018 yılında, fen bilgisi öğretmenliği programı da dâhil olmak üzere tüm öğretmen eğitimi programları; bilim, teknoloji ve eğitimdeki son gelişmeleri programa dâhil etmek için değişti. Bu amaçla, bu çalışmada fen bilgisi öğretmenliği programı, fen bilgisi eğitimi programında çalışan akademisyenlerce değerlendirilmiş ve bu değerlendirmeler Çok Yüzeyli Rasch Analizi ile incelenmiştir. Program, 1) amaç ve hedefler, 2) konu, 3) öğrenme deneyimleri ve 4) ölçme ve değerlendirme olmak üzere dört program boyutuna göre değerlendirilir. Programla ilgili genel değerlendirmeler; akademisyenlerin programı değerlendirme sırasındaki cömertlik ve cimrilik davranışlarını göstermiş ve her bir kriterin kendi analizini ayrı ayrı göstermiştir. Analiz sonuçları, ölçüt formunun psikometrik ve tek boyutlu özelliklerine uymaktadır. Bu nedenle bu çalışmada geliştirilen Likert tipli ölçme aracının fen bilgisi öğretmenliği programının değerlendirilmesinde kullanılabileceği söylenebilir. Ayrıca, bu çalışmada akademisyenin programı değerlendirirken cömertlik ve cimrilik davranışları ele alınmıştır. Her akademisyenin davranışının geçerlilik ve güvenirliğinin ayrı ayrı değerlendirilmiştir. Sonuçlar, yanlılık, cömertlik veya cimrilik davranışlarının ölçüt formlarının istatistiksel güvenini etkilemediğini göstermiştir.

References

  • Aikenhead, G. S. (1997). Toward a First Nations cross‐cultural science and technology curriculum. Science Education, 81(2), 217.
  • Atkin, J. M. (1998). The OECD study of innovations in science, mathematics and technology education. Journal of Curriculum Studies, 30(6), 647.
  • Ayre, C., & Scally A. J. (2014). Critical values for Lawshe’s content validity ratio: revisiting the original methods of calculation. Measurement and Evaluation in Counseling and Development, 47(1), 79.
  • Bailes, L. P., & Nandakumar, R. (2020). Get the most from your survey: An application of Rasch analysis for education leaders. International Journal of Education Policy and Leadership, 16(2), n2.
  • Bawane, J., & Spector, J. M. (2009). Prioritization of online instructor roles: implications for competency‐based teacher education programs. Distance Education, 30(3), 383.
  • Bencze, L., & Carter, L. (2011). Globalizing students acting for the common good. Journal of Research in Science Teaching, 48(6), 648.
  • Bencze, L., & Hodson, D. (1999). Changing practice by changing practice: Toward more authentic science and science curriculum development. Journal of Research in Science Teaching, 36(5), 521.
  • Boone, W. J., & Scantlebury, K. (2006). The role of Rasch analysis when conducting science education research utilizing multiple‐choice tests. Science Education, 90(2), 253.
  • Boone, W. J., Townsend, J. S., & Staver, J. (2011). Using Rasch theory to guide the practice of survey development and survey data analysis in science education and to inform science reform efforts: An exemplar utilizing STEBI self‐efficacy data. Science Education, 95(2), 258.
  • Council of Higher Education (CoHE). (2007). Teacher Education and Faculty of Education (1982-2007). CoHE.
  • Council of Higher Education (CoHE). (2018). Teaching Science Programs. CoHE.
  • Creswell, J. W. (2002). Educational research: Planning, conducting, and evaluating quantitative. Prentice Hall.
  • Cronin‐Jones, L. L. (1991). Science teacher beliefs and their influence on curriculum implementation: Two case studies. Journal of Research in Science Teaching, 28(3), 235.
  • Davis, L. L. (1992). Instrument review: Getting the most from a panel of experts. Applied Nursing Research, 5(4), 194.
  • Farrokhi, F., Esfandiari, R., & Schaefer, E. (2012). A many-facet Rasch measurement of differential rater severity/leniency in three types of assessment. JALT Journal, 34(1), 79.
  • Fleiss, J. L. (1971). Measuring nominal scale agreement among many raters. Psychological Bulletin, 76(5), 378.
  • Gay, L. R., Mills, G. E., & Airasian, P. W. (2009). Educational research competencies for analysis and applications. Merrill/Pearson.
  • Juttner, M., Boone, W., Park, S., & Neuhaus, B. J. (2013). Development and use of a test instrument to measure biology teachers’ content knowledge (CK) and pedagogical content knowledge (PCK). Educational Assessment, Evaluation and Accountability, 25(1), 45.
  • Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel psychology, 28(4), 563.
  • Linacre, J. M. (2012). Many-facet Rasch measurement: Facets tutorial. Retrieved April 24, 2017 from http://www.winsteps.com/a/ftutorial2.pdf
  • Lynn M. R. (1986). Determination and quantification of content validity. Nursing Research, 35(6), 382.
  • Okcabol, R. (2005). Teacher training system: historical development, current situation, and a systems approach to the problem of teacher education. Utopya Publishing.
  • Olivia, P. F. (1997). Developing the curriculum (4th ed.). Longman.
  • Ornstein, A. C., &Hunkins, F. P. (2009).Curriculum foundations, principles and issues (6th ed.). Pearson Education. Orts-Cortés, M. I., Moreno-Casbas, T., Squires, A., Fuentelsaz-Gallego, C., Maciá-Soler, L., & González-María, E. (2013). Content validity of the Spanish version of the Practice Environment Scale of the Nursing Work Index. Applied Nursing Research, 26(4), e5.
  • Oon, P. T., & Fan, X. (2017). Rasch analysis for psychometric improvement of science attitude rating scales. International Journal of Science Education, 39(6), 683.
  • Orhan, E. E., (2017). What teachers think about teacher education they received in Turkey? A qualitative research. Education and Science, 42(189), 197.
  • Polit, D. F., & Beck, C. T. (2006). The content validity index: are you sure you know what's being reported? Critique and recommendations. Research in nursing & health, 29(5), 489.
  • Popkewitz, T. S. (1994). Professionalization in teaching and teacher education: Some notes on its history, ideology, and potential. Teaching and Teacher Education, 10(1), 1.
  • Rasch, G. (1960). Studies in mathematical psychology: I. Probabilistic models for some intelligence and attainment tests. Nielsen & Lydiche.
  • Robinson, B., & Latchem, C. (2003). Teacher education: Challenge and change. In B. Robinson & C. Latchem (Eds.), Teacher education through open and distance learning (pp. 1-27). Routledge.
  • Sandelowski, M. (2000). Combining qualitative and quantitative sampling, data collection, and analysis techniques in mixed‐method studies. Research in Nursing & Health, 23(3), 246.
  • Siegel, S. (1957). Nonparametric statistics. The American Statistician, 11(3), 13.
  • Sondergeld, T. A., & Johnson, C. C. (2014). Using Rasch measurement for the development and use of affective assessments in science education research. Science Education, 98(4), 581.
  • Unal, S., Çoştu, B., & Karataş, F. Ö. (2004). Program development activities for science education: An overview. Journal of Gazi University Faculty of Education, 24(2), 183.
  • Veal, W. R. (2004). Beliefs and knowledge in chemistry teacher development. International Journal of Science Education, 26(3), 329.
  • Veneziano L. ve Hooper J. (1997). A method for quantifying content validity of health-related questionnaires. American Journal of Health Behavior, 21(1), 67.
  • Wei, B. (2020). The change in the intended Senior High School Chemistry Curriculum in China: focus on intellectual demands. Chemistry Education Research and Practice, 21(1), 14. You, H. S. (2016). Rasch validation of a measure of reform-oriented science teaching practices. Journal of Science Teacher Education, 27(4), 373.
There are 37 citations in total.

Details

Primary Language English
Subjects Studies on Education
Journal Section Makaleler
Authors

Ilgım Özergun 0000-0002-2277-6016

Fatih Doğan 0000-0002-3088-886X

Göksel Boran 0000-0003-3060-3876

Serdar Arcagök 0000-0002-4937-3268

Early Pub Date January 6, 2022
Publication Date December 31, 2022
Submission Date September 27, 2022
Published in Issue Year 2022 Volume: 18 Issue: 2

Cite

APA Özergun, I., Doğan, F., Boran, G., Arcagök, S. (2022). Evaluations of Turkish Science Teacher Curriculum with Many-Facet Rasch Analysis. Eğitimde Kuram Ve Uygulama, 18(2), 27-42. https://doi.org/10.17244/eku.1180825