Araştırma Makalesi
BibTex RIS Kaynak Göster

Investigating the Effect of Item Order on the Psychometric Properties of a Self-Efficacy Perception Scale / Madde Sıralamasının Bir Öz Yeterlik Algısı Ölçeğinin Psikometrik Özelliklerine Etkisinin İncelenmesi

Yıl 2023, Cilt: 14 Sayı: 5, 478 - 493, 22.10.2023
https://doi.org/10.19160/e-ijer.1362442

Öz

In this study, the item order effect was investigated using the self-efficacy perception scale for computational thinking. 946 participants consisting of 8th-grade students participated in the research. Participants were first administered the original form of the scale, in which the dimensions were sequential and each item was presented in the relevant subdimension. One month later, a second form in which the order of the items was completely random was administered to the same group. Analyzes revealed no significant difference in the group's total mean score between the two forms. Additionally, the study showed the extent to each individual participant's total and factor scores varied between the two forms, and the difference was negligible. Another remarkable finding is that sequentially ordering items representing the same dimension contributes positively to the internal consistency reliability of the scale. Confirmatory factor analyses were performed for both forms, revealing good model fit with similar index values; This shows that randomizing the item order does not disrupt the structure of the scale. In addition, factor loading differences between the models were also examined. Finally, measurement invariance was examined and it was found that there was a solid level of measurement invariance between the two forms. In future studies, instead of creating random forms, forms in which the order of the items is consciously manipulated can be used and thus contextual effects can be examined. It is also suggested that the item order effect can be examined in the context of many demographic characteristics and different item types.

Kaynakça

  • Albano, A. D. (2013). Multilevel modeling of item position effects. Journal of Educational Measurement, 50(4), 408-426. https://doi.org/10.1111/jedm.12026
  • Asseburg, R., & Frey, A. (2013). Too hard, too easy, or just right? The relationship between effort or boredom and ability-difficulty fit. Psychological Test and Assessment Modeling, 55(1), 92.
  • Bornstein, M. H. (1995). Form and function: Implications for studies of culture and human development. Culture & Psychology, 1(1), 123-137. https://doi.org/10.1177/1354067X95110
  • Bossart, P., & Di Vesta, F. J. (1966). Effects of context, frequency, and order of presentation of evaluative assertions on impression formation. Journal of Personality and Social Psychology, 4, 538-544. https://doi.org/10.1037/h0023898
  • Bradburn, N. M. (1983). Response effects. Handbook of survey research, 1, 289-328.
  • Brenner, M., H. (1964). Test difficulty, reliability, and discrimination as functions of item difficulty order. Journal of Applied Psychology, 48, 98-100. https://doi.org/10.1037/h0045738
  • Chen, P. H. (2010). Item order effects on attitude measures (Doctoral dissertation, University of Denver).
  • Cheung, G. W., & Rensvold, R. B. (2002). Evaluating goodness-of-fit indexes for testing measurement invariance. Structural Equation Modeling, 9(2), 233-255. https://doi.org/10.1207/S15328007SEM0902_5
  • Crano, W. D. (1977). Primacy verusus recency in retention of information and opinion change. Journal of Social Psychology, 101, 87-96. https://doi.org/10.1080/00224545.1977.9923987
  • Cronbach, L. J. (1988). Internal consistency of tests: Analyses old and new. Psychometrika, 53, 63-70.
  • Cronbach, L. J. (2013). Five perspectives on validity argument. In Test Validity (pp. 3-17). Routledge.
  • Demirkol, S., & Kelecioğlu, H. (2022). Investigating the effect of item position on person and item parameters: PISA 2015 Turkey sample. Journal of Measurement and Evaluation in Education and Psychology, 13(1), 69-85. https://doi.org/10.21031/epod.958576
  • Dillman, D. A. (2000). Mail and internet surveys: The tailored design method. New York: John Wiley & Sons, Inc.
  • Etzel, J. M., Holland, J., & Nagy, G. (2021). The internal and external validity of the latent vocational interest circumplex: Structure, relationships with self-concepts, and robustness against item-order effects. Journal of Vocational Behavior, 124, 103520. https://doi.org/10.1016/j.jvb.2020.103520
  • Flaugher, R., L., Melton, R., S., & Myers, C., T. (1968). Item rearrangement under typical test conditions. Educational and Psychological Measurement, 28, 813-824. https://doi.org/10.1177/001316446802800310
  • Fraenkel, J. R.,& Wallen, N. E. (2009). How to design and evaluate research in education. San Fransisco: Mc-Graw Hill Pub.
  • Glanville, J. L., & Wildhagen, T. (2007). The measurement of school engagement: Assessing dimensionality and measurement invariance across race and ethnicity. Educational and Psychological Measurement, 67(6), 1019-1041. https://doi.org/10.1177/00131644062991
  • Gülbahar, Y., Kert, S. B., & Kalelioğlu, F. (2019). Bilgi işlemsel düşünme becerisine yönelik öz yeterlik algısı ölçeği: Geçerlik ve güvenirlik çalışması. Turkish Journal of Computer and Mathematics Education (TURCOMAT), 10(1), 1-29. https://doi.org/10.16949/turkbilmat.385097
  • Haladyna T. M., Rodriguez M. C. (2013). Developing and validating test items. New York, NY: Routledge.
  • Hambleton, R. K., & Traub, R. E. (1974). The effects of item order on test performance and stress. Journal of Experimental Education, 43, 40-46. https://doi.org/10.1080/00220973.1974.10806302
  • Hartig, J., Hölzel, B., & Moosbrugger, H. (2007). A confirmatory analysis of item reliability trends (CAIRT): Differentiating true score and error variance in the analysis of item context effects. Multivariate Behavioral Research, 42(1), 157-183. https://doi.org/10.1080/00273170701341266
  • Horn, J. L., & McArdle, J. J. (1992). A practical and theoretical guide to measurement invariance in aging research. Experimental Aging Research, 18(3), 117-144. https://doi.org/10.1080/03610739208253916
  • Hu, L. T., & Bentler, P. M. (1998). Fit indices in covariance structure modeling: Sensitivity to under parameterized model misspecification. Psychological Methods, 3(4), 424. https://doi.org/10.1037/1082-989X.3.4.424
  • Jöreskog, K. G., & Sörbom, D. (1993). LISREL 8: Structural equation modeling with the SIMPLIS command language. Scientific software international. Chicago, IL, US.
  • Klosner, N. C., & Gellman, E. K. (1973). The effect of item arrangement on classroom test performance: Implications for content validity. Educational and Psychological Measurement, 33, 413-418. https://doi.org/10.1177/00131644730330022
  • Knowles, E. S. (1988). Item context effects on personality scales: Measuring changes the measure. Journal of Personality and Social Psychology, 55, 312-320. https://doi.org/10.1037/0022-3514.55.2.312
  • Krampen, G., Hense, H., & Schneider, J. F. (1992). Reliabilität und Validität von Fragebogenskalen bei Standardreihenfolge versus inhaltshomogener Blockbildung ihrer Items. Zeitschrift für experimentelle und angewandte Psychologie.
  • Laffitte Jr., R. G. (1984). Effects on item order on achievement test scores and students’ perception of test difficulty. Teaching of Psychology, 11, 212-214. https://doi.org/10.1177/0098628384011004
  • Leitgöb, H., Seddig, D., Asparouhov, T., Behr, D., Davidov, E., De Roover, K., ... & van de Schoot, R. (2022). Measurement invariance in the social sciences: Historical development, methodological challenges, state of the art, and future perspectives. Social Science Research, 102805.
  • Little, T. D. (2013). Longitudinal structural equation modeling. Guilford press.
  • Lund, F. H. (1925). The psychology of belief: A study of its emotional, and volitional determinants. The Journal of Abnormal and Social Psychology, 20(2), 174. https://doi.org/10.1037/h0066996
  • Luong, R., & Flake, J. K. (2022). Measurement invariance testing using confirmatory factor analysis and alignment optimization: A tutorial for transparent analysis planning and reporting. Psychological Methods. 28(4), 905–924. https://doi.org/10.1037/met0000441
  • McDonald, R. P., & Ho, M. H. R. (2002). Principles and practice in reporting structural equation analyses. Psychological Methods, 7(1), 64. https://doi.org/10.1037/1082-989X.7.1.64
  • Meredith, W. (1993). Measurement invariance, factor analysis and factorial invariance. Psychometrika, 58, 525-543.
  • Munz, D. C., & Smouse, A. D. (1968). Interaction effects of item-difficulty sequence and achievement-anxiety reaction on academic performance. Journal of Educational Psychology, 59, 370-374. https://doi.org/10.1037/h0026224
  • Muthén, L. K., & Muthén, B. O. (2017). Mplus user’s guide. 8th Edn. Los Angeles, CA: Muthén & Muthén
  • Osterlind, S. J. (1998). What is constructing test items? (pp. 1-16). Springer Netherlands.
  • Perlini, A. H., Lind, D. L., & Zumbo, B. D. (1998). Context effects on examinations: The effects of time, item order and item difficulty. Canadian Psychology/Psychologie Canadienne, 39(4), 299. https://doi.org/10.1037/h0086821
  • Putnick, D. L., & Bornstein, M. H. (2016). Measurement invariance conventions and reporting: the state of the art and future directions for psychological research. Developmental Review, 41, 71-90. https://doi.org/10.1016/j.dr.2016.06.004
  • Schuman, H., & Presser, S. (1981). The attitude-action connection and the issue of gun control. The Annals of the American Academy of Political and Social Science, 455(1), 40-47. https://doi.org/10.1177/000271628145500105
  • Schweizer, K., Schreiner, M., & Gold, A. (2009). The confirmatory investigation of APM items with loadings as a function of the position and easiness of items: A two dimensional model of APM. Psychology Science Quarterly, 51(1), 47–64.
  • O'Shaughnessy, E. (1983). Words and working through. The International Journal of Psycho-Analysis, 64, 281.
  • Strack, F., Martin, L. L., & Schwarz, N. (1988). Priming and communication: Social determinants of information use in judgments of life satisfaction. European Journal of Social Psychology, 18(5), 429-442. https://doi.org/10.1002/ejsp.2420180505
  • Solomon, E., & Kopelman, R. E. (1984). Questionnaire format and scale reliability: An examination of three modes of item presentation. Psychological Reports, 54(2), 447-452. https://doi.org/10.2466/pr0.1984.54.2.447
  • Şahin, M. D. (2021). Effect of item order on certain psychometric properties: A demonstration on a cyberloafing scale. Frontiers in Psychology, 12, 590545. https://doi.org/10.3389/fpsyg.2021.590545
  • Vega, E. M., & O’Leary, K. D. (2006). Reaction time and item presentation factors in the self-report of partner aggression. Violence and Victims, 21, 519-532. https://doi.org/10.1891/vivi.21.4.519
  • Weinberg, M. K., Seton, C., & Cameron, N. (2018). The measurement of subjective wellbeing: Item-order effects in the personal wellbeing index—adult. Journal of Happiness Studies, 19, 315-332. https://doi.org/10.1007/s10902-016-9822-1
  • Widaman, K. F., Ferrer, E., & Conger, R. D. (2010). Factorial invariance within longitudinal structural equation models: Measuring the same construct across time. Child Development Perspectives, 4(1), 10-18. https://doi.org/10.1111/j.1750-8606.2009.00110.x
  • Widaman, K. F., & Reise, S. P. (1997). Exploring the measurement invariance of psychological instruments: Applications in the substance use domain. In K. J. Bryant, M. Windle, & S. G. West (Eds.), The science of prevention: Methodological advances from alcohol and substance abuse research (pp. 281–324). American Psychological Association. https://doi.org/10.1037/10222-009

Madde Sıralamasının Bir Öz Yeterlik Algısı Ölçeğinin Psikometrik Özelliklerine Etkisinin İncelenmesi / Investigating the Effect of Item Order on the Psychometric Properties of a Self-Efficacy Perception Scale

Yıl 2023, Cilt: 14 Sayı: 5, 478 - 493, 22.10.2023
https://doi.org/10.19160/e-ijer.1362442

Öz

Bu çalışmada bir öz yeterlik algısı ölçeği olan bilgi işlemsel düşünme becerisine yönelik öz yeterlik algısı ölçeği kullanılarak madde sıralama etkisi araştırılmıştır. Araştırmaya 8. sınıf öğrencilerinden oluşan 946 katılımcı katılmıştır. Katılımcılara öncelikle ölçeğin boyutlarının sıralı olduğu ve her maddenin ilgili alt boyutta sunulduğu orijinal form uygulanmıştır. Bir ay sonra aynı gruba madde sıralamasının tamamen rastgele olduğu ikinci bir form uygulanmıştır. Grubun bu iki forma ilişkin ölçeğin tamamı ve alt boyutlarından aldıkları ortalama puanları arasında anlamlı bir fark olmadığı bulunmuştur. Ayrıca, her bir katılımcının puanlarının iki form arasındaki değişimi incelenmiş ve bu farkın göz ardı edilebilir olduğu görülmüştür. Dikkat çeken bir diğer bulgu ise aynı boyutu temsil eden maddelerin birlikte sıralanmasının ölçeğin iç tutarlığına olumlu katkı sağlamasıdır. Her iki ölçek formu için de doğrulayıcı faktör analizleri yapılmış ve formlar birbirine yakın model uyum iyiliği göstermiştir. Bu durum madde sıralamasının rastgele yapılmasının ölçeğin yapısını bozmadığını göstermektedir. Son olarak ölçme değişmezliği incelenmiş ve iki form arasında katı düzeyde ölçme değişmezliği olduğu bulunmuştur. Gelecek çalışmalarda rastgele sıralanmış maddeler kullanmak yerine, madde sıralamasının belli bir bağlamsal amaca yönelik olarak manipüle edildiği formlar kullanılabilir. Ayrıca madde sıralama etkisinin farklı demografik özellikler ve farklı madde türü bağlamında incelenmesi de önerilmektedir.

Kaynakça

  • Albano, A. D. (2013). Multilevel modeling of item position effects. Journal of Educational Measurement, 50(4), 408-426. https://doi.org/10.1111/jedm.12026
  • Asseburg, R., & Frey, A. (2013). Too hard, too easy, or just right? The relationship between effort or boredom and ability-difficulty fit. Psychological Test and Assessment Modeling, 55(1), 92.
  • Bornstein, M. H. (1995). Form and function: Implications for studies of culture and human development. Culture & Psychology, 1(1), 123-137. https://doi.org/10.1177/1354067X95110
  • Bossart, P., & Di Vesta, F. J. (1966). Effects of context, frequency, and order of presentation of evaluative assertions on impression formation. Journal of Personality and Social Psychology, 4, 538-544. https://doi.org/10.1037/h0023898
  • Bradburn, N. M. (1983). Response effects. Handbook of survey research, 1, 289-328.
  • Brenner, M., H. (1964). Test difficulty, reliability, and discrimination as functions of item difficulty order. Journal of Applied Psychology, 48, 98-100. https://doi.org/10.1037/h0045738
  • Chen, P. H. (2010). Item order effects on attitude measures (Doctoral dissertation, University of Denver).
  • Cheung, G. W., & Rensvold, R. B. (2002). Evaluating goodness-of-fit indexes for testing measurement invariance. Structural Equation Modeling, 9(2), 233-255. https://doi.org/10.1207/S15328007SEM0902_5
  • Crano, W. D. (1977). Primacy verusus recency in retention of information and opinion change. Journal of Social Psychology, 101, 87-96. https://doi.org/10.1080/00224545.1977.9923987
  • Cronbach, L. J. (1988). Internal consistency of tests: Analyses old and new. Psychometrika, 53, 63-70.
  • Cronbach, L. J. (2013). Five perspectives on validity argument. In Test Validity (pp. 3-17). Routledge.
  • Demirkol, S., & Kelecioğlu, H. (2022). Investigating the effect of item position on person and item parameters: PISA 2015 Turkey sample. Journal of Measurement and Evaluation in Education and Psychology, 13(1), 69-85. https://doi.org/10.21031/epod.958576
  • Dillman, D. A. (2000). Mail and internet surveys: The tailored design method. New York: John Wiley & Sons, Inc.
  • Etzel, J. M., Holland, J., & Nagy, G. (2021). The internal and external validity of the latent vocational interest circumplex: Structure, relationships with self-concepts, and robustness against item-order effects. Journal of Vocational Behavior, 124, 103520. https://doi.org/10.1016/j.jvb.2020.103520
  • Flaugher, R., L., Melton, R., S., & Myers, C., T. (1968). Item rearrangement under typical test conditions. Educational and Psychological Measurement, 28, 813-824. https://doi.org/10.1177/001316446802800310
  • Fraenkel, J. R.,& Wallen, N. E. (2009). How to design and evaluate research in education. San Fransisco: Mc-Graw Hill Pub.
  • Glanville, J. L., & Wildhagen, T. (2007). The measurement of school engagement: Assessing dimensionality and measurement invariance across race and ethnicity. Educational and Psychological Measurement, 67(6), 1019-1041. https://doi.org/10.1177/00131644062991
  • Gülbahar, Y., Kert, S. B., & Kalelioğlu, F. (2019). Bilgi işlemsel düşünme becerisine yönelik öz yeterlik algısı ölçeği: Geçerlik ve güvenirlik çalışması. Turkish Journal of Computer and Mathematics Education (TURCOMAT), 10(1), 1-29. https://doi.org/10.16949/turkbilmat.385097
  • Haladyna T. M., Rodriguez M. C. (2013). Developing and validating test items. New York, NY: Routledge.
  • Hambleton, R. K., & Traub, R. E. (1974). The effects of item order on test performance and stress. Journal of Experimental Education, 43, 40-46. https://doi.org/10.1080/00220973.1974.10806302
  • Hartig, J., Hölzel, B., & Moosbrugger, H. (2007). A confirmatory analysis of item reliability trends (CAIRT): Differentiating true score and error variance in the analysis of item context effects. Multivariate Behavioral Research, 42(1), 157-183. https://doi.org/10.1080/00273170701341266
  • Horn, J. L., & McArdle, J. J. (1992). A practical and theoretical guide to measurement invariance in aging research. Experimental Aging Research, 18(3), 117-144. https://doi.org/10.1080/03610739208253916
  • Hu, L. T., & Bentler, P. M. (1998). Fit indices in covariance structure modeling: Sensitivity to under parameterized model misspecification. Psychological Methods, 3(4), 424. https://doi.org/10.1037/1082-989X.3.4.424
  • Jöreskog, K. G., & Sörbom, D. (1993). LISREL 8: Structural equation modeling with the SIMPLIS command language. Scientific software international. Chicago, IL, US.
  • Klosner, N. C., & Gellman, E. K. (1973). The effect of item arrangement on classroom test performance: Implications for content validity. Educational and Psychological Measurement, 33, 413-418. https://doi.org/10.1177/00131644730330022
  • Knowles, E. S. (1988). Item context effects on personality scales: Measuring changes the measure. Journal of Personality and Social Psychology, 55, 312-320. https://doi.org/10.1037/0022-3514.55.2.312
  • Krampen, G., Hense, H., & Schneider, J. F. (1992). Reliabilität und Validität von Fragebogenskalen bei Standardreihenfolge versus inhaltshomogener Blockbildung ihrer Items. Zeitschrift für experimentelle und angewandte Psychologie.
  • Laffitte Jr., R. G. (1984). Effects on item order on achievement test scores and students’ perception of test difficulty. Teaching of Psychology, 11, 212-214. https://doi.org/10.1177/0098628384011004
  • Leitgöb, H., Seddig, D., Asparouhov, T., Behr, D., Davidov, E., De Roover, K., ... & van de Schoot, R. (2022). Measurement invariance in the social sciences: Historical development, methodological challenges, state of the art, and future perspectives. Social Science Research, 102805.
  • Little, T. D. (2013). Longitudinal structural equation modeling. Guilford press.
  • Lund, F. H. (1925). The psychology of belief: A study of its emotional, and volitional determinants. The Journal of Abnormal and Social Psychology, 20(2), 174. https://doi.org/10.1037/h0066996
  • Luong, R., & Flake, J. K. (2022). Measurement invariance testing using confirmatory factor analysis and alignment optimization: A tutorial for transparent analysis planning and reporting. Psychological Methods. 28(4), 905–924. https://doi.org/10.1037/met0000441
  • McDonald, R. P., & Ho, M. H. R. (2002). Principles and practice in reporting structural equation analyses. Psychological Methods, 7(1), 64. https://doi.org/10.1037/1082-989X.7.1.64
  • Meredith, W. (1993). Measurement invariance, factor analysis and factorial invariance. Psychometrika, 58, 525-543.
  • Munz, D. C., & Smouse, A. D. (1968). Interaction effects of item-difficulty sequence and achievement-anxiety reaction on academic performance. Journal of Educational Psychology, 59, 370-374. https://doi.org/10.1037/h0026224
  • Muthén, L. K., & Muthén, B. O. (2017). Mplus user’s guide. 8th Edn. Los Angeles, CA: Muthén & Muthén
  • Osterlind, S. J. (1998). What is constructing test items? (pp. 1-16). Springer Netherlands.
  • Perlini, A. H., Lind, D. L., & Zumbo, B. D. (1998). Context effects on examinations: The effects of time, item order and item difficulty. Canadian Psychology/Psychologie Canadienne, 39(4), 299. https://doi.org/10.1037/h0086821
  • Putnick, D. L., & Bornstein, M. H. (2016). Measurement invariance conventions and reporting: the state of the art and future directions for psychological research. Developmental Review, 41, 71-90. https://doi.org/10.1016/j.dr.2016.06.004
  • Schuman, H., & Presser, S. (1981). The attitude-action connection and the issue of gun control. The Annals of the American Academy of Political and Social Science, 455(1), 40-47. https://doi.org/10.1177/000271628145500105
  • Schweizer, K., Schreiner, M., & Gold, A. (2009). The confirmatory investigation of APM items with loadings as a function of the position and easiness of items: A two dimensional model of APM. Psychology Science Quarterly, 51(1), 47–64.
  • O'Shaughnessy, E. (1983). Words and working through. The International Journal of Psycho-Analysis, 64, 281.
  • Strack, F., Martin, L. L., & Schwarz, N. (1988). Priming and communication: Social determinants of information use in judgments of life satisfaction. European Journal of Social Psychology, 18(5), 429-442. https://doi.org/10.1002/ejsp.2420180505
  • Solomon, E., & Kopelman, R. E. (1984). Questionnaire format and scale reliability: An examination of three modes of item presentation. Psychological Reports, 54(2), 447-452. https://doi.org/10.2466/pr0.1984.54.2.447
  • Şahin, M. D. (2021). Effect of item order on certain psychometric properties: A demonstration on a cyberloafing scale. Frontiers in Psychology, 12, 590545. https://doi.org/10.3389/fpsyg.2021.590545
  • Vega, E. M., & O’Leary, K. D. (2006). Reaction time and item presentation factors in the self-report of partner aggression. Violence and Victims, 21, 519-532. https://doi.org/10.1891/vivi.21.4.519
  • Weinberg, M. K., Seton, C., & Cameron, N. (2018). The measurement of subjective wellbeing: Item-order effects in the personal wellbeing index—adult. Journal of Happiness Studies, 19, 315-332. https://doi.org/10.1007/s10902-016-9822-1
  • Widaman, K. F., Ferrer, E., & Conger, R. D. (2010). Factorial invariance within longitudinal structural equation models: Measuring the same construct across time. Child Development Perspectives, 4(1), 10-18. https://doi.org/10.1111/j.1750-8606.2009.00110.x
  • Widaman, K. F., & Reise, S. P. (1997). Exploring the measurement invariance of psychological instruments: Applications in the substance use domain. In K. J. Bryant, M. Windle, & S. G. West (Eds.), The science of prevention: Methodological advances from alcohol and substance abuse research (pp. 281–324). American Psychological Association. https://doi.org/10.1037/10222-009
Toplam 49 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Alan Eğitimleri (Diğer)
Bölüm Sayı Makaleleri
Yazarlar

Elif Kubra Demır 0000-0002-3219-1644

Yayımlanma Tarihi 22 Ekim 2023
Yayımlandığı Sayı Yıl 2023Cilt: 14 Sayı: 5

Kaynak Göster

APA Demır, E. K. (2023). Investigating the Effect of Item Order on the Psychometric Properties of a Self-Efficacy Perception Scale / Madde Sıralamasının Bir Öz Yeterlik Algısı Ölçeğinin Psikometrik Özelliklerine Etkisinin İncelenmesi. E-Uluslararası Eğitim Araştırmaları Dergisi, 14(5), 478-493. https://doi.org/10.19160/e-ijer.1362442

Creative Commons Lisansı
Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)


[email protected]        http://www.e-ijer.com       Adres: Ege Üniversitesi Eğitim Fakültesi  Bornova/İzmir