مطالعه تطبیقی آزمون نوشتاری آکادمیک آیلتس در حالت کاغذی در مقابل حالت کامپیوتر در میان داوطلبان ایرانی آیلتس در دانشگاه تهران
Subject Areas : All areas of language and translationمهدی دست پاک 1 , Mohammad Javad Riasati 2 , Ehsan Hadipourfard 3
1 - دانشجوی دکتری آموزش زبان انگلیسی دانشگاه آزاد اسلامی واحد شیراز
2 - استادیار آموزش زبان انگلیسی دانشگاه آزاد اسلامی واحد شیراز
3 - Assistant Professor of Applied Linguistics, Shiraz Branch, Islamic Azad University, Shiraz, Iran
Keywords: آزمون نوشتن رقابتی, حالت کاغذی, حالت کامپیوتری, آشنایی با رایانه,
Abstract :
مطالعه حاضر تلاشی برای بررسی این موضوع بود که آیا زبان آموزان از نظر پاسخ / موفقیت در تکلیف ، انسجام / ارتباط ، منبع واژگانی ، دامنه دستوری و دقت در آزمون نوشتن در سیستم بین المللی تست زبان انگلیسی عملکرد متفاوتی دارند. علاوه بر این ، بررسی شد که آیا آشنایی رایانه ای دانشجویان در گروه های کاغذی و رایانه ای متفاوت است. برای این منظور ، از مجموع 144 داوطلب ، تعداد 108 داوطلب براساس نتایج آزمون آکسفورد در دانشگاه تهران ، ایران انتخاب شدند. برای جمع آوری داده ها ، یک نمونه نوشتن آکادمیک بازنشسته آیلتس و یک پرسشنامه آشنایی با رایانه استفاده شد. شرکت کنندگان به دو گروه مساوی تقسیم شدند. در گروه کاغذی ، برای نوشتن در حالت معمول کاغذ به دانش آموزان داده شد. در گروه حالت رایانه ، آزمون مشابهی به دانشجویان داده شد. با این حال ، از آنها خواسته شد تا آزمون را در رایانه ای که در کلاس خود برای آنها در نظر گرفته شده بود تایپ کنند. همچنین ،به کلیه شرکت کنندگان پرسشنامه آشنایی با رایانه داده شد. داده های جمع آوری شده از طریق آزمون تی تست مستقل مورد تجزیه و تحلیل قرار گرفت. این یافته ها تفاوت معناداری بین حالت های کاغذی و رایانه ای در هر دو تکلیف 1 و 2 نوشتن نشان داد. علاوه بر این ، تحلیل پرسشنامه تأثیر آشنایی رایانه ای داوطلبان بر عملکرد نوشتن آنها را نشان داد.
Asiyaban, A. R., Yamini, M., Bagheri, M. S., &Yarmohammadi, L. (2020). Implicit/explicit knowledge and its contribution towards tense consistency employment across EFL learners’ proficiency levels. Cogent Education, 7(1), 1727129.
Alves, R. A., Castro, S. L., de Sousa, L., & Stromqvist, S. (2007). Influence of typing skill on pause-execution cycles in written composition. In M. Torrance, L. van Waes, & D. Galbraith (Eds.). Writing and cognition: Research and applications (pp. 55–65). Amsterdam: Elsevier.
Barkaoui, K. (2016). What and when second-language learners revise when responding to timed writing tasks on the computer: The roles of task type, second language proficiency, and keyboarding skills. Mod. Lang. Rev. 100(1), 320–340.
Barkaoui, K., &Knouzi, I. (2018). The effects of writing mode and computer ability on L2 test-takers’ essay characteristics and scores.Assessing Writing, 36, 19-31.
Blackhurst, A. (2005). Listening: Reading and writing on computer-based and paper-based versions of IELTS. Research Notes, 21, 14–17.
Breland, H., Lee, Y., &Muraki, E. (2004). Comparability of TOEFL CBT writing prompts: Response mode analyses (TOEFL research report No. RR-75) Princeton, NJ: ETS.
Chambers, L. (2008). Computer-based and paper-based writing assessment: A comparative text analysis. Research Notes, 34, 9–15.
Chan, S., Bax, S., & Weir, C. (2017). Researching participants taking IELTS Academic Writing Task 2 (AWT2) in paper mode and in computer mode in terms of score equivalence, cognitive validity and other factors.
Chapelle, C. A., & Douglas, D. (2006). Assessing language through computer technology. Ernst KlettSprachen.
Clariana, R., & Wallace, P. (2002). Paper–based versus computer–based assessment: key factors associated with the test mode effect. British Journal of Educational Technology, 33(5), 593-602.
Davies, A. (2007). Assessing academic English language proficiency: 40+ years of U.K. language tests.
Fox, J., Bayliss, D., & Wesche, M. (2007). Language testing reconsidered (p. 192). University of Ottawa Press/Les Presses de l’Université d’Ottawa.
Douglas, D., &Hegelheimer, V. (2007). Assessing language using computer technology. Annual Review of Applied Linguistics, 27, 115–132.
Green, T., &Maycock, L. (2004). Computer-based IELTS and paper-based versions of IELTS. Research Notes, 18, 3–6.
Fulcher, G. (2014). Philosophy and language testing. In A. J. Kunnan (Ed.), The companion to language testing (pp. 1431–1451). Hoboken: Wiley.
Horkay, N., Bennett, R. E., Allen, N., Kaplan, B., & Yan, F. (2006). Does it matter if I take my writing test on computer? An empirical study of mode effects in NAEP. Journal of Technology, Learning and Assessment, 5(2), Retrieved from: www.jtla.org.
IELTS. (2014). Guide for educational institutions, governments, professional bodies and commercial organizations. Retrieved from https //www. ielts. Org//media/publications/guide-for- institutions/ielts-guide-for-institutions-uk.ashx?la=en
Jin, Y., & Yan, M. (2017). Computer literacy and the construct validity of a high-stakes computer-based writing assessment. Language Assessment Quarterly, 1–19.
Lee, Y. J. (2002). A comparison of composing processes and written products in timed-essay tests across paper-and-pencil and computer modes. Assessing Writing, 8(2), 135-157.
Li, J. (2006). The mediation of technology in ESL writing and its implications for writing assessment. Assessing Writing, 11(1), 5-21.
Lottridge, S., Nicewander, A., Schulz, M. &Mitzel, H. (2008). Comparability of Paper-based and Computer-based Tests: A Review of the Methodology. Pacific Metrics Corporation 585 Cannery Row, Suite 201 Monterey, California 93940.
Maycock, L. O. U. I. S. E., & Green, T. O. N. Y. (2005). The effects on performance of computer familiarity and attitudes towards CB IELTS. Research Notes, 20, 3-8.
Najmi, K. (2015). The effect of genre-based approach on enhancing writing skill of Iranian law students. Modern Journal of Language Teaching Methods, 5(2), 474.
Neuman, G., &Baydoun, R. (1998). Computerization of paper-and-pencil tests: When are they equivalent? Applied Psychological Measurement, 22(1), 71-83.
O’Loughlin, K. (2011). The interpretation and use of proficiency test scores in university selection: How valid and ethical are they? Language Assessment Quarterly, 8(2), 146-160.
Parsi, P., &VahdaniSanavi, R. (2015). The effects of dynamic assessment on improving writing ability of intermediate EFL learners. International Journal of Language Learning and Applied Linguistics World, 8 (2), 73-88.
Poggio, J., Glasnapp, D., Yang, X. &Poggio, A. (2005). A Comparative Evaluation of Score Results from Computerised and Paper & Pencil Mathematics Testing in a Large-Scale State Assessment Program. The Journal of Technology, Learning and Assessment, 3(6), 5-30.
Puspawati, I. (2012). EFL/ESL (English as a Foreign/Second Language) Students’ Perceptions toward the TOEFL (Test of English as a Foreign Language) Test.
Quaid, E. D. (2018). Reviewing the IELTS speaking test in East Asia: Theoretical andpractice-based insights. Language Testing in Asia, 8(2), 25-48.
Russell, M., & Haney, W. (1997). Testing writing on computers: An experiment comparing student performance on tests conducted via computer and via paper-and-pencil. Education Policy Analysis Archives, 5(3), Retrieved from:http://epaa.asu.edu/epaa/v5n3.html.
Shohamy, E. G. (2001). The power of tests: A critical perspective on the uses of language tests. Pearson Education.
Torrance, M., & Galbraith, D. (2006). The processing demands of writing. In C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.). Handbook of writing research (pp. 67–80). New York: Guilford Press.
Uysal, H. H. (2010). A critical review of the IELTS writing test.ELT journal, 64(3), 314-320.
Van Waes, L., &Schellens, P. J. (2003). Writing profiles: The effect of the writing mode on pausing and revision patterns of experienced writers. Journal of pragmatics, 35(6), 829-853.
Weir C J, O’Sullivan B (2017) ‘Assessing English on the global stage: the British Council and English language testing, 1941-2016’, in (ed(s).)., edn, London: Equinox.
Weir, C. Y. R. I. L., & Shaw, S. T. U. A. R. T. (2005). Establishing the validity of Cambridge ESOL writing tests: towards the implementation of a socio-cognitive model for test validation. University of Cambridge ESOL Examinations Research Notes, 21, 10-14.
Weir, C., Yan, J., O’Sullivan, B., &Bax, S. (2007). Does the computer make a difference? The reaction of candidates to a computer-based versus a traditional hand-written form of the IELTS Writing component: effects and impact. International English Language Testing System (IELTS) Research Reports 2007: Volume 7, 1.
Weir, C. J., O’Sullivan, B., & Jin, Y. (2007). Does the computer make a difference? The reaction of test-takers to a computer-based versus a traditional hand-written form of the IELTS writing component: Effects and impact. IELTS Research Report No.7British Council & IDPAustralia311–347.
Wolfe, E. W., &Manalo, J. R. (2005). An Investigation of the Impact of Composition Medium on the Quality of TOEFL Writing Scores. TOEFL® Research Report.RR-72.ETS RR-04-29.ETS Research Report Series.
Wolfe, E. W., Bolton, S., Feltovich, B., &Niday, D. M. (1996). The influence of student experience with word processors on the quality of essays written for a direct writing assessment. Assessing Writing, 3, 123–147.
Yurdabakan, I. (2012). Primary School Students’ Attitudes Towards Computer Based testing and Assessment in Turkey. Turkish Online Journal of Distance Education- 13 (12), 177-188.