| Peer-Reviewed

A Study on the Characteristics of Writing Test Items of English High School Entrance Examinations in China in 2020 and 2021

Received: 3 July 2022    Accepted: 24 July 2022    Published: 4 August 2022
Views:       Downloads:
Abstract

Being the most important large-scale and high-stake examination in compulsory education, the senior high school entrance examination (SHSEE) has been the focus of testing research for a long time. They are designed to examine students’ language achievement and proficiency according to the requirements of the 2011 curriclumn standards for compulsory education but are the same time influenced by the concept of core sompetencies by the sniopr high school English curriculum standards in 2018. The writing test, especially the writing test items in SHSEE can reflect students’ English proficiency and give washback to future test design and writing teaching. This study carries out an analysis of 102 writing items in SHSEE from all over the country in 2020 and 2021, focusing on their characteristics and the quality of test design. This paper adjusts previous frameworks and follows three dimensions, seven subdimensions to analyze: Test Content (Genre, Topics), Prompt (Form, Length), Test Context Design (Authenticity, Interactivity, Openness). The results show that: Firstly, practical writing remained the most prevalent. Prompt form tends to diversify and the form of outline rprompts remains the most popular. Most of the items succeed in designing authentic, interactive items with open space for students to write and create. Most test designs are in accordance with the concept of core competencies of the latest curriculum standards issued this year. The results of the study has implications and provides some guidelines for future writing test design of middle school English.

Published in International Journal of Language and Linguistics (Volume 10, Issue 4)
DOI 10.11648/j.ijll.20221004.13
Page(s) 239-247
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Senior High School Entrance Examination, English Writing Test, Test Design

References
[1] Bachman, L. F. (1990). Fundamental considerations in language testing. Oxford: University Press.
[2] Bachman, L. F. & Palmer, A. S.(1996). Language testing in practice: Designing and developing useful language tests. Oxford: University Press.
[3] Chinese Ministry of Education (CMOE) (2011) English Curriculum Standards for Compulsory Education. Beijing: Beijing Normal University Publishing House.
[4] Cronbach, L. J., & Meehl, P. E. (1955). Construct validity in psychological tests. Psychological Bulletin, 52 (4), 281–302. American Psychological Association.
[5] Davies, A. (1999). Dictionary of language testing. Studies in language testing . Cambridge, UK: Cambridge University Press.
[6] Ding, J. (2015). A Study of Written English Expression Questions in the National High School Entrance Examination in 2013. Journal of Basic English Education, 17 (01), 105–112.
[7] Dobrić, N. (2018). Reliability, Validity, and Writing Assessment: A Timeline. ELOPE, 15 (2), 9–24. University of Ljubljana, Faculty of Arts.
[8] Dong M., Fang, X. (2014) The Task Characteristics and Design Principles of TEM4 Writing Test Items--A Diachronic Analysis of TEM4 Writing Test Items From 1992 to 2013. Foreign Language Testing and Teaching, (3), 1-10.
[9] Dong M., Gao X., & Yang Z. (2011). A Diachronic Study of English Writing items in the College Entrance Examination of Nation Unified from 1989 to 2011. Educational Measurement and Evaluation, (10), 47-52.
[10] Gao, X., Li, Z., Zhang, Y. & Huang, P. (2012) How To Design English Writing Test Questions--Take Questions of College Entrance Examination as Examples. 33 (1), 254-256.
[11] Hamp-Lyons, L. (1990). Second language writing: Assessment issues. Second language writing: Research insights for the classroom, 67-68.
[12] Horowitz, D. (1991). ESL writing assessments: Contradictions and resolutions. Assessing Second Language Writing in Academic Contexts, 71–85.
[13] Hughes, A. (1989). Testing for language teachers. Cambridge: University Press.
[14] Kane, T. S. (2000). The Oxford Essential Guide to Writing. Oxford University Press Inc. Large-Scale Assessment Programs for All Students: Validity, Technical Adequacy, and Implementation.
[15] Koo, X., & Gao, X. (2007) A Synchronic Study of the English Written Test Items in National-based College Entrance Examination and Province-based Autonomous College Entrance Examination Designing in 2007. Examination Research, (12), 28-33+36.
[16] Li, X. (1997). The science and art of language testing. Changsha: Hunan Education Publishing House.
[17] Qi, L. (2006). The Design and Language Application Principle of the English Writing Items in University Entrance Examination. Foreign Language Teaching in Schools, 29 (02), 15-18.
[18] Shaw, S. D., & Weir, S. C. (2019). Examining writing: Research and practice in assessing second language writing. Beijing, Foreign Language Teaching and Research Press.
[19] Slomp, D. (2005). Teaching and assessing language skills: Defining the knowledge that matters. English Teaching, 4 (3). University of Waikato, Department of English.
[20] Slomp, D. H., Corrigan, J. A., & Sugimoto, T. (2014). A Framework for Using Consequential Validity Evidence in Evaluating Large-Scale Writing Assessments: A Canadian Study. Research in the Teaching of English, 48 (3), 276–302. National Council of Teachers of English.
[21] Swales, J. (1990). Genre analysis: English in academic and research settings. The Cambridge applied linguistics series. Cambridge: University Press.
[22] Tao, B. (2010). A Synchronic Study of English Writing Test Items in the College Entrance Examination of 2009. High School Education: Foreign Language Teaching and Learning, (03), 58-63.
[23] Wang, D. & Bian, Y. (2019) A Synchronic Study and Illusions of English Writing Test Items of Cities and Areas in Jiang Su Province in 2018. English Teacher, 19 (08), 114-118.
[24] Weigle, S. C. (2011). Assessing writing. Beijing, Foreign Language Teaching and Research Press.
[25] Weir, C. J. (2005). Language Testing and Validation, An Evidence-Based Approach. Palgrave Macmillan UK.
[26] Yang, Z., Koo X., & Wang, X. (2018) A Diachronic Study of English Writing items in the College Entrance Examination from 2008 to 2017. Educational Measurement and Evaluation, (02), 18-26.
Cite This Article
  • APA Style

    Lu Yiruo, Qian Xiaofang. (2022). A Study on the Characteristics of Writing Test Items of English High School Entrance Examinations in China in 2020 and 2021. International Journal of Language and Linguistics, 10(4), 239-247. https://doi.org/10.11648/j.ijll.20221004.13

    Copy | Download

    ACS Style

    Lu Yiruo; Qian Xiaofang. A Study on the Characteristics of Writing Test Items of English High School Entrance Examinations in China in 2020 and 2021. Int. J. Lang. Linguist. 2022, 10(4), 239-247. doi: 10.11648/j.ijll.20221004.13

    Copy | Download

    AMA Style

    Lu Yiruo, Qian Xiaofang. A Study on the Characteristics of Writing Test Items of English High School Entrance Examinations in China in 2020 and 2021. Int J Lang Linguist. 2022;10(4):239-247. doi: 10.11648/j.ijll.20221004.13

    Copy | Download

  • @article{10.11648/j.ijll.20221004.13,
      author = {Lu Yiruo and Qian Xiaofang},
      title = {A Study on the Characteristics of Writing Test Items of English High School Entrance Examinations in China in 2020 and 2021},
      journal = {International Journal of Language and Linguistics},
      volume = {10},
      number = {4},
      pages = {239-247},
      doi = {10.11648/j.ijll.20221004.13},
      url = {https://doi.org/10.11648/j.ijll.20221004.13},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ijll.20221004.13},
      abstract = {Being the most important large-scale and high-stake examination in compulsory education, the senior high school entrance examination (SHSEE) has been the focus of testing research for a long time. They are designed to examine students’ language achievement and proficiency according to the requirements of the 2011 curriclumn standards for compulsory education but are the same time influenced by the concept of core sompetencies by the sniopr high school English curriculum standards in 2018. The writing test, especially the writing test items in SHSEE can reflect students’ English proficiency and give washback to future test design and writing teaching. This study carries out an analysis of 102 writing items in SHSEE from all over the country in 2020 and 2021, focusing on their characteristics and the quality of test design. This paper adjusts previous frameworks and follows three dimensions, seven subdimensions to analyze: Test Content (Genre, Topics), Prompt (Form, Length), Test Context Design (Authenticity, Interactivity, Openness). The results show that: Firstly, practical writing remained the most prevalent. Prompt form tends to diversify and the form of outline rprompts remains the most popular. Most of the items succeed in designing authentic, interactive items with open space for students to write and create. Most test designs are in accordance with the concept of core competencies of the latest curriculum standards issued this year. The results of the study has implications and provides some guidelines for future writing test design of middle school English.},
     year = {2022}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - A Study on the Characteristics of Writing Test Items of English High School Entrance Examinations in China in 2020 and 2021
    AU  - Lu Yiruo
    AU  - Qian Xiaofang
    Y1  - 2022/08/04
    PY  - 2022
    N1  - https://doi.org/10.11648/j.ijll.20221004.13
    DO  - 10.11648/j.ijll.20221004.13
    T2  - International Journal of Language and Linguistics
    JF  - International Journal of Language and Linguistics
    JO  - International Journal of Language and Linguistics
    SP  - 239
    EP  - 247
    PB  - Science Publishing Group
    SN  - 2330-0221
    UR  - https://doi.org/10.11648/j.ijll.20221004.13
    AB  - Being the most important large-scale and high-stake examination in compulsory education, the senior high school entrance examination (SHSEE) has been the focus of testing research for a long time. They are designed to examine students’ language achievement and proficiency according to the requirements of the 2011 curriclumn standards for compulsory education but are the same time influenced by the concept of core sompetencies by the sniopr high school English curriculum standards in 2018. The writing test, especially the writing test items in SHSEE can reflect students’ English proficiency and give washback to future test design and writing teaching. This study carries out an analysis of 102 writing items in SHSEE from all over the country in 2020 and 2021, focusing on their characteristics and the quality of test design. This paper adjusts previous frameworks and follows three dimensions, seven subdimensions to analyze: Test Content (Genre, Topics), Prompt (Form, Length), Test Context Design (Authenticity, Interactivity, Openness). The results show that: Firstly, practical writing remained the most prevalent. Prompt form tends to diversify and the form of outline rprompts remains the most popular. Most of the items succeed in designing authentic, interactive items with open space for students to write and create. Most test designs are in accordance with the concept of core competencies of the latest curriculum standards issued this year. The results of the study has implications and provides some guidelines for future writing test design of middle school English.
    VL  - 10
    IS  - 4
    ER  - 

    Copy | Download

Author Information
  • School of Foreign Languages and Literature, Beijing Normal University, Beijing, China

  • School of Foreign Languages and Literature, Beijing Normal University, Beijing, China

  • Sections