| Peer-Reviewed

Assessment Techniques and Students’ Higher-Order Thinking Skills

Received: 23 January 2016    Accepted: 16 February 2016    Published: 6 March 2016
Views:       Downloads:
Abstract

Improving students’ higher-order thinking skills is a collective experience; one teacher of a specific subject cannot alone improve the higher-order thinking skills, and it is a collaborative process between all subjects’ teachers and can be taught for all levels of studying (Lawson, 1993; Shellens, & Valcke, 2005). Moreover, Benjamin (2008) argues that these skills can be developed in a cumulative fashion as students’ progress through their courses and subjects and other experiences they get from their institutions. As well, by including their subjects by problem solving, critical thinking and decision making activities will help students enhance their higher-order thinking skills. In this paper a mathematics test in fractions was constructed and analyzed for both grades 8 and 9 to make sure how teacher-made tests are constructed and how much of them agreed with the Bloom’s Taxonomy levels. The test consists of five sections or content areas the test was analyzed according to the behavior matrix. The results showed that all test items measure the lower three levels in Bloom’s taxonomy which agrees with Stiggins, R. J., Griswold, M. M., and Wikelund, K. R. (1989) results that most of teacher-made tests measure the lower levels in Bloom’s taxonomy. Moreover, 57.14% of the test items are applications and 28.57% are recognition items. These numbers are consistent with Boyd (2008) study, which indicated that the majority of teachers’ assessment items focused on the lower levels of Bloom’s Taxonomy. Moreover, Boyd concluded that 87% of the teachers’ items that have participated in this study used level 1 of the taxonomy in 2003- 2004, and this percentage increased to 86% in 2005-2006. These numbers reflect the tendency of the assessment methods used in schools to ask students to recall information or to do routine question, which will not help students in improving their higher-order thinking skills.

Published in International Journal of Secondary Education (Volume 4, Issue 1)
DOI 10.11648/j.ijsedu.20160401.11
Page(s) 1-11
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Assessment, Higher Order Critical Thinking, Mathematics

References
[1] Abrams, L. M., Pedulla, J. J., & Madaus, G. F. (2003). Views from the classroom: Teachers’ opinions of statewide testing programs. Theory into Practice, 42(1), pp. 18-29.
[2] Airasian, P. W. (1994). Classroom assessment.2nd ed. McGraw Hill, New York.
[3] Anderson, T., and Elloumi, F. (2004). Theory and Practice of Online Learning [online]. Athabasca University [Accessed 11 May 2010]. Available at: http://cde.athabascau.ca/online_book.
[4] Appl, D. J. (2000). Clarifying the preschool assessment process: Traditional practices and alternative approaches. Early Childhood Education Journal, 27(4), pp. 219-225.
[5] Aschbaker, P. R. (1991). Performance assessment: State activity, interest, and concerns. Applied Measurement in Education, 4(4), pp. 275–288.
[6] Authentic Assessment Overview. (2001) Person development group. [Online]. [Accessed 11 April 2010]. Available at: www.ithaca.edu/jwiggles/t&m/check.htm.
[7] Benjamin, R. (2008). The Case for Comparative Institutional Assessment of Higher-Order Thinking Skills. Change, 40(6), pp. 51-55.
[8] Bennet, R. E, Morley, M., and Quardt, D. (2000). Three response types for broadening the connection of mathematical problem solving in computerized tests. Applied Psychological Measurement, 24(4), pp. 294-309.
[9] Bereiter, C. and Scardamalia, M. (1987). An attainable version of high literacy: Approaches to teaching higher-order thinking skills in reading and writing. Curriculum Inquiry, 17, pp. 9-30.
[10] Beyer, B. (1983). Common sense about teaching thinking. Educational Leadership, 41(3), pp. 44-49.
[11] Bol, L., & Strage, A. (1993). Relationships among teachers’ assessment practices and their student outcome and study skill development goals. (ERIC Document Reproduction Service No. ED 367 639) [online]. [Accessed 05 May 2010] available at: http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/15/5b/34.pdf.
[12] Bol, L. (1998). Influence of experience. Grade level and subject area on teachers’ assessment practices. The Journal of Educational Research, 91(6), pp. 323-330.
[13] Boyd, B. (2008). Effects of state tests on classroom test items in mathematics, School Science and Mathematics, 108(6), pp. 251-261.
[14] Black, P., & Wiliam, D. (1998). Inside the black box: Raising standards through classroom assessment. Phi Delta Kappan, 80(2), pp. 139-149.
[15] Bloom, B. S. (Ed.). (1956). A learning in classroom instruction. New (1975). Masteri, objectives. Handbook I, the taxonomy of educational domain. New York: Longman.
[16] Bracey, G. W. (1987). Measurement-driven instruction: Catchy phrase, dangerous practice, Phi Delta Kappan, 68(9), pp. 683-686.
[17] Burns, M. (1985). The role of questioning. The Arithmetic Teacher, 32(6), pp. 14-17.
[18] Calfee, R. C. (1994). Cognitive assessment of classroom learning. Education and Urban Society, 26(4), pp. 340-351.
[19] Christine Sereni (2015). Teaching Strategies for Critical Thinking Skills. Academic Exchange Quarterly Fall 2015 ISSN 1096-1453, 19(3).
[20] Costa, A. (1981). Teaching for intelligent behavior. Educational Leadership, 39(1), pp. 29-32.
[21] Crenshaw, P., Hale, E., & Harper, S. L. (2011). Producing intellectual labor in the classroom: The utilization of a critical thinking model to help students take command of their thinking. Journal of College Teaching & Learning, 8(7), 13-26. Retrieved from http://search.ebscohost.com/.
[22] Crooks, T. J. (1888). The Impact of Classroom Evaluation Practices on Students. Review of Educational Research, 58(4), pp. 438-481.
[23] Dewey, J. (1966). Democracy and education: An introduction to the philosophy of education. New York: Collier-Macmillan.
[24] Doyle, W. (1983). Academic work. Review of Educational Research, 53, pp. 159-199.
[25] Doganay, A. and Bal, A. P. (2010). The Measurement of Students’ Achievement in Teaching Primary School Fifth Year Mathematics Classes. Educational Science: Theory & Practice, 10(1), pp. 199-215.
[26] Eisner, E. (1999). The uses and limits of performance assessment. Phi Delta Kappan, 80(9), pp. 658-660.
[27] Ennis, R. H. (1993). Critical Thinking Assessment. Theory in Practice, 32(3), pp. 179-186.
[28] Ennis, R. H. and Wheary, J. (1995). Gender Bias in Critical Thinking: Continuing the Dialogue. Educational Theory, 45(2), pp. 213-224.
[29] Facione, P. (2011). Think critically. Boston, MA: Prentice Hall
[30] Firestone, W. A., Mayrowetz, D., and Fairman, J. (1998). Performance-based assessment and instructional change: The effects of testing in Maine and Maryland. Educational Evaluation and Policy Analysis, 20(2), pp. 95-113.
[31] Fleming, M. & Chambers, B. (1983). Teacher-made tests: Windows on the classroom. In W. E. Hathaway (Ed.), Testing in the schools: New directions for testing and measurement, pp. 29-38. San Francisco, CA: Jossey-Bass.
[32] Fogarty, R. and McTighe, M. (1993). Educating Teachers for Higher Order Thinking: The three-Story Intellect. Theory into Practice, 32(3), pp. 161-169.
[33] Gronlund, N. E. (1998) Assessment of student achievement (8th ed.) Boston, Pearson/Allyn and Bacon.
[34] Hollander, S. K. (1978). A literature review: Thought processes employed in the solution of verbal arithmetic problems. School Science and Mathematics, 78, pp. 327-335.
[35] Haertel, E. H. (1991). New forms of teacher assessment. Review of Research in Education, 17(1), pp. 3-29.
[36] Kalyuga, S. (2006). Rapid assessment of learners’ proficiency: A cognitive load approach. Educational Psychology, 26(6), pp. 735-749.
[37] Khattri, N., Reeve, A., and Kane, M. (1998). Principles and practices of performance assessment. Mahwah, NJ: Lawrence Erlbaum.
[38] Lai, A. F. (2007). The development of computerized two-tier diagnostic test and remedial learning system for elementary science learning. Seventh IEEE International Conference on Advanced Learning Technologies (ICALT). [Online]. [Accessed 29 April 2010].
[39] Available at: http://csdl2.computer.org/comp/proceedings/icalt/2007/2916/00/29160735.pdf.
[40] Lawson, A. (1993). At What Levels of Education is the Teaching of Thinking Effective? Theory into Practice, 32(3), pp. 170-178.
[41] Lewis, A. and Smith, D. (1993). Defining Higher Order Thinking. Theory into Practice, 32(3), pp. 131-137.
[42] Linda, B., Patricia, L. S., and O’Connel, A. A. (1998). Influence of experience, grade level, and subject area on teachers’ assessment practices. The Journal of Educational Research, 91(6), pp. 323-331.
[43] Linn, R. L., Baker, E. L., & Dunbar, S. B. (1991). Complex Performance-based assessment: expectations and validation criteria. Educational Researcher, 20(8), pp. 15-21.
[44] Marzano, R. J. (1993) How classroom Teachers Approach the Teaching of Thinking, Theory into Practice, 32(3), 154-160.
[45] Mayer, C. (1992). what’s the difference between authentic and performance assessment? Educational Leadership, 49(8), pp. 39-42.
[46] Michael Scriven Richard Paul (2004). Defining Critical Thinking. Retrieved from 10/9/2016 from http://www.criticalthinking.org/aboutCT/definingCT.shtml.
[47] Miller, M. D., & Linn, R. L. (2000). Validation of performance –based assessments. Applied Psychological Measurement, 24(4), pp. 367-378.
[48] National Research Council (NRC). (2000). How People Learn: Brain, Mind, Experience, and School: Expanded Edition, National Academy Press, Washington, D. C.
[49] National Council of Teachers of Mathematics. (1995). Assessment standards for school mathematics. Reston, VA: National Council of Teachers of Mathematics.
[50] National Council of Teacher of Mathematics (NCTM) (1999) Developing Mathematical Reasoning in Grades K- 12 Yearbook of the National Council of Teacher of Mathematics, Reston. VA: National Council of Teachers of Mathematics.
[51] Newman, F. M.(1990) Higher Order Thinking in Teaching Social Studies: A rationale for the assessment of Classroom Thoughtfulness, Journal of Curriculum Studies, 22(1), pp. 41-56.
[52] Nosich, G. (2012). Learning to think things through. A Guide to critical thinking across the curriculum. (4th ed). Boston, MA: Pearson.
[53] Paul, R., & Nosich, R. (1992). A model for the national assessment of higher order thinking. (ERIC Document Reproduction Service No. ED 353 296).
[54] Pellegrino, J, Chudowsky, N & Glaser, R (eds) (2001). Knowing what students know: The science and design of educational assessment: A report of the National Research Council, National Academy Press, Washington DC.
[55] Penta, M. Q., & Hudson, M. B. (1999). Evaluating a practice-based model to develop successful alternative assessment at instructionally innovative elementary Schools. Paper presented at the annual meeting of the American Educational Research Association (AERA): (pp. 1-22). Montreal, Quebec, Canada. [Online]. [Accessed 4 May 2010]. Available at: http://eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/15/e8/6f.pdf.
[56] Porter, A. C., Kirst, M. W., Osthoff, E. J., Smithson, J. S., & Schneider, S. A. (1993). Reform up close: An analysis of high school mathematics and science classrooms (Final Report to the National Science Foundation on Grant No. SPA-8953446 to the Consortium for Policy Research in Education). Madison, WI: University of Wisconsin– Madison, Wisconsin Center for Education Research.
[57] Sambell, K., and McDowell, L. (1998). The construction of the hidden curriculum: messages and meanings in the assessment of student learning. Assessment and Evaluation in Higher Education, 23 (4), pp. 391-402.
[58] Shellens, T., & Valcke, M. (2005). Collaborative learning in asynchronous discussion groups: What about the impact on cognitive process? Computers in Human Behavior, 21(6), pp. 957-975.
[59] Shepard, L. A. (1989). Why we need better assessments. Educational Leadership, 46(7), pp. 4-9.
[60] Shepard, L. A. (2000). The role of assessment in a learning culture. Educational Researcher, 29, pp. 4-14.
[61] Stiggins, R. J., Griswold, M. M., and Wikelund, K. R. (1989). Measuring thinking skills through classroom assessment. Journal of Educational Measurement, 26(3), pp. 233−246.
[62] Stiggins, R. J., Frisbie, D. A. and Griswold, P. A. (1989). Inside high school grading practices: Building a research agenda. Educational Measurement: Issues and Practices, 8(2), PP. 5-14.
[63] Stiggins, R. J. (2002). Assessment crisis: The absence of assessment for learning. Phi Delta Kappan, 83(10), 758-765.
[64] Stiggins, R. J. (1999). Assessment, student confidence, and school success. Phi Delta Kappan, 81(3), 191-198.
[65] Stiggins, R. J. (1997). Student-centered classroom assessment. Upper Saddle River, NJ: Merrill, an imprint of Prentice Hall.
[66] Struyven, K., Dochy, F., Janssens, S., Schelfhout, W., and Gielen, S. (2006). The overall effects of end-of-course assessment on student performance: A comparison between multiple-choice testing, peer assessments, case –based assessment and portfolio assessment. Studies in Educational Evaluation, 32(3), pp. 202-222.
[67] Sterberg, R. (1984). How can we teach intelligence? Educational Leadership, 42(1), pp. 38-48.
[68] Theresa Ebiere Dorgu. Different Teaching Methods: A Panacea for Effective Curriculum Implementation in the Classroom. International Journal of Secondary Education. Special Issue: Teaching Methods and Learning Styles in Education. Vol. 3, No. 6-1, 2015, pp. 77-87. doi: 10.11648/j.ijsedu.s.2015030601.13.
[69] Trevino, E. (2008, September 13). It's critical to learn how to be critical thinkers. El Paso Times, Retrieved from http://www.nwp.org/cs/public/print/resource/2821.
[70] University of South Florida (SFU) (2010). Classroom Assessment, [online]. [Accessed April 20, 2010]. Available at: http://fcit.usf.edu/assessment/selected/responsea.html.
[71] Wiggins, G. (1993). Assessment: Authenticity, context, and validity. Phi Delta Kappan, 75, 200−214.
[72] Wiggins, G (1989). ‘A true test: Toward more authentic and equitable assessment’, Phi Delta Kappan, 70(9), pp. 703–713.
[73] Viechniki, K. J., Barbour, N., Shaklee, B., Rohrer, J. & Ambrose, R. (1993). The impact of portfolio assessment on teacher classroom activities. Journal of Teacher Education, 44(5), 371-377.
[74] Wiggins, G. (1994). Toward more authentic assessment of language performances. In C. Hancock(Ed), Teaching, testing, and assessment: Making the connection. Northeast conference reports. Lincolnwood, IL: National Textbook Co.
[75] Williams, J. & Ryan, J. (2000). National testing and the improvement of classroom teaching: Can they coexist? British Educational Research Journal, 26(1), pp. 49-73.
[76] Wolf, P. J. (2007). Academic improvement through regular assessment. Peabody Journal of Education, 82(4), pp. 690–702.
[77] Wraga, W. G. (1994). Performance assessment: A golden opportunity to improve the future. NASSP Bulletin, 78(563), pp. 71-79.
Cite This Article
  • APA Style

    Yousef Abosalem. (2016). Assessment Techniques and Students’ Higher-Order Thinking Skills. International Journal of Secondary Education, 4(1), 1-11. https://doi.org/10.11648/j.ijsedu.20160401.11

    Copy | Download

    ACS Style

    Yousef Abosalem. Assessment Techniques and Students’ Higher-Order Thinking Skills. Int. J. Second. Educ. 2016, 4(1), 1-11. doi: 10.11648/j.ijsedu.20160401.11

    Copy | Download

    AMA Style

    Yousef Abosalem. Assessment Techniques and Students’ Higher-Order Thinking Skills. Int J Second Educ. 2016;4(1):1-11. doi: 10.11648/j.ijsedu.20160401.11

    Copy | Download

  • @article{10.11648/j.ijsedu.20160401.11,
      author = {Yousef Abosalem},
      title = {Assessment Techniques and Students’ Higher-Order Thinking Skills},
      journal = {International Journal of Secondary Education},
      volume = {4},
      number = {1},
      pages = {1-11},
      doi = {10.11648/j.ijsedu.20160401.11},
      url = {https://doi.org/10.11648/j.ijsedu.20160401.11},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ijsedu.20160401.11},
      abstract = {Improving students’ higher-order thinking skills is a collective experience; one teacher of a specific subject cannot alone improve the higher-order thinking skills, and it is a collaborative process between all subjects’ teachers and can be taught for all levels of studying (Lawson, 1993; Shellens, & Valcke, 2005). Moreover, Benjamin (2008) argues that these skills can be developed in a cumulative fashion as students’ progress through their courses and subjects and other experiences they get from their institutions. As well, by including their subjects by problem solving, critical thinking and decision making activities will help students enhance their higher-order thinking skills. In this paper a mathematics test in fractions was constructed and analyzed for both grades 8 and 9 to make sure how teacher-made tests are constructed and how much of them agreed with the Bloom’s Taxonomy levels. The test consists of five sections or content areas the test was analyzed according to the behavior matrix. The results showed that all test items measure the lower three levels in Bloom’s taxonomy which agrees with Stiggins, R. J., Griswold, M. M., and Wikelund, K. R. (1989) results that most of teacher-made tests measure the lower levels in Bloom’s taxonomy. Moreover, 57.14% of the test items are applications and 28.57% are recognition items. These numbers are consistent with Boyd (2008) study, which indicated that the majority of teachers’ assessment items focused on the lower levels of Bloom’s Taxonomy. Moreover, Boyd concluded that 87% of the teachers’ items that have participated in this study used level 1 of the taxonomy in 2003- 2004, and this percentage increased to 86% in 2005-2006. These numbers reflect the tendency of the assessment methods used in schools to ask students to recall information or to do routine question, which will not help students in improving their higher-order thinking skills.},
     year = {2016}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Assessment Techniques and Students’ Higher-Order Thinking Skills
    AU  - Yousef Abosalem
    Y1  - 2016/03/06
    PY  - 2016
    N1  - https://doi.org/10.11648/j.ijsedu.20160401.11
    DO  - 10.11648/j.ijsedu.20160401.11
    T2  - International Journal of Secondary Education
    JF  - International Journal of Secondary Education
    JO  - International Journal of Secondary Education
    SP  - 1
    EP  - 11
    PB  - Science Publishing Group
    SN  - 2376-7472
    UR  - https://doi.org/10.11648/j.ijsedu.20160401.11
    AB  - Improving students’ higher-order thinking skills is a collective experience; one teacher of a specific subject cannot alone improve the higher-order thinking skills, and it is a collaborative process between all subjects’ teachers and can be taught for all levels of studying (Lawson, 1993; Shellens, & Valcke, 2005). Moreover, Benjamin (2008) argues that these skills can be developed in a cumulative fashion as students’ progress through their courses and subjects and other experiences they get from their institutions. As well, by including their subjects by problem solving, critical thinking and decision making activities will help students enhance their higher-order thinking skills. In this paper a mathematics test in fractions was constructed and analyzed for both grades 8 and 9 to make sure how teacher-made tests are constructed and how much of them agreed with the Bloom’s Taxonomy levels. The test consists of five sections or content areas the test was analyzed according to the behavior matrix. The results showed that all test items measure the lower three levels in Bloom’s taxonomy which agrees with Stiggins, R. J., Griswold, M. M., and Wikelund, K. R. (1989) results that most of teacher-made tests measure the lower levels in Bloom’s taxonomy. Moreover, 57.14% of the test items are applications and 28.57% are recognition items. These numbers are consistent with Boyd (2008) study, which indicated that the majority of teachers’ assessment items focused on the lower levels of Bloom’s Taxonomy. Moreover, Boyd concluded that 87% of the teachers’ items that have participated in this study used level 1 of the taxonomy in 2003- 2004, and this percentage increased to 86% in 2005-2006. These numbers reflect the tendency of the assessment methods used in schools to ask students to recall information or to do routine question, which will not help students in improving their higher-order thinking skills.
    VL  - 4
    IS  - 1
    ER  - 

    Copy | Download

Author Information
  • Department of Mathematics and Science, Preparatory Program, Khalifa University, Abu Dhabi, UAE

  • Sections