研究生: |
李宜璟 Li, Yi-Ching |
---|---|
論文名稱: |
國中基測與會考英文閱讀測驗試題之比較:以布魯姆認知分類( 修訂版)析之 A Comparison of English Reading Comprehension Questions on the BCT and the CAP Using Revised Bloom’s Taxonomy |
指導教授: |
陳秋蘭
Chen, Chiou-Lan |
學位類別: |
碩士 Master |
系所名稱: |
英語學系 Department of English |
論文出版年: | 2017 |
畢業學年度: | 106 |
語文別: | 英文 |
論文頁數: | 68 |
中文關鍵詞: | 國中教育會考 、英語閱讀測驗 、修正版之布魯姆認知分類 、PISA 、PIRLS 、高中入學測驗 |
英文關鍵詞: | CAP, English reading comprehension test, the revised Bloom’s taxonomy, PISA, PIRLS, senior high school entrance exams |
DOI URL: | https://doi.org/10.6345/NTNU202203140 |
論文種類: | 學術論文 |
相關次數: | 點閱:244 下載:52 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本研究使用修正版之布魯姆分類認知分類來分析2011年至2013年之國中基本學力測驗與2014年至2016年之國中教育會考中之英語閱讀測驗題目的認知分類,探討當中認知分類出現之頻率與考生答題之表現。
研究結果摘要如下:
布魯姆認知類型中,「記憶」、「理解」、「應用」與「分析」在基測與會考中最常出現,「評鑑」與「創造」則不曾出現,原因可能為評鑑與創造不易以選擇題呈現。其中「理解」占大多數之試題,且通常屬於簡單之題型。大部分屬於理解之題型的難易程度為簡單到中間程度,但仍有少數為困難。「應用」與「分析」之試題數目被限制在一定範圍內,且多數試題難易度為困難,但仍有例外。近六年以來,上述二類試題數量總計占所有題目數量的10.5% 到36%,當中多數題目屬於困難程度,少數為簡易。相較於基測,會考中「應用」的試題題數大幅增加。這證明了「應用」試題日益重要,也呼應了會考試題與學生生活相關且能活化學習之精神。同樣的,相較於基測,會考包含更多屬於「應用」與「分析」之試題。這也顯示會考涵蓋了更多屬於高層次認知之題型。
本研究結果建議教師可設計更多較有挑戰性之「理解」試題與較簡易之「應用」與「分析」試題,以協助學生因應會考。
This research aimed to analyze the cognitive categories in Bloom’s revised taxonomy on the English reading comprehension test items on BCT (Basic Competence Test) and CAP (Comprehensive Assessment Program) for junior high school students from 2011 to 2016, and to explore examinees’ performance on each cognitive level.
The major results are as follows:
The study found that Remember, Understand, Apply, and Analyze appeared in both exams. Evaluate and Create didn’t appear. The reason may be that Evaluate and Create weren’t easily measured in multiple choices. Items of Understand were the majority and they tended to be easy or average in difficulty. Only a few of them were difficult.
However, as for items of Apply and Analyze combined, though they only accounted for 20% of test items, they tended to be difficult.
When items on the two exams were compared, it was found that items of Apply increased at a faster speed in CAP than in BCT, which indicated its rising importance and this matched CAP’s goal of relating to students’ living experiences and activating their learning.
Also, CAP tended to contain more items of Apply and Analyze, which showed that CAP included more items of higher cognitive levels than BCT did.
It is suggested that teachers can design more difficult items of Understand and easy items of Apply and Analyze for students to prepare for CAP in the future.
Abdelhafez, A. M. (2006). The effect of a suggested training program in some metacognitive language learning strategies on developing listening and reading comprehension of university EFL students. Retrieved from ERIC Database. (ED498263)
Adams, N. E. (2015). Bloom's taxonomy of cognitive learning objectives. Journal of the Medical Library Association, 103 (3), 152-153.
Airasian, P. W., & Miranda, H. (2002). The role of assessment in the revised taxonomy. Theory into practice, 41(4), 249-254.
Akinde, O. A. (2015). A pilot study of students learning outcomes using didactic and socratic instructional methods: An assessment based on Blooms taxonomy. Educational Research and Reviews, 10 (21), 2821-2833.
Anderson, L. W. (1999). Rethinking Bloom's taxonomy: implications for testing and assignment. Columbia, SC: University of South Carolina. (ERIC Document Reproduction Service No. ED 435630)
Anderson, L. W., Krathwohl, D. R. (Eds.) (2001). A taxonomy for learning, teaching, and assessing: a revision of Bloom's taxonomy of educational objectives. New York: Longman.
Andrich, D. (2002). A framework relating outcomes based education and the taxonomy of educational objectives. Studies in Educational Evaluation, 28, 35-59.
Assaly, I. R., & Smadi, O. M. (2015). Using Bloom’s taxonomy to evaluate the cognitive levels of master class textbook’s questions. English Language Teaching, 8 (5), 100-110.
Beatty Jr, R. (1975). Reading comprehension skills and bloom's taxonomy. Literacy Research and Instruction, 15 (2), 101-108.
Bezuidenhout, M.J.,& Alt, H. (2011). Assessment drives learning: Do assessments promote high-level cognitive processing? South African Journal of Higher Education, 25(6), 1062-1076.
Bloom, B. S., Engelhart, N. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (Eds.) (1956). Taxonomy of educational objectives—The classification of education goals, Handbook I: Cognitive domain. 20-24. New York: McKay
Chang, J. Y. &Lan, W. B. (2008). New development of the classification of Bloom’s educational objectives. Journal of Nanyang Normal University, 7 (5), 84-86
Chang, Y. J., Ko, H. W., Chiou, H. J., Ou, T. L. & Wen, F. H. (2011).The cross-level effects of teachers' reading instruction, students' reading attitude, and self-assessment in reading proficiency on students' reading achievement: A multilevel study of PIRLS 2006. Journal of Research in Education Sciences. 56 (2), 69-105
Chen, F. X.(2009). Theory of the revised Bloom’s cognitive objectives and the application in history teaching. Education of History, 15, 1-53 doi:10.6608/THE.2009.015.001
Chen, H. C. (2009). An analysis of the reading skills measured in reading comprehension tests on the scholastic achievement English test (SAET) and the department required English test (DRET). Unpublished master’s thesis, National Taiwan Normal University, Taipei, Taiwan
Chen, L.H. (2016). A Curriculum Action Research on Advancing English Reading
Comprehension of Comprehensive Assessment Program for the Ninth Graders
with the Informational Picture Books. Unpublished master's thesis, University
of Taipei, Taipei
Chou, M. H. (2013). Strategy use for reading English for general and specific academic purposes in testing and nontesting contexts. Reading Research Quarterly, 48 (2), 175-197.
Churches, A. (2008). Bloom's taxonomy blooms digitally. Tech & Learning, 1, 1-6.
Clemens, N. H., Davis, J. L., Simmons, L. E., Oslund, E. L., & Simmons, D. C. (2015). Interpreting secondary students’ performance on a timed, multiple-choice reading comprehension assessment: The prevalence and impact of non-attempted items. Journal of Psychoeducational Assessment, 33 (2), 154-165.
Cohen, A. D. (1991). Testing linguistic and communicative proficiency: The case of reading comprehension. Oxford: Oxford University Press. (ERIC Document Reproduction Service No. ED281380)
Goodman, K. S. (1967). Reading: A psycholinguistic guessing game. Literacy Research and Instruction, 6 (4), 126-135.
Green, K. H. (2010). Matching functions and graphs at multiple levels of Bloom's revised taxonomy. Primus, 20, 204-206.
Habibian, M. (2015). The impact of training metacognitive strategies on reading comprehension among ESL learners. Journal of Educational Practice, 6 (28), 61-69.
Halawi, L. A., McCarthy, R. V., & Pires, S. (2009). An evaluation of e-learning on the basis of Bloom's taxonomy: An exploratory study. Journal of Education for Business, 84 (6), 374-380.
Hanna, W. (2007). The new Bloom's taxonomy: Implications for music education. Arts Education Policy Review, 108, 7-16.
Hannon, B. (2012). Understanding the relative contributions of lower‐level word processes, higher‐level processes, and working memory to reading comprehension performance in proficient adult readers. Reading Research Quarterly, 47(2), 125-152.
Harris, R. L. (2008). Using thinking skills as a bridge between ELA and science teaching strategies. Language and Literacy Spectrum, 18, 67-80
He, T. H. (2008). Reading for different goals: the interplay of EFL college students' multiple goals, reading strategy use and reading comprehension. Journal of Research in Reading,, 31(2), 224-242.
Huang, M. Z., Chang, S. M., Sun, Z. S. (2010). Relevant competence indicators of technology of primary school: Analyzing of knowledge dimension from Bloom's classification system of educational objectives (revised edition). Journal of Industrial Techonology Education. 2,, 57-62 doi:10.6306/JITE.2010.2.7
Huang, Z. J. (2008). The ending of the BCT! The Educator Monthly. 487, 1-4
Janzen, T. M., Saklofske, D. H., & Das, J. P. (2013 ). Cognitive and reading profiles of two samples of Canadian first nations children: comparing two models for identifying reading disability. Canadian Journal of School Psychology, 28, 323-344.
Jheng, K.S. (2017). The Study of Mathematics Item Analysis from 105 Comprehensive Assessment Program for Junior High School Students—A Case Study of A Junior High School in New Taipei City. Unpublished master's thesis, National Taiwan Normal University, Taipei
Jideani, V. A., & Jideani, I. A. (2012). Alignment of assessment objectives with instructional objectives using revised Bloom's taxonomy—The case for food science and technology education. Journal of Food Science Education, 11 (3), 34-42.
Kapinus, B. (2003). PIRLS-IEA reading literacy framework: Comparative analysis of the 1991 lEA reading study and PIRLS (NCES 2003-005). U.S. Department of Education. Washington, DC: NCES Working Paper.
Keenan, J. M., Betjemann, R. S., & Olson, R. K. (2008). Reading comprehension tests vary in the skills they assess: Differential dependence on decoding and oral comprehension. Scientific Studies of Reading, 12 (3), 281-300.
Kelly, D., Xie, H., Nord, C.W., Jenkins, F., Chan, J.Y., & Kastberg, D. (2013). Performance of U.S. 15-year-old students in mathematics, science, and reading literacy in an international context: first look at PISA 2012 (NCES 2014-024). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
Kember, D., & Gow, L. (1994). An examination of the interactive model of ESL reading from the perspective of approaches to studying. RELC journal, 25 (1), 1-25.
Kimmel, S., MacGinitie, W. H. (1981). Hypothesis testing in reading comprehension. New York: Teachers College, Columbia University. (ERIC Document Reproduction Services No.219744)
King, L. P. (1971, April). The relationship of affective changes to cognitive skills development. Paper presented at the Meeting of the International Reading Association, Atlantic City, USA
Kitao, S. K. & Kitao, K. (2002). Testing reading. Asian Journal of English Language Teaching. 12, 161-78
Koo, J., Becker, B. J., & Kim, Y. S. (2014). Examining differential item functioning trends for English language learners in a reading test: A meta-analytical approach. Language Teaching, 31(1), 89-109.
Koong, C. S., & Wu, C. Y. (2011). The applicability of interactive item templates in varied knowledge types. Computers & Education, 56 (3), 781-801.
Krathwohl, D. R. (2002). A revision of Bloom's taxonomy: An overview. Theory into Practice, 41 (4), 212-218.
Lan, W. H. (2007). An analysis of reading comprehension questions on the SAET and the DRET using revised Bloom's taxonomy. Unpublished master's thesis, National Taiwan Normal University, Taipei
Li, S.H. (2009). A content analysis of the ability of high-order-thinking in social
studies workbooks in primary school: taking the cognitive process dimension
in a revision of bloom's taxonomy of educational objectives as analysis
framework. Unpublished master's thesis, University of Taipei, Taipei
Lin, S. C. (2015). A brief talk on the CAP. Taiwan Education Review. 693, 41-44
Liu, M. Z. (2016). Analysis and Comparison of the BCT and CAP English Test Trends
from 2012 to 2015. Unpublished master’s thesis, Asia University, Taichung
Lu, S. M. (2012). Structures and examinees’ performances of the reading tasks on the English basic competence test for junior high school students. Journal of Educational Practice and Research. 25 (1), 67-96
Magliano, J. P., Millis, K. K., Levinstein, I., & Boonthum, C. (2011). Assessing comprehension during reading with the reading strategy assessment tool (RSAT). Metacognition and learning, 6 (2), 131-154.
Morishima, Y. (2013). Allocation of limited cognitive resources during text comprehension in a second language. Discourse Processes, 50, 577-597.
Moskovsky, C., Jiang, G., Libert, A., & Fagan, S. (2015). Bottom‐up or top‐down: English as a foreign language vocabulary instruction for Chinese university students. Tesol Quarterly, 49 (2), 256-277.
Mullis, I., Martin, M., Kennedy, A., Trong, K., & Sainsbury, M. (2009).PIRLS 2011 Assessment Framework. Chestnut Hill, MA: Lynch School of Education, Boston College.
Naidu, B., Briewin, M., & Embi, M. A. (2013). Reading strategy: Tackling reading through topic and main ideas. English Language Teaching, 6(11), 60-64.
Newton, G., & Martin, E. (2013). Blooming, SOLO taxonomy, and phenomenography as assessment strategies in undergraduate science education. Journal of College Science Teaching, 43 (2), 78-90.
OECD (2009). PISA 2009 assessment framework: Key competencies in reading, mathematics and science. Paris, France: OECD publications
O’Reilly, T., Weeks, J., Sabatini, J., Halderman, L., & Steinberg, J. (2014). Designing reading comprehension assessments for reading interventions: How a theoretically motivated assessment can serve as an outcome measure. Educational Psychology Review, 26 (3), 403-424.
Perfetti, C. A., & Hogaboam, T. (1975). Relationship between single word decoding and reading comprehension skill. Journal of Educational Psychology, 67 (4), 461-469.
Pintrich, P. R. (2002). The role of metacognitive knowledge in learning, teaching, and assessing. Theory into practice, 41(4), 219-225.
Richards-Tutor, C., Baker, D. L., Gersten, R., Baker, S. K., & Smith, J. M. (2016). The effectiveness of reading interventions for English learners: A research synthesis. Exceptional Children, 82 (2), 144-169.
Ruddell, R. B., & Unrau, N. J. (1994). Reading as a meaning-constructive process: The reader, the text and the teacher. In R. B. Ruddell, M. R. Ruddell, & H. Singer (Eds.), Theoretical models and processes of reading (pp. 996–1056). Newark, DE: International Reading Association.
Rupp, A. A., Ferne, T., & Choi, H. (2006). How assessing reading comprehension with multiple-choice questions shapes the construct: A cognitive processing perspective. Language Testing, 23 (4), 441-474.
Shin. (2013). Developmental approaches to the teaching of EFL reading. Journal of Pan-Pacific Association of Applied Linguistics, 17 (2), 159-170.
Stoller, F. L., Anderson, N. J., Grabe, W., & Komiyama, R. (2013). Instructional enhancements to improve students' reading abilities. English Teaching Forum, 51, 2-11.
Sung, Y. T., Chou, Y. T., & Tseng, F. L. (2014). Standards-based assessments for 12-year basic education in Taiwan. Journal of Research in Education Sciences, 59 (1), 1-32.
Tetteh, G. A., & Sarpong, F. A. A. (2015). Influence of type of assessment and stress on the learning outcome. Journal of International Education in Business, 8 (2), 125-144.
Thiede, K. W., Redford, J. S., Wiley, J., & Griffin, T. D. (2012). Elementary school experience with comprehension testing may influence metacomprehension accuracy among seventh and eighth graders. Journal of Educational Psychology, 104, 554-564.
Vrchota, D. (2004). Challenging students' thinking with Bloom's taxonomy. Communication Teacher, 18 (1), 2-5.
Wang, D. (2006 ). What can standardized reading tests tell us? Question-answer relationships and students' performance. Journal of College Reading and Learning, 36 (2), 21-37.
Wang, S.M. (2015). The Washback Effect of Comprehensive Assessment Program on Junior High School Students. Unpublished master's thesis, Ming Chuan University, Taipei
Williams, R. S. (2011). Measuring college students' reading comprehension ability using cloze tests. Journal of Research in Reading, 34 (2), 215-231.
Wu, C. S. (2012). The CAP and students’ learning. The Educator Monthly, 540, 40-44
Wu, K.L. (2012). A Study of Item Analysis of Chemistry Test of DRT 2007-2011.
Unpublished doctoral dissertation, National Taiwan Normal University, Taipei
Yang, J.Y. (2007). An analysis of the English reading comprehension tests in the basic competence test and the instruction of the reading skills and strategies in class. Unpublished master's thesis, National Chengchi University, Taipei
Yang, P. L., & Wang, A. L. (2015). Investigation the Relationship among Language Learning Strategies, English Self-Efficacy, and Explicit Strategy Instructions. Taiwan Journal of TESOL, 12 (1), 35-62.
Zheng, Y., Cheng, L., & Klinger, D. A. (2007). Do test formats in reading comprehension affect second-language students' test performance differently? TESL Canada Journal, 25 (1), 65-80