簡易檢索 / 詳目顯示

研究生: 簡佑達
Yu-Ta Chien
論文名稱: 運用立即反饋系統促進科學學習
Engaging Students in Science Learning with Clickers: The Good, the Bad, and the Future
指導教授: 張俊彥
Chang, Chun-Yen
學位類別: 博士
Doctor
系所名稱: 科學教育研究所
Graduate Institute of Science Education
論文出版年: 2015
畢業學年度: 103
語文別: 英文
論文頁數: 133
中文關鍵詞: 立即反饋系統按按按科學學習雲端教室同儕教學小組討論
英文關鍵詞: instant response system, clicker, science learning, cloud classroom, peer instruction, group discussion
論文種類: 學術論文
相關次數: 點閱:209下載:14
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本論文探討立即反饋系統運用於科學教學的效益以及有效策略。針對以往的實徵研究進行後設分析,本論文提出運用立即反饋系統有助於提升學習成效的證據,並指出「搭配立即反饋系統施行小組討論」有助於大幅提升學生的學習成效。本論文進一步執行實徵研究,比較不同小組討論模式搭配立即反饋系統,對於學生討論過程以及學習成果的影響。根據實徵研究的成果,本論文開發新世代的立即反饋系統──CloudClassRoom (CCR),並改良實行小組討論的策略,以期提升科學教學的成效。實徵研究的成果佐證CCR搭配小組討論的可行性與有效性。鑒於CCR的開發成果以及學生的實際討論模式,本論文建議搭配立即反饋系統進行科學教學的注意事項,並提出嶄新的研究方向。

    Clickers are widely advocated as a useful tool to break up the passive learning format of science lectures. It is expected that the use of clickers will actively engage students in thinking about the content presented during the lecture. Regarding the increased investment in clickers by schools, it is critical to examine whether clickers are worth the investment and how to make good use of clickers to promote active learning. This dissertation is thus done to answer these two questions. A meta-analysis on empirical clicker studies is performed to examine the relative effectiveness between conventional and clicker-integrated lectures. Statistical evidence supporting the superiority of clicker-integrated lectures is obtained. Moreover, it is found that using clickers with peer discussion is a very promising strategy to promote learning gains. Along with this research line, an empirical study is conducted to evaluate the relative effectiveness between different models for implementing peer discussion with clickers. Characteristics of productive clicker-integrated peer discussions are identified, including explanation-oriented, intellectually diverse, iteratively justified, and collectively participated. Issues related to the cultivation of productive clicker-integrated peer discussions are raised and discussed, including the balance between education and entertainment, demonstration of the weakness of the appeal to authority, tools for explicating explanations, and heterogeneous mixture within groups. Practical strategies are discussed to resolve these issues. An innovative design is then proposed to transform clickers into a more powerful pedagogical tool. An empirical study, which demonstrates the feasibility and effectiveness of the design ideas in a real classroom setting, is presented as a closure to this dissertation.

    LIST OF TABLES iv LIST OF FIGURES vi Chapter 1 Overview: Using Clickers to Facilitate Science Learning 1 Chapter 2 Meta-analytic Review: Peer Discussion as a Promising Strategy 6 2.1. Introduction 6 2.2. Strategies and theoretical aspects 8 2.2.1. Novelty effect 9 2.2.2. Unequal-item exposure effect 10 2.2.3. Testing effect 11 2.2.4. Adjunct-question effect 12 2.2.5. Feedback-intervention effect 14 2.2.6. Explanation effect 17 2.2.7. Summary 19 2.3. Method 20 2.3.1. Selection of studies 20 2.3.2. Coding of study features 23 2.3.3. Calculation and analysis of effect sizes 24 2.4. Results and discussion 32 2.4.1. Main characteristics of selected studies 32 2.4.2. General outcomes of clicker-integrated instruction 32 2.4.3. Clicker-integrated instruction is more than an amusing novelty 35 2.4.4. Clicker-integrated instruction goes beyond rote memorization 36 2.4.5. Testing/adjunct-question effect has little contribution 38 2.4.6. Clicker-integrated peer discussion is a promising strategy 39 2.5. Summary 41 2.6. Limitations 46 Chapter 3 To Display or Not? Different Designs of Clicker-integrated Discussion 47 3.1. Introduction 47 3.2. Method 51 3.2.1. Participants and instructional materials 51 3.2.2. Research design 51 3.2.3. Outcome variables 52 3.2.4. Instruments to obtain quantitative data 53 3.2.5. Instruments to obtain qualitative data 54 3.2.6. Data analysis 54 3.3. Results and discussion 55 3.3.1. Peer discussion improves correct response rates 55 3.3.2. Voting display negatively influences discussion results 56 3.3.3. Non-display session produces superior learning gains 57 3.4. Summary 58 Chapter 4 Voting Display Inhibiting Peer Discussion 60 4.1. Introduction 60 4.2. General data processing procedure 61 4.3. Results and discussion 65 4.3.1. Decrease in verbal exchanges related to learning material 65 4.3.2. Lack of iterative justifications among clicker answers 68 4.3.3. Monotonous discussion 70 4.3.4. Passive participation of low prior knowledge students 73 4.3.5. Authority of voting display 77 4.3.6. Difference in goal setting between sessions 79 4.3.7. Conformity to the majority 82 4.4. Summary 84 Chapter 5 Making Clicker Usage More Effective 86 Chapter 6 Innovative Use of Clickers to Support Science Teaching 97 6.1. Introduction 97 6.2. Method 100 6.2.1. Participants 100 6.2.2. Procedure 101 6.2.3. Interactive lecture 102 6.2.4. Instruments 103 6.2.5. Data analysis 103 6.3. Results 107 6.3.1. Supportive arguments 107 6.3.2. Counter arguments 109 6.3.3. Rebuttals 110 6.4. Discussion 112 6.5. Future directions 115 REFERENCES 117

    Chapter 1
    Beatty, I. D., Gerace, W. J., Leonard, W. J., & Dufresne, R. J. (2006). Designing effective questions for classroom response system teaching. American Journal of Physics, 74(1), 31-39.
    Bonwell, C. C., & Eison, J. A. (1991). Active learning: Creating excitement in the classroom (ASHE–ERIC Higher Education Rep. No. 1). Washington, DC: The George Washington University, School of Education and Human Development.
    Bruff, D. (2009). Teaching with classroom response systems: Creating active learning environments. San Francisco, CA: Jossey-Bass.
    Caldwell, J. E. (2007). Clickers in the large classroom: Current research and best-practice tips. CBE-Life Sciences Education, 6(1), 9-20.
    Duncan, D. (2005). Clickers in the classroom: How to enhance science teaching using classroom response systems. San Francisco, CA: Pearson.
    Felder, R.M. & Brent, R. (2009). Active learning: An introduction. ASQ Higher Education Brief, 2(4).
    Fies, C., & Marshall, J. (2006). Classroom response systems: A review of the literature. Journal of Science Education and Technology, 15(1), 101-109.
    Gilbert, A. (2005). New for back-to-school: 'Clickers'. Retrieved May 27, 2015 http://news.cnet.com/New-for-back-to-school-Clickers/2100-1041_3-5819171.html
    Kay, R. H., & LeSage, A. (2009). Examining the benefits and challenges of using audience response systems: A review of the literature. Computers & Education, 53(3), 819-827.
    MacArthur, J. R., & Jones, L. L. (2008). A review of literature reports of clickers applicable to college chemistry classrooms. Chemistry Education Research and Practice, 9(3), 187-195.
    Simpson, V., & Oliver, M. (2007). Electronic voting systems for lectures then and now: A comparison of research and practice. Australasian Journal of Educational Technology, 23(2), 187-208.
    Wieman, C., Perkins, K., Gilbert, S., Benay, F., Kennedy, S., Semsar, K., Knight, J., Shi, J., Smith, M., Kelly, T., Taylor, J., Yurk, H., Birol, G., Langdon, L., Pentecost, T., Stewart, J., Arthurs, L., Bair, A., Stempien, J., Gilley, B., Jones, F., Kennedy, B., Chasteen, S., & Simon, B. (2009). Clicker resource guide: An instructor’s guide to the effective use of personal response systems (clickers) in teaching. Vancouver, BC, Canada: University of British Columbia. (Available from http://www.cwsei.ubc.ca/resources/files/Clicker_guide_CWSEI_CU-SEI.pdf)

    Chapter 2
    References marked with an asterisk indicate studies included in the meta-analysis.
    *Agbatogun, A. O. (2012). Exploring the efficacy of student response system in a sub-saharan african country: A sociocultural perspective. Journal of Information Technology Education: Research, 11, 249-267.
    Aleven, V. A., & Koedinger, K. R. (2002). An effective metacognitive strategy: Learning by doing and explaining with a computer-based cognitive tutor. Cognitive Science, 26(2), 147-179.
    Andre, T. (1979). Does answering higher-level questions while reading facilitate productive learning? Review of Educational Research, 49(2), 280-318.
    Anthis, K. (2011). Is it the clicker, or is it the question? Untangling the effects of student response system use. Teaching of Psychology, 38(3), 189-193.
    Asch, S. E. (1951). Effects of group pressure upon the modification and distortion of judgments. In H. Guetzkow (Ed.), Groups, leadership and men: Research in human relations (pp. 177-190). Oxford, UK: Carnegie Press.
    Atkinson, R. K., Renkl, A., & Merrill, M. M. (2003). Transitioning from studying examples to solving problems: Effects of self-explanation prompts and fading worked-out steps. Journal of Educational Psychology, 95(4), 774-783.
    Azevedo, R., & Aleven, V. A. (2013). International handbook of metacognition and learning technologies (Vol. 26). New York, NY: Springer.
    *Bachman, L. R., & Bachman, C. M. (2011). A study of classroom response system clickers: Increasing student engagement and performance in a large undergraduate lecture class on architectural research. Journal of Interactive Learning Research, 22(1), 5-21.
    Bangert-Drowns, R. L., Kulik, C. L. C., Kulik, J. A., & Morgan, M. T. (1991). The instructional effect of feedback in test-like events. Review of Educational Research, 61(2), 213-238.
    Bangert-Drowns, R. L., Kulik, J. A., & Kulik, C. L. C. (1985). Effectiveness of computer-based education in secondary schools. Journal of Computer-Based Instruction, 12(3), 59-68.
    *Bartsch, R. A., & Murphy, W. (2011). Examining the effects of an electronic classroom response system on student engagement and performance. Journal of Educational Computing Research, 44(1), 25-33.
    Bielaczyc, K., Pirolli, P. L., & Brown, A. L. (1995). Training in self-explanation and self-regulation strategies: Investigating the effects of knowledge acquisition activities on problem solving. Cognition and Instruction, 13(2), 221-252.
    Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain. New York, NY: David McKay.
    Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to meta-analysis. Chichester, UK: John Wiley.
    Boscardin, C., & Penuel, W. (2012). Exploring benefits of audience-response systems on learning: A review of the literature. Academic Psychiatry, 36(5), 401-407.
    Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theoretical synthesis. Review of Educational Research, 65(3), 245-281.
    *Butler, M., Pyzdrowksi, L., Walker, V., & Yoho, S. (2010). Studying personal response systems in a college algebra course. Investigations in Mathematics Learning, 2(2), 1-18.
    Caldwell, J. E. (2007). Clickers in the large classroom: Current research and best-practice tips. CBE-Life Sciences Education, 6(1), 9-20.
    *Campbell, J., & Mayer, R. E. (2009). Questioning as an instructional method: Does it affect learning from lectures? Applied Cognitive Psychology, 23(6), 747-759.
    Chan, J. C. K. (2009). Long-term effects of testing on the recall of nontested materials. Memory, 18(1), 49-57.
    Chan, J. C. K., McDermott, K. B., & Roediger III, H. L. (2006). Retrieval-induced facilitation: Initially nontested material can benefit from prior testing of related material. Journal of Experimental Psychology: General, 135(4), 553-571.
    Cheung, A. C. K., & Slavin, R. E. (2013). The effectiveness of educational technology applications for enhancing mathematics achievement in K-12 classrooms: A meta-analysis. Educational Research Review, 9(0), 88-113.
    Chi, M. T. H., & Bassok, M. (1989). Learning from examples via self-explanations. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 251-282). Hillsdale, NJ: Erlbaum.
    Chi, M. T. H., Bassok, M., Lewis, M. W., Reimann, P., & Glaser, R. (1989). Self-explanations: How students study and use examples in learning to solve problems. Cognitive Science, 13(2), 145-182.
    Chi, M. T. H., De Leeuw, N., Chiu, M. H., & Lavancher, C. (1994). Eliciting self-explanations improves understanding. Cognitive Science, 18(3), 439-477.
    Chi, M. T. H., & VanLehn, K. A. (1991). The content of physics self-explanations. Journal of the Learning Sciences, 1(1), 69-105.
    Chi, M., & VanLehn, K. (2010). Meta-cognitive strategy instruction in intelligent tutoring systems: How, when, and why. Educational Technology & Society, 13(1), 25-39.
    *Christopherson, K. M. (2011). Hardware or wetware: What are the possible interactions of pedagogy and technology in the classroom? Teaching of Psychology, 38(4), 288-292.
    Clark, R. E. (1983). Reconsidering research on learning from media. Review of Educational Research, 53(4), 445-459.
    Clark, R. E. (1994). Media will never influence learning. Educational Technology Research and Development, 42(2), 21-29.
    Cohen, J. (1988). Statistical power analysis for the behavioral sciencies (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates.
    Cooper, H. M. (1982). Scientific guidelines for conducting integrative research reviews. Review of Educational Research, 52(2), 291-302.
    Cooper, H. M. (1998). Synthesizing research: A guide for literature reviews (3rd ed.). Thousand Oaks, CA: Sage.
    Crouch, C. H., & Mazur, E. (2001). Peer instruction: Ten years of experience and results. American Journal of Physics, 69(9), 970-977.
    de Boer, H., Donker, A. S., & van der Werf, M. P. C. (2014). Effects of the attributes of educational interventions on students’ academic performance: A meta-analysis. Review of Educational Research, 84(4), 509-545.
    Dempster, F. N. (1996). Distributing and managing the conditions of encoding and practice. In E. L. Bjork & R. A. Bjork (Eds.), Memory: Handbook of perception and cognition (2nd ed., pp. 317-344). San Diego, CA: Academic Press.
    *Deslauriers, L., Schelew, E., & Wieman, C. (2011). Improved learning in a large-enrollment physics class. Science, 332(6031), 862-864.
    Donker, A. S., de Boer, H., Kostons, D., Dignath van Ewijk, C. C., & van der Werf, M. P. C. (2014). Effectiveness of learning strategy instruction on academic performance: A meta-analysis. Educational Research Review, 11(0), 1-26.
    *Doucet, M., Vrins, A., & Harvey, D. (2009). Effect of using an audience response system on learning environment, motivation and long-term retention, during case-discussions in a large group of undergraduate veterinary clinical pharmacology students. Medical Teacher, 31(12), E570-E579.
    Duval, S., & Tweedie, R. (2000). Trim and fill: A simple funnel-plot–based method of testing and adjusting for publication bias in meta-analysis. Biometrics, 56(2), 455-463.
    Egger, M., Smith, G. D., Schneider, M., & Minder, C. (1997). Bias in meta-analysis detected by a simple, graphical test. British Medical Journal, 315(7109), 629-634
    *Elashvili, A., Denehy, G. E., Dawson, D. V., & Cunningham, M. A. (2008). Evaluation of an audience response system in a preclinical operative dentistry course. Journal of Dental Education, 72(11), 1296-1303.
    Fies, C., & Marshall, J. (2006). Classroom response systems: A review of the literature. Journal of Science Education and Technology, 15(1), 101-109.
    *FitzPatrick, K. A., Finn, K. E., & Campisi, J. (2011). Effect of personal response systems on student perception and academic performance in courses in a health sciences curriculum. Advances in Physiology Education, 35(3), 280-289.
    Frase, L. T. (1967). Learning from prose material: Length of passage, knowledge of results, and position of questions. Journal of Educational Psychology, 58(5), 266-272.
    Fritz, C. O., Morris, P. E., & Richler, J. J. (2012). Effect size estimates: Current use, calculations, and interpretation. Journal of Experimental Psychology: General, 141(1), 2-18.
    *Gebru, M. T., Phelps, A. J., & Wulfsberg, G. (2012). Effect of clickers versus online homework on students' long-term retention of general chemistry course material. Chemistry Education Research and Practice, 13(3), 325-329.
    Goldberg, B., & Spain, R. (2014). Creating the intelligent novice: supporting self-regulated learning and metacognition in educational technology. In R. A. Sottilare, A. C. Graesser, X. Hu, & B. S. Goldberg (Eds.), Design recommendations for intelligent tutoring systems - Volume 2: Instructional management (pp. 105-133). Orlando, FL: U.S. Army Research Laboratory.
    Graesser, A. C., McNamara, D. S., & VanLehn, K. (2005). Scaffolding deep comprehension strategies through Point&Query, AutoTutor, and iSTART. Educational Psychologist, 40(4), 225-234.
    *Gray, K., & Steer, D. N. (2012). Personal response systems and learning: It is the pedagogy that matters, not the technology. Journal of College Science Teaching, 41(5), 80-88.
    Hamaker, C. (1986). The effects of adjunct questions on prose learning. Review of Educational Research, 56(2), 212-242.
    Hamilton, R. J. (1985). A framework for the evaluation of the effectiveness of adjunct questions and objectives. Review of Educational Research, 55(1), 47-85.
    Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81-112.
    Hedges, L. V. (1981). Distribution theory for Glass's estimator of effect size and related estimators. Journal of Educational Statistics, 6(2), 107-128.
    Hedges, L. V. (1982). Estimation of effect size from a series of independent experiments. Psychological Bulletin, 92(2), 490-499.
    Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis. Orlando, FL: Academic Press.
    Hedges, L. V., & Vevea, J. L. (1998). Fixed-and random-effects models in meta-analysis. Psychological Methods, 3(4), 486-504.
    Hoekstra, A. (2008). Vibrant student voices: Exploring effects of the use of clickers in large college courses. Learning, Media and Technology, 33(4), 329-341.
    James, M. C. (2006). The effect of grading incentive on student discourse in peer instruction. American Journal of Physics, 74(8), 689-691.
    James, M. C., & Willoughby, S. (2011). Listening to student conversations during clicker questions: What you have not heard might surprise you! American Journal of Physics, 79(1), 123-132.
    Järvelä, S., Kirschner, P., Panadero, E., Malmberg, J., Phielix, C., Jaspers, J., Koivuniemi, M., & Järvenoja, H. (2015). Enhancing socially shared regulation in collaborative learning groups: Designing for CSCL regulation tools. Educational Technology Research and Development, 63(1), 125-142.
    Kay, R. H., & LeSage, A. (2009). Examining the benefits and challenges of using audience response systems: A review of the literature. Computers & Education, 53(3), 819-827.
    Kirk, R. E. (1996). Practical significance: A concept whose time has come. Educational and Psychological Measurement, 56(5), 746-759.
    Kirschner, P. A., Kreijns, K., Phielix, C., & Fransen, J. (2015). Awareness of cognitive and social behaviour in a CSCL environment. Journal of Computer Assisted Learning, 31(1), 59-77.
    Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119(2), 254-284.
    *Knapp, F. A., & Desrochers, M. N. (2009). An experimental evaluation of the instructional effectiveness of a student response system: A comparison with constructed overt responding. International Journal of Teaching and Learning in Higher Education, 21(1), 36-46.
    Knight, J. K., Wise, S. B., & Southard, K. M. (2013). Understanding clicker discussions: Student reasoning and the impact of instructional cues. CBE-Life Sciences Education, 12(4), 645-654.
    Kulik, C. L. C., & Kulik, J. A. (1991). Effectiveness of computer-based instruction: An updated analysis. Computers in Human Behavior, 7(1–2), 75-94.
    Kulik, J. A., Kulik, C. L. C., & Bangert-Drowns, R. L. (1985). Effectiveness of computer-based education in elementary schools. Computers in Human Behavior, 1(1), 59-74.
    Kreijns, K., Kirschner, P. A., & Vermeulen, M. (2013). Social aspects of CSCL environments: A research framework. Educational Psychologist, 48(4), 229-242.
    Lantz, M. E. (2010). The use of ‘clickers’ in the classroom: Teaching innovation or merely an amusing novelty? Computers in Human Behavior, 26(4), 556-561.
    Lasry, N. (2008). Clickers or flashcards: Is there really a difference? The Physics Teacher, 46(4), 242-244.
    *Lim, K. H. (2011). Addressing the multiplication makes bigger and division makes smaller misconceptions via prediction and clickers. International Journal of Mathematical Education in Science and Technology, 42(8), 1081-1106.
    *Lin, Y. C., Liu, T. C., & Chu, C. C. (2011). Implementing clickers to assist learning in science lectures: The clicker-assisted conceptual change model. Australasian Journal of Educational Technology, 27(6), 979-996.
    *Liu, F. C., Gettig, J. P., & Fjortoft, N. (2010). Impact of a student response system on short- and long-term learning in a drug literature evaluation course. American Journal of Pharmaceutical Education, 74(1), Article 6.
    MacArthur, J. R., & Jones, L. L. (2008). A review of literature reports of clickers applicable to college chemistry classrooms. Chemistry Education Research and Practice, 9(3), 187-195.
    Mathan, S. A., & Koedinger, K. R. (2005). Fostering the intelligent novice: Learning from errors with metacognitive tutoring. Educational Psychologist, 40(4), 257-265.
    *Martyn, M. (2007). Clickers in the classroom: An active learning approach. EDUCAUSE Quarterly, 30(2), 71-74.
    *Mayer, R. E., Stull, A., DeLeeuw, K., Almeroth, K., Bimber, B., Chun, D., Bulger, M., Campbell, J., Knight, A., & Zhang, H. (2009). Clickers in college classrooms: Fostering learning with questioning methods in large lecture classes. Contemporary Educational Psychology, 34(1), 51-57.
    Mazur, E. (1997). Peer instruction: A user's manual. Upper Saddle River, NJ: Prentice Hall.
    *McCurry, M. K., & Hunter Revell, S. M. (2011). Evaluating the effectiveness of personal response system technology on millennial student learning. Journal of Nursing Education, 50(8), 471-475.
    *Miller, R. G., Ashar, B. H., & Getz, K. J. (2003). Evaluation of an audience response system for the continuing education of health professionals. Journal of Continuing Education in the Health Professions, 23(2), 109-115.
    Nelson, C., Hartling, L., Campbell, S., & Oswald, A. E. (2012). The effects of audience response systems on learning outcomes in health professions education. A BEME systematic review: BEME Guide No. 21. Medical Teacher, 34(6), e386-e405.
    Neuman, Y., & Schwarz, B. (1998). Is self-explanation while solving problems helpful? The case of analogical problem-solving. British Journal of Educational Psychology, 68(1), 15-24.
    *Patterson, B., Kilpatrick, J., & Woebkenberg, E. (2010). Evidence for teaching practice: The impact of clickers in a large classroom environment. Nurse Education Today, 30(7), 603-607.
    Perez, K. E., Strauss, E. A., Downey, N., Galbraith, A., Jeanne, R., & Cooper, S. (2010). Does displaying the class results affect student discussion during peer instruction? CBE-Life Sciences Education, 9(2), 133-140.
    Phielix, C., Prins, F. J., & Kirschner, P. A. (2010). Awareness of group performance in a CSCL-environment: Effects of peer feedback and reflection. Computers in Human Behavior, 26(2), 151-161.
    Phielix, C., Prins, F. J., Kirschner, P. A., Erkens, G., & Jaspers, J. (2011). Group awareness of social and cognitive performance in a CSCL environment: Effects of a peer feedback and reflection tool. Computers in Human Behavior, 27(3), 1087-1102.
    *Plant, J. D. (2007). Incorporating an audience response system into veterinary dermatology lectures: Effect on student knowledge retention and satisfaction. Journal of Veterinary Medical Education, 34(5), 674-677.
    *Pradhan, A., Sparano, D., & Ananth, C. V. (2005). The influence of an audience response system on knowledge retention: An application to resident education. American Journal of Obstetrics and Gynecology, 193(5), 1827-1830.
    *Radosevich, D. J., Salomon, R., Radosevich, D. M., & Kahn, P. (2008). Using student response systems to increase motivation, learning, and knowledge retention. Innovate: Journal of Online Education, 5(1), 7.
    Raudenbush, S. W. (2009). Analyzing effect sizes: Random-effects models. In H. Cooper, L. V. Hedges & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2nd ed., pp. 295-316). New York: NY: Russell Sage Foundation.
    Renkl, A. (1997). Learning from worked-out examples: A study on individual differences. Cognitive Science, 21(1), 1-29.
    Renkl, A., Stark, R., Gruber, H., & Mandl, H. (1998). Learning from worked-out examples: The effects of example variability and elicited self-explanations. Contemporary Educational Psychology, 23(1), 90-108.
    Rickards, J. P. (1979). Adjunct postquestions in text: A critical review of methods and processes. Review of Educational Research, 49(2), 181-196.
    Rittle-Johnson, B. (2006). Promoting transfer: Effects of self-explanation and direct instruction. Child Development, 77(1), 1-15.
    Roediger III, H. L., & Karpicke, J. D. (2006a). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science, 1(3), 181-210.
    Roediger III, H. L., & Karpicke, J. D. (2006b). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological Science, 17(3), 249-255.
    *Rubio, E. I., Bassignani, M. J., White, M. A., & Brant, W. E. (2008). Effect of an audience response system on resident learning and retention of lecture material. American Journal of Roentgenology, 190(6), W319-W322.
    Scammacca, N., Roberts, G., & Stuebing, K. K. (2014). Meta-analysis with complex research designs: Dealing with dependence from multiple measures and multiple group comparisons. Review of Educational Research, 84(3), 328-364.
    Shadish, W. R., & Haddock, C. K. (2009). Combining estimates of effect size. In H. Cooper, L. V. Hedges & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2nd ed., pp. 257-278). New York: NY: Russell Sage Foundation.
    *Shaffer, D. M., & Collura, M. J. (2009). Evaluating the effectiveness of a personal response system in the classroom. Teaching of Psychology, 36(4), 273-277.
    Shapiro, A., & Gordon, L. T. (2012). A controlled study of clicker-assisted memory enhancement in college classrooms. Applied Cognitive Psychology, 26(4), 635-643.
    Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153-189.
    Simpson, V., & Oliver, M. (2007). Electronic voting systems for lectures then and now: A comparison of research and practice. Australasian Journal of Educational Technology, 23(2), 187-208.
    Surber, J. R., & Anderson, R. C. (1975). Delay-retention effect in natural classroom settings. Journal of Educational Psychology, 67(2), 170-173.
    Thurlings, M., Vermeulen, M., Bastiaens, T., & Stijnen, S. (2013). Understanding feedback: A learning theory perspective. Educational Research Review, 9, 1-15.
    *Tregonning, A. M., Doherty, D. A., Hornbuckle, J., & Dickinson, J. E. (2012). The audience response system and knowledge gain: A prospective study. Medical Teacher, 34(4), e269-e274.
    VanLehn, K., Jones, R. M., & Chi, M. T. H. (1992). A model of the self-explanation effect. Journal of the Learning Sciences, 2(1), 1-59.
    Wong, R. M. F., Lawson, M. J., & Keeves, J. (2002). The effects of self-explanation training on students' problem solving in high-school mathematics. Learning and Instruction, 12(2), 233-262.
    *Yourstone, S. A., Kraye, H. S., & Albaum, G. (2008). Classroom questioning with immediate electronic response: Do clickers improve learning? Decision Sciences Journal of Innovative Education, 6(1), 75-88.

    Chapter 3
    Beatty, I. D., Gerace, W. J., Leonard, W. J., & Dufresne, R. J. (2006). Designing effective questions for classroom response system teaching. American Journal of Physics, 74(1), 31-39.
    Bruff, D. (2009). Teaching with classroom response systems: Creating active learning environments. San Francisco, CA: Jossey-Bass.
    Caldwell, J. E. (2007). Clickers in the large classroom: Current research and best-practice tips. CBE-Life Sciences Education, 6(1), 9-20.
    Cohen, J. (1988). Statistical power analysis for the behavioral sciencies (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates.
    Deslauriers, L., Schelew, E., & Wieman, C. (2011). Improved learning in a large-enrollment physics class. Science, 332(6031), 862-864.
    Duncan, D. (2005). Clickers in the classroom: How to enhance science teaching using classroom response systems. San Francisco, CA: Pearson.
    Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force concept inventory. The Physics Teacher, 30(3), 141-158.
    Hoekstra, A. (2008). Vibrant student voices: Exploring effects of the use of clickers in large college courses. Learning, Media and Technology, 33(4), 329-341.
    Nielsen, K. L., Hansen-Nygård, G., & Stav, J. B. (2012). Investigating Peer Instruction: How the Initial Voting Session Affects Students' Experiences of Group Discussion. International Scholarly Research Notices, 2012.
    Mazur, E. (1997). Peer instruction: A user's manual. Upper Saddle River, NJ: Prentice Hall.
    Perez, K. E., Strauss, E. A., Downey, N., Galbraith, A., Jeanne, R., & Cooper, S. (2010). Does displaying the class results affect student discussion during peer instruction? CBE-Life Sciences Education, 9(2), 133-140.
    Smith, M. K., Wood, W. B., Adams, W. K., Wieman, C., Knight, J. K., Guild, N., & Su, T. T. (2009). Why peer discussion improves student performance on in-class concept questions. Science, 323(5910), 122-124.
    Smith, M. K., Wood, W. B., Krauter, K., & Knight, J. K. (2011). Combining peer discussion with instructor explanation increases student learning from in-class concept questions. CBE-Life Sciences Education, 10(1), 55-63.

    Chapter 4
    Chi, M. T. H., Bassok, M., Lewis, M. W., Reimann, P., & Glaser, R. (1989). Self-explanations: How students study and use examples in learning to solve problems. Cognitive Science, 13(2), 145-182.
    Chi, M. T. H., De Leeuw, N., Chiu, M. H., & Lavancher, C. (1994). Eliciting self-explanations improves understanding. Cognitive Science, 18(3), 439-477.
    Cohen, J. (1988). Statistical power analysis for the behavioral sciencies (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates.

    Chapter 5
    Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2014). NMC horizon report: 2014 K-12 edition. Austin, TX: The New Media Consortium.

    Chapter 6
    Berland, L. K., & Hammer, D. (2012). Framing for scientific argumentation. Journal of Research in Science Teaching, 49(1), 68-94.
    Cohen, J. (1988). Statistical power analysis for the behavioral sciencies (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates.
    Cowen, R. (2013). The wheels come off Kepler. Nature, 497(7450), 417-418.
    Driver, R., Newton, P., & Osborne, J. (2000). Establishing the norms of scientific argumentation in classrooms. Science Education, 84(3), 287-312.
    Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis. Orlando, FL: Academic Press.
    Jiménez-Aleixandre, M. P., Rodríguez, A. B., & Duschl, R. A. (2000). “Doing the lesson” or “doing science”: Argument in high school genetics. Science Education, 84(6), 757-792.
    Kolstø, S. D. (2000). Consensus projects: Teaching science for citizenship. International Journal of Science Education, 22(6), 645-664.
    Kortland, K. (1996). An STS case study about students' decision making on the waste issue. Science Education, 80(6), 673-689.
    Ministry of Education. (2001). The 1–9 grades science and life technology curriculum standards. Taipei. Taiwan: Ministry of Education.
    Ministry of Education. (2004). The 10–12 grades science and life technology curriculum standards. Taipei. Taiwan: Ministry of Education.
    Sadler, T. D. (2004). Informal reasoning regarding socioscientific issues: A critical review of research. Journal of Research in Science Teaching, 41(5), 513-536.
    Wu, Y. T., & Tsai, C. C. (2007). High school students’ informal reasoning on a socio-scientific issue: Qualitative and quantitative analyses. International Journal of Science Education, 29(9), 1163-1187.

    下載圖示
    QR CODE