研究生: |
許棣徭 Hsu, Ti-Yao |
---|---|
論文名稱: |
高中工程設計專題學習歷程檔案評量量表之發展 The Development of Portfolio Assessment Scale for Senior-High-School Students’ Engineering Design Project |
指導教授: |
林坤誼
Lin, Kuen-Yi |
口試委員: |
游光昭
Yu, Kuang-Chao 張美珍 Chang, Mei-Chen 林坤誼 Lin, Kuen-Yi |
口試日期: | 2024/01/19 |
學位類別: |
碩士 Master |
系所名稱: |
科技應用與人力資源發展學系 Department of Technology Application and Human Resource Development |
論文出版年: | 2024 |
畢業學年度: | 112 |
語文別: | 中文 |
論文頁數: | 120 |
中文關鍵詞: | 工程設計專題 、學習歷程檔案 、評量標準 、評量指標 、表現標準 、德懷術 |
英文關鍵詞: | engineering design project, portfolio, assessment criteria, assessment indicators, performance criteria, Delphi technique |
研究方法: | 德爾菲法 |
DOI URL: | http://doi.org/10.6345/NTNU202400264 |
論文種類: | 學術論文 |
相關次數: | 點閱:115 下載:16 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
《十二年國民基本教育課程綱要》明確指出高中階段應著重工程設計的流程,也提倡應用學習歷程檔案評量紀錄學生的學習過程,但卻缺乏一致性的評量標準。因此,本研究旨在訂定高中工程設計專題學習歷程檔案評量標準,以作為高中生活科技教師評量的依據。本研究主要採用德懷術方法,邀請12位科技與工程領域專家,包含4位現職生活科技教師、4位科技與工程領域大學教授以及4位生活科技教科書作者,在經過兩回合的問卷調查後,擬分析問卷平均數(M)、標準差(SD)、四分位差(QD)、中位數(MD),以檢驗專家針對評量指標是否達成共識,最後,根據指標具體制定對應之表現標準,並召開專家諮詢座談審查內容,進而完成高中工程設計專題學習歷程檔案評量量表。本研究預期的效益為以下兩點:(1)提供科技教師一個評量工具,解決評量標準不一及時間不足的問題;(2)使工程設計專題學習歷程檔案的製作更加系統化,促進學生反思並增加學習成效。
The "Curriculum Guidelines of 12-Year Basic Education" clearly points out that the senior high school should focus on the process of engineering design, and also encourage using the portfolio to assess and record students' learning process, but no consistent assessment criteria are provided. Therefore, the purpose of this research is to formulate the assessment indicators and performance criteria for the senior high school students’ engineering design project so that senior high school living technology teachers may use it as the basis of assessment. This study invited 12 experts in the field of technology and engineering, including 4 current living technology teachers , 4 university professors in the field of technology and engineering and 4 authors of living technology textbooks, through Delphi technique, to compile expert opinions, analyze the Mean (M), Standard Deviation (SD), Quartile Deviation (QD), and Median (Md) from the two rounds of questionnaires, test the suitability of each indicator, and construct a complete portfolio assessment scale for senior high school students’ engineering design project. The expected benefits of this research are the following two points: (1) provide an assessment tool for living technology teachers to solve the problems of inconsistent assessment standards and lack of time; (2) make the production of portfolio for engineering design project more systematic and promote students reflect and increase learning outcomes.
吳清山、林天佑(2001)。德懷術。教育研究月刊,92,127。
范斯淳(2020)。高中生工程設計關鍵能力指標與學習進程之建構、評估與教學研究。科技部補助專題研究計畫報告(MOST 107-2511-H-017-004-MY2 ),未出版。
范斯淳、游光昭(2016)。科技教育融入 STEM 課程的核心價值與實踐。教育科學研究期刊,61(2),153-183。https://doi.org/10.6209/JORIES.2016.61(2).06
張美玉(2000)。歷程檔案評量的理念與實施。科學教育月刊,231,58-63。https://doi.org/10.6216/SEM.200006_(231).0015
陳玫良(2002)。評量規準(Rubrics)在生活科技教學評量上之運用。生活科技教育,35(1),2-9。https://doi.org/10.6232/LTE.2002.35(1).2
陳璽宇(2020)。科技教育的測驗與評量之探討。科技與人力教育季刊,7(1),26-47。https://doi.org/10.6587/JTHRE.202009_7(1).0002
游光昭、洪國勳(2003)。網路化學習歷程檔案與科技的學習。生活科技教育月刊,36(5),55-64。https://doi.org/10.6232/LTE.2003.36(5).7
許宜婷(2014)。科技教育的教學評量---以 NAE 及 NRC 評量標準之多元評量為例。科技與人力教育季刊,1(1),55-69。https://doi.org/10.6587/JTHRE.2014.1(1).4
教育部(2018)。十二年國民基本教育課程綱要國民中小學暨普通型高級中等學校科技領域。臺北市:教育部。
鄭雅云、張奕華(2021)。大學入學制度的變革─ 學習歷程檔案之評析。臺灣教育評論月刊,10(4),24-30。
劉協成(2006)。德懷術之理論與實務初探。教師之友, 47(4),91-99。https://doi.org/10.7053/TF.200610.0091
Al-araibi, A. A. M., Mahrin, M. N. R. B., & Yusoff, R. C. M. (2019). Technological aspect factors of E-learning readiness in higher education institutions: Delphi technique. Education and Information Technologies, 24, 567-590. https://doi.org/10.1007/s10639-018-9780-9
Andrade, H., & Du, Y. (2005). Student perspectives on rubric-referenced assessment. Practical Assessment, Research, and Evaluation, 10(3), 1-11.
Arter, J., & McTighe, J. (2001). Scoring rubrics in the classroom: Using performance criteria for assessing and improving student performance. Corwin Press.
Asunda, P. A. & Hill, B. R. (2007). Critical features of engineering design in technology education. Journal of Industrial Technology Education, 44(1), 25-48.
Barak, M. (2012). From ‘doing’to ‘doing with learning’: Reflection on an effort to. promote self-regulated learning in technological projects in high school. European Journal of Engineering Education, 37(1), 105-116. https://doi.org/10.1080/03043797.2012.658759
Barrett, H. (1997). Collaborative planning for electronic portfolios: Asking strategic questions. World Wide Web. http://transition.alaska.edu/www/portfolios/planning. html.
Brookhart, S. M. (1999). The art and science of classroom assessment: The missing part of pedagogy. Washington, DC: The George Washington University Press.https://doi.org/10.5860/CHOICE.38-0425
Cole, D. J., Ryan, C. W., Kick, F., & Mathies, B. K. (2000). Portfolios across the curriculum and beyond. Corwin Press.
Dalkey, N. C. (1969). An experimental study of group opinion. Futures, 1 (5), 408-426. https://doi.org/10.1016/S0016-3287(69)80025-X
Gustafson, D. H., Delbecq, A. L., & Van de Ven, A. H. (1986). Group techniques for program planning-a guide to nominal group and Delphi processes. Group Organ Stud 1(2), 256–256 https://doi.org/10.1177/002188637601200414
Eide, A., Jenison, R., Northup, L., & Mashaw, L. (2001). Introduction to engineering design and problem solving. McGraw-Hill Science/Engineering/Math.
Gattie, D. K., & Wicklein, R. C. (2007). Curricular value and instructional needs for infusing engineering design into K-12 technology education. Journal of Technology Education, 19(1), 6. https://doi.org/10.21061/jte.v19i1.a.1
Goldberg, G. (2011). Engineering Design Process Portfolio Scoring. Rubric(EDPPSR): Scoring Pilot Final Report. Unpublished report. University of Maryland.
Goodman, C. M. (1987). The Delphi technique: a critique. Journal of Advanced Nursing, 12(6), 729-734. https://doi.org/10.1111/j.1365-2648.1987.tb01376.x
Goodrich, H. (1997). Understanding Rubrics: The dictionary may define" rubric," but these models provide more clarity. Educational Leadership, 54(4), 14-17.
Hasson, F., Keeney, S., & McKenna, H. (2000). Research guidelines for the Delphi survey technique. Journal of Advanced Nursing, 32(4), 1008-1015. https://doi.org/10.1046/j.1365-2648.2000.01567.x
Hailey, C., Erekson, T., Becker, K., & Thomas, M. (2005). National center for engineering and technology education. The Technology Teacher, 64(5), 23. https://doi.org/10.18260/1-2--15293
Holden, M. C., & Wedman, J. F. (1993). Future issues of computer-mediated communication: The results of a Delphi study. Educational Technology Research and Development, 41(4), 5-24. https://doi.org/10.1007/BF02297509
Dugger, W. (2000). Standards for technological literacy: Content for the study of technology. Technology Teacher, 59(5), 8-13.
Jaeger, M., & Adair, D. (2015). Using an evidential reasoning approach for portfolio. assessments in a project-based learning engineering design course. European Journal of Engineering Education, 40(6), 638-652. https://doi.org/10.1080/03043797.2014.1001817
Katehi, L., Pearson, G., & Feder, M. (2009). The status and nature of K-12 engineering education in the United States. The Bridge, 39(3), 5-10.
Kelley, T. R. (2011). Engineer's notebook-A design assessment tool. Technology and Engineering Teacher, 70(7), 30-35.
Kelley, T. R. (2014). CONSTRUCTION OF AN engineer's notebook. Technology and Engineering Teacher, 73(5), 26-32.
Lewis, T. (2005). Coming to terms with engineering design as content. Journal of Technology Education, 16(2), 37-54. https://doi.org/10.21061/jte.v16i2.a.3
Mativo, J., & Wicklein, R. (2011). Learning effects of design strategies on high school students. Journal of STEM Teacher Education, 48(3), 66-92. https://doi.org/10.30707/JSTE48.3Mativo
McGourty, J., Sebastian, C., & Swart, W. (1998). Developing a comprehensive assessment program for engineering education. Journal of Engineering Education, 87(4), 355-361. https://doi.org/10.1002/j.2168-9830.1998.tb00365.x
Merrill, C., Custer, R. L., Daugherty, J., Westrick, M., & Zeng, Y. (2009). Delivering core engineering concepts to secondary level students. Journal of Technology Education, 20(1), 48-64. https://doi.org/10.21061/jte.v20i1.a.4
Moore, R., Alemdar, M., Lingle, J. A., Newton, S. H., Rosen, J. H., & Usselman, M. (2016, June). The Engineering Design Log: A Digital Design Journal Facilitating Learning and Assessment (RTP).[Conference Paper]. 2016 ASEE Annual Conference & Exposition. New Orleans, LA. https://doi.org/10.18260/p.26153
Moskal, B. M. (2000). Scoring rubrics: What, when and how? Practical Assessment, Research, and Evaluation, 7(1), 3. https://doi.org/10.7275/a5vq-7q66
National Research Council [NRC]. (2009). Engineering in K-12 education: understanding the status and improving the prospects. The National Academies Press. https://doi.org/10.1002/inst.20101338
NGSS Lead States. (2013). Next generation science standards: For states, by states. National Academies Press.
National Science Foundation. (2017, August 31). Engineering Design Process Portfolio Scoring Rubric (EDPPSR). National Science Foundation https://www.nsf.gov/awardsearch/showAward?AWD_ID=1118755&HistoricalAwards=false
Office for Standards in Education. (2000). OfStEd subject reports secondary design and technology, 1999-2000. The Stationary Office.
Paulson, F. L., Paulson, P. R., & Meyer, C. A. (1991). What makes a portfolio a portfolio? Educational Leadership, 48(5), 60-63.
Palmer, S., Holt, D., Hall, W., & Ferguson, C. (2011). An evaluation of an online student portfolio for the development of engineering graduate attributes. Computer Applications in Engineering Education, 19(3), 447-456. https://doi.org/10.1002/cae.20324
Petrina, S. (2007). Advanced teaching methods for the technology classroom. IGI Global. https://doi.org/10.4018/978-1-59904-337-1
Popham, W. J. (1997). What's Wrong--and What's Right--with Rubrics. Educational Leadership, 55(2), 72-75.
Reddy, Y. M. (2007). Effect of Rubrics on Enhancement of Student Learning. Educate, 7(1), 3-17.
Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation in Higher Education, 35(4), 435-448. https://doi.org/10.1080/02602930902862859
Rogers, G., & Williams, J. (1999). Asynchronous assessment: Using electronic portfolios to assess student outcomes. In 1999 Annual Conference (pp. 4-101). https://doi.org/10.1109/FIE.1998.736818
Rojewski, J. W., & Meers, G. D. (1991). Directions for future research in vocational special needs education. University of Illinois.
Sadler, D. R. (2009). Indeterminacy in the use of preset criteria for assessment and grading. Assessment & Evaluation in Higher Education, 34(2), 159-179. https://doi.org/10.1080/02602930801956059
Tigelaar, D. E., Dolmans, D. H., Wolfhagen, I. H., & van der Vleuten, C. P. (2005). Quality issues in judging portfolios: Implications for organizing teaching portfolio assessment procedures. Studies in Higher Education, 30(5), 595-610. https://doi.org/10.1080/03075070500249302
Williams, J. M. (2002). The engineering portfolio: Communication, reflection, and student learning outcomes assessment. International Journal of Engineering Education, 18(2), 199-207.
Wright, V. H., Stallworth, B. J., & Ray, B. (2002). Challenges of electronic portfolios: Student perceptions and experiences. Journal of Technology and Teacher Education, 10(1), 49-61.