簡易檢索 / 詳目顯示

研究生: 翁勝紘
Wung, Sheng-Hong
論文名稱: 以類神經網路強化大規模多目標演化演算法之子代生成機制
Enhancing Offspring Generation in Large-scale Evolutionary Multi-objective Optimization Using Neural Networks
指導教授: 蔣宗哲
Chiang, Tsung-Che
口試委員: 蔣宗哲
Chiang, Tsung-Che
鄒慶士
Tsou, Ching-Shih
廖容佐
Liaw, Rung-Tzuo
口試日期: 2025/01/03
學位類別: 碩士
Master
系所名稱: 資訊工程學系
Department of Computer Science and Information Engineering
論文出版年: 2025
畢業學年度: 113
語文別: 中文
論文頁數: 51
中文關鍵詞: 大規模多目標演化演算法演化演算法類神經網路
英文關鍵詞: Large-Scale, Multi-Objective, Evolutionary Algorithm, Neural Networks
DOI URL: http://doi.org/10.6345/NTNU202500374
論文種類: 學術論文
相關次數: 點閱:39下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 演化演算法踏入多目標最佳化問題多年,在這領域擁有豐富且效果卓越的演算法,但隨著最佳化問題的決策空間維度上升,演化演算法在求解大規模多目標問題時,會因收斂速度低下而難以發揮效果。為此,大規模多目標演化演算法的研究在近年發展熱絡,研究者使用各種不同方法幫助演化演算法能更快更好的處理高維度問題,提升收斂速度並維持解的分布。類神經網路方法是其中一種有效技術,透過學習機制有效生成高品質子代,讓族群快速收斂,解決收斂速度不足的問題。此種方法起始時收斂快速,但演化到一定程度族群並不能持續蒐集出優秀的訓練資料,為了訓練資料的問題,本論文提出 large scale multi-objective evolutionary algorithm with neural network enhanced offspring generation (LSMOEA-NEO), 用抽樣生解策略生成新資料集,幫助訓練資料蒐集,也運用模型拆分方法讓模型規模變小,能更快的訓練好模型;為了避免使用訓練不佳的模型進行生解,加入對模型使用的時機控制,使LSMOEA-NEO 能夠與現有新穎大規模多目標演化演算法比較中取得更好的結果。在本論文實驗中,測試資料 LSMOP 的 9 個問題中有 6 個問題贏過其他大規模多目標演化演算法。

    Evolutionary algorithms have been widely applied to multi-objective optimization problems, leading to the development of numerous effective and robust approaches. However, as the dimensionality of the decision space increases, these algorithms struggle with slow convergence, making it challenging to solve large-scale multi-objective optimization problems. In response, recent research has focused on enhancing their scalability, improving convergence speed, and maintaining solution diversity.
    Neural network-based techniques have emerged as a promising solution, leveraging learning mechanisms to generate high-quality offspring and accelerate convergence. Building on this approach, this paper proposes the Large-Scale Multi-Objective Evolutionary Algorithm with Neural Network-Enhanced Offspring Generation (LSMOEA-NEO). While neural networks initially drive rapid progress, their effectiveness diminishes over time due to difficulties in continuously acquiring high-quality training data. To address this limitation, LSMOEA-NEO incorporates a sampling-based offspring generation strategy to improve training data collection, a model decomposition strategy to reduce model size and accelerate training, and a usage control mechanism to prevent poorly trained models from generating low-quality offspring. Experimental results on the LSMOP benchmark demonstrate that LSMOEA-NEO outperforms existing large-scale multi-objective evolutionary algorithms on 6 out of 9 test problems, showcasing its superior problem-solving capabilities.

    致謝 i 中文摘要 ii Abstract iii 目錄 iv 附表目錄 v 附圖目錄 vi 第一章 緒論 1 1.1 研究背景與動機 1 1.2 研究問題定義 2 1.3 論文架構與貢獻 2 第二章 文獻探討 4 2.1 基於決策變數分群的大規模多目標演化演算法 4 2.2 基於問題轉換的大規模多目標演化演算法 5 2.3 基於新型搜尋策略的大規模多目標演化演算法 6 第三章 LSMOEA-NEO 演算法 14 3.1 ALMOEA機制 14 3.2 改良多層感知器和訓練資料 16 3.3 拆分多層感知器 23 3.4 使用時機控制 24 3.5 改良後整體演算法流程 26 第四章 實驗結果與分析 27 4.1 實驗設定 27 4.2 LSMOEA-NEO 與其他大規模多目標演化演算法之比較 29 4.3 改良機制有效性之檢驗 35 4.4 參數設定實驗 43 4.5 抽樣方法利用於模型輸入 46 第五章 結論與未來方向 47 參考文獻 48

    S. Liu, Q. Lin, J. Li and K. C. Tan, “A survey on learnable evolutionary algorithms for scalable multiobjective optimization,” IEEE Transactions on Evolutionary Computation, vol. 27, no. 6, pp. 1941–1961, Dec. 2023.
    L. M. Antonio and C. A. C. Coello, “Use of cooperative coevolution for solving large scale multiobjective optimization problems,” in Proceeding of 2013 IEEE Congress on Evolutionary Computation. IEEE, 2013, pp. 2758–2765.
    X. Zhang, Y. Tian, R. Cheng, and Y. Jin, “A decision variable clusteringbased evolutionary algorithm for large-scale many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 22, no. 1, pp. 97–112, Feb. 2018.
    X. Ma, F. Liu, Y. Qi, X. Wang, L. Li, L. Jiao, M. Yin, and M. Gong, “A multiobjective evolutionary algorithm based on decision variable analyses for multiobjective optimization problems with large-scale variables,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 2, pp. 275-298, April 2016.
    H. Zille, H. Ishibuchi, S. Mostaghim, and Y. Nojima, “A framework for large-scale multiobjective optimization based on problem transformation,” IEEE Transactions on Evolutionary Computation, vol. 22, no. 2, pp. 260–275, April 2018.
    R. Liu, R. Ren, J. Liu, and J. Liu, “A clustering and dimensionality reduction based evolutionary algorithm for large-scale multi-objective problems, ” Applied Soft Computing, vol. 89, pp. 106–120, 2020.
    Y. Tian, C. Lu, X. Zhang, K. C. Tan, and Y. Jin, “Solving large-scale multiobjective optimization problems with sparse optimal solutions via unsupervised neural networks, ” IEEE Transactions on Cybernetics, vol. 51, no. 6, pp. 3115–3128, 2021.
    Y. Tian, X. Zhang, C. Wang, and Y. Jin, “An evolutionary algorithm for large-scale sparse multiobjective optimization problems,” IEEE Transactions on Evolutionary Computation, vol. 24, no. 2, pp. 380–393, April 2020.
    X. Liu, Y. Du, M. Jiang, and X. Zeng, “Multiobjective particle swarm optimization based on network embedding for complex network community detection, ” IEEE Transactions on Computational Social Systems, vol. 7, no. 2, pp. 437–449, April 2020.
    R. Cheng, Y. Jin, K. Narukawa, and B. Sendhoff, “A multiobjective evolutionary algorithm using Gaussian process-based inverse modeling, ” IEEE Transactions on Evolutionary Computation, vol. 19, no. 6, pp. 838–856, Dec. 2015.
    Y. Tian, X. Zheng, X. Zhang, and Y. Jin, “Efficient large-scale multiobjective optimization based on a competitive swarm optimizer,” IEEE Transactions on Cybernetics, vol. 50, no. 8, pp. 3696–3708, Aug. 2020.
    R. Cheng and Y. Jin, “A competitive swarm optimizer for large scale optimization, ” IEEE Transactions on Cybernetics, vol. 45, no. 2, pp. 191–204, Feb. 2015.
    B. Li, Y. Zhang, P. Yang, X. Yao and A. Zhou, “A two-population algorithm for large-scale multi-objective optimization based on fitness-aware operator and Adaptive Environmental Selection,” IEEE Transactions on Evolutionary ( Early Access )
    C. He, R. Cheng, and D. Yazdani, “Adaptive offspring generation for evolutionary large-scale multiobjective optimization,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 52, no. 2, pp. 786–798, Feb. 2022.
    L. Li, C. He, R. Cheng, H. Li, L. Pan, and Y. Jin, “A fast sampling based evolutionary algorithm for million-dimensional multiobjective optimization,” Swarm and Evolutionary Computation, vol. 75, 101181, Dec. 2022.
    C. He, L. Li, Y. Tian, X. Zhang, R. Cheng, Y. Jin, and X. Yao, “Accelerating large-scale multiobjective optimization via problem reformulation,” IEEE Transactions on Evolutionary Computation, vol. 23, no. 6, pp. 949–961, Dec. 2019.
    S. Qin, C. Sun, Y. Jin, Y. Tan, and J. Fieldsend, “Large-scale evolutionary multiobjective optimization assisted by directed sampling,” IEEE Transactions on Evolutionary Computation, vol. 25, no. 4, pp. 724–738, Aug. 2021.
    Y. Wu, N. Yang, L. Chen, Y. Tian, and Z. Tang, “Directed quick search guided evolutionary framework for large-scale multi-objective optimization problems,” Expert Systems with Applications, vol. 239, 122370, April. 2024.
    S. Bandaru and K. Deb, “Automated discovery of vital knowledge frompareto-optimal solutions: First results from engineering design,” in Proceedings of IEEE Congress on evolutionary computation. IEEE, 2010, pp. 1–8.
    K. Deb and A. Srinivasan, “Innovization: Innovating design principles through optimization,” in Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, 2006, pp. 1629–1636.
    S. Mittal, D. K. Saxena, K. Deb, and E. D. Goodman, “A learning-based innovized progress operator for faster convergence in evolutionary multiobjective optimization, " ACM Transactions on Evolutionary Learning and Optimization (TELO), vol. 2, no. 1, pp. 1–29, Nov. 2021.
    R. Cheng, Y. Jin, M. Olhofer, and B. Sendhoff, “A reference vector guided evolutionary algorithm for many-objective optimization," IEEE Transactions on Evolutionary Computation, vol. 20, no. 5, pp. 773–791, Oct. 2016.
    S. Mittal, D. K. Saxena, K. Deb, and E. D. Goodman, “Enhanced innovized progress operator for evolutionary multi-and many-objectiveoptimization," IEEE Transactions on Evolutionary Computation, vol. 26, no. 5, pp. 961–975, Oct. 2022.
    L. Breiman, “Random forests,” Machine Learning., vol. 45, no. 1, pp. 5–32, 2001.
    G. Guo, H. Wang, D. Bell, Y. Bi, and K. Greer, “KNN model-based approach in classification. ” On The Move to Meaningful Internet Systems 2003: CoopIS, DOA, and ODBASE: OTM Confederated International Conferences, CoopIS, DOA, and ODBASE 2003, Catania, Sicily, Italy, November 3-7, 2003. Proceedings. Springer Berlin Heidelberg, 2003.
    S. Mittal, D. K. Saxena, K. Deb, and E. D. Goodman, "A unified innovized progress operator for performance enhancement in evolutionary multi- and many-objective optimization," IEEE Transactions on Evolutionary Computation, vol. 28, no. 6, pp. 1605–1619, Dec. 2024
    D. Bhasin, S. Swami, S. Sharma, S. Sah, D. K. Saxena, and K. Deb, “Investigating innovized progress operators with different machine learning methods, ” in M. Emmerich et al. (Eds.), Evolutionary Multi-Criterion Optimization, EMO 2023, Lecture Notes in Computer Science, vol. 13970, Springer, Cham, 2023.
    S. Liu, J. Li, Q. Lin, Y. Tian, and K. C. Tan, “Learning to accelerate evolutionary search for large-scale multiobjective optimization,” IEEE Transactions on Evolutionary Computation, vol. 27, no. 1, pp. 67–81, Feb. 2023.
    L. Li, Y. Li, Q. Lin, S. Liu, J. Zhou, Z. Ming, and C. A. C. Coello, “Neural net-enhanced competitive swarm optimizer for large-scale multiobjective optimization, ” IEEE Transactions on Cybernetics, vol. 54, no. 6, pp. 3502–3515, June 2024.
    Q. Lin, J. Li, S. Liu, L. Ma, J. Li, and J. Chen, “An adaptive two-stage evolutionary algorithm for large-scale continuous multi-objective optimization,” Swarm and Evolutionary Computation, vol. 77, 101235, Mar. 2023.
    Z. Cui, Y. Wu, T. Zhao, W. Zhang, and J. Chen, “A two-stage accelerated search strategy for large-scale multi-objective evolutionary algorithm,” Information Sciences, vol. 686, 121347, Jan. 2025.
    Z. Zhan, J. Li, S. Kwong, and J. Zhang, “Learning-aided evolution for optimization,” IEEE Transactions on Evolutionary Computation, vol. 27, no. 6, pp. 1794–1808, Dec. 2023.
    C. He, S. Huang, R. Cheng, K. C. Tan, and Y. Jin, “Evolutionary multiobjective optimization driven by gnerative adversarial networks (GANs),” IEEE Transactions on Cybernetics, vol. 51, no. 6, pp. 3129-3142, June 2021.
    Z. Wang, H. Hong, K. Ye, G. -E. Zhang, M. Jiang, and K. C. Tan, “Manifold Interpolation for large-scale multiobjective optimization via generative adversarial networks, ” IEEE Transactions on Neural Networks and Learning Systems, vol. 34, no. 8, pp. 4631-4645, Aug. 2023.
    I. J. Goodfellow et al., “Generative adversarial nets,” in Proceedings of Adv. Neural Inf. Process. Syst., 2014, pp. 2672–2680.
    E. Ziztler, M. Laumanns, and L. Thiele, “SPEA2: Improving the strength Pareto evolutionary algorithm for multiobjective optimization,” in Proceedings of Evol. Methods Design Optim. Control Appl. Ind. Problems (EUROGEN), Athens, Greece, Sep. 2001.
    R. Cheng, Y. Jin, M. Olhofer, and B.Sendhoff, “Test problems for large-scale multiobjective and many-objective optimization,” IEEE Transactions on Cybernetics, vol. 47, no. 12, pp. 4108–4121, Dec. 2017.
    K. Deb and H. Jain, “An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: solving problems with box constraints,” IEEE Transactions on Evolutionary Computation, vol. 18, no. 4, pp. 577–601, Aug. 2014.
    H. Jain and K. Deb. “An evolutionary many-objective optimization algorithm using reference-point based nondominated sorting approach, part II: handling constraints and extending to an adaptive approach,” IEEE Transactions on Evolutionary Computation, vol. 18, no. 4, pp.602–622, 2014.
    Y. Tian, R. Cheng, X. Zhang and Y. Jin, “PlatEMO: A matlab platform for evolutionary multi-objective optimization [educational forum],” IEEE Computational Intelligence Magazine, vol. 12, no. 4, pp. 73 –87, Nov. 2017.
    https://github.com/songbai-liu/ALMOEA
    https://github.com/ilog-ecnu/LSTPA

    下載圖示
    QR CODE