BEAM:面向大语言模型启发式设计的双层记忆自适应算法演化框架 / BEAM: Bi-level Memory-adaptive Algorithmic Evolution for LLM-Powered Heuristic Design
1️⃣ 一句话总结
这篇论文提出了一种名为BEAM的新型智能算法设计框架,它通过双层优化和自适应记忆机制,让大语言模型能更高效地自动设计出性能优越的复杂优化算法,在多个经典问题上显著超越了现有方法。
Large Language Model-based Hyper Heuristic (LHH) has recently emerged as an efficient way for automatic heuristic design. However, most existing LHHs just perform well in optimizing a single function within a pre-defined solver. Their single-layer evolution makes them not effective enough to write a competent complete solver. While some variants incorporate hyperparameter tuning or attempt to generate complex code through iterative local modifications, they still lack a high-level algorithmic modeling, leading to limited exploration efficiency. To address this, we reformulate heuristic design as a Bi-level Optimization problem and propose \textbf{BEAM} (Bi-level Memory-adaptive Algorithmic Evolution). BEAM's exterior layer evolves high-level algorithmic structures with function placeholders through genetic algorithm (GA), while the interior layer realizes these placeholders via Monte Carlo Tree Search (MCTS). We further introduce an Adaptive Memory module to facilitate complex code generation. To support the evaluation for complex code generation, we point out the limitations of starting LHHs from scratch or from code templates and introduce a Knowledge Augmentation (KA) Pipeline. Experimental results on several optimization problems demonstrate that BEAM significantly outperforms existing LHHs, notably reducing the optimality gap by 37.84\% on aggregate in CVRP hybrid algorithm design. BEAM also designs a heuristic that outperforms SOTA Maximum Independent Set (MIS) solver KaMIS.
BEAM:面向大语言模型启发式设计的双层记忆自适应算法演化框架 / BEAM: Bi-level Memory-adaptive Algorithmic Evolution for LLM-Powered Heuristic Design
这篇论文提出了一种名为BEAM的新型智能算法设计框架,它通过双层优化和自适应记忆机制,让大语言模型能更高效地自动设计出性能优越的复杂优化算法,在多个经典问题上显著超越了现有方法。
源自 arXiv: 2604.12898