BLADE

PyPI version Maintenance Python 3.10+ Codecov BLADE framework

BLADE is a Python framework for benchmarking the Llm Assisted Design and Evolution of algorithms. BLADE (Benchmark suite for LLM-driven Automated Design and Evolution) provides a standardized benchmark suite for evaluating automatic algorithm design algorithms, particularly those generating metaheuristics by large language models (LLMs). It focuses on continuous black-box optimization and integrates a diverse set of problems and methods, facilitating fair and comprehensive benchmarking.

🔥 News

  • 2025.03 ✨✨ iohblade v0.0.1 released!

Features

  • Comprehensive Benchmark Suite: Covers various classes of black-box optimization problems.

  • LLM-Driven Algorithm Design: Supports algorithm evolution and design using large language models.

  • Built-In Baselines: Includes state-of-the-art metaheuristics for comparison and LLM-driven AAD algorithms.

  • Automatic Logging & Visualization: Integrated with IOHprofiler for performance tracking.

Included Benchmark Function Sets

BLADE incorporates several benchmark function sets to provide a comprehensive evaluation environment:

Name

Short Description

Number of Functions

Multiple Instances

BBOB (Black-Box Optimization Benchmarking)

A suite of 24 noiseless functions designed for benchmarking continuous optimization algorithms. Reference

24

Yes

SBOX-COST

A set of 24 boundary-constrained functions focusing on strict box-constraint optimization scenarios. Reference

24

Yes

MA-BBOB (Many-Affine BBOB)

An extension of the BBOB suite, generating functions through affine combinations and shifts. Reference

Generator-Based

Yes

GECCO MA-BBOB Competition Instances

A collection of 1,000 pre-defined instances from the GECCO MA-BBOB competition, evaluating algorithm performance on diverse affine-combined functions. Reference

1,000

Yes

In addition, several real-world applications are included, such as several photonics problems.

Included Search Methods

The suite contains the state-of-the-art LLM-assisted search algorithms:

Algorithm

Description

Link

LLaMEA

Large Language Model Evolutionary Algorithm

code, paper

EoH

Evolution of Heuristics

code, paper

FunSearch

Google’s GA-like algorithm

code, paper

ReEvo

Large Language Models as Hyper-Heuristics with Reflective Evolution

code, paper

Note

Some of these algorithms are currently not yet integrated, but they are planned for integration soon.

Supported LLM APIs

BLADE supports integration with various LLM APIs to facilitate automated design of algorithms:

LLM Provider

Description

Integration Notes

Gemini

Google’s multimodal LLM designed to process text, images, audio, and more. Reference

Accessible via the Gemini API, compatible with OpenAI libraries. Reference

OpenAI

Developer of GPT series models, including GPT-4, widely used for natural language understanding and generation. Reference

Integration through OpenAI’s REST API and client libraries.

Ollama

A platform offering access to various LLMs, enabling local and cloud-based model deployment. Reference

Integration details can be found in their official documentation.

Evaluating against Human Designed Baselines

An important part of BLADE is the final evaluation of generated algorithms against state-of-the-art human-designed algorithms. In the iohblade.baselines part of the package, several well-known SOTA black-box optimizers are implemented to compare against, including but not limited to CMA-ES and DE variants.

For the final validation, BLADE uses IOHprofiler, providing detailed tracking and visualization of performance metrics.

🤖 Contributing

Contributions to BLADE are welcome! Here are a few ways you can help:

  • Report Bugs: Use GitHub Issues to report bugs.

  • Feature Requests: Suggest new features or improvements.

  • Pull Requests: Submit PRs for bug fixes or feature additions.

Please refer to CONTRIBUTING.md for more details on contributing guidelines.

License

Distributed under the MIT License. See LICENSE for more information.

Cite us

If you use BLADE in your research, please consider citing the associated paper:

TBA

Indices and tables