ÃÛ¶¹ÊÓÆµ

The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here: ).

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

Bayesian optimization in high dimensions : a journey through subspaces and challenges

Author

Summary, in English

This thesis explores the challenges and advancements in high-dimensional Bayesian optimization (HDBO), focusing on understanding, quantifying, and improving optimization techniques in high-dimensional spaces.
Bayesian optimization (BO) is a powerful method for optimizing expensive black-box functions, but its effectiveness diminishes as the dimensionality of the search space increases due to the curse of dimensionality. The thesis introduces novel algorithms and methodologies to make HDBO more practical.
Key contributions include the development of the BAxUS algorithm, which leverages nested subspaces to optimize high-dimensional problems without estimating the dimensionality of the effective subspace.
Additionally, the Bounce algorithm extends these techniques to combinatorial and mixed spaces, providing robust solutions for real-world applications.
The thesis also explores the quantification of exploration in acquisition functions, proposing new methods of quantifying exploration and strategies to design more effective optimization approaches.
Furthermore, this work analyzes why simple BO setups have recently shown promising performance in high-dimensional spaces, challenging the conventional belief that BO is limited to low-dimensional problems.
This thesis offers insights and recommendations for designing more efficient HDBO algorithms by identifying and addressing failure modes such as vanishing gradients and biases in model fitting. Through a combination of theoretical analysis, empirical evaluations, and practical implementations, this thesis contributes to the field of BO by advancing our understanding of high-dimensional optimization and providing actionable methods to improve its performance in complex scenarios.

Publishing year

2025

Language

English

Full text

  • - 19 MB

Links

Document type

Dissertation

Publisher

Computer Science, ÃÛ¶¹ÊÓÆµ

Topic

  • Probability Theory and Statistics

Keywords

  • optimization
  • Bayesian optimization
  • Gaussian process
  • machine learning

Status

Published

ISBN/ISSN/Other

  • ISBN: 978-91-8104-547-5
  • ISBN: 978-91-8104-548-2

Defence date

12 June 2025

Defence time

13:15

Defence place

Lecture Hall E:1406, building E, Klas Anshelms väg 10, Faculty of Engineering LTH, ÃÛ¶¹ÊÓÆµ, Lund.

Opponent

  • Roman Garnett (Assoc. Prof.)