Introduction to BoTorch
BoTorch is a library designed for Bayesian Optimization, built on top of PyTorch, a popular machine learning library. As an evolving project currently in beta, BoTorch targets researchers and advanced practitioners involved in Bayesian Optimization and AI. For end-users who are not focused on research, BoTorch recommends using Ax, a platform that integrates with BoTorch for user-friendly optimization tasks.
Key Features
BoTorch offers several compelling features:
- Modular and Extensible: BoTorch provides a flexible interface for Bayesian optimization primitives. Users can easily compose probabilistic models, acquisition functions, and optimizers.
- Powered by PyTorch: Utilizing PyTorch’s capabilities, including auto-differentiation and GPU support, BoTorch enables efficient computations.
- Monte Carlo Acquisition Functions: BoTorch supports these functions through techniques like the reparameterization trick, which simplify the implementation of novel ideas without heavy assumptions.
- Integration with PyTorch Architectures: It allows for smooth integration with deep and convolutional network architectures.
- State-of-the-Art Models: Deep integration with GPyTorch enables advanced probabilistic models, such as Gaussian Processes, multi-task GPs, and deep kernel learning.
Audience
BoTorch’s primary users are researchers and sophisticated AI practitioners. For those not deeply engaged in optimization research, Ax serves as a more accessible tool, offering flexible management of Bayesian Optimization tasks.
Installation
Requirements:
- Python 3.10 or higher
- PyTorch 2.0.1 or higher
- Other necessary packages include gpytorch 1.13, linear_operator 0.5.3, pyro-ppl 1.8.4 or newer, scipy, and multiple-dispatch.
MacOS Users with Intel Processors: Before installing BoTorch, PyTorch should be manually installed to ensure it is linked to MKL, optimizing performance on Intel processors.
Installation Options:
-
Via Anaconda (Recommended):
conda install botorch -c pytorch -c gpytorch -c conda-forge
-
Using pip:
pip install botorch
-
Latest Development Version: To access cutting-edge features, install directly from GitHub.
-
Editable Installation: Suitable for contributors, allowing for local changes to be reflected immediately.
Getting Started
A simple Bayesian optimization loop in BoTorch consists of:
-
Fitting a Gaussian Process (GP) Model:
import torch from botorch.models import SingleTaskGP from botorch.fit import fit_gpytorch_mll from gpytorch.mlls import ExactMarginalLogLikelihood train_X = torch.rand(10, 2, dtype=torch.double) * 2 Y = 1 - (train_X - 0.5).norm(dim=-1, keepdim=True) Y += 0.1 * torch.rand_like(Y) gp = SingleTaskGP(train_X, train_Y=Y) mll = ExactMarginalLogLikelihood(gp.likelihood, gp) fit_gpytorch_mll(mll)
-
Constructing an Acquisition Function:
from botorch.acquisition import LogExpectedImprovement logEI = LogExpectedImprovement(model=gp, best_f=Y.max())
-
Optimizing the Acquisition Function:
from botorch.optim import optimize_acqf bounds = torch.stack([torch.zeros(2), torch.ones(2)]).to(torch.double) candidate, acq_value = optimize_acqf(logEI, bounds, q=1, num_restarts=5, raw_samples=20)
Contribution and Citation
BoTorch is MIT licensed, and contributions are welcome. For those using BoTorch in research, citation details are provided:
M. Balandat et al., "BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization," Advances in Neural Information Processing Systems 33, 2020.
In summary, BoTorch is a robust tool in the sphere of Bayesian Optimization, offering comprehensive support for advanced users while remaining flexible and extensible for continuous development and research integration.