KerasTuner
KerasTuner is a user-friendly, scalable framework designed for hyperparameter optimization, specifically created to resolve the common challenges faced during hyperparameter searches. It allows users to easily set up their hyperparameter search space utilizing a simple define-by-run syntax. The framework comes equipped with a variety of search algorithms such as Bayesian Optimization, Hyperband, and Random Search. Additionally, it is structured to accommodate the integration of new algorithms, making it an excellent tool for researchers interested in experimenting with different search methods.
Installation
For those interested in utilizing KerasTuner, it requires Python 3.8+ and TensorFlow 2.0+. Installing the latest version is straightforward with the command:
pip install keras-tuner
For more information and access to earlier versions, you can visit the GitHub repository.
Getting Started
To get started with KerasTuner, one must first import KerasTuner alongside TensorFlow:
import keras_tuner
from tensorflow import keras
A pivotal step in using KerasTuner is writing a function to create and return a Keras model. Within this function, the hp
argument is used to specify hyperparameters when initializing the model:
def build_model(hp):
model = keras.Sequential()
model.add(keras.layers.Dense(
hp.Choice('units', [8, 16, 32]),
activation='relu'))
model.add(keras.layers.Dense(1, activation='relu'))
model.compile(loss='mse')
return model
Next, initialize a tuner, for instance, RandomSearch
, and determine the search criteria with the objective
argument, while setting max_trials
to define the number of model variations to attempt:
tuner = keras_tuner.RandomSearch(
build_model,
objective='val_loss',
max_trials=5)
Commence the search process and retrieve the optimal model:
tuner.search(x_train, y_train, epochs=5, validation_data=(x_val, y_val))
best_model = tuner.get_best_models()[0]
More comprehensive insights into KerasTuner can be explored through the starter guide.
Contribution and Community
Contributors interested in enhancing KerasTuner can refer to the CONTRIBUTING.md for guidelines. The KerasTuner team extends gratitude to all its contributors. For community interaction or queries, users can engage in GitHub Discussions.
Citing KerasTuner
In cases where KerasTuner significantly contributes to research, users are encouraged to cite it as follows:
@misc{omalley2019kerastuner,
title = {KerasTuner},
author = {O'Malley, Tom and Bursztein, Elie and Long, James and Chollet, Fran\c{c}ois and Jin, Haifeng and Invernizzi, Luca and others},
year = 2019,
howpublished = {\url{https://github.com/keras-team/keras-tuner}}
}