The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
Keras Tuner is a hypertuning framework made for humans. It aims at making the life of AI practitioners, hypertuner algorithm creators and model designers as simple as possible by providing them with a clean and easy to use API for hypertuning. Keras Tuner makes moving from a base model to a hypertuned one quick and easy by only requiring you to change a few lines of code.
A hyperparameter tuner for Keras,
specifically for tf$keras
with TensorFlow 2.0.
Full documentation and tutorials available on the Keras Tuner website.
Currently, the package is available on github:
devtools::install_github('EagerAI/kerastuneR')
Later, you need to install the python module kerastuner:
kerastuneR::install_kerastuner(python_path = 'paste python path')
Here’s how to perform hyperparameter tuning for a single-layer dense neural network using random search.
First, we define a model-building function. It takes an argument
hp
from which you can sample hyperparameters, such as
hp$Int('units', min_value=32L, max_value=512L, step=32L)
(an integer from a certain range).
Sample data:
x_data <- matrix(data = runif(500,0,1),nrow = 50,ncol = 5)
y_data <- ifelse(runif(50,0,1) > 0.6, 1L,0L) %>% as.matrix()
x_data2 <- matrix(data = runif(500,0,1),nrow = 50,ncol = 5)
y_data2 <- ifelse(runif(50,0,1) > 0.6, 1L,0L) %>% as.matrix()
This function returns a compiled model.
library(keras)
library(kerastuneR)
library(dplyr)
build_model = function(hp) {
model = keras_model_sequential()
model %>% layer_dense(units=hp$Int('units',
min_value=32,
max_value=512,
step=32),
input_shape = ncol(x_data)
activation='relu') %>%
layer_dense(units=1, activation='sigmoid') %>%
compile(
optimizer= tf$keras$optimizers$Adam(
hp$Choice('learning_rate',
values=c(1e-2, 1e-3, 1e-4))),
loss='binary_crossentropy',
metrics='accuracy')
return(model)
}
Next, instantiate a tuner. You should specify the model-building
function, the name of the objective to optimize (whether to minimize or
maximize is automatically inferred for built-in metrics), the total
number of trials (max_trials)
to test, and the number of
models that should be built and fit for each trial
(executions_per_trial)
.
Available tuners are RandomSearch
and
Hyperband
.
Note: the purpose of having multiple executions per trial is to reduce results variance and therefore be able to more accurately assess the performance of a model. If you want to get results faster, you could set executions_per_trial=1 (single round of training for each model configuration).
tuner = RandomSearch(
build_model,
objective = 'val_accuracy',
max_trials = 5,
executions_per_trial = 3,
directory = 'my_dir',
project_name = 'helloworld')
You can print a summary of the search space:
tuner %>% search_summary()
Then, start the search for the best hyperparameter configuration. The
call to search has the same signature as
model %>% fit()
. But here instead of fit()
we call fit_tuner()
.
tuner %>% fit_tuner(x_data,y_data,
epochs=5,
validation_data = list(x_data2,y_data2))
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.