The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
Recently, the reticulate library provided with one of the most anticipating functionality — ability to write a python class in R.
We could use a HyperModel subclass instead of a model-building function. So, this makes it easy to share and reuse hypermodels.
A HyperModel subclass only needs to implement a
build(self, hp)
method. And, again one should return a
compiled model inside a build
function.
MyHyperModel <- reticulate::PyClass(
"HyperModel",
inherit = kerastuneR::HyperModel_class(),
list(
`__init__` = function(self, num_classes) {
self$num_classes = num_classes
NULL
},
build = function(self,hp) {
model = keras_model_sequential()
model %>% layer_dense(units = hp$Int('units',
min_value=32L,
max_value=512L,
step=32L),
activation='relu') %>%
layer_dense(as.integer(self$num_classes), activation='softmax') %>%
compile(
optimizer= tf$keras$optimizers$Adam(
hp$Choice('learning_rate',
values=c(1e-2, 1e-3, 1e-4))),
loss='categorical_crossentropy',
metrics='accuracy')
}
)
)
# generate some data
x_data <- matrix(data = runif(500,0,1),nrow = 50,ncol = 5)
y_data <- ifelse(runif(50,0,1) > 0.6, 1L,0L) %>% as.matrix()
x_data2 <- matrix(data = runif(500,0,1),nrow = 50,ncol = 5)
y_data2 <- ifelse(runif(50,0,1) > 0.6, 1L,0L) %>% as.matrix()
# subclass
MyHyperModel <- reticulate::PyClass(
"HyperModel",
inherit = kerastuneR::HyperModel_class(),
list(
`__init__` = function(self, num_classes) {
self$num_classes = num_classes
NULL
},
build = function(self,hp) {
model = keras_model_sequential()
model %>% layer_dense(units = hp$Int('units',
min_value=32L,
max_value=512L,
step=32L),
activation='relu') %>%
layer_dense(as.integer(self$num_classes), activation='softmax') %>%
compile(
optimizer= tf$keras$optimizers$Adam(
hp$Choice('learning_rate',
values=c(1e-2, 1e-3, 1e-4))),
loss='categorical_crossentropy',
metrics='accuracy')
}
)
)
# Random Search
hypermodel = MyHyperModel(num_classes = 10)
tuner = RandomSearch(
hypermodel,
objective = 'val_accuracy',
max_trials = 10,
directory = 'my_dir',
project_name = 'helloworld')
# Run
tuner %>% fit_tuner(x_data,y_data,
epochs = 5,
validation_data = list(x_data2, y_data2))
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.