The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.

h2otools: Machine Learning Model Evaluation for ‘h2o’ Package

CRAN version

Model evaluation

There are plenty of procedures for evaluating machine learning models, many of which are not implemented in h2o platform. This repository provides additional functions for model performance evaluation that are not implemented in h2o.

The bootperformance function evaluates the model for n number of bootstrapped samples from the testing dataset, instead of evaluating the model on the testing dataset once. Therefore, evaluating the confidence interval of the model performance.

These functions are briefly described below:

Function Description
automlModelParam for extracting model parameters from AutoML grid
bootperformance Bootstrap performance ealuation
checkFrame Checks data.frame format, which is useful before uploading it to H2O cloud
Fmeasure for evaluating F3, F4, F5, or any beta value. h2o only provides F0.5, F1, and F2
getPerfMatrix retrieve performance matrix for all thresholds
kappa Calculates kappa for all thresholds
performance provides performance measures (AUC, AUCPR, MCC, Kappa, etc.) using objects from h2o package

Installation

You can install the latest stable package from CRAN:

install.packages("h2otools")

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.