The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
The goal of higrad is to implement the Hierarchical Incremental GRAdient Descent (HiGrad) algorithm. HiGrad is a first-order algorithm for finding the minimizer of a function in online learning just like SGD and, in addition, this method attaches a confidence interval to assess the uncertainty of its predictions.
This is a basic example which shows you how to solve a linear regression using higrad with simulated data. The predictions obtained at the end come with 95% confidence intervals.
library(higrad)
# generate a data set for linear regression
<- 1e6
n <- 50
d <- 1
sigma <- rep(0, d)
theta <- matrix(rnorm(n * d), n, d)
x <- as.numeric(x %*% theta + rnorm(n, 0, sigma))
y # fit the linear regression with higrad using the default setting
<- higrad(x, y, model = "lm")
fit # predict for 10 new samples
<- matrix(rnorm(10 * d), 10, d)
newx <- predict(fit, newx) pred
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.