Title: | Header-Only C++ Mathematical Optimization Library for 'Armadillo' |
Version: | 0.2.22.1.1 |
Description: | 'Ensmallen' is a templated C++ mathematical optimization library (by the 'MLPACK' team) that provides a simple set of abstractions for writing an objective function to optimize. Provided within are various standard and cutting-edge optimizers that include full-batch gradient descent techniques, small-batch techniques, gradient-free optimizers, and constrained optimization. The 'RcppEnsmallen' package includes the header files from the 'Ensmallen' library and pairs the appropriate header files from 'armadillo' through the 'RcppArmadillo' package. Therefore, users do not need to install 'Ensmallen' nor 'Armadillo' to use 'RcppEnsmallen'. Note that 'Ensmallen' is licensed under 3-Clause BSD, 'Armadillo' starting from 7.800.0 is licensed under Apache License 2, 'RcppArmadillo' (the 'Rcpp' bindings/bridge to 'Armadillo') is licensed under the GNU GPL version 2 or later. Thus, 'RcppEnsmallen' is also licensed under similar terms. Note that 'Ensmallen' requires a compiler that supports 'C++14' and 'Armadillo' 10.8.2 or later. |
Depends: | R (≥ 4.0.0) |
License: | GPL-2 | GPL-3 [expanded from: GPL (≥ 2)] |
URL: | https://github.com/coatless-rpkg/rcppensmallen, https://r-pkg.thecoatlessprofessor.com/rcppensmallen/, https://github.com/mlpack/ensmallen, https://ensmallen.org/ |
BugReports: | https://github.com/coatless-rpkg/rcppensmallen/issues |
Encoding: | UTF-8 |
LinkingTo: | Rcpp, RcppArmadillo (≥ 0.10.8.2.0) |
Imports: | Rcpp |
RoxygenNote: | 7.3.2 |
Suggests: | knitr, rmarkdown |
VignetteBuilder: | knitr |
NeedsCompilation: | yes |
Packaged: | 2024-12-03 09:28:03 UTC; ronin |
Author: | James Joseph Balamuta
|
Maintainer: | James Joseph Balamuta <balamut2@illinois.edu> |
Repository: | CRAN |
Date/Publication: | 2024-12-03 14:00:01 UTC |
RcppEnsmallen: Header-Only C++ Mathematical Optimization Library for 'Armadillo'
Description
'Ensmallen' is a templated C++ mathematical optimization library (by the 'MLPACK' team) that provides a simple set of abstractions for writing an objective function to optimize. Provided within are various standard and cutting-edge optimizers that include full-batch gradient descent techniques, small-batch techniques, gradient-free optimizers, and constrained optimization. The 'RcppEnsmallen' package includes the header files from the 'Ensmallen' library and pairs the appropriate header files from 'armadillo' through the 'RcppArmadillo' package. Therefore, users do not need to install 'Ensmallen' nor 'Armadillo' to use 'RcppEnsmallen'. Note that 'Ensmallen' is licensed under 3-Clause BSD, 'Armadillo' starting from 7.800.0 is licensed under Apache License 2, 'RcppArmadillo' (the 'Rcpp' bindings/bridge to 'Armadillo') is licensed under the GNU GPL version 2 or later. Thus, 'RcppEnsmallen' is also licensed under similar terms. Note that 'Ensmallen' requires a compiler that supports 'C++14' and 'Armadillo' 10.8.2 or later.
Author(s)
Maintainer: James Joseph Balamuta balamut2@illinois.edu (ORCID) [copyright holder]
Authors:
Dirk Eddelbuettel edd@debian.org (ORCID) [copyright holder]
See Also
Useful links:
Report bugs at https://github.com/coatless-rpkg/rcppensmallen/issues
Linear Regression with L-BFGS
Description
Solves the Linear Regression's Residual Sum of Squares using the L-BFGS optimizer.
Usage
lin_reg_lbfgs(X, y)
Arguments
X |
A |
y |
A |
Details
Consider the Residual Sum of Squares, also known as RSS, defined as:
RSS\left( \beta \right) = \left( { \mathbf{y} - \mathbf{X} \beta } \right)^{T} \left( \mathbf{y} - \mathbf{X} \beta \right)
The objective function is defined as:
f(\beta) = (y - X\beta)^2
The gradient is defined as:
\frac{\partial RSS}{\partial \beta} = -2 \mathbf{X}^{T} \left(\mathbf{y} - \mathbf{X} \beta \right)
Value
The estimated \beta
parameter values for the linear regression.
Examples
# Number of Points
n = 1000
# Select beta parameters
beta = c(-2, 1.5, 3, 8.2, 6.6)
# Number of Predictors (including intercept)
p = length(beta)
# Generate predictors from a normal distribution
X_i = matrix(rnorm(n), ncol = p - 1)
# Add an intercept
X = cbind(1, X_i)
# Generate y values
y = X%*%beta + rnorm(n / (p - 1))
# Run optimization with lbfgs
theta_hat = lin_reg_lbfgs(X, y)
# Verify parameters were recovered
cbind(actual = beta, estimated = theta_hat)