The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
optimflex provides a highly flexible suite of
derivative-based non-linear optimization algorithms. It is specifically
designed for researchers who require rigorous convergence
control, particularly in complex models like SEM.
Standard optimization functions often rely on a single, fixed
stopping rule. optimflex offers:
# install.packages("devtools")
# devtools::install_github("yourusername/optimflex")library(optimflex)
rosenbrock <- function(x) {
100 * (x[2] - x[1]^2)^2 + (1 - x[1])^2
}
res <- double_dogleg(
start = c(-1.2, 1.0),
objective = rosenbrock,
control = list(
use_grad = TRUE,
use_rel_x = TRUE,
use_posdef = TRUE
)
)
print(res$par)
#> [1] 0.9999955 0.9999910| Flag | Description |
|---|---|
use_abs_f |
Absolute function change |
use_rel_f |
Relative function change |
use_abs_x |
Absolute parameter change |
use_rel_x |
Relative parameter change |
use_grad |
Gradient infinity norm |
use_posdef |
Hessian Verification |
use_pred_f |
Predicted Decrease |
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.