The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
An optim-style implementation of the Stochastic Quasi-Gradient Differential Evolution (SQG-DE) optimization algorithm first published by Sala, Baldanzini, and Pierini (2018; <doi:10.1007/978-3-319-72926-8_27>). This optimization algorithm fuses the robustness of the population-based global optimization algorithm "Differential Evolution" with the efficiency of gradient-based optimization. The derivative-free algorithm uses population members to build stochastic gradient estimates, without any additional objective function evaluations. Sala, Baldanzini, and Pierini argue this algorithm is useful for 'difficult optimization problems under a tight function evaluation budget.' This package can run SQG-DE in parallel and sequentially.
Version: | 1.0.1 |
Depends: | R (≥ 3.5.0) |
Imports: | stats, doParallel |
Published: | 2022-05-10 |
DOI: | 10.32614/CRAN.package.graDiEnt |
Author: | Brendan Matthew Galdo [aut, cre] |
Maintainer: | Brendan Matthew Galdo <Brendan.m.galdo at gmail.com> |
BugReports: | https://github.com/bmgaldo/graDiEnt |
License: | MIT + file LICENSE |
URL: | https://github.com/bmgaldo/graDiEnt |
NeedsCompilation: | no |
Materials: | README NEWS |
In views: | Optimization |
CRAN checks: | graDiEnt results |
Reference manual: | graDiEnt.pdf |
Package source: | graDiEnt_1.0.1.tar.gz |
Windows binaries: | r-devel: graDiEnt_1.0.1.zip, r-release: graDiEnt_1.0.1.zip, r-oldrel: graDiEnt_1.0.1.zip |
macOS binaries: | r-release (arm64): graDiEnt_1.0.1.tgz, r-oldrel (arm64): graDiEnt_1.0.1.tgz, r-release (x86_64): graDiEnt_1.0.1.tgz, r-oldrel (x86_64): graDiEnt_1.0.1.tgz |
Please use the canonical form https://CRAN.R-project.org/package=graDiEnt to link to this page.
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.