The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
Provides general purpose tools for helping users to implement steepest gradient descent methods for function optimization; for details see Ruder (2016) <doi:10.48550/arXiv.1609.04747>. Currently, the Steepest 2-Groups Gradient Descent and the Adaptive Moment Estimation (Adam) are the methods implemented. Other methods will be implemented in the future.
Version: | 0.1.2 |
Imports: | ucminf (≥ 1.1-4) |
Published: | 2021-10-07 |
DOI: | 10.32614/CRAN.package.optimg |
Author: | Vithor Rosa Franco |
Maintainer: | Vithor Rosa Franco <vithorfranco at gmail.com> |
License: | GPL-3 |
URL: | https://github.com/vthorrf/optimg |
NeedsCompilation: | no |
CRAN checks: | optimg results |
Reference manual: | optimg.pdf |
Package source: | optimg_0.1.2.tar.gz |
Windows binaries: | r-devel: optimg_0.1.2.zip, r-release: optimg_0.1.2.zip, r-oldrel: optimg_0.1.2.zip |
macOS binaries: | r-release (arm64): optimg_0.1.2.tgz, r-oldrel (arm64): optimg_0.1.2.tgz, r-release (x86_64): optimg_0.1.2.tgz, r-oldrel (x86_64): optimg_0.1.2.tgz |
Reverse imports: | skipTrack |
Please use the canonical form https://CRAN.R-project.org/package=optimg to link to this page.
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.