The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.

transformer: Implementation of Transformer Deep Neural Network with Vignettes

Transformer is a Deep Neural Network Architecture based i.a. on the Attention mechanism (Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>).

Version: 0.2.0
Imports: attention (≥ 0.4.0)
Suggests: covr, testthat (≥ 3.0.0)
Published: 2023-11-10
DOI: 10.32614/CRAN.package.transformer
Author: Bastiaan Quast ORCID iD [aut, cre]
Maintainer: Bastiaan Quast <bquast at gmail.com>
License: MIT + file LICENSE
NeedsCompilation: no
Materials: README
CRAN checks: transformer results

Documentation:

Reference manual: transformer.pdf

Downloads:

Package source: transformer_0.2.0.tar.gz
Windows binaries: r-devel: transformer_0.2.0.zip, r-release: transformer_0.2.0.zip, r-oldrel: transformer_0.2.0.zip
macOS binaries: r-release (arm64): transformer_0.2.0.tgz, r-oldrel (arm64): transformer_0.2.0.tgz, r-release (x86_64): transformer_0.2.0.tgz, r-oldrel (x86_64): transformer_0.2.0.tgz
Old sources: transformer archive

Linking:

Please use the canonical form https://CRAN.R-project.org/package=transformer to link to this page.

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.