The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.

HTLR: Bayesian Logistic Regression with Heavy-Tailed Priors

Efficient Bayesian multinomial logistic regression based on heavy-tailed (hyper-LASSO, non-convex) priors. The posterior of coefficients and hyper-parameters is sampled with restricted Gibbs sampling for leveraging the high-dimensionality and Hamiltonian Monte Carlo for handling the high-correlation among coefficients. A detailed description of the method: Li and Yao (2018), Journal of Statistical Computation and Simulation, 88:14, 2827-2851, <doi:10.48550/arXiv.1405.3319>.

Version: 0.4-4
Depends: R (≥ 3.1.0)
Imports: Rcpp (≥ 0.12.0), BCBCSF, glmnet, magrittr
LinkingTo: Rcpp (≥ 0.12.0), RcppArmadillo
Suggests: ggplot2, corrplot, testthat (≥ 2.1.0), bayesplot, knitr, rmarkdown
Published: 2022-10-22
DOI: 10.32614/CRAN.package.HTLR
Author: Longhai Li ORCID iD [aut, cre], Steven Liu [aut]
Maintainer: Longhai Li <longhai at math.usask.ca>
BugReports: https://github.com/longhaiSK/HTLR/issues
License: GPL-3
URL: https://longhaisk.github.io/HTLR/
NeedsCompilation: yes
SystemRequirements: C++11
Citation: HTLR citation info
Materials: README NEWS
CRAN checks: HTLR results

Documentation:

Reference manual: HTLR.pdf
Vignettes: intro

Downloads:

Package source: HTLR_0.4-4.tar.gz
Windows binaries: r-devel: HTLR_0.4-4.zip, r-release: HTLR_0.4-4.zip, r-oldrel: HTLR_0.4-4.zip
macOS binaries: r-release (arm64): HTLR_0.4-4.tgz, r-oldrel (arm64): HTLR_0.4-4.tgz, r-release (x86_64): HTLR_0.4-4.tgz, r-oldrel (x86_64): HTLR_0.4-4.tgz
Old sources: HTLR archive

Linking:

Please use the canonical form https://CRAN.R-project.org/package=HTLR to link to this page.

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.